Category Archives: Code for Art

Orbit Player Prototype

In researching for my Thesis on rhythm, I became fascinated with the front cover of the Wire Magazine book “Undercurrents“. It depicts a record player arm on the rings of a tree, so I wanted to do a piece that had the reference of how the needle of the turntable orbits the sounds as well as how sounds change over time.

The general idea was to provide the understanding that our position (or position of objects within a space) have a rhythmic relationship to each other. I understand how far a table might be from me or how close a person walks past me, but I have difficulty fully understanding the simple relationship that all the pieces have together. This piece illustrates how the position of pieces have relationships over time.

The colors of each of the notes depicted in the piece are inspired by I. J. Belomont’s understanding of the scale and each note’s association with color.

Here is a demo of the application:

In this sketch, I have several instruments that age as they are played. Currently they have two state changes (young and old). I hope in future iterations to continue this idea of aging and rhythm, as well as include the idea of movement within a space. I also want to improve performance and program this for the iPad.

If you would like to try it out, you can download the application here. Once you launch it, just press “f” (make sure it’s lower case), to make it full screen. If you do test it out, it would be awesome to hear your feedback, so please post comments. Remember this is a working prototype.

Visual Sequencer – 2nd Prototype

After getting initial feedback from the class, I added some text that provides information for the user of the sequencer as well as a control for adjusting the tempo.

I am currently using a counter to keep the speed of the pattern, so for the next iteration, I’ll be sure to use actual time (such as milliseconds) to maintain the speed and tempo.

Here is a run through of the latest prototype:

Code is posted in the comments.

Visual Sequencer

In Code for Art, we were asked to pick an adjective as a constraint for our next project. Since my core thesis topic is rhythm, I chose “rhythmic” as my word of choice.

I had the “a-ha” moment last week after a combination of events, influences, and interests all seemed to intersect (Maker Faire, Trespass book release show, visual music, openFrameworks, QR codes, psychogeography, etc), and I have decided to create a form of visual music graffiti for my thesis project.

In general, I will be creating a custom application that allows musicians and artists to compose music/sound pieces, export them as images, and paste the visual pieces up. Using an app, the public can “read” these images and translate them into the music that is embedded within the image. There is much more to the project but that is a brief overview.

For this prototype, I took my first step toward illustrating my idea:

This program reads the image pixel information from the camera, stores the pixel RGB values in to an array, and based on the actual values, maps the image to the sequencer. Basically it looks for black pixels in specific locations and maps them accordingly.

As I started running out of time before class, I had to hardcode some of my variables, so I’m going to need to go back and redo some of this work. I also would like to add the ability to change the tempo, the number of sequencer buttons, alter what instruments are being played, and include some text for directions.

As a first prototype, it was successful in showing the general idea of the final outcome. Looking forward to doing the next prototype.

Code for Art – Random Proximity Drum Machine

For our third assignment, we were asked to use arrays in an interesting way, so I made a random beat and proximity sensitive drum machine. The available beats are scattered about the page in a random assortment, and when the user clicks, the closer the pointer is to an instrument, the louder the sound.

I need to refine how the proximity works, but this is a good first prototype. Here is a video example:

Here is a link to my code.

Code for Art: Research Project

For my Code for Art class, we were asked to prepare a brief report about a project that we read about on a blog or saw in our travels that uses code in some way. I picked Usman Haque’s “Primal Source” as my project to discuss and describe.

“Primal Source” was commissioned by the City of Santa Monica, California, for Glow 08, and it was an all-night performance/installation brought to life through the active participation of festival-goers (estimated at approx. 200,000 over the course of the night).

Responding to sounds emanating from the crowd, the system’s modes changed every few minutes depending on how active the crowd participation was (more quickly when there was more noise). Each mode responded in a slightly different way to the individual voices and sounds picked up by 8 microphones distributed towards the front.

Here are a few photos from a video posted on Haque’s website describing the project. The set-up involved microphones that handled the user input, projectors to for the visuals as well as a water “fountain” constructed on the beach.

I have been completely floored by many of Haque’s projects (ex. Sky Ear), and when I see the dates of these projects, I know I have a lot of work ahead of me to be able to pull off extensive installations like this. The interaction is simple, but after doing a number of projects involving crowd participation, I understand how difficult it is to provide simple interaction.

I was lucky enough to have seen Haque speak at Parsons last year. I know I would have eventually found his work, but it was great to kick off the program seeing such an inspiring artist.

The nature of the project necessitated code-based systems. If there were no interactive elements, projetions alone would have sufficed, but because the crowd noise was a key component to driving the visuals, code (in this case, Processing and Pure Data) based-systems were necessary.

Here is video documentation of the “Primal Source” project:

Code for Art – Reinterpretation of Plants vs Zombies

Our assignment for Code for Art this week was to write an interactive application that would reveal the identity of a celebrity with keyboard or mouse interaction. I decided to use Plants vs Zombies as my celebrity of choice. Since more than 300,000 copies of the game were sold in the first 9 days, many consider this game as the top-grossing App ever launched for the iPhone (link).

Built in openFrameworks, my reinterpretation uses the coloring pattern of the game’s backyard as the first clue as to the identity of my celebrity. When the user clicks within the green-colored squares, a plant is placed in the appropriate square. After the plant has been placed, the application automatically places a zombie in another lane. No matter where you place your plant, the zombie will always choose an alternative lane. After the zombie traverses the whole backyard, the phrase “Zombies ate your brains!” appears on the screen.

Here is a sample run-through of the application:

I posted my code here. Please note, we were asked to build this whole application using simple shapes, such as ofRect, ofCircle, ofEllipse and ofVertex, so this might be a little painful to review. Meaning, there are plenty of lines of code (no arrays, functions, or classes were used). Also, we were asked not to reveal who our celebrity was anywhere in the code, so there are not many comments that describe what is what.

Code for Art – Sample Player

This is my first openFrameworks program, and it was for the first assignment for my Code for Art class this semester. It is a sample player that uses the boundaries of a box and a bouncing ball to trigger the sample for a drum beat (kick, hi-hat, and handclaps) and bass guitars, as well as a clickable keyboard. I included a pitch adjustment, so the notes for the bass and keyboards can be adjusted by where you click within the box.

Here is a link to code.