Category Archives: 2011 Spring

Final Thesis Presentation

May 15, 2011 (Sunday)
Interfacing: Dynamic space as alternative to screen interactions
MFA DT Thesis Symposium
Kellen Auditorium

Above is a video of my final Thesis presentation at Parsons The New School for Design. The panel included Matt Ruby, Brett Burton, Burcum Turkmen / Katie Koepfinger, Bryant Davis, Ashley Ahn, and myself. The talk concluded with a discussion between the panel and two respondents, Daniel Iglesia and Zach Lieberman.

To see the panel in its entirety (which includes a discussion with respondents), click here.

MFA DT Thesis Show is up!

The MFA DT Thesis Gallery show is up at Parsons The New School for Design (The Sheila C. Johnson Design Center, 2 W 13th Street, Kellen Gallery), and it is open to the public.

The show is open daily from May 7 – 23, 2011 (10am – 6pm). Here is a video walk-through I put together:

For more information on the Thesis show, timing of the talks, and other related events, head to the MFA DT Thesis site here.

RhythmSynthesis Gallery Installation

This Friday was the first day we could begin installing our pieces in the Kellen Gallery at The Sheila C. Johnson Design Center, so I decided to get my project in there early and test it.

A few classmates and I tried it out once I got the amp plugged in. I invited the folks that run the gallery and build most of the displays for the work in Kellen Gallery to test it as well. Watching everyone have such a good time made the long hours over the past few weeks and months all worth it.

Here is a video of a few of the folks who used the piece while I was setting up:

The Gallery show runs from May 7 – 23, 2011 at Parsons The New School for Design (The Sheila C. Johnson Design Center, 2 W 13th Street, Kellen Gallery), and it is open to the public so come on by. The opening party is on the evening of May 8th (7pm), and Symposium talks are on May 14 and 15.

For more information on the Thesis show, timing of the talks, and other related events, head to the MFA DT Thesis site here.

RhythmSynthesis Interface tests

I invited Brett Tieman, J. Nicholas Moy, and Jeremy Gough (who I played with in the band, The Mugs) as well as fellow MFA DT students Haeyoung Kim and Matt Ruby to test the interface of my final project.

I incorporated a number of updates to both the software and hardware, which included:

  • play head movement tied to computer clock for more accurate timing
  • “sliders” along the right and left edges to control speed and volume
  • updated oscillators that generate sounds within openFrameworks application
  • new acrylic top that diffuses light more evenly
  • purposefully designed shapes and colors for acrylic pieces used to make sounds

The comments centered specifically on the sounds that were generated by the application, and they also reaffirmed my belief that when the sounds provide more distinctive characteristics (such as range of octaves and speed of attack), users are able to compose without obvious or deliberate visual feedback.

Below is a video demonstration I did of this updated interface as well as images from the final tests. Headphones are recommended (low notes do not play well on computer speakers):

AV Systems – SuperCollider 140 Character

Inspired by The Wire magazine SC140 project, our assignment for AV Sys was to create a SuperCollider sketch that was under 140 characters. Based on the “Micromoog” piece that was involved with the SC140, I made this:


And here’s an audio recording of the first 2:30 of the piece:

Audio clip: Adobe Flash Player (version 9 or above) is required to play this audio clip. Download the latest version here. You also need to have JavaScript enabled in your browser.

AV Systems – Music Video

For this week’s assignment, we were asked to make a music video that used code in an interesting way. It’s been a crazy thesis week, so I figured drum and bass music would be an appropriate genre to explore.

I took the street video by mounting a camera to my bike and riding home from school. I did a series of blob detection on different aspects of the video, and the alpha level of the blob colored visuals are based on the value of FFT ranges. The song is “Urban Shakedown ft. DBO General – Some Justice 95 aka Arsonist (Vocal mix)”.

This video is dedicated to Monday Night Vinyl Club.

Noir Festival Exhibition

My collaborative visual music piece, Modern Romance, was selected to be part of the inaugural New School Arts Festival, Noir, going on this month. The video will be shown as part of the Illuminating Noir gallery exhibition which was curated by faculty members Simone Douglas, Ben Katchor, and Christiane Paul and runs from April 1 – 9.

A combination of film, video, photography, and interactive media art, Illustrating Noir looks to be a significant contribution to the exploration of noir.

Admission is free, and here are details:
Fine Arts Gallery, Parsons The New School for Design
25 East 13th Street, 5th Floor (map)
Open Daily noon-6 pm, and Thursday noon-8 pm

Here are two pieces that caught my eye while I went to the gallery to setup:

There are a ton of events, talks, and screenings, so definitely check the schedule and come to an event.

RhythmSynthesis Sound Tests

This past weekend I invited fellow designers, artists, musicians, programmers, friends and family to Parsons to try out the latest hardware and software version of my tangible color music instrument. It had been a few weeks since the last set of tests, and I made a big push to get as many updates included in the hardware and software for this weekends Sound Tests and for final presentations at school.

On the night before the testing, I finished the hardware portion improvements which includes a new aluminum arm for the camera mount and an enclosure for the camera. I removed the PS3 eye from its original encasing and reduced the size of weight and space the camera took up.

The main goals for this weekend was to rigorously test the software to see if it was stable enough for long performances and to see how musicians and artists dealt with the interface. I truly feel that the audio is enough feedback for performers to know where the playhead is, but I needed to put the new hardware in front of people and see. Also, up to this point, the majority of the people testing my project were from the MFA Design and Technology program, and I wanted a fresh set of eyes and ears on the project.

Here are a set of images from both days:



The results were great. I received a lot of feedback (and some encouragement) that will help me with the next steps of this project. The primary comment dealt with how best to provide the musician feedback as to what is actually happening and to help them make decisions on how to use the pieces. Some people preferred more audio feedback, such as more drastic changes between the sounds when pieces are rotated or differ in shape. Others preferred to have more visual feedback so the user would know the location of the rotating playhead.

I put together a brief sample of the performances, and here is the video:

I also put together an extended version that includes all of the performances from the weekend, so if your still feeling like exploring the results from the test, you can see the long player here.

AV Systems – Granular Synthesis Recorder

For the advanced AvSys homework, we were asked to build an app that uses live audio input to record data for a granular synthesizer.

I spent most of spring break hammering this out, and the current version works but is far from perfect. Here is a video demonstration:

Basically I am taking input buffer data, storing it to dynamic float arrays, and then sending the information to a version of Zach Lieberman’s granular synthesis code that I hacked up quite a bit.

I plan on updating this so that the parameters of the synthesizer can be changed after the sounds are recorded, and I’ll do a post (with code) in the coming weeks.