Category Archives: Thesis Studio

Final Thesis Presentation

May 15, 2011 (Sunday)
Interfacing: Dynamic space as alternative to screen interactions
MFA DT Thesis Symposium
Kellen Auditorium

Above is a video of my final Thesis presentation at Parsons The New School for Design. The panel included Matt Ruby, Brett Burton, Burcum Turkmen / Katie Koepfinger, Bryant Davis, Ashley Ahn, and myself. The talk concluded with a discussion between the panel and two respondents, Daniel Iglesia and Zach Lieberman.

To see the panel in its entirety (which includes a discussion with respondents), click here.

MFA DT Thesis Show is up!

The MFA DT Thesis Gallery show is up at Parsons The New School for Design (The Sheila C. Johnson Design Center, 2 W 13th Street, Kellen Gallery), and it is open to the public.

The show is open daily from May 7 – 23, 2011 (10am – 6pm). Here is a video walk-through I put together:

For more information on the Thesis show, timing of the talks, and other related events, head to the MFA DT Thesis site here.

RhythmSynthesis Gallery Installation


This Friday was the first day we could begin installing our pieces in the Kellen Gallery at The Sheila C. Johnson Design Center, so I decided to get my project in there early and test it.

A few classmates and I tried it out once I got the amp plugged in. I invited the folks that run the gallery and build most of the displays for the work in Kellen Gallery to test it as well. Watching everyone have such a good time made the long hours over the past few weeks and months all worth it.



Here is a video of a few of the folks who used the piece while I was setting up:

The Gallery show runs from May 7 – 23, 2011 at Parsons The New School for Design (The Sheila C. Johnson Design Center, 2 W 13th Street, Kellen Gallery), and it is open to the public so come on by. The opening party is on the evening of May 8th (7pm), and Symposium talks are on May 14 and 15.

For more information on the Thesis show, timing of the talks, and other related events, head to the MFA DT Thesis site here.

RhythmSynthesis Interface tests


I invited Brett Tieman, J. Nicholas Moy, and Jeremy Gough (who I played with in the band, The Mugs) as well as fellow MFA DT students Haeyoung Kim and Matt Ruby to test the interface of my final project.

I incorporated a number of updates to both the software and hardware, which included:

  • play head movement tied to computer clock for more accurate timing
  • “sliders” along the right and left edges to control speed and volume
  • updated oscillators that generate sounds within openFrameworks application
  • new acrylic top that diffuses light more evenly
  • purposefully designed shapes and colors for acrylic pieces used to make sounds

The comments centered specifically on the sounds that were generated by the application, and they also reaffirmed my belief that when the sounds provide more distinctive characteristics (such as range of octaves and speed of attack), users are able to compose without obvious or deliberate visual feedback.

Below is a video demonstration I did of this updated interface as well as images from the final tests. Headphones are recommended (low notes do not play well on computer speakers):


RhythmSynthesis Sound Tests

This past weekend I invited fellow designers, artists, musicians, programmers, friends and family to Parsons to try out the latest hardware and software version of my tangible color music instrument. It had been a few weeks since the last set of tests, and I made a big push to get as many updates included in the hardware and software for this weekends Sound Tests and for final presentations at school.


On the night before the testing, I finished the hardware portion improvements which includes a new aluminum arm for the camera mount and an enclosure for the camera. I removed the PS3 eye from its original encasing and reduced the size of weight and space the camera took up.

The main goals for this weekend was to rigorously test the software to see if it was stable enough for long performances and to see how musicians and artists dealt with the interface. I truly feel that the audio is enough feedback for performers to know where the playhead is, but I needed to put the new hardware in front of people and see. Also, up to this point, the majority of the people testing my project were from the MFA Design and Technology program, and I wanted a fresh set of eyes and ears on the project.

Here are a set of images from both days:

SATURDAY


SUNDAY


The results were great. I received a lot of feedback (and some encouragement) that will help me with the next steps of this project. The primary comment dealt with how best to provide the musician feedback as to what is actually happening and to help them make decisions on how to use the pieces. Some people preferred more audio feedback, such as more drastic changes between the sounds when pieces are rotated or differ in shape. Others preferred to have more visual feedback so the user would know the location of the rotating playhead.

I put together a brief sample of the performances, and here is the video:

I also put together an extended version that includes all of the performances from the weekend, so if your still feeling like exploring the results from the test, you can see the long player here.

RhythmSynthesis User Testing

This past weekend, I ran a series of user tests with the latest prototype of my thesis project. I reserved a classroom on the 10th floor of 2 W 13th Street (room 1013) and set up the prototype along with a large guitar amp, various effects pedals (delay, distortion, etc), and a drum machine. The group of people who tested the project included fellow students, friends, musicians, and Parsons Faculty.


I included a few updates in the program, which included a full 12 tone scale matched to specific hue values within the full visible light color spectrum, tempo/speed changes, and the use of object area to determine appropriate scale (meaning the larger the piece the lower the octave, the smaller the piece the higher the octave). I also updated the light platform to provide a better color for camera detection. The PS3 Eyesight is biased a little on the cooler color side.




The reactions and results were great, and it provided me plenty of feedback for taking the next steps with the interface and the software portion of the project. In general, there was a wide array of approaches to how to create musical pieces. Some were methodical, other’s used the space between the platform and the camera to manipulate the sounds, and others just went crazy.

Here is a video of some of the folks testing this prototype:

Thanks to Ivy, Greg Climber (and his friend), Katherine Moriwaki, Jonah Brucker-Cohen, Matt Ruby, Steven Sclafani, Cara Lynn, Lou Sclafani, Zach Lieberman, George Bixby, Rob Ramirez, Shane Lessa, and Manuel Rueda Iragorri for coming in to test and offering up your valuable feedback. Looking forward to tackling the next set of updates and changes.

The next user testing session will be…
2 W 13th Street, Room 1013 (map)
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm

All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.

For more information on my thesis project, RhythmSynthesis, you can check here.

RhythmSynthesis Sound Tests This Month


Starting this weekend, I am hosting a series of collaborative sound tests at Parsons The New School for Design. The goals of the series are to test how my prototype can be used as an instrument in individual and group configurations, receive feedback and discuss the project with my musical and non-musical peers, and create new musical pieces. Here is the info:

RhythmSynthesis Sound Tests
Where: 2 W 13th Street, Room 1013 (map)
When: On the following weekends…

3/5 (Saturday) 1 – 7pm
3/6 (Sunday) 1 – 10pm
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm

I’ll have an assortment of instruments and amplifiers, and I’ll be recording the sessions in their entirety. All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.

Thesis Prototype – Additive Synthesis and Controls

New musical interfaces are necessary to further explore the complexities of rhythm. RhythmSynthesis proposes a new instrument for composition and performance to continue such exploration. Originating as an investigation into the relationships between rhythm and technology, RhythmSynthesis applies color, shape, and sound to demonstrate how our understanding of visual music, computation, and tangible, audio-visual interactions can be applied as considerations in musical expression.

The goal for this prototype was to introduce updated sound generation and controls in the application. Using additive synthesis code I developed in the course Audio Visual Systems with Zach Lieberman and Zach Gage using openFrameworks, I am analyzing the shape and size of each colored object to determine the sounds that should be made. I also included sliders for changing the rate at which the objects are sensed and played.

Here is a brief demonstration:

You can see other prototypes and relevant write-ups here.