Tag Archives: Thesis

Final Thesis Presentation

May 15, 2011 (Sunday)
Interfacing: Dynamic space as alternative to screen interactions
MFA DT Thesis Symposium
Kellen Auditorium

Above is a video of my final Thesis presentation at Parsons The New School for Design. The panel included Matt Ruby, Brett Burton, Burcum Turkmen / Katie Koepfinger, Bryant Davis, Ashley Ahn, and myself. The talk concluded with a discussion between the panel and two respondents, Daniel Iglesia and Zach Lieberman.

To see the panel in its entirety (which includes a discussion with respondents), click here.

RhythmSynthesis Gallery Installation


This Friday was the first day we could begin installing our pieces in the Kellen Gallery at The Sheila C. Johnson Design Center, so I decided to get my project in there early and test it.

A few classmates and I tried it out once I got the amp plugged in. I invited the folks that run the gallery and build most of the displays for the work in Kellen Gallery to test it as well. Watching everyone have such a good time made the long hours over the past few weeks and months all worth it.



Here is a video of a few of the folks who used the piece while I was setting up:

The Gallery show runs from May 7 – 23, 2011 at Parsons The New School for Design (The Sheila C. Johnson Design Center, 2 W 13th Street, Kellen Gallery), and it is open to the public so come on by. The opening party is on the evening of May 8th (7pm), and Symposium talks are on May 14 and 15.

For more information on the Thesis show, timing of the talks, and other related events, head to the MFA DT Thesis site here.

RhythmSynthesis Interface tests


I invited Brett Tieman, J. Nicholas Moy, and Jeremy Gough (who I played with in the band, The Mugs) as well as fellow MFA DT students Haeyoung Kim and Matt Ruby to test the interface of my final project.

I incorporated a number of updates to both the software and hardware, which included:

  • play head movement tied to computer clock for more accurate timing
  • “sliders” along the right and left edges to control speed and volume
  • updated oscillators that generate sounds within openFrameworks application
  • new acrylic top that diffuses light more evenly
  • purposefully designed shapes and colors for acrylic pieces used to make sounds

The comments centered specifically on the sounds that were generated by the application, and they also reaffirmed my belief that when the sounds provide more distinctive characteristics (such as range of octaves and speed of attack), users are able to compose without obvious or deliberate visual feedback.

Below is a video demonstration I did of this updated interface as well as images from the final tests. Headphones are recommended (low notes do not play well on computer speakers):


RhythmSynthesis Sound Tests

This past weekend I invited fellow designers, artists, musicians, programmers, friends and family to Parsons to try out the latest hardware and software version of my tangible color music instrument. It had been a few weeks since the last set of tests, and I made a big push to get as many updates included in the hardware and software for this weekends Sound Tests and for final presentations at school.


On the night before the testing, I finished the hardware portion improvements which includes a new aluminum arm for the camera mount and an enclosure for the camera. I removed the PS3 eye from its original encasing and reduced the size of weight and space the camera took up.

The main goals for this weekend was to rigorously test the software to see if it was stable enough for long performances and to see how musicians and artists dealt with the interface. I truly feel that the audio is enough feedback for performers to know where the playhead is, but I needed to put the new hardware in front of people and see. Also, up to this point, the majority of the people testing my project were from the MFA Design and Technology program, and I wanted a fresh set of eyes and ears on the project.

Here are a set of images from both days:

SATURDAY


SUNDAY


The results were great. I received a lot of feedback (and some encouragement) that will help me with the next steps of this project. The primary comment dealt with how best to provide the musician feedback as to what is actually happening and to help them make decisions on how to use the pieces. Some people preferred more audio feedback, such as more drastic changes between the sounds when pieces are rotated or differ in shape. Others preferred to have more visual feedback so the user would know the location of the rotating playhead.

I put together a brief sample of the performances, and here is the video:

I also put together an extended version that includes all of the performances from the weekend, so if your still feeling like exploring the results from the test, you can see the long player here.

RhythmSynthesis User Testing

This past weekend, I ran a series of user tests with the latest prototype of my thesis project. I reserved a classroom on the 10th floor of 2 W 13th Street (room 1013) and set up the prototype along with a large guitar amp, various effects pedals (delay, distortion, etc), and a drum machine. The group of people who tested the project included fellow students, friends, musicians, and Parsons Faculty.


I included a few updates in the program, which included a full 12 tone scale matched to specific hue values within the full visible light color spectrum, tempo/speed changes, and the use of object area to determine appropriate scale (meaning the larger the piece the lower the octave, the smaller the piece the higher the octave). I also updated the light platform to provide a better color for camera detection. The PS3 Eyesight is biased a little on the cooler color side.




The reactions and results were great, and it provided me plenty of feedback for taking the next steps with the interface and the software portion of the project. In general, there was a wide array of approaches to how to create musical pieces. Some were methodical, other’s used the space between the platform and the camera to manipulate the sounds, and others just went crazy.

Here is a video of some of the folks testing this prototype:

Thanks to Ivy, Greg Climber (and his friend), Katherine Moriwaki, Jonah Brucker-Cohen, Matt Ruby, Steven Sclafani, Cara Lynn, Lou Sclafani, Zach Lieberman, George Bixby, Rob Ramirez, Shane Lessa, and Manuel Rueda Iragorri for coming in to test and offering up your valuable feedback. Looking forward to tackling the next set of updates and changes.

The next user testing session will be…
2 W 13th Street, Room 1013 (map)
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm

All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.

For more information on my thesis project, RhythmSynthesis, you can check here.

RhythmSynthesis Sound Tests This Month


Starting this weekend, I am hosting a series of collaborative sound tests at Parsons The New School for Design. The goals of the series are to test how my prototype can be used as an instrument in individual and group configurations, receive feedback and discuss the project with my musical and non-musical peers, and create new musical pieces. Here is the info:

RhythmSynthesis Sound Tests
Where: 2 W 13th Street, Room 1013 (map)
When: On the following weekends…

3/5 (Saturday) 1 – 7pm
3/6 (Sunday) 1 – 10pm
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm

I’ll have an assortment of instruments and amplifiers, and I’ll be recording the sessions in their entirety. All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.

Thesis Prototype – Additive Synthesis and Controls

New musical interfaces are necessary to further explore the complexities of rhythm. RhythmSynthesis proposes a new instrument for composition and performance to continue such exploration. Originating as an investigation into the relationships between rhythm and technology, RhythmSynthesis applies color, shape, and sound to demonstrate how our understanding of visual music, computation, and tangible, audio-visual interactions can be applied as considerations in musical expression.

The goal for this prototype was to introduce updated sound generation and controls in the application. Using additive synthesis code I developed in the course Audio Visual Systems with Zach Lieberman and Zach Gage using openFrameworks, I am analyzing the shape and size of each colored object to determine the sounds that should be made. I also included sliders for changing the rate at which the objects are sensed and played.

Here is a brief demonstration:

You can see other prototypes and relevant write-ups here.

Thesis Prototype – Proximity Player

For this prototype, I focused on three specific aspects of the user and object interaction, which were:

  • proximity
  • user input via physical interaction
  • user relationship to the installation piece

Using an Ardweeny, small bread board, MaxBotix Ultra Range Finder, a small plastic encasing, hardboard, and a RadioShack speaker, I created a proximity-based sound box that is mounted on the edge of a bicycle wheel. The mount (made of leftover scraps of wood from the Physical Computing room here at Parsons) is attached to the wheel with zip ties.



I wanted to provide some form of physical input for the person interacting with the piece as well as recognize that if multiple people were involved, each person should have a unique sound experience. By including the bicycle wheel as the method for activating the installation piece as well as PVC pipe that can easily be moved, the user has a direct impact on the sounds that they hear.


As the sound box rotates around the hardboard platform, the PVC pipes that are placed on the platform trigger tones based on their distance to the sound box. The tone rises and falls as the pipe comes into and out of range.

I also started initial prototyping of possible patterns and designs that could be projected or otherwise provided on the platform. I initially played with simple patterns of circles, but based on a suggestion from classmate Brett Burton, I used patterns based on the golden ratio and the fibonacci sequence as well.

Orbit Player Prototype

In researching for my Thesis on rhythm, I became fascinated with the front cover of the Wire Magazine book “Undercurrents“. It depicts a record player arm on the rings of a tree, so I wanted to do a piece that had the reference of how the needle of the turntable orbits the sounds as well as how sounds change over time.

The general idea was to provide the understanding that our position (or position of objects within a space) have a rhythmic relationship to each other. I understand how far a table might be from me or how close a person walks past me, but I have difficulty fully understanding the simple relationship that all the pieces have together. This piece illustrates how the position of pieces have relationships over time.

The colors of each of the notes depicted in the piece are inspired by I. J. Belomont’s understanding of the scale and each note’s association with color.

Here is a demo of the application:

In this sketch, I have several instruments that age as they are played. Currently they have two state changes (young and old). I hope in future iterations to continue this idea of aging and rhythm, as well as include the idea of movement within a space. I also want to improve performance and program this for the iPad.

If you would like to try it out, you can download the application here. Once you launch it, just press “f” (make sure it’s lower case), to make it full screen. If you do test it out, it would be awesome to hear your feedback, so please post comments. Remember this is a working prototype.

Rhythm and the everyday life

In response to my design question “What are latent rhythmic aspects of our lives that can be revealed through technology?”, I shot this video on the F/G train platform at Smith and 9th Street. I believe that each moment is the intersection of experiences, and my project investigates the interaction between rhythm and technology. The goal is to reveal and encourage the use of technology to understand the multitude of rhythms that surround us.

The movement of the clouds, the traffic on the BQE, the grass on top of the subway station, the breeze in the microphone, the arriving and departing train, and the chatter of the passengers all form the symphony of rhythms of my everyday life.