This past weekend I invited fellow designers, artists, musicians, programmers, friends and family to Parsons to try out the latest hardware and software version of my tangible color music instrument. It had been a few weeks since the last set of tests, and I made a big push to get as many updates included in the hardware and software for this weekends Sound Tests and for final presentations at school.
On the night before the testing, I finished the hardware portion improvements which includes a new aluminum arm for the camera mount and an enclosure for the camera. I removed the PS3 eye from its original encasing and reduced the size of weight and space the camera took up.
The main goals for this weekend was to rigorously test the software to see if it was stable enough for long performances and to see how musicians and artists dealt with the interface. I truly feel that the audio is enough feedback for performers to know where the playhead is, but I needed to put the new hardware in front of people and see. Also, up to this point, the majority of the people testing my project were from the MFA Design and Technology program, and I wanted a fresh set of eyes and ears on the project.
Here are a set of images from both days:
The results were great. I received a lot of feedback (and some encouragement) that will help me with the next steps of this project. The primary comment dealt with how best to provide the musician feedback as to what is actually happening and to help them make decisions on how to use the pieces. Some people preferred more audio feedback, such as more drastic changes between the sounds when pieces are rotated or differ in shape. Others preferred to have more visual feedback so the user would know the location of the rotating playhead.
I put together a brief sample of the performances, and here is the video:
I also put together an extended version that includes all of the performances from the weekend, so if your still feeling like exploring the results from the test, you can see the long player here.
This past weekend, I ran a series of user tests with the latest prototype of my thesis project. I reserved a classroom on the 10th floor of 2 W 13th Street (room 1013) and set up the prototype along with a large guitar amp, various effects pedals (delay, distortion, etc), and a drum machine. The group of people who tested the project included fellow students, friends, musicians, and Parsons Faculty.
I included a few updates in the program, which included a full 12 tone scale matched to specific hue values within the full visible light color spectrum, tempo/speed changes, and the use of object area to determine appropriate scale (meaning the larger the piece the lower the octave, the smaller the piece the higher the octave). I also updated the light platform to provide a better color for camera detection. The PS3 Eyesight is biased a little on the cooler color side.
The reactions and results were great, and it provided me plenty of feedback for taking the next steps with the interface and the software portion of the project. In general, there was a wide array of approaches to how to create musical pieces. Some were methodical, other’s used the space between the platform and the camera to manipulate the sounds, and others just went crazy.
Here is a video of some of the folks testing this prototype:
Thanks to Ivy, Greg Climber (and his friend), Katherine Moriwaki, Jonah Brucker-Cohen, Matt Ruby, Steven Sclafani, Cara Lynn, Lou Sclafani, Zach Lieberman, George Bixby, Rob Ramirez, Shane Lessa, and Manuel Rueda Iragorri for coming in to test and offering up your valuable feedback. Looking forward to tackling the next set of updates and changes.
The next user testing session will be…
2 W 13th Street, Room 1013 (map)
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm
All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.
For more information on my thesis project, RhythmSynthesis, you can check here.
Starting this weekend, I am hosting a series of collaborative sound tests at Parsons The New School for Design. The goals of the series are to test how my prototype can be used as an instrument in individual and group configurations, receive feedback and discuss the project with my musical and non-musical peers, and create new musical pieces. Here is the info:
RhythmSynthesis Sound Tests
Where: 2 W 13th Street, Room 1013 (map)
When: On the following weekends…
3/5 (Saturday) 1 – 7pm
3/6 (Sunday) 1 – 10pm
3/26 (Saturday) 1 – 7pm
3/27 (Sunday) 4 – 10pm
I’ll have an assortment of instruments and amplifiers, and I’ll be recording the sessions in their entirety. All are welcome, so if you plan on coming, please send me an e-mail so I can make sure you’re all set with security downstairs.
New musical interfaces are necessary to further explore the complexities of rhythm. RhythmSynthesis proposes a new instrument for composition and performance to continue such exploration. Originating as an investigation into the relationships between rhythm and technology, RhythmSynthesis applies color, shape, and sound to demonstrate how our understanding of visual music, computation, and tangible, audio-visual interactions can be applied as considerations in musical expression.
The goal for this prototype was to introduce updated sound generation and controls in the application. Using additive synthesis code I developed in the course Audio Visual Systems with Zach Lieberman and Zach Gage using openFrameworks, I am analyzing the shape and size of each colored object to determine the sounds that should be made. I also included sliders for changing the rate at which the objects are sensed and played.
Here is a brief demonstration:
You can see other prototypes and relevant write-ups here.