Camposer Project
by Tijmen Klein and Wouter van Ackooy
Our initial idea for this project was to create a way to interact with a synthesizer on a computer in a natural, fluid way. So with this the user is not restricted to a MIDI keyboard and its controller knobs to control the synthesizer notes and certain sound effects such as delay time, delay feedback or frequency cutoff. Additionally we wanted to create an interface which would serve as a visualization for live-shows and at the same time serve as feedback for the user so every motion would be visible on-screen.

To accomplish our concept, we thought of a configuration using Camspace, Propellerhead Reason, and a webcam. Camspace is a third party program which provides a nice tracking algorithm to track objects in front of the webcam, the only downside is that you have to run their program next to our own to make use of its full potential. But the great strength of this program lies in its ability to track any object you can hold, as long as it is somewhat distinguishable from the surroundings. Next to this we would run Propellerhead Reason, this is a program to compose music in and provides us with a synthesizer and several sound effects. We used this program, because we did not intend to built an entire synthesizer but only want to provide an interface which the user can utilize for its natural feel. So eventually it should work with any program that accepts MIDI input. As for our interface earlier we decided upon OpenGL, which would be one of our major bottleneck in this project in the end.
So in the process of exploring what we could do with this idea of ours, we encountered some issues. First of all we were not able to route the MIDI directly from our program to Reason, this seemed rather difficult in Windows without creating our own virtual keyboard drivers. Second, we were not able to create a nice looking graphical interface to provide the user with feedback. And, finally, we found out that there was no way we could remove the Camspace program from our own. So when the program starts, it first shows a separate window (which is the camspace program) to lock the objects you want to use. Then our program routes the MIDI to the standard Windows VST (which converts a MIDI note to a sound). This is a limitation of Camspace, not even with the functionality of their API were we able to solve this, partially also due to missing documentation.
So in the end we were not able to use Reason, had to omit our interface and still had to use an instance of the Camspace program in combination with our own. We now use the Windows VST for MIDI processing, but the VST will not be able to produce any effects on the sound.