Touch screens have revolutionized the way that we interact with technology, and yet that interaction is generally limited to the confines of the device and whatever set of controls the app at hand provides.
What if this restriction no longer applied? What if one could draw a control panel on any surface—a wall, a napkin, the back of one’s hand—and immediately interact with it to control a piece of software?
SketchSynth is a step in this direction. Developed by Billy Keyes as a final project for his Interactive Art and Computational Design class at CMU, it allows anyone to create their own control panels with just a marker and a piece of paper. Once drawn, the controller sends Open Sound Control (OSC) messages to anything that can receive them; in this case, a simple synthesizer running in the programming language Pure Data.
Any number of sliders, buttons and toggle switches can be drawn, which are then illuminated by a projector. Once highlighted they immediately become interactive, with every motion picked up by webcam and instantly transmitted to the synthesizer.
Check out SketchSynth in action below:
Learn more about the project at Billy Keyes’ SketchSynth project page.