Parsing the stream of data from the Leap motion proved no easy task more me. Although I used Golan’s Leap Visualizer based on Theo’s ofxLeapMotion addon for openFrameworks. Once I got the data parsed and converted into OSC messages, it was time to get the OSC messages routed in pure data and mapped to the appropriate synthesis parameters. Here I present a few exploratory interactions.
I’m completely aware that these sounds aren’t pleasant. Since this is a long term project, I’m now in the phase of finding which gestures I like for controlling which style of synthesis, and this is a slow process. Although, these initial explorations provide good insight to the controller’s capabilities and it’s flaws.
Sometimes the Leap gets confused and decides that your right hand is actually your left, and it can only see from a specific angle, so any actions you make where you hands cross will probably confuse it, big time. This is called occlusion.
I fully intend to continue exploring this device for future work and I’m considering using it for my Master Thesis project, yet to be revealed ;).
This project was supported in part by funding from the Carnegie Mellon University Frank-Ratchye Fund for Art @ the Frontier.
I also include an annotated portfolio of research I’ve done for Aisling Kelliher’s Interaction Design Seminar on hand-gesture based systems for controlling sound synthesis.
I found out that the book is now in the public domain, so it’s free to download-
507 Mechanical Movements PDF
And here’s the site that has animated some of the movements.
Searching for an astronomy related API, I found this neat site called “Where is the ISS at?” They have an API
Since there is no sample code, I modified the kimonolabs sample code for python.
results = json.load(urllib.urlopen("https://api.wheretheiss.at/v1/satellites/25544"))
Then you get back something like this.
I made a quick tutorial on how to use OSC to communicate 2 devices running pd and use the [pduino] object to control each other’s leds and solenoids. yay!
One of my main interests is working with interfaces for sound synthesis. Over the years I’ve been experimenting with a few different techniques to see how different interfaces inspire different styles of performance and how the interface affects the sound produced. Without having a clear goal, I’ve delved into circuit bending/ hardware hacking, computer-vision, touch screens and web/text based systems. Here are some of my findings:
Elvis box – Circuit Bent DS-1 Distortion
Circuit Bending provides a great introduction to incorrect electronics, the idea is that you use wires to randomly create connections within existing circuits (using only battery power, for safety) and explore the effect these connections have on the sound (or visual). I think this wires the brain in a great way because you expect failure, instead of total control you have only curiosity and luck. This got me thinking about how I was going to control these sounds. Why had I decided to use buttons, patch cables and knobs?
Dizzy is a friendly looking teddy who actually watches your every move and makes everything it sees public.
Making use of the Raspberry Pi’s GPIO, we hooked up a PIR sensor to trigger a webcam capture event and publish the image to Tumblr.
By far the most challenging part of this project was working with the API. Since it’s a lot easier to find examples of raspi projects written in python than in c++, we decided to use python. For example, the Tumblr API page has example code for python but not c++, adafruit has a great python tutorial for hooking up a PIR sensor to a Pi’s GPIO.
pytumblr is Tumblr’s official API client for python, but the instructions are unclear.