Update- project part2

Final Project,Technique — priyaganadas @ 4:17 am

I have been working on Max since last week. I have decided to use kinect to map depth and movement of a person. When the location of the person interacts with the 3d model of the dataset, sound will be generated.
This is the demo of test set up of Kinect-Max-Ableton working together. Patch creates sound based on hand motion and distance(depth) from the sensor.

Next Step is to import a 3d model of the database in the patch so that interaction of person and dataset can result in music.


No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2021 Making Things Interactive | powered by WordPress with Barecity