“Contact” by Felix Faire (2014)

Contact by Felix  uses contact microphones to gather touch information from the wrist, fingers, and fingernails on the hard surface, and translates it into melodic notes and sounds, where the audio is recorded and playback in a custom built in loop pedal.


“SKinput” by Chris Harrison, et al. HCI Institute @ CMU and Microsoft Research, 2010

Precedent Analysis — priyaganadas @ 9:47 pm

Skinput uses technique of biosensing to create an user interface where skin acts like the interface. User wears an armband which consists of acoustic sensors to sense the tapping. The device has arrays of sensors to spatially differentiate location of the tap. The software is developed to minimise errors and clearly identify individual finger movements.
This project use arrays of vibration sensors.


The idea of using skin as a user interface is novel. Project makes me realise that acoustic sensing need not be in human hearing range, microphones can be used to receive signals from any frequency range.

“Mogee” by Bruno Zamborlin, et al. (2012)

This fourth precedent is being posted because Ishin-den-shin was already used and I wanted to be sure to have three new ones. This technology can turn anything into an instrument by picking up on the way that the object being used as an instrument is being touched. Like Ishin-den-shin, this project has something seemingly low-tech do something unexpected and truly immerses the user in the experience by having them be directly involved.



“Ishin-den-shin” by Yuri Suzuki, et al. Disney, Pittsburgh (2013)

This project is an incredible manifestation of the technological capabilities we have reached today. The ability to use your own body to contain a hidden message and, by simple touch, deliver it to another, takes interactivity to another level of true immersion in the activity with the technology. The whole system seems very low tech but of course that is because the actual technology working here is unseen, adding a layer of complexity that goes completely unnoticed- except for when you touch someone’s ear!




“Acoustic Barcodes” by Chris Harrison, el at. Presented @ Cambridge (2012)

The thoughtfulness of this project is appealing as an object identification device. Though it is difficult to consider a realistic application for this, it is intriguing to imagine a world physically marked with identification that you can hear, feel, and see. The requirement of a haptic interaction with an object in order to get feedback from the audio effect is the true function that entices me.

Interact with one thing, produce a resulting sound from it, then that sound is heard. The indirect relationship between the “input” object and the direct input to the program is where I find inspiration.


“A Drifting Up” by Syed Reza Ali (2009)

A Drifting Up is a project I was shown over the summer of a flocking particle system that responds to sound. Its response creates particle sub-systems within the larger whole. It is a complicated system that yields an interesting visual effect, but to take something like this a step further and manifest physically somehow could be even more intriguing.


“The Visual Microphone” by Abe Davis, et al. @ MIT (2014)

Assignment,Precedent Analysis,Project01 — Tags: — John Mars @ 4:21 am

Extracting sound information from minute vibrations of objects picked up by high-speed video.


“RjDj” by Reality Jockey Ltd. (story told by Roman Mars @ 99% Invisible) (2010 – Present)

Assignment,Precedent Analysis,Project01 — Tags: — John Mars @ 4:08 am

Context-aware, reactive, augmented reality music platform being embedded into a series of apps.


“Ishin-Den-Shin” by Olivier Bau, Ivan Popyrev, and Yuri Suzuki @ Disney Research (2013)

Assignment,Precedent Analysis,Project01 — Tags: — John Mars @ 3:48 am

Using the human body as a speaker, Ishin-Den-Shin converts and amplifies sound spoken into a microphone and passes it to the listener through a touch of the finger.


« Previous Page
This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2020 Making Things Interactive | powered by WordPress with Barecity