“SKinput” by Chris Harrison, et al. HCI Institute @ CMU and Microsoft Research, 2010

Precedent Analysis — priyaganadas @ 9:47 pm

Skinput uses technique of biosensing to create an user interface where skin acts like the interface. User wears an armband which consists of acoustic sensors to sense the tapping. The device has arrays of sensors to spatially differentiate location of the tap. The software is developed to minimise errors and clearly identify individual finger movements.
This project use arrays of vibration sensors.


The idea of using skin as a user interface is novel. Project makes me realise that acoustic sensing need not be in human hearing range, microphones can be used to receive signals from any frequency range.


No comments yet.

RSS feed for comments on this post. TrackBack URI

Leave a comment

This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported License.
(c) 2020 Making Things Interactive | powered by WordPress with Barecity