“SKinput” by Chris Harrison, et al. HCI Institute @ CMU and Microsoft Research, 2010
Skinput uses technique of biosensing to create an user interface where skin acts like the interface. User wears an armband which consists of acoustic sensors to sense the tapping. The device has arrays of sensors to spatially differentiate location of the tap. The software is developed to minimise errors and clearly identify individual finger movements.
This project use arrays of vibration sensors.
The idea of using skin as a user interface is novel. Project makes me realise that acoustic sensing need not be in human hearing range, microphones can be used to receive signals from any frequency range.