The Sound of Touch: On-body Touch and Gesture Sensing Based on Transdermal Ultrasound Propagation

Recent work has shown that the body provides an interesting interaction platform. We propose a novel on-body touch and gesture sensing technology based on active ultrasound signal transmission. It provides rich contextual information of on-body touch and gesture such as: 1) continuous indication of touch or gesture presence, 2) continuous localization of touch (potentially supporting slider-like interaction), 3) pressure- sensitive touch (potentially supporting a touch-and-click event), and 4) an arm-grasping hand gesture


We leverage the phenomena of transdermal ultrasound propagation by prototyping on a wearable system that emits ultrasound signals on one part of the body and measures this signal on another.


The measured signal varies according to the measurement location, forming distinctive propagation profiles which are useful to infer on-body touch locations and on-body gestures.


We developed a proof of concept sliding menu application, which also served as our experiment platform in our interactive study. In this study, we investigated event mapping for continuous and pressure- aware touch gestures. Note that in the future, the sensor unit (transmitter and receiver transducers) and the display contents can be embedded into wearable devices enabling truly mobile experience.


We envisioned two exemplary application domains in which our approach can be immediately useful: 1) supporting interaction modalities for visually impaired users (who can accurately touch their own body parts using proprioception, as illustrated by the following figure (1), and 2) for users with limited access to an input device (illustrated by  (2) and (3), e.g. when driving, biking, running, etc.). Other interactive application examples uniquely enabled by this technique includes: a) Wrist mounted mobile phone will embed the sensing unit and appropriates the forearm to be an input surface, b) Headphones that simultaneously plays music and transmits ultrasound signals through the skull, and s/he may touch different parts of the head/neck/face to control the music, c) Multiple sensors placed in various body parts can be used to sense wide-body gestures.



(primary contact) Adiyan MujibiyaResearch Scientist, Rakuten Institute of Technology & Rekimoto-Lab – The University of Tokyo
Xiang Cao – Lenovo Research & Technology Beijing, China
Desney S. Tan – Microsoft Research Redmond, USA
Dan Morris – Microsoft Research Redmond, USA
Shwetak N. Patel – Microsoft Research Redmond, USA & University of Washington, Seattle, USA
Jun Rekimoto – The University of Tokyo & Sony CSL


Adiyan Mujibiya, Xiang Cao, Desney S. Tan, Dan Morris, Shwetak N. Patel, and Jun Rekimoto. 2013. The sound of touch: on-body touch and gesture sensing based on transdermal ultrasound propagation. In Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces (ITS ’13). ACM, New York, NY, USA, 189-198. DOI=10.1145/2512349.2512821

 author = {Mujibiya, Adiyan and Cao, Xiang and Tan, Desney S. and Morris, Dan and Patel, Shwetak N. and Rekimoto, Jun},
 title = {The Sound of Touch: On-body Touch and Gesture Sensing Based on Transdermal Ultrasound Propagation},
 booktitle = {Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces},
 series = {ITS '13},
 year = {2013},
 isbn = {978-1-4503-2271-3},
 location = {St. Andrews, Scotland, United Kingdom},
 pages = {189--198},
 numpages = {10},
 url = {},
 doi = {10.1145/2512349.2512821},
 acmid = {2512821},
 publisher = {ACM},
 address = {New York, NY, USA},
 keywords = {gestures, on-body sensing, skin, ultrasound propagation},