“At present, finger input on touch screens is handled very simplistically – essentially boiled down to an X/Y coordinate. However, human fingers are remarkably sophisticated, both in their anatomy and motor capabilities. TapSense is an enhancement to touch interaction that allows conventional screens to identify how the finger is being used for input. This is achieved by segmenting and classifying sounds resulting from a finger’s impact. Our system can recognize different finger locations – including the tip, pad, nail and knuckle – without the user having to wear any electronics. This opens several new and powerful interaction opportunities for touch input, especially in mobile devices, where input bandwidth is limited due to small screens and fat fingers. For example, a knuckle tap could serve as a ‘right click’ for mobile device touch interaction, effectively doubling input bandwidth. Our system can also be used to identify different sets of passive tools. We conclude with a comprehensive investigation of classification accuracy and training implications. Results show our proof-of-concept system can support sets with four input types at around 95% accuracy. Small, but useful input sets of two (e.g., pen and finger discrimination) can operate in excess of 99% accuracy.”
(Chris Harrison) courtesy of dansaffer