Swipe the screen with your finger to bring up a menu. Rap it with your knuckle to select an object. Flick with your fingertip to close.
That's the idea behind TapSense, the latest smart interface idea from Carnegie Mellon University's Chris Harrison which uses the sounds that different parts of your hand make when tapped on a touchscreen to differentiate between them.
Attaching a microphone to a touchscreen lets the user assign different actions depending upon which part of the hand is used to strike the screen. It can tell the difference between a fingernail, knuckle, fingertip and pad of a finger. The researchers say the system is able to distinguish between the four types of finger inputs with 95 percent accuracy, and could distinguish between a pen and a finger with 99 percent accuracy.
Harrison will present TapSense today at the User Interface Software and Technology conference in Santa Barbara, California. He says that tablets like Apple's iPad and smartphones could be upgraded to take advantage of this extended capability for around 25 cents per device - just enough to attach a simple microphone to the screen.
The system can also tell the difference between the sound made by different materials, such as wood, plastic or metal. This would allow users with a stylus made from the different materials to work together with each contribution appearing in a different colour on the screen, for example.

bcs rankings pumpkin patch dia de los muertos dia de los muertos james arthur ray james arthur ray pete seeger
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.