Wednesday, July 2, 2014

SixthSense



SixthSense is a gestural interface device comprising a neckworn pendant that contains both a data projector and camera. Headworn versions were also built at MIT Media Lab in 1997 that combined cameras and illumination systems for interactive photographic art, and also included gesture recognition (e.g. finger-tracking using colored tape on the fingers)







SixthSense is a name for extra information supplied by a wearable computer, such as the device called "WuW" (Wear yoUr World) by Pranav Mistry et al., building on the concept of the Telepointer, a neckworn projector and camera combination first proposed and reduced to practice by MIT Media Lab student Steve Mann.




Origin of the "Sixth Sense" name[edit]
Sixth Sense technology (camera combined with light source) was developed in 1997 (headworn) and 1998 (neckworn), but the Sixth Sense name for this work was first coined and published in 2001. Mann referred to this wearable computing technology as affording a "Synthetic Synesthesia of the Sixth Sense", i.e. the idea that wearable computing and digital information can act as an additional (i.e. sixth) sense.[5] Ten years later, Pattie Maes (also with MIT Media Lab) also used the term "Sixth Sense" in this same context, in her TED talk.





Construction and workings[edit]
The SixthSense technology contains a pocket projector, a mirror and a camera contained in a head-mounted, handheld or pendant-like, wearable device. Both the projector and the camera are connected to a mobile computing device in the user’s pocket. The projector projects visual information enabling surfaces, walls and physical objects around us to be used as interfaces; while the camera recognizes and tracks users' hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tips of the user’s fingers. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. SixthSense supports multi-touch and multi-user interaction


In my next part I will bring.



ITI 


3 comments: