SixthSense is a wearable gestural interface device that augments the physical world with digital information and lets people use natural hand gestures to interact with that information. It was developed by Pranav Mistry, a PhD student in the Fluid Interfaces Group at the MIT Media Lab.
The SixthSense prototype is comprised of a pocket projector, a mirror and a camera. The hardware components are coupled in a pendant-like mobile wearable device. Both the projector and the camera are connected to the mobile computing device in the user’s pocket. The device projects visual information, enabling surfaces, walls and physical objects around the wearer to be used as interfaces; while the camera recognizes and tracks the user's hand gestures and physical objects using computer-vision based techniques. The software program processes the video stream data captured by the camera and tracks the locations of the colored markers (visual tracking fiducials) at the tip of the user’s fingers using simple computer-vision techniques. The movements and arrangements of these fiducials are interpreted into gestures that act as interaction instructions for the projected application interfaces. The maximum number of tracked fingers is only constrained by the number of unique fiducials, thus SixthSense also supports multi-touch and multi-user interaction.