In many intelligent environments, instead of using conventional mice, keyboards and joysticks, people are looking for an intuitive, immersive and cost-efficient interaction device. We are developing a vision-based gesture interface prototype system, VisualPanel, which employs an arbitrary quadrangle-shaped panel (e.g., an ordinary paper) and a tip pointer (e.g., fingertip) as an intuitive, wireless and mobile input device. The system can accurately and reliably track the panel and the tip pointer. The panel tracking continuously determines the projective mapping between the panel at the current position and the display, which in turn maps the tip position to the corresponding position on the display. By detecting the clicking and dragging actions, the system can fulfill many tasks such as controlling a remote large display, and simulating a physical keyboard. Users can naturally use their fingers or other tip pointers to issue commands and type texts. Furthermore, by tracking the 3D position and orientation of the visual panel, the system can also provide 3D information, serving as a virtual joystick, to control 3D virtual objects. The system, which runs at around 22Hz on PIII 800MHz PC, is scalable and extensible. Further potential applications include multiple-persons interactions.
A video filming a live demo is available here (85.3MB). The sound is not edited. A beep signals that a Windows event is generated. A polished version, with explanation, will be available later.
A much shorter video filming a different demo in a different setting is available here (3.3MB).
"Visual Panel: From an ordinary paper to a wireless and mobile input device". Microsoft Research Technical Report MSR-TR-00-112, 2000. PDF file
"Visual Panel: Virtual Mouse, Keyboard and 3D Controller with an Ordinary Piece of Paper". Workshop on Perceptive User Interfaces (PUI 2001), Nov. 15-16, 2001. Orlando, Florida. PDF file.