Share on Facebook Tweet on Twitter Share on LinkedIn Share by email
Code Space: Combining Touch, Devices, and Skeletal Tracking to Support Developer Meetings

Andrew Bragdon, Robert DeLine, Ken Hinckley, and Meredith Ringel Morris

Abstract

We present Code Space, a system that contributes touch + air gesture hybrid interactions to support co-located, small group developer meetings by democratizing access, control, and sharing of information across multiple personal devices and public displays. Our system uses a combination of a shared multi-touch screen, mobile touch devices, and Microsoft Kinect sensors. We describe cross-device interactions, which use a combination of in-air pointing for social disclosure of com-mands, targeting and mode setting, combined with touch for command execution and precise gestures. In a formative study, professional developers were positive about the interaction design, and most felt that pointing with hands or devices and forming hand postures are socially acceptable. Users also felt that the techniques adequately disclosed who was interacting and that existing social protocols would help to dictate most permissions, but also felt that our lightweight permission fea-ture helped presenters manage incoming content.

Details

Publication typeProceedings
Published inProceedings of ITS 2011
PublisherACM
> Publications > Code Space: Combining Touch, Devices, and Skeletal Tracking to Support Developer Meetings