On May 17, 2011 Ryersons’ Interactive Computing Applications and Design Group (ICAD) demonstrated their latest projects. The session starts with a demonstration of using Microsoft Kinect hardware to control a computer mouse. Next, the group shows the use of a gestural interface to control Google Earth, followed by a demo of using Kinect to control a avatar in Second Life.
The session continues with a demonstration of a potential application to control a small arduino based robot over bluetooth using gestures. Following this the ICAD staff show the use of Kinect as a tracking and control mechanism for a Point-Tilt-Zoom (PTZ) camera. This approach allows them to track up to five people without active trackers. The data from the Kinect camera is used to instruct the PTZ camera where to “look”. Once a person is identified (by putting up their hand) the kinect will try to track the person around the room and make sure the PTZ camera follows the person as well. Switching the tracked person is done by raising ones hand.
Their last demo will show a gestural based keyboard that will eventually be tied into a interactive phonebook application where the user can type the name of a contact using gestures and automatically dial the number through a voip application (ie: google talk).
Individual project videos below….
1) Kinect Windows Mouse Interface
2) Kinect Google Earth Interface
3) Kinect Second Life Interface
4) Kinect Bluetooth Robot Interface
5) Kinect Tracker-Cam Interface
6) Kinect Interactive Phonebook