Archive for ‘Physical Computing’ Category
On May 17, 2011 Ryersons’ Interactive Computing Applications and Design Group (ICAD) demonstrated their latest projects. The session starts with a demonstration of using Microsoft Kinect hardware to control a computer mouse. Next, the group shows the use of a gestural interface to control Google Earth, followed by a demo of using Kinect to control a avatar in Second Life.
The session continues with a demonstration of a potential application to control a small arduino based robot over bluetooth using gestures. Following this the ICAD staff show the use of Kinect as a tracking and control mechanism for a Point-Tilt-Zoom (PTZ) camera. This approach allows them to track up to five people without active trackers. The data from the Kinect camera is used to instruct the PTZ camera where to “look”. Once a person is identified (by putting up their hand) the kinect will try to track the person around the room and make sure the PTZ camera follows the person as well. Switching the tracked person is done by raising ones hand.
Their last demo will show a gestural based keyboard that will eventually be tied into a interactive phonebook application where the user can type the name of a contact using gestures and automatically dial the number through a voip application (ie: google talk).
Individual project videos below….
1) Kinect Windows Mouse Interface
2) Kinect Google Earth Interface
3) Kinect Second Life Interface
4) Kinect Bluetooth Robot Interface
5) Kinect Tracker-Cam Interface
6) Kinect Interactive Phonebook
Posted on 21:40, January 16th, 2011 by Many Ayromlou
Great little video on how to setup AR marker recognition under QC. Even has a nice mellow background music :-).
So what happens when an artist combines a 3D gaming engine, the power of blender and processing and a dash of human powered mechanical abomination? :-) Bince McKelvie describes his project:
Posted on 11:16, May 6th, 2009 by Many Ayromlou
Great video showing a bizarre and novel way of creating a gesture based interface. You literally touch nothing….Air…..and the interface does the rest. Pretty interesting project. According to Justin Schunick of the team at Northeastern University, the interface uses an array of copper electrodes to sense a certain change in the electric field created by the device. The black material covering the electrodes shows how the interface can be hidden beneath surfaces to create a completely invisible interface. It is simple black felt you can buy at any fabric store. The total cost of this prototype was around $60.00 USD.
They created custom software to communicate with the microcontroller running the show with C++. This enables the use of the device as a new type of XYZ computer mouse. Think nintendo wii controller without the controller — or minority report without the gloves. This can potentially be revolutionary as far as HCI goes.
Posted on 16:10, March 7th, 2009 by Many Ayromlou
If you ever dreamed about a situation where you could just grab a physical representation of the music you want to listen to, say the CD cover, and by placing it on the table have the music automatically played through your stereo, you might want to check out this video. In it you’ll see how Nic used Arduino plus Parallax RFID reader/tags to make his Squeezebox network music player a bit more physically intuitive/interactive. Looks like a fun weekend project. Source code and more detail available on Nic’s blog.
Posted on 10:51, August 14th, 2008 by Many Ayromlou
Monash University team downunder has done it again. Nodal is their Free (for personal use) generative music software for Mac OSX. It’s a good looking application that I will be taking out for a test drive soon. From the website:
Now Nodal generates MIDI data only. This means that you need a hardware/software MIDI synth to hear the good stuff. For simple demonstrations try SimpleSynth. Nodal will also work with Apple’s GarageBand software…….So what are you waiting for……Bombs awayyyy……
The people over at 5VoltCore have put together a PD installation that really tests your courage and trust in machines. The installation sets up a feedback loop between computer, robot and the user. The user is right in assuming that the machine can fail, the machine can fail because the user assumes.
Let me explain, it all starts with a PD patch that controls a knife held by a robot that will try to hit the space in between the users fingers. Once the user places his/her hand under the robot, the program takes over and the knife movements slowly speed up. At this point the user will either trust the machine or they will get nervous and start sweating. The sweating will then trigger a series of short circuits inside the computer that will cause the knife to move in a more erratic manner. The question is, will the user manage to hold still and not break into a sweat as the machine is doing it’s thing. Pretty scary stuff…….
If this is boring you it is not my fault….Here are some more Goodies :-)
More to come later….
Posted on 22:37, July 12th, 2007 by Many Ayromlou
Okay this is more for me since I keep having to digg this stuff up…..but if any of it is useful to you please help yourself:
Okay enough for now…..I will add more stuff as I come across them…