I can’t think of any product that gets “hacked” more often than Microsoft’s Kinect. This device, designed for gaming, remember, has been put to so many creative purposes, it’s a little insane. Here are some examples. I bet Google could come up with half a dozen more in the span it takes you to blink.
Ubi Interactive has taken the Kinect and tied it to a projector to create a touchscreen that will work on just about any surface. The projector creates an image and the Kinect measures how you interact with it. Your wall could be your new widescreen monitor.
One of the more interesting developments in motion control is how the Kinect has been used and modified for all manner of non-gaming applications. What might have started out as a way to play soccer in your living room has quickly emerged as a motion detecting, 3D imaging, jack-of-all-trades. The system sold around 8 million units in the first 60 days after launch, and Microsoft is working on a new version.
With the kind of success the Kinect has had, it shouldn’t come as any surprise that other companies have looked at motion control and tried to figure out new ways to improve on the technology. Leap Motion hopes to have succeeded with their upcoming release, the Leap.
If you aren’t much of a console gamer, you might not think that Halo is the best thing to come out of Microsoft’s venture into the gaming market with the Xbox. Increasingly though, it seems like the most interesting item to result from Xbox support is the Kinect. We already talked about Chaotic Moon’s use of the technology (here), and now we’ve stumbled across a couple more ways people are using the Kinect.
Coming out of South Korea, the Kinect is used along with RFID chips, holograms and gigantic 3D screens to create an interactive center called LivePark. The company behind LivePark, D’strict, calls its creation a “4D Art Park” that offers gaming, social activities and live performances.
Designers and developers interested in gesture-based interfaces finally got what they’ve been clamoring for from Redmond: Microsoft’s Kinect for Windows commercial sensor unit is now available, and the company has launched a Kinect Accelerator Program designed to encourage small companies to develop business applications using the technology.
The commercial sensor unit only works with gesture-aware apps built with the Kinect for Windows SDK, which can support up to four Kinect units on a single PC and includes a “near mode” that allows the units to see objects as close as 40 cm.
According to Microsoft, 200 companies are already involved in a pilot program to develop commercial applications for the popular motion sensing game interface. Continue reading
Fans of the Tom Cruise film Minority Report will be interested in a new interface technology under development by researchers at Carnegie Mellon and Microsoft Research. The new wearable system, called OmniTouch, can turn any surface into a touchscreen, in much the same way characters in the 2002 film could manipulate data using hand gestures.
OmniTouch combines a laser-based pico projector and depth-sensing camera (a prototype from PrimeSense), similar to the Kinect camera used for the Xbox 360. The system detects the proximity of the user’s finger to the surface to determine when a “touch” is actually occurring.
(For another look at how human-computer interfaces are evolving, see our coverage in Desktop Engineering here.)
OmniTouch was developed by Chris Harrison, a Ph.D. student in the Human-Computer Interaction Institute at Carnegie Mellon University, and a Microsoft Research Ph.D. Fellow, along with Andy Wilson, principal researcher at Microsoft Research, and Hrvoje Benko, also of Microsoft Research.
Microsoft has created a number of similar technologies, including Surface (a multi-touch technology that allowed users to manipulate content using gesture recognition), and LightSpace, which allowed users to project a display interface across an entire room. See how the Kinect camera is being used to interact with 3D digital objects in our previous HoloDesk blog post here.
The project was presented during UIST 2012, the Association for Computing Machinery’s annual symposium on user interface software and technology earlier this month.