A FRESH INFORMATION ALWAYS at www.e-infosys.blogspot.com

Touchless Technology


Andy Greenberg,

In the future of Steven Spielberg's Minority Report, Tom Cruise turns on a wall-sized digital display simply by raising his hands, which are covered with black, wireless gloves. Like an orchestra's conductor, he gestures in empty space to pause, play, magnify and pull apart videos with sweeping hand motions and turns of his wrist.

Minority Report takes place in the year 2054. The touchless technology that it demonstrates may arrive many decades sooner. In fact, John Underkoffler, a graduate of the Massachusetts Institute of Technology's Media Lab who advised the Minority Report filmmakers, has already founded a firm to bring gesture-recognition computing into the real world.

His company, Oblong Industries, is in "stealth" mode, and he declined to comment. But Underkoffler's former professor at MIT, Hiroshi Oshii, says Underkoffler is working on a system of gloves and cameras that allow just the sort of ethereal interface that Spielberg imagines and that he has spoken with major tech firms about licensing the technology.

Oblong is one of a wave of companies hoping to retire the mouse, that ancient piece of hardware that has dominated computer interfaces for nearly 25 years. But while multi-touch interfaces like Apple's iPhone and Microsoft's Surface seem to represent the likeliest mouse-killers in the near term, a few firms are already developing science fiction's next intersection with reality: computers that can be controlled with simple gestures in the air.

On July 14th, Toshiba will release the Qosmio G55, a laptop selling for around $1500 and offering what may be the first integration of gesture recognition and day-to-day computing. With Toshiba's media center software, users can pause or play videos and music by holding an open palm up to the screen. Make a fist, and your hand functions as a mouse, pulling a cursor around the screen. Flip your thumb up and down to click.

Mark Lackey, a product manager at Toshiba, imagines a chef using the gestures to control a movie's playback in a kitchen without smearing grease or flour on his keyboard or trackpad. "Everybody uses technology differently," Lackey says. "We're really trying to expand the horizon for how users can choose to connect with their computers."

GestureTek, another player in the nascent gesture-recognition market, has been developing motion-sensing interfaces for around 20 years, and builds software systems for integrating gesture-sensing cameras into video game systems including the Playstation Two's EyeToy and Xbox Live Vision. Those cameras integrate players' movements, allowing an onscreen avatar to mimic arm gestures or walking motions.

More recently, GestureTek has been working to bring that touchless tech to the non-gaming world. The Sunnyvale, Calif.-based company has built demonstrations of interfaces that allow users to control Windows Media Center completely by hand motions. Flashing an open palm to the screen "wakes up" the gesture interface; moving your hand in a circular motion cycles through a menu, as if the user were using a large, invisible iPod clickwheel. All motions can be performed as far as 10 feet from the camera.

In other models, GestureTek uses three-dimensional cameras that bounce infrared light off a user's hands and measure the beam's travel time. That allows the system to gauge distance and register motions toward and away from a computer screen, rather than merely side to side or up and down. In these 3D systems, users can "click" on an onscreen object by merely tapping the air in front of them. Hold a finger forward and move it laterally to drag and drop.

GestureTek co-founder Francis MacDougall says the technology will be integrated by a "major PC manufacturer" in the next three months. "This isn't a gimmick," he says. "If I can get 90% of applications like e-mail or checking stocks or the weather through simple gestures, everything starts to meld into a single display like a digital photo frame. The desktop is cleared off and all the extra pieces are eliminated."

As gesture-recognition technologies come to fruition, other touchless interface technologies are quickly taking their place in the realm of "too futuristic to be possible." One of the most eagerly anticipated is--yes, this is nonfiction--headsets that allow computers to be controlled with thought alone.

Gaming hardware makers like OCZ Technologies, Neurosky, and Emotiv offer gadgets that wrap around users' heads to read electrical impulses. OCZ describes its Neural Impulse Actuator, a headband with three sensors, as capable of allowing full control of a video game, while Emotiv's chief executive Nam Do describes the technology as "another layer" of control that can integrate mental responses into other kinds of interfaces. Trying to use a headset to replace a mouse, Do says, would lack accuracy and result in a 150 millisecond delay--too long for most users.

Some researchers, on the other hand, have successfully created brain implants that allow true mind-machine interfaces--accurately piloting an operating system with thoughts alone. Scientists like Brown University's John Donoghue and the University of Pittsburgh's Andrew Schwartz put aspirin-sized electrodes directly on a subject's cortex. The person (or animal) can then control games and other software programs just by thinking. Donoghue has recorded quadriplegic patients moving a cursor around a screen to play games and manipulate MP3 software. Schwartz recently published a paper in Nature demonstrating that monkeys were able to feed themselves with a thought-controlled prosthetic arm.

But for those who'd prefer not to have a piece of their skull removed--or wave their arms in the air--the most fruitful touchless technology under development may be eye-tracking software. Companies like Tobii, LC Technologies and Eye Response Technologies are developing systems that would allow users to control a computer's cursor just by moving their gaze around the screen. The camera-based technology works by tracking a spot of infrared light reflected in a user's eyes to determine the location of his or her head. Then it measures the location of the eye's pupil and uses it to guide a pointer around the computer's display.

In a study last year, Manu Kumar, a researcher at Stanford, showed that an interface for tracking a user's gaze with a camera was 14% faster than using a mouse in some cases, although it produced about 2.5 times as many errors. A bigger barrier may be price: A fully functioning interface costs around $25,000. Still, Kumar is confident that in two or three years, eye-tracking technology will find its way to the mainstream.

"Currently we can only poke our finger at a mouse or keyboard, but eye contact is a huge source of context about our interaction and attention," he says. "Once you allow a computer to know where we're looking, it really opens up a whole new dialogue between computers and humans."

0 comments:


 

Digital Information | Design by AMSR