Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now


Nice Gesture

Raytheon's gesture-technology demo ? called Interactive Gestural Exploitation and Tools (IGET) ?consists of an off-the-shelf DLP projector and special gloves embedded with reflective beads.

Nice Gesture

Raytheon’s gesture-technology demo ? called Interactive Gestural Exploitation and Tools (IGET) ?consists of an off-the-shelf DLP projector and special gloves embedded with reflective beads.

IN THE 2002 movie “Minority Report,” Tom Cruise plays a detective who conducts investigations by standing in front of a chalkboard-sized display and using special gloves to sort through videos, photos, and text. It may seem like science fiction, but it also looks like a viable way to get a grip — literally — on more information than you could make sense of on a PC screen. That’s why companies such as Raytheon are developing gesture-technology systems for use in applications where information overload is common.

“My customers tend to have more information than they can use, and I try to sell them systems that give them even more,” says Allan Mattson, a technologist at Waltham, MA-based Raytheon. “If they can’t start using it more efficiently, they don’t need to buy systems that collect more. So we initiated this system as a demo to show customers that there are different ways to take advantage of information and speed up the processing.”

Raytheon’s gesture-technology demo — called Interactive Gestural Exploitation and Tools (IGET) —consists of an off-the-shelf DLP projector and special gloves embedded with reflective beads. Infrared cameras are suspended from a truss above the screen and monitor the glove’s reflections in order to track the wearer’s hand movements. A computer collects the tracking information and adjusts the onscreen images accordingly.

For example, as in the movie, the user simply waves at an image to move it, or points at it to zoom in. The system works with just about any type of data and multimedia. “Anything you can display, you can interact with,” Mattson says.

That sounds convenient, but it also begs a question: Why won’t a big touchscreen work just as well? One reason is that to use a touchscreen, you can’t be more than an arm’s length away. That distance means you can’t easily take in all of the information with a single glance that only a big display can accommodate. Instead, a small screen produces information overload by trying to shoehorn everything into your immediate field of vision.

“You’re tunnel-visioned into the stuff that’s right in front of you,” says John Underkoffler, whose gesture-technology research at the Massachusetts Institute of Technology (MIT) caught the eye of “Minority Report’s” producers. “So you have to stand back. Then what? Do you use a mouse? A mouse isn’t that great for doing things on a regular-sized screen, and if the screen is large, you’re going to roll that mouse for quite a while.”

Just move it

Underkoffler’s gesture technology — called g-speak — is an offshoot of his MIT Luminous Room project, which used a similar combination of projectors and sensors to make desktops and other ordinary surfaces interactive. Part of the goal was to reduce the reliance on computer-assisted-design (CAD) tools, which require extensive training. In the case of urban design, for example, Underkoffler’s system made a streetscape interactive by projecting digital shadows.

“As you moved the buildings around, the shadows moved around,” Underkoffler says. “You could project a clock on the table and change the time of day, or change the latitude — all the things that are hard to do even in a CAD system. In order to move a building, you didn’t have to memorize any keyboard shortcuts or navigate through a complex menu. You do what every eight-month-old knows how to do: Grab something and move it.”

“Minority Report’s” producers stumbled onto Luminous Room while hunting for ideas and made Underkoffler a technology advisor on the movie. A Raytheon engineer saw the picture and recognized potential applications, so the company hired Underkoffler as a contractor to develop a demo. Meanwhile, buoyed by the response to his work on “Minority Report” and other films, Underkoffler moved to Hollywood and started a company called “g-speak,” which is refining and commercializing gesture-technology systems.

Raytheon’s interest suggests that the technology is closer to reality than it might appear in the context of a sci-fi movie. Indeed, although a human talking to a computer might have seemed far-fetched in a sci-fi movie just a decade or two ago, today it’s difficult to call a company’s customer service number and not be greeted by a machine that asks questions before routing you to a person.

Building a commercial system wouldn’t require major technological stretches because IGET and g-speak leverage a lot of existing equipment, such as DLP projectors. In fact, Raytheon’s demo does everything that the gesture technology system in “Minority Report” does.

“The human-to-computer interface is working as advertised,” Mattson says. “It’s intuitive: You point, and a cursor shows up on the screen where you’re pointing. You don’t have to point precisely. You just point toward the cursor, see where it is, and then move your hands accordingly.”

Gesture technology can be precise enough even for the most demanding applications, such as telesurgery. “What we’re finding is that the precision is tremendous because your hands are capable of that,” Underkoffler says. “For example, standing 10 feet back from our current screen —which is 16 feet wide and 4 feet high — you can position objects on the screen to roughly one pixel resolution.”

Although the current demos use gloves, gesture technology could track other limbs, too, for a more immersive experience. “If you wanted to do a virtual tour of a building, you could simulate walking through the building just by your motions,” Mattson says. “The technology could be taken to the extent of instrumenting all of your body.”

From sci-fi to commercialization

Scalability is one issue that still needs to be worked out. Raytheon has two demo units — the largest one with a display that’s 20 feet by 20 feet. In both, the user has to stay in a fairly small “sweet spot” for the cameras to pick up the gestures. That’s fine for many applications but not for a magic act in an auditorium let’s say. Mattson maintains that the design and technology could be tweaked to allow for greater distances between the user and the cameras, in turn creating a bigger display.

Another issue is the software that tracks the user’s movements and then adjusts the display accordingly. Mattson says that although the technology works in a demo setting, making it robust enough for commercial applications — especially demanding ones such as air traffic control — will require time and money. Gesture technology’s rai-son d’être makes that a tall order: Because they’re intended for applications where information overload is common, the systems have to be able to work with multiple streams of video and data. But tying together so much information from so many disparate sources while maintaining high reliability could jack up the price.

Those variables make it difficult to estimate how much a commercial gesture-technology system might cost. “The price is also going to be driven by how much it costs to mature it and ‘productize’ it,” Mattson says.

Because Raytheon’s core customers are in government and defense, that’s one reason why the first commercial IGET systems will target those sectors rather than wider pro AV applications. “I’d guess that within a couple of years, we’ll start defense-oriented programs that have it as an underlying technology,” Mattson says.

The other reason for initially targeting that market is cost: Although IGET uses off-the-shelf display technology, the infrared cameras are high-end models currently used only in niche applications such as movie animation. Each camera costs hundreds of thousands of dollars, pushing the total system cost deep into six-figure territory.

“For military applications, a couple hundred thousand dollars isn’t a show-stopper,” Mattson says. “For AV applications, we would have to change the way the gloves were sensed and make it more affordable.”

Another way to reduce the price and thus make it viable for applications is to sell more systems, with the price tag decreasing as equipment volumes increase — the process that all new technologies go through. That takes time, but judging by the response to Raytheon’s demos thus far, there are already plenty of potential customers, many of whom grasp the benefits on their own rather than only after a sales pitch. For example, Raytheon recently got a call from a Midwestern city. “They were seeing applications for their emergency command center,” says Sabrina Steele, a spokesperson for the company’s Space and Airborne Systems division.

Besides selling IGET systems directly to government agencies, Raytheon also is exploring the possibility of creating a software package that other companies could license to create less expensive gesture-technology systems. For example, it’s already talking with Treadle & Loam, a Los Angeles-based company that develops 2D and 3D human-machine interface technologies for industries such as medical, entertainment, and public-venue design.

g-speak also is leaning toward building systems for sale directly to users and licensing the technology. “We’re choosing two or three specific applications that we’ll develop software and systems for,” Underkoffler says. “The key is going to be finding the appropriate partners.”

Tim Kridel is a freelance writer and analyst who covers telecom and technology. He’s based in Kansas City and can be reached at[email protected]

Featured Articles