As AR and VR devices continue to boom, some of the most interesting research and development isn’t always being put into visuals, but rather how we communicate with our devices. Vimal Mollyn, a PHD student at Carnegie Mellon University’s Human-Computer Interaction Institute, along with his team, is developing an interface that turns our own bodies into virtual touchscreens.
Giant, ultrawide virtual displays coming to Apple Vision Pro
Unlike other efforts to overlay interfaces on body parts, Mollyn’s method, EgoTouch (still in prototype), is able to accomplish this with only a VR device’s included RGB optical camera. EgoTouch utilizes the camera to detect shadows and skin deformations associated with pressing down on a body part, turning this section into a pressable “button.” ACcording to Mollyn and his team, EgoTouch currently has a 96% accuracy when detecting touch on a palm, and a 98% accuracy at detecting if a touch was soft or hard. Impressively, these statistics encompass tests with different skin tones, hair densities, and lighting.
“For the first time, we have a system that just uses a camera that is already in all the headsets. Our models are calibration-free, and they work right out of the box,” says Mollyn. “Now we can build off prior work on on-skin interfaces and actually make them real.”