Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Virtual Education

Technology gives students a more real experience.

Virtual Education

May 11, 2012 3:44 PM,
By Cynthia Wisehart

Technology gives students a more real experience.

The first thing I notice about the new Virtual Reality Design Lab at the University of Minnesota’s College of Design is that it’s flooded with light. Or—from the viewpoint of an RF motion capture sensor—it’s flooded with noise.

The light/noise streams in from all four exposures through a 360-degree corona of high windows. Light pools on the floors and bounces across the expansive space, striking extrusions and truss work, white walls, furniture, and debris that has accumulated through meetings, soirees, and projects. At the heart of the college’s Rapson Hall, this courtyard is a gathering place, and thoroughfare—and the new virtual reality lab.

In addition to the virtual reality experience, students also have a simple way to project a large size image of their plan drawings onto the courtyard floor. In this picture, the Sony projector points toward a screen on the wall, but the Epson points straight down. It’s on the LAN so students can easily upload drawings to the projector to download them onto the floor.

It is unusual for a virtual reality or motion capture space to be this large. It is even more unusual for it to be this bright.

For this reason, says Associate Dean of Architecture Lee Anderson, one of the most established virtual reality/motion capture systems, from Vicon, was not practical for the space—the sensors were not suited to the highly reflective environment. Instead, Anderson took his self-taught passion for virtual reality to tradeshows and seminars and came away from one of IEEE’s VR shows sold on the PhaseSpace Impulse system.

Associate Dean of Architecture Lee Anderson holds the Sensics headset that he modified to position the LED markers in a “tree”.

IEEE’s annual—and long-standing—virtual reality conference was held this year in March in Orange County, Calif., right about the time Google’s augmented reality glasses were in the news. While some may now think Google invented augmented reality, IEEE’s Visualization and Graphics Committee has been doing this conference for at least 20 years that I’m aware of, and it could be longer. This year it co-located with IEEE’s Symposium on 3D User Interfaces. The conference includes technical papers and demos; there was discussion of maximizing virtual walking and applying virtual odors. Lest this sound too esoteric, applications included medical, research, education and entertainment, and the market applications seem more mainstream than ever before. Sponsors included Christie, Barco, and Canon, alongside the specialized motion capture systems vendors.

Certainly at Rapson Hall it seems that virtual reality will become a daily tool in the design students’ toolbox. Yet it was clear from talking to Anderson that this is still in many ways an improvisational science, one that involves an amount of winging it with extra-wide cellophane tape (to attach the LED-tree to the headsets, of course).

The Impulse system in use at Rapson Hall is from San Leandro, Calif.-based PhaseSpace. Founded in 1994, the company aims to offer an affordable alternative to the $150,000- $300,000 tracking systems. One of the ways PhaseSpace reduced total cost of ownership for their system was to make it easier to use in rooms with a lot of light. Although cost was definitely a factor for Anderson, it was the system’s ability to work effectively in direct sunlight that made all the difference. Technically and philosophically, Anderson wanted the technology to integrate into the heart of the student’s studies. The university already had a small, dark virtual reality lab tucked into an out-of-the-way place on campus; Anderson wants technology to be right there in the midst of the architecture school, part of the Socratic flow as students meet and mingle in Rapson Hall, reaching for ways to collaborate.

Another factor that favored the Impulse system was its patented active LED technology and realtime processing. In part this eliminates a standard workaround of motion capture—sensors overlap and an operator will have to override the computer to tell it which marker is which. While many motion capture systems are used in environments (like Hollywood) where postprocessing is expected, and even preferred, the ability to eliminate it and avoid what’s called marker-swapping was key to the application at Rapson Hall. Since all of the PhaseSpace markers transmit a unique ID, there is no marker swapping. Further, the system is portable, simple, and accurate enough to suit a realtime application like the one Anderson had in mind for his architecture students.

1

2Next

Virtual Education

May 11, 2012 3:44 PM,
By Cynthia Wisehart

Technology gives students a more real experience.

The virtual reality lab is not the only video-based innovation Lee Anderson brought in. Anderson also developed a triple-screen-environment, an alternative to the costly poster board presentation pinups that line the mezzanine of Rapson Hall. The triple-screen environment provides three HD (1920×1080) monitors coupled with a PC that can treat the set of three monitors as a single window. The high-resolution panoramic format allows a great deal of information such as diagrams flowcharts, callouts, and images to be presented via PowerPoint using simultaneous or separate slides—like a videowall. The advantages over static, plotted presentations are obvious; the PowerPoint/Photoshop/SketchUp/InDesign templates and instructions that Anderson has developed for the environment allow students to create presentations that can be archived and repurposed for other media and allow hundreds of presentations to share a single board.

“One of the hardest things for architecture students to learn is how their designs will feel in three-dimensional space, and how it will feel to move through them. It’s something they learn when the buildings they design are constructed,” he says a little ironically, fully aware that he’s talking about experience gained over years. So how to speed that experience? The virtual reality lab does not actually simulate walking within a building—students are immersed in their drawings and models. Paint and finishes are simulated; the student can put on the Sensics headset and walk through the expansive courtyard as if they were walking through the 3D models of their buildings. Other students can watch the walkthrough on a monitor, while the student in the headset can change among multiple models preloaded into the virtual reality system computer.

The installation is relatively simple—a series of cameras mounted on an overhead truss, wired via Cat-5 to the systems’ computer. The cameras read the headset sensors wirelessly, determining where the viewer is in the virtual space, reporting to the computer, which then plays the corresponding view back to the viewer’s headset (via a beltpack)—also wirelessly. The virtual reality computer lives on the LAN, so drawings and models can be readily uploaded. The headset and the belt pack are variations on the Sensics system elements. The tree-like configuration of the LEDs that sticks up from the headset is an adaptation Anderson picked up from Mark Bolas, the director of the Mixed Reality lab at the University of Southern California. And the beltpack receivers seem to have come from a consumer electronics store. It’s pretty clear that for this application tinkering the wearable interfaces is irresistible and ongoing.

The Virtual Reality Design Lab at Rapson Hall was seeded eight years ago with an alumni gift from Ted and Linda Johnson to foster a cooperative effort between architecture and computer science, where Anderson’s counterpart is Victoria Interrante. Subsequent funding came from various sources including NSF grants, and it was envisioned that the lab would support other disciplines within the College of Design as well as architecture. So Anderson points out that landscape architects could walk through their designs, graphic artists could see their billboards, and the fashion department has already experimented with virtual fashion shows in which virtual models walk by your headset as if they were walking down a runway. Well, not the models, but the clothes themselves, animated as if they were on a model.

These applications have yet to be integrated into the daily life of the lab, but from the looks of things, that life is just getting started now that it is finished and open. Beyond the college of design, the lab could contribute to other applications such as testing and modeling health care procedures such as how long it takes a nurse to move among workstations. The lab will also contribute to the body of work on human perception and virtual reality. There is still much to be learned, Anderson says, such as why humans perceive virtual distances as shorter than real ones. Who knows, Anderson or one of his students may be a future presenter at VR 2015 or beyond.

Virtual desk

The EXODesk by the Canadian company EXOPC (in partnership with ViewSonic) debuted as prototype at CES this year and created a minor sensation. Some called it the poor man’s Microsoft Surface; it’s smaller at 32in. to 40in., and at a projected $1,000 to $1,500, it’s a fraction of the cost.

The ViewSonic EXODesk, as it is now known, is an HD LCD touchscreen display that rests flat on the surface of your desk. It hooks up to a Windows or Mac computer and can be used to interact with what’s on the computer monitor. You can launch websites or use a virtual Microsoft Word keyboard. But it can also be used as a giant standalone tablet with a limited number of Surface-like touchscreen apps.

The EXODesk’s first deployment will be in a Panamanian elementary school physics classroom, as shown in this image. The pilot classroom will feature 20 touchscreen desks, a larger EXOdesk for the teacher, and a larger-still interactive multi-touch “blackboard.” The screens are all connected via Wi-Fi to ease collaboration and help the teacher keep track of what kids are doing with the desks. In this deployment, the desk contains an Intel i5 processor and runs a version of Windows 7 and a custom HTML5 interface with the curriculum. Books, notebooks, writing utensils, and “ink” will be stored within the desks memory and accessible via the cloud.

Previous1

2

Featured Articles

Close