Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Cynthia Wisehart on Projection Mapping

Innovation in the projection mapping community has hit a period of acceleration in part driven by the longtime collaborative relationships between creatives and technologists. In this issue, our reporter Jon Silberg went to the Nature’s Best Photography Awards which were presented at the Smithsonian last week. As part of the festivities, the iconic rotunda was wrapped in a projection-mapped invocation of the winning photographs. Jon gets into more about that in the story.

For myself I was inspired by something the content creator Danny Firpo said about using disguise as a collaborative tool. This is true of other projection mapping platforms, but since the application we were covering was disguise-based I checked in with them by phone with disguise’s Anthony McIntyre.

Not surprisingly McIntyre confirmed that things are changing fast in projection mapping, even more so over the past 18 months. In that time, features that disguise founder Ash Nehru had envisioned for years became reality, including the OmniCal calibration system, the VR monitoring capabilities, and aspects of previz and third-party integration that make disguise as much about content creation, workflow, and client interaction as it is about calculating and serving images.

Like other projection mapping platforms, disguise is now rapidly evolving past its early role as a playback tool, to being an experience platform that creative, clients, and audiences share. The ways in which they share across the disguise platform will no doubt change, driven by the ideas of creative and the demands of audiences for more interactivity in ever more virtual and large-scale worlds. Projection mapping is the closest thing we have to gaming in Pro AV—with the benefit of huge high resolution pictures that fill physical reality.

And speaking of those high resolution images, the gear to show them doesn’t position or calibrate itself. So new tools that auto-calibrate multiple projectors and provide algorithms for various lumens, textures, and lighting are changing how projectors and screens are deployed, providing many more options much further in advance. I don’t need to tell anyone why “further in advance” works.

McIntyre explains that advancements come both from improving workflow—projection calibration, previz, software integration, but also from enabling better client/creative interaction. As a former theme park/ museum designer myself, I can only imagine what I could have done with the VR previz headset aspect of disguise. For myself, the ability to virtually experience a space would be so creative. But equally important, the chance to have my client virtually experience the space would be just as transformative to the development and approval process.

Beyond all that there is another fascinating implication, that one day audiences can join right into the whole scene. Already there is a nascent example this month—at the Hollywood Bowl the projection mapping was mixed live to the performers, as if were lighting. But imagine even beyond that, where like in gaming, audiences can be invited into the same world that creative share. These were things we merely fantasized about when I was a designer. Things we tried to make happen out of the very terrestrial tools we had to work with. Now so much of that and so much more is and will be possible. Makes me want to go back into design.

Featured Articles

Close