Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Mapping the Narrative

Storylines push projection mapping in new directions

On this past New Year’s Eve roughly 45,000 revelers in Los Angeles were transfixed by the 22-story projection mapping spectacular for Grand Park’s N.Y.E. L.A. celebration. The event was produced by yU+co, a conglomeration of directors, designers, producers, animators, writers, programmers, and visual effects artists with offices in Hollywood, Hong Kong and Shanghai.
The multimedia display brought together 3D projection mapping, interactive sound, and storytelling elements, tied together around themes of Los Angeles, its culture and its people. The presentation culminated in a 10-minute narrative presentation projected onto two sides of the city’s iconic City Hall — what yU+co’s artistic director Garson Yu refers to as a towering digital sculpture. The “sculpture” displays the storyline through animation and live-action imagery of a boy interacting with the two sides of the building’s façade. The project was a one-off, so most of the work had to be done long before the company could set up real projectors and refine all the geometry they were using to conform to the specific characteristics of the large building.

Projection mapping generally involves creating a virtual model in the 3D space of the surface that imagery will be projected onto, and then using that model as a guide to contort the image so it conforms to the space’s geometry. At a live event, the imagery and displacement information is fed into servers that crunch the numbers to both distort the imagery as needed–involving sizing, frame area, keystoning and many more variables–and then spread it out over multiple projectors in order to cover the types of very large spaces that are generally used.
Grand Park’s programming director Julia Diamond had chosen yU+co because of its work two years ago on The Interactive New York event at that city’s Pier 57. Preparation for the Grand Park event started the previous spring as Yu and his yU+co team recorded dozens of hours of live action footage featuring local arts organizations, city landmarks, public events and Angelinos going about their day. His team of digital artists meanwhile produced graphics and animation that would be used both for the pre-show event, as well as part of an interactive element in which the intensity of audience cheering would be reflected by the visuals. And, of course, the team was also prepping for the grand finale– the narrative piece that would use the 22 floors of City Hall as its canvas.
The project was so massive and in such a publically travelled space that yU+co’s crew would get very little time to fine-tune its presentation on site prior to the event. Yet the facade of a particular, rather architecturally ornate building is obviously a very different medium than an industry-standard screen. It has its own reflective characteristics, shapes and imperfections that have developed over time. So Yu and his team had to come as close as possible before the short prep period to being able to map as much of their “screen” as possible.
“First, we built models of the building in 3D using Autodesk Maya,” says Yu. “And then within Maya we would project images onto that model.” In order to simulate what would be a projection that spread out over two sides of the structure, they positioned two virtual cameras within Maya to capture that additional dimension. “But there were still a lot of unknowns. We didn’t know exactly how the [building’s surface] would receive the image and the light. We started with some ideas that worked well in Maya but as we progressed, we knew they wouldn’t work in this medium. It was a very site-specific project.”
As the content started to take form, the company progressed to a step between using CGI model and the real thing by building a three and a half foot tall model of City Hall in cardboard, which helped bridge the gap between what works in virtual space and on a real world object. Using a single Barco DLP projector and the cardboard model, they could get a step closer to calculating the perfect displacements for their imagery, which would ultimately make use of 14 projectors on site on December 31st.

Left: yU+co made use of Hippo Hippotizer HD servers, notes Andrew Burnett, the company’s technical director, who engineered the event. The servers’ technology applied all the displacement characteristics to the imagery Yu had designed and then spread the results over the full array of projectors to cover the 22-story high, two-sided presentation.

yU+co made use of Hippo Hippotizer HD servers, notes Andrew Burnett, the company’s technical director, who engineered the event. The servers’ technology applied all the displacement characteristics to the imagery Yu had designed and then spread the results over the full array of projectors to cover the 22-story high, two-sided presentation, blending all the seams through blend masks customized for each projector position. They had a total of four days to completely load in and prepare the project and they obviously needed to keep any actual test projection limited. Running only a few seconds at a time when there was minimal foot traffic in the vicinity, yU+co’s team was able to make last-minute adjustments to the displacement, brightness and other projector characteristics from within the Hippotizer’s software.
What really excites Yu, who oversaw the entire production, which included hours of pre-show interactive entertainment beyond the projection mapping, is not so much the technical advances but the artistic possibilities these types of projects offer. “I’m less interested in something that’s very effects heavy and doesn’t have strong storytelling,” he says.
Projection mapping, Yu explains, “can really be outdoor theater. It can bring an entire community together to react to a strong story. In the past, we’ve seen people using a lot of abstract patterns and forms but once you’ve seen that, it’s hard to sustain interest. There’s no ‘wow’ factor. I think we can now use these techniques to tell a powerful story.”

Dynamic Blending Masks
Naturally, as projection mapping technology expands, there will always be producers and creatives who want to push it further. This was the case when Feld Entertainment (“Disney on Ice”) took on the touring show, “Marvel Universe LIVE!” The show involves stunt performers, motorcycle chases and moving stages in a story featuring Marvel’s major super heroes, including Iron Man, The Hulk and Spider-Man.
For this, Bob Bonniol and his MODE Studios were charged with bringing constant movement into the projection mapping equation. Imagery would be projected onto sets with 3D objects that move. Projections would also have to coordinate perfectly with the stunt performers who could never, despite being among the best in the field, be counted on to hit a precise mark at the exact moment a preprogrammed piece of content came on.
From a technical standpoint, these challenges were addressed using two elements: realtime tracking technology from Cast BlackTrax and Dynamic Soft Edge, a feature within the d3 media servers Bonniol used for the show. “If you’ve got a static object covered by more than one projector you get a flat brightness profile and fade one out and the other in so the images blend together seamlessly,” notes Ash Nehru, software director at the London-based d3. “So you make blend masks. But if the objects you’re projecting onto are moving, you can’t do that because the characteristics of the blend need to change as those objects do. With Dynamic Soft Edge, we generate masks automatically and dynamically in real time so you can move projection surfaces through multiple projector beams in a way that wouldn’t have been possible previously.”
d3 also worked in conjunction with Cast BlackTrax to allow the newest iteration of their servers (the ones used for the Marvel show) to be able to interact with as many as sixty BlackTrax 3D and 6D BTBeacons (infrared LEDs attached to performers, scenery elements and props) and augment the projections to hit the performer or set piece wherever they are, rather than having to rely on them to hit a predetermined mark.
 d3 servers had been used previously to do this but with only a handful of sensors, Nehru adds. “But Marvel Universe LIVE! requires 60.  It was a learning experience to scale up and develop a system that could achieve this and also be easily managed by the user. Projects like Marvel Universe LIVE! really brought this feature to a mature level so this will be something everyone will be able to benefit from soon.”

Product at Work: Barco 3-DLP HDQ-2K40
With it’s 40K lumens of light output and 2048 x 1048 resolution, Barco’s 3-DLP HDQ-2K40 earned Live Design Magazine’s 2013 Projector of the Year award and a reputation for its category-topping brightness. Its adjustable frame provides for ease of stacking and rigging. The HDQ-2K40 is equipped with Barco’s ImagePRO technology with Athena scaler for flexible scaling. Features include a wireless control and preview, providing information on projector status on a built-in color LCD screen. Connected sources can be previewed as well, and the projector can be controlled via smart phone or tablet computer. Barco’s proprietary BarcoLink technology claims swift signal distribution between Barco’s projectors and image processors. With BarcoLink, signals are distributed over a BNC coax cable, which is not only more durable but usually more cost-effective. The projector is also available with a light-on-demand option, allowing users to fully tune the light output of the projector to the location: it can be programmed to a light output of 26,000 lumens up to 40,000 lumens in 2,000 incremental steps. This ability to scale light output gives the user more ways to adjust the projector’s performance and lamp life (7 kW xenon) to temporary or permanent installations.

Featured Articles

Close