Your browser is out-of-date!

Update your browser to view this website correctly. Update my browser now

×

Case Studies

A Glastonbury First – Synergy Makes its Festival Debut

Avolites’ upcoming software feature set Synergy takes the Temple to a whole new level at Glastonbury

The Temple is a haven for late-night ravers at Glastonbury with a range of big name performers and DJs gracing its stage. It’s not just the music and the eclectic atmosphere that makes the Temple so special; since its launch in 2017, the face of the Temple has become the center of attention with a stunning range of visuals bringing it to life every night thanks to the creative minds of the production team.

The team is comprised of Paul de Villiers as the lighting designer, Arran Rothwell-Eyre taking care of the media servers, and a group from Limbic Cinema curating the content and operating the live video mixing. Lightwave Productions provided the lighting equipment with the assistance of SGM.

This year, the team decided to take the production to the next level by introducing Synergy, the upcoming feature set in Avolites’ Titan and Ai v12 software, which integrates lighting and video control together into one system. “Due to the video-led nature of the structure, it was great to be able to blend the video content and lighting with Synergy to achieve cohesion between the content and the lighting fixtures,” de Villiers explains.

The Avolites Arena desk running Synergy

For the kit, de Villiers chose the Avolites Arena for the lighting control. “The Arena is my go-to desk for many reasons; it has large number of faders and execute buttons, and the mini screen can be quite handy,” he says. “The native optical output is also really useful; on previous jobs we’ve had a mile-long run of fixtures with no latency at all. The large main screen is also great for the Pixel Mapper and NDI overlay.”

To handle the video, three Avolites Ai Q3 media servers were brought in: one main, one for Synergy, and one as backup. A Titan Net Processor was also installed to create a content distribution and server management network. For software, the Arena ran a beta of Titan v12 with the Q3s running the newly released Ai v11. The servers used five HD outputs with the live input running at 2048 x 2048 resolution.

To bring the set to life, the team needed content and on a large scale. Limbic Cinema was commissioned back in 2017 to curate the video content, and since then they have been building up a library of video of all different artistic styles, textures and colors.

Once the stage was built, a team of five designers was on site creating more content.

Avolites Ai running the projections

Formatting video to the preferred video codec of the server can be a major issue when working with large teams of designers, but a key feature of Ai allows content of any format to be fed into the software and the Ai Transcoder automatically converts it into the AiM Codec. It also didn’t matter what size or shape the content was. Once the content was uploaded to the server, the Ai Mapping Editor could process it and map the content to the structure with ease.

The next stage was to bring the show together. Five HD projectors were used for the mapping and a total of 65 fixtures, including 12 Aqua beams that surrounded the face of the Temple, were brought in. The trick now was to make it all work cohesively.

To create a fully immersive experience for the attendees, it was vital that all of the visuals told the story together, and this was where Synergy took center stage. The LD used Lightmap—a key Synergy feature—that allowed him to directly pixel map the video content to the fixtures. “The structure was mapped to the pixel mapper on the desk and from there I could control how much the video content affected the lighting using a mode two fader. This allowed us to make a smooth transition between the video cues and lighting,” de Villiers says.

Avolites Q3 servers taking control

For this, getting this map right was vital. The original UV Map of the stage was complicated and didn’t directly match up with the stage itself, so a camera was set up in Ai with the live feed going into the Synergy Q3 server. The live output from this was then fed to the Arena, giving Paul the picture he needed to design his show.

Projection mapping such a complex structure was also no simple feat; many of the areas of the outer structure were layered and therefore regular edge blending was not suitable. Rothwell-Eyre used the Salvation Patching in Ai to add masks to certain areas, allowing the projectors to accurately map out the structure.

Once the preparations were done, it was time to go live. The video was operated by multiple people, all bringing their own style and creativity to the show. De Villiers and his team operated the Arena.

Synergy stuns the Temple Crowds

The weekend was a slick and spectacular success. “The show went really well. Once the map was correct everything went really smoothly,” says de Villiers. “Synergy is surprisingly easy to use as everything is native to the platform. It really is a game changer.”

“It’s changed the way I design my lighting. It’s made it so much easier to work around video. I’m able to frame the video screens and make maximum use of them. You can frame the screen with a group of fixtures and pixel map to amplify the video content and have it blend out.“

Not one to mince his words, de Villiers has had enough industry experience to know a good thing when he sees it. “I was happy to take a risk with the new software after popping into the Avolites office and have Matt Jennings run me through it. You have to use it out in the field to really see if it works. I’m looking forward to using it at Boomtown this year.”

The LD will be using Synergy on the Nucleus at Boomtown; the full release of Titan and Ai v12 is due around September.

Featured Articles

Close