New Adventures at the Intersection of Theme Parks and VFX
By CHRIS McGOWAN
Justice League: Battle for Metropolis (Image courtesy of Sally Corporation)
VFX Voice asked industry experts about the technical demands and continuing evolution of VFX and video used in theme park rides and other fixed-location entertainment. Here they discuss the use of 3D, domed and big screens, and what’s coming in the future. They also talk about dark rides and LBEs (Location-Based Entertainments), such as Justice League: Battle for Metropolis, Avatar: Flight of Passage, King Kong 360 3-D, Panem Aerial Tour (The World of The Hunger Games), Guardians of the Galaxy – Mission: Breakout!, Dream of Anhui and Star Wars: Secrets of the Empire (a VR LBE).
“We are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride.”
—Rich Hill, Creative Director, Sally Corporation
RICH HILL, CREATIVE DIRECTOR, SALLY CORPORATION
Over the past five years or so, Sally has been incorporating more and more CGI into our attractions. Designing a ride that’s all about superheroes (aka our recently opened Justice League: Battle for Metropolis attraction for Six Flags) didn’t lend itself to an army of animatronic figures. They may look great, but I wanted to see them doing dynamic things like running at 100 mph, flying across the sky and fighting villains, and those things are tough to pull off with animatronics.
The Justice League Alien Invasion attraction [which opened in 2011 at Warner Bros. Movie World in Australia] was the first time we used CG gaming in one of our rides. It was very important to have digitally animated scenes due to the “superhero factor.” The dynamic motions really necessitated using CG.
As far as I’m concerned, we are always going to have a balanced mix of practical vs. virtual elements in our rides, though. There is something very satisfying about seeing a practical effect, but virtual effects can be very dynamic. The real trick is showing guests something unexpected and different around every corner in a dark ride.
When we were given the opportunity to create seven Justice League dark rides, we wanted to give each one something special and try to best our last “performance” if you will. Animation, special effects, show programming, lighting – everything was tweaked as we went from park to park. We also went back and applied many of those tweaks to the earlier JL rides to make sure they were all performing at the same level.
There were some very unique moments in the Magic Mountain version of Justice League: Battle for Metropolis [which debuted in 2017]. We changed the queue pre-show to be more of a “batched” system, where guests are held in a series of rooms where they are told about their mission by Superman, Batman and Cyborg. We also brought in massive toroidal projection screens that wrap around you. These changes added up to a “next level” Justice League ride that is really something to experience.
Every dark ride project is a challenge. There are many ways the projects can go “off the rails,” but we have been doing this for 40 years, producing over 60 dark rides during that time, so we are pretty good at keeping the team on task.
King Kong 360 3-D (Image courtesy of Universal Studios Hollywood)
MATT AITKEN, VISUAL EFFECTS SUPERVISOR, WETA DIGITAL
We faced many challenges creating King Kong: 360 3-D for Universal Studios Hollywood. On the technical side, there was the immense scale of the screens we were projecting on. These required very high-resolution imagery to avoid looking pixelated. Though the ride lasts less than two minutes, we had to render more pixels than a standard feature-length CG-animated movie. We also had to contend with designing the ride content to minimize projector cross-talk and manage stereo imagery from multiple rider perspectives so that everyone was able to experience the ride in full 3D.
Creatively, the attraction had to work as a compelling narrative experience for each of the riders in the tram train: King Kong 360 3-D is a stop on the Universal Studios backlot tram tour, and guests experience the ride arranged in a line in front of 175-foot-wide screens. The story had to play out so that everyone was able to get involved in it no matter where they were sitting, which required careful stage management of the action.
In 2010 when we launched this project, the idea of driving the audience into a large, custom-built ‘cinema,’ parking them up on a motion-base between two giant curved screens that filled their entire field of view no matter where they looked, and then taking them on a wild ride through Skull Island was, as far as I know, completely new.
In many ways, our experience working on King Kong 360 3-D was the same as any film digital visual effects project. We used the same set of software tools and produced the work in our standard visual effects pipeline. But there are some key differences to the way we have to approach these projects. For example, we have to come up with a new way of allocating the work across the large team of artists working on the ride. In a typical film project, the work is made up of many short shots and artists can work shot by shot. But the media we are creating for these rides has no edit points and plays out as one continuous take. So we have to set up workflows to enable the team to all work concurrently on what will ultimately be one big shot.
While the environment is rendering, we can continue to work on the animation of the creatures, characters, props, vehicles, etc. that populate the environment and are often one of the final things to get creative approval. We can render these elements as separate layers as they are approved and then composite them into the environment using our deep compositing workflows. Without our innovations in deep compositing, it would have been quite difficult to deliver this project.
3D is a key component in immersing the rider in the world of the ride. In King Kong 360 3-D, the backlot tour tram takes a detour to Skull Island and a CG tram car with digital-double tourists actually appears in this ride. 3D helps to create the sense that what the rider is seeing beyond the windows of the physical tram they are sitting in is a fully dimensional world extending out from their location in all directions.
“Without our innovations in deep compositing, it would have been quite difficult to deliver this project [King Kong 360 3-D].”
—Matt Aitken, Visual Effects Supervisor, Weta Digital
The Panem Aerial Tour at The World of The Hunger Games, Motiongate Dubai. (Image courtesy of Lionsgate)
“AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.”
—Gene Rogers, VP, Global Live & Location Based Entertainment, Lionsgate Entertainment Group
GENE ROGERS, VP, GLOBAL LIVE & LOCATION BASED ENTERTAINMENT, LIONSGATE ENTERTAINMENT GROUP
The Panem Aerial Tour is a hovercraft motion-simulator attraction that takes guests on an exhilarating, in-world ride to and through the Capitol [The World of The Hunger Games, Motiongate Dubai]. Guests are immersed in a 3D media tunnel created by two 6.2 meter x 24 meter screens that curve to envelope the ride vehicle and provide optimum sightlines for all guests. Nine digital projection HighLite laser-based projectors create seamless, stunning 3D images of custom ride media created for the attraction, which includes original cast members from the The Hunger Games film series reprising their roles.
In any attraction with multiple projectors, alignment becomes critical. Any slight misalignment will contribute greatly to the guest’s perception of the media. 3D adds a depth that is missed when not present. For the Panem Aerial Tour, the key set-up of the ride is being able to see vast expanses of Panem in the air and feel like you’re flying with other hovercrafts right next to you. 3D is very effective in selling that reality.
As screen resolution, processing power and speed, and accessibility continue to improve, there are exciting implications for the types of experiences we will soon be able to create. Software advances for animation and rigging also drastically reduce timelines.
AR and VR are amazing tools, but they should only be used if that is the best way to convey or immerse guests in the story you’re trying to tell.
Guardians of the Galaxy – Mission: Breakout! (Image courtesy of Walt Disney Productions)
AMY JUPITER, VFX EXECUTIVE PRODUCER, WDI (WALT DISNEY IMAGINEERING)
In February 2015, we were given the challenge of completely reimagining Tower of Terror and reopening it as Guardians of the Galaxy – Mission: Breakout! by the summer of 2017. We were able to work alongside Marvel Studios while they were shooting Guardians of the Galaxy, Vol. 2, providing both access to director James Gunn and his incredible team while also limiting our schedule and flexibility of the shooting window. If we could do it, we would be able to open a completely new attraction in the dimensional universe just three weeks after the cinematic universe opened the second installment of the smash film.
The most unique challenge was taking this preexisting ride system and seeing what it was capable of, other than what had originally been imagined for the previous attraction.
We thought it would be funny to disrupt the show and open the elevator doors while the ride was still moving. More than just being funny, it would also support the visual illusion that the Guardians were actually there in the attraction with us. This meant that we would have to figure out a way to shoot our live-action plate in a way that would lock the audience’s point of view, their eyepoint, to the elevator movement. Since we had to shoot so early in our schedule, we devised a method to shoot our plates with a vertical stack of seven 4K cameras shooting at 120fps, with the two cameras at either end of the stack tilted so we could create a huge vertical plate on which we would be virtually able to “reshoot” the Guardians – when we understood the ride profile to derive a fully dimensional 3D camera that matched exactly with the ride’s movement.
There has always been a crossover between large-format film for special venues such as theme park attractions and the standard cinema VFX and animation world. ILM has collaborated with Walt Disney Imagineering for many years on projects, from the original Star Tours to the newest versions of Pirates of the Caribbean and Soarin’ Around the World (Shanghai Disneyland), to the Iron Man Experience attraction in Hong Kong Disneyland.
Large-format displays utilizing ultra-high frame rate and ultra-high-resolution imagery bring with them much more data, many more assets, and much more scrutiny of the imagery. Most of our attractions include multiple minutes of these large-format films with no cuts at all. Add into this mix the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially.
Great attractions are those in which our guests are moved and transported emotionally by the experience they have. All of the show elements in that design, including video and VFX which seamlessly support that experience and do not diminish it, help create a great and immersive guest experience.
“Add into this mix [of large-format films] the multiple other data streams that come along with an attraction that must be synchronized together, like audio tracks, control systems, ride systems, and in-theater effects, and the complexity increases exponentially.”
—Amy Jupiter, VFX Executive Producer, Walt Disney Imagineering
Dream of Anhui (Image courtesy of Tippett Studio)
“It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle [in producing Dream of Anhui].”
—Chris Morley, Visual Effects Supervisor, Tippett Studio
CHRIS MORLEY, VISUAL EFFECTS SUPERVISOR, TIPPETT STUDIO
The Dream of Anhui ride was our first step into the world of theme park entertainment. It was a great project to establish our foundation as a media provider for physical rides. Dream of Anhui featured an 80-person flight deck with six degrees of freedom in front of a 30-meter-wide, 180-degree dome. Tippett Studio had over 30 years of experience dealing with screen media so we were in pretty good shape on that front. We knew we would need to deal with the spherical projection crossbounce of light, image contrast, saturation, and the motion of the image matching the motion of the ride. It was the 15 100% computer-generated environments we needed to build that was the biggest technical hurdle. We spent weeks testing various workflows and decided that the Tippett Studio character pipeline would not be efficient for this type of project. We found a fantastic piece of software by the name of Clarisse and built an asset pipeline that would feed it. Clarisse proved to be a wonderful tool, and the Isotropix team was very helpful and collaborative along the way.
The project lasted about two years in total. We spent about a year in pre-production, research, location scouting and previsualization before we started building the final environments. Once we had the solid previsualization foundation using photogrammetry of the real locations, we built the 15 computer-generated environments in about eight months. We had about 60 people on the project from multiple departments throughout the Tippett pipeline. For rides, we are hired to be production. We come up with the concept, write the script, design the characters, create the storyboards, direct the media, compose the music, and work closely with the ride movement and installation team.
We researched a few flying theater rides before starting on Dream of Anhui and knew we wanted to make this one special. We created a character with a dream of flying, we worked hard on creative and seamless transitions between environments, but most of all we created an objective for each scene, something the viewer could follow into the next. This led the viewer’s eye where we wanted and created a thoughtful journey through the beautiful landscape of the Anhui province.
Star Wars: Secrets of the Empire (Image courtesy of ILMxLAB and The Void)
JON WALKENHORST, CHIEF TECHNOLOGY OFFICER, THE VOID
Star Wars: Secrets of the Empire was a true collaboration between The Void and ILMxLAB. Both companies shared the goal of telling location-based hyper-reality immersive stories.
The Void uses custom VR technology along with physical stages to create immersive experiences that inspire exploration and engagement. Participants wear The Void’s proprietary design of the VR equipment, including head mounted displays (HMDs), backtop computers and haptic vests to engage the senses and transport them to new virtual worlds.
Star Wars: Secrets of the Empire is a fully immersive VR experience that, with special gear, transports you deep into the Star Wars universe, allowing you to roam freely, hear, touch, feel and see. We utilize many of the same creation tools and technologies and guidelines [as in VFX for feature films].
For a location-based VR experience we might focus on a shorter experience, but offer more detail or opportunity to explore a given room or interact with the characters. The key difference would be leveraging the VR engines in rendering and layering additional guest interaction and other senses.
The Void’s hyper-reality experiences go beyond other location-based VR experiences. We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.
“We not only blend the physical and virtual worlds, but our experiences incorporate audio, internal haptics and tactile effects – mist, wind and scents – to create a completely immersive perceived reality.”
—Jon Walkenhorst, Chief Technology Officer, The Void
The post New Adventures at the Intersection of <b>Theme Parks and VFX</b> appeared first on VFX Voice Magazine.