Post a Job Join The Guild
Contact Us   |   Sign In   |   Register
Features
Blog Home All Blogs
Search all posts for:   

 

View all (278) posts »
 

Ghosts Of The Future - Bringing Performer And Audience Together In VR

Posted By Curtis Augspurger, Friday, March 15, 2019

At first it looks like theater. From a Victorian-appointed parlor, you are invited into a fully dressed foyer, all dark weathered walls and period furnishings. An actor greets you, declaring herself a physical manifestation of the ghost of Charles Dickens’ Marley, from A Christmas Carol. She engages you curiously, eventually directing you to a small writing desk on which rests an old, leather-bound ledger. Your name is already scribed into the weathered book.

 Seated at the desk, you see your reflection in a mirror. Marley gently places a VR headset on you. If there’s a moment when the theatrical illusion breaks, it’s gone as soon as you see precisely the same room around you, fully replicated in VR. The desk before you, the mirror, the one-to-one nature of the room—everything correlates, marrying what you see with what you feel. The sounds of horse carriages on cobblestones begin to filter in. Marley whispers to you, now beckoning from the other side of the wall.

 “Are you ready to cross over? Yes? You may not recognize me when you do … You may even think yourself an apparition.”

 After a show-stopper moment that seamlessly blends VR with the real world (I don’t want to spoil the surprise for future audience members), you are transported to a different time and place. Seeing is believing, but “feeling what you see” raises the bar to new heights.

 Now alone, you find yourself in Scrooge’s grand sleeping quarters. Thick velvet drapes. A large, intricately carved four-poster bed. An old smoking chair beside a warm fire. Reaching out to touch the digital bedposts, you find that they are physically there. Wandering over to the smoking chair, you can feel the gnarled, walnut arms and soft velvet back. Light filters in through tall windows; 1860s London is outside.

Just as you begin to feel comfortable in the space, you sense the eerie presence of another, whom you cannot see. In a swoosh of smoke, Marley gasps frighteningly to life (or rather, the afterlife) before your VR eyes. You are now in the presence of Marley in her ghostly digital form, being driven by an actor in motion capture gear. Flowing hair and tattered clothing expose the vacuous empty shell of what once was her body. She addresses you by name, reaching her chained arm forward, placing an empathetic (if decaying) hand on your shoulder. She informs you that on this night you will be visited by three spirits.

And so your journey begins. Over the next 20 minutes, you are immersed in the most memorable set-pieces of A Christmas Carol … Scrooge’s (and your) envisioned past, present and future, ending in an iconic Victorian graveyard, where you’re compelled to confront your own mortality in surprisingly personal fashion. Throughout the show you’re guided by the spirits, each of them rendered by motion capture actors, fantastical avatars who react and speak to you in real time. As with any kind of interactive theater, the more you give in response to the performers, the more you get more back from the experience.

This is Chained: A Victorian Nightmare, a one-of-a-kind dive into a fully immersive world that pushes the boundaries of storytelling possibility. The brainchild of creator/director Justin Denton, it’s a single-person immersive theater experience that marries VR technology to live actor-driven performance for a dark reimagining of Charles Dickens’ A Christmas Carol. I was proud to produce Chained for its sold-out inaugural run in downtown Los Angeles this winter and was thrilled when Produced By invited me to reflect on the challenges of creating this unique show.

Chained is a new entry pushing the boundaries of the emerging Location Based Entertainment/VR market. To date LBE has been dominated by first-person-shooter experiences, esports and ride games. Early adopters have flocked to these experiences, powering a market that came close to $1 billion in revenue in 2018 and is projected to eclipse $12 billion by 2023 and comprise 11% of the VR industry. To encourage the audience’s adoption of this new market, the major challenge before us as producers is to create engagement models that place the technology at the service of storytelling, rather than vice-versa.

Developing and rehearsing Chained, from left: cast member Michael Bates, director Justin Denton,
interactive story producer Bruce Straley, cast member Haylee Nichele. Photos courtesy of Curtis Augspurger.

 

Getting Here

As with many of our origin stories, before I discovered my passion for film and digital media production, on a different path. As a graduate student in the School of Architecture at Columbia University, I was quick to realize that I couldn’t draw as well as I could see. We were still being taught with the traditional tools of the t-square and triangle, which gave me the foundation of spatial understanding but didn’t allow the freedom and interactivity I desired. Finding access to the highest-end software and computing systems (now less powerful than the phones in our pockets) at the time proved difficult, as the costs were in the neighborhood of $500K for a system. The Apple IIe had just been released, AutoCAD was just coming onto the market and the 3D visualization tools we take for granted today were then just a concept.

While working for architect Richard Gluckman on the Whitney Museum’s Breuer expansion project (now the Met Breuer), I brokered a deal with leading software and hardware companies and convinced Gluckman and the Breuer’s Director David Ross to take a risk. After several months of hard work photographing and then digitizing the Whitney collection (for the first time) and integrating it into Gluckman’s design on the computer, the board was allowed to see inside the design and take a ‘virtual walk-through’ with the collection in place, which helped the project win the build contract.

A demo of the Breuer expansion led to an alumni donation of a $10M computer lab to Columbia’s School of Engineering, where I then taught these same visualization tools to the next generation of student visionaries. The class eventually caught the eye of Hollywood, and after switching coasts from NYC to Los Angeles, I found myself immersed in building digital set extensions for Wayne Enterprises in Batman Forever, the digital swamp that Shrek called home, and ultimately producing animated features for Disney and Fox. Fast forward 25+ years, and we are still using bleeding-edge technology to strengthen the connection of story with the human experience.

“This project is actually not possible without having a live actor in the space with you. It’s at the core of how we wrote it and how we workshopped it with the actors. The technology of having the motion capture actor there is the reason why we can adapt to you as the guest and make it unique every time.”

~Justin Denton, Creator / Director

 

Bringing Chained To Life In Real-time VR

Not only would this type of project not be possible without the live actor in the space, but the technical ability to make this type of narrative experience didn’t exist a few short years ago.

The immersive tech Chained required is still in its nascent days and fraught with bleeding-edge incompatibility problems. So to produce a project of this scope, a faithful team of visionaries, a small village of talented artists, actors, set fabricators and engineers had to be pulled together to bring the curtains up in just under nine months.

With Chained, we were fortunate to have an experienced producing partner in Executive Producer Ethan Stearns and Associate Producer Christine Ryan, of Madison Wells Media Immersive, along with the support of Executive Producer David Richards of Here Be Dragons. MWM Immersive put tremendous faith in the project early on by backing it as their flagship immersive project. When our milestones slipped or our technology failed (as it did more often than not), Stearns’ previous experience producing Carne y Arena helped us keep the focus on prioritizing the audience’s experience with technology, in service of the story.

For example at one point in Chained, the story calls for a prop to be handed to the audience member by our mo-cap actor. Sounds simple, but it ended up being an enormous pain point—one that was more technical than story-driven. We needed the prop to feel as real as it appeared in the VR rendering. This required a tracking solution. With today’s tech, the two basic options were passive tracking (marble-sized reflective dots) or active tracking (tiny embedded LED emitters). Of course the passive solution is cheap and clunky, which could pull the guest out of the experience; while the active tracking solution is elegant and complicated, but at 10 times the cost. With an already challenged budget, support for this solution could have gone the direction of cheap and clunky. But in this case, our experienced partners’ commitment to stay true to the audience’s experience of the story led us to trim costs elsewhere and go with the more elegant solution, creating one of the more magical takeaway moments of the experience.

Actively tracking audience-held props notwithstanding, there were a large number of unknowns and unproven challenges we would encounter to achieve the narrative path. In an effort to control the complexity of the challenges before us, we chunked our goals down into a series of weekly sprints and milestones. We looked to align the project’s technical requirements with the conceptual design by adopting a game design pipeline. The milestones were broken into a few achievable categories; a minimum viable product (known as a ‘grey box’); vertical slice; green light; and ultimately a final EXE deliverable.

The grey box version merely showed temporary, untextured volumes representing the spaces, props and their interactivity (one month). For the vertical slice milestone (two months), we selected one of the scenes and gave it an approximate finish as a textured, lit environment. The grey box and the vertical slice demonstrated that the technology could marry the look/design with performance interactions, while still supporting our actors in the interactivity of the narrative.

To further complicate the design process, the visual effects from live-action events needed to be triggered (on the fly) during the performance, from a handheld tablet. Actor workshops held in VR headsets were used as an iterative back and forth, helping to get the actors comfortable with the technical demands of bringing the art to life. The story team also used these workshops to refine the story and staging for the mo-cap actor and their counterpart. Our live-action Marley thus transitioned to the stage manager, whose role was to orchestrate the live performance effects, trigger live-action visual effects and cue scene transitions on a connected tablet.

To streamline this intertwined production/performance process, we set up our ‘sandbox’ workspace and motion control camera systems at Aaron Sims Creative. ASC is the powerhouse design shop responsible for some of the biggest AAA character work in the film industry and who we chose to lead the creative design and the engineering of the EXE delivery. With imaginary walls lined in tape on the floor and tracking cameras in the ceiling, we were enabled to iterate directly with the VFX supervisor, Ryan Cummins, and his ASC engineering and creative team to bring the digital design work in line with creative goals. Cummins’ team collaborated with motion capture vendors Ikenema and Dynamixyz to integrate the body and facial capture in real-time to drive our spirits’ performances.

Beyond the one-to-one tactile relation of the space and their visual cues, the acoustical surroundings were critical for full audience buy-in into our immersive VR world. In most film productions, the sound is geared to the element of time, whereas in VR, the sound has the added component of spatiality and becomes more complex than a normal stereo mix. Making audio cues play from sources (diegetic sound) or from the ambient environment (non-diegetic sound) in a VR scene adds exponential science to the audio mix process.

To help us solve these equations, we partnered with composer and VR sound design wizard Dražen Bošnjak and his team at Q Department. Bošnjak and his talented team scored and mixed to make our London exterior and interiors sound like 1860s London. They created the entire audio landscape and score to work in concert with your spatial relation in the rooms and its objects. As you walk toward the fireplace in Scrooge’s bedroom, the crackle of the fire grows louder and the sound of the ticking grandfather clock responds to your position.

When soaring over London, the sounds and whooshes of the wind and passing clock towers surround your ears. You’ll hear each entry and exit that the spirits make into your virtual world, and as they finally usher you to the end of the experience and you reenter the world of the living, the familiar sounds of the clattering horses on cobblestones bookend the journey from where you began.

Immersive media has us all looking ahead to what is possible; what is next. As producers it may feel as though the years of learning the craft of storytelling are being overshadowed by the breakneck speed of technological advancement. However far we have progressed, the core techniques of storytelling, even in the face of modern technology, have not changed fundamentally. The latest headset experiences with either Magic Leap, Oculus, Halo Lens, Vive and whatever comes after, all continue to expand our ability as storytellers to move forward, reimagining legendary tales and new content to deepen the connection with our audiences in exciting ways. These emerging technologies not only offer the expansion of opportunity but simultaneously reinforce the foundational needs of the storytelling craft.

In this new frontier, the point to remember is that our role as producers has not changed. The fight to make story our primary focus, regardless of medium or technology, lives on and will carry the growth of the market.

As the epitaph of a life is being prepared and the birth and death dates are carved, Chained’s take-away message for the audience member is that only “the dash between” counts. How far we’ve come, and just how far we can go with storytelling in the digital age is solely up to what story you can dream up. If you dream it, there will likely be a new technology ready to bring it to life. Make the story of your dash count. 

PGA member Curtis Augspurger is currently finishing work on a 6DOF (six degrees of freedom) VR version of Othello with Oculus, Here Be Dragons and JuVee Productions, and embarking on a passion project to bring Light Immersion Therapy VR to PTSD sufferers.

- Artwork courtesy of Aaron Sims Creative

This post has not been tagged.

Share |
Permalink | Comments (0)
 
ABOUT THE PGABECOME A MEMBERPRODUCERS CODE OF CREDITSPGA AWARDSPRODUCED BY CONFERENCEPRODUCED BY MAGAZINE