Relative Motion's Christopher Lane writes for Stage Directors UK, August 2020.
As theatre makers, our reality was turned upside down in March. The questions we are now grappling with are overwhelming: what can we do to survive the pandemic, keep audiences engaged, artists working and an industry from dying? How can we connect with our audiences, tell stories and keep people entertained and, most importantly, safe? The answers to these questions (and many, many more) are varied and far-reaching. But, with hope being at the core of this article, I’d like to propose that virtual reality might be one of the solutions to the problems we face.
Since 2017, our company, Relative Motion, has been exploring the intersection of theatre and virtual reality. If you’re anything like I was, your first reaction to the idea that VR and theatre are a natural fit might be that these two things have as much in common as apples and chainsaws. However, what initially seems a strange pairing actually makes for well-suited companions.
At its most fundamental, VR is a space. Be it created by a 360° camera or a computer generated environment, virtual reality creates places we can inhabit and populate with stories, performers and audiences. The possibilities for VR, from a theatre perspective, are massive and virtually untapped.
With VR being a spatial medium, theatre makers have an advantage in working in it – we are spatial storytellers. Unlike filmmakers – who craft stories through frames and pictures – theatre makers inhabit spaces and tame the interaction between myriad forces that are set free to play there. Space is a cornerstone of our art form and we intrinsically know how to work with it. What may seem like a massive leap for us – moving into the digital realm – is really just a shift into a new kind of performance space.
Another key feature of VR is the connection with the audience, something essential for theatre as well. In VR you craft stories for a participant Single Audience Member (SAM) who sits at the very centre of the story experience (think the inverse of theatre-in-the-round). I use the word participant purposefully. SAM has presence and agency in VR and engages with the story kinesthetically. The amount of engagement depends on the ‘degrees of freedom’ offered by the experience. This has wide reaching implications for storytelling and is reminiscent of the way theatre audiences already engage with our work.
Additionally, SAM’s proximity to the story creates a connection, unique to VR, which builds audience empathy and has rightfully earned VR its reputation as an ‘empathy machine’. Furthermore, VR tricks your brain into believing that you have actually been to the locations and interacted with the characters from the virtual experience. In doing this, VR creates memories for you that are as authentic as those from real life, a theory well supported by neuroscience. Story-living is a more apt description of the VR audience experience and one that we, as theatre makers (traditional and immersive), are poised to harness and develop. SAM can also play various roles in the story. In fusing VR with the classics (Shakespeare, Marlowe, Puccini) we’ve discovered that these ‘legacy stories’ deftly mesh with VR audience/actor dynamics. In VR, like the classics, SAM can be a voyeur who sits outside and watches the story; the conscience of a character; or a character themselves. It’s clear that our theatrical heritage offers up a treasure trove of ideas and practices that can be explored and implemented to the benefit of this new narrative medium.
VR is in its infancy – as unexplored as film was in the early 1900’s. But it’s also at a point where the technology and infrastructure can support mass adoption. Having spent three and a half years exploring theatre and VR, we believe the time is now right for theatre makers to join the conversation and help shape the future of virtual reality. When we combine this moment in VR’s development with the impact of the current pandemic on our industry, our involvement seems all the more prescient.
Virtual reality also offers an exciting way for theatrical organisations to use their expertise and expand their digital remits to develop this new storytelling platform. It allows us to use our existing buildings and infrastructures in dynamic new ways. It lets us take audiences to places where we could never before stage a story, due to location issues or economic viability. It allows us to create vibrant, inspired content that will entice and build audiences for this new medium. The technology also allows us to live stream our work, simply and cost effectively (where 360 camera capture is concerned), to a global audience. While VR will never replace theatre, it can give us an opportunity to use our skills in new, but familiar, ways.
We believe this pandemic provides a catalyst for theatre makers and VR. Virtual reality has been waiting for us to step into the ring; to create content and shape distribution models that advance the medium. The technology is more ubiquitous, sophisticated and affordable than ever and distribution platforms, like YouTube, support 360° content and spatial audio. Whether you have a VR headset, a mobile device, or just your laptop anyone can now engage with immersive digital content – creating and sharing this work has never been easier. In taking up this mantle, we, as theatre makers and organisations can, to some degree, future proof ourselves from social disruptions like this pandemic. In doing so, we can foster a new storytelling genre, populate new performance spaces and create new revenue streams that will be advantageous in robust times, as well as lean.
We are witnessing the birth of a new way to tell stories – as we did with the printing press, radio, film and TV. At a time where Covid-19 has rendered our traditional spaces not just out of bounds, but dangerous, VR provides an answer to some of our unprecedented questions and a new, safe space for us to share stories with our audiences. We have an opportunity to be vanguards, to shape the future of VR, and it’s a baton we at Relative Motion believe we should take with both hands.
Our Shakespeare VR pieces mark a new collaboration between Relative Motion and composer, Tom Adams. Here’s Tom discussing his work with RM:
“I come from a theatre background as composer and performer and was really interested in Relative Motion's approach in combining theatre, film and video game languages. VR is such a brilliant opportunity to allow the audience to feel truly invested in a piece. What I loved most about these three Shakespeare experiences was that each one had a distinct flavour. This gave me a chance to use different methods to try and support the storylines.
My musical approach is to watch each piece again and again and improvise, experimenting with what instruments to use. My main aim is to not be self-conscious when composing – I just follow the story and let the process take care of itself. Being a performer myself, it is a lot like being on stage: trusting in the moment and breathing. Something will happen.
I have collected quite a large collection of sound making things - guitars, ukulele, bass, harmonica and upright piano to stranger elements such as crisp packets, gongs, sticks and a brilliant sea drum that sounds like the sea. Put all of these into effects pedals, change the pitch, speed it up and you've got something completely your own and organic, bespoke to the piece. I love this part of the process where you can match the mood to the sounds.
With King Lear, it might be apparent that I have been watching a lot of Ozark recently! I am inspired by the way the composers Danny Bensi and Saunder Jurriaans use tension in music. I was on edge the whole time. I am also inspired by how organic sounds can be manipulated into something completely new and unrecognisable.
For Romeo & Juliet, the first thing I wrote down when watching the piece for the first time was 'electric guitar'. I liked the subtlety in their performances and the spaces in between dialogue, allowing the scene to breathe. This was my opportunity to insert some sexy, Pink Floyd style, reverb guitar. Chris, the director, gave me one word which which I really liked: "clandestine". It was then I knew the guitar had to be gentle and caring, sexy and post coital. The score ramps up at the end, with danger and I wanted to find a low sub bass sound that would highlight Romeo's high stakes - death, if caught in her bedroom.
On As You Like It, I knew it needed a more classical but silly edge. A soundtrack that has a method and logic all to itself. We used vocals in this piece a lot and I really enjoyed each instrument playing a character. Inspiration came from the confusion of the characters and the pacey dialogue. This is a piece that grabs you by the horns and doesn't let go.
Enjoy, enjoy, enjoy!"
Once the piece has been mapped out, I like to do a walking test for the camera to make sure that it all feels right for SAM. I watch my stumble through in headset and tweak as needed until I have movement patterns that feel spatially and textually authentic. I generally like to rehearse on location/set too. My draft staging is essentially my pencil sketch for the piece and I fill it in with the actors, as they will always bring new insights that will shape the work. Working on site means that we can all play with the space and pull inspiration from it. Also, the more comfortable the performers are with the macro and micro aspects of the environment, the easier it will be for them to craft refined, authentic performances. This process also includes working with the lighting and sound designer to make sure that any key effects are built into the work from the start. This is especially critical for SFX, as you’ll need to rehearse these into the staging with your actors. Don’t leave either until the end.
Costumes and props also have to be built in during this part of the process. Clearly, this will have been planned for in pre-production but will come into play at this stage. I should also mention that, before the first location rehearsal, I have at least one rehearsal with the actors/singers to read/sing through the work, talk them through concepts, characters etc. and tweak anything we can in advance of the first staging rehearsal.
Before moving from rehearsals to capture mode, we always to a ‘dress/tech rehearsal’ that is captured and reviewed in headset so we can refine any niggling bits of staging, lighting etc. This also affords us the opportunity to give the performers time in the headset with themselves should aspects of their performances need work – often any unconvincing acting moments can be easily sorted by letting the actor experience the problematic moment for themselves. Each project requires a slightly different approach where rehearsals and capture are concerned. More rehearsals are likely required for longer stories/text, and with CGI projects, it’s important to get the actors into the headset so they can explore the environments and build memories that they can take into our rehearsal space – this is a critical step for performers in any CG project – as is extensive physical character work so that actors can authentically embody their avatars in the mo-cap studio.
When we move into capturing the work there are some things I am adamant about being part of this process. First, we work in single takes (I aim to do this with CGI projects as well) so that the energy of the performances and story world are consistent. With this in mind, actors must have embodied the text (be it dialogue, music or both). Memorisation is critical and can’t be avoided (unlike in film where things can be cheated). I’m also a keen on developing a rating process (scale of 1 to 10) with the performers so that I can gauge their feelings about each take in tandem with my own. This helps highlight the best takes from the worst. What’s interesting about this is that we are almost always unanimous in our understanding about the takes crashed and those that soared.
Written by Christopher Lane
As practising theatre-makers, we’re musing daily on the emotional, social, cultural and economic impacts social distancing and travel restrictions have wreaked on the live events industry. This turmoil is testing the resilience of artists, musicians and those in the creative industries to their core. At the same time, remaining hopeful, we’re speculating on what’s next: What do we fight to reconstruct? Can the creative industries regrow into a more robust cultural and economic powerhouse? What might the new parameters and opportunities be?
Our experience of lock-down has reminded many of us of the vitality of a solid and fast data connection. We’ve often been surprised by just how effective and, at times, comforting remote presence can be. Musicians and theatre-makers have so far struggled to fully adopt live video and audio links into their practices – lag, cost, unstable connections and sometimes the sheer inhumanity of mediatization has meant that working face-to-face is still our default. We wonder if that might be now set to change more rapidly, forever. Will we ever forego the rehearsal room? Yes, there are serious questions to be answered about safety, data security and ownership but will 5G enable us to meaningfully connect and collaborate with the nuances we rely upon and crave? Will 5G allow us to reach our audiences of the future?
convenient giant pit in which to hide even the smallest of orchestras and conductors! Add into the mix the need to record our singers in high quality at the same time as being able to place their voices correctly in a sphere of audio surrounding our SAM and we had a set of tricky technical and creative challenges to solve.
Pre-recording an orchestra was our initial instinct but our musical director Joe Louis Robinson helped us refine a better solution. He and the singers needed a more responsive relationship both as their performances developed in rehearsal and during recordings; singing to a backing track would be restrictive. Liveness was crucial. Just as a rehearsal pianist or repetiteur sits in for a full orchestra in an opera house, Joe would accompany the singers live on a keyboard, in rehearsals and during the final recording.
We used a combination of microphone techniques. An Ambisonic mic (which contains 4 separate mic capsules in a tetrahedral arrangement) sat, along with its recorder just under our 360 camera, capturing a full sphere of room audio. Our singers were also fitted with radio microphones, their tiny capsules hidden in choice places under clothing. Clean, clear and separate vocal capture was vital but how could the singers hear Joe’s keyboard and spoken cues without the sound bleeding into the mics? We didn’t want an ever-present “roomy” version of Joe’s keyboard to cause headaches in post-production.
In-ear monitors are now commonplace in live sound production but they’re big earplugs with a cable – quite visible on camera to our SAM, which would somewhat spoil the look of the piece. We knew the solution might lie in a new range of extremely tiny wireless earpieces made by a manufacturer of hearing aids that sit right in the ear canal. Asking an opera singer to block up one ear with an earpiece, no matter how tiny, is a delicate matter. Vocal aural feedback is crucial for a singer’s intonation, or pitching. Our excellently resilient singers spent two days “living” with their earpieces as we made fine adjustments to levels and positioning leading up to a series of single-take captures.
With Joe’s accompaniment and spoken cues transmitted direct to the performers’ ears (with virtually zero latency) and with Joe monitoring his keyboard and the singers in headphones, we experienced the uncanny effect of hearing only their voices in the space, yet perfectly synchronised, responsive to one another and live!
Near flawless vocal performance capture achieved, musical director Joe took the vocal recordings into his studio, along with his reference keyboard track to create the time-base for the orchestral recording sessions. Final tweaks were made to his arrangements before we arrived on the doorstep of Sonica Studios in London with our assembled compact chamber orchestra. Sonica’s Mat and Paul are no strangers to projects with somewhat interesting briefs and they completely understood the production process we were undergoing. Our orchestra were separated into different live rooms as much as possible, all with sightlines through windows to Joe, conducting, who monitored tracks of the singers, a tempo-setting click and his piano from the shoot in headphones. In a series of cues (self-contained sections of the music) the score for Tosca VR came to life, hand in hand, beat to beat with the vocal performances of our singers
Written by Andy Purves
What I find exciting is that this immersion opens up several new frontiers. Firstly, the role of audience member becomes a dynamic new landscape to explore. The audience's presence, agency and empathy become important factors in how we include them in the storytelling design and process. Secondly, there is more exciting terrain to traverse in determining what existing rules govern these spaces from a theatrical perspective and what new principles are emerging that will allow us to exploit these environments to their fullest. I want to turn to both of these issues in my next two posts.
Until next time, have a look at the following links as I think they provide a good backdrop for what is to come... I also find Chris Milk's talk rather inspiring. See you later in the week.
Written by Christopher Lane
Click on the picture that heads this post to see more of the amazing work of Adrien M and Claire B