Once again, the Game Developers Conference is almost upon us! GDC 2018 promises to be an awesome event, chock full of great opportunities for us to learn and grow as video game music composers. I always look forward to the comprehensive sessions on offer in the popular GDC audio track, and for the past few years I’ve been honored to be selected as a GDC speaker. Last year I presented a talk that explored how I built suspense and tension through music I composed for such games as God of War andHomefront: The Revolution. This year, I’m tremendously excited that I’ll be presenting the talk, “Music in Virtual Reality.” The subject matter is very close to my heart!
Throughout 2016 and 2017, I’ve composed music for many virtual reality projects, some of which have hit retail over the past year, and some of which will be released very soon. I’ve learned a lot about the process of composing music for a VR experience, and I’ve given a lot of thought to what makes music for VR unique. During my GDC talk in March, I’ll be taking my audience through my experiences composing music for four very different VR games –the Bebylon: Battle Royale arena combat game from Kite & Lightning, the Dragon Front strategy game from High Voltage Software, the Fail Factory comedy game from Armature Studio, and the Scraper: First Strike Shooter/RPG from Labrodex Inc. I’ll talk about some of the top problems that came up, the solutions that were tried, and the lessons that were learned. Virtual Reality is a brave new world for game music composers, and there will be a lot of ground for me to cover in my presentation!
In preparing my talk for GDC, I kept my focus squarely on composition techniques for VR music creation, while making sure to supply an overview of the technologies that would help place these techniques in context. With these considerations in mind, I had to prioritize the information I intended to offer, and some interesting topics simply wouldn’t fit within the time constraints of my GDC presentation. With that in mind, I thought it would be worthwhile to include some of these extra materials in a couple of articles that would precede my talk in March. In this article, I’ll explore some theoretical ideas from experts in the field of VR, and I’ll include some of my own musings about creative directions we might pursue with VR music composition. In the next article, I’ll talk about some practical considerations relating to the technology of VR music.
So, let’s get started!
VIMS
,
,
No discussion of virtual reality is complete without some time spent on the perils of Visually Induced Motion Sickness (a.k.a. VIMS). My upcoming GDC talk will include research on this topic pointing to a specific music approach that can play an important role in alleviating VIMS symptoms.
However, there is more to consider about the general role that music plays in relation to the famous VIMS phenomenon, apart from the technique that I’ll be describing in my GDC presentation. So let’s take a look at the general relationship between music and VIMS, starting with the most basic question:
What causes VIMS?
,
,
Let’s picture ourselves sitting in a movie theater. We’re watching a silent film that shows a first-person perspective of a high-speed bicycle ride full of wild twists and turns. It looks stressful, but as we sit and watch the visuals, we’re not really all that stressed. Okay, so now let’s imagine that the film isn’t silent anymore. We can hear the bumps and jogs in the road, the air whooshing by, the aurally chaotic soundscape.
It’s a bit more exciting to watch, but we’re still feeling comfortable in our movie-theater seats. Now, let’s imagine that we aren’t looking at a flat 2D screen anymore. Now it’s a 3D stereoscopic image of that wild bicycle ride. Oncoming traffic leaps off the screen at us. Obstacles seem to whip by our heads as the road before us corkscrews madly. Are we still comfortable? Or could all that dizzying 3D motion be finally getting to us?
In their study to better understand the causes of motion sickness, professors Behrang Keshavarz and Heiko Hecht gathered 69 experimental test subjects and exposed them to the visual presentation I described above. There were two variables: viewing mode (2D or 3D stereoscopic) and sound (on or 0ff). The 2D film didn’t cause a problem. Likewise, the presence (or absence) of sound wasn’t an issue. But when 3D visuals were introduced, motion sickness became a big problem.
The findings of the study support the conclusion that immersive visual stimuli has the potential to negatively impact our sense of balance and equilibrium. However, there’s also a secondary conclusion that’s equally interesting to us as game audio folks: sound doesn’t seem to have anything to do with it. Yes, the 3D bicycle ride with sound was pretty nausea-inducing, but according to the study, a silent 3D bike ride has just as much potential to cause motion sickness.
So what does that mean for audio and music in virtual reality? Does it mean that the aural spectrum simply doesn’t matter? Or does it present us with some interesting creative opportunities? Let’s explore that idea a bit further.
Music and VIMS
First, let’s dispense with the notion that music and sound design doesn’t matter when it comes to VIMS. Specifically, the presence of music actually has a powerful influence on the VIMS state, but that influence is therapeutic rather than harmful.
,
,
While there’s a certain musical strategy that has the most beneficial effect (which I’ll define in my GDC talk), the mere presence of music is a proven therapeutic agent that has been shown to diminish nausea symptoms. In a study conducted by the Arthur G. James Cancer Hospital and Research Center at Ohio State University, researchers found that the use of music during high-dose chemotherapy sessions led to a significant reduction in symptoms of nausea. Music acts both as a diversion and a targeted therapeutic agent, shifting the listener’s attention away from physical discomfort while at the same time acting to reduce the symptoms.
In my talk I’ll be exploring how we can best employ an effective music strategy within the constructs of virtual reality in order to cushion VR players and make them more comfortable in the immersive environment. There is, however, an additional dimension to the relationship between music, audio and VR exploration, which I didn’t have time to include in my upcoming GDC talk. I’d like to share my thoughts on that here.
Presence
,
,
What makes virtual reality so real? It can’t be just the encompassing imagery, because then we wouldn’t need VR, we could just go to a 3D movie. No, in order for VR to engage us, it has to make us feel as though we are personally present in the virtual world. This phenomenon can be alternately called telepresence or virtual presence, but the end result is the same. We feel as though we’re physically occupying the same world as the imaginary visuals we’re encountering. How does the game make us feel this sense of presence?
According to MIT Professor Thomas B. Sheridan, the sensation of presence depends on the operation of three important factors: a “sufficiently high-fidelity display, a mental attitude of willing acceptance, and a modicum of motor “participation”. In other words, we need to find the visuals to be sufficiently convincing, we have to be willing to be convinced that they’re real, and we must be able to move about freely and interact with the environment. Unfortunately, it’s that third factor that causes the VIMS problem. Moving around in VR opens us up to motion sickness. How is this problem typically addressed?
,
,
According to Steve Bowler, cofounder of the VR game company CloudGate Studio, the community of VR game developers have “zero tolerance for user motion sickness.” In an interview with ScienceNews.org, Bowler describes the way in which developers typically solve the problem. By virtue of a system of in-game navigation that relies on a type of teleportation, developers allow us to wander their VR worlds. We point our controllers where we want to be, we hit the teleport button and zip!
We’re there in a flash. It’s highly effective in avoiding the perils of VIMS. However, it also sharply curtails our sensation of being able to “move about freely and interact with the environment.”
So, if developers are forced to limit the personal agency of players in wandering around the environment, is it possible for game audio folks to compensate by making it seem as though the environment is wandering around us? This is a thought I’ve been considering lately, as I contemplate the movement limitations we experience in VR environments. After all, before we had visual virtual reality, we had a kind of audio VR in the form of audio-only games like Papa Sangre.
In games like Papa Sangre, the environment presents a busy soundscape that invisibly drifts around us. If we close our eyes, we’re suddenly fully enveloped in the world that the game developers have created. Merely turning around becomes a radically dramatic act of personal agency as the sonic universe reacts to our movement. I’ve included a non-interactive video clip below that demonstrates some gameplay from Papa Sangre. In this clip, you can watch a gamer interacting with the game’s interface. Notice the somewhat exaggerated nature of the sound design as the player is instructed how to play the game:
,
Source: Gamasutra