Half-Life: Alyx Locomotion Deep-Dive
Author: Half-Life: Alyx Team,
published 4 years ago,
[previewyoutube=TX58AbJq-xo;full][/previewyoutube]
[b]Join the Half-Life: Alyx team in a behind-the-scenes deep dive that examines locomotion and player presence in the game. Valve developers Jason Mitchell, Luke Nalker, Greg Coomer, and Roland Shaw share some of our early prototypes, and walk through user interface, audio, and player movement discoveries that led to the game's final designs.[/b]
[h3]Transcription:[/h3]
[i]Jason Mitchell:[/i] My name is Jason Mitchell, and I'm a developer at Valve. Today, some colleagues and I would like to share with you the details of the player movement in Half-Life: Alyx. Traversing space is a fundamental challenge in VR generally, and we have spent considerable effort on our locomotion system. We'd like to take this opportunity to talk about some of our early prototypes and some of the more interesting things we learned along the way.
In Half-Life: Alyx, we decided to support three different types of player locomotion: Blink teleport, shift teleport, and continuous locomotion.
Continuous locomotion is the most similar to traditional WASD mouse and keyboard controls. This allows players to move smoothly through the environment without teleporting. In both teleport modes, the player uses a targeting interface to select the location where they would like to move.
In blink teleport, the screen turns black, the player is moved to the target location, and the screen fades back up from black with the player at the new position. This all takes place in a fraction of a second.
Shift teleport is similar, except the screen doesn't go black, and the player is rapidly moved to the selected location. Today, we're going to dive into the details of the blink and shift teleport modes.
[i]Luke Nalker: [/i]Since we already shipped blink teleport in The Lab, it was a natural place for us to start our development. The Lab had a combination of specifically authored teleportation locations and room-scale spaces, which was where we started our prototypes. While the targeting reticle worked well in The Lab, it turned out to be distracting to players in the more realistic environments of our game.
In order to present an indicator that was more physically grounded in the world, we began experimenting with showing the player's feet at the target location. This had the added benefit of enabling the player to select the desired orientation at the teleport destination. The presentation of the feet implied the volume that the player's body would occupy, which was especially important when moving to cover during combat.
We also experimented with presenting the visualization of player footprints, not just at the destination, but along the current path. Players had a great reaction to this because it allowed them to visualize themselves moving over and around obstacles, as well as understand why some desired movements may not be valid.
[i]Greg Coomer: [/i]Those virtual footsteps you see are a visual representation of our pathing system. Initially, we tried repurposing our existing nav mesh system, which is the same one our AI uses to navigate. In practice, however, this proved to be too rigid. The nav mesh system precomputes all possible movement paths. And while that's efficient, it didn't support the locomotion and interaction desires of our players or our designers, who were building very dynamic environments.
This led us to our next series of tests, using the well-known A* system to build routes. Our A*-based teleport pathing approach was successful in that it freed players from having to micromanage their teleport targeting. Ultimately, it was too computationally intensive, so we had to explore more efficient solutions.
During these early experiments, two things became pretty clear to us: First, players have a strong tendency to focus on the destination, and the rest of their environment is seen as a potential obstacle to them getting where they want to go. Second, players imagine themselves to have super-human speed and agility. Playtesters would happily tell us that, yes, in fact, they could reach that spot by squeezing through a gap, catapulting over an object, or sliding under an obstacle.
Ultimately, our initial tests and playtesters' desires led us to the system we have today, the main goals of which are to ensure that the end player position is a valid place for the player's body and that the path to get there is viable. To get to this point, we had to solve a whole variety of problems. Some of the more interesting ones are things like how player height has an impact on pathing. Also, building player trust in the system and keeping players grounded in the world just using audio.
When we talk about a viable path, we mean one where the representation of the player's body in the virtual environment stays intact. Players obviously can't squeeze in between jail bars or fit into a tiny mouse hole or do things like jump hundreds of feet up onto ledges. But they can crawl, turn, or contort, or engage in their environment in a way that stays true to their physical shape.
Traditional video games put the world in front of you, and, as a player, you have expectations about your ability to navigate and act on the world. But movement in VR generates so many more player expectations because the world completely envelops the player. Players actually lean around corners. They even get down on the floor to try to peek under doors.
It turns out that the height at which players view the virtual environment is very important. We as humans are strongly attuned to the perspective that our height gives us on the real world, and it was essential that we preserve this in the virtual environment in Half-Life: Alyx. The result of this is that the extent of each player's virtual body is subtly different due to the variation of their real-world height. And this affects the way paths are computed. In fact, early on, we would get hard-to-reproduce bugs from players banging their virtual head on low-hanging pipes 'cause they were unable to teleport through certain areas. And it was only later that we realized all those bugs were coming from our taller colleagues. At the time, our system required players to crouch down to the minimum height along their desired teleport path. But players frequently failed to notice that or understand it. They were focused on their goal, not realizing that, halfway through their path, there was an obstacle they needed to crouch under during their movement. So to solve this, we decided on a standard minimum body size, which was used to validate the middle portions of the teleport path and only required that the start and end positions could support the actual presence of the full-size player.
[i]Jason Mitchell:[/i] At the core of all Half-Life games is a particular mix of story, puzzle, and combat experiences. In order for the combat experiences in particular to succeed in VR, it was essential that we built players' trust in the teleport system. Ultimately, we wanted the teleport interface to fade into the background, enabling players to concentrate fully on the combat scenarios. This meant that the teleport system had to perform a player's desired movement even if they were performing quick and coarse gestures. During development, we discovered two key elements that were essential to gaining players' trust: Prioritizing movement along the floor and, under certain conditions, moving players only partially along their desired paths.
If you think about a traditional first-person control scheme, you have forward, back, left, right, strafing. All of these movements happen in a plane. When the player wants to leave the plane, they have to do something explicit, like jump or climb a ladder. Prioritizing the floor is the teleportation version of that concept. In combat, the player's focus is, as you would expect, on the enemy and not on the environment. Playtesting has shown us that players teleport targeting becomes frantic and imprecise in these situations. And even under these conditions, our system needs to match player expectations. Typically, players want to stay in contact with the floor and move out of danger, not jump up and down on all the objects in their environment in the middle of a battle. Floor prioritization achieves this.
Another way that our teleportation system achieves reliable results, even when used quickly and imprecisely, is by moving players partially along their desired path in certain cases. In the heat of battle, players focus on quickly repositioning themselves for tactical advantage. Instead of requiring the player to precisely specify a valid teleport destination, partial movement moves the player as close as possible to the desired location along the valid movement path. As a result, players don't have to shift their mental focus away from their enemy in order to move during battle.
Another variation of partial movement occurs at drop-offs. Heights are very impactful in VR, and dropping into a new environment, especially one that was not visible prior to committing to the teleport, can be very disorienting. To address this, we also perform partial movement when a player's end position is not visible from their starting position. The result is a movement up to the edge of the cliff or the drop-down, creating a natural pause that allows the player to view the terrain below and make a better informed decision about how to proceed.
While we want the locomotion system to become effectively invisible, it's also important that the player's subconscious remains aware of their presence in the virtual world. Sounds are used as a constant, subtle reminder of the player's presence in the game.
[i]Roland Shaw:[/i] Teleporting in The Lab and SteamVR is supported with a relatively abstract sound. But that's not consistent with the more grounded realism of Half-Life or with the visual and haptic cues we are providing players in Half-Life: Alyx. So rather than a simple, abstract teleport confirmation sound, we use audio to describe the player's locomotion after they teleport.
This isn't something that's limited to teleporting; we use sound to describe and support all kinds of body movement. For example, a player moving their arm to grab ammo, crouching, turning, and twisting are all movements supported with sound. When we were developing this technology, we began with just playing some basic Half-Life footstep sounds after each teleportation. Players responded well, so we experimented with using the distance traveled to drive footstep volume and timing. They also started providing feedback that they expected to hear a louder, heavier sound when they teleported from a high to a low area, effectively jumping down in virtual space. This kind of feedback convinced us that players were feeling as if they had a physical presence in VR. They began to expect their weight to impact the environment and hear the results. We implemented that feature and iterated based on further feedback.
[i]Jason Mitchell:[/i] Now that the game is in the hands of customers, it's exciting to see that players who prefer different styles of VR locomotion have been able to select the option that's most comfortable for them. Interestingly, the fact that we implemented multiple locomotion systems together meant that each of them were stronger because they could borrow ideas from each other. For example, the fluidity of continuous locomotion influenced the design of the teleport system. Likewise, the audio cues that we developed to summarize the player movement during teleports were integrated back into the continuous locomotion mode to provide the same kind of player feedback. So in the end, no matter which locomotion method players select, they'll be able to move confidently through the complex environments of Half-Life: Alyx.