Gone Home, Firewatch, What remains of Edith Finch,
and the effects of video game cinematography
My friend M Eifler is very smart. They're smarter than me, and--sorry to be the bearer of bad news--they're probably smarter than you, too. While we were sipping tea in my apartment (and after a long conversation about space elevators), M mentioned how their research into virtual reality showed that common camera angles used in film don't translate the same way in a VR environment.
M referenced a scene in the movie version of Matilda, where Matilda enters the library for the first time. The camera is at a low-angle shot, tilted up to look at the librarian's desk. It's a technique even a lay-person like me is familiar with: when the filmmaker wants to show the balance of power between two characters, adjusting the tilt of the camera gives a sense of relationship. In this instance, Matilda is looking up at the librarian, putting the latter in a more powerful (here: venerated) position relative to Matilda. Similar shots are used when The Trunchbull come into play, though these latter ones are closer and, thus, more antagonistic-feeling. These shots are immediately readable; they create a gut reaction in the viewer, and gives the filmmaker a quick visual shorthand to show relationships between characters.
M went on to explain that they found that when trying to make the viewer feel a similar effect in VR, having the viewer tilt their head up or down wasn't quite enough to get the same effect. To someone playing with VR, a head tilt up or down felt like an exploration of space, rather than a definition of a relationship.
Instead, if M wanted to have the viewer feel a sense of superiority over what they were seeing, M had to have the viewer tilt forward a little while looking down. If they wanted the viewer to feel inferior, M had to have the viewer tilt back while looking up. In other words, unlike in film where the imagined angle of the head was enough to convey the relationship, in VR, these feelings were somatically indicated by the movement of the viewer's chest. (You can read more about M's findings on VR cinematography here.)
Though I don't play a lot of VR, my talk with M got me thinking about how we approach cinematography in video games. As a Gentle Gamer, I was instinctively drawn to so-called "walking simulators," wherein story-telling is primarily done through exploring a space. With their lack of time limits and emphasis on quiet story-building, these sorts of games should be right up my alley. However, "walking simulators" are often marketed as a little spooky, even if the gameplay or story has nothing to do with horror. This is certainly the case with three popular and distinctly non-horror games in this genre: Gone Home, Firewatch, and What Remains of Edith Finch. All three feature undercurrents of trademark scares: home-alone-during-a-thunderstorm, there's-something-in-the-woods, and family-with-a-mysterious-curse, respectively.
I don't think the teaser that these game might just turn into something horrific is just a marketing ploy to get a wider variety of gamers to play a gentle game (though I do think it works in that capacity). Instead, I suspect that it has something to do with the unique cinematography of video games with a first-person perspective and its inability to completely replicate how we experience our first-person lives in the real world.
(Light spoilers ahead for all three games)
2Spoopy
Gone Home came out in 2013, and I remember being suspicious from the start. The game's cover artwork was a Haunting of Hill House-type dark mansion on a hill. Not to mention the fact that the illustration was done by Emily Carroll, whom I absolutely adore, but who is also known for making some truly spooky pieces. (I recommend Face All Red.) I had Gone Home in the firm "no thank you, please" pile, until a torrent of positive reviews came out, praising its thoughtfulness, tenderness, and its not-spooky-at-all ending.
When I turned on the game, I felt like maybe I had misread something. There was the house on the hill, all looming and silent. There were the locked doors and empty rooms, littered with the remains of people who seemingly had JUST left. There was the thunderstorm, booming outside and rattling the windows. The dark corridors mimicked the look and feel of first-person survival horror, and the feeling of something possibly popping out and surprising a very defenseless me was always there.
Even after I finished the game and was thoroughly impressed by the deep storytelling and emotional payoff of the ending (leading me to recommend it to nearly everyone I knew), the moment that remains clearest in my memory these five years later is the moment when I was walking through a crawl-space between walls, and a lightbulb suddenly popped over my character's head. I remember actually yelling out, surprised by the lone jump scare I had been expecting the entire time.
2Spoopy, take 2
With the incredible success of Gone Home came the games that sought to emulate its success. (There has also been the backlash against "walking simulators" and the unnecessary discussion about whether or not they're "Real Games," but that's an argument for another day.) Two games such games are Firewatch and What Remains of Edith Finch, released in 2016 and 2017 respectively. Both games followed Gone Home's lead in not only having the player experience storytelling via exploration via a first-person perspective, but they also both used horror-light teasers in their marketing and gameplay.
Firewatch places the character in the shoes of Henry, a man trying to escape the heartbreak of his wife's early-onset dementia by escaping to the woods and serving as a national park's fire spotter. Henry's one companion is Delilah, a woman in the next firewatch tower over. Henry can't see Delilah, but they speak near-constantly over walkie-talkie. The player decides what Henry says to Delilah, while also leading Henry's exploration around the park. After a bit of time, Henry seems to uncover someone trying to do him harm. He gets threatening messages. Two girls go missing. Someone breaks into his tower. A mystery seems to unfold.
What Remains of Edith Finch casts the player as the eponymous Edith Finch, a woman returning to her family's house in the woods. Everyone in her family has died; Edith is "what remains." She explores the house, telling the stories of her family members and the various ways they died. (For their deaths, the player switches to a first-person view of the doomed person.) The house looks spooky, with it's Frankensteined-together appearance. Like the house in Gone Home, it is empty, with creaky floors and hidden passages. When you approach the house the first time, it is bathed in a grey, sickly light.
However, for both games, just like Gone Home, the background of horror is used to subvert expectations and ultimately highlight a core story of deeply familiar humanity. The mystery of Firewatch is just a cover for Henry's lonliness and grief. For Edith Finch, as the player sees the death of each family member, it's clear that their last moments are explored with tenderness; Edith returned to the house to tell her family's stories and honor their lives.
So, the question remains: why does the game rely on some of the touchstones of horror or suspense in the first place? When you look at the history of first-person views in gaming, it starts to make a lot more sense.
Eye of the beholder
There are a few genres of video games where first-person views reign supreme.
The first is in shooter/tactical games, wherein the player's goal is relatively simple: take out the other players before they take you out. This genre encompasses games like Counter-Strike and Halo, and even in less overly "war" games like the hyper-popular Overwatch. First-person games have also been widely used for survival/horror games like Alien: Isolation, Resident EVII: Biohazard, and Amnesia: the Dark Descent where the player takes over a character who is trying to escape or outwit a much more powerful enemy, usually while trapped in a series of the much-beloved horror corridors. Puzzle games like Myst and its spiritual offspring, The Witness, feature the player seeing the world through the eyes of a protagonist.
All three genres are neatly covered by Bethesda's famous Orange Box, which contained three games in the first person: Half Life 2 (horror/tactical); Team Fortress 2 (shooter co-op); and Portal (puzzle). In all three, the player saw the world through their character's eyes.
Apply a little logic to the needs of game design, and this all makes sense. All three are games that, like the walking simulators described above, require the character to directly interact with the environment around them. Often, the characters are picking up objects (weapons, puzzle pieces, clues, ammunition). Further, keeping the camera in the first person allows for the player to see the objects without an obstructed view. In a shooter game, if the player wants to aim, the camera would have to shift to the first person (or a tighter, over-the-shoulder shot) regardless to have clear and accurate aim.
In looking for counterpoints to the first-person camera in these genres, I felt a little at a loss. There are examples, but a lot of them try to hold to the first-person as much as possible. But then, as always, there's Playerunknown's Battlegrounds.
Sandbox
Playerunknown's Battlegrounds (commonly referred to as PUBG) is a battle royale-styled shooter. A number of players are dropped into a field that grows increasingly small as time goes on. The last person standing, wins. Notably, the game is in a default third-person view. However, in the middle of last year, PUBG debuted a server that was all in the first person, and people had Lots Of Feelings about it.
Polygon wrote about the option with an article titled "PUBG's first person-only mode is absolutely brutal." In the article, they talk about how the third person view allowed players to get a better sense of who was around them. They could better scout around corner and see who might be lurking on the outside. In first person, all that was gone. Players were more routinely surprised with folks lurking around corners they didn't expect or sense. There was a greater sense of "stumbling across" things, partially because of the limited scope of the world. Even looking at the above two screenshots, the difference is clear. One feel more open and expansive. The other feels like you have blinders on; there could be another player right next to you, and you'd have no idea.
In other words, playing a video game in the first person is like looking at a flat screen that only displays what's immediately ahead of you. It completely removed the player's sense of peripheral vision. In our everyday lives, peripheral vision is integral for allowing us to figure out how we can interact with a space. We use our peripheral vision to be on the lookout for danger. There's even some theory that wide-eyed expressions of fear are ways that our body gets more peripheral visual information to better ascertain possible threats. Lacking this central aspect of how we explore the world, our brains go into a panic.
So, even though walking simulators use the first person to make the player better empathize with the main character, they are unable to transcend the limitations of the genre as it currently exists: to play a game in the first person cannot accurately capture the true first-person experience of the world, because it cannot cannot replicate our peripheral vision.
First-person games put us into a character who essentially has a sort of tunnel-vision. It doesn't help that we most commonly experience tunnel-vision when we're hyped up on adrenaline due to fear impulses. First-person in video games makes us on-edge on a deeper level.
Rather than push against this instinctive response, games like Gone Home, Firewatch, and What Remains of Edith Finch lean into that fear, either on purpose or completely by accident, using that deep feeling to elevate their stories. We expect something terrifying to happen and, when it doesn't, the game prods the player, asking, "C'mon. What did you think would happen?"
better than i know myself
So what does that mean for future walking simulators and their ilk? I suspect (though I've been proven wrong before) that it will be hard for first-person games to ever shake the inherent feelings of fear that come with the territory. Until the point when our brains no longer need peripheral vision, when it disappears from our vision we'll still feel a sense of inherent panic and loss.
I wonder if people like M will change the genre. M's work with VR and alternative reality and holograms allows us to experience gaming in non-flat-screen contexts. What would a game look like if it could accurately replicate peripheral vision? What if it was placed atop the world as we see it (as in alternative reality games)? In these instances, I suspect that we'll see a wider variety of stories, and a wider variety of tropes played with.
However, that doesn't mean we have to wait. We'd do well to better realize the medium by which we're enjoying games. Already, we have video games that recognize that they're video games ("meta games," as it were). That's a great jumping off point, but what about games that recognize the specific cinematography of experiences games on a screen? What I mean is this: right now, video games seem to view the fact that they have to exist on a screen as a limitation to work within. Like with the piece I wrote about time in video games, the medium is seen as a constraint.
Right now, I get the feeling that video games consider themselves most closely tied to movies. Games are built to be cinematic experiences, but this ignores the specified cinematography that games have. When you're watching a movie, you're aware of the space around you. As someone who plays games regularly, I know all too well the feeling of "falling into" a game. I don't play video games in the same way I watch movies, and the two have very different effects. We may be ingesting both on similar-looking (or the exact same) pieces of technology, but that in no way makes them the same.
In the way that M pointed out that cinematic camera angles didn't work in VR, the first-person in video games is very different from a first-person view in film. But that's just a very small piece of a giant puzzle, the latter of which we haven't even begun to truly explore. But that metaphorical giant puzzle is incredibly exciting to me. It's a whole language of media that we haven't yet explored, because we've tried too hard to tie video games into other forms of existing media.
Let's recognize that language and build it up. (It'll be equally fun to tear down and rebuild once we get too familiar with it.) Maybe we'll finally get a personal, empathetic, exploration game that doesn't also throw in a couple of jump scares because it feels like, deep down, it has to.