Red Dead Redemption
I challenge you to find a review of Death Stranding that doesn’t mention how cinematic it feels. Not only does the game employ beautiful perspective shots and dramatic exposition, it stars top-tier actors and directors, including Norman Reedus, Mads Mikkelsen and Guillermo del Toro. Their names are emblazoned across the screen multiple times, packed into the intro credits and appearing again with each character’s debut.
As is usually the case in video game criticism, the comparison to film is presented as a positive feature, driving home the idea that Death Stranding is just as powerful and necessary as any Academy Award nominee. As if Hollywood prestige were the highest bar for artistic criticism. As if, by emulating a movie, a video game earns real value.
By this metric, Death Stranding is a good game because it looks like a good movie, and Kojima is a talented filmmaker.
But, is he a good game designer?
Well, yes. The accomplishments of Kojima’s games are varied, numerous and genre-defining. He’s responsible for inviting artistic criticism into a realm of entertainment that’s been historically shunned from mainstream conversations, and his games have plenty to say. Even when those words are coming from the mouth of a rechargable baby floating inside an artificial womb strapped to the chest of a superpowered delivery man.
However, the conversation around the prestige of AAA games has clear parameters, many of which have been set outside of the video game industry. Red Dead Redemption 2, ostensibly the top game of 2018, was lauded for its cinematography and acting; similar praise has been applied to God of War, The Last of Us, Horizon Zero Dawn, and most major-label games of the past few generations. Discussions about these games in art or academia often adhere to standards set by Hollywood insiders. When we talk about video games as important, artistic or socially valuable, film is the gold standard.
There’s good reason for this. The American film industry provides a template for video games, dictating how these experiences will be received in art, education and public consciousness. Essentially, film has been here before.
It took decades for film to be widely viewed as legitimate artistic expression. Citizen Kane, the movie that set the standards of visual storytelling, came out in 1941, but film classes weren’t offered in most American universities until the mid-1970s. Before that time, movies — especially domestic ones — were viewed by the establishment as less intelligent, and therefore less valuable, than literature.
Time marched on. Incoming generations of students grew up with theaters, foreign movies and film critiques in The Village Voice, and they naturally understood the power of the medium. Universities eventually caught up with their students and started examining film, presenting it on equal footing as novels and other visual arts. Today, even digitally distributed television shows are extended the same respect.
Nearly 50 years later, this adoption process has transferred to video games. Most universities nowadays offer some kind of video game program, including classes that dissect the industry with a critical eye. However, even just four years ago, universities like the Art Center College of Design in California didn’t offer dedicated gaming tracks, and many of its professors were baffled by digital, interactive forms of art. In film-history terms, video games are just now leaving the 1970s.
It makes sense, then, that video games would use film as a guiding rod. Movies are similarly visual, and they’ve successfully infiltrated the bubbles of art, academia and mainstream consumption. The quickest and clearest way to ensure video games are viewed equally is to emulate film.