Gaming Is Hollywood’s Playground Now

The filmmaking juggernauts’ interest in the medium is growing.

How is it possible that a figure so inseverable from the world of cinema such as Keanu Reeves had made their way into video games? It’s not entirely unprecedented to see live-action actors take up the responsibility of portraying a virtual character on-screen, but something about his profile as an unrivaled action star and his legendary status in pop culture made it seem like this was unlike anything we’ve seen before. To understand how we got here, we must dive back a little bit into the technical roots of recent console hardware, and a certain contribution of a major console manufacturer and its pivotal role in making narrative-focused personality-led games the new hot streak.

The gaming medium had historically been put under massive loads of scrutiny because it couldn’t quite match the fidelity its creators were aiming for. Processing budget was very limited, and in the case of the sixth generation of consoles (PlayStation 2, GameCube and Xbox), we were starting to get glimpses at what such a revolution in graphics would look like. Sony had boasted that it consoles’ graphical prowess compares very favorably to Toy Story, which for a movie that took thousands of hours to render on very capable machines was quite a bold claim. That of course as is commonly accepted didn’t turn out to be the case, but it was quite the aspirational kind of marketing tomfoolery and self-reassuring technical arrogance that Sony would need to further push the envelope on what was possible through the use of limited resources, and it would come to define the parameters of the medium for years to follow.

There’s no real reason why Sony has historically hoarded the bragging rights to democratizing a narrative-focused brand of interactive storytelling on its platform. Microsoft had tried to blur the lines between live-action presentation and gameplay, but its experiment was mostly met with a collective shrug despite how endearing and well-crafted it was. And the PC platform in general doesn’t lend itself well to publishers committing it their sole attention–and when it happened, commercial success potential for consoles generally hinders their creative output. So when all doors shut as revitalization of the medium tried to get a footing, Sony stepped up to the plate and threw loads of cash at its first-party developers to reinvent the wheel, and they’re in major part to attribute for the success of interactive cinematic storytelling, so much that it was good enough to convince Keanu Reeves that even putting his name next to it is a good idea.

The road to that world was anything but rosy–PlayStation first-party developers were met with numerous challenges trying to optimize for the platform as the PlayStation 3’s Cell proved a fickle to deal with. Guerilla Games — developers of the incredible Horizon: Zero Dawn — were one of the first to aim for a cinematic presentation as its Killzone 2 demo was shown at E3 2005 to enthusiastic, but yet skeptic praise. Killzone failed to match expectation, and the whole industry had collectively to deal with a reckoning of false advertisement claims and overambitious render target goals since then. All the meanwhile, a certain madman was gearing up for a high-profile release on the platform that would be the first to include the relics of Hollywood-infused cinematic storytelling, even if occasionally built on the most pretentious of notions.

David Cage, controversial figure in the gaming world, was arguably the godfather of cinematic storytelling in video games. That’s not to say no one before him has even given it a knack for this high-art (at least visually), self-aggrandizing, surreal mode of presentation for a medium that’s traditionally been about instant feedback, but he was undeniably one prominent figure in making that switch happen in Western markets right where professional cachet matters most.

His first pull on a project worthy of the Hollywood name was Beyond: Two Souls, and while there are still extra-textual conclusions to be drawn from its handling of mature themes after launch, the game included a cast of characters that were not exactly an unknown quantity. Ellen Page headlined the project, with Willem Dafoe a close second as a mentor figure, and while many debate the artistic merits of David Cage’s convoluted mental trappings rooted in problematic tropes, few can argue the cascading effect it had on emboldening future developers to embrace cinema as a useful and reinvigorating frame to their stories. Fellow Sony first-party entities started to take notice, and one of them would soon deliver a knock-out punch to any doubts this format would prove unsuccessful in the long term.

Enter the Last of Us. Naughty Dog had already seasoned itself in the art of making groundbreaking narratives for their much-beloved character Nathan Drake in the Uncharted series, but they took their talent and expertise they’ve amassed from an incredible sequel to the first game in ‘Uncharted 2: Among Thieves’ right to what was at the time a common story frame that lots have suspected to meet the same fate as the Walking Dead, Dead Island, Dead Rising, Dead anything and a plethora of zombie games like them.

The Last of Us was unlike any other game I’ve seen before. It took the emotional core of effective storytelling in video games, infused it with its own brand of endearing character moments between Joel — played by Troy Baker — and Ellie — played by Ashley Johnson — who just melded in a true dozen hours of absolute synergy and near-telepathic charisma. The advancements made in motion capture technology had made what would’ve been a pipedream even five years prior to release, a reality as the game was gearing up for production. The emotions of the characters were convincingly conveyed across, and it felt as though the whole story was the perfect example of what happens when interactivity is leveraged to elicit genuine emotion.

The emotional resonant core of video games is what makes for solid fanfare. Just type the game anywhere on Twitch and YouTube and you’ll be greeted by thousands of walkthroughs from popular game streamers wherever their interests may lie. The game had struck such a deep chord with a lot of players, that its cultural impact had far exceeded the textual content of the game. Its fanfiction scene still brews with interest, and there’s no end in sight to excitement surrounding the IP as Naughty Dog gears up for the release of the sequel next year.

The Last of Us is one of the first reliably-documented instances of players not feeling an attachment to systems of gameplay as much as they do with characters and story. The game was none but a conduit to convey emotion, and the solid framework of interactivity surrounding it was only the icing on the cake on what had been already an accomplished experience from a narrative standpoint. This would set the stage for an upcoming genre of games for whom gameplay is secondary interest, and while so many of them veer away from cinematic framing just by sheer virtue of it being costly to make, it continued to come from many of the same staples, with a new comer throwing their hat in the ring every once in a while with their own spin on it.

Three years after the release of the Xbox One and PlayStation 4, two games were closely competing for attention on who would nail the overlapping interest between live-action entertainment and interactive gameplay from publisher Microsoft and Sony. Uncharted 4: A Thief’s End was somewhat predictably going to be a blockbuster hit, but Quantum Break was the product of troubled development as Microsoft shifted away its strategy from entertainment well back into gaming with Remedy basically being forced to release lest it wanted to close its doors down as its very expensive project already pushed the release more than it could comfortably afford.

Quantum Break was a monumental task for Remedy Entertainment. It had to present its gameplay in a convincing photorealistic package, infuse it with action elements you don’t typically see in games of its genre, whilst making the companion TV show a passable watch in-between main playable portions of the story. The cast was star-studded: Aidan Gillen from Game of Thrones played the villain, Shawn Ashmore who played Iceman in the X-Men was the protagonist, and Lance Reddick — who’s not unknown to video games — from John Wick served as a great titular contrast to Gillen’s vulnerable, perturbed, but yet perfectly composed antagonist. There was no shortage of on-screen talent on display, but the game hadn’t achieved the commercial expectations its publisher and developers were hoping for.

For myself as a passionate gamer, it was a triumphant display of what a company had been aspiring to since its humble beginnings in the crime noir genre, but for many, Quantum Break was the perfect example on what goes wrong when ambitions exceed means and commitment. The game was received modestly well, but it wasn’t enough to sell gaming culture on the dream of live-action weaving so seamlessly into gameplay. The experiment was sound, but it taught lots on how video games shine best on their own merits, rather when contrasted against the feats of a different media.

Uncharted 4 which came out only a month after to bookend the Nathan Drake saga was a poetic potent answer to Quantum Break’s failure as a live-action/gameplay hybrid. The game retained the emotional core of what players love so much about video games, peppered with astoundingly beautiful panoramas and great cinematography, all the whilst delivering on the breadth of interactivity players have come to traditionally enjoy from the medium.

Contrary to past efforts by Sony and Microsoft in the same vein, Uncharted 4 donned the specialty of not having relied on Hollywood talent as much. The Last of Us’ Troy Baker — the quintessential voice actor — presence was offset by Ashley Johnson’s sensibilities for the camera, but Uncharted 4’s Nathan Drake performance was the culmination of years of acting for Nolan North — another quintessential voice actor — to cement himself as not-yet-again a male video game protagonist, but a true icon standing boldly in the pantheon of the greatest video game characters of all-time. The game was particularly special in that regard because it favored traditional voice acting talent over Hollywood clout–Nolan North, Troy Baker, Emily Rose and Richard McGonagle are hardly star-power-loaded actors, but their presence made for an interesting mixture of the sensibilities required for traditional voice acting, and the wealth of performance capture necessary to have made this one of the most captivating interactive cinematic experiences of gaming history, ever.

Yuri Lowenthal as Peter Parker aka Spider-Man, Christopher Judge as Kratos, and Bryan Dechart as Connor. Courtesy of Insomniac Games, Sony Santa Monica, and Quantic Dream. Published by Sony Interactive Entertainment.

Sony’s 2018 slate of video games had achieved similar feats. Detroit: Become Human — while noticeably less reliant on Hollywood talent than Beyond: Two Souls — was an interesting experiment in how cyberpunk stories could be leveraged to dig into questions of identity, racially-coded societal discord, and the language of political polarization even when it can be crassly racist at times; God of War showed that an artistic commitment to a cinematic technique as novel as the single-shot would pay off in major ways, and as Stargate actor Christopher Judge first assumed the role, it was looking like the caliber of the game was in many ways reaching into what Naughty Dog and Quantic Dream had previously tapped into before, as the series ditched its arcade-y over-the-top roots, and long sailed into the uncharted waters of realism and believability; and Spider-Man marked the first occasion that surreal-graphics-enamored Insomniac Games first tried to make a human story whose pillars are human emotions, aided by the latest in motion capture technology and a stern focus on photorealistic depiction of Manhattan’s concrete cityscapes over the simplified environments of something rooted in playful fantasy such as Ratchet & Clank.

This signaled a shift for the entire video game industry. Developers who once made pale copies of real-life figures with no ability to transmit glee, or sorrow effectively through in-engine rendering, now were able to deliver these Hollywood-adjacent performances through the use of talent from the film industry, or by elevating the status of voice acting staples into the shoes of performance capture. Where once only a Benedict Cumberbatch or an Andy Serkis were assigned the duty of such peculiar roles, it seems that traditional actors are more actively considering video game roles as frowned upon in the past as they were.

Mads Mikkelsen — who played Dr. Hannibal Lecter in the hit NBC Show ‘Hannibal’ and was featured as a villain to Doctor Strange in the self-titled movie — took up the role of a vaguely-threatening figure in the upcoming Death Stranding, and the trailer showed stunning recreations of his likeness along a fairly-convincing transposing of his model within the game world. The subtleties of his performance are virtually unimpeded by the PlayStation 4’s technical capabilities, and the rest of the cast is masterfully recreated on-screen to perfect accuracy. Very recently Sony released an eight-minute-long trailer, and it did lots to dispel lingering concern about the host hardware’s ability to portray such a visually-complex, performance-ridden world without fault.

The same can be said about Cameron Monaghan — recurring cast member of the hit Showtime series ‘Shameless’ and FOX’s Gotham as the Joker — who was picked to play Cal Kestis, the main protagonist of a new entry in the Star Wars Extended Universe courtesy of Respawn Entertainment. The game was revealed to universal praise, and EA showcased very recently gameplay footage of it where mo-capped cutscenes blended in perfectly with the mish mash of Bloodborne and Assassin’s Creed in combat and traversal terms respectively. Cameron’s performance came through with next to no disturbances, and it only further bolstered the presentation of the game that had already benefited from the inclusion of engaging gameplay mechanics unlike its sister-franchise, ‘Battlefront’. It was an indisputable win for EA who had been struggling to live up to the Star Wars name, and proof that live-action talent in the hands of a masterful developer can make for great creative potential.

It is why after all this history of technical and creative triumphs in the realm of cinematic storytelling that it’s not entirely inconceivable how one of the most celebrated actors of the modern era, Keanu Reeves — notoriously known for being the lead of the Matrix and John Wick franchises — got to come on the E3 stage where a basketball player or pro-wrestler would’ve previously been, to announce they’ve a major role in the most-anticipated game of the show. Video games have now started to penetrate into the common fabric of modern society, agnostic of cultural provenance, wealth, or social status. If you can’t buy any of these games, chances are — unless local infrastructure is excruciatingly bad — you’re within safe distance of internet access where you can watch any of these games announced at no extra charge through YouTube or Twitch.

This is markedly far more accessible than even film and TV ever were. Whereas a movie typically starts rotating on FTA channels several years after release, and a TV show dons its heaviest cultural weight as it’s getting aired through a premium subscription of some kind–video games are the only medium where you get to enjoy these great stories at no extra charge beyond internet access fees.

That inherent trait to video games’ accessibility, and the increasingly-vibrant market space it has created for smaller indie experiments while AAA ride the high-wave of growing recognition, is why the vanguards of old media are finally starting to loosen their grip on monopolizing entertainment. Now that Keanu Reeves is part of a video game franchise, the doors are blown wide open for all sorts of talent to come in and take part in the spoils–not automatically turn down the opportunity as it was historically considered beneath them.