Nearly a decade after “The Blair Witch Project” brought camcorder shakiness to the masses, first-person horror once again took center screen in 2008 courtesy of three releases, two of them zombified: January’s monster mash “Cloverfield,” February’s George A. Romero’s installment “Diary of the Dead” and October’s rabies-crazy “Quarantine.” Given our Youtubing, cell phone-vid culture, it’s an unsurprising cinematic trend, and one that seems particularly suited for horror films, where a fixed perspective can be easily manipulated for scares and is capable of creating a sense of immediate, frightened involvement for viewers. Well, that’s what it does in theory — these films didn’t all use their shared viewpoint in truly terrifying ways.
I suspect the reason for this common failure has less to do with a filmmaker’s skill than with the limitations of this approach within film in general. Despite their gripping, thematically astute you-are-there conceits, features like “Cloverfield” and “Quarantine” allow audiences to enter the fictional world only through side characters who, for uniformly contrived reasons, have chosen to confront life-or-death situations that demand action with voyeuristic passivity. As an audience member, you’re somewhat immersed in what’s going on, but the artifice doesn’t hold because it’s too weighed down by nagging questions about our on-screen surrogate’s background — Why is this dolt still filming? How is his camera always pointed in the right direction? Why doesn’t someone smash the twit’s recording device to smithereens? — that undermine the central illusion. We’re there, but far too often it feels like we’re not.
Many of these obstacles are absent in the gaming arena, where “survival horror,” a lucrative subgenre spawned by 1996’s PlayStation blockbuster “Resident Evil,” has regularly employed, to decidedly unsettling effect, first-person P.O.V. “Doom 3,” “Condemned: Criminal Origins” and “F.E.A.R.” are a few of the plentiful titles that have transposed action and horror movie tropes to a first-person-shooter realm where engagement with the proceedings is direct and active. This tactic has been further refined by Valve’s recent Xbox 360 and PC hit “Left 4 Dead.” In the bestseller (currently fifth on the domestic console sales charts), you’re presented with four straightforward, narrative-free cinematic zombie apocalypse campaigns, each one leaving the player, along with three human- or A.I.-controlled cohorts, to reach safety by navigating an environment (airport, city downtown, countryside or farm) overrun by flesh-eating undead whose fleetness recalls that of “28 Days Later”‘s hungry monsters.
Between “Left 4 Dead”‘s extended opening cut-scene, corny film posters that kick off each scenario — the airport level is dubbed “Dead Air,” and boasts the tagline: “Their flight just got delayed. Permanently.” — and the requirement that players assume the role of one of four bedrock genre stereotypes (biker, tough chick, old war vet or businessman), the game overtly attempts to put players in a familiar big-screen (un)reality. But unlike its fellow P.O.V. zombie brethren “Diary of the Dead” and “Quarantine,” there’s no impression of detachment in “Left 4 Dead,” which primarily relies on sporadically ambushing the player from all directions with massive swarms of the undead to create a heady blast of panic, fear and excitement.
The game’s visceral immersion is amplified considerably by its multiplayer design. “Left 4 Dead” is meant to be played cooperatively with friends (either online or at home) and triumphantly delivers an in-the-foxhole experience when sitting alongside living, breathing comrades you can verbally strategize with or scream at. Drawing on pistols, shotguns, machine guns and Molotov cocktails to blast your way through hordes of zombies while barking commands at, and requesting help from, fellow survivors — or, in the online Versus mode, assuming the role of a bloodthirsty reanimated ghoul — is as rousing an approximation of what it might actually be like to endure an outbreak of the undead as you’ll currently find. And, in terms of urgent, frantic kicks, “Left 4 Dead” significantly outpaces its cinematic counterparts, which can’t help but depict a disconnected view on mayhem that typically involves irrational idiots stumbling upon, and then (as repeatedly occurs in the three aforementioned films) hysterically fleeing from, supernatural incidents, bouncy cameras in tow.
So games are just creepier than films, especially when they feature first-person perspectives, right? Well, not exactly. “Left 4 Dead” provides an initial anxious high, yet — as with the cheap jolt tactics of “Doom 3” and the hallucinatory dread of “Condemned” and “F.E.A.R.” — it soon wears thin, due mostly to a problem shared by many of its celluloid equivalents: repetition. Modern horror games frequently generate greater terror than films simply because of players’ direct relationship to the action (and their power to dictate pace), they’re nonetheless plagued by the same brand of monotonous predictability born from preprogramming. Spend longer than an hour with “Left 4 Dead,” and you’ll likely turn numb to its gameplay rhythms and beats, which prove as telegraphed and homogeneous as an ’80s slasher flick’s fatal money shots. These P.O.V. adventures offer encompassing sensory/participatory thrill rides, but those thrills are still narrow and as dependent as they’ve always been on design and narrative ingenuity — much rarer qualities that have always been responsible for any horror show’s ability to scare and keep scaring.
The Sandbox, a column about the intersection a film and gaming, runs biweekly.
[Additional photo: “Left 4 Dead,” Valve, 2008]