As any summer moviegoer knows all too well, there’s nothing Hollywood likes more than a franchise capable of spawning sequels and tie-in merchandise. The glut of superhero, science fiction, horror and action series may please genre fans (in theory, if not always in practice), but their true admirers are the studios, who rely on them to prop up annual profits, as epitomized by the box office declines Sony Pictures experienced in the off years between “Spider-Man” releases. For the studios, sequels afford a lot less financial risk than stand-alone original films, equipped as they are with built-in audiences and recognizable stars and characters. And for fans, familiarity provides a level of comfort and a sense of knowing what they’re putting their money toward, an increasingly significant factor as the economy continues to tumble and the decision to drop a not-inconsiderable chunk of change at the ticket counter becomes a more considered one.
If movie producers and consumers both tend to find sequels a win-win proposition — despite the debatable quality of these cash-grab endeavors — then it’s no surprise that video game studios view the issue likewise. Since “Super Mario Brothers 2” hit the NES in 1988, gamemakers have been revisiting popular titles with a voracity usually only seen at Coney Island’s annual Nathan’s Hot Dog Eating Contest. Whether warranted or not, virtually every profitable video game eventually winds up receiving a Part II, or If enough time has passed, a “reboot.” Some of these take advantage of bigger budgets and newer next-gen consoles (like the N64’s peerless “The Legend of Zelda: The Ocarina of Time”), and plenty of others do nothing more than offer minor gameplay and graphical enhancements that barely justify their $50-$60 price tag (I’m looking your way, “Madden” football).
In the past few years, the combination of hot big-ticket franchises, enormous budgets, tech advances and a more-is-more ethos has finally pushed video games’ “sequelitis” into a distinctly summer movie realm, with the marketplace flooded with — and increasingly defined by — A-list action series installments reminiscent of the slam-bang event pics of Michael Bay. “Halo 3,” “Gears of War II,” “Metal Gear Solid 4,” “Resident Evil 5,” “Grand Theft Auto IV,” the forthcoming “God of War III” and Sony’s current blockbuster “Killzone 2” epitomize gaming’s new paradigm, in which established brands, bolstered by enormous sums of development money to create bigger, better versions of their predecessors, minimize the chance for losses by offering up a proven and (maybe) enhanced experience. It’s a strategy as old as the hills, and one that, with the qualified exception of the reportedly $100 million-budgeted “GTA IV,” which strove to push narrative and immersive boundaries, seems to be leading the industry down a slippery slope into monotonous regurgitation.
Take the PS3’s “Killzone 2,” the follow-up to 2004’s disappointing PlayStation 2 first-person shooter. It’s been hyped by Sony since the 2005 E3 trade show, when a jaw-dropping teaser trailer promised an evolutionary graphical step forward. Few gamers outright loved the original “Killzone,” but since it featured enemies that were visually distinctive — heavily armored soldiers with menacingly green, glowing eyes — and since first-person shooters remain all the rage, Sony believed the title had the potential for greatness. And sure, “Killzone 2” (rumored to have cost $60 million) not only bests its precursor, but at times offers a superlative military-action rush, thrusting you onto massive, chaotic battlefields (à la “Call of Duty”) that thrillingly approximate what it might feel like to be a cog in an active war machine. The graphics are stunningly realistic, the scale is gigantic and, most important, there’s a sense of weighty tangibility to your movements across the corpse-strewn front lines, a heaviness of armor and guns and the resultant sluggishness of your stride that lends a you-are-there element to the mayhem.