Den of Geek publsihed a piece Tuesday entitled “The Numbing Ubiquity of Computer Graphics.” Its thesis is one that I’ve personally held for a while: that the better and more widespread computer generated effects get the less interesting they become. As Ryan Lambie writes:
“Twenty or 30 years ago, even the tiniest glimpse of a computer-generated effect had an almost magical air of futuristic novelty about it… And yet, since the advent of a holy trinity of groundbreaking movies in the 90s, namely, ‘Terminator 2,’ ‘Jurassic Park’ and ‘The Matrix,’ it has become increasingly difficult to get particularly worked up about special effects of any kind. Audiences may have cooed and gasped over the imagery of ‘Avatar’ and ‘Inception,’ but we’ve now become so numbed by such visual flights of fancy, whether they’re in films or adverts, that they appear to be set to a side almost as quickly as we’ve seen them.”
In other words, the story of special effects in movies is the story of Christopher Nolan’s “Batman”: escalation. Advancements in technology can reap huge benefits for filmmakers and for studios, but as we see in “The Dark Knight,” escalation always comes at a price. The shelf life for an effect’s impact keeps growing shorter and shorter and even as filmmakers face competition from others trying to outdo them, they always need to be prepared to outdo themselves as well.
This pressure exposes one of the biggest flaws in modern Hollywood’s filmmaking model, which is built around a steady supply of franchises and sequels. But escalation doesn’t sit well with sequels, since sequels are, by their nature, more of the same, and more of the same in the realm of special effects is simply not good enough. So while directors of sequels need to satiate returning audience’s desire to re-experience what they loved about a first film, they also need to show them something they’ve never seen before. Accomplishing both simultaneously is nearly impossible. If they don’t do the former they’re told they forgot what made the first film great and if they fail at the latter, they’re told they made something too much like the first film. It’s a lose-lose scenario.
Take, for example, this year’s “Iron Man 2.” The first “Iron Man” was a surprise hit for two reasons in my opinion: 1)Robert Downey Jr. at his charismatic best and 2)Iron Man looked cool. Both of these reasons carried with them an element of surprise: after years of problems with the law, many people had forgotten Downey’s charms, and the character of Iron Man was one a lot of people were unfamiliar with, and director Jon Favreau made meeting him a fun, visually stimulating experience. But a lot of the first “Iron Man” is simply the pleasure of Downey goofing off in his lab with the armor, and impressing us with the effects’ ability to convince that he can fly. For the second film, Favreau had to top himself. And talk about escalation: “Iron Man 2” had more armored heroes, more armored villains, more non-armored heroes, and more supporting characters. What worked so well in the first film was basically untenable in a sequel (and will be even harder to recreate a third time, perhaps part of the reason Favreau just decided not to direct “Iron Man 3.”
Lambie does see an upside and that’s the availability of previously prohibitively expensive equipment and software to independent filmmakers:
“The fact that it’s now comparatively cheap to create CG effects means that new filmmakers can let their imaginations run riot on a tiny budget. For evidence, look no further than Gareth Edwards’ ‘Monsters,’ a film created with little more than two professional actors, one Sony camera and a copy of 3DSMax. As Edwards put it in a recent interview, ‘You can go into a shop now and buy a laptop that’s faster than the computers they used to make ‘Jurassic Park.””
He has a point, but this can be a double-edged sword for filmmakers too. As I wrote last week “Monsters” is a remarkable technical achievement, but as Lambie points out, there are two new remarkable technical achievements in multiplexes every Friday. That’s not enough anymore. Lambie hopes that CGI’s decreasing emotional returns will force directors to reinvest themselves in storytelling. It’s a nice thought, but it feels like a pipe dream. We’ve already seen how lowering the bar to entry is encouraging more and more people to make their own films, and more and more special effects artists like Edwards and The Brothers Strause from “Skyline” are taking the reigns of their own productions. Democratization can be thrilling and maddening: more good voices, and more bad ones too.
The future I hope to see is one populated by filmmakers like David Fincher, who can use CGI as shock and awe (see the opening sequence of “Fight Club”) or as spy tactic (see the taxi cab sequence in “Zodiac,” or rather try to see it because the effects are so perfect you have no idea you’re looking at a soundstage instead of a street corner). To my mind Fincher’s current work is the best example we currently have for Lambie’s vision: a director who uses computer images as just another tool in his toolbox, no more or less important than composition, framing, mise-en-scène, costumes, or lighting. When a director like Fincher integrates digital magic into his films, he does so so seamlessly we stop looking for the seams at all and return our focus where it belongs: back to the film itself. The effect of that process can be quite special in its own right.