Call me a misty-eyed nostalgic, but for me, the "golden age" of video games is not 2008. Wasn't 2007, wasn't 2004, wasn't 1978. It was the 1990's. Where, as a predominantly PC gamer, I was treated to not only some of the greatest games of all time from companies like Lucasarts, Origin and Bullfrog, but some of the most entertaining attempts at story-telling and world-building as well. Forget Half Life 2's train ride. Forget Halo 3's bombastic aspirations. I'm talking about the cinematic introduction sequence, and its little brother, the cinematic cutscene.
Maybe your experience during the time was different. Maybe it wasn't, doesn't matter, you're not writing this, I am. And when I was a teenager, growing up amongst a friendship group that did little but eat, sleep and swap (err…) 3.5 inch floppies on the playground, games were rated according to two things. One, how good they were. You know. As a game. And two, how good their intros and cutscenes were. I kid you not. Games would live, and die, on the strength of their introduction sequences and cutscenes. Pre-rendered sequences, soundbytes, hand-drawn animation, a rousing musical score, those were the hooks a game needed to grab our attention. Example: Star Wars' scrolling text alone – which was a milestone moment in movie history - would have earned little but our scorn if Lucasarts hadn't followed it in games like X-Wing and Dark Forces with some of the best animated sequences (for their time) this medium has ever seen. In effect, it was a creative arms race. One that would, just like a real arms race, mean everyone spent too much time, energy and money on them, and they all but died off. But boy, before things got that bad, wasn't it just great? If you're nodding in agreement, let's walk down memory lane. If you don't know what the hell I'm talking about, start clicking on these clips.
What was so great about a bunch of animated sequences most gamers these days would happily skip? Two things. And really, you can lump the cutscenes and intros into the same basket for the most part, so I'll stop differentiating between them. Firstly, consider this: this was the 90's. Graphics were improving, yes, but they were still AWFUL. Your X-Wing didn't look like an X-Wing, it looked like an X-Wing soap box racer. You could count the pixels making up your X-Com employees on your own appendages. Basically, we still had this enormous gulf between what they games were MEANT to look like, and what they actually looked like. Intros and cutscenes bridged that gap. We could see people's faces. We could see the world they lived in. Secondly: these were the days before quicksaves, before ridiculously easy difficulty settings. Many games were still brutal. Additionally, many games were also repetitive. Taken on their own, these gruelling tests of physical and mental dexterity would, after the 20th re-attempt, stop being fun and start becoming a chore. Enter the cutscene. The carrot to gaming's stick. As an example, the original Command & Conquer's missions were limited in scope, and quickly grew repetitive. Same terrain, same units, mostly the same objectives. They'd be 30 minutes of mouse-clicking, moving some pixels around, and little more. It's not fun. But when we're imagining that we're controlling an army locked in a struggle for the future of the world? OK, that's some context. Some fantasy. And that context was propelled by the game's endless FMV cutscenes. Yeah, they were full of bad CGI and even worse acting, but as explained above, 10-15 years ago, they were also the shit. You'd force yourself through the missions to "unlock" your "reward", a 15-45 second piece of eye candy that, moreso than the actual gameplay you'd just been engaged in, be the most memorable part of the game, the parts I'd be bleating excitedly about with my buddies the next day.
That "next day", however, was some time ago. Some time during the last century. In 2008, things have changed. While for some developers – Square Enix, Relic and Blizzard in particular – the idea of the glorious intro movie lives on, for most, it's been replaced with either a cheap "movie" consisting of in-game footage or abandoned in favour of an in-game introduction altogether. It's a similar story for the in-game cutscene. While cutscenes themselves live on, they are (outside again of a few developers, like EA on their C&C games, and platforms like the DS where static cutscenes are common), like intros, commonly reduced to in-engine stuff. Even the Halo series, one of the medium's biggest and most successful proponents of the cinematic cutscene, uses in-engine assets to drive its cutscenes. So if these story-driven indulgences were so damn great, how'd we reach a day where they became so damn rare? We asked a few developers to find out. Doug Lombardi, Valve:
For us, stopping the game to show a moving breaks the experience. To preserve the suspension of disbelief and attempt to keep the player immerse in the game, we've made it a constraint to keep all story telling in the game engine, and almost always in real time while the player retains control of the camera. Remember how liberating it was to play HL1 and not have to smack the Space Bar so you could keep shooting headcrabs?
Lee Musgrave, Head of Art, Rare Software:
Main reason we tend not to infect our games with too much pre-rendered movie nonsense is that it jars the player out of the game world. Even if you use the same character models, BG geometry and textures in pre-rendered movies as those that exist in a Real-Time engine, things will look different. There is no doubt that pre-rendered movies have significant ability to look more like real movies, some would say to look better - but this level of fidelity is not always the desired result when you are trying to create an overall universe for the player that is coherent and cohesive throughout their time in your game.
Ru Weerasuriya, Ready At Dawn:
We take so much care in spurring emotions through gameplay that we run the risk of detaching the player from their experience, especially by making them watch events unfold. Ultimately, interactive gameplay is a form of narration, as are movies. Although the two are not mutually exclusive, one can often diminish the power of the other. Our goal is to make the player always be the catalyst in the story, play the action rather than watch it whenever possible, hence the reason behind reducing the amount of cut-scenes and movies in our games. In addition, by using high resolution cinematics many games tend to switch between two distinct visuals styles and that can be a detriment to retaining that same gameplay experience. To put it simply, how would you feel if a movie switched constantly between two looks and two different narratives as you were watching it?
So, there's a common thread throughout all three of those responses. One of disconnection from a storyline, from a character, from a world. But…wasn't that the point of the cutscene in the first place?
Sure was. Then again, while that was the point, the reason they were used was that in-game graphics couldn't do a good enough job of depicting characters and worlds. Modern graphics can. I mean, when Relic can get away with using in-game character models in a cutscene in a real-time strategy game, you can understand why companies would be reluctant to invest time and money into an animated or pre-rendered sequence. Case closed. And as sad as that may be for sad nostalgics like myself, all that's left to do these days is look back on some classic animated sequences, adjust our rose-tinted glasses and soak up the craftsmanship. That or go play a Blizzard or Square Enix game.