In this super-sized edition of Speak Up on Kotaku, commenter DocSeuss tells us the difference between being immersed and being engrossed, and explains why he believes the future of gaming depends on games that submerge us in fantasy worlds. You might want to bring a snack.
I read a forum thread somewhere recently—I want to say NeoGAF, but I can't find it 'cause my registration's pending so I can't access search—that talked a bit about words and concepts we'd like to see removed from gaming. It was a pretty fascinating topic, and I was happy to see that the used-to-the-point-of-meaninglessness word "visceral" and the anti-game "cinematic" were frequently cited. It was perfect timing, then, for Kirk to post an article highlighting a video arguing against the use of the term "immersion" in video games the next day.
I disagreed rather vehemently. I still do, which is why I've spent several hours (as opposed to my normal twenty minutes) to prepare a response.
Before I get into this, I must warn you that I might be someone harsh on Mr. Abraham and those who agree with him. He's gotten so much fluffy praise from people who consider themselves to be on the forefront of games criticism (a field which, from what I've read, is incredibly circlejerky and not nearly as knowledgeable on the subject as it thinks it is) that I think some harshness is in order.
Anyone who believes that "immersion" is a term that should not apply to gaming, or that ideas involving immersive design should be removed from video games is frighteningly wrong. Not only that, but the argument that "immersion" is a bad term, or that games should not be made with immersion in mind are as dangerous to the medium as attempts to ban it.
Guess I should back myself up, huh?
I'll be covering two main points, because it appears that these guys either fail to understand what immersion means or genuinely want the concept of immersion to die.
Let's start with the English language.
Okay, so, first things first, a little English language primer (thanks to squibsforsquid's responses to my initial response to Abraham's video):
The English language is incredibly nuanced. Words that seem to be identical to each other can actually have subtly different meanings that aren't covered by others. "Immerse/Immersed/Immersion" is a great example of this. A simple dictionary lookup reveals it to be something along the lines of "engrossed" or "attention-grabbing," but if that were the case, then one would wonder why similar words and phrases would not suffice. Why does "immerse" and its various forms exist?
The answer lies in its other definition: to be submerged entirely in a body of water.
Imagine, if you will, that the English language is all the food in a grocery store. Words like "engrossed" and "immersed" are like varieties of lettuce. Sure, you might think that iceberg and romaine lettuce are both leafy green veggies, so they can be used interchangeably, but nothing could be further from the truth: indeed, romaine has a radically different texture and moisture than iceberg (I prefer the darker, bitter taste of romaine, personally, but some people like the cool crunchiness of iceberg).
An English-language example of this would be the substitution of "good" for the word "like." What we like is something inherently personal and subjective—it's something that matches up to our own personal standards of enjoyment. What is good is something that compares favorably to set standards—usually ones external to us, like cultural standards. Saying something is "good" does not inherently mean that we like it; likewise, saying that we "like" something does not necessarily mean that it is a good thing.
Similar terms are not identical ones.
Immersion isn't simply "paying a lot of attention to a thing." There's more nuance to it than that. Merriam-Webster's example, "We were surprised by his complete immersion in the culture of the island," hints at a level of integration into something. When someone says "he was immersed in the water," they're talking not talking about being engrossed with water, they're talking about going under.
The people who first used the term "immersion" when applied to game design didn't choose the word lightly. There's a reason that the immersive sim genre of video games is called the immersive sim and not "engrossing games" or something else. "Immersion's" unique texture within English makes it a term uniquely suited to discussing an element of video games that other mediums don't have (you can pay attention to any medium; you can only be immersed in something interactive).
Any game can be engrossing—Tetris is engrossing, for instance—but few games can be truly immersive. Few games can make their players a part of the world within them.
This is an important point, because immersion, in this sense, is something that's entirely unique to video games. Nothing—no movie, no play, no book—can be truly immersive the way a video game can be.
Basically, to sum things up so far, "immersion" is a term that isn't always used correctly. When referring merely to the act of being deeply involved in a game, yes, immersion is an improper term, but we should not remove it from our gaming lexicon entirely, because it's a term that accurately describes one of the primary elements of what separates video games from other entertainment mediums.
Where am I getting this from, you ask?
Right, so, let's jump back to 1974. Gary Gygax and Dave Arneson (sorry, Dave, but while you take alphabetical precedence, Gary wins for having alliteration and an x in his name, which just makes him cooler) created this game called Dungeons & Dragons.
It was a role-playing game.
I'm not talking about stat-based adventure JRPG stuff, either. I'm talking about a true role-playing game (speaking of role-play, there's another thing that will confuse you if you try to find a dictionary definition—understanding the use of the word, specifically regarding its origins and relationship to improvisational theatre, is key to understanding what is and isn't a role-playing game). Basically, they created an instruction set for how to role-play.
The goal was to empower players to have adventures in worlds of their own creation, a radical departure from other games (sports, Milton Bradley-style board games, etc). At the same time, it wasn't a performance thing, like theater. It was just "hey, let's explore a world!"
The rules behind DnD served the purpose of making sure players didn't get overpowered or do absurd things. You don't actually need a turn-based system, stat points, party members, and so on and so forth to have an RPG, it just makes things a bit easier for a GM to handle.
Jumping forward a bit, we hit 1981 and two games, Ultima and Wizardry. It was effectively the birth of the video game RPG; other games had preceded them (I once read that a computer game called DnD showed up in 1975), but these two games were the watershed moment. Ultima and Wizardry used incredibly limited technology at the time to try to emulate the RPG experience.
A necessary digression: when Japanese developer Yuji Horii saw Wizardry for the first time, he got really excited by the prospect, and, apparently being unaware of the purpose of Wizardry's mechanics, cloned a lot of the ideas and created Dragon Quest, the game from which all JRPGs since have descended. Most of the time, things don't work out quite this well and new genres aren't created, but in the JRPGs case, things worked because Horii is a boss. The lesson here is that you shouldn't go creating a game unless you understand why the mechanics behind it exist. This is also the reason why regenerating health is used in a lot of games it has no business being in.
While the JRPG gained popularity and became its own thing (and confused a bunch of people as to what the RPG actually is), Western devs were still quietly making their own RPGs, but with added computer power. Instead of making turn-based, top-down games with various battle systems, they were focusing on evolving the genre, making it distinct even from the pen and paper games which had birthed it, while at the same time, keeping the spirit of the RPG intact.
Now, I should point out that video game RPGs are still absurdly limited! Computers cannot improvise the way that GMs can. That said, there are some areas where they excel... and that's where Looking Glass comes in.
If you understand one thing about the history of video games, it should be that no game studio on the planet will ever be more important than Looking Glass Studios was. These guys pioneered first-person games, sandbox games (what, you thought Shenmue or GTAIII was the first sandbox game?), flight simulation (when they died, the flight sim industry died), stealth games, and a bunch of other stuff. Their employees have gone off to help invent the Xbox (forever transforming the gaming landscape and eliminating Japan's stranglehold on the console industry), work on Guitar Hero and Rock Band, revitalize The Elder Scrolls (heavy immersive elements in those games), create Deus Ex, work for Valve, and so on and so forth.
Oh, and one of the first games they ever made was Madden, so there's that.
Perhaps their most important contribution to game design, however, was immersion.
The Looking Glass guys, in the early 90s, had a revelation: they could use simulation elements to add new life to their worlds! From this, the immersive sim was born.
Basically, you take that core idea behind role-play (I want to be someone in another world) and use computers to create a world players can interact with. That's really all there is to it. You make the game in first-person, to reiterate the fact that the player is his or her character. You create levels that feel like real spaces, then populate it with complex AI that can do more than just fight. If you can, you try to throw in elements like physics, good graphics, a high degree of interactivity, and so on and so forth. You also cut down as many abstractions as possible (abstractions in a game context are basically just mechanics that provide a simpler way of approaching real-life ideas—such as turn-based gameplay when a computer can't handle a real-time approach).
What we've found is that immersive games, provided they are easy enough to get into (Deus Ex, for instance, inundates players with information in its training level and summarily throws players into the deep end with Liberty Island; this is a bad way to do things), actually have a huge draw and significant lasting appeal. Some recent examples of immersive games include STALKER (more than 4 million units sold—not bad for a Ukrainian studio with next to no marketing), Fallout 3, and Skyrim. Other games, like Assassin's Creed and Dark Souls, use immersive elements to enhance their experience.
People love these games. They love being able to enter a new world and interact with it. They love emergent gameplay—why else do you think GTA is such a popular series? Skyrim was successful because it facilitated exploration. Crysis was unique because it allowed deeper physical interaction with the world. STALKER's advanced AI and player needs (eating, for instance) helped its players sink completely into the role of the amnesiac Marked One.
Far Cry 2, flawed as it was, got the love it got because it let players treat the world as an actual world. Yesterday, I read about someone who stacked up cars in Far Cry 2, blew them up, set fire to a field, caused the base he was attacking to catch on fire (which burned some of his enemies alive and confused others), and then walked in and took what he needed without anyone realizing he was there.
(I realize that I could probably write an entire essay on the power of emergent gameplay and why Dwarf Fortress and STALKER are the greatest games ever made, but I've got enough stuff to talk about as it is).
Immersion is the future of video games.
I realize that "the future of video games" is a phrase that gets used a lot, primarily to describe whatever trend is currently popular (Facebook games, iOS games, casual games, motion control, you name it), but I'm using it in a slightly different context: I'm talking about progress.
Most people don't really think about the future advances in tech. What can Kinect really do for us? What does Goal-Oriented Action Planning AI do to enhance video games? What does procedural generation mean to video games? How does the RPG fit in with all this? What can we do with interactivity, that sacred ideal that elevates video games beyond all other mediums by eliminating passivity?
The people arguing that games shouldn't be immersive are as ignorant as the people who argue that Role-Playing Games are nothing more than stat-based adventures. These people want to hold the industry back—to keep it at some larval stage where they're most comfortable. Maybe it's out of fear (after all, I don't doubt that bards objected strongly to novels, nor do I doubt that novelists objected strongly to the medium of film), or maybe they just... really enjoy stat-based adventure games or strategy titles or what have you (I know I do!); I don't really know their motives.
What I do know is that they're trying to fight human nature.
Don't believe me?
Let's go back to the beginning.
The Epic of Gilgamesh is one of humanity's oldest surviving works of fiction. It's a massive adventure story. Fast-forward to ancient Greece and Homer; note the vast influence of his works (basically all of Western fiction owes its existence to Homer and Plato/Aristotle/Socrates). Jump ahead even further, and take a gander at the increasing believability of fiction (Shakespeare, particularly), as well as the increasing accessibility of entertainment. Check out how the integration of music and storytelling in the 1500s led to the birth of the opera. Pay attention to the rise of global exploration during the Renaissance, as well as the scientific leaps and bounds made by a formerly-repressed society. Study the emergence of 19th century literary criticism, as well as the explosive popularity of novels. Read up on the birth of film, radio, television, comics, and their subsequent popularity.
What do these all have in common?
Well, I was hoping to have a word for you, but I don't. Curiosity, maybe? Discovery? Newness? Escapism? None of these really quite sum up what I'm trying to get at, so I'll put it like this: people only enjoy the mundane so much. At some point, every single one of us is going to seek out new experiences. We crave new sensations. We savor them. Experiencing the new is one of the primary motivating factors of human existence.
Humanity, as a whole, has a fascination with the new. When we look back at fiction, we can observe humanity's fascination with the idea of exploring other worlds. CS Lewis's Narnia adventures cover this. Lev Grossman's The Magicians explores it too (fun fact: his brother apparently worked at Looking Glass). Fantasy and science fiction stories sell like crazy. There's a reason that films like The Girl With The Dragon Tattoo didn't do nearly as well as Avatar. One is mundane. The other is not.
The fact of the matter is that we, the human race, are a bunch of insatiably curious creatures who constantly desire new experiences. Discovery is humanity's raison d'etre (oh yeah, I can be just as pretentious as the self-styled game critics; ce que je dis, je le dis dans une autre langue, donc, ce que je dis est profond?).
So what's the future going to be like?
We are creatures driven by discovery. Why do you think Skyrim did so well? Why do you think New Vegas failed? The former facilitated discovery and exploration; the latter was too focused on being a good RPG to care about the world it had created.
The future of games is going to capitalize on this. Arguing that we should eliminate the concept of immersion in games, that the immersive sim should be dead, or anything else along similar lines, is like arguing that we shouldn't have voice acting and ought to stick with scrolling text. It is an argument that says "games should not be more than they already are!"
Modder Robert Yang may consider immersion to be a fallacy, but he's mistaken: the future of video games really is the holodeck. All those things I mentioned earlier—Kinect, procedural technology, better AI, and so on and so forth—are the tools that are slowly pushing us towards that end.
...I haven't even begun to talk about the real-world benefits of creating immersive games. Someone smarter than me could surely go on at length about the possibilities of immersive simulations that allow people to live through various simulated events for... a wide variety of reasons. Someone training to be an EMT could be forced to go through a triage situation, with accurate simulations of panicking people, secondary threats, sensory barrages, and so on and so forth. Researchers could study crowd dynamics (using more advanced AI than anything presently available) in the aftermath of a disaster in order to better understand how to design environments to protect against them. The military already uses immersive sims to save training costs. There are a ton of non-entertainment applications for immersion. Saying we should kill the concept is horrifying, because it's so limiting.
...and so we come to the conclusion.
There will still be room for the [insert any unimmersive game here] of the world. I'm not saying that they should die; there's nothing inherently wrong with them. Instead, I'm looking at this in a long-term perspective—not the next week, or the next month, or the next year, but the next century of game development. Games are... going to become something else. Traditional video games will still exist, but this new thing, this transportation to another world... that's the future. Saying we should kill the concept of immersion and only give credence to attention is a terrible idea.
Considering the way they seem to feel about immersion, it would appear that Ben Abraham, Robert Yang, and Richard Lemarchand don't just misunderstand the term, but want the legitimate usage to die as well. While I don't know a lot about Abraham's personal philosophies, Yang's made his pretty clear in his Dark Past series of blog posts—he thinks the immersive sim should die. Lemarchand's philosophies are made clear by the games he creates, and
Do I sound upset?
These guys seem smart—really, they do—but by failing to understand the nuance of the word "immersion," they seem primed to damage the medium.
Look, I may be just a poor college student (I can't even afford a good school) who is trying to learn game design while his school falls down around his head (seriously, I'm not kidding about the good school thing). Unlike Lemarchand and Yang, I've never made a video game in my life. I've worked on some other forms of RPG before, and I'm trying to work on an indie game right now, but I obviously don't have the body of work behind me that these guys do. I may never have the body of work behind me, at the rate things are going.
...but... I feel like they've got it all wrong. If they're the guys who tell us where games should go—if we follow them—I know we'll be worse off for it.
They scare me.
(Also, in case anyone is wondering, yes, this is one of the reasons I prefer Western to Japanese games. Japan tends to prefer to design more abstract, non-immersive games, which is a totally valid method of expression, but not one I personally enjoy)
About Speak Up on Kotaku: Our readers have a lot to say, and sometimes what they have to say has nothing to do with the stories we run. That's why we have a forum on Kotaku called Speak Up. That's the place to post anecdotes, photos, game tips and hints, and anything you want to share with Kotaku at large. Every weekday we'll pull one of the best Speak Up posts we can find and highlight it here.