Javelins, Zerg Rushes, Ninja Looting and Other Social Dilemmas

Illustration for article titled Javelins, Zerg Rushes, Ninja Looting and Other Social Dilemmas

It's there, and you know how to use it. It's an exploit or a glitch or some imbalance in the AI. Morally, it's wrong. But what if everyone else is doing it? Or just the potential for them doing it?


Jamie Madigan, well known as the gamer with the Ph.D in psychology, tackles an adaptation of the classic "Prisoner's Dilemma " by applying it to glitching. Writing on his personal blog (and also in his columns for GameSetWatch and Gamasutra), Madigan examines what choices and outcomes - foreseen and unforeseen - govern a gaming community's reaction to the presence of a trump exploit, like Modern Warfare 2's notorious Javelin Glitch, so disproportionately powerful that using it got players banned even though no modding was involved.


The conclusion? This is why you game among friends. Hardly a surprise, but one's conscience can't be the only guide. Some accountability to the victim of the glitching is also useful. And I'd argue it's why multiplayer-heavy games bear a higher QA burden, because glitching and exploits that destroy the fun have the potential drive people offline and to a shorter experience with the game, if not to another title altogether. Self-policing does occur, but the longer the exploits persist, the more someone will succumb to temptation.

The Glitcher's Dilemma: Social Dilemmas in Games [The Psychology of Video Games, March 4]

Back in the 1960s research on these kinds of dilemmas exploded and out of it came what's known as "the prisoner's dilemma" based on an anecdote about getting confessions from two prisoners held under suspicion for a bank robbery. In his book, Rational Choice in an Uncertain World Robyn Dawes summarizes the classic scenario thusly:

Two men rob a bank. They are apprehended, but in order to obtain a conviction the district attorney needs confessions. He succeeds by proposing to each robber separately that if he confesses and his accomplice does not, he will go free and his accomplice will be sent to jail for ten years; if both confess, both will be sent to jail for five years, and if neither confesses, both will be sent to jail for one year on charges of carrying a concealed weapon. Further, the district attorney informs each man that he is proposing the same deal to his accomplice.

In this case, both prisoners will probably confess if they're rational about it. Why? Because each prisoner get a better (or no worse) payoff by confessing no matter what the other guy does. Prisoner A thinks, "I don't know what B is going to do, so if I confess it's the best way to keep myself from getting screwed. If he keeps quiet, I go free. If he also confesses, I get 5 years instead of 10." In other words, confessing is the only way to keep the other guy from being able to screw you over. Notice how this mirrors the javelin glitch dilemma, only with fewer explosions.

Or you could apply it to "tick throwing" and "fireball trapping" techniques in fighting games. I could go on, but I think you get the idea. My 2×2 table making machine burnt out, anyway.

What's really more interesting and useful, though, is to look at what psychology has to show us about when people DON'T choose the purely rational option of abusing a glitch or a winning but boring strategy. Generally, people are more likely to do this when:

• They know they will be playing against their opponents in the future and face retribution

• They expect to interact with their opponents outside the game

• They don't expect to remain anonymous

• They don't know how many games will be played with the same person

Under these conditions, many players will adopt a strategy where they cooperate at first (for example, they don't glitch or rush), then if the other player abuses that trust they retaliate in kind. This is known as the "tit for tat" strategy. Some researchers with lots of time on their hands even organized tournaments where people were invited to write computer programs to play iterated prisoner dilemma games, and the programs that adhered to the "tit for tat" strategy tended to do the best.

This is why things like playing with people on your friend's list, Steam community group, guild/clan, or a favorite dedicated server is good. And it's one reason why random matches between strangers or pickup groups can be infuriating. Making it easy to submit ratings to the profiles of people you just played also helps resolve these dilemmas to everyone's benefits. It's also the reason that I love the way that Halo 3 lets you remain in a lobby with the people you just played and go straight into another round with them.3

People being the complicated beings they are it's not a perfect system, though. Some people are just griefers out to disrupt the game no matter what. Some people won't abuse a glitch out of a sense of honor. Some will value their ranking on a leaderboard more than a sense of fair play for any individual match. But even if none of the suggestions above is a silver bullet, they help across large numbers of games.

- Jamie Madigan

Weekend Reader is Kotaku's look at the critical thinking in, and of video games. It appears Sundays at noon. Please take the time to read the full article cited before getting involved in the debate here.

Share This Story

Get our newsletter



How is it morally wrong? Whatever responsibility there is, it's for the developers to release code that doesn't allow for the exploits in the first place. I agree with Sirlin's views on the matter: if it's in the game, it's in the game, and players shouldn't have the responsibility of reading the developer's minds to figure out how they're really supposed to be playing.