In February of 2011, fresh off nine months of 80-hour work weeks, Jessica Chavez took a pair of scissors to her hair. She’d been working so hard on a video game—14 hours a day, six days a week—that she hadn’t even had a spare hour to go to the barber.
This piece originally appeared 5/15/15.
As soon as the overtime came to an end, so did 18 inches of hair. “[It was] retaliation for the headaches the weight of it had given me while working,” she’d later tell me. “It got so heavy… it was unbearable after a while.”
Chavez, who writes and edits text for the boutique publisher XSEED Games, says she dropped 10% of her body weight during this period, where she handled just about all the dialogue for the text-heavy role-playing game Legend of Heroes: Trails in the Sky. By the end of the project, she weighed 99 lbs. (She’s 5’4.)
Spend any amount of time talking to people who make video games and you’ll hear thousands of stories like this. Crunch, as it’s called, has become status quo for the video game industry, as routine to game developers’ lives as daily commutes or lunch breaks. From multimillion-dollar blockbusters like Call of Duty to niche RPGs like Trails, just about every video game in history is the net result of countless overtime hours, extra weekends, and free time sacrificed for the almighty deadline. This crunch comes in many different forms—sometimes it’s long and drawn-out; other times it’s just a few weeks at the end of a project—but for people who work in video games, it’s always there. And because most game developers work on salaries, it’s almost always unpaid.
Conversations about the morality and necessity of crunch have dominated the industry for over a decade now, ramping up in 2004, when the game designer Erin Hoffman wrote an exposé about practices at video game publisher Electronic Arts. Hoffman, who went by the name EA Spouse, wrote about how various forms of crunch were destroying her significant other’s life, hammering EA for practices she said were unethical and illegal. The blog went viral, causing widespread outrage and triggering a series of class-action lawsuits that led EA to settle for tens of millions.
Today, however, things haven’t changed much. Developers regularly lament having to suffer through unrelenting crunch cycles where they go weeks or months without seeing their families. A 2014 survey by the International Game Developers Association found that 81% of polled game developers had crunched at some point over the previous two years. (50% felt crunch was expected in their workplaces and a “normal part of the job.”)
Why is this still happening? Why do people so often have to work crazy hours just to make video games? Should companies be doing more to prevent it? Over the past few weeks, I’ve talked to some two dozen current and former game developers—some of whom spoke on the record and others who asked to be kept anonymous—to try to answer some of these questions. The stories are candid and ugly: Some speak of nights sleeping in the office; of going weeks without seeing their families; of losing friendships and relationships because of endless unpaid overtime. Some say crunch drove them away from the video game industry. Some say they’ve taken vows to never work more than 10 hours a day.
To many developers and outside observers, one thing is increasingly clear: the video game industry’s reliance on crunch is unsustainable, and hurts far more than it helps.
Pretend, for a second, that you’re the head of an independent video game company. You’re in charge of ensuring that all of your designers, programmers, and artists hit their deadlines, which seems feasible because you set up a conservative schedule that accounts for standard 40-hour work weeks. So far, you’ve hit all your milestones—game-dev speak for project goals, like having a playable build of the game or hitting beta—without a problem.
One day, you get a call from the publisher financing your game: turns out your hero didn’t test well with focus groups, so they want you to completely redo all of his design, art, and voice acting. Also, they need you to hit the same release date—can’t change that fiscal quarter guidance! What do you do?
1) Tell the publisher you need more time or more money (for extra staff) to do this, at risk of pissing them off and getting your project cancelled.
2) Tell the publisher you need to cut other features to do this, at risk of pissing them off and getting your project cancelled.
3) Tell the publisher you can’t do it, at risk of pissing them off and getting your project cancelled.
Some studio heads might gamble and pick one of the first three options; others will inevitably go with #4, choosing to sacrifice employees’ free time rather than risk losing the game and having to lay everyone off. There are compelling arguments both ways.
That’s a simple scenario, though. And it assumes that you, the studio head, view crunch as a last resort. In the world of video games, many producers and directors see mandatory overtime not as a contingency plan but as a natural part of game development, to be regularly used as a way to cut costs and make the most ambitious games on the shortest schedules.
“Many teams (indie and AAA alike) seem to start a project already calculating in crunch to the schedule for added content or productivity, which is bizarrely short-sighted and disgusting,” said Tanya X Short, a co-founder at the indie studio Kitfox Games who has also worked in AAA development.
Short, a prominent critic of game development crunch, says she believes unpaid overtime is the result of poor planning and bad management, not an inevitable part of game-making. One of the issues, she says, is that the people on top of the food chain view crunch as something standard and inevitable rather than a toxic, avoidable practice.
“If your milestone is more than two weeks away and you can tell you’re not going to make it, you have to cut features or extend the milestone,” Short said in an e-mail. “Those are your options. It hurts to cut what feels like limbs off your baby, but sometimes it’s necessary. Certainly more necessary than pointless, burn-out crunch, which if you’re lucky will only leave you sick (physically or creatively)... and if you’re unlucky will make you miss your milestone, get sick, and start you down a path towards bad production practices.”
It can be a self-sustaining cycle, Short argues. Say a designer is able to create a sizable level after working 16 hours a day for three straight weeks. From then on, project managers will equate a level of that size with three weeks of work, and for future schedules they’ll plan accordingly, allotting three weeks of time to tasks that should require six. The designers will again have to crunch to finish those future levels, and the cycle will go on and on.
That’s just one of the reasons crunch has become so prevalent. It’s easy to point fingers at the managers who allow this sort of thing to happen—and critics of crunch have done exactly that—but it’s worth noting that game development is a creative process. From level designers to character modelers to foley artists, every single job behind a video game calls for right-brain work. Every member of a development team has to make countless subjective decisions on a daily basis, and by nature, people will be more creative on some days than they are on others. Some days the words, art, and code flow; other days they don’t.
In other words, it’s very difficult to figure out how long it might take to finish a given task. Even when a project manager is desperate to avoid crunch, it can be impossible to estimate how much time it might take for a narrative designer to write a scene or for a programmer to come up with an AI tree. Making games is messy.
It’s important, when zooming out and taking a lens to game development, to recognize the differences between occasional overtime and crunch. Few would take issue with a boss asking his or her employees to work late for a few days or even a week toward the end of a project. It’s when these requests become excessive or even normalized—when standard 40-hour weeks morph into 60, 80, 100 on a regular basis—that it turns into a bigger problem.
There’s no one way to define crunch. It can come in thousands of shapes and sizes, varying based on the schedule, the type of game, the scale of a team, the deadlines, the contracts, the personnel, the publisher, the leadership, the amount of money in the bank, and many other factors.
Often, gamers equate crunch with the final weeks in a game’s development, when everyone on a team has to go into overdrive to ensure they hit their release date—”crunch time” is a euphemism for the very last minute of a project. But in reality, according to many of the game developers I talked to for this story, crunch is always there, hanging over studios like a big gloomy rain cloud. Plenty of the people who make video games say they have to crunch all year long.
“Most people think crunch only happens in the final push of a project,” said one AAA game developer who asked not to be identified. “The last [x] months before shipping. Let’s set something straight: That’s complete bullshit.”
In reality, that developer said, many teams find themselves working unpaid overtime all throughout the year, for various reasons. Sometimes it’s by choice; other times it’s because they have no other options.
“Crunch is any time a milestone is behind schedule,” the developer said. “Crunch is any time a project is due for review by management. Crunch is any time an issue comes up that prevents other people from working. Crunch is any time a publisher decides they want to see something now or wants new features that weren’t planned previously. Crunch is when any trade show or article requires a demo/trailer/screenshots/you name it. Crunch is after the public sees said PRE-RELEASE content and starts tearing apart something that’s not finished… Crunch is not uncommon. It is the norm.”
Sometimes crunch is the result of a young team that thinks passion means working extra hours; other times it’s the result of cold upper management making unreasonable requests of employees.
For example, one developer told me about the years he spent working on a massively multiplayer RPG called Hero’s Journey. As they worked on the game—which the developer described as a Sisyphean task—he and his team found themselves crunching not to finish but to build fake demos for publishers and trade shows.
“As most engineers know, demo code is almost always garbage, throwaway code,” said the developer. “So we got to crunch to write code we knew was not actually feasible long-term, but looked good enough for showing it off. Not only was this for trade shows like E3, but once we’d shown stuff off we had publishers coming out and we were trying to court them as well… The longest I ever spent in the office was almost 30 hours. I did not sleep. We had a huge demo for a potential publisher, some division of Sony, and we had to get everything just right. I came into the office early, around 7am, and I didn’t leave until almost noon the next day. I went out for lunch and I went out for breakfast around 6am, but that was it.”
In the video game industry, crunch is ubiquitous, but it’s also different every time it happens. Indie studios crunch—sometimes because they know they’ll run out of money if they don’t. Big studios crunch—sometimes because the publishers behind them force them to add features and hit deadlines. One developer told me their AAA studio’s policy was “We don’t care how many hours you work in the week as long as you get your work done,” which for some people might mean eight-hour days and for others might mean 12. An artist who once worked for a mobile studio said they were told that working evenings and Saturdays was just part of the culture there.
One person who worked on The Elder Scrolls V: Skyrim—one of the most acclaimed games of the last generation—said the last few months of development were a total mess:
We worked long hours. Our bug queue never slowed. We played the game constantly, dismayed at how slowly the iterations came. Guiding Skyrim in the right direction felt like guiding a thousand-tonne tanker through cold water. Through the fog of panic and activity, we watched as the iceberg of our deadline drew close, inscribed with that horrible prophecy: “11-11-11.” And then, under the steady guidance of Todd Howard, we made it. And as with all great undertakings, the end result was both less than we’d hoped for... and also far, far greater.
People crunch on all those shovelware licensed games, too—the ones you see for $10 at GameStop just a few weeks after they come out. “You know what’s worse than crunch? Death march crunch on games no one wants,” said a developer who worked on Sega’s Iron Man 2. “Mandatory 12 hour days, 6 days a week. If you were salaried (like I just now was) you did not get overtime pay. You did get food though. Oh and there was a keg in the office.”
As a reward for those 70-hour work-weeks, the staff all lost their jobs. On April 2, 2010, a month before Iron Man 2 came out, Sega shut down the studio. (For a look at why layoffs are also so common in the video game industry, see our feature from last year.)
On the flip side, some game developers say that being part of something really successful can alleviate the pain of crunching in at least a small way.
“There was a stretch of a month or so where I slept at the office every other day because my two-hour commute meant I’d get more sleep at the office than I would at home,” said one developer who worked on a critically-acclaimed AAA game. “So I would get in at about 8am, work until 11pm, sleep until 7am so I could shower and be at my desk again by 7:30am. Then I would leave at 5pm to get home in time to see my wife and kids for a couple of hours before bedtime, then I’d do it again. That sounds far worse than I remember it. The fact that the game is awesome, sold well and reviewed well softens the sting a lot.”
The developer later added that although the critical acclaim was nice, it could never justify radical crunch time. “[Some] studios think that their pedigree enables them to overwork employees and that the employees should be ‘honored’ to work there,” he said. “It has gotten so bad that a lot more experienced devs will see the word ‘passion’ on a job description as a red flag. There are a few studios whose games have been critical and commercial darlings but I would never want to work there because of how bad the crunch and culture is.”
Over the years, as the video game industry has evolved, crunch has, too. Thanks to day-one patches and downloadable content, video games are rarely “finished”—the people behind them can keep zapping bugs and building new features all the way through release and even afterwards, as we’ve seen with games like Driveclub and Halo: The Master Chief Collection, both of which shipped broken in several ways.
“For the record, I hate day-one patches as a consumer,” said a programmer for AAA games who asked not to be named. “Day-one patches are a godsend as a dev, though. Reliance on them is a horrible idea—but just knowing you can fix issues almost immediately for all consumers is a huge stress relief.”
In January of 2014, artist Clarke Nordhauser took a job with a big-name studio that was working on a well-known video game franchise. As Nordhauser recalls, when he arrived at the company for his first day, the producer welcomed him and asked if he wanted dinner—oh yeah, and it’d be great if he could stay a few hours extra.
“It wasn’t just that night, it was every night,” Nordhauser said. “[Then it] turned into requests for weekends. By the third week of working there, I had noticed that I have never seen certain team members leave the office ever… You enter a certain point of depression where a process is comforting, and once I’d felt like another cog I just accepted this as my fate.”
By April, Nordhauser was burnt out. He told the studio he was done, and he walked out the door, vowing never to crunch again, even if it means leaving the video game industry entirely.
“I’m constantly disappointed in AAA titles but it makes sense with how much they are overworking and under-compensating their workers,” Nordhauser said. “Until there is some sort of union for game developers, I probably won’t find myself working in the industry again.”
Although there are no good statistics for the number of people who have left gaming for less volatile, more lucrative fields, there are plenty of stories. Ex-QA tester Steve Holland, for example, told me he gave up on his dreams of making video games after a stint at Atari working 60-hour weeks on games like Civilization III.
“Until there is some sort of union for game developers, I probably won’t find myself working in the industry again.” - Clarke Nordhauser
“During my job search after Atari I kept seeing some variation of ‘Weekends and long hours required’ in every job posting at every single development studio or publishing company,” Holland said. “This made me come to the realization that crunch was not some occasional occurrence before you gave the thumbs up on Gold Master for large-budget games, but was instead an off the record mandatory death march required of all employees who wanted a career in the video game industry.”
How many talented people have been scared away from the video game industry because of mandatory overtime? Will we ever actually know?
Ask five developers about crunch and you’ll get 20 different anecdotes—“war stories,” many of them will say, eyes twitching as if suffering from some sort of video game PTSD. It’s human nature, especially in the United States. We like to brag about how hard we work.
It makes for a fun fantasy, too. Can’t you picture it? A team of talented-yet-underpaid developers crunch together, working unimaginable hours to turn their video game dreams into reality, sacrificing their lives to accomplish what everyone else thought was impossible. Painful short-term loss exchanged for glorious long-term gain.
But this sort of mentality can be ineffective and even dangerous, says Tom Ketola, a veteran programmer and director who’s been making video games since the PS1 days. After all, it can lead to awful, self-sustaining habits.
“Hindsight being 20/20, I oftentimes find crunch time stories turning into a bragging match about who worked more hours and suffered more,” he said. “Our facility to forget how bad things actually are can often glamorize bad decisions and bad processes to the point of nostalgia, and as we move up the ranks in development and start managing younger workers, those stories bias our own management towards repeating the same mistakes.”
Some developers say they’ve felt compelled to stay extra hours at the office just because other people were doing it too. Several told me they just couldn’t shake the notion that more hours directly equate to higher-quality games—after all, more hours means more work, which means more features, polish, and testing.
Then again, you don’t need to know C++ to imagine a bleary-eyed, sleep-deprived programmer making mental mistakes that lead to game-breaking bugs and cause even more work for the rest of the team. There isn’t much data on this subject, but a recent study by a group called The Game Outcomes Project did find that mandatory crunch led to less successful games—even if one of their gauges for success was the very-flawed aggregate tool Metacritic.
Wrote Paul Tozour: “Our study seems to reveal that what actually generates ‘extraordinary results’ – the factors that actually make great games great – have nothing to do with mere ‘effort’ and everything to do with focus, team cohesion, a compelling direction, psychological safety, risk management, and a large number of other cultural factors that enhance team effectiveness.”
In other words, all that crunch-time might be making games worse. Anyone who’s worked in a creative field knows that tight deadlines and occasional all-nighters can sometimes make miracles happen, but caffeine and adrenaline aren’t enough to fight off the exhaustion of working outrageous hours for weeks or months on end. Whether you’re writing dialogue or modeling characters, sleep deprivation can have a severe negative impact on what you’re trying to accomplish.
“You can only stretch yourself so far before you start to come apart,” said XSEED’s Chavez, “and quality is the first thing to go at 2am. Long-term crunch is a game of diminishing returns in the end.”
Let’s go back to that hypothetical situation from before. You’re the head of an independent studio, your publisher just asked you to change everything, and you’re quickly running out of time and money. What do you do?
I posed a similar hypothetical to Kitfox’s Tanya X. Short, who is as big a critic of unpaid overtime as it gets. Her answer, unsurprisingly: don’t crunch.
“Crunching for more than two weeks won’t help the studio in your question,” she said. “If I were in that circumstance, I would have to find some way to cut something. There literally is no other option. Crunching will just make your clients or publishers feel better through showing you are ‘working hard,’ but you still won’t make it, so crunching is not the final solution.”
That’s an admirable—if idealistic—stance. But while it’s hard to find a developer who thinks crunch is a good thing, there are many who believe it’s a necessary evil. “I think crunch is the worst approach to managing your projects but at the same time is unavoidable for artists because it seems like it’s just part of our nature,” one freelance developer told me, adding that he’d worked both with studios that crunch a lot and studios that don’t.
Veteran game designer Edward Douglas, a AAA-developer-turned-indie who’s currently directing the RPG Eon Altar, said that he feels awful about asking his team to work unpaid overtime, but that he sees no other options. Three years ago, when he started development on what he describes as his dream project, he over-scoped—or planned out more features than were really feasible with their schedule—and now they’re paying the price.
In a lengthy e-mail, Douglas was remarkably candid about how and why he’s told his team to crunch on the game:
Early in a game cycle the skies are blue and anything is possible. As you get closer to final everything you have to trim, simplify, remove, feels like failure. You also have an optimistic perspective on what can be done in any given schedule, so you over-scope. What they call ‘technical debt’ builds up, and you have more and more features to maintain and fix when it comes to the end.
In a perfect world a game is built on a ‘Minimal Viable Product’ model, where each feature works nicely on its own, and is enhanced by features around it. Although we give internal lip service to this method, to be honest to ourselves we do not do it. We’re making a complex (for us) RPG, with a huge number of moving parts. No feature works in isolation, and nothing can be cut without massive ripple effects throughout other systems. Here, we were overconfident, didn’t scope properly, and now have a huge number of features with lots of bugs to fix and a hard deadline coming up.
We’re completely independent which means we set our own release date, but it also means that when we’re out of money, that’s it. We’re not EA where we can shift resources from studio A to help studio B for the last few months. The only way to finish is to crunch. To ask people who did not commit to the features in the game to spend the evenings, their weekend, their family time, digging us out of this ‘technical debt’ we’ve incurred. Now they’re critical path in getting a game worth shipping, and if they don’t do it they know the game could fail. It’s the absolute worst position to put someone in and it’s shameful that I did it.
Ideally, managers like Douglas could avoid this sort of situation by under-scoping, adding extra buffer time to the schedule to account for catastrophes, and cutting features that he and his team wouldn’t be able to deliver without working overtime. Critics of crunch say those moves are imperative, not optional. But by nature, game developers are creative, ambitious people with a habit of biting off more than they can chew. It’s hard to envision a video game industry where people aren’t over-promising things.
It’s also tough to imagine a culture where people don’t equate long hours with passion and commitment. Many of the game developers I interviewed for this story said that when some members of their teams voluntarily crunched while others didn’t, the crunchers would grow to resent the people who left at 6 or 7pm every day. Even in studios where crunch is never mandatory, a divide like this can rip teams apart—unless there’s a manager forcing people to go home.
In late 2013, the twitter account for Crytek’s Ryse triggered a minor controversy with a tone-deaf brag about the overtime they’d poured into the game. “By the time #Ryse ships for #XboxOne, we will have served the crunching team more than 11,500 dinners throughout development,” they wrote.
Developers reacted harshly—and game companies will now likely think twice before tweeting something like that—but it’s indicative of a cultural trend that’s still prevalent for the people who create the games we play.
“Crunch was invented by humans, normalized by humans, and we absolutely can fix it if we try,” said Short. “There’s not much the common game dev at a larger studio can do about imposed crunch from above, but as soon as ‘we don’t crunch’ becomes something a high-profile studio can brag about (without being weirdly accused of ‘lack of passion’), the rush of quality developers might persuade the rest.”
Whatever the proposed solution—more reasonable and capable management, increased transparency, an organized work force—the problem of crunch seems clear. It’s clear why it happens, and how. Crunch is an ingrained part of video game culture, and will remain that way until Twitter accounts for games like Ryse start bragging about how many dinners their employees got to eat at home with their families.
Crunch manifests in a thousand different ways, and we’re only beginning to understand its long-term, industry-wide effects. For now, the least we as gamers can do is to understand and respect the great strain required to bring us our favorite games; to support those who would try to dismantle crunch culture, and, even just in some small way, try to change things for the better.
Illustration by Sam Woolley