Art by Sam Woolley.

Suicide Threats Are A Big Problem On Twitch

A few weeks ago, a Twitch stream moderator and health professional who goes by “Badxan” posted on the Twitch subreddit about an experience she’d had with a viewer threatening suicide in a chat. “How does Twitch deal with suicide and self-harm?” she wrote. “Shamefully.”

Badxan was moderating a Twitch chat when a viewer threatened to broadcast an IRL stream in which he’d shoot or hang himself. She DMed the viewer to try and help him out. She also used Twitch’s “report: self harm” feature, but she didn’t expect the result: The user immediately got banned, and Badxan ended up having to scramble to seek him out on Instagram, to make sure he didn’t go through with his threats in some other way.

Banning the user, Badxan told Kotaku in an email, might have put them in more danger. “One of the risk factors for mental health disorders such as depression is social isolation,” she wrote. “If contact is suddenly severed with someone in a crisis, it may reinforce the feelings of intense lonesomeness and further escalate the situation.”

In her post, Badxan went on to critique Twitch for a lack of easily accessible resources to aid in the event of a self-harm crisis. In the wake of her thread, as well as discussion and debate stemming from it, Twitch added a mental health page with advice, numbers of suicide prevention hotlines, and specific sections for topics ranging from LGBT+ issues to addiction.

The change was long overdue, but it only goes part of the way toward acknowledging and addressing an issue on Twitch that’s only grown more prevalent with time. Some viewers use Twitch as an outlet for everything from depression to suicidal thoughts. Streamers, in turn, often want to help, and some even pride themselves on creating safe spaces for people to discuss what they’re going through. But should streamers be doing this? They are usually not mental health professionals, and these situations can be dangerously precarious.

General Mittenz, a streamer who uses streaming to fight his own anxiety and depression, tries to create a community where people can let out their inner demons. As a result, Mittenz says that he deals with viewers who express thoughts of self-harm on a “weekly basis,” often via DMs, but sometimes in public stream chat, as well. He believes it’s important to engage viewers who are struggling, because it can function as a valuable release for them, and it can keep them from isolating themselves. But he makes it clear that his companionship is not a substitute for real professional help.

Art by Jim Cooke.
Art by Jim Cooke.

“I make sure to tell them that I’m not a professional, and that everything I say, I say as someone that went through a similar situation,” Mittenz, who dealt with suicidal ideations after his father took his own life, said in an email. “I always try to find common ground with the individual and then tell them that the best thing that I personally found to help was going to a professional.”

Raw_Genesis, the streamer whose chat Badxan was moderating when the incident she posted about on Reddit occurred, agreed, saying that he thinks sometimes people just need to hear words of encouragement from somebody they perceive as being like them.

“Every time this has happened so far, it’s been a young male viewer in their late teens or early twenties that just felt isolated and alone and was going through personal hardships and felt that they had no one in their life they could talk to,” Raw_Genesis said over Discord, noting that he’s older than a lot of his community at age 31, and his community jokes that he’s their dad. He pointed out, too, that even big streamers like Dr Disrespect sometimes take on similar roles in their communities, as the Doc himself explained in an emotional acceptance speech at The Esports Awards last month.

Raw_Genesis believes that kind of outreach really can make a difference, online or off. Recently, he said, a fan approached him at an event and explained that he was having a really hard time with some personal issues. Raw_Genesis said he walked with the crying fan back to his hotel and talked with him for a couple hours. “He messaged me recently letting me know that it really meant a lot to him that someone spent that time to listen,” said Raw_Genesis. “It really meant the world to me that I was able to help someone in that way.”

Raw_Genesis added, however, that he understands that not all streamers would want to take such an approach. People often watch streams for light entertainment, and mentions of depression and suicide in chat can irreparably damage a stream. “It can really affect the mood of the entire stream if not handled properly,” Raw_Genesis said.

After a certain point, streamers also have to take their own mental health into account. “It can really take a toll in terms of emotional labor,” said DistractedElf, an openly trans streamer with many viewers who identify as trans, a population with a significantly higher-than-average suicide rate. “You end up feeling responsible for folks.” She reaches out when she can, because she knows how crucial it can be, but there’s a limit. Despite that, she still tries to at least assist in small ways, “even if it’s just searching something up for them, and linking them in that direction.”

Austen Marie, a video game and art streamer, expressed similar concerns. “Some days I’ve had to say I can’t talk about anything due to just feeling emotionally drained, which can be hard cause I don’t want anyone to feel ignored or invalid in their pain,” she said, adding that it’s “delicate territory,” and she usually defaults to telling people that they should seek assistance from a professional.

In extreme situations, if a streamer is too approachable, they could find viewers becoming over-reliant on them, developing romantic feelings for them, or even violating their privacy, which is what happened to longtime streamer Ellohime when a young fan from Singapore unexpectedly showed up on his doorstep one night in 2015.

Art by Jim Cooke.
Art by Jim Cooke.

And, of course, in some situations when an audience member talks about depression and self-harm, they’re being disingenuous, or worse, trying to make edgy “jokes.”

“Due to the nature of the internet and the gaming community in particular, it can sometimes be hard to determine if someone genuinely needs help or if someone is making a bad joke or trolling,” said Raw_Genesis. He added, however, that he errs on the side of caution and always assumes that people are being serious when they bring up these topics.

General Mittenz has a system for weeding out folks who don’t seem entirely serious about their claims. “When someone says, ‘Hey, I’m thinking about killing myself,’ I immediately ask them what’s going on, and within a few minutes I know if I have to worry or not,” he said.

Mittenz also employs a “three strikes” rule in which he listens and offers advice and resources the first two times a particular person threatens self-harm, but bans on the third. At that point, he says, it’s clear to him that there’s nothing more he can do, and indulging them further could harm the person in question—not to mention other viewers of Mittenz’s stream and Mittenz himself, who was traumatized by his father’s suicide.

Mittenz offered a recent example in which a young guy he’d been talking to on Twitch off and on for years said he was going to take his own life. Mittenz had reason to believe the threat wasn’t serious, so he didn’t respond and decided to ban the fan’s account, per his system. The person then made similar comments to Mittenz’s wife, GollyMsMollie, who is also his manager and a prominent member of Mittenz’s community. A few days later, though, he showed up in Mittenz’s stream chat with a slightly different username, acting like nothing ever happened. Mittenz banned that account, too.

Image: Hellblade.
Image: Hellblade.

“It’s a hard choice, but one that has to be made,” said Mittenz. “Either he’s a sick, manipulative person, or he’s someone that I can’t help that refuses to get help elsewhere. And with depression/suicidal thoughts, it only takes one person to start the whole group of chatters spiraling downwards into the void.”

As the platform that brings streamers and viewers together, undoubtedly Twitch itself has a role to play here. It recently added a mental health support page, but all the streamers I spoke to think it can still do more on this front.

Badxan, whose firsthand experience dealing with a suicidal viewer helped inform Twitch’s decision to up its mental health game, appreciates the improvement but thinks Twitch still hasn’t addressed the core issue.

“Suicide and self harm should no longer be terms of service infringements that see lengthy or permanent bans,” she said. She sees the system as counter-intuitive and potentially dangerous, and believes people should immediately receive support, not a ban, upon threatening self-harm.

Raw_Genesis believes that anyone reported for suicide or self-harm should receive a follow-up from a human being. While Twitch’s new mental health page is a useful resource, he doesn’t believe it’s enough, and hopes Twitch will go on to offer “more transparent information on how reports are dealt with” as well as “documentation for streamers on the best ways to deal with mentions of depression, self-harm, and suicide while live on stream.”

Raw_Genesis also hopes that Twitch will make a command to instantly bring up the new mental health support page in chat available to everyone by default. Currently, streamers have to manually create such a command.

A rep from Twitch said to Kotaku that the goal of the company’s current policy is to “stop promotion of content that can lead to suicide or self harm, which includes mitigating the risk of an individual being exposed to negative encouragement,” and that it is “constantly evaluating” its policies.

Take This, an organization that advocates for mental health awareness in the gaming space, helped Twitch formulate its mental health policy [Correction - 12:30, 12/14/17 PM: Take This consulted on the policy, rather than directly helping create it. We apologize for the error]. The organization’s clinical director, Dr. Raffael Boccamazzo, brought things back around to streamers’ lack of professional training.

“Twitch, for all of its wonderful strengths, isn’t professional help, and we can’t expect it to function as such,” he said in an email. “There is a line between friendly support on an entertainment platform and professional services. Take This has spoken to many streamers, as well as Twitch directly, about how to walk that line, and we’re extremely thankful that so many people are concerned with being inclusive without stepping into the realm of therapy and professional services.”

Where matters of life and death are concerned, Twitch may face legal repercussions if it makes the wrong moves. One law professor, opining on potential legal liability for social media companies if a user streams their suicide, noted that services like Facebook are “not the speakers or publishers of information provided by others” and would “probably” not be liable for any legal repercussions. But if, say, Twitch were to get itself involved by speaking to the user, that could change its relationship, and potentially open it up to liability.

Image: Actual Sunlight.
Image: Actual Sunlight.

“In a hypothetical situation where an unlicensed and untrained individual provided inappropriate care to a person in crisis, and that person went on to harm themselves or others, there would be severe financial and legal implications for the person who dispensed the treatment AND for the organization by which they were employed to do so,” said Russ Pitts, president of Take This, in an email. Licensed care workers must follow consent laws when intervening, Pitts noted, and they’re classified as “required responders,” which means they’re legally required to step in should an emergency arise—and if they’re not qualified in that particular situation, they’re required to call for help. Because of the potential legal and ethical ramifications of making a mistake, Pitts said, “it is often in the best interests of everyone involved for an organization to do nothing.” That doesn’t mean he is against the idea of companies intervening, however. He thinks they need to do it with extreme care. “Those that choose to take the next step, and actually do something, must thread an incredibly thin needle,” Pitts said.

Badxan still thinks it would be better for Twitch to do something other than cut users off from their community when they threaten self-harm.

“Ultimately, I would think that not doing anything to assist during a medical emergency might have greater liability than providing basic help and alerting emergency services ever could,” she said.

“There seems to be a lot of people out there who think that people with a mental health crisis should keep quiet and deal with it on their own,” she continued. “It is time to shift that way of thinking: suicide is a medical emergency. Let’s start treating it like one.”

Update - 12:30 PM, 12/14/17: We’ve updated the article to clarify that Take This simply consulted on Twitch’s mental health policy. Further, we’ve expanded Dr. Boccamazzo’s quote about the organization’s view on how streamers engage with people facing suicidal issues. While Take This has made clear that organizations such as Twitch may take on liability if they engage with people threatening self-harm directly, the organization also emphasizes the importance of seeking help and has provided a set of resources to help streamers to create supportive communities and resolve crises.



I don’t see how twitch could provide medical healthcare to random people in real time. Furthermore, it would open them up for legal liability.