Raw_Genesis added, however, that he understands that not all streamers would want to take such an approach. People often watch streams for light entertainment, and mentions of depression and suicide in chat can irreparably damage a stream. “It can really affect the mood of the entire stream if not handled properly,” Raw_Genesis said.

After a certain point, streamers also have to take their own mental health into account. “It can really take a toll in terms of emotional labor,” said DistractedElf, an openly trans streamer with many viewers who identify as trans, a population with a significantly higher-than-average suicide rate. “You end up feeling responsible for folks.” She reaches out when she can, because she knows how crucial it can be, but there’s a limit. Despite that, she still tries to at least assist in small ways, “even if it’s just searching something up for them, and linking them in that direction.”

Austen Marie, a video game and art streamer, expressed similar concerns. “Some days I’ve had to say I can’t talk about anything due to just feeling emotionally drained, which can be hard cause I don’t want anyone to feel ignored or invalid in their pain,” she said, adding that it’s “delicate territory,” and she usually defaults to telling people that they should seek assistance from a professional.

In extreme situations, if a streamer is too approachable, they could find viewers becoming over-reliant on them, developing romantic feelings for them, or even violating their privacy, which is what happened to longtime streamer Ellohime when a young fan from Singapore unexpectedly showed up on his doorstep one night in 2015.

Art by Jim Cooke.
Art by Jim Cooke.

And, of course, in some situations when an audience member talks about depression and self-harm, they’re being disingenuous, or worse, trying to make edgy “jokes.”

“Due to the nature of the internet and the gaming community in particular, it can sometimes be hard to determine if someone genuinely needs help or if someone is making a bad joke or trolling,” said Raw_Genesis. He added, however, that he errs on the side of caution and always assumes that people are being serious when they bring up these topics.

General Mittenz has a system for weeding out folks who don’t seem entirely serious about their claims. “When someone says, ‘Hey, I’m thinking about killing myself,’ I immediately ask them what’s going on, and within a few minutes I know if I have to worry or not,” he said.

Mittenz also employs a “three strikes” rule in which he listens and offers advice and resources the first two times a particular person threatens self-harm, but bans on the third. At that point, he says, it’s clear to him that there’s nothing more he can do, and indulging them further could harm the person in question—not to mention other viewers of Mittenz’s stream and Mittenz himself, who was traumatized by his father’s suicide.

Mittenz offered a recent example in which a young guy he’d been talking to on Twitch off and on for years said he was going to take his own life. Mittenz had reason to believe the threat wasn’t serious, so he didn’t respond and decided to ban the fan’s account, per his system. The person then made similar comments to Mittenz’s wife, GollyMsMollie, who is also his manager and a prominent member of Mittenz’s community. A few days later, though, he showed up in Mittenz’s stream chat with a slightly different username, acting like nothing ever happened. Mittenz banned that account, too.

Image: Hellblade.
Image: Hellblade.

“It’s a hard choice, but one that has to be made,” said Mittenz. “Either he’s a sick, manipulative person, or he’s someone that I can’t help that refuses to get help elsewhere. And with depression/suicidal thoughts, it only takes one person to start the whole group of chatters spiraling downwards into the void.”

As the platform that brings streamers and viewers together, undoubtedly Twitch itself has a role to play here. It recently added a mental health support page, but all the streamers I spoke to think it can still do more on this front.

Badxan, whose firsthand experience dealing with a suicidal viewer helped inform Twitch’s decision to up its mental health game, appreciates the improvement but thinks Twitch still hasn’t addressed the core issue.

“Suicide and self harm should no longer be terms of service infringements that see lengthy or permanent bans,” she said. She sees the system as counter-intuitive and potentially dangerous, and believes people should immediately receive support, not a ban, upon threatening self-harm.

Raw_Genesis believes that anyone reported for suicide or self-harm should receive a follow-up from a human being. While Twitch’s new mental health page is a useful resource, he doesn’t believe it’s enough, and hopes Twitch will go on to offer “more transparent information on how reports are dealt with” as well as “documentation for streamers on the best ways to deal with mentions of depression, self-harm, and suicide while live on stream.”

Raw_Genesis also hopes that Twitch will make a command to instantly bring up the new mental health support page in chat available to everyone by default. Currently, streamers have to manually create such a command.

A rep from Twitch said to Kotaku that the goal of the company’s current policy is to “stop promotion of content that can lead to suicide or self harm, which includes mitigating the risk of an individual being exposed to negative encouragement,” and that it is “constantly evaluating” its policies.

Take This, an organization that advocates for mental health awareness in the gaming space, helped Twitch formulate its mental health policy [Correction - 12:30, 12/14/17 PM: Take This consulted on the policy, rather than directly helping create it. We apologize for the error]. The organization’s clinical director, Dr. Raffael Boccamazzo, brought things back around to streamers’ lack of professional training.

“Twitch, for all of its wonderful strengths, isn’t professional help, and we can’t expect it to function as such,” he said in an email. “There is a line between friendly support on an entertainment platform and professional services. Take This has spoken to many streamers, as well as Twitch directly, about how to walk that line, and we’re extremely thankful that so many people are concerned with being inclusive without stepping into the realm of therapy and professional services.”

Where matters of life and death are concerned, Twitch may face legal repercussions if it makes the wrong moves. One law professor, opining on potential legal liability for social media companies if a user streams their suicide, noted that services like Facebook are “not the speakers or publishers of information provided by others” and would “probably” not be liable for any legal repercussions. But if, say, Twitch were to get itself involved by speaking to the user, that could change its relationship, and potentially open it up to liability.

Image: Actual Sunlight.
Image: Actual Sunlight.

“In a hypothetical situation where an unlicensed and untrained individual provided inappropriate care to a person in crisis, and that person went on to harm themselves or others, there would be severe financial and legal implications for the person who dispensed the treatment AND for the organization by which they were employed to do so,” said Russ Pitts, president of Take This, in an email. Licensed care workers must follow consent laws when intervening, Pitts noted, and they’re classified as “required responders,” which means they’re legally required to step in should an emergency arise—and if they’re not qualified in that particular situation, they’re required to call for help. Because of the potential legal and ethical ramifications of making a mistake, Pitts said, “it is often in the best interests of everyone involved for an organization to do nothing.” That doesn’t mean he is against the idea of companies intervening, however. He thinks they need to do it with extreme care. “Those that choose to take the next step, and actually do something, must thread an incredibly thin needle,” Pitts said.

Badxan still thinks it would be better for Twitch to do something other than cut users off from their community when they threaten self-harm.

“Ultimately, I would think that not doing anything to assist during a medical emergency might have greater liability than providing basic help and alerting emergency services ever could,” she said.

“There seems to be a lot of people out there who think that people with a mental health crisis should keep quiet and deal with it on their own,” she continued. “It is time to shift that way of thinking: suicide is a medical emergency. Let’s start treating it like one.”

Update - 12:30 PM, 12/14/17: We’ve updated the article to clarify that Take This simply consulted on Twitch’s mental health policy. Further, we’ve expanded Dr. Boccamazzo’s quote about the organization’s view on how streamers engage with people facing suicidal issues. While Take This has made clear that organizations such as Twitch may take on liability if they engage with people threatening self-harm directly, the organization also emphasizes the importance of seeking help and has provided a set of resources to help streamers to create supportive communities and resolve crises.