Gaming Reviews, News, Tips and More.
We may earn a commission from links on this page

Xbox Automatically Banned 4 Million Accounts For Cheating Or Botting In 2022

According to Microsoft’s Digital Transparency Report, most bots get proactively kicked off the Xbox platform

We may earn a commission from links on this page.
Master Chief turns away from a "banned" sign on an alien planet.
Image: Microsoft / 343 / Kotaku / Andrii Yalanskyi (Shutterstock)

Microsoft recently published an Xbox transparency report that offers some details on how it moderated its online services during the first six months of 2022. According to the publisher’s report, most enforcements were “proactive,” which means that Xbox issued a ban (in most cases) before anyone had reported that an account was misbehaving. Out of those bans, nearly 4.33 million were for “cheating” or “inauthentic accounts.”

This doesn’t necessarily mean that millions of bad actors are roving around trying to muck up your online games. Microsoft claims that, 99.99% of the time, it automatically bans ake accounts “as soon as they’re created.” According to Microsoft, automation, “helps to find resolution sooner, reduce the need for human review, and further reduce the impact of toxic content on human moderators.”

Advertisement

Other forms of harassment that Xbox machine-moderates include sexual content, fraud, harassment, profanity, and phishing. But they make up a miniscule amount of things that you can get banned for. Over the six-month period, Xbox’s detection software banhammered fewer than half a million accounts for transgressions in those categories.

Advertisement

It probably goes without saying that software has a harder time figuring out what’s harassment and what’s not. That being so, Microsoft reports that everyday users play a prominent role in shaping the environment for Xbox online gaming, specifically when it comes to comments, user-generated content, and bad sportsmanship. During the reported six-month period, players made 33 million such reports to customer service. These reports are also reviewed by software before being escalated to a human moderator.

Advertisement

Read More: Xbox Game Pass Is About To Get One Of 2022's Best Games

Despite handling a whopping 33 million human-generated reports, Microsoft says that number represents a 36% decline compared to the same period last year. It’s not clear if that means that online games on Xbox are less toxic compared to a year ago, or if players are not reporting incidents for other reasons.

Advertisement

Of course, machines have less discerning judgment than human moderators, and Microsoft did review 151,000 appeals by players who felt that they had been unfairly banned. But if you ever have to make an appeal, I wouldn’t be too hopeful about your odds. Microsoft’s data shows that only six percent of accounts that go through the appeal process ever get reinstated. The vast majority of the time, Microsoft seems quite sure about its decision to ban you from its service, so think real hard about whether or not you might want to say that slur in the chat.