Xbox has released its first-ever Digital Transparency Report, offering insight into the work its community and moderation teams have achieved over the last six months in the name of player safety – including the news it’s taken action on 7.3m community violations during that time.
Microsoft says it’s publishing its inaugural Digital Transparency Report as part of its “long-standing commitment to online safety”, with the aim being to release an updated report every six months so it can address key learnings and do more to “help people understand how to play a positive role in the Xbox community.”
Much of the document is spent reiterating the various tools and processes Microsoft and Xbox players have at their disposal to ensure users remain safe and community guidelines are adhered to, ranging from parental controls to reporting.
However, the latter half of Microsoft’s report goes into more specific details about its actions over the last six months, breaking down the number of player reports versus the number of enforcements (that is, instances where content is removed, accounts are suspended, or both) made on policy areas including cheating, inauthentic accounts, adult sexual content, fraud, harassment and bullying, profanity, and phishing during that time.