Microsoft Corp.’s Xbox released its first transparency report on Monday, detailing how the gaming giant moderates its 3 billion global players.
Microsoft Corp.’s Xbox released its first transparency report on Monday, detailing how the gaming giant moderates its 3 billion global players.
Xbox took action against more than 4.3 million inauthentic accounts between January and June, according to the report, after increasing its proactive moderation nine times compared with the same period a year earlier. These inauthentic accounts are typically automated or bot accounts that can be used to trick or harass players through spam, facilitate cheating activities, inflate friend or follow numbers or initiate distributed denial of service, or DDoS, attacks.
Dave McCarthy, Xbox corporate vice president of player services, said the increase in proactive moderation is an effort to “weed out” the accounts before they hit the system. Proactive enforcements, which made up 65% of the total, refers to artificial intelligence or human moderators identifying, analyzing and taking action on behavior that goes against Xbox’s community standards. Microsoft also relies on players to report inappropriate content through reactive moderation.
Inauthentic accounts aren’t just from bots advertising cheat codes for games, McCarthy said. “There’s regular activity by nation-state actors and other funded groups attempting to distribute content that has no place on our services,” he added.
Xbox joins a growing number of gaming and gaming-service providers who plan to release transparency reports on a regular basis in an effort to crack down on toxicity and abuse and to create a safe experience for players. Twitch, the live-stream gaming site owned by Amazon.com Inc., released its first report in early 2021 and Discord released its first in 2019. Xbox’s Japanese console competitors PlayStation, owned by Sony Group Corp., and Nintendo Co., don’t release analogous moderation data. Microsoft said it would release a report every six months.
McCarthy declined to comment on the size or employment status of Xbox’s human moderators. Discord and Twitch provide significantly more detail on their moderation efforts, including the number of subpoenas processed and data on extremist content. McCarthy said Xbox is “learning our way into what a good transparency report looks like for us.”
Microsoft was able to increase proactive moderation on the site so much in part as a result of the company’s 2021 acquisition of content moderation provider Two Hat, known for its text-filtering software. McCarthy also cited Microsoft’s broader resources, documented in its, bi-annual digital trust report, that allow the company to “harvest more products out of Microsoft Research using video and image detection.”
Unlike Tencent Holdings Ltd.-owned Riot Games, which makes popular games League of Legends and Valorant, Xbox doesn’t capture or analyze players’ voice audio when a report for harassment is submitted. McCarthy said Xbox may increase its resources in that area in the future, but noted there are privacy considerations.
Xbox took action on 1 million accounts for profanity, 814,000 accounts for adult sexual content and 759,000 for harassment or bullying in the first six months of 2022, according to the report. Xbox players provided more than 33 million reports over the same time period, a 36% decline from the same period a year earlier.
Microsoft is awaiting regulatory approval for its purchase of Activision Blizzard Inc., which could bring popular games like Call of Duty and Overwatch into its moderation coverage. McCarthy declined to comment on how or whether Xbox will participate in moderating those games citing the ongoing regulatory review.
Microsoft Corp.’s Xbox released its first transparency report on Monday, detailing how the gaming giant moderates its 3 billion global players.
Microsoft Corp.’s Xbox released its first transparency report on Monday, detailing how the gaming giant moderates its 3 billion global players.
Xbox took action against more than 4.3 million inauthentic accounts between January and June, according to the report, after increasing its proactive moderation nine times compared with the same period a year earlier. These inauthentic accounts are typically automated or bot accounts that can be used to trick or harass players through spam, facilitate cheating activities, inflate friend or follow numbers or initiate distributed denial of service, or DDoS, attacks.
Dave McCarthy, Xbox corporate vice president of player services, said the increase in proactive moderation is an effort to “weed out” the accounts before they hit the system. Proactive enforcements, which made up 65% of the total, refers to artificial intelligence or human moderators identifying, analyzing and taking action on behavior that goes against Xbox’s community standards. Microsoft also relies on players to report inappropriate content through reactive moderation.
Inauthentic accounts aren’t just from bots advertising cheat codes for games, McCarthy said. “There’s regular activity by nation-state actors and other funded groups attempting to distribute content that has no place on our services,” he added.
Xbox joins a growing number of gaming and gaming-service providers who plan to release transparency reports on a regular basis in an effort to crack down on toxicity and abuse and to create a safe experience for players. Twitch, the live-stream gaming site owned by Amazon.com Inc., released its first report in early 2021 and Discord released its first in 2019. Xbox’s Japanese console competitors PlayStation, owned by Sony Group Corp., and Nintendo Co., don’t release analogous moderation data. Microsoft said it would release a report every six months.
McCarthy declined to comment on the size or employment status of Xbox’s human moderators. Discord and Twitch provide significantly more detail on their moderation efforts, including the number of subpoenas processed and data on extremist content. McCarthy said Xbox is “learning our way into what a good transparency report looks like for us.”
Microsoft was able to increase proactive moderation on the site so much in part as a result of the company’s 2021 acquisition of content moderation provider Two Hat, known for its text-filtering software. McCarthy also cited Microsoft’s broader resources, documented in its, bi-annual digital trust report, that allow the company to “harvest more products out of Microsoft Research using video and image detection.”
Unlike Tencent Holdings Ltd.-owned Riot Games, which makes popular games League of Legends and Valorant, Xbox doesn’t capture or analyze players’ voice audio when a report for harassment is submitted. McCarthy said Xbox may increase its resources in that area in the future, but noted there are privacy considerations.
Xbox took action on 1 million accounts for profanity, 814,000 accounts for adult sexual content and 759,000 for harassment or bullying in the first six months of 2022, according to the report. Xbox players provided more than 33 million reports over the same time period, a 36% decline from the same period a year earlier.
Microsoft is awaiting regulatory approval for its purchase of Activision Blizzard Inc., which could bring popular games like Call of Duty and Overwatch into its moderation coverage. McCarthy declined to comment on how or whether Xbox will participate in moderating those games citing the ongoing regulatory review.