Activision has partnered with a company called Modulate to moderate voice chats using an AI technology called ToxMod. According to The Verge, the tool “will work to identify behaviors like hate speech, discrimination, and harassment in real time.” From the report:

ToxMod’s initial beta rollout in North America begins today. It’s active within Call of Duty: Modern Warfare II and Call of Duty: Warzone. A “full worldwide release” (it does not include Asia, the press release notes) will follow on November 10th with the release of Call of Duty: Modern Warfare III, this year’s new entry in the franchise. Modulate’s press release doesn’t include too many details about how exactly ToxMod works. Its website notes that the tool “triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context.”

The company’s CEO said in a recent interview that the tool aims to go beyond mere transcription; it takes factors like a player’s emotions and volume into context as well in order to differentiate harmful statements from playful ones. It is noteworthy that the tool (for now, at least) will not actually take action against players based on its data but will merely submit reports to Activision’s moderators.

Originally Posted at