Image: Activision
First-person shooters like Call of Duty are somewhat infamous for the toxicity of their lobbies and voice chats. Surveys have dubbed the franchise’s fan base the most negative in all of gaming; a feud between two players once resulted in the summoning of a SWAT team. Activision has been trying to crack down on this behavior for years, and part of the solution might involve artificial intelligence.
Activision has partnered with a company called Modulate to bring “in-game voice chat moderation” to their titles. The new moderation system, using an AI technology called ToxMod, will work to identify behaviors like hate speech, discrimination, and harassment in real time.
ToxMod’s initial beta rollout in North America begins today. It’s active…