Blizzard test deployment machine learning for moderation of Overwatch

in #gaming โ€ข 7 years ago

๐๐ฅ๐ข๐ณ๐ณ๐š๐ซ๐ ๐ญ๐ž๐ฌ๐ญ๐ฌ ๐ญ๐ก๐ž ๐ฎ๐ฌ๐ž ๐จ๐Ÿ ๐ฆ๐š๐œ๐ก๐ข๐ง๐ž ๐ฅ๐ž๐š๐ซ๐ง๐ข๐ง๐  ๐ญ๐จ ๐ญ๐š๐œ๐ค๐ฅ๐ž ๐ฎ๐ง๐ฐ๐š๐ง๐ญ๐ž๐ ๐ฅ๐š๐ง๐ ๐ฎ๐š๐ ๐ž ๐ฎ๐ฌ๐ž ๐ข๐ง ๐Ž๐ฏ๐ž๐ซ๐ฐ๐š๐ญ๐œ๐ก. ๐“๐ก๐ข๐ฌ ๐ข๐ฌ ๐๐จ๐ง๐ž ๐ข๐ง ๐ฌ๐ž๐ฏ๐ž๐ซ๐š๐ฅ ๐ฅ๐š๐ง๐ ๐ฎ๐š๐ ๐ž๐ฌ. ๐ˆ๐ง ๐ญ๐ก๐ž ๐ฅ๐จ๐ง๐  ๐ญ๐ž๐ซ๐ฆ, ๐ญ๐ž๐œ๐ก๐ง๐จ๐ฅ๐จ๐ ๐ฒ ๐ฆ๐ฎ๐ฌ๐ญ ๐š๐ฅ๐ฌ๐จ ๐›๐ž ๐š๐›๐ฅ๐ž ๐ญ๐จ ๐š๐ฌ๐ฌ๐ž๐ฌ๐ฌ ๐ฆ๐จ๐ซ๐ž ๐ญ๐ก๐š๐ง ๐ฃ๐ฎ๐ฌ๐ญ ๐ฅ๐š๐ง๐ ๐ฎ๐š๐ ๐ž, ๐ฌ๐ฎ๐œ๐ก ๐š๐ฌ ๐›๐ž๐ก๐š๐ฏ๐ข๐จ๐ฎ๐ซ.

Jeff Kaplan, game director of Blizzard's Overwatch shooter, says in an interview with Kotaku that the company is experimenting with machine learning and that it is trying to teach a system what unwanted language is. The goal of ai's bet is to be able to tackle this faster, without having to wait for a player to report it. It concerns the use of languages in different languages, such as English and Korean. At the moment, Blizzard is using the system to deal with the most blatant cases.

"In everything related to reporting and punishing players, you need to start with the most extreme examples and see how the rules can be adjusted," says Kaplan to the site. The detection of unwanted language would not analyze messages between friends directly. In the long term, it must also be possible to detect undesirable behaviour in the game. It is unclear how far Blizzard has come with this. Kaplan says: "That is the next step. For example, how do you know if Mei's ice cream wall in the spawn room has been built by a monkey?

The Overwatch team is also looking at ways to reward positive behaviour in the game. Together with companies such as Twitch and League of Legends-maker Riot Games, for example, it is part of the so-called Fair Play Alliance, which works on 'healthy communities' in online games. In LoL, for example, such a system already exists in the form of the Honor system. The use of machine learning for moderation also happens in other places, such as the reactions under New York Times articles based on a Google service called Jigsaw.

2001623411.jpeg