Looking back at the past year, while the company hasn’t provided any exact figures as to how many moderators it currently employs, it did say that the amount had increased fourfold, and celebrated the fact that 95 percent of all live content on the service had been viewed by some form of moderator by the end of 2020. With 40 percent more channels on Twitch during the second half of the year, the company has reported that chat messages removed by its AutoMod and blocked term features increased by an impressive 61 percent. Manual deletions by both content creators and moderators surged even more, skyrocketing by 98 percent. Perhaps most importantly, the platform found that its number of rule enforcements against users and channels that have been reported rose 41 percent, targeting a number of categories including violence, gore, nudity, sexual harassment, general hateful conduct, and even terrorist propaganda.
“At Twitch, we believe everyone in our community – creators, viewers, moderators, and Twitch – plays a big role in promoting the health and safety of our community,” the report says. “Through the Community Guidelines, we try to make clear what expression and behavior are allowed on the service, and what is not. We then rely on community moderation actions and user reporting, along with technological solutions, such as machine learning and proactive detection, to ensure the Community Guidelines are upheld. Creators and moderators (colloquially known as “mods”) also use tools that we and third-parties provide, such as AutoMod, Mod View, and moderation bots, to enforce Twitch service wide standards, or to set higher standards in their own channels. ”
In other tech-related news, Apple’s iPhone 13 now has a rough potential release window.