Discord Comes Under Fire for Alleged Moderator Abuse and Furry Corruption

UPDATE (2/13 3:12 P.M.): Discord announced significant changes to its Terms of Service and policy regarding "cub play" imagery after this story broke. See tweet for info:

Two things.

1. We want to be more transparent about how our Trust and Safety team operates. Read: https://t.co/127Id3jvdl

2. After considering your feedback, we're changing our Community Guidelines to cover all cub cases. More details in above blog.

— Discord (@discordapp) February 13, 2019

Every day, Discord gathers 19 million people in chat rooms who discuss everything from video games to Steven Universe fan fiction. The platform, launched in May 2015, now faces scrutiny over allegations of illegal activity. Forbes reported that the FBI is investigating whether the chat application has been used as a marketplace for stolen items (including online passwords and accounts) hacking tips and even child grooming (befriending a minor online to persuade them into sexual abuse).

In the midst of all this public turmoil, Discord has seen an internal scandal rise: site moderators who fail to enforce the platform's rules because of personal bias, specifically among moderators and community members who identify as Furries. A #ChangeDiscord movement on social media grew once users learned that some moderators, in violation of their own code of conduct, selectively banned communities that shared sexually suggestive art depicting minors.

Discord said in a statement to Newsweek "we take our community's safety very seriously and are constantly assessing and improving our trust and safety measures. As with any digital communication platform there are risks, and we work closely with law enforcement agencies in their investigations when appropriate to ensure and strengthen the safety of our community members."

Cub Play

Discord has Community Guidelines like all other social media platforms. People who break the rules can get banned by mods, and entire communities can lose their partner status - which includes prompt technical support from Discord and crucial moderation tools.

In February 2018, Discord issued a complete ban on NSFW content for partnered servers, telling Polygon in a statement, "this policy was created so that our community can feel comfortable within a designated partner server, and we will continue to provide this safe environment for all community members."

The Discord community guidelines specifically target "altered pornography which depicts minors" including shotacon and lolicon, or "loli" for short, which are drawings of young girls or boys in sexually suggestive scenarios.

In late January 2019, a user on the Discord subreddit posted a conversation they had with a platform admin known as TinyFeex. In the emails, TinyFeex argues that "cub" content is not in violation of Discord's terms of service. "Cub" is a term used in the furry community for underage members, with "cub play" being used to describe sexually explicit acts. Discord users cried foul in the comments, and soon subreddit admins deleted the post and pinned a response from Discord Trust and Safety mod karrdian.

"There is some overlap between 'cub' and loli,' karrdian wrote. "There is also some segment of 'cub' art that is not, in fact, human or humanlike at all, but instead, for example, mythological creatures. This content is significantly greyer, which is why there isn't a blanket ban on all content that could conceivably fall under the umbrella."

Furries and Discord

Users began to share their interactions with the Terms of Service team at Discord. Twitter user MrTempestilence posted a Twitter thread February 3 detailing a wide range of accusations from instances of zoophilia on Discord to moderators allowing "cub play." Once these tweets started to gain traction and soon after Discord's "gay zoo and feral" community was shut down.

"I originally intended for them to just be poking fun at how terrible Discord's staff were, but then once I started to add new information to the thread, I realized that what I was doing was important," MrTempestSilence told Newsweek through Twitter DM (he asked we withhold his real name due to privacy concerns). "People needed to know about how toxic Discord's staff are, and I was able to inform them."

The Twitter thread pulled in more than 8,000 likes and gathered the attention of the larger Twitter dialogue. On February 4, freelance journalist Nick Monroe started to dig a little deeper into the connections between Discord Terms of Service moderators and the furry community. "The story unraveled itself as I pulled on the thread," Monroe told Newsweek . "It has become clear to me and everyone who reads the thread that something is wrong with Discord's trust and safety operation."

A "#ChangeDiscord" hashtag spread on social media as more users shared their experiences with the moderation team. Monroe and MrTempestSilence claimed that a Discord Terms of Service moderator, known as allthefoxes and an alleged furry himself, was the one who first created the "cub" policy to protect his favored community (a Discord representative did not respond to these claims).

One YouTuber cited by Monroe was QuackityHQ, whose content centers around encouraging his fans to "raid" video games like Animal Jam (think a flash mob inside a video game), commentary videos and TikTok reacts. Quackity's Discord account was banned on January 1 for violating a ToS regarding "raids" on Discord. Quackity claims he violated no such rule, as Animal Jam isn't a part of Discord. Instead, he believes that he was banned for a video he made that mocked people on a furry roleplay server.

"I googled the moderator's name...their bio clearly stated they were a part of the 'furry' community and after checking their following list, I found out that many of the Discord Trust and Safety team members had this same information in their bios," Quackity told Newsweek.

Discord said in a statement to Newsweek the platform "has a Terms of Service and Community Guidelines that all communities and users are required to adhere to. These specifically prohibit harassment, threatening messages, calls to violence or any illegal activity, and they cover more expansive activities than other platforms' rules such as doxxing and sharing private information. Though we do not read people's private messages, we do investigate and take immediate action against any reported violation by a server or user, which can include shutting down offending servers or banning users."

Moderating a social media platform is difficult, but it's crucial to its longevity. If users feel that their bans are unjustified or the rules are being twisted for others benefits, they may flock somewhere else.

Uncommon Knowledge

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

Newsweek is committed to challenging conventional wisdom and finding connections in the search for common ground.

");jQuery(this).remove()}) jQuery('.start-slider').owlCarousel({loop:!1,margin:10,nav:!0,items:1}).on('changed.owl.carousel',function(event){var currentItem=event.item.index;var totalItems=event.item.count;if(currentItem===0){jQuery('.owl-prev').addClass('disabled')}else{jQuery('.owl-prev').removeClass('disabled')} if(currentItem===totalItems-1){jQuery('.owl-next').addClass('disabled')}else{jQuery('.owl-next').removeClass('disabled')}})}})})

ncG1vNJzZmivp6x7r7HWrK6enZtjsLC5jp2grJufp7FustSrqaKdo2LBpr7MrGSsnaKrtqSxjJympqWlo7a1xYygrKKclaG2r7HSZmhsamNlhno%3D