Thanks to the endlessly depressing extent to which covid has kept everybody trapped inside, Discord is more relevant than ever. But as the company revealed in its latest transparency report, that has led to new challengesâand improved efforts to confront other challenges it probably should have put more effort into sooner.
Discord, which is reportedly in talks with Microsoft to sell for around 1.3 Bethesdas, released the transparency report today. Amid standard operational insights about Discordâs second half of 2020, a few details stood out. For one, the overall number of user reports increased pretty steadily across 2020âfrom 26,886 in January to 65,103 in Decemberâwith the number initially jumping up in March. This makes sense; people were trapped in their homes, and Discord was growing rapidly as a result. Spam resulted in the most account deletions (over 3 million), with exploitative content including nonconsensual pornography coming in a distant second (129,403), and harassment in third (33,615).
Discord also pointed out that of reports made, it most frequently took action against issues involving child harm material, cybercrime, doxxing, exploitative content, and extremist or violent content. âThis may be partly explained by the teamâs prioritization of issues in 2020 that were most likely to cause damage in the real world,â the company said in the transparency report.
Indeed, according to the report, Discord removed over 1,500 servers for violent extremism in the second half of 2020, which it said was ânearly a 93% increase from the first half of the year.â It cited groups like the Boogaloo Boys and QAnon as examples.
âThis increase can be attributed to the expansion of our anti-extremism efforts as well as growing trends in the online extremism space,â the company wrote. âOne of the online trends observed in this period was the growth of QAnon. We adjusted our efforts to address the movementâultimately removing 334 QAnon-related servers.â
Cybercrime server deletions similarly shot up over the course of 2020, increasing by 140% from the first half of the year. In total, Discord removed almost 6,000 servers for cybercrime in the second half of 2020, which it said followed a significant increase in reports. âMore cybercrime spaces than ever were flagged to Trust & Safety, and more were ultimately removed from our site,â Discord wrote.
Discord also emphasized its focus on methods that allow it to âproactively detect and remove the highest-harm groups from our platform,â pointing to its efforts against extremism as an example, but also noting where it made a mistake.
âWe were disappointed to realize that in this period one of our tools for proactively detecting [sexualized content related to minors] servers contained an error,â Discord wrote. âThere were fewer overall flags to our team as a result. That error has since been resolvedâand weâve resumed removing servers the tool surfaces.â
The other issue here is that Discord made a concerted effort to remove QAnon content around the same time other platforms didâafter the lionâs share of the damage had already been done. While removal may have been proactive according to Discordâs internal definition, platforms were slow to even behave reactively when it came to QAnon as a wholeâwhich led to real and lasting damage in the United States and across the world. Back in 2017, Discord also functioned as a major staging ground for Unite The Right rally in Charlottesville, Virginia that ultimately led to violence and three deaths. While the platform has tried to clean up its act since, it played host to an abundance of abuse and alt-right activity as recently as 2017.
Some transparency is much better than none, but it remains worth noting that tech companiesâ transparency reports often provide little insight into how decisions get made and the larger priorities of the platforms that essentially govern our online lives. Earlier this year, for example, Discord banned r/WallStreetBetsâ server at the height of GameStop stonksapalooza. Onlookers suspected foul playâoutside interference of some sort. Speaking to Kotaku, however, two sources made it clear that labyrinthine internal moderation policies ultimately caused Discord to make that decision. Bad timing and substandard transparency before and after took care of the rest.
This is just a minor example of how this dynamic can play out. There are many more. Platforms can say theyâre being transparent, but ultimately theyâre just giving people a bunch of barely contextualized numbers. Itâs hard to say what real transparency looks like in the age of all-encompassing tech platforms, but itâs not this.