New rules challenge Google and Facebook to change the way they moderate users

In recent years, content moderation has reached a breaking point. We have seen that all forms of ugliness thrive on platforms such as Facebook, Twitter and YouTube, whether it is coordinated harassment, impostor accounts, foreign political influence or strange algorithmic gossip. At the same time, inconsistent and sometimes harsh moderation has become an increasingly partisan issue, with conservative celebrities appearing before Congress to make dark claims about censorship. In both cases, the loss of confidence is palpable, fueled by an underlying lack of transparency. A large part of the world's discourse occurs on closed platforms such as Facebook and YouTube, and users still have little control or knowledge of the rules that govern that discourse.

Today, a coalition of non-profit groups attempted to address that gap with a list of the basic moderation standards called the Santa Clara Principles on Transparency and Accountability in Content Moderation, designed as a set of minimum standards on how treat user content online. The final product is based on the work of the American Civil Liberties Union, the Electronic Frontier Foundation, the Center for Democracy & Technology and the New America & # 39; s Open Technology Institute, as well as a series of independent experts. Together, they ask for a more complete notice when a publication is withdrawn, a stronger appeals process and new transparency reports on the total number of publications and accounts suspended.

They are simple measures, but they give users much more information and resources than they currently get on Facebook, YouTube and other platforms. The result is a new roadmap for platform moderation and an open challenge for any company that moderates content online.

According to the rules of Santa Clara At any time that an account or other content is deleted, the user would obtain a specific explanation of how and why the content was marked, with a reference to the specific guideline that they had violated. The user could also challenge the decision and present new evidence to an independent human moderator on appeal. Companies would also submit a regular moderation report based on current reports on government data requests, listing the total number of accounts marked and the rationale for each indicator.

"What we're talking about is basically the internal law of these platforms," ​​says Open Bank Institute director Kevin Bankston, who worked on the document. "Our goal is to make sure it's as open a process as possible."

So far, companies have kept quiet about the new guidelines. Google and Twitter refused to comment on the new rules; Facebook did not respond to multiple requests.

But while companies still have to weigh the Santa Clara rules, some are moving forward with similar measures by themselves. Facebook published its moderation guidelines for the first time last month, setting specific rules about violence and nudity that have guided decisions for years. The company also created its first formal appeal process for users who believe they have been suspended by mistake.

YouTube is closer to complying with the rules, although it still does not achieve transparency. The platform already has a notification and appeal process, and its guidelines have been public from the beginning. YouTube launched its first quarterly moderation report in April, which details the 8.2 million videos deleted during the last quarter of 2017. But although the report breaks down the policies involved in human flags, it does not give the same detail if the content was marked by the automated systems responsible for most of the content removal on YouTube.

The Santa Clara document is limited to process problems, leaving aside many of the thorny questions surrounding moderation. The rules do not explain what content should be removed or if a particular publication can be considered a threat to the user's security. Nor does it deal with political discourse or exclusions for journalistic interest, such as the political polemic of world leaders on Twitter .

But many of the experts involved say that the rules are more a minimum set of standards than a final list of demands. "I've been very critical of some specific policies, from nudity to terrorism," says Jillian C. York, who worked on the rules for EFF. "Ultimately, however, I do not believe that the moderation of content will disappear in the short term, so its mediation through transparency and due process is a great start."

Leave a Reply