Facebook could not prevent its platform from being used to auction off a 16-year-old girl who is getting married in South Sudan.
Premature and forced child marriage (CEMF) is the most common form of gender. violence based in South Sudan, according to a recent report by Plan International on the innumerable risks for teenagers living in the region devastated by the war.
Now it seems that girls in that part of the world should also worry about social networks. 19659002] Vice reported on the story in detail yesterday, noting that Facebook eliminated the publication of the auction, but not until the girl had already been married, and more than two weeks after the family first announced the attention to sell to the child through his platform, on October 25.
Facebook said it learned about the auction post on November 9, after which it says it was removed in 24 hours. It is not clear how many hours of 24 Facebook took to make the decision to remove the publication.
A multimillionaire businessman from the capital of South Sudan won the auction after offering a record "price": 530 cows, three Land Cruiser V8 cars and $ 10,000 to marry the boy, Nyalong Ngong Deng Jalang.
Plan International told Vice that it is the first known incident of Facebook that is used to auction a child girlfriend.
"It's really worrying because, because" It was such a lucrative transaction and it attracted so much attention, we are concerned that this may act as an incentive for others to follow suit, "the development organization told Vice.
A different human rights NGO published on Twitter a screengrab of The publication removed from the auction wrote: "Despite the various appeals made by a human rights group, a 16-year-old girl became the victim of an online publication of the auction, which Facebook did not withdraw in South Sudan. . "
We asked Facebook to explain how it did not act in time to avoid the auction and we sent the following statement, attributed to a spokesperson:
Any form of human trafficking, whether the publications, pages, ads or groups are not allowed on Facebook. We eliminate the publication and permanently disable the account that belongs to the person who posted it on Facebook. We are always improving the methods we use to identify content that breaks our policies, including doubling our security and safety equipment to more than 30,000 and investing in technology.
The delay of more than two weeks between the publication of the auction and the publication. The publication of the auction that was eliminated by Facebook raises serious doubts about its claims of having made substantial investments to improve its moderation processes.
Human rights groups directly tried to mark the publication on Facebook. The auction would also have attracted the attention of the local media. However, he did not notice or act until weeks later, by which time it was too late because the girl had been sold and married.
Facebook does not publish data at the country level on its platform, so it is not clear how many users it has in the South Sudan region.
Nor does it provide a breakdown of the locations of the approximately 15,000 people it hires or contracts to perform content review tasks on its global content platform (which has more than 2 billion users). [19659002] Facebook admits that the content reviewers that it uses do not speak all the languages of the world where its platform is used. They do not even speak all the languages that are widely used in the world. Therefore, it is very unlikely to have reviewers who know the indigenous languages spoken in the South Sudan region.
We asked Facebook how many moderators it uses who speak any of the languages in the South Sudan region (which is multilingual). A spokeswoman could not give an immediate response.
The result of Facebook that carries out a retrospective moderation of content from afar, relying on a small number of reviewers (in relation to the total number of users), is that the company does not respond to human needs.
Facebook has not established teams in the field through its international businesses with the linguistic and cultural sensibilities necessary to respond directly, or even quickly, to the risks created by its platform in each market where it operates. (A large part of its reviewers are located in Germany, which passed a hate act on social networks a year ago).
Artificial intelligence will not solve that difficult problem either, not on a human time scale. And meanwhile, Facebook is letting real humans take the pressure.
But two weeks to realize and eliminate a bridal auction is not the kind of metric that any company wants to measure.
It is increasingly clear that Facebook's failure to invest adequately in its international business to monitor and manage the human rights impacts of its technological tools can be very costly.
In South Sudan, the lack of adequate supervision has caused its platform to be reused as the equivalent of a high-tech slave market.
Facebook is also still struggling with serious failures in Myanmar, where its platform has been blamed for spreading hate speech and accelerating ethnic violence.
You do not have to look very far to see other human rights abuses aided and abetted by access to uncontrolled social media tools.