The company allowed an “out-of-control spread of anti-Rohingya content” despite repeated warnings from civil society groups and human rights activists about its deadly consequences, according to a complaint filed Monday in state court in California.
Separately, members of the Rohingya community from Myanmar who now live in the U.K. and in refugee camps in Bangladesh told Meta they intend to pursue a lawsuit in the U.K.’s High Court over its failure to take action against hate on its platform.
The legal challenges add to the public scrutiny facing Meta following a series of critical media reports based on internal documents disclosed by former Facebook product manager-turned-whistle-blower Frances Haugen. The company is battling accusations that it has prioritized the growth of its platforms at the expense of fighting hate speech, disinformation and violent extremism.
“The last five years, and in fact just the last five months, have made it abundantly clear that Facebook’s path to promote the very worst of humanity was not the result of a bug but rather a carefully designed feature,” according to the complaint in San Mateo County Superior Court, near where Meta is based.
Meta is “appalled by the crimes committed against the Rohingya people in Myanmar” and continues to invest in “Burmese-language technology to reduce the prevalence of violating content,” company spokesman Andy Stone said in a statement.
The plaintiff, who isn’t named in the lawsuit, is seeking more than $150 billion in damages on behalf of an estimated 10,000 Rohingya Muslims in the U.S. who fled Myanmar after June 2012 to escape the threat of violence. Her lawsuit is seeking class action status.
In the U.K. legal challenge, the plaintiffs will argue that Facebook used algorithms that amplified hate speech against the Rohingya people and failed to invest in enough content moderators who spoke Burmese or Rohingya, according to the letter they sent the court.
“Despite Facebook’s acknowledgment of its role in such real-world harms and its proclaimed position as a positive force in the world, no meaningful compensation has been offered to any survivor,” the legal notice said.
On Thursday, 16 Rohingya young people and advocates in a refugee camp in Bangladesh plan to hold a press conference about a complaint they will submit against Facebook to Ireland’s Organisation for Economic Co-operation and Development, arguing its social network incited violence against their community.
Facebook instituted reforms after a company-commissioned study in 2018 found that its platform was being used to coordinate violent repression in Myanmar. More broadly, the company in recent years has stepped up use of artificial intelligence and human-powered system to rid its networks of problematic speech.
The company bans user-posted content that directs attacks against people on the basis of their race, country of origin, religion, sexual orientation and other sensitive attributes. It also bars users from posting messages that include calls for violence.
Facebook became so popular in Myanmar that for the majority of its digitally connected residents the social network had become synonymous with the internet itself. Facebook dominated the developing market because it partnered with local mobile operators who agreed not to charge for the data used to support a cheap basic version of the app and it supported Myanmar fonts better than other tech platforms, according to the California suit.
By 2012, Myanmar’s military-dominated government and everyday users began spreading fearful and dehumanizing messages about Rohingya Muslims. By 2017, the site was being used to recruit and train “civilian death squads” to perpetuate violence, according to the complaint. In the end, tens of thousands of Rohingya were murdered while hundreds of thousands saw “indescribable violence and misery that they will carry with them for the rest of their lives,” the refugee alleged.
Lawyers for the refugee argue that unlike the federal Communications Decency Act that shields internet platforms from lawsuits over user-generated content,“Burmese law does not immunize social media companies for their role in inciting violence and contributing to genocide,” according to the complaint.
Facebook is participating in an international investigation of the Myanmar genocide led by the West African nation Gambia, but was faulted by a judge in Washington, D.C. earlier this year for resisting disclosure of internal company records.
©2021 Bloomberg L.P. Distributed by Tribune Content Agency, LLC.