Refugees say Facebook knew of the ground situation in Myanmar, but ignored complaints of hate speech
Rohingya refugees in the US and UK are suing Facebook for more than $150bn (£113bn), accusing the social media giant of allowing the spread of hate speech and dangerous misinformation against the community.
Facebook allowed hate speech to fester in Myanmar for years, long after it was informed of the genocide perpetrated against the country’s persecuted minority, according to the lawsuit that was filed on Monday in San Francisco, on behalf of an estimated 10,000 Rohingya people in the US.
The refugees are suing the social media company for “compensatory damages, in excess of $150bn, in addition to punitive damages in an amount to be determined at trial” for promoting violence against the community.
Lawyers in the UK have also submitted a letter of notice to Facebook’s London office in a coordinated effort, Reuters reported.
While the Rohingya have long been persecuted in Myanmar, the lawsuit said the introduction of Facebook into the country in 2011 “contributed to the development and widespread dissemination of anti-Rohingya hate speech, misinformation, and incitement of violence.”
In the span of this decade, human rights abuses and sporadic violence turned into terrorism and mass genocide, said the lawsuit.
The refugees accused Facebook of “willing to trade the lives of the Rohingya people for better market penetration in a small country in Southeast Asia.”
The lawsuit said Facebook Messenger was used for spreading similar but conflicting chain messages to Muslim and Buddhist communities, inciting communal violence in the region in early September 2017.
It cited several Facebook posts reported by Reuters, as early as 2013, that said: “We must fight them the way Hitler did the Jews, damn Kalars [a derogatory term for Rohingya people].”
Another post in 2018, showed a photo of a boat full of Rohingya refugees, that said: “Pour fuel and set fire so that they can meet Allah faster.”
One of the most dangerous campaigns, according to the lawsuit, came in 2017 when “the military’s intelligence arm spread rumours on Facebook to both Muslim and Buddhist groups that an attack from the other side was imminent… (sic).”
In the aftermath of the online misinformation, over 10,000 people were killed and at least 725,000 Rohingya fled to neighbouring Bangladesh by September 2018.
The UN’s Independent International Fact-Finding Mission on Myanmar reported that during the course of the Myanmar military’s ethnic cleansing campaign, over 40 per cent of all villages in the northern Rakhine State were partially or totally destroyed.
It specifically alleged that Facebook had contributed to the Myanmar military’s ethnic cleansing operations.
The UN mission examined documents, Facebook posts and audio-visual materials that contributed to shaping public opinion on the Rohingya and confirmed that a “carefully crafted hate campaign” developed a negative perception of Muslims among the Myanmar population, stated the lawsuit.
“This discourse created a conducive environment for the 2012 and 2013 anti-Muslim violence in Rakhine State and beyond, without strong opposition from the general population. It also enabled the hardening of repressive measures against the Rohingya and Kaman in Rakhine State and subsequent waves of State-led violence in 2016 and 2017,” it noted.
Citing the work of Alan Davis, an analyst from the Institute for War and Peace Reporting, the lawsuit noted that in the months before the ethnic cleansing operation, posts on Facebook became “more organised and odious, and more militarised,” and the Myanmar military used posts on the social media platform to justify the “clearance operations.”
The allegations made in the lawsuit centre around the claim that Facebook knew of the ground situation in Myanmar, but chose to ignore complaints of hate speech made on the platform.
The lawsuit said that as Facebook’s algorithms amplified hate speech against the Rohingya community, the social media giant failed to invest in local moderators and fact-checkers.
The Rohingya refugees said Facebook’s response to warnings about hate speech was “utterly ineffective”.
They said the platform was completely unprepared with just one Burmese-speaking content reviewer in 2014, a local contractor in Dublin and a second Burmese speaker who only began working in early 2015.
Facebook whistleblower Frances Haugen, who testified for the US Congress in October, claimed that the platform prioritised profits over people’s wellbeing. She also alleged that the company was not doing enough to stop ethnic violence in countries including Ethiopia and Myanmar.
She noted that only a small fraction of the company’s expenditure to combat misinformation is to curb regional-language hate speech.
But responding to these allegations, Facebook said it employed a “comprehensive” strategy with the use of native speakers and third-party fact-checkers.
The Independent has reached out to Facebook for a comment about the allegations levelled in the Rohingya lawsuit.