Hundreds of Rohingya refugees in the United Kingdom and the United States have sued Facebook, alleging that the social media giant aided the spread of anti-Rohingya hate speech. They are seeking more than $150 billion (£113 billion) in damages, claiming that Facebook’s platforms encouraged violence against persecuted minorities.
During a military crackdown in Buddhist-majority Myanmar in 2017, an estimated 10,000 Rohingya Muslims were slaughtered.
The charges were not immediately responded to by Facebook, which is now known as Meta.
The corporation has been charged with allowing “the spread of vile and deadly falsehoods to persist for years.”
A British law firm representing some of the migrants in the UK has issued a letter to Facebook, which the BBC has seen, accusing:
- “Hate speech against the Rohingya people was magnified by Facebook’s algorithms,” according to the report.
- The company “failed to invest” in moderators and fact-checkers who were familiar with Myanmar’s political context.
- The corporation did not take down or delete posts or accounts that incited violence against Rohingya Muslims.
- Despite warnings from NGOs and the media, it failed to “take appropriate and timely action.”
In San Francisco, lawyers filed a lawsuit against Facebook, accusing the company of being “ready to trade the lives of the Rohingya people for improved market penetration in a minor Southeast Asian country.”
They cite Facebook messages found in a Reuters investigation, including one from 2013 that stated, “We must attack them like Hitler did the Jews.”
“Pour fuel and light fire so that they can meet Allah quickly,” another post wrote.
Myanmar has more than 20 million Facebook users. For many people, social media is their primary or sole source of news and information.
In 2018, Facebook recognized that it had not done enough to prevent incitement to violence and hate speech against Rohingya Muslims.
This came after a Facebook-commissioned independent assessment found that the social media network had created an “enabling environment” for the spread of human rights violations.
One of Facebook’s earliest red signals was what happened in Myanmar.
The social media site was extremely popular in the country, but the firm had no idea what was going on with its own platform. They weren’t actively policing stuff in Burmese and Rakhine, for example.
They would have witnessed anti-Muslim bigotry and misinformation about terrorist plots from the Rohingya if they had. Critics claim that this contributed to the escalation of ethnic tensions into terrible violence.
Mark Zuckerberg has admitted to making mistakes in the run-up to the massive violence in the country.
This is why the case is so intriguing: Facebook isn’t disputing that it could have done more.
However, whether or if this means they are legally liable is another matter. Is there any hope for this lawsuit? It’s a possibility, but it’s a long shot. As its parent firm, Meta, tries to divert attention away from Facebook, it finds itself tormented by past errors.
In Myanmar, the Rohingya are considered illegal migrants and have faced discrimination from both the government and the general population for decades.
After Rohingya terrorists carried out deadly attacks on police checkpoints in Rakhine state in 2017, the Myanmar military initiated a harsh crackdown.
Thousands of people have been killed, and over 700,000 Rohingya have fled to Bangladesh. Human rights violations, including arbitrary killings, rape, and land burning, have also been widely reported.
In 2018, the United Nations criticized Facebook for being “slow and ineffectual” in combating online hatred.
Under US law, Facebook is essentially immune from responsibility for user-generated content. However, the new complaint claims that Myanmar’s law, which lacks such safeguards, should prevail in the case.