top of page
Lemkin Institute

Meta’s Facebook Algorithms ‘Proactively’ Promoted Violence Against the Rohingya


A woman checks Facebook on her phone in Yangon on Feb. 4, 2021, days after Myanmar’s generals seized power. STR/AFP/Getty Images

Amnesty International on Wednesday accused Facebook’s parent company Meta of having “substantially contributed” to human rights violations perpetrated against Myanmar’s Rohingya ethnic group.


In a new report, Amnesty claims that Facebook’s algorithms “proactively amplified” anti-Rohingya content. It also alleges that Meta ignored civilians’ and activists’ pleas to curb hate-mongering on the social media platform while profiting from increased engagement.

Facebook’s seeming inability to manage online hate speech and disinformation has become a major offline problem for many countries across the globe. Amnesty is calling for the tech giant to provide reparations to affected communities.


The Rohingya have been persecuted by Myanmar’s Buddhist majority for decades, but Facebook has exacerbated the situation, Amnesty says. The human rights group claims that the Tatmadaw, Myanmar’s armed forces, used Facebook to boost propaganda against the Rohingya and to amass public support for a military campaign of rampant killings, rape and arson targeting the predominantly Muslim minority in August 2017.


In the aftermath, more than 730,000 Rohingya in the western Rakhine state were forced to take refuge in camps in neighboring Bangladesh. Today, more than a million Rohingya are living in exile, and Myanmar’s military leaders are facing charges of genocide at the International Court of Justice.


A U.N. fact-finding mission in 2018 determined that Facebook had been a “useful instrument” for vilifying the Rohingya in Myanmar “where, for most users, Facebook is the internet.” Months later, Meta released a commissioned human rights impact report in which it admitted that the company was not doing enough to stop the sowing of hatred against the Rohingya on the platform. Meta has since said it has invested in more Burmese-speaking content moderators and improved technology to address the problem.


Amnesty analyzed internal Meta documents released by whistleblower Frances Haugen in 2021, as well as various public reports, and it conducted interviews with Rohingya activists and former Meta staff. It concludes that Facebook’s parent company—then known as Facebook Inc.—was made aware of its role in contributing to the atrocities against the Rohingya ethnic group years before 2017, and it both failed to heed such warnings at the time and took “wholly inadequate” measures to address issues after the fact.


Lead researcher Pat de Brun told TIME the Amnesty report shows the “clear and severe danger” Meta and its engagement-based business model pose to human rights, at-risk communities and conflict-affected areas.


The repor