top of page

Myanmar: Facebook Promotes Violence Against Coup Protestors

Posts ranging from wanted posters to death threats remain online for months, breaching platform’s own standards.


The Global Witness report said the Facebook content promoted in Myanmar advocating violence against coup protesters showed self-regulation was not working. Photograph: Stringer/Reuters

Facebook is promoting content that incites violence against Myanmar’s coup protesters and amplifies junta misinformation, despite promising to clamp down on the misuse of its platform, according to a study.


An investigation by the rights group Global Witness found that Facebook’s recommendation algorithm continues to invite users to view content that breaches its own policies. After liking a Myanmar military fan page, which did not contain recent posts violating Facebook’s policies, the rights group found that Facebook suggested several pro-military pages that contained abusive content.


Among the posts featured on one of the pages was an image of a “wanted” poster offering a $10m bounty for the capture “dead or alive” of a young woman. The post claimed she was among protesters who had burned down a factory following a military crackdown. Images of the woman’s face and a screenshot of what appeared to be her Facebook profile were posted alongside a caption reading: “This girl is the one who committed arson in Hlaing Tharyar. Her account has been deactivated. But she cannot run.”


Global Witness said that its report demonstrated that self-regulation by Facebook was not working, and called for Facebook’s recommendation algorithm to be subject to independent audit.


Other posts identified by Global Witness included a death threat, the glorification of military violence and misinformation, such as the incorrect claims that Isis is present in Myanmar, and that the military had seized power due to “voter fraud”. The military has accused Aung San Suu Kyi’s party of vote rigging in last year’s election in order to justify February’s coup – a suggestion that has been discredited by observers, including by the independent monitoring group Asian Network for Free Elections.


Facebook said in February that it would remove false claims of widespread fraud or foreign interference in Myanmar’s November election from its site. It also said it had banned military-controlled state and media entities, and introduced a specific policy for Myanmar “to remove praise, support and advocacy of violence by Myanmar security forces and protestors”. Content that supported the arrests of civilians by the military and security forces in Myanmar would be removed under this policy.


A spokesperson for Facebook said its staff “closely monitor” the situation in Myanmar in real time and has taken action on any posts, pages or groups that break its rules.


However, content identified by Global Witness has remained online for months, according to the rights group.

Separate analysis by the Guardian found numerous recent examples of posts that also appeared to breach Facebook’s standards:

  • In one post from 19 June, which received more than 500 likes, an image showed a man with a bloodied face and rope tied around his neck. The caption states: “This is how you should arrest them”, referring to protesters.

  • Posts often mock and encourage violence against protesters. One post, also from 19 June referred to a recent flower strike, where protesters wore flowers to mark Aung San Suu Kyi’s 76th birthday, stating: “Every single one of the real men that wore the flowers in public today must be killed … Trash. They all need to be killed so that the children will not have the wrong role models.” The post was liked 175 times.

  • Another post, from 1 June, targeted children. It showed an image of students outside their school, with a sign that states: “We are students and we will go to school. You are criminals, and you will go to prison.” Many children have not returned to school, despite orders to do so by the junta. The post had been liked more than 4,300 times.

  • Posts often share misinformation, for example, blaming pro-democracy politicians for leading “terrorists”. A post states “only real news outlets in this country are MOI, MRTV and MWD and other state-run news”, referring to military-controlled channels.

Facebook has previously acknowledged that its platform has been misused in Myanmar, where it is hugely popular and influential. The site is used by almost half the population and, for many, it is the primary way of accessing the internet.


In 2018, following the massacre of Rohingya Muslims by the military, Facebook admitted that its platform had been used to “foment division and incite offline violence”. A UN fact-finding mission drew similar conclusions the same year, stating that Facebook had been “a useful instrument for those seeking to spread hate” and that the response of the company had been “slow and ineffective”.


In February, Facebook said its staff were working around the clock to keep its platform safe. The coup greatly increased the likelihood “that online threats could lead to offline harm”, Facebook said at the time.


The Global Witness report also called for Facebook to further investigate other types of content it hosted, including the circulation of forced confession videos by political prisoners, military adverts, and posts that amplified military propaganda – such as the claims that the army is acting in a measured way.


In a statement, Facebook said: “We proactively detect 99% of the hate speech removed from Facebook in Myanmar, and our ban of the Tatmadaw [military] and repeated disruption of coordinated inauthentic behaviour has made it harder for people to misuse our services to spread harm. This is a highly adversarial issue and we continue to take action on content that violates our policies to help keep people safe.”



© 2021 Guardian News & Media Limited or its affiliated companies.

Follow Genocide Watch for more updates:

  • Grey Facebook Icon
  • Grey Twitter Icon
  • Grey YouTube Icon
bottom of page