A Rohingya refugee from Myanmar, sits with two out of her four children in their shelter at Nayapara camp, south-east Bangladesh.
IN THE WAKE OF THE MIDTERM ELECTIONS on Tuesday, tempers seem to be running hot at the White House: Attorney-General Jeff Sessions, who drew the president’s ire by recusing himself from the Russia investigation, handed in his resignation on Wednesday at the request of the president, not long after Trump shouted down several reporters during a press conference. After Jim Acosta refused to give the microphone to an aide, Trump told the CNN reporter he was “the enemy of the people” because he spread “fake news,” and then yelled at two other reporters and told them to sit down when they tried to ask a question. The White House later said Acosta’s official press pass had been withdrawn “until further notice.”
Is this a sign of what the US is in for with a divided Congress, or just more of the same from a notoriously combative president? That remains to be seen, but we shouldn’t let the sideshow antics of the White House distract us from other important developments outside the US political arena—including the fact that Facebook is being held accountable, at least in a small way, for its role in the ongoing violence against the Rohingya people in Myanmar. On Tuesday, the social network released an independent report it commissioned on the human rights impact of its presence in that country, from a non-profit group called Business for Social Responsibility (did Facebook try to bury the report by releasing it on the same day as the midterm elections? Let’s just say some people wondered about the convenient timing).
According to a blog post at Facebook, the report “concludes that, prior to this year, we weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence.” However, the social network says the report acknowledges that it is now “taking the right corrective actions,” including hiring more people in Myanmar to moderate content on the site to prevent hate speech against the Rohingya from spiraling even further out of control. The report also says Facebook should have a formal human rights policy as it applies to content moderation, something the company said it is “looking into” doing.
But is Facebook really doing all it can to try to halt the spread of violence? Not according to the United Nations’ Fact-Finding Mission on Myanmar, which released its own report on Tuesday looking into the causes of the genocide that has taken place against the Rohingya in that country. Among other recommendations, the UN report says any commercial entity doing business in Myanmar should not “enter into an economic or financial relationship with the security forces,” including the military police, known as the Tatmadaw. Facebook removed a number of pages and accounts associated with the Myanmar military in August, but Myanmar observers say those pages were used to distribute hate and foment violence for months, if not years, before Facebook finally took action against them.
But the UN’s number one call is for transparency. The UN report recommends that Facebook and other social media platforms allow for “an independent and thorough examination” of how their networks have been used to spread hatred. But the report notes that despite Facebook’s promise to do better on these kinds of issues than it has in the past, it has refused to provide country-specific data about hate speech on its platform, which the UN said “is imperative to assess the problem and the adequacy of its response.” The report also says that before Facebook enters a new market—particularly one with volatile ethnic or other tensions—it should conduct a human-rights assessment and take whatever measures it can to reduce the risk of fomenting violence, something it clearly didn’t do in Myanmar.
Here’s more on Facebook and its connection to violence in countries like Myanmar:
Reporters for The New York Times and Foreign Policy magazine who are based in Asia talked with CJR about the power that Facebook has in countries such as Thailand, Cambodia and Myanmar, where for many people the social network is the internet.
A group of six civil-society organizations in Myanmar wrote an open letter to Facebook CEO Mark Zuckerberg in April, saying the social network’s behavior in that country relied too much on third parties, failed to engage with local human-rights workers on important issues and exhibited “a lack of transparency.”
The Atlantic explained in 2016 how Myanmar went from being a country with a very low penetration of cell phones to one in which almost everyone has a smartphone, and almost all have Facebook installed on them by default.
Facebook-owned WhatsApp has been blamed for multiple deaths in India, but the company says cracking down on such content is difficult because messages are encrypted. Some believe WhatsApp is just a convenient scapegoat for the violence in India, and that the real problems are deeper than that.
© Copyright 2018 Columbia Journalism Review