top of page
  • CNN

Facebook Knew it was Used to Incite Violence in Ethiopia

It did little to stop the spread, documents show.

By Eliza Mackintosh

Mekelle, the regional capital of Tigray, in northern Ethiopia, seen through a broken window in the Ayder Referral Hospital in May.

Facebook employees repeatedly sounded the alarm on the company's failure to curb the spread of posts inciting violence in "at risk" countries like Ethiopia, where a civil war has raged for the past year, internal documents seen by CNN show.

The social media giant ranks Ethiopia in its highest priority tier for countries at risk of conflict, but the documents reveal that Facebook's moderation efforts were no match for the flood of inflammatory content on its platform.

The documents are among dozens of disclosures made to the US Securities and Exchange Commission (SEC) and provided to Congress in redacted form by Facebook whistleblower Frances Haugen's legal counsel. A consortium of 17 US news organizations, including CNN, has reviewed the redacted versions received by Congress.

They show employees warning managers about how Facebook was being used by "problematic actors," including states and foreign organizations, to spread hate speech and content inciting violence in Ethiopia and other developing countries, where its user base is large and growing. Facebook estimates it has 1.84 billion daily active users -- 72% of which are outside North America and Europe, according to its annual SEC filing for 2020.

The documents also indicate that the company has, in many cases, failed to adequately scale up staff or add local language resources to protect people in these places.

Facebook used by militias 'to seed calls for violence'

The reports CNN has obtained provide further insights into the scale of the problem in Ethiopia, elements of which were reported by The Wall Street Journal last month.

CNN's publication of these warnings from within Facebook comes seven months after a Facebook team initially shared an internal report entitled "Coordinated Social Harm."

The report, distributed in March, said that armed groups in Ethiopia were using the platform to incite violence against ethnic minorities in the "context of civil war." At that time, a conflict in the country's northern Tigray region between its former ruling party, the Tigray People's Liberation Front (TPLF), and the Ethiopian government had been rumbling on for five months. Intermittent internet blackouts and media restrictions had obscured much of the fighting.

A destroyed tank on a roadside in western Tigray in May.

Ethiopia is an ethnically and religiously diverse nation of about 110 million people who speak scores of languages. Its two largest ethnic groups, the Oromo and Amhara, make up more than 60% of the population. The Tigrayans, the third largest, are around 7%.

One of the groups flagged in the March report was the "Fano," an ethnic Amhara militia group with a reputation for brutality that has been drawn into the war in Tigray, sometimes fighting alongside Ethiopian government forces. Facebook said it had observed a cluster of accounts affiliated with the militia group, including some based in Sudan, using its platform to "seed calls for violence," promote armed conflict, recruit and fundraise.

Since the war started last November, the Fano militia have been linked by displaced Tigrayans to human rights abuses, including the killings of civilians, looting and rape, according to the United Nations rights office, Amnesty International and other human rights groups.

Though the Facebook team said it had recommended the Fano-affiliated network be taken down, it suggested that other bad actors promoting violence on its platform were simultaneously slipping through the cracks. In a headline in bold, the team warned: "Current mitigation strategies are not enough."

The Facebook documents also detail the platform's removal of a cluster of accounts linked to the Oromo diaspora, mostly based in Egypt, which was targeting Ethiopian audiences with highly inflammatory content, including "explicit calls to violence against government officials and other ethnic groups." One inciteful post highlighted in a report shared a photo of what appears to be a Molotov cocktail being lit and the statement: "Burn the whole country down."

The whistleblower, Haugen, said one of her core motivations for gathering the internal documents was bringing to light "how badly Facebook is handling places like Ethiopia," where she suggested engagement-based ranking was fanning ethnic violence.

"I genuinely fear that a huge number of people are going to die in the next five to ten years, or twenty years, because of choices and underfunding" by Facebook, Haugen said.

In comments made to the consortium, Haugen emphasized the vast difference between the integrity and security systems rolled out by Facebook in the United States versus the rest of the world, adding that the company was not adequately policing its platform in most non-English languages.

"The raw version [of Facebook] roaming wild in most of the world doesn't have any of the things that make it kind of palatable in the United States, and I genuinely think there's a lot of lives on the line -- that Myanmar and Ethiopia are like the opening chapter," she said.

A Facebook spokesperson told CNN that the company has been "actively focused on Ethiopia."

"Over the past two years we have actively invested to add more staff with local expertise, operational resources and additional review capacity to expand the number of local languages we support to include Amharic, Oromo, Somali and Tigrinya. We have worked to improve our proactive detection so that we can remove more harmful content at scale. We have also partnered extensively with international and local experts to better understand and mitigate the biggest risks on the platform," the spokesperson said.

Current mitigation strategies are not enough

None of the revelations from the Facebook documents are news to activists and human rights groups, who have warned for years that the social media giant has made insufficient efforts to protect human rights in Ethiopia, Africa's second most populous country.

A busy street in the Ethiopian capital Addis Ababa last December.

Some politicians and civil society groups said that if no action was taken, the platform risked repeating the same mistakes it made in Myanmar -- now a case study in the deadly impact that hate speech shared on Facebook can have.

In 2018, the UN slammed Facebook's role in the Myanmar crisis, which the global body said, "bore the hallmarks of genocide." By promoting violence and hatred against the minority Rohingya population, the UN said Facebook had "turned into a beast." The social media company later acknowledged that it didn't do enough to prevent its platform being used to fuel bloodshed, and Chief Executive Mark Zuckerberg wrote an open letter apologizing to activists and promising to increase its moderation efforts.

Much like in Myanmar, Facebook's rise in popularity in Ethiopia came at a moment of rapid political and societal change, which helped to boost the platform's growth. In 2018, Abiy Ahmed was appointed Prime Minister and launched a series of reforms, including freeing thousands of political prisoners and lifting restrictions on the press.

But as Ethiopians began to use Facebook to engage in public debate, observers saw that the platform was being abused by a variety of actors, including politicians, to incite discrimination and violence.

Former UN special rapporteur for freedom of expression David Kaye told CNN that this problem came up repeatedly in conversations with civil society groups during his trip to Ethiopia in December 2019: "It was on everybody's radar that there could be real spill over from the platform to offline harm."

"Given the experience in Myanmar, it was really incumbent on Facebook to do a human rights impact assessment and evaluate what they needed to do so that Facebook in Ethiopia didn't become a place for incitement to violence," Kaye said, adding that he didn't know what that assessment looked like or if it was done.

In June 2020, a Facebook employee posted a report to an internal group with about 1,500 members called "At Risk Countries FYI" recapping an ongoing audit into how well its Artificial Intelligence and other signals, like third party fact checkers, worked in the most at-risk countries where the platform operates.

"We found significant gaps in our coverage (especially in Myanmar & Ethiopia), showcasing that our current signals may be inadequate," the employee wrote, sharing a spreadsheet with a list of at-risk countries and the languages supported by the platform in each.

The spreadsheet showed that Facebook had failed to build automated systems, called classifiers, to detect misinformation or hate speech in Oromo or Amharic -- two of the most widely spoken languages spoken in Ethiopia.

A Tigrayan man looks for cellular service on a mountain overlooking Um Rakuba refugee camp in eastern Sudan in January.

Even as the conflict in Tigray escalated, Haugen said she had only found evidence that Facebook had allocated "even slight language support" in two of the country's many native languages.

Facebook says it does not believe it should be the "arbiters of truth," so the firm relies on third-party fact-checking organizations to identify, review and rate potential misinformation on its platforms.

Facebook has partnered with two such organizations in Ethiopia: AFP Fact Check and PesaCheck, an East Africa-based non-profit initiative run by Code for Africa.

PesaCheck has five full-time Ethiopian fact-checkers working in four languages -- Oromo, Amharic, Tigrinya and English -- but says it recently had to relocate one staff member from Ethiopia due to intimidation. AFP Fact Check employs one fact-checker in Ethiopia, Amanuel Neguede, who reviews content in Amharic and English.

Each day, Neguede told CNN that he reviews thousands of posts on an internal Facebook tool, which surfaces content flagged as false or misleading through a combination of AI and human moderators. Originally, Neguede said that AFP only had access to English-language content in Ethiopia through the tool, which would surface only limited content each day.

The tool began serving AFP Amharic-language content in May, and now the number of claims he says he sees on a daily basis has drastically risen. The tool does not always accurately identify mis- and dis-information, but Neguede says it helps with his work.

"I've seen a lot of a lot of hate speech, that definitely does fuel ethnic violence in Ethiopia," Neguede said.

"Whenever there's a major offensive, for example that's happening in the north, we can see a lot of images of conflict that's happened in different countries used in a misleading context. I would say that most of the time we'll see posts surface -- especially posts that are widely shared -- after real news events."

But researchers like Berhan Taye say Facebook in Ethiopia is in desperate need of more content moderators, pointing to how the platform was used to stoke a wave of deadly violence after the murder of Oromo musician Hachalu Hundessa last year. Taye, then a policy manager at digital rights group Access Now, recalls watching livestreams of lynchings and posts calling for the targeted extermination of certain ethnic groups, following Hundessa's death. She penned an open letter calling on Facebook to take action to protect Ethiopians. She says little has changed since.

Buldings burned by a mob in Ethiopia's Oromo region during a wave of violence following Hachalu Hundessa's murder last July.

For that reason, Taye, now an independent Nairobi-based analyst, works with grassroots volunteers to collate misinformation and hate speech they spot on the platform into Excel spreadsheets, which they then send on to Facebook for removal. But she says that much of what they flag -- including posts calling for the extermination of certain ethnic groups -- does not get taken down and, occasionally, the company has responded to activists asking for them to translate posts.

Facebook says it has improved its reporting tools for people in Ethiopia to make it faster and easier for them to report content they believe violates its community standards, which are now available in Amharic and recently launched in Oromo. It says it has also established reporting channels for international and local human rights groups and civil society organizations to flag potentially harmful content in Ethiopia.

Content moderation is extremely dangerous work in Ethiopia, Taye says, adding that she has personally been accused of siding with a person or group because a post was removed. The task is also mentally exhausting; volunteers spend hours reviewing graphic content, often doing it alongside full-time jobs and raising their kids. Taye said it was unacceptable that poor people have effectively been left to the "dirty work" of one of the world's richest companies.

Facebook will not reveal exactly how many local language speakers are evaluating content in Ethiopia that has been flagged as possibly violating its standards, or how much it has invested in resources to better police its platform in the country. A Facebook spokesperson said the company had invested "$13 billion and have 40,000 people working on the safety and security on our platform, including 15,000 people who review content in more than 70 languages working in more than 20 locations all across the world to support our community. Our third party fact-checking program includes over 80 partners who review content in over 60 languages, and 70 of those fact checkers are outside of the US."

"Our track record shows that we crack down on abuse outside the US with the same intensity that we apply in the US," the spokesperson added.

A war on social media

Since war broke out in Tigray, supporters of both the government and TPLF have waged a parallel fight on social media, creating a virtual battleground of toxic ethnic and religious hatred. But it's difficult to determine when atrocities unfolding on the ground are the direct result of hateful content shared online.

The head of the US Agency for International Development Samantha Power expressed concern in August about the "inflamed" and "dehumanizing rhetoric" that she said Ethiopia's leaders were invoking amid the conflict in Tigray, whose forces were described by Prime Minister Abiy Ahmed as "weeds" and "cancer" in a post shared on Facebook. Power emphasized that "increasingly virulent speech" used by the prime minister and other officials, also shared on social media, "often accompanies ethnically-motivated atrocities."

Abiy's language was also condemned by the UN special adviser on the prevention of genocide Alice Wairimu Nderitu, who warned that "hate speech, together with its propagation through social media is part of a worrisome trend that contributes to further fuel ethnic tensions in the country."

Activists say that divisive language has been echoed by Abiy's supporters online.

CNN has reached out to the Ethiopian prime minister's office for comment.

In June, days before Ethiopia's national elections, which Abiy won in a landslide, Facebook said it removed a network of fake accounts targeting domestic users primarily in Amharic. Facebook linked the accounts to individuals associated with Ethiopia's Information Network Security Agency (INSA), the cybersecurity agency that Abiy established and ran before becoming Prime Minister.

People watch Prime Minister Abiy Ahmed's swearing-in ceremony at an Addis Ababa coffee shop in October.

A Facebook page with an INSA-affiliated administrator was also flagged in internal Facebook documents in March, but was not recommended for removal.

As the conflict escalates online and off, the Ethiopian government has accused Facebook of blocking user and removing posts "disseminating the true reality about Ethiopia."

And in the latest sign of its efforts to consolidate control over Ethiopia's information landscape, the government announced in August it had begun developing its own social media platform to rival and replace Facebook, WhatsApp and Twitter.

This article is part of a CNN series published on "The Facebook Papers," a trove of over ten thousand pages of leaked internal Facebook documents that give deep insight into the company's internal culture, its approach to misinformation and hate speech moderation, internal research on its newsfeed algorithm, communication related to Jan. 6, and more. You can read the entire series here.

© 2021 Cable News Network.


Follow Genocide Watch for more updates:

  • Grey Facebook Icon
  • Grey Twitter Icon
  • Grey YouTube Icon
bottom of page