Facebook is under renewed scrutiny this weekend, accused of continuing to allow activists to incite ethnic massacres in Ethiopia’s escalating war.
Analysis by the Bureau of Investigative Journalism (TBIJ) and the Observer found Facebook is still letting users post content inciting violence through hate and misinformation. This is despite being aware it helps directly fuel tensions, prompting claims of inaction and indifference against the social media giant.
The investigation tracked down relatives who have linked Facebook posts to the killings of loved ones. One senior member of Ethiopia’s media accused the firm of “standing by and watching the country fall apart”.
The accusations arrive amid intensifying focus on Facebook’s content moderation decisions, with it previously being accused of playing a role in the ethnic persecution of the Rohingya in Myanmar.
On Wednesday, Meta’s Mark Zuckerberg revealed that former UK deputy prime minister, Nick Clegg, would be president of global affairs, a move designed to help the rebranded company repair its reputation following the testimony of whistleblower Frances Haugen, who said it was “literally fanning ethnic violence” in Ethiopia.
It also comes as Facebook considers launching an independent inquiry into its work in Ethiopia after its oversight board urged it to investigate how the platform had been used to spread hate speech.
TBIJ and Observer investigators also interviewed a number of fact-checkers, civil society organisations and human rights activists in the country. They described Facebook’s support as far less than it should be.
Others said they felt requests for assistance had been ignored and meetings failed to materialise.
These failures, they said, helped to fuel a conflict in which thousands have died and millions been displaced since fighting broke out between government forces and armed opposition groups from the Tigray region in November 2020. Both sides have been accused of atrocities.
Rehobot Ayalew, of the Ethiopian factchecking initiative HaqCheck, said: “Most of the people have low media literacy, so Facebook is considered to be credible.
We come across [Facebook] images that are horrifying and hateful content. You’re not getting the support from the platform itself, that is allowing this kind of content.
They can do more [but] they’re not doing anything.”
Meta rejected the claims, saying it had “invested in safety and security measures” to tackle hate and inflammatory language along with “aggressive steps to stop the spread of misinformation” in Ethiopia.
Among the cases where families believe Facebook’s continued promotion of hate makes it responsible for killings include Gebremichael Teweldmedhin, a Tigrayan jeweller abducted three months ago in Gonder, a city in the Amhara region.
A relative, who said Teweldmedhin was not political, claimed online hate campaigns and calls for violence – particularly on Facebook – played a key role in his suspected killing and many others.
“The worst thing that contributed to their killing are the so-called activists who have been spreading hate on social media,” he said, requesting anonymity.
Some posts, he claimed, would name individuals or even post photos helping create an atmosphere “inciting attacks, killings and displacements”.
He added that the family have been told that Teweldmedhin – who disappeared after trying to stop a mob looting a nephew’s workshop – had been killed and buried in a mass grave.
Teweldmedhin’s family cited one Facebook user in particular: Solomon Bogale, an online activist with more than 86,000 Facebook followers.
Although listed on Facebook as residing in London, Bogale’s social media indicates he has been in Ethiopia since August 2021, with posts of him carrying an assault rifle often accompanied by statements praising the Fano, an Amharan nationalist vigilante group.
One of Teweldmedhin’s family members believed Bogale’s “inciteful posts” had resulted in many attacks on Tigrayans in Gonder. In the weeks before Teweldmedhin’s killing, Bogale called for people to “cleanse” the Amhara territories of the “junta”, a term used by government supporters to refer to Tigrayan forces and Tigrayans more generally.
The post continued: “We need to cleanse the region of the junta lineage present prior to the war!!”
According to TBIJ, the post could be found on Facebook almost four months later, although Meta said it had since “removed any content which violated our policies”.
When contacted over Facebook, Bogale denied that any Tigrayans were killed in Gonder in early November, saying all Tigrayans in the city were safe. Bogale added that he would delete the posts cited by TBIJ.
Less than a month after Teweldmedhin’s disappearance Hadush Gebrekirstos, a 45-year-old who lived in Addis Ababa, was arbitrarily detained by police who heard him speaking Tigrinya.
His body was found two days later, 26 November, close to the police station.
A relative said Gebrekirstos had no political affiliation, but believes that disinformation posted on Facebook played a key role in causing the killing.
“People do not have the ability to verify what was posted on Facebook. Like calling people to kill Tigrinya speaking residents,” they said.
Compounding the concern is that, according to disclosures provided to the US Congress by Haugen, Meta has known about the risks of such problems for years.
In January 2019 an internal report into “On-FB Badness” – a measure of harmful content on the platform – rated the situation in Ethiopia as “severe”, its second-highest category.
Almost a year later Ethiopia had risen to the top of Facebook’s list of countries where it needed to take action.
A presentation dated 10 December 2020 evaluated the risk of societal violence in Ethiopia as “dire” – Meta’s highest threat warning and the only country to receive that ranking.
More than a year on, it is alleged the firm has frequently ignored requests for support from fact-checkers based in the country. Some civil society organisations say they have not met with the company in 18 months. Multiple sources told the Bureau that Facebook only appointed its first senior policy executive from Ethiopia to work on East Africa in September.
Meta does run a third-party fact-checking programme, providing partners with access to internal tools and payment for fact checks. Yet it has not partnered with a single organisation in Ethiopia to tackle the misinformation surrounding the country’s conflict.
Abel Wabella, founder of HaqCheck, said Meta had failed to support his organisation despite first approaching executives more than a year ago.
The other major independent fact-checking organisation based in Ethiopia, Ethiopia Check, is also not part of Facebook’s partner programme.
Instead, Facebook works with two fact-checking organisations on content from Ethiopia – PesaCheck, which runs a small team in Nairobi, and Agence France-Presse (AFP) – but both are based outside the country.
Although misinformation flagged by PesaCheck and AFP has often been labelled as false or removed by Facebook, content debunked by HaqCheck has largely remained unaltered and free to spread.
This has included false declarations of military victories on both sides, false allegations of attacks on civilians and false claims of captured infiltrators.
“As far as I know, support for fact checkers in Ethiopia by Facebook is almost non-existent,” said the senior person working in Ethiopian media, requesting anonymity.
“Facebook doesn’t pay the attention Ethiopia needs at this crucial moment, and that’s contributing to the ongoing crisis by inflaming hatred and spreading hate speech.”
A number of civil society groups have similar complaints of feeling ignored and sidelined. Facebook organised a meeting with several groups in June 2020, to discuss how the platform could best regulate content before scheduled elections. As of November, two of the organisations involved said they had heard nothing about any subsequent meetings.
Haben Fecadu, a human rights activist who has worked in Ethiopia, said: “There’s really no excuse. I’ve doubted they have invested enough in their Africa content moderation.”
Mercy Ndegwa, Meta’s public policy director for East & Horn of Africa, said: “For more than two years, we’ve invested in safety and security measures in Ethiopia, adding more staff with local expertise and building our capacity to catch hateful and inflammatory content in the most widely spoken languages, including Amharic, Oromo, Somali and Tigrinya.
“As the situation has escalated, we’ve put additional measures in place and are continuing to monitor activity on our platform, identify issues as they emerge, and quickly remove content that breaks our rules.”
The company added that it worked with 80 fact-checking partners in more than 60 languages to review content on Facebook, including Pesa Check and AFP.
Additional reporting by Kat Hall and Zecharias Zelalem