June 1, 2021 (Financial Times) – Close to 200 Facebook employees have signed an open letter calling for the company’s leadership to address concerns that pro-Palestine voices on the social network are being suppressed by content moderation systems.
The letter, seen by the Financial Times, urges Facebook to introduce new measures to ensure pro-Palestinian content is not unfairly taken down or down-ranked, as some staff and critics claimed happened during the recent conflict in Gaza.
It calls on management to order a third-party audit of Facebook’s enforcement actions around Arab and Muslim content, and to refer a post by Israel’s prime minister Benjamin Netanyahu — which the letter claimed “mischaracterized Palestinian civilians as terrorists” — to Facebook’s independent oversight board.*
It also calls for an internal task force to “investigate and address potential biases” in both its human and automated content moderation systems.
Posted on the company’s internal message board by employee groups called ‘Palestinians@’ and ‘Muslims@’, it had garnered at least 174 anonymous signatures by Tuesday afternoon.
“As highlighted by employees, the press and members of Congress, and as reflected in our declining app store rating, our users and community at large feel that we are falling short on our promise to protect open expression around the situation in Palestine,” the letter said.
“We believe Facebook can and should do more to understand our users and work on rebuilding their trust.”
The letter also calls on Facebook to commit to hiring more Palestinian talent, publish more data on government-sponsored requests for content takedowns, and clarify its policies around anti-Semitism.
During the Gaza conflict, Facebook’s algorithms had labelled words commonly used by Palestinian users, such as “martyr” and “resistance”, as incitements to violence and removed posts about Al-Aqsa mosque after mistakenly associating the third holiest site in Islam with a terrorist organization, according to US media reports.
The Financial Times on Sunday reported that Facebook-owned Instagram was changing its algorithm to show more viral and current affairs posts following concerns that users re-sharing posts about the recent conflict in Gaza were not reaching a wide audience.
“We know there were several issues that impacted people’s ability to share on our apps. While we fixed them, they should never have happened in the first place and we’re sorry to anyone who felt they couldn’t bring attention to important events, or who believed this was a deliberate suppression of their voice,” Facebook said on Tuesday.
“We design our policies to give everyone a voice while keeping them safe on our apps and we apply them equally, regardless of who is posting or what their personal beliefs are.”
Written by Hannah Murphy