Euro-Med Monitor: Independent Study Should Be Crucial In Changing Meta’s Policy Of Restricting Palestinian Content
Geneva - Meta Platforms, Inc. must address the serious flaws in its system and a biased policy that impeded millions of users’ freedom of publication and expression during the violent events in the Palestinian territories in May 2021, Euro-Med Monitor said in a statement.
Euro-Med Monitor reviewed an independent study conducted by the Business for Social Responsibility (BSR) on the policies of the Meta company, which owns Facebook, Instagram, and WhatsApp. The study concluded that, due to biased practices and censorship of Arabic content during the May 2021 events, Meta’s actions negatively impacted Palestinian users’ rights to freedom of expression, assembly, political participation, and non-discrimination, in part by restricting users’ ability to share information as it happens.
“The Meta study should mark a turning point in the company’s dealing with Palestinian and Arab content when it comes to publishing and reporting on human rights violations”, said Euro-Med Monitor’s Chairman Ramy Abdu. “All forms of unfair restrictions imposed on users must be lifted, particularly those resulting from Meta’s blatant discrimination”.
The hostility that Meta has shown towards Palestinian content in recent years—including the closure of hundreds of accounts of media organisations and journalists—has severely limited users’ access to information, as many people rely on specific accounts to obtain information that may help them make decisions related to their physical safety, particularly during times of insecurity and instability.
Abdu emphasised that the company’s practices not only restricted many users’ right to free expression, but also impeded the work of human rights organisations, which rely heavily on content posted by users in conflict areas to track and investigate violations and reach victims.
“In today’s digital world, social media companies largely limit freedom of expression”, Abdu added, “and while they should play a key role in striking a balance between restricting hate speech and allowing freedom of expression, they are often subject to government pressure and allow their platforms to be used as tools of repression”.
Facebook’s practices in restricting human rights work were not the first of their kind. In 2019, Facebook discontinued the Graph Search tool, which had been used for years by dozens of human rights organisations to search for content relevant to human rights investigations.
The Meta study confirmed that Facebook deleted Arabic posts about violence at a much higher rate than their Hebrew counterparts, whether through the use of human cadres or automated software.
The study also noted that content written by Palestinian users is subject to an algorithmic examination that does not apply to content written by Israeli users, as well as the existence of a filter to search for hate speech in Arabic but not in Hebrew. These findings raise questions about the company’s prejudices, given its apparent presumption that hostile content is only presented in Arabic.
Moreover, due to a lack of linguistic and cultural competence on the part of human reviewers supervising the screening process, the system does not work accurately when assessing Arabic content written in the Palestinian dialect, which may result in the removal of or restriction of access to some posts without their being thoroughly reviewed beforehand.
The study warned that the consequences of these errors on human rights were more serious given the context in which they were committed, as rights such as security, safety, and freedom of expression were becoming increasingly important at the time, particularly for activists and journalists.
The ramifications of Meta’s incorrect measures extend beyond the significant restrictions that accompanied the events of May 2021, and include the rights of Palestinian users and Arabic speakers in general, the study confirmed.
Despite the company’s cash reserves of more than $24 billion USD, the study attributed Meta’s content policy errors to systemic and technical issues, such as lack of experience, lack of employees who understand other cultures and languages, and the use of a faulty algorithm to control discourse around the world.
The study highlighted Meta’s Dangerous Individuals and Organizations (DIO) policy, which lists thousands of people and organisations that billions of Meta users cannot “praise” or “support” and primarily targets Islamic and Middle Eastern entities. Critics have described the list as blatant ethnic and religious prejudice.
According to the study, the DIO policy is systematically biased, as legal classifications of terrorist organisations around the world focus disproportionately on individuals and organisations who have been identified as Muslim. As a result, Meta’s policy is likely to have a much greater impact on Palestinian and Arabic-speaking users than on others.
Meta Platforms, Inc. must amend all policies that may conflict with all users’ right to freedom of expression, opinion, and publication, and ensure that all groups, especially victims, can express their opinions free of unjustified restrictions.
Furthermore, Meta should review all of
the thousands of accounts that were deleted or blocked due
to the company’s erroneous procedures and policies and
take all necessary steps to reactivate those accounts, so
that users can publish as usual and carry out their normal
activities in accordance with fairness and
legality.