Meta's oversight board has urged the company to revise its policy regarding the Arabic word 'shaheed' (martyr) due to concerns about free speech suppression. The board, independent of Meta but funded by it, recommends removing posts with 'shaheed' only if they incite violence or violate other platform rules.
The decision follows criticism of Meta's content handling in the Middle East, especially during conflicts like the Israel-Hamas hostilities, with rights groups accusing Meta of silencing pro-Palestinian content. The board found Meta's approach to 'shaheed' overly broad, resulting in the removal of non-violent content.
In response to the board's ruling, Meta acknowledged the need to reconsider its policies and promised to review the feedback within 60 days. Currently, Meta is removing posts containing 'shaheed' referencing designated 'dangerous organisations and individuals,' including Hamas, considering its praise for these entities. However, the oversight board highlights the word's multiple meanings and emphasises the importance of nuanced content moderation. Meta sought the board's input after facing internal challenges in reassessing its policy, as 'shaheed' accounted for the highest number of content removals on its platforms.
The board's decision reflects broader debates about censorship and safety on social media platforms. Oversight Board co-chair Helle Thorning-Schmidt emphasised the potential for censorship to marginalise entire populations without improving safety. Meta's response to the ruling will be closely watched, as it could impact how the platform handles content related to sensitive geopolitical issues.