Meta's Oversight Board urges the company to be more vigilant in addressing content that normalizes gender-based violence. The demand came after the Board discovered a Facebook post that mocked an injured woman, which remained on the platform for nearly two years without being reviewed by a human moderator. The post depicted a woman with visible marks of physical attack, implying her husband was responsible for it, and was accompanied by laughing emojis.
The post was reported to the Board, and Meta removed it after determining that it violated its Bullying and Harassment policy. However, if the woman was not identifiable, the post would not be considered to violate Meta's policy. The Board emphasized these gaps in the current policies for allowing such content to spread, especially if the victim is not identifiable or involves fictional characters, and urged Meta to:
Why does it matter?The issue of gender-based violence, particularly in the digital space, has become increasingly concerning in recent years. Recently, the EU took a proactive stance and proposed a law on combating various forms of cyberviolence against women and the LGBTQ+ community. However, despite such legislative efforts, platform accountability in addressing gender-based violence remains a critical and unresolved issue worldwide. The lack of transparency, inconsistent content moderation policies, algorithmic opacity, and inadequate reporting mechanisms contribute to a hostile online environment.