Four complaints have been filed against Facebook by European human rights organisations, including international nonprofit Global Witness, Bureau Clara Wichmann, and Fondation des Femmes, alleging that the algorithm used to target users with employment adverts is discriminatory. If the charges are proven, Meta (formerly called Facebook) might face penalties, sanctions, or pressure to modify its product further.
The Global Witness campaign's head, Naomi Hirst, states that Facebook contributes to discrimination and harms possibilities for progress and gender equality in the workplace.
Global Witness presented evidence indicating that algorithmic bias is a worldwide issue and employment adverts published in France, the Netherlands, and the United Kingdom, which discovered that ads were frequently given to users along gender lines. In the United Kingdom, women were more likely to see job advertising for teachers and psychologists, while males were more likely to see ads for pilot and mechanic roles. The degree of gender imbalance in how beneficiaries were targeted for particular employment differed by nation.
Global Witness and its partners also believe that human rights organisations' decisions will pressure Meta to improve its algorithm, promote transparency, and avoid additional prejudice. The company might face significant fines if data protection authorities decide to examine the matter and discover that Meta breached the EU's General Data Protection Regulation.