Facebook employees warned for years that company failed to police abusive content News
Facebook employees warned for years that company failed to police abusive content

Facebook employees have warned for years that the company was failing to police abusive content in countries where it was likely to cause harm, according to a report by Reuters.

In recent years, Facebook has faced legal action for its role in enabling violence in certain countries. In 2021, the Gambia brought a genocide claim against Facebook before the International Court of Justice, alleging that Facebook played a key role in Myanmar’s genocide attempt on the Rohingya, an ethnic and religious minority. The Delhi Legislative Assembly’s Committee on Peace and Harmony also summoned Facebook in 2021 for inquiry related to the February 2020 Delhi riots. 

Reuters interviewed five former Facebook employees and reviewed internal company documents. These documents had been released to the US Securities and Exchange Commission and Congress by former Facebook product manager Frances Haugen.

Reuters reported, on Wednesday, that these documents showed Facebook knew it did not hire enough workers with the needed language skills and knowledge of local events to identify objectionable user posts in some countries. Reuters also stated that Facebook’s artificial intelligence systems were not adequate and that Facebook did not make it easy for global users to flag posts violating site rules.

Employees also reportedly warned about problems with company tools aimed at blocking content that violated its terms. These problems included a lack of screening algorithms in some countries identified by Facebook as most at-risk for real-world harm and violence from site abuses. The technology had issues moderating languages outside of those spoken in the US, Canada, and Europe.

Facebook currently operates in over 190 countries and has over 2.8 billion users who post content in over 160 languages. Over 90 percent of monthly users are located outside of the US and Canada. Community standards are not currently available in about half of the over 110 languages that Facebook supports. However, the company stated it aims to have these rules available in 59 languages by the end of 2021.