Facebook admitted Monday that they “weren’t doing enough to help prevent our platform from being used to foment division and incite offline violence” in Myanmar.
Facebook said that it had commissioned a nonprofit, Business for Social Responsibility (BSR), to look into whether its platform had allowed for violent rhetoric, and that the organization’s findings concluded that Facebook’s community standards were not strict enough to block violent rhetoric. Facebook announced that it would reform these standards.
According to the press release, Facebook remains proud of the hate speech it is able to identify. The company recently claimed that it can now detect 63 percent of violent rhetoric, up from just 13 percent at this time last year. Additionally, it has already updated its “credible violence policy,” which polices rhetoric on the platform and punishes any account that spreads hate speech.
Facebook concluded that it will comply with the UN’s Guiding Principles on Business and Human Rights, which outline the general recommendations for business that wish to engage in ethical practice.
Violence against the Rohingya minority has been ongoing for several years. The UN recently reported on the many war crimes being perpetrated against civilians there.