Rights group warns Meta moderation policies fuel violence in Bangladesh News
VisbyStar, CC BY-SA 4.0, via Wikimedia Commons
Rights group warns Meta moderation policies fuel violence in Bangladesh

Amnesty International on Monday urged tech company Meta to address concerns over harmful online content on Facebook, citing risk of sectarian tensions, discrimination, and violence against minority communities in Bangladesh.

The rights group described a troubling rise in “misleading and inflammatory content spread in relation to political parties and minority communities.” Alia Al Ghussain, Amnesty International’s head of big tech accountability, said “warning signs” are visible for a human rights crisis, adding that a “combination of cross-border harmful content, political tension, sectarian narratives, and algorithmic amplification creates a volatile environment that could put freedom of expression and the rights of minority communities at risk.”

Amnesty argued that Meta built its business model on surveillance and maximizing engagement and incentivizes the amplification of sensational, polarizing, and harmful content. The group called on the company to enact “break the glass” measures that would reduce the power of algorithmic amplification. While Amnesty recommended that states introduce and enforce legislation regulating social media algorithms, it maintained that social media companies have a responsibility to protect human rights independent of state obligations.

In December 2025, the Bangladesh Telecommunication Regulatory Commission (BRTC) wrote a letter to Meta calling on them to address the dissemination of harmful content on Facebook. The BRTC alleged that violent mobs attacked the offices of The Daily Star and Prothom Alo, two leading media outlets, directly after threats against both outlets circulated on social media. The outlets said there was a direct link between online incitement of violence on Facebook and the mob attacks, and Amnesty voiced concerns that such incidents are not isolated but constitute a broader pattern of Facebook algorithms amplifying violence.

The BRTC criticized Meta’s failure to limit the circulation of violence-inciting content, writing that Facebook has been used to incite “large-scale violence” and arguing that Meta’s delays in removing content create “opportunity for further incitement and mobilization of violence.” The BRTC asked Meta to consider content moderation a matter of public responsibility, calling for the company to enforce community standards “in a stricter, faster, and more contextual manner for Bangladesh-related content,” strengthen Bengali-language moderation, and ensure “immediate action on reported content that incites violence.”

Bangladesh experienced a period of political instability and violence after mass student-led protests in July 2024 forced former prime minister Sheikh Hasina to flee to India after a crackdown on dissent that led to at least 600 deaths. Harmful content and sectarian narratives on Facebook have also been linked to human rights abuses in Ethiopia and the Rohingya genocide in neighboring Myanmar in recent years.