How algorithms are fueling a new wave of crimes in Bangladesh

Fatema Jarrin Habiba
Fatema Jarrin Habiba
6 November 2025, 08:48 AM
UPDATED 6 November 2025, 15:07 PM
Algorithmic crime’s journey in Bangladesh began with the spread of misinformation through social media.

In an era where algorithms dictate what we see, read, and believe, the distinction between digital influence and criminal manipulation is becoming increasingly blurred. Instances ranging from Facebook-fueled mob violence to artificial intelligence (AI)-generated deepfakes that ruin reputations prove that a new form of crime is reshaping the criminological landscape in Bangladesh.

Algorithmic crimes refer to crimes that are either influenced or enabled by algorithms. Algorithms determine what we see online, influence transactions, and shape our digital interactions. Algorithmic crimes are not like regular cybercrimes, such as hacking or phishing; they're often indirect. It occurs when algorithms propagate false information, manipulate user behaviour, or cause biased decisions that lead to real-world harm. Criminologists around the world are realising that algorithmic crime is the "dark twin" of digital innovation. In Bangladesh, this phenomenon has started to manifest in alarming ways, from communal violence sparked by viral posts to financial scams driven by AI bots.

However, the algorithmic crime's journey in Bangladesh began with the spread of misinformation through social media. Back in 2017, at the Thakupara village of Rangpur, communal violence erupted after a fake Facebook post supposedly hurting religious sentiments was circulated in the name of a Hindu youth. Within hours, mobs looted and set fire to Hindu homes. Later, investigations revealed that the post was fabricated, marking one of Bangladesh's first major cases where social media algorithms directly fueled real-world violence. Then, in 2019, Bhola's Borhanuddin upazilla witnessed serious clashes in which four people were killed and over a hundred were injured after screenshots of alleged blasphemous messages by a Facebook user went viral. It was later found that the man's Facebook account had been hacked. Algorithms prioritising sensational content accelerated the spread, reaching thousands within minutes.

Such algorithm-driven misinformation demonstrates how digital systems can manipulate human psychology, triggering mass panic and violence. Besides physical violence, algorithmic manipulation has also entered Bangladesh's financial and political systems. In 2025, numerous cases of mobile banking fraud involving Bkash and Nagad were linked to automated bots capable of predicting and exploiting transactions. These bots tricked victims into sending cash to fake accounts using AI-generated voices and messages. Additionally, deepfakes—realistic fake videos or voices—have emerged as a new criminal weapon. We have seen how fake videos of politicians and journalists went around online, aiming to damage their reputations and influence public opinion.

This rise of algorithmic crimes can be analysed through three key theoretical lenses: sociological, psychological, and biological. Sociological theories propose that crime often arises from social inequalities and a lack of community cohesion. Bangladesh's online space mirrors this lack of cohesion. These echo chambers are exacerbated by algorithms, where opposing views are rarely heard and fake news spreads like wildfire.

Strain theory says that in a struggling economy, algorithmic scams such as fake loan ads or cryptocurrency schemes can exploit people trying to make a quick profit. Algorithms are designed to manipulate human psychology. Social media platforms trigger dopamine-driven pleasure loops that keep users addicted to outrage and sensationalism. According to the frustration-aggression theory, people who are constantly exposed to digital content that makes them angry may become aggressive in real life.

Biological theories of criminology suggest that criminal behaviour may be inherited and physiologically determined, indicating that an individual's physical characteristics and genetic makeup can incline them towards deviant behaviour. These theories also shed light on how technology reshapes the human brain. Constant exposure to algorithm-driven content alters the human brain. It may hack the brain's reward system, making it hard to distinguish between moral responsibility and online simulation.

Algorithmic crime challenges the very foundation of criminology. It demonstrates that in this modern era, crime is no longer limited to the physical world; it is automated, coded, and often imperceptible until its effects erupt. For Bangladesh, the way forward lies in combining criminological insight with technological awareness. Policymakers, educationalists, and law enforcement must view algorithmic manipulation as a social and psychological phenomenon that influences behaviour and fuels violence—rather than as merely a cyber issue—and fashion their response accordingly. Given its persistent danger, the urgency of a firm policy response cannot be overstated.


Fatema Jarrin Habiba is a student of law at Bangladesh University of Professionals.


Views expressed in this article are the author's own. 


Follow The Daily Star Opinion on Facebook for the latest opinions, commentaries and analyses by experts and professionals. To contribute your article or letter to The Daily Star Opinion, see our guidelines for submission.