Online abuse is now a national crisis: Time to act
As Bangladesh moves deeper into an all-digital social life, harassment, intimidation and violence have followed women from street corners and homes into their virtual lives. It travels through Facebook inboxes and comment sections, spreads across Messenger groups, mutates into AI-generated images, and resurfaces long after a woman believes it has ended.
Awareness campaigns, roundtables and activism have highlighted the urgency of ending digital violence against women and girls. Yet survivors and legal experts say that while conversations have expanded, legal protection has not kept pace. For many women, the law remains distant, intimidating and, too often, ineffective.
The scale of the problem
The numbers are stark. A 2022/23 ActionAid study found that about 64 per cent of women faced online violence, up from just over 50 per cent the previous year. NETZ Bangladesh reported that more than 78 per cent of women in eight districts experienced technology-facilitated violence. The impact is severe: 65 per cent of survivors reported psychological trauma, nearly 43 per cent lost confidence online, and many withdrew from public life. Yet only around 15 per cent filed formal complaints, reflecting a broader trend.
A widening digital divide
The violence unfolds against a backdrop of inequality in digital access itself.
This gap complicates efforts to empower women digitally. As Sharmin Islam, Gender Analyst at UNDP, shared, “While both men and women experience cyber violence, women and girls are affected at a significantly higher rate. This cyber violence is a new dimension of the violence women have faced for a long time, such as intimate partner violence and sexual harassment in public places.” She pointed to a culture of impunity. “When even gross acts of physical violence often go unpunished, people assume there will be no legal consequences for online harassment.”
The law on paper
Bangladesh has moved through several legislative phases. The Digital Security Act was heavily criticised before being replaced. The latest framework, the Cyber Security Ordinance 2025 (CSO 2025), criminalises sexual harassment, revenge pornography and certain forms of AI-generated harmful content. It prohibits the publication of non-consensual intimate images and provides penalties that can extend to years of imprisonment and fines.
Barrister Tasnuva Shelley, Deputy Attorney General, Appellate Division, Supreme Court of Bangladesh, noted that the Ordinance provides specific definitions for sexual harassment, revenge porn and sextortion. It recognises repeated requests for nude images, unsolicited sexual content and the transformation of someone’s image into sexualised content without consent.
However, grey areas remain. Deepfake content is recognised, but victims often lack recourse under copyright law because they do not “own” the manipulated material. Section 17 addresses harmful AI outputs, yet identifying the origin of automated, AI-driven harassment remains technically complex.
Farjana Yesmin, Assistant Professor of Law at the University of Chittagong, said older laws such as the Penal Code and the Prevention of Oppression against Women and Children Act were never designed with digital offences in mind. Although the 2025 Ordinance defines sexual harassment and revenge porn, she argued that there is still a gap in defining broader “digital harm”, particularly in relation to AI.
The law in practice
One major obstacle is evidence. Cases often fail due to procedural lapses in handling digital evidence. Under Section 65B(4) of the Evidence Act, digital evidence requires a mandatory certificate. Without it, evidence becomes inadmissible. Digital truth is fragile; content can be altered or deleted if not preserved immediately through technical processes such as hashing (digital fingerprints).
Barrister Shelley acknowledged a gap between statutory text and judicial practice. Judges must now assess complex metadata and forensic reports, yet digital literacy within the judiciary remains uneven.
There are also structural barriers for complainants. Under Section 40(2) of the CSO 2025, if the police refuse to register a complaint, a Tribunal may dismiss it if not satisfied after examining the complainant. Section 28, which punishes false cases with penalties equivalent to the alleged offence, can create fear among survivors that a failed case could rebound against them.
Barriers at the police station
Advocate Humayra Noor, a Supreme Court lawyer, described the first hurdle: filing a General Diary (GD) at a police station. “Women frequently face blackmail, AI-generated fake photos, and the sharing of non-consensual images,” she said. “Many women feel insecure and are unsure of what steps to take.”
Even reaching the stage of filing a GD requires courage. Victims are often asked irrelevant and embarrassing questions. If the duty officer is male, many feel unable to speak freely. Questions about prior romantic relationships frequently surface, reinforcing victim-blaming.
Adv. Noor proposed all-female police cells in every station to ensure privacy and sensitivity. She criticised the complicated online GD process, which forces some women to seek help from local computer shops, compromising their privacy.
Reporting without redress
The Bangladesh Police launched the Cyber Support for Women initiative in late 2020, known as the Police Cyber Support Centre for Women (PCSW). From its inception until May 2024, 60,808 women sought assistance. In 2024 alone, 9,117 cyber harassment complaints were recorded, with spikes in September and October following heightened political activity.
Yet redress remains limited. ActionAid found that 64.71 per cent of women did not receive any action in response to their submitted complaints. Many believed the mechanism simply did not work.
A punishment-heavy approach
Sharmin Islam of UNDP noted that international standards focus not only on criminal penalties but also on rapid content removal and survivor support. “In the context of Bangladesh, I see a major gap where policies focus solely on punishment rather than proactive, preventative measures,” she said.
Prof. Farjana Yesmin argued that while laws may be gender-neutral in wording, their impact is unequal. Even if a harasser is jailed, the viral images often remain online indefinitely. She stressed that non-consensual intimate image sharing should be recognised not just as an offence but as a violation of a woman’s fundamental right to privacy and safety.
Towards reform
Proposals for reform converge on several themes.
Barrister Shelley has called for a “correctional jurisprudence” framework that audits and neutralises bias within AI and legal systems. She advocates inclusive data ecosystems and clearer deepfake regulation.
Prof. Yesmin prioritises gender-segregated cyber help desks in every district, mandatory content removal within 24 hours, and compulsory gender-sensitivity training for judges and police officers.
Adv. Noor emphasises people-friendly policing, nationwide awareness campaigns and unified support systems that bring lawyers, doctors, police and mental health professionals under one coordinated structure.
Ystiaque Ahmed is a journalist at The Daily Star.
Send your articles for Slow Reads to slowreads@thedailystar.net. Check out our submission guidelines for details.