Teaching in the age of AI: How can the education curriculum keep up?
In 2021, when I started my undergraduate degree in English, I remember investing a painstaking amount of time to brainstorm and proofread for my foundation English course assignments in an effort to protect my grade. Fast forward to 2025, and the act of writing as a reflection of one’s own thought has changed dramatically.
During my final term as an undergraduate student and as a teaching assistant, while sitting in the crowded writing centre at my university, I noticed how uniformly polished my students’ work had become. It was safe to assume that nearly every script carried significant traces of artificial intelligence (AI) assistance, often at the expense of originality. Needless to say, generative tools, such as ChatGPT, have fundamentally reshaped how students engage with learning. Today, it has become rare to find students who do not rely on these AI tools in some capacity. Yet, despite their widespread use and their potential for both benefit and harm, one question remains: is our education curriculum truly prepared to respond to AI’s growing presence in the classroom?

Illustration: Zarif Faiaz
Reflecting on the national school curriculum of the country, Mahfuzul Haque Sadim Chowdhury, an assistant teacher at Milestone School and College, opines that the curriculum is barely prepared to deal with the rapid growth of AI. He says, “School students are stuck at rote learning, and that makes them more vulnerable to misuse of AI, as they are habituated to treating writing tasks as a product instead of a process that needs to be developed through the conjunction of reflection and critical thinking. On top of that, teachers are not trained to incorporate AI in the learning process, and often their abysmal salary structure does not even incentivise learning and receiving training, as they are more prone to offering private tuition. After all, teachers have to sustain their basic necessities.”
Although the privilege of international schooling can shield against the misuse of generative AI, it also does not guarantee critical engagement with the study materials. Tashfia Ahmed, an instructor at Scholastica, says, “The Cambridge syllabus, due to its opinion-based contextual approach to the syllabus, pushes students to personally engage with the materials.”
Nevertheless, Ahmed also finds that when students are working at home by themselves, they tend to use AI to prepare their materials.

“Earlier, students used to develop synthesising skills by collecting information from different sources and evaluating that information, but now that is being outsourced to AI, and I think that is hampering their critical thinking skills,” she elaborates.
It is important to note that these generative AI tools are not always ideal for research. There are well-documented regular incidents where ChatGPT has generated false information. It can be noticed that a well-designed curriculum, too, can find its learning outcomes shaken by the rapid advancement of technology.
This underpreparedness of the education system does not limit itself to the school gate. Rather, it is affecting university education as well. Sometimes, the AI trouble is showing up in a lack of clear and concrete policy, and sometimes in the study materials themselves.
Dr Nurul Huda Abul Monsur, a history professor, feels an uneasy strain while evaluating his students’ scripts. He states, “I like to place a lot of trust in my students. However, sometimes when I find a student’s writing quality has dramatically improved in an assignment, a sharp contrast from their classwork, then a sense of distrust does loom over. But notice that if a student is only using it to develop their vocabulary, these AI tools can be a friend in those instances. Also, I have hardly any tools to be precisely sure if cheating has occurred or not.”
“Nevertheless, having no unified framework or guideline on AI usage leaves us with little room for choice,” he continues. “We are bound to penalise the students. I think we need to establish a common framework regarding AI usage in students’ assignments.”

Unfortunately, this lack of adoption is not limited to humanities and social sciences disciplines; the engineering field, too, is equally struggling to catch up with AI, an irony in itself.
Muhammad Shafayat Oshman, a lecturer from the Department of Computer Science and Engineering at North South University, delineates how the computer science and engineering (CSE) curriculum in Bangladesh is lagging behind. He explains, “University curriculum is designed to provide a strong theoretical foundation in programming and computer science, while industry increasingly expects practical AI expertise. As a result, many courses fail to reflect real industry needs, such as Bangladesh’s growing semiconductor and chip design sector, which remains largely theoretical in the curriculum. This pattern of theory-heavy teaching with limited hands-on application appears across many courses. Bridging theory with real-world industry implementation is essential to reduce this disconnect.”
Observing the overall situation, Dr Mohammod Moninoor Roshid, a professor at the Institute of Education and Research (IER), Dhaka University, feels that our universities could not take the leadership charge at the inception of AI, and now they are struggling to cope. He says, “The Bangladeshi education system needs to adopt AI according to discipline and come out of the preconceived notion that AI is merely a cheating tool, and the University Grants Commission of Bangladesh (UGC) can play a crucial role in formulating that. Besides, senior academics also need to familiarise themselves with AI instead of fearing it. Collectively, we have to recognise that our future is AI-driven and AI, if used ethically, can operate education efficiently and not just be a tool for cheating.”
The rapid advancement of generative AI has exposed a critical gap between technological advancement and curriculum preparedness. From schools rooted in rote learning to universities lacking clear policies and industry alignment, the system appears reactive rather than adaptive.
While AI holds undeniable potential to make learning more efficient, regardless of its uncritical or unregulated use, it risks eroding originality, critical thinking, and trust within academic spaces. The issue, therefore, is not whether AI should be used, but how it should be meaningfully integrated. Addressing this challenge requires curriculum reform, teacher training, institutional leadership and a unified framework that views AI as a pedagogical tool rather than merely a threat.
If education is to remain relevant in an AI-driven future, curricula must evolve to prioritise process over product, critical engagement over convenience, and ethical use over prohibition.
Fariha Lamisa is a contributor at Campus, Rising Stars, and Star Youth.
Comments