Don’t outsource your brain to AI. It’ll kill critical thinking & creativity

1 hour ago 6
ARTICLE AD BOX

Don’t outsource your brain to AI. It’ll kill critical thinking & creativity

With the rise of AI in educational settings, students are finding themselves using these tools more often for their assignments, transforming a once invaluable resource into a dependency that stifles originality and deep thinking. Experts caution that this trend may have lasting effects on young minds still in development.

OpenAI presents ChatGPT as an all-purpose aide, specifically for students, highlighting prompts to turn flashcards into stories, convert biology notes into talk show scripts, or coach students on public speaking for in-class presentations.

The pitch is simple: with generative AI, learning is a cakewalk.Students, however, were already ahead of the campaign. In Gurgaon, 14-year-old Sachi Aggarwal started using generative AI platforms like Claude, Perplexity and ChatGPT last year to speed up research, brainstorm ideas and complete homework. Her peers at The Shri Ram School, Aravali, did the same. However, what began as light assistance turned into full-scale offloading.

“Students weren’t using AI as a last resort any more,” says Sachi.

“It became their first option.”Finding it hard to balance coursework, extracurriculars, tuition classes, exam prep, Model UNs, and community service – often pursued for college applications abroad – many students capitulated to the easy appeal of AI. Sachi admits she used AI so frequently that it felt she wasn’t thinking for myself anymore. “My thought process became robotic, and I was no longer framing answers the way I originally did, using my own creativity and perspective, because I grew accustomed to the AI’s way of formulating answers,” she says.

The tipping point came when her English teacher asked her to explain an analysis she’d submitted — and she went blank because AI had done the thinking for her.

AI Or You?

Generative AI is promoted as an enabler and pathfinder, a tool that builds on human logic and creativity to prompt new ideas and forms of expression. Critics call it the death knell of critical thinking and creativity.A recent, MIT Media Lab study, Your Brain on ChatGPT, found that users who relied on ChatGPT to write essays consistently underperformed neurologically, linguistically, and behaviourally when compared to those who only used Google or no tools at all, with the latter outperforming the rest.

The study was criticised for its small sample size (54 participants aged 18–39) and lack of peer review, but lead author Nataliya Kosmyna told Time the findings were released early out of concern over the rapid rollout of LLMs without thorough evaluation, and the speed with which they had entered mainstream use.

She warned that longterm brain development could be affected, especiall y i n young people, who are the biggest adopters of generative AI apps.Over half of ChatGPT users in India are under 24, who use the app mainly to study. Neurologist Dr Sid Warrier says the use of this technology can fly both ways. On the upside, says the Mumbai-based doctor: “Someone growing up with AI may have a better ability to harness it, rather than if they were exposed to it later in life.” On the downside: “If (at an early age) you outsource your core critical thinking skills, why would your brain bother to develop those skills at all? For instance, we stopped remembering phone numbers when we had phones to store them.

This ‘cognitive offloading’ — delegating mental work to notes or to devices — can upend daily life with a growing inability to focus, analyse, reason, reflect and question. “That is the real danger, because we’re talking about our core purpose as human beings,” Dr Warrier points out.

Screenshot 2026-04-03 204302

AI generated image

Young people develop these skills gradually, shaped through what they learn and experience. “And if those cognitive stimuli are unidimensional or restrictive, it could affect brain development impacting, among other things, creativity and memory,” says Dr Rajesh Sagar, professor of psychiatry at AIIMS, Delhi.

“The brains of young people are neuroplastic — they’re still developing; what they do or see during the developing years will affect their neuroplasticity, and its implications will be felt later in adulthood.

Adults, on the other hand, have already reached developmental maturity.” Yet today’s adults were yesterday’s teens raised on multiple tabs, who know a thing or two about shrinking attention spans, even if they engage with LLMs less than teens do today.

Lessons From Social Media

Mainstream use of LLMs may be new, but social media offers an indication, as its effect on young people was already well researched.“We have not yet reached the stage where cognitive decline is evident in society because there’s usually a lag between the availability of a service and its effects,” explains Dr Warrier. “But we can reasonably predict the direction we’re headed as a society.”A study published in Nature in 2024, on the long-term impact of digital media on brain development in children, found that “social media users often contend with constant distractions, which can significantly impact their behaviour, leading to inattention symptoms.

Additionally, these users can become easily diverted from tasks like reading or homework, etc.”Social media incentivises quick, superficial interactions, with bite-sized content meant to skim rather than sink in. It triggers and rewards immediate emotional reactions rather than deep reflection and analysis. Predictably, attention deficit is a major concern reported by psychologists who treat kids with tech addiction — it impacts memory and can lead to learning difficulties.

And since cognitive skills and mental health are symbiotic, one impacts the other.

“Emotions are the brain’s instinctive response to stress. Cognitive skills are what help us deal with our emotions effectively,” says Dr Warrier. “Without these skills, emotions overrule rationality and lead to poor decision-making, which can in turn impact mental health.”

AI For Education

AI is already entering classrooms. In August, OpenAI launched its Learning Accelerator in India “to bring advanced AI to India’s educators and learners nationwide through AI research, training, and deployment.”

It plans to distribute 500,000 ChatGPT licences to students and educators, including those in govt schools.Some schools are far ahead of this, as AI is already teaching Rabindranath Tagore and quadratic equations. Indus International, an IB school with campuses in Pune, Bengaluru and Hyderabad, introduced an AI-driven humanoid as a supplementary teacher in 2019. “This humanoid helps deliver content and handles routine instructional tasks, allowing teachers to focus on more meaningful and emotionally supportive interactions with their students.

.. cultivating ethics, building character, and developing entrepreneurial competencies,” Akshaya KB, head of curriculum, wrote in an email. Their Collaborative Learning Model, as it’s called, began with the humanoid teaching pre-programmed lessons, later progressed to semi-programmed lessons with basic generative AI and now works with full conversational generative AI. The school claims this has improved average student performance by 15% every year.

Developing Protocols

As LLMs evolve, so do efforts to guide their use. ChatGPT-5’s Study Mode is designed to encourage learning by guiding students step by step through an answer, instead of providing complete answers. The benefits cannot be denied — personalised learning, clearer explanations, tailored feedback — but without strong protocols, risks over its ethical use and privacy remain.Several countries have developed AI frameworks. The European Commission and OECD, for example, have drafted an AI Literacy Framework that “ensures students know how to evaluate, question, and apply AI responsibly in their academic lives.”In India, equitable access is an additional concern. Osama Manzar, founder of the Digital Empower ment Foundation, warns that intense focus on AI and digital tools in education could marginalise students in areas that lack proper digital infrastructure. “While the National Education Policy emphasises inclusivity, the emphasis on digital education could undermine the quality of teaching in schools, especially if teachers are not adequately trained to use digital tools effectively.

This can also lead to a one-size-fits-all approach, ignoring local needs and educational contexts.”Raju Kendre, founder of Eklavya Foundation — a nonprofit that empowers students from marginalised communities to access higher education — highlights another concern: “Current LLMs often carry the biases of their makers who come from mainstream, urban high socioeconomic backgrounds. Without diversity, elite perspectives risk perpetuating stereotypes,” he cautions.

This can affect the way marginalised communities view themselves and their cultures through the monocular lens of AI.

He has suggested that India should build its own AI models and the govt must use policy and oversight to correct biases and ensure that AI serves everyone.Even as larger questions of access, accountability and cognitive trade-offs are being threshed out in policy, school and psychology circles, students are negotiating their own boundaries.After her analysis debacle, Sachi now only uses AI to study, not as an assignment aide. “I had ChatGPT use the Feynman technique to help me learn about India’s Election Commission,” she says. Better time management is what has helped her stay on the course. “I write down my priorities, and I approach school assignments with a different mindset. Earlier, my attitude was: ‘I’m not going to look at this assignment in 10 years, so why bother?’ Now I know it all adds up.

GUIDELINES

For Schools:

1 Insist on a disclosure/process note for any substantial AI help. Ultimately, a student’s work must remain authentic2 Teach students how to acknowledge AI properly and set clear dos/don’ts for use cases, in subject guides and task sheets3 Design for assessment integrity by prioritising in-class writing and hands-on STEAM tasks that affirm conceptual understanding4 Support ongoing teacher development on generative AI pedagogy and academic integrity norms

For Parents:

1 Create a family AI agreement (define the tools, time and extent of use, and how to acknowledge AI assistance)2 Review the child’s process notes (how AI was used) more than the final product; praise authentic effort and revision3 Focus on building relationships and experiences, watch for emotional outsourcing to chatbots and route wellbeing to teachers/ counsellors(Compiled from Indus International’s AI policy recommendations)

Let's Detstress

Read Entire Article