Most people don’t think about their thinking tools. When AI systems offer explanations, solve problems, or generate decisions, the convenience is obvious but there’s a cognitive cost: the more we offload reasoning to machines, the less we engage in it ourselves.
This companion piece reflects on a study that maps how AI tool use can affect critical thinking skills, especially among younger users. The effect is mediated by a behaviour known as cognitive offloading - the transfer of mental tasks to external systems. When AI becomes a routine substitute for analysis or judgement, people start to lose touch with the skills that those tasks require. Over time, that loss is behavioural because what we don’t practise, we stop doing and what we stop doing, we tend to forget how to do.
The implications go beyond individual skills because AI tools are increasingly embedded in professional workflows, educational platforms, and everyday decision-making. If they are designed to optimise speed, fluency, or task completion without attention to cognitive engagement, they might risk incentivising dependence over development. Critical thinking is a conscious practice and preserving it requires more than access to tools. We need friction in the right places, prompts that provoke reflection, and systems that support cognitive effort.
As always, there are two sides to everything - this is in alternative view to last week’s episode on the positive aspects of expanding our minds with Generative AI.
Key points
Frequent use of AI tools is associated with lower critical thinking skills through a mechanism called cognitive offloading - delegating reasoning tasks to external systems
Trust in AI increases the likelihood of offloading and reinforces the habit, while education mitigates the effect - but only when it fosters active cognitive engagement.
Over time, offloading reduces cognitive resilience and weakens independent judgement.
Companion notes based on the article
1. What is cognitive offloading?
Cognitive offloading refers to the act of transferring mental tasks like recalling facts, comparing options, or solving problems to external aids. These can be physical (a notebook), digital (a search engine), or algorithmic (an AI assistant). Offloading isn’t inherently bad. In many cases, it improves efficiency by freeing up mental capacity. But when it becomes habitual, especially for tasks that require analysis or reflection, it can weaken the very abilities it was meant to support.
This is where the risk to critical thinking emerges. Offloading displaces the need for deep engagement. If the answer is always available, there’s less incentive to weigh, compare, or interpret. Over time, that erosion affects both skill and inclination.
2. AI as a cognitive shortcut
The study at the centre of this episode shows a clear negative correlation between frequent AI tool use and critical thinking ability but the link isn’t simply about screen time or exposure - it’s about substitution. When AI tools become the default source of answers, users stop engaging in the mental processes that lead to judgement.
This is especially pronounced in tasks involving uncertainty, ambiguity, or trade-offs which are the very contexts where critical thinking is most needed. As AI-generated content becomes more fluent and confident, it’s easier to trust the output than to interrogate it. This fluency creates a surface coherence that can mask shallowness, bias, or omission. If users take that coherence at face value, they offload not just the task but the responsibility to evaluate.
3. Why education matters, yet isn’t enough
The study found that educational attainment moderates the effect. People with more education were better at maintaining critical thinking despite AI use. But the protection is not automatic. It depends on whether education includes training in active reasoning, reflection, and metacognitive awareness - skills that counteract the pull of offloading.
Without that scaffolding, education alone may do little. Some participants with lower educational attainment reported relying heavily on AI suggestions without questioning them. Others expressed concern about their own fading ability to analyse or remember information. These findings suggest that the real variable is not knowledge level, but engagement style: passive use weakens thinking; active use sustains it.
Until next time.
Source: Gerlich, M. (2025). AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking. Societies, 15(1), 6. (open access)
Share this post