The Cognitive Costs of AI
In the space of two years, the discourse around AI and knowledge work has produced an entire family of concepts: Cognitive Offloading, Cognitive Debt, Cognitive Atrophy, Cognitive Drift, Cognitive Surrender. Each more alarming than the last. Read them in sequence and you are not reading a series of independent observations. You are reading an escalation.
That escalation is worth examining. The individual concepts are not wrong (most describe something real). But the pattern they form tells a story that may not be the one we think it is.
Cognitive Offloading
The oldest concept in the family is also the calmest. Cognitive Offloading describes the act of delegating mental processes to external tools: writing things down instead of memorizing them, using a calculator instead of doing mental arithmetic, letting GPS navigate instead of building a mental map.
The idea goes back to Andy Clark and David Chalmers‘ Extended Mind Thesis from 1998.1 Their argument: external tools can become genuine parts of our cognitive system. A notebook is not just a storage device. For someone who uses it consistently, it is part of how they think.
This framing is neutral. Nobody panics about writing. Socrates did, famously, but we got over it. Cognitive Offloading simply names what humans have always done: extend their minds into the world. AI is the latest step in that line, not a rupture.
Cognitive Debt
The tone shifts with Cognitive Debt, a term coined by John Willshire in spring 2025.2 The word “debt” had appeared in cognitive research before (in Alzheimer’s studies on repetitive negative thinking), but Willshire gave it its current meaning by drawing a parallel to Ward Cunningham’s concept of technical debt in software development: just as teams accumulate technical debt when they ship code without proper architecture, we accumulate cognitive debt when we skip the thinking and go straight to the answer.
“Cognitive Debt is where you forgo the thinking in order just to get the answers, but have no real idea of why the answers are what they are,” Willshire writes. The concept resonated quickly. People described it as finally having language for something they had been observing in their organizations. By late 2025, MIT Media Lab researchers had picked up the thread, empirically studying how AI-assisted writing affects cognitive processing.3
The metaphor is useful, but it has a blind spot. Technical debt is a conscious trade-off. A team decides to ship fast and clean up later. Cognitive debt, in most cases, is not that. It accumulates under pressure, not by choice: forty more emails, three more deliverables due by Friday. The distinction matters.4
Cognitive Atrophy
With Cognitive Atrophy, the alarm gets louder. A Microsoft study presented at CHI 2025 surveyed 319 knowledge workers about how AI tools affect their reasoning. The findings: participants reported measurable reductions in critical thinking effort, particularly on routine tasks. Some described a creeping dependency, a sense that their own analytical capacity was weakening from disuse.5
The word “atrophy” is medical. Muscles waste away when you stop using them. Applied to cognition, it suggests damage that may not be easily reversed. But the same body of research points to something the framing tends to miss: the effects depend heavily on context. Where learning is volitional (someone choosing difficulty because they want to grow), AI tends to help. Where institutions reward outputs over understanding, AI becomes a shortcut.
Cognitive Drift
Cognitive Drift, coined in a 2026 IQ Mindware analysis, describes something subtler.6 Where atrophy implies measurable decline, drift suggests a gradual, imperceptible shift. Higher-order capacities like systems thinking and resilience erode not through dramatic events but through the quiet accumulation of small concessions: a prompt here, an automated summary there, a decision delegated because the calendar was full.
Drift is harder to detect than atrophy precisely because each individual concession is rational. The problem only becomes visible in aggregate. And by then, the baseline has moved.
The fatalism of the term is notable. Offloading was something you do. Debt was something you accumulate. Atrophy was something that happens to you. Drift is something you do not even notice.
Cognitive Surrender
The most recent entry comes from Shaw and Nave’s 2026 paper Thinking: Fast, Slow, and Artificial.7 Where drift is passive, surrender carries the weight of capitulation. At some point, the effort of maintaining independent judgment becomes too costly, and people stop trying.
A 2026 survey of 200 UK executives illustrates the territory: 62 percent reported using AI for “most decisions,” and 46 percent said they now trust AI recommendations over their colleagues’ judgment.8
Five concepts have emerged in under two years, each moving further from neutral observation toward capitulation.
| Concept | What happens | Your agency |
|---|---|---|
| Cognitive Offloading | You delegate mental tasks to tools | Active choice |
| Cognitive Debt | You skip thinking for speed | Pressured trade-off |
| Cognitive Atrophy | Your reasoning weakens from disuse | Passive decline |
| Cognitive Drift | Higher-order thinking erodes imperceptibly | Unnoticed |
| Cognitive Surrender | You stop maintaining independent judgment | Capitulation |
The Escalation
So what does this escalation tell us?
Two things stand out. The first is the trajectory of agency. Each concept in the sequence strips away a little more of it, from active choice (Offloading) through unnoticed erosion (Drift) to outright capitulation (Surrender). In under two years, the discourse has moved from describing a decision to describing defeat.
The second is where each concept locates the problem. Every one of them diagnoses the individual mind. You are offloading too much. You are accumulating debt. Your skills are atrophying. The prescription, implicitly, is personal too: be more disciplined, think harder, resist the shortcuts.
These two observations connect. Look at the conditions under which people reach for those shortcuts. Their inboxes are overflowing, deadlines leave no room for reflection, and the organizations they work in measure throughput, not understanding. The progressive loss of agency that the five concepts describe is not a character flaw. It is what happens when work leaves no room for deliberation. Consider the executive who, before AI, never questioned the McKinsey slide she put in her deck without reading the underlying study. She was already skipping the thinking. AI just made the shortcut faster and the gap more obvious. It is functioning less as a cause and more as a contrast agent: revealing cognitive compromises that were already embedded in how we work, long before anyone had a ChatGPT account.
Intent matters here. Choosing what to delegate and what to think through yourself is how the cognitive costs stay manageable. But intent is not free-floating. It requires conditions that make deliberation possible: time to reflect and incentives that reward understanding over output. Without those conditions, what the discourse calls “surrender” is simply a rational response to a system that never made room for the thinking it now mourns losing.
Connections
Meaningmaking describes the other side of this question. Where the Cognitive Costs map what we risk losing, meaningmaking (as Vaughn Tan defines it) names what is worth protecting: the capacity to make subjective value judgments that no AI system can replicate. The two lenses are complementary. One asks what we are giving up. The other asks what we must hold on to.
Open Questions
If the real issue is organizational rather than individual, the interesting question shifts. What would a systemic response look like, beyond “use less AI” or “think harder”? Who in an organization has the authority, and the incentive, to redesign the conditions under which knowledge work happens?
-
Andy Clark and David Chalmers, “The Extended Mind,” Analysis 58, no. 1 (1998): 7-19. ↩
-
John Willshire, “Cognitive Debt,” Smithery, 2025. ↩
-
Nataliya Kosmyna et al., “Your Brain on ChatGPT: Accumulation of cognitive debt when using an AI assistant for essay writing task,” MIT Media Lab, 2025. ↩
-
Discussed in depth on the Follow the Rabbit podcast, Season 4, Episode 20: “Cognitive Debt: Are we mortgaging our thinking to AI?” with John V Willshire. ↩
-
Microsoft Research, “The Future of AI in Knowledge Work: Tools for Thought,” presented at CHI 2025. ↩
-
IQ Mindware, “The Real AI Challenge for Knowledge Workers,” 2026. ↩
-
Steven D. Shaw and Gideon Nave, “Thinking: Fast, Slow, and Artificial,” SSRN, 2026. ↩
-
3Gem Research, survey of 200 UK executives, reported in The Register, 2026. ↩
Linked References
No notes link to this note yet.