Researchers Find Heavy Reliance on AI Could Dull Human Reasoning Skills

Artificial intelligence has transformed how we live, learn, and connect. It’s the whisper in our phones that answers our questions before we’ve even finished asking them. It’s the invisible assistant drafting our emails, correcting our grammar, summarizing the news, and sometimes even predicting our emotions. Yet beneath that convenience lies a quiet paradox — the more AI thinks for us, the less we may be thinking for ourselves. A recent study from Zurich, published in the journal Societies, has found that people who frequently use AI tools tend to show lower critical thinking abilities. It’s a finding that doesn’t demonize technology, but rather challenges us to look inward and ask: in our pursuit of ease, what are we letting go of?

Michael Gerlich, a researcher at SBS Swiss Business School, led this investigation into what psychologists call cognitive offloading — the tendency to hand over mental effort to an external tool. The study’s results suggest that when people rely too heavily on AI to generate ideas, solve problems, or interpret meaning, their ability to engage deeply and critically with information begins to fade. It’s not that AI makes us less intelligent; it’s that we stop exercising the part of the mind that wrestles with complexity. And when we stop wrestling, we stop growing. The question, then, isn’t whether AI is powerful — it clearly is. The question is whether our relationship with that power is making us more passive, less capable of deep thought, and ultimately, less human.

The Comfort of Convenience

Humanity has always been addicted to convenience. From the invention of the wheel to the rise of the smartphone, every leap forward in technology has promised liberation — liberation from work, from time, from effort itself. And for the most part, those promises have been kept. The washing machine freed us from endless labor, the Internet brought libraries to our pockets, and now AI can write a convincing essay in seconds. Yet every time we outsource effort, we risk outsourcing engagement. Convenience, for all its beauty, has a hidden cost: it slowly separates us from the process of learning, discovering, and truly understanding.

Gerlich describes cognitive offloading as the moment when we choose not to think because something else can do it faster. It’s the act of deferring effort — not because we can’t think, but because we no longer feel the need to. It begins innocently enough. We use GPS instead of reading a map. We ask Google instead of recalling a fact. We let an AI summarize a book rather than reading it ourselves. These seem like harmless choices, but repeated over years, they reshape how we approach information itself. The mind becomes conditioned to expect answers instead of seeking them. Curiosity — that ancient fire that drives invention and insight — begins to dim.

What’s most striking about this shift is how subtle it is. You don’t feel your critical thinking fading; it doesn’t vanish in a moment of crisis. It erodes slowly, like sand slipping through fingers. You start accepting results without questioning their sources. You begin trusting algorithms more than your own intuition. The brain’s natural skepticism — its habit of asking “why?” and “how?” — softens. We start consuming information the way we scroll social media: fast, reactive, and unreflective. In the age of AI, convenience isn’t just a tool. It’s a temptation — one that invites us to trade reflection for immediacy, depth for speed.

The Study That Sparked the Debate

Gerlich’s research took a closer look at this dynamic, surveying 666 participants across different ages and education levels. Each participant was asked about their use of AI tools, from chatbots to content generators, and then evaluated on measures of critical thinking and reasoning. The results revealed a fascinating pattern: those who used AI most frequently — typically younger, more tech-savvy individuals — scored lower on critical thinking tasks. In contrast, older participants, particularly those aged 46 and above, who used AI less, tended to score higher. Education level, too, played a major role; people with higher educational attainment demonstrated stronger critical thinking regardless of their AI usage.

At first glance, these findings may appear to paint a simple picture: older generations think more deeply, younger ones rely more on technology. But that interpretation misses the nuance. This study isn’t about age — it’s about adaptation. Younger generations have grown up in a world of constant connection and instant answers. Their minds have been shaped by environments where attention is fragmented and information is infinite. When knowledge is always a click away, the incentive to struggle with questions diminishes. Older generations, by contrast, developed their thinking habits in a slower world — one that required patience, persistence, and memory.

Gerlich himself emphasizes that the results should be read with caution. The study is correlational, based on self-reported data. We don’t yet know if AI use directly reduces critical thinking or if people who naturally avoid deep thinking are simply more drawn to AI. Yet the correlation raises a vital question: what happens to a society when the majority of its cognitive work is delegated to machines? If thinking becomes optional, does wisdom become obsolete? These aren’t futuristic questions — they’re happening now, every time someone asks AI to “explain this for me” instead of wrestling with understanding.

When Thinking Becomes Outsourced

AI promises freedom from tedium — and it delivers. It can summarize, translate, ideate, and compose with breathtaking speed. But the line between assistance and replacement is perilously thin. When AI stops being a tool and becomes a substitute, we lose the very friction that forms thought. Gerlich’s findings echo a decade-old phenomenon known as the “Google effect,” identified by researchers at Columbia University in 2011. That study found people were more likely to remember where to find information online than the information itself. We stopped memorizing facts because we trusted that we could always retrieve them. Now, with AI, the problem goes deeper — we risk forgetting how to reason, not just what to recall.

The real cost of cognitive offloading is invisible: it’s the quiet loss of intellectual struggle. The process of thinking — of weighing, doubting, revising, synthesizing — is what transforms raw data into insight. When that process is removed, learning becomes mere consumption. The student who lets AI write their essay might get an A, but they lose the invisible reward that comes from wrestling with uncertainty and building understanding. The writer who lets AI draft every paragraph may produce quickly, but loses the ability to discern their own voice. Efficiency gives us output, but not growth.

The danger isn’t that AI will replace humans. It’s that humans will stop practicing the habits that make them irreplaceable. When machines provide not just information but interpretation, the temptation to disengage grows stronger. We start deferring judgment to the algorithm. We stop checking the facts, trusting that the system already has. We begin to let the machine not only think faster but think for us — and in doing so, we risk dulling the edge of human discernment that took millennia to evolve.

The Mind Needs Resistance

The mind, like the body, thrives on challenge. Muscles grow through strain; so does intelligence. When we engage with complex ideas, we activate cognitive resistance — a kind of mental weightlifting that strengthens our reasoning. But when AI removes the need for that struggle, we lose the opportunity to grow stronger. Gerlich warns that while cognitive offloading can reduce mental strain, it also stunts cognitive development. It’s a trade-off that feels good in the short term but impoverishes us in the long term.

Imagine a gym that promises you a perfect body while you sit perfectly still. It would be absurd — yet that’s how many of us now approach thinking. We let AI do the lifting, and then we admire the results as if they were ours. But true understanding doesn’t come from consumption; it comes from confrontation — confronting confusion, contradiction, and complexity. When we allow AI to remove those confrontations, we lose the tension that builds depth. The mind becomes smooth but shallow, efficient but fragile.

This doesn’t mean we should abandon AI. It means we must redefine our relationship with it. Use AI to spark ideas, not to replace them. Let it organize your thoughts, not dictate them. Ask it to challenge your assumptions, not confirm them. The key is active use, not passive reliance. AI should serve as a partner in your thinking process — a mirror that reflects your understanding, not a crutch that replaces it. The more deliberately we use it, the more it can expand our thinking rather than shrink it.

Reclaiming the Power to Think

Gerlich’s study points toward one simple truth: the future of thinking depends on education — not just formal schooling, but the everyday education of awareness. We must teach ourselves and others not merely to use AI, but to interrogate it. Schools should not only integrate AI literacy but foster digital skepticism: an understanding that every generated answer carries hidden assumptions, biases, and blind spots. Imagine a classroom where students debate AI-generated arguments, critique their logic, and identify their weaknesses. That kind of engagement turns passive consumers into active thinkers.

But this transformation doesn’t belong solely to schools or institutions. Each of us has the power to reclaim the habit of deep thought. The next time you reach for AI to solve a problem, pause and ask yourself: what do I think? How would I approach this without it? Reflect on the process before outsourcing it. That small moment of pause is a form of resistance — a conscious act of reclaiming your mental autonomy.

The truth is, thinking is hard. It’s uncomfortable, uncertain, and slow. But that’s exactly what makes it sacred. The discomfort is where depth is born. Gerlich’s study is not a call to fear AI — it’s a reminder that the most valuable form of intelligence is still the human kind: intuitive, emotional, reflective, imperfect. The kind that asks why before it accepts what. The kind that learns not because it must, but because it can.

The Takeaway: Don’t Outsource Your Soul

This isn’t a story about technology taking over the world. It’s a story about us taking our minds for granted. Every era has faced this moment — when new inventions threaten to erode old disciplines. The printing press made memorization obsolete, the calculator made mental arithmetic rare, the Internet made information abundant but understanding scarce. Now, AI threatens something deeper: the erosion of our inner dialogue, the silent conversation between thought and self that defines consciousness itself.

AI can generate ideas, but it cannot wonder. It can answer questions, but it cannot ask them with curiosity or fear or hope. It can mimic empathy, but it cannot feel it. It can process data, but it cannot make meaning. That is the human domain — fragile, beautiful, irreplaceable. So use AI, but do not surrender to it. Challenge it. Debate it. Let it show you where your mind ends and your imagination begins.

Because the mind is like fire. It burns brighter when fed by friction, by curiosity, by resistance. And AI — like wind — can either fan that flame or extinguish it. The difference lies in how consciously we use it. The future doesn’t belong to the machines. It belongs to the thinkers who remember that no algorithm can replicate the wild, wondrous act of being human — of sitting in silence and daring to think for themselves.

Loading...