Natural Intelligence Is Free. Simulated Intellect Is Exponentially Expensive.
Not metaphorically. Thermodynamically. Our lived experience.
But let’s clarify a few terms before we get into the crux of this essay.
"Free" does not mean costless.
A human brain burns about twenty watts. Add the body and you're at a hundred. That energy comes from food, which comes from sunlight, which arrives whether we ask for it or not.
The system renews itself. It knows when to stop. It runs cool.
Over an 80-year lifetime, a human brain consumes roughly 8.76 megawatt-hours of energy.
"Expensive" does not mean bad.
It means energy-hungry, brittle, and dependent on surplus. Training existing AI models consumed 50,000 megawatt-hours—5,700 times more energy than a human brain uses in an entire lifetime. The projections for new ones require 1.5 million megawatt-hours: 171,000 times a human lifetime.
A single ChatGPT query uses 0.3 watt-hours—ten times more than a Google search. Daily global ChatGPT usage consumes 1 gigawatt-hour, equivalent to the daily electricity consumption of 33,000 US households.*
The cost curve is linear at best, superlinear when you push hard. There is no off switch because the system doesn't know when enough is enough.
The distinction matters because we are about to make a choice we don't yet realize we're making.
*Note: Exact figures vary and some remain undisclosed, but the order-of-magnitude differences are not in dispute.
Most conversations about AI split into two camps: those who believe machines will liberate us from drudgery, and those who fear they'll eliminate us entirely. Both camps assume the same thing: that what AI does is fundamentally intelligent.
It’s not.
What AI does is intellect: symbolic manipulation, pattern matching, recursive abstraction. It is powerful. It is useful. But it is not the same thing as the embodied, contextual, self-limiting process we call intelligence.
We've been making this mistake about ourselves for centuries.
Consider a nurse walking into a hospital room. She glances at the patient—posture, breathing, skin tone—and knows something is wrong before touching a chart.
There was no algorithm and no checklist consulted. Her body read the room the way a root reads soil: through contact, context, feedback.
Now consider the same nurse, later, filling out electronic health records. Dropdown menus. Mandatory fields. Checkboxes that must align with billing codes. The system demands she translate embodied knowing into abstract categories it can process. This takes energy. It takes time. It produces exhaustion.
The first act was intelligence. The second was intellect.
The first scales sublinearly: more patients does not require proportionally more effort, because embodied knowing improves with exposure.
The second scales linearly or worse: every patient demanded the same bureaucratic translation, and the cognitive load accumulated.
This is the cost curve divergence: wisdom feels light; overthinking feels like drowning.
What doesn't show up in energy measurements: The first act ended naturally. The second never does.
The Regulatory Signal We Stopped Sensing
Intelligence operates in contact with reality. It is event-driven. A farmer reads weather and soil. A parent senses when a child needs silence instead of advice. A musician knows when the song is finished. These acts require attention, but not analysis. They stop naturally when meaning resolves.
That word: meaning.
Not the word as content or interpretation. Meaning as the internal signal that tells a system when effort can end. When understanding arrives. When care is complete. When enough is enough.
Meaning is a regulator, not a luxury.
In living systems, it plays the same role as satiety, rest, or coherence. It’s what allows processes to self-terminate. When meaning (remember, we’re talking about the internal signal not intellectual interpretation) is present, energy expenditure drops. Attention settles. Action completes itself.
When meaning is absent, processes loop.
Intellect operates in symbolic space, one layer removed from meaning. It recurses. It abstracts. It builds models of models. A committee drafts a framework to evaluate frameworks. An analyst projects next quarter's projections. A student memorizes test strategies for memorizing test strategies.
None of these processes have an internal stopping rule. They continue until an external limit—time, money, collapse—intervenes.
Because they're operating without the internal regulatory signal that natural intelligence uses to know when to stop.
The human brain uses roughly 20% of the body's total energy despite being only 2% of body weight. But active mental tasks only increase brain energy consumption by 5-8% above the baseline. Most of the brain's energy goes to maintaining communication networks, not to computation itself.
The brain is optimized for connection and context, not raw processing. Intelligence relates to efficient signal routing, not burning more glucose on abstract symbol manipulation.
When you force the brain into extended abstract processing—the kind modern knowledge work demands—you're pushing it into a metabolically expensive mode it wasn't designed to sustain. Not just physically expensive. Expensive in a deeper sense: you're asking it to operate without the stopping signal that makes natural intelligence efficient.
A human who knows why they are doing something:
- Expends less energy
- Requires less supervision
- Does not need infinite choice
- Does not need constant optimization
- Does not need abstraction layers to justify existence
A system without meaning has only one regulator left: throughput.
Throughput demands more energy, more abstraction, more simulation, more intellect.
This is why the care economy—which we'll examine shortly—operates at 100 watts per person. Not because it's primitive. Because its work is closed-loop meaningful. Care ends when care is given. There is no growth curve for love, presence, or sufficiency.
The Invisible Economy
But the largest error isn't what we chose to value. It's what we chose not to see at all.
Economist Riane Eisler calls it the care economy: the vast infrastructure of nurturing, caregiving, relationship maintenance, and life-sustaining work that makes all other economic activity possible. It includes childcare, eldercare, emotional labor, food preparation, community building, teaching, nursing—the foundational work of keeping humans alive, healthy, and capable of participating in the formal economy.
When measured, this "invisible economy" is estimated at $10.8 trillion globally. Household work alone, if valued, would represent 20-50% of GDP in developed nations.
This is not a small rounding error. This is the largest sector of human economic activity, operating continuously, and it doesn't appear in any economic statistics.
Why? Because GDP only counts market transactions and government spending. It ignores:
- The household economy (unpaid care work)
- The volunteer/community economy (social maintenance)
- The natural economy (ecosystem services)
It gets worse: GDP counts negatives as positives. An oil spill increases GDP (cleanup costs). Tobacco sales increase GDP (healthcare costs come later). Arms manufacturing increases GDP. Meanwhile, a parent caring for a child, a neighbour checking on an elder, a community garden feeding families—none of this registers as "productive."
This is not accidental. It's structural design through omission.
The formal economy—the part we measure and value—is built on a foundation of care work that is:
- Performed primarily by women and children
- Unpaid or severely underpaid
- Invisible in economic accounting
- Essential for all other economic activity to function
But here's why it matters for our argument about intelligence vs. intellect:
The care economy is the largest-scale expression of natural intelligence we have. And it operates at precisely the metabolic efficiency we’re describing.
A parent sensing when a child needs comfort. An elder passing on knowledge through story. A nurse reading a patient's unspoken distress. A teacher recognizing which student needs encouragement vs. space. A community organizing mutual aid.
These are all acts of embodied, contextual, relational intelligence.
They require:
- Direct contact with reality
- Context-sensitivity
- Immediate feedback loops
- Knowing when enough is enough
- Operating at human metabolic scale (100 watts)
They cannot be automated by symbolic processing. They cannot be scaled through abstraction. They are irreducibly embodied.
And they are the foundation of everything else.
The formal economy—the part we measure—is the smaller, more energy-expensive layer built on top.
Abstract coordination. Financial engineering. Management consulting. Legal frameworks. Bureaucratic processing. These roles exist to coordinate activity after the foundational care work has already produced capable human beings.
When we say "the economy," we mean this top layer. The intellectual superstructure. The part that requires GDP-generating transactions.
When we don't say "the economy," we mean the foundational care work. The natural intelligence layer. The part that makes humans functional enough to participate in market transactions.
We built our entire economic system to ignore the largest, most efficient, most essential economy that exists.
And here's the insight hiding in the pattern: the care economy operates efficiently because its work carries meaning. Care ends when care is given.
The formal economy operates expensively because it operates one layer removed from meaning. Abstract coordination never ends naturally. Optimization never stops.
Now look at what's happening with AI.
The industry is automating the expensive intellectual superstructure—consultants, analysts, coordinators, the abstract roles that earn $100,000 while producing PowerPoint frameworks.
While the care economy—the $10.8 trillion foundation of natural intelligence—remains unpaid, invisible, and impossible to automate with current AI.
You can't train an LLM to:
- Sense when a child is anxious vs. tired
- Know when an elder needs company vs. solitude
- Read the emotional state of a classroom
- Provide the embodied presence that calms a patient
- Build trust through repeated contact over years
These require 100-watt natural intelligence operating in direct contact with reality.
The automation pattern is backwards.
Investment in AI is scaling the most energy-expensive, least essential work (abstract symbolic processing). AI cannot scale the most energy-efficient, most essential work (care and relationship maintenance).
But because we don't measure the care economy, we don't see the pattern.
We think we're automating "the economy" when we're really automating the expensive superstructure while leaving the invisible foundation untouched.
And here's what makes it exponentially expensive:
We’re asking AI to simulate meaning, natural intelligence.
You can simulate reasoning. You can simulate language. You can simulate coordination.
You cannot simulate "being enough."
So the system keeps running. Every query costs energy. Every inference requires computation. There is no internal signal that says "this answer is complete." There is no metabolic satiety. There is no rest.
The cost goes exponential not because intelligence is expensive, but because there’s no signal for when to stop.
And both economies—the invisible care foundation and the measured formal superstructure—exist within a larger metabolic boundary we've already exceeded.
Since roughly 1970, humanity has operated in ecological overshoot, consuming resources and generating waste faster than Earth's systems can regenerate or absorb. We currently draw down ecosystems at roughly three times the planet's regenerative capacity annually. The feedback from this overshoot is already visible: vertebrate populations have declined 73%, and wild mammal biomass has fallen by 85% over the past 100,000 years.
These aren’t predictions; they’re measurement.
A superorganism encountering the boundaries of its system.
The Experiment We Already Ran
We already ran this experiment. We just didn't call it that. And so we missed understanding its lessons.
Modern education systems reward symbolic manipulation and penalize embodied knowing. A child who can sit still and reproduce abstract procedures is "smart." A child who learns through movement, play, and direct contact with materials is "behind." We train intellect and call it intelligence. Then we wonder why anxiety, burnout, and alienation are epidemic.
The data is unambiguous:
- 70% of teenagers cite anxiety and depression as a major problem
- Students experience 15% higher cortisol levels before standardized tests
- High-anxiety students fall 2 years behind by the end of elementary school
- Test anxiety accounts for 2-15% of the variance in test scores
- Students spending 5+ hours online per day are 71% more likely to have suicide risk factors
No Child Left Behind introduced high-stakes testing in kindergarten. Hours previously spent in art, music, physical education, and recess were eliminated or severely limited. The focus shifted to "teaching to the test"—training children in abstract symbol manipulation divorced from embodied experience, divorced from meaning.
The result: entire generations conditioned to mistake intellectual performance for intelligence. To operate without the regulatory signal that says "this is enough."
The economy made the same error. It overvalues abstract coordination roles—management, consulting, financial engineering—and undervalues care, maintenance, craft, and judgment.
The wage ratios tell the story:
- Management consultants/analysts: $100,000 median salary (top firms pay $150,000-$200,000+)
- Registered nurses: $93,000 median salary
- Manual labor: $31,000 average salary
We pay 3.2 times more for abstract symbolic work than for embodied physical work.
Consultants, who produce frameworks and analyses, earn more than nurses, who make life-or-death judgments based on direct embodied perception.
And the people doing the most abstract work are burning out at unprecedented rates:
- 76-82% of knowledge workers report burnout symptoms
- 50% of project managers experience burnout
- 47-83% of software developers report burnout
- Burned-out employees are 63% more likely to take sick days
- They're 2.6 times more likely to actively seek new employment
And productivity—the supposed justification for all this intellectual labor—has collapsed:
- Productivity growth fell from 3% per year in the 1960s to 0.7% in the 2020s
- Knowledge workers spend 20% of their workweek (one full day in five) searching for information
- They process 121 emails plus 226 work messages per day
- Email volume increased from 1,000 messages per year in the 1970s to 80,000 per year in 2022—an 80-fold increase
- 84% report digital exhaustion, despite 70% using AI tools weekly
This is the productivity paradox in its starkest form: we invested trillions in information technology and got less output per hour worked. More digital tools produced the same or worse productivity, plus epidemic levels of burnout and mental health collapse.
What happened? We automated and amplified intellect—symbolic processing without stopping rules—while systematically ignoring and undervaluing the work that carries meaning.
AI did not create this problem. It simply inherits it, scales it, and makes the cost more visible.
A Tale of Two Stories
The AI industry is making projections about scale, capability, and market penetration. What they're not projecting is the thermodynamic wall they're building toward.
The numbers tell two stories. Only one of them is being sold.
The Market Story (what's being projected):
- AI market value: $757 billion (2025) → $1.8-3.7 trillion (2030-2031) → $4.8 trillion (2033)
- Projected economic impact: $15.7-22.3 trillion added to global GDP by 2030
- Industry CAGR: 19-37% annual growth
- Enterprise adoption: 78% of companies already using AI, 90% deploying generative AI
- Job creation: 97 million AI-related jobs needed by 2025
These projections assume unlimited scaling. The presentations are full of exponential curves that go up and to the right forever.
The Energy Story (what's actually happening):
- Global data center electricity: 448 TWh (2024) → 980 TWh (2030) = 119% increase in 6 years
- US data centers: 183 TWh (2024) → 426 TWh (2030) = 133% increase
- By 2030: data centers will consume 7.8% of US electricity (up from 4% in 2025)
- By 2030: equivalent to Japan's entire electricity consumption
- AI-optimized servers growing 30% annually
- Individual data centers now requiring 5 gigawatts each
But here's what those smooth projections don't show:
The grid can't keep up:
- Utility connection delays: up to 5 years
- Ohio data center requests dropped from 30 GW to 13 GW after new tariffs
- Bay Area data centers sitting empty, not expecting power until 2027-2028
- IEA estimates 20% of planned data center projects may be delayed due to grid constraints
- Grid upgrades require $720 billion in investment through 2030
- Coal retirement delays and nuclear plant restarts to supply power
The scaling math doesn't work:
Training trajectory:
- GPT-3: 1,287 MWh
- GPT-4: 50,000 MWh (39× increase)
- GPT-5 (projected): 1,500,000 MWh (30× increase)
That's a 1,165× increase across three generations. Even with 10× efficiency improvements per generation (which aren't happening), total energy consumption grows superlinearly.
And here's the part nobody's talking about: inference costs dominate. Training gets the headlines, but running models in production accounts for 80-90% of AI's total electricity use. Every ChatGPT query, every Claude conversation, every AI coding assistant—all drawing power, linearly, with no economy of scale.
The ecosystem cannot sustain it:
And this pattern holds globally: despite decades of 'decoupling' rhetoric, GDP remains tightly coupled to carbon emissions, material extraction continues rising, and cropland has expanded 65% since 1990—the opposite of dematerialization.
The hidden pattern:
The industry is building systems that have no internal "enough." No stopping signal. No metabolic satiety. They will run until external constraints—energy, grid capacity, cost—force them to stop.
It's the inevitable systems result of trying to simulate intelligence without the metabolic satiety that meaning provides.
What AI Actually Automates
AI is most economically viable for automating the kinds of work that are already the most energy-expensive and metabolically unsustainable.
Knowledge work. Abstract coordination. Symbolic processing. The very categories that produce epidemic burnout in humans. The work that operates without internal stopping rules.
From a purely thermodynamic perspective, this makes sense. If you've already built an economy that overvalues abstraction and underpays embodied work, AI targets the highest-margin work first.
But it means AI is inheriting—and scaling—the exact error that produced the productivity paradox in the first place.
We're automating intellect and calling it intelligence.
The arbitrage works because:
- Energy is currently cheap and centralized - data center electricity rates don't reflect true infrastructure costs
- Labor costs include all of human survival - housing, healthcare, food, rest, family time
- AI energy costs are externalized - distributed across utilities, subsidized by ratepayers
But each of these conditions is temporary. Energy prices are rising. Infrastructure can't keep pace. And both humans and AI systems operating without meaning will continue escalating energy consumption until external limits force adjustment.
When energy tightens, that 100-watt human who knows when enough is enough starts looking very efficient again.
Two Futures
Path B already exists. It's called the care economy.
$10.8 trillion in activity. 100 watts per person. Self-terminating through meaning. The largest economic foundation we have, operating efficiently precisely because its work carries internal stopping signals.
This isn't speculative. It's the infrastructure that makes everything else possible.
The fork: recognize this and build systems that operate the same way—or keep scaling simulated intellect without regulatory signals until energy constraints force the choice.
The grid is already teaching us which path physics supports.
Path A: Simulated intellect everywhere
In this future, AI continues scaling. More models, more parameters, more data centres. Knowledge work gets automated. Humans who once did abstract coordination either exit the workforce or shift to lower-paid embodied work.
But the energy demand keeps growing. Inference costs scale with usage. Training costs grow superlinearly with capability. The grid strains. Energy becomes increasingly expensive and rationed.
In this path, AI doesn't free humans from drudgery. It creates a new form: providing the energy labor—directly or indirectly—to keep systems running that have no internal stopping rule.
Systems optimizing without meaning, scaling without satiety, growing because they cannot feel complete.
The result: Energy slavery.
Path B: Intelligence-centred systems
In this future, AI becomes a tool for reducing unnecessary intellect, not scaling it.
Use AI to offload brute-force abstraction—the exhausting symbolic manipulation that humans do poorly and hate doing. Use it for search, translation, summarization, initial analysis. Let it handle the recursive symbolic processing that burns humans out.
But keep humans in the loop for:
- Context-sensing (what's actually happening here?)
- Meaning-making (what does this mean for us?)
- Judgment under uncertainty (what should we do?)
- Stopping rules (when is enough, enough?)
Design systems that slow down where needed instead of optimizing for speed everywhere. Recognize that not all problems benefit from faster processing. Some require time, embodied reflection, direct contact with reality, the presence of meaning.
Every pre-industrial society built stopping signals into their rhythms: Sabbath rest, fallow fields, seasonal work cycles, craft completion over speed.
Nature demonstrates it everywhere: predator-prey cycles that self-stabilize, trees that fruit and stop, animals that feed to satiation.
The work ends when the signal says it's done—not when optimization demands more. Modern systems severed these feedback loops and called it progress.
In this path, overall energy use drops because you're avoiding unnecessary computation.
You're not using a megawatt data centre to do what a 100-watt human can do better and cheaper. You're using AI where it's actually more efficient than humans (bulk symbolic processing) and humans where they're more efficient than AI (contextual intelligence with internal stopping rules).
The Quiet Thesis
The cheapest intelligence we know is already alive.
It runs on 20 watts. Over 80 years, it consumes 8.76 megawatt-hours—less energy than it takes to train a single large language model.
And it has already built the largest economy that exists.
The care economy—valued at $10.8 trillion globally, equal to 20-50% of GDP if counted—operates entirely on natural intelligence at human metabolic scale. It operates efficiently not because it's primitive, but because its work carries meaning. Care ends when care is given.
It's invisible. Not because it doesn't exist, but because we built an accounting system that only measures market transactions—the layer of activity that operates without clear stopping rules, that scales through abstraction, that mistakes motion for progress.
This is the real story:
We didn't just choose intellect over intelligence. We chose to systematically ignore the largest expression of natural intelligence, the work that operates with internal regulatory signals, the foundation that makes all other activity possible.
At the same time we built an expensive intellectual superstructure on top—abstract coordination, symbolic processing, bureaucratic translation. This is what GDP measures. This is what produces 76-82% burnout rates and productivity collapse from 3% to 0.7%. This is what operates without meaning.
And this is what investment in AI is automating.
The industry projects $1-4 trillion in market value by 2030-2033. $15-22 trillion added to global GDP. Exponential growth curves going up and to the right.
The physics shows energy demand doubling by 2030, 20% of data center projects delayed by grid constraints, 5-year utility connection waits, infrastructure that can't expand fast enough.
These aren't competing narratives. They're information about boundaries.
A superorganism is running an experiment: can systems that operate without internal stopping rules—without meaning as a regulator—scale faster than the physical infrastructure that supports them?
The feedback is already arriving. Not as judgment, but as information.
Grid delays and infrastructure constraints are one signal. The 73% decline in vertebrate populations since 1970 is another. The fact that absorbing just this year's carbon surplus would require three times current global forest capacity is another.
These aren't separate crises—they're all manifestations of the same pattern: a superorganism operating without internal stopping signals, discovering its boundaries through direct encounter.
This is how living systems learn what's metabolically sustainable. Through direct encounter with limits.
And here's what the feedback is revealing:
The largest, most essential, most energy-efficient economy we have—the one that makes all other economic activity possible—cannot be automated because it operates with meaning. It requires embodied presence, context-sensitivity, direct contact with reality, the ability to sense when enough is enough. Natural intelligence at 100 watts.
AI can replace consultants producing PowerPoint frameworks. It cannot replace a parent sensing when a child needs comfort, a nurse reading unspoken distress, a teacher recognizing which student needs encouragement.
We're automating work that never stops on its own, while the work that completes itself naturally remains unchanged.
The pattern that worked at one scale (ignoring care work, overvaluing abstraction without stopping rules) produces different feedback at another scale (grid delays, energy constraints, infrastructure that can't keep pace).
The system is teaching itself what's sustainable through direct encounter with physical limits.
The thermodynamics are clear:
- Natural intelligence / care economy: efficient, self-limiting, meaning-driven
- Simulated intellect / formal economy: expensive, endless, meaning-absent
What we're learning:
The economy we didn't count is larger, more essential, and more efficient than the one we did.
The work that operates at 100 watts per person with internal stopping rules cannot be replaced by work that requires megawatts and never stops on its own.
The foundation—where meaning provides regulation—makes the superstructure possible, not the reverse.
Natural intelligence is cheap because meaning is metabolically efficient.
A system that knows why it exists and when its work is done:
- Expends minimal energy
- Self-terminates naturally
- Requires no external optimization
- Scales sublinearly
- Survives
Simulated intellect is expensive because meaning is missing.
A system without internal stopping signals:
- Consumes energy proportional to throughput
- Requires external limits
- Optimizes indefinitely
- Scales linearly or worse
- Requires constant intervention
And a superorganism discovers its own boundaries not through theory, but through the direct experience of encountering them.
The feedback is already here. The question is whether we recognize it as information about what carries meaning and what doesn't—about what can self-regulate and what requires external constraint—or whether we continue the experiment until the limits become inescapable.
What knows when to stop is what survives.
Everything else is just information about where the boundaries are.
This essay examines thermodynamic principles, metabolic efficiency, and the role of meaning as a regulatory signal in living systems. The numbers are real. The feedback is observable. The pattern is recognizable. What remains is integration.
Citations & Sources
Care Economy & Invisible Work:
- Riane Eisler: "The Real Wealth of Nations: Creating a Caring Economics" (2007)
- Center for Partnership Studies: Social Wealth Economic Indicators (SWEI)
- Global invisible economy: $10.8 trillion when measured
- Household work alone: 20-50% of GDP if valued
- Nancy Folbre: "The Invisible Heart: Economics and Family Values"
Ecological Overshoot & Biosphere Decline:
- National Ecological Footprint and Biocapacity Accounts 2025
- WWF Living Planet Report 2024: 73% decline in vertebrate populations since 1970
- Bar-On et al (2018), PNAS: Wild mammal biomass decline over 100,000 years
- Global Material Flows Database: material extraction trends
- FAOSTAT: cropland expansion data (65% increase since 1990)
- Art Berman: "Ecomodernism: Modernity Without Ecology" (synthesis of overshoot data)
AI Energy Consumption:
- IEA Global Data Centre Energy Report 2024
- Goldman Sachs "AI to drive 165% increase in data center power demand by 2030"
- GPT-3/4/5 training estimates, ChatGPT daily usage
- Inference accounts for 80-90% of total AI electricity use
Human Brain & Metabolic Data:
- Neuroscience consensus: human brain ~20 watts baseline
- Active cognitive tasks increase brain energy by 5-8%
- Lifetime energy: 20W × 80 years = 8.76 MWh
Grid Constraints & Infrastructure:
- Bain & Company, S&P Global: utility delays up to 5 years
- Ohio requests: 30 GW → 13 GW after tariffs
- $720 billion grid investment needed through 2030
- 20% of projects may be delayed (IEA)
- Coal retirements delayed, nuclear restarts
Burnout, Productivity, Wages:
- Deloitte, Asana, NEA: 76-82% knowledge worker burnout
- BLS: productivity growth 3% (1960s) → 0.7% (2020s)
- BLS wage data 2024
- Knowledge workers: 20% of time searching, 121 emails + 226 messages daily
Student Mental Health:
- Pew Research: 70% of teens cite anxiety/depression
- APA: 15% higher cortisol before tests
- High-anxiety students 2 years behind
AI Market Projections:
- Multiple sources: $757B-4.8T (2025-2033)
- IDC: $22.3T economic impact by 2030
© 2026 Zaheer Merali.