What the Industry Projects vs. What the Physics Shows
A Side-by-Side Comparison of AI Growth Narratives
THE MARKET STORY
What AI companies and analysts are projecting:
Financial Projections
- 2025 Global AI Market: $757 billion
- 2030 Projection: $1.8-3.7 trillion
- 2033 Projection: $4.8 trillion
- Economic Impact by 2030: $15.7-22.3 trillion added to global GDP
- Growth Rate: 19-37% CAGR
Adoption Projections
- Current Enterprise Adoption: 78% of companies using AI
- Generative AI Deployment: 90% of organizations
- Job Creation: 97 million AI-related jobs needed by 2025
- Industry Transformation: Manufacturing gains $3.78T, Financial services $1.15T, Professional services $1.85T
Narrative
- Exponential growth curves going up and to the right
- AI as inevitable economic force
- Massive productivity gains across all sectors
- Job creation and transformation
- Unlimited scaling potential
Key Assumption: Energy and infrastructure will scale to meet demand
THE PHYSICS STORY
What the energy data actually shows:
Energy Requirements
- 2024 Data Center Consumption: 448 TWh globally
- 2030 Projection: 980 TWh (119% increase in 6 years)
- US Alone: 183 TWh (2024) → 426 TWh (2030) = 133% increase
- By 2030: Data centers = 7.8% of US electricity (up from 4% in 2025)
- By 2030: Equivalent to Japan's entire electricity consumption
Training Cost Trajectory
- GPT-3: 1,287 MWh
- GPT-4: 50,000 MWh (39× increase)
- GPT-5 (projected): 1,500,000 MWh (30× increase)
- Total Scaling: 1,165× increase across three generations
Reality Check: A human brain uses 8.76 MWh over an entire 80-year lifetime. GPT-4 training used 5,700× more energy.
Grid Constraints (Current Reality)
- Utility Connection Delays: Up to 5 years
- Ohio Example: Data center requests dropped from 30 GW to 13 GW after state imposed 85% energy-use-or-pay tariff
- Bay Area: Data centers sitting empty, no power until 2027-2028
- IEA Warning: 20% of planned projects may be delayed due to grid constraints
- Required Investment: $720 billion in grid upgrades just to meet demand
What's Actually Happening
- Gas turbine orders hit 20-year high (14 GW in 2024, 18 GW in H1 2025) driven entirely by data center demand
- At least 10 GW of coal retirements delayed or converted to gas since early 2025
- Nuclear plant restarts: ~3 GW coming back online by 2030
- States implementing protective tariffs to shield residential customers from infrastructure costs
- Utility analysts report data centers "underperforming expectations" due to cancellations and delays
Inference Reality
- 80-90% of AI electricity use is inference (running models), not training
- Daily ChatGPT usage: 1 GWh (33,000 US households' daily consumption)
- No economy of scale: Every query costs energy, scales linearly with adoption
- If usage grows 10×, energy cost grows 10×
Key Reality: Energy infrastructure cannot scale at the rate AI companies are projecting
THE CONTRADICTION
The Arbitrage That's Closing
Why AI Looks Cheap (For Now):
- Energy prices don't reflect true infrastructure costs
- Data center rates are subsidized
- Grid upgrade costs externalized to ratepayers
- Environmental costs not priced in
- Human labor costs include total metabolic overhead
- Housing, healthcare, food, rest, family time
- AI gets to externalize its "metabolic" costs (electricity, cooling, infrastructure)
- Capital abundance masks operating costs
- Venture funding absorbs losses during training
- True operational costs hidden in cloud bills
- Temporary regulatory environment
- Infrastructure costs being socialized
- But states are now pushing back (Ohio, Bay Area, others)
What Happens When:
- Energy prices rise (already happening)
- States refuse to socialize infrastructure costs (already happening)
- Grid capacity constraints delay projects (already happening)
- Environmental costs get priced in (coming)
- Capital becomes scarce (cyclical)
That 100-watt human starts looking very efficient again.
THE THERMODYNAMIC REALITY
Natural Intelligence (Human)
- Power: 100 watts total (brain: 20W)
- Fuel: Food (ultimately solar-captured energy)
- Lifetime Energy: 8.76 MWh (80 years)
- Scaling: Sublinear (experience improves efficiency)
- Stopping Rule: Internal (knows when enough is enough)
- Context: Direct contact with reality
Simulated Intellect (AI)
- Training Power: 50,000+ MWh (GPT-4)
- Inference Power: 0.3 Wh per query, scales linearly
- Infrastructure: Megawatts for data centers, cooling, transmission
- Scaling: Linear or superlinear
- Stopping Rule: None (external limits only)
- Context: Symbolic space, one layer removed
Energy Ratio: GPT-4 training = 5,700 human brain lifetimes
THE PATTERN WE'RE REPEATING
The Economy We Don't Count
Before we even get to the formal economy, there's a larger economy we've chosen not to see.
The Care Economy (Invisible):
- Global value: $10.8 trillion (when measured)
- Household work alone: 20-50% of GDP
- Who performs it: Primarily women, unpaid or underpaid
- What it includes: Childcare, eldercare, emotional labor, food prep, community maintenance, teaching, nursing
- Energy profile: 100-watt human scale, self-renewing
- Can AI replace it? No—requires embodied presence, context-sensitivity, direct contact with reality
The Formal Economy (What We Measure):
- Built on top of the care economy foundation
- Only counts market transactions and government spending
- Rewards abstract coordination over embodied care
- GDP counts negatives (oil spills, tobacco, arms) as positives
- Ignores household, volunteer, and natural economies
What This Means:
The largest economy that exists—the one that makes all other economic activity possible—operates at 100-watt human metabolic scale and is completely invisible in GDP accounting.
Meanwhile, we're building AI infrastructure that requires megawatts to automate the expensive superstructure—the abstract intellectual work that sits on top of the care foundation.
The automation pattern is backwards: AI replaces the energy-expensive intellectual work while the energy-efficient care work (which is larger and more essential) remains unchanged.
We Already Ran This Experiment
Education:
- Chose abstract symbol manipulation over embodied learning
- Result: 70% of teens cite anxiety/depression as major problem
- High-anxiety students 2 years behind by end of elementary school
Economy:
- Paid consultants ($101K) more than nurses ($93.6K)
- Overvalued abstract coordination, undervalued embodied judgment
- Result: 76-82% knowledge worker burnout
Productivity:
- Invested trillions in IT
- Knowledge workers now spend 20% of time searching for information
- Process 121 emails + 226 messages daily
- Email volume increased 80× since 1970s
- Result: Productivity growth collapsed from 3% (1960s) to 0.7% (2020s)
What AI Inherits: All of it. The same confusion between intellect and intelligence. The same overvaluation of abstraction. The same linear or superlinear scaling. Just at data center scale, with the energy cost visible in gigawatt-hours instead of hidden in glucose and cortisol.
THE FEEDBACK LOOP
Two projections. One experiment.
Industry projects: $1-4 trillion by 2030-2033, exponential growth, $15-22 trillion economic impact
Physics shows: Energy doubling by 2030, 20% of projects delayed, 5-year waits, states pushing back, coal staying open
This isn't a competition between narratives. It's a superorganism discovering the boundaries of its phenomenon through direct encounter.
The delays aren't failures. They're information. The constraints aren't obstacles. They're feedback about what's sustainable at scale. The rising costs, the grid limitations, the protective tariffs—these are how living systems learn metabolic limits.
What the Feedback Is Revealing
The pattern that worked at one scale produces different results at another:
At individual scale:
- Ignore care work, reward abstraction
- Result: Some burnout, manageable
At institutional scale:
- Same pattern, GDP measures only markets
- Result: 76-82% burnout, productivity collapse
At infrastructure scale:
- Same pattern, automated via AI
- Result: Megawatt demand, grid constraints, delays
The superorganism is teaching itself: The accounting method that made care work invisible doesn't scale to data center infrastructure. The energy cost that could be hidden in glucose and unpaid labor becomes visible as gigawatt-hours and infrastructure delays.
The Foundation That Cannot Be Automated
While the experiment runs, the $10.8 trillion care economy continues:
- Operating at 100 watts per person
- Making all formal economic activity possible
- Invisible in GDP accounting
- Cannot be replaced by megawatt infrastructure
This isn't ideology. It's metabolism.
A superorganism running an experiment: can symbolic processing at data center scale deliver more value than embodied intelligence at human scale?
The feedback is arriving. Not as judgment. As constraint. As information about boundaries.
THE LEARNING PROCESS
We've already made one error: building an economic system that ignores the largest, most efficient, most essential economy—the $10.8 trillion care economy operating at 100-watt human scale.
Now we're running a second experiment: automating the expensive intellectual superstructure while calling it "the future of work."
The superorganism is discovering:
- The superstructure depends on the invisible care foundation
- The superstructure is energy-expensive and produces burnout
- The foundation is energy-efficient and irreplaceable
- AI can only touch the superstructure, not the foundation
- The feedback arrives as delays, constraints, rising costs
This is how living systems learn boundaries—not through prediction, but through direct encounter with limits.
What happens next isn't a choice between paths. It's feedback integration:
Pattern A: Continue Until Constraint Forces Adjustment
- Scale AI to automate formal economy
- Energy demand doubles, infrastructure lags
- Delays accumulate, costs rise, projects cancel
- The 100-watt humans doing care work remain unchanged
- System discovers: the foundation was load-bearing all along
- Learning: hard constraint, forced recognition
Pattern B: Recognize Feedback Early
- See delays and constraints as information about boundaries
- Notice: $10.8 trillion foundation operates at 100W, can't be automated
- Use AI to reduce unnecessary intellectual overhead
- Support humans in the energy-efficient mode
- Learning: soft constraint, adaptive response
Neither is a moral choice. Both are learning processes.
The difference is whether the superorganism integrates feedback before or after encountering hard metabolic limits.
THE PATTERN
We built an accounting method that treats care as zero and destruction as growth. Oil spills add to GDP. Parents caring for children contribute nothing.
We're now scaling that accounting error to megawatt infrastructure.
The feedback is arriving:
- 20% of projects delayed
- 5-year utility waits
- States protecting grids
- Coal staying open
- $720B upgrades needed
- Infrastructure can't keep pace
This isn't failure. It's information.
A superorganism discovering: the pattern that made care work invisible produces different feedback at data center scale.
The thermodynamics:
- Natural intelligence: already built $10.8T economy at 100W per person
- Simulated intellect: requires megawatts, scales linearly, hitting constraints
- Care foundation: makes survival possible, can't be automated
- Formal superstructure: depends on foundation, increasingly automated
- Physical limits: arriving as delays, costs, infrastructure gaps
What the system is teaching itself:
The economy we didn't count is larger, more essential, and more efficient than the one we did.
The work that operates at 100 watts cannot be replaced by work that requires megawatts.
The foundation makes the superstructure possible, not the reverse.
Natural intelligence is cheap.
Simulated intellect is not.
The learning process is already underway.
The only question is how much constraint is required before the pattern becomes visible—before a superorganism recognizes its own boundaries not through theory, but through the direct experience of encountering them.
The feedback is information about what's sustainable. Whether we call it "the grid deciding" or "nature forcing us" or "hitting limits"—it's the same phenomenon: a living system learning the edges of its metabolic niche through direct encounter.
The data is clear. The constraints are real. The foundation remains unchanged.
What remains is integration: will the superorganism adjust its pattern before hard limits force the adjustment, or discover its boundaries the way all living systems eventually do—by reaching them.
Sources
- AI market projections: Grand View Research, Statista, IDC, ABI Research, UNCTAD
- Energy data: IEA, Goldman Sachs, S&P Global, EPRI, Bain & Company
- Grid constraints: utility company reports, regulatory filings, industry analysis
- Human brain data: neuroscience consensus literature
- Burnout/productivity: BLS, Deloitte, Asana, NEA
- Student data: Pew Research, APA
The numbers don't lie. The question is whether we're willing to see what they're telling us.