back

Signal/Noise

Get SIGNAL/NOISE in your inbox daily

Signal/Noise

2025-11-16

Three major forces are reshaping power structures in tech: AI hardware is becoming the new oil with massive infrastructure bets creating winner-take-all dynamics, talent displacement is hitting knowledge workers harder than expected while creating new forms of economic dependency, and a quiet regulatory arbitrage is emerging as companies shop jurisdictions for favorable AI rules. The real story isn’t about AI capabilities—it’s about who controls the infrastructure, labor, and legal frameworks that will define the next economic era.

Infrastructure as Empire: The New Digital Colonialism

When Google drops $40 billion on Texas data centers and crypto miners abandon Bitcoin for AI compute, we’re witnessing the birth of a new imperial structure. These aren’t just business investments—they’re territorial claims in the infrastructure that will define economic power for decades.

The numbers tell the story: AI data centers will consume $580 billion this year, outpacing global oil exploration spending. FMC’s €100 million raise for next-gen memory chips signals that even the components feeding AI are becoming geopolitical assets. Meanwhile, Bitfarm’s pivot from crypto mining to AI represents something deeper—the reallocation of speculative capital toward infrastructure that actually matters.

What’s emerging isn’t just ‘AI companies’ but infrastructure empires. Google’s Texas investment creates not just data centers but economic dependencies—local grids, jobs, tax bases all tethered to Google’s continued success. When Nvidia’s hyperlink agent can search your entire PC locally, it’s not just a privacy win; it’s Intel Inside 2.0, embedding Nvidia deep into personal computing infrastructure.

The real prize isn’t the AI models—it’s controlling the pipes, power, and processing that make AI possible. Countries and states are essentially bidding to become the extraction zones for the new digital economy, offering land, power, and tax breaks in exchange for being essential to someone else’s empire. The AI ‘race’ is actually a race to own the infrastructure layer of the next economy.

The Knowledge Worker Extinction Event

While everyone debates whether AI will replace jobs, the displacement is already happening—and it’s hitting exactly where economists said it wouldn’t. The old narrative was that AI would automate blue-collar work first, but architects, developers, and analysts are seeing their workflows fundamentally altered right now.

The AI safety founder arguing that the field ‘undervalues’ builders reveals the deeper issue: we’re training people for a world that’s disappearing faster than educational institutions can adapt. When 95% of generative AI pilots fail, it’s not because the tech doesn’t work—it’s because organizations don’t know how to reorganize around AI-augmented labor.

Mercor’s wage cut from $21 to $16 per hour for essentially identical work shows the real mechanism: AI doesn’t eliminate jobs directly, it makes human labor more substitutable, driving down wages even when humans remain necessary. The AI companies themselves are treating workers ‘like human garbage,’ cutting pay while increasing workloads under the guise of ‘steadier task volumes.’

Meanwhile, the Internet Archive’s trillion-webpage milestone represents something profound: human knowledge becoming raw material for AI training, with the original creators receiving nothing. We’re witnessing the enclosure of intellectual commons, where decades of human creativity and knowledge become inputs for systems that then compete with their creators.

The cruel irony is that the same people building AI safety systems and discussing alignment problems are being displaced by the very technologies they’re trying to make beneficial. Knowledge work isn’t being automated—it’s being commoditized at scale.

Regulatory Shopping and the Race to the Bottom

While everyone focuses on AI capabilities, the real action is in jurisdictions competing to offer the most permissive regulatory environment for AI development. Companies aren’t just choosing where to build data centers based on power costs—they’re jurisdiction-shopping for favorable AI rules.

The shadow AI phenomenon—employees using unauthorized AI tools—reflects a deeper regulatory challenge. When workers resort to shadow AI because official tools are too restricted or expensive, they’re effectively voting with their workflows for more permissive AI environments. Organizations can’t enforce policies they haven’t written, and they can’t write policies for technologies moving faster than their legal departments.

Meanwhile, debates about AI safety and alignment are happening in academic circles while the actual deployment decisions are being made by people optimizing for quarterly metrics. The ‘modest proposal’ for NHS age limits reveals how economic pressure will drive AI adoption regardless of philosophical concerns—when systems are stretched, automation becomes politically viable even in sensitive areas like healthcare.

The EU’s attempt to regulate AI is already creating regulatory arbitrage, with companies choosing development locations based on compliance costs rather than technical capabilities. China’s conversion of farmland to data centers shows how quickly regulatory environments can shift when AI infrastructure becomes a national priority.

This creates a race to the bottom where jurisdictions compete by offering weaker oversight, faster approvals, and fewer restrictions. The winner won’t be whoever builds the best AI safety framework—it’ll be whoever can deploy AI fastest while maintaining plausible deniability about risks.

Questions

  • If AI infrastructure becomes as strategically important as oil refineries, shouldn’t we be treating data center locations as national security decisions rather than business ones?
  • When shadow AI usage surges because official tools are too restrictive, are organizations creating the conditions for their own regulatory capture?
  • If knowledge workers are being commoditized by AI while infrastructure owners capture most of the value, are we creating a new feudal economy with digital landlords and intellectual serfs?

Past Briefings

Mar 18, 2026

Bill Gurley Says the AI Bubble Is About to Burst. Travis Kalanick’s Timing Says He’s Right.

THE NUMBER: $300 billion — HSBC's estimate of cumulative cash burn by foundational AI model companies through 2030. Bill Gurley sat on Uber's board while it burned $2 billion a year and says it gave him "high anxiety." OpenAI and Anthropic make Uber's bonfire look like a birthday candle. "God bless them," Gurley told CNBC. "It's a scary way to run a company." Travis Kalanick showed up on the All-In podcast this week with a new robotics venture called Atoms and opinions about who's winning the autonomy race. That's the headline most people caught. But the deeper signal is the...

Mar 17, 2026

Anthropic Is Winning the Product War. The $575 Billion Question Is Whether Anyone Can Afford to Keep Fighting

THE NUMBER: 12x — For every dollar the hyperscalers earn from AI today, they're spending twelve dollars building more capacity. That's $575 billion in capex this year. Alphabet just issued a century bond — the first by a tech company since Motorola in 1997 — to fund it. The debt matures in 2126. The chips it buys will be obsolete by 2029. Anthropic now wins 70% of new enterprise deals in direct matchups with OpenAI, according to Ramp's March 2026 AI Index. Claude Code generates $2.5 billion in annualized revenue. OpenAI's Codex manages $1 billion. OpenAI's enterprise share dropped from...

Mar 16, 2026

Chamath Says Your Portfolio Is Worth 75% Less Than You Think. Karpathy’s Data Suggests He’s Right.

THE NUMBER: 60-80% — the share of a typical equity valuation derived from terminal value. That's the portion of every stock price that assumes competitive advantages persist for a decade or more. Chamath Palihapitiya just argued that AI makes that assumption unpriceable. If he's even half right, the math doesn't bend. It breaks. Chamath Palihapitiya posted a note this weekend titled "The Collapse of Terminal Value" that should be required reading for anyone who allocates capital — including the capital of their own career. His thesis: AI accelerates disruption so fast that no company can credibly project cash flows beyond five...