Signal/Noise
Signal/Noise
2025-11-16
Three major forces are reshaping power structures in tech: AI hardware is becoming the new oil with massive infrastructure bets creating winner-take-all dynamics, talent displacement is hitting knowledge workers harder than expected while creating new forms of economic dependency, and a quiet regulatory arbitrage is emerging as companies shop jurisdictions for favorable AI rules. The real story isn’t about AI capabilities—it’s about who controls the infrastructure, labor, and legal frameworks that will define the next economic era.
Infrastructure as Empire: The New Digital Colonialism
When Google drops $40 billion on Texas data centers and crypto miners abandon Bitcoin for AI compute, we’re witnessing the birth of a new imperial structure. These aren’t just business investments—they’re territorial claims in the infrastructure that will define economic power for decades.
The numbers tell the story: AI data centers will consume $580 billion this year, outpacing global oil exploration spending. FMC’s €100 million raise for next-gen memory chips signals that even the components feeding AI are becoming geopolitical assets. Meanwhile, Bitfarm’s pivot from crypto mining to AI represents something deeper—the reallocation of speculative capital toward infrastructure that actually matters.
What’s emerging isn’t just ‘AI companies’ but infrastructure empires. Google’s Texas investment creates not just data centers but economic dependencies—local grids, jobs, tax bases all tethered to Google’s continued success. When Nvidia’s hyperlink agent can search your entire PC locally, it’s not just a privacy win; it’s Intel Inside 2.0, embedding Nvidia deep into personal computing infrastructure.
The real prize isn’t the AI models—it’s controlling the pipes, power, and processing that make AI possible. Countries and states are essentially bidding to become the extraction zones for the new digital economy, offering land, power, and tax breaks in exchange for being essential to someone else’s empire. The AI ‘race’ is actually a race to own the infrastructure layer of the next economy.
The Knowledge Worker Extinction Event
While everyone debates whether AI will replace jobs, the displacement is already happening—and it’s hitting exactly where economists said it wouldn’t. The old narrative was that AI would automate blue-collar work first, but architects, developers, and analysts are seeing their workflows fundamentally altered right now.
The AI safety founder arguing that the field ‘undervalues’ builders reveals the deeper issue: we’re training people for a world that’s disappearing faster than educational institutions can adapt. When 95% of generative AI pilots fail, it’s not because the tech doesn’t work—it’s because organizations don’t know how to reorganize around AI-augmented labor.
Mercor’s wage cut from $21 to $16 per hour for essentially identical work shows the real mechanism: AI doesn’t eliminate jobs directly, it makes human labor more substitutable, driving down wages even when humans remain necessary. The AI companies themselves are treating workers ‘like human garbage,’ cutting pay while increasing workloads under the guise of ‘steadier task volumes.’
Meanwhile, the Internet Archive’s trillion-webpage milestone represents something profound: human knowledge becoming raw material for AI training, with the original creators receiving nothing. We’re witnessing the enclosure of intellectual commons, where decades of human creativity and knowledge become inputs for systems that then compete with their creators.
The cruel irony is that the same people building AI safety systems and discussing alignment problems are being displaced by the very technologies they’re trying to make beneficial. Knowledge work isn’t being automated—it’s being commoditized at scale.
Regulatory Shopping and the Race to the Bottom
While everyone focuses on AI capabilities, the real action is in jurisdictions competing to offer the most permissive regulatory environment for AI development. Companies aren’t just choosing where to build data centers based on power costs—they’re jurisdiction-shopping for favorable AI rules.
The shadow AI phenomenon—employees using unauthorized AI tools—reflects a deeper regulatory challenge. When workers resort to shadow AI because official tools are too restricted or expensive, they’re effectively voting with their workflows for more permissive AI environments. Organizations can’t enforce policies they haven’t written, and they can’t write policies for technologies moving faster than their legal departments.
Meanwhile, debates about AI safety and alignment are happening in academic circles while the actual deployment decisions are being made by people optimizing for quarterly metrics. The ‘modest proposal’ for NHS age limits reveals how economic pressure will drive AI adoption regardless of philosophical concerns—when systems are stretched, automation becomes politically viable even in sensitive areas like healthcare.
The EU’s attempt to regulate AI is already creating regulatory arbitrage, with companies choosing development locations based on compliance costs rather than technical capabilities. China’s conversion of farmland to data centers shows how quickly regulatory environments can shift when AI infrastructure becomes a national priority.
This creates a race to the bottom where jurisdictions compete by offering weaker oversight, faster approvals, and fewer restrictions. The winner won’t be whoever builds the best AI safety framework—it’ll be whoever can deploy AI fastest while maintaining plausible deniability about risks.
Questions
- If AI infrastructure becomes as strategically important as oil refineries, shouldn’t we be treating data center locations as national security decisions rather than business ones?
- When shadow AI usage surges because official tools are too restrictive, are organizations creating the conditions for their own regulatory capture?
- If knowledge workers are being commoditized by AI while infrastructure owners capture most of the value, are we creating a new feudal economy with digital landlords and intellectual serfs?
Past Briefings
OpenAI Deleted ‘Safely.’ NVIDIA Reports. Karpathy Is Still Learning
THE NUMBER: 6 — times OpenAI changed its mission in 9 years. The most recent edit deleted one word: safely. TL;DR Andrej Karpathy — the engineer who wrote the curriculum that trained a generation of developers, ran AI at Tesla, and helped found OpenAI — posted in December that he's never felt so behind as a programmer. Fourteen million people saw it. Tonight, NVIDIA reports Q4 fiscal 2026 earnings after market close: analysts expect $65.7 billion in revenue, up 67% year over year. The numbers will almost certainly land. What matters is what Jensen Huang says about the next two quarters to...
Feb 23, 2026Altman lied about a handshake on camera. CrowdStrike fell 8%. Google just killed the $3,000 photo shoot.
Sam Altman told reporters he was "confused" when Narendra Modi grabbed his hand at the India AI Impact Summit. He said he "wasn't sure what was happening." The video, which has been watched by tens of millions of people, shows Altman looking directly at Dario Amodei before raising his fist. He knew exactly what was happening. He chose not to do it, and then he lied about it. On camera. In multiple interviews. With the footage playing on every screen behind him. That would be a minor character note in any other industry. In this one, it isn't. Because on...
Feb 20, 2026We’re Building the Agentic Web Faster Than We’re Protecting It
Google's WebMCP gives agents structured access to every website. Anthropic's data shows autonomy doubling with oversight thinning. OpenAI's agent already drains crypto vaults. Google shipped working code Thursday that hands AI agents a structured key to every website on the internet. WebMCP, running in Chrome 146 Canary, lets sites expose machine-readable "Tool Contracts" so agents can book a flight, file a support ticket, or complete a checkout without parsing screenshots or scraping HTML. Early benchmarks show 67% less compute overhead than visual approaches. Microsoft co-authored the spec. The W3C is incubating it. This isn't a proposal. It's production software already...