back

Signal/Noise

Get SIGNAL/NOISE in your inbox daily

Signal/Noise

2025-11-06

Three seismic shifts are converging to reshape how value is created in the AI economy. While everyone’s watching the flashy partnerships and IPO drama, the real story is infrastructure capture—from physical data centers to attention networks—determining who controls the chokepoints of the next economic era.

The Great Infrastructure Land Grab

Google’s announcement that it’s planning orbital data centers by 2027 isn’t just engineering ambition—it’s infrastructure warfare disguised as innovation. While Chinese autonomous vehicle companies struggle with 12% share drops in Hong Kong debuts and need $860 million just to stay competitive, tech giants are racing to lock up the fundamental building blocks of AI computation.

The pattern is everywhere: Duke Energy backing satellite-based vegetation monitoring, the UK approving AI factories of “national importance” in Derbyshire, utilities scrambling to partner with AI startups just to keep the lights on. This isn’t about meeting demand—it’s about controlling supply chains that don’t exist yet.

Space-based data centers solve a problem that terrestrial facilities can’t: unlimited solar power and zero cooling costs. But more critically, they create a moat that’s literally astronomical. Once you own orbital infrastructure, competitors face launch costs, regulatory approval, and physics itself. Google isn’t just building data centers; it’s claiming the high ground for the next century of computation.

Meanwhile, Palantir trades at 624 times earnings not because investors are “batshit crazy,” but because they understand infrastructure capture. The company isn’t selling software—it’s positioning itself as the nervous system for how governments and enterprises will think. When your platform becomes the interface between human decision-making and AI computation, pricing power becomes irrelevant.

The infrastructure winners aren’t just providing compute—they’re defining the geography of intelligence itself.

The Attention Economy’s New Landlords

Snap’s $400 million deal with Perplexity reveals the next battlefield: who controls the interface between humans and AI-generated answers. This isn’t partnership—it’s platform colonization.

Google’s AI Overviews already cost News Corp referral traffic while the media giant scrambles to license content to multiple LLMs just to survive. The brutal math is simple: when AI answers questions directly, publishers become invisible unless they’re cited. News Corp’s “woo and sue” strategy acknowledges this reality—you either get paid for training data or you disappear.

But Snap’s move is more sophisticated than content licensing. By embedding Perplexity directly into Snapchat’s chat interface, they’re capturing attention at the moment of query formation. Users won’t search Google then get AI answers—they’ll ask questions inside Snapchat and never leave. The $400 million isn’t buying search technology; it’s buying the right to intermediate between users and all human knowledge.

The parallels to historical media consolidation are striking, but the stakes are higher. When newspapers controlled distribution, alternatives existed. When AI systems control answers, there’s nowhere else to go. Each platform that successfully embeds AI search becomes a sovereign territory in the attention economy.

SAP’s embrace of “agentic AI” and enterprise integration shows how this plays out in B2B markets. Once AI agents are embedded in business workflows, switching costs become prohibitive. You’re not just changing software—you’re rewiring institutional memory and decision-making processes.

The companies winning these integration races aren’t just capturing market share—they’re positioning themselves as the middlemen for human-AI interaction.

The Labor Replacement Ultimatum

Geoffrey Hinton’s warning cuts through Silicon Valley optimism with uncomfortable precision: “To make money you’re going to have to replace human labor.” The godfather of AI isn’t predicting the future—he’s describing the business model.

OpenAI’s Sam Altman claiming he doesn’t want government bailouts while projecting $1.4 trillion in infrastructure commitments reveals the cognitive dissonance. The same week employees using AI earn 40% more than those who don’t, AI executives openly acknowledge their technology only works economically by eliminating jobs. The math is simple but brutal: AI’s productivity gains must exceed human labor costs, or the investments don’t pencil.

The evidence is already visible in educational policy panic. When students face false accusations of AI use—from detection tools with massive error rates—and universities respond with punitive measures rather than adaptation, they’re revealing their own obsolescence anxiety. If AI can produce work indistinguishable from human output, what exactly are schools selling?

Tabnine’s “org-native” AI agents represent the endgame: systems that understand company repositories, tools, and policies well enough to complete entire workflows autonomously. This isn’t augmentation—it’s replacement with better integration.

The geopolitical implications are staggering. AUKUS partnerships between SubSea Craft and Greenroom Robotics show how military contractors are already building AI-powered systems designed to “keep people out of harm’s way.” The same efficiency logic that drives corporate adoption becomes national security doctrine.

Hinton’s call for how “we organize society” sounds almost quaint against this backdrop. The organization is already happening—by companies building systems that require fewer humans to operate. The social consequences will follow the technical capabilities, not precede them.

Questions

  • If orbital data centers become critical infrastructure, what happens when geopolitical tensions extend into space?
  • Are we building AI systems that make human expertise irrelevant, or just redistributing it to whoever controls the interfaces?
  • When the entire economy runs through AI intermediaries, who has the power to turn it off?

Past Briefings

Feb 24, 2026

OpenAI Deleted ‘Safely.’ NVIDIA Reports. Karpathy Is Still Learning

THE NUMBER: 6 — times OpenAI changed its mission in 9 years. The most recent edit deleted one word: safely. TL;DR Andrej Karpathy — the engineer who wrote the curriculum that trained a generation of developers, ran AI at Tesla, and helped found OpenAI — posted in December that he's never felt so behind as a programmer. Fourteen million people saw it. Tonight, NVIDIA reports Q4 fiscal 2026 earnings after market close: analysts expect $65.7 billion in revenue, up 67% year over year. The numbers will almost certainly land. What matters is what Jensen Huang says about the next two quarters to...

Feb 23, 2026

Altman lied about a handshake on camera. CrowdStrike fell 8%. Google just killed the $3,000 photo shoot.

Sam Altman told reporters he was "confused" when Narendra Modi grabbed his hand at the India AI Impact Summit. He said he "wasn't sure what was happening." The video, which has been watched by tens of millions of people, shows Altman looking directly at Dario Amodei before raising his fist. He knew exactly what was happening. He chose not to do it, and then he lied about it. On camera. In multiple interviews. With the footage playing on every screen behind him. That would be a minor character note in any other industry. In this one, it isn't. Because on...

Feb 20, 2026

We’re Building the Agentic Web Faster Than We’re Protecting It

Google's WebMCP gives agents structured access to every website. Anthropic's data shows autonomy doubling with oversight thinning. OpenAI's agent already drains crypto vaults. Google shipped working code Thursday that hands AI agents a structured key to every website on the internet. WebMCP, running in Chrome 146 Canary, lets sites expose machine-readable "Tool Contracts" so agents can book a flight, file a support ticket, or complete a checkout without parsing screenshots or scraping HTML. Early benchmarks show 67% less compute overhead than visual approaches. Microsoft co-authored the spec. The W3C is incubating it. This isn't a proposal. It's production software already...