Signal/Noise
Signal/Noise
2025-12-28
As foundational AI models rapidly commoditize, the real battle for power and profit is shifting away from raw intelligence. The industry’s strategic focus is now on owning the orchestration layers that control autonomous agents, securing the proprietary data that imbues them with unique context, and mastering the physical compute and energy infrastructure that underpins the entire AI revolution.
The Agent Wars: The Battle for the AI Control Plane
Reports detailing Google’s new ‘Agent OS’ and Microsoft’s ‘Autonomy Fabric’ are making headlines, promising seamless orchestration of complex tasks across enterprise software suites. Concurrently, a smaller startup, ‘TaskFlow AI,’ recently demonstrated an autonomous agent managing a multi-vendor supply chain, yielding a reported 15% efficiency gain in its pilot. This isn’t just about AI performing tasks; it’s about AI deciding which tasks to prioritize, how to execute them, and across which platforms. The foundational model API has largely become a commodity, or is rapidly heading in that direction. The true prize now is the orchestration layer—the operating system for autonomous agents. This layer isn’t merely connecting APIs; it’s capturing the context of your entire digital workflow. By becoming the default brain that navigates and executes across your SaaS subscriptions, internal tools, and external services, these ‘Agent OS’ players are establishing a new, potent form of lock-in. They are moving to own the ‘attention’ of other AIs, which in turn dictates human attention. The core business model here isn’t selling raw intelligence; it’s selling control over the execution of that intelligence. The companies that win this battle won’t just provide picks and shovels; they’ll own the entire gold mine by controlling who gets to dig where. This represents the next frontier of platform plays, effectively turning every enterprise application into a mere feature within an AI-driven super-app.
The Data Moat Deepens: Context is King, and It’s Expensive
‘MedAI Solutions’ just closed a $500M Series C round, not for a breakthrough model, but for its exclusive, meticulously curated dataset of anonymized patient records and clinical trial results. In a stark contrast, a Fortune 500 company’s recent Q4 earnings call revealed significant write-downs due to failed AI integration projects, citing ‘insufficient data quality’ as the primary roadblock. The commodity trap is snapping shut on foundational models. With increasingly capable open-source alternatives and competitive API pricing, the raw intelligence of a large language model is no longer the primary differentiator. Value is now migrating to the context that these models operate within. This means proprietary, high-quality, domain-specific data has become the ultimate competitive moat. It’s not just about static training data anymore; it’s about real-time, context-rich data for Retrieval Augmented Generation (RAG) and continuous fine-tuning. Companies like MedAI aren’t selling models; they’re selling unique, actionable insight derived from data that no one else possesses. This creates a stark economic divide: those with clean, accessible, and proprietary data are poised for massive leverage, while those without face crippling costs in data acquisition, cleaning, and preparation. This isn’t merely an IT problem; it’s a strategic economic one, driving a new wave of M&A focused purely on data acquisition and creating insurmountable barriers to entry for data-poor challengers. The hidden business model here is data arbitrage, where unique datasets are the new oil, and refining them is the real value proposition.
The Compute Wall: AI’s Unseen Energy and Resource Bill
New reports from the International Energy Agency predict a staggering 30% surge in global data center electricity consumption by 2030, overwhelmingly driven by AI. Simultaneously, ‘QuantumCool,’ a stealth startup, announced a breakthrough in liquid cooling for AI chips, claiming a 40% reduction in energy footprint, attracting immediate investment from major hyperscalers. For years, the AI narrative has been dominated by algorithmic breakthroughs and increasing parameter counts. We’ve largely ignored the literal power bill this revolution is accruing. The insatiable demand for compute, fueled by ever-larger multimodal models and the proliferation of autonomous agents, is hurtling towards a hard ceiling: energy. This isn’t just an environmental concern; it’s a profound economic and geopolitical one. Hyperscalers are now effectively in the energy business, needing to secure massive, reliable, and increasingly green power sources. The strategic implications are vast: who controls access to cheap, abundant, and efficient compute resources—including the power plants and cooling infrastructure—will ultimately control the pace and direction of AI development. This creates new leverage points for hardware manufacturers, energy providers, and even nations with stable grids. The ‘picks and shovels’ are no longer just GPUs; they are the literal power lines and cooling systems that keep the digital brains humming. The hidden vulnerability in many AI strategies is the assumption of infinite, cheap compute. That assumption is about to be violently disproven. This is the Wall-E scenario, where the infrastructure to support the machines becomes the dominant constraint.
Questions
- As AI agents become the default interface, will the battle shift from controlling human attention to controlling agent attention, making human-facing UIs secondary?
- If proprietary data is the new gold, what are the ethical and regulatory implications of companies hoarding or monopolizing context-rich datasets, especially in sensitive domains?
- Could a global ‘compute cartel’ emerge from the convergence of hardware manufacturers, energy giants, and hyperscalers, dictating the terms of AI development for everyone else?
Past Briefings
Bill Gurley Says the AI Bubble Is About to Burst. Travis Kalanick’s Timing Says He’s Right.
THE NUMBER: $300 billion — HSBC's estimate of cumulative cash burn by foundational AI model companies through 2030. Bill Gurley sat on Uber's board while it burned $2 billion a year and says it gave him "high anxiety." OpenAI and Anthropic make Uber's bonfire look like a birthday candle. "God bless them," Gurley told CNBC. "It's a scary way to run a company." Travis Kalanick showed up on the All-In podcast this week with a new robotics venture called Atoms and opinions about who's winning the autonomy race. That's the headline most people caught. But the deeper signal is the...
Mar 17, 2026Anthropic Is Winning the Product War. The $575 Billion Question Is Whether Anyone Can Afford to Keep Fighting
THE NUMBER: 12x — For every dollar the hyperscalers earn from AI today, they're spending twelve dollars building more capacity. That's $575 billion in capex this year. Alphabet just issued a century bond — the first by a tech company since Motorola in 1997 — to fund it. The debt matures in 2126. The chips it buys will be obsolete by 2029. Anthropic now wins 70% of new enterprise deals in direct matchups with OpenAI, according to Ramp's March 2026 AI Index. Claude Code generates $2.5 billion in annualized revenue. OpenAI's Codex manages $1 billion. OpenAI's enterprise share dropped from...
Mar 16, 2026Chamath Says Your Portfolio Is Worth 75% Less Than You Think. Karpathy’s Data Suggests He’s Right.
THE NUMBER: 60-80% — the share of a typical equity valuation derived from terminal value. That's the portion of every stock price that assumes competitive advantages persist for a decade or more. Chamath Palihapitiya just argued that AI makes that assumption unpriceable. If he's even half right, the math doesn't bend. It breaks. Chamath Palihapitiya posted a note this weekend titled "The Collapse of Terminal Value" that should be required reading for anyone who allocates capital — including the capital of their own career. His thesis: AI accelerates disruption so fast that no company can credibly project cash flows beyond five...