×
Napkin math: Rapidly declining inference costs are making AI assistants cheaper than ever
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rapid decline in AI processing costs is making personal AI assistants, similar to those depicted in science fiction, increasingly feasible from a cost perspective.

Current cost analysis: The computational cost for an AI assistant that interacts with users as frequently as they check their smartphones has dropped to remarkably low levels.

  • Based on average smartphone usage patterns of 144 daily interactions, with four exchanges per interaction over 30 days, the raw computing cost would be approximately 75 cents per month
  • This calculation assumes processing about 0.5 million tokens monthly at a rate of 15 cents per million tokens
  • Even with a typical commercial markup of 10x, the consumer price would be around $7 monthly, less than half the cost of a Netflix subscription

Technical context: AI inference costs (the computing resources needed to run AI models) have been declining dramatically year over year.

  • The cost reduction has been following a pattern of roughly 90% decrease annually
  • Smaller, more efficient AI models are making these services particularly cost-effective
  • The token-based pricing model (where tokens are basic units of text processing) allows for precise cost calculations based on usage

Market implications: The extremely low operational costs are likely to drive widespread AI assistant adoption across various platforms and devices.

  • Hardware manufacturers could integrate AI assistants as differentiating features in their products
  • Software companies might incorporate AI assistants to enhance their subscription services or advertising platforms
  • The cost-to-benefit ratio strongly favors implementation, as the user experience improvements far outweigh the minimal operational expenses

Looking ahead: While basic computational costs have become negligible, the true challenge lies in developing AI assistants that can match the sophisticated interactions portrayed in science fiction, suggesting that actual implementation costs and technical hurdles, rather than computing expenses, will be the primary factors determining widespread adoption.

75 Cents per Month by @ttunguz

Recent News

New to NotebookLM? Here’s what it does and where to get it

Google's free AI tool transforms written documents into two-voiced podcast conversations, signaling broader accessibility to audio content creation.

AI-generated coding is a big success, if you can navigate these risks

AI tools are accelerating software development timelines, but companies must balance speed with security and code quality standards.

The Google smart home ecosystem may get a big Gemini AI upgrade

The company is enhancing Google Assistant with its Gemini AI model to enable more natural conversations and complex task handling in smart homes.