×
Why enterprises are increasingly using small language models
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The growing prominence of smaller AI models in enterprise applications is reshaping how businesses approach artificial intelligence implementation, with a focus on efficiency and cost-effectiveness.

Key findings from industry research: Databricks’ State of Data + AI report reveals that 77% of enterprise AI implementations utilize smaller models with less than 13 billion parameters, while large models exceeding 100 billion parameters account for only 15% of deployments.

  • Enterprise buyers are increasingly scrutinizing the return on investment of larger AI models, particularly in production environments
  • The cost differential between small and large models is significant, with pricing increasing geometrically as parameter counts rise
  • This trend reflects a broader shift toward practical, cost-effective AI solutions in business settings

Performance advantages of smaller models: Recent advancements have significantly improved the capabilities of smaller AI models, making them increasingly attractive alternatives to their larger counterparts.

  • Smaller models now approach the performance levels of larger models in many applications
  • The reduced cost allows organizations to run multiple iterations for verification purposes, similar to using multiple human reviewers
  • This redundancy capability enhances accuracy and reliability while maintaining cost advantages

Latency considerations: Response time measurements reveal substantial performance advantages for smaller AI models.

  • 7 billion parameter models demonstrate an 18ms latency per token
  • 13 billion parameter models show 21ms latency per token
  • 70 billion parameter models require 47ms per token
  • Larger 405 billion parameter models range from 70-750ms per token
  • These differences significantly impact user experience, as faster response times lead to better engagement

Business implications: The combination of cost savings and performance benefits makes smaller AI models particularly attractive for enterprise deployment.

  • Organizations can achieve comparable results at a fraction of the cost of larger models
  • Reduced latency translates to improved user experiences and higher productivity
  • The efficiency gains allow for broader implementation across various business functions

Looking ahead: The trend toward smaller, more efficient AI models suggests a maturing market where practical considerations are beginning to outweigh the pursuit of ever-larger models, potentially signaling a new phase in enterprise AI adoption focused on optimization rather than scale.

Small but Mighty AI by @ttunguz

Recent News

Want to boost your marketing performance? Try these AI tools

AI marketing tools are reshaping digital strategies by automating tasks, analyzing data, and personalizing content, but human oversight remains crucial for strategy and brand integrity.

Google’s AI Overviews may get a little better with this new feature

Google's AI-powered search summaries now allow users to highlight text for more focused, nested summaries within the app.

AI likely impacted the 2024 election, though not in the ways many expected

AI-generated content in the 2024 U.S. election focused on emotional impact rather than sophisticated deception, amplifying existing political divides.