The core challenge: Enterprises in 2025 face complex decisions about scaling generative AI, moving beyond simple deployment to focus on operational transformation and cost optimization.
Leading examples: Wayfair and Expedia demonstrate successful hybrid approaches to large language model (LLM) adoption, combining external platforms with custom solutions.
- Wayfair utilizes Google’s Vertex AI for general applications while developing proprietary tools for specific needs like product tagging
- Expedia implements a multi-vendor LLM proxy layer for seamless integration of various models
- Both companies emphasize the importance of matching technology choices to specific business requirements
Operational implementation: Companies are finding success by targeting specific high-value applications rather than pursuing broad, unfocused AI adoption.
- Wayfair leverages AI to enhance product catalogs and analyze legacy database structures
- Expedia has integrated AI into customer service, achieving 30-second response times for 90% of travelers
- Both organizations focus on measurable impacts and clear business outcomes
Infrastructure considerations: Hardware and infrastructure choices play a crucial role in sustainable AI scaling.
- Cloud infrastructure currently dominates, with both companies primarily relying on major providers
- A proxy layer system helps balance performance with cost efficiency
- Companies are evaluating future needs for localized infrastructure to handle real-time applications
Organizational readiness: Technical deployment represents only part of the scaling challenge; cultural adaptation and governance are equally critical.
- Comprehensive training programs ensure employees can effectively work with AI tools
- Expedia’s Responsible AI Council oversees major AI decisions and ensures ethical implementation
- Traditional KPIs are being rethought to better measure AI effectiveness
Looking ahead to future challenges: The enterprise AI landscape continues to evolve, requiring ongoing adaptation and strategic planning.
- Inference costs and real-time processing capabilities remain key considerations
- Organizations must balance innovation with practical implementation
- Success requires continuous evaluation of use cases and investment priorities to maintain competitive advantage
Strategic implications: In 2025’s AI landscape, organizations that successfully scale their AI operations will be those that maintain flexibility while targeting specific, high-value use cases rather than pursuing broad, unfocused implementations. The experiences of Wayfair and Expedia suggest that a hybrid approach, combining custom solutions with external tools, offers the most promising path forward.
Build or buy? Scaling your enterprise gen AI pipeline in 2025