Oumi, a new AI platform developed by former Google and Apple engineers, has launched with $10 million in seed funding to provide fully open-source access to AI model development tools.
The platform’s core offering: Oumi provides comprehensive access to AI model code, weights, and training data, backed by a consortium of 13 leading research universities.
- The platform delivers a complete toolkit for building, evaluating, and deploying foundation models, supporting a wide range of parameters from 10M to 405B
- Advanced training capabilities include Support Fine-Tuning (SFT), Low-Rank Adaptation (LoRA), Quantized LoRA (QLoRA), and Direct Preference Optimization (DPO)
- The system accommodates both text and multimodal models, with built-in tools for data synthesis and curation
Technical accessibility: Oumi’s architecture enables users to begin development on basic hardware and scale up as needed.
- The platform integrates with popular inference engines like vLLM and SGLang for deployment flexibility
- A distributed computing approach helps reduce costs compared to traditional centralized infrastructure
- Comprehensive model evaluation tools are included to assess performance and capabilities
Competitive differentiation: Oumi sets itself apart from other “open” AI models through its commitment to complete transparency.
- Unlike platforms such as DeepSeek and Llama, which maintain certain proprietary elements, Oumi provides unrestricted access to all model components
- The platform’s unconditional open-source approach aims to foster greater collaboration and innovation in the AI community
- Enterprise offerings for production deployments are in development to support commercial applications
Future implications: The emergence of fully open-source AI platforms like Oumi could reshape the AI development landscape by democratizing access to advanced machine learning tools and reducing dependence on large tech companies’ proprietary solutions.
Ex-Google, Apple engineers launch unconditionally open source Oumi AI platform that could help to build the next DeepSeek