Platform innovations help enterprise customers prepare for rapid growth in their data, ML, and AI initiatives
Outerbounds, builders of the modern, human-centric machine learning (ML) infrastructure stack, today announced four new features to the Outerbounds Platform. The features strengthen customers’ ability to innovate around their existing data and ML projects and experiment with the latest AI techniques as interest in LLMs and Generative AI grows. The Outerbounds Platform gives organizations access to a full stack for ML and AI so customers can design and develop applications with open-source Metaflow and deploy them on scalable, enterprise-grade infrastructure powered by Outerbounds.
Since 2019, the company has helped hundreds of organizations, from startups to Fortune 500 companies, ship data, ML, and AI projects faster. Use cases include forecasting and fraud detection to computer vision for self-driving vehicles, drones, and cancer research.
“The tech around LLMs and Generative AI is still very immature and chaotic. While it’s too early for most companies to adopt today, this area is maturing quickly — we are likely overestimating the impact of AI in the short term and underestimating it in the long term,” said Ville Tuulos, CEO and co-founder, at Outerbounds. “Concurrently, companies need help with their existing data pipelines and traditional ML projects as well as setting the foundation for future AI projects. We have helped hundreds of companies achieve this balance, and our new features make it even easier.”
Battle-tested innovators | Metaflow, Netflix, and hundreds of enterprises
Founded in 2021, the company’s roots are at Netflix, where the founders started open-source Metaflow, a framework that helps people with diverse domain expertise design, experiment, and deliver data-intensive projects that make a difference.
The company’s mission is rooted in the founders’ original vision for Metaflow: Data scientists and ML engineers are asked to develop and deploy models for diverse ML-powered use cases. To do this, they need to leverage foundational infrastructure – data access, scalable compute, workflow orchestration, all tracked and versioned to allow efficient collaboration and incremental improvements. Rather than assembling multiple-point solutions for the job, a more effective solution is to use a cohesive toolchain with a human-friendly API, providing support throughout the project lifecycle.
“We saw that successful projects were delivered by data scientists and engineers who can work on end-to-end workflows independently, focusing on data, science, and business logic rather than infrastructure,” added Tuulos. “We have worked with hundreds of companies and over a thousand data scientists and engineers to make this a reality in their environments. We are now making this business value accessible to many, as ML and AI will come in many forms and be used in myriad ways as we are seeing today.”
The platform is an enterprise-ready, fully managed ML platform. Through the experience of working with hundreds of organizations on their Metaflow journey, the founders recognized patterns. While Metaflow is common for organizations, a great deal of variation exists on the infrastructure side. Based on lessons learned from the vibrant open-source community, Outerbounds set out to bake all infrastructural concerns into one platform, which has been available since the beginning of this year.
New features announced today:
- GPU compute and generative AI use cases: Many companies will want to retain control over their data and models, differentiating their offerings from unrefined foundation models and generic APIs. Outerbounds has tested popular models with Metaflow and published recipes for open-source foundation models, such as Stable Diffusion, Whisper, Dolly, and LLaMA. In addition to the modeling layer, the company has worked on the compute layer, making sure GPUs are easily and cost-effectively accessible to all users of the platform. In addition to supporting GPUs offered by AWS, GCP, and Azure, the company has partnered with CoreWeave to expand the pool of available GPU resources.
- Built-in support for reactive, event-based ML and AI workflows: Ability to build sophisticated production systems powered by ML, integrating workflows to data warehouses like Snowflake, Databricks, or open-source data lakes, as well as other systems downstream.
- Cloud workstations: Administrators can provide developers with cloud workstations combining local IDEs and cloud-based environments.
- Bank-grade security and compliance: Integrated security features, including cloud security perimeter; authentication and authorization integrated with SSO system; machine-to-machine tokens for programmatic authorization; integration with centrally managed secrets; complete audit trail of all user and infrastructure events; and SOC2 Type 2 compliance.
“Ultimately, the outer bounds of common engineering concerns – security, scalability, cost-efficiency, high availability, and integrations to surrounding systems – look similar. All happy infrastructures are alike at the highest level,” said Savin Goyal, CTO and co-founder, at Outerbounds. “We hope Outerbounds will help organizations produce ML-powered value faster and apply ML to new business domains. The platform allows them to shortcut years of time and effort that they would otherwise incur.”
Supporting Customer Quotes
“The Outerbounds Platform empowers ML scientists to quickly scale our model training and less technical users to easily solve ‘mundane’ analytical tasks in a clean and repeatable manner,” said Marc Millstone, principal engineering manager, at Convoy. “By unifying our data and ML pipelines, we can now automatically train models when an underlying warehouse table is updated as opposed to relying on manually scheduled independent pipelines, which drastically improves efficiency and collaboration.”
Connect with Outerbounds