LiteLLM is an open-source AI Gateway developed by BerriAI with 18,000+ GitHub stars, enabling unified access to 100+ LLM APIs through OpenAI-compatible format. Founded as a Y Combinator company with USD 1.6 million in seed funding, LiteLLM is trusted by companies like Rocket Money, Samsara, Lemonade, and Adobe. The platform provides retry and fallback logic, cost tracking, guardrails, and load balancing with MIT licensing for the core proxy. While the open-source version is free, running LiteLLM requires infrastructure costs of USD 200-500 monthly plus DevOps labor, monitoring tools, and incident response. The Enterprise version at USD 30,000 annually adds SSO, RBAC, and team-level budget enforcement. Users praise LiteLLM's unified API interface and security through open-source auditability, but note production complexity with latency overhead (20-40ms) and operational burden for self-hosting.
Free trial available
Engineering teams who want an open-source, self-hosted LLM proxy for provider management
Integrate LiteLLM's open-source gateway with Respan for unified access to 100+ LLM providers. Leverage LiteLLM's retry logic and fallback capabilities alongside Respan's orchestration. Combine open-source flexibility with enterprise-grade reliability for production AI applications.
Top companies in LLM Gateways you can use instead of LiteLLM.
Companies from adjacent layers in the AI stack that work well with LiteLLM.
Last verified: March 10, 2026