Keywords AI
LiteLLM is an open-source LLM proxy that translates OpenAI-format API calls to 100+ LLM providers. It provides a standardized interface for calling models from Anthropic, Google, Azure, AWS Bedrock, and dozens more. LiteLLM is popular as a self-hosted gateway with features like spend tracking, rate limiting, and team management.
Engineering teams who want an open-source, self-hosted LLM proxy for provider management
Top companies in LLM Gateways you can use instead of LiteLLM.
Companies from adjacent layers in the AI stack that work well with LiteLLM.