Keywords AI

LiteLLM

LiteLLM

LLM GatewaysLayer 1Open Source
Visit website

What is LiteLLM?

LiteLLM is an open-source LLM proxy that translates OpenAI-format API calls to 100+ LLM providers. It provides a standardized interface for calling models from Anthropic, Google, Azure, AWS Bedrock, and dozens more. LiteLLM is popular as a self-hosted gateway with features like spend tracking, rate limiting, and team management.

Key Features

  • Open-source LLM proxy
  • OpenAI-compatible API for 100+ providers
  • Budget management and rate limiting
  • Self-hostable
  • Automatic retries and fallbacks

Common Use Cases

Engineering teams who want an open-source, self-hosted LLM proxy for provider management

  • Self-hosted LLM gateway for data control
  • Standardizing LLM access across teams
  • Budget enforcement per team or project
  • Provider migration without code changes
  • Open-source LLM infrastructure

Best LiteLLM Alternatives & Competitors

Top companies in LLM Gateways you can use instead of LiteLLM.

View all LiteLLM alternatives →

Compare LiteLLM

Best Integrations for LiteLLM

Companies from adjacent layers in the AI stack that work well with LiteLLM.