Home/LLM APIs/LiteLLM
LLM APIs
LL

LiteLLM

PythonOpen SourceMulti-providerProxy

Single Python interface and proxy server for 100+ LLM providers — call any model with the OpenAI SDK format.

License

MIT

Language

Python

82
Trust
Strong

Why LiteLLM?

You need to switch between LLM providers without rewriting code

Building a proxy/gateway to centralize API key management and logging

Experimenting with model cost and latency tradeoffs

Signal Breakdown

What drives the Trust Score

PyPI downloads
900k / wk
Commits (90d)
400 commits
GitHub stars
18k ★
Stack Overflow
300 q's
Community
Growing
Weighted Trust Score82 / 100

Download Trend

Last 12 months

Tradeoffs & Caveats

Know before you commit

Your app is TypeScript-only — use Vercel AI SDK instead

You only use one provider and don't need abstraction overhead

Pricing

Free tier & paid plans

Free tier

Open source, free to use

Paid

LiteLLM Enterprise: custom pricing for SSO, audit logs, RBAC

Alternative Tools

Other options worth considering

vercel-ai-sdk
Vercel AI SDK88Strong

Unified TypeScript SDK for building AI-powered streaming UIs with any LLM provider — OpenAI, Anthropic, Google, and more.

OR
OpenRouter76Good

Unified API gateway to 200+ LLMs with automatic fallbacks, cost routing, and a single API key — drop-in OpenAI SDK compatible.

openai-api
OpenAI API87Strong

The most widely used LLM API. Powers GPT-4o and o1 models with best-in-class reasoning, vision, and structured outputs. Largest ecosystem of tutorials, integrations, and community support.

Often Used Together

Complementary tools that pair well with LiteLLM

LF

Langfuse

LLM Observability

85Strong
View
LS

LangSmith

LLM Observability

80Strong
View
ollama

Ollama

LLM APIs

85Strong
View

Learning Resources

Docs, videos, tutorials, and courses

Get Started

Repository and installation options

View on GitHub

github.com/BerriAI/litellm

pippip install litellm

Quick Start

Copy and adapt to get going fast

import litellm

response = litellm.completion(
  model="gpt-4o",
  messages=[{"role": "user", "content": "Hello!"}]
)

# Same code works for claude-3-5-sonnet, gemini/gemini-pro, etc.
print(response.choices[0].message.content)

Community Notes

Real experiences from developers who've used this tool