liteLLM

liteLLM

One library to standardize all LLM APIs
2 reviews3 shoutouts
99 followers

What is liteLLM?

Simplify using OpenAI, Azure, Cohere, Anthropic, Replicate, Google LLM APIs. TLDR Call all LLM APIs using the chatGPT format - completion(model, messages) Consistent outputs and exceptions for all LLM APIs Logging and Error Tracking for all models

Do you use liteLLM?

liteLLM gallery image
liteLLM gallery image
liteLLM gallery image

Recent liteLLM Launches

liteLLM

Forum Threads

View all

liteLLM Alternatives

Review liteLLM?

5/5 based on 2 reviews

Maker reviews of liteLLM

Me.bot
Hengjia Wang
used this to buildSecond.Me by Me.botMe.bot
(451 points)
Versatile API calling center.
Athina AI
Shiv Sakhuja
used this to buildAthinaAthina AI
(581 points)
LiteLLM has made it super easy to switch between different models, and support custom models!
CamelAI
Miguel Salinas
used this to buildCamelAICamelAI
(147 points)
liteLLM is a must for working with different models. We use different models for different tasks and subtasks. With liteLLM the code stays exactly the same and we can just focus on choosing the right prompts and models for the task.
View all liteLLM reviews

Reviews

Ali Avci
8 reviews
I find myself recommending this library to serious LLM-powered app developers that are trying to standardize their codebase by unifying all the APIs they use. Love it!
Timoa
4 reviews
Used as an LLM proxy, it allows the caching and load balancing between multiple AI services (Groq, OpenRouter, etc.) and even local with Ollama. It uses an OpenAI-compatible API that allows (when we can set the base URL) to use it in many apps or services. I use it configured with Langfuse which provides the performance analysis (monitoring) of each prompt/session.