evaluation securityQuick Start ↓
Get Started with LiteLLM
Proxy server for managing auth, load balancing, and spend tracking across multiple LLMs.
Getting Started
1
Read the official documentation
The LiteLLM team maintains comprehensive docs that cover installation, configuration, and common patterns.
Open LiteLLM Docs↗2
Create an account
Visit the LiteLLM website to create your account and explore pricing options.
Visit LiteLLM↗3
Review strengths, tradeoffs, and alternatives
Our full tool profile covers LiteLLM's strengths, weaknesses, pricing, and how it compares to alternatives.
View full profile→Best For
Enterprises needing to manage costs and performance of multiple AI services
Development teams working on projects that require integration with various LLMs in the OpenAI format
Scenarios where real-time spend tracking is critical for budget management