llm providersQuick Start ↓
Get Started with FastChat
Distributed multi-model LLM serving system with web UI and OpenAI-compatible APIs.
Getting Started
1
Read the official documentation
The FastChat team maintains comprehensive docs that cover installation, configuration, and common patterns.
Open FastChat Docs↗2
Create an account
Visit the FastChat website to create your account and explore pricing options.
Visit FastChat↗3
Review strengths, tradeoffs, and alternatives
Our full tool profile covers FastChat's strengths, weaknesses, pricing, and how it compares to alternatives.
View full profile→Best For
Teams needing to serve and scale multiple large language models simultaneously
Developers looking for an OpenAI-compatible API for their projects
Projects requiring a web interface alongside RESTful APIs for LLM access