observability monitoringQuick Start ↓

Get Started with LLM Gateway

Main proxy server for LLM Gateway Core

Getting Started

1

Read the official documentation

The LLM Gateway team maintains comprehensive docs that cover installation, configuration, and common patterns.

Open LLM Gateway Docs
2

Create an account

Visit the LLM Gateway website to create your account and explore pricing options.

Visit LLM Gateway
3

Review strengths, tradeoffs, and alternatives

Our full tool profile covers LLM Gateway's strengths, weaknesses, pricing, and how it compares to alternatives.

View full profile

Best For

Developers who need to integrate multiple AI models into their applications without complex setup

Teams looking for enhanced observability and management over their AI services

Projects requiring self-hosted solutions for full control over infrastructure

Resources