evaluation securityQuick Start ↓

Get Started with llm-guard

A TypeScript library for validating and securing LLM prompts

Getting Started

1

Read the official documentation

The llm-guard team maintains comprehensive docs that cover installation, configuration, and common patterns.

Open llm-guard Docs
2

Create an account

Visit the llm-guard website to create your account and explore pricing options.

Visit llm-guard
3

Review strengths, tradeoffs, and alternatives

Our full tool profile covers llm-guard's strengths, weaknesses, pricing, and how it compares to alternatives.

View full profile

Best For

TypeScript developers building secure and compliant AI applications

Teams needing to validate user inputs before processing by LLMs

Resources