Getting Started
Run LangDB AI Gateway locally.
Last updated
Was this helpful?
Run LangDB AI Gateway locally.
Last updated
Was this helpful?
LangDB AI gateway is available as an open-source repo that you can configure locally. Own your LLM data and route to 250+ models.
Here is the link to the repo - https://github.com/langdb/ai-gateway
The gateway provides the following OpenAI-compatible endpoints:
POST /v1/chat/completions
- Chat completions
GET /v1/models
- List available models
POST /v1/embeddings
- Generate embeddings
POST /v1/images/generations
- Generate images
LangDB allows advanced configuration options to customize its functionality. The three main configuration areas are:
Limits – Control API usage with rate limiting and cost control.
Routing – Define how requests are routed across multiple LLM providers.
Observability – Enable logging and tracing to monitor API performance.
These configurations can be set up using a configuration file (config.yaml
) or overridden via command line options.
Download the sample configuration from our repo.
Copy the example config file:
Command line options will override corresponding config file settings when both are specified.
Visit for more details.