Getting Started

Run LangDB AI Gateway locally.

LangDB AI gateway is available as an open-source repo that you can configure locally. Own your LLM data and route to 250+ models.

Here is the link to the repo - https://github.com/langdb/ai-gateway

Using LangDB Locally through ai-gateway

Running Locally

1

Run Docker and Login

2

Start Server

3

Make your first request

The gateway provides the following OpenAI-compatible endpoints:

  • POST /v1/chat/completions - Chat completions

  • GET /v1/models - List available models

  • POST /v1/embeddings - Generate embeddings

  • POST /v1/images/generations - Generate images

Advanced Configuration

LangDB allows advanced configuration options to customize its functionality. The three main configuration areas are:

  1. Limits – Control API usage with rate limiting and cost control.

  2. Routing – Define how requests are routed across multiple LLM providers.

  3. Observability – Enable logging and tracing to monitor API performance.

These configurations can be set up using a configuration file (config.yaml) or overridden via command line options.

Setting up

Download the sample configuration from our repo.

  1. Copy the example config file:

Command line options will override corresponding config file settings when both are specified.

Visit for more details.

Last updated

Was this helpful?