Running Locally

Self-host LangDB AI Gateway Enterprise locally with ClickHouse, PostgreSQL, and Redis for full control over tracing, caching, and analytics.

Dependencies

  • ClickHouse (for request tracing & analytics)

  • PostgreSQL (for metadata and user management)

  • Redis (for caching and rate‑limiting)

Launch Options

You can self host our enterprise version using two options.

Using binary

Supported Platforms:

  • x86_64

  • aarch64

ai-gateway-enteprise serve -c ai-gateway.yaml 

Using docker

docker run -it 
    -p 8080:8080 \
    <private-url>/ai-gateway-enteprise serve \
    -c ai-gateway.yaml 

Make Your First Request

Test the gateway with a simple chat completion:

# Chat completion with GPT-4
curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "What is the capital of France?"}]
  }'

# Or try Claude
curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-opus",
    "messages": [
      {"role": "user", "content": "What is the capital of France?"}
    ]
  }'

Using MCPs Servers

Invoke an MCP server alongside your request:

curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "Ping the server using the tool and return the response"}],
    "mcp_servers": [{"server_url": "http://localhost:3004"}]
  }'

Next Steps:

Refer to https://docs.langdb.ai/ for understanding various features.

Last updated

Was this helpful?