Introducing Virtual MCP Servers
LogoLogo
GithubJoin SlackSignupBook a Demo
  • Documentation
  • Self Hosted
  • Integrations
  • Guides
  • Enterprise
  • Architecture Overview
  • Enterprise Licensing Options
  • Running Locally
    • ai-gateway.yaml
  • Tenant & User Provisioning
  • Deployment Options
    • Using Docker Compose
    • Deploying on AWS Cloud
    • Using Kubernetes (Beta)
    • Deploying on GCP (Beta)
  • Resources
    • Multi Tenancy
    • Configuring Data Retention
    • Clickhouse Queries
    • Working with Models
Powered by GitBook
LogoLogo

Social

  • LinkedIn
  • X
  • Youtube
  • Github

Platform

  • Pricing
  • Documentation
  • Blog

Company

  • Home
  • About

Legal

  • Privacy Policy
  • Terms of Service

2025 LangDB. All rights reserved.

On this page
  • Dependencies
  • Launch Options
  • Using binary
  • Using docker
  • Make Your First Request
  • Using MCPs Servers
  • Next Steps:

Was this helpful?

Export as PDF

Running Locally

Self-host LangDB AI Gateway Enterprise locally with ClickHouse, PostgreSQL, and Redis for full control over tracing, caching, and analytics.

Dependencies

  • ClickHouse (for request tracing & analytics)

  • PostgreSQL (for metadata and user management)

  • Redis (for caching and rate‑limiting)

Launch Options

You can self host our enterprise version using two options.

Using binary

Supported Platforms:

  • x86_64

  • aarch64

ai-gateway-enteprise serve -c ai-gateway.yaml 

Using docker

docker run -it 
    -p 8080:8080 \
    <private-url>/ai-gateway-enteprise serve \
    -c ai-gateway.yaml 

Make Your First Request

Test the gateway with a simple chat completion:

# Chat completion with GPT-4
curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "What is the capital of France?"}]
  }'

# Or try Claude
curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "claude-3-opus",
    "messages": [
      {"role": "user", "content": "What is the capital of France?"}
    ]
  }'

Using MCPs Servers

Invoke an MCP server alongside your request:

curl http://localhost:8080/v1/chat/completions \
  -H "Content-Type: application/json" \
  -d '{
    "model": "gpt-4o-mini",
    "messages": [{"role": "user", "content": "Ping the server using the tool and return the response"}],
    "mcp_servers": [{"server_url": "http://localhost:3004"}]
  }'

Next Steps:

PreviousEnterprise Licensing OptionsNextai-gateway.yaml

Last updated 9 days ago

Was this helpful?

Refer to for understanding various features.

https://docs.langdb.ai/