Run LangDB AI Gateway locally.
1
Run Docker
docker run -it \
-p 8080:8080 \
-e LANGDB_OPENAI_API_KEY=your-openai-key-here \
langdb/ai-gateway serve
2
Make your first request
# Chat completion with GPT-4
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-4o-mini",
"messages": [{"role": "user", "content": "What is the capital of France?"}]
}'
# Or try Claude
curl http://localhost:8080/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-opus",
"messages": [
{"role": "user", "content": "What is the capital of France?"}
]
}'
The gateway provides the following OpenAI-compatible endpoints:
-
POST /v1/chat/completions
- Chat completions -
GET /v1/models
- List available models -
POST /v1/embeddings
- Generate embeddings -
POST /v1/images/generations
- Generate images
Visit https://github.com/langdb/ai-gateway for more details.