Connecting to OSS Models
Connect to open-source models using Ollama or vLLM with LangDB AI Gateway.
Configuration
- model: gpt-oss
model_provider: ollama
inference_provider:
provider: ollama
model_name: gpt-oss
endpoint: https://my-ollama-server.localhost
price:
per_input_token: 0.0
per_output_token: 0.0
input_formats:
- text
output_formats:
- text
limits:
max_context_size: 128000
capabilities: ['tools']
type: completions
description: OpenAI's open-weight models designed for powerful reasoning, agentic tasks, and versatile developer use cases.Configuration Fields
Field
Description
Required
Example Usage
Supported Providers
Ollama
vLLM
Best Practices
Last updated
Was this helpful?