Routing

Dynamic Model Routing

LangDB AI Gateway enables sophisticated routing strategies for LLM requests. You can optimize AI traffic by implementing fallback routing, script-based routing, and latency-based routing.

Self hosted option enables routing through configuration. Checkout the full Routing documentation for more details.

Example Configuration:

{
    "model": "router/dynamic",
    "messages": [
        { "role": "system", "content": "You are a helpful assistant." },
        { "role": "user", "content": "What is the formula of a square plot?" }
    ],
    "router": {
        "router": "router",
        "type": "fallback",
        "targets": [
            { "model": "openai/gpt-4o-mini", "temperature": 0.9, "max_tokens": 500, "top_p": 0.9 },
            { "model": "deepseek/deepseek-chat", "frequency_penalty": 1, "presence_penalty": 0.6 }
        ]
    },
    "stream": false
}

This configuration allows multiple targets with specific parameters, ensuring that requests are handled efficiently.

Last updated

Was this helpful?