Introducing Virtual MCP Servers
LogoLogo
GithubJoin SlackSignupBook a Demo
  • Documentation
  • Self Hosted
  • Integrations
  • Guides
  • Enterprise
  • Introduction to AI Gateway
  • Supported Models
  • Supported MCP Servers
  • Getting Started
    • Quick Start
    • Working with API
    • Working with Multiple Agents
    • Working with MCPs
    • Working with Headers
    • User Tracking
    • Using Parameters
  • Concepts
    • Thread
    • Trace
    • Run
    • Label
    • Message
    • Virtual Models
      • Routing with Virtual Model
    • Virtual MCP Servers
  • Features
    • Tracing
    • Routing
    • MCP Support
    • Publishing MCP Servers
    • Usage
    • Analytics
    • Guardrails
    • User Roles
    • Cost Control
  • Python SDK
    • Getting Started
  • API Reference
  • Postman Collection
Powered by GitBook
LogoLogo

Social

  • LinkedIn
  • X
  • Youtube
  • Github

Platform

  • Pricing
  • Documentation
  • Blog

Company

  • Home
  • About

Legal

  • Privacy Policy
  • Terms of Service

2025 LangDB. All rights reserved.

On this page
  • Using Virtual MCPs
  • Using API
  • Using MCP Clients
  • Connecting to External MCP Servers

Was this helpful?

Export as PDF
  1. Features

MCP Support

Create, manage, and connect MCP servers easily to integrate dynamic tools and enhance your AI workflows with full tracing.

PreviousRoutingNextPublishing MCP Servers

Last updated 17 days ago

Was this helpful?

LangDB simplifies how you work with MCP (Model Context Protocol) servers — whether you want to use a built-in or connect to an external MCP server.

Model Context Protocol (MCP) is an open standard that enables AI models to seamlessly communicate with external systems. It allows models to dynamically process contextual data, ensuring efficient, adaptive, and scalable interactions. MCP simplifies request orchestration across distributed AI systems, enhancing interoperability and context-awareness.

With native tool integrations, MCP connects AI models to APIs, databases, local files, automation tools, and remote services through a standardized protocol. Developers can effortlessly integrate MCP with IDEs, business workflows, and cloud platforms, while retaining the flexibility to switch between LLM providers. This enables the creation of intelligent, multi-modal workflows where AI securely interacts with real-world data and tools.

Using Virtual MCPs

Using API

Here's an example of how you can use a Virtual MCP Server in your project:

from openai import OpenAI
from uuid import uuid4

client = OpenAI(
    base_url="https://api.us-east-1.langdb.ai/LangDBProjectID/v1",
    api_key="xxxx",
    default_headers={"x-thread-id": str(uuid4())},
)
mcpServerUrl = "Virtual MCP Server URL"
response = client.chat.completions.create(
    model="openai/gpt-4.1",
    messages=[
        {"role": "system", "content": "You are a helpful assistant."},
        {"role": "user",   "content": "What are the databases available"}
    ],
    extra_body={
        "mcp_servers": [
            {
                "server_url": mcpServerUrl,
                "type": "sse"
            }
        ]
    }
)
import openai, {
  OpenAI
} from 'openai';
import { v4 as uuid4 } from 'uuid';

const client = new OpenAI({
  baseURL: "https://api.us-east-1.langdb.ai/LangDBProjectID/v1",
  apiKey: "xxxx",
  defaultHeaders: {
    "x-thread-id": uuid4()
  }
});
const mcpServerUrl = 'Virtual MCP URL';
async function getAssistantReply() {
  const {
    choices
  } = await client.chat.completions.create({
    model: "openai/gpt-4.1-nano",
    messages: [
    {role: "system", content: "You are a helpful assistant."},
    {role: "user", content: "what are the databases on clickhouse?"} ,
    // @ts-expect-error mcp_servers is a LangDB extension
    mcp_servers: [
      { server_url: mcpServerUrl, type: 'sse' }
    ] 
  }
);
  console.log('Assistant:', choices[0].message.content);
}

Using MCP Clients

You can instantly connect LangDB’s Virtual MCP servers to editors like Cursor, Claude, or Windsurf.

Run this in your terminal to set up MCP in Cursor:

npx @langdb/mcp setup <server_name> <mcp_url> --client cursor

You can now call tools directly in your editor, with full tracing on LangDB.

Connecting to External MCP Servers

If you already have an MCP server hosted externally — like Smithery’s Exa MCP — you can plug it straight into LangDB with zero extra setup.

Just pass your external MCP server URL in extra_body when you make a chat completion request. For example Smithery:

extra_body = {
    "mcp_servers": [
        {
            "server_url": "wss://your-mcp-server.com/ws?config=your_encoded_config",
            "type": "ws"
        }
    ]
}

For more details, visit the and explore .

LangDB allows you to create directly from the dashboard. You can instantly select and bundle tools like database queries, search APIs, or automation tasks into a single MCP URL — no external setup needed.

Checkout and section for usecases.

For a complete example of how to use external MCP, refer to the .

Model Context Protocol official page
Anthropic MCP documentation
Virtual MCP Servers
Virtual MCP
Virtual MCP
Browse publicly-available MCP servers on LangDB
MCP Servers hosted on LangDB
Setting up and Using Virtual MCP.
Guides
Smithery guide