Introducing Virtual MCP Servers
LogoLogo
GithubJoin SlackSignupBook a Demo
  • Documentation
  • Self Hosted
  • Integrations
  • Guides
  • Enterprise
  • Examples
  • OpenAI
  • Vercel AI SDK
  • DeepSeek
  • Anthropic
  • LangChain
  • CrewAI
  • Gemini
  • Bedrock
  • xAI
  • TogetherAI
  • FireworksAI
  • DeepInfra
  • OpenRouter
  • Smithery
  • LlamaIndex
  • Supabase
  • Mem0
Powered by GitBook
LogoLogo

Social

  • LinkedIn
  • X
  • Youtube
  • Github

Platform

  • Pricing
  • Documentation
  • Blog

Company

  • Home
  • About

Legal

  • Privacy Policy
  • Terms of Service

2025 LangDB. All rights reserved.

On this page

Was this helpful?

Export as PDF

Smithery

Integrate Smithery's EXA MCP server into LangDB to enhance AI workflows with real-time, tool-driven interactions via WebSocket.

PreviousOpenRouterNextLlamaIndex

Was this helpful?

LangDB supports MCP servers provided by .

This particular example is for .

from openai import OpenAI
from dotenv import load_dotenv
import os
import base64
import json
from urllib.parse import quote
load_dotenv()

def urlEncode(data_dict):
    """Convert dictionary to base64 and then URL encode it"""
    return quote(base64.b64encode(json.dumps(data_dict).encode()).decode())

config = {
    "exaApiKey": os.getenv("EXA_API_KEY")
}

config_str = urlEncode(config)
web_socket_url = "wss://your-mcp-server.com/ws"
extra_body = {
    "mcp_servers": [
        {
            "server_url": f"{web_socket_url}?config={config_str}",
            "type": "ws"
        }
    ]
}
client = OpenAI(
    api_key=os.getenv("LANGDB_API_KEY"),
    base_url=os.getenv("LANGDB_API_URL")

)
response = client.chat.completions.create(
    model="gpt-4o-mini",
    messages=[{"role": "user", "content": "what is langdb?"}],
    extra_body = extra_body
)
print(response)
Smithery
EXA MCP server