Introducing Virtual MCP Servers
LogoLogo
GithubJoin SlackSignupBook a Demo
  • Documentation
  • Self Hosted
  • Integrations
  • Guides
  • Enterprise
  • Examples
  • OpenAI
  • Vercel AI SDK
  • DeepSeek
  • Anthropic
  • LangChain
  • CrewAI
  • Gemini
  • Bedrock
  • xAI
  • TogetherAI
  • FireworksAI
  • DeepInfra
  • OpenRouter
  • Smithery
  • LlamaIndex
  • Supabase
  • Mem0
Powered by GitBook
LogoLogo

Social

  • LinkedIn
  • X
  • Youtube
  • Github

Platform

  • Pricing
  • Documentation
  • Blog

Company

  • Home
  • About

Legal

  • Privacy Policy
  • Terms of Service

2025 LangDB. All rights reserved.

On this page

Was this helpful?

Export as PDF

LlamaIndex

Connect LangDB to LlamaIndex using OpenAI-compatible setup to enhance LLM workflows with full tracing and streamlined monitoring.

PreviousSmitheryNextSupabase

Was this helpful?

1

Install LlamaIndex

pip install llama-index openai
2

Add LangDB into your Code

import os

from llama_index.llms import openai
from llama_index.core import VectorStoreIndex, SimpleDirectoryReader
from llama_index.core import Settings
from llama_index.llms.openai import OpenAI

Settings.llm = OpenAI(
    base_url=os.getenv("OPENAI_API_BASE"),
    api_key=os.getenv("OPENAI_API_KEY"),
    model="gpt-4o-mini"
)

documents = SimpleDirectoryReader("data").load_data()
## Rest of your LlamaIndex

Check out full examples in the .

samples