Working with LangGraph
Automatically instrument LangChain chains and agents with LangDB—gain live traces, cost analytics, and latency insights through init().
LangDB provides seamless tracing and observability for LangChain-based applications.
Installation
Install the LangDB client with LangChain support:
pip install pylangdb[langchain]
Quick Start
Export Environment Variables
export LANGDB_API_KEY="<your_langdb_api_key>"
export LANGDB_PROJECT_ID="<your_langdb_project_id>"
Initialize LangDB
Import and run the initialize before configuring your LangChain/LangGraph:
from pylangdb.langchain import init
# Initialise LangDB
init()
Define your Agent
# Your existing LangChain code works with proper configuration
from langchain.chat_models import ChatOpenAI
from langchain.schema import HumanMessage
import os
api_base = "https://api.us-east-1.langdb.ai"
api_key = os.getenv("LANGDB_API_KEY")
project_id = os.getenv("LANGDB_PROJECT_ID")
# Default headers for API requests
default_headers: dict[str, str] = {
"x-project-id": project_id
}
# Initialize OpenAI LLM with LangDB configuratio
llm = ChatOpenAI(
model_name="gpt-4o",
temperature=0.3,
openai_api_base=api_base,
openai_api_key=api_key,
default_headers=default_headers,
)
result = llm.invoke([HumanMessage(content="Hello, LangDB!")])
Once LangDB is initialized, all calls to llm
, intermediate steps, tool executions, and nested chains are automatically traced and linked under a single session.
Last updated
Was this helpful?