Building Web Search Agent with Google-ADK
Set up a web search agent with Google ADK and LangDB. Guide covers creating virtual MCP servers, integrating search tools, model routing, and full observability.
This guide walks you through setting up a powerful WebSearch agent that leverages Google ADK for orchestration and LangDB for LLM access, tracing, and flexible routing.
Code
Overview
Google ADK: Manages agent workflows and tool execution.
LangDB: Provides access to 350+ models (OpenAI, Anthropic, Gemini, etc.), with full tracing, tool integration, and cost controls.
MCP/Tools: Use LangDB Virtual MCP system to integrate external tools like Tavily Search into your agent’s workflow.
You’ll learn how to:
Configure a Google ADK agent to use LangDB as its LLM provider
Set up and use MCP tool calls (e.g., for real-time web search)
Route and trace all agent activity using LangDB
Agents Involved
Critic Agent (critic_agent
)
critic_agent
)Role: Web search and analysis
Step: First in a 2-step pipeline
Functionality:
Receives the user query
Performs a web search using MCP tools (e.g., Tavily)
Annotates its response with grounding references (links, citations)
Reviser Agent (reviser_agent
)
reviser_agent
)Role: Synthesis and refinement
Step: Second in the pipeline
Functionality:
Takes the output from Critic Agent
Refines, synthesizes, and clarifies the response
Architecture Overview
Two-Step Pipeline:
Critic Agent analyzes and gathers evidence via web search
Reviser Agent polishes the answer for delivery
Shared Context: Both agents use shared thread/run IDs to maintain continuity
Extensible: This design can add more agents or steps as needed
Tech Stack: Built on
LlmAgent
fromgoogle.adk.agents
, using LangDB LLM interface and MCP integration
User Query
│
▼
[Critic Agent] --(web search, analysis, references)-->
│
▼
[Reviser Agent] --(synthesis, refinement)-->
│
▼
Final Answer to User
Enabling Web Search Agents with Virtual MCP Server
To enable real-time web search for your agent, you must use a Virtual MCP Server in LangDB. This ensures all web search tools (such as Tavily) are available and can be integrated seamlessly into your agent's workflow.
Steps to Create a Virtual MCP Server
Click + New Virtual MCP Server:
Name:
web-search-mcp
Underlying MCP: Tavily Search MCP
Requires API Key: Make sure Tavily API Key is configured in your environment to authenticate this operation.
After Creating a Virtual MCP Server
You get the MCP Url from the Code Snippets.
For example:
We will be using this MCP Server URL in the Critic Agent
:
server_url = "https://api.us-east-1.langdb.ai/mcp/tavily_4ykdv5fj"
critic_agent = LlmAgent(
model=LangDBLlm(
model="openai/gpt-4.1",
api_key=os.getenv("LANGDB_API_KEY"),
project_id = os.getenv("LANGDB_PROJECT_ID"),
extra_headers={
"x-thread-id": SHARED_THREAD_ID,
"x-run-id": SHARED_RUN_ID
},
mcp_servers = [{
"server_url": server_url,
"type": "sse",
"name": "Tavily"
}]
),
name="critic_agent",
instruction=prompt.CRITIC_PROMPT,
after_model_callback=_render_reference,
)
Running the Agent
Pre-requesites:
Install Google ADK and LangDB :
pip install google-adk langdb-adk
What is
langdb-adk
?
langdb-adk
is an extension package for Google ADK. It provides additional utilities and integrations that make it easier to use LangDB as your LLM backend within Google ADK workflows.
Executing the Agent:
adk run web-search
<!-- or -->
adk web
Output
Here is a conversation with the Web-Search
agent:
References
Last updated
Was this helpful?