Tracing

LangDB AI Gateway provides

LangDB Gateway provides detailed tracing to monitor, debug, and optimize LLM workflows.

Below is an example of a trace visualization from the dashboard, showcasing a detailed breakdown of the request stages:

LangDB's Tracing provides a clear view of your workflow. The trace above demonstrates:

  • Multiple Model Calls: Each model invocation is tracked with timing and token usage.

  • Agent Flows: Agents process requests, marked with custom labels for better debugging.

  • Tool Integrations: Tools like tavily_search_results_json and python_repl_tool are logged with execution time, showcasing workflows.

Updated on