Skip to content

[bug] Traces appear to be broken when used with agent.ainvoke(..) async function in langgraph #2190

@MD-AZMAL

Description

@MD-AZMAL

Describe the bug
Traces appear to be broken when used with agent.ainvoke(..) async function in langgraph

To Reproduce
Run a multistep agent using ainvoke

result = await chat_agent.ainvoke(
                {
                    "question": message.query,
                    "messages": HumanMessage(content=message.query),
                    "observations": [],
                },
                config=config,
            )

Expected behavior
Traces should appear in correct order with proper nesting and aggregation

Screenshots
Without async (using invoke(..))

Image

With async (using ainvoke(..))

Image

Desktop (please complete the following information):

  • OS: Windows 11

Additional context
Here is my code for running the agent

async run():
# ....
    with (
        using_attributes(session_id=str(thread_id), user_id=uid)
        if settings.ENABLE_TRACING
        else nullcontext()
    ):

        chat_agent_workflow = AgfiChatV2Agent(memory=checkpointer)

        chat_agent = chat_agent_workflow.agent

        callbacks = []
        if settings.ENABLE_LANGGRAPH_DEBUG:
            callbacks.append(ConsoleCallbackHandler())

        config = {
            "configurable": {
                "thread_id": thread_id,
            },
            "metadata": {"uid": uid, "chat_type": "global"},
            "callbacks": callbacks,
        }

        result = await chat_agent.ainvoke(
                {
                    "question": message.query,
                    "messages": HumanMessage(content=message.query),
                    "observations": [],
                },
                config=config,
            )

        result["final_answer"].sources = []
# ....

Here is the setup for tracing which is run on startup

from openinference.instrumentation.langchain import LangChainInstrumentor
from opentelemetry.instrumentation.asyncio import AsyncioInstrumentor
from phoenix.otel import BatchSpanProcessor, register

from app.config import settings


def setup_tracing_pheonix():

    if not settings.ENABLE_TRACING:
        print("Tracing is disabled")
        return

    # AsyncioInstrumentor().instrument()

    tracer_provider = register(
        project_name="mlb", auto_instrument=True, batch=True, protocol="http/protobuf"
    )

    batch_processor = BatchSpanProcessor(protocol="http/protobuf")
    tracer_provider.add_span_processor(batch_processor)

    LangChainInstrumentor(tracer_provider=tracer_provider).instrument(
        skip_dep_check=True,
    )

    print("Tracing enabled for langgraph with arize phoenix")

Other Information

  • I am using fastapi as backend server
  • If i simply use the blocking invoke it works perfectly, but on using ainvoke it creates traces that are not grouped and not in order

Metadata

Metadata

Assignees

Type

No type

Projects

Status

In Progress

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions