Skip to content

[bug] SmolagentsInstrumentor does not create model spans when using CodeAgent #1636

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
igor-gorohovsky opened this issue May 15, 2025 · 10 comments
Assignees
Labels
bug Something isn't working instrumentation: smolagents

Comments

@igor-gorohovsky
Copy link

igor-gorohovsky commented May 15, 2025

Describe the bug
opentelemetry-instrumentation-smolagents instruments the model layer by monkey-patching Model.__call__.
The CodeAgent (and any logic that reuses its _step_stream / step implementation) never calls __call__; it uses model.generate(...) or model.generate_stream(...) directly.
Because of this, no model spans are produced when a CodeAgent is executed – only the agent-level and tool-level spans appear.

To Reproduce

  1. Create python script:
from smolagents import CodeAgent, tool, OpenAIServerModel

model = OpenAIServerModel(model_id="gpt-3.5-turbo")

@tool
def get_tasks() -> dict:
    """Returns list of tasks"""
    return {"title": "some work"}
agent = CodeAgent(tools=[get_tasks], model=model)

agent.run("What tasks I have?")
  1. Run script with opentelemetry-instrumentary
opentelemetry-instrumentary python3 script.py

Expected behavior
Every call to the underlying LLM (generate / generate_stream) should be wrapped in a span so that prompt, completion, latency, token counts, etc. are recorded.

Versions / Environment

  • smolagents: 1.15.0
  • opentelemetry-instrumentation-smolagents: 0.1.11
  • opentelemetry-sdk/api: 1.33.0
  • Python 3.11.12
  • OS: Ubuntu
  • Version: 24.04

Additional context
The instrumentor only wraps:

wrap_function_wrapper("smolagents", name=model_subclass.__name__ + ".__call__", ...)

but CodeAgent._step_stream uses:

chat_message = self.model.generate(...)
# or
output_stream = self.model.generate_stream(...)

Possible fix

  • Wrap Model.generate and Model.generate_stream instead of Model.__call__.
    (__call__ simply delegates to generate)
@igor-gorohovsky igor-gorohovsky added bug Something isn't working triage Issues that require triage labels May 15, 2025
Copy link

dosubot bot commented May 15, 2025

Hello @igor-gorohovsky! I'm Dosu, a bot here to help you with bugs, answer questions, and assist you in becoming a contributor. I'm ready to help while you wait for a human maintainer. Let's dive into your issue!

To address the issue where SmolagentsInstrumentor does not create model spans for CodeAgent, you can modify the instrumentor to wrap the generate and generate_stream methods instead of __call__. Here's how you can implement the fix:

  1. Update the _instrument method in the SmolagentsInstrumentor to wrap generate and generate_stream for each model subclass:
def _instrument(self, **kwargs: Any) -> None:
    import smolagents
    from smolagents import CodeAgent, MultiStepAgent, Tool, ToolCallingAgent, models

    if not (tracer_provider := kwargs.get("tracer_provider")):
        tracer_provider = trace_api.get_tracer_provider()
    if not (config := kwargs.get("config")):
        config = TraceConfig()
    else:
        assert isinstance(config, TraceConfig)
    self._tracer = OITracer(
        trace_api.get_tracer(__name__, __version__, tracer_provider),
        config=config,
    )

    run_wrapper = _RunWrapper(tracer=self._tracer)
    self._original_run_method = getattr(MultiStepAgent, "run", None)
    wrap_function_wrapper(
        module="smolagents",
        name="MultiStepAgent.run",
        wrapper=run_wrapper,
    )

    self._original_step_methods: Optional[dict[type, Optional[Callable[..., Any]]]] = {}
    step_wrapper = _StepWrapper(tracer=self._tracer)
    for step_cls in [CodeAgent, ToolCallingAgent]:
        self._original_step_methods[step_cls] = getattr(step_cls, "step", None)
        wrap_function_wrapper(
            module="smolagents",
            name=f"{step_cls.__name__}.step",
            wrapper=step_wrapper,
        )

    self._original_model_call_methods: Optional[dict[type, Callable[..., Any]]] = {}

    exported_model_subclasses = [
        attr
        for _, attr in vars(smolagents).items()
        if isinstance(attr, type) and issubclass(attr, models.Model)
    ]
    for model_subclass in exported_model_subclasses:
        model_subclass_wrapper = _ModelWrapper(tracer=self._tracer)
        self._original_model_call_methods[model_subclass] = {
            "generate": getattr(model_subclass, "generate", None),
            "generate_stream": getattr(model_subclass, "generate_stream", None)
        }
        wrap_function_wrapper(
            module="smolagents",
            name=model_subclass.__name__ + ".generate",
            wrapper=model_subclass_wrapper,
        )
        wrap_function_wrapper(
            module="smolagents",
            name=model_subclass.__name__ + ".generate_stream",
            wrapper=model_subclass_wrapper,
        )

    tool_call_wrapper = _ToolCallWrapper(tracer=self._tracer)
    self._original_tool_call_method = getattr(Tool, "__call__", None)
    wrap_function_wrapper(
        module="smolagents",
        name="Tool.__call__",
        wrapper=tool_call_wrapper,
    )

This modification ensures that the generate and generate_stream methods are wrapped, allowing spans to be created for these methods when a CodeAgent is executed [1].

To continue talking to Dosu, mention @dosu.


Help Dosu learn! Give it feedback: Great Response | Irrelevant Answer | Incorrect Sources | Too Verbose | Hallucination | Bug Report | Other

@victorlearned
Copy link

victorlearned commented May 22, 2025

Any update on this? Basically makes our codeagent traces worthless

@mikeldking
Copy link
Contributor

Hey @victorlearned @igor-gorohovsky - sorry for the delay. So I believe if you add openinference-instrumentation-litellm in addition to smalagents you should get all the details. I have not tested this however so if you could give that a try and let me know.

Apologize for the delay. Will work with @Arize-ai/dev-rel to see if we can reproduce.

@mikeldking mikeldking removed the triage Issues that require triage label May 23, 2025
@Dylancouzon Dylancouzon moved this to Todo in Instrumentation May 27, 2025
@Dylancouzon
Copy link

@mikeldking I was able to replicate this issue.
SmolagentsInstrumentor only wraps Model.__call__, but CodeAgent uses model.generate() directly, resulting in no model spans being created.

I tested the workaround of adding 'openinference-instrumentation-litellm", but it didn’t seem to resolve the issue. From what I can tell, Smolagents doesn’t use litellm internally, so I’m not sure how this instrumentation would take effect.

The proposed fix in the original issue seems correct and should work
# In SmolagentsInstrumentor._instrument() for method_name in ["__call__", "generate", "generate_stream"]: wrap_function_wrapper( module="smolagents", name=f"{model_subclass.__name__}.{method_name}", wrapper=model_subclass_wrapper, )

@MkYacine
Copy link

MkYacine commented May 29, 2025

Hi @Dylancouzon! I see you're assigned to this and have already confirmed the fix.
I've run into the same issue yesterday, implemented the proposed fix locally and tested it successfully. Would you like me to open a PR, or are you already working on this? Happy to help if needed

@Mandark-droid
Copy link

Yup even am running into the same issue so let me know if I need to try the workaround listed above or there will a fix provided

@axiomofjoy
Copy link
Contributor

Hi @Dylancouzon! I see you're assigned to this and have already confirmed the fix. I've run into the same issue yesterday, implemented the proposed fix locally and tested it successfully. Would you like me to open a PR, or are you already working on this? Happy to help if needed

@MkYacine Feel free to open a PR 😃

@igor-gorohovsky
Copy link
Author

@axiomofjoy Please, review my pull request, this fixes my problem with code agent tracing

@github-project-automation github-project-automation bot moved this from Todo to Done in Instrumentation Jun 4, 2025
@victorlearned
Copy link

why was this closed? The PR is still out for review.

@axiomofjoy
Copy link
Contributor

why was this closed? The PR is still out for review.

I think that was probably a mistake. Reopening.

@axiomofjoy axiomofjoy reopened this Jun 4, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working instrumentation: smolagents
Projects
Status: Done
Development

No branches or pull requests

7 participants