Skip to content

[bug] missing llm input messages in smolagents #2125

@andstor

Description

@andstor

Describe the bug
"Input Messages" are no longer available in traces from smolagents.

As can be seen from the following image, only "Input" details in JSON are available.

Image

This is hard to inspect and follow. The LLM Input Messages should therefore also be available, showing the chat messages used in one turn. Following image show desired output.

Image

Environment

  • openai==1.101.0
  • smolagents==1.21.2
  • opentelemetry-sdk==1.36.0
  • opentelemetry-exporter-otlp==1.36.0
  • openinference-instrumentation-smolagents==0.1.16

Metadata

Metadata

Assignees

Type

No type

Projects

Status

No status

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions