-
Notifications
You must be signed in to change notification settings - Fork 137
feat(semcov): Add Image Token Count and Cost Semantic Conventions #1787
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
@arizeai/openinference-core
@arizeai/openinference-instrumentation-beeai
@arizeai/openinference-instrumentation-langchain
@arizeai/openinference-instrumentation-mcp
@arizeai/openinference-instrumentation-openai
@arizeai/openinference-mastra
@arizeai/openinference-semantic-conventions
@arizeai/openinference-vercel
commit: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hey @sallyannarize - I've reviewed your changes - here's some feedback:
- Remove the duplicated prompt_details and completion_details constant definitions in the Python SpanAttributes class to avoid redefinition conflicts and keep the attribute grouping consistent.
- Consider splitting out the input/output cost prefix refactoring (LLM_COST_INPUT, LLM_COST_OUTPUT, etc.) into a separate PR so this one stays focused solely on adding the image token/count and cost conventions.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- Remove the duplicated prompt_details and completion_details constant definitions in the Python SpanAttributes class to avoid redefinition conflicts and keep the attribute grouping consistent.
- Consider splitting out the input/output cost prefix refactoring (LLM_COST_INPUT, LLM_COST_OUTPUT, etc.) into a separate PR so this one stays focused solely on adding the image token/count and cost conventions.
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
Total cost of all output tokens generated by the LLM in USD. This includes all tokens that were | ||
generated in response to the prompt, including the main response and any additional output. | ||
""" | ||
LLM_COST_COMPLETION_DETAILS = "llm.cost.completion_details" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is removing or renaming LLM_COST_COMPLETION_DETAILS
intentional?
if so, tests in test_attributes.py
should be updated to reflect this change otherwise CI wouldn't pass
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Must have accidentally deleted these, I have added them back
Total cost of all input tokens sent to the LLM in USD. This includes all tokens that were | ||
processed as part of the prompt, including system messages, user messages, and any other input. | ||
""" | ||
LLM_COST_PROMPT_DETAILS = "llm.cost.prompt_details" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same comments here about LLM_COST_PROMPT_DETAILS
js/packages/openinference-semantic-conventions/src/trace/SemanticConventions.ts
Outdated
Show resolved
Hide resolved
…ticConventions.ts Co-authored-by: graphite-app[bot] <96075541+graphite-app[bot]@users.noreply.github.com>
AWS = "aws", | ||
AZURE = "azure", | ||
} | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The file is missing a trailing newline character at the end. Adding a newline at the end of files is standard practice for POSIX compliance and helps prevent issues with certain tools and version control systems. Consider adding a newline character to maintain consistency with standard file formatting conventions.
Spotted by Diamond
Is this helpful? React 👍 or 👎 to let us know.
This PR adds support for image token counting and cost tracking across OpenInference semantic conventions.
Changes Made
New Semantic Conventions:
llm.token_count.prompt_details.image
- Number of image tokens in the promptllm.token_count.completion_details.image
- Number of image tokens in the completionllm.cost.prompt_details.image
- Cost of image tokens in the prompt (USD)llm.cost.completion_details.image
- Cost of image tokens in the completion (USD)resolves #1780
Summary by Sourcery
Add image-specific token counting and cost tracking to OpenInference semantic conventions across Python and TypeScript implementations.
New Features:
Documentation: