fix: preserve callProviderMetadata, providerExecuted, and title on tool parts#1013
Merged
threepointone merged 2 commits intomainfrom Feb 28, 2026
Merged
fix: preserve callProviderMetadata, providerExecuted, and title on tool parts#1013threepointone merged 2 commits intomainfrom
threepointone merged 2 commits intomainfrom
Conversation
…ol parts Fixes #1009 — Gemini's thought_signature was being silently dropped during server-side message building, causing 'Function call is missing a thought_signature in functionCall parts' errors when addToolOutput was invoked from the client side. Root cause: applyChunkToParts in message-builder.ts was not preserving providerMetadata from tool-input stream chunks as callProviderMetadata on the resulting tool UIMessage parts. The AI SDK's own client-side updateToolPart correctly maps chunk.providerMetadata to part.callProviderMetadata, but our server-side builder was dropping it. When convertToModelMessages later reads callProviderMetadata to pass as providerOptions back to the model, the field was undefined, so Gemini never received its thought_signature back on the continuation call. The same gap existed for providerExecuted (used by provider-executed tools like Gemini code execution) and title (tool display name). Changes: - message-builder.ts: Preserve callProviderMetadata, providerExecuted, and title on tool parts in tool-input-start, tool-input-available, and tool-input-error handlers (both create and update paths) - message-builder.ts: Add providerExecuted to StreamChunkData type explicitly for discoverability - react.tsx: Fix stale comment that described tool-input-start and tool-input-delta as 'unrecognized types' (they are handled) - message-builder.test.ts: Add 13 regression tests covering all create/update paths for the three fields Verified: - _sanitizeMessageForPersistence only strips openai-keyed metadata, so google.thoughtSignature survives persistence - _applyToolResult uses spread, preserving these fields through updates - _persistOrphanedStream uses applyChunkToParts, so orphaned stream rebuilds also benefit - All consumers (server streaming, client react, orphaned streams) go through the single applyChunkToParts codepath
🦋 Changeset detectedLatest commit: e7dde8d The changes in this PR will be included in the next version bump. This PR includes changesets to release 1 package
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
commit: |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #1009
Problem
When using Gemini with client-side tools (e.g.
addToolOutput), the continuation call after tool output fails with:This affects
agentsSDK v0.5.1+ /@cloudflare/ai-chatv0.1.3+ with the Gemini API. Reproduction: https://github.com/Suman085/agents-issue-reproductionRoot Cause
applyChunkToPartsinmessage-builder.ts— the server-side message builder used byAIChatAgent— was silently droppingproviderMetadatafrom tool-input stream chunks instead of storing it ascallProviderMetadataon the resulting tool UIMessage parts.The AI SDK's own client-side
updateToolPartcorrectly does this mapping:But our server-side
applyChunkToPartswas only storingstateandinput:Later, when
convertToModelMessagesreads the persisted messages for the continuation call, it looks forcallProviderMetadataand passes it asproviderOptionsto the model. Since it was never stored, Gemini doesn't receive itsthought_signatureback and throws the error.Two additional fields were also missing:
providerExecuted(used byconvertToModelMessagesfor provider-executed tools like Gemini code execution) andtitle(tool display name).Changes
packages/ai-chat/src/message-builder.tstool-input-start: StorecallProviderMetadata,providerExecuted, andtitleon new tool partstool-input-available: Store all three fields on both create and update pathstool-input-error: StorecallProviderMetadataandproviderExecutedon both create and update pathsproviderExecutedtoStreamChunkDatatype explicitly (was previously only accessible via the[key: string]: unknownindex signature)packages/ai-chat/src/react.tsxtool-input-startandtool-input-deltaas "unrecognized types" — they are handled byapplyChunkToPartspackages/ai-chat/src/tests/message-builder.test.tsgoogle.thoughtSignaturemetadata to match the real-world failure caseNotes for reviewers
Why only tool-input- handlers?* The AI SDK's stream only includes
providerMetadataontool-input-start,tool-input-available, andtool-input-errorevents. Thetool-output-availableandtool-output-errorevents do not carryproviderMetadata, so those handlers don't need changes.Sanitization is safe.
_sanitizeMessageForPersistenceonly stripsopenai-keyed metadata fromcallProviderMetadata. Google/Gemini metadata (keyed undergoogle) passes through untouched._applyToolResultis safe. It uses{ ...part, state: "output-available", output }spread, socallProviderMetadata,providerExecuted, andtitlefrom the original tool part are preserved through the update.All consumers benefit. Server-side streaming (
_streamSSEReply), client-side react (react.tsx), and orphaned stream persistence (_persistOrphanedStream) all go throughapplyChunkToParts, so the fix covers all codepaths.This is a field-by-field parity fix with the AI SDK's
updateToolPart. I audited every field the AI SDK stores and confirmed no other fields are missing after this change.Testing