Skip to content

fix: preserve callProviderMetadata, providerExecuted, and title on tool parts#1013

Merged
threepointone merged 2 commits intomainfrom
preserve-provider-metadata
Feb 28, 2026
Merged

fix: preserve callProviderMetadata, providerExecuted, and title on tool parts#1013
threepointone merged 2 commits intomainfrom
preserve-provider-metadata

Conversation

@threepointone
Copy link
Contributor

Fixes #1009

Problem

When using Gemini with client-side tools (e.g. addToolOutput), the continuation call after tool output fails with:

Function call is missing a thought_signature in functionCall parts. This is required for tools to work correctly, and missing thought_signature may lead to degraded model performance.

This affects agents SDK v0.5.1+ / @cloudflare/ai-chat v0.1.3+ with the Gemini API. Reproduction: https://github.com/Suman085/agents-issue-reproduction

Root Cause

applyChunkToParts in message-builder.ts — the server-side message builder used by AIChatAgent — was silently dropping providerMetadata from tool-input stream chunks instead of storing it as callProviderMetadata on the resulting tool UIMessage parts.

The AI SDK's own client-side updateToolPart correctly does this mapping:

// AI SDK client-side (correct)
if (anyOptions.providerMetadata != null) {
  part.callProviderMetadata = anyOptions.providerMetadata;
}

But our server-side applyChunkToParts was only storing state and input:

// Our code (before fix — missing callProviderMetadata)
parts.push({
  type: `tool-${chunk.toolName}`,
  toolCallId: chunk.toolCallId,
  toolName: chunk.toolName,
  state: "input-available",
  input: chunk.input
});

Later, when convertToModelMessages reads the persisted messages for the continuation call, it looks for callProviderMetadata and passes it as providerOptions to the model. Since it was never stored, Gemini doesn't receive its thought_signature back and throws the error.

Two additional fields were also missing: providerExecuted (used by convertToModelMessages for provider-executed tools like Gemini code execution) and title (tool display name).

Changes

packages/ai-chat/src/message-builder.ts

  • tool-input-start: Store callProviderMetadata, providerExecuted, and title on new tool parts
  • tool-input-available: Store all three fields on both create and update paths
  • tool-input-error: Store callProviderMetadata and providerExecuted on both create and update paths
  • Added providerExecuted to StreamChunkData type explicitly (was previously only accessible via the [key: string]: unknown index signature)

packages/ai-chat/src/react.tsx

  • Fixed stale comment that described tool-input-start and tool-input-delta as "unrecognized types" — they are handled by applyChunkToParts

packages/ai-chat/src/tests/message-builder.test.ts

  • Added 13 regression tests covering all create/update paths for the three fields
  • Tests use Gemini-style google.thoughtSignature metadata to match the real-world failure case

Notes for reviewers

  1. Why only tool-input- handlers?* The AI SDK's stream only includes providerMetadata on tool-input-start, tool-input-available, and tool-input-error events. The tool-output-available and tool-output-error events do not carry providerMetadata, so those handlers don't need changes.

  2. Sanitization is safe. _sanitizeMessageForPersistence only strips openai-keyed metadata from callProviderMetadata. Google/Gemini metadata (keyed under google) passes through untouched.

  3. _applyToolResult is safe. It uses { ...part, state: "output-available", output } spread, so callProviderMetadata, providerExecuted, and title from the original tool part are preserved through the update.

  4. All consumers benefit. Server-side streaming (_streamSSEReply), client-side react (react.tsx), and orphaned stream persistence (_persistOrphanedStream) all go through applyChunkToParts, so the fix covers all codepaths.

  5. This is a field-by-field parity fix with the AI SDK's updateToolPart. I audited every field the AI SDK stores and confirmed no other fields are missing after this change.

Testing

  • 226 tests pass (13 new), 2 pre-existing MCP timeout flakes unrelated to this change
  • Build clean
  • oxfmt clean

…ol parts

Fixes #1009 — Gemini's thought_signature was being silently dropped
during server-side message building, causing 'Function call is missing
a thought_signature in functionCall parts' errors when addToolOutput
was invoked from the client side.

Root cause: applyChunkToParts in message-builder.ts was not preserving
providerMetadata from tool-input stream chunks as callProviderMetadata
on the resulting tool UIMessage parts. The AI SDK's own client-side
updateToolPart correctly maps chunk.providerMetadata to
part.callProviderMetadata, but our server-side builder was dropping it.

When convertToModelMessages later reads callProviderMetadata to pass as
providerOptions back to the model, the field was undefined, so Gemini
never received its thought_signature back on the continuation call.

The same gap existed for providerExecuted (used by provider-executed
tools like Gemini code execution) and title (tool display name).

Changes:
- message-builder.ts: Preserve callProviderMetadata, providerExecuted,
  and title on tool parts in tool-input-start, tool-input-available,
  and tool-input-error handlers (both create and update paths)
- message-builder.ts: Add providerExecuted to StreamChunkData type
  explicitly for discoverability
- react.tsx: Fix stale comment that described tool-input-start and
  tool-input-delta as 'unrecognized types' (they are handled)
- message-builder.test.ts: Add 13 regression tests covering all
  create/update paths for the three fields

Verified:
- _sanitizeMessageForPersistence only strips openai-keyed metadata,
  so google.thoughtSignature survives persistence
- _applyToolResult uses spread, preserving these fields through updates
- _persistOrphanedStream uses applyChunkToParts, so orphaned stream
  rebuilds also benefit
- All consumers (server streaming, client react, orphaned streams) go
  through the single applyChunkToParts codepath
@changeset-bot
Copy link

changeset-bot bot commented Feb 27, 2026

🦋 Changeset detected

Latest commit: e7dde8d

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 1 package
Name Type
@cloudflare/ai-chat Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

@pkg-pr-new
Copy link

pkg-pr-new bot commented Feb 27, 2026

Open in StackBlitz

npm i https://pkg.pr.new/cloudflare/agents@1013
npm i https://pkg.pr.new/cloudflare/agents/@cloudflare/ai-chat@1013
npm i https://pkg.pr.new/cloudflare/agents/@cloudflare/codemode@1013
npm i https://pkg.pr.new/cloudflare/agents/hono-agents@1013

commit: e7dde8d

@threepointone threepointone merged commit 11aaaff into main Feb 28, 2026
4 checks passed
@threepointone threepointone deleted the preserve-provider-metadata branch February 28, 2026 07:41
@github-actions github-actions bot mentioned this pull request Feb 28, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Error while invoking the addToolOutput from the client side

1 participant