You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: .changeset/five-colts-eat.md
+22-6Lines changed: 22 additions & 6 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -48,12 +48,31 @@ const main = Effect.gen(function*() {
48
48
49
49
The `AiInput` and `AiResponse` types have been refactored to allow inclusion of more information and metadata from model providers where possible, such as reasoning output and prompt cache token utilization.
50
50
51
+
In addition, for an `AiResponse` you can now access metadata that is specific to a given provider. For example, when using OpenAi to generate audio, you can check the input and output audio tokens used:
The `AiToolkit` has been completely refactored to simplify creating a collection of tools and using those tools in requests to model providers. A new `AiTool` data type has also been introduced to simplify defining tools for a toolkit. `AiToolkit.implement` has been renamed to `AiToolkit.toLayer` for clarity, and defining handlers is now very similar to the way handlers are defined in the `@effect/rpc` library.
54
75
55
-
In addition, you can now control how many sequential steps are performed by `AiLanguageModel.generateText` and `AiLanguageModel.streamText` via the `maxSteps` option. For example, if `maxSteps` is set to `> 1` and any tools are invoked by the language model, these methods will take care of resolving the tool call and returning the results to the language model for subsequent generation (up to the maximum number of steps specified).
56
-
57
76
A complete example of an `AiToolkit` implementation and usage can be found below:
0 commit comments