ai 6.0.31 → 6.0.33
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/CHANGELOG.md +12 -0
- package/dist/index.js +12 -2
- package/dist/index.js.map +1 -1
- package/dist/index.mjs +12 -2
- package/dist/index.mjs.map +1 -1
- package/dist/internal/index.js +1 -1
- package/dist/internal/index.mjs +1 -1
- package/docs/00-introduction/index.mdx +76 -0
- package/docs/02-foundations/01-overview.mdx +43 -0
- package/docs/02-foundations/02-providers-and-models.mdx +163 -0
- package/docs/02-foundations/03-prompts.mdx +620 -0
- package/docs/02-foundations/04-tools.mdx +160 -0
- package/docs/02-foundations/05-streaming.mdx +62 -0
- package/docs/02-foundations/index.mdx +43 -0
- package/docs/02-getting-started/00-choosing-a-provider.mdx +110 -0
- package/docs/02-getting-started/01-navigating-the-library.mdx +85 -0
- package/docs/02-getting-started/02-nextjs-app-router.mdx +556 -0
- package/docs/02-getting-started/03-nextjs-pages-router.mdx +542 -0
- package/docs/02-getting-started/04-svelte.mdx +627 -0
- package/docs/02-getting-started/05-nuxt.mdx +566 -0
- package/docs/02-getting-started/06-nodejs.mdx +512 -0
- package/docs/02-getting-started/07-expo.mdx +766 -0
- package/docs/02-getting-started/08-tanstack-start.mdx +583 -0
- package/docs/02-getting-started/index.mdx +44 -0
- package/docs/03-agents/01-overview.mdx +96 -0
- package/docs/03-agents/02-building-agents.mdx +367 -0
- package/docs/03-agents/03-workflows.mdx +370 -0
- package/docs/03-agents/04-loop-control.mdx +350 -0
- package/docs/03-agents/05-configuring-call-options.mdx +286 -0
- package/docs/03-agents/index.mdx +40 -0
- package/docs/03-ai-sdk-core/01-overview.mdx +33 -0
- package/docs/03-ai-sdk-core/05-generating-text.mdx +600 -0
- package/docs/03-ai-sdk-core/10-generating-structured-data.mdx +662 -0
- package/docs/03-ai-sdk-core/15-tools-and-tool-calling.mdx +1102 -0
- package/docs/03-ai-sdk-core/16-mcp-tools.mdx +375 -0
- package/docs/03-ai-sdk-core/20-prompt-engineering.mdx +144 -0
- package/docs/03-ai-sdk-core/25-settings.mdx +198 -0
- package/docs/03-ai-sdk-core/30-embeddings.mdx +247 -0
- package/docs/03-ai-sdk-core/31-reranking.mdx +218 -0
- package/docs/03-ai-sdk-core/35-image-generation.mdx +341 -0
- package/docs/03-ai-sdk-core/36-transcription.mdx +173 -0
- package/docs/03-ai-sdk-core/37-speech.mdx +167 -0
- package/docs/03-ai-sdk-core/40-middleware.mdx +480 -0
- package/docs/03-ai-sdk-core/45-provider-management.mdx +349 -0
- package/docs/03-ai-sdk-core/50-error-handling.mdx +149 -0
- package/docs/03-ai-sdk-core/55-testing.mdx +218 -0
- package/docs/03-ai-sdk-core/60-telemetry.mdx +313 -0
- package/docs/03-ai-sdk-core/65-devtools.mdx +107 -0
- package/docs/03-ai-sdk-core/index.mdx +88 -0
- package/docs/04-ai-sdk-ui/01-overview.mdx +44 -0
- package/docs/04-ai-sdk-ui/02-chatbot.mdx +1313 -0
- package/docs/04-ai-sdk-ui/03-chatbot-message-persistence.mdx +535 -0
- package/docs/04-ai-sdk-ui/03-chatbot-resume-streams.mdx +263 -0
- package/docs/04-ai-sdk-ui/03-chatbot-tool-usage.mdx +682 -0
- package/docs/04-ai-sdk-ui/04-generative-user-interfaces.mdx +389 -0
- package/docs/04-ai-sdk-ui/05-completion.mdx +186 -0
- package/docs/04-ai-sdk-ui/08-object-generation.mdx +344 -0
- package/docs/04-ai-sdk-ui/20-streaming-data.mdx +397 -0
- package/docs/04-ai-sdk-ui/21-error-handling.mdx +190 -0
- package/docs/04-ai-sdk-ui/21-transport.mdx +174 -0
- package/docs/04-ai-sdk-ui/24-reading-ui-message-streams.mdx +104 -0
- package/docs/04-ai-sdk-ui/25-message-metadata.mdx +152 -0
- package/docs/04-ai-sdk-ui/50-stream-protocol.mdx +477 -0
- package/docs/04-ai-sdk-ui/index.mdx +64 -0
- package/docs/05-ai-sdk-rsc/01-overview.mdx +45 -0
- package/docs/05-ai-sdk-rsc/02-streaming-react-components.mdx +209 -0
- package/docs/05-ai-sdk-rsc/03-generative-ui-state.mdx +279 -0
- package/docs/05-ai-sdk-rsc/03-saving-and-restoring-states.mdx +105 -0
- package/docs/05-ai-sdk-rsc/04-multistep-interfaces.mdx +282 -0
- package/docs/05-ai-sdk-rsc/05-streaming-values.mdx +158 -0
- package/docs/05-ai-sdk-rsc/06-loading-state.mdx +273 -0
- package/docs/05-ai-sdk-rsc/08-error-handling.mdx +96 -0
- package/docs/05-ai-sdk-rsc/09-authentication.mdx +42 -0
- package/docs/05-ai-sdk-rsc/10-migrating-to-ui.mdx +722 -0
- package/docs/05-ai-sdk-rsc/index.mdx +58 -0
- package/docs/06-advanced/01-prompt-engineering.mdx +96 -0
- package/docs/06-advanced/02-stopping-streams.mdx +184 -0
- package/docs/06-advanced/03-backpressure.mdx +173 -0
- package/docs/06-advanced/04-caching.mdx +169 -0
- package/docs/06-advanced/05-multiple-streamables.mdx +68 -0
- package/docs/06-advanced/06-rate-limiting.mdx +60 -0
- package/docs/06-advanced/07-rendering-ui-with-language-models.mdx +213 -0
- package/docs/06-advanced/08-model-as-router.mdx +120 -0
- package/docs/06-advanced/09-multistep-interfaces.mdx +115 -0
- package/docs/06-advanced/09-sequential-generations.mdx +55 -0
- package/docs/06-advanced/10-vercel-deployment-guide.mdx +117 -0
- package/docs/06-advanced/index.mdx +11 -0
- package/docs/07-reference/01-ai-sdk-core/01-generate-text.mdx +2142 -0
- package/docs/07-reference/01-ai-sdk-core/02-stream-text.mdx +3215 -0
- package/docs/07-reference/01-ai-sdk-core/03-generate-object.mdx +780 -0
- package/docs/07-reference/01-ai-sdk-core/04-stream-object.mdx +1140 -0
- package/docs/07-reference/01-ai-sdk-core/05-embed.mdx +190 -0
- package/docs/07-reference/01-ai-sdk-core/06-embed-many.mdx +171 -0
- package/docs/07-reference/01-ai-sdk-core/06-rerank.mdx +309 -0
- package/docs/07-reference/01-ai-sdk-core/10-generate-image.mdx +227 -0
- package/docs/07-reference/01-ai-sdk-core/11-transcribe.mdx +138 -0
- package/docs/07-reference/01-ai-sdk-core/12-generate-speech.mdx +214 -0
- package/docs/07-reference/01-ai-sdk-core/15-agent.mdx +203 -0
- package/docs/07-reference/01-ai-sdk-core/16-tool-loop-agent.mdx +449 -0
- package/docs/07-reference/01-ai-sdk-core/17-create-agent-ui-stream.mdx +148 -0
- package/docs/07-reference/01-ai-sdk-core/18-create-agent-ui-stream-response.mdx +168 -0
- package/docs/07-reference/01-ai-sdk-core/18-pipe-agent-ui-stream-to-response.mdx +144 -0
- package/docs/07-reference/01-ai-sdk-core/20-tool.mdx +196 -0
- package/docs/07-reference/01-ai-sdk-core/22-dynamic-tool.mdx +175 -0
- package/docs/07-reference/01-ai-sdk-core/23-create-mcp-client.mdx +410 -0
- package/docs/07-reference/01-ai-sdk-core/24-mcp-stdio-transport.mdx +68 -0
- package/docs/07-reference/01-ai-sdk-core/25-json-schema.mdx +94 -0
- package/docs/07-reference/01-ai-sdk-core/26-zod-schema.mdx +109 -0
- package/docs/07-reference/01-ai-sdk-core/27-valibot-schema.mdx +55 -0
- package/docs/07-reference/01-ai-sdk-core/28-output.mdx +342 -0
- package/docs/07-reference/01-ai-sdk-core/30-model-message.mdx +415 -0
- package/docs/07-reference/01-ai-sdk-core/31-ui-message.mdx +246 -0
- package/docs/07-reference/01-ai-sdk-core/32-validate-ui-messages.mdx +101 -0
- package/docs/07-reference/01-ai-sdk-core/33-safe-validate-ui-messages.mdx +113 -0
- package/docs/07-reference/01-ai-sdk-core/40-provider-registry.mdx +182 -0
- package/docs/07-reference/01-ai-sdk-core/42-custom-provider.mdx +121 -0
- package/docs/07-reference/01-ai-sdk-core/50-cosine-similarity.mdx +52 -0
- package/docs/07-reference/01-ai-sdk-core/60-wrap-language-model.mdx +59 -0
- package/docs/07-reference/01-ai-sdk-core/61-wrap-image-model.mdx +64 -0
- package/docs/07-reference/01-ai-sdk-core/65-language-model-v2-middleware.mdx +46 -0
- package/docs/07-reference/01-ai-sdk-core/66-extract-reasoning-middleware.mdx +68 -0
- package/docs/07-reference/01-ai-sdk-core/67-simulate-streaming-middleware.mdx +71 -0
- package/docs/07-reference/01-ai-sdk-core/68-default-settings-middleware.mdx +80 -0
- package/docs/07-reference/01-ai-sdk-core/69-add-tool-input-examples-middleware.mdx +155 -0
- package/docs/07-reference/01-ai-sdk-core/70-extract-json-middleware.mdx +147 -0
- package/docs/07-reference/01-ai-sdk-core/70-step-count-is.mdx +84 -0
- package/docs/07-reference/01-ai-sdk-core/71-has-tool-call.mdx +120 -0
- package/docs/07-reference/01-ai-sdk-core/75-simulate-readable-stream.mdx +94 -0
- package/docs/07-reference/01-ai-sdk-core/80-smooth-stream.mdx +145 -0
- package/docs/07-reference/01-ai-sdk-core/90-generate-id.mdx +43 -0
- package/docs/07-reference/01-ai-sdk-core/91-create-id-generator.mdx +89 -0
- package/docs/07-reference/01-ai-sdk-core/index.mdx +159 -0
- package/docs/07-reference/02-ai-sdk-ui/01-use-chat.mdx +446 -0
- package/docs/07-reference/02-ai-sdk-ui/02-use-completion.mdx +179 -0
- package/docs/07-reference/02-ai-sdk-ui/03-use-object.mdx +178 -0
- package/docs/07-reference/02-ai-sdk-ui/31-convert-to-model-messages.mdx +230 -0
- package/docs/07-reference/02-ai-sdk-ui/32-prune-messages.mdx +108 -0
- package/docs/07-reference/02-ai-sdk-ui/40-create-ui-message-stream.mdx +151 -0
- package/docs/07-reference/02-ai-sdk-ui/41-create-ui-message-stream-response.mdx +113 -0
- package/docs/07-reference/02-ai-sdk-ui/42-pipe-ui-message-stream-to-response.mdx +73 -0
- package/docs/07-reference/02-ai-sdk-ui/43-read-ui-message-stream.mdx +57 -0
- package/docs/07-reference/02-ai-sdk-ui/46-infer-ui-tools.mdx +99 -0
- package/docs/07-reference/02-ai-sdk-ui/47-infer-ui-tool.mdx +75 -0
- package/docs/07-reference/02-ai-sdk-ui/50-direct-chat-transport.mdx +333 -0
- package/docs/07-reference/02-ai-sdk-ui/index.mdx +89 -0
- package/docs/07-reference/03-ai-sdk-rsc/01-stream-ui.mdx +767 -0
- package/docs/07-reference/03-ai-sdk-rsc/02-create-ai.mdx +90 -0
- package/docs/07-reference/03-ai-sdk-rsc/03-create-streamable-ui.mdx +91 -0
- package/docs/07-reference/03-ai-sdk-rsc/04-create-streamable-value.mdx +48 -0
- package/docs/07-reference/03-ai-sdk-rsc/05-read-streamable-value.mdx +78 -0
- package/docs/07-reference/03-ai-sdk-rsc/06-get-ai-state.mdx +50 -0
- package/docs/07-reference/03-ai-sdk-rsc/07-get-mutable-ai-state.mdx +70 -0
- package/docs/07-reference/03-ai-sdk-rsc/08-use-ai-state.mdx +26 -0
- package/docs/07-reference/03-ai-sdk-rsc/09-use-actions.mdx +42 -0
- package/docs/07-reference/03-ai-sdk-rsc/10-use-ui-state.mdx +35 -0
- package/docs/07-reference/03-ai-sdk-rsc/11-use-streamable-value.mdx +46 -0
- package/docs/07-reference/03-ai-sdk-rsc/20-render.mdx +262 -0
- package/docs/07-reference/03-ai-sdk-rsc/index.mdx +67 -0
- package/docs/07-reference/04-stream-helpers/01-ai-stream.mdx +89 -0
- package/docs/07-reference/04-stream-helpers/02-streaming-text-response.mdx +79 -0
- package/docs/07-reference/04-stream-helpers/05-stream-to-response.mdx +108 -0
- package/docs/07-reference/04-stream-helpers/07-openai-stream.mdx +77 -0
- package/docs/07-reference/04-stream-helpers/08-anthropic-stream.mdx +79 -0
- package/docs/07-reference/04-stream-helpers/09-aws-bedrock-stream.mdx +91 -0
- package/docs/07-reference/04-stream-helpers/10-aws-bedrock-anthropic-stream.mdx +96 -0
- package/docs/07-reference/04-stream-helpers/10-aws-bedrock-messages-stream.mdx +96 -0
- package/docs/07-reference/04-stream-helpers/11-aws-bedrock-cohere-stream.mdx +93 -0
- package/docs/07-reference/04-stream-helpers/12-aws-bedrock-llama-2-stream.mdx +93 -0
- package/docs/07-reference/04-stream-helpers/13-cohere-stream.mdx +78 -0
- package/docs/07-reference/04-stream-helpers/14-google-generative-ai-stream.mdx +85 -0
- package/docs/07-reference/04-stream-helpers/15-hugging-face-stream.mdx +84 -0
- package/docs/07-reference/04-stream-helpers/16-langchain-adapter.mdx +98 -0
- package/docs/07-reference/04-stream-helpers/16-llamaindex-adapter.mdx +70 -0
- package/docs/07-reference/04-stream-helpers/17-mistral-stream.mdx +81 -0
- package/docs/07-reference/04-stream-helpers/18-replicate-stream.mdx +83 -0
- package/docs/07-reference/04-stream-helpers/19-inkeep-stream.mdx +80 -0
- package/docs/07-reference/04-stream-helpers/index.mdx +103 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-api-call-error.mdx +30 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-download-error.mdx +27 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-empty-response-body-error.mdx +24 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-argument-error.mdx +26 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-data-content-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-data-content.mdx +26 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-message-role-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-prompt-error.mdx +47 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-response-data-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-tool-approval-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-invalid-tool-input-error.mdx +27 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-json-parse-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-load-api-key-error.mdx +24 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-load-setting-error.mdx +24 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-message-conversion-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-content-generated-error.mdx +24 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-image-generated-error.mdx +36 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-object-generated-error.mdx +43 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-speech-generated-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-such-model-error.mdx +26 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-such-provider-error.mdx +28 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-such-tool-error.mdx +26 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-no-transcript-generated-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-retry-error.mdx +27 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-too-many-embedding-values-for-call-error.mdx +27 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-tool-call-not-found-for-approval-error.mdx +26 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-tool-call-repair-error.mdx +28 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-type-validation-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/ai-unsupported-functionality-error.mdx +25 -0
- package/docs/07-reference/05-ai-sdk-errors/index.mdx +38 -0
- package/docs/07-reference/index.mdx +34 -0
- package/docs/08-migration-guides/00-versioning.mdx +46 -0
- package/docs/08-migration-guides/24-migration-guide-6-0.mdx +823 -0
- package/docs/08-migration-guides/25-migration-guide-5-0-data.mdx +882 -0
- package/docs/08-migration-guides/26-migration-guide-5-0.mdx +3427 -0
- package/docs/08-migration-guides/27-migration-guide-4-2.mdx +99 -0
- package/docs/08-migration-guides/28-migration-guide-4-1.mdx +14 -0
- package/docs/08-migration-guides/29-migration-guide-4-0.mdx +1157 -0
- package/docs/08-migration-guides/36-migration-guide-3-4.mdx +14 -0
- package/docs/08-migration-guides/37-migration-guide-3-3.mdx +64 -0
- package/docs/08-migration-guides/38-migration-guide-3-2.mdx +46 -0
- package/docs/08-migration-guides/39-migration-guide-3-1.mdx +168 -0
- package/docs/08-migration-guides/index.mdx +22 -0
- package/docs/09-troubleshooting/01-azure-stream-slow.mdx +33 -0
- package/docs/09-troubleshooting/02-client-side-function-calls-not-invoked.mdx +22 -0
- package/docs/09-troubleshooting/03-server-actions-in-client-components.mdx +40 -0
- package/docs/09-troubleshooting/04-strange-stream-output.mdx +36 -0
- package/docs/09-troubleshooting/05-streamable-ui-errors.mdx +16 -0
- package/docs/09-troubleshooting/05-tool-invocation-missing-result.mdx +106 -0
- package/docs/09-troubleshooting/06-streaming-not-working-when-deployed.mdx +31 -0
- package/docs/09-troubleshooting/06-streaming-not-working-when-proxied.mdx +31 -0
- package/docs/09-troubleshooting/06-timeout-on-vercel.mdx +60 -0
- package/docs/09-troubleshooting/07-unclosed-streams.mdx +34 -0
- package/docs/09-troubleshooting/08-use-chat-failed-to-parse-stream.mdx +26 -0
- package/docs/09-troubleshooting/09-client-stream-error.mdx +25 -0
- package/docs/09-troubleshooting/10-use-chat-tools-no-response.mdx +32 -0
- package/docs/09-troubleshooting/11-use-chat-custom-request-options.mdx +149 -0
- package/docs/09-troubleshooting/12-typescript-performance-zod.mdx +46 -0
- package/docs/09-troubleshooting/12-use-chat-an-error-occurred.mdx +59 -0
- package/docs/09-troubleshooting/13-repeated-assistant-messages.mdx +73 -0
- package/docs/09-troubleshooting/14-stream-abort-handling.mdx +73 -0
- package/docs/09-troubleshooting/14-tool-calling-with-structured-outputs.mdx +48 -0
- package/docs/09-troubleshooting/15-abort-breaks-resumable-streams.mdx +55 -0
- package/docs/09-troubleshooting/15-stream-text-not-working.mdx +33 -0
- package/docs/09-troubleshooting/16-streaming-status-delay.mdx +63 -0
- package/docs/09-troubleshooting/17-use-chat-stale-body-data.mdx +141 -0
- package/docs/09-troubleshooting/18-ontoolcall-type-narrowing.mdx +66 -0
- package/docs/09-troubleshooting/19-unsupported-model-version.mdx +50 -0
- package/docs/09-troubleshooting/20-no-object-generated-content-filter.mdx +72 -0
- package/docs/09-troubleshooting/30-model-is-not-assignable-to-type.mdx +21 -0
- package/docs/09-troubleshooting/40-typescript-cannot-find-namespace-jsx.mdx +24 -0
- package/docs/09-troubleshooting/50-react-maximum-update-depth-exceeded.mdx +39 -0
- package/docs/09-troubleshooting/60-jest-cannot-find-module-ai-rsc.mdx +22 -0
- package/docs/09-troubleshooting/index.mdx +11 -0
- package/package.json +8 -4
|
@@ -0,0 +1,14 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Migrate AI SDK 3.3 to 3.4
|
|
3
|
+
description: Learn how to upgrade AI SDK 3.3 to 3.4.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Migrate AI SDK 3.3 to 3.4
|
|
7
|
+
|
|
8
|
+
<Note>
|
|
9
|
+
Check out the [AI SDK 3.4 release blog
|
|
10
|
+
post](https://vercel.com/blog/ai-sdk-3-4) for more information about the
|
|
11
|
+
release.
|
|
12
|
+
</Note>
|
|
13
|
+
|
|
14
|
+
No breaking changes in this release.
|
|
@@ -0,0 +1,64 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Migrate AI SDK 3.2 to 3.3
|
|
3
|
+
description: Learn how to upgrade AI SDK 3.2 to 3.3.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Migrate AI SDK 3.2 to 3.3
|
|
7
|
+
|
|
8
|
+
<Note>
|
|
9
|
+
Check out the [AI SDK 3.3 release blog
|
|
10
|
+
post](https://vercel.com/blog/vercel-ai-sdk-3-3) for more information about
|
|
11
|
+
the release.
|
|
12
|
+
</Note>
|
|
13
|
+
|
|
14
|
+
No breaking changes in this release.
|
|
15
|
+
|
|
16
|
+
The following changelog encompasses all changes made in the 3.2.x series,
|
|
17
|
+
introducing significant improvements and new features across the AI SDK and its associated libraries:
|
|
18
|
+
|
|
19
|
+
## New Features
|
|
20
|
+
|
|
21
|
+
### Open Telemetry Support
|
|
22
|
+
|
|
23
|
+
- Added experimental [OpenTelemetry support](/docs/ai-sdk-core/telemetry#telemetry) for all [AI SDK Core functions](/docs/ai-sdk-core/overview#ai-sdk-core-functions), enabling better observability and tracing capabilities.
|
|
24
|
+
|
|
25
|
+
### AI SDK UI Improvements
|
|
26
|
+
|
|
27
|
+
- Introduced the experimental **`useObject`** hook (for React) that can be used in conjunction with **`streamObject`** on the backend to enable seamless streaming of structured data.
|
|
28
|
+
- Enhanced **`useChat`** with experimental support for attachments and streaming tool calls, providing more versatile chat functionalities.
|
|
29
|
+
- Patched **`useChat`** to prevent empty submissions, improving the quality of user interactions by ensuring that only intended inputs are processed.
|
|
30
|
+
- Fix **`useChat`**'s **`reload`** function, now correctly sending data, body, and headers.
|
|
31
|
+
- Implemented **`setThreadId`** helper for **`useAssistant`**, simplifying thread management.
|
|
32
|
+
- Documented the stream data protocol for **`useChat`** and **`useCompletion`**, allowing developers to use these functions with any backend. The stream data protocol also enables the use of custom frontends with **`streamText`**.
|
|
33
|
+
- Added support for custom fetch functions and request body customization, offering greater control over API interactions.
|
|
34
|
+
- Added **`onFinish`** to **`useChat`** hook for access to token usage and finish reason.
|
|
35
|
+
|
|
36
|
+
### Core Enhancements
|
|
37
|
+
|
|
38
|
+
- Implemented support for sending custom request headers, enabling more tailored API requests.
|
|
39
|
+
- Added raw JSON schema support alongside existing Zod support, providing more options for schema and data validation.
|
|
40
|
+
- Introduced usage information for **`embed`** and **`embedMany`** functions, offering insights into token usage.
|
|
41
|
+
- Added support for additional settings including **`stopSequences`** and **`topK`**, allowing for finer control over text generation.
|
|
42
|
+
- Provided access to information for all steps on **`generateText`**, providing access to intermediate tool calls and results.
|
|
43
|
+
|
|
44
|
+
### New Providers
|
|
45
|
+
|
|
46
|
+
- [AWS Bedrock provider](/providers/ai-sdk-providers/amazon-bedrock).
|
|
47
|
+
|
|
48
|
+
### Provider Improvements
|
|
49
|
+
|
|
50
|
+
- Enhanced existing providers including Anthropic, Google, Azure, and OpenAI with various improvements and bug fixes.
|
|
51
|
+
- Upgraded the LangChain adapter with StreamEvent v2 support and introduced the **`toDataStreamResponse`** function, enabling conversion of LangChain output streams to data stream responses.
|
|
52
|
+
- Added legacy function calling support to the OpenAI provider.
|
|
53
|
+
- Updated Mistral AI provider with fixes and improvements for tool calling support.
|
|
54
|
+
|
|
55
|
+
### UI Framework Support Expansion
|
|
56
|
+
|
|
57
|
+
- SolidJS: Updated **`useChat`** and **`useCompletion`** to achieve feature parity with React implementations.
|
|
58
|
+
- Vue.js: Introduced **`useAssistant`** hook.
|
|
59
|
+
- Vue.js / Nuxt: [Updated examples](https://github.com/vercel/ai/tree/main/examples/nuxt-openai) to showcase latest features and best practices.
|
|
60
|
+
- Svelte: Added tool calling support to **`useChat`.**
|
|
61
|
+
|
|
62
|
+
## Fixes and Improvements
|
|
63
|
+
|
|
64
|
+
- Resolved various issues across different components of the SDK, including race conditions, error handling, and state management.
|
|
@@ -0,0 +1,46 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Migrate AI SDK 3.1 to 3.2
|
|
3
|
+
description: Learn how to upgrade AI SDK 3.1 to 3.2.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Migrate AI SDK 3.1 to 3.2
|
|
7
|
+
|
|
8
|
+
<Note>
|
|
9
|
+
Check out the [AI SDK 3.2 release blog
|
|
10
|
+
post](https://vercel.com/blog/introducing-vercel-ai-sdk-3-2) for more
|
|
11
|
+
information about the release.
|
|
12
|
+
</Note>
|
|
13
|
+
|
|
14
|
+
This guide will help you upgrade to AI SDK 3.2:
|
|
15
|
+
|
|
16
|
+
- Experimental `StreamingReactResponse` functionality has been removed
|
|
17
|
+
- Several features have been deprecated
|
|
18
|
+
- UI framework integrations have moved to their own Node modules
|
|
19
|
+
|
|
20
|
+
## Upgrading
|
|
21
|
+
|
|
22
|
+
### AI SDK
|
|
23
|
+
|
|
24
|
+
To update to AI SDK version 3.2, run the following command using your preferred package manager:
|
|
25
|
+
|
|
26
|
+
<Snippet text="pnpm add ai@latest" />
|
|
27
|
+
|
|
28
|
+
## Removed Functionality
|
|
29
|
+
|
|
30
|
+
The experimental `StreamingReactResponse` has been removed. You can use [AI SDK RSC](/docs/ai-sdk-rsc/overview) to build streaming UIs.
|
|
31
|
+
|
|
32
|
+
## Deprecated Functionality
|
|
33
|
+
|
|
34
|
+
The `nanoid` export has been deprecated. Please use [`generateId`](/docs/reference/ai-sdk-core/generate-id) instead.
|
|
35
|
+
|
|
36
|
+
## UI Package Separation
|
|
37
|
+
|
|
38
|
+
AI SDK UI supports several frameworks: [React](https://react.dev/), [Svelte](https://svelte.dev/), [Vue.js](https://vuejs.org/), and [SolidJS](https://www.solidjs.com/).
|
|
39
|
+
|
|
40
|
+
The integrations (other than React and RSC) have moved to separate Node modules. You need to update the import and require statements as follows:
|
|
41
|
+
|
|
42
|
+
- Change `ai/svelte` to `@ai-sdk/svelte`
|
|
43
|
+
- Change `ai/vue` to `@ai-sdk/vue`
|
|
44
|
+
- Change `ai/solid` to `@ai-sdk/solid`
|
|
45
|
+
|
|
46
|
+
The old exports are still available but will be removed in a future release.
|
|
@@ -0,0 +1,168 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Migrate AI SDK 3.0 to 3.1
|
|
3
|
+
description: Learn how to upgrade AI SDK 3.0 to 3.1.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Migrate AI SDK 3.0 to 3.1
|
|
7
|
+
|
|
8
|
+
<Note>
|
|
9
|
+
Check out the [AI SDK 3.1 release blog
|
|
10
|
+
post](https://vercel.com/blog/vercel-ai-sdk-3-1-modelfusion-joins-the-team)
|
|
11
|
+
for more information about the release.
|
|
12
|
+
</Note>
|
|
13
|
+
|
|
14
|
+
This guide will help you:
|
|
15
|
+
|
|
16
|
+
- Upgrade to AI SDK 3.1
|
|
17
|
+
- Migrate from Legacy Providers to AI SDK Core
|
|
18
|
+
- Migrate from [`render`](/docs/reference/ai-sdk-rsc/render) to [`streamUI`](/docs/reference/ai-sdk-rsc/stream-ui)
|
|
19
|
+
|
|
20
|
+
Upgrading to AI SDK 3.1 does not require using the newly released AI SDK Core API or [`streamUI`](/docs/reference/ai-sdk-rsc/stream-ui) function.
|
|
21
|
+
|
|
22
|
+
## Upgrading
|
|
23
|
+
|
|
24
|
+
### AI SDK
|
|
25
|
+
|
|
26
|
+
To update to AI SDK version 3.1, run the following command using your preferred package manager:
|
|
27
|
+
|
|
28
|
+
<Snippet text="pnpm add ai@3.1" />
|
|
29
|
+
|
|
30
|
+
## Next Steps
|
|
31
|
+
|
|
32
|
+
The release of AI SDK 3.1 introduces several new features that improve the way you build AI applications with the SDK:
|
|
33
|
+
|
|
34
|
+
- AI SDK Core, a brand new unified API for interacting with large language models (LLMs).
|
|
35
|
+
- [`streamUI`](/docs/reference/ai-sdk-rsc/stream-ui), a new abstraction, built upon AI SDK Core functions that simplifies building streaming UIs.
|
|
36
|
+
|
|
37
|
+
## Migrating from Legacy Providers to AI SDK Core
|
|
38
|
+
|
|
39
|
+
Prior to AI SDK Core, you had to use a model provider's SDK to query their models.
|
|
40
|
+
|
|
41
|
+
In the following Route Handler, you use the OpenAI SDK to query their model. You then pipe that response into the [`OpenAIStream`](/docs/reference/stream-helpers/openai-stream) function which returns a [`ReadableStream`](https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream) that you can pass to the client using a new [`StreamingTextResponse`](/docs/reference/stream-helpers/streaming-text-response).
|
|
42
|
+
|
|
43
|
+
```tsx
|
|
44
|
+
import OpenAI from 'openai';
|
|
45
|
+
import { OpenAIStream, StreamingTextResponse } from 'ai';
|
|
46
|
+
|
|
47
|
+
const openai = new OpenAI({
|
|
48
|
+
apiKey: process.env.OPENAI_API_KEY!,
|
|
49
|
+
});
|
|
50
|
+
|
|
51
|
+
export async function POST(req: Request) {
|
|
52
|
+
const { messages } = await req.json();
|
|
53
|
+
|
|
54
|
+
const response = await openai.chat.completions.create({
|
|
55
|
+
model: 'gpt-4.1',
|
|
56
|
+
stream: true,
|
|
57
|
+
messages,
|
|
58
|
+
});
|
|
59
|
+
|
|
60
|
+
const stream = OpenAIStream(response);
|
|
61
|
+
|
|
62
|
+
return new StreamingTextResponse(stream);
|
|
63
|
+
}
|
|
64
|
+
```
|
|
65
|
+
|
|
66
|
+
With AI SDK Core you have a unified API for any provider that implements the [AI SDK Language Model Specification](/providers/community-providers/custom-providers).
|
|
67
|
+
|
|
68
|
+
Let’s take a look at the example above, but refactored to utilize the AI SDK Core API alongside the AI SDK OpenAI provider. In this example, you import the LLM function you want to use from the `ai` package, import the OpenAI provider from `@ai-sdk/openai`, and then you call the model and return the response using the `toDataStreamResponse()` helper function.
|
|
69
|
+
|
|
70
|
+
```tsx
|
|
71
|
+
import { streamText } from 'ai';
|
|
72
|
+
import { openai } from '@ai-sdk/openai';
|
|
73
|
+
__PROVIDER_IMPORT__;
|
|
74
|
+
|
|
75
|
+
export async function POST(req: Request) {
|
|
76
|
+
const { messages } = await req.json();
|
|
77
|
+
|
|
78
|
+
const result = await streamText({
|
|
79
|
+
model: __MODEL__,
|
|
80
|
+
messages,
|
|
81
|
+
});
|
|
82
|
+
|
|
83
|
+
return result.toUIMessageStreamResponse();
|
|
84
|
+
}
|
|
85
|
+
```
|
|
86
|
+
|
|
87
|
+
## Migrating from `render` to `streamUI`
|
|
88
|
+
|
|
89
|
+
The AI SDK RSC API was launched as part of version 3.0. This API introduced the [`render`](/docs/reference/ai-sdk-rsc/render) function, a helper function to create streamable UIs with OpenAI models. With the new AI SDK Core API, it became possible to make streamable UIs possible with any compatible provider.
|
|
90
|
+
|
|
91
|
+
The following example Server Action uses the `render` function using the model provider directly from OpenAI. You first create an OpenAI provider instance with the OpenAI SDK. Then, you pass it to the provider key of the render function alongside a tool that returns a React Server Component, defined in the `render` key of the tool.
|
|
92
|
+
|
|
93
|
+
```tsx
|
|
94
|
+
import { render } from '@ai-sdk/rsc';
|
|
95
|
+
import OpenAI from 'openai';
|
|
96
|
+
import { z } from 'zod';
|
|
97
|
+
import { Spinner, Weather } from '@/components';
|
|
98
|
+
import { getWeather } from '@/utils';
|
|
99
|
+
|
|
100
|
+
const openai = new OpenAI();
|
|
101
|
+
|
|
102
|
+
async function submitMessage(userInput = 'What is the weather in SF?') {
|
|
103
|
+
'use server';
|
|
104
|
+
|
|
105
|
+
return render({
|
|
106
|
+
provider: openai,
|
|
107
|
+
model: 'gpt-4.1',
|
|
108
|
+
messages: [
|
|
109
|
+
{ role: 'system', content: 'You are a helpful assistant' },
|
|
110
|
+
{ role: 'user', content: userInput },
|
|
111
|
+
],
|
|
112
|
+
text: ({ content }) => <p>{content}</p>,
|
|
113
|
+
tools: {
|
|
114
|
+
get_city_weather: {
|
|
115
|
+
description: 'Get the current weather for a city',
|
|
116
|
+
parameters: z
|
|
117
|
+
.object({
|
|
118
|
+
city: z.string().describe('the city'),
|
|
119
|
+
})
|
|
120
|
+
.required(),
|
|
121
|
+
render: async function* ({ city }) {
|
|
122
|
+
yield <Spinner />;
|
|
123
|
+
const weather = await getWeather(city);
|
|
124
|
+
return <Weather info={weather} />;
|
|
125
|
+
},
|
|
126
|
+
},
|
|
127
|
+
},
|
|
128
|
+
});
|
|
129
|
+
}
|
|
130
|
+
```
|
|
131
|
+
|
|
132
|
+
With the new [`streamUI`](/docs/reference/ai-sdk-rsc/stream-ui) function, you can now use any compatible AI SDK provider. In this example, you import the AI SDK OpenAI provider. Then, you pass it to the [`model`](/docs/reference/ai-sdk-rsc/stream-ui#model) key of the new [`streamUI`](/docs/reference/ai-sdk-rsc/stream-ui) function. Finally, you declare a tool and return a React Server Component, defined in the [`generate`](/docs/reference/ai-sdk-rsc/stream-ui#tools-generate) key of the tool.
|
|
133
|
+
|
|
134
|
+
```tsx
|
|
135
|
+
import { streamUI } from '@ai-sdk/rsc';
|
|
136
|
+
import { openai } from '@ai-sdk/openai';
|
|
137
|
+
import { z } from 'zod';
|
|
138
|
+
import { Spinner, Weather } from '@/components';
|
|
139
|
+
import { getWeather } from '@/utils';
|
|
140
|
+
|
|
141
|
+
async function submitMessage(userInput = 'What is the weather in SF?') {
|
|
142
|
+
'use server';
|
|
143
|
+
|
|
144
|
+
const result = await streamUI({
|
|
145
|
+
model: __MODEL__,
|
|
146
|
+
system: 'You are a helpful assistant',
|
|
147
|
+
messages: [{ role: 'user', content: userInput }],
|
|
148
|
+
text: ({ content }) => <p>{content}</p>,
|
|
149
|
+
tools: {
|
|
150
|
+
get_city_weather: {
|
|
151
|
+
description: 'Get the current weather for a city',
|
|
152
|
+
parameters: z
|
|
153
|
+
.object({
|
|
154
|
+
city: z.string().describe('Name of the city'),
|
|
155
|
+
})
|
|
156
|
+
.required(),
|
|
157
|
+
generate: async function* ({ city }) {
|
|
158
|
+
yield <Spinner />;
|
|
159
|
+
const weather = await getWeather(city);
|
|
160
|
+
return <Weather info={weather} />;
|
|
161
|
+
},
|
|
162
|
+
},
|
|
163
|
+
},
|
|
164
|
+
});
|
|
165
|
+
|
|
166
|
+
return result.value;
|
|
167
|
+
}
|
|
168
|
+
```
|
|
@@ -0,0 +1,22 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Migration Guides
|
|
3
|
+
description: Learn how to upgrade between Vercel AI versions.
|
|
4
|
+
collapsed: true
|
|
5
|
+
---
|
|
6
|
+
|
|
7
|
+
# Migration Guides
|
|
8
|
+
|
|
9
|
+
- [ Migrate AI SDK 5.x to 6.0 ](/docs/migration-guides/migration-guide-6-0)
|
|
10
|
+
- [ Migrate AI SDK 4.x to 5.0 ](/docs/migration-guides/migration-guide-5-0)
|
|
11
|
+
- [ Migrate your data to AI SDK 5.0 ](/docs/migration-guides/migration-guide-5-0-data)
|
|
12
|
+
- [ Migrate AI SDK 4.1 to 4.2 ](/docs/migration-guides/migration-guide-4-2)
|
|
13
|
+
- [ Migrate AI SDK 4.0 to 4.1 ](/docs/migration-guides/migration-guide-4-1)
|
|
14
|
+
- [ Migrate AI SDK 3.4 to 4.0 ](/docs/migration-guides/migration-guide-4-0)
|
|
15
|
+
- [ Migrate AI SDK 3.3 to 3.4 ](/docs/migration-guides/migration-guide-3-4)
|
|
16
|
+
- [ Migrate AI SDK 3.2 to 3.3 ](/docs/migration-guides/migration-guide-3-3)
|
|
17
|
+
- [ Migrate AI SDK 3.1 to 3.2 ](/docs/migration-guides/migration-guide-3-2)
|
|
18
|
+
- [ Migrate AI SDK 3.0 to 3.1 ](/docs/migration-guides/migration-guide-3-1)
|
|
19
|
+
|
|
20
|
+
## Versioning
|
|
21
|
+
|
|
22
|
+
- [ Versioning ](/docs/migration-guides/versioning)
|
|
@@ -0,0 +1,33 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Azure OpenAI Slow to Stream
|
|
3
|
+
description: Learn to troubleshoot Azure OpenAI slow to stream issues.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Azure OpenAI Slow To Stream
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
When using OpenAI hosted on Azure, streaming is slow and in big chunks.
|
|
11
|
+
|
|
12
|
+
## Cause
|
|
13
|
+
|
|
14
|
+
This is a Microsoft Azure issue. Some users have reported the following solutions:
|
|
15
|
+
|
|
16
|
+
- **Update Content Filtering Settings**:
|
|
17
|
+
Inside [Azure AI Studio](https://ai.azure.com/), within "Shared resources" > "Content filters", create a new
|
|
18
|
+
content filter and set the "Streaming mode (Preview)" under "Output filter" from "Default"
|
|
19
|
+
to "Asynchronous Filter".
|
|
20
|
+
|
|
21
|
+
## Solution
|
|
22
|
+
|
|
23
|
+
You can use the [`smoothStream` transformation](/docs/ai-sdk-core/generating-text#smoothing-streams) to stream each word individually.
|
|
24
|
+
|
|
25
|
+
```tsx highlight="6"
|
|
26
|
+
import { smoothStream, streamText } from 'ai';
|
|
27
|
+
|
|
28
|
+
const result = streamText({
|
|
29
|
+
model,
|
|
30
|
+
prompt,
|
|
31
|
+
experimental_transform: smoothStream(),
|
|
32
|
+
});
|
|
33
|
+
```
|
|
@@ -0,0 +1,22 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Client-Side Function Calls Not Invoked
|
|
3
|
+
description: Troubleshooting client-side function calls not being invoked.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Client-Side Function Calls Not Invoked
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
I upgraded the AI SDK to v3.0.20 or newer. I am using [`OpenAIStream`](/docs/reference/stream-helpers/openai-stream). Client-side function calls are no longer invoked.
|
|
11
|
+
|
|
12
|
+
## Solution
|
|
13
|
+
|
|
14
|
+
You will need to add a stub for `experimental_onFunctionCall` to [`OpenAIStream`](/docs/reference/stream-helpers/openai-stream) to enable the correct forwarding of the function calls to the client.
|
|
15
|
+
|
|
16
|
+
```tsx
|
|
17
|
+
const stream = OpenAIStream(response, {
|
|
18
|
+
async experimental_onFunctionCall() {
|
|
19
|
+
return;
|
|
20
|
+
},
|
|
21
|
+
});
|
|
22
|
+
```
|
|
@@ -0,0 +1,40 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Server Actions in Client Components
|
|
3
|
+
description: Troubleshooting errors related to server actions in client components.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Server Actions in Client Components
|
|
7
|
+
|
|
8
|
+
You may use Server Actions in client components, but sometimes you may encounter the following issues.
|
|
9
|
+
|
|
10
|
+
## Issue
|
|
11
|
+
|
|
12
|
+
It is not allowed to define inline `"use server"` annotated Server Actions in Client Components.
|
|
13
|
+
|
|
14
|
+
## Solution
|
|
15
|
+
|
|
16
|
+
To use Server Actions in a Client Component, you can either:
|
|
17
|
+
|
|
18
|
+
- Export them from a separate file with `"use server"` at the top.
|
|
19
|
+
- Pass them down through props from a Server Component.
|
|
20
|
+
- Implement a combination of [`createAI`](/docs/reference/ai-sdk-rsc/create-ai) and [`useActions`](/docs/reference/ai-sdk-rsc/use-actions) hooks to access them.
|
|
21
|
+
|
|
22
|
+
Learn more about [Server Actions and Mutations](https://nextjs.org/docs/app/api-reference/functions/server-actions#with-client-components).
|
|
23
|
+
|
|
24
|
+
```ts file='actions.ts'
|
|
25
|
+
'use server';
|
|
26
|
+
|
|
27
|
+
import { generateText } from 'ai';
|
|
28
|
+
__PROVIDER_IMPORT__;
|
|
29
|
+
|
|
30
|
+
export async function getAnswer(question: string) {
|
|
31
|
+
'use server';
|
|
32
|
+
|
|
33
|
+
const { text } = await generateText({
|
|
34
|
+
model: __MODEL__,
|
|
35
|
+
prompt: question,
|
|
36
|
+
});
|
|
37
|
+
|
|
38
|
+
return { answer: text };
|
|
39
|
+
}
|
|
40
|
+
```
|
|
@@ -0,0 +1,36 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: useChat/useCompletion stream output contains 0:... instead of text
|
|
3
|
+
description: How to fix strange stream output in the UI
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# useChat/useCompletion stream output contains 0:... instead of text
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
I am using custom client code to process a server response that is sent using [`StreamingTextResponse`](/docs/reference/stream-helpers/streaming-text-response). I am using version `3.0.20` or newer of the AI SDK. When I send a query, the UI streams text such as `0: "Je"`, `0: " suis"`, `0: "des"...` instead of the text that I’m looking for.
|
|
11
|
+
|
|
12
|
+
## Background
|
|
13
|
+
|
|
14
|
+
The AI SDK has switched to the stream data protocol in version `3.0.20`. It sends different stream parts to support data, tool calls, etc. What you see is the raw stream data protocol response.
|
|
15
|
+
|
|
16
|
+
## Solution
|
|
17
|
+
|
|
18
|
+
You have several options:
|
|
19
|
+
|
|
20
|
+
1. Use the AI Core [`streamText`](/docs/reference/ai-sdk-core/stream-text) function to send a raw text stream:
|
|
21
|
+
|
|
22
|
+
```tsx
|
|
23
|
+
export async function POST(req: Request) {
|
|
24
|
+
const { prompt } = await req.json();
|
|
25
|
+
|
|
26
|
+
const result = streamText({
|
|
27
|
+
model: openai.completion('gpt-3.5-turbo-instruct'),
|
|
28
|
+
maxOutputTokens: 2000,
|
|
29
|
+
prompt,
|
|
30
|
+
});
|
|
31
|
+
|
|
32
|
+
return result.toTextStreamResponse();
|
|
33
|
+
}
|
|
34
|
+
```
|
|
35
|
+
|
|
36
|
+
2. Pin the AI SDK version to `3.0.19` . This will keep the raw text stream.
|
|
@@ -0,0 +1,16 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Streamable UI Errors
|
|
3
|
+
description: Troubleshooting errors related to streamable UI.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Streamable UI Component Error
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
- Variable Not Found
|
|
11
|
+
- Cannot find `div`
|
|
12
|
+
- `Component` refers to a value, but is being used as a type
|
|
13
|
+
|
|
14
|
+
## Solution
|
|
15
|
+
|
|
16
|
+
If you encounter these errors when working with streamable UIs within server actions, it is likely because the file ends in `.ts` instead of `.tsx`.
|
|
@@ -0,0 +1,106 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Tool Invocation Missing Result Error
|
|
3
|
+
description: How to fix the "ToolInvocation must have a result" error when using tools without execute functions
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Tool Invocation Missing Result Error
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
When using `generateText()` or `streamText()`, you may encounter the error "ToolInvocation must have a result" when a tool without an `execute` function is called.
|
|
11
|
+
|
|
12
|
+
## Cause
|
|
13
|
+
|
|
14
|
+
The error occurs when you define a tool without an `execute` function and don't provide the result through other means (like `useChat`'s `onToolCall` or `addToolOutput` functions).
|
|
15
|
+
|
|
16
|
+
Each time a tool is invoked, the model expects to receive a result before continuing the conversation. Without a result, the model cannot determine if the tool call succeeded or failed and the conversation state becomes invalid.
|
|
17
|
+
|
|
18
|
+
## Solution
|
|
19
|
+
|
|
20
|
+
You have two options for handling tool results:
|
|
21
|
+
|
|
22
|
+
1. Server-side execution using tools with an `execute` function:
|
|
23
|
+
|
|
24
|
+
```tsx
|
|
25
|
+
const tools = {
|
|
26
|
+
weather: tool({
|
|
27
|
+
description: 'Get the weather in a location',
|
|
28
|
+
parameters: z.object({
|
|
29
|
+
location: z
|
|
30
|
+
.string()
|
|
31
|
+
.describe('The city and state, e.g. "San Francisco, CA"'),
|
|
32
|
+
}),
|
|
33
|
+
execute: async ({ location }) => {
|
|
34
|
+
// Fetch and return weather data
|
|
35
|
+
return { temperature: 72, conditions: 'sunny', location };
|
|
36
|
+
},
|
|
37
|
+
}),
|
|
38
|
+
};
|
|
39
|
+
```
|
|
40
|
+
|
|
41
|
+
2. Client-side execution with `useChat` (omitting the `execute` function), you must provide results using `addToolOutput`:
|
|
42
|
+
|
|
43
|
+
```tsx
|
|
44
|
+
import { useChat } from '@ai-sdk/react';
|
|
45
|
+
import {
|
|
46
|
+
DefaultChatTransport,
|
|
47
|
+
lastAssistantMessageIsCompleteWithToolCalls,
|
|
48
|
+
} from 'ai';
|
|
49
|
+
|
|
50
|
+
const { messages, sendMessage, addToolOutput } = useChat({
|
|
51
|
+
// Automatically submit when all tool results are available
|
|
52
|
+
sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,
|
|
53
|
+
|
|
54
|
+
// Handle tool calls in onToolCall
|
|
55
|
+
onToolCall: async ({ toolCall }) => {
|
|
56
|
+
if (toolCall.toolName === 'getLocation') {
|
|
57
|
+
try {
|
|
58
|
+
const result = await getLocationData();
|
|
59
|
+
|
|
60
|
+
// Important: Don't await inside onToolCall to avoid deadlocks
|
|
61
|
+
addToolOutput({
|
|
62
|
+
tool: 'getLocation',
|
|
63
|
+
toolCallId: toolCall.toolCallId,
|
|
64
|
+
output: result,
|
|
65
|
+
});
|
|
66
|
+
} catch (err) {
|
|
67
|
+
// Important: Don't await inside onToolCall to avoid deadlocks
|
|
68
|
+
addToolOutput({
|
|
69
|
+
tool: 'getLocation',
|
|
70
|
+
toolCallId: toolCall.toolCallId,
|
|
71
|
+
state: 'output-error',
|
|
72
|
+
errorText: 'Failed to get location',
|
|
73
|
+
});
|
|
74
|
+
}
|
|
75
|
+
}
|
|
76
|
+
},
|
|
77
|
+
});
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
```tsx
|
|
81
|
+
// For interactive UI elements:
|
|
82
|
+
const { messages, sendMessage, addToolOutput } = useChat({
|
|
83
|
+
transport: new DefaultChatTransport({ api: '/api/chat' }),
|
|
84
|
+
sendAutomaticallyWhen: lastAssistantMessageIsCompleteWithToolCalls,
|
|
85
|
+
});
|
|
86
|
+
|
|
87
|
+
// Inside your JSX, when rendering tool calls:
|
|
88
|
+
<button
|
|
89
|
+
onClick={() =>
|
|
90
|
+
addToolOutput({
|
|
91
|
+
tool: 'myTool',
|
|
92
|
+
toolCallId, // must provide tool call ID
|
|
93
|
+
output: {
|
|
94
|
+
/* your tool result */
|
|
95
|
+
},
|
|
96
|
+
})
|
|
97
|
+
}
|
|
98
|
+
>
|
|
99
|
+
Confirm
|
|
100
|
+
</button>;
|
|
101
|
+
```
|
|
102
|
+
|
|
103
|
+
<Note type="warning">
|
|
104
|
+
Whether handling tools on the server or client, each tool call must have a
|
|
105
|
+
corresponding result before the conversation can continue.
|
|
106
|
+
</Note>
|
|
@@ -0,0 +1,31 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Streaming Not Working When Deployed
|
|
3
|
+
description: Troubleshooting streaming issues in deployed apps.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Streaming Not Working When Deployed
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
Streaming with the AI SDK works in my local development environment.
|
|
11
|
+
However, when deploying, streaming does not work in the deployed app.
|
|
12
|
+
Instead of streaming, only the full response is returned after a while.
|
|
13
|
+
|
|
14
|
+
## Cause
|
|
15
|
+
|
|
16
|
+
The causes of this issue are varied and depend on the deployment environment.
|
|
17
|
+
|
|
18
|
+
## Solution
|
|
19
|
+
|
|
20
|
+
You can try the following:
|
|
21
|
+
|
|
22
|
+
- add `'Transfer-Encoding': 'chunked'` and/or `Connection: 'keep-alive'` headers
|
|
23
|
+
|
|
24
|
+
```tsx
|
|
25
|
+
return result.toUIMessageStreamResponse({
|
|
26
|
+
headers: {
|
|
27
|
+
'Transfer-Encoding': 'chunked',
|
|
28
|
+
Connection: 'keep-alive',
|
|
29
|
+
},
|
|
30
|
+
});
|
|
31
|
+
```
|
|
@@ -0,0 +1,31 @@
|
|
|
1
|
+
---
|
|
2
|
+
title: Streaming Not Working When Proxied
|
|
3
|
+
description: Troubleshooting streaming issues in proxied apps.
|
|
4
|
+
---
|
|
5
|
+
|
|
6
|
+
# Streaming Not Working When Proxied
|
|
7
|
+
|
|
8
|
+
## Issue
|
|
9
|
+
|
|
10
|
+
Streaming with the AI SDK doesn't work in local development environment, or deployed in some proxy environments.
|
|
11
|
+
Instead of streaming, only the full response is returned after a while.
|
|
12
|
+
|
|
13
|
+
## Cause
|
|
14
|
+
|
|
15
|
+
The causes of this issue are caused by the proxy middleware.
|
|
16
|
+
|
|
17
|
+
If the middleware is configured to compress the response, it will cause the streaming to fail.
|
|
18
|
+
|
|
19
|
+
## Solution
|
|
20
|
+
|
|
21
|
+
You can try the following, the solution only affects the streaming API:
|
|
22
|
+
|
|
23
|
+
- add `'Content-Encoding': 'none'` headers
|
|
24
|
+
|
|
25
|
+
```tsx
|
|
26
|
+
return result.toUIMessageStreamResponse({
|
|
27
|
+
headers: {
|
|
28
|
+
'Content-Encoding': 'none',
|
|
29
|
+
},
|
|
30
|
+
});
|
|
31
|
+
```
|