ai 5.0.0-canary.20 → 5.0.0-canary.22

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,65 @@
1
1
  # ai
2
2
 
3
+ ## 5.0.0-canary.22
4
+
5
+ ### Major Changes
6
+
7
+ - e7dc6c7: chore (ai): remove onResponse callback
8
+ - a34eb39: chore (ai): remove `data` and `allowEmptySubmit` from `ChatRequestOptions`
9
+ - b33ed7a: chore (ai): rename DataStream* to UIMessage*
10
+ - 765f1cd: chore (ai): remove deprecated useChat isLoading helper
11
+
12
+ ## 5.0.0-canary.21
13
+
14
+ ### Major Changes
15
+
16
+ - d964901: - remove setting temperature to `0` by default
17
+ - remove `null` option from `DefaultSettingsMiddleware`
18
+ - remove setting defaults for `temperature` and `stopSequences` in `ai` to enable middleware changes
19
+ - 0560977: chore (ai): improve consistency of generate text result, stream text result, and step result
20
+ - 516be5b: ### Move Image Model Settings into generate options
21
+
22
+ Image Models no longer have settings. Instead, `maxImagesPerCall` can be passed directly to `generateImage()`. All other image settings can be passed to `providerOptions[provider]`.
23
+
24
+ Before
25
+
26
+ ```js
27
+ await generateImage({
28
+ model: luma.image('photon-flash-1', {
29
+ maxImagesPerCall: 5,
30
+ pollIntervalMillis: 500,
31
+ }),
32
+ prompt,
33
+ n: 10,
34
+ });
35
+ ```
36
+
37
+ After
38
+
39
+ ```js
40
+ await generateImage({
41
+ model: luma.image('photon-flash-1'),
42
+ prompt,
43
+ n: 10,
44
+ maxImagesPerCall: 5,
45
+ providerOptions: {
46
+ luma: { pollIntervalMillis: 5 },
47
+ },
48
+ });
49
+ ```
50
+
51
+ Pull Request: https://github.com/vercel/ai/pull/6180
52
+
53
+ - bfbfc4c: feat (ai): streamText/generateText: totalUsage contains usage for all steps. usage is for a single step.
54
+ - ea7a7c9: feat (ui): UI message metadata
55
+ - 1409e13: chore (ai): remove experimental continueSteps
56
+
57
+ ### Patch Changes
58
+
59
+ - 66af894: fix (ai): respect content order in toResponseMessages
60
+ - Updated dependencies [ea7a7c9]
61
+ - @ai-sdk/provider-utils@3.0.0-canary.17
62
+
3
63
  ## 5.0.0-canary.20
4
64
 
5
65
  ### Major Changes
package/README.md CHANGED
@@ -106,7 +106,7 @@ export async function POST(req: Request) {
106
106
  messages,
107
107
  });
108
108
 
109
- return result.toDataStreamResponse();
109
+ return result.toUIMessageStreamResponse();
110
110
  }
111
111
  ```
112
112