@ai-sdk/provider 0.0.0 → 0.0.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1 +1 @@
1
- # Vercel AI SDK - Language Model Specification
1
+ # Vercel AI SDK - Provider Language Model Specification
package/dist/index.d.mts CHANGED
@@ -307,23 +307,13 @@ type LanguageModelV1CallSettings = {
307
307
  */
308
308
  maxTokens?: number;
309
309
  /**
310
- * Temperature setting. This is a number between 0 (almost no randomness) and
311
- * 1 (very random).
310
+ * Temperature setting.
312
311
  *
313
- * Different LLM providers have different temperature
314
- * scales, so they'd need to map it (without mapping, the same temperature has
315
- * different effects on different models). The provider can also chose to map
316
- * this to topP, potentially even using a custom setting on their model.
317
- *
318
- * Note: This is an example of a setting that requires a clear specification of
319
- * the semantics.
312
+ * It is recommended to set either `temperature` or `topP`, but not both.
320
313
  */
321
314
  temperature?: number;
322
315
  /**
323
- * Nucleus sampling. This is a number between 0 and 1.
324
- *
325
- * E.g. 0.1 would mean that only tokens with the top 10% probability mass
326
- * are considered.
316
+ * Nucleus sampling.
327
317
  *
328
318
  * It is recommended to set either `temperature` or `topP`, but not both.
329
319
  */
@@ -331,17 +321,11 @@ type LanguageModelV1CallSettings = {
331
321
  /**
332
322
  * Presence penalty setting. It affects the likelihood of the model to
333
323
  * repeat information that is already in the prompt.
334
- *
335
- * The presence penalty is a number between -1 (increase repetition)
336
- * and 1 (maximum penalty, decrease repetition). 0 means no penalty.
337
324
  */
338
325
  presencePenalty?: number;
339
326
  /**
340
327
  * Frequency penalty setting. It affects the likelihood of the model
341
328
  * to repeatedly use the same words or phrases.
342
- *
343
- * The frequency penalty is a number between -1 (increase repetition)
344
- * and 1 (maximum penalty, decrease repetition). 0 means no penalty.
345
329
  */
346
330
  frequencyPenalty?: number;
347
331
  /**
package/dist/index.d.ts CHANGED
@@ -307,23 +307,13 @@ type LanguageModelV1CallSettings = {
307
307
  */
308
308
  maxTokens?: number;
309
309
  /**
310
- * Temperature setting. This is a number between 0 (almost no randomness) and
311
- * 1 (very random).
310
+ * Temperature setting.
312
311
  *
313
- * Different LLM providers have different temperature
314
- * scales, so they'd need to map it (without mapping, the same temperature has
315
- * different effects on different models). The provider can also chose to map
316
- * this to topP, potentially even using a custom setting on their model.
317
- *
318
- * Note: This is an example of a setting that requires a clear specification of
319
- * the semantics.
312
+ * It is recommended to set either `temperature` or `topP`, but not both.
320
313
  */
321
314
  temperature?: number;
322
315
  /**
323
- * Nucleus sampling. This is a number between 0 and 1.
324
- *
325
- * E.g. 0.1 would mean that only tokens with the top 10% probability mass
326
- * are considered.
316
+ * Nucleus sampling.
327
317
  *
328
318
  * It is recommended to set either `temperature` or `topP`, but not both.
329
319
  */
@@ -331,17 +321,11 @@ type LanguageModelV1CallSettings = {
331
321
  /**
332
322
  * Presence penalty setting. It affects the likelihood of the model to
333
323
  * repeat information that is already in the prompt.
334
- *
335
- * The presence penalty is a number between -1 (increase repetition)
336
- * and 1 (maximum penalty, decrease repetition). 0 means no penalty.
337
324
  */
338
325
  presencePenalty?: number;
339
326
  /**
340
327
  * Frequency penalty setting. It affects the likelihood of the model
341
328
  * to repeatedly use the same words or phrases.
342
- *
343
- * The frequency penalty is a number between -1 (increase repetition)
344
- * and 1 (maximum penalty, decrease repetition). 0 means no penalty.
345
329
  */
346
330
  frequencyPenalty?: number;
347
331
  /**
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@ai-sdk/provider",
3
- "version": "0.0.0",
3
+ "version": "0.0.1",
4
4
  "license": "Apache-2.0",
5
5
  "sideEffects": false,
6
6
  "main": "./dist/index.js",