@sap-ai-sdk/foundation-models 1.0.0 → 1.0.1-20240921013041.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +70 -69
  2. package/package.json +3 -7
package/README.md CHANGED
@@ -2,41 +2,77 @@
2
2
 
3
3
  This package incorporates generative AI foundation models into your AI activities in SAP AI Core and SAP AI Launchpad.
4
4
 
5
- ### Installation
5
+ ## Table of Contents
6
+
7
+ 1. [Installation](#installation)
8
+ 2. [Prerequisites](#prerequisites)
9
+ 3. [Usage](#usage)
10
+ - [Client Initialization](#client-initialization)
11
+ - [Azure OpenAI Client](#azure-openai-client)
12
+ - [Chat Client](#chat-client)
13
+ - [Embedding Client](#embedding-client)
14
+ 4. [Support, Feedback, Contribution](#support-feedback-contribution)
15
+ 5. [License](#license)
16
+
17
+ ## Installation
6
18
 
7
19
  ```
8
20
  $ npm install @sap-ai-sdk/foundation-models
9
21
  ```
10
22
 
11
- ## Azure OpenAI Client
23
+ ## Prerequisites
12
24
 
13
- To make a generative AI model available for use, you need to create a deployment.
14
- You can create a deployment for each model and model version, as well as for each resource group that you want to use with generative AI hub.
25
+ - [Enable the AI Core service in BTP](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/initial-setup).
26
+ - Project configured with Node.js v20 or higher and native ESM support enabled.
27
+ - A deployed OpenAI model in SAP Generative AI hub.
28
+ - Use the [`DeploymentApi`](../ai-api/README.md#deploymentapi) from `@sap-ai-sdk/ai-api` to deploy a model to SAP generative AI hub. For more information, see [here](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-generative-ai-model-in-sap-ai-core).
29
+ Deployment can be set up for each model and model version, as well as a resource group intended for use with the generative AI hub.
30
+ - Once a deployment is complete, the model can be accessed via the `deploymentUrl`
15
31
 
16
- After the deployment is complete, you have a `deploymentUrl`, which can be used to access the model.
32
+ ## Usage
17
33
 
18
- The Azure OpenAI client allows you to send chat completion or embedding requests to OpenAI models deployed in SAP generative AI hub.
34
+ ### Client Initialization
19
35
 
20
- ### Prerequisites
36
+ You can pass the model name as a parameter to a client, the SDK will implicitly fetch the deployment ID for the model from the AI Core service and use it in the request.
21
37
 
22
- - [Enable the AI Core service in BTP](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/initial-setup).
23
- - Project configured with Node.js v20 or higher and native ESM support enabled.
24
- - A deployed OpenAI model in SAP generative AI hub.
25
- - You can use the [`DeploymentApi`](../ai-api/README.md#deploymentapi) from `@sap-ai-sdk/ai-api` to deploy a model to SAP generative AI hub. For more information, see [here](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/create-deployment-for-generative-ai-model-in-sap-ai-core).
26
- - `sap-ai-sdk/foundation-models` package installed in your project.
38
+ By default, the SDK caches the deployment information, including the deployment ID, model name, and version, for 5 minutes to avoid performance issues from fetching this data with each request.
27
39
 
28
- ### Usage of Azure OpenAI Chat Client
40
+ ```ts
41
+ import {
42
+ AzureOpenAiChatClient,
43
+ AzureOpenAiEmbeddingClient
44
+ } from '@sap-ai-sdk/foundation-models';
29
45
 
30
- Use the `AzureOpenAiChatClient` to send chat completion requests to an OpenAI model deployed in SAP generative AI hub.
31
- You can pass the model name as a parameter to the client, the SDK will implicitly fetch the deployment ID for the model from the AI Core service and use it to send the request.
46
+ // For a chat client
47
+ const chatClient = new AzureOpenAiChatClient({ modelName: 'gpt-4o' });
48
+ // For an embedding client
49
+ const embeddingClient = new AzureOpenAiEmbeddingClient({ modelName: 'gpt-4o' });
50
+ ```
51
+
52
+ [Resource groups](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/resource-groups?q=resource+group) represent a virtual collection of related resources within the scope of one SAP AI Core tenant.
53
+
54
+ The deployment ID and resource group can be used as an alternative to the model name for obtaining a model.
32
55
 
33
- By default, the SDK caches the deployment information, which includes the deployment ID and properties such as the model name and model version, for 5 minutes to prevent performance impact from fetching the deployment information for every request.
56
+ ```ts
57
+ const chatClient = new AzureOpenAiChatClient({
58
+ deploymentId: 'd1234',
59
+ resourceGroup: 'rg1234'
60
+ });
61
+ ```
62
+
63
+ ### Azure OpenAI Client
34
64
 
35
- ```TS
65
+ The Azure OpenAI client can then be used to send chat completion or embedding requests to models deployed in the SAP generative AI hub.
66
+
67
+ #### Chat Client
68
+
69
+ Use the `AzureOpenAiChatClient` to send chat completion requests to an OpenAI model deployed in SAP generative AI hub.
70
+
71
+ ```ts
36
72
  import { AzureOpenAiChatClient } from '@sap-ai-sdk/foundation-models';
37
73
 
38
- const client = new AzureOpenAiChatClient('gpt-35-turbo');
39
- const response = await client.run({
74
+ const chatClient = new AzureOpenAiChatClient('gpt-4o');
75
+ const response = await chatClient.run({
40
76
  messages: [
41
77
  {
42
78
  role: 'user',
@@ -46,16 +82,13 @@ const response = await client.run({
46
82
  });
47
83
 
48
84
  const responseContent = response.getContent();
49
-
50
85
  ```
51
86
 
52
- Use the following snippet to send a chat completion request with system messages:
87
+ Multiple messages can be sent in a single request, enabling the model to reference the conversation history.
88
+ Include parameters like `max_tokens` and `temperature` in the request to control the completion behavior:
53
89
 
54
- ```TS
55
- import { AzureOpenAiChatClient } from '@sap-ai-sdk/foundation-models';
56
-
57
- const client = new AzureOpenAiChatClient('gpt-35-turbo');
58
- const response = await client.run({
90
+ ```ts
91
+ const response = await chatClient.run({
59
92
  messages: [
60
93
  {
61
94
  role: 'system',
@@ -67,7 +100,8 @@ const response = await client.run({
67
100
  },
68
101
  {
69
102
  role: 'assistant',
70
- content: 'Hi Isa! It is nice to meet you. Is there anything I can help you with today?'
103
+ content:
104
+ 'Hi Isa! It is nice to meet you. Is there anything I can help you with today?'
71
105
  },
72
106
  {
73
107
  role: 'user',
@@ -83,60 +117,27 @@ const tokenUsage = response.getTokenUsage();
83
117
 
84
118
  logger.info(
85
119
  `Total tokens consumed by the request: ${tokenUsage.total_tokens}\n` +
86
- `Input prompt tokens consumed: ${tokenUsage.prompt_tokens}\n` +
87
- `Output text completion tokens consumed: ${tokenUsage.completion_tokens}\n`
120
+ `Input prompt tokens consumed: ${tokenUsage.prompt_tokens}\n` +
121
+ `Output text completion tokens consumed: ${tokenUsage.completion_tokens}\n`
88
122
  );
89
-
90
123
  ```
91
124
 
92
- It is possible to send multiple messages in a single request.
93
- This feature is useful for providing a history of the conversation to the model.
125
+ Refer to `AzureOpenAiChatCompletionParameters` interface for other parameters that can be passed to the chat completion request.
94
126
 
95
- Pass parameters like `max_tokens` and `temperature` to the request to control the completion behavior.
96
- Refer to `AzureOpenAiChatCompletionParameters` interface for knowing more parameters that can be passed to the chat completion request.
97
-
98
- #### Obtaining a Client using Resource Groups
99
-
100
- [Resource groups](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/resource-groups?q=resource+group) represent a virtual collection of related resources within the scope of one SAP AI Core tenant.
101
-
102
- You can use the deployment ID and resource group as an alternative to obtaining a model by using the model name.
103
-
104
- ```TS
105
- import { AzureOpenAiChatClient } from '@sap-ai-sdk/foundation-models';
106
-
107
- const response = await new AzureOpenAiChatClient({ deploymentId: 'd1234' , resourceGroup: 'rg1234' }).run({
108
- messages: [
109
- {
110
- 'role':'user',
111
- 'content': 'What is the capital of France?'
112
- }
113
- ]
114
- });
115
-
116
- ```
117
-
118
- ### Usage of Azure OpenAI Embedding Client
127
+ #### Embedding Client
119
128
 
120
129
  Use the `AzureOpenAiEmbeddingClient` to send embedding requests to an OpenAI model deployed in SAP generative AI hub.
121
- You can pass the model name as a parameter to the client, the SDK will implicitly fetch the deployment ID for the model from the AI Core service and use it to send the request.
122
-
123
- By default, the SDK caches the deployment information, which includes the deployment ID and properties such as the model name and model version, for 5 minutes to prevent performance impact from fetching the deployment information for every request.
124
130
 
125
- ```TS
131
+ ```ts
126
132
  import { AzureOpenAiEmbeddingClient } from '@sap-ai-sdk/foundation-models';
127
133
 
128
- const client = new AzureOpenAiEmbeddingClient('text-embedding-ada-002');
129
- const response = await client.run({
134
+ const embeddingClient = new AzureOpenAiEmbeddingClient(
135
+ 'text-embedding-ada-002'
136
+ );
137
+ const response = await embeddingClient.run({
130
138
  input: 'AI is fascinating'
131
139
  });
132
140
  const embedding = response.getEmbedding();
133
-
134
- ```
135
-
136
- Like in [Azure OpenAI Chat client](#obtaining-a-client-using-resource-groups), you could also pass the [resource group](https://help.sap.com/docs/sap-ai-core/sap-ai-core-service-guide/resource-groups?q=resource+group) name to the client along with the deployment ID instead of the model name.
137
-
138
- ```TS
139
- const client = new AzureOpenAiEmbeddingClient({ deploymentId: 'd1234' , resourceGroup: 'rg1234' })
140
141
  ```
141
142
 
142
143
  ## Support, Feedback, Contribution
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@sap-ai-sdk/foundation-models",
3
- "version": "1.0.0",
3
+ "version": "1.0.1-20240921013041.0",
4
4
  "description": "",
5
5
  "license": "Apache-2.0",
6
6
  "keywords": [
@@ -20,17 +20,13 @@
20
20
  "internal.d.ts"
21
21
  ],
22
22
  "dependencies": {
23
- "@sap-cloud-sdk/connectivity": "^3.21.0",
24
23
  "@sap-cloud-sdk/http-client": "^3.21.0",
25
- "@sap-cloud-sdk/openapi": "^3.21.0",
26
24
  "@sap-cloud-sdk/util": "^3.21.0",
27
- "@sap-ai-sdk/ai-api": "^1.0.0",
28
- "@sap-ai-sdk/core": "^1.0.0"
25
+ "@sap-ai-sdk/core": "^1.0.1-20240921013041.0",
26
+ "@sap-ai-sdk/ai-api": "^1.0.1-20240921013041.0"
29
27
  },
30
28
  "devDependencies": {
31
- "nock": "^13.5.5",
32
29
  "ts-to-zod": "^3.13.0",
33
- "typescript": "^5.6.2",
34
30
  "zod": "^3.23.8"
35
31
  },
36
32
  "scripts": {