@azure/ai-projects 2.0.0-alpha.20260108.1 → 2.0.0-alpha.20260112.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +211 -7
  2. package/package.json +2 -2
package/README.md CHANGED
@@ -4,6 +4,23 @@ The AI Projects client library (in preview) is part of the Microsoft Foundry SDK
4
4
  resources in your Microsoft Foundry Project. Use it to:
5
5
 
6
6
  - **Create and run Agents** using the `.agents` property on the client.
7
+ * **Enhance Agents with specialized tools**:
8
+ * Agent Memory Search
9
+ * Agent-to-Agent (A2A)
10
+ * Azure AI Search
11
+ * Bing Custom Search
12
+ * Bing Grounding
13
+ * Browser Automation
14
+ * Code Interpreter
15
+ * Computer Use
16
+ * File Search
17
+ * Function Tool
18
+ * Image Generation
19
+ * Microsoft Fabric
20
+ * Model Context Protocol (MCP)
21
+ * OpenAPI
22
+ * SharePoint
23
+ * Web Search
7
24
  - **Get an OpenAI client** using the `.getOpenAIClient.` method to run Responses, Conversations, Evals and FineTuning operations with your Agent.
8
25
  * **Manage memory stores** for Agent conversations, using the `.memoryStores` operations.
9
26
  * **Explore additional evaluation tools** to assess the performance of your generative AI application, using the `.evaluationRules`,
@@ -37,12 +54,16 @@ The client library uses version `2025-11-15-preview` of the Microsoft Foundry [d
37
54
  - [Using Agent tools](#using-agent-tools)
38
55
  - [Built-in Tools](#built-in-tools)
39
56
  - [Connection-Based Tools](#connection-based-tools)
57
+ - [Evaluation operations](#evaluation)
40
58
  - [Deployments operations](#deployments-operations)
41
59
  - [Connections operations](#connections-operations)
42
60
  - [Dataset operations](#dataset-operations)
43
61
  - [Files operations](#files-operations)
44
62
  - [Indexes operations](#indexes-operations)
63
+ - [fine-tuning operations](#fine-tuning-operations)
45
64
  - [Tracing](#tracing)
65
+ - [Installation](#installation)
66
+ - [How to enable tracing](#how-to-enable-tracing)
46
67
  - [Troubleshooting](#troubleshooting)
47
68
  - [Exceptions](#exceptions)
48
69
  - [Reporting issues](#reporting-issues)
@@ -56,11 +77,12 @@ The client library uses version `2025-11-15-preview` of the Microsoft Foundry [d
56
77
  - [LTS versions of Node.js](https://github.com/nodejs/release#release-schedule)
57
78
  - An [Azure subscription][azure_sub].
58
79
  - A [project in Microsoft Foundry](https://learn.microsoft.com/azure/ai-studio/how-to/create-projects?tabs=ai-studio).
80
+ - The project endpoint URL of the form `https://your-ai-services-account-name.services.ai.azure.com/api/projects/your-project-name`. It can be found in your Microsoft Foundry Project overview page. Below we will assume the environment variable `AZURE_AI_PROJECT_ENDPOINT` was defined to hold this value.
59
81
 
60
82
  ### Authorization
61
83
 
62
84
  - [Entra ID][entra_id] is needed to authenticate the client. Your application needs an object that implements the [TokenCredential](https://learn.microsoft.com/javascript/api/@azure/core-auth/tokencredential) interface. Code samples here use [DefaultAzureCredential][default_azure_credential]. To get that working, you will need:
63
- - The `Contributor` role. Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal. Learn more about role assignments [here](https://learn.microsoft.com/azure/role-based-access-control/role-assignments-portal).
85
+ - An appropriate role assignment. see [Role-based access control in Microsoft Foundry portal](https://learn.microsoft.com/azure/ai-foundry/concepts/rbac-ai-foundry). Role assigned can be done via the "Access Control (IAM)" tab of your Azure AI Project resource in the Azure portal.
64
86
  - [Azure CLI](https://learn.microsoft.com/cli/azure/install-azure-cli) installed.
65
87
  - You are logged into your Azure account by running `az login`.
66
88
  - Note that if you have multiple Azure subscriptions, the subscription that contains your Azure AI Project resource must be your default subscription. Run `az account list --output table` to list all your subscription and see which one is the default. Run `az account set --subscription "Your Subscription ID or Name"` to change your default subscription.
@@ -68,12 +90,14 @@ The client library uses version `2025-11-15-preview` of the Microsoft Foundry [d
68
90
  ### Install the package
69
91
 
70
92
  ```bash
71
- npm install @azure/ai-projects @azure/identity
93
+ npm install @azure/ai-projects @azure/identity dotenv
72
94
  ```
73
95
 
74
96
  ## Key concepts
75
97
 
76
- ### Create and authenticate the client
98
+ ### Create and authenticate the client with Entra ID
99
+
100
+ Entra ID is the only authentication method supported at the moment by the client.
77
101
 
78
102
  To construct an `AIProjectsClient`, the `projectEndpoint` can be fetched from [projectEndpoint][ai_project_client_endpoint]. Below we will assume the environment variable `AZURE_AI_PROJECT_ENDPOINT` was defined to hold this value:
79
103
 
@@ -85,17 +109,15 @@ const projectEndpoint = process.env["AZURE_AI_PROJECT_ENDPOINT"] || "<project en
85
109
  const client = new AIProjectClient(projectEndpoint, new DefaultAzureCredential());
86
110
  ```
87
111
 
88
- The client uses API version `2025-11-15-preview`, refer to the [API documentation][ai_foundry_data_plane_rest_apis] to learn more about the supported features.
89
-
90
112
  ## Examples
91
113
 
92
114
  ### Performing Responses operations using OpenAI client
93
115
 
94
- Your Microsoft Foundry project may have one or more OpenAI models deployed that support chat completions. Use the code below to get an authenticated [OpenAI](https://github.com/openai/openai-node?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://www.npmjs.com/package/openai) package, and execute a chat completions call.
116
+ Your Microsoft Foundry project may have one or more AI models deployed. These could be OpenAI models, Microsoft models, or models from other providers. Use the code below to get an authenticated [OpenAI](https://github.com/openai/openai-node?tab=readme-ov-file#microsoft-azure-openai) from the [openai](https://www.npmjs.com/package/openai) package, and execute a chat completions call.
95
117
 
96
118
  Run the code below. Here we assume `deploymentName` (str) is defined. It's the deployment name of an AI model in your Foundry Project. As shown in the "Models + endpoints" tab, under the "Name" column.
97
119
 
98
- For openai logging, please refer [OpenAI Logging](https://github.com/openai/openai-node/tree/master?tab=readme-ov-file#logging).
120
+ See the "responses" folder in the [package samples][samples] for additional samples, including streaming responses.
99
121
 
100
122
 
101
123
  ```ts snippet:openAI
@@ -342,6 +364,34 @@ console.log(`Response: ${response.output_text}`);
342
364
 
343
365
  See the full sample code in [agentWebSearch.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/agents/tools/agentWebSearch.ts).
344
366
 
367
+ **Computer Use**
368
+
369
+ Enable agents to interact directly with computer systems for task automation and system operations:
370
+
371
+ ```ts snippet:agent-computer-use
372
+ const agent = await project.agents.createVersion("ComputerUseAgent", {
373
+ kind: "prompt" as const,
374
+ model: deploymentName,
375
+ instructions: `
376
+ You are a computer automation assistant.
377
+
378
+ Be direct and efficient. When you reach the search results page, read and describe the actual search result titles and descriptions you can see.
379
+ `.trim(),
380
+ tools: [
381
+ {
382
+ type: "computer_use_preview",
383
+ display_width: 1026,
384
+ display_height: 769,
385
+ environment: "windows" as const,
386
+ },
387
+ ],
388
+ });
389
+ console.log(`Agent created (id: ${agent.id}, name: ${agent.name}, version: ${agent.version})`);
390
+ ```
391
+
392
+ *After calling `responses.create()`, process the response in an interaction loop. Handle `computer_call` output items and provide screenshots as `computer_call_output` with `computer_screenshot` type to continue the interaction.*
393
+
394
+ See the full sample code in [agentComputerUse.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/agents/tools/agentComputerUse.ts).
345
395
 
346
396
  **Model Context Protocol (MCP)**
347
397
 
@@ -454,6 +504,54 @@ console.log(`Agent created (id: ${agent.id}, name: ${agent.name}, version: ${age
454
504
 
455
505
  See the full sample code in [agentFunctionTool.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/agents/agentFunctionTool.ts).
456
506
 
507
+ * **Memory Search Tool**
508
+
509
+ The Memory Store Tool adds Memory to an Agent, allowing the Agent's AI model to search for past information related to the current user prompt.
510
+
511
+ The `embeddingModelDeployment` is the name of the model used to create vector embeddings for storing and searching memories.
512
+
513
+ ```ts snippet:agent-memory-search
514
+ const memoryStoreName = "AgentMemoryStore";
515
+ const embeddingModelDeployment =
516
+ process.env["AZURE_AI_EMBEDDING_MODEL_DEPLOYMENT_NAME"] || "<embedding model>";
517
+ const scope = "user_123";
518
+ const memoryStore = await project.memoryStores.create(
519
+ memoryStoreName,
520
+ {
521
+ kind: "default",
522
+ chat_model: deploymentName,
523
+ embedding_model: embeddingModelDeployment,
524
+ options: {
525
+ user_profile_enabled: true,
526
+ chat_summary_enabled: true,
527
+ },
528
+ },
529
+ {
530
+ description: "Memory store for agent conversations",
531
+ },
532
+ );
533
+ console.log(
534
+ `Created memory store: ${memoryStore.name} (${memoryStore.id}) using chat model '${deploymentName}'`,
535
+ );
536
+ // Create an agent that will use the Memory Search tool
537
+ const agent = await project.agents.createVersion("MemorySearchAgent", {
538
+ kind: "prompt",
539
+ model: deploymentName,
540
+ instructions:
541
+ "You are a helpful assistant that remembers user preferences using the memory search tool.",
542
+ tools: [
543
+ {
544
+ type: "memory_search",
545
+ memory_store_name: memoryStore.name,
546
+ scope,
547
+ update_delay: 1, // wait briefly after conversation inactivity before updating memories
548
+ },
549
+ ],
550
+ });
551
+ ```
552
+
553
+ See the full sample code in [agentMemorySearch.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/agents/tools/agentMemorySearch.ts).
554
+
457
555
  #### Connection-Based Tools
458
556
 
459
557
  These tools require configuring connections in your AI Foundry project and use `projectConnectionId`.
@@ -732,6 +830,41 @@ See the full sample code in [agentOpenApiConnectionAuth.ts](https://github.com/A
732
830
 
733
831
  For complete working examples of all tools, see the [samples-dev directory](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/ai/ai-projects/samples-dev).
734
832
 
833
+ ### Evaluation
834
+
835
+ Evaluation in Azure AI Project client library provides quantitative, AI-assisted quality and safety metrics to asses performance and Evaluate LLM Models, GenAI Application and Agents. Metrics are defined as evaluators. Built-in or custom evaluators can provide comprehensive evaluation insights.
836
+
837
+ The code below shows some evaluation operations. Full list of sample can be found under "evaluations" folder in the [package samples][samples]
838
+
839
+
840
+ ```ts snippet:evaluations
841
+ const openAIClient = await project.getOpenAIClient();
842
+ const dataSourceConfig = {
843
+ type: "custom" as const,
844
+ item_schema: {
845
+ type: "object",
846
+ properties: { query: { type: "string" } },
847
+ required: ["query"],
848
+ },
849
+ include_sample_schema: true,
850
+ };
851
+ const evalObject = await openAIClient.evals.create({
852
+ name: "Agent Evaluation",
853
+ data_source_config: dataSourceConfig,
854
+ testing_criteria: [
855
+ {
856
+ type: "azure_ai_evaluator",
857
+ name: "violence_detection",
858
+ evaluator_name: "builtin.violence",
859
+ data_mapping: { query: "{{item.query}}", response: "{{item.response}}" },
860
+ } as any,
861
+ ],
862
+ });
863
+ console.log(`Evaluation created (id: ${evalObject.id}, name: ${evalObject.name})`);
864
+ ```
865
+
866
+ See the full sample code in [agentEvaluation.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/evaluations/agentEvaluation.ts).
867
+
735
868
  ### Deployments operations
736
869
 
737
870
  The code below shows some Deployments operations, which allow you to enumerate the AI models deployed to your Microsoft Foundry Projects. These models can be seen in the "Models + endpoints" tab in your Microsoft Foundry Project. Full samples can be found under the "deployment" folder in the [package samples][samples].
@@ -977,12 +1110,83 @@ console.log("Delete the Index versions created above:");
977
1110
  await project.indexes.delete(indexName, version);
978
1111
  ```
979
1112
 
1113
+ ### Fine-tuning operations
1114
+
1115
+ The code below shows how to create fine-tuning jobs using the OpenAI client. These operations support various fine-tuning techniques like Supervised Fine-Tuning (SFT), Reinforcement Fine-Tuning (RFT), and Direct Performance Optimization (DPO). Full samples can be found under the "finetuning" folder in the [package samples][samples].
1116
+
1117
+ ```ts snippet:finetuning
1118
+ import { JobCreateParams } from "openai/resources/fine-tuning/jobs";
1119
+
1120
+ const trainingFilePath = "training_data_path.jsonl";
1121
+ const validationFilePath = "validation_data_path.jsonl";
1122
+ const openAIClient = await project.getOpenAIClient();
1123
+ // 1) Create the training and validation files
1124
+ const trainingFile = await openAIClient.files.create({
1125
+ file: fs.createReadStream(trainingFilePath),
1126
+ purpose: "fine-tune",
1127
+ });
1128
+ console.log(`Uploaded file with ID: ${trainingFile.id}`);
1129
+ const validationFile = await openAIClient.files.create({
1130
+ file: fs.createReadStream(validationFilePath),
1131
+ purpose: "fine-tune",
1132
+ });
1133
+ console.log(`Uploaded file with ID: ${validationFile.id}`);
1134
+ // 2) Wait for the files to be processed
1135
+ await openAIClient.files.waitForProcessing(trainingFile.id);
1136
+ await openAIClient.files.waitForProcessing(validationFile.id);
1137
+ console.log("Files processed.");
1138
+ // 3) Create a supervised fine-tuning job
1139
+ const fineTuningJob = await openAIClient.fineTuning.jobs.create({} as JobCreateParams, {
1140
+ body: {
1141
+ trainingType: "Standard",
1142
+ training_file: trainingFile.id,
1143
+ validation_file: validationFile.id,
1144
+ model: deploymentName,
1145
+ method: {
1146
+ type: "supervised",
1147
+ supervised: {
1148
+ hyperparameters: {
1149
+ n_epochs: 3,
1150
+ batch_size: 1,
1151
+ learning_rate_multiplier: 1.0,
1152
+ },
1153
+ },
1154
+ },
1155
+ },
1156
+ });
1157
+ console.log("Created fine-tuning job:\n", JSON.stringify(fineTuningJob));
1158
+ ```
1159
+
980
1160
  ## Tracing
981
1161
 
982
1162
  **Note:** Tracing functionality is in preliminary preview and is subject to change. Spans, attributes, and events may be modified in future versions.
983
1163
 
984
1164
  You can add an Application Insights Azure resource to your Microsoft Foundry project. See the Tracing tab in your Microsoft Foundry project. If one was enabled, you can get the Application Insights connection string, configure your AI Projects client, and observe the full execution path through Azure Monitor. Typically, you might want to start tracing before you create a client or Agent.
985
1165
 
1166
+ ### Installation
1167
+
1168
+ ```bash
1169
+ npm install @azure/monitor-opentelemetry@^1.14.2 @opentelemetry/api@^1.9.0
1170
+ ```
1171
+
1172
+ ### How to enable tracing
1173
+
1174
+ Here is a code sample that shows how to enable Azure Monitor tracing:
1175
+
1176
+ ```ts snippet:tracing
1177
+ import { AzureMonitorOpenTelemetryOptions, useAzureMonitor } from "@azure/monitor-opentelemetry";
1178
+
1179
+ const TELEMETRY_CONNECTION_STRING = process.env["TELEMETRY_CONNECTION_STRING"];
1180
+ const options: AzureMonitorOpenTelemetryOptions = {
1181
+ azureMonitorExporterOptions: {
1182
+ connectionString: TELEMETRY_CONNECTION_STRING,
1183
+ },
1184
+ };
1185
+ useAzureMonitor(options);
1186
+ ```
1187
+
1188
+ See the full sample code in [remoteTelemetry.ts](https://github.com/Azure/azure-sdk-for-js/blob/main/sdk/ai/ai-projects/samples-dev/telemetry/remoteTelemetry.ts).
1189
+
986
1190
  ## Troubleshooting
987
1191
 
988
1192
  ### Exceptions
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@azure/ai-projects",
3
- "version": "2.0.0-alpha.20260108.1",
3
+ "version": "2.0.0-alpha.20260112.1",
4
4
  "description": "Azure AI Projects client library.",
5
5
  "engines": {
6
6
  "node": ">=20.0.0"
@@ -99,8 +99,8 @@
99
99
  "vitest": "^4.0.8",
100
100
  "@azure-tools/test-credential": "^2.1.2",
101
101
  "@azure-tools/test-recorder": "^4.1.1",
102
- "@azure-tools/test-utils-vitest": "^2.0.1",
103
102
  "@azure/dev-tool": "^1.0.0",
103
+ "@azure-tools/test-utils-vitest": "^2.0.1",
104
104
  "@azure/eslint-plugin-azure-sdk": "^3.0.0"
105
105
  },
106
106
  "exports": {