@aigne/example-workflow-handoff 1.12.6 → 1.13.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,5 +1,44 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- DEBUG=aigne:mcp
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
4
5
 
5
- OPENAI_API_KEY="" # Your OpenAI API key
6
+ # Use different Models
7
+
8
+ # OpenAI
9
+ MODEL="openai:gpt-4.1"
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
11
+
12
+ # Anthropic claude
13
+ # MODEL="anthropic:claude-3-7-sonnet-latest"
14
+ # ANTHROPIC_API_KEY=""
15
+
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
22
+ # AWS_ACCESS_KEY_ID=""
23
+ # AWS_SECRET_ACCESS_KEY=""
24
+ # AWS_REGION=us-west-2
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
42
+
43
+ # Setup proxy if needed
44
+ # HTTPS_PROXY=http://localhost:7890
package/README.md CHANGED
@@ -23,7 +23,6 @@ class agentB processing
23
23
 
24
24
  Workflow of a handoff between user and two agents:
25
25
 
26
-
27
26
  ```mermaid
28
27
  sequenceDiagram
29
28
 
@@ -42,11 +41,11 @@ end
42
41
 
43
42
  ## Prerequisites
44
43
 
45
- - [Node.js](https://nodejs.org) and npm installed on your machine
46
- - An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
47
- - Optional dependencies (if running the example from source code):
48
- - [Bun](https://bun.sh) for running unit tests & examples
49
- - [Pnpm](https://pnpm.io) for package management
44
+ * [Node.js](https://nodejs.org) and npm installed on your machine
45
+ * An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
46
+ * Optional dependencies (if running the example from source code):
47
+ * [Bun](https://bun.sh) for running unit tests & examples
48
+ * [Pnpm](https://pnpm.io) for package management
50
49
 
51
50
  ## Quick Start (No Installation Required)
52
51
 
@@ -87,6 +86,21 @@ Setup your OpenAI API key in the `.env.local` file:
87
86
  OPENAI_API_KEY="" # Set your OpenAI API key here
88
87
  ```
89
88
 
89
+ #### Using Different Models
90
+
91
+ You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
92
+
93
+ * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
94
+ * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
95
+ * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
96
+ * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
97
+ * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
98
+ * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
99
+ * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
100
+ * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
101
+
102
+ For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
103
+
90
104
  ### Run the Example
91
105
 
92
106
  ```bash
@@ -106,7 +120,7 @@ The example supports the following command-line parameters:
106
120
  | Parameter | Description | Default |
107
121
  |-----------|-------------|---------|
108
122
  | `--chat` | Run in interactive chat mode | Disabled (one-shot mode) |
109
- | `--model <provider[:model]>` | AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
123
+ | `--model <provider[:model]>` | AI model to use in format 'provider\[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
110
124
  | `--temperature <value>` | Temperature for model generation | Provider default |
111
125
  | `--top-p <value>` | Top-p sampling value | Provider default |
112
126
  | `--presence-penalty <value>` | Presence penalty value | Provider default |
@@ -132,12 +146,10 @@ echo "transfer to agent b" | pnpm start
132
146
  The following example demonstrates how to build a handoff workflow:
133
147
 
134
148
  ```typescript
135
- import assert from "node:assert";
136
149
  import { AIAgent, AIGNE } from "@aigne/core";
137
150
  import { OpenAIChatModel } from "@aigne/core/models/openai-chat-model.js";
138
151
 
139
152
  const { OPENAI_API_KEY } = process.env;
140
- assert(OPENAI_API_KEY, "Please set the OPENAI_API_KEY environment variable");
141
153
 
142
154
  const model = new OpenAIChatModel({
143
155
  apiKey: OPENAI_API_KEY,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-workflow-handoff",
3
- "version": "1.12.6",
3
+ "version": "1.13.0",
4
4
  "description": "A demonstration of using AIGNE Framework to build a handoff workflow",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/workflow-handoff",
@@ -17,14 +17,14 @@
17
17
  ],
18
18
  "dependencies": {
19
19
  "zod": "^3.24.4",
20
- "@aigne/agent-library": "^1.17.3",
21
- "@aigne/cli": "^1.16.0",
22
- "@aigne/core": "^1.26.0",
23
- "@aigne/openai": "^0.4.3"
20
+ "@aigne/core": "^1.28.0",
21
+ "@aigne/cli": "^1.18.0",
22
+ "@aigne/openai": "^0.6.0",
23
+ "@aigne/agent-library": "^1.17.5"
24
24
  },
25
25
  "devDependencies": {
26
26
  "@types/bun": "^1.2.9",
27
- "@aigne/test-utils": "^0.4.10"
27
+ "@aigne/test-utils": "^0.4.12"
28
28
  },
29
29
  "scripts": {
30
30
  "start": "bun run index.ts",