@aigne/example-workflow-router 1.19.4 → 1.19.5-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +43 -25
  2. package/package.json +7 -7
package/README.md CHANGED
@@ -45,9 +45,9 @@ class other processing
45
45
 
46
46
  ## Quick Start (No Installation Required)
47
47
 
48
- ```bash
49
- export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
48
+ ### Run the Example
50
49
 
50
+ ```bash
51
51
  # Run in one-shot mode (default)
52
52
  npx -y @aigne/example-workflow-router
53
53
 
@@ -58,44 +58,62 @@ npx -y @aigne/example-workflow-router --chat
58
58
  echo "How do I return a product?" | npx -y @aigne/example-workflow-router
59
59
  ```
60
60
 
61
- ## Installation
61
+ ### Connect to an AI Model
62
62
 
63
- ### Clone the Repository
63
+ As an example, running `npx -y @aigne/example-workflow-router --chat` requires an AI model. If this is your first run, you need to connect one.
64
+
65
+ ![run example](./run-example.png)
66
+
67
+ - Connect via the official AIGNE Hub
68
+
69
+ Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
70
+
71
+ ![connect to official aigne hub](../images/connect-to-aigne-hub.png)
72
+
73
+ - Connect via a self-hosted AIGNE Hub
74
+
75
+ Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
76
+
77
+ ![connect to self hosted aigne hub](../images/connect-to-self-hosted-aigne-hub.png)
78
+
79
+ - Connect via a third-party model provider
80
+
81
+ Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
64
82
 
65
83
  ```bash
66
- git clone https://github.com/AIGNE-io/aigne-framework
84
+ export OPENAI_API_KEY="" # Set your OpenAI API key here
67
85
  ```
86
+ For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
68
87
 
69
- ### Install Dependencies
88
+ After configuration, run the example again.
70
89
 
71
- ```bash
72
- cd aigne-framework/examples/workflow-router
90
+ ### Debugging
73
91
 
74
- pnpm install
75
- ```
92
+ The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
93
+
94
+ Start the observation server.
76
95
 
77
- ### Setup Environment Variables
96
+ ![aigne-observe-execute](../images/aigne-observe-execute.png)
97
+
98
+ View a list of recent executions.
99
+
100
+ ![aigne-observe-list](../images/aigne-observe-list.png)
101
+
102
+ ## Installation
78
103
 
79
- Setup your OpenAI API key in the `.env.local` file:
104
+ ### Clone the Repository
80
105
 
81
106
  ```bash
82
- OPENAI_API_KEY="" # Set your OpenAI API key here
107
+ git clone https://github.com/AIGNE-io/aigne-framework
83
108
  ```
84
109
 
85
- #### Using Different Models
86
-
87
- You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
110
+ ### Install Dependencies
88
111
 
89
- * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
90
- * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
91
- * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
92
- * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
93
- * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
94
- * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
95
- * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
96
- * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
112
+ ```bash
113
+ cd aigne-framework/examples/workflow-router
97
114
 
98
- For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
115
+ pnpm install
116
+ ```
99
117
 
100
118
  ### Run the Example
101
119
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-workflow-router",
3
- "version": "1.19.4",
3
+ "version": "1.19.5-beta.1",
4
4
  "description": "A demonstration of using AIGNE Framework to build a router workflow",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/workflow-router",
@@ -16,15 +16,15 @@
16
16
  "README.md"
17
17
  ],
18
18
  "dependencies": {
19
- "@aigne/cli": "^1.57.3",
20
- "@aigne/agent-library": "^1.22.4",
21
- "@aigne/default-memory": "^1.3.4",
22
- "@aigne/openai": "^0.16.14",
23
- "@aigne/core": "^1.70.1"
19
+ "@aigne/core": "^1.71.0-beta.1",
20
+ "@aigne/agent-library": "^1.23.0-beta.1",
21
+ "@aigne/cli": "^1.58.0-beta.1",
22
+ "@aigne/default-memory": "^1.3.5-beta.1",
23
+ "@aigne/openai": "^0.16.15-beta.1"
24
24
  },
25
25
  "devDependencies": {
26
26
  "@types/bun": "^1.2.22",
27
- "@aigne/test-utils": "^0.5.67"
27
+ "@aigne/test-utils": "^0.5.68-beta.1"
28
28
  },
29
29
  "scripts": {
30
30
  "start": "bun run index.ts",