@aigne/example-workflow-concurrency 1.16.86 → 1.16.87-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +43 -25
  2. package/package.json +6 -6
package/README.md CHANGED
@@ -42,9 +42,9 @@ class aggregator processing
42
42
 
43
43
  ## Quick Start (No Installation Required)
44
44
 
45
- ```bash
46
- export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
45
+ ### Run the Example
47
46
 
47
+ ```bash
48
48
  # Run in one-shot mode (default)
49
49
  npx -y @aigne/example-workflow-concurrency
50
50
 
@@ -55,44 +55,62 @@ npx -y @aigne/example-workflow-concurrency --chat
55
55
  echo "Analyze product: Smart home assistant with voice control and AI learning capabilities" | npx -y @aigne/example-workflow-concurrency
56
56
  ```
57
57
 
58
- ## Installation
58
+ ### Connect to an AI Model
59
59
 
60
- ### Clone the Repository
60
+ As an example, running `npx -y @aigne/example-workflow-concurrency --chat` requires an AI model. If this is your first run, you need to connect one.
61
+
62
+ ![run example](./run-example.png)
63
+
64
+ - Connect via the official AIGNE Hub
65
+
66
+ Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
67
+
68
+ ![connect to official aigne hub](../images/connect-to-aigne-hub.png)
69
+
70
+ - Connect via a self-hosted AIGNE Hub
71
+
72
+ Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
73
+
74
+ ![connect to self hosted aigne hub](../images/connect-to-self-hosted-aigne-hub.png)
75
+
76
+ - Connect via a third-party model provider
77
+
78
+ Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
61
79
 
62
80
  ```bash
63
- git clone https://github.com/AIGNE-io/aigne-framework
81
+ export OPENAI_API_KEY="" # Set your OpenAI API key here
64
82
  ```
83
+ For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
65
84
 
66
- ### Install Dependencies
85
+ After configuration, run the example again.
67
86
 
68
- ```bash
69
- cd aigne-framework/examples/workflow-concurrency
87
+ ### Debugging
70
88
 
71
- pnpm install
72
- ```
89
+ The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
90
+
91
+ Start the observation server.
73
92
 
74
- ### Setup Environment Variables
93
+ ![aigne-observe-execute](../images/aigne-observe-execute.png)
94
+
95
+ View a list of recent executions.
96
+
97
+ ![aigne-observe-list](../images/aigne-observe-list.png)
98
+
99
+ ## Installation
75
100
 
76
- Setup your OpenAI API key in the `.env.local` file:
101
+ ### Clone the Repository
77
102
 
78
103
  ```bash
79
- OPENAI_API_KEY="" # Set your OpenAI API key here
104
+ git clone https://github.com/AIGNE-io/aigne-framework
80
105
  ```
81
106
 
82
- #### Using Different Models
83
-
84
- You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
107
+ ### Install Dependencies
85
108
 
86
- * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
87
- * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
88
- * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
89
- * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
90
- * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
91
- * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
92
- * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
93
- * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
109
+ ```bash
110
+ cd aigne-framework/examples/workflow-concurrency
94
111
 
95
- For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
112
+ pnpm install
113
+ ```
96
114
 
97
115
  ### Run the Example
98
116
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-workflow-concurrency",
3
- "version": "1.16.86",
3
+ "version": "1.16.87-beta.1",
4
4
  "description": "A demonstration of using AIGNE Framework to build a concurrency workflow",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/workflow-concurrency",
@@ -16,14 +16,14 @@
16
16
  "README.md"
17
17
  ],
18
18
  "dependencies": {
19
- "@aigne/agent-library": "^1.22.4",
20
- "@aigne/cli": "^1.57.3",
21
- "@aigne/core": "^1.70.1",
22
- "@aigne/openai": "^0.16.14"
19
+ "@aigne/agent-library": "^1.23.0-beta.1",
20
+ "@aigne/cli": "^1.58.0-beta.1",
21
+ "@aigne/core": "^1.71.0-beta.1",
22
+ "@aigne/openai": "^0.16.15-beta.1"
23
23
  },
24
24
  "devDependencies": {
25
25
  "@types/bun": "^1.2.22",
26
- "@aigne/test-utils": "^0.5.67"
26
+ "@aigne/test-utils": "^0.5.68-beta.1"
27
27
  },
28
28
  "scripts": {
29
29
  "start": "bun run index.ts",