@aigne/example-workflow-sequential 1.17.87-beta → 1.17.87-beta.2
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +43 -25
- package/package.json +6 -6
package/README.md
CHANGED
|
@@ -40,9 +40,9 @@ class formatProof processing
|
|
|
40
40
|
|
|
41
41
|
## Quick Start (No Installation Required)
|
|
42
42
|
|
|
43
|
-
|
|
44
|
-
export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
|
|
43
|
+
### Run the Example
|
|
45
44
|
|
|
45
|
+
```bash
|
|
46
46
|
# Run in one-shot mode (default)
|
|
47
47
|
npx -y @aigne/example-workflow-sequential
|
|
48
48
|
|
|
@@ -53,44 +53,62 @@ npx -y @aigne/example-workflow-sequential --chat
|
|
|
53
53
|
echo "Create marketing content for our new AI-powered fitness app" | npx -y @aigne/example-workflow-sequential
|
|
54
54
|
```
|
|
55
55
|
|
|
56
|
-
|
|
56
|
+
### Connect to an AI Model
|
|
57
57
|
|
|
58
|
-
|
|
58
|
+
As an example, running `npx -y @aigne/example-workflow-sequential --chat` requires an AI model. If this is your first run, you need to connect one.
|
|
59
|
+
|
|
60
|
+

|
|
61
|
+
|
|
62
|
+
- Connect via the official AIGNE Hub
|
|
63
|
+
|
|
64
|
+
Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
|
|
65
|
+
|
|
66
|
+

|
|
67
|
+
|
|
68
|
+
- Connect via a self-hosted AIGNE Hub
|
|
69
|
+
|
|
70
|
+
Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
|
|
71
|
+
|
|
72
|
+

|
|
73
|
+
|
|
74
|
+
- Connect via a third-party model provider
|
|
75
|
+
|
|
76
|
+
Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
|
|
59
77
|
|
|
60
78
|
```bash
|
|
61
|
-
|
|
79
|
+
export OPENAI_API_KEY="" # Set your OpenAI API key here
|
|
62
80
|
```
|
|
81
|
+
For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
|
|
63
82
|
|
|
64
|
-
|
|
83
|
+
After configuration, run the example again.
|
|
65
84
|
|
|
66
|
-
|
|
67
|
-
cd aigne-framework/examples/workflow-sequential
|
|
85
|
+
### Debugging
|
|
68
86
|
|
|
69
|
-
|
|
70
|
-
|
|
87
|
+
The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
|
|
88
|
+
|
|
89
|
+
Start the observation server.
|
|
71
90
|
|
|
72
|
-
|
|
91
|
+

|
|
92
|
+
|
|
93
|
+
View a list of recent executions.
|
|
94
|
+
|
|
95
|
+

|
|
96
|
+
|
|
97
|
+
## Installation
|
|
73
98
|
|
|
74
|
-
|
|
99
|
+
### Clone the Repository
|
|
75
100
|
|
|
76
101
|
```bash
|
|
77
|
-
|
|
102
|
+
git clone https://github.com/AIGNE-io/aigne-framework
|
|
78
103
|
```
|
|
79
104
|
|
|
80
|
-
|
|
81
|
-
|
|
82
|
-
You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
|
|
105
|
+
### Install Dependencies
|
|
83
106
|
|
|
84
|
-
|
|
85
|
-
|
|
86
|
-
* **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
|
|
87
|
-
* **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
|
|
88
|
-
* **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
|
|
89
|
-
* **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
|
|
90
|
-
* **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
|
|
91
|
-
* **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
|
|
107
|
+
```bash
|
|
108
|
+
cd aigne-framework/examples/workflow-sequential
|
|
92
109
|
|
|
93
|
-
|
|
110
|
+
pnpm install
|
|
111
|
+
```
|
|
94
112
|
|
|
95
113
|
### Run the Example
|
|
96
114
|
|
package/package.json
CHANGED
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
{
|
|
2
2
|
"name": "@aigne/example-workflow-sequential",
|
|
3
|
-
"version": "1.17.87-beta",
|
|
3
|
+
"version": "1.17.87-beta.2",
|
|
4
4
|
"description": "A demonstration of using AIGNE Framework to build a sequential workflow",
|
|
5
5
|
"author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
|
|
6
6
|
"homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/workflow-sequential",
|
|
@@ -16,14 +16,14 @@
|
|
|
16
16
|
"README.md"
|
|
17
17
|
],
|
|
18
18
|
"dependencies": {
|
|
19
|
-
"@aigne/agent-library": "^1.23.0-beta",
|
|
20
|
-
"@aigne/cli": "^1.58.0-beta",
|
|
21
|
-
"@aigne/core": "^1.71.0-beta",
|
|
22
|
-
"@aigne/openai": "^0.16.15-beta"
|
|
19
|
+
"@aigne/agent-library": "^1.23.0-beta.2",
|
|
20
|
+
"@aigne/cli": "^1.58.0-beta.2",
|
|
21
|
+
"@aigne/core": "^1.71.0-beta.1",
|
|
22
|
+
"@aigne/openai": "^0.16.15-beta.1"
|
|
23
23
|
},
|
|
24
24
|
"devDependencies": {
|
|
25
25
|
"@types/bun": "^1.2.22",
|
|
26
|
-
"@aigne/test-utils": "^0.5.68-beta"
|
|
26
|
+
"@aigne/test-utils": "^0.5.68-beta.1"
|
|
27
27
|
},
|
|
28
28
|
"scripts": {
|
|
29
29
|
"start": "bun run index.ts",
|