@aigne/example-workflow-reflection 1.15.87 → 1.15.88-beta.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +43 -25
  2. package/package.json +6 -6
package/README.md CHANGED
@@ -39,9 +39,9 @@ class reviewer processing
39
39
 
40
40
  ## Quick Start (No Installation Required)
41
41
 
42
- ```bash
43
- export OPENAI_API_KEY=YOUR_OPENAI_API_KEY # Set your OpenAI API key
42
+ ### Run the Example
44
43
 
44
+ ```bash
45
45
  # Run in one-shot mode (default)
46
46
  npx -y @aigne/example-workflow-reflection
47
47
 
@@ -52,44 +52,62 @@ npx -y @aigne/example-workflow-reflection --chat
52
52
  echo "Write a function to validate email addresses" | npx -y @aigne/example-workflow-reflection
53
53
  ```
54
54
 
55
- ## Installation
55
+ ### Connect to an AI Model
56
56
 
57
- ### Clone the Repository
57
+ As an example, running `npx -y @aigne/example-workflow-reflection --chat` requires an AI model. If this is your first run, you need to connect one.
58
+
59
+ ![run example](./run-example.png)
60
+
61
+ - Connect via the official AIGNE Hub
62
+
63
+ Choose the first option and your browser will open the official AIGNE Hub page. Follow the prompts to complete the connection. If you're a new user, the system automatically grants 400,000 tokens for you to use.
64
+
65
+ ![connect to official aigne hub](../images/connect-to-aigne-hub.png)
66
+
67
+ - Connect via a self-hosted AIGNE Hub
68
+
69
+ Choose the second option, enter the URL of your self-hosted AIGNE Hub, and follow the prompts to complete the connection. If you need to set up a self-hosted AIGNE Hub, visit the Blocklet Store to install and deploy it: [Blocklet Store](https://store.blocklet.dev/blocklets/z8ia3xzq2tMq8CRHfaXj1BTYJyYnEcHbqP8cJ?utm_source=www.arcblock.io&utm_medium=blog_link&utm_campaign=default&utm_content=store.blocklet.dev#:~:text=%F0%9F%9A%80%20Get%20Started%20in%20Minutes).
70
+
71
+ ![connect to self hosted aigne hub](../images/connect-to-self-hosted-aigne-hub.png)
72
+
73
+ - Connect via a third-party model provider
74
+
75
+ Using OpenAI as an example, you can configure the provider's API key via environment variables. After configuration, run the example again:
58
76
 
59
77
  ```bash
60
- git clone https://github.com/AIGNE-io/aigne-framework
78
+ export OPENAI_API_KEY="" # Set your OpenAI API key here
61
79
  ```
80
+ For more details on third-party model configuration (e.g., OpenAI, DeepSeek, Google Gemini), see [.env.local.example](./.env.local.example).
62
81
 
63
- ### Install Dependencies
82
+ After configuration, run the example again.
64
83
 
65
- ```bash
66
- cd aigne-framework/examples/workflow-reflection
84
+ ### Debugging
67
85
 
68
- pnpm install
69
- ```
86
+ The `aigne observe` command starts a local web server to monitor and analyze agent execution data. It provides a user-friendly interface to inspect traces, view detailed call information, and understand your agent’s behavior during runtime. This tool is essential for debugging, performance tuning, and gaining insight into how your agent processes information and interacts with tools and models.
87
+
88
+ Start the observation server.
70
89
 
71
- ### Setup Environment Variables
90
+ ![aigne-observe-execute](../images/aigne-observe-execute.png)
91
+
92
+ View a list of recent executions.
93
+
94
+ ![aigne-observe-list](../images/aigne-observe-list.png)
95
+
96
+ ## Installation
72
97
 
73
- Setup your OpenAI API key in the `.env.local` file:
98
+ ### Clone the Repository
74
99
 
75
100
  ```bash
76
- OPENAI_API_KEY="" # Set your OpenAI API key here
101
+ git clone https://github.com/AIGNE-io/aigne-framework
77
102
  ```
78
103
 
79
- #### Using Different Models
80
-
81
- You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
104
+ ### Install Dependencies
82
105
 
83
- * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
84
- * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
85
- * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
86
- * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
87
- * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
88
- * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
89
- * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
90
- * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
106
+ ```bash
107
+ cd aigne-framework/examples/workflow-reflection
91
108
 
92
- For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
109
+ pnpm install
110
+ ```
93
111
 
94
112
  ### Run the Example
95
113
 
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-workflow-reflection",
3
- "version": "1.15.87",
3
+ "version": "1.15.88-beta.1",
4
4
  "description": "A demonstration of using AIGNE Framework to build a reflection workflow",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/workflow-reflection",
@@ -17,14 +17,14 @@
17
17
  ],
18
18
  "dependencies": {
19
19
  "zod": "^3.25.67",
20
- "@aigne/agent-library": "^1.22.4",
21
- "@aigne/cli": "^1.57.3",
22
- "@aigne/core": "^1.70.1",
23
- "@aigne/openai": "^0.16.14"
20
+ "@aigne/agent-library": "^1.23.0-beta.1",
21
+ "@aigne/cli": "^1.58.0-beta.1",
22
+ "@aigne/core": "^1.71.0-beta.1",
23
+ "@aigne/openai": "^0.16.15-beta.1"
24
24
  },
25
25
  "devDependencies": {
26
26
  "@types/bun": "^1.2.22",
27
- "@aigne/test-utils": "^0.5.67"
27
+ "@aigne/test-utils": "^0.5.68-beta.1"
28
28
  },
29
29
  "scripts": {
30
30
  "start": "bun run index.ts",