@aigne/example-chat-bot 1.12.1-6 → 1.13.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,18 +1,44 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- # Use this for OpenAI models
4
- # MODEL="openai:gpt-4.1"
5
- OPENAI_API_KEY="" # Your OpenAI API key
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
6
5
 
7
- # Use this for Anthropic models
6
+ # Use different Models
7
+
8
+ # OpenAI
9
+ MODEL="openai:gpt-4.1"
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
11
+
12
+ # Anthropic claude
8
13
  # MODEL="anthropic:claude-3-7-sonnet-latest"
9
- # ANTHROPIC_API_KEY="" # Your Anthropic API key
14
+ # ANTHROPIC_API_KEY=""
10
15
 
11
- # Use this for AWS Bedrock models
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
12
22
  # AWS_ACCESS_KEY_ID=""
13
23
  # AWS_SECRET_ACCESS_KEY=""
14
24
  # AWS_REGION=us-west-2
15
- # MODEL=Bedrock:us.amazon.nova-premier-v1:0
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
16
42
 
17
43
  # Setup proxy if needed
18
44
  # HTTPS_PROXY=http://localhost:7890
package/CHANGELOG.md CHANGED
@@ -1,5 +1,19 @@
1
1
  # Changelog
2
2
 
3
+ ## [1.13.0](https://github.com/AIGNE-io/aigne-framework/compare/example-chat-bot-v1.12.0...example-chat-bot-v1.13.0) (2025-07-01)
4
+
5
+
6
+ ### Features
7
+
8
+ * **example:** use AIGNE cli to run chat-bot example ([#198](https://github.com/AIGNE-io/aigne-framework/issues/198)) ([7085541](https://github.com/AIGNE-io/aigne-framework/commit/708554100692f2a557f7329ea78e46c3c870ce10))
9
+
10
+
11
+ ### Dependencies
12
+
13
+ * The following workspace dependencies were updated
14
+ * dependencies
15
+ * @aigne/cli bumped to 1.18.0
16
+
3
17
  ## [1.12.0](https://github.com/AIGNE-io/aigne-framework/compare/example-chat-bot-v1.11.0...example-chat-bot-v1.12.0) (2025-07-01)
4
18
 
5
19
 
package/README.md CHANGED
@@ -4,11 +4,8 @@ This example demonstrates how to create and run an agent-based chatbot using the
4
4
 
5
5
  ## Prerequisites
6
6
 
7
- - [Node.js](https://nodejs.org) and npm installed on your machine
8
- - An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
9
- - Optional dependencies (if running the example from source code):
10
- - [Pnpm](https://pnpm.io) for package management
11
- - [Bun](https://bun.sh) for running unit tests & examples
7
+ * [Node.js](https://nodejs.org) and npm installed on your machine
8
+ * An [OpenAI API key](https://platform.openai.com/api-keys) for interacting with OpenAI's services
12
9
 
13
10
  ## Quick Start (No Installation Required)
14
11
 
@@ -27,38 +24,53 @@ echo "Tell me about AIGNE Framework" | npx -y @aigne/example-chat-bot
27
24
 
28
25
  ## Installation
29
26
 
30
- ### Clone the Repository
27
+ ### Install AIGNE CLI
31
28
 
32
29
  ```bash
33
- git clone https://github.com/AIGNE-io/aigne-framework
30
+ npm install -g @aigne/cli
34
31
  ```
35
32
 
36
- ### Install Dependencies
33
+ ### Clone the Repository
37
34
 
38
35
  ```bash
39
- cd aigne-framework/examples/chat-bot
36
+ git clone https://github.com/AIGNE-io/aigne-framework
40
37
 
41
- pnpm install
38
+ cd aigne-framework/examples/chat-bot
42
39
  ```
43
40
 
44
41
  ### Setup Environment Variables
45
42
 
46
- Setup your OpenAI API key in the `.env.local` file:
43
+ Setup your OpenAI API key in the `.env.local` file (you can rename `.env.local.example` to `.env.local`):
47
44
 
48
45
  ```bash
49
46
  OPENAI_API_KEY="" # Set your OpenAI API key here
50
47
  ```
51
48
 
49
+ #### Using Different Models
50
+
51
+ You can use different AI models by setting the `MODEL` environment variable along with the corresponding API key. The framework supports multiple providers:
52
+
53
+ * **OpenAI**: `MODEL="openai:gpt-4.1"` with `OPENAI_API_KEY`
54
+ * **Anthropic**: `MODEL="anthropic:claude-3-7-sonnet-latest"` with `ANTHROPIC_API_KEY`
55
+ * **Google Gemini**: `MODEL="gemini:gemini-2.0-flash"` with `GEMINI_API_KEY`
56
+ * **AWS Bedrock**: `MODEL="bedrock:us.amazon.nova-premier-v1:0"` with AWS credentials
57
+ * **DeepSeek**: `MODEL="deepseek:deepseek-chat"` with `DEEPSEEK_API_KEY`
58
+ * **OpenRouter**: `MODEL="openrouter:openai/gpt-4o"` with `OPEN_ROUTER_API_KEY`
59
+ * **xAI**: `MODEL="xai:grok-2-latest"` with `XAI_API_KEY`
60
+ * **Ollama**: `MODEL="ollama:llama3.2"` with `OLLAMA_DEFAULT_BASE_URL`
61
+
62
+ For detailed configuration examples, please refer to the `.env.local.example` file in this directory.
63
+
52
64
  ### Run the Example
53
65
 
54
66
  ```bash
55
- pnpm start # Run in one-shot mode (default)
67
+ aigne run # Run in one-shot mode (default)
56
68
 
57
69
  # Run in interactive chat mode
58
- pnpm start -- --chat
70
+ aigne run --chat
59
71
 
60
72
  # Use pipeline input
61
- echo "Tell me about AIGNE Framework" | pnpm start
73
+ echo "Tell me about AIGNE Framework" | aigne run
62
74
  ```
63
75
 
64
76
  ### Run Options
@@ -68,23 +80,10 @@ The example supports the following command-line parameters:
68
80
  | Parameter | Description | Default |
69
81
  |-----------|-------------|---------|
70
82
  | `--chat` | Run in interactive chat mode | Disabled (one-shot mode) |
71
- | `--model <provider[:model]>` | AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
83
+ | `--model <provider[:model]>` | AI model to use in format 'provider\[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini' | openai |
72
84
  | `--temperature <value>` | Temperature for model generation | Provider default |
73
85
  | `--top-p <value>` | Top-p sampling value | Provider default |
74
86
  | `--presence-penalty <value>` | Presence penalty value | Provider default |
75
87
  | `--frequency-penalty <value>` | Frequency penalty value | Provider default |
76
88
  | `--log-level <level>` | Set logging level (ERROR, WARN, INFO, DEBUG, TRACE) | INFO |
77
89
  | `--input`, `-i <input>` | Specify input directly | None |
78
-
79
- #### Examples
80
-
81
- ```bash
82
- # Run in chat mode (interactive)
83
- pnpm start -- --chat
84
-
85
- # Set logging level
86
- pnpm start -- --log-level DEBUG
87
-
88
- # Use pipeline input
89
- echo "Tell me about AIGNE Framework" | pnpm start
90
- ```
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/example-chat-bot",
3
- "version": "1.12.1-6",
3
+ "version": "1.13.0",
4
4
  "description": "A demonstration of using AIGNE Framework to build a chat bot",
5
5
  "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
6
6
  "homepage": "https://github.com/AIGNE-io/aigne-framework/tree/main/examples/chat-bot",
@@ -12,9 +12,10 @@
12
12
  "type": "module",
13
13
  "bin": "index.js",
14
14
  "dependencies": {
15
- "@aigne/cli": "^1.17.0"
15
+ "@aigne/cli": "^1.18.0"
16
16
  },
17
17
  "scripts": {
18
- "test": "aigne test"
18
+ "test": "aigne test",
19
+ "test:llm": "aigne run"
19
20
  }
20
21
  }
package/.env.test DELETED
@@ -1,4 +0,0 @@
1
- LISTR_FORCE_TTY=1
2
- INITIAL_CALL="Show your main function by performing a basic task you're designed for."
3
- SKIP_LOOP=true
4
- AIGNE_OBSERVABILITY_DISABLED=true
Binary file