@aigne/cli 1.17.0 → 1.18.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,28 @@
1
1
  # Changelog
2
2
 
3
+ ## [1.18.0](https://github.com/AIGNE-io/aigne-framework/compare/cli-v1.17.0...cli-v1.18.0) (2025-07-01)
4
+
5
+
6
+ ### Features
7
+
8
+ * **example:** use AIGNE cli to run chat-bot example ([#198](https://github.com/AIGNE-io/aigne-framework/issues/198)) ([7085541](https://github.com/AIGNE-io/aigne-framework/commit/708554100692f2a557f7329ea78e46c3c870ce10))
9
+
10
+
11
+ ### Dependencies
12
+
13
+ * The following workspace dependencies were updated
14
+ * dependencies
15
+ * @aigne/agent-library bumped to 1.17.5
16
+ * @aigne/anthropic bumped to 0.5.0
17
+ * @aigne/bedrock bumped to 0.5.0
18
+ * @aigne/core bumped to 1.28.0
19
+ * @aigne/deepseek bumped to 0.4.0
20
+ * @aigne/gemini bumped to 0.4.0
21
+ * @aigne/ollama bumped to 0.4.0
22
+ * @aigne/open-router bumped to 0.4.0
23
+ * @aigne/openai bumped to 0.6.0
24
+ * @aigne/xai bumped to 0.4.0
25
+
3
26
  ## [1.17.0](https://github.com/AIGNE-io/aigne-framework/compare/cli-v1.16.0...cli-v1.17.0) (2025-07-01)
4
27
 
5
28
 
package/README.md CHANGED
@@ -16,12 +16,12 @@ Command-line tool for [AIGNE Framework](https://github.com/AIGNE-io/aigne-framew
16
16
 
17
17
  ## Features
18
18
 
19
- - **Project Creation**: Quickly create new AIGNE projects with predefined file structures and configurations
20
- - **Agent Running**: Easily run and test AIGNE agents
21
- - **Testing Support**: Built-in test command for unit testing and integration testing
22
- - **MCP Services**: Support for launching agents as MCP servers for integration with external systems
23
- - **Interactive Interface**: Beautiful command-line interface providing an intuitive user experience
24
- - **Multi-model Support**: Support for OpenAI, Claude, XAI, and other model providers
19
+ * **Project Creation**: Quickly create new AIGNE projects with predefined file structures and configurations
20
+ * **Agent Running**: Easily run and test AIGNE agents
21
+ * **Testing Support**: Built-in test command for unit testing and integration testing
22
+ * **MCP Services**: Support for launching agents as MCP servers for integration with external systems
23
+ * **Interactive Interface**: Beautiful command-line interface providing an intuitive user experience
24
+ * **Multi-model Support**: Support for OpenAI, Claude, XAI, and other model providers
25
25
 
26
26
  ## Installation
27
27
 
@@ -81,8 +81,8 @@ aigne create my-project
81
81
 
82
82
  The interactive creation process will ask for:
83
83
 
84
- - Project name
85
- - Project template (currently supports the default template)
84
+ * Project name
85
+ * Project template (currently supports the default template)
86
86
 
87
87
  ## Run Command
88
88
 
@@ -104,10 +104,10 @@ aigne run --entry-agent myAgent
104
104
 
105
105
  Available options:
106
106
 
107
- - `--entry-agent <entry-agent>` - Specify the agent name to run (defaults to the first agent found)
108
- - `--cache-dir <dir>` - Specify the directory to download the package to (used in URL mode)
109
- - `--model <provider[:model]>` - Specify the AI model in format 'provider\[:model]' where model is optional (e.g., 'openai' or 'openai:gpt-4o-mini')
110
- - `--verbose` - Enable verbose logging
107
+ * `--entry-agent <entry-agent>` - Specify the agent name to run (defaults to the first agent found)
108
+ * `--cache-dir <dir>` - Specify the directory to download the package to (used in URL mode)
109
+ * `--model <provider[:model]>` - Specify the AI model in format 'provider\[:model]' where model is optional (e.g., 'openai' or 'openai:gpt-4o-mini')
110
+ * `--verbose` - Enable verbose logging
111
111
 
112
112
  ## Test Command
113
113
 
package/README.zh.md CHANGED
@@ -16,12 +16,12 @@
16
16
 
17
17
  ## 特性
18
18
 
19
- - **项目创建**:快速创建新的 AIGNE 项目,包含预设的文件结构和配置
20
- - **代理运行**:轻松运行和测试 AIGNE 代理
21
- - **测试支持**:内置测试命令,方便进行单元测试和集成测试
22
- - **MCP 服务**:支持将代理作为 MCP 服务器启动,与外部系统集成
23
- - **交互式界面**:美观的命令行界面,提供直观的使用体验
24
- - **多模型支持**:支持 OpenAI、Claude、XAI 等多种模型提供商
19
+ * **项目创建**:快速创建新的 AIGNE 项目,包含预设的文件结构和配置
20
+ * **代理运行**:轻松运行和测试 AIGNE 代理
21
+ * **测试支持**:内置测试命令,方便进行单元测试和集成测试
22
+ * **MCP 服务**:支持将代理作为 MCP 服务器启动,与外部系统集成
23
+ * **交互式界面**:美观的命令行界面,提供直观的使用体验
24
+ * **多模型支持**:支持 OpenAI、Claude、XAI 等多种模型提供商
25
25
 
26
26
  ## 安装
27
27
 
@@ -81,8 +81,8 @@ aigne create my-project
81
81
 
82
82
  交互式创建过程会询问:
83
83
 
84
- - 项目名称
85
- - 项目模板(目前支持 default 模板)
84
+ * 项目名称
85
+ * 项目模板(目前支持 default 模板)
86
86
 
87
87
  ## 运行命令 (run)
88
88
 
@@ -104,10 +104,10 @@ aigne run --entry-agent myAgent
104
104
 
105
105
  可用选项:
106
106
 
107
- - `--entry-agent <代理>` - 指定要运行的代理名称(默认为找到的第一个代理)
108
- - `--cache-dir <目录>` - 指定下载包的目录(URL模式下使用)
109
- - `--model <提供商[:模型]>` - 指定AI模型,格式为'提供商\[:模型]',其中模型是可选的(如'openai'或'openai:gpt-4o-mini')
110
- - `--verbose` - 启用详细日志记录
107
+ * `--entry-agent <代理>` - 指定要运行的代理名称(默认为找到的第一个代理)
108
+ * `--cache-dir <目录>` - 指定下载包的目录(URL模式下使用)
109
+ * `--model <提供商[:模型]>` - 指定AI模型,格式为'提供商\[:模型]',其中模型是可选的(如'openai'或'openai:gpt-4o-mini')
110
+ * `--verbose` - 启用详细日志记录
111
111
 
112
112
  ## 测试命令 (test)
113
113
 
@@ -2,7 +2,7 @@ import { Command } from "commander";
2
2
  import { AIGNE_CLI_VERSION } from "../constants.js";
3
3
  import { asciiLogo } from "../utils/ascii-logo.js";
4
4
  import { createCreateCommand } from "./create.js";
5
- import { createObservabilityCommand } from "./observability.js";
5
+ import { createObservabilityCommand } from "./observe.js";
6
6
  import { createRunCommand } from "./run.js";
7
7
  import { createServeCommand } from "./serve.js";
8
8
  import { createTestCommand } from "./test.js";
@@ -1,6 +1,6 @@
1
1
  import { tryOrThrow } from "@aigne/core/utils/type-utils.js";
2
- import { startObservabilityCLIServer } from "@aigne/observability/cli";
3
- import getObservabilityDbPath from "@aigne/observability/db-path";
2
+ import { startObservabilityCLIServer } from "@aigne/observability-api/cli";
3
+ import getObservabilityDbPath from "@aigne/observability-api/db-path";
4
4
  import { Command } from "commander";
5
5
  import detectPort from "detect-port";
6
6
  const DEFAULT_PORT = () => tryOrThrow(() => {
@@ -20,7 +20,7 @@ export function createObservabilityCommand() {
20
20
  .action(async (options) => {
21
21
  const port = await detectPort(options.port || DEFAULT_PORT());
22
22
  const dbUrl = getObservabilityDbPath();
23
- console.log("DB PATH:", dbUrl);
23
+ console.log("Observability database path:", dbUrl);
24
24
  await startObservabilityCLIServer({ port, dbUrl });
25
25
  })
26
26
  .showHelpAfterError(true)
@@ -148,7 +148,7 @@ export class TerminalTracer {
148
148
  }
149
149
  formatResult(agent, context, m = {}) {
150
150
  const { isTTY } = process.stdout;
151
- const outputKey = agent instanceof AIAgent ? agent.outputKey : undefined;
151
+ const outputKey = this.outputKey || (agent instanceof AIAgent ? agent.outputKey : undefined);
152
152
  const prefix = logger.enabled(LogLevel.INFO)
153
153
  ? `${chalk.grey(figures.tick)} 🤖 ${this.formatTokenUsage(context.usage)}`
154
154
  : null;
@@ -6,5 +6,6 @@ export interface ChatLoopOptions {
6
6
  defaultQuestion?: string;
7
7
  inputKey?: string;
8
8
  skipLoop?: boolean;
9
+ outputKey?: string;
9
10
  }
10
11
  export declare function runChatLoopInTerminal(userAgent: UserAgent<any, any>, options?: ChatLoopOptions): Promise<void>;
@@ -42,7 +42,7 @@ export async function runChatLoopInTerminal(userAgent, options = {}) {
42
42
  }
43
43
  }
44
44
  async function callAgent(userAgent, input, options) {
45
- const tracer = new TerminalTracer(userAgent.context);
45
+ const tracer = new TerminalTracer(userAgent.context, options);
46
46
  await tracer.run(userAgent, typeof input === "string" ? { [options.inputKey || DEFAULT_CHAT_INPUT_KEY]: input } : input);
47
47
  }
48
48
  const COMMANDS = {
@@ -13,6 +13,7 @@ export interface RunAIGNECommandOptions {
13
13
  input?: string[];
14
14
  format?: "text" | "json" | "yaml";
15
15
  output?: string;
16
+ outputKey?: string;
16
17
  logLevel?: LogLevel;
17
18
  force?: boolean;
18
19
  }
@@ -124,7 +124,7 @@ export async function runWithAIGNE(agentCreator, { argv = process.argv, chatLoop
124
124
  }
125
125
  await runAgentWithAIGNE(aigne, agent, {
126
126
  ...options,
127
- outputKey,
127
+ outputKey: outputKey || options.outputKey,
128
128
  chatLoopOptions,
129
129
  modelOptions,
130
130
  input,
@@ -168,6 +168,7 @@ export async function runAgentWithAIGNE(aigne, agent, { outputKey, chatLoopOptio
168
168
  const userAgent = agent instanceof UserAgent ? agent : aigne.invoke(agent);
169
169
  await runChatLoopInTerminal(userAgent, {
170
170
  ...chatLoopOptions,
171
+ outputKey,
171
172
  });
172
173
  return;
173
174
  }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/cli",
3
- "version": "1.17.0",
3
+ "version": "1.18.0",
4
4
  "description": "cli for AIGNE framework",
5
5
  "publishConfig": {
6
6
  "access": "public"
@@ -55,17 +55,17 @@
55
55
  "wrap-ansi": "^9.0.0",
56
56
  "yaml": "^2.7.1",
57
57
  "zod": "^3.24.4",
58
- "@aigne/agent-library": "^1.17.4",
59
- "@aigne/bedrock": "^0.4.0",
60
- "@aigne/anthropic": "^0.4.0",
61
- "@aigne/core": "^1.27.0",
62
- "@aigne/gemini": "^0.3.11",
63
- "@aigne/ollama": "^0.3.11",
64
- "@aigne/observability": "^0.3.0",
65
- "@aigne/deepseek": "^0.3.11",
66
- "@aigne/open-router": "^0.3.11",
67
- "@aigne/openai": "^0.5.0",
68
- "@aigne/xai": "^0.3.11"
58
+ "@aigne/agent-library": "^1.17.5",
59
+ "@aigne/anthropic": "^0.5.0",
60
+ "@aigne/core": "^1.28.0",
61
+ "@aigne/deepseek": "^0.4.0",
62
+ "@aigne/gemini": "^0.4.0",
63
+ "@aigne/bedrock": "^0.5.0",
64
+ "@aigne/observability-api": "^0.3.0",
65
+ "@aigne/open-router": "^0.4.0",
66
+ "@aigne/ollama": "^0.4.0",
67
+ "@aigne/openai": "^0.6.0",
68
+ "@aigne/xai": "^0.4.0"
69
69
  },
70
70
  "devDependencies": {
71
71
  "@types/archiver": "^6.0.3",
@@ -1,18 +1,44 @@
1
1
  # Change the name of this file to .env.local and fill in the following values
2
2
 
3
- # Use this for OpenAI models
3
+ # Uncomment the lines below to enable debug logging
4
+ # DEBUG="aigne:*"
5
+
6
+ # Use different Models
7
+
8
+ # OpenAI
4
9
  MODEL="openai:gpt-4.1"
5
- OPENAI_API_KEY=""
10
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
6
11
 
7
- # Use this for Anthropic models
12
+ # Anthropic claude
8
13
  # MODEL="anthropic:claude-3-7-sonnet-latest"
9
- # ANTHROPIC_API_KEY="" # Your Anthropic API key
14
+ # ANTHROPIC_API_KEY=""
10
15
 
11
- # Use this for AWS Bedrock models
16
+ # Gemini
17
+ # MODEL="gemini:gemini-2.0-flash"
18
+ # GEMINI_API_KEY=""
19
+
20
+ # Bedrock nova
21
+ # MODEL=bedrock:us.amazon.nova-premier-v1:0
12
22
  # AWS_ACCESS_KEY_ID=""
13
23
  # AWS_SECRET_ACCESS_KEY=""
14
24
  # AWS_REGION=us-west-2
15
- # MODEL=Bedrock:us.amazon.nova-premier-v1:0
25
+
26
+ # DeepSeek
27
+ # MODEL="deepseek:deepseek-chat"
28
+ # DEEPSEEK_API_KEY=""
29
+
30
+ # OpenRouter
31
+ # MODEL="openrouter:openai/gpt-4o"
32
+ # OPEN_ROUTER_API_KEY=""
33
+
34
+ # xAI
35
+ # MODEL="xai:grok-2-latest"
36
+ # XAI_API_KEY=""
37
+
38
+ # Ollama
39
+ # MODEL="ollama:llama3.2"
40
+ # OLLAMA_DEFAULT_BASE_URL="http://localhost:11434/v1";
41
+
16
42
 
17
43
  # Setup proxy if needed
18
44
  # HTTPS_PROXY=http://localhost:7890
@@ -17,6 +17,16 @@ This is the default project template for the AIGNE framework, providing a basic
17
17
  npm install -g aigne
18
18
  ```
19
19
 
20
+ ### Setup Environment Variables
21
+
22
+ Copy the `.env.local.example` file to `.env.local` and set your OpenAI API key:
23
+
24
+ ```shell
25
+ # OpenAI
26
+ MODEL="openai:gpt-4.1"
27
+ OPENAI_API_KEY="YOUR_OPENAI_API_KEY"
28
+ ```
29
+
20
30
  ### Start the Project
21
31
 
22
32
  ```bash