@aigne/cli 1.9.1 → 1.10.1

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -1,5 +1,50 @@
1
1
  # Changelog
2
2
 
3
+ ## [1.10.1](https://github.com/AIGNE-io/aigne-framework/compare/cli-v1.10.0...cli-v1.10.1) (2025-05-25)
4
+
5
+
6
+ ### Dependencies
7
+
8
+ * The following workspace dependencies were updated
9
+ * dependencies
10
+ * @aigne/anthropic bumped to 0.2.0
11
+ * @aigne/bedrock bumped to 0.2.0
12
+ * @aigne/core bumped to 1.17.0
13
+ * @aigne/deepseek bumped to 0.2.0
14
+ * @aigne/gemini bumped to 0.2.0
15
+ * @aigne/ollama bumped to 0.2.0
16
+ * @aigne/open-router bumped to 0.2.0
17
+ * @aigne/openai bumped to 0.2.0
18
+ * @aigne/xai bumped to 0.2.0
19
+
20
+ ## [1.10.0](https://github.com/AIGNE-io/aigne-framework/compare/cli-v1.9.1...cli-v1.10.0) (2025-05-23)
21
+
22
+
23
+ ### Features
24
+
25
+ * add `--chat` option for `run` command ([#120](https://github.com/AIGNE-io/aigne-framework/issues/120)) ([7699550](https://github.com/AIGNE-io/aigne-framework/commit/76995507001ca33b09b29f72ff588dae513cb340))
26
+ * **models:** publish model adapters as standalone packages ([#126](https://github.com/AIGNE-io/aigne-framework/issues/126)) ([588b8ae](https://github.com/AIGNE-io/aigne-framework/commit/588b8aea6abcee5fa87def1358bf51f84021c6ef))
27
+
28
+
29
+ ### Bug Fixes
30
+
31
+ * **cli:** listr ctx maybe undefined ([#130](https://github.com/AIGNE-io/aigne-framework/issues/130)) ([dfc7b13](https://github.com/AIGNE-io/aigne-framework/commit/dfc7b139e05cf9b6e0314f42f308d25e9b70ea5c))
32
+
33
+
34
+ ### Dependencies
35
+
36
+ * The following workspace dependencies were updated
37
+ * dependencies
38
+ * @aigne/anthropic bumped to 0.1.0
39
+ * @aigne/bedrock bumped to 0.1.0
40
+ * @aigne/core bumped to 1.16.0
41
+ * @aigne/deepseek bumped to 0.1.0
42
+ * @aigne/gemini bumped to 0.1.0
43
+ * @aigne/ollama bumped to 0.1.0
44
+ * @aigne/open-router bumped to 0.1.0
45
+ * @aigne/openai bumped to 0.1.0
46
+ * @aigne/xai bumped to 0.1.0
47
+
3
48
  ## [1.9.1](https://github.com/AIGNE-io/aigne-framework/compare/cli-v1.9.0...cli-v1.9.1) (2025-05-15)
4
49
 
5
50
 
package/README.md CHANGED
@@ -16,12 +16,12 @@ Command-line tool for [AIGNE Framework](https://github.com/AIGNE-io/aigne-framew
16
16
 
17
17
  ## Features
18
18
 
19
- - **Project Creation**: Quickly create new AIGNE projects with predefined file structures and configurations
20
- - **Agent Running**: Easily run and test AIGNE agents
21
- - **Testing Support**: Built-in test command for unit testing and integration testing
22
- - **MCP Services**: Support for launching agents as MCP servers for integration with external systems
23
- - **Interactive Interface**: Beautiful command-line interface providing an intuitive user experience
24
- - **Multi-model Support**: Support for OpenAI, Claude, XAI, and other model providers
19
+ * **Project Creation**: Quickly create new AIGNE projects with predefined file structures and configurations
20
+ * **Agent Running**: Easily run and test AIGNE agents
21
+ * **Testing Support**: Built-in test command for unit testing and integration testing
22
+ * **MCP Services**: Support for launching agents as MCP servers for integration with external systems
23
+ * **Interactive Interface**: Beautiful command-line interface providing an intuitive user experience
24
+ * **Multi-model Support**: Support for OpenAI, Claude, XAI, and other model providers
25
25
 
26
26
  ## Installation
27
27
 
@@ -77,8 +77,9 @@ aigne create my-project
77
77
  ```
78
78
 
79
79
  The interactive creation process will ask for:
80
- - Project name
81
- - Project template (currently supports the default template)
80
+
81
+ * Project name
82
+ * Project template (currently supports the default template)
82
83
 
83
84
  ## Run Command
84
85
 
@@ -93,14 +94,17 @@ aigne run path/to/agents
93
94
 
94
95
  # Run the agent from a remote URL
95
96
  aigne run https://example.com/aigne-project
97
+
98
+ # Run a specific agent
99
+ aigne run --entry-agent myAgent
96
100
  ```
97
101
 
98
102
  Available options:
99
- - `--agent <agent>` - Specify the agent name to use (defaults to the first agent found)
100
- - `--download-dir <dir>` - Specify the directory to download the package to (used in URL mode)
101
- - `--model-provider <provider>` - Specify the model provider (openai, claude, xai)
102
- - `--model-name <model>` - Specify the model name
103
- - `--verbose` - Enable verbose logging
103
+
104
+ * `--entry-agent <entry-agent>` - Specify the agent name to run (defaults to the first agent found)
105
+ * `--cache-dir <dir>` - Specify the directory to download the package to (used in URL mode)
106
+ * `--model <provider[:model]>` - Specify the AI model in format 'provider\[:model]' where model is optional (e.g., 'openai' or 'openai:gpt-4o-mini')
107
+ * `--verbose` - Enable verbose logging
104
108
 
105
109
  ## Test Command
106
110
 
package/README.zh.md CHANGED
@@ -16,12 +16,12 @@
16
16
 
17
17
  ## 特性
18
18
 
19
- - **项目创建**:快速创建新的 AIGNE 项目,包含预设的文件结构和配置
20
- - **代理运行**:轻松运行和测试 AIGNE 代理
21
- - **测试支持**:内置测试命令,方便进行单元测试和集成测试
22
- - **MCP 服务**:支持将代理作为 MCP 服务器启动,与外部系统集成
23
- - **交互式界面**:美观的命令行界面,提供直观的使用体验
24
- - **多模型支持**:支持 OpenAI、Claude、XAI 等多种模型提供商
19
+ * **项目创建**:快速创建新的 AIGNE 项目,包含预设的文件结构和配置
20
+ * **代理运行**:轻松运行和测试 AIGNE 代理
21
+ * **测试支持**:内置测试命令,方便进行单元测试和集成测试
22
+ * **MCP 服务**:支持将代理作为 MCP 服务器启动,与外部系统集成
23
+ * **交互式界面**:美观的命令行界面,提供直观的使用体验
24
+ * **多模型支持**:支持 OpenAI、Claude、XAI 等多种模型提供商
25
25
 
26
26
  ## 安装
27
27
 
@@ -77,8 +77,9 @@ aigne create my-project
77
77
  ```
78
78
 
79
79
  交互式创建过程会询问:
80
- - 项目名称
81
- - 项目模板(目前支持 default 模板)
80
+
81
+ * 项目名称
82
+ * 项目模板(目前支持 default 模板)
82
83
 
83
84
  ## 运行命令 (run)
84
85
 
@@ -93,14 +94,17 @@ aigne run path/to/agents
93
94
 
94
95
  # 运行远程 URL 中的代理
95
96
  aigne run https://example.com/aigne-project
97
+
98
+ # 指定特定代理运行
99
+ aigne run --entry-agent myAgent
96
100
  ```
97
101
 
98
102
  可用选项:
99
- - `--agent <agent>` - 指定要使用的代理名称(默认为找到的第一个代理)
100
- - `--download-dir <dir>` - 指定下载包的目录(URL模式下使用)
101
- - `--model-provider <provider>` - 指定模型提供商(openai、claude、xai)
102
- - `--model-name <model>` - 指定模型名称
103
- - `--verbose` - 启用详细日志记录
103
+
104
+ * `--entry-agent <代理>` - 指定要运行的代理名称(默认为找到的第一个代理)
105
+ * `--cache-dir <目录>` - 指定下载包的目录(URL模式下使用)
106
+ * `--model <提供商[:模型]>` - 指定AI模型,格式为'提供商\[:模型]',其中模型是可选的(如'openai'或'openai:gpt-4o-mini')
107
+ * `--verbose` - 启用详细日志记录
104
108
 
105
109
  ## 测试命令 (test)
106
110
 
@@ -1,2 +1,2 @@
1
- import { Command } from "commander";
1
+ import type { Command } from "commander";
2
2
  export declare function createRunCommand(): Command;
@@ -7,36 +7,33 @@ import { loadModel } from "@aigne/core/loader/index.js";
7
7
  import { logger } from "@aigne/core/utils/logger.js";
8
8
  import { isNonNullable } from "@aigne/core/utils/type-utils.js";
9
9
  import { Listr, PRESET_TIMER } from "@aigne/listr2";
10
- import { Command } from "commander";
10
+ import { availableModels } from "../constants.js";
11
11
  import { isV1Package, toAIGNEPackage } from "../utils/agent-v1.js";
12
12
  import { downloadAndExtract } from "../utils/download.js";
13
- import { runChatLoopInTerminal } from "../utils/run-chat-loop.js";
13
+ import { createRunAIGNECommand, parseModelOption, runAgentWithAIGNE, } from "../utils/run-with-aigne.js";
14
14
  export function createRunCommand() {
15
- return new Command("run")
16
- .description("Run a chat loop with the specified agent")
15
+ return createRunAIGNECommand()
16
+ .description("Run AIGNE from the specified agent")
17
17
  .argument("[path]", "Path to the agents directory or URL to aigne project", ".")
18
- .option("--agent <agent>", "Name of the agent to use (defaults to the first agent found)")
19
- .option("--download-dir <dir>", "Directory to download the package to (defaults to the ~/.aigne/xxx)")
20
- .option("--model-provider <provider>", "Model provider to use, available providers: openai, claude, xai (defaults to the aigne.yaml definition or openai)")
21
- .option("--model-name <model>", "Model name to use, available models depend on the provider (defaults to the aigne.yaml definition or gpt-4o-mini)")
22
- .option("--verbose", "Enable verbose logging", false)
18
+ .option("--entry-agent <entry-agent>", "Name of the agent to run (defaults to the first agent found)")
19
+ .option("--cache-dir <dir>", "Directory to download the package to (defaults to the ~/.aigne/xxx)")
23
20
  .action(async (path, options) => {
24
- if (options.verbose)
25
- logger.enable("*");
26
- const { downloadDir, dir } = prepareDirs(path, options);
21
+ if (options.logLevel)
22
+ logger.level = options.logLevel;
23
+ const { cacheDir, dir } = prepareDirs(path, options);
27
24
  const { aigne, agent } = await new Listr([
28
25
  {
29
26
  title: "Prepare environment",
30
27
  task: (_, task) => {
31
- if (downloadDir) {
28
+ if (cacheDir) {
32
29
  return task.newListr([
33
30
  {
34
31
  title: "Download package",
35
- task: () => downloadPackage(path, downloadDir),
32
+ task: () => downloadPackage(path, cacheDir),
36
33
  },
37
34
  {
38
35
  title: "Extract package",
39
- task: () => extractPackage(downloadDir, dir),
36
+ task: () => extractPackage(cacheDir, dir),
40
37
  },
41
38
  ]);
42
39
  }
@@ -53,24 +50,24 @@ export function createRunCommand() {
53
50
  task: (ctx) => {
54
51
  const { aigne } = ctx;
55
52
  assert(aigne);
56
- let agent;
57
- if (options.agent) {
58
- agent = aigne.agents[options.agent];
59
- if (!agent) {
60
- console.error(`Agent "${options.agent}" not found in ${path}`);
53
+ let entryAgent;
54
+ if (options.entryAgent) {
55
+ entryAgent = aigne.agents[options.entryAgent];
56
+ if (!entryAgent) {
57
+ console.error(`Agent "${options.entryAgent}" not found in ${path}`);
61
58
  console.log("Available agents:");
62
59
  for (const agent of aigne.agents) {
63
60
  console.log(`- ${agent.name}`);
64
61
  }
65
- throw new Error(`Agent "${options.agent}" not found in ${path}`);
62
+ throw new Error(`Agent "${options.entryAgent}" not found in ${path}`);
66
63
  }
67
64
  }
68
65
  else {
69
- agent = aigne.agents[0];
70
- if (!agent)
66
+ entryAgent = aigne.agents[0];
67
+ if (!entryAgent)
71
68
  throw new Error(`No agents found in ${path}`);
72
69
  }
73
- ctx.agent = agent;
70
+ ctx.agent = entryAgent;
74
71
  },
75
72
  },
76
73
  ], {
@@ -81,53 +78,53 @@ export function createRunCommand() {
81
78
  }).run();
82
79
  assert(aigne);
83
80
  assert(agent);
84
- const user = aigne.invoke(agent);
85
- await runChatLoopInTerminal(user);
86
- await aigne.shutdown();
81
+ try {
82
+ await runAgentWithAIGNE(aigne, agent, { ...options });
83
+ }
84
+ finally {
85
+ await aigne.shutdown();
86
+ }
87
87
  })
88
88
  .showHelpAfterError(true)
89
89
  .showSuggestionAfterError(true);
90
90
  }
91
91
  async function loadAIGNE(path, options) {
92
- if (options.modelName && !options.modelProvider) {
93
- throw new Error("please specify --model-provider when using the --model-name option");
94
- }
95
- const model = options.modelProvider
96
- ? await loadModel({ provider: options.modelProvider, name: options.modelName })
92
+ const model = options.model
93
+ ? await loadModel(availableModels, parseModelOption(options.model))
97
94
  : undefined;
98
- return await AIGNE.load(path, { model });
95
+ return await AIGNE.load(path, { models: availableModels, model });
99
96
  }
100
- async function downloadPackage(url, downloadDir) {
101
- await rm(downloadDir, { recursive: true, force: true });
102
- await mkdir(downloadDir, { recursive: true });
103
- await downloadAndExtract(url, downloadDir);
97
+ async function downloadPackage(url, cacheDir) {
98
+ await rm(cacheDir, { recursive: true, force: true });
99
+ await mkdir(cacheDir, { recursive: true });
100
+ await downloadAndExtract(url, cacheDir);
104
101
  }
105
- async function extractPackage(downloadDir, dir) {
102
+ async function extractPackage(cacheDir, dir) {
106
103
  await mkdir(dir, { recursive: true });
107
- if (await isV1Package(downloadDir)) {
108
- await toAIGNEPackage(downloadDir, dir);
104
+ if (await isV1Package(cacheDir)) {
105
+ await toAIGNEPackage(cacheDir, dir);
109
106
  }
110
107
  else {
111
- await cp(downloadDir, dir, { recursive: true, force: true });
108
+ await cp(cacheDir, dir, { recursive: true, force: true });
112
109
  }
113
110
  }
114
111
  function prepareDirs(path, options) {
115
112
  let dir;
116
- let downloadDir;
113
+ let cacheDir;
117
114
  if (!path.startsWith("http")) {
118
115
  dir = isAbsolute(path) ? path : resolve(process.cwd(), path);
119
116
  }
120
- else if (options.downloadDir) {
121
- dir = isAbsolute(options.downloadDir)
122
- ? options.downloadDir
123
- : resolve(process.cwd(), options.downloadDir);
124
- downloadDir = join(dir, ".download");
117
+ else if (options.cacheDir) {
118
+ dir = isAbsolute(options.cacheDir)
119
+ ? options.cacheDir
120
+ : resolve(process.cwd(), options.cacheDir);
121
+ cacheDir = join(dir, ".download");
125
122
  }
126
123
  else {
127
124
  dir = getLocalPackagePathFromUrl(path);
128
- downloadDir = getLocalPackagePathFromUrl(path, { subdir: ".download" });
125
+ cacheDir = getLocalPackagePathFromUrl(path, { subdir: ".download" });
129
126
  }
130
- return { downloadDir, dir };
127
+ return { cacheDir, dir };
131
128
  }
132
129
  function getLocalPackagePathFromUrl(url, { subdir } = {}) {
133
130
  const root = [homedir(), ".aigne", subdir].filter(isNonNullable);
@@ -2,6 +2,7 @@ import { isAbsolute, resolve } from "node:path";
2
2
  import { AIGNE } from "@aigne/core";
3
3
  import { tryOrThrow } from "@aigne/core/utils/type-utils.js";
4
4
  import { Command } from "commander";
5
+ import { availableModels } from "../constants.js";
5
6
  import { serveMCPServer } from "../utils/serve-mcp.js";
6
7
  const DEFAULT_PORT = () => tryOrThrow(() => {
7
8
  const { PORT } = process.env;
@@ -23,7 +24,7 @@ export function createServeCommand() {
23
24
  .action(async (path, options) => {
24
25
  const absolutePath = isAbsolute(path) ? path : resolve(process.cwd(), path);
25
26
  const port = options.port || DEFAULT_PORT();
26
- const aigne = await AIGNE.load(absolutePath);
27
+ const aigne = await AIGNE.load(absolutePath, { models: availableModels });
27
28
  if (options.mcp)
28
29
  await serveMCPServer({
29
30
  aigne,
@@ -1 +1,5 @@
1
+ import { AnthropicChatModel } from "@aigne/anthropic";
2
+ import { BedrockChatModel } from "@aigne/bedrock";
3
+ import { OpenAIChatModel } from "@aigne/openai";
1
4
  export declare const AIGNE_CLI_VERSION: string;
5
+ export declare const availableModels: (typeof OpenAIChatModel | typeof AnthropicChatModel | typeof BedrockChatModel)[];
package/dist/constants.js CHANGED
@@ -1,2 +1,20 @@
1
+ import { AnthropicChatModel } from "@aigne/anthropic";
2
+ import { BedrockChatModel } from "@aigne/bedrock";
3
+ import { DeepSeekChatModel } from "@aigne/deepseek";
4
+ import { GeminiChatModel } from "@aigne/gemini";
5
+ import { OllamaChatModel } from "@aigne/ollama";
6
+ import { OpenRouterChatModel } from "@aigne/open-router";
7
+ import { OpenAIChatModel } from "@aigne/openai";
8
+ import { XAIChatModel } from "@aigne/xai";
1
9
  import pkg from "../package.json" with { type: "json" };
2
10
  export const AIGNE_CLI_VERSION = pkg.version;
11
+ export const availableModels = [
12
+ OpenAIChatModel,
13
+ AnthropicChatModel,
14
+ BedrockChatModel,
15
+ DeepSeekChatModel,
16
+ GeminiChatModel,
17
+ OllamaChatModel,
18
+ OpenRouterChatModel,
19
+ XAIChatModel,
20
+ ];
@@ -1,18 +1,18 @@
1
1
  import { type Agent, type Context, type ContextUsage, MESSAGE_KEY, type Message } from "@aigne/core";
2
- import { type DefaultRenderer, Listr, type ListrRenderer, type ListrTaskWrapper } from "@aigne/listr2";
2
+ import { type Listr } from "@aigne/listr2";
3
+ import { type AIGNEListrTaskWrapper } from "../utils/listr.js";
3
4
  import { promiseWithResolvers } from "../utils/promise-with-resolvers.js";
4
5
  export interface TerminalTracerOptions {
5
- aiResponsePrefix?: (context: Context) => string;
6
+ printRequest?: boolean;
6
7
  }
7
8
  export declare class TerminalTracer {
8
9
  readonly context: Context;
9
10
  readonly options: TerminalTracerOptions;
10
11
  constructor(context: Context, options?: TerminalTracerOptions);
11
- private spinner;
12
12
  private tasks;
13
13
  run(agent: Agent, input: Message): Promise<{
14
14
  result: Message;
15
- context: Context;
15
+ context: Context<import("@aigne/core").UserContext>;
16
16
  }>;
17
17
  formatTokenUsage(usage: Partial<ContextUsage>, extra?: {
18
18
  [key: string]: string;
@@ -24,13 +24,14 @@ export declare class TerminalTracer {
24
24
  time?: boolean;
25
25
  }): string;
26
26
  private marked;
27
- formatAIResponse({ [MESSAGE_KEY]: msg, ...message }?: Message): string;
27
+ formatRequest(_context: Context, { [MESSAGE_KEY]: msg, ...message }?: Message): string | undefined;
28
+ formatResult(context: Context, { [MESSAGE_KEY]: msg, ...message }?: Message): string;
28
29
  }
29
30
  type Task = ReturnType<typeof promiseWithResolvers<void>> & {
30
31
  listr: ReturnType<typeof promiseWithResolvers<{
31
32
  ctx: object;
32
33
  subtask: Listr;
33
- taskWrapper: ListrTaskWrapper<unknown, typeof DefaultRenderer, typeof ListrRenderer>;
34
+ taskWrapper: AIGNEListrTaskWrapper;
34
35
  }>>;
35
36
  startTime?: number;
36
37
  endTime?: number;
@@ -1,13 +1,12 @@
1
1
  import { EOL } from "node:os";
2
- import { format, inspect } from "node:util";
2
+ import { inspect } from "node:util";
3
3
  import { ChatModel, MESSAGE_KEY, } from "@aigne/core";
4
- import { logger } from "@aigne/core/utils/logger.js";
5
- import { mergeAgentResponseChunk } from "@aigne/core/utils/stream-utils.js";
6
- import { Listr, ListrDefaultRendererLogLevels, Spinner, } from "@aigne/listr2";
4
+ import { LogLevel, logger } from "@aigne/core/utils/logger.js";
5
+ import { figures } from "@aigne/listr2";
7
6
  import { markedTerminal } from "@aigne/marked-terminal";
8
7
  import chalk from "chalk";
9
8
  import { Marked } from "marked";
10
- import wrap from "wrap-ansi";
9
+ import { AIGNEListr } from "../utils/listr.js";
11
10
  import { promiseWithResolvers } from "../utils/promise-with-resolvers.js";
12
11
  import { parseDuration } from "../utils/time.js";
13
12
  export class TerminalTracer {
@@ -17,26 +16,13 @@ export class TerminalTracer {
17
16
  this.context = context;
18
17
  this.options = options;
19
18
  }
20
- spinner = new Spinner();
21
19
  tasks = {};
22
20
  async run(agent, input) {
23
- this.spinner.start();
24
21
  const context = this.context.newContext({ reset: true });
25
22
  const listr = new AIGNEListr({
26
- formatResult: (result) => {
27
- return [this.options.aiResponsePrefix?.(context) || "", this.formatAIResponse(result)];
28
- },
29
- }, [], {
30
- concurrent: true,
31
- rendererOptions: {
32
- collapseSubtasks: false,
33
- writeBottomBarDirectly: true,
34
- icon: {
35
- [ListrDefaultRendererLogLevels.PENDING]: () => this.spinner.fetch(),
36
- [ListrDefaultRendererLogLevels.OUTPUT_WITH_BOTTOMBAR]: "",
37
- },
38
- },
39
- });
23
+ formatRequest: () => this.options.printRequest ? this.formatRequest(context, input) : undefined,
24
+ formatResult: (result) => [this.formatResult(context, result)].filter(Boolean),
25
+ }, [], { concurrent: true });
40
26
  const onAgentStarted = async ({ contextId, parentContextId, agent, timestamp, }) => {
41
27
  const task = {
42
28
  ...promiseWithResolvers(),
@@ -99,12 +85,10 @@ export class TerminalTracer {
99
85
  context.on("agentSucceed", onAgentSucceed);
100
86
  context.on("agentFailed", onAgentFailed);
101
87
  try {
102
- const stream = await context.invoke(agent, input, { streaming: true });
103
- const result = await listr.run(stream);
88
+ const result = await listr.run(() => context.invoke(agent, input, { streaming: true }));
104
89
  return { result, context };
105
90
  }
106
91
  finally {
107
- this.spinner.stop();
108
92
  context.off("agentStarted", onAgentStarted);
109
93
  context.off("agentSucceed", onAgentSucceed);
110
94
  context.off("agentFailed", onAgentFailed);
@@ -137,62 +121,22 @@ export class TerminalTracer {
137
121
  return title;
138
122
  }
139
123
  marked = new Marked().use(markedTerminal({ forceHyperLink: false }));
140
- formatAIResponse({ [MESSAGE_KEY]: msg, ...message } = {}) {
124
+ formatRequest(_context, { [MESSAGE_KEY]: msg, ...message } = {}) {
125
+ if (!logger.enabled(LogLevel.INFO))
126
+ return;
127
+ const prefix = `${chalk.grey(figures.pointer)} 💬 `;
141
128
  const text = msg && typeof msg === "string" ? this.marked.parse(msg, { async: false }).trim() : undefined;
142
129
  const json = Object.keys(message).length > 0 ? inspect(message, { colors: true }) : undefined;
143
- return [text, json].filter(Boolean).join("\n");
144
- }
145
- }
146
- class AIGNEListr extends Listr {
147
- myOptions;
148
- result = {};
149
- isStreamRunning = false;
150
- logs = [];
151
- constructor(myOptions, ...args) {
152
- super(...args);
153
- this.myOptions = myOptions;
154
- const renderer = new this.rendererClass(this.tasks, this.rendererClassOptions, this.events);
155
- const spinner = renderer.spinner;
156
- // Override the `create` method of renderer to customize the output
157
- const create = renderer.create;
158
- renderer.create = (...args) => {
159
- const [tasks, output] = create.call(renderer, ...args);
160
- const l = [
161
- "",
162
- tasks,
163
- "",
164
- ...this.myOptions.formatResult(this.result).map((i) => this.wrap(i)),
165
- this.isStreamRunning ? spinner.fetch() : "",
166
- ];
167
- return [l.join(EOL), [output, ...this.logs.splice(0)].filter(Boolean).join(EOL)];
168
- };
169
- // @ts-ignore initialize the renderer
170
- this.renderer = renderer;
171
- }
172
- async run(stream) {
173
- const originalLog = logger.log;
174
- try {
175
- logger.log = (...args) => this.logs.push(format(...args));
176
- this.add({ task: () => this.extractStream(stream) });
177
- return await super.run().then(() => ({ ...this.result }));
178
- }
179
- finally {
180
- logger.log = originalLog;
181
- }
130
+ return [prefix, [text, json].filter(Boolean).join(EOL)].join(" ");
182
131
  }
183
- async extractStream(stream) {
184
- this.isStreamRunning = true;
185
- this.result = {};
186
- for await (const value of stream) {
187
- mergeAgentResponseChunk(this.result, value);
188
- }
189
- this.isStreamRunning = false;
190
- return this.result;
191
- }
192
- wrap(str) {
193
- return wrap(str, process.stdout.columns ?? 80, {
194
- hard: true,
195
- trim: false,
196
- });
132
+ formatResult(context, { [MESSAGE_KEY]: msg, ...message } = {}) {
133
+ const prefix = logger.enabled(LogLevel.INFO)
134
+ ? `${chalk.grey(figures.tick)} 🤖 ${this.formatTokenUsage(context.usage)}`
135
+ : null;
136
+ const text = msg && typeof msg === "string" ? this.marked.parse(msg, { async: false }).trim() : undefined;
137
+ const json = Object.keys(message).length > 0
138
+ ? inspect(message, { colors: process.stdout.isTTY })
139
+ : undefined;
140
+ return [prefix, text, json].filter(Boolean).join(EOL);
197
141
  }
198
142
  }
@@ -0,0 +1,46 @@
1
+ import type { AgentResponseStream, Message } from "@aigne/core";
2
+ import type { PromiseOrValue } from "@aigne/core/utils/type-utils";
3
+ import { DefaultRenderer, Listr, type ListrDefaultRendererOptions, ListrLogger, type ListrSimpleRendererOptions, type ListrTaskWrapper, SimpleRenderer, Spinner } from "@aigne/listr2";
4
+ import type { createLogUpdate } from "log-update";
5
+ export type AIGNEListrTaskWrapper = ListrTaskWrapper<unknown, typeof AIGNEListrRenderer, typeof AIGNEListrFallbackRenderer>;
6
+ export declare class AIGNEListr extends Listr<object, typeof AIGNEListrRenderer, typeof AIGNEListrFallbackRenderer> {
7
+ myOptions: {
8
+ formatRequest: () => string | undefined;
9
+ formatResult: (result: Message) => string[];
10
+ };
11
+ private result;
12
+ private logs;
13
+ private spinner;
14
+ constructor(myOptions: {
15
+ formatRequest: () => string | undefined;
16
+ formatResult: (result: Message) => string[];
17
+ }, ...[task, options, parentTask]: ConstructorParameters<typeof Listr<object, typeof AIGNEListrRenderer, typeof AIGNEListrFallbackRenderer>>);
18
+ run(stream: () => PromiseOrValue<AgentResponseStream<Message>>): Promise<Message>;
19
+ private extractStream;
20
+ }
21
+ export interface AIGNEListrRendererOptions extends ListrDefaultRendererOptions {
22
+ aigne?: {
23
+ getStdoutLogs?: () => string[];
24
+ getBottomBarLogs?: () => string[];
25
+ };
26
+ }
27
+ export declare class AIGNEListrRenderer extends DefaultRenderer {
28
+ static rendererOptions: AIGNEListrRendererOptions;
29
+ get _updater(): ReturnType<typeof createLogUpdate>;
30
+ get _logger(): ListrLogger;
31
+ get _options(): AIGNEListrRendererOptions;
32
+ get _spinner(): Spinner;
33
+ update(): void;
34
+ create({ running, ...options }: Parameters<DefaultRenderer["create"]>[0] & {
35
+ running?: boolean;
36
+ }): string;
37
+ _wrap(str: string): string;
38
+ }
39
+ export interface AIGNEListrFallbackRendererOptions extends ListrSimpleRendererOptions, Pick<AIGNEListrRendererOptions, "aigne"> {
40
+ }
41
+ export declare class AIGNEListrFallbackRenderer extends SimpleRenderer {
42
+ static rendererOptions: AIGNEListrFallbackRendererOptions;
43
+ get _logger(): ListrLogger;
44
+ get _options(): AIGNEListrFallbackRendererOptions;
45
+ end(): void;
46
+ }
@@ -0,0 +1,139 @@
1
+ import { EOL } from "node:os";
2
+ import { format } from "node:util";
3
+ import { LogLevel, logger } from "@aigne/core/utils/logger.js";
4
+ import { mergeAgentResponseChunk } from "@aigne/core/utils/stream-utils.js";
5
+ import { DefaultRenderer, Listr, ListrDefaultRendererLogLevels, ListrLogger, SimpleRenderer, Spinner, } from "@aigne/listr2";
6
+ import wrap from "wrap-ansi";
7
+ export class AIGNEListr extends Listr {
8
+ myOptions;
9
+ result = {};
10
+ logs = [];
11
+ spinner;
12
+ constructor(myOptions, ...[task, options, parentTask]) {
13
+ const aigneOptions = {
14
+ getStdoutLogs: () => {
15
+ return this.logs.splice(0);
16
+ },
17
+ getBottomBarLogs: () => {
18
+ return this.myOptions.formatResult(this.result);
19
+ },
20
+ };
21
+ super(task, {
22
+ ...options,
23
+ renderer: AIGNEListrRenderer,
24
+ rendererOptions: {
25
+ collapseSubtasks: false,
26
+ icon: {
27
+ [ListrDefaultRendererLogLevels.PENDING]: () => this.spinner.fetch(),
28
+ },
29
+ aigne: aigneOptions,
30
+ },
31
+ fallbackRenderer: AIGNEListrFallbackRenderer,
32
+ fallbackRendererOptions: {
33
+ aigne: aigneOptions,
34
+ logger: new (class extends ListrLogger {
35
+ log(...[level, ...args]) {
36
+ // ignore stdout logs if level `INFO` is not enabled
37
+ if (!this.options?.toStderr?.includes(level) && !logger.enabled(LogLevel.INFO)) {
38
+ return;
39
+ }
40
+ super.log(level, ...args);
41
+ }
42
+ })(),
43
+ },
44
+ }, parentTask);
45
+ this.myOptions = myOptions;
46
+ this.spinner = new Spinner();
47
+ }
48
+ async run(stream) {
49
+ const originalLog = logger.logMessage;
50
+ try {
51
+ this.ctx = {};
52
+ this.spinner.start();
53
+ logger.logMessage = (...args) => this.logs.push(format(...args));
54
+ if (logger.enabled(LogLevel.INFO)) {
55
+ const request = this.myOptions.formatRequest();
56
+ if (request)
57
+ console.log(request);
58
+ }
59
+ const _stream = await stream();
60
+ this.add({ task: () => this.extractStream(_stream) });
61
+ return await super.run().then(() => ({ ...this.result }));
62
+ }
63
+ finally {
64
+ logger.logMessage = originalLog;
65
+ this.spinner.stop();
66
+ }
67
+ }
68
+ async extractStream(stream) {
69
+ this.result = {};
70
+ for await (const value of stream) {
71
+ mergeAgentResponseChunk(this.result, value);
72
+ }
73
+ return this.result;
74
+ }
75
+ }
76
+ export class AIGNEListrRenderer extends DefaultRenderer {
77
+ static rendererOptions = {
78
+ ...DefaultRenderer.rendererOptions,
79
+ };
80
+ get _updater() {
81
+ // @ts-ignore `updater` is a private property
82
+ return this.updater;
83
+ }
84
+ get _logger() {
85
+ // @ts-ignore `logger` is a private property
86
+ return this.logger;
87
+ }
88
+ get _options() {
89
+ // @ts-ignore `options` is a private property
90
+ return this.options;
91
+ }
92
+ get _spinner() {
93
+ // @ts-ignore `spinner` is a private property
94
+ return this.spinner;
95
+ }
96
+ update() {
97
+ this._updater(this.create({ running: true }));
98
+ }
99
+ create({ running = false, ...options }) {
100
+ const logs = this._options.aigne?.getStdoutLogs?.();
101
+ if (logs?.length) {
102
+ this._updater.clear();
103
+ this._logger.toStdout(logs.join(EOL));
104
+ }
105
+ const tasks = [super.create(options)];
106
+ const bottomBar = this._options.aigne?.getBottomBarLogs?.();
107
+ if (bottomBar?.length) {
108
+ tasks.push([...bottomBar, running ? this._spinner.fetch() : ""].map((i) => this._wrap(i)).join(EOL));
109
+ }
110
+ return tasks.join(EOL.repeat(2));
111
+ }
112
+ _wrap(str) {
113
+ return wrap(str, process.stdout.columns ?? 80, {
114
+ hard: true,
115
+ trim: false,
116
+ });
117
+ }
118
+ }
119
+ export class AIGNEListrFallbackRenderer extends SimpleRenderer {
120
+ static rendererOptions = {
121
+ ...SimpleRenderer.rendererOptions,
122
+ };
123
+ get _logger() {
124
+ // @ts-ignore `logger` is a private property
125
+ return this.logger;
126
+ }
127
+ get _options() {
128
+ // @ts-ignore `options` is a private property
129
+ return this.options;
130
+ }
131
+ end() {
132
+ const logs = [this._options.aigne?.getStdoutLogs?.(), this._options.aigne?.getBottomBarLogs?.()]
133
+ .filter(Boolean)
134
+ .flat()
135
+ .join(EOL);
136
+ if (logs)
137
+ this._logger.toStdout(logs);
138
+ }
139
+ }
@@ -1,4 +1,4 @@
1
- import { type Message, type UserAgent as input } from "@aigne/core";
1
+ import { type Message, type UserAgent } from "@aigne/core";
2
2
  export interface ChatLoopOptions {
3
3
  initialCall?: Message | string;
4
4
  welcome?: string;
@@ -6,4 +6,4 @@ export interface ChatLoopOptions {
6
6
  inputKey?: string;
7
7
  skipLoop?: boolean;
8
8
  }
9
- export declare function runChatLoopInTerminal(userAgent: input, options?: ChatLoopOptions): Promise<void>;
9
+ export declare function runChatLoopInTerminal(userAgent: UserAgent, options?: ChatLoopOptions): Promise<void>;
@@ -1,6 +1,4 @@
1
1
  import { createMessage } from "@aigne/core";
2
- import { figures } from "@aigne/listr2";
3
- import chalk from "chalk";
4
2
  import inquirer from "inquirer";
5
3
  import { TerminalTracer } from "../tracer/terminal.js";
6
4
  export async function runChatLoopInTerminal(userAgent, options = {}) {
@@ -44,11 +42,7 @@ export async function runChatLoopInTerminal(userAgent, options = {}) {
44
42
  }
45
43
  }
46
44
  async function callAgent(userAgent, input, options) {
47
- const tracer = new TerminalTracer(userAgent.context, {
48
- aiResponsePrefix: (context) => {
49
- return `${chalk.grey(figures.tick)} 🤖 ${tracer.formatTokenUsage(context.usage)}`;
50
- },
51
- });
45
+ const tracer = new TerminalTracer(userAgent.context);
52
46
  await tracer.run(userAgent, options.inputKey && typeof input === "string"
53
47
  ? { [options.inputKey]: input }
54
48
  : createMessage(input));
@@ -0,0 +1,32 @@
1
+ import { AIGNE, type Agent, type ChatModelOptions } from "@aigne/core";
2
+ import { LogLevel } from "@aigne/core/utils/logger.js";
3
+ import { type PromiseOrValue } from "@aigne/core/utils/type-utils.js";
4
+ import { Command } from "commander";
5
+ import { type ChatLoopOptions } from "./run-chat-loop.js";
6
+ export interface RunAIGNECommandOptions {
7
+ chat?: boolean;
8
+ model?: string;
9
+ temperature?: number;
10
+ topP?: number;
11
+ presencePenalty?: number;
12
+ frequencyPenalty?: number;
13
+ input?: string;
14
+ logLevel?: LogLevel;
15
+ }
16
+ export declare const createRunAIGNECommand: (name?: string) => Command;
17
+ export declare const parseModelOption: (model?: string) => {
18
+ provider: string | undefined;
19
+ name: string | undefined;
20
+ };
21
+ export declare function runWithAIGNE(agentCreator: ((aigne: AIGNE) => PromiseOrValue<Agent>) | Agent, { argv, chatLoopOptions, modelOptions, }?: {
22
+ argv?: typeof process.argv;
23
+ chatLoopOptions?: ChatLoopOptions;
24
+ modelOptions?: ChatModelOptions;
25
+ }): Promise<void>;
26
+ export declare function runAgentWithAIGNE(aigne: AIGNE, agent: Agent, { chatLoopOptions, modelOptions, ...options }?: {
27
+ chatLoopOptions?: ChatLoopOptions;
28
+ modelOptions?: ChatModelOptions;
29
+ } & RunAIGNECommandOptions): Promise<{
30
+ result: import("@aigne/core").Message;
31
+ context: import("@aigne/core").Context<import("@aigne/core").UserContext>;
32
+ } | undefined>;
@@ -0,0 +1,90 @@
1
+ import { fstat } from "node:fs";
2
+ import { isatty } from "node:tty";
3
+ import { promisify } from "node:util";
4
+ import { AIGNE, UserAgent, createMessage } from "@aigne/core";
5
+ import { loadModel } from "@aigne/core/loader/index.js";
6
+ import { LogLevel, logger } from "@aigne/core/utils/logger.js";
7
+ import { readAllString } from "@aigne/core/utils/stream-utils.js";
8
+ import { tryOrThrow } from "@aigne/core/utils/type-utils.js";
9
+ import { Command } from "commander";
10
+ import PrettyError from "pretty-error";
11
+ import { ZodError, z } from "zod";
12
+ import { availableModels } from "../constants.js";
13
+ import { TerminalTracer } from "../tracer/terminal.js";
14
+ import { runChatLoopInTerminal } from "./run-chat-loop.js";
15
+ export const createRunAIGNECommand = (name = "run") => new Command(name)
16
+ .description("Run agent with AIGNE in terminal")
17
+ .option("--chat", "Run chat loop in terminal", false)
18
+ .option("--model <provider[:model]>", `AI model to use in format 'provider[:model]' where model is optional. Examples: 'openai' or 'openai:gpt-4o-mini'. Available providers: ${availableModels.map((i) => i.name.toLowerCase().replace(/ChatModel$/i, "")).join(", ")} (default: openai)`)
19
+ .option("--temperature <temperature>", "Temperature for the model (controls randomness, higher values produce more random outputs). Range: 0.0-2.0", customZodError("--temperature", (s) => z.coerce.number().min(0).max(2).parse(s)))
20
+ .option("--top-p <top-p>", "Top P (nucleus sampling) parameter for the model (controls diversity). Range: 0.0-1.0", customZodError("--top-p", (s) => z.coerce.number().min(0).max(1).parse(s)))
21
+ .option("--presence-penalty <presence-penalty>", "Presence penalty for the model (penalizes repeating the same tokens). Range: -2.0 to 2.0", customZodError("--presence-penalty", (s) => z.coerce.number().min(-2).max(2).parse(s)))
22
+ .option("--frequency-penalty <frequency-penalty>", "Frequency penalty for the model (penalizes frequency of token usage). Range: -2.0 to 2.0", customZodError("--frequency-penalty", (s) => z.coerce.number().min(-2).max(2).parse(s)))
23
+ .option("--input -i <input>", "Input to the agent")
24
+ .option("--log-level <level>", `Log level for detailed debugging information. Values: ${Object.values(LogLevel).join(", ")}`, customZodError("--log-level", (s) => z.nativeEnum(LogLevel).parse(s)), LogLevel.INFO);
25
+ export const parseModelOption = (model) => {
26
+ const { provider, name } = model?.match(/(?<provider>[^:]+)(:(?<name>(\S+)))?/)?.groups ?? {};
27
+ return { provider, name };
28
+ };
29
+ export async function runWithAIGNE(agentCreator, { argv = process.argv, chatLoopOptions, modelOptions, } = {}) {
30
+ await createRunAIGNECommand()
31
+ .showHelpAfterError(true)
32
+ .showSuggestionAfterError(true)
33
+ .action(async (options) => {
34
+ if (options.logLevel) {
35
+ logger.level = options.logLevel;
36
+ }
37
+ const model = await loadModel(availableModels, {
38
+ ...parseModelOption(options.model),
39
+ temperature: options.temperature,
40
+ topP: options.topP,
41
+ presencePenalty: options.presencePenalty,
42
+ frequencyPenalty: options.frequencyPenalty,
43
+ }, modelOptions);
44
+ const aigne = new AIGNE({ model });
45
+ try {
46
+ const agent = typeof agentCreator === "function" ? await agentCreator(aigne) : agentCreator;
47
+ await runAgentWithAIGNE(aigne, agent, { ...options, chatLoopOptions, modelOptions });
48
+ }
49
+ finally {
50
+ await aigne.shutdown();
51
+ }
52
+ })
53
+ .parseAsync(argv)
54
+ .catch((error) => {
55
+ console.error(new PrettyError().render(error));
56
+ process.exit(1);
57
+ });
58
+ }
59
+ function customZodError(label, fn) {
60
+ return ((...args) => tryOrThrow(() => fn(...args), (e) => new Error(`${label} ${e instanceof ZodError ? e.issues[0]?.message : e.message}`)));
61
+ }
62
+ export async function runAgentWithAIGNE(aigne, agent, { chatLoopOptions, modelOptions, ...options } = {}) {
63
+ if (options.chat) {
64
+ if (!isatty(process.stdout.fd)) {
65
+ throw new Error("--chat mode requires a TTY terminal");
66
+ }
67
+ const userAgent = agent instanceof UserAgent ? agent : aigne.invoke(agent);
68
+ await runChatLoopInTerminal(userAgent, {
69
+ ...chatLoopOptions,
70
+ });
71
+ return;
72
+ }
73
+ const input = options.input ||
74
+ (isatty(process.stdin.fd) || !(await stdinHasData())
75
+ ? null
76
+ : await readAllString(process.stdin)) ||
77
+ chatLoopOptions?.initialCall ||
78
+ chatLoopOptions?.defaultQuestion ||
79
+ {};
80
+ const tracer = new TerminalTracer(aigne.newContext(), {
81
+ printRequest: logger.enabled(LogLevel.INFO),
82
+ });
83
+ return await tracer.run(agent, chatLoopOptions?.inputKey && typeof input === "string"
84
+ ? { [chatLoopOptions.inputKey]: input }
85
+ : createMessage(input));
86
+ }
87
+ async function stdinHasData() {
88
+ const stats = await promisify(fstat)(0);
89
+ return stats.isFIFO() || stats.isFile();
90
+ }
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@aigne/cli",
3
- "version": "1.9.1",
3
+ "version": "1.10.1",
4
4
  "description": "cli for AIGNE framework",
5
5
  "publishConfig": {
6
6
  "access": "public"
@@ -35,7 +35,7 @@
35
35
  }
36
36
  },
37
37
  "dependencies": {
38
- "@aigne/listr2": "^1.0.9",
38
+ "@aigne/listr2": "^1.0.10",
39
39
  "@aigne/marked-terminal": "^7.3.1",
40
40
  "@modelcontextprotocol/sdk": "^1.11.0",
41
41
  "chalk": "^5.4.1",
@@ -45,12 +45,19 @@
45
45
  "gradient-string": "^3.0.0",
46
46
  "inquirer": "^12.6.0",
47
47
  "marked": "^15.0.11",
48
- "openai": "^4.97.0",
49
48
  "prettier": "^3.5.3",
50
49
  "pretty-error": "^4.0.0",
51
50
  "tar": "^7.4.3",
52
51
  "zod": "^3.24.4",
53
- "@aigne/core": "^1.15.0"
52
+ "@aigne/anthropic": "^0.2.0",
53
+ "@aigne/bedrock": "^0.2.0",
54
+ "@aigne/core": "^1.17.0",
55
+ "@aigne/deepseek": "^0.2.0",
56
+ "@aigne/gemini": "^0.2.0",
57
+ "@aigne/ollama": "^0.2.0",
58
+ "@aigne/openai": "^0.2.0",
59
+ "@aigne/open-router": "^0.2.0",
60
+ "@aigne/xai": "^0.2.0"
54
61
  },
55
62
  "devDependencies": {
56
63
  "@types/archiver": "^6.0.3",