@easynet/agent-model 1.0.77 → 1.0.78

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (2) hide show
  1. package/README.md +12 -54
  2. package/package.json +1 -1
package/README.md CHANGED
@@ -1,76 +1,34 @@
1
1
  # @easynet/agent-model
2
2
 
3
- 统一的 Model Hub:从 `model.yaml` 创建 LangChain ChatModel、Embedding 等模型。
3
+ ## Introduction
4
4
 
5
- ## 最小接口
5
+ `@easynet/agent-model` initializes the framework model layer from `model.yaml` and registers the configured chat model, embedding model, and optional VLM model into the default `AgentContext`.
6
6
 
7
- ```ts
8
- import { AgentContextTokens, getDefaultAgentContext } from "@easynet/agent-common";
9
- import { createAgentModel } from "@easynet/agent-model";
7
+ ## API Reference
10
8
 
11
- await createAgentModel();
12
- const llm = getDefaultAgentContext().get(AgentContextTokens.ChatModel);
13
- const result = await llm.invoke("hello");
14
- console.log(result.content);
15
- ```
9
+ | API | What it does | Minimal usage |
10
+ | --- | --- | --- |
11
+ | `createAgentModel` | Load model config and register models into `AgentContext`. | `await createAgentModel({ configPath: "./model.yaml" })` |
16
12
 
17
- ## 最小 YAML(推荐先用这个)
13
+ ## Usage
18
14
 
19
- `model.yaml`
15
+ Create `model.yaml`:
20
16
 
21
17
  ```yaml
22
18
  llm:
23
- # 默认使用 small 这个实例
24
19
  default: small
25
-
26
- # 可按需保留 strong;不需要就删掉
27
- strong:
28
- provider: openai
29
- base_url: ${LLM_BASE_URL}
30
- model: ${LLM_MODEL}
31
-
32
- # 本地 Ollama 最常见配置
33
20
  small:
34
21
  provider: openai
35
22
  base_url: http://localhost:11434/v1
36
23
  model: qwen3:0.6b
37
-
38
- embed:
39
- default: gemma
40
- gemma:
41
- provider: openai
42
- base_url: https://ollama-nvidia-8g-2.easynet.world/v1
43
- model: embeddinggemma:latest
44
- apiKey: ollama
45
-
46
- runtime:
47
- check_connectivity: false
48
24
  ```
49
25
 
50
- ## 最简单示例(带注释)
26
+ Initialize the model layer:
51
27
 
52
28
  ```ts
53
- import { AgentContextTokens, getDefaultAgentContext } from "@easynet/agent-common";
54
29
  import { createAgentModel } from "@easynet/agent-model";
55
30
 
56
- async function main() {
57
- // 1) 从当前目录 model.yaml 读取配置并创建模型
58
- await createAgentModel();
59
- const llm = getDefaultAgentContext().get(AgentContextTokens.ChatModel);
60
-
61
- // 2) 直接调用模型
62
- const response = await llm.invoke("请用一句话介绍你自己。");
63
-
64
- // 3) 输出文本
65
- console.log(response.content);
66
- }
67
-
68
- main().catch(console.error);
31
+ await createAgentModel({
32
+ configPath: "./model.yaml",
33
+ });
69
34
  ```
70
-
71
- ## 可选扩展
72
-
73
- - 自定义 provider:`registerChatModelProvider(...)`
74
- - 只取默认模型配置:`getDefaultLlmConfig(...)`
75
- - 直接从对象构建模型:`createChatModelFromLlmConfig(...)`
76
- - 从 model.yaml 创建 EmbedFn:`createEmbedFnFromModelsConfig(...)`
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@easynet/agent-model",
3
- "version": "1.0.77",
3
+ "version": "1.0.78",
4
4
  "description": "Agent LLM: multi-provider, multi-model, simple chat/image API. Consumes agent.yaml llm section.",
5
5
  "type": "module",
6
6
  "bin": {