langchain 0.2.8 → 0.2.10

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -2,14 +2,13 @@
2
2
 
3
3
  ⚡ Building applications with LLMs through composability ⚡
4
4
 
5
- [![CI](https://github.com/langchain-ai/langchainjs/actions/workflows/ci.yml/badge.svg)](https://github.com/langchain-ai/langchainjs/actions/workflows/ci.yml) ![npm](https://img.shields.io/npm/dm/langchain) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Twitter](https://img.shields.io/twitter/url/https/twitter.com/langchainai.svg?style=social&label=Follow%20%40LangChainAI)](https://twitter.com/langchainai) [![](https://dcbadge.vercel.app/api/server/6adMQxSpJS?compact=true&style=flat)](https://discord.gg/6adMQxSpJS) [![Open in Dev Containers](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/langchain-ai/langchainjs)
5
+ [![CI](https://github.com/langchain-ai/langchainjs/actions/workflows/ci.yml/badge.svg)](https://github.com/langchain-ai/langchainjs/actions/workflows/ci.yml) ![npm](https://img.shields.io/npm/dm/langchain) [![License: MIT](https://img.shields.io/badge/License-MIT-yellow.svg)](https://opensource.org/licenses/MIT) [![Twitter](https://img.shields.io/twitter/url/https/twitter.com/langchainai.svg?style=social&label=Follow%20%40LangChainAI)](https://twitter.com/langchainai) [![Open in Dev Containers](https://img.shields.io/static/v1?label=Dev%20Containers&message=Open&color=blue&logo=visualstudiocode)](https://vscode.dev/redirect?url=vscode://ms-vscode-remote.remote-containers/cloneInVolume?url=https://github.com/langchain-ai/langchainjs)
6
6
  [<img src="https://github.com/codespaces/badge.svg" title="Open in Github Codespace" width="150" height="20">](https://codespaces.new/langchain-ai/langchainjs)
7
7
 
8
8
  Looking for the Python version? Check out [LangChain](https://github.com/langchain-ai/langchain).
9
9
 
10
10
  To help you ship LangChain apps to production faster, check out [LangSmith](https://smith.langchain.com).
11
11
  [LangSmith](https://smith.langchain.com) is a unified developer platform for building, testing, and monitoring LLM applications.
12
- Fill out [this form](https://airtable.com/appwQzlErAS2qiP0L/shrGtGaVBVAz7NcV2) to get off the waitlist or speak with our sales team.
13
12
 
14
13
  ## ⚡️ Quick Install
15
14
 
@@ -17,10 +16,6 @@ You can use npm, yarn, or pnpm to install LangChain.js
17
16
 
18
17
  `npm install -S langchain` or `yarn add langchain` or `pnpm add langchain`
19
18
 
20
- ```typescript
21
- import { ChatOpenAI } from "langchain/chat_models/openai";
22
- ```
23
-
24
19
  ## 🌐 Supported Environments
25
20
 
26
21
  LangChain is written in TypeScript and can be used in:
@@ -39,19 +34,20 @@ LangChain is written in TypeScript and can be used in:
39
34
  - **Reason**: rely on a language model to reason (about how to answer based on provided context, what actions to take, etc.)
40
35
 
41
36
  This framework consists of several parts.
42
- - **LangChain Libraries**: The Python and JavaScript libraries. Contains interfaces and integrations for a myriad of components, a basic runtime for combining these components into chains and agents, and off-the-shelf implementations of chains and agents.
43
- - **[LangChain Templates](https://github.com/langchain-ai/langchain/tree/master/templates)**: (currently Python-only) A collection of easily deployable reference architectures for a wide variety of tasks.
44
- - **[LangServe](https://github.com/langchain-ai/langserve)**: (currently Python-only) A library for deploying LangChain chains as a REST API.
45
- - **[LangSmith](https://smith.langchain.com)**: A developer platform that lets you debug, test, evaluate, and monitor chains built on any LLM framework and seamlessly integrates with LangChain.
37
+ - **Open-source libraries**: Build your applications using LangChain's open-source [building blocks](https://js.langchain.com/v0.2/docs/concepts#langchain-expression-language), [components](https://js.langchain.com/v0.2/docs/concepts), and [third-party integrations](https://js.langchain.com/v0.2/docs/integrations/platforms/).
38
+ Use [LangGraph.js](https://js.langchain.com/v0.2/docs/concepts/#langgraphjs) to build stateful agents with first-class streaming and human-in-the-loop support.
39
+ - **Productionization**: Use [LangSmith](https://docs.smith.langchain.com/) to inspect, monitor and evaluate your chains, so that you can continuously optimize and deploy with confidence.
40
+ - **Deployment**: Turn your LangGraph applications into production-ready APIs and Assistants with [LangGraph Cloud](https://langchain-ai.github.io/langgraph/cloud/) (currently Python-only).
46
41
 
47
42
  The LangChain libraries themselves are made up of several different packages.
48
43
  - **[`@langchain/core`](https://github.com/langchain-ai/langchainjs/blob/main/langchain-core)**: Base abstractions and LangChain Expression Language.
49
44
  - **[`@langchain/community`](https://github.com/langchain-ai/langchainjs/blob/main/libs/langchain-community)**: Third party integrations.
50
45
  - **[`langchain`](https://github.com/langchain-ai/langchainjs/blob/main/langchain)**: Chains, agents, and retrieval strategies that make up an application's cognitive architecture.
46
+ - **[LangGraph.js](https://langchain-ai.github.io/langgraphjs/)**: A library for building robust and stateful multi-actor applications with LLMs by modeling steps as edges and nodes in a graph. Integrates smoothly with LangChain, but can be used without it.
51
47
 
52
48
  Integrations may also be split into their own compatible packages.
53
49
 
54
- ![LangChain Stack](https://github.com/langchain-ai/langchainjs/blob/main/docs/core_docs/static/img/langchain_stack_feb_2024.webp)
50
+ ![LangChain Stack](https://github.com/langchain-ai/langchainjs/blob/main/docs/core_docs/static/svg/langchain_stack_062024.svg)
55
51
 
56
52
  This library aims to assist in the development of those types of applications. Common examples of these applications include:
57
53
 
@@ -86,15 +82,15 @@ Data Augmented Generation involves specific types of chains that first interact
86
82
 
87
83
  **🤖 Agents:**
88
84
 
89
- Agents involve an LLM making decisions about which Actions to take, taking that Action, seeing an Observation, and repeating that until done. LangChain provides a standard interface for agents, a selection of agents to choose from, and examples of end-to-end agents.
85
+ Agents allow an LLM autonomy over how a task is accomplished. Agents make decisions about which Actions to take, then take that Action, observe the result, and repeat until the task is complete. LangChain provides a [standard interface for agents](https://js.langchain.com/v0.2/docs/concepts/#agents), along with [LangGraph.js](https://github.com/langchain-ai/langgraphjs/) for building custom agents.
90
86
 
91
87
  ## 📖 Documentation
92
88
 
93
- Please see [here](https://js.langchain.com/v0.2/) for full documentation, which includes:
89
+ Please see [here](https://js.langchain.com) for full documentation, which includes:
94
90
 
95
91
  - [Getting started](https://js.langchain.com/v0.2/docs/introduction): installation, setting up the environment, simple examples
96
- - [Tutorials](https://js.langchain.com/v0.2/docs/tutorials/): interactive guides and walkthroughs of common use cases/tasks.
97
- - [Use case](https://js.langchain.com/v0.2/docs/how_to/) walkthroughs and best practices for every component of the LangChain library.
92
+ - Overview of the [interfaces](https://js.langchain.com/v0.2/docs/how_to/lcel_cheatsheet/), [modules](https://js.langchain.com/v0.2/docs/concepts) and [integrations](https://js.langchain.com/v0.2/docs/integrations/platforms/)
93
+ - [Tutorial](https://js.langchain.com/v0.2/docs/tutorials/) walkthroughs
98
94
  - [Reference](https://api.js.langchain.com): full API docs
99
95
 
100
96
  ## 💁 Contributing
@@ -108,4 +104,3 @@ Please report any security issues or concerns following our [security guidelines
108
104
  ## 🖇️ Relationship with Python LangChain
109
105
 
110
106
  This is built to integrate as seamlessly as possible with the [LangChain Python package](https://github.com/langchain-ai/langchain). Specifically, this means all objects (prompts, LLMs, chains, etc) are designed in a way where they can be serialized and shared between languages.
111
-
@@ -478,6 +478,9 @@ class AgentExecutor extends base_js_1.BaseChain {
478
478
  observation = tool
479
479
  ? await tool.invoke(action.toolInput, (0, runnables_1.patchConfig)(config, { callbacks: runManager?.getChild() }))
480
480
  : `${action.tool} is not a valid tool, try another one.`;
481
+ if (typeof observation !== "string") {
482
+ throw new Error("Received unsupported non-string response from tool call.");
483
+ }
481
484
  }
482
485
  catch (e) {
483
486
  // eslint-disable-next-line no-instanceof/no-instanceof
@@ -573,6 +576,9 @@ class AgentExecutor extends base_js_1.BaseChain {
573
576
  const tool = nameToolMap[agentAction.tool];
574
577
  try {
575
578
  observation = await tool.call(agentAction.toolInput, runManager?.getChild());
579
+ if (typeof observation !== "string") {
580
+ throw new Error("Received unsupported non-string response from tool call.");
581
+ }
576
582
  }
577
583
  catch (e) {
578
584
  // eslint-disable-next-line no-instanceof/no-instanceof
@@ -473,6 +473,9 @@ export class AgentExecutor extends BaseChain {
473
473
  observation = tool
474
474
  ? await tool.invoke(action.toolInput, patchConfig(config, { callbacks: runManager?.getChild() }))
475
475
  : `${action.tool} is not a valid tool, try another one.`;
476
+ if (typeof observation !== "string") {
477
+ throw new Error("Received unsupported non-string response from tool call.");
478
+ }
476
479
  }
477
480
  catch (e) {
478
481
  // eslint-disable-next-line no-instanceof/no-instanceof
@@ -568,6 +571,9 @@ export class AgentExecutor extends BaseChain {
568
571
  const tool = nameToolMap[agentAction.tool];
569
572
  try {
570
573
  observation = await tool.call(agentAction.toolInput, runManager?.getChild());
574
+ if (typeof observation !== "string") {
575
+ throw new Error("Received unsupported non-string response from tool call.");
576
+ }
571
577
  }
572
578
  catch (e) {
573
579
  // eslint-disable-next-line no-instanceof/no-instanceof
@@ -28,6 +28,7 @@ test("Test CSV loader from blob", async () => {
28
28
  expect(docs.length).toBe(2);
29
29
  expect(docs[0]).toMatchInlineSnapshot(`
30
30
  Document {
31
+ "id": undefined,
31
32
  "metadata": {
32
33
  "blobType": "text/csv",
33
34
  "line": 1,
@@ -39,6 +40,7 @@ test("Test CSV loader from blob", async () => {
39
40
  `);
40
41
  expect(docs[1]).toMatchInlineSnapshot(`
41
42
  Document {
43
+ "id": undefined,
42
44
  "metadata": {
43
45
  "blobType": "text/csv",
44
46
  "line": 2,
@@ -24,6 +24,7 @@ test("Test JSON loader from blob", async () => {
24
24
  expect(docs.length).toBe(2);
25
25
  expect(docs[0]).toMatchInlineSnapshot(`
26
26
  Document {
27
+ "id": undefined,
27
28
  "metadata": {
28
29
  "blobType": "application/json",
29
30
  "line": 1,
@@ -34,6 +35,7 @@ test("Test JSON loader from blob", async () => {
34
35
  `);
35
36
  expect(docs[1]).toMatchInlineSnapshot(`
36
37
  Document {
38
+ "id": undefined,
37
39
  "metadata": {
38
40
  "blobType": "application/json",
39
41
  "line": 2,
@@ -66,6 +68,7 @@ test("Test JSON loader from blob", async () => {
66
68
  expect(docs.length).toBe(10);
67
69
  expect(docs[0]).toMatchInlineSnapshot(`
68
70
  Document {
71
+ "id": undefined,
69
72
  "metadata": {
70
73
  "blobType": "application/json",
71
74
  "line": 1,
@@ -76,6 +79,7 @@ test("Test JSON loader from blob", async () => {
76
79
  `);
77
80
  expect(docs[1]).toMatchInlineSnapshot(`
78
81
  Document {
82
+ "id": undefined,
79
83
  "metadata": {
80
84
  "blobType": "application/json",
81
85
  "line": 2,
@@ -23,6 +23,7 @@ test("Test JSONL loader from blob", async () => {
23
23
  expect(docs.length).toBe(2);
24
24
  expect(docs[0]).toMatchInlineSnapshot(`
25
25
  Document {
26
+ "id": undefined,
26
27
  "metadata": {
27
28
  "blobType": "application/jsonl+json",
28
29
  "line": 1,
@@ -33,6 +34,7 @@ test("Test JSONL loader from blob", async () => {
33
34
  `);
34
35
  expect(docs[1]).toMatchInlineSnapshot(`
35
36
  Document {
37
+ "id": undefined,
36
38
  "metadata": {
37
39
  "blobType": "application/jsonl+json",
38
40
  "line": 2,
@@ -160,6 +160,17 @@ class ParentDocumentRetriever extends multi_vector_js_1.MultiVectorRetriever {
160
160
  parentDocs.push(...retrievedDocs);
161
161
  return parentDocs.slice(0, this.parentK);
162
162
  }
163
+ async _storeDocuments(parentDoc, childDocs, addToDocstore) {
164
+ if (this.childDocumentRetriever) {
165
+ await this.childDocumentRetriever.addDocuments(childDocs);
166
+ }
167
+ else {
168
+ await this.vectorstore.addDocuments(childDocs);
169
+ }
170
+ if (addToDocstore) {
171
+ await this.docstore.mset(Object.entries(parentDoc));
172
+ }
173
+ }
163
174
  /**
164
175
  * Adds documents to the docstore and vectorstores.
165
176
  * If a retriever is provided, it will be used to add documents instead of the vectorstore.
@@ -192,8 +203,6 @@ class ParentDocumentRetriever extends multi_vector_js_1.MultiVectorRetriever {
192
203
  if (parentDocs.length !== parentDocIds.length) {
193
204
  throw new Error(`Got uneven list of documents and ids.\nIf "ids" is provided, should be same length as "documents".`);
194
205
  }
195
- const embeddedDocs = [];
196
- const fullDocs = {};
197
206
  for (let i = 0; i < parentDocs.length; i += 1) {
198
207
  const parentDoc = parentDocs[i];
199
208
  const parentDocId = parentDocIds[i];
@@ -202,17 +211,7 @@ class ParentDocumentRetriever extends multi_vector_js_1.MultiVectorRetriever {
202
211
  pageContent: subDoc.pageContent,
203
212
  metadata: { ...subDoc.metadata, [this.idKey]: parentDocId },
204
213
  }));
205
- embeddedDocs.push(...taggedSubDocs);
206
- fullDocs[parentDocId] = parentDoc;
207
- }
208
- if (this.childDocumentRetriever) {
209
- await this.childDocumentRetriever.addDocuments(embeddedDocs);
210
- }
211
- else {
212
- await this.vectorstore.addDocuments(embeddedDocs);
213
- }
214
- if (addToDocstore) {
215
- await this.docstore.mset(Object.entries(fullDocs));
214
+ await this._storeDocuments({ [parentDocId]: parentDoc }, taggedSubDocs, addToDocstore);
216
215
  }
217
216
  }
218
217
  }
@@ -63,6 +63,7 @@ export declare class ParentDocumentRetriever extends MultiVectorRetriever {
63
63
  documentCompressorFilteringFn?: ParentDocumentRetrieverFields["documentCompressorFilteringFn"];
64
64
  constructor(fields: ParentDocumentRetrieverFields);
65
65
  _getRelevantDocuments(query: string): Promise<Document[]>;
66
+ _storeDocuments(parentDoc: Record<string, Document>, childDocs: Document[], addToDocstore: boolean): Promise<void>;
66
67
  /**
67
68
  * Adds documents to the docstore and vectorstores.
68
69
  * If a retriever is provided, it will be used to add documents instead of the vectorstore.
@@ -134,6 +134,17 @@ export class ParentDocumentRetriever extends MultiVectorRetriever {
134
134
  parentDocs.push(...retrievedDocs);
135
135
  return parentDocs.slice(0, this.parentK);
136
136
  }
137
+ async _storeDocuments(parentDoc, childDocs, addToDocstore) {
138
+ if (this.childDocumentRetriever) {
139
+ await this.childDocumentRetriever.addDocuments(childDocs);
140
+ }
141
+ else {
142
+ await this.vectorstore.addDocuments(childDocs);
143
+ }
144
+ if (addToDocstore) {
145
+ await this.docstore.mset(Object.entries(parentDoc));
146
+ }
147
+ }
137
148
  /**
138
149
  * Adds documents to the docstore and vectorstores.
139
150
  * If a retriever is provided, it will be used to add documents instead of the vectorstore.
@@ -166,8 +177,6 @@ export class ParentDocumentRetriever extends MultiVectorRetriever {
166
177
  if (parentDocs.length !== parentDocIds.length) {
167
178
  throw new Error(`Got uneven list of documents and ids.\nIf "ids" is provided, should be same length as "documents".`);
168
179
  }
169
- const embeddedDocs = [];
170
- const fullDocs = {};
171
180
  for (let i = 0; i < parentDocs.length; i += 1) {
172
181
  const parentDoc = parentDocs[i];
173
182
  const parentDocId = parentDocIds[i];
@@ -176,17 +185,7 @@ export class ParentDocumentRetriever extends MultiVectorRetriever {
176
185
  pageContent: subDoc.pageContent,
177
186
  metadata: { ...subDoc.metadata, [this.idKey]: parentDocId },
178
187
  }));
179
- embeddedDocs.push(...taggedSubDocs);
180
- fullDocs[parentDocId] = parentDoc;
181
- }
182
- if (this.childDocumentRetriever) {
183
- await this.childDocumentRetriever.addDocuments(embeddedDocs);
184
- }
185
- else {
186
- await this.vectorstore.addDocuments(embeddedDocs);
187
- }
188
- if (addToDocstore) {
189
- await this.docstore.mset(Object.entries(fullDocs));
188
+ await this._storeDocuments({ [parentDocId]: parentDoc }, taggedSubDocs, addToDocstore);
190
189
  }
191
190
  }
192
191
  }
@@ -11,7 +11,7 @@ test("Should work with a question input", async () => {
11
11
  "Cars are made out of plastic",
12
12
  "mitochondria is the powerhouse of the cell",
13
13
  "mitochondria is made of lipids",
14
- ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings());
14
+ ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
15
15
  const retriever = new EnsembleRetriever({
16
16
  retrievers: [vectorstore.asRetriever()],
17
17
  });
@@ -28,7 +28,7 @@ test("Should work with multiple retriever", async () => {
28
28
  "Cars are made out of plastic",
29
29
  "mitochondria is the powerhouse of the cell",
30
30
  "mitochondria is made of lipids",
31
- ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings());
31
+ ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
32
32
  const vectorstore2 = await MemoryVectorStore.fromTexts([
33
33
  "Buildings are made out of brick",
34
34
  "Buildings are made out of wood",
@@ -37,7 +37,7 @@ test("Should work with multiple retriever", async () => {
37
37
  "Cars are made out of plastic",
38
38
  "mitochondria is the powerhouse of the cell",
39
39
  "mitochondria is made of lipids",
40
- ], [{ id: 6 }, { id: 7 }, { id: 8 }, { id: 9 }, { id: 10 }], new CohereEmbeddings());
40
+ ], [{ id: 6 }, { id: 7 }, { id: 8 }, { id: 9 }, { id: 10 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
41
41
  const retriever = new EnsembleRetriever({
42
42
  retrievers: [vectorstore.asRetriever(), vectorstore2.asRetriever()],
43
43
  });
@@ -54,7 +54,7 @@ test("Should work with weights", async () => {
54
54
  "Cars are made out of plastic",
55
55
  "mitochondria is the powerhouse of the cell",
56
56
  "mitochondria is made of lipids",
57
- ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings());
57
+ ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
58
58
  const vectorstore2 = await MemoryVectorStore.fromTexts([
59
59
  "Buildings are made out of brick",
60
60
  "Buildings are made out of wood",
@@ -63,7 +63,7 @@ test("Should work with weights", async () => {
63
63
  "Cars are made out of plastic",
64
64
  "mitochondria is the powerhouse of the cell",
65
65
  "mitochondria is made of lipids",
66
- ], [{ id: 6 }, { id: 7 }, { id: 8 }, { id: 9 }, { id: 10 }], new CohereEmbeddings());
66
+ ], [{ id: 6 }, { id: 7 }, { id: 8 }, { id: 9 }, { id: 10 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
67
67
  const retriever = new EnsembleRetriever({
68
68
  retrievers: [vectorstore.asRetriever(), vectorstore2.asRetriever()],
69
69
  weights: [0.5, 0.9],
@@ -12,7 +12,7 @@ test("Should work with a question input", async () => {
12
12
  "Cars are made out of plastic",
13
13
  "mitochondria is the powerhouse of the cell",
14
14
  "mitochondria is made of lipids",
15
- ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings());
15
+ ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
16
16
  const model = new ChatOpenAI({});
17
17
  const retriever = MultiQueryRetriever.fromLLM({
18
18
  llm: model,
@@ -32,7 +32,7 @@ test("Should work with a keyword", async () => {
32
32
  "Cars are made out of plastic",
33
33
  "mitochondria is the powerhouse of the cell",
34
34
  "mitochondria is made of lipids",
35
- ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings());
35
+ ], [{ id: 1 }, { id: 2 }, { id: 3 }, { id: 4 }, { id: 5 }], new CohereEmbeddings({ model: "embed-english-v3.0" }));
36
36
  const model = new ChatOpenAI({});
37
37
  const retriever = MultiQueryRetriever.fromLLM({
38
38
  llm: model,
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "langchain",
3
- "version": "0.2.8",
3
+ "version": "0.2.10",
4
4
  "description": "Typescript bindings for langchain",
5
5
  "type": "module",
6
6
  "engines": {
@@ -888,7 +888,7 @@
888
888
  }
889
889
  },
890
890
  "dependencies": {
891
- "@langchain/core": ">=0.2.9 <0.3.0",
891
+ "@langchain/core": ">=0.2.11 <0.3.0",
892
892
  "@langchain/openai": ">=0.1.0 <0.3.0",
893
893
  "@langchain/textsplitters": "~0.0.0",
894
894
  "binary-extensions": "^2.2.0",
@@ -900,7 +900,7 @@
900
900
  "ml-distance": "^4.0.0",
901
901
  "openapi-types": "^12.1.3",
902
902
  "p-retry": "4",
903
- "uuid": "^9.0.0",
903
+ "uuid": "^10.0.0",
904
904
  "yaml": "^2.2.1",
905
905
  "zod": "^3.22.4",
906
906
  "zod-to-json-schema": "^3.22.3"