@aigne/ollama 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md ADDED
@@ -0,0 +1,18 @@
1
+ # Changelog
2
+
3
+ ## [0.1.0](https://github.com/AIGNE-io/aigne-framework/compare/ollama-v0.0.1...ollama-v0.1.0) (2025-05-23)
4
+
5
+
6
+ ### Features
7
+
8
+ * **models:** publish model adapters as standalone packages ([#126](https://github.com/AIGNE-io/aigne-framework/issues/126)) ([588b8ae](https://github.com/AIGNE-io/aigne-framework/commit/588b8aea6abcee5fa87def1358bf51f84021c6ef))
9
+
10
+
11
+ ### Dependencies
12
+
13
+ * The following workspace dependencies were updated
14
+ * dependencies
15
+ * @aigne/openai bumped to 0.1.0
16
+ * devDependencies
17
+ * @aigne/core bumped to 1.16.0
18
+ * @aigne/test-utils bumped to 0.3.0
package/LICENSE.md ADDED
@@ -0,0 +1,93 @@
1
+ Elastic License 2.0
2
+
3
+ URL: https://www.elastic.co/licensing/elastic-license
4
+
5
+ ## Acceptance
6
+
7
+ By using the software, you agree to all of the terms and conditions below.
8
+
9
+ ## Copyright License
10
+
11
+ The licensor grants you a non-exclusive, royalty-free, worldwide,
12
+ non-sublicensable, non-transferable license to use, copy, distribute, make
13
+ available, and prepare derivative works of the software, in each case subject to
14
+ the limitations and conditions below.
15
+
16
+ ## Limitations
17
+
18
+ You may not provide the software to third parties as a hosted or managed
19
+ service, where the service provides users with access to any substantial set of
20
+ the features or functionality of the software.
21
+
22
+ You may not move, change, disable, or circumvent the license key functionality
23
+ in the software, and you may not remove or obscure any functionality in the
24
+ software that is protected by the license key.
25
+
26
+ You may not alter, remove, or obscure any licensing, copyright, or other notices
27
+ of the licensor in the software. Any use of the licensor’s trademarks is subject
28
+ to applicable law.
29
+
30
+ ## Patents
31
+
32
+ The licensor grants you a license, under any patent claims the licensor can
33
+ license, or becomes able to license, to make, have made, use, sell, offer for
34
+ sale, import and have imported the software, in each case subject to the
35
+ limitations and conditions in this license. This license does not cover any
36
+ patent claims that you cause to be infringed by modifications or additions to
37
+ the software. If you or your company make any written claim that the software
38
+ infringes or contributes to infringement of any patent, your patent license for
39
+ the software granted under these terms ends immediately. If your company makes
40
+ such a claim, your patent license ends immediately for work on behalf of your
41
+ company.
42
+
43
+ ## Notices
44
+
45
+ You must ensure that anyone who gets a copy of any part of the software from you
46
+ also gets a copy of these terms.
47
+
48
+ If you modify the software, you must include in any modified copies of the
49
+ software prominent notices stating that you have modified the software.
50
+
51
+ ## No Other Rights
52
+
53
+ These terms do not imply any licenses other than those expressly granted in
54
+ these terms.
55
+
56
+ ## Termination
57
+
58
+ If you use the software in violation of these terms, such use is not licensed,
59
+ and your licenses will automatically terminate. If the licensor provides you
60
+ with a notice of your violation, and you cease all violation of this license no
61
+ later than 30 days after you receive that notice, your licenses will be
62
+ reinstated retroactively. However, if you violate these terms after such
63
+ reinstatement, any additional violation of these terms will cause your licenses
64
+ to terminate automatically and permanently.
65
+
66
+ ## No Liability
67
+
68
+ *As far as the law allows, the software comes as is, without any warranty or
69
+ condition, and the licensor will not be liable to you for any damages arising
70
+ out of these terms or the use or nature of the software, under any kind of
71
+ legal claim.*
72
+
73
+ ## Definitions
74
+
75
+ The **licensor** is the entity offering these terms, and the **software** is the
76
+ software the licensor makes available under these terms, including any portion
77
+ of it.
78
+
79
+ **you** refers to the individual or entity agreeing to these terms.
80
+
81
+ **your company** is any legal entity, sole proprietorship, or other kind of
82
+ organization that you work for, plus all organizations that have control over,
83
+ are under the control of, or are under common control with that
84
+ organization. **control** means ownership of substantially all the assets of an
85
+ entity, or the power to direct its management and policies by vote, contract, or
86
+ otherwise. Control can be direct or indirect.
87
+
88
+ **your licenses** are all the licenses granted to you for the software under
89
+ these terms.
90
+
91
+ **use** means anything you do with the software requiring one of your licenses.
92
+
93
+ **trademark** means trademarks, service marks, and similar rights.
package/README.md ADDED
@@ -0,0 +1,113 @@
1
+ # @aigne/ollama
2
+
3
+ [![GitHub star chart](https://img.shields.io/github/stars/AIGNE-io/aigne-framework?style=flat-square)](https://star-history.com/#AIGNE-io/aigne-framework)
4
+ [![Open Issues](https://img.shields.io/github/issues-raw/AIGNE-io/aigne-framework?style=flat-square)](https://github.com/AIGNE-io/aigne-framework/issues)
5
+ [![codecov](https://codecov.io/gh/AIGNE-io/aigne-framework/graph/badge.svg?token=DO07834RQL)](https://codecov.io/gh/AIGNE-io/aigne-framework)
6
+ [![NPM Version](https://img.shields.io/npm/v/@aigne/ollama)](https://www.npmjs.com/package/@aigne/ollama)
7
+ [![Elastic-2.0 licensed](https://img.shields.io/npm/l/@aigne/ollama)](https://github.com/AIGNE-io/aigne-framework/blob/main/LICENSE.md)
8
+
9
+ **English** | [中文](README.zh.md)
10
+
11
+ AIGNE Ollama SDK for integrating with locally hosted AI models via Ollama within the [AIGNE Framework](https://github.com/AIGNE-io/aigne-framework).
12
+
13
+ ## Introduction
14
+
15
+ `@aigne/ollama` provides a seamless integration between the AIGNE Framework and locally hosted AI models via Ollama. This package enables developers to easily leverage open-source language models running locally through Ollama in their AIGNE applications, providing a consistent interface across the framework while offering private, offline access to AI capabilities.
16
+
17
+ ## Features
18
+
19
+ * **Ollama Integration**: Direct connection to a local Ollama instance
20
+ * **Local Model Support**: Support for a wide variety of open-source models hosted via Ollama
21
+ * **Chat Completions**: Support for chat completions API with all available Ollama models
22
+ * **Streaming Responses**: Support for streaming responses for more responsive applications
23
+ * **Type-Safe**: Comprehensive TypeScript typings for all APIs and models
24
+ * **Consistent Interface**: Compatible with the AIGNE Framework's model interface
25
+ * **Privacy-Focused**: Run models locally without sending data to external API services
26
+ * **Full Configuration**: Extensive configuration options for fine-tuning behavior
27
+
28
+ ## Installation
29
+
30
+ ### Using npm
31
+
32
+ ```bash
33
+ npm install @aigne/ollama @aigne/core
34
+ ```
35
+
36
+ ### Using yarn
37
+
38
+ ```bash
39
+ yarn add @aigne/ollama @aigne/core
40
+ ```
41
+
42
+ ### Using pnpm
43
+
44
+ ```bash
45
+ pnpm add @aigne/ollama @aigne/core
46
+ ```
47
+
48
+ ## Prerequisites
49
+
50
+ Before using this package, you need to have [Ollama](https://ollama.ai/) installed and running on your machine with at least one model pulled. Follow the instructions on the [Ollama website](https://ollama.ai/) to set up Ollama.
51
+
52
+ ## Basic Usage
53
+
54
+ ```typescript file="test/ollama-chat-model.test.ts" region="example-ollama-chat-model"
55
+ import { OllamaChatModel } from "@aigne/ollama";
56
+
57
+ const model = new OllamaChatModel({
58
+ // Specify base URL (defaults to http://localhost:11434)
59
+ baseURL: "http://localhost:11434",
60
+ // Specify Ollama model to use (defaults to 'llama3')
61
+ model: "llama3",
62
+ modelOptions: {
63
+ temperature: 0.8,
64
+ },
65
+ });
66
+
67
+ const result = await model.invoke({
68
+ messages: [{ role: "user", content: "Tell me what model you're using" }],
69
+ });
70
+
71
+ console.log(result);
72
+ /* Output:
73
+ {
74
+ text: "I'm an AI assistant running on Ollama with the llama3 model.",
75
+ model: "llama3"
76
+ }
77
+ */
78
+ ```
79
+
80
+ ## Streaming Responses
81
+
82
+ ```typescript file="test/ollama-chat-model.test.ts" region="example-ollama-chat-model-streaming"
83
+ import { OllamaChatModel } from "@aigne/ollama";
84
+
85
+ const model = new OllamaChatModel({
86
+ baseURL: "http://localhost:11434",
87
+ model: "llama3",
88
+ });
89
+
90
+ const stream = await model.invoke(
91
+ {
92
+ messages: [{ role: "user", content: "Tell me what model you're using" }],
93
+ },
94
+ undefined,
95
+ { streaming: true },
96
+ );
97
+
98
+ let fullText = "";
99
+ const json = {};
100
+
101
+ for await (const chunk of stream) {
102
+ const text = chunk.delta.text?.text;
103
+ if (text) fullText += text;
104
+ if (chunk.delta.json) Object.assign(json, chunk.delta.json);
105
+ }
106
+
107
+ console.log(fullText); // Output: "I'm an AI assistant running on Ollama with the llama3 model."
108
+ console.log(json); // { model: "llama3" }
109
+ ```
110
+
111
+ ## License
112
+
113
+ Elastic-2.0
package/README.zh.md ADDED
@@ -0,0 +1,113 @@
1
+ # @aigne/ollama
2
+
3
+ [![GitHub star chart](https://img.shields.io/github/stars/AIGNE-io/aigne-framework?style=flat-square)](https://star-history.com/#AIGNE-io/aigne-framework)
4
+ [![Open Issues](https://img.shields.io/github/issues-raw/AIGNE-io/aigne-framework?style=flat-square)](https://github.com/AIGNE-io/aigne-framework/issues)
5
+ [![codecov](https://codecov.io/gh/AIGNE-io/aigne-framework/graph/badge.svg?token=DO07834RQL)](https://codecov.io/gh/AIGNE-io/aigne-framework)
6
+ [![NPM Version](https://img.shields.io/npm/v/@aigne/ollama)](https://www.npmjs.com/package/@aigne/ollama)
7
+ [![Elastic-2.0 licensed](https://img.shields.io/npm/l/@aigne/ollama)](https://github.com/AIGNE-io/aigne-framework/blob/main/LICENSE.md)
8
+
9
+ [English](README.md) | **中文**
10
+
11
+ AIGNE Ollama SDK,用于在 [AIGNE 框架](https://github.com/AIGNE-io/aigne-framework) 中通过 Ollama 集成本地托管的 AI 模型。
12
+
13
+ ## 简介
14
+
15
+ `@aigne/ollama` 提供了 AIGNE 框架与通过 Ollama 本地托管的 AI 模型之间的无缝集成。该包使开发者能够在 AIGNE 应用程序中轻松利用通过 Ollama 在本地运行的开源语言模型,同时提供框架内一致的接口,并提供私有、离线的 AI 能力访问。
16
+
17
+ ## 特性
18
+
19
+ * **Ollama 集成**:直接连接到本地 Ollama 实例
20
+ * **本地模型支持**:支持通过 Ollama 托管的各种开源模型
21
+ * **聊天完成**:支持所有可用 Ollama 模型的聊天完成 API
22
+ * **流式响应**:支持流式响应,提供更高响应性的应用程序体验
23
+ * **类型安全**:为所有 API 和模型提供全面的 TypeScript 类型定义
24
+ * **一致接口**:兼容 AIGNE 框架的模型接口
25
+ * **注重隐私**:本地运行模型,无需将数据发送到外部 API 服务
26
+ * **完整配置**:丰富的配置选项用于微调行为
27
+
28
+ ## 安装
29
+
30
+ ### 使用 npm
31
+
32
+ ```bash
33
+ npm install @aigne/ollama @aigne/core
34
+ ```
35
+
36
+ ### 使用 yarn
37
+
38
+ ```bash
39
+ yarn add @aigne/ollama @aigne/core
40
+ ```
41
+
42
+ ### 使用 pnpm
43
+
44
+ ```bash
45
+ pnpm add @aigne/ollama @aigne/core
46
+ ```
47
+
48
+ ## 前提条件
49
+
50
+ 在使用此包之前,您需要在机器上安装并运行 [Ollama](https://ollama.ai/),并至少拉取一个模型。请按照 [Ollama 网站](https://ollama.ai/) 上的说明设置 Ollama。
51
+
52
+ ## 基本用法
53
+
54
+ ```typescript file="test/ollama-chat-model.test.ts" region="example-ollama-chat-model"
55
+ import { OllamaChatModel } from "@aigne/ollama";
56
+
57
+ const model = new OllamaChatModel({
58
+ // Specify base URL (defaults to http://localhost:11434)
59
+ baseURL: "http://localhost:11434",
60
+ // Specify Ollama model to use (defaults to 'llama3')
61
+ model: "llama3",
62
+ modelOptions: {
63
+ temperature: 0.8,
64
+ },
65
+ });
66
+
67
+ const result = await model.invoke({
68
+ messages: [{ role: "user", content: "Tell me what model you're using" }],
69
+ });
70
+
71
+ console.log(result);
72
+ /* Output:
73
+ {
74
+ text: "I'm an AI assistant running on Ollama with the llama3 model.",
75
+ model: "llama3"
76
+ }
77
+ */
78
+ ```
79
+
80
+ ## 流式响应
81
+
82
+ ```typescript file="test/ollama-chat-model.test.ts" region="example-ollama-chat-model-streaming"
83
+ import { OllamaChatModel } from "@aigne/ollama";
84
+
85
+ const model = new OllamaChatModel({
86
+ baseURL: "http://localhost:11434",
87
+ model: "llama3",
88
+ });
89
+
90
+ const stream = await model.invoke(
91
+ {
92
+ messages: [{ role: "user", content: "Tell me what model you're using" }],
93
+ },
94
+ undefined,
95
+ { streaming: true },
96
+ );
97
+
98
+ let fullText = "";
99
+ const json = {};
100
+
101
+ for await (const chunk of stream) {
102
+ const text = chunk.delta.text?.text;
103
+ if (text) fullText += text;
104
+ if (chunk.delta.json) Object.assign(json, chunk.delta.json);
105
+ }
106
+
107
+ console.log(fullText); // Output: "I'm an AI assistant running on Ollama with the llama3 model."
108
+ console.log(json); // { model: "llama3" }
109
+ ```
110
+
111
+ ## 许可证
112
+
113
+ Elastic-2.0
@@ -0,0 +1 @@
1
+ export * from "./ollama-chat-model.js";
@@ -0,0 +1,17 @@
1
+ "use strict";
2
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
3
+ if (k2 === undefined) k2 = k;
4
+ var desc = Object.getOwnPropertyDescriptor(m, k);
5
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
6
+ desc = { enumerable: true, get: function() { return m[k]; } };
7
+ }
8
+ Object.defineProperty(o, k2, desc);
9
+ }) : (function(o, m, k, k2) {
10
+ if (k2 === undefined) k2 = k;
11
+ o[k2] = m[k];
12
+ }));
13
+ var __exportStar = (this && this.__exportStar) || function(m, exports) {
14
+ for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
15
+ };
16
+ Object.defineProperty(exports, "__esModule", { value: true });
17
+ __exportStar(require("./ollama-chat-model.js"), exports);
@@ -0,0 +1,22 @@
1
+ import { OpenAIChatModel, type OpenAIChatModelOptions } from "@aigne/openai";
2
+ /**
3
+ * Implementation of the ChatModel interface for Ollama
4
+ *
5
+ * This model allows you to run open-source LLMs locally using Ollama,
6
+ * with an OpenAI-compatible API interface.
7
+ *
8
+ * Default model: 'llama3.2'
9
+ *
10
+ * @example
11
+ * Here's how to create and use an Ollama chat model:
12
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model}
13
+ *
14
+ * @example
15
+ * Here's an example with streaming response:
16
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model-streaming}
17
+ */
18
+ export declare class OllamaChatModel extends OpenAIChatModel {
19
+ constructor(options?: OpenAIChatModelOptions);
20
+ protected apiKeyEnvName: string;
21
+ protected apiKeyDefault: string;
22
+ }
@@ -0,0 +1,34 @@
1
+ "use strict";
2
+ Object.defineProperty(exports, "__esModule", { value: true });
3
+ exports.OllamaChatModel = void 0;
4
+ const openai_1 = require("@aigne/openai");
5
+ const OLLAMA_DEFAULT_BASE_URL = "http://localhost:11434/v1";
6
+ const OLLAMA_DEFAULT_CHAT_MODEL = "llama3.2";
7
+ /**
8
+ * Implementation of the ChatModel interface for Ollama
9
+ *
10
+ * This model allows you to run open-source LLMs locally using Ollama,
11
+ * with an OpenAI-compatible API interface.
12
+ *
13
+ * Default model: 'llama3.2'
14
+ *
15
+ * @example
16
+ * Here's how to create and use an Ollama chat model:
17
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model}
18
+ *
19
+ * @example
20
+ * Here's an example with streaming response:
21
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model-streaming}
22
+ */
23
+ class OllamaChatModel extends openai_1.OpenAIChatModel {
24
+ constructor(options) {
25
+ super({
26
+ ...options,
27
+ model: options?.model || OLLAMA_DEFAULT_CHAT_MODEL,
28
+ baseURL: options?.baseURL || process.env.OLLAMA_BASE_URL || OLLAMA_DEFAULT_BASE_URL,
29
+ });
30
+ }
31
+ apiKeyEnvName = "OLLAMA_API_KEY";
32
+ apiKeyDefault = "ollama";
33
+ }
34
+ exports.OllamaChatModel = OllamaChatModel;
@@ -0,0 +1 @@
1
+ {"type": "commonjs"}
@@ -0,0 +1 @@
1
+ export * from "./ollama-chat-model.js";
@@ -0,0 +1,22 @@
1
+ import { OpenAIChatModel, type OpenAIChatModelOptions } from "@aigne/openai";
2
+ /**
3
+ * Implementation of the ChatModel interface for Ollama
4
+ *
5
+ * This model allows you to run open-source LLMs locally using Ollama,
6
+ * with an OpenAI-compatible API interface.
7
+ *
8
+ * Default model: 'llama3.2'
9
+ *
10
+ * @example
11
+ * Here's how to create and use an Ollama chat model:
12
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model}
13
+ *
14
+ * @example
15
+ * Here's an example with streaming response:
16
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model-streaming}
17
+ */
18
+ export declare class OllamaChatModel extends OpenAIChatModel {
19
+ constructor(options?: OpenAIChatModelOptions);
20
+ protected apiKeyEnvName: string;
21
+ protected apiKeyDefault: string;
22
+ }
@@ -0,0 +1 @@
1
+ export * from "./ollama-chat-model.js";
@@ -0,0 +1 @@
1
+ export * from "./ollama-chat-model.js";
@@ -0,0 +1,22 @@
1
+ import { OpenAIChatModel, type OpenAIChatModelOptions } from "@aigne/openai";
2
+ /**
3
+ * Implementation of the ChatModel interface for Ollama
4
+ *
5
+ * This model allows you to run open-source LLMs locally using Ollama,
6
+ * with an OpenAI-compatible API interface.
7
+ *
8
+ * Default model: 'llama3.2'
9
+ *
10
+ * @example
11
+ * Here's how to create and use an Ollama chat model:
12
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model}
13
+ *
14
+ * @example
15
+ * Here's an example with streaming response:
16
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model-streaming}
17
+ */
18
+ export declare class OllamaChatModel extends OpenAIChatModel {
19
+ constructor(options?: OpenAIChatModelOptions);
20
+ protected apiKeyEnvName: string;
21
+ protected apiKeyDefault: string;
22
+ }
@@ -0,0 +1,30 @@
1
+ import { OpenAIChatModel } from "@aigne/openai";
2
+ const OLLAMA_DEFAULT_BASE_URL = "http://localhost:11434/v1";
3
+ const OLLAMA_DEFAULT_CHAT_MODEL = "llama3.2";
4
+ /**
5
+ * Implementation of the ChatModel interface for Ollama
6
+ *
7
+ * This model allows you to run open-source LLMs locally using Ollama,
8
+ * with an OpenAI-compatible API interface.
9
+ *
10
+ * Default model: 'llama3.2'
11
+ *
12
+ * @example
13
+ * Here's how to create and use an Ollama chat model:
14
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model}
15
+ *
16
+ * @example
17
+ * Here's an example with streaming response:
18
+ * {@includeCode ../test/ollama-chat-model.test.ts#example-ollama-chat-model-streaming}
19
+ */
20
+ export class OllamaChatModel extends OpenAIChatModel {
21
+ constructor(options) {
22
+ super({
23
+ ...options,
24
+ model: options?.model || OLLAMA_DEFAULT_CHAT_MODEL,
25
+ baseURL: options?.baseURL || process.env.OLLAMA_BASE_URL || OLLAMA_DEFAULT_BASE_URL,
26
+ });
27
+ }
28
+ apiKeyEnvName = "OLLAMA_API_KEY";
29
+ apiKeyDefault = "ollama";
30
+ }
@@ -0,0 +1 @@
1
+ {"type": "module"}
package/package.json ADDED
@@ -0,0 +1,54 @@
1
+ {
2
+ "name": "@aigne/ollama",
3
+ "version": "0.1.0",
4
+ "description": "AIGNE Ollama SDK for integrating with locally hosted AI models via Ollama",
5
+ "publishConfig": {
6
+ "access": "public"
7
+ },
8
+ "author": "Arcblock <blocklet@arcblock.io> https://github.com/blocklet",
9
+ "homepage": "https://github.com/AIGNE-io/aigne-framework",
10
+ "license": "Elastic-2.0",
11
+ "repository": {
12
+ "type": "git",
13
+ "url": "git+https://github.com/AIGNE-io/aigne-framework"
14
+ },
15
+ "files": [
16
+ "lib/cjs",
17
+ "lib/dts",
18
+ "lib/esm",
19
+ "LICENSE",
20
+ "README.md",
21
+ "CHANGELOG.md"
22
+ ],
23
+ "type": "module",
24
+ "main": "./lib/cjs/index.js",
25
+ "module": "./lib/esm/index.js",
26
+ "types": "./lib/dts/index.d.ts",
27
+ "exports": {
28
+ ".": {
29
+ "import": "./lib/esm/index.js",
30
+ "require": "./lib/cjs/index.js",
31
+ "types": "./lib/dts/index.d.ts"
32
+ }
33
+ },
34
+ "dependencies": {
35
+ "@aigne/openai": "^0.1.0"
36
+ },
37
+ "devDependencies": {
38
+ "@types/bun": "^1.2.12",
39
+ "@types/node": "^22.15.15",
40
+ "npm-run-all": "^4.1.5",
41
+ "rimraf": "^6.0.1",
42
+ "typescript": "^5.8.3",
43
+ "@aigne/core": "^1.16.0",
44
+ "@aigne/test-utils": "^0.3.0"
45
+ },
46
+ "scripts": {
47
+ "lint": "tsc --noEmit",
48
+ "build": "tsc --build scripts/tsconfig.build.json",
49
+ "clean": "rimraf lib test/coverage",
50
+ "test": "bun test",
51
+ "test:coverage": "bun test --coverage --coverage-reporter=lcov --coverage-reporter=text",
52
+ "postbuild": "echo '{\"type\": \"module\"}' > lib/esm/package.json && echo '{\"type\": \"commonjs\"}' > lib/cjs/package.json"
53
+ }
54
+ }