@mnemai/memory-server 0.1.1 → 0.1.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/CHANGELOG.md CHANGED
@@ -2,6 +2,26 @@
2
2
 
3
3
  All notable changes to `@mnemai/memory-server` are documented here.
4
4
 
5
+ ## [0.1.4] - 2026-04-02
6
+
7
+ ### Fixed
8
+
9
+ - **npm README:** Link to `docs/` and repo files with **absolute GitHub URLs** so they work on [npmjs.com](https://www.npmjs.com/package/@mnemai/memory-server) (the `docs/` tree is not shipped in the package tarball).
10
+ - **package.json:** Add `repository`, `homepage`, and `bugs` for discoverability.
11
+
12
+ ## [0.1.3] - 2026-04-02
13
+
14
+ ### Changed
15
+
16
+ - **Branding / env:** Prefer **`MNEMAI_MEMORY_*`** environment variables (documented on npm README). Legacy **`TENGU_MEMORY_*`** names remain supported for all memory tuning and embeddings.
17
+ - **Default DB path:** New installs use **`~/.mnemai/memory.db`**. If **`~/.tengu/memory.db`** already exists, it is used until you migrate or set **`MNEMAI_MEMORY_DB`** explicitly.
18
+
19
+ ## [0.1.2] - 2026-04-02
20
+
21
+ ### Fixed
22
+
23
+ - Include **`LICENSE`** (MIT) in the published tarball via the `files` list so npm consumers get an explicit license file.
24
+
5
25
  ## [0.1.1] - 2026-04-02
6
26
 
7
27
  ### Fixed
package/README.md CHANGED
@@ -2,10 +2,12 @@
2
2
 
3
3
  **Memory 2.0** — MCP server for an **evidence-linked memory graph**: typed nodes, relationships, freshness decay, contradiction hints, and ranked retrieval with explicit provenance fields. **Local-first** (SQLite file on disk).
4
4
 
5
+ **Where the docs live:** This page is the npm **README** only. Deeper guides (security, competitive benchmark, RFC, performance, host matrix) are Markdown files in the **[`docs/` folder on GitHub](https://github.com/ashahi10/src/tree/main/docs)** — readable by anyone **when the repository is public**. They are **not** bundled inside the npm tarball.
6
+
5
7
  ## Capabilities
6
8
 
7
9
  | Area | What you get |
8
- |------|----------------|
10
+ |------|--------------|
9
11
  | Structure | Typed nodes, edges (`supports`, `contradicts`, …), evidence rows |
10
12
  | Retrieval | Hybrid **substring + BM25-style token index** + **optional embeddings**; bounded candidates on large graphs |
11
13
  | Trust / time | Freshness decay, refresh, spaced **review queue**, confidence rules with evidence |
@@ -26,13 +28,13 @@ MCP config (set an absolute DB path):
26
28
  "mnemai-memory": {
27
29
  "command": "npx",
28
30
  "args": ["--yes", "@mnemai/memory-server"],
29
- "env": { "TENGU_MEMORY_DB": "/absolute/path/to/memory.db" }
31
+ "env": { "MNEMAI_MEMORY_DB": "/absolute/path/to/memory.db" }
30
32
  }
31
33
  }
32
34
  }
33
35
  ```
34
36
 
35
- Publishing and version bumps: see repo [RELEASING.md](../../RELEASING.md). `prepublishOnly` runs **`pnpm run verify:ship`**.
37
+ Publishing and version bumps: see **[RELEASING.md](https://github.com/ashahi10/src/blob/main/RELEASING.md)**. `prepublishOnly` runs **`pnpm run verify:ship`**.
36
38
 
37
39
  ## What `verify:ship` checks
38
40
 
@@ -43,7 +45,7 @@ Publishing and version bumps: see repo [RELEASING.md](../../RELEASING.md). `prep
43
45
 
44
46
  **Scope:** Proves the **reference SDK wire path** on CI’s OS (Linux). Hosts can still differ (timeouts, stderr). Not a guarantee for every proprietary MCP client.
45
47
 
46
- Repo CI also builds the workspace, typechecks, verifies **mission** + **verification** servers, **npm pack** sanity, and the **platform demo** (see root [README.md](../../README.md)).
48
+ Repo CI also builds the workspace, typechecks, verifies **mission** + **verification** servers, **npm pack** sanity, and the **platform demo** (see root **[README.md](https://github.com/ashahi10/src/blob/main/README.md)**).
47
49
 
48
50
  ## Requirements
49
51
 
@@ -62,19 +64,21 @@ node packages/memory-server/dist/index.js
62
64
 
63
65
  ## Environment variables
64
66
 
67
+ Use the **`MNEMAI_MEMORY_*`** names below. The same settings also accept legacy **`TENGU_MEMORY_*`** names (for older configs).
68
+
65
69
  | Variable | Purpose |
66
70
  |----------|---------|
67
- | `TENGU_MEMORY_DB` | Path to the SQLite file (default: `~/.tengu/memory.db`) |
68
- | `TENGU_MEMORY_SYNC_WRITES=1` | Flush to disk immediately after writes (stronger durability, slower) |
69
- | `TENGU_MEMORY_MATCH_LEXICAL` | Weight for substring overlap in hybrid `matchScore` (default `0.35`; renormalized with index/semantic) |
70
- | `TENGU_MEMORY_MATCH_INDEX` | Weight for BM25-style token index (default `0.45`) |
71
- | `TENGU_MEMORY_MATCH_SEMANTIC` | Weight for embedding cosine channel when enabled (default `0.2`) |
72
- | `TENGU_MEMORY_EMBED_URL` | OpenAI-compatible **POST** embeddings endpoint |
73
- | `TENGU_MEMORY_EMBED_KEY` | Bearer token for that endpoint |
74
- | `TENGU_MEMORY_EMBED_MODEL` | Embedding model id (default `text-embedding-3-small`) |
75
- | `TENGU_MEMORY_QUERY_FULL_SCAN_MAX_NODES` | Graphs larger than this use **index-bounded candidates** + recent seed (default `1600`; `0` = always use smart path when the index matches) |
76
- | `TENGU_MEMORY_QUERY_INDEX_CANDIDATE_CAP` | Floor for max BM25-hit ids per query; effective cap is `max(this, limit×25)` (default `600`) |
77
- | `TENGU_MEMORY_QUERY_RECENT_SEED` | Union this many most recently updated nodes (after filters) (default `200`) |
71
+ | `MNEMAI_MEMORY_DB` | Path to the SQLite file. If unset: uses existing `~/.tengu/memory.db` when present, otherwise creates **`~/.mnemai/memory.db`**. |
72
+ | `MNEMAI_MEMORY_SYNC_WRITES=1` | Flush to disk immediately after writes (stronger durability, slower). Legacy: `TENGU_MEMORY_SYNC_WRITES=1`. |
73
+ | `MNEMAI_MEMORY_MATCH_LEXICAL` | Weight for substring overlap in hybrid `matchScore` (default `0.35`; renormalized with index/semantic). Legacy `TENGU_*` alias. |
74
+ | `MNEMAI_MEMORY_MATCH_INDEX` | Weight for BM25-style token index (default `0.45`). Legacy alias. |
75
+ | `MNEMAI_MEMORY_MATCH_SEMANTIC` | Weight for embedding cosine channel when enabled (default `0.2`). Legacy alias. |
76
+ | `MNEMAI_MEMORY_EMBED_URL` | OpenAI-compatible **POST** embeddings endpoint. Legacy alias. |
77
+ | `MNEMAI_MEMORY_EMBED_KEY` | Bearer token for that endpoint. Legacy alias. |
78
+ | `MNEMAI_MEMORY_EMBED_MODEL` | Embedding model id (default `text-embedding-3-small`). Legacy alias. |
79
+ | `MNEMAI_MEMORY_QUERY_FULL_SCAN_MAX_NODES` | Graphs larger than this use **index-bounded candidates** + recent seed (default `1600`; `0` = always use smart path when the index matches). Legacy alias. |
80
+ | `MNEMAI_MEMORY_QUERY_INDEX_CANDIDATE_CAP` | Floor for max BM25-hit ids per query; effective cap is `max(this, limit×25)` (default `600`). Legacy alias. |
81
+ | `MNEMAI_MEMORY_QUERY_RECENT_SEED` | Union this many most recently updated nodes (after filters) (default `200`). Legacy alias. |
78
82
 
79
83
  ## MCP config (clone / local `node`)
80
84
 
@@ -84,7 +88,7 @@ node packages/memory-server/dist/index.js
84
88
  "mnemai-memory": {
85
89
  "command": "node",
86
90
  "args": ["/absolute/path/to/repo/packages/memory-server/dist/index.js"],
87
- "env": { "TENGU_MEMORY_DB": "/absolute/path/to/my-memory.db" }
91
+ "env": { "MNEMAI_MEMORY_DB": "/absolute/path/to/my-memory.db" }
88
92
  }
89
93
  }
90
94
  }
@@ -103,7 +107,7 @@ Use **absolute paths** — hosts often use a cwd that is not your repo.
103
107
  | `memory.refresh` | Reaffirm / boost freshness; optional `reaffirmationNote` |
104
108
  | `memory.list_review_queue` | Spaced verification queue |
105
109
  | `memory.verify_node` | Confirm a memory still holds |
106
- | `memory.embed_node` | Store embedding (requires `TENGU_MEMORY_EMBED_*`) |
110
+ | `memory.embed_node` | Store embedding (requires `MNEMAI_MEMORY_EMBED_*` or legacy `TENGU_MEMORY_EMBED_*`) |
107
111
  | `memory.stats` | Aggregate stats (+ index / embedding counts) |
108
112
 
109
113
  ## Resources
@@ -115,17 +119,17 @@ Use **absolute paths** — hosts often use a cwd that is not your repo.
115
119
 
116
120
  ## Security, performance, limits
117
121
 
118
- - [Security & privacy](../../docs/Memory-2.0-Security-Privacy.md)
119
- - [Performance (test SLOs)](../../docs/Memory-2.0-Performance.md)
120
- - [Host matrix](../../docs/Memory-2.0-Host-Matrix.md)
122
+ - [Security & privacy](https://github.com/ashahi10/src/blob/main/docs/Memory-2.0-Security-Privacy.md)
123
+ - [Performance (test SLOs)](https://github.com/ashahi10/src/blob/main/docs/Memory-2.0-Performance.md)
124
+ - [Host matrix](https://github.com/ashahi10/src/blob/main/docs/Memory-2.0-Host-Matrix.md)
121
125
 
122
126
  **Limits (honest):** **sql.js** build has **no FTS5**; lexical search uses **`node_search_tokens`** + BM25-style scoring plus substring match. **Embeddings** are optional and require your own API. Large graphs use **bounded retrieval** (see env vars above). These tradeoffs favor **portable, predictable** local operation over mimicking a hosted search product.
123
127
 
124
128
  ## Related docs
125
129
 
126
- - [RFC: Memory 2.0](../../docs/RFC-Memory-2.0.md)
127
- - [Runtime compatibility](../../docs/Memory-2.0-Runtime-Compatibility.md)
128
- - [Competitive benchmark](../../docs/Memory-2.0-Competitive-Benchmark.md)
130
+ - [RFC: Memory 2.0](https://github.com/ashahi10/src/blob/main/docs/RFC-Memory-2.0.md)
131
+ - [Runtime compatibility](https://github.com/ashahi10/src/blob/main/docs/Memory-2.0-Runtime-Compatibility.md)
132
+ - [Competitive benchmark](https://github.com/ashahi10/src/blob/main/docs/Memory-2.0-Competitive-Benchmark.md)
129
133
 
130
134
  ## License
131
135
 
package/dist/index.js CHANGED
@@ -37,9 +37,7 @@ import { v4 as uuidv4 } from "uuid";
37
37
 
38
38
  // src/graph/store.ts
39
39
  import initSqlJs from "sql.js";
40
- import { homedir } from "os";
41
- import { join } from "path";
42
- import { mkdirSync, readFileSync, writeFileSync, existsSync } from "fs";
40
+ import { readFileSync, writeFileSync, existsSync as existsSync2 } from "fs";
43
41
 
44
42
  // src/retrieval/tokenize.ts
45
43
  var STOPWORDS = /* @__PURE__ */ new Set([
@@ -206,23 +204,39 @@ function searchIndexTableExists(database) {
206
204
  return ok;
207
205
  }
208
206
 
207
+ // src/envMemory.ts
208
+ import { existsSync, mkdirSync } from "fs";
209
+ import { homedir } from "os";
210
+ import { join } from "path";
211
+ function preferMnemaiEnv(mnemaiName, tenguName) {
212
+ const a = process.env[mnemaiName]?.trim();
213
+ if (a) return a;
214
+ const b = process.env[tenguName]?.trim();
215
+ return b || void 0;
216
+ }
217
+ function getMemoryDbPath() {
218
+ const explicit = preferMnemaiEnv("MNEMAI_MEMORY_DB", "TENGU_MEMORY_DB");
219
+ if (explicit) return explicit;
220
+ const legacyFile = join(homedir(), ".tengu", "memory.db");
221
+ if (existsSync(legacyFile)) return legacyFile;
222
+ const dir = join(homedir(), ".mnemai");
223
+ mkdirSync(dir, { recursive: true });
224
+ return join(dir, "memory.db");
225
+ }
226
+ function isMemorySyncWrites() {
227
+ return process.env.MNEMAI_MEMORY_SYNC_WRITES === "1" || process.env.TENGU_MEMORY_SYNC_WRITES === "1";
228
+ }
229
+
209
230
  // src/graph/store.ts
210
231
  var db = null;
211
232
  var dbPath = null;
212
233
  var saveTimer = null;
213
- var SYNC_WRITES = process.env.TENGU_MEMORY_SYNC_WRITES === "1";
214
- function getDbPath() {
215
- const envPath = process.env.TENGU_MEMORY_DB;
216
- if (envPath) return envPath;
217
- const dir = join(homedir(), ".tengu");
218
- mkdirSync(dir, { recursive: true });
219
- return join(dir, "memory.db");
220
- }
234
+ var SYNC_WRITES = isMemorySyncWrites();
221
235
  async function initDb() {
222
236
  if (db) return db;
223
237
  const SQL = await initSqlJs();
224
- dbPath = getDbPath();
225
- if (existsSync(dbPath)) {
238
+ dbPath = getMemoryDbPath();
239
+ if (existsSync2(dbPath)) {
226
240
  const buffer = readFileSync(dbPath);
227
241
  db = new SQL.Database(buffer);
228
242
  } else {
@@ -779,17 +793,25 @@ var INTENT_WEIGHTS = {
779
793
  incident_triage: { relevance: 0.35, freshness: 0.35, evidence: 0.2, salience: 0.1 },
780
794
  preference_personalization: { relevance: 0.35, freshness: 0.2, evidence: 0.15, salience: 0.3 }
781
795
  };
782
- function parseEnvWeight(name, fallback) {
783
- const v = process.env[name];
784
- if (v == null || v === "") return fallback;
785
- const n = Number(v);
796
+ function parseEnvWeightRaw(raw, fallback) {
797
+ if (raw == null || raw === "") return fallback;
798
+ const n = Number(raw);
786
799
  if (!Number.isFinite(n) || n < 0) return fallback;
787
800
  return n;
788
801
  }
789
802
  function matchBlendWeights(useSemantic) {
790
- let lex = parseEnvWeight("TENGU_MEMORY_MATCH_LEXICAL", 0.35);
791
- let idx = parseEnvWeight("TENGU_MEMORY_MATCH_INDEX", 0.45);
792
- let sem = parseEnvWeight("TENGU_MEMORY_MATCH_SEMANTIC", 0.2);
803
+ let lex = parseEnvWeightRaw(
804
+ preferMnemaiEnv("MNEMAI_MEMORY_MATCH_LEXICAL", "TENGU_MEMORY_MATCH_LEXICAL"),
805
+ 0.35
806
+ );
807
+ let idx = parseEnvWeightRaw(
808
+ preferMnemaiEnv("MNEMAI_MEMORY_MATCH_INDEX", "TENGU_MEMORY_MATCH_INDEX"),
809
+ 0.45
810
+ );
811
+ let sem = parseEnvWeightRaw(
812
+ preferMnemaiEnv("MNEMAI_MEMORY_MATCH_SEMANTIC", "TENGU_MEMORY_MATCH_SEMANTIC"),
813
+ 0.2
814
+ );
793
815
  if (!useSemantic) {
794
816
  const total = lex + idx;
795
817
  if (total <= 0) return { lex: 0.55, idx: 0.45, sem: 0 };
@@ -948,15 +970,17 @@ function computeConflict(node, edges, nodeById) {
948
970
  // src/retrieval/embedding.ts
949
971
  var DEFAULT_MODEL = "text-embedding-3-small";
950
972
  function isEmbeddingsConfigured() {
951
- return Boolean(process.env.TENGU_MEMORY_EMBED_URL?.trim() && process.env.TENGU_MEMORY_EMBED_KEY?.trim());
973
+ const url = preferMnemaiEnv("MNEMAI_MEMORY_EMBED_URL", "TENGU_MEMORY_EMBED_URL");
974
+ const key = preferMnemaiEnv("MNEMAI_MEMORY_EMBED_KEY", "TENGU_MEMORY_EMBED_KEY");
975
+ return Boolean(url && key);
952
976
  }
953
977
  function getEmbeddingModel() {
954
- return process.env.TENGU_MEMORY_EMBED_MODEL?.trim() || DEFAULT_MODEL;
978
+ return preferMnemaiEnv("MNEMAI_MEMORY_EMBED_MODEL", "TENGU_MEMORY_EMBED_MODEL") || DEFAULT_MODEL;
955
979
  }
956
980
  async function embedText(text) {
957
981
  if (!isEmbeddingsConfigured()) return null;
958
- const url = process.env.TENGU_MEMORY_EMBED_URL.trim();
959
- const key = process.env.TENGU_MEMORY_EMBED_KEY.trim();
982
+ const url = preferMnemaiEnv("MNEMAI_MEMORY_EMBED_URL", "TENGU_MEMORY_EMBED_URL");
983
+ const key = preferMnemaiEnv("MNEMAI_MEMORY_EMBED_KEY", "TENGU_MEMORY_EMBED_KEY");
960
984
  const model = getEmbeddingModel();
961
985
  const body = JSON.stringify({ model, input: text.slice(0, 8e3) });
962
986
  const res = await fetch(url, {
@@ -1042,9 +1066,27 @@ function parseNonNegativeInt(raw, fallback) {
1042
1066
  }
1043
1067
  function parseQueryRetrievalBudget() {
1044
1068
  return {
1045
- fullScanMaxNodes: parseNonNegativeInt(process.env.TENGU_MEMORY_QUERY_FULL_SCAN_MAX_NODES, 1600),
1046
- indexCandidateCapFloor: Math.max(1, parseNonNegativeInt(process.env.TENGU_MEMORY_QUERY_INDEX_CANDIDATE_CAP, 600)),
1047
- recentSeedSize: parseNonNegativeInt(process.env.TENGU_MEMORY_QUERY_RECENT_SEED, 200)
1069
+ fullScanMaxNodes: parseNonNegativeInt(
1070
+ preferMnemaiEnv(
1071
+ "MNEMAI_MEMORY_QUERY_FULL_SCAN_MAX_NODES",
1072
+ "TENGU_MEMORY_QUERY_FULL_SCAN_MAX_NODES"
1073
+ ),
1074
+ 1600
1075
+ ),
1076
+ indexCandidateCapFloor: Math.max(
1077
+ 1,
1078
+ parseNonNegativeInt(
1079
+ preferMnemaiEnv(
1080
+ "MNEMAI_MEMORY_QUERY_INDEX_CANDIDATE_CAP",
1081
+ "TENGU_MEMORY_QUERY_INDEX_CANDIDATE_CAP"
1082
+ ),
1083
+ 600
1084
+ )
1085
+ ),
1086
+ recentSeedSize: parseNonNegativeInt(
1087
+ preferMnemaiEnv("MNEMAI_MEMORY_QUERY_RECENT_SEED", "TENGU_MEMORY_QUERY_RECENT_SEED"),
1088
+ 200
1089
+ )
1048
1090
  };
1049
1091
  }
1050
1092
  function effectiveIndexHitCap(limit, capFloor) {
@@ -1183,10 +1225,10 @@ var queryMemorySchema = z2.object({
1183
1225
  "Include BM25-style portable token index in hybrid rank (default true when index exists)."
1184
1226
  ),
1185
1227
  useSemantic: z2.boolean().optional().describe(
1186
- "Allow query-time embedding call when TENGU_MEMORY_EMBED_* is configured (default true)."
1228
+ "Allow query-time embedding call when MNEMAI_MEMORY_EMBED_* (or TENGU_MEMORY_EMBED_*) is configured (default true)."
1187
1229
  ),
1188
1230
  fullScan: z2.boolean().optional().describe(
1189
- "Load the entire filtered graph for ranking (slower, strongest recall). Default uses index-bounded candidates when the graph is larger than TENGU_MEMORY_QUERY_FULL_SCAN_MAX_NODES."
1231
+ "Load the entire filtered graph for ranking (slower, strongest recall). Default uses index-bounded candidates when the graph is larger than MNEMAI_MEMORY_QUERY_FULL_SCAN_MAX_NODES (or TENGU_* alias)."
1190
1232
  )
1191
1233
  });
1192
1234
  async function handleQueryMemory(args) {
@@ -1564,7 +1606,7 @@ async function handleEmbedNode(args) {
1564
1606
  type: "text",
1565
1607
  text: JSON.stringify({
1566
1608
  error: "embeddings_not_configured",
1567
- hint: "Set TENGU_MEMORY_EMBED_URL (OpenAI-compatible embeddings endpoint), TENGU_MEMORY_EMBED_KEY, and optionally TENGU_MEMORY_EMBED_MODEL."
1609
+ hint: "Set MNEMAI_MEMORY_EMBED_URL (OpenAI-compatible embeddings endpoint), MNEMAI_MEMORY_EMBED_KEY, and optionally MNEMAI_MEMORY_EMBED_MODEL (legacy TENGU_MEMORY_EMBED_* still work)."
1568
1610
  }, null, 2)
1569
1611
  }
1570
1612
  ],
@@ -1622,7 +1664,7 @@ ${node.tags.join(" ")}`.slice(0, 8e3);
1622
1664
  // src/index.ts
1623
1665
  var server = new McpServer({
1624
1666
  name: "mnemai-memory",
1625
- version: "0.1.1"
1667
+ version: "0.1.4"
1626
1668
  });
1627
1669
  server.tool(
1628
1670
  "memory.create_node",
@@ -1674,7 +1716,7 @@ server.tool(
1674
1716
  );
1675
1717
  server.tool(
1676
1718
  "memory.embed_node",
1677
- "Store an embedding vector for a node (requires TENGU_MEMORY_EMBED_URL + TENGU_MEMORY_EMBED_KEY); enables semantic channel in memory.query",
1719
+ "Store an embedding vector for a node (requires MNEMAI_MEMORY_EMBED_URL + MNEMAI_MEMORY_EMBED_KEY, or legacy TENGU_* names); enables semantic channel in memory.query",
1678
1720
  embedNodeSchema.shape,
1679
1721
  async (args) => handleEmbedNode(args)
1680
1722
  );
@@ -1745,7 +1787,7 @@ Usage: mnemai-memory
1745
1787
  Starts the Model Context Protocol server on stdin/stdout (no HTTP port).
1746
1788
 
1747
1789
  Docs: https://www.npmjs.com/package/@mnemai/memory-server
1748
- Environment: TENGU_MEMORY_DB (SQLite path), TENGU_MEMORY_EMBED_* (optional embeddings).
1790
+ Environment: MNEMAI_MEMORY_DB (SQLite path; legacy TENGU_MEMORY_DB supported), MNEMAI_MEMORY_EMBED_* (optional embeddings; TENGU_* aliases).
1749
1791
  `);
1750
1792
  process.exit(0);
1751
1793
  }
package/package.json CHANGED
@@ -1,8 +1,17 @@
1
1
  {
2
2
  "name": "@mnemai/memory-server",
3
- "version": "0.1.1",
3
+ "version": "0.1.4",
4
4
  "description": "MCP server providing evidence-linked memory graph with typed nodes, freshness decay, and contradiction handling",
5
5
  "license": "MIT",
6
+ "repository": {
7
+ "type": "git",
8
+ "url": "git+https://github.com/ashahi10/src.git",
9
+ "directory": "packages/memory-server"
10
+ },
11
+ "homepage": "https://github.com/ashahi10/src/tree/main/packages/memory-server#readme",
12
+ "bugs": {
13
+ "url": "https://github.com/ashahi10/src/issues"
14
+ },
6
15
  "type": "module",
7
16
  "main": "dist/index.js",
8
17
  "bin": {
@@ -12,7 +21,8 @@
12
21
  "dist",
13
22
  "bin",
14
23
  "CHANGELOG.md",
15
- "README.md"
24
+ "README.md",
25
+ "LICENSE"
16
26
  ],
17
27
  "publishConfig": {
18
28
  "access": "public"