@releasekit/notes 0.1.0 → 0.1.1-next.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,175 +1,113 @@
1
- # changelog-creator
1
+ # @releasekit/notes
2
2
 
3
- A CLI tool for generating changelogs with LLM-powered enhancement and flexible templating.
3
+ Changelog generation with LLM-powered enhancement and flexible templating.
4
4
 
5
5
  ## Features
6
6
 
7
- - **Multiple input sources**: package-versioner JSON, git log, manual JSON
8
- - **Flexible templating**: Liquid, Handlebars, EJS - single file or composable
9
- - **LLM enhancement** (optional): Summarize, categorize, enhance descriptions, generate release notes
10
- - **Monorepo support**: Root aggregation, per-package changelogs, or both
11
- - **Multiple outputs**: Markdown, JSON, GitHub Releases API
7
+ - **Multiple input sources** `@releasekit/version` JSON, git log, or manual JSON
8
+ - **Flexible templating** Liquid, Handlebars, or EJS with single-file or composable templates
9
+ - **LLM enhancement** (optional) — summarize, categorize, enhance descriptions, generate release notes
10
+ - **Monorepo support** root aggregation, per-package changelogs, or both
11
+ - **Multiple outputs** Markdown, JSON, or GitHub Releases API
12
+ - **Dry-run mode** — preview without writing files
12
13
 
13
14
  ## Installation
14
15
 
15
16
  ```bash
16
- npm install -g changelog-creator
17
+ npm install -g @releasekit/notes
17
18
  # or
18
- pnpm add -g changelog-creator
19
+ pnpm add -g @releasekit/notes
19
20
  ```
20
21
 
21
22
  ## Quick Start
22
23
 
23
24
  ```bash
24
- # Pipe from package-versioner
25
- npx package-versioner --json | changelog-creator
25
+ # Pipe from @releasekit/version
26
+ releasekit-version --json | releasekit-notes
26
27
 
27
- # From file
28
- changelog-creator --input version-data.json
28
+ # From a file
29
+ releasekit-notes --input version-data.json
29
30
 
30
31
  # With LLM enhancement
31
- changelog-creator --input version-data.json --llm-provider openai --llm-model gpt-4o-mini
32
- ```
32
+ releasekit-notes --input version-data.json --llm-provider openai --llm-model gpt-4o-mini
33
33
 
34
- ## CLI Commands
34
+ # Preview without writing
35
+ releasekit-notes --dry-run
36
+ ```
35
37
 
36
- ### `changelog-creator generate` (default)
38
+ ## CLI Reference
37
39
 
38
- Generate changelog from input data.
40
+ | Flag | Description | Default |
41
+ |------|-------------|---------|
42
+ | `-i, --input <file>` | Input file path | stdin |
43
+ | `-o, --output <spec>` | Output spec (`format:file`) | config |
44
+ | `-t, --template <path>` | Template file or directory | built-in |
45
+ | `-e, --engine <engine>` | Template engine: `handlebars`, `liquid`, `ejs` | `liquid` |
46
+ | `--monorepo <mode>` | Monorepo mode: `root`, `packages`, `both` | — |
47
+ | `--llm-provider <name>` | LLM provider | — |
48
+ | `--llm-model <model>` | LLM model | — |
49
+ | `--llm-tasks <tasks>` | Comma-separated LLM tasks | — |
50
+ | `--no-llm` | Disable LLM processing | `false` |
51
+ | `--config <path>` | Config file path | `releasekit.config.json` |
52
+ | `--dry-run` | Preview without writing | `false` |
53
+ | `--regenerate` | Regenerate entire changelog | `false` |
54
+ | `-v, --verbose` | Verbose logging | `false` |
55
+ | `-q, --quiet` | Suppress non-error output | `false` |
39
56
 
40
- ```bash
41
- changelog-creator [options]
42
-
43
- Options:
44
- -i, --input <file> Input file (default: stdin)
45
- -o, --output <spec> Output spec (format:file)
46
- -t, --template <path> Template file or directory
47
- -e, --engine <engine> Template engine (handlebars|liquid|ejs)
48
- --monorepo <mode> Monorepo mode (root|packages|both)
49
- --llm-provider <provider> LLM provider
50
- --llm-model <model> LLM model
51
- --llm-tasks <tasks> Comma-separated LLM tasks
52
- --no-llm Disable LLM processing
53
- --config <path> Config file path
54
- --dry-run Preview without writing
55
- --regenerate Regenerate entire changelog
56
- -v, --verbose Increase verbosity
57
- -q, --quiet Suppress non-error output
58
- ```
57
+ ## Subcommands
59
58
 
60
- ### `changelog-creator init`
59
+ ### `releasekit-notes init`
61
60
 
62
61
  Create a default configuration file.
63
62
 
64
63
  ```bash
65
- changelog-creator init [--force]
64
+ releasekit-notes init [--force]
66
65
  ```
67
66
 
68
- ### `changelog-creator auth <provider>`
67
+ ### `releasekit-notes auth <provider>`
69
68
 
70
69
  Configure API key for an LLM provider.
71
70
 
72
71
  ```bash
73
- changelog-creator auth openai --key sk-...
74
- changelog-creator auth anthropic
72
+ releasekit-notes auth openai --key sk-...
73
+ releasekit-notes auth anthropic
75
74
  ```
76
75
 
77
- ### `changelog-creator providers`
76
+ ### `releasekit-notes providers`
78
77
 
79
78
  List available LLM providers.
80
79
 
81
80
  ## Configuration
82
81
 
83
- Create `changelog.config.json` in your project root:
82
+ Configure via `releasekit.config.json`:
84
83
 
85
84
  ```json
86
85
  {
87
- "output": [
88
- { "format": "markdown", "file": "CHANGELOG.md" }
89
- ],
90
- "updateStrategy": "prepend",
91
- "templates": {
92
- "path": "./templates/",
93
- "engine": "liquid"
94
- },
95
- "llm": {
96
- "provider": "openai",
97
- "model": "gpt-4o-mini",
98
- "tasks": {
99
- "summarize": true,
100
- "enhance": true
86
+ "notes": {
87
+ "output": [
88
+ { "format": "markdown", "file": "CHANGELOG.md" }
89
+ ],
90
+ "updateStrategy": "prepend",
91
+ "templates": {
92
+ "path": "./templates/",
93
+ "engine": "liquid"
94
+ },
95
+ "llm": {
96
+ "provider": "openai",
97
+ "model": "gpt-4o-mini",
98
+ "tasks": {
99
+ "summarize": true,
100
+ "enhance": true
101
+ }
101
102
  }
102
103
  }
103
104
  }
104
105
  ```
105
106
 
106
- ### Config Locations (precedence)
107
-
108
- 1. `CHANGELOG_CONFIG_CONTENT` env var
109
- 2. `--config` CLI flag
110
- 3. `changelog.config.json` in project
111
- 4. `~/.config/changelog-creator/config.json`
112
-
113
- ## Input Sources
114
-
115
- ### package-versioner JSON
116
-
117
- ```bash
118
- npx package-versioner --json | changelog-creator
119
- ```
120
-
121
- ### Git Log
122
-
123
- ```bash
124
- changelog-creator --input-source git-log --from v1.0.0 --to HEAD
125
- ```
126
-
127
- ### Manual JSON
128
-
129
- ```json
130
- {
131
- "packages": [{
132
- "packageName": "my-app",
133
- "version": "1.2.0",
134
- "entries": [
135
- { "type": "added", "description": "New feature" }
136
- ]
137
- }]
138
- }
139
- ```
140
-
141
- ## Templates
142
-
143
- ### Single File
144
-
145
- ```bash
146
- changelog-creator --template ./my-changelog.liquid
147
- ```
148
-
149
- ### Composable
150
-
151
- ```bash
152
- changelog-creator --template ./templates/
153
- ```
154
-
155
- Directory structure:
156
- ```
157
- templates/
158
- ├── document.liquid
159
- ├── version.liquid
160
- └── entry.liquid
161
- ```
162
-
163
- ### Built-in Templates
164
-
165
- - `keep-a-changelog` - Default, Keep a Changelog format
166
- - `angular` - Angular-style changelog
167
- - `github-release` - GitHub release notes
168
-
169
107
  ## LLM Providers
170
108
 
171
- | Provider | Config | Notes |
172
- |----------|--------|-------|
109
+ | Provider | Config Key | Notes |
110
+ |----------|------------|-------|
173
111
  | OpenAI | `openai` | Requires `OPENAI_API_KEY` |
174
112
  | Anthropic | `anthropic` | Requires `ANTHROPIC_API_KEY` |
175
113
  | Ollama | `ollama` | Local, no API key needed |
@@ -184,38 +122,44 @@ templates/
184
122
  | `categorize` | Group entries by category |
185
123
  | `releaseNotes` | Generate release notes |
186
124
 
187
- ## Monorepo Support
125
+ ## Templates
188
126
 
189
- ```bash
190
- # Root changelog only (aggregates all packages)
191
- changelog-creator --monorepo root
127
+ ### Built-in
192
128
 
193
- # Per-package changelogs
194
- changelog-creator --monorepo packages
129
+ - `keep-a-changelog` — Keep a Changelog format (default)
130
+ - `angular` — Angular-style changelog
131
+ - `github-release` — GitHub release notes
195
132
 
196
- # Both
197
- changelog-creator --monorepo both
198
- ```
199
-
200
- ## Output Formats
201
-
202
- ### Markdown
133
+ ### Custom Templates
203
134
 
204
135
  ```bash
205
- changelog-creator -o markdown:CHANGELOG.md
136
+ # Single file
137
+ releasekit-notes --template ./my-changelog.liquid
138
+
139
+ # Composable directory
140
+ releasekit-notes --template ./templates/
206
141
  ```
207
142
 
208
- ### JSON
143
+ Composable directory structure:
209
144
 
210
- ```bash
211
- changelog-creator -o json:changelog.json
145
+ ```
146
+ templates/
147
+ ├── document.liquid
148
+ ├── version.liquid
149
+ └── entry.liquid
212
150
  ```
213
151
 
214
- ### GitHub Release
152
+ ## Monorepo Support
215
153
 
216
154
  ```bash
217
- changelog-creator -o github-release
218
- # Requires GITHUB_TOKEN env var
155
+ # Root changelog only (aggregates all packages)
156
+ releasekit-notes --monorepo root
157
+
158
+ # Per-package changelogs
159
+ releasekit-notes --monorepo packages
160
+
161
+ # Both
162
+ releasekit-notes --monorepo both
219
163
  ```
220
164
 
221
165
  ## License
@@ -195,6 +195,10 @@ function formatVersion(context) {
195
195
  lines.push(`[Full Changelog](${context.compareUrl})`);
196
196
  lines.push("");
197
197
  }
198
+ if (context.enhanced?.summary) {
199
+ lines.push(context.enhanced.summary);
200
+ lines.push("");
201
+ }
198
202
  const grouped = groupEntriesByType(context.entries);
199
203
  for (const [type, entries] of grouped) {
200
204
  if (entries.length === 0) continue;
@@ -379,15 +383,17 @@ var OllamaProvider = class extends BaseLLMProvider {
379
383
  name = "ollama";
380
384
  baseURL;
381
385
  model;
386
+ apiKey;
382
387
  constructor(config = {}) {
383
388
  super();
384
389
  this.baseURL = config.baseURL ?? process.env.OLLAMA_BASE_URL ?? "http://localhost:11434";
385
390
  this.model = config.model ?? LLM_DEFAULTS.models.ollama;
391
+ this.apiKey = config.apiKey ?? process.env.OLLAMA_API_KEY;
386
392
  }
387
393
  async complete(prompt, options) {
388
394
  const requestBody = {
389
395
  model: this.model,
390
- prompt,
396
+ messages: [{ role: "user", content: prompt }],
391
397
  stream: false,
392
398
  options: {
393
399
  num_predict: this.getMaxTokens(options),
@@ -395,11 +401,16 @@ var OllamaProvider = class extends BaseLLMProvider {
395
401
  }
396
402
  };
397
403
  try {
398
- const response = await fetch(`${this.baseURL}/api/generate`, {
404
+ const headers = {
405
+ "Content-Type": "application/json"
406
+ };
407
+ if (this.apiKey) {
408
+ headers["Authorization"] = `Bearer ${this.apiKey}`;
409
+ }
410
+ const baseUrl = this.baseURL.endsWith("/api") ? this.baseURL.slice(0, -4) : this.baseURL;
411
+ const response = await fetch(`${baseUrl}/api/chat`, {
399
412
  method: "POST",
400
- headers: {
401
- "Content-Type": "application/json"
402
- },
413
+ headers,
403
414
  body: JSON.stringify(requestBody)
404
415
  });
405
416
  if (!response.ok) {
@@ -407,10 +418,10 @@ var OllamaProvider = class extends BaseLLMProvider {
407
418
  throw new LLMError(`Ollama request failed: ${response.status} ${text}`);
408
419
  }
409
420
  const data = await response.json();
410
- if (!data.response) {
421
+ if (!data.message?.content) {
411
422
  throw new LLMError("Empty response from Ollama");
412
423
  }
413
- return data.response;
424
+ return data.message.content;
414
425
  } catch (error) {
415
426
  if (error instanceof LLMError) throw error;
416
427
  throw new LLMError(`Ollama error: ${error instanceof Error ? error.message : String(error)}`);
@@ -493,7 +504,7 @@ var OpenAICompatibleProvider = class extends BaseLLMProvider {
493
504
 
494
505
  // src/llm/tasks/categorize.ts
495
506
  import { warn } from "@releasekit/core";
496
- var CATEGORIZE_PROMPT = `You are categorizing changelog entries for a software release.
507
+ var DEFAULT_CATEGORIZE_PROMPT = `You are categorizing changelog entries for a software release.
497
508
 
498
509
  Given the following entries, group them into meaningful categories (e.g., "Core", "UI", "API", "Performance", "Bug Fixes", "Documentation").
499
510
 
@@ -503,24 +514,63 @@ Entries:
503
514
  {{entries}}
504
515
 
505
516
  Output only valid JSON, nothing else:`;
506
- async function categorizeEntries(provider, entries, _context) {
517
+ function buildCustomCategorizePrompt(categories) {
518
+ const categoryList = categories.map((c) => `- "${c.name}": ${c.description}`).join("\n");
519
+ return `You are categorizing changelog entries for a software release.
520
+
521
+ Given the following entries, group them into the specified categories. Only use the categories listed below.
522
+
523
+ Categories:
524
+ ${categoryList}
525
+
526
+ For entries in categories that involve internal/developer changes, set a "scope" field on those entries with a short subcategory label (e.g., "CI", "Dependencies", "Testing", "Code Quality", "Build System").
527
+
528
+ Output a JSON object with two fields:
529
+ - "categories": an object where keys are category names and values are arrays of entry indices (0-based)
530
+ - "scopes": an object where keys are entry indices (as strings) and values are scope labels
531
+
532
+ Entries:
533
+ {{entries}}
534
+
535
+ Output only valid JSON, nothing else:`;
536
+ }
537
+ async function categorizeEntries(provider, entries, context) {
507
538
  if (entries.length === 0) {
508
539
  return [];
509
540
  }
510
541
  const entriesText = entries.map((e, i) => `${i}. [${e.type}]${e.scope ? ` (${e.scope})` : ""}: ${e.description}`).join("\n");
511
- const prompt = CATEGORIZE_PROMPT.replace("{{entries}}", entriesText);
542
+ const hasCustomCategories = context.categories && context.categories.length > 0;
543
+ const promptTemplate = hasCustomCategories ? buildCustomCategorizePrompt(context.categories) : DEFAULT_CATEGORIZE_PROMPT;
544
+ const prompt = promptTemplate.replace("{{entries}}", entriesText);
512
545
  try {
513
546
  const response = await provider.complete(prompt);
514
547
  const cleaned = response.replace(/^```(?:json)?\n?/, "").replace(/\n?```$/, "").trim();
515
548
  const parsed = JSON.parse(cleaned);
516
549
  const result = [];
517
- for (const [category, indices] of Object.entries(parsed)) {
518
- const categoryEntries = indices.map((i) => entries[i]).filter((e) => e !== void 0);
519
- if (categoryEntries.length > 0) {
520
- result.push({
521
- category,
522
- entries: categoryEntries
523
- });
550
+ if (hasCustomCategories && parsed.categories) {
551
+ const categoryMap = parsed.categories;
552
+ const scopeMap = parsed.scopes || {};
553
+ for (const [indexStr, scope] of Object.entries(scopeMap)) {
554
+ const idx = Number.parseInt(indexStr, 10);
555
+ if (entries[idx] && scope) {
556
+ entries[idx] = { ...entries[idx], scope };
557
+ }
558
+ }
559
+ for (const [category, rawIndices] of Object.entries(categoryMap)) {
560
+ const indices = Array.isArray(rawIndices) ? rawIndices : [];
561
+ const categoryEntries = indices.map((i) => entries[i]).filter((e) => e !== void 0);
562
+ if (categoryEntries.length > 0) {
563
+ result.push({ category, entries: categoryEntries });
564
+ }
565
+ }
566
+ } else {
567
+ const categoryMap = parsed;
568
+ for (const [category, rawIndices] of Object.entries(categoryMap)) {
569
+ const indices = Array.isArray(rawIndices) ? rawIndices : [];
570
+ const categoryEntries = indices.map((i) => entries[i]).filter((e) => e !== void 0);
571
+ if (categoryEntries.length > 0) {
572
+ result.push({ category, entries: categoryEntries });
573
+ }
524
574
  }
525
575
  }
526
576
  return result;
@@ -538,10 +588,10 @@ Given a technical commit message, rewrite it as a clear, user-friendly changelog
538
588
 
539
589
  Rules:
540
590
  - Be concise (1-2 sentences max)
541
- - Use present tense ("Add feature" not "Added feature")
542
591
  - Focus on user impact, not implementation details
543
592
  - Don't use technical jargon unless necessary
544
593
  - Preserve the scope if mentioned (e.g., "core:", "api:")
594
+ {{style}}
545
595
 
546
596
  Original entry:
547
597
  Type: {{type}}
@@ -550,7 +600,8 @@ Description: {{description}}
550
600
 
551
601
  Rewritten description (only output the new description, nothing else):`;
552
602
  async function enhanceEntry(provider, entry, _context) {
553
- const prompt = ENHANCE_PROMPT.replace("{{type}}", entry.type).replace("{{#if scope}}Scope: {{scope}}{{/if}}", entry.scope ? `Scope: ${entry.scope}` : "").replace("{{description}}", entry.description);
603
+ const styleText = _context.style ? `- ${_context.style}` : '- Use present tense ("Add feature" not "Added feature")';
604
+ const prompt = ENHANCE_PROMPT.replace("{{style}}", styleText).replace("{{type}}", entry.type).replace("{{#if scope}}Scope: {{scope}}{{/if}}", entry.scope ? `Scope: ${entry.scope}` : "").replace("{{description}}", entry.description);
554
605
  const response = await provider.complete(prompt);
555
606
  return response.trim();
556
607
  }
@@ -652,6 +703,7 @@ function createProvider(config) {
652
703
  });
653
704
  case "ollama":
654
705
  return new OllamaProvider({
706
+ apiKey,
655
707
  baseURL: config.baseURL,
656
708
  model: config.model
657
709
  });
@@ -1111,37 +1163,58 @@ async function processWithLLM(context, config) {
1111
1163
  packageName: context.packageName,
1112
1164
  version: context.version,
1113
1165
  previousVersion: context.previousVersion ?? void 0,
1114
- date: context.date
1166
+ date: context.date,
1167
+ categories: config.llm.categories,
1168
+ style: config.llm.style
1115
1169
  };
1116
1170
  const enhanced = {
1117
1171
  entries: context.entries
1118
1172
  };
1119
1173
  try {
1174
+ info4(`Using LLM provider: ${config.llm.provider}${config.llm.model ? ` (${config.llm.model})` : ""}`);
1175
+ if (config.llm.baseURL) {
1176
+ info4(`LLM base URL: ${config.llm.baseURL}`);
1177
+ }
1120
1178
  const rawProvider = createProvider(config.llm);
1121
1179
  const retryOpts = config.llm.retry ?? LLM_DEFAULTS.retry;
1122
1180
  const provider = {
1123
1181
  name: rawProvider.name,
1124
1182
  complete: (prompt, opts) => withRetry(() => rawProvider.complete(prompt, opts), retryOpts)
1125
1183
  };
1184
+ const activeTasks = Object.entries(tasks).filter(([, enabled]) => enabled).map(([name]) => name);
1185
+ info4(`Running LLM tasks: ${activeTasks.join(", ")}`);
1126
1186
  if (tasks.enhance) {
1127
- debug("Enhancing entries with LLM");
1187
+ info4("Enhancing entries with LLM...");
1128
1188
  enhanced.entries = await enhanceEntries(provider, context.entries, llmContext, config.llm.concurrency);
1189
+ info4(`Enhanced ${enhanced.entries.length} entries`);
1129
1190
  }
1130
1191
  if (tasks.summarize) {
1131
- debug("Summarizing entries with LLM");
1192
+ info4("Summarizing entries with LLM...");
1132
1193
  enhanced.summary = await summarizeEntries(provider, enhanced.entries, llmContext);
1194
+ if (enhanced.summary) {
1195
+ info4("Summary generated successfully");
1196
+ debug(`Summary: ${enhanced.summary.substring(0, 100)}...`);
1197
+ } else {
1198
+ warn2("Summary generation returned empty result");
1199
+ }
1133
1200
  }
1134
1201
  if (tasks.categorize) {
1135
- debug("Categorizing entries with LLM");
1202
+ info4("Categorizing entries with LLM...");
1136
1203
  const categorized = await categorizeEntries(provider, enhanced.entries, llmContext);
1137
1204
  enhanced.categories = {};
1138
1205
  for (const cat of categorized) {
1139
1206
  enhanced.categories[cat.category] = cat.entries;
1140
1207
  }
1208
+ info4(`Created ${categorized.length} categories`);
1141
1209
  }
1142
1210
  if (tasks.releaseNotes) {
1143
- debug("Generating release notes with LLM");
1211
+ info4("Generating release notes with LLM...");
1144
1212
  enhanced.releaseNotes = await generateReleaseNotes(provider, enhanced.entries, llmContext);
1213
+ if (enhanced.releaseNotes) {
1214
+ info4("Release notes generated successfully");
1215
+ } else {
1216
+ warn2("Release notes generation returned empty result");
1217
+ }
1145
1218
  }
1146
1219
  return {
1147
1220
  ...context,
@@ -1193,11 +1266,9 @@ async function generateWithTemplate(contexts, config, outputPath, dryRun) {
1193
1266
  async function runPipeline(input, config, dryRun) {
1194
1267
  debug(`Processing ${input.packages.length} package(s)`);
1195
1268
  let contexts = input.packages.map(createTemplateContext);
1196
- if (config.llm && !process.env.CHANGELOG_NO_LLM && !dryRun) {
1269
+ if (config.llm && !process.env.CHANGELOG_NO_LLM) {
1197
1270
  info4("Processing with LLM enhancement");
1198
1271
  contexts = await Promise.all(contexts.map((ctx) => processWithLLM(ctx, config)));
1199
- } else if (config.llm && dryRun) {
1200
- info4("Skipping LLM processing in dry-run mode");
1201
1272
  }
1202
1273
  for (const output of config.output) {
1203
1274
  info4(`Generating ${output.format} output`);