lilmd 0.1.0 → 0.2.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/BENCHMARK.md CHANGED
@@ -1,6 +1,6 @@
1
1
  # Benchmark summary
2
2
 
3
- Why `mdq` uses a hand-rolled scanner instead of an off-the-shelf markdown
3
+ Why `lilmd` uses a hand-rolled scanner instead of an off-the-shelf markdown
4
4
  parser. Numbers captured on Bun 1.3.11 against MDN content (`mdn/content`,
5
5
  sparse-checkout of `files/en-us/web/{javascript,css,html,api}`, concatenated
6
6
  into fixtures of ~100 KB, ~1 MB, and ~10 MB).
@@ -15,7 +15,7 @@ into fixtures of ~100 KB, ~1 MB, and ~10 MB).
15
15
  | marked lexer | ❌ | >90 s on a 1 MB input (unusable) |
16
16
  | md4w (WASM) | ❌ | ~42 MB/s, errors on 10 MB JSON output |
17
17
 
18
- ## End-to-end `mdq read` (10 MB, find a section + slice its body)
18
+ ## End-to-end `lilmd read` (10 MB, find a section + slice its body)
19
19
 
20
20
  | strategy | time |
21
21
  |---|---:|
@@ -36,7 +36,7 @@ the scanner is correct, not just fast.
36
36
 
37
37
  ## Why the scanner wins
38
38
 
39
- `mdq`'s read-path commands (`toc`, `read`, `ls`, `grep`) only need two facts
39
+ `lilmd`'s read-path commands (`toc`, `read`, `ls`, `grep`) only need two facts
40
40
  from the markdown:
41
41
 
42
42
  1. ATX headings — level, text, line number
@@ -45,7 +45,7 @@ from the markdown:
45
45
  Everything else — links, emphasis, tables, footnotes, nested lists, HTML
46
46
  blocks — is irrelevant to "list the headings" and "slice the body between
47
47
  line N and line M". A full CommonMark parser spends 95% of its budget on
48
- grammar `mdq` immediately throws away. The scanner skips all of that, runs
48
+ grammar `lilmd` immediately throws away. The scanner skips all of that, runs
49
49
  in a single pass over character codes, and is IO-bound on 10 MB of prose.
50
50
 
51
51
  ## The final stack
package/README.md CHANGED
@@ -1,8 +1,10 @@
1
- ## `mdq` - Markdown as a Database for Agents
2
- `mdq` is a CLI for working with large MD files designed for agents.
1
+ <img src="lilmd-logo.webp" alt="lilmd logo" width="400" />
2
+
3
+ ## `lilmd` - Markdown as a Database for Agents
4
+ `lilmd` is a CLI for working with large MD files designed for agents.
3
5
 
4
6
  *Wait, but why?* Agent knowledge, docs, memory keeps growing.
5
- MDQ allows you to dump it all in one file and effeciently read/write/navigate its contents.
7
+ lilmd allows you to dump it all in one file and effeciently read/write/navigate its contents.
6
8
 
7
9
  Features:
8
10
  - fast navigation, complex read selectors, link extraction
@@ -14,13 +16,24 @@ Features:
14
16
  - can be used by humans and **agents**
15
17
  - uses Bun as tooling: to test, control deps etc.
16
18
 
19
+ ### Install
20
+
21
+ ```bash
22
+ # global install
23
+ > npm i -g lilmd
24
+
25
+ # or run without installing
26
+ > npx lilmd file.md
27
+ > bunx lilmd file.md
28
+ ```
29
+
17
30
  ### Help
18
31
 
19
32
  ```bash
20
33
  # start here!
21
34
  # both commands print short documentation for the agent
22
- > mdq
23
- > mdq --help
35
+ > lilmd
36
+ > lilmd --help
24
37
  ```
25
38
 
26
39
  ### Overview & table of contents
@@ -30,10 +43,10 @@ First, the agent gets file overview and table of contents.
30
43
  ```bash
31
44
  # renders toc + stats; line ranges are inclusive, 1-indexed
32
45
  # --depth=N to limit nesting, --flat for a flat list
33
- > mdq file.md
46
+ > lilmd file.md
34
47
 
35
48
  file.md L1-450 12 headings
36
- # MDQ L1-450
49
+ # lilmd L1-450
37
50
  ## Getting Started L5-80
38
51
  ### Installation L31-80
39
52
  ## Community L301-450
@@ -42,26 +55,26 @@ file.md L1-450 12 headings
42
55
  ### Reading sections
43
56
 
44
57
  ```bash
45
- > mdq read file.md "# MDQ"
46
- > mdq file.md "# MDQ" # alias!
47
- # prints the contents of the MDQ section
58
+ > lilmd read file.md "# lilmd"
59
+ > lilmd file.md "# lilmd" # alias!
60
+ # prints the contents of the lilmd section
48
61
 
49
62
  # descendant selector (any depth under the parent)
50
- > mdq file.md "MDQ > Installation"
63
+ > lilmd file.md "lilmd > Installation"
51
64
 
52
65
  # direct child only
53
- > mdq file.md "MDQ >> Installation"
66
+ > lilmd file.md "lilmd >> Installation"
54
67
 
55
68
  # level filter (H2 only)
56
- > mdq file.md "##Installation"
69
+ > lilmd file.md "##Installation"
57
70
 
58
71
  # exact match (default is fuzzy, case-insensitive)
59
- > mdq file.md "=Installation"
72
+ > lilmd file.md "=Installation"
60
73
 
61
74
  # regex
62
- > mdq file.md "/install(ation)?/"
75
+ > lilmd file.md "/install(ation)?/"
63
76
 
64
- # by default no more than 25 matches are printed; if more, mdq prints a hint
77
+ # by default no more than 25 matches are printed; if more, lilmd prints a hint
65
78
  # about --max-results=N
66
79
  # --max-lines=N truncates long bodies (shows "… N more lines")
67
80
  # --body-only skips subsections, --no-body prints headings only
@@ -72,7 +85,7 @@ file.md L1-450 12 headings
72
85
  ```bash
73
86
  # --pretty renders the section body as syntax-highlighted terminal markdown
74
87
  # (for humans; piped output stays plain unless FORCE_COLOR is set)
75
- > mdq file.md --pretty "Installation"
88
+ > lilmd file.md --pretty "Installation"
76
89
 
77
90
  # nicely formatted markdown
78
91
  ```
@@ -80,28 +93,28 @@ file.md L1-450 12 headings
80
93
  ### Searching & extracting
81
94
 
82
95
  ```bash
83
- > mdq ls file.md "Getting Started" # direct children of a section
84
- > mdq grep file.md "pattern" # regex search, grouped by section
85
- > mdq links file.md ["selector"] # extract links with section path
86
- > mdq code file.md "Install" [--lang=ts] # extract code blocks
96
+ > lilmd ls file.md "Getting Started" # direct children of a section
97
+ > lilmd grep file.md "pattern" # regex search, grouped by section
98
+ > lilmd links file.md ["selector"] # extract links with section path
99
+ > lilmd code file.md "Install" [--lang=ts] # extract code blocks
87
100
  ```
88
101
 
89
102
  ### Writing
90
103
 
91
- `mdq` treats sections as addressable records: you can replace, append,
104
+ `lilmd` treats sections as addressable records: you can replace, append,
92
105
  insert, move, or rename them without rewriting the whole file. Every write
93
106
  supports `--dry-run`, which prints a unified diff instead of touching disk —
94
107
  perfect for agent-authored edits that a human (or another agent) reviews
95
108
  before applying.
96
109
 
97
110
  ```bash
98
- > mdq set file.md "Install" < body.md # replace section body
99
- > mdq append file.md "Install" < body.md
100
- > mdq insert file.md --after "Install" < new.md
101
- > mdq rm file.md "Old"
102
- > mdq mv file.md "From" "To" # re-parent, fixes heading levels
103
- > mdq rename file.md "Old" "New"
104
- > mdq promote|demote file.md "Section" # shift heading level ±1
111
+ > lilmd set file.md "Install" < body.md # replace section body
112
+ > lilmd append file.md "Install" < body.md
113
+ > lilmd insert file.md --after "Install" < new.md
114
+ > lilmd rm file.md "Old"
115
+ > lilmd mv file.md "From" "To" # re-parent, fixes heading levels
116
+ > lilmd rename file.md "Old" "New"
117
+ > lilmd promote|demote file.md "Section" # shift heading level ±1
105
118
  ```
106
119
 
107
120
  ### Output
@@ -109,5 +122,5 @@ before applying.
109
122
  ```bash
110
123
  # human-readable by default; --json for machine output
111
124
  # use - as filename to read from stdin
112
- > cat big.md | mdq - "Install"
125
+ > cat big.md | lilmd - "Install"
113
126
  ```
package/dist/index.d.cts CHANGED
@@ -2,7 +2,7 @@
2
2
  * Markdown heading scanner — the engine behind every read-path command.
3
3
  *
4
4
  * Instead of building a full CommonMark AST we walk the source line by line
5
- * and recognize only what `mdq` actually needs: ATX headings and fenced code
5
+ * and recognize only what `lilmd` actually needs: ATX headings and fenced code
6
6
  * blocks (so `#` inside code doesn't count as a heading).
7
7
  *
8
8
  * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a
@@ -10,7 +10,7 @@
10
10
  * mdast-util-from-markdown while returning the exact same section.
11
11
  *
12
12
  * Deliberate limitations:
13
- * - Setext headings (`===` / `---` underlines) are NOT recognized. mdq is
13
+ * - Setext headings (`===` / `---` underlines) are NOT recognized. lilmd is
14
14
  * aimed at agent-authored markdown where ATX is ubiquitous.
15
15
  * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line
16
16
  * would be misread as a heading. That's an acceptable tradeoff for 100x
@@ -82,7 +82,7 @@ type Segment = {
82
82
  declare function parseSelector(input: string): Segment[];
83
83
  declare function match(sections: Section[], selector: Segment[]): Section[];
84
84
  /**
85
- * Pretty printing for `mdq read --pretty`. Lazy-loads marked +
85
+ * Pretty printing for `lilmd read --pretty`. Lazy-loads marked +
86
86
  * marked-terminal on first use so the default (plain-text) path keeps its
87
87
  * ~16ms cold start.
88
88
  */
package/dist/index.d.ts CHANGED
@@ -2,7 +2,7 @@
2
2
  * Markdown heading scanner — the engine behind every read-path command.
3
3
  *
4
4
  * Instead of building a full CommonMark AST we walk the source line by line
5
- * and recognize only what `mdq` actually needs: ATX headings and fenced code
5
+ * and recognize only what `lilmd` actually needs: ATX headings and fenced code
6
6
  * blocks (so `#` inside code doesn't count as a heading).
7
7
  *
8
8
  * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a
@@ -10,7 +10,7 @@
10
10
  * mdast-util-from-markdown while returning the exact same section.
11
11
  *
12
12
  * Deliberate limitations:
13
- * - Setext headings (`===` / `---` underlines) are NOT recognized. mdq is
13
+ * - Setext headings (`===` / `---` underlines) are NOT recognized. lilmd is
14
14
  * aimed at agent-authored markdown where ATX is ubiquitous.
15
15
  * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line
16
16
  * would be misread as a heading. That's an acceptable tradeoff for 100x
@@ -82,7 +82,7 @@ type Segment = {
82
82
  declare function parseSelector(input: string): Segment[];
83
83
  declare function match(sections: Section[], selector: Segment[]): Section[];
84
84
  /**
85
- * Pretty printing for `mdq read --pretty`. Lazy-loads marked +
85
+ * Pretty printing for `lilmd read --pretty`. Lazy-loads marked +
86
86
  * marked-terminal on first use so the default (plain-text) path keeps its
87
87
  * ~16ms cold start.
88
88
  */
package/dist/index.js.map CHANGED
@@ -2,10 +2,10 @@
2
2
  "version": 3,
3
3
  "sources": ["src/scan.ts", "src/sections.ts", "src/select.ts", "src/render.ts"],
4
4
  "sourcesContent": [
5
- "/**\n * Markdown heading scanner — the engine behind every read-path command.\n *\n * Instead of building a full CommonMark AST we walk the source line by line\n * and recognize only what `mdq` actually needs: ATX headings and fenced code\n * blocks (so `#` inside code doesn't count as a heading).\n *\n * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a\n * 10 MB fixture, roughly 7x faster than markdown-it and ~1000x faster than\n * mdast-util-from-markdown while returning the exact same section.\n *\n * Deliberate limitations:\n * - Setext headings (`===` / `---` underlines) are NOT recognized. mdq is\n * aimed at agent-authored markdown where ATX is ubiquitous.\n * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line\n * would be misread as a heading. That's an acceptable tradeoff for 100x\n * speed; a future `--strict` flag could hand off to markdown-it.\n * - Fenced code blocks *inside a list item* that are indented 4+ spaces are\n * not recognized as fences — we only look at the first 3 columns for the\n * fence opener. A `# fake` line inside such a block would be scanned as a\n * heading. Rare in practice; document-your-way-out rather than fix.\n * - An unclosed fence at EOF leaves the scanner in \"still in fence\" state\n * to the end of the file, so any `#`-looking lines after it are ignored.\n * That's the conservative choice — prefer under-counting to over-counting.\n */\n\nexport type Heading = {\n /** 1..6 */\n level: number;\n /** Heading text with trailing closing hashes stripped. */\n title: string;\n /** 1-indexed line number. */\n line: number;\n};\n\n/**\n * Return every ATX heading in `src`, in document order.\n * Runs in a single pass; O(n) in source length, O(headings) in space.\n */\nexport function scan(src: string): Heading[] {\n const out: Heading[] = [];\n const len = src.length;\n\n let i = 0;\n let lineNo = 0;\n\n // Fence state: when inFence is true every line is ignored until we see a\n // matching closing fence (same char, length >= opening length).\n let inFence = false;\n let fenceChar = 0; // charCode of ` or ~\n let fenceLen = 0;\n\n while (i <= len) {\n // Slice one line without the trailing newline. A trailing \\r from CRLF\n // is stripped below.\n const start = i;\n while (i < len && src.charCodeAt(i) !== 10 /* \\n */) i++;\n let line = src.slice(start, i);\n if (line.length > 0 && line.charCodeAt(line.length - 1) === 13 /* \\r */) {\n line = line.slice(0, line.length - 1);\n }\n lineNo++;\n\n const fence = matchFence(line);\n if (fence) {\n if (!inFence) {\n inFence = true;\n fenceChar = fence.char;\n fenceLen = fence.len;\n } else if (fence.char === fenceChar && fence.len >= fenceLen) {\n inFence = false;\n }\n } else if (!inFence) {\n const h = matchHeading(line, lineNo);\n if (h) out.push(h);\n }\n\n if (i >= len) break;\n i++; // skip the \\n\n }\n\n return out;\n}\n\n/**\n * If `line` opens or closes a fenced code block, return the fence char code\n * (` or ~) and the number of fence characters. Otherwise null.\n *\n * A fence is 0–3 spaces, then 3+ of a single fence char, then optional info\n * string. We only care about the opening run length; the info string is\n * ignored.\n */\nfunction matchFence(line: string): { char: number; len: number } | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n const ch = line.charCodeAt(p);\n if (ch !== 96 /* ` */ && ch !== 126 /* ~ */) return null;\n let run = 0;\n while (line.charCodeAt(p + run) === ch) run++;\n if (run < 3) return null;\n // For backtick fences, CommonMark forbids backticks in the info string,\n // but we don't parse info; we only need to know this line is a fence.\n return { char: ch, len: run };\n}\n\n/**\n * If `line` is an ATX heading, return it. Otherwise null.\n *\n * Rules (CommonMark, simplified):\n * - 0–3 spaces of indent\n * - 1–6 `#`\n * - EITHER end-of-line OR a space/tab followed by content\n * - optional closing sequence: whitespace + trailing `#`s (stripped)\n */\nfunction matchHeading(line: string, lineNo: number): Heading | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n if (line.charCodeAt(p) !== 35 /* # */) return null;\n\n let hashes = 0;\n while (line.charCodeAt(p + hashes) === 35) hashes++;\n if (hashes < 1 || hashes > 6) return null;\n\n const after = p + hashes;\n const afterCh = line.charCodeAt(after);\n\n // After the hashes we need either end-of-line or a space/tab. Anything else\n // (including `#` which is caught above by the hashes loop) disqualifies.\n if (after < line.length && afterCh !== 32 && afterCh !== 9 /* \\t */) {\n return null;\n }\n\n // Trim leading whitespace of content and trailing whitespace + closing #s.\n let contentStart = after;\n while (\n contentStart < line.length &&\n (line.charCodeAt(contentStart) === 32 || line.charCodeAt(contentStart) === 9)\n ) {\n contentStart++;\n }\n\n let end = line.length;\n // Trim trailing whitespace first.\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n // Strip closing `#`s only if they are preceded by whitespace (CommonMark\n // requires the closing sequence to be separated from the content).\n let closing = end;\n while (closing > contentStart && line.charCodeAt(closing - 1) === 35) closing--;\n if (\n closing < end &&\n (closing === contentStart ||\n line.charCodeAt(closing - 1) === 32 ||\n line.charCodeAt(closing - 1) === 9)\n ) {\n end = closing;\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n }\n\n const title = line.slice(contentStart, end);\n return { level: hashes, title, line: lineNo };\n}\n",
5
+ "/**\n * Markdown heading scanner — the engine behind every read-path command.\n *\n * Instead of building a full CommonMark AST we walk the source line by line\n * and recognize only what `lilmd` actually needs: ATX headings and fenced code\n * blocks (so `#` inside code doesn't count as a heading).\n *\n * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a\n * 10 MB fixture, roughly 7x faster than markdown-it and ~1000x faster than\n * mdast-util-from-markdown while returning the exact same section.\n *\n * Deliberate limitations:\n * - Setext headings (`===` / `---` underlines) are NOT recognized. lilmd is\n * aimed at agent-authored markdown where ATX is ubiquitous.\n * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line\n * would be misread as a heading. That's an acceptable tradeoff for 100x\n * speed; a future `--strict` flag could hand off to markdown-it.\n * - Fenced code blocks *inside a list item* that are indented 4+ spaces are\n * not recognized as fences — we only look at the first 3 columns for the\n * fence opener. A `# fake` line inside such a block would be scanned as a\n * heading. Rare in practice; document-your-way-out rather than fix.\n * - An unclosed fence at EOF leaves the scanner in \"still in fence\" state\n * to the end of the file, so any `#`-looking lines after it are ignored.\n * That's the conservative choice — prefer under-counting to over-counting.\n */\n\nexport type Heading = {\n /** 1..6 */\n level: number;\n /** Heading text with trailing closing hashes stripped. */\n title: string;\n /** 1-indexed line number. */\n line: number;\n};\n\n/**\n * Return every ATX heading in `src`, in document order.\n * Runs in a single pass; O(n) in source length, O(headings) in space.\n */\nexport function scan(src: string): Heading[] {\n const out: Heading[] = [];\n const len = src.length;\n\n let i = 0;\n let lineNo = 0;\n\n // Fence state: when inFence is true every line is ignored until we see a\n // matching closing fence (same char, length >= opening length).\n let inFence = false;\n let fenceChar = 0; // charCode of ` or ~\n let fenceLen = 0;\n\n while (i <= len) {\n // Slice one line without the trailing newline. A trailing \\r from CRLF\n // is stripped below.\n const start = i;\n while (i < len && src.charCodeAt(i) !== 10 /* \\n */) i++;\n let line = src.slice(start, i);\n if (line.length > 0 && line.charCodeAt(line.length - 1) === 13 /* \\r */) {\n line = line.slice(0, line.length - 1);\n }\n lineNo++;\n\n const fence = matchFence(line);\n if (fence) {\n if (!inFence) {\n inFence = true;\n fenceChar = fence.char;\n fenceLen = fence.len;\n } else if (fence.char === fenceChar && fence.len >= fenceLen) {\n inFence = false;\n }\n } else if (!inFence) {\n const h = matchHeading(line, lineNo);\n if (h) out.push(h);\n }\n\n if (i >= len) break;\n i++; // skip the \\n\n }\n\n return out;\n}\n\n/**\n * If `line` opens or closes a fenced code block, return the fence char code\n * (` or ~) and the number of fence characters. Otherwise null.\n *\n * A fence is 0–3 spaces, then 3+ of a single fence char, then optional info\n * string. We only care about the opening run length; the info string is\n * ignored.\n */\nfunction matchFence(line: string): { char: number; len: number } | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n const ch = line.charCodeAt(p);\n if (ch !== 96 /* ` */ && ch !== 126 /* ~ */) return null;\n let run = 0;\n while (line.charCodeAt(p + run) === ch) run++;\n if (run < 3) return null;\n // For backtick fences, CommonMark forbids backticks in the info string,\n // but we don't parse info; we only need to know this line is a fence.\n return { char: ch, len: run };\n}\n\n/**\n * If `line` is an ATX heading, return it. Otherwise null.\n *\n * Rules (CommonMark, simplified):\n * - 0–3 spaces of indent\n * - 1–6 `#`\n * - EITHER end-of-line OR a space/tab followed by content\n * - optional closing sequence: whitespace + trailing `#`s (stripped)\n */\nfunction matchHeading(line: string, lineNo: number): Heading | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n if (line.charCodeAt(p) !== 35 /* # */) return null;\n\n let hashes = 0;\n while (line.charCodeAt(p + hashes) === 35) hashes++;\n if (hashes < 1 || hashes > 6) return null;\n\n const after = p + hashes;\n const afterCh = line.charCodeAt(after);\n\n // After the hashes we need either end-of-line or a space/tab. Anything else\n // (including `#` which is caught above by the hashes loop) disqualifies.\n if (after < line.length && afterCh !== 32 && afterCh !== 9 /* \\t */) {\n return null;\n }\n\n // Trim leading whitespace of content and trailing whitespace + closing #s.\n let contentStart = after;\n while (\n contentStart < line.length &&\n (line.charCodeAt(contentStart) === 32 || line.charCodeAt(contentStart) === 9)\n ) {\n contentStart++;\n }\n\n let end = line.length;\n // Trim trailing whitespace first.\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n // Strip closing `#`s only if they are preceded by whitespace (CommonMark\n // requires the closing sequence to be separated from the content).\n let closing = end;\n while (closing > contentStart && line.charCodeAt(closing - 1) === 35) closing--;\n if (\n closing < end &&\n (closing === contentStart ||\n line.charCodeAt(closing - 1) === 32 ||\n line.charCodeAt(closing - 1) === 9)\n ) {\n end = closing;\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n }\n\n const title = line.slice(contentStart, end);\n return { level: hashes, title, line: lineNo };\n}\n",
6
6
  "/**\n * Turn a flat list of headings into a section tree with line ranges and\n * parent links.\n *\n * Each Section covers its heading line through the line before the next\n * heading at the same-or-higher level (or the end of the file if none).\n * That's the \"whole subtree\" range — it includes the section's body *and*\n * its descendants. `--body-only` rendering is derived at render time.\n */\n\nimport type { Heading } from \"./scan\";\n\nexport type Section = {\n level: number;\n title: string;\n /** 1-indexed line of the heading itself. */\n line_start: number;\n /** 1-indexed inclusive end of the subtree. */\n line_end: number;\n /** Nearest enclosing section, or null for top-level. */\n parent: Section | null;\n};\n\n/**\n * Build the section tree in a single pass. Preserves document order.\n *\n * Runs in O(n): every section is pushed once and popped once, and we set\n * its `line_end` at pop time. Sections still on the stack when we run out\n * of headings keep their provisional `line_end = totalLines`.\n */\nexport function buildSections(headings: Heading[], totalLines: number): Section[] {\n const out: Section[] = [];\n /** Ancestors whose subtree is still open. */\n const stack: Section[] = [];\n\n for (const h of headings) {\n // Every section on the stack with the same-or-shallower level closes at\n // h.line - 1 (the line before the new heading).\n while (stack.length > 0 && stack[stack.length - 1]!.level >= h.level) {\n const closing = stack.pop()!;\n closing.line_end = h.line - 1;\n }\n const parent = stack.length > 0 ? stack[stack.length - 1]! : null;\n\n const sec: Section = {\n level: h.level,\n title: h.title,\n line_start: h.line,\n // Provisional: if nothing closes this section we leave it at totalLines.\n line_end: totalLines,\n parent,\n };\n out.push(sec);\n stack.push(sec);\n }\n\n return out;\n}\n\n/**\n * Walk `sec` up to the root, collecting ancestor titles in top-down order.\n * Returns [] for a root section.\n */\nexport function pathOf(sec: Section): string[] {\n const path: string[] = [];\n let cur = sec.parent;\n while (cur) {\n path.push(cur.title);\n cur = cur.parent;\n }\n return path.reverse();\n}\n\n/**\n * Count lines in a source string. Empty string is 0; otherwise every line\n * (including the last one, whether or not it ends with a newline) is 1.\n * A trailing newline does NOT add a phantom line.\n */\nexport function countLines(src: string): number {\n if (src.length === 0) return 0;\n let n = 1;\n for (let i = 0; i < src.length; i++) {\n if (src.charCodeAt(i) === 10) n++;\n }\n // If the source ends with a newline, the line-count should equal the\n // number of newlines (not newlines + 1) since the final \"line\" is empty.\n if (src.charCodeAt(src.length - 1) === 10) n--;\n return n;\n}\n",
7
7
  "/**\n * Selector grammar parser and matcher.\n *\n * SELECTOR := SEGMENT ( SEP SEGMENT )*\n * SEP := \">\" (descendant, any depth)\n * | \">>\" (direct child)\n * SEGMENT := LEVEL? MATCHER\n * LEVEL := \"#\"{1,6} (optional level filter)\n * MATCHER := TEXT (fuzzy, case-insensitive substring)\n * | \"=\" TEXT (exact, case-insensitive equality)\n * | \"/\" PATTERN \"/\" FLAGS? (JS regex; defaults to /.../i)\n *\n * Matching semantics:\n * - The *last* segment must match the candidate section itself.\n * - Earlier segments must match an ancestor chain walking upward from that\n * candidate, respecting each separator between them: `A >> B` requires A\n * to be B's *immediate* parent; `A > B` only requires A to be *some*\n * ancestor of B.\n */\n\nimport type { Section } from \"./sections\";\n\nexport type Op = \"descendant\" | \"child\";\nexport type Kind = \"fuzzy\" | \"exact\" | \"regex\";\n\nexport type Segment = {\n /** Operator that connects this segment to the *previous* one.\n * For the first segment this is always \"descendant\" (unused). */\n op: Op;\n /** Optional 1..6 level filter. */\n level: number | null;\n kind: Kind;\n /** The raw value (without level/kind prefix). */\n value: string;\n /** Present only for kind === \"regex\". */\n regex?: RegExp;\n};\n\nexport function parseSelector(input: string): Segment[] {\n const trimmed = input.trim();\n if (trimmed.length === 0) return [];\n\n // Split on > / >>. We walk the string character by character so we can\n // distinguish the two operators without confusing a `>>` with two\n // consecutive `>`s. We intentionally ignore `>` that appear inside a\n // regex delimiter pair because users may write `/a>b/`.\n //\n // `atSegmentStart` tracks whether the running buffer is still whitespace\n // only — only in that state can a `/` open a regex literal. Using\n // `cur.length === 0` instead is wrong because `>` splits leave the loop\n // pointing at a leading space that then lands in `cur` before the next\n // non-space char.\n const rawSegments: string[] = [];\n const ops: Op[] = [\"descendant\"];\n let cur = \"\";\n let i = 0;\n let inRegex = false;\n let atSegmentStart = true;\n while (i < trimmed.length) {\n const ch = trimmed[i];\n if (ch === \"/\" && (atSegmentStart || inRegex)) {\n inRegex = !inRegex;\n cur += ch;\n atSegmentStart = false;\n i++;\n continue;\n }\n if (!inRegex && ch === \">\") {\n rawSegments.push(cur.trim());\n cur = \"\";\n atSegmentStart = true;\n if (trimmed[i + 1] === \">\") {\n ops.push(\"child\");\n i += 2;\n } else {\n ops.push(\"descendant\");\n i += 1;\n }\n continue;\n }\n cur += ch;\n if (ch !== \" \" && ch !== \"\\t\") atSegmentStart = false;\n i++;\n }\n rawSegments.push(cur.trim());\n\n return rawSegments.map((s, idx) => parseSegment(s, ops[idx] ?? \"descendant\"));\n}\n\nfunction parseSegment(raw: string, op: Op): Segment {\n let s = raw;\n let level: number | null = null;\n\n // Level prefix — exactly 1..6 `#`s followed by something that is NOT\n // another `#`. The negative lookahead matters: without it, \"#######foo\"\n // would silently match level=6 value=\"#foo\".\n const levelMatch = /^(#{1,6})(?!#)\\s*(.*)$/.exec(s);\n if (levelMatch) {\n level = levelMatch[1]!.length;\n s = levelMatch[2] ?? \"\";\n }\n\n // Regex literal: /pattern/flags — flags default to \"i\".\n const regexMatch = /^\\/(.+)\\/([gimsuy]*)$/.exec(s);\n if (regexMatch) {\n const pattern = regexMatch[1]!;\n const flags = regexMatch[2] || \"i\";\n return {\n op,\n level,\n kind: \"regex\",\n value: pattern,\n regex: new RegExp(pattern, flags),\n };\n }\n\n // Exact match: =value.\n if (s.startsWith(\"=\")) {\n return { op, level, kind: \"exact\", value: s.slice(1).trim() };\n }\n\n return { op, level, kind: \"fuzzy\", value: s.trim() };\n}\n\nexport function match(sections: Section[], selector: Segment[]): Section[] {\n if (selector.length === 0) return [];\n const out: Section[] = [];\n for (const sec of sections) {\n if (matches(sec, selector)) out.push(sec);\n }\n return out;\n}\n\nfunction matches(sec: Section, segs: Segment[]): boolean {\n // Last segment matches the candidate itself.\n const last = segs[segs.length - 1];\n if (!last || !segmentMatchesSection(last, sec)) return false;\n\n // Walk the ancestor chain backward alongside the earlier segments.\n let cursor: Section | null = sec.parent;\n for (let i = segs.length - 2; i >= 0; i--) {\n // The separator BEFORE segs[i+1] is stored on segs[i+1].op; that's the\n // relationship we need to honor when walking from segs[i+1] back to\n // segs[i] in the ancestor chain.\n const op = segs[i + 1]!.op;\n const seg = segs[i]!;\n\n if (op === \"child\") {\n if (!cursor || !segmentMatchesSection(seg, cursor)) return false;\n cursor = cursor.parent;\n } else {\n // Descendant: find any matching ancestor.\n let found: Section | null = null;\n while (cursor) {\n if (segmentMatchesSection(seg, cursor)) {\n found = cursor;\n break;\n }\n cursor = cursor.parent;\n }\n if (!found) return false;\n cursor = found.parent;\n }\n }\n return true;\n}\n\nfunction segmentMatchesSection(seg: Segment, sec: Section): boolean {\n if (seg.level !== null && seg.level !== sec.level) return false;\n const title = sec.title;\n switch (seg.kind) {\n case \"exact\":\n return title.toLowerCase() === seg.value.toLowerCase();\n case \"regex\":\n return seg.regex!.test(title);\n case \"fuzzy\":\n return title.toLowerCase().includes(seg.value.toLowerCase());\n }\n}\n",
8
- "/**\n * Output formatting for mdq.\n *\n * Two targets today: `renderToc` for the TOC view and `renderSection` for a\n * single section read. Both emit grep-friendly plain text with stable\n * delimiters that agents can split on.\n *\n * A future `--json` pipeline lives in cli.ts; the shapes (Heading, Section)\n * are already JSON-clean so it's a direct serialization.\n */\n\nimport type { Section } from \"./sections\";\nimport { countLines } from \"./sections\";\nimport type { PrettyFormatter } from \"./pretty\";\n\nexport type TocOptions = {\n depth?: number;\n flat?: boolean;\n};\n\nexport function renderToc(\n file: string,\n src: string,\n sections: Section[],\n opts: TocOptions,\n): string {\n const totalLines = countLines(src);\n const headerCount = sections.length;\n const headerRange = totalLines === 0 ? \"L0\" : `L1-${totalLines}`;\n const plural = headerCount === 1 ? \"heading\" : \"headings\";\n\n const out: string[] = [];\n out.push(`${file} ${headerRange} ${headerCount} ${plural}`);\n\n for (const sec of sections) {\n if (opts.depth != null && sec.level > opts.depth) continue;\n const indent = opts.flat ? \"\" : \" \".repeat(Math.max(0, sec.level - 1));\n const hashes = \"#\".repeat(sec.level);\n const range = `L${sec.line_start}-${sec.line_end}`;\n out.push(`${indent}${hashes} ${sec.title} ${range}`);\n }\n return out.join(\"\\n\");\n}\n\nexport type SectionOptions = {\n bodyOnly?: boolean;\n noBody?: boolean;\n raw?: boolean;\n maxLines?: number;\n /** Required when bodyOnly is true so we can find the first child. */\n allSections?: Section[];\n /** Optional markdown→ANSI formatter applied to the body before delimiters. */\n pretty?: PrettyFormatter;\n};\n\nexport function renderSection(\n file: string,\n srcLines: string[],\n sec: Section,\n opts: SectionOptions,\n): string {\n const start = sec.line_start;\n let end = sec.line_end;\n\n if (opts.bodyOnly && opts.allSections) {\n const firstChild = findFirstChild(sec, opts.allSections);\n if (firstChild) end = firstChild.line_start - 1;\n }\n\n if (opts.noBody) {\n end = start;\n }\n\n // Clamp to source length so a stale `line_end` (e.g. countLines and\n // splitLines disagreeing on a trailing newline) can't overrun.\n const clampedEnd = Math.min(end, srcLines.length);\n let body = srcLines.slice(start - 1, clampedEnd).join(\"\\n\");\n\n // Truncate before pretty-printing so ANSI escapes can't land mid-cut.\n if (opts.maxLines != null && opts.maxLines > 0) {\n body = truncateBody(body, opts.maxLines);\n }\n\n if (opts.pretty) {\n body = opts.pretty(body);\n }\n\n if (opts.raw) return body;\n\n const hashes = \"#\".repeat(sec.level);\n const header = `── ${file} L${start}-${end} ${hashes} ${sec.title} ${\"─\".repeat(8)}`;\n const footer = `── end ${\"─\".repeat(40)}`;\n return `${header}\\n${body}\\n${footer}`;\n}\n\n/**\n * Cut `body` to the first `maxLines` lines. If anything was dropped, append\n * a marker line telling the agent how to get the rest. `maxLines <= 0`\n * disables truncation.\n */\nexport function truncateBody(body: string, maxLines: number): string {\n if (maxLines <= 0) return body;\n const lines = body.split(\"\\n\");\n if (lines.length <= maxLines) return body;\n const kept = lines.slice(0, maxLines).join(\"\\n\");\n const remaining = lines.length - maxLines;\n return `${kept}\\n\\n… ${remaining} more lines (use --max-lines=0 for full)`;\n}\n\nfunction findFirstChild(sec: Section, all: Section[]): Section | null {\n for (const candidate of all) {\n if (candidate.parent === sec) return candidate;\n }\n return null;\n}\n"
8
+ "/**\n * Output formatting for lilmd.\n *\n * Two targets today: `renderToc` for the TOC view and `renderSection` for a\n * single section read. Both emit grep-friendly plain text with stable\n * delimiters that agents can split on.\n *\n * A future `--json` pipeline lives in cli.ts; the shapes (Heading, Section)\n * are already JSON-clean so it's a direct serialization.\n */\n\nimport type { Section } from \"./sections\";\nimport { countLines } from \"./sections\";\nimport type { PrettyFormatter } from \"./pretty\";\n\nexport type TocOptions = {\n depth?: number;\n flat?: boolean;\n};\n\nexport function renderToc(\n file: string,\n src: string,\n sections: Section[],\n opts: TocOptions,\n): string {\n const totalLines = countLines(src);\n const headerCount = sections.length;\n const headerRange = totalLines === 0 ? \"L0\" : `L1-${totalLines}`;\n const plural = headerCount === 1 ? \"heading\" : \"headings\";\n\n const out: string[] = [];\n out.push(`${file} ${headerRange} ${headerCount} ${plural}`);\n\n for (const sec of sections) {\n if (opts.depth != null && sec.level > opts.depth) continue;\n const indent = opts.flat ? \"\" : \" \".repeat(Math.max(0, sec.level - 1));\n const hashes = \"#\".repeat(sec.level);\n const range = `L${sec.line_start}-${sec.line_end}`;\n out.push(`${indent}${hashes} ${sec.title} ${range}`);\n }\n return out.join(\"\\n\");\n}\n\nexport type SectionOptions = {\n bodyOnly?: boolean;\n noBody?: boolean;\n raw?: boolean;\n maxLines?: number;\n /** Required when bodyOnly is true so we can find the first child. */\n allSections?: Section[];\n /** Optional markdown→ANSI formatter applied to the body before delimiters. */\n pretty?: PrettyFormatter;\n};\n\nexport function renderSection(\n file: string,\n srcLines: string[],\n sec: Section,\n opts: SectionOptions,\n): string {\n const start = sec.line_start;\n let end = sec.line_end;\n\n if (opts.bodyOnly && opts.allSections) {\n const firstChild = findFirstChild(sec, opts.allSections);\n if (firstChild) end = firstChild.line_start - 1;\n }\n\n if (opts.noBody) {\n end = start;\n }\n\n // Clamp to source length so a stale `line_end` (e.g. countLines and\n // splitLines disagreeing on a trailing newline) can't overrun.\n const clampedEnd = Math.min(end, srcLines.length);\n let body = srcLines.slice(start - 1, clampedEnd).join(\"\\n\");\n\n // Truncate before pretty-printing so ANSI escapes can't land mid-cut.\n if (opts.maxLines != null && opts.maxLines > 0) {\n body = truncateBody(body, opts.maxLines);\n }\n\n if (opts.pretty) {\n body = opts.pretty(body);\n }\n\n if (opts.raw) return body;\n\n const hashes = \"#\".repeat(sec.level);\n const header = `── ${file} L${start}-${end} ${hashes} ${sec.title} ${\"─\".repeat(8)}`;\n const footer = `── end ${\"─\".repeat(40)}`;\n return `${header}\\n${body}\\n${footer}`;\n}\n\n/**\n * Cut `body` to the first `maxLines` lines. If anything was dropped, append\n * a marker line telling the agent how to get the rest. `maxLines <= 0`\n * disables truncation.\n */\nexport function truncateBody(body: string, maxLines: number): string {\n if (maxLines <= 0) return body;\n const lines = body.split(\"\\n\");\n if (lines.length <= maxLines) return body;\n const kept = lines.slice(0, maxLines).join(\"\\n\");\n const remaining = lines.length - maxLines;\n return `${kept}\\n\\n… ${remaining} more lines (use --max-lines=0 for full)`;\n}\n\nfunction findFirstChild(sec: Section, all: Section[]): Section | null {\n for (const candidate of all) {\n if (candidate.parent === sec) return candidate;\n }\n return null;\n}\n"
9
9
  ],
10
10
  "mappings": ";;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAuCO,SAAS,IAAI,CAAC,KAAwB;AAAA,EAC3C,MAAM,MAAiB,CAAC;AAAA,EACxB,MAAM,MAAM,IAAI;AAAA,EAEhB,IAAI,IAAI;AAAA,EACR,IAAI,SAAS;AAAA,EAIb,IAAI,UAAU;AAAA,EACd,IAAI,YAAY;AAAA,EAChB,IAAI,WAAW;AAAA,EAEf,OAAO,KAAK,KAAK;AAAA,IAGf,MAAM,QAAQ;AAAA,IACd,OAAO,IAAI,OAAO,IAAI,WAAW,CAAC,MAAM;AAAA,MAAa;AAAA,IACrD,IAAI,OAAO,IAAI,MAAM,OAAO,CAAC;AAAA,IAC7B,IAAI,KAAK,SAAS,KAAK,KAAK,WAAW,KAAK,SAAS,CAAC,MAAM,IAAa;AAAA,MACvE,OAAO,KAAK,MAAM,GAAG,KAAK,SAAS,CAAC;AAAA,IACtC;AAAA,IACA;AAAA,IAEA,MAAM,QAAQ,WAAW,IAAI;AAAA,IAC7B,IAAI,OAAO;AAAA,MACT,IAAI,CAAC,SAAS;AAAA,QACZ,UAAU;AAAA,QACV,YAAY,MAAM;AAAA,QAClB,WAAW,MAAM;AAAA,MACnB,EAAO,SAAI,MAAM,SAAS,aAAa,MAAM,OAAO,UAAU;AAAA,QAC5D,UAAU;AAAA,MACZ;AAAA,IACF,EAAO,SAAI,CAAC,SAAS;AAAA,MACnB,MAAM,IAAI,aAAa,MAAM,MAAM;AAAA,MACnC,IAAI;AAAA,QAAG,IAAI,KAAK,CAAC;AAAA,IACnB;AAAA,IAEA,IAAI,KAAK;AAAA,MAAK;AAAA,IACd;AAAA,EACF;AAAA,EAEA,OAAO;AAAA;AAWT,SAAS,UAAU,CAAC,MAAoD;AAAA,EAEtE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,MAAM,KAAK,KAAK,WAAW,CAAC;AAAA,EAC5B,IAAI,OAAO,MAAc,OAAO;AAAA,IAAa,OAAO;AAAA,EACpD,IAAI,MAAM;AAAA,EACV,OAAO,KAAK,WAAW,IAAI,GAAG,MAAM;AAAA,IAAI;AAAA,EACxC,IAAI,MAAM;AAAA,IAAG,OAAO;AAAA,EAGpB,OAAO,EAAE,MAAM,IAAI,KAAK,IAAI;AAAA;AAY9B,SAAS,YAAY,CAAC,MAAc,QAAgC;AAAA,EAElE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,KAAK,WAAW,CAAC,MAAM;AAAA,IAAY,OAAO;AAAA,EAE9C,IAAI,SAAS;AAAA,EACb,OAAO,KAAK,WAAW,IAAI,MAAM,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,SAAS,KAAK,SAAS;AAAA,IAAG,OAAO;AAAA,EAErC,MAAM,QAAQ,IAAI;AAAA,EAClB,MAAM,UAAU,KAAK,WAAW,KAAK;AAAA,EAIrC,IAAI,QAAQ,KAAK,UAAU,YAAY,MAAM,YAAY,GAAY;AAAA,IACnE,OAAO;AAAA,EACT;AAAA,EAGA,IAAI,eAAe;AAAA,EACnB,OACE,eAAe,KAAK,WACnB,KAAK,WAAW,YAAY,MAAM,MAAM,KAAK,WAAW,YAAY,MAAM,IAC3E;AAAA,IACA;AAAA,EACF;AAAA,EAEA,IAAI,MAAM,KAAK;AAAA,EAEf,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,IACA;AAAA,EACF;AAAA,EAGA,IAAI,UAAU;AAAA,EACd,OAAO,UAAU,gBAAgB,KAAK,WAAW,UAAU,CAAC,MAAM;AAAA,IAAI;AAAA,EACtE,IACE,UAAU,QACT,YAAY,gBACX,KAAK,WAAW,UAAU,CAAC,MAAM,MACjC,KAAK,WAAW,UAAU,CAAC,MAAM,IACnC;AAAA,IACA,MAAM;AAAA,IACN,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,MACA;AAAA,IACF;AAAA,EACF;AAAA,EAEA,MAAM,QAAQ,KAAK,MAAM,cAAc,GAAG;AAAA,EAC1C,OAAO,EAAE,OAAO,QAAQ,OAAO,MAAM,OAAO;AAAA;;AC7IvC,SAAS,aAAa,CAAC,UAAqB,YAA+B;AAAA,EAChF,MAAM,MAAiB,CAAC;AAAA,EAExB,MAAM,QAAmB,CAAC;AAAA,EAE1B,WAAW,KAAK,UAAU;AAAA,IAGxB,OAAO,MAAM,SAAS,KAAK,MAAM,MAAM,SAAS,GAAI,SAAS,EAAE,OAAO;AAAA,MACpE,MAAM,UAAU,MAAM,IAAI;AAAA,MAC1B,QAAQ,WAAW,EAAE,OAAO;AAAA,IAC9B;AAAA,IACA,MAAM,SAAS,MAAM,SAAS,IAAI,MAAM,MAAM,SAAS,KAAM;AAAA,IAE7D,MAAM,MAAe;AAAA,MACnB,OAAO,EAAE;AAAA,MACT,OAAO,EAAE;AAAA,MACT,YAAY,EAAE;AAAA,MAEd,UAAU;AAAA,MACV;AAAA,IACF;AAAA,IACA,IAAI,KAAK,GAAG;AAAA,IACZ,MAAM,KAAK,GAAG;AAAA,EAChB;AAAA,EAEA,OAAO;AAAA;AAOF,SAAS,MAAM,CAAC,KAAwB;AAAA,EAC7C,MAAM,OAAiB,CAAC;AAAA,EACxB,IAAI,MAAM,IAAI;AAAA,EACd,OAAO,KAAK;AAAA,IACV,KAAK,KAAK,IAAI,KAAK;AAAA,IACnB,MAAM,IAAI;AAAA,EACZ;AAAA,EACA,OAAO,KAAK,QAAQ;AAAA;AAQf,SAAS,UAAU,CAAC,KAAqB;AAAA,EAC9C,IAAI,IAAI,WAAW;AAAA,IAAG,OAAO;AAAA,EAC7B,IAAI,IAAI;AAAA,EACR,SAAS,IAAI,EAAG,IAAI,IAAI,QAAQ,KAAK;AAAA,IACnC,IAAI,IAAI,WAAW,CAAC,MAAM;AAAA,MAAI;AAAA,EAChC;AAAA,EAGA,IAAI,IAAI,WAAW,IAAI,SAAS,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,OAAO;AAAA;;ACjDF,SAAS,aAAa,CAAC,OAA0B;AAAA,EACtD,MAAM,UAAU,MAAM,KAAK;AAAA,EAC3B,IAAI,QAAQ,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EAYlC,MAAM,cAAwB,CAAC;AAAA,EAC/B,MAAM,MAAY,CAAC,YAAY;AAAA,EAC/B,IAAI,MAAM;AAAA,EACV,IAAI,IAAI;AAAA,EACR,IAAI,UAAU;AAAA,EACd,IAAI,iBAAiB;AAAA,EACrB,OAAO,IAAI,QAAQ,QAAQ;AAAA,IACzB,MAAM,KAAK,QAAQ;AAAA,IACnB,IAAI,OAAO,QAAQ,kBAAkB,UAAU;AAAA,MAC7C,UAAU,CAAC;AAAA,MACX,OAAO;AAAA,MACP,iBAAiB;AAAA,MACjB;AAAA,MACA;AAAA,IACF;AAAA,IACA,IAAI,CAAC,WAAW,OAAO,KAAK;AAAA,MAC1B,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,MAC3B,MAAM;AAAA,MACN,iBAAiB;AAAA,MACjB,IAAI,QAAQ,IAAI,OAAO,KAAK;AAAA,QAC1B,IAAI,KAAK,OAAO;AAAA,QAChB,KAAK;AAAA,MACP,EAAO;AAAA,QACL,IAAI,KAAK,YAAY;AAAA,QACrB,KAAK;AAAA;AAAA,MAEP;AAAA,IACF;AAAA,IACA,OAAO;AAAA,IACP,IAAI,OAAO,OAAO,OAAO;AAAA,MAAM,iBAAiB;AAAA,IAChD;AAAA,EACF;AAAA,EACA,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,EAE3B,OAAO,YAAY,IAAI,CAAC,GAAG,QAAQ,aAAa,GAAG,IAAI,QAAQ,YAAY,CAAC;AAAA;AAG9E,SAAS,YAAY,CAAC,KAAa,IAAiB;AAAA,EAClD,IAAI,IAAI;AAAA,EACR,IAAI,QAAuB;AAAA,EAK3B,MAAM,aAAa,yBAAyB,KAAK,CAAC;AAAA,EAClD,IAAI,YAAY;AAAA,IACd,QAAQ,WAAW,GAAI;AAAA,IACvB,IAAI,WAAW,MAAM;AAAA,EACvB;AAAA,EAGA,MAAM,aAAa,wBAAwB,KAAK,CAAC;AAAA,EACjD,IAAI,YAAY;AAAA,IACd,MAAM,UAAU,WAAW;AAAA,IAC3B,MAAM,QAAQ,WAAW,MAAM;AAAA,IAC/B,OAAO;AAAA,MACL;AAAA,MACA;AAAA,MACA,MAAM;AAAA,MACN,OAAO;AAAA,MACP,OAAO,IAAI,OAAO,SAAS,KAAK;AAAA,IAClC;AAAA,EACF;AAAA,EAGA,IAAI,EAAE,WAAW,GAAG,GAAG;AAAA,IACrB,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,MAAM,CAAC,EAAE,KAAK,EAAE;AAAA,EAC9D;AAAA,EAEA,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,KAAK,EAAE;AAAA;AAG9C,SAAS,KAAK,CAAC,UAAqB,UAAgC;AAAA,EACzE,IAAI,SAAS,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EACnC,MAAM,MAAiB,CAAC;AAAA,EACxB,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,QAAQ,KAAK,QAAQ;AAAA,MAAG,IAAI,KAAK,GAAG;AAAA,EAC1C;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,OAAO,CAAC,KAAc,MAA0B;AAAA,EAEvD,MAAM,OAAO,KAAK,KAAK,SAAS;AAAA,EAChC,IAAI,CAAC,QAAQ,CAAC,sBAAsB,MAAM,GAAG;AAAA,IAAG,OAAO;AAAA,EAGvD,IAAI,SAAyB,IAAI;AAAA,EACjC,SAAS,IAAI,KAAK,SAAS,EAAG,KAAK,GAAG,KAAK;AAAA,IAIzC,MAAM,KAAK,KAAK,IAAI,GAAI;AAAA,IACxB,MAAM,MAAM,KAAK;AAAA,IAEjB,IAAI,OAAO,SAAS;AAAA,MAClB,IAAI,CAAC,UAAU,CAAC,sBAAsB,KAAK,MAAM;AAAA,QAAG,OAAO;AAAA,MAC3D,SAAS,OAAO;AAAA,IAClB,EAAO;AAAA,MAEL,IAAI,QAAwB;AAAA,MAC5B,OAAO,QAAQ;AAAA,QACb,IAAI,sBAAsB,KAAK,MAAM,GAAG;AAAA,UACtC,QAAQ;AAAA,UACR;AAAA,QACF;AAAA,QACA,SAAS,OAAO;AAAA,MAClB;AAAA,MACA,IAAI,CAAC;AAAA,QAAO,OAAO;AAAA,MACnB,SAAS,MAAM;AAAA;AAAA,EAEnB;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,qBAAqB,CAAC,KAAc,KAAuB;AAAA,EAClE,IAAI,IAAI,UAAU,QAAQ,IAAI,UAAU,IAAI;AAAA,IAAO,OAAO;AAAA,EAC1D,MAAM,QAAQ,IAAI;AAAA,EAClB,QAAQ,IAAI;AAAA,SACL;AAAA,MACH,OAAO,MAAM,YAAY,MAAM,IAAI,MAAM,YAAY;AAAA,SAClD;AAAA,MACH,OAAO,IAAI,MAAO,KAAK,KAAK;AAAA,SACzB;AAAA,MACH,OAAO,MAAM,YAAY,EAAE,SAAS,IAAI,MAAM,YAAY,CAAC;AAAA;AAAA;;AC5J1D,SAAS,SAAS,CACvB,MACA,KACA,UACA,MACQ;AAAA,EACR,MAAM,aAAa,WAAW,GAAG;AAAA,EACjC,MAAM,cAAc,SAAS;AAAA,EAC7B,MAAM,cAAc,eAAe,IAAI,OAAO,MAAM;AAAA,EACpD,MAAM,SAAS,gBAAgB,IAAI,YAAY;AAAA,EAE/C,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,KAAK,GAAG,SAAS,gBAAgB,eAAe,QAAQ;AAAA,EAE5D,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,KAAK,SAAS,QAAQ,IAAI,QAAQ,KAAK;AAAA,MAAO;AAAA,IAClD,MAAM,SAAS,KAAK,OAAO,KAAK,KAAK,OAAO,KAAK,IAAI,GAAG,IAAI,QAAQ,CAAC,CAAC;AAAA,IACtE,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,IACnC,MAAM,QAAQ,IAAI,IAAI,cAAc,IAAI;AAAA,IACxC,IAAI,KAAK,GAAG,SAAS,UAAU,IAAI,UAAU,OAAO;AAAA,EACtD;AAAA,EACA,OAAO,IAAI,KAAK;AAAA,CAAI;AAAA;AAcf,SAAS,aAAa,CAC3B,MACA,UACA,KACA,MACQ;AAAA,EACR,MAAM,QAAQ,IAAI;AAAA,EAClB,IAAI,MAAM,IAAI;AAAA,EAEd,IAAI,KAAK,YAAY,KAAK,aAAa;AAAA,IACrC,MAAM,aAAa,eAAe,KAAK,KAAK,WAAW;AAAA,IACvD,IAAI;AAAA,MAAY,MAAM,WAAW,aAAa;AAAA,EAChD;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,MAAM;AAAA,EACR;AAAA,EAIA,MAAM,aAAa,KAAK,IAAI,KAAK,SAAS,MAAM;AAAA,EAChD,IAAI,OAAO,SAAS,MAAM,QAAQ,GAAG,UAAU,EAAE,KAAK;AAAA,CAAI;AAAA,EAG1D,IAAI,KAAK,YAAY,QAAQ,KAAK,WAAW,GAAG;AAAA,IAC9C,OAAO,aAAa,MAAM,KAAK,QAAQ;AAAA,EACzC;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,OAAO,KAAK,OAAO,IAAI;AAAA,EACzB;AAAA,EAEA,IAAI,KAAK;AAAA,IAAK,OAAO;AAAA,EAErB,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,EACnC,MAAM,SAAS,MAAK,UAAU,SAAS,QAAQ,UAAU,IAAI,SAAS,IAAI,OAAO,CAAC;AAAA,EAClF,MAAM,SAAS,UAAS,IAAI,OAAO,EAAE;AAAA,EACrC,OAAO,GAAG;AAAA,EAAW;AAAA,EAAS;AAAA;AAQzB,SAAS,YAAY,CAAC,MAAc,UAA0B;AAAA,EACnE,IAAI,YAAY;AAAA,IAAG,OAAO;AAAA,EAC1B,MAAM,QAAQ,KAAK,MAAM;AAAA,CAAI;AAAA,EAC7B,IAAI,MAAM,UAAU;AAAA,IAAU,OAAO;AAAA,EACrC,MAAM,OAAO,MAAM,MAAM,GAAG,QAAQ,EAAE,KAAK;AAAA,CAAI;AAAA,EAC/C,MAAM,YAAY,MAAM,SAAS;AAAA,EACjC,OAAO,GAAG;AAAA;AAAA,IAAY;AAAA;AAGxB,SAAS,cAAc,CAAC,KAAc,KAAgC;AAAA,EACpE,WAAW,aAAa,KAAK;AAAA,IAC3B,IAAI,UAAU,WAAW;AAAA,MAAK,OAAO;AAAA,EACvC;AAAA,EACA,OAAO;AAAA;",
11
11
  "debugId": "F78549B744E4995264756E2164756E21",
@@ -356,22 +356,22 @@ async function buildFormatter() {
356
356
  console.error = originalError;
357
357
  }
358
358
  if (typeof rendered !== "string") {
359
- throw new Error("mdq: pretty renderer returned a Promise unexpectedly");
359
+ throw new Error("lilmd: pretty renderer returned a Promise unexpectedly");
360
360
  }
361
361
  return rendered.replace(/\n+$/, "");
362
362
  };
363
363
  }
364
364
 
365
365
  // src/cli.ts
366
- var HELP = `mdq — CLI for working with large Markdown files
366
+ var HELP = `lilmd — CLI for working with large Markdown files
367
367
 
368
368
  Usage:
369
- mdq show this help
370
- mdq <file> print table of contents
371
- mdq <file> <selector> alias for 'mdq read'
372
- mdq read <file> <selector> print sections matching selector
373
- mdq ls <file> <selector> list direct child headings
374
- mdq grep <file> <pattern> regex-search section bodies
369
+ lilmd show this help
370
+ lilmd <file> print table of contents
371
+ lilmd <file> <selector> alias for 'lilmd read'
372
+ lilmd read <file> <selector> print sections matching selector
373
+ lilmd ls <file> <selector> list direct child headings
374
+ lilmd grep <file> <pattern> regex-search section bodies
375
375
 
376
376
  Selector grammar:
377
377
  Install fuzzy, case-insensitive substring
@@ -425,7 +425,7 @@ async function run(argv) {
425
425
  strict: true
426
426
  });
427
427
  } catch (e) {
428
- return err(`mdq: ${e.message}
428
+ return err(`lilmd: ${e.message}
429
429
  ${HELP}`, 2);
430
430
  }
431
431
  const { values, positionals } = parsed;
@@ -451,7 +451,7 @@ async function dispatch(cmd, rest, values) {
451
451
  case "grep":
452
452
  return cmdGrep(rest, values);
453
453
  default:
454
- return err(`mdq: unknown command '${cmd}'
454
+ return err(`lilmd: unknown command '${cmd}'
455
455
  ${HELP}`, 2);
456
456
  }
457
457
  }
@@ -460,8 +460,8 @@ function loadFile(file) {
460
460
  const src = file === "-" ? readFileSync(0, "utf8") : readFileSync(file, "utf8");
461
461
  return { src };
462
462
  } catch (e) {
463
- const msg = e.code === "ENOENT" ? `mdq: cannot open '${file}': not found
464
- ` : `mdq: cannot open '${file}': ${e.message}
463
+ const msg = e.code === "ENOENT" ? `lilmd: cannot open '${file}': not found
464
+ ` : `lilmd: cannot open '${file}': ${e.message}
465
465
  `;
466
466
  return err(msg, 2);
467
467
  }
@@ -480,7 +480,7 @@ function readFlag(v, name, fallback) {
480
480
  return { value: fallback };
481
481
  const n = parseIntOrNull(raw);
482
482
  if (n == null || n < 0) {
483
- return err(`mdq: --${name} expects a non-negative integer, got '${raw}'
483
+ return err(`lilmd: --${name} expects a non-negative integer, got '${raw}'
484
484
  `, 2);
485
485
  }
486
486
  return { value: n };
@@ -488,7 +488,7 @@ function readFlag(v, name, fallback) {
488
488
  function cmdToc(rest, v) {
489
489
  const file = rest[0];
490
490
  if (file == null)
491
- return err(`mdq toc: missing <file>
491
+ return err(`lilmd toc: missing <file>
492
492
  `, 2);
493
493
  const loaded = loadFile(file);
494
494
  if ("code" in loaded)
@@ -514,11 +514,11 @@ async function cmdRead(rest, v) {
514
514
  const file = rest[0];
515
515
  const selectorStr = rest[1];
516
516
  if (file == null || selectorStr == null) {
517
- return err(`mdq read: missing <file> or <selector>
517
+ return err(`lilmd read: missing <file> or <selector>
518
518
  `, 2);
519
519
  }
520
520
  if (v.pretty && v.json) {
521
- return err(`mdq read: --pretty cannot be combined with --json
521
+ return err(`lilmd read: --pretty cannot be combined with --json
522
522
  `, 2);
523
523
  }
524
524
  const loaded = loadFile(file);
@@ -547,7 +547,7 @@ async function cmdRead(rest, v) {
547
547
  try {
548
548
  pretty = await loadPrettyFormatter();
549
549
  } catch (e) {
550
- return err(`mdq read: ${e.message}
550
+ return err(`lilmd read: ${e.message}
551
551
  `, 2);
552
552
  }
553
553
  }
@@ -585,7 +585,7 @@ function cmdLs(rest, v) {
585
585
  const file = rest[0];
586
586
  const selectorStr = rest[1];
587
587
  if (file == null || selectorStr == null) {
588
- return err(`mdq ls: missing <file> or <selector>
588
+ return err(`lilmd ls: missing <file> or <selector>
589
589
  `, 2);
590
590
  }
591
591
  const loaded = loadFile(file);
@@ -639,7 +639,7 @@ function cmdGrep(rest, v) {
639
639
  const file = rest[0];
640
640
  const pattern = rest[1];
641
641
  if (file == null || pattern == null) {
642
- return err(`mdq grep: missing <file> or <pattern>
642
+ return err(`lilmd grep: missing <file> or <pattern>
643
643
  `, 2);
644
644
  }
645
645
  const loaded = loadFile(file);
@@ -651,7 +651,7 @@ function cmdGrep(rest, v) {
651
651
  try {
652
652
  re = new RegExp(pattern);
653
653
  } catch (e) {
654
- return err(`mdq grep: invalid regex: ${e.message}
654
+ return err(`lilmd grep: invalid regex: ${e.message}
655
655
  `, 2);
656
656
  }
657
657
  const srcLines = src.split(`
@@ -720,7 +720,7 @@ function sectionToJSON(sec) {
720
720
  };
721
721
  }
722
722
 
723
- // bin/mdq.ts
723
+ // bin/lilmd.ts
724
724
  var result = await run(process.argv.slice(2));
725
725
  if (result.stdout) {
726
726
  process.stdout.write(result.stdout.endsWith(`
@@ -731,5 +731,5 @@ if (result.stderr)
731
731
  process.stderr.write(result.stderr);
732
732
  process.exit(result.code);
733
733
 
734
- //# debugId=21A3DDC69C33B5A064756E2164756E21
735
- //# sourceMappingURL=mdq.js.map
734
+ //# debugId=3F3B5F923576EA7C64756E2164756E21
735
+ //# sourceMappingURL=lilmd.js.map
@@ -0,0 +1,16 @@
1
+ {
2
+ "version": 3,
3
+ "sources": ["src/cli.ts", "src/scan.ts", "src/sections.ts", "src/select.ts", "src/render.ts", "src/pretty.ts", "bin/lilmd.ts"],
4
+ "sourcesContent": [
5
+ "#!/usr/bin/env bun\n/**\n * lilmd CLI — composes scan + sections + select + render into commands.\n *\n * Uses node:util.parseArgs only: zero runtime dependencies. Subcommand\n * dispatch is a tiny switch on argv[0]; that's about 20 lines of glue we'd\n * reinvent on top of any framework anyway, and the cold-start benchmark\n * (see BENCHMARK.md) showed cac and parseArgs tied at ~16ms.\n *\n * Public commands (MVP):\n * lilmd [file] toc\n * lilmd <file> <selector> alias for: lilmd read\n * lilmd read <file> <selector> read section(s) matching selector\n * lilmd ls <file> <selector> direct children of matching section\n * lilmd grep <file> <pattern> regex search inside section bodies\n * lilmd --help | -h help\n *\n * File argument accepts `-` to read stdin.\n *\n * Exit codes:\n * 0 found something / toc printed / help\n * 1 ran successfully but found no matches (grep-style, so `&&` chaining\n * works the way agents expect)\n * 2 usage error (bad flag, missing file, invalid regex, file not found)\n */\n\nimport { parseArgs, type ParseArgsConfig } from \"node:util\";\nimport { readFileSync } from \"node:fs\";\n\nimport { scan } from \"./scan\";\nimport { buildSections, countLines, pathOf, type Section } from \"./sections\";\nimport { match, parseSelector } from \"./select\";\nimport { renderSection, renderToc, truncateBody } from \"./render\";\nimport { loadPrettyFormatter, type PrettyFormatter } from \"./pretty\";\n\nconst HELP = `lilmd — CLI for working with large Markdown files\n\nUsage:\n lilmd show this help\n lilmd <file> print table of contents\n lilmd <file> <selector> alias for 'lilmd read'\n lilmd read <file> <selector> print sections matching selector\n lilmd ls <file> <selector> list direct child headings\n lilmd grep <file> <pattern> regex-search section bodies\n\nSelector grammar:\n Install fuzzy, case-insensitive substring\n =Install exact, case-insensitive equality\n /^inst/i regex (JS syntax); flags default to 'i'\n ##Install level filter (1..6 '#'s)\n Guide > Install descendant, any depth under 'Guide'\n Guide >> Install direct child of 'Guide'\n\nOptions:\n --depth <n> TOC: max heading depth to show (0 = none)\n --flat TOC: flat list, no indentation\n --max-results <n> cap matches for read/ls (default 25)\n --max-lines <n> truncate long bodies (0 = unlimited)\n --body-only read: skip subsections\n --no-body read: print headings only\n --raw read: drop delimiter lines\n --pretty read: render markdown with ANSI styling (for humans)\n --json machine-readable JSON output\n\nUse '-' as <file> to read from stdin. Exit code is 1 when no matches.\n`;\n\nexport type CliResult = { code: number; stdout: string; stderr: string };\n\nfunction ok(s: string): CliResult {\n return { code: 0, stdout: s, stderr: \"\" };\n}\nfunction noMatch(s: string): CliResult {\n // Successful run that found nothing: exit 1 so `lilmd ... && foo` works the\n // way agents expect, but keep the friendly message on stdout so humans see\n // it too.\n return { code: 1, stdout: s, stderr: \"\" };\n}\nfunction err(s: string, code = 1): CliResult {\n return { code, stdout: \"\", stderr: s };\n}\n\nconst OPTIONS = {\n depth: { type: \"string\" },\n flat: { type: \"boolean\" },\n \"max-results\": { type: \"string\" },\n \"max-lines\": { type: \"string\" },\n \"body-only\": { type: \"boolean\" },\n \"no-body\": { type: \"boolean\" },\n raw: { type: \"boolean\" },\n pretty: { type: \"boolean\" },\n json: { type: \"boolean\" },\n help: { type: \"boolean\", short: \"h\" },\n} satisfies NonNullable<ParseArgsConfig[\"options\"]>;\n\n/** CLI entry point. Async because `--pretty` lazy-loads marked. */\nexport async function run(argv: string[]): Promise<CliResult> {\n let parsed;\n try {\n parsed = parseArgs({\n args: argv,\n options: OPTIONS,\n allowPositionals: true,\n strict: true,\n });\n } catch (e) {\n return err(`lilmd: ${(e as Error).message}\\n${HELP}`, 2);\n }\n\n const { values, positionals } = parsed;\n if (values.help || positionals.length === 0) {\n return ok(HELP);\n }\n\n // Detect explicit subcommand by the first positional.\n const head = positionals[0];\n if (head === \"read\" || head === \"ls\" || head === \"grep\" || head === \"toc\") {\n return dispatch(head, positionals.slice(1), values);\n }\n\n // Positional form:\n // lilmd <file> -> toc\n // lilmd <file> <selector> -> read\n if (positionals.length === 1) return dispatch(\"toc\", positionals, values);\n return dispatch(\"read\", positionals, values);\n}\n\ntype Values = ReturnType<typeof parseArgs<{ options: typeof OPTIONS }>>[\"values\"];\n\nasync function dispatch(\n cmd: string,\n rest: string[],\n values: Values,\n): Promise<CliResult> {\n switch (cmd) {\n case \"toc\":\n return cmdToc(rest, values);\n case \"read\":\n return cmdRead(rest, values);\n case \"ls\":\n return cmdLs(rest, values);\n case \"grep\":\n return cmdGrep(rest, values);\n default:\n return err(`lilmd: unknown command '${cmd}'\\n${HELP}`, 2);\n }\n}\n\n/**\n * Read `file` from disk or stdin. Returns a CliResult on failure so callers\n * can just forward it — we swallow raw ENOENT stack traces here and emit a\n * friendly \"lilmd: cannot open 'foo.md'\" message instead.\n */\nfunction loadFile(file: string): { src: string } | CliResult {\n try {\n const src = file === \"-\" ? readFileSync(0, \"utf8\") : readFileSync(file, \"utf8\");\n return { src };\n } catch (e) {\n const msg = (e as NodeJS.ErrnoException).code === \"ENOENT\"\n ? `lilmd: cannot open '${file}': not found\\n`\n : `lilmd: cannot open '${file}': ${(e as Error).message}\\n`;\n return err(msg, 2);\n }\n}\n\n/**\n * Parse `v` as a base-10 integer. Returns null (not fallback) on invalid\n * input so callers can distinguish \"not provided\" from \"bad value\".\n */\nfunction parseIntOrNull(v: string | undefined): number | null {\n if (v == null) return null;\n const n = Number.parseInt(v, 10);\n if (Number.isNaN(n) || String(n) !== v.trim()) return null;\n return n;\n}\n\nfunction readFlag(\n v: Values,\n name: \"depth\" | \"max-results\" | \"max-lines\",\n fallback: number | null,\n): { value: number | null } | CliResult {\n const raw = v[name];\n if (raw == null) return { value: fallback };\n const n = parseIntOrNull(raw);\n if (n == null || n < 0) {\n return err(`lilmd: --${name} expects a non-negative integer, got '${raw}'\\n`, 2);\n }\n return { value: n };\n}\n\n// ---- commands ----------------------------------------------------------\n\nfunction cmdToc(rest: string[], v: Values): CliResult {\n const file = rest[0];\n if (file == null) return err(\"lilmd toc: missing <file>\\n\", 2);\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n if (v.json) {\n return ok(\n JSON.stringify(\n {\n file,\n total_lines: countLines(src),\n headings: sections.map(sectionToJSON),\n },\n null,\n 2,\n ),\n );\n }\n\n const depth = readFlag(v, \"depth\", null);\n if (\"code\" in depth) return depth;\n return ok(\n renderToc(file, src, sections, {\n depth: depth.value ?? undefined,\n flat: !!v.flat,\n }),\n );\n}\n\nasync function cmdRead(rest: string[], v: Values): Promise<CliResult> {\n const file = rest[0];\n const selectorStr = rest[1];\n if (file == null || selectorStr == null) {\n return err(\"lilmd read: missing <file> or <selector>\\n\", 2);\n }\n // --pretty styles output with ANSI for humans; --json is for machines.\n // Reject the combo before we do any I/O.\n if (v.pretty && v.json) {\n return err(\"lilmd read: --pretty cannot be combined with --json\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n const maxResults = readFlag(v, \"max-results\", 25);\n if (\"code\" in maxResults) return maxResults;\n const maxLines = readFlag(v, \"max-lines\", 0);\n if (\"code\" in maxLines) return maxLines;\n\n const selector = parseSelector(selectorStr);\n const matches = match(sections, selector);\n\n // Split source once and pass srcLines through to render. Previously\n // renderSection/sliceBody each re-split for every match, turning\n // `lilmd read file.md sel` into O(matches × file_size).\n const srcLines = src.split(\"\\n\");\n\n if (v.json) {\n return emitReadJson(\n file,\n srcLines,\n sections,\n matches,\n maxResults.value ?? 25,\n maxLines.value ?? 0,\n v,\n );\n }\n\n if (matches.length === 0) return noMatch(\"(no match)\\n\");\n\n let pretty: PrettyFormatter | undefined;\n if (v.pretty) {\n try {\n pretty = await loadPrettyFormatter();\n } catch (e) {\n return err(`lilmd read: ${(e as Error).message}\\n`, 2);\n }\n }\n\n const cap = maxResults.value ?? 25;\n const toPrint = matches.slice(0, cap);\n const out: string[] = [];\n if (matches.length > cap) {\n out.push(\n `${matches.length} matches, showing first ${cap}. Use --max-results=N to raise the cap.`,\n );\n }\n for (const sec of toPrint) {\n out.push(\n renderSection(file, srcLines, sec, {\n bodyOnly: !!v[\"body-only\"],\n noBody: !!v[\"no-body\"],\n raw: !!v.raw,\n maxLines: maxLines.value ?? 0,\n allSections: sections,\n pretty,\n }),\n );\n }\n return ok(out.join(\"\\n\"));\n}\n\nfunction emitReadJson(\n file: string,\n srcLines: string[],\n all: Section[],\n matches: Section[],\n maxResults: number,\n maxLines: number,\n v: Values,\n): CliResult {\n const body = JSON.stringify(\n {\n file,\n matches: matches.slice(0, maxResults).map((s) => ({\n ...sectionToJSON(s),\n body: v[\"no-body\"]\n ? \"\"\n : maybeTruncate(sliceBody(srcLines, s, all, !!v[\"body-only\"]), maxLines),\n })),\n truncated: matches.length > maxResults,\n },\n null,\n 2,\n );\n // JSON path keeps exit 0 regardless — the shape itself signals no matches.\n return matches.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n}\n\nfunction cmdLs(rest: string[], v: Values): CliResult {\n const file = rest[0];\n const selectorStr = rest[1];\n if (file == null || selectorStr == null) {\n return err(\"lilmd ls: missing <file> or <selector>\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n const maxResults = readFlag(v, \"max-results\", 25);\n if (\"code\" in maxResults) return maxResults;\n const cap = maxResults.value ?? 25;\n\n const selector = parseSelector(selectorStr);\n const matches = match(sections, selector).slice(0, cap);\n\n // Index children by parent once so we don't repeat an O(n) filter per\n // match — inexpensive in practice but easy to do right.\n const childrenOf = new Map<Section, Section[]>();\n for (const sec of sections) {\n if (sec.parent) {\n const list = childrenOf.get(sec.parent);\n if (list) list.push(sec);\n else childrenOf.set(sec.parent, [sec]);\n }\n }\n\n if (v.json) {\n const results = matches.map((parent) => ({\n parent: sectionToJSON(parent),\n children: (childrenOf.get(parent) ?? []).map(sectionToJSON),\n }));\n const body = JSON.stringify({ file, results }, null, 2);\n return matches.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n }\n\n if (matches.length === 0) return noMatch(\"(no match)\\n\");\n\n const out: string[] = [];\n for (const parent of matches) {\n const children = childrenOf.get(parent) ?? [];\n out.push(\n `${\"#\".repeat(parent.level)} ${parent.title} L${parent.line_start}-${parent.line_end}`,\n );\n if (children.length === 0) {\n out.push(\" (no children)\");\n } else {\n for (const c of children) {\n out.push(\n ` ${\"#\".repeat(c.level)} ${c.title} L${c.line_start}-${c.line_end}`,\n );\n }\n }\n }\n return ok(out.join(\"\\n\"));\n}\n\nfunction cmdGrep(rest: string[], v: Values): CliResult {\n const file = rest[0];\n const pattern = rest[1];\n if (file == null || pattern == null) {\n return err(\"lilmd grep: missing <file> or <pattern>\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n let re: RegExp;\n try {\n re = new RegExp(pattern);\n } catch (e) {\n return err(`lilmd grep: invalid regex: ${(e as Error).message}\\n`, 2);\n }\n\n const srcLines = src.split(\"\\n\");\n type Hit = { section: Section | null; line: number; text: string };\n const hits: Hit[] = [];\n\n // Walk all lines once and attribute each match to the innermost enclosing\n // section. Sections are in document order, so a monotonic cursor is enough.\n // Matches before the first heading get `section: null`.\n let secIdx = -1;\n for (let lineNo = 1; lineNo <= srcLines.length; lineNo++) {\n while (\n secIdx + 1 < sections.length &&\n sections[secIdx + 1]!.line_start <= lineNo\n ) {\n secIdx++;\n }\n const line = srcLines[lineNo - 1]!;\n if (re.test(line)) {\n const section = secIdx >= 0 ? sections[secIdx] ?? null : null;\n hits.push({ section, line: lineNo, text: line });\n }\n }\n\n if (v.json) {\n const body = JSON.stringify(\n hits.map((h) => ({\n file,\n line: h.line,\n text: h.text,\n section: h.section ? sectionToJSON(h.section) : null,\n })),\n null,\n 2,\n );\n return hits.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n }\n\n if (hits.length === 0) return noMatch(\"(no match)\\n\");\n\n // Group consecutive hits by section for readable output.\n const out: string[] = [];\n let lastSection: Section | null | undefined = undefined;\n for (const hit of hits) {\n if (hit.section !== lastSection) {\n if (hit.section) {\n const path = pathOf(hit.section).concat(hit.section.title).join(\" > \");\n out.push(\n `── ${path} L${hit.section.line_start}-${hit.section.line_end}`,\n );\n } else {\n out.push(`── ${file} (no enclosing heading)`);\n }\n lastSection = hit.section;\n }\n out.push(` L${hit.line}: ${hit.text}`);\n }\n return ok(out.join(\"\\n\"));\n}\n\n// ---- shared helpers ----------------------------------------------------\n\nfunction sliceBody(\n srcLines: string[],\n sec: Section,\n all: Section[],\n bodyOnly: boolean,\n): string {\n let end = sec.line_end;\n if (bodyOnly) {\n const firstChild = all.find((s) => s.parent === sec);\n if (firstChild) end = firstChild.line_start - 1;\n }\n return srcLines.slice(sec.line_start - 1, end).join(\"\\n\");\n}\n\nfunction maybeTruncate(body: string, maxLines: number): string {\n return maxLines > 0 ? truncateBody(body, maxLines) : body;\n}\n\nfunction sectionToJSON(sec: Section): Record<string, unknown> {\n return {\n level: sec.level,\n title: sec.title,\n line_start: sec.line_start,\n line_end: sec.line_end,\n path: pathOf(sec),\n };\n}\n\n// Entry point lives in bin/lilmd.ts; this module only exports `run`.\n",
6
+ "/**\n * Markdown heading scanner — the engine behind every read-path command.\n *\n * Instead of building a full CommonMark AST we walk the source line by line\n * and recognize only what `lilmd` actually needs: ATX headings and fenced code\n * blocks (so `#` inside code doesn't count as a heading).\n *\n * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a\n * 10 MB fixture, roughly 7x faster than markdown-it and ~1000x faster than\n * mdast-util-from-markdown while returning the exact same section.\n *\n * Deliberate limitations:\n * - Setext headings (`===` / `---` underlines) are NOT recognized. lilmd is\n * aimed at agent-authored markdown where ATX is ubiquitous.\n * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line\n * would be misread as a heading. That's an acceptable tradeoff for 100x\n * speed; a future `--strict` flag could hand off to markdown-it.\n * - Fenced code blocks *inside a list item* that are indented 4+ spaces are\n * not recognized as fences — we only look at the first 3 columns for the\n * fence opener. A `# fake` line inside such a block would be scanned as a\n * heading. Rare in practice; document-your-way-out rather than fix.\n * - An unclosed fence at EOF leaves the scanner in \"still in fence\" state\n * to the end of the file, so any `#`-looking lines after it are ignored.\n * That's the conservative choice — prefer under-counting to over-counting.\n */\n\nexport type Heading = {\n /** 1..6 */\n level: number;\n /** Heading text with trailing closing hashes stripped. */\n title: string;\n /** 1-indexed line number. */\n line: number;\n};\n\n/**\n * Return every ATX heading in `src`, in document order.\n * Runs in a single pass; O(n) in source length, O(headings) in space.\n */\nexport function scan(src: string): Heading[] {\n const out: Heading[] = [];\n const len = src.length;\n\n let i = 0;\n let lineNo = 0;\n\n // Fence state: when inFence is true every line is ignored until we see a\n // matching closing fence (same char, length >= opening length).\n let inFence = false;\n let fenceChar = 0; // charCode of ` or ~\n let fenceLen = 0;\n\n while (i <= len) {\n // Slice one line without the trailing newline. A trailing \\r from CRLF\n // is stripped below.\n const start = i;\n while (i < len && src.charCodeAt(i) !== 10 /* \\n */) i++;\n let line = src.slice(start, i);\n if (line.length > 0 && line.charCodeAt(line.length - 1) === 13 /* \\r */) {\n line = line.slice(0, line.length - 1);\n }\n lineNo++;\n\n const fence = matchFence(line);\n if (fence) {\n if (!inFence) {\n inFence = true;\n fenceChar = fence.char;\n fenceLen = fence.len;\n } else if (fence.char === fenceChar && fence.len >= fenceLen) {\n inFence = false;\n }\n } else if (!inFence) {\n const h = matchHeading(line, lineNo);\n if (h) out.push(h);\n }\n\n if (i >= len) break;\n i++; // skip the \\n\n }\n\n return out;\n}\n\n/**\n * If `line` opens or closes a fenced code block, return the fence char code\n * (` or ~) and the number of fence characters. Otherwise null.\n *\n * A fence is 0–3 spaces, then 3+ of a single fence char, then optional info\n * string. We only care about the opening run length; the info string is\n * ignored.\n */\nfunction matchFence(line: string): { char: number; len: number } | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n const ch = line.charCodeAt(p);\n if (ch !== 96 /* ` */ && ch !== 126 /* ~ */) return null;\n let run = 0;\n while (line.charCodeAt(p + run) === ch) run++;\n if (run < 3) return null;\n // For backtick fences, CommonMark forbids backticks in the info string,\n // but we don't parse info; we only need to know this line is a fence.\n return { char: ch, len: run };\n}\n\n/**\n * If `line` is an ATX heading, return it. Otherwise null.\n *\n * Rules (CommonMark, simplified):\n * - 0–3 spaces of indent\n * - 1–6 `#`\n * - EITHER end-of-line OR a space/tab followed by content\n * - optional closing sequence: whitespace + trailing `#`s (stripped)\n */\nfunction matchHeading(line: string, lineNo: number): Heading | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n if (line.charCodeAt(p) !== 35 /* # */) return null;\n\n let hashes = 0;\n while (line.charCodeAt(p + hashes) === 35) hashes++;\n if (hashes < 1 || hashes > 6) return null;\n\n const after = p + hashes;\n const afterCh = line.charCodeAt(after);\n\n // After the hashes we need either end-of-line or a space/tab. Anything else\n // (including `#` which is caught above by the hashes loop) disqualifies.\n if (after < line.length && afterCh !== 32 && afterCh !== 9 /* \\t */) {\n return null;\n }\n\n // Trim leading whitespace of content and trailing whitespace + closing #s.\n let contentStart = after;\n while (\n contentStart < line.length &&\n (line.charCodeAt(contentStart) === 32 || line.charCodeAt(contentStart) === 9)\n ) {\n contentStart++;\n }\n\n let end = line.length;\n // Trim trailing whitespace first.\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n // Strip closing `#`s only if they are preceded by whitespace (CommonMark\n // requires the closing sequence to be separated from the content).\n let closing = end;\n while (closing > contentStart && line.charCodeAt(closing - 1) === 35) closing--;\n if (\n closing < end &&\n (closing === contentStart ||\n line.charCodeAt(closing - 1) === 32 ||\n line.charCodeAt(closing - 1) === 9)\n ) {\n end = closing;\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n }\n\n const title = line.slice(contentStart, end);\n return { level: hashes, title, line: lineNo };\n}\n",
7
+ "/**\n * Turn a flat list of headings into a section tree with line ranges and\n * parent links.\n *\n * Each Section covers its heading line through the line before the next\n * heading at the same-or-higher level (or the end of the file if none).\n * That's the \"whole subtree\" range — it includes the section's body *and*\n * its descendants. `--body-only` rendering is derived at render time.\n */\n\nimport type { Heading } from \"./scan\";\n\nexport type Section = {\n level: number;\n title: string;\n /** 1-indexed line of the heading itself. */\n line_start: number;\n /** 1-indexed inclusive end of the subtree. */\n line_end: number;\n /** Nearest enclosing section, or null for top-level. */\n parent: Section | null;\n};\n\n/**\n * Build the section tree in a single pass. Preserves document order.\n *\n * Runs in O(n): every section is pushed once and popped once, and we set\n * its `line_end` at pop time. Sections still on the stack when we run out\n * of headings keep their provisional `line_end = totalLines`.\n */\nexport function buildSections(headings: Heading[], totalLines: number): Section[] {\n const out: Section[] = [];\n /** Ancestors whose subtree is still open. */\n const stack: Section[] = [];\n\n for (const h of headings) {\n // Every section on the stack with the same-or-shallower level closes at\n // h.line - 1 (the line before the new heading).\n while (stack.length > 0 && stack[stack.length - 1]!.level >= h.level) {\n const closing = stack.pop()!;\n closing.line_end = h.line - 1;\n }\n const parent = stack.length > 0 ? stack[stack.length - 1]! : null;\n\n const sec: Section = {\n level: h.level,\n title: h.title,\n line_start: h.line,\n // Provisional: if nothing closes this section we leave it at totalLines.\n line_end: totalLines,\n parent,\n };\n out.push(sec);\n stack.push(sec);\n }\n\n return out;\n}\n\n/**\n * Walk `sec` up to the root, collecting ancestor titles in top-down order.\n * Returns [] for a root section.\n */\nexport function pathOf(sec: Section): string[] {\n const path: string[] = [];\n let cur = sec.parent;\n while (cur) {\n path.push(cur.title);\n cur = cur.parent;\n }\n return path.reverse();\n}\n\n/**\n * Count lines in a source string. Empty string is 0; otherwise every line\n * (including the last one, whether or not it ends with a newline) is 1.\n * A trailing newline does NOT add a phantom line.\n */\nexport function countLines(src: string): number {\n if (src.length === 0) return 0;\n let n = 1;\n for (let i = 0; i < src.length; i++) {\n if (src.charCodeAt(i) === 10) n++;\n }\n // If the source ends with a newline, the line-count should equal the\n // number of newlines (not newlines + 1) since the final \"line\" is empty.\n if (src.charCodeAt(src.length - 1) === 10) n--;\n return n;\n}\n",
8
+ "/**\n * Selector grammar parser and matcher.\n *\n * SELECTOR := SEGMENT ( SEP SEGMENT )*\n * SEP := \">\" (descendant, any depth)\n * | \">>\" (direct child)\n * SEGMENT := LEVEL? MATCHER\n * LEVEL := \"#\"{1,6} (optional level filter)\n * MATCHER := TEXT (fuzzy, case-insensitive substring)\n * | \"=\" TEXT (exact, case-insensitive equality)\n * | \"/\" PATTERN \"/\" FLAGS? (JS regex; defaults to /.../i)\n *\n * Matching semantics:\n * - The *last* segment must match the candidate section itself.\n * - Earlier segments must match an ancestor chain walking upward from that\n * candidate, respecting each separator between them: `A >> B` requires A\n * to be B's *immediate* parent; `A > B` only requires A to be *some*\n * ancestor of B.\n */\n\nimport type { Section } from \"./sections\";\n\nexport type Op = \"descendant\" | \"child\";\nexport type Kind = \"fuzzy\" | \"exact\" | \"regex\";\n\nexport type Segment = {\n /** Operator that connects this segment to the *previous* one.\n * For the first segment this is always \"descendant\" (unused). */\n op: Op;\n /** Optional 1..6 level filter. */\n level: number | null;\n kind: Kind;\n /** The raw value (without level/kind prefix). */\n value: string;\n /** Present only for kind === \"regex\". */\n regex?: RegExp;\n};\n\nexport function parseSelector(input: string): Segment[] {\n const trimmed = input.trim();\n if (trimmed.length === 0) return [];\n\n // Split on > / >>. We walk the string character by character so we can\n // distinguish the two operators without confusing a `>>` with two\n // consecutive `>`s. We intentionally ignore `>` that appear inside a\n // regex delimiter pair because users may write `/a>b/`.\n //\n // `atSegmentStart` tracks whether the running buffer is still whitespace\n // only — only in that state can a `/` open a regex literal. Using\n // `cur.length === 0` instead is wrong because `>` splits leave the loop\n // pointing at a leading space that then lands in `cur` before the next\n // non-space char.\n const rawSegments: string[] = [];\n const ops: Op[] = [\"descendant\"];\n let cur = \"\";\n let i = 0;\n let inRegex = false;\n let atSegmentStart = true;\n while (i < trimmed.length) {\n const ch = trimmed[i];\n if (ch === \"/\" && (atSegmentStart || inRegex)) {\n inRegex = !inRegex;\n cur += ch;\n atSegmentStart = false;\n i++;\n continue;\n }\n if (!inRegex && ch === \">\") {\n rawSegments.push(cur.trim());\n cur = \"\";\n atSegmentStart = true;\n if (trimmed[i + 1] === \">\") {\n ops.push(\"child\");\n i += 2;\n } else {\n ops.push(\"descendant\");\n i += 1;\n }\n continue;\n }\n cur += ch;\n if (ch !== \" \" && ch !== \"\\t\") atSegmentStart = false;\n i++;\n }\n rawSegments.push(cur.trim());\n\n return rawSegments.map((s, idx) => parseSegment(s, ops[idx] ?? \"descendant\"));\n}\n\nfunction parseSegment(raw: string, op: Op): Segment {\n let s = raw;\n let level: number | null = null;\n\n // Level prefix — exactly 1..6 `#`s followed by something that is NOT\n // another `#`. The negative lookahead matters: without it, \"#######foo\"\n // would silently match level=6 value=\"#foo\".\n const levelMatch = /^(#{1,6})(?!#)\\s*(.*)$/.exec(s);\n if (levelMatch) {\n level = levelMatch[1]!.length;\n s = levelMatch[2] ?? \"\";\n }\n\n // Regex literal: /pattern/flags — flags default to \"i\".\n const regexMatch = /^\\/(.+)\\/([gimsuy]*)$/.exec(s);\n if (regexMatch) {\n const pattern = regexMatch[1]!;\n const flags = regexMatch[2] || \"i\";\n return {\n op,\n level,\n kind: \"regex\",\n value: pattern,\n regex: new RegExp(pattern, flags),\n };\n }\n\n // Exact match: =value.\n if (s.startsWith(\"=\")) {\n return { op, level, kind: \"exact\", value: s.slice(1).trim() };\n }\n\n return { op, level, kind: \"fuzzy\", value: s.trim() };\n}\n\nexport function match(sections: Section[], selector: Segment[]): Section[] {\n if (selector.length === 0) return [];\n const out: Section[] = [];\n for (const sec of sections) {\n if (matches(sec, selector)) out.push(sec);\n }\n return out;\n}\n\nfunction matches(sec: Section, segs: Segment[]): boolean {\n // Last segment matches the candidate itself.\n const last = segs[segs.length - 1];\n if (!last || !segmentMatchesSection(last, sec)) return false;\n\n // Walk the ancestor chain backward alongside the earlier segments.\n let cursor: Section | null = sec.parent;\n for (let i = segs.length - 2; i >= 0; i--) {\n // The separator BEFORE segs[i+1] is stored on segs[i+1].op; that's the\n // relationship we need to honor when walking from segs[i+1] back to\n // segs[i] in the ancestor chain.\n const op = segs[i + 1]!.op;\n const seg = segs[i]!;\n\n if (op === \"child\") {\n if (!cursor || !segmentMatchesSection(seg, cursor)) return false;\n cursor = cursor.parent;\n } else {\n // Descendant: find any matching ancestor.\n let found: Section | null = null;\n while (cursor) {\n if (segmentMatchesSection(seg, cursor)) {\n found = cursor;\n break;\n }\n cursor = cursor.parent;\n }\n if (!found) return false;\n cursor = found.parent;\n }\n }\n return true;\n}\n\nfunction segmentMatchesSection(seg: Segment, sec: Section): boolean {\n if (seg.level !== null && seg.level !== sec.level) return false;\n const title = sec.title;\n switch (seg.kind) {\n case \"exact\":\n return title.toLowerCase() === seg.value.toLowerCase();\n case \"regex\":\n return seg.regex!.test(title);\n case \"fuzzy\":\n return title.toLowerCase().includes(seg.value.toLowerCase());\n }\n}\n",
9
+ "/**\n * Output formatting for lilmd.\n *\n * Two targets today: `renderToc` for the TOC view and `renderSection` for a\n * single section read. Both emit grep-friendly plain text with stable\n * delimiters that agents can split on.\n *\n * A future `--json` pipeline lives in cli.ts; the shapes (Heading, Section)\n * are already JSON-clean so it's a direct serialization.\n */\n\nimport type { Section } from \"./sections\";\nimport { countLines } from \"./sections\";\nimport type { PrettyFormatter } from \"./pretty\";\n\nexport type TocOptions = {\n depth?: number;\n flat?: boolean;\n};\n\nexport function renderToc(\n file: string,\n src: string,\n sections: Section[],\n opts: TocOptions,\n): string {\n const totalLines = countLines(src);\n const headerCount = sections.length;\n const headerRange = totalLines === 0 ? \"L0\" : `L1-${totalLines}`;\n const plural = headerCount === 1 ? \"heading\" : \"headings\";\n\n const out: string[] = [];\n out.push(`${file} ${headerRange} ${headerCount} ${plural}`);\n\n for (const sec of sections) {\n if (opts.depth != null && sec.level > opts.depth) continue;\n const indent = opts.flat ? \"\" : \" \".repeat(Math.max(0, sec.level - 1));\n const hashes = \"#\".repeat(sec.level);\n const range = `L${sec.line_start}-${sec.line_end}`;\n out.push(`${indent}${hashes} ${sec.title} ${range}`);\n }\n return out.join(\"\\n\");\n}\n\nexport type SectionOptions = {\n bodyOnly?: boolean;\n noBody?: boolean;\n raw?: boolean;\n maxLines?: number;\n /** Required when bodyOnly is true so we can find the first child. */\n allSections?: Section[];\n /** Optional markdown→ANSI formatter applied to the body before delimiters. */\n pretty?: PrettyFormatter;\n};\n\nexport function renderSection(\n file: string,\n srcLines: string[],\n sec: Section,\n opts: SectionOptions,\n): string {\n const start = sec.line_start;\n let end = sec.line_end;\n\n if (opts.bodyOnly && opts.allSections) {\n const firstChild = findFirstChild(sec, opts.allSections);\n if (firstChild) end = firstChild.line_start - 1;\n }\n\n if (opts.noBody) {\n end = start;\n }\n\n // Clamp to source length so a stale `line_end` (e.g. countLines and\n // splitLines disagreeing on a trailing newline) can't overrun.\n const clampedEnd = Math.min(end, srcLines.length);\n let body = srcLines.slice(start - 1, clampedEnd).join(\"\\n\");\n\n // Truncate before pretty-printing so ANSI escapes can't land mid-cut.\n if (opts.maxLines != null && opts.maxLines > 0) {\n body = truncateBody(body, opts.maxLines);\n }\n\n if (opts.pretty) {\n body = opts.pretty(body);\n }\n\n if (opts.raw) return body;\n\n const hashes = \"#\".repeat(sec.level);\n const header = `── ${file} L${start}-${end} ${hashes} ${sec.title} ${\"─\".repeat(8)}`;\n const footer = `── end ${\"─\".repeat(40)}`;\n return `${header}\\n${body}\\n${footer}`;\n}\n\n/**\n * Cut `body` to the first `maxLines` lines. If anything was dropped, append\n * a marker line telling the agent how to get the rest. `maxLines <= 0`\n * disables truncation.\n */\nexport function truncateBody(body: string, maxLines: number): string {\n if (maxLines <= 0) return body;\n const lines = body.split(\"\\n\");\n if (lines.length <= maxLines) return body;\n const kept = lines.slice(0, maxLines).join(\"\\n\");\n const remaining = lines.length - maxLines;\n return `${kept}\\n\\n… ${remaining} more lines (use --max-lines=0 for full)`;\n}\n\nfunction findFirstChild(sec: Section, all: Section[]): Section | null {\n for (const candidate of all) {\n if (candidate.parent === sec) return candidate;\n }\n return null;\n}\n",
10
+ "/**\n * Pretty printing for `lilmd read --pretty`. Lazy-loads marked +\n * marked-terminal on first use so the default (plain-text) path keeps its\n * ~16ms cold start.\n */\n\nexport type PrettyFormatter = (markdown: string) => string;\n\nlet formatterPromise: Promise<PrettyFormatter> | null = null;\n\n/** Returns a cached formatter, importing marked + marked-terminal on first call. */\nexport function loadPrettyFormatter(): Promise<PrettyFormatter> {\n return (formatterPromise ??= buildFormatter());\n}\n\nasync function buildFormatter(): Promise<PrettyFormatter> {\n const [{ marked }, { markedTerminal }] = await Promise.all([\n import(\"marked\"),\n import(\"marked-terminal\"),\n ]);\n\n marked.use(\n markedTerminal({\n reflowText: false,\n tab: 2,\n // Unicode em-dash for <hr>, matches lilmd's delimiter style.\n hr: \"─\",\n }),\n );\n\n return (md: string) => {\n // highlight.js hits console.error when it sees an unknown code-fence\n // language like `js-nolint` (common in MDN). The throw is swallowed by\n // marked-terminal, but the stderr line isn't — mute that one channel\n // for the duration of the parse.\n const originalError = console.error;\n console.error = (...args: unknown[]) => {\n if (\n typeof args[0] === \"string\" &&\n /Could not find the language/i.test(args[0])\n )\n return;\n originalError.apply(console, args as Parameters<typeof console.error>);\n };\n let rendered: string | Promise<string>;\n try {\n rendered = marked.parse(md);\n } finally {\n console.error = originalError;\n }\n if (typeof rendered !== \"string\") {\n throw new Error(\"lilmd: pretty renderer returned a Promise unexpectedly\");\n }\n // marked-terminal appends a trailing newline; trim so delimiter spacing\n // matches the plain path.\n return rendered.replace(/\\n+$/, \"\");\n };\n}\n",
11
+ "#!/usr/bin/env node\nimport { run } from \"../src/cli\";\n\nconst result = await run(process.argv.slice(2));\nif (result.stdout) {\n process.stdout.write(\n result.stdout.endsWith(\"\\n\") ? result.stdout : result.stdout + \"\\n\",\n );\n}\nif (result.stderr) process.stderr.write(result.stderr);\nprocess.exit(result.code);\n"
12
+ ],
13
+ "mappings": ";;;;;AA0BA;AACA;;;ACYO,SAAS,IAAI,CAAC,KAAwB;AAAA,EAC3C,MAAM,MAAiB,CAAC;AAAA,EACxB,MAAM,MAAM,IAAI;AAAA,EAEhB,IAAI,IAAI;AAAA,EACR,IAAI,SAAS;AAAA,EAIb,IAAI,UAAU;AAAA,EACd,IAAI,YAAY;AAAA,EAChB,IAAI,WAAW;AAAA,EAEf,OAAO,KAAK,KAAK;AAAA,IAGf,MAAM,QAAQ;AAAA,IACd,OAAO,IAAI,OAAO,IAAI,WAAW,CAAC,MAAM;AAAA,MAAa;AAAA,IACrD,IAAI,OAAO,IAAI,MAAM,OAAO,CAAC;AAAA,IAC7B,IAAI,KAAK,SAAS,KAAK,KAAK,WAAW,KAAK,SAAS,CAAC,MAAM,IAAa;AAAA,MACvE,OAAO,KAAK,MAAM,GAAG,KAAK,SAAS,CAAC;AAAA,IACtC;AAAA,IACA;AAAA,IAEA,MAAM,QAAQ,WAAW,IAAI;AAAA,IAC7B,IAAI,OAAO;AAAA,MACT,IAAI,CAAC,SAAS;AAAA,QACZ,UAAU;AAAA,QACV,YAAY,MAAM;AAAA,QAClB,WAAW,MAAM;AAAA,MACnB,EAAO,SAAI,MAAM,SAAS,aAAa,MAAM,OAAO,UAAU;AAAA,QAC5D,UAAU;AAAA,MACZ;AAAA,IACF,EAAO,SAAI,CAAC,SAAS;AAAA,MACnB,MAAM,IAAI,aAAa,MAAM,MAAM;AAAA,MACnC,IAAI;AAAA,QAAG,IAAI,KAAK,CAAC;AAAA,IACnB;AAAA,IAEA,IAAI,KAAK;AAAA,MAAK;AAAA,IACd;AAAA,EACF;AAAA,EAEA,OAAO;AAAA;AAWT,SAAS,UAAU,CAAC,MAAoD;AAAA,EAEtE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,MAAM,KAAK,KAAK,WAAW,CAAC;AAAA,EAC5B,IAAI,OAAO,MAAc,OAAO;AAAA,IAAa,OAAO;AAAA,EACpD,IAAI,MAAM;AAAA,EACV,OAAO,KAAK,WAAW,IAAI,GAAG,MAAM;AAAA,IAAI;AAAA,EACxC,IAAI,MAAM;AAAA,IAAG,OAAO;AAAA,EAGpB,OAAO,EAAE,MAAM,IAAI,KAAK,IAAI;AAAA;AAY9B,SAAS,YAAY,CAAC,MAAc,QAAgC;AAAA,EAElE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,KAAK,WAAW,CAAC,MAAM;AAAA,IAAY,OAAO;AAAA,EAE9C,IAAI,SAAS;AAAA,EACb,OAAO,KAAK,WAAW,IAAI,MAAM,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,SAAS,KAAK,SAAS;AAAA,IAAG,OAAO;AAAA,EAErC,MAAM,QAAQ,IAAI;AAAA,EAClB,MAAM,UAAU,KAAK,WAAW,KAAK;AAAA,EAIrC,IAAI,QAAQ,KAAK,UAAU,YAAY,MAAM,YAAY,GAAY;AAAA,IACnE,OAAO;AAAA,EACT;AAAA,EAGA,IAAI,eAAe;AAAA,EACnB,OACE,eAAe,KAAK,WACnB,KAAK,WAAW,YAAY,MAAM,MAAM,KAAK,WAAW,YAAY,MAAM,IAC3E;AAAA,IACA;AAAA,EACF;AAAA,EAEA,IAAI,MAAM,KAAK;AAAA,EAEf,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,IACA;AAAA,EACF;AAAA,EAGA,IAAI,UAAU;AAAA,EACd,OAAO,UAAU,gBAAgB,KAAK,WAAW,UAAU,CAAC,MAAM;AAAA,IAAI;AAAA,EACtE,IACE,UAAU,QACT,YAAY,gBACX,KAAK,WAAW,UAAU,CAAC,MAAM,MACjC,KAAK,WAAW,UAAU,CAAC,MAAM,IACnC;AAAA,IACA,MAAM;AAAA,IACN,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,MACA;AAAA,IACF;AAAA,EACF;AAAA,EAEA,MAAM,QAAQ,KAAK,MAAM,cAAc,GAAG;AAAA,EAC1C,OAAO,EAAE,OAAO,QAAQ,OAAO,MAAM,OAAO;AAAA;;;AC7IvC,SAAS,aAAa,CAAC,UAAqB,YAA+B;AAAA,EAChF,MAAM,MAAiB,CAAC;AAAA,EAExB,MAAM,QAAmB,CAAC;AAAA,EAE1B,WAAW,KAAK,UAAU;AAAA,IAGxB,OAAO,MAAM,SAAS,KAAK,MAAM,MAAM,SAAS,GAAI,SAAS,EAAE,OAAO;AAAA,MACpE,MAAM,UAAU,MAAM,IAAI;AAAA,MAC1B,QAAQ,WAAW,EAAE,OAAO;AAAA,IAC9B;AAAA,IACA,MAAM,SAAS,MAAM,SAAS,IAAI,MAAM,MAAM,SAAS,KAAM;AAAA,IAE7D,MAAM,MAAe;AAAA,MACnB,OAAO,EAAE;AAAA,MACT,OAAO,EAAE;AAAA,MACT,YAAY,EAAE;AAAA,MAEd,UAAU;AAAA,MACV;AAAA,IACF;AAAA,IACA,IAAI,KAAK,GAAG;AAAA,IACZ,MAAM,KAAK,GAAG;AAAA,EAChB;AAAA,EAEA,OAAO;AAAA;AAOF,SAAS,MAAM,CAAC,KAAwB;AAAA,EAC7C,MAAM,OAAiB,CAAC;AAAA,EACxB,IAAI,MAAM,IAAI;AAAA,EACd,OAAO,KAAK;AAAA,IACV,KAAK,KAAK,IAAI,KAAK;AAAA,IACnB,MAAM,IAAI;AAAA,EACZ;AAAA,EACA,OAAO,KAAK,QAAQ;AAAA;AAQf,SAAS,UAAU,CAAC,KAAqB;AAAA,EAC9C,IAAI,IAAI,WAAW;AAAA,IAAG,OAAO;AAAA,EAC7B,IAAI,IAAI;AAAA,EACR,SAAS,IAAI,EAAG,IAAI,IAAI,QAAQ,KAAK;AAAA,IACnC,IAAI,IAAI,WAAW,CAAC,MAAM;AAAA,MAAI;AAAA,EAChC;AAAA,EAGA,IAAI,IAAI,WAAW,IAAI,SAAS,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,OAAO;AAAA;;;ACjDF,SAAS,aAAa,CAAC,OAA0B;AAAA,EACtD,MAAM,UAAU,MAAM,KAAK;AAAA,EAC3B,IAAI,QAAQ,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EAYlC,MAAM,cAAwB,CAAC;AAAA,EAC/B,MAAM,MAAY,CAAC,YAAY;AAAA,EAC/B,IAAI,MAAM;AAAA,EACV,IAAI,IAAI;AAAA,EACR,IAAI,UAAU;AAAA,EACd,IAAI,iBAAiB;AAAA,EACrB,OAAO,IAAI,QAAQ,QAAQ;AAAA,IACzB,MAAM,KAAK,QAAQ;AAAA,IACnB,IAAI,OAAO,QAAQ,kBAAkB,UAAU;AAAA,MAC7C,UAAU,CAAC;AAAA,MACX,OAAO;AAAA,MACP,iBAAiB;AAAA,MACjB;AAAA,MACA;AAAA,IACF;AAAA,IACA,IAAI,CAAC,WAAW,OAAO,KAAK;AAAA,MAC1B,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,MAC3B,MAAM;AAAA,MACN,iBAAiB;AAAA,MACjB,IAAI,QAAQ,IAAI,OAAO,KAAK;AAAA,QAC1B,IAAI,KAAK,OAAO;AAAA,QAChB,KAAK;AAAA,MACP,EAAO;AAAA,QACL,IAAI,KAAK,YAAY;AAAA,QACrB,KAAK;AAAA;AAAA,MAEP;AAAA,IACF;AAAA,IACA,OAAO;AAAA,IACP,IAAI,OAAO,OAAO,OAAO;AAAA,MAAM,iBAAiB;AAAA,IAChD;AAAA,EACF;AAAA,EACA,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,EAE3B,OAAO,YAAY,IAAI,CAAC,GAAG,QAAQ,aAAa,GAAG,IAAI,QAAQ,YAAY,CAAC;AAAA;AAG9E,SAAS,YAAY,CAAC,KAAa,IAAiB;AAAA,EAClD,IAAI,IAAI;AAAA,EACR,IAAI,QAAuB;AAAA,EAK3B,MAAM,aAAa,yBAAyB,KAAK,CAAC;AAAA,EAClD,IAAI,YAAY;AAAA,IACd,QAAQ,WAAW,GAAI;AAAA,IACvB,IAAI,WAAW,MAAM;AAAA,EACvB;AAAA,EAGA,MAAM,aAAa,wBAAwB,KAAK,CAAC;AAAA,EACjD,IAAI,YAAY;AAAA,IACd,MAAM,UAAU,WAAW;AAAA,IAC3B,MAAM,QAAQ,WAAW,MAAM;AAAA,IAC/B,OAAO;AAAA,MACL;AAAA,MACA;AAAA,MACA,MAAM;AAAA,MACN,OAAO;AAAA,MACP,OAAO,IAAI,OAAO,SAAS,KAAK;AAAA,IAClC;AAAA,EACF;AAAA,EAGA,IAAI,EAAE,WAAW,GAAG,GAAG;AAAA,IACrB,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,MAAM,CAAC,EAAE,KAAK,EAAE;AAAA,EAC9D;AAAA,EAEA,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,KAAK,EAAE;AAAA;AAG9C,SAAS,KAAK,CAAC,UAAqB,UAAgC;AAAA,EACzE,IAAI,SAAS,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EACnC,MAAM,MAAiB,CAAC;AAAA,EACxB,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,QAAQ,KAAK,QAAQ;AAAA,MAAG,IAAI,KAAK,GAAG;AAAA,EAC1C;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,OAAO,CAAC,KAAc,MAA0B;AAAA,EAEvD,MAAM,OAAO,KAAK,KAAK,SAAS;AAAA,EAChC,IAAI,CAAC,QAAQ,CAAC,sBAAsB,MAAM,GAAG;AAAA,IAAG,OAAO;AAAA,EAGvD,IAAI,SAAyB,IAAI;AAAA,EACjC,SAAS,IAAI,KAAK,SAAS,EAAG,KAAK,GAAG,KAAK;AAAA,IAIzC,MAAM,KAAK,KAAK,IAAI,GAAI;AAAA,IACxB,MAAM,MAAM,KAAK;AAAA,IAEjB,IAAI,OAAO,SAAS;AAAA,MAClB,IAAI,CAAC,UAAU,CAAC,sBAAsB,KAAK,MAAM;AAAA,QAAG,OAAO;AAAA,MAC3D,SAAS,OAAO;AAAA,IAClB,EAAO;AAAA,MAEL,IAAI,QAAwB;AAAA,MAC5B,OAAO,QAAQ;AAAA,QACb,IAAI,sBAAsB,KAAK,MAAM,GAAG;AAAA,UACtC,QAAQ;AAAA,UACR;AAAA,QACF;AAAA,QACA,SAAS,OAAO;AAAA,MAClB;AAAA,MACA,IAAI,CAAC;AAAA,QAAO,OAAO;AAAA,MACnB,SAAS,MAAM;AAAA;AAAA,EAEnB;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,qBAAqB,CAAC,KAAc,KAAuB;AAAA,EAClE,IAAI,IAAI,UAAU,QAAQ,IAAI,UAAU,IAAI;AAAA,IAAO,OAAO;AAAA,EAC1D,MAAM,QAAQ,IAAI;AAAA,EAClB,QAAQ,IAAI;AAAA,SACL;AAAA,MACH,OAAO,MAAM,YAAY,MAAM,IAAI,MAAM,YAAY;AAAA,SAClD;AAAA,MACH,OAAO,IAAI,MAAO,KAAK,KAAK;AAAA,SACzB;AAAA,MACH,OAAO,MAAM,YAAY,EAAE,SAAS,IAAI,MAAM,YAAY,CAAC;AAAA;AAAA;;;AC5J1D,SAAS,SAAS,CACvB,MACA,KACA,UACA,MACQ;AAAA,EACR,MAAM,aAAa,WAAW,GAAG;AAAA,EACjC,MAAM,cAAc,SAAS;AAAA,EAC7B,MAAM,cAAc,eAAe,IAAI,OAAO,MAAM;AAAA,EACpD,MAAM,SAAS,gBAAgB,IAAI,YAAY;AAAA,EAE/C,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,KAAK,GAAG,SAAS,gBAAgB,eAAe,QAAQ;AAAA,EAE5D,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,KAAK,SAAS,QAAQ,IAAI,QAAQ,KAAK;AAAA,MAAO;AAAA,IAClD,MAAM,SAAS,KAAK,OAAO,KAAK,KAAK,OAAO,KAAK,IAAI,GAAG,IAAI,QAAQ,CAAC,CAAC;AAAA,IACtE,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,IACnC,MAAM,QAAQ,IAAI,IAAI,cAAc,IAAI;AAAA,IACxC,IAAI,KAAK,GAAG,SAAS,UAAU,IAAI,UAAU,OAAO;AAAA,EACtD;AAAA,EACA,OAAO,IAAI,KAAK;AAAA,CAAI;AAAA;AAcf,SAAS,aAAa,CAC3B,MACA,UACA,KACA,MACQ;AAAA,EACR,MAAM,QAAQ,IAAI;AAAA,EAClB,IAAI,MAAM,IAAI;AAAA,EAEd,IAAI,KAAK,YAAY,KAAK,aAAa;AAAA,IACrC,MAAM,aAAa,eAAe,KAAK,KAAK,WAAW;AAAA,IACvD,IAAI;AAAA,MAAY,MAAM,WAAW,aAAa;AAAA,EAChD;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,MAAM;AAAA,EACR;AAAA,EAIA,MAAM,aAAa,KAAK,IAAI,KAAK,SAAS,MAAM;AAAA,EAChD,IAAI,OAAO,SAAS,MAAM,QAAQ,GAAG,UAAU,EAAE,KAAK;AAAA,CAAI;AAAA,EAG1D,IAAI,KAAK,YAAY,QAAQ,KAAK,WAAW,GAAG;AAAA,IAC9C,OAAO,aAAa,MAAM,KAAK,QAAQ;AAAA,EACzC;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,OAAO,KAAK,OAAO,IAAI;AAAA,EACzB;AAAA,EAEA,IAAI,KAAK;AAAA,IAAK,OAAO;AAAA,EAErB,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,EACnC,MAAM,SAAS,MAAK,UAAU,SAAS,QAAQ,UAAU,IAAI,SAAS,IAAI,OAAO,CAAC;AAAA,EAClF,MAAM,SAAS,UAAS,IAAI,OAAO,EAAE;AAAA,EACrC,OAAO,GAAG;AAAA,EAAW;AAAA,EAAS;AAAA;AAQzB,SAAS,YAAY,CAAC,MAAc,UAA0B;AAAA,EACnE,IAAI,YAAY;AAAA,IAAG,OAAO;AAAA,EAC1B,MAAM,QAAQ,KAAK,MAAM;AAAA,CAAI;AAAA,EAC7B,IAAI,MAAM,UAAU;AAAA,IAAU,OAAO;AAAA,EACrC,MAAM,OAAO,MAAM,MAAM,GAAG,QAAQ,EAAE,KAAK;AAAA,CAAI;AAAA,EAC/C,MAAM,YAAY,MAAM,SAAS;AAAA,EACjC,OAAO,GAAG;AAAA;AAAA,IAAY;AAAA;AAGxB,SAAS,cAAc,CAAC,KAAc,KAAgC;AAAA,EACpE,WAAW,aAAa,KAAK;AAAA,IAC3B,IAAI,UAAU,WAAW;AAAA,MAAK,OAAO;AAAA,EACvC;AAAA,EACA,OAAO;AAAA;;;ACzGT,IAAI,mBAAoD;AAGjD,SAAS,mBAAmB,GAA6B;AAAA,EAC9D,OAAQ,qBAAqB,eAAe;AAAA;AAG9C,eAAe,cAAc,GAA6B;AAAA,EACxD,SAAS,YAAY,oBAAoB,MAAM,QAAQ,IAAI;AAAA,IAClD;AAAA,IACA;AAAA,EACT,CAAC;AAAA,EAED,OAAO,IACL,eAAe;AAAA,IACb,YAAY;AAAA,IACZ,KAAK;AAAA,IAEL,IAAI;AAAA,EACN,CAAC,CACH;AAAA,EAEA,OAAO,CAAC,OAAe;AAAA,IAKrB,MAAM,gBAAgB,QAAQ;AAAA,IAC9B,QAAQ,QAAQ,IAAI,SAAoB;AAAA,MACtC,IACE,OAAO,KAAK,OAAO,YACnB,+BAA+B,KAAK,KAAK,EAAE;AAAA,QAE3C;AAAA,MACF,cAAc,MAAM,SAAS,IAAwC;AAAA;AAAA,IAEvE,IAAI;AAAA,IACJ,IAAI;AAAA,MACF,WAAW,OAAO,MAAM,EAAE;AAAA,cAC1B;AAAA,MACA,QAAQ,QAAQ;AAAA;AAAA,IAElB,IAAI,OAAO,aAAa,UAAU;AAAA,MAChC,MAAM,IAAI,MAAM,wDAAwD;AAAA,IAC1E;AAAA,IAGA,OAAO,SAAS,QAAQ,QAAQ,EAAE;AAAA;AAAA;;;ALpBtC,IAAM,OAAO;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAkCb,SAAS,EAAE,CAAC,GAAsB;AAAA,EAChC,OAAO,EAAE,MAAM,GAAG,QAAQ,GAAG,QAAQ,GAAG;AAAA;AAE1C,SAAS,OAAO,CAAC,GAAsB;AAAA,EAIrC,OAAO,EAAE,MAAM,GAAG,QAAQ,GAAG,QAAQ,GAAG;AAAA;AAE1C,SAAS,GAAG,CAAC,GAAW,OAAO,GAAc;AAAA,EAC3C,OAAO,EAAE,MAAM,QAAQ,IAAI,QAAQ,EAAE;AAAA;AAGvC,IAAM,UAAU;AAAA,EACd,OAAO,EAAE,MAAM,SAAS;AAAA,EACxB,MAAM,EAAE,MAAM,UAAU;AAAA,EACxB,eAAe,EAAE,MAAM,SAAS;AAAA,EAChC,aAAa,EAAE,MAAM,SAAS;AAAA,EAC9B,aAAa,EAAE,MAAM,UAAU;AAAA,EAC/B,WAAW,EAAE,MAAM,UAAU;AAAA,EAC7B,KAAK,EAAE,MAAM,UAAU;AAAA,EACvB,QAAQ,EAAE,MAAM,UAAU;AAAA,EAC1B,MAAM,EAAE,MAAM,UAAU;AAAA,EACxB,MAAM,EAAE,MAAM,WAAW,OAAO,IAAI;AACtC;AAGA,eAAsB,GAAG,CAAC,MAAoC;AAAA,EAC5D,IAAI;AAAA,EACJ,IAAI;AAAA,IACF,SAAS,UAAU;AAAA,MACjB,MAAM;AAAA,MACN,SAAS;AAAA,MACT,kBAAkB;AAAA,MAClB,QAAQ;AAAA,IACV,CAAC;AAAA,IACD,OAAO,GAAG;AAAA,IACV,OAAO,IAAI,UAAW,EAAY;AAAA,EAAY,QAAQ,CAAC;AAAA;AAAA,EAGzD,QAAQ,QAAQ,gBAAgB;AAAA,EAChC,IAAI,OAAO,QAAQ,YAAY,WAAW,GAAG;AAAA,IAC3C,OAAO,GAAG,IAAI;AAAA,EAChB;AAAA,EAGA,MAAM,OAAO,YAAY;AAAA,EACzB,IAAI,SAAS,UAAU,SAAS,QAAQ,SAAS,UAAU,SAAS,OAAO;AAAA,IACzE,OAAO,SAAS,MAAM,YAAY,MAAM,CAAC,GAAG,MAAM;AAAA,EACpD;AAAA,EAKA,IAAI,YAAY,WAAW;AAAA,IAAG,OAAO,SAAS,OAAO,aAAa,MAAM;AAAA,EACxE,OAAO,SAAS,QAAQ,aAAa,MAAM;AAAA;AAK7C,eAAe,QAAQ,CACrB,KACA,MACA,QACoB;AAAA,EACpB,QAAQ;AAAA,SACD;AAAA,MACH,OAAO,OAAO,MAAM,MAAM;AAAA,SACvB;AAAA,MACH,OAAO,QAAQ,MAAM,MAAM;AAAA,SACxB;AAAA,MACH,OAAO,MAAM,MAAM,MAAM;AAAA,SACtB;AAAA,MACH,OAAO,QAAQ,MAAM,MAAM;AAAA;AAAA,MAE3B,OAAO,IAAI,2BAA2B;AAAA,EAAS,QAAQ,CAAC;AAAA;AAAA;AAS9D,SAAS,QAAQ,CAAC,MAA2C;AAAA,EAC3D,IAAI;AAAA,IACF,MAAM,MAAM,SAAS,MAAM,aAAa,GAAG,MAAM,IAAI,aAAa,MAAM,MAAM;AAAA,IAC9E,OAAO,EAAE,IAAI;AAAA,IACb,OAAO,GAAG;AAAA,IACV,MAAM,MAAO,EAA4B,SAAS,WAC9C,uBAAuB;AAAA,IACvB,uBAAuB,UAAW,EAAY;AAAA;AAAA,IAClD,OAAO,IAAI,KAAK,CAAC;AAAA;AAAA;AAQrB,SAAS,cAAc,CAAC,GAAsC;AAAA,EAC5D,IAAI,KAAK;AAAA,IAAM,OAAO;AAAA,EACtB,MAAM,IAAI,OAAO,SAAS,GAAG,EAAE;AAAA,EAC/B,IAAI,OAAO,MAAM,CAAC,KAAK,OAAO,CAAC,MAAM,EAAE,KAAK;AAAA,IAAG,OAAO;AAAA,EACtD,OAAO;AAAA;AAGT,SAAS,QAAQ,CACf,GACA,MACA,UACsC;AAAA,EACtC,MAAM,MAAM,EAAE;AAAA,EACd,IAAI,OAAO;AAAA,IAAM,OAAO,EAAE,OAAO,SAAS;AAAA,EAC1C,MAAM,IAAI,eAAe,GAAG;AAAA,EAC5B,IAAI,KAAK,QAAQ,IAAI,GAAG;AAAA,IACtB,OAAO,IAAI,YAAY,6CAA6C;AAAA,GAAU,CAAC;AAAA,EACjF;AAAA,EACA,OAAO,EAAE,OAAO,EAAE;AAAA;AAKpB,SAAS,MAAM,CAAC,MAAgB,GAAsB;AAAA,EACpD,MAAM,OAAO,KAAK;AAAA,EAClB,IAAI,QAAQ;AAAA,IAAM,OAAO,IAAI;AAAA,GAA+B,CAAC;AAAA,EAC7D,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,IAAI,EAAE,MAAM;AAAA,IACV,OAAO,GACL,KAAK,UACH;AAAA,MACE;AAAA,MACA,aAAa,WAAW,GAAG;AAAA,MAC3B,UAAU,SAAS,IAAI,aAAa;AAAA,IACtC,GACA,MACA,CACF,CACF;AAAA,EACF;AAAA,EAEA,MAAM,QAAQ,SAAS,GAAG,SAAS,IAAI;AAAA,EACvC,IAAI,UAAU;AAAA,IAAO,OAAO;AAAA,EAC5B,OAAO,GACL,UAAU,MAAM,KAAK,UAAU;AAAA,IAC7B,OAAO,MAAM,SAAS;AAAA,IACtB,MAAM,CAAC,CAAC,EAAE;AAAA,EACZ,CAAC,CACH;AAAA;AAGF,eAAe,OAAO,CAAC,MAAgB,GAA+B;AAAA,EACpE,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,cAAc,KAAK;AAAA,EACzB,IAAI,QAAQ,QAAQ,eAAe,MAAM;AAAA,IACvC,OAAO,IAAI;AAAA,GAA8C,CAAC;AAAA,EAC5D;AAAA,EAGA,IAAI,EAAE,UAAU,EAAE,MAAM;AAAA,IACtB,OAAO,IAAI;AAAA,GAAyD,CAAC;AAAA,EACvE;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,MAAM,aAAa,SAAS,GAAG,eAAe,EAAE;AAAA,EAChD,IAAI,UAAU;AAAA,IAAY,OAAO;AAAA,EACjC,MAAM,WAAW,SAAS,GAAG,aAAa,CAAC;AAAA,EAC3C,IAAI,UAAU;AAAA,IAAU,OAAO;AAAA,EAE/B,MAAM,WAAW,cAAc,WAAW;AAAA,EAC1C,MAAM,WAAU,MAAM,UAAU,QAAQ;AAAA,EAKxC,MAAM,WAAW,IAAI,MAAM;AAAA,CAAI;AAAA,EAE/B,IAAI,EAAE,MAAM;AAAA,IACV,OAAO,aACL,MACA,UACA,UACA,UACA,WAAW,SAAS,IACpB,SAAS,SAAS,GAClB,CACF;AAAA,EACF;AAAA,EAEA,IAAI,SAAQ,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAEvD,IAAI;AAAA,EACJ,IAAI,EAAE,QAAQ;AAAA,IACZ,IAAI;AAAA,MACF,SAAS,MAAM,oBAAoB;AAAA,MACnC,OAAO,GAAG;AAAA,MACV,OAAO,IAAI,eAAgB,EAAY;AAAA,GAAa,CAAC;AAAA;AAAA,EAEzD;AAAA,EAEA,MAAM,MAAM,WAAW,SAAS;AAAA,EAChC,MAAM,UAAU,SAAQ,MAAM,GAAG,GAAG;AAAA,EACpC,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,SAAQ,SAAS,KAAK;AAAA,IACxB,IAAI,KACF,GAAG,SAAQ,iCAAiC,4CAC9C;AAAA,EACF;AAAA,EACA,WAAW,OAAO,SAAS;AAAA,IACzB,IAAI,KACF,cAAc,MAAM,UAAU,KAAK;AAAA,MACjC,UAAU,CAAC,CAAC,EAAE;AAAA,MACd,QAAQ,CAAC,CAAC,EAAE;AAAA,MACZ,KAAK,CAAC,CAAC,EAAE;AAAA,MACT,UAAU,SAAS,SAAS;AAAA,MAC5B,aAAa;AAAA,MACb;AAAA,IACF,CAAC,CACH;AAAA,EACF;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAG1B,SAAS,YAAY,CACnB,MACA,UACA,KACA,UACA,YACA,UACA,GACW;AAAA,EACX,MAAM,OAAO,KAAK,UAChB;AAAA,IACE;AAAA,IACA,SAAS,SAAQ,MAAM,GAAG,UAAU,EAAE,IAAI,CAAC,OAAO;AAAA,SAC7C,cAAc,CAAC;AAAA,MAClB,MAAM,EAAE,aACJ,KACA,cAAc,UAAU,UAAU,GAAG,KAAK,CAAC,CAAC,EAAE,YAAY,GAAG,QAAQ;AAAA,IAC3E,EAAE;AAAA,IACF,WAAW,SAAQ,SAAS;AAAA,EAC9B,GACA,MACA,CACF;AAAA,EAEA,OAAO,SAAQ,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA;AAG/E,SAAS,KAAK,CAAC,MAAgB,GAAsB;AAAA,EACnD,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,cAAc,KAAK;AAAA,EACzB,IAAI,QAAQ,QAAQ,eAAe,MAAM;AAAA,IACvC,OAAO,IAAI;AAAA,GAA4C,CAAC;AAAA,EAC1D;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,MAAM,aAAa,SAAS,GAAG,eAAe,EAAE;AAAA,EAChD,IAAI,UAAU;AAAA,IAAY,OAAO;AAAA,EACjC,MAAM,MAAM,WAAW,SAAS;AAAA,EAEhC,MAAM,WAAW,cAAc,WAAW;AAAA,EAC1C,MAAM,WAAU,MAAM,UAAU,QAAQ,EAAE,MAAM,GAAG,GAAG;AAAA,EAItD,MAAM,aAAa,IAAI;AAAA,EACvB,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,IAAI,QAAQ;AAAA,MACd,MAAM,OAAO,WAAW,IAAI,IAAI,MAAM;AAAA,MACtC,IAAI;AAAA,QAAM,KAAK,KAAK,GAAG;AAAA,MAClB;AAAA,mBAAW,IAAI,IAAI,QAAQ,CAAC,GAAG,CAAC;AAAA,IACvC;AAAA,EACF;AAAA,EAEA,IAAI,EAAE,MAAM;AAAA,IACV,MAAM,UAAU,SAAQ,IAAI,CAAC,YAAY;AAAA,MACvC,QAAQ,cAAc,MAAM;AAAA,MAC5B,WAAW,WAAW,IAAI,MAAM,KAAK,CAAC,GAAG,IAAI,aAAa;AAAA,IAC5D,EAAE;AAAA,IACF,MAAM,OAAO,KAAK,UAAU,EAAE,MAAM,QAAQ,GAAG,MAAM,CAAC;AAAA,IACtD,OAAO,SAAQ,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA,EAC/E;AAAA,EAEA,IAAI,SAAQ,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAEvD,MAAM,MAAgB,CAAC;AAAA,EACvB,WAAW,UAAU,UAAS;AAAA,IAC5B,MAAM,WAAW,WAAW,IAAI,MAAM,KAAK,CAAC;AAAA,IAC5C,IAAI,KACF,GAAG,IAAI,OAAO,OAAO,KAAK,KAAK,OAAO,WAAW,OAAO,cAAc,OAAO,UAC/E;AAAA,IACA,IAAI,SAAS,WAAW,GAAG;AAAA,MACzB,IAAI,KAAK,iBAAiB;AAAA,IAC5B,EAAO;AAAA,MACL,WAAW,KAAK,UAAU;AAAA,QACxB,IAAI,KACF,KAAK,IAAI,OAAO,EAAE,KAAK,KAAK,EAAE,WAAW,EAAE,cAAc,EAAE,UAC7D;AAAA,MACF;AAAA;AAAA,EAEJ;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAG1B,SAAS,OAAO,CAAC,MAAgB,GAAsB;AAAA,EACrD,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,UAAU,KAAK;AAAA,EACrB,IAAI,QAAQ,QAAQ,WAAW,MAAM;AAAA,IACnC,OAAO,IAAI;AAAA,GAA6C,CAAC;AAAA,EAC3D;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,IAAI;AAAA,EACJ,IAAI;AAAA,IACF,KAAK,IAAI,OAAO,OAAO;AAAA,IACvB,OAAO,GAAG;AAAA,IACV,OAAO,IAAI,8BAA+B,EAAY;AAAA,GAAa,CAAC;AAAA;AAAA,EAGtE,MAAM,WAAW,IAAI,MAAM;AAAA,CAAI;AAAA,EAE/B,MAAM,OAAc,CAAC;AAAA,EAKrB,IAAI,SAAS;AAAA,EACb,SAAS,SAAS,EAAG,UAAU,SAAS,QAAQ,UAAU;AAAA,IACxD,OACE,SAAS,IAAI,SAAS,UACtB,SAAS,SAAS,GAAI,cAAc,QACpC;AAAA,MACA;AAAA,IACF;AAAA,IACA,MAAM,OAAO,SAAS,SAAS;AAAA,IAC/B,IAAI,GAAG,KAAK,IAAI,GAAG;AAAA,MACjB,MAAM,UAAU,UAAU,IAAI,SAAS,WAAW,OAAO;AAAA,MACzD,KAAK,KAAK,EAAE,SAAS,MAAM,QAAQ,MAAM,KAAK,CAAC;AAAA,IACjD;AAAA,EACF;AAAA,EAEA,IAAI,EAAE,MAAM;AAAA,IACV,MAAM,OAAO,KAAK,UAChB,KAAK,IAAI,CAAC,OAAO;AAAA,MACf;AAAA,MACA,MAAM,EAAE;AAAA,MACR,MAAM,EAAE;AAAA,MACR,SAAS,EAAE,UAAU,cAAc,EAAE,OAAO,IAAI;AAAA,IAClD,EAAE,GACF,MACA,CACF;AAAA,IACA,OAAO,KAAK,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA,EAC5E;AAAA,EAEA,IAAI,KAAK,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAGpD,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,cAA0C;AAAA,EAC9C,WAAW,OAAO,MAAM;AAAA,IACtB,IAAI,IAAI,YAAY,aAAa;AAAA,MAC/B,IAAI,IAAI,SAAS;AAAA,QACf,MAAM,OAAO,OAAO,IAAI,OAAO,EAAE,OAAO,IAAI,QAAQ,KAAK,EAAE,KAAK,KAAK;AAAA,QACrE,IAAI,KACF,MAAK,UAAU,IAAI,QAAQ,cAAc,IAAI,QAAQ,UACvD;AAAA,MACF,EAAO;AAAA,QACL,IAAI,KAAK,MAAK,8BAA8B;AAAA;AAAA,MAE9C,cAAc,IAAI;AAAA,IACpB;AAAA,IACA,IAAI,KAAK,MAAM,IAAI,UAAU,IAAI,MAAM;AAAA,EACzC;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAK1B,SAAS,SAAS,CAChB,UACA,KACA,KACA,UACQ;AAAA,EACR,IAAI,MAAM,IAAI;AAAA,EACd,IAAI,UAAU;AAAA,IACZ,MAAM,aAAa,IAAI,KAAK,CAAC,MAAM,EAAE,WAAW,GAAG;AAAA,IACnD,IAAI;AAAA,MAAY,MAAM,WAAW,aAAa;AAAA,EAChD;AAAA,EACA,OAAO,SAAS,MAAM,IAAI,aAAa,GAAG,GAAG,EAAE,KAAK;AAAA,CAAI;AAAA;AAG1D,SAAS,aAAa,CAAC,MAAc,UAA0B;AAAA,EAC7D,OAAO,WAAW,IAAI,aAAa,MAAM,QAAQ,IAAI;AAAA;AAGvD,SAAS,aAAa,CAAC,KAAuC;AAAA,EAC5D,OAAO;AAAA,IACL,OAAO,IAAI;AAAA,IACX,OAAO,IAAI;AAAA,IACX,YAAY,IAAI;AAAA,IAChB,UAAU,IAAI;AAAA,IACd,MAAM,OAAO,GAAG;AAAA,EAClB;AAAA;;;AMreF,IAAM,SAAS,MAAM,IAAI,QAAQ,KAAK,MAAM,CAAC,CAAC;AAC9C,IAAI,OAAO,QAAQ;AAAA,EACjB,QAAQ,OAAO,MACb,OAAO,OAAO,SAAS;AAAA,CAAI,IAAI,OAAO,SAAS,OAAO,SAAS;AAAA,CACjE;AACF;AACA,IAAI,OAAO;AAAA,EAAQ,QAAQ,OAAO,MAAM,OAAO,MAAM;AACrD,QAAQ,KAAK,OAAO,IAAI;",
14
+ "debugId": "3F3B5F923576EA7C64756E2164756E21",
15
+ "names": []
16
+ }
package/package.json CHANGED
@@ -1,10 +1,10 @@
1
1
  {
2
2
  "name": "lilmd",
3
- "version": "0.1.0",
3
+ "version": "0.2.0",
4
4
  "description": "CLI for working with large MD files — Markdown is the new DB",
5
5
  "type": "module",
6
6
  "bin": {
7
- "mdq": "./dist/mdq.js"
7
+ "lilmd": "./dist/lilmd.js"
8
8
  },
9
9
  "main": "./dist/index.cjs",
10
10
  "module": "./dist/index.js",
package/dist/mdq.js.map DELETED
@@ -1,16 +0,0 @@
1
- {
2
- "version": 3,
3
- "sources": ["src/cli.ts", "src/scan.ts", "src/sections.ts", "src/select.ts", "src/render.ts", "src/pretty.ts", "bin/mdq.ts"],
4
- "sourcesContent": [
5
- "#!/usr/bin/env bun\n/**\n * mdq CLI — composes scan + sections + select + render into commands.\n *\n * Uses node:util.parseArgs only: zero runtime dependencies. Subcommand\n * dispatch is a tiny switch on argv[0]; that's about 20 lines of glue we'd\n * reinvent on top of any framework anyway, and the cold-start benchmark\n * (see BENCHMARK.md) showed cac and parseArgs tied at ~16ms.\n *\n * Public commands (MVP):\n * mdq [file] toc\n * mdq <file> <selector> alias for: mdq read\n * mdq read <file> <selector> read section(s) matching selector\n * mdq ls <file> <selector> direct children of matching section\n * mdq grep <file> <pattern> regex search inside section bodies\n * mdq --help | -h help\n *\n * File argument accepts `-` to read stdin.\n *\n * Exit codes:\n * 0 found something / toc printed / help\n * 1 ran successfully but found no matches (grep-style, so `&&` chaining\n * works the way agents expect)\n * 2 usage error (bad flag, missing file, invalid regex, file not found)\n */\n\nimport { parseArgs, type ParseArgsConfig } from \"node:util\";\nimport { readFileSync } from \"node:fs\";\n\nimport { scan } from \"./scan\";\nimport { buildSections, countLines, pathOf, type Section } from \"./sections\";\nimport { match, parseSelector } from \"./select\";\nimport { renderSection, renderToc, truncateBody } from \"./render\";\nimport { loadPrettyFormatter, type PrettyFormatter } from \"./pretty\";\n\nconst HELP = `mdq — CLI for working with large Markdown files\n\nUsage:\n mdq show this help\n mdq <file> print table of contents\n mdq <file> <selector> alias for 'mdq read'\n mdq read <file> <selector> print sections matching selector\n mdq ls <file> <selector> list direct child headings\n mdq grep <file> <pattern> regex-search section bodies\n\nSelector grammar:\n Install fuzzy, case-insensitive substring\n =Install exact, case-insensitive equality\n /^inst/i regex (JS syntax); flags default to 'i'\n ##Install level filter (1..6 '#'s)\n Guide > Install descendant, any depth under 'Guide'\n Guide >> Install direct child of 'Guide'\n\nOptions:\n --depth <n> TOC: max heading depth to show (0 = none)\n --flat TOC: flat list, no indentation\n --max-results <n> cap matches for read/ls (default 25)\n --max-lines <n> truncate long bodies (0 = unlimited)\n --body-only read: skip subsections\n --no-body read: print headings only\n --raw read: drop delimiter lines\n --pretty read: render markdown with ANSI styling (for humans)\n --json machine-readable JSON output\n\nUse '-' as <file> to read from stdin. Exit code is 1 when no matches.\n`;\n\nexport type CliResult = { code: number; stdout: string; stderr: string };\n\nfunction ok(s: string): CliResult {\n return { code: 0, stdout: s, stderr: \"\" };\n}\nfunction noMatch(s: string): CliResult {\n // Successful run that found nothing: exit 1 so `mdq ... && foo` works the\n // way agents expect, but keep the friendly message on stdout so humans see\n // it too.\n return { code: 1, stdout: s, stderr: \"\" };\n}\nfunction err(s: string, code = 1): CliResult {\n return { code, stdout: \"\", stderr: s };\n}\n\nconst OPTIONS = {\n depth: { type: \"string\" },\n flat: { type: \"boolean\" },\n \"max-results\": { type: \"string\" },\n \"max-lines\": { type: \"string\" },\n \"body-only\": { type: \"boolean\" },\n \"no-body\": { type: \"boolean\" },\n raw: { type: \"boolean\" },\n pretty: { type: \"boolean\" },\n json: { type: \"boolean\" },\n help: { type: \"boolean\", short: \"h\" },\n} satisfies NonNullable<ParseArgsConfig[\"options\"]>;\n\n/** CLI entry point. Async because `--pretty` lazy-loads marked. */\nexport async function run(argv: string[]): Promise<CliResult> {\n let parsed;\n try {\n parsed = parseArgs({\n args: argv,\n options: OPTIONS,\n allowPositionals: true,\n strict: true,\n });\n } catch (e) {\n return err(`mdq: ${(e as Error).message}\\n${HELP}`, 2);\n }\n\n const { values, positionals } = parsed;\n if (values.help || positionals.length === 0) {\n return ok(HELP);\n }\n\n // Detect explicit subcommand by the first positional.\n const head = positionals[0];\n if (head === \"read\" || head === \"ls\" || head === \"grep\" || head === \"toc\") {\n return dispatch(head, positionals.slice(1), values);\n }\n\n // Positional form:\n // mdq <file> -> toc\n // mdq <file> <selector> -> read\n if (positionals.length === 1) return dispatch(\"toc\", positionals, values);\n return dispatch(\"read\", positionals, values);\n}\n\ntype Values = ReturnType<typeof parseArgs<{ options: typeof OPTIONS }>>[\"values\"];\n\nasync function dispatch(\n cmd: string,\n rest: string[],\n values: Values,\n): Promise<CliResult> {\n switch (cmd) {\n case \"toc\":\n return cmdToc(rest, values);\n case \"read\":\n return cmdRead(rest, values);\n case \"ls\":\n return cmdLs(rest, values);\n case \"grep\":\n return cmdGrep(rest, values);\n default:\n return err(`mdq: unknown command '${cmd}'\\n${HELP}`, 2);\n }\n}\n\n/**\n * Read `file` from disk or stdin. Returns a CliResult on failure so callers\n * can just forward it — we swallow raw ENOENT stack traces here and emit a\n * friendly \"mdq: cannot open 'foo.md'\" message instead.\n */\nfunction loadFile(file: string): { src: string } | CliResult {\n try {\n const src = file === \"-\" ? readFileSync(0, \"utf8\") : readFileSync(file, \"utf8\");\n return { src };\n } catch (e) {\n const msg = (e as NodeJS.ErrnoException).code === \"ENOENT\"\n ? `mdq: cannot open '${file}': not found\\n`\n : `mdq: cannot open '${file}': ${(e as Error).message}\\n`;\n return err(msg, 2);\n }\n}\n\n/**\n * Parse `v` as a base-10 integer. Returns null (not fallback) on invalid\n * input so callers can distinguish \"not provided\" from \"bad value\".\n */\nfunction parseIntOrNull(v: string | undefined): number | null {\n if (v == null) return null;\n const n = Number.parseInt(v, 10);\n if (Number.isNaN(n) || String(n) !== v.trim()) return null;\n return n;\n}\n\nfunction readFlag(\n v: Values,\n name: \"depth\" | \"max-results\" | \"max-lines\",\n fallback: number | null,\n): { value: number | null } | CliResult {\n const raw = v[name];\n if (raw == null) return { value: fallback };\n const n = parseIntOrNull(raw);\n if (n == null || n < 0) {\n return err(`mdq: --${name} expects a non-negative integer, got '${raw}'\\n`, 2);\n }\n return { value: n };\n}\n\n// ---- commands ----------------------------------------------------------\n\nfunction cmdToc(rest: string[], v: Values): CliResult {\n const file = rest[0];\n if (file == null) return err(\"mdq toc: missing <file>\\n\", 2);\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n if (v.json) {\n return ok(\n JSON.stringify(\n {\n file,\n total_lines: countLines(src),\n headings: sections.map(sectionToJSON),\n },\n null,\n 2,\n ),\n );\n }\n\n const depth = readFlag(v, \"depth\", null);\n if (\"code\" in depth) return depth;\n return ok(\n renderToc(file, src, sections, {\n depth: depth.value ?? undefined,\n flat: !!v.flat,\n }),\n );\n}\n\nasync function cmdRead(rest: string[], v: Values): Promise<CliResult> {\n const file = rest[0];\n const selectorStr = rest[1];\n if (file == null || selectorStr == null) {\n return err(\"mdq read: missing <file> or <selector>\\n\", 2);\n }\n // --pretty styles output with ANSI for humans; --json is for machines.\n // Reject the combo before we do any I/O.\n if (v.pretty && v.json) {\n return err(\"mdq read: --pretty cannot be combined with --json\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n const maxResults = readFlag(v, \"max-results\", 25);\n if (\"code\" in maxResults) return maxResults;\n const maxLines = readFlag(v, \"max-lines\", 0);\n if (\"code\" in maxLines) return maxLines;\n\n const selector = parseSelector(selectorStr);\n const matches = match(sections, selector);\n\n // Split source once and pass srcLines through to render. Previously\n // renderSection/sliceBody each re-split for every match, turning\n // `mdq read file.md sel` into O(matches × file_size).\n const srcLines = src.split(\"\\n\");\n\n if (v.json) {\n return emitReadJson(\n file,\n srcLines,\n sections,\n matches,\n maxResults.value ?? 25,\n maxLines.value ?? 0,\n v,\n );\n }\n\n if (matches.length === 0) return noMatch(\"(no match)\\n\");\n\n let pretty: PrettyFormatter | undefined;\n if (v.pretty) {\n try {\n pretty = await loadPrettyFormatter();\n } catch (e) {\n return err(`mdq read: ${(e as Error).message}\\n`, 2);\n }\n }\n\n const cap = maxResults.value ?? 25;\n const toPrint = matches.slice(0, cap);\n const out: string[] = [];\n if (matches.length > cap) {\n out.push(\n `${matches.length} matches, showing first ${cap}. Use --max-results=N to raise the cap.`,\n );\n }\n for (const sec of toPrint) {\n out.push(\n renderSection(file, srcLines, sec, {\n bodyOnly: !!v[\"body-only\"],\n noBody: !!v[\"no-body\"],\n raw: !!v.raw,\n maxLines: maxLines.value ?? 0,\n allSections: sections,\n pretty,\n }),\n );\n }\n return ok(out.join(\"\\n\"));\n}\n\nfunction emitReadJson(\n file: string,\n srcLines: string[],\n all: Section[],\n matches: Section[],\n maxResults: number,\n maxLines: number,\n v: Values,\n): CliResult {\n const body = JSON.stringify(\n {\n file,\n matches: matches.slice(0, maxResults).map((s) => ({\n ...sectionToJSON(s),\n body: v[\"no-body\"]\n ? \"\"\n : maybeTruncate(sliceBody(srcLines, s, all, !!v[\"body-only\"]), maxLines),\n })),\n truncated: matches.length > maxResults,\n },\n null,\n 2,\n );\n // JSON path keeps exit 0 regardless — the shape itself signals no matches.\n return matches.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n}\n\nfunction cmdLs(rest: string[], v: Values): CliResult {\n const file = rest[0];\n const selectorStr = rest[1];\n if (file == null || selectorStr == null) {\n return err(\"mdq ls: missing <file> or <selector>\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n const maxResults = readFlag(v, \"max-results\", 25);\n if (\"code\" in maxResults) return maxResults;\n const cap = maxResults.value ?? 25;\n\n const selector = parseSelector(selectorStr);\n const matches = match(sections, selector).slice(0, cap);\n\n // Index children by parent once so we don't repeat an O(n) filter per\n // match — inexpensive in practice but easy to do right.\n const childrenOf = new Map<Section, Section[]>();\n for (const sec of sections) {\n if (sec.parent) {\n const list = childrenOf.get(sec.parent);\n if (list) list.push(sec);\n else childrenOf.set(sec.parent, [sec]);\n }\n }\n\n if (v.json) {\n const results = matches.map((parent) => ({\n parent: sectionToJSON(parent),\n children: (childrenOf.get(parent) ?? []).map(sectionToJSON),\n }));\n const body = JSON.stringify({ file, results }, null, 2);\n return matches.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n }\n\n if (matches.length === 0) return noMatch(\"(no match)\\n\");\n\n const out: string[] = [];\n for (const parent of matches) {\n const children = childrenOf.get(parent) ?? [];\n out.push(\n `${\"#\".repeat(parent.level)} ${parent.title} L${parent.line_start}-${parent.line_end}`,\n );\n if (children.length === 0) {\n out.push(\" (no children)\");\n } else {\n for (const c of children) {\n out.push(\n ` ${\"#\".repeat(c.level)} ${c.title} L${c.line_start}-${c.line_end}`,\n );\n }\n }\n }\n return ok(out.join(\"\\n\"));\n}\n\nfunction cmdGrep(rest: string[], v: Values): CliResult {\n const file = rest[0];\n const pattern = rest[1];\n if (file == null || pattern == null) {\n return err(\"mdq grep: missing <file> or <pattern>\\n\", 2);\n }\n const loaded = loadFile(file);\n if (\"code\" in loaded) return loaded;\n const { src } = loaded;\n const sections = buildSections(scan(src), countLines(src));\n\n let re: RegExp;\n try {\n re = new RegExp(pattern);\n } catch (e) {\n return err(`mdq grep: invalid regex: ${(e as Error).message}\\n`, 2);\n }\n\n const srcLines = src.split(\"\\n\");\n type Hit = { section: Section | null; line: number; text: string };\n const hits: Hit[] = [];\n\n // Walk all lines once and attribute each match to the innermost enclosing\n // section. Sections are in document order, so a monotonic cursor is enough.\n // Matches before the first heading get `section: null`.\n let secIdx = -1;\n for (let lineNo = 1; lineNo <= srcLines.length; lineNo++) {\n while (\n secIdx + 1 < sections.length &&\n sections[secIdx + 1]!.line_start <= lineNo\n ) {\n secIdx++;\n }\n const line = srcLines[lineNo - 1]!;\n if (re.test(line)) {\n const section = secIdx >= 0 ? sections[secIdx] ?? null : null;\n hits.push({ section, line: lineNo, text: line });\n }\n }\n\n if (v.json) {\n const body = JSON.stringify(\n hits.map((h) => ({\n file,\n line: h.line,\n text: h.text,\n section: h.section ? sectionToJSON(h.section) : null,\n })),\n null,\n 2,\n );\n return hits.length === 0 ? { code: 1, stdout: body, stderr: \"\" } : ok(body);\n }\n\n if (hits.length === 0) return noMatch(\"(no match)\\n\");\n\n // Group consecutive hits by section for readable output.\n const out: string[] = [];\n let lastSection: Section | null | undefined = undefined;\n for (const hit of hits) {\n if (hit.section !== lastSection) {\n if (hit.section) {\n const path = pathOf(hit.section).concat(hit.section.title).join(\" > \");\n out.push(\n `── ${path} L${hit.section.line_start}-${hit.section.line_end}`,\n );\n } else {\n out.push(`── ${file} (no enclosing heading)`);\n }\n lastSection = hit.section;\n }\n out.push(` L${hit.line}: ${hit.text}`);\n }\n return ok(out.join(\"\\n\"));\n}\n\n// ---- shared helpers ----------------------------------------------------\n\nfunction sliceBody(\n srcLines: string[],\n sec: Section,\n all: Section[],\n bodyOnly: boolean,\n): string {\n let end = sec.line_end;\n if (bodyOnly) {\n const firstChild = all.find((s) => s.parent === sec);\n if (firstChild) end = firstChild.line_start - 1;\n }\n return srcLines.slice(sec.line_start - 1, end).join(\"\\n\");\n}\n\nfunction maybeTruncate(body: string, maxLines: number): string {\n return maxLines > 0 ? truncateBody(body, maxLines) : body;\n}\n\nfunction sectionToJSON(sec: Section): Record<string, unknown> {\n return {\n level: sec.level,\n title: sec.title,\n line_start: sec.line_start,\n line_end: sec.line_end,\n path: pathOf(sec),\n };\n}\n\n// Entry point lives in bin/mdq.ts; this module only exports `run`.\n",
6
- "/**\n * Markdown heading scanner — the engine behind every read-path command.\n *\n * Instead of building a full CommonMark AST we walk the source line by line\n * and recognize only what `mdq` actually needs: ATX headings and fenced code\n * blocks (so `#` inside code doesn't count as a heading).\n *\n * Numbers on MDN content (see BENCHMARK.md): ~180 MB/s end-to-end on a\n * 10 MB fixture, roughly 7x faster than markdown-it and ~1000x faster than\n * mdast-util-from-markdown while returning the exact same section.\n *\n * Deliberate limitations:\n * - Setext headings (`===` / `---` underlines) are NOT recognized. mdq is\n * aimed at agent-authored markdown where ATX is ubiquitous.\n * - HTML blocks are not detected. A `<pre>` containing an ATX-looking line\n * would be misread as a heading. That's an acceptable tradeoff for 100x\n * speed; a future `--strict` flag could hand off to markdown-it.\n * - Fenced code blocks *inside a list item* that are indented 4+ spaces are\n * not recognized as fences — we only look at the first 3 columns for the\n * fence opener. A `# fake` line inside such a block would be scanned as a\n * heading. Rare in practice; document-your-way-out rather than fix.\n * - An unclosed fence at EOF leaves the scanner in \"still in fence\" state\n * to the end of the file, so any `#`-looking lines after it are ignored.\n * That's the conservative choice — prefer under-counting to over-counting.\n */\n\nexport type Heading = {\n /** 1..6 */\n level: number;\n /** Heading text with trailing closing hashes stripped. */\n title: string;\n /** 1-indexed line number. */\n line: number;\n};\n\n/**\n * Return every ATX heading in `src`, in document order.\n * Runs in a single pass; O(n) in source length, O(headings) in space.\n */\nexport function scan(src: string): Heading[] {\n const out: Heading[] = [];\n const len = src.length;\n\n let i = 0;\n let lineNo = 0;\n\n // Fence state: when inFence is true every line is ignored until we see a\n // matching closing fence (same char, length >= opening length).\n let inFence = false;\n let fenceChar = 0; // charCode of ` or ~\n let fenceLen = 0;\n\n while (i <= len) {\n // Slice one line without the trailing newline. A trailing \\r from CRLF\n // is stripped below.\n const start = i;\n while (i < len && src.charCodeAt(i) !== 10 /* \\n */) i++;\n let line = src.slice(start, i);\n if (line.length > 0 && line.charCodeAt(line.length - 1) === 13 /* \\r */) {\n line = line.slice(0, line.length - 1);\n }\n lineNo++;\n\n const fence = matchFence(line);\n if (fence) {\n if (!inFence) {\n inFence = true;\n fenceChar = fence.char;\n fenceLen = fence.len;\n } else if (fence.char === fenceChar && fence.len >= fenceLen) {\n inFence = false;\n }\n } else if (!inFence) {\n const h = matchHeading(line, lineNo);\n if (h) out.push(h);\n }\n\n if (i >= len) break;\n i++; // skip the \\n\n }\n\n return out;\n}\n\n/**\n * If `line` opens or closes a fenced code block, return the fence char code\n * (` or ~) and the number of fence characters. Otherwise null.\n *\n * A fence is 0–3 spaces, then 3+ of a single fence char, then optional info\n * string. We only care about the opening run length; the info string is\n * ignored.\n */\nfunction matchFence(line: string): { char: number; len: number } | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n const ch = line.charCodeAt(p);\n if (ch !== 96 /* ` */ && ch !== 126 /* ~ */) return null;\n let run = 0;\n while (line.charCodeAt(p + run) === ch) run++;\n if (run < 3) return null;\n // For backtick fences, CommonMark forbids backticks in the info string,\n // but we don't parse info; we only need to know this line is a fence.\n return { char: ch, len: run };\n}\n\n/**\n * If `line` is an ATX heading, return it. Otherwise null.\n *\n * Rules (CommonMark, simplified):\n * - 0–3 spaces of indent\n * - 1–6 `#`\n * - EITHER end-of-line OR a space/tab followed by content\n * - optional closing sequence: whitespace + trailing `#`s (stripped)\n */\nfunction matchHeading(line: string, lineNo: number): Heading | null {\n // Skip up to 3 leading spaces.\n let p = 0;\n while (p < 3 && line.charCodeAt(p) === 32) p++;\n if (line.charCodeAt(p) !== 35 /* # */) return null;\n\n let hashes = 0;\n while (line.charCodeAt(p + hashes) === 35) hashes++;\n if (hashes < 1 || hashes > 6) return null;\n\n const after = p + hashes;\n const afterCh = line.charCodeAt(after);\n\n // After the hashes we need either end-of-line or a space/tab. Anything else\n // (including `#` which is caught above by the hashes loop) disqualifies.\n if (after < line.length && afterCh !== 32 && afterCh !== 9 /* \\t */) {\n return null;\n }\n\n // Trim leading whitespace of content and trailing whitespace + closing #s.\n let contentStart = after;\n while (\n contentStart < line.length &&\n (line.charCodeAt(contentStart) === 32 || line.charCodeAt(contentStart) === 9)\n ) {\n contentStart++;\n }\n\n let end = line.length;\n // Trim trailing whitespace first.\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n // Strip closing `#`s only if they are preceded by whitespace (CommonMark\n // requires the closing sequence to be separated from the content).\n let closing = end;\n while (closing > contentStart && line.charCodeAt(closing - 1) === 35) closing--;\n if (\n closing < end &&\n (closing === contentStart ||\n line.charCodeAt(closing - 1) === 32 ||\n line.charCodeAt(closing - 1) === 9)\n ) {\n end = closing;\n while (\n end > contentStart &&\n (line.charCodeAt(end - 1) === 32 || line.charCodeAt(end - 1) === 9)\n ) {\n end--;\n }\n }\n\n const title = line.slice(contentStart, end);\n return { level: hashes, title, line: lineNo };\n}\n",
7
- "/**\n * Turn a flat list of headings into a section tree with line ranges and\n * parent links.\n *\n * Each Section covers its heading line through the line before the next\n * heading at the same-or-higher level (or the end of the file if none).\n * That's the \"whole subtree\" range — it includes the section's body *and*\n * its descendants. `--body-only` rendering is derived at render time.\n */\n\nimport type { Heading } from \"./scan\";\n\nexport type Section = {\n level: number;\n title: string;\n /** 1-indexed line of the heading itself. */\n line_start: number;\n /** 1-indexed inclusive end of the subtree. */\n line_end: number;\n /** Nearest enclosing section, or null for top-level. */\n parent: Section | null;\n};\n\n/**\n * Build the section tree in a single pass. Preserves document order.\n *\n * Runs in O(n): every section is pushed once and popped once, and we set\n * its `line_end` at pop time. Sections still on the stack when we run out\n * of headings keep their provisional `line_end = totalLines`.\n */\nexport function buildSections(headings: Heading[], totalLines: number): Section[] {\n const out: Section[] = [];\n /** Ancestors whose subtree is still open. */\n const stack: Section[] = [];\n\n for (const h of headings) {\n // Every section on the stack with the same-or-shallower level closes at\n // h.line - 1 (the line before the new heading).\n while (stack.length > 0 && stack[stack.length - 1]!.level >= h.level) {\n const closing = stack.pop()!;\n closing.line_end = h.line - 1;\n }\n const parent = stack.length > 0 ? stack[stack.length - 1]! : null;\n\n const sec: Section = {\n level: h.level,\n title: h.title,\n line_start: h.line,\n // Provisional: if nothing closes this section we leave it at totalLines.\n line_end: totalLines,\n parent,\n };\n out.push(sec);\n stack.push(sec);\n }\n\n return out;\n}\n\n/**\n * Walk `sec` up to the root, collecting ancestor titles in top-down order.\n * Returns [] for a root section.\n */\nexport function pathOf(sec: Section): string[] {\n const path: string[] = [];\n let cur = sec.parent;\n while (cur) {\n path.push(cur.title);\n cur = cur.parent;\n }\n return path.reverse();\n}\n\n/**\n * Count lines in a source string. Empty string is 0; otherwise every line\n * (including the last one, whether or not it ends with a newline) is 1.\n * A trailing newline does NOT add a phantom line.\n */\nexport function countLines(src: string): number {\n if (src.length === 0) return 0;\n let n = 1;\n for (let i = 0; i < src.length; i++) {\n if (src.charCodeAt(i) === 10) n++;\n }\n // If the source ends with a newline, the line-count should equal the\n // number of newlines (not newlines + 1) since the final \"line\" is empty.\n if (src.charCodeAt(src.length - 1) === 10) n--;\n return n;\n}\n",
8
- "/**\n * Selector grammar parser and matcher.\n *\n * SELECTOR := SEGMENT ( SEP SEGMENT )*\n * SEP := \">\" (descendant, any depth)\n * | \">>\" (direct child)\n * SEGMENT := LEVEL? MATCHER\n * LEVEL := \"#\"{1,6} (optional level filter)\n * MATCHER := TEXT (fuzzy, case-insensitive substring)\n * | \"=\" TEXT (exact, case-insensitive equality)\n * | \"/\" PATTERN \"/\" FLAGS? (JS regex; defaults to /.../i)\n *\n * Matching semantics:\n * - The *last* segment must match the candidate section itself.\n * - Earlier segments must match an ancestor chain walking upward from that\n * candidate, respecting each separator between them: `A >> B` requires A\n * to be B's *immediate* parent; `A > B` only requires A to be *some*\n * ancestor of B.\n */\n\nimport type { Section } from \"./sections\";\n\nexport type Op = \"descendant\" | \"child\";\nexport type Kind = \"fuzzy\" | \"exact\" | \"regex\";\n\nexport type Segment = {\n /** Operator that connects this segment to the *previous* one.\n * For the first segment this is always \"descendant\" (unused). */\n op: Op;\n /** Optional 1..6 level filter. */\n level: number | null;\n kind: Kind;\n /** The raw value (without level/kind prefix). */\n value: string;\n /** Present only for kind === \"regex\". */\n regex?: RegExp;\n};\n\nexport function parseSelector(input: string): Segment[] {\n const trimmed = input.trim();\n if (trimmed.length === 0) return [];\n\n // Split on > / >>. We walk the string character by character so we can\n // distinguish the two operators without confusing a `>>` with two\n // consecutive `>`s. We intentionally ignore `>` that appear inside a\n // regex delimiter pair because users may write `/a>b/`.\n //\n // `atSegmentStart` tracks whether the running buffer is still whitespace\n // only — only in that state can a `/` open a regex literal. Using\n // `cur.length === 0` instead is wrong because `>` splits leave the loop\n // pointing at a leading space that then lands in `cur` before the next\n // non-space char.\n const rawSegments: string[] = [];\n const ops: Op[] = [\"descendant\"];\n let cur = \"\";\n let i = 0;\n let inRegex = false;\n let atSegmentStart = true;\n while (i < trimmed.length) {\n const ch = trimmed[i];\n if (ch === \"/\" && (atSegmentStart || inRegex)) {\n inRegex = !inRegex;\n cur += ch;\n atSegmentStart = false;\n i++;\n continue;\n }\n if (!inRegex && ch === \">\") {\n rawSegments.push(cur.trim());\n cur = \"\";\n atSegmentStart = true;\n if (trimmed[i + 1] === \">\") {\n ops.push(\"child\");\n i += 2;\n } else {\n ops.push(\"descendant\");\n i += 1;\n }\n continue;\n }\n cur += ch;\n if (ch !== \" \" && ch !== \"\\t\") atSegmentStart = false;\n i++;\n }\n rawSegments.push(cur.trim());\n\n return rawSegments.map((s, idx) => parseSegment(s, ops[idx] ?? \"descendant\"));\n}\n\nfunction parseSegment(raw: string, op: Op): Segment {\n let s = raw;\n let level: number | null = null;\n\n // Level prefix — exactly 1..6 `#`s followed by something that is NOT\n // another `#`. The negative lookahead matters: without it, \"#######foo\"\n // would silently match level=6 value=\"#foo\".\n const levelMatch = /^(#{1,6})(?!#)\\s*(.*)$/.exec(s);\n if (levelMatch) {\n level = levelMatch[1]!.length;\n s = levelMatch[2] ?? \"\";\n }\n\n // Regex literal: /pattern/flags — flags default to \"i\".\n const regexMatch = /^\\/(.+)\\/([gimsuy]*)$/.exec(s);\n if (regexMatch) {\n const pattern = regexMatch[1]!;\n const flags = regexMatch[2] || \"i\";\n return {\n op,\n level,\n kind: \"regex\",\n value: pattern,\n regex: new RegExp(pattern, flags),\n };\n }\n\n // Exact match: =value.\n if (s.startsWith(\"=\")) {\n return { op, level, kind: \"exact\", value: s.slice(1).trim() };\n }\n\n return { op, level, kind: \"fuzzy\", value: s.trim() };\n}\n\nexport function match(sections: Section[], selector: Segment[]): Section[] {\n if (selector.length === 0) return [];\n const out: Section[] = [];\n for (const sec of sections) {\n if (matches(sec, selector)) out.push(sec);\n }\n return out;\n}\n\nfunction matches(sec: Section, segs: Segment[]): boolean {\n // Last segment matches the candidate itself.\n const last = segs[segs.length - 1];\n if (!last || !segmentMatchesSection(last, sec)) return false;\n\n // Walk the ancestor chain backward alongside the earlier segments.\n let cursor: Section | null = sec.parent;\n for (let i = segs.length - 2; i >= 0; i--) {\n // The separator BEFORE segs[i+1] is stored on segs[i+1].op; that's the\n // relationship we need to honor when walking from segs[i+1] back to\n // segs[i] in the ancestor chain.\n const op = segs[i + 1]!.op;\n const seg = segs[i]!;\n\n if (op === \"child\") {\n if (!cursor || !segmentMatchesSection(seg, cursor)) return false;\n cursor = cursor.parent;\n } else {\n // Descendant: find any matching ancestor.\n let found: Section | null = null;\n while (cursor) {\n if (segmentMatchesSection(seg, cursor)) {\n found = cursor;\n break;\n }\n cursor = cursor.parent;\n }\n if (!found) return false;\n cursor = found.parent;\n }\n }\n return true;\n}\n\nfunction segmentMatchesSection(seg: Segment, sec: Section): boolean {\n if (seg.level !== null && seg.level !== sec.level) return false;\n const title = sec.title;\n switch (seg.kind) {\n case \"exact\":\n return title.toLowerCase() === seg.value.toLowerCase();\n case \"regex\":\n return seg.regex!.test(title);\n case \"fuzzy\":\n return title.toLowerCase().includes(seg.value.toLowerCase());\n }\n}\n",
9
- "/**\n * Output formatting for mdq.\n *\n * Two targets today: `renderToc` for the TOC view and `renderSection` for a\n * single section read. Both emit grep-friendly plain text with stable\n * delimiters that agents can split on.\n *\n * A future `--json` pipeline lives in cli.ts; the shapes (Heading, Section)\n * are already JSON-clean so it's a direct serialization.\n */\n\nimport type { Section } from \"./sections\";\nimport { countLines } from \"./sections\";\nimport type { PrettyFormatter } from \"./pretty\";\n\nexport type TocOptions = {\n depth?: number;\n flat?: boolean;\n};\n\nexport function renderToc(\n file: string,\n src: string,\n sections: Section[],\n opts: TocOptions,\n): string {\n const totalLines = countLines(src);\n const headerCount = sections.length;\n const headerRange = totalLines === 0 ? \"L0\" : `L1-${totalLines}`;\n const plural = headerCount === 1 ? \"heading\" : \"headings\";\n\n const out: string[] = [];\n out.push(`${file} ${headerRange} ${headerCount} ${plural}`);\n\n for (const sec of sections) {\n if (opts.depth != null && sec.level > opts.depth) continue;\n const indent = opts.flat ? \"\" : \" \".repeat(Math.max(0, sec.level - 1));\n const hashes = \"#\".repeat(sec.level);\n const range = `L${sec.line_start}-${sec.line_end}`;\n out.push(`${indent}${hashes} ${sec.title} ${range}`);\n }\n return out.join(\"\\n\");\n}\n\nexport type SectionOptions = {\n bodyOnly?: boolean;\n noBody?: boolean;\n raw?: boolean;\n maxLines?: number;\n /** Required when bodyOnly is true so we can find the first child. */\n allSections?: Section[];\n /** Optional markdown→ANSI formatter applied to the body before delimiters. */\n pretty?: PrettyFormatter;\n};\n\nexport function renderSection(\n file: string,\n srcLines: string[],\n sec: Section,\n opts: SectionOptions,\n): string {\n const start = sec.line_start;\n let end = sec.line_end;\n\n if (opts.bodyOnly && opts.allSections) {\n const firstChild = findFirstChild(sec, opts.allSections);\n if (firstChild) end = firstChild.line_start - 1;\n }\n\n if (opts.noBody) {\n end = start;\n }\n\n // Clamp to source length so a stale `line_end` (e.g. countLines and\n // splitLines disagreeing on a trailing newline) can't overrun.\n const clampedEnd = Math.min(end, srcLines.length);\n let body = srcLines.slice(start - 1, clampedEnd).join(\"\\n\");\n\n // Truncate before pretty-printing so ANSI escapes can't land mid-cut.\n if (opts.maxLines != null && opts.maxLines > 0) {\n body = truncateBody(body, opts.maxLines);\n }\n\n if (opts.pretty) {\n body = opts.pretty(body);\n }\n\n if (opts.raw) return body;\n\n const hashes = \"#\".repeat(sec.level);\n const header = `── ${file} L${start}-${end} ${hashes} ${sec.title} ${\"─\".repeat(8)}`;\n const footer = `── end ${\"─\".repeat(40)}`;\n return `${header}\\n${body}\\n${footer}`;\n}\n\n/**\n * Cut `body` to the first `maxLines` lines. If anything was dropped, append\n * a marker line telling the agent how to get the rest. `maxLines <= 0`\n * disables truncation.\n */\nexport function truncateBody(body: string, maxLines: number): string {\n if (maxLines <= 0) return body;\n const lines = body.split(\"\\n\");\n if (lines.length <= maxLines) return body;\n const kept = lines.slice(0, maxLines).join(\"\\n\");\n const remaining = lines.length - maxLines;\n return `${kept}\\n\\n… ${remaining} more lines (use --max-lines=0 for full)`;\n}\n\nfunction findFirstChild(sec: Section, all: Section[]): Section | null {\n for (const candidate of all) {\n if (candidate.parent === sec) return candidate;\n }\n return null;\n}\n",
10
- "/**\n * Pretty printing for `mdq read --pretty`. Lazy-loads marked +\n * marked-terminal on first use so the default (plain-text) path keeps its\n * ~16ms cold start.\n */\n\nexport type PrettyFormatter = (markdown: string) => string;\n\nlet formatterPromise: Promise<PrettyFormatter> | null = null;\n\n/** Returns a cached formatter, importing marked + marked-terminal on first call. */\nexport function loadPrettyFormatter(): Promise<PrettyFormatter> {\n return (formatterPromise ??= buildFormatter());\n}\n\nasync function buildFormatter(): Promise<PrettyFormatter> {\n const [{ marked }, { markedTerminal }] = await Promise.all([\n import(\"marked\"),\n import(\"marked-terminal\"),\n ]);\n\n marked.use(\n markedTerminal({\n reflowText: false,\n tab: 2,\n // Unicode em-dash for <hr>, matches mdq's delimiter style.\n hr: \"─\",\n }),\n );\n\n return (md: string) => {\n // highlight.js hits console.error when it sees an unknown code-fence\n // language like `js-nolint` (common in MDN). The throw is swallowed by\n // marked-terminal, but the stderr line isn't — mute that one channel\n // for the duration of the parse.\n const originalError = console.error;\n console.error = (...args: unknown[]) => {\n if (\n typeof args[0] === \"string\" &&\n /Could not find the language/i.test(args[0])\n )\n return;\n originalError.apply(console, args as Parameters<typeof console.error>);\n };\n let rendered: string | Promise<string>;\n try {\n rendered = marked.parse(md);\n } finally {\n console.error = originalError;\n }\n if (typeof rendered !== \"string\") {\n throw new Error(\"mdq: pretty renderer returned a Promise unexpectedly\");\n }\n // marked-terminal appends a trailing newline; trim so delimiter spacing\n // matches the plain path.\n return rendered.replace(/\\n+$/, \"\");\n };\n}\n",
11
- "#!/usr/bin/env node\nimport { run } from \"../src/cli\";\n\nconst result = await run(process.argv.slice(2));\nif (result.stdout) {\n process.stdout.write(\n result.stdout.endsWith(\"\\n\") ? result.stdout : result.stdout + \"\\n\",\n );\n}\nif (result.stderr) process.stderr.write(result.stderr);\nprocess.exit(result.code);\n"
12
- ],
13
- "mappings": ";;;;;AA0BA;AACA;;;ACYO,SAAS,IAAI,CAAC,KAAwB;AAAA,EAC3C,MAAM,MAAiB,CAAC;AAAA,EACxB,MAAM,MAAM,IAAI;AAAA,EAEhB,IAAI,IAAI;AAAA,EACR,IAAI,SAAS;AAAA,EAIb,IAAI,UAAU;AAAA,EACd,IAAI,YAAY;AAAA,EAChB,IAAI,WAAW;AAAA,EAEf,OAAO,KAAK,KAAK;AAAA,IAGf,MAAM,QAAQ;AAAA,IACd,OAAO,IAAI,OAAO,IAAI,WAAW,CAAC,MAAM;AAAA,MAAa;AAAA,IACrD,IAAI,OAAO,IAAI,MAAM,OAAO,CAAC;AAAA,IAC7B,IAAI,KAAK,SAAS,KAAK,KAAK,WAAW,KAAK,SAAS,CAAC,MAAM,IAAa;AAAA,MACvE,OAAO,KAAK,MAAM,GAAG,KAAK,SAAS,CAAC;AAAA,IACtC;AAAA,IACA;AAAA,IAEA,MAAM,QAAQ,WAAW,IAAI;AAAA,IAC7B,IAAI,OAAO;AAAA,MACT,IAAI,CAAC,SAAS;AAAA,QACZ,UAAU;AAAA,QACV,YAAY,MAAM;AAAA,QAClB,WAAW,MAAM;AAAA,MACnB,EAAO,SAAI,MAAM,SAAS,aAAa,MAAM,OAAO,UAAU;AAAA,QAC5D,UAAU;AAAA,MACZ;AAAA,IACF,EAAO,SAAI,CAAC,SAAS;AAAA,MACnB,MAAM,IAAI,aAAa,MAAM,MAAM;AAAA,MACnC,IAAI;AAAA,QAAG,IAAI,KAAK,CAAC;AAAA,IACnB;AAAA,IAEA,IAAI,KAAK;AAAA,MAAK;AAAA,IACd;AAAA,EACF;AAAA,EAEA,OAAO;AAAA;AAWT,SAAS,UAAU,CAAC,MAAoD;AAAA,EAEtE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,MAAM,KAAK,KAAK,WAAW,CAAC;AAAA,EAC5B,IAAI,OAAO,MAAc,OAAO;AAAA,IAAa,OAAO;AAAA,EACpD,IAAI,MAAM;AAAA,EACV,OAAO,KAAK,WAAW,IAAI,GAAG,MAAM;AAAA,IAAI;AAAA,EACxC,IAAI,MAAM;AAAA,IAAG,OAAO;AAAA,EAGpB,OAAO,EAAE,MAAM,IAAI,KAAK,IAAI;AAAA;AAY9B,SAAS,YAAY,CAAC,MAAc,QAAgC;AAAA,EAElE,IAAI,IAAI;AAAA,EACR,OAAO,IAAI,KAAK,KAAK,WAAW,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,KAAK,WAAW,CAAC,MAAM;AAAA,IAAY,OAAO;AAAA,EAE9C,IAAI,SAAS;AAAA,EACb,OAAO,KAAK,WAAW,IAAI,MAAM,MAAM;AAAA,IAAI;AAAA,EAC3C,IAAI,SAAS,KAAK,SAAS;AAAA,IAAG,OAAO;AAAA,EAErC,MAAM,QAAQ,IAAI;AAAA,EAClB,MAAM,UAAU,KAAK,WAAW,KAAK;AAAA,EAIrC,IAAI,QAAQ,KAAK,UAAU,YAAY,MAAM,YAAY,GAAY;AAAA,IACnE,OAAO;AAAA,EACT;AAAA,EAGA,IAAI,eAAe;AAAA,EACnB,OACE,eAAe,KAAK,WACnB,KAAK,WAAW,YAAY,MAAM,MAAM,KAAK,WAAW,YAAY,MAAM,IAC3E;AAAA,IACA;AAAA,EACF;AAAA,EAEA,IAAI,MAAM,KAAK;AAAA,EAEf,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,IACA;AAAA,EACF;AAAA,EAGA,IAAI,UAAU;AAAA,EACd,OAAO,UAAU,gBAAgB,KAAK,WAAW,UAAU,CAAC,MAAM;AAAA,IAAI;AAAA,EACtE,IACE,UAAU,QACT,YAAY,gBACX,KAAK,WAAW,UAAU,CAAC,MAAM,MACjC,KAAK,WAAW,UAAU,CAAC,MAAM,IACnC;AAAA,IACA,MAAM;AAAA,IACN,OACE,MAAM,iBACL,KAAK,WAAW,MAAM,CAAC,MAAM,MAAM,KAAK,WAAW,MAAM,CAAC,MAAM,IACjE;AAAA,MACA;AAAA,IACF;AAAA,EACF;AAAA,EAEA,MAAM,QAAQ,KAAK,MAAM,cAAc,GAAG;AAAA,EAC1C,OAAO,EAAE,OAAO,QAAQ,OAAO,MAAM,OAAO;AAAA;;;AC7IvC,SAAS,aAAa,CAAC,UAAqB,YAA+B;AAAA,EAChF,MAAM,MAAiB,CAAC;AAAA,EAExB,MAAM,QAAmB,CAAC;AAAA,EAE1B,WAAW,KAAK,UAAU;AAAA,IAGxB,OAAO,MAAM,SAAS,KAAK,MAAM,MAAM,SAAS,GAAI,SAAS,EAAE,OAAO;AAAA,MACpE,MAAM,UAAU,MAAM,IAAI;AAAA,MAC1B,QAAQ,WAAW,EAAE,OAAO;AAAA,IAC9B;AAAA,IACA,MAAM,SAAS,MAAM,SAAS,IAAI,MAAM,MAAM,SAAS,KAAM;AAAA,IAE7D,MAAM,MAAe;AAAA,MACnB,OAAO,EAAE;AAAA,MACT,OAAO,EAAE;AAAA,MACT,YAAY,EAAE;AAAA,MAEd,UAAU;AAAA,MACV;AAAA,IACF;AAAA,IACA,IAAI,KAAK,GAAG;AAAA,IACZ,MAAM,KAAK,GAAG;AAAA,EAChB;AAAA,EAEA,OAAO;AAAA;AAOF,SAAS,MAAM,CAAC,KAAwB;AAAA,EAC7C,MAAM,OAAiB,CAAC;AAAA,EACxB,IAAI,MAAM,IAAI;AAAA,EACd,OAAO,KAAK;AAAA,IACV,KAAK,KAAK,IAAI,KAAK;AAAA,IACnB,MAAM,IAAI;AAAA,EACZ;AAAA,EACA,OAAO,KAAK,QAAQ;AAAA;AAQf,SAAS,UAAU,CAAC,KAAqB;AAAA,EAC9C,IAAI,IAAI,WAAW;AAAA,IAAG,OAAO;AAAA,EAC7B,IAAI,IAAI;AAAA,EACR,SAAS,IAAI,EAAG,IAAI,IAAI,QAAQ,KAAK;AAAA,IACnC,IAAI,IAAI,WAAW,CAAC,MAAM;AAAA,MAAI;AAAA,EAChC;AAAA,EAGA,IAAI,IAAI,WAAW,IAAI,SAAS,CAAC,MAAM;AAAA,IAAI;AAAA,EAC3C,OAAO;AAAA;;;ACjDF,SAAS,aAAa,CAAC,OAA0B;AAAA,EACtD,MAAM,UAAU,MAAM,KAAK;AAAA,EAC3B,IAAI,QAAQ,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EAYlC,MAAM,cAAwB,CAAC;AAAA,EAC/B,MAAM,MAAY,CAAC,YAAY;AAAA,EAC/B,IAAI,MAAM;AAAA,EACV,IAAI,IAAI;AAAA,EACR,IAAI,UAAU;AAAA,EACd,IAAI,iBAAiB;AAAA,EACrB,OAAO,IAAI,QAAQ,QAAQ;AAAA,IACzB,MAAM,KAAK,QAAQ;AAAA,IACnB,IAAI,OAAO,QAAQ,kBAAkB,UAAU;AAAA,MAC7C,UAAU,CAAC;AAAA,MACX,OAAO;AAAA,MACP,iBAAiB;AAAA,MACjB;AAAA,MACA;AAAA,IACF;AAAA,IACA,IAAI,CAAC,WAAW,OAAO,KAAK;AAAA,MAC1B,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,MAC3B,MAAM;AAAA,MACN,iBAAiB;AAAA,MACjB,IAAI,QAAQ,IAAI,OAAO,KAAK;AAAA,QAC1B,IAAI,KAAK,OAAO;AAAA,QAChB,KAAK;AAAA,MACP,EAAO;AAAA,QACL,IAAI,KAAK,YAAY;AAAA,QACrB,KAAK;AAAA;AAAA,MAEP;AAAA,IACF;AAAA,IACA,OAAO;AAAA,IACP,IAAI,OAAO,OAAO,OAAO;AAAA,MAAM,iBAAiB;AAAA,IAChD;AAAA,EACF;AAAA,EACA,YAAY,KAAK,IAAI,KAAK,CAAC;AAAA,EAE3B,OAAO,YAAY,IAAI,CAAC,GAAG,QAAQ,aAAa,GAAG,IAAI,QAAQ,YAAY,CAAC;AAAA;AAG9E,SAAS,YAAY,CAAC,KAAa,IAAiB;AAAA,EAClD,IAAI,IAAI;AAAA,EACR,IAAI,QAAuB;AAAA,EAK3B,MAAM,aAAa,yBAAyB,KAAK,CAAC;AAAA,EAClD,IAAI,YAAY;AAAA,IACd,QAAQ,WAAW,GAAI;AAAA,IACvB,IAAI,WAAW,MAAM;AAAA,EACvB;AAAA,EAGA,MAAM,aAAa,wBAAwB,KAAK,CAAC;AAAA,EACjD,IAAI,YAAY;AAAA,IACd,MAAM,UAAU,WAAW;AAAA,IAC3B,MAAM,QAAQ,WAAW,MAAM;AAAA,IAC/B,OAAO;AAAA,MACL;AAAA,MACA;AAAA,MACA,MAAM;AAAA,MACN,OAAO;AAAA,MACP,OAAO,IAAI,OAAO,SAAS,KAAK;AAAA,IAClC;AAAA,EACF;AAAA,EAGA,IAAI,EAAE,WAAW,GAAG,GAAG;AAAA,IACrB,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,MAAM,CAAC,EAAE,KAAK,EAAE;AAAA,EAC9D;AAAA,EAEA,OAAO,EAAE,IAAI,OAAO,MAAM,SAAS,OAAO,EAAE,KAAK,EAAE;AAAA;AAG9C,SAAS,KAAK,CAAC,UAAqB,UAAgC;AAAA,EACzE,IAAI,SAAS,WAAW;AAAA,IAAG,OAAO,CAAC;AAAA,EACnC,MAAM,MAAiB,CAAC;AAAA,EACxB,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,QAAQ,KAAK,QAAQ;AAAA,MAAG,IAAI,KAAK,GAAG;AAAA,EAC1C;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,OAAO,CAAC,KAAc,MAA0B;AAAA,EAEvD,MAAM,OAAO,KAAK,KAAK,SAAS;AAAA,EAChC,IAAI,CAAC,QAAQ,CAAC,sBAAsB,MAAM,GAAG;AAAA,IAAG,OAAO;AAAA,EAGvD,IAAI,SAAyB,IAAI;AAAA,EACjC,SAAS,IAAI,KAAK,SAAS,EAAG,KAAK,GAAG,KAAK;AAAA,IAIzC,MAAM,KAAK,KAAK,IAAI,GAAI;AAAA,IACxB,MAAM,MAAM,KAAK;AAAA,IAEjB,IAAI,OAAO,SAAS;AAAA,MAClB,IAAI,CAAC,UAAU,CAAC,sBAAsB,KAAK,MAAM;AAAA,QAAG,OAAO;AAAA,MAC3D,SAAS,OAAO;AAAA,IAClB,EAAO;AAAA,MAEL,IAAI,QAAwB;AAAA,MAC5B,OAAO,QAAQ;AAAA,QACb,IAAI,sBAAsB,KAAK,MAAM,GAAG;AAAA,UACtC,QAAQ;AAAA,UACR;AAAA,QACF;AAAA,QACA,SAAS,OAAO;AAAA,MAClB;AAAA,MACA,IAAI,CAAC;AAAA,QAAO,OAAO;AAAA,MACnB,SAAS,MAAM;AAAA;AAAA,EAEnB;AAAA,EACA,OAAO;AAAA;AAGT,SAAS,qBAAqB,CAAC,KAAc,KAAuB;AAAA,EAClE,IAAI,IAAI,UAAU,QAAQ,IAAI,UAAU,IAAI;AAAA,IAAO,OAAO;AAAA,EAC1D,MAAM,QAAQ,IAAI;AAAA,EAClB,QAAQ,IAAI;AAAA,SACL;AAAA,MACH,OAAO,MAAM,YAAY,MAAM,IAAI,MAAM,YAAY;AAAA,SAClD;AAAA,MACH,OAAO,IAAI,MAAO,KAAK,KAAK;AAAA,SACzB;AAAA,MACH,OAAO,MAAM,YAAY,EAAE,SAAS,IAAI,MAAM,YAAY,CAAC;AAAA;AAAA;;;AC5J1D,SAAS,SAAS,CACvB,MACA,KACA,UACA,MACQ;AAAA,EACR,MAAM,aAAa,WAAW,GAAG;AAAA,EACjC,MAAM,cAAc,SAAS;AAAA,EAC7B,MAAM,cAAc,eAAe,IAAI,OAAO,MAAM;AAAA,EACpD,MAAM,SAAS,gBAAgB,IAAI,YAAY;AAAA,EAE/C,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,KAAK,GAAG,SAAS,gBAAgB,eAAe,QAAQ;AAAA,EAE5D,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,KAAK,SAAS,QAAQ,IAAI,QAAQ,KAAK;AAAA,MAAO;AAAA,IAClD,MAAM,SAAS,KAAK,OAAO,KAAK,KAAK,OAAO,KAAK,IAAI,GAAG,IAAI,QAAQ,CAAC,CAAC;AAAA,IACtE,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,IACnC,MAAM,QAAQ,IAAI,IAAI,cAAc,IAAI;AAAA,IACxC,IAAI,KAAK,GAAG,SAAS,UAAU,IAAI,UAAU,OAAO;AAAA,EACtD;AAAA,EACA,OAAO,IAAI,KAAK;AAAA,CAAI;AAAA;AAcf,SAAS,aAAa,CAC3B,MACA,UACA,KACA,MACQ;AAAA,EACR,MAAM,QAAQ,IAAI;AAAA,EAClB,IAAI,MAAM,IAAI;AAAA,EAEd,IAAI,KAAK,YAAY,KAAK,aAAa;AAAA,IACrC,MAAM,aAAa,eAAe,KAAK,KAAK,WAAW;AAAA,IACvD,IAAI;AAAA,MAAY,MAAM,WAAW,aAAa;AAAA,EAChD;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,MAAM;AAAA,EACR;AAAA,EAIA,MAAM,aAAa,KAAK,IAAI,KAAK,SAAS,MAAM;AAAA,EAChD,IAAI,OAAO,SAAS,MAAM,QAAQ,GAAG,UAAU,EAAE,KAAK;AAAA,CAAI;AAAA,EAG1D,IAAI,KAAK,YAAY,QAAQ,KAAK,WAAW,GAAG;AAAA,IAC9C,OAAO,aAAa,MAAM,KAAK,QAAQ;AAAA,EACzC;AAAA,EAEA,IAAI,KAAK,QAAQ;AAAA,IACf,OAAO,KAAK,OAAO,IAAI;AAAA,EACzB;AAAA,EAEA,IAAI,KAAK;AAAA,IAAK,OAAO;AAAA,EAErB,MAAM,SAAS,IAAI,OAAO,IAAI,KAAK;AAAA,EACnC,MAAM,SAAS,MAAK,UAAU,SAAS,QAAQ,UAAU,IAAI,SAAS,IAAI,OAAO,CAAC;AAAA,EAClF,MAAM,SAAS,UAAS,IAAI,OAAO,EAAE;AAAA,EACrC,OAAO,GAAG;AAAA,EAAW;AAAA,EAAS;AAAA;AAQzB,SAAS,YAAY,CAAC,MAAc,UAA0B;AAAA,EACnE,IAAI,YAAY;AAAA,IAAG,OAAO;AAAA,EAC1B,MAAM,QAAQ,KAAK,MAAM;AAAA,CAAI;AAAA,EAC7B,IAAI,MAAM,UAAU;AAAA,IAAU,OAAO;AAAA,EACrC,MAAM,OAAO,MAAM,MAAM,GAAG,QAAQ,EAAE,KAAK;AAAA,CAAI;AAAA,EAC/C,MAAM,YAAY,MAAM,SAAS;AAAA,EACjC,OAAO,GAAG;AAAA;AAAA,IAAY;AAAA;AAGxB,SAAS,cAAc,CAAC,KAAc,KAAgC;AAAA,EACpE,WAAW,aAAa,KAAK;AAAA,IAC3B,IAAI,UAAU,WAAW;AAAA,MAAK,OAAO;AAAA,EACvC;AAAA,EACA,OAAO;AAAA;;;ACzGT,IAAI,mBAAoD;AAGjD,SAAS,mBAAmB,GAA6B;AAAA,EAC9D,OAAQ,qBAAqB,eAAe;AAAA;AAG9C,eAAe,cAAc,GAA6B;AAAA,EACxD,SAAS,YAAY,oBAAoB,MAAM,QAAQ,IAAI;AAAA,IAClD;AAAA,IACA;AAAA,EACT,CAAC;AAAA,EAED,OAAO,IACL,eAAe;AAAA,IACb,YAAY;AAAA,IACZ,KAAK;AAAA,IAEL,IAAI;AAAA,EACN,CAAC,CACH;AAAA,EAEA,OAAO,CAAC,OAAe;AAAA,IAKrB,MAAM,gBAAgB,QAAQ;AAAA,IAC9B,QAAQ,QAAQ,IAAI,SAAoB;AAAA,MACtC,IACE,OAAO,KAAK,OAAO,YACnB,+BAA+B,KAAK,KAAK,EAAE;AAAA,QAE3C;AAAA,MACF,cAAc,MAAM,SAAS,IAAwC;AAAA;AAAA,IAEvE,IAAI;AAAA,IACJ,IAAI;AAAA,MACF,WAAW,OAAO,MAAM,EAAE;AAAA,cAC1B;AAAA,MACA,QAAQ,QAAQ;AAAA;AAAA,IAElB,IAAI,OAAO,aAAa,UAAU;AAAA,MAChC,MAAM,IAAI,MAAM,sDAAsD;AAAA,IACxE;AAAA,IAGA,OAAO,SAAS,QAAQ,QAAQ,EAAE;AAAA;AAAA;;;ALpBtC,IAAM,OAAO;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAAA;AAkCb,SAAS,EAAE,CAAC,GAAsB;AAAA,EAChC,OAAO,EAAE,MAAM,GAAG,QAAQ,GAAG,QAAQ,GAAG;AAAA;AAE1C,SAAS,OAAO,CAAC,GAAsB;AAAA,EAIrC,OAAO,EAAE,MAAM,GAAG,QAAQ,GAAG,QAAQ,GAAG;AAAA;AAE1C,SAAS,GAAG,CAAC,GAAW,OAAO,GAAc;AAAA,EAC3C,OAAO,EAAE,MAAM,QAAQ,IAAI,QAAQ,EAAE;AAAA;AAGvC,IAAM,UAAU;AAAA,EACd,OAAO,EAAE,MAAM,SAAS;AAAA,EACxB,MAAM,EAAE,MAAM,UAAU;AAAA,EACxB,eAAe,EAAE,MAAM,SAAS;AAAA,EAChC,aAAa,EAAE,MAAM,SAAS;AAAA,EAC9B,aAAa,EAAE,MAAM,UAAU;AAAA,EAC/B,WAAW,EAAE,MAAM,UAAU;AAAA,EAC7B,KAAK,EAAE,MAAM,UAAU;AAAA,EACvB,QAAQ,EAAE,MAAM,UAAU;AAAA,EAC1B,MAAM,EAAE,MAAM,UAAU;AAAA,EACxB,MAAM,EAAE,MAAM,WAAW,OAAO,IAAI;AACtC;AAGA,eAAsB,GAAG,CAAC,MAAoC;AAAA,EAC5D,IAAI;AAAA,EACJ,IAAI;AAAA,IACF,SAAS,UAAU;AAAA,MACjB,MAAM;AAAA,MACN,SAAS;AAAA,MACT,kBAAkB;AAAA,MAClB,QAAQ;AAAA,IACV,CAAC;AAAA,IACD,OAAO,GAAG;AAAA,IACV,OAAO,IAAI,QAAS,EAAY;AAAA,EAAY,QAAQ,CAAC;AAAA;AAAA,EAGvD,QAAQ,QAAQ,gBAAgB;AAAA,EAChC,IAAI,OAAO,QAAQ,YAAY,WAAW,GAAG;AAAA,IAC3C,OAAO,GAAG,IAAI;AAAA,EAChB;AAAA,EAGA,MAAM,OAAO,YAAY;AAAA,EACzB,IAAI,SAAS,UAAU,SAAS,QAAQ,SAAS,UAAU,SAAS,OAAO;AAAA,IACzE,OAAO,SAAS,MAAM,YAAY,MAAM,CAAC,GAAG,MAAM;AAAA,EACpD;AAAA,EAKA,IAAI,YAAY,WAAW;AAAA,IAAG,OAAO,SAAS,OAAO,aAAa,MAAM;AAAA,EACxE,OAAO,SAAS,QAAQ,aAAa,MAAM;AAAA;AAK7C,eAAe,QAAQ,CACrB,KACA,MACA,QACoB;AAAA,EACpB,QAAQ;AAAA,SACD;AAAA,MACH,OAAO,OAAO,MAAM,MAAM;AAAA,SACvB;AAAA,MACH,OAAO,QAAQ,MAAM,MAAM;AAAA,SACxB;AAAA,MACH,OAAO,MAAM,MAAM,MAAM;AAAA,SACtB;AAAA,MACH,OAAO,QAAQ,MAAM,MAAM;AAAA;AAAA,MAE3B,OAAO,IAAI,yBAAyB;AAAA,EAAS,QAAQ,CAAC;AAAA;AAAA;AAS5D,SAAS,QAAQ,CAAC,MAA2C;AAAA,EAC3D,IAAI;AAAA,IACF,MAAM,MAAM,SAAS,MAAM,aAAa,GAAG,MAAM,IAAI,aAAa,MAAM,MAAM;AAAA,IAC9E,OAAO,EAAE,IAAI;AAAA,IACb,OAAO,GAAG;AAAA,IACV,MAAM,MAAO,EAA4B,SAAS,WAC9C,qBAAqB;AAAA,IACrB,qBAAqB,UAAW,EAAY;AAAA;AAAA,IAChD,OAAO,IAAI,KAAK,CAAC;AAAA;AAAA;AAQrB,SAAS,cAAc,CAAC,GAAsC;AAAA,EAC5D,IAAI,KAAK;AAAA,IAAM,OAAO;AAAA,EACtB,MAAM,IAAI,OAAO,SAAS,GAAG,EAAE;AAAA,EAC/B,IAAI,OAAO,MAAM,CAAC,KAAK,OAAO,CAAC,MAAM,EAAE,KAAK;AAAA,IAAG,OAAO;AAAA,EACtD,OAAO;AAAA;AAGT,SAAS,QAAQ,CACf,GACA,MACA,UACsC;AAAA,EACtC,MAAM,MAAM,EAAE;AAAA,EACd,IAAI,OAAO;AAAA,IAAM,OAAO,EAAE,OAAO,SAAS;AAAA,EAC1C,MAAM,IAAI,eAAe,GAAG;AAAA,EAC5B,IAAI,KAAK,QAAQ,IAAI,GAAG;AAAA,IACtB,OAAO,IAAI,UAAU,6CAA6C;AAAA,GAAU,CAAC;AAAA,EAC/E;AAAA,EACA,OAAO,EAAE,OAAO,EAAE;AAAA;AAKpB,SAAS,MAAM,CAAC,MAAgB,GAAsB;AAAA,EACpD,MAAM,OAAO,KAAK;AAAA,EAClB,IAAI,QAAQ;AAAA,IAAM,OAAO,IAAI;AAAA,GAA6B,CAAC;AAAA,EAC3D,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,IAAI,EAAE,MAAM;AAAA,IACV,OAAO,GACL,KAAK,UACH;AAAA,MACE;AAAA,MACA,aAAa,WAAW,GAAG;AAAA,MAC3B,UAAU,SAAS,IAAI,aAAa;AAAA,IACtC,GACA,MACA,CACF,CACF;AAAA,EACF;AAAA,EAEA,MAAM,QAAQ,SAAS,GAAG,SAAS,IAAI;AAAA,EACvC,IAAI,UAAU;AAAA,IAAO,OAAO;AAAA,EAC5B,OAAO,GACL,UAAU,MAAM,KAAK,UAAU;AAAA,IAC7B,OAAO,MAAM,SAAS;AAAA,IACtB,MAAM,CAAC,CAAC,EAAE;AAAA,EACZ,CAAC,CACH;AAAA;AAGF,eAAe,OAAO,CAAC,MAAgB,GAA+B;AAAA,EACpE,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,cAAc,KAAK;AAAA,EACzB,IAAI,QAAQ,QAAQ,eAAe,MAAM;AAAA,IACvC,OAAO,IAAI;AAAA,GAA4C,CAAC;AAAA,EAC1D;AAAA,EAGA,IAAI,EAAE,UAAU,EAAE,MAAM;AAAA,IACtB,OAAO,IAAI;AAAA,GAAuD,CAAC;AAAA,EACrE;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,MAAM,aAAa,SAAS,GAAG,eAAe,EAAE;AAAA,EAChD,IAAI,UAAU;AAAA,IAAY,OAAO;AAAA,EACjC,MAAM,WAAW,SAAS,GAAG,aAAa,CAAC;AAAA,EAC3C,IAAI,UAAU;AAAA,IAAU,OAAO;AAAA,EAE/B,MAAM,WAAW,cAAc,WAAW;AAAA,EAC1C,MAAM,WAAU,MAAM,UAAU,QAAQ;AAAA,EAKxC,MAAM,WAAW,IAAI,MAAM;AAAA,CAAI;AAAA,EAE/B,IAAI,EAAE,MAAM;AAAA,IACV,OAAO,aACL,MACA,UACA,UACA,UACA,WAAW,SAAS,IACpB,SAAS,SAAS,GAClB,CACF;AAAA,EACF;AAAA,EAEA,IAAI,SAAQ,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAEvD,IAAI;AAAA,EACJ,IAAI,EAAE,QAAQ;AAAA,IACZ,IAAI;AAAA,MACF,SAAS,MAAM,oBAAoB;AAAA,MACnC,OAAO,GAAG;AAAA,MACV,OAAO,IAAI,aAAc,EAAY;AAAA,GAAa,CAAC;AAAA;AAAA,EAEvD;AAAA,EAEA,MAAM,MAAM,WAAW,SAAS;AAAA,EAChC,MAAM,UAAU,SAAQ,MAAM,GAAG,GAAG;AAAA,EACpC,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,SAAQ,SAAS,KAAK;AAAA,IACxB,IAAI,KACF,GAAG,SAAQ,iCAAiC,4CAC9C;AAAA,EACF;AAAA,EACA,WAAW,OAAO,SAAS;AAAA,IACzB,IAAI,KACF,cAAc,MAAM,UAAU,KAAK;AAAA,MACjC,UAAU,CAAC,CAAC,EAAE;AAAA,MACd,QAAQ,CAAC,CAAC,EAAE;AAAA,MACZ,KAAK,CAAC,CAAC,EAAE;AAAA,MACT,UAAU,SAAS,SAAS;AAAA,MAC5B,aAAa;AAAA,MACb;AAAA,IACF,CAAC,CACH;AAAA,EACF;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAG1B,SAAS,YAAY,CACnB,MACA,UACA,KACA,UACA,YACA,UACA,GACW;AAAA,EACX,MAAM,OAAO,KAAK,UAChB;AAAA,IACE;AAAA,IACA,SAAS,SAAQ,MAAM,GAAG,UAAU,EAAE,IAAI,CAAC,OAAO;AAAA,SAC7C,cAAc,CAAC;AAAA,MAClB,MAAM,EAAE,aACJ,KACA,cAAc,UAAU,UAAU,GAAG,KAAK,CAAC,CAAC,EAAE,YAAY,GAAG,QAAQ;AAAA,IAC3E,EAAE;AAAA,IACF,WAAW,SAAQ,SAAS;AAAA,EAC9B,GACA,MACA,CACF;AAAA,EAEA,OAAO,SAAQ,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA;AAG/E,SAAS,KAAK,CAAC,MAAgB,GAAsB;AAAA,EACnD,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,cAAc,KAAK;AAAA,EACzB,IAAI,QAAQ,QAAQ,eAAe,MAAM;AAAA,IACvC,OAAO,IAAI;AAAA,GAA0C,CAAC;AAAA,EACxD;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,MAAM,aAAa,SAAS,GAAG,eAAe,EAAE;AAAA,EAChD,IAAI,UAAU;AAAA,IAAY,OAAO;AAAA,EACjC,MAAM,MAAM,WAAW,SAAS;AAAA,EAEhC,MAAM,WAAW,cAAc,WAAW;AAAA,EAC1C,MAAM,WAAU,MAAM,UAAU,QAAQ,EAAE,MAAM,GAAG,GAAG;AAAA,EAItD,MAAM,aAAa,IAAI;AAAA,EACvB,WAAW,OAAO,UAAU;AAAA,IAC1B,IAAI,IAAI,QAAQ;AAAA,MACd,MAAM,OAAO,WAAW,IAAI,IAAI,MAAM;AAAA,MACtC,IAAI;AAAA,QAAM,KAAK,KAAK,GAAG;AAAA,MAClB;AAAA,mBAAW,IAAI,IAAI,QAAQ,CAAC,GAAG,CAAC;AAAA,IACvC;AAAA,EACF;AAAA,EAEA,IAAI,EAAE,MAAM;AAAA,IACV,MAAM,UAAU,SAAQ,IAAI,CAAC,YAAY;AAAA,MACvC,QAAQ,cAAc,MAAM;AAAA,MAC5B,WAAW,WAAW,IAAI,MAAM,KAAK,CAAC,GAAG,IAAI,aAAa;AAAA,IAC5D,EAAE;AAAA,IACF,MAAM,OAAO,KAAK,UAAU,EAAE,MAAM,QAAQ,GAAG,MAAM,CAAC;AAAA,IACtD,OAAO,SAAQ,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA,EAC/E;AAAA,EAEA,IAAI,SAAQ,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAEvD,MAAM,MAAgB,CAAC;AAAA,EACvB,WAAW,UAAU,UAAS;AAAA,IAC5B,MAAM,WAAW,WAAW,IAAI,MAAM,KAAK,CAAC;AAAA,IAC5C,IAAI,KACF,GAAG,IAAI,OAAO,OAAO,KAAK,KAAK,OAAO,WAAW,OAAO,cAAc,OAAO,UAC/E;AAAA,IACA,IAAI,SAAS,WAAW,GAAG;AAAA,MACzB,IAAI,KAAK,iBAAiB;AAAA,IAC5B,EAAO;AAAA,MACL,WAAW,KAAK,UAAU;AAAA,QACxB,IAAI,KACF,KAAK,IAAI,OAAO,EAAE,KAAK,KAAK,EAAE,WAAW,EAAE,cAAc,EAAE,UAC7D;AAAA,MACF;AAAA;AAAA,EAEJ;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAG1B,SAAS,OAAO,CAAC,MAAgB,GAAsB;AAAA,EACrD,MAAM,OAAO,KAAK;AAAA,EAClB,MAAM,UAAU,KAAK;AAAA,EACrB,IAAI,QAAQ,QAAQ,WAAW,MAAM;AAAA,IACnC,OAAO,IAAI;AAAA,GAA2C,CAAC;AAAA,EACzD;AAAA,EACA,MAAM,SAAS,SAAS,IAAI;AAAA,EAC5B,IAAI,UAAU;AAAA,IAAQ,OAAO;AAAA,EAC7B,QAAQ,QAAQ;AAAA,EAChB,MAAM,WAAW,cAAc,KAAK,GAAG,GAAG,WAAW,GAAG,CAAC;AAAA,EAEzD,IAAI;AAAA,EACJ,IAAI;AAAA,IACF,KAAK,IAAI,OAAO,OAAO;AAAA,IACvB,OAAO,GAAG;AAAA,IACV,OAAO,IAAI,4BAA6B,EAAY;AAAA,GAAa,CAAC;AAAA;AAAA,EAGpE,MAAM,WAAW,IAAI,MAAM;AAAA,CAAI;AAAA,EAE/B,MAAM,OAAc,CAAC;AAAA,EAKrB,IAAI,SAAS;AAAA,EACb,SAAS,SAAS,EAAG,UAAU,SAAS,QAAQ,UAAU;AAAA,IACxD,OACE,SAAS,IAAI,SAAS,UACtB,SAAS,SAAS,GAAI,cAAc,QACpC;AAAA,MACA;AAAA,IACF;AAAA,IACA,MAAM,OAAO,SAAS,SAAS;AAAA,IAC/B,IAAI,GAAG,KAAK,IAAI,GAAG;AAAA,MACjB,MAAM,UAAU,UAAU,IAAI,SAAS,WAAW,OAAO;AAAA,MACzD,KAAK,KAAK,EAAE,SAAS,MAAM,QAAQ,MAAM,KAAK,CAAC;AAAA,IACjD;AAAA,EACF;AAAA,EAEA,IAAI,EAAE,MAAM;AAAA,IACV,MAAM,OAAO,KAAK,UAChB,KAAK,IAAI,CAAC,OAAO;AAAA,MACf;AAAA,MACA,MAAM,EAAE;AAAA,MACR,MAAM,EAAE;AAAA,MACR,SAAS,EAAE,UAAU,cAAc,EAAE,OAAO,IAAI;AAAA,IAClD,EAAE,GACF,MACA,CACF;AAAA,IACA,OAAO,KAAK,WAAW,IAAI,EAAE,MAAM,GAAG,QAAQ,MAAM,QAAQ,GAAG,IAAI,GAAG,IAAI;AAAA,EAC5E;AAAA,EAEA,IAAI,KAAK,WAAW;AAAA,IAAG,OAAO,QAAQ;AAAA,CAAc;AAAA,EAGpD,MAAM,MAAgB,CAAC;AAAA,EACvB,IAAI,cAA0C;AAAA,EAC9C,WAAW,OAAO,MAAM;AAAA,IACtB,IAAI,IAAI,YAAY,aAAa;AAAA,MAC/B,IAAI,IAAI,SAAS;AAAA,QACf,MAAM,OAAO,OAAO,IAAI,OAAO,EAAE,OAAO,IAAI,QAAQ,KAAK,EAAE,KAAK,KAAK;AAAA,QACrE,IAAI,KACF,MAAK,UAAU,IAAI,QAAQ,cAAc,IAAI,QAAQ,UACvD;AAAA,MACF,EAAO;AAAA,QACL,IAAI,KAAK,MAAK,8BAA8B;AAAA;AAAA,MAE9C,cAAc,IAAI;AAAA,IACpB;AAAA,IACA,IAAI,KAAK,MAAM,IAAI,UAAU,IAAI,MAAM;AAAA,EACzC;AAAA,EACA,OAAO,GAAG,IAAI,KAAK;AAAA,CAAI,CAAC;AAAA;AAK1B,SAAS,SAAS,CAChB,UACA,KACA,KACA,UACQ;AAAA,EACR,IAAI,MAAM,IAAI;AAAA,EACd,IAAI,UAAU;AAAA,IACZ,MAAM,aAAa,IAAI,KAAK,CAAC,MAAM,EAAE,WAAW,GAAG;AAAA,IACnD,IAAI;AAAA,MAAY,MAAM,WAAW,aAAa;AAAA,EAChD;AAAA,EACA,OAAO,SAAS,MAAM,IAAI,aAAa,GAAG,GAAG,EAAE,KAAK;AAAA,CAAI;AAAA;AAG1D,SAAS,aAAa,CAAC,MAAc,UAA0B;AAAA,EAC7D,OAAO,WAAW,IAAI,aAAa,MAAM,QAAQ,IAAI;AAAA;AAGvD,SAAS,aAAa,CAAC,KAAuC;AAAA,EAC5D,OAAO;AAAA,IACL,OAAO,IAAI;AAAA,IACX,OAAO,IAAI;AAAA,IACX,YAAY,IAAI;AAAA,IAChB,UAAU,IAAI;AAAA,IACd,MAAM,OAAO,GAAG;AAAA,EAClB;AAAA;;;AMreF,IAAM,SAAS,MAAM,IAAI,QAAQ,KAAK,MAAM,CAAC,CAAC;AAC9C,IAAI,OAAO,QAAQ;AAAA,EACjB,QAAQ,OAAO,MACb,OAAO,OAAO,SAAS;AAAA,CAAI,IAAI,OAAO,SAAS,OAAO,SAAS;AAAA,CACjE;AACF;AACA,IAAI,OAAO;AAAA,EAAQ,QAAQ,OAAO,MAAM,OAAO,MAAM;AACrD,QAAQ,KAAK,OAAO,IAAI;",
14
- "debugId": "21A3DDC69C33B5A064756E2164756E21",
15
- "names": []
16
- }