@houtini/lm 2.1.0 → 2.4.1
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +115 -76
- package/dist/index.d.ts +1 -1
- package/dist/index.js +343 -85
- package/dist/index.js.map +1 -1
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -3,39 +3,50 @@
|
|
|
3
3
|
[](https://www.npmjs.com/package/@houtini/lm)
|
|
4
4
|
[](https://opensource.org/licenses/MIT)
|
|
5
5
|
|
|
6
|
-
|
|
6
|
+
I built this because I kept leaving Claude Code running overnight on big refactors and the token bill was painful. A huge chunk of that spend goes on bounded tasks any decent model handles fine - generating boilerplate, explaining code, drafting commit messages, converting formats. Stuff that doesn't need Claude's reasoning or tool access.
|
|
7
7
|
|
|
8
|
-
|
|
8
|
+
Houtini LM connects Claude Code to a local LLM on your network. Claude keeps doing the hard work - architecture, planning, multi-file changes - and offloads the grunt work to your local model. Free. No rate limits. Private.
|
|
9
9
|
|
|
10
|
-
|
|
10
|
+
The session footer tracks everything Claude offloads, so you can watch the savings stack up.
|
|
11
11
|
|
|
12
|
-
|
|
12
|
+
## How it works
|
|
13
13
|
|
|
14
|
-
|
|
14
|
+
```
|
|
15
|
+
Claude Code (orchestrator)
|
|
16
|
+
│
|
|
17
|
+
├─ Complex reasoning, planning, architecture → Claude API (your tokens)
|
|
18
|
+
│
|
|
19
|
+
└─ Bounded grunt work → houtini-lm ──HTTP/SSE──> Your local LLM (free)
|
|
20
|
+
• Boilerplate & test stubs Qwen, Llama, Mistral, DeepSeek...
|
|
21
|
+
• Code review & explanations LM Studio, Ollama, vLLM, llama.cpp
|
|
22
|
+
• Commit messages & docs
|
|
23
|
+
• Format conversion
|
|
24
|
+
• Mock data & type definitions
|
|
25
|
+
```
|
|
26
|
+
|
|
27
|
+
Claude's the architect. Your local model's the drafter. Claude QAs everything.
|
|
15
28
|
|
|
16
|
-
|
|
17
|
-
- Extract structured data from long documents
|
|
18
|
-
- Run a second opinion on generated code
|
|
19
|
-
- Summarise research before Claude synthesises it
|
|
20
|
-
- Delegate code review to a local model while Claude handles other work
|
|
29
|
+
## Quick start
|
|
21
30
|
|
|
22
|
-
|
|
31
|
+
### Claude Code
|
|
32
|
+
|
|
33
|
+
```bash
|
|
34
|
+
claude mcp add houtini-lm -- npx -y @houtini/lm
|
|
35
|
+
```
|
|
23
36
|
|
|
24
|
-
|
|
25
|
-
- **New `code_task` tool** — purpose-built for code analysis with an optimised system prompt and sensible defaults (temp 0.2, 500 token cap)
|
|
26
|
-
- **Delegation guidance** — each tool description tells Claude when to use it, what output to expect, and what to avoid (e.g. never send truncated code to a local model)
|
|
37
|
+
That's it. If LM Studio's running on `localhost:1234` (the default), Claude can start delegating straight away.
|
|
27
38
|
|
|
28
|
-
|
|
39
|
+
### LLM on a different machine
|
|
29
40
|
|
|
30
|
-
|
|
41
|
+
I've got a GPU box on my local network running Qwen 3 Coder Next in LM Studio. If you've got a similar setup, point the URL at it:
|
|
31
42
|
|
|
32
43
|
```bash
|
|
33
|
-
claude mcp add houtini-lm -e LM_STUDIO_URL=http://
|
|
44
|
+
claude mcp add houtini-lm -e LM_STUDIO_URL=http://192.168.1.50:1234 -- npx -y @houtini/lm
|
|
34
45
|
```
|
|
35
46
|
|
|
36
47
|
### Claude Desktop
|
|
37
48
|
|
|
38
|
-
|
|
49
|
+
Drop this into your `claude_desktop_config.json`:
|
|
39
50
|
|
|
40
51
|
```json
|
|
41
52
|
{
|
|
@@ -51,93 +62,127 @@ Add to `claude_desktop_config.json`:
|
|
|
51
62
|
}
|
|
52
63
|
```
|
|
53
64
|
|
|
54
|
-
|
|
65
|
+
## What gets offloaded
|
|
55
66
|
|
|
56
|
-
|
|
57
|
-
npx @houtini/lm
|
|
58
|
-
```
|
|
67
|
+
**Delegate to the local model** - bounded, well-defined tasks:
|
|
59
68
|
|
|
60
|
-
|
|
69
|
+
| Task | Why it works locally |
|
|
70
|
+
|------|---------------------|
|
|
71
|
+
| Generate test stubs | Clear input (source), clear output (tests) |
|
|
72
|
+
| Explain a function | Summarisation doesn't need tool access |
|
|
73
|
+
| Draft commit messages | Diff in, message out |
|
|
74
|
+
| Code review | Paste full source, ask for bugs |
|
|
75
|
+
| Convert formats | JSON↔YAML, snake_case↔camelCase |
|
|
76
|
+
| Generate mock data | Schema in, data out |
|
|
77
|
+
| Write type definitions | Source in, types out |
|
|
78
|
+
| Brainstorm approaches | Doesn't commit to anything |
|
|
61
79
|
|
|
62
|
-
|
|
80
|
+
**Keep on Claude** - anything that needs reasoning, tool access, or multi-step orchestration:
|
|
63
81
|
|
|
64
|
-
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
68
|
-
|
|
82
|
+
- Architectural decisions
|
|
83
|
+
- Reading/writing files
|
|
84
|
+
- Running tests and interpreting results
|
|
85
|
+
- Multi-file refactoring plans
|
|
86
|
+
- Anything that needs to call other tools
|
|
69
87
|
|
|
70
|
-
|
|
88
|
+
The tool descriptions are written to nudge Claude into planning delegation at the start of large tasks, not just using it when it happens to think of it.
|
|
71
89
|
|
|
72
|
-
|
|
90
|
+
## Token tracking
|
|
73
91
|
|
|
74
|
-
|
|
92
|
+
Every response includes a session footer:
|
|
75
93
|
|
|
76
94
|
```
|
|
77
|
-
|
|
78
|
-
system — persona (be specific: "Senior TypeScript dev", not "helpful assistant")
|
|
79
|
-
temperature — 0.1 for code, 0.3 for analysis (default), 0.5 for suggestions
|
|
80
|
-
max_tokens — match to expected output: 150 for quick answers, 300 for explanations, 500 for code gen
|
|
95
|
+
Model: qwen/qwen3-coder-next | This call: 145→248 tokens | Session: 12,450 tokens offloaded across 23 calls
|
|
81
96
|
```
|
|
82
97
|
|
|
83
|
-
|
|
98
|
+
The `discover` tool reports cumulative session stats too. Claude sees this data and (I've found) it reinforces the delegation habit throughout long-running tasks. The more it sees it's saving tokens, the more it looks for things to offload.
|
|
99
|
+
|
|
100
|
+
## Tools
|
|
101
|
+
|
|
102
|
+
### `chat`
|
|
103
|
+
|
|
104
|
+
The workhorse. Send a task, get an answer. The description includes planning triggers that nudge Claude to identify offloadable work when it's starting a big task.
|
|
105
|
+
|
|
106
|
+
| Parameter | Required | Default | What it does |
|
|
107
|
+
|-----------|----------|---------|-------------|
|
|
108
|
+
| `message` | yes | - | The task. Be specific about output format. |
|
|
109
|
+
| `system` | no | - | Persona - "Senior TypeScript dev" not "helpful assistant" |
|
|
110
|
+
| `temperature` | no | 0.3 | 0.1 for code, 0.3 for analysis, 0.7 for creative |
|
|
111
|
+
| `max_tokens` | no | 2048 | Lower for quick answers, higher for generation |
|
|
84
112
|
|
|
85
113
|
### `custom_prompt`
|
|
86
114
|
|
|
87
|
-
|
|
115
|
+
Three-part prompt: system, context, instruction. Keeping them separate prevents context bleed - consistently outperforms stuffing everything into one message, especially with local models.
|
|
88
116
|
|
|
89
|
-
|
|
90
|
-
|
|
91
|
-
|
|
92
|
-
|
|
93
|
-
|
|
94
|
-
|
|
95
|
-
|
|
117
|
+
| Parameter | Required | Default | What it does |
|
|
118
|
+
|-----------|----------|---------|-------------|
|
|
119
|
+
| `instruction` | yes | - | What to produce. Under 50 words works best. |
|
|
120
|
+
| `system` | no | - | Persona + constraints, under 30 words |
|
|
121
|
+
| `context` | no | - | Complete data to analyse. Never truncate. |
|
|
122
|
+
| `temperature` | no | 0.3 | 0.1 for review, 0.3 for analysis |
|
|
123
|
+
| `max_tokens` | no | 2048 | Match to expected output length |
|
|
96
124
|
|
|
97
125
|
### `code_task`
|
|
98
126
|
|
|
99
|
-
|
|
127
|
+
Built for code analysis. Pre-configured system prompt, locked to temperature 0.2 for focused output.
|
|
100
128
|
|
|
101
|
-
|
|
102
|
-
|
|
103
|
-
|
|
104
|
-
|
|
105
|
-
|
|
106
|
-
|
|
129
|
+
| Parameter | Required | Default | What it does |
|
|
130
|
+
|-----------|----------|---------|-------------|
|
|
131
|
+
| `code` | yes | - | Complete source code. Never truncate. |
|
|
132
|
+
| `task` | yes | - | "Find bugs", "Explain this", "Write tests" |
|
|
133
|
+
| `language` | no | - | "typescript", "python", "rust", etc. |
|
|
134
|
+
| `max_tokens` | no | 2048 | Match to expected output length |
|
|
107
135
|
|
|
108
|
-
|
|
136
|
+
### `discover`
|
|
109
137
|
|
|
110
|
-
|
|
138
|
+
Health check. Returns model name, context window, latency, and cumulative session stats. Call before delegating if you're not sure the LLM's available.
|
|
111
139
|
|
|
112
140
|
### `list_models`
|
|
113
141
|
|
|
114
|
-
|
|
142
|
+
Lists everything loaded on the LLM server with context window sizes.
|
|
143
|
+
|
|
144
|
+
## Getting good results from local models
|
|
115
145
|
|
|
116
|
-
|
|
146
|
+
Qwen, Llama, DeepSeek - they score brilliantly on coding benchmarks now. The gap between a good and bad result is almost always **prompt quality**, not model capability. I've spent a fair bit of time on this.
|
|
117
147
|
|
|
118
|
-
|
|
148
|
+
**Send complete code.** Local models hallucinate details when you give them truncated input. If a file's too large, send the relevant function - not a snippet with `...` in the middle.
|
|
119
149
|
|
|
120
|
-
|
|
150
|
+
**Be explicit about output format.** "Return a JSON array" or "respond in bullet points" - don't leave it open-ended. Smaller models need this.
|
|
121
151
|
|
|
122
|
-
|
|
152
|
+
**Set a specific persona.** "Expert Rust developer who cares about memory safety" gets noticeably better results than "helpful assistant."
|
|
123
153
|
|
|
124
|
-
|
|
125
|
-
|------------|--------------|----------|
|
|
126
|
-
| 150 | ~45 seconds | Quick questions, classifications |
|
|
127
|
-
| 300 | ~100 seconds | Code explanations, summaries |
|
|
128
|
-
| 500 | ~170 seconds | Code review, generation |
|
|
154
|
+
**State constraints.** "No preamble", "reference line numbers", "max 5 bullet points" - tell the model what *not* to do as well as what to do.
|
|
129
155
|
|
|
130
|
-
|
|
156
|
+
**Include surrounding context.** For code generation, send imports, types, and function signatures - not just the function body.
|
|
157
|
+
|
|
158
|
+
**One call at a time.** If your LLM server runs a single model, parallel calls queue up and stack timeouts. Send them sequentially.
|
|
159
|
+
|
|
160
|
+
## Configuration
|
|
161
|
+
|
|
162
|
+
| Variable | Default | What it does |
|
|
163
|
+
|----------|---------|-------------|
|
|
164
|
+
| `LM_STUDIO_URL` | `http://localhost:1234` | Base URL of the OpenAI-compatible API |
|
|
165
|
+
| `LM_STUDIO_MODEL` | *(auto-detect)* | Model identifier - leave blank to use whatever's loaded |
|
|
166
|
+
| `LM_STUDIO_PASSWORD` | *(none)* | Bearer token for authenticated endpoints |
|
|
167
|
+
| `LM_CONTEXT_WINDOW` | `100000` | Fallback context window if the API doesn't report it |
|
|
131
168
|
|
|
132
169
|
## Compatible endpoints
|
|
133
170
|
|
|
134
|
-
|
|
135
|
-
|
|
171
|
+
Works with anything that speaks the OpenAI `/v1/chat/completions` API:
|
|
172
|
+
|
|
173
|
+
| What | URL | Notes |
|
|
174
|
+
|------|-----|-------|
|
|
136
175
|
| [LM Studio](https://lmstudio.ai) | `http://localhost:1234` | Default, zero config |
|
|
137
|
-
| [Ollama](https://ollama.com) | `http://localhost:11434` |
|
|
176
|
+
| [Ollama](https://ollama.com) | `http://localhost:11434` | Set `LM_STUDIO_URL` |
|
|
138
177
|
| [vLLM](https://docs.vllm.ai) | `http://localhost:8000` | Native OpenAI API |
|
|
139
178
|
| [llama.cpp](https://github.com/ggml-org/llama.cpp) | `http://localhost:8080` | Server mode |
|
|
140
|
-
|
|
|
179
|
+
| Any OpenAI-compatible API | Any URL | Set URL + password |
|
|
180
|
+
|
|
181
|
+
## Streaming and timeouts
|
|
182
|
+
|
|
183
|
+
All inference uses Server-Sent Events streaming. Tokens arrive incrementally, keeping the connection alive. If generation takes longer than 55 seconds, you get a partial result instead of a timeout error - the footer shows `⚠ TRUNCATED` when this happens.
|
|
184
|
+
|
|
185
|
+
The 55-second soft timeout exists because the MCP SDK has a hard ~60s client-side timeout. Without streaming, any response that took longer than 60 seconds just vanished. Not ideal.
|
|
141
186
|
|
|
142
187
|
## Development
|
|
143
188
|
|
|
@@ -148,12 +193,6 @@ npm install
|
|
|
148
193
|
npm run build
|
|
149
194
|
```
|
|
150
195
|
|
|
151
|
-
|
|
152
|
-
|
|
153
|
-
```bash
|
|
154
|
-
node test.mjs
|
|
155
|
-
```
|
|
156
|
-
|
|
157
|
-
## License
|
|
196
|
+
## Licence
|
|
158
197
|
|
|
159
198
|
MIT
|
package/dist/index.d.ts
CHANGED
|
@@ -3,6 +3,6 @@
|
|
|
3
3
|
* Houtini LM — MCP Server for Local LLMs via OpenAI-compatible API
|
|
4
4
|
*
|
|
5
5
|
* Connects to LM Studio (or any OpenAI-compatible endpoint) and exposes
|
|
6
|
-
* chat, custom prompts, and model
|
|
6
|
+
* chat, custom prompts, code tasks, and model discovery as MCP tools.
|
|
7
7
|
*/
|
|
8
8
|
export {};
|
package/dist/index.js
CHANGED
|
@@ -3,7 +3,7 @@
|
|
|
3
3
|
* Houtini LM — MCP Server for Local LLMs via OpenAI-compatible API
|
|
4
4
|
*
|
|
5
5
|
* Connects to LM Studio (or any OpenAI-compatible endpoint) and exposes
|
|
6
|
-
* chat, custom prompts, and model
|
|
6
|
+
* chat, custom prompts, code tasks, and model discovery as MCP tools.
|
|
7
7
|
*/
|
|
8
8
|
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
|
|
9
9
|
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
|
|
@@ -11,117 +11,343 @@ import { CallToolRequestSchema, ListToolsRequestSchema, } from '@modelcontextpro
|
|
|
11
11
|
const LM_BASE_URL = process.env.LM_STUDIO_URL || 'http://localhost:1234';
|
|
12
12
|
const LM_MODEL = process.env.LM_STUDIO_MODEL || '';
|
|
13
13
|
const LM_PASSWORD = process.env.LM_STUDIO_PASSWORD || '';
|
|
14
|
-
const DEFAULT_MAX_TOKENS =
|
|
14
|
+
const DEFAULT_MAX_TOKENS = 2048;
|
|
15
15
|
const DEFAULT_TEMPERATURE = 0.3;
|
|
16
|
+
const CONNECT_TIMEOUT_MS = 5000;
|
|
17
|
+
const INFERENCE_CONNECT_TIMEOUT_MS = 30_000; // generous connect timeout for inference
|
|
18
|
+
const SOFT_TIMEOUT_MS = 55_000; // return partial results before MCP SDK ~60s timeout
|
|
19
|
+
const READ_CHUNK_TIMEOUT_MS = 30_000; // max wait for a single SSE chunk
|
|
20
|
+
const FALLBACK_CONTEXT_LENGTH = parseInt(process.env.LM_CONTEXT_WINDOW || '100000', 10);
|
|
21
|
+
// ── Session-level token accounting ───────────────────────────────────
|
|
22
|
+
// Tracks cumulative tokens offloaded to the local LLM across all calls
|
|
23
|
+
// in this session. Shown in every response footer so Claude can reason
|
|
24
|
+
// about cost savings and continue delegating strategically.
|
|
25
|
+
const session = {
|
|
26
|
+
calls: 0,
|
|
27
|
+
promptTokens: 0,
|
|
28
|
+
completionTokens: 0,
|
|
29
|
+
};
|
|
30
|
+
function recordUsage(usage) {
|
|
31
|
+
session.calls++;
|
|
32
|
+
if (usage) {
|
|
33
|
+
session.promptTokens += usage.prompt_tokens;
|
|
34
|
+
session.completionTokens += usage.completion_tokens;
|
|
35
|
+
}
|
|
36
|
+
}
|
|
37
|
+
function sessionSummary() {
|
|
38
|
+
const total = session.promptTokens + session.completionTokens;
|
|
39
|
+
if (session.calls === 0)
|
|
40
|
+
return '';
|
|
41
|
+
return `Session: ${total.toLocaleString()} tokens offloaded across ${session.calls} call${session.calls === 1 ? '' : 's'}`;
|
|
42
|
+
}
|
|
16
43
|
function apiHeaders() {
|
|
17
44
|
const h = { 'Content-Type': 'application/json' };
|
|
18
45
|
if (LM_PASSWORD)
|
|
19
46
|
h['Authorization'] = `Bearer ${LM_PASSWORD}`;
|
|
20
47
|
return h;
|
|
21
48
|
}
|
|
22
|
-
|
|
49
|
+
/**
|
|
50
|
+
* Fetch with a connect timeout so Claude doesn't hang when the host is offline.
|
|
51
|
+
*/
|
|
52
|
+
async function fetchWithTimeout(url, options, timeoutMs = CONNECT_TIMEOUT_MS) {
|
|
53
|
+
const controller = new AbortController();
|
|
54
|
+
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
|
55
|
+
try {
|
|
56
|
+
return await fetch(url, { ...options, signal: controller.signal });
|
|
57
|
+
}
|
|
58
|
+
finally {
|
|
59
|
+
clearTimeout(timer);
|
|
60
|
+
}
|
|
61
|
+
}
|
|
62
|
+
/**
|
|
63
|
+
* Read from a stream with a per-chunk timeout.
|
|
64
|
+
* Prevents hanging forever if the LLM stalls mid-generation.
|
|
65
|
+
*/
|
|
66
|
+
async function timedRead(reader, timeoutMs) {
|
|
67
|
+
let timer;
|
|
68
|
+
const timeout = new Promise((resolve) => {
|
|
69
|
+
timer = setTimeout(() => resolve('timeout'), timeoutMs);
|
|
70
|
+
});
|
|
71
|
+
try {
|
|
72
|
+
return await Promise.race([reader.read(), timeout]);
|
|
73
|
+
}
|
|
74
|
+
finally {
|
|
75
|
+
clearTimeout(timer);
|
|
76
|
+
}
|
|
77
|
+
}
|
|
78
|
+
/**
|
|
79
|
+
* Streaming chat completion with soft timeout.
|
|
80
|
+
*
|
|
81
|
+
* Uses SSE streaming (`stream: true`) so tokens arrive incrementally.
|
|
82
|
+
* If we approach the MCP SDK's ~60s timeout (soft limit at 55s), we
|
|
83
|
+
* return whatever content we have so far with `truncated: true`.
|
|
84
|
+
* This means large code reviews return partial results instead of nothing.
|
|
85
|
+
*/
|
|
86
|
+
async function chatCompletionStreaming(messages, options = {}) {
|
|
23
87
|
const body = {
|
|
24
88
|
messages,
|
|
25
89
|
temperature: options.temperature ?? DEFAULT_TEMPERATURE,
|
|
26
90
|
max_tokens: options.maxTokens ?? DEFAULT_MAX_TOKENS,
|
|
27
|
-
stream:
|
|
91
|
+
stream: true,
|
|
28
92
|
};
|
|
29
93
|
if (options.model || LM_MODEL) {
|
|
30
94
|
body.model = options.model || LM_MODEL;
|
|
31
95
|
}
|
|
32
|
-
const
|
|
33
|
-
|
|
34
|
-
headers: apiHeaders(),
|
|
35
|
-
body: JSON.stringify(body),
|
|
36
|
-
});
|
|
96
|
+
const startTime = Date.now();
|
|
97
|
+
const res = await fetchWithTimeout(`${LM_BASE_URL}/v1/chat/completions`, { method: 'POST', headers: apiHeaders(), body: JSON.stringify(body) }, INFERENCE_CONNECT_TIMEOUT_MS);
|
|
37
98
|
if (!res.ok) {
|
|
38
99
|
const text = await res.text().catch(() => '');
|
|
39
100
|
throw new Error(`LM Studio API error ${res.status}: ${text}`);
|
|
40
101
|
}
|
|
41
|
-
|
|
102
|
+
if (!res.body) {
|
|
103
|
+
throw new Error('Response body is null — streaming not supported by endpoint');
|
|
104
|
+
}
|
|
105
|
+
const reader = res.body.getReader();
|
|
106
|
+
const decoder = new TextDecoder();
|
|
107
|
+
let content = '';
|
|
108
|
+
let model = '';
|
|
109
|
+
let usage;
|
|
110
|
+
let finishReason = '';
|
|
111
|
+
let truncated = false;
|
|
112
|
+
let buffer = '';
|
|
113
|
+
try {
|
|
114
|
+
while (true) {
|
|
115
|
+
// Check soft timeout before each read
|
|
116
|
+
const elapsed = Date.now() - startTime;
|
|
117
|
+
if (elapsed > SOFT_TIMEOUT_MS) {
|
|
118
|
+
truncated = true;
|
|
119
|
+
process.stderr.write(`[houtini-lm] Soft timeout at ${elapsed}ms, returning ${content.length} chars of partial content\n`);
|
|
120
|
+
break;
|
|
121
|
+
}
|
|
122
|
+
// Read with per-chunk timeout (handles stalled generation)
|
|
123
|
+
const remaining = SOFT_TIMEOUT_MS - elapsed;
|
|
124
|
+
const chunkTimeout = Math.min(READ_CHUNK_TIMEOUT_MS, remaining);
|
|
125
|
+
const result = await timedRead(reader, chunkTimeout);
|
|
126
|
+
if (result === 'timeout') {
|
|
127
|
+
truncated = true;
|
|
128
|
+
process.stderr.write(`[houtini-lm] Chunk read timeout, returning ${content.length} chars of partial content\n`);
|
|
129
|
+
break;
|
|
130
|
+
}
|
|
131
|
+
if (result.done)
|
|
132
|
+
break;
|
|
133
|
+
buffer += decoder.decode(result.value, { stream: true });
|
|
134
|
+
// Parse SSE lines
|
|
135
|
+
const lines = buffer.split('\n');
|
|
136
|
+
buffer = lines.pop() || ''; // Keep incomplete line in buffer
|
|
137
|
+
for (const line of lines) {
|
|
138
|
+
const trimmed = line.trim();
|
|
139
|
+
if (!trimmed || trimmed === 'data: [DONE]')
|
|
140
|
+
continue;
|
|
141
|
+
if (!trimmed.startsWith('data: '))
|
|
142
|
+
continue;
|
|
143
|
+
try {
|
|
144
|
+
const json = JSON.parse(trimmed.slice(6));
|
|
145
|
+
if (json.model)
|
|
146
|
+
model = json.model;
|
|
147
|
+
const delta = json.choices?.[0]?.delta;
|
|
148
|
+
if (delta?.content)
|
|
149
|
+
content += delta.content;
|
|
150
|
+
const reason = json.choices?.[0]?.finish_reason;
|
|
151
|
+
if (reason)
|
|
152
|
+
finishReason = reason;
|
|
153
|
+
// Some endpoints include usage in the final streaming chunk
|
|
154
|
+
if (json.usage)
|
|
155
|
+
usage = json.usage;
|
|
156
|
+
}
|
|
157
|
+
catch {
|
|
158
|
+
// Skip unparseable chunks (partial JSON, comments, etc.)
|
|
159
|
+
}
|
|
160
|
+
}
|
|
161
|
+
}
|
|
162
|
+
}
|
|
163
|
+
finally {
|
|
164
|
+
// Release the reader — don't await cancel() as it can hang
|
|
165
|
+
reader.releaseLock();
|
|
166
|
+
}
|
|
167
|
+
return { content, model, usage, finishReason, truncated };
|
|
42
168
|
}
|
|
43
|
-
async function
|
|
44
|
-
const res = await
|
|
169
|
+
async function listModelsRaw() {
|
|
170
|
+
const res = await fetchWithTimeout(`${LM_BASE_URL}/v1/models`, { headers: apiHeaders() });
|
|
45
171
|
if (!res.ok)
|
|
46
172
|
throw new Error(`Failed to list models: ${res.status}`);
|
|
47
173
|
const data = (await res.json());
|
|
48
|
-
return data.data
|
|
174
|
+
return data.data;
|
|
175
|
+
}
|
|
176
|
+
function getContextLength(model) {
|
|
177
|
+
// LM Studio uses context_length, vLLM uses max_model_len, fall back to env/100k
|
|
178
|
+
return model.context_length ?? model.max_model_len ?? FALLBACK_CONTEXT_LENGTH;
|
|
179
|
+
}
|
|
180
|
+
/**
|
|
181
|
+
* Format a footer line for streaming results showing model, usage, and truncation status.
|
|
182
|
+
*/
|
|
183
|
+
function formatFooter(resp, extra) {
|
|
184
|
+
// Record usage for session tracking before formatting
|
|
185
|
+
recordUsage(resp.usage);
|
|
186
|
+
const parts = [];
|
|
187
|
+
if (resp.model)
|
|
188
|
+
parts.push(`Model: ${resp.model}`);
|
|
189
|
+
if (resp.usage)
|
|
190
|
+
parts.push(`This call: ${resp.usage.prompt_tokens}→${resp.usage.completion_tokens} tokens`);
|
|
191
|
+
if (extra)
|
|
192
|
+
parts.push(extra);
|
|
193
|
+
if (resp.truncated)
|
|
194
|
+
parts.push('⚠ TRUNCATED (soft timeout — partial result)');
|
|
195
|
+
const sessionLine = sessionSummary();
|
|
196
|
+
if (sessionLine)
|
|
197
|
+
parts.push(sessionLine);
|
|
198
|
+
return parts.length > 0 ? `\n\n---\n${parts.join(' | ')}` : '';
|
|
49
199
|
}
|
|
50
200
|
// ── MCP Tool definitions ─────────────────────────────────────────────
|
|
51
201
|
const TOOLS = [
|
|
52
202
|
{
|
|
53
203
|
name: 'chat',
|
|
54
|
-
description: '
|
|
55
|
-
'
|
|
56
|
-
'
|
|
57
|
-
'
|
|
58
|
-
'
|
|
59
|
-
'
|
|
60
|
-
'
|
|
204
|
+
description: 'Send a task to a local LLM running on a separate machine. This is a FREE, parallel worker — ' +
|
|
205
|
+
'use it to offload bounded work while you continue doing other things. The local LLM runs independently ' +
|
|
206
|
+
'and does not consume your tokens or rate limits.\n\n' +
|
|
207
|
+
'PLANNING: When you start a large task (refactoring, migrations, test suites, documentation), ' +
|
|
208
|
+
'break it into steps and identify which ones are bounded grunt work you can delegate here. ' +
|
|
209
|
+
'The more you offload, the more tokens you save. The session footer tracks cumulative savings.\n\n' +
|
|
210
|
+
'WHEN TO USE (delegate generously — it costs nothing):\n' +
|
|
211
|
+
'• Explain or summarise code/docs you just read\n' +
|
|
212
|
+
'• Generate boilerplate, test stubs, type definitions, mock data\n' +
|
|
213
|
+
'• Answer factual questions about languages, frameworks, APIs\n' +
|
|
214
|
+
'• Draft commit messages, PR descriptions, comments\n' +
|
|
215
|
+
'• Translate or reformat content (JSON↔YAML, snake_case↔camelCase)\n' +
|
|
216
|
+
'• Brainstorm approaches before you commit to one\n' +
|
|
217
|
+
'• Any self-contained subtask that does not need tool access\n\n' +
|
|
218
|
+
'PROMPT QUALITY (the local model is highly capable — results depend on your prompt):\n' +
|
|
219
|
+
'(1) Always send COMPLETE code/context — never truncate, the local LLM cannot access files.\n' +
|
|
220
|
+
'(2) Be explicit about output format ("respond as a JSON array", "return only the function").\n' +
|
|
221
|
+
'(3) Set a specific persona in the system field — "Senior TypeScript dev" beats "helpful assistant".\n' +
|
|
222
|
+
'(4) State constraints: "no preamble", "reference line numbers", "max 5 bullet points".\n' +
|
|
223
|
+
'(5) For code generation, include the surrounding context (imports, types, function signatures).\n\n' +
|
|
224
|
+
'QA: Always review the local LLM\'s output before using it. Verify correctness, check edge cases, ' +
|
|
225
|
+
'and fix any issues. You are the architect — the local model is a fast drafter, not the final authority.\n\n' +
|
|
226
|
+
'The local model, context window, and speed vary — call the discover tool to check what is loaded.',
|
|
61
227
|
inputSchema: {
|
|
62
228
|
type: 'object',
|
|
63
229
|
properties: {
|
|
64
|
-
message: {
|
|
65
|
-
|
|
66
|
-
|
|
67
|
-
|
|
230
|
+
message: {
|
|
231
|
+
type: 'string',
|
|
232
|
+
description: 'The task. Be specific about expected output format. Include COMPLETE code/context — never truncate.',
|
|
233
|
+
},
|
|
234
|
+
system: {
|
|
235
|
+
type: 'string',
|
|
236
|
+
description: 'Persona for the local LLM. Be specific: "Senior TypeScript dev" not "helpful assistant".',
|
|
237
|
+
},
|
|
238
|
+
temperature: {
|
|
239
|
+
type: 'number',
|
|
240
|
+
description: '0.1 for factual/code, 0.3 for analysis (default), 0.7 for creative. Stay under 0.5 for code.',
|
|
241
|
+
},
|
|
242
|
+
max_tokens: {
|
|
243
|
+
type: 'number',
|
|
244
|
+
description: 'Max response tokens. Default 2048. Use higher for code generation, lower for quick answers.',
|
|
245
|
+
},
|
|
68
246
|
},
|
|
69
247
|
required: ['message'],
|
|
70
248
|
},
|
|
71
249
|
},
|
|
72
250
|
{
|
|
73
251
|
name: 'custom_prompt',
|
|
74
|
-
description: 'Structured analysis
|
|
75
|
-
'This 3-part format gets the best results from local models
|
|
76
|
-
'
|
|
77
|
-
'
|
|
78
|
-
'
|
|
79
|
-
'
|
|
80
|
-
'
|
|
252
|
+
description: 'Structured analysis via the local LLM with explicit system/context/instruction separation. ' +
|
|
253
|
+
'This 3-part format prevents context bleed and gets the best results from local models.\n\n' +
|
|
254
|
+
'USE THIS for complex tasks where prompt structure matters — it consistently outperforms ' +
|
|
255
|
+
'stuffing everything into a single message. The separation helps the local model focus.\n\n' +
|
|
256
|
+
'WHEN TO USE:\n' +
|
|
257
|
+
'• Code review — paste full source, ask for bugs/improvements\n' +
|
|
258
|
+
'• Comparison — paste two implementations, ask which is better and why\n' +
|
|
259
|
+
'• Refactoring suggestions — paste code, ask for a cleaner version\n' +
|
|
260
|
+
'• Content analysis — paste text, ask for structure/tone/issues\n' +
|
|
261
|
+
'• Any task where separating context from instruction improves clarity\n\n' +
|
|
262
|
+
'PROMPT STRUCTURE (each field has a job — keep them focused):\n' +
|
|
263
|
+
'• System: persona + constraints, under 30 words. "Expert Python developer focused on performance and correctness."\n' +
|
|
264
|
+
'• Context: COMPLETE data. Full source code, full logs, full text. NEVER truncate or summarise.\n' +
|
|
265
|
+
'• Instruction: exactly what to produce, under 50 words. Specify format: "Return a JSON array of {line, issue, fix}."\n\n' +
|
|
266
|
+
'QA: Review the output. The local model is a capable drafter — verify its analysis before acting on it.',
|
|
81
267
|
inputSchema: {
|
|
82
268
|
type: 'object',
|
|
83
269
|
properties: {
|
|
84
|
-
system: {
|
|
85
|
-
|
|
86
|
-
|
|
87
|
-
|
|
88
|
-
|
|
270
|
+
system: {
|
|
271
|
+
type: 'string',
|
|
272
|
+
description: 'Persona. Be specific: "Expert Node.js developer focused on error handling and edge cases."',
|
|
273
|
+
},
|
|
274
|
+
context: {
|
|
275
|
+
type: 'string',
|
|
276
|
+
description: 'The COMPLETE data to analyse. Full source code, full logs, full text. NEVER truncate.',
|
|
277
|
+
},
|
|
278
|
+
instruction: {
|
|
279
|
+
type: 'string',
|
|
280
|
+
description: 'What to produce. Specify format: "List 3 bugs as bullet points" or "Return a JSON array of {line, issue, fix}".',
|
|
281
|
+
},
|
|
282
|
+
temperature: {
|
|
283
|
+
type: 'number',
|
|
284
|
+
description: '0.1 for bugs/review, 0.3 for analysis (default), 0.5 for suggestions.',
|
|
285
|
+
},
|
|
286
|
+
max_tokens: {
|
|
287
|
+
type: 'number',
|
|
288
|
+
description: 'Max response tokens. Default 2048.',
|
|
289
|
+
},
|
|
89
290
|
},
|
|
90
291
|
required: ['instruction'],
|
|
91
292
|
},
|
|
92
293
|
},
|
|
93
294
|
{
|
|
94
295
|
name: 'code_task',
|
|
95
|
-
description: '
|
|
96
|
-
'
|
|
97
|
-
'
|
|
98
|
-
'
|
|
99
|
-
'
|
|
100
|
-
'
|
|
296
|
+
description: 'Send a code analysis task to the local LLM. Wraps the request with an optimised code-review system prompt.\n\n' +
|
|
297
|
+
'This is the fastest way to offload code-specific work. Temperature is locked to 0.2 for ' +
|
|
298
|
+
'focused, deterministic output. The system prompt is pre-configured for code review.\n\n' +
|
|
299
|
+
'WHEN TO USE:\n' +
|
|
300
|
+
'• Explain what a function/class does\n' +
|
|
301
|
+
'• Find bugs or suggest improvements\n' +
|
|
302
|
+
'• Generate unit tests or type definitions for existing code\n' +
|
|
303
|
+
'• Add error handling, logging, or validation\n' +
|
|
304
|
+
'• Convert between languages or patterns\n\n' +
|
|
305
|
+
'GETTING BEST RESULTS:\n' +
|
|
306
|
+
'• Provide COMPLETE source code — the local LLM cannot read files.\n' +
|
|
307
|
+
'• Include imports and type definitions so the model has full context.\n' +
|
|
308
|
+
'• Be specific in the task: "Write 3 Jest tests for the error paths in fetchUser" beats "Write tests".\n' +
|
|
309
|
+
'• Set the language field — it shapes the system prompt and improves accuracy.\n\n' +
|
|
310
|
+
'QA: Always verify generated code compiles, handles edge cases, and follows project conventions.',
|
|
101
311
|
inputSchema: {
|
|
102
312
|
type: 'object',
|
|
103
313
|
properties: {
|
|
104
|
-
code: {
|
|
105
|
-
|
|
106
|
-
|
|
107
|
-
|
|
314
|
+
code: {
|
|
315
|
+
type: 'string',
|
|
316
|
+
description: 'COMPLETE source code. Never truncate. Include imports and full function bodies.',
|
|
317
|
+
},
|
|
318
|
+
task: {
|
|
319
|
+
type: 'string',
|
|
320
|
+
description: 'What to do: "Find bugs", "Explain this", "Add error handling to fetchData", "Write tests".',
|
|
321
|
+
},
|
|
322
|
+
language: {
|
|
323
|
+
type: 'string',
|
|
324
|
+
description: 'Programming language: "typescript", "python", "rust", etc.',
|
|
325
|
+
},
|
|
326
|
+
max_tokens: {
|
|
327
|
+
type: 'number',
|
|
328
|
+
description: 'Max response tokens. Default 2048.',
|
|
329
|
+
},
|
|
108
330
|
},
|
|
109
331
|
required: ['code', 'task'],
|
|
110
332
|
},
|
|
111
333
|
},
|
|
112
334
|
{
|
|
113
|
-
name: '
|
|
114
|
-
description: '
|
|
335
|
+
name: 'discover',
|
|
336
|
+
description: 'Check whether the local LLM is online and what model is loaded. Returns model name, context window size, ' +
|
|
337
|
+
'response latency, and cumulative session stats (tokens offloaded so far). ' +
|
|
338
|
+
'Call this if you are unsure whether the local LLM is available before delegating work. ' +
|
|
339
|
+
'Fast — typically responds in under 1 second, or returns an offline status within 5 seconds if the host is unreachable.',
|
|
115
340
|
inputSchema: { type: 'object', properties: {} },
|
|
116
341
|
},
|
|
117
342
|
{
|
|
118
|
-
name: '
|
|
119
|
-
description: '
|
|
343
|
+
name: 'list_models',
|
|
344
|
+
description: 'List all models currently loaded in the local LLM server, with context window sizes. ' +
|
|
345
|
+
'Use discover instead for a quick availability check.',
|
|
120
346
|
inputSchema: { type: 'object', properties: {} },
|
|
121
347
|
},
|
|
122
348
|
];
|
|
123
349
|
// ── MCP Server ───────────────────────────────────────────────────────
|
|
124
|
-
const server = new Server({ name: 'houtini-lm', version: '2.1
|
|
350
|
+
const server = new Server({ name: 'houtini-lm', version: '2.4.1' }, { capabilities: { tools: {} } });
|
|
125
351
|
server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: TOOLS }));
|
|
126
352
|
server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
127
353
|
const { name, arguments: args } = request.params;
|
|
@@ -133,15 +359,12 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
|
133
359
|
if (system)
|
|
134
360
|
messages.push({ role: 'system', content: system });
|
|
135
361
|
messages.push({ role: 'user', content: message });
|
|
136
|
-
const resp = await
|
|
362
|
+
const resp = await chatCompletionStreaming(messages, {
|
|
137
363
|
temperature,
|
|
138
364
|
maxTokens: max_tokens,
|
|
139
365
|
});
|
|
140
|
-
const
|
|
141
|
-
|
|
142
|
-
? `\n\n---\nModel: ${resp.model} | Tokens: ${resp.usage.prompt_tokens}→${resp.usage.completion_tokens}`
|
|
143
|
-
: '';
|
|
144
|
-
return { content: [{ type: 'text', text: reply + usage }] };
|
|
366
|
+
const footer = formatFooter(resp);
|
|
367
|
+
return { content: [{ type: 'text', text: resp.content + footer }] };
|
|
145
368
|
}
|
|
146
369
|
case 'custom_prompt': {
|
|
147
370
|
const { system, context, instruction, temperature, max_tokens } = args;
|
|
@@ -152,12 +375,13 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
|
152
375
|
if (context)
|
|
153
376
|
userContent = `Context:\n${context}\n\nInstruction:\n${instruction}`;
|
|
154
377
|
messages.push({ role: 'user', content: userContent });
|
|
155
|
-
const resp = await
|
|
378
|
+
const resp = await chatCompletionStreaming(messages, {
|
|
156
379
|
temperature,
|
|
157
380
|
maxTokens: max_tokens,
|
|
158
381
|
});
|
|
382
|
+
const footer = formatFooter(resp);
|
|
159
383
|
return {
|
|
160
|
-
content: [{ type: 'text', text: resp.
|
|
384
|
+
content: [{ type: 'text', text: resp.content + footer }],
|
|
161
385
|
};
|
|
162
386
|
}
|
|
163
387
|
case 'code_task': {
|
|
@@ -173,40 +397,74 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
|
173
397
|
content: `Task: ${task}\n\n\`\`\`${lang}\n${code}\n\`\`\``,
|
|
174
398
|
},
|
|
175
399
|
];
|
|
176
|
-
const codeResp = await
|
|
400
|
+
const codeResp = await chatCompletionStreaming(codeMessages, {
|
|
177
401
|
temperature: 0.2,
|
|
178
|
-
maxTokens: codeMaxTokens ??
|
|
402
|
+
maxTokens: codeMaxTokens ?? DEFAULT_MAX_TOKENS,
|
|
179
403
|
});
|
|
180
|
-
const
|
|
181
|
-
|
|
182
|
-
? `\n\n---\nModel: ${codeResp.model} | Tokens: ${codeResp.usage.prompt_tokens}→${codeResp.usage.completion_tokens} | ${lang}`
|
|
183
|
-
: '';
|
|
184
|
-
return { content: [{ type: 'text', text: codeReply + codeUsage }] };
|
|
404
|
+
const codeFooter = formatFooter(codeResp, lang);
|
|
405
|
+
return { content: [{ type: 'text', text: codeResp.content + codeFooter }] };
|
|
185
406
|
}
|
|
186
|
-
case '
|
|
187
|
-
const
|
|
407
|
+
case 'discover': {
|
|
408
|
+
const start = Date.now();
|
|
409
|
+
let models;
|
|
410
|
+
try {
|
|
411
|
+
models = await listModelsRaw();
|
|
412
|
+
}
|
|
413
|
+
catch (err) {
|
|
414
|
+
const ms = Date.now() - start;
|
|
415
|
+
const reason = err instanceof Error && err.name === 'AbortError'
|
|
416
|
+
? `Host unreachable (timed out after ${ms}ms)`
|
|
417
|
+
: `Connection failed: ${err instanceof Error ? err.message : String(err)}`;
|
|
418
|
+
return {
|
|
419
|
+
content: [{
|
|
420
|
+
type: 'text',
|
|
421
|
+
text: `Status: OFFLINE\nEndpoint: ${LM_BASE_URL}\n${reason}\n\nThe local LLM is not available right now. Do not attempt to delegate tasks to it.`,
|
|
422
|
+
}],
|
|
423
|
+
};
|
|
424
|
+
}
|
|
425
|
+
const ms = Date.now() - start;
|
|
426
|
+
if (models.length === 0) {
|
|
427
|
+
return {
|
|
428
|
+
content: [{
|
|
429
|
+
type: 'text',
|
|
430
|
+
text: `Status: ONLINE (no model loaded)\nEndpoint: ${LM_BASE_URL}\nLatency: ${ms}ms\n\nThe server is running but no model is loaded. Ask the user to load a model in LM Studio.`,
|
|
431
|
+
}],
|
|
432
|
+
};
|
|
433
|
+
}
|
|
434
|
+
const lines = models.map((m) => {
|
|
435
|
+
const ctx = getContextLength(m);
|
|
436
|
+
return ` • ${m.id} (context: ${ctx.toLocaleString()} tokens)`;
|
|
437
|
+
});
|
|
438
|
+
const primary = models[0];
|
|
439
|
+
const ctx = getContextLength(primary);
|
|
440
|
+
const sessionStats = session.calls > 0
|
|
441
|
+
? `\nSession stats: ${(session.promptTokens + session.completionTokens).toLocaleString()} tokens offloaded across ${session.calls} call${session.calls === 1 ? '' : 's'}`
|
|
442
|
+
: '\nSession stats: no calls yet — delegate tasks to start saving tokens';
|
|
188
443
|
return {
|
|
189
|
-
content: [
|
|
190
|
-
{
|
|
444
|
+
content: [{
|
|
191
445
|
type: 'text',
|
|
192
|
-
text:
|
|
193
|
-
|
|
194
|
-
:
|
|
195
|
-
|
|
196
|
-
|
|
446
|
+
text: `Status: ONLINE\n` +
|
|
447
|
+
`Endpoint: ${LM_BASE_URL}\n` +
|
|
448
|
+
`Latency: ${ms}ms\n` +
|
|
449
|
+
`Model: ${primary.id}\n` +
|
|
450
|
+
`Context window: ${ctx.toLocaleString()} tokens\n` +
|
|
451
|
+
`\nLoaded models:\n${lines.join('\n')}` +
|
|
452
|
+
`${sessionStats}\n\n` +
|
|
453
|
+
`The local LLM is available. You can delegate tasks using chat, custom_prompt, or code_task.`,
|
|
454
|
+
}],
|
|
197
455
|
};
|
|
198
456
|
}
|
|
199
|
-
case '
|
|
200
|
-
const
|
|
201
|
-
|
|
202
|
-
|
|
457
|
+
case 'list_models': {
|
|
458
|
+
const models = await listModelsRaw();
|
|
459
|
+
if (!models.length) {
|
|
460
|
+
return { content: [{ type: 'text', text: 'No models currently loaded.' }] };
|
|
461
|
+
}
|
|
462
|
+
const lines = models.map((m) => {
|
|
463
|
+
const ctx = getContextLength(m);
|
|
464
|
+
return ` • ${m.id}${ctx ? ` (context: ${ctx.toLocaleString()} tokens)` : ''}`;
|
|
465
|
+
});
|
|
203
466
|
return {
|
|
204
|
-
content: [
|
|
205
|
-
{
|
|
206
|
-
type: 'text',
|
|
207
|
-
text: `Connected to ${LM_BASE_URL} (${ms}ms)\nAuth: ${LM_PASSWORD ? 'enabled' : 'none'}\nModels loaded: ${models.length}${models.length ? '\n' + models.join(', ') : ''}`,
|
|
208
|
-
},
|
|
209
|
-
],
|
|
467
|
+
content: [{ type: 'text', text: `Loaded models:\n${lines.join('\n')}` }],
|
|
210
468
|
};
|
|
211
469
|
}
|
|
212
470
|
default:
|
package/dist/index.js.map
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AACA;;;;;GAKG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,2CAA2C,CAAC;AACnE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EACL,qBAAqB,EACrB,sBAAsB,GACvB,MAAM,oCAAoC,CAAC;AAE5C,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,aAAa,IAAI,uBAAuB,CAAC;AACzE,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,eAAe,IAAI,EAAE,CAAC;AACnD,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,IAAI,EAAE,CAAC;AACzD,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,mBAAmB,GAAG,GAAG,CAAC;AAEhC,SAAS,UAAU;IACjB,MAAM,CAAC,GAA2B,EAAE,cAAc,EAAE,kBAAkB,EAAE,CAAC;IACzE,IAAI,WAAW;QAAE,CAAC,CAAC,eAAe,CAAC,GAAG,UAAU,WAAW,EAAE,CAAC;IAC9D,OAAO,CAAC,CAAC;AACX,CAAC;AAmBD,KAAK,UAAU,cAAc,CAC3B,QAAuB,EACvB,UAAwE,EAAE;IAE1E,MAAM,IAAI,GAA4B;QACpC,QAAQ;QACR,WAAW,EAAE,OAAO,CAAC,WAAW,IAAI,mBAAmB;QACvD,UAAU,EAAE,OAAO,CAAC,SAAS,IAAI,kBAAkB;QACnD,MAAM,EAAE,KAAK;KACd,CAAC;IACF,IAAI,OAAO,CAAC,KAAK,IAAI,QAAQ,EAAE,CAAC;QAC9B,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,QAAQ,CAAC;IACzC,CAAC;IAED,MAAM,GAAG,GAAG,MAAM,KAAK,CAAC,GAAG,WAAW,sBAAsB,EAAE;QAC5D,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,UAAU,EAAE;QACrB,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC;KAC3B,CAAC,CAAC;IAEH,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,CAAC;QACZ,MAAM,IAAI,GAAG,MAAM,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC,CAAC;QAC9C,MAAM,IAAI,KAAK,CAAC,uBAAuB,GAAG,CAAC,MAAM,KAAK,IAAI,EAAE,CAAC,CAAC;IAChE,CAAC;IAED,OAAO,GAAG,CAAC,IAAI,EAAqC,CAAC;AACvD,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,MAAM,GAAG,GAAG,MAAM,KAAK,CAAC,GAAG,WAAW,YAAY,EAAE,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,CAAC,CAAC;IAC/E,IAAI,CAAC,GAAG,CAAC,EAAE;QAAE,MAAM,IAAI,KAAK,CAAC,0BAA0B,GAAG,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,MAAM,IAAI,GAAG,CAAC,MAAM,GAAG,CAAC,IAAI,EAAE,CAAoC,CAAC;IACnE,OAAO,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC;AACpC,CAAC;AAED,wEAAwE;AAExE,MAAM,KAAK,GAAG;IACZ;QACE,IAAI,EAAE,MAAM;QACZ,WAAW,EACT,sEAAsE;YACtE,sGAAsG;YACtG,gFAAgF;YAChF,mHAAmH;YACnH,4IAA4I;YAC5I,4GAA4G;YAC5G,gLAAgL;QAClL,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,OAAO,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,wGAAwG,EAAE;gBAClJ,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,4IAA4I,EAAE;gBACrL,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,gHAAgH,EAAE;gBAC9J,UAAU,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,oJAAoJ,EAAE;aAClM;YACD,QAAQ,EAAE,CAAC,SAAS,CAAC;SACtB;KACF;IACD;QACE,IAAI,EAAE,eAAe;QACrB,WAAW,EACT,4FAA4F;YAC5F,sGAAsG;YACtG,0FAA0F;YAC1F,uFAAuF;YACvF,gFAAgF;YAChF,mFAAmF;YACnF,uDAAuD;QACzD,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,wHAAwH,EAAE;gBACjK,OAAO,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,gJAAgJ,EAAE;gBAC1L,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,iIAAiI,EAAE;gBAC/K,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,uEAAuE,EAAE;gBACrH,UAAU,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,8FAA8F,EAAE;aAC5I;YACD,QAAQ,EAAE,CAAC,aAAa,CAAC;SAC1B;KACF;IACD;QACE,IAAI,EAAE,WAAW;QACjB,WAAW,EACT,0GAA0G;YAC1G,mFAAmF;YACnF,mFAAmF;YACnF,sIAAsI;YACtI,wGAAwG;YACxG,oEAAoE;QACtE,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,IAAI,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,iFAAiF,EAAE;gBACxH,IAAI,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,+IAA+I,EAAE;gBACtL,QAAQ,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,wEAAwE,EAAE;gBACnH,UAAU,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,wFAAwF,EAAE;aACtI;YACD,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC;SAC3B;KACF;IACD;QACE,IAAI,EAAE,aAAa;QACnB,WAAW,EAAE,mGAAmG;QAChH,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;IACD;QACE,IAAI,EAAE,cAAc;QACpB,WAAW,EAAE,6GAA6G;QAC1H,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;CACF,CAAC;AAEF,wEAAwE;AAExE,MAAM,MAAM,GAAG,IAAI,MAAM,CACvB,EAAE,IAAI,EAAE,YAAY,EAAE,OAAO,EAAE,OAAO,EAAE,EACxC,EAAE,YAAY,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,CAChC,CAAC;AAEF,MAAM,CAAC,iBAAiB,CAAC,sBAAsB,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC;AAEjF,MAAM,CAAC,iBAAiB,CAAC,qBAAqB,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE;IAChE,MAAM,EAAE,IAAI,EAAE,SAAS,EAAE,IAAI,EAAE,GAAG,OAAO,CAAC,MAAM,CAAC;IAEjD,IAAI,CAAC;QACH,QAAQ,IAAI,EAAE,CAAC;YACb,KAAK,MAAM,CAAC,CAAC,CAAC;gBACZ,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAKpD,CAAC;gBACF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAC/D,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,CAAC,CAAC;gBAElD,MAAM,IAAI,GAAG,MAAM,cAAc,CAAC,QAAQ,EAAE;oBAC1C,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,IAAI,EAAE,CAAC;gBACtD,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK;oBACtB,CAAC,CAAC,mBAAmB,IAAI,CAAC,KAAK,cAAc,IAAI,CAAC,KAAK,CAAC,aAAa,IAAI,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE;oBACvG,CAAC,CAAC,EAAE,CAAC;gBAEP,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,KAAK,GAAG,KAAK,EAAE,CAAC,EAAE,CAAC;YAC9D,CAAC;YAED,KAAK,eAAe,CAAC,CAAC,CAAC;gBACrB,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAMjE,CAAC;gBAEF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAE/D,IAAI,WAAW,GAAG,WAAW,CAAC;gBAC9B,IAAI,OAAO;oBAAE,WAAW,GAAG,aAAa,OAAO,qBAAqB,WAAW,EAAE,CAAC;gBAClF,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,CAAC,CAAC;gBAEtD,MAAM,IAAI,GAAG,MAAM,cAAc,CAAC,QAAQ,EAAE;oBAC1C,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,IAAI,EAAE,EAAE,CAAC;iBAC3E,CAAC;YACJ,CAAC;YAED,KAAK,WAAW,CAAC,CAAC,CAAC;gBACjB,MAAM,EAAE,IAAI,EAAE,IAAI,EAAE,QAAQ,EAAE,UAAU,EAAE,aAAa,EAAE,GAAG,IAK3D,CAAC;gBAEF,MAAM,IAAI,GAAG,QAAQ,IAAI,SAAS,CAAC;gBACnC,MAAM,YAAY,GAAkB;oBAClC;wBACE,IAAI,EAAE,QAAQ;wBACd,OAAO,EAAE,UAAU,IAAI,qJAAqJ;qBAC7K;oBACD;wBACE,IAAI,EAAE,MAAM;wBACZ,OAAO,EAAE,SAAS,IAAI,aAAa,IAAI,KAAK,IAAI,UAAU;qBAC3D;iBACF,CAAC;gBAEF,MAAM,QAAQ,GAAG,MAAM,cAAc,CAAC,YAAY,EAAE;oBAClD,WAAW,EAAE,GAAG;oBAChB,SAAS,EAAE,aAAa,IAAI,GAAG;iBAChC,CAAC,CAAC;gBAEH,MAAM,SAAS,GAAG,QAAQ,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,IAAI,EAAE,CAAC;gBAC9D,MAAM,SAAS,GAAG,QAAQ,CAAC,KAAK;oBAC9B,CAAC,CAAC,mBAAmB,QAAQ,CAAC,KAAK,cAAc,QAAQ,CAAC,KAAK,CAAC,aAAa,IAAI,QAAQ,CAAC,KAAK,CAAC,iBAAiB,MAAM,IAAI,EAAE;oBAC7H,CAAC,CAAC,EAAE,CAAC;gBAEP,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,SAAS,GAAG,SAAS,EAAE,CAAC,EAAE,CAAC;YACtE,CAAC;YAED,KAAK,aAAa,CAAC,CAAC,CAAC;gBACnB,MAAM,MAAM,GAAG,MAAM,UAAU,EAAE,CAAC;gBAClC,OAAO;oBACL,OAAO,EAAE;wBACP;4BACE,IAAI,EAAE,MAAM;4BACZ,IAAI,EAAE,MAAM,CAAC,MAAM;gCACjB,CAAC,CAAC,mBAAmB,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE;gCAC/D,CAAC,CAAC,6BAA6B;yBAClC;qBACF;iBACF,CAAC;YACJ,CAAC;YAED,KAAK,cAAc,CAAC,CAAC,CAAC;gBACpB,MAAM,KAAK,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,UAAU,EAAE,CAAC;gBAClC,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;gBAC9B,OAAO;oBACL,OAAO,EAAE;wBACP;4BACE,IAAI,EAAE,MAAM;4BACZ,IAAI,EAAE,gBAAgB,WAAW,KAAK,EAAE,cAAc,WAAW,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,MAAM,oBAAoB,MAAM,CAAC,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,EAAE,EAAE;yBAC1K;qBACF;iBACF,CAAC;YACJ,CAAC;YAED;gBACE,MAAM,IAAI,KAAK,CAAC,iBAAiB,IAAI,EAAE,CAAC,CAAC;QAC7C,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,UAAU,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,CAAC;YACrG,OAAO,EAAE,IAAI;SACd,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,KAAK,UAAU,IAAI;IACjB,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAC7C,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IAChC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8BAA8B,WAAW,KAAK,CAAC,CAAC;AACvE,CAAC;AAED,IAAI,EAAE,CAAC,KAAK,CAAC,CAAC,KAAK,EAAE,EAAE;IACrB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gBAAgB,KAAK,IAAI,CAAC,CAAC;IAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC"}
|
|
1
|
+
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AACA;;;;;GAKG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,2CAA2C,CAAC;AACnE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EACL,qBAAqB,EACrB,sBAAsB,GACvB,MAAM,oCAAoC,CAAC;AAE5C,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,aAAa,IAAI,uBAAuB,CAAC;AACzE,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,eAAe,IAAI,EAAE,CAAC;AACnD,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,IAAI,EAAE,CAAC;AACzD,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,mBAAmB,GAAG,GAAG,CAAC;AAChC,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,4BAA4B,GAAG,MAAM,CAAC,CAAC,yCAAyC;AACtF,MAAM,eAAe,GAAG,MAAM,CAAC,CAAc,qDAAqD;AAClG,MAAM,qBAAqB,GAAG,MAAM,CAAC,CAAQ,kCAAkC;AAC/E,MAAM,uBAAuB,GAAG,QAAQ,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,IAAI,QAAQ,EAAE,EAAE,CAAC,CAAC;AAExF,wEAAwE;AACxE,uEAAuE;AACvE,uEAAuE;AACvE,4DAA4D;AAE5D,MAAM,OAAO,GAAG;IACd,KAAK,EAAE,CAAC;IACR,YAAY,EAAE,CAAC;IACf,gBAAgB,EAAE,CAAC;CACpB,CAAC;AAEF,SAAS,WAAW,CAAC,KAA4D;IAC/E,OAAO,CAAC,KAAK,EAAE,CAAC;IAChB,IAAI,KAAK,EAAE,CAAC;QACV,OAAO,CAAC,YAAY,IAAI,KAAK,CAAC,aAAa,CAAC;QAC5C,OAAO,CAAC,gBAAgB,IAAI,KAAK,CAAC,iBAAiB,CAAC;IACtD,CAAC;AACH,CAAC;AAED,SAAS,cAAc;IACrB,MAAM,KAAK,GAAG,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,gBAAgB,CAAC;IAC9D,IAAI,OAAO,CAAC,KAAK,KAAK,CAAC;QAAE,OAAO,EAAE,CAAC;IACnC,OAAO,YAAY,KAAK,CAAC,cAAc,EAAE,4BAA4B,OAAO,CAAC,KAAK,QAAQ,OAAO,CAAC,KAAK,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,EAAE,CAAC;AAC7H,CAAC;AAED,SAAS,UAAU;IACjB,MAAM,CAAC,GAA2B,EAAE,cAAc,EAAE,kBAAkB,EAAE,CAAC;IACzE,IAAI,WAAW;QAAE,CAAC,CAAC,eAAe,CAAC,GAAG,UAAU,WAAW,EAAE,CAAC;IAC9D,OAAO,CAAC,CAAC;AACX,CAAC;AAyBD;;GAEG;AACH,KAAK,UAAU,gBAAgB,CAC7B,GAAW,EACX,OAAoB,EACpB,YAAoB,kBAAkB;IAEtC,MAAM,UAAU,GAAG,IAAI,eAAe,EAAE,CAAC;IACzC,MAAM,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,UAAU,CAAC,KAAK,EAAE,EAAE,SAAS,CAAC,CAAC;IAC9D,IAAI,CAAC;QACH,OAAO,MAAM,KAAK,CAAC,GAAG,EAAE,EAAE,GAAG,OAAO,EAAE,MAAM,EAAE,UAAU,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,CAAC;YAAS,CAAC;QACT,YAAY,CAAC,KAAK,CAAC,CAAC;IACtB,CAAC;AACH,CAAC;AAED;;;GAGG;AACH,KAAK,UAAU,SAAS,CACtB,MAA+C,EAC/C,SAAiB;IAEjB,IAAI,KAAoC,CAAC;IACzC,MAAM,OAAO,GAAG,IAAI,OAAO,CAAY,CAAC,OAAO,EAAE,EAAE;QACjD,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,SAAS,CAAC,CAAC;IAC1D,CAAC,CAAC,CAAC;IACH,IAAI,CAAC;QACH,OAAO,MAAM,OAAO,CAAC,IAAI,CAAC,CAAC,MAAM,CAAC,IAAI,EAAE,EAAE,OAAO,CAAC,CAAC,CAAC;IACtD,CAAC;YAAS,CAAC;QACT,YAAY,CAAC,KAAM,CAAC,CAAC;IACvB,CAAC;AACH,CAAC;AAED;;;;;;;GAOG;AACH,KAAK,UAAU,uBAAuB,CACpC,QAAuB,EACvB,UAAwE,EAAE;IAE1E,MAAM,IAAI,GAA4B;QACpC,QAAQ;QACR,WAAW,EAAE,OAAO,CAAC,WAAW,IAAI,mBAAmB;QACvD,UAAU,EAAE,OAAO,CAAC,SAAS,IAAI,kBAAkB;QACnD,MAAM,EAAE,IAAI;KACb,CAAC;IACF,IAAI,OAAO,CAAC,KAAK,IAAI,QAAQ,EAAE,CAAC;QAC9B,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,QAAQ,CAAC;IACzC,CAAC;IAED,MAAM,SAAS,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;IAE7B,MAAM,GAAG,GAAG,MAAM,gBAAgB,CAChC,GAAG,WAAW,sBAAsB,EACpC,EAAE,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,EAAE,EACrE,4BAA4B,CAC7B,CAAC;IAEF,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,CAAC;QACZ,MAAM,IAAI,GAAG,MAAM,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC,CAAC;QAC9C,MAAM,IAAI,KAAK,CAAC,uBAAuB,GAAG,CAAC,MAAM,KAAK,IAAI,EAAE,CAAC,CAAC;IAChE,CAAC;IAED,IAAI,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC;QACd,MAAM,IAAI,KAAK,CAAC,6DAA6D,CAAC,CAAC;IACjF,CAAC;IAED,MAAM,MAAM,GAAG,GAAG,CAAC,IAAI,CAAC,SAAS,EAAE,CAAC;IACpC,MAAM,OAAO,GAAG,IAAI,WAAW,EAAE,CAAC;IAClC,IAAI,OAAO,GAAG,EAAE,CAAC;IACjB,IAAI,KAAK,GAAG,EAAE,CAAC;IACf,IAAI,KAA+B,CAAC;IACpC,IAAI,YAAY,GAAG,EAAE,CAAC;IACtB,IAAI,SAAS,GAAG,KAAK,CAAC;IACtB,IAAI,MAAM,GAAG,EAAE,CAAC;IAEhB,IAAI,CAAC;QACH,OAAO,IAAI,EAAE,CAAC;YACZ,sCAAsC;YACtC,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,SAAS,CAAC;YACvC,IAAI,OAAO,GAAG,eAAe,EAAE,CAAC;gBAC9B,SAAS,GAAG,IAAI,CAAC;gBACjB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gCAAgC,OAAO,iBAAiB,OAAO,CAAC,MAAM,6BAA6B,CAAC,CAAC;gBAC1H,MAAM;YACR,CAAC;YAED,2DAA2D;YAC3D,MAAM,SAAS,GAAG,eAAe,GAAG,OAAO,CAAC;YAC5C,MAAM,YAAY,GAAG,IAAI,CAAC,GAAG,CAAC,qBAAqB,EAAE,SAAS,CAAC,CAAC;YAChE,MAAM,MAAM,GAAG,MAAM,SAAS,CAAC,MAAM,EAAE,YAAY,CAAC,CAAC;YAErD,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,SAAS,GAAG,IAAI,CAAC;gBACjB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8CAA8C,OAAO,CAAC,MAAM,6BAA6B,CAAC,CAAC;gBAChH,MAAM;YACR,CAAC;YAED,IAAI,MAAM,CAAC,IAAI;gBAAE,MAAM;YAEvB,MAAM,IAAI,OAAO,CAAC,MAAM,CAAC,MAAM,CAAC,KAAK,EAAE,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC;YAEzD,kBAAkB;YAClB,MAAM,KAAK,GAAG,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;YACjC,MAAM,GAAG,KAAK,CAAC,GAAG,EAAE,IAAI,EAAE,CAAC,CAAC,iCAAiC;YAE7D,KAAK,MAAM,IAAI,IAAI,KAAK,EAAE,CAAC;gBACzB,MAAM,OAAO,GAAG,IAAI,CAAC,IAAI,EAAE,CAAC;gBAC5B,IAAI,CAAC,OAAO,IAAI,OAAO,KAAK,cAAc;oBAAE,SAAS;gBACrD,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,QAAQ,CAAC;oBAAE,SAAS;gBAE5C,IAAI,CAAC;oBACH,MAAM,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC1C,IAAI,IAAI,CAAC,KAAK;wBAAE,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;oBAEnC,MAAM,KAAK,GAAG,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,EAAE,KAAK,CAAC;oBACvC,IAAI,KAAK,EAAE,OAAO;wBAAE,OAAO,IAAI,KAAK,CAAC,OAAO,CAAC;oBAE7C,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,EAAE,aAAa,CAAC;oBAChD,IAAI,MAAM;wBAAE,YAAY,GAAG,MAAM,CAAC;oBAElC,4DAA4D;oBAC5D,IAAI,IAAI,CAAC,KAAK;wBAAE,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;gBACrC,CAAC;gBAAC,MAAM,CAAC;oBACP,yDAAyD;gBAC3D,CAAC;YACH,CAAC;QACH,CAAC;IACH,CAAC;YAAS,CAAC;QACT,2DAA2D;QAC3D,MAAM,CAAC,WAAW,EAAE,CAAC;IACvB,CAAC;IAED,OAAO,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,YAAY,EAAE,SAAS,EAAE,CAAC;AAC5D,CAAC;AAED,KAAK,UAAU,aAAa;IAC1B,MAAM,GAAG,GAAG,MAAM,gBAAgB,CAChC,GAAG,WAAW,YAAY,EAC1B,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,CAC1B,CAAC;IACF,IAAI,CAAC,GAAG,CAAC,EAAE;QAAE,MAAM,IAAI,KAAK,CAAC,0BAA0B,GAAG,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,MAAM,IAAI,GAAG,CAAC,MAAM,GAAG,CAAC,IAAI,EAAE,CAA0B,CAAC;IACzD,OAAO,IAAI,CAAC,IAAI,CAAC;AACnB,CAAC;AAED,SAAS,gBAAgB,CAAC,KAAgB;IACxC,gFAAgF;IAChF,OAAO,KAAK,CAAC,cAAc,IAAI,KAAK,CAAC,aAAa,IAAI,uBAAuB,CAAC;AAChF,CAAC;AAED;;GAEG;AACH,SAAS,YAAY,CAAC,IAAqB,EAAE,KAAc;IACzD,sDAAsD;IACtD,WAAW,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAExB,MAAM,KAAK,GAAa,EAAE,CAAC;IAC3B,IAAI,IAAI,CAAC,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,UAAU,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC;IACnD,IAAI,IAAI,CAAC,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,cAAc,IAAI,CAAC,KAAK,CAAC,aAAa,IAAI,IAAI,CAAC,KAAK,CAAC,iBAAiB,SAAS,CAAC,CAAC;IAC5G,IAAI,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAC7B,IAAI,IAAI,CAAC,SAAS;QAAE,KAAK,CAAC,IAAI,CAAC,6CAA6C,CAAC,CAAC;IAE9E,MAAM,WAAW,GAAG,cAAc,EAAE,CAAC;IACrC,IAAI,WAAW;QAAE,KAAK,CAAC,IAAI,CAAC,WAAW,CAAC,CAAC;IAEzC,OAAO,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,YAAY,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;AACjE,CAAC;AAED,wEAAwE;AAExE,MAAM,KAAK,GAAG;IACZ;QACE,IAAI,EAAE,MAAM;QACZ,WAAW,EACT,8FAA8F;YAC9F,yGAAyG;YACzG,sDAAsD;YACtD,+FAA+F;YAC/F,4FAA4F;YAC5F,mGAAmG;YACnG,yDAAyD;YACzD,kDAAkD;YAClD,mEAAmE;YACnE,gEAAgE;YAChE,sDAAsD;YACtD,qEAAqE;YACrE,oDAAoD;YACpD,iEAAiE;YACjE,uFAAuF;YACvF,8FAA8F;YAC9F,gGAAgG;YAChG,uGAAuG;YACvG,0FAA0F;YAC1F,qGAAqG;YACrG,mGAAmG;YACnG,6GAA6G;YAC7G,mGAAmG;QACrG,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,OAAO,EAAE;oBACP,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,qGAAqG;iBACnH;gBACD,MAAM,EAAE;oBACN,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,0FAA0F;iBACxG;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,8FAA8F;iBAC5G;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,6FAA6F;iBAC3G;aACF;YACD,QAAQ,EAAE,CAAC,SAAS,CAAC;SACtB;KACF;IACD;QACE,IAAI,EAAE,eAAe;QACrB,WAAW,EACT,6FAA6F;YAC7F,4FAA4F;YAC5F,0FAA0F;YAC1F,4FAA4F;YAC5F,gBAAgB;YAChB,gEAAgE;YAChE,yEAAyE;YACzE,qEAAqE;YACrE,kEAAkE;YAClE,2EAA2E;YAC3E,gEAAgE;YAChE,sHAAsH;YACtH,kGAAkG;YAClG,0HAA0H;YAC1H,wGAAwG;QAC1G,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,MAAM,EAAE;oBACN,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4FAA4F;iBAC1G;gBACD,OAAO,EAAE;oBACP,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,uFAAuF;iBACrG;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,iHAAiH;iBAC/H;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,uEAAuE;iBACrF;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,oCAAoC;iBAClD;aACF;YACD,QAAQ,EAAE,CAAC,aAAa,CAAC;SAC1B;KACF;IACD;QACE,IAAI,EAAE,WAAW;QACjB,WAAW,EACT,gHAAgH;YAChH,0FAA0F;YAC1F,yFAAyF;YACzF,gBAAgB;YAChB,wCAAwC;YACxC,uCAAuC;YACvC,+DAA+D;YAC/D,gDAAgD;YAChD,6CAA6C;YAC7C,yBAAyB;YACzB,qEAAqE;YACrE,yEAAyE;YACzE,yGAAyG;YACzG,mFAAmF;YACnF,iGAAiG;QACnG,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,IAAI,EAAE;oBACJ,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,iFAAiF;iBAC/F;gBACD,IAAI,EAAE;oBACJ,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4FAA4F;iBAC1G;gBACD,QAAQ,EAAE;oBACR,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4DAA4D;iBAC1E;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,oCAAoC;iBAClD;aACF;YACD,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC;SAC3B;KACF;IACD;QACE,IAAI,EAAE,UAAU;QAChB,WAAW,EACT,2GAA2G;YAC3G,4EAA4E;YAC5E,yFAAyF;YACzF,wHAAwH;QAC1H,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;IACD;QACE,IAAI,EAAE,aAAa;QACnB,WAAW,EACT,uFAAuF;YACvF,sDAAsD;QACxD,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;CACF,CAAC;AAEF,wEAAwE;AAExE,MAAM,MAAM,GAAG,IAAI,MAAM,CACvB,EAAE,IAAI,EAAE,YAAY,EAAE,OAAO,EAAE,OAAO,EAAE,EACxC,EAAE,YAAY,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,CAChC,CAAC;AAEF,MAAM,CAAC,iBAAiB,CAAC,sBAAsB,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC;AAEjF,MAAM,CAAC,iBAAiB,CAAC,qBAAqB,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE;IAChE,MAAM,EAAE,IAAI,EAAE,SAAS,EAAE,IAAI,EAAE,GAAG,OAAO,CAAC,MAAM,CAAC;IAEjD,IAAI,CAAC;QACH,QAAQ,IAAI,EAAE,CAAC;YACb,KAAK,MAAM,CAAC,CAAC,CAAC;gBACZ,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAKpD,CAAC;gBACF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAC/D,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,CAAC,CAAC;gBAElD,MAAM,IAAI,GAAG,MAAM,uBAAuB,CAAC,QAAQ,EAAE;oBACnD,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,MAAM,GAAG,YAAY,CAAC,IAAI,CAAC,CAAC;gBAClC,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,GAAG,MAAM,EAAE,CAAC,EAAE,CAAC;YACtE,CAAC;YAED,KAAK,eAAe,CAAC,CAAC,CAAC;gBACrB,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAMjE,CAAC;gBAEF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAE/D,IAAI,WAAW,GAAG,WAAW,CAAC;gBAC9B,IAAI,OAAO;oBAAE,WAAW,GAAG,aAAa,OAAO,qBAAqB,WAAW,EAAE,CAAC;gBAClF,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,CAAC,CAAC;gBAEtD,MAAM,IAAI,GAAG,MAAM,uBAAuB,CAAC,QAAQ,EAAE;oBACnD,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,MAAM,GAAG,YAAY,CAAC,IAAI,CAAC,CAAC;gBAClC,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,GAAG,MAAM,EAAE,CAAC;iBACzD,CAAC;YACJ,CAAC;YAED,KAAK,WAAW,CAAC,CAAC,CAAC;gBACjB,MAAM,EAAE,IAAI,EAAE,IAAI,EAAE,QAAQ,EAAE,UAAU,EAAE,aAAa,EAAE,GAAG,IAK3D,CAAC;gBAEF,MAAM,IAAI,GAAG,QAAQ,IAAI,SAAS,CAAC;gBACnC,MAAM,YAAY,GAAkB;oBAClC;wBACE,IAAI,EAAE,QAAQ;wBACd,OAAO,EAAE,UAAU,IAAI,qJAAqJ;qBAC7K;oBACD;wBACE,IAAI,EAAE,MAAM;wBACZ,OAAO,EAAE,SAAS,IAAI,aAAa,IAAI,KAAK,IAAI,UAAU;qBAC3D;iBACF,CAAC;gBAEF,MAAM,QAAQ,GAAG,MAAM,uBAAuB,CAAC,YAAY,EAAE;oBAC3D,WAAW,EAAE,GAAG;oBAChB,SAAS,EAAE,aAAa,IAAI,kBAAkB;iBAC/C,CAAC,CAAC;gBAEH,MAAM,UAAU,GAAG,YAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,CAAC;gBAChD,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,QAAQ,CAAC,OAAO,GAAG,UAAU,EAAE,CAAC,EAAE,CAAC;YAC9E,CAAC;YAED,KAAK,UAAU,CAAC,CAAC,CAAC;gBAChB,MAAM,KAAK,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;gBACzB,IAAI,MAAmB,CAAC;gBACxB,IAAI,CAAC;oBACH,MAAM,GAAG,MAAM,aAAa,EAAE,CAAC;gBACjC,CAAC;gBAAC,OAAO,GAAG,EAAE,CAAC;oBACb,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;oBAC9B,MAAM,MAAM,GAAG,GAAG,YAAY,KAAK,IAAI,GAAG,CAAC,IAAI,KAAK,YAAY;wBAC9D,CAAC,CAAC,qCAAqC,EAAE,KAAK;wBAC9C,CAAC,CAAC,sBAAsB,GAAG,YAAY,KAAK,CAAC,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE,CAAC;oBAC7E,OAAO;wBACL,OAAO,EAAE,CAAC;gCACR,IAAI,EAAE,MAAM;gCACZ,IAAI,EAAE,8BAA8B,WAAW,KAAK,MAAM,uFAAuF;6BAClJ,CAAC;qBACH,CAAC;gBACJ,CAAC;gBACD,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;gBAE9B,IAAI,MAAM,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;oBACxB,OAAO;wBACL,OAAO,EAAE,CAAC;gCACR,IAAI,EAAE,MAAM;gCACZ,IAAI,EAAE,+CAA+C,WAAW,cAAc,EAAE,gGAAgG;6BACjL,CAAC;qBACH,CAAC;gBACJ,CAAC;gBAED,MAAM,KAAK,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE;oBAC7B,MAAM,GAAG,GAAG,gBAAgB,CAAC,CAAC,CAAC,CAAC;oBAChC,OAAO,OAAO,CAAC,CAAC,EAAE,cAAc,GAAG,CAAC,cAAc,EAAE,UAAU,CAAC;gBACjE,CAAC,CAAC,CAAC;gBAEH,MAAM,OAAO,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC1B,MAAM,GAAG,GAAG,gBAAgB,CAAC,OAAO,CAAC,CAAC;gBAEtC,MAAM,YAAY,GAAG,OAAO,CAAC,KAAK,GAAG,CAAC;oBACpC,CAAC,CAAC,oBAAoB,CAAC,OAAO,CAAC,YAAY,GAAG,OAAO,CAAC,gBAAgB,CAAC,CAAC,cAAc,EAAE,4BAA4B,OAAO,CAAC,KAAK,QAAQ,OAAO,CAAC,KAAK,KAAK,CAAC,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC,CAAC,GAAG,EAAE;oBACzK,CAAC,CAAC,uEAAuE,CAAC;gBAE5E,OAAO;oBACL,OAAO,EAAE,CAAC;4BACR,IAAI,EAAE,MAAM;4BACZ,IAAI,EACF,kBAAkB;gCAClB,aAAa,WAAW,IAAI;gCAC5B,YAAY,EAAE,MAAM;gCACpB,UAAU,OAAO,CAAC,EAAE,IAAI;gCACxB,mBAAmB,GAAG,CAAC,cAAc,EAAE,WAAW;gCAClD,qBAAqB,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE;gCACvC,GAAG,YAAY,MAAM;gCACrB,6FAA6F;yBAChG,CAAC;iBACH,CAAC;YACJ,CAAC;YAED,KAAK,aAAa,CAAC,CAAC,CAAC;gBACnB,MAAM,MAAM,GAAG,MAAM,aAAa,EAAE,CAAC;gBACrC,IAAI,CAAC,MAAM,CAAC,MAAM,EAAE,CAAC;oBACnB,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,6BAA6B,EAAE,CAAC,EAAE,CAAC;gBAC9E,CAAC;gBACD,MAAM,KAAK,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE;oBAC7B,MAAM,GAAG,GAAG,gBAAgB,CAAC,CAAC,CAAC,CAAC;oBAChC,OAAO,OAAO,CAAC,CAAC,EAAE,GAAG,GAAG,CAAC,CAAC,CAAC,cAAc,GAAG,CAAC,cAAc,EAAE,UAAU,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC;gBACjF,CAAC,CAAC,CAAC;gBACH,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,mBAAmB,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,EAAE,CAAC;iBACzE,CAAC;YACJ,CAAC;YAED;gBACE,MAAM,IAAI,KAAK,CAAC,iBAAiB,IAAI,EAAE,CAAC,CAAC;QAC7C,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,UAAU,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,CAAC;YACrG,OAAO,EAAE,IAAI;SACd,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,KAAK,UAAU,IAAI;IACjB,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAC7C,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IAChC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8BAA8B,WAAW,KAAK,CAAC,CAAC;AACvE,CAAC;AAED,IAAI,EAAE,CAAC,KAAK,CAAC,CAAC,KAAK,EAAE,EAAE;IACrB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gBAAgB,KAAK,IAAI,CAAC,CAAC;IAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC"}
|