@houtini/lm 2.0.1 → 2.3.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/README.md +170 -51
- package/dist/index.d.ts +1 -1
- package/dist/index.js +317 -56
- package/dist/index.js.map +1 -1
- package/package.json +1 -1
package/README.md
CHANGED
|
@@ -3,32 +3,23 @@
|
|
|
3
3
|
[](https://www.npmjs.com/package/@houtini/lm)
|
|
4
4
|
[](https://opensource.org/licenses/MIT)
|
|
5
5
|
|
|
6
|
-
MCP server that connects Claude to
|
|
6
|
+
An MCP server that connects Claude to any OpenAI-compatible LLM - LM Studio, Ollama, vLLM, llama.cpp, whatever you've got running locally.
|
|
7
7
|
|
|
8
|
-
|
|
8
|
+
The idea's simple. Claude's brilliant at orchestration and reasoning, but you're burning tokens on stuff a local model handles just fine. Boilerplate, code review, summarisation, classification - hand it off. Claude keeps working on the hard stuff while your local model chews through the grunt work. Free, parallel, no API keys.
|
|
9
9
|
|
|
10
|
-
##
|
|
10
|
+
## Quick start
|
|
11
11
|
|
|
12
|
-
|
|
13
|
-
|
|
14
|
-
**Common use cases:**
|
|
15
|
-
|
|
16
|
-
- Classify or tag hundreds of items without burning Claude tokens
|
|
17
|
-
- Extract structured data from long documents
|
|
18
|
-
- Run a second opinion on generated code
|
|
19
|
-
- Summarise research before Claude synthesises it
|
|
20
|
-
|
|
21
|
-
## Install
|
|
22
|
-
|
|
23
|
-
### Claude Code (recommended)
|
|
12
|
+
### Claude Code
|
|
24
13
|
|
|
25
14
|
```bash
|
|
26
|
-
claude mcp add houtini-lm
|
|
15
|
+
claude mcp add houtini-lm -- npx -y @houtini/lm
|
|
27
16
|
```
|
|
28
17
|
|
|
18
|
+
That's it. If LM Studio's running on `localhost:1234` (the default), Claude can start delegating straight away.
|
|
19
|
+
|
|
29
20
|
### Claude Desktop
|
|
30
21
|
|
|
31
|
-
|
|
22
|
+
Drop this into your `claude_desktop_config.json`:
|
|
32
23
|
|
|
33
24
|
```json
|
|
34
25
|
{
|
|
@@ -44,64 +35,198 @@ Add to `claude_desktop_config.json`:
|
|
|
44
35
|
}
|
|
45
36
|
```
|
|
46
37
|
|
|
47
|
-
###
|
|
38
|
+
### LLM on a different machine
|
|
39
|
+
|
|
40
|
+
If you've got a GPU box on your network (I run mine on a separate machine called hopper), point the URL at it:
|
|
48
41
|
|
|
49
42
|
```bash
|
|
50
|
-
npx @houtini/lm
|
|
43
|
+
claude mcp add houtini-lm -e LM_STUDIO_URL=http://192.168.1.50:1234 -- npx -y @houtini/lm
|
|
51
44
|
```
|
|
52
45
|
|
|
53
|
-
##
|
|
46
|
+
## What's it good for?
|
|
54
47
|
|
|
55
|
-
|
|
48
|
+
Real examples you can throw at it right now.
|
|
56
49
|
|
|
57
|
-
|
|
58
|
-
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
50
|
+
**Explain something you just read**
|
|
51
|
+
```
|
|
52
|
+
"Explain what this function does in 2-3 sentences."
|
|
53
|
+
+ paste the function
|
|
54
|
+
```
|
|
55
|
+
|
|
56
|
+
**Second opinion on generated code**
|
|
57
|
+
```
|
|
58
|
+
"Find bugs in this TypeScript module. Return a JSON array of {line, issue, fix}."
|
|
59
|
+
+ paste the module
|
|
60
|
+
```
|
|
61
|
+
|
|
62
|
+
**Draft a commit message**
|
|
63
|
+
```
|
|
64
|
+
"Write a concise commit message for this diff. One line summary, then bullet points."
|
|
65
|
+
+ paste the diff
|
|
66
|
+
```
|
|
67
|
+
|
|
68
|
+
**Generate boilerplate**
|
|
69
|
+
```
|
|
70
|
+
"Write a Jest test file for this React component. Cover the happy path and one error case."
|
|
71
|
+
+ paste the component
|
|
72
|
+
```
|
|
73
|
+
|
|
74
|
+
**Extract structured data**
|
|
75
|
+
```
|
|
76
|
+
"Extract all API endpoints from this Express router. Return as JSON: {method, path, handler}."
|
|
77
|
+
+ paste the router file
|
|
78
|
+
```
|
|
79
|
+
|
|
80
|
+
**Translate formats**
|
|
81
|
+
```
|
|
82
|
+
"Convert this JSON config to YAML. Return only the YAML, no explanation."
|
|
83
|
+
+ paste the JSON
|
|
84
|
+
```
|
|
85
|
+
|
|
86
|
+
**Brainstorm before committing to an approach**
|
|
87
|
+
```
|
|
88
|
+
"I need to add caching to this API client. List 3 approaches with trade-offs. Be brief."
|
|
89
|
+
+ paste the client code
|
|
90
|
+
```
|
|
62
91
|
|
|
63
92
|
## Tools
|
|
64
93
|
|
|
65
94
|
### `chat`
|
|
66
95
|
|
|
67
|
-
Send a
|
|
96
|
+
The workhorse. Send a task, get an answer. Optional system persona if you want to steer the model's perspective.
|
|
68
97
|
|
|
98
|
+
| Parameter | Required | Default | What it does |
|
|
99
|
+
|-----------|----------|---------|-------------|
|
|
100
|
+
| `message` | yes | - | The task. Be specific about output format. |
|
|
101
|
+
| `system` | no | - | Persona - "Senior TypeScript dev", not "helpful assistant" |
|
|
102
|
+
| `temperature` | no | 0.3 | 0.1 for code, 0.3 for analysis, 0.7 for creative |
|
|
103
|
+
| `max_tokens` | no | 2048 | Lower for quick answers, higher for generation |
|
|
104
|
+
|
|
105
|
+
**Quick factual question:**
|
|
106
|
+
```json
|
|
107
|
+
{
|
|
108
|
+
"message": "What HTTP status code means 'too many requests'? Just the number and name.",
|
|
109
|
+
"max_tokens": 50
|
|
110
|
+
}
|
|
69
111
|
```
|
|
70
|
-
|
|
71
|
-
|
|
72
|
-
|
|
73
|
-
|
|
112
|
+
|
|
113
|
+
**Code explanation with persona:**
|
|
114
|
+
```json
|
|
115
|
+
{
|
|
116
|
+
"message": "Explain this function. What does it do, what are the edge cases?\n\n```ts\nfunction debounce(fn, ms) { ... }\n```",
|
|
117
|
+
"system": "Senior TypeScript developer"
|
|
118
|
+
}
|
|
74
119
|
```
|
|
75
120
|
|
|
76
121
|
### `custom_prompt`
|
|
77
122
|
|
|
78
|
-
|
|
123
|
+
Three-part prompt: system, context, instruction. Keeping them separate stops context bleed - you'll get better results than stuffing everything into one message, especially with smaller models.
|
|
79
124
|
|
|
125
|
+
| Parameter | Required | Default | What it does |
|
|
126
|
+
|-----------|----------|---------|-------------|
|
|
127
|
+
| `instruction` | yes | - | What to produce. Under 50 words works best. |
|
|
128
|
+
| `system` | no | - | Persona, specific and under 30 words |
|
|
129
|
+
| `context` | no | - | Complete data to analyse. Never truncate. |
|
|
130
|
+
| `temperature` | no | 0.3 | 0.1 for review, 0.3 for analysis |
|
|
131
|
+
| `max_tokens` | no | 2048 | Match to expected output length |
|
|
132
|
+
|
|
133
|
+
**Code review:**
|
|
134
|
+
```json
|
|
135
|
+
{
|
|
136
|
+
"system": "Expert Node.js developer focused on error handling and edge cases.",
|
|
137
|
+
"context": "< full source code here >",
|
|
138
|
+
"instruction": "List the top 3 bugs as bullet points. For each: line number, what's wrong, how to fix it."
|
|
139
|
+
}
|
|
140
|
+
```
|
|
141
|
+
|
|
142
|
+
**Compare two implementations:**
|
|
143
|
+
```json
|
|
144
|
+
{
|
|
145
|
+
"system": "Performance-focused Python developer.",
|
|
146
|
+
"context": "Implementation A:\n...\n\nImplementation B:\n...",
|
|
147
|
+
"instruction": "Which is faster for 10k+ items? Why? One paragraph."
|
|
148
|
+
}
|
|
80
149
|
```
|
|
81
|
-
|
|
82
|
-
|
|
83
|
-
|
|
84
|
-
temperature
|
|
85
|
-
|
|
150
|
+
|
|
151
|
+
### `code_task`
|
|
152
|
+
|
|
153
|
+
Built specifically for code analysis. Wraps your request with an optimised code-review system prompt and drops the temperature to 0.2 so the model stays focused.
|
|
154
|
+
|
|
155
|
+
| Parameter | Required | Default | What it does |
|
|
156
|
+
|-----------|----------|---------|-------------|
|
|
157
|
+
| `code` | yes | - | Complete source code. Never truncate. |
|
|
158
|
+
| `task` | yes | - | "Find bugs", "Explain this", "Write tests" |
|
|
159
|
+
| `language` | no | - | "typescript", "python", "rust", etc. |
|
|
160
|
+
| `max_tokens` | no | 2048 | Match to expected output length |
|
|
161
|
+
|
|
162
|
+
```json
|
|
163
|
+
{
|
|
164
|
+
"code": "< full source file >",
|
|
165
|
+
"task": "Find bugs and suggest improvements. Reference line numbers.",
|
|
166
|
+
"language": "typescript"
|
|
167
|
+
}
|
|
86
168
|
```
|
|
87
169
|
|
|
170
|
+
### `discover`
|
|
171
|
+
|
|
172
|
+
Checks if the local LLM's online. Returns the model name, context window size, and response latency. Typically under a second, or an offline status within 5 seconds if the host isn't reachable.
|
|
173
|
+
|
|
174
|
+
No parameters. Call it before delegating if you're not sure the LLM's available.
|
|
175
|
+
|
|
88
176
|
### `list_models`
|
|
89
177
|
|
|
90
|
-
|
|
178
|
+
Lists everything loaded on the LLM server with context window sizes.
|
|
179
|
+
|
|
180
|
+
## How it works
|
|
91
181
|
|
|
92
|
-
|
|
182
|
+
```
|
|
183
|
+
Claude ──MCP──> houtini-lm ──HTTP/SSE──> LM Studio (or any OpenAI-compatible API)
|
|
184
|
+
│
|
|
185
|
+
├─ Streaming: tokens arrive incrementally via SSE
|
|
186
|
+
├─ Soft timeout: returns partial results at 55s
|
|
187
|
+
└─ Graceful failure: returns "offline" if host unreachable
|
|
188
|
+
```
|
|
93
189
|
|
|
94
|
-
|
|
190
|
+
All inference calls use Server-Sent Events streaming (since v2.3.0). In practice, this means:
|
|
191
|
+
|
|
192
|
+
- Tokens arrive as they're generated, keeping the connection alive
|
|
193
|
+
- If generation takes longer than 55 seconds, you get a partial result instead of a timeout error - the footer shows `⚠ TRUNCATED` when this happens
|
|
194
|
+
- If the host is off or unreachable, you get a clean "offline" message within 5 seconds instead of hanging
|
|
195
|
+
|
|
196
|
+
The 55-second soft timeout exists because the MCP SDK has a hard ~60s client-side timeout. Without streaming, any response that took longer than 60 seconds just vanished. Now you get whatever the model managed to generate before the deadline.
|
|
197
|
+
|
|
198
|
+
## Configuration
|
|
199
|
+
|
|
200
|
+
| Variable | Default | What it does |
|
|
201
|
+
|----------|---------|-------------|
|
|
202
|
+
| `LM_STUDIO_URL` | `http://localhost:1234` | Base URL of the OpenAI-compatible API |
|
|
203
|
+
| `LM_STUDIO_MODEL` | *(auto-detect)* | Model identifier - leave blank to use whatever's loaded |
|
|
204
|
+
| `LM_STUDIO_PASSWORD` | *(none)* | Bearer token for authenticated endpoints |
|
|
205
|
+
| `LM_CONTEXT_WINDOW` | `100000` | Fallback context window if the API doesn't report it |
|
|
206
|
+
|
|
207
|
+
## Getting good results
|
|
208
|
+
|
|
209
|
+
**Send complete code.** Local models hallucinate details when you give them truncated input. If a file's too large, send the relevant function - not a snippet with `...` in the middle.
|
|
210
|
+
|
|
211
|
+
**Be explicit about output format.** "Return a JSON array" or "respond in bullet points" - don't leave it open-ended. Smaller models especially need this.
|
|
212
|
+
|
|
213
|
+
**One call at a time.** If your LLM server runs a single model, parallel calls queue up and stack timeouts. Send them sequentially.
|
|
214
|
+
|
|
215
|
+
**Match max_tokens to expected output.** 200 for quick answers, 500 for explanations, 2048 for code generation. Lower values mean faster responses.
|
|
216
|
+
|
|
217
|
+
**Set a specific persona.** "Expert Rust developer who cares about memory safety" gets noticeably better results than "helpful assistant" (or no persona at all).
|
|
95
218
|
|
|
96
219
|
## Compatible endpoints
|
|
97
220
|
|
|
98
|
-
|
|
99
|
-
|
|
221
|
+
Works with anything that speaks the OpenAI `/v1/chat/completions` API:
|
|
222
|
+
|
|
223
|
+
| What | URL | Notes |
|
|
224
|
+
|------|-----|-------|
|
|
100
225
|
| [LM Studio](https://lmstudio.ai) | `http://localhost:1234` | Default, zero config |
|
|
101
|
-
| [Ollama](https://ollama.com) | `http://localhost:11434` |
|
|
226
|
+
| [Ollama](https://ollama.com) | `http://localhost:11434` | Set `LM_STUDIO_URL` |
|
|
102
227
|
| [vLLM](https://docs.vllm.ai) | `http://localhost:8000` | Native OpenAI API |
|
|
103
228
|
| [llama.cpp](https://github.com/ggml-org/llama.cpp) | `http://localhost:8080` | Server mode |
|
|
104
|
-
|
|
|
229
|
+
| Any OpenAI-compatible API | Any URL | Set URL + password |
|
|
105
230
|
|
|
106
231
|
## Development
|
|
107
232
|
|
|
@@ -112,12 +237,6 @@ npm install
|
|
|
112
237
|
npm run build
|
|
113
238
|
```
|
|
114
239
|
|
|
115
|
-
|
|
116
|
-
|
|
117
|
-
```bash
|
|
118
|
-
node test.mjs
|
|
119
|
-
```
|
|
120
|
-
|
|
121
|
-
## License
|
|
240
|
+
## Licence
|
|
122
241
|
|
|
123
242
|
MIT
|
package/dist/index.d.ts
CHANGED
|
@@ -3,6 +3,6 @@
|
|
|
3
3
|
* Houtini LM — MCP Server for Local LLMs via OpenAI-compatible API
|
|
4
4
|
*
|
|
5
5
|
* Connects to LM Studio (or any OpenAI-compatible endpoint) and exposes
|
|
6
|
-
* chat, custom prompts, and model
|
|
6
|
+
* chat, custom prompts, code tasks, and model discovery as MCP tools.
|
|
7
7
|
*/
|
|
8
8
|
export {};
|
package/dist/index.js
CHANGED
|
@@ -3,7 +3,7 @@
|
|
|
3
3
|
* Houtini LM — MCP Server for Local LLMs via OpenAI-compatible API
|
|
4
4
|
*
|
|
5
5
|
* Connects to LM Studio (or any OpenAI-compatible endpoint) and exposes
|
|
6
|
-
* chat, custom prompts, and model
|
|
6
|
+
* chat, custom prompts, code tasks, and model discovery as MCP tools.
|
|
7
7
|
*/
|
|
8
8
|
import { Server } from '@modelcontextprotocol/sdk/server/index.js';
|
|
9
9
|
import { StdioServerTransport } from '@modelcontextprotocol/sdk/server/stdio.js';
|
|
@@ -11,86 +11,296 @@ import { CallToolRequestSchema, ListToolsRequestSchema, } from '@modelcontextpro
|
|
|
11
11
|
const LM_BASE_URL = process.env.LM_STUDIO_URL || 'http://localhost:1234';
|
|
12
12
|
const LM_MODEL = process.env.LM_STUDIO_MODEL || '';
|
|
13
13
|
const LM_PASSWORD = process.env.LM_STUDIO_PASSWORD || '';
|
|
14
|
-
const DEFAULT_MAX_TOKENS =
|
|
14
|
+
const DEFAULT_MAX_TOKENS = 2048;
|
|
15
15
|
const DEFAULT_TEMPERATURE = 0.3;
|
|
16
|
+
const CONNECT_TIMEOUT_MS = 5000;
|
|
17
|
+
const INFERENCE_CONNECT_TIMEOUT_MS = 30_000; // generous connect timeout for inference
|
|
18
|
+
const SOFT_TIMEOUT_MS = 55_000; // return partial results before MCP SDK ~60s timeout
|
|
19
|
+
const READ_CHUNK_TIMEOUT_MS = 30_000; // max wait for a single SSE chunk
|
|
20
|
+
const FALLBACK_CONTEXT_LENGTH = parseInt(process.env.LM_CONTEXT_WINDOW || '100000', 10);
|
|
16
21
|
function apiHeaders() {
|
|
17
22
|
const h = { 'Content-Type': 'application/json' };
|
|
18
23
|
if (LM_PASSWORD)
|
|
19
24
|
h['Authorization'] = `Bearer ${LM_PASSWORD}`;
|
|
20
25
|
return h;
|
|
21
26
|
}
|
|
22
|
-
|
|
27
|
+
/**
|
|
28
|
+
* Fetch with a connect timeout so Claude doesn't hang when the host is offline.
|
|
29
|
+
*/
|
|
30
|
+
async function fetchWithTimeout(url, options, timeoutMs = CONNECT_TIMEOUT_MS) {
|
|
31
|
+
const controller = new AbortController();
|
|
32
|
+
const timer = setTimeout(() => controller.abort(), timeoutMs);
|
|
33
|
+
try {
|
|
34
|
+
return await fetch(url, { ...options, signal: controller.signal });
|
|
35
|
+
}
|
|
36
|
+
finally {
|
|
37
|
+
clearTimeout(timer);
|
|
38
|
+
}
|
|
39
|
+
}
|
|
40
|
+
/**
|
|
41
|
+
* Read from a stream with a per-chunk timeout.
|
|
42
|
+
* Prevents hanging forever if the LLM stalls mid-generation.
|
|
43
|
+
*/
|
|
44
|
+
async function timedRead(reader, timeoutMs) {
|
|
45
|
+
let timer;
|
|
46
|
+
const timeout = new Promise((resolve) => {
|
|
47
|
+
timer = setTimeout(() => resolve('timeout'), timeoutMs);
|
|
48
|
+
});
|
|
49
|
+
try {
|
|
50
|
+
return await Promise.race([reader.read(), timeout]);
|
|
51
|
+
}
|
|
52
|
+
finally {
|
|
53
|
+
clearTimeout(timer);
|
|
54
|
+
}
|
|
55
|
+
}
|
|
56
|
+
/**
|
|
57
|
+
* Streaming chat completion with soft timeout.
|
|
58
|
+
*
|
|
59
|
+
* Uses SSE streaming (`stream: true`) so tokens arrive incrementally.
|
|
60
|
+
* If we approach the MCP SDK's ~60s timeout (soft limit at 55s), we
|
|
61
|
+
* return whatever content we have so far with `truncated: true`.
|
|
62
|
+
* This means large code reviews return partial results instead of nothing.
|
|
63
|
+
*/
|
|
64
|
+
async function chatCompletionStreaming(messages, options = {}) {
|
|
23
65
|
const body = {
|
|
24
66
|
messages,
|
|
25
67
|
temperature: options.temperature ?? DEFAULT_TEMPERATURE,
|
|
26
68
|
max_tokens: options.maxTokens ?? DEFAULT_MAX_TOKENS,
|
|
27
|
-
stream:
|
|
69
|
+
stream: true,
|
|
28
70
|
};
|
|
29
71
|
if (options.model || LM_MODEL) {
|
|
30
72
|
body.model = options.model || LM_MODEL;
|
|
31
73
|
}
|
|
32
|
-
const
|
|
33
|
-
|
|
34
|
-
headers: apiHeaders(),
|
|
35
|
-
body: JSON.stringify(body),
|
|
36
|
-
});
|
|
74
|
+
const startTime = Date.now();
|
|
75
|
+
const res = await fetchWithTimeout(`${LM_BASE_URL}/v1/chat/completions`, { method: 'POST', headers: apiHeaders(), body: JSON.stringify(body) }, INFERENCE_CONNECT_TIMEOUT_MS);
|
|
37
76
|
if (!res.ok) {
|
|
38
77
|
const text = await res.text().catch(() => '');
|
|
39
78
|
throw new Error(`LM Studio API error ${res.status}: ${text}`);
|
|
40
79
|
}
|
|
41
|
-
|
|
80
|
+
if (!res.body) {
|
|
81
|
+
throw new Error('Response body is null — streaming not supported by endpoint');
|
|
82
|
+
}
|
|
83
|
+
const reader = res.body.getReader();
|
|
84
|
+
const decoder = new TextDecoder();
|
|
85
|
+
let content = '';
|
|
86
|
+
let model = '';
|
|
87
|
+
let usage;
|
|
88
|
+
let finishReason = '';
|
|
89
|
+
let truncated = false;
|
|
90
|
+
let buffer = '';
|
|
91
|
+
try {
|
|
92
|
+
while (true) {
|
|
93
|
+
// Check soft timeout before each read
|
|
94
|
+
const elapsed = Date.now() - startTime;
|
|
95
|
+
if (elapsed > SOFT_TIMEOUT_MS) {
|
|
96
|
+
truncated = true;
|
|
97
|
+
process.stderr.write(`[houtini-lm] Soft timeout at ${elapsed}ms, returning ${content.length} chars of partial content\n`);
|
|
98
|
+
break;
|
|
99
|
+
}
|
|
100
|
+
// Read with per-chunk timeout (handles stalled generation)
|
|
101
|
+
const remaining = SOFT_TIMEOUT_MS - elapsed;
|
|
102
|
+
const chunkTimeout = Math.min(READ_CHUNK_TIMEOUT_MS, remaining);
|
|
103
|
+
const result = await timedRead(reader, chunkTimeout);
|
|
104
|
+
if (result === 'timeout') {
|
|
105
|
+
truncated = true;
|
|
106
|
+
process.stderr.write(`[houtini-lm] Chunk read timeout, returning ${content.length} chars of partial content\n`);
|
|
107
|
+
break;
|
|
108
|
+
}
|
|
109
|
+
if (result.done)
|
|
110
|
+
break;
|
|
111
|
+
buffer += decoder.decode(result.value, { stream: true });
|
|
112
|
+
// Parse SSE lines
|
|
113
|
+
const lines = buffer.split('\n');
|
|
114
|
+
buffer = lines.pop() || ''; // Keep incomplete line in buffer
|
|
115
|
+
for (const line of lines) {
|
|
116
|
+
const trimmed = line.trim();
|
|
117
|
+
if (!trimmed || trimmed === 'data: [DONE]')
|
|
118
|
+
continue;
|
|
119
|
+
if (!trimmed.startsWith('data: '))
|
|
120
|
+
continue;
|
|
121
|
+
try {
|
|
122
|
+
const json = JSON.parse(trimmed.slice(6));
|
|
123
|
+
if (json.model)
|
|
124
|
+
model = json.model;
|
|
125
|
+
const delta = json.choices?.[0]?.delta;
|
|
126
|
+
if (delta?.content)
|
|
127
|
+
content += delta.content;
|
|
128
|
+
const reason = json.choices?.[0]?.finish_reason;
|
|
129
|
+
if (reason)
|
|
130
|
+
finishReason = reason;
|
|
131
|
+
// Some endpoints include usage in the final streaming chunk
|
|
132
|
+
if (json.usage)
|
|
133
|
+
usage = json.usage;
|
|
134
|
+
}
|
|
135
|
+
catch {
|
|
136
|
+
// Skip unparseable chunks (partial JSON, comments, etc.)
|
|
137
|
+
}
|
|
138
|
+
}
|
|
139
|
+
}
|
|
140
|
+
}
|
|
141
|
+
finally {
|
|
142
|
+
// Release the reader — don't await cancel() as it can hang
|
|
143
|
+
reader.releaseLock();
|
|
144
|
+
}
|
|
145
|
+
return { content, model, usage, finishReason, truncated };
|
|
42
146
|
}
|
|
43
|
-
async function
|
|
44
|
-
const res = await
|
|
147
|
+
async function listModelsRaw() {
|
|
148
|
+
const res = await fetchWithTimeout(`${LM_BASE_URL}/v1/models`, { headers: apiHeaders() });
|
|
45
149
|
if (!res.ok)
|
|
46
150
|
throw new Error(`Failed to list models: ${res.status}`);
|
|
47
151
|
const data = (await res.json());
|
|
48
|
-
return data.data
|
|
152
|
+
return data.data;
|
|
153
|
+
}
|
|
154
|
+
function getContextLength(model) {
|
|
155
|
+
// LM Studio uses context_length, vLLM uses max_model_len, fall back to env/100k
|
|
156
|
+
return model.context_length ?? model.max_model_len ?? FALLBACK_CONTEXT_LENGTH;
|
|
157
|
+
}
|
|
158
|
+
/**
|
|
159
|
+
* Format a footer line for streaming results showing model, usage, and truncation status.
|
|
160
|
+
*/
|
|
161
|
+
function formatFooter(resp, extra) {
|
|
162
|
+
const parts = [];
|
|
163
|
+
if (resp.model)
|
|
164
|
+
parts.push(`Model: ${resp.model}`);
|
|
165
|
+
if (resp.usage)
|
|
166
|
+
parts.push(`Tokens: ${resp.usage.prompt_tokens}→${resp.usage.completion_tokens}`);
|
|
167
|
+
if (extra)
|
|
168
|
+
parts.push(extra);
|
|
169
|
+
if (resp.truncated)
|
|
170
|
+
parts.push('⚠ TRUNCATED (soft timeout — partial result)');
|
|
171
|
+
return parts.length > 0 ? `\n\n---\n${parts.join(' | ')}` : '';
|
|
49
172
|
}
|
|
50
173
|
// ── MCP Tool definitions ─────────────────────────────────────────────
|
|
51
174
|
const TOOLS = [
|
|
52
175
|
{
|
|
53
176
|
name: 'chat',
|
|
54
|
-
description: 'Send a
|
|
177
|
+
description: 'Send a task to a local LLM running on a separate machine. This is a FREE, parallel worker — ' +
|
|
178
|
+
'use it to offload bounded work while you continue doing other things. The local LLM runs independently ' +
|
|
179
|
+
'and does not consume your tokens or rate limits.\n\n' +
|
|
180
|
+
'WHEN TO USE (delegate generously — it costs nothing):\n' +
|
|
181
|
+
'• Explain or summarise code/docs you just read\n' +
|
|
182
|
+
'• Generate boilerplate, test stubs, type definitions, mock data\n' +
|
|
183
|
+
'• Answer factual questions about languages, frameworks, APIs\n' +
|
|
184
|
+
'• Draft commit messages, PR descriptions, comments\n' +
|
|
185
|
+
'• Translate or reformat content (JSON↔YAML, snake_case↔camelCase)\n' +
|
|
186
|
+
'• Brainstorm approaches before you commit to one\n' +
|
|
187
|
+
'• Any self-contained subtask that does not need tool access\n\n' +
|
|
188
|
+
'RULES:\n' +
|
|
189
|
+
'(1) Always send COMPLETE code/context — never truncate, the local LLM cannot access files.\n' +
|
|
190
|
+
'(2) Be explicit about output format ("respond as a JSON array", "return only the function").\n' +
|
|
191
|
+
'(3) Call discover first if you are unsure whether the local LLM is online.\n\n' +
|
|
192
|
+
'The local model, context window, and speed vary — call the discover tool to check what is loaded.',
|
|
55
193
|
inputSchema: {
|
|
56
194
|
type: 'object',
|
|
57
195
|
properties: {
|
|
58
|
-
message: {
|
|
59
|
-
|
|
60
|
-
|
|
61
|
-
|
|
196
|
+
message: {
|
|
197
|
+
type: 'string',
|
|
198
|
+
description: 'The task. Be specific about expected output format. Include COMPLETE code/context — never truncate.',
|
|
199
|
+
},
|
|
200
|
+
system: {
|
|
201
|
+
type: 'string',
|
|
202
|
+
description: 'Persona for the local LLM. Be specific: "Senior TypeScript dev" not "helpful assistant".',
|
|
203
|
+
},
|
|
204
|
+
temperature: {
|
|
205
|
+
type: 'number',
|
|
206
|
+
description: '0.1 for factual/code, 0.3 for analysis (default), 0.7 for creative. Stay under 0.5 for code.',
|
|
207
|
+
},
|
|
208
|
+
max_tokens: {
|
|
209
|
+
type: 'number',
|
|
210
|
+
description: 'Max response tokens. Default 2048. Use higher for code generation, lower for quick answers.',
|
|
211
|
+
},
|
|
62
212
|
},
|
|
63
213
|
required: ['message'],
|
|
64
214
|
},
|
|
65
215
|
},
|
|
66
216
|
{
|
|
67
217
|
name: 'custom_prompt',
|
|
68
|
-
description: '
|
|
218
|
+
description: 'Structured analysis via the local LLM with explicit system/context/instruction separation. ' +
|
|
219
|
+
'This 3-part format prevents context bleed and gets the best results from local models.\n\n' +
|
|
220
|
+
'WHEN TO USE:\n' +
|
|
221
|
+
'• Code review — paste full source, ask for bugs/improvements\n' +
|
|
222
|
+
'• Comparison — paste two implementations, ask which is better and why\n' +
|
|
223
|
+
'• Refactoring suggestions — paste code, ask for a cleaner version\n' +
|
|
224
|
+
'• Content analysis — paste text, ask for structure/tone/issues\n' +
|
|
225
|
+
'• Any task where separating context from instruction improves clarity\n\n' +
|
|
226
|
+
'System sets persona. Context provides COMPLETE data (never truncate). ' +
|
|
227
|
+
'Instruction states exactly what to produce.',
|
|
69
228
|
inputSchema: {
|
|
70
229
|
type: 'object',
|
|
71
230
|
properties: {
|
|
72
|
-
system: {
|
|
73
|
-
|
|
74
|
-
|
|
75
|
-
|
|
76
|
-
|
|
231
|
+
system: {
|
|
232
|
+
type: 'string',
|
|
233
|
+
description: 'Persona. Be specific: "Expert Node.js developer focused on error handling and edge cases."',
|
|
234
|
+
},
|
|
235
|
+
context: {
|
|
236
|
+
type: 'string',
|
|
237
|
+
description: 'The COMPLETE data to analyse. Full source code, full logs, full text. NEVER truncate.',
|
|
238
|
+
},
|
|
239
|
+
instruction: {
|
|
240
|
+
type: 'string',
|
|
241
|
+
description: 'What to produce. Specify format: "List 3 bugs as bullet points" or "Return a JSON array of {line, issue, fix}".',
|
|
242
|
+
},
|
|
243
|
+
temperature: {
|
|
244
|
+
type: 'number',
|
|
245
|
+
description: '0.1 for bugs/review, 0.3 for analysis (default), 0.5 for suggestions.',
|
|
246
|
+
},
|
|
247
|
+
max_tokens: {
|
|
248
|
+
type: 'number',
|
|
249
|
+
description: 'Max response tokens. Default 2048.',
|
|
250
|
+
},
|
|
77
251
|
},
|
|
78
252
|
required: ['instruction'],
|
|
79
253
|
},
|
|
80
254
|
},
|
|
81
255
|
{
|
|
82
|
-
name: '
|
|
83
|
-
description: '
|
|
256
|
+
name: 'code_task',
|
|
257
|
+
description: 'Send a code analysis task to the local LLM. Wraps the request with an optimised code-review system prompt.\n\n' +
|
|
258
|
+
'WHEN TO USE:\n' +
|
|
259
|
+
'• Explain what a function/class does\n' +
|
|
260
|
+
'• Find bugs or suggest improvements\n' +
|
|
261
|
+
'• Generate unit tests or type definitions for existing code\n' +
|
|
262
|
+
'• Add error handling, logging, or validation\n' +
|
|
263
|
+
'• Convert between languages or patterns\n\n' +
|
|
264
|
+
'Provide COMPLETE source code (the local LLM cannot read files) and a specific task.',
|
|
265
|
+
inputSchema: {
|
|
266
|
+
type: 'object',
|
|
267
|
+
properties: {
|
|
268
|
+
code: {
|
|
269
|
+
type: 'string',
|
|
270
|
+
description: 'COMPLETE source code. Never truncate. Include imports and full function bodies.',
|
|
271
|
+
},
|
|
272
|
+
task: {
|
|
273
|
+
type: 'string',
|
|
274
|
+
description: 'What to do: "Find bugs", "Explain this", "Add error handling to fetchData", "Write tests".',
|
|
275
|
+
},
|
|
276
|
+
language: {
|
|
277
|
+
type: 'string',
|
|
278
|
+
description: 'Programming language: "typescript", "python", "rust", etc.',
|
|
279
|
+
},
|
|
280
|
+
max_tokens: {
|
|
281
|
+
type: 'number',
|
|
282
|
+
description: 'Max response tokens. Default 2048.',
|
|
283
|
+
},
|
|
284
|
+
},
|
|
285
|
+
required: ['code', 'task'],
|
|
286
|
+
},
|
|
287
|
+
},
|
|
288
|
+
{
|
|
289
|
+
name: 'discover',
|
|
290
|
+
description: 'Check whether the local LLM is online and what model is loaded. Returns model name, context window size, ' +
|
|
291
|
+
'and response latency. Call this if you are unsure whether the local LLM is available before delegating work. ' +
|
|
292
|
+
'Fast — typically responds in under 1 second, or returns an offline status within 5 seconds if the host is unreachable.',
|
|
84
293
|
inputSchema: { type: 'object', properties: {} },
|
|
85
294
|
},
|
|
86
295
|
{
|
|
87
|
-
name: '
|
|
88
|
-
description: '
|
|
296
|
+
name: 'list_models',
|
|
297
|
+
description: 'List all models currently loaded in the local LLM server, with context window sizes. ' +
|
|
298
|
+
'Use discover instead for a quick availability check.',
|
|
89
299
|
inputSchema: { type: 'object', properties: {} },
|
|
90
300
|
},
|
|
91
301
|
];
|
|
92
302
|
// ── MCP Server ───────────────────────────────────────────────────────
|
|
93
|
-
const server = new Server({ name: 'houtini-lm', version: '2.0
|
|
303
|
+
const server = new Server({ name: 'houtini-lm', version: '2.3.0' }, { capabilities: { tools: {} } });
|
|
94
304
|
server.setRequestHandler(ListToolsRequestSchema, async () => ({ tools: TOOLS }));
|
|
95
305
|
server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
96
306
|
const { name, arguments: args } = request.params;
|
|
@@ -102,15 +312,12 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
|
102
312
|
if (system)
|
|
103
313
|
messages.push({ role: 'system', content: system });
|
|
104
314
|
messages.push({ role: 'user', content: message });
|
|
105
|
-
const resp = await
|
|
315
|
+
const resp = await chatCompletionStreaming(messages, {
|
|
106
316
|
temperature,
|
|
107
317
|
maxTokens: max_tokens,
|
|
108
318
|
});
|
|
109
|
-
const
|
|
110
|
-
|
|
111
|
-
? `\n\n---\nModel: ${resp.model} | Tokens: ${resp.usage.prompt_tokens}→${resp.usage.completion_tokens}`
|
|
112
|
-
: '';
|
|
113
|
-
return { content: [{ type: 'text', text: reply + usage }] };
|
|
319
|
+
const footer = formatFooter(resp);
|
|
320
|
+
return { content: [{ type: 'text', text: resp.content + footer }] };
|
|
114
321
|
}
|
|
115
322
|
case 'custom_prompt': {
|
|
116
323
|
const { system, context, instruction, temperature, max_tokens } = args;
|
|
@@ -121,38 +328,92 @@ server.setRequestHandler(CallToolRequestSchema, async (request) => {
|
|
|
121
328
|
if (context)
|
|
122
329
|
userContent = `Context:\n${context}\n\nInstruction:\n${instruction}`;
|
|
123
330
|
messages.push({ role: 'user', content: userContent });
|
|
124
|
-
const resp = await
|
|
331
|
+
const resp = await chatCompletionStreaming(messages, {
|
|
125
332
|
temperature,
|
|
126
333
|
maxTokens: max_tokens,
|
|
127
334
|
});
|
|
335
|
+
const footer = formatFooter(resp);
|
|
128
336
|
return {
|
|
129
|
-
content: [{ type: 'text', text: resp.
|
|
337
|
+
content: [{ type: 'text', text: resp.content + footer }],
|
|
130
338
|
};
|
|
131
339
|
}
|
|
132
|
-
case '
|
|
133
|
-
const
|
|
134
|
-
|
|
135
|
-
|
|
136
|
-
|
|
137
|
-
|
|
138
|
-
|
|
139
|
-
|
|
140
|
-
|
|
141
|
-
|
|
142
|
-
|
|
143
|
-
|
|
340
|
+
case 'code_task': {
|
|
341
|
+
const { code, task, language, max_tokens: codeMaxTokens } = args;
|
|
342
|
+
const lang = language || 'unknown';
|
|
343
|
+
const codeMessages = [
|
|
344
|
+
{
|
|
345
|
+
role: 'system',
|
|
346
|
+
content: `Expert ${lang} developer. Analyse the provided code and complete the task. Be specific — reference line numbers, function names, and concrete fixes. No preamble.`,
|
|
347
|
+
},
|
|
348
|
+
{
|
|
349
|
+
role: 'user',
|
|
350
|
+
content: `Task: ${task}\n\n\`\`\`${lang}\n${code}\n\`\`\``,
|
|
351
|
+
},
|
|
352
|
+
];
|
|
353
|
+
const codeResp = await chatCompletionStreaming(codeMessages, {
|
|
354
|
+
temperature: 0.2,
|
|
355
|
+
maxTokens: codeMaxTokens ?? DEFAULT_MAX_TOKENS,
|
|
356
|
+
});
|
|
357
|
+
const codeFooter = formatFooter(codeResp, lang);
|
|
358
|
+
return { content: [{ type: 'text', text: codeResp.content + codeFooter }] };
|
|
144
359
|
}
|
|
145
|
-
case '
|
|
360
|
+
case 'discover': {
|
|
146
361
|
const start = Date.now();
|
|
147
|
-
|
|
362
|
+
let models;
|
|
363
|
+
try {
|
|
364
|
+
models = await listModelsRaw();
|
|
365
|
+
}
|
|
366
|
+
catch (err) {
|
|
367
|
+
const ms = Date.now() - start;
|
|
368
|
+
const reason = err instanceof Error && err.name === 'AbortError'
|
|
369
|
+
? `Host unreachable (timed out after ${ms}ms)`
|
|
370
|
+
: `Connection failed: ${err instanceof Error ? err.message : String(err)}`;
|
|
371
|
+
return {
|
|
372
|
+
content: [{
|
|
373
|
+
type: 'text',
|
|
374
|
+
text: `Status: OFFLINE\nEndpoint: ${LM_BASE_URL}\n${reason}\n\nThe local LLM is not available right now. Do not attempt to delegate tasks to it.`,
|
|
375
|
+
}],
|
|
376
|
+
};
|
|
377
|
+
}
|
|
148
378
|
const ms = Date.now() - start;
|
|
379
|
+
if (models.length === 0) {
|
|
380
|
+
return {
|
|
381
|
+
content: [{
|
|
382
|
+
type: 'text',
|
|
383
|
+
text: `Status: ONLINE (no model loaded)\nEndpoint: ${LM_BASE_URL}\nLatency: ${ms}ms\n\nThe server is running but no model is loaded. Ask the user to load a model in LM Studio.`,
|
|
384
|
+
}],
|
|
385
|
+
};
|
|
386
|
+
}
|
|
387
|
+
const lines = models.map((m) => {
|
|
388
|
+
const ctx = getContextLength(m);
|
|
389
|
+
return ` • ${m.id} (context: ${ctx.toLocaleString()} tokens)`;
|
|
390
|
+
});
|
|
391
|
+
const primary = models[0];
|
|
392
|
+
const ctx = getContextLength(primary);
|
|
149
393
|
return {
|
|
150
|
-
content: [
|
|
151
|
-
{
|
|
394
|
+
content: [{
|
|
152
395
|
type: 'text',
|
|
153
|
-
text: `
|
|
154
|
-
|
|
155
|
-
|
|
396
|
+
text: `Status: ONLINE\n` +
|
|
397
|
+
`Endpoint: ${LM_BASE_URL}\n` +
|
|
398
|
+
`Latency: ${ms}ms\n` +
|
|
399
|
+
`Model: ${primary.id}\n` +
|
|
400
|
+
`Context window: ${ctx.toLocaleString()} tokens\n` +
|
|
401
|
+
`\nLoaded models:\n${lines.join('\n')}\n\n` +
|
|
402
|
+
`The local LLM is available. You can delegate tasks using chat, custom_prompt, or code_task.`,
|
|
403
|
+
}],
|
|
404
|
+
};
|
|
405
|
+
}
|
|
406
|
+
case 'list_models': {
|
|
407
|
+
const models = await listModelsRaw();
|
|
408
|
+
if (!models.length) {
|
|
409
|
+
return { content: [{ type: 'text', text: 'No models currently loaded.' }] };
|
|
410
|
+
}
|
|
411
|
+
const lines = models.map((m) => {
|
|
412
|
+
const ctx = getContextLength(m);
|
|
413
|
+
return ` • ${m.id}${ctx ? ` (context: ${ctx.toLocaleString()} tokens)` : ''}`;
|
|
414
|
+
});
|
|
415
|
+
return {
|
|
416
|
+
content: [{ type: 'text', text: `Loaded models:\n${lines.join('\n')}` }],
|
|
156
417
|
};
|
|
157
418
|
}
|
|
158
419
|
default:
|
package/dist/index.js.map
CHANGED
|
@@ -1 +1 @@
|
|
|
1
|
-
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AACA;;;;;GAKG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,2CAA2C,CAAC;AACnE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EACL,qBAAqB,EACrB,sBAAsB,GACvB,MAAM,oCAAoC,CAAC;AAE5C,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,aAAa,IAAI,uBAAuB,CAAC;AACzE,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,eAAe,IAAI,EAAE,CAAC;AACnD,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,IAAI,EAAE,CAAC;AACzD,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,mBAAmB,GAAG,GAAG,CAAC;AAEhC,SAAS,UAAU;IACjB,MAAM,CAAC,GAA2B,EAAE,cAAc,EAAE,kBAAkB,EAAE,CAAC;IACzE,IAAI,WAAW;QAAE,CAAC,CAAC,eAAe,CAAC,GAAG,UAAU,WAAW,EAAE,CAAC;IAC9D,OAAO,CAAC,CAAC;AACX,CAAC;AAmBD,KAAK,UAAU,cAAc,CAC3B,QAAuB,EACvB,UAAwE,EAAE;IAE1E,MAAM,IAAI,GAA4B;QACpC,QAAQ;QACR,WAAW,EAAE,OAAO,CAAC,WAAW,IAAI,mBAAmB;QACvD,UAAU,EAAE,OAAO,CAAC,SAAS,IAAI,kBAAkB;QACnD,MAAM,EAAE,KAAK;KACd,CAAC;IACF,IAAI,OAAO,CAAC,KAAK,IAAI,QAAQ,EAAE,CAAC;QAC9B,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,QAAQ,CAAC;IACzC,CAAC;IAED,MAAM,GAAG,GAAG,MAAM,KAAK,CAAC,GAAG,WAAW,sBAAsB,EAAE;QAC5D,MAAM,EAAE,MAAM;QACd,OAAO,EAAE,UAAU,EAAE;QACrB,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC;KAC3B,CAAC,CAAC;IAEH,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,CAAC;QACZ,MAAM,IAAI,GAAG,MAAM,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC,CAAC;QAC9C,MAAM,IAAI,KAAK,CAAC,uBAAuB,GAAG,CAAC,MAAM,KAAK,IAAI,EAAE,CAAC,CAAC;IAChE,CAAC;IAED,OAAO,GAAG,CAAC,IAAI,EAAqC,CAAC;AACvD,CAAC;AAED,KAAK,UAAU,UAAU;IACvB,MAAM,GAAG,GAAG,MAAM,KAAK,CAAC,GAAG,WAAW,YAAY,EAAE,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,CAAC,CAAC;IAC/E,IAAI,CAAC,GAAG,CAAC,EAAE;QAAE,MAAM,IAAI,KAAK,CAAC,0BAA0B,GAAG,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,MAAM,IAAI,GAAG,CAAC,MAAM,GAAG,CAAC,IAAI,EAAE,CAAoC,CAAC;IACnE,OAAO,IAAI,CAAC,IAAI,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC,CAAC;AACpC,CAAC;AAED,wEAAwE;AAExE,MAAM,KAAK,GAAG;IACZ;QACE,IAAI,EAAE,MAAM;QACZ,WAAW,EACT,4IAA4I;QAC9I,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,OAAO,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,sBAAsB,EAAE;gBAChE,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,wBAAwB,EAAE;gBACjE,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,yCAAyC,EAAE;gBACvF,UAAU,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,uCAAuC,EAAE;aACrF;YACD,QAAQ,EAAE,CAAC,SAAS,CAAC;SACtB;KACF;IACD;QACE,IAAI,EAAE,eAAe;QACrB,WAAW,EACT,8FAA8F;QAChG,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,MAAM,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,yBAAyB,EAAE;gBAClE,OAAO,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,uCAAuC,EAAE;gBACjF,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,6BAA6B,EAAE;gBAC3E,WAAW,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,oCAAoC,EAAE;gBAClF,UAAU,EAAE,EAAE,IAAI,EAAE,QAAQ,EAAE,WAAW,EAAE,2BAA2B,EAAE;aACzE;YACD,QAAQ,EAAE,CAAC,aAAa,CAAC;SAC1B;KACF;IACD;QACE,IAAI,EAAE,aAAa;QACnB,WAAW,EAAE,4CAA4C;QACzD,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;IACD;QACE,IAAI,EAAE,cAAc;QACpB,WAAW,EAAE,qDAAqD;QAClE,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;CACF,CAAC;AAEF,wEAAwE;AAExE,MAAM,MAAM,GAAG,IAAI,MAAM,CACvB,EAAE,IAAI,EAAE,YAAY,EAAE,OAAO,EAAE,OAAO,EAAE,EACxC,EAAE,YAAY,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,CAChC,CAAC;AAEF,MAAM,CAAC,iBAAiB,CAAC,sBAAsB,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC;AAEjF,MAAM,CAAC,iBAAiB,CAAC,qBAAqB,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE;IAChE,MAAM,EAAE,IAAI,EAAE,SAAS,EAAE,IAAI,EAAE,GAAG,OAAO,CAAC,MAAM,CAAC;IAEjD,IAAI,CAAC;QACH,QAAQ,IAAI,EAAE,CAAC;YACb,KAAK,MAAM,CAAC,CAAC,CAAC;gBACZ,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAKpD,CAAC;gBACF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAC/D,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,CAAC,CAAC;gBAElD,MAAM,IAAI,GAAG,MAAM,cAAc,CAAC,QAAQ,EAAE;oBAC1C,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,KAAK,GAAG,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,IAAI,EAAE,CAAC;gBACtD,MAAM,KAAK,GAAG,IAAI,CAAC,KAAK;oBACtB,CAAC,CAAC,mBAAmB,IAAI,CAAC,KAAK,cAAc,IAAI,CAAC,KAAK,CAAC,aAAa,IAAI,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE;oBACvG,CAAC,CAAC,EAAE,CAAC;gBAEP,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,KAAK,GAAG,KAAK,EAAE,CAAC,EAAE,CAAC;YAC9D,CAAC;YAED,KAAK,eAAe,CAAC,CAAC,CAAC;gBACrB,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAMjE,CAAC;gBAEF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAE/D,IAAI,WAAW,GAAG,WAAW,CAAC;gBAC9B,IAAI,OAAO;oBAAE,WAAW,GAAG,aAAa,OAAO,qBAAqB,WAAW,EAAE,CAAC;gBAClF,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,CAAC,CAAC;gBAEtD,MAAM,IAAI,GAAG,MAAM,cAAc,CAAC,QAAQ,EAAE;oBAC1C,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,CAAC,CAAC,CAAC,EAAE,OAAO,EAAE,OAAO,IAAI,EAAE,EAAE,CAAC;iBAC3E,CAAC;YACJ,CAAC;YAED,KAAK,aAAa,CAAC,CAAC,CAAC;gBACnB,MAAM,MAAM,GAAG,MAAM,UAAU,EAAE,CAAC;gBAClC,OAAO;oBACL,OAAO,EAAE;wBACP;4BACE,IAAI,EAAE,MAAM;4BACZ,IAAI,EAAE,MAAM,CAAC,MAAM;gCACjB,CAAC,CAAC,mBAAmB,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC,OAAO,CAAC,EAAE,CAAC,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE;gCAC/D,CAAC,CAAC,6BAA6B;yBAClC;qBACF;iBACF,CAAC;YACJ,CAAC;YAED,KAAK,cAAc,CAAC,CAAC,CAAC;gBACpB,MAAM,KAAK,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;gBACzB,MAAM,MAAM,GAAG,MAAM,UAAU,EAAE,CAAC;gBAClC,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;gBAC9B,OAAO;oBACL,OAAO,EAAE;wBACP;4BACE,IAAI,EAAE,MAAM;4BACZ,IAAI,EAAE,gBAAgB,WAAW,KAAK,EAAE,cAAc,WAAW,CAAC,CAAC,CAAC,SAAS,CAAC,CAAC,CAAC,MAAM,oBAAoB,MAAM,CAAC,MAAM,GAAG,MAAM,CAAC,MAAM,CAAC,CAAC,CAAC,IAAI,GAAG,MAAM,CAAC,IAAI,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC,EAAE,EAAE;yBAC1K;qBACF;iBACF,CAAC;YACJ,CAAC;YAED;gBACE,MAAM,IAAI,KAAK,CAAC,iBAAiB,IAAI,EAAE,CAAC,CAAC;QAC7C,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,UAAU,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,CAAC;YACrG,OAAO,EAAE,IAAI;SACd,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,KAAK,UAAU,IAAI;IACjB,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAC7C,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IAChC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8BAA8B,WAAW,KAAK,CAAC,CAAC;AACvE,CAAC;AAED,IAAI,EAAE,CAAC,KAAK,CAAC,CAAC,KAAK,EAAE,EAAE;IACrB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gBAAgB,KAAK,IAAI,CAAC,CAAC;IAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC"}
|
|
1
|
+
{"version":3,"file":"index.js","sourceRoot":"","sources":["../src/index.ts"],"names":[],"mappings":";AACA;;;;;GAKG;AAEH,OAAO,EAAE,MAAM,EAAE,MAAM,2CAA2C,CAAC;AACnE,OAAO,EAAE,oBAAoB,EAAE,MAAM,2CAA2C,CAAC;AACjF,OAAO,EACL,qBAAqB,EACrB,sBAAsB,GACvB,MAAM,oCAAoC,CAAC;AAE5C,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,aAAa,IAAI,uBAAuB,CAAC;AACzE,MAAM,QAAQ,GAAG,OAAO,CAAC,GAAG,CAAC,eAAe,IAAI,EAAE,CAAC;AACnD,MAAM,WAAW,GAAG,OAAO,CAAC,GAAG,CAAC,kBAAkB,IAAI,EAAE,CAAC;AACzD,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,mBAAmB,GAAG,GAAG,CAAC;AAChC,MAAM,kBAAkB,GAAG,IAAI,CAAC;AAChC,MAAM,4BAA4B,GAAG,MAAM,CAAC,CAAC,yCAAyC;AACtF,MAAM,eAAe,GAAG,MAAM,CAAC,CAAc,qDAAqD;AAClG,MAAM,qBAAqB,GAAG,MAAM,CAAC,CAAQ,kCAAkC;AAC/E,MAAM,uBAAuB,GAAG,QAAQ,CAAC,OAAO,CAAC,GAAG,CAAC,iBAAiB,IAAI,QAAQ,EAAE,EAAE,CAAC,CAAC;AAExF,SAAS,UAAU;IACjB,MAAM,CAAC,GAA2B,EAAE,cAAc,EAAE,kBAAkB,EAAE,CAAC;IACzE,IAAI,WAAW;QAAE,CAAC,CAAC,eAAe,CAAC,GAAG,UAAU,WAAW,EAAE,CAAC;IAC9D,OAAO,CAAC,CAAC;AACX,CAAC;AAyBD;;GAEG;AACH,KAAK,UAAU,gBAAgB,CAC7B,GAAW,EACX,OAAoB,EACpB,YAAoB,kBAAkB;IAEtC,MAAM,UAAU,GAAG,IAAI,eAAe,EAAE,CAAC;IACzC,MAAM,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,UAAU,CAAC,KAAK,EAAE,EAAE,SAAS,CAAC,CAAC;IAC9D,IAAI,CAAC;QACH,OAAO,MAAM,KAAK,CAAC,GAAG,EAAE,EAAE,GAAG,OAAO,EAAE,MAAM,EAAE,UAAU,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,CAAC;YAAS,CAAC;QACT,YAAY,CAAC,KAAK,CAAC,CAAC;IACtB,CAAC;AACH,CAAC;AAED;;;GAGG;AACH,KAAK,UAAU,SAAS,CACtB,MAA+C,EAC/C,SAAiB;IAEjB,IAAI,KAAoC,CAAC;IACzC,MAAM,OAAO,GAAG,IAAI,OAAO,CAAY,CAAC,OAAO,EAAE,EAAE;QACjD,KAAK,GAAG,UAAU,CAAC,GAAG,EAAE,CAAC,OAAO,CAAC,SAAS,CAAC,EAAE,SAAS,CAAC,CAAC;IAC1D,CAAC,CAAC,CAAC;IACH,IAAI,CAAC;QACH,OAAO,MAAM,OAAO,CAAC,IAAI,CAAC,CAAC,MAAM,CAAC,IAAI,EAAE,EAAE,OAAO,CAAC,CAAC,CAAC;IACtD,CAAC;YAAS,CAAC;QACT,YAAY,CAAC,KAAM,CAAC,CAAC;IACvB,CAAC;AACH,CAAC;AAED;;;;;;;GAOG;AACH,KAAK,UAAU,uBAAuB,CACpC,QAAuB,EACvB,UAAwE,EAAE;IAE1E,MAAM,IAAI,GAA4B;QACpC,QAAQ;QACR,WAAW,EAAE,OAAO,CAAC,WAAW,IAAI,mBAAmB;QACvD,UAAU,EAAE,OAAO,CAAC,SAAS,IAAI,kBAAkB;QACnD,MAAM,EAAE,IAAI;KACb,CAAC;IACF,IAAI,OAAO,CAAC,KAAK,IAAI,QAAQ,EAAE,CAAC;QAC9B,IAAI,CAAC,KAAK,GAAG,OAAO,CAAC,KAAK,IAAI,QAAQ,CAAC;IACzC,CAAC;IAED,MAAM,SAAS,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;IAE7B,MAAM,GAAG,GAAG,MAAM,gBAAgB,CAChC,GAAG,WAAW,sBAAsB,EACpC,EAAE,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,IAAI,EAAE,IAAI,CAAC,SAAS,CAAC,IAAI,CAAC,EAAE,EACrE,4BAA4B,CAC7B,CAAC;IAEF,IAAI,CAAC,GAAG,CAAC,EAAE,EAAE,CAAC;QACZ,MAAM,IAAI,GAAG,MAAM,GAAG,CAAC,IAAI,EAAE,CAAC,KAAK,CAAC,GAAG,EAAE,CAAC,EAAE,CAAC,CAAC;QAC9C,MAAM,IAAI,KAAK,CAAC,uBAAuB,GAAG,CAAC,MAAM,KAAK,IAAI,EAAE,CAAC,CAAC;IAChE,CAAC;IAED,IAAI,CAAC,GAAG,CAAC,IAAI,EAAE,CAAC;QACd,MAAM,IAAI,KAAK,CAAC,6DAA6D,CAAC,CAAC;IACjF,CAAC;IAED,MAAM,MAAM,GAAG,GAAG,CAAC,IAAI,CAAC,SAAS,EAAE,CAAC;IACpC,MAAM,OAAO,GAAG,IAAI,WAAW,EAAE,CAAC;IAClC,IAAI,OAAO,GAAG,EAAE,CAAC;IACjB,IAAI,KAAK,GAAG,EAAE,CAAC;IACf,IAAI,KAA+B,CAAC;IACpC,IAAI,YAAY,GAAG,EAAE,CAAC;IACtB,IAAI,SAAS,GAAG,KAAK,CAAC;IACtB,IAAI,MAAM,GAAG,EAAE,CAAC;IAEhB,IAAI,CAAC;QACH,OAAO,IAAI,EAAE,CAAC;YACZ,sCAAsC;YACtC,MAAM,OAAO,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,SAAS,CAAC;YACvC,IAAI,OAAO,GAAG,eAAe,EAAE,CAAC;gBAC9B,SAAS,GAAG,IAAI,CAAC;gBACjB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gCAAgC,OAAO,iBAAiB,OAAO,CAAC,MAAM,6BAA6B,CAAC,CAAC;gBAC1H,MAAM;YACR,CAAC;YAED,2DAA2D;YAC3D,MAAM,SAAS,GAAG,eAAe,GAAG,OAAO,CAAC;YAC5C,MAAM,YAAY,GAAG,IAAI,CAAC,GAAG,CAAC,qBAAqB,EAAE,SAAS,CAAC,CAAC;YAChE,MAAM,MAAM,GAAG,MAAM,SAAS,CAAC,MAAM,EAAE,YAAY,CAAC,CAAC;YAErD,IAAI,MAAM,KAAK,SAAS,EAAE,CAAC;gBACzB,SAAS,GAAG,IAAI,CAAC;gBACjB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8CAA8C,OAAO,CAAC,MAAM,6BAA6B,CAAC,CAAC;gBAChH,MAAM;YACR,CAAC;YAED,IAAI,MAAM,CAAC,IAAI;gBAAE,MAAM;YAEvB,MAAM,IAAI,OAAO,CAAC,MAAM,CAAC,MAAM,CAAC,KAAK,EAAE,EAAE,MAAM,EAAE,IAAI,EAAE,CAAC,CAAC;YAEzD,kBAAkB;YAClB,MAAM,KAAK,GAAG,MAAM,CAAC,KAAK,CAAC,IAAI,CAAC,CAAC;YACjC,MAAM,GAAG,KAAK,CAAC,GAAG,EAAE,IAAI,EAAE,CAAC,CAAC,iCAAiC;YAE7D,KAAK,MAAM,IAAI,IAAI,KAAK,EAAE,CAAC;gBACzB,MAAM,OAAO,GAAG,IAAI,CAAC,IAAI,EAAE,CAAC;gBAC5B,IAAI,CAAC,OAAO,IAAI,OAAO,KAAK,cAAc;oBAAE,SAAS;gBACrD,IAAI,CAAC,OAAO,CAAC,UAAU,CAAC,QAAQ,CAAC;oBAAE,SAAS;gBAE5C,IAAI,CAAC;oBACH,MAAM,IAAI,GAAG,IAAI,CAAC,KAAK,CAAC,OAAO,CAAC,KAAK,CAAC,CAAC,CAAC,CAAC,CAAC;oBAC1C,IAAI,IAAI,CAAC,KAAK;wBAAE,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;oBAEnC,MAAM,KAAK,GAAG,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,EAAE,KAAK,CAAC;oBACvC,IAAI,KAAK,EAAE,OAAO;wBAAE,OAAO,IAAI,KAAK,CAAC,OAAO,CAAC;oBAE7C,MAAM,MAAM,GAAG,IAAI,CAAC,OAAO,EAAE,CAAC,CAAC,CAAC,EAAE,aAAa,CAAC;oBAChD,IAAI,MAAM;wBAAE,YAAY,GAAG,MAAM,CAAC;oBAElC,4DAA4D;oBAC5D,IAAI,IAAI,CAAC,KAAK;wBAAE,KAAK,GAAG,IAAI,CAAC,KAAK,CAAC;gBACrC,CAAC;gBAAC,MAAM,CAAC;oBACP,yDAAyD;gBAC3D,CAAC;YACH,CAAC;QACH,CAAC;IACH,CAAC;YAAS,CAAC;QACT,2DAA2D;QAC3D,MAAM,CAAC,WAAW,EAAE,CAAC;IACvB,CAAC;IAED,OAAO,EAAE,OAAO,EAAE,KAAK,EAAE,KAAK,EAAE,YAAY,EAAE,SAAS,EAAE,CAAC;AAC5D,CAAC;AAED,KAAK,UAAU,aAAa;IAC1B,MAAM,GAAG,GAAG,MAAM,gBAAgB,CAChC,GAAG,WAAW,YAAY,EAC1B,EAAE,OAAO,EAAE,UAAU,EAAE,EAAE,CAC1B,CAAC;IACF,IAAI,CAAC,GAAG,CAAC,EAAE;QAAE,MAAM,IAAI,KAAK,CAAC,0BAA0B,GAAG,CAAC,MAAM,EAAE,CAAC,CAAC;IACrE,MAAM,IAAI,GAAG,CAAC,MAAM,GAAG,CAAC,IAAI,EAAE,CAA0B,CAAC;IACzD,OAAO,IAAI,CAAC,IAAI,CAAC;AACnB,CAAC;AAED,SAAS,gBAAgB,CAAC,KAAgB;IACxC,gFAAgF;IAChF,OAAO,KAAK,CAAC,cAAc,IAAI,KAAK,CAAC,aAAa,IAAI,uBAAuB,CAAC;AAChF,CAAC;AAED;;GAEG;AACH,SAAS,YAAY,CAAC,IAAqB,EAAE,KAAc;IACzD,MAAM,KAAK,GAAa,EAAE,CAAC;IAC3B,IAAI,IAAI,CAAC,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,UAAU,IAAI,CAAC,KAAK,EAAE,CAAC,CAAC;IACnD,IAAI,IAAI,CAAC,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,WAAW,IAAI,CAAC,KAAK,CAAC,aAAa,IAAI,IAAI,CAAC,KAAK,CAAC,iBAAiB,EAAE,CAAC,CAAC;IAClG,IAAI,KAAK;QAAE,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,CAAC;IAC7B,IAAI,IAAI,CAAC,SAAS;QAAE,KAAK,CAAC,IAAI,CAAC,6CAA6C,CAAC,CAAC;IAE9E,OAAO,KAAK,CAAC,MAAM,GAAG,CAAC,CAAC,CAAC,CAAC,YAAY,KAAK,CAAC,IAAI,CAAC,KAAK,CAAC,EAAE,CAAC,CAAC,CAAC,EAAE,CAAC;AACjE,CAAC;AAED,wEAAwE;AAExE,MAAM,KAAK,GAAG;IACZ;QACE,IAAI,EAAE,MAAM;QACZ,WAAW,EACT,8FAA8F;YAC9F,yGAAyG;YACzG,sDAAsD;YACtD,yDAAyD;YACzD,kDAAkD;YAClD,mEAAmE;YACnE,gEAAgE;YAChE,sDAAsD;YACtD,qEAAqE;YACrE,oDAAoD;YACpD,iEAAiE;YACjE,UAAU;YACV,8FAA8F;YAC9F,gGAAgG;YAChG,gFAAgF;YAChF,mGAAmG;QACrG,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,OAAO,EAAE;oBACP,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,qGAAqG;iBACnH;gBACD,MAAM,EAAE;oBACN,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,0FAA0F;iBACxG;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,8FAA8F;iBAC5G;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,6FAA6F;iBAC3G;aACF;YACD,QAAQ,EAAE,CAAC,SAAS,CAAC;SACtB;KACF;IACD;QACE,IAAI,EAAE,eAAe;QACrB,WAAW,EACT,6FAA6F;YAC7F,4FAA4F;YAC5F,gBAAgB;YAChB,gEAAgE;YAChE,yEAAyE;YACzE,qEAAqE;YACrE,kEAAkE;YAClE,2EAA2E;YAC3E,wEAAwE;YACxE,6CAA6C;QAC/C,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,MAAM,EAAE;oBACN,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4FAA4F;iBAC1G;gBACD,OAAO,EAAE;oBACP,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,uFAAuF;iBACrG;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,iHAAiH;iBAC/H;gBACD,WAAW,EAAE;oBACX,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,uEAAuE;iBACrF;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,oCAAoC;iBAClD;aACF;YACD,QAAQ,EAAE,CAAC,aAAa,CAAC;SAC1B;KACF;IACD;QACE,IAAI,EAAE,WAAW;QACjB,WAAW,EACT,gHAAgH;YAChH,gBAAgB;YAChB,wCAAwC;YACxC,uCAAuC;YACvC,+DAA+D;YAC/D,gDAAgD;YAChD,6CAA6C;YAC7C,qFAAqF;QACvF,WAAW,EAAE;YACX,IAAI,EAAE,QAAiB;YACvB,UAAU,EAAE;gBACV,IAAI,EAAE;oBACJ,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,iFAAiF;iBAC/F;gBACD,IAAI,EAAE;oBACJ,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4FAA4F;iBAC1G;gBACD,QAAQ,EAAE;oBACR,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,4DAA4D;iBAC1E;gBACD,UAAU,EAAE;oBACV,IAAI,EAAE,QAAQ;oBACd,WAAW,EAAE,oCAAoC;iBAClD;aACF;YACD,QAAQ,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC;SAC3B;KACF;IACD;QACE,IAAI,EAAE,UAAU;QAChB,WAAW,EACT,2GAA2G;YAC3G,+GAA+G;YAC/G,wHAAwH;QAC1H,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;IACD;QACE,IAAI,EAAE,aAAa;QACnB,WAAW,EACT,uFAAuF;YACvF,sDAAsD;QACxD,WAAW,EAAE,EAAE,IAAI,EAAE,QAAiB,EAAE,UAAU,EAAE,EAAE,EAAE;KACzD;CACF,CAAC;AAEF,wEAAwE;AAExE,MAAM,MAAM,GAAG,IAAI,MAAM,CACvB,EAAE,IAAI,EAAE,YAAY,EAAE,OAAO,EAAE,OAAO,EAAE,EACxC,EAAE,YAAY,EAAE,EAAE,KAAK,EAAE,EAAE,EAAE,EAAE,CAChC,CAAC;AAEF,MAAM,CAAC,iBAAiB,CAAC,sBAAsB,EAAE,KAAK,IAAI,EAAE,CAAC,CAAC,EAAE,KAAK,EAAE,KAAK,EAAE,CAAC,CAAC,CAAC;AAEjF,MAAM,CAAC,iBAAiB,CAAC,qBAAqB,EAAE,KAAK,EAAE,OAAO,EAAE,EAAE;IAChE,MAAM,EAAE,IAAI,EAAE,SAAS,EAAE,IAAI,EAAE,GAAG,OAAO,CAAC,MAAM,CAAC;IAEjD,IAAI,CAAC;QACH,QAAQ,IAAI,EAAE,CAAC;YACb,KAAK,MAAM,CAAC,CAAC,CAAC;gBACZ,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAKpD,CAAC;gBACF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAC/D,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,OAAO,EAAE,CAAC,CAAC;gBAElD,MAAM,IAAI,GAAG,MAAM,uBAAuB,CAAC,QAAQ,EAAE;oBACnD,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,MAAM,GAAG,YAAY,CAAC,IAAI,CAAC,CAAC;gBAClC,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,GAAG,MAAM,EAAE,CAAC,EAAE,CAAC;YACtE,CAAC;YAED,KAAK,eAAe,CAAC,CAAC,CAAC;gBACrB,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,WAAW,EAAE,UAAU,EAAE,GAAG,IAMjE,CAAC;gBAEF,MAAM,QAAQ,GAAkB,EAAE,CAAC;gBACnC,IAAI,MAAM;oBAAE,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,QAAQ,EAAE,OAAO,EAAE,MAAM,EAAE,CAAC,CAAC;gBAE/D,IAAI,WAAW,GAAG,WAAW,CAAC;gBAC9B,IAAI,OAAO;oBAAE,WAAW,GAAG,aAAa,OAAO,qBAAqB,WAAW,EAAE,CAAC;gBAClF,QAAQ,CAAC,IAAI,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,EAAE,WAAW,EAAE,CAAC,CAAC;gBAEtD,MAAM,IAAI,GAAG,MAAM,uBAAuB,CAAC,QAAQ,EAAE;oBACnD,WAAW;oBACX,SAAS,EAAE,UAAU;iBACtB,CAAC,CAAC;gBAEH,MAAM,MAAM,GAAG,YAAY,CAAC,IAAI,CAAC,CAAC;gBAClC,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,IAAI,CAAC,OAAO,GAAG,MAAM,EAAE,CAAC;iBACzD,CAAC;YACJ,CAAC;YAED,KAAK,WAAW,CAAC,CAAC,CAAC;gBACjB,MAAM,EAAE,IAAI,EAAE,IAAI,EAAE,QAAQ,EAAE,UAAU,EAAE,aAAa,EAAE,GAAG,IAK3D,CAAC;gBAEF,MAAM,IAAI,GAAG,QAAQ,IAAI,SAAS,CAAC;gBACnC,MAAM,YAAY,GAAkB;oBAClC;wBACE,IAAI,EAAE,QAAQ;wBACd,OAAO,EAAE,UAAU,IAAI,qJAAqJ;qBAC7K;oBACD;wBACE,IAAI,EAAE,MAAM;wBACZ,OAAO,EAAE,SAAS,IAAI,aAAa,IAAI,KAAK,IAAI,UAAU;qBAC3D;iBACF,CAAC;gBAEF,MAAM,QAAQ,GAAG,MAAM,uBAAuB,CAAC,YAAY,EAAE;oBAC3D,WAAW,EAAE,GAAG;oBAChB,SAAS,EAAE,aAAa,IAAI,kBAAkB;iBAC/C,CAAC,CAAC;gBAEH,MAAM,UAAU,GAAG,YAAY,CAAC,QAAQ,EAAE,IAAI,CAAC,CAAC;gBAChD,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,QAAQ,CAAC,OAAO,GAAG,UAAU,EAAE,CAAC,EAAE,CAAC;YAC9E,CAAC;YAED,KAAK,UAAU,CAAC,CAAC,CAAC;gBAChB,MAAM,KAAK,GAAG,IAAI,CAAC,GAAG,EAAE,CAAC;gBACzB,IAAI,MAAmB,CAAC;gBACxB,IAAI,CAAC;oBACH,MAAM,GAAG,MAAM,aAAa,EAAE,CAAC;gBACjC,CAAC;gBAAC,OAAO,GAAG,EAAE,CAAC;oBACb,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;oBAC9B,MAAM,MAAM,GAAG,GAAG,YAAY,KAAK,IAAI,GAAG,CAAC,IAAI,KAAK,YAAY;wBAC9D,CAAC,CAAC,qCAAqC,EAAE,KAAK;wBAC9C,CAAC,CAAC,sBAAsB,GAAG,YAAY,KAAK,CAAC,CAAC,CAAC,GAAG,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,GAAG,CAAC,EAAE,CAAC;oBAC7E,OAAO;wBACL,OAAO,EAAE,CAAC;gCACR,IAAI,EAAE,MAAM;gCACZ,IAAI,EAAE,8BAA8B,WAAW,KAAK,MAAM,uFAAuF;6BAClJ,CAAC;qBACH,CAAC;gBACJ,CAAC;gBACD,MAAM,EAAE,GAAG,IAAI,CAAC,GAAG,EAAE,GAAG,KAAK,CAAC;gBAE9B,IAAI,MAAM,CAAC,MAAM,KAAK,CAAC,EAAE,CAAC;oBACxB,OAAO;wBACL,OAAO,EAAE,CAAC;gCACR,IAAI,EAAE,MAAM;gCACZ,IAAI,EAAE,+CAA+C,WAAW,cAAc,EAAE,gGAAgG;6BACjL,CAAC;qBACH,CAAC;gBACJ,CAAC;gBAED,MAAM,KAAK,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE;oBAC7B,MAAM,GAAG,GAAG,gBAAgB,CAAC,CAAC,CAAC,CAAC;oBAChC,OAAO,OAAO,CAAC,CAAC,EAAE,cAAc,GAAG,CAAC,cAAc,EAAE,UAAU,CAAC;gBACjE,CAAC,CAAC,CAAC;gBAEH,MAAM,OAAO,GAAG,MAAM,CAAC,CAAC,CAAC,CAAC;gBAC1B,MAAM,GAAG,GAAG,gBAAgB,CAAC,OAAO,CAAC,CAAC;gBAEtC,OAAO;oBACL,OAAO,EAAE,CAAC;4BACR,IAAI,EAAE,MAAM;4BACZ,IAAI,EACF,kBAAkB;gCAClB,aAAa,WAAW,IAAI;gCAC5B,YAAY,EAAE,MAAM;gCACpB,UAAU,OAAO,CAAC,EAAE,IAAI;gCACxB,mBAAmB,GAAG,CAAC,cAAc,EAAE,WAAW;gCAClD,qBAAqB,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,MAAM;gCAC3C,6FAA6F;yBAChG,CAAC;iBACH,CAAC;YACJ,CAAC;YAED,KAAK,aAAa,CAAC,CAAC,CAAC;gBACnB,MAAM,MAAM,GAAG,MAAM,aAAa,EAAE,CAAC;gBACrC,IAAI,CAAC,MAAM,CAAC,MAAM,EAAE,CAAC;oBACnB,OAAO,EAAE,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,6BAA6B,EAAE,CAAC,EAAE,CAAC;gBAC9E,CAAC;gBACD,MAAM,KAAK,GAAG,MAAM,CAAC,GAAG,CAAC,CAAC,CAAC,EAAE,EAAE;oBAC7B,MAAM,GAAG,GAAG,gBAAgB,CAAC,CAAC,CAAC,CAAC;oBAChC,OAAO,OAAO,CAAC,CAAC,EAAE,GAAG,GAAG,CAAC,CAAC,CAAC,cAAc,GAAG,CAAC,cAAc,EAAE,UAAU,CAAC,CAAC,CAAC,EAAE,EAAE,CAAC;gBACjF,CAAC,CAAC,CAAC;gBACH,OAAO;oBACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,mBAAmB,KAAK,CAAC,IAAI,CAAC,IAAI,CAAC,EAAE,EAAE,CAAC;iBACzE,CAAC;YACJ,CAAC;YAED;gBACE,MAAM,IAAI,KAAK,CAAC,iBAAiB,IAAI,EAAE,CAAC,CAAC;QAC7C,CAAC;IACH,CAAC;IAAC,OAAO,KAAK,EAAE,CAAC;QACf,OAAO;YACL,OAAO,EAAE,CAAC,EAAE,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,UAAU,KAAK,YAAY,KAAK,CAAC,CAAC,CAAC,KAAK,CAAC,OAAO,CAAC,CAAC,CAAC,MAAM,CAAC,KAAK,CAAC,EAAE,EAAE,CAAC;YACrG,OAAO,EAAE,IAAI;SACd,CAAC;IACJ,CAAC;AACH,CAAC,CAAC,CAAC;AAEH,KAAK,UAAU,IAAI;IACjB,MAAM,SAAS,GAAG,IAAI,oBAAoB,EAAE,CAAC;IAC7C,MAAM,MAAM,CAAC,OAAO,CAAC,SAAS,CAAC,CAAC;IAChC,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,8BAA8B,WAAW,KAAK,CAAC,CAAC;AACvE,CAAC;AAED,IAAI,EAAE,CAAC,KAAK,CAAC,CAAC,KAAK,EAAE,EAAE;IACrB,OAAO,CAAC,MAAM,CAAC,KAAK,CAAC,gBAAgB,KAAK,IAAI,CAAC,CAAC;IAChD,OAAO,CAAC,IAAI,CAAC,CAAC,CAAC,CAAC;AAClB,CAAC,CAAC,CAAC"}
|