thoughtgear 0.1.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md ADDED
@@ -0,0 +1,207 @@
1
+ # ThoughtGear
2
+
3
+ A small agent loop: send a prompt, the model can call tools, the loop persists transcript + state and iterates until the model is done. It's inspired (literally) in OpenClaw source code.
4
+
5
+ For the full design walkthrough, see [PROMPT_HANDLER.md](./PROMPT_HANDLER.md).
6
+
7
+ ## Install
8
+
9
+ ```bash
10
+ npm install thoughtgear
11
+ ```
12
+
13
+ ## Quick start
14
+
15
+ ```ts
16
+ import { PromptHandler } from "thoughtgear";
17
+
18
+ const handler = new PromptHandler({
19
+ context: "You are a friendly assistant. Reply in one short sentence.",
20
+ tools: [],
21
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
22
+ db: { type: "memory" },
23
+ callbacks: {
24
+ onPartialReply: (chunk) => process.stdout.write(chunk),
25
+ onDone: () => console.log("\n[done]"),
26
+ },
27
+ });
28
+
29
+ await handler.handlePrompt({ text: "Hello" });
30
+ ```
31
+
32
+ That's the whole loop: no tools, model streams a reply, `onDone` fires.
33
+
34
+ ## Adding a tool
35
+
36
+ A tool is a `{ key, description, content, handler }` object. The `handler` runs when the model calls it; its return string is sent back to the model.
37
+
38
+ ```ts
39
+ const rollDice = {
40
+ key: "roll_dice",
41
+ description: "Roll a 6-sided die.",
42
+ content: "Returns a random integer between 1 and 6.",
43
+ handler: async () => String(Math.floor(Math.random() * 6) + 1),
44
+ };
45
+
46
+ const handler = new PromptHandler({
47
+ context: "When the user asks for a dice roll, call the roll_dice tool.",
48
+ tools: [rollDice],
49
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
50
+ db: { type: "memory" },
51
+ });
52
+
53
+ await handler.handlePrompt({ text: "Roll a die for me." });
54
+ ```
55
+
56
+ The model calls `roll_dice`, the handler runs, the result is fed back, and the model replies with the number.
57
+
58
+ ## Tool with typed arguments
59
+
60
+ Add a `parameters` JSONSchema so the model knows what to pass. The arguments arrive on `params`.
61
+
62
+ ```ts
63
+ const addNumbers = {
64
+ key: "add_numbers",
65
+ description: "Add two numbers.",
66
+ content: "Returns a + b.",
67
+ parameters: {
68
+ type: "object",
69
+ properties: { a: { type: "number" }, b: { type: "number" } },
70
+ required: ["a", "b"],
71
+ },
72
+ handler: async ({ params }) => {
73
+ const { a, b } = params as { a: number; b: number };
74
+ return String(a + b);
75
+ },
76
+ };
77
+ ```
78
+
79
+ ## Chaining tools
80
+
81
+ The loop handles chaining automatically. Give the model multiple tools and it'll call them in sequence, feeding each result into the next call:
82
+
83
+ ```ts
84
+ const handler = new PromptHandler({
85
+ context: "Use the tools for every arithmetic step. Never compute in your head.",
86
+ tools: [addNumbers, multiplyNumbers, squareRoot],
87
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
88
+ db: { type: "memory" },
89
+ });
90
+
91
+ await handler.handlePrompt({
92
+ text: "Take 7 + 9, multiply by 12, then square root it. Give me the final answer.",
93
+ });
94
+ // → add_numbers(7,9)=16 → multiply_numbers(16,12)=192 → square_root(192)=13.856
95
+ ```
96
+
97
+ ## Sessions (multi-turn conversations)
98
+
99
+ By default every `handlePrompt` call is an **independent run** — the model has no memory of earlier calls. Pass a `sessionId` in the constructor to bind the handler to a session: every prompt, assistant reply, and tool result is persisted under that `sessionId`, and on the next call the model is fed the entire prior history.
100
+
101
+ ```ts
102
+ const handler = new PromptHandler({
103
+ sessionId: "user-42", // ← bind to a session
104
+ context: "You are a helpful assistant.",
105
+ tools: [],
106
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
107
+ db: { type: "memory" },
108
+ });
109
+
110
+ await handler.handlePrompt({ text: "My name is Diego." });
111
+ await handler.handlePrompt({ text: "What's my name?" });
112
+ // → the model sees the prior exchange and answers "Diego".
113
+ ```
114
+
115
+ Notes:
116
+
117
+ - The `sessionId` is just a string you choose (e.g. a user ID, a chat thread ID, a UUID).
118
+ - Without `sessionId`, each `handlePrompt` is isolated — the model only sees that single prompt and the loop's own tool calls.
119
+ - Session history is loaded via `orm.getSessionHistory(sessionId)`. The in-memory adapter implements this; the Mongo / SQL adapters are stubs (one-liner query you fill in).
120
+ - Resume a session in a different process by passing the same `sessionId` and pointing at the same persistent DB (e.g. `db: { type: "mongodb", ... }`).
121
+
122
+ ## Streaming callbacks
123
+
124
+ Subscribe to whichever you need:
125
+
126
+ ```ts
127
+ callbacks: {
128
+ onPartialReply: (chunk, runId) => { /* text tokens as they stream */ },
129
+ onToolStart: (call, runId) => { /* model invoked a tool */ },
130
+ onToolResult: (res, runId) => { /* handler returned */ },
131
+ onDone: (runId) => { /* run finished */ },
132
+ }
133
+ ```
134
+
135
+ ## Persistence: `orm` or `db`
136
+
137
+ `PromptHandler` accepts **either** a pre-built `orm` **or** raw `db` settings. The `db` form is just a shortcut — internally the handler constructs an `ORM` from your `DbConfig` and picks the right adapter (`memory` / `mongodb` / `sql`).
138
+
139
+ ### Shortcut: pass `db` settings
140
+
141
+ ```ts
142
+ const handler = new PromptHandler({
143
+ context: "You are a helpful assistant.",
144
+ tools: [],
145
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
146
+ db: { type: "mongodb", uri: "mongodb://localhost:27017", database: "thoughtgear" },
147
+ });
148
+
149
+ const { runId } = await handler.handlePrompt({ text: "Hello" });
150
+ ```
151
+
152
+ Other supported `db` shapes:
153
+
154
+ ```ts
155
+ db: { type: "memory" }
156
+ db: { type: "mongodb", uri: "...", database: "..." }
157
+ db: { type: "sql", dialect: "postgres", uri: "..." }
158
+ ```
159
+
160
+ ### Bring your own ORM
161
+
162
+ Use this when you want to share one ORM across multiple handlers, or you need to read the transcript back yourself:
163
+
164
+ ```ts
165
+ import { ORM, PromptHandler } from "thoughtgear";
166
+
167
+ const orm = new ORM({
168
+ type: "mongodb",
169
+ uri: "mongodb://localhost:27017",
170
+ database: "thoughtgear",
171
+ });
172
+
173
+ const handler = new PromptHandler({
174
+ context: "You are a helpful assistant.",
175
+ tools: [],
176
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: process.env.OPENAI_API_KEY! },
177
+ orm,
178
+ });
179
+
180
+ const { runId } = await handler.handlePrompt({ text: "Hello" });
181
+
182
+ // Later — even from a different process — read the transcript back:
183
+ const history = await orm.getHistory(runId);
184
+ const state = await orm.getRunState(runId);
185
+ ```
186
+
187
+ Mongo collections used: `messages`, `run_states`, `cache`, `memory`. The Mongo and SQL adapters are stubbed in `src/classes/PromptHandler.ts` — fill in the eight `OrmAdapter` methods using the `mongodb` / `pg` / `kysely` drivers to make them live.
188
+
189
+ ## Switching providers
190
+
191
+ Just change `model.provider`:
192
+
193
+ ```ts
194
+ model: { name: "claude-opus-4-7", provider: "anthropic", apiKey: "..." }
195
+ model: { name: "gemini-2.5-pro", provider: "google", apiKey: "..." }
196
+ model: { name: "gpt-4o-mini", provider: "openai", apiKey: "..." }
197
+ model: { name: "mock", provider: "mock", apiKey: "" } // for tests
198
+ ```
199
+
200
+ ## Running the tests
201
+
202
+ ```bash
203
+ # Put OPENAI_API_KEY in tests/.env
204
+ npm test
205
+ ```
206
+
207
+ The test suite covers a plain greeting, a single-tool call, and a 3-tool chain.
@@ -0,0 +1,263 @@
1
+ /**
2
+ * PromptHandler — minimal emulation of openclaw's "chain of thoughts" loop.
3
+ *
4
+ * Shape mirrors src/agents/pi-embedded-runner:
5
+ * handlePrompt() persists the inbound message + a RunState, then asks the
6
+ * Executor to drive the loop. Each iteration (continueRun) is stateless:
7
+ * it rebuilds the prompt from ORM history, opens a model stream, executes
8
+ * any tool calls, persists results, and either finishes or schedules the
9
+ * next iteration. In Lambda mode the next iteration is a fresh invocation.
10
+ */
11
+ export type Tool = {
12
+ key: string;
13
+ description: string;
14
+ /** Skill body / instructions returned when the model invokes the tool. */
15
+ content: string;
16
+ /**
17
+ * Optional JSONSchema describing the tool's input. Sent to the model so it
18
+ * knows what arguments to pass. Defaults to an open object schema.
19
+ */
20
+ parameters?: Record<string, unknown>;
21
+ /**
22
+ * Optional function-style handler. If provided, its return value is sent
23
+ * back to the model as the tool_result instead of `content`. The skill's
24
+ * `content` is still rendered into the system prompt so the model knows
25
+ * what the tool does.
26
+ *
27
+ * Receives an object with a `params` attribute (the arguments produced by
28
+ * the model). Wrapped in an object so we can inject additional context
29
+ * later (e.g. runId, callbacks, abort signal) without breaking handlers.
30
+ */
31
+ handler?: (args: ToolHandlerArgs) => Promise<string> | string;
32
+ };
33
+ export type ToolHandlerArgs = {
34
+ /** Arguments the model passed when invoking this tool. */
35
+ params: unknown;
36
+ };
37
+ export type ModelConfig = {
38
+ name: string;
39
+ provider: "anthropic" | "openai" | "google" | "mock";
40
+ apiKey: string;
41
+ baseUrl?: string;
42
+ };
43
+ export type FileAttachment = {
44
+ name: string;
45
+ mimeType: string;
46
+ /** base64-encoded bytes */
47
+ data: string;
48
+ };
49
+ export type ContentBlock = {
50
+ type: "text";
51
+ text: string;
52
+ } | {
53
+ type: "reasoning";
54
+ text: string;
55
+ } | {
56
+ type: "tool_call";
57
+ id: string;
58
+ name: string;
59
+ input: unknown;
60
+ } | {
61
+ type: "tool_result";
62
+ toolCallId: string;
63
+ output: string;
64
+ isError?: boolean;
65
+ } | {
66
+ type: "file";
67
+ file: FileAttachment;
68
+ };
69
+ export type Role = "system" | "user" | "assistant" | "tool";
70
+ export type Message = {
71
+ id: string;
72
+ runId: string;
73
+ /** Optional session this run belongs to. When set, history is loaded
74
+ * across all runs in the same session instead of a single run. */
75
+ sessionId?: string;
76
+ role: Role;
77
+ blocks: ContentBlock[];
78
+ createdAt: Date;
79
+ iteration: number;
80
+ };
81
+ export type RunStatus = "pending" | "streaming" | "awaiting_tools" | "compacting" | "done" | "failed";
82
+ export type RunState = {
83
+ runId: string;
84
+ sessionId?: string;
85
+ status: RunStatus;
86
+ iteration: number;
87
+ /** Last stop reason from the model: end_turn | tool_use | max_tokens | error */
88
+ lastStopReason?: string;
89
+ lastError?: string;
90
+ createdAt: Date;
91
+ updatedAt: Date;
92
+ };
93
+ export type StreamCallbacks = {
94
+ onPartialReply?: (chunk: string, runId: string) => void;
95
+ onReasoningStream?: (chunk: string, runId: string) => void;
96
+ onBlockReply?: (block: ContentBlock, runId: string) => void;
97
+ onToolStart?: (call: Extract<ContentBlock, {
98
+ type: "tool_call";
99
+ }>, runId: string) => void;
100
+ onToolResult?: (result: Extract<ContentBlock, {
101
+ type: "tool_result";
102
+ }>, runId: string) => void;
103
+ onDone?: (runId: string) => void;
104
+ };
105
+ export type DbConfig = {
106
+ type: "memory";
107
+ } | {
108
+ type: "mongodb";
109
+ uri: string;
110
+ database: string;
111
+ } | {
112
+ type: "sql";
113
+ dialect: "postgres" | "mysql" | "sqlite";
114
+ uri: string;
115
+ };
116
+ export interface OrmAdapter {
117
+ saveMessage(msg: Message): Promise<void>;
118
+ getHistory(runId: string): Promise<Message[]>;
119
+ /** Returns every message persisted under a sessionId, in createdAt order,
120
+ * across all runs in that session. Used to feed prior turns back into the
121
+ * model when the handler was constructed with a sessionId. */
122
+ getSessionHistory(sessionId: string): Promise<Message[]>;
123
+ saveRunState(state: RunState): Promise<void>;
124
+ getRunState(runId: string): Promise<RunState | null>;
125
+ cacheGet<T>(key: string): Promise<T | null>;
126
+ cacheSet<T>(key: string, value: T, ttlSeconds?: number): Promise<void>;
127
+ saveMemory(scope: string, key: string, value: unknown): Promise<void>;
128
+ getMemory<T = unknown>(scope: string, key: string): Promise<T | null>;
129
+ }
130
+ export declare class ORM implements OrmAdapter {
131
+ readonly config: DbConfig;
132
+ private adapter;
133
+ constructor(config: DbConfig);
134
+ saveMessage(msg: Message): Promise<void>;
135
+ getHistory(runId: string): Promise<Message[]>;
136
+ getSessionHistory(sessionId: string): Promise<Message[]>;
137
+ saveRunState(state: RunState): Promise<void>;
138
+ getRunState(runId: string): Promise<RunState | null>;
139
+ cacheGet<T>(key: string): Promise<T | null>;
140
+ cacheSet<T>(key: string, value: T, ttl?: number): Promise<void>;
141
+ saveMemory(scope: string, key: string, value: unknown): Promise<void>;
142
+ getMemory<T = unknown>(scope: string, key: string): Promise<T | null>;
143
+ }
144
+ export type StreamResult = {
145
+ blocks: ContentBlock[];
146
+ stopReason: "end_turn" | "tool_use" | "max_tokens" | "error";
147
+ };
148
+ export interface LLMProvider {
149
+ stream(args: {
150
+ system: string;
151
+ messages: Message[];
152
+ tools: Tool[];
153
+ callbacks?: StreamCallbacks;
154
+ runId: string;
155
+ }): Promise<StreamResult>;
156
+ }
157
+ export declare function createLLMProvider(model: ModelConfig): LLMProvider;
158
+ export interface Executor {
159
+ scheduleNextIteration(runId: string): Promise<void>;
160
+ }
161
+ export declare class LocalExecutor implements Executor {
162
+ private handler;
163
+ bind(handler: PromptHandler): void;
164
+ scheduleNextIteration(runId: string): Promise<void>;
165
+ }
166
+ export type LambdaInvoker = (payload: {
167
+ runId: string;
168
+ action: "continue";
169
+ }) => Promise<void>;
170
+ export declare class LambdaExecutor implements Executor {
171
+ private invoke;
172
+ constructor(invoke: LambdaInvoker);
173
+ scheduleNextIteration(runId: string): Promise<void>;
174
+ }
175
+ export type PromptHandlerOptions = {
176
+ context: string;
177
+ tools: Tool[];
178
+ model: ModelConfig;
179
+ /**
180
+ * Optional session this handler is bound to. When set, every prompt
181
+ * handled by this instance is persisted under the sessionId and prior
182
+ * turns are fed back as context on the next `handlePrompt` call.
183
+ * When omitted, each call is an independent run with no shared history.
184
+ */
185
+ sessionId?: string;
186
+ executor?: Executor;
187
+ callbacks?: StreamCallbacks;
188
+ /** Hard ceiling on iterations per run; mirrors retry-limit.ts. */
189
+ maxIterations?: number;
190
+ /** Token/char budget that triggers a compaction pass. */
191
+ compactionCharThreshold?: number;
192
+ } & ({
193
+ /** Pre-built ORM instance. Use this when you want to share an ORM
194
+ * across handlers or have already wired adapters. */
195
+ orm: ORM;
196
+ db?: never;
197
+ } | {
198
+ /** DB settings — the handler constructs the ORM for you using the
199
+ * matching adapter (memory / mongodb / sql). */
200
+ db: DbConfig;
201
+ orm?: never;
202
+ });
203
+ export declare class PromptHandler {
204
+ private readonly context;
205
+ private readonly tools;
206
+ private readonly model;
207
+ private readonly orm;
208
+ private readonly executor;
209
+ private readonly callbacks;
210
+ private readonly maxIterations;
211
+ private readonly compactionCharThreshold;
212
+ private readonly llm;
213
+ /** Session this handler is bound to. Undefined means "no session" — each
214
+ * prompt is an independent run with no shared history. */
215
+ readonly sessionId?: string;
216
+ constructor(opts: PromptHandlerOptions);
217
+ /**
218
+ * Entry point. Persists the inbound user message + initial RunState,
219
+ * then hands control to the Executor to drive the first iteration.
220
+ */
221
+ handlePrompt(input: {
222
+ text: string;
223
+ files?: FileAttachment[];
224
+ }): Promise<{
225
+ runId: string;
226
+ }>;
227
+ /**
228
+ * One iteration of the loop. Public so a Lambda entry-point can route
229
+ * { action: 'continue', runId } straight here.
230
+ *
231
+ * Phases — mirrors run/attempt.ts:
232
+ * 1. Load state + history.
233
+ * 2. Compact if oversized.
234
+ * 3. Build system prompt + payload.
235
+ * 4. Open model stream; collect blocks.
236
+ * 5. Persist assistant message.
237
+ * 6. If tool_use: execute tools, persist tool_result messages,
238
+ * schedule next iteration. Else: finalize.
239
+ */
240
+ continueRun(runId: string): Promise<void>;
241
+ private buildSystemPrompt;
242
+ private executeTool;
243
+ /**
244
+ * Cheap stand-in for pi-embedded-runner compaction. If the transcript is
245
+ * over the char budget, summarize older messages into a single system note
246
+ * and keep the tail. Real implementation would call the model to summarize.
247
+ */
248
+ private maybeCompact;
249
+ private updateState;
250
+ private fail;
251
+ }
252
+ export type LambdaEvent = {
253
+ action: "start";
254
+ text: string;
255
+ files?: FileAttachment[];
256
+ } | {
257
+ action: "continue";
258
+ runId: string;
259
+ };
260
+ export declare function makeLambdaHandler(handler: PromptHandler): (event: LambdaEvent) => Promise<{
261
+ runId: string;
262
+ } | undefined>;
263
+ //# sourceMappingURL=PromptHandler.d.ts.map
@@ -0,0 +1 @@
1
+ {"version":3,"file":"PromptHandler.d.ts","sourceRoot":"","sources":["../../src/classes/PromptHandler.ts"],"names":[],"mappings":"AAAA;;;;;;;;;GASG;AAUH,MAAM,MAAM,IAAI,GAAG;IACjB,GAAG,EAAE,MAAM,CAAC;IACZ,WAAW,EAAE,MAAM,CAAC;IACpB,0EAA0E;IAC1E,OAAO,EAAE,MAAM,CAAC;IAChB;;;OAGG;IACH,UAAU,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,OAAO,CAAC,CAAC;IACrC;;;;;;;;;OASG;IACH,OAAO,CAAC,EAAE,CAAC,IAAI,EAAE,eAAe,KAAK,OAAO,CAAC,MAAM,CAAC,GAAG,MAAM,CAAC;CAC/D,CAAC;AAEF,MAAM,MAAM,eAAe,GAAG;IAC5B,0DAA0D;IAC1D,MAAM,EAAE,OAAO,CAAC;CACjB,CAAC;AAWF,MAAM,MAAM,WAAW,GAAG;IACxB,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,WAAW,GAAG,QAAQ,GAAG,QAAQ,GAAG,MAAM,CAAC;IACrD,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,MAAM,CAAC;CAClB,CAAC;AAEF,MAAM,MAAM,cAAc,GAAG;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,MAAM,CAAC;IACjB,2BAA2B;IAC3B,IAAI,EAAE,MAAM,CAAC;CACd,CAAC;AAEF,MAAM,MAAM,YAAY,GACpB;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,IAAI,EAAE,MAAM,CAAA;CAAE,GAC9B;IAAE,IAAI,EAAE,WAAW,CAAC;IAAC,IAAI,EAAE,MAAM,CAAA;CAAE,GACnC;IAAE,IAAI,EAAE,WAAW,CAAC;IAAC,EAAE,EAAE,MAAM,CAAC;IAAC,IAAI,EAAE,MAAM,CAAC;IAAC,KAAK,EAAE,OAAO,CAAA;CAAE,GAC/D;IAAE,IAAI,EAAE,aAAa,CAAC;IAAC,UAAU,EAAE,MAAM,CAAC;IAAC,MAAM,EAAE,MAAM,CAAC;IAAC,OAAO,CAAC,EAAE,OAAO,CAAA;CAAE,GAC9E;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,IAAI,EAAE,cAAc,CAAA;CAAE,CAAC;AAE3C,MAAM,MAAM,IAAI,GAAG,QAAQ,GAAG,MAAM,GAAG,WAAW,GAAG,MAAM,CAAC;AAE5D,MAAM,MAAM,OAAO,GAAG;IACpB,EAAE,EAAE,MAAM,CAAC;IACX,KAAK,EAAE,MAAM,CAAC;IACd;sEACkE;IAClE,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,IAAI,EAAE,IAAI,CAAC;IACX,MAAM,EAAE,YAAY,EAAE,CAAC;IACvB,SAAS,EAAE,IAAI,CAAC;IAChB,SAAS,EAAE,MAAM,CAAC;CACnB,CAAC;AAEF,MAAM,MAAM,SAAS,GACjB,SAAS,GACT,WAAW,GACX,gBAAgB,GAChB,YAAY,GACZ,MAAM,GACN,QAAQ,CAAC;AAEb,MAAM,MAAM,QAAQ,GAAG;IACrB,KAAK,EAAE,MAAM,CAAC;IACd,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,MAAM,EAAE,SAAS,CAAC;IAClB,SAAS,EAAE,MAAM,CAAC;IAClB,gFAAgF;IAChF,cAAc,CAAC,EAAE,MAAM,CAAC;IACxB,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,SAAS,EAAE,IAAI,CAAC;IAChB,SAAS,EAAE,IAAI,CAAC;CACjB,CAAC;AAEF,MAAM,MAAM,eAAe,GAAG;IAC5B,cAAc,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;IACxD,iBAAiB,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;IAC3D,YAAY,CAAC,EAAE,CAAC,KAAK,EAAE,YAAY,EAAE,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;IAC5D,WAAW,CAAC,EAAE,CAAC,IAAI,EAAE,OAAO,CAAC,YAAY,EAAE;QAAE,IAAI,EAAE,WAAW,CAAA;KAAE,CAAC,EAAE,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;IAC1F,YAAY,CAAC,EAAE,CACb,MAAM,EAAE,OAAO,CAAC,YAAY,EAAE;QAAE,IAAI,EAAE,aAAa,CAAA;KAAE,CAAC,EACtD,KAAK,EAAE,MAAM,KACV,IAAI,CAAC;IACV,MAAM,CAAC,EAAE,CAAC,KAAK,EAAE,MAAM,KAAK,IAAI,CAAC;CAClC,CAAC;AAOF,MAAM,MAAM,QAAQ,GAChB;IAAE,IAAI,EAAE,QAAQ,CAAA;CAAE,GAClB;IAAE,IAAI,EAAE,SAAS,CAAC;IAAC,GAAG,EAAE,MAAM,CAAC;IAAC,QAAQ,EAAE,MAAM,CAAA;CAAE,GAClD;IAAE,IAAI,EAAE,KAAK,CAAC;IAAC,OAAO,EAAE,UAAU,GAAG,OAAO,GAAG,QAAQ,CAAC;IAAC,GAAG,EAAE,MAAM,CAAA;CAAE,CAAC;AAE3E,MAAM,WAAW,UAAU;IACzB,WAAW,CAAC,GAAG,EAAE,OAAO,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;IACzC,UAAU,CAAC,KAAK,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,EAAE,CAAC,CAAC;IAC9C;;kEAE8D;IAC9D,iBAAiB,CAAC,SAAS,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,EAAE,CAAC,CAAC;IACzD,YAAY,CAAC,KAAK,EAAE,QAAQ,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;IAC7C,WAAW,CAAC,KAAK,EAAE,MAAM,GAAG,OAAO,CAAC,QAAQ,GAAG,IAAI,CAAC,CAAC;IACrD,QAAQ,CAAC,CAAC,EAAE,GAAG,EAAE,MAAM,GAAG,OAAO,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC;IAC5C,QAAQ,CAAC,CAAC,EAAE,GAAG,EAAE,MAAM,EAAE,KAAK,EAAE,CAAC,EAAE,UAAU,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;IACvE,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM,EAAE,KAAK,EAAE,OAAO,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;IACtE,SAAS,CAAC,CAAC,GAAG,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM,GAAG,OAAO,CAAC,CAAC,GAAG,IAAI,CAAC,CAAC;CACvE;AAED,qBAAa,GAAI,YAAW,UAAU;aAGR,MAAM,EAAE,QAAQ;IAF5C,OAAO,CAAC,OAAO,CAAa;gBAEA,MAAM,EAAE,QAAQ;IAS5C,WAAW,CAAC,GAAG,EAAE,OAAO;IAGxB,UAAU,CAAC,KAAK,EAAE,MAAM;IAGxB,iBAAiB,CAAC,SAAS,EAAE,MAAM;IAGnC,YAAY,CAAC,KAAK,EAAE,QAAQ;IAG5B,WAAW,CAAC,KAAK,EAAE,MAAM;IAGzB,QAAQ,CAAC,CAAC,EAAE,GAAG,EAAE,MAAM;IAGvB,QAAQ,CAAC,CAAC,EAAE,GAAG,EAAE,MAAM,EAAE,KAAK,EAAE,CAAC,EAAE,GAAG,CAAC,EAAE,MAAM;IAG/C,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM,EAAE,KAAK,EAAE,OAAO;IAGrD,SAAS,CAAC,CAAC,GAAG,OAAO,EAAE,KAAK,EAAE,MAAM,EAAE,GAAG,EAAE,MAAM;CAGlD;AAsGD,MAAM,MAAM,YAAY,GAAG;IACzB,MAAM,EAAE,YAAY,EAAE,CAAC;IACvB,UAAU,EAAE,UAAU,GAAG,UAAU,GAAG,YAAY,GAAG,OAAO,CAAC;CAC9D,CAAC;AAEF,MAAM,WAAW,WAAW;IAC1B,MAAM,CAAC,IAAI,EAAE;QACX,MAAM,EAAE,MAAM,CAAC;QACf,QAAQ,EAAE,OAAO,EAAE,CAAC;QACpB,KAAK,EAAE,IAAI,EAAE,CAAC;QACd,SAAS,CAAC,EAAE,eAAe,CAAC;QAC5B,KAAK,EAAE,MAAM,CAAC;KACf,GAAG,OAAO,CAAC,YAAY,CAAC,CAAC;CAC3B;AAED,wBAAgB,iBAAiB,CAAC,KAAK,EAAE,WAAW,GAAG,WAAW,CAWjE;AAyUD,MAAM,WAAW,QAAQ;IACvB,qBAAqB,CAAC,KAAK,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC,CAAC;CACrD;AAED,qBAAa,aAAc,YAAW,QAAQ;IAC5C,OAAO,CAAC,OAAO,CAAiB;IAChC,IAAI,CAAC,OAAO,EAAE,aAAa;IAGrB,qBAAqB,CAAC,KAAK,EAAE,MAAM;CAK1C;AAED,MAAM,MAAM,aAAa,GAAG,CAAC,OAAO,EAAE;IAAE,KAAK,EAAE,MAAM,CAAC;IAAC,MAAM,EAAE,UAAU,CAAA;CAAE,KAAK,OAAO,CAAC,IAAI,CAAC,CAAC;AAE9F,qBAAa,cAAe,YAAW,QAAQ;IACjC,OAAO,CAAC,MAAM;gBAAN,MAAM,EAAE,aAAa;IACnC,qBAAqB,CAAC,KAAK,EAAE,MAAM;CAK1C;AAMD,MAAM,MAAM,oBAAoB,GAAG;IACjC,OAAO,EAAE,MAAM,CAAC;IAChB,KAAK,EAAE,IAAI,EAAE,CAAC;IACd,KAAK,EAAE,WAAW,CAAC;IACnB;;;;;OAKG;IACH,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,QAAQ,CAAC,EAAE,QAAQ,CAAC;IACpB,SAAS,CAAC,EAAE,eAAe,CAAC;IAC5B,kEAAkE;IAClE,aAAa,CAAC,EAAE,MAAM,CAAC;IACvB,yDAAyD;IACzD,uBAAuB,CAAC,EAAE,MAAM,CAAC;CAClC,GAAG,CACA;IACE;yDACqD;IACrD,GAAG,EAAE,GAAG,CAAC;IACT,EAAE,CAAC,EAAE,KAAK,CAAC;CACZ,GACD;IACE;oDACgD;IAChD,EAAE,EAAE,QAAQ,CAAC;IACb,GAAG,CAAC,EAAE,KAAK,CAAC;CACb,CACJ,CAAC;AAEF,qBAAa,aAAa;IACxB,OAAO,CAAC,QAAQ,CAAC,OAAO,CAAS;IACjC,OAAO,CAAC,QAAQ,CAAC,KAAK,CAAS;IAC/B,OAAO,CAAC,QAAQ,CAAC,KAAK,CAAc;IACpC,OAAO,CAAC,QAAQ,CAAC,GAAG,CAAM;IAC1B,OAAO,CAAC,QAAQ,CAAC,QAAQ,CAAW;IACpC,OAAO,CAAC,QAAQ,CAAC,SAAS,CAAkB;IAC5C,OAAO,CAAC,QAAQ,CAAC,aAAa,CAAS;IACvC,OAAO,CAAC,QAAQ,CAAC,uBAAuB,CAAS;IACjD,OAAO,CAAC,QAAQ,CAAC,GAAG,CAAc;IAClC;8DAC0D;IAC1D,QAAQ,CAAC,SAAS,CAAC,EAAE,MAAM,CAAC;gBAEhB,IAAI,EAAE,oBAAoB;IAgBtC;;;OAGG;IACG,YAAY,CAAC,KAAK,EAAE;QAAE,IAAI,EAAE,MAAM,CAAC;QAAC,KAAK,CAAC,EAAE,cAAc,EAAE,CAAA;KAAE,GAAG,OAAO,CAAC;QAAE,KAAK,EAAE,MAAM,CAAA;KAAE,CAAC;IA8BjG;;;;;;;;;;;;OAYG;IACG,WAAW,CAAC,KAAK,EAAE,MAAM,GAAG,OAAO,CAAC,IAAI,CAAC;IAoF/C,OAAO,CAAC,iBAAiB;YAQX,WAAW;IAiCzB;;;;OAIG;YACW,YAAY;YAyBZ,WAAW;YAMX,IAAI;CAGnB;AAMD,MAAM,MAAM,WAAW,GACnB;IAAE,MAAM,EAAE,OAAO,CAAC;IAAC,IAAI,EAAE,MAAM,CAAC;IAAC,KAAK,CAAC,EAAE,cAAc,EAAE,CAAA;CAAE,GAC3D;IAAE,MAAM,EAAE,UAAU,CAAC;IAAC,KAAK,EAAE,MAAM,CAAA;CAAE,CAAC;AAE1C,wBAAgB,iBAAiB,CAAC,OAAO,EAAE,aAAa,IACxC,OAAO,WAAW;WA1NwD,MAAM;eAiO/F"}