@openfluke/welvet 0.1.8 β†’ 0.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/README.md CHANGED
@@ -1,18 +1,57 @@
1
- # @openfluke/welvet
2
-
3
- Isomorphic TypeScript/JavaScript wrapper for the LOOM WebAssembly neural network framework.
4
-
5
- ## Features
6
-
7
- - πŸŽ‰ **NEW: Simple API** - Streamlined functions with cross-platform consistency
8
- - πŸš€ **Isomorphic WASM Wrapper** - Works in Node.js and browser with same API
9
- - πŸ”„ **Mirrors main.go** - Direct 1:1 mapping to WASM exports
10
- - 🎯 **Type-Safe** - Full TypeScript type definitions for all Network methods
11
- - πŸ€– **Multi-Agent Networks** - Grid scatter architecture for heterogeneous agents
12
- - πŸ“¦ **JSON Configuration** - Build networks from simple JSON configs
13
- - ⚑ **Fast Training** - Optimized training with configurable parameters
14
- - πŸ’Ύ **Model Persistence** - Save and load trained models as JSON
15
- - βœ… **Cross-Platform Consistency** - Same API as Python, C#, C, WASM
1
+ # @openfluke/welvet - LOOM TypeScript/WASM Bindings
2
+
3
+ **Wrapper for Embedding Loom Via External (WASM) Toolchain**
4
+
5
+ High-performance neural network library with **full training in browser/Node.js** via WebAssembly. Zero external dependenciesβ€”just import and go.
6
+
7
+ > **v0.3.0 Update**: Now includes a **Universal Test Suite** (2298 tests) with 100% parity across Browser and Node.js environments.
8
+
9
+
10
+ ## Framework Comparison
11
+
12
+ | Feature Category | Feature | **Loom/welvet** | **TensorFlow.js** | **Brain.js** | **ONNX.js** | **Candle (WASM)** |
13
+ | :--- | :--- | :---: | :---: | :---: | :---: | :---: |
14
+ | **Core** | **Runtime** | WASM (Pure Go) | JS + WebGL | Pure JS | WASM | WASM (Rust) |
15
+ | | **Runtime Dependency** | **None** | Heavy | Light | Light | None |
16
+ | **Loading** | **Safetensors** | βœ… **Native** | ❌ | ❌ | ❌ | βœ… |
17
+ | | **Structure Inference** | βœ… **Auto-Detect** | ❌ | ❌ | ❌ | ❌ |
18
+ | **Training** | **Browser Training** | βœ… **Full** | βœ… (Slow) | βœ… | ❌ | βœ… |
19
+ | | **Neural Tweening** | βœ… **Hybrid Engine** | ❌ | ❌ | ❌ | ❌ |
20
+ | | **LR Schedulers** | βœ… **7 Types** | βœ… | ⚠️ | ❌ | βœ… |
21
+ | **Layer Support** | **Dense (MLP)** | βœ… | βœ… | βœ… | βœ… | βœ… |
22
+ | | **Conv1D/2D** | βœ… **Native** | βœ… | ❌ | βœ… | βœ… |
23
+ | | **RNN / LSTM** | βœ… **Full Gate** | βœ… | βœ… | βœ… | βœ… |
24
+ | | **Transformer (MHA)** | βœ… (Explicit) | βœ… | ❌ | βœ… | βœ… |
25
+ | | **Parallel / MoE** | βœ… **Structure** | ❌ (Manual) | ❌ | ❌ | ❌ |
26
+ | | **Sequential Layers** | βœ… **Native** | ⚠️ | ⚠️ | ❌ | ⚠️ |
27
+ | **Advanced** | **Step-Based Forward** | βœ… **Unique** | ❌ | ❌ | ❌ | ❌ |
28
+ | | **Stitch Layers** | βœ… **Native** | ❌ | ❌ | ❌ | ❌ |
29
+ | | **K-Means / Stats** | βœ… **Parallel** | ❌ | ❌ | ❌ | ❌ |
30
+ | | **Cross-Lang ABI** | βœ… **Universal** | ❌ | ❌ | ❌ | ⚠️ |
31
+ | **Streaming** | **LLM Streaming** | βœ… | βœ… | ❌ | ❌ | βœ… |
32
+ | | **Pure Go Tokenizer** | βœ… | ❌ | ❌ | ❌ | ❌ |
33
+
34
+ For detailed comparison, see [`docs/loom_assessment_comparison.md`](../docs/loom_assessment_comparison.md).
35
+
36
+ ## 🌍 Cross-Ecosystem Compatibility
37
+
38
+ Models trained in TypeScript can be loaded instantly in Python, C#, Go, or other WASM environments. **Bit-for-bit identical results** across all platforms.
39
+
40
+ | Platform | Package | Install |
41
+ |:---------|:--------|:--------|
42
+ | **TypeScript/Node** | [NPM](https://www.npmjs.com/package/@openfluke/welvet) | `npm install @openfluke/welvet` |
43
+ | **Python** | [PyPI](https://pypi.org/project/welvet/) | `pip install welvet` |
44
+ | **C#/.NET** | [NuGet](https://www.nuget.org/packages/Welvet) | `dotnet add package Welvet` |
45
+ | **Go** | [GitHub](https://github.com/openfluke/loom) | `go get github.com/openfluke/loom` |
46
+
47
+ ### Key Strengths
48
+
49
+ - **True Embeddability**: Single WASM binary. Works in Node.js and browsers with the same API.
50
+ - **Hybrid Gradient/Geometric Engine**: "Neural Tweening" combines geometric gap-closing with backpropagation-guided momentum.
51
+ - **Structural Parallelism**: Native support for Inception, ResNeXt, Siamese, and MoE architectures via `LayerParallel`.
52
+ - **Native Mixed-Precision**: Generic tensor backend supports `int8`, `uint16`, and `float32`.
53
+ - **Complete Evaluation Suite**: Deviation metrics, training milestones, and adaptation tracking.
54
+ - **Network Telemetry**: Runtime introspection with `GetMethodsJSON()` and `ExtractNetworkBlueprint()`.
16
55
 
17
56
  ## Installation
18
57
 
@@ -20,18 +59,19 @@ Isomorphic TypeScript/JavaScript wrapper for the LOOM WebAssembly neural network
20
59
  npm install @openfluke/welvet
21
60
  ```
22
61
 
62
+ **Using Bun:**
63
+ ```bash
64
+ bun add @openfluke/welvet
65
+ ```
66
+
23
67
  ## Quick Start
24
68
 
25
- ### πŸŽ‰ NEW: Simple API (Recommended)
69
+ ### πŸŽ‰ Simple API (Recommended)
26
70
 
27
- The simple API provides streamlined functions with consistent behavior across all platforms:
71
+ The simple API provides streamlined functions with **cross-platform consistency**:
28
72
 
29
73
  ```typescript
30
- import {
31
- init,
32
- createNetworkFromJSON,
33
- loadLoomNetwork,
34
- } from "@openfluke/welvet";
74
+ import { init, createNetworkFromJSON, loadLoomNetwork } from "@openfluke/welvet";
35
75
 
36
76
  // Initialize LOOM WASM
37
77
  await init();
@@ -47,33 +87,17 @@ const config = {
47
87
  {
48
88
  type: "parallel",
49
89
  combine_mode: "grid_scatter",
50
- grid_output_rows: 3,
51
- grid_output_cols: 1,
52
- grid_output_layers: 1,
90
+ grid_output_rows: 3, grid_output_cols: 1, grid_output_layers: 1,
53
91
  grid_positions: [
54
92
  { branch_index: 0, target_row: 0, target_col: 0, target_layer: 0 },
55
93
  { branch_index: 1, target_row: 1, target_col: 0, target_layer: 0 },
56
94
  { branch_index: 2, target_row: 2, target_col: 0, target_layer: 0 },
57
95
  ],
58
96
  branches: [
59
- {
60
- type: "parallel",
61
- combine_mode: "add",
62
- branches: [
63
- {
64
- type: "dense",
65
- input_size: 16,
66
- output_size: 8,
67
- activation: "relu",
68
- },
69
- {
70
- type: "dense",
71
- input_size: 16,
72
- output_size: 8,
73
- activation: "gelu",
74
- },
75
- ],
76
- },
97
+ { type: "parallel", combine_mode: "add", branches: [
98
+ { type: "dense", input_size: 16, output_size: 8, activation: "relu" },
99
+ { type: "dense", input_size: 16, output_size: 8, activation: "gelu" },
100
+ ]},
77
101
  { type: "lstm", input_size: 16, hidden_size: 8, seq_length: 1 },
78
102
  { type: "rnn", input_size: 16, hidden_size: 8, seq_length: 1 },
79
103
  ],
@@ -88,86 +112,55 @@ const network = createNetworkFromJSON(JSON.stringify(config));
88
112
  const batches = [
89
113
  { Input: [0.2, 0.2, 0.2, 0.2, 0.8, 0.8, 0.8, 0.8], Target: [1.0, 0.0] },
90
114
  { Input: [0.9, 0.9, 0.9, 0.9, 0.1, 0.1, 0.1, 0.1], Target: [0.0, 1.0] },
91
- { Input: [0.7, 0.7, 0.7, 0.7, 0.3, 0.3, 0.3, 0.3], Target: [0.0, 1.0] },
92
- { Input: [0.3, 0.3, 0.3, 0.3, 0.7, 0.7, 0.7, 0.7], Target: [1.0, 0.0] },
93
115
  ];
94
116
 
95
117
  const trainingConfig = {
96
118
  Epochs: 800,
97
119
  LearningRate: 0.15,
98
- UseGPU: false,
99
- PrintEveryBatch: 0,
100
- GradientClip: 1.0,
101
120
  LossType: "mse",
102
- Verbose: false,
121
+ GradientClip: 1.0,
103
122
  };
104
123
 
105
- const [result] = network.Train(JSON.stringify([batches, trainingConfig]));
106
- console.log("Training complete!");
124
+ network.Train(JSON.stringify([batches, trainingConfig]));
107
125
 
108
126
  // Forward pass
109
- const [output] = network.ForwardCPU(
110
- JSON.stringify([[0.2, 0.2, 0.2, 0.2, 0.8, 0.8, 0.8, 0.8]])
111
- );
112
- console.log("Output:", JSON.parse(output)); // [0.950, 0.050]
113
-
114
- // Evaluate network
115
- const inputs = batches.map((b) => b.Input);
116
- const expected = [0, 1, 1, 0];
117
- const [metrics] = network.EvaluateNetwork(JSON.stringify([inputs, expected]));
118
- const metricsData = JSON.parse(metrics);
119
- console.log(
120
- `Quality: ${metricsData.score}/100, Deviation: ${metricsData.avg_deviation}%`
121
- );
127
+ const [output] = network.ForwardCPU(JSON.stringify([[0.2, 0.2, 0.2, 0.2, 0.8, 0.8, 0.8, 0.8]]));
128
+ console.log("Output:", JSON.parse(output)); // [0.95, 0.05]
122
129
 
123
130
  // Save/Load
124
131
  const [modelJSON] = network.SaveModelToString(JSON.stringify(["my_model"]));
125
- console.log(`Model saved (${modelJSON.length} bytes)`);
126
-
127
- // Load model
128
132
  const loadedNetwork = loadLoomNetwork(modelJSON, "my_model");
129
- const [output2] = loadedNetwork.ForwardCPU(
130
- JSON.stringify([[0.2, 0.2, 0.2, 0.2, 0.8, 0.8, 0.8, 0.8]])
131
- );
132
- // output2 === output (bit-for-bit identical!)
133
133
  ```
134
134
 
135
135
  **Simple API Functions:**
136
136
 
137
- - `createNetworkFromJSON(jsonConfig)` - Create network from JSON
138
- - `loadLoomNetwork(jsonString, modelID)` - Load saved model
139
- - `network.ForwardCPU(inputJSON)` - Forward pass
140
- - `network.BackwardCPU(gradientsJSON)` - Backward pass
141
- - `network.Train(paramsJSON)` - Train network
142
- - `network.SaveModelToString(idJSON)` - Save to JSON string
143
- - `network.EvaluateNetwork(paramsJSON)` - Evaluate with metrics
144
- - `network.UpdateWeights(lrJSON)` - Update weights
145
-
146
- **Cross-Platform Results:**
147
-
148
- - βœ… Same training: 99.5% improvement, 100/100 quality score
149
- - βœ… Same save/load: 0.00 difference in predictions
150
- - βœ… Same evaluation: Identical deviation metrics
151
- - βœ… Same behavior as Python, C#, C, and WASM
152
-
153
- See `example/grid-scatter.ts` for a complete working example.
137
+ | Function | Description |
138
+ |:---------|:------------|
139
+ | `createNetworkFromJSON(config)` | Create network from JSON |
140
+ | `loadLoomNetwork(json, id)` | Load saved model |
141
+ | `network.ForwardCPU(input)` | Forward pass |
142
+ | `network.BackwardCPU(gradients)` | Backward pass |
143
+ | `network.Train(params)` | Train network |
144
+ | `network.SaveModelToString(id)` | Save to JSON string |
145
+ | `network.EvaluateNetwork(params)` | Evaluate with metrics |
154
146
 
155
147
  ### ⚑ Stepping API - Fine-Grained Execution Control
156
148
 
157
- **NEW:** Execute networks one step at a time for online learning:
149
+ Execute networks one step at a time for online learning:
158
150
 
159
151
  ```typescript
160
152
  import { init, createNetwork, StepState } from "@openfluke/welvet";
161
153
 
162
154
  await init();
163
155
 
164
- // Create network
165
- const config = { batch_size: 1, layers: [
166
- { type: "dense", input_height: 4, output_height: 8, activation: "relu" },
167
- { type: "lstm", input_size: 8, hidden_size: 12, seq_length: 1 },
168
- { type: "dense", input_height: 12, output_height: 3, activation: "softmax" }
169
- ]};
170
- const network = createNetwork(config);
156
+ const network = createNetwork({
157
+ batch_size: 1,
158
+ layers: [
159
+ { type: "dense", input_height: 4, output_height: 8, activation: "relu" },
160
+ { type: "lstm", input_size: 8, hidden_size: 12, seq_length: 1 },
161
+ { type: "dense", input_height: 12, output_height: 3, activation: "softmax" }
162
+ ]
163
+ });
171
164
 
172
165
  // Initialize stepping state
173
166
  const state: StepState = network.createStepState(4);
@@ -178,12 +171,8 @@ for (let step = 0; step < 100000; step++) {
178
171
  state.stepForward();
179
172
  const output = state.getOutput();
180
173
 
181
- // Calculate gradients
182
- const gradients = new Float32Array(output.length);
183
- for (let i = 0; i < output.length; i++)
184
- gradients[i] = output[i] - target[i];
185
-
186
174
  // Backward pass
175
+ const gradients = output.map((o, i) => o - target[i]);
187
176
  state.stepBackward(gradients);
188
177
 
189
178
  // Update weights immediately
@@ -191,19 +180,9 @@ for (let step = 0; step < 100000; step++) {
191
180
  }
192
181
  ```
193
182
 
194
- **Stepping API:**
195
- - `network.createStepState(inputSize)` - Initialize stepping state
196
- - `state.setInput(data)` - Set input for current step
197
- - `state.stepForward()` - Execute forward pass
198
- - `state.getOutput()` - Get output from last layer
199
- - `state.stepBackward(gradients)` - Execute backward pass
200
- - `network.ApplyGradients(paramsJSON)` - Update network weights
201
-
202
- See `example/step_train_v3.ts` for a complete example achieving 100% accuracy.
203
-
204
183
  ### 🧠 Neural Tweening API - Gradient-Free Learning
205
184
 
206
- **NEW:** Direct weight adjustment without backpropagation:
185
+ Direct weight adjustment without backpropagation:
207
186
 
208
187
  ```typescript
209
188
  import { init, createNetwork, TweenState } from "@openfluke/welvet";
@@ -217,212 +196,215 @@ const tweenState: TweenState = network.createTweenState(true); // useChainRule=t
217
196
 
218
197
  // Training loop - direct weight updates
219
198
  for (let step = 0; step < 10000; step++) {
220
- const input = new Float32Array([0.1, 0.2, 0.3, 0.4]);
221
- const targetClass = 1; // Target output class
222
-
223
- // Single-step tween learning
224
- const loss = tweenState.TweenStep(input, targetClass, 4, 0.02);
199
+ const loss = tweenState.TweenStep(
200
+ new Float32Array([0.1, 0.2, 0.3, 0.4]),
201
+ 1, // targetClass
202
+ 4, // outputSize
203
+ 0.02 // learningRate
204
+ );
225
205
  }
226
206
  ```
227
207
 
228
- **Tweening API:**
229
- - `network.createTweenState(useChainRule)` - Initialize tween state
230
- - `tweenState.TweenStep(input, targetClass, outputSize, lr)` - Train step
231
- - `tweenState.setChainRule(enabled)` - Toggle chain rule
232
- - `tweenState.getChainRule()` - Get chain rule status
233
- - `tweenState.getTweenSteps()` - Get total steps performed
234
-
235
- ### πŸ“Š Adaptation Benchmark - Multi-Architecture Testing
208
+ ### πŸ“Š Adaptation Benchmark
236
209
 
237
- **NEW:** Run the full Test 18 Multi-Architecture Adaptation Benchmark:
210
+ Run the full multi-architecture adaptation benchmark:
238
211
 
239
212
  ```bash
240
213
  cd example
241
214
  bun run test18_adaptation.ts
242
215
  ```
243
216
 
244
- Tests 5 architectures Γ— 3 depths Γ— 5 training modes (75 tests total):
217
+ Tests **5 architectures Γ— 3 depths Γ— 5 training modes** (75 tests total):
245
218
  - **Architectures:** Dense, Conv2D, RNN, LSTM, Attention
246
219
  - **Depths:** 3, 5, 9 layers
247
220
  - **Modes:** NormalBP, StepBP, Tween, TweenChain, StepTweenChain
248
221
 
249
- Measures adaptation speed when tasks change mid-stream (chase→avoid→chase).
222
+ ## Complete Test Suite
223
+
224
+ The `universal_test.ts` example demonstrates all framework capabilities with **100% parity** to the Go/C core.
225
+
226
+ ### Running in Browser (v0.3.0+)
227
+
228
+ The universal test suite now runs directly in the browser with a full DOM report:
229
+
230
+ ```bash
231
+ cd typescript
232
+ python3 serve.py
233
+ # Open http://localhost:8081
234
+ ```
235
+
236
+ ### Running in Node/Bun
237
+
238
+ ```bash
239
+ cd example
240
+ bun run universal_test.ts
241
+ ```
242
+
243
+ **Test Coverage (2298 Tests):**
244
+ - βœ… **Serialization**: 12 Layer Types Γ— 15 Data Types (2100 combinations)
245
+ - βœ… **In-Memory WASM**: SafeTensors without filesystem (144 tests)
246
+ - βœ… **Advanced Math**: K-Means, Correlation, Grafting, Ensembles
247
+ - βœ… **GPU Parity**: Determinism checks for forward/backward passes
248
+ - βœ… **Core**: Architecture generation, combinators, sequential layers
249
+
250
+ See [`example/universal_test.ts`](./example/universal_test.ts) for the complete test implementation.
250
251
 
251
- See `example/test18_adaptation.ts` for the full implementation.
252
-
253
- ```
254
-
255
- ## API Reference
256
-
257
- ### Functions
258
-
259
- #### `async init(): Promise<void>`
260
-
261
- Initialize LOOM WASM module for Node.js environment.
252
+ ## Layer Types
262
253
 
263
- #### `async initBrowser(): Promise<void>`
254
+ | Layer | Type String | Description |
255
+ |:------|:------------|:------------|
256
+ | Dense | `dense` | Fully connected layer |
257
+ | LSTM | `lstm` | Long Short-Term Memory |
258
+ | RNN | `rnn` | Recurrent Neural Network |
259
+ | GRU | `gru` | Gated Recurrent Unit |
260
+ | Conv2D | `conv2d` | 2D Convolution |
261
+ | Conv1D | `conv1d` | 1D Convolution |
262
+ | Multi-Head Attention | `multi_head_attention` | Transformer attention |
263
+ | LayerNorm | `layer_norm` | Layer normalization |
264
+ | RMSNorm | `rms_norm` | RMS normalization |
265
+ | SwiGLU | `swiglu` | SwiGLU activation layer |
266
+ | Softmax | `softmax` | Softmax classification |
267
+ | Embedding | `embedding` | Token embedding |
268
+ | Parallel | `parallel` | Branching with combine modes |
269
+ | Sequential | `sequential` | Grouped sub-layers |
270
+
271
+ **Parallel Combine Modes:** `add`, `concat`, `multiply`, `average`, `grid_scatter`, `filter`
264
272
 
265
- Initialize LOOM WASM module for browser environment.
273
+ ## Activation Functions
266
274
 
267
- #### `createNetwork(config: object | string): Network`
275
+ `relu`, `sigmoid`, `tanh`, `softmax`, `gelu`, `swish`, `mish`, `leaky_relu`, `elu`, `selu`, `linear`
268
276
 
269
- Create a new neural network from JSON configuration object or string.
277
+ ## API Reference
270
278
 
271
- **Note:** This is the only global function exposed by the WASM (mirrors `createLoomNetwork` from main.go). To load a saved model, just pass the saved JSON string to `createNetwork()`.
279
+ ### Initialization
272
280
 
273
- ### Network Interface
281
+ ```typescript
282
+ import { init, initBrowser, createNetwork, createNetworkFromJSON } from "@openfluke/welvet";
274
283
 
275
- The `Network` object returned by `createNetwork()` has all methods from the Go `nn.Network` type automatically exposed via reflection.
284
+ // Node.js
285
+ await init();
276
286
 
277
- **Important:** All Network methods follow the WASM calling convention:
287
+ // Browser
288
+ await initBrowser();
289
+ ```
278
290
 
279
- - Take a single parameter: JSON string of an array of parameters
280
- - Return a JSON string of an array of results
291
+ ### Network Methods
281
292
 
282
- Example:
293
+ All Network methods follow the WASM calling convention:
294
+ - **Input:** JSON string of an array of parameters
295
+ - **Return:** JSON string of an array of results
283
296
 
284
297
  ```typescript
285
298
  // Method with no parameters
286
299
  const info = network.GetNetworkInfo(JSON.stringify([]));
287
- const parsed = JSON.parse(info)[0];
288
300
 
289
301
  // Method with parameters
290
302
  const result = network.Train(JSON.stringify([batches, config]));
291
- const data = JSON.parse(result)[0];
292
303
 
293
- // Save model (requires modelID parameter)
304
+ // Save model
294
305
  const saved = network.SaveModelToString(JSON.stringify(["my-model"]));
295
- const json = JSON.parse(saved)[0];
296
- ````
297
-
298
- #### Available Network Methods
299
-
300
- - `ForwardCPU(paramsJSON)` - CPU forward pass: `[inputs]`
301
- - `ForwardGPU(paramsJSON)` - GPU forward pass: `[inputs]`
302
- - `BackwardCPU(paramsJSON)` - CPU backward pass: `[gradients]`
303
- - `BackwardGPU(paramsJSON)` - GPU backward pass: `[gradients]`
304
- - `UpdateWeights(paramsJSON)` - Update weights: `[learningRate]`
305
- - `Train(paramsJSON)` - Train network: `[batches, config]`
306
- - `SaveModelToString(paramsJSON)` - Save model: `["modelID"]`
307
- - `GetWeights(paramsJSON)` - Get layer weights: `[row, col, layer]`
308
- - `SetWeights(paramsJSON)` - Set layer weights: `[row, col, layer, weights]`
309
- - `GetBiases(paramsJSON)` - Get layer biases: `[row, col, layer]`
310
- - `SetBiases(paramsJSON)` - Set layer biases: `[row, col, layer, biases]`
311
- - `GetActivation(paramsJSON)` - Get activation: `[row, col, layer]`
312
- - `GetLayerType(paramsJSON)` - Get layer type: `[row, col, layer]`
313
- - `GetLayerSizes(paramsJSON)` - Get layer sizes: `[row, col, layer]`
314
- - `GetBatchSize(paramsJSON)` - Get batch size: `[]`
315
- - `GetGridDimensions(paramsJSON)` - Get grid dimensions: `[]`
316
- - `GetNetworkInfo(paramsJSON)` - Get network info: `[]`
317
- - `GetTotalParameters(paramsJSON)` - Get parameter count: `[]`
318
- - `InitializeWeights(paramsJSON)` - Initialize weights: `[]` or `[method]`
319
- - `Clone(paramsJSON)` - Clone network: `[]`
320
- - And 10+ more methods...
321
- - `GetLastOutput(): string` - Get last forward pass output
322
-
323
- ### Types
324
-
325
- #### `NetworkConfig`
326
-
327
- ```typescript
328
- interface NetworkConfig {
329
- batch_size: number;
330
- grid_rows?: number; // Required for grid networks (use 1 for sequential)
331
- grid_cols?: number; // Required for grid networks (use 1 for sequential)
332
- layers_per_cell?: number; // Required for grid networks
333
- layers: LayerConfig[];
334
- }
335
306
  ```
336
307
 
337
- #### `LayerConfig`
308
+ **Available Methods:**
309
+
310
+ | Method | Parameters | Description |
311
+ |:-------|:-----------|:------------|
312
+ | `ForwardCPU` | `[inputs]` | CPU forward pass |
313
+ | `ForwardGPU` | `[inputs]` | GPU forward pass |
314
+ | `BackwardCPU` | `[gradients]` | CPU backward pass |
315
+ | `Train` | `[batches, config]` | Train network |
316
+ | `SaveModelToString` | `["modelID"]` | Save to JSON |
317
+ | `GetWeights` | `[row, col, layer]` | Get layer weights |
318
+ | `SetWeights` | `[row, col, layer, weights]` | Set layer weights |
319
+ | `GetBiases` | `[row, col, layer]` | Get layer biases |
320
+ | `SetBiases` | `[row, col, layer, biases]` | Set layer biases |
321
+ | `GetNetworkInfo` | `[]` | Get network info |
322
+ | `GetTotalParameters` | `[]` | Get parameter count |
323
+ | `Clone` | `[]` | Clone network |
324
+ | `TotalLayers` | `[]` | Get total layer count |
325
+
326
+ ### Statistical Tools
338
327
 
339
328
  ```typescript
340
- interface LayerConfig {
341
- type: string;
342
- input_size?: number;
343
- output_size?: number;
344
- hidden_size?: number;
345
- seq_length?: number;
346
- activation?: string;
347
- combine_mode?: string;
348
- grid_output_rows?: number;
349
- grid_output_cols?: number;
350
- grid_output_layers?: number;
351
- grid_positions?: GridPosition[];
352
- branches?: LayerConfig[];
353
- }
329
+ import welvet from "@openfluke/welvet";
330
+
331
+ // K-Means Clustering
332
+ const data = [[1, 1], [1.1, 1.1], [5, 5], [5.1, 5.1]];
333
+ const result = welvet.kmeans(data, 2, 100);
334
+ console.log(`Centroids: ${result.centroids}`);
335
+ console.log(`Silhouette Score: ${result.silhouette_score}`);
336
+
337
+ // Correlation Matrix
338
+ const matrix = [[1, 2, 3], [4, 5, 6], [7, 8, 9]];
339
+ const corr = welvet.correlation(matrix);
340
+ console.log(`Pearson: ${corr.pearson}`);
354
341
  ```
355
342
 
356
- #### `TrainingBatch`
357
-
358
- ```typescript
359
- interface TrainingBatch {
360
- Input: number[];
361
- Target: number[];
362
- }
363
- ```
343
+ ### Network Grafting
364
344
 
365
- #### `TrainingConfig`
345
+ Combine multiple trained networks:
366
346
 
367
347
  ```typescript
368
- interface TrainingConfig {
369
- Epochs: number;
370
- LearningRate: number;
371
- LossType?: string;
372
- Verbose?: boolean;
373
- UseGPU?: boolean;
374
- PrintEveryBatch?: number;
375
- GradientClip?: number;
376
- }
348
+ const h1 = welvet.createKHandle(config);
349
+ const h2 = welvet.createKHandle(config);
350
+
351
+ const result = welvet.graft([h1, h2], "concat");
352
+ console.log(`Grafted: ${result.num_branches} branches`);
377
353
  ```
378
354
 
379
355
  ## Examples
380
356
 
381
- ### Grid Scatter Multi-Agent
382
-
383
357
  ```bash
384
358
  cd example
385
- bun install
386
- bun run grid-scatter.ts
387
- ```
388
359
 
389
- ### Stepping Training (LSTM)
360
+ # Grid Scatter Multi-Agent
361
+ bun run grid-scatter.ts
390
362
 
391
- ```bash
363
+ # Stepping Training (LSTM)
392
364
  bun run step_train_v3.ts
393
- ```
394
-
395
- ### Adaptation Benchmark (75 tests)
396
365
 
397
- ```bash
366
+ # Adaptation Benchmark (75 tests)
398
367
  bun run test18_adaptation.ts
399
- ```
400
368
 
401
- > **Note:** Full benchmark takes ~12.5 minutes (10 seconds per test)
369
+ # Full Test Suite (77 tests)
370
+ bun run universal_test.ts
371
+ ```
402
372
 
403
- ## Layer Types
373
+ ## TypeScript Types
404
374
 
405
- - `dense` - Fully connected layer
406
- - `lstm` - Long Short-Term Memory layer
407
- - `rnn` - Recurrent Neural Network layer
408
- - `gru` - Gated Recurrent Unit layer
409
- - `cnn` - Convolutional layer
410
- - `parallel` - Parallel branches with combine modes:
411
- - `add` - Element-wise addition
412
- - `concat` - Concatenation
413
- - `multiply` - Element-wise multiplication
414
- - `grid_scatter` - Multi-agent grid routing
375
+ ```typescript
376
+ interface NetworkConfig {
377
+ batch_size: number;
378
+ grid_rows?: number;
379
+ grid_cols?: number;
380
+ layers_per_cell?: number;
381
+ layers: LayerConfig[];
382
+ dtype?: "float32" | "float64" | "int32" | "int16" | "int8" | "uint8";
383
+ }
415
384
 
416
- ## Activation Functions
385
+ interface TrainingConfig {
386
+ Epochs: number;
387
+ LearningRate: number;
388
+ LossType?: string;
389
+ Verbose?: boolean;
390
+ UseGPU?: boolean;
391
+ GradientClip?: number;
392
+ }
417
393
 
418
- `relu`, `sigmoid`, `tanh`, `softmax`, `gelu`, `swish`, `mish`, `leaky_relu`, `elu`, `selu`
394
+ interface TrainingBatch {
395
+ Input: number[];
396
+ Target: number[];
397
+ }
398
+ ```
419
399
 
420
400
  ## License
421
401
 
422
- APACHE2
402
+ Apache-2.0
423
403
 
424
404
  ## Links
425
405
 
426
- - [GitHub](https://github.com/openfluke/loom)
427
- - [WASM Documentation](../wasm/README.md)
428
- - [Go Examples](../examples/)
406
+ - **GitHub**: [github.com/openfluke/loom](https://github.com/openfluke/loom)
407
+ - **NPM**: [@openfluke/welvet](https://www.npmjs.com/package/@openfluke/welvet)
408
+ - **PyPI**: [welvet](https://pypi.org/project/welvet/)
409
+ - **NuGet**: [Welvet](https://www.nuget.org/packages/Welvet)
410
+ - **Documentation**: [`docs/loom_assessment_comparison.md`](../docs/loom_assessment_comparison.md)
package/dist/index.d.ts CHANGED
@@ -4,24 +4,53 @@
4
4
  * Direct wrapper around Loom WASM that mirrors main.go exports exactly.
5
5
  * Provides the same API in both Node.js and browser environments.
6
6
  */
7
- import { Network } from "./types.js";
8
- import { loadLoomWASM } from "./loader.js";
7
+ import { Network, GraftResult, KMeansResult, CorrelationResult, EnsembleMatch, AdaptationTracker } from "./types.js";
9
8
  import { loadLoomWASMBrowser } from "./loader.browser.js";
10
9
  export * from "./types.js";
11
- export { loadLoomWASM, loadLoomWASMBrowser };
10
+ export { loadLoomWASMBrowser };
11
+ export declare function loadLoomWASM(): Promise<void>;
12
12
  /**
13
- * Initialize WASM for Node.js environment
13
+ * Initialize WASM
14
+ * Auto-detects environment (Browser vs Node.js)
14
15
  */
15
- export declare function init(): Promise<void>;
16
+ export declare function init(wasmUrl?: string): Promise<void>;
16
17
  /**
17
18
  * Initialize WASM for Browser environment
18
19
  */
19
- export declare function initBrowser(): Promise<void>;
20
+ export declare function initBrowser(wasmUrl?: string): Promise<void>;
20
21
  /**
21
22
  * Create a network from JSON config
22
23
  * Wrapper around the global createLoomNetwork function exposed by WASM
23
24
  */
24
25
  export declare function createNetwork(config: object | string): Network;
26
+ /**
27
+ * Load a network from JSON string and ID
28
+ */
29
+ export declare function loadNetwork(jsonString: string, modelID: string): Network;
30
+ /**
31
+ * Create a network handle for grafting
32
+ */
33
+ export declare function createKHandle(config: object | string): number;
34
+ /**
35
+ * Graft multiple networks together
36
+ */
37
+ export declare function graft(ids: number[], combineMode: string): GraftResult;
38
+ /**
39
+ * Perform K-Means Clustering
40
+ */
41
+ export declare function kmeans(data: number[][], k: number, iter: number): KMeansResult;
42
+ /**
43
+ * Compute Correlation Matrix
44
+ */
45
+ export declare function correlation(matrixA: number[][], matrixB?: number[][]): CorrelationResult;
46
+ /**
47
+ * Find Complementary Ensemble Matches
48
+ */
49
+ export declare function ensemble(models: object[], minCoverage: number): EnsembleMatch[];
50
+ /**
51
+ * Create Adaptation Tracker
52
+ */
53
+ export declare function tracker(windowMs: number, totalMs: number): AdaptationTracker;
25
54
  /**
26
55
  * Default export with all functions
27
56
  */
@@ -29,5 +58,12 @@ declare const _default: {
29
58
  init: typeof init;
30
59
  initBrowser: typeof initBrowser;
31
60
  createNetwork: typeof createNetwork;
61
+ loadNetwork: typeof loadNetwork;
62
+ createKHandle: typeof createKHandle;
63
+ graft: typeof graft;
64
+ kmeans: typeof kmeans;
65
+ correlation: typeof correlation;
66
+ ensemble: typeof ensemble;
67
+ tracker: typeof tracker;
32
68
  };
33
69
  export default _default;
package/dist/index.js CHANGED
@@ -4,21 +4,29 @@
4
4
  * Direct wrapper around Loom WASM that mirrors main.go exports exactly.
5
5
  * Provides the same API in both Node.js and browser environments.
6
6
  */
7
- import { loadLoomWASM } from "./loader.js";
8
7
  import { loadLoomWASMBrowser } from "./loader.browser.js";
9
8
  export * from "./types.js";
10
- export { loadLoomWASM, loadLoomWASMBrowser };
9
+ export { loadLoomWASMBrowser };
10
+ export async function loadLoomWASM() {
11
+ const mod = await import("./loader.js");
12
+ await mod.loadLoomWASM();
13
+ }
11
14
  /**
12
- * Initialize WASM for Node.js environment
15
+ * Initialize WASM
16
+ * Auto-detects environment (Browser vs Node.js)
13
17
  */
14
- export async function init() {
15
- await loadLoomWASM();
18
+ export async function init(wasmUrl) {
19
+ // Check for browser environment (needs window and document for loader.browser.ts)
20
+ if (typeof window !== "undefined" && typeof document !== "undefined") {
21
+ return initBrowser(wasmUrl);
22
+ }
23
+ return loadLoomWASM();
16
24
  }
17
25
  /**
18
26
  * Initialize WASM for Browser environment
19
27
  */
20
- export async function initBrowser() {
21
- await loadLoomWASMBrowser();
28
+ export async function initBrowser(wasmUrl) {
29
+ await loadLoomWASMBrowser(wasmUrl);
22
30
  }
23
31
  /**
24
32
  * Create a network from JSON config
@@ -30,11 +38,73 @@ export function createNetwork(config) {
30
38
  : JSON.stringify(config);
31
39
  return createLoomNetwork(jsonConfig);
32
40
  }
41
+ /**
42
+ * Load a network from JSON string and ID
43
+ */
44
+ export function loadNetwork(jsonString, modelID) {
45
+ return loadLoomNetwork(jsonString, modelID);
46
+ }
47
+ /**
48
+ * Create a network handle for grafting
49
+ */
50
+ export function createKHandle(config) {
51
+ const jsonConfig = typeof config === "string" ? config : JSON.stringify(config);
52
+ return createNetworkForGraft(jsonConfig);
53
+ }
54
+ /**
55
+ * Graft multiple networks together
56
+ */
57
+ export function graft(ids, combineMode) {
58
+ const idsJSON = JSON.stringify(ids);
59
+ const resJSON = graftNetworks(idsJSON, combineMode);
60
+ return JSON.parse(resJSON);
61
+ }
62
+ /**
63
+ * Perform K-Means Clustering
64
+ */
65
+ export function kmeans(data, k, iter) {
66
+ const resJSON = kmeansCluster(JSON.stringify(data), k, iter);
67
+ return JSON.parse(resJSON);
68
+ }
69
+ /**
70
+ * Compute Correlation Matrix
71
+ */
72
+ export function correlation(matrixA, matrixB) {
73
+ const jsonA = JSON.stringify(matrixA);
74
+ const jsonB = matrixB ? JSON.stringify(matrixB) : "null"; // Use "null" string for nil
75
+ const resJSON = computeCorrelation(jsonA, jsonB);
76
+ const raw = JSON.parse(resJSON);
77
+ // Transform to match interface
78
+ return {
79
+ pearson: raw.correlation?.matrix || raw.Correlation?.Matrix || raw.matrix || [],
80
+ spearman: raw.spearman?.matrix || raw.Spearman?.Matrix || []
81
+ };
82
+ }
83
+ /**
84
+ * Find Complementary Ensemble Matches
85
+ */
86
+ export function ensemble(models, minCoverage) {
87
+ const resJSON = findComplementaryMatches(JSON.stringify(models), minCoverage);
88
+ return JSON.parse(resJSON);
89
+ }
90
+ /**
91
+ * Create Adaptation Tracker
92
+ */
93
+ export function tracker(windowMs, totalMs) {
94
+ return createAdaptationTracker(windowMs, totalMs);
95
+ }
33
96
  /**
34
97
  * Default export with all functions
35
98
  */
36
99
  export default {
37
100
  init,
38
101
  initBrowser,
39
- createNetwork
102
+ createNetwork,
103
+ loadNetwork,
104
+ createKHandle,
105
+ graft,
106
+ kmeans,
107
+ correlation,
108
+ ensemble,
109
+ tracker
40
110
  };
@@ -2,4 +2,4 @@
2
2
  * LOOM WASM Browser Loader
3
3
  * Browser-only version without Node.js dependencies
4
4
  */
5
- export declare function loadLoomWASMBrowser(): Promise<void>;
5
+ export declare function loadLoomWASMBrowser(wasmUrl?: string): Promise<void>;
@@ -2,7 +2,7 @@
2
2
  * LOOM WASM Browser Loader
3
3
  * Browser-only version without Node.js dependencies
4
4
  */
5
- export async function loadLoomWASMBrowser() {
5
+ export async function loadLoomWASMBrowser(wasmUrl) {
6
6
  // For browser environments - load wasm_exec.js first if not already loaded
7
7
  if (typeof globalThis.Go === "undefined") {
8
8
  // Load wasm_exec.js dynamically from /dist/
@@ -14,7 +14,7 @@ export async function loadLoomWASMBrowser() {
14
14
  document.head.appendChild(script);
15
15
  });
16
16
  }
17
- const response = await fetch("/dist/main.wasm");
17
+ const response = await fetch(wasmUrl || "/dist/main.wasm");
18
18
  const wasmBuffer = await response.arrayBuffer();
19
19
  // @ts-ignore - Go is defined by wasm_exec.js
20
20
  const go = new Go();
package/dist/loader.js CHANGED
@@ -2,12 +2,13 @@
2
2
  * LOOM WASM Loader
3
3
  * Loads and initializes the LOOM WebAssembly module (Node.js only)
4
4
  */
5
- import { readFileSync } from "fs";
6
- import { fileURLToPath } from "url";
7
- import { dirname, join } from "path";
8
- const __filename = fileURLToPath(import.meta.url);
9
- const __dirname = dirname(__filename);
5
+ // Node.js only loader - using dynamic imports to allow bundling for browser (where this file is not used but might be analyzed)
10
6
  export async function loadLoomWASM() {
7
+ const fs = await import("fs");
8
+ const url = await import("url");
9
+ const path = await import("path");
10
+ const __filename = url.fileURLToPath(import.meta.url);
11
+ const __dirname = path.dirname(__filename);
11
12
  // __dirname points to:
12
13
  // - dist/ β†’ in production
13
14
  // - src/ β†’ when running via Bun, ts-node, or example files
@@ -19,16 +20,16 @@ export async function loadLoomWASM() {
19
20
  else {
20
21
  // Running from src/ or example/
21
22
  // Point to project’s dist/ directory
22
- root = join(__dirname, "..", "dist");
23
+ root = path.join(__dirname, "..", "dist");
23
24
  }
24
25
  // Load wasm_exec.js
25
- const wasmExecPath = join(root, "wasm_exec.js");
26
- const wasmExecCode = readFileSync(wasmExecPath, "utf-8");
26
+ const wasmExecPath = path.join(root, "wasm_exec.js");
27
+ const wasmExecCode = fs.readFileSync(wasmExecPath, "utf-8");
27
28
  // Execute wasm_exec.js to get the Go runtime
28
29
  eval(wasmExecCode);
29
30
  // Load main.wasm
30
- const wasmPath = join(root, "main.wasm");
31
- const wasmBuffer = readFileSync(wasmPath);
31
+ const wasmPath = path.join(root, "main.wasm");
32
+ const wasmBuffer = fs.readFileSync(wasmPath);
32
33
  // @ts-ignore - Go runtime from wasm_exec.js
33
34
  const go = new Go();
34
35
  const { instance } = await WebAssembly.instantiate(wasmBuffer, go.importObject);
package/dist/main.wasm CHANGED
Binary file
package/dist/types.d.ts CHANGED
@@ -6,13 +6,33 @@ export interface LayerConfig {
6
6
  input_size?: number;
7
7
  output_size?: number;
8
8
  hidden_size?: number;
9
- seq_length?: number;
10
9
  activation?: string;
11
10
  combine_mode?: string;
12
11
  grid_output_rows?: number;
13
12
  grid_output_cols?: number;
14
13
  grid_output_layers?: number;
15
14
  grid_positions?: GridPosition[];
15
+ input_height?: number;
16
+ output_height?: number;
17
+ input_width?: number;
18
+ input_channels?: number;
19
+ output_channels?: number;
20
+ kernel_size?: number;
21
+ stride?: number;
22
+ padding?: number;
23
+ filters?: number;
24
+ d_model?: number;
25
+ num_heads?: number;
26
+ seq_length?: number;
27
+ norm_size?: number;
28
+ vocab_size?: number;
29
+ embedding_dim?: number;
30
+ epsilon?: number;
31
+ softmax_variant?: string;
32
+ temperature?: number;
33
+ residual?: boolean;
34
+ filter_gate?: LayerConfig;
35
+ filter_temperature?: number;
16
36
  branches?: LayerConfig[];
17
37
  }
18
38
  export interface NetworkConfig {
@@ -137,7 +157,36 @@ export interface TweenState {
137
157
  */
138
158
  declare global {
139
159
  function createLoomNetwork(jsonConfig: string): Network;
160
+ function loadLoomNetwork(jsonString: string, modelID: string): Network;
140
161
  function createAdaptationTracker(windowMs: number, totalMs: number): AdaptationTracker;
162
+ function createNetworkForGraft(jsonConfig: string): number;
163
+ function graftNetworks(idsJSON: string, combineMode: string): string;
164
+ function kmeansCluster(dataJSON: string, k: number, iter: number): string;
165
+ function computeCorrelation(matrixAJSON: string, matrixBJSON: string): string;
166
+ function findComplementaryMatches(modelsJSON: string, minCoverage: number): string;
167
+ }
168
+ export interface EnsembleMatch {
169
+ ModelA: string;
170
+ ModelB: string;
171
+ Coverage: number;
172
+ Overlap: number;
173
+ WeightedAcc?: number;
174
+ }
175
+ export interface GraftResult {
176
+ success: boolean;
177
+ type: string;
178
+ num_branches: number;
179
+ combine_mode: string;
180
+ error?: string;
181
+ }
182
+ export interface KMeansResult {
183
+ centroids: number[][];
184
+ assignment: number[];
185
+ silhouette_score: number;
186
+ }
187
+ export interface CorrelationResult {
188
+ pearson: number[][];
189
+ spearman: number[][];
141
190
  }
142
191
  /**
143
192
  * AdaptationTracker interface for tracking accuracy during task changes
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "@openfluke/welvet",
3
- "version": "0.1.8",
3
+ "version": "0.3.0",
4
4
  "description": "TypeScript/JavaScript bindings for LOOM neural network framework with WebAssembly support - GPU-accelerated machine learning in the browser",
5
5
  "type": "module",
6
6
  "main": "./dist/index.js",
@@ -22,13 +22,41 @@
22
22
  "gpu",
23
23
  "typescript",
24
24
  "javascript",
25
- "go",
26
- "golang",
27
25
  "ml",
28
26
  "ai",
29
27
  "training",
30
28
  "inference",
31
- "browser"
29
+ "browser",
30
+ "nodejs",
31
+ "isomorphic",
32
+ "transformer",
33
+ "lstm",
34
+ "rnn",
35
+ "gru",
36
+ "attention",
37
+ "multi-head-attention",
38
+ "conv2d",
39
+ "conv1d",
40
+ "convolution",
41
+ "embedding",
42
+ "layer-norm",
43
+ "rms-norm",
44
+ "swiglu",
45
+ "moe",
46
+ "mixture-of-experts",
47
+ "tweening",
48
+ "gradient-free",
49
+ "safetensors",
50
+ "cross-platform",
51
+ "clustering",
52
+ "kmeans",
53
+ "k-means",
54
+ "correlation",
55
+ "statistics",
56
+ "streaming",
57
+ "llm",
58
+ "tokenizer",
59
+ "bpe"
32
60
  ],
33
61
  "license": "Apache-2.0",
34
62
  "repository": {