@aleph-ai/tinyaleph 1.0.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (58) hide show
  1. package/LICENSE +21 -0
  2. package/README.md +278 -0
  3. package/backends/cryptographic/index.js +196 -0
  4. package/backends/index.js +15 -0
  5. package/backends/interface.js +89 -0
  6. package/backends/scientific/index.js +272 -0
  7. package/backends/semantic/index.js +527 -0
  8. package/backends/semantic/surface.js +393 -0
  9. package/backends/semantic/two-layer.js +375 -0
  10. package/core/fano.js +127 -0
  11. package/core/hilbert.js +564 -0
  12. package/core/hypercomplex.js +141 -0
  13. package/core/index.js +133 -0
  14. package/core/llm.js +132 -0
  15. package/core/prime.js +184 -0
  16. package/core/resonance.js +695 -0
  17. package/core/rformer-tf.js +1086 -0
  18. package/core/rformer.js +806 -0
  19. package/core/sieve.js +350 -0
  20. package/data.json +8163 -0
  21. package/docs/EXAMPLES_PLAN.md +293 -0
  22. package/docs/README.md +159 -0
  23. package/docs/design/ALEPH_CHAT_ARCHITECTURE.md +499 -0
  24. package/docs/guide/01-quickstart.md +298 -0
  25. package/docs/guide/02-semantic-computing.md +409 -0
  26. package/docs/guide/03-cryptographic.md +420 -0
  27. package/docs/guide/04-scientific.md +494 -0
  28. package/docs/guide/05-llm-integration.md +568 -0
  29. package/docs/guide/06-advanced.md +996 -0
  30. package/docs/guide/README.md +188 -0
  31. package/docs/reference/01-core.md +695 -0
  32. package/docs/reference/02-physics.md +601 -0
  33. package/docs/reference/03-backends.md +892 -0
  34. package/docs/reference/04-engine.md +632 -0
  35. package/docs/reference/README.md +252 -0
  36. package/docs/theory/01-prime-semantics.md +327 -0
  37. package/docs/theory/02-hypercomplex-algebra.md +421 -0
  38. package/docs/theory/03-phase-synchronization.md +364 -0
  39. package/docs/theory/04-entropy-reasoning.md +348 -0
  40. package/docs/theory/05-non-commutativity.md +402 -0
  41. package/docs/theory/06-two-layer-meaning.md +414 -0
  42. package/docs/theory/07-resonant-field-interface.md +419 -0
  43. package/docs/theory/08-semantic-sieve.md +520 -0
  44. package/docs/theory/09-temporal-emergence.md +298 -0
  45. package/docs/theory/10-quaternionic-memory.md +415 -0
  46. package/docs/theory/README.md +162 -0
  47. package/engine/aleph.js +418 -0
  48. package/engine/index.js +7 -0
  49. package/index.js +23 -0
  50. package/modular.js +254 -0
  51. package/package.json +99 -0
  52. package/physics/collapse.js +95 -0
  53. package/physics/entropy.js +88 -0
  54. package/physics/index.js +65 -0
  55. package/physics/kuramoto.js +91 -0
  56. package/physics/lyapunov.js +80 -0
  57. package/physics/oscillator.js +95 -0
  58. package/types/index.d.ts +575 -0
@@ -0,0 +1,402 @@
1
+ # Non-Commutativity: Why Word Order Matters
2
+
3
+ ## The Critical Insight
4
+
5
+ Consider these two sentences:
6
+ - "Dog bites man"
7
+ - "Man bites dog"
8
+
9
+ They contain the exact same words, but their meanings are completely different. If we encode concepts as unordered sets of primes, we lose this distinction:
10
+
11
+ ```javascript
12
+ // WRONG: Treats as unordered set
13
+ encode("dog bites man") → {dog, bites, man} → [2, 3, 5]
14
+ encode("man bites dog") → {man, bites, dog} → [2, 3, 5]
15
+
16
+ // Same encoding! But completely different meanings!
17
+ ```
18
+
19
+ This is the **non-commutativity problem**.
20
+
21
+ ---
22
+
23
+ ## Semantic Non-Commutativity
24
+
25
+ Word order encodes meaning that cannot be captured by bags of words:
26
+
27
+ | Expression A | Expression B | Same Words? | Same Meaning? |
28
+ |-------------|-------------|-------------|---------------|
29
+ | "dog bites man" | "man bites dog" | ✓ | ✗ |
30
+ | "A causes B" | "B causes A" | ✓ | ✗ |
31
+ | "I love you" | "You love me" | ✓ | ✗ |
32
+ | "time creates change" | "change creates time" | ✓ | ✗ |
33
+ | "theory → experiment" | "experiment → theory" | ✓ | ✗ |
34
+
35
+ ### What Order Encodes
36
+
37
+ 1. **Agent/Patient** - who does what to whom
38
+ 2. **Causation** - what causes what
39
+ 3. **Temporal sequence** - what happens first
40
+ 4. **Topic/Comment** - what we're talking about vs. what we say
41
+ 5. **Given/New** - what's assumed vs. what's introduced
42
+
43
+ ---
44
+
45
+ ## Hypercomplex Non-Commutativity
46
+
47
+ The solution lies in hypercomplex multiplication, which is non-commutative:
48
+
49
+ ```
50
+ For quaternions and beyond:
51
+
52
+ e₁ × e₂ = e₃
53
+ e₂ × e₁ = -e₃
54
+
55
+ A × B ≠ B × A (in general)
56
+ ```
57
+
58
+ This non-commutativity is THE MECHANISM for encoding order—but only if we USE it.
59
+
60
+ ---
61
+
62
+ ## The Flawed Approach
63
+
64
+ The old implementation used summation (commutative):
65
+
66
+ ```javascript
67
+ // OLD: Commutative sum - order doesn't matter
68
+ primesToState(primes) {
69
+ const state = Hypercomplex.zero(this.dimension);
70
+ for (const p of primes) { // Order is incidental
71
+ const angle = primeToAngle(p);
72
+ for (let i = 0; i < this.dimension; i++) {
73
+ state.c[i] += Math.cos(angle * (i + 1)) / Math.sqrt(primes.length);
74
+ }
75
+ }
76
+ return state.normalize();
77
+ }
78
+ ```
79
+
80
+ The `+=` operation is commutative. Summing in any order gives the same result.
81
+
82
+ ---
83
+
84
+ ## Solution: Sequential Multiplication
85
+
86
+ Use hypercomplex multiplication instead of addition:
87
+
88
+ ```javascript
89
+ orderedPrimesToState(orderedTokens) {
90
+ // Start with identity element (1 in hypercomplex)
91
+ let state = Hypercomplex.basis(this.dimension, 0, 1);
92
+
93
+ for (let i = 0; i < orderedTokens.length; i++) {
94
+ const token = orderedTokens[i];
95
+ const primes = token.primes;
96
+
97
+ // Convert primes to hypercomplex rotation
98
+ const tokenH = this.primesToHypercomplex(primes);
99
+
100
+ // Apply position-dependent phase shift
101
+ const positioned = this.applyPositionPhase(tokenH, i);
102
+
103
+ // MULTIPLY (non-commutative!) - order matters!
104
+ state = state.mul(positioned);
105
+ }
106
+
107
+ return state.normalize();
108
+ }
109
+ ```
110
+
111
+ Now:
112
+ ```javascript
113
+ encode("dog bites man") // [dog, bites, man]
114
+ state1 = e_dog × e_bites × e_man
115
+
116
+ encode("man bites dog") // [man, bites, dog]
117
+ state2 = e_man × e_bites × e_dog
118
+
119
+ state1 ≠ state2 // Different! Non-commutative multiplication preserves order!
120
+ ```
121
+
122
+ ---
123
+
124
+ ## Converting Primes to Hypercomplex
125
+
126
+ Each prime becomes a rotation in hypercomplex space:
127
+
128
+ ```javascript
129
+ primesToHypercomplex(primes) {
130
+ // Start with 1 (identity)
131
+ let h = Hypercomplex.basis(this.dimension, 0, 1);
132
+
133
+ for (const p of primes) {
134
+ const angle = primeToAngle(p);
135
+ const axis = (p % (this.dimension - 1)) + 1; // Use prime to select axis
136
+
137
+ // Create rotation: cos(θ) + sin(θ)·eₐₓᵢₛ
138
+ const rot = Hypercomplex.zero(this.dimension);
139
+ rot.c[0] = Math.cos(angle);
140
+ rot.c[axis] = Math.sin(angle);
141
+
142
+ // Accumulate by multiplication
143
+ h = h.mul(rot);
144
+ }
145
+
146
+ return h.normalize();
147
+ }
148
+ ```
149
+
150
+ Each prime contributes a rotation in a different plane of hypercomplex space.
151
+
152
+ ---
153
+
154
+ ## Position-Dependent Phase
155
+
156
+ Position further breaks symmetry:
157
+
158
+ ```javascript
159
+ applyPositionPhase(h, position) {
160
+ // Use position-th prime for phase shift
161
+ const posPrime = nthPrime(position + 1);
162
+ const angle = primeToAngle(posPrime) * 0.5; // Half angle
163
+
164
+ // Rotate in position-dependent plane
165
+ const posAxis = (position % (this.dimension - 2)) + 1;
166
+
167
+ const rot = Hypercomplex.zero(this.dimension);
168
+ rot.c[0] = Math.cos(angle);
169
+ rot.c[posAxis] = Math.sin(angle);
170
+
171
+ return h.mul(rot);
172
+ }
173
+ ```
174
+
175
+ This ensures that even if two words have the same primes, their position affects the state differently.
176
+
177
+ ---
178
+
179
+ ## The Ordered Encoding Pipeline
180
+
181
+ ```
182
+ ORDERED INPUT
183
+ "dog bites man"
184
+
185
+
186
+ ┌─────────────────┐
187
+ │ Ordered Encode │
188
+ │ [{word: "dog", │
189
+ │ primes: [...],│
190
+ │ position: 0}, │
191
+ │ {word: "bites",│
192
+ │ primes: [...],│
193
+ │ position: 1}, │
194
+ │ {word: "man", │
195
+ │ primes: [...],│
196
+ │ position: 2}] │
197
+ └────────┬────────┘
198
+
199
+
200
+ ┌─────────────────┐
201
+ │ Prime to │
202
+ │ Hypercomplex │
203
+ │ (for each │
204
+ │ token) │
205
+ └────────┬────────┘
206
+
207
+
208
+ ┌─────────────────┐
209
+ │ Position │
210
+ │ Phase Shift │ ← Breaks residual symmetry
211
+ └────────┬────────┘
212
+
213
+
214
+ ┌─────────────────┐
215
+ │ Sequential │
216
+ │ Multiplication │ ← Non-commutative!
217
+ │ H₀ × H₁ × H₂ │
218
+ └────────┬────────┘
219
+
220
+
221
+ ┌─────────────────┐
222
+ │ State Vector │
223
+ │ (16D) │ ← Order is now encoded
224
+ └─────────────────┘
225
+ ```
226
+
227
+ ---
228
+
229
+ ## Verification
230
+
231
+ Test that order is preserved:
232
+
233
+ ```javascript
234
+ const backend = new SemanticBackend({ dimension: 16 });
235
+
236
+ // Encode ordered sentences
237
+ const tokens1 = backend.encodeOrdered("dog bites man");
238
+ const tokens2 = backend.encodeOrdered("man bites dog");
239
+
240
+ const state1 = backend.orderedPrimesToState(tokens1);
241
+ const state2 = backend.orderedPrimesToState(tokens2);
242
+
243
+ const similarity = state1.coherence(state2);
244
+ console.log(`Similarity: ${similarity}`); // Should be < 1.0
245
+
246
+ if (similarity > 0.95) {
247
+ throw new Error("Order not being encoded - states too similar!");
248
+ }
249
+
250
+ console.log("✓ Non-commutativity preserved");
251
+ ```
252
+
253
+ ---
254
+
255
+ ## What This Enables
256
+
257
+ With order properly encoded:
258
+
259
+ | Capability | Example |
260
+ |-----------|---------|
261
+ | **Subject-Object distinction** | "A loves B" vs "B loves A" |
262
+ | **Causal direction** | "rain causes flood" vs "flood causes rain" |
263
+ | **Narrative sequence** | Story order affects meaning |
264
+ | **Argument structure** | "if A then B" vs "if B then A" |
265
+ | **Dependency parsing** | Word relationships are directed |
266
+ | **Temporal reasoning** | before/after are opposite |
267
+
268
+ ---
269
+
270
+ ## Alternative Approaches
271
+
272
+ ### Position-Dependent Encoding
273
+
274
+ Encode position into the prime representation:
275
+
276
+ ```javascript
277
+ encodeOrdered(tokens) {
278
+ const orderedPrimes = [];
279
+
280
+ for (let i = 0; i < tokens.length; i++) {
281
+ const basePrimes = this.vocabulary.get(tokens[i]);
282
+
283
+ // Modify primes by position
284
+ const positionalPrimes = basePrimes.map(p => {
285
+ return [p, nthPrime(i + 1)]; // Tuple: (concept, position)
286
+ });
287
+
288
+ orderedPrimes.push(...positionalPrimes);
289
+ }
290
+
291
+ return orderedPrimes;
292
+ }
293
+ ```
294
+
295
+ ### Ordered Pairs Algebra
296
+
297
+ Treat ordered pairs of concepts as new concepts:
298
+
299
+ ```javascript
300
+ encodeRelational(tokens) {
301
+ const primes = [];
302
+
303
+ // Encode individual concepts
304
+ for (const token of tokens) {
305
+ primes.push(...this.encode(token));
306
+ }
307
+
308
+ // Encode PAIRS (order matters)
309
+ for (let i = 0; i < tokens.length - 1; i++) {
310
+ const pairPrime = this.getRelationPrime(tokens[i], tokens[i+1]);
311
+ primes.push(pairPrime); // Asymmetric relation
312
+ }
313
+
314
+ return primes;
315
+ }
316
+
317
+ getRelationPrime(from, to) {
318
+ const fromPrimes = this.encode(from);
319
+ const toPrimes = this.encode(to);
320
+
321
+ // Asymmetric combination: (A,B) ≠ (B,A)
322
+ const forwardProduct = fromPrimes.reduce((a,b) => a*b, 1);
323
+ const backwardProduct = toPrimes.reduce((a,b) => a+b, 0);
324
+
325
+ return forwardProduct * 1000 + backwardProduct;
326
+ }
327
+ ```
328
+
329
+ ---
330
+
331
+ ## The Deep Insight
332
+
333
+ **Syntax IS meaning.** Word order is not separate from meaning—it IS part of the meaning.
334
+
335
+ Non-commutative algebras (quaternions, sedenions) have non-commutativity built into their structure. When A × B ≠ B × A:
336
+ - A × B can encode "A acts on B"
337
+ - B × A can encode "B acts on A"
338
+ - The order encodes the DIRECTION of the relationship
339
+
340
+ This is exactly what natural language needs for semantic computing.
341
+
342
+ ---
343
+
344
+ ## Quaternionic Sentences
345
+
346
+ A more advanced approach represents each word as a quaternion:
347
+
348
+ ```javascript
349
+ class SemanticQuaternion {
350
+ constructor(w, x, y, z) {
351
+ this.w = w; // Scalar: "being" aspect
352
+ this.x = x; // i: "doing" aspect
353
+ this.y = y; // j: "receiving" aspect
354
+ this.z = z; // k: "relating" aspect
355
+ }
356
+
357
+ mul(other) {
358
+ // Hamilton's quaternion multiplication (non-commutative!)
359
+ return new SemanticQuaternion(
360
+ this.w*other.w - this.x*other.x - this.y*other.y - this.z*other.z,
361
+ this.w*other.x + this.x*other.w + this.y*other.z - this.z*other.y,
362
+ this.w*other.y - this.x*other.z + this.y*other.w + this.z*other.x,
363
+ this.w*other.z + this.x*other.y - this.y*other.x + this.z*other.w
364
+ );
365
+ }
366
+ }
367
+
368
+ // "dog" emphasizes subject (w)
369
+ const dog = new SemanticQuaternion(0.8, 0.2, 0.1, 0.1);
370
+
371
+ // "bites" emphasizes action (x)
372
+ const bites = new SemanticQuaternion(0.1, 0.9, 0.1, 0.1);
373
+
374
+ // "man" can be subject or object
375
+ const man = new SemanticQuaternion(0.5, 0.2, 0.5, 0.1);
376
+
377
+ // "dog bites man"
378
+ const dbm = dog.mul(bites).mul(man);
379
+
380
+ // "man bites dog"
381
+ const mbd = man.mul(bites).mul(dog);
382
+
383
+ // dbm ≠ mbd due to non-commutativity!
384
+ ```
385
+
386
+ ---
387
+
388
+ ## Summary
389
+
390
+ Non-commutativity is essential for semantic computing:
391
+
392
+ 1. **Word order encodes meaning** that cannot be captured by bags of words
393
+ 2. **Hypercomplex multiplication is non-commutative** by mathematical necessity
394
+ 3. **Sequential multiplication** of token representations preserves order
395
+ 4. **Position-dependent phases** add further asymmetry
396
+ 5. **Non-commutativity is a feature**, not a bug
397
+
398
+ Without order encoding, we can only compute with bags of words—not sentences, not arguments, not narratives.
399
+
400
+ ---
401
+
402
+ ## Next: [Two-Layer Meaning →](./06-two-layer-meaning.md)