@aleph-ai/tinyaleph 1.2.1 → 1.3.0

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -0,0 +1,559 @@
1
+ # Prime Resonant Graph Memory: A Quantum-Inspired Content-Addressable Memory System Using Prime Number Theory and Hypercomplex Algebra
2
+
3
+ **Sebastian Schepis**
4
+ *Coherent Observer Research*
5
+
6
+ ---
7
+
8
+ ## Abstract
9
+
10
+ We present **PRGraphMemory**, a novel content-addressable memory architecture that combines prime number theory with quaternionic algebra to achieve semantic similarity-based storage and retrieval. Unlike conventional associative memories that rely on vector similarity in Euclidean space, our approach encodes information as sparse superpositions over prime-indexed basis states, where each activation carries both complex amplitude (encoding phase) and quaternionic orientation (encoding directional semantics). Retrieval is performed via a resonance score that unifies set-theoretic overlap, phase coherence, and orientation alignment into a single similarity metric. We further introduce entropy-driven dynamics that enable temporal forgetting and memory consolidation without explicit garbage collection. Theoretical analysis demonstrates that the prime-indexed representation provides logarithmic addressing efficiency while the quaternionic extension captures non-commutative compositional semantics. We describe implementations in both JavaScript and AssemblyScript (WebAssembly), and discuss applications to semantic memory, knowledge graphs, and distributed storage systems.
11
+
12
+ **Keywords:** content-addressable memory, prime numbers, quaternions, resonance, semantic similarity, sparse representation, entropy dynamics
13
+
14
+ ---
15
+
16
+ ## 1. Introduction
17
+
18
+ ### 1.1 Motivation
19
+
20
+ Traditional memory systems operate on exact-match addressing: a key uniquely identifies a value. While efficient for structured data, this paradigm fails to capture the fuzzy, associative nature of human memory and semantic knowledge. Content-addressable memories (CAMs) address this limitation by enabling retrieval based on partial or similar patterns, but conventional CAM implementations face scaling challenges and lack principled mechanisms for temporal dynamics.
21
+
22
+ We propose a fundamentally different approach grounded in two mathematical observations:
23
+
24
+ 1. **Prime Factorization as Semantic Addressing**: The Fundamental Theorem of Arithmetic guarantees unique factorization of integers into primes. By mapping semantic content to prime-indexed activations, we obtain a representation where compositional structure (products of concepts) corresponds to additive structure in the activation space.
25
+
26
+ 2. **Quaternions for Non-Commutative Composition**: Natural language and temporal reasoning exhibit order-sensitivity ("Alice called Bob" ≠ "Bob called Alice"). Quaternionic algebra, being non-commutative under multiplication, naturally encodes this asymmetry in the representation itself.
27
+
28
+ ### 1.2 Contributions
29
+
30
+ This paper makes the following contributions:
31
+
32
+ 1. **Formal Definition**: We define the Prime Resonant Graph Memory (PRGraphMemory) as a content-addressable store operating in the tensor product space H_P ⊗ ℍ, where H_P is a prime-indexed Hilbert space and ℍ is the quaternion algebra.
33
+
34
+ 2. **Resonance Score**: We introduce a novel similarity metric combining Jaccard set overlap, quaternionic alignment, and phase coherence.
35
+
36
+ 3. **Entropy Dynamics**: We describe a temporal evolution mechanism where memories decay according to entropy reduction, with high-coherence memories becoming "locked" (protected from forgetting).
37
+
38
+ 4. **Algorithm Analysis**: We provide computational complexity bounds and discuss the trade-offs between sparsity, expressiveness, and retrieval efficiency.
39
+
40
+ 5. **Implementation**: We present concrete implementations in JavaScript and AssemblyScript, demonstrating practical applicability.
41
+
42
+ ### 1.3 Paper Organization
43
+
44
+ Section 2 reviews related work. Section 3 establishes the mathematical preliminaries. Section 4 defines the PRGraphMemory data structures. Section 5 presents the core algorithms. Section 6 analyzes computational properties. Section 7 discusses applications. Section 8 concludes.
45
+
46
+ ---
47
+
48
+ ## 2. Related Work
49
+
50
+ ### 2.1 Content-Addressable Memory
51
+
52
+ Hopfield networks (Hopfield, 1982) introduced the concept of associative memory in neural systems, where patterns are stored as attractors of a dynamical system. However, capacity scales as O(n) for n neurons, limiting applicability. Modern approaches using attention mechanisms (Vaswani et al., 2017) achieve content-based addressing but require O(n²) computation for n items.
53
+
54
+ ### 2.2 Prime Number Representations
55
+
56
+ The use of prime numbers in computing traces to Gödel numbering, where statements are encoded as products of primes raised to powers. More recently, prime-indexed representations have appeared in the context of hyperdimensional computing (Kanerva, 2009) and compositional semantics (Smolensky, 1990). Our work extends these ideas by incorporating phase and orientation information.
57
+
58
+ ### 2.3 Quaternions in Machine Learning
59
+
60
+ Quaternions have been applied to rotation representation in computer graphics and robotics. In machine learning, quaternion neural networks (Parcollet et al., 2019) demonstrate improved performance on tasks with inherent 3D structure. We apply quaternions not for geometric rotation but for encoding non-commutative semantic relationships.
61
+
62
+ ### 2.4 Quantum-Inspired Computing
63
+
64
+ Quantum-inspired algorithms (Tang, 2019) apply quantum mechanical formalism to classical systems. Our use of superposition, amplitude, and phase follows this paradigm, treating memory states as classical analogues of quantum states without requiring quantum hardware.
65
+
66
+ ---
67
+
68
+ ## 3. Mathematical Preliminaries
69
+
70
+ ### 3.1 Prime Hilbert Space
71
+
72
+ **Definition 3.1 (Prime Hilbert Space).** Let ℙ = {2, 3, 5, 7, 11, ...} denote the set of prime numbers. The *Prime Hilbert Space* H_P is the Hilbert space with orthonormal basis {|p⟩ : p ∈ ℙ}, equipped with the standard inner product:
73
+
74
+ $$\langle p | q \rangle = \delta_{pq}$$
75
+
76
+ where δ_{pq} is the Kronecker delta.
77
+
78
+ **Definition 3.2 (Sparse Prime State).** A *sparse prime state* with sparsity parameter k is a vector in H_P with at most k non-zero amplitudes:
79
+
80
+ $$|\Psi\rangle = \sum_{p \in P_{\text{active}}} c_p |p\rangle$$
81
+
82
+ where |P_active| ≤ k and c_p ∈ ℂ.
83
+
84
+ ### 3.2 Quaternion Algebra
85
+
86
+ **Definition 3.3 (Quaternions).** The quaternion algebra ℍ is the 4-dimensional real algebra generated by {1, i, j, k} subject to:
87
+
88
+ $$i^2 = j^2 = k^2 = ijk = -1$$
89
+
90
+ A quaternion q = w + xi + yj + zk has:
91
+ - **Conjugate**: q* = w - xi - yj - zk
92
+ - **Norm**: |q|² = qq* = w² + x² + y² + z²
93
+ - **Inverse**: q⁻¹ = q*/|q|²
94
+
95
+ **Proposition 3.1 (Non-Commutativity).** For general quaternions q₁, q₂ ∈ ℍ:
96
+
97
+ $$q_1 q_2 \neq q_2 q_1$$
98
+
99
+ The *commutator* [q₁, q₂] = q₁q₂ - q₂q₁ measures the degree of non-commutativity.
100
+
101
+ ### 3.3 Prime-Quaternion State Space
102
+
103
+ **Definition 3.4 (Prime-Quaternion State Space).** The *Prime-Quaternion State Space* is the tensor product:
104
+
105
+ $$H_Q = H_P \otimes \mathbb{H}$$
106
+
107
+ An element of H_Q is a sparse mapping from primes to quaternions:
108
+
109
+ $$|\Psi\rangle = \sum_{p \in P_{\text{active}}} \alpha_p \cdot q_p \cdot |p\rangle$$
110
+
111
+ where α_p ∈ ℂ (complex amplitude) and q_p ∈ ℍ (quaternion orientation).
112
+
113
+ ### 3.4 Entropy
114
+
115
+ **Definition 3.5 (Shannon Entropy over Primes).** For a normalized state |Ψ⟩ with amplitudes {c_p}, the entropy is:
116
+
117
+ $$S(\Psi) = -\sum_{p} |c_p|^2 \log_2 |c_p|^2$$
118
+
119
+ Entropy measures the "spread" of the state across prime bases. Maximum entropy occurs for uniform superposition; minimum (zero) entropy occurs for basis states.
120
+
121
+ ---
122
+
123
+ ## 4. PRGraphMemory Data Structures
124
+
125
+ ### 4.1 Memory Entry
126
+
127
+ **Definition 4.1 (Memory Entry).** A memory entry E is a tuple:
128
+
129
+ $$E = (id, \Psi, M, S, \lambda, t_c, n_a)$$
130
+
131
+ where:
132
+ - id ∈ ℕ: unique identifier (hash of key)
133
+ - Ψ ∈ H_Q: the prime-quaternion state
134
+ - M: metadata (arbitrary structured data)
135
+ - S ∈ ℝ⁺: current entropy
136
+ - λ ∈ {true, false}: locked status
137
+ - t_c ∈ ℝ: creation timestamp
138
+ - n_a ∈ ℕ: access count
139
+
140
+ ### 4.2 Memory Store
141
+
142
+ **Definition 4.2 (PRGraphMemory).** A PRGraphMemory G is a tuple:
143
+
144
+ $$G = (\mathcal{E}, \mathbb{P}_N, \tau_{\text{lock}}, \gamma)$$
145
+
146
+ where:
147
+ - ℰ: Map⟨id, Entry⟩ is the entry store
148
+ - ℙ_N = {p₁, ..., p_N} is the vocabulary of the first N primes
149
+ - τ_lock ∈ [0, 1]: threshold for memory locking
150
+ - γ ∈ [0, 1]: decay rate per access
151
+
152
+ ---
153
+
154
+ ## 5. Algorithms
155
+
156
+ ### 5.1 Prime-Entropy Hash Function
157
+
158
+ The hash function Π maps a string key to a numeric identifier that determines prime activations.
159
+
160
+ **Algorithm 1: Prime-Entropy Hash**
161
+ ```
162
+ function Π(key: String) → ℕ:
163
+ hash ← 0
164
+ for i = 0 to |key| - 1:
165
+ hash ← ((hash << 5) - hash + charCode(key[i])) mod 2³²
166
+ return |hash|
167
+ ```
168
+
169
+ **Proposition 5.1.** The prime-entropy hash distributes uniformly over [0, 2³²) for sufficiently long random strings.
170
+
171
+ ### 5.2 State Encoding
172
+
173
+ Given a text string, we construct a sparse prime state by:
174
+
175
+ 1. Computing the hash to select k primes
176
+ 2. Assigning phase-encoded amplitudes based on position
177
+ 3. Generating quaternion orientations from character statistics
178
+
179
+ **Algorithm 2: Text to SparsePrimeState**
180
+ ```
181
+ function encode(text: String, k: int, primes: Prime[]) → SparsePrimeState:
182
+ hash ← Π(text)
183
+ state ← new SparsePrimeState()
184
+ charSum ← Σ charCode(text[i])
185
+
186
+ for i = 0 to k - 1:
187
+ idx ← (hash × (i + 1) × 31337) mod |primes|
188
+ p ← primes[idx]
189
+ phase ← 2π × i / k
190
+ amplitude ← Complex.fromPolar(1/√k, phase)
191
+
192
+ axis ← [sin(charSum + i), cos(charSum × i), sin(i)]
193
+ angle ← (charSum × i) mod 2π
194
+ quaternion ← Quaternion.fromAxisAngle(axis, angle)
195
+
196
+ state.set(p, amplitude, quaternion.normalize())
197
+
198
+ return state
199
+ ```
200
+
201
+ ### 5.3 Resonance Score
202
+
203
+ The resonance score quantifies similarity between two prime-quaternion states.
204
+
205
+ **Definition 5.1 (Resonance Score).** For states Ψ_i and Ψ_j with active prime sets P_i and P_j:
206
+
207
+ $$\text{Res}(\Psi_i, \Psi_j) = \alpha \cdot J(P_i, P_j) + \beta \cdot Q(\Psi_i, \Psi_j) + \gamma \cdot \Phi(\Psi_i, \Psi_j)$$
208
+
209
+ where α + β + γ = 1 and:
210
+
211
+ **Jaccard Similarity (Set Overlap):**
212
+ $$J(P_i, P_j) = \frac{|P_i \cap P_j|}{|P_i \cup P_j|}$$
213
+
214
+ **Quaternion Alignment:**
215
+ $$Q(\Psi_i, \Psi_j) = \frac{1}{|P_i \cap P_j|} \sum_{p \in P_i \cap P_j} |q_i^p \cdot q_j^p|$$
216
+
217
+ where q_i^p · q_j^p denotes the quaternion dot product (as 4-vectors).
218
+
219
+ **Phase Coherence:**
220
+ $$\Phi(\Psi_i, \Psi_j) = \frac{1}{2} \left( 1 + \frac{1}{|P_i \cap P_j|} \sum_{p \in P_i \cap P_j} \cos(\phi_i^p - \phi_j^p) \right)$$
221
+
222
+ where φ^p is the phase of the complex amplitude at prime p.
223
+
224
+ **Algorithm 3: Resonance Score**
225
+ ```
226
+ function resonanceScore(Ψ_i, Ψ_j, α=0.33, β=0.33, γ=0.34) → ℝ:
227
+ P_i ← Ψ_i.getActivePrimes()
228
+ P_j ← Ψ_j.getActivePrimes()
229
+
230
+ intersection ← P_i ∩ P_j
231
+ union ← P_i ∪ P_j
232
+
233
+ if |intersection| = 0:
234
+ return α × |intersection| / |union|
235
+
236
+ // Jaccard
237
+ jaccard ← |intersection| / |union|
238
+
239
+ // Quaternion alignment
240
+ quatSum ← 0
241
+ for p in intersection:
242
+ quatSum ← quatSum + |Ψ_i.get(p).quaternion · Ψ_j.get(p).quaternion|
243
+ quatAlign ← quatSum / |intersection|
244
+
245
+ // Phase coherence
246
+ phaseSum ← 0
247
+ for p in intersection:
248
+ Δφ ← Ψ_i.get(p).amplitude.phase() - Ψ_j.get(p).amplitude.phase()
249
+ phaseSum ← phaseSum + cos(Δφ)
250
+ phaseCoherence ← (phaseSum / |intersection| + 1) / 2
251
+
252
+ return α × jaccard + β × quatAlign + γ × phaseCoherence
253
+ ```
254
+
255
+ ### 5.4 Storage (PRG-Put)
256
+
257
+ **Algorithm 4: Memory Storage**
258
+ ```
259
+ function put(G: PRGraphMemory, key: String, Ψ: SparsePrimeState, M: Metadata) → id:
260
+ id ← Π(key)
261
+ S ← Ψ.entropy()
262
+
263
+ entry ← {
264
+ id: id,
265
+ key: key,
266
+ state: Ψ,
267
+ metadata: M,
268
+ entropy: S,
269
+ locked: false,
270
+ createdAt: now(),
271
+ accessCount: 0
272
+ }
273
+
274
+ G.ℰ.set(id, entry)
275
+ return id
276
+ ```
277
+
278
+ ### 5.5 Retrieval (PRG-Get)
279
+
280
+ **Algorithm 5: Content-Addressable Retrieval**
281
+ ```
282
+ function get(G: PRGraphMemory, query: SparsePrimeState, k: int) → Entry[]:
283
+ results ← []
284
+
285
+ for (id, entry) in G.ℰ:
286
+ score ← resonanceScore(query, entry.state)
287
+
288
+ // Apply entropy decay
289
+ entry.entropy ← entry.entropy × (1 - G.γ)
290
+
291
+ // Lock condition
292
+ if entry.entropy < 0.5 and score > G.τ_lock:
293
+ entry.locked ← true
294
+
295
+ entry.accessCount ← entry.accessCount + 1
296
+
297
+ results.append({id, score, entry})
298
+
299
+ // Sort by score descending
300
+ results.sort(by: score, order: descending)
301
+
302
+ return results[0:k]
303
+ ```
304
+
305
+ ### 5.6 Hamilton Composition
306
+
307
+ For combining states with order-sensitivity, we use the Hamilton product:
308
+
309
+ **Algorithm 6: Hamilton Composition**
310
+ ```
311
+ function hamiltonCompose(Ψ_A, Ψ_B) → SparsePrimeState:
312
+ result ← new SparsePrimeState()
313
+ allPrimes ← Ψ_A.primes() ∪ Ψ_B.primes()
314
+
315
+ for p in allPrimes:
316
+ a_A ← Ψ_A.get(p)
317
+ a_B ← Ψ_B.get(p)
318
+
319
+ // Complex amplitude multiplication
320
+ newAmp ← a_A.amplitude × a_B.amplitude
321
+
322
+ // Quaternion Hamilton product (non-commutative!)
323
+ newQuat ← a_A.quaternion × a_B.quaternion
324
+
325
+ result.set(p, newAmp, newQuat.normalize())
326
+
327
+ return result.normalize()
328
+ ```
329
+
330
+ **Theorem 5.1 (Order Sensitivity).** For general states Ψ_A, Ψ_B:
331
+
332
+ $$\text{hamiltonCompose}(\Psi_A, \Psi_B) \neq \text{hamiltonCompose}(\Psi_B, \Psi_A)$$
333
+
334
+ *Proof.* The Hamilton product on quaternions is non-commutative. For any prime p where both states have non-zero quaternion amplitudes q_A and q_B with non-zero commutator, the composed quaternion differs under permutation. ∎
335
+
336
+ ### 5.7 Memory Pruning
337
+
338
+ For bounded-capacity systems, we employ utility-based pruning:
339
+
340
+ **Algorithm 7: Memory Pruning**
341
+ ```
342
+ function prune(G: PRGraphMemory, maxEntries: int):
343
+ if |G.ℰ| ≤ maxEntries:
344
+ return
345
+
346
+ utilities ← []
347
+ for (id, entry) in G.ℰ:
348
+ utility ← entry.decayFactor × (1 + log(entry.accessCount + 1))
349
+ utilities.append({id, utility})
350
+
351
+ utilities.sort(by: utility, order: ascending)
352
+
353
+ toRemove ← |G.ℰ| - maxEntries + ⌊maxEntries / 10⌋
354
+ for i = 0 to toRemove - 1:
355
+ G.ℰ.delete(utilities[i].id)
356
+ ```
357
+
358
+ ### 5.8 Memory Consolidation
359
+
360
+ Similar memories can be merged to reduce redundancy:
361
+
362
+ **Algorithm 8: Memory Consolidation**
363
+ ```
364
+ function consolidate(G: PRGraphMemory, θ: float) → int:
365
+ merged ← 0
366
+ toRemove ← {}
367
+
368
+ for (id_1, e_1) in G.ℰ:
369
+ if id_1 in toRemove: continue
370
+
371
+ for (id_2, e_2) in G.ℰ:
372
+ if id_2 ≤ id_1 or id_2 in toRemove: continue
373
+
374
+ if resonanceScore(e_1.state, e_2.state) > θ:
375
+ // Merge e_2 into e_1
376
+ e_1.state ← (e_1.state + e_2.state).normalize()
377
+ e_1.accessCount ← e_1.accessCount + e_2.accessCount
378
+ toRemove.add(id_2)
379
+ merged ← merged + 1
380
+
381
+ for id in toRemove:
382
+ G.ℰ.delete(id)
383
+
384
+ return merged
385
+ ```
386
+
387
+ ---
388
+
389
+ ## 6. Theoretical Analysis
390
+
391
+ ### 6.1 Complexity Analysis
392
+
393
+ **Theorem 6.1 (Storage Complexity).** For a memory with n entries, each with sparsity k:
394
+
395
+ - Space: O(n × k) quaternion-amplitude pairs
396
+ - Put: O(k) for state construction, O(1) for storage
397
+ - Get: O(n × k) for resonance scoring (dominates)
398
+
399
+ **Theorem 6.2 (Resonance Score Complexity).** Computing Res(Ψ_i, Ψ_j) requires:
400
+
401
+ - O(k log k) for set intersection (using sorted prime lists)
402
+ - O(|P_i ∩ P_j|) for quaternion and phase components
403
+ - Total: O(k log k)
404
+
405
+ ### 6.2 Capacity and Collision Analysis
406
+
407
+ **Definition 6.1 (Prime Vocabulary).** The vocabulary ℙ_N contains the first N primes.
408
+
409
+ **Proposition 6.1.** For vocabulary size N and sparsity k, the number of distinct states is:
410
+
411
+ $$\binom{N}{k} \times |\mathcal{A}|^k$$
412
+
413
+ where |𝒜| is the cardinality of the amplitude discretization.
414
+
415
+ For N = 4096, k = 32, and continuous amplitudes, this is effectively uncountable, ensuring collision-free addressing.
416
+
417
+ ### 6.3 Entropy Dynamics
418
+
419
+ **Theorem 6.3 (Entropy Decay Convergence).** Under repeated access with decay rate γ ∈ (0, 1), the entropy of an entry converges to zero:
420
+
421
+ $$S_t = S_0 \times (1 - \gamma)^t \xrightarrow{t \to \infty} 0$$
422
+
423
+ **Corollary 6.1.** All sufficiently accessed memories eventually become locked (if they achieve high resonance with queries).
424
+
425
+ ### 6.4 Non-Commutativity Measure
426
+
427
+ **Definition 6.2 (Average Commutator Norm).** For states Ψ_A, Ψ_B sharing primes P = P_A ∩ P_B:
428
+
429
+ $$\text{NonComm}(\Psi_A, \Psi_B) = \frac{1}{|P|} \sum_{p \in P} \|[q_A^p, q_B^p]\|$$
430
+
431
+ **Proposition 6.2.** NonComm = 0 if and only if all shared quaternion pairs commute (i.e., are parallel or one is real).
432
+
433
+ ---
434
+
435
+ ## 7. Applications
436
+
437
+ ### 7.1 Semantic Memory for Language Models
438
+
439
+ PRGraphMemory can serve as external memory for large language models:
440
+
441
+ 1. **Fact Storage**: Encode facts as prime-indexed states
442
+ 2. **Retrieval Augmentation**: Query memory during generation
443
+ 3. **Temporal Coherence**: Older, less-accessed facts naturally fade
444
+ 4. **Consolidation**: Related facts merge into unified representations
445
+
446
+ ### 7.2 Knowledge Graph Completion
447
+
448
+ Traditional knowledge graphs require explicit edges. PRGraphMemory enables:
449
+
450
+ - **Implicit Relationships**: Similarity via resonance score
451
+ - **Fuzzy Queries**: Partial match retrieval
452
+ - **Dynamic Strength**: Relationship strength evolves with access patterns
453
+
454
+ ### 7.3 Distributed Storage via CRT
455
+
456
+ Using the Chinese Remainder Theorem, states can be sharded across nodes indexed by distinct primes:
457
+
458
+ **Algorithm 9: CRT Reconstruction**
459
+ ```
460
+ function reconstructFromResidues(residues: int[], primes: Prime[]) → int:
461
+ M ← Π_{p ∈ primes} p
462
+ result ← 0
463
+
464
+ for i = 0 to |primes| - 1:
465
+ M_i ← M / primes[i]
466
+ y_i ← modularInverse(M_i, primes[i])
467
+ result ← (result + residues[i] × M_i × y_i) mod M
468
+
469
+ return result
470
+ ```
471
+
472
+ This enables:
473
+ - **Redundancy**: State recoverable from subset of shards
474
+ - **Locality**: Access patterns determine physical placement
475
+ - **Security**: No single node holds complete information
476
+
477
+ ### 7.4 Temporal Reasoning
478
+
479
+ The order-sensitive Hamilton composition enables:
480
+
481
+ - **Event Sequencing**: "A then B" ≠ "B then A"
482
+ - **Causal Inference**: Asymmetric relationships encoded in quaternions
483
+ - **Narrative Memory**: Stories encoded with temporal structure
484
+
485
+ ---
486
+
487
+ ## 8. Conclusion
488
+
489
+ We have presented PRGraphMemory, a content-addressable memory system grounded in prime number theory and quaternionic algebra. The key innovations are:
490
+
491
+ 1. **Prime-Indexed Representation**: Compositional semantics via sparse superpositions over prime bases
492
+ 2. **Quaternionic Orientation**: Non-commutative structure for order-sensitive composition
493
+ 3. **Resonance-Based Retrieval**: Unified similarity metric combining set overlap, orientation alignment, and phase coherence
494
+ 4. **Entropy Dynamics**: Natural temporal forgetting with consolidation of stable memories
495
+
496
+ The approach offers a principled alternative to vector-based semantic memory, with theoretical foundations in number theory, hypercomplex algebra, and information theory. Future work will explore integration with neural architectures, scaling to billions of entries, and applications to multi-modal memory systems.
497
+
498
+ ---
499
+
500
+ ## References
501
+
502
+ 1. Hopfield, J.J. (1982). Neural networks and physical systems with emergent collective computational abilities. *Proceedings of the National Academy of Sciences*, 79(8), 2554-2558.
503
+
504
+ 2. Kanerva, P. (2009). Hyperdimensional computing: An introduction to computing in distributed representation with high-dimensional random vectors. *Cognitive Computation*, 1(2), 139-159.
505
+
506
+ 3. Parcollet, T., et al. (2019). Quaternion recurrent neural networks. *ICLR 2019*.
507
+
508
+ 4. Smolensky, P. (1990). Tensor product variable binding and the representation of symbolic structures in connectionist systems. *Artificial Intelligence*, 46(1-2), 159-216.
509
+
510
+ 5. Tang, E. (2019). A quantum-inspired classical algorithm for recommendation systems. *STOC 2019*.
511
+
512
+ 6. Vaswani, A., et al. (2017). Attention is all you need. *NeurIPS 2017*.
513
+
514
+ ---
515
+
516
+ ## Appendix A: Implementation Notes
517
+
518
+ ### A.1 JavaScript Implementation
519
+
520
+ Located at [`core/rformer.js`](../../core/rformer.js), the JavaScript implementation prioritizes readability and integration with Node.js ecosystems. Key design choices:
521
+
522
+ - String keys hashed to numeric IDs
523
+ - Real timestamps for temporal tracking
524
+ - Unbounded storage (no automatic pruning)
525
+ - CRT reconstruction for distributed use cases
526
+
527
+ ### A.2 AssemblyScript Implementation
528
+
529
+ Located at [`@sschepis/resolang/assembly/rformer.ts`](../../node_modules/@sschepis/resolang/assembly/rformer.ts), the AssemblyScript version targets WebAssembly for performance:
530
+
531
+ - Integer IDs for efficient storage
532
+ - Virtual time for deterministic testing
533
+ - Capacity limits with automatic pruning
534
+ - Memory consolidation for efficiency
535
+
536
+ ### A.3 Usage Example
537
+
538
+ ```javascript
539
+ const { PRGraphMemory, SparsePrimeState } = require('tinyaleph');
540
+
541
+ const memory = new PRGraphMemory(4096, 0.8);
542
+
543
+ // Store semantic content
544
+ memory.put('capital', SparsePrimeState.fromHash('Paris is the capital of France'),
545
+ { domain: 'geography' });
546
+
547
+ // Content-addressable retrieval
548
+ const query = SparsePrimeState.fromHash('capital city of France');
549
+ const results = memory.get(query, 3);
550
+
551
+ // Results ranked by resonance score
552
+ for (const r of results) {
553
+ console.log(`Score: ${r.score.toFixed(4)}, Locked: ${r.locked}`);
554
+ }
555
+ ```
556
+
557
+ ---
558
+
559
+ *© 2025 Coherent Observer Research. This work is released under the MIT License.*