@aleph-ai/tinyaleph 1.0.0
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- package/LICENSE +21 -0
- package/README.md +278 -0
- package/backends/cryptographic/index.js +196 -0
- package/backends/index.js +15 -0
- package/backends/interface.js +89 -0
- package/backends/scientific/index.js +272 -0
- package/backends/semantic/index.js +527 -0
- package/backends/semantic/surface.js +393 -0
- package/backends/semantic/two-layer.js +375 -0
- package/core/fano.js +127 -0
- package/core/hilbert.js +564 -0
- package/core/hypercomplex.js +141 -0
- package/core/index.js +133 -0
- package/core/llm.js +132 -0
- package/core/prime.js +184 -0
- package/core/resonance.js +695 -0
- package/core/rformer-tf.js +1086 -0
- package/core/rformer.js +806 -0
- package/core/sieve.js +350 -0
- package/data.json +8163 -0
- package/docs/EXAMPLES_PLAN.md +293 -0
- package/docs/README.md +159 -0
- package/docs/design/ALEPH_CHAT_ARCHITECTURE.md +499 -0
- package/docs/guide/01-quickstart.md +298 -0
- package/docs/guide/02-semantic-computing.md +409 -0
- package/docs/guide/03-cryptographic.md +420 -0
- package/docs/guide/04-scientific.md +494 -0
- package/docs/guide/05-llm-integration.md +568 -0
- package/docs/guide/06-advanced.md +996 -0
- package/docs/guide/README.md +188 -0
- package/docs/reference/01-core.md +695 -0
- package/docs/reference/02-physics.md +601 -0
- package/docs/reference/03-backends.md +892 -0
- package/docs/reference/04-engine.md +632 -0
- package/docs/reference/README.md +252 -0
- package/docs/theory/01-prime-semantics.md +327 -0
- package/docs/theory/02-hypercomplex-algebra.md +421 -0
- package/docs/theory/03-phase-synchronization.md +364 -0
- package/docs/theory/04-entropy-reasoning.md +348 -0
- package/docs/theory/05-non-commutativity.md +402 -0
- package/docs/theory/06-two-layer-meaning.md +414 -0
- package/docs/theory/07-resonant-field-interface.md +419 -0
- package/docs/theory/08-semantic-sieve.md +520 -0
- package/docs/theory/09-temporal-emergence.md +298 -0
- package/docs/theory/10-quaternionic-memory.md +415 -0
- package/docs/theory/README.md +162 -0
- package/engine/aleph.js +418 -0
- package/engine/index.js +7 -0
- package/index.js +23 -0
- package/modular.js +254 -0
- package/package.json +99 -0
- package/physics/collapse.js +95 -0
- package/physics/entropy.js +88 -0
- package/physics/index.js +65 -0
- package/physics/kuramoto.js +91 -0
- package/physics/lyapunov.js +80 -0
- package/physics/oscillator.js +95 -0
- package/types/index.d.ts +575 -0
|
@@ -0,0 +1,414 @@
|
|
|
1
|
+
# Two-Layer Meaning
|
|
2
|
+
|
|
3
|
+
## The Fundamental Distinction
|
|
4
|
+
|
|
5
|
+
A profound insight underlies Aleph's architecture: **meaning is different from words**.
|
|
6
|
+
|
|
7
|
+
Consider: we have many words for the same concept:
|
|
8
|
+
- "awesome", "splendid", "magnificent", "fire π₯", "magnifique"
|
|
9
|
+
|
|
10
|
+
All different surface forms. All pointing to the same underlying meaning.
|
|
11
|
+
|
|
12
|
+
This reveals a two-layer structure:
|
|
13
|
+
|
|
14
|
+
```
|
|
15
|
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
16
|
+
β LAYER 1: PRIME SUBSTRATE β
|
|
17
|
+
β β
|
|
18
|
+
β Universal, pre-linguistic meaning structure β
|
|
19
|
+
β Same primes = same meaning, regardless of language β
|
|
20
|
+
β β
|
|
21
|
+
β AWESTRUCK = [2, 5, 11] β
|
|
22
|
+
β (existence + form + psyche) β
|
|
23
|
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
24
|
+
β
|
|
25
|
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
26
|
+
β LAYER 2: SURFACE VOCABULARY β
|
|
27
|
+
β β
|
|
28
|
+
β Language-specific, culture-dependent word choice β
|
|
29
|
+
β Same primes β different words in different contexts β
|
|
30
|
+
β β
|
|
31
|
+
β [2, 5, 11] β "awesome" (American casual) β
|
|
32
|
+
β [2, 5, 11] β "splendid" (British formal) β
|
|
33
|
+
β [2, 5, 11] β "fire π₯" (Gen Z slang) β
|
|
34
|
+
β [2, 5, 11] β "magnifique" (French) β
|
|
35
|
+
ββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββββ
|
|
36
|
+
```
|
|
37
|
+
|
|
38
|
+
---
|
|
39
|
+
|
|
40
|
+
## Layer 1: The Prime Substrate
|
|
41
|
+
|
|
42
|
+
The prime substrate is **invariant meaning**βwhat something actually IS.
|
|
43
|
+
|
|
44
|
+
### Properties
|
|
45
|
+
|
|
46
|
+
| Property | Description |
|
|
47
|
+
|----------|-------------|
|
|
48
|
+
| **Pre-linguistic** | Exists before words |
|
|
49
|
+
| **Cross-cultural** | Same in every language |
|
|
50
|
+
| **Cross-species** | A dog feels fear too |
|
|
51
|
+
| **Cross-temporal** | Ancient humans felt love |
|
|
52
|
+
| **Invariant** | The meaning doesn't change |
|
|
53
|
+
|
|
54
|
+
### Example Encoding
|
|
55
|
+
|
|
56
|
+
```
|
|
57
|
+
AWESTRUCK = [2, 5, 11]
|
|
58
|
+
2 = existence (something IS)
|
|
59
|
+
5 = form (impressive structure)
|
|
60
|
+
11 = psyche (affects the soul)
|
|
61
|
+
|
|
62
|
+
β "that which exists with impressive form affecting the soul"
|
|
63
|
+
```
|
|
64
|
+
|
|
65
|
+
This prime signature is the **same** regardless of what word you use to express it.
|
|
66
|
+
|
|
67
|
+
---
|
|
68
|
+
|
|
69
|
+
## Layer 2: The Surface Vocabulary
|
|
70
|
+
|
|
71
|
+
The surface layer maps primes to words, with **biases** that depend on context.
|
|
72
|
+
|
|
73
|
+
### Multiple Words, Same Primes
|
|
74
|
+
|
|
75
|
+
| Word | Language/Context | Prime Signature |
|
|
76
|
+
|------|-----------------|-----------------|
|
|
77
|
+
| awesome | American casual | [2, 5, 11] |
|
|
78
|
+
| splendid | British formal | [2, 5, 11] |
|
|
79
|
+
| fire π₯ | Gen Z slang | [2, 5, 11] |
|
|
80
|
+
| magnifique | French | [2, 5, 11] |
|
|
81
|
+
| sugoi | Japanese | [2, 5, 11] |
|
|
82
|
+
| wunderbar | German | [2, 5, 11] |
|
|
83
|
+
|
|
84
|
+
### Properties
|
|
85
|
+
|
|
86
|
+
| Property | Description |
|
|
87
|
+
|----------|-------------|
|
|
88
|
+
| **Linguistic** | Requires language |
|
|
89
|
+
| **Cultural** | Varies by community |
|
|
90
|
+
| **Temporal** | Changes over time |
|
|
91
|
+
| **Arbitrary** | Any symbol can map |
|
|
92
|
+
| **Learned** | Acquired through exposure |
|
|
93
|
+
|
|
94
|
+
---
|
|
95
|
+
|
|
96
|
+
## The Translation Machine
|
|
97
|
+
|
|
98
|
+
Translation is NOT: Wordβ β Wordβ (error-prone, loses nuance)
|
|
99
|
+
|
|
100
|
+
Translation IS: Wordβ β Primes β Wordβ (meaning-preserving)
|
|
101
|
+
|
|
102
|
+
```javascript
|
|
103
|
+
function translate(word, fromLanguage, toLanguage) {
|
|
104
|
+
// Go down to prime layer
|
|
105
|
+
const primes = this.vocabularies[fromLanguage].encode(word);
|
|
106
|
+
|
|
107
|
+
// Come back up in new language
|
|
108
|
+
return this.vocabularies[toLanguage].decode(primes);
|
|
109
|
+
}
|
|
110
|
+
```
|
|
111
|
+
|
|
112
|
+
This explains why translation is possible (same primes) and why it's hard (different surface forms).
|
|
113
|
+
|
|
114
|
+
---
|
|
115
|
+
|
|
116
|
+
## The TwoLayerEngine
|
|
117
|
+
|
|
118
|
+
Aleph implements this architecture:
|
|
119
|
+
|
|
120
|
+
```javascript
|
|
121
|
+
class TwoLayerEngine {
|
|
122
|
+
constructor(config) {
|
|
123
|
+
// Layer 1: Core meaning engine
|
|
124
|
+
this.core = new SemanticBackend(config.core);
|
|
125
|
+
|
|
126
|
+
// Layer 2: Surface word selection
|
|
127
|
+
this.surfaces = new SurfaceManager();
|
|
128
|
+
this.biasEngine = new BiasEngine();
|
|
129
|
+
}
|
|
130
|
+
|
|
131
|
+
process(input, options = {}) {
|
|
132
|
+
// Layer 1: Extract meaning
|
|
133
|
+
const tokens = this.core.encodeOrdered(input);
|
|
134
|
+
const meaningState = this.core.orderedPrimesToState(tokens);
|
|
135
|
+
|
|
136
|
+
// Layer 2: Select words
|
|
137
|
+
const selectedWords = this.selectWords(tokens, options);
|
|
138
|
+
|
|
139
|
+
return {
|
|
140
|
+
meaning: meaningState,
|
|
141
|
+
surface: selectedWords
|
|
142
|
+
};
|
|
143
|
+
}
|
|
144
|
+
}
|
|
145
|
+
```
|
|
146
|
+
|
|
147
|
+
---
|
|
148
|
+
|
|
149
|
+
## Surface Vocabularies (Registers)
|
|
150
|
+
|
|
151
|
+
Different registers map the same primes to different words:
|
|
152
|
+
|
|
153
|
+
### Formal Register
|
|
154
|
+
|
|
155
|
+
```javascript
|
|
156
|
+
surfaces.create('formal', {
|
|
157
|
+
vocabulary: {
|
|
158
|
+
'truth': { primes: [7, 11, 13], bias: 1.0 },
|
|
159
|
+
'verity': { primes: [7, 11, 13], bias: 0.8 },
|
|
160
|
+
'veracity': { primes: [7, 11, 13], bias: 0.6 },
|
|
161
|
+
|
|
162
|
+
'wisdom': { primes: [2, 7, 11], bias: 1.0 },
|
|
163
|
+
'sagacity': { primes: [2, 7, 11], bias: 0.6 },
|
|
164
|
+
'prudence': { primes: [2, 7, 11], bias: 0.8 },
|
|
165
|
+
}
|
|
166
|
+
});
|
|
167
|
+
```
|
|
168
|
+
|
|
169
|
+
### Casual Register
|
|
170
|
+
|
|
171
|
+
```javascript
|
|
172
|
+
surfaces.create('casual', {
|
|
173
|
+
vocabulary: {
|
|
174
|
+
'truth': { primes: [7, 11, 13], bias: 0.5 },
|
|
175
|
+
'real talk': { primes: [7, 11, 13], bias: 1.0 },
|
|
176
|
+
'straight up': { primes: [7, 11, 13], bias: 0.8 },
|
|
177
|
+
'no cap': { primes: [7, 11, 13], bias: 0.6 },
|
|
178
|
+
|
|
179
|
+
'wisdom': { primes: [2, 7, 11], bias: 0.6 },
|
|
180
|
+
'smarts': { primes: [2, 7, 11], bias: 1.0 },
|
|
181
|
+
'big brain': { primes: [2, 7, 11], bias: 0.7 },
|
|
182
|
+
}
|
|
183
|
+
});
|
|
184
|
+
```
|
|
185
|
+
|
|
186
|
+
### Technical Register
|
|
187
|
+
|
|
188
|
+
```javascript
|
|
189
|
+
surfaces.create('technical', {
|
|
190
|
+
vocabulary: {
|
|
191
|
+
'truth': { primes: [7, 11, 13], bias: 0.8 },
|
|
192
|
+
'validity': { primes: [7, 11, 13], bias: 1.0 },
|
|
193
|
+
'accuracy': { primes: [7, 11, 13], bias: 0.9 },
|
|
194
|
+
|
|
195
|
+
'think': { primes: [5, 7, 11], bias: 0.6 },
|
|
196
|
+
'compute': { primes: [5, 7, 11], bias: 1.0 },
|
|
197
|
+
'process': { primes: [5, 7, 11], bias: 0.9 },
|
|
198
|
+
}
|
|
199
|
+
});
|
|
200
|
+
```
|
|
201
|
+
|
|
202
|
+
---
|
|
203
|
+
|
|
204
|
+
## Bias-Based Word Selection
|
|
205
|
+
|
|
206
|
+
Words are selected based on bias weights:
|
|
207
|
+
|
|
208
|
+
```javascript
|
|
209
|
+
decode(primes, options = {}) {
|
|
210
|
+
const candidates = [];
|
|
211
|
+
|
|
212
|
+
for (const [word, entry] of this.vocabulary) {
|
|
213
|
+
// Match primes
|
|
214
|
+
const match = this.matchScore(primes, entry.primes);
|
|
215
|
+
if (match < 0.5) continue;
|
|
216
|
+
|
|
217
|
+
// Apply bias
|
|
218
|
+
let score = match * entry.bias;
|
|
219
|
+
|
|
220
|
+
// Context boost
|
|
221
|
+
if (options.contexts?.some(c => entry.contexts?.includes(c))) {
|
|
222
|
+
score *= 1.5;
|
|
223
|
+
}
|
|
224
|
+
|
|
225
|
+
// Avoid recently used (for variety)
|
|
226
|
+
if (options.avoid?.includes(word)) {
|
|
227
|
+
score *= 0.5;
|
|
228
|
+
}
|
|
229
|
+
|
|
230
|
+
candidates.push({ word, score });
|
|
231
|
+
}
|
|
232
|
+
|
|
233
|
+
// Select highest-scoring candidate
|
|
234
|
+
candidates.sort((a, b) => b.score - a.score);
|
|
235
|
+
return candidates[0]?.word || this.fallback(primes);
|
|
236
|
+
}
|
|
237
|
+
```
|
|
238
|
+
|
|
239
|
+
---
|
|
240
|
+
|
|
241
|
+
## Register Translation
|
|
242
|
+
|
|
243
|
+
Same meaning, different register:
|
|
244
|
+
|
|
245
|
+
```javascript
|
|
246
|
+
translate(input, fromRegister, toRegister) {
|
|
247
|
+
// Encode in source register
|
|
248
|
+
this.surfaces.use(fromRegister);
|
|
249
|
+
const meaning = this.process(input);
|
|
250
|
+
|
|
251
|
+
// Decode in target register
|
|
252
|
+
this.surfaces.use(toRegister);
|
|
253
|
+
const translated = this.selectWords(meaning.tokens);
|
|
254
|
+
|
|
255
|
+
return {
|
|
256
|
+
original: input,
|
|
257
|
+
translated: translated.join(' '),
|
|
258
|
+
meaning: meaning.meaning // Same meaning!
|
|
259
|
+
};
|
|
260
|
+
}
|
|
261
|
+
```
|
|
262
|
+
|
|
263
|
+
### Examples
|
|
264
|
+
|
|
265
|
+
```
|
|
266
|
+
Formal β Casual:
|
|
267
|
+
"The verity of the matter is sagacious."
|
|
268
|
+
β "Real talk, that's big brain."
|
|
269
|
+
|
|
270
|
+
Technical β Poetic:
|
|
271
|
+
"Process the data to compute validity."
|
|
272
|
+
β "Muse upon the essence to divine the verity."
|
|
273
|
+
|
|
274
|
+
Casual β Formal:
|
|
275
|
+
"That's fire, no cap."
|
|
276
|
+
β "That is truly magnificent."
|
|
277
|
+
```
|
|
278
|
+
|
|
279
|
+
---
|
|
280
|
+
|
|
281
|
+
## Why This Matters
|
|
282
|
+
|
|
283
|
+
The two-layer model explains:
|
|
284
|
+
|
|
285
|
+
### Why Translation is Possible
|
|
286
|
+
Same primes β Same meaning β Different words work
|
|
287
|
+
|
|
288
|
+
### Why Translation is Hard
|
|
289
|
+
Surface forms carry connotations that don't transfer
|
|
290
|
+
|
|
291
|
+
### Why Poetry Survives Translation
|
|
292
|
+
Prime resonance transcends specific words
|
|
293
|
+
|
|
294
|
+
### Why Puns Don't Translate
|
|
295
|
+
Puns depend on surface form, not meaning
|
|
296
|
+
|
|
297
|
+
### Why Babies Learn Meaning Before Words
|
|
298
|
+
They absorb primes first, words second
|
|
299
|
+
|
|
300
|
+
### Why Music is Universal
|
|
301
|
+
Frequencies are primes, no words needed
|
|
302
|
+
|
|
303
|
+
---
|
|
304
|
+
|
|
305
|
+
## Cross-Species Communication
|
|
306
|
+
|
|
307
|
+
If animals also operate on primes:
|
|
308
|
+
|
|
309
|
+
```javascript
|
|
310
|
+
const dogVocab = {
|
|
311
|
+
'wag': [2, 3, 5], // existence + unity + form β happiness
|
|
312
|
+
'growl': [5, 7, 23], // form + boundary + intensity β threat
|
|
313
|
+
'whine': [3, 11, 19], // unity + psyche + lack β need
|
|
314
|
+
};
|
|
315
|
+
|
|
316
|
+
function humanToDog(humanWord) {
|
|
317
|
+
const primes = humanVocab.encode(humanWord);
|
|
318
|
+
return dogVocab.findBestMatch(primes);
|
|
319
|
+
}
|
|
320
|
+
|
|
321
|
+
humanToDog('good boy'); // β wag (same prime signature!)
|
|
322
|
+
```
|
|
323
|
+
|
|
324
|
+
---
|
|
325
|
+
|
|
326
|
+
## The Platonic Connection
|
|
327
|
+
|
|
328
|
+
The prime layer is the **Platonic realm of meaning**.
|
|
329
|
+
|
|
330
|
+
Words are shadows on the cave wall. Different cultures, languages, and eras cast different shadows. But the object casting the shadowβthe prime signatureβremains constant.
|
|
331
|
+
|
|
332
|
+
```
|
|
333
|
+
MEANING
|
|
334
|
+
(Platonic Form)
|
|
335
|
+
β
|
|
336
|
+
βββββββββββββββββ
|
|
337
|
+
β PRIMES β
|
|
338
|
+
β [2,3,5,7...] β
|
|
339
|
+
β universal β
|
|
340
|
+
βββββββββ¬ββββββββ
|
|
341
|
+
β
|
|
342
|
+
βββββββββββββββΌββββββββββββββ
|
|
343
|
+
β β β
|
|
344
|
+
βββββββββ βββββββββ βββββββββ
|
|
345
|
+
βEnglishβ βMandarinβ β Dog β
|
|
346
|
+
β words β β ζ±ε β βsounds β
|
|
347
|
+
βββββββββ βββββββββ βββββββββ
|
|
348
|
+
β β β
|
|
349
|
+
spoken written barked
|
|
350
|
+
sound symbol sound
|
|
351
|
+
```
|
|
352
|
+
|
|
353
|
+
---
|
|
354
|
+
|
|
355
|
+
## Implementation in Aleph
|
|
356
|
+
|
|
357
|
+
### Backend Architecture
|
|
358
|
+
|
|
359
|
+
```javascript
|
|
360
|
+
class SemanticBackend {
|
|
361
|
+
// Layer 1: Universal prime substrate
|
|
362
|
+
primeSubstrate: {
|
|
363
|
+
[2]: { essence: 'existence', resonance: 1.0 },
|
|
364
|
+
[3]: { essence: 'unity', resonance: 1.0 },
|
|
365
|
+
[5]: { essence: 'form', resonance: 1.0 },
|
|
366
|
+
// ...
|
|
367
|
+
}
|
|
368
|
+
|
|
369
|
+
// Layer 2: Cultural vocabulary mappings
|
|
370
|
+
vocabularies: {
|
|
371
|
+
'en-US-casual': Map<word, primes>,
|
|
372
|
+
'en-GB-formal': Map<word, primes>,
|
|
373
|
+
'fr-FR': Map<word, primes>,
|
|
374
|
+
'emoji': Map<symbol, primes>,
|
|
375
|
+
// ...
|
|
376
|
+
}
|
|
377
|
+
}
|
|
378
|
+
```
|
|
379
|
+
|
|
380
|
+
### Cross-Cultural Bridging
|
|
381
|
+
|
|
382
|
+
```javascript
|
|
383
|
+
function findSharedMeaning(culture1Word, culture2Word) {
|
|
384
|
+
const primes1 = cultures['culture1'].encode(culture1Word);
|
|
385
|
+
const primes2 = cultures['culture2'].encode(culture2Word);
|
|
386
|
+
|
|
387
|
+
const overlap = intersection(primes1, primes2);
|
|
388
|
+
|
|
389
|
+
return {
|
|
390
|
+
shared: overlap, // Common ground
|
|
391
|
+
unique1: difference(primes1, overlap), // What culture1 adds
|
|
392
|
+
unique2: difference(primes2, overlap), // What culture2 adds
|
|
393
|
+
compatibility: overlap.length / union(primes1, primes2).length
|
|
394
|
+
};
|
|
395
|
+
}
|
|
396
|
+
```
|
|
397
|
+
|
|
398
|
+
---
|
|
399
|
+
|
|
400
|
+
## Summary
|
|
401
|
+
|
|
402
|
+
The two-layer model:
|
|
403
|
+
|
|
404
|
+
1. **Separates meaning from words** β primes are meaning, words are pointers
|
|
405
|
+
2. **Enables translation** β through the prime layer
|
|
406
|
+
3. **Supports registers** β same meaning, different style
|
|
407
|
+
4. **Explains universality** β some concepts transcend culture
|
|
408
|
+
5. **Models acquisition** β primes first, words second
|
|
409
|
+
|
|
410
|
+
Aleph is a GPS for meaningβit works regardless of which map (language) you're using.
|
|
411
|
+
|
|
412
|
+
---
|
|
413
|
+
|
|
414
|
+
## Next: [Resonant Field Interface β](./07-resonant-field-interface.md)
|