ruvector 0.1.24 → 0.1.26

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,7 +1,7 @@
1
1
  {
2
- "startTime": 1764220734177,
3
- "sessionId": "session-1764220734177",
4
- "lastActivity": 1764220734177,
2
+ "startTime": 1764543814799,
3
+ "sessionId": "session-1764543814799",
4
+ "lastActivity": 1764543814799,
5
5
  "sessionDuration": 0,
6
6
  "totalTasks": 1,
7
7
  "successfulTasks": 1,
@@ -1,10 +1,10 @@
1
1
  [
2
2
  {
3
- "id": "cmd-hooks-1764220734296",
3
+ "id": "cmd-hooks-1764543814926",
4
4
  "type": "hooks",
5
5
  "success": true,
6
- "duration": 9.616918999999996,
7
- "timestamp": 1764220734306,
6
+ "duration": 11.31845599999997,
7
+ "timestamp": 1764543814938,
8
8
  "metadata": {}
9
9
  }
10
10
  ]
package/README.md CHANGED
@@ -484,6 +484,119 @@ npx ruvector gnn search -q "[1.0,0.0,0.0]" -c candidates.json -k 5
484
484
  # -t, --temperature Softmax temperature (default: 1.0)
485
485
  ```
486
486
 
487
+ ### Attention Commands
488
+
489
+ Ruvector includes high-performance attention mechanisms for transformer-based operations, hyperbolic embeddings, and graph attention.
490
+
491
+ ```bash
492
+ # Install the attention module (optional)
493
+ npm install @ruvector/attention
494
+ ```
495
+
496
+ #### Attention Mechanisms Reference
497
+
498
+ | Mechanism | Type | Complexity | When to Use |
499
+ |-----------|------|------------|-------------|
500
+ | **DotProductAttention** | Core | O(n²) | Standard scaled dot-product attention for transformers |
501
+ | **MultiHeadAttention** | Core | O(n²) | Parallel attention heads for capturing different relationships |
502
+ | **FlashAttention** | Core | O(n²) IO-optimized | Memory-efficient attention for long sequences |
503
+ | **HyperbolicAttention** | Core | O(n²) | Hierarchical data, tree-like structures, taxonomies |
504
+ | **LinearAttention** | Core | O(n) | Very long sequences where O(n²) is prohibitive |
505
+ | **MoEAttention** | Core | O(n*k) | Mixture of Experts routing, specialized attention |
506
+ | **GraphRoPeAttention** | Graph | O(n²) | Graph data with rotary position embeddings |
507
+ | **EdgeFeaturedAttention** | Graph | O(n²) | Graphs with rich edge features/attributes |
508
+ | **DualSpaceAttention** | Graph | O(n²) | Combined Euclidean + hyperbolic representation |
509
+ | **LocalGlobalAttention** | Graph | O(n*k) | Large graphs with local + global context |
510
+
511
+ #### Attention Info
512
+
513
+ ```bash
514
+ # Show attention module information
515
+ npx ruvector attention info
516
+
517
+ # Output:
518
+ # Attention Module Information
519
+ # Status: Available
520
+ # Version: 0.1.0
521
+ # Platform: linux
522
+ # Architecture: x64
523
+ #
524
+ # Core Attention Mechanisms:
525
+ # • DotProductAttention - Scaled dot-product attention
526
+ # • MultiHeadAttention - Multi-head self-attention
527
+ # • FlashAttention - Memory-efficient IO-aware attention
528
+ # • HyperbolicAttention - Poincaré ball attention
529
+ # • LinearAttention - O(n) linear complexity attention
530
+ # • MoEAttention - Mixture of Experts attention
531
+ ```
532
+
533
+ #### Attention List
534
+
535
+ ```bash
536
+ # List all available attention mechanisms
537
+ npx ruvector attention list
538
+
539
+ # With verbose details
540
+ npx ruvector attention list -v
541
+ ```
542
+
543
+ #### Attention Benchmark
544
+
545
+ ```bash
546
+ # Benchmark attention mechanisms
547
+ npx ruvector attention benchmark -d 256 -n 100 -i 100
548
+
549
+ # Options:
550
+ # -d, --dimension Vector dimension (default: 256)
551
+ # -n, --num-vectors Number of vectors (default: 100)
552
+ # -i, --iterations Benchmark iterations (default: 100)
553
+ # -t, --types Attention types to benchmark (default: dot,flash,linear)
554
+
555
+ # Example output:
556
+ # Dimension: 256
557
+ # Vectors: 100
558
+ # Iterations: 100
559
+ #
560
+ # dot: 0.012ms/op (84,386 ops/sec)
561
+ # flash: 0.012ms/op (82,844 ops/sec)
562
+ # linear: 0.066ms/op (15,259 ops/sec)
563
+ ```
564
+
565
+ #### Hyperbolic Operations
566
+
567
+ ```bash
568
+ # Calculate Poincaré distance between two points
569
+ npx ruvector attention hyperbolic -a distance -v "[0.1,0.2,0.3]" -b "[0.4,0.5,0.6]"
570
+
571
+ # Project vector to Poincaré ball
572
+ npx ruvector attention hyperbolic -a project -v "[1.5,2.0,0.8]"
573
+
574
+ # Möbius addition in hyperbolic space
575
+ npx ruvector attention hyperbolic -a mobius-add -v "[0.1,0.2]" -b "[0.3,0.4]"
576
+
577
+ # Exponential map (tangent space → Poincaré ball)
578
+ npx ruvector attention hyperbolic -a exp-map -v "[0.1,0.2,0.3]"
579
+
580
+ # Options:
581
+ # -a, --action Action: distance|project|mobius-add|exp-map|log-map
582
+ # -v, --vector Input vector as JSON array (required)
583
+ # -b, --vector-b Second vector for binary operations
584
+ # -c, --curvature Poincaré ball curvature (default: 1.0)
585
+ ```
586
+
587
+ #### When to Use Each Attention Type
588
+
589
+ | Use Case | Recommended Attention | Reason |
590
+ |----------|----------------------|--------|
591
+ | **Standard NLP/Transformers** | MultiHeadAttention | Industry standard, well-tested |
592
+ | **Long Documents (>4K tokens)** | FlashAttention or LinearAttention | Memory efficient |
593
+ | **Hierarchical Classification** | HyperbolicAttention | Captures tree-like structures |
594
+ | **Knowledge Graphs** | GraphRoPeAttention | Position-aware graph attention |
595
+ | **Multi-Relational Graphs** | EdgeFeaturedAttention | Leverages edge attributes |
596
+ | **Taxonomy/Ontology Search** | DualSpaceAttention | Best of both Euclidean + hyperbolic |
597
+ | **Large-Scale Graphs** | LocalGlobalAttention | Efficient local + global context |
598
+ | **Model Routing/MoE** | MoEAttention | Expert selection and routing |
599
+
487
600
  ## 📊 Performance Benchmarks
488
601
 
489
602
  Tested on AMD Ryzen 9 5950X, 128-dimensional vectors:
package/bin/cli.js CHANGED
@@ -47,6 +47,46 @@ try {
47
47
  // GNN not available - commands will show helpful message
48
48
  }
49
49
 
50
+ // Import Attention (optional - graceful fallback if not available)
51
+ let DotProductAttention, MultiHeadAttention, HyperbolicAttention, FlashAttention, LinearAttention, MoEAttention;
52
+ let GraphRoPeAttention, EdgeFeaturedAttention, DualSpaceAttention, LocalGlobalAttention;
53
+ let benchmarkAttention, computeAttentionAsync, batchAttentionCompute, parallelAttentionCompute;
54
+ let expMap, logMap, mobiusAddition, poincareDistance, projectToPoincareBall;
55
+ let attentionInfo, attentionVersion;
56
+ let attentionAvailable = false;
57
+ try {
58
+ const attention = require('@ruvector/attention');
59
+ // Core mechanisms
60
+ DotProductAttention = attention.DotProductAttention;
61
+ MultiHeadAttention = attention.MultiHeadAttention;
62
+ HyperbolicAttention = attention.HyperbolicAttention;
63
+ FlashAttention = attention.FlashAttention;
64
+ LinearAttention = attention.LinearAttention;
65
+ MoEAttention = attention.MoEAttention;
66
+ // Graph attention
67
+ GraphRoPeAttention = attention.GraphRoPeAttention;
68
+ EdgeFeaturedAttention = attention.EdgeFeaturedAttention;
69
+ DualSpaceAttention = attention.DualSpaceAttention;
70
+ LocalGlobalAttention = attention.LocalGlobalAttention;
71
+ // Utilities
72
+ benchmarkAttention = attention.benchmarkAttention;
73
+ computeAttentionAsync = attention.computeAttentionAsync;
74
+ batchAttentionCompute = attention.batchAttentionCompute;
75
+ parallelAttentionCompute = attention.parallelAttentionCompute;
76
+ // Hyperbolic math
77
+ expMap = attention.expMap;
78
+ logMap = attention.logMap;
79
+ mobiusAddition = attention.mobiusAddition;
80
+ poincareDistance = attention.poincareDistance;
81
+ projectToPoincareBall = attention.projectToPoincareBall;
82
+ // Meta
83
+ attentionInfo = attention.info;
84
+ attentionVersion = attention.version;
85
+ attentionAvailable = true;
86
+ } catch (e) {
87
+ // Attention not available - commands will show helpful message
88
+ }
89
+
50
90
  const program = new Command();
51
91
 
52
92
  // Get package version from package.json
@@ -316,9 +356,10 @@ program
316
356
 
317
357
  // Try to load ruvector for implementation info
318
358
  if (loadRuvector()) {
319
- const info = getVersion();
320
- console.log(chalk.white(` Core Version: ${chalk.yellow(info.version)}`));
321
- console.log(chalk.white(` Implementation: ${chalk.yellow(info.implementation)}`));
359
+ const version = typeof getVersion === 'function' ? getVersion() : 'unknown';
360
+ const impl = typeof getImplementationType === 'function' ? getImplementationType() : 'native';
361
+ console.log(chalk.white(` Core Version: ${chalk.yellow(version)}`));
362
+ console.log(chalk.white(` Implementation: ${chalk.yellow(impl)}`));
322
363
  } else {
323
364
  console.log(chalk.white(` Core: ${chalk.gray('Not loaded (install @ruvector/core)')}`));
324
365
  }
@@ -855,6 +896,429 @@ gnnCmd
855
896
  console.log(chalk.gray(` binary (freq <= 0.01) - ~32x compression, archive`));
856
897
  });
857
898
 
899
+ // =============================================================================
900
+ // Attention Commands
901
+ // =============================================================================
902
+
903
+ // Helper to require attention module
904
+ function requireAttention() {
905
+ if (!attentionAvailable) {
906
+ console.error(chalk.red('Error: @ruvector/attention is not installed'));
907
+ console.error(chalk.yellow('Install it with: npm install @ruvector/attention'));
908
+ process.exit(1);
909
+ }
910
+ }
911
+
912
+ // Attention parent command
913
+ const attentionCmd = program
914
+ .command('attention')
915
+ .description('High-performance attention mechanism operations');
916
+
917
+ // Attention compute command - run attention on input vectors
918
+ attentionCmd
919
+ .command('compute')
920
+ .description('Compute attention over input vectors')
921
+ .requiredOption('-q, --query <json>', 'Query vector as JSON array')
922
+ .requiredOption('-k, --keys <file>', 'Keys file (JSON array of vectors)')
923
+ .option('-v, --values <file>', 'Values file (JSON array of vectors, defaults to keys)')
924
+ .option('-t, --type <type>', 'Attention type (dot|multi-head|flash|hyperbolic|linear)', 'dot')
925
+ .option('-h, --heads <number>', 'Number of attention heads (for multi-head)', '4')
926
+ .option('-d, --head-dim <number>', 'Head dimension (for multi-head)', '64')
927
+ .option('--curvature <number>', 'Curvature for hyperbolic attention', '1.0')
928
+ .option('-o, --output <file>', 'Output file for results')
929
+ .action((options) => {
930
+ requireAttention();
931
+ const spinner = ora('Loading keys...').start();
932
+
933
+ try {
934
+ const query = JSON.parse(options.query);
935
+ const keysData = JSON.parse(fs.readFileSync(options.keys, 'utf8'));
936
+ const keys = keysData.map(k => k.vector || k);
937
+
938
+ let values = keys;
939
+ if (options.values) {
940
+ const valuesData = JSON.parse(fs.readFileSync(options.values, 'utf8'));
941
+ values = valuesData.map(v => v.vector || v);
942
+ }
943
+
944
+ spinner.text = `Computing ${options.type} attention...`;
945
+
946
+ let result;
947
+ let attentionWeights;
948
+
949
+ switch (options.type) {
950
+ case 'dot': {
951
+ const attn = new DotProductAttention();
952
+ const queryMat = [query];
953
+ const output = attn.forward(queryMat, keys, values);
954
+ result = output[0];
955
+ attentionWeights = attn.getLastWeights ? attn.getLastWeights()[0] : null;
956
+ break;
957
+ }
958
+ case 'multi-head': {
959
+ const numHeads = parseInt(options.heads);
960
+ const headDim = parseInt(options.headDim);
961
+ const attn = new MultiHeadAttention(query.length, numHeads, headDim);
962
+ const queryMat = [query];
963
+ const output = attn.forward(queryMat, keys, values);
964
+ result = output[0];
965
+ break;
966
+ }
967
+ case 'flash': {
968
+ const attn = new FlashAttention(query.length);
969
+ const queryMat = [query];
970
+ const output = attn.forward(queryMat, keys, values);
971
+ result = output[0];
972
+ break;
973
+ }
974
+ case 'hyperbolic': {
975
+ const curvature = parseFloat(options.curvature);
976
+ const attn = new HyperbolicAttention(query.length, curvature);
977
+ const queryMat = [query];
978
+ const output = attn.forward(queryMat, keys, values);
979
+ result = output[0];
980
+ break;
981
+ }
982
+ case 'linear': {
983
+ const attn = new LinearAttention(query.length);
984
+ const queryMat = [query];
985
+ const output = attn.forward(queryMat, keys, values);
986
+ result = output[0];
987
+ break;
988
+ }
989
+ default:
990
+ throw new Error(`Unknown attention type: ${options.type}`);
991
+ }
992
+
993
+ spinner.succeed(chalk.green(`Attention computed (${options.type})`));
994
+
995
+ console.log(chalk.cyan('\nAttention Results:'));
996
+ console.log(chalk.white(` Type: ${chalk.yellow(options.type)}`));
997
+ console.log(chalk.white(` Query dim: ${chalk.yellow(query.length)}`));
998
+ console.log(chalk.white(` Num keys: ${chalk.yellow(keys.length)}`));
999
+ console.log(chalk.white(` Output dim: ${chalk.yellow(result.length)}`));
1000
+ console.log(chalk.white(` Output: ${chalk.gray(`[${result.slice(0, 4).map(v => v.toFixed(4)).join(', ')}...]`)}`));
1001
+
1002
+ if (attentionWeights) {
1003
+ console.log(chalk.cyan('\nAttention Weights:'));
1004
+ attentionWeights.slice(0, 5).forEach((w, i) => {
1005
+ console.log(chalk.gray(` Key ${i}: ${w.toFixed(4)}`));
1006
+ });
1007
+ if (attentionWeights.length > 5) {
1008
+ console.log(chalk.gray(` ... and ${attentionWeights.length - 5} more`));
1009
+ }
1010
+ }
1011
+
1012
+ if (options.output) {
1013
+ const outputData = { result, attentionWeights };
1014
+ fs.writeFileSync(options.output, JSON.stringify(outputData, null, 2));
1015
+ console.log(chalk.green(`\nResults saved to: ${options.output}`));
1016
+ }
1017
+ } catch (error) {
1018
+ spinner.fail(chalk.red('Failed to compute attention'));
1019
+ console.error(chalk.red(error.message));
1020
+ process.exit(1);
1021
+ }
1022
+ });
1023
+
1024
+ // Attention benchmark command
1025
+ attentionCmd
1026
+ .command('benchmark')
1027
+ .description('Benchmark attention mechanisms')
1028
+ .option('-d, --dimension <number>', 'Vector dimension', '256')
1029
+ .option('-n, --num-vectors <number>', 'Number of vectors', '100')
1030
+ .option('-i, --iterations <number>', 'Benchmark iterations', '100')
1031
+ .option('-t, --types <list>', 'Attention types to benchmark (comma-separated)', 'dot,flash,linear')
1032
+ .action((options) => {
1033
+ requireAttention();
1034
+ const spinner = ora('Setting up benchmark...').start();
1035
+
1036
+ try {
1037
+ const dim = parseInt(options.dimension);
1038
+ const numVectors = parseInt(options.numVectors);
1039
+ const iterations = parseInt(options.iterations);
1040
+ const types = options.types.split(',').map(t => t.trim());
1041
+
1042
+ // Generate random test data
1043
+ spinner.text = 'Generating test data...';
1044
+ const query = Array.from({ length: dim }, () => Math.random());
1045
+ const keys = Array.from({ length: numVectors }, () =>
1046
+ Array.from({ length: dim }, () => Math.random())
1047
+ );
1048
+
1049
+ console.log(chalk.cyan('\n═══════════════════════════════════════════════════════════════'));
1050
+ console.log(chalk.cyan(' Attention Mechanism Benchmark'));
1051
+ console.log(chalk.cyan('═══════════════════════════════════════════════════════════════\n'));
1052
+
1053
+ console.log(chalk.white(` Dimension: ${chalk.yellow(dim)}`));
1054
+ console.log(chalk.white(` Vectors: ${chalk.yellow(numVectors)}`));
1055
+ console.log(chalk.white(` Iterations: ${chalk.yellow(iterations)}`));
1056
+ console.log('');
1057
+
1058
+ const results = [];
1059
+
1060
+ // Convert to Float32Arrays for compute()
1061
+ const queryF32 = new Float32Array(query);
1062
+ const keysF32 = keys.map(k => new Float32Array(k));
1063
+
1064
+ for (const type of types) {
1065
+ spinner.text = `Benchmarking ${type} attention...`;
1066
+ spinner.start();
1067
+
1068
+ let attn;
1069
+ try {
1070
+ switch (type) {
1071
+ case 'dot':
1072
+ attn = new DotProductAttention(dim);
1073
+ break;
1074
+ case 'flash':
1075
+ attn = new FlashAttention(dim, 64); // dim, block_size
1076
+ break;
1077
+ case 'linear':
1078
+ attn = new LinearAttention(dim, 64); // dim, num_features
1079
+ break;
1080
+ case 'hyperbolic':
1081
+ attn = new HyperbolicAttention(dim, 1.0);
1082
+ break;
1083
+ case 'multi-head':
1084
+ attn = new MultiHeadAttention(dim, 4); // dim, num_heads
1085
+ break;
1086
+ default:
1087
+ console.log(chalk.yellow(` Skipping unknown type: ${type}`));
1088
+ continue;
1089
+ }
1090
+ } catch (e) {
1091
+ console.log(chalk.yellow(` ${type}: not available (${e.message})`));
1092
+ continue;
1093
+ }
1094
+
1095
+ // Warm up
1096
+ for (let i = 0; i < 5; i++) {
1097
+ try {
1098
+ attn.compute(queryF32, keysF32, keysF32);
1099
+ } catch (e) {
1100
+ // Some mechanisms may fail warmup
1101
+ }
1102
+ }
1103
+
1104
+ // Benchmark
1105
+ const start = process.hrtime.bigint();
1106
+ for (let i = 0; i < iterations; i++) {
1107
+ attn.compute(queryF32, keysF32, keysF32);
1108
+ }
1109
+ const end = process.hrtime.bigint();
1110
+ const totalMs = Number(end - start) / 1_000_000;
1111
+ const avgMs = totalMs / iterations;
1112
+ const opsPerSec = 1000 / avgMs;
1113
+
1114
+ results.push({ type, avgMs, opsPerSec });
1115
+ spinner.succeed(chalk.green(`${type}: ${avgMs.toFixed(3)} ms/op (${opsPerSec.toFixed(1)} ops/sec)`));
1116
+ }
1117
+
1118
+ // Summary
1119
+ if (results.length > 0) {
1120
+ console.log(chalk.cyan('\n═══════════════════════════════════════════════════════════════'));
1121
+ console.log(chalk.cyan(' Summary'));
1122
+ console.log(chalk.cyan('═══════════════════════════════════════════════════════════════\n'));
1123
+
1124
+ const fastest = results.reduce((a, b) => a.avgMs < b.avgMs ? a : b);
1125
+ console.log(chalk.green(` Fastest: ${fastest.type} (${fastest.avgMs.toFixed(3)} ms/op)\n`));
1126
+
1127
+ console.log(chalk.white(' Relative Performance:'));
1128
+ for (const r of results) {
1129
+ const relPerf = (fastest.avgMs / r.avgMs * 100).toFixed(1);
1130
+ const bar = '█'.repeat(Math.round(relPerf / 5));
1131
+ console.log(chalk.white(` ${r.type.padEnd(12)} ${chalk.cyan(bar)} ${relPerf}%`));
1132
+ }
1133
+ }
1134
+ } catch (error) {
1135
+ spinner.fail(chalk.red('Benchmark failed'));
1136
+ console.error(chalk.red(error.message));
1137
+ process.exit(1);
1138
+ }
1139
+ });
1140
+
1141
+ // Hyperbolic math command
1142
+ attentionCmd
1143
+ .command('hyperbolic')
1144
+ .description('Hyperbolic geometry operations')
1145
+ .requiredOption('-a, --action <type>', 'Action: exp-map|log-map|distance|project|mobius-add')
1146
+ .requiredOption('-v, --vector <json>', 'Input vector(s) as JSON')
1147
+ .option('-b, --vector-b <json>', 'Second vector for binary operations')
1148
+ .option('-c, --curvature <number>', 'Poincaré ball curvature', '1.0')
1149
+ .option('-o, --origin <json>', 'Origin point for exp/log maps')
1150
+ .action((options) => {
1151
+ requireAttention();
1152
+
1153
+ try {
1154
+ const vecArray = JSON.parse(options.vector);
1155
+ const vec = new Float32Array(vecArray);
1156
+ const curvature = parseFloat(options.curvature);
1157
+
1158
+ let result;
1159
+ let description;
1160
+
1161
+ switch (options.action) {
1162
+ case 'exp-map': {
1163
+ const originArray = options.origin ? JSON.parse(options.origin) : Array(vec.length).fill(0);
1164
+ const origin = new Float32Array(originArray);
1165
+ result = expMap(origin, vec, curvature);
1166
+ description = 'Exponential map (tangent → Poincaré ball)';
1167
+ break;
1168
+ }
1169
+ case 'log-map': {
1170
+ const originArray = options.origin ? JSON.parse(options.origin) : Array(vec.length).fill(0);
1171
+ const origin = new Float32Array(originArray);
1172
+ result = logMap(origin, vec, curvature);
1173
+ description = 'Logarithmic map (Poincaré ball → tangent)';
1174
+ break;
1175
+ }
1176
+ case 'distance': {
1177
+ if (!options.vectorB) {
1178
+ throw new Error('--vector-b required for distance calculation');
1179
+ }
1180
+ const vecBArray = JSON.parse(options.vectorB);
1181
+ const vecB = new Float32Array(vecBArray);
1182
+ result = poincareDistance(vec, vecB, curvature);
1183
+ description = 'Poincaré distance';
1184
+ break;
1185
+ }
1186
+ case 'project': {
1187
+ result = projectToPoincareBall(vec, curvature);
1188
+ description = 'Project to Poincaré ball';
1189
+ break;
1190
+ }
1191
+ case 'mobius-add': {
1192
+ if (!options.vectorB) {
1193
+ throw new Error('--vector-b required for Möbius addition');
1194
+ }
1195
+ const vecBArray = JSON.parse(options.vectorB);
1196
+ const vecB = new Float32Array(vecBArray);
1197
+ result = mobiusAddition(vec, vecB, curvature);
1198
+ description = 'Möbius addition';
1199
+ break;
1200
+ }
1201
+ default:
1202
+ throw new Error(`Unknown action: ${options.action}`);
1203
+ }
1204
+
1205
+ console.log(chalk.cyan('\nHyperbolic Operation:'));
1206
+ console.log(chalk.white(` Action: ${chalk.yellow(description)}`));
1207
+ console.log(chalk.white(` Curvature: ${chalk.yellow(curvature)}`));
1208
+
1209
+ if (typeof result === 'number') {
1210
+ console.log(chalk.white(` Result: ${chalk.green(result.toFixed(6))}`));
1211
+ } else {
1212
+ const resultArray = Array.from(result);
1213
+ console.log(chalk.white(` Input dim: ${chalk.yellow(vec.length)}`));
1214
+ console.log(chalk.white(` Output dim: ${chalk.yellow(resultArray.length)}`));
1215
+ console.log(chalk.white(` Result: ${chalk.gray(`[${resultArray.slice(0, 5).map(v => v.toFixed(4)).join(', ')}...]`)}`));
1216
+
1217
+ // Compute norm to verify it's in the ball
1218
+ const norm = Math.sqrt(resultArray.reduce((sum, x) => sum + x * x, 0));
1219
+ console.log(chalk.white(` Norm: ${chalk.yellow(norm.toFixed(6))} ${norm < 1 ? chalk.green('(inside ball)') : chalk.red('(outside ball)')}`));
1220
+ }
1221
+ } catch (error) {
1222
+ console.error(chalk.red('Hyperbolic operation failed:'), error.message);
1223
+ process.exit(1);
1224
+ }
1225
+ });
1226
+
1227
+ // Attention info command
1228
+ attentionCmd
1229
+ .command('info')
1230
+ .description('Show attention module information')
1231
+ .action(() => {
1232
+ if (!attentionAvailable) {
1233
+ console.log(chalk.yellow('\nAttention Module: Not installed'));
1234
+ console.log(chalk.white('Install with: npm install @ruvector/attention'));
1235
+ return;
1236
+ }
1237
+
1238
+ console.log(chalk.cyan('\nAttention Module Information'));
1239
+ console.log(chalk.white(` Status: ${chalk.green('Available')}`));
1240
+ console.log(chalk.white(` Version: ${chalk.yellow(attentionVersion ? attentionVersion() : 'unknown')}`));
1241
+ console.log(chalk.white(` Platform: ${chalk.yellow(process.platform)}`));
1242
+ console.log(chalk.white(` Architecture: ${chalk.yellow(process.arch)}`));
1243
+
1244
+ console.log(chalk.cyan('\nCore Attention Mechanisms:'));
1245
+ console.log(chalk.white(` • DotProductAttention - Scaled dot-product attention`));
1246
+ console.log(chalk.white(` • MultiHeadAttention - Multi-head self-attention`));
1247
+ console.log(chalk.white(` • FlashAttention - Memory-efficient IO-aware attention`));
1248
+ console.log(chalk.white(` • HyperbolicAttention - Poincaré ball attention`));
1249
+ console.log(chalk.white(` • LinearAttention - O(n) linear complexity attention`));
1250
+ console.log(chalk.white(` • MoEAttention - Mixture of Experts attention`));
1251
+
1252
+ console.log(chalk.cyan('\nGraph Attention:'));
1253
+ console.log(chalk.white(` • GraphRoPeAttention - Rotary position embeddings for graphs`));
1254
+ console.log(chalk.white(` • EdgeFeaturedAttention - Edge feature-enhanced attention`));
1255
+ console.log(chalk.white(` • DualSpaceAttention - Euclidean + hyperbolic dual space`));
1256
+ console.log(chalk.white(` • LocalGlobalAttention - Local-global graph attention`));
1257
+
1258
+ console.log(chalk.cyan('\nHyperbolic Math:'));
1259
+ console.log(chalk.white(` • expMap, logMap - Exponential/logarithmic maps`));
1260
+ console.log(chalk.white(` • mobiusAddition - Möbius addition in Poincaré ball`));
1261
+ console.log(chalk.white(` • poincareDistance - Hyperbolic distance metric`));
1262
+ console.log(chalk.white(` • projectToPoincareBall - Project vectors to ball`));
1263
+
1264
+ console.log(chalk.cyan('\nTraining Utilities:'));
1265
+ console.log(chalk.white(` • AdamOptimizer, AdamWOptimizer, SgdOptimizer`));
1266
+ console.log(chalk.white(` • InfoNceLoss, LocalContrastiveLoss`));
1267
+ console.log(chalk.white(` • CurriculumScheduler, TemperatureAnnealing`));
1268
+ console.log(chalk.white(` • HardNegativeMiner, InBatchMiner`));
1269
+ });
1270
+
1271
+ // Attention list command - list available mechanisms
1272
+ attentionCmd
1273
+ .command('list')
1274
+ .description('List all available attention mechanisms')
1275
+ .option('-v, --verbose', 'Show detailed information')
1276
+ .action((options) => {
1277
+ console.log(chalk.cyan('\n═══════════════════════════════════════════════════════════════'));
1278
+ console.log(chalk.cyan(' Available Attention Mechanisms'));
1279
+ console.log(chalk.cyan('═══════════════════════════════════════════════════════════════\n'));
1280
+
1281
+ const mechanisms = [
1282
+ { name: 'DotProductAttention', type: 'core', complexity: 'O(n²)', available: !!DotProductAttention },
1283
+ { name: 'MultiHeadAttention', type: 'core', complexity: 'O(n²)', available: !!MultiHeadAttention },
1284
+ { name: 'FlashAttention', type: 'core', complexity: 'O(n²) IO-optimized', available: !!FlashAttention },
1285
+ { name: 'HyperbolicAttention', type: 'core', complexity: 'O(n²)', available: !!HyperbolicAttention },
1286
+ { name: 'LinearAttention', type: 'core', complexity: 'O(n)', available: !!LinearAttention },
1287
+ { name: 'MoEAttention', type: 'core', complexity: 'O(n*k)', available: !!MoEAttention },
1288
+ { name: 'GraphRoPeAttention', type: 'graph', complexity: 'O(n²)', available: !!GraphRoPeAttention },
1289
+ { name: 'EdgeFeaturedAttention', type: 'graph', complexity: 'O(n²)', available: !!EdgeFeaturedAttention },
1290
+ { name: 'DualSpaceAttention', type: 'graph', complexity: 'O(n²)', available: !!DualSpaceAttention },
1291
+ { name: 'LocalGlobalAttention', type: 'graph', complexity: 'O(n*k)', available: !!LocalGlobalAttention },
1292
+ ];
1293
+
1294
+ console.log(chalk.white(' Core Attention:'));
1295
+ mechanisms.filter(m => m.type === 'core').forEach(m => {
1296
+ const status = m.available ? chalk.green('✓') : chalk.red('✗');
1297
+ console.log(chalk.white(` ${status} ${m.name.padEnd(22)} ${chalk.gray(m.complexity)}`));
1298
+ });
1299
+
1300
+ console.log(chalk.white('\n Graph Attention:'));
1301
+ mechanisms.filter(m => m.type === 'graph').forEach(m => {
1302
+ const status = m.available ? chalk.green('✓') : chalk.red('✗');
1303
+ console.log(chalk.white(` ${status} ${m.name.padEnd(22)} ${chalk.gray(m.complexity)}`));
1304
+ });
1305
+
1306
+ if (!attentionAvailable) {
1307
+ console.log(chalk.yellow('\n Note: @ruvector/attention not installed'));
1308
+ console.log(chalk.white(' Install with: npm install @ruvector/attention'));
1309
+ }
1310
+
1311
+ if (options.verbose) {
1312
+ console.log(chalk.cyan('\n Usage Examples:'));
1313
+ console.log(chalk.gray(' # Compute dot-product attention'));
1314
+ console.log(chalk.white(' npx ruvector attention compute -q "[1,2,3]" -k keys.json -t dot'));
1315
+ console.log(chalk.gray('\n # Benchmark attention mechanisms'));
1316
+ console.log(chalk.white(' npx ruvector attention benchmark -d 256 -n 100'));
1317
+ console.log(chalk.gray('\n # Hyperbolic distance'));
1318
+ console.log(chalk.white(' npx ruvector attention hyperbolic -a distance -v "[0.1,0.2]" -b "[0.3,0.4]"'));
1319
+ }
1320
+ });
1321
+
858
1322
  // =============================================================================
859
1323
  // Doctor Command - Check system health and dependencies
860
1324
  // =============================================================================
@@ -942,8 +1406,10 @@ program
942
1406
 
943
1407
  // Check if native binding works
944
1408
  if (coreAvailable && loadRuvector()) {
945
- const info = getVersion();
946
- console.log(chalk.green(` ✓ Native binding working (${info.implementation})`));
1409
+ const version = typeof getVersion === 'function' ? getVersion() : null;
1410
+ const impl = typeof getImplementationType === 'function' ? getImplementationType() : 'native';
1411
+ const versionStr = version ? `, v${version}` : '';
1412
+ console.log(chalk.green(` ✓ Native binding working (${impl}${versionStr})`));
947
1413
  } else if (coreAvailable) {
948
1414
  console.log(chalk.yellow(` ! Native binding failed to load`));
949
1415
  warnings++;
@@ -956,6 +1422,13 @@ program
956
1422
  console.log(chalk.gray(` ○ @ruvector/gnn not installed (optional)`));
957
1423
  }
958
1424
 
1425
+ // Check @ruvector/attention
1426
+ if (attentionAvailable) {
1427
+ console.log(chalk.green(` ✓ @ruvector/attention installed`));
1428
+ } else {
1429
+ console.log(chalk.gray(` ○ @ruvector/attention not installed (optional)`));
1430
+ }
1431
+
959
1432
  // Check @ruvector/graph-node
960
1433
  try {
961
1434
  require.resolve('@ruvector/graph-node');
package/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "ruvector",
3
- "version": "0.1.24",
3
+ "version": "0.1.26",
4
4
  "description": "High-performance vector database for Node.js with automatic native/WASM fallback",
5
5
  "main": "dist/index.js",
6
6
  "types": "dist/index.d.ts",
@@ -29,7 +29,11 @@
29
29
  "wasm",
30
30
  "native",
31
31
  "ruv",
32
- "ruvector"
32
+ "ruvector",
33
+ "attention",
34
+ "transformer",
35
+ "flash-attention",
36
+ "hyperbolic"
33
37
  ],
34
38
  "author": "ruv.io Team <info@ruv.io> (https://ruv.io)",
35
39
  "homepage": "https://ruv.io",
@@ -43,12 +47,15 @@
43
47
  "directory": "npm/packages/ruvector"
44
48
  },
45
49
  "dependencies": {
46
- "@ruvector/core": "^0.1.15",
50
+ "@ruvector/core": "^0.1.16",
47
51
  "@ruvector/gnn": "^0.1.15",
48
- "commander": "^11.1.0",
49
52
  "chalk": "^4.1.2",
53
+ "commander": "^11.1.0",
50
54
  "ora": "^5.4.1"
51
55
  },
56
+ "optionalDependencies": {
57
+ "@ruvector/attention": "^0.1.1"
58
+ },
52
59
  "devDependencies": {
53
60
  "@types/node": "^20.10.5",
54
61
  "typescript": "^5.3.3"