roxify 1.13.5 → 1.13.7

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
package/Cargo.toml CHANGED
@@ -1,6 +1,6 @@
1
1
  [package]
2
2
  name = "roxify_native"
3
- version = "1.13.5"
3
+ version = "1.13.6"
4
4
  edition = "2021"
5
5
  publish = false
6
6
 
package/README.md CHANGED
@@ -57,40 +57,39 @@ The core compression and image-processing logic is written in Rust and exposed t
57
57
 
58
58
  ## Benchmarks
59
59
 
60
- All measurements below use Roxify native Rust CLI (`roxify_native`) with streaming directory packing, Zstd L3, multi-threading, long-distance matching, and `window_log(30)`.
61
-
62
- ### Cold-cache throughput on ext4
63
-
64
- Measured with targeted page-cache eviction (`POSIX_FADV_DONTNEED`) before both encode and decode. Raw manifest lives in `docs/COLD_BENCHMARK_2026-04-15.json`.
65
-
66
- | Dataset | Files | Source | Output PNG | Encode | Encode throughput | Decode | Decode throughput |
67
- | --- | --- | --- | --- | --- | --- | --- | --- |
68
- | Glados-Disc | 19,645 | 208.18 MiB | 54.83 MiB | 2.883 s | 72.22 MiB/s | 0.954 s | 218.16 MiB/s |
69
- | Gmod | 3,936 | 1.36 GiB | 411.09 MiB | 6.127 s | 227.69 MiB/s | 5.850 s | 238.48 MiB/s |
70
-
71
- ### High-latency source filesystem encode
72
-
73
- Roxify 1.13.4 adds adaptive parallel preload for small files before feeding Zstd. This specifically targets metadata-heavy trees on slower filesystems such as NTFS, APFS, exFAT, and network-backed mounts.
74
-
75
- | Dataset | Source FS | Before 1.13.4 | Roxify 1.13.4 | Speedup |
76
- | --- | --- | --- | --- | --- |
77
- | Glados-Disc (19,645 files) | NTFS under Linux | 81.608 s | 2.189 s | 37.3x |
78
- | Gmod (3,936 files) | NTFS under Linux | 22.578 s | 4.517 s | 5.0x |
79
-
80
- ### Portal 2 comparative reference: ZIP vs PNG
81
-
82
- Measured on the full `Portal 2` game directory (`3,731 files`, `193 folders`, `12.83 GiB` logical source) to compare classic ZIP packaging against Roxify PNG packing on the same dataset.
83
-
84
- | Format | Time (s) | Time (min:sec) | Throughput | Compression ratio |
85
- | --- | ---: | --- | ---: | ---: |
86
- | ZIP Encode | 633,87 | 10 min 33 s | 21,73 Mo/s | 36,08% |
87
- | ZIP Decode | 232,88 | 3 min 52 s | 59,15 Mo/s | - |
88
- | PNG Encode | 157,80 | 2 min 37 s | 87,30 Mo/s | 41,09% |
89
- | PNG Decode | 156,00 | 2 min 36 s | 88,30 Mo/s | - |
60
+ All measurements below use Roxify native Rust CLI (`roxify_native`) against `zip -qry` / `unzip -qq`, with targeted page-cache eviction (`POSIX_FADV_DONTNEED`) before both encode and decode. `Saved` = `100 - final_size / source_size`. ZIP runs preserve symlinks so extracted trees stay logically identical to source.
61
+
62
+ ### Comparative archive benchmark on ext4
63
+
64
+ | Dataset / Format | Files | Source | Final size | Saved | Encode | Encode throughput | Decode | Decode throughput |
65
+ | --- | --- | --- | --- | --- | --- | --- | --- | --- |
66
+ | **Glados-Disc** | **19,645** | **208.18 MiB** | - | - | - | - | - | - |
67
+ | PNG (Roxify) | - | - | 54.83 MiB | 73.66% | 1.63 s | 127.36 MiB/s | 1.98 s | 104.94 MiB/s |
68
+ | ZIP | - | - | 82.44 MiB | 60.40% | 13.27 s | 15.69 MiB/s | 2.68 s | 77.69 MiB/s |
69
+ | **Gmod** | **3,936** | **1.36 GiB** | - | - | - | - | - | - |
70
+ | PNG (Roxify) | - | - | 411.09 MiB | 70.53% | 7.06 s | 197.59 MiB/s | 8.14 s | 171.29 MiB/s |
71
+ | ZIP | - | - | 516.44 MiB | 62.98% | 44.07 s | 31.66 MiB/s | 12.51 s | 111.54 MiB/s |
72
+ | **Portal 2** | **3,731** | **12.83 GiB** | - | - | - | - | - | - |
73
+ | PNG (Roxify) | - | - | 7.62 GiB | 40.60% | 1 min 33.07 s | 141.16 MiB/s | 2 min 07.51 s | 103.03 MiB/s |
74
+ | ZIP | - | - | 8.20 GiB | 36.08% | 9 min 00.66 s | 24.30 MiB/s | 3 min 18.63 s | 66.14 MiB/s |
75
+
76
+ ### Comparative archive benchmark on NTFS
77
+
78
+ | Dataset / Format | Files | Source | Final size | Saved | Encode | Encode throughput | Decode | Decode throughput |
79
+ | --- | --- | --- | --- | --- | --- | --- | --- | --- |
80
+ | **Glados-Disc** | **19,645** | **208.18 MiB** | - | - | - | - | - | - |
81
+ | PNG (Roxify) | - | - | 54.64 MiB | 73.75% | 1 min 11.55 s | 2.91 MiB/s | 3.90 s | 53.31 MiB/s |
82
+ | ZIP | - | - | 82.44 MiB | 60.40% | 1 min 55.28 s | 1.81 MiB/s | 11.99 s | 17.36 MiB/s |
83
+ | **Gmod** | **3,936** | **1.36 GiB** | - | - | - | - | - | - |
84
+ | PNG (Roxify) | - | - | 409.10 MiB | 70.67% | 19.68 s | 70.87 MiB/s | 22.47 s | 62.08 MiB/s |
85
+ | ZIP | - | - | 516.45 MiB | 62.98% | 57.07 s | 24.44 MiB/s | 33.86 s | 41.19 MiB/s |
86
+ | **Portal 2** | **3,731** | **12.83 GiB** | - | - | - | - | - | - |
87
+ | PNG (Roxify) | - | - | 7.56 GiB | 41.09% | 2 min 40.95 s | 81.62 MiB/s | 3 min 13.14 s | 68.02 MiB/s |
88
+ | ZIP | - | - | 8.20 GiB | 36.08% | 10 min 58.95 s | 19.94 MiB/s | 4 min 01.80 s | 54.33 MiB/s |
90
89
 
91
90
  ### Data integrity
92
91
 
93
- All benchmark runs completed with byte-exact roundtrip validation. Decode output matched original logical source bytes on every dataset.
92
+ All benchmark runs completed with successful roundtrip extraction on the measured datasets. ZIP runs used `-y` to preserve symlinks instead of dereferencing them during archive creation.
94
93
 
95
94
  ---
96
95
 
package/dist/cli.js CHANGED
@@ -1,5 +1,5 @@
1
1
  #!/usr/bin/env node
2
- import { mkdirSync, readdirSync, readFileSync, statSync, writeFileSync, } from 'fs';
2
+ import { readdirSync, readFileSync, statSync, writeFileSync } from 'fs';
3
3
  import { open } from 'fs/promises';
4
4
  import { basename, dirname, join, resolve } from 'path';
5
5
  import * as cliProgress from './stub-progress.js';
@@ -20,7 +20,7 @@ async function loadJsEngine() {
20
20
  VFSIndexEntry: undefined,
21
21
  };
22
22
  }
23
- const VERSION = '1.13.2';
23
+ const VERSION = '1.13.6';
24
24
  function getDirectorySize(dirPath) {
25
25
  let totalSize = 0;
26
26
  try {
@@ -86,6 +86,7 @@ Options:
86
86
  -e, --encrypt <type> auto|aes|xor|none
87
87
  --no-compress Disable compression
88
88
  --dict <file> Use zstd dictionary when compressing
89
+ --ram-budget-mb <mb> Max RAM budget used by native encode/decode paths
89
90
  --force-ts Force TypeScript encoder (slower but supports encryption)
90
91
  -o, --output <path> Output file path
91
92
  -s, --sizes Show file sizes in 'list' output (default)
@@ -196,6 +197,15 @@ function parseArgs(args) {
196
197
  parsed.dict = args[i + 1];
197
198
  i += 2;
198
199
  }
200
+ else if (key === 'ram-budget-mb') {
201
+ const v = Number(args[i + 1]);
202
+ if (!Number.isFinite(v) || v <= 0) {
203
+ console.error(`Invalid --ram-budget-mb: ${args[i + 1]}`);
204
+ process.exit(1);
205
+ }
206
+ parsed.ramBudgetMb = Math.floor(v);
207
+ i += 2;
208
+ }
199
209
  else {
200
210
  const value = args[i + 1];
201
211
  parsed[key] = value;
@@ -320,7 +330,7 @@ async function encodeCommand(args) {
320
330
  encodeBar.start(100, 0, { step: 'Encoding', elapsed: '0' });
321
331
  const encryptType = parsed.encrypt === 'xor' ? 'xor' : 'aes';
322
332
  const fileName = basename(inputPaths[0]);
323
- await encodeWithRustCLI(inputPaths.length === 1 ? resolvedInputs[0] : resolvedInputs[0], resolvedOutput, 19, parsed.passphrase, encryptType, fileName, (current, total, step) => {
333
+ await encodeWithRustCLI(inputPaths.length === 1 ? resolvedInputs[0] : resolvedInputs[0], resolvedOutput, 19, parsed.passphrase, encryptType, fileName, parsed.ramBudgetMb, (current, total, step) => {
324
334
  const pct = total > 0 ? Math.floor((current / total) * 100) : 0;
325
335
  const elapsed = Math.floor((Date.now() - startTime) / 1000);
326
336
  encodeBar.update(Math.min(pct, 99), {
@@ -588,204 +598,36 @@ async function decodeCommand(args) {
588
598
  }
589
599
  const resolvedInput = resolve(inputPath);
590
600
  const resolvedOutput = parsed.output || outputPath || '.';
591
- if (isRustBinaryAvailable() && !parsed.forceTs && !parsed.lossyResilient) {
592
- try {
593
- console.log(' ');
594
- console.log('Decoding... (Using native Rust decoder)\n');
595
- const startTime = Date.now();
596
- const decodeBar = new cliProgress.SingleBar({ format: ' {bar} {percentage}% | {step} | {elapsed}s' }, cliProgress.Presets.shades_classic);
597
- decodeBar.start(100, 0, { step: 'Decoding', elapsed: '0' });
598
- await decodeWithRustCLI(resolvedInput, resolvedOutput, parsed.passphrase, parsed.files, parsed.dict, (current, total, step) => {
599
- const pct = total > 0 ? Math.floor((current / total) * 100) : 0;
600
- const elapsed = Math.floor((Date.now() - startTime) / 1000);
601
- decodeBar.update(Math.min(pct, 99), {
602
- step: step || 'Decoding',
603
- elapsed: String(elapsed),
604
- });
605
- });
606
- const decodeTime = Date.now() - startTime;
607
- decodeBar.update(100, { step: 'done', elapsed: String(Math.floor(decodeTime / 1000)) });
608
- decodeBar.stop();
609
- console.log(`\nSuccess!`);
610
- console.log(` Time: ${decodeTime}ms`);
611
- console.log(` Output: ${resolve(resolvedOutput)}`);
612
- console.log(' ');
613
- return;
614
- }
615
- catch (err) {
616
- console.warn('\nRust decoder failed, falling back to TypeScript decoder...');
617
- console.warn(`Reason: ${err.message}\n`);
618
- }
601
+ if (!isRustBinaryAvailable()) {
602
+ console.error('Error: Rust decoder binary not found');
603
+ process.exit(1);
619
604
  }
620
605
  try {
621
- const options = {};
622
- if (parsed.passphrase) {
623
- options.passphrase = parsed.passphrase;
624
- }
625
- if (parsed.debug) {
626
- options.debugDir = dirname(resolvedInput);
627
- }
628
- if (parsed.files) {
629
- options.files = parsed.files;
630
- }
631
- if (parsed.dict) {
632
- try {
633
- options.dict = readFileSync(parsed.dict);
634
- }
635
- catch (e) {
636
- console.error(`Failed to read dictionary file: ${parsed.dict}`);
637
- process.exit(1);
638
- }
639
- }
640
- console.log(' ');
641
- console.log(`Decoding...`);
642
606
  console.log(' ');
643
- const decodeBar = new cliProgress.SingleBar({
644
- format: ' {bar} {percentage}% | {step} | {elapsed}s',
645
- }, cliProgress.Presets.shades_classic);
646
- let barStarted = false;
647
- const startDecode = Date.now();
648
- let currentPct = 0;
649
- let targetPct = 0;
650
- let currentStep = 'Decoding';
651
- const heartbeat = setInterval(() => {
652
- if (currentPct < targetPct) {
653
- currentPct = Math.min(currentPct + 2, targetPct);
654
- }
655
- if (!barStarted && targetPct > 0) {
656
- decodeBar.start(100, Math.floor(currentPct), {
657
- step: currentStep,
658
- elapsed: String(Math.floor((Date.now() - startDecode) / 1000)),
659
- });
660
- barStarted = true;
661
- }
662
- else if (barStarted) {
663
- decodeBar.update(Math.floor(currentPct), {
664
- step: currentStep,
665
- elapsed: String(Math.floor((Date.now() - startDecode) / 1000)),
666
- });
667
- }
668
- }, 100);
669
- options.onProgress = (info) => {
670
- if (info.phase === 'decompress_start') {
671
- targetPct = 50;
672
- currentStep = 'Decompressing';
673
- }
674
- else if (info.phase === 'decompress_progress' &&
675
- info.loaded &&
676
- info.total) {
677
- targetPct = 50 + Math.floor((info.loaded / info.total) * 40);
678
- currentStep = `Decompressing (${info.loaded}/${info.total})`;
679
- }
680
- else if (info.phase === 'decompress_done') {
681
- targetPct = 90;
682
- currentStep = 'Decompressed';
683
- }
684
- else if (info.phase === 'done') {
685
- targetPct = 100;
686
- currentStep = 'Done';
687
- }
688
- };
689
- const inputBuffer = await readLargeFile(resolvedInput);
690
- const js = await loadJsEngine();
691
- const result = await js.decodePngToBinary(inputBuffer, options);
692
- const decodeTime = Date.now() - startDecode;
693
- clearInterval(heartbeat);
694
- if (barStarted) {
695
- currentPct = 100;
696
- decodeBar.update(100, {
697
- step: 'done',
698
- elapsed: String(Math.floor(decodeTime / 1000)),
699
- });
700
- decodeBar.stop();
701
- }
702
- if (result.files) {
703
- const baseDir = parsed.output || outputPath || '.';
704
- const totalBytes = result.files.reduce((s, f) => s + f.buf.length, 0);
705
- const extractBar = new cliProgress.SingleBar({ format: ' {bar} {percentage}% | {step} | {elapsed}s' }, cliProgress.Presets.shades_classic);
706
- const extractStart = Date.now();
707
- extractBar.start(totalBytes, 0, { step: 'Writing files', elapsed: '0' });
708
- let written = 0;
709
- for (const file of result.files) {
710
- const fullPath = join(baseDir, file.path);
711
- const dir = dirname(fullPath);
712
- mkdirSync(dir, { recursive: true });
713
- writeFileSync(fullPath, file.buf);
714
- written += file.buf.length;
715
- extractBar.update(written, {
716
- step: `Writing ${file.path}`,
717
- elapsed: String(Math.floor((Date.now() - extractStart) / 1000)),
718
- });
719
- }
720
- extractBar.update(totalBytes, {
721
- step: 'Done',
722
- elapsed: String(Math.floor((Date.now() - extractStart) / 1000)),
607
+ console.log('Decoding... (Using native Rust decoder)\n');
608
+ const startTime = Date.now();
609
+ const decodeBar = new cliProgress.SingleBar({ format: ' {bar} {percentage}% | {step} | {elapsed}s' }, cliProgress.Presets.shades_classic);
610
+ decodeBar.start(100, 0, { step: 'Decoding', elapsed: '0' });
611
+ await decodeWithRustCLI(resolvedInput, resolvedOutput, parsed.passphrase, parsed.files, parsed.dict, parsed.ramBudgetMb, (current, total, step) => {
612
+ const pct = total > 0 ? Math.floor((current / total) * 100) : 0;
613
+ const elapsed = Math.floor((Date.now() - startTime) / 1000);
614
+ decodeBar.update(Math.min(pct, 99), {
615
+ step: step || 'Decoding',
616
+ elapsed: String(elapsed),
723
617
  });
724
- extractBar.stop();
725
- console.log(`\nSuccess!`);
726
- console.log(`Unpacked ${result.files.length} files to directory : ${resolve(baseDir)}`);
727
- console.log(`Time: ${decodeTime}ms`);
728
- }
729
- else if (result.buf) {
730
- const unpacked = js.unpackBuffer(result.buf);
731
- if (unpacked) {
732
- const baseDir = parsed.output || outputPath || '.';
733
- for (const file of unpacked.files) {
734
- const fullPath = join(baseDir, file.path);
735
- const dir = dirname(fullPath);
736
- mkdirSync(dir, { recursive: true });
737
- writeFileSync(fullPath, file.buf);
738
- }
739
- console.log(`\nSuccess!`);
740
- console.log(`Time: ${decodeTime}ms`);
741
- console.log(`Unpacked ${unpacked.files.length} files to current directory`);
742
- }
743
- else {
744
- let finalOutput = resolvedOutput;
745
- if (!parsed.output && !outputPath && result.meta?.name) {
746
- finalOutput = result.meta.name;
747
- }
748
- writeFileSync(finalOutput, result.buf);
749
- console.log(`\nSuccess!`);
750
- if (result.meta?.name) {
751
- console.log(` Original name: ${result.meta.name}`);
752
- }
753
- const outputSize = (result.buf.length / 1024 / 1024).toFixed(2);
754
- console.log(` Output size: ${outputSize} MB`);
755
- console.log(` Time: ${decodeTime}ms`);
756
- console.log(` Saved: ${finalOutput}`);
757
- }
758
- }
759
- else {
760
- console.log(`\nSuccess!`);
761
- console.log(`Time: ${decodeTime}ms`);
762
- }
618
+ });
619
+ const decodeTime = Date.now() - startTime;
620
+ decodeBar.update(100, { step: 'done', elapsed: String(Math.floor(decodeTime / 1000)) });
621
+ decodeBar.stop();
622
+ console.log(`\nSuccess!`);
623
+ console.log(` Time: ${decodeTime}ms`);
624
+ console.log(` Output: ${resolve(resolvedOutput)}`);
763
625
  console.log(' ');
764
626
  }
765
627
  catch (err) {
766
- if ((err.message && err.message.includes('passphrase required')) ||
767
- (err.message && err.message.includes('passphrase') && !parsed.passphrase)) {
768
- console.log(' ');
769
- console.error('File appears to be encrypted. Provide a passphrase with -p');
770
- }
771
- else if ((err.message && err.message.includes('Incorrect passphrase')) ||
772
- (err.message && err.message.includes('Incorrect passphrase'))) {
773
- console.log(' ');
774
- console.error('Incorrect passphrase');
775
- }
776
- else if ((err.message && err.message.includes('data format error')) ||
777
- (err.message &&
778
- (err.message.includes('decompression failed') ||
779
- err.message.includes('missing ROX1') ||
780
- err.message.includes('Pixel payload truncated') ||
781
- err.message.includes('Marker START not found')))) {
782
- console.log(' ');
783
- console.error('Data corrupted or unsupported format. Use --verbose for details.');
784
- }
785
- else {
786
- console.log(' ');
787
- console.error('Failed to decode file. Use --verbose for details.');
788
- }
628
+ console.log(' ');
629
+ console.error('Error: Rust decoder failed.');
630
+ console.error(`Reason: ${err.message}`);
789
631
  if (parsed.verbose) {
790
632
  console.error('Details:', err.stack || err.message);
791
633
  }
Binary file
Binary file
Binary file
Binary file
Binary file
@@ -1,30 +1,10 @@
1
1
  import { DecodeOptions, DecodeResult } from './types.js';
2
- /**
3
- * Un-stretch an image that was nearest-neighbor scaled.
4
- * 1. Crops to non-background bounding box
5
- * 2. Collapses horizontal runs of identical pixels into single pixels
6
- * 3. Removes duplicate consecutive rows
7
- *
8
- * Returns null if the image doesn't appear to be stretched.
9
- */
10
- export declare function unstretchImage(rawRGB: Buffer, width: number, height: number, tolerance?: number): {
11
- data: Buffer;
12
- width: number;
13
- height: number;
14
- } | null;
15
2
  /**
16
3
  * Decode a ROX PNG or buffer into the original binary payload or files list.
4
+ * This function uses the Rust native implementation exclusively.
17
5
  *
18
6
  * @param input - Buffer or path to a PNG file.
19
7
  * @param opts - Optional decode options.
20
8
  * @returns A Promise resolving to DecodeResult ({ buf, meta } or { files }).
21
- *
22
- * @example
23
- * ```js
24
- * import { decodePngToBinary } from 'roxify';
25
- * const png = fs.readFileSync('out.png');
26
- * const res = await decodePngToBinary(png);
27
- * console.log(res.meta?.name, res.buf.toString('utf8'));
28
- * ```
29
9
  */
30
10
  export declare function decodePngToBinary(input: Buffer | string, opts?: DecodeOptions): Promise<DecodeResult>;