ruborg 0.9.0 → 0.9.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: a4d69dff5edaf281b1014e64653462df7d7af84be0f341db0c4fe389978f25b1
4
- data.tar.gz: b206422e04dab022cd19a8d4ea6f9bd399988fc27b6e810fd673dfb7de0fb9ba
3
+ metadata.gz: 8f3c9f239a72ec2f321419eb6f4349c25fe4cec4ca5174c156e2f1d3fe84ec68
4
+ data.tar.gz: 4930dbbc8dc2d4e2040628a8c57e62d7bf3ac20ca58a0e7e363a649f6f8d6546
5
5
  SHA512:
6
- metadata.gz: f881b5b0908afaa16729339263d1584d861cd3a3d6b613599cbdb120340a86a2dc69408dc3f630d347769b990ea6c36ced1b538d17ba1517bebfb347c5c9ef2b
7
- data.tar.gz: ffd3ee5bd76b08ac9753d66ae95ac1aebaed2f4ba6db38ddc720e013970d2799a99202e4cf7295f969dd93ddf4f8b8b7df6d8730352a33f8ff86ce76eecbaff6
6
+ metadata.gz: ba9bd867f83d24c88abf435f4b64368b9c414080290747f107a6b5e94db6e82ddba0a59b0f700a4625e689b54724aefceecccc56f5e51f3fc8271be74601bc5e
7
+ data.tar.gz: e95dd232d73c7c63968c0105170928d0ab28d689342ad875a81fbc30ac56d508179a1866b476b79125fe43a85fc69cc68baef14a45cd220019be5612ce458bea
data/CHANGELOG.md CHANGED
@@ -7,6 +7,58 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0
7
7
 
8
8
  ## [Unreleased]
9
9
 
10
+ ## [0.9.3] - 2026-05-09
11
+
12
+ ### Added
13
+ - **`ruborg lock` command**: Check for and optionally break stale Borg repository locks
14
+ - `ruborg lock --repository NAME` — exits 0 if no lock, exits 1 if lock detected
15
+ - `ruborg lock --repository NAME --break --yes` — breaks the lock via `borg break-lock`
16
+ - `ruborg lock --repository NAME --force --yes` — force-removes lock files directly without invoking Borg (useful when Borg itself can't run)
17
+ - `--break` and `--force` are mutually exclusive; both require `--yes` as a safety guard
18
+ - `Repository#locked?` — pure filesystem check on `lock.exclusive` / `lock.roster`, no Borg invocation or passphrase required
19
+ - `Repository#break_lock` — delegates to `borg break-lock`; requires Borg >= 1.4.0
20
+ - `Repository#force_break_lock` — direct filesystem removal of lock files/dirs; no Borg needed
21
+ - Status output (lock present/absent) goes to stdout for scriptability; warning messages go to `$stderr`
22
+ - **Pre-flight lock detection during backup**: If a repository is locked when `backup` starts, ruborg waits and retries instead of failing immediately
23
+ - Polls every 5 seconds, prints elapsed time via the spinner
24
+ - Aborts with a clear error message after `lock_wait` seconds (default 300), suggesting `ruborg lock` to inspect or clear
25
+ - **`lock_wait` config key**: Optional integer (seconds). When set, also passed as `--lock-wait` to all Borg commands so Borg itself waits for mid-operation locks. Omitting the key leaves Borg at its own default (1 second)
26
+ - **Minimum Borg version**: Raised to 1.4.0; `break_lock` verifies this before invoking `borg break-lock`
27
+ - **`CLI::DEFAULT_LOCK_WAIT = 300`**: Named constant for the pre-flight wait timeout
28
+ - Fixes [#8](https://github.com/mpantel/ruborg/issues/8)
29
+
30
+ ## [0.9.2] - 2026-05-09
31
+
32
+ ### Added
33
+ - **CLI progress display**: Real-time feedback during backup operations
34
+ - Named stages printed to `$stderr`: `[1/3] Verifying repository`, `[2/3] Backing up files`, `[3/3] Pruning`
35
+ - Animated spinner for indeterminate operations (cache loading, `borg create`, pruning)
36
+ - Inline progress bar for per-file backup mode: `[=========> ] 42/120 filename.jpg`
37
+ - Stage count adapts to the operation: 2 stages without pruning, 3 with
38
+ - Degrades gracefully to plain text lines when output is piped or redirected (non-TTY)
39
+ - All progress output goes to `$stderr` — `--json` stdout and piped output remain clean
40
+ - No external gem dependencies — pure Ruby with ANSI `\r` rewrite
41
+ - Fixes [#6](https://github.com/mpantel/ruborg/issues/6)
42
+
43
+ ## [0.9.1] - 2026-05-09
44
+
45
+ ### Added
46
+ - **Archive metadata cache**: New `ArchiveCache` class eliminates N+1 `borg info` calls during per-file backup runs
47
+ - Metadata cached in `<repo_path>.ruborg_cache.json` — sibling file to the repository
48
+ - Cache is shared across machines: any host with access to the repo path shares the same cache
49
+ - Local repos use `File::LOCK_EX` for safe concurrent reads/writes with merge-on-conflict
50
+ - SSH repos (`user@host:/path`, `ssh://user@host/path`) fetch/push the cache via `scp` with optimistic locking (fetch-fresh → merge → push), avoiding deadlocks on process crashes
51
+ - Only archives not yet in cache trigger a `borg info` call; warm runs reduce subprocess overhead from O(n) to O(new archives)
52
+ - Fixes [#4](https://github.com/mpantel/ruborg/issues/4)
53
+ - **Catalog command**: New `ruborg catalog` command for fast, offline browsing of backed-up files
54
+ - Reads the local cache file — no `borg` subprocess calls needed
55
+ - `--search PATTERN` — filter entries by file path using a regex
56
+ - `--stats` — show aggregate statistics (total archives, unique files, total size, source dirs)
57
+ - `--json` — machine-readable JSON output; default is a human-friendly text table
58
+ - Works per-repository like all other commands (`--repository`)
59
+ - Supports SSH repos transparently via the same `scp`-based cache fetch
60
+ - **Bug fix**: `ArchiveCache` now normalises all loaded metadata to symbol keys, ensuring cache hits and cache misses return identical key types
61
+
10
62
  ## [0.9.0] - 2025-10-14
11
63
 
12
64
  ### Changed
data/README.md CHANGED
@@ -25,13 +25,15 @@ A friendly Ruby frontend for [Borg Backup](https://www.borgbackup.org/). Ruborg
25
25
  - 📈 **Summary View** - Quick overview of all repositories and their configurations
26
26
  - 🔧 **Custom Borg Path** - Support for custom Borg executable paths per repository
27
27
  - 🏠 **Hostname Validation** - NEW! Restrict backups to specific hosts (global or per-repository)
28
- - **Well-tested** - Comprehensive test suite with RSpec (297 examples, 0 failures)
28
+ - 🔒 **Lock Management** - Detect and break stale Borg repository locks with `ruborg lock`
29
+ - ⏳ **Lock-aware Backups** - Pre-flight lock detection with configurable wait timeout before backup
30
+ - ✅ **Well-tested** - Comprehensive test suite with RSpec (412 examples, 0 failures)
29
31
  - 🔒 **Security-focused** - Path validation, safe YAML loading, command injection protection
30
32
 
31
33
  ## Prerequisites
32
34
 
33
35
  - Ruby >= 3.2.0
34
- - [Borg Backup](https://www.borgbackup.org/) installed and available in PATH
36
+ - [Borg Backup](https://www.borgbackup.org/) >= 1.4.0 installed and available in PATH
35
37
  - [Passbolt CLI](https://github.com/passbolt/go-passbolt-cli) (optional, for password management)
36
38
 
37
39
  ### Installing Borg Backup
@@ -169,8 +171,9 @@ repositories:
169
171
  - **Source Deletion Safety**: `allow_remove_source` flag to explicitly enable `--remove-source` option (default: disabled)
170
172
  - **Skip Hash Check**: Optional `skip_hash_check` flag to skip content hash verification for faster backups (per-file mode only)
171
173
  - **Type-Safe Booleans**: Strict boolean validation prevents configuration errors (must use `true`/`false`, not strings)
172
- - **Global Settings**: Hostname, compression, encryption, auto_init, allow_remove_source, skip_hash_check, log_file, borg_path, borg_options, and retention apply to all repositories
173
- - **Per-Repository Overrides**: Any global setting can be overridden at the repository level (including hostname, allow_remove_source, skip_hash_check, and custom borg_path)
174
+ - **Lock Wait Timeout**: Optional `lock_wait` (integer, seconds) how long ruborg waits for a locked repository before aborting. Also passed as `--lock-wait` to Borg when set. Default: 300 seconds (pre-flight), Borg default: 1 second (when not configured)
175
+ - **Global Settings**: Hostname, compression, encryption, auto_init, allow_remove_source, skip_hash_check, lock_wait, log_file, borg_path, borg_options, and retention apply to all repositories
176
+ - **Per-Repository Overrides**: Any global setting can be overridden at the repository level (including hostname, allow_remove_source, skip_hash_check, lock_wait, and custom borg_path)
174
177
  - **Custom Borg Path**: Specify a custom Borg executable path if borg is not in PATH or to use a specific version
175
178
  - **Retention Policies**: Define how many backups to keep (hourly, daily, weekly, monthly, yearly)
176
179
  - **Multiple Sources**: Each repository can have multiple backup sources with their own exclude patterns
@@ -472,6 +475,38 @@ Group: postgres
472
475
  Type: regular file
473
476
  ```
474
477
 
478
+ ### Manage Repository Locks
479
+
480
+ Borg uses lock files to prevent concurrent access. If a backup crashes, stale locks can block all subsequent operations. Use `ruborg lock` to inspect and clear them.
481
+
482
+ ```bash
483
+ # Check if a repository is locked (exits 0 = no lock, 1 = locked)
484
+ ruborg lock --repository documents
485
+
486
+ # Break the lock via borg break-lock (requires Borg >= 1.4.0)
487
+ ruborg lock --repository documents --break --yes
488
+
489
+ # Force-remove lock files directly (no Borg required, last resort)
490
+ ruborg lock --repository documents --force --yes
491
+ ```
492
+
493
+ **Lock-aware backups:** When `ruborg backup` starts and detects a lock, it waits up to `lock_wait` seconds (default 300) for the lock to clear before aborting:
494
+
495
+ ```
496
+ [1/2] Verifying repository: documents
497
+ Repository locked — waiting for lock to clear (5s / 300s)…
498
+ Repository locked — waiting for lock to clear (10s / 300s)…
499
+ ✓ Lock cleared
500
+ [2/2] Creating archive
501
+ ```
502
+
503
+ Configure the timeout in `ruborg.yml` (also passed as `--lock-wait` to Borg when set):
504
+
505
+ ```yaml
506
+ # Wait up to 60s for a lock before aborting; also passes --lock-wait 60 to borg commands
507
+ lock_wait: 60
508
+ ```
509
+
475
510
  ### Validate Repository Compatibility
476
511
 
477
512
  ```bash
@@ -659,6 +694,7 @@ See [SECURITY.md](SECURITY.md) for detailed security information and best practi
659
694
  | `list` | List archives or files in repository | `--config`, `--repository`, `--archive`, `--log` |
660
695
  | `restore ARCHIVE` | Restore files from archive | `--config`, `--repository`, `--destination`, `--path`, `--log` |
661
696
  | `metadata ARCHIVE` | Get file metadata from archive | `--config`, `--repository`, `--file`, `--log` |
697
+ | `lock` | Check for and optionally break a repository lock | `--config`, `--repository`, `--break`, `--force`, `--yes`, `--log` |
662
698
  | `info` | Show repository information | `--config`, `--repository`, `--log` |
663
699
  | `version` | Show ruborg version | None |
664
700
 
@@ -0,0 +1,189 @@
1
+ # frozen_string_literal: true
2
+
3
+ require "json"
4
+ require "open3"
5
+ require "tempfile"
6
+
7
+ module Ruborg
8
+ # Persistent cache of per-archive metadata, stored as a JSON file sibling to the
9
+ # Borg repository. Eliminates repeated `borg info` calls across runs.
10
+ #
11
+ # Supports local paths (File::LOCK_EX) and SSH paths (optimistic merge via scp).
12
+ # All metadata is stored and returned with symbol keys (:path, :size, :hash, :source_dir).
13
+ class ArchiveCache
14
+ SSH_PATTERN = %r{\A(?:ssh://|[^\s/]+@[^\s:]+:)}
15
+
16
+ def initialize(repo_path)
17
+ @repo_path = repo_path
18
+ @data = {}
19
+ @snapshot = {}
20
+ @loaded = false
21
+ end
22
+
23
+ def fetch
24
+ return self if @loaded
25
+
26
+ if ssh?
27
+ load_remote
28
+ else
29
+ load_local
30
+ end
31
+
32
+ @snapshot = snapshot(@data)
33
+ @loaded = true
34
+ self
35
+ end
36
+
37
+ def [](archive_name)
38
+ @data[archive_name]
39
+ end
40
+
41
+ def store(archive_name, metadata)
42
+ @data[archive_name] = symbolize(metadata)
43
+ end
44
+
45
+ # Returns all cached entries as an array of hashes, each including :archive_name.
46
+ def entries
47
+ @data.map { |archive_name, metadata| metadata.merge(archive_name: archive_name) }
48
+ end
49
+
50
+ def save_if_changed
51
+ return unless dirty?
52
+
53
+ if ssh?
54
+ save_remote
55
+ else
56
+ save_local
57
+ end
58
+ end
59
+
60
+ private
61
+
62
+ def dirty?
63
+ @data != @snapshot
64
+ end
65
+
66
+ def snapshot(hash)
67
+ hash.transform_values(&:dup)
68
+ end
69
+
70
+ def ssh?
71
+ SSH_PATTERN.match?(@repo_path)
72
+ end
73
+
74
+ def cache_path_for(path)
75
+ "#{path}.ruborg_cache.json"
76
+ end
77
+
78
+ def symbolize(metadata)
79
+ metadata.transform_keys(&:to_sym)
80
+ end
81
+
82
+ def normalize_archives(raw)
83
+ (raw || {}).transform_values { |v| symbolize(v) }
84
+ end
85
+
86
+ def load_local
87
+ path = cache_path_for(@repo_path)
88
+ return unless File.exist?(path)
89
+
90
+ File.open(path, "r") do |f|
91
+ f.flock(File::LOCK_SH)
92
+ parsed = JSON.parse(f.read)
93
+ @data = normalize_archives(parsed["archives"])
94
+ end
95
+ rescue JSON::ParserError
96
+ @data = {}
97
+ end
98
+
99
+ def save_local
100
+ path = cache_path_for(@repo_path)
101
+ File.open(path, File::RDWR | File::CREAT, 0o600) do |f|
102
+ f.flock(File::LOCK_EX)
103
+ existing = read_existing_local(f)
104
+ merged = existing.merge(@data)
105
+ f.rewind
106
+ f.write(JSON.generate({ "version" => 1, "archives" => stringify_for_storage(merged) }))
107
+ f.truncate(f.pos)
108
+ end
109
+ end
110
+
111
+ def read_existing_local(file)
112
+ content = file.read
113
+ return {} if content.empty?
114
+
115
+ normalize_archives(JSON.parse(content)["archives"])
116
+ rescue JSON::ParserError
117
+ {}
118
+ end
119
+
120
+ # JSON requires string keys; convert symbol keys back before writing.
121
+ def stringify_for_storage(data)
122
+ data.transform_values { |v| v.transform_keys(&:to_s) }
123
+ end
124
+
125
+ def parse_ssh
126
+ if @repo_path.start_with?("ssh://")
127
+ require "uri"
128
+ uri = URI.parse(@repo_path)
129
+ host = uri.user ? "#{uri.user}@#{uri.host}" : uri.host
130
+ host = "#{host}:#{uri.port}" if uri.port && uri.port != 22
131
+ [host, uri.path]
132
+ else
133
+ match = @repo_path.match(%r{\A([^\s/]+@[^\s:]+):(.+)\z})
134
+ return [nil, nil] unless match
135
+
136
+ [match[1], match[2]]
137
+ end
138
+ end
139
+
140
+ def load_remote
141
+ host, path = parse_ssh
142
+ return unless host
143
+
144
+ remote = "#{host}:#{cache_path_for(path)}"
145
+ loaded = nil
146
+ Tempfile.create(["ruborg_cache", ".json"]) do |tmp|
147
+ _, status = Open3.capture2e("scp", "-q", "-B", remote, tmp.path)
148
+ next unless status.success?
149
+
150
+ begin
151
+ loaded = normalize_archives(JSON.parse(File.read(tmp.path))["archives"])
152
+ rescue JSON::ParserError
153
+ loaded = {}
154
+ end
155
+ end
156
+ @data = loaded if loaded
157
+ end
158
+
159
+ def save_remote
160
+ host, path = parse_ssh
161
+ return unless host
162
+
163
+ remote = "#{host}:#{cache_path_for(path)}"
164
+ fresh = fetch_remote_fresh(remote)
165
+ merged = fresh.merge(@data)
166
+
167
+ Tempfile.create(["ruborg_cache_upload", ".json"]) do |tmp|
168
+ tmp.write(JSON.generate({ "version" => 1, "archives" => stringify_for_storage(merged) }))
169
+ tmp.flush
170
+ Open3.capture2e("scp", "-q", "-B", tmp.path, remote)
171
+ end
172
+ end
173
+
174
+ def fetch_remote_fresh(remote)
175
+ result = {}
176
+ Tempfile.create(["ruborg_cache_fresh", ".json"]) do |tmp|
177
+ _, status = Open3.capture2e("scp", "-q", "-B", remote, tmp.path)
178
+ next unless status.success?
179
+
180
+ begin
181
+ result = normalize_archives(JSON.parse(File.read(tmp.path))["archives"])
182
+ rescue JSON::ParserError
183
+ result = {}
184
+ end
185
+ end
186
+ result
187
+ end
188
+ end
189
+ end
data/lib/ruborg/backup.rb CHANGED
@@ -3,13 +3,15 @@
3
3
  module Ruborg
4
4
  # Backup operations using Borg
5
5
  class Backup
6
- def initialize(repository, config:, retention_mode: "standard", repo_name: nil, logger: nil, skip_hash_check: false)
6
+ def initialize(repository, config:, retention_mode: "standard", repo_name: nil, logger: nil,
7
+ skip_hash_check: false, progress: nil)
7
8
  @repository = repository
8
9
  @config = config
9
10
  @retention_mode = retention_mode
10
11
  @repo_name = repo_name
11
12
  @logger = logger
12
13
  @skip_hash_check = skip_hash_check
14
+ @progress = progress
13
15
  end
14
16
 
15
17
  def create(name: nil, remove_source: false)
@@ -27,36 +29,31 @@ module Ruborg
27
29
  def create_standard_archive(name, remove_source)
28
30
  archive_name = name || Time.now.strftime("%Y-%m-%d_%H-%M-%S")
29
31
 
30
- # Show repository header in console only
31
32
  print_repository_header
32
-
33
- # Show progress in console
34
- puts "Creating archive: #{archive_name}"
33
+ @progress&.spin("Creating archive: #{archive_name}")
35
34
 
36
35
  cmd = build_create_command(archive_name)
37
-
38
36
  execute_borg_command(cmd)
39
37
 
40
- # Log successful action
38
+ @progress&.done("Archive created: #{archive_name}")
41
39
  @logger&.info("[#{@repo_name}] Created archive #{archive_name} with #{@config.backup_paths.size} source(s)")
42
- puts "✓ Archive created successfully"
43
40
 
44
41
  remove_source_files if remove_source
45
42
  end
46
43
 
47
44
  # rubocop:disable Metrics/AbcSize, Metrics/MethodLength, Metrics/PerceivedComplexity, Metrics/BlockNesting
48
45
  def create_per_file_archives(name_prefix, remove_source)
49
- # Collect all files from backup paths
46
+ @progress&.spin("Collecting files...")
50
47
  files_to_backup = collect_files_from_paths(@config.backup_paths, @config.exclude_patterns)
48
+ @progress&.stop_spin
51
49
 
52
50
  raise BorgError, "No files found to backup" if files_to_backup.empty?
53
51
 
54
- # Get list of existing archives for duplicate detection
52
+ @progress&.spin("Loading archive catalog...")
55
53
  existing_archives = get_existing_archive_names
54
+ @progress&.done("Catalog loaded — #{existing_archives.size} archive(s) known")
56
55
 
57
- # Show repository header in console only
58
56
  print_repository_header
59
-
60
57
  puts "Found #{files_to_backup.size} file(s) to backup"
61
58
 
62
59
  backed_up_count = 0
@@ -78,8 +75,8 @@ module Ruborg
78
75
  # Ensure archive name doesn't exceed 255 characters (filesystem limit)
79
76
  archive_name = name_prefix || build_archive_name(@repo_name, sanitized_filename, path_hash, file_mtime)
80
77
 
81
- # Show progress in console
82
- print " [#{index + 1}/#{files_to_backup.size}] Backing up: #{file_path}"
78
+ @progress&.bar(index + 1, files_to_backup.size, File.basename(file_path))
79
+ $stderr.print " [#{index + 1}/#{files_to_backup.size}] Backing up: #{file_path}" unless @progress
83
80
 
84
81
  # Check if archive already exists AND contains this exact file
85
82
  if existing_archives.key?(archive_name)
@@ -149,7 +146,7 @@ module Ruborg
149
146
  cmd = build_per_file_create_command(archive_name, file_path, source_dir)
150
147
 
151
148
  execute_borg_command(cmd)
152
- puts ""
149
+ puts "" unless @progress
153
150
 
154
151
  # Log successful action with details
155
152
  @logger&.info("[#{@repo_name}] Archived #{file_path} in archive #{archive_name}")
@@ -160,11 +157,13 @@ module Ruborg
160
157
  end
161
158
  # rubocop:enable Metrics/BlockLength
162
159
 
163
- if skipped_count.positive?
164
- puts "✓ Per-file backup completed: #{backed_up_count} file(s) backed up, #{skipped_count} skipped (unchanged)"
165
- else
166
- puts "✓ Per-file backup completed: #{backed_up_count} file(s) backed up"
167
- end
160
+ summary = if skipped_count.positive?
161
+ "#{backed_up_count} file(s) backed up, #{skipped_count} skipped (unchanged)"
162
+ else
163
+ "#{backed_up_count} file(s) backed up"
164
+ end
165
+ @progress&.done(summary)
166
+ puts "✓ Per-file backup completed: #{summary}" unless @progress
168
167
  end
169
168
  # rubocop:enable Metrics/AbcSize, Metrics/MethodLength, Metrics/PerceivedComplexity, Metrics/BlockNesting
170
169
 
@@ -459,12 +458,11 @@ module Ruborg
459
458
  puts "=" * 60
460
459
  end
461
460
 
462
- # rubocop:disable Metrics/AbcSize, Metrics/MethodLength, Metrics/PerceivedComplexity
461
+ # rubocop:disable Metrics/AbcSize, Metrics/MethodLength
463
462
  def get_existing_archive_names
464
463
  require "json"
465
464
  require "open3"
466
465
 
467
- # First get list of archives
468
466
  cmd = [@repository.borg_path, "list", @repository.path, "--json"]
469
467
  env = {}
470
468
  passphrase = @repository.instance_variable_get(:@passphrase)
@@ -475,73 +473,52 @@ module Ruborg
475
473
  stdout, stderr, status = Open3.capture3(env, *cmd)
476
474
  raise BorgError, "Failed to list archives: #{stderr}" unless status.success?
477
475
 
478
- json_data = JSON.parse(stdout)
479
- archives = json_data["archives"] || []
476
+ archives = JSON.parse(stdout)["archives"] || []
477
+ cache = ArchiveCache.new(@repository.path).fetch
480
478
 
481
- # Build hash by querying each archive individually for comment
482
- # This is necessary because 'borg list' doesn't include comments
483
- archives.each_with_object({}) do |archive, hash|
479
+ result = archives.each_with_object({}) do |archive, hash|
484
480
  archive_name = archive["name"]
485
481
 
486
- # Query this specific archive to get the comment
487
- info_cmd = [@repository.borg_path, "info", "#{@repository.path}::#{archive_name}", "--json"]
488
- info_stdout, _, info_status = Open3.capture3(env, *info_cmd)
489
-
490
- unless info_status.success?
491
- # If we can't get info for this archive, skip it with defaults
492
- hash[archive_name] = { path: "", size: 0, hash: "", source_dir: "" }
482
+ if (cached = cache[archive_name])
483
+ hash[archive_name] = cached
493
484
  next
494
485
  end
495
486
 
496
- info_data = JSON.parse(info_stdout)
497
- archive_info = info_data["archives"]&.first || {}
498
- comment = archive_info["comment"] || ""
499
-
500
- # Parse comment based on format
501
- # The comment field stores metadata as: path|||size|||hash|||source_dir (using ||| as delimiter)
502
- # For backward compatibility, handle old formats:
503
- # - Old format 1: plain path (no |||)
504
- # - Old format 2: path|||hash (2 parts)
505
- # - Old format 3: path|||size|||hash (3 parts)
506
- # - New format: path|||size|||hash|||source_dir (4 parts)
507
- if comment.include?("|||")
508
- parts = comment.split("|||")
509
- file_path = parts[0]
510
- if parts.length >= 4
511
- # New format: path|||size|||hash|||source_dir
512
- file_size = parts[1].to_i
513
- file_hash = parts[2] || ""
514
- source_dir = parts[3] || ""
515
- elsif parts.length >= 3
516
- # Format 3: path|||size|||hash (no source_dir)
517
- file_size = parts[1].to_i
518
- file_hash = parts[2] || ""
519
- source_dir = ""
520
- else
521
- # Old format: path|||hash (size and source_dir not available)
522
- file_size = 0
523
- file_hash = parts[1] || ""
524
- source_dir = ""
525
- end
526
- else
527
- # Oldest format: comment is just the path string
528
- file_path = comment
529
- file_size = 0
530
- file_hash = ""
531
- source_dir = ""
532
- end
487
+ info_cmd = [@repository.borg_path, "info", "#{@repository.path}::#{archive_name}", "--json"]
488
+ info_stdout, _, info_status = Open3.capture3(env, *info_cmd)
489
+
490
+ metadata = if info_status.success?
491
+ parse_archive_comment(JSON.parse(info_stdout).dig("archives", 0, "comment") || "")
492
+ else
493
+ { path: "", size: 0, hash: "", source_dir: "" }
494
+ end
533
495
 
534
- hash[archive_name] = {
535
- path: file_path,
536
- size: file_size,
537
- hash: file_hash,
538
- source_dir: source_dir
539
- }
496
+ cache.store(archive_name, metadata)
497
+ hash[archive_name] = metadata
540
498
  end
499
+
500
+ cache.save_if_changed
501
+ result
541
502
  rescue JSON::ParserError => e
542
503
  raise BorgError, "Failed to parse archive info: #{e.message}"
543
504
  end
544
- # rubocop:enable Metrics/AbcSize, Metrics/MethodLength, Metrics/PerceivedComplexity
505
+ # rubocop:enable Metrics/AbcSize, Metrics/MethodLength
506
+
507
+ def parse_archive_comment(comment)
508
+ if comment.include?("|||")
509
+ parts = comment.split("|||")
510
+ file_path = parts[0]
511
+ if parts.length >= 4
512
+ { path: file_path, size: parts[1].to_i, hash: parts[2] || "", source_dir: parts[3] || "" }
513
+ elsif parts.length >= 3
514
+ { path: file_path, size: parts[1].to_i, hash: parts[2] || "", source_dir: "" }
515
+ else
516
+ { path: file_path, size: 0, hash: parts[1] || "", source_dir: "" }
517
+ end
518
+ else
519
+ { path: comment, size: 0, hash: "", source_dir: "" }
520
+ end
521
+ end
545
522
 
546
523
  def find_next_version_name(base_name, existing_archives)
547
524
  version = 2
@@ -0,0 +1,36 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ruborg
4
+ # Read-only view over the ArchiveCache for searching and reporting.
5
+ # Never writes back to the cache.
6
+ class Catalog
7
+ def initialize(repo_path)
8
+ @cache = ArchiveCache.new(repo_path).fetch
9
+ end
10
+
11
+ # Returns all cached entries sorted by file path.
12
+ def list
13
+ @cache.entries.sort_by { |e| e[:path].to_s }
14
+ end
15
+
16
+ # Returns entries whose :path matches +pattern+ (a Regexp or regex string).
17
+ # Raises CatalogError on invalid regex.
18
+ def search(pattern)
19
+ regex = pattern.is_a?(Regexp) ? pattern : Regexp.new(pattern)
20
+ list.select { |e| regex.match?(e[:path].to_s) }
21
+ rescue RegexpError => e
22
+ raise CatalogError, "Invalid regex pattern: #{e.message}"
23
+ end
24
+
25
+ # Returns a summary hash with aggregate statistics.
26
+ def stats
27
+ all = list
28
+ {
29
+ total_archives: all.size,
30
+ unique_paths: all.map { |e| e[:path] }.uniq.size,
31
+ total_size: all.sum { |e| e[:size].to_i },
32
+ source_dirs: all.map { |e| e[:source_dir] }.uniq.reject(&:empty?).size
33
+ }
34
+ end
35
+ end
36
+ end
data/lib/ruborg/cli.rb CHANGED
@@ -1,10 +1,12 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  require "thor"
4
+ require "json"
4
5
 
5
6
  module Ruborg
6
7
  # Command-line interface for ruborg
7
8
  class CLI < Thor
9
+ DEFAULT_LOCK_WAIT = 300
8
10
  class_option :config, type: :string, default: "ruborg.yml", desc: "Path to configuration file"
9
11
  class_option :log, type: :string, desc: "Path to log file"
10
12
  class_option :repository, type: :string, aliases: "-r", desc: "Repository name (for multi-repo configs)"
@@ -74,11 +76,7 @@ module Ruborg
74
76
  merged_config = global_settings.merge(repo_config)
75
77
  validate_hostname(merged_config)
76
78
  passphrase = fetch_passphrase_for_repo(merged_config)
77
- borg_opts = merged_config["borg_options"] || {}
78
- borg_path = merged_config["borg_path"]
79
-
80
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
81
- logger: @logger)
79
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
82
80
 
83
81
  # Auto-initialize repository if configured
84
82
  # Use strict boolean checking: only true enables, everything else disables
@@ -121,11 +119,7 @@ module Ruborg
121
119
  merged_config = global_settings.merge(repo_config)
122
120
  validate_hostname(merged_config)
123
121
  passphrase = fetch_passphrase_for_repo(merged_config)
124
- borg_opts = merged_config["borg_options"] || {}
125
- borg_path = merged_config["borg_path"]
126
-
127
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
128
- logger: @logger)
122
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
129
123
 
130
124
  # Create backup config wrapper for compatibility
131
125
  backup_config = BackupConfig.new(repo_config, merged_config)
@@ -161,11 +155,7 @@ module Ruborg
161
155
  global_settings = config.global_settings
162
156
  merged_config = global_settings.merge(repo_config)
163
157
  passphrase = fetch_passphrase_for_repo(merged_config)
164
- borg_opts = merged_config["borg_options"] || {}
165
- borg_path = merged_config["borg_path"]
166
-
167
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
168
- logger: @logger)
158
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
169
159
 
170
160
  # Auto-initialize repository if configured
171
161
  # Use strict boolean checking: only true enables, everything else disables
@@ -311,11 +301,7 @@ module Ruborg
311
301
  merged_config = global_settings.merge(repo_config)
312
302
  validate_hostname(merged_config)
313
303
  passphrase = fetch_passphrase_for_repo(merged_config)
314
- borg_opts = merged_config["borg_options"] || {}
315
- borg_path = merged_config["borg_path"]
316
-
317
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
318
- logger: @logger)
304
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
319
305
 
320
306
  unless repo.exists?
321
307
  puts " ✗ Repository does not exist at #{repo_config["path"]}"
@@ -358,6 +344,90 @@ module Ruborg
358
344
 
359
345
  public
360
346
 
347
+ desc "catalog", "Search or browse the local archive metadata catalog (no borg calls)"
348
+ option :search, type: :string, desc: "Regex pattern to filter by file path"
349
+ option :stats, type: :boolean, default: false, desc: "Show catalog statistics instead of listing entries"
350
+ option :json, type: :boolean, default: false, desc: "Output as JSON"
351
+ def catalog
352
+ config = Config.new(options[:config])
353
+
354
+ raise ConfigError, "Please specify --repository" unless options[:repository]
355
+
356
+ repo_config = config.get_repository(options[:repository])
357
+ raise ConfigError, "Repository '#{options[:repository]}' not found" unless repo_config
358
+
359
+ global_settings = config.global_settings
360
+ merged_config = global_settings.merge(repo_config)
361
+ cat = Catalog.new(repo_config["path"])
362
+
363
+ if options[:stats]
364
+ print_catalog_stats(cat.stats, options[:json])
365
+ elsif options[:search]
366
+ results = cat.search(options[:search])
367
+ print_catalog_entries(results, options[:json])
368
+ else
369
+ print_catalog_entries(cat.list, options[:json])
370
+ end
371
+
372
+ @logger.info("Catalog query on repository '#{merged_config["name"]}'")
373
+ rescue Error => e
374
+ @logger.error("Catalog failed: #{e.message}")
375
+ raise
376
+ end
377
+
378
+ desc "lock", "Check for and optionally break a Borg repository lock"
379
+ option :break, type: :boolean, default: false,
380
+ desc: "Break the lock via borg break-lock (requires --yes)"
381
+ option :force, type: :boolean, default: false,
382
+ desc: "Force-remove lock files directly without invoking borg (requires --yes)"
383
+ option :yes, type: :boolean, default: false, desc: "Confirm the destructive operation"
384
+ def lock
385
+ config = Config.new(options[:config])
386
+
387
+ raise ConfigError, "Please specify --repository" unless options[:repository]
388
+ raise ConfigError, "Use --break or --force, not both" if options[:break] && options[:force]
389
+
390
+ repo_config = config.get_repository(options[:repository])
391
+ raise ConfigError, "Repository '#{options[:repository]}' not found" unless repo_config
392
+
393
+ global_settings = config.global_settings
394
+ merged_config = global_settings.merge(repo_config)
395
+ passphrase = fetch_passphrase_for_repo(merged_config)
396
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
397
+
398
+ unless repo.locked?
399
+ puts "No lock found for repository '#{repo_config["name"]}'"
400
+ @logger.info("Lock check: no lock found for '#{repo_config["name"]}'")
401
+ return
402
+ end
403
+
404
+ warn "Lock detected on repository '#{repo_config["name"]}' (#{repo_config["path"]})"
405
+ @logger.warn("Lock detected on repository '#{repo_config["name"]}'")
406
+
407
+ unless options[:break] || options[:force]
408
+ warn " Run with --break --yes (via borg) or --force --yes (direct removal)."
409
+ exit 1
410
+ end
411
+
412
+ unless options[:yes]
413
+ warn " Add --yes to confirm."
414
+ exit 1
415
+ end
416
+
417
+ if options[:force]
418
+ removed = repo.force_break_lock
419
+ puts "Force-removed lock files for '#{repo_config["name"]}': #{removed.join(", ")}"
420
+ @logger.info("Force-removed lock files for '#{repo_config["name"]}'")
421
+ else
422
+ repo.break_lock
423
+ puts "Lock broken for repository '#{repo_config["name"]}'"
424
+ @logger.info("Lock broken for repository '#{repo_config["name"]}'")
425
+ end
426
+ rescue Error => e
427
+ @logger.error("Lock command failed: #{e.message}")
428
+ raise
429
+ end
430
+
361
431
  desc "version", "Show ruborg and borg versions"
362
432
  def version
363
433
  require_relative "version"
@@ -408,11 +478,7 @@ module Ruborg
408
478
  merged_config = global_settings.merge(repo_config)
409
479
  validate_hostname(merged_config)
410
480
  passphrase = fetch_passphrase_for_repo(merged_config)
411
- borg_opts = merged_config["borg_options"] || {}
412
- borg_path = merged_config["borg_path"]
413
-
414
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
415
- logger: @logger)
481
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
416
482
 
417
483
  raise BorgError, "Repository does not exist at #{repo_config["path"]}" unless repo.exists?
418
484
 
@@ -441,6 +507,49 @@ module Ruborg
441
507
 
442
508
  private
443
509
 
510
+ def print_catalog_entries(entries, as_json)
511
+ if as_json
512
+ puts JSON.generate(entries.map { |e| stringify_entry(e) })
513
+ return
514
+ end
515
+
516
+ if entries.empty?
517
+ puts "No entries found."
518
+ return
519
+ end
520
+
521
+ puts "\n#{"FILE PATH".ljust(55)} #{"SIZE".ljust(10)} ARCHIVE"
522
+ puts "-" * 90
523
+ entries.each do |e|
524
+ puts "#{truncate(e[:path].to_s, 55).ljust(55)} #{format_size(e[:size].to_i).ljust(10)} #{e[:archive_name]}"
525
+ end
526
+ puts "\n#{entries.size} entry/entries."
527
+ end
528
+
529
+ def print_catalog_stats(stats, as_json)
530
+ if as_json
531
+ puts JSON.generate(stats)
532
+ return
533
+ end
534
+
535
+ puts "\n═══════════════════════════════════════════════════════════════"
536
+ puts " CATALOG STATISTICS"
537
+ puts "═══════════════════════════════════════════════════════════════\n\n"
538
+ puts " Total archives : #{stats[:total_archives]}"
539
+ puts " Unique files : #{stats[:unique_paths]}"
540
+ puts " Source dirs : #{stats[:source_dirs]}"
541
+ puts " Total size : #{format_size(stats[:total_size])}"
542
+ puts ""
543
+ end
544
+
545
+ def stringify_entry(entry)
546
+ entry.transform_keys(&:to_s)
547
+ end
548
+
549
+ def truncate(str, max)
550
+ str.length > max ? "...#{str[-(max - 3)..]}" : str
551
+ end
552
+
444
553
  def show_repositories_summary(config)
445
554
  repositories = config.repositories
446
555
  global_settings = config.global_settings
@@ -529,7 +638,7 @@ module Ruborg
529
638
  unit_index += 1
530
639
  end
531
640
 
532
- format("%.2f %s", size, units[unit_index])
641
+ "#{format("%.2f", size)} #{units[unit_index]}"
533
642
  end
534
643
 
535
644
  def get_passphrase(passphrase, passbolt_id)
@@ -588,30 +697,31 @@ module Ruborg
588
697
  puts "\n--- Backing up repository: #{repo_name} ---"
589
698
  @logger.info("Backing up repository: #{repo_name}")
590
699
 
591
- # Merge global settings with repo-specific settings (repo-specific takes precedence)
592
700
  merged_config = global_settings.merge(repo_config)
593
701
  validate_hostname(merged_config)
594
702
 
703
+ retention_mode = merged_config["retention_mode"] || "standard"
704
+ auto_prune = merged_config["auto_prune"] == true
705
+ retention_policy = merged_config["retention"]
706
+ will_prune = auto_prune && retention_policy && !retention_policy.empty?
707
+ stage_total = will_prune ? 3 : 2
708
+
709
+ progress = Progress.new
710
+ progress.stage(1, stage_total, "Verifying repository: #{repo_name}")
711
+
595
712
  passphrase = fetch_passphrase_for_repo(merged_config)
596
- borg_opts = merged_config["borg_options"] || {}
597
- borg_path = merged_config["borg_path"]
598
- repo = Repository.new(repo_config["path"], passphrase: passphrase, borg_options: borg_opts, borg_path: borg_path,
599
- logger: @logger)
713
+ lock_wait = (merged_config["lock_wait"] || DEFAULT_LOCK_WAIT).to_i
714
+ repo = build_repo(repo_config["path"], merged_config, passphrase)
600
715
 
601
- # Auto-initialize if configured
602
- # Use strict boolean checking: only true enables, everything else disables
603
- auto_init = merged_config["auto_init"]
604
- auto_init = false unless auto_init == true
716
+ auto_init = merged_config["auto_init"] == true
605
717
  if auto_init && !repo.exists?
606
718
  @logger.info("Auto-initializing repository at #{repo_config["path"]}")
607
719
  repo.create
608
720
  puts "Repository auto-initialized at #{repo_config["path"]}"
609
721
  end
610
722
 
611
- # Get retention mode (defaults to standard)
612
- retention_mode = merged_config["retention_mode"] || "standard"
723
+ wait_for_lock_clear(repo, repo_name, lock_wait, progress)
613
724
 
614
- # Validate remove_source permission with strict type checking
615
725
  if options[:remove_source]
616
726
  allow_remove_source = merged_config["allow_remove_source"]
617
727
  unless allow_remove_source.is_a?(TrueClass)
@@ -622,14 +732,14 @@ module Ruborg
622
732
  end
623
733
  end
624
734
 
625
- # Get skip_hash_check setting (defaults to false)
626
- skip_hash_check = merged_config["skip_hash_check"]
627
- skip_hash_check = false unless skip_hash_check == true
735
+ skip_hash_check = merged_config["skip_hash_check"] == true
736
+
737
+ backup_label = retention_mode == "per_file" ? "Backing up files (per-file mode)" : "Creating archive"
738
+ progress.stage(2, stage_total, backup_label)
628
739
 
629
- # Create backup config wrapper
630
740
  backup_config = BackupConfig.new(repo_config, merged_config)
631
741
  backup = Backup.new(repo, config: backup_config, retention_mode: retention_mode, repo_name: repo_name,
632
- logger: @logger, skip_hash_check: skip_hash_check)
742
+ logger: @logger, skip_hash_check: skip_hash_check, progress: progress)
633
743
 
634
744
  archive_name = options[:name] ? sanitize_archive_name(options[:name]) : nil
635
745
  @logger.info("Creating archive#{"s" if retention_mode == "per_file"}: #{archive_name || "auto-generated"}")
@@ -640,27 +750,55 @@ module Ruborg
640
750
  backup.create(name: archive_name, remove_source: options[:remove_source])
641
751
  @logger.info("Backup created successfully")
642
752
 
643
- if retention_mode == "per_file"
644
- puts "✓ Per-file backups created"
645
- else
646
- puts "✓ Backup created: #{archive_name || "auto-generated"}"
647
- end
648
753
  puts " Sources removed" if options[:remove_source]
649
754
 
650
- # Auto-prune if configured and retention policy exists
651
- # Use strict boolean checking: only true enables, everything else disables
652
- auto_prune = merged_config["auto_prune"]
653
- auto_prune = false unless auto_prune == true
654
- retention_policy = merged_config["retention"]
655
-
656
- return unless auto_prune && retention_policy && !retention_policy.empty?
755
+ return unless will_prune
657
756
 
658
757
  mode_desc = retention_mode == "per_file" ? "per-file mode" : "standard mode"
758
+ progress.stage(3, stage_total, "Pruning old archives (#{mode_desc})")
759
+ progress.spin("Pruning...")
659
760
  @logger.info("Auto-pruning repository: #{repo_name} (#{mode_desc})")
660
- puts " Pruning old backups (#{mode_desc})..."
661
761
  repo.prune(retention_policy, retention_mode: retention_mode)
662
762
  @logger.info("Pruning completed successfully for #{repo_name}")
663
- puts "Pruning completed"
763
+ progress.done("Pruning completed")
764
+ end
765
+
766
+ def wait_for_lock_clear(repo, repo_name, lock_wait, progress)
767
+ return unless repo.locked?
768
+
769
+ @logger.warn("Repository '#{repo_name}' is locked — waiting up to #{lock_wait}s")
770
+ elapsed = 0
771
+ interval = 5
772
+
773
+ progress.spin("Repository locked — waiting for lock to clear (0s / #{lock_wait}s)…")
774
+
775
+ while repo.locked? && elapsed < lock_wait
776
+ sleep interval
777
+ elapsed += interval
778
+ progress.spin("Repository locked — waiting for lock to clear (#{elapsed}s / #{lock_wait}s)…")
779
+ end
780
+
781
+ progress.stop_spin
782
+
783
+ if repo.locked?
784
+ raise BorgError,
785
+ "Repository '#{repo_name}' is still locked after #{lock_wait}s. " \
786
+ "Run 'ruborg lock --repository #{repo_name}' to inspect, or " \
787
+ "'ruborg lock --repository #{repo_name} --break --yes' to clear."
788
+ end
789
+
790
+ @logger.info("Lock cleared for '#{repo_name}' after #{elapsed}s")
791
+ end
792
+
793
+ def build_repo(repo_path, merged_config, passphrase)
794
+ Repository.new(
795
+ repo_path,
796
+ passphrase: passphrase,
797
+ borg_options: merged_config["borg_options"] || {},
798
+ borg_path: merged_config["borg_path"],
799
+ lock_wait: merged_config["lock_wait"]&.to_i,
800
+ logger: @logger
801
+ )
664
802
  end
665
803
 
666
804
  def fetch_passphrase_for_repo(repo_config)
data/lib/ruborg/config.rb CHANGED
@@ -41,7 +41,7 @@ module Ruborg
41
41
 
42
42
  def global_settings
43
43
  @data.slice("passbolt", "compression", "encryption", "auto_init", "borg_options", "log_file", "retention",
44
- "auto_prune", "hostname", "allow_remove_source", "borg_path", "skip_hash_check")
44
+ "auto_prune", "hostname", "allow_remove_source", "borg_path", "skip_hash_check", "lock_wait")
45
45
  end
46
46
 
47
47
  private
@@ -54,12 +54,12 @@ module Ruborg
54
54
  # Valid configuration keys at each level
55
55
  VALID_GLOBAL_KEYS = %w[
56
56
  hostname compression encryption auto_init auto_prune allow_remove_source
57
- log_file borg_path passbolt borg_options retention repositories skip_hash_check
57
+ log_file borg_path passbolt borg_options retention repositories skip_hash_check lock_wait
58
58
  ].freeze
59
59
 
60
60
  VALID_REPOSITORY_KEYS = %w[
61
61
  name description path hostname retention_mode passbolt retention sources
62
- compression encryption auto_init auto_prune borg_options allow_remove_source skip_hash_check
62
+ compression encryption auto_init auto_prune borg_options allow_remove_source skip_hash_check lock_wait
63
63
  ].freeze
64
64
 
65
65
  VALID_SOURCE_KEYS = %w[name paths exclude].freeze
@@ -0,0 +1,81 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Ruborg
4
+ # Terminal progress display: named stages, inline progress bar, and spinner.
5
+ # Writes to $stderr so stdout remains clean for --json or piped output.
6
+ # Degrades to plain text lines when output is not a TTY (piped / redirected).
7
+ class Progress
8
+ SPINNER_FRAMES = %w[⠋ ⠙ ⠹ ⠸ ⠼ ⠴ ⠦ ⠧ ⠇ ⠏].freeze
9
+ BAR_WIDTH = 28
10
+ LINE_WIDTH = 80
11
+
12
+ def initialize(output: $stderr)
13
+ @output = output
14
+ @tty = output.respond_to?(:isatty) && output.isatty
15
+ @spinner_thread = nil
16
+ end
17
+
18
+ # Print a numbered stage header: "[2/3] Label"
19
+ def stage(index, total, label)
20
+ stop_spin
21
+ clear_line if @tty
22
+ @output.puts "[#{index}/#{total}] #{label}"
23
+ end
24
+
25
+ # Start a spinner on the current line for an indeterminate operation.
26
+ # Call stop_spin (or done) to halt it.
27
+ def spin(label)
28
+ stop_spin
29
+ return unless @tty
30
+
31
+ frame = 0
32
+ @spinner_thread = Thread.new do
33
+ loop do
34
+ @output.print "\r #{SPINNER_FRAMES[frame % SPINNER_FRAMES.size]} #{label}"
35
+ frame += 1
36
+ sleep 0.1
37
+ end
38
+ end
39
+ end
40
+
41
+ # Stop the spinner and erase its line.
42
+ def stop_spin
43
+ return unless @spinner_thread
44
+
45
+ @spinner_thread.kill
46
+ @spinner_thread.join(0.2)
47
+ @spinner_thread = nil
48
+ clear_line if @tty
49
+ end
50
+
51
+ # Redraw an inline progress bar. Call once per item in a loop.
52
+ # label is truncated to fit the terminal line.
53
+ def bar(current, total, label = "")
54
+ return unless @tty
55
+
56
+ pct = total.positive? ? (current.to_f / total) : 0
57
+ filled = (BAR_WIDTH * pct).round
58
+ bar_str = filled.positive? ? "#{"=" * (filled - 1)}>" : ""
59
+ bar_str = bar_str.ljust(BAR_WIDTH)
60
+ short_label = truncate_left(label.to_s, 28)
61
+ @output.print "\r [#{bar_str}] #{current}/#{total} #{short_label.ljust(28)}"
62
+ end
63
+
64
+ # Halt any in-progress display and print a completion line.
65
+ def done(label = nil)
66
+ stop_spin
67
+ clear_line if @tty
68
+ @output.puts " ✓ #{label}" if label
69
+ end
70
+
71
+ private
72
+
73
+ def clear_line
74
+ @output.print "\r#{" " * LINE_WIDTH}\r"
75
+ end
76
+
77
+ def truncate_left(str, max)
78
+ str.length > max ? "...#{str[-(max - 3)..]}" : str
79
+ end
80
+ end
81
+ end
@@ -6,11 +6,13 @@ module Ruborg
6
6
  class Repository
7
7
  attr_reader :path, :borg_path
8
8
 
9
- def initialize(path, passphrase: nil, borg_options: {}, borg_path: nil, logger: nil)
9
+ def initialize(path, passphrase: nil, borg_options: {}, borg_path: nil, lock_wait: nil, logger: nil)
10
+ @original_path = path
10
11
  @path = validate_repo_path(path)
11
12
  @passphrase = passphrase
12
13
  @borg_options = borg_options
13
14
  @borg_path = validate_borg_path(borg_path || "borg")
15
+ @lock_wait = lock_wait&.to_i
14
16
  @logger = logger
15
17
  end
16
18
 
@@ -18,6 +20,37 @@ module Ruborg
18
20
  File.directory?(@path) && File.exist?(File.join(@path, "config"))
19
21
  end
20
22
 
23
+ MINIMUM_BORG_VERSION = "1.4.0"
24
+
25
+ def locked?
26
+ File.exist?(File.join(@path, "lock.exclusive")) ||
27
+ File.exist?(File.join(@path, "lock.roster"))
28
+ end
29
+
30
+ def break_lock
31
+ raise BorgError, "Repository does not exist at #{@path}" unless exists?
32
+
33
+ check_borg_version!
34
+ cmd = [@borg_path, "break-lock", @path]
35
+ execute_borg_command(cmd)
36
+ @logger&.info("Lock broken for repository at #{@path}")
37
+ end
38
+
39
+ def force_break_lock
40
+ raise BorgError, "Repository does not exist at #{@path}" unless exists?
41
+
42
+ require "fileutils"
43
+ removed = %w[lock.exclusive lock.roster].select do |name|
44
+ target = File.join(@path, name)
45
+ next false unless File.exist?(target)
46
+
47
+ FileUtils.rm_rf(target)
48
+ true
49
+ end
50
+ @logger&.info("Force-removed lock files at #{@path}: #{removed.join(", ")}")
51
+ removed
52
+ end
53
+
21
54
  def create
22
55
  raise BorgError, "Repository already exists at #{@path}" if exists?
23
56
 
@@ -278,62 +311,67 @@ module Ruborg
278
311
  nil # Failed to parse, skip this archive
279
312
  end
280
313
 
314
+ # rubocop:disable Metrics/AbcSize, Metrics/MethodLength
281
315
  def get_archives_grouped_by_source_dir
282
316
  require "json"
283
317
  require "time"
284
318
  require "open3"
285
319
 
286
- # Get list of all archives
287
320
  cmd = [@borg_path, "list", @path, "--json"]
288
321
  env = build_borg_env
289
322
 
290
323
  stdout, stderr, status = Open3.capture3(env, *cmd)
291
324
  raise BorgError, "Failed to list archives: #{stderr}" unless status.success?
292
325
 
293
- json_data = JSON.parse(stdout)
294
- archives = json_data["archives"] || []
295
-
296
- # Group archives by source directory from metadata
326
+ archives = JSON.parse(stdout)["archives"] || []
327
+ cache = ArchiveCache.new(@original_path).fetch
297
328
  archives_by_source = Hash.new { |h, k| h[k] = [] }
298
329
 
299
330
  archives.each do |archive|
300
331
  archive_name = archive["name"]
301
-
302
- # Get archive info to read comment (metadata)
303
- info_cmd = [@borg_path, "info", "#{@path}::#{archive_name}", "--json"]
304
- info_stdout, _, info_status = Open3.capture3(env, *info_cmd)
305
-
306
- unless info_status.success?
307
- # If we can't get info, put in legacy group
308
- archives_by_source[""] << {
309
- name: archive_name,
310
- time: Time.parse(archive["time"])
311
- }
312
- next
313
- end
314
-
315
- info_data = JSON.parse(info_stdout)
316
- comment = info_data.dig("archives", 0, "comment") || ""
317
-
318
- # Parse source_dir from comment
319
- # Format: path|||size|||hash|||source_dir
320
- source_dir = if comment.include?("|||")
321
- parts = comment.split("|||")
322
- parts.length >= 4 ? (parts[3] || "") : ""
332
+ archive_time = Time.parse(archive["time"])
333
+
334
+ metadata = if (cached = cache[archive_name])
335
+ cached
336
+ else
337
+ info_cmd = [@borg_path, "info", "#{@path}::#{archive_name}", "--json"]
338
+ info_stdout, _, info_status = Open3.capture3(env, *info_cmd)
339
+
340
+ if info_status.success?
341
+ comment = JSON.parse(info_stdout).dig("archives", 0, "comment") || ""
342
+ parsed = parse_archive_comment(comment)
343
+ cache.store(archive_name, parsed)
344
+ parsed
323
345
  else
324
- ""
346
+ cache.store(archive_name, { path: "", size: 0, hash: "", source_dir: "" })
347
+ { path: "", size: 0, hash: "", source_dir: "" }
325
348
  end
349
+ end
326
350
 
327
- archives_by_source[source_dir] << {
328
- name: archive_name,
329
- time: Time.parse(archive["time"])
330
- }
351
+ archives_by_source[metadata[:source_dir] || ""] << { name: archive_name, time: archive_time }
331
352
  end
332
353
 
354
+ cache.save_if_changed
333
355
  archives_by_source
334
356
  rescue JSON::ParserError => e
335
357
  raise BorgError, "Failed to parse archive metadata: #{e.message}"
336
358
  end
359
+ # rubocop:enable Metrics/AbcSize, Metrics/MethodLength
360
+
361
+ def parse_archive_comment(comment)
362
+ if comment.include?("|||")
363
+ parts = comment.split("|||")
364
+ if parts.length >= 4
365
+ { path: parts[0], size: parts[1].to_i, hash: parts[2] || "", source_dir: parts[3] || "" }
366
+ elsif parts.length >= 3
367
+ { path: parts[0], size: parts[1].to_i, hash: parts[2] || "", source_dir: "" }
368
+ else
369
+ { path: parts[0], size: 0, hash: parts[1] || "", source_dir: "" }
370
+ end
371
+ else
372
+ { path: comment, size: 0, hash: "", source_dir: "" }
373
+ end
374
+ end
337
375
 
338
376
  def prune_per_directory_standard(retention_policy)
339
377
  # Apply standard retention policies (keep_daily, etc.) per source directory
@@ -587,6 +625,26 @@ module Ruborg
587
625
  borg_path
588
626
  end
589
627
 
628
+ def inject_lock_wait(cmd)
629
+ return cmd if @lock_wait.nil? || cmd[1] == "break-lock"
630
+
631
+ [cmd[0], "--lock-wait", @lock_wait.to_s] + cmd[1..]
632
+ end
633
+
634
+ def check_borg_version!
635
+ version = self.class.borg_version(@borg_path)
636
+ return if version_sufficient?(version, MINIMUM_BORG_VERSION)
637
+
638
+ raise BorgError,
639
+ "Borg #{MINIMUM_BORG_VERSION}+ is required but found #{version}. Please upgrade Borg."
640
+ end
641
+
642
+ def version_sufficient?(actual, minimum)
643
+ actual_parts = actual.split(".").map(&:to_i)
644
+ minimum_parts = minimum.split(".").map(&:to_i)
645
+ (actual_parts <=> minimum_parts) >= 0
646
+ end
647
+
590
648
  def find_in_path(command)
591
649
  ENV["PATH"].split(File::PATH_SEPARATOR).each do |directory|
592
650
  path = File.join(directory, command)
@@ -610,6 +668,9 @@ module Ruborg
610
668
  env["BORG_RELOCATED_REPO_ACCESS_IS_OK"] = allow_relocated ? "yes" : "no"
611
669
  env["BORG_UNKNOWN_UNENCRYPTED_REPO_ACCESS_IS_OK"] = allow_unencrypted ? "yes" : "no"
612
670
 
671
+ # Inject --lock-wait for all commands except break-lock (which breaks, not acquires)
672
+ cmd = inject_lock_wait(cmd)
673
+
613
674
  # Redirect stdin from /dev/null to prevent interactive prompts
614
675
  result = system(env, *cmd, in: "/dev/null")
615
676
  raise BorgError, "Borg command failed: #{cmd.join(" ")}" unless result
@@ -1,5 +1,5 @@
1
1
  # frozen_string_literal: true
2
2
 
3
3
  module Ruborg
4
- VERSION = "0.9.0"
4
+ VERSION = "0.9.3"
5
5
  end
data/lib/ruborg.rb CHANGED
@@ -3,6 +3,9 @@
3
3
  require_relative "ruborg/version"
4
4
  require_relative "ruborg/logger"
5
5
  require_relative "ruborg/config"
6
+ require_relative "ruborg/archive_cache"
7
+ require_relative "ruborg/catalog"
8
+ require_relative "ruborg/progress"
6
9
  require_relative "ruborg/repository"
7
10
  require_relative "ruborg/backup"
8
11
  require_relative "ruborg/passbolt"
@@ -13,4 +16,5 @@ module Ruborg
13
16
  class ConfigError < Error; end
14
17
  class BorgError < Error; end
15
18
  class PassboltError < Error; end
19
+ class CatalogError < Error; end
16
20
  end
metadata CHANGED
@@ -1,14 +1,13 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: ruborg
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.9.0
4
+ version: 0.9.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - Michail Pantelelis
8
- autorequire:
9
8
  bindir: exe
10
9
  cert_chain: []
11
- date: 2025-10-14 00:00:00.000000000 Z
10
+ date: 1980-01-02 00:00:00.000000000 Z
12
11
  dependencies:
13
12
  - !ruby/object:Gem::Dependency
14
13
  name: psych
@@ -144,11 +143,14 @@ files:
144
143
  - SECURITY.md
145
144
  - exe/ruborg
146
145
  - lib/ruborg.rb
146
+ - lib/ruborg/archive_cache.rb
147
147
  - lib/ruborg/backup.rb
148
+ - lib/ruborg/catalog.rb
148
149
  - lib/ruborg/cli.rb
149
150
  - lib/ruborg/config.rb
150
151
  - lib/ruborg/logger.rb
151
152
  - lib/ruborg/passbolt.rb
153
+ - lib/ruborg/progress.rb
152
154
  - lib/ruborg/repository.rb
153
155
  - lib/ruborg/version.rb
154
156
  - ruborg.gemspec
@@ -161,7 +163,6 @@ metadata:
161
163
  source_code_uri: https://github.com/mpantel/ruborg.git
162
164
  changelog_uri: https://github.com/mpantel/ruborg/blob/main/CHANGELOG.md
163
165
  rubygems_mfa_required: 'true'
164
- post_install_message:
165
166
  rdoc_options: []
166
167
  require_paths:
167
168
  - lib
@@ -176,8 +177,7 @@ required_rubygems_version: !ruby/object:Gem::Requirement
176
177
  - !ruby/object:Gem::Version
177
178
  version: '0'
178
179
  requirements: []
179
- rubygems_version: 3.5.22
180
- signing_key:
180
+ rubygems_version: 3.7.1
181
181
  specification_version: 4
182
182
  summary: A friendly Ruby frontend for Borg backup
183
183
  test_files: []