appydave-tools 0.18.2 → 0.18.4
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- checksums.yaml +4 -4
- data/CHANGELOG.md +15 -0
- data/docs/dam/usage.md +58 -5
- data/docs/dam/vat-testing-plan.md +56 -0
- data/docs/development/CODEX-recommendations.md +14 -1
- data/lib/appydave/tools/dam/manifest_generator.rb +133 -32
- data/lib/appydave/tools/dam/sync_from_ssd.rb +3 -3
- data/lib/appydave/tools/version.rb +1 -1
- data/package.json +1 -1
- metadata +1 -1
checksums.yaml
CHANGED
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
---
|
|
2
2
|
SHA256:
|
|
3
|
-
metadata.gz:
|
|
4
|
-
data.tar.gz:
|
|
3
|
+
metadata.gz: 13697577cf21ab548b3ab152ad3a12194cf26c034eeaf2fdf5325b49730fe120
|
|
4
|
+
data.tar.gz: b1d7ee7daf7b60c00b80b0ec399ca7d5664d38d41cdacec959f8f8ba3f440d64
|
|
5
5
|
SHA512:
|
|
6
|
-
metadata.gz:
|
|
7
|
-
data.tar.gz:
|
|
6
|
+
metadata.gz: 8d6d9643703e76b3946ca0f5923b74972e04b3981a5cd2ca3ebbac315d2d42c98064dc5e6dc12dbc0833073e1b7b36c5b41f1276b38acb5a01dd86802f7aabce
|
|
7
|
+
data.tar.gz: 390852362fe0ecc9c45bf4c45bfb87ef8fc697051026b89e3f93eacdfd55a8ed9796f2892870e2d661ab4491fc0183ac07d4046ae6051588fe4111b1b0140c21
|
data/CHANGELOG.md
CHANGED
|
@@ -1,3 +1,18 @@
|
|
|
1
|
+
## [0.18.3](https://github.com/appydave/appydave-tools/compare/v0.18.2...v0.18.3) (2025-11-10)
|
|
2
|
+
|
|
3
|
+
|
|
4
|
+
### Bug Fixes
|
|
5
|
+
|
|
6
|
+
* resolve archived structure detection and range folder calculation for DAM manifest and sync ([dec1400](https://github.com/appydave/appydave-tools/commit/dec1400c561f11959fb6aecd7761e916ca525082))
|
|
7
|
+
* resolve rubocop violations in manifest_generator (refactor build_project_entry, simplify SSD check) ([ced61be](https://github.com/appydave/appydave-tools/commit/ced61be41c3da352831d46e3486968f5ee55842c))
|
|
8
|
+
|
|
9
|
+
## [0.18.2](https://github.com/appydave/appydave-tools/compare/v0.18.1...v0.18.2) (2025-11-10)
|
|
10
|
+
|
|
11
|
+
|
|
12
|
+
### Bug Fixes
|
|
13
|
+
|
|
14
|
+
* resolve Windows compatibility by removing hardcoded SSL certificate path ([fb4e6f7](https://github.com/appydave/appydave-tools/commit/fb4e6f74437e898229e1863f361d8ed4274a4ca6))
|
|
15
|
+
|
|
1
16
|
## [0.18.1](https://github.com/appydave/appydave-tools/compare/v0.18.0...v0.18.1) (2025-11-10)
|
|
2
17
|
|
|
3
18
|
|
data/docs/dam/usage.md
CHANGED
|
@@ -561,17 +561,70 @@ dam s3-down
|
|
|
561
561
|
dam s3-status
|
|
562
562
|
```
|
|
563
563
|
|
|
564
|
-
###
|
|
564
|
+
### Command Safety Features
|
|
565
565
|
|
|
566
|
-
|
|
566
|
+
#### Dry-Run Support
|
|
567
|
+
|
|
568
|
+
All filesystem-modifying commands support `--dry-run` to preview changes before execution:
|
|
569
|
+
|
|
570
|
+
| Command | Dry-Run Support | Force Flag Required | What It Previews |
|
|
571
|
+
|---------|----------------|---------------------|------------------|
|
|
572
|
+
| **s3-up** | ✅ Yes | No | Files to upload to S3 |
|
|
573
|
+
| **s3-down** | ✅ Yes | No | Files to download from S3 |
|
|
574
|
+
| **s3-cleanup-remote** | ✅ Yes | **Yes (`--force`)** | S3 files to delete |
|
|
575
|
+
| **s3-cleanup-local** | ✅ Yes | **Yes (`--force`)** | Local s3-staging files to delete |
|
|
576
|
+
| **archive** | ✅ Yes | Optional (`--force` = delete local) | Project to copy to SSD |
|
|
577
|
+
| **sync-ssd** | ✅ Yes | No | Light files to restore from SSD |
|
|
578
|
+
|
|
579
|
+
#### Read-Only Commands (No Dry-Run Needed)
|
|
580
|
+
|
|
581
|
+
These commands only read data and don't modify files:
|
|
582
|
+
|
|
583
|
+
| Command | Type | What It Does |
|
|
584
|
+
|---------|------|--------------|
|
|
585
|
+
| **list** | Read-only | List brands/projects |
|
|
586
|
+
| **manifest** | Generates JSON | Generate `projects.json` manifest |
|
|
587
|
+
| **s3-status** | Read-only | Check sync status |
|
|
588
|
+
| **help** | Read-only | Show help information |
|
|
589
|
+
|
|
590
|
+
#### Safety Workflow Examples
|
|
591
|
+
|
|
592
|
+
**Always preview destructive operations first:**
|
|
567
593
|
|
|
568
594
|
```bash
|
|
595
|
+
# Preview S3 upload
|
|
569
596
|
dam s3-up appydave b65 --dry-run
|
|
570
|
-
|
|
571
|
-
dam s3-
|
|
572
|
-
|
|
597
|
+
# Review output, then execute
|
|
598
|
+
dam s3-up appydave b65
|
|
599
|
+
|
|
600
|
+
# Preview S3 cleanup (requires --force)
|
|
601
|
+
dam s3-cleanup-remote appydave b65 --force --dry-run
|
|
602
|
+
# Review output, then execute
|
|
603
|
+
dam s3-cleanup-remote appydave b65 --force
|
|
604
|
+
|
|
605
|
+
# Preview archive (with or without local deletion)
|
|
606
|
+
dam archive appydave b63 --dry-run # Copy only
|
|
607
|
+
dam archive appydave b63 --force --dry-run # Copy + delete local
|
|
608
|
+
# Review output, then execute
|
|
609
|
+
dam archive appydave b63 # Copy only
|
|
610
|
+
dam archive appydave b63 --force # Copy + delete local
|
|
611
|
+
|
|
612
|
+
# Preview SSD sync
|
|
613
|
+
dam sync-ssd appydave --dry-run
|
|
614
|
+
# Review output, then execute
|
|
615
|
+
dam sync-ssd appydave
|
|
573
616
|
```
|
|
574
617
|
|
|
618
|
+
#### Force Flag Behavior
|
|
619
|
+
|
|
620
|
+
Commands requiring `--force` provide extra protection for destructive operations:
|
|
621
|
+
|
|
622
|
+
- **s3-cleanup-remote**: Must use `--force` to delete S3 files (prevents accidental deletion)
|
|
623
|
+
- **s3-cleanup-local**: Must use `--force` to delete local staging files
|
|
624
|
+
- **archive**: Optional `--force` flag deletes local copy after successful SSD backup
|
|
625
|
+
- Without `--force`: Copies to SSD, keeps local copy intact
|
|
626
|
+
- With `--force`: Copies to SSD, then deletes local copy (frees disk space)
|
|
627
|
+
|
|
575
628
|
### Interactive Selection
|
|
576
629
|
|
|
577
630
|
When multiple projects match short name:
|
|
@@ -113,6 +113,62 @@ dam list
|
|
|
113
113
|
|
|
114
114
|
---
|
|
115
115
|
|
|
116
|
+
## Command Safety Features Reference
|
|
117
|
+
|
|
118
|
+
### Dry-Run and Force Flag Support
|
|
119
|
+
|
|
120
|
+
All filesystem-modifying commands support `--dry-run` to preview changes before execution:
|
|
121
|
+
|
|
122
|
+
| Command | Dry-Run Support | Force Flag Required | What It Does |
|
|
123
|
+
|---------|----------------|---------------------|--------------|
|
|
124
|
+
| **s3-up** | ✅ Yes | No | Preview files to upload to S3 |
|
|
125
|
+
| **s3-down** | ✅ Yes | No | Preview files to download from S3 |
|
|
126
|
+
| **s3-cleanup-remote** | ✅ Yes | **Yes (`--force`)** | Preview S3 files to delete |
|
|
127
|
+
| **s3-cleanup-local** | ✅ Yes | **Yes (`--force`)** | Preview local s3-staging files to delete |
|
|
128
|
+
| **archive** | ✅ Yes | Optional (`--force` = delete local) | Preview project copy to SSD |
|
|
129
|
+
| **sync-ssd** | ✅ Yes | No | Preview light files to restore from SSD |
|
|
130
|
+
|
|
131
|
+
### Read-Only Commands (No Dry-Run Needed)
|
|
132
|
+
|
|
133
|
+
These commands only read data and don't modify files:
|
|
134
|
+
|
|
135
|
+
| Command | Type | What It Does |
|
|
136
|
+
|---------|------|--------------|
|
|
137
|
+
| **list** | Read-only | List brands/projects |
|
|
138
|
+
| **manifest** | Generates JSON | Generate `projects.json` manifest |
|
|
139
|
+
| **s3-status** | Read-only | Check sync status |
|
|
140
|
+
| **help** | Read-only | Show help information |
|
|
141
|
+
|
|
142
|
+
### Force Flag Behavior
|
|
143
|
+
|
|
144
|
+
Commands requiring `--force` provide extra protection for destructive operations:
|
|
145
|
+
|
|
146
|
+
- **s3-cleanup-remote**: Must use `--force` to delete S3 files (prevents accidental deletion)
|
|
147
|
+
- **s3-cleanup-local**: Must use `--force` to delete local staging files
|
|
148
|
+
- **archive**: Optional `--force` flag deletes local copy after successful SSD backup
|
|
149
|
+
- Without `--force`: Copies to SSD, keeps local copy intact
|
|
150
|
+
- With `--force`: Copies to SSD, then deletes local copy (frees disk space)
|
|
151
|
+
|
|
152
|
+
### Safety Testing Workflow
|
|
153
|
+
|
|
154
|
+
**Always test with dry-run first:**
|
|
155
|
+
|
|
156
|
+
```bash
|
|
157
|
+
# 1. Preview with dry-run
|
|
158
|
+
dam s3-up appydave b65 --dry-run
|
|
159
|
+
dam s3-cleanup-remote appydave b65 --force --dry-run
|
|
160
|
+
dam archive appydave b63 --dry-run
|
|
161
|
+
|
|
162
|
+
# 2. Review output carefully
|
|
163
|
+
|
|
164
|
+
# 3. Execute if safe
|
|
165
|
+
dam s3-up appydave b65
|
|
166
|
+
dam s3-cleanup-remote appydave b65 --force
|
|
167
|
+
dam archive appydave b63
|
|
168
|
+
```
|
|
169
|
+
|
|
170
|
+
---
|
|
171
|
+
|
|
116
172
|
## Test Suite
|
|
117
173
|
|
|
118
174
|
### Phase 1: Unit Tests (Automated - RSpec)
|
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
# CODEX Recommendations - Review & Status
|
|
2
2
|
|
|
3
|
-
> Last updated: 2025-11-10
|
|
3
|
+
> Last updated: 2025-11-10 08:12:59 UTC
|
|
4
4
|
> Original recommendations provided by Codex (GPT-5) on 2025-11-09
|
|
5
5
|
|
|
6
6
|
This document captures Codex's architectural recommendations with implementation status and verdicts after engineering review.
|
|
@@ -146,6 +146,19 @@ end
|
|
|
146
146
|
|
|
147
147
|
**Note:** These are legitimate technical debt items, not style preferences. Recommend creating GitHub issues for tracking.
|
|
148
148
|
|
|
149
|
+
### 🔍 DAM Manifest & Sync Addendum (2025-11-10)
|
|
150
|
+
|
|
151
|
+
New DAM code mirrors the VAT manifest/sync flow but reintroduces several bugs plus new inconsistencies:
|
|
152
|
+
|
|
153
|
+
- **Archived projects still missing:** `collect_project_ids` explicitly skips the `archived` directory (`lib/appydave/tools/dam/manifest_generator.rb:70-99`), so the later logic that probes `brand_path/archived/<range>/<project>` never runs; manifests omit the majority of historical work. This also means `SyncFromSsd.should_sync_project?` (`lib/appydave/tools/dam/sync_from_ssd.rb:77-96`) will think everything is already local because manifests never flag archived presence.
|
|
154
|
+
- **Range math diverges between components:** Manifest uses `project_id =~ /^[a-z](\d+)/` to build ranges (`lib/appydave/tools/dam/manifest_generator.rb:214-224`), but the SSD sync hard-codes `/^b(\d+)/` (`lib/appydave/tools/dam/sync_from_ssd.rb:123-138`). Projects outside the `b` prefix (aitldr, voz, etc.) will all collapse into the fallback `000-099`, creating collisions.
|
|
155
|
+
- **SSD paths lose grouping info:** Manifests record `path: project_id` for SSD entries (`lib/appydave/tools/dam/manifest_generator.rb:119-126`), ignoring the range folders that exist on disk. The sync tool then assumes `ssd/<project_id>` (line 98) and will fail whenever the SSD organizes projects under range subdirectories.
|
|
156
|
+
- **Disk usage ignores archived location:** Even when a project only exists under `archived/<range>`, `calculate_disk_usage` points at `File.join(brand_path, project[:id])` (`lib/appydave/tools/dam/manifest_generator.rb:131-146`), so archived-only projects report 0 bytes. Need to reuse the resolved `local_path` (flat vs archived) instead of rebuilding the path blindly.
|
|
157
|
+
- **Heavy file detection still shallow:** `heavy_files?` only inspects direct children (`lib/appydave/tools/dam/manifest_generator.rb:233-239`), while `light_files?` walks `**/*`. Any team that keeps footage under nested folders (e.g., `/final/video.mp4`) gets `has_heavy_files: false`, which downstream sync logic relies on.
|
|
158
|
+
- **Sync exclusion filter misidentifies generated folders:** `EXCLUDE_PATTERNS` contain glob syntax (`**/node_modules/**`, `**/.DS_Store`), but `excluded_file?` strips `**/` and compares raw path segments (`lib/appydave/tools/dam/sync_from_ssd.rb:160-182`), so patterns like `.DS_Store` or `.turbo` may still slip through or block unrelated files. Consider using `File.fnmatch` with the original glob rather than manual string surgery.
|
|
159
|
+
|
|
160
|
+
Action: Fold these findings into the existing VAT manifest backlog or spin up DAM‑specific tickets so both manifest implementations converge on a single, tested service.
|
|
161
|
+
|
|
149
162
|
---
|
|
150
163
|
|
|
151
164
|
### ⚠️ CLI Standardization (Worth Auditing)
|
|
@@ -86,12 +86,23 @@ module Appydave
|
|
|
86
86
|
|
|
87
87
|
# Scan SSD (if available)
|
|
88
88
|
if ssd_available
|
|
89
|
-
Dir.glob(File.join(ssd_backup, '*/')).each do |
|
|
90
|
-
|
|
89
|
+
Dir.glob(File.join(ssd_backup, '*/')).each do |ssd_path|
|
|
90
|
+
basename = File.basename(ssd_path)
|
|
91
|
+
|
|
92
|
+
if range_folder?(basename)
|
|
93
|
+
# Scan projects within SSD range folders
|
|
94
|
+
Dir.glob(File.join(ssd_path, '*/')).each do |project_path|
|
|
95
|
+
project_id = File.basename(project_path)
|
|
96
|
+
all_project_ids << project_id if valid_project_id?(project_id)
|
|
97
|
+
end
|
|
98
|
+
elsif valid_project_id?(basename)
|
|
99
|
+
# Direct project in SSD root (legacy structure)
|
|
100
|
+
all_project_ids << basename
|
|
101
|
+
end
|
|
91
102
|
end
|
|
92
103
|
end
|
|
93
104
|
|
|
94
|
-
# Scan local (
|
|
105
|
+
# Scan local flat structure (active projects only)
|
|
95
106
|
Dir.glob(File.join(brand_path, '*/')).each do |path|
|
|
96
107
|
basename = File.basename(path)
|
|
97
108
|
# Skip hidden and special directories
|
|
@@ -101,36 +112,75 @@ module Appydave
|
|
|
101
112
|
all_project_ids << basename if valid_project_id?(basename)
|
|
102
113
|
end
|
|
103
114
|
|
|
115
|
+
# Scan archived structure (restored/archived projects)
|
|
116
|
+
archived_base = File.join(brand_path, 'archived')
|
|
117
|
+
if Dir.exist?(archived_base)
|
|
118
|
+
# Scan range folders (e.g., archived/a50-a99/, archived/b50-b99/)
|
|
119
|
+
Dir.glob(File.join(archived_base, '*/')).each do |range_folder|
|
|
120
|
+
# Scan projects within each range folder
|
|
121
|
+
Dir.glob(File.join(range_folder, '*/')).each do |project_path|
|
|
122
|
+
basename = File.basename(project_path)
|
|
123
|
+
all_project_ids << basename if valid_project_id?(basename)
|
|
124
|
+
end
|
|
125
|
+
end
|
|
126
|
+
end
|
|
127
|
+
|
|
104
128
|
all_project_ids.uniq.sort
|
|
105
129
|
end
|
|
106
130
|
|
|
107
131
|
def build_project_entries(all_project_ids, ssd_backup, ssd_available)
|
|
108
|
-
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
|
|
112
|
-
|
|
113
|
-
|
|
114
|
-
|
|
115
|
-
|
|
116
|
-
|
|
117
|
-
|
|
118
|
-
|
|
119
|
-
|
|
120
|
-
|
|
121
|
-
|
|
122
|
-
|
|
123
|
-
|
|
124
|
-
|
|
125
|
-
|
|
126
|
-
|
|
127
|
-
|
|
128
|
-
|
|
132
|
+
all_project_ids.map { |project_id| build_project_entry(project_id, ssd_backup, ssd_available) }
|
|
133
|
+
end
|
|
134
|
+
|
|
135
|
+
def build_project_entry(project_id, ssd_backup, ssd_available)
|
|
136
|
+
# Check flat structure (active projects)
|
|
137
|
+
flat_path = File.join(brand_path, project_id)
|
|
138
|
+
flat_exists = Dir.exist?(flat_path)
|
|
139
|
+
|
|
140
|
+
# Check archived structure (restored/archived projects)
|
|
141
|
+
range = determine_range(project_id)
|
|
142
|
+
archived_path = File.join(brand_path, 'archived', range, project_id)
|
|
143
|
+
archived_exists = Dir.exist?(archived_path)
|
|
144
|
+
|
|
145
|
+
# Determine which path to use for file detection
|
|
146
|
+
local_path = if flat_exists
|
|
147
|
+
flat_path
|
|
148
|
+
else
|
|
149
|
+
(archived_exists ? archived_path : flat_path)
|
|
150
|
+
end
|
|
151
|
+
local_exists = flat_exists || archived_exists
|
|
152
|
+
|
|
153
|
+
# Determine structure type
|
|
154
|
+
structure = if flat_exists
|
|
155
|
+
'flat'
|
|
156
|
+
elsif archived_exists
|
|
157
|
+
'archived'
|
|
158
|
+
end
|
|
159
|
+
|
|
160
|
+
# Check SSD (try both flat and range-based structures)
|
|
161
|
+
ssd_exists = if ssd_available
|
|
162
|
+
flat_ssd_path = File.join(ssd_backup, project_id)
|
|
163
|
+
range_ssd_path = File.join(ssd_backup, range, project_id)
|
|
164
|
+
Dir.exist?(flat_ssd_path) || Dir.exist?(range_ssd_path)
|
|
165
|
+
else
|
|
166
|
+
false
|
|
167
|
+
end
|
|
168
|
+
|
|
169
|
+
{
|
|
170
|
+
id: project_id,
|
|
171
|
+
storage: {
|
|
172
|
+
ssd: {
|
|
173
|
+
exists: ssd_exists,
|
|
174
|
+
path: ssd_exists ? project_id : nil
|
|
175
|
+
},
|
|
176
|
+
local: {
|
|
177
|
+
exists: local_exists,
|
|
178
|
+
structure: structure,
|
|
179
|
+
has_heavy_files: local_exists ? heavy_files?(local_path) : false,
|
|
180
|
+
has_light_files: local_exists ? light_files?(local_path) : false
|
|
129
181
|
}
|
|
130
182
|
}
|
|
131
|
-
|
|
132
|
-
|
|
133
|
-
projects
|
|
183
|
+
}
|
|
134
184
|
end
|
|
135
185
|
|
|
136
186
|
def calculate_disk_usage(projects, ssd_backup)
|
|
@@ -139,13 +189,27 @@ module Appydave
|
|
|
139
189
|
|
|
140
190
|
projects.each do |project|
|
|
141
191
|
if project[:storage][:local][:exists]
|
|
142
|
-
|
|
143
|
-
|
|
192
|
+
# Try flat structure first, then archived structure
|
|
193
|
+
flat_path = File.join(brand_path, project[:id])
|
|
194
|
+
if Dir.exist?(flat_path)
|
|
195
|
+
local_bytes += calculate_directory_size(flat_path)
|
|
196
|
+
else
|
|
197
|
+
range = determine_range(project[:id])
|
|
198
|
+
archived_path = File.join(brand_path, 'archived', range, project[:id])
|
|
199
|
+
local_bytes += calculate_directory_size(archived_path) if Dir.exist?(archived_path)
|
|
200
|
+
end
|
|
144
201
|
end
|
|
145
202
|
|
|
146
|
-
|
|
147
|
-
|
|
148
|
-
|
|
203
|
+
next unless project[:storage][:ssd][:exists]
|
|
204
|
+
|
|
205
|
+
# Try flat structure first, then range-based structure
|
|
206
|
+
flat_ssd_path = File.join(ssd_backup, project[:id])
|
|
207
|
+
if Dir.exist?(flat_ssd_path)
|
|
208
|
+
ssd_bytes += calculate_directory_size(flat_ssd_path)
|
|
209
|
+
else
|
|
210
|
+
range = determine_range(project[:id])
|
|
211
|
+
range_ssd_path = File.join(ssd_backup, range, project[:id])
|
|
212
|
+
ssd_bytes += calculate_directory_size(range_ssd_path) if Dir.exist?(range_ssd_path)
|
|
149
213
|
end
|
|
150
214
|
end
|
|
151
215
|
|
|
@@ -190,6 +254,26 @@ module Appydave
|
|
|
190
254
|
end
|
|
191
255
|
|
|
192
256
|
# Helper methods
|
|
257
|
+
|
|
258
|
+
# Determine range folder for project
|
|
259
|
+
# Both SSD and local archived use 50-number ranges with letter prefixes:
|
|
260
|
+
# b00-b49, b50-b99, a01-a49, a50-a99
|
|
261
|
+
def determine_range(project_id)
|
|
262
|
+
# FliVideo/Modern pattern: b40, a82, etc.
|
|
263
|
+
if project_id =~ /^([a-z])(\d+)/
|
|
264
|
+
letter = Regexp.last_match(1)
|
|
265
|
+
number = Regexp.last_match(2).to_i
|
|
266
|
+
# 50-number ranges (0-49, 50-99)
|
|
267
|
+
range_start = (number / 50) * 50
|
|
268
|
+
range_end = range_start + 49
|
|
269
|
+
# Format with leading zeros and letter prefix
|
|
270
|
+
format("#{letter}%02d-#{letter}%02d", range_start, range_end)
|
|
271
|
+
else
|
|
272
|
+
# Legacy pattern or unknown
|
|
273
|
+
'000-099'
|
|
274
|
+
end
|
|
275
|
+
end
|
|
276
|
+
|
|
193
277
|
def valid_project_id?(project_id)
|
|
194
278
|
# Valid formats:
|
|
195
279
|
# - Modern: letter + 2 digits + dash + name (e.g., b63-flivideo)
|
|
@@ -197,6 +281,23 @@ module Appydave
|
|
|
197
281
|
!!(project_id =~ /^[a-z]\d{2}-/ || project_id =~ /^\d/)
|
|
198
282
|
end
|
|
199
283
|
|
|
284
|
+
def range_folder?(folder_name)
|
|
285
|
+
# Range folder patterns with letter prefixes:
|
|
286
|
+
# - b00-b49, b50-b99, a00-a49, a50-a99 (letter + 2 digits + dash + same letter + 2 digits)
|
|
287
|
+
# - 000-099 (3 digits + dash + 3 digits)
|
|
288
|
+
# Must match: same letter on both sides (b00-b49, not b00-a49)
|
|
289
|
+
return true if folder_name =~ /^\d{3}-\d{3}$/
|
|
290
|
+
|
|
291
|
+
if folder_name =~ /^([a-z])(\d{2})-([a-z])(\d{2})$/
|
|
292
|
+
letter1 = Regexp.last_match(1)
|
|
293
|
+
letter2 = Regexp.last_match(3)
|
|
294
|
+
# Must be same letter on both sides
|
|
295
|
+
return letter1 == letter2
|
|
296
|
+
end
|
|
297
|
+
|
|
298
|
+
false
|
|
299
|
+
end
|
|
300
|
+
|
|
200
301
|
def heavy_files?(dir)
|
|
201
302
|
return false unless Dir.exist?(dir)
|
|
202
303
|
|
|
@@ -149,11 +149,11 @@ module Appydave
|
|
|
149
149
|
|
|
150
150
|
# Determine if project should be synced
|
|
151
151
|
def should_sync_project?(project)
|
|
152
|
-
# Only sync if project exists on SSD but NOT
|
|
152
|
+
# Only sync if project exists on SSD but NOT locally (either flat or archived)
|
|
153
153
|
return false unless project[:storage][:ssd][:exists]
|
|
154
154
|
|
|
155
|
-
# Skip if exists locally in flat
|
|
156
|
-
return false if project[:storage][:local][:exists]
|
|
155
|
+
# Skip if exists locally in any structure (flat or archived)
|
|
156
|
+
return false if project[:storage][:local][:exists]
|
|
157
157
|
|
|
158
158
|
true
|
|
159
159
|
end
|
data/package.json
CHANGED