appydave-tools 0.18.1 → 0.18.3

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  # CODEX Recommendations - Review & Status
2
2
 
3
- > Last updated: 2025-11-10
3
+ > Last updated: 2025-11-10 08:12:59 UTC
4
4
  > Original recommendations provided by Codex (GPT-5) on 2025-11-09
5
5
 
6
6
  This document captures Codex's architectural recommendations with implementation status and verdicts after engineering review.
@@ -146,6 +146,19 @@ end
146
146
 
147
147
  **Note:** These are legitimate technical debt items, not style preferences. Recommend creating GitHub issues for tracking.
148
148
 
149
+ ### 🔍 DAM Manifest & Sync Addendum (2025-11-10)
150
+
151
+ New DAM code mirrors the VAT manifest/sync flow but reintroduces several bugs plus new inconsistencies:
152
+
153
+ - **Archived projects still missing:** `collect_project_ids` explicitly skips the `archived` directory (`lib/appydave/tools/dam/manifest_generator.rb:70-99`), so the later logic that probes `brand_path/archived/<range>/<project>` never runs; manifests omit the majority of historical work. This also means `SyncFromSsd.should_sync_project?` (`lib/appydave/tools/dam/sync_from_ssd.rb:77-96`) will think everything is already local because manifests never flag archived presence.
154
+ - **Range math diverges between components:** Manifest uses `project_id =~ /^[a-z](\d+)/` to build ranges (`lib/appydave/tools/dam/manifest_generator.rb:214-224`), but the SSD sync hard-codes `/^b(\d+)/` (`lib/appydave/tools/dam/sync_from_ssd.rb:123-138`). Projects outside the `b` prefix (aitldr, voz, etc.) will all collapse into the fallback `000-099`, creating collisions.
155
+ - **SSD paths lose grouping info:** Manifests record `path: project_id` for SSD entries (`lib/appydave/tools/dam/manifest_generator.rb:119-126`), ignoring the range folders that exist on disk. The sync tool then assumes `ssd/<project_id>` (line 98) and will fail whenever the SSD organizes projects under range subdirectories.
156
+ - **Disk usage ignores archived location:** Even when a project only exists under `archived/<range>`, `calculate_disk_usage` points at `File.join(brand_path, project[:id])` (`lib/appydave/tools/dam/manifest_generator.rb:131-146`), so archived-only projects report 0 bytes. Need to reuse the resolved `local_path` (flat vs archived) instead of rebuilding the path blindly.
157
+ - **Heavy file detection still shallow:** `heavy_files?` only inspects direct children (`lib/appydave/tools/dam/manifest_generator.rb:233-239`), while `light_files?` walks `**/*`. Any team that keeps footage under nested folders (e.g., `/final/video.mp4`) gets `has_heavy_files: false`, which downstream sync logic relies on.
158
+ - **Sync exclusion filter misidentifies generated folders:** `EXCLUDE_PATTERNS` contain glob syntax (`**/node_modules/**`, `**/.DS_Store`), but `excluded_file?` strips `**/` and compares raw path segments (`lib/appydave/tools/dam/sync_from_ssd.rb:160-182`), so patterns like `.DS_Store` or `.turbo` may still slip through or block unrelated files. Consider using `File.fnmatch` with the original glob rather than manual string surgery.
159
+
160
+ Action: Fold these findings into the existing VAT manifest backlog or spin up DAM‑specific tickets so both manifest implementations converge on a single, tested service.
161
+
149
162
  ---
150
163
 
151
164
  ### ⚠️ CLI Standardization (Worth Auditing)
@@ -86,12 +86,23 @@ module Appydave
86
86
 
87
87
  # Scan SSD (if available)
88
88
  if ssd_available
89
- Dir.glob(File.join(ssd_backup, '*/')).each do |project_path|
90
- all_project_ids << File.basename(project_path)
89
+ Dir.glob(File.join(ssd_backup, '*/')).each do |ssd_path|
90
+ basename = File.basename(ssd_path)
91
+
92
+ if range_folder?(basename)
93
+ # Scan projects within SSD range folders
94
+ Dir.glob(File.join(ssd_path, '*/')).each do |project_path|
95
+ project_id = File.basename(project_path)
96
+ all_project_ids << project_id if valid_project_id?(project_id)
97
+ end
98
+ elsif valid_project_id?(basename)
99
+ # Direct project in SSD root (legacy structure)
100
+ all_project_ids << basename
101
+ end
91
102
  end
92
103
  end
93
104
 
94
- # Scan local (all projects in brand directory)
105
+ # Scan local flat structure (active projects only)
95
106
  Dir.glob(File.join(brand_path, '*/')).each do |path|
96
107
  basename = File.basename(path)
97
108
  # Skip hidden and special directories
@@ -101,36 +112,75 @@ module Appydave
101
112
  all_project_ids << basename if valid_project_id?(basename)
102
113
  end
103
114
 
115
+ # Scan archived structure (restored/archived projects)
116
+ archived_base = File.join(brand_path, 'archived')
117
+ if Dir.exist?(archived_base)
118
+ # Scan range folders (e.g., archived/a50-a99/, archived/b50-b99/)
119
+ Dir.glob(File.join(archived_base, '*/')).each do |range_folder|
120
+ # Scan projects within each range folder
121
+ Dir.glob(File.join(range_folder, '*/')).each do |project_path|
122
+ basename = File.basename(project_path)
123
+ all_project_ids << basename if valid_project_id?(basename)
124
+ end
125
+ end
126
+ end
127
+
104
128
  all_project_ids.uniq.sort
105
129
  end
106
130
 
107
131
  def build_project_entries(all_project_ids, ssd_backup, ssd_available)
108
- projects = []
109
-
110
- all_project_ids.each do |project_id|
111
- local_path = File.join(brand_path, project_id)
112
- ssd_path = ssd_available ? File.join(ssd_backup, project_id) : nil
113
-
114
- local_exists = Dir.exist?(local_path)
115
- ssd_exists = ssd_path && Dir.exist?(ssd_path)
116
-
117
- projects << {
118
- id: project_id,
119
- storage: {
120
- ssd: {
121
- exists: ssd_exists,
122
- path: ssd_exists ? project_id : nil
123
- },
124
- local: {
125
- exists: local_exists,
126
- has_heavy_files: local_exists ? heavy_files?(local_path) : false,
127
- has_light_files: local_exists ? light_files?(local_path) : false
128
- }
132
+ all_project_ids.map { |project_id| build_project_entry(project_id, ssd_backup, ssd_available) }
133
+ end
134
+
135
+ def build_project_entry(project_id, ssd_backup, ssd_available)
136
+ # Check flat structure (active projects)
137
+ flat_path = File.join(brand_path, project_id)
138
+ flat_exists = Dir.exist?(flat_path)
139
+
140
+ # Check archived structure (restored/archived projects)
141
+ range = determine_range(project_id)
142
+ archived_path = File.join(brand_path, 'archived', range, project_id)
143
+ archived_exists = Dir.exist?(archived_path)
144
+
145
+ # Determine which path to use for file detection
146
+ local_path = if flat_exists
147
+ flat_path
148
+ else
149
+ (archived_exists ? archived_path : flat_path)
150
+ end
151
+ local_exists = flat_exists || archived_exists
152
+
153
+ # Determine structure type
154
+ structure = if flat_exists
155
+ 'flat'
156
+ elsif archived_exists
157
+ 'archived'
158
+ end
159
+
160
+ # Check SSD (try both flat and range-based structures)
161
+ ssd_exists = if ssd_available
162
+ flat_ssd_path = File.join(ssd_backup, project_id)
163
+ range_ssd_path = File.join(ssd_backup, range, project_id)
164
+ Dir.exist?(flat_ssd_path) || Dir.exist?(range_ssd_path)
165
+ else
166
+ false
167
+ end
168
+
169
+ {
170
+ id: project_id,
171
+ storage: {
172
+ ssd: {
173
+ exists: ssd_exists,
174
+ path: ssd_exists ? project_id : nil
175
+ },
176
+ local: {
177
+ exists: local_exists,
178
+ structure: structure,
179
+ has_heavy_files: local_exists ? heavy_files?(local_path) : false,
180
+ has_light_files: local_exists ? light_files?(local_path) : false
129
181
  }
130
182
  }
131
- end
132
-
133
- projects
183
+ }
134
184
  end
135
185
 
136
186
  def calculate_disk_usage(projects, ssd_backup)
@@ -139,13 +189,27 @@ module Appydave
139
189
 
140
190
  projects.each do |project|
141
191
  if project[:storage][:local][:exists]
142
- local_path = File.join(brand_path, project[:id])
143
- local_bytes += calculate_directory_size(local_path)
192
+ # Try flat structure first, then archived structure
193
+ flat_path = File.join(brand_path, project[:id])
194
+ if Dir.exist?(flat_path)
195
+ local_bytes += calculate_directory_size(flat_path)
196
+ else
197
+ range = determine_range(project[:id])
198
+ archived_path = File.join(brand_path, 'archived', range, project[:id])
199
+ local_bytes += calculate_directory_size(archived_path) if Dir.exist?(archived_path)
200
+ end
144
201
  end
145
202
 
146
- if project[:storage][:ssd][:exists]
147
- ssd_path = File.join(ssd_backup, project[:id])
148
- ssd_bytes += calculate_directory_size(ssd_path)
203
+ next unless project[:storage][:ssd][:exists]
204
+
205
+ # Try flat structure first, then range-based structure
206
+ flat_ssd_path = File.join(ssd_backup, project[:id])
207
+ if Dir.exist?(flat_ssd_path)
208
+ ssd_bytes += calculate_directory_size(flat_ssd_path)
209
+ else
210
+ range = determine_range(project[:id])
211
+ range_ssd_path = File.join(ssd_backup, range, project[:id])
212
+ ssd_bytes += calculate_directory_size(range_ssd_path) if Dir.exist?(range_ssd_path)
149
213
  end
150
214
  end
151
215
 
@@ -190,6 +254,26 @@ module Appydave
190
254
  end
191
255
 
192
256
  # Helper methods
257
+
258
+ # Determine range folder for project
259
+ # Both SSD and local archived use 50-number ranges with letter prefixes:
260
+ # b00-b49, b50-b99, a01-a49, a50-a99
261
+ def determine_range(project_id)
262
+ # FliVideo/Modern pattern: b40, a82, etc.
263
+ if project_id =~ /^([a-z])(\d+)/
264
+ letter = Regexp.last_match(1)
265
+ number = Regexp.last_match(2).to_i
266
+ # 50-number ranges (0-49, 50-99)
267
+ range_start = (number / 50) * 50
268
+ range_end = range_start + 49
269
+ # Format with leading zeros and letter prefix
270
+ format("#{letter}%02d-#{letter}%02d", range_start, range_end)
271
+ else
272
+ # Legacy pattern or unknown
273
+ '000-099'
274
+ end
275
+ end
276
+
193
277
  def valid_project_id?(project_id)
194
278
  # Valid formats:
195
279
  # - Modern: letter + 2 digits + dash + name (e.g., b63-flivideo)
@@ -197,6 +281,23 @@ module Appydave
197
281
  !!(project_id =~ /^[a-z]\d{2}-/ || project_id =~ /^\d/)
198
282
  end
199
283
 
284
+ def range_folder?(folder_name)
285
+ # Range folder patterns with letter prefixes:
286
+ # - b00-b49, b50-b99, a00-a49, a50-a99 (letter + 2 digits + dash + same letter + 2 digits)
287
+ # - 000-099 (3 digits + dash + 3 digits)
288
+ # Must match: same letter on both sides (b00-b49, not b00-a49)
289
+ return true if folder_name =~ /^\d{3}-\d{3}$/
290
+
291
+ if folder_name =~ /^([a-z])(\d{2})-([a-z])(\d{2})$/
292
+ letter1 = Regexp.last_match(1)
293
+ letter2 = Regexp.last_match(3)
294
+ # Must be same letter on both sides
295
+ return letter1 == letter2
296
+ end
297
+
298
+ false
299
+ end
300
+
200
301
  def heavy_files?(dir)
201
302
  return false unless Dir.exist?(dir)
202
303
 
@@ -12,6 +12,22 @@ module Appydave
12
12
  class S3Operations
13
13
  attr_reader :brand_info, :brand, :project_id, :brand_path, :s3_client
14
14
 
15
+ # Directory patterns to exclude from archive/upload (generated/installable content)
16
+ EXCLUDE_PATTERNS = %w[
17
+ **/node_modules/**
18
+ **/.git/**
19
+ **/.next/**
20
+ **/dist/**
21
+ **/build/**
22
+ **/out/**
23
+ **/.cache/**
24
+ **/coverage/**
25
+ **/.turbo/**
26
+ **/.vercel/**
27
+ **/tmp/**
28
+ **/.DS_Store
29
+ ].freeze
30
+
15
31
  def initialize(brand, project_id, brand_info: nil, brand_path: nil, s3_client: nil)
16
32
  @project_id = project_id
17
33
 
@@ -37,9 +53,11 @@ module Appydave
37
53
  Aws::S3::Client.new(
38
54
  credentials: credentials,
39
55
  region: brand_info.aws.region,
40
- http_wire_trace: false,
41
- ssl_verify_peer: true,
42
- ssl_ca_bundle: '/etc/ssl/cert.pem' # macOS system certificates
56
+ http_wire_trace: false
57
+ # AWS SDK auto-detects SSL certificates on all platforms:
58
+ # - Windows: Uses Windows Certificate Store
59
+ # - macOS: Finds system certificates automatically
60
+ # - Linux: Finds OpenSSL certificates
43
61
  )
44
62
  end
45
63
 
@@ -500,20 +518,23 @@ module Appydave
500
518
  end
501
519
 
502
520
  size = calculate_directory_size(source_dir)
503
- puts '📋 Copy to SSD:'
521
+ puts '📋 Copy to SSD (excluding generated files):'
504
522
  puts " Source: #{source_dir}"
505
523
  puts " Dest: #{dest_dir}"
506
524
  puts " Size: #{file_size_human(size)}"
507
525
  puts ''
508
526
 
509
527
  if dry_run
510
- puts ' [DRY-RUN] Would copy entire project to SSD'
528
+ puts ' [DRY-RUN] Would copy project to SSD (excluding node_modules, .git, etc.)'
511
529
  return true
512
530
  end
513
531
 
514
- FileUtils.mkdir_p(File.dirname(dest_dir))
515
- FileUtils.cp_r(source_dir, dest_dir, preserve: true)
516
- puts ' ✅ Copied to SSD'
532
+ FileUtils.mkdir_p(dest_dir)
533
+
534
+ # Copy files with exclusion filtering
535
+ stats = copy_with_exclusions(source_dir, dest_dir)
536
+
537
+ puts " ✅ Copied to SSD (#{stats[:files]} files, excluded #{stats[:excluded]} generated files)"
517
538
 
518
539
  true
519
540
  rescue StandardError => e
@@ -521,6 +542,47 @@ module Appydave
521
542
  false
522
543
  end
523
544
 
545
+ # Copy directory contents with exclusion filtering
546
+ def copy_with_exclusions(source_dir, dest_dir)
547
+ stats = { files: 0, excluded: 0 }
548
+
549
+ Dir.glob(File.join(source_dir, '**', '*'), File::FNM_DOTMATCH).each do |source_path|
550
+ next if File.directory?(source_path)
551
+ next if ['.', '..'].include?(File.basename(source_path))
552
+
553
+ relative_path = source_path.sub("#{source_dir}/", '')
554
+
555
+ if excluded_path?(relative_path)
556
+ stats[:excluded] += 1
557
+ next
558
+ end
559
+
560
+ dest_path = File.join(dest_dir, relative_path)
561
+ FileUtils.mkdir_p(File.dirname(dest_path))
562
+ FileUtils.cp(source_path, dest_path, preserve: true)
563
+ stats[:files] += 1
564
+ end
565
+
566
+ stats
567
+ end
568
+
569
+ # Check if path should be excluded (generated/installable content)
570
+ def excluded_path?(relative_path)
571
+ EXCLUDE_PATTERNS.any? do |pattern|
572
+ # Extract directory/file name from pattern (remove **)
573
+ excluded_name = pattern.gsub('**/', '').chomp('/**')
574
+ path_segments = relative_path.split('/')
575
+
576
+ if excluded_name.include?('*')
577
+ # Pattern with wildcards - use fnmatch on filename
578
+ File.fnmatch(excluded_name, File.basename(relative_path))
579
+ else
580
+ # Check if any path segment matches the excluded name
581
+ path_segments.include?(excluded_name)
582
+ end
583
+ end
584
+ end
585
+
524
586
  # Delete local project directory
525
587
  def delete_local_project(project_dir, dry_run: false)
526
588
  size = calculate_directory_size(project_dir)
@@ -35,6 +35,22 @@ module Appydave
35
35
  *.webm
36
36
  ].freeze
37
37
 
38
+ # Directory patterns to exclude (generated/installable content)
39
+ EXCLUDE_PATTERNS = %w[
40
+ **/node_modules/**
41
+ **/.git/**
42
+ **/.next/**
43
+ **/dist/**
44
+ **/build/**
45
+ **/out/**
46
+ **/.cache/**
47
+ **/coverage/**
48
+ **/.turbo/**
49
+ **/.vercel/**
50
+ **/tmp/**
51
+ **/.DS_Store
52
+ ].freeze
53
+
38
54
  def initialize(brand, brand_info: nil, brand_path: nil)
39
55
  @brand_info = brand_info || load_brand_info(brand)
40
56
  @brand = @brand_info.key # Use resolved brand key, not original input
@@ -133,11 +149,11 @@ module Appydave
133
149
 
134
150
  # Determine if project should be synced
135
151
  def should_sync_project?(project)
136
- # Only sync if project exists on SSD but NOT in local flat structure
152
+ # Only sync if project exists on SSD but NOT locally (either flat or archived)
137
153
  return false unless project[:storage][:ssd][:exists]
138
154
 
139
- # Skip if exists locally in flat structure
140
- return false if project[:storage][:local][:exists] && project[:storage][:local][:structure] == 'flat'
155
+ # Skip if exists locally in any structure (flat or archived)
156
+ return false if project[:storage][:local][:exists]
141
157
 
142
158
  true
143
159
  end
@@ -184,6 +200,7 @@ module Appydave
184
200
  LIGHT_FILE_PATTERNS.each do |pattern|
185
201
  Dir.glob(File.join(ssd_path, pattern)).each do |source_file|
186
202
  next if heavy_file?(source_file)
203
+ next if excluded_file?(source_file, ssd_path)
187
204
 
188
205
  copy_stats = copy_light_file(source_file, ssd_path, local_dir, dry_run: dry_run)
189
206
  stats[:files] += copy_stats[:files]
@@ -199,6 +216,30 @@ module Appydave
199
216
  HEAVY_FILE_PATTERNS.any? { |pattern| File.fnmatch(pattern, File.basename(source_file)) }
200
217
  end
201
218
 
219
+ # Check if file should be excluded (generated/installable content)
220
+ def excluded_file?(source_file, ssd_path)
221
+ relative_path = source_file.sub("#{ssd_path}/", '')
222
+
223
+ EXCLUDE_PATTERNS.any? do |pattern|
224
+ # Extract directory/file name from pattern (remove **)
225
+ # **/node_modules/** → node_modules
226
+ # **/.git/** → .git
227
+ # **/.DS_Store → .DS_Store
228
+ excluded_name = pattern.gsub('**/', '').chomp('/**')
229
+
230
+ # Check path segments for matches
231
+ path_segments = relative_path.split('/')
232
+
233
+ if excluded_name.include?('*')
234
+ # Pattern with wildcards - use fnmatch on filename
235
+ File.fnmatch(excluded_name, File.basename(relative_path))
236
+ else
237
+ # Check if any path segment matches the excluded name
238
+ path_segments.include?(excluded_name)
239
+ end
240
+ end
241
+ end
242
+
202
243
  # Copy a single light file
203
244
  def copy_light_file(source_file, ssd_path, local_dir, dry_run: false)
204
245
  relative_path = source_file.sub("#{ssd_path}/", '')
@@ -2,6 +2,6 @@
2
2
 
3
3
  module Appydave
4
4
  module Tools
5
- VERSION = '0.18.1'
5
+ VERSION = '0.18.3'
6
6
  end
7
7
  end
data/package.json CHANGED
@@ -1,6 +1,6 @@
1
1
  {
2
2
  "name": "appydave-tools",
3
- "version": "0.18.1",
3
+ "version": "0.18.3",
4
4
  "description": "AppyDave YouTube Automation Tools",
5
5
  "scripts": {
6
6
  "release": "semantic-release"
metadata CHANGED
@@ -1,7 +1,7 @@
1
1
  --- !ruby/object:Gem::Specification
2
2
  name: appydave-tools
3
3
  version: !ruby/object:Gem::Version
4
- version: 0.18.1
4
+ version: 0.18.3
5
5
  platform: ruby
6
6
  authors:
7
7
  - David Cruwys
@@ -227,6 +227,10 @@ files:
227
227
  - bin/youtube_automation.rb
228
228
  - bin/youtube_manager.rb
229
229
  - docs/README.md
230
+ - docs/SESSION-SUMMARY-WINDOWS-PREP.md
231
+ - docs/WINDOWS-COMPATIBILITY-REPORT.md
232
+ - docs/WINDOWS-SETUP.md
233
+ - docs/WINDOWS-START-HERE.md
230
234
  - docs/archive/codebase-audit-2025-01.md
231
235
  - docs/archive/documentation-framework-proposal.md
232
236
  - docs/archive/purpose-and-philosophy.md
@@ -241,6 +245,7 @@ files:
241
245
  - docs/dam/session-summary-2025-11-09.md
242
246
  - docs/dam/usage.md
243
247
  - docs/dam/vat-testing-plan.md
248
+ - docs/dam/windows-testing-guide.md
244
249
  - docs/development/CODEX-recommendations.md
245
250
  - docs/development/README.md
246
251
  - docs/development/cli-architecture-patterns.md