timet 1.5.2 → 1.5.4

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
checksums.yaml CHANGED
@@ -1,7 +1,7 @@
1
1
  ---
2
2
  SHA256:
3
- metadata.gz: f538523396dc470f91334a3abe44915e63dec009a60901a1ae4ef7b432a9a37f
4
- data.tar.gz: 262da1d62b413102d4ff33932fa7fb05c10c4b13abcb832d6382683d2d314d8d
3
+ metadata.gz: 9d3673e96ff9410f3b77d1ecdd59b74cbfe794bd50b9f6fad75701200c911742
4
+ data.tar.gz: d405e584648c18a4119b7f4c242923b74e28e433779ad3929ff22da3e569bba1
5
5
  SHA512:
6
- metadata.gz: 4d1d90414bda675c2f09827ec99c63b4e86e04f826e969d07af15a757e29b1cb868d412d17b7995012b1091173cd50e2093009600bb55888b06ffd1c1a4397e0
7
- data.tar.gz: cad0f0569224af29941301173dcfa1045c4419db21265b5cdb2d64e5e14f9eeca52d5ada931be7fed1757305ba55bb05823111d974a40c2e57ca0a25c4f8cff7
6
+ metadata.gz: 63267d2db22ceabb15c2ca4e8114a25854bf71446799992de552ce91f6ff7d8487fb7204f1ea76df8be77dcfa71ac89d21f26084c87fb17831469406920f1862
7
+ data.tar.gz: 2f7d784e8f9b3385d37c90a04fce2b4ea869c5fb3b1e62be613e835b898a2b6f4f8742dbb38785295e695309c7b6b6f6b3fe804a8e5afb7f03655510612ce05d
data/.rubocop.yml CHANGED
@@ -9,3 +9,9 @@ Metrics/MethodLength:
9
9
  Max: 12
10
10
 
11
11
  require: rubocop-rspec
12
+
13
+ RSpec/ExampleLength:
14
+ Max: 10
15
+
16
+ Metrics/CyclomaticComplexity:
17
+ Max: 8
data/CHANGELOG.md CHANGED
@@ -1,5 +1,42 @@
1
1
  ## [Unreleased]
2
2
 
3
+ ## [1.5.4] - 2025-02-11
4
+
5
+ **Improvements:**
6
+ - Added `.env` file creation in CI workflow for testing environment variables.
7
+ - Updated Code Climate coverage reporting to use `simplecov-lcov` for LCOV format compatibility.
8
+ - Refactored validation error message tests in `ValidationEditHelper` for clarity and maintainability.
9
+ - Simplified environment variable validation in `S3Supabase` by introducing a helper method `check_env_var`.
10
+ - Improved integration tests for `play_sound_and_notify` with better stubbing and platform-specific behavior checks.
11
+ - Added comprehensive tests for `#process_existing_item`, `#update_item_from_hash`, and `#insert_item_from_hash` in `DatabaseSyncer`.
12
+ - Enhanced database synchronization logic and moved `ITEM_FIELDS` to `DatabaseSyncer` for better organization.
13
+ - Added `.format_time_string` method to `TimeHelper` with tests for various input formats.
14
+ - Updated AWS SDK dependencies and improved database sync logging.
15
+ - Consolidated integration tests and improved readability.
16
+ - Improved error handling and added tests for `S3Supabase`.
17
+
18
+ **Bug Fixes:**
19
+ - Fixed environment variable validation in `S3Supabase` to handle `nil` values.
20
+ - Resolved issues with database synchronization logic in `DatabaseSyncer`.
21
+ - Fixed test setup and cleanup in `S3Supabase` and `DatabaseSyncer` specs.
22
+ - Addressed edge cases in `#process_and_update_time_field` and `#valid_time_value?`.
23
+
24
+ ## [1.5.3] - 2025-01-02
25
+
26
+ **Improvements:**
27
+
28
+ - Upgraded dependencies in `Gemfile` and `Gemfile.lock` to their latest compatible versions, including `aws-sdk-s3`, `csv`, `sqlite3`, and others.
29
+ - Refactored `TimeBlockChart` from a module to a class, encapsulating `start_hour` and `end_hour` as instance variables for better state management.
30
+ - Added a `print_footer` method to `tag_distribution.rb` to provide clear explanations for summary metrics (total duration, average duration, and standard deviation).
31
+ - Improved documentation for the `TimeBlockChart` class, including example usage and `attr_reader` for `start_hour` and `end_hour`.
32
+ - Simplified method signatures in `TimeBlockChart` by removing redundant parameters and leveraging instance variables.
33
+
34
+ **Bug Fixes:**
35
+
36
+ - Fixed an issue in `tag_distribution.rb` where the calculation of `percentage_value` was redundant and simplified the logic.
37
+ - Corrected the display of the day abbreviation in `TimeBlockChart` to include the full three-letter abbreviation (e.g., "Mon" instead of "Mo").
38
+ - Removed unused code in `time_statistics.rb` that was not contributing to the `totals` method.
39
+
3
40
  ## [1.5.2] - 2024-12-12
4
41
 
5
42
  **Improvements:**
@@ -52,15 +52,14 @@ module Timet
52
52
  def initialize(*args)
53
53
  super
54
54
 
55
- # Initialize database without validation in test environment
56
55
  if defined?(RSpec)
57
56
  @db = Database.new
58
57
  else
59
- command_name = args[2][:current_command].name
58
+ command_name = args.dig(2, :current_command, :name)
60
59
  if VALID_ARGUMENTS.include?(command_name)
61
60
  @db = Database.new
62
61
  else
63
- puts 'Invalid arguments provided. Please check your input.'
62
+ warn 'Invalid arguments provided. Please check your input.'
64
63
  exit(1)
65
64
  end
66
65
  end
@@ -110,6 +109,7 @@ module Timet
110
109
 
111
110
  desc 'stop', 'Stop time tracking'
112
111
  # Stops the current tracking session if there is one in progress.
112
+ # After stopping the tracking session, it displays a summary of the tracked time.
113
113
  #
114
114
  # @return [void] This method does not return a value; it performs side effects such as updating the tracking item
115
115
  # and generating a summary.
@@ -120,15 +120,14 @@ module Timet
120
120
  # @note The method checks if the last tracking item is in progress by calling `@db.item_status`.
121
121
  # @note If the last item is in progress, it fetches the last item's ID using `@db.fetch_last_id` and updates it
122
122
  # with the current timestamp.
123
- # @note The method then fetches the last item using `@db.last_item` and generates a summary if the result
124
- # is not nil.
125
- def stop(display = nil)
123
+ # @note The method always generates a summary after stopping the tracking session.
124
+ def stop
126
125
  return unless @db.item_status == :in_progress
127
126
 
128
127
  last_id = @db.fetch_last_id
129
128
  @db.update_item(last_id, 'end', TimeHelper.current_timestamp)
130
129
 
131
- summary unless display
130
+ summary
132
131
  end
133
132
 
134
133
  desc 'resume (r) [id]', 'Resume last task (id is an optional parameter) => tt resume'
@@ -306,6 +305,7 @@ module Timet
306
305
  desc 'sync', 'Sync local db with supabase external db'
307
306
  def sync
308
307
  puts 'Syncing database with remote storage...'
308
+ puts 'Sync method called'
309
309
  DatabaseSyncHelper.sync(@db, BUCKET)
310
310
  end
311
311
  end
@@ -111,7 +111,7 @@ module Timet
111
111
  def run_linux_session(time, tag)
112
112
  notification_command = "notify-send --icon=clock '#{show_message(tag)}'"
113
113
  command = "sleep #{time} && tput bel && tt stop 0 && #{notification_command} &"
114
- pid = spawn(command)
114
+ pid = Kernel.spawn(command)
115
115
  Process.detach(pid)
116
116
  end
117
117
 
@@ -123,7 +123,7 @@ module Timet
123
123
  def run_mac_session(time, tag)
124
124
  notification_command = "osascript -e 'display notification \"#{show_message(tag)}\"'"
125
125
  command = "sleep #{time} && afplay /System/Library/Sounds/Basso.aiff && tt stop 0 && #{notification_command} &"
126
- pid = spawn(command)
126
+ pid = Kernel.spawn(command)
127
127
  Process.detach(pid)
128
128
  end
129
129
 
@@ -285,8 +285,6 @@ module Timet
285
285
  :complete
286
286
  end
287
287
 
288
- private
289
-
290
288
  # Moves the old database file to the new location if it exists.
291
289
  #
292
290
  # @param database_path [String] The path to the new SQLite database file.
@@ -314,7 +312,7 @@ module Timet
314
312
  # @raise [StandardError] If there is an issue executing the SQL queries, an error may be raised.
315
313
  #
316
314
  def update_time_columns
317
- result = execute_sql('SELECT * FROM items where updated_at is null or created_at is null')
315
+ result = execute_sql('SELECT * FROM items WHERE updated_at IS NULL OR created_at IS NULL')
318
316
  result.each do |item|
319
317
  id = item[0]
320
318
  end_time = item[2]
@@ -2,13 +2,13 @@
2
2
 
3
3
  require 'tempfile'
4
4
  require 'digest'
5
+ require_relative 'database_syncer'
5
6
 
6
7
  module Timet
7
8
  # Helper module for database synchronization operations
8
9
  # Provides methods for comparing and syncing local and remote databases
9
10
  module DatabaseSyncHelper
10
- # Fields used in item operations
11
- ITEM_FIELDS = %w[start end tag notes pomodoro updated_at created_at deleted].freeze
11
+ extend DatabaseSyncer
12
12
 
13
13
  # Main entry point for database synchronization
14
14
  #
@@ -57,6 +57,8 @@ module Timet
57
57
  # @note This method ensures proper resource cleanup by using ensure block
58
58
  def self.with_temp_file
59
59
  temp_file = Tempfile.new('remote_db')
60
+ raise 'Temporary file path is nil' unless temp_file.path
61
+
60
62
  yield temp_file
61
63
  ensure
62
64
  temp_file.close
@@ -74,209 +76,5 @@ module Timet
74
76
  local_md5 = Digest::MD5.file(local_path).hexdigest
75
77
  remote_md5 == local_md5
76
78
  end
77
-
78
- # Handles the synchronization process when differences are detected between databases
79
- #
80
- # @param local_db [SQLite3::Database] The local database connection
81
- # @param remote_storage [S3Supabase] The remote storage client for cloud operations
82
- # @param bucket [String] The S3 bucket name
83
- # @param local_db_path [String] Path to the local database file
84
- # @param remote_path [String] Path to the downloaded remote database file
85
- # @return [void]
86
- # @note This method attempts to sync the databases and handles any errors that occur during the process
87
- def self.handle_database_differences(*args)
88
- local_db, remote_storage, bucket, local_db_path, remote_path = args
89
- puts 'Differences detected between local and remote databases'
90
- begin
91
- sync_with_remote_database(local_db, remote_path, remote_storage, bucket, local_db_path)
92
- rescue SQLite3::Exception => e
93
- handle_sync_error(e, remote_storage, bucket, local_db_path)
94
- end
95
- end
96
-
97
- # Performs the actual database synchronization by setting up connections and syncing data
98
- #
99
- # @param local_db [SQLite3::Database] The local database connection
100
- # @param remote_path [String] Path to the remote database file
101
- # @param remote_storage [S3Supabase] The remote storage client for cloud operations
102
- # @param bucket [String] The S3 bucket name
103
- # @param local_db_path [String] Path to the local database file
104
- # @return [void]
105
- # @note Configures both databases to return results as hashes for consistent data handling
106
- def self.sync_with_remote_database(*args)
107
- local_db, remote_path, remote_storage, bucket, local_db_path = args
108
- db_remote = open_remote_database(remote_path)
109
- db_remote.results_as_hash = true
110
- local_db.instance_variable_get(:@db).results_as_hash = true
111
- sync_databases(local_db, db_remote, remote_storage, bucket, local_db_path)
112
- end
113
-
114
- # Opens and validates a connection to the remote database
115
- #
116
- # @param remote_path [String] Path to the remote database file
117
- # @return [SQLite3::Database] The initialized database connection
118
- # @raise [RuntimeError] If the database connection cannot be established
119
- # @note Validates that the database connection is properly initialized
120
- def self.open_remote_database(remote_path)
121
- db_remote = SQLite3::Database.new(remote_path)
122
- raise 'Failed to initialize remote database' unless db_remote
123
-
124
- db_remote
125
- end
126
-
127
- # Handles errors that occur during database synchronization
128
- #
129
- # @param error [SQLite3::Exception] The error that occurred during sync
130
- # @param remote_storage [S3Supabase] The remote storage client for cloud operations
131
- # @param bucket [String] The S3 bucket name
132
- # @param local_db_path [String] Path to the local database file
133
- # @return [void]
134
- # @note When sync fails, this method falls back to uploading the local database
135
- def self.handle_sync_error(error, remote_storage, bucket, local_db_path)
136
- puts "Error opening remote database: #{error.message}"
137
- puts 'Uploading local database to replace corrupted remote database'
138
- remote_storage.upload_file(bucket, local_db_path, 'timet.db')
139
- end
140
-
141
- # Converts database items to a hash indexed by ID
142
- #
143
- # @param items [Array<Hash>] Array of database items
144
- # @return [Hash] Items indexed by ID
145
- def self.items_to_hash(items)
146
- items.to_h { |item| [item['id'], item] }
147
- end
148
-
149
- # Determines if remote item should take precedence
150
- #
151
- # @param remote_item [Hash] Remote database item
152
- # @param remote_time [Integer] Remote item timestamp
153
- # @param local_time [Integer] Local item timestamp
154
- # @return [Boolean] true if remote item should take precedence
155
- def self.remote_wins?(remote_item, remote_time, local_time)
156
- remote_time > local_time && (remote_item['deleted'].to_i == 1 || remote_time > local_time)
157
- end
158
-
159
- # Formats item status message
160
- #
161
- # @param id [Integer] Item ID
162
- # @param item [Hash] Database item
163
- # @param source [String] Source of the item ('Remote' or 'Local')
164
- # @return [String] Formatted status message
165
- def self.format_status_message(id, item, source)
166
- deleted = item['deleted'].to_i == 1 ? ' and deleted' : ''
167
- "#{source} item #{id} is newer#{deleted} - #{source == 'Remote' ? 'updating local' : 'will be uploaded'}"
168
- end
169
-
170
- # Processes an item that exists in both databases
171
- #
172
- # @param id [Integer] Item ID
173
- # @param local_item [Hash] Local database item
174
- # @param remote_item [Hash] Remote database item
175
- # @param local_db [SQLite3::Database] Local database connection
176
- # @return [Symbol] :local_update if local was updated, :remote_update if remote needs update
177
- def self.process_existing_item(*args)
178
- id, local_item, remote_item, local_db = args
179
- local_time = local_item['updated_at'].to_i
180
- remote_time = remote_item['updated_at'].to_i
181
-
182
- if remote_wins?(remote_item, remote_time, local_time)
183
- puts format_status_message(id, remote_item, 'Remote')
184
- update_item_from_hash(local_db, remote_item)
185
- :local_update
186
- elsif local_time > remote_time
187
- puts format_status_message(id, local_item, 'Local')
188
- :remote_update
189
- end
190
- end
191
-
192
- # Processes items from both databases and syncs them
193
- #
194
- # @param local_db [SQLite3::Database] The local database connection
195
- # @param remote_db [SQLite3::Database] The remote database connection
196
- # @return [void]
197
- def self.process_database_items(local_db, remote_db)
198
- remote_items = remote_db.execute('SELECT * FROM items ORDER BY updated_at DESC')
199
- local_items = local_db.execute_sql('SELECT * FROM items ORDER BY updated_at DESC')
200
-
201
- sync_items_by_id(
202
- local_db,
203
- items_to_hash(local_items),
204
- items_to_hash(remote_items)
205
- )
206
- end
207
-
208
- # Syncs items between local and remote databases based on their IDs
209
- #
210
- # @param local_db [SQLite3::Database] The local database connection
211
- # @param local_items_by_id [Hash] Local items indexed by ID
212
- # @param remote_items_by_id [Hash] Remote items indexed by ID
213
- # @return [void]
214
- def self.sync_items_by_id(local_db, local_items_by_id, remote_items_by_id)
215
- all_item_ids = (remote_items_by_id.keys + local_items_by_id.keys).uniq
216
-
217
- all_item_ids.each do |id|
218
- if !remote_items_by_id[id]
219
- puts "Local item #{id} will be uploaded"
220
- elsif !local_items_by_id[id]
221
- puts "Adding remote item #{id} to local"
222
- insert_item_from_hash(local_db, remote_items_by_id[id])
223
- else
224
- process_existing_item(id, local_items_by_id[id], remote_items_by_id[id], local_db)
225
- end
226
- end
227
- end
228
-
229
- # Synchronizes the local and remote databases by comparing and merging their items
230
- #
231
- # @param local_db [SQLite3::Database] The local database connection
232
- # @param remote_db [SQLite3::Database] The remote database connection
233
- # @param remote_storage [S3Supabase] The remote storage client for cloud operations
234
- # @param bucket [String] The S3 bucket name
235
- # @param local_db_path [String] Path to the local database file
236
- # @return [void]
237
- # @note This method orchestrates the entire database synchronization process
238
- def self.sync_databases(*args)
239
- local_db, remote_db, remote_storage, bucket, local_db_path = args
240
- process_database_items(local_db, remote_db)
241
- remote_storage.upload_file(bucket, local_db_path, 'timet.db')
242
- puts 'Database sync completed'
243
- end
244
-
245
- # Gets the values array for database operations
246
- #
247
- # @param item [Hash] Hash containing item data
248
- # @param include_id [Boolean] Whether to include ID at start (insert) or end (update)
249
- # @return [Array] Array of values for database operation
250
- def self.get_item_values(item, include_id_at_start: false)
251
- values = ITEM_FIELDS.map { |field| item[field] }
252
- include_id_at_start ? [item['id'], *values] : [*values, item['id']]
253
- end
254
-
255
- # Updates an existing item in the database with values from a hash
256
- #
257
- # @param db [SQLite3::Database] The database connection
258
- # @param item [Hash] Hash containing item data
259
- # @return [void]
260
- def self.update_item_from_hash(db, item)
261
- fields = "#{ITEM_FIELDS.join(' = ?, ')} = ?"
262
- db.execute_sql(
263
- "UPDATE items SET #{fields} WHERE id = ?",
264
- get_item_values(item)
265
- )
266
- end
267
-
268
- # Inserts a new item into the database from a hash
269
- #
270
- # @param db [SQLite3::Database] The database connection
271
- # @param item [Hash] Hash containing item data
272
- # @return [void]
273
- def self.insert_item_from_hash(db, item)
274
- fields = ['id', *ITEM_FIELDS].join(', ')
275
- placeholders = Array.new(ITEM_FIELDS.length + 1, '?').join(', ')
276
- db.execute_sql(
277
- "INSERT INTO items (#{fields}) VALUES (#{placeholders})",
278
- get_item_values(item, include_id_at_start: true)
279
- )
280
- end
281
79
  end
282
80
  end
@@ -0,0 +1,214 @@
1
+ # frozen_string_literal: true
2
+
3
+ module Timet
4
+ # Module responsible for synchronizing local and remote databases
5
+ module DatabaseSyncer
6
+ # Fields used in item operations
7
+ ITEM_FIELDS = %w[start end tag notes pomodoro updated_at created_at deleted].freeze
8
+
9
+ # Handles the synchronization process when differences are detected between databases
10
+ #
11
+ # @param local_db [SQLite3::Database] The local database connection
12
+ # @param remote_storage [S3Supabase] The remote storage client for cloud operations
13
+ # @param bucket [String] The S3 bucket name
14
+ # @param local_db_path [String] Path to the local database file
15
+ # @param remote_path [String] Path to the downloaded remote database file
16
+ # @return [void]
17
+ # @note This method attempts to sync the databases and handles any errors that occur during the process
18
+ def handle_database_differences(*args)
19
+ local_db, remote_storage, bucket, local_db_path, remote_path = args
20
+ puts 'Differences detected between local and remote databases'
21
+ begin
22
+ sync_with_remote_database(local_db, remote_path, remote_storage, bucket, local_db_path)
23
+ rescue SQLite3::Exception => e
24
+ handle_sync_error(e, remote_storage, bucket, local_db_path)
25
+ end
26
+ end
27
+
28
+ # Handles errors that occur during database synchronization
29
+ #
30
+ # @param error [SQLite3::Exception] The error that occurred during sync
31
+ # @param remote_storage [S3Supabase] The remote storage client for cloud operations
32
+ # @param bucket [String] The S3 bucket name
33
+ # @param local_db_path [String] Path to the local database file
34
+ # @return [void]
35
+ # @note When sync fails, this method falls back to uploading the local database
36
+ def handle_sync_error(error, remote_storage, bucket, local_db_path)
37
+ puts "Error opening remote database: #{error.message}"
38
+ puts 'Uploading local database to replace corrupted remote database'
39
+ remote_storage.upload_file(bucket, local_db_path, 'timet.db')
40
+ end
41
+
42
+ # Performs the actual database synchronization by setting up connections and syncing data
43
+ #
44
+ # @param local_db [SQLite3::Database] The local database connection
45
+ # @param remote_path [String] Path to the remote database file
46
+ # @param remote_storage [S3Supabase] The remote storage client for cloud operations
47
+ # @param bucket [String] The S3 bucket name
48
+ # @param local_db_path [String] Path to the local database file
49
+ # @return [void]
50
+ # @note Configures both databases to return results as hashes for consistent data handling
51
+ def sync_with_remote_database(*args)
52
+ local_db, remote_path, remote_storage, bucket, local_db_path = args
53
+ db_remote = open_remote_database(remote_path)
54
+ db_remote.results_as_hash = true
55
+ local_db.instance_variable_get(:@db).results_as_hash = true
56
+ sync_databases(local_db, db_remote, remote_storage, bucket, local_db_path)
57
+ end
58
+
59
+ # Opens and validates a connection to the remote database
60
+ #
61
+ # @param remote_path [String] Path to the remote database file
62
+ # @return [SQLite3::Database] The initialized database connection
63
+ # @raise [RuntimeError] If the database connection cannot be established
64
+ # @note Validates that the database connection is properly initialized
65
+ def open_remote_database(remote_path)
66
+ db_remote = SQLite3::Database.new(remote_path)
67
+ raise 'Failed to initialize remote database' unless db_remote
68
+
69
+ db_remote
70
+ end
71
+
72
+ # Synchronizes the local and remote databases by comparing and merging their items
73
+ #
74
+ # @param local_db [SQLite3::Database] The local database connection
75
+ # @param remote_db [SQLite3::Database] The remote database connection
76
+ # @param remote_storage [S3Supabase] The remote storage client for cloud operations
77
+ # @param bucket [String] The S3 bucket name
78
+ # @param local_db_path [String] Path to the local database file
79
+ # @return [void]
80
+ # @note This method orchestrates the entire database synchronization process
81
+ def sync_databases(*args)
82
+ local_db, remote_db, remote_storage, bucket, local_db_path = args
83
+ process_database_items(local_db, remote_db)
84
+ remote_storage.upload_file(bucket, local_db_path, 'timet.db')
85
+ puts 'Database sync completed'
86
+ end
87
+
88
+ # Processes items from both databases and syncs them
89
+ #
90
+ # @param local_db [SQLite3::Database] The local database connection
91
+ # @param remote_db [SQLite3::Database] The remote database connection
92
+ # @return [void]
93
+ def process_database_items(local_db, remote_db)
94
+ remote_items = remote_db.execute('SELECT * FROM items ORDER BY updated_at DESC')
95
+ local_items = local_db.execute_sql('SELECT * FROM items ORDER BY updated_at DESC')
96
+
97
+ sync_items_by_id(
98
+ local_db,
99
+ items_to_hash(local_items),
100
+ items_to_hash(remote_items)
101
+ )
102
+ end
103
+
104
+ # Syncs items between local and remote databases based on their IDs
105
+ #
106
+ # @param local_db [SQLite3::Database] The local database connection
107
+ # @param local_items_by_id [Hash] Local items indexed by ID
108
+ # @param remote_items_by_id [Hash] Remote items indexed by ID
109
+ # @return [void]
110
+ def sync_items_by_id(local_db, local_items_by_id, remote_items_by_id)
111
+ all_item_ids = (remote_items_by_id.keys + local_items_by_id.keys).uniq
112
+
113
+ all_item_ids.each do |id|
114
+ if !remote_items_by_id[id]
115
+ puts "Local item #{id} will be uploaded"
116
+ elsif !local_items_by_id[id]
117
+ puts "Adding remote item #{id} to local"
118
+ insert_item_from_hash(local_db, remote_items_by_id[id])
119
+ else
120
+ process_existing_item(id, local_items_by_id[id], remote_items_by_id[id], local_db)
121
+ end
122
+ end
123
+ end
124
+
125
+ # Inserts a new item into the database from a hash
126
+ #
127
+ # @param db [SQLite3::Database] The database connection
128
+ # @param item [Hash] Hash containing item data
129
+ # @return [void]
130
+ def insert_item_from_hash(db, item)
131
+ fields = ['id', *ITEM_FIELDS].join(', ')
132
+ placeholders = Array.new(ITEM_FIELDS.length + 1, '?').join(', ')
133
+ db.execute_sql(
134
+ "INSERT INTO items (#{fields}) VALUES (#{placeholders})",
135
+ get_item_values(item, include_id_at_start: true)
136
+ )
137
+ end
138
+
139
+ # Processes an item that exists in both databases
140
+ #
141
+ # @param id [Integer] Item ID
142
+ # @param local_item [Hash] Local database item
143
+ # @param remote_item [Hash] Remote database item
144
+ # @param local_db [SQLite3::Database] Local database connection
145
+ # @return [Symbol] :local_update if local was updated, :remote_update if remote needs update
146
+ def process_existing_item(*args)
147
+ id, local_item, remote_item, local_db = args
148
+ local_time = local_item['updated_at'].to_i
149
+ remote_time = remote_item['updated_at'].to_i
150
+
151
+ if remote_wins?(remote_item, remote_time, local_time)
152
+ puts format_status_message(id, remote_item, 'Remote')
153
+ update_item_from_hash(local_db, remote_item)
154
+ :local_update
155
+ elsif local_time > remote_time
156
+ puts format_status_message(id, local_item, 'Local')
157
+ :remote_update
158
+ end
159
+ end
160
+
161
+ # Converts database items to a hash indexed by ID
162
+ #
163
+ # @param items [Array<Hash>] Array of database items
164
+ # @return [Hash] Items indexed by ID
165
+ def items_to_hash(items)
166
+ items.to_h { |item| [item['id'], item] }
167
+ end
168
+
169
+ # Determines if remote item should take precedence
170
+ #
171
+ # @param remote_item [Hash] Remote database item
172
+ # @param remote_time [Integer] Remote item timestamp
173
+ # @param local_time [Integer] Local item timestamp
174
+ # @return [Boolean] true if remote item should take precedence
175
+ def remote_wins?(remote_item, remote_time, local_time)
176
+ remote_time > local_time && (remote_item['deleted'].to_i == 1 || remote_time > local_time)
177
+ end
178
+
179
+ # Formats item status message
180
+ #
181
+ # @param id [Integer] Item ID
182
+ # @param item [Hash] Database item
183
+ # @param source [String] Source of the item ('Remote' or 'Local')
184
+ # @return [String] Formatted status message
185
+ def format_status_message(id, item, source)
186
+ deleted = item['deleted'].to_i == 1 ? ' and deleted' : ''
187
+ "#{source} item #{id} is newer#{deleted} - #{source == 'Remote' ? 'updating local' : 'will be uploaded'}"
188
+ end
189
+
190
+ # Updates an existing item in the database with values from a hash
191
+ #
192
+ # @param db [SQLite3::Database] The database connection
193
+ # @param item [Hash] Hash containing item data
194
+ # @return [void]
195
+ def update_item_from_hash(db, item)
196
+ fields = "#{ITEM_FIELDS.join(' = ?, ')} = ?"
197
+ db.execute_sql(
198
+ "UPDATE items SET #{fields} WHERE id = ?",
199
+ get_item_values(item)
200
+ )
201
+ end
202
+
203
+ # Gets the values array for database operations
204
+ #
205
+ # @param item [Hash] Hash containing item data
206
+ # @param include_id [Boolean] Whether to include ID at start (insert) or end (update)
207
+ # @return [Array] Array of values for database operation
208
+ def get_item_values(item, include_id_at_start: false)
209
+ @database_fields ||= ITEM_FIELDS
210
+ values = @database_fields.map { |field| item[field] }
211
+ include_id_at_start ? [item['id'], *values] : values
212
+ end
213
+ end
214
+ end
@@ -179,6 +179,7 @@ module Timet
179
179
  @logger.info "Object '#{object_key}' deleted successfully."
180
180
  rescue Aws::S3::Errors::ServiceError => e
181
181
  @logger.error "Error deleting object: #{e.message}"
182
+ raise e
182
183
  end
183
184
 
184
185
  # Deletes a bucket and all its contents.
@@ -197,6 +198,7 @@ module Timet
197
198
  @logger.info "Bucket '#{bucket_name}' deleted successfully."
198
199
  rescue Aws::S3::Errors::ServiceError => e
199
200
  @logger.error "Error deleting bucket: #{e.message}"
201
+ raise e
200
202
  end
201
203
 
202
204
  private
@@ -207,14 +209,19 @@ module Timet
207
209
  # @return [void]
208
210
  def validate_env_vars
209
211
  missing_vars = []
210
- missing_vars << 'S3_ENDPOINT' if S3_ENDPOINT.empty?
211
- missing_vars << 'S3_ACCESS_KEY' if S3_ACCESS_KEY.empty?
212
- missing_vars << 'S3_SECRET_KEY' if S3_SECRET_KEY.empty?
212
+ missing_vars.concat(check_env_var('S3_ENDPOINT', S3_ENDPOINT))
213
+ missing_vars.concat(check_env_var('S3_ACCESS_KEY', S3_ACCESS_KEY))
214
+ missing_vars.concat(check_env_var('S3_SECRET_KEY', S3_SECRET_KEY))
213
215
 
214
216
  return if missing_vars.empty?
215
217
 
216
- error_message = "Missing required environment variables (.env): #{missing_vars.join(', ')}"
217
- raise CustomError, error_message
218
+ raise CustomError, "Missing required environment variables (.env): #{missing_vars.join(', ')}"
219
+ end
220
+
221
+ def check_env_var(name, value)
222
+ return [] if value && !value.empty?
223
+
224
+ [name]
218
225
  end
219
226
 
220
227
  # Custom error class that suppresses the backtrace for cleaner error messages.