winrm-fs 1.0.1 → 1.0.2

Sign up to get free protection for your applications and to get access to all the features.
data/changelog.md CHANGED
@@ -1,51 +1,54 @@
1
- # WinRM-fs Gem Changelog
2
- # 1.0.1
3
- - Call ClearScriptBlockCache to prevent OutOfMemoryExceptions ClearScriptBlockCache
4
-
5
- # 1.0.0
6
- - Using winrm v2. File uploads just got a whole lot faster!
7
-
8
- # 0.4.3
9
- - Fix error handling with wmf5, filtering out progress output from inspected stderr.
10
-
11
- # 0.4.2
12
- - Improved Powershell error handling in metadata checking.
13
-
14
- # 0.4.1
15
- - Fixes a regression on Windows 2008 R2/Windows 7 and below where the WinRM service corrupts the check files metadata resulting in malformed destination paths.
16
-
17
- # 0.4.0
18
- - Correct the destination path of individual files. Always assume it is the full destination path unless it is an existing directory. This may potentialy break some callers expecting the remote path to be a directory that winrm-fs will create if missing as the destination of the local file. A new directory will not be created and the local file will be uploaded directly to the remote path.
19
-
20
- # 0.3.2
21
- - Fix re-extraction of cached directories from temp folder when there is more than one "clean" directory deleted from destination
22
-
23
- # 0.3.1
24
- - Widen logging version constraints to include 2.0 (matching WinRM core gem)
25
-
26
- # 0.3.0
27
- - Jetisons `CommandExecutor` now living in the core WinRM gem and swaps in implementation currently used in the winrm-transport gem. These changes should have little visible effect on current consumers of the `FileManager` class with these exceptions:
28
- - BREAKING CHANGE: When uploading a directory and the destination directory exists on the endpoint, the source base directory will be created below the destination directory on the endpoint and the source directory contents will be unzipped to that location. Prior to this release, the contents of the source directory would be unzipped to an existing destination directory without creating the source base directory. This new behavior is more consistent with SCP and other well known shell copy commands.
29
- - `Upload` may now receive an array of source files and directories rather than just a single file or directory path.
30
-
31
- # 0.2.4
32
- - Fix issue 21, downloading files is extremely slow.
33
- - Add zip file creation debug logging.
34
-
35
- # 0.2.3
36
- - Fix yielding progress data, issue #23
37
-
38
- # 0.2.2
39
- - Fix powershell streams leaking to standard error breaking Windows 10, issue #18
40
-
41
- # 0.2.1
42
- - Fixed issue 16 creating zip file on Windows
43
-
44
- # 0.2.0
45
- - Redesigned temp zip file creation system
46
- - Fixed lots of small edge case issues especially with directory uploads
47
- - Simplified file manager upload method API to take only a single source file or directory
48
- - Expanded acceptable username and hostnames for rwinrmcp
49
-
50
- # 0.1.0
51
- - Initial alpha quality release
1
+ # WinRM-fs Gem Changelog
2
+ # 1.0.2
3
+ - Fix `Pathname.glob` expansion of shortnames.
4
+
5
+ # 1.0.1
6
+ - Call ClearScriptBlockCache to prevent OutOfMemoryExceptions ClearScriptBlockCache
7
+
8
+ # 1.0.0
9
+ - Using winrm v2. File uploads just got a whole lot faster!
10
+
11
+ # 0.4.3
12
+ - Fix error handling with wmf5, filtering out progress output from inspected stderr.
13
+
14
+ # 0.4.2
15
+ - Improved Powershell error handling in metadata checking.
16
+
17
+ # 0.4.1
18
+ - Fixes a regression on Windows 2008 R2/Windows 7 and below where the WinRM service corrupts the check files metadata resulting in malformed destination paths.
19
+
20
+ # 0.4.0
21
+ - Correct the destination path of individual files. Always assume it is the full destination path unless it is an existing directory. This may potentialy break some callers expecting the remote path to be a directory that winrm-fs will create if missing as the destination of the local file. A new directory will not be created and the local file will be uploaded directly to the remote path.
22
+
23
+ # 0.3.2
24
+ - Fix re-extraction of cached directories from temp folder when there is more than one "clean" directory deleted from destination
25
+
26
+ # 0.3.1
27
+ - Widen logging version constraints to include 2.0 (matching WinRM core gem)
28
+
29
+ # 0.3.0
30
+ - Jetisons `CommandExecutor` now living in the core WinRM gem and swaps in implementation currently used in the winrm-transport gem. These changes should have little visible effect on current consumers of the `FileManager` class with these exceptions:
31
+ - BREAKING CHANGE: When uploading a directory and the destination directory exists on the endpoint, the source base directory will be created below the destination directory on the endpoint and the source directory contents will be unzipped to that location. Prior to this release, the contents of the source directory would be unzipped to an existing destination directory without creating the source base directory. This new behavior is more consistent with SCP and other well known shell copy commands.
32
+ - `Upload` may now receive an array of source files and directories rather than just a single file or directory path.
33
+
34
+ # 0.2.4
35
+ - Fix issue 21, downloading files is extremely slow.
36
+ - Add zip file creation debug logging.
37
+
38
+ # 0.2.3
39
+ - Fix yielding progress data, issue #23
40
+
41
+ # 0.2.2
42
+ - Fix powershell streams leaking to standard error breaking Windows 10, issue #18
43
+
44
+ # 0.2.1
45
+ - Fixed issue 16 creating zip file on Windows
46
+
47
+ # 0.2.0
48
+ - Redesigned temp zip file creation system
49
+ - Fixed lots of small edge case issues especially with directory uploads
50
+ - Simplified file manager upload method API to take only a single source file or directory
51
+ - Expanded acceptable username and hostnames for rwinrmcp
52
+
53
+ # 0.1.0
54
+ - Initial alpha quality release
data/lib/winrm-fs.rb CHANGED
@@ -1,28 +1,28 @@
1
- # encoding: UTF-8
2
- #
3
- # Copyright 2015 Shawn Neal <sneal@sneal.net>
4
- #
5
- # Licensed under the Apache License, Version 2.0 (the "License");
6
- # you may not use this file except in compliance with the License.
7
- # You may obtain a copy of the License at
8
- #
9
- # http://www.apache.org/licenses/LICENSE-2.0
10
- #
11
- # Unless required by applicable law or agreed to in writing, software
12
- # distributed under the License is distributed on an "AS IS" BASIS,
13
- # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14
- # See the License for the specific language governing permissions and
15
- # limitations under the License.
16
-
17
- require 'winrm'
18
- require 'logger'
19
- require 'pathname'
20
- require_relative 'winrm-fs/exceptions'
21
- require_relative 'winrm-fs/file_manager'
22
-
23
- module WinRM
24
- # WinRM File System
25
- module FS
26
- # Top level module code
27
- end
28
- end
1
+ # encoding: UTF-8
2
+ #
3
+ # Copyright 2015 Shawn Neal <sneal@sneal.net>
4
+ #
5
+ # Licensed under the Apache License, Version 2.0 (the "License");
6
+ # you may not use this file except in compliance with the License.
7
+ # You may obtain a copy of the License at
8
+ #
9
+ # http://www.apache.org/licenses/LICENSE-2.0
10
+ #
11
+ # Unless required by applicable law or agreed to in writing, software
12
+ # distributed under the License is distributed on an "AS IS" BASIS,
13
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
14
+ # See the License for the specific language governing permissions and
15
+ # limitations under the License.
16
+
17
+ require 'winrm'
18
+ require 'logger'
19
+ require 'pathname'
20
+ require_relative 'winrm-fs/exceptions'
21
+ require_relative 'winrm-fs/file_manager'
22
+
23
+ module WinRM
24
+ # WinRM File System
25
+ module FS
26
+ # Top level module code
27
+ end
28
+ end
@@ -1,527 +1,527 @@
1
- # -*- encoding: utf-8 -*-
2
- #
3
- # Author:: Fletcher (<fnichol@nichol.ca>)
4
- #
5
- # Copyright (C) 2015, Fletcher Nichol
6
- #
7
- # Licensed under the Apache License, Version 2.0 (the "License");
8
- # you may not use this file except in compliance with the License.
9
- # You may obtain a copy of the License at
10
- #
11
- # http://www.apache.org/licenses/LICENSE-2.0
12
- #
13
- # Unless required by applicable law or agreed to in writing, software
14
- # distributed under the License is distributed on an "AS IS" BASIS,
15
- # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16
- # See the License for the specific language governing permissions and
17
- # limitations under the License.
18
-
19
- require 'benchmark'
20
- require 'csv'
21
- require 'digest'
22
- require 'securerandom'
23
- require 'stringio'
24
-
25
- require 'winrm-fs/core/tmp_zip'
26
-
27
- module WinRM
28
- module FS
29
- module Core
30
- # Wrapped exception for any internally raised WinRM-related errors.
31
- #
32
- # @author Fletcher Nichol <fnichol@nichol.ca>
33
- class FileTransporterFailed < ::WinRM::WinRMError; end
34
- # rubocop:disable MethodLength, AbcSize, ClassLength
35
-
36
- # Object which can upload one or more files or directories to a remote
37
- # host over WinRM using PowerShell scripts and CMD commands. Note that
38
- # this form of file transfer is *not* ideal and extremely costly on both
39
- # the local and remote sides. Great pains are made to minimize round
40
- # trips to the remote host and to minimize the number of PowerShell
41
- # sessions being invoked which can be 2 orders of magnitude more
42
- # expensive than vanilla CMD commands.
43
- #
44
- # This object is supported by a `PowerShell` instance as it
45
- # depends on the `#run` API contract.
46
- #
47
- # An optional logger can be supplied, assuming it can respond to the
48
- # `#debug` and `#debug?` messages.
49
- #
50
- # @author Fletcher Nichol <fnichol@nichol.ca>
51
- # @author Matt Wrock <matt@mattwrock.com>
52
- class FileTransporter
53
- # Creates a FileTransporter given a PowerShell object.
54
- #
55
- # @param shell [PowerShell] a winrm PowerShell object
56
- def initialize(shell, opts = {})
57
- @shell = shell
58
- @logger = shell.logger
59
- @id_generator = opts.fetch(:id_generator) { -> { SecureRandom.uuid } }
60
- end
61
-
62
- # Uploads a collection of files and/or directories to the remote host.
63
- #
64
- # **TODO Notes:**
65
- # * options could specify zip mode, zip options, etc.
66
- # * maybe option to set tmpfile base dir to override $env:PATH?
67
- # * progress yields block like net-scp progress
68
- # * final API: def upload(locals, remote, _options = {}, &_progress)
69
- #
70
- # @param locals [Array<String>,String] one or more local file or
71
- # directory paths
72
- # @param remote [String] the base destination path on the remote host
73
- # @return [Hash] report hash, keyed by the local MD5 digest
74
- def upload(locals, remote)
75
- files = nil
76
- report = nil
77
- remote = remote.to_s
78
-
79
- elapsed1 = Benchmark.measure do
80
- files = make_files_hash(Array(locals), remote)
81
- report = check_files(files)
82
- merge_with_report!(files, report)
83
- reconcile_destinations!(files)
84
- end
85
- total_size = total_base64_transfer_size(files)
86
-
87
- elapsed2 = Benchmark.measure do
88
- report = stream_upload_files(files) do |local_path, xfered|
89
- yield xfered, total_size, local_path, remote if block_given?
90
- end
91
- merge_with_report!(files, report)
92
- end
93
-
94
- elapsed3 = Benchmark.measure do
95
- report = extract_files(files)
96
- merge_with_report!(files, report)
97
- cleanup(files)
98
- end
99
-
100
- logger.debug(
101
- "Uploaded #{files.keys.size} items " \
102
- "dirty_check: #{duration(elapsed1.real)} " \
103
- "stream_files: #{duration(elapsed2.real)} " \
104
- "extract: #{duration(elapsed3.real)} " \
105
- )
106
-
107
- [total_size, files]
108
- end
109
-
110
- private
111
-
112
- # @return [String] the Array pack template for Base64 encoding a stream
113
- # of data
114
- # @api private
115
- BASE64_PACK = 'm0'.freeze
116
-
117
- # @return [String] the directory where temporary upload artifacts are
118
- # persisted
119
- # @api private
120
- TEMP_UPLOAD_DIRECTORY = '$env:TEMP\\winrm-upload'.freeze
121
-
122
- # @return [#debug,#debug?] the logger
123
- # @api private
124
- attr_reader :logger
125
-
126
- # @return [Winrm::Shells::Powershell] a WinRM Powershell shell
127
- # @api private
128
- attr_reader :shell
129
-
130
- # @return [Integer] the maximum number of bytes to send per request
131
- # when streaming a file. This is optimized to send as much data
132
- # as allowed in a single PSRP fragment
133
- # @api private
134
- def max_encoded_write
135
- @max_encoded_write ||= begin
136
- empty_command = WinRM::PSRP::MessageFactory.create_pipeline_message(
137
- '00000000-0000-0000-0000-000000000000',
138
- '00000000-0000-0000-0000-000000000000',
139
- stream_command('')
140
- )
141
- shell.max_fragment_blob_size - empty_command.bytes.length
142
- end
143
- end
144
-
145
- # Examines the files and corrects the file destination if it is
146
- # targeting an existing folder. In this case, the destination path
147
- # will have the base name of the source file appended. This only
148
- # applies to file uploads and not to folder uploads.
149
- #
150
- # @param files [Hash] files hash, keyed by the local MD5 digest
151
- # @return [Hash] a report hash, keyed by the local MD5 digest
152
- # @api private
153
- def reconcile_destinations!(files)
154
- files.each do |_, data|
155
- if data['target_is_folder'] == 'True'
156
- data['dst'] = File.join(data['dst'], File.basename(data['src']))
157
- end
158
- end
159
- end
160
-
161
- # Adds an entry to a files Hash (keyed by local MD5 digest) for a
162
- # directory. When a directory is added, a temporary Zip file is created
163
- # containing the contents of the directory and any file-related data
164
- # such as MD5 digest, size, etc. will be referring to the Zip file.
165
- #
166
- # @param hash [Hash] hash to be mutated
167
- # @param dir [String] directory path to be Zipped and added
168
- # @param remote [String] path to destination on remote host
169
- # @api private
170
- def add_directory_hash!(hash, dir, remote)
171
- logger.debug "creating hash for directory #{remote}"
172
- zip_io = TmpZip.new(dir, logger)
173
- zip_md5 = md5sum(zip_io.path)
174
-
175
- hash[zip_md5] = {
176
- 'src' => dir,
177
- 'src_zip' => zip_io.path.to_s,
178
- 'zip_io' => zip_io,
179
- 'tmpzip' => "#{TEMP_UPLOAD_DIRECTORY}\\tmpzip-#{zip_md5}.zip",
180
- 'dst' => "#{remote}\\#{File.basename(dir)}",
181
- 'size' => File.size(zip_io.path)
182
- }
183
- end
184
-
185
- # Adds an entry to a files Hash (keyed by local MD5 digest) for a file.
186
- #
187
- # @param hash [Hash] hash to be mutated
188
- # @param local [String] file path
189
- # @param remote [String] path to destination on remote host
190
- # @api private
191
- def add_file_hash!(hash, local, remote)
192
- logger.debug "creating hash for file #{remote}"
193
-
194
- hash[md5sum(local)] = {
195
- 'src' => local,
196
- 'dst' => remote,
197
- 'size' => File.size(local)
198
- }
199
- end
200
-
201
- # Runs the check_files PowerShell script against a collection of
202
- # destination path/MD5 checksum pairs. The PowerShell script returns
203
- # its results as a CSV-formatted report which is converted into a Ruby
204
- # Hash.
205
- #
206
- # @param files [Hash] files hash, keyed by the local MD5 digest
207
- # @return [Hash] a report hash, keyed by the local MD5 digest
208
- # @api private
209
- def check_files(files)
210
- logger.debug 'Running check_files.ps1'
211
- hash_file = check_files_ps_hash(files)
212
- script = WinRM::FS::Scripts.render('check_files', hash_file: hash_file)
213
- parse_response(shell.run(script))
214
- end
215
-
216
- # Constructs a collection of destination path/MD5 checksum pairs as a
217
- # String representation of the contents of a PowerShell Hash Table.
218
- #
219
- # @param files [Hash] files hash, keyed by the local MD5 digest
220
- # @return [String] the inner contents of a PowerShell Hash Table
221
- # @api private
222
- def check_files_ps_hash(files)
223
- hash = files.map do |md5, data|
224
- [
225
- md5,
226
- {
227
- 'target' => data.fetch('tmpzip', data['dst']),
228
- 'src_basename' => File.basename(data['src']),
229
- 'dst' => data['dst']
230
- }
231
- ]
232
- end
233
- ps_hash(Hash[hash])
234
- end
235
-
236
- # Performs any final cleanup on the report Hash and removes any
237
- # temporary files/resources used in the upload task.
238
- #
239
- # @param files [Hash] a files hash
240
- # @api private
241
- def cleanup(files)
242
- files.select { |_, data| data.key?('zip_io') }.each do |md5, data|
243
- data.fetch('zip_io').unlink
244
- files.fetch(md5).delete('zip_io')
245
- logger.debug "Cleaned up src_zip #{data['src_zip']}"
246
- end
247
- end
248
-
249
- # Runs the extract_files PowerShell script against a collection of
250
- # temporary file/destination path pairs. The PowerShell script returns
251
- # its results as a CSV-formatted report which is converted into a Ruby
252
- # Hash. The script will not be invoked if there are no zip files
253
- # present in the incoming files Hash.
254
- #
255
- # @param files [Hash] files hash, keyed by the local MD5 digest
256
- # @return [Hash] a report hash, keyed by the local MD5 digest
257
- # @api private
258
- def extract_files(files)
259
- extracted_files = extract_files_ps_hash(files)
260
-
261
- if extracted_files == ps_hash({})
262
- logger.debug 'No remote files to extract, skipping'
263
- {}
264
- else
265
- logger.debug 'Running extract_files.ps1'
266
- script = WinRM::FS::Scripts.render('extract_files', hash_file: extracted_files)
267
-
268
- parse_response(shell.run(script))
269
- end
270
- end
271
-
272
- # Constructs a collection of temporary file/destination path pairs for
273
- # all zipped folders as a String representation of the contents of a
274
- # PowerShell Hash Table.
275
- #
276
- # @param files [Hash] files hash, keyed by the local MD5 digest
277
- # @return [String] the inner contents of a PowerShell Hash Table
278
- # @api private
279
- def extract_files_ps_hash(files)
280
- file_data = files.select { |_, data| data.key?('tmpzip') }
281
-
282
- result = file_data.map do |md5, data|
283
- val = { 'dst' => data['dst'] }
284
- val['tmpzip'] = data['tmpzip'] if data['tmpzip']
285
-
286
- [md5, val]
287
- end
288
-
289
- ps_hash(Hash[result])
290
- end
291
-
292
- # Returns a formatted string representing a duration in seconds.
293
- #
294
- # @param total [Integer] the total number of seconds
295
- # @return [String] a formatted string of the form (XmYY.00s)
296
- def duration(total)
297
- total = 0 if total.nil?
298
- minutes = (total / 60).to_i
299
- seconds = (total - (minutes * 60))
300
- format('(%dm%.2fs)', minutes, seconds)
301
- end
302
-
303
- # Contructs a Hash of files or directories, keyed by the local MD5
304
- # digest. Each file entry has a source and destination set, at a
305
- # minimum.
306
- #
307
- # @param locals [Array<String>] a collection of local files or
308
- # directories
309
- # @param remote [String] the base destination path on the remote host
310
- # @return [Hash] files hash, keyed by the local MD5 digest
311
- # @api private
312
- def make_files_hash(locals, remote)
313
- hash = {}
314
- locals.each do |local|
315
- local = local.to_s
316
- expanded = File.expand_path(local)
317
- expanded += local[-1] if local.end_with?('/', '\\')
318
-
319
- if File.file?(expanded)
320
- add_file_hash!(hash, expanded, remote)
321
- elsif File.directory?(expanded)
322
- add_directory_hash!(hash, expanded, remote)
323
- else
324
- fail Errno::ENOENT, "No such file or directory #{expanded}"
325
- end
326
- end
327
- hash
328
- end
329
-
330
- # @return [String] the MD5 digest of a local file
331
- # @api private
332
- def md5sum(local)
333
- Digest::MD5.file(local).hexdigest
334
- end
335
-
336
- # Destructively merges a report Hash into an existing files Hash.
337
- # **Note:** this method mutates the files Hash.
338
- #
339
- # @param files [Hash] files hash, keyed by the local MD5 digest
340
- # @param report [Hash] report hash, keyed by the local MD5 digest
341
- # @api private
342
- def merge_with_report!(files, report)
343
- files.merge!(report) { |_, oldval, newval| oldval.merge(newval) }
344
- end
345
-
346
- # @param depth [Integer] number of padding characters (default: `0`)
347
- # @return [String] a whitespace padded string of the given length
348
- # @api private
349
- def pad(depth = 0)
350
- ' ' * depth
351
- end
352
-
353
- # Parses response of a PowerShell script or CMD command which contains
354
- # a CSV-formatted document in the standard output stream.
355
- #
356
- # @param output [WinRM::Output] output object with stdout, stderr, and
357
- # exit code
358
- # @return [Hash] report hash, keyed by the local MD5 digest
359
- # @api private
360
- def parse_response(output)
361
- exitcode = output.exitcode
362
- stderr = output.stderr
363
-
364
- if exitcode != 0
365
- fail FileTransporterFailed, "[#{self.class}] Upload failed " \
366
- "(exitcode: #{exitcode})\n#{stderr}"
367
- elsif stderr != '\r\n' && stderr != ''
368
- fail FileTransporterFailed, "[#{self.class}] Upload failed " \
369
- "(exitcode: 0), but stderr present\n#{stderr}"
370
- end
371
-
372
- logger.debug 'Parsing CSV Response'
373
- logger.debug output.stdout
374
-
375
- array = CSV.parse(output.stdout, headers: true).map(&:to_hash)
376
- array.each { |h| h.each { |key, value| h[key] = nil if value == '' } }
377
- Hash[array.map { |entry| [entry.fetch('src_md5'), entry] }]
378
- end
379
-
380
- # Converts a Ruby hash into a PowerShell hash table, represented in a
381
- # String.
382
- #
383
- # @param obj [Object] source Hash or object when used in recursive
384
- # calls
385
- # @param depth [Integer] padding depth, used in recursive calls
386
- # (default: `0`)
387
- # @return [String] a PowerShell hash table
388
- # @api private
389
- def ps_hash(obj, depth = 0)
390
- if obj.is_a?(Hash)
391
- obj.map do |k, v|
392
- %(#{pad(depth + 2)}#{ps_hash(k)} = #{ps_hash(v, depth + 2)})
393
- end.join(";\n").insert(0, "@{\n").insert(-1, "\n#{pad(depth)}}")
394
- else
395
- %("#{obj}")
396
- end
397
- end
398
-
399
- # Uploads an IO stream to a Base64-encoded destination file.
400
- #
401
- # **Implementation Note:** Some of the code in this method may appear
402
- # slightly too dense and while adding additional variables would help,
403
- # the code is written very precisely to avoid unwanted allocations
404
- # which will bloat the Ruby VM's object space (and memory footprint).
405
- # The goal here is to stream potentially large files to a remote host
406
- # while not loading the entire file into memory first, then Base64
407
- # encoding it--duplicating the file in memory again.
408
- #
409
- # @param input_io [#read] a readable stream or object to be uploaded
410
- # @param dest [String] path to the destination file on the remote host
411
- # @return [Integer,Integer] the number of resulting upload chunks and
412
- # the number of bytes transferred to the remote host
413
- # @api private
414
- def stream_upload(input_io, dest)
415
- read_size = ((max_encoded_write - dest.length) / 4) * 3
416
- chunk, bytes = 1, 0
417
- buffer = ''
418
- shell.run(<<-EOS
419
- $to = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath("#{dest}")
420
- $parent = Split-Path $to
421
- if(!(Test-path $parent)) { mkdir $parent | Out-Null }
422
- $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList @(
423
- $to,
424
- [system.io.filemode]::Create,
425
- [System.io.FileAccess]::Write,
426
- [System.IO.FileShare]::ReadWrite
427
- )
428
-
429
- # Powershell caches ScrpitBlocks in a dictionary
430
- # keyed on the script block text. Thats just great
431
- # unless the script is super large and called a gillion
432
- # times like we might do. In such a case it will saturate the
433
- # Large Object Heap and lead to Out Of Memory exceptions
434
- # for large files or folders. So we call the internal method
435
- # ClearScriptBlockCache to clear it.
436
- $bindingFlags= [Reflection.BindingFlags] "NonPublic,Static"
437
- $method = [scriptblock].GetMethod("ClearScriptBlockCache", $bindingFlags)
438
- EOS
439
- )
440
-
441
- while input_io.read(read_size, buffer)
442
- bytes += (buffer.bytesize / 3 * 4)
443
- shell.run(stream_command([buffer].pack(BASE64_PACK)))
444
- logger.debug "Wrote chunk #{chunk} for #{dest}" if chunk % 25 == 0
445
- chunk += 1
446
- yield bytes if block_given?
447
- end
448
- shell.run('$fileStream.Dispose()')
449
- buffer = nil # rubocop:disable Lint/UselessAssignment
450
-
451
- [chunk - 1, bytes]
452
- end
453
-
454
- def stream_command(encoded_bytes)
455
- <<-EOS
456
- if($method) { $method.Invoke($Null, $Null) }
457
- $bytes=[Convert]::FromBase64String('#{encoded_bytes}')
458
- $fileStream.Write($bytes, 0, $bytes.length)
459
- EOS
460
- end
461
-
462
- # Uploads a local file.
463
- #
464
- # @param src [String] path to a local file
465
- # @param dest [String] path to the file on the remote host
466
- # @return [Integer,Integer] the number of resulting upload chunks and
467
- # the number of bytes transferred to the remote host
468
- # @api private
469
- def stream_upload_file(src, dest, &block)
470
- logger.debug "Uploading #{src} to #{dest}"
471
- chunks, bytes = 0, 0
472
- elapsed = Benchmark.measure do
473
- File.open(src, 'rb') do |io|
474
- chunks, bytes = stream_upload(io, dest, &block)
475
- end
476
- end
477
- logger.debug(
478
- "Finished uploading #{src} to #{dest} " \
479
- "(#{bytes.to_f / 1000} KB over #{chunks} chunks) " \
480
- "in #{duration(elapsed.real)}"
481
- )
482
-
483
- [chunks, bytes]
484
- end
485
-
486
- # Uploads a collection of "dirty" files to the remote host as
487
- # Base64-encoded temporary files. A "dirty" file is one which has the
488
- # `"chk_dirty"` option set to `"True"` in the incoming files Hash.
489
- #
490
- # @param files [Hash] files hash, keyed by the local MD5 digest
491
- # @return [Hash] a report hash, keyed by the local MD5 digest
492
- # @api private
493
- def stream_upload_files(files)
494
- response = {}
495
- files.each do |md5, data|
496
- src = data.fetch('src_zip', data['src'])
497
- if data['chk_dirty'] == 'True'
498
- response[md5] = { 'dest' => data['tmpzip'] || data['dst'] }
499
- chunks, bytes = stream_upload_file(src, data['tmpzip'] || data['dst']) do |xfered|
500
- yield data['src'], xfered
501
- end
502
- response[md5]['chunks'] = chunks
503
- response[md5]['xfered'] = bytes
504
- else
505
- logger.debug "File #{data['dst']} is up to date, skipping"
506
- end
507
- end
508
- response
509
- end
510
-
511
- # Total by byte count to be transferred.
512
- # Calculates count based on the sum of base64 encoded content size
513
- # of all files base 64 that are dirty.
514
- #
515
- # @param files [Hash] files hash, keyed by the local MD5 digest
516
- # @return [Fixnum] total byte size
517
- # @api private
518
- def total_base64_transfer_size(files)
519
- size = 0
520
- files.values.each { |file| size += file['size'] if file['chk_dirty'] == 'True' }
521
- size / 3 * 4
522
- end
523
- end
524
- # rubocop:enable MethodLength, AbcSize, ClassLength
525
- end
526
- end
527
- end
1
+ # -*- encoding: utf-8 -*-
2
+ #
3
+ # Author:: Fletcher (<fnichol@nichol.ca>)
4
+ #
5
+ # Copyright (C) 2015, Fletcher Nichol
6
+ #
7
+ # Licensed under the Apache License, Version 2.0 (the "License");
8
+ # you may not use this file except in compliance with the License.
9
+ # You may obtain a copy of the License at
10
+ #
11
+ # http://www.apache.org/licenses/LICENSE-2.0
12
+ #
13
+ # Unless required by applicable law or agreed to in writing, software
14
+ # distributed under the License is distributed on an "AS IS" BASIS,
15
+ # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
16
+ # See the License for the specific language governing permissions and
17
+ # limitations under the License.
18
+
19
+ require 'benchmark'
20
+ require 'csv'
21
+ require 'digest'
22
+ require 'securerandom'
23
+ require 'stringio'
24
+
25
+ require 'winrm-fs/core/tmp_zip'
26
+
27
+ module WinRM
28
+ module FS
29
+ module Core
30
+ # Wrapped exception for any internally raised WinRM-related errors.
31
+ #
32
+ # @author Fletcher Nichol <fnichol@nichol.ca>
33
+ class FileTransporterFailed < ::WinRM::WinRMError; end
34
+ # rubocop:disable MethodLength, AbcSize, ClassLength
35
+
36
+ # Object which can upload one or more files or directories to a remote
37
+ # host over WinRM using PowerShell scripts and CMD commands. Note that
38
+ # this form of file transfer is *not* ideal and extremely costly on both
39
+ # the local and remote sides. Great pains are made to minimize round
40
+ # trips to the remote host and to minimize the number of PowerShell
41
+ # sessions being invoked which can be 2 orders of magnitude more
42
+ # expensive than vanilla CMD commands.
43
+ #
44
+ # This object is supported by a `PowerShell` instance as it
45
+ # depends on the `#run` API contract.
46
+ #
47
+ # An optional logger can be supplied, assuming it can respond to the
48
+ # `#debug` and `#debug?` messages.
49
+ #
50
+ # @author Fletcher Nichol <fnichol@nichol.ca>
51
+ # @author Matt Wrock <matt@mattwrock.com>
52
+ class FileTransporter
53
+ # Creates a FileTransporter given a PowerShell object.
54
+ #
55
+ # @param shell [PowerShell] a winrm PowerShell object
56
+ def initialize(shell, opts = {})
57
+ @shell = shell
58
+ @logger = shell.logger
59
+ @id_generator = opts.fetch(:id_generator) { -> { SecureRandom.uuid } }
60
+ end
61
+
62
+ # Uploads a collection of files and/or directories to the remote host.
63
+ #
64
+ # **TODO Notes:**
65
+ # * options could specify zip mode, zip options, etc.
66
+ # * maybe option to set tmpfile base dir to override $env:PATH?
67
+ # * progress yields block like net-scp progress
68
+ # * final API: def upload(locals, remote, _options = {}, &_progress)
69
+ #
70
+ # @param locals [Array<String>,String] one or more local file or
71
+ # directory paths
72
+ # @param remote [String] the base destination path on the remote host
73
+ # @return [Hash] report hash, keyed by the local MD5 digest
74
+ def upload(locals, remote)
75
+ files = nil
76
+ report = nil
77
+ remote = remote.to_s
78
+
79
+ elapsed1 = Benchmark.measure do
80
+ files = make_files_hash(Array(locals), remote)
81
+ report = check_files(files)
82
+ merge_with_report!(files, report)
83
+ reconcile_destinations!(files)
84
+ end
85
+ total_size = total_base64_transfer_size(files)
86
+
87
+ elapsed2 = Benchmark.measure do
88
+ report = stream_upload_files(files) do |local_path, xfered|
89
+ yield xfered, total_size, local_path, remote if block_given?
90
+ end
91
+ merge_with_report!(files, report)
92
+ end
93
+
94
+ elapsed3 = Benchmark.measure do
95
+ report = extract_files(files)
96
+ merge_with_report!(files, report)
97
+ cleanup(files)
98
+ end
99
+
100
+ logger.debug(
101
+ "Uploaded #{files.keys.size} items " \
102
+ "dirty_check: #{duration(elapsed1.real)} " \
103
+ "stream_files: #{duration(elapsed2.real)} " \
104
+ "extract: #{duration(elapsed3.real)} " \
105
+ )
106
+
107
+ [total_size, files]
108
+ end
109
+
110
+ private
111
+
112
+ # @return [String] the Array pack template for Base64 encoding a stream
113
+ # of data
114
+ # @api private
115
+ BASE64_PACK = 'm0'.freeze
116
+
117
+ # @return [String] the directory where temporary upload artifacts are
118
+ # persisted
119
+ # @api private
120
+ TEMP_UPLOAD_DIRECTORY = '$env:TEMP\\winrm-upload'.freeze
121
+
122
+ # @return [#debug,#debug?] the logger
123
+ # @api private
124
+ attr_reader :logger
125
+
126
+ # @return [Winrm::Shells::Powershell] a WinRM Powershell shell
127
+ # @api private
128
+ attr_reader :shell
129
+
130
+ # @return [Integer] the maximum number of bytes to send per request
131
+ # when streaming a file. This is optimized to send as much data
132
+ # as allowed in a single PSRP fragment
133
+ # @api private
134
+ def max_encoded_write
135
+ @max_encoded_write ||= begin
136
+ empty_command = WinRM::PSRP::MessageFactory.create_pipeline_message(
137
+ '00000000-0000-0000-0000-000000000000',
138
+ '00000000-0000-0000-0000-000000000000',
139
+ stream_command('')
140
+ )
141
+ shell.max_fragment_blob_size - empty_command.bytes.length
142
+ end
143
+ end
144
+
145
+ # Examines the files and corrects the file destination if it is
146
+ # targeting an existing folder. In this case, the destination path
147
+ # will have the base name of the source file appended. This only
148
+ # applies to file uploads and not to folder uploads.
149
+ #
150
+ # @param files [Hash] files hash, keyed by the local MD5 digest
151
+ # @return [Hash] a report hash, keyed by the local MD5 digest
152
+ # @api private
153
+ def reconcile_destinations!(files)
154
+ files.each do |_, data|
155
+ if data['target_is_folder'] == 'True'
156
+ data['dst'] = File.join(data['dst'], File.basename(data['src']))
157
+ end
158
+ end
159
+ end
160
+
161
+ # Adds an entry to a files Hash (keyed by local MD5 digest) for a
162
+ # directory. When a directory is added, a temporary Zip file is created
163
+ # containing the contents of the directory and any file-related data
164
+ # such as MD5 digest, size, etc. will be referring to the Zip file.
165
+ #
166
+ # @param hash [Hash] hash to be mutated
167
+ # @param dir [String] directory path to be Zipped and added
168
+ # @param remote [String] path to destination on remote host
169
+ # @api private
170
+ def add_directory_hash!(hash, dir, remote)
171
+ logger.debug "creating hash for directory #{remote}"
172
+ zip_io = TmpZip.new(dir, logger)
173
+ zip_md5 = md5sum(zip_io.path)
174
+
175
+ hash[zip_md5] = {
176
+ 'src' => dir,
177
+ 'src_zip' => zip_io.path.to_s,
178
+ 'zip_io' => zip_io,
179
+ 'tmpzip' => "#{TEMP_UPLOAD_DIRECTORY}\\tmpzip-#{zip_md5}.zip",
180
+ 'dst' => "#{remote}\\#{File.basename(dir)}",
181
+ 'size' => File.size(zip_io.path)
182
+ }
183
+ end
184
+
185
+ # Adds an entry to a files Hash (keyed by local MD5 digest) for a file.
186
+ #
187
+ # @param hash [Hash] hash to be mutated
188
+ # @param local [String] file path
189
+ # @param remote [String] path to destination on remote host
190
+ # @api private
191
+ def add_file_hash!(hash, local, remote)
192
+ logger.debug "creating hash for file #{remote}"
193
+
194
+ hash[md5sum(local)] = {
195
+ 'src' => local,
196
+ 'dst' => remote,
197
+ 'size' => File.size(local)
198
+ }
199
+ end
200
+
201
+ # Runs the check_files PowerShell script against a collection of
202
+ # destination path/MD5 checksum pairs. The PowerShell script returns
203
+ # its results as a CSV-formatted report which is converted into a Ruby
204
+ # Hash.
205
+ #
206
+ # @param files [Hash] files hash, keyed by the local MD5 digest
207
+ # @return [Hash] a report hash, keyed by the local MD5 digest
208
+ # @api private
209
+ def check_files(files)
210
+ logger.debug 'Running check_files.ps1'
211
+ hash_file = check_files_ps_hash(files)
212
+ script = WinRM::FS::Scripts.render('check_files', hash_file: hash_file)
213
+ parse_response(shell.run(script))
214
+ end
215
+
216
+ # Constructs a collection of destination path/MD5 checksum pairs as a
217
+ # String representation of the contents of a PowerShell Hash Table.
218
+ #
219
+ # @param files [Hash] files hash, keyed by the local MD5 digest
220
+ # @return [String] the inner contents of a PowerShell Hash Table
221
+ # @api private
222
+ def check_files_ps_hash(files)
223
+ hash = files.map do |md5, data|
224
+ [
225
+ md5,
226
+ {
227
+ 'target' => data.fetch('tmpzip', data['dst']),
228
+ 'src_basename' => File.basename(data['src']),
229
+ 'dst' => data['dst']
230
+ }
231
+ ]
232
+ end
233
+ ps_hash(Hash[hash])
234
+ end
235
+
236
+ # Performs any final cleanup on the report Hash and removes any
237
+ # temporary files/resources used in the upload task.
238
+ #
239
+ # @param files [Hash] a files hash
240
+ # @api private
241
+ def cleanup(files)
242
+ files.select { |_, data| data.key?('zip_io') }.each do |md5, data|
243
+ data.fetch('zip_io').unlink
244
+ files.fetch(md5).delete('zip_io')
245
+ logger.debug "Cleaned up src_zip #{data['src_zip']}"
246
+ end
247
+ end
248
+
249
+ # Runs the extract_files PowerShell script against a collection of
250
+ # temporary file/destination path pairs. The PowerShell script returns
251
+ # its results as a CSV-formatted report which is converted into a Ruby
252
+ # Hash. The script will not be invoked if there are no zip files
253
+ # present in the incoming files Hash.
254
+ #
255
+ # @param files [Hash] files hash, keyed by the local MD5 digest
256
+ # @return [Hash] a report hash, keyed by the local MD5 digest
257
+ # @api private
258
+ def extract_files(files)
259
+ extracted_files = extract_files_ps_hash(files)
260
+
261
+ if extracted_files == ps_hash({})
262
+ logger.debug 'No remote files to extract, skipping'
263
+ {}
264
+ else
265
+ logger.debug 'Running extract_files.ps1'
266
+ script = WinRM::FS::Scripts.render('extract_files', hash_file: extracted_files)
267
+
268
+ parse_response(shell.run(script))
269
+ end
270
+ end
271
+
272
+ # Constructs a collection of temporary file/destination path pairs for
273
+ # all zipped folders as a String representation of the contents of a
274
+ # PowerShell Hash Table.
275
+ #
276
+ # @param files [Hash] files hash, keyed by the local MD5 digest
277
+ # @return [String] the inner contents of a PowerShell Hash Table
278
+ # @api private
279
+ def extract_files_ps_hash(files)
280
+ file_data = files.select { |_, data| data.key?('tmpzip') }
281
+
282
+ result = file_data.map do |md5, data|
283
+ val = { 'dst' => data['dst'] }
284
+ val['tmpzip'] = data['tmpzip'] if data['tmpzip']
285
+
286
+ [md5, val]
287
+ end
288
+
289
+ ps_hash(Hash[result])
290
+ end
291
+
292
+ # Returns a formatted string representing a duration in seconds.
293
+ #
294
+ # @param total [Integer] the total number of seconds
295
+ # @return [String] a formatted string of the form (XmYY.00s)
296
+ def duration(total)
297
+ total = 0 if total.nil?
298
+ minutes = (total / 60).to_i
299
+ seconds = (total - (minutes * 60))
300
+ format('(%dm%.2fs)', minutes, seconds)
301
+ end
302
+
303
+ # Contructs a Hash of files or directories, keyed by the local MD5
304
+ # digest. Each file entry has a source and destination set, at a
305
+ # minimum.
306
+ #
307
+ # @param locals [Array<String>] a collection of local files or
308
+ # directories
309
+ # @param remote [String] the base destination path on the remote host
310
+ # @return [Hash] files hash, keyed by the local MD5 digest
311
+ # @api private
312
+ def make_files_hash(locals, remote)
313
+ hash = {}
314
+ locals.each do |local|
315
+ local = local.to_s
316
+ expanded = File.expand_path(local)
317
+ expanded += local[-1] if local.end_with?('/', '\\')
318
+
319
+ if File.file?(expanded)
320
+ add_file_hash!(hash, expanded, remote)
321
+ elsif File.directory?(expanded)
322
+ add_directory_hash!(hash, expanded, remote)
323
+ else
324
+ fail Errno::ENOENT, "No such file or directory #{expanded}"
325
+ end
326
+ end
327
+ hash
328
+ end
329
+
330
+ # @return [String] the MD5 digest of a local file
331
+ # @api private
332
+ def md5sum(local)
333
+ Digest::MD5.file(local).hexdigest
334
+ end
335
+
336
+ # Destructively merges a report Hash into an existing files Hash.
337
+ # **Note:** this method mutates the files Hash.
338
+ #
339
+ # @param files [Hash] files hash, keyed by the local MD5 digest
340
+ # @param report [Hash] report hash, keyed by the local MD5 digest
341
+ # @api private
342
+ def merge_with_report!(files, report)
343
+ files.merge!(report) { |_, oldval, newval| oldval.merge(newval) }
344
+ end
345
+
346
+ # @param depth [Integer] number of padding characters (default: `0`)
347
+ # @return [String] a whitespace padded string of the given length
348
+ # @api private
349
+ def pad(depth = 0)
350
+ ' ' * depth
351
+ end
352
+
353
+ # Parses response of a PowerShell script or CMD command which contains
354
+ # a CSV-formatted document in the standard output stream.
355
+ #
356
+ # @param output [WinRM::Output] output object with stdout, stderr, and
357
+ # exit code
358
+ # @return [Hash] report hash, keyed by the local MD5 digest
359
+ # @api private
360
+ def parse_response(output)
361
+ exitcode = output.exitcode
362
+ stderr = output.stderr
363
+
364
+ if exitcode != 0
365
+ fail FileTransporterFailed, "[#{self.class}] Upload failed " \
366
+ "(exitcode: #{exitcode})\n#{stderr}"
367
+ elsif stderr != '\r\n' && stderr != ''
368
+ fail FileTransporterFailed, "[#{self.class}] Upload failed " \
369
+ "(exitcode: 0), but stderr present\n#{stderr}"
370
+ end
371
+
372
+ logger.debug 'Parsing CSV Response'
373
+ logger.debug output.stdout
374
+
375
+ array = CSV.parse(output.stdout, headers: true).map(&:to_hash)
376
+ array.each { |h| h.each { |key, value| h[key] = nil if value == '' } }
377
+ Hash[array.map { |entry| [entry.fetch('src_md5'), entry] }]
378
+ end
379
+
380
+ # Converts a Ruby hash into a PowerShell hash table, represented in a
381
+ # String.
382
+ #
383
+ # @param obj [Object] source Hash or object when used in recursive
384
+ # calls
385
+ # @param depth [Integer] padding depth, used in recursive calls
386
+ # (default: `0`)
387
+ # @return [String] a PowerShell hash table
388
+ # @api private
389
+ def ps_hash(obj, depth = 0)
390
+ if obj.is_a?(Hash)
391
+ obj.map do |k, v|
392
+ %(#{pad(depth + 2)}#{ps_hash(k)} = #{ps_hash(v, depth + 2)})
393
+ end.join(";\n").insert(0, "@{\n").insert(-1, "\n#{pad(depth)}}")
394
+ else
395
+ %("#{obj}")
396
+ end
397
+ end
398
+
399
+ # Uploads an IO stream to a Base64-encoded destination file.
400
+ #
401
+ # **Implementation Note:** Some of the code in this method may appear
402
+ # slightly too dense and while adding additional variables would help,
403
+ # the code is written very precisely to avoid unwanted allocations
404
+ # which will bloat the Ruby VM's object space (and memory footprint).
405
+ # The goal here is to stream potentially large files to a remote host
406
+ # while not loading the entire file into memory first, then Base64
407
+ # encoding it--duplicating the file in memory again.
408
+ #
409
+ # @param input_io [#read] a readable stream or object to be uploaded
410
+ # @param dest [String] path to the destination file on the remote host
411
+ # @return [Integer,Integer] the number of resulting upload chunks and
412
+ # the number of bytes transferred to the remote host
413
+ # @api private
414
+ def stream_upload(input_io, dest)
415
+ read_size = ((max_encoded_write - dest.length) / 4) * 3
416
+ chunk, bytes = 1, 0
417
+ buffer = ''
418
+ shell.run(<<-EOS
419
+ $to = $ExecutionContext.SessionState.Path.GetUnresolvedProviderPathFromPSPath("#{dest}")
420
+ $parent = Split-Path $to
421
+ if(!(Test-path $parent)) { mkdir $parent | Out-Null }
422
+ $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList @(
423
+ $to,
424
+ [system.io.filemode]::Create,
425
+ [System.io.FileAccess]::Write,
426
+ [System.IO.FileShare]::ReadWrite
427
+ )
428
+
429
+ # Powershell caches ScrpitBlocks in a dictionary
430
+ # keyed on the script block text. Thats just great
431
+ # unless the script is super large and called a gillion
432
+ # times like we might do. In such a case it will saturate the
433
+ # Large Object Heap and lead to Out Of Memory exceptions
434
+ # for large files or folders. So we call the internal method
435
+ # ClearScriptBlockCache to clear it.
436
+ $bindingFlags= [Reflection.BindingFlags] "NonPublic,Static"
437
+ $method = [scriptblock].GetMethod("ClearScriptBlockCache", $bindingFlags)
438
+ EOS
439
+ )
440
+
441
+ while input_io.read(read_size, buffer)
442
+ bytes += (buffer.bytesize / 3 * 4)
443
+ shell.run(stream_command([buffer].pack(BASE64_PACK)))
444
+ logger.debug "Wrote chunk #{chunk} for #{dest}" if chunk % 25 == 0
445
+ chunk += 1
446
+ yield bytes if block_given?
447
+ end
448
+ shell.run('$fileStream.Dispose()')
449
+ buffer = nil # rubocop:disable Lint/UselessAssignment
450
+
451
+ [chunk - 1, bytes]
452
+ end
453
+
454
+ def stream_command(encoded_bytes)
455
+ <<-EOS
456
+ if($method) { $method.Invoke($Null, $Null) }
457
+ $bytes=[Convert]::FromBase64String('#{encoded_bytes}')
458
+ $fileStream.Write($bytes, 0, $bytes.length)
459
+ EOS
460
+ end
461
+
462
+ # Uploads a local file.
463
+ #
464
+ # @param src [String] path to a local file
465
+ # @param dest [String] path to the file on the remote host
466
+ # @return [Integer,Integer] the number of resulting upload chunks and
467
+ # the number of bytes transferred to the remote host
468
+ # @api private
469
+ def stream_upload_file(src, dest, &block)
470
+ logger.debug "Uploading #{src} to #{dest}"
471
+ chunks, bytes = 0, 0
472
+ elapsed = Benchmark.measure do
473
+ File.open(src, 'rb') do |io|
474
+ chunks, bytes = stream_upload(io, dest, &block)
475
+ end
476
+ end
477
+ logger.debug(
478
+ "Finished uploading #{src} to #{dest} " \
479
+ "(#{bytes.to_f / 1000} KB over #{chunks} chunks) " \
480
+ "in #{duration(elapsed.real)}"
481
+ )
482
+
483
+ [chunks, bytes]
484
+ end
485
+
486
+ # Uploads a collection of "dirty" files to the remote host as
487
+ # Base64-encoded temporary files. A "dirty" file is one which has the
488
+ # `"chk_dirty"` option set to `"True"` in the incoming files Hash.
489
+ #
490
+ # @param files [Hash] files hash, keyed by the local MD5 digest
491
+ # @return [Hash] a report hash, keyed by the local MD5 digest
492
+ # @api private
493
+ def stream_upload_files(files)
494
+ response = {}
495
+ files.each do |md5, data|
496
+ src = data.fetch('src_zip', data['src'])
497
+ if data['chk_dirty'] == 'True'
498
+ response[md5] = { 'dest' => data['tmpzip'] || data['dst'] }
499
+ chunks, bytes = stream_upload_file(src, data['tmpzip'] || data['dst']) do |xfered|
500
+ yield data['src'], xfered
501
+ end
502
+ response[md5]['chunks'] = chunks
503
+ response[md5]['xfered'] = bytes
504
+ else
505
+ logger.debug "File #{data['dst']} is up to date, skipping"
506
+ end
507
+ end
508
+ response
509
+ end
510
+
511
+ # Total by byte count to be transferred.
512
+ # Calculates count based on the sum of base64 encoded content size
513
+ # of all files base 64 that are dirty.
514
+ #
515
+ # @param files [Hash] files hash, keyed by the local MD5 digest
516
+ # @return [Fixnum] total byte size
517
+ # @api private
518
+ def total_base64_transfer_size(files)
519
+ size = 0
520
+ files.values.each { |file| size += file['size'] if file['chk_dirty'] == 'True' }
521
+ size / 3 * 4
522
+ end
523
+ end
524
+ # rubocop:enable MethodLength, AbcSize, ClassLength
525
+ end
526
+ end
527
+ end