cloud-files 5.5.0__py3-none-any.whl → 5.6.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
@@ -1,6 +1,6 @@
1
1
  Metadata-Version: 2.1
2
2
  Name: cloud-files
3
- Version: 5.5.0
3
+ Version: 5.6.0
4
4
  Summary: Fast access to cloud storage and local FS.
5
5
  Home-page: https://github.com/seung-lab/cloud-files/
6
6
  Author: William Silversmith
@@ -30,6 +30,7 @@ Requires-Dist: google-auth >=1.10.0
30
30
  Requires-Dist: google-cloud-core >=1.1.0
31
31
  Requires-Dist: google-cloud-storage >=1.31.1
32
32
  Requires-Dist: google-crc32c >=1.0.0
33
+ Requires-Dist: intervaltree
33
34
  Requires-Dist: orjson
34
35
  Requires-Dist: pathos
35
36
  Requires-Dist: protobuf >=3.3.0
@@ -41,6 +42,12 @@ Requires-Dist: urllib3 >=1.26.3
41
42
  Requires-Dist: zstandard
42
43
  Requires-Dist: rsa >=4.7.2
43
44
  Requires-Dist: fasteners
45
+ Provides-Extra: apache
46
+ Requires-Dist: lxml ; extra == 'apache'
47
+ Provides-Extra: monitoring
48
+ Requires-Dist: psutil ; extra == 'monitoring'
49
+ Requires-Dist: intervaltree ; extra == 'monitoring'
50
+ Requires-Dist: matplotlib ; extra == 'monitoring'
44
51
  Provides-Extra: numpy
45
52
  Requires-Dist: numpy ; extra == 'numpy'
46
53
  Provides-Extra: test
@@ -97,8 +104,20 @@ cf.touch([ "example", "example2" ])
97
104
  cf = CloudFile("gs://bucket/file1")
98
105
  info = cf.head()
99
106
  binary = cf.get()
107
+ obj = cf.get_json()
100
108
  cf.put(binary)
109
+ cf.put_json()
101
110
  cf[:30] # get first 30 bytes of file
111
+
112
+ num_bytes = cf.size() # get size in bytes (also in head)
113
+ exists = cf.exists() # true or false
114
+ cf.delete() # deletes the file
115
+ cf.touch() # create the file if it doesn't exist
116
+ cf.move("gs://example/destination/directory") # copy then delete source
117
+ cf.transfer_from("gs://example/source/file.txt") # copies file efficiently
118
+ cf.transfer_to("gs://example/dest/file.txt") # copies file efficiently
119
+
120
+ path = cf.join([ path1, path2, path3 ]) # use the appropriate path separator
102
121
  ```
103
122
 
104
123
  CloudFiles was developed to access files from object storage without ever touching disk. The goal was to reliably and rapidly access a petabyte of image data broken down into tens to hundreds of millions of files being accessed in parallel across thousands of cores. CloudFiles has been used to processes dozens of images, many of which were in the hundreds of terabyte range. It has reliably read and written tens of billions of files to date.
@@ -124,6 +143,7 @@ CloudFiles was developed to access files from object storage without ever touchi
124
143
  ```bash
125
144
  pip install cloud-files
126
145
  pip install cloud-files[test] # to enable testing with pytest
146
+ pip install cloud-files[monitoring] # enable plotting network performance
127
147
  ```
128
148
 
129
149
  If you run into trouble installing dependenies, make sure you're using at least Python3.6 and you have updated pip. On Linux, some dependencies require manylinux2010 or manylinux2014 binaries which earlier versions of pip do not search for. MacOS, Linux, and Windows are supported platforms.
@@ -176,7 +196,9 @@ You can create the `google-secret.json` file [here](https://console.cloud.google
176
196
 
177
197
  ## API Documentation
178
198
 
179
- Note that the "Cloud Costs" mentioned below are current as of June 2020 and are subject to change. As of this writing, S3 and Google use identical cost structures for these operations.
199
+ Note that the "Cloud Costs" mentioned below are current as of June 2020 and are subject to change. As of this writing, S3 and Google use identical cost structures for these operations.
200
+
201
+ `CloudFile` is a more intuitive version of `CloudFiles` designed for managing single files instead of groups of files. See examples above. There is an analogus method for each `CloudFiles` method (where it makes sense).
180
202
 
181
203
  ### Constructor
182
204
  ```python
@@ -236,6 +258,10 @@ binary = cf['filename'][0:5] # same result, fetches 11 bytes
236
258
  >> b'hello' # represents byte range 0-4 inclusive of filename
237
259
 
238
260
  binaries = cf[:100] # download the first 100 files
261
+
262
+ # Get the TransmissionMonitor object that records
263
+ # the flight time of each file.
264
+ binaries, tm = cf.get(..., return_recording=True)
239
265
  ```
240
266
 
241
267
  `get` supports several different styles of input. The simplest takes a scalar filename and returns the contents of that file. However, you can also specify lists of filenames, a byte range request, and lists of byte range requests. You can provide a generator or iterator as input as well. Order is not guaranteed.
@@ -265,6 +291,10 @@ cf.puts([{
265
291
  cf.puts([ (path, content), (path, content) ], compression='gzip')
266
292
  cf.put_jsons(...)
267
293
 
294
+ # Get the TransmissionMonitor object that records the
295
+ # flight times of each object.
296
+ _, tm = cf.puts(..., return_recording=True)
297
+
268
298
  # Definition of put, put_json is identical
269
299
  def put(
270
300
  self,
@@ -469,6 +499,12 @@ cloudfiles -p 2 cp --progress -r s3://bkt/ gs://bkt2/
469
499
  cloudfiles cp -c br s3://bkt/file.txt gs://bkt2/
470
500
  # decompress
471
501
  cloudfiles cp -c none s3://bkt/file.txt gs://bkt2/
502
+ # save chart of file flight times
503
+ cloudfiles cp --flight-time s3://bkt/file.txt gs://bkt2/
504
+ # save a chart of estimated bandwidth usage from these files alone
505
+ cloudfiles cp --io-rate s3://bkt/file.txt gs://bkt2/
506
+ # save a chart of measured bandwidth usage for the machine
507
+ cloudfiles cp --machine-io-rate s3://bkt/file.txt gs://bkt2/
472
508
  # move or rename files
473
509
  cloudfiles mv s3://bkt/file.txt gs://bkt2/
474
510
  # create an empty file if not existing
@@ -528,6 +564,40 @@ cloudfiles alias rm example # remove example://
528
564
 
529
565
  The alias file is only accessed (and cached) if CloudFiles encounters an unknown protocol. If you stick to default protocols and use the syntax `s3://https://example.com/` for alternative endpoints, you can still use CloudFiles in environments without filesystem access.
530
566
 
567
+ ## Performance Monitoring
568
+
569
+ CloudFiles now comes with two tools inside of `cloudfiles.monitoring` for measuring the performance of transfer operations both via the CLI and the programatic interface.
570
+
571
+ A `TransmissionMonitor` object is created during each download or upload (e.g. `cf.get` or `cf.puts`) call. You can access this object by using the `return_recording=True` flag. This object saves the flight times of each object along with its size in an interval tree datastructure. It comes with methods for estimating the peak bits per a second and can plot both time of flight and the estimated transfer rates (assuming the transfer is evenly divided over the flight of an object, an assumption that is not always true). This allows you to estimate the contribution of a given CloudFiles operation to a machine's network IO.
572
+
573
+ ```python
574
+ from cloudfiles import CloudFiles
575
+
576
+ ...
577
+
578
+ results, tm = cf.get([ ... some files ... ], return_recording=True)
579
+
580
+ value = tm.peak_Mbps() # estimated peak transfer rate
581
+ value = tm.total_Mbps() # estimated average transfer rate
582
+ tm.plot_gantt() # time of flight chart
583
+ tm.plot_histogram() # transfer rate chart
584
+ ```
585
+
586
+ A second object, `IOSampler`, can sample the OS network counters using a background thread and provides a global view of the machine's network performance during the life of the transfer. It is enabled on the CLI for the `cp` command when the `--machine-io-rate` flag is enabled, but must be manually started programatically. This is to avoid accidentally starting unnecessary sampling threads. The samples are accumulated into a circular buffer, so make sure to set the buffer length long enough for your points of interest to be captured.
587
+
588
+ ```python
589
+ from cloudfiles.monitoring import IOSampler
590
+
591
+ sampler = IOSampler(buffer_sec=600, interval=0.25)
592
+ sampler.start_sampling()
593
+
594
+ ...
595
+
596
+
597
+ sampler.stop_sampling()
598
+ sampler.plot_histogram()
599
+ ```
600
+
531
601
  ## Credits
532
602
 
533
603
  CloudFiles is derived from the [CloudVolume.Storage](https://github.com/seung-lab/cloud-volume/tree/master/cloudvolume/storage) system.
@@ -1,11 +1,12 @@
1
1
  cloudfiles/__init__.py,sha256=pLB4CcV2l3Jgv_ni1520Np1pfzFj8Cpr87vNxFT3rNI,493
2
- cloudfiles/cloudfiles.py,sha256=nUIW-SDyeldvmUu_SETJXS4x38Nv3Kj7TBwMwLQhjns,51196
2
+ cloudfiles/cloudfiles.py,sha256=tPG1PBLEjABPu-KLe93yf6xW_zbafPsQ6z5NuofyUoU,56743
3
3
  cloudfiles/compression.py,sha256=WXJHnoNLJ_NWyoY9ygZmFA2qMou35_9xS5dzF7-2H-M,6262
4
4
  cloudfiles/connectionpools.py,sha256=aL8RiSjRepECfgAFmJcz80aJFKbou7hsbuEgugDKwB8,4814
5
5
  cloudfiles/exceptions.py,sha256=N0oGQNG-St6RvnT8e5p_yC_E61q2kgAe2scwAL0F49c,843
6
6
  cloudfiles/gcs.py,sha256=unqu5KxGKaPq6N4QeHSpCDdtnK1BzPOAerTZ8FLt2_4,3820
7
- cloudfiles/interfaces.py,sha256=3_ybsCKf74DPpI5Vv-BwFLLzc1cLUMfZLBtIKWkfoHQ,44478
7
+ cloudfiles/interfaces.py,sha256=5rUh2DWOVlg13fAxyZ0wAaQyfW04xc2zlUfTItFV-zQ,45325
8
8
  cloudfiles/lib.py,sha256=HHjCvjmOjA0nZWSvHGoSeYpxqd6FAG8xk8LM212LAUA,5382
9
+ cloudfiles/monitoring.py,sha256=N5Xq0PYZK1OxoYtwBFsnnfaq7dElTgY8Rn2Ez_I3aoo,20897
9
10
  cloudfiles/paths.py,sha256=HOvtdLSIYGwlwvnZt9d_Ez3TXOe7WWd18bZNDpExUDQ,12231
10
11
  cloudfiles/resumable_tools.py,sha256=NyuSoGh1SaP5akrHCpd9kgy2-JruEWrHW9lvJxV7jpE,6711
11
12
  cloudfiles/scheduler.py,sha256=ioqBT5bMPCVHEHlnL-SW_wHulxGgjeThiKHlnaDOydo,3831
@@ -15,12 +16,12 @@ cloudfiles/threaded_queue.py,sha256=Nl4vfXhQ6nDLF8PZpSSBpww0M2zWtcd4DLs3W3BArBw,
15
16
  cloudfiles/typing.py,sha256=f3ZYkNfN9poxhGu5j-P0KCxjCCqSn9HAg5KiIPkjnCg,416
16
17
  cloudfiles_cli/LICENSE,sha256=Jna4xYE8CCQmaxjr5Fs-wmUBnIQJ1DGcNn9MMjbkprk,1538
17
18
  cloudfiles_cli/__init__.py,sha256=Wftt3R3F21QsHtWqx49ODuqT9zcSr0em7wk48kcH0WM,29
18
- cloudfiles_cli/cloudfiles_cli.py,sha256=L9xr_z9GCf5PcpmA21Iandim7uMviaLxTZ4Pf1B9Ui0,34865
19
- cloud_files-5.5.0.dist-info/AUTHORS,sha256=BFVmobgAhaVFI5fqbuqAY5XmBQxe09ZZAsAOTy87hKQ,318
20
- cloud_files-5.5.0.dist-info/LICENSE,sha256=Jna4xYE8CCQmaxjr5Fs-wmUBnIQJ1DGcNn9MMjbkprk,1538
21
- cloud_files-5.5.0.dist-info/METADATA,sha256=m3Rlk29z4XnqeLhpKX613g7KMiT-5vGVNsWpjXNfXmU,26970
22
- cloud_files-5.5.0.dist-info/WHEEL,sha256=GJ7t_kWBFywbagK5eo9IoUwLW6oyOeTKmQ-9iHFVNxQ,92
23
- cloud_files-5.5.0.dist-info/entry_points.txt,sha256=xlirb1FVhn1mbcv4IoyMEGumDqKOA4VMVd3drsRQxIg,51
24
- cloud_files-5.5.0.dist-info/pbr.json,sha256=k1LwgnAaBImOMXKnlbOx2sBAWZ_Jm5AdFAhAugTuvrY,46
25
- cloud_files-5.5.0.dist-info/top_level.txt,sha256=xPyrST3okJbsmdCF5IC2gYAVxg_aD5AYVTnNo8UuoZU,26
26
- cloud_files-5.5.0.dist-info/RECORD,,
19
+ cloudfiles_cli/cloudfiles_cli.py,sha256=IT6CHH7rI2B79ecV526uFa6ruAC8Yg25OO36Uwfxenc,37464
20
+ cloud_files-5.6.0.dist-info/AUTHORS,sha256=BFVmobgAhaVFI5fqbuqAY5XmBQxe09ZZAsAOTy87hKQ,318
21
+ cloud_files-5.6.0.dist-info/LICENSE,sha256=Jna4xYE8CCQmaxjr5Fs-wmUBnIQJ1DGcNn9MMjbkprk,1538
22
+ cloud_files-5.6.0.dist-info/METADATA,sha256=vXFDHg7VwfvoIlQlQZR6bqS-SqlPI7O7vUe8gSNcnGI,30509
23
+ cloud_files-5.6.0.dist-info/WHEEL,sha256=GJ7t_kWBFywbagK5eo9IoUwLW6oyOeTKmQ-9iHFVNxQ,92
24
+ cloud_files-5.6.0.dist-info/entry_points.txt,sha256=xlirb1FVhn1mbcv4IoyMEGumDqKOA4VMVd3drsRQxIg,51
25
+ cloud_files-5.6.0.dist-info/pbr.json,sha256=YPRBUS6pO0tlB43cAfdmW2Lb_kpuL4KYmVzrlj7hjr8,46
26
+ cloud_files-5.6.0.dist-info/top_level.txt,sha256=xPyrST3okJbsmdCF5IC2gYAVxg_aD5AYVTnNo8UuoZU,26
27
+ cloud_files-5.6.0.dist-info/RECORD,,
@@ -0,0 +1 @@
1
+ {"git_version": "4a9dd39", "is_release": true}