datago 2026.2.1__cp314-cp314-manylinux_2_31_x86_64.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
README.md ADDED
@@ -0,0 +1,319 @@
1
+ # datago
2
+
3
+ [![Rust](https://github.com/Photoroom/datago/actions/workflows/rust.yml/badge.svg)](https://github.com/Photoroom/datago/actions/workflows/rust.yml)
4
+ [![Rust-py](https://github.com/Photoroom/datago/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/Photoroom/datago/actions/workflows/ci-cd.yml)
5
+
6
+ A Rust-written data loader which can be used as a python module. Handles several data sources, from local files to webdataset or a VectorDB focused http stack [soon-to-be open sourced](https://github.com/Photoroom/dataroom). Focused on image data at the moment, could also easily be more generic.
7
+
8
+ Datago handles, outside of the Python GIL
9
+
10
+ - per sample IO
11
+ - deserialization (jpg and png decompression)
12
+ - some optional vision processing (aligning different image payloads)
13
+ - optional serialization
14
+
15
+ Samples are exposed in the Python scope as python native objects, using PIL and Numpy base types. Speed will be network dependent, but GB/s is typical. Depending on the front ends, datago can be rank and world-size aware, in which case the samples are dispatched depending on the samples hash.
16
+
17
+ ![Datago organization](assets/447175851-2277afcb-8abf-4d17-b2db-dae27c6056d0.png)
18
+
19
+ <details> <summary><strong>Use it</strong></summary>
20
+
21
+ You can simply install datago with `[uv] pip install datago`
22
+
23
+ ## Use the package from Python
24
+ Please note that in all the of the following cases, you can directly get an IterableDataset (torch compatible) with the following code snippet
25
+
26
+ ```python
27
+ from dataset import DatagoIterDataset
28
+ client_config = {} # See below for examples
29
+ datago_dataset = DatagoIterDataset(client_config, return_python_types=True)
30
+ ```
31
+
32
+ `return_python_types` enforces that images will be of the PIL.Image sort for instance, being an external binary module should be transparent.
33
+
34
+ <details> <summary><strong>Dataroom</strong></summary>
35
+
36
+ ```python
37
+ from datago import DatagoClient, initialize_logging
38
+ import os
39
+ import json
40
+
41
+ # Respects RUST_LOG=INFO env var for setting log level
42
+ # If omitted the logger will be initialized when the client starts.
43
+ initialize_logging()
44
+
45
+ config = {
46
+ "source_config": {
47
+ "sources": os.environ.get("DATAROOM_TEST_SOURCE", ""),
48
+ "page_size": 500,
49
+ "rank": 0,
50
+ "world_size": 1,
51
+ },
52
+ "limit": 200,
53
+ "samples_buffer_size": 32,
54
+ }
55
+
56
+ client = DatagoClient(json.dumps(config))
57
+
58
+ for _ in range(10):
59
+ sample = client.get_sample()
60
+ ```
61
+
62
+ Please note that the image buffers will be passed around as raw pointers, see below (we provide python utils to convert to PIL types).
63
+
64
+ </details><details> <summary><strong>Local files</strong></summary>
65
+
66
+ To test datago while serving local files (jpg, png, ..), code would look like the following.
67
+ **Note that datago serving files with a lot of concurrent threads means that, even if random_sampling is not set,
68
+ there will be some randomness in the sample ordering.**
69
+
70
+ ```python
71
+ from datago import DatagoClient, initialize_logging
72
+ import os
73
+ import json
74
+
75
+ # Can also set the log level directly instead of using RUST_LOG env var
76
+ initialize_logging(log_level="warn")
77
+
78
+ config = {
79
+ "source_type": "file",
80
+ "source_config": {
81
+ "root_path": "myPath",
82
+ "random_sampling": False, # True if used directly for training
83
+ "rank": 0, # Optional, distributed workloads are possible
84
+ "world_size": 1,
85
+ },
86
+ "limit": 200,
87
+ "samples_buffer_size": 32,
88
+ }
89
+
90
+ client = DatagoClient(json.dumps(config))
91
+
92
+ for _ in range(10):
93
+ sample = client.get_sample()
94
+ ```
95
+
96
+ </details><details> <summary><strong>[experimental] Webdataset</strong></summary>
97
+
98
+ Please note that this implementation is very new, and probably has significant limitations still. It has not yet been tested at scale.
99
+ Please also note that you can find a better example in /python/benchmark_webdataset.py, which will show how to convert everything to more pythonic types (PIL images).
100
+
101
+ ```python
102
+ from datago import DatagoClient, initialize_logging
103
+ import os
104
+ import json
105
+
106
+ # Can also set the log level directly instead of using RUST_LOG env var
107
+ initialize_logging(log_level="warn")
108
+
109
+ # URL of the test bucket
110
+ bucket = "https://storage.googleapis.com/webdataset/fake-imagenet"
111
+ dataset = "/imagenet-train-{000000..001281}.tar"
112
+ url = bucket + dataset
113
+
114
+ client_config = {
115
+ "source_type": "webdataset",
116
+ "source_config": {
117
+ "url": url,
118
+ "random_sampling": False,
119
+ "concurrent_downloads": 8, # The number of TarballSamples which should be handled concurrently
120
+ "rank": 0,
121
+ "world_size": 1,
122
+ },
123
+ "prefetch_buffer_size": 128,
124
+ "samples_buffer_size": 64,
125
+ "limit": 1_000_000, # Dummy example, max number of samples you would like to serve
126
+ }
127
+
128
+ client = DatagoClient(json.dumps(client_config))
129
+
130
+ for _ in range(10):
131
+ sample = client.get_sample()
132
+ ```
133
+
134
+ </details>
135
+
136
+ ## Process images on the fly
137
+
138
+ Datago can also process images on the fly, for instance to align different image payloads. This is done by adding an `image_config` to the configuration. The following example shows how to align different image payloads.
139
+
140
+ Processing can be very CPU heavy, but it will be distributed over all CPU cores wihout requiring multiple python processes. I.e., you can keep a single python process using `get_sample()` on the client and still saturate all CPU cores.
141
+
142
+ There are three main processing topics that you can choose from:
143
+
144
+ - crop the images to within an aspect ratio bucket (which is very handy for all Transformer / patch based architectures)
145
+ - resize the images (setting here will be related to the square aspect ratio bucket, other buckets will differ of course)
146
+ - pre-encode the images to a specific format (jpg, png, ...)
147
+
148
+ ```python
149
+ config = {
150
+ "source_type": "file",
151
+ "source_config": {
152
+ "root_path": "myPath",
153
+ "random_sampling": False, # True if used directly for training
154
+ },
155
+ # Optional pre-processing of the images, placing them in an aspect ratio bucket to preserve as much as possible of the original content
156
+ "image_config": {
157
+ "crop_and_resize": True, # False to turn it off, or just omit this part of the config
158
+ "default_image_size": 1024,
159
+ "downsampling_ratio": 32,
160
+ "min_aspect_ratio": 0.5,
161
+ "max_aspect_ratio": 2.0,
162
+ "pre_encode_images": False,
163
+ },
164
+ "limit": 200,
165
+ "samples_buffer_size": 32,
166
+ }
167
+ ```
168
+
169
+ ## Match the raw exported buffers with typical python types
170
+
171
+ See helper functions provided in `raw_types.py`, should be self explanatory. Check python benchmarks for examples. As mentioned above, we also provide a wrapper so that you get a `dataset` directly.
172
+
173
+ ## Logging
174
+
175
+ We are using the [log](https://docs.rs/log/latest/log/) crate with [env_logger](https://docs.rs/env_logger/latest/env_logger/).
176
+ You can set the log level using the RUST_LOG environment variable. E.g. `RUST_LOG=INFO`.
177
+
178
+ When using the library from Python, `env_logger` will be initialized automatically when creating a `DatagoClient`. There is also a `initialize_logging` function in the `datago` module, which if called before using a client, allows to customize the log level. This only works if RUST_LOG is not set.
179
+
180
+ ## Env variables
181
+
182
+ There are a couple of env variables which will change the behavior of the library, for settings which felt too low level to be exposed in the config.
183
+
184
+ - `DATAGO_MAX_TASKS`: refers to the number of threads which will be used to load the samples. Defaults to a multiple of the CPU cores.
185
+ - `RUST_LOG`: see above, will change the level of logging for the whole library, could be useful for debugging or to report an issue here.
186
+ - `DATAGO_MAX_RETRIES`: number of retries for a failed sample load, defaults to 3.
187
+
188
+ </details><details> <summary><strong>Build it</strong></summary>
189
+
190
+ ## Preamble
191
+
192
+ Just install the rust toolchain via rustup
193
+
194
+ ## [Apple Silicon MacOS only]
195
+
196
+ If you are using an Apple Silicon Mac OS machine, create a `.cargo/config` file and paste the following:
197
+
198
+ ``` cfg
199
+ [target.x86_64-apple-darwin]
200
+ rustflags = [
201
+ "-C", "link-arg=-undefined",
202
+ "-C", "link-arg=dynamic_lookup",
203
+ ]
204
+
205
+ [target.aarch64-apple-darwin]
206
+ rustflags = [
207
+ "-C", "link-arg=-undefined",
208
+ "-C", "link-arg=dynamic_lookup",
209
+ ]
210
+ ```
211
+
212
+ ## Build a benchmark CLI
213
+
214
+ `Cargo run --release -- -h` to get all the information, should be fairly straightforward
215
+
216
+ ## Run the rust test suite
217
+
218
+ From the datago folder
219
+
220
+ ```bash
221
+ cargo test
222
+ ```
223
+
224
+ ## Generate the python package binaries manually
225
+
226
+ Build a wheel useable locally
227
+
228
+ ```bash
229
+ maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu"
230
+ ```
231
+
232
+ Build a wheel which can be uploaded to pypi or related
233
+
234
+ - either use a manylinux docker image
235
+
236
+ - or cross compile using zip
237
+
238
+ ```bash
239
+ maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu" --manylinux 2014 --zig
240
+ ```
241
+
242
+ then you can `pip install` from `target/wheels`
243
+
244
+ ## Update the pypi release (maintainers)
245
+
246
+ Create a new tag and a new release in this repo, a new package will be pushed automatically.
247
+
248
+ </details>
249
+
250
+ <details> <summary><strong>Benchmarks</strong></summary>
251
+ As usual, benchmarks are a tricky game, and you shouldn't read too much into the following plots but do your own tests. Some python benchmark examples are provided in the [python](./python/) folder.
252
+
253
+ In general, Datago will be impactful if you want to load a lot of images very fast, but if you consume them as you go at a more leisury pace then it's not really needed. The more CPU work there is with the images and the higher quality they are, the more Datago will shine.
254
+
255
+ ## From disk: ImageNet
256
+
257
+ The following benchmarks are using ImageNet 1k, which is very low resolution and thus kind of a worst case scenario. Data is served from cache (i.e. the OS cache) and the images are not pre-processed. In this case the receiving python process is typically the bottleneck, and caps at around 3000 images per second.
258
+
259
+ ### AMD Zen3 laptop - IN1k - disk - no processing
260
+ ![AMD Zen3 laptop & M2 SSD](assets/zen3_ssd.png)
261
+
262
+ ### AMD EPYC 9454 - IN1k - disk - no processing
263
+ ![AMD EPYC 9454](assets/epyc_vast.png)
264
+
265
+ ## Webdataset: FakeIN
266
+
267
+ This benchmark is using low resolution images. It's accessed through the webdataset front end, datago is compared with the popular python webdataset library. Note that datago will start streaming the images faster here (almost instantly !), which emphasizes throughput differences depending on how long you test it for.
268
+
269
+ Of note is also that this can be bottlenecked by your external bandwidth to the remote storage where WDS is hosted, in which case both solution would yield comparable numbers.
270
+
271
+ ### AMD Zen3 laptop - webdataset - no processing
272
+ ![AMD EPYC 9454](assets/zen3_wds_fakein.png)
273
+
274
+
275
+ ## Webdataset: PD12M
276
+
277
+ This benchmark is using high resolution images. It's accessed through the webdataset front end, datago is compared with the popular python webdataset library. Note that datago will start streaming the images faster here (almost instantly !), which emphasizes throughput differences depending on how long you test it for.
278
+
279
+ Of note is also that this can be bottlenecked by your external bandwidth to the remote storage where WDS is hosted, in which case both solution would yield comparable numbers.
280
+
281
+ ### AMD Zen3 laptop - webdataset - no processing
282
+ ![AMD Zen3 laptop](assets/zen3_wds_pd12m.png)
283
+
284
+
285
+ ### AMD EPYC 9454 - pd12m - webdataset - no processing
286
+ ![AMD EPYC 9454](assets/epyc_wds_pd12m.png)
287
+
288
+
289
+ ### AMD Zen3 laptop - webdataset - processing
290
+ Adding image processing (crop and resize to Transformer compatible size buckets) to the equation changes the picture, as the work spread becomes more important. If you're training a diffusion model or an image encoder from a diverse set of images, this is likely to be the most realistic micro-benchmark.
291
+
292
+ ![AMD Zen3 laptop](assets/zen3_wds_pd12m_processing.png)
293
+
294
+ </details>
295
+
296
+
297
+ ## License
298
+
299
+ MIT License
300
+
301
+ Copyright (c) 2025 Photoroom
302
+
303
+ Permission is hereby granted, free of charge, to any person obtaining a copy
304
+ of this software and associated documentation files (the "Software"), to deal
305
+ in the Software without restriction, including without limitation the rights
306
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
307
+ copies of the Software, and to permit persons to whom the Software is
308
+ furnished to do so, subject to the following conditions:
309
+
310
+ The above copyright notice and this permission notice shall be included in all
311
+ copies or substantial portions of the Software.
312
+
313
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
314
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
315
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
316
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
317
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
318
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
319
+ SOFTWARE.
datago/__init__.py ADDED
@@ -0,0 +1,5 @@
1
+ from .datago import *
2
+
3
+ __doc__ = datago.__doc__
4
+ if hasattr(datago, "__all__"):
5
+ __all__ = datago.__all__
@@ -0,0 +1,338 @@
1
+ Metadata-Version: 2.4
2
+ Name: datago
3
+ Version: 2026.2.1
4
+ Classifier: Programming Language :: Rust
5
+ Classifier: Programming Language :: Python :: Implementation :: CPython
6
+ Classifier: Programming Language :: Python :: Implementation :: PyPy
7
+ Classifier: Programming Language :: Python :: 3
8
+ Classifier: License :: OSI Approved :: MIT License
9
+ Requires-Dist: pillow
10
+ License-File: LICENSE
11
+ Summary: A high performance dataloader for Python, written in Rust
12
+ Author: Benjamin Lefaudeux, Roman Frigg, Tarek Ayed, Marco Forte
13
+ Author-email: Photoroom <team@photoroom.com>
14
+ Requires-Python: >=3.8
15
+ Description-Content-Type: text/markdown; charset=UTF-8; variant=GFM
16
+ Project-URL: Homepage, https://github.com/photoroom/datago
17
+ Project-URL: Issues, https://github.com/photoroom/datago/issues
18
+
19
+ # datago
20
+
21
+ [![Rust](https://github.com/Photoroom/datago/actions/workflows/rust.yml/badge.svg)](https://github.com/Photoroom/datago/actions/workflows/rust.yml)
22
+ [![Rust-py](https://github.com/Photoroom/datago/actions/workflows/ci-cd.yml/badge.svg)](https://github.com/Photoroom/datago/actions/workflows/ci-cd.yml)
23
+
24
+ A Rust-written data loader which can be used as a python module. Handles several data sources, from local files to webdataset or a VectorDB focused http stack [soon-to-be open sourced](https://github.com/Photoroom/dataroom). Focused on image data at the moment, could also easily be more generic.
25
+
26
+ Datago handles, outside of the Python GIL
27
+
28
+ - per sample IO
29
+ - deserialization (jpg and png decompression)
30
+ - some optional vision processing (aligning different image payloads)
31
+ - optional serialization
32
+
33
+ Samples are exposed in the Python scope as python native objects, using PIL and Numpy base types. Speed will be network dependent, but GB/s is typical. Depending on the front ends, datago can be rank and world-size aware, in which case the samples are dispatched depending on the samples hash.
34
+
35
+ ![Datago organization](assets/447175851-2277afcb-8abf-4d17-b2db-dae27c6056d0.png)
36
+
37
+ <details> <summary><strong>Use it</strong></summary>
38
+
39
+ You can simply install datago with `[uv] pip install datago`
40
+
41
+ ## Use the package from Python
42
+ Please note that in all the of the following cases, you can directly get an IterableDataset (torch compatible) with the following code snippet
43
+
44
+ ```python
45
+ from dataset import DatagoIterDataset
46
+ client_config = {} # See below for examples
47
+ datago_dataset = DatagoIterDataset(client_config, return_python_types=True)
48
+ ```
49
+
50
+ `return_python_types` enforces that images will be of the PIL.Image sort for instance, being an external binary module should be transparent.
51
+
52
+ <details> <summary><strong>Dataroom</strong></summary>
53
+
54
+ ```python
55
+ from datago import DatagoClient, initialize_logging
56
+ import os
57
+ import json
58
+
59
+ # Respects RUST_LOG=INFO env var for setting log level
60
+ # If omitted the logger will be initialized when the client starts.
61
+ initialize_logging()
62
+
63
+ config = {
64
+ "source_config": {
65
+ "sources": os.environ.get("DATAROOM_TEST_SOURCE", ""),
66
+ "page_size": 500,
67
+ "rank": 0,
68
+ "world_size": 1,
69
+ },
70
+ "limit": 200,
71
+ "samples_buffer_size": 32,
72
+ }
73
+
74
+ client = DatagoClient(json.dumps(config))
75
+
76
+ for _ in range(10):
77
+ sample = client.get_sample()
78
+ ```
79
+
80
+ Please note that the image buffers will be passed around as raw pointers, see below (we provide python utils to convert to PIL types).
81
+
82
+ </details><details> <summary><strong>Local files</strong></summary>
83
+
84
+ To test datago while serving local files (jpg, png, ..), code would look like the following.
85
+ **Note that datago serving files with a lot of concurrent threads means that, even if random_sampling is not set,
86
+ there will be some randomness in the sample ordering.**
87
+
88
+ ```python
89
+ from datago import DatagoClient, initialize_logging
90
+ import os
91
+ import json
92
+
93
+ # Can also set the log level directly instead of using RUST_LOG env var
94
+ initialize_logging(log_level="warn")
95
+
96
+ config = {
97
+ "source_type": "file",
98
+ "source_config": {
99
+ "root_path": "myPath",
100
+ "random_sampling": False, # True if used directly for training
101
+ "rank": 0, # Optional, distributed workloads are possible
102
+ "world_size": 1,
103
+ },
104
+ "limit": 200,
105
+ "samples_buffer_size": 32,
106
+ }
107
+
108
+ client = DatagoClient(json.dumps(config))
109
+
110
+ for _ in range(10):
111
+ sample = client.get_sample()
112
+ ```
113
+
114
+ </details><details> <summary><strong>[experimental] Webdataset</strong></summary>
115
+
116
+ Please note that this implementation is very new, and probably has significant limitations still. It has not yet been tested at scale.
117
+ Please also note that you can find a better example in /python/benchmark_webdataset.py, which will show how to convert everything to more pythonic types (PIL images).
118
+
119
+ ```python
120
+ from datago import DatagoClient, initialize_logging
121
+ import os
122
+ import json
123
+
124
+ # Can also set the log level directly instead of using RUST_LOG env var
125
+ initialize_logging(log_level="warn")
126
+
127
+ # URL of the test bucket
128
+ bucket = "https://storage.googleapis.com/webdataset/fake-imagenet"
129
+ dataset = "/imagenet-train-{000000..001281}.tar"
130
+ url = bucket + dataset
131
+
132
+ client_config = {
133
+ "source_type": "webdataset",
134
+ "source_config": {
135
+ "url": url,
136
+ "random_sampling": False,
137
+ "concurrent_downloads": 8, # The number of TarballSamples which should be handled concurrently
138
+ "rank": 0,
139
+ "world_size": 1,
140
+ },
141
+ "prefetch_buffer_size": 128,
142
+ "samples_buffer_size": 64,
143
+ "limit": 1_000_000, # Dummy example, max number of samples you would like to serve
144
+ }
145
+
146
+ client = DatagoClient(json.dumps(client_config))
147
+
148
+ for _ in range(10):
149
+ sample = client.get_sample()
150
+ ```
151
+
152
+ </details>
153
+
154
+ ## Process images on the fly
155
+
156
+ Datago can also process images on the fly, for instance to align different image payloads. This is done by adding an `image_config` to the configuration. The following example shows how to align different image payloads.
157
+
158
+ Processing can be very CPU heavy, but it will be distributed over all CPU cores wihout requiring multiple python processes. I.e., you can keep a single python process using `get_sample()` on the client and still saturate all CPU cores.
159
+
160
+ There are three main processing topics that you can choose from:
161
+
162
+ - crop the images to within an aspect ratio bucket (which is very handy for all Transformer / patch based architectures)
163
+ - resize the images (setting here will be related to the square aspect ratio bucket, other buckets will differ of course)
164
+ - pre-encode the images to a specific format (jpg, png, ...)
165
+
166
+ ```python
167
+ config = {
168
+ "source_type": "file",
169
+ "source_config": {
170
+ "root_path": "myPath",
171
+ "random_sampling": False, # True if used directly for training
172
+ },
173
+ # Optional pre-processing of the images, placing them in an aspect ratio bucket to preserve as much as possible of the original content
174
+ "image_config": {
175
+ "crop_and_resize": True, # False to turn it off, or just omit this part of the config
176
+ "default_image_size": 1024,
177
+ "downsampling_ratio": 32,
178
+ "min_aspect_ratio": 0.5,
179
+ "max_aspect_ratio": 2.0,
180
+ "pre_encode_images": False,
181
+ },
182
+ "limit": 200,
183
+ "samples_buffer_size": 32,
184
+ }
185
+ ```
186
+
187
+ ## Match the raw exported buffers with typical python types
188
+
189
+ See helper functions provided in `raw_types.py`, should be self explanatory. Check python benchmarks for examples. As mentioned above, we also provide a wrapper so that you get a `dataset` directly.
190
+
191
+ ## Logging
192
+
193
+ We are using the [log](https://docs.rs/log/latest/log/) crate with [env_logger](https://docs.rs/env_logger/latest/env_logger/).
194
+ You can set the log level using the RUST_LOG environment variable. E.g. `RUST_LOG=INFO`.
195
+
196
+ When using the library from Python, `env_logger` will be initialized automatically when creating a `DatagoClient`. There is also a `initialize_logging` function in the `datago` module, which if called before using a client, allows to customize the log level. This only works if RUST_LOG is not set.
197
+
198
+ ## Env variables
199
+
200
+ There are a couple of env variables which will change the behavior of the library, for settings which felt too low level to be exposed in the config.
201
+
202
+ - `DATAGO_MAX_TASKS`: refers to the number of threads which will be used to load the samples. Defaults to a multiple of the CPU cores.
203
+ - `RUST_LOG`: see above, will change the level of logging for the whole library, could be useful for debugging or to report an issue here.
204
+ - `DATAGO_MAX_RETRIES`: number of retries for a failed sample load, defaults to 3.
205
+
206
+ </details><details> <summary><strong>Build it</strong></summary>
207
+
208
+ ## Preamble
209
+
210
+ Just install the rust toolchain via rustup
211
+
212
+ ## [Apple Silicon MacOS only]
213
+
214
+ If you are using an Apple Silicon Mac OS machine, create a `.cargo/config` file and paste the following:
215
+
216
+ ``` cfg
217
+ [target.x86_64-apple-darwin]
218
+ rustflags = [
219
+ "-C", "link-arg=-undefined",
220
+ "-C", "link-arg=dynamic_lookup",
221
+ ]
222
+
223
+ [target.aarch64-apple-darwin]
224
+ rustflags = [
225
+ "-C", "link-arg=-undefined",
226
+ "-C", "link-arg=dynamic_lookup",
227
+ ]
228
+ ```
229
+
230
+ ## Build a benchmark CLI
231
+
232
+ `Cargo run --release -- -h` to get all the information, should be fairly straightforward
233
+
234
+ ## Run the rust test suite
235
+
236
+ From the datago folder
237
+
238
+ ```bash
239
+ cargo test
240
+ ```
241
+
242
+ ## Generate the python package binaries manually
243
+
244
+ Build a wheel useable locally
245
+
246
+ ```bash
247
+ maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu"
248
+ ```
249
+
250
+ Build a wheel which can be uploaded to pypi or related
251
+
252
+ - either use a manylinux docker image
253
+
254
+ - or cross compile using zip
255
+
256
+ ```bash
257
+ maturin build -i python3.11 --release --target "x86_64-unknown-linux-gnu" --manylinux 2014 --zig
258
+ ```
259
+
260
+ then you can `pip install` from `target/wheels`
261
+
262
+ ## Update the pypi release (maintainers)
263
+
264
+ Create a new tag and a new release in this repo, a new package will be pushed automatically.
265
+
266
+ </details>
267
+
268
+ <details> <summary><strong>Benchmarks</strong></summary>
269
+ As usual, benchmarks are a tricky game, and you shouldn't read too much into the following plots but do your own tests. Some python benchmark examples are provided in the [python](./python/) folder.
270
+
271
+ In general, Datago will be impactful if you want to load a lot of images very fast, but if you consume them as you go at a more leisury pace then it's not really needed. The more CPU work there is with the images and the higher quality they are, the more Datago will shine.
272
+
273
+ ## From disk: ImageNet
274
+
275
+ The following benchmarks are using ImageNet 1k, which is very low resolution and thus kind of a worst case scenario. Data is served from cache (i.e. the OS cache) and the images are not pre-processed. In this case the receiving python process is typically the bottleneck, and caps at around 3000 images per second.
276
+
277
+ ### AMD Zen3 laptop - IN1k - disk - no processing
278
+ ![AMD Zen3 laptop & M2 SSD](assets/zen3_ssd.png)
279
+
280
+ ### AMD EPYC 9454 - IN1k - disk - no processing
281
+ ![AMD EPYC 9454](assets/epyc_vast.png)
282
+
283
+ ## Webdataset: FakeIN
284
+
285
+ This benchmark is using low resolution images. It's accessed through the webdataset front end, datago is compared with the popular python webdataset library. Note that datago will start streaming the images faster here (almost instantly !), which emphasizes throughput differences depending on how long you test it for.
286
+
287
+ Of note is also that this can be bottlenecked by your external bandwidth to the remote storage where WDS is hosted, in which case both solution would yield comparable numbers.
288
+
289
+ ### AMD Zen3 laptop - webdataset - no processing
290
+ ![AMD EPYC 9454](assets/zen3_wds_fakein.png)
291
+
292
+
293
+ ## Webdataset: PD12M
294
+
295
+ This benchmark is using high resolution images. It's accessed through the webdataset front end, datago is compared with the popular python webdataset library. Note that datago will start streaming the images faster here (almost instantly !), which emphasizes throughput differences depending on how long you test it for.
296
+
297
+ Of note is also that this can be bottlenecked by your external bandwidth to the remote storage where WDS is hosted, in which case both solution would yield comparable numbers.
298
+
299
+ ### AMD Zen3 laptop - webdataset - no processing
300
+ ![AMD Zen3 laptop](assets/zen3_wds_pd12m.png)
301
+
302
+
303
+ ### AMD EPYC 9454 - pd12m - webdataset - no processing
304
+ ![AMD EPYC 9454](assets/epyc_wds_pd12m.png)
305
+
306
+
307
+ ### AMD Zen3 laptop - webdataset - processing
308
+ Adding image processing (crop and resize to Transformer compatible size buckets) to the equation changes the picture, as the work spread becomes more important. If you're training a diffusion model or an image encoder from a diverse set of images, this is likely to be the most realistic micro-benchmark.
309
+
310
+ ![AMD Zen3 laptop](assets/zen3_wds_pd12m_processing.png)
311
+
312
+ </details>
313
+
314
+
315
+ ## License
316
+
317
+ MIT License
318
+
319
+ Copyright (c) 2025 Photoroom
320
+
321
+ Permission is hereby granted, free of charge, to any person obtaining a copy
322
+ of this software and associated documentation files (the "Software"), to deal
323
+ in the Software without restriction, including without limitation the rights
324
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
325
+ copies of the Software, and to permit persons to whom the Software is
326
+ furnished to do so, subject to the following conditions:
327
+
328
+ The above copyright notice and this permission notice shall be included in all
329
+ copies or substantial portions of the Software.
330
+
331
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
332
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
333
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
334
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
335
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
336
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
337
+ SOFTWARE.
338
+
@@ -0,0 +1,7 @@
1
+ README.md,sha256=XwJ68dy_mOJNETAX0ZT4BRY0At9f2feXXlYrysdoBZo,12828
2
+ datago/__init__.py,sha256=kNnGM4GPRHPbxtWqYjr-57Td4rOTtl3kLaKExPLuE7g,107
3
+ datago/datago.cpython-314-x86_64-linux-gnu.so,sha256=Nk3o5_YpDF3lzcxIdMRVm-3Xm0LN3Qm2uj1laaL1PD0,14261032
4
+ datago-2026.2.1.dist-info/METADATA,sha256=K9VN7TYe46mZCNipfoy1ZhxD39O4ziF5TnUBv_8mOu0,13599
5
+ datago-2026.2.1.dist-info/WHEEL,sha256=-qirJpzzEpvr1JG6jPvE2fewx9ESdlB9RVdVwnc0E_k,109
6
+ datago-2026.2.1.dist-info/licenses/LICENSE,sha256=5BkY8hpbSq1ldBOzHL1XHW4h-ognKi_lZa2zeWailV8,1066
7
+ datago-2026.2.1.dist-info/RECORD,,
@@ -0,0 +1,4 @@
1
+ Wheel-Version: 1.0
2
+ Generator: maturin (1.11.5)
3
+ Root-Is-Purelib: false
4
+ Tag: cp314-cp314-manylinux_2_31_x86_64
@@ -0,0 +1,21 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 Photoroom
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.