hafnia 0.1.27__py3-none-any.whl → 0.2.1__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (42) hide show
  1. cli/__main__.py +2 -2
  2. cli/config.py +17 -4
  3. cli/dataset_cmds.py +60 -0
  4. cli/runc_cmds.py +1 -1
  5. hafnia/data/__init__.py +2 -2
  6. hafnia/data/factory.py +12 -56
  7. hafnia/dataset/dataset_helpers.py +91 -0
  8. hafnia/dataset/dataset_names.py +72 -0
  9. hafnia/dataset/dataset_recipe/dataset_recipe.py +327 -0
  10. hafnia/dataset/dataset_recipe/recipe_transforms.py +53 -0
  11. hafnia/dataset/dataset_recipe/recipe_types.py +140 -0
  12. hafnia/dataset/dataset_upload_helper.py +468 -0
  13. hafnia/dataset/hafnia_dataset.py +624 -0
  14. hafnia/dataset/operations/dataset_stats.py +15 -0
  15. hafnia/dataset/operations/dataset_transformations.py +82 -0
  16. hafnia/dataset/operations/table_transformations.py +183 -0
  17. hafnia/dataset/primitives/__init__.py +16 -0
  18. hafnia/dataset/primitives/bbox.py +137 -0
  19. hafnia/dataset/primitives/bitmask.py +182 -0
  20. hafnia/dataset/primitives/classification.py +56 -0
  21. hafnia/dataset/primitives/point.py +25 -0
  22. hafnia/dataset/primitives/polygon.py +100 -0
  23. hafnia/dataset/primitives/primitive.py +44 -0
  24. hafnia/dataset/primitives/segmentation.py +51 -0
  25. hafnia/dataset/primitives/utils.py +51 -0
  26. hafnia/experiment/hafnia_logger.py +7 -7
  27. hafnia/helper_testing.py +108 -0
  28. hafnia/http.py +5 -3
  29. hafnia/platform/__init__.py +2 -2
  30. hafnia/platform/datasets.py +197 -0
  31. hafnia/platform/download.py +85 -23
  32. hafnia/torch_helpers.py +180 -95
  33. hafnia/utils.py +21 -2
  34. hafnia/visualizations/colors.py +267 -0
  35. hafnia/visualizations/image_visualizations.py +202 -0
  36. {hafnia-0.1.27.dist-info → hafnia-0.2.1.dist-info}/METADATA +209 -99
  37. hafnia-0.2.1.dist-info/RECORD +50 -0
  38. cli/data_cmds.py +0 -53
  39. hafnia-0.1.27.dist-info/RECORD +0 -27
  40. {hafnia-0.1.27.dist-info → hafnia-0.2.1.dist-info}/WHEEL +0 -0
  41. {hafnia-0.1.27.dist-info → hafnia-0.2.1.dist-info}/entry_points.txt +0 -0
  42. {hafnia-0.1.27.dist-info → hafnia-0.2.1.dist-info}/licenses/LICENSE +0 -0
@@ -1,22 +1,27 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: hafnia
3
- Version: 0.1.27
3
+ Version: 0.2.1
4
4
  Summary: Python SDK for communication with Hafnia platform.
5
5
  Author-email: Milestone Systems <hafniaplatform@milestone.dk>
6
6
  License-File: LICENSE
7
7
  Requires-Python: >=3.10
8
8
  Requires-Dist: boto3>=1.35.91
9
9
  Requires-Dist: click>=8.1.8
10
- Requires-Dist: datasets>=3.2.0
11
10
  Requires-Dist: emoji>=2.14.1
12
11
  Requires-Dist: flatten-dict>=0.4.2
12
+ Requires-Dist: more-itertools>=10.7.0
13
+ Requires-Dist: opencv-python-headless>=4.11.0.86
13
14
  Requires-Dist: pathspec>=0.12.1
14
15
  Requires-Dist: pillow>=11.1.0
16
+ Requires-Dist: polars>=1.30.0
15
17
  Requires-Dist: pyarrow>=18.1.0
18
+ Requires-Dist: pycocotools>=2.0.10
16
19
  Requires-Dist: pydantic>=2.10.4
17
20
  Requires-Dist: rich>=13.9.4
21
+ Requires-Dist: s5cmd>=0.2.0
18
22
  Requires-Dist: seedir>=0.5.0
19
23
  Requires-Dist: tqdm>=4.67.1
24
+ Requires-Dist: xxhash>=3.5.0
20
25
  Description-Content-Type: text/markdown
21
26
 
22
27
  # Hafnia
@@ -28,8 +33,8 @@ The package includes the following interfaces:
28
33
 
29
34
  - `cli`: A Command Line Interface (CLI) to 1) configure/connect to Hafnia's [Training-aaS](https://hafnia.readme.io/docs/training-as-a-service) and 2) create and
30
35
  launch recipe scripts.
31
- - `hafnia`: A python package with helper functions to load and interact with sample datasets and an experiment
32
- tracker (`HafniaLogger`).
36
+ - `hafnia`: A python package including `HafniaDataset` to manage datasets and `HafniaLogger` to do
37
+ experiment tracking.
33
38
 
34
39
 
35
40
  ## The Concept: Training as a Service (Training-aaS)
@@ -76,7 +81,7 @@ Copy the key and save it for later use.
76
81
  1. Download `mnist` from terminal to verify that your configuration is working.
77
82
 
78
83
  ```bash
79
- hafnia data download mnist --force
84
+ hafnia dataset download mnist --force
80
85
  ```
81
86
 
82
87
  ## Getting started: Loading datasets samples
@@ -84,115 +89,220 @@ With Hafnia configured on your local machine, it is now possible to download
84
89
  and explore the dataset sample with a python script:
85
90
 
86
91
  ```python
87
- from hafnia.data import load_dataset
92
+ from hafnia.data import load_dataset, get_dataset_path
93
+ from hafnia.dataset.hafnia_dataset import HafniaDataset
88
94
 
89
- dataset_splits = load_dataset("mnist")
95
+ # To download the sample dataset use:
96
+ path_dataset = get_dataset_path("midwest-vehicle-detection")
90
97
  ```
91
98
 
92
- ### Dataset Format
93
- The returned sample dataset is a [hugging face dataset](https://huggingface.co/docs/datasets/index)
94
- and contains train, validation and test splits.
99
+ This will download the dataset sample `midwest-vehicle-detection` to the local `.data/datasets/` folder
100
+ in a human readable format.
101
+
102
+ Images are stored in the `data` folder, general dataset information is stored in `dataset_info.json`
103
+ and annotations are stored as both `annotations.jsonl`(jsonl) and `annotations.parquet`.
104
+
105
+ ```bash
106
+ $ cd .data/datasets/
107
+ $ tree midwest-vehicle-detection
108
+ midwest-vehicle-detection
109
+ └── sample
110
+ ├── annotations.jsonl
111
+ ├── annotations.parquet
112
+ ├── data
113
+ │   ├── video_0026a86b-2f43-49f2-a17c-59244d10a585_1fps_mp4_frame_00000.png
114
+ │   ....
115
+ │   ├── video_ff17d777-e783-44e2-9bff-a4adac73de4b_1fps_mp4_frame_00000.png
116
+ │   └── video_ff17d777-e783-44e2-9bff-a4adac73de4b_1fps_mp4_frame_00100.png
117
+ └── dataset_info.json
118
+
119
+ 3 directories, 217 files
120
+ ```
121
+
122
+ You can interact with data as you want, but we also provide `HafniaDataset`
123
+ for loading/saving, managing and interacting with the dataset.
124
+
125
+ We recommend to visit and potentially execute the example script [examples/example_hafnia_dataset.py](examples/example_hafnia_dataset.py)
126
+ to see how to use the `HafniaDataset` class and its methods.
127
+
128
+ Below is a short introduction to the `HafniaDataset` class.
129
+
130
+ ```python
131
+ from hafnia.dataset.hafnia_dataset import HafniaDataset, Sample
132
+
133
+ # Load dataset
134
+ dataset = HafniaDataset.read_from_path(path_dataset)
135
+
136
+ # Alternatively, you can use the 'load_dataset' function to download and load dataset in one go.
137
+ # dataset = load_dataset("midwest-vehicle-detection")
138
+
139
+ # Print dataset information
140
+ dataset.print_stats()
141
+
142
+ # Create a dataset split for training
143
+ dataset_train = dataset.create_split_dataset("train")
144
+ ```
145
+
146
+ The `HafniaDataset` object provides a convenient way to interact with the dataset, including methods for
147
+ creating splits, accessing samples, printing statistics, saving to and loading from disk.
148
+
149
+ In essence, the `HafniaDataset` class contains `dataset.info` with dataset information
150
+ and `dataset.samples` with annotations as a polars DataFrame
95
151
 
96
152
  ```python
97
- print(dataset_splits)
98
-
99
- # Output:
100
- >>> DatasetDict({
101
- train: Dataset({
102
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
103
- num_rows: 172
104
- })
105
- validation: Dataset({
106
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
107
- num_rows: 21
108
- })
109
- test: Dataset({
110
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
111
- num_rows: 21
112
- })
113
- })
153
+ # Annotations are stored in a polars DataFrame
154
+ print(dataset.samples.head(2))
155
+ shape: (2, 14)
156
+ ┌──────────────┬─────────────────────────────────┬────────┬───────┬───┬─────────────────────────────────┬──────────┬──────────┬─────────────────────────────────┐
157
+ sample_index ┆ file_name ┆ height ┆ width ┆ … ┆ objects ┆ bitmasks ┆ polygons ┆ meta │
158
+ --- ┆ --- ┆ --- ┆ --- ┆ ┆ --- ┆ --- ┆ --- ┆ --- │
159
+ u32 ┆ str ┆ i64 ┆ i64 ┆ ┆ list[struct[11]] ┆ null ┆ null ┆ struct[5] │
160
+ ╞══════════════╪═════════════════════════════════╪════════╪═══════╪═══╪═════════════════════════════════╪══════════╪══════════╪═════════════════════════════════╡
161
+ │ 0 ┆ /home/ubuntu/code/hafnia/.data… ┆ 1080 ┆ 1920 ┆ … ┆ [{0.0492,0.0357,0.2083,0.23,"V… ┆ null ┆ null ┆ {120.0,1.0,"2024-07-10T18:30:0…
162
+ 100 ┆ /home/ubuntu/code/hafnia/.data… ┆ 1080 ┆ 1920 ┆ … ┆ [{0.146382,0.078704,0.42963,0.… null ┆ null ┆ {120.0,1.0,"2024-07-10T18:30:0…
163
+ └──────────────┴─────────────────────────────────┴────────┴───────┴───┴─────────────────────────────────┴──────────┴──────────┴─────────────────────────────────┘
164
+ ```
114
165
 
166
+ ```python
167
+ # General dataset information is stored in `dataset.info`
168
+ rich.print(dataset.info)
169
+ DatasetInfo(
170
+ dataset_name='midwest-vehicle-detection',
171
+ version='1.0.0',
172
+ tasks=[
173
+ TaskInfo(
174
+ primitive=<class 'hafnia.dataset.primitives.Bbox'>,
175
+ class_names=[
176
+ 'Person',
177
+ 'Vehicle.Bicycle',
178
+ 'Vehicle.Motorcycle',
179
+ 'Vehicle.Car',
180
+ 'Vehicle.Van',
181
+ 'Vehicle.RV',
182
+ 'Vehicle.Single_Truck',
183
+ 'Vehicle.Combo_Truck',
184
+ 'Vehicle.Pickup_Truck',
185
+ 'Vehicle.Trailer',
186
+ 'Vehicle.Emergency_Vehicle',
187
+ 'Vehicle.Bus',
188
+ 'Vehicle.Heavy_Duty_Vehicle'
189
+ ],
190
+ name='bboxes'
191
+ ),
192
+ TaskInfo(primitive=<class 'hafnia.dataset.primitives.Classification'>, class_names=['Clear', 'Foggy'], name='Weather'),
193
+ TaskInfo(primitive=<class 'hafnia.dataset.primitives.Classification'>, class_names=['Dry', 'Wet'], name='Surface Conditions')
194
+ ],
195
+ meta={
196
+ 'n_videos': 109,
197
+ 'n_cameras': 20,
198
+ 'duration': 13080.0,
199
+ 'duration_average': 120.0,
200
+ ...
201
+ }
202
+ )
115
203
  ```
116
204
 
117
- A Hugging Face dataset is a dictionary with splits, where each split is a `Dataset` object.
118
- Each `Dataset` is structured as a table with a set of columns (also called features) and a row for each sample.
205
+ You can iterate and access samples in the dataset using the `HafniaDataset` object.
206
+ Each sample contain image and annotations information.
119
207
 
120
- The features of the dataset can be viewed with the `features` attribute.
121
208
  ```python
122
- # View features of the train split
123
- pprint.pprint(dataset["train"].features)
124
- {'Surface Conditions': ClassLabel(names=['Dry', 'Wet'], id=None),
125
- 'Weather': ClassLabel(names=['Clear', 'Foggy'], id=None),
126
- 'height': Value(dtype='int64', id=None),
127
- 'image': Image(mode=None, decode=True, id=None),
128
- 'image_id': Value(dtype='int64', id=None),
129
- 'objects': Sequence(feature={'bbox': Sequence(feature=Value(dtype='int64',
130
- id=None),
131
- length=-1,
132
- id=None),
133
- 'class_idx': ClassLabel(names=['Vehicle.Bicycle',
134
- 'Vehicle.Motorcycle',
135
- 'Vehicle.Car',
136
- 'Vehicle.Van',
137
- 'Vehicle.RV',
138
- 'Vehicle.Single_Truck',
139
- 'Vehicle.Combo_Truck',
140
- 'Vehicle.Pickup_Truck',
141
- 'Vehicle.Trailer',
142
- 'Vehicle.Emergency_Vehicle',
143
- 'Vehicle.Bus',
144
- 'Vehicle.Heavy_Duty_Vehicle'],
145
- id=None),
146
- 'class_name': Value(dtype='string', id=None),
147
- 'id': Value(dtype='string', id=None)},
148
- length=-1,
149
- id=None),
150
- 'width': Value(dtype='int64', id=None)}
209
+ from hafnia.dataset.hafnia_dataset import HafniaDataset, Sample
210
+ # Access the first sample in the dataset either by index or by iterating over the dataset
211
+ sample_dict = dataset[0]
212
+
213
+ for sample_dict in dataset:
214
+ sample = Sample(**sample_dict)
215
+ print(sample.sample_id, sample.objects)
216
+ break
151
217
  ```
218
+ Not that it is possible to create a `Sample` object from the sample dictionary.
219
+ This is useful for accessing the image and annotations in a structured way.
152
220
 
153
- View the first sample in the training set:
154
221
  ```python
155
- # Print sample from the training set
156
- pprint.pprint(dataset["train"][0])
157
-
158
- {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=1920x1080 at 0x79D6292C5ED0>,
159
- 'image_id': 4920,
160
- 'height': 1080,
161
- 'Weather': 0,
162
- 'Surface Conditions': 0,
163
- 'objects': {'bbox': [[441, 180, 121, 126],
164
- [549, 151, 131, 103],
165
- [1845, 722, 68, 130],
166
- [1810, 571, 110, 149]],
167
- 'class_idx': [7, 7, 2, 2],
168
- 'class_name': ['Vehicle.Pickup_Truck',
169
- 'Vehicle.Pickup_Truck',
170
- 'Vehicle.Car',
171
- 'Vehicle.Car'],
172
- 'id': ['HW6WiLAJ', 'T/ccFpRi', 'CS0O8B6W', 'DKrJGzjp']},
173
- 'width': 1920}
222
+ # By unpacking the sample dictionary, you can create a `Sample` object.
223
+ sample = Sample(**sample_dict)
224
+
225
+ # Use the `Sample` object to easily read image and draw annotations
226
+ image = sample.read_image()
227
+ image_annotations = sample.draw_annotations()
228
+ ```
174
229
 
230
+ Note that the `Sample` object contains all information about the sample, including image and metadata.
231
+ It also contain annotations as primitive types such as `Bbox`, `Classification`.
232
+
233
+ ```python
234
+ rich.print(sample)
235
+ Sample(
236
+ sample_index=120,
237
+ file_name='/home/ubuntu/code/hafnia/.data/datasets/midwest-vehicle-detection/data/343403325f27e390.png',
238
+ height=1080,
239
+ width=1920,
240
+ split='train',
241
+ is_sample=True,
242
+ collection_index=None,
243
+ collection_id=None,
244
+ remote_path='s3://mdi-production-midwest-vehicle-detection/sample/data/343403325f27e390.png',
245
+ classifications=[
246
+ Classification(
247
+ class_name='Clear',
248
+ class_idx=0,
249
+ object_id=None,
250
+ confidence=None,
251
+ ground_truth=True,
252
+ task_name='Weather',
253
+ meta=None
254
+ ),
255
+ Classification(
256
+ class_name='Day',
257
+ class_idx=3,
258
+ object_id=None,
259
+ confidence=None,
260
+ ground_truth=True,
261
+ task_name='Time of Day',
262
+ meta=None
263
+ ),
264
+ ...
265
+ ],
266
+ objects=[
267
+ Bbox(
268
+ height=0.0492,
269
+ width=0.0357,
270
+ top_left_x=0.2083,
271
+ top_left_y=0.23,
272
+ class_name='Vehicle.Car',
273
+ class_idx=3,
274
+ object_id='cXT4NRVu',
275
+ confidence=None,
276
+ ground_truth=True,
277
+ task_name='bboxes',
278
+ meta=None
279
+ ),
280
+ Bbox(
281
+ height=0.0457,
282
+ width=0.0408,
283
+ top_left_x=0.2521,
284
+ top_left_y=0.2153,
285
+ class_name='Vehicle.Car',
286
+ class_idx=3,
287
+ object_id='MelbIIDU',
288
+ confidence=None,
289
+ ground_truth=True,
290
+ task_name='bboxes',
291
+ meta=None
292
+ ),
293
+ ...
294
+ ],
295
+ bitmasks=None, # Optional a list of Bitmask objects List[Bitmask]
296
+ polygons=None, # Optional a list of Polygon objects List[Polygon]
297
+ meta={
298
+ 'video.data_duration': 120.0,
299
+ 'video.data_fps': 1.0,
300
+ ...
301
+ }
302
+ )
175
303
  ```
176
304
 
177
- For hafnia based datasets, we want to standardized how a dataset and dataset tasks are represented.
178
- We have defined a set of features that are common across all datasets in the Hafnia data library.
179
-
180
- - `image`: The image itself, stored as a PIL image
181
- - `height`: The height of the image in pixels
182
- - `width`: The width of the image in pixels
183
- - `[IMAGE_CLASSIFICATION_TASK]`: [Optional] Image classification tasks are top-level `ClassLabel` feature.
184
- `ClassLabel` is a Hugging Face feature that maps class indices to class names.
185
- In above example we have two classification tasks:
186
- - `Weather`: Classifies the weather conditions in the image, with possible values `Clear` and `Foggy`
187
- - `Surface Conditions`: Classifies the surface conditions in the image, with possible values `Dry` and `Wet`
188
- - `objects`: A dictionary containing information about objects in the image, including:
189
- - `bbox`: Bounding boxes for each object, represented with a list of bounding box coordinates
190
- `[xmin, ymin, bbox_width, bbox_height]`. Each bounding box is defined with a top-left corner coordinate
191
- `(xmin, ymin)` and bounding box width and height `(bbox_width, bbox_height)` in pixels.
192
- - `class_idx`: Class indices for each detected object. This is a
193
- `ClassLabel` feature that maps to the `class_name` feature.
194
- - `class_name`: Class names for each detected object
195
- - `id`: Unique identifiers for each detected object
305
+ To learn more, view and potentially execute the example script [examples/example_hafnia_dataset.py](examples/example_hafnia_dataset.py).
196
306
 
197
307
  ### Dataset Locally vs. Training-aaS
198
308
  An important feature of `load_dataset` is that it will return the full dataset
@@ -354,7 +464,7 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
354
464
  Create virtual environment and install python dependencies
355
465
 
356
466
  ```bash
357
- uv sync
467
+ uv sync --dev
358
468
  ```
359
469
 
360
470
  Run tests:
@@ -0,0 +1,50 @@
1
+ cli/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
+ cli/__main__.py,sha256=pKYvJOk0toACDMrgEeUYT3P5EQPFmXdMRIQuLZLd3dc,1603
3
+ cli/config.py,sha256=-BTdljjC42hXHb1P0yewea9knzgSBFsb909qJ5DEkCo,5531
4
+ cli/consts.py,sha256=sj0MRwbbCT2Yl77FPddck1VWkFxp7QY6I9l1o75j_aE,963
5
+ cli/dataset_cmds.py,sha256=VUMhnHGYPtNNJUK9aobKTx2zpVzLex4gTMmyQXuzCVw,1623
6
+ cli/experiment_cmds.py,sha256=L-k_ZJ4B7I4cA8OvHcheSwXM6nx9aTF9G7eKBzAcOzQ,1961
7
+ cli/profile_cmds.py,sha256=-HQcFgYI6Rqaefi0Nj-91KhiqPKUj7zOaiJWbHx_bac,3196
8
+ cli/recipe_cmds.py,sha256=qnMfF-te47HXNkgyA0hm9X3etDQsqMnrVEGDCrzVjZU,1462
9
+ cli/runc_cmds.py,sha256=QqhQe2sd7tK1Bl2aGfIWRyJjpP6F7Tducg7HULrHsZ4,4958
10
+ hafnia/__init__.py,sha256=Zphq-cQoX95Z11zm4lkrU-YiAJxddR7IBfwDkxeHoDE,108
11
+ hafnia/helper_testing.py,sha256=GnaNhXdY81arjCT9M2RUAmvn2-aIzRqlCtbWwGbOIaY,3901
12
+ hafnia/http.py,sha256=HoPB03IL6e-nglTrw1NGT6sDx1T8VNas5HjTT1QZHnU,3035
13
+ hafnia/log.py,sha256=sWF8tz78yBtwZ9ddzm19L1MBSBJ3L4G704IGeT1_OEU,784
14
+ hafnia/torch_helpers.py,sha256=ho65B0WIu_SjbaKPRL4wabDNrnVumWH8QSXVH4r7NAY,11605
15
+ hafnia/utils.py,sha256=aTZaeHldXn4Jx_AR2BYATxtLCRrBKBjjDFmpSZTSvV4,5138
16
+ hafnia/data/__init__.py,sha256=o9QjiGbEcNa6r-qDmwwmxPXf-1UitNl5-WxFNcujqsg,111
17
+ hafnia/data/factory.py,sha256=OY6l6c9UKk6OUDhG4Akb2VgcSaTRLHlbSndAe1HuW2U,813
18
+ hafnia/dataset/dataset_helpers.py,sha256=WVCpbUfNbHy7MZJqJ3OyJF8k1hSObo3kScxpXT17Sj8,3510
19
+ hafnia/dataset/dataset_names.py,sha256=mp7A_TOqgoqHUEBCPC4ReKNJ93cxwQB451owoCqD6yM,2120
20
+ hafnia/dataset/dataset_upload_helper.py,sha256=D1BGaeEar4McpUvXj4Yy8nk1tr12IEVhP_Ma47OoWmU,21150
21
+ hafnia/dataset/hafnia_dataset.py,sha256=4SJUq7pAqLkcFzgnOUUx8ERraE_sABctOAsONBJExME,27664
22
+ hafnia/dataset/dataset_recipe/dataset_recipe.py,sha256=DbPLlmshF6DC98Cwko04XtBaXgSg966LZKR6JXD_9Sg,13632
23
+ hafnia/dataset/dataset_recipe/recipe_transforms.py,sha256=wh1y2XyX0PwOwfuzJ3_17KKng2Rk0zLlgdfSHfS1SyM,1305
24
+ hafnia/dataset/dataset_recipe/recipe_types.py,sha256=6LxfanhX9ihof1gGSonoC-56zSWsI8k2aS4Uw_QgXoM,5176
25
+ hafnia/dataset/operations/dataset_stats.py,sha256=tSHPmkXt4WNgjf5-j3jIrsSy1Ajld3619AkUHaesXb4,445
26
+ hafnia/dataset/operations/dataset_transformations.py,sha256=4ibC11upEtRGJgoFLv8lUnglv2xANZVfNdsvI1BMvfM,2960
27
+ hafnia/dataset/operations/table_transformations.py,sha256=kCLbLRdiFSx1JG0IWtaKkhWcMtM7hy8zgm0Ehz0zO_g,7639
28
+ hafnia/dataset/primitives/__init__.py,sha256=LAdTeK5GgmaF1se8f0Yj1lOTNqplXGL87kLeOnv1D8Q,627
29
+ hafnia/dataset/primitives/bbox.py,sha256=HXYYy5BLNZwh-bO7aiAWg3z0OurUev8ISa-vYey8b8A,6055
30
+ hafnia/dataset/primitives/bitmask.py,sha256=mq_wchMqGupJDc-a-mJh9uBO_mjHcXpLH49g591doAM,7619
31
+ hafnia/dataset/primitives/classification.py,sha256=7UmePtrf3imfU56YUro21JvHGoktomMous1vTVKLjHU,2341
32
+ hafnia/dataset/primitives/point.py,sha256=JCRwb-E4sDafodkg6wqyuAS1Yj-yaJbwiD8aB69_Ros,635
33
+ hafnia/dataset/primitives/polygon.py,sha256=vhPrYHv6TqQZMTAyv9r3NV8Hu6YRSSD0srB0wOCIwQ4,4289
34
+ hafnia/dataset/primitives/primitive.py,sha256=7jxcyFADVGf95pjeQHEOqAnR9eucLpxA2h8Blz3ppXI,1253
35
+ hafnia/dataset/primitives/segmentation.py,sha256=jUMjOmYr9j4An3YSCw5CJC1W8ihXAbus3CXaTOpc7Xw,1905
36
+ hafnia/dataset/primitives/utils.py,sha256=3gT1as-xXEj8CamoIuBb9gQwUN9Ae9qnqtqF_uEe0zo,1993
37
+ hafnia/experiment/__init__.py,sha256=OEFE6HqhO5zcTCLZcPcPVjIg7wMFFnvZ1uOtAVhRz7M,85
38
+ hafnia/experiment/hafnia_logger.py,sha256=dnV3VPzJK7DSeUh0g4Hk9w1g-eSXcVqJD9If0h2d2GE,6885
39
+ hafnia/platform/__init__.py,sha256=zJsR6Hy_0iUcC9xL-lBnqR0mLfF4EUr_VXa_XQA7SlA,455
40
+ hafnia/platform/builder.py,sha256=_g8ykQWETz5Y4Np9QU1a6wIzbbJwXCkbiOCA6JcF5Rc,5742
41
+ hafnia/platform/datasets.py,sha256=J252hrejrBWUdS6hY4lRc9_SbYy7CMD92068lLHjPC8,6953
42
+ hafnia/platform/download.py,sha256=oJzdxSIDTuw1an7maC6I7A5nZvDaZPhUkuAmyRwN9Kc,6843
43
+ hafnia/platform/experiment.py,sha256=-nAfTmn1c8sE6pHDCTNZvWDTopkXndarJAPIGvsnk60,2389
44
+ hafnia/visualizations/colors.py,sha256=003eAJVnBal4abaYIIpsrT7erIOIjTUHHYVJ1Tj1CDc,5226
45
+ hafnia/visualizations/image_visualizations.py,sha256=RuFFj2fJCm9dxl2Lq0MumJHF81ZnX-IsDsTxm8ZFV9A,7313
46
+ hafnia-0.2.1.dist-info/METADATA,sha256=A1_OEYNslARBFGoYBPm7_-3YivfUeA8adwUUbsM3UsY,19040
47
+ hafnia-0.2.1.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
48
+ hafnia-0.2.1.dist-info/entry_points.txt,sha256=FCJVIQ8GP2VE9I3eeGVF5eLxVDNW_01pOJCpG_CGnMM,45
49
+ hafnia-0.2.1.dist-info/licenses/LICENSE,sha256=wLZw1B7_mod_CO1H8LXqQgfqlWD6QceJR8--LJYRZGE,1078
50
+ hafnia-0.2.1.dist-info/RECORD,,
cli/data_cmds.py DELETED
@@ -1,53 +0,0 @@
1
- from pathlib import Path
2
- from typing import Optional
3
-
4
- import click
5
- from rich import print as rprint
6
-
7
- import cli.consts as consts
8
- from cli.config import Config
9
-
10
-
11
- @click.group()
12
- def data():
13
- """Manage data interaction"""
14
- pass
15
-
16
-
17
- @data.command("get")
18
- @click.argument("url")
19
- @click.argument("destination")
20
- @click.pass_obj
21
- def data_get(cfg: Config, url: str, destination: click.Path) -> None:
22
- """Download resource from Hafnia platform"""
23
-
24
- from hafnia.platform import download_resource
25
-
26
- try:
27
- result = download_resource(resource_url=url, destination=str(destination), api_key=cfg.api_key)
28
- except Exception:
29
- raise click.ClickException(consts.ERROR_GET_RESOURCE)
30
-
31
- rprint(result)
32
-
33
-
34
- @data.command("download")
35
- @click.argument("dataset_name")
36
- @click.argument("destination", default=None, required=False)
37
- @click.option("--force", is_flag=True, default=False, help="Force download")
38
- @click.pass_obj
39
- def data_download(cfg: Config, dataset_name: str, destination: Optional[click.Path], force: bool) -> Path:
40
- """Download dataset from Hafnia platform"""
41
-
42
- from hafnia.data.factory import download_or_get_dataset_path
43
-
44
- try:
45
- path_dataset = download_or_get_dataset_path(
46
- dataset_name=dataset_name,
47
- cfg=cfg,
48
- output_dir=destination,
49
- force_redownload=force,
50
- )
51
- except Exception:
52
- raise click.ClickException(consts.ERROR_GET_RESOURCE)
53
- return path_dataset
@@ -1,27 +0,0 @@
1
- cli/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
- cli/__main__.py,sha256=MX0RT1BP3t59rzCvdUqfw39Kw05HOF4OEtjDTwIU9h8,1594
3
- cli/config.py,sha256=R9w0NKIOtIxRKNs7ieeUrIKwRkrTlK5PqOVjc5VYljE,4923
4
- cli/consts.py,sha256=sj0MRwbbCT2Yl77FPddck1VWkFxp7QY6I9l1o75j_aE,963
5
- cli/data_cmds.py,sha256=BQiythAPwAwudgdUa68v50a345uw5flrcDiBHLGp9lo,1460
6
- cli/experiment_cmds.py,sha256=L-k_ZJ4B7I4cA8OvHcheSwXM6nx9aTF9G7eKBzAcOzQ,1961
7
- cli/profile_cmds.py,sha256=-HQcFgYI6Rqaefi0Nj-91KhiqPKUj7zOaiJWbHx_bac,3196
8
- cli/recipe_cmds.py,sha256=qnMfF-te47HXNkgyA0hm9X3etDQsqMnrVEGDCrzVjZU,1462
9
- cli/runc_cmds.py,sha256=6fHMi_dEd8g3Cx9PEfU4gJMZf5-G0IUPDcZh6DNq8Mw,4953
10
- hafnia/__init__.py,sha256=Zphq-cQoX95Z11zm4lkrU-YiAJxddR7IBfwDkxeHoDE,108
11
- hafnia/http.py,sha256=psCWdNKfKYiBrYD6bezat1AeHh77JJtJrPePiUAjTIk,2948
12
- hafnia/log.py,sha256=sWF8tz78yBtwZ9ddzm19L1MBSBJ3L4G704IGeT1_OEU,784
13
- hafnia/torch_helpers.py,sha256=P_Jl4IwqUebKVCOXNe6iTorJZA3S-3d92HV274UHIko,7456
14
- hafnia/utils.py,sha256=mJ5aOjSVSOrrQnpmaKLK71ld5jYpmtd3HciTIk_Wk88,4659
15
- hafnia/data/__init__.py,sha256=Pntmo_1fst8OhyrHB60jQ8mhJJ4hL38tdjLvt0YXEJo,73
16
- hafnia/data/factory.py,sha256=4fZDkWNyOK1QNCmsxsXfSztPJkJW_HBIa_PTdGCYHCM,2551
17
- hafnia/experiment/__init__.py,sha256=OEFE6HqhO5zcTCLZcPcPVjIg7wMFFnvZ1uOtAVhRz7M,85
18
- hafnia/experiment/hafnia_logger.py,sha256=usL5pl7XLJP-g1vZrwvbky5YiD6Bg-rOODYYAX5z43I,6830
19
- hafnia/platform/__init__.py,sha256=I-VIVXDxwBAUzxx8Zx0g_wykyDdFGTsjb_mYLmvxk2Y,443
20
- hafnia/platform/builder.py,sha256=_g8ykQWETz5Y4Np9QU1a6wIzbbJwXCkbiOCA6JcF5Rc,5742
21
- hafnia/platform/download.py,sha256=t055axPNHlXTYCQgZHOS2YMQt1I2_bc4G8dltsOKttY,4760
22
- hafnia/platform/experiment.py,sha256=-nAfTmn1c8sE6pHDCTNZvWDTopkXndarJAPIGvsnk60,2389
23
- hafnia-0.1.27.dist-info/METADATA,sha256=zClGBhUHMs5pW19BX5avM5tBWvF5KAQxmAvEwRKNeSw,14990
24
- hafnia-0.1.27.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
25
- hafnia-0.1.27.dist-info/entry_points.txt,sha256=FCJVIQ8GP2VE9I3eeGVF5eLxVDNW_01pOJCpG_CGnMM,45
26
- hafnia-0.1.27.dist-info/licenses/LICENSE,sha256=wLZw1B7_mod_CO1H8LXqQgfqlWD6QceJR8--LJYRZGE,1078
27
- hafnia-0.1.27.dist-info/RECORD,,