hafnia 0.1.26__py3-none-any.whl → 0.2.0__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Files changed (38) hide show
  1. cli/__main__.py +2 -2
  2. cli/dataset_cmds.py +60 -0
  3. cli/runc_cmds.py +1 -1
  4. hafnia/data/__init__.py +2 -2
  5. hafnia/data/factory.py +9 -56
  6. hafnia/dataset/dataset_helpers.py +91 -0
  7. hafnia/dataset/dataset_names.py +71 -0
  8. hafnia/dataset/dataset_transformation.py +187 -0
  9. hafnia/dataset/dataset_upload_helper.py +468 -0
  10. hafnia/dataset/hafnia_dataset.py +453 -0
  11. hafnia/dataset/primitives/__init__.py +16 -0
  12. hafnia/dataset/primitives/bbox.py +137 -0
  13. hafnia/dataset/primitives/bitmask.py +182 -0
  14. hafnia/dataset/primitives/classification.py +56 -0
  15. hafnia/dataset/primitives/point.py +25 -0
  16. hafnia/dataset/primitives/polygon.py +100 -0
  17. hafnia/dataset/primitives/primitive.py +44 -0
  18. hafnia/dataset/primitives/segmentation.py +51 -0
  19. hafnia/dataset/primitives/utils.py +51 -0
  20. hafnia/dataset/table_transformations.py +183 -0
  21. hafnia/experiment/hafnia_logger.py +2 -2
  22. hafnia/helper_testing.py +63 -0
  23. hafnia/http.py +5 -3
  24. hafnia/platform/__init__.py +2 -2
  25. hafnia/platform/builder.py +25 -19
  26. hafnia/platform/datasets.py +184 -0
  27. hafnia/platform/download.py +85 -23
  28. hafnia/torch_helpers.py +180 -95
  29. hafnia/utils.py +1 -1
  30. hafnia/visualizations/colors.py +267 -0
  31. hafnia/visualizations/image_visualizations.py +202 -0
  32. {hafnia-0.1.26.dist-info → hafnia-0.2.0.dist-info}/METADATA +212 -99
  33. hafnia-0.2.0.dist-info/RECORD +46 -0
  34. cli/data_cmds.py +0 -53
  35. hafnia-0.1.26.dist-info/RECORD +0 -27
  36. {hafnia-0.1.26.dist-info → hafnia-0.2.0.dist-info}/WHEEL +0 -0
  37. {hafnia-0.1.26.dist-info → hafnia-0.2.0.dist-info}/entry_points.txt +0 -0
  38. {hafnia-0.1.26.dist-info → hafnia-0.2.0.dist-info}/licenses/LICENSE +0 -0
@@ -1,22 +1,27 @@
1
1
  Metadata-Version: 2.4
2
2
  Name: hafnia
3
- Version: 0.1.26
3
+ Version: 0.2.0
4
4
  Summary: Python SDK for communication with Hafnia platform.
5
5
  Author-email: Milestone Systems <hafniaplatform@milestone.dk>
6
6
  License-File: LICENSE
7
7
  Requires-Python: >=3.10
8
8
  Requires-Dist: boto3>=1.35.91
9
9
  Requires-Dist: click>=8.1.8
10
- Requires-Dist: datasets>=3.2.0
11
10
  Requires-Dist: emoji>=2.14.1
12
11
  Requires-Dist: flatten-dict>=0.4.2
12
+ Requires-Dist: more-itertools>=10.7.0
13
+ Requires-Dist: opencv-python-headless>=4.11.0.86
13
14
  Requires-Dist: pathspec>=0.12.1
14
15
  Requires-Dist: pillow>=11.1.0
16
+ Requires-Dist: polars>=1.30.0
15
17
  Requires-Dist: pyarrow>=18.1.0
18
+ Requires-Dist: pycocotools>=2.0.10
16
19
  Requires-Dist: pydantic>=2.10.4
17
20
  Requires-Dist: rich>=13.9.4
21
+ Requires-Dist: s5cmd>=0.2.0
18
22
  Requires-Dist: seedir>=0.5.0
19
23
  Requires-Dist: tqdm>=4.67.1
24
+ Requires-Dist: xxhash>=3.5.0
20
25
  Description-Content-Type: text/markdown
21
26
 
22
27
  # Hafnia
@@ -28,8 +33,8 @@ The package includes the following interfaces:
28
33
 
29
34
  - `cli`: A Command Line Interface (CLI) to 1) configure/connect to Hafnia's [Training-aaS](https://hafnia.readme.io/docs/training-as-a-service) and 2) create and
30
35
  launch recipe scripts.
31
- - `hafnia`: A python package with helper functions to load and interact with sample datasets and an experiment
32
- tracker (`HafniaLogger`).
36
+ - `hafnia`: A python package including `HafniaDataset` to manage datasets and `HafniaLogger` to do
37
+ experiment tracking.
33
38
 
34
39
 
35
40
  ## The Concept: Training as a Service (Training-aaS)
@@ -76,7 +81,7 @@ Copy the key and save it for later use.
76
81
  1. Download `mnist` from terminal to verify that your configuration is working.
77
82
 
78
83
  ```bash
79
- hafnia data download mnist --force
84
+ hafnia dataset download mnist --force
80
85
  ```
81
86
 
82
87
  ## Getting started: Loading datasets samples
@@ -84,115 +89,223 @@ With Hafnia configured on your local machine, it is now possible to download
84
89
  and explore the dataset sample with a python script:
85
90
 
86
91
  ```python
87
- from hafnia.data import load_dataset
92
+ from hafnia.data import load_dataset, get_dataset_path
93
+ from hafnia.dataset.hafnia_dataset import HafniaDataset
88
94
 
89
- dataset_splits = load_dataset("mnist")
95
+ # To download the sample dataset use:
96
+ path_dataset = get_dataset_path("midwest-vehicle-detection")
90
97
  ```
91
98
 
92
- ### Dataset Format
93
- The returned sample dataset is a [hugging face dataset](https://huggingface.co/docs/datasets/index)
94
- and contains train, validation and test splits.
99
+ This will download the dataset sample `midwest-vehicle-detection` to the local `.data/datasets/` folder
100
+ in a human readable format.
101
+
102
+ Images are stored in the `data` folder, general dataset information is stored in `dataset_info.json`
103
+ and annotations are stored as both `annotations.jsonl`(jsonl) and `annotations.parquet`.
104
+
105
+ ```bash
106
+ $ cd .data/datasets/
107
+ $ tree midwest-vehicle-detection
108
+ midwest-vehicle-detection
109
+ └── sample
110
+ ├── annotations.jsonl
111
+ ├── annotations.parquet
112
+ ├── data
113
+ │   ├── video_0026a86b-2f43-49f2-a17c-59244d10a585_1fps_mp4_frame_00000.png
114
+ │   ....
115
+ │   ├── video_ff17d777-e783-44e2-9bff-a4adac73de4b_1fps_mp4_frame_00000.png
116
+ │   └── video_ff17d777-e783-44e2-9bff-a4adac73de4b_1fps_mp4_frame_00100.png
117
+ └── dataset_info.json
118
+
119
+ 3 directories, 217 files
120
+ ```
121
+
122
+ You can interact with data as you want, but we also provide `HafniaDataset`
123
+ for loading/saving, managing and interacting with the dataset.
124
+
125
+ We recommend to visit and potentially execute the example script [examples/example_hafnia_dataset.py](examples/example_hafnia_dataset.py)
126
+ to see how to use the `HafniaDataset` class and its methods.
127
+
128
+ Below is a short introduction to the `HafniaDataset` class.
129
+
130
+ ```python
131
+ from hafnia.dataset.hafnia_dataset import HafniaDataset, Sample
132
+
133
+ # Load dataset
134
+ dataset = HafniaDataset.read_from_path(path_dataset)
135
+
136
+ # Alternatively, you can use the 'load_dataset' function to download and load dataset in one go.
137
+ # dataset = load_dataset("midwest-vehicle-detection")
138
+
139
+ # Print dataset information
140
+ dataset.print_stats()
141
+
142
+ # Create a dataset split for training
143
+ dataset_train = dataset.create_split_dataset("train")
144
+ ```
145
+
146
+ The `HafniaDataset` object provides a convenient way to interact with the dataset, including methods for
147
+ creating splits, accessing samples, printing statistics, saving to and loading from disk.
148
+
149
+ In essence, the `HafniaDataset` class contains `dataset.info` with dataset information
150
+ and `dataset.table` with annotations as a polars DataFrame
95
151
 
96
152
  ```python
97
- print(dataset_splits)
98
-
99
- # Output:
100
- >>> DatasetDict({
101
- train: Dataset({
102
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
103
- num_rows: 172
104
- })
105
- validation: Dataset({
106
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
107
- num_rows: 21
108
- })
109
- test: Dataset({
110
- features: ['image_id', 'image', 'height', 'width', 'objects', 'Weather', 'Surface Conditions'],
111
- num_rows: 21
112
- })
113
- })
153
+ # Annotations are stored in a polars DataFrame
154
+ print(dataset.table.head(2))
155
+ shape: (2, 14)
156
+ ┌──────────┬────────────────────────────────┬────────┬───────┬───┬───────────────────────────────┬──────────┬──────────┬───────────────────────────────┐
157
+ image_id ┆ file_name ┆ height ┆ width ┆ … ┆ objects ┆ bitmasks ┆ polygons ┆ meta │
158
+ --- ┆ --- ┆ --- ┆ --- ┆ ┆ --- ┆ --- ┆ --- ┆ --- │
159
+ str ┆ str ┆ i64 ┆ i64 ┆ ┆ list[struct[12]] ┆ null ┆ null ┆ struct[5] │
160
+ ╞══════════╪════════════════════════════════╪════════╪═══════╪═══╪═══════════════════════════════╪══════════╪══════════╪═══════════════════════════════╡
161
+ │ 7800 ┆ /home/ubuntu/code/hafnia/.data ┆ 1080 ┆ 1920 ┆ … ┆ [{0.0492,0.0357,0.2083,0.23," ┆ null ┆ null ┆ {120.0,1.0,"2024-07-10T18:30:
162
+ │ ┆ … ┆ ┆ ┆ ┆ V… ┆ ┆ ┆ 0… │
163
+ │ 7900 ┆ /home/ubuntu/code/hafnia/.data ┆ 1080 ┆ 1920 ┆ … ┆ [{0.146382,0.078704,0.42963,0 ┆ null ┆ null ┆ {120.0,1.0,"2024-07-10T18:30:
164
+ │ ┆ … ┆ ┆ ┆ ┆ .… ┆ ┆ ┆ 0… │
165
+ └──────────┴────────────────────────────────┴────────┴───────┴───┴───────────────────────────────┴──────────┴──────────┴───────────────────────────────┘
166
+ ```
114
167
 
168
+ ```python
169
+ # General dataset information is stored in `dataset.info`
170
+ rich.print(dataset.info)
171
+ DatasetInfo(
172
+ dataset_name='midwest-vehicle-detection',
173
+ version='1.0.0',
174
+ tasks=[
175
+ TaskInfo(
176
+ primitive=<class 'hafnia.dataset.primitives.Bbox'>,
177
+ class_names=[
178
+ 'Person',
179
+ 'Vehicle.Bicycle',
180
+ 'Vehicle.Motorcycle',
181
+ 'Vehicle.Car',
182
+ 'Vehicle.Van',
183
+ 'Vehicle.RV',
184
+ 'Vehicle.Single_Truck',
185
+ 'Vehicle.Combo_Truck',
186
+ 'Vehicle.Pickup_Truck',
187
+ 'Vehicle.Trailer',
188
+ 'Vehicle.Emergency_Vehicle',
189
+ 'Vehicle.Bus',
190
+ 'Vehicle.Heavy_Duty_Vehicle'
191
+ ],
192
+ name='bboxes'
193
+ ),
194
+ TaskInfo(primitive=<class 'hafnia.dataset.primitives.Classification'>, class_names=['Clear', 'Foggy'], name='Weather'),
195
+ TaskInfo(primitive=<class 'hafnia.dataset.primitives.Classification'>, class_names=['Dry', 'Wet'], name='Surface Conditions')
196
+ ],
197
+ meta={
198
+ 'n_videos': 109,
199
+ 'n_cameras': 20,
200
+ 'duration': 13080.0,
201
+ 'duration_average': 120.0,
202
+ ...
203
+ }
204
+ )
115
205
  ```
116
206
 
117
- A Hugging Face dataset is a dictionary with splits, where each split is a `Dataset` object.
118
- Each `Dataset` is structured as a table with a set of columns (also called features) and a row for each sample.
207
+ You can iterate and access samples in the dataset using the `HafniaDataset` object.
208
+ Each sample contain image and annotations information.
119
209
 
120
- The features of the dataset can be viewed with the `features` attribute.
121
210
  ```python
122
- # View features of the train split
123
- pprint.pprint(dataset["train"].features)
124
- {'Surface Conditions': ClassLabel(names=['Dry', 'Wet'], id=None),
125
- 'Weather': ClassLabel(names=['Clear', 'Foggy'], id=None),
126
- 'height': Value(dtype='int64', id=None),
127
- 'image': Image(mode=None, decode=True, id=None),
128
- 'image_id': Value(dtype='int64', id=None),
129
- 'objects': Sequence(feature={'bbox': Sequence(feature=Value(dtype='int64',
130
- id=None),
131
- length=-1,
132
- id=None),
133
- 'class_idx': ClassLabel(names=['Vehicle.Bicycle',
134
- 'Vehicle.Motorcycle',
135
- 'Vehicle.Car',
136
- 'Vehicle.Van',
137
- 'Vehicle.RV',
138
- 'Vehicle.Single_Truck',
139
- 'Vehicle.Combo_Truck',
140
- 'Vehicle.Pickup_Truck',
141
- 'Vehicle.Trailer',
142
- 'Vehicle.Emergency_Vehicle',
143
- 'Vehicle.Bus',
144
- 'Vehicle.Heavy_Duty_Vehicle'],
145
- id=None),
146
- 'class_name': Value(dtype='string', id=None),
147
- 'id': Value(dtype='string', id=None)},
148
- length=-1,
149
- id=None),
150
- 'width': Value(dtype='int64', id=None)}
211
+ from hafnia.dataset.hafnia_dataset import HafniaDataset, Sample
212
+ # Access the first sample in the dataset either by index or by iterating over the dataset
213
+ sample_dict = dataset[0]
214
+
215
+ for sample_dict in dataset:
216
+ sample = Sample(**sample_dict)
217
+ print(sample.sample_id, sample.objects)
218
+ break
151
219
  ```
220
+ Not that it is possible to create a `Sample` object from the sample dictionary.
221
+ This is useful for accessing the image and annotations in a structured way.
152
222
 
153
- View the first sample in the training set:
154
223
  ```python
155
- # Print sample from the training set
156
- pprint.pprint(dataset["train"][0])
157
-
158
- {'image': <PIL.PngImagePlugin.PngImageFile image mode=RGB size=1920x1080 at 0x79D6292C5ED0>,
159
- 'image_id': 4920,
160
- 'height': 1080,
161
- 'Weather': 0,
162
- 'Surface Conditions': 0,
163
- 'objects': {'bbox': [[441, 180, 121, 126],
164
- [549, 151, 131, 103],
165
- [1845, 722, 68, 130],
166
- [1810, 571, 110, 149]],
167
- 'class_idx': [7, 7, 2, 2],
168
- 'class_name': ['Vehicle.Pickup_Truck',
169
- 'Vehicle.Pickup_Truck',
170
- 'Vehicle.Car',
171
- 'Vehicle.Car'],
172
- 'id': ['HW6WiLAJ', 'T/ccFpRi', 'CS0O8B6W', 'DKrJGzjp']},
173
- 'width': 1920}
224
+ # By unpacking the sample dictionary, you can create a `Sample` object.
225
+ sample = Sample(**sample_dict)
226
+
227
+ # Use the `Sample` object to easily read image and draw annotations
228
+ image = sample.read_image()
229
+ image_annotations = sample.draw_annotations()
230
+ ```
174
231
 
232
+ Note that the `Sample` object contains all information about the sample, including image and metadata.
233
+ It also contain annotations as primitive types such as `Bbox`, `Classification`.
234
+
235
+ ```python
236
+ rich.print(sample)
237
+ Sample(
238
+ image_id='7800',
239
+ file_name='data/video_0026a86b-2f43-49f2-a17c-59244d10a585_1fps_mp4_frame_0
240
+ 0000.png',
241
+ height=1080,
242
+ width=1920,
243
+ split='test',
244
+ is_sample=True,
245
+ frame_number=None,
246
+ video_name=None,
247
+ remote_path=None,
248
+ classifications=[
249
+ Classification(
250
+ class_name='Clear',
251
+ class_idx=0,
252
+ object_id=None,
253
+ confidence=None,
254
+ ground_truth=True,
255
+ task_name='Weather',
256
+ meta=None
257
+ ),
258
+ Classification(
259
+ class_name='Day',
260
+ class_idx=3,
261
+ object_id=None,
262
+ confidence=None,
263
+ ground_truth=True,
264
+ task_name='Time of Day',
265
+ meta=None
266
+ ),
267
+ ...
268
+ ],
269
+ objects=[
270
+ Bbox(
271
+ height=0.0492,
272
+ width=0.0357,
273
+ top_left_x=0.2083,
274
+ top_left_y=0.23,
275
+ class_name='Vehicle.Car',
276
+ class_idx=3,
277
+ object_id='cXT4NRVu',
278
+ confidence=None,
279
+ ground_truth=True,
280
+ task_name='bboxes',
281
+ meta=None
282
+ ),
283
+ Bbox(
284
+ height=0.0457,
285
+ width=0.0408,
286
+ top_left_x=0.2521,
287
+ top_left_y=0.2153,
288
+ class_name='Vehicle.Car',
289
+ class_idx=3,
290
+ object_id='MelbIIDU',
291
+ confidence=None,
292
+ ground_truth=True,
293
+ task_name='bboxes',
294
+ meta=None
295
+ ),
296
+ ...
297
+ ],
298
+ bitmasks=None, # Optional a list of Bitmask objects List[Bitmask]
299
+ polygons=None, # Optional a list of Polygon objects List[Polygon]
300
+ meta={
301
+ 'video.data_duration': 120.0,
302
+ 'video.data_fps': 1.0,
303
+ ...
304
+ }
305
+ )
175
306
  ```
176
307
 
177
- For hafnia based datasets, we want to standardized how a dataset and dataset tasks are represented.
178
- We have defined a set of features that are common across all datasets in the Hafnia data library.
179
-
180
- - `image`: The image itself, stored as a PIL image
181
- - `height`: The height of the image in pixels
182
- - `width`: The width of the image in pixels
183
- - `[IMAGE_CLASSIFICATION_TASK]`: [Optional] Image classification tasks are top-level `ClassLabel` feature.
184
- `ClassLabel` is a Hugging Face feature that maps class indices to class names.
185
- In above example we have two classification tasks:
186
- - `Weather`: Classifies the weather conditions in the image, with possible values `Clear` and `Foggy`
187
- - `Surface Conditions`: Classifies the surface conditions in the image, with possible values `Dry` and `Wet`
188
- - `objects`: A dictionary containing information about objects in the image, including:
189
- - `bbox`: Bounding boxes for each object, represented with a list of bounding box coordinates
190
- `[xmin, ymin, bbox_width, bbox_height]`. Each bounding box is defined with a top-left corner coordinate
191
- `(xmin, ymin)` and bounding box width and height `(bbox_width, bbox_height)` in pixels.
192
- - `class_idx`: Class indices for each detected object. This is a
193
- `ClassLabel` feature that maps to the `class_name` feature.
194
- - `class_name`: Class names for each detected object
195
- - `id`: Unique identifiers for each detected object
308
+ To learn more, view and potentially execute the example script [examples/example_hafnia_dataset.py](examples/example_hafnia_dataset.py).
196
309
 
197
310
  ### Dataset Locally vs. Training-aaS
198
311
  An important feature of `load_dataset` is that it will return the full dataset
@@ -354,7 +467,7 @@ curl -LsSf https://astral.sh/uv/install.sh | sh
354
467
  Create virtual environment and install python dependencies
355
468
 
356
469
  ```bash
357
- uv sync
470
+ uv sync --dev
358
471
  ```
359
472
 
360
473
  Run tests:
@@ -0,0 +1,46 @@
1
+ cli/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
+ cli/__main__.py,sha256=pKYvJOk0toACDMrgEeUYT3P5EQPFmXdMRIQuLZLd3dc,1603
3
+ cli/config.py,sha256=R9w0NKIOtIxRKNs7ieeUrIKwRkrTlK5PqOVjc5VYljE,4923
4
+ cli/consts.py,sha256=sj0MRwbbCT2Yl77FPddck1VWkFxp7QY6I9l1o75j_aE,963
5
+ cli/dataset_cmds.py,sha256=VUMhnHGYPtNNJUK9aobKTx2zpVzLex4gTMmyQXuzCVw,1623
6
+ cli/experiment_cmds.py,sha256=L-k_ZJ4B7I4cA8OvHcheSwXM6nx9aTF9G7eKBzAcOzQ,1961
7
+ cli/profile_cmds.py,sha256=-HQcFgYI6Rqaefi0Nj-91KhiqPKUj7zOaiJWbHx_bac,3196
8
+ cli/recipe_cmds.py,sha256=qnMfF-te47HXNkgyA0hm9X3etDQsqMnrVEGDCrzVjZU,1462
9
+ cli/runc_cmds.py,sha256=QqhQe2sd7tK1Bl2aGfIWRyJjpP6F7Tducg7HULrHsZ4,4958
10
+ hafnia/__init__.py,sha256=Zphq-cQoX95Z11zm4lkrU-YiAJxddR7IBfwDkxeHoDE,108
11
+ hafnia/helper_testing.py,sha256=4pIG5sA1FycbZRFmfG7CyhZMdQgWob86JCzX8ALuHfQ,2344
12
+ hafnia/http.py,sha256=HoPB03IL6e-nglTrw1NGT6sDx1T8VNas5HjTT1QZHnU,3035
13
+ hafnia/log.py,sha256=sWF8tz78yBtwZ9ddzm19L1MBSBJ3L4G704IGeT1_OEU,784
14
+ hafnia/torch_helpers.py,sha256=ho65B0WIu_SjbaKPRL4wabDNrnVumWH8QSXVH4r7NAY,11605
15
+ hafnia/utils.py,sha256=DFMALTCOxGsTWZ6qIVbm55Wr8xxwukj87sb0oYB6w64,4660
16
+ hafnia/data/__init__.py,sha256=o9QjiGbEcNa6r-qDmwwmxPXf-1UitNl5-WxFNcujqsg,111
17
+ hafnia/data/factory.py,sha256=RC36LSPCULJqHn93vGeadzNXoItHhsYZA8YxURY1nd0,717
18
+ hafnia/dataset/dataset_helpers.py,sha256=WVCpbUfNbHy7MZJqJ3OyJF8k1hSObo3kScxpXT17Sj8,3510
19
+ hafnia/dataset/dataset_names.py,sha256=DhnFDlaq-PDNeGpvE1pFmhZ2M7HKeQNyRo6PwuchVe8,2074
20
+ hafnia/dataset/dataset_transformation.py,sha256=LyXt4LOQaZ4EkYspKlqp_W3IbHJxB1AxVNkuuShiKg0,7761
21
+ hafnia/dataset/dataset_upload_helper.py,sha256=D1BGaeEar4McpUvXj4Yy8nk1tr12IEVhP_Ma47OoWmU,21150
22
+ hafnia/dataset/hafnia_dataset.py,sha256=IJfPQivzM0wAZgGP7fx7-Mpoe8mhrEpLzBJ9JP3N6D4,19970
23
+ hafnia/dataset/table_transformations.py,sha256=WuAbEXaNKyacmY4FShNkUoG3-rVC5pTKntO8Z7CJNoI,7628
24
+ hafnia/dataset/primitives/__init__.py,sha256=LAdTeK5GgmaF1se8f0Yj1lOTNqplXGL87kLeOnv1D8Q,627
25
+ hafnia/dataset/primitives/bbox.py,sha256=HXYYy5BLNZwh-bO7aiAWg3z0OurUev8ISa-vYey8b8A,6055
26
+ hafnia/dataset/primitives/bitmask.py,sha256=mq_wchMqGupJDc-a-mJh9uBO_mjHcXpLH49g591doAM,7619
27
+ hafnia/dataset/primitives/classification.py,sha256=7UmePtrf3imfU56YUro21JvHGoktomMous1vTVKLjHU,2341
28
+ hafnia/dataset/primitives/point.py,sha256=JCRwb-E4sDafodkg6wqyuAS1Yj-yaJbwiD8aB69_Ros,635
29
+ hafnia/dataset/primitives/polygon.py,sha256=vhPrYHv6TqQZMTAyv9r3NV8Hu6YRSSD0srB0wOCIwQ4,4289
30
+ hafnia/dataset/primitives/primitive.py,sha256=7jxcyFADVGf95pjeQHEOqAnR9eucLpxA2h8Blz3ppXI,1253
31
+ hafnia/dataset/primitives/segmentation.py,sha256=jUMjOmYr9j4An3YSCw5CJC1W8ihXAbus3CXaTOpc7Xw,1905
32
+ hafnia/dataset/primitives/utils.py,sha256=3gT1as-xXEj8CamoIuBb9gQwUN9Ae9qnqtqF_uEe0zo,1993
33
+ hafnia/experiment/__init__.py,sha256=OEFE6HqhO5zcTCLZcPcPVjIg7wMFFnvZ1uOtAVhRz7M,85
34
+ hafnia/experiment/hafnia_logger.py,sha256=TNGsb5aP_FE0t2NQofik6chK6dDWckYJMikiGS9GAd0,6855
35
+ hafnia/platform/__init__.py,sha256=zJsR6Hy_0iUcC9xL-lBnqR0mLfF4EUr_VXa_XQA7SlA,455
36
+ hafnia/platform/builder.py,sha256=_g8ykQWETz5Y4Np9QU1a6wIzbbJwXCkbiOCA6JcF5Rc,5742
37
+ hafnia/platform/datasets.py,sha256=Sc2vkT3QYMwaSWp2Pf2E2BLW4YG40Tqdhn6vgkTgpQk,6643
38
+ hafnia/platform/download.py,sha256=oJzdxSIDTuw1an7maC6I7A5nZvDaZPhUkuAmyRwN9Kc,6843
39
+ hafnia/platform/experiment.py,sha256=-nAfTmn1c8sE6pHDCTNZvWDTopkXndarJAPIGvsnk60,2389
40
+ hafnia/visualizations/colors.py,sha256=003eAJVnBal4abaYIIpsrT7erIOIjTUHHYVJ1Tj1CDc,5226
41
+ hafnia/visualizations/image_visualizations.py,sha256=mNlRfG-ooJw0vqKyNCuAZOBpjiFuqJH3mjd6spInJs4,7318
42
+ hafnia-0.2.0.dist-info/METADATA,sha256=sYxIR2FnabM4KApFnNw_zKHPEDjQ9abU3dpon4RTXOY,19153
43
+ hafnia-0.2.0.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
44
+ hafnia-0.2.0.dist-info/entry_points.txt,sha256=FCJVIQ8GP2VE9I3eeGVF5eLxVDNW_01pOJCpG_CGnMM,45
45
+ hafnia-0.2.0.dist-info/licenses/LICENSE,sha256=wLZw1B7_mod_CO1H8LXqQgfqlWD6QceJR8--LJYRZGE,1078
46
+ hafnia-0.2.0.dist-info/RECORD,,
cli/data_cmds.py DELETED
@@ -1,53 +0,0 @@
1
- from pathlib import Path
2
- from typing import Optional
3
-
4
- import click
5
- from rich import print as rprint
6
-
7
- import cli.consts as consts
8
- from cli.config import Config
9
-
10
-
11
- @click.group()
12
- def data():
13
- """Manage data interaction"""
14
- pass
15
-
16
-
17
- @data.command("get")
18
- @click.argument("url")
19
- @click.argument("destination")
20
- @click.pass_obj
21
- def data_get(cfg: Config, url: str, destination: click.Path) -> None:
22
- """Download resource from Hafnia platform"""
23
-
24
- from hafnia.platform import download_resource
25
-
26
- try:
27
- result = download_resource(resource_url=url, destination=str(destination), api_key=cfg.api_key)
28
- except Exception:
29
- raise click.ClickException(consts.ERROR_GET_RESOURCE)
30
-
31
- rprint(result)
32
-
33
-
34
- @data.command("download")
35
- @click.argument("dataset_name")
36
- @click.argument("destination", default=None, required=False)
37
- @click.option("--force", is_flag=True, default=False, help="Force download")
38
- @click.pass_obj
39
- def data_download(cfg: Config, dataset_name: str, destination: Optional[click.Path], force: bool) -> Path:
40
- """Download dataset from Hafnia platform"""
41
-
42
- from hafnia.data.factory import download_or_get_dataset_path
43
-
44
- try:
45
- path_dataset = download_or_get_dataset_path(
46
- dataset_name=dataset_name,
47
- cfg=cfg,
48
- output_dir=destination,
49
- force_redownload=force,
50
- )
51
- except Exception:
52
- raise click.ClickException(consts.ERROR_GET_RESOURCE)
53
- return path_dataset
@@ -1,27 +0,0 @@
1
- cli/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
2
- cli/__main__.py,sha256=MX0RT1BP3t59rzCvdUqfw39Kw05HOF4OEtjDTwIU9h8,1594
3
- cli/config.py,sha256=R9w0NKIOtIxRKNs7ieeUrIKwRkrTlK5PqOVjc5VYljE,4923
4
- cli/consts.py,sha256=sj0MRwbbCT2Yl77FPddck1VWkFxp7QY6I9l1o75j_aE,963
5
- cli/data_cmds.py,sha256=BQiythAPwAwudgdUa68v50a345uw5flrcDiBHLGp9lo,1460
6
- cli/experiment_cmds.py,sha256=L-k_ZJ4B7I4cA8OvHcheSwXM6nx9aTF9G7eKBzAcOzQ,1961
7
- cli/profile_cmds.py,sha256=-HQcFgYI6Rqaefi0Nj-91KhiqPKUj7zOaiJWbHx_bac,3196
8
- cli/recipe_cmds.py,sha256=qnMfF-te47HXNkgyA0hm9X3etDQsqMnrVEGDCrzVjZU,1462
9
- cli/runc_cmds.py,sha256=6fHMi_dEd8g3Cx9PEfU4gJMZf5-G0IUPDcZh6DNq8Mw,4953
10
- hafnia/__init__.py,sha256=Zphq-cQoX95Z11zm4lkrU-YiAJxddR7IBfwDkxeHoDE,108
11
- hafnia/http.py,sha256=psCWdNKfKYiBrYD6bezat1AeHh77JJtJrPePiUAjTIk,2948
12
- hafnia/log.py,sha256=sWF8tz78yBtwZ9ddzm19L1MBSBJ3L4G704IGeT1_OEU,784
13
- hafnia/torch_helpers.py,sha256=P_Jl4IwqUebKVCOXNe6iTorJZA3S-3d92HV274UHIko,7456
14
- hafnia/utils.py,sha256=mJ5aOjSVSOrrQnpmaKLK71ld5jYpmtd3HciTIk_Wk88,4659
15
- hafnia/data/__init__.py,sha256=Pntmo_1fst8OhyrHB60jQ8mhJJ4hL38tdjLvt0YXEJo,73
16
- hafnia/data/factory.py,sha256=4fZDkWNyOK1QNCmsxsXfSztPJkJW_HBIa_PTdGCYHCM,2551
17
- hafnia/experiment/__init__.py,sha256=OEFE6HqhO5zcTCLZcPcPVjIg7wMFFnvZ1uOtAVhRz7M,85
18
- hafnia/experiment/hafnia_logger.py,sha256=usL5pl7XLJP-g1vZrwvbky5YiD6Bg-rOODYYAX5z43I,6830
19
- hafnia/platform/__init__.py,sha256=I-VIVXDxwBAUzxx8Zx0g_wykyDdFGTsjb_mYLmvxk2Y,443
20
- hafnia/platform/builder.py,sha256=OFPnOjE3bAbWjUgYErXtffDKTiW_9ol95eVzKqL27WM,5433
21
- hafnia/platform/download.py,sha256=t055axPNHlXTYCQgZHOS2YMQt1I2_bc4G8dltsOKttY,4760
22
- hafnia/platform/experiment.py,sha256=-nAfTmn1c8sE6pHDCTNZvWDTopkXndarJAPIGvsnk60,2389
23
- hafnia-0.1.26.dist-info/METADATA,sha256=Lds8gx_ffd8_l9kByvK_e-HPFehSSUv8E_85d8ZelSE,14990
24
- hafnia-0.1.26.dist-info/WHEEL,sha256=qtCwoSJWgHk21S1Kb4ihdzI2rlJ1ZKaIurTj_ngOhyQ,87
25
- hafnia-0.1.26.dist-info/entry_points.txt,sha256=FCJVIQ8GP2VE9I3eeGVF5eLxVDNW_01pOJCpG_CGnMM,45
26
- hafnia-0.1.26.dist-info/licenses/LICENSE,sha256=wLZw1B7_mod_CO1H8LXqQgfqlWD6QceJR8--LJYRZGE,1078
27
- hafnia-0.1.26.dist-info/RECORD,,