datamint 2.3.5__py3-none-any.whl → 2.4.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
Potentially problematic release.
This version of datamint might be problematic. Click here for more details.
- datamint/api/base_api.py +42 -8
- datamint/api/client.py +2 -0
- datamint/apihandler/base_api_handler.py +0 -1
- datamint/apihandler/dto/annotation_dto.py +2 -0
- datamint/dataset/base_dataset.py +4 -0
- datamint/lightning/__init__.py +1 -0
- datamint/lightning/datamintdatamodule.py +103 -0
- datamint/mlflow/__init__.py +46 -0
- datamint/mlflow/artifact/__init__.py +1 -0
- datamint/mlflow/artifact/datamint_artifacts_repo.py +8 -0
- datamint/mlflow/env_utils.py +109 -0
- datamint/mlflow/env_vars.py +5 -0
- datamint/mlflow/lightning/callbacks/__init__.py +1 -0
- datamint/mlflow/lightning/callbacks/modelcheckpoint.py +338 -0
- datamint/mlflow/models/__init__.py +94 -0
- datamint/mlflow/tracking/datamint_store.py +46 -0
- datamint/mlflow/tracking/default_experiment.py +27 -0
- datamint/mlflow/tracking/fluent.py +78 -0
- datamint-2.4.0.dist-info/METADATA +320 -0
- {datamint-2.3.5.dist-info → datamint-2.4.0.dist-info}/RECORD +22 -9
- datamint-2.4.0.dist-info/entry_points.txt +18 -0
- datamint-2.3.5.dist-info/METADATA +0 -125
- datamint-2.3.5.dist-info/entry_points.txt +0 -4
- {datamint-2.3.5.dist-info → datamint-2.4.0.dist-info}/WHEEL +0 -0
|
@@ -0,0 +1,320 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: datamint
|
|
3
|
+
Version: 2.4.0
|
|
4
|
+
Summary: A library for interacting with the Datamint API, designed for efficient data management, processing and Deep Learning workflows.
|
|
5
|
+
Requires-Python: >=3.10
|
|
6
|
+
Classifier: Programming Language :: Python :: 3
|
|
7
|
+
Classifier: Programming Language :: Python :: 3.10
|
|
8
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
9
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
10
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
11
|
+
Classifier: Programming Language :: Python :: 3.14
|
|
12
|
+
Provides-Extra: dev
|
|
13
|
+
Provides-Extra: docs
|
|
14
|
+
Requires-Dist: Deprecated (>=1.2.0)
|
|
15
|
+
Requires-Dist: aiohttp (>=3.0.0,<4.0.0)
|
|
16
|
+
Requires-Dist: aioresponses (>=0.7.8,<0.8.0) ; extra == "dev"
|
|
17
|
+
Requires-Dist: albumentations (>=2.0.0)
|
|
18
|
+
Requires-Dist: backports-strenum ; python_version < "3.11"
|
|
19
|
+
Requires-Dist: datamintapi (==0.0.*)
|
|
20
|
+
Requires-Dist: httpx
|
|
21
|
+
Requires-Dist: humanize (>=4.0.0,<5.0.0)
|
|
22
|
+
Requires-Dist: lazy-loader (>=0.3.0)
|
|
23
|
+
Requires-Dist: lightning (>=2.0.0,!=2.5.1,!=2.5.1.post0)
|
|
24
|
+
Requires-Dist: matplotlib
|
|
25
|
+
Requires-Dist: medimgkit (>=0.7.3)
|
|
26
|
+
Requires-Dist: mlflow (>=2.0.0,<3.0.0)
|
|
27
|
+
Requires-Dist: nest-asyncio (>=1.0.0,<2.0.0)
|
|
28
|
+
Requires-Dist: nibabel (>=4.0.0)
|
|
29
|
+
Requires-Dist: numpy
|
|
30
|
+
Requires-Dist: opencv-python (>=4.0.0)
|
|
31
|
+
Requires-Dist: pandas (>=2.0.0)
|
|
32
|
+
Requires-Dist: platformdirs (>=4.0.0,<5.0.0)
|
|
33
|
+
Requires-Dist: pydantic (>=2.6.4)
|
|
34
|
+
Requires-Dist: pydicom (>=3.0.0,<4.0.0)
|
|
35
|
+
Requires-Dist: pylibjpeg (>=2.0.0,<3.0.0)
|
|
36
|
+
Requires-Dist: pylibjpeg-libjpeg (>=2.0.0,<3.0.0)
|
|
37
|
+
Requires-Dist: pytest (>=7.0.0,<8.0.0) ; extra == "dev"
|
|
38
|
+
Requires-Dist: pytest-cov (>=4.0.0,<5.0.0) ; extra == "dev"
|
|
39
|
+
Requires-Dist: pyyaml (>=5.0.0)
|
|
40
|
+
Requires-Dist: requests (>=2.0.0,<3.0.0)
|
|
41
|
+
Requires-Dist: responses (>=0.20.0,<0.21.0) ; extra == "dev"
|
|
42
|
+
Requires-Dist: rich (>=10.0.0)
|
|
43
|
+
Requires-Dist: setuptools (>=57.0) ; extra == "docs"
|
|
44
|
+
Requires-Dist: sphinx (>=5.0) ; extra == "docs"
|
|
45
|
+
Requires-Dist: sphinx-tabs (>=3.0.0) ; extra == "docs"
|
|
46
|
+
Requires-Dist: sphinx_rtd_theme (>=2.0.0) ; extra == "docs"
|
|
47
|
+
Requires-Dist: torch (>=1.2.0,!=2.3.0)
|
|
48
|
+
Requires-Dist: torchvision (>=0.18.0)
|
|
49
|
+
Requires-Dist: tqdm (>=4.0.0,<5.0.0)
|
|
50
|
+
Requires-Dist: typing_extensions (>=4.0.0)
|
|
51
|
+
Description-Content-Type: text/markdown
|
|
52
|
+
|
|
53
|
+
# Datamint Python API
|
|
54
|
+
|
|
55
|
+

|
|
56
|
+
[](https://www.python.org/downloads/)
|
|
57
|
+
|
|
58
|
+
A comprehensive Python SDK for interacting with the Datamint platform, providing seamless integration for medical imaging workflows, dataset management, and machine learning experiments.
|
|
59
|
+
|
|
60
|
+
## 📋 Table of Contents
|
|
61
|
+
|
|
62
|
+
- [Features](#-features)
|
|
63
|
+
- [Installation](#-installation)
|
|
64
|
+
- [Quick Setup](#-quick-setup)
|
|
65
|
+
- [Documentation](#-documentation)
|
|
66
|
+
- [Key Components](#-key-components)
|
|
67
|
+
- [Command Line Tools](#️-command-line-tools)
|
|
68
|
+
- [Examples](#-examples)
|
|
69
|
+
- [Support](#-support)
|
|
70
|
+
|
|
71
|
+
## 🚀 Features
|
|
72
|
+
|
|
73
|
+
- **Dataset Management**: Download, upload, and manage medical imaging datasets
|
|
74
|
+
- **Annotation Tools**: Create, upload, and manage annotations (segmentations, labels, measurements)
|
|
75
|
+
- **Experiment Tracking**: Integrated MLflow support for experiment management
|
|
76
|
+
- **PyTorch Lightning Integration**: Streamlined ML workflows with Lightning DataModules and callbacks
|
|
77
|
+
- **DICOM Support**: Native handling of DICOM files with anonymization capabilities
|
|
78
|
+
- **Multi-format Support**: PNG, JPEG, NIfTI, and other medical imaging formats
|
|
79
|
+
|
|
80
|
+
See the full documentation at https://sonanceai.github.io/datamint-python-api/
|
|
81
|
+
|
|
82
|
+
## 📦 Installation
|
|
83
|
+
|
|
84
|
+
> [!NOTE]
|
|
85
|
+
> We recommend using a virtual environment to avoid package conflicts.
|
|
86
|
+
|
|
87
|
+
### From PyPI
|
|
88
|
+
|
|
89
|
+
To be released soon
|
|
90
|
+
|
|
91
|
+
### From Source
|
|
92
|
+
|
|
93
|
+
```bash
|
|
94
|
+
pip install git+https://github.com/SonanceAI/datamint-python-api
|
|
95
|
+
```
|
|
96
|
+
|
|
97
|
+
### Virtual Environment Setup
|
|
98
|
+
|
|
99
|
+
<details>
|
|
100
|
+
<summary>Click to expand virtual environment setup instructions</summary>
|
|
101
|
+
|
|
102
|
+
We recommend that you install Datamint in a dedicated virtual environment, to avoid conflicting with your system packages.
|
|
103
|
+
For instance, create the enviroment once with `python3 -m venv datamint-env` and then activate it whenever you need it with:
|
|
104
|
+
|
|
105
|
+
1. **Create the environment** (one-time setup):
|
|
106
|
+
```bash
|
|
107
|
+
python3 -m venv datamint-env
|
|
108
|
+
```
|
|
109
|
+
|
|
110
|
+
2. **Activate the environment** (run whenever you need it):
|
|
111
|
+
|
|
112
|
+
| Platform | Command |
|
|
113
|
+
|----------|---------|
|
|
114
|
+
| Linux/macOS | `source datamint-env/bin/activate` |
|
|
115
|
+
| Windows CMD | `datamint-env\Scripts\activate.bat` |
|
|
116
|
+
| Windows PowerShell | `datamint-env\Scripts\Activate.ps1` |
|
|
117
|
+
|
|
118
|
+
3. **Install the package**:
|
|
119
|
+
```bash
|
|
120
|
+
pip install git+https://github.com/SonanceAI/datamint-python-api
|
|
121
|
+
```
|
|
122
|
+
|
|
123
|
+
</details>
|
|
124
|
+
|
|
125
|
+
## Setup API key
|
|
126
|
+
|
|
127
|
+
To use the Datamint API, you need to setup your API key (ask your administrator if you don't have one). Use one of the following methods to setup your API key:
|
|
128
|
+
|
|
129
|
+
### Method 1: Command-line tool (recommended)
|
|
130
|
+
|
|
131
|
+
Run ``datamint-config`` in the terminal and follow the instructions. See [command_line_tools](https://sonanceai.github.io/datamint-python-api/command_line_tools.html) for more details.
|
|
132
|
+
|
|
133
|
+
### Method 2: Environment variable
|
|
134
|
+
|
|
135
|
+
Specify the API key as an environment variable.
|
|
136
|
+
|
|
137
|
+
**Bash:**
|
|
138
|
+
```bash
|
|
139
|
+
export DATAMINT_API_KEY="my_api_key"
|
|
140
|
+
# run your commands (e.g., `datamint-upload`, `python script.py`)
|
|
141
|
+
```
|
|
142
|
+
|
|
143
|
+
**Python:**
|
|
144
|
+
```python
|
|
145
|
+
import os
|
|
146
|
+
os.environ["DATAMINT_API_KEY"] = "my_api_key"
|
|
147
|
+
```
|
|
148
|
+
|
|
149
|
+
## 📚 Documentation
|
|
150
|
+
|
|
151
|
+
| Resource | Description |
|
|
152
|
+
|----------|-------------|
|
|
153
|
+
| [🚀 Getting Started](docs/source/getting_started.rst) | Step-by-step setup and basic usage |
|
|
154
|
+
| [📖 API Reference](docs/source/client_api.rst) | Complete API documentation |
|
|
155
|
+
| [🔥 PyTorch Integration](docs/source/pytorch_integration.rst) | ML workflow integration |
|
|
156
|
+
| [💡 Examples](examples/) | Practical usage examples |
|
|
157
|
+
|
|
158
|
+
## 🔗 Key Components
|
|
159
|
+
|
|
160
|
+
### Dataset Management
|
|
161
|
+
|
|
162
|
+
```python
|
|
163
|
+
from datamint import Dataset
|
|
164
|
+
|
|
165
|
+
# Load dataset with annotations
|
|
166
|
+
dataset = Dataset(
|
|
167
|
+
project_name="medical-segmentation",
|
|
168
|
+
)
|
|
169
|
+
|
|
170
|
+
# Access data
|
|
171
|
+
for sample in dataset:
|
|
172
|
+
image = sample['image'] # torch.Tensor
|
|
173
|
+
mask = sample['segmentation'] # torch.Tensor (if available)
|
|
174
|
+
metadata = sample['metainfo'] # dict
|
|
175
|
+
```
|
|
176
|
+
|
|
177
|
+
|
|
178
|
+
### PyTorch Lightning Integration
|
|
179
|
+
|
|
180
|
+
```python
|
|
181
|
+
import lightning as L
|
|
182
|
+
from datamint.lightning import DatamintDataModule
|
|
183
|
+
from datamint.mlflow.lightning.callbacks import MLFlowModelCheckpoint
|
|
184
|
+
|
|
185
|
+
# Data module
|
|
186
|
+
datamodule = DatamintDataModule(
|
|
187
|
+
project_name="your-project",
|
|
188
|
+
batch_size=16,
|
|
189
|
+
train_split=0.8
|
|
190
|
+
)
|
|
191
|
+
|
|
192
|
+
# ML tracking callback
|
|
193
|
+
checkpoint_callback = MLFlowModelCheckpoint(
|
|
194
|
+
monitor="val_loss",
|
|
195
|
+
save_top_k=1,
|
|
196
|
+
register_model_name="best-model"
|
|
197
|
+
)
|
|
198
|
+
|
|
199
|
+
# Trainer with MLflow logging
|
|
200
|
+
trainer = L.Trainer(
|
|
201
|
+
max_epochs=100,
|
|
202
|
+
callbacks=[checkpoint_callback],
|
|
203
|
+
logger=L.pytorch.loggers.MLFlowLogger(
|
|
204
|
+
experiment_name="medical-segmentation"
|
|
205
|
+
)
|
|
206
|
+
)
|
|
207
|
+
```
|
|
208
|
+
|
|
209
|
+
|
|
210
|
+
### Annotation Management
|
|
211
|
+
|
|
212
|
+
|
|
213
|
+
```python
|
|
214
|
+
# Upload segmentation masks
|
|
215
|
+
api.upload_segmentations(
|
|
216
|
+
resource_id="resource-123",
|
|
217
|
+
file_path="segmentation.nii.gz",
|
|
218
|
+
name="liver_segmentation",
|
|
219
|
+
frame_index=0
|
|
220
|
+
)
|
|
221
|
+
|
|
222
|
+
# Add categorical annotations
|
|
223
|
+
api.add_image_category_annotation(
|
|
224
|
+
resource_id="resource-123",
|
|
225
|
+
identifier="diagnosis",
|
|
226
|
+
value="positive"
|
|
227
|
+
)
|
|
228
|
+
|
|
229
|
+
# Add geometric annotations
|
|
230
|
+
api.add_line_annotation(
|
|
231
|
+
point1=(10, 20),
|
|
232
|
+
point2=(50, 80),
|
|
233
|
+
resource_id="resource-123",
|
|
234
|
+
identifier="measurement",
|
|
235
|
+
frame_index=5
|
|
236
|
+
)
|
|
237
|
+
```
|
|
238
|
+
|
|
239
|
+
|
|
240
|
+
## 🛠️ Command Line Tools
|
|
241
|
+
|
|
242
|
+
### Upload Resources
|
|
243
|
+
|
|
244
|
+
**Upload DICOM files with anonymization:**
|
|
245
|
+
```bash
|
|
246
|
+
datamint-upload \
|
|
247
|
+
--path /path/to/dicoms \
|
|
248
|
+
--recursive \
|
|
249
|
+
--channel "training-data" \
|
|
250
|
+
--anonymize \
|
|
251
|
+
--publish
|
|
252
|
+
```
|
|
253
|
+
|
|
254
|
+
**Upload with segmentation masks:**
|
|
255
|
+
```bash
|
|
256
|
+
datamint-upload \
|
|
257
|
+
--path /path/to/images \
|
|
258
|
+
--segmentation_path /path/to/masks \
|
|
259
|
+
--segmentation_names segmentation_config.yaml
|
|
260
|
+
```
|
|
261
|
+
|
|
262
|
+
### Configuration Management
|
|
263
|
+
|
|
264
|
+
```bash
|
|
265
|
+
# Interactive setup
|
|
266
|
+
datamint-config
|
|
267
|
+
|
|
268
|
+
# Set API key
|
|
269
|
+
datamint-config --api-key "your-key"
|
|
270
|
+
```
|
|
271
|
+
|
|
272
|
+
## 🔍 Examples
|
|
273
|
+
|
|
274
|
+
### Medical Image Segmentation Pipeline
|
|
275
|
+
|
|
276
|
+
```python
|
|
277
|
+
import torch
|
|
278
|
+
import lightning as L
|
|
279
|
+
from datamint.lightning import DatamintDataModule
|
|
280
|
+
from datamint.mlflow.lightning.callbacks import MLFlowModelCheckpoint
|
|
281
|
+
|
|
282
|
+
class SegmentationModel(L.LightningModule):
|
|
283
|
+
def __init__(self):
|
|
284
|
+
super().__init__()
|
|
285
|
+
# Model definition...
|
|
286
|
+
|
|
287
|
+
def training_step(self, batch, batch_idx):
|
|
288
|
+
# Training logic...
|
|
289
|
+
pass
|
|
290
|
+
|
|
291
|
+
# Setup data
|
|
292
|
+
datamodule = DatamintDataModule(
|
|
293
|
+
project_name="liver-segmentation",
|
|
294
|
+
batch_size=8,
|
|
295
|
+
train_split=0.8
|
|
296
|
+
)
|
|
297
|
+
|
|
298
|
+
# Setup model with MLflow tracking
|
|
299
|
+
model = SegmentationModel()
|
|
300
|
+
checkpoint_cb = MLFlowModelCheckpoint(
|
|
301
|
+
monitor="val_dice",
|
|
302
|
+
mode="max",
|
|
303
|
+
register_model_name="liver-segmentation-model"
|
|
304
|
+
)
|
|
305
|
+
|
|
306
|
+
# Train
|
|
307
|
+
trainer = L.Trainer(
|
|
308
|
+
max_epochs=50,
|
|
309
|
+
callbacks=[checkpoint_cb],
|
|
310
|
+
logger=L.pytorch.loggers.MLFlowLogger()
|
|
311
|
+
)
|
|
312
|
+
trainer.fit(model, datamodule)
|
|
313
|
+
```
|
|
314
|
+
|
|
315
|
+
## 🆘 Support
|
|
316
|
+
|
|
317
|
+
[Full Documentation](https://datamint-python-api.readthedocs.io/)
|
|
318
|
+
[GitHub Issues](https://github.com/SonanceAI/datamint-python-api/issues)
|
|
319
|
+
|
|
320
|
+
|
|
@@ -1,7 +1,7 @@
|
|
|
1
1
|
datamint/__init__.py,sha256=ucsnxrYClh6pdy7psRJXWam_9rjAQB4NXzvy7xLovmo,824
|
|
2
2
|
datamint/api/__init__.py,sha256=7QYkmDBXbKh8-zchV7k6Lpolaw6h-IK6ezfXROIWh2A,43
|
|
3
|
-
datamint/api/base_api.py,sha256=
|
|
4
|
-
datamint/api/client.py,sha256=
|
|
3
|
+
datamint/api/base_api.py,sha256=q-TQKFbupUPCaOTihoFEljxCyTF76ojsDoGkhtIJRPc,20194
|
|
4
|
+
datamint/api/client.py,sha256=x-hkoDoKBWqwVCLbFZjHEWEwfqxo8bwVUSIWB4aebYQ,4018
|
|
5
5
|
datamint/api/dto/__init__.py,sha256=fUi901Zs-q5XHyWwZ4dMi2fEO8-CUEVEdYbpd17lahc,416
|
|
6
6
|
datamint/api/endpoints/__init__.py,sha256=wi4liAb5-wOohwyzKUD6TxHGeZmUPaZerFUGa2IUju4,529
|
|
7
7
|
datamint/api/endpoints/annotations_api.py,sha256=B21oGgNXO7xoir3ynNvhnbo1JyRvWZ9sbsc7VVjghhM,49713
|
|
@@ -15,9 +15,9 @@ datamint/api/endpoints/users_api.py,sha256=pnkuTZ1B9Y0FtwwvXO8J64e02RSkRxnBmTl9U
|
|
|
15
15
|
datamint/api/entity_base_api.py,sha256=-8SIt4M8P9G2b8SQznuWpFuFE8zEQjQxkRkw0s_w0Y4,11692
|
|
16
16
|
datamint/apihandler/annotation_api_handler.py,sha256=W3vV4z3BqX1OQe1r7zr8dI-IVu4zUDxED4QttdiWV-E,57098
|
|
17
17
|
datamint/apihandler/api_handler.py,sha256=mL0gMaWePYa7zwkw92E-VMK2WjpcPt7au0KqnmsWSYw,439
|
|
18
|
-
datamint/apihandler/base_api_handler.py,sha256=
|
|
18
|
+
datamint/apihandler/base_api_handler.py,sha256=VrQvCkCmWPHLVcK-zl764Ay7toiEJ2W9c2hymz63pro,11668
|
|
19
19
|
datamint/apihandler/dto/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
20
|
-
datamint/apihandler/dto/annotation_dto.py,sha256=
|
|
20
|
+
datamint/apihandler/dto/annotation_dto.py,sha256=4tpcet0-ShCMhrflk9H3CEX5tnhWZqw4Doft1B2E260,7341
|
|
21
21
|
datamint/apihandler/exp_api_handler.py,sha256=hFUgUgBc5rL7odK7gTW3MnrvMY1pVfJUpUdzRNobMQE,6226
|
|
22
22
|
datamint/apihandler/root_api_handler.py,sha256=jBof_XPTeq4o41CW-l-I5GHQKVa76kaX75RovS_qAM4,63384
|
|
23
23
|
datamint/client_cmd_tools/__init__.py,sha256=47DEQpj8HBSa-_TImW-5JCeuQeRkm5NMpJWZG3hSuFU,0
|
|
@@ -26,7 +26,7 @@ datamint/client_cmd_tools/datamint_upload.py,sha256=zGNEylfS2P5XUyzBz44ruHjoyAoK
|
|
|
26
26
|
datamint/configs.py,sha256=ArVD5QxuohLcM6NB0dMxcEfrD1-x5lPRDLcBdY8jhMU,1625
|
|
27
27
|
datamint/dataset/__init__.py,sha256=4PlUKSvVhdfQvvuq8jQXrkdqnot-iTTizM3aM1vgSwg,47
|
|
28
28
|
datamint/dataset/annotation.py,sha256=qN1IMjdfLD2ceQ6va3l76jOXA8Vb_c-eBk1oWQu6hW0,7994
|
|
29
|
-
datamint/dataset/base_dataset.py,sha256=
|
|
29
|
+
datamint/dataset/base_dataset.py,sha256=3sfYVxvxCaLfuTl5XQs48avuvhfmBUDxiYL_FYx4cS0,49941
|
|
30
30
|
datamint/dataset/dataset.py,sha256=c0887PcI6fEWG3FjM3gz_EYb9maH8v4ormZhm7vf4gE,28929
|
|
31
31
|
datamint/entities/__init__.py,sha256=ePkFeNATE5tJ9Lulsn2VTGWBJGwPtoRT1zsfbg1bkPw,504
|
|
32
32
|
datamint/entities/annotation.py,sha256=4qsm3yO8bTgu5SNxY91P0LGXx3d7mxF1OGWpMNir1CA,8956
|
|
@@ -43,12 +43,25 @@ datamint/exceptions.py,sha256=Or-NNj8pgChzAZNaWtkX1WyHJ2q5GziyvHdFLYymvX0,1661
|
|
|
43
43
|
datamint/experiment/__init__.py,sha256=5qQOMzoG17DEd1YnTF-vS0qiM-DGdbNh42EUo91CRhQ,34
|
|
44
44
|
datamint/experiment/_patcher.py,sha256=ZgbezoevAYhJsbiJTvWPALGTcUiMT371xddcTllt3H4,23296
|
|
45
45
|
datamint/experiment/experiment.py,sha256=aHK9dRFdQTi569xgUg1KqlCZLHZpDmSH3g3ndPIZvXw,44546
|
|
46
|
+
datamint/lightning/__init__.py,sha256=8Od8e-nWi4pNn1m5CQsIV8ElitVEUoHxEBiT2GVfJUc,50
|
|
47
|
+
datamint/lightning/datamintdatamodule.py,sha256=yTch-H0QCAQh_lwls8sCfofKqP0nPssy0CpSsO8QeEI,4038
|
|
46
48
|
datamint/logging.yaml,sha256=tOMxtc2UmwlIMTK6ljtnBwTco1PNrPeq3mx2iMuSbiw,482
|
|
49
|
+
datamint/mlflow/__init__.py,sha256=UCY6-e7WX65vj7_wPIVtDcDQTby-9wLsUOaXFvriLMs,1718
|
|
50
|
+
datamint/mlflow/artifact/__init__.py,sha256=XwZ4u63VoB9EdAx3mTwoOp-2PmMfa7KPBVPV0kZFUh0,64
|
|
51
|
+
datamint/mlflow/artifact/datamint_artifacts_repo.py,sha256=e97U4SF1bq8j_OTiZofuY92eIdgBuw9J8cyhW72ez6A,353
|
|
52
|
+
datamint/mlflow/env_utils.py,sha256=YOomkkr2HxFk6nIOaICamvbcUYEGqgZlQ5s02N4k2ok,3499
|
|
53
|
+
datamint/mlflow/env_vars.py,sha256=H3iteY7o_Jj3_EU9vREBE4s62SZMmiO6-ubC5_xG2kU,144
|
|
54
|
+
datamint/mlflow/lightning/callbacks/__init__.py,sha256=4Y-54OFIKBiwD_AwbWr2JRNyWBYsz9vnagfZfKgdKA0,50
|
|
55
|
+
datamint/mlflow/lightning/callbacks/modelcheckpoint.py,sha256=WadefDxvNW9CF0IMfyCgJ2gV4UfDX-QVe7ubnswKkEw,15225
|
|
56
|
+
datamint/mlflow/models/__init__.py,sha256=z7WbQZiK05MIjNmmjEs-s0zhIiLJ_uMynz60yFIf7f4,3629
|
|
57
|
+
datamint/mlflow/tracking/datamint_store.py,sha256=ULJhdPQ5fvy3-vM5SZv0Hnt95FzSq9-oyiMscz17AqY,1907
|
|
58
|
+
datamint/mlflow/tracking/default_experiment.py,sha256=gzrIZng0DpI2GWa8X30JVAq2fteq4w8cg4nz3UxYCvc,934
|
|
59
|
+
datamint/mlflow/tracking/fluent.py,sha256=xO1UJYHX0_vJINYzRV01PL6gimPhU_F6OvSV3ffE7as,2696
|
|
47
60
|
datamint/types.py,sha256=2OaY5QJvQIJKxyMNJYzxBksKCa9ZS2gb_ayJrByvu2Y,410
|
|
48
61
|
datamint/utils/logging_utils.py,sha256=9pRoaPrWu2jOdDCiAoUsjEdP5ZwaealWL3hjUqFvx9g,4022
|
|
49
62
|
datamint/utils/torchmetrics.py,sha256=lwU0nOtsSWfebyp7dvjlAggaqXtj5ohSEUXOg3L0hJE,2837
|
|
50
63
|
datamint/utils/visualization.py,sha256=yaUVAOHar59VrGUjpAWv5eVvQSfztFG0eP9p5Vt3l-M,4470
|
|
51
|
-
datamint-2.
|
|
52
|
-
datamint-2.
|
|
53
|
-
datamint-2.
|
|
54
|
-
datamint-2.
|
|
64
|
+
datamint-2.4.0.dist-info/METADATA,sha256=g4WkKWDq3mBETvQibcoISGXTtQCRdebfrw4leyurg3E,8974
|
|
65
|
+
datamint-2.4.0.dist-info/WHEEL,sha256=zp0Cn7JsFoX2ATtOhtaFYIiE2rmFAD4OcMhtUki8W3U,88
|
|
66
|
+
datamint-2.4.0.dist-info/entry_points.txt,sha256=wTr9DXteCfAFqLUstN1Qx_SqC_rFLrAUIcDx9q1eXZw,845
|
|
67
|
+
datamint-2.4.0.dist-info/RECORD,,
|
|
@@ -0,0 +1,18 @@
|
|
|
1
|
+
[console_scripts]
|
|
2
|
+
datamint-config=datamint.client_cmd_tools.datamint_config:main
|
|
3
|
+
datamint-upload=datamint.client_cmd_tools.datamint_upload:main
|
|
4
|
+
|
|
5
|
+
[mlflow.artifact_repository]
|
|
6
|
+
datamint=datamint.mlflow.artifact.datamint_artifacts_repo:DatamintArtifactsRepository
|
|
7
|
+
http=datamint.mlflow.artifact.datamint_artifacts_repo:DatamintArtifactsRepository
|
|
8
|
+
https=datamint.mlflow.artifact.datamint_artifacts_repo:DatamintArtifactsRepository
|
|
9
|
+
mlflow-artifacts=datamint.mlflow.artifact.datamint_artifacts_repo:DatamintArtifactsRepository
|
|
10
|
+
|
|
11
|
+
[mlflow.default_experiment_provider]
|
|
12
|
+
datamint=datamint.mlflow.tracking.default_experiment:DatamintExperimentProvider
|
|
13
|
+
|
|
14
|
+
[mlflow.tracking_store]
|
|
15
|
+
datamint=datamint.mlflow.tracking.datamint_store:DatamintStore
|
|
16
|
+
http=datamint.mlflow.tracking.datamint_store:DatamintStore
|
|
17
|
+
https=datamint.mlflow.tracking.datamint_store:DatamintStore
|
|
18
|
+
|
|
@@ -1,125 +0,0 @@
|
|
|
1
|
-
Metadata-Version: 2.4
|
|
2
|
-
Name: datamint
|
|
3
|
-
Version: 2.3.5
|
|
4
|
-
Summary: A library for interacting with the Datamint API, designed for efficient data management, processing and Deep Learning workflows.
|
|
5
|
-
Requires-Python: >=3.10
|
|
6
|
-
Classifier: Programming Language :: Python :: 3
|
|
7
|
-
Classifier: Programming Language :: Python :: 3.10
|
|
8
|
-
Classifier: Programming Language :: Python :: 3.11
|
|
9
|
-
Classifier: Programming Language :: Python :: 3.12
|
|
10
|
-
Classifier: Programming Language :: Python :: 3.13
|
|
11
|
-
Classifier: Programming Language :: Python :: 3.14
|
|
12
|
-
Provides-Extra: dev
|
|
13
|
-
Provides-Extra: docs
|
|
14
|
-
Requires-Dist: Deprecated (>=1.2.0)
|
|
15
|
-
Requires-Dist: aiohttp (>=3.0.0,<4.0.0)
|
|
16
|
-
Requires-Dist: aioresponses (>=0.7.8,<0.8.0) ; extra == "dev"
|
|
17
|
-
Requires-Dist: albumentations (>=2.0.0)
|
|
18
|
-
Requires-Dist: backports-strenum ; python_version < "3.11"
|
|
19
|
-
Requires-Dist: datamintapi (==0.0.*)
|
|
20
|
-
Requires-Dist: httpx
|
|
21
|
-
Requires-Dist: humanize (>=4.0.0,<5.0.0)
|
|
22
|
-
Requires-Dist: lazy-loader (>=0.3.0)
|
|
23
|
-
Requires-Dist: lightning
|
|
24
|
-
Requires-Dist: matplotlib
|
|
25
|
-
Requires-Dist: medimgkit (>=0.7.3)
|
|
26
|
-
Requires-Dist: nest-asyncio (>=1.0.0,<2.0.0)
|
|
27
|
-
Requires-Dist: nibabel (>=4.0.0)
|
|
28
|
-
Requires-Dist: numpy
|
|
29
|
-
Requires-Dist: opencv-python (>=4.0.0)
|
|
30
|
-
Requires-Dist: pandas (>=2.0.0)
|
|
31
|
-
Requires-Dist: platformdirs (>=4.0.0,<5.0.0)
|
|
32
|
-
Requires-Dist: pydantic (>=2.6.4)
|
|
33
|
-
Requires-Dist: pydicom (>=3.0.0,<4.0.0)
|
|
34
|
-
Requires-Dist: pylibjpeg (>=2.0.0,<3.0.0)
|
|
35
|
-
Requires-Dist: pylibjpeg-libjpeg (>=2.0.0,<3.0.0)
|
|
36
|
-
Requires-Dist: pytest (>=7.0.0,<8.0.0) ; extra == "dev"
|
|
37
|
-
Requires-Dist: pytest-cov (>=4.0.0,<5.0.0) ; extra == "dev"
|
|
38
|
-
Requires-Dist: pyyaml (>=5.0.0)
|
|
39
|
-
Requires-Dist: requests (>=2.0.0,<3.0.0)
|
|
40
|
-
Requires-Dist: responses (>=0.20.0,<0.21.0) ; extra == "dev"
|
|
41
|
-
Requires-Dist: rich (>=10.0.0)
|
|
42
|
-
Requires-Dist: setuptools (>=57.0) ; extra == "docs"
|
|
43
|
-
Requires-Dist: sphinx (>=5.0) ; extra == "docs"
|
|
44
|
-
Requires-Dist: sphinx-tabs (>=3.0.0) ; extra == "docs"
|
|
45
|
-
Requires-Dist: sphinx_rtd_theme (>=2.0.0) ; extra == "docs"
|
|
46
|
-
Requires-Dist: torch (>=1.2.0,!=2.3.0)
|
|
47
|
-
Requires-Dist: torchvision (>=0.18.0)
|
|
48
|
-
Requires-Dist: tqdm (>=4.0.0,<5.0.0)
|
|
49
|
-
Requires-Dist: typing_extensions (>=4.0.0)
|
|
50
|
-
Description-Content-Type: text/markdown
|
|
51
|
-
|
|
52
|
-
|
|
53
|
-
# Datamint python API
|
|
54
|
-
|
|
55
|
-

|
|
56
|
-
|
|
57
|
-
See the full documentation at https://sonanceai.github.io/datamint-python-api/
|
|
58
|
-
|
|
59
|
-
## Installation
|
|
60
|
-
|
|
61
|
-
Datamint requires Python 3.10+.
|
|
62
|
-
You can install/update Datamint and its dependencies using pip
|
|
63
|
-
|
|
64
|
-
```bash
|
|
65
|
-
pip install -U datamint
|
|
66
|
-
```
|
|
67
|
-
|
|
68
|
-
We recommend that you install Datamint in a dedicated virtual environment, to avoid conflicting with your system packages.
|
|
69
|
-
Create the enviroment once with `python3 -m venv datamint-env` and then activate it whenever you need it with:
|
|
70
|
-
- `source datamint-env/bin/activate` (Linux/MAC)
|
|
71
|
-
- `datamint-env\Scripts\activate.bat` (Windows CMD)
|
|
72
|
-
- `datamint-env\Scripts\Activate.ps1` (Windows PowerShell)
|
|
73
|
-
|
|
74
|
-
|
|
75
|
-
## Setup API key
|
|
76
|
-
|
|
77
|
-
To use the Datamint API, you need to setup your API key (ask your administrator if you don't have one). Use one of the following methods to setup your API key:
|
|
78
|
-
|
|
79
|
-
### Method 1: Command-line tool (recommended)
|
|
80
|
-
|
|
81
|
-
Run ``datamint-config`` in the terminal and follow the instructions. See [command_line_tools](https://sonanceai.github.io/datamint-python-api/command_line_tools.html) for more details.
|
|
82
|
-
|
|
83
|
-
### Method 2: Environment variable
|
|
84
|
-
|
|
85
|
-
Specify the API key as an environment variable.
|
|
86
|
-
|
|
87
|
-
**Bash:**
|
|
88
|
-
```bash
|
|
89
|
-
export DATAMINT_API_KEY="my_api_key"
|
|
90
|
-
# run your commands (e.g., `datamint-upload`, `python script.py`)
|
|
91
|
-
```
|
|
92
|
-
|
|
93
|
-
**Python:**
|
|
94
|
-
```python
|
|
95
|
-
import os
|
|
96
|
-
os.environ["DATAMINT_API_KEY"] = "my_api_key"
|
|
97
|
-
```
|
|
98
|
-
|
|
99
|
-
### Method 3: Api constructor
|
|
100
|
-
|
|
101
|
-
Specify API key in the Api constructor:
|
|
102
|
-
|
|
103
|
-
```python
|
|
104
|
-
from datamint import Api
|
|
105
|
-
api = Api(api_key='my_api_key')
|
|
106
|
-
```
|
|
107
|
-
|
|
108
|
-
## Tutorials
|
|
109
|
-
|
|
110
|
-
|
|
111
|
-
You can find example notebooks in the `notebooks` folder:
|
|
112
|
-
|
|
113
|
-
- [Uploading your resources](notebooks/upload_data.ipynb)
|
|
114
|
-
- [Uploading model segmentations](notebooks/upload_model_segmentations.ipynb)
|
|
115
|
-
|
|
116
|
-
and example scripts in [examples](examples) folder:
|
|
117
|
-
|
|
118
|
-
- [API usage examples](examples/api_usage.ipynb)
|
|
119
|
-
- [Project and entity usage](examples/project_entity_usage.ipynb)
|
|
120
|
-
- [Channels example](examples/channels_example.ipynb)
|
|
121
|
-
|
|
122
|
-
## Full documentation
|
|
123
|
-
|
|
124
|
-
See all functionalities in the full documentation at https://sonanceai.github.io/datamint-python-api/
|
|
125
|
-
|
|
File without changes
|