thds.adls 4.4.20251028175225__py3-none-any.whl → 4.4.20251106191543__py3-none-any.whl

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.

Potentially problematic release.


This version of thds.adls might be problematic. Click here for more details.

@@ -0,0 +1,79 @@
1
+ Metadata-Version: 2.4
2
+ Name: thds.adls
3
+ Version: 4.4.20251106191543
4
+ Summary: ADLS tools
5
+ Author-email: Trilliant Health <info@trillianthealth.com>
6
+ License: MIT
7
+ Project-URL: Repository, https://github.com/TrilliantHealth/trilliant-data-science
8
+ Requires-Python: >=3.9
9
+ Description-Content-Type: text/markdown
10
+ Requires-Dist: aiohttp>=3.8.1
11
+ Requires-Dist: aiostream>=0.4.5
12
+ Requires-Dist: azure-identity>=1.9
13
+ Requires-Dist: azure-storage-file-datalake>=12.6
14
+ Requires-Dist: blake3
15
+ Requires-Dist: filelock>=3.0
16
+ Requires-Dist: xxhash
17
+ Requires-Dist: thds-core
18
+
19
+ # thds.adls
20
+
21
+ A high-performance Azure Data Lake Storage (ADLS Gen2) client for the THDS monorepo. It wraps the Azure
22
+ SDK with hash-aware caching, azcopy acceleration, and shared client/credential plumbing so applications
23
+ can transfer large blob datasets quickly and reliably.
24
+
25
+ ## Highlights
26
+
27
+ - **Environment-aware paths first:** Almost every consumer starts by importing `fqn`, `AdlsFqn`, and
28
+ `defaults.env_root()` to build storage-account/container URIs that follow the current THDS environment.
29
+ - **Cache-backed reads:** `download_to_cache` is the standard entry point for pulling blobs down with a
30
+ verified hash so local workflows, tests, and pipelines can operate on read-only copies.
31
+ - **Bulk filesystem helpers:** `ADLSFileSystem` powers scripts and jobs that need to walk directories,
32
+ fetch batches of files, or mirror hive tables without re-implementing Azure SDK plumbing.
33
+ - **Spark/Databricks bridges:** `abfss` and `uri` conversions keep analytics code agnostic to whether it
34
+ needs an `adls://`, `abfss://`, `https://`, or `dbfs://` view of the same path.
35
+ - **Composable utilities:** Higher-level modules (cache, upload, copy, list) layer on top of those
36
+ imports so teams can opt into more advanced behavior without leaving the public API surface.
37
+
38
+ ## Key Modules
39
+
40
+ | Component | Typical usage in the monorepo |
41
+ | ------------------------------------- | ---------------------------------------------------------------------------------------------------------- |
42
+ | `fqn` | Parse, validate, and join ADLS paths; used when materializing model datasets and configuring pipelines. |
43
+ | `AdlsFqn` | Strongly typed value passed between tasks and tests to represent a single blob or directory. |
44
+ | `defaults` / `named_roots` | Resolve environment-specific storage roots (`defaults.env_root()`, `named_roots.require(...)`). |
45
+ | `download_to_cache` (`cached` module) | Bring a blob down to the shared read-only cache before analytics, feature builds, or test fixtures run. |
46
+ | `ADLSFileSystem` (`impl` module) | Fetch or list entire directory trees and integrate with caching inside scripts and notebooks. |
47
+ | `abfss` | Translate `AdlsFqn` objects into `abfss://` URIs for Spark/Databricks jobs. |
48
+ | `uri` | Normalize `adls://`, `abfss://`, `https://`, and `dbfs://` strings into `AdlsFqn` values (and vice versa). |
49
+ | `global_client` / `shared_credential` | Shared, fork-safe Azure clients and credentials backing the public helpers above. |
50
+
51
+ ## Example Usage
52
+
53
+ 1. Use the caching helpers and Source integration:
54
+
55
+ ```python
56
+ from thds.adls import cached, upload, source
57
+
58
+ cache_path = cached.download_to_cache("adls://acct/container/path/to/file")
59
+ src = upload("adls://acct/container/path/out.parquet", cache_path)
60
+ verified = source.get_with_hash(src.uri)
61
+ ```
62
+
63
+ 1. For CLI usage, run (from repo root):
64
+
65
+ ```bash
66
+ uv run python -m thds.adls.tools.download adls://acct/container/path/file
67
+ ```
68
+
69
+ ## Operational Notes
70
+
71
+ - **Hash metadata:** Uploads attach `hash_xxh3_128_b64` automatically when the bytes are known. Download
72
+ completion back-fills missing hashes when permissions allow.
73
+ - **Locks and concurrency:** Large transfers acquire per-path file locks to keep azcopy instances
74
+ cooperative. Global HTTP connection pools default to 100 but are configurable via `thds.core.config`.
75
+ - **Error handling:** `BlobNotFoundError` and other ADLS-specific exceptions translate into custom error
76
+ types to simplify retries and diagnostics.
77
+ - **Extensibility:** Additional hash algorithms can be registered by importing dependent packages (e.g.,
78
+ `blake3`). Named roots can be populated dynamically via environment-specific modules
79
+ (`thds.adls._thds_defaults` hook).
@@ -39,8 +39,8 @@ thds/adls/tools/download.py,sha256=CW2cWbCRdUqisVVVoqqvqk5Ved7pPGTkwnZj3uV0jy4,1
39
39
  thds/adls/tools/ls.py,sha256=OgEaIfTK359twlZIj-A0AW_nv81Z6zi0b9Tw6OJJfWA,1083
40
40
  thds/adls/tools/ls_fast.py,sha256=Nowc-efAL_Y4ybPwZzKIeh7KGIjfecRzdWvJZcBzq_8,585
41
41
  thds/adls/tools/upload.py,sha256=5WyWkpuVp2PETZ3O3ODlq8LXszSHU73ZMnIDZXPJdC8,442
42
- thds_adls-4.4.20251028175225.dist-info/METADATA,sha256=i2-9fxSOeCZX38RhXZW7ge5t_oDHOf_UNF8WagFmNpc,587
43
- thds_adls-4.4.20251028175225.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
44
- thds_adls-4.4.20251028175225.dist-info/entry_points.txt,sha256=rtVF0A2MMTYUsBScF6b3AlOuk2Vm02QK7Tc2bDcDpk0,200
45
- thds_adls-4.4.20251028175225.dist-info/top_level.txt,sha256=LTZaE5SkWJwv9bwOlMbIhiS-JWQEEIcjVYnJrt-CriY,5
46
- thds_adls-4.4.20251028175225.dist-info/RECORD,,
42
+ thds_adls-4.4.20251106191543.dist-info/METADATA,sha256=WsSi-KMq1I3WAkNJ4pq9H8kYybXhAIeIljIKvy06eu8,4586
43
+ thds_adls-4.4.20251106191543.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
44
+ thds_adls-4.4.20251106191543.dist-info/entry_points.txt,sha256=rtVF0A2MMTYUsBScF6b3AlOuk2Vm02QK7Tc2bDcDpk0,200
45
+ thds_adls-4.4.20251106191543.dist-info/top_level.txt,sha256=LTZaE5SkWJwv9bwOlMbIhiS-JWQEEIcjVYnJrt-CriY,5
46
+ thds_adls-4.4.20251106191543.dist-info/RECORD,,
@@ -1,21 +0,0 @@
1
- Metadata-Version: 2.4
2
- Name: thds.adls
3
- Version: 4.4.20251028175225
4
- Summary: ADLS tools
5
- Author-email: Trilliant Health <info@trillianthealth.com>
6
- License: MIT
7
- Project-URL: Repository, https://github.com/TrilliantHealth/trilliant-data-science
8
- Requires-Python: >=3.9
9
- Description-Content-Type: text/markdown
10
- Requires-Dist: aiohttp>=3.8.1
11
- Requires-Dist: aiostream>=0.4.5
12
- Requires-Dist: azure-identity>=1.9
13
- Requires-Dist: azure-storage-file-datalake>=12.6
14
- Requires-Dist: blake3
15
- Requires-Dist: filelock>=3.0
16
- Requires-Dist: xxhash
17
- Requires-Dist: thds-core
18
-
19
- # adls Library
20
-
21
- A port of `core.adls`.