sawnergy 1.0.3__py3-none-any.whl → 1.0.9__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- sawnergy/__init__.py +3 -1
- sawnergy/embedding/SGNS_pml.py +324 -51
- sawnergy/embedding/SGNS_torch.py +282 -39
- sawnergy/embedding/__init__.py +26 -1
- sawnergy/embedding/embedder.py +426 -203
- sawnergy/embedding/visualizer.py +251 -0
- sawnergy/logging_util.py +1 -1
- sawnergy/rin/rin_builder.py +4 -4
- sawnergy/visual/visualizer.py +6 -6
- sawnergy/visual/visualizer_util.py +3 -0
- sawnergy/walks/walker.py +43 -22
- {sawnergy-1.0.3.dist-info → sawnergy-1.0.9.dist-info}/METADATA +91 -57
- sawnergy-1.0.9.dist-info/RECORD +23 -0
- sawnergy-1.0.3.dist-info/RECORD +0 -22
- {sawnergy-1.0.3.dist-info → sawnergy-1.0.9.dist-info}/WHEEL +0 -0
- {sawnergy-1.0.3.dist-info → sawnergy-1.0.9.dist-info}/licenses/LICENSE +0 -0
- {sawnergy-1.0.3.dist-info → sawnergy-1.0.9.dist-info}/licenses/NOTICE +0 -0
- {sawnergy-1.0.3.dist-info → sawnergy-1.0.9.dist-info}/top_level.txt +0 -0
|
@@ -1,6 +1,6 @@
|
|
|
1
1
|
Metadata-Version: 2.4
|
|
2
2
|
Name: sawnergy
|
|
3
|
-
Version: 1.0.
|
|
3
|
+
Version: 1.0.9
|
|
4
4
|
Summary: Toolkit for transforming molecular dynamics (MD) trajectories into rich graph representations
|
|
5
5
|
Home-page: https://github.com/Yehor-Mishchyriak/SAWNERGY
|
|
6
6
|
Author: Yehor Mishchyriak
|
|
@@ -19,7 +19,7 @@ Requires-Dist: zarr>=3.0
|
|
|
19
19
|
Requires-Dist: threadpoolctl>=3.0
|
|
20
20
|
Requires-Dist: matplotlib>=3.7
|
|
21
21
|
Requires-Dist: psutil>=5.9
|
|
22
|
-
Requires-Dist: ym-pure-ml
|
|
22
|
+
Requires-Dist: ym-pure-ml==1.2.4
|
|
23
23
|
Dynamic: author
|
|
24
24
|
Dynamic: classifier
|
|
25
25
|
Dynamic: description
|
|
@@ -39,19 +39,68 @@ Dynamic: summary
|
|
|
39
39
|

|
|
40
40
|
|
|
41
41
|
A toolkit for transforming molecular dynamics (MD) trajectories into rich graph representations, sampling
|
|
42
|
-
random and self-avoiding walks, learning node embeddings, and
|
|
42
|
+
random and self-avoiding walks, learning node embeddings, and visualizing residue interaction networks (RINs). SAWNERGY
|
|
43
43
|
keeps the full workflow — from `cpptraj` output to skip-gram embeddings (node2vec approach) — inside Python, backed by efficient Zarr-based archives and optional GPU acceleration.
|
|
44
44
|
|
|
45
45
|
---
|
|
46
46
|
|
|
47
|
+
## Installation
|
|
48
|
+
|
|
49
|
+
```bash
|
|
50
|
+
pip install sawnergy
|
|
51
|
+
```
|
|
52
|
+
|
|
53
|
+
> **Optional:** For GPU training, install PyTorch separately (e.g., `pip install torch`).
|
|
54
|
+
> **Note:** RIN building requires `cpptraj` (AmberTools). Ensure it is discoverable via `$PATH` or the `CPPTRAJ`
|
|
55
|
+
> environment variable. Probably the easiest solution: install AmberTools via Conda, activate the environment, and SAWNERGY will find the cpptraj executable on its own, so just run your code and don't worry about it.
|
|
56
|
+
|
|
57
|
+
---
|
|
58
|
+
|
|
59
|
+
# UPDATES:
|
|
60
|
+
|
|
61
|
+
## v1.0.9 — What’s new:
|
|
62
|
+
v1.0.9 — What’s new:
|
|
63
|
+
- **`SGNS_Torch` is no longer deprecated.**
|
|
64
|
+
- The root cause was weight initialization; it’s fixed.
|
|
65
|
+
- **`SG_Torch` and `SG_PureML` no longer use biases.**
|
|
66
|
+
- Affine/Linear layers no longer translate embeddings away from the origin.
|
|
67
|
+
- **Warm starts for frame embeddings.**
|
|
68
|
+
- Each frame initializes from the preceding frame’s representation. This speeds convergence and keeps the basis approximately consistent.
|
|
69
|
+
- **Alignment function for comparing embeddings from different latent spaces.**
|
|
70
|
+
- Based on the Orthogonal Procrustes solution: finds the best-fit orthogonal map between two embedding sets. Orthogonality preserves angles and relative distances, enabling direct comparison across bases.
|
|
71
|
+
|
|
72
|
+
## v1.0.8 — What’s new:
|
|
73
|
+
- **Temporary deprecation of `SGNS_Torch`**
|
|
74
|
+
- `sawnergy.embedding.SGNS_Torch` currently produces noisy embeddings in practice. The issue likely stems from **weight initialization**, although the root cause has not yet been conclusively determined.
|
|
75
|
+
- **Action:** The class and its `__init__` docstring now carry a deprecation notice. Constructing the class emits a **`DeprecationWarning`** and logs a **warning**.
|
|
76
|
+
- **Use instead:** Prefer **`SG_Torch`** (plain Skip-Gram with full softmax) or the PureML backends **`SGNS_PureML`** / **`SG_PureML`**.
|
|
77
|
+
- **Compatibility:** No breaking API changes; imports remain stable. PureML backends are unaffected.
|
|
78
|
+
- **Embedding visualizer update**
|
|
79
|
+
- Now you can L2 normalize your embeddings before display.
|
|
80
|
+
- **Small improvements in the embedding module**
|
|
81
|
+
- Improved API with a lot of good defaults in place to ease usage out of the box.
|
|
82
|
+
- Small internal model tweaks.
|
|
83
|
+
|
|
84
|
+
## v1.0.7 — What’s new:
|
|
85
|
+
- **Added plain Skip-Gram model**
|
|
86
|
+
- Now, the user can choose if they want to apply the negative sampling technique (two binary classifiers) or train a single classifier over the vocabulary (full softmax). For more detail, see: [node2vec](https://arxiv.org/pdf/1607.00653), [word2vec](https://arxiv.org/pdf/1301.3781), and [negative_sampling](https://arxiv.org/pdf/1402.3722).
|
|
87
|
+
- **Set a harsher default for low interaction energies pruning during RIN construction**
|
|
88
|
+
- Now we zero out 85% of the lowest interaction energies as opposed to the past 30% default, leading to more meaningful embeddings.
|
|
89
|
+
- **BUG FIX: Visualizer**
|
|
90
|
+
- Previously, the visualizer would silently draw edges of 0 magnitude, meaning they were actually being drawn but were invisible due to full transparency and 0 width. As a result, the displayed image/animation would be very laggy. Now, this was fixed, and given the higher pruning default, the displayed interaction networks are clean and smooth under rotations, dragging, etc.
|
|
91
|
+
- **New Embedding Visualizer (3D)**
|
|
92
|
+
- New lightweight viewer for per-frame embeddings that projects embeddings with PCA to a **3D** scatter. Supports the same node coloring semantics, optional node labels, and the same antialiasing/depthshade controls. Works in headless setups using the same backend guard and uses a blocking `show=True` for scripts.
|
|
93
|
+
|
|
94
|
+
---
|
|
95
|
+
|
|
47
96
|
## Why SAWNERGY?
|
|
48
97
|
|
|
49
98
|
- **Bridge simulations and graph ML**: Convert raw MD trajectories into residue interaction networks ready for graph
|
|
50
99
|
algorithms and downstream machine learning tasks.
|
|
51
|
-
- **Deterministic, shareable
|
|
52
|
-
- **High-performance data handling**: Heavy arrays live in shared memory during walk sampling to allow parallel processing without
|
|
53
|
-
- **Flexible
|
|
54
|
-
- **Visualization out of the box**: Plot and animate residue networks without leaving Python, using the data produced by RINBuilder
|
|
100
|
+
- **Deterministic, shareable artifacts**: Every stage produces compressed Zarr archives that contain both data and metadata so runs can be reproduced, shared, or inspected later.
|
|
101
|
+
- **High-performance data handling**: Heavy arrays live in shared memory during walk sampling to allow parallel processing without serialization overhead; archives are written in chunked, compressed form for fast read/write.
|
|
102
|
+
- **Flexible objectives & backends**: Train Skip-Gram with **negative sampling** (`objective="sgns"`) or **plain Skip-Gram** (`objective="sg"`), using either **PureML** (default) or **PyTorch**.
|
|
103
|
+
- **Visualization out of the box**: Plot and animate residue networks without leaving Python, using the data produced by RINBuilder.
|
|
55
104
|
|
|
56
105
|
---
|
|
57
106
|
|
|
@@ -91,9 +140,9 @@ node indexing, and RNG seeds stay consistent across the toolchain.
|
|
|
91
140
|
* Wraps the AmberTools `cpptraj` executable to:
|
|
92
141
|
- compute per-frame electrostatic (EMAP) and van der Waals (VMAP) energy matrices at the atomic level,
|
|
93
142
|
- project atom–atom interactions to residue–residue interactions using compositional masks,
|
|
94
|
-
- prune,
|
|
95
|
-
- compute per-residue
|
|
96
|
-
* Outputs a compressed Zarr archive with transition matrices, optional
|
|
143
|
+
- prune, symmetrize, remove self-interactions, and L1-normalize the matrices,
|
|
144
|
+
- compute per-residue centers of mass (COM) over the same frames.
|
|
145
|
+
* Outputs a compressed Zarr archive with transition matrices, optional pre-normalized energies, COM snapshots, and rich
|
|
97
146
|
metadata (frame range, pruning quantile, molecule ID, etc.).
|
|
98
147
|
* Supports parallel `cpptraj` execution, batch processing, and keeps temporary stores tidy via
|
|
99
148
|
`ArrayStorage.compress_and_cleanup`.
|
|
@@ -103,7 +152,7 @@ node indexing, and RNG seeds stay consistent across the toolchain.
|
|
|
103
152
|
* Opens RIN archives, resolves dataset names from attributes, and renders nodes plus attractive/repulsive edge bundles
|
|
104
153
|
in 3D using Matplotlib.
|
|
105
154
|
* Allows both static frame visualization and trajectory animation.
|
|
106
|
-
* Handles backend selection (`Agg` fallback in headless environments) and offers convenient
|
|
155
|
+
* Handles backend selection (`Agg` fallback in headless environments) and offers convenient color palettes via
|
|
107
156
|
`visualizer_util`.
|
|
108
157
|
|
|
109
158
|
### `sawnergy.walks.Walker`
|
|
@@ -116,13 +165,10 @@ node indexing, and RNG seeds stay consistent across the toolchain.
|
|
|
116
165
|
|
|
117
166
|
### `sawnergy.embedding.Embedder`
|
|
118
167
|
|
|
119
|
-
* Consumes walk archives, generates skip-gram pairs, and
|
|
120
|
-
*
|
|
121
|
-
|
|
122
|
-
|
|
123
|
-
* Both `SGNS_PureML` and `SGNS_Torch` accept training hyperparameters such as batch_size, LR, optimizer and LR_scheduler, etc.
|
|
124
|
-
* Exposes `embed_frame` (single frame) and `embed_all` (all frames, deterministic seeding per frame) which return the
|
|
125
|
-
learned input embedding matrices and write them to disk when requested.
|
|
168
|
+
* Consumes walk archives, generates skip-gram pairs, and normalizes them to 0-based indices.
|
|
169
|
+
* Selects skip-gram (SG / SGNS) backends dynamically via `model_base="pureml"|"torch"` with per-backend overrides supplied through `model_kwargs`.
|
|
170
|
+
* Handles deterministic per-frame seeding and returns the requested embedding `kind` (`"in"`, `"out"`, or `"avg"`) from `embed_frame` and `embed_all`.
|
|
171
|
+
* Persists per-frame matrices with rich provenance (walk metadata, objective, hyperparameters, RNG seeds) when `embed_all` targets an output archive.
|
|
126
172
|
|
|
127
173
|
### Supporting Utilities
|
|
128
174
|
|
|
@@ -140,23 +186,13 @@ node indexing, and RNG seeds stay consistent across the toolchain.
|
|
|
140
186
|
|---|---|---|
|
|
141
187
|
| **RIN** | `ATTRACTIVE_transitions` → **(T, N, N)**, float32 • `REPULSIVE_transitions` → **(T, N, N)**, float32 (optional) • `ATTRACTIVE_energies` → **(T, N, N)**, float32 (optional) • `REPULSIVE_energies` → **(T, N, N)**, float32 (optional) • `COM` → **(T, N, 3)**, float32 | `time_created` (ISO) • `com_name` = `"COM"` • `molecule_of_interest` (int) • `frame_range` = `(start, end)` inclusive • `frame_batch_size` (int) • `prune_low_energies_frac` (float in [0,1]) • `attractive_transitions_name` / `repulsive_transitions_name` (dataset names or `None`) • `attractive_energies_name` / `repulsive_energies_name` (dataset names or `None`) |
|
|
142
188
|
| **Walks** | `ATTRACTIVE_RWs` → **(T, N·num_RWs, L+1)**, int32 (optional) • `REPULSIVE_RWs` → **(T, N·num_RWs, L+1)**, int32 (optional) • `ATTRACTIVE_SAWs` → **(T, N·num_SAWs, L+1)**, int32 (optional) • `REPULSIVE_SAWs` → **(T, N·num_SAWs, L+1)**, int32 (optional) <br/>_Note:_ node IDs are **1-based**.| `time_created` (ISO) • `seed` (int) • `rng_scheme` = `"SeedSequence.spawn_per_batch_v1"` • `num_workers` (int) • `in_parallel` (bool) • `batch_size_nodes` (int) • `num_RWs` / `num_SAWs` (ints) • `node_count` (N) • `time_stamp_count` (T) • `walk_length` (L) • `walks_per_node` (int) • `attractive_RWs_name` / `repulsive_RWs_name` / `attractive_SAWs_name` / `repulsive_SAWs_name` (dataset names or `None`) • `walks_layout` = `"time_leading_3d"` |
|
|
143
|
-
| **Embeddings** | `FRAME_EMBEDDINGS` → **(
|
|
189
|
+
| **Embeddings** | `FRAME_EMBEDDINGS` → **(T, N, D)**, float32 | `created_at` (ISO) • `frame_embeddings_name` = `"FRAME_EMBEDDINGS"` • `time_stamp_count` = T • `node_count` = N • `embedding_dim` = D • `model_base` = `"torch"` or `"pureml"` • `embedding_kind` = `"in"|"out"|"avg"` • `objective` = `"sgns"` or `"sg"` • `negative_sampling` (bool) • `num_negative_samples` (int) • `num_epochs` (int) • `batch_size` (int) • `window_size` (int) • `alpha` (float) • `lr_step_per_batch` (bool) • `shuffle_data` (bool) • `device_hint` (str) • `model_kwargs_repr` (repr string) • `RIN_type` = `"attr"` or `"repuls"` • `using` = `"RW"|"SAW"|"merged"` • `source_WALKS_path` (str) • `walk_length` (int) • `num_RWs` / `num_SAWs` (ints) • `attractive_*_name` / `repulsive_*_name` (dataset names or `None`) • `master_seed` (int) • `per_frame_seeds` (list[int]) • `arrays_per_chunk` (int) • `compression_level` (int) |
|
|
144
190
|
|
|
145
191
|
**Notes**
|
|
146
192
|
|
|
147
|
-
- In **RIN**, `T` equals the number of frame **batches** written (i.e., `frame_range` swept in steps of `frame_batch_size`). `ATTRACTIVE/REPULSIVE_energies` are **pre-
|
|
193
|
+
- In **RIN**, `T` equals the number of frame **batches** written (i.e., `frame_range` swept in steps of `frame_batch_size`). `ATTRACTIVE/REPULSIVE_energies` are **pre-normalized** absolute energies (written only when `keep_prenormalized_energies=True`), whereas `ATTRACTIVE/REPULSIVE_transitions` are the **row-wise L1-normalized** versions used for sampling.
|
|
148
194
|
- All archives are Zarr v3 groups. ArrayStorage also maintains per-block metadata in root attrs: `array_chunk_size_in_block`, `array_shape_in_block`, and `array_dtype_in_block` (dicts keyed by dataset name). You’ll see these in every archive.
|
|
149
|
-
|
|
150
|
-
---
|
|
151
|
-
|
|
152
|
-
## Installation
|
|
153
|
-
|
|
154
|
-
```bash
|
|
155
|
-
pip install sawnergy
|
|
156
|
-
```
|
|
157
|
-
|
|
158
|
-
> **Note:** RIN building requires `cpptraj` (AmberTools). Ensure it is discoverable via `$PATH` or the `CPPTRAJ`
|
|
159
|
-
> environment variable.
|
|
195
|
+
- In **Embeddings**, `alpha` and `num_negative_samples` apply to **SGNS** only and are ignored for `objective="sg"`.
|
|
160
196
|
|
|
161
197
|
---
|
|
162
198
|
|
|
@@ -181,10 +217,10 @@ rin_builder.build_rin(
|
|
|
181
217
|
molecule_of_interest=1,
|
|
182
218
|
frame_range=(1, 100),
|
|
183
219
|
frame_batch_size=10,
|
|
184
|
-
prune_low_energies_frac=0.
|
|
220
|
+
prune_low_energies_frac=0.85,
|
|
185
221
|
output_path=rin_path,
|
|
186
222
|
include_attractive=True,
|
|
187
|
-
include_repulsive=False
|
|
223
|
+
include_repulsive=False
|
|
188
224
|
)
|
|
189
225
|
|
|
190
226
|
# 2. Sample walks from the RIN
|
|
@@ -192,52 +228,43 @@ walker = Walker(rin_path, seed=123)
|
|
|
192
228
|
walks_path = Path("./WALKS_demo.zip")
|
|
193
229
|
walker.sample_walks(
|
|
194
230
|
walk_length=16,
|
|
195
|
-
walks_per_node=
|
|
231
|
+
walks_per_node=100,
|
|
196
232
|
saw_frac=0.25,
|
|
197
233
|
include_attractive=True,
|
|
198
234
|
include_repulsive=False,
|
|
199
235
|
time_aware=False,
|
|
200
236
|
output_path=walks_path,
|
|
201
|
-
in_parallel=False
|
|
237
|
+
in_parallel=False
|
|
202
238
|
)
|
|
203
239
|
walker.close()
|
|
204
240
|
|
|
205
241
|
# 3. Train embeddings per frame (PyTorch backend)
|
|
206
242
|
import torch
|
|
207
243
|
|
|
208
|
-
embedder = Embedder(walks_path,
|
|
244
|
+
embedder = Embedder(walks_path, seed=999)
|
|
209
245
|
embeddings_path = embedder.embed_all(
|
|
210
246
|
RIN_type="attr",
|
|
211
247
|
using="merged",
|
|
248
|
+
num_epochs=10,
|
|
249
|
+
negative_sampling=False,
|
|
212
250
|
window_size=4,
|
|
213
|
-
|
|
214
|
-
|
|
215
|
-
|
|
216
|
-
dimensionality=128,
|
|
217
|
-
shuffle_data=True,
|
|
218
|
-
output_path="./EMBEDDINGS_demo.zip",
|
|
219
|
-
sgns_kwargs={
|
|
220
|
-
"optim": torch.optim.Adam,
|
|
221
|
-
"optim_kwargs": {"lr": 1e-3},
|
|
222
|
-
"lr_sched": torch.optim.lr_scheduler.LambdaLR,
|
|
223
|
-
"lr_sched_kwargs": {"lr_lambda": lambda _: 1.0},
|
|
224
|
-
"device": "cuda" if torch.cuda.is_available() else "cpu",
|
|
225
|
-
},
|
|
251
|
+
device="cuda" if torch.cuda.is_available() else "cpu",
|
|
252
|
+
model_base="torch",
|
|
253
|
+
output_path="./EMBEDDINGS_demo.zip"
|
|
226
254
|
)
|
|
227
255
|
print("Embeddings written to", embeddings_path)
|
|
228
256
|
```
|
|
229
257
|
|
|
230
|
-
> For the PureML backend,
|
|
231
|
-
> (for example `optim=pureml.optimizers.Adam`, `lr_sched=pureml.optimizers.CosineAnnealingLR`).
|
|
258
|
+
> For the PureML backend, set `model_base="pureml"` and pass the optimizer / scheduler classes inside `model_kwargs`.
|
|
232
259
|
|
|
233
260
|
---
|
|
234
261
|
|
|
235
|
-
##
|
|
262
|
+
## Visualization
|
|
236
263
|
|
|
237
264
|
```python
|
|
238
265
|
from sawnergy.visual import Visualizer
|
|
239
266
|
|
|
240
|
-
v =
|
|
267
|
+
v = Visualizer("./RIN_demo.zip")
|
|
241
268
|
v.build_frame(1,
|
|
242
269
|
node_colors="rainbow",
|
|
243
270
|
displayed_nodes="ALL",
|
|
@@ -250,14 +277,20 @@ v.build_frame(1,
|
|
|
250
277
|
|
|
251
278
|
`Visualizer` lazily loads datasets and works even in headless environments (falls back to the `Agg` backend).
|
|
252
279
|
|
|
280
|
+
```python
|
|
281
|
+
from sawnergy.embedding import Visualizer
|
|
282
|
+
|
|
283
|
+
viz = Visualizer("./EMBEDDINGS_demo.zip", normalize_rows=True)
|
|
284
|
+
viz.build_frame(1, show=True)
|
|
285
|
+
```
|
|
286
|
+
|
|
253
287
|
---
|
|
254
288
|
|
|
255
289
|
## Advanced Notes
|
|
256
290
|
|
|
257
291
|
- **Time-aware walks**: Set `time_aware=True`, provide `stickiness` and `on_no_options` when calling `Walker.sample_walks`.
|
|
258
292
|
- **Shared memory lifecycle**: Call `Walker.close()` (or use a context manager) to release shared-memory segments.
|
|
259
|
-
- **PureML vs PyTorch**:
|
|
260
|
-
constructor kwargs through `sgns_kwargs` (optimizer, scheduler, device).
|
|
293
|
+
- **PureML vs PyTorch**: Select the backend at call time with `model_base="pureml"|"torch"` (defaults to `"pureml"`) and pass optimizer / scheduler overrides through `model_kwargs`.
|
|
261
294
|
- **ArrayStorage utilities**: Use `ArrayStorage` directly to peek into archives, append arrays, or manage metadata.
|
|
262
295
|
|
|
263
296
|
---
|
|
@@ -268,8 +301,9 @@ v.build_frame(1,
|
|
|
268
301
|
├── sawnergy/
|
|
269
302
|
│ ├── rin/ # RINBuilder and cpptraj integration helpers
|
|
270
303
|
│ ├── walks/ # Walker class and shared-memory utilities
|
|
271
|
-
│ ├── embedding/ # Embedder + SGNS backends (PureML / PyTorch)
|
|
304
|
+
│ ├── embedding/ # Embedder + SG/SGNS backends (PureML / PyTorch)
|
|
272
305
|
│ ├── visual/ # Visualizer and palette utilities
|
|
306
|
+
│ │
|
|
273
307
|
│ ├── logging_util.py
|
|
274
308
|
│ └── sawnergy_util.py
|
|
275
309
|
│
|
|
@@ -278,7 +312,7 @@ v.build_frame(1,
|
|
|
278
312
|
|
|
279
313
|
---
|
|
280
314
|
|
|
281
|
-
##
|
|
315
|
+
## Acknowledgments
|
|
282
316
|
|
|
283
317
|
SAWNERGY builds on the AmberTools `cpptraj` ecosystem, NumPy, Matplotlib, Zarr, and PyTorch (for GPU acceleration if necessary; PureML is available by default).
|
|
284
318
|
Big thanks to the upstream communities whose work makes this toolkit possible.
|
|
@@ -0,0 +1,23 @@
|
|
|
1
|
+
sawnergy/__init__.py,sha256=Dq1U38ah6nPRFEDKN41mYphcTynKfnItca6QkYkpSbs,248
|
|
2
|
+
sawnergy/logging_util.py,sha256=mfYw8IsYtOfCXayjkd4g9jHuupluxRNbqyFegRkiAhQ,1476
|
|
3
|
+
sawnergy/sawnergy_util.py,sha256=Htx9wr0S8TXt5aHT2mtEdYf1TCo_BC1IUwNNuZdIR-4,49432
|
|
4
|
+
sawnergy/embedding/SGNS_pml.py,sha256=Oo6fVfGHgTMwZNgz-Kk4Pi9iapTkcYsi7TcnLOdS4a0,16868
|
|
5
|
+
sawnergy/embedding/SGNS_torch.py,sha256=mSQ38Yq1EoVFzjQump5yTJqAXp1Moo1xdpy_62K1uCk,16821
|
|
6
|
+
sawnergy/embedding/__init__.py,sha256=EH3PoCX5FQ1-f75V3iHn32_vZDrMX3KqVNRjgQn42PA,1798
|
|
7
|
+
sawnergy/embedding/embedder.py,sha256=Qa0jvZ2rVd7-Qv0FhYYHMKeY9qrW1A68G673GGk40v8,35987
|
|
8
|
+
sawnergy/embedding/visualizer.py,sha256=x0BiSG9_nk9AUQm9RsZ2syKeCiaxX1gTlC85aYycMXY,8830
|
|
9
|
+
sawnergy/rin/__init__.py,sha256=z19hLfEIp3bwzY-eCHQBQf0NRTCJzVz_FLIpVV5q0W4,162
|
|
10
|
+
sawnergy/rin/rin_builder.py,sha256=d1cC4KKY9zzNlqhxHWTFM-QyXRXubd2zlCrSM-dV5pc,44624
|
|
11
|
+
sawnergy/rin/rin_util.py,sha256=5TKywA5qfm76Gl4Cyz7oBPasmE5chclR7UM4hawwQOg,14939
|
|
12
|
+
sawnergy/visual/__init__.py,sha256=p_ByFtfrP19b5_qiJlkAnYesZN3M1LjIo421LUgVVbw,502
|
|
13
|
+
sawnergy/visual/visualizer.py,sha256=GVD_rFavDXFz9-h28eFf5nPBujUvRncn_zYoHcFHZ3Q,33155
|
|
14
|
+
sawnergy/visual/visualizer_util.py,sha256=7y3kWjHxDQMoG0dmimceHKTC5veVChoyvW7d0qXH23k,15100
|
|
15
|
+
sawnergy/walks/__init__.py,sha256=Z_Kaffhn3oUX13z9jbY0V5Ncdwj9Cnr--n9D-s7gh5k,250
|
|
16
|
+
sawnergy/walks/walker.py,sha256=scvfZFrSL4AwpmspD0Jb0uhnrVIRRwE_hPCE3bG6zpg,37729
|
|
17
|
+
sawnergy/walks/walker_util.py,sha256=ETdyPNIDwDQCA8Z5t38keBhYBJ56_ksT_0NhOCY-tHE,15361
|
|
18
|
+
sawnergy-1.0.9.dist-info/licenses/LICENSE,sha256=cElK4bCsDhyAEON3H05s35bQZvxBcXBiCOrOdiUhDCY,11346
|
|
19
|
+
sawnergy-1.0.9.dist-info/licenses/NOTICE,sha256=eVTbuSasZrmMJVtKoWOzsKyu4ZNm7Ks7dzI3Tx5tEHc,109
|
|
20
|
+
sawnergy-1.0.9.dist-info/METADATA,sha256=t1LF9o767YZdSPhWrE8D4jIBzGnUhnN02i7KVl0YDj4,16844
|
|
21
|
+
sawnergy-1.0.9.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
|
|
22
|
+
sawnergy-1.0.9.dist-info/top_level.txt,sha256=-67FQD6FD9Gjt74WTmO9hNYA3MLB4HaSxci0sEKC5Lo,9
|
|
23
|
+
sawnergy-1.0.9.dist-info/RECORD,,
|
sawnergy-1.0.3.dist-info/RECORD
DELETED
|
@@ -1,22 +0,0 @@
|
|
|
1
|
-
sawnergy/__init__.py,sha256=VXONQ6OVlREGONOhpP29GVHYyl2W-6tfifnyK2GnlUo,207
|
|
2
|
-
sawnergy/logging_util.py,sha256=tnhToHchnWaORHU73dxzBuL1e_C-AXFdPExDZTEI6tE,1474
|
|
3
|
-
sawnergy/sawnergy_util.py,sha256=Htx9wr0S8TXt5aHT2mtEdYf1TCo_BC1IUwNNuZdIR-4,49432
|
|
4
|
-
sawnergy/embedding/SGNS_pml.py,sha256=ZwzGogopOX9xjKXXuoaP1dp8TjxN9dnHiKLt6nj10mQ,4933
|
|
5
|
-
sawnergy/embedding/SGNS_torch.py,sha256=3Pa_mk5mzsl27M87q4tNmitOouxDdG5ZzxpdaOSyGt8,6411
|
|
6
|
-
sawnergy/embedding/__init__.py,sha256=sxUh2RcZyPs8aCdvec8x843Bm3DBaYQNrBF8VyvLQ-k,965
|
|
7
|
-
sawnergy/embedding/embedder.py,sha256=0hCEVtO-qwNFa-qEwJ6GKnirZ7nYptChwhhFL-QnCho,25810
|
|
8
|
-
sawnergy/rin/__init__.py,sha256=z19hLfEIp3bwzY-eCHQBQf0NRTCJzVz_FLIpVV5q0W4,162
|
|
9
|
-
sawnergy/rin/rin_builder.py,sha256=mdKojh6qKtZ8n10fJtTh-3s2f4zZ5bkDBLwaoLEOUpg,44625
|
|
10
|
-
sawnergy/rin/rin_util.py,sha256=5TKywA5qfm76Gl4Cyz7oBPasmE5chclR7UM4hawwQOg,14939
|
|
11
|
-
sawnergy/visual/__init__.py,sha256=p_ByFtfrP19b5_qiJlkAnYesZN3M1LjIo421LUgVVbw,502
|
|
12
|
-
sawnergy/visual/visualizer.py,sha256=qqggoLRNi6t0awXEt-Hy2ut9S0Y8_uKznyozlGLR1Q8,33131
|
|
13
|
-
sawnergy/visual/visualizer_util.py,sha256=C9W22CJmfJuTV5_uYsEnG8YChR4nH7OHKbNz26hAyB0,15028
|
|
14
|
-
sawnergy/walks/__init__.py,sha256=Z_Kaffhn3oUX13z9jbY0V5Ncdwj9Cnr--n9D-s7gh5k,250
|
|
15
|
-
sawnergy/walks/walker.py,sha256=Yj8FlIGYXNR1lK5ORE-TICr74k8ifargp1Ddswl5aFc,36808
|
|
16
|
-
sawnergy/walks/walker_util.py,sha256=ETdyPNIDwDQCA8Z5t38keBhYBJ56_ksT_0NhOCY-tHE,15361
|
|
17
|
-
sawnergy-1.0.3.dist-info/licenses/LICENSE,sha256=cElK4bCsDhyAEON3H05s35bQZvxBcXBiCOrOdiUhDCY,11346
|
|
18
|
-
sawnergy-1.0.3.dist-info/licenses/NOTICE,sha256=eVTbuSasZrmMJVtKoWOzsKyu4ZNm7Ks7dzI3Tx5tEHc,109
|
|
19
|
-
sawnergy-1.0.3.dist-info/METADATA,sha256=m3cQkU2KNUCGJuNYne1mAyHkasiTyw6MeRZMbknuzsI,13367
|
|
20
|
-
sawnergy-1.0.3.dist-info/WHEEL,sha256=_zCd3N1l69ArxyTb8rzEoP9TpbYXkqRFSNOD5OuxnTs,91
|
|
21
|
-
sawnergy-1.0.3.dist-info/top_level.txt,sha256=-67FQD6FD9Gjt74WTmO9hNYA3MLB4HaSxci0sEKC5Lo,9
|
|
22
|
-
sawnergy-1.0.3.dist-info/RECORD,,
|
|
File without changes
|
|
File without changes
|
|
File without changes
|
|
File without changes
|