spd-learn 0.1.0__tar.gz
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- spd_learn-0.1.0/CHANGELOG.md +146 -0
- spd_learn-0.1.0/LICENSE.txt +33 -0
- spd_learn-0.1.0/MANIFEST.in +6 -0
- spd_learn-0.1.0/PKG-INFO +244 -0
- spd_learn-0.1.0/README.md +172 -0
- spd_learn-0.1.0/pyproject.toml +164 -0
- spd_learn-0.1.0/setup.cfg +4 -0
- spd_learn-0.1.0/spd_learn/__init__.py +66 -0
- spd_learn-0.1.0/spd_learn/functional/__init__.py +187 -0
- spd_learn-0.1.0/spd_learn/functional/autograd.py +165 -0
- spd_learn-0.1.0/spd_learn/functional/batchnorm.py +217 -0
- spd_learn-0.1.0/spd_learn/functional/bilinear.py +134 -0
- spd_learn-0.1.0/spd_learn/functional/core.py +785 -0
- spd_learn-0.1.0/spd_learn/functional/covariance.py +98 -0
- spd_learn-0.1.0/spd_learn/functional/dropout.py +83 -0
- spd_learn-0.1.0/spd_learn/functional/metrics/__init__.py +85 -0
- spd_learn-0.1.0/spd_learn/functional/metrics/affine_invariant.py +347 -0
- spd_learn-0.1.0/spd_learn/functional/metrics/bures_wasserstein.py +588 -0
- spd_learn-0.1.0/spd_learn/functional/metrics/log_cholesky.py +489 -0
- spd_learn-0.1.0/spd_learn/functional/metrics/log_euclidean.py +417 -0
- spd_learn-0.1.0/spd_learn/functional/numerical.py +635 -0
- spd_learn-0.1.0/spd_learn/functional/parallel_transport.py +672 -0
- spd_learn-0.1.0/spd_learn/functional/regularize.py +147 -0
- spd_learn-0.1.0/spd_learn/functional/riemannian_pgd.py +164 -0
- spd_learn-0.1.0/spd_learn/functional/utils.py +57 -0
- spd_learn-0.1.0/spd_learn/functional/wavelet.py +137 -0
- spd_learn-0.1.0/spd_learn/init.py +168 -0
- spd_learn-0.1.0/spd_learn/logging.py +334 -0
- spd_learn-0.1.0/spd_learn/models/__init__.py +22 -0
- spd_learn-0.1.0/spd_learn/models/eegspdnet.py +144 -0
- spd_learn-0.1.0/spd_learn/models/green.py +210 -0
- spd_learn-0.1.0/spd_learn/models/matt.py +221 -0
- spd_learn-0.1.0/spd_learn/models/phase_spdnet.py +145 -0
- spd_learn-0.1.0/spd_learn/models/spdnet.py +111 -0
- spd_learn-0.1.0/spd_learn/models/tensorcsp.py +207 -0
- spd_learn-0.1.0/spd_learn/models/tsmnet.py +119 -0
- spd_learn-0.1.0/spd_learn/modules/__init__.py +45 -0
- spd_learn-0.1.0/spd_learn/modules/batchnorm.py +623 -0
- spd_learn-0.1.0/spd_learn/modules/bilinear.py +331 -0
- spd_learn-0.1.0/spd_learn/modules/covariance.py +196 -0
- spd_learn-0.1.0/spd_learn/modules/dropout.py +142 -0
- spd_learn-0.1.0/spd_learn/modules/manifold.py +167 -0
- spd_learn-0.1.0/spd_learn/modules/modeig.py +414 -0
- spd_learn-0.1.0/spd_learn/modules/regularize.py +319 -0
- spd_learn-0.1.0/spd_learn/modules/residual.py +150 -0
- spd_learn-0.1.0/spd_learn/modules/utils.py +191 -0
- spd_learn-0.1.0/spd_learn/modules/wavelet.py +206 -0
- spd_learn-0.1.0/spd_learn/version.py +5 -0
- spd_learn-0.1.0/spd_learn.egg-info/PKG-INFO +244 -0
- spd_learn-0.1.0/spd_learn.egg-info/SOURCES.txt +61 -0
- spd_learn-0.1.0/spd_learn.egg-info/dependency_links.txt +1 -0
- spd_learn-0.1.0/spd_learn.egg-info/requires.txt +50 -0
- spd_learn-0.1.0/spd_learn.egg-info/top_level.txt +1 -0
- spd_learn-0.1.0/tests/test_attacks.py +43 -0
- spd_learn-0.1.0/tests/test_batchnorm.py +228 -0
- spd_learn-0.1.0/tests/test_correctness.py +200 -0
- spd_learn-0.1.0/tests/test_dropout.py +54 -0
- spd_learn-0.1.0/tests/test_functional.py +377 -0
- spd_learn-0.1.0/tests/test_integration.py +273 -0
- spd_learn-0.1.0/tests/test_models.py +61 -0
- spd_learn-0.1.0/tests/test_modules.py +571 -0
- spd_learn-0.1.0/tests/test_numerical.py +412 -0
- spd_learn-0.1.0/tests/test_transport.py +551 -0
|
@@ -0,0 +1,146 @@
|
|
|
1
|
+
# Changelog
|
|
2
|
+
|
|
3
|
+
All notable changes to SPD Learn will be documented in this file.
|
|
4
|
+
|
|
5
|
+
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
|
|
6
|
+
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
|
|
7
|
+
|
|
8
|
+
## [0.1.0] - 2026-02-05
|
|
9
|
+
|
|
10
|
+
### π Initial Release
|
|
11
|
+
|
|
12
|
+
We are excited to announce the first public release of **SPD Learn** β a pure PyTorch library for geometric deep learning on Symmetric Positive Definite (SPD) matrices.
|
|
13
|
+
|
|
14
|
+
SPD Learn provides differentiable Riemannian operations, broadcast-compatible layers, and reference implementations of published neural network architectures for SPD data, with a focus on EEG/BCI applications.
|
|
15
|
+
|
|
16
|
+
---
|
|
17
|
+
|
|
18
|
+
### β¨ Features
|
|
19
|
+
|
|
20
|
+
#### Neural Network Models
|
|
21
|
+
|
|
22
|
+
Seven state-of-the-art deep learning architectures for SPD matrix data:
|
|
23
|
+
|
|
24
|
+
| Model | Description | Reference |
|
|
25
|
+
|-------|-------------|-----------|
|
|
26
|
+
| **SPDNet** | Foundational architecture for deep learning on SPD manifolds with dimension reduction | Huang & Van Gool, AAAI 2017 |
|
|
27
|
+
| **EEGSPDNet** | Specialized for EEG classification, combining covariance estimation with SPD layers | - |
|
|
28
|
+
| **TSMNet** | Tangent Space Mapping Network with convolutional features and SPD batch normalization | Kobler et al., 2022 |
|
|
29
|
+
| **TensorCSPNet** | Multi-band EEG feature extraction using Tensor Common Spatial Patterns | - |
|
|
30
|
+
| **PhaseSPDNet** | Leverages instantaneous phase information from analytic signals | - |
|
|
31
|
+
| **GREEN** | Gabor Riemann EEGNet combining Gabor wavelets with Riemannian geometry | Chambon et al. |
|
|
32
|
+
| **MAtt** | Manifold Attention mechanism for SPD matrices | - |
|
|
33
|
+
|
|
34
|
+
#### SPD Neural Network Layers
|
|
35
|
+
|
|
36
|
+
A comprehensive set of differentiable layers that respect SPD geometry:
|
|
37
|
+
|
|
38
|
+
- **BiMap** β Bilinear mapping layer for SPD dimension reduction
|
|
39
|
+
- **ReEig** β Eigenvalue rectification to ensure positive definiteness
|
|
40
|
+
- **LogEig** β Logarithmic map to tangent space (Euclidean)
|
|
41
|
+
- **ExpEig** β Exponential map from tangent space back to SPD manifold
|
|
42
|
+
- **SqrtEig** β Matrix square root via eigendecomposition
|
|
43
|
+
- **InvSqrtEig** β Inverse matrix square root
|
|
44
|
+
- **PowerEig** β Matrix power function
|
|
45
|
+
- **VecMat** β Vectorization/matricization operations
|
|
46
|
+
- **CovLayer** β Differentiable covariance matrix estimation
|
|
47
|
+
- **Shrinkage** β Regularized covariance estimation (Ledoit-Wolf, Oracle)
|
|
48
|
+
|
|
49
|
+
#### Batch Normalization
|
|
50
|
+
|
|
51
|
+
SPD-specific batch normalization layers respecting Riemannian geometry:
|
|
52
|
+
|
|
53
|
+
- **SPDBatchNorm** β Standard SPD batch normalization
|
|
54
|
+
- **TSMBatchNorm** β Batch normalization for Tangent Space Mapping
|
|
55
|
+
- **AdaMomSPDBatchNorm** β Adaptive momentum batch normalization
|
|
56
|
+
- **DomainSPDBatchNorm** β Domain-specific batch normalization for transfer learning
|
|
57
|
+
- **TrackingMeanBatchNorm** β Batch normalization with running mean tracking
|
|
58
|
+
|
|
59
|
+
#### Riemannian Metrics
|
|
60
|
+
|
|
61
|
+
Four Riemannian metrics for SPD manifolds:
|
|
62
|
+
|
|
63
|
+
| Metric | Description |
|
|
64
|
+
|--------|-------------|
|
|
65
|
+
| **AffineInvariantRiemannian** | The canonical metric on SPD manifolds (AIRM) |
|
|
66
|
+
| **LogEuclidean** | Computationally efficient metric via matrix logarithm |
|
|
67
|
+
| **LogCholesky** | Metric based on Cholesky decomposition |
|
|
68
|
+
| **BuresWasserstein** | Optimal transport metric between Gaussians |
|
|
69
|
+
|
|
70
|
+
Each metric provides:
|
|
71
|
+
- Geodesic distance computation
|
|
72
|
+
- Exponential and logarithmic maps
|
|
73
|
+
- Parallel transport along geodesics
|
|
74
|
+
- FrΓ©chet/Karcher mean computation
|
|
75
|
+
|
|
76
|
+
#### Functional Operations
|
|
77
|
+
|
|
78
|
+
Low-level differentiable operations in `spd_learn.functional`:
|
|
79
|
+
|
|
80
|
+
- **Matrix Operations**: `logm`, `expm`, `sqrtm`, `invsqrtm`, `powm`
|
|
81
|
+
- **Geodesics**: `geodesic`, `log_map`, `exp_map`
|
|
82
|
+
- **Statistics**: `frechet_mean`, `log_euclidean_mean`
|
|
83
|
+
- **Parallel Transport**: `parallel_transport`
|
|
84
|
+
- **Covariance**: `covariance`, `scm`, `shrinkage_covariance`
|
|
85
|
+
|
|
86
|
+
#### Additional Features
|
|
87
|
+
|
|
88
|
+
- **GPU Acceleration** β Full CUDA support with efficient batched operations
|
|
89
|
+
- **Automatic Differentiation** β Seamless gradient computation on manifolds via PyTorch
|
|
90
|
+
- **scikit-learn Compatible** β Integration with ML pipelines via skorch/Braindecode wrappers
|
|
91
|
+
- **Comprehensive Documentation** β Tutorials, API reference, and theoretical background
|
|
92
|
+
- **Examples Gallery** β Ready-to-run examples for common use cases
|
|
93
|
+
|
|
94
|
+
---
|
|
95
|
+
|
|
96
|
+
### π Documentation
|
|
97
|
+
|
|
98
|
+
- **Installation Guide** β Step-by-step setup instructions
|
|
99
|
+
- **User Guide** β Comprehensive introduction to SPD matrices and Riemannian geometry
|
|
100
|
+
- **Theory Section** β Mathematical background, layer descriptions, and metric details
|
|
101
|
+
- **API Reference** β Complete documentation of all modules and functions
|
|
102
|
+
- **Examples Gallery** β Practical examples including EEG classification
|
|
103
|
+
|
|
104
|
+
---
|
|
105
|
+
|
|
106
|
+
### π§ Technical Details
|
|
107
|
+
|
|
108
|
+
- **Python**: 3.11, 3.12, 3.13
|
|
109
|
+
- **PyTorch**: 2.0+
|
|
110
|
+
- **License**: BSD-3-Clause
|
|
111
|
+
|
|
112
|
+
---
|
|
113
|
+
|
|
114
|
+
### π Acknowledgments
|
|
115
|
+
|
|
116
|
+
SPD Learn is developed and maintained by researchers from:
|
|
117
|
+
|
|
118
|
+
- Inria (French National Institute for Research in Digital Science and Technology)
|
|
119
|
+
- CNRS (French National Centre for Scientific Research)
|
|
120
|
+
- CEA (French Alternative Energies and Atomic Energy Commission)
|
|
121
|
+
- UniversitΓ© Paris-Saclay
|
|
122
|
+
- ATR (Advanced Telecommunications Research Institute International)
|
|
123
|
+
- UniversitΓ© Savoie Mont Blanc
|
|
124
|
+
|
|
125
|
+
---
|
|
126
|
+
|
|
127
|
+
### π Citation
|
|
128
|
+
|
|
129
|
+
If you use SPD Learn in your research, please cite:
|
|
130
|
+
|
|
131
|
+
```bibtex
|
|
132
|
+
@article{aristimunha2025spdlearn,
|
|
133
|
+
title = {SPDlearn: A Geometric Deep Learning Python Library for Neural
|
|
134
|
+
Decoding Through Trivialization},
|
|
135
|
+
author = {Aristimunha, Bruno and Ju, Ce and Collas, Antoine and
|
|
136
|
+
Bouchard, Florent and Thirion, Bertrand and
|
|
137
|
+
Chevallier, Sylvain and Kobler, Reinmar},
|
|
138
|
+
journal = {To be submitted},
|
|
139
|
+
year = {2026},
|
|
140
|
+
url = {https://github.com/spdlearn/spd_learn}
|
|
141
|
+
}
|
|
142
|
+
```
|
|
143
|
+
|
|
144
|
+
---
|
|
145
|
+
|
|
146
|
+
**Full Changelog**: https://github.com/spdlearn/spd_learn/commits/v0.1.0
|
|
@@ -0,0 +1,33 @@
|
|
|
1
|
+
BSD 3-Clause License, unless the header of the code section
|
|
2
|
+
explicitly states otherwise.
|
|
3
|
+
|
|
4
|
+
Copyright (c) 2024-now SPD Learn Developers.
|
|
5
|
+
|
|
6
|
+
All rights reserved
|
|
7
|
+
|
|
8
|
+
Redistribution and use in source and binary forms, with or without modification,
|
|
9
|
+
are permitted provided that the following conditions are met:
|
|
10
|
+
|
|
11
|
+
1. Redistributions of source code must retain the above copyright notice,
|
|
12
|
+
this list of conditions and the following disclaimer.
|
|
13
|
+
|
|
14
|
+
2. Redistributions in binary form must reproduce the above copyright notice,
|
|
15
|
+
this list of conditions and the following disclaimer in the documentation
|
|
16
|
+
and/or other materials provided with the distribution.
|
|
17
|
+
|
|
18
|
+
3. Neither the name of the copyright holder nor the names of its contributors
|
|
19
|
+
may be used to endorse or promote products derived from this software
|
|
20
|
+
without specific prior written permission.
|
|
21
|
+
|
|
22
|
+
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
|
23
|
+
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO,
|
|
24
|
+
THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A P
|
|
25
|
+
ARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER
|
|
26
|
+
OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL,
|
|
27
|
+
EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO,
|
|
28
|
+
PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR
|
|
29
|
+
PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
|
|
30
|
+
THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
|
31
|
+
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY
|
|
32
|
+
WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE
|
|
33
|
+
POSSIBILITY OF SUCH DAMAGE.
|
spd_learn-0.1.0/PKG-INFO
ADDED
|
@@ -0,0 +1,244 @@
|
|
|
1
|
+
Metadata-Version: 2.4
|
|
2
|
+
Name: spd_learn
|
|
3
|
+
Version: 0.1.0
|
|
4
|
+
Summary: Deep Riemannian Learning for Symmetric Positive Definite matrices
|
|
5
|
+
Author-email: Bruno Aristimunha <b.aristimunha@gmail.com>
|
|
6
|
+
Maintainer-email: Bruno Aristimunha <b.aristimunha@gmail.com>
|
|
7
|
+
Keywords: python,deep-learning,pytorch,spdnet,torch
|
|
8
|
+
Classifier: Intended Audience :: Science/Research
|
|
9
|
+
Classifier: Intended Audience :: Developers
|
|
10
|
+
Classifier: Programming Language :: Python
|
|
11
|
+
Classifier: Topic :: Software Development
|
|
12
|
+
Classifier: Topic :: Scientific/Engineering
|
|
13
|
+
Classifier: Development Status :: 3 - Alpha
|
|
14
|
+
Classifier: Operating System :: Microsoft :: Windows
|
|
15
|
+
Classifier: Operating System :: POSIX
|
|
16
|
+
Classifier: Operating System :: Unix
|
|
17
|
+
Classifier: Operating System :: MacOS
|
|
18
|
+
Classifier: Programming Language :: Python :: 3
|
|
19
|
+
Classifier: Programming Language :: Python :: 3.11
|
|
20
|
+
Classifier: Programming Language :: Python :: 3.12
|
|
21
|
+
Classifier: Programming Language :: Python :: 3.13
|
|
22
|
+
Requires-Python: >=3.11
|
|
23
|
+
Description-Content-Type: text/markdown
|
|
24
|
+
License-File: LICENSE.txt
|
|
25
|
+
Requires-Dist: torch
|
|
26
|
+
Requires-Dist: einops
|
|
27
|
+
Requires-Dist: numpy
|
|
28
|
+
Provides-Extra: tests
|
|
29
|
+
Requires-Dist: pytest; extra == "tests"
|
|
30
|
+
Requires-Dist: pytest-cov; extra == "tests"
|
|
31
|
+
Requires-Dist: pytest-codeblocks; extra == "tests"
|
|
32
|
+
Requires-Dist: sybil; extra == "tests"
|
|
33
|
+
Requires-Dist: codecov; extra == "tests"
|
|
34
|
+
Requires-Dist: pytest_cases; extra == "tests"
|
|
35
|
+
Requires-Dist: mypy; extra == "tests"
|
|
36
|
+
Provides-Extra: docs
|
|
37
|
+
Requires-Dist: sphinx; extra == "docs"
|
|
38
|
+
Requires-Dist: pydata-sphinx-theme; extra == "docs"
|
|
39
|
+
Requires-Dist: sphinx_gallery; extra == "docs"
|
|
40
|
+
Requires-Dist: sphinx-copybutton; extra == "docs"
|
|
41
|
+
Requires-Dist: sphinx-sitemap; extra == "docs"
|
|
42
|
+
Requires-Dist: sphinxcontrib-bibtex; extra == "docs"
|
|
43
|
+
Requires-Dist: numpydoc; extra == "docs"
|
|
44
|
+
Requires-Dist: memory_profiler; extra == "docs"
|
|
45
|
+
Requires-Dist: ipython; extra == "docs"
|
|
46
|
+
Requires-Dist: sphinx_design; extra == "docs"
|
|
47
|
+
Requires-Dist: sphinx-autodoc-typehints; extra == "docs"
|
|
48
|
+
Requires-Dist: hydra-core; extra == "docs"
|
|
49
|
+
Requires-Dist: omegaconf; extra == "docs"
|
|
50
|
+
Requires-Dist: nbformat; extra == "docs"
|
|
51
|
+
Provides-Extra: brain
|
|
52
|
+
Requires-Dist: moabb; extra == "brain"
|
|
53
|
+
Requires-Dist: braindecode; extra == "brain"
|
|
54
|
+
Requires-Dist: moabb; extra == "brain"
|
|
55
|
+
Requires-Dist: nilearn; extra == "brain"
|
|
56
|
+
Requires-Dist: pyriemann; extra == "brain"
|
|
57
|
+
Requires-Dist: skada; extra == "brain"
|
|
58
|
+
Provides-Extra: dev
|
|
59
|
+
Requires-Dist: spd_learn[tests]; extra == "dev"
|
|
60
|
+
Requires-Dist: pre-commit; extra == "dev"
|
|
61
|
+
Requires-Dist: ruff; extra == "dev"
|
|
62
|
+
Requires-Dist: mypy; extra == "dev"
|
|
63
|
+
Requires-Dist: rich; extra == "dev"
|
|
64
|
+
Requires-Dist: types-PyYAML; extra == "dev"
|
|
65
|
+
Requires-Dist: types-requests; extra == "dev"
|
|
66
|
+
Provides-Extra: all
|
|
67
|
+
Requires-Dist: spd_learn[tests]; extra == "all"
|
|
68
|
+
Requires-Dist: spd_learn[docs]; extra == "all"
|
|
69
|
+
Requires-Dist: spd_learn[brain]; extra == "all"
|
|
70
|
+
Requires-Dist: spd_learn[dev]; extra == "all"
|
|
71
|
+
Dynamic: license-file
|
|
72
|
+
|
|
73
|
+
<p align="center">
|
|
74
|
+
<img src="docs/source/_static/spd_learn.png" alt="SPD Learn Logo" width="400">
|
|
75
|
+
</p>
|
|
76
|
+
|
|
77
|
+
<h3 align="center">Deep Learning on Symmetric Positive Definite Matrices</h3>
|
|
78
|
+
|
|
79
|
+
<p align="center">
|
|
80
|
+
<strong>SPD Learn</strong> is a pure PyTorch library for geometric deep learning on Symmetric Positive Definite (SPD) matrices.<br>
|
|
81
|
+
The library provides differentiable Riemannian operations, broadcast-compatible layers, and reference implementations of published neural network architectures for SPD data.
|
|
82
|
+
</p>
|
|
83
|
+
|
|
84
|
+
<p align="center">
|
|
85
|
+
<a href="https://spdlearn.org/">Docs</a> β’
|
|
86
|
+
<a href="https://spdlearn.org/installation.html">Install</a> β’
|
|
87
|
+
<a href="https://spdlearn.org/generated/auto_examples/index.html">Examples</a> β’
|
|
88
|
+
<a href="https://spdlearn.org/api.html">API</a>
|
|
89
|
+
</p>
|
|
90
|
+
|
|
91
|
+
<p align="center">
|
|
92
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/badge/python-3.11%20%7C%203.12%20%7C%203.13-blue?style=flat-square" alt="Python"></a>
|
|
93
|
+
<a href="https://github.com/spdlearn/spd_learn/blob/main/LICENSE.txt"><img src="https://img.shields.io/badge/license-BSD--3--Clause-green?style=flat-square" alt="License"></a>
|
|
94
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/badge/PyTorch-2.0%2B-red?style=flat-square&logo=pytorch" alt="PyTorch"></a>
|
|
95
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/github/stars/spdlearn/spd_learn?style=social" alt="GitHub Stars"></a>
|
|
96
|
+
</p>
|
|
97
|
+
|
|
98
|
+
---
|
|
99
|
+
|
|
100
|
+
## Why SPD Learn?
|
|
101
|
+
|
|
102
|
+
A PyTorch library providing differentiable Riemannian operations and neural network layers for SPD matrix-valued data.
|
|
103
|
+
|
|
104
|
+
| π **Pure PyTorch** | π **Riemannian Geometry** | π¦ **Model Zoo** |
|
|
105
|
+
|:---:|:---:|:---:|
|
|
106
|
+
| Built entirely on PyTorch for seamless integration, automatic differentiation, and GPU acceleration out of the box. | Efficient exponential maps, logarithms, parallel transport, and geodesic distance computations on SPD manifolds. | Implementations of SPDNet, TensorCSPNet, EEGSPDNet, TSMNet, and more state-of-the-art architectures. |
|
|
107
|
+
|
|
108
|
+
---
|
|
109
|
+
|
|
110
|
+
## Model Architectures
|
|
111
|
+
|
|
112
|
+
State-of-the-art deep learning models for SPD matrix data.
|
|
113
|
+
|
|
114
|
+
| Model | Description | Tags |
|
|
115
|
+
|-------|-------------|------|
|
|
116
|
+
| **[SPDNet](https://spdlearn.org/generated/spd_learn.models.SPDNet.html)** | The foundational architecture for deep learning on SPD manifolds. Performs dimension reduction while preserving the SPD structure. | `BiMap` `ReEig` `LogEig` |
|
|
117
|
+
| **[EEGSPDNet](https://spdlearn.org/generated/spd_learn.models.EEGSPDNet.html)** | Specialized for EEG signal classification. Combines covariance estimation with SPD network layers for BCI applications. | `Covariance` `BiMap` `ReEig` |
|
|
118
|
+
| **[TSMNet](https://spdlearn.org/generated/spd_learn.models.TSMNet.html)** | Tangent Space Mapping Network combining convolutional features with SPD batch normalization. | `BatchNorm` `LogEig` `Transfer` |
|
|
119
|
+
| **[TensorCSPNet](https://spdlearn.org/generated/spd_learn.models.TensorCSPNet.html)** | SPDNet variant with Tensor Common Spatial Patterns for multi-band EEG feature extraction. | `Multi-band` `CSP` `BiMap` |
|
|
120
|
+
| **[PhaseSPDNet](https://spdlearn.org/generated/spd_learn.models.PhaseSPDNet.html)** | Phase-based SPDNet that leverages instantaneous phase information from analytic signals. | `Phase` `Hilbert` `BiMap` |
|
|
121
|
+
| **[GREEN](https://spdlearn.org/generated/spd_learn.models.Green.html)** | Gabor Riemann EEGNet combining Gabor wavelets with Riemannian geometry for robust EEG decoding. | `Gabor` `Wavelet` `Shrinkage` |
|
|
122
|
+
|
|
123
|
+
---
|
|
124
|
+
|
|
125
|
+
## Key Features
|
|
126
|
+
|
|
127
|
+
Core components for constructing and training geometric neural networks on SPD manifolds.
|
|
128
|
+
|
|
129
|
+
- **SPD Layers** β Specialized neural network layers for SPD matrices: BiMap for bilinear mappings, ReEig for eigenvalue rectification, and LogEig for tangent space projection.
|
|
130
|
+
- **Riemannian Operations** β Complete toolkit for SPD manifold computations: exponential/logarithmic maps, geodesic distances, Log-Euclidean mean, and geodesic interpolation.
|
|
131
|
+
- **GPU Accelerated** β Full CUDA support with efficient batched operations. Leverage PyTorch's automatic differentiation for seamless gradient computation on manifolds.
|
|
132
|
+
- **scikit-learn Compatible** β Seamlessly integrate with scikit-learn pipelines, cross-validation, and hyperparameter tuning via skorch/Braindecode wrappers.
|
|
133
|
+
- **Batch Normalization** β SPD-specific batch normalization layers that respect the Riemannian geometry, enabling stable training of deep SPD networks.
|
|
134
|
+
- **Open Source** β BSD-3 licensed, actively maintained, and welcoming contributions. Comprehensive documentation and examples to get you started quickly.
|
|
135
|
+
|
|
136
|
+
---
|
|
137
|
+
|
|
138
|
+
## Getting Started
|
|
139
|
+
|
|
140
|
+
Three simple steps to start using SPD Learn.
|
|
141
|
+
|
|
142
|
+
### 1. Install
|
|
143
|
+
|
|
144
|
+
```bash
|
|
145
|
+
pip install spd_learn
|
|
146
|
+
```
|
|
147
|
+
|
|
148
|
+
Or install from source:
|
|
149
|
+
|
|
150
|
+
```bash
|
|
151
|
+
git clone https://github.com/spdlearn/spd_learn
|
|
152
|
+
cd spd_learn && pip install -e .
|
|
153
|
+
```
|
|
154
|
+
|
|
155
|
+
Works with Python 3.11+ and PyTorch 2.0+.
|
|
156
|
+
|
|
157
|
+
### 2. Import & Create
|
|
158
|
+
|
|
159
|
+
```python
|
|
160
|
+
from spd_learn.models import SPDNet
|
|
161
|
+
from spd_learn.modules import BiMap, ReEig
|
|
162
|
+
|
|
163
|
+
# Create your model
|
|
164
|
+
model = SPDNet(n_chans=22, n_outputs=4, subspacedim=16)
|
|
165
|
+
```
|
|
166
|
+
|
|
167
|
+
### 3. Train & Evaluate
|
|
168
|
+
|
|
169
|
+
```python
|
|
170
|
+
import torch
|
|
171
|
+
|
|
172
|
+
# Standard PyTorch training
|
|
173
|
+
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
|
|
174
|
+
|
|
175
|
+
for epoch in range(100):
|
|
176
|
+
output = model(X_train)
|
|
177
|
+
loss = criterion(output, y_train)
|
|
178
|
+
loss.backward()
|
|
179
|
+
optimizer.step()
|
|
180
|
+
```
|
|
181
|
+
|
|
182
|
+
---
|
|
183
|
+
|
|
184
|
+
## Ecosystem Integration
|
|
185
|
+
|
|
186
|
+
Works seamlessly with your favorite tools.
|
|
187
|
+
|
|
188
|
+
- **[PyTorch](https://pytorch.org/)** β Built entirely on PyTorch 2.0+
|
|
189
|
+
- **[scikit-learn](https://scikit-learn.org/)** β ML pipelines and cross-validation (via skorch/Braindecode wrappers)
|
|
190
|
+
- **[Braindecode](https://braindecode.org/)** β Deep learning for EEG
|
|
191
|
+
- **[MOABB](https://moabb.neurotechx.com/)** β EEG benchmark datasets
|
|
192
|
+
- **[pyRiemann](https://pyriemann.readthedocs.io/)** β Riemannian geometry for BCI
|
|
193
|
+
|
|
194
|
+
---
|
|
195
|
+
|
|
196
|
+
## Citation
|
|
197
|
+
|
|
198
|
+
If you use SPD Learn in your research, please cite:
|
|
199
|
+
|
|
200
|
+
```bibtex
|
|
201
|
+
@article{aristimunha2025spdlearn,
|
|
202
|
+
title = {SPDlearn: A Geometric Deep Learning Python Library for Neural
|
|
203
|
+
Decoding Through Trivialization},
|
|
204
|
+
author = {Aristimunha, Bruno and Ju, Ce and Collas, Antoine and
|
|
205
|
+
Bouchard, Florent and Thirion, Bertrand and
|
|
206
|
+
Chevallier, Sylvain and Kobler, Reinmar},
|
|
207
|
+
journal = {To be submitted},
|
|
208
|
+
year = {2026},
|
|
209
|
+
url = {https://github.com/spdlearn/spd_learn}
|
|
210
|
+
}
|
|
211
|
+
```
|
|
212
|
+
|
|
213
|
+
---
|
|
214
|
+
|
|
215
|
+
## Open Source & Community Driven
|
|
216
|
+
|
|
217
|
+
SPD Learn is an open-source project contributed by researchers for researchers.
|
|
218
|
+
Join our community and help advance deep learning on Riemannian manifolds.
|
|
219
|
+
|
|
220
|
+
- [GitHub Repository](https://github.com/spdlearn/spd_learn)
|
|
221
|
+
- [Report Issues](https://github.com/spdlearn/spd_learn/issues)
|
|
222
|
+
- [Contributing Guide](CONTRIBUTING.md)
|
|
223
|
+
|
|
224
|
+
### Supported by
|
|
225
|
+
|
|
226
|
+
<p align="center">
|
|
227
|
+
<a href="https://www.inria.fr/"><img src="docs/source/_static/support/inria_red.png" alt="Inria" height="60"></a>
|
|
228
|
+
|
|
229
|
+
<a href="https://www.cnrs.fr/"><img src="docs/source/_static/support/cnrs_dark.png" alt="CNRS" height="60"></a>
|
|
230
|
+
|
|
231
|
+
<a href="https://www.cea.fr/"><img src="docs/source/_static/support/cea_dark.png" alt="CEA" height="60"></a>
|
|
232
|
+
|
|
233
|
+
<a href="https://www.universite-paris-saclay.fr/"><img src="docs/source/_static/support/paris_saclay_dark.jpg" alt="UniversitΓ© Paris-Saclay" height="60"></a>
|
|
234
|
+
|
|
235
|
+
<a href="https://www.atr.jp/"><img src="docs/source/_static/support/atr_logo_white.png" alt="ATR" height="60"></a>
|
|
236
|
+
|
|
237
|
+
<a href="https://www.univ-smb.fr"><img src="docs/source/_static/support/usmb_dark.png" alt="USMB" height="60"></a>
|
|
238
|
+
</p>
|
|
239
|
+
|
|
240
|
+
---
|
|
241
|
+
|
|
242
|
+
## License
|
|
243
|
+
|
|
244
|
+
This project is licensed under the BSD 3-Clause License, unless the header of the code section explicitly states otherwise. See [LICENSE.txt](LICENSE.txt) for details.
|
|
@@ -0,0 +1,172 @@
|
|
|
1
|
+
<p align="center">
|
|
2
|
+
<img src="docs/source/_static/spd_learn.png" alt="SPD Learn Logo" width="400">
|
|
3
|
+
</p>
|
|
4
|
+
|
|
5
|
+
<h3 align="center">Deep Learning on Symmetric Positive Definite Matrices</h3>
|
|
6
|
+
|
|
7
|
+
<p align="center">
|
|
8
|
+
<strong>SPD Learn</strong> is a pure PyTorch library for geometric deep learning on Symmetric Positive Definite (SPD) matrices.<br>
|
|
9
|
+
The library provides differentiable Riemannian operations, broadcast-compatible layers, and reference implementations of published neural network architectures for SPD data.
|
|
10
|
+
</p>
|
|
11
|
+
|
|
12
|
+
<p align="center">
|
|
13
|
+
<a href="https://spdlearn.org/">Docs</a> β’
|
|
14
|
+
<a href="https://spdlearn.org/installation.html">Install</a> β’
|
|
15
|
+
<a href="https://spdlearn.org/generated/auto_examples/index.html">Examples</a> β’
|
|
16
|
+
<a href="https://spdlearn.org/api.html">API</a>
|
|
17
|
+
</p>
|
|
18
|
+
|
|
19
|
+
<p align="center">
|
|
20
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/badge/python-3.11%20%7C%203.12%20%7C%203.13-blue?style=flat-square" alt="Python"></a>
|
|
21
|
+
<a href="https://github.com/spdlearn/spd_learn/blob/main/LICENSE.txt"><img src="https://img.shields.io/badge/license-BSD--3--Clause-green?style=flat-square" alt="License"></a>
|
|
22
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/badge/PyTorch-2.0%2B-red?style=flat-square&logo=pytorch" alt="PyTorch"></a>
|
|
23
|
+
<a href="https://github.com/spdlearn/spd_learn"><img src="https://img.shields.io/github/stars/spdlearn/spd_learn?style=social" alt="GitHub Stars"></a>
|
|
24
|
+
</p>
|
|
25
|
+
|
|
26
|
+
---
|
|
27
|
+
|
|
28
|
+
## Why SPD Learn?
|
|
29
|
+
|
|
30
|
+
A PyTorch library providing differentiable Riemannian operations and neural network layers for SPD matrix-valued data.
|
|
31
|
+
|
|
32
|
+
| π **Pure PyTorch** | π **Riemannian Geometry** | π¦ **Model Zoo** |
|
|
33
|
+
|:---:|:---:|:---:|
|
|
34
|
+
| Built entirely on PyTorch for seamless integration, automatic differentiation, and GPU acceleration out of the box. | Efficient exponential maps, logarithms, parallel transport, and geodesic distance computations on SPD manifolds. | Implementations of SPDNet, TensorCSPNet, EEGSPDNet, TSMNet, and more state-of-the-art architectures. |
|
|
35
|
+
|
|
36
|
+
---
|
|
37
|
+
|
|
38
|
+
## Model Architectures
|
|
39
|
+
|
|
40
|
+
State-of-the-art deep learning models for SPD matrix data.
|
|
41
|
+
|
|
42
|
+
| Model | Description | Tags |
|
|
43
|
+
|-------|-------------|------|
|
|
44
|
+
| **[SPDNet](https://spdlearn.org/generated/spd_learn.models.SPDNet.html)** | The foundational architecture for deep learning on SPD manifolds. Performs dimension reduction while preserving the SPD structure. | `BiMap` `ReEig` `LogEig` |
|
|
45
|
+
| **[EEGSPDNet](https://spdlearn.org/generated/spd_learn.models.EEGSPDNet.html)** | Specialized for EEG signal classification. Combines covariance estimation with SPD network layers for BCI applications. | `Covariance` `BiMap` `ReEig` |
|
|
46
|
+
| **[TSMNet](https://spdlearn.org/generated/spd_learn.models.TSMNet.html)** | Tangent Space Mapping Network combining convolutional features with SPD batch normalization. | `BatchNorm` `LogEig` `Transfer` |
|
|
47
|
+
| **[TensorCSPNet](https://spdlearn.org/generated/spd_learn.models.TensorCSPNet.html)** | SPDNet variant with Tensor Common Spatial Patterns for multi-band EEG feature extraction. | `Multi-band` `CSP` `BiMap` |
|
|
48
|
+
| **[PhaseSPDNet](https://spdlearn.org/generated/spd_learn.models.PhaseSPDNet.html)** | Phase-based SPDNet that leverages instantaneous phase information from analytic signals. | `Phase` `Hilbert` `BiMap` |
|
|
49
|
+
| **[GREEN](https://spdlearn.org/generated/spd_learn.models.Green.html)** | Gabor Riemann EEGNet combining Gabor wavelets with Riemannian geometry for robust EEG decoding. | `Gabor` `Wavelet` `Shrinkage` |
|
|
50
|
+
|
|
51
|
+
---
|
|
52
|
+
|
|
53
|
+
## Key Features
|
|
54
|
+
|
|
55
|
+
Core components for constructing and training geometric neural networks on SPD manifolds.
|
|
56
|
+
|
|
57
|
+
- **SPD Layers** β Specialized neural network layers for SPD matrices: BiMap for bilinear mappings, ReEig for eigenvalue rectification, and LogEig for tangent space projection.
|
|
58
|
+
- **Riemannian Operations** β Complete toolkit for SPD manifold computations: exponential/logarithmic maps, geodesic distances, Log-Euclidean mean, and geodesic interpolation.
|
|
59
|
+
- **GPU Accelerated** β Full CUDA support with efficient batched operations. Leverage PyTorch's automatic differentiation for seamless gradient computation on manifolds.
|
|
60
|
+
- **scikit-learn Compatible** β Seamlessly integrate with scikit-learn pipelines, cross-validation, and hyperparameter tuning via skorch/Braindecode wrappers.
|
|
61
|
+
- **Batch Normalization** β SPD-specific batch normalization layers that respect the Riemannian geometry, enabling stable training of deep SPD networks.
|
|
62
|
+
- **Open Source** β BSD-3 licensed, actively maintained, and welcoming contributions. Comprehensive documentation and examples to get you started quickly.
|
|
63
|
+
|
|
64
|
+
---
|
|
65
|
+
|
|
66
|
+
## Getting Started
|
|
67
|
+
|
|
68
|
+
Three simple steps to start using SPD Learn.
|
|
69
|
+
|
|
70
|
+
### 1. Install
|
|
71
|
+
|
|
72
|
+
```bash
|
|
73
|
+
pip install spd_learn
|
|
74
|
+
```
|
|
75
|
+
|
|
76
|
+
Or install from source:
|
|
77
|
+
|
|
78
|
+
```bash
|
|
79
|
+
git clone https://github.com/spdlearn/spd_learn
|
|
80
|
+
cd spd_learn && pip install -e .
|
|
81
|
+
```
|
|
82
|
+
|
|
83
|
+
Works with Python 3.11+ and PyTorch 2.0+.
|
|
84
|
+
|
|
85
|
+
### 2. Import & Create
|
|
86
|
+
|
|
87
|
+
```python
|
|
88
|
+
from spd_learn.models import SPDNet
|
|
89
|
+
from spd_learn.modules import BiMap, ReEig
|
|
90
|
+
|
|
91
|
+
# Create your model
|
|
92
|
+
model = SPDNet(n_chans=22, n_outputs=4, subspacedim=16)
|
|
93
|
+
```
|
|
94
|
+
|
|
95
|
+
### 3. Train & Evaluate
|
|
96
|
+
|
|
97
|
+
```python
|
|
98
|
+
import torch
|
|
99
|
+
|
|
100
|
+
# Standard PyTorch training
|
|
101
|
+
optimizer = torch.optim.Adam(model.parameters(), lr=1e-4)
|
|
102
|
+
|
|
103
|
+
for epoch in range(100):
|
|
104
|
+
output = model(X_train)
|
|
105
|
+
loss = criterion(output, y_train)
|
|
106
|
+
loss.backward()
|
|
107
|
+
optimizer.step()
|
|
108
|
+
```
|
|
109
|
+
|
|
110
|
+
---
|
|
111
|
+
|
|
112
|
+
## Ecosystem Integration
|
|
113
|
+
|
|
114
|
+
Works seamlessly with your favorite tools.
|
|
115
|
+
|
|
116
|
+
- **[PyTorch](https://pytorch.org/)** β Built entirely on PyTorch 2.0+
|
|
117
|
+
- **[scikit-learn](https://scikit-learn.org/)** β ML pipelines and cross-validation (via skorch/Braindecode wrappers)
|
|
118
|
+
- **[Braindecode](https://braindecode.org/)** β Deep learning for EEG
|
|
119
|
+
- **[MOABB](https://moabb.neurotechx.com/)** β EEG benchmark datasets
|
|
120
|
+
- **[pyRiemann](https://pyriemann.readthedocs.io/)** β Riemannian geometry for BCI
|
|
121
|
+
|
|
122
|
+
---
|
|
123
|
+
|
|
124
|
+
## Citation
|
|
125
|
+
|
|
126
|
+
If you use SPD Learn in your research, please cite:
|
|
127
|
+
|
|
128
|
+
```bibtex
|
|
129
|
+
@article{aristimunha2025spdlearn,
|
|
130
|
+
title = {SPDlearn: A Geometric Deep Learning Python Library for Neural
|
|
131
|
+
Decoding Through Trivialization},
|
|
132
|
+
author = {Aristimunha, Bruno and Ju, Ce and Collas, Antoine and
|
|
133
|
+
Bouchard, Florent and Thirion, Bertrand and
|
|
134
|
+
Chevallier, Sylvain and Kobler, Reinmar},
|
|
135
|
+
journal = {To be submitted},
|
|
136
|
+
year = {2026},
|
|
137
|
+
url = {https://github.com/spdlearn/spd_learn}
|
|
138
|
+
}
|
|
139
|
+
```
|
|
140
|
+
|
|
141
|
+
---
|
|
142
|
+
|
|
143
|
+
## Open Source & Community Driven
|
|
144
|
+
|
|
145
|
+
SPD Learn is an open-source project contributed by researchers for researchers.
|
|
146
|
+
Join our community and help advance deep learning on Riemannian manifolds.
|
|
147
|
+
|
|
148
|
+
- [GitHub Repository](https://github.com/spdlearn/spd_learn)
|
|
149
|
+
- [Report Issues](https://github.com/spdlearn/spd_learn/issues)
|
|
150
|
+
- [Contributing Guide](CONTRIBUTING.md)
|
|
151
|
+
|
|
152
|
+
### Supported by
|
|
153
|
+
|
|
154
|
+
<p align="center">
|
|
155
|
+
<a href="https://www.inria.fr/"><img src="docs/source/_static/support/inria_red.png" alt="Inria" height="60"></a>
|
|
156
|
+
|
|
157
|
+
<a href="https://www.cnrs.fr/"><img src="docs/source/_static/support/cnrs_dark.png" alt="CNRS" height="60"></a>
|
|
158
|
+
|
|
159
|
+
<a href="https://www.cea.fr/"><img src="docs/source/_static/support/cea_dark.png" alt="CEA" height="60"></a>
|
|
160
|
+
|
|
161
|
+
<a href="https://www.universite-paris-saclay.fr/"><img src="docs/source/_static/support/paris_saclay_dark.jpg" alt="UniversitΓ© Paris-Saclay" height="60"></a>
|
|
162
|
+
|
|
163
|
+
<a href="https://www.atr.jp/"><img src="docs/source/_static/support/atr_logo_white.png" alt="ATR" height="60"></a>
|
|
164
|
+
|
|
165
|
+
<a href="https://www.univ-smb.fr"><img src="docs/source/_static/support/usmb_dark.png" alt="USMB" height="60"></a>
|
|
166
|
+
</p>
|
|
167
|
+
|
|
168
|
+
---
|
|
169
|
+
|
|
170
|
+
## License
|
|
171
|
+
|
|
172
|
+
This project is licensed under the BSD 3-Clause License, unless the header of the code section explicitly states otherwise. See [LICENSE.txt](LICENSE.txt) for details.
|