congrads 0.1.0__py3-none-any.whl → 0.2.0__py3-none-any.whl
This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
- congrads/__init__.py +7 -6
- congrads/constraints.py +182 -300
- congrads/core.py +158 -144
- congrads/datasets.py +12 -559
- congrads/descriptor.py +20 -35
- congrads/metrics.py +37 -52
- congrads/networks.py +5 -6
- congrads/utils.py +310 -0
- congrads-0.2.0.dist-info/LICENSE +26 -0
- congrads-0.2.0.dist-info/METADATA +222 -0
- congrads-0.2.0.dist-info/RECORD +13 -0
- congrads/learners.py +0 -233
- congrads-0.1.0.dist-info/LICENSE +0 -34
- congrads-0.1.0.dist-info/METADATA +0 -196
- congrads-0.1.0.dist-info/RECORD +0 -13
- {congrads-0.1.0.dist-info → congrads-0.2.0.dist-info}/WHEEL +0 -0
- {congrads-0.1.0.dist-info → congrads-0.2.0.dist-info}/top_level.txt +0 -0
|
@@ -1,196 +0,0 @@
|
|
|
1
|
-
Metadata-Version: 2.1
|
|
2
|
-
Name: congrads
|
|
3
|
-
Version: 0.1.0
|
|
4
|
-
Summary: A toolbox for using Constraint Guided Gradient Descent when training neural networks.
|
|
5
|
-
Author-email: Wout Rombouts <wout.rombouts@kuleuven.be>, Quinten Van Baelen <quinten.vanbaelen@kuleuven.be>
|
|
6
|
-
License: MIT License
|
|
7
|
-
|
|
8
|
-
Copyright (c) 2024 DTAI - KU Leuven
|
|
9
|
-
|
|
10
|
-
Permission is hereby granted, free of charge, to any person obtaining a copy
|
|
11
|
-
of this software and associated documentation files (the "Software"), to deal
|
|
12
|
-
in the Software without restriction, including without limitation the rights
|
|
13
|
-
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
|
14
|
-
copies of the Software, and to permit persons to whom the Software is
|
|
15
|
-
furnished to do so, subject to the following conditions:
|
|
16
|
-
|
|
17
|
-
The above copyright notice and this permission notice shall be included in all
|
|
18
|
-
copies or substantial portions of the Software.
|
|
19
|
-
|
|
20
|
-
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
|
21
|
-
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
|
22
|
-
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
|
23
|
-
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
|
24
|
-
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
|
25
|
-
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
|
26
|
-
SOFTWARE.
|
|
27
|
-
|
|
28
|
-
|
|
29
|
-
"Commons Clause" License Condition v1.0
|
|
30
|
-
|
|
31
|
-
The Software is provided to you by the Licensor under the License, as defined below, subject to the following condition.
|
|
32
|
-
|
|
33
|
-
Without limiting other conditions in the License, the grant of rights under the License will not include, and the License does not grant to you, the right to Sell the Software.
|
|
34
|
-
|
|
35
|
-
For purposes of the foregoing, "Sell" means practicing any or all of the rights granted to you under the License to provide to third parties, for a fee or other consideration (including without limitation fees for hosting or consulting/ support services related to the Software), a product or service whose value derives, entirely or substantially, from the functionality of the Software. Any license notice or attribution required by the License must also include this Commons Clause License Condition notice.
|
|
36
|
-
|
|
37
|
-
Software: All CGGD-Toolbox associated files.
|
|
38
|
-
License: MIT
|
|
39
|
-
Licensor: DTAI - KU Leuven
|
|
40
|
-
Requires-Python: >=3.9
|
|
41
|
-
Description-Content-Type: text/markdown
|
|
42
|
-
License-File: LICENSE
|
|
43
|
-
Requires-Dist: torch>=1.12.0
|
|
44
|
-
Requires-Dist: pytorch-lightning>=2.0.0
|
|
45
|
-
Requires-Dist: pandas>=2.2.2
|
|
46
|
-
Requires-Dist: numpy>=1.26.4
|
|
47
|
-
|
|
48
|
-
# Congrads
|
|
49
|
-
|
|
50
|
-
**Congrads** is a Python toolbox that brings **constraint-guided gradient descent** capabilities to your machine learning projects. Built with seamless integration into PyTorch and PyTorch Lightning, Congrads empowers you to enhance the training and optimization process by incorporating constraints into your training pipeline.
|
|
51
|
-
|
|
52
|
-
Whether you're working with simple inequality constraints, combinations of input-output relations, or custom constraint formulations, Congrads provides the tools and flexibility needed to build more robust and generalized models.
|
|
53
|
-
|
|
54
|
-
> <strong>Note:</strong> The Congrads toolbox is currently in alpha phase. Expect significant changes, potential bugs, and incomplete features as we continue to develop and improve the functionality. Feedback is highly appreciated during this phase to help us refine the toolbox and ensure its reliability in later stages.
|
|
55
|
-
|
|
56
|
-
## Key Features
|
|
57
|
-
|
|
58
|
-
- **Constraint-Guided Training**: Add constraints to guide the optimization process, ensuring that your model generalizes better by trying to satisfy the constraints.
|
|
59
|
-
- **Flexible Constraint Definition**: Define constraints on inputs, outputs, or combinations thereof, using an intuitive and extendable interface. Make use of pre-programmed constraint classes or write your own.
|
|
60
|
-
- **Seamless PyTorch Integration**: Use Congrads within your existing PyTorch workflows with minimal setup.
|
|
61
|
-
- **PyTorch Lightning Support**: Easily plug into PyTorch Lightning projects for scalable and structured model training.
|
|
62
|
-
- **Flexible and extendible**: Write your own custom networks, constraints and dataset classes to easily extend the functionality of the toolbox.
|
|
63
|
-
|
|
64
|
-
## Installation
|
|
65
|
-
|
|
66
|
-
Currently, the **Congrads** toolbox can only be installed using pip. We will later expand to other package managers such as conda.
|
|
67
|
-
|
|
68
|
-
```bash
|
|
69
|
-
pip install congrads
|
|
70
|
-
```
|
|
71
|
-
|
|
72
|
-
## Getting Started
|
|
73
|
-
|
|
74
|
-
### 1. **Prerequisites**
|
|
75
|
-
|
|
76
|
-
Before you can use **Congrads**, make sure you have the following installed:
|
|
77
|
-
|
|
78
|
-
- Python 3.7+
|
|
79
|
-
- **PyTorch** (install with CUDA support for GPU training, refer to the [getting started guide](https://pytorch.org/get-started/locally/))
|
|
80
|
-
- **PyTorch Lightning** (preffered version 2.4, [installation guide](https://lightning.ai/docs/pytorch/stable/starter/installation.html))
|
|
81
|
-
|
|
82
|
-
### 2. **Installation**
|
|
83
|
-
|
|
84
|
-
Please install **Congrads** via pip:
|
|
85
|
-
|
|
86
|
-
```bash
|
|
87
|
-
pip install congrads
|
|
88
|
-
```
|
|
89
|
-
|
|
90
|
-
### 3. **Basic Usage**
|
|
91
|
-
|
|
92
|
-
#### 1. Import the toolbox
|
|
93
|
-
|
|
94
|
-
```python
|
|
95
|
-
from congrads.descriptor import Descriptor
|
|
96
|
-
from congrads.constraints import ScalarConstraint, BinaryConstraint
|
|
97
|
-
from congrads.learners import Learner
|
|
98
|
-
```
|
|
99
|
-
|
|
100
|
-
#### 2. Instantiate and configure descriptor
|
|
101
|
-
|
|
102
|
-
The descriptor describes your specific use-case. It assigns names to specific neurons so you can easily reference them when defining constraints. By settings flags, you can specifiy if a layer is fixed or if it is an output layer.
|
|
103
|
-
|
|
104
|
-
```python
|
|
105
|
-
# Descriptor setup
|
|
106
|
-
descriptor = Descriptor()
|
|
107
|
-
descriptor.add("input", ["I1", "I2", "I3", "I4"], constant=True)
|
|
108
|
-
descriptor.add("output", ["O1", "O2"], output=True)
|
|
109
|
-
```
|
|
110
|
-
|
|
111
|
-
#### 3. Define constraints on your network
|
|
112
|
-
|
|
113
|
-
You can define constraints on your network using the names previously configured in the descriptor. A set of predefined constraint classes can be used to define inequalities on input or output data.
|
|
114
|
-
|
|
115
|
-
```python
|
|
116
|
-
# Constraints definition
|
|
117
|
-
Constraint.descriptor = descriptor
|
|
118
|
-
constraints = [
|
|
119
|
-
ScalarConstraint("O1", gt, 0), # O1 > 0
|
|
120
|
-
BinaryConstraint("O1", le, "O2"), # O1 <= O2
|
|
121
|
-
]
|
|
122
|
-
```
|
|
123
|
-
|
|
124
|
-
#### 4. Adjust network
|
|
125
|
-
|
|
126
|
-
Your regular Pytorch network can be used with this toolbox. We only require that the output of your model's forward pass is a dictionary of layers. The keys must match the descriptor settings.
|
|
127
|
-
|
|
128
|
-
```python
|
|
129
|
-
def forward(self, X):
|
|
130
|
-
input = X
|
|
131
|
-
output = self.out(self.hidden(self.input(X)))
|
|
132
|
-
|
|
133
|
-
return {"input": input, "output": output}
|
|
134
|
-
```
|
|
135
|
-
|
|
136
|
-
You then can use your own network and directly assign it to the learner.
|
|
137
|
-
|
|
138
|
-
#### 5. Set up network and data
|
|
139
|
-
|
|
140
|
-
Next, instantiate the adjusted network and the data. At the moment, we require the data to be implemented as a `LightningDataModule` class.
|
|
141
|
-
|
|
142
|
-
```python
|
|
143
|
-
# Data and network setup
|
|
144
|
-
network = YourOwnNetwork(n_inputs=4, n_outputs=2, n_hidden_layers=3, hidden_dim=10)
|
|
145
|
-
data = YourOwnData(batch_size=100)
|
|
146
|
-
```
|
|
147
|
-
|
|
148
|
-
#### 6. Set up learner
|
|
149
|
-
|
|
150
|
-
You can specify your own loss function and optimizer with their own settings to be used for learning the model.
|
|
151
|
-
|
|
152
|
-
```python
|
|
153
|
-
# Learner setup
|
|
154
|
-
loss_function = MSELoss()
|
|
155
|
-
optimizer = Adam(network.parameters(), lr=0.001)
|
|
156
|
-
|
|
157
|
-
learner = Learner(network, descriptor, constraints, loss_function, optimizer)
|
|
158
|
-
```
|
|
159
|
-
|
|
160
|
-
#### 7. Set up trainer
|
|
161
|
-
|
|
162
|
-
Finally, set up a trainer to start the actual training of the model.
|
|
163
|
-
|
|
164
|
-
```python
|
|
165
|
-
# Trainer setup
|
|
166
|
-
trainer = Trainer(max_epochs=100)
|
|
167
|
-
|
|
168
|
-
# Train model
|
|
169
|
-
trainer.fit(learner, data)
|
|
170
|
-
```
|
|
171
|
-
|
|
172
|
-
## Example Use Cases
|
|
173
|
-
|
|
174
|
-
- **Optimization with Domain Knowledge**: Ensure outputs meet real-world restrictions or safety standards.
|
|
175
|
-
- **Physics-Informed Neural Networks (PINNs)**: Enforce physical laws as constraints in your models.
|
|
176
|
-
- **Improve Training Process**: Inject domain knowledge in the training stage, increasing learning efficiency.
|
|
177
|
-
|
|
178
|
-
## Roadmap
|
|
179
|
-
|
|
180
|
-
- [ ] Documentation and Notebook examples
|
|
181
|
-
- [ ] Add support for constraint parser that can interpret equations
|
|
182
|
-
- [ ] Add better handling of metric logging and visualization
|
|
183
|
-
- [ ] Revise if Pytorch Lightning is preferable over plain Pytorch
|
|
184
|
-
- [ ] Determine if it is feasible to add unit and or functional tests
|
|
185
|
-
|
|
186
|
-
## Contributing
|
|
187
|
-
|
|
188
|
-
We welcome contributions to Congrads! Whether you want to report issues, suggest features, or contribute code via issues and pull requests.
|
|
189
|
-
|
|
190
|
-
## License
|
|
191
|
-
|
|
192
|
-
Congrads is licensed under the [MIT License with a Commons Clause](LICENSE). This means you are free to use, modify, and distribute the software, but you may not sell or offer it as part of a paid service without permission. We encourage companies that are interested in a collaboration for a specific topic to contact the authors for more information.
|
|
193
|
-
|
|
194
|
-
---
|
|
195
|
-
|
|
196
|
-
Elevate your neural networks with Congrads! 🚀
|
congrads-0.1.0.dist-info/RECORD
DELETED
|
@@ -1,13 +0,0 @@
|
|
|
1
|
-
congrads/__init__.py,sha256=6d2_3fjj0R7r7jS76WRbOuzHlahPHhUvyqHHYG6lTMc,444
|
|
2
|
-
congrads/constraints.py,sha256=ppdVZNm6x8D8ggT-fhupaq5irWlKqZaGSH2Yp8Gl4f8,21787
|
|
3
|
-
congrads/core.py,sha256=guqMQuGZd-HtQkQ94IuFHSB371hmlnoQ_Jwf9d4Gra8,8714
|
|
4
|
-
congrads/datasets.py,sha256=epvAvi_2a5yeASkZyNY0rGGttA58vd-40f6anVHJtsk,29661
|
|
5
|
-
congrads/descriptor.py,sha256=Zfd42036OpD5Ts2pDGR0TZEChGLsnlGyLi-NrwaYaaU,2618
|
|
6
|
-
congrads/learners.py,sha256=-1uhQbWmtpI8xpz9P1KBSrgQRZQ8dURJJFcjsnyy__s,7740
|
|
7
|
-
congrads/metrics.py,sha256=FIhK-HElTlt6DxEojjU3pcnoBfcLQw7UFeL_qxOkP3s,2533
|
|
8
|
-
congrads/networks.py,sha256=LOLK7oVBP7hrBNcF6g0mx53lE2KySDX9l7LPXYR0kGo,3288
|
|
9
|
-
congrads-0.1.0.dist-info/LICENSE,sha256=ibijcrj5FKNWxQCp0P3ZKFAjKgs4G5rexUXSl9n5o88,2012
|
|
10
|
-
congrads-0.1.0.dist-info/METADATA,sha256=AC4YWubXlpJXuATzfZBJdqgi7N_KBdCsoYR7bVKlSiE,8767
|
|
11
|
-
congrads-0.1.0.dist-info/WHEEL,sha256=PZUExdf71Ui_so67QXpySuHtCi3-J3wvF4ORK6k_S8U,91
|
|
12
|
-
congrads-0.1.0.dist-info/top_level.txt,sha256=B8M9NmtHbmzp-3APHe4C0oo7aRIWRHWoba9FIy9XeYM,9
|
|
13
|
-
congrads-0.1.0.dist-info/RECORD,,
|
|
File without changes
|
|
File without changes
|