congrads 0.1.0__tar.gz

This diff represents the content of publicly available package versions that have been released to one of the supported registries. The information contained in this diff is provided for informational purposes only and reflects changes between package versions as they appear in their respective public registries.
congrads-0.1.0/LICENSE ADDED
@@ -0,0 +1,34 @@
1
+ MIT License
2
+
3
+ Copyright (c) 2024 DTAI - KU Leuven
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in all
13
+ copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
21
+ SOFTWARE.
22
+
23
+
24
+ "Commons Clause" License Condition v1.0
25
+
26
+ The Software is provided to you by the Licensor under the License, as defined below, subject to the following condition.
27
+
28
+ Without limiting other conditions in the License, the grant of rights under the License will not include, and the License does not grant to you, the right to Sell the Software.
29
+
30
+ For purposes of the foregoing, "Sell" means practicing any or all of the rights granted to you under the License to provide to third parties, for a fee or other consideration (including without limitation fees for hosting or consulting/ support services related to the Software), a product or service whose value derives, entirely or substantially, from the functionality of the Software. Any license notice or attribution required by the License must also include this Commons Clause License Condition notice.
31
+
32
+ Software: All CGGD-Toolbox associated files.
33
+ License: MIT
34
+ Licensor: DTAI - KU Leuven
@@ -0,0 +1,196 @@
1
+ Metadata-Version: 2.1
2
+ Name: congrads
3
+ Version: 0.1.0
4
+ Summary: A toolbox for using Constraint Guided Gradient Descent when training neural networks.
5
+ Author-email: Wout Rombouts <wout.rombouts@kuleuven.be>, Quinten Van Baelen <quinten.vanbaelen@kuleuven.be>
6
+ License: MIT License
7
+
8
+ Copyright (c) 2024 DTAI - KU Leuven
9
+
10
+ Permission is hereby granted, free of charge, to any person obtaining a copy
11
+ of this software and associated documentation files (the "Software"), to deal
12
+ in the Software without restriction, including without limitation the rights
13
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
14
+ copies of the Software, and to permit persons to whom the Software is
15
+ furnished to do so, subject to the following conditions:
16
+
17
+ The above copyright notice and this permission notice shall be included in all
18
+ copies or substantial portions of the Software.
19
+
20
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
21
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
22
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
23
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
24
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
25
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
26
+ SOFTWARE.
27
+
28
+
29
+ "Commons Clause" License Condition v1.0
30
+
31
+ The Software is provided to you by the Licensor under the License, as defined below, subject to the following condition.
32
+
33
+ Without limiting other conditions in the License, the grant of rights under the License will not include, and the License does not grant to you, the right to Sell the Software.
34
+
35
+ For purposes of the foregoing, "Sell" means practicing any or all of the rights granted to you under the License to provide to third parties, for a fee or other consideration (including without limitation fees for hosting or consulting/ support services related to the Software), a product or service whose value derives, entirely or substantially, from the functionality of the Software. Any license notice or attribution required by the License must also include this Commons Clause License Condition notice.
36
+
37
+ Software: All CGGD-Toolbox associated files.
38
+ License: MIT
39
+ Licensor: DTAI - KU Leuven
40
+ Requires-Python: >=3.9
41
+ Description-Content-Type: text/markdown
42
+ License-File: LICENSE
43
+ Requires-Dist: torch>=1.12.0
44
+ Requires-Dist: pytorch-lightning>=2.0.0
45
+ Requires-Dist: pandas>=2.2.2
46
+ Requires-Dist: numpy>=1.26.4
47
+
48
+ # Congrads
49
+
50
+ **Congrads** is a Python toolbox that brings **constraint-guided gradient descent** capabilities to your machine learning projects. Built with seamless integration into PyTorch and PyTorch Lightning, Congrads empowers you to enhance the training and optimization process by incorporating constraints into your training pipeline.
51
+
52
+ Whether you're working with simple inequality constraints, combinations of input-output relations, or custom constraint formulations, Congrads provides the tools and flexibility needed to build more robust and generalized models.
53
+
54
+ > <strong>Note:</strong> The Congrads toolbox is currently in alpha phase. Expect significant changes, potential bugs, and incomplete features as we continue to develop and improve the functionality. Feedback is highly appreciated during this phase to help us refine the toolbox and ensure its reliability in later stages.
55
+
56
+ ## Key Features
57
+
58
+ - **Constraint-Guided Training**: Add constraints to guide the optimization process, ensuring that your model generalizes better by trying to satisfy the constraints.
59
+ - **Flexible Constraint Definition**: Define constraints on inputs, outputs, or combinations thereof, using an intuitive and extendable interface. Make use of pre-programmed constraint classes or write your own.
60
+ - **Seamless PyTorch Integration**: Use Congrads within your existing PyTorch workflows with minimal setup.
61
+ - **PyTorch Lightning Support**: Easily plug into PyTorch Lightning projects for scalable and structured model training.
62
+ - **Flexible and extendible**: Write your own custom networks, constraints and dataset classes to easily extend the functionality of the toolbox.
63
+
64
+ ## Installation
65
+
66
+ Currently, the **Congrads** toolbox can only be installed using pip. We will later expand to other package managers such as conda.
67
+
68
+ ```bash
69
+ pip install congrads
70
+ ```
71
+
72
+ ## Getting Started
73
+
74
+ ### 1. **Prerequisites**
75
+
76
+ Before you can use **Congrads**, make sure you have the following installed:
77
+
78
+ - Python 3.7+
79
+ - **PyTorch** (install with CUDA support for GPU training, refer to the [getting started guide](https://pytorch.org/get-started/locally/))
80
+ - **PyTorch Lightning** (preffered version 2.4, [installation guide](https://lightning.ai/docs/pytorch/stable/starter/installation.html))
81
+
82
+ ### 2. **Installation**
83
+
84
+ Please install **Congrads** via pip:
85
+
86
+ ```bash
87
+ pip install congrads
88
+ ```
89
+
90
+ ### 3. **Basic Usage**
91
+
92
+ #### 1. Import the toolbox
93
+
94
+ ```python
95
+ from congrads.descriptor import Descriptor
96
+ from congrads.constraints import ScalarConstraint, BinaryConstraint
97
+ from congrads.learners import Learner
98
+ ```
99
+
100
+ #### 2. Instantiate and configure descriptor
101
+
102
+ The descriptor describes your specific use-case. It assigns names to specific neurons so you can easily reference them when defining constraints. By settings flags, you can specifiy if a layer is fixed or if it is an output layer.
103
+
104
+ ```python
105
+ # Descriptor setup
106
+ descriptor = Descriptor()
107
+ descriptor.add("input", ["I1", "I2", "I3", "I4"], constant=True)
108
+ descriptor.add("output", ["O1", "O2"], output=True)
109
+ ```
110
+
111
+ #### 3. Define constraints on your network
112
+
113
+ You can define constraints on your network using the names previously configured in the descriptor. A set of predefined constraint classes can be used to define inequalities on input or output data.
114
+
115
+ ```python
116
+ # Constraints definition
117
+ Constraint.descriptor = descriptor
118
+ constraints = [
119
+ ScalarConstraint("O1", gt, 0), # O1 > 0
120
+ BinaryConstraint("O1", le, "O2"), # O1 <= O2
121
+ ]
122
+ ```
123
+
124
+ #### 4. Adjust network
125
+
126
+ Your regular Pytorch network can be used with this toolbox. We only require that the output of your model's forward pass is a dictionary of layers. The keys must match the descriptor settings.
127
+
128
+ ```python
129
+ def forward(self, X):
130
+ input = X
131
+ output = self.out(self.hidden(self.input(X)))
132
+
133
+ return {"input": input, "output": output}
134
+ ```
135
+
136
+ You then can use your own network and directly assign it to the learner.
137
+
138
+ #### 5. Set up network and data
139
+
140
+ Next, instantiate the adjusted network and the data. At the moment, we require the data to be implemented as a `LightningDataModule` class.
141
+
142
+ ```python
143
+ # Data and network setup
144
+ network = YourOwnNetwork(n_inputs=4, n_outputs=2, n_hidden_layers=3, hidden_dim=10)
145
+ data = YourOwnData(batch_size=100)
146
+ ```
147
+
148
+ #### 6. Set up learner
149
+
150
+ You can specify your own loss function and optimizer with their own settings to be used for learning the model.
151
+
152
+ ```python
153
+ # Learner setup
154
+ loss_function = MSELoss()
155
+ optimizer = Adam(network.parameters(), lr=0.001)
156
+
157
+ learner = Learner(network, descriptor, constraints, loss_function, optimizer)
158
+ ```
159
+
160
+ #### 7. Set up trainer
161
+
162
+ Finally, set up a trainer to start the actual training of the model.
163
+
164
+ ```python
165
+ # Trainer setup
166
+ trainer = Trainer(max_epochs=100)
167
+
168
+ # Train model
169
+ trainer.fit(learner, data)
170
+ ```
171
+
172
+ ## Example Use Cases
173
+
174
+ - **Optimization with Domain Knowledge**: Ensure outputs meet real-world restrictions or safety standards.
175
+ - **Physics-Informed Neural Networks (PINNs)**: Enforce physical laws as constraints in your models.
176
+ - **Improve Training Process**: Inject domain knowledge in the training stage, increasing learning efficiency.
177
+
178
+ ## Roadmap
179
+
180
+ - [ ] Documentation and Notebook examples
181
+ - [ ] Add support for constraint parser that can interpret equations
182
+ - [ ] Add better handling of metric logging and visualization
183
+ - [ ] Revise if Pytorch Lightning is preferable over plain Pytorch
184
+ - [ ] Determine if it is feasible to add unit and or functional tests
185
+
186
+ ## Contributing
187
+
188
+ We welcome contributions to Congrads! Whether you want to report issues, suggest features, or contribute code via issues and pull requests.
189
+
190
+ ## License
191
+
192
+ Congrads is licensed under the [MIT License with a Commons Clause](LICENSE). This means you are free to use, modify, and distribute the software, but you may not sell or offer it as part of a paid service without permission. We encourage companies that are interested in a collaboration for a specific topic to contact the authors for more information.
193
+
194
+ ---
195
+
196
+ Elevate your neural networks with Congrads! 🚀
@@ -0,0 +1,149 @@
1
+ # Congrads
2
+
3
+ **Congrads** is a Python toolbox that brings **constraint-guided gradient descent** capabilities to your machine learning projects. Built with seamless integration into PyTorch and PyTorch Lightning, Congrads empowers you to enhance the training and optimization process by incorporating constraints into your training pipeline.
4
+
5
+ Whether you're working with simple inequality constraints, combinations of input-output relations, or custom constraint formulations, Congrads provides the tools and flexibility needed to build more robust and generalized models.
6
+
7
+ > <strong>Note:</strong> The Congrads toolbox is currently in alpha phase. Expect significant changes, potential bugs, and incomplete features as we continue to develop and improve the functionality. Feedback is highly appreciated during this phase to help us refine the toolbox and ensure its reliability in later stages.
8
+
9
+ ## Key Features
10
+
11
+ - **Constraint-Guided Training**: Add constraints to guide the optimization process, ensuring that your model generalizes better by trying to satisfy the constraints.
12
+ - **Flexible Constraint Definition**: Define constraints on inputs, outputs, or combinations thereof, using an intuitive and extendable interface. Make use of pre-programmed constraint classes or write your own.
13
+ - **Seamless PyTorch Integration**: Use Congrads within your existing PyTorch workflows with minimal setup.
14
+ - **PyTorch Lightning Support**: Easily plug into PyTorch Lightning projects for scalable and structured model training.
15
+ - **Flexible and extendible**: Write your own custom networks, constraints and dataset classes to easily extend the functionality of the toolbox.
16
+
17
+ ## Installation
18
+
19
+ Currently, the **Congrads** toolbox can only be installed using pip. We will later expand to other package managers such as conda.
20
+
21
+ ```bash
22
+ pip install congrads
23
+ ```
24
+
25
+ ## Getting Started
26
+
27
+ ### 1. **Prerequisites**
28
+
29
+ Before you can use **Congrads**, make sure you have the following installed:
30
+
31
+ - Python 3.7+
32
+ - **PyTorch** (install with CUDA support for GPU training, refer to the [getting started guide](https://pytorch.org/get-started/locally/))
33
+ - **PyTorch Lightning** (preffered version 2.4, [installation guide](https://lightning.ai/docs/pytorch/stable/starter/installation.html))
34
+
35
+ ### 2. **Installation**
36
+
37
+ Please install **Congrads** via pip:
38
+
39
+ ```bash
40
+ pip install congrads
41
+ ```
42
+
43
+ ### 3. **Basic Usage**
44
+
45
+ #### 1. Import the toolbox
46
+
47
+ ```python
48
+ from congrads.descriptor import Descriptor
49
+ from congrads.constraints import ScalarConstraint, BinaryConstraint
50
+ from congrads.learners import Learner
51
+ ```
52
+
53
+ #### 2. Instantiate and configure descriptor
54
+
55
+ The descriptor describes your specific use-case. It assigns names to specific neurons so you can easily reference them when defining constraints. By settings flags, you can specifiy if a layer is fixed or if it is an output layer.
56
+
57
+ ```python
58
+ # Descriptor setup
59
+ descriptor = Descriptor()
60
+ descriptor.add("input", ["I1", "I2", "I3", "I4"], constant=True)
61
+ descriptor.add("output", ["O1", "O2"], output=True)
62
+ ```
63
+
64
+ #### 3. Define constraints on your network
65
+
66
+ You can define constraints on your network using the names previously configured in the descriptor. A set of predefined constraint classes can be used to define inequalities on input or output data.
67
+
68
+ ```python
69
+ # Constraints definition
70
+ Constraint.descriptor = descriptor
71
+ constraints = [
72
+ ScalarConstraint("O1", gt, 0), # O1 > 0
73
+ BinaryConstraint("O1", le, "O2"), # O1 <= O2
74
+ ]
75
+ ```
76
+
77
+ #### 4. Adjust network
78
+
79
+ Your regular Pytorch network can be used with this toolbox. We only require that the output of your model's forward pass is a dictionary of layers. The keys must match the descriptor settings.
80
+
81
+ ```python
82
+ def forward(self, X):
83
+ input = X
84
+ output = self.out(self.hidden(self.input(X)))
85
+
86
+ return {"input": input, "output": output}
87
+ ```
88
+
89
+ You then can use your own network and directly assign it to the learner.
90
+
91
+ #### 5. Set up network and data
92
+
93
+ Next, instantiate the adjusted network and the data. At the moment, we require the data to be implemented as a `LightningDataModule` class.
94
+
95
+ ```python
96
+ # Data and network setup
97
+ network = YourOwnNetwork(n_inputs=4, n_outputs=2, n_hidden_layers=3, hidden_dim=10)
98
+ data = YourOwnData(batch_size=100)
99
+ ```
100
+
101
+ #### 6. Set up learner
102
+
103
+ You can specify your own loss function and optimizer with their own settings to be used for learning the model.
104
+
105
+ ```python
106
+ # Learner setup
107
+ loss_function = MSELoss()
108
+ optimizer = Adam(network.parameters(), lr=0.001)
109
+
110
+ learner = Learner(network, descriptor, constraints, loss_function, optimizer)
111
+ ```
112
+
113
+ #### 7. Set up trainer
114
+
115
+ Finally, set up a trainer to start the actual training of the model.
116
+
117
+ ```python
118
+ # Trainer setup
119
+ trainer = Trainer(max_epochs=100)
120
+
121
+ # Train model
122
+ trainer.fit(learner, data)
123
+ ```
124
+
125
+ ## Example Use Cases
126
+
127
+ - **Optimization with Domain Knowledge**: Ensure outputs meet real-world restrictions or safety standards.
128
+ - **Physics-Informed Neural Networks (PINNs)**: Enforce physical laws as constraints in your models.
129
+ - **Improve Training Process**: Inject domain knowledge in the training stage, increasing learning efficiency.
130
+
131
+ ## Roadmap
132
+
133
+ - [ ] Documentation and Notebook examples
134
+ - [ ] Add support for constraint parser that can interpret equations
135
+ - [ ] Add better handling of metric logging and visualization
136
+ - [ ] Revise if Pytorch Lightning is preferable over plain Pytorch
137
+ - [ ] Determine if it is feasible to add unit and or functional tests
138
+
139
+ ## Contributing
140
+
141
+ We welcome contributions to Congrads! Whether you want to report issues, suggest features, or contribute code via issues and pull requests.
142
+
143
+ ## License
144
+
145
+ Congrads is licensed under the [MIT License with a Commons Clause](LICENSE). This means you are free to use, modify, and distribute the software, but you may not sell or offer it as part of a paid service without permission. We encourage companies that are interested in a collaboration for a specific topic to contact the authors for more information.
146
+
147
+ ---
148
+
149
+ Elevate your neural networks with Congrads! 🚀
@@ -0,0 +1,21 @@
1
+ # __init__.py
2
+
3
+ # Only expose the submodules, not individual classes
4
+ from . import core
5
+ from . import constraints
6
+ from . import datasets
7
+ from . import descriptor
8
+ from . import learners
9
+ from . import metrics
10
+ from . import networks
11
+
12
+ # Define __all__ to specify that the submodules are accessible, but not classes directly.
13
+ __all__ = [
14
+ "core",
15
+ "constraints",
16
+ "datasets",
17
+ "descriptor",
18
+ "learners",
19
+ "metrics",
20
+ "networks"
21
+ ]