Congrads is a Python toolbox that brings constraint-guided gradient descent capabilities to your machine learning projects. Built with seamless integration into PyTorch, Congrads empowers you to enhance the training and optimization process by incorporating constraints into your training pipeline.
Whether you're working with simple inequality constraints, combinations of input-output relations, or custom constraint formulations, Congrads provides the tools and flexibility needed to build more robust and generalized models.
Notice: All previous
v1.xreleases are yanked.
The library is still in active development, and backwards compatibility is not guaranteed.
Please use the newv0.xseries for ongoing updates.
- Constraint-Guided Training: Add constraints to guide the optimization process, ensuring that your model generalizes better by trying to satisfy the constraints.
- Flexible Constraint Definition: Define constraints on inputs, outputs, or combinations thereof, using an intuitive and extendable interface. Make use of pre-programmed constraint classes or write your own.
- Seamless PyTorch Integration: Use Congrads within your existing PyTorch workflows with minimal setup.
- Flexible and extendible: Write your own custom networks, constraints and dataset classes to easily extend the functionality of the toolbox.
First, make sure to install PyTorch since Congrads heavily relies on its deep learning framework. Please refer to the PyTorch's getting started guide. Make sure to install with CUDA support for GPU training.
Next, install the Congrads toolbox. The recommended way to install it is to use pip:
pip install congradsYou can also install Congrads together with extra packages required to run the examples:
pip install congrads[examples]This should automatically install all required dependencies for you. If you would like to install dependencies manually, Congrads depends on the following:
- Python 3.11 - 3.13
- PyTorch (install with CUDA support for GPU training, refer to PyTorch's getting started guide)
- NumPy (install with
pip install numpy, or refer to NumPy's install guide.) - Pandas (install with
pip install pandas, or refer to Panda's install guide.) - Tqdm (install with
pip install tqdm) - Torchvision (install with
pip install torchvision) - Optional: Tensorboard (install with
pip install tensorboard)
Before diving into the toolbox, it is recommended to familiarize yourself with Congrads's core concept and topics. Please read the documentation at https://congrads.readthedocs.io/en/latest/ to get up-to-date.
Below, a basic example can be found that illustrates how to work with the Congrads toolbox. For additional examples, refer to the examples and notebooks folders in the repository.
use_cuda = torch.cuda.is_available()
device = torch.device("cuda:0" if use_cuda else "cpu")data = BiasCorrection(
"./datasets", preprocess_BiasCorrection, download=True
)
loaders = split_data_loaders(
data,
loader_args={"batch_size": 100, "shuffle": True},
valid_loader_args={"shuffle": False},
test_loader_args={"shuffle": False},
)network = MLPNetwork(25, 2, n_hidden_layers=3, hidden_dim=35)
network = network.to(device)criterion = MSELoss()
optimizer = Adam(network.parameters(), lr=0.001)descriptor = Descriptor()
descriptor.add_layer("output")
descriptor.add("Tmax", "output", 0)
descriptor.add("Tmin", "output", 1)Constraint.descriptor = descriptor
Constraint.device = device
constraints = [
ScalarConstraint("Tmin", ">=", 0),
ScalarConstraint("Tmin", "<=", 1),
ScalarConstraint("Tmax", ">=", 0),
ScalarConstraint("Tmax", "<=", 1),
BinaryConstraint("Tmax", ">", "Tmin"),
]metric_manager = MetricManager()
core = CongradsCore(
descriptor,
constraints,
loaders,
network,
criterion,
optimizer,
metric_manager,
device,
checkpoint_manager,
)
core.fit(max_epochs=50)- Optimization with Domain Knowledge: Ensure outputs meet real-world restrictions or safety standards.
- Improve Training Process: Inject domain knowledge in the training stage, increasing learning efficiency.
- Physics-Informed Neural Networks (PINNs): Coming soon, Enforce physical laws as constraints in your models.
- Add ODE/PDE constraints to support PINNs
- Rework callback system
- Add support for constraint parser that can interpret equations
If you make use of this package or it's concepts in your research, please consider citing the following papers.
- Van Baelen, Q., & Karsmakers, P. (2023). Constraint guided gradient descent: Training with inequality constraints with applications in regression and semantic segmentation.
Neurocomputing, 556, 126636. doi:10.1016/j.neucom.2023.126636
[ pdf | bibtex ]
-
Meire, M., Van Baelen, Q., Ooijevaar, T., Karsmakers, P. (2026). Constraint Guided Recurrent Convolutional AutoEncoders for Condition Indicator Estimation. In: Proceedings of ESANN. Presented at the European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Bruges, Belgium.
-
Rombouts, W., Van Baelen, Q., Karsmakers, P. with Rombouts, W. (corresp. author) (2025). Constraint-Guided PINNs: A Constrained Optimization Approach. In: CEUR Workshop Proceedings: vol. 4125, (124-130). Presented at the ECAI workshop ANSyA, Bologna, Italy, 25 Oct 2025-30 Oct 2025.
[ pdf | bibtex ] -
Tefera, Y., Van Baelen, Q., Meire, M., Luca, S., Karsmakers, P. with Tefera, Y. (corresp. author) (2025). Constraint-Guided Learning of Data-Driven Health Indicator Models: An Application on Bearings. INTERNATIONAL JOURNAL OF PROGNOSTICS AND HEALTH MANAGEMENT, 16 (2), Art.No. ARTN 4268. doi: 10.36001/IJPHM.2025.v16i2.4268
[ pdf | bibtex ] -
Meire, M., Van Baelen, Q., Ooijevaar, T., Karsmakers, P. with Meire, M. (corresp. author) (2025). Constraint guided autoencoders for joint optimization of condition indicator estimation and anomaly detection in machine condition monitoring. MACHINE LEARNING, 114 (7), Art.No. ARTN 153. doi: 10.1007/s10994-025-06779-0
[ pdf | bibtex ] -
Meire, M., Van Baelen, Q., Ooijevaar, T., Karsmakers, P. (2023). Constraint-Guided Autoencoders to Enforce a Predefined Threshold on Anomaly Scores: An Application in Machine Condition Monitoring. Journal of Dynamics, Monitoring and Diagnostics, 2 (2), 144-154. doi: 10.37965/jdmd.2023.234
[ pdf | bibtex ]
We welcome contributions to Congrads! Whether you want to report issues, suggest features, or contribute code via issues and pull requests.
Congrads is licensed under the The 3-Clause BSD License. We encourage companies that are interested in a collaboration for a specific topic to contact the authors for more information or to set up joint research projects.
Feel free to contact any of the below contact persons for more information or details about the project. Companies interested in a collaboration, or to set up joint research projects are also encouraged to get in touch with us.
Below you find a list of people who contributed in making the toolbox. Feel free to contact them for any repository- or code-specific questions, suggestions or remarks.
- Wout Rombouts [ email | github profile ]
- Quinten Van Baelen [ email | github profile ]
Elevate your neural networks with Congrads! 🚀