Welcome to the repository for our submission HadamRNN: Binary and Sparse Ternary Orthogonal RNNs, accepted at the conference ICLR'25.
This repository contains code and resources to build binary or ternary orthogonal RNNs, and run the experiments presented in the paper..

HadamRNN is a Python toolkit dedicated to binary and ternary and orthogonal Recurrent Neural Networks.
- 📚 Table of contents
- 🚀 Overview
- 📦 What's Included
- 👀 See Also
- 🙏 Acknowledgments
- 👨🎓 Creator
- 🗞️ Citation
- 📝 License
This code require uses the pytorch framework, and the deel-torchlip.
Other dependencies are described in the next section.
-
Clone the repository:
git clone https://github.com/deel-ai-papers/hadamRNN.git cd hadamRNN
-
Create a virtual environment (optional but recommended):
python -m venv venv source venv/bin/activate # On Windows use `venv\Scripts\activate`
-
Install dependencies:
pip install -r requirements.txt pip install deel-torchlip
See also config//Readme.md files for additional dependencies for expes
To train a model, use the train.py
script located in the src/
directory.
python src/train.py --help
usage: train.py [-h] [--config CONFIG]
options:
-h, --help Show this help message and exit.
--config CONFIG Path to the configuration file.
python src/train.py --config config/pmnist/pmnist_hadamRNN_paper.yaml
This command trains a model using the configuration specified in config/pmnist/pmnist_hadamRNN_paper.yaml
.
.
├── config # all configuration are here by sub-directories
│ └── pmnist/pmnist_hadamRNN_paper.yaml
│ └── smnist/smnist_hadamRNN_paper.yaml
│ └── copytask/copy_task_hadamRNN_paper.yaml
│ └── imdb/imdb_hadamRNN_paper.yaml
│ └── glue/sst2_hadamRNN_paper.yaml
│ └── glue/qqp_hadamRNN_paper.yaml
├── README.md
├── requirements.txt
└── src # all code is here
├── launch_train.py
├── config.py # yaml config load
├── getters.py # from config to classes
├── quantized_layers.py # hadamard and quantized layers
├── training.py # training and validation steps
├── utils.py
├── dataset_tools # folder for dataset management
├── extra_layers.py # layers for post training quantization of activations
├── notebook/expe_quantif_activ_hadam.ipynb # post training quantization of activations
- configs/: Configuration files for training models.
- requirements.txt: Python dependencies.
- src/: Source code directory.
- launch_train.py: Main script for training
- training.py: Training script.
We welcome contributions to improve this repository. Please submit a pull request or open an issue to discuss your proposed changes.
This library is one approach of many...
Other tools to proposed by the DEEL project:
- Xplique a Python library exclusively dedicated to explaining neural networks.
- deel-lip a Python library for training k-Lipschitz neural networks on TF and Keras3.
- Influenciae Python toolkit dedicated to computing influence values for the discovery of potentially problematic samples in a dataset.
- deel-torchlip a Python library for training k-Lipschitz neural networks on PyTorch.
- oodeel a Python library for post-hoc deep OOD (Out-of-Distribution) detection on already trained neural network image classifiers
- DEEL White paper a summary of the DEEL team on the challenges of certifiable AI and the role of data quality, representativity and explainability for this purpose.
This code was created by Franck Mamalet and Armand Foucault.
If you use code as part of your workflow in a scientific publication, please consider citing 🗞️ our paper:
@inproceedings{
foucault2025hadamrnn,
title={{HadamRNN}: {B}inary and {S}parse {T}ernary {O}rthogonal {RNN}s},
author={Armand Foucault and Francois Malgouyres and Franck Mamalet},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
}
The package is released under MIT license.