Skip to content

MartinuzziFrancesco/RecurrentLayers.jl

Repository files navigation

Documentation Build Status Julia Testing Paper
docsstbl docsdev CI Julia Code Style: Blue Aqua QA JET codecov arXiv

RecurrentLayers.jl

RecurrentLayers.jl extends Flux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.

Features 🚀

The package offers multiple layers for Flux.jl. Currently there are 30+ cells implemented, together with multiple higher level implementations:

Short name Publication venue Official implementation
AntisymmetricRNN/GatedAntisymmetricRNN ICLR 2019
ATR EMNLP 2018 bzhangGo/ATR
BR/BRC PLOS ONE 2021 nvecoven/BRC
CFN ICLR 2017
coRNN ICLR 2021 tk-rusch/coRNN
FastRNN/FastGRNN NeurIPS 2018 Microsoft/EdgeML
IndRNN CVPR 2018 Sunnydreamrain/IndRNN_Theano_Lasagne
JANET arXiv 2018 JosvanderWesthuizen/janet
LEM ICLR 2022 tk-rusch/LEM
LiGRU IEEE Transactions on Emerging Topics in Computing 2018 mravanelli/theano-kaldi-rnn
LightRU MDPI Electronics 2023
MinimalRNN NeurIPS 2017
MultiplicativeLSTM Workshop ICLR 2017 benkrause/mLSTM
MGU International Journal of Automation and Computing 2016
MUT1/MUT2/MUT3 ICML 2015
NAS arXiv 2016 tensorflow_addons/rnn
OriginalLSTM Neural Computation 1997 -
PeepholeLSTM JMLR 2002
RAN arXiv 2017 kentonl/ran
RHN ICML 2017 jzilly/RecurrentHighwayNetworks
SCRN ICLR 2015 facebookarchive/SCRNNs
SGRN IET 2018
STAR IEEE Transactions on Pattern Analysis and Machine Intelligence 2022 0zgur0/STAckable-Recurrent-network
Typed RNN / GRU / LSTM ICML 2016
UGRNN ICLR 2017 -
UnICORNN ICML 2021 tk-rusch/unicornn
WMCLSTM Neural Networks 2021

Installation 💻

You can install RecurrentLayers using either of:

using Pkg
Pkg.add("RecurrentLayers")
julia> ]
pkg> add RecurrentLayers

Getting started 🛠️

The workflow is identical to any recurrent Flux layer: just plug in a new recurrent layer in your workflow and test it out!

Citation

If you use RecurrentLayers.jl in your work, please consider citing

@misc{martinuzzi2025unified,
  doi = {10.48550/ARXIV.2510.21252},
  url = {https://arxiv.org/abs/2510.21252},
  author = {Martinuzzi,  Francesco},
  keywords = {Machine Learning (cs.LG),  Software Engineering (cs.SE),  FOS: Computer and information sciences,  FOS: Computer and information sciences},
  title = {Unified Implementations of Recurrent Neural Networks in Multiple Deep Learning Frameworks},
  publisher = {arXiv},
  year = {2025},
  copyright = {Creative Commons Attribution 4.0 International}
}

License 📜

This project is licensed under the MIT License, except for nas_cell.jl, which is licensed under the Apache License, Version 2.0.

  • nas_cell.jl is a reimplementation of the NASCell from TensorFlow and is licensed under the Apache License 2.0. See the file header and LICENSE-APACHE for details.
  • All other files are licensed under the MIT License. See LICENSE-MIT for details.

See also

LuxRecurrentLayers.jl: Equivalent library, providing recurrent layers for Lux.jl.

torchrecurrent: Recurrent layers for Pytorch.

ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.