LuxRecurrentLayers.jl extends Lux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.
LuxRecurrentLayers.jl is registered on the general registry! To install it please use either of:
julia> ]
Pkg> add LuxRecurrentLayers
or
using Pkg
Pkg.add("LuxRecurrentLayers")LuxRecurrentLayers aims to provide additional recurrent layers to use with Lux.jl. All the cells are easy drop-in replacements to any existing workflow in Lux! More specifically this library provides:
- 25+ additional cells that provide alternative computation to the default
RNNCell,GRUCell,LSTMCell. - Additional toggles and options per cell.
- Additional more complex layers and wrappers.
using Lux, LuxRecurrentLayers, Random
# Seeding
rng = Random.default_rng()
Random.seed!(rng, 0)
# Define the recurrent model (a cell in this case)
rnn = AntisymmetricRNNCell(3=>5)
# Get parameters and states
ps, st = Lux.setup(rng, rnn)
# Random input
inp = rand(Float32, 3)
# Forward pass with random input
output, st = Lux.apply(rnn, inp, ps, st)RecurrentLayers.jl: Equivalent library, providing recurrent layers for Flux.jl.
ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.
