Skip to content

MartinuzziFrancesco/LuxRecurrentLayers.jl

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

93 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Documentation Build Status Julia Testing
docsstbl docsdev CI Julia Code Style: Blue Aqua QA JET codecov

LuxRecurrentLayers.jl

LuxRecurrentLayers.jl extends Lux.jl recurrent layers offering by providing implementations of additional recurrent layers not available in base deep learning libraries.

Installation

LuxRecurrentLayers.jl is registered on the general registry! To install it please use either of:

julia> ]
Pkg> add LuxRecurrentLayers

or

using Pkg
Pkg.add("LuxRecurrentLayers")

Features

LuxRecurrentLayers aims to provide additional recurrent layers to use with Lux.jl. All the cells are easy drop-in replacements to any existing workflow in Lux! More specifically this library provides:

  • 25+ additional cells that provide alternative computation to the default RNNCell, GRUCell, LSTMCell.
  • Additional toggles and options per cell.
  • Additional more complex layers and wrappers.

Quick Example

using Lux, LuxRecurrentLayers, Random
# Seeding
rng = Random.default_rng()
Random.seed!(rng, 0)

# Define the recurrent model (a cell in this case)
rnn = AntisymmetricRNNCell(3=>5)
# Get parameters and states
ps, st = Lux.setup(rng, rnn)

# Random input
inp = rand(Float32, 3)

# Forward pass with random input
output, st = Lux.apply(rnn, inp, ps, st)

See also

RecurrentLayers.jl: Equivalent library, providing recurrent layers for Flux.jl.

ReservoirComputing.jl: Reservoir computing utilities for scientific machine learning. Essentially gradient free trained neural networks.