The LMU layer currently accepts return_sequences and passes that through to tf.keras.layers.RNN with the created LMUCell. However, there are a number of other RNN flags that are ignored (https://www.tensorflow.org/api_docs/python/tf/keras/layers/RNN):
return_state
go_backwards
stateful
unroll
time_major
In order to use any of these additional flags, the LMUCell must be invoked directly then and passed to tf.keras.layers.RNN alongside the flags. Supporting the flags at the layer level would mirror the pattern for other recurrent layers such as tf.keras.layers.LSTM.
The
LMUlayer currently acceptsreturn_sequencesand passes that through totf.keras.layers.RNNwith the createdLMUCell. However, there are a number of other RNN flags that are ignored (https://www.tensorflow.org/api_docs/python/tf/keras/layers/RNN):return_statego_backwardsstatefulunrolltime_majorIn order to use any of these additional flags, the
LMUCellmust be invoked directly then and passed totf.keras.layers.RNNalongside the flags. Supporting the flags at the layer level would mirror the pattern for other recurrent layers such as tf.keras.layers.LSTM.