LSTM with working memory

A Pulver, S Lyu - … Joint Conference on Neural Networks (IJCNN …, 2017 - ieeexplore.ieee.org
… begin with a review of LSTM. Since nearly all modern implementations use forget gates, we
will call LSTM with forget gates standard LSTM or just LSTM. The term “memory cell” will be …

Working memory connections for LSTM

F Landi, L Baraldi, M Cornia, R Cucchiara - Neural Networks, 2021 - Elsevier
… We show through extensive experimental evaluation that Working Memory Connections
constantly improve the performance of LSTMs on a variety of tasks. Numerical results suggest …

Cognitive analysis of working memory load from EEG, by a deep recurrent neural network

S Kuanar, V Athitsos, N Pradhan… - … on Acoustics, Speech …, 2018 - ieeexplore.ieee.org
… According to our dataset limits we used two LSTM layers each with 64 memory cells. The
complete LSTM sequence of frames were propagated to FC layer (Figure 5) and prediction was …

A robust deep neural network for denoising task-based fMRI data: An application to working memory and episodic memory

Z Yang, X Zhuang, K Sreenivasan, V Mishra… - Medical Image …, 2020 - Elsevier
memory (LSTM) layer, one time-distributed fully-connected layer, and one unconventional
selection layer in sequential order. The LSTM … weights the output of the LSTM layer, and the …

Understanding LSTM--a tutorial into long short-term memory recurrent neural networks

RC Staudemeyer, ER Morris - arXiv preprint arXiv:1909.09586, 2019 - arxiv.org
… In order to preserve the CEC in LSTM memory block cells, the original formulation of LSTM
used a combination of two learning algorithms: BPTT to train network components located …

Decoding working memory load from EEG with LSTM networks

S Goldstein, Z Hu, M Ding - arXiv preprint arXiv:1910.05621, 2019 - arxiv.org
… Using SVM to define four time periods of interest, which corresponded to different stages of
the working memory process, we then applied LSTM-RNN to find that the decoding accuracy …

SAM: a unified self-adaptive multicompartmental spiking neuron model for learning with working memory

S Yang, T Gao, J Wang, B Deng, MR Azghadi… - Frontiers in …, 2022 - frontiersin.org
… Learning with working memory has been implemented by LSTM models in the field of artificial
intelligence, but the neuron-level mechanisms underlying this formulation have not been …

How working memory and reinforcement learning are intertwined: A cognitive, neural, and computational perspective

AH Yoo, AGE Collins - Journal of cognitive neuroscience, 2022 - direct.mit.edu
Reinforcement learning and working memory are two core processes of human cognition and
are often considered cognitively, neuroscientifically, and algorithmically distinct. Here, we …

Making working memory work: a computational model of learning in the prefrontal cortex and basal ganglia

RC O'Reilly, MJ Frank - Neural computation, 2006 - direct.mit.edu
… Moreover, the similar good performance of PBWM and LSTM models across a range of
complex tasks clearly demonstrates the advantages of dynamic gating systems for …

Do RNN and LSTM have long memory?

J Zhao, F Huang, J Lv, Y Duan, Z Qin… - International …, 2020 - proceedings.mlr.press
memory filter structure into RNN and LSTM and introduce Memoryaugmented RNN (MRNN)
and Memory-… Thus, we propose to revise the cell states of LSTM by adding the long memory