Long short-term memory (LSTM)

Long short-term memory (LSTM) is an artificial recurrent neural network (RNN) architecture used in the field of deep learning.
Unlike standard feedforward neural networks, LSTM has feedback connections. It can not only process single data points (such as images), but also entire sequences of data (such as speech or video).
For example, LSTM is applicable to tasks such as unsegmented, connected handwriting recognition, speech recognition and anomaly detection in network traffic or IDS’s (intrusion detection systems).

Extra reading:

Long short-term memory

Papers:

LSTM: A Search Space Odyssey : In this paper, the authors provide an ablation study of the main LSTM components (input gate, output gate, forget gate, peephole connections, activation functions) and show what each of these modules contribute to the final accuracy of LSTM. A must-read paper try dive deeper into LSTM architecture.

Video:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Multiple Object Tracking