Recurrent Neural Network
www.cs.toronto.edu › ~tingwuwang › rnn_tutorial1. A new type of RNN cell (Gated Feedback Recurrent Neural Networks) 1. Very similar to LSTM 2. It merges the cell state and hidden state. 3. It combines the forget and input gates into a single "update gate". 4. Computationally more efficient. 1. less parameters, less complex structure. 2. Gaining popularity nowadays [15,16]