Gated recurrent unit
Gated recurrent units (GRUs) are a gating mechanism in recurrent neural networks, introduced in 2014 by Kyunghyun Cho et al. Their performance on polyphonic music modeling and speech signal modeling was found to be similar to that of long short-term memory (LSTM). However, GRUs have been shown to exhibit better performance on smaller datasets.
They have fewer parameters than LSTM, as they lack an output gate.
GRU 개요
GRU는 LSTM의 장점을 유지하면서도 계산복잡성을 확 낮춘 셀 구조입니다. GRU도 Gradient Vanishing/Explosion 문제를 극복했다는 점에서 LSTM과 유사하지만 게이트 일부를 생략한 형태입니다. GRU는 크게 update gate와 reset gate 두 가지로 나뉩니다.