Simplification of RNN and Its Performance Evaluation in Machine Translation
Tomohiro Fujita, Zhiwei Luo, Changqin Quan, Kohei Mori
pp. 267-274
DOI:
10.5687/iscie.33.267抄録
In this paper, we study on simplification of RNN and propose new structures which enable faster learning to improve the performance while reducing the number of learning parameters. We construct 4 types of RNNs with new gated structures and call these new RNNs “SGR (Simple Gated RNN)”. SGR have two or one gate and weight or no weight for input. Comparison studies are performed to verify the effectiveness of our proposal. As a result of machine translation in relatively small corpus, compared with LSTM and GRU, our proposed SGR can realize higher scores than LSTM and GRU. Furthermore, SGR can realize faster learning approximately 1.7 times than GRU. However, with the increase of learning layers and weights for input, the learning scores of SGR seems not increase as much as we expected, which should be studied in our future work. It is necessary to analyze in more detail a performance in larger dataset and a performance difference due to multi-layering, weight for input and the number of gates.