File(s) under permanent embargo
Improving recurrent neural networks with predictive propagation for sequence labelling
conference contribution
posted on 2023-05-23, 14:43 authored by Son TranSon Tran, Zhang, Q, Nguyen, A, Vu, X-S, Ngo, SRecurrent neural networks (RNNs) is a useful tool for sequence labelling tasks in natural language processing. Although in practice RNNs suffer a problem of vanishing/exploding gradient, their compactness still offers efficiency and make them less prone to overfitting. In this paper we show that by propagating the prediction of previous labels we can improve the performance of RNNs while keeping the number of parameters in RNNs unchanged and adding only one more step for inference. As a result, the models are still more compact and efficient than other models with complex memory gates. In the experiment, we evaluate the idea on optical character recognition and Chunking which achieve promising results. © 2018, Springer Nature Switzerland AG.
History
Publication title
Proceedings of the 25th International Conference on Neural Information Processing (ICONIP 2018), Lecture Notes in Computer Science, volume 11301Volume
11301Editors
L Cheng, ACS Leung & S OzawaPagination
452-462ISBN
978-3-030-04166-3Department/School
School of Information and Communication TechnologyPublisher
SpringerPlace of publication
Cham, SwitzerlandEvent title
25th International Conference on Neural Information Processing (ICONIP 2018)Event Venue
Siem Reap, CambodiaDate of Event (Start Date)
2018-12-13Date of Event (End Date)
2018-12-16Rights statement
Copyright 2018 SpringerRepository Status
- Restricted