University of Tasmania
Browse

File(s) under permanent embargo

Improving recurrent neural networks with predictive propagation for sequence labelling

conference contribution
posted on 2023-05-23, 14:43 authored by Son TranSon Tran, Zhang, Q, Nguyen, A, Vu, X-S, Ngo, S
Recurrent neural networks (RNNs) is a useful tool for sequence labelling tasks in natural language processing. Although in practice RNNs suffer a problem of vanishing/exploding gradient, their compactness still offers efficiency and make them less prone to overfitting. In this paper we show that by propagating the prediction of previous labels we can improve the performance of RNNs while keeping the number of parameters in RNNs unchanged and adding only one more step for inference. As a result, the models are still more compact and efficient than other models with complex memory gates. In the experiment, we evaluate the idea on optical character recognition and Chunking which achieve promising results. © 2018, Springer Nature Switzerland AG.

History

Publication title

Proceedings of the 25th International Conference on Neural Information Processing (ICONIP 2018), Lecture Notes in Computer Science, volume 11301

Volume

11301

Editors

L Cheng, ACS Leung & S Ozawa

Pagination

452-462

ISBN

978-3-030-04166-3

Department/School

School of Information and Communication Technology

Publisher

Springer

Place of publication

Cham, Switzerland

Event title

25th International Conference on Neural Information Processing (ICONIP 2018)

Event Venue

Siem Reap, Cambodia

Date of Event (Start Date)

2018-12-13

Date of Event (End Date)

2018-12-16

Rights statement

Copyright 2018 Springer

Repository Status

  • Restricted

Socio-economic Objectives

Health related to ageing

Usage metrics

    University Of Tasmania

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC