eCite Digital Repository
Sequence classification restricted Boltzmann machines with gated units
Citation
Tran, SN and d'Avila Garcez, A and Weyde, T and Yin, J and Zhang, Q and Karunanithi, M, Sequence classification restricted Boltzmann machines with gated units, IEEE Transactions on Neural Networks and Learning Systems pp. 1-10. ISSN 2162-237X (2020) [Refereed Article]
![]() | PDF 2Mb |
Copyright Statement
© 2020 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
DOI: doi:10.1109/TNNLS.2019.2958103
Abstract
For the classification of sequential data, dynamic Bayesian networks and recurrent neural networks (RNNs) are the preferred models. While the former can explicitly model the temporal dependences between the variables, and the latter have the capability of learning representations. The recurrent temporal restricted Boltzmann machine (RTRBM) is a model that combines these two features. However, learning and inference in RTRBMs can be difficult because of the exponential nature of its gradient computations when maximizing log likelihoods. In this article, first, we address this intractability by optimizing a conditional rather than a joint probability distribution when performing sequence classification. This results in the "sequence classification restricted Boltzmann machine" (SCRBM). Second, we introduce gated SCRBMs (gSCRBMs), which use an information processing gate, as an integration of SCRBMs with long short-term memory (LSTM) models. In the experiments reported in this article, we evaluate the proposed models on optical character recognition, chunking, and multiresident activity recognition in smart homes. The experimental results show that gSCRBMs achieve the performance comparable to that of the state of the art in all three tasks. gSCRBMs require far fewer parameters in comparison with other recurrent networks with memory gates, in particular, LSTMs and gated recurrent units (GRUs).
Item Details
Item Type: | Refereed Article |
---|---|
Keywords: | recurrent neural networks (RNNs), restricted Boltzmann machines, sequence classification, temporal learning, sequence labelling, rbm |
Research Division: | Information and Computing Sciences |
Research Group: | Artificial intelligence |
Research Field: | Intelligent robotics |
Objective Division: | Health |
Objective Group: | Specific population health (excl. Indigenous health) |
Objective Field: | Health related to ageing |
UTAS Author: | Tran, SN (Dr Son Tran) |
ID Code: | 139112 |
Year Published: | 2020 |
Web of Science® Times Cited: | 3 |
Deposited By: | Information and Communication Technology |
Deposited On: | 2020-05-27 |
Last Modified: | 2020-09-10 |
Downloads: | 12 View Download Statistics |
Repository Staff Only: item control page