eCite Digital Repository
Real-time automated detection of older adults’ hand gestures in home and clinical settings
Citation
Huang, G and Tran, SN and Bai, Q and Alty, J, Real-time automated detection of older adults' hand gestures in home and clinical settings, Neural Computing and Applications pp. 1-14. ISSN 0941-0643 (2022) [Refereed Article]
![]() | PDF Pending copyright assessment - Request a copy 3Mb | ![]() | PDF (Online ahead of print) Pending copyright assessment - Request a copy 1Mb |
DOI: doi:10.1007/s00521-022-08090-8
Abstract
There is an urgent need, accelerated by the COVID-19 pandemic, for methods that allow clinicians and neuroscientists to remotely evaluate hand movements. This would help detect and monitor degenerative brain disorders that are particularly prevalent in older adults. With the wide accessibility of computer cameras, a vision-based real-time hand gesture detection method would facilitate online assessments in home and clinical settings. However, motion blur is one of the most challenging problems in the fast-moving hands data collection. The objective of this study was to develop a computer vision-based method that accurately detects older adults’ hand gestures using video data collected in real-life settings. We invited adults over 50 years old to complete validated hand movement tests (fast finger tapping and hand opening–closing) at home or in clinic. Data were collected without researcher supervision via a website programme using standard laptop and desktop cameras. We processed and labelled images, split the data into training, validation and testing, respectively, and then analysed how well different network structures detected hand gestures. We recruited 1,900 adults (age range 50–90 years) as part of the TAS Test project and developed UTAS7k—a new dataset of 7071 hand gesture images, split 4:1 into clear: motion-blurred images. Our new network, RGRNet, achieved 0.782 mean average precision (mAP) on clear images, outperforming the state-of-the-art network structure (YOLOV5-P6, mAP 0.776), and mAP 0.771 on blurred images. A new robust real-time automated network that detects static gestures from a single camera, RGRNet, and a new database comprising the largest range of individual hands, UTAS7k, both show strong potential for medical and research applications.
Item Details
Item Type: | Refereed Article |
---|---|
Keywords: | hand gesture classification, similar object detection, motion blur, dementia |
Research Division: | Information and Computing Sciences |
Research Group: | Artificial intelligence |
Research Field: | Artificial life and complex adaptive systems |
Objective Division: | Information and Communication Services |
Objective Group: | Information systems, technologies and services |
Objective Field: | Human-computer interaction |
UTAS Author: | Huang, G (Mr Guan Huang) |
UTAS Author: | Tran, SN (Dr Son Tran) |
UTAS Author: | Bai, Q (Dr Quan Bai) |
UTAS Author: | Alty, J (Associate Professor Jane Alty) |
ID Code: | 154600 |
Year Published: | 2022 |
Funding Support: | National Health and Medical Research Council (2004051) |
Deposited By: | Information and Communication Technology |
Deposited On: | 2022-12-14 |
Last Modified: | 2023-01-10 |
Downloads: | 0 |
Repository Staff Only: item control page