eCite Digital Repository

High-resolution estimates of fire severity - an evaluation of UAS image and LiDAR mapping approaches on a sedgeland forest boundary in Tasmania, Australia


Hillman, S and Hally, B and Wallace, L and Turner, D and Lucieer, A and Reinke, K and Jones, S, High-resolution estimates of fire severity - an evaluation of UAS image and LiDAR mapping approaches on a sedgeland forest boundary in Tasmania, Australia, Fire, 4, (1) Article 14. ISSN 2571-6255 (2021) [Refereed Article]


Copyright Statement

Copyright 2021 the authors. Licensed by Creative Commons Attribution 4.0 International (CC BY 4.0)

DOI: doi:10.3390/fire4010014


With an increase in the frequency and severity of wildfires across the globe and resultant changes to long-established fire regimes, the mapping of fire severity is a vital part of monitoring ecosystem resilience and recovery. The emergence of unoccupied aircraft systems (UAS) and compact sensors (RGB and LiDAR) provide new opportunities to map fire severity. This paper conducts a comparison of metrics derived from UAS Light Detecting and Ranging (LiDAR) point clouds and UAS image based products to classify fire severity. A workflow which derives novel metrics describing vegetation structure and fire severity from UAS remote sensing data is developed that fully utilises the vegetation information available in both data sources. UAS imagery and LiDAR data were captured pre-and post-fire over a 300 m by 300 m study area in Tasmania, Australia. The study area featured a vegetation gradient from sedgeland vegetation (e.g., button grass 0.2 m) to forest (e.g., Eucalyptus obliqua and Eucalyptus globulus 50 m). To classify the vegetation and fire severity, a comprehensive set of variables describing structural, textural and spectral characteristics were gathered using UAS images and UAS LiDAR datasets. A recursive feature elimination process was used to highlight the subsets of variables to be included in random forest classifiers. The classifier was then used to map vegetation and severity across the study area. The results indicate that UAS LiDAR provided similar overall accuracy to UAS image and combined (UAS LiDAR and UAS image predictor values) data streams to classify vegetation (UAS image: 80.6%; UAS LiDAR: 78.9%; and Combined: 83.1%) and severity in areas of forest (UAS image: 76.6%, UAS LiDAR: 74.5%; and Combined: 78.5%) and areas of sedgeland (UAS image: 72.4%; UAS LiDAR: 75.2%; and Combined: 76.6%). These results indicate that UAS SfM and LiDAR point clouds can be used to assess fire severity at very high spatial resolution.

Item Details

Item Type:Refereed Article
Keywords:photogrammetry, UAS, LiDAR, 3D remote sensing, vegetation, RPAS, drone, structure, fuel structure, fire severity
Research Division:Engineering
Research Group:Geomatic engineering
Research Field:Photogrammetry and remote sensing
Objective Division:Environmental Policy, Climate Change and Natural Hazards
Objective Group:Natural hazards
Objective Field:Climatological hazards (e.g. extreme temperatures, drought and wildfires)
UTAS Author:Wallace, L (Dr Luke Wallace)
UTAS Author:Turner, D (Dr Darren Turner)
UTAS Author:Lucieer, A (Professor Arko Lucieer)
ID Code:144834
Year Published:2021
Web of Science® Times Cited:10
Deposited By:Geography and Spatial Science
Deposited On:2021-06-16
Last Modified:2022-08-23
Downloads:14 View Download Statistics

Repository Staff Only: item control page