eCite Digital Repository
Pushbroom hyperspectral imaging from an unmanned aircraft system (UAS) - geometric processing workflow and accuracy assessment
Citation
Turner, D and Lucieer, A and McCabe, M and Parkes, S and Clarke, I, Pushbroom hyperspectral imaging from an unmanned aircraft system (UAS) - geometric processing workflow and accuracy assessment, International Archives of the Photogrammetry, Remote Sensing and Spatial information, 4-7 September 2017, Bonn, Germany, pp. 379-384. ISSN 1682-1750 (2018) [Refereed Conference Paper]
![]() | PDF (Online published version) 884Kb |
Copyright Statement
Copyright 2017 the Authors. Licensed under Creative Commons Attribution 4.0 International (CC BY 4.0) https://creativecommons.org/licenses/by/4.0/
DOI: doi:10.5194/isprs-archives-XLII-2-W6-379-2017
Abstract
In this study, we assess two push broom hyperspectral sensors as carried by small (10 – 15 kg) multi-rotor Unmanned Aircraft Systems (UAS). We used a Headwall Photonics micro-Hyperspec push broom sensor with 324 spectral bands (4 – 5 nm FWHM) and a Headwall Photonics nano-Hyperspec sensor with 270 spectral bands (6 nm FWHM) both in the VNIR spectral range (400 – 1000 nm). A gimbal was used to stabilise the sensors in relation to the aircraft flight dynamics, and for the microHyperspec a tightly coupled dual frequency Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU), and Machine Vision Camera (MVC) were used for attitude and position determination. For the nano-Hyperspec, a navigation grade GNSS system and IMU provided position and attitude data. This study presents the geometric results of one flight over a grass oval on which a dense Ground Control Point (GCP) network was deployed. The aim being to ascertain the geometric accuracy achievable with the system. Using the PARGE software package (ReSe – Remote Sensing Applications) we ortho-rectify the push broom hyperspectral image strips and then quantify the accuracy of the ortho-rectification by using the GCPs as check points.
The orientation (roll, pitch, and yaw) of the sensor is measured by the IMU. Alternatively imagery from a MVC running at 15 Hz, with accurate camera position data can be processed with Structure from Motion (SfM) software to obtain an estimated camera orientation. In this study, we look at which of these data sources will yield a flight strip with the highest geometric accuracy.
Item Details
Item Type: | Refereed Conference Paper |
---|---|
Keywords: | UAS, push broom, hyperspectral, geometric accuracy, PARGE |
Research Division: | Engineering |
Research Group: | Geomatic Engineering |
Research Field: | Photogrammetry and Remote Sensing |
Objective Division: | Environment |
Objective Group: | Ecosystem Assessment and Management |
Objective Field: | Ecosystem Assessment and Management at Regional or Larger Scales |
Author: | Turner, D (Mr Darren Turner) |
Author: | Lucieer, A (Associate Professor Arko Lucieer) |
Author: | Clarke, I (Mr Iain Clarke) |
ID Code: | 124566 |
Year Published: | 2018 |
Deposited By: | Geography and Spatial Science |
Deposited On: | 2018-02-27 |
Last Modified: | 2018-03-29 |
Downloads: | 3 View Download Statistics |
Repository Staff Only: item control page