Deep Learning Approach for Mapping Arctic Vegetation using Multi-Sensor Remote Sensing Fusion

Land cover datasets are essential for modeling Arctic ecosystem structure and function and for understanding land–atmosphere interactions at high spatial resolutions. However, most Arctic land cover products are generated at a coarse resolution, and finding quality satellite remote sensing datasets...

Full description

Bibliographic Details
Main Authors: Kumar, Jitendra, Langford, Zachary L., Hoffman, Forrest M.
Other Authors: 10th International Conference on Ecological Informatics- Translating Ecological Data into Knowledge and Decisions in a Rapidly Changing World. 24-28 September, 2018. Jena, Germany.
Format: Text
Language:English
Published: 2018
Subjects:
Online Access:https://doi.org/10.22032/dbt.37838
https://www.db-thueringen.de/receive/dbt_mods_00037838
https://www.db-thueringen.de/servlets/MCRFileNodeServlet/dbt_derivate_00043965/Kumar_S2.4_ICEI.pdf
Description
Summary:Land cover datasets are essential for modeling Arctic ecosystem structure and function and for understanding land–atmosphere interactions at high spatial resolutions. However, most Arctic land cover products are generated at a coarse resolution, and finding quality satellite remote sensing datasets to produce such maps is difficult due to cloud cover, polar darkness, and poor availability of high-resolution imagery. A multi-sensor remote sensing-based deep learning approach was developed for generating high-resolution (5 m) vegetation maps for the western Alaskan Arctic on the Seward Peninsula, Alaska. The datasets from hyperspectral, multispectral, synthetic aperture radar (SAR) platforms, and terrain datasets were fused together using unsupervised and supervised classification techniques over a 343 km2 region to generate high-resolution (5 m) vegetation type maps. A unsupervised technique was developed to classify high-dimensional remote sensing datasets into cohesive clusters and a quantitative technique to add supervision to the unlabeled clusters was employed, producing a fully labeled vegetation map. Deep neural networks (DNNs) were developed using multi-sensor remote sensing datasets to map vegetation distributions using the original labels and the labels produced by the unsupervised method for training [1]. Fourteen different combinations of remote sensing imagery were analyzed to explore the optimization of multi-sensor remote sensing fusion. To validate the resulting DNN-based vegetation maps, field vegetation observations were conducted at 30 plots during the summer of 2016 and developed vegetation maps were evaluated against them for accuracy. Our analysis showed that the DNN models based on hyperspectral EO-1 Hyperion, integrated with the other remote sensing data, provided the most accurate mapping of vegetation types, increasing the average validation score from 0.56 to 0.70 based on field observation-based vegetation. REFERENCES: 1. Langford, Z. L., Kumar, J., and Hoffman, F. M., "Convolutional ...