Texture-based classification of sub-Antarctic vegetation communities on Heard Island.

This study was the first to use high-resolution IKONOS imagery to classify vegetation communities on sub-Antarctic Heard Island. We focused on the use of texture measures, in addition to standard multispectral information, to improve the classification of sub-Antarctic vegetation communities. Heard...

Full description

Bibliographic Details
Published in:International Journal of Applied Earth Observation and Geoinformation
Main Authors: Murray, H, Lucieer, A, Williams, R
Format: Article in Journal/Newspaper
Language:English
Published: 2010
Subjects:
Online Access:https://eprints.utas.edu.au/9907/
https://eprints.utas.edu.au/9907/1/MurrayLucieerWilliams2010press_IJAEOG_texture_Heard.pdf
https://doi.org/10.1016/j.jag.2010.01.006
Description
Summary:This study was the first to use high-resolution IKONOS imagery to classify vegetation communities on sub-Antarctic Heard Island. We focused on the use of texture measures, in addition to standard multispectral information, to improve the classification of sub-Antarctic vegetation communities. Heard Island’s pristine and rapidly changing environment makes it a relevant and exciting location to study the regional effects of climate change. This study uses IKONOS imagery to provide automated, up-to-date, and non-invasive means to map vegetation as an important indicator for environmental change. Three classification techniques were compared:multispectral classification, texture based classification, and a combination of both. Texture features were calculated using the Grey Level Co-occurrence Matrix (GLCM). We investigated the effect of the texture window size on classification accuracy. The combined approach produced a higher accuracy than using multispectral bands alone. It was also found that the selection of GLCM texture features is critical. The highest accuracy (85%) was produced using all original spectral bands and three uncorrelated texture features. Incorporating texture improved classification accuracy by 6%.