Forests 10 01025
Forests 10 01025
Forests 10 01025
Abstract: Unmanned aerial vehicle (UAV)-based remote sensing has limitations in acquiring images
before a forest fire, although burn severity can be analyzed by comparing images before and after a
fire. Determining the burned surface area is a challenging class in the analysis of burn area severity
because it looks unburned in images from aircraft or satellites. This study analyzes the availability of
multispectral UAV images that can be used to classify burn severity, including the burned surface
class. RedEdge multispectral UAV image was acquired after a forest fire, which was then processed
into a mosaic reflectance image. Hundreds of samples were collected for each burn severity class, and
they were used as training and validation samples for classification. Maximum likelihood (MLH),
spectral angle mapper (SAM), and thresholding of a normalized difference vegetation index (NDVI)
were used as classifiers. In the results, all classifiers showed high overall accuracy. The classifiers also
showed high accuracy for classification of the burned surface, even though there was some confusion
among spectrally similar classes, unburned pine, and unburned deciduous. Therefore, multispectral
UAV images can be used to analyze burn severity after a forest fire. Additionally, NDVI thresholding
can also be an easy and accurate method, although thresholds should be generalized in the future.
1. Introduction
A fire is a primary disaster in forests, disturbing biodiversity and forest wealth. Forest fires
sometimes destroy human settlements and cause loss of life and property. The forest fires in South
Korea occur mainly in the dry season (from winter to spring), and are mostly caused by humans.
As a forest fire burns off vegetation, soil, organic matter, and moisture, there is a danger of landslides
or other secondary disasters during the summer rainy season. In the Republic of Korea, there were
6,588 forest fires from 2004 to 2018. The total area affected was 11,065 hectares, and the damage
amounted to US$ 252 million [1].
A strategy is needed to recover from the damage and to respond to secondary disasters by
rapidly investigating the burn severity. Burn severity is mainly investigated by field survey or visual
interpretation of satellite imagery. Field surveys need a lot of labor, incur high costs, and take time.
Satellite imagery has limited uses based on weather conditions and image resolution. Therefore, a rapid
and efficient method is needed to investigate burn severity. The unmanned aerial vehicle (UAV) is
widely used in various fields. UAVs and sensors provide high-resolution data when users want them,
and they are less affected by atmospheric conditions [2–5]. In most cases, UAVs can acquire images
right after a forest fire even though the location and time of a forest fire cannot be anticipated [6,7].
Previous studies used spaceborne or airborne multispectral imagery to analyze burn severity.
The traditional methods are comparisons of spectral indices pre- and post-fire [8,9]. The normalized
difference vegetation index (NDVI) and the normalized burn ratio (NBR) are well-known as spectral
indices sensitive to forest fire damage [10–13]. Recent studies widely used NBR and the burned area
index (BAI) because shortwave infrared (SWIR) bands are more sensitive to forest fire damage [14–16].
UAV or high-resolution satellite images have to use visible-near infrared (VNIR) bands because they
do not have SWIR bands.
Burn severity incorporates both short- and long-term post-fire effects on the local and regional
environment. Burn severity is defined as the degree to which an ecosystem has changed as a result
of the fire. Vegetation rehabilitation may specifically vary based on burn severity after a fire [17–22].
Previous studies classified burn severity into four or five classes, such as extreme, high, moderate, low,
and unburned, using remote sensing data based on the composite burn index (CBI) suggested by the
United States Forest Service [23–25]. Those classes might not be clear enough to define burn severity
with remote sensing data. One study suggested a Korean CBI (KCBI) by adjusting the CBI classes to
burned crown, boiled crown, moderate (a mix of burned crown and burned surface), low (burned
surface only), and unburned [26]. But low and unburned are challenging classes. They look similar from
nadir views of the crown because the class low means a burned surface under an unburned crown. Also,
the Korean forest has a very high canopy density. These characteristics add limitations to classifying
severity as low or unburned. Therefore, a method is needed to classify low and unburned severity
using remotely sensed imagery that might contribute to estimating damaged areas and establishing a
recovery plan.
This study tries to analyze multispectral UAV images that can be used to classify burn severity,
including surfaces in the study area that are classed as low under the KCBI. Sample pixels were
collected from UAV multispectral images based on visual interpretation and field surveys. Spectral
characteristics of the samples were analyzed, after which burn severity was classified using a spectral
index and supervised classifiers. The suitability of multispectral UAV imaging for burn severity
analysis is shown by the classification accuracy.
Figure 1. Location of (a) the city of Gangneung, and (b) the study area (red box) in the burning forest
shown in a KOMPSAT-3A satellite image taken on 5 April 2019.
Figure 2 shows some of the damaged locations in the study area. The burned crown looks black
along one ridge, and the boiled crown shows brown-colored needles in Figure 2a. In the burned surface
area, green needles are distributed in the crown, although the ground surface and the bark of the lower
trunks that burned are black (Figure 2b). A burned surface stresses the trees owing to a lack of organic
matter and moisture in the soil.
Forests 2019, 10, 1025 4 of 15
Figure 2. The study area is forest damaged by fire from 4 to 5 April 2019, near the city of Gangneung:
(a) burned forest where there are various burned (damaged) types mixed in the area; and (b) a typical
burned surface area where some tree trunks were burned, even though the crown was not burned.
2.2. Data
color composite and (b) pseudo-infrared composite. The image shows some distinguishable colors
in Figure 3a, which are dark brown, light brown, dark green, and light green for the burned crown,
boiled crown, unburned pine trees, and deciduous trees, respectively. However, the burned surface
and unburned (pine) trees cannot be classified from visual interpretation. Figure 3b shows more clearly
distinguished colors using the NIR band. Figure 4 shows an NDVI transformed image.
Figure 3. RedEdge multispectral unmanned aerial vehicle (UAV) image of some of the burned area
(red box in Figure 1) near the city of Gangneung: (a) natural color composite: RGB = band 3, 2, 1; and
(b) pseudo-infrared composite: RGB = band 5, 3, 2.
Forests 2019, 10, 1025 6 of 15
Figure 4. A normalized difference vegetation index (NDVI) transformed image that was stretched from
0.10 to 0.95.
3. Methods
Figure 6. Examples of sample collections for (a) an unburned area and (b) a burned crown and burned
surface area.
the mean reflectance of each class. Equations (1) to (3) show the definitions of each vegetation index,
where ρ is the mean reflectance of each band:
ρNIR − ρRed
NDVI = , (1)
ρNIR + ρRed
Table 1. Mean of burn severity classes for each vegetation index. NDVI, normalized difference
vegetation index; RE-NDVI, red edge NDVI; VDVI, visible-band difference vegetation index.
1 1
P( X| wi ) = n exp[− (X − Mi )T Vi −1 (X − Mi ), (4)
1 2
2 2
(2π) Vi
where, n, X, Vi , and Mi denote the number of multispectral bands, the unknown measurement vector,
the covariance matrix of each training class, and the mean vector of each training class, respectively, and
Pn
i=1 ti ri
−1
α = cos 1 P 1 ,
(5)
Pn 2 2 n 2
i = 1 ti i=1 ir2
where, n, ti , and ri denote the number of multispectral bands, the unknown measurement vector, and
the reference spectrum vector, respectively; and α denotes an angle between r and t vectors.
Forests 2019, 10, 1025 9 of 15
Figure 7. Distribution of NDVI values with mean ± one standard deviation for each burn severity class.
Table 2. Thresholds between burn severity classes using the median of the overlapped range.
4. Results
wavelength of the red edge between the red and NIR bands slightly shifts to a shorter wavelength [29].
The reason for the red-edge shift is known to be decreased vitality owing to stress. Therefore, a burned
surface and unburned pine might be classified using slightly different spectral characteristics.
Figure 8. Mean spectral reflectance curve for five classes in the training samples.
Figure 9. Range of mean ± one standard deviation for burned surface, unburned pine, and
unburned deciduous.
surface, which was mostly misclassified as unburned pine. MLH showed better results with a burned
surface, although it was overestimated by misclassification between unburned pine and burned
surface. Additionally, it showed confusion between unburned pine and unburned deciduous. NDVI
thresholding showed moderate results among the three classification methods. The burned surface
class was underestimated, and burned crown was confused with boiled crown.
Figure 10. Results from classification of burn severity using (a) maximum likelihood (MLH), (b) spectral
angle mapper (SAM), and (c) NDVI thresholding.
Forests 2019, 10, 1025 12 of 15
Confusion matrices (Tables 3–5) show the classification accuracy for each classifier. High accuracy
was shown in the following order: MLH, SAM, and NDVI thresholding. Overall accuracies are 89%,
81%, and 71% for MLH, SAM, and NDVI thresholding, respectively. Kappa coefficients were also
similar to overall accuracy at 0.86, 0.76, and 0.64, respectively. As seen in Table 3, MLH showed
very high accuracy of more than 85% for four classes, except unburned pine. We can see that 33%
of the unburned pine was misclassified as burned surface, demonstrating that burned surface was
overestimated, as seen in the visual analysis of classification results in Figure 10a. In Table 4, we see
that SAM had more than 85% classification accuracy in three classes except for unburned pine and
unburned deciduous. Those were misclassified as burned surface, or confused with each other, because
the SAM algorithm uses a pattern of spectral reflectance rather than an absolute value of reflectance.
In Table 5, NDVI thresholding showed lower overall accuracy than the supervised classifiers, MLH
and SAM. Low overall accuracy was caused by confusion between burned crown and boiled crown
from similar NDVI values for burned out or discolored needles. However, the classification accuracy
of burned surface, unburned pine, and unburned deciduous showed similar or higher levels than the
two supervised classifiers.
Table 3. Confusion matrix for classification accuracy assessment using MLH (%).
Table 4. Confusion matrix for classification accuracy assessment using spectral angle mapper (SAM) (%).
Table 5. Confusion matrix for classification accuracy assessment using NDVI thresholding (%).
5. Discussion
In this study, two supervised classifiers and NDVI thresholding were compared for the classification
of forest burn severity, including burned surface. As a result, the classification of burned surface
showed accuracy of more than 85%. This means that the UAV multispectral image can be used to
accurately classify a burned surface, even though the two classes are very similar when viewed from
the air.
Comparing the classification methods, the supervised classifiers (MLH and SAM) showed more
than 80% overall accuracy, while the NDVI thresholding accuracy was 71%. However, 20% to 30% of
unburned pine and deciduous trees were misclassified as a burned surface, and the amount of burned
surface was overestimated. This means that a high degree of expertise and a lot of time are required
to collect dozens to hundreds of training samples for a supervised classifier such as MLH, while the
classification accuracy between burned surface and unburned pine is not high. SAM can classify with
one training sample, such as a collected pixel or a spectrum from a spectral library. It showed a high
number of misclassifications between classes with similar spectral patterns, although hundreds of
training samples were used.
NDVI thresholding showed similar or higher accuracy with the burned surface class, compared
with the supervised classifiers, even though it showed confusion between burned crown and boiled
crown. Previous studies reported that burned crown and boiled crown can be classified easily with
various methods, including visual interpretation. Detection or classification of a burned surface
should be the focus for accurate assessment of damaged areas and burn severity. Therefore, the NDVI
thresholding method is expected to be able to estimate forest fire damage more easily and accurately.
Further studies are needed to generalize the threshold for application in the field. In other words,
independent thresholds should be defined for different regions and times when classifying burn
severity of a forest.
UAV multispectral images have very high spatial resolution and multispectral bands that can be
used as training samples for classification using high-resolution satellite images and deep learning
algorithms in the future. Deep learning can enhance accuracy and convenience, although it needs a
long-term approach to collecting a large number of training samples. High-resolution earth observation
satellites can also be a useful tool for analyzing burn severity and damaged areas. The Korean
government is developing a new earth observation satellite program, the Compact Advanced Satellite
500 (CAS 500), to shorten the observation interval. The first and second twin satellites will be launched
in 2020, and have a 0.5 m panchromatic band and 2 m multispectral bands. We are expecting acquisition
of high-resolution multispectral images within two to three days. In the future, it will be necessary to
suggest the possibility of classifying burn severity using high-resolution satellite images, in comparison
with UAV images.
6. Conclusions
This study tried to analyze the use of multispectral UAV images to classify burn severity, including
burned surfaces. A RedEdge multispectral image was acquired for a region of the Gangneung forest
fire in April 2019. Spectral characteristics showed differences among burn severity classes, although
some of them were similar. Burn severity was classified using two supervised classifiers, MLH and
SAM, as well as the NDVI thresholding method. Classification accuracies were about 80% to 90%
using the supervised classifiers and about 70% using NDVI thresholding. They showed an accuracy
of more than 85% for the burned surface class, where a multispectral UAV image can differentiate a
burned surface from unburned pine or deciduous trees. The NDVI thresholding method also showed
high classification accuracy for burned surface, unburned pine, and deciduous. It can be useful as an
easier and more accurate tool for the estimation of burn severity and damaged areas than a supervised
classifier. Supervised classification approaches might be applied to other regions through collection
of corresponding training samples. However, NDVI of burn severity classes might have different
values by regional characteristics. Further studies are needed to generalize NDVI or the thresholds for
Forests 2019, 10, 1025 14 of 15
application in other regions. In the future, multispectral UAV images can also be used for training
deep learning techniques and high-resolution satellite images.
Author Contributions: Conceptualization, C.-s.W. and J.-i.S.; data curation, C.-s.W.; formal analysis, J.-i.S. and
W.-w.S.; funding acquisition, J.P. and T.K.; investigation, J.-i.S., C.-s.W., and W.-w.S.; methodology, J.-i.S.; project
administration, J.P. and T.K.; supervision, C.-s.W.; visualization, J.-i.S. and W.-w.S.; writing—original draft, J.-i.S.
and W.-w.S.; writing—review and editing, J.-i.S., W.-w.S. and T.K.
Funding: This research was funded by “National Institute of Forest Science grant, number FE0500-2018-04”
and “Satellite Information Utilization Center Establishment Program (18SIUE-B148326-01) by Ministry of Land,
Infrastructure, and Transport of Republic of Korea”.
Acknowledgments: We thank In-Sup Kim as the staff of the National Institute of Forest Science who helped
our UAV image acquisition and we also thank Korea Aerospace Research Institute for offering a KOMPSAT-3A
satellite image.
Conflicts of Interest: The authors declare they have no conflicts of interest. The funders had no role in the design
of the study; in the collection, analyses, or interpretation of the data; in the writing of the manuscript; or in the
decision to publish the results.
References
1. Statistical yearbook of wild fire 2018, Statistics Korea. Available online: https://www.index.go.kr/potal/stts/
idxMain/selectPoSttsIdxMainPrint.do?idx_cd=1309&board_cd=INDX_001 (accessed on 16 September 2019).
2. Tang, L.; Shao, G. Drone remote sensing for forestry research and practices. J. For. Res. 2015, 26, 791–797.
[CrossRef]
3. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Wallace, L. Forestry applications
of UAVs in Europe: A review. Int. J. Remote Sens. 2017, 38, 2427–2447. [CrossRef]
4. Sankey, T.; Donager, J.; McVay, J.; Sankey, J.B. UAV lidar and hyperspectral fusion for forest monitoring in
the southwestern USA. Remote Sens. Environ. 2017, 195, 30–43. [CrossRef]
5. Carvajal-Ramírez, F.; Marques da Silva, J.R.; Agüera-Vega, F.; Martínez-Carricondo, P.; Serrano, J.; Moral, F.J.
Evaluation of Fire Severity Indices Based on Pre-and Post-Fire Multispectral Imagery Sensed from UAV.
Remote Sens. 2019, 11, 993. [CrossRef]
6. Pla, M.; Duane, A.; Brotons, L. Potential of UAV images as ground-truth data for burn severity classification
of Landsat imagery: Approaches to an useful product for post-fire management. Rev. Teledetec. 2017, 49,
91–102. [CrossRef]
7. Fernández-Guisuraga, J.; Sanz-Ablanedo, E.; Suárez-Seoane, S.; Calvo, L. Using unmanned aerial vehicles in
postfire vegetation survey campaigns through large and heterogeneous areas: Opportunities and challenges.
Sensors 2018, 18, 586. [CrossRef]
8. García, M.L.; Caselles, V. Mapping burns and natural reforestation using Thematic Mapper data. Geocarto
Int. 1991, 6, 31–37. [CrossRef]
9. Barbosa, P.M.; Stroppiana, D.; Grégoire, J.M.; Cardoso Pereira, J.M. An assessment of vegetation fire in Africa
(1981–1991): Burned areas, burned biomass, and atmospheric emissions. Glob. Biogeochem. Cycles 1999, 13,
933–950. [CrossRef]
10. Chafer, C.J.; Noonan, M.; Macnaught, E. The post-fire measurement of fire severity and intensity in the
Christmas 2001 Sydney wildfires. Int. J. Wildland Fire 2004, 13, 227–240. [CrossRef]
11. Epting, J.; Verbyla, D.; Sorbel, B. Evaluation of remotely sensed indices for assessing burn severity in interior
Alaska using Landsat TM and ETM+. Remote Sens. Environ. 2005, 96, 328–339. [CrossRef]
12. Collins, B.M.; Kelly, M.; van Wagtendonk, J.W.; Stephens, S.L. Spatial patterns of large natural fires in Sierra
Nevada wilderness areas. Landsc. Ecol. 2007, 22, 545–557. [CrossRef]
13. Tran, B.; Tanase, M.; Bennett, L.; Aponte, C. Evaluation of Spectral Indices for Assessing Fire Severity in
Australian Temperate Forests. Remote Sens. 2018, 10, 1680. [CrossRef]
14. Amos, C.; Petropoulos, G.P.; Ferentinos, K.P. Determining the use of Sentinel-2A MSI for wildfire burning &
severity detection. Int. J. Remote Sens. 2019, 40, 905–930.
15. Filipponi, F. BAIS2: Burned Area Index for Sentinel-2. Proc. Int. Electron. Conf. Remote Sens. 2018, 2, 364.
[CrossRef]
Forests 2019, 10, 1025 15 of 15
16. Chuvieco, E.; Martin, M.P.; Palacios, A. Assessment of different spectral indices in the red-near-infrared
spectral domain for burned land discrimination. Int. J. Remote Sens. 2002, 23, 5103–5110. [CrossRef]
17. Key, C.H.; Benson, N.C. Fire Effects Monitoring and Inventory Protocol—Landscape Assessment; USDA Forest
Service Fire Science Laboratory: Missoula, MT, USA, 2002; pp. 1–16.
18. Van Wagtendonk, J.W.; Root, R.R.; Key, C.H. Comparison of AVIRIS and Landsat ETM+ detection capabilities
for burn severity. Remote Sens. Environ. 2004, 92, 397–408. [CrossRef]
19. Hartford, R.A.; Frandsen, W.H. When it’s hot, it’s hot... or maybe it’s not! (Surface flaming may not portend
extensive soil heating). Int. J. Wildland Fire 1992, 2, 139–144. [CrossRef]
20. Neary, D.G.; Klopatek, C.C.; DeBano, L.F.; Ffolliott, P.F. Fire effects on belowground sustainability: A review
and synthesis. For. Ecol. Manag. 1999, 122, 51–71. [CrossRef]
21. Miller, J.D.; Yool, S.R. Mapping forest post-fire canopy consumption in several overstory types using
multi-temporal Landsat TM and ETM data. Remote Sens. Environ. 2002, 82, 481–496. [CrossRef]
22. Smith, A.M.; Wooster, M.J. Remote classification of head and backfire types from MODIS fire radiative power
and smoke plume observations. Int. J. Wildland Fire 2005, 14, 249–254. [CrossRef]
23. McKenna, P.; Erskine, P.D.; Lechner, A.M.; Phinn, S. Measuring fire severity using UAV imagery in semi-arid
central Queensland, Australia. Int. J. Remote Sens. 2017, 38, 4244–4264. [CrossRef]
24. Fraser, R.; Van Der Sluijs, J.; Hall, R. Calibrating satellite-based indices of burn severity from UAV-derived
metrics of a burned boreal forest in NWT, Canada. Remote Sens. 2017, 9, 279. [CrossRef]
25. Rossi, F.; Fritz, A.; Becker, G. Combining satellite and UAV imagery to delineate forest cover and basal area
after mixed-severity fires. Sustainability 2018, 10, 2227. [CrossRef]
26. Won, M. Analysis of Burn Severity in Large-Fire Area Using Satellite Imagery. Ph.D. Thesis, Korea University,
Seoul, Korea, July 2013.
27. Swain, P.H.; Davis, S.M. Remote sensing: The quantitative approach. IEEE Trans. Pattern Anal. Mach. Intell.
1981, 6, 713–714. [CrossRef]
28. Kruse, F.A.; Lefkoff, A.B.; Boardman, J.W.; Heidebrecht, K.B.; Shapiro, A.T.; Barloon, P.J.; Goetz, A.F.H.
The spectral image processing system (SIPS)—Interactive visualization and analysis of imaging spectrometer
data. Remote Sens. Environ. 1993, 44, 145–163. [CrossRef]
29. Clevers, J.P.G.W.; Jongschaap, R. Imaging spectrometry for agricultural applications. In Imaging
Spectrometry—Basic Priciples and Prospective Applications, 1st ed.; Van Der Meer, F.D., Ed.; Kluwer Academic
Publishers: Dordrecht, The Netherlands, 2003; pp. 157–199.
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).