Sensors: Computer Vision and Iot-Based Sensors in Flood Monitoring and Mapping: A Systematic Review
Sensors: Computer Vision and Iot-Based Sensors in Flood Monitoring and Mapping: A Systematic Review
Sensors: Computer Vision and Iot-Based Sensors in Flood Monitoring and Mapping: A Systematic Review
Review
Computer Vision and IoT-Based Sensors in Flood
Monitoring and Mapping: A Systematic Review
Bilal Arshad 1, * , Robert Ogie 1 , Johan Barthelemy 1 , Biswajeet Pradhan 2,3 ,
Nicolas Verstaevel 4 and Pascal Perez 1
1 SMART Infrastructure Facility, University of Wollongong, Wollongong 2522, NSW, Australia;
[email protected] (R.O.); [email protected] (J.B.); [email protected] (P.P.)
2 The Centre for Advanced Modelling and Geospatial Information Systems (CAMGIS),
Faculty of Engineering and Information Technology, University of Technology Sydney,
Sydney 2007, NSW, Australia; [email protected]
3 Department of Energy and Mineral Resources Engineering, Sejong University, Choongmu-gwan,
209 Neungdong-ro, Gwangjingu, Seoul 05006, Korea
4 UMR 5505 CNRS-IRIT, Université Toulouse 1 Capitole, 31062 Toulouse, France; [email protected]
* Correspondence: [email protected]; Tel.: +61-2-4239-2329
Received: 4 October 2019; Accepted: 12 November 2019; Published: 16 November 2019
Abstract: Floods are amongst the most common and devastating of all natural hazards. The alarming
number of flood-related deaths and financial losses suffered annually across the world call for improved
response to flood risks. Interestingly, the last decade has presented great opportunities with a series
of scholarly activities exploring how camera images and wireless sensor data from Internet-of-Things
(IoT) networks can improve flood management. This paper presents a systematic review of the
literature regarding IoT-based sensors and computer vision applications in flood monitoring and
mapping. The paper contributes by highlighting the main computer vision techniques and IoT sensor
approaches utilised in the literature for real-time flood monitoring, flood modelling, mapping and
early warning systems including the estimation of water level. The paper further contributes by
providing recommendations for future research. In particular, the study recommends ways in which
computer vision and IoT sensor techniques can be harnessed to better monitor and manage coastal
lagoons—an aspect that is under-explored in the literature.
Keywords: remote sensing; flood; disaster management; coastal; environmental sensor network
(ESN); IoT; drones; UAV; computer vision; wireless sensor network
1. Introduction
Natural hazards such as floods, storms, tsunamis and others pose a significant threat to lives
and property around the world [1]. Without proper monitoring and effective mitigation measures,
these natural perils often culminate in disasters that have severe implications in terms of economic loss,
social disruptions, and damage to the urban environment [2,3]. Historical records have shown that
flood is the most frequent natural hazard (see Figure 1), accounting for 41% of all natural perils that
occurred globally in the last decade [4]. In this period alone (2009 to 2019), there were over 1566 flood
occurrences affecting 0.754 billion people around the world with 51,002 deaths recorded and damage
estimated at $371.8 billion [4]. Put in context, these statistics only account for “reported” cases of
large-scale floods, typically considered flood disasters. A flood disaster is defined as a flood that
significantly disrupts or interferes with human and societal activity, whereas a flood is the presence of
water in areas that are usually dry [5,6]. The global impact of a flood would be more alarming if these
statistics incorporated other numerous small-scale floods where less than 10 people may have died,
100 or more people may have been affected or where there is no declaration of a state of emergency
or a call for international assistance. Nevertheless, the current situation calls for improved ways
Sensors 2019, 19, x 2 of 28
of monitoring and responding to floods. The importance of improved flood monitoring cannot be
overemphasized
monitoring cannotgiven the growing uncertainty
be overemphasized associated
given the growing with climate
uncertainty changewith
associated andclimate
the increasing
change
numbers of people living in flood-prone areas [7].
and the increasing numbers of people living in flood-prone areas [7].
Significant efforts
Significant effortshavehavebeen made
been globally
made to develop
globally cost-effective
to develop and robust
cost-effective andflood monitoring
robust flood
monitoring solutions. A common approach is based on computer vision, wherein relevant existing
solutions. A common approach is based on computer vision, wherein relevant images from images
urbanexisting
from surveillance
urbancameras are captured
surveillance camerasand areprocessed
captured to andimprove
processeddecision makingdecision
to improve about floods
making [8].
These types of camera-based applications involve low equipment cost
about floods [8]. These types of camera-based applications involve low equipment cost and wide and wide aerial coverage
thereby
aerial enabling
coverage the detection
thereby enabling of theflood levels
detection of at multiple
flood levels points. Thepoints.
at multiple wider Thecoverage
wider gives
coveragethe
computer
gives vision approach
the computer an advantage
vision approach over the traditional
an advantage flood monitoring
over the traditional method that
flood monitoring reliesthat
method on
fixed-point sensors [9]. Computer vision is based on image processing techniques
relies on fixed-point sensors [9]. Computer vision is based on image processing techniques that have that have been
widelywidely
been appliedapplied
in manyinfields,
many including
fields, aerospace,
including medicine,
aerospace,traffic monitoring,
medicine, trafficand environmental
monitoring, and
object analysis [10].
environmental In analysis
object the last decade,
[10]. In research efforts have
the last decade, intensified
research effortsinhave
exploring computer
intensified vision to
in exploring
improve flood
computer visionmonitoring,
to improve flood inundation
flood monitoring,mapping,
flood debris flow estimation,
inundation and post-flood
mapping, debris damage
flow estimation,
estimation.
and post-flood To effectively harness thisToknowledge
damage estimation. effectivelyand fosterthis
harness rapid research progress,
knowledge and foster it is important
rapid research to
review the relevant literature and provide a constructively critical appraisal
progress, it is important to review the relevant literature and provide a constructively critical of scientific production,
including of
appraisal recommended directionsincluding
scientific production, for futurerecommended
research. directions for future research.
Figure 1. Comparison
Comparison of of different
different disaster
disaster types
types reported
reported from 2009 to 2019: (a) (a) total
total number
number of
reported disasters;
disasters;(b)
(b)total
total number
number of deaths;
of deaths; (c) number
(c) total total number of people
of people affected;affected; and economic
and (d) total (d) total
economic
loss [4]. loss [4].
Another method
Another method of offlood
floodmonitoring
monitoringand andprediction
predictionis isthethe
useuseof of wireless
wireless sensors
sensors powered
powered by
by the
the Internet
Internet of of Things
Things (IoT)
(IoT) technology.IoT
technology. IoTand
andcomputational
computationalmodels modelssuch
such as as artificial
artificial neural
(ANNs) [11] have opened up new doorways, allowing
network (ANNs) allowing the the design
design of new hardware and
water-level data
software to provide real-time water-level data as
as required
required for
for flood
flood monitoring
monitoring and and forecasting
forecasting [12].
[12].
Today, many
Today, manyflood-prone
flood-prone countries, including
countries, includingthe the
tropical nation
tropical of Indonesia
nation that suffers
of Indonesia from annual
that suffers from
monsoonal
annual rainfall,rainfall,
monsoonal are exploring IoT sensors
are exploring to gather
IoT sensors intelligence
to gather for issuing
intelligence earlyearly
for issuing warnings and
warnings
evacuate orders to people at risk of major floods [13]. The IoT has gained increased
and evacuate orders to people at risk of major floods [13]. The IoT has gained increased popularity popularity in the
lastthe
in decade, particularly
last decade, within the
particularly context
within theof context
smart cityof applications such as real-time
smart city applications suchmonitoring
as real-time of
urban drainage
monitoring networks
of urban using wireless
drainage networks sensors
using[14]. A review
wireless of the
sensors relevant
[14]. literature
A review is needed
of the relevantto
provide anisin-depth
literature needed understanding
to provide anofin-depth
the research scope and progress
understanding achievedscope
of the research in the and
last decade
progressof
achieved in the last decade of using IoT sensors for flood monitoring in both occupied lands and
other coastal sites such as lakes and lagoons.
This study provides an opportunity to update readers on recent advancements in flood
monitoring, and how technology is used in the literature to map the flood events. The motivation
behind this study is to highlight existing solutions and adapt them to better manage coastal lagoons,
Sensors 2019, 19, 5012 3 of 28
using IoT sensors for flood monitoring in both occupied lands and other coastal sites such as lakes
and lagoons.
This study provides an opportunity to update readers on recent advancements in flood monitoring,
and how technology is used in the literature to map the flood events. The motivation behind this study
is to highlight existing solutions and adapt them to better manage coastal lagoons, which impose flood
threat to the local communities. This study presents a systematic review of the literature focusing
on the use of computer vision and IoT-based sensors in flood monitoring, mapping and prediction
for both occupied lands and coastal sites such as lagoons. The main contributions of this article are
as follows:
• A detailed survey is presented on the use of computer vision and IoT-based sensors for flood
monitoring, prediction and inundation mapping. The scope covers the state-of-the-art applications
of computer vision and sensor integrated approaches for managing coastal sites and other
flood-prone urban areas.
• The study highlights gaps in the literature and recommends directions for future research.
The following section presents the methodology adopted in conducting this systematic
literature review.
2. Methodology
This section provides details for the procedure involved in the selection, inclusion and exclusion
of research articles. The review was conducted using the Preferred Reporting Items for Systematic
Reviews and Meta-Analysis (PRISMA) guidelines [15]. Overall, three databases were selected to
conduct this review, namely, Scopus, IEEE Xplore and Science Direct. The keywords utilised to
select relevant articles from the databases are listed in Table 1 along with the number of retrieved
research papers.
The research articles were manually screened by reading the title and abstract. The database
search returned (n = 13,875) records from three online databases. After removal of duplicate articles,
only 2823 articles were left for review. The titles and abstracts of these 2823 articles were manually
screened for relevance, resulting in the exclusion of 2415 records. The remaining 408 articles were
selected for full-text review and content analysis. For inclusion in the final list, articles were required
to be published between 2009 and 2019 and to be related to flood monitoring, forecasting or mapping.
These inclusion criteria resulted in 91 relevant articles. In regard to exclusion criteria, the articles about
IoT protocols in flood monitoring were not included in this review, as this is not the core focus of
this study. Furthermore, duplicated articles in different databases were also discarded, and only the
articles written in English were considered in this review. The PRISMA flow diagram for the systematic
literature review can be seen in Figure 2.
Sensors 2019, 19, 5012 4 of 28
Sensors 2019, 19, x 4 of 28
Figure
Figure 2. PRISMA flow
2. PRISMA flow diagram
diagram for
for the
the literature
literature review
review [15].
[15].
Another study utilised the physical measuring ruler along with different computational models
in computer vision, including the differencing method, dictionary learning and convolutional neural
network (CNNs) [20]. The dictionary method is based on classifying the ROI into two classes, i.e., ruler
and water region. The features of the water and ruler are stored in the dictionary. By analysing the
boundary line between the ruler and water classes, the water level can be calculated. The CNN delivered
the most promising results. A CNN is a computer vision technique which involves convolving the
image with the filter. The role of the filter is to extract important features from the image. The algorithm
was trained on raw images and during prediction. Instead of using preserved features from the
dictionary, the algorithm extracted features from the input image. Having tested the algorithms on six
different locations, the study concluded that the CNN outperformed the accuracy of both the dictionary
learning technique and the differencing method. The average error and variance of error recorded for
the three different methods can be seen in Table 2.
Table 2. Comparison of the average and variance of error for different computer vision techniques [20].
The task of computationally differentiating a water body in an image can be challenging. A vital
step is to rely on the intensity data from the water body to develop a mathematical model that contains
the water body reflection coefficients [21]. Rankin et al. [22] considered the low texture part of the
image as the water body. Low texture in an image can be found by converting the red/green/blue
(RGB) image to grayscale and convolving a grayscale image with a 5 × 5 intensity variance filter.
The study utilized the intensity data from the water body to extract the reflection coefficient from
surface reflection. In contrast to using only intensity information, Park et al. [23] proposed the
segmentation technique to identify the water level. The proposed algorithm uses an accumulated
histogram approach and a bandpass filter. The bandpass filter is fine-tuned to reduce the noise in
the image. For this reason, images taken from a charge-coupled device (CCD) camera are converted
from time series to a frequency domain using discrete cosine transform (DCT). In the accumulated
histogram approach, the image is compared with previous frames and a histogram plotted, so that the
changes in the histogram can be tracked, and the water level estimated from variation in the histogram.
In a similar approach, Udomsiri et al. [24] proposed the edge detector finite impulse response (FIR)
filters along with bandpass filter to find the boundary between water and ground. The water level was
detected by finding features of horizontal straight lines. The error of the detection was calculated by
measuring the water level manually and comparing the results with the output of the algorithm.
Moreover, Zhang et al. [25] has proposed a real-time flow and water level measurement system
based on near infrared (NIR) imaging, OSF-based adaptive thresholding and image ortho-rectification
techniques. The proposed framework consists of ten steps as follows: (i) camera calibration to obtain
intrinsic and distortion matrix; (ii) correction of non-linear distortion of an image; (iii) selection of
staff gauge ruler as ROI; (iv) design of binary orthographic template image based on chosen ROI;
(v) selection of corresponding points on staff gauge; (vi) determining the transformation matrix of a
staff gauge with respect to the camera; (vii) ortho-rectification of ROI image; (viii) segmentation of an
image with adaptive thresholding; and (ix) locating the water line in the image by accumulating grey
values in a row.
In contrast to utilizing visual information based only on ground or wall-mounted cameras [26],
Ridolfi et al. [27] deployed unmanned aerial vehicles (UAVs) to monitor the water level in dams.
The water level was estimated by utilizing a canny filter on greyscale images. The threshold parameters
minimum and maximum (0.019,0.047) were predefined, and the objective was to draw a boundary
between water and surface. By comparing the water level retrieved from the images with a benchmarked
Sensors 2019, 19, 5012 6 of 28
value obtained from a traditional device, the method was found to have achieved 0.05 m in the overall
mean error between the estimated and actual water levels. This outcome is quite encouraging,
considering that testing in four different locations within an Italian artificial lake has reaffirmed the
reliability of the method for extracting the water level from images. Image parsing is another key
challenge in the use of computer vision for flood monitoring. Lo et al. [28] designed an image parser to
analyse images that have significant perceptual recognizability. Firstly, the image parser looks for dark
sample pixels or blank images, where the intensity of the pixel is the luminance in the hue, saturation
and value (HSV) colour domain. Images with an intensity less than a specified threshold are discarded.
The second step is to check the image visibility by calculating the overall luminance of an image.
Afterwards, the next step is to draw some reference sampling points on an image, check the visibility
at those points and then check the presence of fog/haze on site. The final phase involves checking for
the presence of water in the ROI by finding geometric boundaries and edges in the resultant image.
One of the most significant findings to emerge for this subsection is that a computer vision
approach can be used to extract the water level at multiple points within a field of view (FOV) of the
camera. The water level readings can be validated by analysing the visual data acquired from the
visual sensor. This provides an inexpensive way to forecast flood by merely relying on remote sensing
data. This has also contributed to the understanding of how different computer vision methods are
used in the literature.
from the sensors were taken into consideration to predict the flash flood. Furthermore, the multilayer
perceptron (MLP) algorithm was trained to reduce the number of fake alarms [35].
There are several studies that show how to establish and harness a network of connected
sensors for water level monitoring. Noar et al. [36] show how the Blynk platform can be utilised to
connect the ultrasonic sensor with the internet and obtain real-time information on mobile phones.
The proposed approach utilises NodeMCU as a medium to connect the range finder sensor with the
Internet and receive information about the status of water level in real time. In a similar approach,
Purkovic et al. [37] designed a low-cost ultrasonic sensor that was utilised along with other sensors
from EnOcean. The data was transmitted every 5 min and the maximum range of the sensor was
10 m, with a resolution of 10 mm. However, the paper does not provide information about the results
obtained from the experiment. Kafli et al. [38] proposed an IoT platform along with several sensors
including rangefinder, humidity, carbon monoxide and a GPS sensor to monitor water level. The study
was designed to be able to monitor water level in real time and issue early warnings to the local
community. Chandanala et al. [39] proposed a technique to make the wireless system more energy
efficient by optimising the parameters of network coding and duty cycling. Flooding was predicted by
executing active monitoring through available off-the-shelf sensors such as an ultrasonic/sonar for
estimating the water level and a precipitation sensor for estimating the intensity of rainfall.
Furthermore, an early flood detection system can be implemented through real-time monitoring
of the flood-prone area via sensors deployed in optimal locations at the site. This approach provides
a convenient and cost-effective way to monitor flood-prone sites in real time [40]. Furthermore,
Thekkil et al. [41] and Balaji et al. [42] utilised ZigBee and Global System for Mobile (GSM) to transmit
acquired camera images and generate flood-related warnings. The study also utilised the scale-invariant
feature transform (SIFT) algorithm for the autonomous monitoring of flood. In a similar approach,
Pratama et al. [43] utilised Mamdani fuzzy logic along with ZigBee and water level sensors to detect
and transmit the flood-related data. The study suggests that the maximum error for the proposed
approach falls within an acceptable range of five percent. Waleed et al. [44] proposed a microchip-based
solution using an array of piezoelectric pressure sensors that measure the pressure exerted by water
and ZigBee for transmitting and receiving the data. The sensors were prototyped on Altera’s Cyclone
board. The study also suggested that placement of the sensors is of extreme importance to forecast
flood accurately. Ogie et al. [45] proposed a solution for the best placement of water-level sensors.
The study puts a considerable emphasis on the optimal placement of the sensors, as it is important to
gain situational awareness of water level in a large area of interest. The NSGA-II algorithm, which has
gained wide application in many real-world problems, was used to find the best spot for the sensors.
Using the sensor placement algorithm, four locally fabricated sensors were deployed to monitor water
levels at different points in the waterways in Jakarta, Indonesia. In situations where accessibility is
constrained, drones can be utilised to deploy sensors. For example, Abdelkader et al. [46] utilised UAVs
to deploy cheap disposable sensors that can transmit data to UAVs about the monitored lake/valley.
Monitoring of water level has stirred the design and implementation of several wireless
sensor networks (WSNs). For example, Wen-Yao et al. [47] utilised water level sensors along with
analogue-to-digital converter (ADC) and an 8051 microprocessor in a ZigBee WSN to estimate water
level. The study was executed to monitor and control the distribution substation in low-lying areas,
providing early warning to the local community in case the water level increases above a predefined
threshold value. Other similar studies have provided real-time signals from a WSN to inform an early
warning system [48–50]. These studies have mostly relied on a web server to visualise the data coming
from the flood monitoring station. Additionally, Jayashree et al. [51] proposed an early warning system
based on real-time monitoring of dams via flow and water level sensors. The data collected from
sensors is accessible and available to the public and can be fetched through an Android app designed
for the research. Similarly, Teixidó et al. [52] and Smith et al. [53] presented a WSN system to notify
the user in case of flooding. Similarly, Yumang et al. [54] designed a sensor network system capable
Sensors 2019, 19, 5012 8 of 28
of issuing warnings to locals in the event of flooding. The proposed system is based on sensors to
monitor water level, a renewable power source to power the system and a GSM shield to transmit data.
Data from sensor networks need to be validated and machine learning techniques can be quite
useful in this regard. Machine learning techniques can be used in conjunction with ultrasonic/rangefinder
sensors to predict flooding probability as needed for early warning [55]. Widiasari et al. [56] utilised
the machine learning technique, Multilayer Perceptron (MLP), to analyse the time-series data coming
from ultrasonic and precipitation sensors. The study was conducted to increase the accuracy of
predicting flood events and also attributing floods in the region. Khan et al. [57] proposed an AI-based
multi-modal network to alert locals to any upcoming flood event. The proposed approach is based on
the sensor network, which consists of rangefinder, pressure, temperature, and gas sensors. The study
indicated that the proposed system delivers accurate results with minimal false alerts. It would
have helped to investigate the performance of the system in many locations. In a different study,
Cruz et al. [58] developed a system to collect data from sensors such as a rain gauge, water level
sensor, and soil moisture sensor. Using an artificial neural network (ANN) technique, real-time data
from the flood monitoring station can be analysed to inform flood risk. The novelty in the study was
the introduction of measuring river slope through the rangefinder sensor. The same authors in [59],
Mousa et al. progressed their work further in [60] by introducing L1 regularization for fault detection
and missing data points in real-time sensor applications. The proposed study also utilised ANN to
compensate for the change in the environmental condition, accounting for how such change affects the
readings obtained from sensors. In this ANN approach, the readings from multiple temperature sensors
was obtained and the temperature variation between the ultrasonic sensor and ground determined in
order to compensate for the error. The study highlights the fact that acquired data from sensors may
not always be reliable as sensors may be damaged or covered with dirt; thus, early warning monitoring
systems can issue false alarms.
The need to overcome the problem of false alarms has been of interest to several researchers.
For example, Ancona et al. [61] proposed a technique that comprises intelligent sensors and 3D map
techniques to forecast flooding while minimising false alarms. Horita et al. [62] validated WSN data
about flooding with data reported by the citizens. In most cases, the sensors either were out of order
or were not able to take the measurement. In a similar approach, Neal et al. [63] proposed a Kalman
filter with WSN to improve the accuracy of the data coming from the sensors for flood forecasting.
Ray et al. [64] discussed the IoT protocols utilized in the literature. Perumal et al. [65] proposed the
IoT enable water monitoring station. Furthermore, Moreno et al. [66] and Purnomo et al. [67] proposed
an early detection system by embedding rainfall, river slope and temperature sensors to monitor a
continuous change in water level and forecast flash flooding. Moreover, Mostafa et al. [68] proposed a
WSN along with a multi-agent system to classify whether the data coming from the sensors are valid or
invalid. The study suggested an optimal model for aggregation and classification and is divided into
three steps, namely, sensors verification phase, data aggregation and classification, and the database
interaction step.
Review of the IoT-related literature presented above has revealed the potential of IoT-based
sensors in early warning system. The most obvious findings to emerge from this subsection is that
sensor-based approaches are more accurate in terms of calculating water level. However, the limitation
of such approaches is that they only offer a reading at a single point and the only way to validate
the reading is to visit the site due to the unavailability of visual data. Furthermore, we highlight the
relevant studies that have focused on the IoT-based sensors as shown in Table 3.
Sensors 2019, 19, 5012 9 of 28
Table 3. Cont.
coastal areas. The experimental results showed that the proposed approach of UAV photogrammetry
and GIS offers cheaper and faster information without compromising accuracy. In a similar approach,
Beni et al. [90] aimed to extract the water surface from images taken by UAVs [91]. The DEM was
generated by utilizing the data points collected via the UAV. The data was then compared with the
LiDAR sensor data from a satellite. The study found that data collected from UAVs are more accurate
than LiDAR sensor data with an approximately 30 cm difference between the models.
There is also research progress in classifying water surface from Landsat images. Landsat provides
open-source data, but it suffers from low resolution. Isikdogan et al. [92] proposed an algorithm to
segment out the surface water from land, clouds, ice, snow and shadows by using only Landsat band
as an input. Currently, classification of the water surface from Landsat images suffers from false
positives. This situation arises mainly due to the presence of cloud and terrain shadows, other reasons
may include ice and snow threshold variations for different regions. The classification model takes
the context of an image into account during the classification of an image. The proposed approach
emphasized that the DeepWaterMap classification model works well across different terrain types and
changing atmospheric conditions. The comparison between different models (conventional MLP and
DeepWaterMap with one, three and five CNN layer blocks) can be seen in Table 4.
Table 4. Comparison of models between the conventional MLP and DEEPWATERMAP with one, three
and five convolutional blocks [92].
Moreover, Kang et al. [93] introduced an FCN-16 model based on fully convolutional networks
(FCNs) for the mapping of flood events. The proposed approach achieved an improvement over
FCNs to the overall accuracy of 0.0015 to 0.0058 under different test environments. The comparison
between FCNs and FCN-16 can be seen in Table 5. Furthermore, in a study by Gebrehiwot et al. [94],
the pre-trained FCN-16 model was further trained to extract flooded areas from UAV imagery.
The FCN-16 model achieved an accuracy of 95% as compared to the 87.4% accuracy obtained with
support vector machine (SVM). The confusion matrix was used to analyse the performance of the
algorithm [94].
Table 5. Comparison between fully convolutional networks (FCNs) and the proposed algorithm
FCN-16 [93].
Instead of utilizing a satellite-based approach, wall-mounted cameras can be utilized for mapping of
the flooded areas [95]. Lo et al. [96] introduced an image-based early warning system to instantaneously
monitor and map a flooded area. This utilizes the existing video surveillance system and image
Sensors 2019, 19, 5012 14 of 28
processing techniques. The proposed method overcomes the need for a “staff gauge” or ruler to measure
the water level. In this approach, the GrowCut method for region segmentation of an image was relied
on to map the flooded area. During segmentation, the boundary between background and foreground
was determined by the addition of the cellular automata (CA) algorithm. In a similar approach to
GrowCut, Horng et al. [97] proposed a mean-shift clustering algorithm and region growing image
segmentation algorithm to identify flooded areas and calculate the flood risk associated with the rise in
water level. The proposed approach works well, as the purpose of utilizing region growing at the top
of mean-shift is to group the pixels into meaningful clusters and analyse the variation in the growing
region by comparing with previous frames. On a slightly different approach, Narayanan et al. [98]
utilized the feature matching scale invariant feature transform (SIFT) algorithm to find standard
features among two pictures which belong to the same building, whereas one picture was taken
before the flood, the other was taken after the flood. To improve the generalizability of the algorithm,
this study can be repeated on multiple images and sites.
Admittedly, some sites that require frequent monitoring are harder to access, but UAVs can provide
a cost-effective alternative approach for real-time monitoring. Images taken from UAVs can support
the localization, detection, segmentation and modelling of the flood [99]. Feng et al. [100] utilized
drones to survey urban land to predict flood events. The reason for choosing UAV over static cameras
was the ease of data collection at different locations. The study proposed the approach of a hybrid
method based on the combination of texture analysis and the random forest algorithm. The overall
accuracy for the proposed solution at the kappa index of 0.746 was 87.3%. The study proved that the
accuracy increased up to 11.2% due to the addition of the texture analysis of the images. Important
highlights of this study include an emphasis on utilizing a UAV platform for the monitoring of complex
urban landscapes as well as the use of object-based information analysis (OBIA) to further increase the
accuracy. Similarly, Popescu et al. [101] proposed an approach based on the analysis of texture feature
and sliding box method via UAVs. The input image was divided into sub-images and classified into
two classes, i.e., flooded or not flooded. The proposed algorithm was evaluated on ten images and
achieved an accuracy of 98.57%. Even though the evaluation of this method could have benefitted
from the use of a larger number of images, this level of performance is considered outstanding.
Similarly, Sumalan et al. [102] developed a classification algorithm to classify images taken from
UAVs into three different classes, namely, grass, buildings and flooded area. The proposed study
developed an algorithm which is based on a local binary pattern so that it can extend to red and green
channels in the RGB domain and to the h channel of HSV. The UAV was utilized to collect images,
and the dataset was grouped into three categories. The histograms of the different classes were grouped
together so that the histogram of the new image was compared against one of the three groups to
predict the class of the input image. Instead of using the histogram approach, deep learning can be
utilized to classify the images and the videos collected from UAVs autonomously into disaster and
non-disaster categories [103]. Kamilaris et al. [104] utilized a deep learning model based on visual
geometry group (VGG) to establish if an image is to be categorized into a disaster or not. The training
used 544 images containing different images, some of which are non-disaster type and the others relate
to disasters such as fires, earthquakes, collapsed buildings, tsunamis and flooding. By employing data
augmentation techniques on the small dataset, an accuracy of 91% was achieved, with a suggestion
that this accuracy can reach 95% with a larger dataset. However, little is known about how the accuracy
of VGG architecture compares with other existing state-of-the-art CNN architectures.
This subsection presented the critical analysis of the cited literature in terms of proposed technology
and the type of experimental setup for mapping of the flood events. From the analysis, it can be
observed that data for mapping of the flood events can be collected by utilising ground, spaceborne
and airborne sources. Review of the cited literature indicates that there is no single/general approach
that would always work, the performance of the chosen method is highly dependent on the application
and visual sensor location.
Sensors 2019, 19, 5012 15 of 28
of luminance/chrominance (YUV) transforms, defining the ROI region that helps in improving accuracy
for identifying debris flow. To improve the generalizability of the algorithm, it will be important to
consider how to handle the definition of a threshold for the different filters which will change with
time and place. Langhammer et al. [116] presented a novel approach to detect objects during flooding
events through UAVs equipped with panchromatic cameras. The study proposed a workflow that
uses a method of texture analysis, photogrammetric analysis and a classification model based on a
2D ortho-photograph and a 3D digital elevation model (DEM). The accuracy of the model depends
upon the combination of image information (RGB, texture analysis, terrain ruggedness index (TRI) and
DEM) that the model uses during evaluation. The comparison of the classification accuracy for the
different combinations of input features can be seen in Table 6.
Flood debris detection, monitoring and accessing the damage due to the debris flow is an
application of computer vision. This subsection highlighted how computer approaches can be used to
detect and map the debris flow in a running stream.
4.4. Computer Vision in Estimating Surface Water Velocity for Hydrodynamic Modelling of Floods
This subsection includes research where computer vision is used to estimate the flow rate and
surface water velocity for hydrodynamic modelling. Finding the flow rate of water is of extreme
importance in hydrological modelling and flood inundation mapping [117]. Optical flow is a method in
computer vision that has been used to detect the movement of objects between two consecutive frames
in a video sequence [118]. Harjoko et al. [119] successfully utilised the pyramidal Lucas–Kanade optical
flow method for determining the flow rate of water in a case study of a dam. The directional arrows in the
ROI is detected by the coordinates of the moving objects. Moving objects that have a directional vector
parallel to the flow direction are useful for calculating the flow velocity. Discharge is also considered
useful for modelling the relationship between rainfall and flash floods [120]. Al-Mamari et al. [120]
utilised the large-scale particle image velocimetry (LSPIV) and the space-time image velocimetry
(STIV) techniques to model the river discharge and established the relationship between high-intensity
rainfall and flash floods. The study concluded that the flow was two-dimensional and time varying.
However, the direction of the flow pattern was still determined with reasonable accuracy.
In a similar approach, Fujita et al. [121] studied the impact of snow-melting on floods by measuring
the velocity and direction of the water. The far infrared (FIR) camera was utilized along with STIV
techniques to conduct this study. Comparisons were made among readily available sensors such
as acoustic Doppler current profilers (ADCPs), radio-wave velocity meters and image processing
techniques. The study emphasized the idea of using image techniques, as the error between ADCPs
and FIR cameras is less than 10%. The suggested direction for future work includes examining the
effect of rainfall and wind on the accuracy of STIV measurements.
Addressed Requirements
+ -> Average
Purpose Article Type of Information Proposed Method ++ -> Good
+++ -> State of the Art
Accuracy Generalization Scope of the Study
[18] Static Ground Camera Difference Method + + Real-world, tested on one river
[19] Static Ground Camera Logistic Regression and WSN ++ +++ Real-world, tested on thirteen rivers
[20] Static Ground Camera CNN Architecture ++ ++ Real-world, tested on six scenes
A [22] Static Ground Camera Image Texture features + + Real-world, tested on one river
(water level estimation/early [23] Static Ground Camera Accumulated Histogram and + Not Addressed In-lab experiment
warning system) Bandpass Filter
[24] Static Ground Camera Edge Detector and Far Infrared + Not Addressed In-lab experiment
(FIR) filter
[25] Static Ground Near OSF-based adaptive thresholding +++ ++ Real-world, tested on one river
Infrared (NIR) Camera
[27] UAV Mounted Camera Canny Filter thresholding ++ + Real-world, tested on one DAM
[28] Static Ground IP Cameras Image Texture-based segmentation ++ ++ Real world, tested on one river
B [119] Static Ground Camera Pyramidal Lucas-Kanade optical ++ ++ Real-world, tested on one river
(surface water velocity for flow method
hydrodynamic modelling) [120] Static Ground Camera LSPIV and STIV techniques +++ ++ Real world, tested on one river
[121] Static Ground FIR Camera STIV technique +++ ++ Real world, tested on one river
Sensors 2019, 19, 5012 18 of 28
Addressed Requirements
+ -> Average
Purpose Article Type of Information Proposed Method ++ -> Good
+++ -> State of the Art
Accuracy Generalization Scope of Study
[70] Static Ground Camera Tiramisu image segmentation algorithm ++ +++ Real-world, multiple locations
along with database
C
[71,72] Social Media Flood image segmentation dataset Not Addressed Not Addressed Real-world, multiple locations
(flood-related
[73] Spaceborne ResNet-50 along with flood image database +++ +++ Real world, multiple locations
data collection)
[76] UAV Mounted Camera Digital Terrain elevation (DTE) Not Addressed Not Addressed Real-world, multiple locations
dataset Collection
[77] UAV Mounted Camera Fuzzy C-means model to cluster images and ++ ++ Real-world, multiple locations
database collection
[122] UAV Mounted Camera Stereo images collection for floods Not Addressed Not Addressed Real-world, multiple locations
D [89] UAV Mounted Camera Aerial images inspection with Geographical ++ ++ Real-world, tested on
(flood risk Information System (GIS) data points coastal environment
management) [90] UAV Mounted Camera Digital Elevation Model (DEM) data ++ ++ Real-world, tested on one site
collection via UAVs but can expand out to other sites
[100] UAV Mounted Camera Fusion of random forest and texture analysis ++ ++ Real-world, multiple locations
E
[115] Static Ground Camera Spatial filtering and luminance / ++ + Real-world, tested on one site
(debris flow detection)
chrominance (YUV) transforms
[116] UAV panchromatic camera Texture analysis and DEM ++ ++ Real-world, tested on one site
Sensors 2019, 19, 5012 19 of 28
Addressed Requirements
+ -> Average
Purpose Article Type of Information Proposed Method ++ -> Good
+++ -> State of the Art
Accuracy Generalization Scope of the Study
[75] Spaceborne Near real-time monitoring by triggering TerraSAR-X ++ +++ Real-world, multiple locations
[82] Spaceborne Fusion of MMI with DSM ++ ++ Real world, multiple locations
[83] Spaceborne Image retrieval and classification software based on CNN +++ +++ Real world, multiple locations
[85] Spaceborne Modest adaboost and Spatiotemporal Context ++ ++ Real world, multiple locations
[86] Spaceborne Gaussian kernels and Support Vector Machine (SVM) ++ +++ Real world, multiple locations
[87] UAV Optimized route planning for UAV + + Real-world, UAVs path planning for
flood monitoring
[88] UAV Mounted Camera Texture analysis and fractal technique ++ + Real-world, tested on big dataset
[92] Spaceborne Convolutional Neural Network (CNN) architecture ++ ++ Real world, multiple locations
[93] Spaceborne FCN-16 CNN ++ ++ Real world, Multiple locations
[96] Static Ground Camera GrowCut method and Cellular automata (CA) algorithm ++ + Real-world, tested on one river
[97] Static Ground Camera Mean-shift and region growing + + Real-world, tested on one river
F
[98] Static Ground Camera SIFT algorithm + Not Addressed In-lab experiment
(flood detection and
[101] UAV Mounted Camera Texture feature analysis + Not Addressed Real-world, tested on ten images
inundation mapping)
[102] UAV Mounted Camera Accumulated histogram and clustering images into a group + + Real-world, multiple locations
[104] UAV Mounted Camera VGG–CNN with a custom dense layer ++ + Real-world, CNN trained on 444 images and
tested on 100 images
[106] Spaceborne Fusion of radar SAR and optical data ++ ++ Real-world, Multiple locations
[107] Spaceborne Fusion of Landsat images with DEM ++ ++ Real-world, multiple locations
[108] Spaceborne Hierarchical clustering approach + ++ Real-world, multiple locations
[109] Spaceborne Fusion of water-level sensor and satellites images + + Real-world, tested on one site
[111] Spaceborne Fusion of static ground cameras and satellite images ++ ++ Real-world, multiple locations
[112] UAV and Fusion of ultrasonic and DEM data collected from UAV to ++ ++ Real-world, tested on one site but can expand
Ultrasonic sensor make a 3D model out to other sites
[113] UAV Mounted Camera Fusion of GIS and aerial photography ++ ++ Real-world, tested in urban environment
[123] Social Media Pre-trained CNN on ImageNet with the addition of ++ ++ Real-world, tested on real images posted online.
meta-data analysis
[124–130] Social Media Fusion of contextual information with Image ++ + Real-world, tested on real images posted online.
[131] Social Media CNN architecture and meta-data analysis ++ + Real-world, tested on real images posted online.
[94] UAV Mounted Camera FCN-16 Architecture + ++ Real world, tested on big dataset
Sensors 2019, 19, 5012 20 of 28
Table 10. Main challenges addressed against possible solutions and future research.
As this review has shown, regular monitoring of flood-prone areas is a challenging task and a costly
activity for local governments [132]. This review has focused on studies that explore computer vision
or IoT-based sensors to monitor or map floods. The findings have covered water-level monitoring in
different sites that are of interest to understanding flood risks including residential street areas, rivers,
urban drainage networks, seas, dams, lakes, etc. However, there is still a lack of studies on computer
vision applications for the monitoring and management of coastal lagoons. Similarly, IoT-based sensors
have not been widely applied in lagoon monitoring. Coastal lagoons provide a variety of essential
services that are exceptionally admired by society, including storm defence, boating, recreation, fishing,
tourism and natural habitats for aquatic lives [133]. However, coastal lagoons also pose a significant
flood risk to residential areas adjacent to the lagoon foreshore. This flood risk is heightened by intense
rainfall that causes water to build-up behind the closed entrance at lagoons. Hence, in the following
sub-section we provide some recommendations for adopting computer vision and IoT sensors to
improve the monitoring of lagoon sites.
Recommended Future Research of Computer Vision and IoT Sensors in Monitoring Coastal Lagoons
Typically, coastal lagoons or lakes alternate between being closed and open to the ocean,
forming what is commonly referred to as intermittently closed and open lakes and lagoons (ICOLLs).
These are characterized by a berm, formed from sand and sediments deposited by winds, tides and
waves from the ocean. This berm helps to prevent further flow of ocean water into the lagoon, but rainfall
can cause the lagoon to overflow and inundate low-lying residential development. Knowing when to
Sensors 2019, 19, 5012 21 of 28
dredge the berm is therefore crucial for effective flood management and this would require regular
monitoring of the water level in the lagoon, the berm height, berm composition and permeability
and any activity related to artificial opening of the sand berm entrance. Hence, we recommend that
future research explore the adoption of existing technology and techniques in computer vision and/or
IoT-based sensors to monitor ICOLLs including obtaining berm height, water level measurement
and improving decisions on when to open/close a lagoon entrance. The linear regression technique
presented in Reference [19] can serve as a starting point for finding berm height. This study estimated
the water level by finding the upper and lower limits of the dike area [19]. The adaptive method of
finding the dike area assumed the upper limit to be a straight line because of the noise and thresholding
limits in the proposed approach. This approach can be optimized to find the berm height. Moreover,
the coastal lagoon entrance can be segmented out from the water region by utilizing the inundation
mapping techniques such as region growing [97] and CNN architecture [92]. These techniques are
utilized to segment the water surface, whereas for segmenting out the lagoon area, the CNN can
be retrained with the addition of one more class, i.e., lagoon entrance. For adding one more class,
additional data needs to be collected which will require an understanding of a new feature map of an
image. Hence, further research should be undertaken to investigate semantic segmentation.
Furthermore, future research should be undertaken to automate the control process of a lagoon
entrance by incorporating remote sensing and computer vision techniques. This will allow relevant
data to be collected and visualized to understand the impacts of a change in weather conditions on berm
height. At present, berm height is understood to be the product of wave run-up and the height of a
berm which continues to increase until reaching the maximum height of a wave run-up [134]. The wave
run-up varies for every beach and is directly affected by several factors such as beach slope, period,
wave height and weather conditions [134]. Future research can explore the numerical computation of
berm height using a mathematical model derived from experimentation. A data-oriented approach
could produce interesting findings that will let researchers generalize the findings from one site to
another lagoon site which may be behaving differently under different environmental conditions.
In monitoring lagoon water levels, it might be possible to utilize the data fusion approach where
data from the sensor can be fused with data from the camera to reduce false positives in water level
readings. The reason for data fusion is that water level derived from the camera can be adjusted
according to the single point IoT-based physical sensor so that readings can be obtained at multiple
points without having to deploy physical sensors at several locations. In other words, the field of view
(FOV) of the camera would give multiple points in the image, and each point can be considered as
one physical sensor. This will suggest an improvement to coastal monitoring which is currently done
either manually or from images taken from space.
7. Conclusions
This paper presented a systematic review of the literature regarding computer vision and IoT-based
sensors for flood monitoring and mapping. The review found that there are a wide range of applications
that support computer vision techniques and the IoT-based sensor approach for improved monitoring
and mapping of floods. Some of these applications include, but are not limited to, an early warning
system, debris flow estimation, flood risk management, flood inundation mapping and surface water
velocity. It was observed that computer vision is advantageous for covering a broader range, and each
point in the field of view (FOV) can be considered as one sensor when it comes to finding water
level, whereas IoT sensors are more accurate but can only deliver a point-based reading. Therefore,
both computer vision and IoT sensors have shortcomings that can be addressed by complementary use
through the fusing of data coming from two independent sources of information and, thus, can improve
the accuracy of the flood monitoring stations. It can be concluded that IoT-based sensor networks are
essential in real-time monitoring of flooding, as they provide instant information about water levels
thereby helping the responsible authorities to understand the impact of heavy rainfall on the carrying
capacity of waterways so that adequate strategies can be put in place including the need for proactive
Sensors 2019, 19, 5012 22 of 28
emergency evacuation. Importantly, this study has revealed a lack of research focused on exploring
computer vision or IoT-based sensors for improving the monitoring and management of coastal lagoon
sites. Hence, some recommendations were made to direct future research, particularly in relation to
monitoring berm heights in coastal lagoons.
Author Contributions: Conceptualization and Methodology: B.A. and R.O.; Original draft preparation: B.A. and
R.O.; writing—review and editing: B.A., R.O., B.P., N.V., J.B.; supervision: J.B. and P.P.
Funding: This work was supported by the Australian Government Department of Infrastructure, Transport,
Cities and Regional Development through the Smart Cities and Suburbs Program-Round two (Grant Number:
SCS69244).
Acknowledgments: Special thanks to the Illawarra-Shoalhaven Smart Water Management team for their support
and guidance.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Dilley, M.; Chen, R.S.; Deichmann, U.; Lerner-Lam, A.L.; Arnold, M. Natural Disaster Hotspots: A Global Risk
Analysis; The World Bank: Washington, DC, USA, 2005.
2. Mosquera-Machado, S.; Dilley, M. A comparison of selected global disaster risk assessment results.
Nat. Hazards 2009, 48, 439–456. [CrossRef]
3. Kuenzer, C.; Guo, H.; Huth, J.; Leinenkugel, P.; Li, X.; Dech, S. Flood mapping and flood dynamics of the
mekong delta: ENVISAT-ASAR-WSM based time series analyses. Remote. Sens. 2013, 5, 687–715. [CrossRef]
4. CRED. EM-DAT: The OFDA/CRED International Disaster Database; Catholic University of Leuven: Brussels,
Belgium, 2019.
5. Basha, E.; Rus, D. Design of early warning flood detection systems for developing countries. In Proceedings
of the 2007 International Conference on Information and Communication Technologies and Development,
Bangalore, India, 15–16 December 2007.
6. Jonkman, S.N.; Kelman, I. An Analysis of the Causes and Circumstances of Flood Disaster Deaths. Disasters
2005, 29, 75–97. [CrossRef]
7. Hirabayashi, Y.; Mahendran, R.; Koirala, S.; Konoshima, L.; Yamazaki, D.; Watanabe, S.; Kim, H.; Kanae, S.
Global flood risk under climate change. Nat. Clim. Chang. 2013, 3, 816–821. [CrossRef]
8. Ko, B.; Kwak, S. Survey of computer vision–based natural disaster warning systems. Opt. Eng. 2012, 51, 70901.
[CrossRef]
9. Kanwal, K.; Liaquat, A.; Mughal, M.; Abbasi, A.R.; Aamir, M. Towards development of a low cost early fire
detection system using wireless sensor network and machine vision. Wirel. Pers. Commun. 2017, 95, 475–489.
[CrossRef]
10. Blaschke, T. Object based image analysis for remote sensing. Isprs J. Photogramm. Remote Sens. 2010, 65, 2–16.
[CrossRef]
11. Bande, S.; Shete, V.V. Smart flood disaster prediction system using IoT & neural networks. In Proceedings
of the 2017 International Conference on Smart Technologies for Smart Nation (SmartTechCon), Bangalore,
India, 17–19 August 2017.
12. Barthélemy, J.; Verstaevel, N.; Forehead, H.; Perez, P. Edge-Computing Video Analytics for Real-Time Traffic
Monitoring in a Smart City. Sensors 2019, 19, 2048. [CrossRef] [PubMed]
13. Yuliandoko, H.; Subono, S.; Wardhani, V.A.; Pramono, S.H.; Suwindarto, P. Design of Flood Warning System
Based IoT and Water Characteristics. Telkomnika (Telecommun. Comput. Electron. Control.) 2018, 16, 2101–2110.
[CrossRef]
14. Keung, K.L.; Lee, C.K.M.; Ng, K.K.H.; Yeung, C.K. Smart city application and analysis: Real-time urban
drainage monitoring by iot sensors: A case study of Hong Kong. In Proceedings of the 2018 IEEE
International Conference on Industrial Engineering and Engineering Management (IEEM), Bangkok, Thailand,
16–19 December 2018.
15. McInnes, M.D.F.; Moher, D.; Thombs, B.D.; McGrath, T.A.; Bossuyt, P.M.; the PRISMA-DTA Group; Clifford, T.;
Cohen, J.F.; Deeks, J.J.; Gatsonis, C.; et al. Preferred Reporting Items for a Systematic Review and Meta-analysis
of Diagnostic Test Accuracy Studies: The PRISMA-DTA Statement. JAMA 2018, 319, 388–396. [CrossRef]
Sensors 2019, 19, 5012 23 of 28
16. Mocanu, B.; Tapu, R.; Zaharia, T. When ultrasonic sensors and computer vision join forces for efficient
obstacle detection and recognition. Sensors 2016, 16, 1807. [CrossRef] [PubMed]
17. Basha, E.A.; Ravela, S.; Rus, D. Model-based monitoring for early warning flood detection. In Proceedings of
the 6th ACM Conference on Embedded Network Sensor Systems, Raleigh, NC, USA, 5–7 November 2008.
18. Yu, J.; Hahn, H. Remote detection and monitoring of a water level using narrow band channel. J. Inf. Sci. Eng.
2010, 26, 71–82.
19. Hiroi, K.; Kawaguchi, N. FloodEye: Real-time flash flood prediction system for urban complex water flow.
In Proceedings of the 2016 IEEE SENSORS, Orlando, FL, USA, 30 October–3 November 2016.
20. Pan, J.; Yin, Y.; Xiong, J.; Luo, W.; Gui, G.; Sari, H. Deep Learning-based unmanned surveillance systems for
observing water levels. IEEE Access 2018, 6, 73561–73571. [CrossRef]
21. Rankin, A.L.; Matthies, L.H.; Huertas, A. Daytime water detection by fusing multiple cues for autonomous
off-road navigation. In Transformational Science and Technology for the Current and Future Force; World Scientific:
Singapore, 2006; pp. 177–184.
22. Rankin, A.; Matthies, L. Daytime water detection based on color variation. In Proceedings of the 2010
IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010.
23. Park, S.; Lee, N.; Han, Y.; Hahn, H. The water level detection algorithm using the accumulated histogram
with band pass filter. Acad. Sci. Eng. Technol. 2009, 56, 193–197.
24. Udomsiri, S.; Iwahashi, M. Design of FIR filter for water level detection. World Acad. Sci. Technol. 2008,
48, 47–52.
25. Zhang, Z.; Zhou, Y.; Liu, H.; Gao, H. In-situ water level measurement using NIR-imaging video camera.
Flow Meas. Instrum. 2019, 67, 95–106. [CrossRef]
26. Hsu, S.-Y.; Chen, T.-B.; Du, W.-C.; Wu, J.-H.; Chen, S.-C. Integrate Weather Radar and Monitoring Devices for
Urban Flooding Surveillance. Sensors 2019, 19, 825. [CrossRef]
27. Ridolfi, E.; Manciola, P. Water Level Measurements from Drones: A Pilot Case Study at a Dam Site. Water
2018, 10, 297. [CrossRef]
28. Lo, S.-W.; Wu, J.-H.; Lin, F.-P.; Hsu, C.-H. Visual Sensing for Urban Flood Monitoring. Sensors 2015,
15, 20006–20029. [CrossRef]
29. Krzhizhanovskaya, V.; Shirshov, G.; Melnikova, N.; Belleman, R.; Rusadi, F.; Broekhuijsen, B.; Gouldby, B.;
Lhomme, J.; Balis, B.; Bubak, M.; et al. Flood early warning system: Design, implementation and computational
modules. Procedia Comput. Sci. 2011, 4, 106–115. [CrossRef]
30. Baczyk,
˛ A.; Piwiński, J.; Kłoda, R.; Grygoruk, M. Survey on river water level measuring technologies:
Case study for flood management purposes of the C2-SENSE project. In Proceedings of the International
Conference on Systems, Control and Information Technologies, Warsaw, Poland, 20–21 May 2016.
31. Rachman, S.; Pratomo, I.; Mita, N. Design of low cost wireless sensor networks-based environmental
monitoring system for developing country. In Proceedings of the 2008 14th Asia-Pacific Conference on
Communications, Tokyo, Japan, 14–16 October 2008.
32. Hagedorn, P.; Wallaschek, J. Travelling wave ultrasonic motors, Part I: Working principle and mathematical
modelling of the stator. J. Sound Vib. 1992, 155, 31–46. [CrossRef]
33. Ward, D.; Petty, A.; Setterfield, S.; Douglas, M.; Ferdinands, K.; Hamilton, S.; Phinn, S. Floodplain inundation
and vegetation dynamics in the Alligator Rivers region (Kakadu) of northern Australia assessed using optical
and radar remote sensing. Remote. Sens. Environ. 2014, 147, 43–55. [CrossRef]
34. Lin, L.; Di, L.; Yu, E.G.; Kang, L.; Shrestha, R.; Rahman, M.S.; Tang, J.; Deng, M.; Sun, Z.; Zhang, C.; et al.
A review of remote sensing in flood assessment. In Proceedings of the 2016 Fifth International Conference
on Agro-Geoinformatics (Agro-Geoinformatics), Tianjin, China, 18–20 July 2016.
35. Khan, T.A.; Alam, M.; Kadir, K.; Shahid, Z.; Mazliham, S.M. A Novel approach for the investigation of
flash floods using soil flux and CO2 : An implementation of MLP with less false alarm rate. In Proceedings
of the 2018 2nd International Conference on Smart Sensors and Application (ICSSA), Kuching, Malaysia,
24–26 July 2018.
36. Noar, N.A.Z.M.; Kamal, M.M. The development of smart flood monitoring system using ultrasonic sensor with
blynk applications. In Proceedings of the 2017 IEEE 4th International Conference on Smart Instrumentation,
Measurement and Application (ICSIMA), Putrajaya, Malaysia, 28–30 November 2017.
Sensors 2019, 19, 5012 24 of 28
37. Purkovic, D.; Coates, L.; Hönsch, M.; Lumbeck, D.; Schmidt, F. Smart river monitoring and early flood
detection system in Japan developed with the EnOcean long range sensor technology. In Proceedings of the
2019 2nd International Colloquium on Smart Grid Metrology (SMAGRIMET), Split, Croatia, 9–12 April 2019.
38. Kafli, N.; Isa, K. Internet of Things (IoT) for measuring and monitoring sensors data of water surface platform.
In Proceedings of the 2017 IEEE 7th International Conference on Underwater System Technology: Theory
and Applications (USYS), Kuala Lumpur, Malaysia, 18–20 December 2017.
39. Chandanala, R.; Zhang, W.; Stoleru, R.; Won, M. On combining network coding with duty-cycling in
flood-based wireless sensor networks. Ad Hoc Netw. 2013, 11, 490–507. [CrossRef]
40. Lin, Y.B.; Lai, J.S.; Chang, K.C.; Li, L.S. Flood scour monitoring system using fiber Bragg grating sensors.
Smart Mater. Struct. 2006, 15, 1950. [CrossRef]
41. Thekkil, T.M.; Prabakaran, N. Real-time WSN based early flood detection and control monitoring system.
In Proceedings of the 2017 International Conference on Intelligent Computing, Instrumentation and Control
Technologies (ICICICT), Kannur, India, 6–7 July 2017.
42. Balaji, V.; Akshaya, A.; Jayashree, N.; Karthika, T. Design of ZigBee based wireless sensor network for early
flood monitoring and warning system. In Proceedings of the 2017 IEEE Technological Innovations in ICT for
Agriculture and Rural Development (TIAR), Chennai, India, 7–8 April 2017.
43. Pratama, A.; Munadi, R.; Mayasari, R. Design and implementation of flood detector using wireless sensor
network with mamdani’s fuzzy logic method. In Proceedings of the 2017 2nd International conferences on
Information Technology, Information Systems and Electrical Engineering (ICITISEE), Yogyakarta, Indonesia,
1–2 November 2017.
44. Al-Assadi, W.K.; Gandla, S.; Sedigh, S.; Dugganapally, I.P. Design of a flood prediction system. In Proceedings
of the 2009 12th International IEEE Conference on Intelligent Transportation Systems, St. Louis, MO, USA,
4–7 October 2009.
45. Ogie, R.; Shukla, N.; Sedlar, F.; Holderness, T. Optimal placement of water-level sensors to facilitate
data-driven management of hydrological infrastructure assets in coastal mega-cities of developing nations.
Sustain. Cities Soc. 2017, 35, 385–395. [CrossRef]
46. Abdelkader, M.; Shaqura, M.; Claudel, C.G.; Gueaieb, W. A UAV based system for real time flash flood
monitoring in desert environments using Lagrangian microsensors. In Proceedings of the 2013 International
Conference on Unmanned Aircraft Systems (ICUAS), Atlanta, GA, USA, 28–31 May 2013.
47. Zhuang, W.Y.; Junior, M.C.; Cheong, P.; Tam, K.W. Flood monitoring of distribution substation in low-lying
areas using Wireless Sensor Network. In Proceedings of the 2011 International Conference on System Science
and Engineering, Macao, China, 8–10 June 2011.
48. Garcia, F.C.C.; Retamar, A.E.; Javier, J.C. A real time urban flood monitoring system for metro Manila.
In Proceedings of the TENCON 2015-2015 IEEE Region 10 Conference, Macao, China, 1–4 November 2015.
49. Napiah, M.N.; Idris, M.Y.I.; Ahmedy, I.; Ngadi, M.A. Flood alerts system with android application.
In Proceedings of the 2017 6th ICT International Student Project Conference (ICT-ISPC), Skudai, Malaysia,
23–24 May 2017.
50. Intharasombat, O.; Khoenkaw, P. A low-cost flash flood monitoring system. In Proceedings of the 2015
7th International Conference on Information Technology and Electrical Engineering (ICITEE), Chiang Mai,
Thailand, 29–30 October 2015.
51. Jayashree, S.; Sarika, S.; Solai, A.L.; Prathibha, S. A novel approach for early flood warning using android
and IoT. In Proceedings of the 2017 2nd International Conference on Computing and Communications
Technologies (ICCCT), Chennai, India, 23–24 February 2017.
52. Teixidó, P.; Gómez-Galán, J.A.; Gómez-Bravo, F.; Sánchez-Rodríguez, T.; Alcina, J.; Aponte, J. Low-Power
Low-Cost Wireless Flood Sensor for Smart Home Systems. Sensors 2018, 18, 3817. [CrossRef]
53. Smith, P.J.; Hughes, D.; Beven, K.J.; Cross, P.; Tych, W.; Coulson, G.; Blair, G.S.; Tych, W. Towards the
provision of site specific flood warnings using wireless sensor networks. Meteorol. Appl. 2009, 16, 57–64.
[CrossRef]
54. Yumang, A.N.; Paglinawan, C.C.; Paglinawan, A.C.; Avendaño, G.O.; Esteves, J.A.C.; Pagaduan, J.R.P.;
Selda, J.D.S. Real-time flood water level monitoring system with SMS notification. In Proceedings of the 2017
IEEE 9th International Conference on Humanoid, Nanotechnology, Information Technology, Communication
and Control, Environment and Management (HNICEM), Manila, Philippines, 1–3 December 2017.
Sensors 2019, 19, 5012 25 of 28
55. Terzic, J.; Nagarajah, C.; Alamgir, M. Fluid level measurement in dynamic environments using a single
ultrasonic sensor and Support Vector Machine (SVM). Sens. Actuators A Phys. 2010, 161, 278–287. [CrossRef]
56. Widiasari, I.R.; Nugroho, L.E. Deep learning multilayer perceptron (MLP) for flood prediction model
using wireless sensor network based hydrology time series data mining. In Proceedings of the 2017
International Conference on Innovative and Creative Information Technology (ICITech), Salatiga, Indonesia,
2–4 November 2017.
57. Khan, T.A.; Alam, M.; Shahid, Z.; Ahmed, S.F.; Mazliham, M.S. Artificial Intelligence based Multi-modal
sensing for flash flood investigation. In Proceedings of the 2018 IEEE 5th International Conference on
Engineering Technologies and Applied Sciences (ICETAS), Bangkok, Thailand, 22–23 November 2018.
58. Cruz, F.R.G.; Binag, M.G.; Ga, M.R.G.; Uy, F.A.A. Flood Prediction Using Multi-Layer Artificial Neural
Network in Monitoring System with Rain Gauge, Water Level, Soil Moisture Sensors. In Proceedings of the
TENCON 2018-2018 IEEE Region 10 Conference, Jeju, Korea, 28–31 October 2018.
59. Mousa, M.; Oudat, E.; Claudel, C. A novel dual traffic/flash flood monitoring system using passive
infrared/ultrasonic sensors. In Proceedings of the 2015 IEEE 12th International Conference on Mobile Ad
Hoc and Sensor Systems, Dallas, TX, USA, 19–22 October 2015.
60. Mousa, M.; Zhang, X.; Claudel, C.; Moussa, M. Flash Flood Detection in Urban Cities Using Ultrasonic and
Infrared Sensors. IEEE Sens. J. 2016, 16, 7204–7216. [CrossRef]
61. Ancona, M.; Corradi, N.; Dellacasa, A.; Delzanno, G.; Dugelay, J.-L.; Federici, B.; Gourbesville, P.; Guerrini, G.;
La Camera, A.; Rosso, P.; et al. On the design of an intelligent sensor network for flash flood monitoring,
diagnosis and management in urban areas position paper. Procedia Comput. Sci. 2014, 32, 941–946. [CrossRef]
62. Horita, F.E.; De Albuquerque, J.P.; Degrossi, L.C.; Mendiondo, E.M.; Ueyama, J. Development of a spatial
decision support system for flood risk management in Brazil that combines volunteered geographic
information with wireless sensor networks. Comput. Geosci. 2015, 80, 84–94. [CrossRef]
63. Neal, J.C.; Atkinson, P.M.; Hutton, C.W. Adaptive space–time sampling with wireless sensor nodes for flood
forecasting. J. Hydrol. 2012, 414, 136–147. [CrossRef]
64. Ray, P.P.; Mukherjee, M.; Shu, L. Internet of Things for Disaster Management: State-of-the-Art and Prospects.
IEEE Access 2017, 5, 18818–18835. [CrossRef]
65. Perumal, T.; Sulaiman, M.N.; Leong, C.Y. Internet of Things (IoT) enabled water monitoring system.
In Proceedings of the 2015 IEEE 4th Global Conference on Consumer Electronics (GCCE), Osaka, Japan,
27–30 October 2015.
66. Moreno, C.; Aquino, R.; Ibarreche, J.; Pérez, I.; Castellanos, E.; Álvarez, E.; Rentería, R.; Anguiano, L.;
Edwards, A.; Lepper, P.; et al. RiverCore: IoT Device for River Water Level Monitoring over Cellular
Communications. Sensors 2019, 19, 127. [CrossRef]
67. Purnomo, R.; Pamungkas, M.H.; Arrofi, D.; Goni, A. Flood prediction using integrated sensor based on
internet of thing and radio frequency as flood risk reduction. AIP Conf. Proc. 2018, 1987, 020070.
68. Mostafa, E.; Mohamed, E. Intelligent data classification and aggregation in wireless sensors for flood
forecasting system. In Proceedings of the 2014 Mediterranean Microwave Symposium (MMS2014), Marrakech,
Morocco, 12–14 December 2014.
69. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks.
Available online: http://papers.nips.cc/paper/4824-imagenet-classification-with-deep-convolutional-neural
-networks.pdf (accessed on 16 November 2019).
70. Lopez-Fuentes, L.; Rossi, C.; Skinnemoen, H. River segmentation for flood monitoring. In Proceedings of
the 2017 IEEE International Conference on Big Data (Big Data), Boston, MA, USA, 11–14 December 2017;
pp. 3746–3749.
71. Wang, R.-Q.; Mao, H.; Wang, Y.; Rae, C.; Shaw, W. Hyper-resolution monitoring of urban flooding with social
media and crowdsourcing data. Comput. Geosci. 2018, 111, 139–147. [CrossRef]
72. Arthur, R.; Boulton, C.A.; Shotton, H.; Williams, H.T. Social sensing of floods in the UK. PLoS ONE 2018,
13, e0189327. [CrossRef]
73. Helber, P.; Bischke, B.; Dengel, A.; Borth, D. EuroSAT: a novel dataset and deep learning benchmark for
land use and land cover classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 12, 2217–2226.
[CrossRef]
Sensors 2019, 19, 5012 26 of 28
74. Rahnemoonfar, M.; Murphy, R.; Miquel, M.V.; Dobbs, D.; Adams, A. Flooded area detection from uav images
based on densely connected recurrent neural networks. In Proceedings of the IGARSS 2018-2018 IEEE
International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018.
75. Martinis, S.; Twele, A.; Strobl, C.; Kersten, J.; Stein, E. A Multi-Scale Flood Monitoring System Based on Fully
Automatic MODIS and TerraSAR-X Processing Chains. Remote. Sens. 2013, 5, 5598–5619. [CrossRef]
76. Mourato, S.; Fernandez, P.; Pereira, L.; Moreira, M. Improving a DSM Obtained by Unmanned Aerial Vehicles
for Flood Modelling. IOP Conf. Ser. Earth Environ. Sci. 2017, 95, 22014. [CrossRef]
77. Wang, Y.; Zhang, C.; Zhang, Y.; Huang, H.; Feng, L. Obtaining land cover type for urban storm flood model
in UAV images using MRF and MKFCM clustering techniques. ISPRS Int. J. Geo-Inf. 2019, 8, 205. [CrossRef]
78. Wagner, W.; Hollaus, M.; Briese, C.; Ducic, V. 3D vegetation mapping using small-footprint full-waveform
airborne laser scanners. Int. J. Remote. Sens. 2008, 29, 1433–1452. [CrossRef]
79. Jain, S.K.; Saraf, A.K.; Goswami, A.; Ahmad, T. Flood inundation mapping using NOAA AVHRR data.
Water Resour. Manag. 2006, 20, 949–959. [CrossRef]
80. Mueller, N.; Lewis, A.; Roberts, D.; Ring, S.; Melrose, R.; Sixsmith, J.; Lymburner, L.; McIntyre, A.; Tan, P.;
Curnow, S.; et al. Water observations from space: Mapping surface water from 25 years of Landsat imagery
across Australia. Remote. Sens. Environ. 2016, 174, 341–352. [CrossRef]
81. Van der Sande, C.J.; De Jong, S.M.; De Roo, A.P.J. A segmentation and classification approach of IKONOS-2
imagery for land cover mapping to assist flood risk and flood damage assessment. Int. J. Appl. Earth
Obs. Geoinf. 2003, 4, 217–229. [CrossRef]
82. Horkaew, P.; Puttinaovarat, S. Entropy-Based Fusion of Water Indices and DSM Derivatives for Automatic
Water Surfaces Extraction and Flood Monitoring. ISPRS Int. J. Geo-Inf. 2017, 6, 301. [CrossRef]
83. Li, S.; Sun, D.; Goldberg, M.D.; Sjoberg, B.; Santek, D.; Hoffman, J.P.; Deweese, M.; Restrepo, P.; Lindsey, S.;
Holloway, E. Automatic near real-time flood detection using Suomi-NPP/VIIRS data. Remote. Sens. Environ.
2018, 204, 672–689. [CrossRef]
84. Shen, L.; Li, C. Water body extraction from Landsat ETM+ imagery using adaboost algorithm. In Proceedings
of the 2010 18th International Conference on Geoinformatics, Beijing, China, 18–20 June 2010.
85. Liu, X.; Sahli, H.; Meng, Y.; Huang, Q.; Lin, L. Flood inundation mapping from optical satellite images using
Spatiotemporal Context Learning and Modest AdaBoost. Remote. Sens. 2017, 9, 617. [CrossRef]
86. Hakdaoui, S.; Emran, A.; Pradhan, B.; Lee, C.-W.; Fils, S.C.N. A collaborative change detection approach
on multi-sensor spatial imagery for desert wetland monitoring after a flash flood in Southern Morocco.
Remote. Sens. 2019, 11, 1042. [CrossRef]
87. Malandrino, F.; Chiasserini, C.-F.; Casetti, C.; Chiaraviglio, L.; Senacheribbe, A. Planning UAV activities for
efficient user coverage in disaster areas. Ad Hoc Netw. 2019, 89, 177–185. [CrossRef]
88. Popescu, D.; Ichim, L.; Stoican, F. Unmanned aerial vehicle systems for remote estimation of flooded areas
based on complex image processing. Sensors 2017, 17, 446. [CrossRef] [PubMed]
89. Casella, E.; Rovere, A.; Pedroncini, A.; Mucerino, L.; Casella, M.; Cusati, L.A.; Vacchi, M.; Ferrari, M.; Firpo, M.
Study of wave runup using numerical models and low-altitude aerial photogrammetry: A tool for coastal
management. Estuar. Coast. Shelf Sci. 2014, 149, 160–167. [CrossRef]
90. Hashemi-Beni, L.; Jones, J.; Thompson, G.; Johnson, C.; Gebrehiwot, A. challenges and opportunities
for UAV-based digital elevation model generation for flood-risk management: A case of Princeville,
North Carolina. Sensors 2018, 18, 3843. [CrossRef] [PubMed]
91. Langhammer, J.; Lendzioch, T.; Miřijovský, J.; Hartvich, F. UAV-based optical granulometry as tool for
detecting changes in structure of flood depositions. Remote. Sens. 2017, 9, 240. [CrossRef]
92. Isikdogan, F.; Bovik, A.C.; Passalacqua, P. Surface water mapping by deep learning. IEEE J. Sel. Top. Appl.
Earth Obs. Remote. Sens. 2017, 10, 4909–4918. [CrossRef]
93. Kang, W.; Xiang, Y.; Wang, F.; Wan, L.; You, H. Flood detection in gaofen-3 SAR images via fully convolutional
networks. Sensors 2018, 18, 2915. [CrossRef]
94. Gebrehiwot, A.; Hashemi-Beni, L.; Thompson, G.; Kordjamshidi, P.; Langan, T.E. Deep convolutional neural
network for flood extent mapping using unmanned aerial vehicles data. Sensors 2019, 19, 1486. [CrossRef]
95. Moy de Vitry, M.; Dicht, S.; Leitão, J.P. Leitão, floodX: Urban flash flood experiments monitored with
conventional and alternative sensors. Earth Syst. Sci. Data 2017, 9, 657–666. [CrossRef]
96. Lo, S.-W.; Wu, J.-H.; Lin, F.-P.; Hsu, C.-H. Cyber surveillance for flood disasters. Sensors 2015, 15, 2369–2387.
[CrossRef]
Sensors 2019, 19, 5012 27 of 28
97. Jyh-Horng, W.; Chien-Hao, T.; Lun-Chi, C.; Shi-Wei, L.; Fang-Pang, L. Automated Image Identification
Method for Flood Disaster Monitoring in Riverine Environments: A Case Study in Taiwan. In Proceedings
of the AASRI International Conference on Industrial Electronics and Applications (IEA 2015), London, UK,
27–28 June 2015.
98. Narayanan, R.; Lekshmy, V.M.; Rao, S.; Sasidhar, K. A novel approach to urban flood monitoring using
computer vision. In Proceedings of the Fifth International Conference on Computing, Communications and
Networking Technologies (ICCCNT), Hefei, China, 11–13 July 2014.
99. Zakaria, S.; Mahadi, M.R.; Abdullah, A.F.; Abdan, K. Aerial platform reliability for flood monitoring under
various weather conditions: A review. In Proceedings of the GeoInformation for Disaster Management
Conference, Istanbul, Turkey, 18–21 March 2018.
100. Feng, Q.; Liu, J.; Gong, J. Urban flood mapping based on unmanned aerial vehicle remote sensing and
random forest classifier—A case of Yuyao, China. Water 2015, 7, 1437–1455. [CrossRef]
101. Popescu, D.; Ichim, L.; Caramihale, T. Flood areas detection based on UAV surveillance system.
In Proceedings of the 2015 19th International Conference on System Theory, Control and Computing
(ICSTCC), Cheile Gradistei, Romania, 14–16 October 2015.
102. Sumalan, A.L.; Popescu, D.A.N.; Ichim, L.O.R.E.T.T.A. Flooded areas detection based on LBP from UAV
images. In Recent Advances on Systems, Signals, Control, Communications and Computers; WSEAS Press:
Budapest, Hungary, 2015; pp. 186–191.
103. Carrio, A.; Sampedro, C.; Rodriguez-Ramos, A.; Campoy, P. A review of deep learning methods and
applications for unmanned aerial vehicles. J. Sens. 2017, 2017, 1–13. [CrossRef]
104. Kamilaris, A.; Prenafeta-Boldú, F.X. Disaster monitoring using unmanned aerial vehicles and deep learning.
arXiv 2018, arXiv:1807.11805.
105. Zeng, Y.; Huang, W.; Liu, M.; Zhang, H.; Zou, B. Fusion of satellite images in urban area: Assessing the
quality of resulting images. In Proceedings of the 2010 18th International Conference on Geoinformatics,
Beijing, China, 18–20 June 2010.
106. Zoka, M.; Psomiadis, E.; Dercas, N. The complementary use of optical and sar data in monitoring flood
events and their effects. Proceedings 2018, 2, 644. [CrossRef]
107. Chaouch, N.; Temimi, M.; Hagen, S.; Weishampel, J.; Medeiros, S.; Khanbilvardi, R. A synergetic use of
satellite imagery from SAR and optical sensors to improve coastal flood mapping in the Gulf of Mexico.
Hydrol. Process. 2012, 26, 1617–1628. [CrossRef]
108. Senthilnath, J.; Rajendra, R.; Suresh, S.; Kulkarni, S.; Benediktsson, J.A. Hierarchical clustering approaches
for flood assessment using multi-sensor satellite images. Int. J. Image Data Fusion 2019, 10, 28–44. [CrossRef]
109. Khan, S.I.; Hong, Y.; Gourley, J.J.; Khattak, M.U.; De Groeve, T. Multi-sensor imaging and space-ground
cross-validation for 2010 flood along Indus River, Pakistan. Remote. Sens. 2014, 6, 2393–2407. [CrossRef]
110. Xu, B.; Da Xu, L.; Cai, H.; Xie, C.; Hu, J.; Bu, F. Ubiquitous Data Accessing Method in IoT-Based Information
System for Emergency Medical Services. IEEE Trans. Ind. Inform. 2014, 10, 1578–1586.
111. Balkaya, C.; Casciati, F.; Casciati, S.; Faravelli, L.; Vece, M. Real-time identification of disaster areas by an
open-access vision-based tool. Adv. Eng. Softw. 2015, 88, 83–90. [CrossRef]
112. Langhammer, J.; Bernsteinová, J.; Miřijovský, J. Building a high-precision 2D hydrodynamic flood model
using UAV photogrammetry and sensor network monitoring. Water 2017, 9, 861. [CrossRef]
113. Zhu, Z.J.; Jiang, A.Z.; Lai, J.; Xiang, Y.; Baird, B.; McBean, E. Towards efficient use of an unmanned aerial
vehicle for urban flood monitoring. J. Water Manag. Model. 2017. [CrossRef]
114. Liu, C.C.; Chen, P.L.; Matsuo, T.; Chen, C.Y. Rapidly responding to landslides and debris flow events using a
low-cost unmanned aerial vehicle. J. Appl. Remote. Sens. 2015, 9, 96016. [CrossRef]
115. Kao, H.M.; Ren, H.; Lee, C.S.; Chen, Y.L.; Lin, Y.S.; Su, Y. Monitoring debris flows using spatial filtering and
entropy determination approaches. Terr. Atmos. Ocean. Sci. 2013, 24, 773. [CrossRef]
116. Langhammer, J.; Vacková, T. Detection and mapping of the geomorphic effects of flooding using UAV
photogrammetry. Pure Appl. Geophys. 2018, 175, 83–105. [CrossRef]
117. Goderniaux, P.; Brouyère, S.; Fowler, H.J.; Blenkinsop, S.; Therrien, R.; Orban, P.; Dassargues, A. Large scale
surface–subsurface hydrological model to assess climate change impacts on groundwater reserves. J. Hydrol.
2009, 373, 122–138. [CrossRef]
118. Yacoob, Y.; Davis, L.S. Recognizing human facial expressions from long image sequences using optical flow.
IEEE Trans. Pattern Anal. Mach. Intell. 1996, 18, 636–642. [CrossRef]
Sensors 2019, 19, 5012 28 of 28
119. Harjoko, A.; Awaludin, L.; Hujja, R.M. The flow rate of debris estimation on the Sabo Dam area with video
processing. In Proceedings of the 2017 International Conference on Signals and Systems (ICSigSys), Sanur,
Indonesia, 16–18 May 2017.
120. Al-Mamari, M.M.; Kantoush, S.A.; Kobayashi, S.; Sumi, T.; Saber, M. Real-Time Measurement of Flash-Flood
in a Wadi Area by LSPIV and STIV. Hydrology 2019, 6, 27. [CrossRef]
121. Fujita, I. Discharge measurements of snowmelt flood by Space-Time Image Velocimetry during the night
using far-infrared camera. Water 2017, 9, 269. [CrossRef]
122. Sullivan, J.L.; McFaden, S.; Engel, T. Using Remote Data Collection to Identify Bridges and Culverts Susceptible
to Blockage during Flooding Events; University of Vermont, Transportation Research Center: Burlington,
VT, USA, 2016.
123. Ahmad, S.; Ahmad, K.; Ahmad, N.; Conci, N. Convolutional neural networks for disaster images retrieval.
In Proceedings of the MediaEval, Dublin, Ireland, 13–15 September 2017.
124. Dao, M.S.; Pham, Q.N.M.; Nguyen, D.; Tien, D. A domain-based late-fusion for disaster image retrieval from
social media. In Proceedings of the MediaEval 2017 Multimedia Benchmark Workshop, Dublin, Ireland,
13–15 September 2017.
125. Zhao, Z.; Larson, M. Retrieving Social Flooding Images Based on Multimodal Information. 2017. Available
online: http://ceur-ws.org/Vol-1984/Mediaeval_2017_paper_40.pdf (accessed on 16 November 2019).
126. Flood Detection Using Social Media Data and Spectral Regression Based Kernel Discriminant Analysis.
Available online: http://slim-sig.irisa.fr/me17/Mediaeval_2017_paper_43.pdf (accessed on 11 November 2019).
127. BMC@ MediaEval 2017 Multimedia Satellite Task via Regression Random Forest. Available online: http:
//slim-sig.irisa.fr/me17/Mediaeval_2017_paper_46.pdf (accessed on 11 November 2019).
128. CNN and GAN Based Satellite and Social Media Data Fusion for Disaster Detection. Available online:
https://www.researchgate.net/profile/Michael_Riegler/publication/319774098_CNN_and_GAN_Based_Satelli
te_and_Social_Media_Data_Fusion_for_Disaster_Detection/links/59bc1161a6fdcca8e5624836/CNN-and-GA
N-Based-Satellite-and-Social-Media-Data-Fusion-for-Disaster-Detection.pdf (accessed on 11 November 2019).
129. Visual and Textual Analysis of Social Media and Satellite Images for Flood Detection@ Multimedia Satellite
Task MediaEval 2017. Available online: http://slim-sig.irisa.fr/me17/Mediaeval_2017_paper_31.pdf (accessed
on 11 November 2019).
130. Detection of Flooding Events in Social Multimedia and Satellite Imagery Using Deep Neural
Networks. Available online: https://pdfs.semanticscholar.org/3118/eed6edfc0ecabf14968906832510e4898e7f.p
df (accessed on 11 November 2019).
131. Feng, Y.; Sester, M. Extraction of pluvial flood relevant volunteered geographic information (VGI) by deep
learning from user generated texts and photos. ISPRS Int. J. Geo-Inf. 2018, 7, 39. [CrossRef]
132. Bourgeau-Chavez, L.L.; Smith, K.B.; Brunzell, S.M.; Kasischke, E.S.; Romanowicz, E.A.; Richardson, C.J.
Remote monitoring of regional inundation patterns and hydroperiod in the Greater Everglades using
Synthetic Aperture Radar. Wetlands 2005, 25, 176–191. [CrossRef]
133. Opaluch, J.J.; Anthony, A.; Atwood, J.; August, P.; Byron, C.; Cobb, S.; Foster, C.; Fry, C.; Hagos, K.;
Heffner, L.; et al. Coastal lagoons and climate change: Ecological and social ramifications in the US Atlantic
and Gulf coast ecosystems. Ecol. Soc. 2009, 14, 8.
134. Hanslow, D.J.; Davis, G.A.; You, B.Z.; Zastawny, J. Berm height at coastal lagoon entrances in NSW. Available
online: https://www.researchgate.net/profile/David_Hanslow/publication/258918589_BERM_HEIGHT_AT
_COASTAL_LAGOON_ENTRANCES_IN_NSW/links/0c96052966e61cc447000000/BERM-HEIGHT-AT-C
OASTAL-LAGOON-ENTRANCES-IN-NSW.pdf (accessed on 16 November 2019).
© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).