Remote Sensing: Without Actually Being in Contact With It. This Is Done by
Remote Sensing: Without Actually Being in Contact With It. This Is Done by
Remote Sensing: Without Actually Being in Contact With It. This Is Done by
1
Components of Remote sensing
• Energy Source or
Illumination (A) - The
first requirement for remote
sensing is to have an energy
source which illuminates or
provides electromagnetic
energy to the target of
interest.
The shorter the wavelength, the higher the frequency. The longer the wavelength,
6
the lower the frequency.
• What is the obvious source of electromagnetic energy that
you can think of? What "remote sensing device" do you
personally use to detect this energy?
7
• The Electromagnetic Spectrum
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma
and x-rays) to the longer wavelengths (including microwaves and broadcast radio waves).
There are several regions of the electromagnetic spectrum which are useful for remote
sensing.
8
• the ultraviolet or UV portion of the
spectrum has the shortest wavelengths
which are practical for remote sensing.
This radiation is just beyond the violet
portion of the visible wavelengths, hence
its name. Some Earth surface materials,
primarily rocks and minerals, fluoresce or
emit visible light when illuminated by UV
radiation.
• The light which our eyes - our "remote
sensors" - can detect is part of the visible
spectrum. It is important to recognize how
small the visible portion is relative to the
rest of the spectrum.
• There is a lot of radiation around us which
is "invisible" to our eyes, but can be
detected by other remote sensing
instruments and used to our advantage.
• The visible wavelengths cover a range
from approximately 0.4 to 0.7 µm. The
longest visible wavelength is red and the
shortest is violet.
9
• Blue, green, and red are the
primary colours or wavelengths
of the visible spectrum. They
are defined as such because no
single primary colour can be
created from the other two, but
all other colours can be formed
by combining blue, green, and
red in various proportions.
12
• Rayleigh scattering occurs when particles
are very small compared to the wavelength
of the radiation. These could be particles
such as small specks of dust or nitrogen
and oxygen molecules. Rayleigh scattering
causes shorter wavelengths of energy to be
scattered much more than longer
wavelengths. Rayleigh scattering is the
dominant scattering mechanism in the
upper atmosphere.
• Mie scattering occurs when the particles
are just about the same size as the
wavelength of the radiation. Dust, pollen,
smoke and water vapour are common
causes of Mie scattering which tends to
affect longer wavelengths than those
affected by Rayleigh scattering. Mie
scattering occurs mostly in the lower
portions of the atmosphere where larger
particles are more abundant, and
dominates when cloud conditions are
overcast.
13
• Nonselective scattering occurs
when the particles are much larger
than the wavelength of the
radiation. Water droplets and large
dust particles can cause this type of
scattering. Nonselective scattering
gets its name from the fact that all
wavelengths are scattered about
equally. This type of scattering
causes fog and clouds to appear
white to our eyes because blue,
green, and red light are all scattered
in approximately equal quantities
(blue+green+red light = white light).
Absorption is the other main mechanism at work when electromagnetic radiation
interacts with the atmosphere. In contrast to scattering, this phenomenon causes
molecules in the atmosphere to absorb energy at various wavelengths. Ozone,
carbon dioxide, and water vapour are the three main atmospheric constituents
which absorb radiation.
Most remote sensing systems avoid detecting and recording wavelengths in the
ultraviolet and blue portions of the spectrum. why ? 14
Hint: The sky is blue!!!!!!
Radiation - Target Interactions
• Radiation that is not absorbed or
scattered in the atmosphere can reach
and interact with the Earth's surface.
There are three (3) forms of interaction
that can take place when energy strikes, or
is incident (I) upon the surface. These are:
absorption (A); transmission (T); and
reflection (R).
• Absorption (A) occurs when radiation
(energy) is absorbed into the target while
transmission (T) occurs when radiation
passes through a target. Reflection (R)
occurs when radiation "bounces" off the
target and is redirected.
• In remote sensing, we are most interested
in measuring the radiation reflected from
targets.
18
• Depending on the complex make-up of the target that is being looked at, and the
wavelengths of radiation involved, we can observe very different responses to the
mechanisms of absorption, transmission, and reflection.
• By measuring the energy that is reflected (or emitted) by targets on the Earth's surface
over a variety of different wavelengths, we can build up a spectral response for that
object.
• By comparing the response patterns of different features we may be able to distinguish
between them, where we might not be able to, if we only compared them at one
wavelength. For example, water and vegetation may reflect somewhat similarly in the
visible wavelengths but are almost always separable in the infrared.
• Spectral response can be quite variable, even for the same target type, and can also vary
with time (e.g. "green-ness" of leaves) and location.
• Knowing where to "look" spectrally and understanding the factors which influence the
spectral response of the features of interest are critical to correctly interpreting the
interaction of electromagnetic radiation with the surface.
19
Passive vs. Active Sensing
Remote sensing systems which measure energy
that is naturally available are called passive
sensors.
Passive sensors can only be used to detect energy
when the naturally occurring energy is available.
For all reflected energy, this can only take place
during the time when the sun is illuminating the
Earth.
Active sensors, on the other hand, provide their own energy source for
illumination. The sensor emits radiation which is directed toward the target to
be investigated. The radiation reflected from that target is detected and
measured by the sensor. Advantages for active sensors include the ability to
obtain measurements anytime, regardless of the time of day or season.
Some examples of active sensors are a laser fluorosensor and a synthetic
aperture radar (SAR).
20
• Did you know?
• a camera provides an excellent example of both
passive and active sensors. During a bright sunny day,
enough sunlight is illuminating the targets and then
reflecting toward the camera lens, that the camera
simply records the radiation provided (passive
mode). On a cloudy day or inside a room, there is
often not enough sunlight for the camera to record
the targets adequately. Instead, it uses its own
energy source - a flash - to illuminate the targets and
record the radiation reflected from them (active
mode).
21
Characteristics of Images
• Electromagnetic energy may be detected either
photographically or electronically. The
photographic process uses chemical reactions
on the surface of light-sensitive film to detect
and record energy variations.
• It is important to distinguish between the
terms images and photographs in remote
sensing.
• An image refers to any pictorial representation,
regardless of what wavelengths or remote
sensing device has been used to detect and
record the electromagnetic energy.
• A photograph refers specifically to images that
have been detected as well as recorded on
photographic film. • A photograph could also be represented
and displayed in a digital format by
• Photos are normally recorded over the
subdividing the image into small equal-
wavelength range from 0.3 µm to 0.9 µm - the
sized and shaped areas, called picture
visible and reflected infrared. Based on these
elements or pixels, and representing the
definitions, we can say that all photographs are
brightness of each area with a numeric
images, but not all images are photographs.
value or digital number
22
• Can you imagine what the world would look like if we could only see very
narrow ranges of wavelengths or colours? That is how many sensors work.
• The information from a narrow wavelength range is gathered and stored in a
channel, also sometimes referred to as a band.
• We can combine and display channels of information digitally using the three
primary colours (blue, green, and red).
• The data from each channel is represented as one of the primary colours and,
depending on the relative brightness (i.e. the digital value) of each pixel in each
channel, the primary colours combine in different proportions to represent
different colours.
• When we use this method to display a single channel or range of wavelengths,
we are actually displaying that channel through all three primary colours.
Because the brightness level of each pixel is the same for each primary colour,
they combine to form a black and white image, showing various shades of gray
from black to white.
• When we display more than one channel each as a different primary colour,
then the brightness levels may be different for each channel/primary colour
combination and they will combine to form a colour image.
DID YOU KNOW?
Photographic film has the clear advantage of recording extremely fine spatial detail, since individual silver halide
molecules can record light sensitivity differently than their neighbouring molecules. But when it comes to spectral and
radiometric qualities, digital sensors outperform film, by being able to use extremely fine spectral bands (for spectral 23
'fingerprinting' of targets), and recording up to many thousands of levels of brightness.
Satellites and sensor
• On the Ground, In the Air, In Space
• Ground-based sensors are often used to record detailed information about the
surface which is compared with information collected from aircraft or satellite
sensors. In some cases, this can be used to better characterize the target which is
being imaged by these other sensors, making it possible to better understand the
information in the imagery.
• Sensors may be placed on a ladder, scaffolding, tall building, crane, etc. Aerial
platforms are primarily stable wing aircraft, although helicopters are occasionally
used. Aircraft are often used to collect very detailed images and facilitate the
collection of data over virtually any portion of the Earth's surface at any time.
• In space, remote sensing is sometimes conducted from
the space shuttle or, more commonly, from satellites.
• Satellites are objects which revolve around another
object - in this case, the Earth. For example, the moon is
a natural satellite, whereas man-made satellites include
those platforms launched for remote sensing,
communication, and telemetry (location and navigation)
purposes.
• Because of their orbits, satellites permit repetitive
coverage of the Earth's surface on a continuing basis.
Cost is often a significant factor in choosing among the
24
various platform options
• Satellite Characteristics: Orbits and Swaths
• The path followed by a satellite is referred to as its
orbit.
• Satellite orbits are matched to the capability and
objective of the sensor(s) they carry.
• Orbit selection can vary in terms of altitude (their
height above the Earth's surface) and their
orientation and rotation relative to the Earth.
• Satellites at very high altitudes, which view the
same portion of the Earth's surface at all times
have geostationary orbits.
These geostationary satellites, at altitudes of approximately 36,000
kilometres, revolve at speeds which match the rotation of the Earth so
they seem stationary, relative to the Earth's surface. This allows the
satellites to observe and collect information continuously over specific
areas. Weather and communications satellites commonly have these
25
types of orbit
• As a satellite revolves around the Earth, the sensor
"sees" a certain portion of the Earth's surface. The
area imaged on the surface, is referred to as the
swath.
• Imaging swaths for spaceborne sensors generally
vary between tens and hundreds of kilometres wide.
• As the satellite orbits the Earth from pole to pole, its
east-west position wouldn't change if the Earth
didn't rotate. However, as seen from the Earth, it
seems that the satellite is shifting westward because
the Earth is rotating (from west to east) beneath it.
This apparent movement allows the satellite swath
to cover a new area with each consecutive pass.
• The satellite's orbit and the rotation of the Earth
work together to allow complete coverage of the
Earth's surface, after it has completed one complete
cycle of orbits
• If we start with any randomly selected pass in a
satellite's orbit, an orbit cycle will be completed
when the satellite retraces its path, passing over the
same point on the Earth's surface directly below the
satellite (called the nadir point) for a second time.
• The exact length of time of the orbital cycle will vary
with each satellite. The interval of time required for
the satellite to complete its orbit cycle is not the
26
same as the "revisit period".
Spatial Resolution, Pixel Size, and Scale
• The detail discernible in an image is
dependent on the spatial resolution of the
sensor and refers to the size of the smallest
possible feature that can be detected.
• Spatial resolution of passive sensors (we will
look at the special case of active microwave
sensors later) depends primarily on their
Instantaneous Field of View (IFOV).
• The IFOV is the angular cone of visibility of
the sensor (A) and determines the area on
the Earth's surface which is "seen" from a
given altitude at one particular moment in
time (B).
• The size of the area viewed is determined by
multiplying the IFOV by the distance from
the ground to the sensor (C). This area on
the ground is called the resolution cell and
determines a sensor's maximum spatial
resolution. 27
• Most remote sensing images are composed of a matrix
of picture elements, or pixels, which are the smallest
units of an image.
• Image pixels are normally square and represent a certain
area on an image.
• It is important to distinguish between pixel size and
spatial resolution - they are not interchangeable.
• If a sensor has a spatial resolution of 20 metres and an
image from that sensor is displayed at full resolution,
each pixel represents an area of 20m x 20m on the
ground. In this case the pixel size and resolution are the
same. However, it is possible to display an image with a
pixel size different than the resolution.
• Many posters of satellite images of the Earth have their
pixels averaged to represent larger areas, although the
original spatial resolution of the sensor that collected
the imagery remains the same.
28
• Images where only large features are visible are said to have
coarse or low resolution.
• In fine or high resolution images, small objects can be detected.
• Military sensors for example, are designed to view as much detail
as possible, and therefore have very fine resolution.
• Commercial satellites provide imagery with resolutions varying
from a few metres to several kilometres.
• Generally speaking, the finer the resolution, the less total ground
area can be seen.
29
• The ratio of distance on an image or map, to actual
ground distance is referred to as scale. If you had a map
with a scale of 1:100,000, an object of 1cm length on
the map would actually be an object 100,000cm (1km)
long on the ground.
• Maps or images with small "map-to-ground ratios" are
referred to as small scale (e.g. 1:100,000), and those
with larger ratios (e.g. 1:5,000) are called large scale.
30
31
Spectral Resolution
• Different classes of features and details in an image can often be distinguished by
comparing their responses over distinct wavelength ranges. Broad classes, such as water
and vegetation, can usually be separated using very broad wavelength ranges - the visible
and near infrared .
• Other more specific classes, such as different rock types, may not be easily
distinguishable using either of these broad wavelength ranges and would require
comparison at much finer wavelength ranges to separate them. Thus, we would require a
sensor with higher spectral resolution. Spectral resolution describes the ability of a
sensor to define fine wavelength intervals. The finer the spectral resolution, the narrower
the wavelength range for a particular channel or band.
• Black and white film records wavelengths extending over
much, or all of the visible portion of the electromagnetic
spectrum. Its spectral resolution is fairly coarse, as the
various wavelengths of the visible spectrum are not
individually distinguished and the overall reflectance in the
entire visible portion is recorded.
• Colour film is also sensitive to the reflected energy over the
visible portion of the spectrum, but has higher spectral
resolution, as it is individually sensitive to the reflected
energy at the blue, green, and red wavelengths of the
spectrum. Thus, it can represent features of various colours
based on their reflectance in each of these distinct
wavelength ranges. 32
33
• Many remote sensing systems record energy over several separate wavelength ranges
at various spectral resolutions. These are referred to as multi-spectral sensors.
• Advanced multi-spectral sensors called hyperspectral sensors, detect hundreds of very
narrow spectral bands throughout the visible, near-infrared, and mid-infrared portions
of the electromagnetic spectrum.
• Their very high spectral resolution facilitates fine discrimination between different
targets based on their spectral response in each of the narrow bands.
Radiometric Resolution
• The radiometric characteristics describe the
actual information content in an image.
Every time an image is acquired on film or by
a sensor, its sensitivity to the magnitude of
the electromagnetic energy determines the
radiometric resolution.
• The radiometric resolution of an imaging
system describes its ability to discriminate
very slight differences in energy The finer
the radiometric resolution of a sensor, the
more sensitive it is to detecting small
differences in reflected or emitted energy.
If a sensor used 8 bits to record the data, there would be 28=256 digital values available, ranging from
0 to 255. However, if only 4 bits were used, then only 24=16 values ranging from 0 to 15 would be
available. Thus, the radiometric resolution would be much less.
Image data are generally displayed in a range of grey tones, with black representing a digital
34
number of 0 and white representing the maximum value (for example, 255 in 8-bit data).
• • Number of Shades or brightness levels at a given
wavelength
• Smallest change in
• intensity level that can
• be detected by the
• sensing system
35
36
Image classification
• Image classification is the process of assigning land cover classes to
pixels
• For example, these 9 global land cover data sets classify images
into forest, urban, agriculture and other classes
• In general, these are three main image classification techniques in
remote sensing:
• Unsupervised image classification
• Supervised image classification
• Object-based image analysis
37
• In unsupervised classification, it first groups pixels into
“clusters” based on their properties
• In order to create “clusters”, analysts use image clustering
algorithms such as K-means and ISODATA
• For the most part, they can use this list of free remote sensing
software to create land cover maps.
38
• After picking a clustering algorithm, you identify the number of
groups you want to generate
• For example, you can create 8, 20 or 42 clusters. To be clear, these
are unclassified clusters because in the next step, you manually
identify each cluster with land cover classes
• For example, if you want to classify vegetation and non-vegetation,
you’ll have to merge clusters into only 2 clusters
• Overall, unsupervised classification is the most basic technique.
Because you don’t need samples for unsupervised classification, it’s
an easy way to segment and understand an image
39
• Unsupervised Classification Steps:
• Generate clusters
• Assign classes
40
• In supervised classification, you select representative samples for
each land cover class
• The software then uses these “training sites” and applies them to
the entire image
• Supervised classification uses the spectral signature defined in the
training set
• For example, it determines each class on what it resembles most in
the training set
• The common supervised classification algorithms are maximum
likelihood and minimum-distance classification
41
Supervised Classification Steps:
• Select training areas
• Generate signature file
• Classify
42
Unsupervised vs Supervised vs Object-Based
Classification
• A case study from the University of Arkansas compared object-
based vs pixel-based classification
• The goal was to compare high and medium spatial resolution
imagery
• Overall, object-based classification outperformed both
unsupervised and supervised pixel-based classification methods.
Because OBIA used both spectral and contextual information, it had
higher accuracy.
• This study is a good example of some of the limitations of pixel-
based image classification techniques.
43
Applications of Remote Sensing
44
Change Detection - Flooding
Borneo