Remote Sensing: Without Actually Being in Contact With It. This Is Done by

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 51

Remote sensing

Remote sensing is the science (and to some extent, art)


of acquiring information about the Earth's surface
without actually being in contact with it. This is done by
sensing and recording reflected or emitted energy and
processing, analyzing, and applying that information.
In much of remote sensing, the process involves an
interaction between incident radiation and the targets of
interest.

1
Components of Remote sensing
• Energy Source or
Illumination (A) - The
first requirement for remote
sensing is to have an energy
source which illuminates or
provides electromagnetic
energy to the target of
interest.

Radiation and the Atmosphere (B) - As the energy travels


from its source to the target, it will come in contact with and
interact with the atmosphere it passes through. This interaction
may take place a second time as the energy travels from the
target to the sensor.
2
• Interaction with the
Target (C) - once the
energy makes its way to
the target through the
atmosphere, it interacts
with the target depending
on the properties of both
the target and the
radiation.
• Recording of Energy by • Transmission, Reception, and
the Sensor (D) - after the Processing (E) - the energy
energy has been scattered by, or
recorded by the sensor has to be
emitted from the target, we
require a sensor (remote - not in transmitted, often in electronic
contact with the target) to form, to a receiving and processing
collect and record the station where the data are
electromagnetic radiation. processed into an image (hardcopy
and/or digital). 3
Interpretation and Analysis
(F) - the processed image is
interpreted, visually and/or
digitally or electronically, to extract
information about the target which
was illuminated.
Application (G) - the final element
of the remote sensing process is
achieved when we apply the
information we have been able to
extract from the imagery about the
target in order to better • Did you know?
understand it, reveal some new Of our five senses (sight, hearing, taste,
information, or assist in solving a smell, touch), three may be considered
forms of "remote sensing", where the
particular problem. source of information is at some distance.
The other two rely on direct contact with
the source of information - which are
they? 4
Electromagnetic Radiation
• first requirement for remote sensing is
to have an energy source to illuminate
the target (unless the sensed energy is
being emitted by the target). This
energy is in the form of
electromagnetic radiation.
• All electromagnetic radiation has
fundamental properties and behaves in
predictable ways according to the
basics of wave theory.
• Electromagnetic radiation consists of Can "remote sensing"
an electrical field(E) which varies in employ anything other than
magnitude in a direction perpendicular electromagnetic radiation?
to the direction in which the radiation
is traveling, and a magnetic field (M)
oriented at right angles to the electrical The use of sound is an obvious
field. Both these fields travel at the alternative; thus you can claim
speed of light (c). that your telephone
conversation is indeed
'remote sensing'.
5
wavelength and frequency
• The wavelength is the length of one wave
cycle, which can be measured as the
distance between successive wave crests.
• Wavelength is usually represented by the
Greek letter lambda (λ).
• Wavelength is measured in metres (m) or
some factor of metres such as
nanometres (nm, 10-9 metres),
micrometres (μm, 10-6 metres) (μm, 10-6
metres) or centimetres (cm, 10-2 metres). Wavelength and frequency are
• Frequency refers to the number of cycles related by the following formula:
of a wave passing a fixed point per unit of
time. Frequency is normally measured in
hertz (Hz), equivalent to one cycle per
second, and various multiples of hertz.

The shorter the wavelength, the higher the frequency. The longer the wavelength,
6
the lower the frequency.
• What is the obvious source of electromagnetic energy that
you can think of? What "remote sensing device" do you
personally use to detect this energy?

7
• The Electromagnetic Spectrum
The electromagnetic spectrum ranges from the shorter wavelengths (including gamma
and x-rays) to the longer wavelengths (including microwaves and broadcast radio waves).
There are several regions of the electromagnetic spectrum which are useful for remote
sensing.

8
• the ultraviolet or UV portion of the
spectrum has the shortest wavelengths
which are practical for remote sensing.
This radiation is just beyond the violet
portion of the visible wavelengths, hence
its name. Some Earth surface materials,
primarily rocks and minerals, fluoresce or
emit visible light when illuminated by UV
radiation.
• The light which our eyes - our "remote
sensors" - can detect is part of the visible
spectrum. It is important to recognize how
small the visible portion is relative to the
rest of the spectrum.
• There is a lot of radiation around us which
is "invisible" to our eyes, but can be
detected by other remote sensing
instruments and used to our advantage.
• The visible wavelengths cover a range
from approximately 0.4 to 0.7 µm. The
longest visible wavelength is red and the
shortest is violet.

9
• Blue, green, and red are the
primary colours or wavelengths
of the visible spectrum. They
are defined as such because no
single primary colour can be
created from the other two, but
all other colours can be formed
by combining blue, green, and
red in various proportions.

Although we see sunlight as a


uniform or homogeneous colour, it
is actually composed of various
wavelengths of radiation in
primarily the ultraviolet, visible and
infrared portions of the spectrum
10
• The infrared (IR) region which covers
the wavelength range from
approximately 0.7 µm to 100 µm -
more than 100 times as wide as the
visible portion!
• The infrared region can be divided
into two categories based on their
radiation properties - the reflected
IR, and the emitted or thermal IR.
• Radiation in the reflected IR region is
used for remote sensing purposes in
ways very similar to radiation in the
visible portion. The reflected IR
covers wavelengths from
approximately 0.7 µm to 3.0 µm.
• The thermal IR region is quite
different than the visible and
reflected IR portions, as this energy is
essentially the radiation that is
emitted from the Earth's surface in
the form of heat. The thermal IR
covers wavelengths from
approximately 3.0 µm to 100 µm.
Did you know?
Hue and saturation are independent characteristics of colour. Hue refers to the wavelength of light, which we
commonly call "colour", while saturation indicates how pure the colour is, or how much white is mixed in with 11
it. For instance, "pink" can be considered a less saturated version of "red".
Interactions with the Atmosphere
• Before radiation used for remote sensing reaches the Earth's
surface it has to travel through some distance of the Earth's
atmosphere. Particles and gases in the atmosphere can affect
the incoming light and radiation. These effects are caused by the
mechanisms of scattering and absorption.
• Scattering occurs when particles or large gas molecules present
in the atmosphere interact with and cause the electromagnetic
radiation to be redirected from its original path.
• How much scattering takes place depends on several factors
including the wavelength of the radiation, the abundance of
particles or gases, and the distance the radiation travels through
the atmosphere.

There are three (3) types of scattering which take place.


Rayleigh scattering
Mie scattering
nonselective scattering

12
• Rayleigh scattering occurs when particles
are very small compared to the wavelength
of the radiation. These could be particles
such as small specks of dust or nitrogen
and oxygen molecules. Rayleigh scattering
causes shorter wavelengths of energy to be
scattered much more than longer
wavelengths. Rayleigh scattering is the
dominant scattering mechanism in the
upper atmosphere.
• Mie scattering occurs when the particles
are just about the same size as the
wavelength of the radiation. Dust, pollen,
smoke and water vapour are common
causes of Mie scattering which tends to
affect longer wavelengths than those
affected by Rayleigh scattering. Mie
scattering occurs mostly in the lower
portions of the atmosphere where larger
particles are more abundant, and
dominates when cloud conditions are
overcast.
13
• Nonselective scattering occurs
when the particles are much larger
than the wavelength of the
radiation. Water droplets and large
dust particles can cause this type of
scattering. Nonselective scattering
gets its name from the fact that all
wavelengths are scattered about
equally. This type of scattering
causes fog and clouds to appear
white to our eyes because blue,
green, and red light are all scattered
in approximately equal quantities
(blue+green+red light = white light).
Absorption is the other main mechanism at work when electromagnetic radiation
interacts with the atmosphere. In contrast to scattering, this phenomenon causes
molecules in the atmosphere to absorb energy at various wavelengths. Ozone,
carbon dioxide, and water vapour are the three main atmospheric constituents
which absorb radiation.

Most remote sensing systems avoid detecting and recording wavelengths in the
ultraviolet and blue portions of the spectrum. why ? 14
Hint: The sky is blue!!!!!!
Radiation - Target Interactions
• Radiation that is not absorbed or
scattered in the atmosphere can reach
and interact with the Earth's surface.
There are three (3) forms of interaction
that can take place when energy strikes, or
is incident (I) upon the surface. These are:
absorption (A); transmission (T); and
reflection (R).
• Absorption (A) occurs when radiation
(energy) is absorbed into the target while
transmission (T) occurs when radiation
passes through a target. Reflection (R)
occurs when radiation "bounces" off the
target and is redirected.
• In remote sensing, we are most interested
in measuring the radiation reflected from
targets.

When a surface is smooth we get


specular or mirror-like reflection
where all (or almost all) of the Diffuse reflection occurs when the
energy is directed away from the surface is rough and the energy is
surface in a single direction. reflected almost uniformly in all
directions.
Most earth surface features lie somewhere between perfectly specular or perfectly diffuse 15
reflectors.
• Interaction of light with different
objects
Leaves:
• Chlorophyll strongly absorbs radiation in
the red and blue wavelengths but reflects
green wavelengths. Leaves appear
"greenest" to us in the summer, when
chlorophyll content is at its maximum.
• In autumn, there is less chlorophyll in the
leaves, so there is less absorption and
proportionately more reflection of the red
wavelengths, making the leaves appear
red or yellow (yellow is a combination of
red and green wavelengths).
• The internal structure of healthy leaves act as excellent diffuse
reflectors of near-infrared wavelengths.
• If our eyes were sensitive to near-infrared, trees would appear
extremely bright to us at these wavelengths.
• In fact, measuring and monitoring the near-IR reflectance is one way
that scientists can determine how healthy (or unhealthy) vegetation
may be. 16
Water:
• Longer wavelength visible and near infrared
radiation is absorbed more by water than shorter
visible wavelengths. Thus water typically looks
blue or blue-green due to stronger reflectance at
these shorter wavelengths, and darker if viewed at
red or near infrared wavelengths.
• If there is suspended sediment present in the
upper layers of the water body, then this will allow
better reflectivity and a brighter appearance of the
water.
• The apparent colour of the water will show a slight
shift to longer wavelengths.
• Suspended sediment (S) can be easily confused
with shallow (but clear) water, since these two
phenomena appear very similar.
• Chlorophyll in algae absorbs more of the blue wavelengths and reflects the green,
making the water appear more green in colour when algae is present.
• The topography of the water surface (rough, smooth, floating materials, etc.) can
also lead to complications for water-related interpretation due to potential
problems of specular reflection and other influences on colour and brightness.
17
• the colours we perceive are a combination of these radiation
interactions (absorption, transmission, reflection), and
represent the wavelengths being reflected. If all visible
wavelengths are reflected from an object, it will appear white,
while an object absorbing all visible wavelengths will appear
colourless, or black.

18
• Depending on the complex make-up of the target that is being looked at, and the
wavelengths of radiation involved, we can observe very different responses to the
mechanisms of absorption, transmission, and reflection.
• By measuring the energy that is reflected (or emitted) by targets on the Earth's surface
over a variety of different wavelengths, we can build up a spectral response for that
object.
• By comparing the response patterns of different features we may be able to distinguish
between them, where we might not be able to, if we only compared them at one
wavelength. For example, water and vegetation may reflect somewhat similarly in the
visible wavelengths but are almost always separable in the infrared.
• Spectral response can be quite variable, even for the same target type, and can also vary
with time (e.g. "green-ness" of leaves) and location.
• Knowing where to "look" spectrally and understanding the factors which influence the
spectral response of the features of interest are critical to correctly interpreting the
interaction of electromagnetic radiation with the surface.
19
Passive vs. Active Sensing
Remote sensing systems which measure energy
that is naturally available are called passive
sensors.
Passive sensors can only be used to detect energy
when the naturally occurring energy is available.
For all reflected energy, this can only take place
during the time when the sun is illuminating the
Earth.

Active sensors, on the other hand, provide their own energy source for
illumination. The sensor emits radiation which is directed toward the target to
be investigated. The radiation reflected from that target is detected and
measured by the sensor. Advantages for active sensors include the ability to
obtain measurements anytime, regardless of the time of day or season.
Some examples of active sensors are a laser fluorosensor and a synthetic
aperture radar (SAR).

20
• Did you know?
• a camera provides an excellent example of both
passive and active sensors. During a bright sunny day,
enough sunlight is illuminating the targets and then
reflecting toward the camera lens, that the camera
simply records the radiation provided (passive
mode). On a cloudy day or inside a room, there is
often not enough sunlight for the camera to record
the targets adequately. Instead, it uses its own
energy source - a flash - to illuminate the targets and
record the radiation reflected from them (active
mode).

• radar used by police to measure the speed of


traveling vehicles is a use of active remote sensing.
The radar device is pointed at a vehicle, pulses of
radiation are emitted, and the reflection of that
radiation from the vehicle is detected and timed. The
speed of the vehicle is determined by calculating
time delays between the repeated emissions and
reception of the pulses. This can be calculated very
accurately because the speed of the radiation is
moving much, much faster than most
vehicles...unless you're driving at the speed of light!

21
Characteristics of Images
• Electromagnetic energy may be detected either
photographically or electronically. The
photographic process uses chemical reactions
on the surface of light-sensitive film to detect
and record energy variations.
• It is important to distinguish between the
terms images and photographs in remote
sensing.
• An image refers to any pictorial representation,
regardless of what wavelengths or remote
sensing device has been used to detect and
record the electromagnetic energy.
• A photograph refers specifically to images that
have been detected as well as recorded on
photographic film. • A photograph could also be represented
and displayed in a digital format by
• Photos are normally recorded over the
subdividing the image into small equal-
wavelength range from 0.3 µm to 0.9 µm - the
sized and shaped areas, called picture
visible and reflected infrared. Based on these
elements or pixels, and representing the
definitions, we can say that all photographs are
brightness of each area with a numeric
images, but not all images are photographs.
value or digital number
22
• Can you imagine what the world would look like if we could only see very
narrow ranges of wavelengths or colours? That is how many sensors work.
• The information from a narrow wavelength range is gathered and stored in a
channel, also sometimes referred to as a band.
• We can combine and display channels of information digitally using the three
primary colours (blue, green, and red).
• The data from each channel is represented as one of the primary colours and,
depending on the relative brightness (i.e. the digital value) of each pixel in each
channel, the primary colours combine in different proportions to represent
different colours.
• When we use this method to display a single channel or range of wavelengths,
we are actually displaying that channel through all three primary colours.
Because the brightness level of each pixel is the same for each primary colour,
they combine to form a black and white image, showing various shades of gray
from black to white.
• When we display more than one channel each as a different primary colour,
then the brightness levels may be different for each channel/primary colour
combination and they will combine to form a colour image.
DID YOU KNOW?
Photographic film has the clear advantage of recording extremely fine spatial detail, since individual silver halide
molecules can record light sensitivity differently than their neighbouring molecules. But when it comes to spectral and
radiometric qualities, digital sensors outperform film, by being able to use extremely fine spectral bands (for spectral 23
'fingerprinting' of targets), and recording up to many thousands of levels of brightness.
Satellites and sensor
• On the Ground, In the Air, In Space
• Ground-based sensors are often used to record detailed information about the
surface which is compared with information collected from aircraft or satellite
sensors. In some cases, this can be used to better characterize the target which is
being imaged by these other sensors, making it possible to better understand the
information in the imagery.
• Sensors may be placed on a ladder, scaffolding, tall building, crane, etc. Aerial
platforms are primarily stable wing aircraft, although helicopters are occasionally
used. Aircraft are often used to collect very detailed images and facilitate the
collection of data over virtually any portion of the Earth's surface at any time.
• In space, remote sensing is sometimes conducted from
the space shuttle or, more commonly, from satellites.
• Satellites are objects which revolve around another
object - in this case, the Earth. For example, the moon is
a natural satellite, whereas man-made satellites include
those platforms launched for remote sensing,
communication, and telemetry (location and navigation)
purposes.
• Because of their orbits, satellites permit repetitive
coverage of the Earth's surface on a continuing basis.
Cost is often a significant factor in choosing among the
24
various platform options
• Satellite Characteristics: Orbits and Swaths
• The path followed by a satellite is referred to as its
orbit.
• Satellite orbits are matched to the capability and
objective of the sensor(s) they carry.
• Orbit selection can vary in terms of altitude (their
height above the Earth's surface) and their
orientation and rotation relative to the Earth.
• Satellites at very high altitudes, which view the
same portion of the Earth's surface at all times
have geostationary orbits.
These geostationary satellites, at altitudes of approximately 36,000
kilometres, revolve at speeds which match the rotation of the Earth so
they seem stationary, relative to the Earth's surface. This allows the
satellites to observe and collect information continuously over specific
areas. Weather and communications satellites commonly have these
25
types of orbit
• As a satellite revolves around the Earth, the sensor
"sees" a certain portion of the Earth's surface. The
area imaged on the surface, is referred to as the
swath.
• Imaging swaths for spaceborne sensors generally
vary between tens and hundreds of kilometres wide.
• As the satellite orbits the Earth from pole to pole, its
east-west position wouldn't change if the Earth
didn't rotate. However, as seen from the Earth, it
seems that the satellite is shifting westward because
the Earth is rotating (from west to east) beneath it.
This apparent movement allows the satellite swath
to cover a new area with each consecutive pass.
• The satellite's orbit and the rotation of the Earth
work together to allow complete coverage of the
Earth's surface, after it has completed one complete
cycle of orbits
• If we start with any randomly selected pass in a
satellite's orbit, an orbit cycle will be completed
when the satellite retraces its path, passing over the
same point on the Earth's surface directly below the
satellite (called the nadir point) for a second time.
• The exact length of time of the orbital cycle will vary
with each satellite. The interval of time required for
the satellite to complete its orbit cycle is not the
26
same as the "revisit period".
Spatial Resolution, Pixel Size, and Scale
• The detail discernible in an image is
dependent on the spatial resolution of the
sensor and refers to the size of the smallest
possible feature that can be detected.
• Spatial resolution of passive sensors (we will
look at the special case of active microwave
sensors later) depends primarily on their
Instantaneous Field of View (IFOV).
• The IFOV is the angular cone of visibility of
the sensor (A) and determines the area on
the Earth's surface which is "seen" from a
given altitude at one particular moment in
time (B).
• The size of the area viewed is determined by
multiplying the IFOV by the distance from
the ground to the sensor (C). This area on
the ground is called the resolution cell and
determines a sensor's maximum spatial
resolution. 27
• Most remote sensing images are composed of a matrix
of picture elements, or pixels, which are the smallest
units of an image.
• Image pixels are normally square and represent a certain
area on an image.
• It is important to distinguish between pixel size and
spatial resolution - they are not interchangeable.
• If a sensor has a spatial resolution of 20 metres and an
image from that sensor is displayed at full resolution,
each pixel represents an area of 20m x 20m on the
ground. In this case the pixel size and resolution are the
same. However, it is possible to display an image with a
pixel size different than the resolution.
• Many posters of satellite images of the Earth have their
pixels averaged to represent larger areas, although the
original spatial resolution of the sensor that collected
the imagery remains the same.
28
• Images where only large features are visible are said to have
coarse or low resolution.
• In fine or high resolution images, small objects can be detected.
• Military sensors for example, are designed to view as much detail
as possible, and therefore have very fine resolution.
• Commercial satellites provide imagery with resolutions varying
from a few metres to several kilometres.
• Generally speaking, the finer the resolution, the less total ground
area can be seen.

29
• The ratio of distance on an image or map, to actual
ground distance is referred to as scale. If you had a map
with a scale of 1:100,000, an object of 1cm length on
the map would actually be an object 100,000cm (1km)
long on the ground.
• Maps or images with small "map-to-ground ratios" are
referred to as small scale (e.g. 1:100,000), and those
with larger ratios (e.g. 1:5,000) are called large scale.

30
31
Spectral Resolution
• Different classes of features and details in an image can often be distinguished by
comparing their responses over distinct wavelength ranges. Broad classes, such as water
and vegetation, can usually be separated using very broad wavelength ranges - the visible
and near infrared .
• Other more specific classes, such as different rock types, may not be easily
distinguishable using either of these broad wavelength ranges and would require
comparison at much finer wavelength ranges to separate them. Thus, we would require a
sensor with higher spectral resolution. Spectral resolution describes the ability of a
sensor to define fine wavelength intervals. The finer the spectral resolution, the narrower
the wavelength range for a particular channel or band.
• Black and white film records wavelengths extending over
much, or all of the visible portion of the electromagnetic
spectrum. Its spectral resolution is fairly coarse, as the
various wavelengths of the visible spectrum are not
individually distinguished and the overall reflectance in the
entire visible portion is recorded.
• Colour film is also sensitive to the reflected energy over the
visible portion of the spectrum, but has higher spectral
resolution, as it is individually sensitive to the reflected
energy at the blue, green, and red wavelengths of the
spectrum. Thus, it can represent features of various colours
based on their reflectance in each of these distinct
wavelength ranges. 32
33
• Many remote sensing systems record energy over several separate wavelength ranges
at various spectral resolutions. These are referred to as multi-spectral sensors.
• Advanced multi-spectral sensors called hyperspectral sensors, detect hundreds of very
narrow spectral bands throughout the visible, near-infrared, and mid-infrared portions
of the electromagnetic spectrum.
• Their very high spectral resolution facilitates fine discrimination between different
targets based on their spectral response in each of the narrow bands.
Radiometric Resolution
• The radiometric characteristics describe the
actual information content in an image.
Every time an image is acquired on film or by
a sensor, its sensitivity to the magnitude of
the electromagnetic energy determines the
radiometric resolution.
• The radiometric resolution of an imaging
system describes its ability to discriminate
very slight differences in energy The finer
the radiometric resolution of a sensor, the
more sensitive it is to detecting small
differences in reflected or emitted energy.
If a sensor used 8 bits to record the data, there would be 28=256 digital values available, ranging from
0 to 255. However, if only 4 bits were used, then only 24=16 values ranging from 0 to 15 would be
available. Thus, the radiometric resolution would be much less.
Image data are generally displayed in a range of grey tones, with black representing a digital
34
number of 0 and white representing the maximum value (for example, 255 in 8-bit data).
• • Number of Shades or brightness levels at a given
wavelength

• Smallest change in
• intensity level that can
• be detected by the
• sensing system

35
36
Image classification
• Image classification is the process of assigning land cover classes to
pixels
• For example, these 9 global land cover data sets classify images
into forest, urban, agriculture and other classes
• In general, these are three main image classification techniques in
remote sensing:
• Unsupervised image classification
• Supervised image classification
• Object-based image analysis

37
• In unsupervised classification, it first groups pixels into
“clusters” based on their properties
• In order to create “clusters”, analysts use image clustering
algorithms such as K-means and ISODATA
• For the most part, they can use this list of free remote sensing
software to create land cover maps.

38
• After picking a clustering algorithm, you identify the number of
groups you want to generate
• For example, you can create 8, 20 or 42 clusters. To be clear, these
are unclassified clusters because in the next step, you manually
identify each cluster with land cover classes
• For example, if you want to classify vegetation and non-vegetation,
you’ll have to merge clusters into only 2 clusters
• Overall, unsupervised classification is the most basic technique.
Because you don’t need samples for unsupervised classification, it’s
an easy way to segment and understand an image
39
• Unsupervised Classification Steps:
• Generate clusters
• Assign classes

40
• In supervised classification, you select representative samples for
each land cover class
• The software then uses these “training sites” and applies them to
the entire image
• Supervised classification uses the spectral signature defined in the
training set
• For example, it determines each class on what it resembles most in
the training set
• The common supervised classification algorithms are maximum
likelihood and minimum-distance classification

41
Supervised Classification Steps:
• Select training areas
• Generate signature file
• Classify

42
Unsupervised vs Supervised vs Object-Based
Classification
• A case study from the University of Arkansas compared object-
based vs pixel-based classification
• The goal was to compare high and medium spatial resolution
imagery
• Overall, object-based classification outperformed both
unsupervised and supervised pixel-based classification methods.
Because OBIA used both spectral and contextual information, it had
higher accuracy.
• This study is a good example of some of the limitations of pixel-
based image classification techniques.
43
Applications of Remote Sensing

• Images serve as base maps


• Observe or measure properties or condition of the land,
oceans, and atmosphere
• Map spatial distribution of “features”
• Record spatial changes

44
Change Detection - Flooding

Landsat imagery of the 1993 Mississippi flood

9/21/2019 Sunil Duwal


Quantifying Urban Sprawl

San Francisco Bay

9/21/2019 Sunil Duwal


Change Detection - Urban Sprawl

9/21/2019 Sunil Duwal


Monitoring Weather

GOES-8 Water Vapor

9/21/2019 Sunil Duwal


Detecting and Monitoring Wildland Fires

Borneo

Arizona, June 2002

9/21/2019 Sunil Duwal


Monitoring Sea Surface Temperature

Remote Sensing Process


9/21/2019 Sunil Duwal
51

You might also like