sensors

Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Sensors for Missions

20
Luis Mejias, John Lai, and Troy Bruggemann

Contents
20.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
20.2 Navigation Sensors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 386
20.2.1 Electro-Optical (EO) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
20.2.2 Radio-Wave Sensors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 389
20.3 Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
20.3.1 Collision Avoidance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 390
20.3.2 Remote Sensing: Power Line Inspection and Vegetation Management . . . . . . . . . 394
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 398

Abstract
An onboard payload may be seen in most instances as the “Raison d’Etre”
for a UAV. It will define its capabilities, usability and hence market value.
Large and medium UAV payloads exhibit significant differences in size and
computing capability when compared with small UAVs. The latter has stringent
size, weight, and power requirements, typically referred as SWaP, while the
former still exhibit endless appetite for compute capability. The tendency for this
type of UAVs (Global Hawk, Hunter, Fire Scout, etc.) is to increase payload
density and hence processing capability. An example of this approach is the
Northrop Grumman MQ-8 Fire Scout helicopter, which has a modular payload
architecture that incorporates off-the-shelf components. Regardless of the UAV
size and capabilities, advances in miniaturization of electronics are enabling the
replacement of multiprocessing, power-hungry general-purpose processors with
more integrated and compact electronics (e.g., FPGAs).

L. Mejias (!) • J. Lai • T. Bruggemann


Australian Research Centre for Aerospace Automation, Queensland University of Technology,
Brisbane, QLD, Australia
e-mail: [email protected]; [email protected]; [email protected]

K.P. Valavanis, G.J. Vachtsevanos (eds.), Handbook of Unmanned Aerial Vehicles, 385
DOI 10.1007/978-90-481-9707-1 6,
© Springer Science+Business Media Dordrecht 2015
386 L. Mejias et al.

The payload plays a significant role in the quality of ISR (intelligent,


surveillance, and reconnaissance) data, and also in how quickly that information
can be delivered to the end user. At a high level, payloads are important enablers
of greater mission autonomy, which is the ultimate aim in every UAV.
This section describes common payload sensors and introduces two cases in
which onboard payloads were used to solve real-world problems. A collision
avoidance payload based on electro optical (EO) sensors is first introduced, fol-
lowed by a remote sensing application for power line inspection and vegetation
management.

20.1 Introduction

There are two main categories of payloads onboard a UAV: those that allow the
vehicle to navigate and those that allow the vehicle to perform its main task. The
distinction between them is sometimes very subtle. In some cases, a payload aimed
to perform a task can be used for navigation purposes (e.g., cameras), and navigation
payloads may in some instances be integrated with other sensors to perform a
specific task (e.g., GPS/INS with LIDAR). This section will describe common
sensors that are used in many typical payloads onboard UAVs.

20.2 Navigation Sensors

At the core of most UAV guidance and navigation systems, one can find Global
Navigation Satellite Systems (GNSS) (including the Global Positioning System –
GPS) and Inertial Navigation Systems (INS). Their complementary nature has been
recognized, and as a result, GPS and INS sensors are the preferred sensor couple for
the majority of autopilot systems.
The integration of GPS and INS is without doubt the area in which researchers
have spent considerable efforts proposing approaches such as uncoupled integra-
tion, loosely coupled integration, tightly coupled integration, and deeply coupled
integration (Grewal et al. 2007). GPS and INS are not the only two sensors used
for navigation. They can be complemented with altimeters (laser-based, barometric,
etc.) to enhance the estimation of the vehicle state. Additionally, infrared attitude
sensors are typically found in micro UAVs. Recently in Cao et al. (2010), a survey
of UAV autopilot alternatives and typical sensor combinations was presented.
At the heart of the integration scheme lies a estimator (usually a form of Kalman
filter) which estimates position, velocity, attitude, GPS errors and inertial sensor
errors. Due to their complementary nature, GPS and INS are often the preferred
core sensor suite. However, researchers have also investigated the integration of
other sensor combinations, such as GPS with computer vision (Dusha et al. 2011;
Wein et al. 2011; Dusha and Mejias 2012) and INS with computer vision (Merz
et al. 2006). Additionally, factors such as the trade-off between cost and accuracy
20 Sensors for Missions 387

of INS sensors and the susceptibility of GPS to spoofing and jamming have also
contributed to increased interest in alternative sensor combinations.
While alternative integration schemes using other sensors is an attractive option,
the low cost and future navigation availability and integrity that Space-Based
Augmentation Systems (SBAS) such as WAAS, EGNOS, GAGAN, and MSAS will
provide cannot be ignored. Submeter accuracy for civilian users will also be possible
with the commissioning of Galileo, Compass, and the modernization of GLONASS.
This in turn will encourage interoperatibility and the possibility of a triple-frequency
civilian-band GPS over the next decades.

20.2.1 Electro-Optical (EO)

Nowadays, it is difficult to realize a UAV without an EO sensor. They have


become standard fit-out onboard many aerial vehicles. The challenge is now in the
processing and interpretation of the information acquired by EO sensors. Perception
through EO sensors can be seen as one of the most important tasks in a UAV.
Whether it is performed for navigation or for surveillance (as an end application), it
defines the necessary peripherals to process the EO data. This section will introduce
some of the most common EO sensors typically found in UAVs.

20.2.1.1 Visible Spectrum


A visible-spectrum camera operates about 390 nm (3.9 !m)–750 nm (7.5 !m)
wavelength. These cameras can be found in two main categories: digital still
cameras and machine vision cameras (including surveillance and webcam). Digital
still cameras offer very high resolution, but cannot provide a continuous stream of
images. The amount of images they can provide usually depends on the amount
of internal memory. This type of camera sees application in remote sensing and
aerial photography. Machine vision cameras have relatively lower resolution but can
provide a continuous stream of images up to a few hundred frames per second. The
speed is related with the resolution and output format used (digital or analog). They
are suitable for processes or tasks that require very fast perception of environment.
Common output protocols and interfaces for machine vision cameras include IEEE
1394, Camera Link, SD/HD Analog, USB, GigE Vision, and in coming years
Thunderbolt cameras.
They can provide color or gray level (or both) images. The data representation
or color space usually varies from one manufacturer to another. Typical color
spaces or models used in most machine vision cameras are RGB, YUV, YPbPr,
and YCbCr, etc.
Regardless of its end use and camera type, camera geometric models are
necessary. They provide parameters that are needed to correct lens distortions,
perform the mapping or representation of 3D objects onto the 2D surface called
image, and overall allow manipulation of the data acquired by these devices. These
parameters are normally estimated through a calibration process that is regularly
388 L. Mejias et al.

performed. For more details on camera models and calibration theory, refer to
Forsyth and Ponce (2002) and Szeliski (2011).

20.2.1.2 Infrared
An infrared (IR) camera is a device that detects and converts light in the same way
as common visible-spectrum cameras, but is sensitive to light at longer wavelengths.
They form an image using infrared radiation in the spectrum at wavelengths from
5,000 nm (5 !m) to 14,000 nm (14 !m). Infrared cameras are used to convey a
measure of thermal radiation of bodies. The intensity of each pixel can be converted
for use in temperature measurement, where the brightest parts of the image (colored
white) represent the warmest temperatures. Intermediate temperatures are shown as
reds and yellows, and the coolest parts are shown in blue. Often, IR images are
accompanied by a scale next to a false color image to relate colors to temperatures.
The resolution of IR cameras (up to a maximum of 640 !480 pixels) is considerably
lower than the resolution of optical cameras. Furthermore, the price of IR cameras
is considerably higher than their visible-spectrum counterparts.
Infrared cameras can be categorized in two main groups:

Cooled Infrared Detectors


Cooled IR detectors are normally inside a vacuum-sealed container that is cryo-
genically cooled. Cooling is needed for the efficient operation of the semiconductor
used. Typical operating temperatures (in Kelvin degrees (K)) range from 4 to 293 K.
Most modern IR detectors operate in 60–100 K range. The operating temperature
is related with the type of technology used and performance expected. Cooled
infrared cameras provide superior image quality compared to uncooled ones. They
provide greater sensitivity that allows the use of higher F-number lenses, making
high-performance long focal length lenses both smaller, and cheaper for cooled
detectors. The drawbacks of cooled infrared cameras are that they are expensive
both to produce and to run. Cooling is power hungry and time consuming. A camera
may need several minutes to cool down before it can begin working.

Uncooled Infrared Detectors


This type of camera use sensors that are at or close to room temperature. For this
reason, they do not require bulky, expensive cryogenic coolers, and therefore are
smaller and cheaper than cooled ones, but with the drawback that their resolution
and image quality tend to be lower than cooled detectors. This is due to a difference
in their fabrication processes, limited by currently available technology.
Given the ability of IR cameras to reveal the hidden world not perceived by
the human eye or visible-spectrum camera, they see application in tasks that are
visibly challenging such as night vision or bad weather scenarios, inspection tasks,
surveillance (bushfire monitoring), search, and rescue.

20.2.1.3 Hyperspectral Imaging


Sensors in this category acquire image data simultaneously in multiple adjacent
spectral bands. This gives a wealth of data, but its processing and interpretation
20 Sensors for Missions 389

require good knowledge of what specific properties are to be measured and how
they relate to the actual measurement made from the sensor. For example, a
single cell position in an image will have a set of brightness (or intensity) levels
for each wavelength (spectral band). Different materials under examination by
a hyperspectral sensor will often exhibit different intensity versus wavelength
relationships. Hence, with some prior knowledge, hyperspectral image data can be
useful for identifying the type or composition of materials.
Cost and complexity are the two major drawbacks of this technology. Storage is
another limiting factor given the multidimensional nature of hyperspectral datasets.
The availability of graphics processing units (GPU) as powerful parallel process-
ing hardware could bring this technology a step closer to widespread adoption.
However, further research efforts are needed to create analytical techniques and
algorithms to unleash the full potential of hyperspectral imaging.

20.2.2 Radio-Wave Sensors

20.2.2.1 Airborne Radio Detection and Ranging (Radar)


Radar is a radio system used to determine range, altitude, direction, or speed of
objects. The system transmits controlled radio pulses which are reflected back by
objects. The distance to objects is estimated by measuring the signal return time.
The received power declines as the fourth power of the range, assuming transmitting
and receiving antennas are in the same location, hence the need for high transmitting
power in most cases. Speed can be estimated by tracking the change in distance with
time or by exploiting the Doppler effect (Stimson 1998).
Airborne radar has been in operation since WWII, and can be considered as an
integral part of systems such as Ground Proximity Warning Systems (GPWS) or
TCAS. In the automotive industry, radar is starting to appear in the form of collision
warning systems (www.ford.com).
In a UAV context, the main drawback of radar is the high power consumption.
The size, weight, and power (SWaP) challenges that are faced by UAVs are well
known. However, new systems such as Synthetic-Aperture Radar (SAR) (Soumekh
1999) are beginning to make radar technology a feasible option onboard UAVs
(Hanlon 2008).

20.2.2.2 Light Detection and Ranging (LIDAR)


LIDAR operates in a similar manner to radar, in that it can estimate distance by
measuring the time of return of a signal reflected from an object. LIDARs have been
widely used in atmospheric and meteorology research (Couch et al. 1991; Kiemle
et al. 1997) and remote sensing (Dubayah and Drake 2000; Lefsky et al. 2002).
Given its ability to provide very high definition (under 2.5 cm (Terranean 2011)),
LIDAR has become a common sensor for mapping and infrastructure inspection
(Yuee et al. 2009).
390 L. Mejias et al.

However, LIDAR shares many of the SWaP obstacles that are faced by radar
technology. Hence, LIDARs are not often found in micro/small size UAVs, but
have been flown in light general aviation aircraft (Ara 2011). Recent advances in
technology have been made, an example of which is the Riegl LMS-Q160 LIDAR
in <15 kg and 60 W range (Riegl 2011), which is paving the path for the use of
LIDARs in smaller UAVs in the future.
The pace of innovation seems to be accelerating lately. Perhaps this is a result of
the challenging economy, but without doubt, the miniaturization of electronics and
the availability of small, power-efficient and computationally capable hardware will
facilitate the adoption of UAVs in many civilian contexts.

20.3 Applications

20.3.1 Collision Avoidance

A sense-and-avoid capability equivalent to the human pilot see-and-avoid is one


of the key prerequisites for UAVs gaining routine access to civil airspace. Sense-
and-avoid begins with the detection of potential conflicting air traffic, and as a
result, considerable research and development has been dedicated to addressing the
“sensing” aspect of sense-and-avoid.
An important design choice in the development of sensing and detection systems
is the type of sensor used to collect information about the surrounding environment;
this is a choice that must account for the physical and resource limitations of the
UAV platform and has implications for the target detection algorithms and other
data processing techniques that are employed. Furthermore, the cooperative or
uncooperative nature of the system will have also an impact in the choice of sensors.
Cooperative implies that all vehicles in the air share information through common
communication links. Uncooperative denotes the issue that vehicles in the sky do
not communicate with each other and therefore implies that there is no other way to
detect other vehicles than a self-contained passive detection system.

20.3.1.1 Vision-Based Collision Avoidance


Electro-optical (EO) sensors possess a unique combination of characteristics that
make them an attractive sensor type to utilize in the sense-and-avoid application.
The passive means by which EO sensors obtain information about the environment
allows EO-based sense-and-avoid systems to detect other aircraft regardless of their
onboard equipment. In contrast, the Traffic Alert and Collision Avoidance System
(TCAS) is an active sensing approach that relies on the presence of transponder
equipment onboard other aircraft (FAA 2000). Robustness against noncooperative
threats is one of the key advantages of an EO sensing approach.
The maturity of EO sensor technology is also at a stage suitable for the UAV
sense-and-avoid application. The current state-of-the-art EO sensors tend to be
compact, light, and low power (due to their passive nature of operation), allowing
them to be operationally feasible even in relatively small UAV platforms with
20 Sensors for Missions 391

conservative size, weight, and power (SWaP) budgets. Furthermore, sensors sup-
porting high-speed IEEE 1394 and IEEE 802.3-2008 (Gigabit Ethernet) communi-
cation interfaces are readily accessible as commercial off-the-shelf (COTS) products
and can easily facilitate real-time capture and transfer of image data at high reso-
lutions. Currently, the market offers four preferred standards for transferring digital
video signals from the camera to the image-processing computer or workstation,
FireWire (IEEE 1394), USB 2.0, Gigabit Ethernet, and Camera Link. Although
not covered in this section, analog interfaces have constituted the backbone of the
imaging industry, as it is known it today. They are still present in analog video links
in many UAVs payloads.
The information provided by EO sensors can be exploited for purposes other
than the detection and localization of targets in the image plane. Relative bearing
information derived from target location on the image plane can be used to evaluate
the risk of collision (constant relative bearings correspond to high risk, whereas
large variations in relative bearings correspond to low risk) (Angelov et al. 2008).
Furthermore, range information useful for control purposes can be derived from a
single sensor when combined with appropriate aircraft maneuvers (Karlsson and
Gustafsson 2005).
An EO sensing approach offers many benefits, and studies have suggested that
EO-based sense-and-avoid systems hold the greatest potential for gaining regulatory
approval (Karhoff et al. 2006). However, there are also many challenges associated
with an EO sensing approach. One of the most significant challenges lies in
the unpredictable and constantly changing nature of the airborne environment. In
particular, for visible light spectrum EO sensors, detection algorithms must be able
to handle a variety of image backgrounds (from blue-sky to cloud and ground
clutter), lighting conditions, and possibly image artifacts (e.g., lens flare).
Another factor to contend with when using an EO sensing approach is the
presence of image jitter noise. Image jitter is caused by movement of the camera
sensor and hence is exacerbated in an airborne environment where the platform
is in constant motion and subject to unpredictable aerodynamic disturbances. For
detection algorithms that exploit target motion dynamics in the image plane,
image jitter represents an undesirable noise component that can have significant
impact on performance (Utt et al. 2004; Lai et al. 2011a). Various jitter com-
pensation techniques based on aircraft state information and image features have
been proposed to reduce the effects of image jitter (but it cannot be completely
eliminated).
Finally, it can also be a challenge to achieve real-time processing of EO
sensor image data, given the computationally intensive nature of image-processing
algorithms. However, this is gradually being overcome with the development of
specialized parallel processing hardware (such as graphics processing units (GPU),
field-programmable gate arrays (FPGA), and dedicated digital signal processors
(DSP)). Several attempts have been made, aiming to characterize the performance
of these specialize parallel processing hardware (Chase et al. 2008; Thomas et al.
2009), but their utility is highly dependent on the application and its degree of
parallelization.
392 L. Mejias et al.

Over the past decade, government, university, and private research groups
have demonstrated EO-based sense-and-avoid technologies of varying levels of
sophistication (Lai et al. 2012). One of the most mature EO-based sense-and-
avoid technology programs has been conducted by a partnership between Defense
Research Associates, Inc. (DRA), the Air Force Research Laboratory (AFRL),
and Aeronautical Systems Center (ASC) (Utt et al. 2004, 2005). Over a number
of development phases, detection and tracking of noncooperative general aviation
aircraft to a range of approximate 7 nautical miles has been demonstrated onboard
an Aerostar UAV. The program aims to ultimately provide a combined cooperative
and noncooperative collision avoidance capability for seamless integration onto
Department of Defense platforms.
Concurrently, the Australian Research Centre for Aerospace Automation
(ARCAA) has been undertaking a program to develop a cost-effective vision-
based sense-and-avoid system for civilian UAV applications (Lai et al. 2011a,b).
Extensive air-to-air target image data as well as nontarget data has been collected
to characterize the detection range and false alarm performance of the proposed
detection subsystem. Furthermore, closed-loop flight trials have been conducted,
demonstrating the ability of the prototype system to automatically detect intruder
aircraft and command the host aircraft autopilot to perform avoidance maneuvers.
These development programs and other studies over the past decade have advanced
the knowledge and understanding of trade-offs between EO sensor parameters (such
as camera field of view) and system performance measures (such as detection range,
detection probability, and false alarm rate) to a large extent. For example, many
studies have shown that, in general, increasing the field of view reduces detection
range and vice versa (Geyer et al. 2009; Lai et al. 2012).
However, the knowledge so far gained is far from complete, and more research
and experimentation is required to determine the design trade-offs that will allow
an EO-based sense-and-avoid system to demonstrate an equivalent level of safety
to human pilot see-and-avoid. Figure 20.1 presents two examples of images taken
from an onboard collision avoidance system.

20.3.1.2 Collision Avoidance Using Traffic Alert and Collision Avoidance


System (TCAS)
TCAS is a system designed to reduce the incidence of air-to-air collisions and hence
improve overall aircraft safety. TCAS was originally designed for manned aviation;
however, its adoption for unmanned aviation is feasible and will be dictated by
the degree of integration of UAVs in the civil airspace. However, its current cost
(between $25,000 and $150,000) might prevent widespread adoption of TCAS in the
UAV sector. TCAS consists of electronics that generate and interpret radio signals
of the transponders from nearby aircraft. TCAS can provide auditory alarms and
visual information through cockpit displays. The capabilities of the system in terms
of the type of information provided to pilots and the conflict resolution strategies
employed are defined by the TCAS generation (e.g., TCAS I or II). TCAS can
provide Traffic Advisories (TA) that warn pilots of aircraft flying in the vicinity.
Furthermore, the system can issue Resolution Advisories (RA) that suggest actions
20 Sensors for Missions 393

Fig. 20.1 Images example


of EO target detection taken
from a collision avoidance
system onboard a Cessna 172
(Greer et al. 2010). Detected
aircraft is highlighted by the
rectangular boxes (large box
is zoomed version for clarity)

the pilot should follow to avoid a collision. These might be corrective or preventive,
depending on whether a deviation in course is suggested or not, respectively. For full
technical details and capabilities of TCAS, readers are referred to Harman (1989)
and FAA (2000).
TCAS can only be fully effective as a means for collision avoidance if it
is equipped in all aircraft in any given airspace, as one might expect with any
cooperative collision avoidance system.

20.3.1.3 Automatic Dependent Surveillance-Broadcast (ADS-B) for


Collision Avoidance
ADS-B is a relatively new technology that offers great promise for collision
avoidance. ADS-B is not restricted to air-to-air surveillance (unlike TCAS), as it
can also be utilized in air-to-ground communications with the potential to replace
secondary surveillance radars. It resembles TCAS in the way that radio signals
are used to transmit and receive information from nearby aircraft. One important
and distinctive feature of ADS-B is the type of information exchanged. Each
394 L. Mejias et al.

aircraft would have the capability to share messages including 3D position, speed,
heading, time, and intent. This information is highly valuable in the design of
collision avoidance systems. However, ADS-B is often criticized for its excessive
verbosity, which is not always necessary for collision avoidance. The high volume
of information exchanged in ADS-B limits the number of aircraft that can use the
system in a given time due to channel data bandwidth constraints. Nonetheless, the
affordability of ADS-B will continue to motivate future research and development
into its application for UAVs (Lai et al. 2009).
There are ADS-B systems already in the market for under U.S. $3,000 (Kunzi
2009; NavWorx 2011). A detailed description of ADS-B systems, including its
operational capabilities, can be found in RTCA (2002).

20.3.2 Remote Sensing: Power Line Inspection and Vegetation


Management

Power line infrastructure is a large and costly asset to manage, costing many millions
of dollars per year. Inspecting the power lines for failures (or worn, broken parts) and
vegetation encroachment is a significant task particularly for electricity companies
such as Ergon Energy who manages one of the largest networks in the world,
consisting of over 150,000 km of power lines covering an area of 120,000 km2 of
Queensland, Australia (Li et al. 2012).
Traditionally, power line inspection is carried out by manned crews visiting
the power lines by ground vehicle. However, this is time consuming, and due to
terrain or proximity limitations, this task can be difficult. Unpiloted aircraft is an
ideal technology for streamlining the inspection task carried out by ground crews.
UAVs can achieve similar results to ground inspection but with greater efficiency.
In many cases, UAVs can gain visibility of power lines in difficult locations which
are inaccessible by ground vehicle.
Alternatively, inspecting power lines with a manned aerial inspection aircraft
involves precise flying at low altitude above the infrastructure to be inspected. This
is a potentially hazardous task for a human pilot due to fatigue from sitting in
aircraft for long periods of time and the intense concentration and effort required
to ensure that the infrastructure to inspect is within the sensor swath or field of view
(usually LIDAR and high-resolution camera) (Li et al. 2012). A LIDAR and camera
combination provides excellent sources of data with complementary characteristics.
However, LIDAR data files are large due to the amount of data it can capture,
often in the order of hundreds of Gigabytes or more. The processing of LIDAR
captured by UAV is an important task which currently must be done after flight
missions.
A future avenue is to perform LIDAR processing onboard the aerial vehicle.
Processing may be done by a number of techniques. During the processing, one of
the tasks is to distinguish the wanted feature from unwanted features in the data.
For example, trees, building, and other obstacles which are not near the power
line infrastructure are unwanted. Information such as the precise height, location
20 Sensors for Missions 395

Fig. 20.2 Typical infrared


image with 10 cm spatial
resolution taken from an
aircraft onboard camera
(image courtesy of Ergon
Energy and the Cooperative
Research Centre for Spatial
Information (CRCSI))

of power lines and their displacement from potentially hazardous objects such as
vegetation (which may start bushfires) is among some of the wanted data to be
extracted.
LIDAR and images (IR or visible spectrum) might be combined to provide a
comprehensive analysis of the area under examination. Figure 20.2 shows a typical
infrared image taken from an aircraft. Vegetation and other objects are clearly
distinguishable; this facilitates the identification of potentially dangerous vegetation
(in the vicinity of power lines). Image data can be combined in a multilayered
processing approach with other sources of information as depicted in Fig. 20.3.
Here, an image (top left) has been used to detect power lines, creating a region
of interest around the lines; the same image after georeferencing is then merged
with LIDAR data (top right) to improve feature detection rates. Typical LIDAR
data representation can be seen in the middle, while in the bottom, an example
representation of the merged data can be observed.
Because the task requires tracking of linear infrastructure from the air (not
merely enroute waypoint-to-waypoint navigation), specialized UAV control systems
may be required (Bruggemann and Ford 2011; Bruggemann et al. 2011; Li et al.
2012). Without a human pilot to control the aircraft, the stabilization and control
of the platform such that the assets to be inspected fall within the field of view
of the LIDAR or EO sensor becomes more challenging. The control system must
also maintain steady altitude and speed as per LIDAR/camera scan/frame rate and
swath/field-of-view requirements. Also, accurate spatial knowledge of the power-
line infrastructure to be tracked is required, and the UAVs own flight management,
navigation, and sensing systems must be capable for the task.
396 L. Mejias et al.

Fig. 20.3 Example of data merging image and LIDAR (Source data courtesy of Ergon Energy
and the Cooperative Research Centre for Spatial Information (CRCSI))

However currently, there are a number of outstanding challenges impeding the


use of UAVs for power line inspection. The UAV has to be dynamically capable to
fly at low altitudes in windy or turbulent conditions and be able to carry a sizeable
navigation and sensor payload system (including onboard power and LIDAR/image
data storage systems) while keeping the UAV steady while flying over the power
line infrastructure.
20 Sensors for Missions 397

Intuitively, this means the UAV must be of reasonably large size. This is for a
number of reasons. Firstly, unless the sensor payload can compensate for unwanted
dynamic motion, there is a requirement for stable flight over the power lines. This
means many small to medium UAVs which are sensitive to wind and turbulence
at low altitudes are not suitable. Secondly, the size of the LIDAR/camera sensor
payloads to capture fine details (such as tree branches and leaves, wires, and bolts) is
quite large (can be in the order of 80 kg weight and of large volume). This also rules
out many smaller UAV platforms. Thirdly, the use of a large UAV in civilian airspace
weighing 100 kg or more presents its own technical, legal, and regulatory challenges
to overcome. In future, smaller and lighter LIDAR/camera sensor payloads that have
the required resolutions for power line inspection could help to facilitate the use
of smaller UAV platforms for this task. This application illustrates the important
coupling between the UAV platform and the sensor payload.

Conclusion
This chapter has outlined some of the most common payloads for civilian
applications. Two examples of the use of sensors for missions such as collision
avoidance and remote sensing were presented also. These are two examples of
experimental use of sensors for civilian mission. Collision avoidance represents
one of the critical capabilities for UAVs in order to allow these systems to
routinely perform missions in shared airspace. In a civilian context, the remote
sensing field is considered as the one that can benefit the most for UAV
technology.
However, the establishment of UAVs in the civilian arena will be dictated by
how the regulators define and establish requirements and guidelines for UAVs.
This will affect similarly payload manufacturers, UAV manufacturers, and UAV
service providers.
Overall, the UAV sector is increasingly demanding sensors that can provide
more onboard autonomy, less unnecessary weight, and more ISR capabilities
without an exponential increase in costs or weight. Applications such as ISR will
require or emphasize sensors like cameras or signal intelligence receivers, robust
radio relay, etc. Commercial payloads will emphasize very high data rate com-
munications and sensors such as visible light and multispectral (hyperspectral)
cameras.
In the near future, enduring and persistent UAVs flying for days, months, or
years at high altitude will require a completely new approach to design and
manufacture of payloads. These new payloads will form new communications
and sensing infrastructures. UAVs flying in the upper atmosphere will behave
much like satellites, but with the clear advantage that if something goes wrong
the vehicle might be commanded to land, at the same time that its replacement
is sent to continue the mission. This capability will be very difficult to replicate
in traditional satellites. Hence, the gain is using UAVs for missions traditionally
assigned to satellites.

You might also like