1
Remote Sensing of the Environment with Small Unmanned Aircraft Systems
(UASs), Part 1: A review of progress and challenges
Ken Whitehead1* and Chris H. Hugenholtz1
1 Department of Geography, University of Calgary, 2500 University Drive NW, Calgary AB
T2N 1N4
* Corresponding author: [email protected]
Abstract: The recent development and proliferation of Unmanned Aircraft Systems (UASs) has
made it possible to examine environmental processes and changes occurring at spatial and
temporal scales that would be difficult or impossible to detect using conventional remote sensing
platforms. This review article highlights new developments in UAS-based remote sensing,
focusing mainly on small UASs (< 25 kg). Because this class is generally less expensive and
more versatile than larger systems the use of small UASs for civil, commercial and scientific
applications is expected to expand considerably in the future.
In order to highlight different environmental applications, we provide an overview of
recent progress in remote sensing with small UASs, including photogrammetry, multispectral
and hyperspectral imaging, thermal, and SAR and LiDAR. We also draw on the literature and
our own research experience in order to identify some key research challenges, including
limitations of the current generation of platforms and sensors, and the development of optimal
2
methodologies for processing and analysis. While much of the potential of small UASs for
remote sensing remains to be realised, it is likely that the next few years will see such systems
being used to provide data for an ever-increasing range of environmental applications.
Keywords: UAS, remote sensing, environment, review
1. Introduction
Remote sensing data acquired from satellites and piloted aircraft are powerful tools for
measuring the state and trends of environmental changes associated with natural processes and
human-induced alterations of the environment. For many situations, data from these platforms
provide the only way to measure features or processes on the Earth’s surface and in the
atmosphere, and to evaluate how they are changing. To address the growing demand for data on
the state of the environment, the science of remote sensing is continually evolving. A wide
variety of sensors are now available (Jensen 2000), including hyperspectral imaging systems that
characterise reflections from surface objects across a vast portion of the electromagnetic
spectrum (e.g. Huesca et al. 2013), and LiDAR systems (Light Detection and Ranging) that
provide detailed three-dimensional representations of Earth surface features and topography (e.g.
Glennie et al. 2013).
However, the conventional airborne and satellite remote sensing platforms upon which
most sensors are mounted have not always met the demands of researchers and environmental
professionals. For many environmental applications these platforms pose challenges or require
3
tradeoffs, such as high cost, lack of operational flexibility, limited versatility and/or poor spatial
and temporal resolution. The ability to identify, measure, and forecast environmental changes
requires remote sensing data that match the resolution of the changes and the associated
processes. Often data acquired from conventional remote sensing platforms do not have the
resolution and operational flexibility to address this challenge effectively or affordably. Attempts
have been made with different types of low cost platforms to overcome this gap (e.g. telescoping
masts: Schlitz 2004; balloons: Vierling et al. 2006; and kites: Wundram and Loffler 2007) but
limited adoption suggests these platforms have not met the demands of the research and
professional communities. We suspect this is because they involve customised designs, are
operationally impractical for some environments, or rely on manual control.
One area of recent innovation in terms of remote sensing platforms is the development
and application of small Unmanned Aircraft Systems (UASs), also known as drones, Unmanned
Aerial Vehicles (UAVs), or Remotely-Piloted Aircraft (RPA). UASs are emerging as flexible
platforms that, in many cases, have the potential to supplement and/or complement remote
sensing measurements acquired from satellites or manned aircraft. Spurred by growth in military
applications, UAV technology is now mainstream and affordable, with UASs now being applied
to a broad spectrum of environmental applications: from elephant enumeration in Africa
(Vermeulen et al. 2013) to glacier measurements in Canada’s high arctic (Whitehead et al. 2013).
Relative to conventional satellite and manned aircraft platforms, there are several characteristics
that make UASs highly attractive to remote sensing researchers and professionals alike,
including their: (a) low cost, particularly the “small” (< 25 kg) category (cf. Hugenholtz et al.
2012); (b) ability to perform missions and acquire data autonomously so that human operation is
minimised; (c) manoeuvrability, which is ideal for low altitude flying and navigating complex
4
environments; (d) ability to operate in adverse weather and dangerous environments; (e) reduced
exposure risk to pilots; and (f) a regulatory framework in many countries (like Canada) that
enables research and commercial applications. For these reasons, as well as many others,
tremendous growth in this sector is anticipated over the next decade (Hugenholtz et al. 2012;
Watts et al. 2012; Nex and Remondino 2013).
Another indicator of the growth of UAS remote sensing research is the scientific
literature. An analysis using the Web of Science and Scopus databases reveals that there has been
an upward trend of papers published in scholarly journals since the early 2000s (Figure 1). The
data were compiled by searching for titles, abstracts or keywords of journal articles and reviews
in each database that contained “UAV” or “unmanned”, and “remote sensing” or “image”. The
analysis is not exhaustive because it does not include conference papers and book chapters;
nevertheless, it shows that the pace of scientific adoption of UAS-based remote sensing has
increased in the last decade. We surmise that this increase reflects growing awareness of UAS
technology and its remote sensing capabilities, their straightforward operation, as well as the
growing commercial UAS manufacturing sector, which now offers a broad selection of small
UASs that increasingly fall within the reach of research budgets.
In anticipation of the continued growth and expansion of small UASs for remote sensing
of the environment (cf. Hugenholtz et al. 2012), this paper presents an overview of recent
developments in UAS-based remote sensing systems and methods. In order to expand on
previous review articles (e.g. Hardin and Jensen 2011; Watts et al. 2012; Nex and Remondino
2013), this paper emphasises the remote sensing technology and highlights some of the major
research challenges that lay ahead. After a brief overview of the technology, we outline the
UAS-based remote sensing workflow and follow with recent developments in sensor technology
5
and applications. Based on the literature and our own research experience using UASs and
working with the remote sensing data they acquire, we then attempt to provide some guidance on
future research by outlining some of the key challenges that currently exist in the context of
environmental measurements and monitoring.
2. UAS Basics
Several key papers and reviews have already outlined the main features of UASs (e.g. Hardin
and Jensen 2011; Watts et al. 2012; Nex and Remondino 2013), so here we provide a very brief
overview of the technology. First, it is important to clarify that a UAS consists of several
components (Eisenbeiss 2009): the aircraft or UAV, a Ground Control System (GCS), a pilot or
navigator who operates the UAS from the GCS, and one or more spotters who monitor the UAS
and other aircraft and hazards in the area.
For most remote sensing and mapping applications, UASs are autonomous and controlled
via an onboard autopilot. Positional information is provided using a Global Navigation and
Satellite System (GNSS), which yields regularly-sampled measurements of 3D position. The
majority of UAS platforms will also have an Inertial Measurement Unit (IMU) that provides
information on the aircraft attitude at any given time. The position and attitude of the aircraft is
fed into the autopilot, which then makes the necessary course adjustments to keep the aircraft on
course, either through varying the throttle, or adjusting flaps as appropriate. The autopilot may
also be used to generate a signal that triggers a camera at predetermined positions. The operator
at any time has the ability to override the autopilot via the GCS.
6
UASs may be either fixed wing, or rotary wing, with fixed wing UASs typically having
greater speed and longer range. Rotary-wing UASs include miniature helicopters and multirotor
platforms. Typically they have shorter flight durations, but offer greater manoeuvrability. Fixed
wing UASs are typically launched by hand or by catapult, and land with or without some form of
arresting mechanism, such as a parachute or by flying into a net. Rotary-wing UASs typically
require some manual operation for take-off, and may or may not require manual operation for
landing.
UASs may also be divided into different classes, depending on weight and capabilities. In
the US, UASs are classified as micro (< 0.9 kg), mini (0.9 – 13.6 kg), tactical (13.6 – 454.5 kg),
Medium Altitude Long Endurance (MALE: 454.5 – 13,636.4 kg), and High Altitude Long
Endurance (HALE: > 13,636.4 kg). The U.S. Federal Aviation Administration Modernisation
and Reform Act of 2012 (FAAMRA) also recognises a further class of small unmanned aerial
systems (< 25 kg); (Hugenholtz et al. 2012). Within Canada, UASs with a maximum takeoff
weight less than 35 kg are currently permitted, subject to regulatory approval (Transport Canada
2008). We estimate that the greatest uptake for commercial and remote sensing applications will
occur at the lighter end of the scale, for platforms weighing less than 5 kg, since this is the region
where the cost advantages of UASs are likely to be most significant and where risk associated
with blunt force impact is also minimised.
3. UAS regulations and remote sensing
7
There are several characteristics of remote sensing surveys with small UASs that stand out from
those performed by satellites or manned aircraft. For the time being, many of the differences are
driven by aviation regulations that are in place to address safety issues arising from civil uses of
UASs. However, the regulations also place restrictions on how UASs are operated, which in turn
has a major impact on the types of data that can be acquired from these platforms. In this section,
we outline some of the key regulatory criteria that distinguish remote sensing data acquired by
UASs.
At the time of writing, many countries and organisations are in the process of establishing
or revamping regulatory frameworks for integrating UASs into civil airspace (e.g. U.S.;
Hugenholtz et al. 2012), which makes it difficult to define a consistent set of regulations
affecting UAS-based remote sensing. Nevertheless, according to existing rules in countries like
Canada, the U.S. and the UK, there are three criteria that are likely to persist into the future:
limited flying altitude, flying within visual range, and proximity to built-up areas.
In Canada and the UK, operators of small UASs are required to fly below 400 feet (122
m) Above Ground Level (AGL) unless otherwise specified. In the U.S., similar height
restrictions generally apply, although Rango and Laliberte (2010) were able to obtain permission
to fly at up to 1,000 feet (305 m) AGL. For remote sensing data, the low flying height enables
the acquisition of ultra-high resolution imagery (centimeter-scale), which is a major benefit for
some applications, but it also introduces a trade-off in that a large number of images, perhaps
several hundred, may be required to completely cover the area of interest. The trade-off emerges
in the image processing, such that variations in viewing geometry, as well as the roll, pitch and
yaw of the aircraft, can yield radiometric and geometric distortions in the final image mosaic.
8
Requirements that the aircraft be in visual range at all times place an effective limit on
the distance between the operator and the aircraft, which varies according to the shape and color
of the aircraft, and the atmospheric conditions during flight. This requirement places a limit on
the size of the survey area, thereby often necessitating extra flights to cover larger areas. Future
advances in sense and avoid technology may permit UAS flights beyond visual range, thus
enabling UAS-based remote sensing of larger areas.
The final regulatory criterion that is likely to persist into the future is the restriction of
UAS flights near built-up areas. Without doubt, the high resolution of UAS surveys would be of
benefit for many engineering and construction projects in urban environments. However, even
with the development of reliable sense and avoid technology, public safety (and possibly
privacy) considerations are likely to rule out UAS remote sensing in this context.
4. UAS remote sensing
4.1. Survey and flight planning
The remote sensing workflow for small UASs is essentially an adaptation of the same steps and
processes used for piloted aircraft surveys, and in both cases, aviation regulations place certain
restrictions on how the surveys are configured.
Though each UAS survey is unique in nature, the same generic workflow is normally
followed. Typically, a UAS survey starts with flight planning (Hugenholtz et al. 2013). This
stage relies on specialised flight-planning software and uses a background map or satellite image
to define the survey area. Additional information is then added, such as the desired flying height,
9
the focal length and orientation of the camera, the desired amount of overlap between images,
and the desired flight direction. The flight-planning software will then calculate the optimal
solution to obtain overlapping stereo imagery covering the area of interest. During this process,
the various parameters can be adjusted, until the operator is satisfied with the flight plan. An
example of a flight plan with image footprints is shown in Figure 2.
As part of the planning stage, the camera shutter speed may need to be set manually, but
many consumer-grade cameras now use pre-defined automatic modes for different lighting
conditions. When setting the manual shutter, experience tends to trump any given rule-of-thumb
because the setting will depend on the ground surface cover and the ambient light levels. If the
exposure time is too long, the imagery will be blurred, or will be too bright, but if it is too short,
the imagery might be too dark to discriminate key features of interest.
Once a flight plan has been generated, it is uploaded to the UAS autopilot. The
instructions contained in the flight plan are used by the autopilot to calculate the necessary climb
rates and positional adjustments that enable the aircraft to follow the planned course as closely as
possible. Readings from the GNSS and IMU are typically logged by the autopilot several times
per second throughout the flight.
On completion of the flight a log file is usually downloaded from the aircraft autopilot
(Hugenholtz et al. 2013). This file contains details about the recorded aircraft position and
attitude throughout the flight, as well as details about when the camera was triggered. This log
file is typically used to provide initial estimates for image centre positions and camera
orientations, which are then used as inputs to the photogrammetric process. These initial
estimates will reflect the accuracy of onboard instrumentation. For example, a low-cost UAS
using inexpensive mapping-grade navigational sensors will typically have positional accuracies
10
in the 2-5 m range (Turner et al. 2013). Further errors will be introduced if there is uncertainty in
the timing of the camera shutter release. Such errors can be significant, with a one second delay
potentially resulting in an error of 30-50 m in the direction of flight for a fast-flying fixed-wing
UAS.
4.2. Photogrammetry
Other than the collection of airborne video footage, the most common non-military application of
UASs to date has been for large-scale photogrammetric mapping (e.g. Haala et al. 2011;
d'Oleire-Oltmanns et al. 2012; Hugenholtz et al. 2013; Whitehead et al. 2013). Issues such as
platform stability and the use of non-metric cameras usually mean that the geometry of the
imagery collected is of a lower quality than that obtained during traditional photogrammetric
surveys carried out from manned aircraft (Hardin and Jensen 2011). UAS surveys also tend to
collect images with large amounts of overlap. This is partly because the low flying height and
comparatively low accuracy of onboard navigational sensors can lead to significant differences
between the image footprints estimated during flight planning and the actual ground coverage of
each image, especially in undulating terrain (Haala et al. 2011; Zhang et al. 2011). Image
footprints can also drift from expectation due to changes in the roll, pitch and yaw of the aircraft
caused by wind and navigation corrections. In spite of these drawbacks, the low flying heights
normally make it possible to gather imagery with sub-decimeter spatial resolution. This level of
detail combined with low costs, flexibility in the timing of image acquisition, and short turn-
around times makes UAS-based photogrammetry an attractive option for many potential users
across a broad spectrum of research and professional applications.
11
Within the field of aerial photography in general, the last twenty years has seen high-
resolution digital imagery largely replace analog aerial photography, as well as the development
of onboard navigational systems that provide accurate positional and attitude information. This
has spurred the parallel development of automated photogrammetric processing packages, such
as Inpho (e.g. Haala et al. 2011; Hugenholtz et al. 2013; Whitehead et al. 2013) and LPS (e.g.
Laliberte and Rango 2011; d'Oleire-Oltmanns et al. 2012). These software packages provide a
semi-automated workflow, allowing for the production of Digital Elevation Models (DEMs) and
orthophoto mosaics with limited operator intervention. The photogrammetric processing chain
for a typical UAS survey is described in detail by Hugenholtz et al. (2013) and by Whitehead et
al. (2013), who describe processing using Trimble’s Inpho software. The process is however the
same for most photogrammetric software packages. The log file from the UAS autopilot is used
to provide initial estimates for the position and orientation of each image. In addition, it is usual
to include a number of accurately-surveyed Ground Control Points (GCPs) in the
photogrammetric adjustment (see Figure 3). These usually consist of specially-placed targets that
are surveyed with a GNSS at the time of the UAS survey (e.g. Hugenholtz et al. 2013).
Aerial Triangulation (AT) refers to the process by which the true positions and
orientations of the images from an aerial survey are re-established. This process includes project
setup, measurement of GCPs and manual tie points, and bundle-block adjustment (Hugenholtz et
al. 2013; Whitehead et al. 2013). During AT, a large number of automated tie points are
generated for conjugate points identified across multiple images. A bundle-block adjustment then
uses these automated tie points, along with manually observed GCPs and tie points, to optimise
the photo positions and orientations, with the goal being to recreate the positions and orientations
associated with each image at the time of its capture (Hugenholtz et al. 2013). The bundle-block
12
adjustment process generates a high number of redundant observations, which are used to derive
an optimal solution through a rigorous least squares adjustment. It is also common practice to
include a number of check points, which are not used during the AT process and which can be
used to later provide an independent check on the accuracy of the adjustment.
After AT, the oriented images may be used to generate a Digital Surface Model (DSM),
which provides a detailed representation of the terrain surface, including the elevations of raised
objects, such as trees and buildings. The DSM production process first creates a dense point
cloud, by matching features across multiple image pairs (Whitehead et al. 2013). Another
product that can be generated at this stage is a Digital Terrain Model (DTM), which is often
referred to as a bare-Earth model. For most purposes a DTM is a more useful product than a
surface model, since the high frequency noise associated with vegetation cover is removed. A
DTM can be produced in a number of ways, including filtering of the dense point cloud used to
produce a DSM, or interpolation of a sparse point cloud (Arefi et al. 2009). DTMs often require
manual editing in order to remove the influence of larger buildings and heavily-vegetated areas,
which are generally not adequately filtered during their creation. Break lines and additional
points are often added during this process, in order to augment the quality of the final DTM.
While often used interchangeably, the term DEM as used in this review is considered to be
generic, and can thus refer to either a DSM or a DTM.
After a DTM has been created, it can then be used to orthorectify the original images.
Orthorectification refers to the removal of distortions caused by relief, which result from the
central-perspective geometry associated with photography. Once orthorectified, the images have
an orthogonal geometry and can be used for direct measurement. After orthorectification, the
individual images can be combined into a mosaic, in order to provide a seamless image of the
13
survey area at the desired resolution. Orthorectification can also be carried out using a DSM, but
the amount of noise associated with dense vegetation can often cause the resulting orthoimage to
have a choppy and irregular appearance.
Photogrammetric processing of UAS imagery poses a number of challenges, since in
many ways the characteristics of such imagery are more akin to those encountered in terrestrial
photogrammetry than to conventional aerial photography (Turner et al. 2013). UAS imagery is
subject to variable scales, high amounts of overlap, variable image orientations, and often has
high amounts of relief displacement arising from the low flying heights relative to the variation
in topographic relief (Zhang et al. 2011). Under such circumstances traditional photogrammetric
processing approaches developed for well-calibrated metric cameras and regularly-spaced
photography, may not be optimal (Hardin and Jensen 2011).
Over the last few years a number of Structure from Motion (SfM) software packages
have been developed (e.g. Westoby et al. 2012; Fonstad et al. 2013; Turner et al. 2013). The SfM
approach uses algorithms originally developed for computer vision, such as a Scale Invariant
Feature Transform (SIFT), which identifies similar features in conjugate images. Unlike
conventional photogrammetry, which is bound by rigid geometric constraints, SfM is able to
accommodate large variations in scale and image acquisition geometry. The use of SfM packages
such as Bundler, along with its web implementation Photosynth (e.g. Fonstad et al. 2013; Turner
et al. 2013) and Photoscan (Turner et al. 2013) allows for a high degree of automation, and
makes it possible for non-specialists to produce accurate DSMs and orthophoto mosaics in less
time that it would take using conventional photogrammetric software. Due to their comparatively
low cost and ability to handle unconventional imagery, it is likely that SfM packages will
increasingly become the software of choice for UAS photogrammetric surveys.
14
While most UAS photogrammetric surveys require GCPs to provide the required
horizontal and vertical accuracies, there have been a number of examples of surveys that have
obtained high accuracy results without the use of ground control (e.g. Blaha et al. 2011; Turner
et al. 2013). These “direct georeferencing” systems use high-accuracy, carrier-phase GPS
measurements to obtain positional values for the aircraft which are in the 10 – 20 cm range, and
these are complemented by extremely accurate measurements of the aircraft attitude. In order to
obtain accurate results using this approach, the precise offsets between the camera and the GPS
antenna must be known, and the camera calibration must be accurately determined. Turner et al.
(2013) describe a system developed for an Oktokopter platform, which was able to obtain
horizontal and vertical accuracies in the 10 cm range, with no GCPs. Direct georeferencing
systems have also been described by Nagai et al. (2009) and Bláha et al. (2011).
Direct georeferencing is still comparatively rare. Because of the extreme sensitivity such
systems have to timing errors, they tend to use slow-moving VTOL platforms, rather than fixed-
wing UASs (e.g. Blaha et al. 2011; Turner et al. 2013). Direct georeferencing systems also
require the use of high-end survey-grade components, and to achieve good results they also need
to carry heavier cameras. These factors mean that such systems are generally expensive, heavy,
and have limited range (e.g. Nagai et al. 2009; Turner et al. 2013), but we surmise that UASs
with these capabilities will become more common in the near future.
4.3. Multispectral and hyperspectral
A major thrust of remote sensing research and application is the analysis of the spectral content
of the imagery, and how it relates to land cover and other biophysical properties. In traditional
satellite and airborne remote sensing, it is usual to work with multiple image bands, covering
15
different parts of the electromagnetic spectrum. Payload weight restrictions and the cost of high-
end miniaturised imaging devices mean that small UASs are often restricted to carrying
consumer-grade cameras that are typically designed only to record spectral reflectance between
400 and 700 nm; the visible region of the spectrum (Lebourgeois et al. 2008; Rump et al. 2011).
The dynamic range of such cameras is also limited, and this can pose additional problems for
spectral analysis (Hardin and Jensen 2011).
For applications that investigate vegetation health, Near InfraRed (NIR) reflectance is
particularly important. Reflectance from healthy vegetation is at its highest in the region between
750 nm and 1250 nm, while this peak tends to be depressed in stressed vegetation. The sensor
arrays of many cameras are sensitive to radiation in this region of the spectrum, but it is normally
blocked out by the use of an internal “hot mirror” filter, which is designed to limit the camera
response only to the visible part of the spectrum. Hunt et al. (2010) used a Fuji FinePix S3 Pro
UVIR camera, which does not filter the NIR region, and instead used a customised red filter to
produce composite images made up from the blue, green, and NIR portions of the spectrum. This
camera was flown on a UAS and was used to monitor leaf area index for two fields of winter
wheat. A similar approach was taken by Knoth et al. (2013), who used a specially-converted
compact camera to classify vegetation types in a peat bog. This approach provides a low-cost
alternative to true multispectral sensors, but limitations in the quality of imagery that can be
obtained generally mean that such cameras do not provide suitable imagery for quantitative
analysis. In Figure 4 we present an example of multispectral imagery from an abandoned oil well
undergoing reclamation in southern Alberta. The area was flown twice: first to acquire RGB
imagery with a consumer-grade camera (Canon ELPH 115) and then a second time with a NIR
16
filter. The resulting data were used to create a false-color infrared image and normalised
difference vegetation index (NDVI) map of leafy vegetation.
In recent years, a number of lightweight multispectral sensors have been developed for
UAS platforms. While these typically are bulkier, more expensive, and have lower resolutions
than the converted cameras described above, they have higher dynamic ranges, and their spectral
characteristics can often be customised. Many of these sensors also have multiple bands and
have adjustable spectral ranges. The calibration of a six band Tetracam MCA-6 camera for UAS-
specific surveys is described by Kelsey and Lucier (2012). The same camera was used to
produce vegetation indices in order to monitor the health of a vineyard (Turner et al. 2011).
Baluja et al. (2012) also investigated vineyard water status using a UAS-mounted Tetracam
MCA-6 camera, along with a FLIR thermal camera. Another example is provided by Berni et al.
(2009), who used a combination of six visible and NIR bands in combination with thermal
imagery in order to investigate crop canopy metrics, including chlorophyll content, leaf water
content, carotenoid concentration, dry mass, as well as structural parameters such as Leaf Area
Index (LAI). The imagery was obtained from a rotary-wing UAS, over three test plots consisting
of olive trees, peach trees, and cornfields.
Hyperspectral sensors sacrifice spatial resolution for spectral resolution and can provide a
measure of spectral response across hundreds of narrowly-defined spectral bands simultaneously.
Recent advances in sensor miniaturisation, along with low flying heights, mean that
hyperspectral surveys with ground resolutions of 0.2 m or better can now be carried out from
UAS platforms (e.g. Zarco-Tejada et al. 2012; Uto et al. 2013). The widespread collection of
airborne hyperspectral information with this level of spatial resolution has not hitherto been
practical and this represents an example of a novel application for which UASs are uniquely
17
suited. Such surveys are able to provide detailed information on vegetation health, and also can
be used as a basis for mapping of vegetation species. The testing and calibration of a
miniaturised hyperspectral imaging system designed for UAS use was described by Duan et al.
(2013). Uto et al. (2013) were able to use data collected by a lightweight hyperspectral sensor to
monitor chlorophyll density in rice paddies, while Zarco-Tejada et al. (2013) were able to
estimate water stress and vegetation health metrics in a citrus orchard, using a number of derived
vegetation indices. Saari et al. (2011) describe the development of a lightweight hyperspectral
system, suitable for use on a small UAS, and optimised for forestry and agricultural applications.
Kaivosoja et al. (2013) investigated the use of UAS acquired hyperspectral imagery, along with
farm history records, for precision agriculture.
Despite the growing potential for multi- and hyperspectral remote sensing with UASs,
there are also notable drawbacks and challenges in terms of the commensurability of these
sensors relative to existing sensor systems developed for the field and piloted aircraft. In
particular, a study conducted at the US Department of Energy’s Idaho National Laboratory by
Hruska et al. (2012) showed that one type of hyperspectral imaging spectrometer designed for
small UASs was of limited use for quantitative remote sensing of vegetation applications,
including vegetation stress studies requiring red edge or specific bands for photochemical
reflectance indices. They also found significant geometric errors that resulted in obvious image
distortions (Fig. 5 in Hruska et al. 2012). Their research demonstrates that multi- or
hyperspectral sensors designed for use on UASs must be tested to ensure they adhere to the
radiometric and geometric benchmarks necessary to be useful in a quantitative sense.
4.4. Thermal
18
Thermal imaging is one area that is particularly suited to the use of small low-flying aerial
platforms. Traditionally, thermal imaging devices have required bulky and expensive cooling
systems. However, the use of new materials in the design of thermal sensors has lead to the
development of a new generation of thermal imaging devices that operate at ambient
temperatures. These tend to be considerably smaller and less-expensive than the traditional
cooled thermal imaging sensors, making it practical to include them as part of a UAS payload,
alongside a regular camera. However in spite of these technological advances, thermal imagers
are still comparatively expensive, and this has served to limit their application to date.
Thermal imaging is commonly used in vegetation monitoring. Berni et al. (2009) describe
a combined system which used UAS-acquired thermal imagery to monitor the temperature of a
number of agricultural test plots at different times of day, providing an indication of water uptake
by the different crops. Turner and Lucier (2011) used a UAS-mounted thermal camera to
measure soil moisture as part of a study into measuring vineyard health. Bellvert et al. (2013)
also used UAS-acquired thermal imagery to assess vineyard health. Sullivan et al. (2007) used
thermal imagery collected from a fixed-wing UAS to assess the response of cotton to irrigation
and crop residue management. Gonzalez-Dugo et al. (2013) used UAS-acquired thermal imagery
over a commercial orchard in Spain to investigate water uptake in five different species of fruit
trees. The imagery acquired was of sufficient resolution to enable water stress to be measured at
the individual tree level.
The potential of UAS thermal imaging for archaeology was demonstrated by Poirier et al.
(2013), who were able to detect previously unknown roads and walls dating back to Roman
times. UAS thermal surveys also show considerable potential for wildlife tracking and for anti-
poaching operations. An example of wildlife detection is provided by Israel (2011), who describe
19
a system to detect Roe Deer fawns in grassland. By using a multi-rotor VTOL UAS equipped
with a thermal camera, researchers were able to detect fawns prior to grass cutting, allowing
them to be moved out of harm’s way. Another application of thermal imagery is shown in Figure
5, where temperature difference was used to delineate the waterline along a river channel.
Thermal imagery is also well suited for real-time applications. A prototype near real-time
fire monitoring system was described by Ambrosia et al. (2003). Using a variation of the military
Predator drone, which was specially equipped with a combination of visible and thermal infrared
detectors, the system was able to penetrate thick smoke and identify several thermal hot spots
within a test burn. Using satellite telemetry, imagery was relayed to a control centre, where it
was orthorectified and supplied to decision makers as a series of quick-look images within six
minutes of being acquired. Another near-real time application of thermal imagery acquired from
a UAS for fire monitoring is described by Wu et al. (2007).
Potential applications for thermal imagery also exist within search and rescue, and for
undertaking wildlife counts, although there have been few publications documenting such
applications to date. Airborne remote sensing has been used to detect thermal plumes from
power stations for many years (e.g. Scarpace and Green 1973) and this is also an application
which offers considerable potential for UAS surveying. Another application of UASs which has
potential is for conducting thermal heat loss surveys from buildings. Martinez-de Dios and
Ollero (2006), were able to detect areas of significant heat loss during a test overflight of an
office building using a small helicopter UAS. This is an area with considerable potential for the
future, although safety and privacy concerns may need to be addressed before such applications
can become mainstream.
20
4.5. SAR and LiDAR
Synthetic Aperture Radar (SAR) is a mature technology for manned aircraft and satellite
platforms. In 2004 NASA’s Jet Propulsion Laboratory began developing an L-band polarimetric
SAR payload called UASSAR, specifically designed to collect repeat track SAR data for
differential interferometric measurements (Madsen et al. 2005). Since 2009 UASSAR has been
acquiring data for a broad range of science applications, including more than 160 flights across
the globe. For the majority of applications the UASSAR payload or pod has been flown on a
Gulfstream G-III aircraft, which is not a fully autonomous UAS; however, a recent campaign
over the Canadian arctic involved the UASSAR pod attached to NASA’s high altitude, long
endurance (HALE) Global Hawk UAS.
SAR systems suitable for small UASs are generally still in the development phase, and
we are unaware of any published case studies on specific applications. To be suitable for
incorporation into a small UAS, SAR systems must be lightweight and have low power
consumption. Two proof-of-concept technology examples of miniature SARs for UASs include
the ka-band MISAR described by Edrich (2004) and the c-band, single VV-polarization SAR
developed by Koo et al. (2012). This remains an area of active research with considerable
potential for the future.
LiDAR measurements from small UASs are also still in a proof-of-concept phase, but are
further along than SAR in terms of case studies and miniaturisation. One of the main challenges
is that the accuracy of LiDAR data is highly dependent on positional information from the
aircraft GNSS and IMU sensors. It follows that if the aircraft position and attitude are not known
to a high level of accuracy, the corresponding accuracy of any measurements made by a LiDAR
sensor will be affected. This is a limitation for most small UASs, which are typically equipped
21
with navigation-grade GNSS and IMU sensors (Turner et al. 2013). Thus, to obtain high absolute
accuracies, these platforms need to be equipped with high-end carrier-phase GPS units (e.g.
Turner et al. 2013). Nevertheless, initial proof-of-concept case studies demonstrate the potential
for acquiring LiDAR measurements from small UASs.
Lin et al. (2011) developed a lightweight UAS-mounted LiDAR system and assessed its
utility for ground height determination and tree height estimation. Because of flying heights that
ranged between 10 m and 40 m above the ground, an extremely high density of points was
generated for the area of interest, allowing for a high percentage of ground returns through the
forest canopy, as well as accurate estimation of tree heights and discrimination of utility poles.
Similarly, Wallace et al. (2012) describe another UAS-LiDAR system and its use for the
measurement of tree structure. Both studies used VTOL UAS platforms, owing to their ability to
incorporate and support the LiDAR payload. While results from both these studies were
encouraging, both systems were restricted to low flying heights and could cover only small areas
during the course of a single flight. With continued development of sensor technology, it is likely
that more powerful UAS-mounted LiDAR systems will become available in the near future.
Such systems will allow LiDAR surveys to be carried out from greater heights, making it
possible to cover larger areas per flight. The accuracy achievable using such systems will,
however, depend on the accuracy of the on-board navigation sensors.
An alternative application of UAS-mounted LiDAR was demonstrated by Chisholm et al.
(2013). In this case, a LiDAR system was developed for below-canopy use on a quadcopter. This
system was developed for areas where GPS signals are poor or completely absent, and obtains
measurements only in the horizontal plane. Using this approach, it was found that 70% of trees
with a diameter of greater than 20 cm could be detected for a patch of trees 20 m by 20 m in size.
22
This system is still in the proof-of concept phase, and the authors point out that some form of
(non GPS) location device will be necessary for this application to become practical.
5. Challenges and research needs
5.1. Camera shortcomings
While the applications of small UASs have seen tremendous growth in recent years (Figure 1),
there remain a number of outstanding issues that need to be addressed in order for their full
potential to be realised for environmental measurements and monitoring. Chief among these are
the radiometric and geometric limitations imposed by the current generation of lightweight,
consumer-grade digital cameras (cf. Hardin and Jensen 2011). These are designed for the general
market and are not optimised for photogrammetric or remote sensing applications. Higher-end
instruments tend to be too bulky to be used with current lightweight UASs, and for those that do
exist, there is still a question of calibration commensurability with conventional sensors (cf.
Hruska et al. 2012).
Spectral limitations include the fact that spectral response curves from consumer-grade
cameras are usually poorly calibrated (Hakala et al. 2010; Rump et al. 2011), making it difficult
to convert brightness values to radiance (Lebourgeois et al. 2008), which is important for
comparative studies. However, even sensors designed specifically for UASs may not meet the
necessary scientific benchmarks (e.g. Hruska et al. 2012). With consumer cameras, the detectors
may also become saturated when there are high contrasts (Figure 6), such as when an image
covers both a dark forest canopy and a snow covered field. Wavebands tend to be broad, with
23
considerable overlap between all bands in the visible (300 nm – 700 nm) portion of the spectrum
(Lebourgeois et al. 2008; Rump et al. 2011), making it difficult to obtain useful spectral
signatures from different cover types. The lack of a near infrared band is also a serious drawback
for vegetation surveys, although some progress has been achieved with the six-band
multispectral Tetracam mini-MCA-6 (e.g. Turner et al. 2011; Calderon et al. 2013; Garcia-Ruiz
et al. 2013; Torres-Sanchez et al. 2013). The spectral characteristics of most cameras therefore
tend to limit their applicability towards vegetation analysis, and the imagery they produce is
seldom suitable for doing much more than distinguishing vegetated from non-vegetated areas.
Another drawback is that many consumer cameras are prone to vignetting (Figure 6),
where the edges of images appear darker than the centres (Lebourgeois et al. 2008; Hakala et al.
2010; Kelcey and Lucieer 2012). This effect occurs because rays of light at the edges of the
image have to pass through a greater optical thickness of the camera lens, and are thus more
strongly attenuated than light rays in the centre of the image. Lebourgeois et al. (2008) noted that
for modified cameras, vignetting tends to be more pronounced in the near infrared band. Custom-
built multispectral and hyperspectral imaging sensors are less likely to be significantly affected
by this problem, although some degree of vignetting will always be present. Hakala et al. (2010)
estimated vignetting accounted for a 25% variation in brightness between the centre and corners
of images acquired using a consumer camera. However, if the effects can be quantified, then they
can be removed using a flat-field correction (Hakala et al. 2010). The low cost lenses used by
many consumer-grade cameras can also cause different wavelengths of light to be refracted
differently (Figure 6). Known as chromatic aberration, this effect can cause separation of colours
at the edges of images, presenting further difficulties for spectral analysis (e.g. Eisenbeiss 2006;
Van Achteren et al. 2007; Seidl et al. 2011).
24
While it is possible to compensate for many of these effects, and to create a mosaic that
appears seamless, ad-hoc colour balancing is likely to impact the performance of automated
image classification algorithms. These limitations necessarily degrade the quality of the spectral
information that can be recovered from a typical UAS survey. For multispectral and
hyperspectral imagery, the lens characteristics of the sensor are normally well established, and
this information can be used to compensate for the effects of vignetting. The use of vegetation
indices, which are typically based on the ratios of different wavebands can also help to reduce
this effect. Nonetheless, as a general rule, imagery acquired from consumer-grade cameras
cannot be considered suitable for quantitative spectral analysis (Lebourgeois et al. 2008).
The geometry of consumer-grade cameras also presents challenges (Haala et al. 2011;
Hardin and Jensen 2011). Even under ideal conditions it can be difficult to obtain reliable
calibrations for such cameras. This is especially true for retractable lens cameras, where the focal
length may be affected by extraneous factors, such as particles of dust in the mechanism. The
small size of the image sensor and short focal length of such cameras mean that the effects of
microscopic errors or misalignments are greatly magnified, compared to similar errors in
conventional full-frame cameras. The low costs of such cameras also mean that traditional, high-
end calibration using a precision test field is seldom carried out. Instead, calibration is typically
done using an inexpensive and quick flat target algorithm, which in many cases can yield
inconsistent results. Consumer cameras rarely remain stable for any length of time (Habib and
Morgan 2005; Hardin and Jensen 2011), and therefore frequent recalibration may be required.
While such cameras can give acceptable results for many applications, users need to be aware of
the above constraints on the potential accuracy requirements of a project. However it is worth
25
noting that in many cases these limitations are more than compensated for by the low flying
height from which UAS surveys are typically conducted.
While the issues relating to camera geometry and image quality can never be entirely
eliminated, there are a number of steps which can be taken to improve the quality of the final
product. Consumer-grade cameras are often used for UAS surveying because of their light
weight. A good alternative is to use a micro four thirds format camera instead (e.g. Hugenholtz et
al. 2013; Whitehead et al. 2013). This class of camera has similar characteristics to a full single
lens reflex camera, but in a much smaller body. Instead of having a retractable lens, micro four
thirds cameras can use fixed interchangeable lenses, allowing for much improved calibrations
and image quality. Other ways to improve image quality include removing photos which are
blurred, under or over exposed, or which are saturated. This is a simple step which can make a
big difference in the processing stage. Future research into improving image quality is likely to
involve the use of higher-resolution fixed-lens cameras, as well as finding ways to improve
camera stabilisation during flight, possibly through the use of miniaturised gyro-stabilised
gimbal systems.
5.2. Image classification
UASs can be used to gather images at a considerably higher spatial resolution than has hitherto
been achieved, often to centimeter level (e.g. d'Oleire-Oltmanns et al. 2012; Harwin and Lucieer
2012; Turner et al. 2013). While this resolution offers a number of advantages, the amount of
detail presents new challenges from the point of view of image classification. The brightness of
an individual pixel represents an aggregate of the reflected solar radiation from the different
cover types making up that pixel. Traditionally in remote sensing, the low resolutions of satellite
26
imagery and high-altitude aerial imagery have tended to result in comparatively homogenous
clusters of pixels, which are well-suited to pixel-based analysis techniques. However at
resolutions of only a few centimeters, the individual component parts of plants and trees often
become apparent, with separate pixels often representing leaves, branches, and underlying
ground cover. Because of the high contrast differences between these features, mixed pixels,
comprising various combinations of the above components, will also tend to show greater
variation than would be apparent for lower-resolution imagery. In such circumstances, pixel-
based image classification algorithms are unlikely to give good results.
An alternative workflow is to use an object-based analysis strategy (e.g. Rango et al.
2009; Laliberte et al. 2010; Laliberte and Rango 2011). Such a strategy amalgamates groups of
pixels into discrete objects, based on spectral, textural, and shape characteristics. These objects
are then classified based on their inherent properties. If the objects thus defined represent
meaningful units, such as individual trees, then such a strategy can potentially deliver much
improved classification results. Variations in brightness across image mosaics can also present
problems for a purely spectral analysis. By integrating structural and textural parameters into the
analysis, object-based analysis is likely to considerably improve classification accuracies.
5.3. Illumination issues
One factor that has thus far received little attention in the published literature is the effect of
variable illumination on the photogrammetric processing of UAS imagery (cf. Figure 3, Figure
6). Differences between sunlit and shaded areas can be significant on a bright sunny day,
especially where there are cumulus clouds overhead, which give sharp well-defined shadows.
From our combined experience, such conditions can pose significant challenges for the
27
automated image matching algorithms used in both triangulation and DEM generation. Where
clouds are moving rapidly, shaded areas can vary significantly between images obtained during
the same flight, potentially causing the AT process to fail for some images, and also resulting in
errors in automatically generated DEMs. Moreover, patterns of light and shade across images
can confuse automated colour balancing algorithms used in the creation of image mosaics. This
can result in output mosaics of poor visual quality, which have obvious variations in contrast or
colour (e.g. Figure 6), and which may be excessively dark or light in places.
The best recommendation is to avoid flying under such conditions. However there may
be no other available time available to carry out the survey. Structure from motion software is
generally more robust when it comes to coping with image variations (Turner et al. 2012;
Westoby et al. 2012), and it may be possible to carry out a successful triangulation using an SfM
package, when conventional photogrammetric software fails. One possibility is to generate band
ratio images from the original images and use these for triangulation. Ratioing will normally
reduce shadowing considerably. However there is no published literature on the use of band ratio
images in photogrammetry, suggesting that the potential of this method remains untested.
Another commonly seen illumination effect is the presence of image hotspots, where a
bright spot appears in the image (Figure 6). These are due to the effects of bidirectional
reflectance, which is dependent on the relative position of the image sensor and the sun (Hakala
et al. 2010; Grenzdorffer and Niemeyer 2011; Laliberte et al. 2011). Hotspots occur at the
antisolar point, which is the point where the line defined by the sensor position and the sun
intersects with the ground.
5.4. Relief displacement
28
Because of the low flying heights used, UAS imagery is particularly prone to the effects of relief
displacement (Eisenbeiss 2009; Mozas-Calvache et al. 2012; Niethammer et al. 2012). For non-
vegetated areas, such displacement is removed during the orthorectification process, assuming
that the DSM or DTM used correctly represents the terrain. The situation is more complicated
when dealing with trees and buildings (e.g. Mozas-Calvache et al. 2012). In such cases, local
displacement is often considerable (Figure 6), and there can often be hidden areas where no data
has been captured. If a DSM is used to orthorectify such images, the result can often be a choppy
looking, irregular image, due to the noise present in the surface (Figure 6). Using a DTM will
typically result in a smoother looking image; however, locally-elevated features will often still be
subject to the effects of relief displacement. As such it is often difficult to produce a true
orthoimage, which accurately represents all features.
There are a number of work-arounds to this problem. These include obtaining high
overlaps and only using the centres of each image, flying higher, and using a longer focal length.
All of these options will help to reduce, but not eliminate the effects of relief displacement.
Problems with relief displacement will often surface at the mosaicing stage, where images with
different amounts and directions of relief displacement are combined (Figure 6). This can result
in features being displaced in the final image, with linear features in rapidly changing areas, such
as in the case of a road surrounded by high trees, showing sudden jumps in horizontal position.
Certain mosaicking packages may also produce effects such as blurring and ghosting (Figure 6),
where the same feature appears multiple times in the final mosaic in slightly different positions.
Some of these issues can be avoided by manual selection of seam lines, but this can often add
considerably to the time required for processing.
29
5.5. Mosaic artifacts
The production of mosaics can lead to additional problems. In addition to the problems which
result from vignetting, relief displacement, misregistration, and ghosting, image artifacts are
often created where the colour balancing algorithms fail to work properly. These can occur
where the contrasts of individual image bands fall outside the range of the image histograms
used for image matching. Another common occurrence is striping on the final mosaic. This will
often occur where there is insufficient overlap between flight lines to allow colour matching to
be carried out successfully.
Geometric artifacts may also occur where poorly orthorectified adjacent images are used
as inputs to the final mosaic. This can result in mismatching of features across the mosaic, as
well as holes in the final mosaic (Figure 6). In general such features are indicative of problems at
the orthorectification stage.
6. Conclusions and Outlook
In this review we have described the current state of remote sensing with UASs, specifically
focusing on image-based environmental measurements and monitoring. In addition to updating
the progress on various types of remote sensing systems for UASs, we have also highlighted
some of the major research challenges. The use of UASs for environmental monitoring is still in
its infancy. While the last few years have seen a tremendous leap forward in the availability and
sophistication of aerial platforms, the imaging and sensor technology has not kept pace with this
surge. The result is that most UASs still carry a basic compact camera as their primary payload.
The rapid growth of the UAS industry has also created a heterogeneous patchwork of UAS
30
regulations, with regulatory frameworks moving more slowly in some countries (e.g. USA) than
others (e.g. Canada).
Technological improvements will undoubtedly play an important part in the development
of the commercial and civil applications. Improvements to platform stability, ease of operation,
and operating range are likely to expand the scope of UAS surveys beyond the limited extents
that can currently be covered in a single survey. The development of automated sense and avoid
systems will also help to mitigate safety concerns, and will most likely result in a less restrictive
regulatory environment, allowing UAS remote sensing surveys to be carried out over wider areas
and at greater operating altitudes.
For widespread adoption of advanced sensor payloads such as hyperspectral scanners,
LiDAR, and SAR, improvements need to be made to UAS navigation sensors. This process is
now starting to happen, although UAS systems using carrier phase GPS are still rare. Such
sensors are also bulky at present, and inexpensive, miniaturised versions will need to be
developed, which also have lower power requirements than currently-available. If these issues
can be successfully addressed, then the versatility of UASs as data collection platforms will be
considerably enhanced.
Until many of these issues can be addressed, it is likely that the primary use of UAS
surveys will continue to be for large-scale topographic surveys. The low cost and operational
flexibility offered by UAS platforms, along with the recent development of SfM-based
photogrammetric packages provide unique advantages compared with traditional aerial photo
surveys. However, looking to the medium term it is likely that UASs will start to be used in ways
which cannot yet be conceived, as the technology of both the platforms and the sensors
undergoes a process of continual development.
31
Acknowledgements
The authors are grateful for the financial support received for the case studies. Funding was
provided by the Natural Sciences and Engineering Research Council of Canada, Cenovus
Energy, Alberta Innovates, the Canadian Foundation for Innovation, and the University of
Calgary. Jordan Walker (Isis Geomatics) is acknowledged for his role in providing assistance
with some of the image processing.
References
Ambrosia, V.G., Wegener, S.S., Sullivan, D.V., Buechel, S.W., Dunagan, S.E., Brass, J A., Stoneburner,
J. and Schoenung, S.M. 2003. Demonstrating UAV-acquired real-time thermal data over fires.
Photogramm. Eng. Remote Sens. 69(4): 391-402.
Arefi, H., d'Angelo, P., Mayer, H. and Reinartz, P. 2009. Automatic generation of digital terrain models
from CARTOSAT-1 stereo images. Proceedings of the International Archives of the
Photogrammetry, Remote Sensing and Spatial Information Sciences. Available online
at: www.isprs.org/proceedings/XXXVIII-1-4-7_W5/paper/Arefi-169.pdf.
Baluja, J., Diago, M.P., Balda, P., Zorer, R., Meggio, F., Morales, F. and Tardaguila, J. 2012. Assessment
of vineyard water status variability by thermal and multispectral imagery using an unmanned
aerial vehicle (UAV). Irrig. Sci. 30(6): 511-522.
Bellvert, J., Zarco-Tejada, P.J., Girona, J. and Fereres, E. 2013. Mapping crop water stress index in a
Pinot-noir vineyard: comparing ground measurements with thermal remote sensing imagery from
an unmanned aerial vehicle. Precision Agriculture. doi:10.1007/s11119-013-9334-5.
32
Berni, J., Zarco-Tejada, P.J., Suarez, L. and Fereres, E. 2009. Thermal and narrowband multispectral
remote sensing for vegetation monitoring from an unmanned aerial vehicle, IEEE Transactions on
Geoscience and Remote Sensing. 47(3): 722-738.
Bláha, M., Eisenbeiss, H., Grimm, D. and Limpach, P. 2011. Direct georeferencing of UAVs.
Proceedings of Conference on Unmanned Aerial Vehicle in Geomatics, Zurich, Switzerland, 14-
16 September 2011. 38(1/C22): 1-6.
Calderón, R., Navas-Cortés, J.A., Lucena, C. and Zarco-Tejada, P.J. 2013. High-resolution airborne
hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using
fluorescence, temperature and narrow-band spectral indices. Remote Sensing of Environment.
139: 231-245. doi: 10.1016/j.rse.2013.07.031.
Chisholm, R.A., Cui, J., Lum, S.K.Y. and Chen, B.M. 2013. UAV LiDAR for below-canopy forest
surveys. Journal of Unmanned Vehicle Systems. 01(01): 61-68.
d'Oleire-Oltmanns, S., Marzolff, I., Peter, K.D. and Ries, J.B. 2012. Unmanned Aerial Vehicle (UAV) for
monitoring soil erosion in Morocco, Remote Sens. 4(11): 3390-3416.
Duan, S.B., Li, Z.L., Tang, B.H., Wu, H., Ma, L., Zhao, E. and Li, C. 2013. Land surface reflectance
retrieval from hyperspectral data collected by an unmanned aerial vehicle over the Baotou test
site, PLoS One. 8(6): e66972.
Eisenbeiss, H. 2006. Applications of photogrammetric processing using an autonomous model helicopter.
International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences.
36(185): 51-56.
Eisenbeiss, H. 2009. UAV Photogrammetry. Eidgenössische Technische Hochschule, Zürich.
Fonstad, M.A., Dietrich, J.T., Courville, B.C., Jensen, J.L. and Carbonneau, P.E. 2013. Topographic
structure from motion: a new development in photogrammetric measurement. Earth Surf.
Processes Landforms. 38(4): 421–430.
33
Garcia-Ruiz, F., Sankaran, S., Maja, J.M., Lee, W.S., Rasmussen, J. and Ehsani, R. 2013. Comparison of
two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Computers
and Electronics in Agriculture. 91(0): 106-115.
Glennie, C.L., Carter, W.E., Shrestha, R.L. and Dietrich, W.E. 2013. Geodetic imaging with airborne
LiDAR: the Earth's surface revealed. Rep. Prog. Phys. 76(8): 086801.
Gonzalez-Dugo, V., Zarco-Tejada, P., Nicolás, E., Nortes, P.A., Alarcón, J.J., Intrigliolo, D.S. and
Fereres, E. 2013. Using high resolution UAV thermal imagery to assess the variability in the
water status of five fruit tree species within a commercial orchard. Precision Agriculture. 14(6):
660-678.
Grenzdörffer, G.J. and Niemeyer, F. 2011. UAV based BRDF-measurements of agricultural surfaces with
PFIFFikus. International Archives of the Photogrammetry, Remote Sensing and Spatial
Information Sciences. 38(1/C22): 229-234.
Haala, N., Cramer, M., Weimer, F. and Trittler, M. 2011. Performance test on UAV-based
photogrammetric data collection. Proceedings of the International Archives of the
Photogrammetry, Remote Sensing and Spatial Information Sciences. 38(1/C22): 7-12.
Habib, A. and Morgan, M. 2005. Stability analysis and geometric calibration of off-the-shelf digital
cameras. Photogramm. Eng. Remote Sens. 71(6): 733-741.
Hakala, T., Suomalainen, J. and Peltoniemi, J.I. 2010. Acquisition of Bidirectional Reflectance Factor
Dataset Using a Micro Unmanned Aerial Vehicle and a Consumer Camera. Remote Sens. 2(3):
819-832.
Hardin, P.J. and Jensen, R.R. 2011. Small-Scale Unmanned Aerial Vehicles in Environmental Remote
Sensing: Challenges and Opportunities. GIScience & Remote Sensing. 48(1): 99-111.
Harwin, S. and Lucieer, A. 2012. Assessing the accuracy of georeferenced point clouds produced via
multi-view stereopsis from unmanned aerial vehicle (UAV) imagery. Remote Sens. 4(6): 1573-
1599.
34
Hruska, R., Mitchell, J., Anderson, M. and Glenn, N.F. 2012. Radiometric and Geometric Analysis of
Hyperspectral Imagery Acquired from an Unmanned Aerial Vehicle. Remote Sens. 4(9): 2736-
2752.
Huesca, M., Merino-de-Miguel, S., González-Alonso, F., Martinez, S., Miguel Cuevas, J. and Calle, A.
2013. Using AHS hyper-spectral images to study forest vegetation recovery after a fire.
International Journal of Remote Sensing. 34(11): 4025-4048.
Hugenholtz, C.H., Moorman, B.J., Riddell, K. and Whitehead, K. 2012. Small unmanned aircraft systems
for remote sensing and earth science research. Eos, Transactions American Geophysical Union.
93(25): 236-236.
Hugenholtz, C.H., Whitehead, K., Barchyn, T.E., Brown, O.W., Moorman, B.J., LeClair, A., Hamilton, T.
and Riddell, K. 2013. Geomorphological mapping with a small unmanned aircraft system
(sUAS): feature detection and accuracy assessment of a photogrammetrically-derived digital
terrain model. Geomorphology. 194: 16-24.
Hunt, E.R., Hively, W.D., Fujikawa, S.J., Linden, D.S., Daughtry, C.S.T. and McCarty, G.W. 2012.
Acquisition of NIR-green-blue digital photographs from unmanned aircraft for crop monitoring
Remote Sens. 2(1): 290-305.
Israel, M. 2011. A UAV-based roe deer fawn detection system. International Archives of the
Photogrammetry, Remote Sensing and Spatial Information Sciences. 38(1/C22): 51-55.
Jensen, J.R. 2000. Remote Sensing of the Environment: An Earth Resource Perspective. Prentice Hall.
Kaivosoja, J., Pesonen, L., Kleemola, J., Pölönen, I., Salo, H., Honkavaara, E., Saari, H., Mäkynen, J. and
Rajala, A. 2013. A case study of a precision fertilizer application task generation for wheat based
on classified hyperspectral data from UAV combined with farm history data. Proc. SPIE 8887.
doi: 10.1117/12.2029165.
Kelcey, J. and Lucieer, A. 2012. Sensor Correction of a 6-Band Multispectral imaging sensor for UAV
remote sensing. Remote Sens. 4(5): 1462-1493.
35
Knoth, C., Klein, B., Prinz, T. and Kleinebecker, T. 2013. Unmanned aerial vehicles as innovative remote
sensing platforms for high-resolution infrared imagery to support restoration monitoring in cut-
over bogs. Applied Vegetation Science. 16(3): 509-517.
Laliberte, A.S., Goforth, M.A., Steele, C.M. and Rango, A. 2011. Multispectral Remote Sensing from
Unmanned Aircraft: Image Processing Workflows and Applications for Rangeland Environments.
Remote Sens. 3(11): 2529-2551.
Laliberte, A.S., Herrick, J.E., Rango, A. and Winters, C. 2010. Acquisition, orthorectification, and object-
based classification of unmanned aerial vehicle (UAV) imagery for rangeland monitoring.
Photogramm. Eng. Remote Sens. 76(6): 661-672.
Laliberte, A.S. and Rango, A. 2011. Image processing and classification procedures for analysis of sub-
decimeter imagery acquired with an unmanned aircraft over arid rangelands. GIScience &
Remote Sensing. 48(1): 4-23.
Lebourgeois, V., Bégué, A., Labbé, S., Mallavan, B., Prévot, L. and Roux, B. 2008. Can commercial
digital cameras be used as multispectral sensors? A crop monitoring test. Sensors. 8(11): 7300-
7322.
Lin, Y., Hyyppa, J. and Jaakkola, A. 2011. Mini-UAV-borne LIDAR for fine-scale mapping. IEEE
Geoscience and Remote Sensing Letters. 8(3): 426-430.
Madsen, S.N., Hensley, S., Wheeler, K., Sadowy, G.A., Miller, T., Muellerschoen, R., Lou, Y. and
Rosen, P. A. 2005. UAV-based L-band SAR with precision flight path control. Fourth
International Asia-Pacific Environmental Remote Sensing Symposium 2004: Remote Sensing of
the Atmosphere, Ocean, Environment, and Space. International Society for Optics and Photonics.
doi:10.1117/12.578373.
Martinez-De Dios, J.R. and Ollero, A. 2006. Automatic Detection of Windows Thermal Heat Losses in
Buildings Using UAVs. Automation Congress, 2006. WAC '06. doi:
10.1109/WAC.2006.375998.
36
Mozas-Calvache, A.T., Pérez-García, J.L., Cardenal-Escarcena, F.J., Mata-Castro, E. and Delgado-
García, J. 2012. Method for photogrammetric surveying of archaeological sites with light aerial
platforms. J. Archaeol. Sci. 39(2): 521-530.
Nagai, M., Chen, T., Shibasaki, R., Kumagai, H. and Ahmed, A. 2009. UAV-borne 3-D mapping system
by multisensor integration. IEEE Transactions on Geoscience and Remote Sensing. 47(3): 701-
708.
Nex, F. and Remondino, F. 2013. UAV for 3D mapping applications: a review. Applied Geomatics. 6(1):
1-15.
Niethammer, U., James, M.R., Rothmund, S., Travelletti, J. and Joswig, M. 2012. UAV-based remote
sensing of the Super-Sauze landslide: Evaluation and results. Eng. Geol. 128(0): 2-11.
Poirier, N., Hautefeuille, F. and Calastrenc, C. 2013. Low Altitude Thermal Survey by Means of an
Automated Unmanned Aerial Vehicle for the Detection of Archaeological Buried Structures.
Archaeological Prospection. 20(4): 303-307.
Rango, A., Laliberte, A., Herrick, J.E., Winters, C., Havstad, K., Steele, C. and Browning, D. 2009.
Unmanned aerial vehicle-based remote sensing for rangeland assessment, monitoring, and
management. Journal of Applied Remote Sensing. 3(1). doi:10.1117/1.3216822.
Rango, A. and Laliberte, A.S. 2010. Impact of flight regulations on effective use of unmanned aircraft
systems for natural resources applications. Journal of Applied Remote Sensing. 4(1).
doi:10.1117/1.3474649.
Rump, M., Zinke, A. and Klein, R. 2011. Practical spectral characterization of trichromatic cameras.
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2011. doi:
10.1145/2070781.2024204.
Saari, H., Pellikka, I., Pesonen, L., Tuominen, S., Heikkilä, J., Holmlund, C., Mäkynen, J., Ojala, K. and
Antila, T. 2011. Unmanned Aerial Vehicle (UAV) operated spectral camera system for forest and
agriculture applications. Proc. SPIE 8174, Remote Sensing for Agriculture, Ecosystems, and
Hydrology. doi:10.1117/12.897585.
37
Scarpace, F. L. and Green, T. 1973. Dynamic surface temperature structure of thermal plumes. Water
Resour. Res. 9(1): 138-153.
Schlitz, M. 2004. A review of low-level aerial archaeology and its application in Australia. Australian
Archaeology. (59): 51-58.
Seidl, K., Richter, K., Knobbe, J. and Maas, H.G. 2011. Wide field-of-view all-reflective objectives
designed for multispectral image acquisition in photogrammetric applications. Proc SPIE 8172
Optical Systems Design. International Society for Optics and Photonics. doi:10.1117/12.896754.
Sullivan, D.G., Fulton, J.P., Shaw, J.N. and Bland, G. 2007. Evaluating the sensitivity of an unmanned
thermal infrared aerial system to detect water stress in a cotton canopy. Trans. Am. Soc. Agric.
Eng.. 50(6): 1955 - 1962.
Torres-Sánchez, J., López-Granados, F., De Castro, A.I. and Peña-Barragán, J.M. 2013. Configuration
and specifications of an unmanned aerial vehicle (UAV) for early site specific weed management,
PLoS One. 8(3): e58210.
Transport Canada. 2008. The review and processing of an application for a Special Flight Operations
Certificate for the Operation of an Unmanned Air Vehicle (UAV) System. Available from
http://www.tc.gc.ca/eng/civilaviation/opssvs/managementservices-referencecentre-documents-
600-623-001-972.htm [Accessed 26 June 2014].
Turner, D., Lucieer, A. and Wallace, L. 2014. Direct Georeferencing of Ultrahigh-Resolution UAV
Imagery. IEEE Transactions on Geoscience and Remote Sensing. 52(5): 2738 - 2745. doi:
10.1109/TGRS.2013.2265295.
Turner, D., Lucieer, A. and Watson, C. 2011. Development of an Unmanned Aerial Vehicle (UAV) for
hyper resolution vineyard mapping based on visible, multispectral, and thermal imagery.
Proceedings of 34th International Symposium on Remote Sensing of Environment, Sydney,
Australia, 10–15 April 2011.
38
Turner, D., Lucieer, A. and Watson, C. 2012. An automated technique for generating georectified
mosaics from ultra-high resolution Unmanned Aerial Vehicle (UAV) imagery, based on Structure
from Motion (SfM) point clouds. Remote Sens. 4(5): 1392-1410.
Uto, K., Seki, H., Saito, G. and Kosugi, Y. 2013. Characterization of Rice Paddies by a UAV-Mounted
Miniature Hyperspectral Sensor System. IEEE Journal of Selected Topics in Applied Earth
Observations and Remote Sensing. 6(2): 851-860.
Van Achteren, T., Delauré. B., Everaerts, J., Beghuin, D. and Ligot, R. 2007. MEDUSA: an ultra-
lightweight multi-spectral camera for a HALE UAV. Proc. SPIE 6744, Sensors, Systems, and
Next-Generation Satellites. doi:10.1117/12.737718.
Vermeulen, C.D., Lejeune, P., Lisein, J., Sawadogo, P. and Bouché, P. 2013. Unmanned Aerial Survey of
Elephants, Plos One 8(2): e54700.
Vierling, L.A., Fersdahl, M., Chen, X., Li, Z. and Zimmerman, P. 2006. The Short Wave Aerostat-
Mounted Imager (SWAMI): A novel platform for acquiring remotely sensed data from a tethered
balloon. Remote Sensing of Environment. 103(3): 255-264.
Wallace, L., Lucieer, A., Watson, C. and Turner, D. 2012. Development of a UAV-LiDAR system with
application to forest inventory. Remote Sens. 4(6):1519-1543.
Watts, A.C., Ambrosia, V.G. and Hinkley, E.A. 2012. Unmanned aircraft systems in remote sensing and
scientific research: Classification and considerations of use. Remote Sens. 4(6): 1671-1692.
Westoby, M.J., Brasington, J., Glasser, N.F., Hambrey, M.J. and Reynolds, J.M. 2012. Structure-from-
Motion photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology.
179: 300-314. doi: 10.1016/j.geomorph.2012.08.021.
Whitehead, K., Moorman, B.J. and Hugenholtz, C.H. 2013. Low-cost, on-demand aerial photogrammetry
for glaciological measurement. The Cryosphere. 7(6): 1879-1884.
Wu, J., Dong, Z., Liu, Z. and Zhou, G. 2007. Geo-registration and mosaic of UAV video for quick-
response to forest fire disaster. Proc. SPIE 6788, MIPPR 2007: Pattern Recognition and
Computer Vision. doi:10.1117/12.748824.
39
Wundram, D. and Loffler, J. 2007. High resolution spatial analysis of mountain landscapes using a low
altitude remote sensing approach. International Journal of Remote Sensing. 29(4): 961-974.
Zarco-Tejada, P.J., González-Dugo, V. and Berni, J. A. J. 2012. Fluorescence, temperature and narrow-
band indices acquired from a UAV platform for water stress detection using a micro-
hyperspectral imager and a thermal camera. Remote Sensing of Environment. 117: 322-337.
Zarco-Tejada, P.J., Guillén-Climent, M.L., Hernández-Clemente, R., Catalina, A., González, M.R. and
Martín, P. 2013. Estimating leaf carotenoid content in vineyards using high resolution
hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agricultural and Forest
Meteorology. 171: 281-294.
Zhang, Y., Xiong, J. and Hao, L. 2011. Photogrammetric processing of low-altitude images acquired by
unpiloted aerial vehicles. The Photogrammetric Record. 26(134): 190-211.
40
Figure 1: Frequency of scholarly journal articles published before 31/01/2014. This list was compiled from
keyword searches in Scopus and Web of Science and does not include book chapters or conference papers.
41
Figure 2: Flight planning example: (a) showing image waypoints and flight lines and (b) image footprints with
overlap. The home point denotes the location of the GCS and the location used for takeoff and landing.
42
Figure 3: Orthomosaic of an aggregate quarry showing locations of GCPs. Instead of using targets, these GCPs
were marked with biodegradable spray paint (inset image).
43
Figure 4: (a) False-color infrared image (NIR, red, green) and (b) corresponding NDVI map of an oil well site
undergoing reclamation. In (b) the leafy vegetation in shown in green.
44
Figure 5: A mosaic of thermal imagery acquired by a FLIR camera onboard a UAV quadcopter. The imagery was
used to enhance the waterline along the river channel.
45
Figure 6: Common image artifacts and distortion from UAV remote sensing; (a) saturated image; (b) vignetting; (c)
chromatic aberration; (d) mosaic blurring in overlap area; (e) incorrect colour balancing; (f) hotspots on mosaic
due to bidirectional reflectance effects; (g) relief displacement (tree lean) effects in final image mosaic; (h)
image distortion due to DSM errors; (i) mosaic gaps caused by incorrect orthorectification or missing images.