Chapter 22: Uncrewed Aircraft Systems (UAS) for Predicting, Visualizing, and Modelling Environmental Change
Figure 22.1
Portion of an aeronautical sectional chart for Phoenix, Arizona, USA. The chart indicates physical landforms, obstacles and key landmarks as well as areas where the airspace is restricted.

Figure 22.2
Schematic showing how the number and spacing of flight lines impacts the amount of side overlap in the images. The image on the right has the flight lines positioned closer together, which creates greater side lap. The pink rectangles are the image swathes; darker areas represent overlap between consecutive images (left) and adjacent flight lines (right).

Figure 22.3
A true-colour composite image (left) created from an RGB image showing a crop field. A false-colour composite image (right) displaying the near-infrared band with the red colour, the red band with the green colour, and the green band with the blue colour. Certain details of the crops are more easily visualized when the NIR band is included, such as the areas of greater vigour represented by the red areas in the map on the right.

Figure 22.4
Example of (a) a dense point cloud derived from overlapping UAS images, and (b) a polygon mesh derived from the point cloud

Figure 22.5
A digital surface model (DSM) derived from UAS imagery for a portion of the University of Georgia campus comprising the Old Athens Cemetery. Buildings, trees and other surface features are coloured according to their elevation above ground. The DSM was derived from data available in Bernardes and Madden (2021).

Figure 22.6
Examples of (a) an RGB image of a natural area, (b) the same image spectrally enhanced to highlight vegetation (blue areas) using the Visible Atmospherically Resistant Index (VARI), (c) an example of object-based image segments created for a portion of the image, (d) the classified image based on the segmented objects
