Summer 2019: The inaugural flights

The flagship project for GMAP is the Regional Geospatial Modeling project, a cooperative effort across the Gulf states with a number of universities and institutions. We began collecting data for this project in 2019, and our first flight was in Cedar Key, Florida, to support an oyster habitat restoration project by the Wildlife Ecology & Conservation program at the University of Florida.

As sea levels rise, do the elevations of oyster reefs keep pace? Healthy reefs are expected accrete to stay in a healthy intertidal range, and less healthy reefs, more vulnerable to erosion, should subside. Oysters need to spend just the right amount of time above and under water to remain healthy, and minor changes in sea level or reef elevation can make a big difference.

Before GMAP joined the effort, UF WEC was collecting elevation data with RTK GNSS surveys. We hoped to demonstrate the usefulness of UAS lidar for monitoring elevation changes over time while covering more area to gain a finer-scale understanding of the reefs’ elevation changes.

As I mentioned before, this was our first flight. We were delayed in receiving our UAS mapping system, and once we had the system in hand, we had to wait a number of weeks before we could receive training on the system and the software. We decided we couldn’t wait that long before collecting our first dataset, so we took a ride to Deer Island to make our inaugural flight.

Sunset near Cedar Key, Florida, July 27, 2019.
Sunset over Deer Island near Cedar Key, Florida.

Take a closer look at the cover image. If you are familiar with Phoenix LiDAR Systems’ SCOUT systems, you see that we put the payload on backwards. That would explain why our data came out so badly when we first tried to process it. But we made the needed adjustment to the lever arm and rotation matrix (both of which are defined in the aircraft’s body frame, so everything needed to be rotated around the airframe’s Z-axis) and we were able to salvage the flight. We weren’t sure what other mistakes we could be making, so to play it safe, we decided not to try our hand at more flights until after our training in August.

After receiving thorough instructions on how to properly attach the payload to the bottom of the drone, we set out for Cedar Key once more to map oyster habitat at Little Trout Creek and Big Trout Creek, as well as revisiting the Deer Island site to finish what we started in July.

Geomatics student and GMAP pilot Connor Bass breaks down the GNSS static base receiver as the sun sets on a low tide data collection at Little Trout Creek near Cedar Key, Florida.

From 2.5D to (roughly) 2.9D

Until just a few years ago, aerial lidar was collected from relatively high altitudes using conventional aircraft. The flying-height-to-object-height ratio was pretty small, and the resulting data looked less three-dimensional than it did, as many in this field call it, 2.5-dimensional. This term is equal parts irreverent and accurate: aerial lidar did not capture well the sides of buildings, the undersides of bridges or tree canopies, and so on. Aerial lidar is more akin to a two-dimensional blanket of light being draped over a lumpy world, and the result was, well, not truly three-dimensional. Even multi-return and full waveform systems, while providing a much richer view of vegetated areas, are still providing essentially a top-down view, and are not fully 3D.

Unoccupied aerial systems changes the nature of the lidar point cloud. that flying-height-to-target-height ratio is much smaller with UAS, so laser beams are reflecting back from surfaces at steeper angles. Sides of buildings and cliffs are being sensed better, and it’s becoming more feasible to see around and under obstructions. The two-dimensional blanket is beginning to feel more like, say, shrink wrap. This is still not truly three-dimensional, at least not in my mind. I’m calling it 2.9D.

There are myriad implications for this new way of illuminating objects from the low altitudes possible with UAS, and most of them are flashy and exciting. But the fundamental way we filter and process our point clouds shouldn’t get lost in the shuffle.

Scan angles != incidence angles
It is common practice to filter out from UAS point clouds those returns that have a high scan angle. From a 2.5D standpoint, this make sense: a laser return from a steep angle will be less accurate. (This isn’t a problem in conventional aerial lidar because the angular field of view is already limited from the start.) But we have to remember that scan angle is the measure of the angle of the laser pulse as it was sent from the scanner, and that measure is in the scanner’s coordinate system. Unless the scanner is parallel to perfectly flat ground, scan angle does not equal the incidence angle. Incidence angle is the difference between the surface normal of the target and the laser pulse. This measure is much more pertinent to how reliable that laser return is. Naively filtering returns with high scan angles will surely get rid of many undesirable returns, but it could mean losing valuable information, particularly on vegetation, steep topography, and buildings in the scene.

Points per square meter” doesn’t make sense
Another way we talk about point clouds is to speak of its density in terms of points per square meter. When it comes to UAS lidar, unless we’re scanning a barren landscape, this number can be somewhat misleading. And though the end user might not care about the difference between surface density and volume density, the point cloud processor ought to care very much.

One of the steps many of us take when processing UAS lidar point clouds is to thin the cloud to make it more reasonable to analyze or deliver to another researcher. After all, most UAS laser scanners are spitting out hundreds of thousands of pulses per second, and depending on who you ask, that’s way too much for most applications. Decisions about thinning an aerial point cloud could be made in two dimensions, but UAS point clouds should be made in three dimensions. There are tools in ArcGIS and LASTools, and surely other softwares, that allow the processor to thin more thoughtfully. But at the very least, one should make a decision about a desired volume–points per cubic meter–before thinning.