Abstract— We have developed a method employing an autonomous unmanned aerial system (UAS) to provide a more robust measure of the field terrain correction (FTC) for gravity measurements than is offered by traditional methods. The resolution of digital terrain that is typically available for the United States (10 or 30 m) is too low to adequately estimate the FTC in steep terrain. The FTC corresponds to the innermost zone around the gravity station (e.g., extending to a 68 m radius for the Hayford-Bowie zones A and B) and is most often estimated in the field with the aid of templates and charts that approximate sectors of the terrain as uniform slopes. These techniques can incur significant error if they are not performed by experienced practitioners. In our approach, we dispatch a UAS to collect camera images around the gravity station, which we use to construct a digital elevation model (DEM) of the area with the structure from motion (SfM) method. The resulting DEMs allow us to precisely calculate the FTC. We have developed software to automate most of the procedure, including the autonomous flight and image capturing by the UAS. We have experimented with a variety of flight paths at several sites spanning a range of terrain conditions to determine the most efficient flight characteristics for this application. Selecting field sites with existing light detection and ranging data has enabled us to characterize errors in the DEMs derived from SfM and to assess the tradeoffs between flight time, processing time, and accuracy of the resulting FTCs. Our methodology is fast (flight time: 3–4 minutes) and robust, primarily because the UAS flight is automated. It can be used to calculate the FTC of a variety of terrain conditions and delivers results that are much more precise than existing methods that do not make use of high-resolution digital terrain data.
|
DJI Innovations Matrice 100 with Zenmuse X3 gimbal and camera flying at one of the test sites
(above) Screenshot of the AgiSoft PhotoScan processing screen. Blue patterns represent the camera positions, illustrating the flight path (Below) Example of a comparison of the 3D model from UAS against LiDAR data
|