Results:
Tag: Drone aircraft
Clear
  • Vertical and slanted sound propagation in the near-ground atmosphere: amplitude and phase fluctuations

    ABSTRACT: Sound propagation along vertical and slanted paths through the near-ground atmosphere impacts detection and localization of low-altitude sound sources, such as small unmanned aerial vehicles, from ground-based microphone arrays. This article experimentally investigates the amplitude and phase fluctuations of acoustic signals propagating along such paths. The experiment involved nine microphones on three horizontal booms mounted at different heights to a 135-m meteorological tower at the National Wind Technology Center (Boulder, CO). A ground-based loudspeaker was placed at the base of the tower for vertical propagation or 56 m from the base of the tower for slanted propagation. Phasor scatterplots qualitatively characterize the amplitude and phase fluctuations of the received signals during different meteorological regimes. The measurements are also compared to a theory describing the log-amplitude and phase variances based on the spectrum of shear and buoyancy driven turbulence near the ground. Generally, the theory correctly predicts the measured log-amplitude variances, which are affected primarily by small-scale, isotropic turbulent eddies. However, the theory overpredicts the measured phase variances, which are affected primarily by large-scale, anisotropic, buoyantly driven eddies. Ground blocking of these large eddies likely explains the overprediction.
  • guiBathy: A Graphical User Interface to Estimate Nearshore Bathymetry from Hovering Unmanned Aerial System Imagery

    Abstract: This US Army Engineer Research and Development Center, Coastal and Hydraulics Laboratory, technical report details guiBathy, a graphical user interface to estimate nearshore bathymetry from imagery collected via a hovering Unmanned Aerial System (UAS). guiBathy provides an end-to-end solution for non-subject-matter-experts to utilize commercial-off-the-shelf UAS to collect quantitative imagery of the nearshore by packaging robust photogrammetric and signal-processing algorithms into an easy-to-use software interface. This report begins by providing brief background on coastal imaging and the photogrammetry and bathymetric inversion algorithms guiBathy utilizes, as well as UAS data collection requirements. The report then describes guiBathy software specifications, features, and workflow. Example guiBathy applications conclude the report with UAS bathymetry measurements taken during the 2020 Atlantic Hurricane Season, which compare favorably (root mean square error = 0.44 to 0.72 m; bias = -0.35 to -0.11 m) with in situ survey measurements. guiBathy is a standalone executable software for Windows 10 platforms and will be freely available at www.github.com/erdc.
  • Evaluation of Unmanned Aircraft System Coastal Data Collection and Horizontal Accuracy: A Case Study at Garden City Beach, South Carolina

    Abstract: The US Army Corps of Engineers (USACE) aims to evaluate unmanned aircraft system (UAS) technology to support flood risk management applications, examining data collection and processing methods and exploring potential for coastal capabilities. Foundational evaluation of the technology is critical for understanding data application and determining best practices for data collection and processing. This study demonstrated UAS Multispectral (MS) and Red Green Blue (RGB) image efficacy for coastal monitoring using Garden City Beach, South Carolina, as a case study. Relative impacts to horizontal accuracy were evaluated under varying field scenarios (flying altitude, viewing angle, and use of onboard Real-Time Kinematic–Global Positioning System), level of commercial off-the-shelf software processing precision (default optimal versus high or low levels) and processing time, and number of ground control points applied during postprocessing (default number versus additional points). Many data sets met the minimum horizontal accuracy requirements designated by USACE Engineering Manual 2015. Data collection and processing methods highlight procedures resulting in high resolution UAS MS and RGB imagery that meets a variety of USACE project monitoring needs for site plans, beach renourishment and hurricane protection projects, project conditions, planning and feasibility studies, floodplain mapping, water quality analysis, flood control studies, emergency management, and ecosystem restoration.
  • PUBLICATION NOTICE: Use of Convolutional Neural Networks for Semantic Image Segmentation Across Different Computing Systems

    ABSTRACT: The advent of powerful computing platforms coupled with deep learning architectures have resulted in novel approaches to tackle many traditional computer vision problems in order to automate the interpretation of large and complex geospatial data. Such tasks are particularly important as data are widely available and UAS are increasingly being used. This document presents a workflow that leverages the use of CNNs and GPUs to automate pixel-wise segmentation of UAS imagery for faster image processing. GPU-based computing and parallelization is explored on multi-core GPUs to reduce development time, mitigate the need for extensive model training, and facilitate exploitation of mission critical information. VGG-16 model training times are compared among different systems (single, virtual, multi-GPUs) to investigate each platform’s capabilities. CNN results show a precision accuracy of 88% when applied to ground truth data. Coupling the VGG-16 model with GPU-accelerated processing and parallelizing across multiple GPUs decreases model training time while preserving accuracy. This signifies that GPU memory and cores available within a system are critical components in terms of preprocessing and processing speed. This workflow can be leveraged for future segmentation efforts, serve as a baseline to benchmark future CNN, and efficiently support critical image processing tasks for the Military.