UAVs and Computer Vision Enhance Remote Runway Inspections

In a paper published in the journal Drones, researchers introduced an innovative automated system for monitoring and maintaining remote gravel runways in Northern Canada.

Study: UAVs and Computer Vision Enhance Remote Runway Inspections. Image Credit: Supavadee butradee/Shutterstock
Study: UAVs and Computer Vision Enhance Remote Runway Inspections. Image Credit: Supavadee butradee/Shutterstock

This system addressed the challenges of isolation and harsh weather by using unmanned aerial vehicles (UAVs) and computer vision with deep learning (DL) algorithms. It detected runway defects like water pooling, vegetation growth, and surface irregularities by analyzing high-resolution UAV imagery through a combined vision transformer model and image processing techniques.

Beyond defect detection, it also evaluated runway smoothness to enhance air transport safety and reliability in these regions. Real-world experiments across multiple remote airports validated the effectiveness of this UAV and DL-based approach over traditional manual inspection methods.

Background

Past work has focused on using deep learning techniques like convolutional neural networks (CNNs) and computer vision to detect defects on airport runways made of asphalt or concrete surfaces. Some studies have utilized UAVs to capture aerial imagery and apply image segmentation algorithms for automated crack detection and pavement condition assessment. However, more research must be dedicated to gravel runway inspection, which requires different approaches due to the unique characteristics of gravel surfaces. The existing methods for gravel runway evaluation still heavily rely on manual processes.

Gravel Runway Analysis

The smoothness of a gravel runway is evaluated by quantifying surface irregularities while distinguishing them from the normal runway texture. It is achieved using the bilateral filter algorithm, which highlights differences from the original image while preserving edges.

Morphological operations like erosion and dilation and the Ramer-Douglas-Peucker algorithm for contour approximation refine the results to retain only relevant irregularities. Finally, a modified sigmoid function rates the runway condition on a 1-5 scale, with higher values indicating a greater need for maintenance.

The methodology involves training the model on a dataset of 4K RGB images captured by UAVs at 40-70m altitude over six remote airports in Northern Canada. Images were pre-processed by resizing 1024x1024 pixels and augmented through flipping, saturation, and exposure adjustments. Key features like water pooling, vegetation, and runway edges were manually annotated for supervised learning.

Performance is evaluated using standard metrics for image segmentation tasks - Intersection over Union (IoU), accuracy, F-score, precision, and recall. IoU measures the overlap between predicted and ground truth regions, accuracy counts correct predictions, F-score combines precision and recall, precision captures true positives among predicted positives, and recall finds true positives among actual positives.

The bilateral filter, morphological operations, Ramer-Douglas-Peucker algorithm, and modified sigmoid function analyze runway imagery, identify defects and irregularities, and provide an automated rating of the gravel runway's smoothness condition. This novel vision-based system aims to enhance inspection capabilities for remote airports.

Automated Runway Inspection

The selected image segmentation models for the project were Mask R-CNN, PointRend, and Mask2Former. Initially, these models were trained and tested using the LARD dataset, which consisted of 1500 aerial front-view images of runways taken during the aircraft landing phase. The dataset was resized and split into an 8:2 train-validate ratio. After obtaining the UAV dataset of six remote airports, the team trained three models again to compare their performance.

The UAV dataset contained 6832 images, and the models were trained with a batch size of 2 and 100 epochs. Mask2Former outperformed the other two models regarding accuracy and intersection over union (IoU). The team conducted a final training session for Mask2Former to ensure the best fit before deployment. The dataset was augmented by manually adding water pools to improve the results of water pooling on the runway. It led to better IoU and accuracy for the overall performance.

To streamline the analysis of runway images, an automated pipeline was developed that integrated the essential stages of slicing, detecting, and merging. Large orthorectified images were segmented into smaller pieces for detailed analysis, and each sliced segment underwent a detailed detection process using the trained Mask2Former model. This step identified and classified points of interest (POIs) such as surface irregularities, water pooling, and vegetation encroachment. The system also evaluated the smoothness of the runways.

The research has significant potential for improving aviation safety and operational efficiency at remote airports with unpaved runways. The implications of the work extend beyond Northern Canada and offer several critical benefits globally. The methodology is wider than Northern Canada; countries with vast, sparsely populated areas, such as the United States, Australia, and New Zealand, rely on gravel runways to connect remote communities. Developing nations with limited road and rail infrastructure could also benefit from the automated runway inspection and maintenance system.

Conclusion

The paper introduced a novel approach for automating the monitoring and maintaining gravel runways at remote airports using UAV imagery and advanced computer vision techniques. The approach accurately detected and segmented runway defects such as water pooling, vegetation, and rough surfaces.

Extensive experimentation with diverse aerial images demonstrated the approach's effectiveness and robustness. The approach's potential applications extended beyond aviation, including infrastructure, agriculture, and environmental monitoring. The automated system offered a universal, effective, and user-friendly solution for airports globally.

Journal reference:
Silpaja Chandrasekar

Written by

Silpaja Chandrasekar

Dr. Silpaja Chandrasekar has a Ph.D. in Computer Science from Anna University, Chennai. Her research expertise lies in analyzing traffic parameters under challenging environmental conditions. Additionally, she has gained valuable exposure to diverse research areas, such as detection, tracking, classification, medical image analysis, cancer cell detection, chemistry, and Hamiltonian walks.

Citations

Please use one of the following formats to cite this article in your essay, paper or report:

  • APA

    Chandrasekar, Silpaja. (2024, June 05). UAVs and Computer Vision Enhance Remote Runway Inspections. AZoAi. Retrieved on July 17, 2024 from https://www.azoai.com/news/20240605/UAVs-and-Computer-Vision-Enhance-Remote-Runway-Inspections.aspx.

  • MLA

    Chandrasekar, Silpaja. "UAVs and Computer Vision Enhance Remote Runway Inspections". AZoAi. 17 July 2024. <https://www.azoai.com/news/20240605/UAVs-and-Computer-Vision-Enhance-Remote-Runway-Inspections.aspx>.

  • Chicago

    Chandrasekar, Silpaja. "UAVs and Computer Vision Enhance Remote Runway Inspections". AZoAi. https://www.azoai.com/news/20240605/UAVs-and-Computer-Vision-Enhance-Remote-Runway-Inspections.aspx. (accessed July 17, 2024).

  • Harvard

    Chandrasekar, Silpaja. 2024. UAVs and Computer Vision Enhance Remote Runway Inspections. AZoAi, viewed 17 July 2024, https://www.azoai.com/news/20240605/UAVs-and-Computer-Vision-Enhance-Remote-Runway-Inspections.aspx.

Comments

The opinions expressed here are the views of the writer and do not necessarily reflect the views and opinions of AZoAi.
Post a new comment
Post

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.

You might also like...
ORACLE: Enhancing Wildlife Surveillance with Drones