Aerial UAV with optical payloads


Main objective

An unmanned aircraft vehicle (UAV) is defined as a powered aerial vehicle that does not carry a human operator, making use of aerodynamic forces to fly, being piloted remotely or by means of an autonomous control [1].

In order to measure and monitor the environment, these vehicles can carry optical imaging sensors, such as Light Detection and Ranging (LiDAR), Synthetic Aperture Radar (SAR) and NDT payloads. The measurements can be georeferenced by the navigation system and the attitude sensors of the vehicle, generally based on Global Navigation Satellite Systems (GNSS) and Inertial Measurement Units (IMU)

The main contribution of UAVs technology consists of their capability to fly in difficult and inaccessible areas, lowering the risks to the crews of manned aircraft.

In the last years, these systems have enabled for the so-called Aerial Robotics, that can obtain measurements and perform inspection tasks in bridges to obtain georeferenced images and raw data from onboard sensors.

This technology consists of a platform that acts as the carrier for multiple sensors that have been widely used for bridge diagnosis, such LiDAR, GPR, and other sensors. We refer interested readers to the reports on specific sensor technologies.


Functioning mode

UAVs behaviour mainly depends on system properties and, in general, their functioning mode heavily depends on their subsystems, that briefly consist of:


The frame is the main structural element of UASs. The frame is the support for the rest of the components such as motors, electronic subsystems, batteries, and payload.

The frame should obey a trade-off between a smooth geometry with little aerodynamic resistance to fly and the need for actuating and hovering in the air. Accordingly, Aerodynamics is more important in fixed-wing than in rotary-wing UASs.

The classification of frames depends also on the aerospace materials used to create the UAV: from foam frames used in low-end fixed-wing UAVs carbon fiber and expensive materials used for high-end vehicles. For rotary-wing UAVs plastic, aluminium, and carbon fiber frames, are frequent materials depending on the quality of the aircraft.

Motors and batteries

The strong adoption of rotary-wing UAVs is partly caused by the performance and popularization of brushless motors. These motors can be controlled straightforward based on off-the-shelf electronics and are very efficient in energy consumption, being powered by light batteries that use an integrated inverter to obtain AC electric signals that control and drive the motor. For heavy UAVs, there is a need to use combustion motors powered by solid combustible that can carry the payload and are more complex to use and maintain.


The propellers convert the rotational motion of the motors into thrust following the Bernoulli’s principle. Depending on the type and size of the UAV, different size, pitch, number of blades, and type of material are used. For professional small UAVs, carbon fiber propellers are preferred because they are more rigid and produce less vibration when spinning.

Flight control

The flight of the UAV is controlled by the so-called autopilot that performs a mission based on a planned path. This is a challenging task and the basis for the different applications of the UAVs beyond the use of remote controls by a human operator.

Path planning must consider several procedures and constrains such as obstacle avoidance, maximum coverage, sensor limitations, and vehicle motions, as well as time and cost efficiency. For optimal flight path, several algorithms have being proposed based on spanning trees or Neural Networks[2].

The flight control strongly depends on the Navigation (GNSS) and attitude (IMU) sensors of the aircraft.

Payloads and Data Processing

Payload is defined as all the components that are not used to fly but are specific to perform the mission and its objectives. Payloads of civil UAS are mainly classified into optical payloads that enable for remote sensing of the environment and NDT payloads that support aerial robotics [1].


Optical payloads are the basis for remote inspection, one of the key features enabled by UAVs. Typical payloads that support a wide range of inspection tasks, include [3]:* Passive sensors: These sensors (mainly cameras) use the ambient light to record the reflection of that light on the object to be imaged and include the different ranges of the spectrum.

    • RGB cameras. These are the most frequent cameras and may be considered as a ubiquitous sensors, since a high number of modern devices include cameras. Depending on their applicability to measurement tasks, we can distinguish metric, semi-metric and non-metric cameras [4].
    • Multispectral and hyperspectral cameras. These cameras can capture images in visible and non-visible parts of the spectrum from the Ultraviolet (UV) to the long wave infrared (LWIR), passing through optical frequencies, Near Infrared (NIR), infrared and short wave infrared (SWIR). Depending on the number of wavelengths that the camera is able to capture we distinguish multispectral (up-to 10 wavelengths) of hyperspectral cameras (tenths of hundreds of wavelengths)
    • Thermal camera. Given their application, we distinguish this type of cameras based on sensors that are sensible to long infrared spectra, the so called Thermal Infrared (TIR) that covers from 8 to 14 micrometers [5].
  • Active sensors: These sensors use an internal source of energy to illuminate the object to be inspected and record the reflection to that specific signal. As a consequence, these sensors are sensible only for an specific range of the spectrum, being monochromatic in the case of Laser-based systems. Active sensors include [6]:
    • Light detection and ranging (LiDAR) is probably the most up-to-date system for object inspection and monitoring. More information in Laser Scanning report.
    • Radio detection and ranging (RADAR)/synthetic aperture RADAR (SAR). In this case, the source of energy is a pulsed radio wave. More information in GPR report.
    • Sound navigation and ranging (SONAR) Using pressure waves, one of the applications of SONAR consists of underwater measurements providing underwater object information, but not limited to because of their capability to detect objects.




There are several types of UAVs and they can be classified depending on a number of their characteristics as developed in the following tables[7] and paragraphs.

Depending on the weight:

Class Type Weight range
Class I(a) Nano drones W≤200 g
Class I(b) Micro drones 200 g <W≤2 kg
Class I(c) Mini drones 2 kg <W≤20 kg
Class I(d) Small drones 20 kg <W≤150 kg
Class II Tactical drones 150 kg <W≤600 kg
Class III MALE/HALE/Strike drones W> 600 kg

Table 1Classification of UAVs depending on their weight

Depending on the application:

  • Military
  • Civilian
  • Agriculture and forestry
  • Disaster monitoring and management
  • Surveillance
  • Environmental monitoring
  • 3D mapping
  • Atmospheric
  • Wildlife monitoring

Depending on the flying principle

Type Description
Fixed wing Airfoil that generates the lift of the plane. These frames are controlled making use of surfaces built on the wing (ailerons, elevators, and rudder)
Flapping wing Inspired from insects, and small birds consist of the flexible and flapper wings which use an actuation mechanism for their flapping motion. Most of the flapping wings have flexible and light wings as observed in birds and insects which indicate that the flexibility and weight of wings are important for their aerodynamic proficiency and flight stability
Fixed/flapping wing Hybrid designs which use fixed wings for lift and flapping wings for propulsion
Rotary wing These systems are based on the control of the motors and propellers that provide the force to lift. Their maneuverability is very high ande allows them to hover and subsequently, to fly in and inspect confined spaces. These characteristics make rotary wings the key UAV type for for surveying hardly-accessible areas and components in high-rise applications.]

Table 2UAV classification based on flying principle

Process/event to be detected or monitored

UAV support the combined advantages of robot inspection and remote sensor inspection. As such, the use of these systems for documentation, inspection, and monitoring, has gained a significant focus. Even though RPAS/UAVs platforms can help practitioners to make measurements in difficult environments, the detection of damages and failures depends on the specific sensors onboard the platform. Please refer to the templates about sensor technologies for more information.

Physical quantity to be measured

The quantities to be measured using UAVs depends mainly on the payload sensors attached to the frame. The most frequent sensors include:* Optical cameras for visual inspection through image processing

  • Multispectral and thermographic cameras
  • LiDAR for collecting points clouds
  • RADAR payloads

Please review the other templates regarding specific sensors for a more detailed description about measurement techniques

Visual inspection

Traditional visual bridge inspection methods are time-consuming and unsafe for practitioners because of the height works and falling risk. The use of UAS for visual inspection is a mitigation measure to reduce the risk and improve the efficiency of the field inspection. As a general feasible procedure, UAS are able to obtain high-quality image data to be analysed by inspectors.

In this context, the visual inspection of the assets is based on optical payloads onboard the frame of the UAV and is dependent on the image acquisition system and surveying process in the field and the trajectory followed by the system. In order to limit the dependence on the field work, UAS can include control algorithm based on computer vision to navigate an unknown 3D environment and saturation functions to maintain the object to be inspected in the camera’s field of view.

Unattended or automatized image processing can contribute to detect a range of damages on the surface of the asset, such as moderate crack thickness ranging from 0.5 mm to several mm. The extension of the crack can be achieved with statistical analysis and sufficient data to cover a representative inspection area. Depending on the thickness, the standard photogrammetric survey can be a useful tool for quantifying the damage.

In order to detect smaller thickness cracks, higher quality images (in terms of resolution, sharpness and entropy) followed by image postprocessing treatment can reveal cracks with a small thickness of 0.1 mm.

In addition, the UAS can help to easily access structural components at high altitudes, such as girders, putting together optical cameras, infrared, motion, and modelling sensors.

Other damages, such as gaps between the end of kerf plate and sawn kerf in the brace, can be identified on the bridge components (i.e.,).

In general and due to the higher resolution of optical cameras, image processing techniques may be preferred over other sensors such as LiDAR or RADAR, to detect the boundaries of the defects.

Induced damage to the structure during the measurement

Not Applicable

General characteristics

Measurement type (static or dynamic, local or global, short-term or continuous, etc.)

Depending on the payload that is attached to the system and subject to the flight time of the framework.

The flight time for typical UAVs ranges from 15-45 minutes for multirotor systems to more than one hour (or more) for fixed wing aircrafts.

Given that multirotor UAVs are more adequate for bridge inspection, the typical measurement time could be considered as 30 minutes, restricted by the autonomy of the UAV and the payload.

Measurement range

Depending on the payload that is attached to the system and subject to the flight time of the framework.

For optical systems (i.e. cameras) onboard the UAV, the range is restricted by the optical scale of the images.

The optical scale is the relation between the distance to the object and the principal distance (focal length) of the camera. This optical scale is a limit for the expected accuracy of the measurements and, in general, can be considered that the higher the principal distance the higher the expected accuracy.

The drawback of the optical scale consists of the field of view (FoV) of the camera: for each body and sensor, a larger principal distance would reduce the FoV and, thus, increase the time needed for measurement in the field and a more complex planification of the operation.

For more information on active sensors (LiDAR and RADAR) please refer to the technology documentation.

Measurement accuracy

Regarding the accuracy of the data with optical payloads, two main sources of issues should be taken into consideration:* The quality of the collected images

The accuracy of the data processing procedure, mainly, the photogrammetric process

Image quality assessment

After the inspection, there is a need of completing the image quality assessment in the field in order to limit the number of visits to the inspection site.

In order to speed up the process the entire dataset should be analysed. This is a drawback for photogrammetric projects because the number of pictures is very high.

The quality of the images must be quantified calculating sharpness and entropy of the images and comparing the results to the average value for the image dataset. This analysis can be carried out with image processing tools included in photogrammetric software bundles. Higher quality images consist of the portion of the dataset presenting improved sharpness without a variation in entropy.

The quality of the images largely depends on lighting conditions. A sufficient illumination in the images would lead to sharpness and increased entropy [8,9]

Accuracy of the photogrammetric process

Beyond the quality of each single image, when dealing with a photogrammetric project, it is important to quantify the accuracy of the final 3D measurements. The internal and external accuracy of the measurements is one of results of the photogrammetric data processing and, thus, a surveying of control and test points is mandatory to obtain both. Moreover, the final accuracy depends on the image photogrammetric network, that should be defined previously to the field work, as described in the next sections.

To summarize, the photogrammetric process is based on a number of procedures that affect the accuracy of the measurements and should be considered: * Alignment of the images. In the current state of the art, the alignment solves the relative orientation of the images and the absolute orientation to a global frame (ie. Global coordinate system). This is based on a bundle block adjustment that solves the unknowns for certain 3D points on the inspection site that are considered interest points based on the calculation of salient features on the images. As a result, the alignment provides the precision figures based on redundant information on the images about this interest points, that are classified on Key points and Tie points.

  • A dense point cloud can be calculated based on the alignment of the images. In this procedure, the disparity map of the images is transformed into a depth map and a 3D point cloud. The accuracy of this procedure depends not only on the image network and the overall accuracy of the images to obtain the interest points but on the reliability of each pixel in the image. This is related to the image quality assessment. Nevertheless, there are a number of methods to detect the outliers in the point cloud calculation and, accordingly, to filter the points with a lower precision.
  • Depending on the inspection task, obtaining a mesh of the surface is mandatory to model the site. This process is subject to the filtering of unknowns and may result on lower frequency models. A visual inspection of such models is mandatory to solve the trade-off between number of triangles in the surface and the actual optical resolution of the model. The correction of the texture of this triangulation could lead to a more accurate model in terms of radiometric properties.
  • The most frequently used method for the assessment of the photogrammetric results consists of a classical surveying using a GPS and depending on the accuracy objectives. The points in the GPS surveying must be divided into control points and check points to clarify the internal and external accuracy figures of the photogrammetric process.

Background (evolution through the years)

Military applications

The drones or UAS are an example of how military systems can be applied to civil requirements after a long time.

The first UAS attempts started during World War I when the Dayton-Wright Airplane Company invented an unmanned aerial torpedo that would explode at a pre-set time.

During World War II, Reginald Denny Industries' first large-scale production drone appeared. They made about 15,000 drones to be used as targets for anti-aircraft gunner training.

The first target drone converted for an unmanned aerial photographic reconnaissance mission on the battlefield was a version of the MQM-57 Falconer (first flown in 1955).

UASs were not applied until the 1980s when the coordinated use of UASs with manned aircraft became popular with applications such as electronic decoys, electronic jammers, and reconnaissance tools.

Military UAS as the Predator RQ-1L were deployed from the Balkan war in the 1990s to current war conflicts.

Civil applications in recent times

Recent improvements in the field of Aerial Robots have enabled the use of various multi-rotor platforms (for example, four- and eight-rotor platforms) in the field of SHM, with various implementations focusing on inspection and maintenance of bridges, mostly based on visual inspection methods. However, some of the recent studies have attempted to explore different ways in which aerial robots can be modified to provide contact and perch-based inspection capabilities.

Several recent studies have also proposed the development of hybrid robots, which can provide multiple functionalities (for example, mechanisms for flying and walking and a number of different approaches based on contact and flight).

Some of these platforms have provided a proof of concept with considerable potential for successful use for future bridge inspection.

This is a relatively new field of research to take full advantage of the flexibility and versatility of aerial robotic platforms to access and monitor different components of the bridge infrastructure.

Trends for Aerial Robotics

The trends for the UAS is their evolution to Aerial Robotic System to use UAV for bridge inspection tasks that require physical contact between the aerial platform and bridge surfaces[13,14], such as beam deflection analysis or crack depth measurement with an ultrasonic sensor.

These systems take advantage of the aerodynamic ceiling effect when the multirotor approaches the bridge surface. As a consequence, a UAV can be used as a sensor capable of flying and touching the bridge to take measurements during a contact inspection. The numerous practical applications of these systems include measuring beam girder deflection from a bridge using a laser tracking station.

Other common approach is UAS design based on aerial manipulators composed of an aerial platform and an articulated robotic arm attached to the top of the multirotor, which was used to perform inspection tasks that require contact with the bridge. However, the payload requirements for this type of manipulator are high, resulting in large, heavy platforms that are slower and more complex to control.


General points of attention and requirements

Design criteria and requirements for the design of the survey

Procedures for defining layout of the survey

Design constraints (e.g. related to the measurement principles of the monitoring technologies)

Data collection and surveying using UAV needs for specific measures to prevent risky situations and obtain useful data.

Workflow Steps
Survey objectives Determine the objective areas and data
Site prechecking Revise available geographical information of the location and the operational constraints of the environment
Fight planning Offline flight planning, including: * take-off and landing locations,
  • flight speeds and heights,
  • image scale constraints (distance vs focal length)
  • Preliminary Operation Safety study

Risk assessment Enumerate, evaluate, and foresee mitigation measures related to the operation risks.
Permission application Study Regulation constraints to depending on the risk-level of the operation:* Obtain permission for the flight and/or
  • Communicate authority the specified flight plan

Data collection Notify any potentially impacted populations about when the aerial survey will start; follow the devised flight plan for data collection, and if any emergency occurs, land the UAV safely

Table 3Workflow for the planning of a UAV-based survey. Following [12]

The field work for bridge inspection must be clearly defined and scheduled in order to reduce the time spent in the field, especially for risky operations. The process consists of conducting the following steps:# Path planning. It consists of defining the six Degrees of Freedom of the system during the trajectory for the inspection. It is dependent on the payload sensor because the field of view of the object must be maintained and the distance to the structure is critical. It can be considered a collection of Poses (Position and Orientation)

    1. Defining critical sections: potential location of cracks or other surface defects, which could be estimated using structural analysis or based on experience. These sections define the Regions of Interest (RoI) during the inspection.
    2. Define critical waypoints: Depending on the critical section and the payload sensor, these are the POSES of the best locations for surveying to likely detect the damage in the RoI.
    3. Select optimal path
      1. Generate paths. Taking into account the RoI and the Waypoints, a number of trajectories are proposed.
      2. Calculate coverage and cost.
      3. Select the optimal trajectory or path in terms of cost, including the energy cost of the flight and the coverage to obtain a good image quality.
  1. Data collection and image quality assessment.
  2. Data analysis and overall accuracy assessment

Sensibility of measurements to environmental conditions

Taking into consideration UAVs consist of a platform to carry sensors as the so called payload, the environmental conditions affect the trajectory followed by the vehicle and we should consider:* Wind speed: the maximum speed the UAV can maintain the sustainability and flight capabilities. It depends mainly on the type of frame, the size and weight of the platform and the power of the propellers and motors. The flight controller should account for variations in wind speed, particularly, for sureying with a high accuracy requirement.

  • Visibility: even though UAVs can perform the mission with a very high degree of autonomy, safety issues and normative requires that a pilot is the person in charge of the operation. Low visibility also affects optical payloads that are the most applicable in the state of the practise.
  • IP protection grade with respect to rain and water. Robustness of the system depends on the ambient moisture/rain.


Procedures for calibration, initialisation, and post-installation verification

Data collection by UAV is highly dependent on the aircraft's navigation system, which is mainly comprised of a GNSS solution for location and an IMU for measuring drone attitude. A lack of proper calibration of the navigation system would result in incorrect or useless data.

The process to perform the pre-calibration of the navigation system consists of confirming that the flight is going to be carried out safely and efficiently, in the Pre-Flight Setup. [9]

Components to be inspected include motors, propellers, batteries, ground station and / or remote control, payload gimbal, and communications. The components to be calibrated are mainly focused on the IMU and, specifically, on the magnetometers and compass. To calibrate the compass, several rotations must be made around the three axes of the aircraft.

Regarding image quality and image processing, the state of the Technique permits to perform the so-called self-calibration of the camera during the alignment of the images in the photogrammetric process.

Self-calibration takes advantage of automated Interest Point detection based on different techniques to obtain feature descriptors, such as SIFT, SURF or ORB. Using these methods, thousands of points can be detected and matched on different images. This database can be used to model the properties of the camera during the data collection, outperforming traditional calibration based on laboratory or in field calibration using a known pattern of points.

The components of the calibration model for optical cameras consist of the principal distance (ie. focal length), the principal point and the distortion parameters of the camera.

Procedures for estimating the component of measurement uncertainty resulting from calibration of the data acquisition system (calibration uncertainty)

Not Applicable. The operational uncertainty in navigation position and attitude of the UAV is higher than the calibration uncertainty and strongly depends on surveying conditions.

Requirements for data acquisition depending on measured physical quantity (e.g. based on the variation rate)

The operational uncertainty in navigation position and attitude strongly depends on surveying conditions.


Requirements and recommendations for maintenance during operation (in case of continuous maintenance)

Criteria for the successive surveying campaigns for updating the sensors. The campaigns include: (i) Georeferenced frame, i.e. the global location on the bridge; (ii) Alignment of sensor data, relative alignment of the data collected in a surveying; (iii) Multi-temporal registration to previous campaigns; and (iv) Diagnostics.

General constraints

The sensors onboard UAS can obtain their Position and Orientation (POSE) from the navigation sensors used by the UAV platform. Consequently, the absolute accuracy of the data will largely depend on UAV navigation system.

The positioning provided by standard GPS units is generally adequate for navigation of small RPAS that entail a lower risk during the navigation. There are a number of applications, including that based on visual inspection, the image quality is more important that the POSE accuracy and, thus, standard coarse GPS is correct.

If we focus on photogrammetric applications where the dimensional surveying of the bridge is a key, the general workflow consists of complementing coarse navigation GPS with survey-grade GPS in the field. As described in the photogrammetric discussion, these accurate GPS points must be divided into control points and test points in order to quantify the internal and external accuracy of the data.

This GPS surveying increase the cost of the inspection and is time consuming and in recent times, small RPAS systems include RTK positioning systems that can provide centimetric positioning accuracy. As a result, manually surveyed checkpoints are no longer necessary for a number of applications that don’t need very high precision, increasing the cost-effectiveness of UAV platforms.

Nevertheless, although Network-based RTK GNSS positioning can improve the precision of surveying if needed, the altitude component of the measurements is worse than the planimetric surveying due to the nature of satellite navigation.

The POSE of the UAV will affect the different sensors onboard the platform. In this document we focus on image-based surveying and we refer the reader to the specific document for other sensors, such as LiDAR and RADAR.

  1. Georeferenced frame

The use of GNSS based POSE makes UAS based surveying a repeatable method, subject to the accuracy of the positioning system.

The general framework for traditional surveying based on external GPS consists of selecting a number of control points with a minimum of three (to know the internal precision) and perform a transformation to the Global Reference Frame or Global Coordinate System (GCS).

Global coordinates can be achieved from the local 3D photogrammetric coordinates through a conformal 3D Helmert or 7-parameter transformation.

For the transformation between GCS the is a need for changing both datum and projection system.

  1. Alignment of sensor data

The alignment of sensor data or relative orientation is possible if there is a redundancy in the spatial distribution of data.

In the case of the photogrammetric process, we described this alignment based on redundant points on the images. This process can be automatized through interest point detection and matching using feature descriptors.

Even though the POSE of the UAV during data collection is not mandatory for the alignment of data, it can largely speed-up the process, providing initial guesses for the solution in a non-linear bundle adjustment that includes the POSE for the cameras and the 3D coordinates of the thousands of points detected in the images.

  1. Multi-temporal registration to previous campaigns.

Multi-temporal registration of successive data collection campaigns is based on the control points gathered during the GPS surveying phase. Having the results of the photogrammetric process (Points clouds, Images and Orthomosaics) in a GCS enables for direct multi-temporal analysis and change detection.

In the case that the UAV platform includes a RTK positioning system and the GNSS survey is not performed, the registration must be based on points that remain fixed between campaigns. These points can be used as initial guess for image alignment, as fixed points in a Helmert 3D transformation or as control points in a Iterative Closest Point (ICP) process to register Point Clouds.

Previous knowledge about fixed points can speed up the process for GNSS surveying, using these points as control points and reducing the data acquisition to test points that can be analysed to obtain the external accuracy of the final data.

  1. Diagnostics


Lifespan of the technology (if applied for continuous monitoring)

Not Applicable.

Interpretation and validation of results

Expected output (Format, e.g. numbers in a .txt file)

Interpretation (e.g. each number of the file symbolizes the acceleration of a degree of freedom in the bridge)


Specific methods used for validation of results depending on the technique

Quantification of the error

Quantitative or qualitative evaluation

Detection accuracy


  • Lower risk for practitioners and operators
  • Access to hard-to-reach areas such as deck bottoms
  • Robust Data acquisition in high components
  • Better site visibility and aerial point of view of the system
  • Cost-effective technique for surveying
  • Speed-up the process in field compared to manual data acquisition
  • Pre-defined risk scenarios for the operation that simplify the planification phase.


The main disadvantage in the use of drones consist of the Regulations related to Aviation safety and the operational constraints* Limited flight time -Autonomy of the system

  • Payload weight limited by Maximum Take Off Weight and regulations.
  • Even though there is a EU regulation defined by European Aviation Safety Agency (EASA), the National regulation and local permissions are not standardized.
    • Mixed between operational safety and System Certification
  • Operational constraints including
    • Visual Line of Sight (VLOS) restrictions.
    • Subject to field conditions, including weather condition
    • Navigation solutions for interesting areas, such as deck bottoms
  • More complex planification phase.
  • Low Readiness Level for a number of inspection applications, such as Aerial Robotics
  • Limitation for Real-Time data and postprocessing procedures.

Possibility of automatising the measurements

Although current regulations require that a pilot be in charge of the operation of remotely piloted aerial systems (RPAS), technology allows the consideration of autonomous aerial systems (UAS) that include vehicles (UAVs) and other subsystems. and components that allow the sensing of bridge properties automatically.

The measurement procedure using UAVs includes the definition of a precise path that the vehicle will follow thanks to its navigation system, based on GNSS and IMU, which is known as mission planning. The precision of the components included in the navigation system will determine the adjustment of the real trajectory followed by the vehicle to the one defined in the mission planning. The missions include the planning of infrastructure sampling, which is determined by the position and orientation (POSE) of the sensors involved. When the UAV reaches the marked position, the flight controller will send a trigger signal to the sensor to collect data. Optionally, the sensor is oriented towards the area of the asset to be monitored thanks to the use of so-called gimbals.

The georeferencing of the data obtained is done automatically at that time, being able to relate each individual measurement to the vehicle's pose. The accuracy of this automatic georeferencing depends on the accuracy of the navigation subsystem. There are solutions on the market based on real-time differential kinematic GNSS systems (RTK) that allow obtaining centimetric precision in the positioning of the measurements.

Automatic data georeferencing can speed up the photogrammetric process: since the positioning of the images is known, it is easier to establish neighbourhood relationships between the images to be processed. In this way, the automatic calculation of the relative orientation, or alignment, of the images can start from an initial solution and many checks of the geometry of the images to be processed are avoided.

Today, most of the photogrammetric process is automated. The automatic detection of points of interest (PoI) or KeyPoints in the images allows obtaining the alignment of the images and the coordinates of such Keypoints through the intersection of their projection rays. The application of automatic image processing techniques also supports the calculation of disparity maps between each pair of images and, therefore, the depth for each pixel of the photo in the real world. In this way, a dense point cloud is calculated, with precision and resolution figures that are comparable to those obtained by a LiDAR system.


  • Assessment of the cost-effectiveness
  • Safety
  • Lack of harmonized Regulation.
    • Reactiveness in Traffic Control of UAS that difficult the operations.
  • Applicability and Robustness of the Results
  • TRL of well-knowns applications on UAS and Aerial Robotics.
  • Results subject to field conditions, including weather condition and remotely sensed data.

Existing standards

International Standards

ISO/TC 20/SC 16 Unmanned aircraft systems

Standardization in the field of unmanned aircraft systems (UAS) including, but not limited to, classification, design, manufacture, operation (including maintenance) and safety management of UAS operations.

Code Topic
ISO/TC 20/SC 16/AG 5 Detect And Avoid (DAA)
ISO/TC 20/SC 16/AHG 1 Counter UAS
ISO/TC 20/SC 16/JWG 7 Joint ISO/TC 20/SC 16 - ISO/TC 43/SC 1 WG: Noise measurements for UAS (Unmanned Aircraft systems)
ISO/TC 20/SC 16/WG 1 General
ISO/TC 20/SC 16/WG 2 Product manufacturing and maintenance
ISO/TC 20/SC 16/WG 3 Operations and procedures
ISO/TC 20/SC 16/WG 4 UAS Traffic Management
ISO/TC 20/SC 16/WG 5 Testing and evaluation
ISO/TC 20/SC 16/WG 6 UAS subsystems

Table 4Working groups in SO/TC 20/SC 16

CEN - PREN 4709-Aerospace series - Unmanned Aircraft Systems

Part 001: Product requirements and verification

Part 002: Direct Remote identification

Part 003: Geo-awareness requirements

Part 004: Lighting requirements

National Standards

Country Code
... ...


Relevant knowledge fields

Performance Indicators

Type of structure

Spatial scales addressed (whole structure vs specific asset elements)


Available knowledge

Reference projects

  • GIS-Based Infrastructure Management System for Optimized Response to Extreme Events of Terrestrial Transport Networks - SAFEWAY. 2018 - 2022 | European Union | H2020-MG-2016-2017 Ref. 769255-2
  • Healthy and Efficient Routes In Massive Open-Data Based Smart Cities: Smart 3D Modelling: HERMES-S3D. 2014 - 2016 | MINECO | Ref. TIN2013-46801-C4-4-R
  • SITEGI project: Application of Geotechnologies to Infrastructure Management and Inspection. 2011 - 2013 | Technology Centre for Industrial Development, CDTI.



[1]H. González-Jorge, J. Martínez-Sánchez, M. Bueno, and P. Arias, “Unmanned aerial systems for civil applications: A review,” Drones, vol. 1, no. 1, pp. 1–19, 2017, doi: 10.3390/drones1010002.
[2]N. Bolourian, M. M. Soltani, A. H. Albahri, and A. Hammad, “High level framework for bridge inspection using LiDAR-equipped UAV,” ISARC 2017 - Proc. 34th Int. Symp. Autom. Robot. Constr., no. July, pp. 683–688, 2017, doi: 10.22260/isarc2017/0095.
[3]W. W. Greenwood, J. P. Lynch, and D. Zekkos, “Applications of UAVs in Civil Infrastructure,” J. Infrastruct. Syst., vol. 25, no. 2, p. 04019002, 2019, doi: 10.1061/(asce)is.1943-555x.0000464.
[4]J. Mart\’\inez-Sánchez, P. Arias, and J. C. Caamaño, “Close Range Photogrammetry: Fundamentals, Principles and Applications in Structures,” Non-Destructive Tech. Eval. Struct. Infrastruct., vol. 11, p. 35, 2016.
[5]S. Lagüela, J. Martínez, J. Armesto, and P. Arias, “Energy efficiency studies through 3D laser scanning and thermographic technologies,” Energy Build., vol. 43, no. 6, 2011, doi: 10.1016/j.enbuild.2010.12.031.
[6]S. Dorafshan and M. Maguire, Bridge inspection: human performance, unmanned aerial systems and automation, vol. 8, no. 3. Springer Berlin Heidelberg, 2018.
[7]M. Hassanalian and A. Abdelkefi, “Classifications, applications, and design challenges of drones: A review,” Prog. Aerosp. Sci., vol. 91, no. April, pp. 99–131, 2017, doi: 10.1016/j.paerosci.2017.04.003.
[8]L. Duque, J. Seo, and J. Wacker, “Synthesis of Unmanned Aerial Vehicle Applications for Infrastructures,” J. Perform. Constr. Facil., vol. 32, no. 4, p. 04018046, 2018, doi: 10.1061/(asce)cf.1943-5509.0001185.
[9]J. Seo, L. Duque, and J. P. Wacker, “Field Application of UAS-Based Bridge Inspection,” Transp. Res. Rec., vol. 2672, no. 12, pp. 72–81, 2018, doi: 10.1177/0361198118780825.
[10]J. Seo, L. Duque, and J. Wacker, “Drone-enabled bridge inspection methodology and application,” Autom. Constr., vol. 94, no. May, pp. 112–126, 2018, doi: 10.1016/j.autcon.2018.06.006.
[11]C. Ordóñez, J. Martínez, P. Arias, and J. Armesto, “A software program for semi-automated measurement of building faades,” Meas. J. Int. Meas. Confed., vol. 43, no. 9, 2010, doi: 10.1016/j.measurement.2010.05.013.
[12]S. Chen, D. F. Laefer, E. Mangina, S. M. I. Zolanvari, and J. Byrne, “UAV Bridge Inspection through Evaluated 3D Reconstructions,” J. Bridg. Eng., vol. 24, no. 4, p. 05019001, 2019, doi: 10.1061/(asce)be.1943-5592.0001343.
[13]L. M. González-Desantos, J. Martínez-Sánchez, H. González-Jorge, M. Ribeiro, J. B. de Sousa, and P. Arias, “Payload for contact inspection tasks with UAV systems,” Sensors (Switzerland), vol. 19, no. 17, 2019, doi: 10.3390/s19173752.
[14]Sanchez-Cuevas, P. J., Ramon-Soria, P., Arrue, B., Ollero, A., & Heredia, G. (2019). Robotic system for inspection by contact of bridge beams using UAVs. Sensors, 19(2), 305.