Satellite remote sensing

Goal(s)

Main objective

Remote sensing is the term that encompasses the acquisition, among others, of satellite images, since the basis of this technique consists in collecting information about the object to be measured without making physical contact with it, in contrast to on-site observation.

The use of satellite images for monitoring in the different types of infrastructures (example bridges) responds to two great advantages, the great coverage that is achieved in a single analysis and the possibility of recovering historical data, through old images.

These virtues make its use more accessible, compared to jobs where on-site data is more difficult and expensive to obtain.

The satellites that are used to monitor the different infrastructures of the earth have the main objective the observation and mapping the earth's surface.

Description

The data acquisition by remote sensing, through the different satellites, could be by two types, depending on the signal source used to explore the object:

  • Active: They generate their own radiation and receive it bounced. Most devices use microwaves because they are relatively immune to weather conditions. Active remote sensing is different in what it transmits (light and waves) and what it determines (for example distance, height, etc.).
  • Passive: Receives radiation emitted or reflected by the earth. It depends on the natural energy (solar rays) that bounces off the target. For this reason, it only works in the correct light, if not, there will be nothing to reflect. Passive remote sensing uses multispectral or hyperspectral sensors that measure the amount acquired with multiple combining bands. Those combinations are different by the number of channels (wavelength and more). The scope of the bands includes spectra within and beyond human vision (IR, NIRS, TIRS, microwave, etc.).

Functioning mode

In order to capture the data, the different satellites employ whit electromagnetic waves, those could be [30]:

1) Photographic ultraviolet: with λ between 0.3 to 0.4 µ. Only this portion can be captured with photographic emulsions; the rest are absorbed by the atmosphere and do not reach the earth's surface.

2) Visible: with λ from 0.4 to 0.7 µ. It is the operating range of most image-producing sensors and the best known, since it corresponds to the sensitivity of the human eye, thus facilitating the interpretation of images.

3) Photographic: with λ between 0.3 to 0.9 µ. It corresponds to the sensitivity ranges of the photographic films currently in use. It is located within the atmospheric window between 0.3 to 1.35 µ. It is the range used in multispectral photography.

4) Reflective region: with λ between 0.3 to 3 µ. It corresponds to the capture of radiation reflected by natural bodies at ordinary temperatures of the earth's surface.

5) Emissive region: with λ between 3 and 14 µ. In this region, the sensors capture the energy emitted by bodies as a function of their temperature. It is operated with thermal sensors. It is called I.R. thermal or emissive.

6) Reflective infrared: with λ between 0.7 to 3 µ. It is the region of the I.R. in which the reflected radiation is captured. Photographic systems and multispectral scanners operate there. Two subregions can be considered:

6.a) I.R. close with λ between 1.3 to 3 µ in which high sensitivity photographic emulsions operate and corresponds to an atmospheric window between U.V., the visible and the I.R.

6.b) I.R. medium with λ between 1.3 to 3µ. It is the region where there is the greatest influence of the absorption zones of electromagnetic radiation. The sensors must operate in two atmospheric windows that are between 1.5 to 1.8 µ and 2.0 to 2.4 µ of wavelengths.

7) optical region: with λ from 0.3 to 15 µ. It includes the entire application range of optical systems such as lenses, prisms, mirrors. Multispectral scavengers have the capacity to operate throughout this region.

8) Microwave: with λ between 0.3 to 300 cm. It corresponds to the side-sight radar (SLAR), the synthetic aperture radar (SAR), both active sensors, and the radiometer as a passive sensor.

Inside the class of the passive sensors are the Photographic, the Optical-electronic sensors, (that combine one focus similar to the photographic and one detection electronic system (Push Broom and Whisk Broom Sensors)), imaging and antenna spectrometers (Microwave radiometer). Referred to the sensor’s actives, exist the LIDAR and the RADAR [31].

There are two optical-electronic system types [31]:

  • Whisk Broom are the most common in remote sensing. They have a movable mirror that oscillates perpendicular to the direction of the trajectory that allows to explore the swaths of land on both sides. Each movement of the mirror sends information from a different swath to the set of sensors.
  • The Push Broom: eliminates the oscillating mirror because it has a chain with many detectors so that they cover the entire field of view of the sensor. This allows increasing the spatial resolution and reducing geometric errors because they eliminate the mobile and less robust part of the Whisk Broom, however they make a more complex calibration since it must be done for all the sensors at the same time to achieve a homogeneous behaviour.


In addition to the Push Broom and Whisk Broom, there is the microwave radiometer [31]:

  • The Microwave radiometer are composed of an antenna that functions as a receiver and amplifier element of the microwave signal (because it is too weak) and a detector. In this type of system, the spatial resolution is inversely proportional to the diameter of the antenna and directly proportional to the wavelength. Also, the spatial resolution is worse and should only be applied in global studies.


Finally, the satellites catch the information by two different ways in function of the position form the [31]:

  • Geosynchronous or geostationary: they are located on the Equator in an orbit 36000 km from the Earth. They always remain in the vertical of a certain point accompanying the Earth in its rotational movement.
  • Heliosynchronous satellites move in generally circular and polar orbits (the plane of the orbit is parallel to the axis of rotation of the Earth) so that, taking advantage of the Earth's rotational motion, it can capture images of different points each time it passes through the same point in the orbit. These orbits are only possible between 300 and 1500 km high. The orbit is designed in such a way that the satellite always passes over the same point at the same local time.


In addition, depending on the orientation with which the sensor captures the images, a distinction is made between sensors of [31]:

  • Vertical orientation, typical for satellites of low or medium spatial resolution
  • Oblique orientation, typical of radar
  • Modifiable orientation appears on high resolution sensors. It allows maintaining a high spatial resolution and having a high temporal resolution as well. Images of the entire earth's surface are no longer taken systematically, but the sensor is oriented on request. The downside is that it is difficult to find images afterwards, since only those images that have been previously ordered are taken.

Types

From the observation of the electromagnetic spectrum, the remote sensing uses certain regions of the electromagnetic spectrum for different systems. Then, according to the type of energy that the systems capture, the sensors can be photographic, optical and microwave sensors.

Photographic systems are all those that capture images with cameras using photographic emulsions with a long-wave sensitivity of 0.3 to 0.9 µm (UV to IR). Optical systems are the sensors that work to capture images with long waves from 0.3 to 15 µm. Both systems capture the electromagnetic energy reflected or emitted by the ground, and the spectral response is recorded in the image.

The microwave system operates from 0.8 µm to 100 cm, the information or the content of each pixel of the images captured by the satellite is the result of the amount of beams of waves that return to the satellite, once they were emitted to Earth., the information is called backscatter. the study of them should have a big knowledge of the physic [30].

Process/event to be detected or monitored

In the optical-electronic system the by the optical components are decompose in several wavelengths. Each one is sent to in thar region of the spectrum that and converting in electric signal and finally in one numeric value. These values can convert in others values of radiance knowing the calibrate coefficients. The mission of the image spectrometers is to obtain images in a big number of spectral bands, for that obtain an almost continuous spectrum of radiation. The radar works for the band between 1mm and 1m. It works because artificial microwaves sent in a certain direction collide with targets and then the microwaves scatter. The scattered energy is received, amplified, and analysed to determine the location and properties of the targets, so it is possible to measure the time it takes for the radiation pulse to go and return, thanks to this the distance travelled can be known and generate a DTM. The radar could work in any weather condition, so it is a good option in cloudy areas [31].

Physical quantity to be measured

The resolution of the different satellites limits the monitoring of the infrastructure in the conservation evaluation in a general way, being impossible to detect structural problems in detail, in any case the objective is to detect changes over time, either for example movement mm/year (radar satellites) or visual changes of maximum 50 cm (optical satellites).

Induced damage to the structure during the measurement

As the images are taken remotely, this type of technique does not cause any damage during measurement.

General characteristics

Measurement type (static or dynamic, local or global, short-term or continuous, etc.)

The emission of radiation (emitted or reflected) from the earth's surface is a continuous phenomenon in 4 dimensions (space, time, wavelength, and radiance).

Measurement range

Due to this great diversity of measurements between satellites, there is a wide measurement range for each type of data used. Section 1.8 specifies the measurement resolutions obtained from the different satellites and therefore the existing ranges of extent.

Measurement accuracy

The above defines the four types of resolution used in remote sensing [30],[31] and therefore the measurement accuracy:

  • Spatial resolution (pixel size): For photographic sensors, the resolution depends on the photographic scale and the flight height. For the optical-electronic sensors it also depends on the flight height of the platform, the scanning speed (interpreted as reading) and the number of detectors. In antenna sensors such as radar, it depends on the aperture radius of the antenna, the flight height, and the wavelength at which they work.
  • Spectral resolution (indicates the number and width of the spectrum regions for which the sensor collects data): Among the space sensors the lowest spectral resolution corresponds to the photographic systems that operate in the visible and the radar that operates in the microwaves.
  • Radiometric resolution (number of intensity intervals that can be captured): In photographic systems, the radiometric resolution is indicated by the number of grey levels captured by the film. In electro-optical sensors, the spectral resolution is given by the number of values that correspond to the digital levels that the sensor transforms analog-digitally.
  • Temporal resolution: is the time that elapses between capturing sequential images detecting the same area.

Background

Currently it is easy to access a large amount of data, but in the beginning access to satellite information was inaccessible or too expensive. So, the evolution has been at the origin of a scarcity of data (in which, for example, users were only able to have a single image available per year) to currently having an excess of data available.

The first satellite, providing images for earth observation, was launched in 1972 through the LandSat program by the United States.

Europe, through the European Union and the European Space Agency (ESA) launched the first satellite of the constellation, Sentinel 1A, in 2014 through its Copernicus earth observation program, which constantly offers free images, that is one of the most important sources of data, available for users for their various analyses.

In addition, other international agencies have launched their own satellites, enriching the availability of information on the earth.

Currently, the rapid development of electronics, and increasingly smaller and cheaper, but more powerful computing devices, allows the launching of small satellites or nanosatellites (from private companies) that, added to the images available in the past, represent an exponential increase in the availability of Earth observation images, so currently the challenge is knowing how to use all this available information in an effective and useful way [32].

Performance

General points of attention and requirements

Design criteria and requirements for the design of the survey

Does not apply.

Procedures for defining layout of the survey

Does not apply.

Design constraints (e.g. related to the measurement principles of the monitoring technologies)

To obtain the data or satellite images, it is necessary to analyze in advance what is the objective of the study to be carried out, depending on what it wants to measure it is better to use one type of satellite or another.

That is why it must be clear about the different resolutions that the different satellites on the market have, as well as the type of data capture (if they are optical or radar satellites, etc.).

Start from this first decision, it must be considered that the meteorological conditions of the earth affect the data capture. It is therefore necessary to review or filter the choice of the day of the image that is intended to be used.

Finally, it is necessary to know the different data sources, where the images can be downloaded for use, whether public or private.

Sensibility of measurements to environmental conditions.

To reduce distortion and other accuracy issues, consider the following factors [40]:
  • Atmospheric conditions: Changes in the atmosphere, sun illumination, and viewing geometries during image capture can impact data accuracy, and result in distortions that can hinder automated information extraction and change detection processes. Humidity, water vapor, and light are common culprits for errors and distortion.


When atmospheric conditions change, reference points can be obscured or lost, which impacts efforts to create accurate measurements from images. Differences in light temperature can lead to color changes that distort data quality, and make for unsightly inconsistencies that ruin the magic of 3D maps.


  • Altitude and reflectance: Light collected at high elevation goes through a larger column of air before it reaches the sensor. The result is surface reflectance, a phenomena which can diminish color quality and detail in images.


The difference in reflectance near the surface and at top-of-atmosphere creates substantial changes in color, image resolution, and perspective that may need to be accounted for in normalization. Even on a small scale, variances in altitude between data sets should raise a red flag for cross-referencing and review.

Preparation

Procedures for calibration, initialization, and post-installation verification

Does not apply.

Procedures for estimating the component of measurement uncertainty resulting from calibration of the data acquisition system (calibration uncertainty)

Does not apply.

Requirements for data acquisition depending on measured physical quantity (e.g. based on the variation rate)

  • Study of the environment in which the object / infrastructure is placed.
  • Choice of the type of satellite image appropriate for the case study.
  • Study of the number of images required for analysis.
  • Analysis of the climatic conditions for the study area and discarding the satellite images of the day the image was taken and there were no good climatic conditions.
  • Choice of the download platform, as well as the choice of programs to deal with them.

Performance

Requirements and recommendations for maintenance during operation (in case of continuous maintenance)

Since a user of the satellite data cannot carry out the maintenance work of the satellites or the image data sources, the best recommendation that can be given is the continuous verification of the availability of the product that is to be used, always considering the market images alternatives.

Criteria for the successive surveying campaigns for updating the sensors. The campaigns include: (i) Georeferenced frame, i.e. the global location on the bridge; (ii) Alignment of sensor data, relative alignment of the data collected in a surveying; (iii) Multi-temporal registration to previous campaigns; and (iv) Diagnostics.

Due to the great diversity of data sources and the different types of data, it is recommended to try to define an ontology in a "standardized" way as follows [33] in Figure 35:

SatRemSens01.png
Figure 35. LOD Life Cycle

Reporting

Does not apply.

Lifespan of the technology (if applied for continuous monitoring)

It is necessary to know the useful life in Figure 36of the different satellites that exist in the market.

SatRemSens02.jpg
Figure 36. Optical (a) and SAR (b) satellite timeline for major Earth Observing satellites with systematic and global coverage. Source: [34].


Many of them extend their useful life but it is also possible to start the analysis with images from a satellite and that the analysis cannot be continued due to the lack of more images. In this sense, a good alternative is the combination of images from different satellites for analysis.

Interpretation and validation of results

Expected output (Format, e.g. numbers in a .txt file)

Due to the diversity of satellites, and their different uses and technologies, there is a great variety of file output formats, which are very difficult to unify. However, there are several commonly used files that can be considered standardized formats [35],[36]:

HDF5.Addresses some of the limitations and deficiencies in old versions of HDF to meet with the requirements of the current and anticipated computing systems and applications. The improvements in HDF5 include larger file size, more objects, multi-thread and parallel I/O, unified and flexible data models and interfaces, etc. Although inheriting the old version numbering, HDF5 is a new data format and is not back compatible with old versions of HDF. HDF5 consists of a software package for manipulating an HDF5 file, a file format specification describing low-level objects in a physical disk file, and a user’s guide describing high-level objects as exposed by HDF5 APIs. 7.4.1 The Physical Layout of HDF5 At lowest level, an HDF5 file consists of the following components: a super block, B-tree nodes, object headers, a global heap, local heaps, and free space. The HDF5 file physical format is specified with three levels of information. Level-0 is file identification and definition information. Level-1 provides information about the infrastructure of an HDF5 file. Level-2 contains the actual data objects and the metadata about the data objects (NCSA, 2003c).

The National Imagery Transmission Format (NITF) is designed primarily by the National Geospatial-Intelligence Agency (NGA), formerly named National Imagery and Mapping Agency (NIMA) and is a component of the National Imagery Transmission Format Standard (NITFS). It is adopted by ISO as an international standard known as Basic Image Interchange Format (BIIF) (ISO/ IEC 12087-5). NITF is aimed primarily to be a comprehensive format that shares various kinds of imagery and associated data, including images, graphics, texts, geo- and non-geo-coordinate systems, and metadata, among diverse computing systems and user groups. The format is comprehensive in contents, implementable among different computer systems, extensible for additional data types, simple for pre- and post-processing, and minimal in terms of formatting overhead.

The Physical Layout of NITF. The top level NITF file structure includes a file header and one or more data segments which can be image, graphics, text, data extension, and reserved extension.

TIFF and GeoTIFF. The Tagged-Image File Format (TIFF) is designed for raster image data. It is primarily used to describe the unsigned integer type bi-level, gray scale, palette pseudo color, and three-band full color image data but can also be used to store other types of raster data. Although TIFF is not considered as a geospatial data format, its extension, GeoTIFF, which includes standardized definition of geolocation information, is one of the most popular formats for earth observing remote sensing data.

The TIFF physical layout includes four components: (1) an 8-byte TIFF header containing byte order, TIFF file identifier, and the offset address (in byte) of the first Image File Directory (IFD) in the file; (2) one or more IFDs, each containing the number of directory entries, a sequence of 12-byte directory entries, and the address of the next IFD; (3) directory entries each having a tag number indicating the meaning of the tag, a data type identifier, a data value count containing number of values included in this tag, and an offset containing the file address of the value or value array; and (4) the actual data of a tag. Because the offset is of 4-byte size, the actual value of a tag is directly put in the offset field if and only if there is only one value and the value fits into 4 bytes.

Interpretation (e.g. each number of the file symbolizes the acceleration of a degree of freedom in the bridge)

Depending on the analysis that one wants to make of the infrastructure, you must choose the type of satellite to use since the data will contain different information for the information of each pixel of the image, for example, the optical images have different information in its pixels to the radar image for the same study area or target.

For example in Table 9, the European agency has the Copernicus program that has a constellation of satellites called Sentinel. To monitor infrastructures with the Copernicus program, only Sentinel-1 (radar images) or Sentinel-2 (optical images) are usually used.

Table 9. Example of 2 types of infrastructure monitoring satellite (instrument characteristics). Source: {{DocRef|37}.


Sentinel-1 Sentinel-2
Launch A-unit/B-unit 2013/18 months after A-unit 2013/18 months after A-unit
Design lifetime per unit 7.25 yrs (consumables for 12 yrs) 7.25 yrs (consumables for 12 yrs)
Orbit Sun-sync, 693 km/incl. 98.18/LTAN 18:00 Sun-sync, 786 km/LTDN: 10:30
Instrument C-band SAR MSI (multi-spectral-instrument)
Coverage Global/20 min per orbit All land surfaces and coastal waters + full med. sea between: − 56 and + 84° latitude, 40 min imaging per orbit
Revisit 12 days (6 days for A- and B-units) 10 days (5 days for A- and B-units)
Spatial resolution/swath width Strip mode: 5 × 5m/80 km interferometric wide-swath mode: 5 × 20m/250 km (standard mode) extra-wide-swath mode: 20 × 40 m/400 km wave mode: 5 × 5m/20 × 20 km Depending on spectral band 10–20–60 m/290 km
Spectral coverage/resolution 5.405 GHz — VV + VH, HH + HV 13 spectral bands: 443 nm–2190 nm (incl. 3 bands at 60 m for atmos. corr.)
Radiometric resolution/accuracy 1 dB (3 s) 12 bit/ < 5%

These characteristics mean that, for example, for monitoring with Sentinel-2 is usually used only to visually detect the effects of collapse, whereas for Sentinel-1 is usually used to constantly monitor the movement-displacement of the infrastructure surface.

Validation

Specific methods used for validation of results depending on the technique

Different technical approaches have been developed in the Earth Observation (EO) communities to address the validation problem which results in a large variety of methods as well as terminology, in the following shows the generic structure of the comparison part within a validation process [38] in Figure 37:

SatRemSens03.png
Figure 37. Schematic overview of the general validation process. Source: [38].

Quantification of the error

It is important to clarify here what exactly is understood by the terms “error” and “measurement uncertainty” are often used interchangeably within the scientific community. The VIM defines the measurement uncertainty as a nonnegative parameter describing the dispersion of the quantity values attributed to a measurand. The measurement error on the other hand is the difference between the measured value and the true value, i.e., a single draw from the probability density function (PDF) determined by the measurement uncertainty. The measurement error can contain both a random and a systematic component. While the former averages out over multiple measurements, the latter does not. Uncertainties in the reference and EO measurements are derived from a consideration of the calibration chain in each system and the statistical properties of outputs of the measurement system [38].

Quantitative or qualitative evaluation

The CAL-VAL process in Figure 38starts already before the launch of the platform because it's the unique opportunity where can directly calibrate and characterize physically the satellite, After the launch continues this process directly to obtain Level 1 and 2 data reliable and calibrate. The CAL-VAL of one mission includes the sensor calibration, verify the algorithm, the geophysical data validation and the intercomparison with other missions, all of this going to the uncertainties quantification. This process can be better through the comparison of multiple independent sources so that confidence is generated in the veracity of the data [39].

SatRemSens04.png
Figure 38. Steps necessary for comprehensive Cal/Val activities for satellite missions. Source: [39].

Detection accuracy

In addition to the factor's mentioned in section 1.4.1.4 (Atmospheric conditions and Altitude and reflectance), there are others that can trigger incorrect satellite images [40]:

  • Documented metadata for cross-referencing


Many data errors come from sources that are difficult to pinpoint — momentary glitches in connectivity, inconsistencies in light or other atmospheric distortions in remote sensing.
  • False accuracy is a problem. Good data practices involve regularly layering and cross-referencing data sets against existing data to pinpoint errors and ensure accuracy.


In addition to these factors, the precision is determined by the resolution of the images, since the more pixels (resolution) an image has, the more detailed it is [41] in Table 10.

Table 10. Approximate relationship between Resolution Satellite Accuracy of Satellite. Source: [41].


Resolution Satellite
Approximate Accuracy of
Satellite
0.31 m < 5.0 m
0.41 m 3.0 m
0.55 m 23 m
0.82 m 9 m
1.50 m 35 m
0.40 m 7.8 m
0.50 m 9.5 m

However, the precision of an image is not directly related to the resolution and is specified less often (and less clearly) than the resolution of an image [41].

Advantages

The use of satellite images offers a series of advantages over other technologies such as:

Wide geographical and temporal coverage of the study area.

Access to free information depends on the resolution you want to reach.

Possibility, depending on the satellite employee, to obtain images under any climatic and geographical conditions and therefore high accessibility of information.

Easy to complement or combine with other on-site techniques.

Disadvantages

High-resolution images often come at a high price.

The satellite images require a large storage capacity, as well as, to process them a great demand for computational performance.

Need for experts for its use and interpretation.

Depending on the type of satellite image and weather conditions, certain images may not be valid for use.

Possibility of automatising the measurements

In the conception of the operation of all satellites, a periodic automation of data collection is already established, an exception of data capture at express request (such as natural disasters).

Table 11. Automatization of data collection from the main satellites.


Satellite Sensor Spatial Resolution Temporal Resolution Free or Charge
Landsat MSS+TM (Landsat-5)ETM+ (Landsat-7)OLI (Landsat-8) 30 m 16 days Free
Terra/Aqua MODIS 250–1000 m 1–2 days Free
HJ-1A/B CCD1/2 30 m 2–4 days Free
SPOT HRV (SPOT1~3)VGT (SPOT-4)HRG/HRS/VGT (SPOT-5) 1 km 1 day Charge
Sentinel-2 MSI 10–20 m 5 days Free
Sentinel-1 SAR 5–40 m 12 days Free
COSMO-SkyMed SAR 3–15 m 16 days Charge
TerraSAR-X SAR 3–10 m 11 days Charge
ENVISAT ASAR 20–500 m 35 days Free
RADARSAT-1 SAR 10–100 m 24 days Charge
RADARSAT-2 SAR 3–100 m 24 days Charge
ALOS-2 PALSAR-2 25 m 14 days Charge

For the automation of downloading images by the user, there are numerous tools and freely accessible codes to put it into operation (example: Google Earth Engine, DIAS, USGS Earth Explorer, etc)

Barriers

The resolution of the images has a limit, in which certain infrastructure monitoring jobs require overcoming them, making their use unfeasible.

Certain satellites are restricted for the civil use of their images.

Existing standards

Since the 1990s, many national and international organizations have participated in the development of spatial data and information infrastructures for facilitating the sharing of spatial data and information among broad geospatial data producers and consumers and for supporting geospatial applications in multiple disciplines. Since remote sensing is one of the major methods for acquiring geospatial data, remote sensing standards are always one of the core standards for construction of any spatial data infrastructure.

For example:

  • The National Spatial Data Infrastructure (NSDI) initiative of the United States is using the remote sensing standards discussed in this encyclopaedia entry for the construction of NSDI (FGDC 2004).
  • Internationally, the intergovernmental Group on Earth Observations (GEO) is leading a worldwide effort to build a Global Earth Observation System of Systems (GEOSS).


According to the principle of information engineering, the remote sensing standards can be classified into four general categories based on the subject a standard tries to address: Data, Processes, Organizations and technology [42].

Applicability

Relevant knowledge fields

The purpose of the analysis will be linked to the sensor available on board the satellite. Each satellite has one or more instruments that allow obtaining conventional optical images, radar data, presence of pollutants, temperatures, etc.

That is why, focused on the monitoring of infrastructures, the following topics can be considered as the fields of action of satellites:

  • Civil Engineering
  • Geosciences
  • Civil and environmental protection
  • Climate change
  • Archaeology
  • Urbanism

Performance Indicators

  • Cracks
  • Obstruction/ impending
  • Crushing
  • Debonding
  • Holes
  • rupture
  • holes
  • Displacement
  • Deformation

Type of structure

Bridges, roads, railways, buildings, docks or ports, airports, etc.

Spatial scales addressed (whole structure vs specific asset elements)

Infrastructure monitoring is more efficient and advisable for big study areas since this is where this technology is really advantageous compared to other types of on-site technologies.

Materials

Environment in general

Available knowledge

Reference projects

SIRMA:

Strengthening infrastructure risk management in the Atlantic Area

Other

https://skygeo.com/ → Radar images for mining, energy, civil engineering, underground gas storage

https://kartenspace.com/ → satellite monitoring for LINEAR INFRASTRUCTURES, AGRICULTURE FIELDS, MINES, LOGISTIC CENTRES, FOREST, NATURAL DISASTERS, CITIES, OCEANS & PORTS

https://www.orbitaleos.com/ → satellite monitoring for Urban Planning, Deforestation, Infrastructure Monitoring, Gas Leaks, Catastrophe Claims, Power Lines and others.

http://dares.tech/ → Radar images for Mining, Infrastructure, Oil and Gas

https://site.tre-altamira.com/ → Radar images for Mining, Oil and Gas, civil engineering, geohazards

https://satsense.com/ → Radar images for Residential & Commercial Properties, Insurance, Infrastructure and Geotechnical

Bibliography

[1] International vocabulary of metrology - Basic and general concepts and associated terms (VIM). ISO 99. 2007, Technical Committee of Management Board, p. 92..
[2] Bases for design of structures - Assessment of existing structures. ISO 13822. 2010, Technical Committee of Reliability of structures..
[3] Sensors & Circuits: Sensors, Transducers, & Supporting Circuits for Electronic Instrumentation Measurement and Control. J Carr, Joseph. 1993, Prentice Hall..
[4] Surry, D. W. and Stanfield, A. K. Performance technology. [book auth.] J. M. Spector. The Foundations of Instructional Technology. s.l. : M. K. Barbour & M. Orey, 2008..
[5] A. Grosso, D. Inaudi, K. Bergmeister, U. Santa. Monitoring of bridges and concrete stroctures with Fibre Optic Sensors in Europe. IABSE Symposium, Seoul 2001: Cable-Supported Bridges - Challenging Technical Limits. January 2001..
[6] Rafał Sieńko, Łukasz Bednarski. Monitorowanie obiektów budowlanych w sąsiedztwie budowy . Geoinzynieria . 2016, pp. 60-65..
[7] Samuel Vurpillot, Daniele Inaudi, Jean - Marc Ducret. Smart Structures and Materials vol. 2719. Bridge Monitoring by fiber optic deformation sensors: design, emplacemnet and results. San Diego , USA : Society of Photo-Optical Instrumentation Engineers, April 22, 1996..
[8] Jose Luis Santos, Faramarz Farahi. Handbook of Optical Sensors. Boca Raton : CRC Press Taylor & Francis Group, 2015..
[9] Romaniuk, Ryszard S. Miernictwo Światłowodowe Wydanie II Uzupełnione. Warszawa : Instytut Systemów elektronicznych Politechnika Warszawska, 2001..
[10] Glisic, Branko. Fibre optic sensors and behaviour in concrete at early age . PhD thesis. s.l., Switzerland : Laboratory of Stress Analysis (IMAC) Swiss Federal Institute of Technology , 2000..
[11] H.Hartog, Arthur. An Introduction to Distributed Optical Fibre Sensors. An Introduction to Distributed Optical Fibre Sensors. Boca Raton : CRC Press Taylor & Francis Group, 2017, 2, pp. 115-128..
[12] Udd, Eric. Fiber Optic Sensors And Introduction for Engineers and Scientists. New Jersey  : John Wiley & Sons, Inc. , 2006..
[13] Measures, Raymond M. Structural Monitoring with Fiber Optic Technology . Ontario : Institute of Aerospace Studies , 2001..
[14] David Krohn, Trevor MacDougall, Alexis Mendez. Fiber Optic Sensors Fundamentals and Applications . Washington : SPIE PRESS , 2014..
[15] Overview of fiber optic sensors for NDT applications. Alexis Mendez, Tom Graver. Buenos Aires : s.n., 2012. IV NDT Panamerican Conference Buenos Aires..
[16] KEYENCE. Fibre optic sensors FS-N40 Series. Digital Fibre optic Sensor. [Online] 11 30, 2021. https://www.keyence.co.uk/products/sensor/fiber-optic/fs-n40/downloads/..
[17] NERVE-SENSORS. NERVE SENSORS Epsilon Rebar strain sensor. NERVE SENSORS. [Online] 11 30, 2021. [Cited: 11 30, 2021.] www.nerve-sensors.com..
[18] Wybrane zagadnienia monitorowania konstrukcji. Łukasz Bednarski, Rafał Sieńko, Tomasz Howiacki. Szczyrk  : Politechnika Krakowska, Akademia Górniczo-Hutnicza, 2015. XXX Jubileuszowe Ogólnopolskie Warsztaty Pracy Projektanta Konstrukcji. pp. 28-30..
[19] D. Inaudi, S. Vurpillot, G. Martinola, G. Steinmann. SOFO: Structural Monitoring with Fiber Optic Sensors . FIB, Monitoring and Safety Evaluation of Existing Concrete Structures. February 12-13, 1999..
[20] Fiber Optic Sensors and their applications. Fidanboylu, K., Efendioglu, H.S. Karabuk : Fauniversity , 2009. 5th Intenrational Advanced Technologies Symposium ..
[21] Andrea del Grosso, Konrad Bergmeister, Daniele Inaudi, Ulrich Santa. Monitoring of Bridges and Concrete Structures with Fibre Optic Sensors in Europe. s.l. : IABSE Symposium Report , 2001..
[22] Mei, Ying. Error analysis for distributed fibre optic sensing tehcnology based on Brillouin scattering. s.l., USA : University of Cambridge, Department of Philosophy , August 2018..
[24] R. B. Figuera, C. J. R. Silva, P. A. S. Jorge. Development of Advanced Fibre Optic Sensors to Monitor the Durability of Concrete and Reinforced Concrete Structures - SolSensors. s.l. : Universidade do Minho, INESC-TEC and Engiprojects, Lda, 2021..
[25] T. H. Nguyen, T. Venugopala, S. Chen, T. Sun. Fluorescence based fibre optic pH sensor for the pH 10–13 range suitable for corrosion monitoring in concrete structures. Sensors and Actuators B Chemical 191. February 2014, pp. 498-507..
[26] SHM SYSTEMS. SHM Systems Projects. SHM Systems. [Online] 7 12 2021. [Cited: 7 12 2021.] https://www.shmsystem.pl/projects/..
[27] Narodowe Centurm Badań i Rozwoju, Politechnika Rzeszowska, . OptiDeck. Inteligentny system pomostowy z kompozytów polimerowych do budowy i modernizacji drogowych obiektów mostowych. [Online] 08 12 2021. https://optideck.prz.edu.pl/..
[28] J. Biliszczuk, W. Barcik, R.Sieńko. System monitorowania mostu w Puławach. Mosty 4 . 2009, pp. 12-17..
[29] W. Barcik, R. Sieńko, J. Biliszczuk. System monitorowania konstrukcji Mostu Rędzińskiego we Wrocławiu. Mosty 2. 2012, pp. 56-62..
[31] “Plataformas, sensores y canales.”. [Online] [Cited: Junio 24, 2021.] https://www.um.es/geograf/sigmur/teledet/tema03.pdf..
[32] Pultarova, Tereza. The Evolution Of Earth Observation. The Evolution Of Earth Observation. [Online] December 2018. [Cited: 22 Oct 2021.] https://www.ingenia.org.uk/ingenia/issue-77/the-evolution-of-earth-observation..
[33] "Open data from earth observation: From big data to linked open data, through INSPIRE". Zotti, Massimo and La Mantia, Claudio . 2, 2014, Journal of e-Learning and Knowledge Society, Vol. 10..
[34] The role of space-based observation in understanding and responding to active tectonics and earthquakes. Elliott, J. R., Walters, R. J. and Wright, T. J. 1, 2016, Nature communications, Vol. 7, pp. 1-16..
[35] NASA's Earth Science Data Systems (ESDS). Standards and Practices. [Online] [Cited: Junio 24, 2021.] https://earthdata.nasa.gov/esdis/esco/standards-and-references#deprecated..
[36] A review of remote sensing data formats for earth system observations. Yang, W. 2006, Earth Science Satellite Remote Sensing, pp. 120-145..
[37] ESA's sentinel missions in support of Earth system science. Berger, M., et al. 2012, Remote Sensing of Environment, Vol. 120, pp. 84-90..
[38] Validation practices for satellite‐based Earth observation data across communities. Loew, A., et al. 3, 2017, Reviews of Geophysics, Vol. 55, pp. 779-817..
[39] Towards a European Cal/Val service for earth observation. Sterckx, S, et al. 12, 2020, International Journal of Remote Sensing, Vol. 41, pp. 4496-4511..
[40] mapwaredev. Understanding Errors and Distortion in Remote Sensing. [Online] 4 Jun 2020. [Cited: 24 Jun 2021.] https://mapware.ai/blog/understanding-errors-and-distortion-in-remote-sensing/..
[41] Setyawan, Eric. Intermap. Satellite Imagery: Resolution vs. Accuracy. [Online] 12 August 2019. [Cited: 15 Dec 2021.] https://www.intermap.com/blog/satellite-imagery-resolution-vs.-accuracy..
[42] Standards, Critical Evaluation of Remote Sensing. Di, L. 2017, Encyclopedia of GIS, Springer International Publishing, pp. 2187–2196..
[43] Annan, P. GPR Principles, Procedures & Applications. Mississauga, ON, Canada : Sensors and Software Inc., 2003..
[44] Daniels , D. J. Ground Penetrating Radar. London, UK : The institution of Electrical Engineers, 2004..
[45] Jol, H. M. Ground Penetrating Radar: Theory and Applications. Amsterdam, The Netherlands : Elsevier Science, 2009..
[46] Solla, M., Lorenzo, H. and Pérez-Gracia, V. Ground Penetrating Radar: Fundamentals, Methodologies and Applications in Structures and Infrastructures. [book auth.] B. Riveiro and M. Solla. Non-Destructive Techniques for the Evaluation of Structures and Infrastructures. EH Leiden, The Netherlands : CRC Press/Balkema, 2016..
[47] Pajewski, L., Fontul, S. and Solla, M. Ground-penetrating radar for the evaluation and monitoring of transport infrastructures. Innovation in Near-Surface Geophysics. Instrumentation, Application, and Data Processing Methods. Amsterdam, The Netherlands : Elsevier, 2019..
[48] Benedetto, A. and Pajewski, L. Civil Engineering Applications of Ground Penetrating Radar. New York, NY, USA : Springer International, 2015..
[49] A Review of Ground Penetrating Radar Application in Civil Engineering: A 30-Year Journey from Locating and Testing to Imaging and Diagnosis. Wai-Lok Lai, W., Dérobert, X. and Annan, P. 2018, NDT E Int., Vol. 96, pp. 58–78..
[50] A review of GPR application on transport infrastructures: troubleshooting and best practices. Solla, M., Pérez-Gracia, V. and Fontul, S. 672, 2021, Remote Sensing, Vol. 13..
[51] Uncertainty evaluation of the 1 GHz GPR antenna for the estimation of concrete asphalt thickness. Solla, M., et al. 2013, Measurement, Vol. 46, pp. 3032–3040..
[52] Assessment of modern roadways using non-destructive geophysical surveying techniques. Plati, C., Loizos, A. and Gkyrtis, K. 2020, Surveys in Geophysics, Vol. 41, pp. 395-430..
[53] GPR uncertainty modelling and analysis of object depth based on constrained least squares. Xie, F., Lai, W.W.L. and Dérobert, X. 109799, 2021, Measurement..
[54] Leimbach, G. and Löwy, H. Verfahren zur systematischen Erforschung desErdinnern größerer Gebiete mittels elektrischer Wellen. 237944 Germany, 1910..
[55] —. Verfahren zum Nachweis unterirdischer Erzlager oder von Grundwasser mittels elektrischer Wellen. 246836 Germany, 1910..
[56] Hülsenbeck. Prospector Inst Fuer Praktisch Geol, Huelsenbeck&Co Dr. 489434 Germany, 1926..
[57] Stepped frequency ground-penetrating radar survey with a multi-element array antenna: Results from field application on archaeological sites. Linford, N., et al. 2010, Archaeological Prospection..
[58] Underground asset mapping with dualfrequency dual-polarized GPR massive array. Simi, Al., et al. Lecce, Italy : s.n., 2010. Proceedings 13rd International Conference on Ground-Penetrating Radar. pp. 1001-1005..
[59] Processing strategies for high-resolution GPR concrete inspections. Hugenschmidt, J., et al. 2010, NDT E Int., Vol. 43, pp. 334–342..
[60] Autonomous airborne 3D SAR imaging system for subsurface sensing: UWB-GPR on board a UAV for landmine and IED detection. Garcia-Fernandez, M., Alvarez-Lopez, Y. and Las Heras, F. 2357, 2019, Remote Sens., Vol. 11..
[61] A new drone-borne GPR for soil moisture mapping. Wu, K., et al. 111456, 2019, Remote Sens. Environ., Vol. 235..
[62] Towards fully automated tunnel inspection: A survey and future trends. Balaguer, C., et al. Sydney, Australia : s.n., 2014. Proceedings of the 31st International Symposium on Automation and Robotics in Construction and Mining..
[63] Data processing of backfill grouting detected by GPR in shield tunnel and research on equipment of GPR antenna. Xie, X.Y., Chen, Y.F. and Zhou, B. Hong Kong, China : s.n., 2016. Proceedings of the 16th International Conference on Ground Penetrating Radar..
[64] A Train-mounted GPR System for Fast and Efficient Monitoring of Tunnel Health Conditions. Zan, Y.W., Su, G.F. and Li, Z.L. Hong Kong, China : s.n., 2016. Proceedings of the 16th International Conference on Ground Penetrating Radar..
[65] Persico, R., et al. Recommendations for the safety of people and instruments in ground-penetrating radar and near-surface geophysical prospecting. s.l. : European Association of Geoscientists and Engineers, EAGE Publications, 2015..
[66] Comparative measurements of ground penetrating radars used for road and bridge diagnostics in the Czech Republic and France. Stryk, J., et al. 2017, Constr. Build. Mater., Vol. 154, pp. 1199–1206..
[67] Saarenketo, T. Recommendations for guidelines for the use of GPR in asphalt air voids content measurement. Mara Nord Project; Europeiska Unionen. Brussels, Belgium : s.n., 2012..
[68] DMRB3.1.7. Department for Transport, Highway Agency. DMRB 3.1.7.: Design Manual for Roads and Bridges, Advice Notes on the Non-Destructive Testing of Highway Structures—Advice Note 3.5 BA 86/2006: Ground Penetrating Radar (GPR). Birmingham, UK : Department for Transport, Highway Agency, 2006..
[69] DMRB7.3.2. DMRB 7.3.2.: Design Manual for Roads and Bridges, Data for Pavement Assessment—Annex 6 HD 29/2008: Ground-Penetrating Radar (GPR). Birmingham, UK : Highway Agency, 2008..
[70] ME91/16. ME91/16: Methodologies for the Use of Ground-Penetrating Radar in Pavement Condition Surveys. Brussels, Belgium : Belgian Road 445 Research Centre, 2016..
[71] ASTM. Standard Test Method for Determining the Thickness of Bound Pavement Layers Using Short-Pulse Radar; Non-destructive testing of pavement structures; ASTM D4748. West Conshohocken, PA, USA : ASTM, 2004..
[72] GPR system performance compliance according to COST Action TU1208 guidelines. Pajewski, L., et al. 2018, Ground Penetrating Radar, Vol. 1, pp. 104–122..
[73] ASTM. ASTM D6087-08(2015)e1. In Standard Test Method for Evaluating Asphalt-Covered Concrete Bridge Decks Using Ground Penetrating Radar. West Conshohocken, PA, USA : ASTM, 2015..
[74] Estimating active layer thickness and volumetric water content from ground penetrating radar measurements in Barrow, Alaska. Jafarov, E.E., et al. 2017, Geoscience Data Journal, Vol. 4(2), pp. 72-79..
[75] Processing strategies for high-resolution GPR concrete inspections. Hugenschmidt, J., et al. 2010, NDT E Int., Vol. 43, pp. 334–342..
[76] Autonomous airborne 3D SAR imaging system for subsurface sensing: UWB-GPR on board a UAV for landmine and IED detection. Garcia-Fernandez, M., Alvarez-Lopez, Y. and Las Heras, F. 2357, 2019, Remote Sens, Vol. 11..
[77] Development of a Wall Climbing Robotic Ground Penetrating Radar System for Inspection of Vertical Concrete Structures. Howlader, M.O.F., Sattar, T.P. and Dudley, S. 2016, Int. J. Mech. Mechatron. Eng., Vol. 10, pp. 1382–1388..
[78] Wall-climbing robot for visual and GPR inspection. Yangí, L., et al. Wuhan, China : s.n., 2018. Proceedings of the 2018 13th IEEE Conference on Industrial Electronics and Applications (ICIEA). pp. 1004–1009..
[79] Towards 3D Simulation for Disaster Intervention Robot Behaviour Assessment. Bertolino, M. and Tanzi, T.J. 2020, Adv. Radio Sci., Vol. 18, pp. 23–32..
[80] Nondestructive evaluation sensor fusion with autonomous robotic system for civil infrastructure inspection. Gibb, S., et al. 2018, J. Field Robot., Vol. 35, pp. 988–1004..
[81] SHRP. SHRP 2- report S2-R06A-RR-1. In Nondestructive Testing to Identify Concrete Bridge Deck Deterioration. Washington, DC, USA : Transportation Research Board, 2013..
[82] Saarenketo, T., Maijala, P. and Leppäl¨, A. Recommendations for Guidelines for the Use of GPR in Bridge Deck Surveys. s.l. : Publications of Mara Nord Project, 2011..
[83] BASt. BASt-Report B55. Examination of GPR in Combination with Magnetic Techniques for the Determination of Moisture and Salinity of Concrete Bridge Decks with Asphalt Cover. Bundesanstalt für Straßenwesen : Federal Highway Research Institute, 2007..
[84] B10. Document B10—Recommendation for Nondestructive Testing of Civil Engineering Structures by GPR. Berlin, Germany : German Society for Non- Destructive Testing (DGZfP), 2008..
[85] AASHTO. AASHTO R 37-04. Standard Practice for Application of Ground Penetrating Radar (GPR) to Highways. Washington, DC, USA : American Association of State and Highway Transportation Officials, 2004..
[86] ACI. ACI 228.2R-98 (Reapproved 2004). Nondestructive Test Methods for Evaluation of Concrete in Structures. Farmington Hills, MI, USA : American Concrete Institute, 1998..
[87] NCHRP. NCHRP Research Report 848. In Inspection Guidelines for Bridge Post-Tensioning and Stay Cable Systems Using NDE Methods. Washington, DC, USA : TRB’s National Cooperative Highway Research Program, 2017..
[88] Regtien, Paul. Measurement science for engineers. s.l. : Elsevier, 20014..
[89] Semyon, G. Evaluating Measurement Accuracy: A Practical Approach. s.l. : Springer, 2018..
[90] ISO 19159. Geographic information - Calibration and validation of remote sensing imagery sensors and data - Part 1: Optical sensors. s.l. : Technical Committee of Geographic information, 2014..
[91] ISO 7078. Buildings and civil engineering works - Procedures for setting out, measurement and surveying - Vocabulary. s.l. : Technical Committee of Terminology and harmonization of languages, 2020..
[92] ISO 17123. Optics and optical instruments - Field procedures for testing geodetic and surveying instruments - Part 1: Theory. s.l. : Technical Committee of Geodetic and surveying instruments, 2014..
[93] Evaluation of commercially available remote sensors for highway bridge condition assessment. Khatereh, Vaghefi. 2012, ournal of Bridge Engineering, p. 10..
[94] An Evaluation of Commercially Available Remote Sensors for Assessing Highway Bridge Condition. Ahlborn, T. 2010, Michigan Tech Transportation Institute, p. 73..
[95] Convolutional neural network-based safety evaluation method for structures with dynamic responses. Park, Hyo. 2020, xpert Systems with Applications, p. 14..
[96] IEEE Intelligent TranSPOrtation Systems. Chang, Remy. 2004, San Diego, USA, October 3-6, p. 971..
[97] Research on Data Correlation in Structural Health Monitoring System. Qiang, Li. 2013, Advanced Materials Research, p. 11..
[98] Static and Dynamic Structural Health. Radoi, Andrei. 2021, Romanian Journal of Transport Infraestructure, p. 16..
[99] Guideline for Structural Health Monitoring. Rücker, W. 2006, Federal Institute of Materials Research and Testing, p. 63..
[100] Condition monitoring and diagnostics of machines - Vocabulary. ISO 13372. 2012, Technical Committee of Condition monitoring and diagnostics of machine systems, p. 15..
[101] Non-destructive testing - Eddy current testing - Vocabulary. ISO 12718. 2019, Technical Committee of Eddy current testing, p. 39..
[102] Statistics - Vocabulary and symbols - Part 1: General statistical terms and terms used in probability. ISO 3534. 2006, Applications of statistical methods, p. 105..
[103] Sharps injury protection - Requirements and test methods - Part 2: Reusable sharps containers. ISO 23907. 2019, Technical Committee of devices for administration of medicinal products and catheters, p. 17..
[104] Petroleum and natural gas industry - Pipeline transportation systems - Pipeline integrity management specification . ISO 19345. 2019, Technical Committee of Pipeline transportation systems, p. 108..
[105] Vacuum technology - Turbomolecular pumps - Measurement of rapid shutdown torque. 27892, ISO. 2010, Technical Committee of Vacuum technology, p. 16..
[106] —.ISO 27892. 2010, Technical Committee of Vacuum technology, p. 16..
[107] Photography — Archiving Systems — Vocabulary. ISO 19262. 2015, Technical Committee of Photography, p. 44..
[108] Buildings and civil engineering works — Procedures for setting out, measurement and surveying — Vocabulary. ISO 7078. 2020, Technical Committee of Terminology and harmonization of languages, p. 34..
[109] Bazovsky, Igor. Reliability theory and practice. s.l. : Courier Corporation, 2004..
[110] Water quality — Estimation of measurement uncertainty based on validation and quality control data. ISO 11352. 2021, Technical Committee of Physical, chemical and biochemical methods, p. 26..
[111] Petroleum industry — Terminology — Part 1: Raw materials and products. ISO 1998. 1998, Technical Committee of Petroleum and related products, fuels and lubricants from natural or synthetic sources..
[112] Surry, D. W..
[113] Andrea Del Groso, Konrad Bergmeister, Daniele Inaudi, Ulrich Santa. IABSE Symposium Report: Monitoring of bridges and concrete structures with Fibre optic sensors in Europe. s.l. : International Association for Bridge and Structural Engineering, 2001..
[114] Calibration and testing of distributed fiber optic sensors for detection of high energy radiation . Mathew kautzman, Brian Jenkins. 2018, Journal of Directed Energy ..
[115] Integrity monitoring of old steel bridge using fiber optic distributed sensors based on Brillouin scattering. SMARTEC SA, Branko Glisic, Daniele Posenato, Daniele Inaudi. s.l. : The International Society for Optical Engineering, 2007. Proceedings of SPIE..
[116] SMARTEC SA Branko Glisic, Daniele Inaudi. Sensing tape for easy integration of optical fiber sensors in composite structures. Manno : SMARTEC SA, 2007..
[117] Instytut Maszyn Przepływowych im. Roberta Szewalskiego PAN. Specyfikacja Istotnych warunków zamówienia (SIWZ). Dostawa czujników światłowodowych i przetworników . Gdańsk , Polska  : s.n., March 2011..
[118] System monitorowania mostów kompozytowych z wykorzystaniem światłowodowych czujników odkształceń. Tomasz Siwowski, Rafał Sieńko, Łukasz Bednarski. 2017. PROJECT COM-BRIDGE, DEMONSTRATOR+. pp. 50-53..
[119] Annan. 2023..
[120] Annan, P. GPR . .
[121] —. GPR Principles, Procedures & Applications. Mississauga, ON, Canada : Sensors and Software Inc., 2003. p. 278..
[122] Solla, M., Lorenzo, H. and Pérez-Gracia, V. Ground Penetrating Radar: Fundamentals, Methodologies and Applications in Structures and Infrastructures. [book auth.] B. Riveiro and M. Solla. Non-Destructive Techniques for the Evaluation of Structures and Infrastructures. EH Leiden, The Netherlands : CRC Press/Balkema, 2016..
[123] Pajewski, L., Fontul, S. and Solla, M. Ground-penetrating radar for the evaluation and monitoring of transport infrastructures. Innovation in Near-Surface Geophysics. Instrumentation, Application, and Data Processing Methods. Amsterdam, The Netherlands : Elsevier, 2019..