Remote monitoring glossary

For newcomers to the world of remote monitoring, it can be difficult to know where to begin. This glossary will introduce you to some key terms you may encounter in the world of satellites and remote sensing, so you can confidently navigate options for remote property monitoring.

UPSTREAM TECH
Jul 10, 2020
Table of contents

For newcomers to the world of remote monitoring, it can be difficult to know where to begin. This glossary will introduce you to some key terms you may encounter in the world of satellites and remote sensing, so you can confidently navigate options for remote property monitoring.

First off, what is remote sensing? Fundamentally, it is a way to scan the earth from above. Often remote sensing refers to satellite-based data collection – but this can be any type of data that is not collected by in-person monitoring (i.e. drones, aerial flights, LiDAR, etc). This geospatial data, or information associated with a specific location, can be used to remotely monitor conservation properties. Let’s dive into some specifics.

First, we’ll cover some basic terminology

Spatial resolution – This is defined by the size of the “pixel” from airborne or satellite imagery. A pixel represents the smallest, non-divisible portion of a digital image – for example, 3-meter resolution imagery means each pixel covers a 3 by 3 meter square of the earth. Smaller pixels capture a clearer picture and can identify smaller features. This can vary from 3 centimeters captured by drones, 30 centimeters to 5 meters by commercial satellites, or 10 meters to 1 kilometer from public domain satellite sources. There are different classifications of low, moderate, and high spatial resolutions. Some satellite imagery companies refer to high-resolution as finer than 1 meter, while others refer to high-resolution as finer than 2 meters. Some academic groups refer to imagery finer than 1 meter as very high resolution. The level of resolution necessary depends on the applications for your work – the graphics below illustrate how images with varying spatial resolution appear on the ground:

Images covering 6, 100, 400 acres at 30, 10, 3, 0.5 meter resolutions

Using the visuals above as an example, if you needed to monitor specific changes to a building, imagery with a spatial resolution between 0.5 and 3 meters might be ideal. Unless you were aware that a building was present before reviewing the imagery, the 10-meter isn’t clear enough to evaluate the area in detail. Therefore you might ask, why not always go for higher resolution? The answer is that high-resolution data is generally captured less frequently, and can be more costly. Figuring out the right data to use is about understanding the trade-offs in terms of cost and frequency of availability, and how this fits your particular needs.

Images of a property captured at 0.3m, 0.5m, 1m, 10m spatial resolution

Temporal resolution – The time it takes for a satellite to orbit and revisit a specific area, which determines how often imagery is captured. This could be daily as is the case with commercial satellites, weekly or monthly from public domain sources, for example.

Spectral resolution – The number and characteristics of wavelengths captured by a satellite, or the range of the electromagnetic spectrum that a sensor(s) can capture. (More on how we can combine these bands to evaluate different characteristics later!)

Basemap – Background imagery that is stitched together from an array of sources, including aerial photography and satellites. This is meant to provide a geographic context for your property and orient you to the map location. However, because basemaps are aggregate images and may not display the most up-to-date image of the landscape, they are not ideal for remote monitoring or analysis because you can’t determine when the changes you might see occurred.

Next, let’s explore how imagery is captured

Remote data can be captured in a variety of different ways:

  • Airborne sensors on airplanes – The most familiar source is the National Agriculture Imagery Program (NAIP) imagery administered by the United States Department of Agriculture (USDA).
  • Drones – You can learn more about drones in our blog here.
  • Satellites – You can learn more satellites in our blog here.
  • Light Detection and Ranging (LiDAR) – A common remote sensing method that uses pulsed laser light beams to measure elevation and provide 3D representations of the ground surface.

And once captured, this data can be interpreted or analyzed in different ways to provide useful insights:

  • Georeferenced Image – An image where each pixel is aligned with a real-world GPS coordinate. This is important when looking at the specific location of elements in an image. One thing to note is that sometimes with aerial flights, you might have someone just looking from the plane or snapping images from above that may or may not be georeferenced. It’s important to keep this in mind if you want reference points consistently year-over-year and be able to compare images captured over time.
  • Orthorectification/Orthogonal imagery – Orthorectification creates an image that has been corrected so that the scale is uniform across the terrain, and is an accurate representation of the earth’s surface. Most commercial airborne and satellite imagery, including Google Earth, is orthogonal imagery in order to provide an accurate top-down view that is aligned to a map grid. This is important for tasks such as measuring distance.
  • Mosaic – This is when multiple images, captured at different times, are combined together to build a composite that fills in gaps of missing data. Sensor failures or clouds may prevent data from being captured for all parts of a property, and mosaics can be useful in covering large geographic areas such as for basemaps.

Satellite data comes from both public and commercial satellites:

  • Public domain satellites capture information with a high temporal frequency (often weekly or monthly) to enable consistent monitoring, though the trade-off is a lower spatial resolution (usually 10-30 meters). This data is made publicly available from the organizations that operate the satellites like NASA and the European Space Agency.
  • Examples of public domain satellites include MODIS, LandSat, and Sentinel.
  • Commercial satellites are owned by private companies that sell access to higher spatial resolution data than what public sources can provide. This includes companies such as Maxar (formerly DigitalGlobe) and Airbus, and Planet, all of whom work with Upstream Tech.

Here’s an overview of Upstream Tech’s satellite sources:

Images of public and commercial satellite sources used by Upstream Tech

Archive imagery – Any imagery that is captured by a satellite (public or commercial) and is made available for others to use is called archive imagery. These times of capture can vary widely, from two days ago or two decades ago, and there is not necessarily a way to predict when imagery will be captured for any given geography. Public archive images are available in a variety of ways, including Google Earth. Commercial archive images are available through purchase or subscription.

Tasking – Depending on the type of satellite, it may only capture data when it is instructed to do so. That is called “tasking” a satellite. For example, commercial satellites only gather imagery when they are “tasked” to do so, and then make that data available for others to view and purchase as well through the archive. Airbus Pleiades and SPOT, Maxar WorldView, and Planet SkySat are examples of on-demand imaging satellites.

Continuous Imagery Capture – Imagery captured continuously as the satellite orbits the Earth, resulting in consistent patterns of image capture worldwide as these satellites orbit the earth. MODIS, Landsat, Sentinel, PlanetScope are examples of continuous capture systems in which they have a designated schedule to capture imagery which can vary from every day to every week or month.

Finally, let’s evaluate the types of information captured

For a refresher on the electromagnetic spectrum, see image below where the rectangle demonstrates the types of wavelengths captured by satellites:

Electromagnetic spectrum diagram
Electromagnetic spectrum diagram from NASA

Different kinds of data can be captured by different satellite bands to glean information about the ground conditions. The type of data gathered depends on the band of the electromagnetic spectrum:

BandApplicationsVisible Light BandsThere are Red, Green, and Blue bands that are utilized individually and collectively.

When combined, the images look like what the human eye would perceive from above, which makes it very straightforward to interpret.Near-Infrared (NIR)“Near” because it’s close to the visible red light on the spectrum.

Photosynthesis reflects NIR most strongly and thus is often used to detect vegetation differences in species and phenology.Shortwave Infrared (SWIR)Although light in the shortwave infrared wavelengths is invisible to the human eye, this reflective light is helpful in measuring leaf moisture and snow.Thermal Infrared (TIR)Measures temperature, which can be helpful to quantify irrigation and evapotranspiration.MicrowaveThis wavelength includes radar, which is an active sensor that is able to penetrate clouds. Radar can be used to characterize surface texture, which helps with tillage, land subsidence, tree height, soil moisture, and more.

Information gathered from these bands can then be visualized in a variety of ways. Here are some examples:

Truecolor image

Truecolor – Displays ground conditions in a natural color palette, similar to what humans observe.

NDVI image

Normalized Difference Vegetation Index (NDVI) – Satellite imagery is processed using this index to quantify photosynthetic activity, which can be used to assess changes in plant health. The NDVI value ranges from 0 to 1, with higher values indicating greener, more dense vegetation. Many satellites (and drones!) capture this data, but the European Space Agency’s Sentinel satellites are commonly used to evaluate this information on a weekly basis.

NDWI and NDSI image

Normalized Difference Water Index (NDWI) – Satellite imagery is processed using this index to quantify the presence of surface water. The NDWI value ranges from 0-1, where higher values indicate open surface water in blue, values in middle ranges indicate shallow water or moist soil, and low values indicate the absence of surface water in white. Similarly to vegetation, the Sentinel satellite is a great source to monitor surface water changes.

Normalized Difference Snow Index (NDSI) – Satellite imagery is processed using this index to quantify the presence of snow cover. The NDSI value ranges from 0 to 1, where lower values indicate the presence of snow and higher values correspond to a lack of snow.

We hope this glossary has helped you to feel more comfortable about embarking on your remote monitoring journey! Curious to learn more? Check out our blog posts on drones, satellites, and stewardship monitoring options.

If you have any additional questions, please feel free to contact us at lens@upstream.tech.