Microwave Remote Sensing, Theory and Application - Lecture Note - Lecture Material
Microwave Remote Sensing, Theory and Application

Introduction

penetrate through cloud cover, haze, dust, and all but the heaviest rainfall Microwave sensing encompasses both active and passive forms of remote sensing. As described in Chapter 2, the microwave portion of the spectrum covers the range from approximately 1cm to 1m in wavelength. Because of their long wavelengths, compared to the visible and infrared, microwaves have special properties that are important for remote sensing. Longer wavelength microwave radiation can penetrate through cloud cover, haze, dust, and all but the heaviest rainfall as the longer wavelengths are not susceptible to atmospheric scattering which affects shorter optical wavelengths. This property allows detection of microwave energy under almost all weather and environmental conditions so that data can be collected at any time.

Passive microwave sensing is similar in concept to thermal remote sensing. All objects emit microwave energy of some magnitude, but the amounts are generally very small. A passive microwave sensor detects the naturally emitted microwave energy within its field of view. This emitted energy is related to the temperature and moisture properties of the emitting object or surface. Passive microwave sensors are typically radiometers or scanners and operate in much the same manner as systems discussed previously except that an antenna is used to detect and record the microwave energy.

illustration of microwave energy recorded by a passive sensor

The microwave energy recorded by a passive sensor can be emitted by the atmosphere (1), reflected from the surface (2), emitted from the surface (3), or transmitted from the subsurface (4). Because the wavelengths are so long, the energy available is quite small compared to optical wavelengths. Thus, the fields of view must be large to detect enough energy to record a signal. Most passive microwave sensors are therefore characterized by low spatial resolution.

Applications of passive microwave remote sensing include meteorology, hydrology, and oceanography. By looking "at", or "through" the atmosphere, depending on the wavelength, meteorologists can use passive microwaves to measure atmospheric profiles and to determine water and ozone content in the atmosphere. Hydrologists use passive microwaves to measure soil moisture since microwave emission is influenced by moisture content. Oceanographic applications include mapping sea ice, currents, and surface winds as well as detection of pollutants, such as oil slicks.

Active microwave sensorsActive microwave sensors provide their own source of microwave radiation to illuminate the target. Active microwave sensors are generally divided into two distinct categories: imaging and non-imaging. The most common form of imaging active microwave sensors is RADAR. RADAR is an acronym for RAdio Detection And Ranging, which essentially characterizes the function and operation of a radar sensor. The sensor transmits a microwave (radio) signal towards the target and detects the backscattered portion of the signal. The strength of the backscattered signal is measured to discriminate between different targets and the time delay between the transmitted and reflected signals determines the distance (or range) to the target.

Non-imaging microwave sensors include altimeters and scatterometers. In most cases these are profiling devices which take measurements in one linear dimension, as opposed to the two-dimensional representation of imaging sensors. Radar altimeters transmit short microwave pulses and measure the round trip time delay to targets to determine their distance from the sensor. Generally altimeters look straight down at nadir below the platform and thus measure height or elevation (if the altitude of the platform is accurately known). Radar altimetry is used on aircraft for altitude determination and on aircraft and satellites for topographic mapping and sea surface height estimation. Scatterometers are also generally non-imaging sensors and are used to make precise quantitative measurements of the amount of energy backscattered from targets. The amount of energy backscattered is dependent on the surface properties (roughness) and the angle at which the microwave energy strikes the target. Scatterometry measurements over ocean surfaces can be used to estimate wind speeds based on the sea surface roughness. Ground-based scatterometers are used extensively to accurately measure the backscatter from various targets in order to characterize different materials and surface types. This is analogous to the concept of spectral reflectance curves in the optical spectrum.

For the remainder of this chapter we focus solely on imaging radars. As with passive microwave sensing, a major advantage of radar is the capability of the radiation to penetrate through cloud cover and most weather conditions. Because radar is an active sensor, it can also be used to image the surface at any time, day or night. These are the two primary advantages of radar: all-weather and day or night imaging. It is also important to understand that, because of the fundamentally different way in which an active radar operates compared to the passive sensors we described in Chapter 2, a radar image is quite different from and has special properties unlike images acquired in the visible and infrared portions of the spectrum. Because of these differences, radar and optical data can be complementary to one another as they offer different perspectives of the Earth's surface providing different information content. We will examine some of these fundamental properties and differences in more detail in the following sections.

Before we delve into the peculiarities of radar, let's first look briefly at the origins and history of imaging radar, with particular emphasis on the Canadian experience in radar remote sensing. The first demonstration of the transmission of radio microwaves and reflection from various objects was achieved by Hertz in 1886. Shortly after the turn of the century, the first rudimentary radar was developed for ship detection. In the 1920s and 1930s, experimental ground-based pulsed radars were developed for detecting objects at a distance. The first imaging radars used during World War II had rotating sweep displays which were used for detection and positioning of aircrafts and ships. After World War II, side-looking airborne radar (SLAR) was developed for military terrain reconnaissance and surveillance where a strip of the ground parallel to and offset to the side of the aircraft was imaged during flight. In the 1950s, advances in SLAR and the development of higher resolution synthetic aperture radar (SAR) were developed for military purposes. In the 1960s these radars were declassified and began to be used for civilian mapping applications. Since this time the development of several airborne and spaceborne radar systems for mapping and monitoring applications use has flourished.

Canada initially became involved in radar remote sensing in the mid-1970s. It was recognized that radar may be particularly well-suited for surveillance of our vast northern expanse, which is often cloud-covered and shrouded in darkness during the Arctic winter, as well as for monitoring and mapping our natural resources. Canada's SURSAT (Surveillance Satellite) project from 1977 to 1979 led to our participation in the (U.S.) SEASAT radar satellite, the first operational civilian radar satellite. The Convair-580 airborne radar program, carried out by the Canada Centre for Remote Sensing following the SURSAT program, in conjunction with radar research programs of other agencies such as NASA and the European Space Agency (ESA), led to the conclusion that spaceborne remote sensing was feasible. In 1987, the Radar Data Development Program (RDDP), was initiated by the Canadian government with the objective of "operationalizing the use of radar data by Canadians". Over the 1980s and early 1990s, several research and commercial airborne radar systems have collected vast amounts of imagery throughout the world demonstrating the utility of radar data for a variety of applications. With the launch of ESA's ERS-1 in 1991, spaceborne radar research intensified, and was followed by the major launches of Japan's J-ERS satellite in 1992, ERS-2 in 1995, and Canada's advanced RADARSAT satellite, also in 1995.

 

Radar Basics

As noted in the previous section, a radar is essentially a ranging or distance measuring device. It consists fundamentally of a transmitter, a receiver, an antenna, and an electronics system to process and record the data. The transmitter generates successive short bursts (or pulses of microwave (A) at regular intervals which are focused by the antenna into a beam (B). The radar beam illuminates the surface obliquely at a right angle to the motion of the platform. The antenna receives a portion of the transmitted energy reflected (or backscattered) from various objects within the illuminated beam (C). By measuring the time delay between the transmission of a pulse and the reception of the backscattered "echo" from different targets, their distance from the radar and thus their location can be determined. As the sensor platform moves forward, recording and processing of the backscattered signals builds up a two-dimensional image of the surface.

Microwave region of the spectrum

While we have characterized electromagnetic radiation in the visible and infrared portions of the spectrum primarily by wavelength, microwave portions of the spectrum are often referenced according to both wavelength and frequency. The microwave region of the spectrum is quite large, relative to the visible and infrared, and there are several wavelength ranges or bands commonly used which given code letters during World War II, and remain to this day.

  • Ka, K, and Ku bands: very short wavelengths used in early airborne radar systems but uncommon today.
  • X-band: used extensively on airborne systems for military reconnaissance and terrain mapping.
  • C-band: common on many airborne research systems (CCRS Convair-580 and NASA AirSAR) and spaceborne systems (including
  • ERS-1 and 2 and RADARSAT).
  • S-band: used on board the Russian ALMAZ satellite.
  • L-band: used onboard American SEASAT and Japanese JERS-1 satellites and NASA airborne system.
  • P-band: longest radar wavelengths, used on NASA experimental airborne research system.



Two radar images of the same agricultural fields

Here are two radar images of the same agricultural fields, each image having been collected using a different radar band. The one on the top was acquired by a C-band radar and the one below was acquired by an L-band radar. You can clearly see that there are significant differences between the way the various fields and crops appear in each of the two images. This is due to the different ways in which the radar energy interacts with the fields and crops depending on the radar wavelength. We will learn more about this in later sections.

Polarization

When discussing microwave energy, the polarization of the radiation is also important. Polarization refers to the orientation of the electric field (recall the definition of electromagnetic radiation from Chapter 1). Most radars are designed to transmit microwave radiation either horizontally polarized (H) or vertically polarized (V). Similarly, the antenna receives either the horizontally or vertically polarized backscattered energy, and some radars can receive both. These two polarization states are designated by the letters H for horizontal, and V, for vertical. Thus, there can be four combinations of both transmit and receive polarizations as follows:

  • HH - for horizontal transmit and horizontal receive,
  • VV - for vertical transmit and vertical receive,
  • HV - for horizontal transmit and vertical receive, and
  • VH - for vertical transmit and horizontal receive.

The first two polarization combinations are referred to as like-polarized because the transmit and receive polarizations are the same. The last two combinations are referred to as cross-polarized because the transmit and receive polarizations are opposite of one another. These C-band images of agricultural fields demonstrate the variations in radar response due to changes in polarization. The bottom two images are like-polarized (HH and VV, respectively), and the upper right image is cross-polarized (HV). The upper left image is the result of displaying each of the three different polarizations together, one through each of the primary colours (red, green, and blue). Similar to variations in wavelength, depending on the transmit and receive polarizations, the radiation will interact with and be backscattered differently from the surface. Both wavelength and polarization affect how a radar "sees" the surface. Therefore, radar imagery collected using different polarization and wavelength combinations may provide different and complementary information about the targets on the surface.


C-band images
C-band images

Did you know?

"....Just what do those numbers mean?!"

Typical output products (e.g. RADARSAT imagery) have used 8-bit or 16-bit data formats (digital numbers) for data storage. In order to obtain the original physically meaningful backscatter values (σo sigma nought, βo beta nought) of calibrated radar products, it is necessary to reverse the final steps in the SAR processing chain. For RADARSAT imagery, this must include the squaring of the digital values and the application of a lookup table (which can have range dependent values). Thus, as you can see, the relationships among the digital numbers in the imagery are not that simple!

 

Viewing Geometry and Spatial Resolution

Imaging geometry of a radar system

The imaging geometry of a radar system is different from the framing and scanning systems commonly employed for optical remote sensing described in Chapter 2. Similar to optical systems, the platform travels forward in the flight direction (A) with the nadir (B) directly beneath the platform. The microwave beam is transmitted obliquely at right angles to the direction of flight illuminating a swath (C) which is offset from nadir. Range (D) refers to the across-track dimension perpendicular to the flight direction, while azimuth (E) refers to the along-track dimension parallel to the flight direction. This side-looking viewing geometry is typical of imaging radar systems (airborne or spaceborne).

Near range

Near range

The portion of the image swath closest to the nadir track of the radar platform is called the near range (A) while the portion of the swath farthest from the nadir is called the far range (B).

Incidence angle

Incidence angle

The incidence angle is the angle between the radar beam and ground surface (A) which increases, moving across the swath from near to far range. The look angle (B) is the angle at which the radar "looks" at the surface. In the near range, the viewing geometry may be referred to as being steep, relative to the far range, where the viewing geometry is shallow. At all ranges the radar antenna measures the radial line of sight distance between the radar and each target on the surface. This is the slant range distance (C). The ground range distance (D) is the true horizontal distance along the ground corresponding to each point measured in slant range.

Range or across-track resolution

Unlike optical systems, a radar's spatial resolution is a function of the specific properties of the microwave radiation and geometrical effects. If a Real Aperture Radar (RAR) is used for image formation (as in Side-Looking Airborne Radar) a single transmit pulse and the backscattered signal are used to form the image. In this case, the resolution is dependent on the effective length of the pulse in the slant range direction and on the width of the illumination in the azimuth direction. The range or across-track resolution is dependent on the length of the pulse (P). Two distinct targets on the surface will be resolved in the range dimension if their separation is greater than half the pulse length. For example, targets 1 and 2 will not be separable while targets 3 and 4 will. Slant range resolution remains constant, independent of range. However, when projected into ground range coordinates, the resolution in ground range will be dependent of the incidence angle. Thus, for fixed slant range resolution, the ground range resolution will decrease with increasing range.

Azimuth or along-track resolution

The azimuth or along-track resolution is determined by the angular width of the radiated microwave beam and the slant range distance. This beamwidth (A) is a measure of the width of the illumination pattern. As the radar illumination propagates to increasing distance from the sensor, the azimuth resolution increases (becomes coarser). In this illustration, targets 1 and 2 in the near range would be separable, but targets 3 and 4 at further range would not. The radar beamwidth is inversely proportional to the antenna length (also referred to as the aperture) which means that a longer antenna (or aperture) will produce a narrower beam and finer resolution.

Finer range resolution can be achieved by using a shorter pulse length, which can be done within certain engineering design restrictions. Finer azimuth resolution can be achieved by increasing the antenna length. However, the actual length of the antenna is limited by what can be carried on an airborne or spaceborne platform. For airborne radars, antennas are usually limited to one to two metres; for satellites they can be 10 to 15 metres in length. To overcome this size limitation, the forward motion of the platform and special recording and processing of the backscattered echoes are used to simulate a very long antenna and thus increase azimuth resolution.

Increase azimuth resolution

This figure illustrates how this is achieved. As a target (A) first enters the radar beam (1), the backscattered echoes from each transmitted pulse begin to be recorded. As the platform continues to move forward, all echoes from the target for each pulse are recorded during the entire time that the target is within the beam. The point at which the target leaves the view of the radar beam (2) some time later, determines the length of the simulated or synthesized antenna (B). Targets at far range, where the beam is widest will be illuminated for a longer period of time than objects at near range. The expanding beamwidth, combined with the increased time a target is within the beam as ground range increases, balance each other, such that the resolution remains constant across the entire swath. This method of achieving uniform, fine azimuth resolution across the entire imaging swath is called synthetic aperture radar, or SAR. Most airborne and spaceborne radars employ this type of radar.

 

Radar Image Distortions

Slant-range scale distortionAs with all remote sensing systems, the viewing geometry of a radar results in certain geometric distortions on the resultant imagery. However, there are key differences for radar imagery which are due to the side-looking viewing geometry, and the fact that the radar is fundamentally a distance measuring device (i.e. measuring range). Slant-range scale distortion occurs because the radar is measuring the distance to features in slant-range rather than the true horizontal distance along the ground. This results in a varying image scale, moving from near to far range. Although targets A1 and B1 are the same size on the ground, their apparent dimensions in slant range (A2 and B2) are different. This causes targets in the near range to appear compressed relative to the far range. Using trigonometry, ground-range distance can be calculated from the slant-range distance and platform altitude to convert to the proper ground-range format.

This conversion comparison shows a radar image in slant-range display (top) where the fields and the road in the near range on the left side of the image are compressed, and the same image converted to ground-range display (bottom) with the features in their proper geometric shape.

Similar to the distortions encountered when using cameras and scanners, radar images are also subject to geometric distortions due to relief displacement. As with scanner imagery, this displacement is one-dimensional and occurs perpendicular to the flight path. However, the displacement is reversed with targets being displaced towards, instead of away from the sensor. Radar foreshortening and layover are two consequences which result from relief displacement.

When the radar beam reaches the base of a tall feature tilted towards the radar (e.g. a mountain) before it reaches the top foreshortening will occur. Again, because the radar measures distance in slant-range, the slope (A to B) will appear compressed and the length of the slope will be represented incorrectly (A' to B'). Depending on the angle of the hillside or mountain slope in relation to the incidence angle of the radar beam, the severity of foreshortening will vary. Maximum foreshortening occurs when the radar beam is perpendicular to the slope such that the slope, the base, and the top are imaged simultaneously (C to D). The length of the slope will be reduced to an effective length of zero in slant range (C'D'). The figure below shows a radar image of steep mountainous terrain with severe foreshortening effects. The foreshortened slopes appear as bright features on the image.

Radar image of steep mountainous terrain

LayoverLayover occurs when the radar beam reaches the top of a tall feature (B) before it reaches the base (A). The return signal from the top of the feature will be received before the signal from the bottom. As a result, the top of the feature is displaced towards the radar from its true position on the ground, and "lays over" the base of the feature (B' to A'). Layover effects on a radar image look very similar to effects due to foreshortening. As with foreshortening, layover is most severe for small incidence angles, at the near range of a swath, and in mountainous terrain.

Layover effects

Both foreshortening and layover result in radar shadow. Radar shadow occurs when the radar beam is not able to illuminate the ground surface. Shadows occur in the down range dimension (i.e. towards the far range), behind vertical features or slopes with steep sides. Since the radar beam does not illuminate the surface, shadowed regions will appear dark on an image as no energy is available to be backscattered. As incidence angle increases from near to far range, so will shadow effects as the radar beam looks more and more obliquely at the surface. This image illustrates radar shadow effects on the right side of the hillsides which are being illuminated from the left.

Red surfaces are completely in shadow. Black areas in image are shadowed and contain no information.

Radar shadow effects



Did you know?

"...look to the left, look to the right, stand up, sit down..."

RADARSAT image of Jordan...although a radar's side-looking geometry can result in several image effects such as foreshortening, layover, and shadow, this geometry is exactly what makes radar so useful for terrain analysis. These effects, if not too severe, actually enhance the visual appearance of relief and terrain structure, making radar imagery excellent for applications such as topographic mapping and identifying geologic structure.

 

Target Interaction and Image Appearance

The brightness of features in a radar image is dependent on the portion of the transmitted energy that is returned back to the radar from targets on the surface. The magnitude or intensity of this backscattered energy is dependent on how the radar energy interacts with the surface, which is a function of several variables or parameters. These parameters include the particular characteristics of the radar system (frequency, polarization, viewing geometry, etc.) as well as the characteristics of the surface (landcover type, topography, relief, etc.). Because many of these characteristics are interrelated, it is impossible to separate out each of their individual contributions to the appearance of features in a radar image. Changes in the various parameters may have an impact on and affect the response of other parameters, which together will affect the amount of backscatter. Thus, the brightness of features in an image is usually a combination of several of these variables. However, for the purposes of our discussion, we can group these characteristics into three areas which fundamentally control radar energy/target interactions. They are:

  • Surface roughness of the target
  • Radar viewing and surface geometry relationship
  • Moisture content and electrical properties of the target

surface roughness

The surface roughness of a feature controls how the microwave energy interacts with that surface or target and is generally the dominant factor in determining the tones seen on a radar image. Surface roughness refers to the average height variations in the surface cover from a plane surface, and is measured on the order of centimetres. Whether a surface appears rough or smooth to a radar depends on the wavelength and incidence angle.

Smooth surfaceSimply put, a surface is considered "smooth" if the height variations are much smaller than the radar wavelength. When the surface height variations begin to approach the size of the wavelength, then the surface will appear "rough". Thus, a given surface will appear rougher as the wavelength becomes shorter and smoother as the wavelength becomes longer. A smooth surface (A) causes specular reflection of the incident energy (generally away from the sensor) and thus only a small amount of energy is returned to the radar. This results in smooth surfaces appearing as darker toned areas on an image. A rough surface (B) will scatter the energy approximately equally in all directions (i.e. diffusely) and a significant portion of the energy will be backscattered to the radar. Thus, rough surfaces will appear lighter in tone on an image. Incidence angle, in combination with wavelength, also plays a role in the apparent roughness of a surface. For a given surface and wavelength, the surface will appear smoother as the incidence angle increases. Thus, as we move farther across the swath, from near to far range, less energy would be returned to the sensor and the image would become increasingly darker in tone.

Local incidence angleWe have already discussed incidence or look angle in relation to viewing geometry and how changes in this angle affect the signal returned to the radar. However, in relation to surface geometry, and its effect on target interaction and image appearance, the local incidence angle is a more appropriate and relevant concept. The local incidence angle is the angle between the radar beam and a line perpendicular to the slope at the point of incidence (A). Thus, local incidence angle takes into account the local slope of the terrain in relation to the radar beam. With flat terrain, the local incidence angle is the same as the look angle (B) of the radar. For terrain with any type of relief, this is not the case. Generally, slopes facing towards the radar will have small local incidence angles, causing relatively strong backscattering to the sensor, which results in a bright-toned appearance in an image.

As the concept of local incidence angle demonstrates, the relationship between viewing geometry and the geometry of the surface features plays an important role in how the radar energy interacts with targets and their corresponding brightness on an image. Variations in viewing geometry will accentuate and enhance topography and relief in different ways, such that varying degrees of foreshortening, layover, and shadow (section 3.4) may occur depending on surface slope, orientation, and shape.

Look direction or aspect angle

The look direction or aspect angle of the radar describes the orientation of the transmitted radar beam relative to the direction or alignment of linear features on the surface. The look direction can significantly influence the appearance of features on a radar image, particularly when ground features are organized in a linear structure (such as agricultural crops or mountain ranges). If the look direction is close to perpendicular to the orientation of the feature (A), then a large portion of the incident energy will be reflected back to the sensor and the feature will appear as a brighter tone. If the look direction is more oblique in relation to the feature orientation (B), then less energy will be returned to the radar and the feature will appear darker in tone. Look direction is important for enhancing the contrast between features in an image. It is particularly important to have the proper look direction in mountainous regions in order to minimize effects such as layover and shadowing. By acquiring imagery from different look directions, it may be possible to enhance identification of features with different orientations relative to the radar.

Corner reflection

Features which have two (or more) surfaces (usually smooth) at right angles to one another, may cause corner reflection to occur if the 'corner' faces the general direction of the radar antenna. The orientation of the surfaces at right angles causes most of the radar energy to be reflected directly back to the antenna due to the double bounce (or more) reflection. Corner reflectors with complex angular shapes are common in urban environments (e.g. buildings and streets, bridges, other man-made structures). Naturally occurring corner reflectors may include severely folded rock and cliff faces or upright vegetation standing in water. In all cases, corner reflectors show up as very bright targets in an image, such as the buildings and other man-made structures in this radar image of a city.

Radar image of a city

The presence (or absence) of moisture affects the electrical properties of an object or medium. Changes in the electrical properties influence the absorption, transmission, and reflection of microwave energy. Thus, the moisture content will influence how targets and surfaces reflect energy from a radar and how they will appear on an image. Generally, reflectivity (and image brightness) increases with increased moisture content. For example, surfaces such as soil and vegetation cover will appear brighter when they are wet than when they are dry.

When a target is moist or wet, scattering from the topmost portion (surface scattering) is the dominant process taking place. The type of reflection (ranging from specular to diffuse) and the magnitude will depend on how rough the material appears to the radar. If the target is very dry and the surface appears smooth to the radar, the radar energy may be able to penetrate below the surface, whether that surface is discontinuous (e.g. forest canopy with leaves and branches), or a homogeneous surface (e.g. soil, sand, or ice). For a given surface, longer wavelengths are able to penetrate further than shorter wavelengths.

Volume scattering

If the radar energy does manage to penetrate through the topmost surface, then volume scattering may occur. Volume scattering is the scattering of radar energy within a volume or medium, and usually consists of multiple bounces and reflections from different components within the volume. For example, in a forest, scattering may come from the leaf canopy at the tops of the trees, the leaves and branches further below, and the tree trunks and soil at the ground level. Volume scattering may serve to decrease or increase image brightness, depending on how much of the energy is scattered out of the volume and back to the radar.



Did you know?

"...rivers in the Sahara desert?...you're crazy!..."

... that an L-band radar (23.5 cm wavelength) imaging from the orbiting space shuttle was able to discover ancient river channels beneath the Sahara Desert in Northern Africa. Because of the long wavelength and the extreme dryness of the sand, the radar was able to penetrate several metres below the desert surface to reveal the old river beds during ancient times when this area was not so dry.

 

Radar Image Properties

speckle

All radar images appear with some degree of what we call radar speckle. Speckle appears as a grainy "salt and pepper" texture in an image. This is caused by random constructive and destructive interference from the multiple scattering returns that will occur within each resolution cell. As an example, an homogeneous target, such as a large grass-covered field, without the effects of speckle would generally result in light-toned pixel values on an image (A). However, reflections from the individual blades of grass within each resolution cell results in some image pixels being brighter and some being darker than the average tone (B), such that the field appears speckled.

Grass covered field

Speckle is essentially a form of noise which degrades the quality of an image and may make interpretation (visual or digital) more difficult. Thus, it is generally desirable to reduce speckle prior to interpretation and analysis. Speckle reduction can be achieved in two ways:

  • multi-look processing, or
  • spatial filtering.

Multi-look processing

Multi-look processing refers to the division of the radar beam (A) into several (in this example, five) narrower sub-beams (1 to 5). Each sub-beam provides an independent "look" at the illuminated scene, as the name suggests. Each of these "looks" will also be subject to speckle, but by summing and averaging them together to form the final output image, the amount of speckle will be reduced.

Small window

While multi-looking is usually done during data acquisition, speckle reduction by spatial filtering is performed on the output image in a digital (i.e. computer) image analysis environment. Speckle reduction filtering consists of moving a small window of a few pixels in dimension (e.g. 3x3 or 5x5) over each pixel in the image, applying a mathematical calculation using the pixel values under that window (e.g. calculating the average), and replacing the central pixel with the new value. The window is moved along in both the row and column dimensions one pixel at a time, until the entire image has been covered. By calculating the average of a small window around each pixel, a smoothing effect is achieved and the visual appearance of the speckle is reduced.

Speckle reduction using an averaging filter

Speckle reduction using an averaging filter

This graphic shows a radar image before (top) and after (bottom) speckle reduction using an averaging filter. The median (or middle) value of all the pixels underneath the moving window is also often used to reduce speckle. Other more complex filtering calculations can be performed to reduce speckle while minimizing the amount of smoothing taking place.

Both multi-look processing and spatial filtering reduce speckle at the expense of resolution, since they both essentially smooth the image. Therefore, the amount of speckle reduction desired must be balanced with the particular application the image is being used for, and the amount of detail required. If fine detail and high resolution is required then little or no multi-looking/spatial filtering should be done. If broad-scale interpretation and mapping is the application, then speckle reduction techniques may be more appropriate and acceptable.

Another property peculiar to radar images is slant-range distortion, which was discussed in some detail in section 3.4. Features in the near-range are compressed relative to features in the far range due to the slant-range scale variability. For most applications, it is desirable to have the radar image presented in a format which corrects for this distortion, to enable true distance measurements between features. This requires the slant-range image to be converted to 'ground range' display. This can be done by the radar processor prior to creating an image or after data acquisition by applying a transformation to the slant range image. In most cases, this conversion will only be an estimate of the geometry of the ground features due to the complications introduced by variations in terrain relief and topography.

A radar antenna transmits more power in the mid-range portion of the illuminated swath than at the near and far ranges. This effect is known as antenna pattern and results in stronger returns from the center portion of the swath than at the edges. Combined with this antenna pattern effect is the fact that the energy returned to the radar decreases dramatically as the range distance increases. Thus, for a given surface, the strength of the returned signal becomes smaller and smaller moving farther across the swath. These effects combine to produce an image which varies in intensity (tone) in the range direction across the image. A process known as antenna pattern correction may be applied to produce a uniform average brightness across the imaged swath, to better facilitate visual interpretation.

Image

The range of brightness levels a remote sensing system can differentiate is related to radiometric resolution (section 2.5) and is referred to as the dynamic range. While optical sensors, such as those carried by satellites such as Landsat and SPOT, typically produce 256 intensity levels, radar systems can differentiate intensity levels up to around 100,000 levels! Since the human eye can only discriminate about 40 intensity levels at one time, this is too much information for visual interpretation. Even a typical computer would have difficulty dealing with this range of information. Therefore, most radars record and process the original data as 16 bits (65,536 levels of intensity), which are then further scaled down to 8 bits (256 levels) for visual interpretation and/or digital computer analysis.

Calibration is a process which ensures that the radar system and the signals that it measures are as consistent and as accurate as possible. Prior to analysis, most radar images will require relative calibration. Relative calibration corrects for known variations in radar antenna and systems response and ensures that uniform, repeatable measurements can be made over time. This allows relative comparisons between the response of features within a single image, and between separate images to be made with confidence. However, if we wish to make accurate quantitative measurements representing the actual energy or power returned from various features or targets for comparative purposes, then absolute calibration is necessary.

Absolute calibration, a much more involved process than relative calibration, attempts to relate the magnitude of the recorded signal strength to the actual amount of energy backscattered from each resolution cell. To achieve this, detailed measurements of the radar system properties are required as well as quantitative measurements of the scattering properties of specific targets. The latter are often obtained using ground-based scatterometers, as described in section 3.1. Also, devices called transponders may be placed on the ground prior to data acquisition to calibrate an image. These devices receive the incoming radar signal, amplify it, and transmit a return signal of known strength back to the radar. By knowing the actual strength of this return signal in the image, the responses from other features can be referenced to it.

 

Advanced Radar Applications

In addition to standard acquisition and use of radar data, there are three specific applications worth mentioning.

Stereo radar image pair

The first is stereo radar which is similar in concept to stereo mapping using aerial photography (described in section 2.7). Stereo radar image pairs are acquired covering the same area, but with different look/incidence angles (A), or opposite look directions (B). Unlike aerial photos where the displacement is radially outward from the nadir point directly below the camera, radar images show displacement only in the range direction. Stereo pairs taken from opposite look directions (i.e. one looking north and the other south) may show significant contrast and may be difficult to interpret visually or digitally. In mountainous terrain, this will be even more pronounced as shadowing on opposite sides of features will eliminate the stereo effect. Same side stereo imaging (A) has been used operationally for years to assist in interpretation for forestry and geology and also to generate topographic maps. The estimation of distance measurements and terrain height for topographic mapping from stereo radar data is called radargrammetry, and is analogous to photogrammetry carried out for similar purposes with aerial photographs.

Electromagnetic waves

Radargrammetry is one method of estimating terrain height using radar. Another, more advanced method is called interferometry. Interferometry relies on being able to measure a property of electromagnetic waves called phase. Suppose we have two waves with the exact same wavelength and frequency traveling along in space, but the starting point of one is offset slightly from the other.

Interferometric systems

The offset between matching points on these two waves (A) is called the phase difference. Interferometric systems use two antennas, separated in the range dimension by a small distance, both recording the returns from each resolution cell. The two antennas can be on the same platform (as with some airborne SARs), or the data can be acquired from two different passes with the same sensor, such has been done with both airborne and satellite radars.

Interferogram

By measuring the exact phase difference between the two returns (A), the path length difference can be calculated to an accuracy that is on the order of the wavelength (i.e centimetres). Knowing the position of the antennas with respect to the Earth's surface, the position of the resolution cell, including its elevation, can be determined. The phase difference between adjacent resolution cells, is illustrated in this interferogram, where colours represents the variations in height. The information contained in an interferogram can be used to derive topographic information and produce three-dimensional imagery of terrain height.

Three-dimensional imagery

The concept of radar polarimetry was already alluded to in our discussion of radar fundamentals in section 3.2. As its name implies, polarimetry involves discriminating between the polarizations that a radar system is able to transmit and receive. Most radars transmit microwave radiation in either horizontal (H) or vertical (V) polarization, and similarly, receive the backscattered signal at only one of these polarizations. Multi-polarization radars are able to transmit either H or V polarization and receive both the like- and cross-polarized returns (e.g. HH and HV or VV and VH, where the first letter stands for the polarization transmitted and the second letter the polarization received). Polarimetric radars are able to transmit and receive both horizontal and vertical polarizations. Thus, they are able to receive and process all four combinations of these polarizations: HH, HV, VH, and VV. Each of these "polarization channels" have varying sensitivities to different surface characteristics and properties. Thus, the availability of multi-polarization data helps to improve the identification of, and the discrimination between features. In addition to recording the magnitude (i.e. the strength) of the returned signal for each polarization, most polarimetric radars are also able to record the phase information of the returned signals. This can be used to further characterize the polarimetric "signature" of different surface features.

Did you know?

"...we've picked up an unidentified moving object on the radar, sir..."

... besides being able to determine terrain height using interferometry, it is also possible to measure the velocity of targets moving towards or away from the radar sensor, using only one pass over the target. This is done by recording the returns from two antennas mounted on the platform, separated by a short distance in the along-track or flight direction. The phase differences between the returns at each antenna are used to derive the speed of motion of targets in the illuminated scene. Potential applications include determination of sea-ice drift, ocean currents, and ocean wave parameters.

 

Radar Polarimetry

Introduction to Polarization

When discussing microwave energy propagation and scattering, the polarization of the radiation is an important property. For a plane electromagnetic (EM) wave, polarization refers to the locus of the electric field vector in the plane perpendicular to the direction of propagation. While the length of the vector represents the amplitude of the wave, and the rotation rate of the vector represents the frequency of the wave, polarization refers to the orientation and shape of the pattern traced by the tip of the vector.

The waveform of the electric field strength (voltage) of an EM wave can be predictable (the wave is polarized) or random (the wave is unpolarized), or a combination of both. In the latter case, the degree of polarization describes the ratio of polarized power to total power of the wave. An example of a fully polarized wave would be a monochromatic sine wave, with a single, constant frequency and stable amplitude.

Examples of horizontal (black) and vertical (red) polarizations of a plane electromagnetic wave

Examples of horizontal (black) and vertical (red) polarizations of a plane electromagnetic wave

Many radars are designed to transmit microwave radiation that is either horizontally polarized (H) or vertically polarized (V). A transmitted wave of either polarization can generate a backscattered wave with a variety of polarizations. It is the analysis of these transmit and receive polarization combinations that constitutes the science of radar polarimetry.

Any polarization on either transmission or reception can be synthesized by using H and V components with a well-defined relationship between them. For this reason, systems that transmit and receive both of these linear polarizations are commonly used. With these radars, there can be four combinations of transmit and receive polarizations:

  • HH - for horizontal transmit and horizontal receive
  • VV - for vertical transmit and vertical receive
  • HV - for horizontal transmit and vertical receive, and
  • VH - for vertical transmit and horizontal receive.

The first two polarization combinations are referred to as "like-polarized" because the transmit and receive polarizations are the same. The last two combinations are referred to as "cross-polarized" because the transmit and receive polarizations are orthogonal to one another.

Radar systems can have one, two or all four of these transmit/receive polarization combinations. Examples include the following types of radar systems:

single polarized - HH or VV (or possibly HV or VH)
dual polarized - HH and HV, VV and VH, or HH and VV
alternating polarization - HH and HV, alternating with VV and VH
polarimetric - HH, VV, HV, and VH

Note that "quadrature polarization" and "fully polarimetric" can be used as synonyms for "polarimetric". The relative phase between channels is measured in a polarimetric radar, and is a very important component of the measurement. In the other radar types, relative phase may or may not be measured. The alternating polarization mode has been introduced on ENVISAT - relative phase is measured but the important HH-VV phase is not meaningful because of the time lapse between the measurements.

These C-band images of agricultural fields demonstrate the dependence of the radar response on polarization. The top two images are like-polarized (HH on left, VV on right), and the lower left image is cross-polarized (HV). The lower right image is the result of displaying these three images as a colour composite (in this case, HH - red, VV - green, and HV - blue).

Both wavelength and polarization affect how a radar system "sees" the elements in the scene. Therefore, radar imagery collected using different polarization and wavelength combinations may provide different and complementary information. Furthermore, when three polarizations are combined in a colour composite, the information is presented in a way that an image interpreter can infer more information of the surface characteristics.

H H V V

H V colour composite

Illustration of how different polarizations (HH, VV, HV & colour composite) bring out different features in an agricultural scene

Polarimetric Information

The primary description of how a radar target or surface feature scatters EM energy is given by the scattering matrix. From the scattering matrix, other forms of polarimetric information can be derived, such as synthesized images and polarization signatures.

Polarization Synthesis

A polarimetric radar can be used to determine the target response or scattering matrix using two orthogonal polarizations, typically linear H and linear V on each of transmit and receive. If a scattering matrix is known, the response of the target to any combination of incident and received polarizations can be computed. This is referred to as polarization synthesis, and illustrates the power and flexibility of a fully polarimetric radar.

Through polarization synthesis, an image can be created to improve the detectability of selected features. An example is the detection of ships in ocean images. To find the best transmit-receive polarization combination to use, the polarization signature of a typical ship and that of the ocean is calculated for a number of polarizations. Then the ratio of the ship to ocean backscatter is computed for each polarization. The transmit-receive polarization combination that maximises the ratio of backscatter strength is then used to improve the detectability of ships. This procedure is called "polarimetric contrast enhancement" or the use of a "polarimetric matched filter".

Polarization Signatures

Because the incident and scattered waves can take on so many different polarizations, and the scattering matrix consists of four complex numbers, it is helpful to simplify the interpretation of the scattering behaviour using three-dimensional plots. The "polarization signature" of the target provides a convenient way of visualising a target's scattering properties. The signatures are also called "polarization response plots".

An incident electromagnetic wave can be selected to have an electric field with ellipticity between -45º and +45º, and an orientation between 0 and 180º. These variables are used as the x- and y-axes of a 3-D plot portraying the polarization signature. For each of these possible incident polarizations, the strength of the backscatter can be computed for the same polarization on transmit and receive (the co-polarized signature) and for orthogonal polarizations on transmit and receive (the cross-polarized signature). The strength is displayed on the z-axis of the signatures.

Co-polarized signature Cross-polarized signature
Co-polarized signature Cross-polarized signature

Polarization signatures of a large conducting sphere.
P = Power, O = Orientation (degrees), E = Ellipticity (degrees)

This figure shows the polarization signatures of the most simple of all targets - a large conducting sphere or a trihedral corner reflector. The wave is backscattered with the same polarization, except for a change of sign of the ellipticity (or in the case of linear polarization, a change of the phase angle between Eh and Ev of 180o). The sign changes once for every reflection - the sphere represents a single reflection, and the trihedral gives three reflections, so each behaves as an "odd-bounce" reflector.

For more complicated targets, the polarization signature takes on different shapes. Two interesting signatures come from a dihedral corner reflector and Bragg scattering from the sea surface. In the case of the dihedral reflector, the co-pol signature has a double peak, characteristic of "even-bounce" reflectors. In the case of Bragg scattering, the response is similar to the single-bounce sphere, except that the backscatter of the vertical polarization is higher than that of the horizontal polarization.

Data Calibration

One critical requirement of polarimetric radar systems is the need for calibration. This is because much of the information lies in the ratios of amplitudes and the differences in phase angle between the four transmit-receive polarization combinations. If the calibration is not sufficiently accurate, the scattering mechanisms will be misinterpreted and the advantages of using polarization will not be realised.

Calibration is achieved by a combination of radar system design and data analysis. Imagine the response to a trihedral corner reflector. Its ideal response is only obtained if the four channels of the radar system all have the same gain, system-dependent phase differences between channels are absent, and there is no energy leakage from one channel to another.

In terms of the radar system design, the channel gains and phases should be as carefully matched as possible. In the case of the phase balance, this means that the signal path lengths should be effectively the same in all channels. Calibration signals are often built into the design to help verify these channel balances.

In terms of data analysis, channel balances, cross-talk and noise effects can be measured and corrected by analysing the received data. In addition to analysing the response of internal calibration signals, the signals from known targets such as corner reflectors, active transponders, and uniform clutter can be used to calibrate some of the parameters.

Polarimetric Applications

Synthetic Aperture Radar polarimetry has been limited to a number of experimental airborne SAR systems and the SIR-C (shuttle) mission. With these data, researchers have studied a number of applications, and have shown that the interpretation of a number of features in a scene is facilitated when the radar is operated in polarimetric mode. The launch of RADARSAT-2 will make polarimetric data available on an operational basis, and uses of such data will become more routine and more sophisticated.

Some applications in which polarimetric SAR has already proved useful include:

  • Agriculture: for crop type identification, crop condition monitoring, soil moisture measurement, and soil tillage and crop residue identification;
  • Forestry: for clearcuts and linear features mapping, biomass estimation, species identification and fire scar mapping;
  • Geology: for geological mapping;
  • Hydrology: for monitoring wetlands and snow cover;
  • Oceanography: for sea ice identification, coastal windfield measurement, and wave slope measurement;
  • Shipping: for ship detection and classification;
  • Coastal Zone: for shoreline detection, substrate mapping, slick detection and general vegetation mapping.
Did you know?

That many other polarizations can be transmitted (or received) if a radar system can transmit or receive the H and V channels simultaneously. For example, if a radar system transmits an H and a V signal simultaneously, and the V signal is 90o out of phase with respect to the H signal, the resulting transmitted wave will have circular polarization.

 

Airborne versus Spaceborne Radars

Like other remote sensing systems, an imaging radar sensor may be carried on either an airborne or spaceborne platform. Depending on the use of the prospective imagery, there are trade-offs between the two types of platforms. Regardless of the platform used, a significant advantage of using a Synthetic Aperture Radar (SAR) is that the spatial resolution is independent of platform altitude. Thus, fine resolution can be achieved from both airborne and spaceborne platforms.

Airborne radar

Although spatial resolution is independent of altitude, viewing geometry and swath coverage can be greatly affected by altitude variations. At aircraft operating altitudes, an airborne radar must image over a wide range of incidence angles, perhaps as much as 60 or 70 degrees, in order to achieve relatively wide swaths (let's say 50 to 70 km). As we have learned in the preceding sections, incidence angle (or look angle) has a significant effect on the backscatter from surface features and on their appearance on an image. Image characteristics such as foreshortening, layover, and shadowing will be subject to wide variations, across a large incidence angle range. Spaceborne radars are able to avoid some of these imaging geometry problems since they operate at altitudes up to one hundred times higher than airborne radars. At altitudes of several hundred kilometres, spaceborne radars can image comparable swath widths, but over a much narrower range of incidence angles, typically ranging from five to 15 degrees. This provides for more uniform illumination and reduces undesirable imaging variations across the swath due to viewing geometry.

Spaceborne radars

Although airborne radar systems may be more susceptible to imaging geometry problems, they are flexible in their capability to collect data from different look angles and look directions. By optimizing the geometry for the particular terrain being imaged, or by acquiring imagery from more than one look direction, some of these effects may be reduced. Additionally, an airborne radar is able to collect data anywhere and at any time (as long as weather and flying conditions are acceptable!). A spaceborne radar does not have this degree of flexibility, as its viewing geometry and data acquisition schedule is controlled by the pattern of its orbit. However, satellite radars do have the advantage of being able to collect imagery more quickly over a larger area than an airborne radar, and provide consistent viewing geometry. The frequency of coverage may not be as often as that possible with an airborne platform, but depending on the orbit parameters, the viewing geometry flexibility, and the geographic area of interest, a spaceborne radar may have a revisit period as short as one day.

As with any aircraft, an airborne radar will be susceptible to variations in velocity and other motions of the aircraft as well as to environmental (weather) conditions. In order to avoid image artifacts or geometric positioning errors due to random variations in the motion of the aircraft, the radar system must use sophisticated navigation/positioning equipment and advanced image processing to compensate for these variations. Generally, this will be able to correct for all but the most severe variations in motion, such as significant air turbulence. Spaceborne radars are not affected by motion of this type. Indeed, the geometry of their orbits is usually very stable and their positions can be accurately calculated. However, geometric correction of imagery from spaceborne platforms must take into account other factors, such as the rotation and curvature of the Earth, to achieve proper geometric positioning of features on the surface.

 

Airborne and Spaceborne Radar Systems

In order to more clearly illustrate the differences between airborne and spaceborne radars, we will briefly outline a few of the representative systems of each type, starting with airborne systems.

Convair-580 C/X SAR

The Convair-580 C/X SAR system developed and operated by the Canada Centre for Remote Sensing was a workhorse for experimental research into advanced SAR applications in Canada and around the world, particularly in preparation for satellite-borne SARs. The system was transferred to Environment Canada in 1996 for use in oil spill research and other environmental applications. This system operates at two radar bands, C- (5.66 cm) and X- (3.24 cm). Cross-polarization data can be recorded simultaneously for both the C- and X-band channels, and the C-band system can be operated as a fully polarimetric radar. Imagery can be acquired at three different imaging geometries (nadir, narrow and wide swath modes) over a wide range of incidence angles (five degrees to almost 90 degrees). In addition to being a fully calibratable system for quantitative measurements, the system has a second antenna mounted on the aircraft fuselage to allow the C-band system to be operated as an interferometric radar.

Sea Ice and Terrain Assessment (STAR)

The Sea Ice and Terrain Assessment (STAR) systems operated by Intera Technologies Limited of Calgary, Alberta, Canada, (later Intermap Technologies ) were among the first SAR systems used commercially around the world. Both STAR-1 and STAR-2 operate at X-band (3.2 cm) with HH polarization in two different resolution modes. The swath coverage varies from 19 to 50 km, and the resolution from 5 to 18 m. They were primarily designed for monitoring sea ice (one of the key applications for radar, in Canada) and for terrain analysis. Radar's all-weather, day or night imaging capabilities are well-suited to monitoring ice in Canada's northern and coastal waters. STAR-1 was also the first SAR system to use on-board data processing and to offer real-time downlinking of data to surface stations.

AirSAR

The United States National Aeronautics and Space Administration (NASA) has been at the forefront of multi-frequency, multi-polarization synthetic aperture radar research for many years. The Jet Propulsion Laboratory (JPL) in California has operated various advanced systems on contract for NASA. The AirSAR system is a C-, L-, and P-band advanced polarimetric SAR which can collect data for each of these bands at all possible combinations of horizontal and vertical transmit and receive polarizations (i.e. HH, HV, VH, and VV). Data from the AirSAR system can be fully calibrated to allow extraction of quantitative measurements of radar backscatter. Spatial resolution of the AirSAR system is on the order of 12 metres in both range and azimuth. Incidence angle ranges from zero degrees at nadir to about 70 degrees at the far range. This capability to collect multi-frequency, multi-polarization data over such a diverse range of incidence angles allows a wide variety of specialized research experiments to be carried out.

SEASATWith the advances and success of airborne imaging radar, satellite radars were the next logical step to complement the optical satellite sensors in operation. SEASAT, launched in 1978, was the first civilian remote sensing satellite to carry a spaceborne SAR sensor. The SAR operated at L-band (23.5 cm) with HH polarization. The viewing geometry was fixed between nine and 15 degrees with a swath width of 100 km and a spatial resolution of 25 metres. This steep viewing geometry was designed primarily for observations of ocean and sea ice, but a great deal of imagery was also collected over land areas. However, the small incidence angles amplified foreshortening and layover effects over terrain with high relief, limiting its utility in these areas. Although the satellite was only operational for three months, it demonstrated the wealth of information (and the large volumes of data!) possible from a spaceborne radar.

ERS-1

With the success of the short-lived SEASAT mission, and impetus provided from positive results with several airborne SARs, the European Space Agency (ESA) launched ERS-1 in July of 1991. ERS-1 carried on-board a radar altimeter, an infrared radiometer and microwave sounder, and a C-band (5.66 cm), active microwave instrument. This is a flexible instrument which can be operated as a scatterometer to measure reflectivity of the ocean surface, as well as ocean surface wind speed and direction. It can also operate as a synthetic aperture radar, collecting imagery over a 100 km swath over an incidence angle range of 20 to 26 degrees, at a resolution of approximately 30 metres. Polarization is vertical transmit and vertical receive (VV) which, combined with the fairly steep viewing angles, make ERS-1 particularly sensitive to surface roughness. The revisit period (or repeat cycle) of ERS-1 can be varied by adjusting the orbit, and has ranged from three to 168 days, depending on the mode of operation. Generally, the repeat cycle is about 35 days. A second satellite, ERS-2, was launched in April of 1995 and carries the same active microwave sensor as ERS-1. Designed primarily for ocean monitoring applications and research, ERS-1 provided the worldwide remote sensing community with the first wide-spread access to spaceborne SAR data. Imagery from both satellites has been used in a wide range of applications, over both ocean and land environments. Like SEASAT, the steep viewing angles limit their utility for some land applications due to geometry effects.

JERS-1

The National Space Development Agency of Japan (NASDA), launched the JERS-1 satellite in February of 1992. In addition to carrying two optical sensors, JERS-1 has an L-band (23.5 cm) SAR operating at HH polarization. The swath width is approximately 75 km and spatial resolution is approximately 18 metres in both range and azimuth. The imaging geometry of JERS-1 is slightly shallower than either SEASAT or the ERS satellites, with the incidence angle at the middle of the swath being 35 degrees. Thus, JERS-1 images are slightly less susceptible to geometry and terrain effects. The longer L-band wavelength of JERS-1 allows some penetration of the radar energy through vegetation and other surface types.

RADARSAT-1

Spaceborne SAR remote sensing took a giant leap forward with the launch of Canada's RADARSAT satellite on Nov. 4, 1995. The RADARSAT project, led by the Canadian Space Agency (CSA), was built on the development of remote sensing technologies and applications work carried out by the Canada Centre for Remote Sensing (CCRS) since the 1970s. RADARSAT carries an advanced C-band (5.6 cm), HH-polarized SAR with a steerable radar beam allowing various imaging options over a 500 km range. Imaging swaths can be varied from 35 to 500 km in width, with resolutions from 10 to 100 metres. Viewing geometry is also flexible, with incidence angles ranging from less than 20 degrees to more than 50 degrees. Although the satellite's orbit repeat cycle is 24 days, the flexibility of the steerable radar beam gives RADARSAT the ability to image regions much more frequently and to address specific geographic requests for data acquisition. RADARSAT's orbit is optimized for frequent coverage of mid-latitude to polar regions, and is able to provide daily images of the entire Arctic region as well as view any part of Canada within three days. Even at equatorial latitudes, complete coverage can be obtained within six days using the widest swath of 500 km.

Imaging options over a 500 km range

Imaging options over a 500 km range

Source: http://www.ccrs.nrcan.gc.ca/