In conventional imaging radar the measurement is a scalar which is proportional to the received backscattered power at a particular combination of linear polarization (HH, HV, VH or VV). In polarimetry the basic measurement is a 2x2 complex scattering matrix yielding an eight dimensional measurement space. For reciprocal targets where HV = VH, this space is compressed to five dimensions: three amplitudes (|HH|, |HV|, and |VV|); and two phase measurements, (co-pol: HH-VV, and cross-pol: HH-HV). It is the phase measurements which are distinctive of the technology, and have allowed the development of techniques to synthesize polarization measurements for any point on the Poincare sphere. This increases the measurement space of polarimetry far beyond the fivefold increase implied by the scattering matrix.
Two other measurements, which are also important in polarimetry, are the covariance matrix and the Mueller matrix. These are the basis for most analyses. Research activities related to radar polarimetry have increased significantly over the past few years. Several books have been published on the subject and parts the technology have begun to mature (i.e. achieve consensus). This review divides the field into broad categories: system analysis and data analysis.
In addition to the three frequency airborne system operated by JPL, NASA operates an orbiting polarimetric SAR from the space shuttle. Plans also exist to include a polarimeter in the EOS system of satellites. Several other countries operate airborne polarimetric SARs including Canada, Denmark, Germany, and the Netherlands. The data rate for polarimetric SARs is significant and the various systems differ in their design. These designs will continue to evolve to accommodate advances in antenna design, internal calibration, real-time processing strategies, and data storage technologies.
Calibration of polarimetric image data is relatively mature and the relevant theory has been developed. The basic system model consists of four equations with ten unknowns including the four elements of the scattering matrix and six radar system parameters. The system unknowns consist of crosstalk (polarization impurity) and channel imbalance terms which can vary significantly across the image, particularly in the range dimension. The required additional information comes in the form of (i) man-made targets with known scattering properties; (ii) natural distributed targets with assumed scattering properties; and (iii) simplifications to the system model (i.e. assumptions about the system parameters). Practical solutions to calibration often involve all three types of information. The cost and inconvenience of using man-made targets has led to intensive research on internal calibration involving only distributed targets and system assumptions. However the necessary assumptions, particularly as they relate to phase calibration, may be difficult to meet. Nevertheless, internal calibration remains an important goal which will drive research in this area.
Many important analytical results have been derived including the joint and marginal probability density functions of the amplitudes and phase measurements, as well as the associated moments, and the variance of parameter estimates. Results for both single and multilook data have been derived. Most results to date are based on the assumption that the elements of the scattering matrix are jointly Gaussian. This is equivalent to the assumption that the scene is homogeneous. Recent work has generalized these results to include heterogeneous scenes, leading to the K-distribution for intensities. However the underlying assumptions are often introduced as strictly statistical vehicles, and the physical reasoning is still quite tenuous. Polarimetric response of terrain is dependent on several parameters such as frequency, incidence angle, dielectric properties and dielectric geometry of the scene. Statistical models have not yet been tested against a useful range of these parameters.
(b) Image Processing
The two most important image processing operations performed on SAR imagery are speckle filtering and image classification/segmentation. The presence of speckle in SAR image data make it difficult to estimate the spatial statistics of the underlying scene backscatter; it also makes it difficult to perform classification of the image data. Hence, speckle filtering invariably precedes these operations.
Speckle is an expression of subpixel-scale scattering, and all speckle filtering is built on a model of this random process. The prevailing model, called multiplicative noise or the product model, assumes that the scene backscatter varies very slowly in comparison with the scale of the resolution cell. While this assumption appears to hold for certain cases, it is not consistent with current statistical theories of SAR image data. Several speckle filtering algorithms have been developed for polarimetric data, built on the prior work on conventional data.
Recent results have shown that the 3 multiplicative noise model holds only for the four elements of the complex scattering matrix, and it is these complex scalars which must be filtered before any derived polarimetric measures are used. Other algorithms have also been developed that use an optimal linear combination of the scattering elements which is an improvement over earlier algorithms based on the span of the scattering matrix.
Classification and segmentation of polarimetric image data has seen significant activity in recent years. Classification algorithms are usually closely tied to advances in statistical modeling. Preference appears to be for unsupervised techniques due to a lack of surface measurements with which to train supervised algorithms. Several algorithms use neural networks rather than strictly statistical models. Current classification algorithms operate on a pixel-by-pixel basis, yielding a very noisy class map of the scene. Some investigators use a post-processing step to reduce the noise in the final map. In contrast, segmentation algorithms, which are very few, rely on structural information rather than points in a polarimetric feature space, to group pixels together. Future algorithms will likely combine structural and radiometric information to classify images. Operational applications will likely demand a trained algorithm to segment/classify an image. Training will likely be an ongoing procedure, where the algorithms ability to correctly classify imagery is refined over time as it is presented with more examples of known classes.
(c) Scene Understanding
The overarching goal of remote sensing is the development of a relationship between image measurements and geophysically useful information such as forest biomass, snow-water equivalent, soil moisture, sea ice thickness, etc. Polarimetry is no exception, with the relevant measurements being linearly polarized backscattering coefficients, polarization phase difference, the 2D polarization response functions, and a host of derived metrics and statistics. Relating these measurements to geophysical parameters remains the most important, the most difficult, and the least investigated area of polarimetry.
To date the greatest effort in this area has been directed towards forests. There have been several studies which have demonstrated the potential of linearly polarized backscatter coefficients for discriminating forest types and estimating useful parameters such as tree height and biomass. However this is a strong function of frequency, forest type, and forest density. It seems likely that successful algorithms will require some prior information, possibly through multi-sensor integration, to achieve operationally useful results. The utility of polarimetric measurements, such as phase differences and the synthesized polarimetric response functions, for estimating geophysical parameters has yet to be demonstrated. Nevertheless these measurements have been extremely useful in understanding the interaction between polarized microwave radiation and natural terrestrial scenes.
[ TOP OF PAGE ] [ BACK TO TECH BRIEFS ] [ BACK TO IFT ]