+ All Categories
Home > Documents > SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the...

SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the...

Date post: 10-Jun-2020
Category:
Upload: others
View: 0 times
Download: 0 times
Share this document with a friend
12
SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND PERSPECTIVES Olaf Hellwich Chair for Photogrammetry and Remote Sensing Technische Universit¨ at M¨ unchen, D-80290 Munich, Germany Ph.: +49-89-289-22677, Fax: +49-89-2809573 E-mail: [email protected] URL: http://www.photo.verm.tu-muenchen.de KEY WORDS: SAR Interferometry, Atmospheric Effects, SRTM, Differential Interferometry, ScanSAR, Polarimetric Interferometry. ABSTRACT The principles of SAR interferometry and interferogram generation, in particular phase unwrapping and geocoding, are described. Current issues of SAR interferometry are discussed: atmospheric effects on space- borne repeat-pass interferometry limiting the accuracy and reliability of Digital Elevation Models (DEM), the Shuttle Radar Topography Mission (SRTM) resulting in a globally homogeneous DEM, differential interferom- etry monitoring surface displacements in the mm to cm range with spaceborne sensors, ScanSAR interferometry allowing continuous monitoring of regions threatened by natural hazards, and polarimetric interferometry deter- mining vegetation parameters such as height, and generating more accurate DEM than standard interferometry. 1 INTRODUCTION SAR derives a two-dimensional representation of the three-dimensional object space, the image. For each pixel there are two geometric measurements repre- senting the transformation from 3-D to 2-D: the range of the object point from the sensor and the Doppler frequency under which the object point is observed. In addition to these, each pixel contains as a third geometric quantity, the phase , which is com- bined with the phase of another SAR image in or- der to recover the third dimension of the object space. The computation of phase differences is conducted according to the interferometer principle which is why this operation and the following compu- tation of object space coordinates for the pixels of a SAR image are subsumed under the term SAR inter- ferometry. Either the two SAR images are taken from two in- dependent passes of a sensor over the object result- ing in repeat-pass interferometry, or the images are taken by two antennas mounted on a common plat- form which is called single-pass interferometry. Ex- amples for repeat-pass interferometry are interfero- grams derived from ERS-1 satellite data and ERS tandem interferometry using images taken by ERS- 1 and ERS-2. Examples for single-pass interferom- etry are airborne systems consisting of two SAR an- tennas mounted on the body of an airplane and the planned Shuttle Radar Topography Mission (SRTM) where the second SAR antenna is attached to a m long telescope mast which is unfolded as soon as the shuttle reaches its final orbit. For spaceborne repeat- pass interferometry the length of the baseline is in the range of hundreds of meters, and the distance be- tween object and sensor is about km – in case of ERS-SAR. For airborne single-pass interferome- try the baseline length is several meters whereas the range is e.g. several kilometers. Figure 1 illustrates the determination of object space coordinates of a point by SAR inter- ferometry. The measurements determining an object point, i.e. range, Doppler frequency and two phase values, are taken when the point is located in the cen- ters of the radar beams. Given the position of the first sensor the range determines in the viewing direction, also called range direction. The relation be- tween and the unknown object space coordinates is expressed by the range equation (1) In flight direction, usually called azimuth direction, the point position is determined by measurement of the Doppler frequency when the object point is lo- cated in the center of the radar beam or – expressed geometrically more vividly – by measurement of the angle between flight and viewing direction. This is expressed by the Doppler equation (2)
Transcript
Page 1: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND PERSPECTIVES

Olaf HellwichChair for Photogrammetry and Remote Sensing

Technische Universitat Munchen, D-80290 Munich, GermanyPh.: +49-89-289-22677, Fax: +49-89-2809573

E-mail: [email protected]: http://www.photo.verm.tu-muenchen.de

KEY WORDS: SAR Interferometry, Atmospheric Effects, SRTM, Differential Interferometry, ScanSAR,Polarimetric Interferometry.

ABSTRACT

The principles of SAR interferometry and interferogram generation, in particular phase unwrapping andgeocoding, are described. Current issues of SAR interferometry are discussed: atmospheric effects on space-borne repeat-pass interferometry limiting the accuracy and reliability of Digital Elevation Models (DEM), theShuttle Radar Topography Mission (SRTM) resulting in a globally homogeneous DEM, differential interferom-etry monitoring surface displacements in the mm to cm range with spaceborne sensors, ScanSAR interferometryallowing continuous monitoring of regions threatened by natural hazards, and polarimetric interferometry deter-mining vegetation parameters such as height, and generating more accurate DEM than standard interferometry.

1 INTRODUCTION

SAR derives a two-dimensional representation of thethree-dimensional object space, the image. For eachpixel there are two geometric measurements repre-senting the transformation from 3-D to 2-D: the ranger of the object pointp from the sensors and theDoppler frequencyf under which the object point isobserved. In addition to these, each pixel contains as athird geometric quantity, the phase'1, which is com-bined with the phase'2 of another SAR image in or-der to recover the third dimension of the object space.The computation of phase differences� = '1 � '2is conducted according to the interferometer principlewhich is why this operation and the following compu-tation of object space coordinates for the pixels of aSAR image are subsumed under the term SAR inter-ferometry.

Either the two SAR images are taken from two in-dependent passes of a sensor over the object result-ing in repeat-pass interferometry, or the images aretaken by two antennas mounted on a common plat-form which is called single-pass interferometry. Ex-amples for repeat-pass interferometry are interfero-grams derived from ERS-1 satellite data and ERStandem interferometry using images taken by ERS-1 and ERS-2. Examples for single-pass interferom-etry are airborne systems consisting of two SAR an-tennas mounted on the body of an airplane and theplanned Shuttle Radar Topography Mission (SRTM)

where the second SAR antenna is attached to a60mlong telescope mast which is unfolded as soon as theshuttle reaches its final orbit. For spaceborne repeat-pass interferometry the length of the baseline is inthe range of hundreds of meters, and the distance be-tween object and sensor is about830 km – in caseof ERS-SAR. For airborne single-pass interferome-try the baseline length is several meters whereas therange is e.g. several kilometers.

Figure 1 illustrates the determination of object spacecoordinatespT = (px py pz) of a point by SAR inter-ferometry. The measurements determining an objectpoint, i.e. range, Doppler frequency and two phasevalues, are taken when the point is located in the cen-ters of the radar beams. Given the positions1 of thefirst sensor the ranger1 determinesp in the viewingdirection, also called range direction. The relation be-tweenr1 and the unknown object space coordinates isexpressed by the range equationr1 = jp� s1j: (1)

In flight direction, usually called azimuth direction,the point position is determined by measurement ofthe Doppler frequencyf when the object point is lo-cated in the center of the radar beam or – expressedgeometrically more vividly – by measurement of theangle between flight and viewing direction. This isexpressed by the Doppler equationf ' 2jv1j cos � = 2v1 (p� s1)� � r1 (2)

Page 2: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

wherev1 is the velocity vector of the first sensor. Theapproximation means that the hyperboloid on which apoint p is located when it is observed under Dopplerfrequencyf is replaced by a cone. It is particularlyharmless as the SAR systems regularly approximateDoppler frequencyf = 0 where hyperboloid as wellas cone degenerate to a plane. More details on thistopic can be found in (Kuchling, 1979, p. 151f.). Inthe direction orthogonal to viewing and flight direc-tion, p is positioned by the interferometric phase,i.e. the difference of the phases of the first and thesecond SAR image being a function of the differenceof the ranges from the object point to the sensors. Theinterferometric phase equation is� = 4�� �r + �c = 4�� (jp� s1j � jp� s2j) + �c

(3)where� is the unwrapped interferometric phase,�ris the difference of the two ranges from the sensor po-sitions to the object point,�c is the phase integrationconstant solving the phase ambiguity remaining afterphase unwrapping, ands2 is the position of the sec-ond SAR sensor.

r1

γr∆

Z

Y

X

2

1

v1

p-s p

s

1

s

Figure 1: Geometric principle of SAR interferometry.

The general geometric task of SAR interferometry isthe determination of the object point coordinates us-ing the above equations. As the baseline in SAR in-terferometry is extremely short in comparison withthe distance of the sensors from the object, the rela-tive position in viewing direction of the second sensorwith respect to the first one, i.e. the parallel baselinecomponent, has to be known with a standard deviationin the1mm range. Presently, this requirement cannotbe fulfilled by orbit determination methods. This iswhy at least one control point is needed to geocode

the interferogram. It is also used to determine thephase integration constant�c.

2 INTERFEROGRAM GENERATION

The steps necessary for processing and geocoding ofspaceborne repeat-pass interferograms are listed inFigure 2. Input to the procedure are two complex-valued SAR images containing intensity and phasesas well as orbit data describing positions and flyingdirections of the sensor.

resamplingof image 2

interferogram generation

phase unwrapping

DEM

coherence data

SAR image 1 SAR image 2complex-valued complex-valued

common-band filtering

image co-registration

geocoding

orbit data

Figure 2: Interferometric processing.

As demonstrated by Figure 3, the SAR sensor trans-forms object spectra to data spectra. The data fre-quency resulting from a certain object frequency de-pends on incidence angle and direction of the SARsignal. As the two SAR images used to generate theinterferogram are taken with (slightly) different inci-dence angles, an object frequency is belonging to dif-ferent frequencies in the data spectra. As the SARsensor images only those object frequencies whichare located in the sensor-specific frequency band,both data spectra contain partially different, in theobject spectra non-overlapping object information.By common-band filtering those non-overlapping fre-quency bands are filtered out from both SAR im-ages. This limits the information used for interfer-ogram generation to the frequency bands contain-ing the same object information (Gatelli et al., 1994,Geudtner, 1995).

After common-band filtering, the images are co-registered with sub-pixel accuracy, e.g. with an accu-racy of 0.1 pixels. As the imaging geometries of bothimages are - in comparison with stereo photogramme-try - very similar, fulfilling this requirement is usuallyeasy. Then the second SAR image is resampled in the

Page 3: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

α

#�#Figure 3: Frequency shift caused by varying inci-dence angles#.

geometry of the first one using a suitable interpola-tion method such as Shannon or bi-cubic polynomialinterpolation. When both SAR images resemble thesame geometry and contain completely overlappingobject spectra, the interferogram is computed accord-ing to I = V1 � V �2 = jV1j � jV2j � ej�

where � = '1 � '2is the interferometric phase,V1 is the complex signal(amplitude and phase) of image 1 (master), andV2is the complex signal of image 2 (slave), and the�indicates the conjugate of a complex variable. A finalinterferogram is computed by subtracting a simulatedinterferogram, e.g. the interferogram of a tangentialplane to the geoid, the so-called flat earth, from themeasured interferogram.

The interferogram contains phase information use-ful for the computation of object space coordinatesp only if the scatterers on the surface of the earthhave changed neither their backscattering behaviournor their positions relative to each other. Whether thisrequirement for successful interferometric processinghas been fulfilled, can be checked using the coherencewhich is a measure of the correspondence of bothSAR images. It is estimated by window-based com-putation of the magnitudej j of the complex cross-correlation coefficient of the SAR images = PNn=1 V (n)1 � V �(n)2qPNn=1 jV (n)1 j2 �PNn=1 jV (n)2 j2whereN is the number of pixels in the window. SARcoherence contains valuable information in additionto the intensity of the single SAR images which canfor instance be used for object extraction (Hellwich etal., 1998).

The phase information of the complex interferogram,i.e. the interferometric phase, is ambiguous as it islimited to the interval [0; 2�[ not allowing to di-rectly infer range differences. The ambiguities canbe solved by integrating phase differences�� =�i+1 � �i between neighbouring pixelsi, i+ 1. Thisintegration is called phase unwrapping. It is pos-sible without any problems when the differences ofthe ranges between neighbouring object points, corre-sponding with the pixels, and the sensor positions donot exceed half a wavelength. Then the magnitudes ofthe true phase differences between the object pointsare smaller than�. This is the case when the surfaceof the earth is sufficiently smooth, and the sampling ofthe surface by the interferogram is sufficiently dense.

As shown in Figure 4 these requirements are notnecessarily fulfilled: for instance in a foreshorten-ing area neighbouring pixels may have phase differ-ences larger than�. If the shown configuration oc-curs in only one row of the image, then phase mea-surements as shown in the example of Figure 5 aretypical. The phase differences between neighbouringpixels are computed from the interferometric phasesaccording to �� = W (�i+1 � �i)whereW is the so-called wrapping operator trans-forming a phase difference into the interval] � �; �]by adding or subtracting an integer multiple of2�. Toinvestigate the integrability of the phase differences,they are added over every four directly neighbouringpixels on a squared closed loop. Sometimes the sumis not 0, but 2� or �2�. Depending on the sign ofthe sum those locations are called positive or negativeresidues. Residues either have topographical causessuch as foreshortening areas (as used in the above ex-ample) or layover areas, or they are related to noisesuch as sensor noise or lack of coherence of the SARimage pair.

Whenever residues occur the phase differences arenot directly integrable, and special measures haveto be taken. A common way to treat residues inthe phase unwrapping process is to connect an equalnumber of positive and negative residues by lineswhich are called cuts, and to introduce the rule thatcuts are not allowed to be crossed when comput-ing unwrapped interferometric phases from wrappedphase differences. Such a group of residues con-nected by cuts is considered discharged. When it iscircumvented during the integration of phase differ-ences the residues do not pose a numerical problemany longer. Especially, when there are many residuesthe decisions about the location of cuts is a difficulttask. Usually, algorithms either try to find cuts hav-ing a minimum overall length, or other features of

Page 4: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

B

pi

i+1p

Figure 4: Foreshortening area causing phase un-wrapping problems, as the phase difference betweenneighbouring pixelspi andpi+1 is slightly larger than�.

0.0 0.0

0.00.0

0.0

Φ ∆Φ

-

+

+0.3 +0.3

-0.3

+0.3

+0.3 +0.3

-0.4

.3 .6 .9

.3 .9 .9

.3 .6 .9

Figure 5: Residues causing phase unwrapping prob-lems. Unit of� and�� values is “number of wavelengths”. The positive residue is marked by a plus,the negative residue by a minus sign.

the interferogram such as layover, shadow or inco-herent areas are preferred as position of cuts. Duringthe past ten years phase unwrapping has been one ofthe major research areas in SAR interferometry andseveral phase unwrapping methods have been devel-oped (Goldstein et al., 1988, Ghiglia and Romero,1994, Schwabisch, 1995, Flynn, 1996, Bamler et al.,1998, Costantini, 1998, Hellwich, 1998, Davidsonand Bamler, 1999, Zebker and Chen, 1999). Despiteof remarkable progress, especially in understandingthe problem of phase unwrapping, the developmenthas not reached a final state yet.

After phase unwrapping the interferogram isgeocoded, i.e. it is transformed into object space.Though a variety of approaches to interferogramgeocoding have been taken (Small et al., 1993, Smallet al., 1995, Schwabisch, 1995, Schwabisch, 1997,Madsen et al., 1993, Crosetto, 1998), the computationof object space coordinates is similar in principleand based on the Equations (1) to (3). (Hellwich andEbner, 1998) propose a geocoding method using least

squares adjustment with a Gauss-Markov model. Theobservation equations are the Range Equation (1),the Doppler Equation (2), and the Phase Equation(3) complemented by equations for the first and thesecond sensor position as well as the velocity at thefirst sensor position at the times of the acquisitionsof the individual pixels. Furthermore, control pointequations are introduced, and bias and drift equationsare used to model the sensor flight paths. Generally,it would be possible to use two range and twoDoppler equations for each pixel. Yet in the proposedapproach for reasons of computational efficiencyonly one range and one Doppler equation takes partto the adjustment. In this way it is possible to limitthe adjustment to control points only. The result ofthe adjustment are the phase constant�c, and bias anddrift parameters refining the interferometric baselinefor the complete flight path. Then the individualpixels are geocoded using the refined baselines in aNewton-Raphson approach which needs only a verylimited amount of computational resources.

The least squares adjustment approach allows to in-vestigate the accuracy of the computed object spacecoordinates. For instance, it can be shown how errorsin the range, Doppler and interferometric phase obser-vations are propagated. Figure 6 shows the variance-covariance matrix of a resulting object point in formof an error ellipsoid under the assumption that all ob-servations except range, Doppler frequency and in-terferometric phase are free of errors. In viewing di-rection the error ellipsoid is determined by the rangeerror, in flight direction by the Doppler frequency er-ror, and in the direction orthogonal to range and flightdirection by the interferometric phase error. Further-more, it is interesting to note that the introductionof control points into interferometric processing doesnot improve the absolute accuracy of the flight paths,but only the baseline in viewing direction, i.e. the rel-ative accuracy of one coordinate of the flight pathwith respect to the other flight path.

Once the pixels are geocoded, they are transformedfrom e.g. a geocentric cartesian coordinate systeminto a map coordinate system. In this coordinate sys-tem the data are resampled to a regular grid, i.e. araster-based DEM.

Further detailed descriptions of SAR interferometryare given by (Zebker and Goldstein, 1986, Gabrieland Goldstein, 1988, Leberl, 1990, Li and Goldstein,1990, Prati and Rocca, 1990, Hartl, 1991, Lin et al.,1992, Rodriguez and Martin, 1992, Hartl and Thiel,1993, Hartl and Xia, 1993, Madsen et al., 1993, Mas-sonnet and Rabaute, 1993, Prati et al., 1994, Zebkeret al., 1994, Gens and Genderen, 1996, Bamler andHartl, 1998).

Page 5: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

-10

-5

0

5

10

15

20

25

30

35

40

z[m]x x

yFigure 6: Accuracy of point positioning as a func-tion of the accuracies of range, Doppler frequencyand interferometric phase. Elevation and base graphsare based on ERS-SAR parameters, a baseline lengthB = 100m, and standard deviations�r = 7m,�f = 2Hz, �� = 18�. The ellipsoid is plotted toscale. The y-axis is parallel to the flight path, the z-axis normal to the surface of the earth.

3 CURRENT ISSUES

In this section currently important developments inresearch and application of SAR interferometry arebriefly discussed.

3.1 Differential Interferometry

The goal of differential interferometry is the measure-ment of displacements of parts of the earth surface be-tween two passes of a SAR sensor, i.e. differential in-terferometry is a special form of repeat-pass interfer-ometry. Generally formulated, the unwrapped phaseof a repeat-pass interferogram is a function of to-pography, surface displacement, atmospheric effectsand measurement errors (cf. (Wegmuller and Strozzi,1998)): �unw = �top + �dis + �atm + �err (4)

The topographic term�top depends on surface to-pography and baseline geometry. The displacementterm �dis shows the displacement of the surface ofthe earth which does not result in any incoherence be-tween the acquisition times of the SAR images. Thisis the case when all scatterers of several pixels takepart to the same movement. The atmospheric term�atm contains the phase differences due to changes ofthe permittivity of the atmosphere (cf. Section 3.2).The error term�err accounts for sensor noise, sen-sor position and baseline modelling errors, and effectsof incoherence between the SAR scenes, e.g. randomdisplacements of the scatterers.

Under the assumption that�atm and�err are negli-gible (cf. Section 3.2), the first general task of dif-ferential interferometry is to separate�top from �dis.This is done with the help of a DEM from which�topis computed using the flight path parameters, i.e. asimulation-like operation. Then�dis results from asubtraction of�top from �unw. The DEM can be gen-erated with the help of SAR interferometry as well.In this case it is advantageous when displacements donot occur between the acquisition times of the SARimages used to derive the DEM. Concerning baselinegeometry, it is advantageous to use large baselines toinfer surface topography, and zero baselines to mea-sure surface displacements.

The second general task of differential interferometryis to compute surface displacements from�dis. Thisis illustrated by Figure 7. Here it is assumed that thesurface of the earth is completely flat and horizontal,i.e. that�top behaves very systematic. If the interfero-gram containing the displacement term�dis was pro-cessed using a standard processing scheme (cf. Sec-tion 2), �dis would be misinterpreted as�top, and anon-flat, false terrain surface would be derived. I.e.,if in Figure 7 surface patchA would be moving tolocationB between the acquisition times of the SARsensor, its apparent position would beC, accordingto the SAR imaging geometry a location many timesfurther away fromA thanB.

���������������������������������������������������������������������������������������������������������

���������������������������������������������������������������������������������������������������������

B

C

DA

original terrain

apparent terrain

Figure 7: Differential SAR interferometry.

If the direction of surface displacement is not known,it cannot be determined unambiguously by a singledifferential interferometric measurement without anyfurther assumptions. If – as in Figure 7 – the displace-ment was assumed to happen in viewing direction, thenew location of surface patchA would beD. Morereasonable assumptions would probably be an exclu-

Page 6: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

a) Bagley ice field SAR image b) Fringe image caused by ice motion

Figure 11: Glacier flow measured by differential SAR interferometry. c D. R. Fatland(http://www.slm.wau.nl/whh/rs/SAR-faq/fatland2.html).

1009.595

14.3

IJSSEL LAKE

B

10 km

17.3

3 30Rain rate [mm/hr]

13.9

14.290

1009.6

92

1009.69117.1

40

95

1009.89814.3

98

94

1008.79613.8

1007.8

A

0.3

14.3

17.692

1009.7

1009.5

13.8

9614.3

N

Slant delay [mm]-40

Figure 12: Effect of a cold front.c R. Hanssen (Hanssen et al., 1999).

Page 7: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

sively vertical displacement as in mining areas, or adisplacement along the surface of the imaged object,e.g. in case of a glacier sliding downhill.

Figure 11 shows an example of glacier flow measure-ment on Bagley ice field in Southern Alaska using dif-ferential interferometry (Fatland and Lingle, 1998).11 a) is the SAR intensity image. The wide homoge-neous area in the center of the image is Bagley icefield, the narrower homogeneous area in the upperpart of the image Jeffries glacier. Figure 11 b) con-tains the differential interferogram after subtractionof the topographic contribution to the phase. The ho-mogeneously red areas are stable mountains and icefield areas. It can easily be seen that Jeffries glacier ismoving fast in comparison with the surface of Bagleyice field. Counting interferometric fringes starting ata stable zone ice velocity can be estimated. Here,one fringe is equivalent to an ice velocity of approxi-mately 10 m per year.

3.2 Atmospheric Effects on Spaceborne Repeat-Pass Interferometry

After the methods of SAR interferometry had beendeveloped and problems such as phase unwrappinghad been solved to a satisfactory degree, the prod-ucts of SAR interferometry – in particular DEM –were validated. Only then attention turned to atmo-spheric effects as a source of errors for DEM de-rived by spaceborne repeat-pass SAR interferometry.It was shown that temporal and spatial variations ofthe refractive index of the atmosphere result in con-siderable variations of the interferometric phase. In arecent study (Hanssen, 1998) showed that the influ-ences account for variations over a significant part ofthe interferogram of 0.3 to 2.3 phase cycles. Thesevalues correspond very well with the findings of oth-ers. This means that for an ERS interferogram witha perpendicular baseline of 100 m height errors ofup to roughly 100 m to 200 m can be caused by at-mospheric effects. According to (Hanssen, 1998) thedriving mechanisms of this phenomenon are localizedchanges in the refractive index mainly due to varyinghumidity in the troposphere.

It is often recommended to overcome atmosphericeffects in spaceborne repeat-pass interferometry byaveraging the results of several interferograms. Assome of the atmospheric events, e.g. thunder- or rain-storms, have very distinctive effects which cannot beaveraged out easily, this does not provide a panacea.Rather it seems to be more reasonable to select onlythose SAR images for DEM generation which weretaken in periods without events resulting in severe lo-cal atmospheric disturbancies. In order to be able todo this, meteorological data such as weather charts,

weather radar data, and synoptic observations have tobe evaluated before deciding about the use of particu-lar SAR images.

These suggestions show that spaceborne repeat-passinterferometry only results in reliable DEM when at-mospheric conditions are considered. On the otherhand, if accurate DEM are available, the atmosphericeffects can be regarded as a signal to be investi-gated (Hanssen, 1998, Hanssen et al., 1999). In thissense, spaceborne SAR interferometry provides datato study the atmospheric water and water vapor distri-bution. Never before such data was given with a com-paratively high resolution. An important aspect of fu-ture research in this area is the separation of topogra-phy, displacements and atmospheric effects from eachother (cf. Section 3.1).

Figure 12 A) shows the effect of a cold front on aSAR interferogram (Hanssen et al., 1999). The coldfront is visible as the linear structure between the ar-rows. Figure 12 B) is the weather radar image takenwith a time difference of only 4 minutes to the acqui-sition time of one of the SAR scenes. The weatherradar stations are indicated by the yellow circles. Ahalf wind barb shown at the location of a weather sta-tion corresponds to a wind velocity of 2.5 m/s and afull barb to 5 m/s. Temperature, percentage of rela-tive humidity, and pressure are plotted beside the sta-tions. This example shows the high degree of corre-lation between atmospheric conditions and SAR in-terferograms. Note that the atmospheric signal inthe interferogram refers to the differences of the at-mosphere between the acquisition dates of the SARscenes. In the example shown, the atmosphere at thesecond scene’s acquisition was free of disturbancies.

3.3 Spaceborne Single-Pass Interferometry

By the end of the year 1999 the Shuttle Radar Topog-raphy Mission (SRTM) will take place. For the firsttime in history single-pass interferometry will be ap-plied from a spaceborne platform. The goal of themission is the generation of a digital elevation modelhomogeneously covering the land masses of the earthbetween60� North and58� South. SAR data will beacquired both by SIR-C and X-SAR sensors. The sec-ond antenna of each sensor system will be mountedon a60m long telescope mast which is unfolded assoon as the shuttle reaches its final orbit. A majortechnical problem besides the unfolding of the longeststructure ever applied in space is the calibration of thesignals received by the second antennas with respectto the first ones, as the signals are sent along the mastthrough cables which are subject to large temperaturedifferences. While the SIR-C sensor will completelycover the overflown land surface, the X-SAR sen-

Page 8: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

sensor SIR-C X-SAR

size of inboard antenna 12 m x 80 cm 12 m x 40 cmsize of outboard antenna 8 m x 80 cm 6 m x 40 cmwave length 5.6 cm 3.1 cmfrequency 5.3 GHz 9.6 GHzhorizontal resolution 30 m 30 mrelative height accuracy 10 m 6 mswath width 225 km 50 kmmapped land surface 100% 40%

Table 1: Technical parameters of SIR-C and X-SAR(DLR, 1999).

sor will achieve a limited coverage of approximately40% of the land surface during the eleven day mis-sion. Table 1 summarizes the technical parameters ofboth interferometric SAR sensors; Figure 8 shows anartists view of the space shuttle mission (DLR, 1999).SRTM gets its practical importance from the fact thatit will for the first time provide a global DEM withhomogeneous quality. The DEM will foster macro-and mesoscale modelling of earth-system processes,e.g. in hydrology (Ludwig et al., 1999).

Figure 8: Artists view of SRTM.c (DLR, 1999).

3.4 ScanSAR Interferometry

The ScanSAR modes of the Radarsat SAR and theEnvisat ASAR instruments allow to image the surfaceof the earth with a swath width of up to about 500km and 400 km, respectively. This increase of cover-age in comparison with the standard imaging modes isachieved at the expense of azimuth resolution. Figure9 symbolizes the ScanSAR principle. The ScanSARswath is subdivided into several sub-swaths. Afterimaging a sub-swath with a certain number of pulses,

a so-called burst covering a small part of the syn-thetic aperture, the elevation angle of the radar beamis rapidly steered to a different sub-swath. After allsub-swaths have been treated the process is repeatedstarting with the first sub-swath sending another burstof pulses which still belongs to the same syntheticaperture as the first one. While the ScanSAR prin-ciple has already been developed approximately twodecades ago (Tomiyasu, 1981), interferometry withScanSAR data was proposed and tested only recently(Monti Guarnieri and Prati, 1996, Bamler et al., 1999,Monti Guarnieri and Rocca, 1999).

flight path

burst

sub-swath

Figure 9: ScanSAR principle.

Interferometric processing of ScanSAR data is com-plicated compared with standard interferometric pro-cessing because of two reasons. Firstly, the burstsused to image the sub-swaths, i.e. the azimuth scan-ning patterns, have to be synchronized for the twopasses. This means that “the same point on groundmust be imaged from the same along-track positionsboth times” (Bamler et al., 1999). Non-overlappingparts of the bursts have to be left away before inter-ferometric processing, or filtered out similarly to thefiltering of common bands in standard interferometricprocessing.

Secondly, the critical baseline length is shorterfor ScanSAR interferometry than for standard in-terferometry due to the reduced range bandwidthof ScanSAR. For instance, for the ASAR wideswath mode the critical baseline length is 400 m(Monti Guarnieri et al., 1998).

Both problems do not exist for interferometry usingone ScanSAR image and one standard image because

Page 9: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

in this case the pulses of the standard image corre-sponding with the pulses of the ScanSAR bursts canbe explicitly selected for processing (Bamler et al.,1999), and “zero-baseline steering” (Monti Guarnieriet al., 1998, Monti Guarnieri and Rocca, 1999) canbe used to relax the baseline requirements to standardconditions.

The advantage of ScanSAR interferometry is that thesame area can be observed more frequently usingdata acquired from different orbits, which optimizesthe performance of continuous monitoring systems(Monti Guarnieri et al., 1998). In this case, it has to betaken into account that interferometry is only possiblewith data acquired from repeated orbits, i.e. from thesame track. This means that in preparation for ob-serving of dynamic events on the surface of the earth,e.g. earthquakes, interferometric master images haveto be acquired from all possible tracks, independentfrom the track which will finally be used to acquirethe interferometric slave image, e.g. immediately af-ter an earthquake.

3.5 Polarimetric Interferometry

Fully polarimetric SAR data can be evaluated inter-ferometrically in the same way as standard SAR data.For this purpose, (Cloude and Papathanassiou, 1998)define a coherence matrix� = � V 1V 2 �� V �1TV �2T � = V 1V �1T V 1V �2TV �1TV 2 V 2V �2T !

(5)where V = 0@ VhhVhvVvv 1Ais the polarimetric scattering vector referring to a lin-ear polarization basis assuming thatVhv = Vvh(Kimet al., 1999). TheT indicates the transpose of a ma-trix. The off-diagonal terms of the coherence matrixprovide the interferograms resulting from all combi-nations of polarization responses. The resulting va-riety of coherence data can be used e.g. for landuseclassification purposes, as the interferometric com-binations of different polarizations are influenced bydifferent physical phenomena.

(Cloude and Papathanassiou, 1998) provide a methodto transform the coherence matrix based on a linearpolarization basis to a coherence matrix with an ellip-tical polarization basis maximizing coherence of oneof the interferograms. Applying this method, (Lopez-Sanchez et al., 1999) show with a laboratory exper-iment that interferometric height measurement accu-racy can be increased significantly when the interfero-gram with maximum coherence is used instead of thestandard interferogram.

As interferograms derived from polarimetric data arerelated to different scattering mechanisms, e.g. on theground and in the canopy, different layers of scat-terers in vegetation can be extracted and vegetationheight can be estimated, as visualized in Figure 10(Cloude and Papathanassiou, 1998, Papathanassiouand Cloude, 1998). First practical results are shownby (Lopez-Sanchez et al., 1999) in the laboratory ex-periment mentioned above and by (Kim et al., 1999)for real data. (Treuhaft and Cloude, 1999) show thatheights of oriented-vegetation volumes can be esti-mated using single-baseline polarimetric interferom-etry. They point out that at least two baselines areneeded for randomly oriented volumes.

Figure 10: Perspective view of height differences be-tween effective scattering centers of interferogramswith different polarization bases.c K. Papathanas-siou (Cloude and Papathanassiou, 1998).

4 OUTLOOK

During the last decade SAR interferometry has under-gone a tremendous development from a newly sug-gested research topic to an elaborated technique beingoperationally applied in practice. As the section oncurrent issues has shown, the vivid development con-tinues suggesting that SAR interferometry will stayone of the most rewarding research areas in remotesensing. A visionary future goal would for instancebe the development of a spaceborne high-resolution

Page 10: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

polarimetric interferometric SAR with multibase-line and multifrequency capabilities which would al-low for continuous (i.e. any-time) monitoring of theearth’s surface with a high discriminative power forautomatic object extraction. A prototype of such asystem could possibly be implemented in the frame-work of the international space station.

ACKNOWLEDGMENTS

The author thanks D. R. Fatland, R. Hanssen, K. Pa-pathanassiou, and DLR for providing illustrations ofdifferential, atmospheric, polarimetric, and SRTM in-terferometry, respectively.

REFERENCES

Bamler, R., Adam, N., Davidson, G. W. and Just, D.,1998. Noise-Induced Slope Distortion in 2-D PhaseUnwrapping by Linear Estimators with Applicationto SAR Interferometry. IEEE Transactions on Geo-science and Remote Sensing 36(3), pp. 913–921.

Bamler, R. and Hartl, P., 1998. Synthetic Aper-ture Radar Interferometry. Inverse Problems 14(4),pp. R1–R54.

Bamler, R., Geudtner, D., Schattler, B., Vachon, P.,Steinbrecher, U., Holzner, J., Mittermayer, J., Breit,H. and Moreira, A., 1999. RADARSAT ScanSARInterferometry. In: International Geoscience and Re-mote Sensing Symposium 99, Hamburg, IEEE.

Cloude, S. R. and Papathanassiou, K. P., 1998. Po-larimetric SAR Interferometry. IEEE Transactionson Geoscience and Remote Sensing 36(5), pp. 1551–1565.

Costantini, M., 1998. A Novel Phase UnwrappingMethod Based on Network Programming. IEEETransactions on Geoscience and Remote Sensing36(3), pp. 813–821.

Crosetto, M., 1998. Interferometric SAR for DEMGeneration: Validation of an Integrated ProcedureBased on Multisource Data. Doctorate thesis, Politec-nico di Milano, Geodetic and Surveying Sciences,Milano.

Davidson, G. W. and Bamler, R., 1999. Multires-olution Phase Unwrapping for SAR Interferometry.IEEE Transactions on Geoscience and Remote Sens-ing 37(1), pp. 163–174.

DLR, 1999. X-SAR – Die Erde im Visier. Datenfur die dreidimensionale Weltkarte. Deutsches Zen-trum fur Luft- und Raumfahrt e.V., Presse undOffentlichkeitsarbeit, Koln.

Fatland, D. R. and Lingle, C. S., 1998. Analysis of the1993-95 Bering Glacier (Alaska) Surge Using Dif-ferential SAR Interferometry. Journal of Glaciology44(148), pp. 532–546.

Flynn, T. J., 1996. Consistent 2-D Phase Unwrap-ping Guided by a Quality Map. In: International Geo-science and Remote Sensing Symposium 96, Lincoln,IEEE, pp. 2057–2059.

Gabriel, A. K. and Goldstein, R. M., 1988. CrossedOrbit Interferometry: Theory and Experimental Re-sults from SIR-B. Int. J. Remote Sensing 9(5),pp. 857–872.

Gatelli, F., Monti Guarnieri, A., Parizzi, Pasquali,Prati and Rocca, 1994. The Wavenumber Shift inSAR Interferometry. IEEE Transactions on Geo-science and Remote Sensing 32(4), pp. 855–865.

Gens, R. and Genderen, J. L. v., 1996. SAR Interfer-ometry - Issues, Techniques, Applications. Interna-tional Journal of Remote Sensing 17(10), pp. 1803–1835.

Geudtner, D., 1995. Die interferometrische Verar-beitung von SAR-Daten des ERS-1. Dissertation,Universitat Stuttgart. DLR Forschungsbericht 95-28.

Ghiglia, D. C. and Romero, L. A., 1994. RobustTwo-Dimensional Weighted and Unweighted PhaseUnwrapping that Uses Fast Transforms and IterativeMethods. Journal of the Optical Society of America(A) 11(1), pp. 107–117.

Goldstein, R. M., Zebker, H. A. and Werner,C. L., 1988. Satellite Radar Interferometry: Two-Dimensional Phase Unwrapping. Radio Science23(4), pp. 713–720.

Hanssen, R., 1998. Atmospheric Heterogeneitiesin ERS Tandem SAR Interferometry. DEOS Re-port 98.1, Technische Universiteit Delft, Faculteit derGeodesie, Thijsseweg 11, Postbus 5030, 2600 GADelft, Netherlands.

Hanssen, R. F., Weckwerth, T. M., Zebker, H. A. andKlees, R., 1999. High-Resolution Water Vapor Map-ping from Interferometric Radar Measurements. Sci-ence 283, pp. 1297–1299.

Hartl, P., 1991. Application of Interferometric SAR-Data of the ERS-1-Mission for High ResolutionTopographic Terrain Mapping. Geo-Informations-Systeme 4(2), pp. 8–14.

Hartl, P. and Thiel, K.-H., 1993. Bestimmungvon topographischen Feinstrukturen mit interfer-ometrischem ERS-1-SAR. Zeitschrift fur Photogram-metrie und Fernerkundung 61(3), pp. 108–114.

Page 11: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

Hartl, P. and Xia, Y., 1993. Besonderheitender Datenverarbeitung bei der SAR-Interferometrie.Zeitschrift fur Photogrammetrie und Fernerkundung61(6), pp. 214–222.

Hellwich, O., 1998. SAR Phase Unwrapping Us-ing Adaptive Recursive Smoothing. In: InternationalArchives of Photogrammetry and Remote Sensing,Vol. (32) 3/1, pp. 492–500.

Hellwich, O. and Ebner, H., 1998. A New Approachto SAR Interferogram Geocoding. In: InternationalArchives of Photogrammetry and Remote Sensing,Vol. (32) 1, pp. 108–112.

Hellwich, O., Laptev, I. and Mayer, H., 1998. Auto-mated Pipeline Extraction from Interferometric SARData of the ERS Tandem Mission. In: InternationalArchives of Photogrammetry and Remote Sensing,Vol. (32) 7, pp. 532–537.

Kim, Y., van Zyl, J. and Chu, A., 1999. PreliminaryResults of Single Pass Polarimetric SAR Interferome-try. In: International Geoscience and Remote SensingSymposium 99, Hamburg, IEEE.

Kuchling, H., 1979. Physik. Formeln und Gesetze.Buch- und Zeit-Verlagsgesellschaft, Koln.

Leberl, F. W., 1990. Radargrammetric Image Process-ing. Artech House, Norwood, MA.

Li, F. K. and Goldstein, M., 1990. Studies of Multi-baseline Spaceborne Interferometric Synthetic Aper-ture Radars. IEEE Transactions on Geoscience andRemote Sensing 28(1), pp. 88–97.

Lin, Q., Vesecky, J. F. and Zebker, H. A., 1992. Reg-istration of Interferometric SAR Images. In: IGARSS1992, Proc., pp. 1579–1581.

Lopez-Sanchez, J. M., Sagues, L., Fortunes, J.,Fabregas, X., Broquetas, A., Sieber, A. J. and Cloude,S. R., 1999. Laboratory Experiments of Polarimet-ric Radar Interferometry: DEM Generation and Veg-etation Height Estimation. In: International Geo-science and Remote Sensing Symposium 99, Ham-burg, IEEE.

Ludwig, R., Hellwich, O., Strunz, G., Roth, A.and Eder, K., 1999. Applications of Digital Eleva-tion Models from SAR Interferometry for HydrologicModelling. submitted toPhotogrammetrie, Fern-erkundung, Geoinformation.

Madsen, S. N., Zebker, H. A. and Martin, J., 1993.Topographic Mapping Using Radar Interferometry:Processing Techniques. IEEE Transactions on Geo-science and Remote Sensing 31(1), pp. 246–256.

Massonnet, D. and Rabaute, T., 1993. Radar Interfer-ometry: Limits and Potential. IEEE Transactions onGeoscience and Remote Sensing 31(2), pp. 455–464.

Monti Guarnieri, A. and Prati, C., 1996. ScanSARFocusing and Interferometry. IEEE Transactions onGeoscience and Remote Sensing 34(4), pp. 1029–1038.

Monti Guarnieri, A. and Rocca, F., 1999. Combina-tion of Low- and High-Resolution SAR Images forDifferential Interferometry. IEEE Transactions onGeoscience and Remote Sensing 37(4), pp. 2035–2049.

Monti Guarnieri, A., Prati, C., Rocca, F. and Desnos,Y.-L., 1998. Wide Baseline Interferometry with VeryLow Resolution SAR Sytems. In: EUSAR ’98 Euro-pean Conference on Synthetic Aperture Radar, VDE-Verlag GmbH, Berlin, Offenbach, pp. 361–364.

Papathanassiou, K. P. and Cloude, S. R., 1998. PhaseDecomposition in Polarimetric SAR Interferometry.In: International Geoscience and Remote SensingSymposium 98, Seattle, IEEE.

Prati, C. and Rocca, F., 1990. Limits to the Resolutionof Elevation Maps from Stereo SAR Images. Int. J.Remote Sensing 11(12), pp. 2215–2235.

Prati, C., Rocca, F. and Monti Guarnieri, A., 1994.Topographic Capabilities of SAR Exemplified withERS-1. Geo-Informations-Systeme 7(1), pp. 17–23.

Rodriguez, E. and Martin, J. M., 1992. Theory andDesign of Interferometric Synthetic Aperture Radars.IEE Proceedings-F 139(2), pp. 147–159.

Schwabisch, M., 1995. Die SAR-Interferometrie zurErzeugung digitaler Gelandemodelle. Dissertation,Universitat Stuttgart. DLR Forschungsbericht 95-25.

Schwabisch, M., 1997. SAR-Interferometrie – Tech-nik, Anwendungen, Perspektiven. Zeitschrift fur Pho-togrammetrie und Fernerkundung 65(1), pp. 22–29.

Small, D., Werner, C. and Nuesch, D., 1993. BaselineModelling for ERS-1 SAR Interferometry. In: Inter-national Geoscience and Remote Sensing Symposium93, Tokyo, IEEE, pp. 1204–1206.

Small, D., Werner, C. and Nuesch, D., 1995. Geocod-ing and Validation of ERS-1 InSAR-Derived Digi-tal Elevation Models. EARSeL Advances in RemoteSensing 4(2-X), pp. 26–39,I,II.

Tomiyasu, K., 1981. Conceptual Performance ofa Satellite Borne Wide Swath Synthetic ApertureRadar. IEEE Transactions on Geoscience and RemoteSensing GE-19, pp. 108–116.

Page 12: SAR INTERFEROMETRY: PRINCIPLES, PROCESSING, AND ... › cd80 › de0513a980b... · 2 is the position of the sec-ond SAR sensor. r1 γ ∆r Z Y X 2 1 v 1 p-s p s 1 s Figure 1: Geometric

Treuhaft, R. N. and Cloude, S. R., 1999. The Struc-ture of Oriented Vegetation from Polarimetric Inter-ferometry. IEEE Transactions on Geoscience and Re-mote Sensing 37(5), pp. 2620–2624.

Wegmuller, U. and Strozzi, T., 1998. Characteriza-tion of Differential Interferometry Approaches. In:EUSAR ’98 European Conference on Synthetic Aper-ture Radar, VDE-Verlag GmbH, Berlin, Offenbach,pp. 237–240.

Zebker, H. A. and Chen, C., 1999. Advances in In-terferometric Phase Unwrapping: Network Flow Al-gorithms. In: International Geoscience and RemoteSensing Symposium 99, Hamburg, IEEE.

Zebker, H. A. and Goldstein, R. M., 1986. To-pographic Mapping from Interferometric SyntheticAperture Radar Observations. Journal of Geophysi-cal Research 91(B5), pp. 4993–4999.

Zebker, H., Werner, C., Rosen, P. and Hensley, S.,1994. Accuracy of Topographic Maps Derived fromERS-1 Interferometric Radar. IEEE Transactions onGeoscience and Remote Sensing 32(4), pp. 823–836.


Recommended