Ross W. LEAMERDANIEL A. WEBER
CRAIG L. WIEGAND*USDA Agricultural Research Service
Weslaco, TX 78596
Pattern Recognition ofSoils and Crops from Spacet
Minimum-distance-to-mean and maximum-likelihood-ratioalgorithms are both useful. The more variables used, thebetter the recognition results.
INTRODUCTION
T HIS REPORT discusses the relative effectiveness of some of the commonly
used computer analysis techniques to extractland use (crop identification) informationfrom digitized aerial photographs. Comparisons between minimum distance to the mean(MDM) and maximum likelihood ratio (MLR)algorithms show that either can successfully
digital counts from a density measuring system were as good as standardized optical density units so long as the two were linearlyrelated. Conversion of densities of color filmto analytical densities degraded the classification accuracy more often than it improvedthe results from integral densities. Combining optical density values from black-andwhite films and from color film into a single
ABSTRACT: Some commonly used computer analysis techniques to getland use information from digitized aerial photographs are compared. Density readings of color-IR film and of multispectral blackand-white films are used to make the comparisons. All methods increased in accuracy as more densities were included in the data set.The highest accuracy was obtained if density values from both colorfilm and from black-and-white films were combined into a single dataset. Using arbitrary units from density measuring equipment gave asgood results as converting to standard optical density units or toanalytical density values.
recognize land-use patterns. The numberand combination of densities chosen to represent the land-use categories affects the relative ranking of the two algorithms. Finalclassification accuracy was not affected bythe density units in the base data. Arbitrary
* Soil Scientist; MatheI)ll!tics teacher, DoppaIndependent School District, formerly physicalScience Technician; liesearch Leader,' respectively,
t Contribution from Soil & Water Research,Southern Region, Agr. Res. Service, USDA, Weslaco, Texas. This study was supported in part byNASA under Contract No. R-09-038-002.
data set resulted in the highest correct identification (84.9 per cent for the MDM and 100per cent for the MLR technique).
Aerial photographs have been a standardtool of geologists, geographers, foresters, engineers, etc., for many years. The utility ofsuch images has been well established. Reqlote sensing in general, anq the space satellites in Pilrticu!flTl have gre"t!y increased theinterest in qigitill anillYsis and interpretationof aerial photographs.
Latham2 suggested in 1959 that electronicdevices such as scanning densitometerscould be used for the quantitative measurement and analysis of geographic features and
471
472 PHOTOGRAMMETRIC E IGI EERING & REMOTE SE SING, 1975
land use in photographs in a manner thatwould "permit the use of mechanical and/orelectronic sorting and computing equipmentfor organizing and statisticaIly evaluating thedata." Rosenfeld5 , using the methodology introduced by Latham, conducted research intothe possibilities of using a flying-spot scanner in identifying terrain types recorded onblack-and-white film.
Because of the tremendous amount ofphotography and other imagery being generated, only a fraction of the images are beingfuIly interpreted. The computer is a vitalcomponent of most of the analysis and interpretation systems currently being used.More efficient and effective use ofcomputerswill allow more of the information in remotesensing imagery to be extracted.
This report discusses the relative effectiveness of some commonly used computeranalysis techniques to extract land use (cropidentification) information from digitizedaerial photographs.
PROCEDURE
Multispectral terrain photographs of theImperial Valley of California obtained by theApoIlo-9 astronauts (experiment SO-65)served as the basic data set for this study.These data were used because they representdata generated by a low-resolution system,and thus they should be a rigid test for classification techniques. Extensive ground truthof the area collected by NASA and the University of Michigan has been reported by Spansail et al. B• The data set includes opticalcounts and integral and analytical film opticaldensity readings from color-IR film (multiemulsion) and from multispectral blackand-white (multi base) films exposed simultaneously. The films were exposed in four70-mm Hasselblad cameras with 80-mmfocal-length lenses mounted to view thesame area and connected so that the shutterswere tripped simultaneously. Three of thecameras contained black-and-white films andhad filters to obtain exposures to the green,the red, and the reflective infrared portions ofthe spectrum. The fourth camera containedcolor-infrared film which has three dye layerswhich are sensitive to approximately thesame wavelength bands as those passing thefilters on the black-and-white cameras.
Optical densities of specific areas havingknown ground truth were measured on thefilms by a Joyce, Loebl microdensitometer. *
* Trade names and company names are includedfor the benefit of the reader and do not imply anendorsement or preferential treatment by the U.S.Department of Agriculture of the product listed.
Optical density to white light was measuredon the black-and-white films, and density towhite, red, green, and blue light was measured on the color film. Computer pattern recognition of known crop and soil featuresusing differences between optical densitiesread from these films have been reported byWiegand et aU Their classification and discriminations were based on the minimumdistance to the mean (MOM) patternrecognition technique. In this experiment,the same data were used to test the effectiveness of other forms of the data and otherpattern-recognition techniques.
Each system designed to measure opticaldensity of photographic film uses a differentoptical system and usuaIly a different deviceto measure light passing through film. Somesystems produce readings directly in opticaldensity values, whereas others produce outputs that must be converted to optical densityunits. Ifmeasurements from one machine areto be compared with readings from anothermachine, readings from both machines mustbe converted to a common unit of measurement. Optical density units are used to compare the variables being studied in this report. Equation 1 converts the output from theJoyce, Loebl microdensitometer to opticaldensity values:OD = (OC - Base) (0.0082) x 0.71 (1)
where OD is the optical (integral) density,OC is the optical count (machine units), Baseis 108.5 for White light, 106.1 for Red light,107.6 forGreen light, and 109.5 for Blue light.
Another objective ofthis experiment was tocompare integral densities with analyticaloptical densities for identifying crop and soilconditions. TheoreticaIly, analytical densityquantifies the density of each dye layer independently of the other dye layers in a colorfilm. Although each layer is primarily sensitive to energy of a certain wavelength band,each layer has an effect on the other layers,and all three layers affect each density measurement. Thus optical counts, or integral densities, measure the response to colored lightpassing through the three layers of the film;analytical densities express the independentcontribution ofeach layer to the film density.Conversion from integral density to analytical density is designed to eliminate the overlap of response of the film layers. Two sets ofequations were used to convert integral density of the color film to analytical density,one proposed by Kodak 1 , the other byEG&G3.
Kodak conversion of integral to analyticaldensity,R' = R (1.368) - G (0.321) + B (0.022) - 0.054
PATTERN RECOGNITION OF SOILS & CROPS FROM SPACE 473
G' = -R (0.120) + G (1.238) - B (0.147) +0.006B' = -R (0.039) - G (0.213) + B (1.154) +0.016;
EG&G conversion of integral to analyticaldensity,R' = R (1.0188) - G (0.0258) - B (0.0101)G' = -R (0.217) + G (1.1026) - B (0.0705)B' = -R (0.0326) - G (0.2107) + B (1.1356)where R', G', and B' are the analytical densities of the film to red, green, and bluelight, respectively, and R, G, and B are therespective integral densities for red, green,and blue light calculated in this experimentby Equation 1.
Optical counts were used for all discrimination tests reported here except for the testscomparing integral and analytical densitiesand those comparing optical counts with integral densities. Where integral densitieswere used, Equation 1 was employed to convert optical counts to integral densities.
The tests covered in this report can begrouped into four major categories:
• The MOM algorithm was compared with themaximum likelihood ratio (MLR) algorithmusing both actual optical count data and theoptical count differences. The two algorithms were also compared where thedata were converted to the principal axisfactor scores (PAFS) for the MLR algorithm.
• Optical counts (original machine units)were compared with integral optical densities in all algorithms.
• The effectiveness of analytical densities ofcolored film was compared with integraldensities in discriminating among surfacefeatures.
• The effectiveness of combining densityvalues from all films into a single data setwas evaluated.
Information representing five crop and soilcategories from 53 fields for which groundtruth was known was the data set for thisstudy. To eliminate any possible effect offield size in the discrimination programs,each field was represented by an average ofseveral density readings. Consequently eachdensity reading for each field is a mean forthe field. Details on site selection and filmdensity measurements are available inWiegand et al. 7
Computer programs were developed andmodified to perform either the MOM or MLRpattern recognition algorithm within thecapabilities of an IBM 1800 computer system. Each program calculated crop and soilstandards from the ground truth, classifiedeach sample into one of the five categories,gave an output vector showing the classification of each field along with a recognitionmatrix which identified correct and incorrectclassifications and gave a per cent recognition figure. The programs gave recogn.itionresults usingall possible combinations of densities and density differences for a field .PAFS cannot be calculated for single or pairwise comparisons because conversions to
TABLE 1. PER CENT CORRECT IDENTIFICATION BY MDM AND MLR ALGORITHMS OF 53 FIELDS BY OPTICALCOUNTS (OC) AND OPTICAL COUNT DIFFERENCES (OCD) FROM BOTH COLOR A DBLACK-A 'D-WHITE FILMS
FROM SO-65 EXPERIMENT.
Comparison
Color filmRed minus blue (R-B)Red minus green (R-G)Green counts (G)White minus red (W-R)Blue counts (B)Red counts (R)White counts (W)White minus blue (W-B)Green minus blue (G-B)White minus green (W-G)
Black-and-white filmsRed band (R)Green band (G)IR minus red (IR-R)Green minus IR (G-IR)IR band (IR)Green minus red (G-R)
Single-Level ComparisonsMDM MLR
Per cent correct identification
62.2 60.452.6 58.550.9 56.649.0 60.449.0 50.949.0 49.045.3 45.334.0 39.630.2 41.534.0 34.0
66.0 71.764.2 45.362.3 66.060.4 41.539.6 49.037.7 26.4
474 PHOTOGRAMMETRIC ENGINEERING & REMOTE SE SING, 1975
TABLE 2. PER CENT CORRECT IDENTIFICATION BY MDM AND MLR ALGORITHMS OF 53 FIELDS BY PAIRWISECOMPARISONS OF OPTICAL COUNTS (OC) AND OPTICAL COUNT DIFFERENCES (OCD) FROM BOTH COLOR AND
BLACK-AND-WHITE FILMS FROM 50-65 EXPERIMENT
Comparison
Color filmR, BR,GR-B, G-BR-G, G-BR-G, R-BW-R, R-GW-R, R-BW-G,R-GW,RW-R, W-GW-R, R-BW-B, R-GW-R, W-BW-B, R-BG, BW-R,G-BW,GW,BW-G, G-BW-B, G-BW-G, W-B
Black-and-white filmsG,RIR, RG-IR, IR-RG-R, IR-RG-IR, G-R
PAFS requires a mInImum of three originaldensities to generate the two scores requiredto use PAFS in the MLR algorithm.
RESULTS
MDM VS MLR COMPARISONS
The MOM algorithm was tested against theMLR algorithm using both optical counts andoptical count differences as the original variables. Tests were run on individual comparisons between optical counts obtained fromwhite light through the three black-andwhite films; these were compared with thethree possible differences of these opticalcounts. Similar tests were made of the threeoptical counts from the color film and thethree differences in optical counts. Table 1gives the summary of the per cent correctidentification of each single level comparison.For the color film, MlR yields the higher recognition in six out of ten trials, whereas forthe six black-and-white film comparisons,each of the MOM and MlR methods is betterthan the other in three instances.
Pairwise ComparisonsMDM MLR
Per cent correct identiflcation
73.6 62.371.7 71.767.9 71.766.0 71.766.0 71.767.9 69.867.9 66.066.0 69.864.2 67.962.3 69.860.4 66.060.4 64.258.5 66.056.6 66.052.8 69.852.8 66.052.8 56.649.0 54.745.3 58.543.4 58.541.5 58.5
73.6 71.767.9 73.667.9 67.967.9 67.960.4 67.9
Table 2 summarizes the results of trialsmade with all possible pairwise combinations of optical counts and optical count differences, both from color film and threeblack-and-white films.
Table 3 gives per cent correct identification for all three level combinations ofopticalcounts and optical count differences exceptcombinations of the three differences thatwere linearly dependent on the other two: forinstance, in the set of differences W-R, W-G,and R-G, W-R = (W-G) - (R-G). The reasonfor excluding such sets is that the MlR algorithm recognizes the dependence ofone ofthe variables on the others and either doesnot calculate PAFS or, in trials without PAFSgeneration, creates covariant matrices whosedeterminates are less than, or very close to,zero, thereby rendering any discriminationresults useless.
The three level comparisons of Table 3show that for color film, the optical countsgave better identification than optical countdifferences for both the MOM and MLR tech-
PATTERN RECOGNITION OF SOILS & CROPS FROM SPACE 475
TABLE3. PER CENT CORRECT IDENTIFICATION BY MDM AND MLR(WlTH AND WITHOUTPAFS) ALGORITHMSOF 53 FIELDS BY THREE-LEVEL, FOUR-LEVEL, AND SEVEN-LEVEL COMPARISONS OF OPTICAL COUNTS AND
OPTICAL COUNT DIFFERENCES FROM BOTH COLOR AND BLACK-AND-WHITE FILMS FROM SO-65 EXPERIMENT.
Comparison MDM MLRwlo PAFS w/PAFS
Per cent correct identificationThree level comparisonsColor filmW,R,G 75.5 71.7 71.7R,G,B 77.4 69.8 66.0W,R,B 73.6 67.9 60.4W-R, R-G, G-B 69.8 64.2 62.3W-G, R-G, R-B 69.8 64.2 62.3W-G, R-G, G-B 69.8 64.2 62.3W-R, R-G, R-B 67.9 64.2 62.3W-G, W-B, R-B 67.9 64.2 62.3W-B, R-C, R-B 67.9 64.2 62.3W-B, R-B, G-B 66.0 64.2 62.3W-R, W-G, W-B 66.0 64.2 62.3W-R, W-C, R-B 66.0 64.2 62.3W-B, R-G, G-B 64.2 64.2 62.3W-G, W-B, R-G 64.2 64.2 62.3W-R, R-B, C-B 64.2 64.2 62.3W-R, W-B, R-G 62.3 64.2 62.3W-R, W-B, G-B 60.4 64.2 62.3W-B, R-B, G-B 60.2 64.2 62.3W-R, W-G, G-B 58.5 64.2 62.3W,C,B 50.9 64.2 60.4
Black-and-white filmsC, IR, R 73.6 75.5 75.5
Four level comparisonsColor filmW,R,G,B 73.6 73.6 71.7
Seven level comparisonsAll filmsW, R, G, B, G, R, IR 84.9 100.0 8Ll
niques and that principal axis factor scorepre-processing tended to decrease correctidentifications using the MLR.
The optical counts for the green, red, andinfrared-sensitive black-and-white films andthe integral densities of the color film to red,white, green, and blue light yield equal identifications (73.6 to 75.5 per cent) regardless ofrecognition algorithm used.
Considering the results of the 89 trialscomparing MOM vs. MLR algorithms, the MLRmethod showed a slight advantage in beingmore accurate in 51.1 per cent of the trials.However, in the 23 instances where three ormore densities were used, the MOM methodproved superior 64.0 per cent of the time. Inthe 64 cases where only one or two densitieswere used, the MLR method gave better results 68.8 per cent ofthe time. Ifthree or moredensity values were used, identifications
. were better with both MLR and MDM than ifone or two variables were used.
COMPARISONS INCLUDING PAFSFor the MOM vs. MLR (PAFS) comparison, 23
tests were run using optical counts takenfrom color film with white, red, green, andblue light. Table 3 includes the results obtained using all four readings from the colorfilm and using all seven densities for eachfield from both the color film and the threeblack-and-white films. The MOM algorithmusing the seven densities correctly identified45 of the 53 fields for a correct identificationpercentage of 84.9. Generation of PAFS in theMLR algorithm with seven densities resultedin correctly identifying 43 of the 53 fields fora correct identification percentage of 81.1.However, where the MLR algorithm was usedwithout conversion to PAFS, all 53 fields wereidentified correctly. The identifications obtained using all seven film densities for eachfield indicated that the information content ofblack-and-white, and color-infrared filmscomplement each other.
476 PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING, 1975
TABLE 4. CLASSIFICATION RESULTS OBTAINED USING THE MDM PATTERN RECOGNITION ALGORITHM ONCOLOR-FILM DATA COMPARING IDENTIFICATION FROM OPTICAL COUNTS WITH IDENTIFICATION FROM INTEGRALOPTICAL DENSITIES. BOTH TESTS USED THE PAIRWISE DIFFERENCES OF THE DENSITIES OF COLOR FILM TO
WHITE, RED, GREEN, AND BLUE LIGHT FOR THE CATEGORIES ALFALFA (1), BARLEY (2), SUGARBEETS (3), BARESOIL (4), AND SALT FLATS (5).
Recognition vector
Integral optical densities1115111214 3212222222211
Alfalfa Barley3323223333Sugarbeet
44144114144Bare soil
515554455Salt flat
Optical counts1115111214
Alfalfa3212222222211
Barley3323223333Sugarbeet
44144114144Bare soil
555554455Salt Flat
OPTICAL COUNT VS OPTICAL DENSITY COMPARISON
An experiment was run to see if the conversion of optical count data from the Joyce,Loebl microdensitometer to optical densityvalues affected the results obtained by thevarious identification algorithms. The conversion involves three constants: base count,wedge factor, and step count (equation 1).Eight trials were run to compare identification from optical counts with identificationusing integral optical densities in the MDMalgorithm. In each ofthe trials, the identification from optical count data was identical tothat from the corresponding optical densitydata, both on fields correctly identified andon a point-by-point basis. Thus, it was concluded that the conversion to optical densityis unnecessary where data from only onemachine are used.
Table 4 shows the output recognition vector obtained by Wiegand et a/.7 for color filmcompared with the corresponding outputvector obtained in this study. Both used allsix possible pairwise differences of the fourreadings from each field. The only difference
between the data for the two tests is that oneused optical densities and the other use'd optical counts. The results are identical exceptfor one point. This is presumed to be due tothe fact that different computers were usedfor the two trials and a round-off error occurred.
Wiegand et al. used optical density differences and obtained an overall classificationaccuracy of 68 per cent from color film and 72per cent from black-and-white films (Table5). In this study, classification accuracy wasincreased to a maximum of77 per cent usingoptical counts from the color film with theMOM algorithm. Optical counts with the MLRalgorithm increased the per cent correct classification from black-and-white films to 75from 72 per cent.
INTEGRAL DENSITY VS ANALYTICAL DENSITYCOMPARISONS
Both the Kodak and the EG&G conversions from integral density to analytical density were tested against integral densitiesusing all noninterdependent one-, two-, andthree-level combinations of both these den-
TABLE 5. NUMBER AND PER CENT CORRECT IDENTIFICATION BY CROP AND SOIL CATEGORIES FOR BOTHCOLOR AND BLACK-AND-WHITE FILMS FROM SO-65 EXPERIMENT (After Wiegand et al., 1971.)
Ektachrome IR film Black-and-white films
No. of Correctly identified Correctly identifiedCrop fields umber Per cent umber Per cent
Alfalfa 10 7 70 4 40Barley 13 9 69 8 62Sugarbeet 10 7 70 9 90Bare soil 11 7 64 9 82Salt flats 9 6 67 8 89
Total 53 36 38Overall per cent 68 72
PATTERN RECOGNITION OF SOILS & CROPS FROM SPACE 477
sities and their pairwise differences in theMLR algorithm. The outcome of the 26 trialswas very nearly the same regardless of theconversion used (Table 6). All results werescored on the basis of one point for the conversion having the greater per cent of fieldsidentified correctly with ties scored as onehalf of a win and one-half as a loss. On thisbasis, integral densities came out ahead in 17trials, behind in seven, and tied in two, for awinning percentage of 70.8. Here, too, thebest identification was obtained with thelarger number of density variables.
CONCLUSIONS
This study showed that both the MOM andMLR algorithms were useful for pattern recognition. Each algorithm had advantagesunder certain circumstances. The MOM algorithm was slightly more accurate wherethree or more variables were used, but theMLR algorithm proved to be better where lessthan three variables were used. The MOM algorithm takes considerably fewer formulasteps and consequently less computer time4
so its use should be considered seriously inany pattern recognition work where three ormore variables are available and computertime is a factor.
This study also pointed out that the morevariables used in the classification algorithm,the better the recognition results. Where density values from both multispectral blackand-white films and color film are available,all values should be used. Together they contained enough information to raise previ-
ously reported overall recognition accuracyof 68 per cent (color-IR film) and 72 per cent(black-and-white films) involving four andthree independent optical density differences to 84.9 per cent (using all seven opticalcounts in the MOM algorithm) and to 100 percent (using all seven optical counts in the MLRalgorithm).
It made no difference to the final classification accuracy what density units were used asthe base data. Arbitrary digital counts from asingle density measuring system were asgood as standardized optical density units solong asthe two are Ii nearly related.
Even conversion of densities measured oncolor film to analytical densities does not improve classification accuracy. Conversion toanalytical densities in this study degradedthe classification results more often than itimproved the results from integral densities.
REFERENCES
1. Fritz, Norman L., Private communication withJohn D. Tallant. May 4, 1972.
2. Latham, James P., Possible Application ofElectronic Scanning and Computer Devices tothe Analysis of Geographic Phenomena,Report No.1, NR 387-023 (Washington Officeof Naval Research), August 1959.
3. Linnerud, Harold J., Private communicationwith Craig L. Wiegand, May 4, 1972.
4. Richardson, A. J., Torline, R. J., Weber, D. A.,Leamer, R. W., and Wiegand, C. L., ComputerDiscrimination Procedures Using Film Optical Densities, SWC Research Report 422(USDA, Weslaco, Texas), December 1970.
5. Rosenfeld, Azriel, Automatic Recognition ofBasic Terrain Types from Aerial Photographs,
TABLE 6. PER CENT CORRECT IDENTIFICATION BY MDM AND MLR ALGORITHMS OF 53 FIELDS BY OPTICALCOUNTS, INTEGRAL OPTICAL DENSITIES, AND ANALYTICAL OPTICAL DENSITIES OF COLOR FILM FROM SO-65
EXPERIMENT.
Comparison
MDM
OpticalCounts
OpticalCounts
IntegralDensity
MLR
Analytical DensityEG&G Kodak
R,G,BR, BR,G
R-B, G-BR-G, G-BR-G, R-B
R-BG-BR-GGBR
G-B
77.473.671.767.966.066.062.352.852.650.949.049.030.2
Per cent correct identification69.8 69.8 73.662.3 62.3 71.771.7 71.7 67.971.7 71.7 62.371.7 71.7 66.071.7 71.7 67.960.4 60.4 53.569.8 69.8 54.758.5 58.5 60.456.6 56.6 49.050.9 50.9 50.949.0 49.0 39.641.5 41.5 28.3
71.767.967.969.864.269.862.350.950.947.250.950.937.7
478 PHOTOGRAMMETRIC E GINEERING & REMOTE SENSING, 1975
Photogrammetric Engineering, 27: 115-132(January 1962).
6. Spansail, N. et al. (1969), Imperial ValleyGround Truth for Apollo 9 Overflight ofMarch1969, Univ. Mich. Inst. Sci. Technol. Rept.2264-7-X.
7. Wiegand, C. L., Leamer, R. W., Weber, D. A.,and Gerbermann, A. H., Multibase and Multiemulsion Space Photos for Crops and Soils,Photogrammetric Engineering, 37: 147-156(February 1971).
A.S.P. ANNOUNCES ORTHOPHOTO WORKSHOP III
ORTHOPHOTO WORKSHOP Ill. latest in the ASPseries of symposia on the state-ofthe-art
in orthophotography, is scheduled for June4-6, 1975.
Sponsored by the American Society ofPhotogrammetry, this year's event will beheld at the EI Tropicano Motor Hotel, SanAntonio, Texas. The Society's TexasLouisiana Region will host the workshop.
Several new orthophoto devices havecome onto the market since the last workshopand many projects are underway or nowcomplete, in which orthophotography plays amajor role.
Richard T. Church, Workshop Chairman,indicates that workshop objectives are to 1)identify the state-of-the-art, 2) provide aforum for users oforthophoto equipment, and3) to supply the buyer or potential buyer oforthophotos a clear understanding of thefundamentals and advantages of orthophotography and its many uses.
The Third workshop is to consist of six (6)technical sessions, (two each day) duringwhich the invited technical papers will bediscussed informally, along with a limitednumber of unsolicited papers. According toDr. Robert T. Turpin and Dr. Robert Baker,Program Co-Chairmen, all accepted paperswill be published in a bound volume.
No formal call for papers is planned. Anyone wishing to prepare an un-solicited papershould submit the following information:
1. The paper's title2. Author's name, address and telephone
number3. An abstract of approximately 200 words
This information should be mailed to:
Dr. Robert T. TurpinCivil Engineering DepartmentTexas A&M UniversityCollege Station, Texas 77843
The Co-Chairmen have indicated that invited papers will include the fundamentals,history, recent technical progress, user procedures and project descriptions of orthophotographic endeavours.
An exhibit area including both commercialand noncommercial exhibits will be openthroughout the show. Manufacturers will beexhibiting new orthophoto equipment andrecent projects will be featured in the noncommercial area.
Several other national organizations arecooperati ng in presenti ng the workshop.Members will receive more detailed information on this important technical meeting ata later date.
ISP Congress Newsletter
To ensure that your name is on the mailing list for the 1976 Congress Newsletter ofthe International Society of Photogrammetry, write to Mrs. A. Savolainen, Instituteof Photogrammetry, Technical University of Helsinki, Otaniemi, Finland.