+ All Categories
Home > Documents > Superimposed fringe projection for three-dimensional shape acquisition by image analysis

Superimposed fringe projection for three-dimensional shape acquisition by image analysis

Date post: 03-Oct-2016
Category:
Upload: dario
View: 214 times
Download: 0 times
Share this document with a friend
11

Click here to load reader

Transcript
Page 1: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

Superimposed fringe projection for three-dimensionalshape acquisition by image analysis

Marco Sasso,* Gianluca Chiappini, Giacomo Palmieri, and Dario AmodioDipartimento di Meccanica, Università Politecnica delle Marche Via Brecce Bianche, 60131, Ancona, Italy

*Corresponding author: [email protected]

Received 8 December 2008; revised 12 February 2009; accepted 3 March 2009;posted 8 April 2009 (Doc. ID 104770); published 22 April 2009

The aim in this work is the development of an image analysis technique for 3D shape acquisition, basedon luminous fringe projections. In more detail, the method is based on the simultaneous use of severalprojectors, which is desirable whenever the surface under inspection has a complex geometry, with under-cuts or shadow areas. In these cases, the usual fringe projection technique needs to perform several ac-quisitions, each time moving the projector or using several projectors alternately. Besides the procedureof fringe projection and phase calculation, an unwrap algorithm has been developed in order to obtaincontinuous phase maps needed in following calculations for shape extraction. With the technique ofsimultaneous projections, oriented in such a way to cover all of the surface, it is possible to increasethe speed of the acquisition process and avoid the postprocessing problems related to the matchingof different point clouds. © 2009 Optical Society of America

OCIS codes: 100.2650, 100.5088, 120.0120, 120.2830.

1. Introduction

The measurement of 3D surface geometries is an on-going requirement for many activities and applica-tions. Among the first industries interested in thisfield were the automotive ones, especially for the in-spection of exterior car body sheets, where wavinessor dents frequently occur during the forming or hand-ling phases; but the 3D shape of an object can be ofspecial interest in many other manufacturing pro-cesses, including prototyping, crafts, or low-volumeproduction, even handmade and aesthetic products,or digital archives for artistic heritage, just to men-tion some. Great enhancements have been conceivedsince the time when only a few points of an objectcould be accurately measured by contacting coordi-nate measuring machines; the modern systems aregenerally based on optical methods and digital imageanalysis. Nowadays even commercial systems areavailable that can perform full-field acquisition of3D surfaces with relatively high precision and veryhigh spatial resolution in few seconds. This not only

is the basis for what we today call reverse engineer-ing, but also represents a very useful tool in produc-tion inspection, design, and quality control.

Among the number of methods for measuring 3Dcoordinates of dense point clouds [1,2], such as sha-dow and projected moiré [3–5], Fourier transformprofilometry [6–8], modulation measurement profilo-metry [9], spatial phase detection [10], laser triangu-lation [11,12], and color-coded fringe projection [13],it is not a hazard to state that fringe projectioncombined with temporal phase shifting [14–21] isprobably the most established and widely diffusedtechnique. This full-field method utilizes at leastone camera and one programmable projector. Thefringe pattern projected onto the surface under in-spection is shifted into several positions while thecamera grabs pictures of the lit surface. In thisway each point is invested by an oscillating luminousintensity; by extracting the phase of the correspond-ing digital signal, it is possible to obtain a phase mapof the surface. This phase map intrinsically containsmore information than the single fringe image used,for example, in the Fourier transform profilometrytechnique, as the phase value in each pixel is com-puted considering several gray level values. So the

0003-6935/09/132410-11$15.00/0© 2009 Optical Society of America

2410 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009

Page 2: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

phase maps obtained with the temporal phase shift-ing are more accurate and reliable; furthermore, ifthis method is combined with the gray code, by alsovarying the spatial frequency of the projected fringes,the absolute phase measurement can be carried out,avoiding any phase unwrapping difficulty, which incontrast represents a weak point for other methods.The drawback of the phase shifting technique is

that it requires the system to grab several pictures;the number of images needed rapidly increases if onedesires not only to increase the measurementreliability (it could mean acquiring more picturesthan the minimum necessary in both phase andspatial frequency shifting), but also to perform awhole-body or a 360° acquisition. In the latter case,different interesting and successful solutions havebeen proposed: in [19], e.g., automatically recognizedmarkers are used to merge overlapped patches intothe final point cloud by means of photogrammetry;in [20] several approaches are proposed, basedmainly on the rotation of the object or of the opticaldevices.When neither the object nor the sensors can be

moved, or when this operation is considered tointroduce excessive error into the final measure-ment, the only chance is to realize a static measure-ment cell with several cameras and projectors, whichhave to cover the object’s entire surface. With such amethod more sensors are required, but they have tostay in fixed positions; in addition the object does notneed to bemoved, e.g., by means of a rotational stage,which is often undesirable, especially when dealingwith large objects. If all the optical devices are cali-brated within the same coordinate system, thepatches acquired by each camera are already calcu-lated in this frame, and no fitting and/or merging ofthe single views must be performed to achieve thewhole-body assessment [20]. On the other hand, thisapproach still presents the unresolved issue of timeconsumption during the image acquisition phase. Infact, if one uses several projectors to cover the wholesurface of a certain object, unavoidably there will besome regions of overlapped projection, where thefringes produced by a projector are superimposedon the fringes of another source. This is an undesiredsituation that forces one to turn off the projectorsalternately and use them one by one.The present work suggests what is to our knowl-

edge a novel method to overcome this difficulty,based on a postprocessing algorithm that is able todiscriminate the phase values of up to three super-imposed fringe signals. This makes it possible to ac-quire the phase maps, and so the 3D shape of anobject, by means of three cameras and three LCDprojectors simultaneously activated with overlappedprojection areas, with a significant reduction of totalacquisition time. Furthermore, we have focused onthe problem of phase unwrapping, required to obtainphase maps without discontinuities or ambiguities.Several techniques have been proposed in the litera-ture [22–28]; Gilbert and Blatt conceived an algo-

rithm inspired by multispectral phase unwrapping[29] that has been found to be well suited for the pre-sent application. This technique consists in theacquisition of phase maps with different periods,whose values are such that their least common mul-tiple is greater than the maximum dimension of theprojection. This paper is organized as follows: after abrief revision on the standard phase shifting techni-que, the acquisition procedure we developed is de-scribed in detail in Subsection 2.B. Then theseparation algorithm is presented, which producesseveral phase distribution maps, one for each cam-era–projector pair. Section 4 is dedicated to describ-ing the proposed unwrapping procedure. Finally, anexample of shape acquisition is reported; a cylindri-cal object has been digitized into a dense point cloudwith the proposed methodology, and the loss of accu-racy is briefly assessed in comparison with theclassical fringe projection method.

2. Fringe-Projection Technique

A. Phase Shift Technique with One Projection

Before starting to describe the multiprojection it canbe useful to speak briefly about the original phaseshift technique with one projection. A personal com-puter generates a fringe pattern in a flexible way:fringes may be horizontal or vertical, with an adjus-table period, with sinusoidal distribution of gray le-vel (sinusoidal fringes) or with a step distribution(rectangular fringes). A common LCD projector re-produces the pattern on the surface of the object.The result is shown in Fig. 1(a): fringes are distortedbecause of the morphology of the surface, and cam-eras (at least two as required in stereoscopic calcula-tion, or only one if the projector also has been 3Dcalibrated) grab images of the distorted pattern. Thisoperation is repeated n times, and each time fringesare shifted by an nth part of the period. In this wayeach pixel is invested with a signal of variable lightintensity during the acquisition. By discrete Fouriertransform (DFT) of the signal, it is possible to calcu-late the phase for each pixel of coordinates u and v,obtaining the so-called phase distribution mapφðu; vÞ. An example of a phase map is given inFig. 1(b), where the dark and the bright gray levelscorrespond to low and high phase values, respec-tively. This map contains values gradually varyingfrom 0 to 2π, but also presents abrupt discontinuitiesof the same spatial period of the projected fringes.Hence, the next operation to be carried out is anunwrap procedure able to avoid any discontinuityand ambiguity in the phase maps [Fig. 1(c)]. Oneof the classic techniques is the gray code [26], wherea series of rectangular black and white fringes, with adoubling period in each frame, are projected onto thesurface. Thus each pixel can be associated with a uni-vocal binary sequence, and the fringe order n of thepixel can be easily computed by simply decoding thissequence into a decimal number. When for each pixelof coordinates ðu; vÞ the number of the fringe n to

1 May 2009 / Vol. 48, No. 13 / APPLIED OPTICS 2411

Page 3: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

which it belongs is known, the operation below is suf-ficient to obtain the unwrapped phase:

φUNWðu; vÞ ¼ φðu; vÞ þ n2π: ð1Þ

When two cameras are utilized, their unwrappedphase maps are useful to determine the stereoscopicpairs, that is, to understand which pixels in the twoframes correspond to the same point of the physicalsurface. For example if phase maps obtained withboth horizontal and vertical fringes are availablefor different cameras, a point of the test object sur-face will be associated with two particular valuesof horizontal and vertical phase, and these valuesare the same for both phase maps. Hence 3D coordi-nates may be calculated by a stereoscopic algorithmbased on 3D calibration of cameras. Several cameracalibration methods are available in literature;among them, the Zhang [30] and the Tsai [31] algo-rithms are well suited for automatic inspections. Aninteresting generalization of the stereoscopic techni-que is the direct triangulation between the cameraand the projector; this is possible if the projector is3D calibrated like the camera, using analogous algo-rithms. This method presents the advantage ofrequiring only one camera; on the other hand, pro-blems arise not only in the complex calibration pro-cedure of the LCD projector, but also and especiallyin the fact that projectors often exhibit quite big im-age distortion that needs to be accurately corrected.An another direct technique to calculate 3D coordi-nates, maybe the most effective is that used in[14]. It considers the phase value of a point to dependdirectly on its height Z from a reference plane. So, iftwo phase maps are acquired for two referenceplanes, say Z ¼ 0 and Z ¼ �Z, the following simple ex-pression can be used to compute the Z level at eachpoint:

Z ¼ φ − φ0

φ�Z − φ0

�Z: ð2Þ

Of course, if calibration planes at intermediate Zlevels are available, a regression or, better, a multi-linear interpolation law can be adopted with

improved accuracy. Once the Z coordinate of a pointis known, it is also easy to calculate X and Y coordi-nates if the camera has been previously calibrated in3D. This technique has the great advantage of requir-ing the use of only one camera and with no need for acomplete 3D calibration of the projector with its in-trinsic and extrinsic parameters estimation. Thedrawback is that the acquisition of reference and Z ¼�Z planes is critical and requires good precision.With such a simple phase shift application that

uses only one projector, significant relevant problemscome out when test surfaces are complex and presentshadowy areas; moreover, the acquisition of the en-tire object is inherently impossible if the object hasa relevant 3D extension, as in Fig. 1(a), where theprojector and the camera cannot cover the entireobject.

B. Multiprojection Technique

With the aim of speeding up the acquisition phaseand at the same time avoiding alternately movingor switching off the light source, the natural ideawas to implement a matrix of fixed devices (threecameras and three projectors) performing multiple,simultaneous fringe projections. Using several pro-jectors concurrently, as shown in the layout of Fig. 2,

Fig. 1. (a) Fringe pattern on the surface. (b) Phase map after phase shifting. (c) Phase map after unwrapping.

Fig. 2. Acquisition system layout.

2412 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009

Page 4: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

originates a strange pattern resulting from superim-position of the single regular patterns. Different re-gions can be distinguished (Fig. 3), depending on thenumber of projectors that light up that area and onthe period and orientation of fringes, as summarizedin Table 1. The acquisition procedure is divided intothree steps. In the first step projector A executes asequence of n shifted projections; the period of thefringes is P0, and they are shifted as described inthe technique with one projector. At the same timeprojectors B and C execute the same sequence withfringes of period P1 and P2, respectively . The valuesof P0, P1, P2 are such that their least common multi-ple (l.c.m) is greater than the maximum dimension ofthe projection. For example, if the resolution is1024 × 768, the condition becomes

l:c:m:ðP0;P1;P2Þ > 1024: ð3Þ

Of course, no two fringe periods or shifts can be thesame; otherwise their superposition will result in asignal of identical period, making the component sig-nals phases indistinguishable. Moreover, satisfactionof relation (3) guarantees the possibility of perform-ing an absolute phase unwrapping.In the second step projectors rotate the period of

their fringes; for example, projector A generatesfringes of period P1, projector B fringes of periodP2, and projector C fringes of period P0. The sequenceof image projections is the same as in step 1. In thelast step projectors again rotate the periods and exe-cute the same sequence. The number n of shifts foreach step is equal to the maximum period betweenP0, P1, and P2, which by convention is P2. So the totalnumber of projections during the acquisition is 3P2.In the end each projector will have executed three

phase shifts with three different periods of fringes,and this operation is simultaneous for A, B, and C.The only difference between the projectors is the or-der of the rotation of fringe periods. In Fig. 4 threeframes, one for each step, are shown; the values ofthe periods are P0 ¼ 5, P1 ¼ 11, and P2 ¼ 21; son ¼ 21, and the total number of projections is 63.For the sake of intelligibility, considering the afore-

mentioned periods P0 ¼ 5, P1 ¼ 11, P2 ¼ 21 to be ex-pressed in pixels, each n-frame sequence is obtainedwith a 1 × 1 pixel shift of the projected fringes: duringthe first 21-frame sequence, only projector C made anumber of shifts exactly equal to its period, so that itsfringes came back to their initial position, while

other projectors repeated their period more thanonce; in particular, projector A repeated its fringeperiod more than four times, and projector B itsperiod almost two times.

The use of multiple source of light provides largecoverage of the surface, also reaching vertical sur-faces of different orientations. Moreover, the simulta-neity of the process makes the acquisition processfast. Nevertheless, further elaboration is necessaryto separate the contribution of each projector fromthe superimposition of the three signals. In fact,the signal in each pixel is the composition of one,two, or three single periodic signals, depending onhow many projectors invest that pixel. Figure 5(a)shows the signal acquired by a camera pixel belong-ing to an area invested by only one projector; the sig-nals of a pixel with two and three superimposedprojections are illustrated in Figs. 5(b) and 5(c), re-spectively. Notice that the illumination intensity ofthe projectors and the diaphragm aperture of thecamera optics are constant during acquisition andneed to be adjusted only at setup in order to utilizethe whole 8 bit depth of the analog–digital cameraconverter. This means that the signal is close to sa-turation only where all three projections overlap.The next section describes in detail the proposed al-gorithm, based on DFTs, for signal separation.

3. Phase Extraction Procedure

The starting point of this procedure is the sequenceof frames grabbed during acquisition, which can bealso seen as a set of arrays, one for each pixel, con-taining samples of the light signal of that pixel. Inmore detail, each array is made of 3P2 samples ofa signal resulting from superimposition of up to three

Fig. 3. Superimposition of fringes.

Table 1. Bullets Indicate which Projector Lights Up Each Subarea

Subarea Projector A Projector B Projector C

1 •

2 •

3 •

4 • •

5 • •

6 • •

7 • • •

1 May 2009 / Vol. 48, No. 13 / APPLIED OPTICS 2413

Page 5: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

single signals. To extract phase values of single sig-nals from the total signal, a direct application of aDFT is not sufficient. Indeed, in the areas of super-imposition it is necessary to investigate how signals

with different frequency influence the DFT of the to-tal signal. Let us consider one of the three acquisitionsteps; it must be calculated how a signal of period PjðPj < P2Þ made of P2 samples (so the signal containsmore than one period) scatters its content in the DFTspectrum of the superimposed P2 sampled signal. Ifwe indicate with xk the discrete total signal receivedby a pixel, where k is the index of the sample, we cansay that it is the sum of three single discrete signalscorresponding to the contributes of the ith projector;this means that

xk ¼X2i¼0

xik; k ¼ 0; 1;…;P2 − 1: ð4Þ

The DFT of the total signal will have the form ofEq. (5), where f q is a complex number relative tothe contribution of the qth harmonic of the signaland contains information about the amplitude andphase of that harmonic:

f q ¼ 1P2

XP2−1

k¼0

xk exp�−j

2πP2

kq

�; q ¼ 0; 1;…;P2 − 1:

ð5Þ

Considering the signals xik separately, they can beexpressed as a sinusoidal discrete signal, where Ai isthe amplitude and φi is the phase value:

xik ¼ Ai sin�2πPi

kþ φi

¼ Ai

�sin

�2πPi

k

�cosφi þ cos

�2πPi

k

�sinφi

�: ð6Þ

From Eq. (6), and imposing

ai ¼ Ai cosφi; bi ¼ Ai sinφi ; ð7Þ

we obtain

xik ¼ ai sin�2πPi

k

�þ bi cos

�2πPi

k

�; ð8Þ

xk ¼X2i¼0

�ai sin

�2πPi

k

�þ bi cos

�2πPi

k

��: ð9Þ

Fig. 4. Different patterns during step 1 (left), step 2 (middle), step 3 (right).

Fig. 5. Signal of light intensity in a pixel belonging to (a) a zonewith one projection, (b) a zone with superimposition of two projec-tions, (c) a zone with superimposition of three projections.

2414 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009

Page 6: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

Coefficients ai and bi can be seen as real and ima-ginary parts of the first harmonic in the DFT of xikconsidered separately and made of Pj samples. Com-bining Eqs. (9) and (5), we obtain

f q ¼ 1P2

X2i¼0

�ai

XP2−1

k¼0

sin�2πPi

k

�exp

�−j

2πP2

kq

þ biXP2−1

k¼0

cos�2πPi

k

�exp

�−j

2πP2

kq

��: ð10Þ

From Eq. (10) it is shown that the DFT of the cu-mulative signal is function of six unknown para-meters ai, bi, i ¼ 1; 2; 3. Furthermore, it is possibleto observe that the terms multiplying the ai and bicoefficients represent the DFTs of unitary sinusoidsor cosinusoids of period Pi made of P2 samples.Defining them as

Siq ¼ 1

P2

PP2−1k¼0 sin

�2πPik

�exp

�−j 2πP2

kq

�;

Ciq ¼ 1

P2

PP2−1k¼0 cos

�2πPik

�exp

�−j 2πP2

kq

�;

q ¼ 0; 1;…;P2 − 1;

ð11Þ

it is possible to write the equation

f q ¼X2i¼0

ðaiSiq þ biCi

qÞ: ð12Þ

While coefficients ai, bi are real, coefficients Siq, Ci

qare complex. Writing Eq. (12) in an expandedform, we obtain a system of P2 equations with sixunknowns:

26664

f 0f 1...

f P2−1

37775 ¼

26664

S00 C0

0 S10 C1

0 S20 C2

0S01 C0

1 S11 C1

1 S21 C2

1

..

.

S0P2−1

C0P2−1

S1P2−1

C1P2−1

S2P2−1

C2P2−1

37775 ·

2666664

a0

b0a1

b1a2

b2

3777775; ð13Þ

f ¼ ½D� · c: ð14Þ

In Eq. (14) f represents the array of the DFT of thetotal acquired signal, ½D� is the coefficient matrixcomposed by terms Si

q and Ciq, and c is the array

of unknown parameters. Now it is easy to reverse

the system and to calculate the phases φi and the am-plitudes Ai of single signals by using a linear least-squares minimization:

c ¼ ½DTD�−1DT · f; ð15Þ

Ai ¼ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffia2i þ b2i

q; ð16Þ

φi ¼ tan−1�bi=ai

�: ð17Þ

At the end of this procedure three separate phasemaps, one for each projector, are available for eachstep of the acquisition. Observe that not onlyEq. (17), but also Eq. (16) is very important becauseit computes the amplitudes of the three main sinusoi-dal signals and, by use of proper thresholds, it per-mits us to discriminate whether a certain pixelrefers to a single, double, or triple overlappedprojection.

These simple operations are repeated for all threeacquisition steps, leading to nine separate phasemaps, one for each projector step, or equivalently,each projector–period combination. An example ofthe phase distribution maps computed with the pro-posed algorithm is presented in Fig. 6.

4. Phase Unwrapping Algorithm

As is possible to see in Fig. 6, phase maps have now astep trend with values gradually varying from 0 to 2πinside each fringe and abrupt discontinuities be-tween two adjacent fringes. In order to avoid discon-tinuities and create a univocal correspondencebetween CCD and LCD pixels, or in general to applyone of the aforesaid shape extraction methods fromSubsection 2.A, an unwrapping procedure has tobe carried out. In the present application, the classi-cal gray code technique explained above cannot beimplemented because binary sequence decodingbecomes infeasible in the case of superimposed

projection. In such a situation the unwrapping pro-blem can be solved with a technique similar to multi-spectral phase unwrapping, which is just based onthe utilization of several phase maps with differentperiods. Each pixel is associated with a sequence ofthree phase values, but if periods P0, P1, P2 are suchas their least common multiple is greater than the

1 May 2009 / Vol. 48, No. 13 / APPLIED OPTICS 2415

Page 7: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

maximum projection dimension, the sequence ofphase values becomes univocal and will identifythe pixel. Figure 7 gives a simplified idea of howthe method works: for example, pixels A and B havethe same value of φ2, and one could not say to whichP2 fringe they belong; but, on the other hand, pixels Aand B have different values of φ0 and φ1, which letsus decide that A belongs to the first P2 fringe and B tothe second.Considering a pixel and its generic phase value φi,

the relative unwrapped value φui depends on the

absolute number of the fringe ni to which the pixelbelongs:

φui ¼ φi þ 2πni i ¼ 0; 1; 2: ð18Þ

At the same time another relation exists betweenunwrapped phase values of different periods:

φu0 ¼ φu

1

P1

P0; φu

1 ¼ φu2

P2

P1; φu

0 ¼ φu2

P2

P0: ð19Þ

Combining Eqs. (18) and (19), it is possible to writethe system

φ0 þ 2πn0 ¼ ðφ1 þ 2πn1ÞP1

P0; ð20aÞ

φ0 þ 2πn0 ¼ ðφ2 þ 2πn2Þ P2P0; ð20bÞ

φ1 þ 2πn1 ¼ ðφ2 þ 2πn2ÞP2

P1; ð20cÞ

where n0, n1, n2 are the unknowns.This system is not directly solvable with linear

algebra because the values of ni obviously mustbe integers; in addition, the system is badly condi-tioned; rather, the equations would be linearly de-pendent in the ideal case of zero error in thephase measurement. Notice that having linearly

Fig. 6. Phase maps of each projector for each acquisition step.

Fig. 7. Principle of the multispectral phase unwrap.

2416 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009

Page 8: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

dependent equations means that there are infinitesolutions, but this does not represent a contradiction,because the solutions are repeated regularly with aperiod equal to the least commonmultiplier of P0, P1,and P2.To overcome this mathematical difficulty, two ap-

proaches are possible. The first is a brute recursiveprocedure, consisting in rearranging the first twoequations of system (20),

n1 ¼�ðφ0 þ 2πn0Þ

P0

P1− φ1

�12π ;

n2 ¼�ðφ0 þ 2πn0Þ

P0

P2− φ2

�12π ; ð21Þ

and imposing iteratively an integer value of n0, from0 to its maximum admissible value, i.e., 204. For eachiteration, n1 and n2 values are computed by Eqs. (21)and stored; at the end of all iterations, the correctunwrapping sequence n0, n1, n2, is the one that ob-tained the n1 and n2 closest to integer values. Ofcourse, these operations have to be performed foreach image pixel.The second, more efficient, method, takes advan-

tage of the algebraic theory specific for equationswith integer coefficients, where the unknowns haveto be integers, too. Again rearranging Eqs. (20) andmultiplying the first two equations and the third byP0 and P1, respectively, the following matrix form isobtained:

"P0 −P1 0P0 0 −P2

0 P1 −P2

#(n0

n1

n2

)¼ 12π

(φ1P1 −φ0P0

φ2P2 −φ0P0

φ1P1 −φ2P2

): ð22Þ

This represents a linear system of Diophantineequations, which admit (infinite) solutions only ifthe known terms on the right-hand side of the equa-tions are integers and multiples of the greatest com-mon divisor of the equation coefficients. Consideringthat the Pi values adopted here were mutuallyprime, the second condition is automatically satis-fied, and the imposed restriction constrains theright-hand terms in Eq. (22) to be integers. Naturally,this is true only in ideal case or with synthetic data;when dealing with real images, the noise makes the

right-hand terms of Eq. (22) not perfectly integer, sothey have to be rounded to the nearest integer. De-noting the known terms of Eq. (22) with ½p; q; r�Tand considering the P0, P1 , and P2 values to be 5,11, and 21, respectively, it is possible to demonstratethat possible sets of solutions, for the three equationsconsidered separately, are given by

n0 ¼ −2pþ h1P1;

n1 ¼ −1pþ h1P0:

n0 ¼ −4qþ h2P2 ; n2 ¼ −1qþ h2P0 :

n1 ¼ 2rþ h3P2; n2 ¼ 1rþ h3P1: ð23ÞHere h1, h2 and h3 are any positive, zero, or negativeinteger. To solve the unwrapping ambiguity, it is suf-ficient to find the unique n0 value that is present inboth n0 sequences of Eqs. (23) in the range 0–204,which is the maximum number of 5-pixel periods in-cluded in a 1024-pixel width projection. The final un-wrapped phase maps for each projector, computedwith this technique, are shown in Fig. 8. Naturally,the same results are achieved by using the firstmethod, but with a longer computational time.

5. Results

A. Error Estimation

This subsection gives an estimation of the error inthe phase maps obtained with the superimposedfringe projection acquisition procedure. Actually,we are going to compare a phase map obtained fromthe new technique with one obtained from the clas-sical phase shift technique by using only one projec-tor and one camera, considering the latter as thereference. Wrapped phase maps of period equal to21 have been calculated with both techniques, mak-ing an acquisition over a 500mm × 500mm planarsurface, and the relative error is calculated as thepixel-by-pixel difference of these maps. A plot ofphase discrepancies against occurrences is shownin Fig. 9.

This distribution, which can be read as a pro-bability density function, gives a mean value

Fig. 8. Phase map after unwrapping of each projector.

1 May 2009 / Vol. 48, No. 13 / APPLIED OPTICS 2417

Page 9: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

of −0:32 × 10−4 rad and a standard deviation of0:0146 rad. The symmetry of error distributionaround the origin suggests that no systematic errorsinfluenced the results; furthermore, the very lowstandard deviation indicates good agreement be-tween the two techniques, and the further error in-duced by the projection of overlapped fringes isquite small. It is worth pointing out that the solutionof system (13) is exact for synthetic or simulateddata, so what actually introduces inaccuracy in thecomputed phase values is the noise that is unavoid-ably higher in the superimposed projection. Just tohave idea of the final error, it must be consideredthat in the adopted setup the average calibration fac-tor K ¼ ðφ�Z − φ0Þ=�Z, according to Eq. (2), was0:5880 rad=mm, meaning that the difference in themeasurements performed with the two methodsamounts to only 0:0146=0:5880 ¼ 0:025mm ofstandard deviation.

B. Example of 3D Shape Acquisition

In this section we present an example of 3D shapeacquisition by the proposed method. The object un-der examination is a cylinder. This geometrical shapeis very simple, but its 360° profile makes it impossi-ble to acquire it from one point of view only; so, theclassical phase shift technique with one projectorwould require several steps of acquisition to coverthe entire vertical surface. With the new technique,using three projectors and three cameras disposed atregular intervals of 120° (Fig. 2), all of the surfaces,both vertical and horizontal on the top, can be

acquired simultaneously. In Fig. 10, the fringe pat-tern generated by the projectors over the entire cylin-der is shown. In this application the periods of thefringes are P0 ¼ 5, P1 ¼ 11, and P2 ¼ 21 pixels, sothat their least commonmultiple is 1155, larger thanthe maximum dimension of the projection (1024 pix-els). In this way, the 3P2 ¼ 63 acquired frames areenough to carry out not only the phase calculation,but also the phase unwrapping procedure for allthree cameras; for comparison, consider that a singlecamera–projector, combined with a reliable gray codeunwrapping sequence, would require up to 36 frames(e.g., 4 for phase shift and 32 binary), but the opera-tion has to be repeated for each camera. The proce-dure of phase extraction results in the phase mapseries shown in Fig. 11, where the maps have beensuccessively postprocessed with the unwrapping al-gorithm described above. Then, three unwrappedphase maps for each projector are obtained, eachone corresponding to different periods of fringes.The overall procedure, including phase extractionand unwrapping of half a megapixel images, takesapproximately 6 s per camera on a 2GHz Dual Corelaptop computer. To proceed with the elaboration,only three maps of the same fringe period wouldbe sufficient. Nevertheless, it may be useful to utilizeall available maps, in order to mediate as much aspossible random errors of the acquired signal. To thispurpose, all the maps have been converted byEq. (19) to the common fringe period of 5 pixels,

Fig. 9. Plot of phase difference between the classical and the pro-posed technique against occurrences (total number of pixels is1280 × 1024).

Fig. 10. Fringe pattern on the surface of a cylinder.

Fig. 11. Unwrapped phase maps of a cylinder.

2418 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009

Page 10: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

and their mean value was considered for further ela-boration. In the earlier calibration step, phase mapswere acquired also for a reference plane at Z ¼ 0 andfor a parallel plane at a fixed value �Z, so that the Zcoordinate for each pixel can be calculated by Eq. (2).Figure 12 shows the point clouds acquired by eachcamera; Figs. 12a, 12b, and 12c refer to cameras 1,2, and 3, respectively. Obviously, the estimation ofthe X and Y coordinates of the points required aprevious 3D calibration of the cameras in a commonreference frame, here performed according to theZhang technique [30]. The global point cloud isshown in Fig. 12d. It is possible to note the areaswhere two or three point clouds overlap; this hap-pens in narrow vertical bands of the lateral surfaceand over the whole top of the cylinder.

6. Conclusions and Future Developments

This work proposes a novel optical methodology for3D object shape acquisition based on structuredlight. The well-known fringe projection and phaseshifting techniques have been extended and general-ized to applications where a 360° acquisition is re-quired but the movement of the optical sensorsand, alternatively, switching off the multiple lightsources are not desirable. Indeed, a data postproces-sing algorithm has been implemented that also al-lows several cameras to acquire the object surfacesthat are invested simultaneously by up to threefringe projections.In fact, the adopted acquisition and data postpro-

cessing procedure made it possible not only to discri-minate in each camera image the regions with one,two, or three superimposed fringe signals, but alsoto extract the principal phase values to be succes-sively used in a specific unwrapping algorithm. Ex-perimental data revealed that the loss of accuracy isnot so relevant, if compared with a single projector–camera acquisition. A possible development thatwould make the proposed method even more appeal-ing for practical application consists in further com-pressing the acquisition phase, e.g., by changing theperiod of the fringes when a projector completes its

current period and avoiding repeated projectionswith same period until the projector with maximumperiod concludes its job. This implies reducing the ac-quired frames from 3P2 to P0 þ P1 þ P2; the proposedapproach is probably extendible to such a situation,but care must be taken to avoid a possible loss ofaccuracy.

References1. F. Chen, G. M. Brown, andM. Song, “Overview of three dimen-

sional shape measurement using optical methods,” Opt. Eng.39, 10–22 (2000).

2. F. Blais, “Review of 20 years of range sensor development,”J. Electron. Imaging 13, 231–240 (2004).

3. K. Patorski, Handbook of the Moiré Fringe Technique(Elsevier 1993).

4. C. A. Sciammarella, L. Lamberti, and F. M. Sciammarella,“High accuracy contouring with projection moiré,” Opt. Eng.44, 093605 (2005).

5. J. A. N. Buytaert and J. J. J. Dirckx, “Moiré profilometry usingliquid crystals for projection and demodulation,” Opt. Express16, 179–193 (2008).

6. M. Takeda, H. Ina, and S. Kobayashi, “Fourier-transformmethod of fringe-pattern analysis for computer-based topogra-phy and interferometry,” J. Opt. Soc. Am. 72, 156–160 (1982).

7. M. Takeda and K. Mutoh, “Fourier transform profilometry forthe automatic measurement of 3-D object shapes,” Appl. Opt.22, 3977–3982 (1983).

8. S. Xianyu and C. Wenjing, “Fourier transform profilometry: areview,” Opt. Lasers Eng. 35, 263–284 (2001).

9. L. Su, X. Su, W. Li, and L. Xiang, “Application of modulationmeasurement profilometry to objects with surface holes,”Appl. Opt. 38, 1153–1158 (1999).

10. S. Toyooka and Y. Iwaasa, “Automatic profilometry of 3-Ddiffuse objects by spatial phase detection,” Appl. Opt. 25,1630–1633 (1986).

11. X. Cheng, X. Su, and L. Guo, “Automated measurement meth-od for 360° profilometry of 3-D diffuse objects,” Appl. Opt. 30,1274–1278 (1991).

12. A. Asundi and Z. Wensen, “Unified calibration technique andits applications in optical triangular profilometry,” Appl. Opt.38, 3556–3561 (1999).

13. G. Häusler and D. Ritter, “Parallel three-dimensional sensingby color-coded triangulation,” Appl. Opt. 32, 7164–7169(1993).

14. Y. Y. Hung, L. Lin, H. M. Shang, and B. G. Park, “Practicalthree-dimensional computer vision techniques for full-fieldsurface measurement,” Opt. Eng. 39, 143–149 (2000).

15. W. S. Zhou and X. X. Su, “A direct mapping algorithm forphase-measuring profilometry,” J. Mod. Opt. 41, 89–93 (1994).

16. J. L. Li, X. Su, and H. J. Su, “Removal of carrier frequency inphase-shifting techniques,” Opt. Lasers Eng. 30, 107–115(1998).

17. X. Liang and X. Su, “Computer simulation of a 3-D sensingsystem with structured illumination,” Opt. Lasers Eng. 27,379–393 (1997).

18. H. Zhang, M. J. Lalor, and D. R. Burton, “Spatiotemporalphase unwrapping for the measurement of discontinuous ob-jects in dynamic fringe-projection phase-shifting profilome-try,” Appl. Opt. 38, 3534–3541 (1999).

19. C. Reich, R. Ritter, and J. Thesing, “3-D shapemeasurement ofcomplex objects by combining photogrammetry and fringe pro-jection,” Opt. Eng. 39, 224–231 (2000).

20. W. Schreiber and G. Notni, “Theory and arrangements of self-calibrating whole-body three-dimensional measurementsystems using fringe projection technique,” Opt. Eng. 39,143–149 (2000).

Fig. 12. (Color online) Point clouds of the surface of a cylinder.

1 May 2009 / Vol. 48, No. 13 / APPLIED OPTICS 2419

Page 11: Superimposed fringe projection for three-dimensional shape acquisition by image analysis

21. M. J. Tsai and C. C. Hung, “Development of a high precisionsurface metrology system using structured light projection,”Measurement 38, 236–247 (2005).

22. T. R. Judge and P. J. Bryanston-Cross, “A review of phase un-wrapping techniques in fringe analysis,” Opt. Lasers Eng. 21,199–239 (1994).

23. H. Zhao, W. Chen, and Y. Tan, “Phase-unwrapping algorithmfor the measurement of three-dimensional object shapes,”Appl. Opt. 33, 4497–4500 (1994).

24. H. O. Saldner and J. M. Huntley, “Temporal phase unwrap-ping: application to surface profiling of discontinuous objects,”Appl. Opt. 36, 2770–2775 (1997).

25. J. M. Huntley andH. O. Saldner, “Shapemeasurement by tem-poral phase unwrapping: comparison of unwrapping algo-rithms,” Meas. Sci. Technol. 8, 986–992 (1997).

26. G. Sansoni, M. Carocci, and R. Rodella, “Three-dimensionalvision based on a combination of gray-code and phase-shift

light projection: analysis and compensation of the systematicerrors,” Appl. Opt. 38, 6565–6573 (1999).

27. C. Bräuer-Burchardt, C. Munkelt, M. Heinze, P. Kühmstedt,and G. Notni, “Phase unwrapping in fringe projection systemsusing epipolar geometry,” in Advanced Concepts for IntelligentVision Systems (Springer, 2008), pp. 422–432.

28. J. Tian, X. Peng, and X. Zhao, “A generalized temporal phaseunwrapping algorithm for three-dimensional profilometry,”Opt. Lasers Eng. 46, 336–342 (2008).

29. B. S. Gilbert and J. H. Blatt, “Enhanced three-dimensional re-construction of surfaces using multicolor gratings,” Opt. Eng.39, 52–60 (2000).

30. Z. Zhang, “A flexible new technique for camera calibration,”Technical Report MSR-TR-98-71 (Microsoft Research, 1998).

31. R. Y. Tsai, “A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TVcamera and lenses,” IEEE J. Rob. Autom. 3, 323–344 (1987).

2420 APPLIED OPTICS / Vol. 48, No. 13 / 1 May 2009


Recommended