+ All Categories
Home > Documents > Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein...

Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein...

Date post: 24-Sep-2020
Category:
Upload: others
View: 5 times
Download: 0 times
Share this document with a friend
9
Multispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany ABSTRACT The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One of these applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Institute of Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectral imaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filters positioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accurately and reconstructing the spectra for each acquired image position. An overview will be given over several optical and imaging aspects of the multispectral camera that have been investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality of captured multispectral images. The different aberrations were analyzed thoroughly and compensated based on models for the optical elements and the imaging chain by utilizing image processing. With this compensation, geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectral images. Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof. Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure- ments that are further explored with his expertise will also be presented in this work. Keywords: Multispectral imaging, multispectral camera, filter wheel camera, filter aberrations, chromatic aberrations, stereo multispectral imaging. 1. INTRODUCTION Conventional RGB cameras cannot be utilized for applications where an accurate color acquisition is required, e.g.for the measurement of textile colors, prints or car paints. This is due to their three broadband sensitivity functions that are not a linear combination of sensitivity functions of the CIE observer and do not fulfill the Luther rule 1 . So, no accurate acquisition of color is possible with RGB cameras. In contrast, multispectral cameras sample more finely the spectral stimuli with several color channels. For practical implementation, a monochrome sensor can be utilized with a liquid-crystal tunable filter or with optical bandpass filters 2–5 or an RGB sensor can be utilized with color filters 6 . The first multispectral camera utilized at the Institute is composed of a monochrome camera and a filter wheel with 7 bandpass color filters positioned between the sensor and the system objective lens, see Fig. (1a). For a complete multispectral image, 7 images are thus acquired sequentially. The color filters have central wavelengths from 400 nm to 700 nm in 50 nm steps and bandwidths of ca. 40 nm. The camera sensitivity obtained with these optical filters is shown in Fig.(1b). When the sensitivity is known, e.g., by measuring it with a monochromator 7 , the remission spectra of the scene can be reconstructed using a Wiener inverse for instance. This means that a scene can be acquired with a high spatial resolution and reconstructed spectrally with a high accuracy. Recently, a new multispectral camera with 19 bandpass filters working similarly has joined the multispectral laboratory, with filter central wavelengths from 400 nm to 760 nm in 20 nm steps 8 . Apart from the complete images with high color accuracy provided by such filter wheel multispectral cameras, their major drawback is the optical aberrations appearing because the optical filters are in the ray path and Further author information: Send correspondence to Julie Klein, E-mail: [email protected], Telephone: +49 (0)241 80-27866
Transcript
Page 1: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

Multispectral imaging and image processing

Julie Klein

Institute of Imaging and Computer VisionRWTH Aachen University, D-52056 Aachen, Germany

ABSTRACT

The color accuracy of conventional RGB cameras is not sufficient for many color-critical applications. One ofthese applications, namely the measurement of color defects in yarns, is why Prof. Til Aach and the Instituteof Image Processing and Computer Vision (RWTH Aachen University, Germany) started off with multispectralimaging. The first acquisition device was a camera using a monochrome sensor and seven bandpass color filterspositioned sequentially in front of it. The camera allowed sampling the visible wavelength range more accuratelyand reconstructing the spectra for each acquired image position.

An overview will be given over several optical and imaging aspects of the multispectral camera that havebeen investigated. For instance, optical aberrations caused by filters and camera lens deteriorate the quality ofcaptured multispectral images. The different aberrations were analyzed thoroughly and compensated based onmodels for the optical elements and the imaging chain by utilizing image processing. With this compensation,geometrical distortions disappear and sharpness is enhanced, without reducing the color accuracy of multispectralimages.

Strong foundations in multispectral imaging were laid and a fruitful cooperation was initiated with Prof.Bernhard Hill. Current research topics like stereo multispectral imaging and goniometric multispectral measure-ments that are further explored with his expertise will also be presented in this work.

Keywords: Multispectral imaging, multispectral camera, filter wheel camera, filter aberrations, chromaticaberrations, stereo multispectral imaging.

1. INTRODUCTION

Conventional RGB cameras cannot be utilized for applications where an accurate color acquisition is required,e.g. for the measurement of textile colors, prints or car paints. This is due to their three broadband sensitivityfunctions that are not a linear combination of sensitivity functions of the CIE observer and do not fulfill theLuther rule1. So, no accurate acquisition of color is possible with RGB cameras. In contrast, multispectralcameras sample more finely the spectral stimuli with several color channels. For practical implementation, amonochrome sensor can be utilized with a liquid-crystal tunable filter or with optical bandpass filters2–5 or anRGB sensor can be utilized with color filters6.

The first multispectral camera utilized at the Institute is composed of a monochrome camera and a filter wheelwith 7 bandpass color filters positioned between the sensor and the system objective lens, see Fig. (1a). For acomplete multispectral image, 7 images are thus acquired sequentially. The color filters have central wavelengthsfrom 400 nm to 700 nm in 50 nm steps and bandwidths of ca. 40 nm. The camera sensitivity obtained with theseoptical filters is shown in Fig. (1b). When the sensitivity is known, e.g., by measuring it with a monochromator7,the remission spectra of the scene can be reconstructed using a Wiener inverse for instance. This means that ascene can be acquired with a high spatial resolution and reconstructed spectrally with a high accuracy. Recently,a new multispectral camera with 19 bandpass filters working similarly has joined the multispectral laboratory,with filter central wavelengths from 400 nm to 760 nm in 20 nm steps8.

Apart from the complete images with high color accuracy provided by such filter wheel multispectral cameras,their major drawback is the optical aberrations appearing because the optical filters are in the ray path and

Further author information: Send correspondence to Julie Klein, E-mail: [email protected], Telephone:+49 (0)241 80-27866

Page 2: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

(a)

400 500 600 7000

1

Wavelength [nm]

Rel

. sp

ectr

al s

ensi

tivit

y

(b)

Figure 1: (a) Our multispectral camera with filter wheel and objective lens and (b) sensitivity of its 7 colorchannels.

modify it. This means that for each color filter, the focus plane is shifted and the image points are displaced.When the color channels, i.e., the images acquired with each color filter, are merged together in order to obtain amultispectral image, different problems appear. The images are not exactly aligned and do not present the samesharpness, resulting in color fringes and blurry edges. In the past years, these aberrations as well as applicationsof multispectral imaging have been analyzed by Prof. Aach and his team.

In this work, we summarize our main achievements concerning multispectral imaging and imaging processing,and interesting literature can be found in previous work. We first expose the work concerning the aberrationsthat appear in images from multispectral cameras and how they are optimally modeled and compensated. Theseresults reduce the transversal as well as the longitudinal aberrations. Other issues concerning multispectralimaging have been studied but are not discussed here, for instance multispectral high dynamic range imaging9,10,multispectral acquisition with flash light source11 or multispectral imaging with a particular color filter array12.The last section before the conclusions refers to research topics that continued to be developed at the Institutewith Prof. Hill13: stereo multispectral imaging and multispectral goniometric measurements.

2. TRANSVERSAL ABERRATIONS

In multispectral camera featuring a filter wheel, optical aberration cannot be avoided. They are caused by thecolor filters that have slightly different thicknesses, refraction indices and tilt angles and by the objective lens.In the following, we show our useful results concerning the transversal components of these aberrations, i.e., thecomponents along the sensor plane, and their successful modeling and compensation.

2.1 Filter aberrations

The filters placed in the filter wheel of the multispectral camera cause aberrations by deviating the rays passingthrough them14–16. An optical model has been developed for these aberrations and is based on a pinhole cameramodel for the objective lens and on a plane parallel plate for the color filter. A given object point would beimaged on the position P if no color filter was additionally used. But with the bandpass filter in the ray path, theimage point is distorted to the position P′. As stated by Brauers et al.14,15, displacement between the distortedpoint P′ and the undistorted image point P is the sum of a position-dependent displacement depending on theimage position P and of a global displacement for the whole image.

The transversal filter aberrations appearing with an optical filter f1 relative to another filter f2 follow asimilar model, since the filters have different tilt angles, thicknesses or refraction indices. The relative aberrationsbetween the image position P1 obtained with filter f1 and the image position P2 with filter f2 is the sum oftwo terms: a displacement depending on P1 and a global shift. During a multispectral acquisition, a referencechannel is thus chosen, for instance the channel using the filter with central wavelength 550 nm in the middle ofthe visible wavelength range. The aberrations of the other channels utilizing the other bandpass filters are thencalculated relative to this reference channel. In the compensated multispectral image, all the color channels arecorrected so that their image points match the image points of the reference channel.

Page 3: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

0.5 0.5

0.5

1

1

1

1.5

1.5

1.5

1.5

22

2

2.5

x [pixels]

y [p

ixel

s]

200 400 600 800 1000 1200

100

200

300

400

500

600

700

800

900

Figure 2: Transversal aberrations for channel 500 nm relative to channel 550 nm. The shifts measured separatelyfor each block of the image are marked with red vectors and the aberrations obtained by interpolating the modelover the whole image with black vectors. From [14].

For the measurement of filters aberrations, the image is first subdivided into blocks. The displacement of eachblock is then calculated relative to the reference channel. Based on the block displacements, the parameters ofthe model for transversal aberrations are then approximated for the whole image plane. A RANSAC algorithmis utilized for this step, thus ensuring that blocks where the calculated displacements were erroneous are nottaken into account for the global aberrations model. The matching algorithm for the block displacements usesmutual information as a similarity measure. More details can be found in previous work14–16.

Aberrations calculated with this algorithm are shown in Fig. (2). The displacements calculated for each blockseparately are marked with red vectors and the adjustment of the optical model for the whole image followingthese data are marked with black vectors. The lengths of the vectors are written on the isolines, in pixels. Aftercompensation of multispectral images using this algorithm for modeling and measurement, no color fringe remainat edges of objects, see Fig. (5a)-(b).

Additionally, an algorithm that enables both filters aberrations and lens distortions to be measured andcompensated has been developed16. The aberrations can be corrected simultaneously using a specific target insuch a way that color fringes vanish and the images are geometrically corrected. In the previously explainedalgorithm, on the contrary, the aberrations can be measured for any type of scene, as long as enough texture isavailable.

2.2 Chromatic aberrations

Another important source of transversal aberrations is the system objective lens responsible for chromatic aber-rations. We first measured these aberrations accurately by illuminating a checkerboard pattern with differentbandpass light sources: we utilized a broadband light source and placed in front of it the 7 bandpass color filtersone after another. The utilized camera was a common monochrome camera and the corners in the checkerboardpattern were detected in the different images. Their displacements corresponded to the chromatic aberrations,since no other element than the light source was modified in the acquisition system17. The light source with thebandpass color filter of central wavelength 700 nm was taken as a reference, and the distortions for the otherlight sources are shown in Fig. (3) as a vector field.

Several existing models were compared to approximate chromatic aberrations and as a new feature, thewavelength-dependency of their parameters was highlighted. This led to a new, position and wavelength-dependent model for transversal chromatic aberrations18. After compensation of chromatic aberrations using

Page 4: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

distortion 2 pixels

(a)

418 nm 450 nm500 nm

550 nm600 nm

650 nm

(b)

Figure 3: (a) Chromatic aberrations for six wavelengths from 418 to 650 nm measured with respect to thewavelength 700 nm. The region in the black box is enlarged in (b). From [18].

this model, the mean error of the corners positions measured between the reference and the compensated imagewas 0.056 pixels and the maximum error 0.185 pixels. Even the compensation of aberrations in a wavelengthrange that was missing during the calibration step was possible.

2.3 Alternative position of filters

The transversal aberrations for a filter wheel camera where the filters are positioned between the sensor and theobjective lens (setup in Fig. (4a)) have been explained previously. Here, we additionally measured and modeledthe aberrations appearing when the filter wheel is in front of the objective lens (setup in Fig. (4b)) in order tocompare both types of filter wheel multispectral cameras.

monochrome

filter wheel

lens

camera

(a)

monochrome

filter wheellens

camera

(b)

Figure 4: Two possible configurations for filter wheel multispectral cameras: the color filters can either bebetween the sensor and the objective lens (a) or in front of the objective lens (b). From [19].

If the filter wheel is positioned in front of the objective lens, the rays originating from the acquired scene aredistorted first by the filter aberrations and then by the chromatic aberrations of the objective lens. The completemodel for the aberrations gave results similar to those for aberrations appearing when the filter wheel is placedbetween sensor and objective lens19. Simulations as well as real acquisitions corroborated the model developed:the mean pixel error between measured aberrations and aberrations calculated according to the model were only0.0070 pixels for the simulations and 0.2035 pixels for the real acquisitions.

Parts of multispectral images acquired with the two setups presented in Fig. (4) are shown in Fig. (5). Similarerrors can be seen in the form of color fringes in the original multispectral images (Fig. (5a) and (c)). But aftercompensation using the two models developed, no remaining error can be seen (Fig. (5b) and (d)).

3. LONGITUDINAL ABERRATIONS

The aberrations appearing in multispectral cameras featuring a filter also show longitudinal components, i.e.,along the optical axis of the camera, since the focal plane is shifted with the introduction of a color filter in

Page 5: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

(a) (b)

(c) (d)

Figure 5: Selected regions of a multispectral acquisition presenting transversal filter aberrations (a) for filtersbetween sensor and objective lens and (c) for filters in front of the objective lens. These aberrations are thenwell corrected to obtain images (b) and (d), respectively. The compensated images do not exhibit any remainingaberration. Each region represents an area of 61 × 61 pixels. From [19].

the ray path. The solution elaborated here consists in measuring the point spread function (PSF) of the opticalsystem and deconvolving the original image using this PSF. This solution turned out to be better than justutilizing parameterized PSF like Gauss functions, since the form of the PSF does not obey to any straightforwardfunction20.

Figure 6: A PSF calculated for an image block (left) and the position of the corresponding block in the imagethat contains 10 × 16 blocks (right). Adapted from [21].

The PSF was estimated using a noise pattern that was acquired for the calibration. A pattern with whiteGaussian noise has a distinctive feature: it shows a homogeneous representation in the frequency domain, coveringall the frequency components. For this reason, the optical transfer function (OTF), which is the PSF in thefrequency domain, can be calculated by a straightforward division of the Fourier representation of the imagecontaining longitudinal aberrations by the Fourier representation of the reference image of noise21. Once thePSF is calculated for all the image blocks, several post-processing steps are utilized to get rid of the noise.

Simulation of longitudinal aberrations with a state-of-the art simulation software showed that the PSF isa function depending on the position in the image plane20. For this reason, the PSF was calculated for smallblocks of the image. In each block, the function is slightly different, as can be seen in Fig. (6). The image isdivided into 10 × 16 blocks and the values of the PSFs are coded with colors, with low values in white and highvalues in black. For instance, the PSFs in the upper left part of the image have their peaks in the upper leftdirection, and the PSFs in the lower right part of the image have lower peaks in the right direction. The resultsof this algorithms compensating longitudinal aberrations are shown in Fig. (7). The sharp noise pattern (lower

Page 6: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

part) is utilized with the acquisition (left part) to calculate the PSF and the compensated, sharp acquisition iscalculated with a deconvolution (upper part).

Deconvolution

Acq

uisi

tion

Synthetic prototype

deco

nvolv

e

estimate PSF

Figure 7: Longitudinal aberrations and their correction. From [21].

The compensation of longitudinal aberrations can be performed using the synthetic prototype of the noisepattern, as described previously, in order to know the absolute PSF of each acquisition20. Another possibility isto calculate the relative PSF, taking a given color channel of the multispectral image as a reference21, as we didfor the transversal aberrations. This enables the longitudinal aberrations of all the color channels to be broughtto the level of sharpness of the reference channel.

Apart from this complete characterization of the aberrations PSF using a special noise pattern, other attemptsto compensate for longitudinal aberrations without any calibration acquisition were tested22. Such algorithmsfor unsupervised correction of aberrations could simplify the correction.

4. MULTISPECTRAL IMAGING WITH MULTIPLE ACQUISITIONS

With the exhaustive characterization of aberrations appearing in filter wheel multispectral cameras presentedpreviously, the multispectral acquisition has been simplified. Indeed, all the aberrations can be compensated withimage processing after the acquisition and the optical elements of the system do not have to be modified betweenthe acquisitions of each color channel. This simplified imaging enables to work with multiple multispectralcameras. In the following section, we report our work with a stereo multispectral system, which gives informationabout depth and colors of a scene, and with a goniometric multispectral system, which gives the possibility toimage an object from multiple acquisition angles and with multiple illumination angles.

4.1 Stereo multispectral imaging

A stereo system can be made of two multispectral cameras like the filter wheel multispectral camera with 7bandpass channels presented in the previous chapters. This system enables the acquisition of both accurate colorinformation and accurate depth information about a scene, as explained by Klein and Aach23. Another type ofmultispectral cameras that has gained interest in the past few years is a stereo multispectral camera made of twoRGB cameras in front of which additional color filters are positioned. Their main advantage is that they providemultispectral and depth information in only one shot. One of these experimental setups is shown in Fig. (8). TwoRGB cameras are utilized, with an angle of ca. 10◦ between them. For the color filters limiting the sensitivity ofeach camera, one can either use broadband color filters or dichroic filters. In the example in Fig. (8), dichroicfilters are used: they cut each color band R, G and B in half. This leads to a 6-channel multispectral camerawhose channels are spread over two cameras.

Color accuracy still remains an important aim for the stereo multispectral imaging. Since the spectral stimulireaching the left and the right camera are always slightly different, we measured the spectra of a set of regions in

Page 7: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

RGB 400 700

0,8

0 λ [nm]

Sen

s.

400 700

0,8

0 λ [nm]

Sen

s.

Filter 1

Filter 2

Figure 8: Experimental setup of the 6-channel stereo multispectral camera. The sensitivity obtained for bothsystems camera + filter are plotted on the left. From [24].

a 3D scene from the left and the right position. We were thus able to compare the spectral information that canbe reconstructed from this data by the 7-channel camera and by the 6-channel camera23. The results showedthat 6-channel stereo cameras have a color accuracy suited for practical applications, but more limited than thecolor accuracy of 7-channel multispectral cameras.

In stereo imaging, depth information can be won by calculating the disparity of image points between the leftand the right images when the intrinsic camera parameters are known. Classical algorithms for the calculationof disparity for monochrome or RGB cameras are based on the search of similar gray values or color informationin both images. In the presented setup, both cameras do not acquire the same spectral information, because ofthe filters placed in front of them. The classical algorithms thus cannot be used for the disparity search withthe 6-channel multispectral camera. Instead, we utilize a block matching algorithm with mutual informationas similarity measure24, since it can handle the contrast inversions appearing in multispectral images or moregenerally in multimodal images.

From the 6-channel stereo camera, a classical RGB stereo system can easily be obtained by removing the twocolor filters. This allows a comparison of disparity search algorithms for both stereo systems. The results forthe disparity with a classical algorithms for the RGB stereo system and with our algorithm for the multispectralstereo system were similar24.

4.2 Goniometric multispectral measurements

The second utilization of multispectral imaging is goniometric measurement of materials. By acquiring a materialfrom different viewing angles and with different illumination angles, one can characterize it completely withits bidirectional reflectance distribution function (BRDF). Once this function is known, the reflectance of thematerial can be calculated for any geometry. Using an imaging device instead of a spectrometer for instance evenallows the measurement of a bidirectional texture function (BTF) where the spatial dependence of the materialcharacteristics is also taken into account.

Here, we utilize a goniometric setup with a multispectral camera as measuring device, as shown in Fig. (9).The camera and object can be rotated around the same axis and the light source remains fixed. The acquisitionangle αa measured between the normal to the object surface and the camera axis is modified by rotating thecamera. The illumination angle αi measured between the normal to the object surface and the light source ismodified by rotating the object itself. The multispectral camera is a filter wheel camera with 19 channels whosecentral wavelengths are spread from 400 nm to 760 nm in steps of 20 nm and whose bandwidths are ca. 10 nm.Our first results on goniometric multispectral imaging need again the analysis of transversal25 and longitudinal8

aberrations and the influence of real rays angles on the measurement25.

The transversal aberrations are basically the same for all the acquisition angles but are different for eachcolor channel. This means that the images from the different acquisition angles are utilized either separately or

Page 8: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

camera

object

light

sour

ce

object normal

ααi

a

Figure 9: Measuring setup for goniometric acquisitions of objects. The light source is fixed and the object andcamera can be rotated. The acquisition angle αa and the illumination angle αi can thus be modified. This setupenables a measurement of materials where normal to object, camera and light source are in one plane. From [8].

together in order to compute the parameters of the model for the aberrations25. Both methods give similar resultand lead to an error smaller than 0.187 pixels. The longitudinal aberrations in goniometric imaging have multiplereasons. First, the color filters in the filter wheel are responsible for longitudinal aberrations as explained inSec. 3. Then, the objective lens cause chromatic aberrations that also have longitudinal components. Finally,different steps of geometric rectification are necessary for the goniometric images and the interpolations utilizedduring these rectifications lead to blur 8. These aberrations are measured and some causes analyzed separately,before the correction of goniometric multispectral images gives access to goniometric multispectral images whereall the color channels for all the acquisition positions have an enhanced sharpness.

5. CONCLUSION

With Prof. Til Aach, the first works in the research area of multispectral imaging have started at the Institute ofImage Processing and Computer Vision. The optical aberrations appearing in multispectral cameras featuring afilter wheel have been thoroughly analyzed and modeled, thus leading to compensated multispectral images thatdo not contain any geometrical distortions or blur anymore and that still present a high color accuracy. Thelast progress at the Institute concerning stereo multispectral imaging and goniometric measurement of materialswith a multispectral camera were also presented. The first technique gives access to accurate depth and spectralinformation and the second to a complete characterization of materials.

ACKNOWLEDGMENTS

Many thanks to Dr. Johannes Brauers and Prof. Bernhard Hill for valuable comments and discussions about thispaper.

REFERENCES

[1] Luther, R., “Aus dem Gebiet der Farbreizmetrik,” Zeitschrift fur technische Physik 8, 540–558 (1927).

[2] Hill, B. and Vorhagen, F. W., “Multispectral image pick-up system,” (1991). U.S.Pat. 5,319,472, GermanPatent P 41 19 489.6.

[3] Burns, P. D. and Berns, R. S., “Analysis multispectral image capture,” in [Proc. IS&T/SID 4th ColorImaging Conference (CIC) ], 4, 19–22 (November 1996).

[4] Helling, S., Seidel, E., and Biehlig, W., “Algorithms for spectral color stimulus reconstruction with a seven-channel multispectral camera,” in [Proc. IS&Ts 2nd European Conference on Colour in Graphics, Imaging,and Vision (CGIV) ], 2, 254–258 (April 2004).

[5] Mansouri, A., Marzani, F. S., Hardeberg, J. Y., and Gouton, P., “Optical calibration of a multispectralimaging system based on interference filters,” SPIE Optical Engineering 44, 027004.1–027004.12 (Feb 2005).

Page 9: Multispectral imaging and image processingMultispectral imaging and image processing Julie Klein Institute of Imaging and Computer Vision RWTH Aachen University, D-52056 Aachen, Germany

[6] Hashimoto, M. and Kishimoto, J., “Two-shot type 6-band still image capturing system using commercialdigital camera and custom filter,” in [Proc. IS&Ts 4th European Conference on Colour in Graphics, Imaging,and Vision (CGIV) ], 538–541 (June 2008).

[7] Klein, J., Brauers, J., and Aach, T., “Methods for spectral characterization of multispectral cameras,” in[IS&T/SPIE Electronic Imaging: Digital Photography VII ], 78760B–1–78760B–11, SPIE-IST Vol. 7876, SanFrancisco, CA, USA (January 23–27 2011).

[8] Klein, J., “Correction of longitudinal aberrations in goniometric measurement with a multispectral camera,”in [19. Workshop Farbbildverarbeitung, ], 31–41 (September 26–27 2013).

[9] Brauers, J., Schulte, N., Bell, A. A., and Aach, T., “Multispectral high dynamic range imaging,” in[IS&T/SPIE Electronic Imaging ], 6807, 680704–1 – 680704–12 (January 2008).

[10] Brauers, J., Schulte, N., Bell, A. A., and Aach, T., “Color accuracy and noise analysis in multispectral HDRimaging,” in [14. Workshop Farbbildverarbeitung ], 33–42 (October 2008).

[11] Brauers, J., Helling, S., and Aach, T., “Multispectral image acquisition with flash light sources,” Journalof Imaging Science and Technology 53(3), 031103–1–031103–10 (2009).

[12] Brauers, J. and Aach, T., “A color filter array based multispectral camera,” in [12. Workshop Farbbild-verarbeitung ], 55–64, Zentrum fur Bild- und Signalverarbeitung e.V., Gustav-Kirchhoff-Straße 5, D-98693Illmenau (October 2006).

[13] Hill, B., “High quality color image reproduction: The multispectral solution,” in [9th International Sympo-sium on Color Science and Applications MCS-07 ], 1–7 (2007).

[14] Brauers, J., Schulte, N., and Aach, T., “Multispectral filter-wheel cameras: Geometric distortion model andcompensation algorithms,” IEEE Transactions on Image Processing 17, 2368–2380 (December 2008).

[15] Brauers, J., Schulte, N., and Aach, T., “Modeling and compensation of geometric distortions of multispectralcameras with optical bandpass filter wheels,” in [15th European Signal Processing Conference ], 1902–1906(September 2007).

[16] Brauers, J. and Aach, T., “Geometric calibration of lens and filter distortions for multispectral filter-wheelcameras,” IEEE Transactions on Image Processing 20, 496–505 (February 2011).

[17] Klein, J., Brauers, J., and Aach, T., “Spatial and spectral analysis and modeling of transversal chromaticaberrations and their compensation,” in [Proc. IS&Ts 5th European Conference on Colour in Graphics,Imaging, and Vision (CGIV) ], 516–522 (June 14–17 2010).

[18] Klein, J., Brauers, J., and Aach, T., “Spatio-spectral modeling and compensation of transversal chromaticaberrations in multispectral imaging,” Journal of Imaging Science and Technology 55(6), 060502 (2011).

[19] Klein, J. and Aach, T., “Multispectral filter wheel cameras: modeling aberrations with filters in front oflens,” in [IS&T/SPIE Electronic Imaging: Digital Photography VIII ], 82990R–1 – 82990R–9, SPIE-IST Vol.8299, San Francisco, CA, USA (January 22–26 2012).

[20] Brauers, J. and Aach, T., “Longitudinal aberrations caused by optical filters and their compensation inmultispectral imaging,” in [IEEE International Conference on Image Processing (ICIP) ], 525–528 (CD–ROM), IEEE, San Diego, CA, USA (October 2008).

[21] Brauers, J. and Aach, T., “Direct PSF estimation using a random noise target,” in [IS&T/SPIE ElectronicImaging: Digital Photography VI ], Allebach, J. and Susstrunk, S., eds., 75370B–1 – 75370B–10, SPIE-ISTVol. 7537, San Jose, USA (January 17–21 2010).

[22] Klein, J., “Unsupervised correction of longitudinal aberrations for multispectral imaging using a multires-olution approach,” in [IS&T/SPIE Electronic Imaging: Color Imaging XVIII ], 8652, 8652–28 (February2013).

[23] Klein, J. and Aach, T., “Spectral and colorimetric constancy and accuracy of multispectral stereo systems,”in [Proc. IS&Ts 6th European Conference on Colour in Graphics, Imaging, and Vision (CGIV) ], 239–246(May 6–9 2012).

[24] Klein, J. and Hill, B., “Multispectral stereo acquisition using two RGB cameras and color filters: color anddisparity accuracy,” in [18. Workshop Farbbildverarbeitung, ], 89–96 (September 27–28 2012).

[25] Klein, J. and Schmucking, G., “Analysis of aberrations and pixel information in goniometric multispectralimaging,” in [IS&T/SPIE Electronic Imaging: Measuring, Modeling, and Reproducing Material Appearance ],(to appear) (February 2014).


Recommended