+ All Categories
Home > Documents > The Next-Generation Digital Camera

The Next-Generation Digital Camera

Date post: 05-Oct-2016
Category:
Upload: dick
View: 213 times
Download: 1 times
Share this document with a friend
9
A lthough about 100 billion photo- graphs are taken each year world- wide, only a fraction are shot with digital cameras. The growth potential for digital photography is enormous, yet much work needs to be done before digital photography can become an acceptable al- ternative for the majority of camera users. Here are some of the problems that will have to be overcome before digital cameras can become true mass-market contenders. Cost/quality A $100 point-and-shoot film camera cap- tures images that have higher resolution and more accurate color than a digital camera of the same cost. Digital cameras need to surpass this price/quality bench- mark. Ease of use Digital photography, as currently prac- ticed, requires more technical skill than film photography, both in capturing im- ages and in obtaining satisfactory prints or electronic files. Some level of computer lit- eracy is usually required. Image capture can also require closer attention, and elec- tronic sensors often have less exposure lat- itude than film cameras. Confusion Because digital photography is still some- what of a novelty, many customers are struggling to understand the various is- sues it entails. The proliferation of new types of cameras and the sometimes mis- leading jargon used by manufacturers do not help reassure the prospective digital camera buyer. 26 Optics & Photonics News January 2003 The Next-Generation DIGITAL CAMERA Dick Merrill Easy-to-use cameras that offer pictures of unprecedented quality at very low cost are being created thanks to new applications of optics, electronics, software and nanotechnology. This is an exciting time for the field of electronic imaging: the business opportunities are enticing; the challenges for engineers significant. The author describes some of the ways in which new technologies may be applied to digital photography in the very near future. 1047-6938/03/01/0026/8-$15.00 © Optical Society of America ©Foveon
Transcript

A lthough about 100 billion photo-graphs are taken each year world-wide, only a fraction are shot with

digital cameras. The growth potential fordigital photography is enormous, yetmuch work needs to be done before digitalphotography can become an acceptable al-ternative for the majority of camera users.

Here are some of the problems thatwill have to be overcome before digitalcameras can become true mass-marketcontenders.

Cost/qualityA $100 point-and-shoot film camera cap-tures images that have higher resolutionand more accurate color than a digitalcamera of the same cost. Digital camerasneed to surpass this price/quality bench-mark.

Ease of useDigital photography, as currently prac-ticed, requires more technical skill thanfilm photography, both in capturing im-ages and in obtaining satisfactory prints orelectronic files. Some level of computer lit-eracy is usually required. Image capturecan also require closer attention, and elec-tronic sensors often have less exposure lat-itude than film cameras.

ConfusionBecause digital photography is still some-what of a novelty, many customers arestruggling to understand the various is-sues it entails. The proliferation of newtypes of cameras and the sometimes mis-leading jargon used by manufacturers donot help reassure the prospective digitalcamera buyer.

26 Optics & Photonics News ■ January 2003

The Next-Generation

DIGITAL CAMERA

Dick Merrill

Easy-to-use cameras that offer pictures of unprecedented qualityat very low cost are being createdthanks to new applications of optics, electronics, software and nanotechnology. This is an exciting time for the field of electronic imaging: the businessopportunities are enticing; the challenges for engineers significant. The author describessome of the ways in which new technologies may be applied todigital photography in the very near future.

1047-6938/03/01/0026/8-$15.00 © Optical Society of America

©Foveon

January 2003 ■ Optics & Photonics News 27

ObsolescenceHistorically, the electronics industry hasevolved at a much faster pace than thephotography industry. Customers who areaccustomed to owning cameras for yearsor even decades are understandably reluc-tant to purchase equipment that will beoutperformed—and possibly not evensupported—in a few months’ time.

In this article I will describe sometrends in digital camera system develop-ment that will allow for the production ofless expensive cameras that are easier touse and capable of producing better im-ages. On the horizon are a number of dig-ital camera products with exciting newcharacteristics.

Spectral detector

The image sensor is the most appropriatestarting point for this discussion becausesensor capabilities have numerous impli-cations for the rest of the camera system.The ideal camera image sensor detects allthree primary colors in every spatial loca-tion in the image plane, as shown in Fig.1(a). It has several fundamental advan-tages compared to the lateral color filterdetector [Fig. 1(b)]:

• quantum efficiency (the ratio of elec-trons generated to incident photons) isthree times higher. This is because no

photons are absorbed by color filters.For example, because a red color filterpasses red light but absorbs green andblue light, the information from greenand blue photons never reaches the de-tector and is wasted. With a vertical col-or filter, almost all of the photonsfalling on the array are converted intouseful electrical signals;

• color artifacts are suppressed to a sig-nificant degree because all three colorsare measured in all pixel locations, afactor which eliminates color aliasingin repeating patterns and false colorson high-contrast edges in the image. Inlateral color filter images, there is alsono need for complex signal processingto suppress artifacts;

• image sharpness is improved because itis not necessary to blur the image tosuppress color artifacts. Elimination ofthe blur filter from the optical path be-tween lens and sensor can result in low-er-cost cameras and higher-quality im-ages. Figures 2(a) and 2(b) illustrate thetrade-off that must be made betweencolor artifacts and image blurring inthe case of lateral color filter images;

• the characteristics of the vertical colorfilter spectral detector are very repeat-able because they are based on the fun-

damental properties of semiconductorsrather than on the chemical composi-tion and physical shape of the polymermaterials used for lateral color filter ar-rays. This can lower manufacturingcosts by reducing the test time requiredto calibrate color filter characteristics;

• color accuracy is improved because thespectral response of the three channelsis very broad. Figure 3 shows a simpleexample of how broad color filter char-acteristics can reduce color ambiguity;

• semiconductor color filters are morestable than polymer color filters be-cause they will not fade with time orwith exposure to temperature and ul-traviolet (UV) light;

• the steps in the vertical color filtermanufacturing process are less expen-sive than those used to produce lateralcolor filters;

• the manufacture of vertical color filtersdoes not require new equipment ormaterials that may be unfamiliar tocomplementary metal oxide semicon-ductor (CMOS) foundries.

Readout architecture

Today, the charge-coupled device (CCD),the incumbent array readout technology

©Foveon ©Foveon

for image sensors, is being challenged byCMOS readout sensors. With CCD read-out, the integrated photocurrent is trans-ported as a charge packet to the edge of thearray for conversion to a voltage signal.With CMOS readout circuits, on the otherhand, the charge-to-voltage conversiontakes place inside the pixel and the signal ispassed out of the array as voltage. Someadvantages follow of CMOS readout com-pared to CCD readout.

First, CMOS sensors require each rowaccess line to be clocked only once as thearray is read out; with CCD arrays, row ac-cess lines must be clocked as many timesas there are rows. This results in on-chippower dissipation that is orders of magni-tude larger for CCD sensors than forCMOS sensors.

Next, CMOS sensors can address asingle pixel or any arbitrary group of pix-els easily, whereas CCDs must read outthe entire array to get the informationfrom a specific group of pixels. This fac-tor enables CMOS sensors to be used in aclass of applications for which CCD sen-sors are impractical.

Unlike CCD arrays, many types ofCMOS pixels can be accessed via nonde-structive readout, or in other words with-out disturbing the photoelectric signalbeing integrated, an advantage thatmakes CMOS pixels suitable for a num-ber of new applications.

CMOS process technology allows foreasy integration of imager array supportfunctions—such as analog-to-digitalconvertors (ADC) and digital signalprocessors (DSP)—on the same piece ofsilicon as the array itself. This is impor-tant in applications such as cell phonecameras, where power, size and weightare critical.

Finally, annual investment in CMOSprocess technology is about $50 billionworldwide. Much of this goes into re-search to reduce cost and improve per-formance to the benefit of all users ofCMOS technology. Such corollary bene-fits are not generated by the limited num-ber of foundries making CCD products.

Many CMOS technology innovationsbecoming mainstream have applicationsin the area of optoelectronic devices.Some examples are silicon-on-insulator(SOI) wafers; microelectromechanicalsystems (MEMS); integration of non-sili-con semiconductors, such as III-V com-pounds, and die-stacking techniques.

28 Optics & Photonics News ■ January 2003

DIGITAL CAMERAS

Figure 1. (a) The vertical color filter spectral detector samples all three primary colors inevery spatial location. It works in a way which is similar to celluloid film. (b) Lateral color fil-ter detector.

Figure 2. (a) Color interpretation errors are possible with lateral color filters because luminanceinformation and chrominance information are mixed. (b) Optical low-pass filters can suppresscolor interpretation errors by means of lateral color filter detectors, but at the cost of blurringthe image.

(a) Vertical Color Filter Detector

(b) Lateral Color Filter Detector

(a)

(b)

1. A Foveon X3 image sensorfeatures three separate layersof photodetectors embeddedin silicon.

2. Slab silicon absorbs differentcolors of light at differentdepths; each layer captures adifferent color. Stacked togeth-er, they create full-color pixels.

3. As a result, the Foveon X3image sensors capture red,green and blue light at everypixel location.

1. In conventional systems,color filters are applied to asingle layer of photodetectorsembedded in a tiled mosaicpattern.

SubjectColor Filter

ArrayColor ArrayResponse Final Image

SubjectColor Filter

ArrayBlur Filter

CCDResponse Final Image

Red

Black

Blue

White

Black

BlackBlack

White

NarrowWhiteFeature

WideWhiteFeature

2. The filters let only one wave-length of light—red, green orblue—pass through to any giv-en pixel, allowing it to recordonly one color.

3. As a result, mosaic sensorscapture only 25% of the redand blue light, and just 50% ofthe green.

January 2003 ■ Optics & Photonics News 29

Sensor noise

One of the advantages of digital imagecapture is that it entails very little of thefilm grain-pattern noise that typifies im-age capture by means of conventional filmcamera technology. There are, however,noise sources that users of digital camerasshould be aware of. The need to addressthe issue of sensor noise creates opportu-nities for future improvements in elec-tronic image sensors.

First, there is temporal noise, whichchanges from frame to frame. Photon shotnoise, the most fundamental form of tem-poral noise, can only be reduced by in-creasing the total number of photons cap-tured in each sample. Another kind oftemporal noise is electrical noise, such asleakage or transistor switching noise,which is associated with getting the imagesignal out of the array. CCDs are oftencharacterized by good performance speci-fications as far as this type of noise is con-cerned since they benefit from decades ofoptimization. In any case, the CCD enjoysno fundamental advantage in this respectcompared to the CMOS readout sensor.

In contrast to temporal noise, fixed-pattern noise stays the same from frame toframe. For example, optical fixed-patternnoise can result from local pixel-to-pixelphotosensitivity variations, which can bereduced with the more sophisticated li-thography equipment used to makesmaller feature size CMOS wafers. Use ofthe latest generation scanner-lithographyequipment allows for the elimination ofvisible boundaries in large-field sensorsresulting from stitching reticle fields to-gether, another form of fixed-patternnoise.

Noise determines the lower end of im-age sensor dynamic range; the high end isdetermined by the total number of pho-tons that can be detected by the sensorwithout compromising the measurementof light intensity. Although there are nu-merous methods for extending the highend of sensor dynamic range, few havebeen implemented in digital cameras be-cause they are impractical for use withCCD technology. Image plane analog sig-nal processing techniques can be appliedto make CMOS sensor response nonlinearby adjusting sensitivity or exposure timein the pixel. This can improve one of themost disturbing photographic effects: thecase in which the image captured has lessdynamic range than the image perceived

by the photographer. Compression at thehigh end of the sensor’s dynamic rangewill also increase exposure latitude bymaking the image less susceptible to high-light washout, an artifact often apparent intoday’s digital camera photographs.

Camera hardware

Many digital cameras are simply film cam-eras in which the film manipulationmechanism has been replaced by an elec-tronic sensor. In the camera of the future,to reduce costs many of the expensive op-tical and mechanical components will bereplaced by electronics and software (seeFig. 4).

For example, the pentaprism and flip-mirror viewfinder assembly can be re-placed by an electronic display. Instead ofthe liquid crystal displays with a resolutionof about 100,0000 pixels now in use forvideo and still cameras, a microdisplay (~ 1,000,000 pixels) with proper viewingoptics would allow for much higher reso-lution with less power.

Today’s cameras often have separateoptical paths for focus, exposure andviewfinding (see bottom of camera, Fig. 4).This extra hardware can be eliminated ifthe sensor itself is used to determine focusand exposure. This change reduces costand increases focus and exposure accuracybecause the same component that samplesfocus and exposure is also capturing the fi-nal image. A CMOS sensor is better suitedto this type of image-plane focus and ex-posure solution because subsampling ofthe array is easier and consumes less powerthan is the case with a CCD sensor.

The viewfinder function can be com-bined with the focus and exposure func-tion (see Fig. 5). A pointer is superim-posed on the image in the viewfinder dis-play. This pointer, which shows a meas-ured value for local sharpness and expo-sure, can be moved around under manualcontrol so that focus and exposure can bedetermined in the part of the image ofmost interest: in the example shown, thesubject’s face.

The shutter is another complex opto-mechanical component that can be elim-inated from cameras (Fig. 4). In a digitalcamera, the mechanical shutter speed canbe as fast as 1/8000th of a second or more,for a complex interconnected set of mov-ing panels. Mechanical shutters create vi-brations during exposure that blur the im-age and can cause motion artifacts. With

DIGITAL CAMERAS

Single-use digital cameras

The largest number of film cam-eras sold today are the so-called“disposables” that are recycledby one-hour-photo kiosks afterthe customer's prints have beendeveloped.

It is likely that some digitalcameras will evolve to serve aniche in this market composed ofcasual users who do not want toexpend the effort needed to learnhow to obtain high-quality printsfrom a digital camera. In ex-change for a rental fee, the infre-quent camera user will get accessto a state-of-the-art digital cam-era. The service provider will veri-fy image quality and delete poor-quality images before giving thecustomer his or her prints andCD. Today's film service agenciesare expected to step into thismarket segment since it will pro-vide them with a way to surviveas the world transitions from filmto digital.

Figure 3. (a) Color filters with minimaloverlap can make it hard to determinecolors. (b) Broad color filter characteristicsprovide more information about locationin color space.

400 nm 500 nm 600 nm 700 nm

Resp

onse

400 nm 500 nm 600 nm 700 nm

(b)

(a)

Tone

Tone

Resp

onse

CMOS sensors, it is possible to build intothe pixel a global electronic shutter thathas no motion artifacts and is more reli-able than a mechanical shutter. Figure 6shows a photograph taken with a focalplane electronic shutter at 1/4000th of asecond.

A mechanical shutter does have someadvantages over an electronic shutter interms of camera function, especially theability to physically block all light fromfalling on the sensor when closed. Recentprogress in MEMS technology may oneday make it possible to integrate a low-costmechanical shutter into the image sensordie. This would allow for combination ofthe advantages of semiconductor massproduction with the capabilities of themechanical shutter.

The lens

Improvements in optics enabled by newtechnology can also reduce camera costand improve image quality significantly.

Pixel sizeThe spot size of a camera lens, determinedby the diffraction limit and lens aberra-tions, is usually in the range of 4–5 �m.Although at very large or small f-numbersthe spot size is larger, the minimum usefulpixel size is approximately 4–5 �m. Mak-ing a pixel smaller than the lens spot sizewill not improve the resolution of the final

image, although it may help sell camerasby allowing a larger “megapixel” numberto be advertised. Capturing as much infor-mation as possible within a given spot sizecan also help, and this is one of the bene-fits of the vertical color filter pixel.

Field sizeThe 24 mm x 36 mm format borrowedfrom cinema cameras first appeared in aLeica still camera in 1924. It has since be-come the most popular photographicmedium. Unfortunately, maintaininggood image sharpness across such a largefield requires bulky and expensive lenses.Since making a silicon image sensor ofthat size is also very expensive, it is likelythat standard format sizes for digital pho-tography smaller than 35 mm will evolve.

Lateral color shiftOne of the most noticeable lens problemsin color photographs is lateral chromaticaberration, which usually appears as a col-or fringe on the borders of high-contrastfeatures near the edge of the field. It is dif-ficult and expensive to improve chromaticaberration by optical means, especially forcomplex designs such as zoom lenses.Happily, most lateral color shift can becorrected by means of software, especiallyif full-measured color sensors are used sothat the color in each pixel location can bedetected precisely.

Zoom factorsElectronic zoom capability is much cheap-er and less bulky than optical zoom. How-ever, image quality suffers because fewerpixels are available for the long focallength equivalent image. Cheaper pixelsenabled by CMOS technology will allowarrays with very large pixel counts to be-come economical; this, in turn, will makelarge electronic zoom factors possiblewithout compromising image quality.Flexible CMOS readout also simplifies thekind of array subsampling necessary forthe use of electronic zoom technology.

Depth of field The traditional method for increasingdepth of field optically is to reduce thelens aperture size, which increases dif-fraction blurring and reduces the amountof light, increasing image noise. Newtechniques are being proposed to addressthe depth of field issue in non-traditionalways. For example, specialized optical fil-ters can be designed to work in conjunc-tion with digital filters to produce imagesthat have a very large effective depth offield.1 A method such as this for increas-ing depth of field electronically ratherthan optically would allow for sharperimages at lower light levels.

Many photographs, for example, thoseof people posing with scenery in the dis-tant background, have a distinct number

30 Optics & Photonics News ■ January 2003

DIGITAL CAMERAS

Figure 4. Replacement of optomechanical components with electronics and software.

Software correctionof chromatic aberra-tion enabled by VCF

Zoom lens substitutehigh pixel count Optical viewfinder

replace with microdisplay

Mirror and shutter replace with electronic shutter

Exposure control replace with focal plane autoexposure

Autofocus replace with focal plane autofocus

January 2003 ■ Optics & Photonics News 31

of focal planes. For this type of image, bychanging the lens focus between exposuresit is possible to implement a multiple ex-posure to capture both focal planes sepa-rately. Many lens systems today have focusmotors with millisecond response times,so that handheld operation is possible.Software can merge the two images to-gether by using sharpness criteria to deter-mine which image data to choose in dif-ferent parts of the image.

Filters

Required and optional filters will likely re-main important in the future. Here aresome examples of how technologicalprogress can improve optical filter re-quirements.

A spectral shaping filter is usually re-quired to ensure good color accuracy withthe electronic image sensors used in digitalcameras. This type of filter is often expen-sive because of the precision required forinfrared (IR) and UV cutoff, as well as op-tical defect specification. Advanced CMOSwafer process capabilities, such as preci-sion film thickness deposition and me-chanical planarization, allow this type offilter to be integrated onto the surface ofthe sensor. As well as reducing cost, thisapproach eliminates two surfaces in theoptical path that may have light scatteringdefects and specular reflections.

Another filter issue is dust particlesfalling on the surface of the sensor or theprotection glass above it. Depending onthe distance from the image plane, the f-number and the size of the dust particle,the particle may be visible in the image.This constitutes a particularly annoyingpractical problem for digital photogra-phers, especially those who use cameraswith interchangeable lens systems inwhich the image sensor is exposed to theair when the lens is changed. It is to behoped that some day a camera manufac-turer will develop a system for cleaning thesensor surface before every shot, perhapsthe way 35-mm film is cleaned by brushesas it is pulled from the canister.

Photographers often use specialized fil-ters to control the look of a photograph.One of the most common types of filters isthe gradient filter, which is used to increasethe dynamic range of the photograph byattenuating the optical signal in the brightpart of the image. To achieve the same ef-fect, CMOS analog circuits in the pixel canbe used to attenuate the optical signal elec-trically, with more flexibility than can beachieved with an external optical filter.

One trend that is certain to result innew filter manufacturing techniques andcapabilities is nanotechnology. In particu-lar, the fact that features in advancedCMOS wafer processes are now smallerthan optical wavelengths allows for un-

precedented methods of manipulatinglight based on diffraction and interference.These capabilities will certainly find appli-cation in the future in imagers featuringlower cost filters with interesting newproperties.

Power

Anyone who has ever used a digital cameraknows that power consumption is a majorpractical issue. Not only does the digitalphotographer need to carry aroundenough batteries to get a day's worth ofpictures, but at the end of the day the bat-teries must be recharged or new onesmust be purchased. Since the additionaltime and effort represent a barrier tomarket entry for digital products, im-provements in power management areneeded. Power consumption can be re-duced in many ways:

• lower cost, in-camera memory can re-duce the need for in-camera computa-tions to perform image compression;

• use of a microdisplay viewfinder, in-stead of a large LCD, can reduce thepower absorbed by what is usually thelargest battery drain;

• compared to CCDs, CMOS sensorscan reduce power consumption by

DIGITAL CAMERAS

Figure 5. Viewfinder with electronic focus/exposure loupe.

FocusExposure

©Foveon

reducing the amount of power re-quired to get the data out of the array;

• if in-camera image reduction compu-tation is required, full measured colorpixels can save computation energy be-cause they eliminate the need for com-plex color interpolation calculations;

• motors, such as mechanical shutters,can be replaced with electronic alterna-tives.

On the supply side, battery technologyhas improved slowly over the past fewdecades compared to the rate of improve-ments in power consumption. One solu-tion to the power supply problem lies inthe use of hydrocarbon fuels, which havemuch higher energy density than batteries.A flask of hydrocarbon fuel could power acamera (and support a laptop computer)during a two-week trek in the wilderness,far from any convenient source of electricalenergy. Although fuel cells are often toutedas a way to convert hydrocarbon energy toelectrical energy, low cost fuel cells seem toalways be just out of reach. Another possi-bility is the microengine, working modelsof which have recently been demonstrat-ed.2 Leveraging rapidly evolving manufac-turing techniques in the field of MEMS, anengine the size of a small coin can generate10 W of output power, enough to run a

camera and laptop or quickly recharge a setof batteries.

The camera system

A number of trends that are evident in to-day’s electronics industry may be exploit-ed in cameras of the future. First, memo-ry cost and density will continue to scaledown, a factor that will help digital cam-eras by leading to reductions in price andsize. Increased in-camera memory willalso reduce or eliminate in-camera pro-cessing for image compression; this meansthat power requirements will be reducedand the quality of the final image will beimproved. Next, most digital cameras to-day use wires or card transfer to move im-ages from the camera to the support com-puter. This requires manipulating pieces ofhardware, a process that creates connectorreliability problems, takes time and can befrustrating to users. The camera of the fu-ture will most likely have a wireless inter-face: the only action required of the userwill be to place the camera close to a com-puter so that the software can automati-cally download all the images to storagemedia in a computer network.

It is not uncommon for a modern digi-tal single lens reflex camera to have 20 ormore control buttons. Clearly, simplifica-tion is needed! Controls that are easy touse and remember—without sacrificing

functionality—would have a lot of valuefor photographers. Voice control is onepossibility. Since the number of requiredcontrol words and users would be limited,computer voice recognition systemswould work well.

Many households today have both avideo camera and a still camera. The cam-era of the future will be flexible enough tostore as much video as a state-of-the-artvideo camera, yet capable of capturing ex-cellent still images. CMOS imagers can ex-pedite the merger of video and still imagesbecause the readout process is inherentlysimpler and more flexible (see Fig. 7).

Designing an image sensor that is opti-mal for all applications is a tremendouschallenge. With film cameras it’s an easymatter to buy film specialized for differentuse, such as black and white, fine grain orhigh sensitivity. With digital cameras, onthe other hand, the customer is forced tobuy an entirely new camera to obtain asensor that is optimized to meet a specificneed. Figure 8 shows an example of an in-terchangeable sensor that can be swappedin and out of a single camera body. Suchsensors would sell for much less than thecamera, allowing the digital photographerto save money while enjoying increasedphoto options.

In the future, progress in digital signalprocessing will have an impact on imagequality and image file size. Image process-ing in support of astronomy is a good ex-ample.3 The cost of obtaining astronomi-cal images is very high, and the images ob-tained are limited by optics and availablesignal to such an extent that a large invest-ment in signal processing resources is justi-fied to improve the amount of informationthat can be extracted from the availabledata. Because data processing cost decreas-es at a fairly predictable rate with time (fol-lowing Moore's law of semiconductor de-vice scaling), at some point these sophisti-cated data processing algorithms will beginto be used in consumer digital cameras.

In the early 20th century, there was nostandard photographic film format. Even-tually, the situation for both users andmanufacturers was simplified when the 35 mm format became the industry stan-dard. It is to be hoped that some day thiskind of standardization and simplificationwill occur for digital photography as well.The proliferation of support tools, colormanagement systems, compression stan-dards and other software components

32 Optics & Photonics News ■ January 2003

DIGITAL CAMERAS

Figure 6. Solid-state global electronic shutter image at 1/4000th of a second.

January 2003 ■ Optics & Photonics News 33

represents a significant barrier that mustbe overcome before film photographerswill be willing to become digital camerausers.

Output media

In the case of traditional film photogra-phy, developing and printing are carriedout by a service company and the negativesand prints are brought home to be viewed,mailed and stored. To obtain pictures ofacceptable quality by means of digital pho-tography, the customer must expend moreeffort. Here are some examples of the areasin which progress will one day reduce theburden on the digital photographer.

Color managementToday there is no standard in general useto guarantee that the color captured by thecamera is correctly carried through all sys-tem interfaces to ensure the best possiblecolor rendition in the output device. Asdigital photography evolves, a properlycorrected system for color managementwill become more important. Good colorreproduction is a high priority for mostcamera users.

ArchivingDigital image data will not deteriorate overtime, as is the case for an analog mediumsuch as film. However, the practical life-

time for digital media is the lifetime of thedata format, which is currently no morethan a few years. For example, compactdisks (600 Mbytes) are being replaced byDVD (6 Gbytes), and DVD will be re-placed by whatever format follows it inturn. This creates a problem for the pho-tographer who wants to archive images.When the old media becomes obsolete, thephotographer will have to copy his datafrom the old format to media in the newformat, a very time-consuming task. I be-lieve the only solution to this dilemma is acentralized database, perhaps maintainedby Internet-provider-owned companies.Images could be dumped into the data-base in batch mode and organized so thatrecall would be easy. With disk capacitynow retailing for $1/Gbyte, the service feeneed only be a small portion of themonthly connection fees.

DisplayLuminous displays have more dynamicrange than printed images, so that imageslook brighter and more dramatic. Large,wall-mounted flat panel display systemsare migrating into the cost range in whichthey could become standard household ap-pliances, replacing the TV display, thecomputer display and also serving as a dis-play system for digital photographs. Whenthere is nothing to watch on TV, you could

call up images from your last vacationwithout having to set up a slide projectoror find the correct set of slides in the closet!

Communication bandwidthOnce high bandwidth communication isavailable to a large number of users, elec-tronic imaging use and applications willgrow very rapidly. The large size of imagerfiles and the computation required willserve to use up some of the increase incomputational resources and availablebandwidth of future computer networks.

Conclusion

We are in the midst of a transition be-tween film photography and electronicphotography. A wealth of applicable newtechnology, combined with the enormousmarket potential for digital imaging, willensure rapid progress.

In this article I have attempted to pro-vide some examples of interesting newpossibilities that may be in the future ofphotography. As a technologist, I look for-ward to working on the problems to besolved. As a photographer, I look forwardto taking pictures with ever higher levelsof image quality.

Dick Merrill ([email protected]) works on im-age sensor design at Foveon, Inc. in Santa Clara, Cali-fornia.

DIGITAL CAMERAS

Figure 8. Modular sensor for digitalcamera.

Figure 7. Vertical color filter detectors can simplify multiresolution camera design by making it easier to combine data at the pixel level.

1 x 1 Pixels

Still Photography• High Resolution• More Pixels

X3 Image Sensor• Variable Resolution• Variable ISO• Variable Readout

Video Resolution• Faster Readout• Higher ISO

A new class of cameracaptures high resolutionstill and digital-qualityfull-motion video.

2 x 2 Pixels 4 x 4 Pixels

©Foveon

January 2003 ■ Optics & Photonics News 49

The Next-Generation Digital Camera 26Dick Merrill

1. E. Dowski, Jr. and G. Johnson, CDM Optics Inc.Courtesy of oemagazine, Jan. 2002, p. 42.

2. C. Livermore, MIT Microsystems Technology Laboratory, The Industrial Physicist, Dec.2001/Jan. 2002, .

3. J. Starck, SPIE Electronic Imaging Newsletter, Vol 12, No. 1, Dec. 2001.

How the Magic Lantern Lost Its Magic 34Thomas L. Hankins

1. Christiaan Huygens, Oeuvres complete, 22 vols.(The Hague: Société Hollandaise des Sciences/Nijhoff: 1888-1950), 4:102,111; 22:196-197.For more detail see Thomas L. Hankins andRobert J. Silverman, Instruments and the Imagi-nation (Princeton: Princeton University Press,1995), chapter 3.

2. W. A. Wagenaar,” The True Inventor of the Mag-ic lantern: Kircher, Walgenstein, or Huygens?”Janus 66 (1979):193-207; Athanasius Kircher, Arsmagna lucis et umbrae (Rome: Sumptibus Her-manni Scheuss, 1646), 768-70.

3. Quoted from Rosalie L. Colie, “Some Thankful-nesse to Constantine”: A Study of English Influ-ence upon the Early Works of Constantijn Huy-gens (The Hague: Nijhoff, 1956), p. 93, 98,106,110.

4. Giambattista Della Porta, Natural Magick (1st ed.Naples, 1558); reprint of the 1st English ed. (Lon-don: Thomas Young and Samuel Speek, 1658),ed. Derek J. Price (New York: Basic Books, 1957),pp. 1-2.

5. On the telescope: Della Porta to Federigo Cesi,August 28, 1609, Galileo Galilei, Le opere diGalileo Galilei, edizione nazionale, 20 vols. In 21(Florence: Barbera, 1890-1909), 10:252; also Al-bert Van Helden, “The Invention of the Tele-scope,” Transactions of the American Philosophi-cal Society 67, pt. 4 (1977): 44-45. On the ther-mometer: W. E. Knowles Middleton, A History ofthe Barometer and Its Use in Meteorology (Balti-more: The Johns Hopkins University Press, 1966),chap. 1, esp. pp. 21-22. On the air pump:Steven Shapin and Simon Schaffer, Leviathan andthe Air Pump: Hobbes, Boyle, and the Experi-mental Life (Princeton: Princeton University Press,1985), p. 239. On the prism: Simon Schaffer,“Glass works: Newton’s Prisms and the Uses ofExperiment,” in The Uses of Experiment: Studiesin the Natural Sciences, ed. David Gooding,Trevor Pinch and Simon Schaffer (Cambridge:Cambridge Univ. Press, 1989), pp. 67-104 at pp.73-74 and 78-9.

6. The Diary of Samuel Pepys, ed. Henry B. Wheat-ley, 8 vols. (London: G. Bell and Sons, [1892]),5:40.

7. Willem Jacob ‘sGravesande, Physices elementamathematica, experimentis confirmata; sive in-troductio ad philosophiam Newtonianam ( Gene-va: Apud Henricum-Albertum Gosse & Soc.,1721), p. x.

8. David Robinson, “Robinson on Robertson,” NewMagic Lantern Journal 4 (1986): 4-13; and Éti-enne-Gaspard Robertson, Mémoires recréatifs,scientifiques et anecdotique d’un physicien-aero-naute, reprint of vol. 1 (Langres: Café Clima Edi-teur, 1985).

9. W. F. Ryan, “Limelight on Eastern Europe: TheGreat Dissolving Views at the Royal Polytechnic,”4 (1986): 48-55.

10. Lewis Wright, Optical Projection: A Treatise onthe Use of the Lantern in Exhibition and ScientificDemonstration (London: Longmans, Green, andCo., 1901), pp. 63 and 180.

11. Erik Barnouw, “Magicians Created ‘Living Pic-tures’ Then Rued the Day,” Smithsonian 12, no. 4(1981):114-28.

12. David Brewster, Letters on Natural Magic Ad-dressed to Sir Walter Scott, Bart. (London: J. Mur-ray, 1832), pp. 14, 350.

13. Franz Paul Liesegang, Dates and Sources: A Con-tribution to the History of the Art of Projectionand to Cinematography, trans. Hermann Hecht(London: The Magic Lantern Society of GreatBritain, 1986), p. 23.

14. Reprinted in JSMPTE, 72 (March 1963): 198-200at 200.

15. Perley G. Nutting, Outlines of Applied Optics(Philadelphia: P. Blakiston’s Sons, 1912), p. 96;Nutting, “The Needs of Applied Optics,” ScienceNew Series, Vol. 43, Issue 1100 (Jan. 28, 1916):124-128; “History of the Optical Society of Amer-ica, 1916-1966,” JOSA 56 (3)(1966).

REFERENCES

ReferencesNote: References in OPN feature articles arepublished exactly as submitted by the authors.

reputation

OSA’s flagship journals, theJournal of the Optical Societyof America A and B, have set thestandard of excellencein optics research for more than 85 years.

Since 1917, OSA has published ground-breaking research in every field of opticsand photonics—from lens design to lasers tooptical communication—research that contin-ues to have relevance and resonance today.From key thinkers in theoretical physicsto Nobel Prize winners to inventors who havechanged the technological landscape, JOSAhas published the seminal research in everyfield of optics and photonics.

Beginning with the very first paper publishedin 1917 by Leonard Thompson Troland,the “JOSAs” have included papers thatdefined new and growing areas within thefield of optics.

Among the landmarks have been: March 1924—A. A.Michelson; “The Limit ofAccuracy in Optical Measurement”February 1947—Edwin H. Land; “A NewOne-Step Photographic Process”April 1961—Charles H. Townes, “Opticaland Infrared Masers”

Authors and readers trust the reputation ofJOSA A and B. They know that OSA editorsand reviewers are the most authoritativeexperts in their fields. They rely on JOSA Aand B to provide the best peer-reviewedoptical science available.

The two “JOSAs” are the core of anylibrarian’s physics collection. Combined withOSA’s other top-ranked optics journals orOSA’s Optics InfoBase—an online searchablearchive of OSA publications—the subscriptionrates are among the lowest in basic sciencepublishing.

Your library should have a subscription tothese journals, and now all of OSA’s journalsare available online!

For more information on how to subscribe to JOSA A and /or B, visit the Publicationssection of www.osa.org.


Recommended