optical system performance visualization

12
Optical system performance visualization Marie Côté a , Robert J. Pagano a , Michael A. Stevenson ab a Breault Research Organization, 6400 East Grant Road, Tucson, AZ 85715, USA b Dept. of Optical Sciences, University of Arizona, Tucson, AZ Copyright 1999, Society of Photo-Optical Instrumentation Engineers (SPIE). This paper will be published in the proceedings from the July 1999 SPIE Annual Conference held in Denver, Colorado, and is made available as a preprint with permission of SPIE. Single print or electronic copies for personal use only are allowed. Systematic or multiple reproduction, or distribution to multiple locations through an electronic listserver or other electronic means, or duplication of any material in this paper for a fee or for commercial purposes is prohibited. By choosing to view or print this document, you agree to all the provisions of the copyright law protecting it. ABSTRACT Advances in computer technology have dramatically increased raytrace speeds in optical engineering software. Increases in raytrace speed have, in turn, led to new methods for evaluating optical system performance. Designers traditionally evaluate imaging system performance with spot diagrams, MTF plots, ray aberration plots, and distortion plots. These tools are invaluable for two reasons: 1) they provide the information experienced designers need to make design decisions, and 2) they require only a coarse sampling of rays. However, these tools are an indirect representation of imaging system performance. The designer must “wait and see” how the lens performs in situ. With today’s computers and optical engineering software, it is now possible to evaluate imaging system performance visually as well as numerically – prior to lens fabrication. This paper will discuss the benefits of visual characterization for various practical optical systems. Distortion, diffraction, imagery with three-dimensional (3D) objects, and other optical phenomenon will be evaluated. Keywords: imaging, optical design, simulation, raytrace, software, machine vision 1. INTRODUCTION Virtual optical system prototyping first began in the 1950’s and 1960’s. 1-5 Commercially available optical engineering software programs were introduced during the 1970’s and enhanced continually through the present. During this 40-year period, the integrated circuit industry has doubled the transistor density on a manufactured die every 18 months (dubbed “Moore’s Law”). 6 Integrated circuit development has, in turn, led to astounding increases in computational speed. As a result, it is now practical to evaluate optical system performance not just in terms of standard lens design metrics (spot diagrams, ray aberration plots, and modulation transfer functions [MTF]) but also in terms of virtual photographs. Whether the optical system supplies an image to the human eye and brain or to a CCD and computer, today’s optical engineering tools can help to answer the question, “Is it good enough?” well before costly fabrication, assembly and testing. In a previous paper the authors demonstrated how imagery is degraded by ghost images, scatter, and camera defocus. 7 In this paper we discuss distortion and keystone, diffraction, the limits of three color simulations, the impact of photon noise in the simulation, and three-dimensional scene simulations. 2. TWO-DIMENSIONAL SCENE SIMULATIONS Optical and illumination systems are traditionally evaluated using collections of rays representing point sources, uniform surface emitters, or simple volume emitters such as ellipses, cones, and boxes. To evaluate imaging system performance, it is convenient to represent the scene as a two-dimensional (2D) source and to trace rays from that source, through the optical system, to the image surface. A computer-readable scene is needed for such a simulation. Since bitmapped images abound, a tool was developed to convert color bitmaps into 2D raysets or sources. The bitmap is first separated into its red, green and blue (RGB) constituents, and then each of these three images is converted into 2D raysets by assigning a number of unit flux

Upload: haque

Post on 02-Jan-2017

231 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Optical system performance visualization

Optical system performance visualization

Marie Côtéa, Robert J. Paganoa, Michael A. Stevensonab

aBreault Research Organization, 6400 East Grant Road, Tucson, AZ 85715, USA

bDept. of Optical Sciences, University of Arizona, Tucson, AZ

Copyright 1999, Society of Photo-Optical Instrumentation Engineers (SPIE). This paper will be published in the proceedingsfrom the July 1999 SPIE Annual Conference held in Denver, Colorado, and is made available as a preprint with permissionof SPIE. Single print or electronic copies for personal use only are allowed. Systematic or multiple reproduction, ordistribution to multiple locations through an electronic listserver or other electronic means, or duplication of any material inthis paper for a fee or for commercial purposes is prohibited. By choosing to view or print this document, you agree to all theprovisions of the copyright law protecting it.

ABSTRACT

Advances in computer technology have dramatically increased raytrace speeds in optical engineering software. Increases inraytrace speed have, in turn, led to new methods for evaluating optical system performance. Designers traditionally evaluateimaging system performance with spot diagrams, MTF plots, ray aberration plots, and distortion plots. These tools areinvaluable for two reasons: 1) they provide the information experienced designers need to make design decisions, and 2) theyrequire only a coarse sampling of rays. However, these tools are an indirect representation of imaging system performance.The designer must “wait and see” how the lens performs in situ. With today’s computers and optical engineering software, itis now possible to evaluate imaging system performance visually as well as numerically – prior to lens fabrication. This paperwill discuss the benefits of visual characterization for various practical optical systems. Distortion, diffraction, imagery withthree-dimensional (3D) objects, and other optical phenomenon will be evaluated.

Keywords: imaging, optical design, simulation, raytrace, software, machine vision

1. INTRODUCTION

Virtual optical system prototyping first began in the 1950’s and 1960’s.1-5 Commercially available optical engineeringsoftware programs were introduced during the 1970’s and enhanced continually through the present. During this 40-yearperiod, the integrated circuit industry has doubled the transistor density on a manufactured die every 18 months (dubbed“Moore’s Law”).6 Integrated circuit development has, in turn, led to astounding increases in computational speed. As a result,it is now practical to evaluate optical system performance not just in terms of standard lens design metrics (spot diagrams, rayaberration plots, and modulation transfer functions [MTF]) but also in terms of virtual photographs. Whether the opticalsystem supplies an image to the human eye and brain or to a CCD and computer, today’s optical engineering tools can help toanswer the question, “Is it good enough?” well before costly fabrication, assembly and testing. In a previous paper theauthors demonstrated how imagery is degraded by ghost images, scatter, and camera defocus.7 In this paper we discussdistortion and keystone, diffraction, the limits of three color simulations, the impact of photon noise in the simulation, andthree-dimensional scene simulations.

2. TWO-DIMENSIONAL SCENE SIMULATIONS

Optical and illumination systems are traditionally evaluated using collections of rays representing point sources, uniformsurface emitters, or simple volume emitters such as ellipses, cones, and boxes. To evaluate imaging system performance, it isconvenient to represent the scene as a two-dimensional (2D) source and to trace rays from that source, through the opticalsystem, to the image surface. A computer-readable scene is needed for such a simulation. Since bitmapped images abound, atool was developed to convert color bitmaps into 2D raysets or sources. The bitmap is first separated into its red, green andblue (RGB) constituents, and then each of these three images is converted into 2D raysets by assigning a number of unit flux

Page 2: Optical system performance visualization

rays to each bitmapped pixel. The number of rays for a given pixel is determined using a Monte Carlo method based upon theirradiance of that bitmap pixel.7 Figure 1 illustrates how a single color bitmap is represented as a set of source rays and re-imaged through a lens system to the image plane. All raytrace simulations presented in this paper were performed using theAdvanced Systems Analysis Program (ASAPTM) from Breault Research Organization (BRO), Inc.

In this section we discuss how distortion and keystone, diffraction, noise due to photon statistics, and color impact virtualphotographs. All simulations in Section 2 are based on 2D scenes.

2.1 Distortion and keystone

For optical systems that provide an image to the human eye and brain, specifying an acceptable limit for lens distortion isoften a subjective process dependent on the nature of the scene. For example, lens distortion is far more apparent in aphotograph of a building than it is in a photograph of a field of wild flowers. The tools described above are ideal fordemonstrating the impact of distortion on various scenes during the design process. To illustrate, we take a virtual photographof Notre-Dame as viewed from the tourist’s perspective using a well corrected, wide-angle lens. Figures 2A and 2B showMTF plots, and distortion plots for the ± 55-degree field of view tilted lens to view Notre-Dame from its base. The lens isdesigned to have a maximum of 25% distortion. However, the ZEMAXTM plot shows 100% distortion after accounting forthe change in magnification with object distance. Figure 3 shows the original bitmap of Notre-Dame as photographed from adistance. Note that this example is somewhat compromised since the original photograph already has some distortion andkeystone; however, it will serve to demonstrate the process.

As described above, the bitmap scene is converted into RGB rectangular raysets each with 10 million rays. The raysets arescaled to the approximate dimensions of Notre-Dame (estimated at roughly 150 m tall). The wide-angle lens is modeled andpositioned approximately 25 m from the base of the Notre-Dame rayset, and tilted (approximately 45°) so that Notre-Dame isreasonably well framed in the camera field. Rays are then traced through the lens to the image plane. Figure 4 shows theresulting virtual photograph. While distortion bends the scene somewhat, keystone resulting from our extreme viewing angleis the dominant effect. The virtual photograph includes the effects of the geometric relationship between the scene, the lens,and the focal surface, as well as all aberrations in the as designed lens.

Now that we have created a virtual photograph with the lens, let’s discuss lens performance in terms of a more traditionallens performance metric: MTF. This is certainly mixing apples and oranges a bit since the virtual photograph was taken withan incoherent raytrace, ignoring the impact of diffraction. However, looking at the lens performance in terms of MTF willallow us to illustrate a potential drawback of traditional lens performance metrics.

In this simulation the image was captured with 640 x 480 pixels. Given the extreme angle at which this picture is taken, theobject distance varies between 25 and 152 m. This translates into a magnification change of 6:1. To evaluate the performanceof a lens, we typically choose the Nyquist frequency of (2*pixel size)-1 at which to evaluate the MTF. For this simulation thepixel size is 6 µm and the Nyquist frequency is 85 lines/mm. Note that the MTF curve is computed based upon the lens’image space f-number. The image space f-number varies only slightly with field. To accurately represent MTF performancefor this lens and this application, it may be more appropriate to plot the object space MTF. The object space MTF wouldcapture the significant variation in resolution as a function of field. With the virtual photograph, the performance variationassociated with changes in 1) object distance, and 2) distortion as a function of field are a natural part of the simulation. If wehad accounted for diffraction, we would also have simulated the diffraction-induced resolution changes as a function of field.

Page 3: Optical system performance visualization

Figure 1. Scene represented as a set of rays and re-imaged through and optical system

Figure 2A. MTF plots for the wide-angle lens Figure 2B. Distortion plot for the wide-angle lens

Figure 3. The original bitmap of Notre-Dame Figure 4. Virtual photograph from ASAP raytrace

Page 4: Optical system performance visualization

2.2 Diffraction

In high f-number systems, such as endoscopes, diffraction has a significant impact on imaging performance. In addition,wide-angle lens systems may have large variations in f-number as a function of field. This large variation in f-number willresult in field dependent diffraction. For systems with little or no variation in aberrations or diffraction as a function of field,it is possible to compute virtual performance by calculating a single point spread function (PSF) and convolving it with thescene. This is likely sufficient for most applications. However, the authors have modified the simulation technique describedabove in a way so as to handle more general systems. The revised simulation technique works equally well for on-axis,circular symmetric systems, or systems with non-circular aperture stops, sizeable diffraction variations with field, and non-rotationally symmetric systems.

Recall that the original simulation technique created a surface emitter using a Monte Carlo method to distribute unit flux raysacross the bitmap such that ray density is statistically related to the bitmap irradiance at a particular location. The benefit ofthis technique is that high irradiance regions of the bitmap receive larger numbers of rays, increasing the statistical accuracyfor these source locations, while the less critical, low-irradiance regions, receive fewer rays. This application of the MonteCarlo method results in an efficient use of rays. However, the raytrace code used to perform these computations, ASAP, usesgrids of rays and a Gaussian beam decomposition methodology to simulate coherent phenomenon.8-9 Each grid of rayssimulates a single coherent point source. The field from that point source is propagated through the optical systemaccumulating wavefront deformation and diffracting from apertures until it strikes the designated image surface. The fielddistribution on the image surface is then computed. To create a virtual photograph that includes diffraction, a grid of rays istraced from each bitmap pixel location. The flux contained in each grid of rays is weighted according to the relativeirradiance for that particular location on the bitmap. Each grid of rays representing a bitmap pixel is simulated as a coherentsource that fills the lens system pupil and diffracts within the system to create an aberrated PSF at the image surface. Gridsrepresenting neighboring pixels are simulated as incoherent with respect to one another. To ensure that the fields fromneighboring pixels do not add coherently, an iterative loop is set up to trace grids of rays from each and every bitmap pixeland then to incoherently add the resulting fields at the image plane. This technique, while general and accurate, is fairlycomputationally intensive. The number of rays used and raytrace computation times for each simulation are provided in thesummary section of this paper.

The parrot picture bitmap is used to demonstrate how diffraction impacts imaging. For reference, Figures 5A and 5B showMTF curves and spot diagrams for the CCD camera lens modeled in this simulation. In Figures 5 the lens aperture stop is setto F/2.8. Figures 6A and 6B show MTF curves and spot diagrams for the same lens when operating at F/22. Note that thislens is aberration limited at F/2.8 and diffraction limited at F/22. For each of three colors (RGB), grids of rays (13 x 13) weretraced from each pixel in the 105 x 105 pixel bitmap. A one-to-one correspondence between bitmap pixels and CCD (orimage plane) pixels is maintained. The pixel size at the image plane is 6 µm. Figure 7 shows the virtual photograph of theparrots with the lens operating at F/2.8, while Figure 8 displays imaging performance at F/22. The effects are subtle. Only anenlarged section of the 105 x 105-pixel image is shown so that individual pixels are visible. The results demonstrate visuallyhow the photograph will look, and confirm the conclusion drawn from the MTF curves and spot diagrams: despite the factthat F/2.8 performance is compromised by aberrations, diffraction at F/22 causes the most significant image degradation.

Page 5: Optical system performance visualization

Figure 5A. Camera MTF curves with F/2.8 aperture stop Figure 5B. Camera spot diagrams with F/2.8 aperture stop

Figure 6A. Camera MTF curves with F/22 aperture stop Figure 6B. Camera spot diagrams with F/22 aperture stop

Figure 7. Virtual photograph with F/2.8 aperture stop Figure 8. Virtual photograph with F/22 aperture stop

Page 6: Optical system performance visualization

2.3 Photon noise in the simulation process

A new raytrace technique for including diffraction in the virtual photograph was described in Section 2.2. When this newraytrace technique was employed without including diffraction, the resulting virtual photographs showed less statistical noise.To describe how this occurs let’s first look at raytrace statistical noise in a Monte Carlo raytrace simulation. Raytracestatistical noise is akin to photon noise. With the initial bitmap to surface emitter tool described above, rays are distributedusing a Monte Carlo method such that ray density increases with increasing bitmap irradiance. Fewer rays represent lowirradiance regions in the bitmap. The rays are traced through the system and captured in a finite number of bins or pixels atthe image surface. Low flux regions of the image receive fewer rays but not exactly in proportion to the original bitmap fluxdue to the finite number of rays used to simulate the source and the finite number of sampling bins used in the image plane.Hence, a large number of rays is required to reduce the finite sampling error of the calculation. As an estimate of the errorbased entirely on ray statistics, the noise-to-signal ratio is traceable to a Bernoulli trial sequence.10 The signal to noise ratio isstatistically defined as the ratio of the distribution mean to the square root of the distribution variance within an energycounting bucket. Therefore, for a Bernoulli trial sequence the noise to signal ratio is:

( ) ( )n

pNp

pmSignal

Noise −=

−==

11σ(1)

where,

σ = square root of the variance = ( )pNp −1<m> = mean value = NpN = total number of rays in ray tracep = probability of a ray getting to the detectorn = number of rays at the detector

In the case of a small probability of a ray getting to the detector; that is, p<<1, the statistical error reduces to the mean andvariances of a Poisson distribution. The Poisson distribution is simply the limiting case of the Bernoulli trial sequence underthis condition. The noise to signal ratio becomes:

nSignalNoise 1= (2)

To obtain a crude estimate of this error, let’s use the actual number of rays used in the Monte Carlo parrot simulation above.5 million rays were used for each color in the original 480 x 640-pixel bitmap. Even if these rays were evenly distributedacross all pixels, the error in the average pixel would be 14 %. In practice, some regions would have far less error and someregions far greater. This is the case with the Monte Carlo ray distribution method described at the beginning of this paper. Byusing the new technique described for including diffraction, an n x m grid of ray grids is used to simulate the source. Eachgrid, representing a single bitmap pixel, is assigned a flux based directly on the irradiance of that bitmap pixel. Therefore, theprocess of generating the source rayset from the bitmap is noise free. If an image were produced from the source rayset priorto tracing, it would be identical to the original bitmap. This along with two additional fortuitous factors makes the newraytrace technique less noisy:1. A one-to-one correspondence between the number of grids of rays used for the source and the number of pixels used in

the focal plane was chosen, and2. The geometrical aberration in the lens is such that a large percentage of the rays from a given bitmap pixel strike the

corresponding image pixel.

Figure 9A shows a virtual photograph of the parrots using the Monte Carlo method for distributing rays across the sourceplane, while Figure 9B shows a virtual photograph of the same scene using the incoherent grid of grids. This new techniquemay be advantageous under conditions similar to those described above. However, it is still a less efficient application ofrays, and a more time consuming technique, as compared with the Monte Carlo source generation method.

Page 7: Optical system performance visualization

2.4 Color

It was brought to the authors’ attention that this virtual photography process characterizes the optical system at only threedistinct wavelengths: RGB. A system well corrected at RGB but compromised at other wavelengths would bemisrepresented. The current technique is limited to RGB in two ways:1. Only RGB raysets are created from the original color bitmap, and2. Only RGB wavelength rays have been traced through the lens systems.

To graphically demonstrate this limitation, we have modeled an Ebert spectrograph in ASAP (see Figure 10). Thespectrometer entrance slit and focal surface are located symmetrically displaced along a plane containing the center ofcurvature of the spherical mirror. A diffraction grating is placed at the center of curvature of the same spherical mirror, andtilted 8.53°. The grating is operated in order, m=2, and the grating spacing is 6.73 µm. The entrance slit illumination ismodeled as a spatially uniform, Lambertian angular distribution of rays. Three raytraces are performed: 450, 550, and 650nm. For each of the three wavelength cases the image of the entrance slit is shifted with respect to the others as a result ofdiffraction at the grating. After each of the three wavelength raytraces, a separate file is generated containing the focal surfaceirradiance distribution for that particular wavelength. These three RGB irradiance distribution files are combined andconverted back into a single color bitmap file. This process results not in a spectrum but rather in three distinct color imagesof the entrance slit (see Figure 10).

To remove this limitation we modified our computation technique yet again. First we needed a method to convert betweenRGB values and wavelength. After a brief and disappointing web search we developed our own crude conversion. Figure 11illustrates the linear changes in RGB values across the visible spectrum. Next we traced rays for a large number of discretewavelengths across the visible spectrum rather than simply RGB. In the demonstration case presented here, we chose to tracerays in 1 nm increments from 400 to 700 nm. Each separate raytrace computation results in an irradiance distribution at thefocal surface of the spectrometer. We end up with 300 irradiance distribution files. Since we are tracing discrete wavelengths,each trace results in a new entrance slit image at the focal surface. As we step across the visible spectrum, the entrance slitimage moves up the focal surface. To merge the 300 discrete wavelength files into RGB representations of the spectrum, weuse the following approach:1. Use the first wavelength file to create its RGB file representations by multiplying all pixels within the file by the RGB

values corresponding to that particular wavelength (see Figure 11).2. Proceed to the next wavelength and multiply the irradiance distribution file by the appropriate RGB values to create three

new RGB files.3. Add the initial R files together, the G files together and the B files together.4. Repeat the process through all remaining wavelengths.The result yields RGB file representations of the spectrum as shown in Figure 12. Combining these three files in a bitmap weget the desired spectrum (see Figure 13).

Page 8: Optical system performance visualization

Figure 9A. Virtual photograph using the Monte Carlo Figure 9B. Virtual photograph using an incoherentmethod for distributing rays across the source grid of grids as a source

Figure 10. Ebert spectrograph with 3 color raytrace Figure 11. Conversion – wavelength to RGB

Figure 12. RGB irradiance distributions for spectrum Figure 13. Ebert with multi-spectral raytrace

Page 9: Optical system performance visualization

One final note of interest that may address the question, “Why would you go to all of this trouble to simulate spectralcontent?” Just as imaging systems suffer from aberrations by degrading the scene they strive to capture, spectrometer systemsalso suffer from aberrations. These aberrations affect not only the scene content but also spectral resolution. The Ebertspectrograph as portrayed in Figure 13 is an ideal example of this. The Ebert is well corrected for coma and distortion as aresult of symmetry within the system, with respect to the stop located at the grating.11 However, uncorrected astigmatismresults in curvature of the entrance slit image at the spectrometer focal plane. This slit image curvature is clearly visible in thespectrum displayed in Figure 13. One could model a scene incident on the entrance slit of this simple spectrometer andassess, using the virtual photograph technique just described, not only the spatial resolution loss, but also the spectralresolution loss due to aberrations.

3. THREE-DIMENSIONAL SCENE SIMULATIONS

All simulations described above have used 2D sources to represent scenes. In a previous paper7, the authors demonstratedhow defocus impacts imaging performance using just such a 2D source. This is not really a valid way to portray depth offocus. With a 2D simulation the scene has no depth and hence defocus blur occurs uniformly across the entire scene. Inpractice, photographs of 3D scenes demonstrate the limited depth of focus of the camera. Defocus blur gradually degradesportions of the scene that are outside of the acceptable depth of focus of the camera. To simulate these effects we must firstgenerate 3D scenes. To construct complex 3D scenes we use a 3D graphical design program called Rhinoceros®, fromRobert McNeel & Associates. Rhinoceros offers features for deriving unconventional objects, including surface bending,blending, and patching as well as uniform and non-uniform scaling in any combination of dimensions. Once the geometry isconstructed it is imported into ASAP and optical properties are assigned to each surface. In the 2D simulations describedabove, the bitmap becomes the source. In the 3D simulations described below, each 3D scene is illuminated with a simulatedlight source and rays are scattered toward the entrance pupil of the imaging system.12

3.1 Virtual photograph of a virtual car

To demonstrate, a 3D representation of a car was developed in Rhinoceros (see Figure 14). Rays from a collimated source arescattered off the car toward the entrance pupil of the camera lens and ultimately strike the camera focal plane, an absorbingsurface with 480 x 640 pixels (see Figure 15). The model car is shrunk to the size of a toy car so that the lens operates atfinite conjugates. Each surface within the car model is assigned RGB reflectance properties as well as scatter properties aslisted in Table 1. Figure 16 shows a series of virtual photographs of the car as it is moved through the camera’s best focus.

Note that the virtual photographs of the car appear flat or cartoon-like as compared to the real cars or photographs of cars weobserve daily. This is the result of the simple Lambertian scatter properties applied to the car surface. Realistic scatterproperty modeling is essential to photometrically accurate representations of illuminated scenes. The importance of realisticscatter properties modeling will be re-iterated in the context of machine vision simulations in the discussion to follow.

Table 1. Reflectance property assignments for the car model. All surfaces are modeled as Lambertian scatterersCar Surfaces Color Red Green BlueTrim and interior White 1.00 1.00 1.00Body Blue 0.00 0.00 1.00Background Plum 0.55 0.40 0.55Tires Black 0.00 0.00 0.00

3.2 Machine vision application

Thus far, with the exception of the Ebert spectrograph, the examples we have chosen are geared toward photography. In theseapplications, the ultimate end user is the human brain. Quality in these applications is subjective. In machine visionapplications such as: inspection systems, personal identification devices, electronic parts placement and inspection machines,and remote sensing target identification and tracking instruments, the end user is a computer algorithm and quality is notsubjective: the computer either recognizes the scene or it does not. In these applications computer programmers often waituntil the vision system is fabricated, and captured images are available before programming recognition algorithms. With thetechniques described above, it is now possible to generate photometrically accurate virtual photographs of the scene duringthe vision system design process, so that computer programmers and optical engineers can design the best system for theapplication.

Page 10: Optical system performance visualization

Figure 14. Car model as constructed in Rhinoceros

Figure 15. ASAP simulation demonstrating the 3D virtual photography process

Figure 16. Series of virtual photographs of the car model as it moves through focus

To illustrate this, consider the task of identifying, inspecting, and locating electronic parts with a machine vision system.Figures 17A and 17B show two electronic components: a 32-pin chip, and a 37-ball chip, as modeled in Rhinoceros. As withthe car simulation above, these 3D models were imported into ASAP, scatter properties assigned to each surface, and raysscattered from the parts toward the entrance pupil of the imaging lens. For these simulations, the source was a small diskemitter, somewhat like an LED illuminator. The emitter was located 20 mm from the electronic part center and at an angle of45°, with respect to the surface normal of the component body. The resulting virtual photographs are displayed in Figures18A and 18B. It is important to point out that Figures 18 have been altered to saturate the specular reflected hot spots from

Page 11: Optical system performance visualization

the metal pins and ball solder contacts. This simplified demonstration identifies two key issues for the machine vision systeminspecting these components:1. The specular reflections from the metal portions of the electronic chips will be far more intense than the scatter from the

body of the components, and will dwarf the signal unless the specularities are allowed to saturate,2. The specular reflections are not centered on the metal components providing an asymmetric set of hot spots, which could

be troublesome to a computer algorithm tasked to identify and locate the parts.

Figure 17A and B. Rhinoceros models of the 32 pin and 37 ball electronic parts

Figure 18A and 18B. Virtual photographs generated in ASAP of the 32-pin and 37-ball electronic parts

As with any computer simulation these virtual photographs are only as good as the data used to construct them. Physicaldimensions, surface reflectance and scatter properties, illumination system properties, and lens system parameters, must allbe accurate representations of the real system. Now that computer speed is sufficiently fast, the more accurate the model themore realistic the simulation. As a final demonstration of this point, Rhinoceros provides a rendering tool to display modeledcomponents in the presence of light sources. Refer once again to Figures 17A and 17B. These rendered images have many ofthe same attributes of the ASAP raytrace simulations. However, the results are in no way photometrically accurate.Reflectance and scatter properties are simplified mathematical models, and the lighting options are limited to simple pointsources. Rhinoceros, or one of many other 3D rendering tools, may have benefits to show qualitative simulation effects butare not appropriate for accurate photometric simulations.

Page 12: Optical system performance visualization

4. SUMMARY

Increases in computer speed have made it possible to generate virtual photographs to display the performance of opticalsystems prior to fabrication. Table 2 summarizes the number of rays and raytrace durations for each virtual photographcontained in this paper. Though some of the runs were performed on a 200 MHz system while others were performed on a500 MHz system, all results were scaled to represent raytrace times for the 500 MHz platform.

Table 2. Number of rays and raytrace durations for generating the virtual photographs in this paperVirtual Photograph Number of Rays Raytrace Time

(hours)Notre-Dame (Fig. 4) 33,000,000 10Parrots with diffraction at F/2.8 (Fig. 7) 5,600,000 3Parrots with diffraction at F/22 (Fig. 8) 5,600,00 3Parrots Monte Carlo (Fig. 9A) 2,000,000 0.7Parrots grid-grid (Fig. 9B) 22,000,0000 6Ebert multi-spectral (Fig. 12) 6,000,000 1.5Car vs. focus (Fig. 15) 24,000,000 x 6 images 3632 pin component (Fig. 18A) 50,000,000 1137 ball component (Fig. 18B) 50,000,000 11

Several methods for generating these images have been described. Examples of how these tools may be useful to designersand project teams have been presented. The simulation techniques presented, as well as the applications suggested herein, areintended simply as a starting point in the discussion of how these tools may benefit product development. Fortunately, thesesimulations will become even easier to perform since, according to Moore,6 computer speed will continue to increasedramatically over the next two decades.

ACKNOWLEDGEMENTS

The authors wish to thank Joe Shiefman for his development of the diffractive simulation technique described in Section 2.2.

REFERENCES

1. D. C. Sinclair, “Optical Design Software,” Handbook of Optics, Volume I, McGraw-Hill, USA (1995)2. D. P. Feder, “Automatic Optical Design,” Appl. Opt. 2:1209-1226 (1963).3. D. S. Grey, “Aberration Theories for Semiautomatic Lens Design by Electronic Computers,” J. Opt. Soc. Am.

53:672-680 (1963).4. G. H. Spencer, “A Flexible Automatic Lens Correction Procedure,” Appl. Opt. 2:1257-1264 (1963).5. C. G. Wynne and P. Wormell, “Lens Design by Computer,” Appl. Opt. 2:1223-1238 (1963).6. G. E. Moore, Chairman Emeritus of Intel Corporation, statements from 1965 and 1999.7. M. Côté and J. Tesar, “Optical System Image Irradiance Simulations,” SPIE International Optical Design

Conference (June 1999).8. A. Greynolds, “Vector formalization of the ray-equivalent method for general Gaussian beam propagation,”

Proceedings of SPIE: Current Developments in Optical Engineering and Diffractive Phenomena 679: 129-133(1986).

9. G. L. Peterson, “Using Gaussian Beam Decomposition,” Photonics Tech Briefs (1999).10. J. Schweyen and K. Garcia, “Geometrical Optical Modeling Considerations for LCD Projector Display Systems,"

SPIE Vol. 3013:126-140 (1997).11. W. G. Fastie, “Ebert Spectrometer Reflections,” Physics Today (January 1991).12. M. Stevenson and M. Cote, “Modeling Optical Vision Systems with Innovative Software,” Vision Systems Design

Magazine, PennWell (1999).

* Correspondence: Email: [email protected]; WWW: http://breault.com; Telephone: 520 721 0500; Fax: 520 721 9630.