anisoplanatic imaging through turbulent media: image recovery by local information fusion from a set...

13
Anisoplanatic imaging through turbulent media: image recovery by local information fusion from a set of short-exposure images Mikhail A. Vorontsov and Gary W. Carhart U.S. Army Research Laboratory, Computational and Information Sciences Directorate, Adelphi, Maryland 20783 Received June 15, 2000; revised manuscript received October 18, 2000; accepted December 4, 2000 A wide-field-of-view white-light imaging experiment with artificially generated turbulence layers located be- tween the extended object and the imaging system is described. Relocation of the turbulence sources along the imaging path allowed the creation of controllable anisoplanatic effects. We demonstrate that the recently proposed synthetic imaging technique [J. Opt. Soc. Am. A 16, 1623 (1999)] may result in substantial improve- ment in image quality for highly anisoplanatic conditions. It is shown that for multisource objects located at different distances the processing of turbulence-degraded short-exposure images may lead to a synthetic image that has an image quality superior to that of the undistorted image obtained in the absence of turbulence (turbulence-induced image quality enhancement). © 2001 Optical Society of America OCIS codes: 100.0100, 200.0200, 010.1080. 1. INTRODUCTION The desired goal for imaging through turbulence can be presented by a simple formula: diffraction-limited image quality obtained within a wide field of view and observed during a reasonably long (continuous) registration time. This goal is difficult to achieve in a single imaging sys- tem. If we sacrifice the wide-field-of-view requirement, we may use conventional adaptive optics developed for as- tronomical observations of bright natural stars 1 or con- sider the more sophisticated guide-star adaptive optics approach. 2 Indeed, these adaptive optics techniques have demonstrated close-to-diffraction-limited imaging system performance during a continuous observation time, but only for small isoplanatic angles (5 10 mrad for typical astronomical objects). 1,3 Adaptive imaging system field of view is restricted by the effects of anisoplanatism. 46 Under anisoplanatic imaging conditions typical with wide-field-of-view sys- tems, angular separations between different points of the viewing object may result in uncorrelated (or weakly cor- related) phase distortions caused by wave propagation through different regions of the turbulent medium. These uncorrelated phase distortions pose a difficult prob- lem for adaptive correction. The multiple-guide-star technique as well as multiconjugate and scanning conven- tional adaptive optics may help to extend to some degree the field of view, but at the expense of significantly in- creasing both system complexity and cost. 7,8 For a num- ber of imaging scenarios, we may not need continuous- time image quality improvement and would be satisfied to have just a single wide-field-of-view good-quality image obtained during some reasonable observation and compu- tational time. This simplification of the problem state- ment has resulted in a number of image postprocessing techniques aimed at overcoming the effects of turbulence through computer processing of either a single long- exposure image or a large number of short-exposure images. 3 Most of these techniques are based on the as- sumption (typical for astronomical applications, where turbulent regions are located close to the imaging tele- scope) that the point-spread function (PSF) of the imaging system including turbulence is position independent—the condition for weak anisoplanatism. These methods typi- cally require extensive computer calculations that involve two sets of short-exposure images obtained simulta- neously, such as images of the object and a set of defo- cused images for phase diversity methods, 9,10 an addi- tional set of reference source images in speckle imaging techniques, 11 and both object images and wave-front data in the deconvolution from a wave-front sensing algorithm. 12 The assumption of weak anisoplanatism is not practi- cal if we consider ground-to-ground or underwater imag- ing of extended or multiple-point-source objects with tur- bulence regions located anywhere between an imaging system and an imaging scene. This truly anisoplanatic imaging scenario is a challenge for both analytical and numerical analysis. Among imaging techniques that are more or less ‘‘comfortable’’ with anisoplanatic conditions are adaptive systems based on direct image quality per- formance (sharpness metric) optimization, 13,14 an image processing technique based on local shift removal, 15,16 the ‘‘lucky-frame’’ selection technique, 17,18 and the correction of high-order space-variant aberrations with phase- diverse speckles. 19 The term ‘‘comfortable’’ does not al- ways mean efficient. It is rather an indication that these techniques do not completely fail under anisoplanatic im- aging conditions and can even produce some useful image quality improvement. Adaptive optics based on the image quality optimiza- tion technique (model-free optimization) does not require either a guide-star or wave-front measurements and in 1312 J. Opt. Soc. Am. A / Vol. 18, No. 6 / June 2001 M. A. Vorontsov and G. W. Carhart 0740-3232/2001/061312-13$15.00 © 2001 Optical Society of America

Upload: gary-w

Post on 03-Oct-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

1312 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

Anisoplanatic imaging through turbulent media:image recovery by local information

fusion from a set of short-exposure images

Mikhail A. Vorontsov and Gary W. Carhart

U.S. Army Research Laboratory, Computational and Information Sciences Directorate, Adelphi, Maryland 20783

Received June 15, 2000; revised manuscript received October 18, 2000; accepted December 4, 2000

A wide-field-of-view white-light imaging experiment with artificially generated turbulence layers located be-tween the extended object and the imaging system is described. Relocation of the turbulence sources alongthe imaging path allowed the creation of controllable anisoplanatic effects. We demonstrate that the recentlyproposed synthetic imaging technique [J. Opt. Soc. Am. A 16, 1623 (1999)] may result in substantial improve-ment in image quality for highly anisoplanatic conditions. It is shown that for multisource objects located atdifferent distances the processing of turbulence-degraded short-exposure images may lead to a synthetic imagethat has an image quality superior to that of the undistorted image obtained in the absence of turbulence(turbulence-induced image quality enhancement). © 2001 Optical Society of America

OCIS codes: 100.0100, 200.0200, 010.1080.

1. INTRODUCTIONThe desired goal for imaging through turbulence can bepresented by a simple formula: diffraction-limited imagequality obtained within a wide field of view and observedduring a reasonably long (continuous) registration time.This goal is difficult to achieve in a single imaging sys-tem. If we sacrifice the wide-field-of-view requirement,we may use conventional adaptive optics developed for as-tronomical observations of bright natural stars1 or con-sider the more sophisticated guide-star adaptive opticsapproach.2 Indeed, these adaptive optics techniqueshave demonstrated close-to-diffraction-limited imagingsystem performance during a continuous observationtime, but only for small isoplanatic angles (5–10 mrad fortypical astronomical objects).1,3

Adaptive imaging system field of view is restricted bythe effects of anisoplanatism.4–6 Under anisoplanaticimaging conditions typical with wide-field-of-view sys-tems, angular separations between different points of theviewing object may result in uncorrelated (or weakly cor-related) phase distortions caused by wave propagationthrough different regions of the turbulent medium.These uncorrelated phase distortions pose a difficult prob-lem for adaptive correction. The multiple-guide-startechnique as well as multiconjugate and scanning conven-tional adaptive optics may help to extend to some degreethe field of view, but at the expense of significantly in-creasing both system complexity and cost.7,8 For a num-ber of imaging scenarios, we may not need continuous-time image quality improvement and would be satisfied tohave just a single wide-field-of-view good-quality imageobtained during some reasonable observation and compu-tational time. This simplification of the problem state-ment has resulted in a number of image postprocessingtechniques aimed at overcoming the effects of turbulencethrough computer processing of either a single long-

0740-3232/2001/061312-13$15.00 ©

exposure image or a large number of short-exposureimages.3 Most of these techniques are based on the as-sumption (typical for astronomical applications, whereturbulent regions are located close to the imaging tele-scope) that the point-spread function (PSF) of the imagingsystem including turbulence is position independent—thecondition for weak anisoplanatism. These methods typi-cally require extensive computer calculations that involvetwo sets of short-exposure images obtained simulta-neously, such as images of the object and a set of defo-cused images for phase diversity methods,9,10 an addi-tional set of reference source images in speckle imagingtechniques,11 and both object images and wave-front datain the deconvolution from a wave-front sensingalgorithm.12

The assumption of weak anisoplanatism is not practi-cal if we consider ground-to-ground or underwater imag-ing of extended or multiple-point-source objects with tur-bulence regions located anywhere between an imagingsystem and an imaging scene. This truly anisoplanaticimaging scenario is a challenge for both analytical andnumerical analysis. Among imaging techniques that aremore or less ‘‘comfortable’’ with anisoplanatic conditionsare adaptive systems based on direct image quality per-formance (sharpness metric) optimization,13,14 an imageprocessing technique based on local shift removal,15,16 the‘‘lucky-frame’’ selection technique,17,18 and the correctionof high-order space-variant aberrations with phase-diverse speckles.19 The term ‘‘comfortable’’ does not al-ways mean efficient. It is rather an indication that thesetechniques do not completely fail under anisoplanatic im-aging conditions and can even produce some useful imagequality improvement.

Adaptive optics based on the image quality optimiza-tion technique (model-free optimization) does not requireeither a guide-star or wave-front measurements and in

2001 Optical Society of America

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1313

this sense is independent of imaging conditions. How-ever, with the use of adaptive wave-front aberration cor-rection, image quality can be improved only within theisoplanatic angle unless several wave-front aberration-correcting elements are used. The best that can beachieved under anisoplanatic imaging conditions withmodel-free adaptive optics is local image quality improve-ment, leading to the appearance of an unblurred windowlocated somewhere within a wide-field-of-view imagescene. Note that this local-area image quality improve-ment may result in a more severe degradation of the re-mainder of the image scene.20

For the image restoration technique based on local shiftremoval, a different idea is used. Local shifts in short-exposure images are calculated based on the local corre-lation with an image prototype.15,16 The obtained shiftmaps are then used to compute a summed image havingcorrected local shifts. The summed image is still blurred,and additional iterative deblurring image processing isrequired. This method requires extremely extensivecomputations and aims to remove only image degradationassociated with local wave-front tilt components.

The frame selection (lucky-frame) technique uses animage quality metric to select a subset of the best-qualityimages from a short-exposure video data stream. Theoutput ( processed) image is formed by averaging only theimage frames belonging to this subset.3,17,18 The prob-lem with this technique is the relatively low probability ofthe appearance of a good-quality image. For isoplanaticimaging conditions the probability P fr of registering agood-quality image frame is dependent on the ratio of theimaging system diameter D to the phase distortion spa-tial correlation radius r0 (Fried radius for atmosphericturbulence).21 The probability P fr exponentially de-creases with a D/r0 increase.17 One would expect thatthe probability of getting a lucky frame underanisoplanatic imaging conditions is even smaller. In thiscase a single lucky frame can be obtained only if phasedistortions are simultaneously small for all waves origi-nating from spatially separated object points. An impor-tant feature of anisoplanatic imaging is that image qual-ity degradation is spatially nonuniform, so that a singleshort-exposure image may contain local regions havingrelatively good image quality (lucky regions).6

The probability P loc of having a lucky region within asingle short-exposure frame should be higher than thecorresponding probability of getting a lucky frame underisoplanatic conditions (P loc . P fr). Indeed, the probabil-ity of having a lucky region corresponding to a specificisoplanatic angle is essentially the same as the probabil-ity of having a lucky frame P fr under isoplanatic condi-tions. Nevertheless, an imaging scene may containmany (assume N) isoplanatic angles. This means thatthe probability P loc of getting a lucky region independentof its location is higher than P fr (at least N times higher).

The probability P loc is also dependent on the location(s)of turbulent layer(s). A simple geometrical considerationshows that for multisource (extended) objects, relocationof the turbulence layer from the telescope pupil towardthe object plane may result in an increase in probabilityof having lucky image regions, which is due to an increasein the characteristic phase distortion correlation radius

r0 .3 From a qualitative consideration of all of these fac-tors, we can expect that the probability of obtaining luckyregions under anisoplanatic imaging conditions should benoticeably higher than the probability of having a singlelucky frame taken under isoplanatic imaging conditionsas estimated by Fried.17,22 This means that a relativelysmall set of short-exposure anisoplanatic images may con-tain a few good-image-quality regions. The problem be-comes how to identify and extract these lucky regions andhow to combine ( fuse) the obtained information into asingle frame.

In Ref. 23 we described an image processing techniquebased on a nonlinear evolution partial differential equa-tion with anisotropic gain (synthetic imaging technique).This technique allows simultaneous (in parallel) extrac-tion and fusion of lucky regions from a short-exposurevideo data stream obtained under anisoplanatic imagingconditions. In Ref. 24 this synthetic imaging techniquewas applied to the processing of atmospheric turbulence-degraded images.

Here we continue analysis of the synthetic imagingtechnique through the experimental study of its efficiencyunder diverse anisoplanatic imaging conditions. We de-scribe results of a laboratory-based imaging experimentwith artificially generated turbulence layers located be-tween the extended object being viewed and the imagingsystem. Relocation of the turbulence sources along theimaging path allowed the creation of controllableanisoplanatic effects in a white-light imaging system.

We demonstrate that the synthetic imaging techniquemay result in substantial improvement in image qualityfor anisoplanatic conditions. For multisource objects lo-cated at different distances, we obtained an interestingresult: The processing of turbulence-degraded short-exposure images may lead to a synthetic image that hasan image quality superior to that of the undistorted imageobtained in the absence of the turbulence, i.e., theturbulence-induced image quality enhancement effect.The possibility of obtaining image fragments with theturbulence-induced image quality enhancement (super-lucky regions) has been discussed and demonstrated ex-perimentally in Ref. 25. The synthetic imaging tech-nique allows for automatic identification, extraction, anduse of these superlucky image regions if they are presentin the examined set of short-exposure video data for im-age processing purposes.

2. IMAGING SYSTEM WITH LABORATORY-GENERATED TURBULENCE LAYERSA. Controllable Anisoplanatic Effects: Modeling IssuesIn the experiments described below, we were not trying tomodel dynamical phase distortion effects similar to whatatmospheric turbulence induces in actual imaging sys-tems. There are several reasons for such disregard of thereality of the actual atmospheric imaging experiment andthe lack of appropriate respect for the traditionally usedstatistical models for the atmospheric-induced phase dis-tortions. First, the most severe impact of theanisoplanatic effects on imaging system performance oc-curs for low-height ground-to-ground and underwater im-aging applications, where the statistical properties of the

1314 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

turbulence-induced phase distortions are often not welldefined and are strongly dependent on a number of envi-ronmental factors. Second, the goal of these experimentsis analyzing efficiency of the synthetic imaging techniqueunder the controllable and well-defined anisoplanatic im-aging conditions, which is extremely difficult to achieve inan actual outdoor or underwater experiment. In our pre-vious experiments with actual atmospheric conditions, wedemonstrated that the image processing of short-exposure images with the synthetic imaging techniquemay result in image quality improvement for specific im-aging conditions under which the experiment wasperformed.24 The question raised from these experi-ments is how efficient this technique could be in differentimaging scenarios. Unfortunately, in modeling of the at-mospheric turbulence-induced phase distortions in labo-ratory conditions, there is no easy way to enjoy both‘‘right’’ (or at least well-established) statistics for thephase distortions and the ability to create and examinediverse anisoplanatic imaging conditions.

B. Experimental SetupThe scheme of the experimental setup shown in Fig. 1aconsists of an imaged scene that includes several ex-tended objects and an imaging system placed at a dis-

tance L 5 14 m from the front object of the imaged scene.To create localized turbulent regions along the propaga-tion path, we used three identical 1500-W baseboard elec-trical heaters. Each heater consists of a 1-m-long heat-ing element oriented orthogonal to the direction of wavepropagation and located 14.0 cm below the system’s opti-cal axis. A metal grating with cell size 4 mm was placedon top of the heating element to break the airflow intosmaller streams and create refractive-index fluctuationssmall in comparison with the imaging telescope diameterD 5 90 mm. Positions of the imaging telescope, themetal grating, and the heating element are shown in Fig.1a (bottom left corner). Two heaters were attached to-gether, and the third heater was used as a separate tur-bulence source. To distinguish between these two turbu-lence sources, we refer to the set of two attached heatersas turbulence generator 1 and the individual heater asturbulence generator 2. Turbulence generator 1 was lo-cated at certain positions (stations). A total of nine sta-tions with a 1.2-m separation between them were used(see Fig. 1a). Turbulence generator 2 (if used) was al-ways placed at the first station, located 3 m from the im-aging telescope. A picture of the three-dimensional im-aged scene is shown in Fig. 1b. The imaged scene

Fig. 1. (a) White-light imaging system setup, (b) imaged scene geometry, (c) image in the absence of turbulence.

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1315

consists of several objects placed at different distancesfrom the telescope: closest to the telescope is Peter theGreat (P), next is Taras Bulba (B), then a truck viewedfrom behind (T1), a truck seen from the front (T2), and agarage (G). The imaging telescope was focused at thefront plane of the truck T2 with the truck lights used as areference. An undistorted image of the scene taken inthe absence of turbulence is shown in Fig. 1c. To createan image that has a spatially nonuniform quality even inthe absence of the turbulence, we chose the distance be-tween the front and back objects to be larger than the im-aging system depth of field. This resulted in noticeabledefocus of the front model images (P and B), as seen inFig. 1c.

The imaging system consists of a Maksutov–Cassegrain optical telescope (Celestron C90) with a focallength of 1 m. Attached to it was a digital (8-bit) fast-framing DALSA camera with 256 3 256-pixel resolution.The imaging system’s view angle was near 7.2 mrad. Inthe experiments described here, we always used a set of1000 short-exposure images taken at the rate of 100frames per second. The EPIX frame grabber with per-sonal computer (PC) was used for camera control anddata acquisition. Approximately 10 s were required toregister and save a set of 1000 frames.

C. Turbulence CharacterizationTo characterize imaging system performance in the pres-ence of the laboratory-generated turbulence, we used thesimple reference two-dimensional object shown in Fig. 2a.This reference object contains a rectangular array of 83 8 small (0.8-mm) white spots ( point sources) on ablack background separated by distances of 10 mm. Thereference object was located in the plane of the truck T1and filled the entire imaging system field of view (8 cm).A single short-exposure image of the reference object,such as shown in Fig. 2b, represents an array of point-spread functions (short-exposure PSF’s) corresponding todifferent view angles. The presence of the turbulent re-gions located along the propagation path resulted in ran-dom degradation and motion of the point-source images,as seen in Fig. 2b. Under isoplanatic conditions the im-aging system PSF is independent of the point-source loca-tion, since degradation and motion of the point-source im-ages are mutually correlated. The presence ofanisoplanatic effects manifests itself as a decrease in thecorrelation distance, leading to a position-dependent im-aging system PSF. Mutual correlation of the point-source image distortions is rather sensitive to the locationof the turbulent regions, which allows one to controlanisoplanatic effects by moving the turbulence sourcesalong the propagation path.

In the experiments turbulence generator 1 was sequen-tially placed at each of the nine stations, and 1000 short-exposure images of the reference object were recorded ateach location. For every short-exposure image and eachpoint-source image, we calculated the following character-istics: coordinates of the point-source image centroids, xjand yj ( j 5 1 ,..., 64); centroid displacements with re-spect to the undistorted positions, dxj 5 xj 2 ^xj& anddyj 5 yj 2 ^ yj&; absolute values of the centroid displace-ments, drj 5 @(dxj)

2 1 (dyj)2#1/2; and point-source image

widths wj 5 @(wjx)2 1 (wj

y)2#1/2. Here and below, ^ & de-notes averaging over 1000 frame values. The centroidcoordinates xj and yj and the widths wj

x and wjy were cal-

culated by using the following formulas:

xj 5 Ej21E

V j

xIj~r!d2r, yj 5 Ej21E

V j

yIj ~r!d2r,

~wjx!2 5 Ej

21EV j

~x 2 ^xj&!2Ij~r!d2r,

~wjy!2 5 Ej

21EV j

~ y 2 ^ yj&!2Ij~r!d2r,

Ej 5 EV j

I j~r!d2r, (1)

where r 5 $x, y% is a radius vector in the image planeand Ij(r) is the intensity distribution for the jth point-source image measured inside the square area V j5 S/64 (S is the visible area of the reference object).

The centers of the square areas V j coincide with the cen-ters of the undistorted point-source images.

The values dxj , dyj , drj , and wj were used to obtainthe following approximations for the normalized covari-ance coefficients of the mutual point-source image dis-placements and widths:

Bijdx 5 ^dxjdxi&/~ s j

dxs idx!, Bij

dy 5 ^dyjdyi&/~ s jdys i

dy!,

Bijdr 5 ^drjdri&/~ s j

drs idr!, Bij

w 5 ^wjwi&/~ s jws i

w!,(2)

where s jdx , s j

dy , s jdr , and s j

w are standard deviations cor-responding to dxj , dyj , drj , and wj . The covariance co-efficients (2) were calculated for point-source images lo-cated along the middle horizontal (x) and vertical (y) linesof the reference object, as shown in Fig. 2a. The index jin Eq. (2) was fixed and corresponded to the left marginpoint-source image for the x line and the top marginpoint-source image for the y line. In this geometry theindex i corresponds to the distance (Dx or Dy) betweenthe point-source margin and the ith images in the x or they direction as measured in the image plane: Dx5 mi (cm) and Dy 5 mi (cm), where m 5 F/(L 2 F) isthe imaging system magnification factor. For L 5 14 mand F 5 1.0 m, we have m 5 0.077. The correspondingcovariance coefficients Bdx(Dx) [ Bij

dx , Bdy(Dy) [ Bijdy ,

and Bw(Dx) [ Bijw are shown in Fig. 3. The distance l

(normalized by L) specifies the position of the station asmeasured from the plane of the telescope: l 5 0.21 cor-responds to the first station, and l 5 0.91 corresponds tothe last station. As seen from Figs. 3a and 3b, the gen-erated turbulence is highly anisotropic. The covariancecoefficients Bdx(Dx) for point-source image centroid mo-tion in the x direction (Fig. 3a) are smaller and decreasemore drastically as the turbulence generator is moved to-ward the object than do the corresponding covariance co-efficients Bdy(Dy) characterizing correlation of the cen-troid motions in the y direction, shown in Fig. 3b. Thisresult is expected, as the metal grating placed on top ofthe heating elements broke the airflow into small streamsin only the x direction. The characteristic size of the air-flow streams in the y direction was governed only by ther-

1316 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

mal convection. For l . 0.64 the covariance functionsBdx(Dx) and Bw(Dx) vanish at Dx ; 0.5X, where X is thevisible size of the reference object. This indicates thepresence of noticeable anisoplanatic effects.

The spatial correlation properties of the point-sourceimages appeared different when the second turbulencegenerator was placed at the position closest to the imag-ing telescope. The corresponding covariance functionsBdx(Dx) for this case are shown in Fig. 3d. In contrastwith the case of a single turbulent layer (Fig. 3a), for

l . 0.64 the covariance function Bdx(Dx) has two slopes:one in the vicinity of small point-source image separa-tions Dx and the second in the vicinity of large Dx. Thisindicates the presence of two characteristic correlationscales. For small separation distances (Dx , 0.25X),correlation properties are primarily determined by theturbulence layer closest to the imaged scene. The pres-ence of long-range correlations in Fig. 3d for Dx . 0.6Xcan be explained by the influence of the turbulent layerlocated closest to the imaging telescope. This turbulent

Fig. 2. Short-exposure images of the reference object (a) without and (b) with the turbulence region (a single turbulent generator at theposition z 5 0.64L). Point-source images used in the analysis are shown inside the dashed rectangular boxes in (a).

Fig. 3. Mutual covariance coefficients for point-source images shown in Fig. 2a with turbulence layers created by using (a)–(c) a singleand (d) two turbulence generators located at different distances l: (a) and (d), covariance coefficients for point-source image centroidmotion in the x direction and (b) in the y direction; (c) covariance coefficient for point-source image widths. The spatial scale Dx/mcorresponds to distances between point-source images in the object plane measured in centimeters.

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1317

layer results in the appearance of highly correlated (es-sentially identical) phase distortion components for allpoint-source images.

Moving the turbulent region toward the object resultedin a monotonic decrease of the averaged point-source im-age widths w 5 ^wj&, as demonstrated in Fig. 4a (index jcorresponds to the left margin point-source image in Fig.2a). The point-source image width in Fig. 4a is normal-ized by the corresponding point-source image width w0without turbulence. Decrease of the image width is lesspronounced when there are two turbulent layers. Thestandard deviation for point-source image centroid posi-tion fluctuations (jitter) normalized by w0 is shown in Fig.4b.

The magnitude of turbulence-induced point-source im-age distortions can also be characterized by the param-eter b 5 ^ s j

dr&/^wj& (jitter relative to blur width). Theparameter b is shown in Fig. 4c as a function of the tur-bulence layer position l. When there is a single turbu-lent layer, the parameter b reaches a maximum forl ' 0.7. The existence of this maximum can be ex-plained based on the following qualitative consideration.The turbulent layer in front of the telescope results instrongly pronounced point-source image blurring (^wj& in-crease). This image blurring effect decreases when theturbulent layer is moved toward the object, while the am-

Fig. 4. (a) Normalized averaged point-source image width, (b)normalized standard deviation for point-source image centroidfluctuations, and (c) jitter parameter b, all versus normalizeddistance l: imaging system with single turbulence generator[curves 1 (solid curves)] and with two turbulence generators[curves 2 (dashed curves)].

plitude of the random image displacements increases(0.2 , l , 0.4). In the limiting case when the turbulentlayer is placed in the object plane, the influence of turbu-lence vanishes. Thus there is a plane described by theparameter b, where the turbulent layer has its maximumimpact on the point-source image degradation. Havingan additional turbulent layer located close to the tele-scope plane caused the maximum to vanish (compare thecurves in Fig. 4).

3. IMAGE QUALITY ANALYSIS IN THEPRESENCE OF ANISOPLANATIC EFFECTSA. Image Quality MetricIn this section we analyze the influence of turbulence re-gion location on image quality of the extended three-dimensional object shown in Fig. 1b. As an image qual-ity metric (sharpness function), we consider the metric Jbased on the optoelectronic edge detection scheme de-scribed in Ref. 23. The advantage of this image qualitymetric is that it can be measured in real time by using acoherent optical processor consisting of a high-resolutionphase spatial light modulator and photoarray (very-large-scale integration imager) coupled through coherent wavediffraction over a short distance Ld .26

The mathematical model for this optoelectronic imagequality metric system includes an equation describingfree-space propagation of the phase-modulated wave withcomplex amplitude A(r, z 5 0) 5 A0 exp@imI(r)# ( phaseimage) over the distance z 5 Ld :

22ik]A~r, z !

]z5 ¹2A~r, z !, (3)

where I(r) is the intensity distribution of the short-exposure image, k 5 2p/l is the wave number, and m isthe phase modulation amplitude. Diffraction results in anonuniform intensity distribution Id(r) 5 uA(r, Ld)u2

having enhanced edges.23

The following expression was used as an image qualitymetric:

J 5 E @Id~r! 2 I0#2 d2r. (4)

As shown in Ref. 26, for short propagation distances thefollowing approximation is valid: Id(r) 2 I0 > n¹2I(r),where ¹2 is the Laplacian operator and n is a proportion-ality coefficient. Hence for short propagation distancesLd , instead of Eq. (4), we have J > *@¹2I(r)#2 d2r. Thisapproximation for the image quality metric coincides withthe sharpness function introduced in Ref. 13. The majoradvantage of the optical edge-image detection system isthat it allows estimation of the image quality metric with-out direct calculation of the input image spatial deriva-tives.

B. Image Quality and Turbulence Region LocationCalculations of the image quality metric (4) were per-formed for each of 1000 recorded short-exposure imagesand for each location of the turbulence generator. Theparameter m characterizing the depth of phase modula-tion in Eq. (3) was chosen to achieve a dynamic range

1318 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

equal to 2p rad. The diffraction distance Ld used in cal-culations was fixed: Ld 5 1.2kb2, where b is camerapixel size. The selected values for Ld and m provided agood-quality edge image Id(r) computed from the undis-torted image frame I0(r). For analysis of image qualitymetric temporal behavior, we used the correlation func-tion BJ(iDt) 5 ^@J(nDt) 2 ^J&#@J(nDt 1 iDt) 2 ^J&#&,where ^ & denotes averaging over a number of short-exposure frames (n 5 1 ,..., 1000) and Dt corresponds tothe time interval between sequential frames (Dt5 10 ms).

Examples of the image quality metric temporal dynam-ics ( frame-to-frame image quality evolution curves) corre-sponding to different locations of turbulence generator 1are shown in Fig. 5a. A typical normalized correlationfunction BJ(i) calculated for a single turbulence genera-tor is presented in Fig. 5b. The characteristic correlationtime of phase distortion change tcor , corresponding to a1/e falloff in the correlation function value, was near 50ms, independent of the turbulence generator location.Repositioning the turbulence generator resulted inchanges to both the image quality metric average value^J& and the metric fluctuation amplitude. For each loca-tion l of the turbulence generator, we calculated the aver-aged (long-exposure) image Iav(r) 5 ^I(r)& and selectedthe worst @Iw(r)# and best @Ib(r)# images among the 1000short-exposure frames. Selection of the worst and bestframes was based on image quality metric analysis. Im-age quality metric values corresponding to the average(Jav), best (Jb), and worst (Jw) frames normalized by theimage quality metric for the undistorted image (J0) are

Fig. 5. Image quality metric temporal dynamics: (a) imagequality fluctuations for sets of 1000 short-exposure images takenat a rate of 100 frames per second for different locations of tur-bulence generator 1 and (b) normalized image quality temporalcorrelation function BJ(iDt) for Dt 5 10 ms and l 5 0.81. Theimage quality J is normalized by the image quality of the undis-torted image, J0 .

presented in Fig. 6a for a single turbulence region (turbu-lence generator 1) and in Fig. 6b for two turbulence lay-ers. As seen in Fig. 6a, moving the turbulence generatortoward the imaged scene in the region 0.21 , l , 0.81caused the averaged-frame image quality metric and thedifference in image quality between the best and worseimages, DJ 5 Jb 2 Jw , to increase from DJ 5 0.13 forl 5 0.21 to DJ 5 0.23 for l 5 0.64 and further toDJ 5 0.18 for l 5 0.81. When the turbulence generatormoved close to the object plane (l . 0.81), the quality ofthe averaged frame increased, while the metric differenceDJ decreased, approaching Jav /J0 5 1 and DJ 5 0 forl 5 1. When turbulence generator 2 was placed in theposition closest to the imaging telescope (l 5 0.21) andturbulence generator 1 was sequentially moved towardthe imaged scene (Fig. 6b), the behavior of the imagequality metric was quite different. Image quality of theaveraged frame, Jav , was primarily determined by theturbulent layer located closest to the imaging telescope,with only a slight increase noted when the second turbu-lence generator was placed near the object (curve 2 in Fig.6b).

The level of image quality metric fluctuations is an im-portant characteristic of an anisoplanatic imaging sys-

Fig. 6. Influence of the turbulence layer location(s) on imagequality metric: normalized image quality metric correspondingto the best Jb , (curve 1), frame-averaged Jav , (curve 2), andworst Jw , (curve 3) frames for (a) a single and (b) two turbulencegenerators at different locations, (c) normalized image qualitymetric standard deviation for a single (curve 1) and two (curve 2)turbulence generators.

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1319

tem, as it impacts the efficiency of the synthetic imagingtechnique. The normalized standard deviations sJ /J0for image quality fluctuations in the case of a single andtwo turbulence regions are shown in Fig. 6c. With a

single turbulence layer, the standard deviation achievedits maximum when the turbulence generator was locatedat the distance l 5 0.73. Similar behavior was also ob-served for point-source images, as discussed in Subsection

Fig. 7. Image quality variation within a set of 1000 frames taken at different turbulence generator location(s). The left column cor-responds to the best and the right column to the worst frames: (a), (b) l 5 0.21 (single turbulence generator); (c), (d) l 5 0.47 (singleturbulence generator); (e), ( f ) l 5 0.73 (two turbulence generators).

1320 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

2.C (Fig. 4c). For two spatially separated turbulence lay-ers, the standard deviations for the image quality fluctua-tions sJ monotonically increased as the turbulent regionmoved toward the imaged scene (Fig. 6c). Examples ofthe best and worst frames corresponding to l 5 0.21 (tur-bulence generator 1), l 5 0.47 (turbulence generator 1),and l 5 0.73 (two turbulence generators) are shown inFig. 7. In all cases the difference in image quality be-tween the best and worst frames is quite noticeable.

4. SHORT-EXPOSURE VIDEO DATAPROCESSING WITH A SYNTHETICIMAGING TECHNIQUEA. Local-Area Image Quality Analysis: Image QualityMapVisual analysis of short-exposure images taken in thepresence of dynamical phase distortions shows that anumber of frames have quite strong spatial nonunifor-mity in their image quality. This effect was especiallynoticeable under anisoplanatic conditions when the tur-bulence generator was moved from the telescope pupil to-ward the object plane (0.4 , l , 0.9)—conditions ofweakly correlated point-source images (see Fig. 3).

To identify good-quality regions belonging to differentshort-exposure image frames, we consider the local-areaimage quality metric (image quality map) derivedthrough convolution of the spatially modulated compo-nent of the edge-image intensity distribution Id(r) in Eq.(4) with a Gaussian kernel G(r, a) (Ref. 23):

J~r! 5 E @Id~r8! 2 I0#2G~r8 2 r, a !d2r

> E @¹2I~r8!#2G~r8 2 r, a !d2r8, (5)

where G(r, a) 5 exp@2(x2 1 y2)/a2# and a is the kernelradius. The image quality map characterizes the contri-bution of high spatial spectral components (edges) in animage frame local area of radius a with a center point r.For a kernel size 2a large in comparison with the imagesize, the function G(r, a) approaches 1, and hence J(r)coincides with the image quality metric J [Eq. (4)]. Toincrease sensitivity of the image quality map in selecting‘‘lucky’’ regions, the parameter a should match the char-acteristic correlation radius of image quality nonunifor-mity rc—a parameter that is in fact difficult to definein practice. For rc we used the characteristic spatialscale for point-source image mutual correlation ana-lyzed in Subsection 2.B. In the calculations describedbelow, the selected kernel size 2a was approxi-mately equal to the minimum correlation distance ob-served in the experiments with point-source images (seeFig. 3): 2a 5 0.1X, where X is the image size. Atthe image edges the function Id(r) was smoothed byusing a super-Gaussian windowing function W(r)5 exp@ 2 (x8 1 y8)/X8#. Calculation of the convolution

integral (5) was performed by using a spectral represen-tation of the functions @Id(r) 2 I0#W(r) and G(r, a) inEq. (5).

B. Evolution Equation with Anisotropic GainThe image quality map makes it possible to locate short-exposure image regions where the quality of the image iseither preserved or only slightly degraded. The syntheticimaging technique operates with a large set of short-exposure images I(r, tn) taken at sequential times tn5 nDt (n 5 0, 2 ..., N 2 1). It aims to incorporate( fuse) these lucky regions belonging to different imageframes and create an output visual data stream Is(r, tn)having improved image quality.23 The following form ofthe evolution equation with anisotropic gain was used forprocessing short-exposure images obtained in the experi-ment:

t]Is~r, t !

]t5 2Kd ~r, t !@Is~r, t ! 2 I~r, tn!#,

tn < t , tn11 , (6)

where t and K are coefficients. The anisotropic gainfunction is given by

d ~r, t !

5 H J~r, tn! 2 Js~r, t ! for J~r, tn! . J8~r, t !

0 otherwise. (7)

The first short-exposure image frame I(r, t0) from the setof N input images I(r, tn) was used as an initial conditionfor the synthetic image evolution for the first time inter-val 0 < t , t1 @Is(r, t 5 0) 5 I(r, t0)#.

The anisotropic gain function d (r, t) in Eq. (7) causessynthetic image evolution to be dependent on the differ-ence between the image quality maps for the processed@Js(r, t)# and input @J(r, tn)# images. This preventssynthetic image change in the region where input imagequality map values are less than the corresponding val-ues for the processed image. The evolution equationwith anisotropic gain (6) in the continuous-time form of-fers the potential for parallel analog implementation as avery large-scale integration image processing system.23

C. Synthetic Image Processing: Experimental ResultsTo process experimental video data using a digital com-puter (600-MHz Dell PC), we applied the followingdiscrete-time version of Eq. (6):

In11s ~r! 5 In

s ~r! 2 Kdn~r!@Ins ~r! 2 In~r!#,

dn~r! 5 Jn~r! 2 Jns ~r! for Jn~r! . Jn

s ~r!,

dn~r! 5 0 otherwise. (8)

The synthetic image Ins (r) was upgraded for each input

short-exposure image In(r), n 5 0, 1 ,..., N 2 1 (N5 1000). The synthetic image quality maps Jn

s (r) werecalculated at each step of the iteration process (8). Tocompare image quality of short-exposure @In(r)# and syn-thetic @In

s (r)# images at each step of the iterative process(8), we also calculated the image quality metrics Jn andJn

s defined according to Eq. (4). The dependence of Jns

and Jn on the iteration ( frame) number n (evolutioncurves) was calculated for different locations of the turbu-lence generators by using iterative procedure (8).

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1321

Figure 8 shows examples of the synthetic image evolu-tion curves normalized by the averaged image qualitymetric ^Jn&. Figure 8 (curve 1) also gives an example ofthe image quality dynamics Jn for the case when bothturbulence generators were placed close to the imagingtelescope: l 5 0.3 for turbulence generator 1 and l5 0.21 for turbulence generator 2 ( fire safety concernsprevented us from colocating the heating elements in thesame plane). Synthetic image quality dynamics Jn

s cor-responding to Jn (curve 1) are shown in curve 2 of Fig. 8.As seen by comparing these two curves, the maximumgain in synthetic image quality was obtained from only afew lucky frames (n 5 333, n 5 336, and n 5 337).When anisoplanatic effects are increased (single turbu-lence layer placed at l . 0.5), growth in synthetic imagequality occurs not from any single frame but rather fromsmall metric increases that are due to the contributions ofmany short-exposure frames (compare curves 2 and 4 inFig. 8). Under anisoplanatic imaging conditions the im-provement in synthetic image quality has a kneelike be-havior characterized by a rapid increase in the metricvalue Jn

s within a few frames ('100), followed by a rela-tively slow accumulation of the image quality improve-ment thereafter. This observation supports the state-ment in Section 1 that under anisoplanatic imagingconditions the probability of obtaining lucky regions issignificantly higher than the probability of getting asingle lucky frame under isoplanatic imaging conditions.

The normalized image quality metric values Jns

achieved after processing of n 5 1000 short-exposure im-ages (Js 5 J1000

s ) are shown in Fig. 9 for different turbu-lence generator locations. In all cases the obtained syn-thetic image quality metric values increased by a factor of1.45–2.35 in comparison with the corresponding averagedimage quality metric values ^Jn&. The largest gain in

Fig. 8. Processing of video data corresponding to different loca-tion(s) l of the turbulence generator(s) by using iterative proce-dure (8) (curves 2–5) and the image quality frame-to-frame evo-lution curve for two turbulence generators (curve 1). Each set ofimages used in the calculations of curves 2–5 is independent.

image quality metric (Js 5 2.35^Jn&) occurred for thecase of a single turbulence generator located at l 5 0.73.This corresponds to the location in Fig. 6c for the maxi-mum standard deviation of the image quality metric sJand also coincides with the maximum point in Fig. 4c forthe parameter b j 5 ^ s j

dr&/^wj& characterizing standarddeviation for point-source image centroid position fluctua-tions normalized by the averaged image width. Fromthis it would appear that the synthetic image processingtechnique works best for highly anisoplanatic conditionscharacterized by significant levels of image quality fluc-tuations. The gain in image quality metric valueachieved through the use of the synthetic image process-ing technique was smaller when two turbulence regionswere used, and the image quality metric value increasedmonotonically when turbulence generator 1 was movedtoward the imaging scene (see curve 2 in Fig. 9a).

Examples of both long-exposure ( frame-averaged) andsynthetic images corresponding to the conditions indi-cated by points A–C in Fig. 9a are presented in Fig. 10.The image quality improvement achieved by using thesynthetic images technique is clearly visible when com-paring the frame-averaged images shown in the left col-umn with the synthetic images in the right column.

D. Turbulence-Induced Image Quality EnhancementCompare the synthetic image quality metric Js with thequality metric corresponding to the undistorted image, J0(dashed horizontal line in Fig. 9b). The result is intrigu-ing: For the single turbulence generator located at dis-

Fig. 9. Synthetic image quality metric Js corresponding to n5 1000 for different turbulence generator locations normalizedby (a) averaged image quality metric values ^Jn& and (b) thequality metric of the undistorted image, J0 : synthetic imagequality metrics curves 1 and 2 (solid curves) and curves 3 and 4(dashed curves) averaged metrics, corresponding to a single tur-bulence generator (curves 1 and 3) and to two turbulence genera-tors (curves 2 and 4). The dashed horizontal line corresponds tothe quality metric J0 for the undistorted image.

1322 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

tances l . 0.5, the synthetic image quality metric exceedsthe undistorted image quality metric (Js . J0).

Does this mean we may be better off to image objects inthe presence of turbulence rather than without turbu-

lence? The general answer to this question is, perhapsnot, but the experimental results presented here (seeFigs. 9–11) show that for certain imaging conditions wemay obtain a turbulence-induced image quality enhance-

Fig. 10. Long-exposure (left column) and synthetic (right column) images corresponding to conditions indicated by points A, B, and C inFig. 9a: (a), (b) l 5 0.73 (single turbulence generator), (c), (d) l 5 0.91 (two turbulence generators), (e), ( f ) l 5 0.21 (single turbu-lence generator).

M. A. Vorontsov and G. W. Carhart Vol. 18, No. 6 /June 2001/J. Opt. Soc. Am. A 1323

Fig. 11. Image enhancement obtained by using the synthetic imaging technique. Examples of (a) undistorted and (b) synthetic imagesfor l 5 0.73 and a single turbulence generator are given. Synthetic image (b) was obtained by using undistorted image (a) as an initialcondition in iterative procedure (8).

ment by recording and processing a large set of short-exposure images. Note that the term image quality en-hancement is defined here through our definition of imagequality metric [Eq. (4)].

The synthetic imaging technique is well suited for iden-tifying and incorporating image segments having abetter-than-undistorted image quality (superlucky imageregions). To illustrate the idea of building a syntheticimage with turbulence-induced enhanced quality, we car-ried out the following experiment. We recorded the un-distorted image frame shown in Fig. 11a and used thisimage as the initial condition in iterative procedure (8).A set of 1000 short-exposure images taken with the tur-bulence generator placed at the distance l 5 0.73 wereused as input data. The obtained synthetic image isshown in Fig. 11b. The quality metric for this syntheticimage is Js 5 1.5 J0 .

There are several issues that should be considered be-fore we would suggest using turbulence to improvediffraction-limited imaging system performance. First,the observed image quality enhancement is mostly re-lated to an increase of the imaging system’s depth of focus(turbulence-induced depth-of-focus enhancement). Theprimary contribution to image quality improvement inFig. 11b (when compared with the undistorted image inFig. 11a) comes from the sharper images of Peter theGreat and Taras Bulba standing at the front of the scene.Random dithering of the telescope imaging plane causedby turbulence may result in the random appearance ofshort-exposure images with clearly seen objects locatedbeyond the undistorted imaging system’s depth of field.The synthetic imaging technique takes advantage of thisrandom image plane dithering by selecting and incorpo-rating sharply seen image regions belonging to diverseimage planes.

Second, synthetic image processing of short-exposurevideo data may introduce unwanted effects such as geo-metrical distortions and line doubling. Lucky image re-gions may not appear exactly in the same place as theyare actually located. Random shifts of lucky regions are

the source of both the geometrical distortions present inthe synthetic image (e.g., slightly curved roof line in Fig.11b) and the line doubling (see line doubling of Peter theGreat’s hat in Fig. 11b). This line doubling appearswhen lucky regions corresponding to the same image areaand having approximately the same image quality aremutually shifted. Line doubling may contribute to animage quality metric increase but certainly cannot be con-sidered an actual image quality improvement. Thus in-creasing the image quality metric beyond J0 does not al-ways mean obtaining an image with resolution beyondthe diffractive limit. Still, our visual perception is rathertolerant to both of these side effects (if they are relativelysmall) and very sensitive to image sharpness. Looking atthe images in Fig. 11, we actually feel (on a perceptivelevel) that the synthetic image is better than the undis-torted image despite the presence of both the geometricaldistortions and the line doubling in Fig. 11b. Note thatthere is a ghost image for the arm of Peter the Great evenin the undistorted image in Fig. 11a. These types ofghost images are typically present in images of objectsplaced at distances larger than the imaging system depthof field.

The synthetic imaging technique presented here can befurther improved. To eliminate line doubling, one canuse post-processing techniques applied to short-exposureimages; for example, an image restoration algorithmbased on local shift removal.15,16 One more option is toincorporate, on line, additional information about localwave-front tilts obtained from a wave-front sensor. Afurther increase in the imaging system’s depth of field canpotentially be achieved by combining the synthetic imag-ing technique with controllable dithering of the telescopeimaging plane.

ACKNOWLEDGMENTSThe authors thank M. Lin of Springbrook High School,Silver Spring, Montgomery County, Maryland, whoworked on this project during an Army Research Labora-

1324 J. Opt. Soc. Am. A/Vol. 18, No. 6 /June 2001 M. A. Vorontsov and G. W. Carhart

tory summer job assignment, and J. C. Ricklin for techni-cal and editorial comments. This work was performed atthe Army Research Laboratory’s Intelligent Optics Labo-ratory.

Address correspondence to Mikhail A. Vorontsov, U.S.Army Research Laboratory, Intelligent Optics Labora-tory, AMSRL-CI-C, 2800 Powder Mill Road, Adelphi,Maryland 20783, or by phone, 301-394-0214; fax, 301-394-0225; or e-mail, [email protected].

REFERENCES AND NOTES1. F. Roddier, ed., Adaptive Optics in Astronomy (Cambridge

U. Press, Cambridge, UK, 1999), pp. 91–130.2. R. Q. Fugate, ‘‘Laser beacon adaptive optics,’’ Opt. Photon.

News, pp. 14–19 (June 1994).3. M. C. Roggemann and B. M. Welsh, Imaging through Tur-

bulence (CRC Press, Boca Raton, Fla., 1996).4. D. L. Fried, ‘‘Anisoplanatism in adaptive optics,’’ J. Opt.

Soc. Am. 72, 52–61 (1982).5. B. M. Welsh and C. S. Gardner, ‘‘Effects of turbulence in-

duced anisoplanatism on the imaging performance of adap-tive astronomical telescopes using laser guide stars,’’ J.Opt. Soc. Am. A 8, 69–80 (1991).

6. M. I. Chernotskii, ‘‘Anisoplanatic short-exposure imaging inturbulence,’’ J. Opt. Soc. Am. A 10, 492–501 (1993).

7. D. V. Murphy, C. A. Primmerman, B. G. Zollars, and H. T.Barclay, ‘‘Experimental demonstration of atmospheric com-pensation using multiple synthetic beacons.’’ Opt. Lett. 16,1797–1799 (1991).

8. P. L. Wizinowich, ed., Adaptive Optical Systems Technol-ogy, Proc. SPIE 4007 (2000).

9. R. G. Paxman and J. R. Fienup, ‘‘Optical misalignmentsensing and image reconstruction using phase diversity,’’ J.Opt. Soc. Am. A 5, 914–923 (1988).

10. R. G. Paxman, B. J. Thelen, and J. H. Seldin, ‘‘Phase-diversity correction of space-variant turbulence-inducedblur,’’ Opt. Lett. 19, 1231–1233 (1994).

11. A. Labeyrie, ‘‘Attainment of diffraction limited resolution inlarge telescopes by Fourier analyzing speckle patterns instar images,’’ Astron. Astrophys. 6, 85 (1970).

12. J. Primot, G. Rousset, and J. C. Fortanella, ‘‘Deconvolutionfrom wave-front sensing: a new technique for compensat-ing turbulence-degraded images,’’ J. Opt. Soc. Am. A 7,1589–1608 (1990).

13. R. A. Muller and A. Buffington, ‘‘Real-time correction of at-mospherically degraded telescope images through imagesharpening,’’ J. Opt. Soc. Am. 64, 1200–1210 (1974).

14. M. A. Vorontsov, G. W. Carhart, D. V. Pruidze, J. C. Rick-lin, and D. G. Voelz, ‘‘Adaptive imaging system for phase-distorted extended source/multiple distance objects,’’ Appl.Opt. 36, 3319–3328 (1997); M. A. Vorontsov, G. W. Carhart,M. Cohen, and G. Cauwenberghs, ‘‘Adaptive optics based onanalog parallel stochastic optimization: analysis and ex-perimental demonstration,’’ J. Opt. Soc. Am. A 17, 1440–1453 (2000).

15. D. Fraser, G. Thorpe, and A. J. Lambert, ‘‘Atmospheric tur-bulence visualization with wide-area motion-blur restora-tion,’’ J. Opt. Soc. Am. A 16, 1751–1758 (1999).

16. D. Fraser and A. J. Lambert, ‘‘Wide area image restorationusing a new iterative registration method,’’ in Image Recon-struction From Incomplete Data, M. A. Fiddy and R. P. Mill-ane, eds., Proc. SPIE 4123, 64–72 (2000).

17. D. L. Fried, ‘‘Probability of getting a lucky short-exposureimage through turbulence,’’ J. Opt. Soc. Am. 68, 1651–1658(1978).

18. M. C. Roggemann, C. A. Stoudt, and B. M. Welsh, ‘‘Imagespectrum signal-to-noise ratio improvement by statisticalframe selection for adaptive optics imaging through atmo-spheric turbulence,’’ Opt. Eng. 33, 3254–3264 (1994).

19. B. J. Thelen, D. A. Carrara, and R. G. Paxman, ‘‘Fine-resolution imagery of extended objects observed throughvolume turbulence using phase diverse speckle,’’ in Propa-gation and Imaging through the Atmosphere III, M. C.Roggemann and L. R. Bissonnette, eds., Proc. SPIE 3763,102–111 (1999).

20. V. I. Shmalhausen and N. A. Yaitskova, ‘‘Correction errorin extended objects imaging through turbulent atmo-sphere,’’ Opt. Atmos. Ocean 9, 1462–1470 (1996).

21. D. L. Fried, ‘‘Statistics of a geometric representation ofwavefront distortion,’’ J. Opt. Soc. Am. 55, 1427–1435(1965).

22. Quantitative estimation of the probability P loc is an inter-esting problem; however, this estimation is beyond thescope of this paper.

23. M. A. Vorontsov, ‘‘Parallel image processing based on anevolution equation with anisotropic gain: integrated opto-electronic architectures,’’ J. Opt. Soc. Am. A 16, 1623–1637(1999).

24. G. W. Carhart and M. A. Vorontsov, ‘‘Synthetic imaging:non-adaptive anisoplanatic image correction in atmosphericturbulence,’’ Opt. Lett. 23, 745–747 (1998).

25. M. I. Charnotskii, V. A. Myakinin, and V. U. Zavorotnyy,‘‘Observation of superresolution in nonisoplanatic imagingthrough turbulence,’’ J. Opt. Soc. Am. A 7, 1345–1350(1990).

26. M. A. Vorontsov, ‘‘Information processing with nonlinearoptical two-dimensional feedback systems,’’ J. Eur. Opt.Soc. Quantum Semiclassic. Opt. B 1, 1–10 (1999).