vision and revision wavefront sensing from the image domain benjamin pope @fringetracker

34
Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

Upload: donald-harris

Post on 18-Dec-2015

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

Vision and RevisionWavefront Sensing from the Image Domain

Benjamin Pope@fringetracker

Page 2: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

2

Our Group

Peter Tuthill Ben Pope (me!)

Frantz Martinache Nick Cvetojevic

Anthony Cheetham

Niranjan Thatte

Page 3: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

3

James Webb Space Telescope

› The James Webb Space Telescope is the next-generation NASA major space observatory

› A 6.5 m diameter segmented primary mirror will make it the largest civilian space telescope

› Left: a full scale mockup of the telescope in Munich

Page 4: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

4

Phasing the JWST

› A major issue with JWST will be in using a segmented mirror in space

› No current robust, cheap approach to making sure the mirrors are aligned with required ~ nm accuracy

› This is where we come in!

› Mirror segments have actuators in tip, tilt, piston and curvature

› We can use these to correct the phasing

Page 5: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

Fourier Amplitudes and Phases

5

V

Amplitudes

Phases

V

Amplitudes

Phases

Albert Michelson

Pablo Picasso

Page 6: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

6

Interferometry: Phase

› The phase delay between two receivers relates to the position of the astrophysical object

Δ𝜑=𝑑 /𝜆

Page 7: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

7

Interferometry: Phase

› The phase delay between two receivers relates to the position of the astrophysical object

› If the atmosphere disrupts the wavefront, this information is corrupted by errors!

Δ𝜑=? ??

Page 8: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

8

Closure Phase

› True phases are encoded on baselines, , but errors are on the wavefront and hit each detector separately

› If we add up measured phases around a loop, errors cancel out:

- If , then the closure phase

is immune to error: !

Δ𝜙1

Δ𝜙3

Δ𝜙2

Page 9: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

9

Redundant Baselines

› Multiple pairs of points sample each baseline.

› Random walk in the phasor space corrupts image

› Closure phase no longer works!

(𝑢 ,𝑣 )(𝑢 ,𝑣 )

Duplicate baselines!

ℑ𝔪

ℜ𝔢

Page 10: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

10

Non-Redundant Masking

› If we put a mask on the telescope aperture and let through only one copy of each baseline, we overcome this problem

› Non-redundant masking gives us good amplitude measurements – and also lets us recover some phase information

Aperture masks used on NIRC2 (image courtesy of Peter Tuthill).

Page 11: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

11

Kernel Phase Interferometry

› Kernel phase is a method of image deconvolution for high resolution astronomy

› Works on Nyquist-sampled, space-based or well-corrected (extreme) AO images

› A software-only idea similar to the aperture masking approach

› Delivers contrast ~ hundreds within λ/D

› Can also be used as a wavefront sensor (Martinache sensor)

› Only five papers so far!

› A solution looking for a problem!

Page 12: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

12

Linear Phase Transfer

› Martinache (2010) considered a generalisation of closure phase when phase errors are small enough to be linear

› For measured phases are , ‘true’ baseline phases and errors in the pupil plane are

› In this case, the phases on each baseline will be

where is a transfer matrix relating pupil and image plane phases and the redundancy of each baseline.

Page 13: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

13

Kernel Phase Interferometry

› Use singular value decomposition: express

› Find an operator K such that

› This kernel operator maps measured phases to kernel phases which are immune to small errors:

› Can recover a large fraction of phase information

› Even for a redundant aperture, we still have robust observables – with no need of a mask!

Page 14: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

14

Phase Reconstruction

› What about the other way – get phases from their autocorrelation?

› The singular value decomposition also lets us create a pseudoinverse, such that

› This allows for an approximate reconstruction of pupil phases:

› This maps modes in the Fourier plane onto modes in the pupil, i.e. inverting the autocorrelation

› Reconstruct wavefronts using only an image!

Page 15: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

15

The Asymmetric Pupil

› The Fourier plane (FT of the image) is the autocorrelation of the pupil

- Not sensitive to all symmetries! A symmetric pupil can sense only even modes

› A Fourier wavefront sensor needs to have pupil asymmetry to sense asymmetric modes

- This could be achieved with thick spider(s), e.g. right.

- Alternatively, could mask out subapertures

Page 16: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

16

Martinache Wavefront Sensor Simulations

› Uses row phases (complement of kernel space)

› Wavefront sensor is your science camera!

› Initial Strehl > 35% required – can iterate to as high as 97% in simulations

› Sensitivity near-optimal for a given flux and wavefront error

Page 17: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

17

Experimental Setup I

› The microelectromechanical (MEMS) segmented mirror from Dragonfly was used as a JWST analogue

› This has piston, tip and tilt on each segment with ~ few nm precision

› Phasing JWST is hard – no apparatus in space!

› Ideal test case for the new wavefront sensor

Page 18: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

18

Experimental Setup II

› Laser introduced from single mode fibre

› Passed to MEMS array and through mask

› Imaged onto Xenics IR camera

Page 19: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

19

First Step - FICSM

› The first step in phasing a segmented mirror like this is the FICSM approach – ‘Fizeau interferometric cophasing of segmented mirrors’ (Cheetham et al)

Page 20: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

20

How do you Solve a Problem like A Mirror?

› We want a ‘tweeter’ on top of coarse phasing with FICSM

› Martinache (2013) theory says you need an asymmetric pupil

› We could do this with bars – or single segments

› That does work – but you still have planes of mirror symmetry for which the sensor is weak

› If you take out the same pattern you included with FICSM, you have no symmetries

Page 21: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

21

Pupil Modes

› Below: examples of normal modes of the discrete pupil model used in our experiment

› These play the role of the Zernike basis for expanding optical aberrations on a circular pupil

Page 22: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

22

Degrading the Wavefront

› Wavelength: 1600 nm

› MEMS settings disturbed by random tips, tilts and pistons nm

› Right: Wavefront reconstruction

› Can we fix this?

Page 23: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

23

Wavefront Restoration

› Yes we can!

› Right: reconstructed wavefront

› Don’t let the colourbar scare you – there are a couple of bad points

› Actual RMS piston error reduced < 10nm on all segments

› Strehl > 99%!

Page 24: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

24

Quick and Dirty

› While we found that the not-at-all symmetric segment tilting worked best, can we get away with doing fewer segments?

- Better if moving mirrors is risky, expensive or slow

› Yes we can! A single asymmetry (particularly in the outer ring) was found to work (top)

- Has some issues with sensing modes symmetric with respect to the line of reflection symmetry

› A wedge asymmetry (3 in outer ring, one in inner) also works (bottom)

Page 25: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

25

How well can we Measure a Single Segment?

› Pistoned a single segment (3) in 20 nm steps from -300 to +300

› Reconstruction is very accurate… within limits

› You lose it when linearity fails

› Phases are also correlated – sensing modes globally is a bad way to reconstruct phases on a single segment

› When phasing the whole mirror in closed loop, this correlated error beats down to zero as you make the whole thing flat

Page 26: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

26

Restoring a PSF

› Left: animation of a stretched PSF as our algorithm converges

› This is with a scalene triangle of segments removed

Page 27: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

27

Restoring a PSF

› Also works with a wedge removed

Page 28: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

28

Restoring a PSF

› Or in fact a single segment!

Page 29: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

29

Oxford-SWIFT at Palomar

› Hale 200-inch telescope has the PALM-3000 extreme AO system – the highest order AO currently available for experiments

› Oxford has the Short Wavelength Integral Field specTrograph (SWIFT) on P3K – a high spatial and spectral resolution IFU from 650-1000 nm

› SWIFT has historically had problems with non common path error – never achieves even internal diffraction limit

- Hard to correct with conventional methods, because of the optical layout

› Ideal test case for our wavefront sensing method

Page 30: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

30

HODM Mask

› Right: the mask we placed over the high order DM to get the pupil asymmetry

› This was probably overkill, but we wanted to make sure it worked the first time!

Page 31: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

31

Results

› Left top: 940nm ; bottom, 970 nm PSF

› You can see two, maybe 3 Airy rings

› This would not ordinarily be especially impressive – but SWIFT has an internally very complicated layout and has never seen Airy rings before!

› This was achieved with only the LODM

› HODM correction should get it to the diffraction limit!

Page 32: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

32

The Phase Map

› Right: Final phase map subtracted from LODM offsets

› Could maybe have improved – but had to stop to do science observations!

Page 33: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

33

Next Directions

› Now that we’ve demonstrated this in the lab and on Palomar…- JWST needs phasing! But can we maybe get a ‘spider

sense’ from using the spiders as the asymmetric mask?

- HARMONI, the first-light IFS for the E-ELT is predicted to suffer from very bad non common path error – ideal case!

- Project 1640 – another Palomar IFS for exoplanet studies

Page 34: Vision and Revision Wavefront Sensing from the Image Domain Benjamin Pope @fringetracker

34

Thanks!

Thank you for listening!