coherent lidar for 3d-imaging through obscurants · 2019-02-01 · coherent lidar for 3d-imaging...

5
Martin 19 th Coherent Laser Radar Conference CLRC 2018, June 18 21 1 Coherent Lidar for 3D-imaging through obscurants Aude Martin (a), Jérôme Bourderionnet (a), Luc Leviander (a), John F. Parsons (b), Mark Silver (b), Patrick Feneyrou (a) (a) Thales Research & Technology France, 1, Avenue Augustin Fresnel, 91767, Palaiseau, Cedex, France (b) Thales Ltd, 1 Linthouse Road, Glasgow, G51 4BZ, UK., Email: [email protected] Abstract: In the context of unmanned vehicles, eye-safe LiDARs able to measure simultaneously range and speed in degraded visual environments, are required. The ability of frequency modulated continuous wave (FMCW) coherent LiDAR at 1.55 μm to differentiate hard target from diffuse ones like clouds is used here to demonstrate 3D- imaging through obscurants. Detection experiments of a moving target up to 50 m using a compact FMCW LiDAR based on silicon photonics are also highlighted. The photonic integrated circuit, without any moving parts, allows emission and detection in 8 different collimated directions spread over the desired angle. Keywords: Coherent Laser Radar, Frequency Modulated Continuous Wave Lidar, Photonic Integrated Circuit. 1. Introduction Multidimensional imaging systems either use an array of detectors such as gated viewing cameras [1]; single photons array [2], or use a single pixel technology such as laser ranging with a beam scanning technology [3] or a single pixel camera [4]. In this work, we have used the FMCW LiDAR technology with a scanning head for imaging through obscurants and used a non-mechanical scanning system with a LiDAR on-chip to detect a hard target. The main idea for FMCW lidar is to obtain spatial resolution using a frequency modulation in a coherent detection scheme. Both local oscillator and emitted signal are frequency modulated. The backscattered signal is time-delayed due to the light propagation time from the lidar to the target and frequency shifted due to Doppler shift. Hence, the interference between local oscillator and backscattered light shows frequency plateaus, the average of which corresponds to the Doppler shift (speed measurement) and the difference of which corresponds to the range multiplied by the frequency modulation slope. As illustrated in Figure 1, this type of detection also enables FMCW LiDAR to distinguish a (relatively weak) static target signal from a much larger signal backscattered by a cloud. Since the cloud presents a significant thickness, the power spectral density of the backscattered light is spread over a large number of frequency bins. In contrast, the signature of a hard target consists of two very sharp peaks. By only selecting narrow peaks, the system can reduce false alarms and, hence, increase the signal processing speed. The signal processing used in the following experiments [5] also enables detection of multiple targets within the collimated beam. Figure 1. Experimental power spectral density of a target behind a cloud Mo10

Upload: others

Post on 07-May-2020

22 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Coherent Lidar for 3D-imaging through obscurants · 2019-02-01 · Coherent Lidar for 3D-imaging through obscurants ... differentiate hard target from diffuse ones like clouds is

Martin 19th Coherent Laser Radar Conference

CLRC 2018, June 18 – 21 1

Coherent Lidar for 3D-imaging through obscurants

Aude Martin (a), Jérôme Bourderionnet (a), Luc Leviander (a),

John F. Parsons (b), Mark Silver (b), Patrick Feneyrou (a) (a) Thales Research & Technology France, 1, Avenue Augustin Fresnel, 91767, Palaiseau, Cedex, France

(b) Thales Ltd, 1 Linthouse Road, Glasgow, G51 4BZ, UK.,

Email: [email protected]

Abstract: In the context of unmanned vehicles, eye-safe LiDARs able to measure

simultaneously range and speed in degraded visual environments, are required. The

ability of frequency modulated continuous wave (FMCW) coherent LiDAR at 1.55 µm to

differentiate hard target from diffuse ones like clouds is used here to demonstrate 3D-

imaging through obscurants. Detection experiments of a moving target up to 50 m using a

compact FMCW LiDAR based on silicon photonics are also highlighted. The photonic

integrated circuit, without any moving parts, allows emission and detection in 8 different

collimated directions spread over the desired angle.

Keywords: Coherent Laser Radar, Frequency Modulated Continuous Wave Lidar, Photonic Integrated

Circuit.

1. Introduction

Multidimensional imaging systems either use an array of detectors such as gated viewing cameras [1];

single photons array [2], or use a single pixel technology such as laser ranging with a beam scanning

technology [3] or a single pixel camera [4]. In this work, we have used the FMCW LiDAR technology

with a scanning head for imaging through obscurants and used a non-mechanical scanning system with a

LiDAR on-chip to detect a hard target.

The main idea for FMCW lidar is to obtain spatial resolution using a frequency modulation in a coherent

detection scheme. Both local oscillator and emitted signal are frequency modulated. The backscattered

signal is time-delayed due to the light propagation time from the lidar to the target and frequency shifted

due to Doppler shift. Hence, the interference between local oscillator and backscattered light shows

frequency plateaus, the average of which corresponds to the Doppler shift (speed measurement) and the

difference of which corresponds to the range multiplied by the frequency modulation slope.

As illustrated in Figure 1, this type of detection also enables FMCW LiDAR to distinguish a (relatively

weak) static target signal from a much larger signal backscattered by a cloud. Since the cloud presents a

significant thickness, the power spectral density of the backscattered light is spread over a large number

of frequency bins. In contrast, the signature of a hard target consists of two very sharp peaks. By only

selecting narrow peaks, the system can reduce false alarms and, hence, increase the signal processing

speed. The signal processing used in the following experiments [5] also enables detection of multiple

targets within the collimated beam.

Figure 1. Experimental power spectral density of a target behind a cloud

Mo10

Page 2: Coherent Lidar for 3D-imaging through obscurants · 2019-02-01 · Coherent Lidar for 3D-imaging through obscurants ... differentiate hard target from diffuse ones like clouds is

Martin 19th Coherent Laser Radar Conference

CLRC 2018, June 18 – 21 2

2. Differentiating objects based on their speed

In these 3D-detection experiments, a RIO ORION laser with a linewidth of 1 kHz was modulated with

slopes of 5 and 6 MHz/µs and at a period of 3.8 kHz. In a single pixel configuration, with the same

LiDAR and a different optical head, targets up to 10 km were detected [6] with a mean optical power of

200 mW. A 2-axis piezo tip/tilt mirror (PI), with maximum angular range of 10 mrad, was used to scan

the beam to create 2-D images. The signal processing is performed in two steps:

- Windowing and Fourier transforms are computed in real time in the FPGA using a FFT (Fast

Fourier Transform) algorithm. The FFTs are averaged in power (incoherent integration) and the result is

transferred to the laptop.

- Peak detection and computation of range and speed from the detected frequencies achieved using

a program coded in C.

In Figure 2, a scene was mapped using the system on a clear day with 2 mW of output power and an

integration time of 1 ms. The red square in Figure 2a, corresponds to the field of view of the optical head

with a 10 mrad angle. A tilted sign, vegetation and a building are present at ranges between 70 and 120 m.

The color coded detected ranges are plotted along the (x,y) coordinates in Figure 2b. These data with a

focus on the 67 to 68 m ranges are then plotted in a 2D plot where the tilted sign and the mast are visible

with a resolution of a few tens of centimeters that matches the expected resolution of Δ𝑅 = 37 cm.

Figure 2d shows the speed of each pixel. Green areas correspond to non-moving objects like the sign and

the building while the vegetation that is moving at up to 5 cm/s can be easily identified. The ability to

simultaneously measure the range and speed of an object helps identifying different objects in a complex

scene.

Figure 2. a) Picture of the scene, the field of view is highlighted in red; b) 3D plot of the range of the

pixels as a function of the (x,y) coordinates, range is colour coded; 2D plots of the range with a focus on

the 67 to 68 m range (c), and the speed as a function of the angle.

3. Large dynamic range of the LiDAR in the presence of smoke obscurants.

Range and speed measurements of a hard target in the presence of obscurants were carried out at DSTL

Battery Hill facility (Porton Down, UK). Artificial fog made of a mixture of oil and water was used to fill

a 10 m tent that obscured the view between the LiDAR and a hard target (a mannequin as shown in inset

Page 3: Coherent Lidar for 3D-imaging through obscurants · 2019-02-01 · Coherent Lidar for 3D-imaging through obscurants ... differentiate hard target from diffuse ones like clouds is

Martin 19th Coherent Laser Radar Conference

CLRC 2018, June 18 – 21 3

of Figure 3b). A picture of the scene (Figure 3a) shows the optical head and the open door of the 10 m

long tent filled with artificial fog which obscures the mannequin. . Figure 3b presents detected ranges

between 72 and 74.5 m as a function of the 2D scan angle. Note that the fence behind the mannequin can

still be observed. Figure 3c presents the SNR for each detected pixel. Acquisition starts at the top right of

the image and the scanning pattern describes a series of switchbacks. Smoke generation in the tent starts

at the beginning of the acquisition and the target (head of the mannequin) is detected clearly with 20 to 30

dB of SNR. The neck and shoulders of the mannequin are not detected, as the smoke builds-up to a level

that fully blocks the return signal. The smoke generation is then stopped and, as the cloud dissipates, the

signal from the hard target reappears when the signal to noise ratio goes above 2 dB. In this configuration,

the FMCW lidar can detect and evaluate range and speed from targets with a SNR ranging from 2 dB up

to over 30 dB without recalibrating any system parameters.

During the smoke generation, the transmission at 1.55 µm was measured using a corner cube and

transmissometer in order to have an independent measure of the smoke attenuation through which the

FMCW could detect the target. We conclude that the Lidar was able to detect a target through smoke

down to 0.04% of transmission (double pass), which corresponds to 3.9 attenuation lengths (single pass).

Figure 3. a) Picture of the scene with the optical head (front), the tent filled with smoke and the

mannequin (not visible but indicated by an arrow), 2D plots of the range b) and the SNR c) as a function

of the angle.

4. Integrating an FMCW Lidar on-chip

In order to develop 3-D mapping functionalities for unmanned vehicles, compact eye-safe LiDARs,

which are able to measure simultaneously range and speed with no or few moving parts, are required.

Photonic integrated circuits (PICs) are considered a viable option as they allow monolithic integration of

both electronic and optical devices on the same chip and offer good thermal and mechanical stability. In

addition, silicon PIC technology is becoming mature thanks to foundries, which supply parallel

manufacturing paving the way for mass production. As high peak powers need to be handled with care

with PICs, pulsed LiDARs are challenging to implement on-chip. Besides since the transparency window

of silicon imposes to use wavelength above 1100 nm, the implementation of the architecture of an FMCW

LiDAR at 1.55 µm on-chip is particularly straightforward [7].

Except from the DFB and output circulators, the optical part of the LiDAR is fully implemented on a 3x3

mm silicon chip, as shown on Figure 4a. It consists of 8 emission channels and 8 collection channels

addressed using phase modulators inserted in Mach Zehnder interferometers and a waveform calibration

channel. In Figure 4a, the switch network for the reception ports (SN1) and emission ports (SN2) are

colored green and red respectively. Emission channels are successively addressed and connected via

optical fibered circulators (FC) to the corresponding on-chip Balanced PhotoDiodes (BPD) and the output

collimator. Thus we avoid the need for mechanical scanning as the PIC is addressed to select the channel

and collimator of the desired output beam. The waveform calibration is achieved with the on-chip delay

Page 4: Coherent Lidar for 3D-imaging through obscurants · 2019-02-01 · Coherent Lidar for 3D-imaging through obscurants ... differentiate hard target from diffuse ones like clouds is

Martin 19th Coherent Laser Radar Conference

CLRC 2018, June 18 – 21 4

line interferometer (DLI). The silicon chip is then bonded to a Printed Circuit Board to read the balanced

photodiodes signals.

Figure 4. a) Architecture of the FMCW Lidar system (the rectangle delimits the chip perimeter), Range b)

and speed c) of a moving target detected by the PIC LiDAR with less than 5 mW of output power.

We have demonstrated up to 30 mW of output power for 200 mW input power and a homogeneous

routing of the 8 channels. Figure 4b and c) present the detection experiment of a moving target at up to 50

m of range with less than 5 mW of output power.

5. Conclusion

In this article, two different scanning systems were demonstrated; first, a 2-axis scanning head was

combined with an FMCW Lidar allowing 3D measurements and second, switching networks

implemented within a PIC LiDAR steered non-mechanically an optical beam. In the first case, clear air

measurements allowed demonstrating a resolution limited by the frequency modulation of the laser and

measurements in a dense fog highlighted the large detection dynamics of the LiDAR. This allows imaging

of complex scenes with multiple targets with different albedos and at (very) different ranges. The

performances of the system were experimentally measured to 3.9 attenuation length detection capabilities

(single pass) at 70 m. In the second system, this architecture was integrated on-chip to demonstrate range

and speed measurements up to 50 m with less than 5 mW of output power. This last system was not tested

for 3D measurements but the scanning method potentially increases the speed and ruggedness of the

system.

6. References and Acknowledgements

We acknowledge the funding of the DSTL under DSTL1000114607 “Seeing Through the Clouds”.

Funding is also acknowledged by the European Union Seventh Programme for research, technological

development and demonstration under grant agreement No 318178 - PLAT4M. We thank IMEC foundry

for the fabrication of the chips and Tyndall University for the packaging.

[1] Busck, J., and Heiselberg, H “Gated viewing and high-accuracy three-dimensional laser radar”, Applied optics,

43(24), 4705-4710 (2004)

[2] Pawlikowska, A., Halimi, A., Lamb, R., and Buller, G. "Single-photon three-dimensional imaging at up to 10

kilometers range," Opt. Express 25, 11919-11931 (2017)

Page 5: Coherent Lidar for 3D-imaging through obscurants · 2019-02-01 · Coherent Lidar for 3D-imaging through obscurants ... differentiate hard target from diffuse ones like clouds is

Martin 19th Coherent Laser Radar Conference

CLRC 2018, June 18 – 21 5

[3] Aflatouni, F., Abiri, B., Rekhi, A., and Hajimiri, A. “Nanophotonic coherent imager”, Opt. Express 23, 5117-

5125 (2015)

[4] Hardy, N., and Shapiro, J. “Computational ghost imaging versus imaging laser radar for 3D imaging”, Physical

Review A, 87, 023820 (2013)

[5] Feneyrou, P., Leviandier, L., Minet, J., Pillet, G., Martin, A., Dolfi, D., ... & Midavaine, T. “Frequency-

modulated multifunction lidar for anemometry, range finding, and velocimetry–1. Theory and signal processing.”,

Applied Optics, 56(35), 9663-9675 (2017).

[6] Feneyrou, P., Leviandier, L., Minet, J., Pillet, G., Martin, A., Dolfi, D., ... & Midavaine, T. “ Frequency-

modulated multifunction lidar for anemometry, range finding, and velocimetry—2. Experimental results.”, Applied

optics, 56(35), 9676-9685 (2017).

[7] Poulton C. V. Yaacobi, A., Cole D., Byrd M., Raval M.,Vermeulen D. & Watts, M. “Coherent solid-state

LIDAR with silicon photonic optical phased arrays”, Optics Letters, 42(20), 4091-4094 (2017)