on field radiometric calibration for multispectral...

7
On Field Radiometric Calibration for Multispectral Cameras Raghav Khanna, Inkyu Sa, Juan Nieto and Roland Siegwart Abstract— Perception systems for outdoor robotics have to deal with varying environmental conditions. Variations in illu- mination in particular, are currently the biggest challenge for vision-based perception. In this paper we present an approach for radiometric characterisation of multispectral cameras. To enable spatio-temporal mapping we also present a procedure for in-situ illumination estimation, resulting in radiometric calibration of the collected images. In contrast to current approaches, we present a purely data driven, parameter free approach, based on maximum likelihood estimation which can be performed entirely on the field, without requiring specialised laboratory equipment. Our routine requires three simple datasets which are easily acquired using most modern multispectral cameras. We evaluate the framework with a cost- effective snapshot multispectral camera. The results show that our method enables the creation of quantitatively accurate relative reflectance images with challenging on field calibration datasets under a variety of ambient conditions. I. I NTRODUCTION Using multispectral cameras onboard UAVs for remote and non invasive sensing has garnered significant interest in recent times [1], [2]. The combination of both technologies has provided new opportunities for the use of mobile robotics for environmental monitoring applications, including preci- sion agriculture, plant phenotyping, mining, and forestry [3], [4], [5]. Considerable effort has been directed towards obtaining accurate, dense 3D reconstructions and commercial tools [7] have become available providing automated pipelines to pro- duce orthomosaics, digital elevation models and coloured 3D point clouds. Multispectral indices are an important indicator of crop health and have been widely used for providing accurate and reliable insights for farm management. Indices such as the Normalised Difference Vegetation Index (NDVI) and Sum of Green Reflectances (SGR) have been shown to be reliable indicators of Nitrogen demand and leaf chlorophyll [8], [9], [10], [11]. However, in order to use these techniques in everyday farm management, there is a need to build quantitative spatio-temporal-spectral representations of the environment. To achieve this, images must be corrected to account for atmospheric, solar and topographic conditions. In hyperspectral imaging, radiometric calibration is the process that converts a raw digital number provided by the camera The research leading to these results has received funding from the European Unions Horizon 2020 research and innovation programme under grant agreement No 644227 and from the Swiss State Secretariat for Education, Research and Innovation (SERI) under contract number 15.0029. Raghav Khanna is a PhD student, Inkyu Sa is a post doctoral scholars, Juan Nieto is Deputy Director and Roland Siegwart is full professor at the Autonomous Systems Lab at ETH Z¨ urich (email: {raghav.khanna, inkyu.sa}@mavt.ethz.ch, [email protected], [email protected]). (a) 600 650 700 750 800 850 900 Wavelength band (nm) 1.5 2 2.5 Correction Factor Sunny, no shadow Sunny, shadow Cloudy, no shadow Cloudy, shadow (b) (c) (d) (e) (f) Fig. 1: (a) depicts a DJI Matrice 100 UAV flying over a field with a Ximea MQ022HG-IM-SM5X5-NIR Snapshot Multispectral Camera. (b) Shows the spectral correction factors estimated using our method over different field con- ditions. (c) shows an uncorrected, raw image of a scene containing plants taken using our multispectral camera for a wavelength band centered at 803nm. (d) shows the calibrated image computed from (c) using our method compensating for sensor response, lens vignetting and spectral variation of incident illumination. (e) and (f) show binary NDVI images calculated using the raw and corrected datacubes and thresholded using Otsu’s method [6]. It can be seen that the multispectral data cubes corrected using our method form the basis for a much better plant segmentation. into a measure representative of scene reflectance. Calibra- tion methods may be classified into two broad categories, (i) theoretical methods, which produce absolute calibration by incorporating complex atmospheric models (e.g. MOD- TRAN), and (ii) empirical methods, usually referred to as normalisation techniques, which are the most commonly used in practice [12], [13]. This paper elaborates a complete and practical normalisation based calibration routine suitable for quickly and conveniently generating reflectance estimates with multispectral cameras. We aim to combine the aerial

Upload: letuyen

Post on 05-Jul-2018

242 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

On Field Radiometric Calibration for Multispectral Cameras

Raghav Khanna, Inkyu Sa, Juan Nieto and Roland Siegwart

Abstract— Perception systems for outdoor robotics have todeal with varying environmental conditions. Variations in illu-mination in particular, are currently the biggest challenge forvision-based perception. In this paper we present an approachfor radiometric characterisation of multispectral cameras. Toenable spatio-temporal mapping we also present a procedurefor in-situ illumination estimation, resulting in radiometriccalibration of the collected images. In contrast to currentapproaches, we present a purely data driven, parameter freeapproach, based on maximum likelihood estimation whichcan be performed entirely on the field, without requiringspecialised laboratory equipment. Our routine requires threesimple datasets which are easily acquired using most modernmultispectral cameras. We evaluate the framework with a cost-effective snapshot multispectral camera. The results show thatour method enables the creation of quantitatively accuraterelative reflectance images with challenging on field calibrationdatasets under a variety of ambient conditions.

I. INTRODUCTION

Using multispectral cameras onboard UAVs for remoteand non invasive sensing has garnered significant interest inrecent times [1], [2]. The combination of both technologieshas provided new opportunities for the use of mobile roboticsfor environmental monitoring applications, including preci-sion agriculture, plant phenotyping, mining, and forestry [3],[4], [5].

Considerable effort has been directed towards obtainingaccurate, dense 3D reconstructions and commercial tools [7]have become available providing automated pipelines to pro-duce orthomosaics, digital elevation models and coloured 3Dpoint clouds. Multispectral indices are an important indicatorof crop health and have been widely used for providingaccurate and reliable insights for farm management. Indicessuch as the Normalised Difference Vegetation Index (NDVI)and Sum of Green Reflectances (SGR) have been shown to bereliable indicators of Nitrogen demand and leaf chlorophyll[8], [9], [10], [11]. However, in order to use these techniquesin everyday farm management, there is a need to buildquantitative spatio-temporal-spectral representations of theenvironment. To achieve this, images must be corrected toaccount for atmospheric, solar and topographic conditions. Inhyperspectral imaging, radiometric calibration is the processthat converts a raw digital number provided by the camera

The research leading to these results has received funding from theEuropean Unions Horizon 2020 research and innovation programme undergrant agreement No 644227 and from the Swiss State Secretariat forEducation, Research and Innovation (SERI) under contract number 15.0029.Raghav Khanna is a PhD student, Inkyu Sa is a post doctoral scholars,Juan Nieto is Deputy Director and Roland Siegwart is full professor atthe Autonomous Systems Lab at ETH Zurich (email: raghav.khanna,[email protected], [email protected], [email protected]).

(a)

600 650 700 750 800 850 900

Wavelength band (nm)

1.5

2

2.5

Co

rrecti

on

Facto

r

Sunny, no shadow

Sunny, shadow

Cloudy, no shadow

Cloudy, shadow

(b)

(c) (d)

(e) (f)

Fig. 1: (a) depicts a DJI Matrice 100 UAV flying over afield with a Ximea MQ022HG-IM-SM5X5-NIR SnapshotMultispectral Camera. (b) Shows the spectral correctionfactors estimated using our method over different field con-ditions. (c) shows an uncorrected, raw image of a scenecontaining plants taken using our multispectral camera for awavelength band centered at 803nm. (d) shows the calibratedimage computed from (c) using our method compensatingfor sensor response, lens vignetting and spectral variationof incident illumination. (e) and (f) show binary NDVIimages calculated using the raw and corrected datacubes andthresholded using Otsu’s method [6]. It can be seen that themultispectral data cubes corrected using our method formthe basis for a much better plant segmentation.

into a measure representative of scene reflectance. Calibra-tion methods may be classified into two broad categories,(i) theoretical methods, which produce absolute calibrationby incorporating complex atmospheric models (e.g. MOD-TRAN), and (ii) empirical methods, usually referred to asnormalisation techniques, which are the most commonlyused in practice [12], [13]. This paper elaborates a completeand practical normalisation based calibration routine suitablefor quickly and conveniently generating reflectance estimateswith multispectral cameras. We aim to combine the aerial

Page 2: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

survey capabilities of small autonomous UAVs (Figure 1a),with the vegetation properties provided by multispectralcameras to survey farms from the air. As an example, figures1e and 1f show the improved performance of fully automatedplant segmentation based on NDVI values obtained aftercalibration using our approach as compared to raw cameradata. In light of the above mentioned use cases, the followingare the main contributions of this work:• In contrast to earlier approaches [14], [12], [13], we

base our complete pipeline on one standard reflector,eliminating the need for specialised laboratory equip-ment such as tunable lasers, integration spheres, flatfield images and uniform illumination which are cum-bersome to obtain in practice, especially if the cameraonboard a UAV/ ground robot needs to be recalibratedon the field due to focus and/or aperture fine tuning.

• We use a data driven, parameter free representation ofthe camera and lens properties, allowing for maximumflexibility and calibration with minimal prior knowledgeor assumptions about the camera/lens setup.

• In contrast to most existing approaches, our calibrationmethod does not assume uniform incident illuminationfor the normalisation image/dataset, hence making itextremely convenient to acquire calibration datasets,even when in the field.

II. BACKGROUND AND RELATED WORK

Radiometric camera calibration for remote sensing of nat-ural environments has a longstanding history[15]. A varietyof techniques exist to characterise the sensor dark current,response function, and optical attenuation. Radiometric cali-bration techniques typically follow a step by step procedure,estimating first the sensor dark current with images takenunder the absence of any incident light [14], followed bymeasuring lens vignetting effects using flat field imagestaken under uniform illumination in the laboratory usingan integrating sphere [13], then compensating for incidentillumination and/or sensor quantum efficiency using imagesof a standard gray reflector [12]. Most current methods,however, require specialised laboratory equipment, or relyon the assumption of uniform incident illumination duringthe calibration dataset and restrict the camera measurementsto lie within the linear response range of the sensor.

The recent advent of small and cost effective snapshotmultispectral cameras with reasonable resolutions in both thespatial and spectral domains [16], such as the one depictedin Figure 2, makes them suitable to be put onboard UAVsand other robotic platforms, enabling quick and accuratespatial as well as spectral mapping of the environment. Onmobile platforms, it is often necessary to vary the cameraand lens configuration in situ, in order to adapt to the fieldenvironment, and obtain the most informative images. Wetherefore propose a calibration pipeline, which can entirelybe performed on the field, requiring only a standard reflectorwith a marker on it.

We focus on obtaining parameter free descriptions of thecamera and lens properties, in order to make it appealing

Fig. 2: Ximea Snapshot ”Hyperspectral” Camera MQ022HG-IM-SM5X5-NIR with Pentax C61215TH 12mm lens

to non-expert users with minimal prior knowledge of thecamera and lens setup. The work in [17] proposed a methodto recover a 2D valued look up table for the inverse responseof the camera, where D is the bit depth of the camera image.In [18], a method for obtaining the maximum likelihoodestimate of such a table for the inverse sensor responseis provided. This method assumes that at least a set oftwo images of a general scene taken at different exposuretimes are provided. [18] also proposes a method for esti-mating the optical attenuation caused by lens vignetting formonochrome cameras using a set of images of a marker ona white wall, which we extend in this work to multispectralcameras.

Incident illumination compensation is typically done byeither parametric models [19], [20], [21], or normalising theimages with one or more images of a standard reflector takenunder the same conditions [13]. Normalisation based meth-ods are typically accurate, however they require a carefullytaken uniformly illuminated image of the standard reflector.Such an image is difficult to obtain using a camera whichhas been mounted on a mobile platform, due to the positionof the sun or shadows from nearby objects or parts of theplatform itself. We propose a novel technique for illumina-tion compensation based on decomposing it into brightnessand spectral components, enabling normalisation based onimages which do not need to be uniformly illuminated.

III. METHOD

Our aim, in radiometric calibration, is to convert a rawcamera image into a relative reflectance image of the targetscene which is correct up to scale. A raw image provided bythe Ximea MQ022HG-IM-SM5X5-NIR camera (Figure 2)is depicted in Figure 3. Such a raw image can be easilyconverted to a multispectral datacube using informationabout the location and peaks of the filter pattern, which isprovided by the manufacturer, as shown in Figure 4.

We define the following camera and image dependentquantities:• Iλ(x) is the measured D bit intensity value for pixel x

over the wavelength band centered at λ.• Rλ(x) is the desired reflectance image.

Page 3: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

615 nm 623 nm 608 nm 790 nm 686 nm

816 nm 828 nm 803 nm 791 nm 700 nm

765 nm 778 nm 752 nm 739 nm 714 nm

653 nm 662 nm 645 nm 636 nm 678 nm

867 nm 864 nm 857 nm 845 nm 670 nm

Fig. 3: Visualization of a raw data cube of a sugarbeet fieldfrom a snapshot multi-spectral camera on board a UAV.

Fig. 4: Spectral response for the sensor along and Fabry-Perot interferometric spectral filters provided by the manu-facturer.

• t is the time for which the sensor was exposed to thescene during image capture

• Cλ is the camera response function which maps theenergy received at a pixel to a D bit digital number.

• Uλ as the inverse sensor response function, i.e, Uλ =C−1λ .

• Vλ(x) is the optical attenuation function depending onthe pixel and wavelength, commonly caused due to lensvignetting.

In order to limit the number of parameters to be estimatedand make the calibration problem computationally and prac-tically tractable, we make the following assumptions:

• The scene is assumed to be Lambertian across allwavelengths of interest.

• The incident illumination is assumed constant duringthe calibration period.

• The dataset for illumination estimation (III-D), is ex-pected to be collected in-situ, i.e, under the sameillumination conditions for which the scene reflectanceis to be estimated.

A. Camera Model

We model the image formation process from a physicalpoint of view as follows. Let Jλ(x) be the incident illu-mination at the world point p(x) observed by the camera atpixel x. The radiance incident on the camera, Lλ(x) is givenby:

Lλ(x) = Jλ(x)Rλ(x). (1)

The power reaching the sensor is diminished as a resultof optical attenuation at each pixel, and is given by

Eλ(x) = Vλ(x)Lλ(x) (2)

The observed pixel intensity, then, as a result of the cameraresponse function is:

Iλ(x) = Cλ

(∫ t

0

Eλ(x)dt

), (3)

which may be rewritten as:

Uλ(Iλ(x)) =

∫ t

0

Eλ(x)dt (4)

B. Response Function Estimation

Following [18], the camera response function, for eachwavelength band, is estimated from two or more imagesof a static scene while varying the exposure time. In orderto estimate the inverse response function, Uλ(x), for everyintensity value, a wide range of intensities should appear inthe dataset. We treat each wavelength band as an individualcamera for which the response curve is to be estimated.Since for a static scene, the optical attenuation and radianceat every pixel does not change with time, hence the powerreaching the sensor remains the same for each pixel x withinthe dataset. For a set of images, i, of a static scene withcorresponding exposures ti, equation 4 under Gaussian whitenoise assumption on Uλ(x) leads to the following maximumlikelihood energy M :

M(U,E) =∑i

∑x

(U(Ii(x)− tiE(x)))2 (5)

which can be minimised with respect to U and E by settingthe partial derivatives to zero, resulting in the followingequations:

U(k)∗ = argminU(k)

M(U,E) =

∑ΩktiE(x)

|Ωk|(6)

E(x)∗ = argminE(x)

M(U,E) =

∑i tiU(I(x))∑

i t2i

(7)

where Ωk is the set of pixels in all images with Iλ(x) = k.The above set of coupled equations can be solved simplyby starting with a guess for either U or E and alternatinglyiterating towards the solution until the error falls below aspecified threshold.

Page 4: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

C. Optical Attenuation Compensation

For the compensation of optical attenuation we also followthe ideas presented in [18], and estimate it with a parameterfree form. This is done by creating an image where eachpixel corresponds to the factor by which the camera opticsattenuate the incoming light intensity for each wavelength.Here, we take images of a scene containing a standardgray reflector with an Aruco marker [22] from multipleviewpoints. The camera pose with respect to the markeris then estimated using the corresponding OpenCV [23]package and geometric intrinsic camera properties, estimatedusing [24] in each image, resulting in a mapping: πi : P 7→ Ωfrom a 3D world point on the reflector plane p to imagepixel x. A planar region of specified size around the marker(surface of the lambertian, gray reflector) is discretised intoa 1000x1000 points, p ∈ P . It is assumed that the radianceemanating from each point is constant and independent ofviewpoint for all images in the dataset. However, this methodallows for the radiance to very from point to point, hencedoes not require uniform incident illumination, which isextremely useful in practice on the field.

Using equations 2 and 4, the maximum likelihood en-ergy to be minimised, assuming Gaussian white noise onU(Ii(πi(p))), is given by:

M(L, V ) =∑i,p∈P

(tiV (πi(p)L(x)− U(Ii(πi(p))))2 (8)

which leads to the following solution for the optical attenu-ation at each pixel for each wavelength:

V ∗(x) =

∑i tiL(x)U(Ii(πi(p)))∑

i(tiL(x))2(9)

and the incident radiance:

L∗(x) =

∑i tiV (πi(p))U(Ii(πi(p)))∑

i(tiV (πi(p)))2(10)

The above equations are solved alternatingly as in sectionIII-B, for the optical attenuation V at each pixel x, alongwith the scene radiance L(x) = L(π(p)) for every p ∈ P .Since V is only observable upto a scalar, it is normalisedsuch that max(V ) = 1 for each wavelength.

D. Illumination Estimation

We now present a novel method to simultaneously com-pensate for the spectral variation in filter sensitivity acrosswavelength (Figure4) and incident illumination using oneor more images of a Lambertian gray reflector taken undernon uniform incident illumination. We first propose thatthe incident illumination Jλ(x) may be decomposed into abrightness component B(x), and a “color” or spectral com-ponent S(λ). We assume that the relative spectral intensitiesof the source (or the “color” of the illumination) do notchange over the surface of the reflector, only its brightnessdoes. Combining these two assumptions, we can write thefollowing equation for an image of the gray reflector,

Jλ(x) = B(x)S(λ) (11)

With this decomposition, using equations 11 and 1, and as-suming Gaussian white noise perturbing Lλ(x), we obtain amaximum likelihood energy dependent on both componentsgiven one or more images of a standard reflector with knownreflectance, Rs.

M(S,B) =∑x

(L(x)− S(λ)B(x)Rs))2 (12)

S∗(λ) = argminS(λ)

M(S,B) =

∑x

Lλ(x)B(x)Rs

|Ω|(13)

and,

B∗(x) = argminB(x)

M(S,B) =

∑λ

Lλ(x)S(λ)Rs∑λ 1

(14)

Which are again solved alternatingly, starting with a guessfor either, such as with the mean spectral radiance, B(x) =∑

λL(x)

nλ, similar to sections III-B and III-C, to obtain the

spectral correction factor for each wavelength, S∗(λ) and thebrightness component, B∗(x) of the reflector image.

E. Relative Reflectance Image Estimation

Given the above estimates of the camera response function,optical attenuation compensation and illumination, we invertequation 3, to obtain the reflectance image given measuredcamera intensities

Rλ(x) =Uλ(Iλ(x))

tVλ(x)S(λ)B(x)(15)

We note that we know the brightness component only forthe calibration (reflector) images and not for the target scene.Hence, Rλ(x) is also observable only up to a scalar factor,and hence has to be normalised, with respect to a chosenwavelength in order to compare absolute reflectance of theobjects observed in the target scene.

It is important to note that the illumination estimation stepof the calibration must be performed again whenever thereis significant change in the spectral distribution of incidentillumination, since the estimated spectral correction factorsare valid only for the illumination under which the reflectorimage was taken. An example is when a mobile robotmoves between regions which are naturally and artificiallyilluminated.

IV. RESULTS AND DISCUSSION

A. Spectral sensor response function(s)

Figure 5 shows the estimated inverse response function ofthe camera plotted against the measured pixel intensity. Itis interesting to note that although we allowed the cameraresponse function to vary with wavelength, the resultingestimates for all wavelengths were equal to each other, withinexperimental error bounds. This is consistent with the factthat the response function is a property of the underlying

Page 5: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

0 50 100 150 200 250

Pixel Intensity (I)

0

50

100

150

200

250

Inve

rse

re

sp

on

se

Fu

nctio

n (

U(I

))

Estimated Inverse Camera Response

Fig. 5: Sample subset of images taken at different exposuresat 803 nm wavelength (top) and the resulting estimatedsensor inverse response function. Since no hardware gammacorrection is applied the resulting response is approximatelylinear, however linearity is in no way imposed, merelyestimated.

CMOS/CCD sensor and readout electronics and does not de-pend on the spectral filters placed on the sensor. The responsefor the given camera is estimated to be approximately linear,which is consistent with the typical response of a CMOSsensor if no additional hardware gamma correction is applied,which is the case for the used sensor.

B. Spectral Optical attenuation coefficients

Figure 6 shows the estimated spectral optical attenuationcoefficients for the camera-lens system shown in Figure 2.The radially symmetric lens vignetting, typical of commonlenses, is recovered from the data. Also significant is thevariation of the optical attenuation factors with wavelength.The optical attenuation factors are only dependent on thecamera and lens configuration, and hence only need to bere-estimated when the configuration changes. However, sucha scenario is not uncommon in outdoor mobile roboticsapplications where the aperture and focal length may haveto be adjusted depending on the environment. Using thismethod for optical attenuation estimation allows a recali-bration dataset to be acquired on the field, when such anadjustment is made, since it uses only the same gray reflectorthat is required for illumination compensation with a papertag placed on it.

C. Illumination Estimation

The estimated spectral correction factors for a wide rangeof illumination conditions are shown in figure 1b. There islittle variation observed in the spectral correction factorsbetween clear and cloudy days. This is consistent withpast studies [26], which found that the intensity normalisedspectral distributions of Global Horizontal Illuminance (GHI)on clear and cloudy days are similar. Since we are primarily

615 nm 623 nm 608 nm 790 nm 686 nm

816 nm 828 nm 803 nm 791 nm 700 nm

765 nm 778 nm 752 nm 739 nm 714 nm

653 nm 662 nm 645 nm 636 nm 678 nm

867 nm 864 nm 857 nm 845 nm 670 nm

0

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

Fig. 6: Subset of images of an aruco marker [25] placed on astandard gray reflector board taken from different viewpointsat 803 nm (top) and estimated optical attenuation coefficientsfor each wavelength of our multi-spectral camera using thecomplete set of images.

interested in using the camera in a nadir configuration onboard a UAV, we expect this result to extend to those datasetsas well. Hence, the assumption that the spectral distributionof the incident illumination remains the same for imageswhich are used for calibration and the actual target scenes isapplicable in practise for a wide range of ambient conditions.

Figure 7 shows a qualitative comparison between ourapproach and a baseline approach in which the corrected datacubes for the target scene are calculated by normalisation i.edividing the target image for each wavelength by the standardreflector image.

Rλ(x)target,norm =Iλ(x)targetIλ(x)reflector

(16)

Such techniques work well in simultaneously compensat-ing for vignetting and the incident illumination spectrumwhen the reflector image is taken under uniform incidentillumination in the laboratory, however, as seen in Figure7d, lead to artefacts for calibration images taken under fieldconditions. In contrast, our method gracefully handles nonuniform illumination even in the reflector image (Figure 7b)by first separating out the brightness component (Figures 7eand 7f), before estimating the correction factors for eachwavelength, resulting in reflectance images without artefactsin both cases (Figures 7g and 7h).

D. Reflectance Recovery

As mentioned in section III-D, our method allows one toobtain the reflectance image up to a scalar factor. This issufficient if one is primarily interested in obtaining multi-spectral indices, which is the case for applications such asprecision agriculture. However, in order to compare absolute

Page 6: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

(a) (b)

(c) (d)

40

50

60

70

80

90

100

(e)

40

50

60

70

80

90

100

(f)

(g) (h)

Fig. 7: Comparison of corrected data cubes generated usingour approach and normalisation methods. (a) and (b) showimages of a standard gray reflector at 803 nm taken usingthe camera. (c) shows the result of applying a standardnormalisation technique [13], [12] to a target image. Re-flectance artefacts caused as a result of non uniform incidentillumination in the reflector image are apparent in (d). (e) and(f) show the result of our brightness estimation step whichenables factoring out the brightness component from thecalibration image, and hence gives repeatable results whenapplied to the target image (g) and (h).

reflectance values for the target scene to literature, it isnecessary to further correct it by normalising with respectto a region of known reflectance in the scene. In order toprovide an idea of the quantitative variance of reflectanceestimates for corrected data cubes using our method, Figure8a compares the spectral distribution of reflectance valuesfor 5 different gray panels on a GretaMacbeth ColorCheckerchart taken under clear and cloudy skies. We normalise thecorrected spectral response by the estimated response for thewhite panel from the same corrected image. One can see that

600 650 700 750 800 850 900

Wavelength (nm)

0

0.2

0.4

0.6

0.8

1

White p

anel norm

alis

ed r

eflecta

nce

Cloudy Day

Sunny Day

(a)

(b) (c) (d)

(e) (f) (g)

Fig. 8: (a) Estimated reflectance for 5 different gray panelsof a colour checker chart estimated after calibrating imagesusing our method on cloudy and sunny days. The reflectancehave been normalised by the reflectance of the white colourpanel (from the same corrected image) to get absolute values.(b),(e) Gray reflector images, (c),(f) uncorrected raw cameraimage of a colour checker for the 803 nm band and (d),(g)corrected images using our method on sunny (top) andcloudy (bottom) days. As shown above, our method enablesquantitatively accurate relative spectral reflectance recoverywith calibration images taken under practical field conditionsunder a wide range of illumination conditions.

although the incident illumination brightness varies betweenthe two sets of images and is also different for each reflectorimage used for calibration, the reflectance estimates are closeto the documented values under both conditions.

V. CONCLUSION

This work proposes a complete pipeline for radiomet-rically calibrating multispectral cameras. Especially suitedfor cameras onboard mobile robotic platforms, the pipelineprovides a method to correct for non linear sensor responsefunctions, lens vignetting effects, filter spectral sensitivi-ties and incident illumination using only a standard grayreflectance target, hence allowing the entire radiometriccalibration process to take place when the robot is on thefield. The calibration routine consists of three major steps.First, the (non-linear) sensor response function is determinedusing an exposure sweep of a static scene. Second, the lightattenuation due to the camera’s optics is estimated, for eachwavelength, by moving the camera around while observing

Page 7: On Field Radiometric Calibration for Multispectral Camerasflourish-project.eu/.../user_upload/publications/2017-icra-khanna.pdf · On Field Radiometric Calibration for Multispectral

a marker placed on the reflectance target. Third, the convo-lution of the camera’s spectral response and current incidentillumination is estimated using one or more images of a stan-dard gray reflector using a novel method which separatelydetermines the brightness and spectral components of theincident illumination. By using a parameter free descriptionof the camera properties, our routine enables non expert userswith little prior knowledge about the camera characteristicsto quickly and conveniently get started with capturing accu-rate reflectance data using present and upcoming snapshotmultispectral cameras. We show that quantitatively accuratereflectance estimates can be obtained using the approachacross a range of incident illumination conditions, laying thefoundation for accurate long term spatio-temporal-spectralmapping using mobile robots equipped with such cameras.

VI. ACKNOWLEDGEMENTS

The authors would like to thank Dr. Frank Liebisch andDr. Johannes Pfeifer from the Professorship of Crop Science,ETH Zurich for insightful discussions and dataset collection.

REFERENCES

[1] C. Zhang and J. M. Kovacs, “The application of small unmanned aerialsystems for precision agriculture: a review,” Precision agriculture,vol. 13, no. 6, pp. 693–712, 2012.

[2] A. A. Gitelson, “Wide dynamic range vegetation index for remotequantification of biophysical characteristics of vegetation,” Journal ofplant physiology, vol. 161, no. 2, pp. 165–173, 2004.

[3] D. Haboudane, J. R. Miller, N. Tremblay, P. J. Zarco-Tejada, andL. Dextraze, “Integrated narrow-band vegetation indices for predictionof crop chlorophyll content for application to precision agriculture,”Remote sensing of environment, vol. 81, no. 2, pp. 416–426, 2002.

[4] A. Walter, B. Studer, and R. Kolliker, “Advanced phenotyping offersopportunities for improved breeding of forage and turf species,” Annalsof botany, p. mcs026, 2012.

[5] A. Walter, F. Liebisch, and A. Hund, “Plant phenotyping: from beanweighing to image analysis,” Plant methods, vol. 11, no. 1, p. 14,2015.

[6] N. Otsu, “A threshold selection method from gray-level histograms,”Automatica, vol. 11, no. 285-296, pp. 23–27, 1975.

[7] Point clouds generated using Pix4Dmapper by Pix4D. [Online].Available: http://www.pix4d.com/

[8] E. Barnes, T. Clarke, S. Richards, P. Colaizzi, J. Haberland,M. Kostrzewski, P. Waller, C. Choi, E. Riley, T. Thompson et al.,“Coincident detection of crop water stress, nitrogen status and canopydensity using ground based multispectral data,” in Proceedings of the5th International Conference on Precision Agriculture, Bloomington,MN, 2000, pp. 16–19.

[9] D. Haboudane, J. R. Miller, E. Pattey, P. J. Zarco-Tejada, and I. B.Strachan, “Hyperspectral vegetation indices and novel algorithms forpredicting green lai of crop canopies: Modeling and validation inthe context of precision agriculture,” Remote sensing of environment,vol. 90, no. 3, pp. 337–352, 2004.

[10] J. Pfeifer, R. Khanna, D. Constantinc, M. Popovic, E. Galceran,N. Kirchgessner, A. Walter, R. Siegwart, and F. Liebisch, “Towardsautomatic uav data interpretation for precision farming.”

[11] D. Constantin, M. Rehak, Y. Akhtman, and F. Liebisch, “Detectionof crop properties by means of hyperspectral remote sensing from amicro uav,” in Bornimer Agrartechnische Berichte, vol. 88, no. EPFL-CONF-218662. Leibniz-Institut fur Agrartechnik Potsdam-BornimeV, 2015, pp. 129–137.

[12] S. Del Pozo, P. Rodrıguez-Gonzalvez, D. Hernandez-Lopez, andB. Felipe-Garcıa, “Vicarious radiometric calibration of a multispectralcamera on board an unmanned aerial system,” Remote Sensing, vol. 6,no. 3, pp. 1918–1937, 2014.

[13] H. Aasen, A. Burkart, A. Bolten, and G. Bareth, “Generating 3dhyperspectral information with lightweight uav snapshot cameras forvegetation monitoring: From camera calibration to quality assurance,”ISPRS Journal of Photogrammetry and Remote Sensing, vol. 108, pp.245–259, 2015.

[14] D. Olsen, C. Dou, X. Zhang, L. Hu, H. Kim, and E. Hildum,“Radiometric calibration for agcam,” Remote Sensing, vol. 2, no. 2,pp. 464–477, 2010.

[15] G. E. Healey and R. Kondepudy, “Radiometric ccd camera calibrationand noise estimation,” IEEE Transactions on Pattern Analysis andMachine Intelligence, vol. 16, no. 3, pp. 267–276, 1994.

[16] B. Geelen, C. Blanch, P. Gonzalez, N. Tack, and A. Lambrechts, “Atiny vis-nir snapshot multispectral camera,” in SPIE OPTO. Interna-tional Society for Optics and Photonics, 2015, pp. 937 414–937 414.

[17] P. E. Debevec and J. Malik, “Recovering high dynamic range radiancemaps from photographs,” in ACM SIGGRAPH 2008 classes. ACM,2008, p. 31.

[18] J. Engel, V. Usenko, and D. Cremers, “A photometrically calibratedbenchmark for monocular visual odometry,” in arXiv:1607.02555, July2016.

[19] Y. Kim and J. Reid, “Modeling and calibration of a multi-spectralimaging sensor for in-field crop nitrogen assessment,” Applied Engi-neering in Agriculture, vol. 22, no. 6, pp. 935–941, 2006.

[20] P. Corke, R. Paul, W. Churchill, and P. Newman, “Dealing withshadows: Capturing intrinsic scene appearance for image-based out-door localisation,” in 2013 IEEE/RSJ International Conference onIntelligent Robots and Systems. IEEE, 2013, pp. 2085–2092.

[21] R. Ramakrishnan, J. Nieto, and S. Scheding, “Verification of sky mod-els for image calibration,” in Proceedings of the IEEE InternationalConference on Computer Vision Workshops, 2013, pp. 907–914.

[22] R. Munoz-Salinas, “Aruco: a minimal library for augmented realityapplications based on opencv,” Universidad de Cordoba, 2012.

[23] G. Bradski, “The opencv library,” Dr. Dobb’s Journal of SoftwareTools.

[24] P. Furgale, J. Rehder, and R. Siegwart, “Unified temporal and spatialcalibration for multi-sensor systems,” in Intelligent Robots and Systems(IROS), 2013 IEEE/RSJ International Conference on. IEEE, 2013,pp. 1280–1286.

[25] S. Garrido-Jurado, R. Munoz-Salinas, F. J. Madrid-Cuevas, and M. J.Marın-Jimenez, “Automatic generation and detection of highly reliablefiducial markers under occlusion,” Pattern Recognition, vol. 47, no. 6,pp. 2280–2292, 2014.

[26] G. Blackburn and F. Vignola, “Spectral distributions of diffuse andglobal irradiance for clear and cloudy periods,” in World RenewableEnergy Forum, Denver, CO, 2012.