phasespace optical navigation development & multi-uav closed- loop control demonstration texas...

Post on 26-Dec-2015

217 Views

Category:

Documents

1 Downloads

Preview:

Click to see full reader

TRANSCRIPT

PhaseSpace Optical Navigation Development & Multi-UAV Closed-

Loop Control Demonstration

Texas A&M UniversityTelecon with BoeingDecember 7, 2009

PhaseSpace Optical Navigation Adaptation (PSONA) Team

Faculty Mentors: Ms. Magda Lagoudas, Dr. John L. Junkins, Dr. Raktim Bhattacharya

Overview• PSONA Semester Goals• PhaseSpace calibration (Albert Soto & Zach Sunberg)• Outfitting quad-rotors with PhaseSpace/Vicon (Erin

Mastenbrook)• Quad-rotor custom electronics board design (Winnie

Lung & Kate Stuckman)• A look at the next semester

2

Semester GoalsCharacterizing and Calibrating PhaseSpace

• Characterize the PhaseSpace system• Develop a method of calibration• Evaluate the accuracy and reliability of

PhaseSpace by comparing to Indoor GPS• Compare all results to the capabilities of Vicon

3

Semester Goals

• Demonstrate the capability of utilizing PhaseSpace as a vision-based localization solution in multiple, UAV coordination

• Equip two DraganFlyer Quad-Rotors and LASR Lab to enable multi-vehicle autonomous flight– Next semester: develop user interfaces and control

algorithms for autonomous flight

UAV Demonstration

4

PhaseSpace Calibration

5

• Error in the PhaseSpace system arises primarily from optical noise and from misalignments in the camera’s internal geometry (biases).

• By determining the true geometry of the camera, bias error may be compensated for, resulting in a more accurate “best guess” of the beacon location.

• Noise characterization then describes how reliable the “best guess” measurement is.

Approach

6

• Develop a model that describes a camera’s output as a function of its intrinsic parameters and beacon position.

• Construct an algorithm to determine more accurate (or less biased) camera parameters.

• Using these improved parameters, run several tests to gather data reflecting the correlation of noise with beacon position in each dimension.

Mathematical Model

7

Extrinsic parameters:u – beacon locationu0 – mounting pointCamera's Euler angles

Intrinsic parameters:pz – pinhole depthsz – sensor center depth

α – angle of sensor's axis from horizontal within ideal planeß – angle between sensor and ideal planepx' – distance from mount to pinhole along sensor's axissx' – distance from mount to sensor center along sensor's axis

pinhole

Mathematical Model

8

The following set of four equations describes an output pixel in terms of the basic intrinsic and extrinsic parameters.

b0, f, and δ are intermediate values. R is the rotation matrix containing the camera's Euler angles. ξ is the pixel value returned by the camera.

0uub R0

Calibration Simulation

9

Ideal Camera Sensor 1 Sensor 2

α (deg.) -45 45

ß (deg.) 0 0

sx' (in.) 1.77 1.77

sz (in.) .100 .100

px' (in.) 1.77 1.77

pz (in.) 1.81 1.81

True Camera Sensor 1 Sensor 2

α (deg.) -47.5 42

ß (deg.) 2.0 -1.5

sx' (in.) 1.81 1.74

sz (in.) .105 .095

px' (in.) 1.76 1.80

pz (in.) 1.80 1.84

The ideal intrinsic parameter values shown below were passed into the GLSDC algorithm as initial guess values. Biased “true” values were used to generate output for eight beacon locations.

Calibration Simulation

10

Error (GLSDC) Average Max

α 0.000% 0.000%

ß 0.000% 0.000%

sx' 0.000% 0.000%

sz 0.000% 0.000%

px' 0.000% 0.000%

pz 0.000% 0.000%

Shown below is the % error for the ideal and GLSDC-derived cameras vs. the true camera. For this trial, noiseless measurements from the hypothetical true camera were used and the parameters were found accurately to several decimal places.

Error (ideal) Sensor 1 Sensor 2

α 5.263% 7.143%

ß 100.0% 100.0%

sx' 2.210% 1.724%

sz 4.762% 5.263%

px' 0.568% 1.667%

pz 0.556% 1.630%

Calibration Simulation

11

Error (ideal) Sensor 1 Sensor 2

α 5.263% 7.143%

ß 100.0% 100.0%

sx' 2.210% 1.724%

sz 4.762% 5.263%

px' 0.568% 1.667%

pz 0.556% 1.630%

Error (GLSDC) Average Max

α 0.003% 0.010%

ß 0.343% 0.443%

sx' 0.043% 0.101%

sz 2.896% 4.740%

px' 0.045% 0.100%

pz 0.160% 0.273%

Next, the noise was set to a value slightly greater than the maximum observed in testing. The average and maximum % errors from a set of 5 trials are shown. In all cases, the GLSDC code produced more accurate parameters, often by a large margin. Sensor depth was estimated less accurately than the other parameters.

Calibration Testing

12

Ideal Camera Sensor 1 Sensor 2

α (deg.) -45 45

ß (deg.) 0 0

sx' (in.) 0.707 -0.707

sz (in.) 0.100 0.100

px' (in.) 0.707 -0.707

pz (in.) 1.81 1.81

True Camera Sensor 1 Sensor 2

α (deg.) -44.08 45.49

ß (deg.) 2.13 -0.75

sx' (in.) 0.462 -0.566

sz (in.) -2.370 -2.424

px' (in.) 0.461 -0.566

pz (in.) -0.512 -0.560

• Two cameras were tested using three different arrangements of 6 beacons in known locations

• GLSDC was run on all three data sets simultaneously• Results were similar for both cameras

Distortion Testing

13

• Characterized the lens distortion by taking measurements across the FOV of each imager.

• For each imager, the camera was tilted +45⁰ or -45⁰ to isolate one sensor (set one imager vertical and one horizontal)

• The test camera was leveled in the plane of the test by ensuring that the vertical sensor outputs for a beacon at the FOV endpoints have equivalent values (horizontal imager is parallel to the optical table assuming the imagers are orthogonal).

0 500 1000 1500 2000 2500 3000 3500 4000-50

-40

-30

-20

-10

0

10

20

30

40

50

Model Pixels

Diff

ere

nce

from

Mod

elDistortion Test Results

Distortion Testing

14

13

3 ddd

Calibration Software

15

• The calibration software will take in data previously gathered by the cameras.

• The cameras will gather data by viewing the calibration rig (a circle of beacons at many positions and orientations).

• The software will calibrate the positions and orientations of all of the cameras and their intrinsic parameters. (It will also calibrate the position of the rig in each frame, but this is of no use to us)

Calibration Software Initialization

16

y – measurementsx – calibration parameters

Main Function

Read in Measurements

Load Typical Intrinsic Parameters

Assume Initial Rig State is Zero

Estimate Camera States (Initial Rig State)

Estimate Rig States (Camera States)

While Difference in Y > Tolerance

GLSDC Iteration (Measurements, Calibration Parameters)

[Next Slide]

GLSDC Iteration

17

While Difference in Y > Tolerance

GLSDC Iteration ( y , x )

Difference in Y = y – New Y

New Y = Simulate System ( x )

H = Generate H ( H = Generate H ( xx , Constraints ) , Constraints )

Difference in X = ( HT W H )-1 W HT * Difference in Y

x = x + Difference in X

W = Generate Weight MatrixW = Generate Weight Matrix

y – measurementsx – calibration parameters

Coding Progress

18

Created Matlab classes for the measurements and calibration parameters. These will handle access to all of the parameters and allow them to be used as vectors.

Example: x is an object of class CalParamsx.Vec acts as a vector for using in GLSDCx.IntParams(2,1).F allows access to the intrinsic parameter F of sensor 1 in camera 2.

I have never worked with classes in Matlab, so the development has taken longer than initially expected, but we hope that a strong framework will pay off in the future.

Previous Semester Accomplishments

19

• Familiarized with Matlab and GLSDC• Researched and purchased computer

hardware for QuadRotor offboard computing.• Wrote software for finding an initial estimate

of a camera’s position and orientation.

Tasks for Next Semester

20

• Finish writing calibration software.• Help to write software for determining

positions using the data we gather from calibration.

• Help to write control software for the quadrotors.

Quad-Rotor Structure

21

22

X

Y

LP

LVR

R=131.6 mmLP= 8.3 mmLV= 14.3 mm

Rotors

PhaseSpace Beacon

Vicon Reflector

Basic Stabilization Kit Rods

Upper Layer Rods

Φ

cossin

sincos

ii

ii

LRY

LRX

Use the top sign if beacon is to the clockwise side

Semester Summary

23

• We designed a T joint, corner piece, center piece, and mounts for PhaseSpace beacons and Vicon reflectors in SolidWorks and had the parts created using a rapid prototyping machine.

• Outfitted 2 quad rotors with parts, beacons, reflectors, and electrical wiring.

• Created files for Vicon and PhaseSpace to allow them to recognize the individual quad rotors.

• Will outfit 3 more quad rotors next semester.

Quad-Rotor Video

24

Schematic

25

Current Board Layout

26

• Completed Schematic– Learned circuit design basics– Learned EAGLE– Learned how to interpret datasheets

• Finalized Parts List• Began Board Layout

Semester Summary

27

• Before End of Semester– Finish board layout– Order board and parts

• Early Next Semester– Assemble board– Test board– Make necessary changes

Future Plans

28

• Assemble calibration rig and perform comparative testing between Vicon and PhaseSpace using iGPS as truth

• Complete PhaseSpace calibration code• Assemble & debug custom electronics board• Interface quad-rotor board with off-board computing

& Vicon/PhaseSpace feedback• Outfit three additional quad-rotors (total of 5)• Develop models & control laws for autonomous

flight

Next Semester

29

Questions? Comments?

30

top related