improved tau-guidance and vision-aided navigation for...

14
Improved Tau-Guidance and Vision-aided Navigation for Robust Autonomous Landing of UAVs Amedeo Rodi Vetrella, Inkyu Sa, Marija Popovi´ c, Raghav Khanna, Juan Nieto, Giancarmine Fasano, Domenico Accardo and Roland Siegwart Abstract In many unmanned aerial vehicle (UAV) applications, flexible trajectory generation algorithms are required to enable high levels of autonomy for critical mission phases, such as take-off, area coverage, and landing. In this paper, we present a guidance approach which uses the improved intrinsic tau guidance theory to create 4-D trajectories for the desired time-to-contact (TTC) with a landing plat- form tracked by a visual sensor. This allows us to perform maneuvers with tunable trajectory profiles, while catering for static or non-static starting and terminating motion states. We validate our method using a rotary-wing UAV to land on a static platform. Results from simulations and outdoor experiments show that our approach achieves smooth, accurate landings with easily adjustable parameters. 1 Introduction Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing services and enhancing safety thanks to their flexible and low-cost surveillance, monitoring, and risk assessment capabilities. In many emerging applications, in- cluding environmental monitoring [3], industrial inspection [4], and emergency re- sponse [13], high levels of system autonomy are vital to guarantee safe and reliable operation, especially when UAVs act as sentinels monitoring large areas taking-off from their nest and returning to it to recharge before performing another mission. A typical mission can be divided into the following phases: (1) take-off, (2) mid- course, (3) area scanning [14], (4) visual tracking, and (5) landing. Often, the final phase is the most critical as it involves performing delicate maneuvers; e.g., land- ing on a station for re-charging [2] or on a ground carrier for transportation [8]. These procedures are subject to constraints on time and space, and must be robust to changes in environmental conditions, such as visibility and wind disturbances [6]. To achieve smooth landings, precise sensing and accurate control techniques are therefore required. A. R. Vetrella ( ), G. Fasano, D. Accardo University of Naples “Frederico II” e-mail: [email protected] I. Sa, M. Popovi´ c, R. Khanna, J. Nieto, R. Siegwart Autonomous Systems Lab, ETH Z ¨ urich 1

Upload: others

Post on 05-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

Improved Tau-Guidance and Vision-aidedNavigation for Robust Autonomous Landing ofUAVs

Amedeo Rodi Vetrella, Inkyu Sa, Marija Popovic, Raghav Khanna, Juan Nieto,Giancarmine Fasano, Domenico Accardo and Roland Siegwart

Abstract In many unmanned aerial vehicle (UAV) applications, flexible trajectorygeneration algorithms are required to enable high levels of autonomy for criticalmission phases, such as take-off, area coverage, and landing. In this paper, wepresent a guidance approach which uses the improved intrinsic tau guidance theoryto create 4-D trajectories for the desired time-to-contact (TTC) with a landing plat-form tracked by a visual sensor. This allows us to perform maneuvers with tunabletrajectory profiles, while catering for static or non-static starting and terminatingmotion states. We validate our method using a rotary-wing UAV to land on a staticplatform. Results from simulations and outdoor experiments show that our approachachieves smooth, accurate landings with easily adjustable parameters.

1 Introduction

Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providingservices and enhancing safety thanks to their flexible and low-cost surveillance,monitoring, and risk assessment capabilities. In many emerging applications, in-cluding environmental monitoring [3], industrial inspection [4], and emergency re-sponse [13], high levels of system autonomy are vital to guarantee safe and reliableoperation, especially when UAVs act as sentinels monitoring large areas taking-offfrom their nest and returning to it to recharge before performing another mission.

A typical mission can be divided into the following phases: (1) take-off, (2) mid-course, (3) area scanning [14], (4) visual tracking, and (5) landing. Often, the finalphase is the most critical as it involves performing delicate maneuvers; e.g., land-ing on a station for re-charging [2] or on a ground carrier for transportation [8].These procedures are subject to constraints on time and space, and must be robust tochanges in environmental conditions, such as visibility and wind disturbances [6].To achieve smooth landings, precise sensing and accurate control techniques aretherefore required.

A. R. Vetrella (� ), G. Fasano, D. AccardoUniversity of Naples “Frederico II”e-mail: [email protected]

I. Sa, M. Popovic, R. Khanna, J. Nieto, R. SiegwartAutonomous Systems Lab, ETH Zurich

1

Page 2: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

2 Vetrella et al.

In this paper, we address these problems by integrating, within the end-to-endsoftware system developed at the ETH, a trajectory generation algorithm based on avisual tracking system and a bio-inspired guidance method for autonomously land-ing on a specified target. Our motivation is to increase the reliability and versatil-ity of UAV landings in the example scenarios above. We use the improved intrin-sic tau guidance theory [10, 17] to generate 4-D trajectories for a desired time-to-contact (TTC) based on the relative pose estimate of the UAV with respect to the tar-get. This approach enables us to perform a maneuver with arbitrary initial and finalmotion states, and tailor its trajectory profile for different types of rotary- or fixed-wing tasks, such as landing, in-flight obstacle avoidance, and object picking. The ad-vantage of this approach is thus the possibility of having a more “user−oriented”trajectory generation method, where fundamental parameters can be tuned on thebasis of the mission requirements. With respect to the trajectory, the user may beinterested in assigning predefined mission requirements, such as to maintain thecourse within the boundary of two intersecting planes (for example, to fly within anatural or man-made canyon), controlling the total mission time or its energy con-sumption, or specifying the initial and final landing angles.

We validate our strategy using a rotary-wing UAV equipped with a Global Posi-tioning System (GPS), an Inertial Measurement Unit (IMU), a Visual-Inertial (VI-)sensor, and a downward-looking camera for detecting a static target. However, asour method does not rely on position-based sensing, it can be easily adapted to dif-ferent platforms and environments. The main contributions of this work are:

1. A bio-inspired landing method which:

• generates easily tunable 4-D trajectories to the target,• provides guidance starting or terminating with static or non-static motion

states,• is applicable with different sensor configurations.

2. The validation of our method in simulation and its demonstration in an outdoorenvironment.

The paper is structured as follows: Section 2 begins by outlining the state-of-the-art in autonomous landing methods. The general landing problem is formulatedin Section 3, and Section 4 describes our bio-inspired approach. We present ourexperimental set-up and results in Sections 5 and 6 before concluding in Section 7.

2 Related Work

Significant work has been done recently on autonomous landing methods for UAVsin various environments. As discussed by Gautam et al [6], landing navigationframeworks mainly use a GPS and an IMU, with small, light-weight visual sensorsoften integrated for improved accuracy, especially in outdoor applications.

Page 3: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 3

Landing strategies can be categorized based on the guidance techniques usedto create trajectories to the landing position. We consider (i) position-based [16,18] and (ii) biologically-inspired [9, 11, 15] techniques. Traditional position-basedguidance approaches, such as the pursuit [18] and proportional [16] laws, leverageLine-of-Sight (LOS) to navigate the UAV towards the target. While these methodsprovide precise tracking, they are limited to position sensors only as positions and/orvelocities must be accurately computed [8]. Moreover, they may require complexcontrol algorithms and lack controllability of the UAV trajectory pattern and profile,thus restricting their applicability.

Bio-inspired guidance paradigms overcome these limitations by using visualinformation such as the TTC [11] or optical flow [15] to generate 4-D trajecto-ries [9, 15], giving the possibility of controlling both the trajectory and the time.In this category, the general tau theory [10, 11] has been popularly postulated todescribe goal-directed movements, e.g. collision avoidance [17], docking, and land-ing [9]. Kendoul [9] recently introduced a tau-based UAV autopilot and imple-mented it to perform high-accuracy maneuvers. Similarly, we apply the tau prin-ciples to generate a 4-D trajectory during landing. However, by using elements ofimproved intrinsic tau guidance [17], our method is not restricted to static initial andfinal states and is thus also suitable for landing on moving platforms.

3 Problem Definition

As mentioned in Section 1, there are various UAV missions in which large area cov-erage is required. For safe and robust autonomous operation, it is necessary to gener-ate successive and consecutive trajectories by maintaining position and/or velocitycontinuities on the boundary waypoint between the nth and the nth + 1 trajectory.Adjourning the trajectory is also required during the successive steps of followingor landing on a moving object, taking into account the error deriving by the trackingalgorithm, including noise. Consequently the main problem is to develop and applya fast and flexible algorithm to give the UAV the capability to perform autonomouslythe various above mentioned phases by a limited number of intuitive parameters tobe chosen, without the need of complex computations.

The general goal-directed trajectory generation problem is defined as follows. Wedescribe a 4-D UAV trajectory by the state S(t) = {x(t), y(t), z(t), x(t), y(t), z(t)}.The aim is to guide the UAV from arbitrary initial states S(t0) to goal states S(t0+T )in the execution time T . The UAV dynamics are:

x = ux,

y = uy,

z = uz,(1)

Page 4: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

4 Vetrella et al.

where u = {ux, uy, uz} are the velocity components along each co-ordinate axesused as commands for the trajectory tracker, which is discussed for our applicationin Section 5.

Although the algorithm is applicable to any trajectory, this paper focuses on themost complex mission phase involving landing on a stationary or moving target. Thetrajectory is adjourned based on visual information provided by a camera during theapproach phase. The specific requirements are to:

1. reach the target point in a specified time T ,2. arrive and stop at the target point with zero velocity at contact, such that S(t) ={x(t), y(t), z(t), 0, 0, 0} at t = t0 +T ,

3. reach the target point from a specific approach direction, as discussed in Section 4.1.

4 Approach

This section overviews the proposed bio-inspired guidance strategy for autonomouslylanding on a platform, where the perching trajectory is generated assuming thatthe target can be tracked with a camera. In brief, we use the intrinsic tau guid-ance strategy to generate tunable 4-D trajectories for smooth landings with staticor non-static starting and terminating states. First, we present our method of tra-jectory parametrization as the basis of our approach. We then summarize the basicprinciples of general tau theory before detailing our specific guidance technique.

4.1 Trajectory Parametrization

With reference to Fig. 1 in the East-North-Up (ENU) frame, the landing maneuverusually requires arriving at a destination with a final state of the trajectory dependingon the target morphology and dynamics. The main parameters are:

• the distance gap d(t) closing to zero,• the relative speed (zero in the case of landing on a static target),• approaching angle of the trajectory α(t) with respect to the normal to the EN

plane,• approaching angle of the trajectory β (t) with respect to the yaw angle between

the UAV and target body frame, respectively.

d(t) is the instantaneous distance along the LOS between the UAV and the tar-get, given by the difference between the UAV position pu(t), and the target positionpt(t). The relative speed is given by pu(t)− pt(t) which, in order to avoid colli-sions, must be zero at the touch-down. The approaching angle α(t) is chosen onthe basis of the landing approach, and consequently depends on the UAV platformand application, e.g., along the tangent to the surface for a fixed-wing UAV, almost

Page 5: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 5

vertical for a rotary-wing UAV, or inclined at a certain angle to optimize entering anopening. The approaching angle β (t) ensures that the body frames of the UAV andthe target respect a required orientation at the touch-down, e.g., when rechargingrequires docking between the two.

To introduce these parameters in mission design, we use the intrinsic tau guid-ance approach presented in Section 4.3.

N

U

E

pu(0)

pt(0)pt(t)

pt(T )

pu(t)

α(t)

β (t)

Trajectory

Fig. 1: Problem definition.

4.2 General Tau Theory

Our guidance method for landing is based on the general tau theory [10], which usesthe tau function to represent the TTC of a goal-directed movement. We define thetau variable of a motion gap χ as:

τχ =

χ(t)χ(t)

|χ(t)| ≥ χmin,

sgn(

χ(t)χ(t)

)τmax |χ(t)|< χmin,

(2)

where χ(t) can be the gap of any state in S(t).

4.3 Improved Intrinsic Tau Guidance Strategy

Specifically, our landing method uses the improved intrinsic tau guidance strat-egy [17] to allow for goal-directed movements and in particular to start the tra-jectory with an initial condition of flight. We formulate the intrinsic guidance gapof a movement Gv(t) as:

Page 6: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

6 Vetrella et al.Gv(t) =−0.5at2 +VGt +G0,Gv(t) =−at +VG,Gv(t) =−a,

(3)

where a is the acceleration, VG is an initial velocity, and G0 specifies an initial in-trinsic gap.

Considering the movement along a generic x-axis from time 0 to T as an exam-ple, the position and velocity gaps can be expressed as ∆x = xT −x and ∆x = xT − x,respectively, where xT and xT denote the goal states at time T . We apply the tau cou-pling strategy [9, 17] for synchronous gap-closing to obtain the movement states:

x(t) = xT + xT (t−T )− χx0

G1/kx0

G1/kxv ,

x(t) = xT −χx0

kxG1/kx0

GvG1/kx−1v ,

x(t) =− χx0

kxG1/kx0

G1/kx−2v

(1− kx

kxG2

v +GvGv

).

(4)

where kx is a gain parameter controlling gap convergence along the x-axis, as dis-cussed below.

From the definition of Gv, Eq. 4, and Eq. 3, it can be shown that:G0 =

χx0gT 2

2(χx0 + kx∆x0T ),

VG =kx∆x0gT 2

2(χx0 + kx∆x0T ).

(5)

G0 and VG can be viewed as bonding actual and intrinsic movements due to gravi-tational effects. If kx ∈ (0, 0.5), (x, x, x)→ (xT , xT , 0) and the position and velocitycan be steadily guided to the target values, as required. Hence, varying the elementsof k= {kx, ky, kz}within this range allows for modifying the trajectory profile alongeach axis.

Using the tau coupling strategy described by (2) and (3), we can calculate themain gap along the LOS d(t) and the two angular gaps in terms of time as follows:

d(t) =d(0)

G1

kd0

G1

kdv (6)

α(t) =α(0)

d(0)1

d(t)1

kα (7)

Page 7: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 7

β (t) =β (0)

d(0)1

d(t)1

kβ (8)

Combining these equations yields the position vector of the UAV:

p(t) = p(T )+

−d(t)sinα(t)cosβ (t)−d(t)sinα(t)sinβ (t)

d(t)cosα(t)

(9)

5 Experimental Set-up

This section details the system architecture and set-up used to test the developedalgorithms in outdoor experiments (Section 6.2). The experiments are conductedon an empty 20× 20 m farmland plot in clear weather conditions (Fig. 2a), wherewe demonstrate our landing strategy running the algorithms in real-time on an As-cTec Firefly (Fig. 2b). Our UAV is equipped with an autopilot providing a low- andhigh-level control, and on-board computer (AscTec Mastermind), and the followingsensors:

• 100 Hz IMU,• 10 Hz Piksi Real Time Kinematic (RTK) differential GPS,• 20 Hz VI-sensor,• 50 Hz downward-looking Point Grey Chameleon 2.0 camera.

The landing platform target (Fig. 2c) is an A3 (297× 420 mm) arrangement ofvariable-dimension tags, obtained from the ar track alvar library. The nested taglayout on the platform allows for detecting and estimating the camera-to-target rel-ative pose from different altitudes. In this paper, we use a static platform for proofof concept and leave the study of moving targets to future work.

The logical architecture of the algorithms running on the UAV is described inFig. 3, which depicts the different interacting modules. In particular, the input data(Fig. 3, left) include: GPS measurements, images from the downward-looking cam-era and the VI-sensor, and accelerometer and gyroscope measurements from theVI-sensor and the UAV IMU.

For navigation, the UAV pose is estimated by integrating VI-sensor and GPSdata within the RObust Visual Inertial Odometry (ROVIO) framework [1]. This isthen integrated with IMU data in the Multi-Sensor Fusion (MSF) framework [12]to obtain a refined state estimate Fig. 3. Guidance is based on the improved tauguidance strategy (red block in Fig. 3), which uses the actual UAV state and thecamera-to-target relative pose to generate reference trajectories that are then trackedby a non-linear Model Predictive Controller (MPC)[7].

Page 8: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

8 Vetrella et al.

(a) (b) (c)

Fig. 2: (a) shows our experimental set-up on the field, with the GPS RTK base station visible onthe right. (b) depicts the AscTec Firefly positioned on the landing platform. (c) exemplifies the tagarrangement on the platform as seen by the on-board camera.

ROVIO MSF

IMU data

Images +IMU data

GPSposition

Pose tracker

Pose

Images

Tau guidance

Camera posew.r.t. tracker

ControllerRef.traj.

State Commands

Firefly

IMU

VI-sensor

GPS

Camera

Fig. 3: Systems diagram for our outdoor experiments. A camera is used to track the landing plat-form pose, and localization is provided by the MSF framework output. The guidance unit passesreference trajectories to the MPC. Note that our RTK GPS requires a ground base station (Fig. 2a)to receive satellite signals and transmit position corrections to the UAV.

6 Results

This section demonstrates the usage of our improved tau guidance strategy to landsmoothly on a target platform. We first validate our method in simulation beforepresenting results from outdoor experiments.

6.1 Simulations

The performance of our proposed landing method is validated preliminarily in sim-ulation. To achieve this, we use a simulation environment developed in the RotorS[5] framework, which realistically replicates the AscTec Firefly flight dynamics andits on-board sensors. For the IMU, orders of magnitude of biases and random errorsare set to be consistent with Micro Electro-Mechanical Systems (MEMS) used onthe AscTec Firefly (Table 1). Furthermore, the GPS position uncertainty in the ENUframe is modeled as a constant bias plus Gaussian white noise to account for thecorrelation of GPS errors in time. For vision sensing, only the downward-looking

Page 9: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 9

camera was simulated with a LOS uncertainty modeled as Gaussian white noise witha standard deviation of 0.05◦. This camera is used to detect the landing target on theground and to obtain its relative pose with respect to the UAV. Consequently, nav-igation is achieved here without simulating ROVIO, but only integrating simulatedposition and attitude measurements within the MSF framework. The simulation was

Sensors Bias instability Random walk

Gyroscopes 14.5◦/hr 0.66◦/√

hrAccelerometers 0.11 m/sec/

√hr 0.25 mg

Table 1: Assumptions on the simulated bias and random errors.

run in order to validate and integrate each component of the software frameworkbefore the real platform experiments (Section 6.2). Since we focus on rotary-wingUAVs, our interest is also to land on the platform with a desired orientation α(t)and zero velocity. Hence, simulations have been performed to evaluate the behaviorof trajectory shapes for varying coefficients k with an emphasis on vertical landing.Fig. 4a shows a comparison of different landing trajectories where the maneuverstarts from an height of 1.5 m and east north components of 0.8 m each. This fig-ure evidences how the 3-D shape of each trajectory is determined by coupling thecoefficients k, In addition, Fig. 4b shows the velocity and acceleration profiles dur-ing the almost vertical landing trajectory (blue curve in Fig. 4a), demonstrating thatzero velocity is reached at the end of the landing maneuver.

0.2

0.4

0.6

0

0.8

1

U (

m)

1.2

1.4

0.8

E (m)

0.60.5

N (m)

0.40.21 0

k = {0.15, 0.15, 0.45}

k = {0.45, 0.45, 0.45}

k = {0.45, 0.45, 0.15}

A

B

(a)

0 5 10 15 20 25 30

No. of samples

0

0.05

0.1

0.15

No

rm

Velocity (m/s)

Acceleration (m/s2)

(b)

Fig. 4: (a) depicts the landing trajectories with different values for the elements of k. Changingthese parameters allows for generating different trajectory profiles, including LOS (black), almostvertical (blue), and almost horizontal (red). (b) shows the velocity and acceleration norm profilesfor the almost vertical (blue) landing curve in (a).

Page 10: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

10 Vetrella et al.

6.2 Outdoor Experiments

We conducted outdoor experiments using the set-up described in Section 5. In eachexperiment, the UAV was commanded to:

• take-off and perform its mission,• fly back to the landing area,• fly a search path to detect the target, and• safely land on the target point in a specified time.

Figs. 5 and 6 illustrate the phases of the autonomous landing, where the initialphase is target detection (at an altitude of about 2m in Fig. 5c) followed by a refine-ment phase to reduce the noise. Once the target is detected and tracked, the UAVstarts the approaching phase during which the camera continues to track the targetto update the relative pose. If the difference between successive relative positions islarger than a certain tolerance, the trajectory is updated accordingly and passed tothe non-linear MPC.

These phases and results are clearly visible in Fig. 5, where the UAV positioncomponents in ENU are shown. In particular, by comparing the reference trajectorygenerated by the improved tau guidance strategy with the UAV position output bythe MSF framework, it is evident that the controller follows the reference trajectoryalong the entire maneuver up to touch-down. Fig. 7 confirms that the final landingposition remains within a maximum dispersion of a few centimeters with respectto the center of the target. It is worth mentioning that if the target is lost for a cer-tain time interval during the descent maneuver, the UAV is commanded to climb toincrease the camera footprint. The UAV returns on the landing path only after thetarget is detected and tracked again. In the figures, this effect is manifested as thetarget was lost at about 20cm height above mainly due to the UAV shadow causingocclusions impeding robust tracking. As a consequence, the UAV started to climband successfully performed a smooth landing after target re-detection.

Fig. 6 compares the UAV position and velocity along the three co-ordinate axes.The plots illustrate the optimum velocity profile of the landing maneuver to reach asoftest touch-down while achieving the necessary precision to land very close to thetarget center.

7 Conclusions and Future Work

We present a guidance approach using the improved intrinsic tau guidance law forautonomous UAV landing on a static landing platform. The guidance theory gener-ates smooth and computationally efficient 4-D trajectories that are both suitable fora fixed- and rotary-wing UAV platforms. A variety of trajectory designs can be eas-ily achieved by varying guidance coefficients. We have demonstrated its reliabilitythrough a realistic simulator and several field experiments.

Page 11: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 11

0 2 4 6 8

Time (s)

-1.5

-1

-0.5

0

E (

m)

(a)

0 2 4 6 8

Time (s)

-1.5

-0.5

0.5

N (

m)

(b)

0 2 4 6 8

Time (s)

-0.5

0.5

1.5

2.5

U (

m)

MSF

Ref. trajectory

(c)

Fig. 5: Difference between reference trajectoryand MSF during the final phases of landing

0 2 4 6 8

Time (s)

-1.5

-0.5

0.5

E (

m)

(a)

0 2 4 6 8

Time (s)

-1.5

-0.5

0.5

N (

m)

(b)

0 2 4 6 8

Time (s)

-1.5

0.5

2.5

U (

m)

Position (m)

Velocity (m/s)

(c)

Fig. 6: Velocity profile during the landing ma-neuver, with the position profile as a reference

Page 12: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

12 Vetrella et al.

-2 -1.5 -1 -0.5 0

E (m)

-0.6

-0.4

-0.2

0

0.2

0.4

0.6

N (

m)

Traj. 1

Traj. 2

Traj. 3

Landing platform

position

Fig. 7: UAV landing position with respect to the target position in the EN plane

Future work will consider ways of developing a reliable target tracker based onthe current visual target detector, GPS, and IMU measurements to estimate a tar-get pose with respect to a UAV pose. This will allow to robustly track a target inchallenge conditions such as dynamic motion and even in a case of a target is out-of-sight of a camera. The development of accurate and reliable target tracker willalso lead to land on a moving platform. Finally, we are interested in to apply theproposed guidance law to a fix-wing landing mission.

Acknowledgement

This project has received funding from the European Union’s Horizon 2020 re-search and innovation programme under grant agreement No 644227 and from theSwiss State Secretariat for Education, Research and Innovation (SERI) under con-tract number 15.0029. We would like to thank Marco Tranzatto and Michael Panticfor their valuable insights and platform support, and the ETH Crop Science Groupfor providing the testing facilities.

References

[1] Bloesch M, Omari S, Hutter M, Siegwart R (2015) Robust Visual InertialOdometry Using a Direct EKF-Based Approach. In: IEEE/Intenational Con-ference on Intelligent Robots and Systems, IEEE, Hamburg, pp 298–304

[2] Cocchioni F, Mancini A, Longhi S (2014) Autonomous Navigation, Landingand Recharge of a Quadrotor using Artificial Vision. In: International Confer-ence on Unmanned Aircraft Systems, pp 418–429

[3] Dunbabin M, Marques L (2012) Robots for environmental monitoring: Signif-icant advancements and applications. IEEE Robotics and Automation Maga-

Page 13: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

UAV Landing 13

zine 19(1):24–39[4] Ezequiel CAF, Cua M, Libatique NC, Tangonan GL, Alampay R, Labuguen

RT, Favila CM, Honrado JLE, Canos V, Devaney C, Loreto AB, Bacusmo J,Palma B (2014) UAV Aerial Imaging Applications for Post-Disaster Assess-ment, Environmental Management and Infrastructure Development. In: Inter-national Conference on Unmanned Aircraft Systems, pp 274–283

[5] Furrer F, Burri M, Achtelik M, Siegwart R (2016) RotorS—A Modular GazeboMAV Simulator Framework. Springer International Publishing Volume 1:pp.595–625

[6] Gautam A, Sujit PB, Saripalli S (2014) A Survey of Autonomous LandingTechniques for UAVs. International Conference on Unmanned Aircraft Sys-tems pp 1210–1218

[7] Kamel M, Stastny T, Alexis K, Siegwart R (????) Model predictive control fortrajectory tracking of unmanned aerial vehicles using robot operating system.Robot Operating System, The Complete Reference 2

[8] Kendoul F (2012) Survey of Advances in Guidance , Navigation , and Controlof Unmanned Rotorcraft Systems. Journal of Field Robotics 29(2):315–378

[9] Kendoul F (2013) Four-dimensional guidance and control of movement usingtime-to-contact: Application to automated docking and landing of unmannedrotorcraft systems. The International Journal of Robotics Research 33(2):237–267

[10] Lee DN (1976) A theory of visual control of braking based on informationabout time-to-collision. Perception 5(4):437–459

[11] Lee DN, Davies MNO, Green PR, (Ruud) Van Der Weel FR (1993) Visual con-trol of velocity of approach by pigeons when landing. Journal of ExperimentalBiology 180:85–104

[12] Lynen S, Achtelik MW, Weiss S, Chli M, Siegwart R (2013) A Robust andModular Multi-Sensor Fusion Approach Applied to MAV Navigation. In:IEEE/RSJ International Conference on Intelligent Robots and Systems, IEEE,Tokyo, pp 3923–3929

[13] Nex F, Remondino F (2014) UAV for 3D mapping applications: A review.Applied Geomatics 6(1):1–15

[14] Popovic M, Hitz G, Nieto J, Sa I, Siegwart R, Galceran E (2016) Online in-formative path planning for active classification using uavs. arXiv preprintarXiv:160908446

[15] Strydom R, Denuelle A, Srinivasan MV (2016) Bio-Inspired Principles Ap-plied to the Guidance , Navigation and Control of UAS. Aerospace 3(21)

[16] Yamasaki T, Sakaida H, Enomoto K, Takano H, Academy ND (2007) RobustTrajectory-Tracking Method for UAV Guidance. In: International Conferenceon Control, Automation and Systems, IEEE, Seoul, pp 1404–1409

[17] Yang Z, Fang Z, Li P (2016) Decentralized 4D Trajectory Generation for UAVsBased on Improved Intrinsic Tau Guidance Strategy. International Journal ofAdvanced Robotic Systems 13(3):1–13

[18] Yoon S, Kim Y, Kim S (2008) Pursuit Guidance Law and Adaptive Back-stepping Controller Design for Vision-Based Net-Recovery UAV. In: AIAA

Page 14: Improved Tau-Guidance and Vision-aided Navigation for ...flourish-project.eu/fileadmin/user_upload/... · Rotary-wing unmanned aerial vehicles (UAVs) play a significant role in providing

14 Vetrella et al.

Guidance, Navigation and Control Conference and Exhibit, AIAA, Honolulu,HI, August, pp 1–33