annotation-based assistance system for unmanned helicopter ... · annotation-based assistance...

2
Annotation-Based Assistance System for Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA, Yoshio MATSUMOTO, Tsukasa OGASAWARA Nara Institute of Science and Technology 8916-5 Takayama, Ikoma, Nara, 630-0192 Japan. Abstract In this paper, we introduce an annotation-based assis- tance system for an unmanned helicopter with an wearable augmented reality environment. In this system, an operator controls the helicopter remotely while watching an anno- tated view from the helicopter through a head mounted dis- play(HMD) with a laptop PC in a backpack. Annotations assist the operation indicating some conditions of the heli- copter and a name of buildings nearby. The position and the attitude of the helicopter is measured by GPS and a gy- roscope, and sent to the operator’s PC via a wireless LAN. 1. Introduction Unmanned helicopters are currently used for various pur- poses, such as crop-dusting and remote sensing. However it is difficult for an operator to control an unmanned helicopter remotely. One reason is that an operator cannot be aware of its attitude when he/she is far away from the helicopter. Another reason is that the coordinate system between the helicopter and the operator changes drastically depending on the attitude of the helicopter. To solve these problems, several studies have been made on autonomous helicopters [1, 2]. Autonomous helicopters need pre-determined land- marks or flight paths in order to fly, thus they are not suitable for flight tasks where the situation changes every minute such as in disaster relief. Additionally, many on-board sensors and computers for control are needed. Since the payload of a helicopter is sharply small, autonomous heli- copters tend to be large, heavy, and expensive. We proposed an immersive teleoperating system for unmanned helicopters using an omnidirectional cam- era(Figure 1)[3]. In this system, the operator controls the helicopter remotely by viewing the surroundings of the he- licopter through a HMD. The advantage of this system is that it is only necessary to install a camera and a transmitter on the helicopter. Therefore it is possible to use a compact helicopter with a small payload, and make it lightweight and cheap. Additionally, it becomes easy to control an un- manned helicopter because a coordinate system between a helicopter and an operator doesn’t change even when the at- titude of the helicopter changes. Furthermore, an operator (a) Helicopter (b) Operator (c) Omnidirectional Image (d) Perspective Image Figure 1. Operation of Unmmand Helicopter and Captured Image can retain control even when a helicopter is out of the opera- tor’s sight as long as the video image can reach the operator and the helicopter can receive the control signal. However, in this system, it is impossible to control when the image transmission fails or the visual range is poor. In addition, the resolution of a perspective image which is generated from omnidirectional image becomes low and the operator has trouble seeing distant objects. To solve these problems, we are developing an annotation-based assistance system for an unmanned helicopter. 2. Annotation-Based Assistance System for Unmanned Helicopter Figure 2 and 3 show the configuration and the overview of this system. On the helicopter, an omnidirectional cam- era, a GPS, a gyroscope, and a PC with wireless LAN are mounted at the bottom. Position/attitude data and an omnidirectional image are sent to the operator during the flight via a wireless LAN. On the ground, a perspective

Upload: trannhi

Post on 18-Mar-2019

227 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Annotation-Based Assistance System for Unmanned Helicopter ... · Annotation-Based Assistance System for Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA,

Annotation-Based Assistance System for Unmanned Helicopterwith Wearable Augmented Reality Environment

Masanao KOEDA, Yoshio MATSUMOTO, Tsukasa OGASAWARANara Institute of Science and Technology

8916-5 Takayama, Ikoma, Nara, 630-0192 Japan.

AbstractIn this paper, we introduce an annotation-based assis-

tance system for an unmanned helicopter with an wearableaugmented reality environment. In this system, an operatorcontrols the helicopter remotely while watching an anno-tated view from the helicopter through a head mounted dis-play(HMD) with a laptop PC in a backpack. Annotationsassist the operation indicating some conditions of the heli-copter and a name of buildings nearby. The position andthe attitude of the helicopter is measured by GPS and a gy-roscope, and sent to the operator’s PC via a wireless LAN.

1. Introduction

Unmanned helicopters are currently used for various pur-poses, such as crop-dusting and remote sensing. However itis difficult for an operator to control an unmanned helicopterremotely. One reason is that an operator cannot be awareof its attitude when he/she is far away from the helicopter.Another reason is that the coordinate system between thehelicopter and the operator changes drastically dependingon the attitude of the helicopter. To solve these problems,several studies have been made on autonomous helicopters[1, 2]. Autonomous helicopters need pre-determined land-marks or flight paths in order to fly, thus they are not suitablefor flight tasks where the situation changes every minutesuch as in disaster relief. Additionally, many on-boardsensors and computers for control are needed. Since thepayload of a helicopter is sharply small, autonomous heli-copters tend to be large, heavy, and expensive.

We proposed an immersive teleoperating system forunmanned helicopters using an omnidirectional cam-era(Figure 1)[3]. In this system, the operator controls thehelicopter remotely by viewing the surroundings of the he-licopter through a HMD. The advantage of this system isthat it is only necessary to install a camera and a transmitteron the helicopter. Therefore it is possible to use a compacthelicopter with a small payload, and make it lightweightand cheap. Additionally, it becomes easy to control an un-manned helicopter because a coordinate system between ahelicopter and an operator doesn’t change even when the at-titude of the helicopter changes. Furthermore, an operator

(a) Helicopter (b) Operator

(c) Omnidirectional Image (d) Perspective Image

Figure 1. Operation of Unmmand Helicopterand Captured Image

can retain control even when a helicopter is out of the opera-tor’s sight as long as the video image can reach the operatorand the helicopter can receive the control signal. However,in this system, it is impossible to control when the imagetransmission fails or the visual range is poor. In addition,the resolution of a perspective image which is generatedfrom omnidirectional image becomes low and the operatorhas trouble seeing distant objects. To solve these problems,we are developing an annotation-based assistance systemfor an unmanned helicopter.

2. Annotation-Based Assistance Systemfor Unmanned Helicopter

Figure 2 and3 show the configuration and the overviewof this system. On the helicopter, an omnidirectional cam-era, a GPS, a gyroscope, and a PC with wireless LANare mounted at the bottom. Position/attitude data and anomnidirectional image are sent to the operator during theflight via a wireless LAN. On the ground, a perspective

Page 2: Annotation-Based Assistance System for Unmanned Helicopter ... · Annotation-Based Assistance System for Unmanned Helicopter with Wearable Augmented Reality Environment Masanao KOEDA,

IEEE1394

USB

USBPCMCIA

RS-232C

USB

Helicopter

Operator

NTSC

NTSC

Data Relay Station

OmnidirectionalCamera

External Antenna

Wireless LAN Access Point

Wireless LAN

GPS USB-Serial Converter

NotePC DV Converter

Gyroscope

HMD

Head Tracker

NotePCWireless LAN

Figure 2. System Configuration

Omnidirectional Camera NotePC

Gyroscope

DV ConverterGPS

Wireless LAN

Figure 3. System Overview

image is generated from a received image, and displayedon the HMD which the operator wears. The displayed im-age changes depending on the head direction which is mea-sured by the gyroscope attached to the HMD. The opera-tor has an annotation database which consists of the namesand the position information of neighboring buildings. Us-ing the database and the current position and attitude of thehelicopter, annotations are overlaid on the perspective im-age which the operator is observing. The direction of thenose of the helicopter, ground speed, map, and the oper-ator’s head attitude are also displayed on the image. Theoperator holds a controller with both hands and controls thehelicopter wearing a backpack which contains a laptop PC.

3. ExperimentWe have carried out an experiment using the system

around our campus. In order to move over the wide rangeareas safely, the experiment was conducted using a heli-copter mounted on a car.Figure 4 indicates the route andGPS data andFigure 5 shows the annotation overlay im-ages. The position is measured correctly and annotationsare overlaid on actual buildings accurately.

4. ConclusionWe have proposed an annotation-based assistance sys-

tem for an unmanned helicopter with wearable augmentedreality environment, and showed our system can display an-notations robustly in the large range of outdoor.

AcknowledgementThis research is partly supported by Core Research for

Evolutional Science and Technology (CREST) Program”Advanced Media Technology for Everyday Living” ofJapan Science and Technology Agency (JST).

A

BC

F

D

E

GH 34.43.62

34.43.64

34.43.66

34.43.68

34.43.70

34.43.72

34.43.74

34.43.76

135.44.10 135.44.15 135.44.20 135.44.25

long

titud

e

latitude

Figure 4. Route and GPS Data

A B

C D

E F

G H

Figure 5. Annotation Overlay Images

References[1] Ryan Miller, Omead Amidi, and Mark Delouis, “Arc-

tic Test Flights of the CMU Autonomous Helicopter”,Proceeding of the Association for Unmanned VehicleSystems International 1999.

[2] Kale Harbick, James F. Montgomery, and Gaurav S.Sukhatme, “Planar Spline Trajectory Following foran Autonomous Helicopter”,Journal of AdvancedComputational Intelligence and Intelligent Informat-ics, Vol.8, No.3, pp. 237-242, 2004.

[3] Masanao Koeda, Yoshio Matsumoto, and TsukasaOgasawara, “Development of an Immersive Teleoper-ating System for Unmanned Helicopter”, IAPR Work-shop on Machine Vision Applications 2002, pp. 220-223.