2015 humanitarian robotics and automation technology...

3
182 IEEE ROBOTICS & AUTOMATION MAGAZINE SEPTEMBER 2015 HUMANITARIAN TECHNOLOGY 2015 Humanitarian Robotics and Automation Technology Challenge By Raj Madhavan, Lino Marques, Edson Prestes, Renan Maffei, Vitor Jorge, Baptiste Gil, Sedat Dogru, Gonçalo Cabrita, Renata Neuland, and Prithviraj Dasgupta O rganized by the IEEE Robot- ics and Automation Society (RAS) Special Interest Group on Humanitarian Technolo- gy (SIGHT), the Humanitarian Robotics and Automation Technology Challenge (HRATC) provides a unique opportuni- ty for the robotics and automation (RA) community from around the world to collaborate using its skills and education to benefit humanity. The RAS SIGHT’s mission is the application of RA technol- ogies to promote humanitarian causes in collaboration with global communities and organizations [1]. Started in 2014, the HRATC brings together researchers, students, and roboticists from academia and industry toward realizing a cost-effective, reli- able, and sustainable solution to solving the age-old problem of detecting and classifying locations of land mines scat- tered throughout the globe, serving as sad remnants of war and conflict. Countless people, including children, have been maimed and killed as a result of stepping on land mines buried too close to inhabited areas [2]. The chal- lenge occurs in three phases: 1) simula- tion, 2) testing, and 3) the finals. Teams are progressively eliminated after each phase, and the remaining teams move on to the next phase, culminating in the challenge (finals) phase. The teams do not need to purchase or build a robot instrumented with sensors or develop any of the accompanying soft- ware. Every team can participate remotely in each of the phases. The main goals of the challenge are to develop an open-source and free software for reliable and robust detection and classification of land mines and their subsequent clearance inspire, encourage, and educate researchers and students on the ben- efits of deploying RA technologies for the benefit of humanity provide a platform for exchanging ideas on addressing pressing needs across the globe via RA technologies. For more details on the HRATC’14 phases and accompanying frame- works, the reader is directed to [3]. The HRATC’15 framework runs on a Linux/Robot operating system (ROS) environment and is responsible for con- necting the team code to the robot. The framework also offers simulation scenari- os, visualization tools, and scoring met- rics. Figure 1 shows the software architecture. This framework has the same core as the HRATC’14 framework; however, the evaluation software (HRATC 2015 Judge) was moved from Python to C++ to improve performance, and the visualization system was modi- fied to use RVIZ, making it consistent with the ROS standard interfaces and eas- ier to use for ROS users. In the simulation phase, as shown in Figure 1(a), sensor data, such as from cameras and laser range-finder readings, are simulated by Gazebo, through Husky modules, while the metal-detector information is simulat- ed using a custom module based on previously collected information. In the testing phase, as shown in Fig- ure 1(b), the Husky robot provides the sensor data, including that for metal detection. Figure 2 shows the HRATC frame- work visualization in RVIZ, with the metrics used by the HRATC judge to compute the scores of each team. Like in the first edition of the HRATC, the score computed by the judge is composed of three components: 1) mine-detection effectiveness, 2) coverage area, and 3) execution time. Based on our experiences from 2014, we decided to penalize teams that “explode” the robot, eliminating those that pass over a mine more than once. We also penalized teams that were too conservative and inactive. Thus, the HRATC’15 scoring metric was slightly different from the first edition of the challenge. This enabled us to assess each team’s performance and, at the same time, penalize inactive teams. In the real world, poor performance or inactivity would imply lost assets, substantial costs, and, possibly, lives. In the 2015 challenge, Clearpath’s Husky robot, shown in Figure 3, was upgraded with a new two-degrees-of- freedom arm, including compliance in both motion axes. If the arm hits the ground or an obstacle, there will be no major damage to the system. The end- effector position is measured by means of absolute encoders attached to the arm’s links. The sensor-supporting bridge was also changed so that the arm could have a very large sweeping range, making it possible to place the arm above the robot’s body to have a com- pact system for transportation [4]. Digital Object Identifier 10.1109/MRA.2015.2452199 Date of publication: 11 September 2015

Upload: others

Post on 28-Jun-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 2015 Humanitarian Robotics and Automation Technology Challengeperpustakaan.unitomo.ac.id/repository/2015... · lenge occurs in three phases: 1) simula-tion, 2) testing, and 3) the

182 • IEEE ROBOTICS & AUTOMATION MAGAZINE • SEpTEMBER 2015

Humanitarian tecHnology

2015 Humanitarian Robotics and Automation Technology Challenge

By Raj Madhavan, Lino Marques, Edson Prestes, Renan Maffei, Vitor Jorge, Baptiste Gil, Sedat Dogru,

Gonçalo Cabrita, Renata Neuland, and Prithviraj Dasgupta

Organized by the IEEE Robot-ics and Automation Society (RAS) Special Interest Group on Humanitarian Technolo-

gy (SIGHT), the Humanitarian Robotics and Automation Technology Challenge (HRATC) provides a unique opportuni-ty for the robotics and automation (RA) community from around the world to collaborate using its skills and education to benefit humanity. The RAS SIGHT’s mission is the application of RA technol-ogies to promote humanitarian causes in collaboration with global communities and organizations [1].

Started in 2014, the HRATC brings together researchers, students, and roboticists from academia and industry toward realizing a cost-effective, reli-able, and sustainable solution to solving the age-old problem of detecting and classifying locations of land mines scat-tered throughout the globe, serving as sad remnants of war and conflict. Countless people, including children, have been maimed and killed as a result of stepping on land mines buried too close to inhabited areas [2]. The chal-lenge occurs in three phases: 1) simula-tion, 2) testing, and 3) the finals. Teams are progressively eliminated after each phase, and the remaining teams move on to the next phase, culminating in the challenge (finals) phase. The teams do not need to purchase or build a robot instrumented with sensors or develop any of the accompanying soft-ware. Every team can participate

remotely in each of the phases. The main goals of the challenge are to

● develop an open-source and free software for reliable and robust detection and classification of land mines and their subsequent clearance

● inspire, encourage, and educate researchers and students on the ben-efits of deploying RA technologies for the benefit of humanity

● provide a platform for exchanging ideas on addressing pressing needs across the globe via RA technologies. For more details on the HRATC’14 phases and accompanying frame-works, the reader is directed to [3].The HRATC’15 framework runs on a

Linux/Robot operating system (ROS) environment and is responsible for con-necting the team code to the robot. The framework also offers simulation scenari-os, visualization tools, and scoring met-rics. Figure 1 shows the software architecture. This framework has the same core as the HRATC’14 framework; however, the evaluation software (HRATC 2015 Judge) was moved from Python to C++ to improve performance, and the visualization system was modi-fied to use RVIZ, making it consistent with the ROS standard interfaces and eas-ier to use for ROS users.

In the simulation phase, as shown in Figure 1(a), sensor data, such as from cameras and laser range-finder readings, are simulated by Gazebo, through Husky modules, while the metal-detector information is simulat-ed using a custom module based on previously collected information. In the testing phase, as shown in Fig-

ure 1(b), the Husky robot provides the sensor data, including that for metal detection.

Figure 2 shows the HRATC frame-work visualization in RVIZ, with the metrics used by the HRATC judge to compute the scores of each team. Like in the first edition of the HRATC, the score computed by the judge is composed of three components: 1) mine-detection effectiveness, 2) coverage area, and 3) execution time.

Based on our experiences from 2014, we decided to penalize teams that “explode” the robot, eliminating those that pass over a mine more than once. We also penalized teams that were too conservative and inactive. Thus, the HRATC’15 scoring metric was slightly different from the first edition of the challenge. This enabled us to assess each team’s performance and, at the same time, penalize inactive teams. In the real world, poor performance or inactivity would imply lost assets, substantial costs, and, possibly, lives.

In the 2015 challenge, Clearpath’s Husky robot, shown in Figure 3, was upgraded with a new two-degrees-of-freedom arm, including compliance in both motion axes. If the arm hits the ground or an obstacle, there will be no major damage to the system. The end-effector position is measured by means of absolute encoders attached to the arm’s links. The sensor-supporting bridge was also changed so that the arm could have a very large sweeping range, making it possible to place the arm above the robot’s body to have a com-pact system for transportation [4].

Digital Object Identifier 10.1109/MRA.2015.2452199Date of publication: 11 September 2015

Page 2: 2015 Humanitarian Robotics and Automation Technology Challengeperpustakaan.unitomo.ac.id/repository/2015... · lenge occurs in three phases: 1) simula-tion, 2) testing, and 3) the

183September 2015 • Ieee rObOtICS & AUtOmAtION mAGAZINe •

HRATC’15 introduced some changes to the environment, mimicking a more realistic demining operation. The teams had to start from outside the field, and they had to deal with some bush-like obstacles that were added to the test field. The teams had the chance to perceive the environment using either a stereo camera system or a tilted scanning laser range finder, which provides a three-dimen-sional point-cloud representation of the environment. Given its simplicity and reliability, all the teams chose to use only the laser to detect the obstacles and to extract the ground profile.

Similar to HRATC’14, the field was still defined by four corners; however, these did not necessarily define a rect-angle. For the next edition, we plan to provide an aerial image of the minefield and a list of coordinates defining an arbitrary convex polygon. This year, we used three surrogate mines buried in the ground and some metal debris (e.g. cans and metal bars) placed on the field as mock mines. The “real” mines con-tained only a small metal part, making detection as difficult as detecting a land mine with low metal content. The teams had the chance to use the robot three times before the finals. The results of these tests—videos and ROS bag files of the testing runs—were provided to the teams along with constructive feed-back on how to better detect the mines and navigate on the field. The teams used all the opportunities to improve their source code, and all showed signif-icant improvements during the course of the challenge.

In the finals, the teams were allowed two runs on different days, thereby pro-viding them with a chance to modify their source code. The best run of each team was taken to arrive at the final rankings. While deceptively straightfor-ward, the challenge is much more than merely moving the robot platform, detecting and classifying mines, and moving again. There are inherent levels of complexities that are to be dealt with by the teams in terms of sub-tasks such as appropriate minefield mapping, obstacle and land-mine avoidance, and proper arm control, in addition to a robust mine-detection algorithm.

hratc2015_framework

hratc2015_framework

(a)

(b)

ROS Ecosystem

ROS Ecosystem

fsr_husky_robotRVIZ

Visualization

RVIZVisualization

Metal-DetectorSimulator

Gazebo Simulator

HRATC 2015Judge

HRATC 2015Judge

HRATC 2015Entry Template

HRATC 2015Entry Template

Your ROSNode 2

Your ROSNode 2

Your ROSNode 1

Your ROSNode 1 (f)

(f)

Figure 1. The software architecture for (a) the simulation and (b) the testing phases.

Figure 2. The HRATC’15 framework visualization using RVIZ.

Page 3: 2015 Humanitarian Robotics and Automation Technology Challengeperpustakaan.unitomo.ac.id/repository/2015... · lenge occurs in three phases: 1) simula-tion, 2) testing, and 3) the

184 • IEEE ROBOTICS & AUTOMATION MAGAZINE • SEpTEMBER 2015

HRATC’15 started with 15 teams in the simulation phase, which lasted for 12 weeks. Based on their performance, eight teams progressed to the six-week testing phase. This, in turn, was followed by five teams qualifying for the finals colocated with the Robot Challenge at the 2015 International Conference on Robotics and Automation (ICRA) in Seattle, Washington, 26–27 May. The finals took place remotely, similar to the testing phase, but the results were beamed via a live YouTube channel. National University of Singapore’s Team NUS was declared the overall winner, and Team ORION from the University of Texas at Arlington was the runner up (Figure 4). In addition to certificates for the finalists, the top finishers received US$1,000 and US$500, respectively.

For 2016, we are developing a new robot that will carry a ground-penetrat-ing radar array. We are introducing this second robot for the next challenge, giv-ing the teams an opportunity to imple-ment the multiagent mine-scanning

and sensor-fusion techniques. Another aspect we consider important for next year is to encourage the teams to use vision, which is an indispensable tool in field robotics. Integrating vision will also stimulate the participation of larger teams with various backgrounds, fur-ther improving teamwork. To enforce this, we will be providing a degraded global positioning system, so the teams will have to rely on visual odometry and other sensing means for accurate local-ization on the field.

The call for the 2016 challenge will be published in October 2015, with the deadline for applications in November 2015. You can peruse information relat-ed to this year’s challenge, including rules and frequently asked questions, at http://www.isr.uc.pt/HRATC2015. A

summary vídeo is available from http://www.isr.uc.pt/HRATC2015/Lookback.html. We look forward to your partici-pation in HRATC’16!

AcknowledgmentsSpecial thanks are due to IEEE SIGHT for its sponsorship of the prizes. This challenge was partially supported by the European Union Seventh Frame-work Program TIRAMISU project (http://www.fp7-tiramisu.eu) under grant 284747 and by Clearpath’s Part-nerBot Program under grant PB12-024. Brazil’s Conselho Nacional de Desenvolvimento Científico e Tec-nológico program is acknowledged for its partial financial support.

References[1] R. Madhavan, “RAS-SIGHT formed [society news],” IEEE Robot. Automat. Mag., vol. 20, no. 1, p. 115, 2013.[2] UN Mine Action Service (UNMAS). [Online]. Available: http://www.mineaction.org/ [3] R. Madhavan, L. Marques, E. Prestes, P. Das-gupta, D. Portugal, B. Gouveia, V. Jorge, R. Maffei, G. Franco, and J. Garcia, “2014 humanitarian robotics and automation technology challenge,” IEEE Robot. Automat. Mag., vol. 21, no. 3, pp. 10–16, 2014.[4] G. Cabrita, R. Madhavan, and L. Marques, “A framework for remote field robotics competitions,” in Proc. IEEE Int. Conf. Autonomous Robot Systems Competitions, Apr. 2015, pp. 192–197.

Figure 3. The FSR Husky robot employed during HRATC’15 challenge.

Figure 4. The HRATC’15 overall winner: Team NUS. Haoyu Bai (left), Team NUS member and Sonia Chernova, ICRA’15 Robot challenge cochair.

We want to hear from you!

Do you like what you’re reading? Your feedback is important. Let us know—send the editor-in-chief an e-mail!

IMAG

E LICEN

SED BY G

RAPH

IC STO

CK