autonomous vision-based underwater robot competition 2/autono… · ing that they need the...

11
Autonomous Vision-based Underwater Robot Competition ZHENG Xingwen, WANG Wei, XIE Guangming State Key Laboratory for Turbulence and Complex Systems, College of Engineering, Peking University, Beijing 100871, P. R. China E-mail: [email protected]; [email protected]; [email protected] Abstract: Aiming to promote the underwater robotics and automation technology, this paper introduces an autonomous vision- based underwater robot competition. The competition is carried out in the forms of the intelligent biomimetic robotic fish’s racing chasing and water polo game. It involves soccer-robot technology, hydrodynamic analysis, underwater communication, image processing, anti-interference technology, and so on. The competition is characterized by bionics and intelligent cooperation. It provides a standard platform for the detection and verification of collaboration technology of the underwater robots. Key Words: water polo game; ball-heading algorithm; autonomous vision; boxfish; underwater localization; autonomous image processing 1 Introduction Robot competition plays a significant role in promot- ing robot technology and its related disciplines. RoboCup (RoboCup World Championship and Conference [1]) is the highest-ranked robot soccer competition in the world. It- s most significant aim is to put the cutting-edge researches on information automation especially Multi-Agent System into practice. At the same time, the participants all over the world can exchange their new ideas and new develop- ments, and thus promote the fundamental researches more effectively. All kinds of new principles and new technolo- gies can get comparatively objective appraisements through the competition. Robot soccer competition has high require- ment in robots’ visual function. The better the properties of the graphics card, the faster the velocity of movement and recognition. Only in this way can the robot win the com- petition. At the same time, the development of vision tech- nology can be promoted greatly. Robot competition enables researchers to use varieties of technologies for the sake of acquiring a better solution. Simultaneously, it enhances the development in different scientific areas. In a manner of speaking, compared with robot competi- tions on land, the underwater robot competition is more chal- lenging because of the complexity and uncertainty of the en- vironment underwater. Along with the exploration of marine resources, the operating tasks under the water have become more frequent. These tasks are so complex and challeng- ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation have become essential and urgent. With regard to underwater robot competition, it covers a wide range of areas relating to mechatronics, robotics, multi-sensor information fusion, intelligent control, computer vision, computer graphics, ar- tificial intelligence, and so on. With the help of underwa- ter robot competition, we can combine the latest research achievements on multi-robot cooperation with practice, and thus explore how multiple robots cooperate closely with one another in an unpredictable dynamic environment. On this basis, we can accelerate the process of commercialization and industrialization of the multi-robotic cooperative tech- nology. This work is supported by National Natural Science Foundation (NNS- F) of China under Grant 00000000. Underwater robot competition is carried out in the form of racing, chasing and water polo game of the intelligent biomimetic robotic fish. The autonomous vision-based un- derwater robot competition adopts distributed autonomous control method. Each robotic fish is equipped with cam- era and other sensors. The robotic fish has the abilities of perceiving workspace, locating itself and making motion s- trategy autonomously. Autonomous vision-based underwa- ter robot competition includes water polo game, underwater rescue, technical challenge, and so on. The most exciting project is water polo game in which the robotic fish from two teams compete against each other. The team who has more goals will win the game. Water polo game involves soccer- robot technology, underwater localization, underwater com- munication, image processing, ball-heading technology, and so on. The robotic fish used in autonomous vision-based competition are the autonomous boxfish-like robot which are developed by Intelligent Control Laboratory of Peking Uni- versity. Fig. 1 shows the boxfish-like robot in the water polo game and several different versions of the robotic fish. [2–4] Based on the above discussion and our previous works on the boxfish-like robot, this paper aims to give a systemat- ic description of the autonomous vision-based underwater robot competition. The rest of this paper is organized as follows: Section II introduces the robotic fish including it- s mechanical design and electric system design. Especially, this section introduces the typical swimming modes of the robotic fish. Section III introduces the software architecture of the competition including its basic operation principle and human machine interaction. Section IV introduces the com- petition items and several key technologies in the competi- tion. The key technologies of the competition include au- tonomous image processing, underwater localization, path planning and obstacle avoidance based on artificial potential field. This section also introduces some typical ball-tracking and ball-heading algorithms. Section V concludes this paper with an outline of future work. 2 Hardware System of the Competition 2.1 The Robotic Fish and the Boxfish It Imitated The boxfish-like robot chooses an aquatic animal called boxfish as the bionic object. Fig. 2 shows the boxfish in na- ture and the prototype of boxfish-like robot. [5, 6] Boxfish is named after its box-shaped external feature. Differ from ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 1

Upload: others

Post on 30-Apr-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

Autonomous Vision-based Underwater Robot CompetitionZHENG Xingwen, WANG Wei, XIE Guangming

State Key Laboratory for Turbulence and Complex Systems, College of Engineering, Peking University, Beijing 100871, P. R. China E-mail: [email protected]; [email protected]; [email protected]

Abstract: Aiming to promote the underwater robotics and automation technology, this paper introduces an autonomous vision-based underwater robot competition. The competition is carried out in the forms of the intelligent biomimetic robotic fish’s racingchasing and water polo game. It involves soccer-robot technology, hydrodynamic analysis, underwater communication, imageprocessing, anti-interference technology, and so on. The competition is characterized by bionics and intelligent cooperation. Itprovides a standard platform for the detection and verification of collaboration technology of the underwater robots.

Key Words: water polo game; ball-heading algorithm; autonomous vision; boxfish; underwater localization; autonomous imageprocessing

1 Introduction

Robot competition plays a significant role in promot-ing robot technology and its related disciplines. RoboCup(RoboCup World Championship and Conference [1]) is thehighest-ranked robot soccer competition in the world. It-s most significant aim is to put the cutting-edge researcheson information automation especially Multi-Agent Systeminto practice. At the same time, the participants all overthe world can exchange their new ideas and new develop-ments, and thus promote the fundamental researches moreeffectively. All kinds of new principles and new technolo-gies can get comparatively objective appraisements throughthe competition. Robot soccer competition has high require-ment in robots’ visual function. The better the properties ofthe graphics card, the faster the velocity of movement andrecognition. Only in this way can the robot win the com-petition. At the same time, the development of vision tech-nology can be promoted greatly. Robot competition enablesresearchers to use varieties of technologies for the sake ofacquiring a better solution. Simultaneously, it enhances thedevelopment in different scientific areas.

In a manner of speaking, compared with robot competi-tions on land, the underwater robot competition is more chal-lenging because of the complexity and uncertainty of the en-vironment underwater. Along with the exploration of marineresources, the operating tasks under the water have becomemore frequent. These tasks are so complex and challeng-ing that they need the cooperation of multiple robots. Sothe researches on underwater multi-robot cooperation havebecome essential and urgent. With regard to underwaterrobot competition, it covers a wide range of areas relatingto mechatronics, robotics, multi-sensor information fusion,intelligent control, computer vision, computer graphics, ar-tificial intelligence, and so on. With the help of underwa-ter robot competition, we can combine the latest researchachievements on multi-robot cooperation with practice, andthus explore how multiple robots cooperate closely with oneanother in an unpredictable dynamic environment. On thisbasis, we can accelerate the process of commercializationand industrialization of the multi-robotic cooperative tech-nology.

This work is supported by National Natural Science Foundation (NNS-F) of China under Grant 00000000.

Underwater robot competition is carried out in the formof racing, chasing and water polo game of the intelligentbiomimetic robotic fish. The autonomous vision-based un-derwater robot competition adopts distributed autonomouscontrol method. Each robotic fish is equipped with cam-era and other sensors. The robotic fish has the abilities ofperceiving workspace, locating itself and making motion s-trategy autonomously. Autonomous vision-based underwa-ter robot competition includes water polo game, underwaterrescue, technical challenge, and so on. The most excitingproject is water polo game in which the robotic fish from twoteams compete against each other. The team who has moregoals will win the game. Water polo game involves soccer-robot technology, underwater localization, underwater com-munication, image processing, ball-heading technology, andso on. The robotic fish used in autonomous vision-basedcompetition are the autonomous boxfish-like robot which aredeveloped by Intelligent Control Laboratory of Peking Uni-versity. Fig. 1 shows the boxfish-like robot in the water pologame and several different versions of the robotic fish. [2–4]

Based on the above discussion and our previous works onthe boxfish-like robot, this paper aims to give a systemat-ic description of the autonomous vision-based underwaterrobot competition. The rest of this paper is organized asfollows: Section II introduces the robotic fish including it-s mechanical design and electric system design. Especially,this section introduces the typical swimming modes of therobotic fish. Section III introduces the software architectureof the competition including its basic operation principle andhuman machine interaction. Section IV introduces the com-petition items and several key technologies in the competi-tion. The key technologies of the competition include au-tonomous image processing, underwater localization, pathplanning and obstacle avoidance based on artificial potentialfield. This section also introduces some typical ball-trackingand ball-heading algorithms. Section V concludes this paperwith an outline of future work.

2 Hardware System of the Competition

2.1 The Robotic Fish and the Boxfish It ImitatedThe boxfish-like robot chooses an aquatic animal called

boxfish as the bionic object. Fig. 2 shows the boxfish in na-ture and the prototype of boxfish-like robot. [5, 6] Boxfishis named after its box-shaped external feature. Differ from

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 1

Page 2: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

(a)

(b) (c) (d)Fig. 1: (a) Autonomous vision-based boxfish-like robot inthe water polo game. (b) Version I of the boxfish-like robot.(c) Version II of the boxfish-like robot. (d) Version III of theboxfish-like robot.

other kinds of fish, boxfish’s body is enclosed in a prismat-ic shell which is not streamlined. There are chiseled spinestructures at the edge of the shell. The fish’s outer surfacesbetween different spines have different concave and convexextents. Its special outer structure gives it the properties ofexcellent maneuverability. [7]

2.1.1 Mechanical Design

In order to achieve flexibility in design and complete au-tonomy of the three-dimensional motion, we have attemptedto integrate the mechanical structure, the functional charac-teristics and multiple sensors of the robotic fish. By contrast-ing with the biological entity, the robotic fish should have as

(a) (b) (c)

(d) (e) (f)Fig. 2: Boxfish and the prototype of boxfish-like robot. (a)Front view of boxfish. (b) Front view of the robotic fish. (c)Side view of boxfish. (d) Side view of the robotic fish. (e)Isometric views of boxfish. (f) Isometric views of the roboticfish.

(a)

(b)

(c)

(d)

Fig. 3: Mechanical structure of the robotic fish. (a) Config-urations of the robotic fish. (b) Tree parts of the robotic fish.(c) The upper part. (d) The lower part.

many effective functions as possible while the compact me-chanical structure is ensured.

From the perspective of robotics, we research the propul-sion mechanism, mechanism design, motion control and sys-tem integration of the robotic fish. By studying the controlmechanism and information processing in biological propul-sion system, we designed and manufactured the robotic fishwhich can move flexibly and have high propulsion efficien-cy. In addition, the robotic fish has the ability of swimmingin three-dimensional space, the biological sensing capabilityand the behavior characteristics.

Fig. 3 shows the mechanical configurations of the boxfish-like robot. [8] It consists of main module, a pair of pectoralfins, and one caudal fin. With the use of the fins, the robot-ic fish can implement various swimming modes. The mainmodule of the robotic fish is designed in the form of sealedshell. Rechargeable battery, sensors and motors are wrappedin the shell. The shell is made up of two parts. The upperpart is made of plastics ABS. The lower part is made of alu-minum. Static seal and motive seal were used to ensure thesealing property of the robotic fish. In particular, O-ring andsilica gel are used for the seal between the upper and lowershell. Seal rings and grease are used for the sealing propertyof rotating shafts. The fins are designed basing on the shapeof the real fins. The density of the robotic fish has been cal-ibrated to be close to the density of water so that it is ableto float on the surface of the water. The robotic fish imple-ments forward movement and turning action by controllingthe swing caudal fin. Through controlling the pectoral finsat both sides of the body, the robot implements up-down lo-comotion, turning action, forward movement and backwardmovement.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 2

Page 3: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

2.1.2 Electric System Design

The electrical system is important to the robotic fish. Theperception of the external environment, the processing ofprocessor, and the control of actuators are all based on theeffective combination of the electrical system. The electricalsystem consists of power supply unit, sensing unit, controlunit, motoring and control platform, processor, and so on.

The electrical system is built based on Raspberry Pi whichis the core processor. Raspberry Pi is a card-sized miniatureelectronic brain. The system installed on Raspberry Pi isLinux. With the use of Raspberry Pi, the robotic fish is ableto be operated autonomously through Linux.

The sensing unit consists of different kinds of sensorswhich include IMU, pressure sensor, infrared sensor, cam-era, temperature and humidity sensor, etc. IMU is a sensorconsists of gyroscope, accelerometer, and electronic com-pass. It can measure the acceleration and azimuth informa-tion of the robotic fish so that we can monitor its yaw, pitchand roll motions. Pressure sensor can obtain the externalpressure. Infrared sensor provides the function of ranging.Temperature and humidity sensors are used to monitor inter-nal temperature and humidity in case of water leakage. Cam-era is placed at the central area in the front of the robot body.Camera is used to capture the surrounding environment, soas to avoid obstacles and locate the robotic fish.

Fig. 4 shows the structure of the electrical system. Sens-ing unit can acquire a variety of sensor data, and then sendthe data to the processor through the IIC transport protocol.The processor does a comprehensive analysis about the sur-rounding environment and the movement state of the roboticfish according to the sensor data. After processing the sen-sor data, the processor will send motion commands to con-trol unit. Control unit receives motion commands from theprocessor, and then controls the running of steering engines.Control unit is based on CPG control model which can gen-erate several different kinds of PWM waves. These PWMwaves have a certain delay in the phase. Input these PWMwaves to steering engines which control the pectoral fins andcaudal fin, we can control the robotic fish’s various motionmodes. [9]

In terms of functionality, the robot can realize autonomouslocalization, navigation, communication, decision-making,and so on. In addition, it is easy to carry out function expan-sion and secondary development.

2.1.3 Typical Swimming Modes

In nature, the fish shows adroit and versatile movementsin varieties of forms. According to the differences in accel-eration, cruising speed and maneuverability, Webb classifiesthe optimal design of fish into two categories which includeBCF (Body and/or Caudal Fin) propulsion mode and MPF(Median and/or Paired Fin) propulsion mode.[10] Becauseof the mutual exclusion among acceleration, cruising speedand maneuverability, no such a fish is prominent in all ofthese abilities. Certainly, not that each kind of fish is promi-nent in some or other ability. In order to adapt to the eco-logical environment, the structure and function of the fishare the complex of these three ability. BCF propulsion modecan provide high speed and acceleration, while MPF propul-

Mornitoring

&control

platform

Core processor

(Raspberry Pi)

Control

unitSensing unit

Power

supply unit

Camera

Pressure

sensor

IMU

Steering

engines

Right

Pectoral fin

Caudal fin

Infrared

sensor

Temperature

and humidity

sensors

Left

Pectoral fin

IIC PWM

Fig. 4: The electrical system of the robotic fish.

sion mode can provide high maneuvering performance. Inessence, BCF propulsion mode and MPF propulsion modecannot be completely separated, they should be consideredas a continuum.

On the basis of the foregoing, we design and implementthe robotic fish’s several typical swimming modes as shownin Fig. 5. Fig. 5(a) shows forward swimming with MPFpropulsion mode. The robotic fish only uses its swing pec-toral fins to swim forward or backward. As for the BCFpropulsion mode, the robotic fish swims forward with theuse of the swing caudal fin, and the pectoral fins are paral-lel to horizontal surface in order to keep balance. Fig. 5(b)shows the mix use of two different propulsion modes in for-ward swimming. Fig. 5(c) shows backward swimming withMPF propulsion mode. Fig. 5(d) shows turning action in theprocess of forward swimming. Because of the caudal fin’soffset, the robotic can change its direction and produce theforward propulsion simultaneously. Fig. 5(e) shows turningaction with MPF propulsion mode. One swing pectoral finproduces the forward propulsion while another swing pec-toral fin produces the backward propulsion. Because of thedirectional difference between the right and left swing pec-toral fins, the robotic fish is subjected to a bending momentwhich results in its turning action. Fig. 5(f) shows the div-ing movement of the robotic fish. By changing the angle ofattack α, the pectoral fins can produce different lift forces.On condition that the velocity of the robotic fish reaches acertain degree, it can realize its attitude control and up-downlocomotion with the help of the lift force. Fig. 5(g) showsthe rolling locomotion of the robotic fish. The pectoral finsare both perpendicular to the main body, and there is a differ-ence in their swing direction. On this occasion, the roboticfish can realize its rolling locomotion. Fig. 5(h) shows thebraking mode of the robotic fish. The pectoral fins’ stop inthe vertical position can produce great resistance so as toachieve the effect of braking the robotic fish.

3 Software Architecture of the Competition

3.1 Basic Operation PrincipleBy establishing the operation system platform which is

based on Linux, it is convenient to add functional moduleand meet higher levels of control requirements. As men-tioned above, the robotic fish is equipped with IMU whichhas gyroscope, accelerometer, electronic compass and oth-er sensors. So the robotic fish can know its own position-al information with real-time computation. With the helpof intelligent sensors, the capability of autonomous sensing

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 3

Page 4: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

(a)

(c)

(e)

(g)

(b)

(d)

(f)

(h)Fig. 5: Several typical swimming mode of the robotic fish.(a) Forward swimming [MPF mode]. (b) Forward swimming[BCF+MPF mode]. (c) Backward swimming. (d) Turningon the move. (e) Turning action with MPF mode. (f) Up-down locomotion. (g) Roll locomotion. (h) Braking.

has improved. In addition, the robotic fish can communicatewith stable and reliable Wi-Fi. It can transfer images to themotoring and control platform in real-time through Wi-Fi.The technology greatly expands the application scenarios ofthe robotic fish, and thus makes the communication amongmultiple robotic fish more convenient. At the same time, theconvenience of communication makes it propitious to com-plete the multi-robot cooperative tasks.

3.2 Human Machine InteractionIn order to realize the observation and control of the robot-

ic fish, we designs the monitoring and control platform asshown in Fig. 6. The platform can be used to implemen-

Fig. 6: The computer monitoring and control platform of therobotic fish. (Modified and redrawn)

t the parameters displaying, parameter adjustment, speedmeasurement, and motion control of the robotic fish. Theplatform is mainly composed of five major parts showed inFig. 6. Area I represents image-display area. Area II rep-resents status-display area. Area III represents sensor data-display area. Area IV represents network-setting area. AreaV represents parameter-control area.

Image-display area: as mentioned previously, a camerais equipped with the robotic fish for image acquisition. Thecamera can be used to collect the image information in thewater. And the information can be transmitted to the uppermonitoring platform through Wi-Fi, so that the image willbe displayed in image-display area.

Status-display area: this area is used to display the robot-ic fish’s parameters and state of swimming in real-time.

Sensor data-display area: this area is mainly used to dis-play the data collected by the sensors installed on the roboticfish. These sensors include infrared sensor, pressure sensorand IMU.

Network-setting area: this area is used to set the IP ad-dress of the upper platform which must be matched with theIP address of the robotic fish. Only in this way can datacommunication be carried out between the platform and therobotic fish. In addition, there are some keys being used todebug the motion of the robotic fish.

Parameter-control area: The main functions of therobotic fish are concentrated in this area. The button groupis used to control the robotic fish’s swimming frequency, thefins’ offset and swing amplitude, the phase difference be-tween caudal fin and pectoral fins, the pectoral fins’ phasedifference, and so on.

3.3 Experimental PlatformDue to detect the position and velocity information of the

robotic fish in the water, we have designed an open experi-mental platform which has good scalability, friendly graph-ical interface, modularized software design, and real-timeimage processing. In addition, it has the excellent func-tion of recording and replaying the experiment results. Theexperimental platform insists of the robotic fish subsystem,the communication subsystem, and the computer monitoringand control platform which is introduced previously. Thecommunication subsystem which is also introduced previ-

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 4

Page 5: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

the robotic fish

subsystem

the

communication

subsystem

the computer

monitoring and

control platform

Fig. 7: The workflow of the experimental platform.

ously is based on Wi-Fi .The experimental platform is run with the coordination of

three subsystems as shown in Fig. 7. Let’s take an exam-ple. When it comes to measure the speed of the robotic fish,we can fulfil this task as follows. Firstly, the communica-tion subsystem give the control command to the robotic fishto make it achieve the corresponding swimming state. Sec-ondly, the pressure sensors installed on the robot fish mea-sure the pressure variation of the moving water, then sendthe sensor data to the computer monitoring and control plat-form. Finally, the computer platform calculates the speed ofthe robotic fish with the analysis of the sensor data. We canalso measure the speed of the robotic fish without the pres-sure sensors. In this way, we can use the speed-measuringplatform based on camera to measure the speed. The visionsystem captures the images underwater and sends them tothe computer platform for real-time image processing, thenwe can acquire the speed of the robotic fish.

4 Several Key Technologies in the Competition

4.1 Competition ItemsAutonomous vision-based underwater robot competition

comes down to many key technologies including ball-heading algorithm, autonomous image processing, au-tonomous underwater localization, and so on. Specificallyin water polo game and technical challenge, the robotic fishneeds firstly to identify the colored water polo, then chasesand heads the water polo. In this process, the technologies ofimage processing and autonomous localization play impor-tant roles in determining the distance and angle informationbetween the water polo and the robotic fish. Besides, excel-lence of the ball-heading algorithm determines the accuracyof goal.

Fig. 8(a) shows the water polo game between the roboticfish. Both of the robotic fish are located on the sides of thepool. Once the whistle blows, the robotic fish start to headthe water polo. The one whose number of goals is more willwin the game. Fig. 8(b) shows the game of technical chal-lenge. The robotic fish is located at the centric line besidesthe pool wall. Moreover, the robotic fish’s head points to thewater polo. The location of water polo is the penalty mark.The aim of the robotic fish is to head the water polo into itsown goal. Fig. 8(c) shows the game of underwater rescue.Five colored cylindrical pillars are placed in the pool. Num-ber 1 and 3 pillars are painted green. Number 2 and 4 pillarsare painted red. Number 5 pillar is painted yellow. On thetop of each pillar, there is a transparent box with heavy loads.The whole box represents the fish that is trapped to be saved.The initial positions of the cylindrical pillars are shown inFig. 8(c). The robotic fish starts from the center of pool wallwhich is on the right side. Then it collides the pillars oneby one. The boxes which fall into the water represent therescued fish.

As previously stated, the robotic fish’s attitude can be con-

(a)

(b)

(c)

Fig. 8: Autonomous vision-based underwater robot compe-tition. (a) Water polo game between the robotic fish. (b)Technical challenge. (c) Underwater rescue.

Fig. 9: Competition in three-dimensional space.

trolled and it has fine maneuverability. Therefore, maybewe can design a new competition item in which the roboticfish need swim in three dimensional space. In the process ofheading the water polo, the robotic fish is required to avoidobstacles and then head the polo into the goals. The route ofthe robotic fish is as shown in Fig. 9.

4.2 Autonomous Image ProcessingIn order to obtain accurate information about the environ-

ment and implement the assigned tasks better, camera andcomputer which the robotic fish is equipped with form therobot’s vision system. The vision system is used to identify,track and measure the target. Then we can do some work

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 5

Page 6: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

about image processing.Machine vision is used to convert the images captured by

camera to the control command of the robot. It consistsof three processes: image capturing, image processing, andoutput display or control. With the help of vision sensors,the robotic fish converts the scene captured to image signal.Then the robotic fish transmits the image signal to the imageprocessing system which can separate out the available fea-ture information, so that we can get the key information weneed. [11]

OpenCV is a function library of image processing andcomputer vision. Compared with other image function li-brary, OpenCV is freely available. In order to realizefrequently-used image processing and computer vision al-gorithms, developers can invoke the related processing func-tions in the library according to their own needs.

Under the premise of ensuring the processing efficiency,we transplant OpenCV to embedded development platformin order to shorten the development cycle and improve de-velopment efficiency. With regard to the robotic fish, weinstall OpenCV in Raspberry Pi. Under the environment ofOpenCV, camera can directly obtain the images in the for-mat of IplImage. Then we can process the image directly.The images captured by camera are directly processed onthe ARM11 processor. The upper computer can be closeduntil it needs the real-time observation of water environmentor control the robotic fish through the instructions. In oth-er words, the PC is only an auxiliary role at this point. Therobotic fish’s real decision-making and controlling core isthe embedded core processor ARM11. Fig. 10 shows thevision system of the robotic fish.

Poor quality in the formation of images is a problem inthe underwater machine vision technology. Due to the in-sufficient light, uneven illumination, refraction, reflection,and other color’s interference in the propagation process oflight under water, there exists sink effect, scattering effect,and convolution effect of water. Compared with land im-ages, underwater images have low contrast, bad uniformity,low noise-signal ratio, and serious gray effect. How to e-liminate external noise interference and identify the markersaccurately is the premise of achieving accurate positioning.[12]

In the competition, we adopt the color-based image pro-cessing algorithm. Fig. 11 shows the flow diagram of under-water image processing algorithm for the robotic fish. [13]Compared with traditional image processing algorithms, thisalgorithm uses ACE filtering algorithm for initial process-ing of the acquired images. Then it converts the images toHSV color space. On the basis of HSV color space, we cancarry out further processing such as threshold segmentation,lump detection, reflection elimination, landmark matching,distance and angle calculation, and so on. Then, we can ac-quire the distance and angle information of the robotic fishto the markers (landmarks). Fig. 12 shows the schematicdiagram of the distance and angle measurement. Detailedinformation about the distance and angle can be found in ourprevious works [13].

The ACE model can control the images’ color contrasts,maximize the image dynamic range, and sharpen the borderof the image. But it is not suitable for real-time applications

USB Camera

(Image capturing)

Embedded

Microprocessor

ARM1176JZF-S

(Image processing)

Upper Monitor

(Monitoring)

Fig. 10: Vision system of the robotic fish.

Start

Accelerated

ACE Model

HSV Color

Space

Transformation

Threshold

Segmentation

Lump Detection

Reflection

Elimination

Landmark

Matching

Matching

Success?

Distance and

Angle

Calculation

Kalman Filter

End

Input Image

N

Y

Distance and

Angle Information

Fig. 11: Flow diagram of underwater image processing al-gorithm for the robotic fish.

because of high computational costs. In order to overcomethis problem, the AACE (Accelerated Automatic Color E-qualization) model is applied to decrease the computationaltime. The main structure of the AACE model is shown inFig. 13. [8] It includes random sampling, chromatic adjust-ment, dynamic tone reproduction scaling, SVD-based map-ping, and so on. In our previous works [8], AACE is dis-cussed in detail.

4.3 Underwater LocalizationAutonomous robot is able to work with the abilities of

self-organizing, self-planning and self-adaption in complexenvironment. Navigation technology is the core in the re-searches of autonomous robot. Moreover, it is the key tech-nology of achieving the robot’s self-determination. The re-

Fig. 12: Schematic diagram of the distance and angle mea-surement.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 6

Page 7: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

Random

Sampling

Chromatic/Spatial

Adjustment

Dynamic Tone

Reproduction

Scaling

Input

Image

Subset of the

Original Image

Intermediate

Result of the

Selected Subset

SVD-based

Mapping

Final Output of the

Selected Subset

Final Output of the

Whole Input Image

Fig. 13: Main structure of the AACE model.

search purpose of navigation is to make the robot move ina meaningful way and complete particular tasks without hu-man intervention. Place the mobile robot in an unknown,complex and dynamic environment. After a period of ex-ploration of the environment, the robot is able to reach anyspecified position under the restricted circumstances of mini-mum cost function. In order to complete navigation mission,we have to solve the problems of motion control, path plan-ning, and localization. Among the problems, localization isthe most basic link in the navigation of robot. Moreover,it is also the first problem to solve before complete naviga-tion tasks. Real-time and precise localization is the key toimprove the performance of robot. The robot determines it-s pose relative to the environment through data acquired bysensors equipped with the robot. [14], [15], [16] As for therobotic fish, we choose Monte Carlo algorithm to achieve lo-calization while considering the environmental complexity.

Without regard for the up-down locomotion of the roboticfish, we restrict the locomotion of the robotic fish underwa-ter. We adjust the position of the robotic fish to ensure thatthe camera is just under the horizontal plane. Moreover, thedorsal of the robotic fish is just in the horizontal plane. Inaddition, we need to ensure the smooth motion of the robot-ic fish. As is shown in Fig. 14, we establish the coordi-nate system of the competition area and place some coloredmarks at the corresponding positions in water. Through therecognition and processing of the markers, we can preparefor the achievement of the localization. [17] We need to de-termine the position information and direction informationof the robotic fish. We set the installation position of cameraas the position coordinates of the robotic fish. In the imple-mentation procedure of Monte Carlo algorithm, we chooseX and Y to represent the two-dimensional horizontal planecoordinates of the robotic fish. Moreover, the angle betweenthe direction of head and the X axis is regarded as seen. X,Y, and seen present the positional information of the roboticfish. [18]

We also need to determine the odometer calculation modelof the robotic fish. The odometer calculation model providesthe information about the incremental motion and velocityof the robotic fish. We regard the robotic fish as wheel-barrow. It cannot carry out transverse movement while itcan achieve the motion of forward swimming, turning swim-ming, and backward swimming. Compared with the rollingaction of the mobile robot which use wheels on the land, therobotic fish swims with the help of the fins. So we cannotuse photoelectric sensor to measure speed of the robotic fishand the distance it has travelled. Considering the disadvan-tages above, the robotic fish is equipped with IMU which can

Fig. 14: Coordinate system for the robotic fish and the parti-cles.

sense the movement of the robotic fish. We can acquire thetriaxial angles and accelerations which are relative to IMU.The triaxial angles are angle of pitch, angle of roll, and angleof yaw. IMU defines a coordinate system which is differentfrom the coordinate system of the competition area. Con-sequently, in the process of establishing the odometer cal-culation model of the robotic fish, we need to transform thecoordinate when it relates to mathematical reasoning. Withthe angle information and acceleration information providedby IMU, we can update the data in the odometer calcula-tion model. Based on the data, we can calculate the distancetravelled and speed of the robotic fish.

Fig. 15 shows the procedure of the localization based onMonte Carlo algorithm. [8] It concludes underwater imageprocessing, distance and angle calculation, perception up-date, resampling, Kalman-based localization estimation, andso on. In the localization based on Monte Carlo algorithm,we use the method of probability distribution to confirm theposition of the robotic fish. The position of the robotic fish isrepresented as a probability distribution, which can providean estimation for the position of the robotic fish. [19] Thedetailed information about the localization based on MonteCarlo algorithm can be found in our previous works. [8]

4.4 Path Planning and Obstacle Avoidance Based OnArtificial Potential Field

The artificial potential field has been widely applied inreal-time obstacle avoidance and motion planning. Its basicidea is as follows: Regard the robot which is in the motionspace as a particle. The particle suffers the artificial potentialfield force. The artificial potential field reflects the structureof motion space, and the robot can get the distribution infor-mation of the obstacles and targets. The potential field con-tains the gravitational pole and the repulsive pole. Obstaclesare repulsion poles. So the repulsion field function shouldbe related to the obstacle. In addition, the more close to theobstacle, the more the repulsion should be. On the contrary,the target is the gravitational pole, and the gravitational fieldfunction is related to the position of the target. The fartherthe robot is away from the target, the more the gravitationshould be. In this way, the gravitational force and the repul-sive force will generate a corresponding potential field. Theobstacles have high potential field, and the free space hasa low potential. So the robot will avoid the obstacles, andmove to the target point.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 7

Page 8: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

Initialization

Underwater image

processing

Any landmark in

sight?

Distance and

angle calculation

Perception update

Resampling

IMU

calculation

Motion

update

Kalman-based

localization

estimation

N

Y

[ x y ]

Fig. 15: Flowchart of the localization based on Monte Carloalgorithm.

Based on artificial potential field, we have designed twobasic behaviors including the behavior of swimming to thegoal and avoiding obstacles. The behavior here representsan action or sequence of actions with purposes. As for theautonomous robotic fish, the behavior can be regarded as gaitsequence which can be required by means of bionics.

The behavior of swimming to the target is controlled by atypical point-to-point algorithm. That is to say, the roboticfish adjusts its direction and speed continuously according toreal-time visual feedback. So the robotic fish can swim froman initial position to the designated target location. [11], [20]The behavior of avoiding obstacles in artificial potential fieldcan be described as follows [2]:

θ =

φ (L < Mmin) ∩ (R < Mmin)θ0 L 6 R−θ0 L > R

(1)

v =

φ (L < Mmin) ∩ (R < Mmin)0 (L > Mmax) ∩ (R > Mmax)

vmax other(2)

where θ and v are the output of obstacle avoidance. θ de-termines the direction of the robotic fish, and v determinesthe swimming speed. φ means the output of the behaviorof swimming to target when the field is in the absence ofobstacles. θ0 is the direction angle of advoiding the nearestobstacle. vmax is the maximum speed. Both L and R canbe determined by the distance between the obstacles at theleft and right sides. Mmin represents the situation which iswithout considering obstacles. Mmax represents the situa-tion that the obstacle is very close to the robotic fish. L, R,

are defined as follows [2]:

L =1√

d2camera +∆X2l +m0

(3)

R =1√

d2camera +∆X2r +m0

(4)

Mmin =1

drange +m0(5)

Mmax =1

v∆t+m0(6)

where dcamera is the distance between the robotic fish andthe obstacle. It is acquired by the real-time estimation fromthe camera. Xl is the actual distance which is estimated byusing the similar triangle rule. The estimation is based onthe obstacle’s error signal in the X axis direction. The X-Y coordinate system here is defined in the image which isacquired by the camera. The obstacle is on the left side andXl has left deviation. The definition of Xr is similar to Xl.And it has right deviation. drange is the possible maximumdistance estimated by the camera. m0 is a gain value.

4.5 Ball-trackingIn water polo game, we choose PD controller to realize the

tracking of the water polo. As for the robotic fish, the errorsignal can be defined as the distance between the center ofmass of the image captured by camera and the center of thevisual field of the robotic fish. The error signals on the Xaxis and Y axis of the image can be used to control the pitchlocomotion and yaw locomotion. The error signals can beselected as the input of the PD controller. According to theerror signals, the control rule of PD controller is as follows[2]:

µi = Kpϵi +Kd∂ϵi∂i

(7)

after discretization, the formula above can be described asfollows [2]:

µi = Kpϵi +Kd

T(ϵi − ϵi−1) (8)

where Kd is the proportion. Kp is differential gain. T is thesampling period. ϵi is the average error signal. It can bedefined as [2]:

ϵi = ϵi − γϵi−1 (9)

where ϵi is the error signal and γ is an error constant.In order to track the water polo, the location of the water

polo should be kept at the center of view of the robotic fish.The structure of PD controller is shown in Fig. ?? [2]. Theyaw locomotion and pitch locomotion can adjust the errorsin the X axis direction and Y axis direction. [21]

4.6 Ball-heading AlgorithmIn water polo game, the robotic fish whose number of

goals is more will win the game. Therefore, the ball-headingalgorithm is very critical. We will introduce several typi-cal ball-heading algorithm. Considering the objective fac-tors which influence heading, a novel ball-heading algorithmbased on the movements of the robotic fish is introduced inthis paper. Fig. 17 shows the scene of heading water polo.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 8

Page 9: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

Fig. 16: The structure of PD controller.

4.6.1 Basic Ball-heading Algorithm

As shown in Fig. 18, the basic idea of the basic ball-heading algorithm is as follows. Firstly, make the roboticfish swim to the best shot point G. Secondly, adjust the di-rection of the robotic fish. Thirdly, make the robotic fishswim to the water polo and head it. The whole process in-cludes the acceleration and then deceleration to G, changeof the direction and the final acceleration to the water po-lo. Actually, the basic ball-heading algorithm has severaltypical problems. Though the swing frequency of the finsare reduced to zero, the robot still swims along the origi-nal direction because of the low resistance in water. In thiscase, the robotic fish cannot reach to G accurately. More-over, because of the hardware limitations, turning action oflarge angle cannot be completed once. At the same time, thewater wave generated by the multiple turning adjustmentswill cause more interference to the environment, and the d-ifficulty of control will increase.

4.6.2 Cylindrical Plung Ball-heading Algorithm

As shown in Fig. 19, the basic idea of cylindrical plungball-heading algorithm is as follows: Firstly, the robotic fishswims to the cylindrical plung. Secondly, it swims to thebest ball-heading point G along the circle. Finally, it headsthe ball at a certain speed.

The best ball-heading point G(xG, yG) can be defined asfollows [22]:

β = arctan

(yC − yBxC − xB

)(10)

xG = xB − ρ · cosβ (11)

yG = yB − ρ · sinβ (12)

where xG and yG are the coordinates of the best ball-headingpoint G. xC and yC are the coordinates of the center of thegate. xB and yB are the coordinates of the water polo. ρ is aconstant which is larger than the radius of the water polo. Itcan be set according to the physical demand.

The cylindrical plung is the path line between the roboticfish and the best ball-heading point G. Draw a line l2 perpen-dicular to l1 through the point G. Choose a point O which ison the line l2. The distance between G and O is defined asr. Draw a circle that its center is O, and its radius is r. Thecircle and l1 are tangent to G. This circle is the cylindricalplung we need. The radius r is the best turning radius withwhich the robotic fish has the fastest speed. The center Oshould be on the same side with the robotic fish. Connectthe point F and G. Draw the midnormal of the segment FG.If the cylindrical plung and l3 intersect and have two inter-sectant points, we choose the point whose x-coordinate is

(a)

(b) (c)Fig. 17: The scene of heading water polo. (a) Beginning ofthe game. (b) Tackling the water polo. (c) Goaling.

smaller as a trajectory dot. If they only have an intersectantpoint, we choose the point as a trajectory dot. If they don’tintersect, we choose the point G as a temporary point of in-tersection. The collection of these intersectant points is theattacking path of the robotic fish.

With the help of the cylindrical plung ball-heading algo-rithm, the robotic fish is able to swim smoothly to the bestball-heading point along the specified trajectory. At the sametime, the direction adjustment of the robotic fish can be ac-curate. Therefore, the error of the angle and distance be-tween the robotic fish and the best ball-heading point G canbe eliminated. However, the attacking trajectory dot of thecylindrical plung changes constantly because of the distur-bance of water. So the robotic fish is usually in the state ofadjusting its direction for a long time. And the best time forheading the water polo may get delayed. Owing to the long-path attacking trajectory, the water polo may be tackled bythe opponent before the robotic fish reaches it.

Fig. 18: Basic Ball-Heading Algorithm.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 9

Page 10: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

Table 1: The arbitration rules and the actions executedreledir ball2goal

Angle 1 Angle 2 Angle 3 Angle 4

reledir fish2ball

Angle 1 Large Right Turn Large Left Turn Small Right Turn Move ForwardAngle 2 Large Right Turn Large Left Turn Move Forward Small Right TurnAngle 3 Small Right Turn Large Right Turn Large Left Turn Move ForwardAngle 4 Large Left Turn Small Left Turn Move Forward Large Right Turn

Fig. 19: Cylindrical Plung Ball-Heading Algorithm.

4.6.3 Behavior-based Ball-heading Algorithm

The basic ball-heading algorithm and the cylindrical plungball-heading algorithm are both the frequently-used ball-heading algorithm. Their reasonable path planning improvethe efficiency of movement to the target point, so the robot-ic fish can fulfill the task of heading water polo. But theirapplication in practice is barely satisfactory.

The complexity and uncertainty of the underwater envi-ronment have brought a large amount of interference to therobotic fish. These interferences have largely reduced theefficiency and accuracy of control to the robotic fish. Theirregular-shaped head also reduce the success rate of head-ing the water polo.

In the actual game, the robotic fish doesn’t complete thegoal by the ball-heading action or long-distance raid. In mostcases, it relies on the repeat adjustment. It is convenient forthe robotic fish to head the water polo by reducing the speedwhen it approaches the water polo. But it is difficult to main-tain its state in the fierce competition environment.

Based on the former fact, we may consider such an ideathat: Let the robotic fish approach the water polo primarily.Within a certain range of distance to the water polo, we canarbitrate the action of the robotic fish according to the geo-metrical relationship among the robotic fish, water polo andthe opponents’ goal. The robotic fish can not only head thewater polo, but also use its caudal fin to flick the water polo.In a way, the robotic fish is prone to stroke the water polowith the use of its caudal fin and body. Though the roboticfish cannot reach the water polo in the process, the waves ofwater are also helpful for the ball to drift to the opponent’sgoal. Under the guidance of this idea, a kind of algorithmbased on the behavior of the robotic fish has been put for-ward. The algorithm structure is shown in Fig. 20.

image

processing

and

identification

unit

Large Right Turn

Small Right Turn

Large Left Turn

Small Left Turn

Move Forwar ArbiterAction

Executing

Camera

Fig. 20: Structure of behavior-based ball-heading algorithm.

We choose reledir fish2ball and reledir ball2goalas the arbitration clause. The reledir fish2ball andreledir ball2goal can be defined as [22]:

reledir fish2ball = dir fish2ball − dir fish

reledir ball2goal = dir ball2goal − dir fish

where dir fish means the direction of the robotic fish,dir fish2ball means the direction that from the center of therobotic fish to the water polo, dir ball2goal means the di-rection that the water polo points to the opponent’s goal. Wedivide the reledir fish2ball, reledir ball2goal into fourintervals which are named Angle1, Angle2, Angle3, andAngle4. Angle1 ∈ (0◦, 90◦), Angle2 ∈ (-90◦, 0◦), Angle3∈ (-180◦, -90◦), Angle4 ∈ (90◦, 180◦).

The arbitration rules can be divided into four cases asshown in Table 1. Large Right Turn, Large Left Turn, SmallRight Turn and Small Left Turn are correspond to differentturning direction and speed.

5 Conclusion and Future Work

This paper introduces an autonomous vision-based under-water robot competition. The competition provides a stan-dard platform for the examination and verification of theunderwater robots’ technology including underwater imageprocessing, underwater localization, path planning, obstacleavoidance, ball-heading, and so on.

The goal of the underwater robot competition especiallythe water polo game is to bloom into an international stan-dard robot event. Moreover, researchers all over the worldcan cooperate and exchange their ideas with the help of thiscompetition. Then we can promote the underwater roboticsand automation technology together.

Nowadays, underwater robot competition has gained un-precedented participation and attention in China. In 2015,International Underwater Robot Alliance set up a special

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 10

Page 11: Autonomous Vision-based Underwater Robot Competition 2/Autono… · ing that they need the cooperation of multiple robots. So the researches on underwater multi-robot cooperation

performance during IROS (International Conference on In-telligent Robots and Systems) held in Hamburg, Germany.During the conference, an international invitational under-water robot competition was held. The underwater robotcompetition has great international effect. It has made sig-nificant contributions to the cooperation and development ofthe research on robotics. We believe that International Un-derwater Robot Competition will become a standard interna-tional robot competition in the future.

References[1] Official website of RoboCup, http://www.robocup.org.[2] W. Zhao, Control and Coordination of Multiple Biomimetic

Underwater Robots (Doctor thesis), Beijing: Peking Universi-ty, 2009.

[3] W. Wang and G. Xie, CPG-based locomotion controller designfor a boxfish-like robot, International Journal of AdvancedRobotic Systems, 11(1): 147–169, 2014.

[4] 2013 China robot competition and RoboCup open tournamentwas rounded off yesterday, http://edu.anhuinews.com/system/2013/10/21/006162639.shtml

[5] P. Kodati and X. Deng, Towards the body shape design of a hy-drodynamically stable robotic boxfish, 2006 IEEE/RSJ Inter-national Conference on Intelligent Robots and Systems, 2006:5412-5417.

[6] I. K. Bartol, G. Morteza, P. W. Webb, W. Daniel and M. S. Gor-don, Body-induced vortical flows: a common mechanism forself-corrective trimming control in boxfishes., Journal of Ex-perimental Biology, 208(2): 34-36, 2005.

[7] J. A. Walker, Does a rigid body limit maneuverability?, Journalof Experimental Biology, 203(22): 3391-6, 2000.

[8] W. Wang and G. Xie, Online high-Precision probabilistic lo-calization of robotic fish Using visual and inertial cues, IEEETransactions on Industrial Electronics, 62(2): 1113-1124,2015.

[9] W. Wang, J. Guo, Z. Wang and G. Xie, Neural controller forswimming modes and gait transition on an ostraciiform fishrobot, IEEE/ASME International Conference on Advanced In-telligent Mechatronics (AIM), 2013: 1564-1569.

[10] P. W. Webb, Form and function in fish swimming, ScientificAmerican, 251(1): 72-82, 1984.

[11] Y. Hu, W. Zhao and L. Wang, Vision-based target track-ing and collision avoidance for two autonomous robotic fish,IEEE Transactions on Industrial Electronics, 56(5): 1401-1410, 2009.

[12] S. Bazeille, I. Quidu, L. Jaulin and J. P. Malkasse, Automaticunderwater image pre-processing, in Proceedings of the Car-acterisation du Milieu Marin (CMM ’06), 2006

[13] W. Wang and G. Xie, An adaptive and online underwater im-age processing algorithm implemented on miniature biomimet-ic robotic fish, World Congress, 19(1): 7598-7603, 2014.

[14] H. P. Tan, R. Diamant, W. K. G. Seah and M. Waldmeyer, Asurvey of techniques and challenges in underwater localization,Ocean Engineering, 38(14): 1663-1676, 2011.

[15] M. Carreras, P. Ridao, R. Garcia and T. Nicosevici, Vision-based localization of an underwater robot in a structured envi-ronment, IEEE International Conference on Robotics and Au-tomation, 2003: 971-976.

[16] B. Browning and M. Veloso, Real-time, adaptive color-basedrobot vision, IEEE/RSJ International Conference on IntelligentRobots and Systems, 2005: 3871-3876.

[17] P. Zhang, E. E. Milios and J. Gu, Underwater robot localiza-tion using artificial visual landmarks, IEEE International Con-ference on Robotics and Biomimetics, 2004: 705-710.

[18] J. Zhang, W. Wang and G. Xie and H. Shi, Camera-IMU-

based underwater localization, 33rd Chinese Control Confer-ence (CCC), 2014: 8589-8594.

[19] T. G. Kim, N. Y. Ko, S. W. Noh and Y. P. Lee, Localization onan underwater robot using Monte Carlo localization algorithm,The Journal of the Korea institute of electronic communicationsciences, 6(2): 288–295, 2011.

[20] Y. Hu, W. Zhao, G. Xie and L. Wang, Development and targetfollowing of vision-based autonomous robotic fish, Robotica,27(7): 1075-1089, 2009.

[21] W. Zhao, Y. Hu, G. Xie, L. Wang and Y. Jia, Developmentof vision-based autonomous robotic fish and its application inwater-polo-attacking task, The 2008 American Control Confer-ence, 2008: 568-573.

[22] J. Tao, F. Kong and G. Xie, Behavior-based motion plan-ning of biomimetic robot-Fish, Ordnance Industry Automation,29(11): 70-73, 2010.

ILUR TRANSACTIONS ON SCIENCE AND ENGINEERING, Volume 2016 (2016), Article ID 20160401 ISSN 2414-6684 11