[ieee 2006 ieee/rsj international conference on intelligent robots and systems - beijing, china...

6
Proceedings of the 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems October 9 - 15, 2006, Beijing, China A Visual Tele-operation System for the Humanoid Robot BHR-02 Lei ZHANG, Qiang HUANG, Yuepin LU, Tao XIAO, Jiapeng YANG, and Muhammadusman KEERIO Department of Mechatronic Engineering Beijing Institute of Technology 5 Nandajie, Zhongguancun, Haidian, Beijing, China, 100081 zh zhangleigbit.edu.cn, [email protected], luyuepingtom.com txiaol63@ 126.com, [email protected], [email protected] Abstract - This paper presents a method to reconstruct the virtual scene of the humanoid robot teleopreation to overcome the problem of the poor vision images feedback. A virtual robot interface is built to render the data of the real robot. It has the same DOF set and the same size scale as the real robot. The interface can render the multiple real-time feedback data from the robot. In the data-fusion module an algorithm is adopted to determine the position and attitude of the robot body. Some experiments are done to confirm the effectiveness of the virtual scene. Index Terms - Humanoid robot, Teleoperation, Data-fusion, Real time monitor. I. INTRODUCTION A humanoid robot with human-like shape (two legs, two arms, one head) is easier to adapt the human working environments. Since the humanoid robot has the similar mechanism and motion with the human operator, the shape of the humanoid robot is one of the best shapes of remotely controlled robots. It is expected to operate in hazardous and emergency environments. To improve the efficiency of the teleoperation for the humanoid robot, it is desired to provide the operator a complete scene of the robot and its worksite. One approach is to use the feedback real video images [1]. They equipped the cameras in the remote robot. And in the interface to display the image, some data of the status were added in multiple formats as numbers or characters. This type of interface is easy to be developed. But in the environment where the cameras can not shoot the images clearly, for example in the environment which is full of smoke, the interface can not offer the enough knowledge of the robot and its worksite for the operator to complete the task. Also the operator should have a good train to use the interface skilled [2]-[3]. Another approach is to use the VR technology [4]. Monferrer et al. [5] developed two interfaces using the virtual reality desktop as a data visualization enhancement tool, one of them focusing on information and the other on controllability. Igor R et al. [6] developed a virtual control environment for robot teleoperation via the Internet which comprises a Java3D-based real-time virtual representation of the robot and worksite. Qingping Lin et al. [7] provided a virtual telepresence interface which could provide operator a better visualization of teleoperation information i.e. a continuously available 3D graphics display the robot's location and its surrounding environment. However, only a few researchers have studied the humanoid robot teleoperation. Some of teleoperation systems displayed the real images captured by the cameras on the robot [8][9][10]. They could not fit to the misty environment. Others focused on building virtual models of the robot and rendering their configuration [1 1][12]. In the teleoperation for humanoid robot HRP- I S [ 11 ], a robot state display system was adopted to provide a 3D simulation displaying current configuration of the robot by rendering all the data of the robot joint angels. Satoshi Kagami et al. [12] built a virtual puppet interface for the humanoid robot H6, which could render the internal state of the robot. But these two systems did not render the robot external data such as the location data and attitude data relative to its worksite. This paper explores the method of reconstructing the virtual scene of humanoid robot by combining the multiple kinds of real humanoid robot state data and the virtual models. The method presented in this paper is to use the 3D animation software to make the model of the robot. The virtual scene will fuse the multiple kinds of real feedback data from the robot, and render the result to the virtual model of the robot. The paper is organized as the follows. In section 2 we describe the developed teleoperation system for humanoid robot BHR- 02. Section 3 presentes the method to reconstruct the virtual interface. Section 4 provides implementation of the virtual interface. The experiment and the conclusion are given in section 5 and 6 respectively. II. THE OVERVIEW OF TELEOPERATION SYSTEM The humanoid robot "BHR-02" (Fig. 1) consists of a head, two arms and legs, and has total 32 DOF (Degrees of Freedom). The height is 160cm, the weight is 63kg. The humanoid robot BHR-02 has stereo cameras and stereo microphone and speakers in the head, torque/force sensors at wrists and ankles, acceleration sensors and gyro sensors at the trunk. There are two computers built in robot body, one is for motion control, another for information processing (such as images processing, objects characters identifying and so on) and transferring data with remote 1-4244-0259-X/06/$20.00 C)2006 IEEE 1110

Upload: muhammadusman

Post on 07-Mar-2017

224 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

Proceedings of the 2006 IEEE/RSJInternational Conference on Intelligent Robots and Systems

October 9 - 15, 2006, Beijing, China

A Visual Tele-operation System for the Humanoid RobotBHR-02

Lei ZHANG, Qiang HUANG, Yuepin LU, Tao XIAO, Jiapeng YANG, and Muhammadusman KEERIODepartment of Mechatronic Engineering Beijing Institute of Technology

5 Nandajie, Zhongguancun, Haidian, Beijing, China, 100081zh zhangleigbit.edu.cn, [email protected], luyuepingtom.com

txiaol63@ 126.com, [email protected], [email protected]

Abstract - This paper presents a method to reconstruct thevirtual scene of the humanoid robot teleopreation to overcomethe problem of the poor vision images feedback. A virtual robotinterface is built to render the data of the real robot. It has thesame DOF set and the same size scale as the real robot. Theinterface can render the multiple real-time feedback data fromthe robot. In the data-fusion module an algorithm is adopted todetermine the position and attitude of the robot body. Someexperiments are done to confirm the effectiveness of the virtualscene.

Index Terms - Humanoid robot, Teleoperation, Data-fusion,Real time monitor.

I. INTRODUCTION

A humanoid robot with human-like shape (two legs, twoarms, one head) is easier to adapt the human workingenvironments. Since the humanoid robot has the similarmechanism and motion with the human operator, the shape ofthe humanoid robot is one of the best shapes of remotelycontrolled robots. It is expected to operate in hazardous andemergency environments.

To improve the efficiency of the teleoperation for thehumanoid robot, it is desired to provide the operator acomplete scene of the robot and its worksite. One approach isto use the feedback real video images [1]. They equipped thecameras in the remote robot. And in the interface to displaythe image, some data of the status were added in multipleformats as numbers or characters. This type of interface iseasy to be developed. But in the environment where thecameras can not shoot the images clearly, for example in theenvironment which is full of smoke, the interface can not offerthe enough knowledge of the robot and its worksite for theoperator to complete the task. Also the operator should have agood train to use the interface skilled [2]-[3].

Another approach is to use the VR technology [4].Monferrer et al. [5] developed two interfaces using the virtualreality desktop as a data visualization enhancement tool, oneof them focusing on information and the other oncontrollability. Igor R et al. [6] developed a virtual controlenvironment for robot teleoperation via the Internet whichcomprises a Java3D-based real-time virtual representation ofthe robot and worksite. Qingping Lin et al. [7] provided avirtual telepresence interface which could provide operator a

better visualization of teleoperation information i.e. acontinuously available 3D graphics display the robot'slocation and its surrounding environment.

However, only a few researchers have studied thehumanoid robot teleoperation. Some of teleoperation systemsdisplayed the real images captured by the cameras on the robot[8][9][10]. They could not fit to the misty environment. Othersfocused on building virtual models of the robot and renderingtheir configuration [1 1][12]. In the teleoperation for humanoidrobot HRP- I S [ 11 ], a robot state display system was adoptedto provide a 3D simulation displaying current configuration ofthe robot by rendering all the data of the robot joint angels.Satoshi Kagami et al. [12] built a virtual puppet interface forthe humanoid robot H6, which could render the internal stateof the robot. But these two systems did not render the robotexternal data such as the location data and attitude datarelative to its worksite.

This paper explores the method of reconstructing thevirtual scene of humanoid robot by combining the multiplekinds of real humanoid robot state data and the virtual models.The method presented in this paper is to use the 3D animationsoftware to make the model of the robot. The virtual scenewill fuse the multiple kinds of real feedback data from therobot, and render the result to the virtual model of the robot.The paper is organized as the follows. In section 2 we describethe developed teleoperation system for humanoid robot BHR-02. Section 3 presentes the method to reconstruct the virtualinterface. Section 4 provides implementation of the virtualinterface. The experiment and the conclusion are given insection 5 and 6 respectively.

II. THE OVERVIEW OF TELEOPERATION SYSTEM

The humanoid robot "BHR-02" (Fig. 1) consists of a head,two arms and legs, and has total 32 DOF (Degrees ofFreedom). The height is 160cm, the weight is 63kg.

The humanoid robot BHR-02 has stereo cameras andstereo microphone and speakers in the head, torque/forcesensors at wrists and ankles, acceleration sensors and gyrosensors at the trunk. There are two computers built in robotbody, one is for motion control, another for informationprocessing (such as images processing, objects charactersidentifying and so on) and transferring data with remote

1-4244-0259-X/06/$20.00 C)2006 IEEE1110

Page 2: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

Fig.1 The humanoid robot BHR-02

cockpit. The two computers are connected with a massmemory for data sharing, which is a high speedcommunication device that need not shake hands whileexchanging data, called memolink. When BHR-02 receivesinstructions from the remote cockpit via wireless LAN, orindependently acts according to its vision or perceives otherkinds of exoteric information such as robot head tracking amoving object by its view, the first step of BHR-02 works isthat one of its computers, which is for information processingand exchanging data, disposes these information and writesthe results into memolink. The second step is that the motioncontrol computer reads the data from memolink, calculatesand generates the values of motion trajectory that will be usedto control corresponding DC motors. The control system ofBHR-02 is a real-time position control system based on RT-Linux operating system. Now BHR-02 can perform walking,squatting, wheeling and shadowboxing etc. On BHR-02 as anexperiment platform, we have also done research on bipedaldynamic walking, 3D vision, motion planning, teleoperationand other subprojects [13]-[14].

To control the humanoid robot BHR-02 we havedeveloped a remote cockpit [15]-[16], whose systemarchitecture is client/sever mode based on the 802.1 IGwireless LAN. The data transferring between the remotecockpit and BHR-02 is bilateral. In one direction, the operatormanipulates the multiple input devices (keyboard, mouse,master arm and hand), and these actions are transformed intoinstructions for the computer, and these instructions aretransferred to BHR-02 ultimately via the wireless LAN. In theother direction, the operator can also receive certain feedbackinformation, such as BHR-02 bodily sensor data system andvideo from robot vision system.

As the humanoid has many moving parts such as legs,arms, head and hands etc., to control a humanoid robot using

only one input device, operator should pay more attention tochange the control method to control different parts of therobot using the same device. Operator will be very tired for along time work. In our teleoperation system we use multipleinput devices to control robot, including the keyboard, mouse,master arm. In controlling different parts of the robot, operatorcan select different operational mode conveniently to input theinstructions.

Fig.2 shows working map of the remote system. With thehelp of remote system, operator can control the BHR-02walking to achieve target position and manipulate the robotupper limb working.

It is important for the operator to watch the overall sceneof the workspace. There are 4 kinds of feedback in thecurrently system, which are body sensors data of the robot,feedback by the robot vision system, real scene of the overallworkspace and virtual scene monitoring system.

Body sensors data of the robot are detected by the sensorequipped on the robot. They are the joint angle data, forcedata, and the orientation data. All these data are characteristicdata of the robot status. They are sent back to the platform anddisplayed as the text number on the interface. The operator isdifficult to judge the robot motion state just from the text ofthese data but not images of the robot.

MotionCaptur

........................ ......_

IRelsen_ * 1 Virtual scene

monitoringmntrn

WirelessAP

Fig.2 Working sketch map of the remote cockpit

Robot vision images feedback and the real scene of theoverall worksite. They are the real images feedback anddisplayed as the video on the workstation. All these feedbackhas the drawback as mentioned above. For example they arenot fit in the environment which is full of smoke.

Virtual scene monitoring system is a special systembased on the motion capture system. This system can renderthe virtual scene, but it can only adopt the data from themotion capture system. Other data, for example the internalsensors data of the robot body, can not be rendered in theinterface. Furthermore, when this system works, we shouldattach more than 30 markers on the robot body, which is notconvenient.

1111

Teleoperationplatform

cInfrared'amer I

q Ia;, I u .l

Page 3: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

Communicaltion6oont' Motionmodule capture

Teleoperation platform

Fig.3 Structure of the virtual scene for the BHR-02

To monitor the robot, we want to build virtual scene bythe virtual models which can render the multiple feedbackdata from the robot as 3D images and express the real state ofthe robot.

III. RECONSTRUCTION OF THE ROBOT

The basic functions of the virtual scene which we hope torender to the operator are real-time monitor of the robot andconveniently changing the viewpoint and scaling zoom.

monitor of the robot is the most important function of thevirtual scene. By rendering the real-time feedback from therobot, the virtual scene expresses the real situation of therobot and part of its environment instead of the video picturefrom real camera. Because the viewpoint and zoom could bechanged easily, the operator can get the detail information ofof the robot.

As the humanoid robot has more than 30 DOF, to renderit correctly, we should know the exact state of the robot,which is sent back from the robot. In the current system thereare such data should be rendered:

A. Robot Status Datafrom the Body SensorWhen the robot is executing the order, the sensor can

detect the joint angle value in every control cycle. By theteleoperation platform, the data of the joint angle will be sentback to the operator at real time.

B. Robot Position andAttitude DataBy the motion capture system, we can obtain the robot

motion data. The data is expressed as the coordinates ofmarkers which attached to the body. In the previous work, weused the motion data to render the whole robot scene, but forit has the drawback, we must capture more than 30 markersdata to render the robot. It needs a high level test condition. Infact, by more than 3 markers' motion data which attached tothe rigid body, we can calculate the whole robot position and

attitude data. The algorithm will be described in the nextsection.

The main structure of the virtual scene is like the fig 3.As shown in fig 3, the real-time body sensor data and themotion data are transferred to teleoperation platform. By thereal-time data fusion module, these feedbacks will beprocessed to integrate data which the 3D interface can render.By the virtual model they are finally rendered as theanimation. When confusing the feedback data, the strategy isthat the robot body sensor data could be rendered directly, themotion data of the markers should be used to calculate theposition and attitude data of the robot body, and then berendered.

IV. IMPLEMENT OF THE VIRTUAL SCENE

Maya [17] is chosen for this research work. Maya is a 3Danimation software which is widely used in the cartoonproducing. It has the model function and rendering function.Building the surface and the skeleton of the role, it can renderany motion of the virtual role. It has the interface to add thefunction of data processing or other.

The whole 3D computer graphic user interface to renderthe robot can be divided into 4 parts.

A. Virtual Models ofthe RobotThe humanoid robot BHR-02 has 32 DOF, so we build a

virtual skeleton model like the robot. There are DOF near thetwo skeletons. As shown in fig.4 the setup of the DOF issimilar as the real robot. In the Maya software the value ofeach DOF can be changed, so it can display the mutualmovement of two adjacent skeletons. There is a root skeletonwhich can accept the position and attitude data itself.

After building the skeleton system, the surface of therobot is built by the Maya Tools, attached on the skeleton, thewhole robot model is built completely fig.5.

1112

3Dinterface

.D

Virtual scene

Page 4: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

B. Data Matching ModuleAfter the robot model is built, the attributes should be

added to accept the values. The model of the robot can renderthe robot motion, and it should use the motion data of therobot. In the Maya, the user can develop the data processingplug-ins itself. We build a plug-in for Maya to obtain themotion data from a data file which is updating in time by theteleoperation platform feedback module. In the plug-in themotion data are evaluated to attribute the model joints.

Fig.4 Kinematics and skeleton model ofBHR-02

Fig.5 Integrity of the virtual model

C. Data-Fusion ModuleThere are 2 kinds of data can be used to render the state

of the robot.1) Real-time joint angle data. There are more than 30

DOF in the robot. When the robot executes the order, thebody sensor data were feedback to the platform. Then the datacould be used directly in the virtual scene.

2) Real-time position and attitude data of the robot. Bythe motion capture system, the coordinates data of themarkers on the robot body can be obtained. As the body of therobot is rigid, use coordinates data of 3 markers which isattached on the robot body, we can calculate the position andthe attitude data of the robot body. The Algorithm is describedas follows.

As P1, P2, P3 are the markers attached to the robot body.The position vector of Pi at time t = tl is denoted by ai, at thetime t = t2 pi.

We can define two labels as a and p:

i 3'a= Z Iai (1)3i=1

l 3P=3 Pi (2)3i=l

As the robot body is rigid, the motion from the originalpoint to the end point can be described in terms of the sum ofthe translation of Po, characterized by the translation vector r,and a rotation around Po, characterized by the rotation matrix

11l 1l2 rl31R = |r21 r22 r23 (3)

r3l r32 r33 j

Rotation matrix R is satisfies:

p,=a+r+R(ai-a) (4)

Ref. [18] proved that the vector r and the matrix R canbe calculated as the follows: First define the three matrixesthat:

3A=- (ai - a)(ai - a)T (5)

3 i=1

P 3- P -)(pi -p) (6)3i=1

G - E (pi - p)(ai - a)T (7)

Here the superscript T denotes the transposition.Secondly, two middle variable 3 land 3 2 can be

calculated in the next equation:

P2 g 2~gtA1 _ 2)62 = gl1

Ai2 = 922 + 2,81g,

2g =tr(GTG)2 T a

g2 =tr(GTG)

g3 =det(G)

(8)

(9)

Here the superscript a denotes the adjoint of the matrix.The tr denotes the trace of the matrix.

1113

Page 5: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

So the rotation matrix R of the rigid body can becalculated as:

R =(Ga + G)C1 (10)

Here the matrix C can be calculated that

C=GTG+/P21 (11)

From the rotation matrix R the rotation angel of the 3axis can be calculated like the following:

{,l/=Atan2 -r3l, 2I +r2)

a A tan2(r2ljrl (12)

l =A tan 2(r32, r33)

The data a , 3 are the rotation angles of the robotbody around 3 axis. The position of the robot body can becalculated from the (4).

D. Data Communication ModuleAs shown in fig.3, there are 3 types of communication

modules. All of them are developed based on the TCP/IPprotocol.

Robot status data are sent from robot to the teleoperationplatform. The robot control computer can offer the real timedata of every joint in every control cycle. The data are sent tothe communication computer equipped on the robot whichwill transmit the data to the platform. Client unit is equippedon the platform, and the server unit is equipped on the robotcommunication system.

Real time coordinates of the markers are sent from themotion capture system to the teleoperation platform. Whenthe robot is working, the motion capture system can offermore than 60 frame data of the markers position.

The data was transferred from the data-fusion module tothe virtual interface. When finished the fusion task of themultiple data, the data to be rendered on the virtual sceneshould be sent to the graphic workstation. Here the client isthe workstation and the server is the platform.

V. EXPERIMENT

We examined the performance of the developedteleoperation system to see whether the interface coulddisplay the virtual robot well.

Fig 6(a) shows the moving robot real picture, which isunder the control of the remotely operator. Fig 6(b) shows thefront viewpoint picture of the virtual robot. Fig 6(c) shows theside viewpoint picture. Fig 6(d) shows the back viewpointpicture. Pictures indicate that the operator could change theviewpoint and the scaling easily. It can be seen visibly that therobot model in the fig 6(c) is smaller than in fig 6(b) and 6(d).

(a) Real robot motion

(b) Front viewpoint

(c) Side viewpoint

1114

Page 6: [IEEE 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems - Beijing, China (2006.10.9-2006.10.15)] 2006 IEEE/RSJ International Conference on Intelligent Robots

(d) Back viewpoint

Fig 6 Virtual robot interface in teleoperation experiment

VI. CONCLUDION

For the purpose of remotely controlling humanoid robotBHR-02, we designed a vision tele-operation system todisplay the status of the robot. The system has the followingcharacters:

(1)It can render multiple data of the robot as a 3D virtualscene. The data include not only the motion data fromthe motion capture system, but also the real time statusdata feedback from the robot.

(2)Multiple data can be fused. Especially, by the real timethree-dimensional motion data of the robot body, theposition and the attitude data can be calculated.

(3)The operator can change the viewpoint and the scalingof the zoom freely. Therefore the operator can observethe detail of the models.

(4)A teleoperation experiment using BHR-02 humanoidrobot is provided.

ACKNOWLEDGMENT

This work was partially supported by National HighTechnology Research and Development Program andNational Natural Science Foundation of China.

REFERENCES

[1] Fiorini P, Bejczy A, and Schenker P, "Integrate Interface for AdvancedTeleoperation", IEEE Control Systems, Vol. 13, N 5, 10/93, ppl5 - 19.

[2] Rigaud V, Coste Maniere E, Aldon M.J, Probet P, Amat J, Casals A, etc.,"Union: Underwater Intelligent Operation and Navigator", IEEERobotics & Automation Magazine, Vol.5 N.1, 03/98, pp25-35.

[3] Stanney K, Mourant R, and Kennedy R, "Human Factors Issues inVirtual Environments: A review of the literature", Presence:Teleoperators & Virtual Environments, MIT, Vol.7, N.4, 08/98, pp327 -

351.[4] Burdea, Grigore. "Invited Review: The Synergy between Virtual Reality

and Robotics", IEEE Transactions on Robotics and Automation, Vol.15,N.3, June 1999, pp 400 - 410.

[5] Monferrer, Alexandre, Bonyuet, and David, "Cooperative RobotTeleoperation through Virtual Reality Interfaces", IEEE Proceedings ofthe Sixth International Conference on Information Visualisation (IV'02),1093-9547/02.

[6] Igor R, Belousov, Rqad Chellali, and Gordon J. Clapworthy, "VirtualReality Tools for Internet Robotics", Proceedings of the 2001 IEEEInternational Conference on Robotics 8, Automation Seoul, May 21-26,2001, pp 1878-1883.

[7] Qingping Lin, Chengi Kuo, Frina, Fsut, Frse, "Virtual Tele-Operation ofUnderwater Robots", Proceedings of the 1997 IEEE InternationalConference on Robotics and Automation Albuquerque, April 1997, pp1022-1027.

[8] Hitoshi Hasunumat, Masami Kobayashit, Hisashi Moriyamat, ToshiyukiItokot, Yoshitaka Yanagiharatt, Takao Uenott, Kazuhisa Ohyatt, andKazuhito Yokoi, "A Tele-operated Humanoid Robot Drives a LiftTruck", Proceedings of the 2002 IEEE lntemational Conference onRobotics 8 Automation, May 2002, pp 2246-2252.

[9] Hitoshi Hasunuma, Katsumi Nakashima, "The Tele-operation of theHumanoid Robot, Workspace Extension of the Arm with Step Motion",Proceedings of 2005 5th IEEE-RAS International Conference onHumanoid Robots, pp 245-252.

[10]Sokho Chang, Jungtae Kim, Insup Kim, Jin Hwan Borm, Chongwon Leeand Jong Oh Park, "KIST Teleoperation System for Humanoid Robot",Proceedings of the 1999 IEEG/RSJ International Conference onIntelligent Robots and Systems, pp 1198-1203.

[1I]Neo Ee Sian, Kazuhito Yokoi, Shuuji Kajita, Fumio Kanehiro, KazuoTanie, "Whole Body Teleoperation of a Humanoid Robot, Developmentof a Simple Master Device using Joysticks", Proceedings of the 2002IEEE/RSJ Intl. Conference on Intelligent Robots and Systems EPFL,October 2002, pp2569-2574.

[12]Satoshi Kagami, James J. Kuffner Jr, Koichi Nishiwaki, TomomichiSugihara, Takashi Michikata, takuma Aoyama Masayuki Inaba,Hirochika Inoue", Design and Implementation of Remotely OperationInterface for Humanoid Robot", Proceedings of the 2001 IEEEInternational Conference on Robotics &Automation, May 21-26, 2001,pp 401-406.

[13]Qiang Huang, Kejie Li, Tianmiao Wang, "Control and MechanicalDesign of Humanoid Robot BHR-01", Proceeding of The Third IARPInternational Workshop on Humanoid and Human Friendly Robotics,2002.12, pplO-13.

[14]Qiang Huang, Kazuhito Yokoi, Shuuji Kajita, Kenji Kaneko, HirohikoArai, Noriho Koyachi, and Kazuo Tanie, "Planning Walking Pattern for aBiped Robot", IEEE Transactions on Robotics and Automation, Vol.17,N.3, 2001.6, pp 280-289.

[15]Lei Zhang, Qiang Huang, Qiusheng Liu, Tao Liu, Yuepin Lu, andDongguang Li, "A Teleoperation System for a Humanoid Robot withMultiple Information Feedback and Operational Modes", Proceedings ofthe 2005 IEEE International Conference on Robotics and BiomimeticsJune 29 - July 3, 2005, pp290-294.

[16]Qiusheng Liu, Qiang Huang, Weimin Zhang, Xingyi Wang, ChengweiWu, "Manipulation of a Humanoid Robot by Teleoperation", Proceedingsof the 5th World Congress on Intelligent Control and Automation, June15-19, 2004, pp 4894-4898.

[17]Gould, David A. D. , Complete Maya programming: an extensive guideto MEL and the C++ API, San Francisco, CA: Morgan KaufmannPub., 2003.

[18]F.E.Veldpaus, H.J.Woltring, and L.J.M.G. Dortmans, "A Least-squaresAlgorithm for the Equiform Transformation from Spatial Marker Co-ordinates", Journal ofBiomechanics, 1988, 21(1), pp45-55.

1115