Exploiting Ungrounded Tactile Haptic Displays for Mobile RoboticTeleoperation
B. Horan, Z. Najdovski and S. Nahavandi
Abstract- Teleoperated mobile robotics offer potentialuse in a variety of different real-world applicationsincluding hazardous materials handling, Urban Searchand Rescue and explosive ordnance handling anddisposal. Recent research discusses the use of Haptictechnology in increasing task immersion andteleoperator performance. This work investigates theutility of low-cost, ungrounded tactile haptic interfaces inmobile robotic teleoperation. In order to achieve thedesired implementation using only tactile sensationpresents distinct challenges. Innovative Haptic controlmethodologies providing the teleoperator with intuitivemotion control and task-relevant haptic augmentationare presented within this paper.
I. INTRODUCTION
TELEOPERATED mobile robotics provides a valuablesolution for a variety of tasks such as hazardous
materials handling [1], Urban Search and Rescue [2] andexplosive ordnance disposal [3,4]. At the other end of thespectrum, fully autonomous robots provide feasiblesolutions for structured, repetitive tasks. The teleoperatedapproach to mobile robotics provides the capability tointroduce many desirable attributes to the control of aremote robotic system. Haptic technology provides thecapability to interact with the teleoperator's tactual modalityin the aims of improving operator immersion and taskperformance. The integration of Haptic technology in theteleoperation of a mobile robot is discussed in [5-15].
This paper, however, investigates ungrounded tactileHaptic displays for achieving Haptic teleoperative control ofa mobile robot. The ungrounded tactile-Haptic devices areunable to display actual forces to the operator, and thereforepresent distinct challenges to the human-robotic interaction.These types of Haptic devices are capable of only providingtactile sensation, however, they do represent a simple, costefficient technology, and as such investigation of theircapabilities in mobile robotic teleoperation is warranted.Whilst the non-existent force rendering capabilities do provea disadvantage, these devices do in fact provide a
Manuscript received January 31, 2008.B. Horan, is with the Intelligent Systems Research Lab, Deakin
University, Australia. (email: [email protected]).Z. Najdovski, is with the Intelligent Systems Research Lab, Deakin
University, Australia. (email: [email protected]).S. Nahavandi, is with the Intelligent Systems Research Lab, Deakin
University, Australia. (email: [email protected]).
theoretically unlimited workspace, unlike a traditionalgrounded Haptic interface.
The iFeel™ mouse from Logitech [16] is a low-cost, off-theshelf, tactile haptic interface setting the focus of this work.The capabilities of this type of Haptic interface areinvestigated and suitable control methodologies introduced.Ultimately, simulation results demonstrate the applicabilityof this type of device in the Haptic teleoperative control of amobile robot..
II. TACTILE-HAPTIC INTERFACES
The tactile-Haptic interface discussed in this work utilisesvibration as its basis of operation. As this type of device isungrounded it is incapable of exerting any forces to theteleoperator, however it is advantageous in that it is notsubjected to the same workspace restrictions as thegrounded-type of device. In order to achieve the adequatehuman-robotic interaction required for effective Hapticteleoperative control this work investigates the capabilitiesof such devices and presents suitable methodologiesattempting to overcome any such limitations. This workinvestigates the relevant capabilities of the iFeel™ mouse byLogitech as a representation of tactile-Haptic interfaces.These capabilities are determined in order to develop anappropriate Haptic control methodology for mobile roboticteleoperation.
Firstly, it is identified that ungrounded Haptic displaysare not subject to the same space constraints as traditionalHaptic devices. The workspace of the iFeel™ haptic mouseis theoretically unbounded (except for its tether) given asuitable supporting planar surface. Secondly, in order toachieve a suitable control methodology the Haptic renderingcapabilities of the device need to be investigated. Asaforementioned this particular device relies on vibration,which provides the basis to render tactile sensations to theoperator without the need to exert forces to the operator asin [8]. In the aims of achieving the desired teleoperativecontrol the relevant tactile effects can be classified as spatialand temporal effects as discussed below.
A. Spatial Effects
The spatial effects able to be rendered by this devicecorrespond to a relationship to the x, y displacement of thedevice across the planar operating surface. The texture andgrid effects were identified as relevant to this work and arecreated by displaying vibration in response to spatialvariance as shown below.
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.
TextureXl X2 X3 . Xn
Grid
{
- r . t, r + p ~ t ~ A
a = 0, r < t < r + p (4)
(1)
and for the grid
Figure 1. Rendering spatial effects with theTactile-Haptic device [17]
Action
Environment
Infornlation
III. TELEOPERATION CONTROL ARCHITECTURE
The human-in-the-Ioop approach to the control of a remoterobotic system provides the capability to overcome severallimitations by introducing several desirable humanattributes. Such attributes include adaptability to a diverserange of situations, a relatively high-level of intelligence,advanced sensory capabilities as well as human judgmentand intuition. Many critical real-world applications such asUrban Search and Rescue and other such rescue missionsrequire such attributes in order to achieve successful taskexecution. Therefore, having identified the suitability ofteleoperated robotic systems to a range of applications, thiswork presents the tactile-Haptic approach to teleoperation ofall-terrain mobile robotics.
A literature review identifies two fundamentally conflictingapproaches to the teleoperation of a remote robotic system.The first approach is that the human operator controls theremote system in a shared autonomy scenario where therobot has the capability to influence its own actions. Thisscenario may arise in situations where the robot's autonomymay be considered as equal or more valuable than that of theteleoperator. This scenario is depicted in Figure 3. In thisscenario the teleoperator does not have absolute control ofthe robotic system, which will inevitably result in conflictbetween the intent of the operator and that of the robot.
Identification of the above-discussed effects forms the basisfor development of the following Haptic controlmethodology.
where a is the vibration, f3 is the frequency of the vibration
and t is time.
Yn
--+---+----+---+--+---+----t~ YI
Y --+---+----+---+--+---+----t- Y2
~ IPo ·• ~I,:<::·:::~ ~ x
A tan ¢ .(xn+1 - Xn ) (2)0.&distance <--------
Yn+l - Yn
1\ xn+l - xn£distance = COS(¢)
B. Temporal Effects
The temporal effects achievable by this device correspond toa relationship between magnitude and time.
Periodic Pulse
Magnitude, Magnitude,a a
~f\Nv········T~e irme
where &dis tan ce represents the actual distance between
rendered vibrations and ¢J is the direction ofmovement.
It is acknowledged that other spatial effects may beachievable, however, for the purposes of this work the abovetwo are considered.
Figure 2. Rendering temporal effects [17]
The periodic and pulse effects were identified as relevant tothis work and are created by varying the frequency ofvibration as a function of time, as demonstrated above.
~:Y......
j ..~ ..
t.:::·~::·:::~ ~ x
Given Figure 1, the distance/space between consecutivevibrations for the texture effect is given by
Given Figure 2, these vibrations are a function of time.The periodic vibration is given by
a = A .sin(2 . 1l . f3 .t)(3)
and for the pulse vibration
Figure 3. Shared autonomy control scenario [8]
The conflicting approach to teleoperation supported by thisresearch is that of absolute human control. In this approach,the human operator's capabilities are acknowledged assuperior and as such, all actions are dictated to the roboticsystem. In this scenario the robot's intelligence may be usedto augment the teleoperator's control process [8]. Indicationof the robot's intent can then provide suggestions or cues to
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.
the operator rather than directly intervening in the controlprocess. This ensures that the teleoperator remains inabsolute control, while still utilizing the capabilities of therobot's computational intelligence. This scenario eliminatesconflict of control as the teleoperator has total control of allof the robot's actions. This situation is depicted in Figure 4.
Environment
Figure 5. As the teleoperator's movement of the devicetravels from PI to P2, the corresponding linear and angularvelocities are commanded to the robot. The speed of theoperator's motion relates to the magnitude of thecommanded velocities, as per (5) and (6). It is important tonote, that in this particular approach, the commandedvelocities are applied in open loop and it may take sometime for the robot to achieve the desired velocities, and as,such the teleoperator needs to compensate accordingly.
Haptic CuesRover LinearDisolacement. B
The absolute human control approach is supported by thecontrol methodologies presented below. This is achieved bydecoupling of the motion control and application-specificHaptic augmentation components of the presented controlmethodologies. The teleoperator's motion control is notdirectly affected by any Haptic augmentation. Theteleoperator can determine whether or not to react to therobot's display of any information Haptically, as in Figure 4.The following sections explain the two components of thetactile-Haptic teleoperative approach to the teleoperation ofa mobile robot.
Figure 4. Absolute human control [8]
Figure 5. Haptic Motion Control
Given Figure 5, the mapping between movement of theHaptic device and the linear velocity is given by
User
Information
ControlInput
Robot Action
YI A (e, 8)t+1
Y2 1 82
e ~ /'-., _% .g/ I
(8, 8)t IYn
Rover Angular
Rotation, 88
where dt is chosen suitably and Al and A2 are constants
of proportionality.
(5)
(6)
Backward andTurning left
Turning right
Turning leftMoving
backward ..( _
Movingfo ard
and the angular velocity
- iJ - A °1+1 - °tOJ - - 2· dt
t~ ~8
Given the motion control methodology presented abovetypical Haptic device movements and corresponding rovervelocities are presented below in Figure 6.
Haptic device movements
Forward andTurning right
Iv. MOTION CONTROL
The motion control component of the Haptic controlmethodology is responsible for providing a mechanism forthe teleoperator to provide motion commands to the robot.In order to achieve the desired approach to motion control ofthe mobile robot, the theoretically unlimited workspace ofthe device is utilised. Rather than mapping the displacementof the Haptic device to corresponding rover velocities as in[9], the presented approach utilises continuous teleoperatormotion in order to command the motion of the mobile robot.
The distinct advantage of this approach is theteleoperator is always aware of the velocities beingcommanded to the robot given that they are providingcontinuous motion. The teleoperator can also easily providea zero motion command by stopping their motion of theHaptic device. The ungrounded nature of this type of deviceallows this to be achieved. There are of course somelimitations in this approach, in that for a long continuousmotion the teleoperator may need to reset the position of thedevice. In reality lifting the device and replacing it in alogical position can achieve this. While this may prove alimitation, this approach does allow the operator tointuitively infer the velocities being commanded to the robotat any particular time.
The mapping between the motion of the tactile-Hapticdevice to the motion of the mobile robot is depicted by
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.
where k 3 scales to the suitable frequency, and is chosen
according the appropriate magnitude of the tactile effect. Itis also necessary to limit the maximum allowable range sothat the HGF does not affect the robot for any possible
position in space. The suitable Pmax can be chosen
empirically.
Y
~ : P: f••• •• I Robot. . . .•••• •••••• ••••• p'
................................../....... . / (xr,Yr) X
! ~
Figure 7. HGF in Obstacle Detection
In order to inform the teleoperator of the HGF and thecorresponding object of interest (in this case an obstacle) aperiod tactile effect is utilised. Varying the frequency of theperiodic effect adequately can provide the teleoperator withan indication of the existence of the obstacle. As the mobilerobot enters the Haptic Gravitational Field, as depicted byFigure 7, the frequency of the periodic effect is increased bya factor of p according to (7).
Figure 6. Haptic device - robot motions
In addition to monitoring the movement of the tactileHaptic device a grid-texture is used to provide theteleoperator with an intuitive-spatial indication of the speedand in which direction they are manipulating the Hapticdevice. The magnitude of the texture-grid vibration ischosen to be far less than that of the temporal Hapticaugmentation so that the teleoperator can easily differentiatebetween the two vibratory effects.
As mentioned above, when the teleoperator'sdisplacement of the haptic device exceeds the providedplanar surface, the position of the haptic device needs to bereset in order to provide adequate maneuverability.
IV. v. ApPLICATION-SPECIFIC HAPTIC AUGMENTATION
This section presents the application-specific Hapticaugmentation methodology designed to assist theteleoperator in a particular scenario. This methodologypresents distinct challenges because the particular device isunable to provide forces to the operator specifying anappropriate indication [7,9] or corrective action.
A. Haptic gravitational field
The Haptic Gravitational Field (HGF) is related to theArtifical Potential Force Field methodology [18] andprovides a method to provide Haptic information pertainingto surrounding obstacles or a desired goal location. Theutility of the HGF in respect to the tactile-Haptic controlmethodology is discussed in the following sections.
Given the current location of the robot (xr, Yr) and a
location of interest (Xi, Yi), the HGF is governed by
As such, the tactile rendering is given by
a = A . sin(2 .1r . P.t)
p=(k3 ·p)
(8)
(9)
1 ············· ·..·.. ························· ~
Figure 6. HGF in guidance to a goalX
Goal..•• ~'. \~i,Yi)
'. p .+.
Robot ••••....... ',..(~r'yr) ~ •.
••••• 0
. .
··•..'",•••..
~
•..•...110..
+• ....... ~ ~~~.. ...............
yA
B. Haptic indication ofgoal location
In order to provide the teleoperator with a method tohaptically determine the distance to a desired goal locationthe Haptic Gravitational Field is again utilised.
where p is the strength of the HGP for any logical
position.
In reality the location of the robot in respect to obstaclescan be determined using ultrasonic or optical range findingmethods, and the absolute location of the robot and goallocations can be determined using either Global PositioningSystem (GPS) or Differential Global Positioning System(DGPS) methods. The use of the HGF in Haptic obstacledetection and guidance to a goal location is discussed below.
B. HAPTIC OBSTACLE DETECTION
In order to provide the teleoperator with a Hapticindication of surrounding obstacles the HGF is utilised. Atemporal tactile effect is used to provide the teleoperatorwith a tactile indication of the HGF and the correspondingobstacle or obstacles.
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.
Stop
1000800
r (\I~ I
I,
II
-+ "
600
Obstacles and RepulsiveHaptic Gravitational Field
400
II\
I ~I II iI \I !I \
\II
200
Figure 7. Simulation with obstacles
40
80
60
20
100
120
200
100
700
600Start
500
400 TrajedOfY +,
300
900
800
1000
1100
OL..-------J.----L.._--J.---'-~-.....&-.-....&...--'--.L...-----I-.............-----'
o 5 10 15 20 25 30 35 40 45 50X-Displacement
Figure 8. Corresponding pIn the simulated trajectory depicted by Figures 9 and 10, therobot is again navigated from the Start to Goal locations.The location of the Goal and corresponding inverse HGF areshown. As the robot enters the inverse HGF the associatedperiod is shown directly below.
(10)
v. SIMULATION RESULTS
In order to demonstrate the ability of the presentedapproach to provide a tactile-Haptic indication of both thepresence of obstacles and goal location the followingsimulated results are presented. Figures 7 and 8 demonstratethe HGF in providing vibration according to surroundingobstacles and Figures 9 ad lOin relation to a desired goal
location. The computed f3 is presented below each
simulated trajectory as an in (9) and (10). In the firstsimulated trajectory presented by Figures 7 and 8, the robotis navigated from the Start to Stop locations. The location ofthe obstacles and corresponding HGF are shown. As therobot enters the corresponding HGF's the associated changein vibratory frequency is depicted below.
The Inverse of the HGF proves valuable in providing anindication of the direction of a goal location. Asdemonstrated by Figure 6, the period of the periodicvibratory effect (8) will actually decrease as the robot movescloser to the goal position.
The Inverse of the HGF is given by
where k4 is a constant of proportionality and chosen
appropriately.
The Haptic Gravitational Field (HGF) provides a valuablemechanism for augmenting the teleoperator's control taskutilising a tactile-Haptic interface. In this particular USRscenario we have considered the utility of the HGF inobstacle detection and guidance to a desired goal location. Inorder for the teleoperator to fully utilise this approach itbecomes the responsibility of the teleoperator to attempt to
achieve minimal frequency ofvibration ( f3 ) of the vibratory
effect in order to avoid obstacles and to traverse closer to thegoal location.
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.
100
800
900
300
[3] Kang, S., Cho, C., Lee, 1., Ryu, D., Park, C., Shin, K-C and Kim, M.,"ROBHAZ-DT2: Design and integration of passive double tracked mobilemanipulator system for explosive ordnance disposal", Proc. of IEEE Int.Conf. On Intelligent Robots and Systems (IROS), Las Vegas, October 200l[4] A. Kron, G. Schmidt, B. Petzold, M. 1. Zah, P. Hinterseer, E. Steinbach,Disposal of explosive ordnances by use of a bimanual haptic telepresencesystem," Proc. of IEEE Int Conf. on Robotics and Automation, 2004, pp.haptic feedback," Proc of IEEE Int. Workshop on Haptic VirtualEnvironments and Their Applications, 2002, pp. 67-72.[8] B. Horan, D. Creighton, S. Nahavandi, M. Jamshidi, "Bilateral HapticTeleoperation of an articulated track mobile robot," Proc. of IEEE Int Cont.on Systems of Systems Engineering, 2007.[9] S. Lee, G. S. Sukhatme, G. 1. Kim, C. M. Park, "Haptic control of amobile robot: A user study," Proc. of IEEE Int Conf on Intelligent Robotsand Systems, 2002, pp. 2867-2874.[10] J. B. Park, 1. H. Lee, B. H. Lee, "Rollover-free navigation for a mobileagent in an unstructured environment," IEEE Trans. Systems, Man andCybernetics, Vol. 36, No 4, 2006, pp. 835-848.[11] B. Horan, S. Nahavandi, D. Creighton, E. Tunstel, "Fuzzy HapticAugmentation for Telerobotic stair-climbing," Proc. of IEEE Int Cont.Systems, Man and Cybernetics, 2007.[12] H. Azarnoush, B. Horan, P. Sridhar, A. Madni, M. Jamshidi, "Towardsoptimization of a real-world robotic sensor system-of-systems," WorldAutomation Congress, 2006[13] F. Sahin, P. Sridhar, B. Horan, V. Raghavan, M. Jamshidi, "System ofSystems Approach to Threat Detection and Integration of HeterogeneousIndependently Operable Systems," Proc. of IEEE Int Cont. Systems, Manand Cybernetics, 2007.[14] M. Fielding, 1. Mullins, B. Horan, S. Nahavandi, ''OzBot™ - ATeleoperated Robotic Platform for Search and Rescue Operations," IEEEInt Workshop on Safety, Security and Rescue Robotics, 2007.[15] J. Mullins, B. Horan, M. Fielding, S. Nahavandi, "A HapticallyEnabled Low-Cost Reconnaissance Platform for Law Enforcement," IEEEInt Workshop on Safety, Security and Rescue Robotics, 2007.[16] Logitech iFeel Mouse, Logitech, www.logitech.com. 2005[17] Immersion Development Kit, Immersion Corporation,www.immersion.com. 2007[18] K. Liang, L. Zhiye, D. Chen, X. Chen, "Improved Artificial PotentialField for Unknown Narrow Environments," IEEE Int. Conf. On Roboticsand Biometrics, 2004, pp. 688-692.
40353010 15 20 25X-Displacement
Figure 10. Corresponding P
5
OL....----..I.....----..I.....----"O"'----"O"'------'o 200 400 600 800 1()()()
Figure 9. Simulation with goal
700 ~ Goal and InverseI ", Haptic Gravitational Fiekt
600 / \ "/1 \ Trajectory /-/. "\ I + Stop. \ / \~ ./
500 Start \ ,/ >......J/\ //
~_/400
900
800
700
600
500CQ.
400
300
200
100
00
1000
200
VI. CONCLUSION
This paper presents a novel approach to the use ofungrounded haptic devices in the teleoperative control of amobile robotic platform. The preliminary simulation resultsdemonstrate how the approach can provide the teleoperatorwith the required information.
VII. ACKNOWLEDGEMENT
This work was supported by the ISR: Intelligent Systems ResearchLaboratory, at Deakin University, Australia.
VIII. REFERENCES
[1] Cox, DJ., "Mock-up of hazardous material handling tasks using a dualarm robotic system", World Automation Congress, Orlando, June 2002.[2] Casper, 1. and Murphy, R.R., "Human-robot interactions during therobot-assisted urban search and rescue response at the World Trade Centre",IEEE Trans. on Systems. Man. and Cybernetics (SMC), Vol.33, 2003.
Authorized licensed use limited to: DEAKIN UNIVERSITY LIBRARY. Downloaded on December 8, 2009 at 17:19 from IEEE Xplore. Restrictions apply.