fido‐dido – a wheeled mobile robot for research and education...
TRANSCRIPT
FIDO‐DIDO – a wheeled mobile robot for research and education purposes
As a requirement to the graduate course
Applied Control and Sensor Fusion
André Carvalho Bittencourt, September 2010
PREFACE This document was prepared as a requirement to the graduate course ‘Applied Control and Sensor Fusion’. The main objective of the project was to setup the Pioneer robot, Fido‐Dido, to study residual generation for fault detection of complex sensor systems, such as odometry, scan mathicng, SLAM, etc. which are all based on the processing of raw measurements.
A list of primary objectives follows
1. Make the robot system operational.
This task fully was accomplished, and the robot is updated with latest software and drivers. There is also a backup of the robot hard disk in the file server.
2. Integrate odometric pose estimates.
Done.
3. Integrate the Sick laser range finder.
Done. The laser can now be accessible through a RS232 cable.
4. Build software tools for data logging.
Done. The class ArFastSickLogger.cpp allows laser data logging at maximum sampling rate
A list of secondary objectives was also listed, including the development of scan matching, SLAM and trajectory generation algorithms. All these functionalities are readily available with the installed software and libraries in the robot.
Despite such primary objectives, this document mostly describes the platform, its hardware, software, setup and usage, with code examples. The main intent of the document is to serve as a tutorial for future users of the platform.
INTRODUCTION Fido is an autonomous wheeled mobile robot manufactured by MobileRobots Inc. The platform model P2‐DX is suitable for research and education purposes. It is a differential drive robot with a caster wheel for balance, with very simple kinematics. Fido comes with many sensors which enable it to interact with the surroundings and accomplish more complex tasks, here is a list of the current sensors/actuators installed :
Caster wheel
PC
MC
Actuated Wheel
Sonars
Sick• 2 DC motors (to drive the wheels)
• 800 tick wheel encoders (e.g. odometry)
• 8 forward‐facing ultrasonic array sonar
• 1 Sick laser range finder, LMS200 Used for localization/mapping
• Rechargeable battery set
• On‐board MicroController (MC) Performs integration with sensors and actuators.
• On‐board PC, VSBC8 (onboard PC) Performs higher level tasks, e.g. your application1
Besides the installed sensors, optional items can be installed in the robot for more complex tasks such as video camera, GPS, gyros, IMU, etc.
Calibrate the tires evenly at 23psi before running.
An important feature of the platform is the software support available. C++ libraries are already available to make the development easy and accessible. All these features make the platform suitable for research and education purposes. With Fido, it is really easy to test your newest algorithms, the platform is especially suitable for research in sensor fusion, navigation and trajectory planning.
Throughout this document, examples programs are presented for the user to get acquainted with the platform and start its own applications.
COMMUNICATION Sonar‐ and actuator‐ signals can not be accessed directly by your application2, only the on‐board MicroController (MC) has access to them. Nevertheless, the onboard MC intermediates the communication between your application and the sensors/actuators.
The communication is based in a client‐server architecture. The MC acts as a server and the different applications are clients that request information or actions. The server‐side runs the ActivMedia Robotics Operating System (AROS) and in the client‐side (where your applications are built), the
1 Your application can run in other hardware as well, as explained in the next section. 2 In fact, this might be possible, but requires reengineering of the onboard microcontroller software.
ActivMedia Robotics Interface for Application (ARIA) C++ software suite can be used to easily build your applications.
All the communication flows through a serial port from the MC (server) to the clients. Different options are possible to run a client
ease of use
versatily
1) Serial cable and off‐board PC 2) Autonomously with on‐board PC 3) Serial cable and on‐board (piggyback) laptop 4) Wireless Ethernet serial bridge and off‐board PC
1) 2) 4) 3)
Figure 1 ‐ Communication link options between robot (server) and applications (clients).
Up to date, the preferred communication link option is #3 (Serial cable and on‐board (piggyback) laptop). The serial port connection to the MC is positioned on the user control panel as shown in Fig. 2.
Figure 2 ‐ User control panel and serial‐usb adapter.
The communication through the robot serial port guarantees access to wheels, sonar and sound devices installed in the robot. Extra sensors, such as the Sick laser should be connected directly to the client.
Use the serial-usb adapter3 to connect the robot and Sick sensor to your laptop. Check Appendix C to depict what are the COM ports you are using.
3 This adaptor has 2 RS232 inputs and 1 usb output that can be used to connect the laser and robot. The adaptor driver can be found in http://www.control.isy.liu.se/~andrecb/fidodido/drivers/usb2serial/.
The Sick sensor is also connected through a serial port from the sensor to your client (e.g. a piggyback laptop). Notice that in case your client has no or few serial ports, you can also use serial‐usb adapters (see Fig. 2) to grant you access to the robot and sensors. Appendix C shows how to check for the COM ports Windows chose when using the serial‐usb adapter.
Example 1: Your first robot application (jog.cpp) In this program, you will run your first robot program to jog the robot around with your laptop’s
keypad. Press ‘space’ to stop the robot. Press ‘Esc’ to stop and exit the application at any time. Before running: 1. Turn the robot on by switching the power button to ‘on’ in the user control panel. 2. Connect the robot to the laptop using the serial to usb adapter. Function call: >> jog –robotPort <comX> #include "Aria.h" int main(int argc, char** argv) { Aria::init(); ArArgumentParser argParser(&argc, argv); ArSimpleConnector con(&argParser); // the serial connection (robot) ArSerialConnection serConn; // tcp connection (sim) ArTcpConnection tcpConn; ArRobot robot; // create key handler ArKeyHandler keyHandler; // let the global aria stuff know about it Aria::setKeyHandler(&keyHandler); // toss it on the robot robot.attachKeyHandler(&keyHandler); //parse parameters to the robot argParser.loadDefaultArguments(); if (!con.parseArgs() || !argParser.checkHelpAndWarnUnparsed(1)) { con.logOptions(); keyHandler.restore(); exit(1); } /* - connect to the robot. */ if(!con.connectRobot(&robot))
{ ArLog::log(ArLog::Terse, "jog: Could not connect to the robot."); Aria::shutdown(); keyHandler.restore(); return 1; } /* - the action group for the jogging actions: */ ArActionGroup driver(&robot); // the keydrive action (drive from keyboard) driver.addAction(new ArActionKeydrive, 45); driver.activateExclusive(); robot.runAsync(true); // enable the motors robot.enableMotors(); // just hang out and wait for the end robot.waitForRunExit(); // now exit Aria::shutdown(); return 0; }
Range sensors The Sick laser and the sonars are range sensors. Range sensors provide an estimate of the distance from the robot to objects in the surroundings. Such sensors are useful for different purposes, e.g. avoiding collision, navigating, mapping, landmark recognition, etc.
Sonars The ultrasonic sonar array positioned in the robot front provides 180º of sensing. Here is a short table summarizing its characteristics.
Min range Max range Sampling rate Features
10cm 4m 25Hz (nominal) ‐ Poor accuracy. ‐ Adj. gain. Low: noisy env. High: quiet env.
Figure 3 ‐ Ultrasonic sonar array.
Laser range finder Laser range finders are much more accurate range sensors than sonars. The sensor principle is based on the time of flight of a laser beam in an emition‐reflection‐reception cycle.
Through the use of a rotating mirror, several beams are emitted in a 180º angle, with an angular resolution of 0.5º or 1º. The table below presents some typical values for this sensor.
Figure 4 ‐ Sick LMS200.
Since the sensing principle is based on laser reflection, non‐reflective objects (e.g. glass) will not be visible to the sensor. Use the sonar to complement the laser when navigating in an environment with glass doors.
Objects with low reflectivity (e.g. glass doors) cannot be sensed by a Sick sensor.
Some objects are more reflective than others; the Sick sensor can also be used to detect objects with different reflectivity. This might be useful for various purposes; one that is attractive for localization is the use of landmarks. With the use of special reflective tapes installed in strategic places in an environment, the robot can easily localize itself. The table to the side presents some materials and their reflectivity levels.
Example 2: Safe navigation (safeJog.cpp) In this program, jog.cpp is changed to achieve a secure navigation using range sensors. Function call: >> safeJog –robotPort <comX> ‐laserPort <comY> ... //initialize Aria and parses arguments // create sonear and sick laser sensors objects ArSonarDevice sonar; ArSick sick; ... //load parser arguments
Min range Max range Sampling rate Resolution/Typical Accuracy
10m 32m ~4Hz (whole scanning angle)
10mm/±15mm
Material Reflectivity (%)
Cardboard 10 to 20
Wood 40
PVC 50
Paper 80
Steel 120 to 200
Reflectors >2000
//add range devices robot.addRangeDevice(&sonar); robot.addRangeDevice(&sick); ... //parse params connect to the robot /* - the action group for the safe jogging actions: */ ArActionGroup safeDriver(&robot); // limiter for close obstacles safeDriver.addAction(new ArActionLimiterForwards("speed limiter near", 1000, 3000, 800), 95); // limiter for far away obstacles safeDriver.addAction(new ArActionLimiterForwards("speed limiter far", 300, 1100, 1800), 95); // limiter so we don't bump things backwards //safeDriver.addAction(new ArActionLimiterBackwards, 85); // the keydrive action (drive from keyboard) safeDriver.addAction(new ArActionKeydrive, 45); safeDriver.activateExclusive(); ... //enable motors and wait for exit command
LOCALIZATION AND MAPPING Localization is a basic task a mobile robot should perform in order to accomplish tasks. There are several methods to perform localization, e.g. odometry, scan matching, SLAM, etc.
The robot manufacturers already provide software solutions for map‐based localization. The Advanced Robotics Navigation and Localization, ARNL, is a software package to perform localization (as well as navigation). The localization from ARNL fuses odometry data and a map to find the most likely position of the robot in the map. The localization method is proprietary but the C++ libraries and DLLs are available.
A robot pose is defined as the robot position and orientation in a coordinate frame. For a wheeled
mobile robot navigating in a smooth floor, the pose is defined by the triple [ ]θyx as shown in Fig.
5.
Figure 5 ‐ A differential drive mobile robot in a reference frame.
Consider the directional speeds (forward and rotational) of the robot at time instant t [ ])()( tutu f ω .
Given the robot pose at time t‐1, it is possible to predict its pose at time t using the simple robot dynamical model:
⎥⎦
⎤⎢⎣
⎡
⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡−−
+⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡
−−−
=⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡
)()(
00))1(sin(0))1(cos(
)1()1()1(
)()()(
tutu
TtTtT
ttytx
ttytx
f
ω
θθ
θθ
where T is the sampling rate. That means that given an initial pose and the direction speeds, one can estimate the robot pose at any time, by integration, this is the principle of dead‐reckoning. Let us take a closer look at odometry.
Odometry Since the only signals required to perform dead‐reckoning localization are the directional speeds, there should be a manner to estimate or measure them. In Fido‐dido it is possible to measure the wheel speeds, the speeds of the wheels do not give you directly the direction speeds, a kinematics model is therefore needed.
Under several simplifications, e.g. perfect alignment of wheels, the kinematics of a differential drive robot can be written as
⎥⎦
⎤⎢⎣
⎡
⎥⎥⎥
⎦
⎤
⎢⎢⎢
⎣
⎡
−=⎥⎦
⎤⎢⎣
⎡)()(22
)()(
tt
br
br
rr
tutu
L
R
LR
LRf
ωω
ω
where iir ω, are respectively wheel radius and wheel angular speed at side i (right and left) and b is the
distance between the wheels along their common axis. Once the wheel radii and b are known, it is possible to use the model to achieve an estimate of the directional speeds and then, using the dynamical model, an estimate of the robot pose.
A shortcoming with odometry as with any dead‐reckoning localization is that any noise is integrated through time and therefore, the pose estimate is only reliable in a short navigation range. Other methods are therefore needed to complement the pose estimates.
Odometry is already implemented in the basic software suite ARIA.
Example 3: Using odometry to navigate in a square (square.cpp) The program safeJog.cpp is extended to travel the robot in a 1x1m square using odometry. Function call: >> square –robotPort <comX> ‐laserPort <comY> ...//define variables, parse arguments, connect robot and laser
printf("Starting moving\n"); robot.lock(); // enable the motors, disable amigobot sounds robot.comInt(ArCommands::ENABLE, 1); // Goto action at lower priority ArActionGoto gotoPoseAction("goto"); robot.addAction(&gotoPoseAction, 50); const int maxDur = 8000;//variable that defines max time to reach a goal
position ArTime start; //timer start.setToNow();//start timer robot.unlock(); gotoPoseAction.setGoal(ArPose(squareSide, 0));//start moving to first
corner while (!gotoPoseAction.haveAchievedGoal() || start.mSecSince() < maxDur); gotoPoseAction.setGoal(ArPose(squareSide, squareSide)); while (!gotoPoseAction.haveAchievedGoal() || start.mSecSince() < maxDur); gotoPoseAction.setGoal(ArPose(0, squareSide)); while (!gotoPoseAction.haveAchievedGoal() || start.mSecSince() < maxDur); gotoPoseAction.setGoal(ArPose(0, 0)); //back to initial position while (!gotoPoseAction.haveAchievedGoal() || start.mSecSince() < maxDur); printf("Stopped\n"); ...//disconnect and finish
Building a map The localization algorithm used in the ARNL library uses laser information combined with odometry and an environment map to find the most likely robot position within the map. That means you need to first build a map in order to use the ARNL libraries.
Mapper3 is a tool provided by MobileRobots Inc. that can be used to build maps. “Mapper3 converts laser scan logs to maps when the laser scan file is loaded. A laser scan file is a log of robot positions determined by possibly error‐prone dead‐reckoning, plus laser scan readings taken at each position. The file typically ends with the extension ".2d". When the scan file is loaded it does (offline, non‐realtime) simultaneous localization and mapping (SLAM) as it simulates the robot's path (using a Kalman filter on robot position during localization), and matching the laser scans ("scan‐matching") to create a cohesive map, then does some clean‐up on the data.”4
The “.2d” file is generated when laser scan data is logged when jogging the robot around the environment to build a map. That means we need to learn how to log data from the sensors.
4 Text extracted from http://robots.mobilerobots.com/wiki/Mapper3_Map_Creation.
Logging data The ARIA suite already provides classes for the logging of data, ArSickLogger.cpp is one such example. The class is programmed to register a laser scan anytime the robot moves, that means no new data is logged if the robot is simply standing still. In this meanwhile useful information might be thrown out. To be able to log data constantly, the class was modified to ArFastSickLogger.cpp. The function call and methods work the same as ArSickLogger.cpp with the difference that data will be logged as fast as possible (see Appendix B for instructions on how to use this class and Appendix A for correct linker configuration in Visual Studio).
The jogAndLog.exe application can be used to log a data to “.2d” file to be used in Mapper3. The file folad.2d presents an example of data logged from the Fo‐lab room.
Example 4: Logging data to build a map (jogAndLog.cpp) In this program, the class ArFastSickLogger.cpp (see Appendix B for instructions on this class) is used to
log environment data to be used to build a map. Jog the robot around the environment while it collects the data.
Function call: >> jogAndLog <logFileName.2d> –robotPort <comX> ‐laserPort <comY> #include "ArFastSickLogger.h" ... // define variables, parse arguments //sets the log filename (default is scan.2d) std::string filename = "scan.2d"; if (argc > 1) filename = argv[1]; printf("Logging to file %s\n", filename.c_str()); ... // connect robot and laser /*sets up the laser*/ sick.chooseReflectorBits("1ref"); simpleConnector.setupLaser(&sick); // This must be created after the robot is connected so that it'll // get the right laser pos ArFastSickLogger logger(&robot, &sick, 0, 0, filename.c_str(), false,NULL,NULL,true,true); // now that we're connected to the robot, connect to the laser logger.takeOldReadings(true); logger.takeNewReadings(false); sick.runAsync(); ... //enable motors and wait for exit command
Using Mapper3 to build a map Now that you have logged data from the environment, you can use Mapper3 to build a map. Simply open Mapper3 and load the “.2d” file. The map creating task might take some time, just wait and check the result.
In Mapper3 you can also define a “home” for your robot and goal positions, that can be used later on to navigate on the map you created. When you are ready, save the map to “.map” file.
The file folab.map presents a map example built from the data found in folab.2d.
Figure 6 ‐ Map of the Fo‐lab created using Mapper3.
Using MobileEyes to navigate Now that you have a map of your environment, you can use it together with sensor data to navigate around the environment. The easiest way is to use MobileEyes to do that.
To do so, you need to run “.MobileRobots\ARNL\bin\arnlServer.exe” to make your robot accessible to MobileEyes. Call arnlServer as
>> arnlServer –robotPort <comX> ‐laserPort <comY>
After arnlServer is running, open MobileEyes using “localhost” in the ‘robot server’ field. If your robot still has no map, assign a map of the environment. Go to ‘Tools‐>Robot Configuration’ and set the path for the map.
You should now be able to see the robot in the environment.
Figure 7 ‐ Robot in MobileEyes.
You can now use the ‘send robot’ tool to define a position to send the robot to, drive the robot using a keypad and many other options.
SUPPORT Fido is produced by MobileRobots Inc. the company is active since 1995 but has been recently sold out to Adept Technologies (June 2010), the biggest US based industrial robots manufacturer. No critical changes are expected, but more development/support is expected after the transition. Some relevant aspects are listed here.
Software The software provided by MobileRobots Inc. can currently handle the following OS, Windows, Debian 3.1/4.0. In the near future, Debian 5 should also be supported and Debian 4.0 support will be discontinued.
To have access to software/firmware upgrades, visit http://robots.mobilerobots.com. Some content are under restricted access, here is a login to a RESEARCH level account:
Username: linkopings Password: 7#3Hy%T+
Hardware Fido currently runs on an old Penitum III/Celeron, by July 2010, the manufacturers announced plans to start supporting C2D based PCs. However, there is still no information whether a hardware update would be possible for our Fido, let’s hope so.
RESOURCES
Useful links http://www.robots.mobilerobots.com http://www.control.isy.liu.se/~andrecb/fidodido.html
Usb2serial driver http://www.control.isy.liu.se/~andrecb/fidodido/drivers/usb2serial/
DLL files http://www.control.isy.liu.se/~andrecb/fidodido/dll/ http://www.control.isy.liu.se/~andrecb/fidodido/dll/dll.zip
Bibliography Sebastian Thrun, W. B. Probabilistic Robotics. MIT Press.
APPENDIX A – CONFIGURING VISUAL C++ 2008 To build aplpications to the robot, several libraries are needed, e.g. ARIA, ARNL, etc. In order to be able to use them, it is important to configure your compiler properly so that everything is accessible to your application.
The easiest way to do so is to add certain comman line options to your compiler. If you have installed the libraries in the default directory, ‘C:\Program Files\MobileRobots’ the following procedure should work.
1. Create a subfolder in ‘C:\Program Files\MobileRobots’ where you want your project to be, for example:
Create the folder ‘C:\Program Files\MobileRobots\myproject’
2. Start Visual Studio
a. Create a new empty project (ctrl+alt+n) using ‘C:\Program Files\MobileRobots’ as the location
b. Press ‘ok’
c. Add a .cpp source file (this is only done so that C++ building options are accessible)
d. Go in Project properties (Alt+N)
e. On ‘Configuration’ choose ‘All configurations’
f. In the ‘C/C++’ tab go to ‘Command Line’ and add the following ‘Additional options’
/I "../ARNL/include" /I "../ARNL/include/Aria" /I "../ARNL/include/ArNetworking" /I "../Aria/include" /I "../Aria/ArNetworking/include" /I "./include" /D "WIN32" /D "_DEBUG" /D "_WINDOWS" /D "_USRDLL" /D "ARIADLL_EXPORTS" /D "_VC80_UPGRADE=0x0710" /D "_MBCS" /FD /EHsc /GS‐ /W3 /nologo /c /TP
g. In In the ‘linker’ tab go to ‘Command Line’ and add the following ‘Additional options’
/NOLOGO /LIBPATH:"../ARNL/lib" /DYNAMICBASE:NO /MACHINE:X86 AriaDebugVC9.lib ArNetworkingDebugVC9.lib SonArnlDebugVC9.lib BaseArnlDebugVC9.lib /FORCE:MULTIPLE
h. Press OK
Your compiler should now be properly configured and you start building your application.
Some “.dll” files are needed in order for you to run your application. They can be found in
http://www.control.isy.liu.se/~andrecb/fidodido/dll/
the file “dll.zip” contains all “.dll” files. Unzip it to the same folder your application (“.exe”) is going to run.
APPENDIX B – USING ArFastSickLogger.cpp The standard logging class ArSickLogger.cpp can be used to log data laser and odometry data. A shortcoming with this class is that it is configured to log data only when the robot moves, if the robot is standing still, no data is collected. In many situations however, it is desirable to log data constinuously. To overcome this situation the class ArSickLogger.cpp was changed, resulting in the class ArFastSickLogger.cpp. The function works very similarly to ArSickLogger.cpp the only difference is that data is logged continuously.
To use this class, simply add the files ArFastSickLogger.cpp and ArFastSickLogger.h to your project, as seen in the figure below.
Because this files overwrittes methods use in ArSickLogger.cpp, when you compile applications using this class, the following warning messages will be displayed
Compiling... ArFastSickLogger.cpp Linking... AriaDebugVC9.lib(AriaDebugVC9.DLL) : warning LNK4006: "public: virtual int __thiscall ArRangeDeviceThreaded::unlockDevice(void)" (?unlockDevice@ArRangeDeviceThreaded@@UAEHXZ) already defined in ArFastSickLogger.obj; second definition ignored AriaDebugVC9.lib(AriaDebugVC9.DLL) : warning LNK4006: "public: virtual int __thiscall ArRangeDeviceThreaded::lockDevice(void)" (?lockDevice@ArRangeDeviceThreaded@@UAEHXZ) already defined in ArFastSickLogger.obj; second definition ignored AriaDebugVC9.lib(AriaDebugVC9.DLL) : warning LNK4006: "public: virtual void __thiscall ArRangeDeviceThreaded::runAsync(void)" (?runAsync@ArRangeDeviceThreaded@@UAEXXZ) already defined in ArFastSickLogger.obj; second definition ignored Creating library C:\Program Files\MobileRobots\myproject\Debug\myproject.lib and object C:\Program Files\MobileRobots\myproject\Debug\myproject.exp C:\Program Files\MobileRobots\myproject\Debug\myproject.exe : warning LNK4088: image being generated due to /FORCE option; image may not run Embedding manifest...
The messages can simply be ignored and your application should work correctly.
APPENDIX C – CHECKING THE COM PORTS The newest computers do not have any RS232 port directly available. Since both the robot and laser range finder communicate through such serial channel, an alternative is to use serial‐usb adapters.
In order for your operating system to recognize the adapter, you will need first to install a driver. The driver for the adapter displayed in Figure 2 is available at
http://www.control.isy.liu.se/~andrecb/fidodido/drivers/usb2serial/
Be sure to install the driver before trying to connect the robot to your computer.
If you succeeded with the drivers installation, when you connect the adapter to your computer, Windows should detect the drivers and associate a ‘virtual’ COM port to each of the RS232 inputs. This process is apparently not systematic and you different COMs might be associated to the inputs every time you connect it.
On Windows XP, to realize to which COM port did Windows associate the inputs, go to
‘Control Panel/System/Hardware/Device Manager’
Click on ‘Ports(COM & LPT) to see the list of COM ports available, the ones associated to the adapter should be listed there.