otc 22989

15
OTC 22989 Space Robotics Technologies for Deep Well Operations Hari Nayar, Khaled Ali, Andrew Aubrey, Tara Estlin, Jeffery Hall, Issa Nesnas, Aaron Parness, Dean Wiberg, Jet Propulsion Laboratory, California Institute of Technology Copyright 2012, Offshore Technology Conference This paper was prepared for presentation at the Offshore Technology Conference held in Houston, Texas, USA, 30 April3 May 2012. This paper was selected for presentation by an OTC program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Offshore Technology Conference and are subject to correction by the author(s). The material does not necessarily reflect any position of the Offshore Technology Conference, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Offshore Technology Conference is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract must contain conspicuous acknowledgment of OTC copyright. Abstract The Jet Propulsion Laboratory (JPL) routinely operates robotic spacecraft millions of miles from the Earth. Current JPL missions include roving on Mars and observing the Sun, Earth, Saturn, comets and asteroids and deep space. Advances in high-fidelity modeling, simulation and visualization, high-precision sensors, instruments, harsh-environment electronics, autonomous operations and on-board intelligence have enabled these challenging missions. Robotics capabilities developed at JPL have been applied to the Mars Pathfinder, Mars Exploration Rover, Deep Space 1, EOS1 and other space missions and used on many JPL research projects conducted for NASA, the US Department of Defense and private industry. Although developed for space applications, these technologies are also highly relevant to problems in terrestrial oil and gas exploration and production. Benefits from the deployment of these technologies include greater precision, increased reliability, reduced uncertainty, increased productivity, reduced cost, and accelerated development schedules and task completion times. In this paper, we provide a description of some of these technologies and suggest how they might benefit the oil and gas industry. Introduction JPL is a United States federally funded research and development center managed by the California Institute of Technology under a contract with NASA. Founded more than 50 years ago, JPL has been responsible for many pioneering space achievements including building and controlling Explorer 1 the first US satellite, the Ranger and Surveyor missions to land on the moon, Mariner 2 the first spacecraft to Venus, the Viking missions to land on Mars, Voyager 1 and 2 the spacecraft that toured Jupiter, Saturn, Uranus and Neptune, the Mars Pathfinder and Mars Exploration Rovers to drive on Mars and many others to observe the planets, the sun, comets, and asteroids. In addition to its planetary exploration mission, JPL is actively involved in Earth science. Instruments and Earth-orbiting satellites developed at JPL study the geology, hydrology, ecology, oceanography, gravity and climate of the Earth. JPL also conducts missions focused beyond our solar system. Observations of deep space have yielded information about galaxy, star and planetary system formation, developed maps of our Milky Way galaxy and the universe, and found planets on other star systems. To enable these missions, JPL continues to develop technologies in telecommunications, navigation, intelligent automation, imaging and image analyses, robotics, science instruments and micro- and nano-systems. In telecommunications, for example, JPL is able to collect data from the Voyager spacecraft beaming a 23 Watt signal from the edge of our solar system more than 14 billion kilometers away. There are a number of similarities between space applications and applications in the oil & gas industry (Oxnevad, 2010). In both cases, systems are deployed at remote locations with limited access for intervention, maintenance or repair. The operational environments are often hostile with harsh temperatures and pressures and corrosive materials. Reliability of systems is extremely important in space and in the oil and gas industry. Operations in the respective environments have high risk and the capabilities for replicating environmental conditions for testing and failure mitigation are limited. Failures can be catastrophic and the cost, financially and in public perception, can erode support for programs. Due to these similarities, there is much that the space and energy sectors can learn and benefit from each other. As wells in remote and hazardous environments increase, the technologies to explore, drill, produce, and handle accidents associated with them need to evolve to address the increased complexity and difficulty. Dependence on robotics technologies and more sophisticated instruments is increasing in this arena. Enhancing existing systems with greater intelligence, autonomy and reliability should help relieve human operators of the tedious and time-consuming aspects of operations. More

Upload: rasheed-yusuf

Post on 06-Nov-2015

3 views

Category:

Documents


1 download

DESCRIPTION

otc

TRANSCRIPT

  • OTC 22989

    Space Robotics Technologies for Deep Well Operations Hari Nayar, Khaled Ali, Andrew Aubrey, Tara Estlin, Jeffery Hall, Issa Nesnas, Aaron Parness, Dean Wiberg, Jet Propulsion Laboratory, California Institute of Technology

    Copyright 2012, Offshore Technology Conference This paper was prepared for presentation at the Offshore Technology Conference held in Houston, Texas, USA, 30 April3 May 2012. This paper was selected for presentation by an OTC program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Offshore Technology Conference and are subject to correction by the author(s). The material does not necessarily reflect any position of the Offshore Technology Conference, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Offshore Technology Conference is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract must contain conspicuous acknowledgment of OTC copyright.

    Abstract The Jet Propulsion Laboratory (JPL) routinely operates robotic spacecraft millions of miles from the Earth. Current JPL

    missions include roving on Mars and observing the Sun, Earth, Saturn, comets and asteroids and deep space. Advances in

    high-fidelity modeling, simulation and visualization, high-precision sensors, instruments, harsh-environment electronics,

    autonomous operations and on-board intelligence have enabled these challenging missions. Robotics capabilities developed

    at JPL have been applied to the Mars Pathfinder, Mars Exploration Rover, Deep Space 1, EOS1 and other space missions and

    used on many JPL research projects conducted for NASA, the US Department of Defense and private industry. Although

    developed for space applications, these technologies are also highly relevant to problems in terrestrial oil and gas exploration

    and production. Benefits from the deployment of these technologies include greater precision, increased reliability, reduced

    uncertainty, increased productivity, reduced cost, and accelerated development schedules and task completion times. In this

    paper, we provide a description of some of these technologies and suggest how they might benefit the oil and gas industry.

    Introduction JPL is a United States federally funded research and development center managed by the California Institute of Technology

    under a contract with NASA. Founded more than 50 years ago, JPL has been responsible for many pioneering space

    achievements including building and controlling Explorer 1 the first US satellite, the Ranger and Surveyor missions to land on the moon, Mariner 2 the first spacecraft to Venus, the Viking missions to land on Mars, Voyager 1 and 2 the spacecraft that toured Jupiter, Saturn, Uranus and Neptune, the Mars Pathfinder and Mars Exploration Rovers to drive on

    Mars and many others to observe the planets, the sun, comets, and asteroids. In addition to its planetary exploration mission,

    JPL is actively involved in Earth science. Instruments and Earth-orbiting satellites developed at JPL study the geology,

    hydrology, ecology, oceanography, gravity and climate of the Earth. JPL also conducts missions focused beyond our solar

    system. Observations of deep space have yielded information about galaxy, star and planetary system formation, developed

    maps of our Milky Way galaxy and the universe, and found planets on other star systems. To enable these missions, JPL

    continues to develop technologies in telecommunications, navigation, intelligent automation, imaging and image analyses,

    robotics, science instruments and micro- and nano-systems. In telecommunications, for example, JPL is able to collect data

    from the Voyager spacecraft beaming a 23 Watt signal from the edge of our solar system more than 14 billion kilometers

    away.

    There are a number of similarities between space applications and applications in the oil & gas industry (Oxnevad, 2010). In

    both cases, systems are deployed at remote locations with limited access for intervention, maintenance or repair. The

    operational environments are often hostile with harsh temperatures and pressures and corrosive materials. Reliability of

    systems is extremely important in space and in the oil and gas industry. Operations in the respective environments have high

    risk and the capabilities for replicating environmental conditions for testing and failure mitigation are limited. Failures can be

    catastrophic and the cost, financially and in public perception, can erode support for programs. Due to these similarities, there

    is much that the space and energy sectors can learn and benefit from each other.

    As wells in remote and hazardous environments increase, the technologies to explore, drill, produce, and handle accidents

    associated with them need to evolve to address the increased complexity and difficulty. Dependence on robotics technologies

    and more sophisticated instruments is increasing in this arena. Enhancing existing systems with greater intelligence,

    autonomy and reliability should help relieve human operators of the tedious and time-consuming aspects of operations. More

  • 2 OTC 22989

    precise data from new instruments, improved models of environmental conditions, more capable actuators and greater

    sensory perception of these remote environments can extend operational capabilities, improve safety and reliability, reduce

    the environmental impact and reduce cost of operations.

    In this paper, a few robotics technologies are presented to illustrate the potential adaptation of space innovations to the oil

    and gas industry. The technology areas selected are: a) science instruments, b) modeling, simulation and visualization, c)

    software architectures, d) mobility and manipulation, e) sensing and perception and f) on-board automation. These

    technologies and their application in space are described in the next section of this paper. Potential applications in the oil and

    gas industry are suggested for each technology area. We conclude the paper with a summary of possible benefits to be gained

    from the use of capabilities developed for space in the oil and gas industry and describe our efforts to further this objective.

    Space Robotics Technologies

    Science and Instruments

    JPLs mission statement includes a focus on space exploration for the benefit of humankind through development of robotic space

    missions to explore our own and neighboring planetary systems.

    This mission statement encompasses a major focus on

    instrumentation required as components of planetary science

    missions to answer major mission science questions. JPL designs

    instruments from a top-down science requirements approach, which

    assures instrument design/performance tailored to provide necessary

    data products with respect to performance. Our institutional approach

    is evidenced by the science traceability matrix approach where the

    performance of each instrument is mapped to high-level mission

    science goals.

    JPL draws on decades of instrument and sensor development, which

    have been successfully deployed to planetary surfaces as in situ instruments and to the far reaches of our solar system. The

    technical requirements necessary for in situ instruments often push the limits of engineering capabilities for operation in these

    harsh environments. Thus, the technological solutions often redefine the state-of-the-art, which helps assure efficient

    operation in the operating environment. For instance, future missions to Venus will require new developments of high-

    temperature electronics, which are currently in development at JPL to offer sustained science measurements during surface

    operations. On the other end of the spectrum, extremely cold environmental operating requirements have enabled

    technological breakthroughs to enable prolonged surface operations on planetary surfaces such as Mars. Recently as part of

    the Phoenix mission, the Microscopy, Electrochemistry, and Conductivity Analzyer (MECA) instrument carried out in situ

    aqueous measurements of soil extracts during in situ surface operations (see Figure 1). Similar low-temperature requirements

    will be necessary for future missions to icy planetesimals such as Europa, a Jovian moon, and Enceledus, a Saturnian moon.

    JPLs design of science instruments and sensors in unique planetary surface environments enables translation of engineering heritage to capabilities necessary for extreme terrestrial environments. Such stringent environmental requirements include

    wide ranges of operating temperatures, pressures, humidity and corrosive conditions. Furthermore, advancements have been

    made in the fields of power system technologies and advanced electronics/mechanical packaging for sustained operations in

    high vibration environments. Just as important to the robust design and engineering of planetary instruments is minimization

    of mass and power for every flight instrument. Thus the small size and low power demand necessary for deployment on

    Autonomous Underwater Vehicles (AUVs) are identical to the engineering goals for planetary instruments.

    Requirements for persistent operation in marine environments are reflected above and include operation at low temperatures

    (e.g. 4C, ocean bottom water temperatures), high temperatures (hydrothermal vent environments), high pressures, and

    corrosive conditions (e.g. seawater). Marine operation scenarios in deep or shallow water environments are most relevant to

    currently funded work examining exploration of saline water oceans on Europa, Enceledus or hydrocarbon lakes on Titan.

    Thus JPL is uniquely poised to contribute to instrument system development for operation in terrestrial marine environments

    by leveraging current and past efforts to address these types of problems. In taking a system level approach to designing an

    instrument system for deepwater marine deployment, it is critical to leverage decades of work accomplished in the research

    and development communities and by small businesses instrument developers. In preliminary studies, JPL has adopted many

    of the commercially available high-pressure rated instruments designed for marine environments for traditional needs such as

    sensing of pressure, salinity, temperature, and to provide underwater navigation capabilities. Thus JPL has focused on

    development of advanced sensors and instruments for marine environments to address the more difficult measurement

    requirement, which can also address needs of the oil and gas industry. JPL has begun to adapt existing technologies

    developed for in situ planetary science to aqueous environments such as the deep sea. JPL is currently pursuing projects for

    Figure 1 Phoenix mission Microscopy Electrochemistry

    and Conductivity Analyzer (MECA) instrument with

    the Wet Chemistry Laboratory (WCL) beakers

    displayed (Hecht, 2009).

  • OTC 22989 3

    in-house development of marine sensors including: (1) mass spectrometers for hydrocarbon detection and attribution, (2)

    laser-induced fluorescence molecular detection strategies for seawater characterization in microscale and macroscale form

    factors, (3) a hydrothermal vent microbial biosampler, among others.

    These new technologies are in development in a form factor

    consistent with accommodation as a payload aboard an AUV

    platform. Thus the small mass and size requirement for in situ

    instruments are reflected in these underwater instrument

    developments. These technology developments run in parallel with

    previous efforts towards underwater technologies carried out at JPL

    including the hydrothermal vent biosampler, HVB (see Figure 2),

    and UV fluorescence/Raman chemical sensing system (Hug, 2004).

    Modeling, Simulation and Visualization

    At JPL, mission design and planning, systems and component design

    and development and operations make significant use of modeling,

    simulation and visualization. Modeling and simulation is an

    important element of the development of systems for the following

    reasons:

    Missions span a large range of distances and time scales that cannot be physically replicated.

    Gravitational fields, temperatures and pressures, caustic atmospheres and other conditions are difficult or costly to physically duplicate.

    Simulations can help evaluate new concepts or operational scenarios without the cost of building or deploying them.

    Monte-Carlo and parametric analyses through simulations can be used to optimize systems and plans.

    Simulations can be used to share results among distributed teams.

    Simulations can be used to validate and verify analytical and physical analog studies.

    Simulations help visualize and predict performance of systems and operations.

    Visualization from real-time telemetry during operations can provide intuitive displays of data. These reasons for the use of modeling, simulation and visualization apply for the oil and gas industry as well.

    The capabilities for high-fidelity modeling, simulating and

    visualizing complex dynamic systems depend on a number of

    component software systems developed at JPL. An important

    element of this capability is the overall DARTS/Dshell framework

    for software development and execution (Jain, 1991). At its core is

    DARTS (Figure 3), a fast, efficient physics engine using the spatial algebra recursive algorithm developed at JPL (Rodriguez, 1991) for solving the dynamics of flexible, multi-body, tree-topology systems, to propagate dynamics state based on the system configuration and external inputs (Biesiadecki, 1997). A software

    interface to the physics engine provides a means to easily configure

    and specify models to be simulated. The interface allows the user to

    enter models for dynamic system components intuitively. Parametric

    models of spacecraft sub-systems are assembled hierarchically to

    build complex dynamic systems. Signals into and out of the

    respective models define their interfaces and constrain the assembly to a coherent dynamic

    system. This framework has been used to model and simulate the dynamics of systems in a

    variety of regimes.

    An early application of this technology was modeling the flexible dynamics of the Cassini

    spacecraft during its cruise to Saturn (Jain, 1992). Models implemented for the Cassini

    simulation included a flexible central body with articulated rigid-body appendages, and

    dynamic models of thrusters, reaction wheels, and sensors. The simulator was successfully

    used in a hardware-in-the-loop simulation mode to exercise the guidance and control system and evaluate the interfaces to the spacecraft. The dynamic simulation of Cassini in

    the space environment enabled end-to-end testing of the spacecraft guidance and control,

    telemetry, and sensor data processing sub-systems.

    Figure 3 Framework for DARTS/Dshell simulation software.

    Figure 5 Visualization of an entry,

    descent and landing sequence.

    Figure 2 Photographs of the Hydrothermal Vent

    Biosampler, HVB, ready for the deployment as a

    standalone instrument package (A), and sampling at

    a submarine hydrothermal system during

    operations on the HyperDolphin submersible (B).

    (Stam, 2010).

  • 4 OTC 22989

    The dynamics modeling and simulation framework has also been used to simulate vehicles driving on the surface of Mars

    (Yen, 1999; Jain, 2003) and the moon (Nayar, 2010) (Figure 4). Important components of surface traverse simulations are

    high-resolution terrain models obtained from satellite imagery, accurate location of terrain models on planet surfaces and

    accurate motion of the planet and its moons with respect to the sun. In addition, ground contact mechanics and soil models

    are used for wheel-soil contact. Power system dynamics have been implemented that include energy consumption models,

    energy storage in batteries and energy generation from solar panels. With these elements, it is possible to simulate the energy

    budget during a traverse with relatively high fidelity. In

    addition to incorporating the energy expenditure to

    traverse over the undulating terrain, the simulator

    incorporates a model of the energy generated during the

    traverse as the solar panels change their pose with

    respect to the sun and the effects of obscuration of the

    solar panels by the terrain when the sun is low on the

    horizon. Surface simulations have been used to quantify

    performance of vehicles, plan traverses and scope

    power system requirements for Mars and lunar

    missions. Investigations into terrain interaction

    simulation using granular media modeling are being

    conducted to provide more realistic modeling of these complex phenomena (Mukherjee, 2011; Balaram, 2011).

    Simulations of the entry, descent and landing (EDL) of spacecraft on Mars (Balaram, 2002) (Figure 5) illustrate the use of the

    simulator in supersonic atmospheric applications. Dynamic models of parachutes, ballutes and drag devices, winds, and

    control surfaces in supersonic and subsonic atmospheric environments were implemented for these simulations. EDL is a

    critical phase of surface missions where landing in safe regions on the surface ensures health of the spacecraft for subsequent

    operations. Through parametric analyses and Monte-Carlo simulations, stochastic models of the landing ellipses under

    varying atmospheric conditions were identified for the Mars Exploration Rovers, Mars Phoenix and Mars Science Laboratory

    missions.

    Balloons or aerobots have been proposed for future missions on Venus, Mars, Jupiter and Titan. Balloons were first used for

    planetary exploration on two Venus Vega missions in 1985. Simulations have been used at JPL (Elfes, 2008) (Figure 6) for

    aerobots on Titan and Venus where the dense atmosphere allows compact balloons to be deployed. The aerobot model

    incorporates aerodynamics, mass properties, buoyancy, kinematics, dynamics, and control surfaces. It also incorporates simulated sensor signals based on sensor and environment models and a range of terrain models. The aerobot control algorithm was optimized

    using hardware-in-the-loop simulation where the aerobot model is driven by the flight control software using signals fed back from

    the simulated sensors. This dynamics simulation capability was used recently to model a

    spacecraft approaching and anchoring on an asteroid to support the design, planning and

    operations of space missions to near-Earth bodies.

    Modeling and simulation of dynamic phenomena in the oil and gas industry could help in

    understanding and predicting their behavior and in developing techniques for influencing or

    controlling them. Flow of material through permeable media, drilling, excavating and other

    types of terrain interaction, and operations in terrestrial surface and underwater, surface and

    seabed marine applications are potential areas where the technologies we have described

    could be applied. Some examples of the benefits from modeling, simulation and

    visualization could include:

    Better environmental impact modeling of hydraulic fracture or terrestrial and subsurface spills.

    Improved ROV/AUV control through the use of model-predictive control and drag and tether models.

    More precise directional drilling.

    More precise Floating Production Storage and Offloading (FPSO) or other surface platform control systems. These and potentially many others could be derived from the application of modeling and simulation technology in the oil

    and gas industry.

    Software Architectures for Autonomous Systems

    The development of software that operates remote robotic assets in harsh environments necessitates a disciplined approach.

    This becomes particularly important as the software grows in functionality and complexity. Ensuring the reliability of the

    software becomes paramount to prevent harm to people, the environment, and the robotic assets themselves.

    Figure 6 Visualization from an

    aerobot simulation.

    Figure 4 Visualization of simulated vehicles and terrain on the moon and on

    Mars.

  • OTC 22989 5

    From a software-centric view, a robotic system

    has three main elements: the ground software

    system with which the operator interfaces to

    control and command the asset, the on-board

    software system, which embodies the

    intelligence, and the target platform (Figure 7).

    The target platform can either be a physical

    system or a simulation of the physical system

    and its environment. As mentioned in the

    previous section, modeling and simulation play

    a key role in the development, verification and

    maturation of the software. Therefore, the

    interoperability of the on-board software

    between the physical and simulated worlds

    becomes a key element for the development of

    reliable software. Furthermore, the ability to

    interface to the simulation at various levels of

    fidelity facilitates the efficient testing of the

    low- and high-level capabilities of the on-board

    software.

    An important consideration for the ground and

    on-board system is its software architecture. Principled approaches to the development of software systems are important for

    reasons that include:

    Interoperability: In addition to supporting both physical and simulated platforms, interoperability of the software across a number of heterogeneous physical platforms increases software reusability and hence reliability. Moreover, it

    enables the interoperability of sensors, actuators, and electronics from different vendors with minimal impact to the

    software system.

    Flexibility: Having high-level command and control as well as finely granular control over each component is

    particularly important for assets that have to operate

    reliably without the luxury of direct access to the asset or

    the environment. Once launched, JPLs robotic assets become accessible and modifiable only through software.

    The flexibility of the software enables JPL to deal with

    anomalies and to work around hardware failures,

    resulting in a graceful degradation of performance.

    Technology Integration: A properly architected software system enables the integration, maturation and

    validation of new capabilities and technologies. It also

    enables the comparison of the performance of competing

    technologies.

    Scalability: Software architecture helps manage complexity while supporting functional enhancements.

    Maintainability: The ability to modify, upgrade and extend current capabilities is important to handle

    unforeseen software and hardware problems.

    Cost: Reducing the cost of software development is particularly important when supporting a fleet of heterogeneous robotic assets.

    Time-delays and constraints on communication windows define the control and operations paradigm of remote assets.

    Control strategies ranging from tele-operation control, to time-delayed teleoperation, to supervisory control, and to

    autonomous control are all part of JPLs suite of on-board software capabilities.

    To support remote control and commanding of more sophisticated robotic platforms, JPL has been researching and

    developing software architectures over the past two decades. It has also sought to develop guidelines for enhancing software

    reliability (Holzmann, 2006). Among the chief challenges facing such efforts are the heterogeneity and complexity of the

    hardware/software systems and the flexibility and scalability of the architecture. A prominent theme that emerged from these

    architectures is the consistent and flexible handling of states and objects across multiple domains (Volpe, 1996; Rassmussen,

    Figure 7 The three elements of a robotic system: (1) the ground software, (2)

    the on-board software, and (3) the target platform

    Figure 8 The CLARAty robotic architecture that supports

    declarative and procedure programming with the ability to

    interoperate hardware and software components

  • 6 OTC 22989

    2006; Nesnas, 2006). Support for different programming paradigms enabled the incorporation of advances from the artificial

    intelligence, controls and robotics communities. Figure 8 shows an example from the CLARAty architecture, which supports

    a declarative representation for its decision layer to provide flexibility for activity planning and an object-model

    decomposition for its functional layer to support interoperability (Nesnas, 2006). The use of domain modeling and abstract

    interfaces help handle hardware and software heterogeneity. Data flow representations using asynchronous events improve

    the runtime reliability of the software (Trebi-Ollennu, 2012) and have been used in several missions. The systematic

    modeling of articulated mechanisms and the management of coordinate frame help reduce overall complexity and improve

    consistency (Diaz-Calderon, 2006; Nayar, 2007).

    In addition to these software centric themes, JPLs has been developing, maturing and fielding a number of autonomous capabilities as part of on-board software. The challenging environment in which these technologies were fielded helped

    advance and mature these capabilities. For planetary operations, the challenges included traversing rock-strewn outdoor

    terrains, operating with limited power and computational resources, and localizing in GPS denied environments. The

    autonomous capabilities included control of complex mechanisms in rough terrain, pose estimation for multi-limbed and

    multi-wheeled robotic platforms, outdoor perception, map building with large uncertainties, traversability analysis, motion

    planning, and high-level activity planning to name a few. For example, the autonomous navigation capability of a robotic

    platform employs stereo vision to generate a three-dimensional view of the environment, localization and fuses data

    algorithms to build map, statistical analysis to assess the maps traversability for a given mechanism, and motion planning to generate trajectories that would avoid obstacles and reach the specified target. This autonomous navigation capability has

    been matured, validated and fielded on a number of rover prototypes and has been integrated and used on JPLs Mars Exploration Rover. The Spirit and Opportunity rovers have driven autonomously for several kilometers on the Red Planet

    (Biesiadecki, 2007). JPL has also extended this capability to three-dimensional environments for underwater navigation for

    Navy applications (Huntsberger, 2011). Such an extension would be applicable to underwater operations as well as aerial

    operations for the oil and gas industry.

    Another related capability is the autonomous traverse and placement of a tool or instrument on a remotely designated target.

    Such a capability would enable an operator to select a target from 10-20 meters away and have a robotic platform

    autonomously navigate to and deploy a tool on the designated target to conduct an operation (Fleder, 2011). This capability

    could be applicable to a number of oil and gas scenarios for underwater operation. The software combines the

    aforementioned autonomous navigation with the visual target tracking capability to simultaneously navigate and track targets

    of interest. Once the robot reaches the vicinity of the target, its positions itself to approach the target in such a way to

    maximize the manipulability of its robotic arm. JPL has demonstrated the ability to acquire a measurement to within 23-cm accuracy of the originally selected target from a 10-20 m distance.

    In addition to precise manipulation for instrument placement, JPL has also developed algorithms that would enable more

    sophisticated manipulation. These include perception algorithms for object recognition and handling, motion planning

    algorithms for redundant manipulators to avoid obstacles in the manipulations workspace, and force sensing and control algorithms for interacting with complex objects. These algorithms were deployed on an anthropomorphic arm with a multi-

    fingered hand to demonstrate grasping of clear objects, inserting a key and opening a door, picking up and operating a drill

    (Hebert, 2011). The key to these algorithms is that they are tolerant of large amounts of error that are commonly encountered

    in the real world due to error in sensing or compliance in the mechanical systems, but to date have been difficult for robots to

    accommodate. These developments were part of the first phase of the DARPA funded ARM-S program and are expected to

    be improved and extended for dual-arm operations in the next phase. Such capabilities promote safe operations and enhance the cooperation between robots and humans.

    The combination of scalable software architecture and advanced robotic capabilities can enable meaningful and novel

    applications in harsh environments for the oil and gas industry similar to those encountered in space applications.

    Mobility and Manipulation

    A paradigm shift was made in NASAs planetary exploration missions when JPL built and operated the Sojourner Rover as part of the Mars Pathfinder mission in 1997 (Sojourner, 1997). Sojourner was a six-wheeled mobile robot that carried

    scientific instruments and cameras that explored the rocks and regolith on the surface of Mars. Having mobility provided a

    dramatic increase in the amount and variety of data that could be collected by a single mission in comparison to previous

    lander-only planetary missions like NASAs Viking landers and the Soviet Unions Venera missions. Landers are limited to sampling a finite region about their location, and the ability to land precisely is not yet advanced enough to target specific

    sites of interest. Additionally, sites of interest may not be amenable to landing a spacecraft, necessitating landing in a safe

    zone and using a rover like Sojourner to travel to the desired location.

  • OTC 22989 7

    Another step forward was made when JPL developed and flew

    the Mars Exploration Rovers with the capability to not only roam

    the surface of Mars, but also to interact with the environment

    using a robotic arm (Lindenmann, 2005). These platforms have

    survived more than 6 years in the harsh Martian environment

    driving to scientific targets and using the instruments on the arm

    to touch and manipulate the surfaces of these targets. The

    Opportunity Rover is still active on the surface of Mars. The

    Spirit Rover was lost in 2010, exceeding the prime mission

    duration of 90 Martian days by more than 2500%. The Mars

    Science Laboratory (currently in transit to Mars) has an even

    more capable arm that will drill into rocks (Jandura, 2010). The

    combination of mobility and manipulation allows a robot to

    accomplish tasks in an unknown, dynamic environment that are

    impossible without both capabilities. JPL has advanced planetary

    robotics to a point where the robot can move to and interact with

    specific targets of scientific interest as an explorer, rather than

    just an observer.

    While the Mars Rovers are hallmarks of JPLs recent past, a large portfolio of research and development projects at the

    cutting edge of technology are pointing towards the future. These

    projects span a broad range of focus including multi-modal

    mobility, extreme terrain access, sample acquisition and handling, assembly and repair operations, and technologies for both

    tele-operation and autonomous execution of tasks.

    The All-Terrain Hex-Limbed Extra-Terrestrial Explorer (ATHLETE) rover, see Figure 9, is one such development project

    aimed at providing a highly capable mobile platform to traverse long stretches of rough terrain efficiently, as well as interact

    with cargo and astronaut habitats (Wilcox, 2007; Wilcox, 2009). ATHLETE uses a unique architecture that combines the

    advantages of both walking and driving. The robot has six dexterous

    limbs with seven degrees of freedom each that allow it to operate in a

    walking mode when moving across rough terrain. At the end of each of

    these limbs is a small wheel that can be driven to allow ATHLETE to

    move very efficiently across flat or modestly undulating terrain.

    ATHLETE can also use its limbs as robotic manipulators. A tool

    interface on the inside of each of the wheels allows ATHLETE to use

    tools like drills, grippers, and anchors to interact with its environment.

    The reliability of the system is also well documented; the ATHLETE

    rover performed a traverse of more than 40 km across varied terrain in

    the Arizona desert. While originally designed for operation on the

    Moon, the ATHLETE architecture could enable a very capable robot

    operating on the sea floor across either sandy or rocky terrain with the

    ability to perform repair or exploration tasks using a specified set of

    tools.

    The LEMUR robot was also built to use a variety of tools, (Figure 10)

    (Kennedy, 2001). Originally designed to perform precision assembly

    operations in space, LEMUR is able to walk on feet that can be designed

    for specific environments including rock, metal, or complex environments like the International Space Station. At the wrist

    joint of each of LEMURs limbs is a tool interface. When performing assembly tasks, LEMUR rotates this wrist so that the tool is facing the workspace. LEMUR successfully demonstrated assembly tasks like screwing and unscrewing bolts, drilling,

    and visual inspection with a suite of cameras. LEMUR has also been tested with specialty feet will allow it to climb vertical

    or even inverted surfaces (Parness, 2011; Kennedy, 2006). This groundwork could allow the robot to be deployed on the

    beam structures of large rigs to inspect for corrosion or other flaws, or to perform simple maintenance tasks in very difficult

    to access locations.

    Figure 9 The ATHLETE rover is a highly capable multi-

    modal mobile robot. By combining the rough surface

    performance of legged robots with the efficiency of wheeled

    driving, ATHLETE is able to move quickly and efficiently

    across a broad range of terrain. The limbs can also function

    as manipulators with a variety of tools to do maintenance

    tasks.

    Figure 10 The LEMUR robot is designed for in

    space assembly. It uses a variety of tools on the

    wrist joint to perform common tasks like screwing

    and unscrewing bolts, drilling, and taking

    pictures/video.

  • 8 OTC 22989

    The Axel robot has been specifically designed to access extreme

    terrain that is inaccessible to current rover architectures (Nesnas,

    2012). Axel is a minimalist robot that uses a tether to repel down the

    sides of steep slopes or sheer cliff faces, and houses a suite of

    instruments within each of its wheel hubs that it can point and deploy

    independently from the motion of the wheels, see Figure 11. The

    tether not only provides mechanical support for safe operations in

    high-cost or high-risk situations, but also provides a power and

    communication link that allow the robot to operate without a line of

    sight to the sun for solar power and without a heavy load of batteries.

    Axel has been fielded in two configurations, as a single daughter

    platform hosted on a larger rover or lander (fixed asset) or as part of

    the dual Axel (DuAxel) configuration that provides an independent

    all terrain mobility platform. The dual Axel, which consists of two

    Axels docked to either side of a central module, provides untethered

    mobility up and down moderately slopes terrains and an anchoring

    mechanism for Axels extreme terrain excursions. In this configuration, either Axel can be used for such excursions, providing

    redundancy and modularity of the overall system. The Axel rovers

    can be repurposed to provide access to a large variety of terrain types

    in the oil and gas industry that are either dangerous, inaccessible, or

    costly to reach with human workers.

    Work is also underway for a potential future Mars mission that

    would acquire rock core samples and cache them for return to Earth.

    The sample handling technology has been matured so that a mobile

    rover can safely core into rock with a rotary percussive drill and

    acquire rock cores. Once the drilling is complete, the drill bit is

    inserted into a mechanism that removes and caches the rock core and

    replaces a drill bit into the drill for continued operation. A full end-

    to-end sequence has been demonstrated in the field using a high-

    fidelity prototype of the potential future space system; see Figure 12

    (Backes, 2011).

    While current robotics systems development focuses on

    autonomous systems, pioneering developments in teleoperated

    systems technology were developed at JPL in the 1970s, 1980s and

    1990s. Force feedback, force and position scaling, predictive

    displays, operator displays, real-time visualization, shared and

    supervisory control, task scripting and many other technologies

    were demonstrated at JPL for remotely controlled robotics systems.

    Finally, JPL has also prototyped in-pipe robots for the natural gas

    industry (Wilcox, 2004). One such platform shown in Figure13 uses

    a segmented approach with an inchworm method of locomotion. A

    three degree-of-freedom joint between each of the segments allows

    the robot to navigate around tees and elbows in the pipe. To stay in

    place, the robot used inflatable airbags that would push against the

    interior of the pipe so that the robot could be used to move in either

    direction, even when there was flow in the pipeline. A turbine was

    also prototyped that harvested energy from this flow to power the

    robot.

    The expertise that JPL has developed in mobility and manipulation for robots could benefit a large number of applications in

    the oil and gas industry. These include:

    Inspection and repair inside of pipelines

    Maintenance and inspection on the sea floor

    Improved dexterous manipulation of current robotic assets

    Inspection of rig structure above and below waterline by climbing or repelling robots

    Figure 12 The prototype sample handling, encapsulation

    and containerization unit caches rock cores acquired by

    the robots rotary percussive coring drill.

    Figure 11 The Axel rover descending a steep cliff

    face (top). Axel undocks from the DuAxel and

    repels down the cliff using a tether (bottom).

  • OTC 22989 9

    Autonomous task execution

    Sensing and Perception

    JPL has developed and adapted several sensing and perception technologies for space and robotics applications. Many of

    these technologies are also relevant to the energy industry. Some good examples include visual odometry, visual target

    tracking, simultaneous mapping and localization, and techniques for force, position, and proximity sensing.

    Visual odometry (Maimone, 2007; Howard, 2008) is a technique for measuring the motion of a vehicle from multiple pairs of

    images taken by the vehicles own visual sensors. It uses the disparity between features in images from two horizontally displaced cameras to build a three-dimensional model of the environment. Vehicle motion is determined from changes in the

    location of features from one pair of images to another. Visual odometry is useful in applications where precise knowledge

    of motion is important. It is also useful when no other source of position knowledge, such as GPS, is available, such as

    planetary applications. Visual odometry has been used on the Mars Exploration Rovers (see Figure 14) and on research

    rovers to accurately determine vehicle motion without the need for GPS. It can be used in the energy industry to precisely

    measure the motion of moving platforms, such as ROVs, AUVs, or even instrumented drill bits, which also would not have

    access to GPS data. While JPL primarily uses cameras to produce the model of the environment, other sensors more suited to

    an underwater environment could be used.

    Visual Target Tracking (Kim, 2009) is a form of visual servoing of a vehicle as it approaches or maneuvers around a target.

    It involves creating a template image of a feature in an image, matching the feature from image to image as the vehicle

    moves, and guiding the path selection of the vehicle toward or around the target. Visual Target Tracking can work in

    conjunction with visual odometry. This technique has been used on the Mars Exploration Rovers (see Figure 15) to

    autonomously approach science targets on rocks, reducing lost time due to communication delays. Visual Target Tracking

    could be used by an AUV to guide its approach to, or path around, a subsurface valve or other oilrig or tree hardware.

    Simultaneous Localization and Mapping (SLAM) involves both building a map of the environment and locating a vehicle in

    that map as it moves. It uses the known motion of the vehicle and its perception of the environment, as determined by

    computer vision, wheel odometry, and other sensors, to incrementally generate the maps and localize the vehicle on the maps.

    (a)

    (b) (c)

    Figure 14 Visual Odometry used by the Mars Exploration Rover Opportunity to track position change while driving on Mars. (a)

    Initial left and right image pair. (b) Subsequent left and right image pair. (c) Tracked feature differences between the image

    pairs.

    Figure 13 An in-pipe robot built by JPL for natural gas pipelines. The prototype uses airbags to push

    against the inside of the pipe. The robot moves with an inchworm motion and uses a three degree of

    freedom joint between segments to navigate elbows and tees. A propeller harvests energy for the robot

    from flow in the pipe.

  • 10 OTC 22989

    JPL has used this technology to track the motion of military vehicles driving in unstructured terrain (Marks, 2009) and

    spacecraft during descent and landing (Johnson, 2007). This technology could be used in the energy industry to aid in the

    navigation of and exploration by underwater vehicles in uncharted or poorly mapped environments.

    Various techniques (Helmick, 2006; Hayati, 1993; Das 2001) have been developed, tested, and compared at JPL for

    measuring the internal system state and interaction forces with the environment for manipulator arms. These include force,

    position, and proximity sensing and estimation. They range from flexibility modeling to six-axis force/torque sensor use.

    Techniques for estimating and measuring environmental interaction contact and forces are necessary to allow precision

    interactions without damage to either the manipulator or the environment. JPL has used these techniques to do pose

    measurement of manipulator arms, and to measure and provide feedback on proximity and contact with the environment

    during manipulation tasks, such as remote inspection (see Figure 16a) and robot-assisted microsurgery (see Figure 16b).

    These techniques would be useful for measuring the state of any articulated system and providing feedback to human

    operators, especially for automated or telemanipulation contact tasks, such as opening or closing valves.

    On-board Intelligence

    JPL technologies for onboard intelligence and high-level autonomy are playing an important role in current JPL missions and

    robotic applications. Examples of such technologies include automated planning and scheduling systems, which provide

    capabilities for spacecraft command sequencing and resource management, and onboard data analysis systems, which

    provide capabilities for analyzing gathered data and identifying high-priority data features that warrant further investigation.

    Figure 15 Visual Target tracking was used by the Mars Exploration Rover Opportunity to approach a

    rock on Mars. The overlaid box shows the target window as it is tracked.

    Figure 16 a) Robot Assisted Microsurgery. b) Remote Surface Inspection System: a scaled mockup of a space

    station truss is inspected by a robot arm.

    a) b)

  • OTC 22989 11

    These systems have been used on a number of deep space

    and Earth orbiting mission as well as in Earth surface

    applications, such as Autonomous Underwater Vehicles

    (AUVs) performing wide area surveys in the Earths oceans (Figure 17).

    The Continuous Activity Planning, Scheduling, Execution

    and Re-Planning (CASPER) System (Chien, 2000) has been

    applied to support onboard and ground-based command

    sequence generation and re-planning for a variety of

    missions and robotic applications. Based on an input set of

    goals and the vehicles current state, CASPER generates a sequence of activities that satisfies the goals while obeying

    relevant resource, state and temporal constraints. For

    instance, plans may have strict limits on power usage and

    activities may be required to occur during certain time

    windows. Plans are produced using an iterative repair

    algorithm that classifies plan conflicts and resolves them

    individually by performing one or more plan modifications.

    Conflicts occur when a plan constraint has been violated

    where this constraint could be temporal or involve a resource, state or activity parameter. CASPER also monitors current

    vehicle state and the execution status of plan activities. As information is acquired, CASPER updates future-plan projections.

    Based on this new data, new conflicts and/or opportunities may arise, requiring the planner to re-plan in order to

    accommodate the unexpected events.

    The CASPER system is currently operating onboard the Earth Observing One (EO-1) spacecraft and has performed over

    28,000 autonomous activities since its upload in 2003. This system has also operated multiple in-situ hardware platforms

    including multiple sea surface and underwater vehicles (Huntsberger, 2010), three Mars rover platforms (Estlin, 2007), and

    an aerobot platform being developed for potential future missions to locations such as Saturns moon Titan (Gaines, 2008). For these applications, CASPER handles a range of plan generation and re-planning scenarios, including sequencing daily

    commands to handle movement, data collection, and engineering activities. The system can also dynamically adjust vehicle

    commands in response to both fortuitous and fault situations. The CASPER planning technology could be used in the energy

    industry to automatically generate and modify commands for AUVs or other robotic equipment, such as deep sea drilling

    rigs. CASPER could both reduce operations costs by lowering the need for manual input and supervision as well as enable

    rapid response and re-planning in a range of scenarios.

    In the area of onboard data analysis, JPL has developed a number of systems for NASA missions. The Autonomous

    Exploration for Gathering Increased Science (AEGIS) system (Estlin, 2012) enables intelligent targeting onboard the Mars

    Exploration Rover (MER) Mission. AEGIS operates on the MER Opportunity rover and analyzes wide-angle images for

    potential science targets, such as rocks with interesting features. If such a target is identified, additional measurements are

    automatically gathered with the MER high-resolution, multi-filter color panoramic cameras. An example is shown in Figure

    18. This approach enables data to be autonomously collected on interested terrain features without requiring communication

    to rover operators on Earth. This software was recently awarded the 2011 NASA Software of the Year Award for its

    performance on this mission. The MER Mission also uses onboard software to identify dynamic atmospheric events

    (including dust-devils and clouds) in rover images (Castano, 2008). When events are found, the image is marked as high

    priority for downlink. This enables a surface rover to search for such dynamic events frequently, but only downlink data that

    actually contains the feature or event of interest.

    Figure 17 The CASPER planning system has been deployed with on

    Slocum glider submersibles to perform adaptive area surveys.

  • 12 OTC 22989

    Another example of onboard data detection algorithms are used on the EO-1 spacecraft (Chien, 2010) and working in

    conjunction with the CASPER planner (mentioned above). This software analyzes EO-1 images to extract static features and

    detect changes relative to previous observations. It has been used with EO-1 Hyperion hyperspectral instrument data to

    automatically identify regions of interest including land, ice, snow, water, and thermally hot areas. An example of flood

    detection imagery is shown in Figure 19. Repeat imagery using these algorithms can detect regions of change (such as

    flooding, ice melt, and lava flows). Using these algorithms onboard enables retargeting and search, e.g., retargeting the

    instrument on a subsequent orbit cycle to identify and capture the full extent of a flood. Onboard data analysis algorithms are

    currently being applied to a range of instrument data and could be use by the energy industry to perform online analysis of

    sensor data onboard AUVS and other robotic vehicles. Analysis algorithms could quickly identify when key chemical

    signatures are present, such as hydrocarbon anomalies, and help direct an AUV to areas of high interest.

    Conclusion The adaptation of these and other space technologies to the oil and gas industry could yield a number of benefits. Improved

    models of deployed systems and the environment would provide greater understanding of the underlying processes and

    enable more efficient operations. Better perception and algorithms for providing situational awareness of remote systems

    should lead to more informed decisions. Routine and well-defined procedures could be relegated to automated processes.

    Complex resource management under dynamic conditions could be more optimally handled by planning and execution

    systems.

    Figure 19 Flood detection time series imagery from the Earth Observing One (EO-1) satellite. Imagery shows

    Australias Diamkantina River with visual spectra at left and flood detection map at right.

    Figure 18 Results from an AEGIS data collecting session onboard the MER Opportunity rover. On the left is shown a

    selected rock target, captured in three color filters using the high quality MER Panoramic Cameras. On the right is the

    original target selection in a wide-angle image taken by the MER wide-angle Navigation.

  • OTC 22989 13

    The adaptation of space robotics technology to problems in the oil & gas industry could:

    - increase automation, intelligence - extend deepwater/seabed operations - enable environmental monitoring in-situ and remote sensing - enable miniaturization - improve perception & mapping - enable more reliable and capable remote operations - reduce cost of operations.

    The authors have begun developing collaborations with companies in the oil and gas industry to apply JPLs robotics technologies to drilling and production operations to address these problems.

    Acknowledgement This work was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the

    National Aeronautics and Space Administration.

    References P. Backes, P. Younse, M. DiCicco, N. Hudson, C. Collins, A. Allwood, R. Paolini, C. Male, J. Ma, A. Steele, and P. Conrad,

    2011. Experimental Results of Rover-Based Coring, IEEE 2011 Aerospace Conference, Big Sky, MT.

    J. Balaram, R. Austin, P. Banerjee, T. Bentley, D. Henriquez, B. Martin, E. McMahon, and G. Sohl, 2002. DSENDS - A

    High-Fidelity Dynamics and Spacecraft Simulator for Entry, Descent and Surface Landing, IEEE 2002 Aerospace Conf.,

    Big Sky, Montana, March 9-16, 2002.

    J. Balaram, J. Cameron, A. Jain, H. Kline, C. Lim, H. Mazhar, S. Myint, H. Nayar, R. Patton, M. Pomerantz, M. Quadrelli, P.

    Shakkotai, and K.Tso, 2011. Physics-Based Simulator for NEO Exploration Analysis & Simulation Control, AIAA

    SpaceOps 2011 Conference and Exposition, September 2011.

    J. Biesiadecki, D. Henriquez and A. Jain, 1997. A Reusable, Real-Time Spacecraft Dynamics Simulator, in 6th Digital

    Avionics Systems Conference, (Irvine, CA), Oct 1997.

    J. J. Biesiadecki, P. C. Leger, and M. W. Maimone, 2007. Tradeoffs Between Directed and Autonomous Driving on the Mars

    Exploration Rovers, International Journal of Robotics Research, Vol 26, No 1, January, 2007, 91-104.

    A. Castao, A. Fukunaga, J. Biesiadecki, L. Neakrase, P. Whelly, R. Greeley, M. Lemmon, R. Castao and S. Chien, 2008.

    Automatic detection of dust devils and clouds on Mars, In Machine Vision and Applications 19:467482, 2008. S. Chien, R. Knight, A. Stechert, R. Sherwood, and G. Rabideau, G., 2000. Using Iterative Repair to Improve

    Responsiveness of Planning and Scheduling, Proceedings of the 5th International Conference on Artificial Intelligence Planning and Scheduling, Breckenridge, CO, 2000.

    S Chien, S., Sherwood, R., Tran, D., Cichy, B., Rabideau, G., Castano, R., Davies, A., Mandl, D., Frye, S., Trout, B.,

    Shulman, S., and Boyer, D., 2010. "Autonomy Flight Software to Improve Science Return on Earth Observing One,"

    Journal of Aerospace Computing, Information, and Communication, April 2010.

    H. Das, T. Ohm, C. Boswell, R. Steele, and G. Rodriguez, 2001. Robot-Assisted Microsurgery Development at JPL, book chapter in Information Technologies in Medicine, Volume II, Ed. Metin Akay, Andy Marsh, John Wiley & Sons, Inc.,

    2001.

    A. Diaz-Calderon, I.A. Nesnas, W.S. Kim, and H. Nayar, 2007. "Towards a Unified Representation of Mechanisms for

    Robotic Control Software," International Journal of Advanced Robotic Systems, Vol. 3, No. 1, pp. 061-066, 2006.

    A. Elfes, J.L. Hall, E.A. Kulczyki, D.S. Clouse, A.C. Morfopoulos, J.F. Montgomery, J.M. Cameron, A. Ansar, and R.J.

    Machuzak, 2008. Autonomy Architecture for Aerobot Exploration of Saturian Moon Titan, IEEAC 08, IEEE Aerospace Conference 2008, Big Sky, MT, March 1-8, 2008.

    T. Estlin, D. Gaines, C. Chouinard, R. Castano, B. Bornstein, M. Judd, and R.C. Anderson, 2007. Increased Mars Rover Autonomy using AI Planning, Scheduling and Execution, Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2007), Rome Italy, April 2007.

    T. Estlin, B. Bornstein, D. Gaines, R. C. Anderson, D. Thompson, M. Burl, R. Castano, and M. Judd, 2012. AEGIS Automated Targeting for the MER Opportunity Rover, To appear in ACM Transactions on Intelligent Systems and Technology, 2012.

    M. Fleder, I.A. Nesnas, M. Pivtoraiko, A. Kelly, and R. Volpe, 2011. Autonomous Rover Traverse and Precise Arm Placement on Remotely Designated Targets, International Conference on Robotics and Automation, Shanghai, China, May, 2011

    D. Gaines, T. Estlin, S. Schaffer, R. Castano and A. Elfes, 2008. Robust and Opportunistic Autonomous Science for a Potential Titan Aerobot, Proceedings of the IEEE International Conference on Robotics and Automation (ICRA 2010), Anchorage, Alaska, May 2010

    S. Hayati, J. Balaram, H. Seraji, W.S. Kim, and K.S. Tso, 1993. "Remote surface inspection system," Robotics and

    Autonomous Systems, 11(1), pp. 45-49, 1993.

  • 14 OTC 22989

    M.H. Hecht, S.P. Kounaves, R.C. Quinn, S.J. West, S.M.M. Young, D.W. Ming, D.C. Catling, B.C. Clark, W.V. Boynton, J.

    Hoffman, L.P. DeFlores, K. Gospodinova, J. Kapit, and P.H. Smith, 2009. Detection of Perchlorate and the Soluble Chemistry of Martian Soil at the Phoenix Lander Site, Science 325, 64-67.

    D. Helmick, A. Okon, and M. DiCicco, 2006. "A Comparison of Force Sensing Techniques for Planetary

    Manipulation," IEEE Aerospace Conference, March, 2006.

    P. Hebert, N. Hudson, J. Ma, and J. Burdick, 2011. Fusion of Stereo Visison, Force-Torque, and Joint Sensors for Estimation

    of In-Hand Object Location, IEEE 2011 International Conference on Robotics and Automation, Shanghai, China.

    G.J. Holzmann, 2006. The Power of Ten: Rules for Developing Safety Critical Code, IEEE Computer, June 2006 A. Howard, 2008. "Real-Time Stereo Visual Odometry for Autonomous Ground Vehicles," Proc. of the International

    Conference on Robots and Systems (IROS), September 2008.

    W.F. Hug, R.D. Reid, P. Conrad, A.L. Lane, and R. Bhartia, 2004. Portable Integrated UV Resonance Fluorescence and Raman Chemical Sensor for in situ, Autonomous, Detection, 31st Annual Conference of the Federation of Analytical Chemistry and Spectroscopy Societies, Portland, OR, October 2004.

    T. Huntsberger and G. Woodward, 2011. Intelligent Autonomy for Unmanned Surface and Underwater Vehicles, Proceedings of the MTS/IEEE OCEANS11 Conference, Kona, Hawaii, August 2011.

    A. Jain, 1991. Unified Formulation of Dynamics for Serial Rigid Multibody Systems, Journal of Guidance, Control and

    Dynamics, vol. 14, pp. 531-542, May-June 1991.

    A. Jain and G. Man, 1992. RealTime Simulation of the Cassini Spacecraft Using DARTS: Functional Capabilities and the

    Spatial Algebra Algorithm, in 5th Annual Conference on Aerospace Computational Control, Aug. 1992.

    A. Jain, J. Guineau, C. Lim, W. Lincoln, M. Pomerantz, G. Sohl, and R. Steele, 2003. Roams: Planetary Surface Rover

    Simulation Environment, International Symposium on Artificial Intelligence, Robotics and Automation in Space (i-

    SAIRAS 2003), May 2003.

    L. Jandura, K. Burke, B. Kennedy, J. Melko, A. Okon, and D. Sunshine, 2010. An Overview of the Mars Science Laboratory

    Sample Acquisition, Sample Processing and Handling Subsystem, Proceedings of the 12th International Conference on

    Engineering, Science, Construction and Operations in Challenging Environments, 2010.

    A. Johnson, A. Ansar, L. Matthies, N. Trawny, A. I. Mourikis, and S. I. Roumeliotis, 2007. "A General Approach to Terrain

    Relative Navigation for Planetary Landing," AIAA Infotech at Aerospace Conference, Rohnert Park, CA, May 2007.

    B. Kennedy, H. Agazarian, Y. Cheng, M. Garrett, G. Hickey, T. Huntsberger, L. Magnone, C. Mahoney, A. Meyer, and J.

    Knight, 2001. LEMUR: Legged Excursion Mechanical Utility Rover, Autonomous Robots, 11:201-205, 2001. B. Kennedy, A. Okon, M. Badescu, Y. Bar-cohen, Z. Chang, B. E. Dabiri, M. Garrett, L. Magnone, and S. Sherrit, 2006.

    Lemur IIb: A Robotic System For Steep Terrain Access, Industrial Robot: An International Journal , vol. 33, pp. 265269, 2006.

    W. S. Kim, I. A. Nesnas, M. Bajracharya, R. Madison, A. I. Ansar, R. D. Steele, J. J. Biesiadecki, and K. S. Ali, 2009.

    "Targeted Driving Using Visual Tracking on Mars: From Research to Flight," Journal of Field Robotics, 26, 3, March

    2009, 243-263

    R. Lindemann and C Voorhees, 2005. Mars Exploration Rover Mobility Assembly Design, Test and Performance, IEEE 2005

    Conference on Systems, Man, and Cybertnetics.

    M. Maimone, Y. Cheng, and L. Matthies. 2007. Two years of visual odometry on the Mars Exploration Rovers. Journal of

    Field Robotics, 24, 3, March 2007, 169186. T. K. Marks, A. Howard, M. Bajracharya, G. W. Cottrell, and L. H. Matthies, 2009. "Gamma-SLAM: Visual SLAM in

    unstructured environments using variance grid maps," Journal of Field Robotics, 26(1), 2009.

    R. Mukherjee and M. Pomerantz, 2011. Initial Implementation of the mobility mechanics modeling tool kit (M3Tk-Lite),

    ASME 2011 International Design Engineering Technical Conferences (IDETC) and Computers and Information in

    Engineering Conference (CIE), Washington, DC, August 28-31, 2011.

    H.D. Nayar, and I.A. Nesnas, 2007. Measures and Procedures: Lessons Learned from the CLARAty Development at NASA/JPL, " International Conference on Intelligent Robots and Systems (IROS), Workshop on Measures and

    Procedures for the Evaluation of Robot Architectures and Middleware, San Diego, CA, October 2007

    H.D. Nayar, and I.A. Nesnas, 2007. "Re-usable Kinematic Models and Algorithms for Manipulators and Vehicles, "

    International Conference on Intelligent Robots and Systems (IROS), San Diego, CA, October 2007

    H. Nayar, J. Balaram, J. Cameron, M. DiCicco, T. Howard, A. Jain, Y. Kuwata, C. Lim, R. Mukherjee, S. Myint, A.

    Malkovic, M. Pomerantz, and S. Wall, 2010. Surface Operations Analysis for Lunar Missions, AIAA SPACE 2010

    Conference and Exposition, September 2010.

    I.A. Nesnas, R. Simmons, D. Gaines, C. Kunz, A. Diaz-Calderon, T. Estlin, R. Madison, J. Guineau, M. McHenry, I. Shu,

    and D. Apfelbaum, 2006. "CLARAty: Challenges and Steps Toward Reusable Robotic Software," International Journal

    of Advanced Robotic Systems, Vol. 3, No. 1, pp. 023-030, 2006.

    I. A. Nesnas, J. Matthews, P. Abad-Manterola, J. W. Burdick, J. Edlund, J. Morrison, R. Peters, M. Tanner, R. Miyake, and

    B. Solish, 2012. Axel and DuAxel Rovers for the Sustainable Exploration of Extreme Terrains, to appear in the Journal of Field Robotics, 2012

    K. I. Oxnevad, 2010. Technology Collaboration out of this World? Chronicle, PetroNews, Offshore Northern Seas, 2010

    issue, August, 2010.

  • OTC 22989 15

    A. Parness, 2011. Anchoring Foot Mechanisms for Sampling and Mobility in Microgravity, IEEE 2011 International

    Conference on Robotics and Automation, Shanghai, China. 2011.

    R. Rasmussen, M. Ingham, and D. Dvorak, 2005. Achieving Control and Interoperability Through Unified Model-Based Engineering and Software Engineering, AIAA Infotech at the Aerospace Conference. Arlington, VA. September 2005.

    G. Rodriguez, K. Kreutz-Delgado, and A. Jain, 1991. A Spatial Operator Algebra for Manipulator Modeling and Control,

    The International Journal of Robotics Research, vol. 10, pp. 371-381, Aug. 1991.

    Sojourner Rover Team, 1997. Characterization of the Martian Surface Deposits by the Mars Pathfinder Rover, Sojourner,

    Science, 278:1765-1767, 1997.

    C. Stam, M. Cooper, A. Behar, and K. Venkateswaran, 2010. A Hydrothermal Vent Biosampler for Filtration and Concentration of Hydrothermal Vent Fluids, 2010 Ridge 2000 Community Meeting, Portland, OR, October 2010.

    A. Trebi-Ollennu, K. S. Ali, A. L. Rankin, K. S. Tso, C. Assad, J. B. Matthews, R. G Deen, D. A. Alexander, R. Toda, H.

    Manohara, M. Wolf, J. R Wright, J. Yen, F. Hartman, R. G Bonitz, and A. R Sirota, 2012. Luner Surface Operation Testbed, to appear in Proc. of the 2012 IEEE Aerospace Conference, Big Sky, MT, March 3-10, 2012.

    R. Volpe, J. Balaram, T. Ohm, and R. Ivlev, 1996. The Rocky 7 Mars Rover Prototype, Proc. of the Intl Conference on Intelligent Robots and System, Osaka, Japan 1996.

    B. Wilcox, 2004. Gas Pipe Explorer Robot, US Patent Number 6,697,710 (2004).

    B. H. Wilcox, T. Litwin, J. Biesiadecki, J. Matthews, M. Heverly, J. Morrison, J. Townsend, N. Ahmad, A. Sirota, and B.

    Cooper, 2007. ATHLETE: A Cargo Handling and Manipulation Robot for the Moon, Journal of Field Robotics, 24:421-

    434, 2007.

    B. Wilcox, 2009. ATHLETE: A Cargo and Habitat Transporter for the Moon, IEEE 2009 Aerospace Conference, (Big Sky,

    MT).

    J. Yen and A. Jain, 1999. ROAMS: Rover Analysis Modeling and Simulation Software, International Symposium on Artificial

    Intelligence, Robotics and Automation in Space (i-SAIRAS'99), (Noordwijk, Netherlands).

    MAIN MENUTable of ContentsSearchPrint