wireless avionics and human interfaces for inflatable spacecraft

17
1 Wireless Avionics and Human Interfaces for Inflatable Spacecraft Richard Alena, Steven R. Ellis, Jim Hieronymus, Dougal Maclise NASA Ames Research Center Moffett Field, CA 94035 650-604-0262 [email protected] Abstract—Revolutionary capabilities for robust control of inflatable Lunar and Martian transit vehicles and planetary habitats can be developed using advanced wireless network technology and modular avionics coupled with facile human to system interfaces. Fully wireless modular avionics would eliminate any cabling associated with power and data transmission, allowing easy deployment of flexible control systems and human interfaces. Furthermore, wearable human interface systems hosting virtual reality interaction methods can provide significant improvement in human situational awareness and control of dynamic space systems. The crew can interact with intelligent software agents providing human-like interaction using speech. These advanced information management systems would incorporate intelligent software agents to assist the crew in performing vehicle and mission operations. Advances in robust wireless data communications and wireless power transmission are the key technologies that enable this new spacecraft architecture. This paper will cover the proposed architecture for wireless spacecraft avionics including innovative human interaction techniques with spacecraft systems. The team believes these two aspects are intimately related and that mobile virtual human interfaces can solve many problems associated with operating spacecraft based on inflatable structures. Conventional architectures allocate much space to a cockpit from which the spacecraft is piloted and monitored. For the transit to Mars, which in most scenarios takes approximately 6 months, the cockpit becomes a major consumer of available space while being used only briefly during the journey for earth departure and planetary approach. Wireless control of the spacecraft would allow the piloting and monitoring function to be carried out from any location within the crew space. Identifying key technology developments required to support this architecture will involve evaluating current and next generation wireless networks, computational modules and wireless power transmission for avionics. Complementary methods for virtual human interfaces will be evaluated, with the rapid development of this technology enabling significant advances to be realized in the next decade. 12 For piloting and monitoring functions it will be necessary to have virtual reality for looking outside the transit ship. Because of the radiation shielding required beyond the atmosphere and magnetosphere, there will be no clear view to the outside of the spacecraft. The visual capability will 1 1 U.S. Government work not protected by U.S. copyright. 2 IEEEAC Paper #1210, Version 1.3, Updated November 21, 2007 have to be provided by remote presence technology, which ties the virtual reality visual system with cameras on the outside of the craft. Spoken commanding and interaction with ship systems is also an important part of this concept. As ship systems become more complex, it will become impossible for the crew to remember exact details of all aspects of the operating environment. In this case the ability to ask questions about spacecraft status and have the agents issue audible spoken advisories and warnings becomes an important part of safely operating a very complex system. The shipboard agent systems can become in effect a much larger crew for the mission. With the round trip communication times approaching 40 minutes for the Mars transit, all of the mission operations knowledge will have to be built into the spacecraft, lessening dependence upon earth-based support. The operational authority has to be given to the astronauts augmented by virtual mission support provided by the knowledge agents on the spacecraft. A distributed wireless avionics system would provide the computational capacity for sophisticated support systems that inform the correct crewmember of a problem needing their attention, and to assist them in performing a complex task. TABLE OF CONTENTS 1. INTRODUCTION ............................................ 2 2. INFLATABLE SPACECRAFT DESIGN .................. 3 3. MODULAR AVIONICS ARCHITECTURE .............. 4 4. HUMAN-SYSTEM INTERFACES ........................ 8 5. INTELLIGENT AGENTS .................................. 11 6. TECHNOLOGY DEVELOPMENT....................... 13 7. CONCLUSIONS ............................................ 14 REFERENCES.................................................. 15 BIOGRAPHY................................................... 15

Upload: lymien

Post on 02-Jan-2017

219 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

1

Wireless Avionics and Human Interfaces for Inflatable Spacecraft

Richard Alena, Steven R. Ellis, Jim Hieronymus, Dougal MacliseNASA Ames Research Center

Moffett Field, CA 94035650-604-0262

[email protected]

Abstract—Revolutionary capabilities for robust control ofinflatable Lunar and Martian transit vehicles and planetaryhabitats can be developed using advanced wireless networktechnology and modular avionics coupled with facile humanto system interfaces. Fully wireless modular avionics wouldeliminate any cabling associated with power and datatransmission, allowing easy deployment of flexible controlsystems and human interfaces. Furthermore, wearablehuman interface systems hosting virtual reality interactionmethods can provide significant improvement in humansituational awareness and control of dynamic space systems.The crew can interact with intelligent software agentsproviding human-like interaction using speech. Theseadvanced information management systems wouldincorporate intelligent software agents to assist the crew inperforming vehicle and mission operations.

Advances in robust wireless data communications andwireless power transmission are the key technologies thatenable this new spacecraft architecture. This paper will coverthe proposed architecture for wireless spacecraft avionicsincluding innovative human interaction techniques withspacecraft systems. The team believes these two aspects areintimately related and that mobile virtual human interfacescan solve many problems associated with operatingspacecraft based on inflatable structures. Conventionalarchitectures allocate much space to a cockpit from whichthe spacecraft is piloted and monitored. For the transit toMars, which in most scenarios takes approximately 6months, the cockpit becomes a major consumer of availablespace while being used only briefly during the journey forearth departure and planetary approach. Wireless control ofthe spacecraft would allow the piloting and monitoringfunction to be carried out from any location within the crewspace. Identifying key technology developments required tosupport this architecture will involve evaluating current andnext generation wireless networks, computational modulesand wireless power transmission for avionics.Complementary methods for virtual human interfaces willbe evaluated, with the rapid development of this technologyenabling significant advances to be realized in the nextdecade.1 2

For piloting and monitoring functions it will be necessaryto have virtual reality for looking outside the transit ship. Because of the radiation shielding required beyond theatmosphere and magnetosphere, there will be no clear viewto the outside of the spacecraft. The visual capability will1 1 U.S. Government work not protected by U.S. copyright.2 IEEEAC Paper #1210, Version 1.3, Updated November 21, 2007

have to be provided by remote presence technology, whichties the virtual reality visual system with cameras on theoutside of the craft. Spoken commanding and interactionwith ship systems is also an important part of this concept. As ship systems become more complex, it will becomeimpossible for the crew to remember exact details of allaspects of the operating environment. In this case the abilityto ask questions about spacecraft status and have the agentsissue audible spoken advisories and warnings becomes animportant part of safely operating a very complex system.The shipboard agent systems can become in effect a muchlarger crew for the mission. With the round tripcommunication times approaching 40 minutes for the Marstransit, all of the mission operations knowledge will have tobe built into the spacecraft, lessening dependence uponearth-based support. The operational authority has to begiven to the astronauts augmented by virtual missionsupport provided by the knowledge agents on the spacecraft.A distributed wireless avionics system would provide thecomputational capacity for sophisticated support systemsthat inform the correct crewmember of a problem needingtheir attention, and to assist them in performing a complextask.

TABLE OF CONTENTS

1. INTRODUCTION ............................................ 22. INFLATABLE SPACECRAFT DESIGN .................. 33. MODULAR AVIONICS ARCHITECTURE .............. 44. HUMAN-SYSTEM INTERFACES ........................ 85. INTELLIGENT AGENTS.................................. 116. TECHNOLOGY DEVELOPMENT....................... 137. CONCLUSIONS ............................................ 14REFERENCES.................................................. 15BIOGRAPHY................................................... 15

Page 2: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

2

1. INTRODUCTION

Inflatable spacecraft have advantages over conventionalaluminum alloy spacecraft in launch mass and volume butpose unique challenges for spacecraft system design.Inflatable structures can be launched in a collapsed state, andthen inflated upon deployment, providing a significantincrease in usable volume. Moreover, these structures can beremarkably strong and resistant to micrometeorite damage ifmade from ultra-strong fabrics such as Kevlar. NASA hasperformed numerous design studies of such structures,notably the TransHab in 2000, identifying variouschallenges in design and solutions to problems such asdeployment and system integration. This paper intends topresent innovative design approaches to solving keychallenges for inflatable structures, like flexible modularavionics, intelligent sensors and virtual human interfaceswhile defining innovative operations concepts for crewinteractions with the vehicle and space environment duringflight.

Commercial space ventures launched the Bigelow Genesis ISpace Station “Hotel” in June of 2006 as a scaled-downflight demonstration.[1] It returned pictures of itself inOrbit as shown in the figure below. The level of interest inbuilding and utilizing inflatable spacecraft can be readilyseen from these investments and flight tests, but technicalchallenges remain. For example, the structure must becompactly stowed for launch, including the criticalsubsystems, which generally are not flexible in nature. Allspacecraft need substantial wiring and plumbing forelectrical power, air supply, cooling, command and controland providing this for a spacecraft that changes size is amajor design challenge. Additionally, sensor and controlsignals and power connections must feed through from theexterior to the interior passing through several pressureinterfaces while maintaining integrity during stowage,deployment and inflation.

This paper focuses on four key design approaches specific toinflatable spacecraft structures, but applicable to futurespacecraft in general. The first one is to incorporate modularavionics nodes into the structure, using wireless power anddata transmission for creating the integrated system. Animportant challenge is to remove significant wiring from thespacecraft by using inductive power transmission andwireless networks for command and telemetry signals.These methods also do not require penetration of thepressure interfaces. This has the added advantage ofsignificantly lowering overall spacecraft mass, in addition todramatically lowering the wiring complexity and itsinherent impact on reliability. Methods of utilizingredundant wireless power and data transmission systems areoutlined to ensure that these systems do not compromiseoverall vehicle safety.

Inflatable structures also require innovative methods ofmonitoring the integrity of the pressure shells and othercomponents unique to this type of spacecraft. This is alsotrue of the normal sensing required for environmental

control and other vital functions. Intelligent modularsensors for pressure shell monitoring are introduced andmethods for integration into spacecraft health managementsystems are outlined. Intelligent sensors also nicelycomplement the proposed modular avionics architecturethereby addressing the second design challenge.

Miniaturization and corresponding power reduction inelectronic sensors and effectors now makes possible radicalredesign of crewed spacecraft. These new designs promisesignificant reductions in power requirements and waste heat.Because the new sensors and effectors can communicate overwireless LANs and pull most of their power by inductionfrom electromagnetic fields, a spacecraft designed aroundsuch hardware can use widely distributed architecture withthe elimination of much of the connecting wires used forcommunication and power distribution. Indeed, thetechnologies for these functions can be imbedded almostarbitrarily throughout the spacecraft.

The third design challenge is the difficulty of creatingwindows and other means for monitoring the outside of thespacecraft. Since an airlock is required for human ingressand egress, the costs and reliability impact of this interior-exterior interface must be borne, but the same argument isnot really true for windows. Windows are a problem forconventional spacecraft also, incurring significant cost andhaving an unfavorable reliability impact. Additionally, theneed for substantial radiation shielding also creates issueswith windows and other openings. Finally, each of theseapproaches penetrates pressure barriers, compromisingreliability. Fortunately, modern computer technology hasprovided virtual means for such necessary functions that canpotentially perform even better than conventional methodsfor providing situational awareness. Additionally, since thisbasic capability is needed for situational awareness, suchvirtual methods can be extended to command and control ofthe spacecraft. This paper will outline innovative methodsfor wearable virtual presence cockpit concepts capable ofreplacing conventional methods of spacecraft control andmonitoring.

Innovative human to spacecraft system interfaces (HSI)require devices such as head-mounted displays and speakersfor synthesized speech output and input devices such ascameras for pictures and microphones for speech.Information fusion, in the form of overlays of digitalimages with composite video and still images can presentcomplex situations in a format better suited for humanunderstanding. These methods are collectively known as“augmented reality”. A more difficult challenge would bethe development of algorithms to interpret actions and datainput into commands from the crew.

Page 3: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

3

Figure 1. Bigelow Genesis IFigure 1. Bigelow Genesis IProviding accurate situational awareness to the crewincluding spacecraft health and operational status can beextended to support missions operations such as Extra-Vehicular Activity (EVA) and robotic teleoperations. Forexample, would it be effective for a crewmember in thespacecraft to have the ability to “see” what and astronaut isseeing and doing during a spacecraft or planetary EVA?Operations concepts for such virtual HSI methods will bedeveloped and presented, resulting in more flexibility andeffectiveness for the crew aboard the transit vehicle or livingin the habitat.

Devices alone do not solve the HSI problem. Bettermethods for interacting with critical systems are needed thatsignificantly increase crew effectiveness and autonomy.Light speed delays and the impossibility of providingphysical assistance to the remote crew make reliance uponan earth-based Mission Control Center (MCC) for dealingwith time-critical system failures or mission contingencies amode of the past. Intelligent software agents can bedeveloped for crew-system interaction and for managementof complex procedures, specifications and other tasks thatneed to be accurately completed in a limited amount oftime, using the available resources. These concepts will beintroduced in the context of virtual HSI devices and speech-based interaction methods.

It is anticipated that virtual HSI methods providingaugmented reality are quite complementary to the modularavionics methods advocated for vehicle systems control andhealth management. Intelligent agents extend these

capabilities by providing a “virtual person” for more naturalhuman interaction. These methods require a facile, high-performance redundant distributed computing system with alow-latency network for data communication. In effect, theHSI devices become mobile nodes of the modular avionicssystem. The avionics system provides significantcomputational power for implementing virtual interfaces andintelligent agents. High levels of peer-to-peer redundancyand very high performance computing and communicationscapability, enabled by significant reductions of power, massand volume of each avionics module can enable aremarkably flexible yet robust spacecraft command andcontrol system, while providing the excess capabilityneeded for crew and payload operations support.

2. INFLATABLE SPACECRAFT DESIGN

Inflatable structures for spacecraft have been proposed andstudied for many years. They provide significant advantagesfor space station habitats, transit vehicles and planetaryhabitats, particularly in terms of increasing usable volumefor crew. Since they can be launched in a deflatedconfiguration, with concurrent savings in mass, theyprovide a compact solution to delivering such spacecraftinto earth orbit. Once in place, they can be inflated toprovide a large habitable volume for crew transit orhabitation. They can provide interesting structuraladvantages: the main support structure is isolated from thegas containment volume, while the gas pressure acts as astructural stiffener, with excellent shock absorptioncapability. Other advantages are the potential for betterthermal insulation, micrometeorite protection and evenradiation shielding, particularly when exotic multilayermaterials are used.

Once at the intended site, they can be inflated, providing asignificant increase in volume over rigid designs. Theflexible fabrics used for inflatable spacecraft also can providesuperior micrometeorite protection if constructed frommaterials such as Kevlar (having greater impact resistancethan aluminum alloy). Of course, these advantages come atthe cost of increased complexity for the design of spacecraftsubsystems, which now must function in a variety ofconfigurations, deflated, partially inflated and fully inflated.The flexible fabric of the pressure vessel also needsincreased health monitoring for leaks and weak spots.

Design challenges for inflatable spacecraft include providingthe strength and load bearing capacity required forpropulsion, providing the signal, power and fluid interfacesbetween the inner pressurized volume and the outerenvironment, providing windows and other ports, mountingequipment to inner and outer surfaces of the containmentvolume, the design of the multiple flexible fabric bladderand mitigating the hazards of the outer space environment.Another challenge is to design subsystems that can functionin the three major stages and configurations: deflated,inflation and fully-deployed. Given the critical nature of gaspressure for holding the structure inflated, leak detection andmitigation is a primary concern.

Page 4: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

4

Certain functions will be needed during all three stages formonitoring basic spacecraft health during transport, butprior to deployment. Deployment must be carefullymanaged to avoid damaging the layered fabric and finally,leaks and other problems with the pressure interfaces mustbe quickly detected and sealed before the spacecraftcollapses. All these design challenges can benefit frominnovative avionics architectures and human-systeminterfaces.

Figure 2. NASA TransHabFigure 2. NASA TransHabIntegrated Vehicle Health Management (IVHM) of inflatablespacecraft has been studied and high-level requirementsdefined. The structural health and gas containment arecritical. This is no different than conventional spacecraft,except for the dynamic nature of the inflatable structure,which makes monitoring much more difficult. For example,in a rigid structure, any leak will be detected by a decreasein cabin pressure. Inflatable structures, by contrast, may notexhibit this pressure loss, since the volume reduces in directproportion to the loss of gas, maintaining a constantpressure initially. Leakage repair and prevention of leakpropagation are other challenges, mostly in the realm ofmaterial science. Since the space environment is harsh, thecontainment material must be able to withstand the effectsof atomic oxygen, radiation and sunlight, as well as largetemperature differences. An intriguing solution is to use thegas containment layers as a storage location for water and/orliquid hydrogen or oxygen, which could help with radiationshielding, given their ability to absorb neutrons and other

radiation species. If water is used, thermal control of thewater would yield excellent thermal control for the interiorvolume.

A good reference design is presented in Figure 2. In 2000,NASA studied the TransHab concept for providing a habitatfor the International Space Station.[2] It provided a centralrigid core for subsystems, airlocks, transport corridors andattachments. The outer shell inflated, increasing useablevolume by 100% when deployed. A multi-layered fabricconsisting of Kevlar, Inconel and other materials wasprototyped and methods for measuring the integrity of thefabric were developed. For propulsion or life supportfunctions, the subsystems would be located in the centralcore of the spacecraft, which accommodates thrust forces andprovides a rigid structure for pumps, plumbing etc. Anintriguing idea is to store the consumable liquids within thefabric of the inflatable structure, eliminating storage tanks.While not providing the full range of advantages associatedwith inflatables, particularly increased volume, the designwas capable of being qualified for human space missions.

3. MODULAR AVIONICS ARCHITECTURE

A popular avionics architecture for critical flight controlaboard current-design commercial aviation and proposedspacecraft is Integrated Modular Architecture (IMA),consisting of multiple powerful computer nodesinterconnected using a high-performance time-deterministicnetwork. The software architecture conforms to ARINC 653,which defines independent partitions for running applicationprograms such as Guidance, Navigation and Control (GNC),Systems Health Monitoring (SHM) and other spacecraftfunctions concurrently.[3] A key feature is the capability ofrunning independent application programs in alternativepartitions separated in both memory space and time-criticalresource utilization. Another advantage is that the networkprovides many virtual paths for effective data transfer,enabling new modes of interaction between avionicsmodules.

In IMA, diverse hardware computing nodes use the networkto maintain connection to many sensors and effectorssimultaneously, allowing any computing module access toeither sensor data or to control interfaces for subsystemsdistributed throughout the vehicle. Together with ARINC653, this hardware capability allows any computing node torun a subsystem independent from the current location ofthe executing application program. Highly redundant andflexible software architectures can be constructed in thismanner, ranging from traditional triple-redundant controlschemes to advanced peer-to-peer constructs. Some of theseschemes have yet to be proven, but will provide certainadvantages if the design challenges can be met. The physicalcharacteristics of inflatable structures lend themselves wellto diverse peer-to-peer constructs with highly dispersedavionics nodes.

However, this same dynamic characteristic of inflatablespacecraft makes it very difficult to run cabling for signal

Page 5: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

5

and power transfer throughout the vehicle. One solution isto not use any surface of the containment volume forsubsystem mounting, but this approach limits the degree ofvolumetric advantage obtained, by requiring a much largercore. In many ways, the very large core of the TransHab isone reason this design does not provide much volumetricadvantage. If equipment could be mounted on thecontainment surface, then a much larger spacecraft could beconstructed around a smaller core. Additionally, externalresources could be mounted to the outside of thecontainment volume, providing additional capability likemany camera viewpoints, antennas, solar panels and otherimportant assets, arrayed around the vehicle, providingcoverage in all directions. All these schemes require smallcompact modules providing these functions, withinterconnections capable of stretching or contracting withthe spacecraft during deployment stages. Reconfiguration ofthe volume for different mission phases or purposes mightalso be possible using this approach.

The key to such flexibility is to eliminate any physicalmedia associated with signal and power transmission,particularly for avionics. Luckily, wireless techniques arenow available that have the potential of meeting the needsfor advanced inflatable spacecraft. The basic requirements forreliable power and data transmission can be met usingredundant networks and power systems, all wireless,perhaps using diverse techniques to guard against common-mode failure. Radio and optical data transmission are twodiverse data transfer methods, and inductive methodscoupled with solar power or microwave power transmissionwould be two potential power supply methods. A battery ineach avionics module would guard against temporarydisruptions in power transmission. Logical and physicalredundancy methods can be used to guard against temporarydisruptions in network operation. Another key challenge innetwork design is to ensure the timely delivery of sensordata and commands to the critical avionics computers.

In Modular Avionics Architecture (MAA), described in thispaper, many computational modules form a highly-redundant mesh network of computational resources foracquiring sensor data, calculating flight control andsubsystem control loops, and commanding the actuatorassemblies. This provides the ability to define theconfiguration of these avionics modules dynamically, eitheras highly redundant controllers, or as multiple computersystems each providing a dedicated function, with a hugeaggregate computational capability. Coupled with ARINC653, critical control and operations support functions canalso be run concurrently in any avionics module,significantly increasing capability for hosting complexsoftware. The best solution is probably a hybrid of thesetwo approaches: a set of modules provides critical commandand control, while the remainder is used for payload andmission support functions. As critical avionics nodes, fail,the support nodes switch function, backfilling thearchitecture so that a certain number of processors are alwaysused for critical control. Therefore, this type of system willsee a gradual decrease of non-critical support functions, priorto any loss of critical function. When all avionics nodes areoperational, there is a large amount of excess capacity for

other functions like supporting virtual interfaces andintelligent agents. Certain modules may act as cold spares,coming on-line only after numerous failures of the primarymodules.

A block diagram of Modular Avionics Architecture (MAA)suitable for inflatable spacecraft is shown in the figurebelow. This is a notional diagram, intended to show howsuch a system might be constructed and how it might bemanaged. For simplicity, only three computational clustersare shown, C1, C2 and C3, representing the minimumcomplement for critical control. An actually implementationwould have dozens of these cluster, each with similarcharacteristics. Each cluster has a complement of sensorslocated in reasonable proximity to their primarycomputational module, a wireless data link for the sensorsand a mesh wireless network for connecting each cluster tothe others and to the subsystems being controlled.

Referring to the diagram, sensors belong to a cluster wherethe sensors, utilizing their own “sensor” wireless data links,connect to the avionics module nearest to their physicallocation. In this example, the sensors are chosen to representa system for monitoring inflatable structural health.Accelerometers (A-2) measure vibrations and movement ofthe fabric, strain gauges (S-2, S2-2) measure forces on thefabric, while temperature (T-2) and pressure sensors (P-2)measure properties of the fabric and environment in thevicinity of the cluster. By properly distributing theseclusters throughout the inner surface of the containmentvolume, the entire structure can be monitored. Each clustercan be labeled as a vehicle monitoring zone, to specify itslocation and coverage. If the sensors are inertialmeasurement units, star trackers, GPS units and othernavigational sensors, then the clusters can be used for flightcontrol. If they are pressure, temperature, oxygenconcentration and humidity sensors, then the clusters can beused for environmental control. Many combinations arepossible, yielding a wide range of potential designsolutions.

Page 6: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

6

Certain avionics modules would be mounted on the exteriorof the containment volume, outfitted with high-definitioncameras for obtaining visual images and with interfaces forcontrolling solar power panels and radio systems mountedto the exterior of the spacecraft. They could be powered bythe same inductive method that powers the interior modulesand would be nodes on the mesh wireless network used forcritical control.

A separate wireless “mesh” network connects the firstavionics computing nodes (C1) to the other computingnodes (C2, C3). The mesh network creates a method forspeedy propagation of all network data to all computingnodes and allows any of the avionics nodes to act as acommunication repeater for any other node. The use of amesh network provides routing capability from any node toany other node, even as nodes fail. Using this method,highly reliable interconnects can be created, even usingunreliable means such as radio for the physical layer. Theuse of two diverse data transmission methods (i.e. radio andoptical) improves reliability and prevents common-modefailures caused by electromagnetic interference or blockageof the optical path.

The computing takes place in multiple computing nodesthat duplicate each others functions. Theoretically, with thesame inputs, running the same algorithms, each nodeshould produce the same results. The actuator controlmodules perform the majority voting scheme, rejecting anyavionics nodes output that fails to conform to the others. Inthe event of failures, the sheer number of sensors andcomputing nodes ensure majority control, leading to a veryrobust system. The entire system is time synchronous,providing their control outputs to the actuator moduleswithin the same time frame. The network must be capableof delivering all data and message traffic in the systemwithin the time window required for time critical flight

control functions, generally on the order of 10 milliseconds.A distributed, rather than central timebase is required forfault tolerance, and there are many such schemes.

Excess computational capacity is used to host a variety ofcrew and mission support applications, in particularembedded virtual/augmented reality HIS and intelligentagents for high-level interaction. Certain computationalnodes can be human system interfaces, such as mobiledisplays with joysticks, virtual reality helmets and otherunconventional devices. In microgravity, sitting down to akeyboard and display and using a mouse is not a validconcept. Astronauts generally Velcro the laptop to theirknees and use their own body as the counter-weight forpressing keys and using trackballs or pointing pads orsticks. So, body worn computing is already used aboardspacecraft.

Sensors for Inflatable Structure Health Management

Intelligent sensors are another method for creating massivelyredundant spacecraft control systems, providing significantincreases in monitoring capability and fault tolerance. The“intelligence” in this case is the ability to correlate multiplesensing elements and determine the correct average value,the ability to monitor internal sensing and measuringcircuits and the ability to communicate on the wirelesssensor network.

C1 C2

Timebase Control

Vehicle Subsystems (Core)

Temp1

Pressure1

ActuatorChannel

C

ActuatorChannel

B

ActuatorChannel

A

ManualEmergency

Control

Strain1

Strain-21

Accel1

S2-2

A-2

S-2 T-2

P-2

C3

S2-3A-3

S-3

T-3

P-3

Vehicle Zone 2Vehicle Zone 1

AvionicsModule

Voting Actuator ControllersVehicle Zone 3

Figure 3. Modular Avionics Architecture

C1 C2

Timebase ControlTimebase Control

Vehicle Subsystems (Core)

Temp1

Pressure1

ActuatorChannel

C

ActuatorChannel

B

ActuatorChannel

A

ManualEmergency

Control

Strain1

Strain-21

Accel1

S2-2

A-2

S-2 T-2

P-2

C3

S2-3A-3

S-3

T-3

P-3

Vehicle Zone 2Vehicle Zone 1

AvionicsModule

Voting Actuator ControllersVehicle Zone 3

Figure 3. Modular Avionics Architecture

Page 7: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

7

A key design challenge is to provide adequate sensing of theinflatable structure by monitoring strain forces on the fabricand detecting leaks. This would include environmentalparameters such as temperature, oxygen and CO2 contentand other key variables. One approach would be to embedsuch sensors into the fabric of the inflatable, in such amanner as to support the accurate measurement of thedesired parameters. As in the case of the avionics nodes,these sensors would rely on wireless power and datatransmission to function within the integrated system.Again, since the volume, mass, financial cost andinterconnect cost of each sensor is minimized using thisapproach, one can afford to use many sensors. Highlyredundant sensing systems allow many failures beforereliability of the entire integrated system is compromised.

The diagram of Figure 4 shows how a strain sensor could beembedded into the spacecraft fabric, including the inductivepower coupling and the associated data link to the avionicsmodule. Channels for carrying high frequency current forpowering the sensors and avionics modules would becreated by conductive traces embedded in the fabric layers.These conductive traces could be associated withreinforcement materials to provide an integrated solution.Note that this solution requires no penetration of the criticalpressure interfaces of the containment fabric.

The use of wireless data transmission eliminates any needfor additional conductive traces, simplifying the design.Each sensor would need to interface to a wireless networkdedicated to sensor modules, necessitating processing powerco-located with the sensor to format the sensor data to thedata link network protocol. If the sensor data stream were tobe TCP/IP compliant, this would require significantprocessing power to implement due to the complexity of theInternet protocol. However, low-level peripheralinterconnects such as Bluetooth or Zigbee networks couldbe used that only define the protocols to the data transportlevel, relying on low-level error handling. These protocolsand physical implementations were designed with sensornetworks as targets, simplifying the design of sensornetworks, with much of the network protocol implementedin dedicated chips and low-power microcontrollers.

Data processing power available at the sensor location alsoallows implementation of intelligent sensor networkscompliant with IEEE 1451.1, which defines smart sensormodules capable of facile networking in large-scalesystems.[4] The corresponding Standard, IEEE 1451.2,describes Transducer Electronic Data Sheets for standardizedinformation interchange of sensor data and characteristics,including calibration data.[5] The increased processingpower can support sensors with multiple sensing elements,which allows cross-checking to be performed at the sensormodule level, allowing more precise and reliable sensordesign. Sensor failures are the number one most probablefailure in aerospace systems, so improvement in this areaprovides significant advantages for vehicle reliability. [6]

There are many new types of sensors that can be used formonitoring the health of the inflatable structure itself. Theserange from microwave detectors for fabric integritymeasurements, which detect impedance changes in the fabricindicative of changes in the multilayer dielectric constant toacoustic sensors that listen for leaks. Acoustic sensors,mounted to the surface of the inflatable could also pick upmicrometeorite impacts, warning of potential damage. Fiberoptics can be used, embedded in the fabric, with Braggdiffraction gratings used to monitor strain and temperatureover a large area. Pressure sensors and regulators could beused to ensure accurate inflation and monitor the amount ofmake-up gas required to maintain inflation. These methodswould detect very small leaks, but not their location.Interface areas between fabric to structure joints and otherstress areas would need additional monitoring.

Fault Tolerance

The architecture can provide robust fault tolerance. In case ofsensor failure, the architecture incorporates numerousredundant sensors for each critical function, alwayspreserving multiple ways of performing the criticalmeasurements. When all sensors are functioning, muchredundant data is available for confirming sensor readingsand calibration, covering sensor malfunctions or calibrationdrift. When fewer sensors are available, the algorithms canselect the ones needed for closing control loops. Manysensors would need to fail in order to result in completeloss of function. This scheme requires many sensors and

AvionicsNode

Inductive Power Coupling

Strain Sensor

SensorData link

Main Sensor Power (400KHz)

Sensor Embedded in Multi-Layer

Inflatable Fabric

Figure 4. Embedded Intelligent Sensor

AvionicsNode

Inductive Power Coupling

Strain Sensor

SensorData link

Main Sensor Power (400KHz)

Sensor Embedded in Multi-Layer

Inflatable Fabric

Figure 4. Embedded Intelligent Sensor

Page 8: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

8

algorithms capable of determining consistency within sensorsets.

Similarly, highly-redundant wireless power and datainterconnects provide reliability consistent with currentwired systems. The key to implementing such highlyparallel redundant systems is to reduce the cost of eachsensor and computing package (volume, mass, powerconsumption and funding) to a minimum. Moreover, sinceeach module is of the same design, significant effort mustbe applied for improving module reliability to preventcommon-mode failures capable of affecting many modulessimultaneously. Modern computing and sensor technologycan allow such dramatic reductions in cost and improvementin reliability.

Applying dozens of computational modules to each criticalcontrol loop can provide a considerable excess of computingfunction in such a system, therefore loss of a computationalmodule is simply a loss of redundancy. How many failurescan be tolerated is determined by the excess capacityprovided. Transient errors are covered by the voting andcoordination schemes. Since much of the voting andcoordination can be implemented in software and thehardware functions are relatively unconstrained, complexand novel methods for coherency can be employed. Many ofthese methods have yet to be developed and proven.Another key is the use of time stamping on ALL data andmessage traffic within the network, which minimizes thepossibility of Byzantine fault modes in such highly parallelredundant systems. Data and messages that are not timesynchronous with the current control cycle are rejected bythe actuator controllers.

Redundant architectures need to cover failures of theactuators. Actuators can be control surfaces, thrusters,heaters, coolers, pumps, power switches or any other devicethat are needed for spacecraft flight control, life support andvehicle operations. Each actuator controller is a separatecomputing node, accepting control inputs from multiplecomputing modules and using a majority voting algorithmto accept only correct inputs. To prevent single-point offailure, these actuator modules would require redundantcomputing, data communications and actuator outputfunctions. Again, the concept of double-fault tolerancewould be applied to any flight-critical or crew-criticalfunction. An important consideration is voting betweenredundant actuator modules. If a redundant actuator suffersan internal transient error, this is covered by requiring thethree actuator modules to vote on each other’s controloutputs, ensuring that at least two of the three redundantchannels agree on the next actuator state. This provides atleast the same level of fault protection as current avionicsdesigns.

Loss of the cockpit in an aircraft is critical, since there isonly one of them. By incorporating diverse redundant HSIdevices for flight control, this single point of failure iseliminated. Note how well the MAA supports roving HSIdevices that support computational capability on the meshWLAN. The MAA network routes the user input and output

to the correct avionics module hosting the software beingused. In case of HSI device failure, the crewmember uses analternative or spare device.

Finally, incorporating simple backup flight control andcritical control functions in the actuator modules,independent from the primary avionics, can providedisparate redundancy. Manual control interfaces for the crewcan also be provided. However, this backup system wouldonly possess rudimentary sensors and functions, sufficientto allow operation of critical subsystems for the duration ofthe primary control system outage. This is a low impactmethod for providing disparate redundancy to covercommon-mode failures in the massively parallel primaryavionics system.

4. HUMAN-SYSTEM INTERFACES

Virtual Human System Interface (HSI) devices can providecrew mobility, freedom from position constraints (commonin earth gravity) or physical restraints (common inmicrogravity) through the use of body worn components,producing information fusion from a variety of spacecraftsources. Significant computational power is needed,particularly as user interaction modes become more complexthrough the use of intelligent virtual agents. Improvementsin display technology are needed to allow high-resolutioncamera video and still images to be combined withcomputer status information. Methods of trackingcrewmember position and orientation are needed to createthe virtual fields, which change as the crewmember movestheir head around. All this has to be provided in acomfortable manner, capable of being used for long-periodswithout fatigue.

Also, operator input also poses challenges for virtualcockpits and new methods of user input are needed, basedon gestures and motion, or speech. Since the virtual cockpitmay not have a specific dedicated location within thespacecraft, mobile methods are preferable. However, forcritical operations such as docking, maneuvering andteleoperations, specialized devices such as trackballs orjoysticks may be necessary. Devices that work usinggestures, like the Nintendo WII computer game console, arepotentially useful for this purpose. For interaction usingspeech, a headset would have to be worn. Finally, feedbackto the crewmember, in the form of physical forcesproportional to some desired parameter can provide a totalimmersion experience, leading to more accurate operation.

Speech and synthesized voice interaction is highly effectivefor general interaction with computing systems and researchhas significantly improved the capability for criticalcommand and control. These systems use high-end phoneticspeech recognition coupled with dialog managers designedfor specific tasks. This combination significantly increasesrecognition accuracy and speed, providing pragmaticcapability. For example, the crewmembers aboard theInternational Space Station (ISS) have tested voice

Page 9: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

9

recognition for use in critical procedure execution forcheckout of space suits. [7]

Virtual Cockpits

Current spacecraft and aircraft have cockpits where alloperator input and output is concentrated. Visual input fromreal images viewed through windows is used to fly thevehicle, with control inputs from sticks and hand-controllersdirectly affecting control surfaces. Additional flight systemsinformation or autopilot headings can be shown in heads-updisplays (HUD) or head-mounted displays (HMD) allowingthe pilot to see both the view outside the cockpit and theancillary information presented without changing his headposition. Advances in fly-by-wire and other methods allowthis direct connection to be bypassed in favor ofcomputational methods that can provide significantadvantages for control stability and robustness. Indeed, as inthe case of the Supersonic Transport, cockpit windows werea real problem, and virtual methods of presenting externalvisual information to the pilot have been investigated. Inthis approach, cameras are used to present a virtual pictureto the pilot, either using HUD or HMD technology.

Widely disseminated HSI hardware can match similarlydispersed MAA software architectures altering the need forspacecraft cockpits where information from all the vehicle’ssystems converges and from which it is controlled. HSIdevices can be used at any location within the spacecraft,connecting to the nearest computing node using the wirelessnetwork, functioning well as roaming mobile devices,enabling innovative methods of HSI interaction for futurespacecraft. Crewed spacecraft, in fact, can more and more bedesigned in the manner of remotely controlled vehicles inwhich the cockpit function does not need to be physicallylocalized in one place in the vehicle and may even not belocated within the vehicle at all.

The only time that cockpit functions are required is duringtransit orbit insertion, trajectory corrections and orbitalinsertion (assuming that the surface landings will be madein smaller craft). During this time the crew would donvirtual reality (VR) headsets with microphones andearphones, and take out flight controls hardware, whichmight look like computer game controllers for the pilot andcommander. All of the rest of the status checks and burnequipment readiness checks can be made by voice, andchecking the virtual reality displays. Images from outsidethe spacecraft would be presented to the crewmembers,perhaps synchronized with the visual field, providing thecapability for viewing the outside environment in alldirections.

Crew location and tracking of body position are needed fortaking advantage of these developments. Miniaturized lowpower sensors and effectors may be mounted on a variety ofplaces on the crew themselves allowing precise tracking oftheir position, body postures, and biometric status, i.e.heart and metabolic rate. These parameters may bewirelessly transmitted to the relevant spacecraft systemsallowing more accurate tracking of crew location andsupporting maintenance of physical and cognitive workload

within acceptable margins. Moreover, since these smallerhead or body-mounted displays can move around with theuser, they open the possibility of the virtual cockpit beingtotally mobile and accessible anywhere within thepressurized spacecraft volume. Indeed, if the wirelessnetwork is appropriately configured, the cockpit could evenbe accessed from outside the vehicle during EVA.(HAL9000 watch out!).

Augmented Reality

For ordinary vehicle and mission operations, the crew coulduse alternative displays and input methods, not requiringVR headsets. For example, the spacecraft could be equippedwith beam forming microphone arrays and 3D spatial audiospeakers, capable of monitoring spoken commands andproviding directional audio output directly to thecrewmember. This will allow the intelligent agent system tocommunicate with any crewmember and receive voicecommands from any place in the spacecraft, including thecrew quarters. Mobile hand-held displays or projecteddisplays could provide a similar capability for visualinformation.

Each crewmember would have an area of responsibility andwould be presented the appropriate displays, handcontrollers and have the appropriate voice commandsavailable. The software would implement a flexiblepresentation interface coupled to the workflow of the presentoperation, with crew being able to override the defaultpresentation. At any point in time the system would becapable of answering questions, displaying diagrams andprocedures and showing schedules and timelines.

Page 10: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

10

Functions for operating spacecraft systems and monitoringthe progress of the mission can be done by augmentedreality displays and voice commands. In augmented reality,information systems help crewmembers perform their dutiesby presenting relevant procedures in the correct timesequence along with relevant technical information,drawings and pictures. In effect, this type of assistant knowsabout the task being performed, using correlations tomission procedures and documentation to present theappropriate information to the crewmember at the righttime, or upon request.

The augmented reality system will have at least thefollowing capabilities:

(1) Controlling avionics and information displays.

(2) Reminding the crew of tasks to be performed.

(3) Alerting the crew to off nominal and emergencyconditions in the spacecraft.

(4) Help running diagnostic tests and conducting recoveryactivities in the case of failures.

(5) Planning and scheduling crew work activities andcommunications.

(6) Navigation and mission progress reporting.

(7) Procedure execution assistance.

(8) Robot control and monitoring.

(9) Medical monitoring and diagnosis.

(10) Communication facilitation with Earth.

The spoken dialogue system will be the human interface toa series of intelligent agents which will help run thespacecraft both autonomously and under human control.These intelligent agents are software processes running inthe computational modules, acting as high-level userinterfaces to the control software. Because transit and habitatspacecraft will be more complicated than previousspacecraft, they will have to be largely autonomous for asmall crew to operate and monitor, particularly for long-duration missions. Since speech is a very naturalcommunication modality for humans, it will be the bestinterface to the spacecraft systems particularly whenaugmented by haptic pointing and gesture input devices andHMDs. Haptic devices such as miniature finger-mountedbuzzers can also be introduced to provide positive tactilefeedback for operation of virtual switches and knobs.

The key to coherent function is to virtually connect the HSIcomponents to the software processes needing human inputand monitoring. In the case of complex spacecraft, thisusually encompasses multiple concurrent processes, withseparate virtual displays for output. Training would beembedded in the system in the form of refresher videos forinstruction. Overlays of information during repair operationscan help guide the crewmember and provide procedures

Figure 5. Augmented Reality Concept Mockup

Page 11: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

11

directly through displays or audio output, replacing papermethods. Most procedures have been migrated to electronicdisplays for ISS already. The picture in Figure 5 is a photomockup showing flight engineer Vladimir N. Dezhurovusing a virtual reality HMD aboard the International SpaceStation. Note the presence of copious amounts of paperprocedures posted throughout the spacecraft, and the largeamount of wiring, floating in the air, waiting to ensnarehapless crewmembers.

Since radio communication with the ground will be subjectto long delays (about 40 minute light-speed round triptransit time), all the information necessary for problemsolving has to be on board the spacecraft. Advanced searchtechniques will allow the crew to find the relevantdocumentation for any problem or sub-system in thespacecraft. The crewmember will be able to requestinformation using spoken commands and flexible dialogue.The intelligent agent systems would be designed to try todiagnose problems and provide possible remedies becausethe spacecraft systems may be too complicated forcrewmembers to understand in detail. Of course there willalways be unforeseen problems, which will require humanreasoning and diagnosis, which can be improved by theproper presentation of relevant information.

Virtual User Interface Design

The dramatic reorganization of crewed spacecraft madepossible by the new computing and communicationhardware and user-interface technologies requires appropriatedesign. Specific analyses need to be conducted to validatethe savings and document the effect on the panoply of newrisks and issues raised by adoption of virtual interfacesbased on body worn sensors and displays. It is important tonote that similar claims have been made about theunderlying technology for decades, first when such bodyreferenced computer displays were conceived in the 1960’sand later when the aerospace applications were morespecifically detailed in the mid 1980’s. At those time thebody position tracking systems, visual displays systems,interactive 3D computer rendering, and kinematic anddynamic computer simulations were not even approachingperformance levels that would be practical. [8] Thesetechnical barriers have largely been broken and the principleremaining issues concern concepts of operations and systemengineering. [9]

The virtualization of the user interface for cockpits andcontrol centers raises a number of significant design issuesbecause the very nature of such interfaces removes many ofthe sensory cues associated with conventional user-computerinteraction, like the sound and feel of fingers hitting akeyboard and the kinesthetic position cues associated withmouse use. These ancillary cues are important forestablishing control contexts and are very important to keepusers aware of “where” they are in their task or softwareoperation. Fortunately, the same display techniques thatinterfere with crew awareness can be used to enhance it byintroducing naturalistic sensory feedback for inputs tovirtual controls. For example, a switch closure on a visuallypresented virtual control panel can be confirmed by an

auditory click as well as a visual change in color. Ifnecessary, miniaturized tactile displays built into clothingor gloves can also be used to add haptic feedback.

The disseminated user-interface design should be guided bysome of the rules for such systems proposed by DonNorman: [10]

(1) Be predictable.

(2) Keep things as simple as possible.

(3) Provide good conceptual models.

(4) Provide rich, complex, and natural signals.

(5) Make the output understandable, provide reasons foractions.

(6) Provide continual awareness, reassurance of intendedoperation without annoyance

( 7 ) Exploit natural mappings of user input toenvironmental effects mappings

The first two rules incorporate a realization that unnecessarycomplexity in an interface is definitely a defect. Therefore,the system’s expectation of user expertise needs to beindicated to the user to avoid affecting predictability. Theremaining admonitions all relate to providing goodconceptual cognitive models for system navigation. Thesesupport the predictability of the system since one of thefeatures of a “good” conceptual model is that it leads toaccurate expectations of system behavior. These accurateexpectations are supported by the provision of rich,complex, natural feedback, recommendation four. This kindof feedback supports the understandability of the interface,recommendation five. The last recommendation to usenatural mapping refers to metaphorical relationships betweenpatterns of user inputs and systems responses and otheranalogous relationships in the real world. When we wish tomodify something on our right, we typically move ourhands towards our right. The guidelines outlined abovemust be made more concrete for specific designs, but theirconsideration will generally improve virtual user interfaces.

5. INTELLIGENT AGENTS

A complementary concept to mobile virtual HSI forspacecraft command and control is the use of “intelligentagents” to act as avatars for human interaction, actors forautonomous operations and as experts for knowledgemanagement. The large amount of computing provided byMAA aboard a spacecraft can be comparable to an EnterpriseIT system. On a long duration mission to the moon orMars, this capability may significantly improve the abilitiesof the limited crew complement to perform vehicle andmission operations activities and can lead the way to moreautonomous, capable and reliable human spacecraft.

Page 12: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

12

Figure 6. Spoken Dialogue InteractionFigure 6. Spoken Dialogue Interaction

NASA Ames Research Center has developed prototypes forintelligent agents applied to mixed human-robotic teamsand to planetary habitats. [11] These software systems weresupported by facile mobile information systems andsimulations of planetary surface surveys were conducted inanalog environments. [12] The intelligent agents,implemented in the Belief-Desire-Intent language Brahms,responded to questions like, “Where am I?”, “What is thecurrent activity? What is the voltage from the generator?”and other queries of information available in the distributedIT system. The agents would also initiate commands to therobots: “Robot, take a picture of me” “Robot, go to location1, GPS coordinates …” etc. Note how useful these types ofhuman to agent interactions could be aboard a spacecraft andhow they mimic natural interactions between people. Figure6 shows a picture of an astronaut performing payloadoperations using interactive spoken dialogue interactionwith a human on Earth. However, aboard a future transitvehicle, this may well be an intelligent agent, particularlywhen light speed delays prevent real-time communicationwith Earth.

Intelligent agents will be in effect the extra crew membersneeded to run and maintain the spacecraft. A crew of 3-5persons may not be able to operate a complicated spacecraftwithout the support of virtual crew members capable ofworking 24/7. Many functions can be handled or augmentedby these virtual assistants. Spacecraft monitoring,maintenance and repair operations are obvious candidatesgiven the amount of work required on these tasks. Activitiesrequiring specialized knowledge could be facilitated, such asmedical treatment, by giving the human crewmember helpin executing procedures for diagnosis and treatment. Thefollowing are brief summaries of selected agent functionsproposed for particular activities.

Repair Agent

Intelligent agents can monitor subsystems and follow trendsin operating parameters which might signal impending

problems, and schedule maintenance before the system failsor loses significant function or performance. The followingsubsystems will need to be monitored continuously: 1) lifesupport, 2) navigation, 3) propulsion, 4) communication, 5)consumables, 6) power, 7) radiation shielding, and 8)structural integrity. Most repairs have complicateddiagnostic and replacement procedures to be followed alongwith needed diagrams and technical information. Thespoken dialogue system can read out the procedure steps oneby one and provide the necessary diagrams and pictures. Allof this material needs to be on the spacecraft, so that nointeraction with the ground is necessary to proceed with therepair. A set of intelligent repair agents can significantlyreduce the time to diagnose and complete a repair whileimproving the quality of the repair.

Medical Agent

The crew is a vital part of the spacecraft and their health isas important to the success of the mission as the health ofthe vehicle. The health of the crew will be monitoredcontinuously both during exercise and normal activities.This monitoring will have to be unobtrusive because of thelong duration of the mission and this requirement is nicelymet using body-worn sensors. Sensor-bearing vests or shirtshave been suggested as ways of measuring heart rate, bodytemperature, EKG and arterial oxygen. These could beintegrated with the sensors used to monitor location andbody position for virtual HSI. These sensors provideinformation wirelessly to the spacecraft medical intelligentagent. The agents would keep track of all of the crewphysiological data and analyze it for signs of medicalproblems in each crew member before they became serious.Often trends in medical data are very valuable for findingconditions that may become serious in the future.

During medical emergencies like injuries or breathingdifficulties, the medical agent would interact with the caregiving crew member to assist in diagnosis and treatment.Much work has been done in the past to develop diagnosticdecision trees which allow the system and caregiver to makea rapid diagnosis. Since the medical agent would alreadyhave many of the vital signs, it could quickly eliminatediagnosis sub trees that do not match the measuredsymptoms. The agent could also communicate with earth-based flight surgeons for consultation, saving time duringserious medical conditions.

Crew Personal Agent

Each crew member would have an intelligent agent to keeptrack of schedules, exercise, entertainment andcorrespondence with the earth. On the long trip to Mars itwill be necessary to keep up a rigorous exercise program.Since the whole crew will be dependent on a limitednumber of exercise machines, it will be necessary toschedule these machines between the whole crew. The agentsystem will dynamically allocate exercise times and updatethe schedule in case of breakdowns and other delays.Scheduling and coordination can be optimized and theresults of this optimization sent to the Crew Personal Agentfor execution, by guiding the crewmember through

Page 13: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

13

scheduled activities. This same concept can be applied toany constrained resources.

Communication with the ground will be delayed for up to40 minutes, so a system which keeps track of crewquestions and answers received from Mission Control willbe very important. When the answer to a question arrives 2days after the initial query, it will necessary to pair thequery and answer automatically to save crew time and jogtheir memory. These communications will be in speech andtext/picture form and will have to be indexed and tracked.

6. TECHNOLOGY DEVELOPMENT

Implementing the MAA and associated HSI concepts willrequire investments in improving computer technology,both hardware and software. Fortunately, very little needs tobe invented, since most components have either commercialor developmental predecessors.

Significant development of radiation-hardened avionicscomputing nodes would be needed. Power consumptionwould have to be dramatically reduced. Intelligent sensormodules would need to be developed, all minimizinginstalled cost. Sensor modules would need to be very small,incorporating multiple sensing elements, processor andnetwork interface. The MAA relies on high-performancecomputing and networks, which can function at many timesthe rate needed for control. This large excess of performanceis used for implementing the voting and coherencyalgorithms needed for effective coordination. Fortunately,modern semiconductor technology provides the means toimplement all of these features.

In addition to further development of low-power radiationtolerant electronics, significant advances in avionicspackaging would be required. For example, such avionicsmodules could not be liquid or conduction cooled, ratherthermal control would have to be passive. Given the factthat no wired interfaces penetrate from the exterior, the useof completely sealed packages would be possible, which canincrease reliability and interchangeability.

Network technology has progressed significantly with theadoption of AFDX time-deterministic Ethernet for criticalaerospace control applications. [13] This is being used inthe latest Airbus and Boeing passenger aircraft. Time-triggered Ethernet is a similar alternative. Each methodrelies on issuing packets on a timed basis, with sufficientgaps in between packets to allow other time-critical packetsto be interleaved, which produces bounded latency. [14]Wireless network technology has similarly progressed, butto date, wireless time-deterministic networks have not beendeveloped. Mesh routing software would be needed. Today,several products are available that perform mesh routing ateither the MAC (Zigbee) or TCP/IP level (Tropos). MobileIP routing has been flown aboard satellites in flight tests.

The biggest hurdle for implementation and biggest risk isthe development of the necessary system managementalgorithms and software architectures needed for massivelyredundant peer-to-peer avionics systems. Although much ofthe critical control for MAA can be adapted from IMA, thelarger number of possible redundancy solutions means thatbetter means of maintaining coherency and failover wouldbe needed. Model-based system management anddiagnostics could be useful in this area. The system isreconfigured according to multiple constraints, synthesizedfrom current system state. Planning algorithms could beused to determine the optimal process for reconfiguring suchcomplex redundant systems.

Power transmission using inductive coupling is simply amatter of producing the correct magnetic interfaces.Semiconductor technology providing the high-frequencydriving current is used in every switching power supply –not an issue for technology development. However,providing an alternative power source maybe more difficult,in that radiative schemes for power transmission may bedifficult to apply internal to a spacecraft for humans. Triple-redundant power distribution schemes are possible, whereeach module couples with more than one distributioncircuit. Again, low power electronics enables battery backupdevices to be small and lowers the performance needed fromwireless power transmission methods.

There are a number of building block technologies neededfor the virtual disseminated interfaces we are proposing. Thevarious sensory display devices that physically present theinformation to the users are a real challenge. These aredevices such as miniature flat panel displays, and audiointerfaces for head mounted systems. Recent advances inhead mounted display technology benefiting from newOLED displays and redundant optical tracking systemsmake possible development of virtual cockpits based onbody and head mounted displays that can present cockpitvirtual images laid out within a user’s 4π steradian field ofregard. Such displays can replace the much heavier, power-hungry, flat panel displays of conventional control consoleswith bright, high-resolution, full color, lightweight displaysthat only draw on the order of a few watts.

Other types of sensors such as miniature medical sensorscurrently exist and can be adapted to this use. Arraymicrophones can focus upon the speech of a crew member ina given known location. The corresponding audio outputfunction can be implemented using surround soundtechniques, commonly used for home theater. Hapticsensors for interpreting motion and gestures are nowavailable, but the feedback elements are lagging. Other typesof user input devices are emerging, and may be useful.

The main challenge for HSI is not the device development,but rather the information and sensor fusion required forproviding a coherent, yet dynamic and flexible userinterface. The interface and display-rendering software thattranslates all the vehicle sensor and state readings intohuman sensory information and then encodes human voiceand action responses into commands the spacecraft can use

Page 14: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

14

is particularly challenging. It is noteworthy that thesetranslations may well be haptic and audible as well asvisual.

A final building block for the disseminated user interface isthe development of advanced packaging techniques forassembly of the HSI components into sufficiently smallpackages that may be unobtrusively introduced into crewquarters and crew clothing so that their use requires little orno special setup before operation. Power sources andlightweight batteries would be needed as well, compatiblewith the spacecraft environment. Changing your shirt mayalso change your batteries for the wearable sensors.

Intelligent agents are a focus for many of the activities ofthe Intelligent Systems Division at NASA Ames. Severaldemonstrations of early prototypes have been performed, sothis area is considered one for development, rather than justresearch.

7. CONCLUSIONS

Inflatable spacecraft represent a revolutionary way of creatingviable living and operations volume for humans engaged inexploration missions, providing increased usable volumewith reduced launch mass and volume. They can potentiallyincrease resistance to micrometeorite and space radiationhazards over conventional aluminum structures, while beingmore flexible for reconfiguration from transit to habitatmodes. Challenges maintaining avionics operation duringdynamic changes from stowed to deployed state duringinflation, providing feedthroughs between the inner andouter surface of the pressure vessel, and for monitoring thehealth of the inflatable fabric itself can be met by applyinginnovations in the areas of avionics architecture, intelligentsensors, human-system interfaces and spacecraft informationinteraction, currently in the research and development phase.These innovations leverage the capabilities of each other toprovide new concepts for the operation of long durationhuman spacecraft.

MAA is based on a massively redundant peer-to-peerconstruct, providing much higher fault tolerance and highercomputational performance. Intelligent sensors have thepotential for dramatically reducing sensor failures andinaccuracies, again by employing many redundant elementswith internal cross-checking of function and accuracy. MAAdirectly supports roaming mobile HSI and its excesscomputational power can be used for hosting intelligentagent technology for information management and access.The use of wireless networks and wireless powerdistribution directly supports inflatable spacecraft byeliminating inflexible cables and feedthroughs across thepressure vessel.

An innovative operations concept for transit vehiclepiloting, operation and maintenance was outlined,incorporating these technologies, providing virtual cockpitsfor vehicle operation, augmented reality for complex support

activities and unobtrusive interfaces for common activities.Design guidelines for constructing such HSI systems werepresented, consistent with current human factors practice.The use of intelligent agents nicely complements advancedHSI allowing natural interaction between crew and spacecraftsystems and payloads, effectively increasing crew size.

Technology for implementing proof-of-concept prototypesalready exists. Significant improvement in performance,power consumption, size, mass and cost are required for allelements, but are anticipated to be achieved using modernsemiconductor advances. Packaging and hardening for thespace environment will remain serious challenges.

Critical advances in software architecture and algorithmswill be needed to integrate these widely dispersed assetsinto a coherent system, capable of maintain function despitefailures and changing mission conditions. In many ways,the system architecture described is a good host fordeveloping and testing many advanced software concepts.Agent-based design, in general, may help coordinate manyof the loosely-coupled modules and functions identified.

Successful implementation of this architecture andcomplementary features can lead to more robust inflatablespacecraft, increasing the performance and capabilities of thespacecraft and its crew during the conduct of explorationmissions to the moon and Mars.

Page 15: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

15

REFERENCES

[1] http://www.bigelowaerospace.com/out_there/

[2] http://spaceflight.nasa.gov/history/station/transhab

[3] “Avionics Application Software Standard Interface”,ARINC 653 Standard, 2006.ARINC 653 Standard

[4] IEEE Std 1451.1-1999 IEEE Standard for a SmartTransducer Interface for Sensors and Actuators – NetworkCapable Application Processor Information Model,Instrumentation and Measurement Society, TC-9,Institute of Electrical and Electronic Engineers, NewYork, NY 10016-5997, SH94767, April 18, 2000.

[5] IEEE Std 1451.2-1997 IEEE Standard for a SmartTransducer Interface for Sensors and Actuators –Transducer to Microprocessor Communication Protocolsand Transducer Electronic Data Sheet (TEDS) formats,Instrumentation and Measurement Society, TC-9,Institute of Electrical and Electronic Engineers, NewYork, NY 10016-5997, SH94566, September 25, 1998.

[6] Fernando Figueroa, Randy Holland, John Schmalzel, DanDuncavage, Rick Alena, Alan Crocker, “ISHMImplementation for Constellation Systems,” AIAA 2006-4410, 42nd AIAA/ASME/SAE/ASEE Joint PropulsionConference & Exhibit, July 9-12, 2006, SacramentoConvention Center, Sacramento, CA.

[7] Hieronymus, J. and Dowding, J., "Clarissa SpokenDialogue System", ACTA Astronautica, 2006.

[8] Fisher, S. S., McGreevy, M., Humphries, J. & Robinett,W. (1986). Virtual Environment Display System. ACM1986 Workshop on 3D Interactive Graphics. Chapel Hill,North Carolina. 23-24 October 1986. ACM. pp. 77-87.

[9] Brooks, F. B. Jr. (1999) What’s real about virtual reality,IEEE: Computer Graphics and Applications, 19, 16-27.

[10] Norman, Donald A. (2007) The design of future things,Basic Books, New York.

[11] Clancey, W.J, Sierhuis, M., Alena, R., Crawford, S.,Dowding, J., Graham, J., Kaskiris, C., Tyree, K. S., andvanHoof, R. (2004) Mobile Agents: A distributed voice-commanded sensory and robotic system forsurface EVA assistance, In R. B. Malla and A. Maji(eds), Engineering, Construction, and Operations inChallenging Environments: Earth and Space 2004, Houston: ASCE. pp. 85-92

[12] Alena R., Ossenfort J., Lee C., Walker E., Stone T.,“Design of Hybrid Mobile Communication Networks forPlanetary Exploration” IEEE Aerospace Conference 2004,Big Sky, MT.

[13] Richard L. Alena, John P. Ossenfort IV, Kenneth I.Laws, Andre Goforth “Communications for IntegratedModular Avionics” IEEE Aerospace Conference 2007,Big Sky MT.

[14] “Aircraft Data Network,” ARINC 664 StandardDocument, 2002.

BIOGRAPHY

Richard L. Alena is aComputer Engineer in theIntelligent Systems Divisionat NASA Ames. Mr. Alenais currently working on theMission Operations Systemfor the LCROSS Lunarmission and on avionicsarchitecture for a landervehicle for human lunarmissions. He was the co-leadfor the Advanced Diagnostic

Systems for International Space Station (ISS) Project,developing model-based diagnostic tools for spaceoperations. He was the chief architect of a flight experimentconducted aboard Shuttle and Mir using laptop computers,personal digital assistants and servers in a wireless networkfor the ISS. He was also the technical lead for the DatabusAnalysis Tool for International Space Station on-orbitdiagnosis. He was group lead for Intelligent MobileTechnologies, developing planetary exploration systems forfield simulations. Mr. Alena holds an M.S. in ElectricalEngineering and Computer Science from the University ofCalifornia, Berkeley. He is the winner of the NASA SilverSnoopy Award in 2002, a NASA Group AchievementAward in 1998 for his work on the ISS Phase 1 ProgramTeam and a Space Flight Awareness Award in 1997.

Stephen R. Ellis headed theAdvanced Displays and Spatialperception Laboratory at theNASA Ames Research Centerbetween September 1989 andMarch, 2006 and is currently amember of this group. Hereceived a Ph.D. (1974) fromM c G i l l U n i v e r s i t y i nPsychology after receiving aA.B. in Behavioral Science fromU.C. Berkeley. He has had

postdoctoral fellowships in Physiological Optics at BrownUniversity and at U.C. Berkeley. He has published on thetopic of presentation and user interaction with spatialinformation in 170 journal publications and formal reportsand has been in the forefront of the introduction ofperspective and 3D displays into aerospace user interfaces.In particular he has worked recently on kinesthetictechniques to improve cursor and manipulator control underdifficult display control coordinate mappings. He has servedon the editorial boards of Presence and Human Factors and

Page 16: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

16

has edited a book, Pictorial communication in virtual andreal environments, concerning the geometric and dynamicsaspects of human interface to systems using spatialdata.(Taylor and Francis, London. 2nd Ed. 1993). He wasawarded a University Medal from Kyushu SangyoUniversity in Japan in 1992.

Dougal Maclise is currentlythe group lead for the ISHMTechnology MaturationGroup within the IntelligentSystems Division at AmesResearch Center. He has aBachelors degree inMechanical Engineering anda Masters degree inBioMedical Engineering. Over the fifteen years that he

has worked at NASA, he has managed a wide variety ofprojects such as animal physiology experiment payloads forthe Shuttle, the consolidation of seven resource trackingdatabases, a real-time imaging payload for the Solar-Powered Pathfinder UAV and the development of theAdvanced Diagnostics and Prognostics Testbed. For thelast five years he has focused on the systems engineeringand application aspects of Integrated Systems HealthManagement while working on the Second GenerationReusable Launch Vehicle, Orbital Space Plane and NextGeneration Launch Technology projects. He is currently theInformation Systems lead for the Lunar Lander Project.

James Hieronymus isUSRA Senior Scientist inthe Collaborative andAssistive Systems (CAS)Technical Area within theIntelligent Systems Divisionat NASA Ames. He wastechnical lead for the NASAteam which developed theClarissa spoken dialoguesystem that talks astronauts

through executing complex procedures which has beeninstalled for testing on the International Space Station. Heworked with John Dowding on the EVITA (EVA IntelligentTalking Assistants) spoken dialogue system for lunar spacesuits. The EVITA system forms a front end for the MobileAgents system which keeps track of all of the assets on alunar exploration mission, assists in scheduling, navigation,sample gathering and commands robots. His PhD is fromCornell University, masters degree from U Cal San Diego,and B.S. from the University of Michigan. He was a PostDoc at the Stanford AI Lab working on automatic speechrecognition. He helped start the speech laboratory at NIST,was a linguistics professor at U. of Edinburgh in the Centrefor Speech Technology Research, a Member of Technical Staffin the Linguistics and Spoken Dialogue Systems Departmentsat Bell Labs, Murray Hill, and a Visiting Professor at theUniversity of Gothenburg. He joined NASA Ames in late2000.

Page 17: Wireless Avionics and Human Interfaces for Inflatable Spacecraft

17

1234567

8910

111213

14