[ieee 2010 ieee virtual reality conference (vr) - boston, ma, usa (2010.03.20-2010.03.24)] 2010 ieee...

2
Mixed Reality in the Loop - Design Process for Interactive Mechatronical Systems org St ¨ ocklein * University of Paderborn Christian Geiger Duesseldorf University of Applied Sciences Volker Paelke Leibniz University Hannover ABSTRACT Mixed reality techniques have high potential to support the develop- ment of complex systems that operate in a real world environment, especially mechatronic systems. In our paper we present the Mixed Reality in the Loop design process that enables a seamless progres- sion from an initial virtual prototype to the final system along the mixed reality continuum. Index Terms: D.2.1 [Requirements/Specifications]: Elicitation methods—Prototyping; H.5.1 [Multimedia Information Systems]: Artificial, augmented, and virtual realities—Mixed Reality Envi- ronment 1 I NTRODUCTION AND MOTIVATION Mechatronic systems pose major challenges during development: a significant portion of the systems functionality is no longer repre- sented by obvious hardware but consists of the sensors, processors and actuators who’s operation is controlled by software. While this is desirable to achieve high flexibility in operation it also results in debugging problems as large parts of the system state are invisible and no longer accessible by traditional methods. To support effec- tive development and debugging of such systems it is necessary to develop user interfaces that provide an overview of the complete system state and enable detailed analysis of ongoing operations. Mixed-reality user interfaces have high potential to support this task by providing seamless means of visualization and interaction for both the mechanic and software parts of the system. In the use phase similar challenges arise: During operation of the system users need adequate interfaces to supervise and control it. Again the use of mixed-reality user interfaces has high potential to provide intuitive means of visualization and interaction that cover both the hardware and the software aspects of the system. The mixed reality in the loop-design process addresses these problems by iteratively developing a mixed-reality user interface in close coordination with the mechatronic system. The key to inte- grate mixed-reality into the development is that elements from the initial virtual prototype are successively replaced by the real hard- ware and software components. This progression results in various combinations of virtual and real elements, starting with predomi- nately virtual elements that are relatively easy and quick to develop and replacing them with progressively more refined and realistic components during the process. 2 RELATED WORK Our ’mixed reality in the loop’ design process builds on existing concepts from iterative development, virtual prototyping [7] [5] and hardware in the loop simulations [8]. The implementation is closely * e-mail: [email protected] e-mail:[email protected] e-mail:[email protected] related to existing work in user interface development in general, and the development of mixed-reality user interfaces in particular. Regarding system structure the most prominent development model for graphical user interface software is the MVC (model- view-controller) paradigm, originally defined for the SmallTalk80 programming language [1]. Ishii extended the MVC model to de- scribe tangible user interfaces [4]. He added a tangible / real repre- sentation to the digital view that is basically used as direct control mechanism. Real objects act as physical representations of inter- face elements and allow to use real and virtual elements in tangi- ble user interfaces. In our approach we use a similar extension of the MVC model, that incorporates the external (real world) envi- ronment as an additional component. A widely used concept to capture and describe the combination of real and virtual elements is the mixed-reality continuum introduced by Milgram [6]. Mil- gram defines a continuum of arbitrary combinations between real and virtual elements that ranges from purely real environments at one end to purely virtual ones on the other, with combinations like augmented reality and augmented virtuality in between. In our ap- proach we use this continuum as an additional dimension for the MVC model. 3 I TERATIVE DESIGN AND THE MVCE MODEL PATTERN The goal of many mixed-reality applications is the development of better man-machine interfaces. Therefore, an iterative design pro- cess is useful. However, most design processes assume that the underlying technology is well defined and stable, a condition that is often violated in the development of MR systems, that use emerg- ing technologies that are still in early experimental stages. Changes in the underlying technologies can be problematic, if a design pro- cess does not anticipate such volatility and therefore provides no means to handle it in a systematic and structured way. Most of the time mixed reality applications are designed unstructured using a ”Trial-and-Error”-prototyping approach. This leads to an over- head in the programming, since parts of the application can not be reused and must be reimplemented for each prototype. Against this background we have developed a structured iterative design process which can be used for developing mixed reality applications. The central part of this design process is the structuring into (M)odel-, (V)iew-, (C)ontroller- and (E)nvironment-components. 3.1 The Design Process One central feature of mixed-reality user interfaces is the integra- tion with a real environment. The application requires information about objects and spaces, whose geometry and behavior is not un- der the control of the designer but must be acquired from the real environment. Real objects can be subject to real-world manipula- tion (e.g. in a maintenance task) or external forces. Therefore, it must be possible to track state changes in the environment. In prac- tice the real world model of a mixed reality application often con- sists of a combination of static information (e.g. geometry of the environment that is assumed to be fixed) and dynamic information (e.g. position and orientation information for the user and central objects) that is acquired by sensors at runtime. While sensor infor- mation could be handled as controller events in the MVC model this 303 IEEE Virtual Reality 2010 20 - 24 March, Waltham, Massachusetts, USA 978-1-4244-6238-4/10/$26.00 ©2010 IEEE

Upload: volker

Post on 06-Mar-2017

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: [IEEE 2010 IEEE Virtual Reality Conference (VR) - Boston, MA, USA (2010.03.20-2010.03.24)] 2010 IEEE Virtual Reality Conference (VR) - Mixed reality in the loop — design process

Mixed Reality in the Loop - Design Process for Interactive MechatronicalSystems

Jorg Stocklein∗

University of PaderbornChristian Geiger†

Duesseldorf University of Applied SciencesVolker Paelke‡

Leibniz University Hannover

ABSTRACT

Mixed reality techniques have high potential to support the develop-ment of complex systems that operate in a real world environment,especially mechatronic systems. In our paper we present the MixedReality in the Loop design process that enables a seamless progres-sion from an initial virtual prototype to the final system along themixed reality continuum.

Index Terms: D.2.1 [Requirements/Specifications]: Elicitationmethods—Prototyping; H.5.1 [Multimedia Information Systems]:Artificial, augmented, and virtual realities—Mixed Reality Envi-ronment

1 INTRODUCTION AND MOTIVATION

Mechatronic systems pose major challenges during development: asignificant portion of the systems functionality is no longer repre-sented by obvious hardware but consists of the sensors, processorsand actuators who’s operation is controlled by software. While thisis desirable to achieve high flexibility in operation it also results indebugging problems as large parts of the system state are invisibleand no longer accessible by traditional methods. To support effec-tive development and debugging of such systems it is necessary todevelop user interfaces that provide an overview of the completesystem state and enable detailed analysis of ongoing operations.Mixed-reality user interfaces have high potential to support this taskby providing seamless means of visualization and interaction forboth the mechanic and software parts of the system.

In the use phase similar challenges arise: During operation ofthe system users need adequate interfaces to supervise and controlit. Again the use of mixed-reality user interfaces has high potentialto provide intuitive means of visualization and interaction that coverboth the hardware and the software aspects of the system.

The mixed reality in the loop-design process addresses theseproblems by iteratively developing a mixed-reality user interfacein close coordination with the mechatronic system. The key to inte-grate mixed-reality into the development is that elements from theinitial virtual prototype are successively replaced by the real hard-ware and software components. This progression results in variouscombinations of virtual and real elements, starting with predomi-nately virtual elements that are relatively easy and quick to developand replacing them with progressively more refined and realisticcomponents during the process.

2 RELATED WORK

Our ’mixed reality in the loop’ design process builds on existingconcepts from iterative development, virtual prototyping [7] [5] andhardware in the loop simulations [8]. The implementation is closely

∗e-mail: [email protected]†e-mail:[email protected]‡e-mail:[email protected]

related to existing work in user interface development in general,and the development of mixed-reality user interfaces in particular.

Regarding system structure the most prominent developmentmodel for graphical user interface software is the MVC (model-view-controller) paradigm, originally defined for the SmallTalk80programming language [1]. Ishii extended the MVC model to de-scribe tangible user interfaces [4]. He added a tangible / real repre-sentation to the digital view that is basically used as direct controlmechanism. Real objects act as physical representations of inter-face elements and allow to use real and virtual elements in tangi-ble user interfaces. In our approach we use a similar extension ofthe MVC model, that incorporates the external (real world) envi-ronment as an additional component. A widely used concept tocapture and describe the combination of real and virtual elementsis the mixed-reality continuum introduced by Milgram [6]. Mil-gram defines a continuum of arbitrary combinations between realand virtual elements that ranges from purely real environments atone end to purely virtual ones on the other, with combinations likeaugmented reality and augmented virtuality in between. In our ap-proach we use this continuum as an additional dimension for theMVC model.

3 ITERATIVE DESIGN AND THE MVCE MODEL PATTERN

The goal of many mixed-reality applications is the development ofbetter man-machine interfaces. Therefore, an iterative design pro-cess is useful. However, most design processes assume that theunderlying technology is well defined and stable, a condition that isoften violated in the development of MR systems, that use emerg-ing technologies that are still in early experimental stages. Changesin the underlying technologies can be problematic, if a design pro-cess does not anticipate such volatility and therefore provides nomeans to handle it in a systematic and structured way. Most ofthe time mixed reality applications are designed unstructured usinga ”Trial-and-Error”-prototyping approach. This leads to an over-head in the programming, since parts of the application can not bereused and must be reimplemented for each prototype. Against thisbackground we have developed a structured iterative design processwhich can be used for developing mixed reality applications. Thecentral part of this design process is the structuring into (M)odel-,(V)iew-, (C)ontroller- and (E)nvironment-components.

3.1 The Design Process

One central feature of mixed-reality user interfaces is the integra-tion with a real environment. The application requires informationabout objects and spaces, whose geometry and behavior is not un-der the control of the designer but must be acquired from the realenvironment. Real objects can be subject to real-world manipula-tion (e.g. in a maintenance task) or external forces. Therefore, itmust be possible to track state changes in the environment. In prac-tice the real world model of a mixed reality application often con-sists of a combination of static information (e.g. geometry of theenvironment that is assumed to be fixed) and dynamic information(e.g. position and orientation information for the user and centralobjects) that is acquired by sensors at runtime. While sensor infor-mation could be handled as controller events in the MVC model this

303

IEEE Virtual Reality 201020 - 24 March, Waltham, Massachusetts, USA978-1-4244-6238-4/10/$26.00 ©2010 IEEE

Page 2: [IEEE 2010 IEEE Virtual Reality Conference (VR) - Boston, MA, USA (2010.03.20-2010.03.24)] 2010 IEEE Virtual Reality Conference (VR) - Mixed reality in the loop — design process

can lead to complex and obscure models. We have therefore intro-duced an additional environment (E) component that captures thereal world model of the application (see Figure 1). A perfect realworld model would contain all information about the real environ-ment at the time of query. In practice both the amount of informa-tion required by the application and the amount actually accessiblethrough sensors is limited. Both the model (M) and the view (V)can query the environment (E). This allows to capture spatial as-sociation (e.g. the common augmented reality scenario in which aview object is fixed to a physical location or object) as well as con-trol relations (e.g. objects in the real environment influenced by theapplication).

queryData()

changeDisplay()

user action

changeData()

Model (M)

View (V) Controller (C)

Environment (E)

setControl()queryData()

changenotificationqueryData()

Indirect association

Direct association

Figure 1: MVCE model.

Using the MVCE structure, components can be refined indepen-dently. The current development state of a prototype can be charac-terized by the amount of complexity/realism for each component,as visualized in figure 2a), where the center indicates the most ab-stract representation and movement along the MVCE axes repre-sents increasing refinement/realism of the corresponding compo-nents. One key benefit is the possibility to develop a user interfacealong the mixed-reality continuum, starting with a virtual environ-ment in which the environment (E) is represented by a model. Test-ing mixed reality interfaces in a virtual environment allows to focuson interaction mechanisms and can provide controlled conditionsfor tests, while avoiding limitations of mixed reality technologies(e.g. tracking systems) that are often present in early developmentstages. Refinement of the E component ranges from more refinedmodels to real-time data acquisition in the real environment. Arbi-trary combinations of components are possible, e.g. it is sometimesuseful to combine refined MVC components with a simple E model,for tests or demonstrations in later development stages.

Figure 2b) illustrates our design process. We define each MVCE-component as an ’actor’. An actor can be anything from a model,a visual representation, to a controller. An actor owns a freely con-figurable set of input- and output-ports, which are used for gettinginformation or publishing its own information.

Development proceeds as follows: At the beginning of the devel-opment process an initial set of actors is identified. For each actorin this set the inputs and outputs are defined. If the developmenttargets a complex mechatronic system each relevant system com-ponent (either hardware or software) is initially represented by anactor. Additional actors represent the elements of the user interfaceto the system or visualizations of system state. This initial set of ac-tors can later be extended. It should include all actors needed for afirst prototype that provides a rough approximation of the intendedsystem. The first prototype is then composed from these actors,connecting the information flow between the ports as required bythe application. For the technical implementation we use our mixedreality environment described in [2].

Typically, the first prototype consists of an environment in whichall actors are implemented purely in software. The resulting systemwould therefore be a purely virtual environment. The key benefitis that elements in a virtual environment are faster and cheaper todevelop. Depending on the development goals and priorities actorsare selected for refinement.

Refinement means that either the behavior or visual representa-

Iteration Phase

Initialisation Phase

Prototypei-1

Refinement along one

Axis (MVCE)

Exchange/Add/Split/Remove Actors

Prototypei

Refinement Phase

Model (M) View (V)

Environment (E)Controller (C)

increasingrealism/complexity

a) b)

Figure 2: a) MVCE diagram. b) Design process.

tion of an actor is updated (e.g. the 3d-model) or an actor replacedby another version of the same actor (e.g. Gamephysics vs. Mat-lab/Simulink). If the actor is concerned with simulating real-worldelements typical refinements would be the replacement of an ap-proximative simulation with a more realistic one, or the replace-ment of a simulation component with it’s real world counterpart.This approach allows to move from purely virtual environments tohardware in the loop (or more precisely reality in the loop) systemsin a structured way. If an actor is concerned with the implementa-tion of interaction or visualization techniques the replacement couldeither be a complete exchange of the component (e.g. to comparealternative approaches to system control) or stepwise refinements,in which a user interface is refined according to user feedback as inestablished iterative user centered design processes. As the devel-opment progresses from an initial basic prototype to more complexsystems it can also be necessary to adjust the number of actors, e.g.by splitting the functionality of an actor into two or more, or byadding or removing actors. In all cases the data flow connectionsbetween the actors must be checked and corrected accordingly.

4 CONCLUSION

The process described in this paper was used by an interdisciplinarydevelopment team who used it to develop a (semi-)autonomous mi-cro UAV (unmanned aerial vehicle) along with a corresponding MRcontrol and supervision interface [3]. Four distinct system optionswhere developed and evaluated, each consisting of several proto-type incarnations. Due to the lack of a parallel development usingother processes (which is infeasible due to resource constraints) wecannot pinpoint the benefit of the process, but the development en-gineers involved commented favorably on the freedom of develop-ment options afforded by the MVCE process and saw it as a valu-able way to structure the development of such systems.

REFERENCES

[1] S. Burbeck. Applications programming in smalltalk-80: How to usemodel-view-controller (mvc). math.rsu.ru, January 1992.

[2] C. Geiger, P. Pogscheba, J. Stocklein, H. Haehnel, and M. Berntssen.Modellbasierter Entwurf von Mixed Reality-Interaktionstechniken furein Indoor-Zeppelin. Augmented & Virtual Reality in der Produk-tentstehung, 2009.

[3] C. Geiger, J. Stocklein, J. Berssenbrugge, and V. Paelke. Mixed re-ality design of control strategies. ASME 2009 International DesignEngineering Technical Conferences & Computers and Information inEngineering Conference, 2009.

[4] H. Ishii. The tangible user interface and its evolution. Communicationsof the ACM, 51(6):32–36, January 2008.

[5] B. Krassi. Dynamic Virtual Prototyping for Control Engineering. VDMVerlag, October 2008.

[6] P. Milgram and F. Kishino. A taxonomy of mixed reality visual dis-plays. IEICE Transactions On Information And Systems, E77-D(12),January 1994.

[7] J. Rix, S. Haas, and J. Teixeira. Virtual Prototyping - Virtual environ-ments and the product design process. Chapman & Hall, 1995.

[8] M. Schlager. Hardware-in-the-Loop Simulation: A Scalable,Component-based, Time-triggered Hardware-in-the-loop SimulationFramework. VDM Verlag Dr. Muller, April 2008.

304