ar community meeting, seoul, korea, october 6, 2015

32
A Standard for Augmented Reality Learning Experience Models (AR-LEM) Fridolin Wild 1) , Christine Perey 2) , Paul Lefrere 3) 1) The Open University, UK 2) Perey Research and Consulting, CH 3) CCA, UK

Upload: fridolinwild

Post on 13-Apr-2017

726 views

Category:

Education


0 download

TRANSCRIPT

Page 1: AR community meeting, Seoul, Korea, October 6, 2015

A Standard forAugmented RealityLearning Experience Models(AR-LEM)

Fridolin Wild1), Christine Perey2), Paul Lefrere3)

1) The Open University, UK 2) Perey Research and Consulting, CH3) CCA, UK

Page 2: AR community meeting, Seoul, Korea, October 6, 2015

The traditional route to knowledge.

Photo: Simon Q (flickr)

2The Codrington Library, All Souls College, Oxford University

Page 3: AR community meeting, Seoul, Korea, October 6, 2015

MATERIALS (e.g. yarn 76/2 710)

MACHINES

Machine parts Materials

OCCUPATIONAL HEALTH AND SAFETY

SET UP PARAMETERS

Its practical application.

Page 4: AR community meeting, Seoul, Korea, October 6, 2015

Experience.

4

Page 5: AR community meeting, Seoul, Korea, October 6, 2015

Mend the dissociative gap.

Photo: Marco Leo (flickr) 5

Page 6: AR community meeting, Seoul, Korea, October 6, 2015

Embedding knowledge into experience

6

Page 7: AR community meeting, Seoul, Korea, October 6, 2015

Creating Interoperability for AR learning experiences

7

Page 8: AR community meeting, Seoul, Korea, October 6, 2015

The Cost of Integration

Studies show: – 30% of the time in software development

projects is spent on interface design and –implementation (Schwinn & Winter, 2005)

– 35% to 60% of the IT budget are spent on development and maintenance of interfaces (Ruh et al., 2001)

Rising heterogeneity and integration demand (Klesse et al., 2005)

Page 9: AR community meeting, Seoul, Korea, October 6, 2015

Status Quo in Learning Technology

Plethora of (standard) software:C4LPT lists over 2,000 tools

Existing learning object / activity standards lack reality support

Multi-device orchestration (think wearables!)

=> enterprises and institutions face interoperability problems

Page 10: AR community meeting, Seoul, Korea, October 6, 2015

Interoperability

is a property that emerges, when distinctive information systems

(subsystems)cooperatively exchange data

in such a way that they facilitate the successful accomplishment of an overarching task.

Wild & Sobernig (2005)

Page 11: AR community meeting, Seoul, Korea, October 6, 2015

Dissociating Interoperability

(modified from Kosanke, 2005)

Page 12: AR community meeting, Seoul, Korea, October 6, 2015

ARLEM conceptual model

12

Page 13: AR community meeting, Seoul, Korea, October 6, 2015

World Knowledge

13

Activity Knowledge

http://bit.ly/arlem-input

Page 14: AR community meeting, Seoul, Korea, October 6, 2015

14

The Activity Model

“find the spray gun nozzle size

13”

Messaging in the real-time presence

channel and tracking to xAPI

onEnter/onExit chaining of actions and

other activations/deact

ivations

Styling (cascading) of viewports and UI elements

Constraint modeling:specify validation

conditions and model workflow branchinge.g. smart player;e.g. search widget

http://bit.ly/arlem-input

Page 15: AR community meeting, Seoul, Korea, October 6, 2015

15

The Workplace Model

The ‘tangibles’:Specific persons,

places, things

The ‘configurables’:devices (styling),

apps+widgets

The ‘triggers’:Markers trigger

Overlays; Overlays trigger human action

Overlay ‘Primitives’:enable re-use of e.g. graphical overlays

http://bit.ly/arlem-input

Page 16: AR community meeting, Seoul, Korea, October 6, 2015

Action steps

<action id=‘start’ viewport=‘actions’ type=‘actions’></action>

Page 17: AR community meeting, Seoul, Korea, October 6, 2015

Instructions for action

<instruction><![CDATA[ <h1>Assembly of a simple cabinet</h1> <p>Point to the cabinet to start…</p>]]></instruction>

Page 18: AR community meeting, Seoul, Korea, October 6, 2015

Defining flow: Entry, Exit, Trigger

<enter removeSelf="false"></enter><exit> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/></exit><triggers> <trigger type="click" viewport="actions" id="start"/></triggers>

Nothing (for now)

On exit: launch step2

On exit: remove dialogue box ‘start’

This action step shall be exited by ‘clicking’ on the

dialogue box

Page 19: AR community meeting, Seoul, Korea, October 6, 2015

Sample script<activity id="assembly" name="Assembly of cabinet" language="english" workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/data/workplace-AIDIMA.xml" start="start">

<action id=‘start’ viewport=‘actions’ type=‘actions’> <enter removeSelf="false"> </enter> <exit> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/> </exit> <triggers> <trigger type="click" viewport="actions" id="start"/> </triggers> <instruction><![CDATA[<h1>Assembly of a simple cabinet</h1><p>Point to the cabinet to start ... </p>]]></instruction></action>

<action id="step2" viewport="actions” type=“actions”> <enter></enter> <exit removeSelf="true”></exit> <triggers> <trigger type="click" viewport="actions" id="step1"/> </triggers> <instruction><![CDATA[<h1>step2</h1><p>do this and that.</p>]]></instruction></action>

</activity>

Page 20: AR community meeting, Seoul, Korea, October 6, 2015

Working with ‘tangibles’

Utilise computer vision engine to detect things/places/people (=tangibles)

Define tangibles in the workplace model

Then activate (or deactivate) what shall be visible and relevant in each action step

Page 21: AR community meeting, Seoul, Korea, October 6, 2015

In the workplace model

We open the workplace model and define a new thing (under resources/tangibles/things):

<thing id="board1" name="Cabinet" urn="/tellme/object/cabinet1" detectable="001"> <pois> <poi id="leftside" x-offset="-0.5" y-offset="0" z-offset="0.1"/> <poi id="default" x-offset="0" y-offset="0" z-offset="0"/> </pois></thing>

The id is what we will reference

The detectable specifies, which

marker (or sensor state) will be bound to the thing Poi = point of interest:

specify locations relative to centre of marker (x=y=z=0: centre)

Page 22: AR community meeting, Seoul, Korea, October 6, 2015

Triggers and tangibles

If you add a tangible trigger (for ‘stareGaze navigation’), an target icon will be overlaid, rotating in yellow, turning green when the stare duration (3 secs) has been reached

<trigger type="detect" id="board1" duration=”3"/>

Page 23: AR community meeting, Seoul, Korea, October 6, 2015

Markers and pre-trained markers

Marker must be defined in the workplace model Possible to provide pretrained markers (and their PDF file

to print): named, e.g., 001 to 050 Markers shall be specified via their id in the computer

vision engine (under resources/triggers/detectables): <detectable id="001" sensor="engine" type="marker"/>

<detectable id=”myid" sensor="engine" type=”image_target” url=“myurl.org/marker.zip” />

Page 24: AR community meeting, Seoul, Korea, October 6, 2015

Activates and deactivates

Now we have defined a thing called ‘board1’ and we have tied it to the marker 001

We can start referring to it now from the activity script: we can, e.g., activate pictogram overlays for the verbs of handling and motion

<activate tangible="board1" predicate="point" poi="leftside" option="down” />

Page 25: AR community meeting, Seoul, Korea, October 6, 2015

<activity id="assembly" name="Assembly of cabinet" language="english" workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/data/workplace-AIDIMA.xml" start="start">

<action id=‘start’ viewport=‘actions’ type=‘actions’> <enter removeSelf="false”> <activate tangible="board1" predicate="point" poi="leftside" option="down"/> <activate tangible="board1" predicate="addlabel" poi="default" option="touchme"/> </enter> <exit> <deactivate tangible="board1" predicate="point" poi="leftside"/> <deactivate tangible="board1" predicate="addlabel" poi="default"/> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/> </exit> <triggers> <trigger type="click" viewport="actions" id="start"/> </triggers> <instruction><![CDATA[<h1>Assembly of a simple cabinet</h1><p>Point to the cabinet to start ... </p>]]></instruction></action>

<action id="step2" viewport="actions” type=“actions”> <enter></enter> <exit removeSelf="true”></exit> <triggers> <trigger type="click" viewport="actions" id="step1"/> </triggers> <instruction><![CDATA[<h1>step2</h1><p>do this and that.</p>]]></instruction></action>

</activity>

Display an arrow pointing

downwards on the point of

interest ‘leftside’

Display a label ‘touchme’ at the

centre of the marker Remove both

visual overlays when this action

step is exited

Page 26: AR community meeting, Seoul, Korea, October 6, 2015

Non-normed overlays<activate tangible=”board1" predicate="add3dmodel" poi="leftside" option=”augmentation"/>

<augmentations> <augmentation id="cube" scale="1" y_angle="180.0" url="http://myurl.org/cube.unity3d" /></augmentations>

<activate tangible=”board1” predicate=”addvideo” option=“http://myurl.org/myvideo.mp4"/>

<activate tangible=”board1" predicate=”addimage” option=“http://myurl.org/myvideo.png"/>

Page 27: AR community meeting, Seoul, Korea, October 6, 2015

Normed overlays – verb primitives

All verbs need the ‘id’ of the tangible, some of them have ‘POIs’ that they need as input, few have ‘options’ 'point': poi + options = up, upperleft, left, lowerleft,

down, lowerright, right, upperright 'assemble’, ‘disassemble’ ‘close’ ‘cut’: poi 'drill': poi 'inspect': poi 'lift': 'lower’: 'lubricate': 'measure': poi

'open’ ‘pack’ ‘paint’ ‘plug’ 'rotate-cw’, 'rotate-ccw': poi 'screw': poi 'unfasten': poi 'unpack 'unplug’: 'unscrew': poi 'forbid': 'allow': 'pick': 'place':

Page 28: AR community meeting, Seoul, Korea, October 6, 2015

28

Viitaniemi et al. (2014): Deliverable d4.2,

TELLME consortium

Page 29: AR community meeting, Seoul, Korea, October 6, 2015

Warning signs

Add an enter activation:

<activate tangible=”board1" poi=“leftside” warning="p030"/>

Page 31: AR community meeting, Seoul, Korea, October 6, 2015

Next virtual meeting

See http://arlem.kmi.open.ac.uk

October 12, 20158:00 PDT / 11 EDT / 16:00 BST / 17:00 CEST / 24:00 KT

31

Page 32: AR community meeting, Seoul, Korea, October 6, 2015

The END