ieee augmented reality learning experience model (arlem)

62
A Standard for Augmented Reality Learning Experience Models (AR-LEM) Fridolin Wild 1) , Christine Perey 2) 1) The Open University, UK 2) Perey Research and Consulting, CH

Upload: fridolinwild

Post on 07-Aug-2015

244 views

Category:

Engineering


3 download

TRANSCRIPT

Page 1: IEEE augmented reality learning experience model (ARLEM)

A Standard for

Augmented Reality

Learning Experience Models

(AR-LEM)

Fridolin Wild1), Christine Perey2)

1) The Open University, UK 2) Perey Research and Consulting, CH

Page 2: IEEE augmented reality learning experience model (ARLEM)

Call for Potentially Essential Patents

If anyone in this meeting is personally aware of the holder of any patent claims that are potentially essential to implementation of the proposed standard(s) under consideration by this group and that are not already the subject of an Accepted Letter of Assurance: Either speak up now or

Provide the chair of this group with the identity of the holder(s) of any and all such claims as soon as possible or

Cause an LOA to be submitted

Page 3: IEEE augmented reality learning experience model (ARLEM)

Agenda

• Welcome by Avron Barr (LTSC)

• Welcome by the Chairs of P1589

• Purpose and goals of ARLEM

• Baseline spec

• Use Cases

3

Page 4: IEEE augmented reality learning experience model (ARLEM)

from our SponsorWelcome Message

Avron Barr, IEEE Learning Technology

Standards Committee

4

Page 5: IEEE augmented reality learning experience model (ARLEM)

IEEE Standards Working Groups

Identified market problem and solution– Who will be selling what to whom? What products will be “certified”?– Solving an identified or anticipated market problem: fractured marketplace,

excessive product integration costs, product incompatibility, vendor lock-in – Vendors and their customers should see the need for standards, as evidenced

by their participation and sponsorship.– The specification may be only a part of the solution. Stewardship might also

involve promotion, conformance testing, best practices guides, maintenance, and continued evolution of the spec.

Participation and governance in the IEEE– All IEEE LTSC proceedings are open to observers– The working group gets to decide about membership (individual vs. entity),

fees, voting, and its governance framework generally, within IEEE guidelines.– Shared IP: http://open-stand.org

5

Page 6: IEEE augmented reality learning experience model (ARLEM)

Learning Technology Standards CommitteeCurrent Projects:

Study groups (pre-standard)– Actionable Data Book. Exploring the future of educational

publication – textbooks that compute.– Project-based Learning Opportunities. Exploring the possibility of

describing internships and other on-the-job learning opportunities and building a brokerage system to match prospects with jobs.

– Competencies. Defining a universal language for describing competency frameworks, which will allow these frameworks to be compatible and interoperable across communities of practice.

Standards working groups– Resource Aggregation Models for Learning Education and Training.

Developing ontology based solutions for semantic interoperability across the various elearning content packaging schemes.

– Augmented Reality. Developing a standard model for defining AR-based learning experiences.

6

Page 7: IEEE augmented reality learning experience model (ARLEM)

You can’t make a standard!

Pre-standards Activities- Principles- Requirements- Early Specs- Prototypes

Standardization - Compromises- Champions- Prototypes

Early Adoption - Publication- First Products- PR

Rude Awakening- User feedback- Revisions

Real Adoption - Stabilization- Test Suites- Products- Conformance- Compliance

Only the market can make a standard

Robby Robson, 2005

Page 8: IEEE augmented reality learning experience model (ARLEM)

Welcome Message

Christine Perey, Perey Consulting

8

Page 9: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

9

Digital Assets

Augmented Reality in 2015

June 3, 2015

Physical World

100,000+ developers with access, skills and expressed

desire to author AR experiences

Nearly 1B people with at least one AR-ready device (sensors and output/display support)

Specific AR Use Cases

AR Experience Authoring

AR Experience Delivery

Page 10: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

10

Many Companies Are Producing AR Products and Services, but . . .

June 3, 2015

ProprietaryTechnology

Silos

Page 11: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

11

Augmented Reality Developersand their Experiences

June 3, 2015

Top <10% are responsible for > 50%

80% of developers have only a few AR experiences

Next 10% are responsible for 20%

Page 12: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

12

Mobile AR-Enabled vs. Users

1B Smartphones with all necessary sensors and graphics acceleration hardware

Only <10% are users of mobile AR

June 3, 2015 12

Page 13: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

13June 3, 2015

What’s Missing?

Page 14: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

14June 3, 2015

What is Open and Interoperable AR?

Complete end-to-end system in which

modular components can be supplied by

multiple vendors and still have the

same workflow and experience quality

A set of shared

values…

a “school of thought” about

AugmentedReality

Page 15: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

15

From silos to open systems

June 3, 2015

Page 16: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

16

Any Digital Assets

Open and Interoperable AR

June 3, 2015

Millions of developers with access, skills and expressed

desire to author AR experiences

Billions of people with at least one AR-ready device (sensors and output/display support)

AnyAR Use Cases

AR Experience Authoring

Tools and Workflows

AR experience on anyform factor and using

any standards-compliantsoftware client

Page 17: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

17

Open and Interoperable AR

June 3, 2015

Permits consistent and flexible content and technology integration and management

Interoperability simplifies the developer’s AR experience

Authoring Publishing Integration

Interoperability increases user’s Discovery Sharing

Consuming

Page 18: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

18

Where Would You Rather Start?

June 3, 2015

Existing Standards and Modern Tools

Or Raw Materials and Primitive Tools

Page 19: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

19June 3, 2015

AR Community

Grassroots community of people since 2009Seek open and interoperable

AR content and experiences Brings together standards development organizations

and developers

Operate A Web portal Seven archived mailing lists

Conduct virtual and in-person meetings

Page 20: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

20

Notable Achievements to Date

June 3, 2015

Initiatives Cross-SDO (OGC, Khronos Group, ISO, Web3D)

collaboration to address 3D Compression and Transmission

Development of AR Browserinteroperability

Resources Tables of relevant standards and status of active SDOs Calendar of meetings and events Glossary of AR terminology Mixed and Augmented Reality Reference Model (ISO)

Page 21: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

21

Relevant Industry Groups and Standards Organizations

June 3, 2015

National Standards Organizatio

ns

Page 22: IEEE augmented reality learning experience model (ARLEM)

PER

EY R

ese

arc

h &

Consu

ltin

g

22

Most Active Standards Groups

June 3, 2015

Mixed and Augmented Reality Reference Model (MAR RM)

AR Application Format (ARAF)

WebGL glTF OpenVX OpenKCam StreamInput

3D Medical Display Streaming Media Quality Streaming to Mobile xAPI Simulation and Virtual Reality ARLEM

ARML 2.0 IndoorML OWS Context GeoPackage Moving Features Points of Interest

Page 23: IEEE augmented reality learning experience model (ARLEM)

ARLEMInteroperability

23

Page 24: IEEE augmented reality learning experience model (ARLEM)

The Cost of Integration

Studies show: – 30% of the time in software development

projects is spent on interface design and –implementation (Schwinn & Winter, 2005)

– 35% to 60% of the IT budget are spent on development and maintenance of interfaces (Ruh et al., 2001)

Rising heterogeneity and integration demand (Klesse et al., 2005)

Page 25: IEEE augmented reality learning experience model (ARLEM)

Status Quo in Learning Technology

Plethora of (standard) software:C4LPT lists over 2,000 tools

Existing learning object / activity standards lack reality support

Multi-device orchestration (think wearables!)

=> enterprises and institutions face interoperability problems

Page 26: IEEE augmented reality learning experience model (ARLEM)

Interoperability

“…this means that independently developed software

components can exchange information so that they can be used

together.”

(Duval, 2004)

“… is the ability to transfer and use information in a uniform and

efficient manner across multiple organisations and information

technology systems.” (Noie, 2003)

”…is a property that emerges, when distinctive information

systems (subsystems) cooperatively exchange data in such a way

that they facilitate the successful accomplishment of an

overarching task.” (Wild et al., 2007) http://ceur-ws.org/Vol-309/paper01.pdf

Page 27: IEEE augmented reality learning experience model (ARLEM)

Dissociating Interoperability

(modified from Kosanke, 2005)

Page 28: IEEE augmented reality learning experience model (ARLEM)

ARLEM

Mending the dissociative gap

28

Page 29: IEEE augmented reality learning experience model (ARLEM)

The traditional route to knowledge.

Photo: Simon Q (flickr)

29The Codrington Library, All Souls College, Oxford University

Page 30: IEEE augmented reality learning experience model (ARLEM)

MATERIALS (e.g. yarn 76/2 710)

MACHINES

Machine parts Materials

OCCUPATIONAL HEALTH AND SAFETY

SET UP PARAMETERS

Its practical application.

Page 31: IEEE augmented reality learning experience model (ARLEM)

Experience.

31

Page 32: IEEE augmented reality learning experience model (ARLEM)

Mend the dissociative gap.

Photo: Marco Leo (flickr) 32

Page 33: IEEE augmented reality learning experience model (ARLEM)

Embedding knowledge into experience

33

Page 34: IEEE augmented reality learning experience model (ARLEM)

From the AR Community Meeting@ MIT media labs, Cambridge/MA, 24.-25.3.15

Use Cases

34

Page 35: IEEE augmented reality learning experience model (ARLEM)

Use Cases (1)

‘Wet rehearsal’. Simulation in the real context: Rehearsal on the actual workplace and actual objects, newbie training, before they can do the real thing. Recording Standard Operating Procedures (for an audit) is such an example.

Assessment. Experience recording helps collect evidence of task performance ‘by the book’ (and can be replayed to others). Such recordings on-push-of-button or at-hot-spots can be later brought up again to support career development: evidence helps assess, where training is required and proves whether staff is able to do the job within the specs required. Service technicians.

Quality Inspection. Assessment comes in many disguises, quality inspection for high precision jobs being one of them. In manufacturing, for example, product assurance is key.

Experience recording. Active AR in authoring mode is used in a ‘show and tell’ way to extract key steps from existing documentation. User generated content can be used to convert existing technical documentation into augmented documentation.

http://bit.ly/arlemusecases

Page 36: IEEE augmented reality learning experience model (ARLEM)

Use Cases (2)

Health & Wellness Learning.Imaging, wearable sensors, and biometrics enable the enlightened patient to better control wellbeing, using direct biofeedback to understand and modify own behavior. For example, visualizing x-ray or MRI data in situ on the body, using an interactive ‘mirror’, helps people understand conditions in a better way. Physical therapy for rehabilitation, patient self-help, Yoga Trainer, etc. – all work the same principles: understand better what’s happening inside of you and use it to your advantage.

Maintenance.Not only mechanics are able to do repairs and maintenance operations. Many products today are not repaired, but disposed, when faults occur, as the cost of professional labour (and travel of engineers or postage) often is more expensive than producing a new unit. Changing the motor on a washing machine, replacing a chain, gearbox, or brakes on a bike, changing the electronic window levers on a car, supporting installation of a complex wire harness: the amount of AR-supported DIY opportunities is sheer endless.

36

http://bit.ly/arlemusecases

Page 37: IEEE augmented reality learning experience model (ARLEM)

Use Cases (3)

Remote Tutoring. Not only professionals, but also home users with a certain level of manual dexterity would benefit a lot from live tutoring and guidance, receiving remote support in situ and on the job. Stuck with changing the motor in your washing machine? Call the service agency on the smart glasses to receive live hands-on guidance.

Resumé Service. Human Resources would so love to visualize experience of candidates, enhancing the resume. Check compliance of workers is an example (for compliance assessment).

Tangible Learning Objects. Using 3D-printing and Internet-of-Things hardware, we can breathe new life into objects, using their tangible features as interfaces to software functionality and logic. A relay box simulator is an example of this.

Work Shadowing. For complex tasks, it is often best to learn from the best and see things ‘through the eyes of the master’. AR may well be the game-changer, as it provides cost efficient ways with passive mode AR to watch a master in action – at scale.

37

http://bit.ly/arlemusecases

Page 38: IEEE augmented reality learning experience model (ARLEM)

Activity Modeling LanguageWorkplace Modeling Language

The TELLME Spec

38

Page 39: IEEE augmented reality learning experience model (ARLEM)

World Knowledge

39

Activity Knowledge

http://bit.ly/arlem-input

Page 40: IEEE augmented reality learning experience model (ARLEM)

40

The Activity Model

“find the spray gun nozzle size

13”

Messaging in the real-time presence

channel and tracking to xAPI

onEnter/onExit chaining of actions and

other activations/deact

ivations

Styling (cascading) of viewports and UI elements

Constraint modeling:specify validation

conditions and model workflow branching

e.g. smart player;e.g. search widget

http://bit.ly/arlem-input

Page 41: IEEE augmented reality learning experience model (ARLEM)

41

The Workplace Model

The ‘tangibles’:Specific persons,

places, things

The ‘configurables’:devices (styling),

apps+widgets

The ‘triggers’:Markers trigger

Overlays; Overlays trigger human action

Overlay ‘Primitives’:enable re-use of e.g. graphical overlays

http://bit.ly/arlem-input

Page 42: IEEE augmented reality learning experience model (ARLEM)

Create a new activity

Create a new xml file, best name it something like ‘activity-myname.xml’

Add the activity element :<activity></activity>

Add the following attributes to the <activity …> element:– id=”myshortname”

(no spaces): this will serve the indexing so that we can find activities later)

– name="Assembly of cabinet”human readable description of the activity

– language="english" – workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/

data/workplace-AIDIMA.xml" link to the workplace model – you can use one workplace for all activities, or different workplaces for different activities

– start="start”id of the action to start with

Page 43: IEEE augmented reality learning experience model (ARLEM)

This is what the file looks like now

<activity id="assembly" name="Assembly of cabinet" language="english" workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/data/workplace-AIDIMA.xml" start="start”>

</activity>

Page 44: IEEE augmented reality learning experience model (ARLEM)

Add your first action step (‘start’)

We just defined that the action step to begin with has the id ‘start’, so we create this action step:

Add (in between the two <activity> element codes:

<action id=‘start’ viewport=‘actions’ type=‘actions’></action>

Page 45: IEEE augmented reality learning experience model (ARLEM)

The dialogue box of the action

We want a human readable instruction to be visible (not just an action step that displays overlays or 3d models or videos), so we add the following in between the two <action> codes:

<instruction><![CDATA[ <h1>Assembly of a simple cabinet</h1> <p>Point to the cabinet to start…</p>]]></instruction>

Page 46: IEEE augmented reality learning experience model (ARLEM)

Entry, Exit, Trigger

To define the flow of the actions, we have to define what ‘triggers’ the state change

Moreover, we want to define what shall happen, when the action is launched (‘entered’) and when the trigger moves to the next action (or whatever the ‘exit’ statements define)

Page 47: IEEE augmented reality learning experience model (ARLEM)

Example enter/exit/trigger

<enter removeSelf="false"></enter><exit> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/></exit><triggers> <trigger type="click" viewport="actions" id="start"/></triggers>

Nothing (for now)

On exit: launch step2

On exit: remove dialogue box ‘start’

This action step shall be exited by ‘clicking’ on the

dialogue box

Page 48: IEEE augmented reality learning experience model (ARLEM)

Your script now:<activity id="assembly" name="Assembly of cabinet" language="english" workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/data/workplace-AIDIMA.xml" start="start">

<action id=‘start’ viewport=‘actions’ type=‘actions’> <enter removeSelf="false"> </enter> <exit> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/> </exit> <triggers> <trigger type="click" viewport="actions" id="start"/> </triggers> <instruction><![CDATA[<h1>Assembly of a simple cabinet</h1><p>Point to the cabinet to start ... </p>]]></instruction></action>

<action id="step2" viewport="actions” type=“actions”> <enter></enter> <exit removeSelf="true”></exit> <triggers> <trigger type="click" viewport="actions" id="step1"/> </triggers> <instruction><![CDATA[<h1>step2</h1><p>do this and that.</p>]]></instruction></action>

</activity>

Page 49: IEEE augmented reality learning experience model (ARLEM)

Working with ‘tangibles’

Utilise computer vision engine to detect things/places/people (=tangibles)

Define tangibles in the workplace model

Then activate (or deactivate) what shall be visible and relevant in each action step

Page 50: IEEE augmented reality learning experience model (ARLEM)

In the workplace model

We open the workplace model and define a new thing (under resources/tangibles/things):

<thing id="board1" name="Cabinet" urn="/tellme/object/cabinet1" detectable="001"> <pois> <poi id="leftside" x-offset="-0.5" y-offset="0" z-offset="0.1"/> <poi id="default" x-offset="0" y-offset="0" z-offset="0"/> </pois></thing>

The id is what we will reference

The detectable specifies, which

marker (or sensor state) will be bound to the thing

Poi = point of interest: specify locations relative

to centre of marker (x=y=z=0: centre)

Page 51: IEEE augmented reality learning experience model (ARLEM)

Markers and pre-trained markers

Marker must be defined in the workplace model

It shall be possible to provide pretrained markers (and their PDF file to print): these markers shall be named 001 to 050

Markers shall be specified via their id in the computer vision engine (under resources/triggers/detectables):

<detectable id="001" sensor="engine" type="marker"/>

Page 52: IEEE augmented reality learning experience model (ARLEM)

Activates and deactivates

Now we have defined a thing called ‘board1’ and we have tied it to the marker 001

We can start referring to it now from the activity script: we can, e.g., activate pictogram overlays for the verbs of handling and motion

<activate tangible="board1" predicate="point" poi="leftside" option="down” />

Page 53: IEEE augmented reality learning experience model (ARLEM)

Your script<activity id="assembly" name="Assembly of cabinet" language="english" workplace="http://crunch.kmi.open.ac.uk/people/~jmartin/data/workplace-AIDIMA.xml" start="start">

<action id=‘start’ viewport=‘actions’ type=‘actions’> <enter removeSelf="false”> <activate tangible="board1" predicate="point" poi="leftside" option="down"/> <activate tangible="board1" predicate="addlabel" poi="default" option="touchme"/> </enter> <exit> <deactivate tangible="board1" predicate="point" poi="leftside"/> <deactivate tangible="board1" predicate="addlabel" poi="default"/> <activate type="actions" viewport="actions" id="step2"/> <deactivate type="actions" viewport="actions" id="start"/> </exit> <triggers> <trigger type="click" viewport="actions" id="start"/> </triggers> <instruction><![CDATA[<h1>Assembly of a simple cabinet</h1><p>Point to the cabinet to start ... </p>]]></instruction></action>

<action id="step2" viewport="actions” type=“actions”> <enter></enter> <exit removeSelf="true”></exit> <triggers> <trigger type="click" viewport="actions" id="step1"/> </triggers> <instruction><![CDATA[<h1>step2</h1><p>do this and that.</p>]]></instruction></action>

</activity>

Display an arrow pointing

downwards on the point of

interest ‘leftside’

Display a label ‘touchme’ at the

centre of the marker Remove both

visual overlays when this action

step is exited

Page 54: IEEE augmented reality learning experience model (ARLEM)

3D overlays, image overlays, videos Besides the verb primitives and the label, there shall be ‘generics’ that

can be used to embed video, images, or animations:

<activate tangible=”board1" predicate="addanimation" poi="leftside" option="1"/>

Animations shall either be embedded in the app or be downloaded from the web (url)

Animations can have ‘states’, addressed via the ‘option’ attribute (option=0: invisible; option=1: animation step 1, option=2: animation step 2…)

<activate tangible=”board1" predicate=”addvideo" poi="leftside" option=“http://myurl.org/myvideo.mp4"/>

<activate tangible=”board1" predicate=”addimage" poi="leftside" option=“http://myurl.org/myvideo.png"/>

Page 55: IEEE augmented reality learning experience model (ARLEM)

Verb Primitives

All verbs need the ‘id’ of the tangible, some of them have ‘POIs’ that they need as input, few have ‘options’ 'point': poi + options = up, upperleft, left, lowerleft,

down, lowerright, right, upperright

'assemble’, ‘disassemble’

‘close’

‘cut’: poi

'drill': poi

'inspect': poi

'lift':

'lower’:

'lubricate':

'measure': poi

'open’

‘pack’

‘paint’

‘plug’

'rotate-cw’, 'rotate-ccw': poi

'screw': poi

'unfasten': poi

'unpack

'unplug’:

'unscrew': poi

'forbid':

'allow':

'pick':

'place':

Page 56: IEEE augmented reality learning experience model (ARLEM)

56

Viitaniemi et al. (2014): Deliverable d4.2,

TELLME consortium

Page 57: IEEE augmented reality learning experience model (ARLEM)

Triggers and tangibles

If you add a tangible trigger (for ‘stareGaze navigation’), an target icon will be overlaid, rotating in yellow, turning green when the stare duration (3 secs) has been reached

<trigger type="detect" id="board1" duration=”3"/>

Page 58: IEEE augmented reality learning experience model (ARLEM)

Warning signs

Add an enter activation:

<activate tangible=”board1" poi=“leftside” warning="p030"/>

Page 59: IEEE augmented reality learning experience model (ARLEM)

Open Problems

59

Page 60: IEEE augmented reality learning experience model (ARLEM)

Open Problems

Real-time messaging (multiuser, multi-device)

Revision needed: xAPI auto-logging

query language for constraint validation

Performance analytics

Validator service

LEM aggregator (‘Open LEM’)

Page 61: IEEE augmented reality learning experience model (ARLEM)

Your Reference Implementation

Design Competition at the 10th ECTEL conference 2015:Envisioning Wearable Enhanced Learning:– 500 word abstract (approx. 2 pages) – and design samples (e.g. mock-ups, videos, prototypes)

Wearable Enhanced Learning (WELL) is emerging to be a transformational step in the transition from the desktop age through the mobile age to the age of wearable, ubiquitous computing.

ECTEL (http://www.ec-tel.eu/) will take place– Toledo (Spain)– 15 - 18 September 2015

http://bit.ly/sigwellcompetition

Page 62: IEEE augmented reality learning experience model (ARLEM)

The END