interfacing with virtual worlds

16
Interfacing with Virtual Worlds An Introduction to MPEG-V Christian Timmerer Klagenfurt University (UNIKLU) Faculty of Technical Sciences (TEWI) Department of Information Technology (ITEC) Multimedia Communication (MMC) http://research.timmerer.com http://blog.timmerer.com mailto:[email protected] Authors: Christian Timmerer, Jean Gelissen, Markus Waltl, and Hermann Hellwagner Slides available at http://www.slideshare.net/christian.timmerer

Upload: christian-timmerer

Post on 24-May-2015

1.301 views

Category:

Technology


4 download

TRANSCRIPT

Page 1: Interfacing with Virtual Worlds

Interfacing with Virtual WorldsAn Introduction to MPEG-V

Christian Timmerer

Klagenfurt University (UNIKLU) Faculty of Technical Sciences (TEWI)Department of Information Technology (ITEC) Multimedia Communication (MMC)

http://research.timmerer.com http://blog.timmerer.com mailto:[email protected]

Authors: Christian Timmerer, Jean Gelissen, Markus Waltl, and Hermann HellwagnerSlides available at http://www.slideshare.net/christian.timmerer

Page 2: Interfacing with Virtual Worlds

Outline• Introduction• Part 1: System Architecture• Overview of MPEG-V Parts 2 and 4• Part 3: Sensory Information– Concept– Sensory Effect Description Language– Sensory Effect Vocabulary + Usage Examples (cf. paper)

• Conclusions• (Demo Video)

2009/09/30 2Christian Timmerer, Klagenfurt University, Austria

Page 3: Interfacing with Virtual Worlds

Introduction• Multi-user online virtual worlds (NVE, MMOG)

reached mainstream popularity– e.g., World of Warcraft, Second Life, Lineage

• Boost real world economy by connecting virtual and real world? – Not only Gaming– Entertainment, education, training, getting

information, social interaction, work, virtual tourism, etc.

• For fast adoption of virtual worlds we need a better understanding of their internal economics, rules and regulations

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 3

Page 4: Interfacing with Virtual Worlds

Introduction (cont’d)

• Finally, interoperability achieved through standardization

• MPEG-V (ISO/IEC 23005) :== system architecture + associated information representations

• Interoperability between virtual worlds– E.g., digital content provider of a virtual world (serious)

gaming, simulation, DVD• And real world– E.g., sensors, actuators, vision and rendering, robotics (e.g.

for revalidation), (support for) independent living, social and welfare systems, banking, insurance, travel, real estate, rights management

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 4

Page 5: Interfacing with Virtual Worlds

MPEG-V System Architecture

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 5

Media context and control

Pt. 1: Architecture

Pt. 3: Sensory InformationPt. 3: Sensory Information

Pt. 4: Avatar Information

Pt. 2: Control Information

Page 6: Interfacing with Virtual Worlds

Part 2: Control Information

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 6

Sensory Device Capabilities asext. of dia:TerminalCapability• unit, max/minIntensity, numOfLevels, delay, position• light (color, flash), heating, cooling, wind, vibration• scent, fog, water sprayer, color correction• kinestetic, tactile

Sensory Device Capabilities asext. of dia:TerminalCapability• unit, max/minIntensity, numOfLevels, delay, position• light (color, flash), heating, cooling, wind, vibration• scent, fog, water sprayer, color correction• kinestetic, tactile

User Sensory Preferences asext. of dia:UserCharacteristics• adaptability, max/minIntensity• light (color, flash), heating, cooling, wind, vibration• scent, fog, water sprayer, color correction• kinestetic, tactile

User Sensory Preferences asext. of dia:UserCharacteristics• adaptability, max/minIntensity• light (color, flash), heating, cooling, wind, vibration• scent, fog, water sprayer, color correction• kinestetic, tactile

Fundamental Input to any Control

Device (aka Adaptation Engine)

Fundamental Input to any Control

Device (aka Adaptation Engine)

Page 7: Interfacing with Virtual Worlds

Part 4: Avatar Characteristics• Appearance

– Contains the high level description of the appearance and may refer a media containing the exact geometry and texture

• Haptics Properties– Contains the high level description of the haptics properties

• Animation– Contains the description of a set of animation sequences that the avatar is

able to perform and may refer to several medias containing the exact (geometric transformations) animation parameters

• Communication Skills– Contains a set of descriptors providing information on the different modalities

an avatar is able to communicate• Personality

– Contains a set of descriptors defining the personality of the avatar• Control

– Contains a set of descriptors defining possible place-holders for sensors on body skeleton and face feature points

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 7

Page 8: Interfacing with Virtual Worlds

Part 3: Sensory Information• Universal Multimedia Access (UMA)

– Anywhere, anytime, any device + technically feasible– Main focus on devices and network connectivity issues

• Universal Multimedia Experience (UME)– Take the user into account

• Multimedia Adaptation and Quality Models/Metrics– Single modality (i.e., audio, image, or video only) or a simple combination of

two modalities (i.e., audio and video)• Triple user characterization model

– Sensorial, e.g., sharpness, brightness– Perceptual, e.g., what/where is the content– Emotional, e.g., feeling, sensation

• Ambient Intelligence– Add’l light effects are highly appreciated for both audio and visual content– Calls for a scientific framework to capture, measure, quantify, judge, and

explain the user experience

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 8

F. Pereira, “A triple user characterization model for video adaptation and quality of experience evaluation,” Proc. of the 7th Workshop on Multimedia Signal Processing, Shanghai, China, October 2005, pp. 1–4.

B. de Ruyter, E. Aarts. “Ambient intelligence: visualizing the future”, Proceedings of the Working Conference on Advanced Visual Interfaces, New York, NY, USA, 2004, pp. 203–208.E. Aarts, B. de Ruyter, “New research perspectives on Ambient Intelligence”, Journal of Ambient Intelligence and Smart Environments, IOS Press, vol. 1, no. 1, 2009, pp. 5–14.

Page 9: Interfacing with Virtual Worlds

Concept of MPEG-V Sensory Information• Consumption of multimedia content may stimulate also other

senses– Vision or audition– Olfaction, mechanoreception, equilibrioception, thermoception, …

• Annotation with metadata providing so-called sensory effects that steer appropriate devices capable of rendering these effects

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 9

… giving her/him the sensation of being part of the particular media

➪ worthwhile, informative user experience

Page 10: Interfacing with Virtual Worlds

Sensory Effect Description Language (SEDL)

• XML Schema-based language for describing sensory effects– Basic building blocks to describe, e.g., light, wind, fog, vibration, scent– MPEG-V Part 3, Sensory Information– Adopted MPEG-21 DIA tools for adding time information (synchronization)

• Actual effects are not part of SEDL but defined within the Sensory Effect Vocabulary (SEV)– Extensibility: additional effects can be added easily w/o affecting SEDL– Flexibility: each application domain may define its own sensory effects

• Description conforming to SEDL :== Sensory Effect Metadata (SEM)– May be associated to any kind of multimedia content (e.g., movies, music,

Web sites, games)– Steer sensory devices like fans, vibration chairs, lamps, etc. via an appropriate

mediation device

➪ Increase the experience of the user ➪ Worthwhile, informative user experience

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 10

Page 11: Interfacing with Virtual Worlds

Sensory Effect Description Language (cont’d)

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 11

SEM ::=[DescriptionMetadata](Declarations|GroupOfEffects| Effect|ReferenceEffect)+

Declarations ::= (GroupOfEffects|Effect|Parameter)+

GroupOfEffects ::= timestamp EffectDefinition EffectDefinition (EffectDefinition)*

Effect ::= timestamp EffectDefinition

EffectDefinition ::= [activate][duration][fade][alt] [priority][intensity][position] [adaptability]

Page 12: Interfacing with Virtual Worlds

Example

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 12

<sedl:GroupOfEffects si:pts="3240000" duration="100" fade="15" position="urn:mpeg:mpeg-v:01-SI-PositionCS-NS:center:*:front">

<sedl:Effect xsi:type="sev:WindType" intensity="0.0769"/>

<sedl:Effect xsi:type="sev:VibrationType" intensity="0.56"/>

<sedl:Effect xsi:type="sev:LightType" intensity="0.0000077"/>

</sedl:GroupOfEffects>

<sedl:GroupOfEffects si:pts="3240000" duration="100" fade="15" position="urn:mpeg:mpeg-v:01-SI-PositionCS-NS:center:*:front">

<sedl:Effect xsi:type="sev:WindType" intensity="0.0769"/>

<sedl:Effect xsi:type="sev:VibrationType" intensity="0.56"/>

<sedl:Effect xsi:type="sev:LightType" intensity="0.0000077"/>

</sedl:GroupOfEffects>

Page 13: Interfacing with Virtual Worlds

Conclusions• MPEG-V: Media Context and Control

– Information exchange between Virtual Worlds– Information exchange between Virtual and Real Worlds– Currently comprises four parts (more to come, e.g., refsw, conf)

• MPEG-V Part 3: Sensory Information– Annotation with metadata providing so-called sensory effects

that steer appropriate devices capable of rendering these effects ➪ enhanced, worthwhile, and informative user experience, giving the user the sensation of being part of the actual media

• Future work– Standardization: currently at CD level & going to FCD in October 2009– Research & Development:

• Optimized and efficient delivery framework for MPEG-V enabled content• New Quality of Service/Experience metrics • Mechanism for (semi-)automatic generation of MPEG-V metadata• End-to-end reference implementation of MPEG-V

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 13

Page 14: Interfacing with Virtual Worlds

References• M. Waltl, C. Timmerer, and H. Hellwagner, “A Test-Bed for Quality of Multimedia

Experience Evaluation of Sensory Effects”, Proceedings of the First International Workshop on Quality of Multimedia Experience (QoMEX 2009), San Diego, USA, July 29-31, 2009.

• C. Timmerer, J. Gelissen, M. Waltl, and H. Hellwagner, “Interfacing with Virtual Worlds”, accepted for publication in the Proceedings of the 2009 NEM Summit, Saint-Malo, France, September 28-30, 2009.

• C. Timmerer, “MPEG-V: Media Context and Control”, 89th ISO/IEC JTC 1/SC 29/WG 11 (MPEG) Meeting, London, UK, June 2009. https://www-itec.uni-klu.ac.at/mmc/blog/2009/07/08/mpeg-v-media-context-and-control/

• MPEG-V: http://www.chiariglione.org/mpeg/working_documents.htm#MPEG-V

• MPEG-V reflector: http://lists.uni-klu.ac.at/mailman/listinfo/metaverse

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 14

Page 15: Interfacing with Virtual Worlds

2009/09/30 Christian Timmerer, Klagenfurt University, Austria 15

Demo & Video

Page 16: Interfacing with Virtual Worlds

Thank you for your attention

... questions, comments, etc. are welcome …

Ass.-Prof. Dipl.-Ing. Dr. Christian TimmererKlagenfurt University, Department of Information Technology (ITEC)

Universitätsstrasse 65-67, A-9020 Klagenfurt, [email protected]

http://research.timmerer.com/Tel: +43/463/2700 3621 Fax: +43/463/2700 3699

© Copyright: Christian Timmerer162009/09/30 Christian Timmerer, Klagenfurt University, Austria