ar in class rooms 4.pdf

Upload: thomas-stanly

Post on 20-Feb-2018

223 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/24/2019 ar in class rooms 4.pdf

    1/16

    Evaluation of learnersattitude toward learning in ARIES augmentedreality environments

    Rafa1Wojciechowski*, Wojciech CellaryDepartment of Information Technology, Faculty of Informatics and Electronic Economy, Poznan Universityof Economics, Mansfelda 4, 60-854 Poznan, Poland

    a r t i c l e i n f o

    Article history:

    Received 12 January 2012Received in revised form21 October 2012Accepted 6 February 2013

    Keywords:

    Interactive learning environmentsEvaluation of CAL systemsMultimedia/hypermedia systemsAuthoring tools and methodsAugmented reality

    a b s t r a c t

    The ARIES system for creating and presenting 3D image-based augmented reality learning environmentsis presented. To evaluate the attitude of learners toward learning in ARIES augmented reality environ-ments, a questionnaire was designed based on Technology Acceptance Model (TAM) enhanced withperceived enjoyment and interface style constructs. For empirical study, a scenario of a chemistryexperimental lesson was developed. The study involved students of the second grade of lower secondaryschool. As follows from this study, perceived usefulness and enjoyment had a comparable effect on theattitude toward using augmented reality environments. However, perceived enjoyment played adominant role in determining the actual intention to use them. The interface style based on physicalmarkers had signicant impact on perceived ease of use. Interface style and perceived ease of use had aweak inuence on perceived enjoyment. In contrast, these two constructs had a signicantly strongerinuence on perceived usefulness.

    2013 Elsevier Ltd. All rights reserved.

    1. Introduction

    The two most important social and economic processes occurring nowadays are: emerging electronic knowledge-based economy andtransformation toward global information society. Therefore, creativity and innovation become more and more prominent determinants ofthe competitiveness on the labor market in the 21st century. This is a major challenge for education and teaching that needs to be addressedin the near future (Cellary, 2002). The aforementioned challenges require signicant improvement of teaching methods, which willtransform the role of learners from passive recipients of information to active participants in knowledge acquisition (Walczak,Wojciechowski, & Cellary, 2006).

    In response to this need, an increasing interest in teaching based on the constructivist learning theory has taken place since the 90s of the20th century (Wilson,1996;Jonassen,1999; Marshall,1996). There are a wide variety of perspectives onwhatthe term constructivism means(Piaget, 1973;Vygotsky, 1978;Bruner, 1996). In this paper, the constructivist learning is understood as an active process of constructingknowledge by the learner, in contrast to passive acquiring the information (Duffy & Cunningham, 1996).

    According to the constructivist approach, a teacher is a facilitator of learning rather than a transmitter of knowledge ( Chaille & Britain,2002). There are a number of possible pedagogic activities implementing the constructivist principles, such as experimentation, conductingdiscussions, performing projects, etc. All these activities encourage learners to be active and to make their own discoveries, inferences, and

    conclusions. Deployment of constructivist principles in a classroom requires usage of interactive and dynamic learning environments,where the learners are able to modify appropriate elements, test ideas, and perform experiments ( Roussou, 2004).

    Learning based on performing experiments and further reection on their results is the basis of thelearning-by-doingparadigm (Schank,Berman, & Macperson, 1999). This paradigm implies that the best and the most natural way of learning how to do something is trying to doit. A learning strategy that implements the learning-by-doing approach is experiential learning(Kolb, 1984;Beard & Wilson, 2006). Thisstrategy greatly increases understanding and retention of the learned material in comparison to the methods that solely involve listening,reading, or even viewing, as learners are usually intrinsically motivated to learn when they are actively engaged in the learning process(Yang, 2012).

    * Corresponding author.E-mail addresses:[email protected](R. Wojciechowski),[email protected](W. Cellary).

    Contents lists available atSciVerse ScienceDirect

    Computers & Education

    j o u r n a l h o m e p a g e : w w w . e l s e v i e r . c o m / l o c a t e / c o m p e d u

    0360-1315/$ see front matter 2013 Elsevier Ltd. All rights reserved.

    http://dx.doi.org/10.1016/j.compedu.2013.02.014

    Computers & Education 68 (2013) 570585

    mailto:[email protected]:[email protected]://www.sciencedirect.com/science/journal/03601315http://www.elsevier.com/locate/compeduhttp://dx.doi.org/10.1016/j.compedu.2013.02.014http://dx.doi.org/10.1016/j.compedu.2013.02.014http://dx.doi.org/10.1016/j.compedu.2013.02.014http://dx.doi.org/10.1016/j.compedu.2013.02.014http://dx.doi.org/10.1016/j.compedu.2013.02.014http://dx.doi.org/10.1016/j.compedu.2013.02.014http://www.elsevier.com/locate/compeduhttp://www.sciencedirect.com/science/journal/03601315http://crossmark.dyndns.org/dialog/?doi=10.1016/j.compedu.2013.02.014&domain=pdfmailto:[email protected]:[email protected]
  • 7/24/2019 ar in class rooms 4.pdf

    2/16

    A key determinant of the effectiveness of experiential learning is interactivity ( Roussou, 2004). As far as learning content is concerned,the interactivity is dened as: the extent to which users can participate in modifying the form and content of a mediated environment inreal time(Steuer, 1992, p. 14). In traditional learning, the highest level of interactivity can be achieved in teaching labs, where students areable to conduct experiments putting theoretical concepts into practice. However, there are serious limitations associated with the exper-iments performed in teaching labs, since they may require much space, expensive equipment, appropriate safety measures, and trainedstaff. These restrictions make the large scale dissemination of experiential learning in educational institutions practically impossible oreconomically unjustiable (Jara, Candelas, Puente, & Torres, 2011). Without breaking down those barriers, experiential learning will remainof more theoretical rather than practical signicance.

    In this paper, we consider application of image-based augmented reality(AR), which is an extension ofvirtual reality (VR), to createlearning environments enabling experiential learning. Virtual reality is dened as a high-end usercomputer interface that involves realtime simulation and interactions through multiple sensorial channels. These sensorial modalities are visual, auditory, tactile, smell, andtaste(Burdea & Coiffet, 2003, p. 3). In this paper, we focus on two essential aspects of VR, namely three-dimensional (3D) visualization andinteractivity. Such a VR interface is called a virtual environment(VE), which is a 3D digital model of a real, abstract, or imagined environment.Virtual environments potentially offer a much broader range of forms of interactivity than real environments. Users are able to freelynavigate in a virtual environment, observe the environment from different perspectives, and interact with selected virtual objects. Virtualenvironments can be used to implement virtual laboratoriesin which users are able to perform experiments (Dalgarno, Bishop, Adlong, &Bedgood, 2009;Jeong, Park, Kim, Oh, & Yoo, 2011). However, to create a virtual environment offering a high level of credibility it is requiredto create a 3D model of an entire real environment, which is both time-consuming and expensive. The current state of the VR technologycauses the separation of humans from the real world, requires the useof expensive equipment to display and manipulate virtual objects, andalso offers an indirect non-intuitive user interface.

    In comparison to virtual reality, which is aimed at immersing a user in a synthetic environment, augmented reality supplements theusers perception of the real world by the addition of computer-generated content registered to real-world locations ( Azuma, 1997).

    Augmented reality combines virtual reality with video processing and computer vision technologies ( Parker, 1997;Davies, 2005). The ARtechnology enable merging virtual objects with the view of real objects, resulting inaugmented reality environments. In augmented realityenvironments both virtual and real objects can co-exist and interact in real time (Milgram & Kishino, 1994).

    The creation of AR environments requires design of virtual representation of a relatively small part of these environments. A signicantpart of AR environments consist of real objects, for which it is not necessary to create detailed 3D models, while offering the highest possiblelevel of reality. In AR environments, users are able to interact with virtual objects in a direct and natural way by manipulating real objectswithout the need of sophisticated and expensive input devices (Wojciechowski, Walczak, White, & Cellary, 2004). Also, in contrast to virtualenvironments, in which users communicate in a mediated way via avatars, AR environments afford users direct face-to-face contact witheach other.

    AR environments offer better opportunity of learning-by-doing through physical movements in rich sensory spatial contexts (Dunleavy,Dede, & Mitchell, 2009). Therefore, users have an opportunity to perform experiments on virtual objects by hands-on experiences in theirreal environments. This feature of AR supports situated learningwhich means that learning should take place in the context in which it isgoing to be applied (Lave & Wenger, 1991). AR allows students to seamlessly combine learning environments with the real world in whichthey live and apply the knowledge and skills learned. AR environments with possible direct face-to-face interaction between learners foster

    the creation ofcommunities of practice focused on the goal of gaining knowledge related to the presented content, since the learners are ableto easily share gained information and experiences with the group ( Lave & Wenger, 1991).

    The main advantages of AR applications in the education domain are: activity of learners, cost and safety. AR environments allowlearning content to be presented in meaningful and concrete ways including training of practical skills. They may play active roles in a widerange of learning activities within interactive educational scenarios developed in accordance with the learning-by-doing paradigm. Theexperience gained by learners during the learning process within an AR environment can be the basis for reection and further groupdiscussion in a classroom. The main aspects of learning afforded by AR environments are: spatial ability, practical skills, conceptual un-derstanding, and inquiry-based activities (Cheng & Tsai, 2012).

    Application of AR environments for teaching is followed by cost reduction due to replacing real expensive resources, such as laboratoryequipment and supplies, with their virtual counterparts. A signicant advantage of AR environments is safety, since unskilled learners mayexplore potentially dangerous situations without any risk of harm to themselves or damage to expensive equipment.

    There are a number of possible applications of AR environments in education (Walczak et al., 2006). They can be used for teaching aboutobjects and phenomena impossible to see by naked eye (e.g., molecular movements), simulation of potentially dangerous situations (e.g.,chemical reactions), and visualization of abstract concepts (e.g., magnetic elds). In addition, the level of complexity of the presented

    phenomena can be reduced to allow the learners to easier gain knowledge about the underlying concepts. AR environments may be used ina wide spectrum of domains from natural sciences (e.g., chemistry, physics, biology, astronomy), through computer and information sci-ences, mathematics, engineering (e.g., mechanical, electrical, biomedical), to humanities (e.g., history, linguistics, anthropology).

    This paper is organized as follows. In Section2, basic concepts related to augmented reality environments are introduced, as well as anoverviewof applications of AR in education. In Section 3, an overview of the ARIES system is presented. In Section 4, the TAM-based researchmodel and an application scenario of the ARIES system are described. This section also contains a description of the research study. InSection5, the results of the system evaluation are presented. Finally, Section6concludes the paper.

    2. Augmented reality in education

    2.1. Categories of augmented reality environments

    Augmented reality is a broad concept, which applies to technologies that combine the real and the virtual in any location-specic way,where both real and virtual information play signicant roles(Klopfer, 2008, p. 92). In general, AR systems are divided into location-based

    and image-based systems (Cheng & Tsai, 2012).

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 571

  • 7/24/2019 ar in class rooms 4.pdf

    3/16

    Thelocation-basedAR systems use the data about the position of mobile devices, determined by the Global Positioning System (GPS) orWiFi-based positioning systems. The location-based AR systems enable users moving around with mobile devices in the real environment.Users can observe computer-generated information on the screens of mobile devices, while the information is dependent on the currentlocation of the users in an environment.

    In contrast to the location-based AR, the image-basedAR is focused on image recognition techniques used to determine the position ofphysical objects in the real environment for appropriate location of the virtual contents related to these objects. The image-based ARsystems are divided into marker-based and marker-less tracking. The marker-based AR requires the placement of articial markers in thereal environment to determine the position of physical objects in the environment. The marker-less AR does not require articial markersplaced in the real environment, but instead it is based on tracking of natural features of physical objects present in the environment.

    In this paper, we focus on the image-based AR using marker-based tracking of physical objects. An image-based augmented realityenvironmentconsists of a real environmentand a virtual scene, which is presented in the context of the real environment. The real envi-ronment contains real objects, which are automatically tracked using image processing and computer vision techniques. The virtual sceneconsists of virtual objects and virtual representations of real objects present in the real environment. The virtual objects and the virtualrepresentations of real objects are overlaid on captured views of a real environment giving users an impression that the virtual contentactually exist in the real environment. The virtual objects can be displayed anywhere in the context of the real environment, whereas thevirtual representations of real objects are displayed aligned with the corresponding real objects. In this way, users viewing the augmentedreality environment get the illusion that virtual objects and virtual representations of real objects are an integral part of the realenvironment.

    The image-based augmentedreality environments can be presented to end users via different display devices, which are categorized intofour types: head-mounted displays (HMDs), desktop monitors, large-screen projection systems, and handheld displays (Drascic & Milgram,1996). To superimpose virtual graphics on a real-world view, an AR system requires wearing by a user an HMD optionally combined with adevice that can measure the position and orientation of the users head. There are two approaches to generation of the augmented views on

    HMDs: optical see-through and video see-through systems (Azuma, 1997). The optical see-through approach allows a user to look throughthe display at the real world. The optical see-through display is based on optical image combiners, e.g., half-silvered mirrors, in front of theusers eyes used to mix the virtual with the real images. The video see-through approach requires using one or two video cameras capturingviews of the real world. The cameras are attached in front of a closed-view HMD. The cameras are used to capture the real world images,which are augmented with virtual content and displayed on the HMD worn by a user. These systems do not allow a user to look directly atthe real world.

    Image-based AR systems can be also implemented based on monitor-based congurations. Instead of using see-through HMDs, amonitor-based system is composed of one or two video cameras and a monitor. Optionally, if the images displayed are stereoscopic, the userhas to wear a pair of stereo glasses. The cameras capture an environment, whereas the monitor displays the captured images overlaid withvirtual content. In monitor-based congurations users can observe augmented reality environments displayed on a monitor screen.Alternatively, a monitor can be replaced with a projection system for presentation to a larger audience.

    Handheld displays are usually embedded in mobile devices, such as smartphones and tablets. All currently available AR systems based onmobile devices are video see-through, where real-world views are captured by cameras built into mobile devices. The two main advantagesof the displays embedded in mobile devices are the portability and ubiquity. The disadvantages of handheld displays are their small size and

    image distortion caused by built-in cameras on mobile devices.

    2.2. Related works

    The application of AR technology in education is still in a very early stage.The reason is that this technology is often perceived by teachersas too expensive, complicated, and time-consuming (Champion, 2006). In recent years, there have been only few attempts to apply ARtechnology to teaching.

    In location-based AR systems, the presentation of information does not have to contain 3D virtual content registered in relation to realenvironments. In these systems, students usually work in groups to solve a problem. Each of them plays a different role, e.g., a chemist, adoctor, an environmentalist, or other domain experts. Students taking on different roles have to resolve a variety of tasks, which are pieces ofa larger puzzle. In Alien Contact! students have to investigate a mysterious alien crash (Dunleavy et al., 2009). In Mad City Mystery studentsmust explain a death of a virtual character (Squire & Jan, 2007), whereas in Environmental Detectives students play the role of environ-mental engineers investigating a toxic spill within a local watershed (Klopfer & Squire, 2008).

    In image-based AR,students can observe a real environment augmentedwith 3D virtual content registered in relation to real objects. The

    existing image-based AR learning systems, suchas Construct3D, Augmented Chemistry, Mixed Reality Classroom, and AR-Dehaes, have beendeveloped to support a relatively narrow range of potential teaching subjects. For instance, Construct3D is a simple 3D construction tool inan immersive AR environment for educational purposes. The application domain of Construct3D is geometry education (Kaufmann,Schmalstieg, & Wagner, 2000). The Augmented Chemistry system is an application designed to assist in teaching abstract organic chem-istry concepts such as molecular forms, the octet rule, and bonding (Fjeld, Juchli, & Voegtli, 2003). The Mixed Reality Classroom is an AReducational system developed for primary schools in Singapore (Liu, Cheok, Mei-Ling, & Theng, 2007). The system is composed of twothematic modules on Solar Systemand Plant. AR-Dehaes is an augmented book designed to visualize 3D virtual objects in order to helpengineering students to develop spatial skills (Martn-Gutirrez et al., 2010).

    The existing image-based AR solutions are developed for a specic pre-dened domain to teach only a narrow range of topics. In thesesystems, the role of a teacher is limited solely to instruction during the learning experience. Teachers are not able to update and adjust theexisting learning content to their needs, changes of curricula, and different levels of learners. They also cannot easily create new learningcontent on their own. As a consequence, the potential application of such systems within real curricula is very restricted. As far as expe-riential learning is concerned, only the Augmented Chemistry system offers the possibility of experimentation to some extent.

    In most existing solutions, the process of creation of advanced interactive AR environments requires involvement of highly qualied

    skilled IT professionals, who are experts in design and implementation of interactive 3D content. Also, any changes in the content often

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585572

  • 7/24/2019 ar in class rooms 4.pdf

    4/16

    cannot be made without assistance of the programmers. On the one hand, programmers do not have sufcient domain and pedagogicalknowledge required to build a complete AR environment. On the other hand, teachers do not have appropriate technical knowledge tocreate and modify the learning content on their own. As a result, teachers are condemned to use ready-made content only.

    Reusability and adaptability are two of the most important requirements for effective creation of learning materials (Boyle, 2003).Reusability allows teachers to create learning content that can be used in different learning contexts without much additional effort. Thus,teachers do not have to create the content from scratch but they can build it reusing some of existing materials. To this end, the learningmaterials should be appropriately modularized to enable easy sharing among teachers. The materials should be treated as regular digitalproducts that are produced and distributed (Landowska & Kaczmarek, 2005). Adaptability makes it possible to adjust learning materials tothe individual and situational needs. It should be possible to tailor the materials to the age, learning styles, abilities, and performancecharacteristics of learners.

    3. System design

    In this section, a system for building AR learning environments, called ARIES, is presented. ARIES is an e-learning system which enablesdomain experts,i.e. teachers,to actively participate in the authoring process of interactive educational scenarios. The ARIES system has beenbuilt as an implementation of theAugmented Reality Environment Modeling(AREM) approach (Wojciechowski, 2012). The AREM approachenables teachers to design and create learning scenes for augmented reality environments.

    3.1. AREM approach

    In AREM, scenes and objects, both virtual and real, are modeled as instances of classes based on the object-oriented paradigm. Theprocess of creating learning content is called learning content preparation, while the process of using the learning content is called learningcontent use.

    Learning content preparation begins with design of the content form, i.e., scene classes and object classes describing visual and behavioralaspects of the learning content. Denition of classes is performed by designers who have programming skills required for designing 3Dgraphics and writing some high-level scripting code in XML. Next, the actual learning content in the form of learning scenesandobjectsiscreated based on the classes. Denition of the scenes and objects is performed by the use of a simple graphical user interface by domainexperts without programming skills but with domain-specic knowledge necessary to produce high-quality learning content.

    When the learning content is ready, it can be used for learning in a classroom. To this end, learning content setup is performed by aninstructor just before or during a lesson. The instructor selects an appropriate scene and sets it up for using in the classroom environment inwhich the lesson is going to take place. Then, a new AR environment is created based on the selected scene. The setup of AR environmentscan be performed by people without programming skills but with competence to guide the instruction during the lesson. After the contentsetup is completed, an AR environment is ready for use and the learning process can begin. During the learning process, learners can interactwith the learning content using real objects present in the AR environment.

    3.1.1. AR-Classes and AR-Objects

    The AREM approach is based on two main concepts:AR-ClassandAR-Object, according to the object-oriented paradigm (Wojciechowski,2012). However, the conventional object-oriented approach is not sufcient to model interactive AR environments, thus the conventionalconcepts of class and object have been appropriately extended. AR-Objects are representations of virtual objects, real objects, and scenescomposed of the both kinds of objects. AR-Classes are created in the content design stage by content designers, whereas AR-Objects arecreated during the content creation stage by content creators.

    AR-Classes implement the basic class features originating from the object-oriented paradigm, such as: attributes, operations, and in-heritance. In the context of AR environments, these features have been extended with 3D geometry, interactive behavior, media objects,constraints on the attributes, and aggregation relationships with other classes. An AR-Class represents a group of AR-Objects that sharesimilar characteristics, such as geometry, media objects, behavior, relationships to other AR-Objects, and semantics. There are three kinds ofthe relationships among AR-Classes: specialization, composition, and containment. Specialization denes hierarchical structure of AR-Classes, where one class is a specialization of another class. Composition and containment are kinds of aggregation, where one class ispart of another class.

    Attributes are used for describing visual, behavioral, and semantic characteristics of AR-Objects. Therefore, the attributes of AR-Classesare used for parameterization of their geometry and behavior. Since different AR-Objects being instances of an AR-Class may have different

    values of the attributes, thus the presentation of different AR-Objects may differ in visual and behavioral aspects.In the context of education, AR-Classesare used for modeling learning concepts. AR-Objects of an AR-Class represent specic instances of thelearning concept. The instances can be presented in an AR environment. Learning concepts encompass all the concepts necessary to create anAR environment. They are divided into domain concepts andpresentation concepts. Domain concepts are directly related to the domain-specicknowledge, whereas presentation concepts are related to the presentation of the knowledge. For example, in the chemistry domain, domain-specic concepts are: liquid, solid, acid, base, etc., while presentation concepts correspond to glassware and equipment necessary to setup andconduct chemical experiments, such as a pipette, measuring cylinder, test tube, beaker, Bunsen burner, or thermometer.

    For specifying AR-Classes and AR-Objects including all their constituent elements, a new high level, XML-based language, called Augmented Reality Scenario Modeling Language, has been developed.

    3.1.2. Geometry

    Each AR-Class may contain geometry which is a 3D digital model specifying how the AR-Objects instantiated from this AR-Class arevisually presented in AR environments. Geometry may be encoded in any language enabling modeling of virtual environments, for exampleX3D (Web3D Consortium, 2011), extended with parameterization features. A notable example of such language is X-VRML (Walczak &

    Cellary, 2003).

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 573

  • 7/24/2019 ar in class rooms 4.pdf

    5/16

    Geometry of an AR-Class can be directly parameterized with the attributes of this AR-Class and indirectly with the attributes of itscomponent AR-Classes. The geometry may be customized in the AR-Objects by setting different attribute values during learning contentpreparation. The attribute values can dynamically change during learning content use as a result of behavior of the AR-Objects present in anAR environment. Hence, visualization of AR-Objects can also dynamically change at runtime. The possible changes of the geometry dependon the parameterization capabilitiesoffered by this geometry. Consider the AR-Class Beaker that has properties specifying the diameter andheight of geometry. Different AR-Objects created based on the BeakerAR-Class may represent beakers of different diameters and heights.

    Geometry of a composite AR-Class may embed geometries of its component AR-Classes and may depend on the attributes of these AR-Classes. Consider the AR-ClassMeasuring cylinder with liquidcomposed of two component AR-Classes: Measuring cylinderand Liquid.The Measuring cylinder AR-Class contains geometry that can be displayedin an AR environment. The Liquid AR-Class has empty geometry,so it cannot be directly visualized, because the liquids shape is conned to a container it lls. Therefore, geometry of Measuring cylinderwith liquiddescribes visualization of the associated liquid in the context of geometry included from the Measuring cylinderAR-Class.Geometry of the liquid can be parameterized by the following attributes: diameter of Measuring cylinder, quantity of liquid inMeasuring cylinder with liquid, and color and opacity ofLiquid. In the example, geometry of theMeasuring cylinder with liquid AR-Classis directly parameterized by attributes this AR-Class and indirectly by attributes of its component AR-Classes.

    Parameterization of geometry provides a exible mechanism for building highly dynamic 3D graphics content. In particular, the visu-alization of the objects can be dynamically changed as a result of user actions performed in an AR environment. For instance, when a userpours different liquids between different containers, visualization of the liquids lling the containers is dynamically adjusted according tothe properties of the liquids.

    3.1.3. Behavior

    Behavior in AR-Classes is dened by twokinds of operations: methods and activities. Methods aresequences of commands that access andprocess data in an immediate way. On the contrary, activities access and process data continuously for a period of time.

    Activities describe behavior of AR-Objects in time, in particular describe reactions to some events occurring in an AR environment. Eachactivity denotes some distinctive behavior of the AR-Objects instantiated from an AR-Class. Each AR-Object can contain a number ofdifferent activities. Activities can be activated and deactivated at runtime. When an activity is activated for an AR-Object, an activity instanceis created. Execution of activity instances depends on user interaction and behavior of other AR-Objects. An activity instance is executeduntil it is explicitly deactivatedor it is completed. For oneAR-Object,a numberof instancesof different activities can be executedat the sametime. In particular, a number of instances of one activity can run simultaneously.

    Each activity denes an interaction context for instances of an AR-Class. An interaction context of an activity de ned for an AR-Classspecies the classes of objects that can interact with the AR-Objects instantiated from the AR-Class. For instance, consider an AR-ClassPipetterepresenting pipettes used for transferring liquids between containers. Thus, the Pipetteclass should contain specications oftwo activities called DrawingFrom and DrippingTo, respectively. The DrawingFrom activity enables a pipette to draw liquid from acontainer, whereas the DrippingTo activity enables a pipette to drip liquid to a container. The interaction context of these activities containsthe Container with liquid class. A number of instances of these activities can be executed simultaneously for different AR-Objectsinstantiated from the Container with liquidclass. As a result for each instance of the Pipetteclass it is possible to indicate which con-tainers can be used to draw liquid from, and which ones can be used to drip liquid to. Those associations can be dynamically changed at

    runtime.

    3.1.4. Media objects

    AR-Class may contain media objects, such as images, videos, and audio clips. AR-Classes may dene attributes whose values are allowedto be media objects. Each such attribute can be associated with a default media object contained in the AR-Class. In the AR-Objectsinstantiated from the AR-Class, media object attributes can be set to media objects different from their default values.

    Media objects contained in an AR-Class can be referenced in the specication of geometry and behavior of the AR-Class. In the geometryspecication, images and videos can be used as textures. Also, audio clips can be embedded in a geometry model, if it is supported by themodeling language used for the geometry specication. The media objects can also be used as an aid in instruction during learning scenariospresented in AR environments. The media objects can provide supplementary information on the learning content presented. Images andvideos can be displayed as 2D overlays on top of the view of an AR environment, while audio clips can be played in the background. Audioclips can contain sound effects, background music, or voice instructions.

    3.2. ARIES system

    3.2.1. System architecture

    The overall architecture of the ARIES system is presented in Fig.1. The central role plays the learning content repository component storingAR-Classes and AR-Objects used for building AR environments. AR-Classes and AR-Objects are created in the repository in the learningcontent preparation phase. They are created and managed by the use of theAR-Class Managerand theAR-Object Manager, respectively. AR-Class Manager and AR-Object Manager are web applications accessible over the Internet. Thus, these tools can be used without the need ofinstalling any additional software apart from a Web browser. The AR-Class and AR-Object managers cooperate with external authoring toolsfor creation and modication of geometry and media objects. The media object authoring tools encompass a variety of graphics, audio, andvideo editors.

    In the learning content preparation phase, the constituent elements of AR-Classes are dened, i.e., attributes, geometry, behavior, mediaobjects, and relationships with other AR-Classes. AR-Classes are administered by content designers using AR-Class Manager. Creation ofdifferent elements of AR-Classes requires different skills, thus, the creation of AR-Classes can be an iterative process in which new con-stituent elements are added incrementally and existing elements are modied by different designers. Using AR-Class Manager, content

    designers can navigate the hierarchy of AR-Classes, and also create/modify/delete the AR-Class de

    nitions.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585574

  • 7/24/2019 ar in class rooms 4.pdf

    6/16

    The user interface of AR-Class Manager hosted in a web browser window is presented inFig. 2. On the left side of the window, there is atree representing an AR-Class inheritance hierarchy. The root of this hierarchy is theObjectclass, which has three subclasses:Real Object,Virtual Object, and Scene. All the classes representing real objects, virtual objects, and scenes are descendants of the Real Object, VirtualObject, and Sceneclasses, respectively. The real object classes are denoted by the Ricon, the virtual object classes are denoted by the Vicon, and the scene classes are denoted by the Sicon. Abstract classes are marked with the icons in gray, whereas concrete classes are

    marked in green. On the right side of the window, there are a number of tabs allowing content designers to edit the constituent elements ofAR-Classes. In particular, using the details tab, a user can associate 3D geometry with an AR-Class.

    Geometry and media objects associated with AR-Classes are created and edited using external authoring tools, and then imported by AR-Class Manager into the learning content repository. To create 3D geometry, different tools and methods can be used depending on itscomplexity and whether the geometry represents a concrete or an abstract concept. Geometry of abstract concepts can be modeled with a

    Fig. 2. AR-Class Manager

    hierarchy of AR-Classes for a chemistry lesson.

    Fig. 1. Architecture of the ARIES system.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 575

  • 7/24/2019 ar in class rooms 4.pdf

    7/16

    3D modeling package such as 3ds Max (3ds Max, 2012). On the contrary, geometry of real objects can be modeled with photogrammetrytechniques. Photogrammetry enables automatic generation of textured 3D models of real objects from photographic images of the objects(Luhmann, Robson, Kyle, & Harley, 2006). Parameterization of geometry can be performed using a 3D modeling package extended withadditional plug-ins enabling the parameterization. Using AR-Object Manager, domain experts can easily create, modify, and delete AR-Objects. When a domain expert creates a new AR-Object, he/she sets values of the attributes de ned in the corresponding AR-Class.

    Theuser interfaceof AR-Object Manager hostedin a web browserwindow is presentedin Fig. 3. Similarly to AR-ClassManager, AR-ObjectManager contains the tree representing the AR-Class inheritance hierarchy. In the central part of the window, there is a list of AR-Objectsbeing instances of the AR-Class selected in the hierarchy. In the example, there are three instances of the Cylindrical container with liquidAR-Class in the list. Each of the AR-Objects is dened with different attribute values. On the right side of the window, there are twotabs withcontrol elements enabling content creators to specify theattribute values forthe currentlyselected AR-Object. By setting the attribute valuescontent creators are able to affect the geometry and behavior of the AR-Objects being dened. The attributes of AR-Objects are divided intothe tabs depending on whether they can be set during the creation or setup stage. Furthermore, the values of the setup attributes can bechanged in the content setup stage before a lesson takes places.

    In the learning content use phase, AR-Objects are retrieved from the repository and loaded into the presentation module, which isresponsible for visualization of AR environments based on the AR-Objects retrieved from the learning content repository.

    3.2.2. Presentation module

    The presentation module is used for building AR environments. To this end, the module uses the AR installation, which comprises a videocamera, a display device, and a set of real objects in the form of square cardboard markers located in a real environment.

    The presentation module consists of two components: Web Browser and AR Browser. Web Browser offers a web-based user interface forbrowsing AR-Classes and AR-Objects representing scenes. An instructorcan browse scene AR-Objects created in the content creation stage andselect the scene that should be used for building his/her AR environment. Next, the instructor can customize the visualization and behavioral

    properties of the AR environment. To this end, the instructor may set values of the attributes dened in the setup modication interface of thescene AR-Class. These attributes allow instructors to customize learning scenarios to their needs immediately before the learning stage.

    The instructor can initiate a learning scenario, and then the presentation module switches to AR Browser, which generates an ARenvironment based on the AR-Objects contained in the scene. AR Browser operates in full-screen mode and enables learners to see the ARenvironment in which they can interact with the learning content. The AR Browser displayed on a monitor is shown in Fig. 4.

    Fig. 3. AR-Object Manager

    creating AR-Objects representing different virtual cylindrical containers for liquids.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585576

  • 7/24/2019 ar in class rooms 4.pdf

    8/16

    AR Browsercombines virtual objects and representations of real objects with live video images captured by a video camera. Visualizationof an AR environment is performed in a loop. In each cycle of the loop, a video frame is grabbed and analyzed to nd and identify particularreal objects present in the real environment. Then, positions and orientationsof the real objects relative to the camera are calculated. Finally,

    virtual content is rendered and superimposed on the captured image. In particular, the rendered content is transformed according to thelocations of the recognized real objects.

    In the AR Browser, tracking of real objects is performed using the ARToolKit library (Kato, Billinghurst, Popyrev, Imamoto, & Tachibana,2000). ARToolKit is capable of tracking special square-shaped markers placed in a real environment. The ARToolKit library uses computervision techniques to calculate position and orientation of a camera relative to markers in real time. The markers have a form of black squareswith a white inner area containing a non-symmetrical pattern. The markers have to be attached to real objects that should be tracked in anAR environment. The real objects have a form of square cardboard pieces with markers printed on their surfaces. Learners can freelymanipulate the real objects and in this way interact with the virtual content presented in the AR environment, as shown inFig. 4.

    A complete view of an AR environment can be displayed by the presentation module on a head-mounted display, desktop monitor, orprojection system. Using an HMD would be appealing for learners that would like to interact with learning content looking directly at a realenvironment instead of a computer display. However, using an HMD in a real classroom is rather difcult for organizational and nancialreasons. Thus, it is recommended to use rather large-screen displays to enable easier access and allow a number of learners to collaborate inan AR environment at the same time. In the evaluation of the system, we applied 22-inch LCD monitors. Monitors of this size weresatisfactory, because each AR installation was used by maximum two users at a time.

    4. Methods

    4.1. Research model

    The aim of the experiment was to evaluate the learners attitude toward experiential learning in AR environments. In the experiment weadopted the Technology Acceptance Model (TAM) that enables to explain the determinants that encourage system use (Davis, 1989;Davis,Bagozzi, & Warshaw, 1989). The TAM model is a widely used model in technology acceptance studies (Teo, 2009;Sun & Cheng, 2009). Thebasic TAM model is shown inFig. 5.

    In the TAM model, acceptance of a system is represented by intention to use, which is determined by the users attitude toward using thesystem and perceivedusefulness. Attitude toward using a system is determined by users perceptionsof the usefulness and ease of useof thesystem. According toTAM, perceivedusefulness is determined by perceivedease of use. In addition, perceived usefulness and perceived easeof use can be affected by various external variables. These variables describe user characteristics, system features, and the setting in whichthe system is used.

    Fig. 4. AR Browser component students performing a chemical experiment in an AR environment.

    Fig. 5. Technology Acceptance Model (TAM).

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 577

  • 7/24/2019 ar in class rooms 4.pdf

    9/16

    Perceived usefulness is dened as the degree to which a person believes that using a particular system would enhance his or her jobperformance (Davis, 1989, p. 320). In the learning context, the user believes that a system would yield positive benets for learning.Perceived ease of use refers to the degree to which a person believes that using a particular system would be free of effort(Davis, 1989, p.320). The perceived usefulness and perceived ease of use have been extensively investigated in a number of studies, which proved that theyare important factors positively inuencing computer acceptance (Yuen & Ma, 2002;Liaw & Huang, 2003;Lin & Wu, 2004). However, somestudies criticized the original TAM model due to the omission of intrinsic factors that inuence the computer acceptance (Moon & Kim,2001;Chung & Tan, 2004). Furthermore, prior studies proved that perceived enjoyment has a signicant positive inuence on attitudetoward using, thus, it should be included in the TAM model ( Chung & Tan, 2004;Wu, Chen, & Lin, 2007;Teo & Noyes, 2011).

    The original TAM model takes into consideration only extrinsic motivation in the form of perceived usefulness. Extrinsic motivation isconsidered to be instrumental in achieving objectives that are distinct from the activity itself. In contrast, intrinsic motivation is related tothe process of performing the activity per se. Thus, perceived usefulness is a form of extrinsic motivation, whereas perceived enjoyment isconsidered as intrinsic motivation (Davis, Bagozzi, & Warshaw, 1992;Teo, Lim, & Lai, 1999).

    Davis et al. proposed the revised TAM model including perceived enjoyment as intrinsic motivational factor. Perceived enjoyment isdened as the extent to which the activity of using the computer is perceived to be enjoyable in its own right, apart from any performanceconsequences that may be anticipated(Davis et al., 1992, p. 1113).

    For evaluation of the acceptance of AR environments by learners we adopted the TAM model enhanced with perceived enjoymentproposed by Davis et al. (1992). The research model for examining the impact of extrinsic and intrinsic factors on using the ARIES system bylearners is presented inFig. 6. According to the model, perceived usefulness and perceived enjoyment directly inuence attitude towardusing and intention to use the system. Furthermore,perceived ease of use may directly affect both extrinsic and intrinsic motivation,and theattitude toward using.

    The AR interface based on real objects enabling direct manipulation of virtual objects particularly distinguishes AR environments fromother learning environments built with traditional computer systems. Therefore, in this work we explored the inuence of the AR interface on

    the main constructs directly determining the attitude toward using; i.e., perceived usefulness, perceived enjoyment, and perceived ease of use.To this end, in the research model we included one external variable interface style(IS), which may have a signicant inuence on the

    determinants of attitude toward using. A signicant impact of interface style on the attitude of users toward using a system was proved inprevious studies (Hasan & Ahmed, 2007;Sun & Cheng, 2009).

    The following research hypotheses were formulated on the basis of the research model:

    H1. Perceived usefulness (PU) will positively affect attitude toward using (ATU).H2. Perceived usefulness (PU) will positively affect intention to use (ITU).H3. Perceived enjoyment (PE) will positively affect attitude toward using (ATU).H4.Perceived enjoyment (PE) will positively affect intention to use (ITU).H5. Perceived ease of use (PEU) will positively affect perceived usefulness (PU).H6. Perceived ease of use (PEU) will positively affect perceived enjoyment (PE).H7. Perceived ease of use (PEU) will positively affect attitude toward using (ATU).H8. Attitude toward using (ATU) will positively affect intention to use (ITU).

    H9. Interface style (IS) will positively affect perceived usefulness (PU).H10. Interface style (IS) will positively affect perceived enjoyment (PE).H11. Interface style (IS) will positively affect perceived ease of use (PEU).

    4.2. Application scenario

    The AREM approach can be applied to teaching in different domains such as chemistry, physics, geography, biology, and cultural heritage(Wojciechowski, Walczak, & Cellary, 2005;Walczak & Wojciechowski, 2005). To illustrate the concept of AR environments we have chosenthe chemistry domain due to several reasons. First, chemistry is fundamentally an experimental science, in which experimentation and

    Fig. 6. Research model based on TAM.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585578

  • 7/24/2019 ar in class rooms 4.pdf

    10/16

    observation are essential for understanding many chemical concepts. Second, chemistry is particularly challenging area as far as theinteractive visualization of chemical experiments is concerned due to high dynamism of visual and behavioral aspects of chemical reactions.

    Consider an example interactive scenario that shows learners the reaction of hydrochloric acid (HCl) and sodium hydroxide (NaOH). Thescenario allows learners to gain knowledgeof the acidic and basic nature of the substances. In an acidbasereaction, an acid and a base reactforming a salt andwater. In the example scenario, the reaction of hydrochloric acid (HCl) andsodium hydroxide (NaOH) producesa table salt(NaCl) and water.

    The experiment requires the following laboratory equipment: laboratory beakers, measuring cylinder, pipette, porcelain evaporating dish,Bursen burner, and pair of tongs. In the experiment the following chemical supplies are used: HCl solution in water, NaOH solution in water,phenolphthalein solution in ethanol, and distilled water. One of the beakers is lled with the HCl solution, the other is lled with the phenol-phthalein solution. The measuring cylinder contains the NaOH solution. To conduct the experiment, a learner must perform the following steps:

    1. Fill the pipette with the phenolphthalein solution from the beaker and drip it into the measuring cylinder with the NaOH solution. Thesolution in the cylinder changes color from colorless to pink.

    2. Rinse the pipette with distilled water. If a learner does not do so, he or she should get appropriate to remind him/her of the need to rinsethe pipette before drawing another liquid.

    3. Fill the pipette with the HCl solution fromthe beakerand drip it intothe measuring cylinder with the NaOH solution. The mixture in thecylinder changes color from pink to colorless. Continue dripping until the neutralization process is complete.

    4. Take the measuring cylinder and pour the mixture into the evaporating dish.5. Place the evaporating dish over the laboratory burner. Heat the evaporating dish over the burner ame until the solution evaporates to

    dryness. The evaporation process produces steam leaving NaCl crystals in the dish.

    In traditional teaching, such an experiment is carried out by a teacher, who demonstrates and explains the phenomena occurring during

    the experiment. The participation of students in carrying out such an experiment is limited to the observation of the phenomena takingplace and asking questions to the teacher. However, this way of learning is not very engaging for students, who are not allowed to carry outchemical experiments in person due to safety measures, limited resources in laboratories, and the limited time available for a given group ofstudents. In addition, when performing experiments teachers cannot perform potentially dangerous experiments, which could cause anexplosion, re, or the emission of hazardous substances.

    4.2.1. Learning content preparation

    The AR-Classes and AR-Objects created for the acidbase reaction scenario are presented inFig. 7. The upper part of the diagram shows afragment of the AR-Class inheritance hierarchy dened for the scenario. Below the AR-Classes, a selection of AR-Objects used in the scenariois presented.

    Fig. 7. AR-Classes and AR-Objects de

    ned for the application scenario.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 579

  • 7/24/2019 ar in class rooms 4.pdf

    11/16

    AR-Classes have a single line border, whereas AR-Objects are framed by a double line. The is-instance-ofrelationships between AR-Objects and AR-Classes are represented by a dashed line with a solid arrowhead that points to an AR-Class. The composition andcontainment relationships are denoted by a solid line with a solid or an empty diamond, respectively.

    The acidbase reaction scenario is implemented as the HClNaOH reactionobject that is an instance of the Acidbase reactionclass.The HClNaOH reaction object represents a scene describing an AR environment. The Acidbase reaction class is connected by thecontainment and composition relationships with AR-Classes representing both virtual and real objects. The HClNaOH reactionobject isconnected by the containment and composition relationships with the appropriate AR-Objects conforming to the AR-Classes specied forthe relationships dened in the Acid-base reactionclass.

    4.2.2. Learning content use

    When a lesson in a classroom is going to take place, an instructor selects the HClNaOH reaction object from the learning contentrepository and sets the learning scenario presentation parameters. Next, the scenario is started and the required AR-Objects are retrievedfrom the learning content repository and loaded into the presentation module, which builds a complete AR environment.

    When the acidbase reaction scenario is initiated, a learner is provided via the presentation module with the AR environment containingboth real and virtual objects. There are two beakers with the phenolphthalein and HCl solutions standing on the virtual caption boxes withappropriate labels, the measuring cylinder with the NaOH solution, the pipette, and two beakers for rinsing the pipette with distilled waterstanding on the green virtual caption box. The presented laboratory glassware is represented as virtual objects. The cylinder and the pipetteare attached to real objects having form of square cardboard pieces.

    At the beginning of the scenario a learner should drip some quantity of the phenolphthalein solution into the NaOH solution. Manip-ulating the appropriate marker, the learner draws the phenolphthalein solution into the pipette from the beaker. To this end, he/she has tomove the pipette close enough to the beaker with the solution. While the tapered pipette end is being located close enough over the beaker,the pipette is lling with the solution. Thelling stops when the learner moves the pipette from the beaker away. Next, the learner drips the

    phenolphthalein solution intothe NaOH solution by placing the pipette end over the measuring cylinder. While the dripping takes place, thesolution in the cylinder changes color from colorless to pink, because the NaOH solution has a basic pH.

    The next step of the chemical experiment is transferring the HCl solution from the beaker to the measuring cylinder. Before that, thelearner has to rinse the pipette with distilled water for safety reasons. If he/she forgets about it, a message appears on the screen, whichreminds him/her that the pipette has to be rinsed every time before using it for transferring various liquids. After rinsing the pipette, thelearner transfers some quantity of the HCl solution with the pipette from the beaker to the measuring cylinder containing the NaOH solutionmixed with the pH indicator. While the learner drips the HCl solution, the mixture in the cylinder changes color from pink to colorless,because the acid neutralizes the basic pH of the NaOH solution. When the mixture in the cylinder has pH neutral, the learner pours themixture from the cylinder to the evaporating dish. Next, the learner grabs the evaporating dish with the tongs and place the dish above theburnerame in order to heat the mixture.

    When the mixture is heated sufciently, water evaporates and condenses into mist. At the same time, at the bottom of the evaporatingdish sodium chloride (NaCl) appears. The learner can control the heating process by manipulating the tongs with the marker. The scenarionishes when all the water evaporates from the mixture leaving dry NaCl at the bottom of the evaporating dish.

    4.3. Design of empirical study

    The empirical study consisted of using the ARIES system to carry out a chemical experiment in an AR environment according to theapplication scenario presenting the reaction between hydrochloric acid and sodium hydroxide. The study involved 42 participants of thesecond grade of lower secondary school at the age of 1416 years. The chemistry curriculum in the second grade of lower secondary schoolin Poland includes topics such as acids, bases, salts, and pH. Therefore, the application scenario used in the study concerned the topic of thereaction between acids and bases, which is consistent with the curriculum of the students.

    Inorder to perform the study, we setup six AR installations for carrying out chemical experiments in AR environments. Each ARinstallation was composed of a desktop PC with a monitor, a webcam, and a set of square cardboard markers. One of the AR installations isshown in Fig. 8. In the installation, a webcam is placed on top of the monitor. It captures the area in which a set of square cardboard markersare placed. Students sit in front of the monitor and can freely manipulate the markers. The image captured by the webcam displayed on thescreen is ipped horizontally. This allows the students to see on the monitor their mirror image augmented with virtual objects. In this way,students get the illusion that virtual objects exist in their environment. Furthermore, the students have the opportunity to directly interactwith the virtual objects using the real markers in a natural and intuitive way, as shown inFigs. 9 and 10.

    Students were performing chemical experiments in groups of two. They were collaborating on performing experiments, exchangingremarks about the presented learning content, and giving to each other instructions on how to use the system. The students after a fewminutes of working with the system took on relevant experience in interacting with the learning content using an interface based on thecardboard markers. The students were free to manipulate the markers and could focus on carrying out chemical experiments in an ARenvironment, as shown inFig. 10.

    Each student could carry out the experiment at his/her own pace tailored to his/her personal preferences. Students were able to freelymanipulate the real markers but the experiment scenario restricted possible interactions between virtual objects to those that wereessential for the proper execution of the experiment. While performing the experiment, the students were following the guidance displayedon the screen. The guidance was comprised of instructions and explanations of the chemical and physical phenomena, and thus the teacherinvolvement was kept to minimum. There were no technical problems during the study, so the students could focus on the merits of theexperiment.

    All of the 42 participants completed the experiment successfully. After the completion of the experiment, the participants were asked toll out an anonymous questionnaire with statements about working with the ARIES system and the attitude toward using such a system inthe learning process in the future. We developed a questionnaire to measure each of the constructs comprising the research model pre-

    sented inFig. 6. The participants were asked to provide demographic information and respond to 18 statements grouped into six groups

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585580

  • 7/24/2019 ar in class rooms 4.pdf

    12/16

    representing the constructs of the research model. The questionnaire statements were adapted from previous studies dealing with the TAMmodel with changes in expressions in order to adjust them to the context of AR environments. Each statement in the questionnaire wasmeasured according to a ve-point Likert scale ranging from 1 (strongly disagree) to 5 (strongly agree), with the exception of one reverseditem for attitude toward using, which was measured in a ve-point Likert scale ranging from 1 (strongly agree) to 5 (strongly disagree).

    5. Results

    5.1. Descriptive statistics

    The statements of the questionnaire and the descriptive statistics for each statement are presented in Table 1.All mean values are withina range of 3.93 and 4.62. The standard deviation range from 0.623 to 1.310.

    To measure the internal consistency of statements a coefcient Cronbach alpha was calculated for the statements belonging to eachconstruct specied in the research model. To consider the internal reliability of statements concerning the same construct as satisfactoryCronbach alpha should be greater than 0.7. The obtained Cronbach alpha values for each construct except ATU are at a satisfactory level, asshown inTable 2. In the case of ATU, the value is slightly lower, which may indicate minor differences between the statements formulatedregarding attitude toward using. This discrepancy could be inuenced by the fact that one of the three statements was a reversed itemphrased in the opposite semantic direction from the other statements. Negative statements used together with positive statements candecrease the degree of internal consistency, because the negative items may not be considered the exact opposite of the positive ones(Barnette, 2000).

    5.2. System evaluation

    Toverify hypotheses H5, H6, H9, H10, and H11 we examined the relationships between pairs of the appropriate constructs dened in theresearch model using regression analysis. The results of the regression analysis are presented inTable 3.

    Fig. 9. Mirrored image augmented with virtual objects

    direct interaction of students with virtual objects.

    Fig. 8. AR installation a desktop PC with a monitor, a webcam, a set of physical markers.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 581

  • 7/24/2019 ar in class rooms 4.pdf

    13/16

    The coefcient p was less than the assumed signicance level of 0.05 for all of the calculated regression values. Thus, for each of thehypotheses we rejected the null hypothesis indicating the lack of dependence.

    Perceived usefulness depends to a similar extent on perceived ease of use ( R2 0.491) and interface style (R2 0.478). Based on theregression values the hypotheses H5 and H9 were supported. Perceived enjoyment was dependent to a relatively small extent on bothperceived ease of use (R2 0.346) and interface style (R2 0.368). However, the regression values were sufcient to accept the hypothesesH6 and H10. Interface style had signicant impact on perceived ease of use (R2 0.596), so the hypothesis H11 was supported.

    In order to thoroughly investigate factors that affect attitude toward using and intention to use we used stepwise multiple regressionanalysis. The results of the analysis are presented in Table 4.

    As a result of stepwise multiple regression analysis, we received a regression model of attitude toward using based on perceived use-fulness and perceived enjoyment (R2 0.827). The results of the regression analysis supported the hypotheses H1 and H3. The stepwisemultiple regression algorithm excluded perceived ease of use due to the pvalue higher than signicance level (p 0.243). This meant thatthe H7 hypothesis was not supported. Based on the results, perceived usefulness and perceived enjoyment had a strong impact on attitudetoward using the system.

    Based on the stepwise multiple regression analysis, intention to use depended on attitude toward using and perceived enjoyment(R2 0.737). The results of the regression analysis supported the hypotheses H4 and H8. The stepwise multiple regression algorithm causedthe exclusion of perceived usefulness because of the too high p value (p 0.953). This meant that the H2 hypothesis was not supported.

    Based on the regression model, perceived enjoyment and attitude toward using had a strong positive effect on intention to use the system.

    Fig. 10. Students carrying out a chemical experiment by manipulating real objects.

    Table 1

    The questionnaire statements and the means, standard deviations of the answers.

    Questionnaire statements M S.D.

    Interface style (IS)

    Moving virtual objects using the cardboard cards is easy. 3.93 0.921

    Operation of a computer with the cards is a good idea. 4.17 0.935

    I could easily control the course of the chemical experiment using the cards. 4.10 0.958Perceived usefulness (PU)

    The use of such a system improves learning in the classroom. 4.14 1.095

    Using the system during lessons would facilitate understanding of certain concepts. 4.19 1.087

    I believe that the system is helpful when learning. 4.31 0.869Perceived ease of use (PEU)

    I think the system is easy to use. 4.29 0.708 Learning to use the system is not a problem. 4.50 0.707

    Operation with the system is clear and understandable. 4.33 0.846Perceived enjoyment (PE)

    I think the system allows learning by playing. 4.40 0.767

    I enjoyed using the system. 4.43 0.859

    Learning with such a system is entertainment. 4.29 0.944Attitude toward using (ATU)

    The use of such a system makes learning more interesting. 4.55 0.803

    Learning through the system was boring (reversed item). 4.21 0.782

    I believe that using such a system in the classroom is a good idea. 4.12 1.310Intention to use (ITU)

    I would like to use the system in the future if I had the opportunity. 4.29 0.835

    Using such a system would allow me to perform chemical experiments on my own. 4.62 0.623

    I would like to use the system to learn chemistry and other subjects. 4.40 1.037

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585582

  • 7/24/2019 ar in class rooms 4.pdf

    14/16

    6. Conclusions

    Following the empirical study, we found that perceived usefulness and perceived enjoyment had a similareffect on attitude toward usingimage-based AR environments. With regard to the intention to use of AR environments, perceived enjoyment was a much more signicantfactor than perceived usefulness. Therefore, the use of AR environments during lessons could provide extra motivation to learn for youngstudents. Before performing the empirical study, we wondered whether the interface style based on physical markers would not act as a

    disincentive to use the system. Based on the analysis, it turned out that although interface style had a strong inuence on perceived ease ofuse, these two factors had a little effect on perceived enjoyment. Despite the fact that at the beginning the user interface based on physicalmarkers required a little practice, it was not a factor limiting perceived enjoyment. Furthermore, perceived enjoyment was the factorhavingthe essential inuence on the willingness of students to use the system in the learning process.

    An alternative interpretation of the study results is that the positive attitude of learners to the AR technology was expressed due to itsnovelty. It can be assumed that the positive attitude of students to learning in AR environments will fade with time, since the learners willget used to the technology. To maintain the learners interest in AR technology in the longer term, continuous provision of engaging learningcontent is of crucial importance. Thus, successful dissemination of the AR technology in education on a large scale will greatly depend on theavailability and quality of learning content for AR environments. Therefore, research on the application of AR in education should focusprimarily on the development of new methods for the creation of interactive 3D content for AR learning environments. The proposed ARIESsystem ts in with this trend and enables teachers to develop new learning content by the creation of AR-Classes and AR-Objects. Theopportunity for the creation of new content by teachers themselves fosters continuous development of high-quality learning content, sincethey have substantive and pedagogical knowledge required to prepare such content in accordance with the curriculum and pedagogy.

    Image-based AR environments seamlessly combine interactive 3D learning content with real environments containing physical objects.

    The learners can interact with the content in a direct and intuitiveway by manipulation of physical objects, thus they have an opportunity toperform different experiments in person. The active participation of learners in hands-on activities has a particularly positive effect on theperceived enjoyment, resulting in their increased motivation for learning.

    Another important advantage of image-basedAR environments is the freedom of experimentation, which could be impossible to achievein the real world due to cost and safety reasons. In the presented application scenario, AR environments are used to implement experientiallearning for performing chemical experiments. In these environments students are able to carry out experiments in person using virtualcounterparts of real laboratory equipment and chemicals. The replacement of the actual laboratory resources with their virtual counterpartsenables educational institutions to achieve signicant nancial savings. Once designed and developed, virtual objects can be reused by anumber of students for performing various experiments. One installation for learning in image-based AR environments can be used for abroad spectrum of chemical experiments without having to make changes to the physical conguration of this installation. The ARinstallation takes up much less space than a typical workbench for chemical experiments and does not require any special chemistrylaboratory infrastructure.

    It is possible to setup a number of AR installations in a classroom in order to enable performing in parallel chemical experiments by anumber of students individually or in small groups. Setting a number of AR installations in a classroom enables each student to perform an

    experiment independently at his/her individual pace. Also, students can perform different experiments in parallel, if decided by a teacher.

    Table 3

    The regression analysis.

    Dependent variable Independent variable R2 p

    Perceived usefulness (PU) Perceived ease of use (PEU) 0.491

  • 7/24/2019 ar in class rooms 4.pdf

    15/16

    Experimenting in person encourages students to test various what-if scenarios that are possible for a given chemical experiment. Inparticular, students can perform potentially dangerous tasks without compromising their health and safety.

    AR has great potential for educational applications because it supports situated learning (Johnson, Smith, Willis, Levine, & Haywood,2011). To ensure successful situated learning, AR environments should provide a reliable representation of reality to allow students togain knowledge applicable in the real world. The realistic learning contexts offered by AR environments considerably facilitate the transferof the abilities learned to the real world, in comparison to learning out of the real context.

    However, not all learning experiences are equally educative (Dewey, 1938). Some experiences can be mis-educative unless they arefollowed by the opportunity to reect on what happened. Learners should be able to draw generalizations from the experiences and un-derstand how to use these generalizations in future experiences. Thus, teachers have to create learning environments in which learners areactively engaged in meaningful tasks and carefully guided in reection on their experiences. During the learning content preparation for ARenvironments, a special attention must be paid to the representation of potentially dangerous activities. The ease of exploring the conse-quences of such actions and a sense of security offered by AR environments cannot foster the students illusory belief that performing thesame actions in the real world would also not result in hazardous consequences. Therefore, AR learning environments should includeappropriate guidance to the operations performed by students, and particularly dangerous activities should always be accompanied withthe relevant warnings about safety, or even prohibited.

    We conclude that learning in image-based AR environments can be particularly attractive and evocative for younger generations, bywhom it can be perceived more like edutainment than pure learning. This is the right time to introduce AR to teaching on a large scale. Inrecent years, wide availability of 3D computer games and movies based on 3D computer graphics has led to widespread familiarity of peoplewith the 3D technologies. The young generations accustomed to 3D games and movies demand similar experiences in education. In thiscontext, the AR technology may offer a great help to educational institutions in increasing the attractiveness of teaching, thereby providingbetter motivation for students to learn.

    The study on the attitude of learners toward AR learning environments is only the rst step in the dissemination of the AR technology in

    education. Further research should focus on whether students are actually acquiring knowledge and to what extent their knowledge of theconcepts and processes presented in AR environments is increased. Next, a comparative experimental study should be carried out todetermine if students taught with the use of AR achieve signicantly better results and the level of self-efcacy compared to a control grouptaught using the traditional methods.

    References

    3ds Max. (2012). Autodesk 3ds Max. Accessed 19.10.12.http://www.autodesk.com/3dsmax.Azuma, R. T. (1997). A survey of augmented reality. Presence: Teleoperators and Virtual Environments, 6(4), 355385.Barnette, J. J. (2000). Effects of stem and Likert response option reversals on survey internal consistency: If you feel the need, there is a better alternative to using those

    negatively worded stems. Educational and Psychological Measurement, 60, 361370.Beard, C., & Wilson, J. P. (2006). Experiential learning: A best practice handbook for educators and trainers (2nd ed.). London: Kogan Page Limited.Boyle, T. (2003). Design principles for authoring dynamic, reusable learning objects. Australian Journal of Educational Technology, 19(1), 4658.Bruner, J. S. (1996). The culture of education. Cambridge, MA: Harvard University Press.Burdea, G. C., & Coiffet, P. (2003). Virtual reality technology. NJ: John Wiley & Sons.

    Cellary, W. (2002). Social changes. In W. Cellary (Ed.), Poland and the global information society: Logging on(pp. 29

    33). Warsaw: UNDP. Accessed 19.10.12.http://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdf.Chaille, C., & Britain, L. (2002). The young child as scientist: A constructivist approach to early childhood science education (3rd ed.). Boston: Allyn & Bacon.Champion, E. (2006). Enhancing learning through 3D virtual environments. In E. K. Sorensen, & D.. Murch (Eds.), Enhancing learning through technology (pp. 103124).

    London: Ideas Group Inc., Information Science Publishing.Cheng, K.-H., & Tsai, C.-C. (2012). Affordances of augmented reality in science learning: suggestions for future research. Journal of Science Education and Technology, . http://

    dx.doi.org/10.1007/s10956-012-9405-9 .Chung, J., & Tan, F. B. (2004). Antecedents of perceived playfulness: an exploratory study on user acceptance of general information-searching websites. Information and

    Management, 41(7), 869881.Dalgarno, B., Bishop, A. G., Adlong, W., & Bedgood, D. R. (2009). Effectiveness of a virtual laboratory as a preparatory resource for distance education chemistry students.

    Computers & Education, 53(3), 853865.Davies, E. R. (2005). Machine vision: Theory, algorithms, practicalities(3rd ed.). San Francisco, CA: Morgan Kaufmann Publishers.Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance of information technology. MIS Quarterly, 13(3), 319340.Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1989). User acceptance of computer technology: a comparison of two theoretical models. Management Science, 35(8), 9821003.Davis, F. D., Bagozzi, R. P., & Warshaw, P. R. (1992). Extrinsic and intrinsic motivation to use computers in the workplace. Journal of Applied Social Psychology, 22(14), 11111132.Dewey, J. (1938). Experience & education. New York, NY: Simon & Schuster (republished in 1997).Drascic, D., & Milgram, P. (1996). Perceptual issues in augmented reality. In M. T. Bolas (Ed.), Spie Volume 2653: Stereoscopic displays and virtual reality systems III(pp.123134).

    San Jose, CA: SPIE.Duffy, T. M., & Cunningham, D. J. (1996). Constructivism: implications for the design and delivery of instruction. In D. H. Jonassen (Ed.), Handbook of research for educational

    Communications and technology(pp. 170198). NY: Simon & Schuster.Dunleavy, M., Dede, C., & Mitchell, R. (2009). Affordances and limitations of immersive participatory augmented reality simulations for teaching and learning. Journalof

    Science Education and Technology, 18(1), 722.Fjeld, M., Juchli, P., & Voegtli, B. M. (2003). Chemistry education: a tangible interaction approach. In M. Rauterberg (Ed.), Proceedings of humancomputer interaction

    INTERACT03(pp. 287294). Amsterdam: IOS Press.Hasan, B., & Ahmed, M. U. (2007). Effects of interface style on user perceptions and behavioral intention to use computer systems. Computers in Human Behavior, 23, 3025

    3037.Jara, C. A., Candelas, F. A., Puente, S. T., & Torres, F. (2011). Hands-on experiences of undergraduate students in automatics and robotics using a virtual and remote laboratory.

    Computers & Education, 57(4), 24512461.Jeong, J.-S., Park, C., Kim, M., Oh, W.-K., & Yoo, K.-H. (2011). Development of a 3D virtual laboratory with motion sensor for physics education. In T.-H. Kim (Ed.),Proceedings of

    ubiquitous computing and multimedia applications: Second international conference (UCMA 2011), part I(pp. 253262). Berlin Heidelberg: Springer-Verlag.Johnson, L., Smith, R., Willis, H., Levine, A., & Haywood, K. (2011). The 2011 horizon report. TX: The New Media Consortium.Jonassen, D. H. (1999). Designing constructivist learning environments. In C. M. Reigeluth (Ed.), Instructional design theories and models: their current state of the art(pp. 215

    239). Mahwah, NJ: Lawrence Erlbaum Associates.Kato, H., Billinghurst, M., Popyrev, I., Imamoto, K., Tachibana, K. (2000). Virtual object manipulation on a table-top AR environments. InProceedings of international symposium

    on augmented reality ISAR 2000(pp. 111119).Kaufmann, H., Schmalstieg, D., & Wagner, M. (2000). Construct3D: a virtual reality application for mathematics and geometry education. Education and Information Tech-

    nologies, 5(4), 263276.

    Klopfer, E. (2008). Augmented learning. Cambridge, MA: MIT Press.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585584

    http://www.autodesk.com/3dsmaxhttp://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdfhttp://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdfhttp://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdfhttp://dx.doi.org/10.1007/s10956-012-9405-9http://dx.doi.org/10.1007/s10956-012-9405-9http://dx.doi.org/10.1007/s10956-012-9405-9http://dx.doi.org/10.1007/s10956-012-9405-9http://dx.doi.org/10.1007/s10956-012-9405-9http://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdfhttp://hdr.undp.org/es/informes/nacional/europacei/poland/poland_2001_en.pdfhttp://www.autodesk.com/3dsmax
  • 7/24/2019 ar in class rooms 4.pdf

    16/16

    Klopfer, E., & Squire, K. (2008). Environmental detectives the development of an augmented reality platform for environmental simulations.Educational Technology Researchand Development, 56(2), 203228.

    Kolb, D. A. (1984). Experiential Learning: Experience as the source of learning and development. NJ: Prentice Hall.Landowska, A., & Kaczmarek, J. (2005). Educational resources as digital products. In K. Baukneht (Ed.), Proceedings of e-commerce and web technologies: Sixth international

    conference EC-web 2005 Copenhagen(pp. 228237). Berlin Heidelberg: Springer-Verlag.Lave, J., & Wenger, E. (1991). Situated learning: Legitimate peripheral participation. NY: Cambridge University Press.Liaw, S.-S., & Huang, H.-M. (2003). An investigation of user attitudes toward search engines as an information retrieval tool. Computers in Human Behavior, 19(6), 751765.Lin, F.-H., & Wu, J.-H. (2004). An empirical study of end-user computing acceptance factors in small and medium enterprises in Taiwan: analyzed by structural equation

    modeling. Journal of Computer Information Systems, 44(3), 98108.Liu, W., Cheok, A. D., Mei-Ling, C. L., & Theng, Y.-L. (2007). Mixed reality classroom: learning from entertainment. In K. K. W. Wong (Ed.),Proceedings of the second international

    conference on digital interactive media in entertainment and arts (DIMEA07)(pp. 6572). NY: ACM Press.Luhmann, T., Robson, S., Kyle, S., & Harley, I. (2006). Close range photogrammetry: Principles, techniques and applications. NJ: John Wiley & Sons, Inc.Marshall, H. H. (1996). Implications of differentiating and understanding constructivist approaches. Educational Psychologist, 31(3/4), 235240.Martn-Gutirrez, J., Saorn, J. L., Contero, M., Alcaiz, M., Prez-Lpez, D. C., & Ortega, M. (2010). Design and validation of an augmented book for spatial abilities development

    in engineering students. Computers & Graphics, 34(1), 7791.Milgram, P., & Kishino, F. (1994). A taxonomy of mixed reality virtual displays. IEICE Transactions on Information and Systems, E77-D(12), 13211329.Moon, J.-W., & Kim, Y.-G. (2001). Extending the TAM for the world wide web context. Information and Management, 38(4), 217230.Parker, J. R. (1997). Algorithms for image processing and computer vision. Indianapolis, IN: Wiley Publishing, Inc.Piaget, J. (1973). To understand is to invent: The future of education . NY: Grossman Publishers.Roussou, M. (2004). Learning by doing and learning through play: an exploration of interactivity in virtual environments for children. ACM Computers in Entertainment, 2(1),

    123. NY: ACM Pres.Schank, R. C., Berman, T. R., & Macperson, K. A. (1999). Learning by doing. In C. M. Reigeluth (Ed.). Instructional design theories and models: A new paradigm of instructional

    theory, Vol. II, (pp. 161181). Mahwah, NJ: Lawrence Erlbaum Associates.Squire, K. D., & Jan, M. (2007). Mad City Mystery: developing scientic argumentation skills with a place-based augmented reality game on handheld computers. Journal of

    Science Education and Technology, 16(1), 529.Steuer, J. (1992). Dening virtual reality: dimensions determining telepresence. Journal of Communication, 42(4), 7393.Sun, H.-M., & Cheng, W.-L. (2009). The input-interface of Webcam applied in 3D virtual reality systems. Computers & Education, 53(4), 12311240.Teo, T. (2009). Modelling technology acceptance in education: a study of pre-service teachers. Computers & Education, 52(1), 302312.Teo, T. S. H., Lim, V. K. G., & Lai, R. Y. C. (1999). Intrinsic and extrinsic motivation in Internet usage. OMEGA: International Journal of Management Science, 27(1), 2537.Teo, T., & Noyes, J. (2011). An assessment of the inuence of perceived enjoyment and attitude on the intention to use technology among pre-service teachers: a structural

    equation modeling approach. Computers & Education, 57(2), 16451653.Vygotsky, L. S. (1978). Mind in society: The development of higher psychological processes. Cambridge, MA: Harvard University Press.Walczak, K., & Cellary, W. (2003). X-VRML for advanced virtual reality applications. IEEE Computer, 36(3), 8992.Walczak, K., & Wojciechowski, R. (2005). Dynamic creation of interactive mixed reality presentations. In Y. Chrysanthou, & R. Darken (Eds.), Proceedings of ACM symposium on

    virtual reality software and technology (VRST 2005)(pp. 167176). NY: ACM Press.Walczak, K., Wojciechowski, R., & Cellary, W. (2006). Dynamic interactive VR network services for education. In Proceedings of ACM symposium on virtual reality software and

    technology (VRST 2006) (pp. 277286). NY: ACM Press.Web3D Consortium. (2011). X3D specication website. Accessed 19.10.12.http://www.web3d.org/x3d/specications/.Wilson, B. G. (1996). Constructivist learning environments: Case studies in instructional design . Englewood Cliffs, NJ: Educational Technology Publications.Wojciechowski, R. (2012). Modeling interactive augmented reality environments. In W. Cellary, & K. Walczak (Eds.), Interactive 3D multimedia content models for creation,

    management, search and presentation(pp. 137170). London: Springer.Wojciechowski, R., Walczak, K., & Cellary, W. (2005). Mixed reality for interactive learning of cultural heritage. In S. Richir, & B. Taravel (Eds.),Proceedings ofrst international

    VR-learning seminar, in conjunction with the 7th international conference on virtual reality, VRIC Laval virtual 2005(pp. 9599).Wojciechowski, R., Walczak, K., White, M., & Cellary, W. (2004). Building virtual and augmented reality museum exhibitions. In Proceedings of 9th international conference on

    3D web technology (Web3D 2004) (pp. 135144). NY: ACM Press.Wu, J. H., Chen, Y. C., & Lin, L. M. (2007). Empirical evaluation of the revised end user computing acceptance model. Computers in Human Behavior, 23(1), 162174.Yang, Y.-T. C. (2012). Building virtual cities, inspiring intelligent citizens: digital games for developing students problem solving and learning motivation. Computers &

    Education, 59(2), 365

    377.Yuen, A., & Ma, W. (2002). Gender differences in teacher computer acceptance. Journal of Technology and Teacher Education, 10 (3), 365382.

    R. Wojciechowski, W. Cellary / Computers & Education 68 (2013) 570585 585

    http://www.web3d.org/x3d/specifications/http://www.web3d.org/x3d/specifications/http://www.web3d.org/x3d/specifications/http://www.web3d.org/x3d/specifications/