elat paper emi published

Upload: sretal

Post on 06-Apr-2018

228 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/3/2019 eLAT Paper EMI Published

    1/17

    PLEASE SCROLL DOWN FOR ARTICLE

    This article was downloaded by: [HEAL-Link Consortium]

    On: 8 December 2010

    Access details: Access Details: [subscription number 786636649]

    Publisher Routledge

    Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-

    41 Mortimer Street, London W1T 3JH, UK

    Educational Media InternationalPublication details, including instructions for authors and subscription information:http://www.informaworld.com/smpp/title~content=t713698864

    Building a tool to help teachers analyse learners' interactions in anetworked learning environmentO. Petropouloua; I. Altanisa; S. Retalisa; C. A. Nicolaoub; C. Kannasb; M. Vasiliadouc; Ireneos Pattisca University of Piraeus, Piraeus, Greece b Noesis Chemoinformatics, Nicosia, Cyprus c INNOVADE LILtd, Nicosia, Cyprus

    Online publication date: 05 November 2010

    To cite this Article Petropoulou, O. , Altanis, I. , Retalis, S. , Nicolaou, C. A. , Kannas, C. , Vasiliadou, M. and Pattis,Ireneos(2010) 'Building a tool to help teachers analyse learners' interactions in a networked learning environment',Educational Media International, 47: 3, 231 246

    To link to this Article: DOI: 10.1080/09523987.2010.518815URL: http://dx.doi.org/10.1080/09523987.2010.518815

    Full terms and conditions of use: http://www.informaworld.com/terms-and-conditions-of-access.pdf

    This article may be used for research, teaching and private study purposes. Any substantial orsystematic reproduction, re-distribution, re-selling, loan or sub-licensing, systematic supply ordistribution in any form to anyone is expressly forbidden.

    The publisher does not give any warranty express or implied or make any representation that the contentswill be complete or accurate or up to date. The accuracy of any instructions, formulae and drug dosesshould be independently verified with primary sources. The publisher shall not be liable for any loss,actions, claims, proceedings, demand or costs or damages whatsoever or howsoever caused arising directlyor indirectly in connection with or arising out of the use of this material.

    http://www.informaworld.com/smpp/title~content=t713698864http://dx.doi.org/10.1080/09523987.2010.518815http://www.informaworld.com/terms-and-conditions-of-access.pdfhttp://www.informaworld.com/terms-and-conditions-of-access.pdfhttp://dx.doi.org/10.1080/09523987.2010.518815http://www.informaworld.com/smpp/title~content=t713698864
  • 8/3/2019 eLAT Paper EMI Published

    2/17

    Educational Media International

    Vol. 47, No. 3, September 2010, 231246

    ISSN 0952-3987 print/ISSN 1469-5790 online 2010 International Council for Educational MediaDOI: 10.1080/09523987.2010.518815http://www.informaworld.com

    Building a tool to help teachers analyse learners interactions in anetworked learning environment

    O. Petropouloua, I. Altanisa, S. Retalisa*, C.A. Nicolaoub, C. Kannasb, M. Vasiliadoucand Ireneos Pattisc

    aUniversity of Piraeus, Piraeus, Greece; bNoesis Chemoinformatics, Nicosia, Cyprus;cINNOVADE LI Ltd, Nicosia, Cyprus

    Taylor and FrancisREMI_A_518815.sgm

    (Received 12 March 2010; final version received 12 August 2010)10.1080/09523987.2010.518815EducationalMediaInternational0952-3987 (print)/1469-5790 (online)OriginalArticle2010Taylor &Francis473000000September [email protected]

    Educators participating in networked learning communities have very littlesupport from integrated tools in evaluating students learning activities flow andexamining learners online behaviour. There is a need for non-intrusive ways tomonitor learners progress in order better to follow their learning process andappraise the online course effectiveness. This paper presents a conceptualframework and an innovative tool, called LMSAnalytics, that allows teachers andevaluators easily to track the learners online behaviour, make judgments aboutlearners activity flow and gain a better insight about the knowledge constructedand skills acquired in a networked learning environment.

    Erstellen eines Tools um Lehrern zu helfen, Lerner-Interaktionen in einervernetzten Lernumgebung zu analysieren

    Pdagogen, die an vernetzten Lerngemeinschaften teilnehmen, haben sehr wenigUntersttzung von integrierten Programmen zum Auswerten der Lernaktivittender Studenten und von ihrem Online-Verhalten. Es ist notwendig, nicht-strendeund auch automatisierte Mglichkeiten zur berwachung des Lernfortschrittsder Lerner zu entwickeln, damit ihr Lernfortschritt und auch die Online-Kurs-Effektivitt besser verfolgt werden knnen. Dieses Papier stellt einenkonzeptuellen Rahmen und ein innovatives Tool, LMS-Analytics, vor, dieLehrern und Bewertern das Nachverfolgen des Online-Verhalten des Lernendenund seinen Aktivittsfluss zu beobachten und dadurch einen besseren Einblickber die Kenntnisse, das Wissen und die Fhigkeiten, die in einer vernetztenLernumgebung erworben werden, zu gewinnen.

    La construction dun instrument pour aider les enseignants analyser lesinteractions entre apprenants dans un environnement dapprentissage enrseau

    Les ducateurs qui participent aux activits de communauts dapprentissage enrseau ne sont gure aids par des instruments intgrs permettant dvaluer leflux des activits dapprentissage des tudiants et dexaminer le comportement desapprenants en ligne. Il y a un besoin rel de moyens automatiss et non invasifs

    pour suivre les progrs des apprenants afin de mieux suivre leur processusdapprentissage et dvaluer lefficacit du cours en ligne. Cet article prsente un

    *Corresponding author: [email protected]

  • 8/3/2019 eLAT Paper EMI Published

    3/17

    232 O. Petropoulou et al.

    cadre conceptuel et un outil innovant appel LMSAnalytics qui permet auxprofesseurs et aux valuateurs de suivre facilement la trace le comportement enligne des apprenants, de porter des jugements sur le flux dactivit de cesapprenants et davoir une vision meilleure du savoir qui sest construit et descomptences acquises dans un environnement dapprentissage en rseau.

    La construccin de una herramienta para ayudar a los profesores a analizarlas interacciones entre los estudiantes dentro de un entorno de aprendizaje enred

    Los educadores que participan en comunidades de aprendizaje en red tienen pocaayuda por falta de herramientas integradas que les permitan evaluar el flujo de lasactividades de aprendizaje por parte de los estudiantes y examinar sucomportamiento en lnea. Lo que hacen falta son sistemas automatizados y non-invasivos para comprobar los progresos de los estudiantes, vigilar sus procesos deaprendizaje y evaluar la eficacia del curso en lnea. Este artculo presenta un marcoconceptual y una herramienta innovadora llamada LMS Analytics que ofrecer a

    los profesores y evaluadores la posibilidad de vigilar fcilmente elcomportamiento en lnea de los estudiantes, de evaluar sus flujos de actividad yconseguir una visin ms clara de los conocimientos construidos y de lascompetencias adquiridas dentro de un entorno de aprendizaje en red.

    Keywords: learner interaction analysis; evaluation of learning process; networkedlearning environment

    Evaluating learners activities in a networked learning environment

    According to modern pedagogical theories, learning occurs not only as a result oflearners direct participation in learning tasks, but also through legitimate peripheral

    participation in communities (Goodyear, 2005), in which implicit and explicitknowledge is acquired from the community (Brown & Duguid, 2000). In the era ofnetworked learning, the network component plays an important role in the forma-tion of learning communities, since it promotes and facilitates collaborative andcooperative connections: between one learner and other learners; between learnersand tutors; between learners and learning resources, so that learners and tutors canextend and develop their understanding and capabilities in ways that are important tothem, and over which they have significant control (Steeples, Jones, & Goodyear,2002).

    Networked learning environments (NLEs) provide socially situated learner supportthrough the active processes of dialogue, collaboration and shared knowledge construc-tion that drive learning in social settings. Creating NLEs leads to an array of benefits,such as:

    opportunities for participants to share their knowledge and expertise; opportunities for participants to discuss, plan, reflect on and explore learning

    issues; increased inspiration, innovation and motivation amongst participants; increased social contact between individuals from differing backgrounds; a reduction in feelings of isolation (both geographically and emotionally); increased access to shared resources.

    In an NLE, where individual and collective actions take place, educators face greatdifficulties in evaluating the broad spectrum of interactions among the interacting

  • 8/3/2019 eLAT Paper EMI Published

    4/17

    Educational Media International 233

    participants such as learnerlearner, learnerteacher, learnercontent and learnertechnology (Vrasidas, 2002). It becomes difficult and time consuming for educatorsthoroughly to capture, track and assess the various interactive learning activitiesperformed by all learners. There is a need for designing specific tools, which will be based on well-grounded conceptual frameworks for analysing the grid of all theseinteractions, since evaluation is a multifaceted and complex process for educators(Dimitracopoulou et al., 2006b; Marcos, Martinez, & Dimitradis, 2005).

    Our paper discusses the theoretical framework, and findings from pilot imple-mentations of an interaction analysis tool, called LMSAnalytics. LMSAnalyticsidentifies specific indicators that can help educators analyse and evaluate multipledimensions of interaction that are developed in an asynchronous NLE. The toolwas built for the automatic analysis and visualization of data collected during thenetworked collaborative learning process. LMSAnalytics is expected to be usefulfor educators who need to assess the performance of both, individuals and groupsin an NLE.

    One of the innovative features of LMSAnalytics is that it inter-exchanges datawith the Moodle learning management system (LMS) which is the most popularopen source application of this type. Thus, it can be used by teachers to analysedata of asynchronous networked learning interactions that occur within Moodle.The analysed data can also be exported in appropriate formats so that it can beused as input to the Microsoft Excel, the SPSS statistical package or other toolssuch as the NetDraw software for further social network analysis and WEKA foradvanced data mining. Another innovative feature of the LMSAnalytics tool is thatit guides the teacher in performing specific analysis of the data collected accordingto the teaching strategy followed in his/her course. For example, if a teacher

    designs the learning activities using the ThinkPairShare (TPS) learning strategy(Palloff & Pratt, 1999), LMSAnalytics will automatically create the most appropri-ate statistical tables and diagrams based on specific indicators that fit to this strat-egy. In this way, a teacher can save time, resources and be guided on how tocollect the quantitative and qualitative data from the participation, interactions andcollaboration among learners. These data can be fed into a rubric for assessinglearners performance.

    In this paper, we first discuss the current state of the art in networked learninganalytics. We will also discuss an interaction analysis conceptual framework on whichthe LMSAnalytics tool is based. Then we will present the LMSAnalytics via an exam-

    ple of its application in a learning scenario of collaborative problem solving accordingto the TPS strategy. Using this example, 28 teachers had been asked to play the roleof the evaluator using the LMSAnalytics tool and evaluate its usability. The resultsfrom the evaluation case study will be presented. Finally concluding remarks andfuture research plans will be outlined.

    Towards networked learning analytics

    Networked learning is much more ambitious than previous approaches of using tech-nology in education (Goodyear, 2005). The added value of networked technology isthat it enables the enrichment of the learning paradigm in order to:

    support open, flexible and learner-centred patterns of study; provide new ways for learners to work collaboratively;

  • 8/3/2019 eLAT Paper EMI Published

    5/17

    234 O. Petropoulou et al.

    facilitate various forms of interaction: learnerlearner, learnercontent, learnerinstructor;

    promote authentic learning and the acquisition of higher-order thinking skills,problem solving abilities, and the like.

    The evaluation process, which is based on the analysis of students interaction andonline behaviour is a difficult task in an NLE. Actually, it is an open research issue,which has attracted the attention of many research and development groups (Mazza& Dimitrova, 2005; Padilha, Almeida, & Alves, 2004; Ryberg & Larsen, 2006;TELL, 2005; Vrasidas & McIsaac, 2000). Bates and Hardy (2004) emphasized thatinformation from students monitoring could be used as a valuable input for evalu-ating the effectiveness and quality of the e-learning materials and instructionalmodel.

    The driving force for our research is the fact that the evaluators and teachers of anetworked learning course have very little support by integrated tools to evaluate

    learners activities and identify learners online behaviour and interactions (Mazza &Botturi, 2007; Mazza, 2009). As a consequence, they are in need for non-intrusive andautomated ways to monitor learners progress in order to better understand their learn-ing process and appraise the online course effectiveness. In other words, they neededucational data mining tools or networked learning analytic tools (Baker and Yacef,2009).

    Input for designing these services can benefit from research in the area of the so-called web analytics tools (Bradley, 2007). These software tools have been origi-nally developed for the purpose of web site traffic analysis. Web analytics processesa variety of data and sources (mainly the web server log file and historical data of

    visits to the web server) in order to evaluate web site performance and popularity, visi-tors behaviour and interaction patterns at both an individual and an aggregate level(Pierrakos, Paliouras, Papatheodorou, & Spyropoulos, 2003). Processing of the avail-able information is performed using statistical analysis and data mining methods inorder to extract knowledge in the form of patterns, associations and correlations fromthe raw data. However, Baker (in press) mentions that educational data mining meth-ods and tools differ from web analytics methods, since issues such as student knowl-edge level and context play important roles in the analysis of educational data. Thus,it is widely believed to be appropriate to develop custom tools that employ both,appropriate pre-processing of the learning management system educational data and

    encoding into meaningful descriptors, and selected data mining methods targeting thediscovery of educationally meaningful knowledge.Networked learning analytics is a specific aspect of the educational data

    mining area, which can be defined as the process of studying and analysing in depththe students behaviour within an NLE and extracting meaningful interaction patterns,i.e., patterns that concern learnercontent interactions, learnerinstructor interactionsand learnerlearner interactions, as well as identifying trends in students onlinebehaviours (Petropoulou, Lazakidou, Retalis, & Vrasidas, 2007). In order to analyseand interpret the data about students interactions in an NLE, several research groupsdeveloped coding schemes and analysis techniques that categorize interactionsaccording to models of knowledge construction and skills acquisition in an effort to perform interaction analysis more efficiently. Gunawardena, Lowe and Anderson(1997) developed a model and coding scheme for online interaction among peers withfive phases of knowledge construction:

  • 8/3/2019 eLAT Paper EMI Published

    6/17

    Educational Media International 235

    (1) sharing/comparing of information;(2) discovery and exploration of dissonance or inconsistency among ideas,

    concepts, or statements;(3) negotiation of meaning/co-construction of knowledge;(4) testing and modification of proposed synthesis or co-construction; and(5) agreement statement(s)/applications of newly constructed meaning.

    Vrasidas (2002) developed a working typology of intentions driving interaction inonline and blended learning environments. Examples of intentions include collabora-tion, discussion, evaluation, gain status, provide support, share information and social-ize. Shute and Glaser (1990) propose a technique that enables the evaluator to deriveglobal learner differences on the basis of learner interaction measures. The approach by Shute and Glaser can be summarized as involving: (1) counting frequencies ofactions, (2) categorizing them into meaningful units, and (3) making comparisonsacross groups. Ganer, Jansen, Harrer, Herrmann, and Hoppe (2003) illustrate how log

    files can be captured, codified and analysed for providing statistics of activity patternssuch as cooperation, turn taking, etc. in a synchronous CSCL environment. The Match-Maker TNG tool offers a framework for analysing activities that occur via a synchro-nous shared desktop collaborative learning systems. For example, when a student addsa node in a graph and another student connects this node with another one with an edge,this might indicate a collaboration of these two students. An analysis method like theone that appears in Mhlenbrock (2004) can help evaluators identify collaborativeactivity patterns. Another type of interaction analysis is related to the calculation ofthe number of the messages read, the postings to a discussion board, the file uploads,the annotations to the uploaded files, etc. For example, the DIAS tool (Bratitsis and

    Dimitracopoulou, 2005) offer 64 indicators, which can be used for the examination ofquantitative data about the interactions in an asynchronous collaborative NLE. Apartfrom quantitative measurements, analysis of participants postings (content analysis)should be performed in sequence, to reveal many of the behaviours associated withcollaborative learning situations (Curtis & Lawson, 2001).

    Nowadays, LMSs are being widely used by educational and training organizations.However, very few interaction analysis techniques and tools have been developed.LMSs allow authorized users (teachers and administrators) to view some data aboutthe online actions performed by students. This data is stored in the tables of the LMSsdatabase. However, this data is very poor and cannot be easily combined with the inter-

    action analysis techniques as it happens in the field of computer supported collaborativelearning (CSCL) environments. For example, in synchronous CSCL environments,interaction analysis can be made using action based approaches/frameworks such asthe activity recognition approach (Barros & Verdejo, 1999; Mhlenbrock, 2004), andthe OCAF framework (Avouris, Dimitracopoulou, & Komis, 2003) to name few. Theseapproaches/frameworks have been supported by specialized analysis tools like CoLATthat collect actions of the users in a collaborative learning environment and show differ-ent indexes of collaboration (Koutri, Avouris, & Daskalaki, 2004). The main advantageof these tools is that they are efficient in providing feedback to their users (learners,teachers or evaluators). These tools are very valuable and their philosophy need to beadopted by LMSs since teachers need rich information that will come out from analysisof interaction using sophisticated interaction analysis techniques combined withconceptual modeling approaches similar to the ones that have started to appear in thearea of CSCL. An overview about interaction analysis techniques and tools utilized in

  • 8/3/2019 eLAT Paper EMI Published

    7/17

    236 O. Petropoulou et al.

    CSCL can be found in the web site of the Kaleidoscope project (http://www.noe-kaleidoscope.org). However, these techniques are mainly focused on the use of CSCLtools for knowledge building and they do not embrace holistically the whole interactionprocess within an NLE.

    Our thinking about networked learning analytics concerns a more holistic viewover the learners interactions within an NLE, not only focusing on social and personalinteractions among learners and tutors but also the flow of learning activities thatconcern learners interaction with online learning resources (Moore, 1989). With theproposed approach, we try to give a good insight to the three design components of anNLE identified by Goodyear (2005): (1) the tasks set for students, which influencetheir actual learning activity; (2) the (social) organizational forms put in place forthem, out of which they develop more or less convivial learning relationships; and (3)the digital resources, tools, artefacts, etc. that we make available to students, which areused by them to customise or fit out their individual learnplaces.

    Thus, the purpose of our project was to develop and test the LMSAnalytics tool,

    which is based on a specific conceptual framework for discovering informationconcerning learners navigational behaviour and extract meaningful patterns, whichcan be used for assessment purposes using data driven analytical methods. Our overallaim is to make teachers better able to understand what learners are doing alone or ingroups, and to classify learners according to their flow of activities to facilitate theevaluation of the lesson taught by educators.

    A conceptual framework for assessing learners behaviour in an NLE

    According to Dillenbourg (1999), the key to understanding collaborative learning is

    to gain an understanding of the wealth of interactions among the individuals. This iswhy various indicators and specific tools that can analyse the grid of all these interac-tions have recently proposed. Interaction analysis indicators deal with: (1) the processof the activity (individual, group or community), (2) the interaction product, (3) thequality of collaboration, and (4) the formed social context (Dimitracopoulou et al.,2006a; Vrasidas & Glass, 2002). The associated interaction analysis tools either informlearners about their learning progress (for self-regulation purposes) or help instructors/researchers evaluate and assess the collaborative learning process and products.

    Various techniques have been appeared in the literature for evaluating the collab-orative learning process and products. Several publications with overviews of such

    techniques can be found in the literature (e.g., Daradoumis, Martnez, & Xhafa, 2006;Dimitracopoulou et al., 2006b; Vrasidas & Glass, 2002). The evaluation of collabora-tive learning has to be performed at least at two levels, separating the process (orgroup functioning) from the product (or task performance) of collaboration (Collazos,Guerrero, Pino, & Ochoa, 2003; Daradoumis, Xhafa, & Marques, 2003; Hakkinen,Jarvela & Makitalo, 2003; MacDonald, 2003).

    Based on this trend, we have developed a multi-faceted framework to study learn-ers behaviour in an NLE by making use of descriptive statistics, social network anal-ysis (SNA), content and context analysis (through coding teaching and learningactivities) as a way to find out what they are talking about, and why they are talk-ing as they do. These methods are being used to triangulate and contextualize ourfindings (De Laat, Lally, Lipponen, & Simons, 2006).

    Our proposed conceptual framework consists of two axes that measure studentperformance in an NLE: (1) the quality of learning products and (2) the quality of the

  • 8/3/2019 eLAT Paper EMI Published

    8/17

    Educational Media International 237

    collaboration related to the volume and quality of interactions in an NLE. Theproposed framework tries to analyse the interaction holistically, thus covering the fourtypes of interaction: learnercontent, instructorlearner, learnertechnology andlearnerlearner (Vrasidas & Glass, 2002).

    The first axis concerns all deliverables of individual or group action (e.g., learnersassignments). Both quantitative and qualitative indicators about the quality of learningproducts should be accounted for, such as:

    grading of learners ongoing and final learning products (e.g., final reports,tests, exercises, quizzes);

    individual and groups overall performance in specific tasks (e.g., groups aver-age score);

    number of steps performed in a multi-step exercise (e.g., number of correct,wrong, or incomplete steps);

    ratio of correct to incorrect steps per session correlated with task difficulty.

    The second axis refers to the necessity for specifying the effects of particular catego-ries of interactions in an NLE (Dillenbourg, 1999) for the accomplishment of learn-ing products. These interactions refer to the grid of interactions developed betweenpeers, learnertutor and learnercontent. Thus, we not only measure what studentsdeliver in an NLE but also how they produced their deliverables. We propose thatthis entire spectrum of interactions should be captured and analysed accordingly.For example, for the learnerlearner interactions, we propose evaluation of thefollowing elements:

    number and nature of contributions (e.g., questions, additions, replies, socialremarks) to the task (per learner);

    learner behaviour compared with that of other group members; direction of information flow (different kinds of communication among

    participants); total number of follow-up postings.

    Concerning the learnercontent interactions, the following indicators could bemeasured:

    amount of time a learner spends with the system (per session); percentage of available material read; percentage of available exercises tackled; amount of time spent per concept/skill/method/competency; sequential learning paths per session (e.g., theory, example, exercise).

    Consequently, a set of available indicators have been identified which can be used orreused by the teachers, in order to analyse the students interactions in an NLE, or byappropriately designed software tools to discover useful knowledge. These indicatorshave been integrated into the LMSAnalytics tool, which offers information visualizationof the data gathered from the Moodle LMS and refer to each indicator. In addition,special care has been attributed to implement the interoperation of LMSAnalytics witha set of applications that include social network analysis and data mining.

  • 8/3/2019 eLAT Paper EMI Published

    9/17

    238 O. Petropoulou et al.

    Data mining in LMSAnalytics

    Data mining enables the use on the data of interest of interpretative methods, aimingto provide insights to the data, and predictive methods, constructing models for esti-mating future performance based on past experiences (Han & Kamber, 2006). Theinclusion of data mining technology into LMSAnalytics takes place primarily throughthe interoperation of the tool with the WEKA data mining application. Users of theLMSAnalytics have the option to export a selection of the indicators calculated by thetool in a format appropriate as input to WEKA. Following that, WEKA can be usedto import the dataset of interest and process it using any of the data techniques avail-able in it. The results of the analysis can then be evaluated through the WEKA inter-face or though exporting to text files and manual inspection. The currentimplementation of LMSAnalytics has placed emphasis on the preparation and exportof datasets suitable for two types of data mining analysis available in WEKA: cluster-ing and classification.

    Clustering refers to a set of techniques that aim to identify natural groups in adataset based on the similarities of the patterns the dataset contains. The method takesas input a set of patterns (in our case a set of students), each represented by a set ofdescriptors (in our case a set of indicators) typically arranged in a tabular form. Themethod produces a set of groups where each contains patterns with similarities at thedescriptor level. Data analysts and all users in general can use the results of clusteringto understand the nature of the data and overall relations among patterns. WEKAoffers several clustering methods. LMSAnalytics has been tested successfully with thesimple K-Means method (Han & Kamber, 2006).

    Classification methods aim at discovering features of the patterns examined that

    discriminate between distinct classes of patterns. A second goal of classification is toformalize rules useful for predicting the class of a pattern based on its description. Inthe LMSAnalytics problem case, this amounts to discovering indicators/descriptors ofstudents/student online behaviour, which are crucial in discriminating between, forexample, students with high grades and those with low grades and in using thosedescriptors to build predictive models for the likely performance of a student based onher online behaviour. LMSAnalytics has been tested using decision tree classificationmethods from WEKA. Decision trees, as well as all classification methods need to besupplied by a set of patterns represented by descriptors as well as a special descriptordefining the class of the pattern (e.g., grade). Decision trees are then used to divide

    the patterns of the entire dataset into exactly two groups according to whether thepatterns have a particular best descriptor in common.The best descriptor is the descriptor that results in the highest possible ratio of

    patterns in the same class between those patterns containing the descriptor andthose patterns not containing it. The method continues iteratively with respect to eachsubdivided group, dividing each group into two groups based on a next bestdescriptor selected from the group of descriptors. The result of this process is a treestructure in which terminal nodes contain a majority of patterns in one of the classes.Tracing the lineage that defines each terminal node can reveal descriptors that maybe related to the increased or decreased likelihood of the presence of a specific classof patterns. Moreover, new patterns can be filtered through the tree structure gener-ated by a decision tree and a prediction for the class of each filtered pattern can bemade by simple examination of the characteristics of the tree nodes in which it is placed. In an educational setting, teachers could utilize classification methods to

  • 8/3/2019 eLAT Paper EMI Published

    10/17

    Educational Media International 239

    develop models for specific courses. These models could reveal indicators thatcontribute to increased or decreased performance and, thus, support educators inredesigning their lessons to accommodate this finding. Similarly, the models couldbe used during the course period to predict the likelihood performance of studentsbased on their online behaviour. Teachers could use this information to assess overallclass progress and take appropriate actions to support students predicted to have alow performance.

    The LMSAnalytics tool

    LMSAnalytics is an interaction analysis tool for the automated collection, analysisand visualization of data that concern the behaviour of participants in an asynchro-nous NLE based on the Moodle LMS. The tool has been developed based on themeasurable analysis indicators aforementioned. Its basic operations are portrayed inFigure 1.Figure1. Usecasediagramof LMSAnalytics.

    LMSAnalytics is an open source tool developed using php and MySQL languages.It interoperates with the Moodle LMS. Moodle stores in specific tables of the databaseserver data about the students interactions. LMSAnalytics connects to the databaseand retrieves that necessary data from those tables (a teacher may ask data about aspecific course and/or a specific period). The data retrieved from Moodle is stored ina relational database for analysis. The results of the analysis of the data are shown ingraphs and/or tables. Data can also be exported in a suitable coded form, so that it can be further processed further with the help of more specialized tools such as the NetDraw software for social networks analysis, SPSS for more in depth statisticalanalysis or WEKA for additional data mining processing. Figure 2 contains a high-

    level architectural diagram of the LMSAnalytics tool. This is the only stand-aloneinteraction analysis tool that inter-exchanges data with Moodle LMS. The only quitesimilar tool is the GISMO graphical interactive student monitoring tool (Mazza &

    Figure 1. Use case diagram of LMSAnalytics.

  • 8/3/2019 eLAT Paper EMI Published

    11/17

    240 O. Petropoulou et al.

    Botturi, 2007), which also interoperates with Moodle, acting as a Module block andnot as a separate application.Figure2. A high-levelarchitectural diagramof theLMSAnalytics tool.

    Moreover, if the teacher has structured the learning tasks in an NLE according toa pedagogical strategy such as Jigsaw, TPS, etc. (Palloff & Pratt, 1999) LMSAnalyticscan propose to him/her a series of diagrams and tables that show the results from ananalysis of the indicators that best fit to the strategy followed. An example of thisfeature is shown in the next section.

    Example of the LMSAnalytics utilization the case of TPS strategy

    According to the TPS strategy, the teacher gives to the students a problem/question.As can be shown in Figure 3, first each teacher has to reflect upon the given problem

    and submit to a forum his/her answers (Think phase). Of course, questions andremarks about the problem can be exchanged among students via the forum during theproblem solving process. Often students share resources that could help their peers

    Figure 2. A high-level architectural diagram of the LMSAnalytics tool.

  • 8/3/2019 eLAT Paper EMI Published

    12/17

    Educational Media International 241

    find the solution to the problem. Having the students thought about the problem and

    reported their solutions (first deliverable), they form groups (Pair phase). During this phase, members of each group exchange their deliverables, give explanations andnegotiate their thoughts in order to jointly create a new deliverable, which will be anelaborated version of the problem solution. Finally, all the deliverables are shared(Share phase) in order that the learners peer-review them and ask for clarifications,explanations and so on. Normally, the Share phase ends with an electronic vote on thegiven solutions. The TPS strategy encourages students active participation, collabo-ration, investigation of a given problem from various angles, critical thinking and thegroup attainment of knowledge.Figure3. Graphicalrepresentation of theThinkPairShare(TPS) strategy.

    LMSAnalytics can help teacher perform the interaction analysis for this specific

    collaborative strategy. More specifically, the tool can specifically produce reports onindicators such as:

    A3 Actors degree centrality (SNA); B1 Work Amount (quantification of the amount of work, message dimension

    per user); B2 Argumentation (measure of the initiative work that has been done in the

    team message annotation); B5 Collaboration (interaction base message characterization); D1 Average Number of Contributions (calculate the participation percentage

    per team in a certain course and team argumentation in a certain time period); E3 Participation Count (number of posted messages a user has done in acertain course and period);

    F3 Number of Messages per Participant (number of posted messages a userhas done in a certain course per Forum and period).

    Moreover, the results of the analysis of these indicators can be shown per phase. Forexample:

    Think phase: the tool produces statistical tables and bar-charts for B1 indicators(case a). The teacher can see the total number of messages sent per learner andthe total time he spent on this activity at a glance (Figure 4).

    Pair phase: the tool produces statistical tables and diagrams for indicators B2and D1, that concern the degree and the quality of students participation in the

    Figure 3. Graphical representation of the ThinkPairShare (TPS) strategy.

  • 8/3/2019 eLAT Paper EMI Published

    13/17

    242 O. Petropoulou et al.

    same group, the type and the quality of collaboration and communicationamong groups, as well as the total time that students spent for solving a givenproblem. For example, LMSAnalytics identifies the most active student of thisphase and the number/type of messages that this student sent (Figure 5).

    Share phase: the tool produces graphical representations for indicator B1 as inthe Think phase per student and for the forum of the share phase.

    Figure4. Totalnumber of messages per learner.Figure5. Semanticannotation of messages per learner.

    LMSAnalytics evaluation

    The usability of the LMSAnalytics tool has been very positively evaluated. Morespecifically, 28 teachers from different schools who have strong interest in the use ofnetworked technologies in their schools had been asked to use the LMSAnalytics tooland answer to a questionnaire. Teachers were given a learning scenario in which sixstudents learn about nuclear power using the Moodle LMS and decide whether theyare in favour or against it. At each phase the teacher gave to students a set of onlineresources and posed a set of questions that the students had to answer individually at

    Figure 4. Total number of messages per learner.

    Figure 5. Semantic annotation of messages per learner.

  • 8/3/2019 eLAT Paper EMI Published

    14/17

    Educational Media International 243

    first (Think phase) and then in groups of two (Pair phase). During the last phase(Share phase), the students voted about the usefulness or not of the nuclear power.During the Think phase, students used a common asynchronous web forum forexchanging resources (mostly links) and discussing about the given set of questions.During the Pair phase, each group used an asynchronous web forum in order to sharetheir deliverables and discuss their answers in order to reach to a consensus, whichwas portrayed in a joint report. Thus, each teacher had to analyse the quality of thegiven deliverables as well as the data from the discussions occurred during this learn-ing scenario. Each teacher had been given an assessment rubric which they had tocomplete in order to grade each student. A rubric is an authentic assessment tool,which acts as a scoring guide that seeks to evaluate a students performance based onthe sum of a full range of criteria/indicators thus giving a final composite numericalscore.

    Each teacher had to use LMSAnalytics in order to check the reports per indicatorand fill in the assessment rubric. It is important to note that final graders that teachers

    gave to the various students did not differ much (maximum variance 0.6). Afterhaving used the LMSAnalytics tool, teachers expressed their opinions about it usinga typical usability evaluation questionnaire. Teachers highly appreciated (over 80%)the tool, which was considered highly efficient and effective. They also liked thefollowing aspects of the tool:

    content: the visualization and the reporting of information produced by thetool;

    structure: the organization of the content and functions; appearance: the aesthetics of the graphical user interface; learnability: the easiness in learning to use the tool as well as using it.

    Concluding remarks

    A teacher who organizes collaborative learning tasks for the students needs frame-works and tools that will enable him/her quickly and accurately to evaluate theirbehaviour as well as to offer timely scaffolding when needed. The LMSAnalytics tool,which has been presented in this paper, tries to address this need. It also goes one stepfurther by guiding the teacher in analysing and visualizing data of the students behav-iour. An innovative aspect of the tool is that it interoperates with the Moodle LMS.

    Thus, although it contains fewer indicators than the DIAS tool, it is very valuable,since it effectively and efficiently aid teacher in assessing the students learningperformance. LMSAnalytics tool is comparable to the GISMO tool with respect to thefeatures offered (not the technical details).

    We plan to extend LMSAnalytics functionality by building a web service formaking it interoperable with the WEKA data mining tool. In this way, the toolcould:

    exploit learning activities flow sequential patterns by drawing the exact pathsfollowed by each student individually or in groups;

    show deviations of individual students from the typical learning activities flowperformed by their peers;

    perform path analysis with the creation of more complex queries that revealinteresting correlations and association rules among students learning paths.

  • 8/3/2019 eLAT Paper EMI Published

    15/17

    244 O. Petropoulou et al.

    Acknowledgements

    This work has been partially supported by the eLAT project: eLearning Analytics Tool:Analyzing Student Behavior in Learning Management Systems, supported by the CyprusResearch Promotion Foundation and partially funded by the European Structural Funds and theRepublic of Cyprus.

    References

    Avouris, N.M., Dimitracopoulou, A., & Komis, V. (2003). On analysis of collaborative prob-lem solving: An object-oriented approach. Computers in Human Behavior,19(2), 147167.

    Baker, R.S.J.d. (in press). Data mining for education. In B. McGaw, P. Peterson, & E. Baker(Eds.), International encyclopedia of education (3rd ed.). Oxford: Elsevier. Retrieved onJanuary 2010 from http://www.cs.cmu.edu/rsbaker/Encyclopedia%20Chapter%20Draft%20v10%20-fw.pdf

    Baker, R.S., & Yacef K. (2009). The state of educational data mining in 2009: A review andfuture visions. Journal of Educational Data Mining(JEDM),1, 317.

    Barros, B., & Verdejo, M. F. (1999). An approach to analyse collaboration when shared struc-

    tured workspaces are used for carrying out group learning processes. In S.P. Lajoie & M.Vivet (Eds.), Artificial intelligence in education: Open learning environments (pp. 449456).Amsterdam: IOS Press.

    Bates, S.P., & Hardy, J. (2004) An evaluation of an e-learning strategy: Watching thee-learners learn. In D.S. Preston & T.H. Nguyen (Eds.), Virtuality and education: Areader (pp. 7781). Oxford: Inter-Disciplinary Press.

    Bradley, N. (2007) Marketing research. Tools and techniques. Oxford: Oxford University Press.Bratitsis, T., & Dimitrakopoulou, A. (2005). Data recording and usage interaction analysis in

    asynchronous discussions: The DIAS system. Proceedings of the 12th InternationalConference on Artificial Intelligence in Education AIED, Workshop Usage Analysis inLearning Systems, Amsterdam.

    Brown, J.S., & Duguid, P. (2000). The social life of information. Boston, MA: Harvard BusinessSchool Press.

    Collazos, C., Guerrero, L., Pino, J., & Ochoa, S. (2003). Collaborative scenarios to promotepositive interdependence among group members. In J. Favela & D. Decouchant (Eds.),Proceedings of the Ninth International Workshop on Groupware (CRIWG 2003),Grenoble-Autrans, France, LNCS 2806 (pp. 247260). Berlin: Springer.

    Curtis, D.D., & Lawson, M.L. (2001). Exploring collaborative online learning, Journal ofAsynchronous Learning Networks,5(1), 2134.

    Daradoumis, T., Xhafa, F., & Marques, J.M. (2003). Exploring interaction behaviour andperformance of online collaborative learning teams. In J. Favela & D. Decouchant (Eds.),Proceedings of the Ninth International Workshop on Groupware (CRIWG 2003),Grenoble-Autrans, France, LNCS 2806 (pp. 126134). Berlin: Springer.

    Daradoumis, T., Martnez A., & Xhafa F. (2006) A layered framework for evaluating on-linecollaborative learning interactions. International Journal of Man-Machine Studies, 64(7),

    622635.De Laat, M.F., Lally, V., Lipponen, L., & Simons, P.R. J. (2006). Analysing student

    engagement with learning and tutoring activities in networked learning communities: Amulti-method approach. International Journal of Web-based Communities,2(4).

    Dillenbourg, P. (1999). What do you mean by collaborative learning? In P. Dillenbourg (Ed.),Collaborative learning: Cognitive and computational approaches (pp. 120). Advances inLearning and Instruction series. Oxford: Pergamon Elsevier.

    Dimitrakopoulou A., Petrou A., Martinez A., Marcos J., Kollias V., Jermann P., Harrer A.,Dimitriadis Y., & Bollen L. (2006a). State of the art of interaction analysis for MetacognitiveSupport & Diagnosis, D31.1.1 deliverable of the EU Sixth Framework programme priority2, Information society technology, Network of Excellence Kaleidoscope project (contract

    NoE IST-507838).

    Dimitracopoulou, A., Vosniadou, S., Gregoriadou, M., Avouris, N., Kollias, V., Gogoulou, L.,Fessakis, G., & Bratitsis, Th. (2006b). The field of computer based interaction Analysisfor the support of participants regulation in social technology based learning environ-ments. State of the art and perspectives. In D. Psillos & V. Dagdidelis (Eds.), 5th Hellenic

  • 8/3/2019 eLAT Paper EMI Published

    16/17

    Educational Media International 245

    Congress with International Participation: Information and Communication Technologiesin Education (pp. 9971000). HICTE, Thessaloniki, October 2006.

    Ganer K., Jansen M., Harrer A., Herrmann K., & Hoppe U. (2003). Analysis methods forcollaborative models and activities. In U. Hoppe (Ed.), Computer support for collaborativelearning: Designing for change in networked learning environments, CSCL 2003 Congress,1418 June 2003, Bergen, Norway.

    Goodyear, P. (2005). Educational design and networked learning: Patterns, pattern languagesand design practice. Australasian Journal of Educational Technology,21(1), 82101.

    Gunawardena, C.N., Lowe, C.A., & Anderson, T. (1997). Analysis of a global online debateand the development of an interaction analysis model for examining social construction ofknowledge in computer conferencing. Journal of Educational Computing Research,17(4),397431.

    Hakkinen, P., Jarvela, S., & Makitalo, K. (2003). Sharing perspectives in virtual interaction:Review of methods of analysis. In B. Wason, S. Ludvigson, & U. Hoppe (Eds.), Designingfor Change in Networked Learning, Proceedings of the International Conference onComputer Support for Collaborative Learning (pp. 395404). Dordrecht: Kluwer.

    Han, J., & Kamber, M. (2006). Data mining: Concepts and techniques (2nd ed.). San Diego,CA: Morgan Kaufmann.

    Koutri, M., Avouris, N., & Daskalaki, S. (2004). A survey on web usage mining techniquesfor web-based adaptive hypermedia systems. In S.Y. Chen & G.D. Magoulas (Eds.),

    Adaptable and adaptive hypermedia systems. Hershey, PA: Idea Publishing Inc..MacDonald, J. (2003). Assessing online collaborative learning: Process and product.

    International Journal of Computers and Education, 40, 377391.Marcos, A., Martinez, A., & Dimitriadis, Y. (2005). Towards adaptable interaction analysis

    in CSCL. Proceedings of the 12th International Conference on Artificial Intelligence inEducation, Amsterdam.

    Mazza, R. (2009). Introduction to information visualization. London: Springer-Verlag.Mazza, R., & Botturi, L. (2007). Monitoring an online course with the GISMO tool: A case

    study. Journal of Interactive Learning Research,18(2), 251265.Mazza, R., & Dimitrova, V. (2005). Generation of graphical representations of student track-

    ing data in course management systems. In IV 05: Proceedings of the Ninth InternationalConference on Information Visualisation (IV05) (pp. 253258). Washington, DC.Moore, G. (1989). Three types of interaction. The American Journal of Distance Education,

    3(2), 16.Mhlenbrock, M. (2004). Shared workspaces: Analyzing user activity and group interaction.

    In H.U. Hoppe, M. Ikeda, H. Ogata, F. Hesse (Eds.), New technologies for collaborativelearning. Computer-Supported Collaborative Learning Series. Dordrecht: Kluwer.

    Padilha, T.P.P., Almeida, L.M., & Alves, J.B.M. (2004). Mining techniques for models ofcollaborative learning. In J. Mostow & P. Tedesco (Eds.), Designing ComputationalModels of Collaborative Learning Interaction, Workshop at ITS 2004 (pp. 8994).Macei, Brazil.

    Palloff, R., & Pratt, K. (1999). Building learning communities in cyberspace: Effective strategies

    for the online classroom. San Francisco, CA: Jossey Bass.Petropoulou, O., Lazakidou, G., Retalis, S., & Vrasidas, C. (2007). Analysing interactionbehaviour in network supported collaborative learning environments: A holistic approach.International Journal of Knowledge and Learning, 3(4/5), 450464.

    Pierrakos, D., Paliouras, G., Papatheodorou, C., & Spyropoulos, C.D. (2003). Web usagemining as a tool for personalization A survey. User Modelling and User-Adapted Interaction,13(4), 311372.

    Ryberg, T., & Larsen, M. (2006). Networked identities: Understanding different types of socialorganisation and movements between strong and weak ties in networked environments. InProceedings of the Fifth International Conference on Networked Learning, 1012 April2006, Lancaster.

    Shute, V.J., & Glaser, R. (1990). A large-scale evaluation of an intelligent discovery world:Smithtown. Interactive Learning Environments, 1, 5177.

    Steeples, C., Jones, C., & Goodyear, P. (2002). Beyond e-learning: A future for networkedlearning. In C. Steeples & C. Jones (Eds.), Networked learning: Perspectives and issues(pp. 323342). London: Springer.

  • 8/3/2019 eLAT Paper EMI Published

    17/17

    246 O. Petropoulou et al.

    TELL. (2005). Introducing a framework for the evaluation of network supported collaborativelearning. TELL Project, Deliverable of WorkPackage 1, Retrieved January 2010, from TELL

    project website, http://cosy.ted.unipi.gr/tell/Vrasidas, C. (2002). A working typology of intentions driving face-to-face and online interac-

    tion in a graduate teacher education course. Journal of Technology and Teacher Education,10(2), 273296.

    Vrasidas, C., & McIsaac, M. (2000). Principles of pedagogy and evaluation of Web-basedlearning. Educational Media International,37(2), 105111.

    Vrasidas, C., & Glass, C.V. (Eds.) (2002). Distance education and distributed learning. Charlotte,NC: Information Age Publishing, Inc.