esai april2015

18
Design method for the evaluation and quality assurance of an online learning environment. Marianne Checkley ESAI 2015

Upload: mcheckley

Post on 06-Aug-2015

53 views

Category:

Education


3 download

TRANSCRIPT

Design method for the evaluation and quality assurance of an online learning

environment.

Marianne Checkley ESAI 2015

Background and Context

• Personalised online learning programme for young early school leavers aged 13-16 years.

• 52 students• Blended centre and at-home• 10 online teachers

• Maximising multiple data streams in examining the online educational experience

• Creating a narrative approach

Presentation Focus

Instruments developed to evaluate, benchmark, and provide quality-assurance processes for online learning environments

are primarily used at Third Level (Oncu & Cakir, 2011).

Research from iNacol designed to measure qualityin K12 online and blended learning environments divides strategies into outcomes and processes (Patricks et al, 2012).

Including proficiency, individual student growth, graduation rate, college and career readiness and closing the achievement gap.

For quality assurance of processes the following headings are assessed under comprehensive criteria: Content, Instructional Design, Student Assessment, Technology, Course Evaluation and Support.

Developing an evaluation framework for the educational experience on iScoil:

• Research suggests incorporating the data streams available from a virtual learning environment (Kennedy and Soifer, 2013) with a student centred qualitative approach to participation from all stakeholders in order to consider outcomes in a contextualized way (Niehaus, 2014).

• Deepwell (2007) recognises the capacity of Stake’s Countenance approach to organise large amounts of diverse data and as such is ideal for the analysis of online learning programmes.

Three interdependent phases Antecedent, Transaction, Outcome.

AntecedentConditions existing prior to instruction that may relate to

outcomes

TransactionThe process of

instruction

OutcomesThe effects of the

programme

What is the profile of iScoil students?

How do the learners engage with the online

platform?

What certification is achieved?

What is the theory underpinning curriculum

design?

What is the learning experience?

What are the progression routes?

What is the teaching experience?

How does student profile impact outcomes?

How does student profile impact engagement?

Are there unintended outcomes?

How is the curriculum delivered?

Is there potential for improvement

Table 1: Research Questions within a countenance framework

Figure 1: Community of Inquiry Model (Swan et al, 2009)

EDUCATIONAL EXPERIENCE

Supporting Discourse

COGNITIVE PRESENCE

Selecting Content

TEACHING PRESENCE

Setting Climate

SOCIAL PRESENCE

Features for Data Collection 1

Features for Data Collection 2

Figure 2: Cognitive Apprenticeship Model (Collins, Brown & Newman, 1989)

AntecedentConditions existing prior to instruction that may

relate to outcomes

TransactionThe process of

instruction

OutcomesThe effects of the

programme

Student Profile: Gender-Reason for Referral-Location of Learning.

Learner engagement. patterns on VLE

Certification achieved according to student

profile

Design of Curriculum. The Learning Experience. Progression Routes according to student

profile.

Learner-Teacher-Content Interaction.

The Teaching Experience

Learner Content Interaction.

Table 2: Features for Data Collection within a countenance framework

Research Question

• How do the students engage with the online platform?

• What is the learning experience?

• What is the teaching experience?

• How is the curriculum delivered?

• How does student profile impact on engagement and learning?

Feature

• Learner engagement patterns on VLE

• The teaching experience

• The learning experience

• Learner/Teacher/Content Interaction

• Learner/Content Interaction

Method

• VLE Log Analysis• Focus Group• Semi-Structured

Interviews• Authoring Tool Open

GLM• Outcomes Matrix

How the Framework works out: Transaction Phase

Figure 3: Corresponding Research Questions, Features and Methods across a countenance approach for the Transaction Phase

Data Collection …

• Document Analysis• Learning Designer• VLE Log Analysis

• Open GLM• Focus Groups• Semi Structured Interviews• Outcomes Matrix

Reflective use of Authoring Tools

A relevant example is a study where multiple design and authoring tools explore one lesson (Prieto et al., 2013). The study ultimately highlighted the complexity involved in the learning design process as different tools and processes were a better fit to different teachers depending on their pedagogical aims or institutional and technological contexts.

A number of computational tools that guide the thinking of education practitioners in pedagogical design have emerged from learning design research over the last ten years (Masterman, Walker & Bower, 2013).

While these design and authoring tools were developed to facilitate the development, adaptation and sharing of teachers’ pedagogical ideas, they are also, according to Laurillard (2012) useful as tools for reflection on practice.

Figure 4: Authoring Tool, The Learning Designer (Laurillard, 2011)

AntecedentStudentProfile

Anxiety

Behavioural

School Refusal

Disaffection

Illness

0 2 4 6 8 10 12 14

FemaleMale

DisaffectionBehavioural

School RefusalAnxiety

Illness

0

2

4

6

8

10

12

Blended Male Blended Female At-Home Female At-Home Male

Gender and Reason for Referral

Gender, Location and Reason for Referral

AntecedentCurriculum Design

RWL4%

Produce18%

Practice42%

Inquire36%

Results from reflective use of the Learning Designer grounds the design in constructivist learning theory with an emphasis on an active model of learning.

However absence of discussion based activities limit the opportunity for interaction and social construction of knowledge.

Figure 5: Results of Learning Designer analysis of Communications Unit 2 Writing

Transaction Phase

Feedback Loops

Positive Affirmation

Personalised

Teaching Teams

Constructivism By Proxy

Modular Assessment

Relevant Content

Open GLM University of Vienna

Results of thematic analysis of variety of data

Outcomes

6 Modules

5 Modules

4 Modules

3 Modules

2 Modules

1 Modules

0 1 2 3 4 5 6 7

At-Home N=19 Blended N=12

Unknown

Second Level School

VEC Second Level School

Youthreach

Solas

Community Training Centres

0 1 2 3 4 5 6 7

Blended At-Home

Certification

Progression

Synchronous Social Presence

Project Based Learning

Student Online Collaboration

A Model for Prevention

Conclusions

The interdependence of sampling across phases used a parallel and simultaneous design (Cresswell & Plano-Clark, 2011) that permitted triangulation of data and ultimately strengthened the research process.

Research design supported a study where data were collected, analysed and interpreted in a way that provided an insight into an online learning journey from induction to exit and also identifying areas for improvement.

ReferencesDeepwell, F. (2007). Embedding Quality in e-Learning Implementation through Evaluation. Educational Technology & Society, 10 (2), 34-33.

Kennedy, S. & Soifer, D. (2013). Why blended learning can’t stand still: A commitment to constant innovation is needed to realise the potential of individualised learning. Lexington Institute.

Laurillard D. & Ljubojevic D. (2011). Evaluating learning designs through the formal representation of pedagogical patterns. In Investigations of E-Learning Patterns: Context Factors, Problems and Solutions. (eds J.W. Kohls & C.Kohls), pp. 86–105. IGI Global, Hershey, PA.

Masterman, L. (2013). The Challenge of Teacher’s Design Practice. In Beetham, H. and Sharpe, R. (Eds.), Rethinking Pedagogy for a Digital Age, (64-77). New York: Routledge.

Oncu, S., & Cakir, H. (2011). Research in online learning environments: Priorities and methodologies. Computers & Education, 57(1), 1098-1108. doi:10.1016/j.compedu.2010.12.009

Patrick, S., Edwards, D., Wicks, M. & Watson, J. (2012). Measuring Quality from Inputs to Outcomes: Creating Student Learning Performance Metrics and Quality Assurance for Online Schools. Vienna (VA): International Association for K-12 Online Learning.

Prieto, L., Dimitriadisa, Y., Craftb, B., Derntlc, M., Katsamanie, M., Laurillard, D., Masterman, E., Retalis, S. & Villasclaras, E. (2013). Learning design Rashomon II: exploring one lesson through multiple tools. Research in Learning Technology, 21.

Stake, R.E. (1967). The countenance of educational evaluation. Teachers College Record, 68 (7), 523-540.

Swan, K., Garrison, D.R., and Richardson, J.C. (2009). A constructivist approach to online learning: The Community of Inquiry framework. Information Technology in Higher Education: Progressive Learning Frameworks. Hershey, PA.