monitoring and evaluating in the phea educational technology initiative patrick spaven

35
Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Upload: lionel-simpson

Post on 05-Jan-2016

218 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Monitoring and Evaluating in the PHEA Educational Technology Initiative

Patrick Spaven

Page 2: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

The Project Cycle

Carry out the projects

Monitor

Summative

evaluation

Plan projects – and plan to monitor and

evaluate them

Outside Stakeholders

ET Strategy

Formative

evaluation

Page 3: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Levels

1. Your projects

2. Your programme

3. The whole Initiative

Page 4: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Monitoring and evaluating your projects

Page 5: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Monitoring

The capture of data about the project regularly or continuously, usually in a consistent way.

Monitoring data can be expressed in numbers or concise narrative.

Page 6: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Evaluation

The all-round assessment of the performance of a project or programme

Uses data from monitoring plus those captured during the evaluation itself, e.g. qualitative interviews with key stakeholders

Page 7: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Uses of monitoring Provides up-to-date feedback on

performance of the project. Are we on track with: Inputs (use of resources)? Activities? Outputs? Short-term outcomes?

Contributes data to evaluations.

Page 8: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Uses of evaluation

Improvement - Are we doing the things right? Wider learning – What difference did we make?

Did we do the right things? Accountability – Did we do what stakeholders

expected? Advocacy – Look what we can do!

Page 9: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

M & E M&E

In human development, where results are often unpredictable, monitoring and evaluation are tending to converge.

Monitoring should look beyond planned results

Evaluation should be a regular, timely, process.

Page 10: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning your M&E – the important questions

Why? For whom? What? How? When? Where?

Page 11: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning your M&E – why, for whom and what

Be clear why you are doing it – what use you and others will make of it.

Who are the main stakeholders? What do they need to know?

In this light, and bearing in mind the resources available, decide what you should M&E and on what scale.

Don’t plan to M&E anything that isn’t important to know about!

Page 12: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

WHAT to M&E

Outcomes

Outputs

Activities

Inputs

Page 13: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

WHAT to M&E

Outcomes = the changes the intervention helped to bring about (better educated students, more empowered academic staff) – both planned and unplanned

Outputs = the immediate, planned, results of the intervention (technicians and academic staff trained, courses re-engineered and launched on the LMS)

Page 14: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

WHAT to M&E

Activities = the things you did to ensure you delivered the outputs (identify and contract trainers; research the software options and procure)

Inputs = the resources you used in your activities

Page 15: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Kirkpatrick’s 4 levels for evaluating training

Reactive: how they felt about the training Cognitive/affective: new knowledge, skills,

ideas, attitudes Behavioural: doing new things, doing things

differently Organisational/multiplier: effect on the

organisation; diffusing benefits to others

Page 16: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning M&E - indicators

Decide whether it would be helpful to develop indicators for these elements (inputs, activities, outputs, outcomes).

[The ETS LogFrame requires indicators for outputs]

Indicators are pre-defined, precise, pointers that help you assess performance against the background of the planned inputs, activities, outputs and outcomes.

Page 17: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning M&E - indicators

Examples: Proportion of activities in the project plan

completed on time Proportion of trainees who report that the training

either met or exceeded their expectations Numbers of students using the new LMS

applications Date when ETS was formally adopted by the

University XYZ Board

Page 18: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning M&E – indicators/targets

Indicators can be neutral as in the previous slide.

Or they can be expressed as targets if you are confident the targets are appropriate.

2000 students using the new LMS applications within 3 months of their launch

ETS formally adopted by the XYZ Board by October 2010

Page 19: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning M&E - indicators

Be careful not to let indicators narrow your perspective.

Outcomes in particular are often difficult to predict in advance. Look for unplanned effects of your interventions.

Page 20: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

A few words about baselines and attribution

You have a qualitative picture of where your institution stands in ET4TL on the verge of ETI Part B.

But if you want to assess with precision the change that your intervention has promoted in a specific area – e.g. a change in attitudes or usage of ICT among a particular group - you may need to capture data on the baseline.

This research must obviously be built into your project plan and implemented before the project gets moving.

Page 21: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

A few more words about baselines and attribution

Trying to measure some sorts of change and attribute it to your intervention can be difficult. The before and after groups may not be comparable – e.g. different student cohorts - and change may be influenced by extraneous factors.

Page 22: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Planning M&E - data

Work out HOW you are going to capture the data on inputs, activities, outputs and outcomes – including baseline data.

Bear in mind the cost, time and access issues in capturing the data. Don’t try to do too much. Use samples if appropriate.

Work out how you are going to process and analyse the data.

Page 23: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Project Logical Framework – optional except for outputs

Descriptors Performance indicators

Sources of verification

Risks/ assumptions

Long-term Outcomes

Should relate to ETS LogFrame

Should relate to ETS LogFrame

Should relate to ETS LogFrame

Should relate to ETS LogFrame

Outputs Key output relates to ETS

LogFrame

Key indicators relate to ETS

LogFrame

Key sources relate to ETS LogFrame

Key risks/

assumptions relate to ETS

LogFrame

Activities

Inputs

Page 24: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Questions to answer in a summative evaluation

Effectiveness (have we fulfilled the project objectives?)

Efficiency (have the resources – including time – been used optimally?)

Page 25: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Questions to answer with a summative evaluation

Impact (what difference have we made - intentionally or unintentionally?)

Sustainability (are the positive changes likely to last?)

Relevance of project (were we doing the right things?)

Page 26: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

The Project Cycle

Carry out the projects

Monitor

Summative

evaluation

Plan projects – and plan to monitor and

evaluate them

Outside Stakeholders

ET Strategy

Formative

evaluation

Page 27: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Reporting on your projects

The MOA requires six monthly reporting on projects: Progress towards results and indicators A summary of problems and challenges

experienced Revised activity schedules for each project A budget variance report.

Page 28: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Reviewing projects

You will want to meet more frequently than that to review your projects. You will want to keep a note of what you conclude to feed into the six-monthly MOA reporting.

Your regular review of projects will inform a six-monthly programme-wide self-assessment.

Page 29: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Reviewing your projects

We suggest you ask these standard questions, among others, at your project review:

Have the activities and products been completed according to plan - in terms of both timing and quality? If not, why not? What have you done to address completion and quality challenges?

What benefits is the project producing – for people and the institution as a whole?

Are there any negative effects, if so what are they? What is being done to mitigate any negative effects?

Page 30: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Monitoring and evaluating your programme

Page 31: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

The self-assessment process

Meeting of ETI core team plus project leaders every 6 months

4-6 hours Report in full, with 1-2 page summary

for MOA requirement (MOA also requires six-monthly reporting on projects, as we have seen).

Page 32: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Self-assessment at programme level

How is the overall ETI progressing? What are the main changes/outcomes that have

taken place for people and the institution as a whole – both positive and negative - as a result of the ETI?

Are there any outcomes/changes that you were expecting by now, but which have not taken place? If so, why do you think they haven’t taken place?

How useful has your ET Strategy been in this period? What specifically has it helped with?

Page 33: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Self-assessment at programme level

How is the overall ETI progressing? What aspects of your team’s work have been

most constructive and productive? Are there aspects of your team’s work that have

not worked well? If so, what are the probable reasons?

In what ways has the wider institution supported progress in the ETI?

Are there aspects of the wider institution that have hindered progress?

Page 34: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

Self-assessment at programme level

How is the overall ETI progressing? What has been helpful to you in the work of the

SAIDE-CET team and the external evaluator What has not been helpful? What other support could they have given you

that would have been helpful?

Page 35: Monitoring and Evaluating in the PHEA Educational Technology Initiative Patrick Spaven

ETS Logical Framework

Vision

Descriptors Performance indicators

Means of verification

Assumptions/

risks

Long-term Outcomes

Project outcomes should relate to them

ET Strategy Outputs

Key outputs from project plans

Defined in project plans

Defined in project plans

Defined in project plans