hep quality audit briefing on adri, scope & evidence © 2007 martin carroll, salim razvi &...

37
HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Upload: noel-adrian-greene

Post on 22-Dec-2015

222 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

HEP Quality AuditBriefing on ADRI, Scope & Evidence

© 2007 Martin Carroll, Salim Razvi & Tess Goodliffe

Oman Accreditation Council

Page 2: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Table of Contents

Part A: ADRI9.00am – 9.30am

Part B: Scope of Audit9.30am – 10.00am

Part C: Methods of Analysis10.00am – 10.30am

Part C: Questions and Discussion10.30am – 11.00am

Page 3: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Part A: ADRI

Page 4: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTSDEPLOYM

ENT

IMPROVEM

ENT

Page 5: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

• Operational Plans – detailing what should be done by when, by whom, to what standard and with what resources.

• Manuals – detailing how processes should be implemented.

• Professional development and training aligned to HEP operational needs.

• Alignment of resource allocation to plans.

ApproachHow an HEP proposes to achieve its purpose

A

DR

I

Page 6: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTSDEPLOYM

ENT

IMPROVEM

ENT

Page 7: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

• Also known as ‘implementation’ or ‘process’.• Looks at how an HEP is implementing its

approach.• In other words, do the plans and bylaws

happen in reality?• This is best tested through interviews; tap into

people’s ‘lived experiences’.• Also includes consideration of input factors

such as the quality of resources.

DeploymentDeployment Dimensions

A

DR

I

Page 8: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTSDEPLOYM

ENT

IMPROVEM

ENT

Page 9: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

• Quality cannot be assured by only focusing on the goals, plans, inputs and processes.

• There must be an emphasis on what is actually achieved – the results!

• Every goal must have a reported result.• Every result should link back to a goal.

ResultsResults Dimensions

A

DR

I

Page 10: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

• It is essential that a causal relationship can be shown between the approach, the deployment and the eventual result – otherwise the result may be just chance.

• If you know that A+B+C = 19 you still do not know what B is (i.e. whether each step in the process is adding value or not).

ResultsResults Dimensions

7 + (-6) + 18 = 19

A

DR

I

Page 11: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTSDEPLOYM

ENT

IMPROVEM

ENT

Page 12: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

• This dimension looks at what an HEP knows about itself in order to get better and better.

• Goals should be continually set higher.• Processes should get more efficient and more

effective over time.• Results should indicate increasing success. • This requires a comprehensive system of

review – not just consideration of results.

ImprovementImprovement Dimensions

A

DR

I

Page 13: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTSDEPLOYM

ENT

IMPROVEM

ENT

Page 14: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

“We have had our standardised voluntary student evaluation of teaching (SET) programme in place for 4 years. The surveys use quantitative and qualitative questions. Teachers run the surveys in the last week of each semester. It was reviewed last year by an international teaching evaluation expert, who commended it as “best practice”. Mean results rate consistently high (above 4.2). SET results are required for promotion applications. Our staff development programmes build on areas of weakness identified through aggregated results.”

ScenarioStudent Evaluation of Teaching

Page 15: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

A: Need to see the guiding bylaw and the design of SET (not all SETs are designed well).

D: ROSQA 3.3.3 requires that the system be “comprehensive”. Theirs is voluntary. How many actually use it? How do students feel about teachers running the survey?

R: Need to see review report to find out what “best practice” means. “High” is also a relative term. High compared with what? Benchmarks?

I: No evidence of improving SET itself. But some evidence of using SET to improve teaching through professional development. Any evidence that this has lead to improvements in SET results?

Some ResponsesStudent Evaluation of Teaching

Page 16: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

2004 2005

SET Average Results

Ove

rall

Sat

isfa

ctio

n

2002 2003

Very High 7

6High .

Very High 5

High 4

Average 3

Low 2

Very Low 1

Some ResponsesStudent Evaluation of Teaching

Appears to be fine.And if we knew that the trend was improving then it would look even better.

Page 17: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

2004 2005

SET Average Results

Ove

rall

Sat

isfa

ctio

n

2002 2003

Very High 7

6High .

5

Average 4

3Low .

2

Very Low 1

Some ResponsesStudent Evaluation of Teaching

But we haven’t been told the scale. Perhaps it is a 7pt Likert-type scale! This would change our interpretation of the result.

And what if the trend goes the other way? Again, this would impact on our interpretation.

Clearly, we needed more information to make an informed judgement.

Page 18: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

This may be an example where the quality management system is very good, but poorly presented.

Much depends on the credibility and report of the international expert.

As it stands, there is inadequate evidence to suggest that SET is effective in meeting clearly stated teaching improvement goals.

Possible ConclusionStudent Evaluation of Teaching

Page 19: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

APPROACH

RESULTS

DEPLOYMENT

IM

PROVEMENT

Mission. Planning, designing processes and training staff

Implementing plans and monitoring processes

Analysing measures of

outcomes against

goals

Reviewing& improving

Approach, Deployment

& Results

Page 20: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Part C:Scope of Quality Audit

Page 21: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

What does Quality Audit Consider?

• Everything done in the name of the HEP or its subsidiaries could be subject to consideration in a Quality Audit.

• The following topics are provided by way of guidance (they supersede the 10 standards in ROSQA). They do NOT constitute a comprehensive checklist, and are negotiable.

• They are NOT standards – they suggest topics, but do not prescribe levels of performance against each topic. Each topic ought to be analysed using ADRI and robust evidence.

Page 22: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

What does Quality Audit Consider?

1. Governance & Management

2. Student Learning by Coursework Programs

3. Student Learning by Research Programs

4. Staff Research & Consultancy

5. Industry & Community Engagement

6. Academic Support Services

7. Students & Student Support Services

8. Staff & Staff Support Services

9. General Support Services & Facilities

Page 23: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Part C:Methods of Analysis

Page 24: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Self Study vs External Review

Self Study External Review

Internal mandate External mandate

Story creation Story verification

Commitment of people, resources & time ongoing

Commitment of people, resources & time more bounded

All issues Sample issues

Deep knowledge of the HEP

Little or no pre-existing knowledge of the HEP

Similar methods of analysis, with the following differences:

Page 25: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Methods of Analysis Issues

• Sampling• Types of Evidence & Data Analysis• Gaining a Comprehensive Picture• Gaining Confidence in the Evidence• Interview Methods• Reaching Conclusions

Page 26: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Sampling (1/2)

• A HEP reviews all its major activities• An External Audit Panel uses sampling due to

limited:– number of reviewers– amount of time – access to HEP

• Two things are sampled: issues and evidence• The Audit Panel makes the sampling decisions.

Page 27: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Sampling (2/2)• Sampled Issues:

– All main headings (the 9 Scope topics) will be covered, but not every topic/issue under those headings

– The Panel will seek to achieve an overall balanced account of the HEP

• Sampled Evidence:– People to interview– Administrative & academic departments to

investigate– Programs and courses to consider– Documents to consider

Page 28: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Types of Evidence

• Conclusions in an Audit Report are based on consideration of evidence.

• There are different types of evidence, each with its own methods of presentation & collection.

• A notable distinction is between Quantitative and Qualitative evidence.

• All types have something useful to offer in reaching a comprehensive conclusion.

• Therefore, Quality Audit is a mixed method exercise.

Page 29: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Qualitative Evidence • Examples include:

– Case Studies– Folk Lore (shared legends)– Lived Experiences

• Methods of Collection include:– Interviews– Surveys (e.g. using open-ended questions)– Observations

Page 30: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Quantitative Evidence

• Examples include:– Enrolment statistics– Financial results– Group opinions– Student results

• Methods of Collection include:– Surveys (e.g. using Likert-type scales)– Documented reports– Records

Page 31: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Gaining a Comprehensive Picture

• Saturation – a method used to explore an issue until no new information about it comes to light.

• Triangulation – a method for strengthening the analysis using combination of:– Multiple original source of data (eg. students,

staff, other stakeholders)– Multiple methods of data collection (e.g. surveys,

interviews, literature)– Different types of data

• Process Mapping – a method for depicting the steps in a process and their relationships.

Page 32: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Gaining Confidence in the Evidence• In reaching a conclusion, Panel Members must

have confidence that the evidence is not only comprehensive, but also valid, reliable and honest!

• Examples of methods for gaining confidence:– Chatham House Rule– Discourage ‘rehearsing’ for interviewees– Random Interviews– Non-attributable surveys (e.g. student

evaluations of teaching)– Independence (of survey analysis, program

reviews etc.)

Page 33: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

“Wet Paint” Syndrome

• HEPs will usually seek to addressa range of problems before theAudit Visit occurs.

• HEPs should acknowledge thenewness of any such improvements(rather than presenting the issue as being of long standing).

• HEPs should not be embarrassed that an improvement is new.

• OAC supports new improvement efforts – it is evidence of a quality management system in practice.

Page 34: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

But there are limits!!!

Page 35: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Interview Session Techniques

Interviewers will use a range of questioning techniques in order to:• Explore how the documented evidence

compares to the ‘lived experience’ of staff and students.

• Seek the authoritative voice on a topic.• Seek validation from non-authoritative sources.• Determine how widely known an issue/policy

etc. may be.• Saturate a topic.• Triangulate a topic.

Page 36: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Reaching Conclusions

• Conclusions should be based on a comprehensive analysis considering different types of evidence.

• They represent the judgment of the team involved – not individual members.

• Each chapter in a Portfolio should conclude with Areas of Strength and Opportunities for Improvement.

• Conclusions in the Audit Report are presented in the form of Commendations, Affirmations and Recommendations.

Page 37: HEP Quality Audit Briefing on ADRI, Scope & Evidence © 2007 Martin Carroll, Salim Razvi & Tess Goodliffe Oman Accreditation Council

Helpful Review Questions

• What type of evidence is provided?• Is this (a) appropriate and (b) comprehensive

in relation to the objective?• What are the (a) advantages & (b) limitations

of this type of evidence in relation to the objective?

• How would you check on the accuracy / validity / reliability of this evidence? (How does this differ between internal and external reviews?)

• What further kinds of evidence would be required? (Triangulation)