dr sara booth university of tasmania standards mean uniformity - one size fits all - national...
TRANSCRIPT
Deakin University Ms Heather Sainsbury
University of Tasmania Dr Sara Booth
University of Wollongong Ms Anne Melano | Ms Lynn Woodley
BENCHMARKING OF ASSESSMENT
Articulating and Comparing
Standards
through
Dr Sara Booth
University of Tasmania
CONTESTED SPACE STANDARDS AND BENCHMARKING
Standards mean uniformity - one size fits all - national curriculum
5 sets of sector standards (DEEWR & TEQSA) for Provider Registration, Provider Category, Qualification (AQF), Information, Teaching and Learning, Research
Sets of academic standards – a contested space including professional (e.g. teaching standards); quality assurance; minimum threshold (what is achieved); aspirational and student achievement standards (Carmichael, 2010)
TEQSA’s discussion paper on Teaching and Learning Standards (July, 2011)Learning/Teaching standards/role of TEQSA/role of universities
Definition of Benchmarking is varied across sector
Implicit Standards in universities are self-monitoring and self-regulating
Explicit Standards means diversity, substance, accountability and transparency
They are a basis for comparison and collaboration
Universities need to become more explicit in comparison of standardsTo do this:- Make explicit definition of
standards used- Make explicit definition of
benchmarking used
Argument 1 Explicit Argument 2 Implicit Explicit
Jackson and Lund (2000, cited in Stella & Woodhouse, 2007, p.14) define benchmarking as
‘ first and foremost, a learning process structured so as to enable those engaging in the process to compare their services/activities /products in order to identify their comparative strengths and weaknesses as a basis for self improvement and/or self regulation’.
Agreed points of comparison – Deakin, UOW, UTAS• Three Cycle 1 AUQA Audits specified more benchmarking • Comparable institutions - age, structure, regional presence,
disciplines• Benchmarking awareness and confidence at similar level
BENCHMARKING AS A PROCESS FOR IMPROVEMENT THROUGH COMPARISON OF STANDARDS
1. Early Implementation 2. Further Refinement and Alignment
3. Full Embedding
Universities need to develop and implement a benchmarking framework, processes and partnerships as part of the Quality System
UOW, Deakin and UTAS
Universities have begun to implement benchmarking processes and partnerships but further refinement and alignment with other university processes is required
We are currently here!
Universities have established benchmarking frameworks, processes and partnerships across the sector and make extensive use of external reference points and benchmarkingKey features• university-wide approach • aligned to strategic priorities,
data strategy, data warehouse and risk framework
• applied at unit and course level• mechanisms for selecting
appropriate institutions;• benchmarking reference groups
(Booth, 2011)
UNIVERSITIES ARE AT DIFFERENT STAGES OF DEVELOPMENT TOWARDS BENCHMARKING
Ms Heather Sainsbury
Deakin University
PlanningEstablishing the benchmarking partnershipAgreement on area and scopePlanning for success
Implementation
Communicating with facultiesStreamlining the processPutting it together
ASSESSMENT BENCHMARKING – CASE STUDY OF A SUCCESSFUL PARTNERSHIP
Success factors
Shared understanding of benchmarking goals
High level of trust
Willingness to share information and discuss successes and failures
THE BENCHMARKING PARTNERSHIP
Success factorsSimilar enough to offer transferable strategies
Similarities Differences All unaligned Compatible missions, values and
goals Multi-campus structures Regional presence Comparable discipline areas Similar experience of AUQA
audit cycles
Size Student
profiles Offshore
presence Off campus
delivery
THE BENCHMARKING PARTNERSHIP
Success factorsComparable commitment
THE BENCHMARKING PARTNERSHIP
Success factorsSustained commitment
THE BENCHMARKING PARTNERSHIP
Success factorsSustained commitment
THE BENCHMARKING PARTNERSHIP
Success factorsThe more partners there
are the harder it gets
Communication and flexibility the keys to success
THE BENCHMARKING PARTNERSHIP
What to benchmark? Catalyst for assessment project – 2009 AUQF in Alice Springs Paper by Linda Davies (Griffith Uni) on ALTC Teaching Quality
Indicators Project – external reference point Shared commitment to review assessment practice in the
lead up to our respective AUQA audits in 2011 Potential to deliver significant benefits to all three
universities Support from relevant Executive and other leaders critical
AGREEMENT ON AREA AND SCOPE
Agreement on scope Careful scoping through collaborative process involving
senior academic and quality leaders from each university⁻ Time period⁻ Coverage – undergraduate but excluding Honours⁻ Focus on standards – assessment design not covered
Agreement on data to be shared Make sure that you are talking about the same thing –
different terminology a potential barrier Take the time to get it right…
AGREEMENT ON AREA AND SCOPE
Agreement on scopeKeep sight of the main objective
AGREEMENT ON AREA AND SCOPE
Agreement on methodology Derived from existing successful methodology - ACODE Benchmarking
Framework (2007)− Self-review by each partner− Peer review− Action plans (shared)
Adapted indicators and measures developed through TQIP project Tested against literature on good practice, expert reviewers and academic
leaders at each university Agreement reached on:
– Performance indicators– Good practice statements– Performance measures– Trigger questions
PLANNING FOR SUCCESS
Agreement on performance indicators and measuresPI #1: Assessment purposes, processes and expected standards of performance are clearly communicated and supported by timely advice and feedback to students
Good Practice Statement: Students receive clear and timely information on the aims and details of assessment tasks; marking and grading practices; expected standards of achievement; and requirements for academic integrity. They are provided with timely feedback on their performance and supported in making improvements.
Performance measures:1.1 Expectations are clearly communicated 1.2 Advice and feedback are provided
Trigger questions under each measure
PLANNING FOR SUCCESS
Agreement on self-review templates
PLANNING FOR SUCCESS
Performance measure Rating Rationale Evidence
State measure as agreed, with trigger questions to focus self-review
4-level scale:1 Yes2 Yes, but3 No, but4 No
Dot points identifying practices that support the rating
Including references to policies, documents, web references, data sources (including student feedback)
Agreement on timelines Build in flexibility for partners to move at slightly different
speeds at different times, while still all meeting critical common dates:⁻ Finalising templates⁻ Completion of self-reviews and sharing of self-review reports⁻ Peer review workshops⁻ Contributions to shared reports
Accommodate internal deadlines of partners wherever possible (key committee dates, AUQA deadlines)
PLANNING FOR SUCCESS
Ms Anne Melano
University of Wollongong
Communicate with faculties Prepare a communication plan Consider the culture – eg UOW is
very consultative, very engaged faculty T&L chairs Hold a high level briefing – establishes importance, brings faculty
leaders together Hold informal one-on-one meetings – answers questions and
address concerns Don’t rush – do invite comments on documents and processes –
builds ownership Send out updates as project progresses Thank/acknowledge along the way
IMPLEMENTATION
Provide support
Appoint a project coordinator Encourage faculties to identify
a person to support faculty leader Offer funding or admin assistance if possible Provide a clear guide to the process Provide data packs Offer draft emails, information sheets etc
that faculties can send to staff Attend faculty self-reviews – helpful as questions of
interpretations do arise
IMPLEMENTATION
Streamline the process Faculties are time poor - risk of backlash if
time contributed not rewarded by benefits Clear, realistic timeline and expectations ONE self-review meeting in each faculty – if
put together the right people, most questions can be answered ONE template to work through – all questions clearly set out Simple rating scale As much as possible of the template completed in that meeting A rating on each measure MUST be agreed by the group. Otherwise
there is no clear result A similarly streamlined process for institutional reviews and for the
peer review across three universities
IMPLEMENTATION
But it does need rigour… Question design based on:
– Griffith ALTC project, additional workby Boud, advice from Joughin, testing in a faculty
Evidence:– has to be provided to support each rationale/rating – collecting this is a major effort by faculty leaders
and their admin assistant – survey conducted at UTAS – valuable and can be
done centrally– all evidence checked centrally
IMPLEMENTATION
Sharing
At each level, encourage the conversations – these can be just as important as the project outcomes. Good practice sharing, questioning and problem solving naturally occurs – let it
Faculties aren’t mediaeval castles – encourage interaction UOW – each faculty leader sat in on another’s self review Deakin – four Associate Deans (T&L), very collegial Avoid the ‘black hole of benchmarking’. Reward evidence-
gathering by selecting and disseminating good practice
IMPLEMENTATION
Putting it together – the institutional self-review
Faculty reports combined into an institutional report All leaders brought together Agreement on institutional rating, good practice and
gaps/issues Discussion of each measure with top issues agreed – these
form the basis of an action plan for the future
IMPLEMENTATION
Putting it together– the three-university peer review Face-to-face if possible Selection of leaders brought together Icebreakers, time to mingle Template provided to work through – each institution’s results and
ratings on each measure Review of institutional ratings Discussion of good practice and gaps/issues Expect surprises! You may be doing better than you think … OR your ‘best practice’ may be just ‘ho-hum that’s what everyone is
doing’!
IMPLEMENTATION
Ms Lynn Woodley
University of Wollongong
Using and sharpening the tools: What works and what doesn’t The broad indicators of the Griffith TQIP
project (Davies, 2009) The ACODE Benchmarking Framework Templates – the Pollard Rating Index
"No but yeah but no but yeah but no but... Killing two birds : making the most of the project Benchmarking logistics: checking the steps and
the flight plan Escaping the black hole –the action plan Becoming a toolmaker
KEY OUTCOMES THE PROCESS
Collegial partnerships
Institutional: self-review activity; cross faculty bonds
Cross- university: co-ordinators, executive and academic staff
A mutual learning process for all involved
KEY OUTCOMES THE PROCESS
Assessment- Standards at work:
The academic standards trinity: Learning Outcomes, Assessment, Graduate Qualities
An “academic” exercise in definition or a “real world” definition - how academics set, monitor and review standards?
Uniformity Vs Quality and Good Practice
KEY OUTCOMES THE TOPIC
Assessment - Good Practice and Quality Improvement: Insights and ideas from the practices of others Good practice and areas for improvement for each faculty and each university
What we do well: For example: Deakin - Online Unit Guide; UTAS - Criterion-referenced
assessment (CRA) supported by faculty champions; UOW - educativefocus of Academic Integrity Policy
What we needed to do better: Connecting learning outcomes, Graduate Attributes/Qualities and Assessment
(the crux of academic standards) Staff development (incl. sessional staff) Marking practices for group work Use of best practice models Benchmarking at the course/program level (Oliver, 2009)
KEY OUTCOMES
Did we achieve the Project Aims? 1. Compare processes within faculties, across each university and across
the three universities.2. Compare the effectiveness of Academic Boards/Senates in performing
their role in policy and standards, across the three universities. 3. Identify good practice and areas where improvements can be made for
the benefit of students and staff at each university.4. Develop and share knowledge and experience between the three
benchmarking partners about the process of benchmarking.
Your rating? "No but yeah but no but yeah but no but..."
KEY OUTCOMES
‘ first and foremost, a learning process structured so as to enable those engaging in the process to compare their services/activities /products in order to identify their comparative strengths and weaknesses as a basis for self improvement and/or self regulation’.