sqa internal verification policy - largs academy policies/sqa internal...sqa internal verification...

21
LARGS ACADEMY SQA INTERNAL VERIFICATION POLICY NOVEMBER 2009 Christine Collins DHT

Upload: phungdiep

Post on 30-May-2018

258 views

Category:

Documents


0 download

TRANSCRIPT

LARGS ACADEMY

SQA

INTERNAL VERIFICATION POLICY

NOVEMBER 2009

Christine Collins DHT

SQA INTERNAL VERIFICATION All SQA centres are responsible for the internal verification of their assessments. This means that centres should have an internal verification system- a system of having quality checks in place that are operated throughout the centre. Each member of staff who is responsible for the assessment of candidates and/or the internal verification of candidate material should comply with the procedures.

The internal verification system ensures that the centre’s assessments of internally assessed qualifications are valid and reliable and the centre : a) ensures that the chosen assessment instruments and assessment guidelines are valid and applied consistently by all assessors for the same qualification across all candidates b) demonstrates that arrangements are effective for the safe storage of internal assessment materials c) ensures that access to assessment materials is effectively managed d) ensures that the final assessment decisions made by assessors are accurate, reliable and recorded e) takes steps to minimise the risk of plagiarism f) ensures that assessment evidence is the candidate’s own work g) monitors the effectiveness of the assessment and internal verification system and implements any necessary changes h) implements any changes made necessary by changes to SQA

The internal verification process:

Step-by-step and in terms of who does what, the process of assessment and internal verification may look like this: 1 The assessors decide how they are going to assess the candidates. For example, if candidates are required to

demonstrate competence in a practical skill, then a practical exercise would be designed.

2 Internal verifier confirms that the assessments are valid and that the assessment specification and marking schedule are appropriate. This could be done by discussion at internal verification meetings and recorded in minutes.

3 Assessment is carried out using internally verified materials which the internal verifier has checked. 4 The assessors evaluate the candidate’s evidence to ensure that it meets requirements of the qualification. Agreed

marking schedules should be used. 5 The internal verifier confirms that assessors are marking consistently, applying the standards defined for the

Unit. He or she usually does this by sampling the work of the assessor. 6 Assessment records, materials and evidence are retained in line with SQA requirements.

SQA documents relating to Internal Verification/ Moderation, Assessment and Appeals have been downloaded onto the I Drive but the condensed points in APPENDIX 2,3,4 and 5 should be adhered to for students to have the best chance of success:

References on I drive

• Estimates, Absentees and Assessment Appeals (February 2009) • Guide to Internal Moderation for SQA Centres (2001) • Standard Grade Advice on generation of evidence for assessable Elements assessed by

Question Papers • Unit assessment and Course assessment :General principles • SQA’s Quality Framework: a guide for centres (March 2006) • Quality Assurance Principles, Elements and Criteria (2008) • Introduction to Assessment Arrangements (2008)

LARGS ACADEMY

INTERNAL VERIFICATION PROCEDURES

There must be evidence of internal verification to ensure that estimates are accurate, as part of the external verification process and as evidence for appeal. It is vital that internal verification procedures are consistent across the whole school and the following procedure should be adopted. 1) The PT/subject leader should instigate an internal verification programme for the department

2) Assessment tools and prelim papers should be scrutinised to ensure they meet SQA standards and are reliable, valid and secure

• Prelim papers produced from SQA past papers or commercially produced papers that are not of the current year should be made up from a minimum of three papers. The sources of questions should be mixed so that consecutive questions are not from the same paper. The source of each individual question should be identified and recorded (see Appendix3)

• To be valid, the prelim paper must replicate the SQA exam paper

• Any NAB that is not a secure SQA NAB needs prior verification by the SQA before use.

(Generally NABs are not suitable evidence for appeals. The verifier should check the subject specific documentation)

3) Pupil work should be sampled regularly using a valid sample of 5 pupils per class for groups of 33 and 3 pupils for classes of 20 or less

• The verified pieces of work should be signed by both assessor and verifier 4) The generic school checklists and forms provided in APPENDIX 1 should be used and retained as evidence for external verification and appeals 5)A generic Verification Meeting Form comprising of three sections; before, during and after delivery should be completed and included with external verification and appeal evidence 6) All evidence should be retained securely until the October following the pupil leaving school

The responsibilities within the centre are as follows:

SQA Co-ordinator Centre Contact • develops and implements improvements to quality assurance systems • liaises with SQA • arranges for the training of internal verifiers and assessors • co-ordinates the operation of the internal verification system • arranges for the induction of candidates • co-ordinates external verification activity on behalf of the centre • co-ordinates appeals

Internal Verifier/ PT or nominated representative • operates systems to standardise assessment and ensure that the work of all

assessors is sampled over a defined period • monitors consistency of assessment records • supports assessors by offering guidance and advice particularly in the case of

new or inexperienced assessors. The work of these assessors should be sampled more often

• prepares a plan for internal verification • decides on the methodology/ mechanisms to be used • samples assessment materials using school policy • liaises with external verifiers and SQA co-ordinator • co-ordinates meetings of assessors • produces records of internal verification for external verification and as

evidence for appeals • Internal verifiers should also keep records of their activities.

Assessor/ Teacher • contributes to the design and review of assessment materials • plans the assessment process with the PT and candidate • assess evidence against the SQA standards and makes judgements • completes the assessment records • liaises with other assessors and the internal verifier • participates in internal and external verification

Records of the internal verifier’s activities will include: Internal verification schedule

• a list of the Units for which the assessors have responsibility • a list of the assessors with whom the internal verifier liaises

Records relating to the design of assessment

• record of comments made about the assessments • the source of each individual question used

Records relating to the delivery of assessment

• notes of meetings with assessors • confirmation that assessment complies with the Unit standards

Records relating to the review of the assessment

• feedback from the external verifier, assessors and candidates • changes made to the assessments in the light of feedback

Possible Methodology/Mechanisms for internal verification The PT/ subject leader is responsible for working with his/ her department to ensure that a variety of methodology is on hand to ensure the verification can take place smoothly. In terms of the actual mechanisms used for internal verification, there are many options. These may include: • checklists: these are particularly useful when relying on observation or conducting interviews or role plays • model solutions and suggested answers: for use when any assessment has been carried out • discussions about assessment: it may be necessary to discuss levels of performance for particular candidates • cross-assessing/ block marking: it is sometimes useful for assessors to agree standards by marking each others’

work • bank of material: assessments which have already been agreed

APPENDICES

APPENDIX 1

a) Internal Verification Checklist b) Internal Verification Sample Form c) Internal Verification Feedback Form d) Internal Verification Meeting Form

APPENDIX 2

Estimates APPENDIX 3

Prelims APPENDIX 4

Using evidence generated by NABs APPENDIX 5

Principles of Assessment

LARGS ACADEMY

CHECKLIST FOR CONDUCTING INTERNAL VERIFICATION OF NQ COURSES, STANDARD GRADE COURSES AND PRELIMS DEPARTMENT COURSE NAME UNIT NO UNIT TITLE

Standard Grade Intermediate 1 Intermediate 2 Higher

LEVEL

Advanced Higher

TYPE OF ASSESSMENT NAB Prelim ASSESSOR’S NAME INTERNAL VERIFIER’S NAME

CHECKLIST (√) DATE Do you have the following: Unit Specification Instrument of Assessment Marking Scheme Cut Off Scores Source of Each Question Identified Class List Candidate Evidence If any of the above items not (√) – give reason why When the process is complete, have you sent a copy of the Internal Verification Feedback form to relevant staff?

Yes No

Have you confirmed arrangements for follow-up action and agreed dates with staff?

Yes

No

LARGS ACADEMY INTERNAL VERIFICATION SAMPLE FORM DEPARTMENT COURSE NAME UNIT NO UNIT TITLE

Standard Grade Intermediate 1 Intermediate 2 Higher

LEVEL

Advanced Higher NAB TYPE OF ASSESSMENT Prelim

ASSESSOR’S NAME INTERNAL VERIFIERS NAME

SAMPLE CANDIDATES Candidate Name Mark/Grade Pass/Fail 1

2

3

4

5

6

7

8

9

10

11

12

SAMPLE CANDIDATES Candidate Name Mark/Grade Pass/Fail 13

14

15

16

17

18

19

20

21

22

23

24

LARGS ACADEMY INTERNAL VERIFICATION FEEDBACK FORM DEPARTMENT COURSE NAME UNIT NO UNIT TITLE

Standard Grade Intermediate 1 Intermediate 2 Higher

LEVEL

Advanced Higher NAB TYPE OF ASSESSMENT Prelim

INTERNAL VERIFIERS NAME CHECKLIST ITEM VERIFICATION

OUTCOME ACTION REQUIRED AND AGREED DATE

ACTION COMPLETED (√)

Current Unit Specification being used

Yes No

Valid Instrument of Assessment/Marking Scheme being used

Yes No

Appropriate Sample Supplied

Yes No

Candidate Evidence – consistent marking which meets the requirements of the assessment

Pass On Hold

Internal Verifier Signature …………………………………………………………………………………………. Member of Staff Signature …………………………………………………………………………………………. Date …………………………………………………………………………………………………….

LARGS ACADEMY

INTERNAL VERIFICATION MEETING

DEPARTMENT

COURSE

DATE

PRESENT

1

MATTERS ARISING

2

CHANGES TO UNIT SPEC/ARRANGEMENTS DOCUMENTS

CHANGES REQUIRED

STAFF TO MAKE THE CHANGES

3

CHANGES TO INSTRUMENTS OF ASSESS

TARGET DATE FOR CHANGES

B E F O R E

CANDIDATE PROGRESS/ ACHIEVEMENT

4 REVIEW OF COURSE DELIVERY

ACTION REQUIRED AND BY WHOM

D U R I N G

METHODOLOGY USED

CANDIDATE EVIDENCE ACCEPTED/ON HOLD

5

INTERNAL VERIFICATION

FEEDBACK

A F T E

R

6

AOB

7

DATE OF NEXT MEETING

APPENDIX 2 Estimates Estimates are based on candidates’ demonstrated attainment against the assessment requirements of the National Courses. Since Estimates relate to achievement of a Course, the assessment instruments that are used to generate evidence for Estimates should allow candidates opportunities to demonstrate attainment against the requirements for the Course, taking into account the different components of the Course and their relative weightings. Assessment instruments that assess candidates’ abilities against the Outcomes of National Units (eg NABs) may help with making decisions about Estimates. However NABs cannot, on their own, generate sufficient evidence of how a candidate can perform against the assessment criteria (eg retention, integration) for a Course. There is more advice on the use of NABs in generating evidence for Estimates in the subject-specific guidance for each National Course. Purpose of Estimates Each year, centres submit estimates for every National Course candidate. These Estimates are made by subject specialists in centres, and are based on each candidate’s demonstrated attainment in assessments conducted in the centre. Estimates are not compulsory. However, candidates may be disadvantaged if you do not provide Estimates — this would exclude them from Absentee consideration and the Appeals process. Estimates influence a range of decisions that can impact on a candidate’s grade for a National Course. For example: ♦ They play a role in helping us make decisions about the pass mark for a Course. This is because the Estimates we receive each year indicate the grades that centres expect the population of candidates to attain. ♦ They help Examiners to prioritise candidates’ scripts for reconsideration during the finalisation stage of the awarding procedure —where marks attained are close to grade boundaries (ie cut-off scores). ♦ They enable our centres to submit a claim for absentee consideration where, for valid reason, a candidate is unable to attend an examination. ♦ They enable our centres to submit Appeals — for candidates who do not achieve their estimated grade at C and above. ♦ They enable us to compare centres’ judgements with candidate achievement, within and across centres.

Evidence for Estimates: how Course components are assessed The evidence used for making Estimates should be based on a candidate’s demonstrated attainment against the Course Grade Descriptions, in all aspects of the Course assessment (ie the Course components). Course components are assessed in different ways. Before assembling the evidence for Estimates, you need to have a clear understanding of the structure and assessment requirements of the Course and the components that contribute to the Course award, using weightings where appropriate. Evidence for Estimates: models of Course assessment Course assessment can take the form of a single component, eg externally assessed Question Paper; or it can be a combination of more than one component, eg externally assessed Question Paper and Folio. Understanding what is required Appendix 1 of this guide provides extracts from Course Grade Descriptions, and Appendix 2 lists the bands, grades and associated marks used to assess National Courses. This is all valuable information which we recommend you refer to when preparing for your Estimates. As you prepare your Estimates you may find it helpful to ask yourself the following questions: ♦ Do your Estimates reflect attainment in the components of the Course assessment in proportion to their relative weight in the overall Course award? For example: Question-paper-based Courses

• If there are any, have you included the practical components of the Course (eg oral test, listening test, performance, folio)?

• Have you given due weight in your Estimates to any coursework that is part of the Course

assessment (eg folio, project, assignment)? Project-based Courses

• Do your Estimates reflect attainment of all parts of the project (planning, development, evaluation), and in proportion to their relative weight in the external assessment of the project?

♦ Do the Unit assessments (eg NABs) have the potential to generate any evidence of the Course requirements — will Unit assessment evidence match the requirements outlined in the Course Grade Descriptions? ♦ Do your Estimates generally reflect later as well as earlier attainment in the Course? This is particularly important for subjects with an emphasis on skill development rather than subject content.

♦ Does evidence on which Estimates are based cover only part of the Course? It is important to take account of this since the level of demand could be less than the Course assessment ♦ Is evidence on which Estimates are based generated by two assessments separated by a period of time? It is important to take account of this since the level of demand could be less than the Course assessment. ♦ Have all the assessments that contribute to the Estimates undergone internal verification? This will ensure consistency across all assessors for that Course in your centre. The most common form of evidence used for Question-Paper-based Courses for the purposes of Estimates is generated by Prelims — where this reflects the nature of SQA’s Question Paper component of the Course assessment.

APPENDIX 3 Prelims Prelims should, and generally do, replicate Course assessments (ie Question Papers) in the standard, format, duration and security. They should also replicate Course assessments in the marking standards imposed. They are normally taken under the same time constraints and supervised conditions as the Course assessment. We do not require you to carry out prelims, though some integrated assessment, conducted internally and related to the Course Grade Descriptions, is recommended for most Courses. Prelims can serve a variety of purposes for both centres and candidates. These include:

• producing diagnostic information that helps to inform teaching and learning

• producing predictive information that helps in decisions on Estimates

• practice for candidates in an externally-assessed Question Paper, which reflects the type, length and complexity of tasks; length of test; time available, and conditions for assessment of the Course

• producing evidence for an Appeal

A properly constructed Prelim can also be used to generate evidence of Unit attainment. The most effective Prelims replicate the format of the Course assessment and the way candidate performance is measured there. The Marking Instructions used, and their application, should correspond as closely as possible to the exemplars provided by SQA. The way prelim question papers are produced can vary. They tend to fall into three main categories: centre-produced papers; papers generated cooperatively by a group of centres; and commercially-produced question papers. All of these have the potential to generate robust evidence for Appeals. In all cases, you are responsible for the validity, reliability and security of the evidence that you submit to SQA, so you must exercise care in how these question papers are selected and used. Use of SQA Question Papers Centres generating their own prelims sometimes draw heavily on past SQA papers for their questions. Please note, though, that a past SQA paper in its entirety will not be accepted as evidence to support an Appeal. SQA papers are available publicly, complete with Marking Instructions, so candidates have full access to them. SQA specimen papers are not acceptable in their entirety unless we have indicated otherwise for a specific Course. However, it is possible to use a judicious selection of questions drawn from a range of past papers, and preferably adapted, to make up a Prelim paper.

In such instances, using at least three past papers is recommended. Sets of questions should not be lifted en-bloc from past papers. The difficulty of creating new questions and question papers from scratch is appreciated. However, this can strengthen the security of Question Papers. The authenticity of the evidence is less likely to be compromised by candidates having prior sight of the questions or tasks than when using question papers in the public domain. Use of commercially-produced Question Papers Many centres make use of commercially-produced Question Papers to estimate candidates’ expected performance in the Course assessment and to generate evidence to support Appeals. A well-designed commercially produced Question Paper can provide valid and reliable evidence for Estimates. It can also provide full or partial evidence to support an Appeal. The use of these question papers is both convenient to centres and acceptable to SQA — provided that our guidance on validity, reliability and security is adhered to. Producers of commercial question papers make a big effort to meet the Course requirements for validity and reliability. However, SQA do not prior moderate commercially-produced Question Papers, and there is no guarantee that they meet all Course requirements. You are responsible for the validity and reliability of the assessment evidence submitted to support Appeals, so you should evaluate these papers in the same way as you would for locally-produced papers. Only the current year’s commercial papers will be accepted in their entirety for Appeals. Because these papers and associated Marking Instructions find their way into the public domain, which provides candidates with opportunities for improper access to them, past papers cannot be accepted in their entirety to support Appeals. (As we have already said, this applies to SQA papers too.) Previous years’ commercially-produced Question Papers can be used in the same way as past SQA papers. You can use a judicious selection of questions from a range of past commercially-produced papers, preferably adapted, to make up a prelim paper. If you do this, we recommend using at least three past papers. Sets of questions should not be lifted en-bloc from past papers — again, apart from the exceptions detailed in the subject specific information in Part 2 of this guide. Ensure consecutive questions do not come from the same paper. All assessments conducted internally should be subjected to a moderation process to ensure consistency of application and marking standards Prelims — common failings While Prelim papers generally do replicate the standard, format, duration and security of the Course assessment, there are occasions when they do not, and candidates can be disadvantaged as a result. Common failings are: ♦ Prelims that are statistically invalid — they do not carry enough marks to test the candidate reliably. Prelims should have total mark allocations and Element mark allocations that reflect those of the Course assessment.

♦ Insufficiently comprehensive prelims, where the Course Grade Descriptions are not fully sampled. This is not uncommon, because prelim examinations often take place before a Course has been completed. In such cases, the prelim evidence should be supplemented by evidence of additional skills developed later in the Course. For many National Courses where Appeals are for a grade C, NAB materials may provide the basis of useful supplementary material. ♦ Inconsistent marking within a centre — it is essential that a consistent standard be applied to the marking of scripts. Before marking begins, everyone involved in marking should discuss and confirm the Marking Instructions and their application. There should also be some checking of the standards of marking across the marking team. ♦ Unmarked prelim scripts. Some centres submit unmarked prelim scripts, making it impossible for the examiner to gauge the standards that have been applied by the centre. Unmarked scripts do not constitute valid evidence, even where they are accompanied by Marking Instructions. ♦ No indication of cut-off scores applied by the centre. The cut off-scores are also essential information to enable Examiners to gauge the standards being applied by the centre. Evidence not supported by the cut-off scores is invalid.

♦ Unacceptably low cut-off scores. For National Courses, cut-off scores should be set at approximately 70% for grade A, and 50% for grade C with grade B falling midway. Note, however, that these cut-off scores may be lowered slightly if the paper turns out to be more demanding than intended, or raised if the paper is less demanding. This adjustment of cut- off scores takes place in SQA annually for each Course, after all Course assessments have been completed, and the data is published on our website in the Principal Assessors’ reports.

♦ Composite papers in which all candidates, irrespective of ability, are tested on a single paper. Some centres have used this approach at Standard Grade rather than using three discrete papers, one for each level. The prelim examination should mirror the Course assessment in terms of structure and demand; so a single paper approach is only acceptable where it also operates in SQA examinations.

♦ The level of demand is less than the Course assessment. This can happen in some subjects when the prelim which covers all Units is split into its constituent papers or parts and the two papers or parts have been separated by a period of time rather than being taken on one occasion. The level of demand can be increased in a number of ways such as raising the cut-off scores or by increasing the level of challenge of the prelim. The timing of a prelim needs to be considered carefully. To optimise diagnosis, assessment will be earlier rather than later in the Course. On the other hand, for the purposes of Estimates and Appeals, assessment should be as late as possible so that there is maximum coverage of content and/or growth of competence. Where the Estimate is based on an assessment covering only part of the Course, the level of demand will be less than in the Course assessment.

You should ensure that assessment evidence is produced under supervised conditions, similar to those set for the Course assessment. The reliability of the evidence will be compromised if there are doubts about the authenticity of the evidence, eg if the prelim is used by different groups of candidates within a centre on different occasions The tasks set should be unseen by the candidates, and should be administered under supervised conditions in accordance with the conditions for the Course assessment. Assessment instruments and Marking Instructions should be stored securely to prevent candidates gaining access to them out with formal assessment arrangements.

APPENDIX 4 Using evidence generated by NABs The primary purpose of NAB material is to assess candidates’ attainment of the learning outcomes of National Units. This is different from assessing candidates’ attainment in components that contribute to a Course award, and measuring the combined attainment against the Course Grade Descriptions. The difference normally lies in a requirement for candidates to retain, integrate and apply knowledge and skills acquired for a Course. The additionality involved in integration, retention and application of the knowledge and skills gained is what makes a Course greater than the sum of its parts — the nature of performance even at Grade C in a Course implies a greater level of attainment than is required by the Units that form part of a Course. Please refer to the subject-specific guidance in Part 2 for details on the extent to which NABs can be used to generate evidence for Estimates and Appeals.

APPENDIX 5

Principles of Assessment SQA defines assessment as measuring evidence of candidate’s attainment of knowledge and skills against qualification standards. There are two modes of assessment — internal and external. Internal assessment is where centres apply assessment instruments and make assessment decisions about candidate evidence. Centres may also devise the assessments but this does not apply equally across all SQA qualifications. External assessment is where the awarding body takes on these duties and centres administer assessment activities on its behalf. In common with all awarding bodies, we strive to ensure that assessment of our qualifications is valid, reliable and practicable. We also aim to make it flexible and cost-effective. Validity Each assessment should be designed so that it provides candidates with the opportunity to produce evidence to show they have the knowledge and skills they need to meet the requirements of the qualification. An assessment is valid when it: ♦ is appropriate to purpose (eg a practical assessment should be used to assess practical skills) ♦ allows the production of the evidence of candidates’ performance which can be measured against standards defined in the qualification ♦ allows candidates to produce sufficient evidence of all the skills and knowledge required to satisfy standards in the qualification ♦ facilitates the making of reliable assessment decisions by all assessors for all candidates ♦ is accessible to all candidates who are potentially able to achieve it Reliability To be reliable, assessment decisions on candidates’ performance must be consistent between all assessors and for all candidates undertaking the same assessment task. In any assessment system, procedures have to be put in place to ensure this. Assessment decisions are reliable when they are based on evidence that is: ♦ generated by valid assessments ♦ generated under consistently-applied conditions of assessment (eg open-book, supervised or invigilated) The authenticated work of the candidates being assessed and when they are: ♦ taken on the basis of clearly-defined performance and/or grade-related criteria ♦ consistent across the range of assessors applying the assessment in different situations and contexts, and with different candidates ♦ consistent over time

Practicability For assessments to be practicable (ie capable of being carried out both efficiently and cost-effectively) there has to be adequate resources and time. Your assessment system should have the flexibility to meet the needs of all candidates. Examples of issues associated with practicability are: ♦ in the context of oral assessments or interviews, balancing the need for assessment reliability with considerations of staff and candidate time and potential stress ♦ in the context of assessing practical skills, bearing in mind any resource implications