college of nursing january 2011 best practices for writing objective test items

45
College of Nursing January 2011 Best Practices for Writing Objective Test Items

Upload: kaylie-muter

Post on 14-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

College of Nursing

January 2011

Best Practices for Writing Objective Test Items

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Writing Objective Test Items

Presenter Dr. James Coraggio, Director, Academic

Effectiveness and AssessmentContributor Alisha Vitale, Collegewide Testing

Coordinator

2

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Writing Objective Test Items

Former Life… Director of Test Development , SMT Director of Measurement and Test

Development, Pearson Taught EDF 4430 Measurement for

Teachers, USF

January 7. 2011 Academic Effectiveness and Assessment 3

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Purpose

This presentation will address the importance of establishing a test purpose and developing test specifications.

This presentation will explain how to create effective multiple choice test questions.

The presentation will provide item-writing guidelines as well as best practices to prevent students from just guessing the correct answers.

4

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 5

Agenda

Purpose of a Test Prior to Item Writing Advantages of Objective Tests Types of Objective tests Writing Multiple Choice Items The Test-wise Student Test Instructions Test Validity

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Purpose of a Test

To clearly delineate between those that know the content and those that do not.

To determine whether the student knows the content, not whether the student is a good test-taker. Likewise, confusing and tricky questions

should be avoided to prevent incorrect responses from students who know (and understand) the material.

6

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Prior to Writing Items

Establish the test purpose Conduct the role delineation study/job

analysis Create the test specifications

January 7. 2011 Academic Effectiveness and Assessment 7

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Establish the Test Purpose

Initial Questions How will the test scores be used? Will the test be designed for minimum

competency or content mastery? Will the test be low-stakes, moderate-

stakes, or high-stakes (consequences for examinees)?

Will the test address multiple levels of thinking (higher order, lower order, or both)?

Will there be time constraints?January 7. 2011 Academic Effectiveness and Assessment 8

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Establish the Test Purpose

Responses to those initial questions have implications such as the overall length of the test, the average difficulty of the items, the conditions under which the test will be

administered, and the type of score information to be provided.

Take the time to establish a singular purpose that is clear and focused so that goals and priorities will be effectively met.

January 7. 2011 Academic Effectiveness and Assessment 9

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Conduct the Job Analysis

The primary purpose of a role delineation study or job analysis is to provide a strong linkage between competencies necessary for successful performance on the job and the content on the test.

This work has already been conducted by the National Council Licensure Examination for Registered Nurses [See Report of Findings from the 2008 RN Practice Analysis: Linking the NCLEX-RN® Examination to Practice, NCSBN, 2009]

January 7. 2011 Academic Effectiveness and Assessment 10

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Create Test Specifications

Test specifications are essentially the ‘blue print’ used to create the test.

Test specifications operationalize the competencies that are being assessed.

NCLEX-RN® Examination has established test specifications. [See 2010 NCLEX-RN® Detailed Test Plan, April 2010, Item Writer/Item Reviewer/Nurse Educator Version]January 7. 2011 Academic Effectiveness and Assessment 11

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Create Test Specifications

January 7. 2011 Academic Effectiveness and Assessment 12

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Create Test Specifications

January 7. 2011 Academic Effectiveness and Assessment 13

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Create Test Specifications

January 7. 2011 Academic Effectiveness and Assessment 14

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Create Test Specifications

Test specifications: Support the validity of the examination Provide standardized content across

administrations Allow for subscores that can provide

diagnostic feedback to students and administrators

Inform the student (and the item writers) of the required content

January 7. 2011 Academic Effectiveness and Assessment 15

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

Item Development

After developing the test specifications, item development can begin.

The focus on the remaining presentation will be on creating ‘appropriate’ objective items.

January 7. 2011 Academic Effectiveness and Assessment 16

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 17

Objective Tests

Measure several types of learning (also levels) Wide content, short period of time Variations for flexibility Easy to administer, score, and analyze Scored more reliability and quickly

What type of learning cannot be measured?

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 18

Types of Objective Tests

Written-response Completion (fill-in-the-blank) Short answer

Selected-response Alternative response (two options) Matching Keyed (like matching) Multiple choice

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 19

Written-response

Single questions/statements or clusters (stimuli) Advantages

Measure several types of learning Minimizes guessing Points out student misconceptions

Disadvantages Time to score Misspelling and writing clarity Incomplete answers More than one possible correct response (novel

answers) Subjectivity in grading

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 20

Completion

A word that describes a person, place or thing is a ________.

1. Remove only ‘key’ words2. Blanks at end of statement3. Avoid multiple correct answers4. Eliminate clues5. Paraphrase statements6. Use answer sheets to simplify scoring

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 21

Short Answer

Briefly describe the term proper noun. ____________________________

Terminology – Stimulus and Response1. Provide an appropriate blank (word (s) or

sentence).2. Specify the units (inches, dollars)3. Ensure directions for clusters of items and

appropriate for all items

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 22

Selected-response

Select from provided responses Advantages

Measure several types of learning Measures ability to make fine distinctions Administered quickly Cover wide range of material Reliably scored Multiple scoring options (hand, computer, scanner)

Disadvantages Allows guessing Distractors can be difficult to create Student misconceptions not revealed

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 23

Alternative Response

T F 1. A noun is a person place or thing.T F 2. An adverb describes a noun.

1. Explain judgments to be made2. Ensure answers choices match3. Explain how to answer4. Only one idea to be judged5. Positive wording6. Avoid trickiness, clues, qualifiers

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 24

Matching Item

Column A Column B

__Person, place, or thing. a. Adjective

__Describes a person, place, or thing. b. Noun

Terminology – premises and responses1. Clear instructions2. Homogenous premises3. Homogenous responses (brief and

ordered)4. Avoid one-to-one

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 25

Keyed Response

Responsesa. A nounb. A pronounc. An adjectived. An adverb

___Person, place, or thing.___Describes a person, place, or thing.

Like matching items, more response options

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 26

MC Item Format

What is the part of speech that is used to name a person, place, or thing?

A) A noun* B) A pronoun C) An adjective D) An adverb

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

MC Item Terminology

Stem: Sets the stage for the item; question or incomplete thought; should contain all the needed information to select the correct response.

Options: Possible responses consisting of one and only one correct answer

Key: correct response Distractor: wrong response, plausible but

not correct, attractive to an under-prepared student

27

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Competency

Items should test for the appropriate or adequate level of knowledge, skill, or ability (KSA) for the students.

Assessing lower division students on graduate level material is an ‘unfair’ expectation.

The competent student should do well on an assessment, items should not be written for only the top students in the class.

28

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Clarity

Clear, precise item and instructions Correct grammar, punctuation,

spelling Address one single issue Avoid extraneous material (teaching) One correct or clearly best answer Legible copies of exam

29

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Bias

Tests should be free from bias… No stereotyping No gender bias No racial bias No cultural bias No religious bias No political bias

30

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Level of Difficulty

Ideally, test difficulty should be aimed at a middle level of difficulty. This can not always be achieved when the subject matter is based on specific expectations (i.e., workforce area).

31

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Level of Difficulty

To make a M/C item more difficult, make the stem more specific or narrow and the options more similar.

To make a M/C item less difficult, make the stem more general and the options more varied.

32

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Trivial and Trick Questions

Avoid trivia and tricks. Avoid humorous or ludicrous

responses. Items should be straight forward.

They should cleanly delineate those that know the material from those that do not.

Make sure every item has value and that it is contributing to the final score.

33

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test Taking Guidelines

When you don’t know the answer As with all exams, attempt the questions that are easiest

for you first. Come back and do the hard ones later. Unless you will lose marks for an incorrect response, never leave a question blank. Make a calculated guess if you are sure you don’t know the answer. Here are some tips to help you guess ‘intelligently’.

Use a process of elimination Try to narrow your choice as much as possible: which of

the options is most likely to be incorrect? Ask: are options in the right range? Is the measurement unit correct? Does it sound reasonable?

34

http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test Taking Guidelines

Look for grammatical inconsistencies In extension-type questions a choice is nearly always wrong if the

question and the answer do not combine to make a grammatically correct sentence. Also look for repetition of key words from the question in the responses. If words are repeated, the option is worth considering. e.g.:

The apparent distance hypothesis explains… b) The distance between the two parallel lines appears…

35

http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test Taking Guidelines

Be wary of options containing definitive words and generalizations

Because they can’t tolerate exceptions, options containing words like ‘always’, ‘only’, ‘never’, ‘must’ tend to be incorrect more often. Similarly, options containing strong generalizations tend to be incorrect more often.

Favor look-alike options If two of the alternatives are similar, give them your

consideration. e.g.:A. tourism consultantsB. touristsC. tourism promotersD. fairy penguins

36

http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test Taking Guidelines

Favor numbers in the mid-range If you have no idea what the real answer is, avoid extremes.

Favor more inclusive options If in doubt, select the option that encompasses others. e.g.:

A. an adaptive systemB. a closed systemC. an open systemD. a controlled and responsive systemE. an open and adaptive system.

Please note: None of these strategies is foolproof and they do not apply equally to the different types of multiple choice questions, but they are worth considering when you would otherwise leave a blank.

37

http://www.services.unimelb.edu.au/asu/download/Study-Multiple-ChoiceExams-Flyer.pdf

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test-wise Students

Are familiar with item formats Use informed and educated guessing Avoid common mistakes Have testing experience Use time effectively Apply various strategies to solve

different problem types

38

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test-wise Students

Vary your keys: ‘Always pick option ‘C’. ’

Avoid ‘all of the above’ and ‘none of the above.’

Avoid extraneous information: It may assist in answering another item.

Avoid item ‘bad pairs’ or ‘enemies.’ Avoid clueing with the same word in

the stem and the key. 39

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Test-wise Students

Make options similar in terms of length, grammar, and sentence structure. Different options stand out. Avoid ‘clues.’

40

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment

Item Format Considerations

Information in the stem Avoid negatively stated stem,

qualifiers Highlight qualifiers if used Avoid irrelevant symbols (“&”) and

jargon Standard set number of options

(Prefer only four) Ideally, you should tie an item to

reference (and rationale)41

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 42

Test Directions

Highlight Directions

1. State the skill measured.2. Describe any resource materials required.3. Describe how students are to respond.4. Describe any special conditions.5. State time limits, if any.

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 43

Ensure Test Validity

Congruence between items and course objectives

Congruence between item and student characteristics

Clarity of items Accuracy of the measures Item formatting criteria Feasibility-time, resources

Best Practices for Writing Objective Test Items January 2011March 2010

January 2010

January 7. 2011 Academic Effectiveness and Assessment 44

Questions

College of Nursing

January 2011

Best Practices for Writing Objective Test Items