Transcript
Page 1: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Performance Assessment

OSI WorkshopJune 25 – 27, 2003Yerevan, Armenia

Ara Tekian, PhD, MHPEUniversity of Illinois at Chicago

Page 2: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Presentation Outline

• Characteristics, types, strengths, and limitations

• Five factors to consider when making performance assessment

• Checklists and rating scales

• Portfolios

• Video presentation & exercise (workshop)

Page 3: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Performance Assessment (PA)

• PA can be distinguished from the traditional paper-and-pencil test by a number of characteristics:– Greater realism of tasks– Greater complexity of tasks– Greater time needed for assessment– Greater use of judgment in scoring

Linn & Gronlund, 1995

Page 4: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Performance tasks appear in many forms:

• Solving realistic problems• Oral or psychomotor skills without a

product• Writing or psychomotor skills with a

product

(Various types of performance may be restricted or extended)

Page 5: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Restricted Performance

• Highly structured and limited in scope

• Examples:– Write a one-page report– Give a one minute speech– Construct a graph form a given set of data– Demonstrate how to sep up lab equipment

Page 6: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Extended Performance

• Less structured and broad in scope

• Examples:– Design and conduct an experiment on a

selected topic, present and defend the findings– Take the history of a patient, perform a physical

examination, and diagnose and write a management plan.

Page 7: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Performance Assessment - Strengths

• Can evaluate complex learning outcomes and skills

• Provides a more natural, direct, and complete evaluation of some types of reasoning, oral, and physical skills

• Provides greater motivation for students by clarifying goals and making learning more meaningful

• Encourages the application of learning to “real life” situations

Page 8: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Performance Assessment - Limitations

• Requires considerable time and effort to use

• Judgment and scoring performance is subjective, burdensome, and typically has low reliability

• Evaluation must frequently be done individually, rather than in groups

Page 9: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Need for Perf. Assessment

• The shift from norm-referenced measurement to criterion-referenced measurement

• The need for focusing on more complex learning outcomes (reasoning and thinking skills), on using more comprehensive student projects based on “real life” problems, and on engaging students in the activities and in the construction of meaning from them

Page 10: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Main factors to consider when making PA

• 1. Specifying the performance outcome• 2. Selecting the focus of the assessment

(procedure, product, or both)• 3. Selecting an appropriate degree of

realism• 4. Selecting the performance situation• 5. Selecting the method of observing,

recording, and scoring

Page 11: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

1. Specifying the performance outcome

• E.g. A research project might include intended learning outcomes as follows:– Selects an appropriate research task– Designs and conducts an experiment– States valid conclusions– Writes a critique of the procedure of findings

Page 12: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Typical Action Verbs for restricted performance outcomes

• Identify, locate, select, describe

• Construct, design, draw, prepare

• Demonstrate, measure, perform, set up

Page 13: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

2. Selecting the focus of the assessment

• Performance assessment can focus on the procedure, the product, or some combination of the two. The nature of the performance frequently dictates where the emphasis should be placed.

Page 14: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Assessing the Procedure

• Focus the PA on the procedure when:– There is no product, or product evaluation is

infeasible (e.g., unavailable or too costly)– The procedure is orderly and directly

observable– Correct procedure is crucial to later success– Analysis of procedural steps can aid in

improving a product

Page 15: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Assessing the Product

• PA should be focused on the product when:– Different procedures can result in an equally

good product– The procedure is not available for observation– The procedural steps have been mastered– The product has qualities that can be clearly

identified and judged

Page 16: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

3. Selecting an appropriate degree of realism

• Although we cannot expect the duplicate the natural situation in which the learning will later be used, we can strive for performance assessments that approximate “real world” conditions.

Page 17: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

4. Selecting the Performance Situation

• PA can be classified by the type of situation or setting used:– Paper-and-pencil performance– Identification test– Structured performance test– Simulated performance– Work sample– Extended research project

Page 18: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

5. Selecting the Method of Observing, Recording, and

Scoring• Whether judging procedures, products, or

some combination of the two, some type of guided observation, method of recording and scoring the results is needed.– Systematic observation and anecdotal records– Checklists– Rating scales

Page 19: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Systematic observation and anecdotal records

• Observations are frequently guided by checklists or rating scales.

• An anecdotal record is a brief description of some significant event. It typically includes the observed behavior, the setting in which it occurred, and a separate interpretation of the event.

Page 20: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Anecdotal Records

• Are likely to be most useful when:– They focus on meaningful incidents– They are recorded soon after the incident– They contain enough information to be

understandable later– The observed incident and its interpretation are

kept separate

Page 21: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Checklists (Advantages)

• List of performance for which raters indicate presence or absence of what they have observed

• Two purposes:– Descriptors direct the attention of the rater– Boxes provide a means of recording

performance judgment

Page 22: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Checklists - Focus

• Focus on the procedure when there is NO product

• Focus on product when:– You have a choice– The procedure is not available for observation

Page 23: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Checklists - Limitations

• Checklists are for all-or-none decisions

• They create a conflict for respondents when the item contains more than one performance attribute

• Can force a judgment when there is no basis for the judgment

Page 24: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Rating Scales (Advantages)

• Contain a list of attributes with a range of responses: e.g. VG, G, ..Poor; SA, A, D, ..

• Rating scales direct raters attention to certain performance dimensions & provide a way for recording judgment

• Covers areas that are not covered well by other methods (e.g. MCQs)– Flexibility of use– Low cost– Unobtrusiveness

Page 25: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Rating Scales - Limitations

• Often filled out retrospectively

• Lack of agreement. Get many raters for one student.

• Associated with certain types of errors– Leniency– Range restriction– Halo effect

Page 26: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

My Advice

• Use checklists rather than rating scales when you have a choice

• Use checklists and rating scales immediately after observations

• Take care in describing performance attributes• Use tailored descriptive phrases to anchor the

points on scale• Include a category like “Not able to rate”

Page 27: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

More Advice

• Provide room for comments to encourage constructive feedback

• Don’t ask raters to provide a grade• Use 5 – 7 rating points on the scale• Get multiple ratings of the same student

(7 or more)• Use a scoring guide or product scale when rating

holistically• Supplement Checklists and rating scales

Page 28: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Portfolios

• A useful method of collecting, organizing, and evaluating samples of students’ work.

• Advantages:– Learning progress over time– Comparing work to past work– Self-assessment skills– Clear communication of learning progress

Page 29: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

Effective Use of Portfolios

• Deciding what to include

• Deciding on criteria and standards

• Collecting the work samples

• Maintaining and using the portfolios

• Getting started

Page 30: Performance Assessment OSI Workshop June 25 – 27, 2003 Yerevan, Armenia

In conclusion• PAs can provide useful information

concerning student achievement but they are subject to all the errors of observation and judgment, such as personal bias, generosity error, and halo effect.

• If PAs are to provide valid information, special care must be taken to improve the objectivity, reliability, and meaningfulness of the results.


Top Related