cbme and assessment

30
CBME and Assessment

Upload: jakinyi

Post on 10-Jul-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: CBME and Assessment

CBME and Assessment

Page 2: CBME and Assessment

Competency-Based Medical Education

is an outcomes-based approach to the design, implementation, assessment and evaluation of a medical education program using an organizing framework of competencies

The International CMBE Collaborators 2009

Page 3: CBME and Assessment

Traditional versus CBME: Start with System Needs

3

Frenk J. Health professionals for a new century: transforming education to strengthen health systems in an interdependent world. Lancet. 2010

Page 4: CBME and Assessment

Competency BasedEducation

Fixed length, variable outcome

Variable length, defined outcome

Structure/Process•Knowledge acquisition•Single subjective measure•Norm referenced evaluation•Evaluation setting removed•Emphasis on summative

Competency Based•Knowledge application•Multiple objective measures•Criterion referenced•Evaluation setting: DO•Emphasis on formative

Caraccio et al 2002

The Transition to Competency

Page 5: CBME and Assessment

Miller’s Assessment Pyramid

KNOWS

KNOWS HOW

SHOWS

HOW

DOES

MCQ EXAM

Extended matching / CRQ

Standardized Patients

Impact on Patient

Faculty observation, audits, surveys

Page 6: CBME and Assessment

Training and Safe Patient Care

Trainee performance* X

Appropriate level of supervision**

Must = Safe, effective patient-centered care

* a function of level of competence in context

**a function of attending competence in context

Page 7: CBME and Assessment

Educational Program

Variable Structure/Process Competency-based

Driving force: curriculum

Content-knowledge acquisition

Outcome-knowledge application

Driving force: process Teacher Learner

Path of learning Hierarchical (Teacher→student)

Non-hierarchical (Teacher↔student)

Responsibility: content Teacher Student and Teacher

Goal of educ. encounter

Knowledge acquisition Knowledge application

Typical assessment tool Single subject measure Multiple objective measures

Assessment tool Proxy Authentic (mimics real tasks of profession)

Setting for evaluation Removed (gestalt) Direct observation

Evaluation Norm-referenced Criterion-referenced

Timing of assessment Emphasis on summative Emphasis on formative

Program completion Fixed time Variable time

Carracchio, et al. 2002.

Page 8: CBME and Assessment

Assessment “Building Blocks”

Choice of right outcomes tied to an effective curriculum – step 1!!

Right combination of assessment methods and tools– MiniCEX, DOPS, Chart stimulated recall (CSR),

medical record audit

Effective application of the methods and tools Effective processes to produce good

judgments

Page 9: CBME and Assessment

Measurement Tools: Criteria

Cees van der Vleuten’s utility index: Utility = V x R x A x EI x CE/Context*

– Where:V = validity

R = reliability

A = acceptability

E = educational impact

C = cost effectiveness

*Context = ∑ Microsystems

Page 10: CBME and Assessment

Criteria for “Good” Assessment1

– Validity or Coherence– Reproducibility or Consistency– Equivalence– Feasibility– Educational effect– Catalytic effect

• This is the “new” addition – relates to feedback that “drives future learning forward.”

– Acceptability

1Ottawa Conference Working Group 2010

Page 11: CBME and Assessment

Measurement Model

Donabedian Model (adapted)• Structure: the way a training program is set up

and the conditions under which the program is administered

• Organization, people, equipment and technology

• Process: the activities that result from the training program

• Outcomes: the changes (desired or undesired) in individuals or institutions that can be attributed to the training program

Page 12: CBME and Assessment

Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)

o Research project

Trainee•Review portfolio •Reflect on contents•Contribute to portfolio

Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment

Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems

Program Summative Assessment Process

Licensing and Certification• Licensure and certification in Qatar

Assessment During Training: Components

Advisor

Page 13: CBME and Assessment

Time

AssessmentActivities

TrainingActivities

SupportingActivities

v v v v v v

= learning task

= learning artifact

= single assessment data-point

= single certification data point for mastery tasks

= learner reflection and planning= social interaction around reflection (supervision)

= learning task being an assessment task also

Model For Programmatic Assessment(With permission from CPM van der Vleuten)

Committee

Page 14: CBME and Assessment

Assessment Subsystem

An assessment subsystem is a group of people who work together on a regular basis to perform evaluation and provide feedback to a population of trainees over a defined period of time

This system has a structure to carry out evaluation processes that produce an outcome

The assessment subsystem must ultimately produce a valid entrustment judgment

Page 15: CBME and Assessment

Assessment Subsystem

This group shares:– Educational goals and outcomes– Linked assessment and evaluation processes – Information about trainee performance– A desire to produce a trainee truly competent (at

a minimum) to enter practice or fellowship at the end of training

Page 16: CBME and Assessment

Assessment Subsystem The subsystem must:

– Involve the trainees in the evaluation structure and processes

– Provide both formative and summative evaluation to the trainees

– Be embedded within, not outside the overall educational system (assessment not an “add-on”

– Provide a summative judgment for the profession and public• Effective Evaluation = Professionalism

Page 17: CBME and Assessment

Subsystem Components

Effective Leadership Clear communication of goals

– Both trainees and faculty Evaluation of competencies is multi-faceted Data and Transparency

– Involvement of trainees– Self-directed assessment and reflection by

trainees– Trainees must have access to their “file”

Page 18: CBME and Assessment

Subsystem Components

“Competency” committees– Need wisdom and perspectives of the group

Continuous quality improvement– The evaluation program must provide data as

part of the CQI cycle of the program and institution

– Faculty development

Supportive Institutional Culture

Page 19: CBME and Assessment

Structured Portfolio

Medical record audit andQI project

MSF: Directed per protocolTwice/year

Practice-based learning and improvement

Systems-based prac

Mini-CEX:10/year

Interpersonal skil ls and Communication

ITE:1/year

Patient care

Faculty Evaluations

EBM/Question Log

Medical knowledge

Professionalism

Multi-faceted Evaluation

■ Trainee-directed ■ Direct observation

Page 20: CBME and Assessment

Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)

o Research project

Trainee•Review portfolio •Reflect on contents•Contribute to portfolio

Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment

Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems

Program Summative Assessment Process

Licensing and Certification• USLME•American Boards of Medical Specialties

Assessment During Training: Components

Advisor

Page 21: CBME and Assessment

Performance Data

A training program cannot reach its full potential without robust and ongoing performance data– Aggregation of individual trainee performance– Performance measurement of the quality and

safety of the clinical care provided by the training institution and the program

Page 22: CBME and Assessment

Competency Committees

Page 23: CBME and Assessment

Structured Portfolio•ITE (formative only)•Monthly Evaluations•MiniCEX•Medical record audit/QI project•Clinical question log•Multisource feedback•Trainee contributions (personal portfolio)

o Research project

Trainee•Review portfolio •Reflect on contents•Contribute to portfolio

Program Leaders•Review portfolio periodically and systematically•Develop early warning system•Encourage reflection and self-assessment

Clinical Competency Committee•Periodic review – professional growth opportunities for all•Early warning systems

Program Summative Assessment Process

Licensing and Certification• USLME•American Boards of Medical Specialties

Assessment During Training: Components

Advisor

Page 24: CBME and Assessment

Time

AssessmentActivities

TrainingActivities

SupportingActivities

v v v v v v

= learning task

= learning artifact

= single assessment data-point

= single certification data point for mastery tasks

= learner reflection and planning= social interaction around reflection (supervision)

= learning task being an assessment task also

Model For Programmatic Assessment(With permission from CPM van der Vleuten)

Committee

Page 25: CBME and Assessment

Committees and Information Evaluation (“competency”) committees can be

invaluable• Develop group goals• “Real-time” faculty development• Key for dealing with difficult trainees

Key “receptor site” for frameworks/milestones• Synthesis and integration of multiple assessments

Page 26: CBME and Assessment

“Wisdom of the Crowd”

Hemmer (2001) – Group conversations more likely to uncover deficiencies in professionalism among students

Schwind, Acad. Med. (2004) – • 18% of resident deficiencies requiring

active remediation only became apparent through group discussion.• Average discussion 5 minutes/resident

(range 1 – 30 minutes)

Page 27: CBME and Assessment

“Wisdom of the Crowd” Williams, Teach. Learn. Med. (2005)

• No evidence that individuals in groups dominate discussions.• No evidence of ganging up or piling on

Thomas (2011) – Group assessment improved inter-rater reliability and reduced range restriction in multiple domains in an internal medicine residency

Page 28: CBME and Assessment

Narratives and Judgments

Pangaro (1999) – matching students to a “synthetic” descriptive framework (RIME) reliable and valid across multiple clerkships

Regehr (2007) – Matching students to a standardized set of holistic, realistic vignettes improved discrimination of student performance

Regehr (2012) – Faculty created narrative “profiles” (16 in all) found to produce consistent rankings of excellent, competent and problematic performance.

Page 29: CBME and Assessment

The “System”

Assessments within Program:

•Direct observations•Audit and performance data•Multi-source FB•Simulation•ITExam

Judgment and Synthesis:Committee

Residents

Faculty, PDs and others

Milestone and EPAs as Guiding Framework and Blueprint

Accreditation:ACGME/RRC

NAS Milestones

ABIM Fastrak

Program Aggregation

Certification:ABIM

No AggregationNo Aggregation

Institution and Program

Page 30: CBME and Assessment

Questions