nbme u lesson catalog

36
• Self-guided, interactive lessons • Evidence-based best practices • Consistent and high-quality student assessments e evidence is out there. NBME U Lesson Catalog YOUR PARTNER IN EVIDENCE-BASED ASSESSMENT NBME SM www.my.nbme.org

Upload: vohuong

Post on 27-Dec-2016

263 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: NBME U Lesson Catalog

• Self-guided, interactive lessons

• Evidence-based best practices

• Consistent and high-quality student assessments

The evidence is out there.

NBME U Lesson Catalog

YOUR PARTNER IN EVIDENCE-BASED ASSESSMENT

NBMESM

www.my.nbme.org

Page 2: NBME U Lesson Catalog

Reaching new horizons in student assessments. 1

Table of Contents 03 NBME U Overview

05 Assessment Principles How to Create a Good Score Purposes and Types of Assessments: An Overview06 Score Reporting Test Blueprinting I: Selecting an Assessment Method07 Test Blueprinting II: Creating a Test Blueprint Test Score Reliability Overview08 Validity and Threats to Validity

09 Method: Multiple-Choice Questions (MCQs) Incorporating Graphics and Multimedia into MCQs Item Analysis and Key Validation10 MCQ Flaws and How to Avoid Them Purposes, Types and Educational Uses of MCQ Examinations11 Setting Pass/Fail Standards Strategies for Organizing Question Writing and Review12 Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge13 Writing MCQs to Assess Application of Clinical Science Knowledge

14 Method: Objective Structured Clinical Examinations (OSCEs) An Introduction to the Use of Generalizability Theory in OSCEs Building a Clinical Skills Center15 Developing Rating Scales and Checklists for OSCEs Provision of Feedback by Standardized Patients16 Quality Assurance of Standardized Patients Reality Check: Promoting Realism in Standardized Patients17 Training Physicians to Rate OSCE Patient Notes Working with Standardized Patients for Assessment

18 Method: Workplace-Based Assessment Conducting the Feedback Session Educational Frameworks for Assessing Competence19 Workplace Assessment: Encounter-Based Methods

20 Professionalism Introduction to the Construct of Professionalism and Its Assessment

21 Meet the Authors

Page 3: NBME U Lesson Catalog

2National Board of Medical Examiners

Five actions to create more confident student assessments

REGISTER for NBME U

ENROLL in NBME U lessons

ASSESS your skills

LEARN new skills

APPLY your new skills

engage, develop and improve

Page 4: NBME U Lesson Catalog

Reaching new horizons in student assessments. 3

NBME U Lesson Catalog NBME U is a new, evidence-based, digital resource to help educators more accurately, confidently and consistently assess their students. With NBME U, you can learn from the experts on student assessments.

As a series of short, self-guided, interactive online lessons, NBME U helps educators create and deliver consistent, valid, reliable and high-quality assessments

of their students. Learn anywhere, anytime and on any device.

By enrolling in NBME U online, healthcare professionals can learn more about:

• Assessment principles • Scoring principles • And other topics related to: - Objective structured clinical examinations (OSCEs) - Multiple-choice questions (MCQs) - Workplace-based assessments - Professionalism

This activity has been planned and implemented in accordance with the accreditation requirements and policies of the Accreditation Council for Continuing Medical Education (ACCME) through the joint providership of the Federation of State Medical Boards and the National Board of Medical Examiners. The Federation of State Medical Boards is accredited by the ACCME to provide continuing medical education for physicians.

The Federation of State Medical Boards designates this enduring material for a maximum of 0.25 AMA PRA Category 1 Credit(s)™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.

© 2016 National Board of Medical Examiners® All Rights Reserved.

Page 5: NBME U Lesson Catalog

4National Board of Medical Examiners

NBME U leverages 100+ years of expertise in creating evidence-based student assessments for health professionals. Content for each lesson was created by a team of nationally recognized experts on assessment.

In this catalog you’ll find: • A list of every lesson offered by NBME U, • A brief description of each lesson’s objectives, and • A biography of each lesson’s author(s).

Each of the 28 NBME U lessons can be completed on any device in 15-20 minutes. In addition, each successfully completed course earns 0.25 AMA PRA Category 1 Credits™ for physicians or a certificate of participation for other healthcare professionals.

The evidence is here, at NBME U. Just select the lessons of most interest to you, and begin strengthening your assessment skills today.

Enroll today at www.my.nbme.org.

Page 6: NBME U Lesson Catalog

Reaching new horizons in student assessments. 5

How to Create a Good ScoreLesson Objectives1. Identify the characteristics of a good score.2. Identify the necessary steps to create a good score.

AuthorsMarc Gessaroli, PhD

Suggested NBME U Companion Courses• Purposes and Types of Assessments: An Overview • Score Reporting• Test Blueprinting I: Selecting an Assessment Method

• Test Blueprinting II: Creating a Test Blueprint• Test Score Reliability Overview• Validity and Threats to Validity

Purposes and Types of Assessments: An OverviewLesson Objectives1. Explain the common purposes of testing and the different types of decisions that can be made from test results.2. Define the terminology commonly used to describe different types of tests in medical education.3. Identify the steps for developing sound assessments.

Authors

Marc Gessaroli, PhD (Editor)Mark Raymond, PhD

Suggested NBME U Companion Courses• How to Create a Good Score• Score Reporting• Test Blueprinting I: Selecting an Assessment Method

• Test Blueprinting II: Creating a Test Blueprint• Test Score Reliability Overview• Validity and Threats to Validity

Assessment Principles

Page 7: NBME U Lesson Catalog

6National Board of Medical Examiners

Score ReportingLesson Objectives1. Apply an evidence-driven framework to design and evaluate a score report. 2. Identify the directionality, or ordering, of test design and assembly based on an evidence-driven framework (purpose → claims → evidence → content). 3. Define the terms Claim, Evidence, and Score.4. Write unambiguous claims that can be supported by evidence.

AuthorsAmanda Clauser, MSEd, EdDMarc Gessaroli, PhD (Editor)Howard Wainer, PhD

Suggested NBME U Companion Courses• How to Create a Good Score• Purposes and Types of Assessments: An Overview• Test Blueprinting I: Selecting an Assessment Method

• Test Blueprinting II: Creating a Test Blueprint• Test Score Reliability Overview• Validity and Threats to Validity

Test Blueprinting I: Selecting an Assessment MethodLesson Objectives1. Recognize the role of learning objectives in deciding what knowledge and skills to assess.2. Classify learning objectives according to skill domain (cognitive, affective and psychomotor) and level of learning

(recognition/recall or application/critical thinking).3. Consider a vast range of assessment methods and select a method that is optimal for the skill to be assessed.

Authors

Marc Gessaroli, PhD (Editor)Joseph Grande, MD, PhD

Mark Raymond, PhDKathleen Short, MALS, MS

Suggested NBME U Companion Courses• How to Create a Good Score• Purposes and Types of Assessments: An Overview• Score Reporting

• Test Blueprinting II: Creating a Test Blueprint• Test Score Reliability Overview• Validity and Threats to Validity

Page 8: NBME U Lesson Catalog

Reaching new horizons in student assessments. 7

Test Blueprinting II: Creating a Test BlueprintLesson Objectives1. Explain the benefits of using a test blueprint when developing assessments.2. List the common frameworks used for blueprinting.3. Describe how to create a blueprint that includes two topic dimensions.

AuthorsGail Furman, MSN, CHSE, PhDMarc Gessaroli, PhD (Editor)Joseph Grande, MD, PhD

Mark Raymond, PhDKathleen Short, MALS, MS

Suggested NBME U Companion Courses• How to Create a Good Score• Purposes and Types of Assessments: An Overview• Score Reporting

• Test Blueprinting I: Selecting an Assessment Method• Test Score Reliability Overview• Validity and Threats to Validity

Test Score Reliability OverviewLesson Objectives1. Explain what reliability is and why it’s important.2. Identify which measures of reliability to use with different types of assessments.3. Identify and interpret the different measures of reliability for different purposes of assessments.4. Identify the factors that can affect the reliability of a score.

Authors

Marc Gessaroli, PhD

Suggested NBME U Companion Courses• How to Create a Good Score• Purposes and Types of Assessments: An Overview• Score Reporting

• Test Blueprinting I: Selecting an Assessment Method• Test Blueprinting II: Creating a Test Blueprint• Validity and Threats to Validity

Assessment Principles

Page 9: NBME U Lesson Catalog

8National Board of Medical Examiners

Validity and Threats to ValidityLesson Objectives1. Describe what validity means in the context of educational tests.2. Explain what inferences are and how they relate to validity evidence.3. Identify different threats to the validity of test scores.4. List approaches to mitigate threats to validity.

AuthorsRichard Feinberg, PhDMarc Gessaroli, PhD (Editor)Kimberly Swygert, PhD

Suggested NBME U Companion Courses• How to Create a Good Score• Purposes and Types of Assessments: An Overview• Score Reporting

• Test Blueprinting I: Selecting an Assessment Method• Test Blueprinting II: Creating a Test Blueprint• Test Score Reliability Overview

Page 10: NBME U Lesson Catalog

Reaching new horizons in student assessments. 9

Incorporating Graphics and Multimedia into MCQsLesson Objectives1. Determine when it makes sense to use media in your exam.2. Determine which content areas are conducive to media.3. Explain what a media blueprint shows.

AuthorsKathy Angelucci, MSKieran HussieCarol Morrison, PhD (Editor)

Miguel Paniagua, FACP, MDMark Raymond, PhD

Suggested NBME U Companion Courses• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations• Setting Pass/Fail Standards

• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of

Clinical Science Knowledge

Item Analysis and Key ValidationLesson Objectives1. Review multiple-choice question item analysis data to identify items that require review by a content expert.2. Identify the key components of a key validation review.

Authors

Kathleen HoltzmanCarol Morrison, PhD (Editor)Miguel Paniagua, FACP, MD

David Swanson, PhDKimberly Swygert, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations• Setting Pass/Fail Standards

• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of Clinical

Science Knowledge

Method: Multiple-Choice Questions (MCQs)

Page 11: NBME U Lesson Catalog

10National Board of Medical Examiners

MCQ Flaws and How to Avoid ThemLesson Objectives1. Recognize and avoid common technical flaws related to testwiseness and irrelevant difficulty.

AuthorsKathleen HoltzmanCarol Morrison, PhD (Editor)Miguel Paniagua, FACP, MD

David Swanson, PhDKimberly Swygert, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• Purposes, Types and Educational Uses of

MCQ Examinations• Setting Pass/Fail Standards

• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of

Clinical Science Knowledge

Purposes, Types and Educational Uses of MCQ ExaminationsLesson Objectives1. Describe what information is best captured by a multiple-choice question (MCQ).2. List four types of MCQ examinations.3. Explain the different uses for each type of test.4. Describe the inferences that can be supported based on the type of test.

Authors

Rich Feinberg, PhDDan Jurich, PhDCarol Morrison, PhD (Editor)

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Setting Pass/Fail Standards

• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of Clinical

Science Knowledge

Page 12: NBME U Lesson Catalog

Reaching new horizons in student assessments. 11

Setting Pass/Fail StandardsLesson Objectives1. Describe important considerations when setting pass/fail standards for multiple-choice tests.2. Outline the process for four standard setting methods.

AuthorsCarol Morrison, PhD (Editor)David Swanson, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations

• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of Clinical

Science Knowledge

Strategies for Organizing Question Writing and ReviewLesson Objectives1. Recognize the strengths and weaknesses of different strategies for organizing question writing and review for tests.

Authors

Kathleen HoltzmanCarol Morrison, PhD (Editor)

Miguel Paniagua, FACP, MDDavid Swanson, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations

• Setting Pass/Fail Standards• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of Clinical

Science Knowledge

Method: Multiple-Choice Questions (MCQs)

Page 13: NBME U Lesson Catalog

12National Board of Medical Examiners

Structuring Multiple-Choice QuestionsLesson Objectives1. Describe the three components of a well structured single-best-answer multiple-choice question.2. Evaluate whether questions follow the “rules” for well structured single-best-answer multiple-choice questions.3. Describe three types of single-best-answer questions.

AuthorsKathleen HoltzmanCarol Morrison, PhD (Editor)Miguel Paniagua, FACP, MD

David Swanson, PhDKimberly Swygert, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations

• Setting Pass/Fail Standards• Strategies for Organizing Question Writing and Review• Writing MCQs to Assess Application of

Basic Science Knowledge• Writing MCQs to Assess Application of

Clinical Science Knowledge

Writing MCQs to Assess Application of Basic Science KnowledgeLesson Objectives1. Write clinical and experimental vignettes, lead-ins, and option sets that test examinees’ application of

basic science knowledge.2. Use a standard vignette structure to write consistently structured multiple-choice question stems with focused lead-ins

that pose clearly defined tasks for examinees.

Authors

Kathleen HoltzmanCarol Morrison, PhD (Editor)

Miguel Paniagua, MD, FACPDavid Swanson, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations

• Setting Pass/Fail Standards• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of Clinical

Science Knowledge

Page 14: NBME U Lesson Catalog

Reaching new horizons in student assessments. 13

Writing MCQs to Assess Application of Clinical Science KnowledgeLesson Objectives1. Write vignettes, lead-ins and option sets that test examinees’ application of clinical science knowledge.2. Use worksheets and templates to write consistently structured multiple-choice question stems with focused lead-ins that

pose clearly defined clinical tasks for examinees.3. Write homogeneous option sets with distractors that are appropriate for the examinees’ stage of training.

AuthorsKathleen HoltzmanCarol Morrison, PhD (Editor)

Miguel Paniagua, FACP, MDDavid Swanson, PhD

Suggested NBME U Companion Courses• Incorporating Graphics and Multimedia into MCQs• Item Analysis and Key Validation• MCQ Flaws and How to Avoid Them• Purposes, Types and Educational Uses of

MCQ Examinations

• Setting Pass/Fail Standards• Strategies for Organizing Question Writing and Review• Structuring Multiple-Choice Questions• Writing MCQs to Assess Application of

Basic Science Knowledge

Method: Multiple-Choice Questions (MCQs)

Page 15: NBME U Lesson Catalog

14National Board of Medical Examiners

An Introduction to the Use of Generalizability Theory in OSCEsLesson Objectives1. Identify basic concepts related to Generalizability Theory.2. Explain how reliability is conceptualized in the Generalizability Theory framework.3. Calculate two of the reliability coefficients used in the Generalizability Theory framework.4. Understand how the number of raters and the number of cases affect the reliability of an

Objective Structured Clinical Examination (OSCE).

AuthorsGail Furman, MSN, CHSE, PhD (Editor)Kimberly Swygert, PhD

Suggested NBME U Companion Courses• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients• Quality Assurance of Standardized Patients

• Reality Check: Promoting Realism in Standardized Patients

• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Building a Clinical Skills CenterLesson Objectives1. List the features that need to be considered when designing a clinical skills center for teaching and assessment

in a new or refurbished space.2. Discuss the features required in your center with builders and architects.3. Realize the need for future planning for upkeep and improvements.

Authors

Nancy AmbroseGail Furman, MSN, CHSE, PhD (Editor)

Gayle Gliva, PhDJessica McBride

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients• Quality Assurance of Standardized Patients

• Reality Check: Promoting Realism in Standardized Patients

• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Method: Objective Structured Clinical Examinations (OSCEs)

Page 16: NBME U Lesson Catalog

Reaching new horizons in student assessments. 15

Method: Objective Structured Clinical Examinations (OSCEs)

Developing Rating Scales and Checklists for OSCEsLesson Objectives1. Describe the differences between rating scales and checklists and identify scenarios in which each might be used.2. Explain the implications of scale choice on rater biases and other factors that impact scoring in an

Objective Structured Clinical Examination (OSCE).3. Explain the implications of scale choice on rater biases and other factors that impact scoring in an

Objective Structured Clinical Examination (OSCE).

AuthorsAmanda Clauser, MSEd, EdDGail Furman, MSN, CHSE, PhD (Editor)Kimberly Swygert, PhD

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Building a Clinical Skills Center• Provision of Feedback by Standardized Patients• Quality Assurance of Standardized Patients

• Reality Check: Promoting Realism in Standardized Patients

• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Provision of Feedback by Standardized PatientsLesson Objectives1. Describe the types of feedback that can be given by standardized patients.2. Develop training materials for standardized patients to enable them to give effective feedback.3. Develop a feedback quality assurance tool for monitoring standardized patient–student encounters.

Authors

Carol Pfeiffer, PhDGail Furman, MSN, CHSE, PhD (Editor)

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Quality Assurance of Standardized Patients

• Reality Check: Promoting Realism in Standardized Patients

• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Page 17: NBME U Lesson Catalog

16National Board of Medical Examiners

Quality Assurance of Standardized PatientsLesson Objectives1. Explain the importance of implementing a quality assurance approach for standardized patients during

OSCE exams.2. Develop case-specific quality assurance monitoring checklists.

AuthorsDavid Disbrow, MACMGail Furman, MSN, CHSE, PhD

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients

• Reality Check: Promoting Realism in Standardized Patients

• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Reality Check: Promoting Realism in Standardized PatientsLesson Objectives1. Identify the vocabulary used to promote realism in standardized patient performance.2. Recognize behaviors that standardized patients can employ to promote realism.

Authors

David Disbrow, MACMGail Furman, MSN, CHSE, PhD (Editor)

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients

• Quality Assurance of Standardized Patients• Training Physicians to Rate OSCE Patient Notes• Working with Standardized Patients for Assessment

Page 18: NBME U Lesson Catalog

Reaching new horizons in student assessments. 17

Training Physicians to Rate OSCE Patient NotesLesson Objectives1. Describe three tools that help physicians rate OSCE post-encounter tasks (PETs).2. Lists the steps needed to train physicians to effectively rate patient notes.3. Identify potential rater biases.

AuthorsJessica Salt, MDEllen Turner, MDGail Furman, MSN, CHSE, PhD (Editor)

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability

Theory in OSCEs• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients

• Quality Assurance of Standardized Patients• Reality Check: Promoting Realism in

Standardized Patients• Working with Standardized Patients for Assessment

Working with Standardized Patients for AssessmentLesson Objectives1. Name areas where standardized/simulated patients can be integrated into the curriculum for assessment.2. Identify advantages of using standardized/simulated patients.3. Describe a framework for understanding learner skill levels.4. List the resources needed to develop a standardized/simulated patient program at your institution.

Authors

Gail Furman, MSN, CHSE, PhDHenry Pohl, MD

Suggested NBME U Companion Courses• An Introduction to the Use of Generalizability Theory

in OSCEs• Building a Clinical Skills Center• Developing Rating Scales and Checklists for OSCEs• Provision of Feedback by Standardized Patients

• Quality Assurance of Standardized Patients• Reality Check: Promoting Realism in

Standardized Patients• Training Physicians to Rate OSCE Patient Notes

Method: Objective Structured Clinical Examinations (OSCEs)

Page 19: NBME U Lesson Catalog

18National Board of Medical Examiners

Conducting the Feedback SessionLesson Objectives1. Identify the characteristics of effective feedback.2. Describe the feedback process.3. Critique a feedback session based on the characteristics of effective feedback.

AuthorsM. Brownell Anderson, MEd (Editor)Colleen Canavan, MS

Peter Katsufrakis, MDMargaret Richmond, MS

Suggested NBME U Companion Courses• Educational Frameworks for Assessing Competence• Workplace Assessment: Encounter-Based Methods

Educational Frameworks for Assessing CompetenceLesson Objectives1. Explain the purpose of frameworks in instructional design and assessment.2. Describe the relationship between frameworks and different definitions of competence.3. Describe current examples of frameworks.

Authors

M. Brownell Anderson, MEdLou Pangaro, MD

Suggested NBME U Companion Courses• Conducting the Feedback Session• Workplace Assessment: Encounter-Based Methods

Method: Workplace-Based Assessment

Page 20: NBME U Lesson Catalog

Reaching new horizons in student assessments. 19

Method: Workplace-based Assessment

Workplace Assessment: Encounter-Based Methods

Lesson Objectives1. Describe the three most popular encounter-based methods of workplace assessment.2. Explain the factors that contribute to the quality of workplace assessment.3. List different types of feedback and their use in the context of workplace assessment.

AuthorsJohn Norcini, PhDMark Raymond, PhD (Editor)

Suggested NBME U Companion Courses• Conducting the Feedback Session• Educational Frameworks for Assessing Competence

Page 21: NBME U Lesson Catalog

20National Board of Medical Examiners

Introduction to the Construct of Professionalism and Its AssessmentLesson Objectives1. Articulate your own definition of professionalism.2. Access resources for assessing professionalism.3. Explain how you will integrate the assessment of one aspect of professionalism in your institution.

AuthorsM. Brownell Anderson, MEd

Professionalism

Page 22: NBME U Lesson Catalog

Reaching new horizons in student assessments. 21

Nancy Ambrose, MBA

LessonBuilding a Clinical Skills Center

Nancy Ambrose was Assistant Director of Center Operations for the Educational Commission for Foreign Medical Graduates (ECFMG) and oversaw the evaluation centers where the United States Medical Licensing Examination (USMLE) Step 2 Clinical Skills assessment was administered.

Ms. Ambrose earned her MBA from the Fox School of Management at Temple University, Philadelphia, PA.

M. Brownell Anderson, MEdVice President, International ProgramsNational Board of Medical Examiners

LessonsConducting the Feedback Session Educational Frameworks for Assessing CompetenceIntroduction to the Construct of Professionalism and Its Assessment

M. Brownell “Brownie” Anderson is Vice President of International Programs with NBME, where she works to create a culture of assessment by aligning outcomes to student assessments with medical schools around the world. She is currently helping medical schools and organizations in Brazil, China, and Kazakhstan to build a stronger culture of assessment.

A frequent author, educator and speaker around the world, Ms. Anderson is the editor of Really Good Stuff, a biannual collection of medical education innovation reports. She was a member of the Arabian Gulf University faculty, Manama, Bahrein, for several years and has worked with the Foundation for Advancement of International Medical Education and Research.

Ms. Anderson received her degrees from Washington University in St. Louis, St. Louis, MO, and the University of Illinois at Chicago and was employed at the Southern Illinois University School of Medicine, Springfield, IL, from 1978 to 1983.

Kathy Angelucci, MSManaging Editor, International ProgramsAmerican Board of Medical Specialties

Lesson Incorporating Graphics and Multimedia in MCQs

Kathy Angelucci is currently the Managing Editor, International Programs, at the American Board of Medical Specialties (ABMS), where she oversees the examination development process for the Singapore program, ensuring that all examinations meet quality standards and that participating specialty item banks contain high-quality test questions.

Prior to joining ABMS, Ms. Angelucci worked at NBME for more than 20 years in several test development roles, most recently as Manager of Developmental Projects and System Enhancements. She has in-depth expertise and technical skills in examination development and publication, media development and acquisition, and item banking and web-based testing. 

Ms. Angelucci received a BA in English from LaSalle University and an MS in Technical and Science Communication from Drexel University, Philadelphia, PA.

Meet the Authors

Page 23: NBME U Lesson Catalog

22National Board of Medical Examiners

Colleen Canavan, MS

Lesson Conducting the Feedback Session

Colleen Canavan contributed to the research and development of multiple observational assessment programs during her tenure at NBME. As part of this work, she delivered training on professionalism assessment and feedback to physicians across the US. 

Ms. Canavan holds a BA in Sociology from Vassar College, Poughkeepsie, NY, and an MS in Library and Information Science from Drexel University, Philadelphia, PA. 

Amanda Clauser, MSEd, EdD PsychometricianNational Board of Medical Examiners

Lessons Score Reporting Developing Rating Scales and Checklists for Objective Structured Clinical Examinations (OSCEs)

Amanda Clauser is a Psychometrician with NBME, where she specializes in test design, equating, score report development and other operational testing issues.

Her research interests include evidence-centered design, applied generalizability theory and performance assessment.

She earned an MSEd from the University of Pennsylvania, Philadelphia, and an EdD in Educational Measurement and Psychometrics from the University of Massachusetts.

David Disbrow, MACMCenter for Innovation OfficerNational Board of Medical Examiners

LessonsQuality Assurance of Standardized PatientsReality Check: Promoting Realism in Standardized Patients

David Disbrow runs NBME’s Center for Innovation, which serves as an incubator for disruptive ideas and concepts that may extend the boundaries of NBME strategic principle areas.

He started his career with the ECFMG in 1998, and he helped to train standardized patient trainers and patients at the Clinical Skills Evaluation Collaboration’s six exam centers across the country. He joined NBME in 2012 to help create the Educational Design and Development Program with Gail Furman.

Mr. Disbrow holds an MACM degree from the Keck School of Medicine, University of Southern California, Los Angeles.

Page 24: NBME U Lesson Catalog

Reaching new horizons in student assessments. 23

Marc Gessaroli, PhDPrincipal Measurement Scientist National Board of Medical Examiners

LessonsHow to Create a Good ScorePurposes and Types of Assessments: An OverviewScore Reporting Test Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint Test Score Reliability Overview Validity and Threats to Validity

Marc Gessaroli is Principal Measurement Scientist at NBME, where his research focuses on the use of multidimensional models to address psychometric issues in testing.

Dr. Gessaroli received a PhD in Educational Measurement and Applied Statistics at the Ontario Institute for Studies in Education at the University of Toronto. For 10 years, he was a faculty member at the University of Ottawa, where he taught graduate courses in educational measurement, applied statistics and psychometrics before joining NBME.

Gayle Gliva-McConvey, PhDDirector, Center for Simulation & Immersive Learning Eastern Virginia Medical School

LessonBuilding a Clinical Skills Center

Gayle Gliva-McConvey is Director of the Center for Simulation & Immersive Learning and the Professional Skills Teaching and Assessment Center at Eastern Virginia Medical School, Norfolk, VA.

Richard Feinberg, PhDSenior PsychometricianNational Board of Medical Examiners

Lessons Validity and Threats to Validity Purposes, Types and Educational Uses of MCQ Examinations

Richard Feinberg is a Senior Psychometrician with NBME, where he leads and oversees the data analysis and score reporting activities for large-scale high-stakes licensure and credentialing examinations. He is

also an Assistant Professor at the Philadelphia College of Osteopathic Medicine, Philadelphia, PA, where he teaches a course on Research Methods and Statistics.

His research interests include psychometric applications in the fields of educational and psychological testing.

He earned a PhD in Research Methodology and Evaluation from the University of Delaware, Newark, DE.

Meet the Authors

Page 25: NBME U Lesson Catalog

24National Board of Medical Examiners

Gail Furman, MSN, CHSE, PhDDirector of Educational Design and Development Clinical Skills Evaluation CollaborationNational Board of Medical Examiners

Lessons An Introduction to the Use of Generalizability Theory in OSCEsBuilding a Clinical Skills Center Developing Rating Scales and Checklists for OSCEsProvision of Feedback by Standardized Patients Quality Assurance of Standardized PatientsReality Check: Promoting Realism in Standardized PatientsTraining Physicians to Rate OSCE Patient Notes Working with Standardized Patients for Assessment

Gail Furman is Director of Educational Design and Development for USMLE’s Step 2 Clinical Skills examination, which uses standardized patients to evaluate the clinical skills of those seeking medical licensure in the US.

Dr. Furman has more than 20 years of experience in medical education as a professor and director of a clinical skills center, working with standardized patients and designing Objective Structured Clinical Examinations (OSCEs).

She earned a PhD from Saint Louis University, St. Louis, MO.

Joseph Grande, MD, PhDCourse Director, Pathology and Cell BiologyMayo Medical School

LessonsTest Blueprinting I: Selecting an Assessment Method Test Blueprinting II: Creating a Test Blueprint

Joseph Grande has served as director for the Pathology and Cell Biology course taught to first-year students at Mayo Medical School, Rochester, MN, for more than 25 years. His previous

roles include Associate Dean for Academic Affairs at Mayo Medical School, 2007-2013, and Chair of the Step 1 Committee for NBME, 2008-2011.

Dr. Grande currently serves on the Executive Board of NBME. He is a reviewer for many education-related journals and is currently on the editorial board of Biochemistry and Molecular Biology Education. He also chairs a study section for the National Institutes of Health that reviews educational conference grant applications.

Page 26: NBME U Lesson Catalog

Reaching new horizons in student assessments. 25

Kathy HoltzmanDirector of Assessment and International OperationsAmerican Board of Medical Specialties

LessonsItem Analysis and Key Validation MCQ Flaws and How to Avoid Them Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge

Kathy Holtzman is Director of Assessment and International Operations for the ABMS, where she provides leadership and project management for international programs and examinations.

Prior to joining ABMS, Ms. Holtzman worked for 35 years at NBME, most recently as Assistant Vice President in the Assessment Programs unit. She has extensive experience with assessment of medical decision-making skills with multiple-choice tests and simulation formats; methods for development and review of test material; design and introduction of computer-based and web-based tests; and development of new assessment formats, some utilizing multimedia. In addition, she has conducted item-writing workshops at medical schools, specialty boards and professional conferences nationally and internationally.

Ms. Holtzman holds a bachelor’s degree from Tennessee Technological University, Cookeville.

Kieran Hussie Manager of Multimedia ServicesNational Board of Medical Examiners

LessonIncorporating Graphics and Multimedia into MCQs

Kieran Hussie is the Manager of Multimedia Services for Test Development with NBME, where he has managed the development, acquisition, processing and standardization of multimedia formats and applications for use in NBME-produced assessments.

After working for the Warner Brothers Network television station in Philadelphia, he joined the testing industry with Assessments Systems and Promissor, which became a part of Pearson.

Mr. Hussie earned a BA in Communications from Temple University, Philadelphia, PA.

Daniel Jurich, PhDPsychometricianNational Board of Medical Examiners

Lesson Purposes, Types and Educational Uses of MCQ Examinations

Daniel Jurich is a Psychometrician with NBME, where he manages the psychometric activities for various licensure and in-training examinations.

His primary research interests include improving the diagnostic utility of assessments to aid in tailoring remediation and data forensic techniques to examine test security.

He received a PhD in Assessment and Measurement from James Madison University, Harrisonburg, VA.

Meet the Authors

Page 27: NBME U Lesson Catalog

26National Board of Medical Examiners

Peter J. Katsufrakis, MD, MBAFamily PhysicianPresident and CEO National Board of Medical Examiners

Lesson Conducting the Feedback Session

Peter Katsufrakis is a board-certified family physician and Senior Vice President for Assessment Programs with NBME. His responsibilities at NBME include oversight of the Medical School Services and Medical Education and Health Profession Services programs, International Programs,

the Post-Licensure Assessment Service, and the United States Medical Licensing Examination program.

He is a past Associate Dean for Student Affairs at the Keck School of Medicine, University of Southern California.

Dr. Katsufrakis received a BA from the University of California, Berkeley, an MBA from the University of Southern California, Los Angeles, and an MD from the University of California, San Diego.

He served his internship and residency in family practice at Santa Monica Hospital, and is a Diplomate of the American Board of Family Medicine.

Jessica McBrideOperations DirectorClinical Skills Evaluation CollaborationEducational Commission for Foreign Medical Graduates

Lessons Building a Clinical Skills Center

Jessica McBride is an Operations Director with NBME and has more than 10 years of experience enhancing operational capabilities, building and leading top-performing teams and resolving ongoing issues.

Ms. McBride is responsible for exam session scheduling at all sites. She oversees facility enhancement and development and maintenance issues (including AV, software and equipment), and acts as project manager for special projects.

She holds a BA in psychology, is a certified PMP and is completing CPBPM certification at Villanova University, Villanova, PA.

Page 28: NBME U Lesson Catalog

Reaching new horizons in student assessments. 27

Carol Morrison, PhDManager, PsychometricsNational Board of Medical Examiners

Lessons Incorporating Graphics and Multimedia into MCQs Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Purposes, Types and Educational Uses of MCQ Examinations Setting Pass/Fail StandardsStrategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge

Carol Morrison is Manager of Psychometrics with NBME. Dr. Morrison supervises the scoring, equating, standard setting and other psychometric analyses for Step 1, Step 2 CK and Step 3 of the USML and other certification, in-training and self-assessment examinations.

Dr. Morrison is an active member of several professional organizations, including the American Educational Research Association and the National Council on Measurement in Education.

She earned a PhD in Educational Psychology (Quantitative Methods) from the University of Texas.

John J. Norcini, PhD President and CEO Foundation for Advancement of International Medical Education and Research

LessonWorkplace Assessment: Encounter-Based Methods

John J. Norcini has been the President and CEO of the Foundation for Advancement of International Medical Education and Research (FAIMER®) since its inception in 2000. Before that, he held a number of senior positions at the American Board of Internal Medicine.

His principal academic interest is in assessment. He has published extensively, has lectured and taught in many countries and is on the editorial boards of several peer-reviewed journals in health professions education.

He is an honorary Fellow of the Royal College of General Practitioners and the Academy of Medical Educators, and has received numerous awards, including the Karolinska Institutet Prize for Research in Medical Education, Stockholm, Sweden.

Meet the Authors

Page 29: NBME U Lesson Catalog

28National Board of Medical Examiners

Louis N. Pangaro, MD, MACPProfessor and ChairDepartment of MedicineUniformed Services University of the Health Sciences

Lesson Educational Frameworks for Assessing Competence

Louis Pangaro is a graduate of Georgetown University Medical School with subsequent training in endocrinology and metabolism. As a recognized expert in quantitative and descriptive evaluation

of students’ progress, Dr. Pangaro created “standardized examinees” to calibrate the validity of the prototype clinical skills examination of the USMLE. He also created a “synthetic” framework, or RIME scheme (reporter-interpreter-manager-educator), for defining expectations of students and residents.

He co-directs the annual Systems Approach to Assessment in Health Professions Education program at the Harvard Macy Institute, Boston, MA.

Miguel A. Paniagua, MD, FACPMedical Advisor for Test Development Services National Board of Medical Examiners

Lessons Incorporating Graphics and Multimedia in MCQsItem Analysis and Key Validation MCQ Flaws and How to Avoid Them Strategies for Organizing Question Writing and Review Structuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge

Miguel A. Paniagua currently serves as Medical Advisor for Test Development Services at NBME. His work at NBME includes the development of assessments of procedural skills, communication skills, interprofessional teamwork, and other innovations in computer-based examinations.

He has served on multiple item writing and reviewing committees at NBME over the past 10 years, and has served as a representative member of the National Board from 2011 to 2014 and on the Executive Board from 2013 to 2014.

Dr. Paniagua practices consultative hospice and palliative medicine at the Hospital of the University of Pennsylvania and holds adjunct appointments to the faculties of the Saint Louis University School of Medicine, St. Louis, MO, and the Perelman School of Medicine at the University of Pennsylvania, Philadelphia. Dr. Paniagua received an undergraduate degree from Saint Louis University and an MD at the University of Illinois College of Medicine, Chicago. Dr. Paniagua completed his internal medicine residency and fellowship in gerontology and geriatric medicine at the University of Washington, Seattle.

Page 30: NBME U Lesson Catalog

Reaching new horizons in student assessments. 29

Carol Pfeiffer, PhDProfessor EmeritusUniversity of Connecticut School of Medicine

LessonProvision of Feedback by Standardized Patients

Carol Pfeiffer is Professor Emeritus at the University of Connecticut School of Medicine, Farmington, CT, where she had a 35-year career as a medical educator. She was a founding member of the Association of Standardized Patient Educators and received awards as an outstanding educator from

both that organization and the University of Connecticut. She was a member of the Prototype Committee in the initial phases of the development of NBME’s Step 2 CS.

She holds a PhD in Sociology from Washington University in St. Louis, St. Louis, MO, and her focus in medical education has been on communication skills. 

Mark Raymond, PhDResearch Director and Principal Assessment ScientistNational Board of Medical Examiners

Lessons Purposes and Types of Assessments: An Overview Test Blueprinting I: Selecting an Assessment MethodTest Blueprinting II: Creating a Test Blueprint Incorporating Graphics and Multimedia into MCQsWorkplace Assessment: Encounter-Based Methods

Mark Raymond is a Research Director and Principal Assessment Scientist with NBME, where he conducts and coordinates research on assessment and promotes scholarly interactions with external organizations.

Over the past 25 years, Dr. Raymond has consulted with licensing agencies, professional associations and universities on test development and psychometrics. His scholarly interests include job analysis and test blueprinting, generalizability theory and performance-based assessments. He serves on the editorial boards of several journals in health care and testing, and recently relinquished his role as Associate Editor of Applied Measurement in Education.

He is also the co-editor of The Handbook of Test Development, published by Routledge in 2015.

He earned a PhD in Educational Psychology from Pennsylvania State University, State College.

Meet the Authors

Page 31: NBME U Lesson Catalog

30National Board of Medical Examiners

Margaret Richmond, MScProcess ExpertNational Board of Medical Examiners

Lesson Conducting the Feedback Session

Margaret Richmond is a Process Expert with the Strategic Planning and Institutional Effectiveness team at NBME. In this role, she manages and facilitates cross-functional project teams to implement and improve internal processes, focusing on new product development and marketing processes to

enhance the effectiveness of the NBME.

She previously worked at the Center for Innovation to diversify the products and services offered by the NBME, including the Assessment of Professional Behaviors (APB) program. Ms. Richmond earned an MSc in Information Science from the University of Michigan, Ann Arbor, MI.

Jessica Salt, MD, MBEMedical Director and Patient Note DirectorClinical Skills Evaluation CollaborationEducational Commission for Foreign Medical Graduates

LessonTraining Physicians to Rate OSCE Patient Notes

Jessica Salt is the Medical Director and Director of the Patient Note Program for the Clinical Skills Evaluation Collaboration (CSEC), where she is responsible for recruitment, training and QA of

Patient Note Raters for the USMLE Step 2 CS Exam.

Prior to joining CSEC in 2012, Dr. Salt was Associate Program Director for the Internal Medicine Residency and Director of the Internal Medicine Clerkship at Jefferson University Hospital in Philadelphia. She is a board-certified internist and currently practices as a General Hospitalist at Lankenau Medical Center in Wynnewood, PA.

Dr. Salt received an MBE in Bioethics from the University of Pennsylvania, Philadelphia, and an MD from Virginia Commonwealth University, Richmond. She completed her internal medicine residency at Brown University, Providence, RI.

Page 32: NBME U Lesson Catalog

Reaching new horizons in student assessments. 31

Kathleen Short, MALS, MSProgram Officer National Board of Medical Examiners

Lessons Test Blueprinting I: Selecting an Assessment MethodTest Blueprinting II: Creating a Test Blueprint

Kathleen Short is a Program Officer with NBME. With a wide range of assessment experience, she leads a team that works on the USMLE, board certifications, self-assessments and educational tools, including

NBME’s Customized Assessment Service. Her research interests include investigations into innovative assessments, including those using multimedia simulations in conjunction with multiple-choice questions.

Ms. Short earned her MALS from the University of Pennsylvania, where she focused in Ethnographic Research, and her MS in Statistics, Measurement, Assessment and Research Technology from the University of Pennsylvania’s Graduate School of Education, Philadelphia.

David B. Swanson, PhDVice President American Board of Medical Specialties

Lessons Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Setting Pass/Fail StandardsStrategies for Organizing Question Writing and ReviewStructuring Multiple-Choice Questions Writing MCQs to Assess Application of Basic Science Knowledge Writing MCQs to Assess Application of Clinical Science Knowledge

David B. Swanson is currently a Vice President at the ABMS, where he works with committees to manage the certification and recertification processes of ABMS Member Boards.

He also worked with the NBME, where he wrote, spoke at and hosted seminars around the world about various topics related to student assessment. He also wrote Constructing Written Test Questions For the Basic and Clinical Sciences with Dr. Susan Case. He earned his PhD in Psychology from the University of Minnesota. In 2011, he received the Richard Farrow Gold Medal for Outstanding Contributions to Medical Education from the UK Association for the Study of Medical Education. He also holds an honorary professorial appointment in the University of Melbourne Medical School, Victoria, Australia.

Meet the Authors

Page 33: NBME U Lesson Catalog

32National Board of Medical Examiners

Kimberly Swygert, PhDDirector, Research and Development in Test Development National Board of Medical Examiners

LessonsValidity and Threats to Validity Item Analysis and Key Validation MCQ Flaws and How to Avoid Them Structuring Multiple-Choice Questions

Kimberly Swygert is Director of Research and Development in Test Development at NBME. Her work on performance assessments, examinee timing and pacing, examinee repeater behavior and score reporting has been presented at conferences and published in journals such as Academic Medicine, Advances in Health Sciences Education, the Journal of General Internal Medicine and the Journal of Educational Measurement.

In addition to her work at NBME, Dr. Swygert has taught graduate courses in biostatistics and psychometrics at Drexel University, Philadelphia, PA, and the Uniformed Services University of the Health Sciences, Bethesda, MD, where she is an external advisory board member.

She earned a PhD in Quantitative Psychology from the University of North Carolina, Chapel Hill.

Ellen Turner, MDAssistant Director, Patient Note ProgramClinical Skills Evaluation CollaborationEducational Commission for Foreign Medical Graduates

LessonTraining Physicians to Rate OSCE Patient Notes

Ellen Turner serves as Assistant Director of the Patient Note Program at CSEC in Philadelphia, PA.

Dr. Turner is a board-certified infectious diseases physician with more than 10 years of clinical and teaching experience. She is an adjunct faculty member at Drexel University College of Medicine, Philadelphia, PA.

After receiving her medical degree from Rutgers Robert Wood Johnson Medical School, New Brunswick, NJ, Dr. Turner completed an internal medicine residency and infectious diseases fellowship at Temple University Hospital in Philadelphia. In addition to her experience as a clinical educator, she was also a Patient Note Rater at the ECFMG prior to joining the CSEC staff.

Page 34: NBME U Lesson Catalog

Reaching new horizons in student assessments. 33

Howard Wainer, PhDDistinguished Research Scientist National Board of Medical Examiners

Lesson Score Reporting

Howard Wainer is a Distinguished Research Scientist with NBME, where he acts as senior statistical/psychometric advisor and writes books.

As a research scientist and former professor, he has published more than 400 articles and book chapters. His 20th book, Medical Illuminations: Using Evidence, Visualization & Statistical Thinking to Improve Healthcare, was published by Oxford University Press in 2014 and was a finalist for the Royal Society Winton Book Prize. His latest book, Truth or Truthiness: Distinguishing Fact from Fiction by Learning to Think like a Data Scientist, was published by Cambridge University Press in 2015.

He is a Fellow of the American Statistical Association and the American Educational Research Association and has received several awards for his work in the industry.

Dr. Wainer earned a PhD in Psychometrics from Princeton University, Princeton, NJ.

Meet the Authors

Page 35: NBME U Lesson Catalog

34National Board of Medical Examiners

Reaching new horizons in student assessments.

“ You are most likely a faculty member because of your content and your expertise, not because of your educational skills. We have always assumed that people can teach if they understand the content. And I think that has changed because there is a greater recognition that education requires some skills unto themselves and not just the content area.” Larry Gruppen, PhD Professor, Master of Health Professions Education University of Michigan

Page 36: NBME U Lesson Catalog

YOUR PARTNER IN EVIDENCE-BASED ASSESSMENT

NBMESM