rime in em clerkship

8

Click here to load reader

Upload: ayad-m-al-moslih

Post on 26-May-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: RIME in EM Clerkship

The Journal of Emergency Medicine, Vol. 43, No. 4, pp. 720–727, 2012Copyright � 2012 Elsevier Inc.

Printed in the USA. All rights reserved0736-4679/$ - see front matter

doi:10.1016/j.jemermed.2011.05.069

RECEIVED: 27 OACCEPTED: 28 M

Education

REPORTER-INTERPRETER-MANAGER-EDUCATOR (RIME) DESCRIPTIVE RATINGSAS AN EVALUATION TOOL IN AN EMERGENCY MEDICINE CLERKSHIP

Douglas S. Ander, MD, Joshua Wallenstein, MD, Jerome L. Abramson, MD, PHD,Lorie Click, MPH, and Philip Shayne, MD

Department of Emergency Medicine, Emory University School of Medicine, Atlanta, GeorgiaReprint Address: Douglas S. Ander, MD, Department of Emergency Medicine, Emory University School of Medicine,

49 Jesse Hill Jr. Dr., Atlanta, GA 30303

, Abstract—Background: Emergency Medicine (EM)clerkships traditionally assess students using numericalratings of clinical performance. The descriptive ratings ofthe Reporter, Interpreter, Manager, and Educator (RIME)method have been shown to be valuable in other specialties.Objectives: We hypothesized that the RIME descriptiveratings would correlate with clinical performance andexamination scores in an EM clerkship, indicating that theRIME ratings are a valid measure of performance.Methods: This was a prospective cohort study of an evalua-tion instrument for 4th-year medical students completing anEM rotation. This study received exempt InstitutionalReview Board status. EM faculty and residents completedshift evaluation forms including both numerical and RIMEratings. Students completed a final examination. Meanscores for RIME and clinical evaluations were calculated.Linear regression models were used to determine whetherRIME ratings predicted clinical evaluation scores or finalexamination scores. Results: Four hundred thirty-ninestudents who completed the EM clerkship were enrolled inthe study. After excluding items with missing data, therewere 2086 evaluation forms (based on 289 students) avail-able for analysis. There was a clear positive relationshipbetween RIME category and clinical evaluation score(r2 = 0.40, p < 0.01). RIME ratings correlated most stronglywith patient management skills and least strongly with hu-manistic qualities. A very weak correlation was seen withRIME and final examination. Conclusion: We found a posi-tive association between RIME and clinical evaluationscores, suggesting that RIME is a valid clinical evaluation

ctober 2010; FINAL SUBMISSION RECEIVED: 5 Januay 2011

720

instrument. RIME descriptive ratings can be incorporatedinto EM evaluation instruments and provides useful datarelated to patient management skills. � 2012 Elsevier Inc.

, Keywords—undergraduatemedical education; evaluation

INTRODUCTION

Students on emergency medicine (EM) clerkships areevaluated using a variety of evaluation methods (1).Although clerkship directors use a variety of other evalua-tion instruments to assess clinical competencies, globalassessments of live clinical performance by faculty andresidents remains the predominant component of evalua-tions in most EM clerkships. EM course directors faceunique challenges compared to other disciplines. In mostother specialty clerkships, there is an ongoing relationshipbetween the student and teacher/evaluator ranging fromweeks to months or longer. Students in EM may workwith multiple faculty members, and a single faculty mem-ber may not work with a student for more than a singleshift. In our institution, during a 4-week rotation studentsmay work with 10 different faculty members and see asfew as 4–6 patients per shift. Although this arrangementhas some advantages to the student and clerkship director,it poses unique challenges to the learner and the teacher/evaluator, primarily the delivery of reliable performance

ary 2011;

Page 2: RIME in EM Clerkship

Reporter-Interpreter-Manager-Educator Descriptive Ratings 721

assessment and constructive feedback based on clinicalinteractions. Despite the ease of use, clinical evaluationscores lack interobserver reliability and suffer from limiteddiscrimination between various evaluation domains (2–4).Additionally, clinical scores may not give the studentdescriptive anchors to adequately describe weaknesses.

First described in 1999, the Reporter-Interpreter-Manager-Educator (RIME) terminologyprovides a frame-work to assess a student’s performance in the clinicalsetting (5,6). The RIME approach to assessment wasdeveloped for Internal Medicine to be used duringformal student evaluation sessions, providing theteachers with a simple framework to categorize studentprogress that the student would accept as valid feedback(5,6). The reliability of the RIME has been establishedacross geographically distant sites, and when used ina system with evaluation sessions, it has been found tocorrelate with a student’s performance on the NationalBoard shelf examination and detect deficiencies inprofessionalism. It has also been shown to forecast lowscores for performance as an intern when used ina comprehensive system including evaluation sessionsand committee review (7–9). One study comparing theRIME system to standard clerkship evaluation formsdemonstrated that RIME could better detect anddescribe changes in a student’s performance over time(10).Another study demonstrated that it helps students un-derstand their performance during feedback sessions (11).

Although the advantages of RIME over traditionalclinical evaluation have been shown, there are also bar-riers to using it in EM. RIME was originally designed toprovide a standard vocabulary for describing the progressof trainees and the use of formal evaluation sessions. Thepublished work on the reliability and validity of RIMErelates to its use in evaluation sessions. However, manyclerkship directors in Internal Medicine utilize RIMEwithout formal evaluation sessions with students (12).

The RIME terminology has been shown to enhance theevaluation and feedback process, but has never been usedin the setting of an EM clerkship. The objective of ourstudy was to determine the extent towhich RIME descrip-tive ratings in the EM setting correlate with the clinicalevaluation scores and performance on a multiple-choicefinal examination. We hypothesized that the RIMEdescriptive rating would correlate with these other evalu-ation methods, suggesting that RIME terminology isa valid method of assessment in an EM clerkship.

METHODS

Study Design

This was a prospective cohort study of an evaluationinstrument for 4th-year medical students completing

a rotation in EM. As an addition to the standard evalua-tion process, this study was considered exempt by theInstitutional Review Board.

Study Setting and Population

This study was conducted at a University medical schoolwith a required 4th-year EM clerkship. The clerkshiputilizes five distinct training sites including a countyhospital, a university medical center, a community hospi-tal, and two pediatric hospitals. All the evaluators werefaculty or senior residents in the Department of Emer-gency Medicine. The subjects were 4th-year medicalstudents enrolled in our clerkship between 2005 and2007, including both students from our institution andvisiting students from other medical schools.

Evaluation Instrument

Development of the study evaluation form used wasbased on review of evaluation forms used in otherprograms and a literature review, and underwent finalreview and approval by the Undergraduate EducationCommitteewithin our Department. The evaluation instru-ment contained several components. Part 1 used theRIME descriptive rating. The RIME descriptive ratingwas based on individual shift interactions between theevaluator and the student. In addition, our definition ofan ‘‘Educator’’ was somewhat different than what wasdescribed in the original papers. We used the RIMEdescriptive rating during individual shift encounterswith students, during which they may see 4–6 patientswith the evaluator (Figure 1).

Part 2 of the form is a traditional global assessment oflive performance using detailed clinical evaluation of15 EM competencies grouped within the framework ofthe six Accreditation Council on Graduate Medical Edu-cation core competencies (Figure 1) (13).

Study Protocol

EM faculty and residents were trained on the proper useof the form and its components at the beginning of theacademic year. They received a lecture describing thetheory behind the evaluation instrument, how to usethe form, and case examples. During orientation to theclerkship, students were asked to document their fieldof interest and their school affiliation. The students re-ceived instruction on the distribution of the form to thefaculty and residents. Forms were completed using eithera paper form or aWeb page. All the data were compiled inan internally developed database. At the end of the rota-tion, the students completed a 71-question multiple-choice examination covering material from our didactic

Page 3: RIME in EM Clerkship

Student: ____________________ Date: ____________________ Faculty: _______________________

Written Comments are required regarding overall performance of the student

Strengths:

Needing Improvement:

Specific Examples:

Plan for improvement:

Please check each step the student has consistently reached: Reporter Interpreter Manager Educator

Reporter level student: Performs acceptably in some areas of evaluation but clearly needs improvement in others.

Interpreter level student:Performs acceptably in most areas of evaluation. Obtains and reports basic information accurately; beginning to interpret; someattempt to actively manage patient care; solid personal/professional qualities.

Manager level student:Clearly well above average in most areas of evaluation. Proceeds consistently to interpreting data; Solid ability to actively manage patient care.

Educator level student:Outstanding ratings in most major areas of evaluation. Open to new knowledge and skilled in identifying questions that can’t beanswered from textbooks. Is able to consistently manage patient care. This student performs at a level far superior to his level of training.

Are there any concerns of professionalism with this student? Yes* No *If yes, MUST comment with specifics

For each area of evaluation, please check the appropriate level of ability. Qualities should be cumulative as rating increases.Indicate the level at which the student is consistent.

PATIENT CARE History Taking If Not Observed, Check Here

Unable to elicit important information or nonverbal cues. Often fails to identify major problem.

Incomplete or unfocused.

Adequate history. Focused on the major problem. Accurate.

History is complete & accurate. Details were appropriate to the setting.

History is comprehensive. Accurate & focused on key pertinent problems. Identifies subtle problem areas.

Physical Exam If Not Observed, Check Here Unreliable physical

examination.Incomplete exam.

Missed major findings.Minor gaps in

technical skill. Major findings were identified.

Technically sound & thorough exam. Organized focused, and relevant.

Thorough, detailed exam, yet focused to primary complaint. Uses pertinent ancillary techniques.

Problem Solving/Management Plans If Not Observed, Check Here Fails to formulate an

adequate plan. Poor judgment in selection or use of diagnostics & therapeutics.

Limited differential diagnostic ability. Formulates inappropriate diagnostic and therapeutics.

Identified major problems. Able to formulate a basic plan including selection of diagnostics & therapeutics

Identified major & minor problems. Develops a complete & efficient plan for diagnostics & therapeutics.

Developed an extensive problem list. Plan is thorough and precise. Identifies alternative plans.

Patient management skills If Not Observed, Check Here Fails to monitor

patient responses to treatment and make adjustment after the initial workup. Unable to manage multiple patients.

Does not always monitor patient response to treatment or make indicated adjustments after initial workup. Fair ability to manage multiple patients.

Monitors response to treatment and adjusts as indicated after initial workup. Average ability to manage multiple patients.

Above average ability to monitor response to treatment and make adjustments to treatment plan. Can manage multiple patients efficiently.

Closely monitors patients responses to treatment after initial workup; makes astute adjustments as needed; excellent ability to manage multiple patients.

Figure 1. Clinical evaluation form.

722 D. S. Ander et al.

Page 4: RIME in EM Clerkship

MEDICAL KNOWLEDGE Knowledge Base If Not Observed, Check Here

Cannot recall basic science & clinical information. Demonstrates poor ability to clinically apply knowledge base.

Marginal understanding of basic and clinical sciences as it relates to their patients.

Has basic knowledge base, and shows the ability for some clinical application.

Above average knowledge. Able to consistently relate to clinical material.

Outstanding fund of knowledge & understanding of disease mechanisms with excellent ability to apply to clinical situations.

PRACTICE BASED LEARNING Use of Medical Literature If Not Observed, Check Here

Fails to consider use of the medical literature.

Uses inappropriate sources when attempting to use the medical literature.

With prompting is able to support decisions using appropriate medical literature.

Actively seeks out medical literature that supports decision making.

Superb use of medical literature and is able to teach when they have learned.

INTERPERSONAL AND COMMUNICATION SKILLS Humanistic Qualities If Not Observed, Check Here

Often insensitive to patient’s feelings, needs, & wishes. Lack of empathy & compassion.

Occasionally insensitive to patient’s feelings. Inattentive to patient needs.

Sometimes has difficulty establishing rapport or communicating with patients.

Relates well to most patients & family members. Shows empathy & compassion.

Outstanding in putting patients &/or family members at ease & appropriately communicates with them. Relates well to difficult patients.

Works as part of a Health Care Team If Not Observed, Check Here Disrespectful, rude,

and insensitive to other members of the health care team.

Occasionally fails to act collegially with other members of the health care team.

Communicates well, respectful, and cooperative with other members of the health care team.

Strong communication skills and professional demeanor with other members of the health care team.

Mature and collegial. Communicates expertly with other members of the health care team.

Written Note If Not Observed, Check Here Medical record is

poor, inadequate, or inaccurate

Medical record has occasional voids.

Medical record is usually accurate medical record including the medical decision portion and progress notes.

Medical record is always accurate and well organized.

Medical record is excellent. Always accurate, well organized, and appropriate for level of care.

Presentation Skills If Not Observed, Check Here Presentations are

disorganized & incomplete with major omissions.

Includes irrelevant facts, ramblings, but with no major omissions.

Organized and provided the basic information but may be verbose, or have “holes”. Dependent on written prompters

Organized and complete presentation. Attempt to chronicle key events in patient’s illness. Minimal use of written prompters.

Complete, concise, orderly, & polished. Clear delineation of primary problems, excellent characterization, accurate chronology of key events.

PROFESSIONALISM Work Ethic If Not Observed, Check Here

Late. Whereabouts are often unknown. Level of commitment questionable.

Punctual. Appears peripheral to team activities & patient care.

Punctual. Can be relied upon to fulfill all required responsibilities of patient care.

Attempts to seek new responsibilities.

Exceptionally conscientious. Assumes high levels of responsibilities.

Sensitivity If Not Observed, Check Here Often seen as

insensitive to the needs of the patient and is unresponsive to the diverse populations.

Unable to provide the required sensitivity or responsiveness to the needs of the patient and the diverse populations.

Appropriate sensitivity and responsiveness to the needs of the patient and the diverse populations.

Above average sensitivity and responsiveness to the needs of the patient and the diverse populations.

Exceptional sensitivity and responsiveness to the needs of the patient and the diverse populations.

SYSTEM-BASE PRACTICE Cost-Effective healthcare If Not Observed, Check Here

No understanding of cost issues when developing management plans.

Limited understanding of cost issues when developing management plans.

Modifies plans to consider costs when prompted by faculty or residents.

Above average ability to consider cost issues when developing management plans.

Exceptional ability to addresses cost issues when developing management plans.

:erutangiSrotaulavE:emaNrotaulavE

Procedural Skills If Not Observed, Check Here Poor knowledge of

and/or ability in technique. Insensitive to patient needs.

Awkward, reluctant to try even basic procedures.

Occasional difficulty with knowledge and/or ability in technique.

Appropriate knowledge and/or ability in procedural techniques.

Outstanding level of knowledge and ability in procedural techniques.

Health Promotion If Not Observed, Check Here Unaware of

community resources and makes no effort to learn.

Disregards community resources when treating patients.

Uses appropriate community resources when pointed out by faculty or residents.

Uses appropriate community resources independently.

Fully and consistently uses available community resources and proactive looking for appropriate community resources.

© Douglas S. Ander, MD; Emory University School of Medicine

Figure 1. (continued).

Reporter-Interpreter-Manager-Educator Descriptive Ratings 723

Page 5: RIME in EM Clerkship

Table 1. Descriptive Statistics for Emergency MedicineClerkship

Evaluations (n = 2086 Evaluationson 289 Students)

Year of evaluation, n (%)2005 1069 (51.2)2006 529 (25.4)2007 488 (23.4)

Calendar month of evaluation, n (%)January–April 795 (38.1)May–August 219 (10.5)September–December 1072 (51.4)

Evaluator, n (%)Resident 538 (25.8)Faculty, Department of Emergency Medicine 1209 (58.0)Pediatric faculty, Department of Emergency

Medicine339 (16.2)

Student’s school, n (%)Emory 1403 (67.3)Other 683 (32.7)

RIME classification, n (%)Reporter 86 (4.1)Interpreter 502 (24.1)Manager 1173 (56.2)Educator 325 (15.6)

Clinical evaluation score, mean (6 SD) 28.36 5.1Examination score, mean (6 SD) 83.56 6.7

RIME =Reporter-Interpreter-Manager-Educator method.

724 D. S. Ander et al.

series, which was scored from 0 to 100% correct. This testhas been developed internally by the Clerkship Director,EM faculty, and the EM Education Committee.

Data Analysis

Categorical variables including year of evaluation,month of evaluation, type of evaluator, medical school,and RIME classification were calculated in terms of per-cents. Continuous variables, clinical evaluation score,and examination score were calculated in terms of themean (6 SD). For each of the 15 competencies,a student was graded on a scale of 0–4 points, with 0 rep-resenting the lowest level of competence and 4 represent-ing the highest level of competence. Descriptive anchorswere added to each competency level in each compe-tency to improve the reliability of this portion of the eval-uation form. Points from 9 of the 15 items were thensummed to arrive at an overall clinical evaluation score,with possible scores ranging from 0 (lowest score) to36 (highest score). We excluded procedural skills, healthpromotion, use of medical literature, medical note,sensitivity, and cost-effective health care due to a pre-dominance of evaluators documenting that they werenot observed. Missing data for these items ranged from53% to 80%.

The next step in the analysis was to assess the extent towhich RIME scores were associated with other methodsof evaluating students, namely the clinical evaluationscore and the final examination score. The associationswere assessed by running linear regression models, inwhich RIME categories were the independent variables(entered as a series of dummy variables to representeach category), and the clinical evaluation score or finalexamination score was the dependent variable (enteredas continuous variables). From the models, we calculatedmean level of each of the outcome variables according toeach RIME level. Additionally, as a measure of the asso-ciation between RIME categories and the outcome vari-ables, r2 values were derived from each model. Theser2 values represented the amount of variation in each out-come that was explained by RIME categories. We ranunadjusted models, as well as models that were adjustedfor the potential predictors of the outcome, such as year ofevaluation, evaluator status (faculty, resident, pediatrics),and school of the student. The data set included multipleevaluations per student. Thus, the unit of analysis in themodels was evaluation. To correct for the fact that evalu-ations were correlated within the same student, p-valueson our models were calculated with robust standarderrors. All analyses were based on subjects that had com-plete data on all variables of interest. All data were ana-lyzed with Stata Version 10 (Stata Corporation, CollegeStation, TX).

RESULTS

Four hundred thirty-nine students who completed theEM clerkship were enrolled in the study. There were4834 shift evaluation forms submitted for these students.After excluding all items and covariates with missing data(portions of the 15 competencies), there were 2086 eval-uation forms for 289 students available for analysis.There was no apparent difference between the includedand excluded shift evaluation forms by either type ofevaluator or final student grade. The percentage of evalu-ations done by the three types of evaluators was verysimilar for included and excluded evaluations. Includedevaluations: 26% done by residents, 58% done by faculty,and 16% done by pediatric faculty, compared to excludedevaluations: 23% done by resident, 56% done by faculty,and 21% done by pediatric faculty. Final grade distribu-tion for included versus excluded evaluations was similar.Included evaluations: 22% had ‘‘A,’’ 74% had ‘‘B,’’ and4% had ‘‘C’’ or lower compared to excluded evaluations:22% had ‘‘A,’’ 74% had ‘‘B,’’ and 4% had ‘‘C’’ or lower.Based on this comparison, it seems that the included andexcluded evaluations were similar with respect to type ofevaluator and grade distribution.

Descriptive statistics are presented in Table 1. Evalua-tions occurred during the years 2005–2007, and the mostcommon time of year for an evaluation to take place wasSeptember through December. The majority of evaluatorswere EM (non-pediatric) faculty and residents, and

Page 6: RIME in EM Clerkship

Table 2. Ability of RIME Categories to Explain Variance inIndividual Items on Clinical Evaluation Form

Individual Item from ClinicalEvaluation Form r2 Due to RIME Categories

Patient management .33Problem management .33Knowledge base .30History-taking .28Presentation skills .27Physical examination .26Work ethic .21Works as part of a team .20Humanistic qualities .15

Note: p-values for all r2 values are p < 0.001.RIME = Reporter-Interpreter-Manager-Educator method.

Reporter-Interpreter-Manager-Educator Descriptive Ratings 725

two-thirds of the subjects were students from our institu-tion. The most common RIME descriptor used to classifythe subjects was ‘‘manager’’ (Table 1). Average clinicalevaluation scores and final examination scores were28.3 6 5.1 and 83.5 6 6.7, respectively, which weretowards the maximum potential scores for these evalua-tion scales.

There was a clear positive relationship between RIMEcategory and clinical evaluation score (r2 = 0.40,p < 0.01). This result was essentially unchanged upon ad-justment for month of evaluation, evaluator, and school ofstudent.

RIME descriptive ratings most strongly correlatedwith patient management skills and least strongly corre-lated to humanistic qualities (Table 2). Adjustment forother factors did not change these associations. Therewas only a weak association between RIME descriptiverating and final examination score.

DISCUSSION

RIME is increasingly being used by clerkship directors aspart of the feedback and evaluation process. Currentlypublished literature exists for its use in internal medicineand obstetrics and gynecology, but there is no publishedliterature studying its use in EM (14,15). The RIMEmethodology has proven advantageous over traditionalnumerical evaluations and its use in EM clerkshipevaluations could benefit both students and educators.

Our results show that RIME and traditional clinicalevaluation scores are positively related, though the mod-erate r-squared between these two measures suggests thatthey may be measuring different concepts. RIMEdescriptive ratings associated most strongly with ‘‘techni-cal’’ competencies (patient and problem management,knowledge base, history-taking, and presentation skills)and less strongly yet still reasonably with ‘‘professional’’competencies (work ethic, humanistic qualities, andteam-working qualities). Based on these findings, we

believe that RIME descriptive ratings are a valid evalua-tion modality in EM and may provide other informationthat complements traditional clinical evaluations. Wehave shown that when a RIME descriptive rating questionis added to an end-of-shift evaluation, almost all facultymembers complete both portions of the evaluation, sug-gesting that the addition of RIME will not detract fromother portions of the evaluation. EM course directorscan gain useful predictive data through the addition ofRIME descriptive ratings. Based on our study methodol-ogy, we are not able to state that RIME descriptive ratingsare more valid than other forms of clinical evaluation inEM. Future work could compare various evaluation toolsto future performance in residency.

Limitations

Our methodology did not include a system for ongoingrater training and calibration. This may have impactedthe way the raters used the RIME classification, althoughit is unusual for any clerkship or residency director tofully train and assess the ability of their faculty to useevaluation measures. The evaluation form includeda description of each RIME classification as a guide, pro-viding continuous education each time the formwas com-pleted. Faculty and residents are aware that this is nota grade but only an evaluation of the student’s perfor-mance at one point in time. Grading takes into accountseveral objective and subjective components, which isthe reason we did not compare final grades to the RIMEclassification. Similarly, the reliability of the final exam-ination is not known. The form design with both measure-ments on the same form may have led to a bias. BecauseRIME is different from the typical methods of evaluationduring an emergency department shift, we believe thatdespite being on the same form, the RIME measurementwas not affected by the other scores.

We compared the RIME score to the overall clinicalscore, but other measures may exist that are more reliableor valid assessments of clinical performance, such asobserved patient assessments, objective standardizedclinical examinations, simulation, or future clinical per-formance. Given the lack of a gold standard, we comparedRIME to the classic numerical rating of clinical perfor-mance, arguably the most common form of evaluationused in clinical clerkships. We are confident that thiswas a reasonable approach because our evaluation formhas a sophisticated rating gridwith a total of 15 categories,each with five descriptive anchors to assist the evaluator.

CONCLUSION

This is the first attempt by an EM clerkship to use a newevaluation instrument that has proven successful in other

Page 7: RIME in EM Clerkship

726 D. S. Ander et al.

specialties. We found a positive association betweenRIME descriptive ratings and the numerical clinical eval-uation scores, indicating that RIME is a valid evaluationinstrument in an EM clerkship, complementing the clas-sical evaluation form. The modest association suggeststhat RIME measures different student characteristicsnot characterized on our clinical evaluation form.

REFERENCES

1. Bandiera GW, Morrison LJ, Regehr G. Predictive validity of theglobal assessment form used in a final-year undergraduate rotationin emergency medicine. Acad Emerg Med 2002;9:889–95.

2. Noel GL, Herbers JE, Caplow MP, Cooper GS, Pangaro LN,Harvey J. How well do internal medicine faculty members evalu-ate the clinical skills of residents? Ann Intern Med 1992;117:757–65.

3. Ryan JG, Madel FS, Sama A, Ward ME. Reliability of facultyclinical evaluations of non-emergency medicine residents duringemergency department rotations. Acad Emerg Med 2008;3:1124–30.

4. LaMantia J, Rennie W, Risucci DA, et al. Interobserver variabilityamong faculty in evaluations of residents’ clinical skills. AcadEmerg Med 2008;6:38–44.

5. Pangaro L. A new vocabulary and other innovations for improvingdescriptive in-training evaluations. Acad Med 1999;74:1203–7.

6. Hemmer PA, Pangaro L. Using formal evaluation sessions for case-based faculty development during clinical clerkships. Acad Med2000;75:1216–21.

7. Griffith CH 3rd, Wilson JF. The association of student examinationperformance with faculty and resident ratings using a modifiedRIME process. J Gen Intern Med 2008;23:1020–3.

8. Hemmer PA, Hawkins R, Jackson JL, Pangaro L. Assessing howwell three evaluation methods detect deficiencies in medicalstudents’ professionalism in two settings of an internal medicineclerkship. Acad Med 2000;75:167–73.

9. Durning SJ, Pangaro LN, Lawrence LL, Waechter D, McManigle J,Jackson JL. The feasibility, reliability, and validity of a programdirector’s (supervisor’s) evaluation form for medical school gradu-ates. Acad Med 2005;80:964–8.

10. Hemmer PA, Pangaro L. Can a descriptive evaluation system detectstudent growth during a clerkship? Using descriptive evaluation todetect student growth. Proceedings from Annual 2000 Meeting ofthe Clerkship Directors of Internal Medicine. Teach Learn Med2001;13:199–205.

11. DeWitt DE, Carline D, Paauw DS, Pangaro L. A pilot study ofa "RIME" framework-based tool for giving feedback in a multi-specialty longitudinal clerkship. Med Educ 2008;42:1205–9.

12. Hemmer PA, Papp KK, Mechaber AJ, Durning SJ. Evaluation,grading, and use of the RIME vocabulary on internal medicineclerkships: results of a national survey and comparison to otherclinical clerkships. Teach Learn Med 2008;20:118–26.

13. Accreditation Council on Graduate Medical Education (ACGME).Common program requirements: general competencies: Availableat: http://www.acgme.org/outcome/comp/GeneralCompetenciesStandards21307.pdf. Accessed April 21, 2010.

14. Battistone MJ, Pendeleton B, Milne C, et al. Global descriptiveevaluations are more responsive than global numeric ratings indetecting students’ progress during the inpatient portion of an inter-nal medicine clerkship. Acad Med 2001;76:S105–7.

15. Ogburn T, Espey E. The R-I-M-E method for evaluation of medicalstudents on an obstetrics and gynecology clerkship. Am J ObstetGynecol 2003;189:666–9.

Page 8: RIME in EM Clerkship

Reporter-Interpreter-Manager-Educator Descriptive Ratings 727

ARTICLE SUMMARY

1. Why is this topic important?This is the first attempt by an emergency medicine

(EM) clerkship to use a new evaluation instrument thathas proven successful in other specialties.2. What does this study attempt to show?

We hypothesized that the Reporter-Interpreter-Manager-Educator (RIME) descriptive ratings would cor-relate with clinical performance and examination scoresin an EM clerkship, indicating that the RIME ratings area valid measure of performance.3. What are the key findings?

We noted a positive association between RIME andclinical evaluation scores, suggesting that RIME is a validclinical evaluation instrument. RIME descriptive ratingscan be incorporated into EM evaluation instruments andprovide useful data related to patient management skills.