nca residency session 7 march 8 2017
TRANSCRIPT
AGENDA- Learning Collaborative Session 7March 8, 3:00-4:30pm (EST)
Welcome and Review Moodle/Assignments Questions on NP finances Residency Program Policies and Procedures Curriculum: Evaluation of the Learner
How and Why to Assess Your Residents Assessment Tools and Process
Action Period Items Begin working on policy and procedures Continue Curriculum development Progress Checklist and MONTHLY REPORTS!
Monthly Reports
due EVERY MONTH!
• 1) Is the Program set up as a separate Cost Center. If so, what is costed directly to the Residency Program Cost Center?– Are the revenues from the residents credited to this cost center?– Are the salaries of the mentors and continuity clinics charged to the
cost center?• 2) Is the program considered fully in scope, was this
separately added to your scope – how has this been handled with respect to UDS and FFR and SAC 330 budget?
• 3) Is the work of the residents at the off site specialty rotations covered by FTCA or Gap policies?
Creating policy and procedures for your program
• Policies VS. Procedures • What policies do you need to create
for the residency?• What procedures may you need to
adapt to fit program? (ie PTO)• What policies does your organization
already have?3
Policies and Procedures
Policies and Procedures
• Residency Specific Policies1.Ramp Up Policy2.Precepting Policy3.Patient Panel Transfer Policy
• Accreditation Policies • Check accreditation standards on what
policies are required
4
Policies and Procedures
• Start putting pen to paper to develop Policies and Procedures
• Create program manuals • Program staff• Residents
• Important to have these established for training new staff
5
6
Curriculum Development
Assessment of Resident Performance
7
DRIVERS LEADERSHIP/BOARD/FINANCES
MARKETING, RECRUITMENT
CURRICULUM
Domains/subdomainsSpace/equipment Space/equipmentPoliciesPatients DIDACTIC
Preceptors Faculty
ASSESS LEARNER SCHEDULE ASSESS LEARNER
REMEDIATION OF LEARNER REMEDIATION OF LEARNER
ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION
CLINICALTOPICS/KSAs
MISSION
Program goals/objectives
Learner outcomes/competencies
Learning ObjectivesKnowledge:– Understand the purpose of assessment– Know the characteristics of good assessment– Understand how assessment builds trainee and
programmatic performanceAttitude:– Appreciate the importance of good assessment– Embrace the challenge
Skills:– To be gained by independent / group work building on
information provided in the presentation
Overview of the Session• Defining terms: difference between evaluation and
assessment• How assessment/evaluation fits in the bigger picture of
curriculum and program development– Integrated throughout the program– Creates explicit expectations for trainee– Building blocks for program evaluation– Engine for trainee and program improvement
• Characteristics of effective assessment and evaluation• Examples of techniques/methods• Discussion
Definitions• Assessment
Process of measuring learning (describing, collecting, recording, and scoring information), generally focusing on observable KSAs
Gathering of information about learner performance that is relevant to stated competencies/outcomes
The goal of assessment: performance improvement, as opposed to simply being judged. Provides information for changes/interventions that improve learner performance Formative
• Evaluation Process of making judgments; of comparing assessment data against established criteria,
evidence or standards to determine the extent to which learner competencies/outcomes and program goals have been met
Provides information for changes/interventions that improve learner/program performance
Summative
Definitions con’t• Program Goals General and ‘fuzzy’, they are aspirational. Overall outline of what the program will accomplish.
• Program Objectives Measurable and specific. Introduce the curricular domains of the program, eg: Patient-Centered Care, Professionalism,
Clinical Practice. Within the domains are sub-domains which contain specific learner outcomes.
• Learner outcomes Measurable benchmarks, the intended results of the curriculum. Describe what the learner will actually do, and often use Bloom’s taxonomy of action verbs. Summative (final) data describing learner performance is compared to the benchmarks. It is an
indicator of achieving outcomes. It is your evidence that your residents are learning and doing what you said they would learn and do.
The Relationship between Assessment and Evaluation
Formative Assessment for Learner Feedback
Summative Evaluation for Improvement
Summative Evaluation for Programmatic Improvement
APA guidelinesDomain E: Resident–Supervisor Relations
At least semiannually, written feedback re: meeting performance requirements:(a) Initial written evaluation provided early enough for self-correction;(b) Second written evaluation early enough to provide time for continued correction or development;(c) Discussions / signing of evaluation by resident and supervisor;(d) Timely written notification of problems, opportunity to discuss them, and guidance re: remediation; and(e) Substantive written feedback on extent to which corrective actions are or are not successful.
NNPRFTC Standard 3: Evaluation
Evaluation components• Institutional performance• Programmatic
performance• Trainee performance • Instructor and staff
performance
• Assessment based on Program’s core elements, competencies, and curriculum components
• Assess performance of each trainee through periodic and objective assessment (formative and summative)
• Include identification of any deficiencies or performance concerns
• Process for trainee performance concerns, incl. improvement plan with measurable goals.
Models of Learner Assessment Learner assessment is anchored in the learning theory or model used to
create the curriculum; Measure important milestones specified by the learning theory in the
context of the curriculum._____________________________________________________
– Malcolm Knowles – Andragogy: “Adult learning”• Involve learner in the planning and evaluation of their instruction.• Experience (including mistakes) provides the basis for the learning activities.• Adult learning is problem-centered rather than content-oriented. (Kearsley, 2010)
– James Englander et al (2013) 8 Clinical competencies
– Dreyfus / Brenner• Novice to expert• Assessment tailored to each level of proficiency
Dreyfus/Brenner
Types of Assessment/Evaluation• Formative – formal and informal – ongoing, periodic• Summative – formal – “final”
• Personal, peer, expert• Surveys, Simulations, Criterion/Standard referenced
observation• Journals• 360 • Portfolio• Project
Characteristics of Effective Assessment Reliable (replicable)• Multiple observers• Reproducible observations/outcomes
Valid (meaningful)• Useful indicator of performance, competency• Relevant to professional practice
Measurable/observable• Verifiable
DRIVERS LEADERSHIP/BOARD/FINANCES
MARKETING, RECRUITMENT
CURRICULUM
Domains/subdomainsSpace/equipment Space/equipmentPoliciesPatients DIDACTIC
Preceptors Faculty
ASSESS LEARNER SCHEDULE ASSESS LEARNER
REMEDIATION OF LEARNER REMEDIATION OF LEARNER
ACCCREDITATION GRADUATES WHO FULFILL YOUR MISSION
CLINICALTOPICS/KSAs
MISSION
Program goals/objectives
Learner outcomes/competencies
Impact – Feedback Loop
Examples: APA Accreditation • Competency/domain: Professionalism • Learner Outcome: Demonstrates in behavior and
comportment the professional values and attitudes of the discipline of psychology.
• Subdomains: Professional Values and Attitudes, Cultural diversity, Ethics, Reflective Practice/Self-Assessment
• Measurable outcome for subdomains: – CHCI: Dreyfus Novice to Expert
Example: APA Accreditation w/CHCI outcomes
• Subdomain: Professional Values and Attitudes • Components of subdomain: Integrity,
Accountability, Concern for welfare of others• Outcome for Integrity: Monitors and
independently resolves situations that challenge professional values and integrity
• Outcome for Accountability: Independently accepts personal responsibility across settings and contexts
CHCI Rating Scale for Post-doc Psychologists
1) Novice – entry level skills, knowledge, attitudes2) Advanced Beginner -- Developing skills, knowledge and attitude3) Competent - Developed skills, knowledge and attitude4) Proficient -- Advanced skills, knowledge and attitude5) Expert -- Authority for skills, knowledge and attitude0) No interaction
Example: NNPRFTC Accreditation• Competency/domain: Patient Care/ Knowledge
for practice• Learner Outcome: Provide effective evidence-
based patient-centered care for the treatment of health problems and the promotion of health
• Subdomains: diagnostic tests, history & physical, prescribing, plan of care
• CHCI’s Model for assessment measurement: Dreyfus/Benner Novice to Expert
NNPRFTC Accreditation w/CHCI outcomes• Subdomain: History & physical• Outcome for History & physical: Perform
comprehensive history and physical exam• Outcome for diagnostic tests: Order
appropriate screening and diagnostic tests• Outcome for prescribing: Order appropriate
medications
CHCI NP Residency rating scale1 Novice Observes task only: Entry level skills,
knowledge, attitudes2 Advance
BeginnerNeeds direct supervision: Developing skills, knowledge, attitudes
3 Competent Needs supervision periodically: Developed skills, knowledge, attitudes
4 Proficient Able to perform without supervision: Advanced skills, knowledge, attitudes
5 Expert Able to supervise others: Authority for skills, knowledge, attitudes
0 N/A Not applicable, not observed, or not performed
CHCI Assessment Proto • Residents assessed in 8 competency domain areas
(based on NNPRFTC accreditation curriculum standards)• Residents complete a self-assessment at baseline, 6 months and 12 months• Preceptors complete assessment at 6 and 12 months • Preceptor team develops 1 final assessment for each
resident
Creating Your Assessment Process• Anchor in the curriculum and program objectives• What is the evidence/documentation?• What methods do you want to use?• Use reliable and valid techniques• When are you going to collect data?• Conduct systematic formative (on-going) and
summative (final) data collection • Create feedback loop – remediation and using the
information• Measuring the impact
• Pell Institute: user-friendly toolbox that steps through every point in the evaluation process: designing a plan, data collection and analysis, dissemination and communication, program improvement.
• CDC has an evaluation workbook for obesity programs; concepts and detailed work products can be readily adapted to NP postgraduate programs.
• The Community Tool Box, (Work Group for Community Health at the U of Kansas): incredibly complete and understandable resource, provides theoretical overviews, practical suggestions, a tool box, checklists, and an extensive bibliography.
Resources:
Resources cont’• Another wonderful resource, Designing Your Program Evaluation Plans
, provides a self-study approach to evaluation for nonprofit organizations and is easily adapted to training programs. There are checklists and suggested activities, as well as recommended readings.
• http://edglossary.org/assessment/
• NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education-Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-Evaluation
Creating an Evaluation Process Kathryn Rugen, PhD, FNP-BC, FAAN, FAANP
VETERANS HEALTH ADMINISTRATION
• Explain the development of the NP Residency competency tool
• Describe the validation of the NP Residency competency tool
34
Objectives
VETERANS HEALTH ADMINISTRATION
• Demonstrate program effectiveness
• Standardization across 5 sites
• Document competence in 7 domains
• Prepare for site accreditation
NP Competency Tool
VETERANS HEALTH ADMINISTRATION
– AACN/CCNE Masters and DNP Essentials– AACN/NONPF Adult-Gerontology Nurse Practitioner
Core Competencies – NCQA PCMH Standards– Core Competencies for Interprofessional
Collaborative Practice (IPEC)– ACGME competencies– VA top outpatient diagnoses– COE education core domains – Entrustable Professional Activities
Development
VETERANS HEALTH ADMINISTRATION
• Iterative process – VA NP experts at each site and MD education
consultant– Post-graduate NP trainee reviewed and offered
suggestions– Solicitied input from experienced and new NPs
throughout VA Primary Care
Content validity
VETERANS HEALTH ADMINISTRATION
• Clinical competency in planning and managing care• Leadership• Interprofessional team collaboration• Patient-centered care• Shared decision making• Sustain relationships• Quality improvement and population management
Domains
VETERANS HEALTH ADMINISTRATION
• NP resident and mentor complete competency tool at 1, 6, and 12 months (total 69 items)
• Rate on 0-5 scale– 0= not observed or not performed– 1= observes task only– 2= needs direct supervision– 3= needs supervision periodically– 4= able to perform without supervision– 5= able to supervise others- aspirational!NP resident responds to open ended questions
Methods
VETERANS HEALTH ADMINISTRATION
• Evaluation questions: – identify items and domains NP residents are strongest and
weakest – determine how NP residents progress over time – determine agreement between trainee and mentor ratings• Descriptive statistics to evaluate the distributional
characteristics of each item and domain, the impact of the time on trainee and mentor
• T-test and general linear models to assess relationship between NP resident and mentor ratings over time
Analysis
VETERANS HEALTH ADMINISTRATION
Subscale
Trainee Ratings Faculty Ratings
1 month 6 months
12 months p-value 1 month 6
months12
months p-value
Clinical Competency in Planning/Managing Care n Mean SD Range
372.75.56
1.71-3.85
343.41.46
2.28-4.25
353.751.430 -
5.00
<.0001
372.94.60
1.86-4.59
343.68.49
2.89-5.00
364.42.50
3.50-5.00
<.0001
Clinical Competency
VETERANS HEALTH ADMINISTRATION
Clinical Competency
1.1 Comprehen
sive H
&P
1.2 Differential
Dx
1.3 Scree
ning/Dx T
ests
1.4 Appropriate
Consults
1.5 Med
ications
1.6 Med
Review
/Reco
nciliati
on
1.7 Case Pres
entati
on
1.8 Hypert
ensio
n
1.9 Obesi
ty
1.10 Diabete
s Mell
itus
1.11 Depres
sion
1.12 Ischem
ic Hear
t Dise
ase
1.13 Gastroeso
phageal
Reflux
1.14 PTSD
1.15 Enlar
ged Prosta
te
1.16 COPD
1.17 Anemia
1.18 Chronic Ren
al Fail
ure
1.19 Heart F
ailure
1.20 Asthma
1.21 Periphera
l Arte
rial D
isease
1.22 Osteoart
hritis
1.23 Substa
nce Abuse
1.24 Milit
ary Se
xual T
rauma
1.25 Suicid
ality
1.26 TBI
1.27 Hepati
tis C
1.28 Eviden
ce-Base
d Guidelines
0
0.5
1
1.5
2
2.5
3
3.5
4
4.5
5
Mentor_1mMentor_6mMentor_12mTrainee_1mTrainee_6mTrainee_12m
Mea
n
VETERANS HEALTH ADMINISTRATION
Subscale
Trainee Ratings Faculty Ratings
1 month 6 months
12 months p-value 1 month 6
months12
months p-value
Leadership n Mean SD Range
37
1.451.35
0-4.85
34
2.411.58
0-5.00
35
3.131.56
0-5.00
<.0001
28
2.641.231.00-4.33
29
3.63.67
2.00-5.00
36
4.44.55
3.20-5.00
<.0001
Leadership Competency
VETERANS HEALTH ADMINISTRATION
Leadership Competency
2.1 Lead
PACT team
huddle
2.2 Lead
case
conferen
ce
2.3 Lead
team
meeti
ng usin
g conflict
mgmt/r
esolution
2.4 Lead
group ed
uc activiti
es for p
ts/fam
, PACT t
eam, p
eers
2.5 Lead
PACT team
quality i
mprovem
ent p
rojec
t
2.6 Lead
share
d/group m
edica
l appts
2.7 Apply lead
ership st
rateg
ies to
support c
ollaborati
ve prac
tice/tea
m effecti
veness
00.5
11.5
22.5
33.5
44.5
5
Mentor_1mMentor_6mMentor_12mTrainee_1mTrainee_6mTrainee_12m
Mea
n
VETERANS HEALTH ADMINISTRATION
• At 1 month, 24 out of 28 items were rated between 2 and 3 (2= needs direct supervision; 3=needs supervision periodically) only four items were rated greater than 3 by the NP Residents.
• Four items rated higher than 3 were “perform comprehensive history and physical exam” (3.48), “perform medication reconciliation” (3.54) and “management of hypertension” (3.13) and “management of obesity” (3.35).
• At the 12 month time point all items were rates higher than 3 and seven items out of 28 were rated higher than 4 (able to perform without supervision) by the NP Residents
• The seven items rated 4 or higher were “perform comprehensive history and physical exam” (4.17), “order appropriate consults” (4.11), “perform medication reconciliation” (4.14) “management of hypertension” (4.08), “management of obesity” (4.11) “management of gastroesophageal reflux” ( 4.02), and “management of osteoarthritis” (4.00).
• At the 12 month time point the mentors ratings were all above 4 (4=able to perform without supervision) except for two items, “management military sexual trauma” (3.58) and “ management of traumatic brain injury” (3.66).
Item Analysis -Clinical Competence
VETERANS HEALTH ADMINISTRATION
Psychometric Analysis• Internal consistency – the degree to which the items are
measuring the same attribute• Cronbach’s (coefficient) alpha ranging .00 -1.0, higher
value the higher the internal consistency • Internal consistency calculated by NP resident and mentor
for each domain and each time point; α = 0.82-0.96 • Triangulating qualitative data, qualitative data and end of
program evaluation further enhances content validity• Factor analysis will be used for construct validation –
identifies clusters of related variables 46
Please complete the survey after the session!