leading with results: how questioning validity fosters proactive and engaged program assessment 2014...

30
Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st , 2014 Stephen R. Wallace Anne-Marie Kuchinski Mary Elaine Koren Joeseph Kutter Tawanda Gipson S. R. Wallace © 2014

Upload: dulcie-hunter

Post on 27-Dec-2015

222 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results:How Questioning Validity Fosters Proactive and Engaged Program Assessment

2014 Assessment Institute in Indianapolis

October 21st, 2014 Stephen R. WallaceAnne-Marie KuchinskiMary Elaine KorenJoeseph KutterTawanda Gipson

S. R. Wallace © 2014

Page 2: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Session Outcomes

1. Describe how to use a results-driven activity to lead a program through a deeper analysis of student data, with the goal of fostering more engaged and proactive assessment.

2. Describe a process for determining the predictive validity of an assessment.

3. Interpret a correlation matrix by identify assessments that are measuring something in common (or not in common).

S. R. Wallace © 2014

Page 3: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Driving Question

Is the costly, high stakes, pre-graduation HESI exam a predictor of success on the post-graduation NCLEX-RN exam (in which one must pass AFTER GRADUATION for licensure)?

S. R. Wallace © 2014

Page 4: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

The Context

• Northern Illinois University• Nursing, B.S. degree program

– Must pass NCLEX for accreditation and program review

– Student Learning Outcomes– Mix of program assessment methods

S. R. Wallace © 2014

Page 5: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

The Context

Student Learning Outcomes Assessment Methods

1. Evaluate safe, quality, patient-centered, evidence-based nursing care to individuals, families, and communities.

NCLEX-RN (1-7)

2. Evaluate critical thinking/clinical reasoning when providing nursing care.

HESI Exam (1-7)

3. Implement quality improvement related to patient care. Classroom Assessments (1-7)

4. Establish collaborative relationships with members of the interdisciplinary team.

Portfolio (1,2)

5. Incorporate information management principles, techniques, and systems when providing nursing care.

Student Survey (1-7)Faculty Survey (1-7)

6. Provide leadership role in a variety of healthcare settings for the purpose of providing and improving patient care.

Baccalaureate supplemental Alumni Questionnaire (1-7)

7. Defend use of professional, ethical, and legal principles while implementing the roles of the registered nurse as provider, designer/manager/coordinator of care, and member of the profession.

Employer Feedback (1-7)

S. R. Wallace © 2014

Page 6: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Predictive Validity

• Definition– The degree to which scores on an

assessment predict future performance on another measure

• Process– Administer Assessment A, wait a period of

time, administer Assessment B, calculate1. Degree of relationship between assessments

- or -

2. Accuracy of A predicting B

S. R. Wallace © 2014

Page 7: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Answering the Question

• Replication– Langford & Young (2013), Predicting

NCLEX-RN Success With the HESI Exit Exam: Eighth Validity Study

• NCLEX-RN is P/F → Predict accuracy• Used highest HESI Exit Exam score and

whether student passed NCLEX-RN

S. R. Wallace © 2014

Page 8: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Answering the Question

• Predictive Validity ResultsPredictive Accuracy of HESI a (Langford & Young (2013))

HESI Scoring Category n b

Pass NCLEX-RN c

n %

900 and higher 1,560 1,520 97.4

850-899 721 666 92.4

Note. a n = 3,758. b Accuracy in predicting NCLEX-RN success regardless of whether the student was required to take up to three versions of the HESI before achieving a score in either scoring category. c NCLEX-RN scores are only reported as pass/fail.

Predictive Accuracy of HESI a (Nursing, B.S. program)

HESI Scoring Category n b

Pass NCLEX-RN c

n %

900 and higher 260 257 98.8

850-899 122 116 95.1

800-849 92 81 88.0

700-799 130 96 73.8

699 and below 39 11 28.2

Note. a Accuracy in predicting NCLEX-RN success based on student’s highest score regardless of which of three versions of the HESI the student took before achieving a score in any scoring category. b n = 643. c NCLEX-RN scores are only reported as pass/fail.

• HESI accuratelypredicts NCLEX

S. R. Wallace © 2014

Page 9: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Deepening the Discussion

• Through Guiding Questioning– Guided Discovery approach– Scaffold strategic questions– Extend Wait-Time I and II

• Examples– Did you answer the original question?– What are program improvement implications?– Do you see additional patterns in the data?

What trends do you see?

S. R. Wallace © 2014

Page 10: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Deepening the Discussion

Predictive Accuracy of HESI (Nursing, B.S. program)

HESI Scoring Category n

Pass NCLEX-RN

n %

900 and higher 260 257 98.8

850-899 122 116 95.1

800-849 92 81 88.0

700-799 130 96 73.8

699 and below 39 11 28.2

Is there a cut score for remediation?

S. R. Wallace © 2014

Page 11: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Deepening the Discussion

Combined Pass Rate and Potential Cut Scores

Percent Passing NCLEX-RN (Nursing, B.S. program)

HESI Scoring Category n

Pass NCLEX-RN Fail NCLEX-RN Combined Pass Raten % n %

963 and higher 130 130 100.0 0 0.0 100.0905-962 112 110 98.2 2 1.8 99.0859-904 118 113 95.8 5 4.2 98.0830-858 62 57 91.9 5 8.1 97.0806-829 44 38 86.4 6 13.6 96.0777-805 47 40 85.1 7 14.9 95.0755-776 31 24 77.4 7 22.6 94.0744-754 19 12 63.2 7 36.8 93.0743 and below 80 37 46.3 43 53.8 92.0 and below

S. R. Wallace © 2014

Page 12: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Deepening the Discussion

Discussion leads to new discoveries– University Writing Project

• AAC&U VALUE Rubrics• written communication and critical thinking

– VSA• CLA• written communication and analytical reasoning

– HESI• critical thinking

S. R. Wallace © 2014

Page 13: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Deepening the Discussion

• What could we do with this new information?• What questions could be answered?

Think – Pair – Share for a moment

S. R. Wallace © 2014

Page 14: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

New Questions Arise

• Are the assessments measuring the same things?

• Can the Nursing program capitalize on the authentic assessments they are already using?

• Answer the questions through– Correlational study– Similar to a predictive validity study

S. R. Wallace © 2014

Page 15: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Correlation Coefficient, r

• Definition– Degree to which scores on Assessment A predict

scores on Assessment B

• Range– between -1.00 and +1.00

• Sign

– indicates direction + ↑↑ and ↓↓ vs. - ↑↓ and ↓↑

• Absolute value– indicates strength (stronger as it moves away

from 0.0)

S. R. Wallace © 2014

Page 16: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

VALUE Rubric – Critical

Thinking

VALUE Rubric – Written

Communication

CLA – Analytical Reasoning

CLA – Written

Expression

CLA – Writing

Mechanics

HESI – Critical

Thinking ACT GPA NCLEX

VALUE Rubric – Critical Thinking

–          

VALUE Rubric – Written Communication

.88** –        

CLA – Analytical Reasoning

.24 .18 –      

CLA – Written Expression

.24 .20 .85** –    

CLA – Writing Mechanics

.40** .38** .74** .72** –  

HESI – Critical Thinking

.48** .55** .11 .18 .29 –  

ACT .29* .24 .27 .24 .34* .43** –  

GPA .37** .33* .19 .30* .22 .49** .38** –  

NCLEX .21 .31* .15 .15 .33* .68** .45** .25 –

Note. ** p < 0.01, * p < 0.05

Correlation Matrix

Note. ** p < 0.01, * p < 0.05

.38**

S. R. Wallace © 2014

Page 17: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Correlation Matrix (Shading)

VALUE Rubric – Critical

Thinking

VALUE Rubric – Written

Communication

CLA – Analytical Reasoning

CLA – Written

Expression

CLA – Writing

Mechanics

HESI – Critical

Thinking ACT GPA NCLEX

VALUE Rubric – Critical Thinking

–          

VALUE Rubric – Written Communication

.88** –        

CLA – Analytical Reasoning

.24 .18 –      

CLA – Written Expression

.24 .20 .85** –    

CLA – Writing Mechanics

.40** .38** .74** .72** –  

HESI – Critical Thinking

.48** .55** .11 .18 .29 –  

ACT .29* .24 .27 .24 .34* .43** –  

GPA .37** .33* .19 .30* .22 .49** .38** –  

NCLEX .21 .31* .15 .15 .33* .68** .45** .25 –

Note. ** p < 0.01, * p < 0.05

S. R. Wallace © 2014

Page 18: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

VALUE Rubric – Critical

Thinking

VALUE Rubric – Written

Communication

CLA – Analytical Reasoning

CLA – Written

Expression

CLA – Writing

Mechanics

HESI – Critical

Thinking ACT GPA NCLEX

VALUE Rubric – Critical Thinking

–          

VALUE Rubric – Written Communication

.88** –        

CLA – Analytical Reasoning

.24 .18 –      

CLA – Written Expression

.24 .20 .85** –    

CLA – Writing Mechanics

.40** .38** .74** .72** –  

HESI – Critical Thinking

.48** .55** .11 .18 .29 –  

ACT .29* .24 .27 .24 .34* .43** –  

GPA .37** .33* .19 .30* .22 .49** .38** –  

NCLEX .21 .31* .15 .15 .33* .68** .45** .25 –

Note. ** p < 0.01, * p < 0.05

Correlation Matrix (Same Constructs)

.24

.48** .11

.72**

.20

.38**

S. R. Wallace © 2014

Page 19: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

VALUE Rubric – Critical

Thinking

VALUE Rubric – Written

Communication

CLA – Analytical Reasoning

CLA – Written

Expression

CLA – Writing

Mechanics

HESI – Critical

Thinking ACT GPA NCLEX

VALUE Rubric – Critical Thinking

–          

VALUE Rubric – Written Communication

.88** –        

CLA – Analytical Reasoning

.24 .18 –      

CLA – Written Expression

.24 .20 .85** –    

CLA – Writing Mechanics

.40** .38** .74** .72** –  

HESI – Critical Thinking

.48** .55** .11 .18 .29 –  

ACT .29* .24 .27 .24 .34* .43** –  

GPA .37** .33* .19 .30* .22 .49** .38** –  

NCLEX .21 .31* .15 .15 .33* .68** .45** .25 –

Note. ** p < 0.01, * p < 0.05

Correlation Matrix (Different Constructs)

.24.18

.29

.88**

.40**.55**

.85**

.74**.18

S. R. Wallace © 2014

Page 20: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

VALUE Rubric – Critical

Thinking

VALUE Rubric – Written

Communication

CLA – Analytical Reasoning

CLA – Written

Expression

CLA – Writing

Mechanics

HESI – Critical

Thinking ACT GPA NCLEX

VALUE Rubric – Critical Thinking

–          

VALUE Rubric – Written Communication

.88** –        

CLA – Analytical Reasoning

.24 .18 –      

CLA – Written Expression

.24 .20 .85** –    

CLA – Writing Mechanics

.40** .38** .74** .72** –  

HESI – Critical Thinking .48** .55** .11 .18 .29 –  

ACT .29* .24 .27 .24 .34* .43** –  

GPA .37** .33* .19 .30* .22 .49** .38** –  

NCLEX .21 .31* .15 .15 .33* .68** .45** .25 –

Note. ** p < 0.01, * p < 0.05

Correlation Matrix (Outcomes)

S. R. Wallace © 2014

Page 21: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Correlation Conclusions

• Written Communication and Critical Thinking appear to be related

• CLA appears to relate more to itself than other important program outcomes

• VALUE Rubric assessment correlates with important program outcomes

• VALUE Rubrics appears more valid than CLA at the program and university level

S. R. Wallace © 2014

Page 22: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Program Assessment

• What should the Nursing program do?• And the university?• Why?

Think – Pair – Share for a moment

S. R. Wallace © 2014

Page 23: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Program Assessment

Integrating the VALUE Rubrics into program assessment requires an alignment between

Outcomes

Portfolio Rubric

VALUE Rubric

S. R. Wallace © 2014

Page 24: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Program Assessment

• Are they aligned?• Crosswalk the:

– Program Outcomes (p.1)– Adapted VALUE Rubric (p. 9)– Portfolio Rubric (p. 10)

• Is anything missing in the Outcomes?

Outcomes

Portfolio Rubric

VALUE Rubric

S. R. Wallace © 2014

Page 25: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Program Assessment

• Written Communication is NOT one of the student learning outcomes

• Written Communication IS emphasized throughout the curriculum

• Steps Nursing program is taking

S. R. Wallace © 2014

Page 26: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Program Assessment

Program assessment is effective and efficient when assessments and corresponding rubrics

– Are aligned with• Course-level objectives• Program student learning outcomes• Broader university outcomes

– Are used for multiple purposes• Formatively• Summatively• Internally• Externally

S. R. Wallace © 2014

Page 27: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Lessons Learned

• How have you used assessment results to foster more engaged and proactive assessment on your campus? Have you– Put results in the hands of users?– Asked probing questions?– Guided improvement efforts?

S. R. Wallace © 2014

Page 28: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Workshop Outcomes

1. Describe how to use a results-driven activity to lead a program through a deeper analysis of student data, with the goal of fostering more engaged and proactive assessment.

2. Describe a process for determining the predictive validity of an assessment.

3. Interpret a correlation matrix by identify assessments that are measuring something in common (or not in common).

S. R. Wallace © 2014

Page 29: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with Results

Contact

Stephen R. Wallace Associate DirectorOffice of Assessment ServicesNorthern Illinois UniversityDeKalb, IL [email protected]

Anne-Marie Kuchinski Undergraduate NCLEX-RN CoordinatorNursing ProgramNorthern Illinois UniversityDeKalb, IL [email protected]

Mary Ellaine Koren Associate Professor and Area CoordinatorNursing ProgramNorthern Illinois UniversityDeKalb, IL [email protected]

S. R. Wallace © 2014

Page 30: Leading with Results: How Questioning Validity Fosters Proactive and Engaged Program Assessment 2014 Assessment Institute in Indianapolis October 21 st,

Leading with ResultsS. R. Wallace © 2014