assessment in general education: a case study in scientific and quantitative reasoning b.j. miller...
TRANSCRIPT
Assessment in General Education: A Case Study in Scientific and
Quantitative Reasoning
B.J. Miller & Donna L. SundreCenter for Assessment and Research Studies
James Madison University
http://www.jmu.edu/assessment/
The First Habit
#1 Be proactive
A Proactive Assessment Model
• Provides for program improvement
• Fulfills accountability requirements
Preview
• JMU assessment model Background Five stages
• Assessment in Science & Math Methods Results Discussion
James Madison University
• Public• Harrisonburg, VA• Enrollment
15,000 undergraduate 700 graduate
• Center for Assessment and Research Studies (CARS)
General Education at JMUCluster Credits
1: Skills for the 21st Century 9
2: Arts and Humanities 9
3: The Natural World 10
4: Social and Cultural Processes 7
5: Individuals in the Human Community 6
Critical Thinking, Written and Oral Communication, Information Literacy
Culture, Philosophy, Fine Arts and Literature
Problem-solving Skills in Science and Mathematics
Global and American: History, Government, Economics, Anthropology
Wellness, Psychology, and Sociology
Example from Cluster ThreeMATH 103. The Nature of Mathematics MATH 107. Fundamentals of Mathematics I MATH 205. Introductory Calculus I MATH 220. Elementary Statistics MATH 231. Calculus with Functions I MATH 235. Calculus I
GSCI 101. Physics, Chemistry, & the Human Experience
GSCI 102. Environment Earth GSCI 103. Discovering Life
GSCI 104. Scientific Perspectives
Choose one
Choose one
Stages of the Assessment Process
EstablishingObjectives
Selecting/Designing
Instruments
CollectingData
AnalyzingData
UsingResults Continuous Cycle
Definition of Assessment
“…the systematic basis for making inferences about student learning and development” (Erwin, 1991, p. 28).
Establishing Objectives
• Expected/intended student outcomes
• Good objectives: Student-oriented Observable Measurable Reasonable
• Drive the assessment process
Examples from Cluster Three
Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon.
Discriminate between association and causation, and identify the types of evidence used to establish causation.
Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.
Selecting / Designing Instruments
• Selecting an instrument Other universities Resources such as MMY
and TIP
• Considerations Does it match my objectives? How much will it cost? How do I administer it? Are the scores reliable and valid?
Selecting / Designing Instruments
• Designing an instrument Table of specifications Item pool Pilot test Item analysis Item revision Pilot test again Reliability and validity
Learning Objective
# and % of
Items
.
.
.
Use graphical, symbolic, and numerical methods to analyze, organize, and interpret natural phenomenon.
1821%
Discriminate between association and causation, and identify the types of evidence used to establish causation.
911%
Formulate hypotheses, identify relevant variables, and design experiments to test hypotheses.
1214%
.
.
.
Total test 85 items
Collecting Data• Who?• When?• How?• Why?
Assessment Days at JMU
• All students
• August and February
• P-&-P, Computer-based
Assessment Day Data Collection Plan
COHORT 1
COHORT 2
COHORT 3
Repeated MeasuresRepeated Measures
Students in each cohort Students in each cohort are tested are tested twicetwice on the on the same instrument – once same instrument – once as as entering freshmenentering freshmen and again in the second and again in the second semester of thesemester of the sophomoresophomore year. year.
Spring 2003
Fall 2003Spring 2004
Fall 2004Spring 2005
Fall 2005Spring 2006
Fall 2002
Analyzing Data
• Research questions Are there group differences? Do scores change over time? Are there expected relationships? Are students meeting expectations?
Maintaining Data
• Organize and archive• Trends emerge• Informs future assessment
Using Results
• Accountability
Allocating resources
Enhancing reputation
Compliance
Using Results
• Program improvement
Curriculum
Teaching
Sequence of course offerings
Using Results
• Benefits Faculty Students Institution
Assessment in Cluster Three
Measures
• NAW-5 Created by Cluster Three faculty 50-item multiple-choice test 12 of 17 learning objectives Reliability = .75
Measures
• SOS Examinee motivation 10-item rating scale Two dimensions
• Effort• Importance
Participants
• August, 2001 746 freshmen
• February 2003 316 sophomores
Research Questions and Data Analysis
• Do scores change over time?
• Does number of courses impact scores?
• Does motivation impact scores?
• Paired samples t-test
• ANOVA
• Multiple regression
Results: Question 1
• Do NAW-5 scores change over time? MeanDiff = 2.6, t(315) = 3.96, p < .001, d = .23
NAW-5 Pre -test and Post-test M ean Scores and 95% CI
50
55
60
65
Fall '01 Spr '03
NA
W-5
Me
an
Sc
ore
Results: Question 2• Does the number of courses completed
impact NAW-5 scores? F(4, 315) = 1.145, p = .335
NAW-5 Mean Score and 95% CI by Number of Cluster Three Courses Completed
40
45
50
55
60
65
None One Two Three Four or more
Number of Cluster Three Courses Completed
NA
W-5
Mea
n S
core
Results: Question 3
• Does examinee motivation have an impact on NAW-5 scores? Pre-test scores explain 22%
Motivation scores significant improvement in predictive utilityR2
change = .11, Fchange(2, 312) = 25.70, p < .001
Squared semi-partialsPre-test = .24 Effort = .07 Importance = .01
Discussion
• Scores increased from FR to SO year• Number of courses had no impact• More effort, higher NAW-5 scores
• Using results Program improvement Accountability
Using Results for Program Improvement
• Cluster Three objectives• NAW-8
• Better alignment of curriculum with learning objectives
• Test with full and balanced coverage, better reliability
Using Results for Accountability
• Scientific and Quantitative Reasoning
• Assessment of core competency• Marketable instrument
Conclusion
• Proactive assessment works for JMU1. Establish objectives2. Design instruments3. Collect data4. Analyze data5. Use results
• Assessment in Cluster Three
1
2
34
5
Final Thoughts
• Stimulating• Meaningful• Challenging• FUN!
If you’re not having fun,
you’re doing it wrong!