technology-mediated assessment

59
Technology-Mediated Assessment Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh Gateway Engineering Education Coalition

Upload: amie

Post on 21-Feb-2016

62 views

Category:

Documents


0 download

DESCRIPTION

Technology-Mediated Assessment. Jack McGourty, Columbia University John Merrill, Ohio State University Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh. Gateway Engineering Education Coalition. Technology-Mediated Assessment. Introduction Your Expectations Applications - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Technology-Mediated Assessment

Technology-Mediated Assessment

Jack McGourty, Columbia UniversityJohn Merrill, Ohio State University

Mary Besterfield-Sacre & Larry Shuman, University of Pittsburgh

Gateway Engineering Education Coalition

Page 2: Technology-Mediated Assessment

Technology-Mediated Assessment Introduction

Your Expectations Applications

Drexel and Columbia’s Course Evaluation Ohio State’s Activities Team Evaluator

Your Experiences Enablers and Barriers (Break-out Groups) Conclusions

Page 3: Technology-Mediated Assessment

Introduction Reasons for Online-Assessment Common Applications Design and Development Things to Think About

Page 4: Technology-Mediated Assessment

Reasons for On-Line Assessment Customized development Targeted communication Ease of distribution/no boundaries Automatic data collection and

analyses Real time response monitoring Timely feedback

Page 5: Technology-Mediated Assessment

Common Applications Attitude Surveys Multisource assessment and

feedback Course evaluations Portfolios Technology-mediated interviews Tests

Page 6: Technology-Mediated Assessment

Design and Development Item/Question development Adaptive testing/expert systems Multimedia tutorials Dialogue boxes Reporting wizards

Page 7: Technology-Mediated Assessment

Things to Think About Confidentiality/Privacy Response rates Reliability/Validity Ease of use

Administrators, end users

System growth Can it easily be

upgraded? Adding modules

System flexibility Survey/test construction

Data flexibility Item databases Reporting wizards Data storage

Platforms Specific vs.

combination Reporting

Various levels Dissemination

mechanisms Real time vs. delayed

Page 8: Technology-Mediated Assessment

Technology in Education

Dr. John MerrillThe Ohio State University

Introduction To Engineering Program

Technology Enabled Technology Enabled AssessmentAssessment

The Wave of The FutureThe Wave of The Future

Page 9: Technology-Mediated Assessment

Objectives Explanation of web-based

assessment tools Uses of assessment tools Virtual run-through of student

actions Lessons learned Q&A

Page 10: Technology-Mediated Assessment

Web-Based Assessment Tools Course Sorcerer (through WebCT)

Online Journal Entries Course Evaluations

Team Evaluator Peer Evaluations

Page 11: Technology-Mediated Assessment

WebCT WebCT is a commercial web-based

tool used for course management. IE Program uses/capabilities:

Electronic grade book, chat rooms, bulletin boards, calendars

Provides links to Course Material Course Sorcerer Team Evaluations (Team Evaluator)

Page 12: Technology-Mediated Assessment

Course Sorcerer A simple, web-based evaluation tool created by

Scott Cantor at University Technology Services Technical Specifications:

Written in Cold Fusion Run on Windows NT with a Netscape Enterprise Web

Server Uses a MS SQL Server database with 15 tables Server Machine: PII-450 w/ 512M of RAM Accesses Sybase running on Solaris 2.6 as a

warehouse for roster data. Used for Journal Entries & Course Evaluations

Page 13: Technology-Mediated Assessment

Team Evaluator (Peer Evaluation) Used by team members to provide confidential assessment System Requirements:

Operating System: Windows 2000 with ActivePerl or UNIX with Perl 5.004 or

higher Perl Modules: CGI, DBI (plus SQL drivers), POSIX SQL Server: MySQL 3.23 or higher Web Server: IIS (Windows) or Apache 1.3 (UNIX) CPU: Pentium II 400 or better recommended Memory: 128 MB or higher recommended Disk Space: 100 MB for adequate database space

Page 14: Technology-Mediated Assessment

Journal Entries Students complete journal entries

online every two weeks. Submissions are anonymous. All entries are read and summarized

by a staff member and shared with the instructional team.

Instructional team members share the summaries with their classes.

Page 15: Technology-Mediated Assessment

Course Evaluations Students in 181 & 182 complete

online course evaluations at the end of each quarter.

Questions designed to evaluate courses based on items a-k of Criterion 3, Program Outcomes & Assessment, in the ABET Engineering Criteria, 2000.

Page 16: Technology-Mediated Assessment

Short Term Uses Journal Entries & Course Evaluations

Address immediate student concerns/questions about class, labs, or projects.

Inquire about student problems with specific topics and labs.

Discover general information from students in regards to interests, influences, and attitudes.

Page 17: Technology-Mediated Assessment

ExampleAddressing Immediate Student Concerns

“How are the figures supposed to be done? Strictly isometric or just drawn so you can see everything? What pieces need to be labeled?”

“What are we doing in labs 6 & 7? I know it says in the syllabus that we are incorporating the sorting mechanism, but is that going to take two weeks?”

Page 18: Technology-Mediated Assessment

Long-Term Uses Journal Entries & Course Evaluations

Improve program content Improve course materials Modify teaching styles Evaluate course based on ABET

criteria

Page 19: Technology-Mediated Assessment

ExampleImproving Course Content

“Positive: I... - Gained knowledge about circuits in general - Learned how to read schematics - Learned how to use breadboards - Further developed team working skills Negative: - The circuits did not work the first time. - Time ran short for both labs, but we did finish each circuit.”

Page 20: Technology-Mediated Assessment

How It Works

Start: WebCT site:http://courses2.telr.ohio-state.edu

Page 21: Technology-Mediated Assessment

Completion TrackingEngineering 182

Journal Completion Rate

50.0%

60.0%

70.0%

80.0%

90.0%

100.0%

Per

cent

Com

plet

e P

er W

eek

Dickinson

Hastings

Chubb

Herrera

Gustafson

Dickinson 87.2% 76.2% 73.8% 78.0% 75.5% 78.1%

Hastings 92.7% 85.5% 80.1% 78.4% 73.0% 81.9%

Chubb 93.1% 86.1% 79.2% 80.6% 80.6% 83.9%

Herrera 93.0% 71.7% 81.8% 74.6% 74.6% 79.1%

Gustafson 71.9% 65.6% 70.3% 68.8% 68.8% 69.1%

Journal Entry #1

Journal Entry #2

Journal Entry #3

Journal Entry #4

Journal Entry #5

All Entries Avr.

Page 22: Technology-Mediated Assessment

Lessons LearnedJournal Entries & Course Evaluations

Students are more likely to complete if given credit.

Students are extremely responsive to the anonymity of the online survey.

Students respond positively when asked for suggestions/solutions to problems in the class.

Page 23: Technology-Mediated Assessment

Web Enhanced Course Evaluation at Columbia University

Jack McGourtyColumbia University

Page 24: Technology-Mediated Assessment

Overview A little history How does course assessment fit

into the “big picture”? Why use web technology? How is it being done? Does it work?

Page 25: Technology-Mediated Assessment

History Columbia’s Fu Foundation School of

Engineering and Applied Science began using the web for course assessment about four years ago starting with a student administered web site for results

Designed and developed state-of-the-art system using student teams

Now building on current infrastructure to include on-line tutorials and increased flexibility for administration

Page 26: Technology-Mediated Assessment

Student Web Site Search

by course or faculty

Current and past results

No comments

Page 27: Technology-Mediated Assessment

The Big Picture Why are we assessing courses and programs?

Continuous improvement of the education process What are we doing right, and what can we do better?

Integral part of our ABET EC2000 Compliance Develop a process Collect and evaluate data Close the loop Document/Archive results

Course evaluation one of several outcome assessment measures such as senior exit surveys, enrolled student surveys, and alumni surveys

Page 28: Technology-Mediated Assessment

How WCES Fits in

SEAS Assessment Processes

1998 1999 2000 2001pre1997

Initiate Course Evaluation Process

Conduct First Alumni Survey (All Alumni)

Conduct Second Alumni Survey1989 & 1994 Grads.

Benchmarking Senior Surveys -Class of 2000

Start Academic Review Cycle

Create Web Based Course EvaluationProcess

Senior Surveys -Class of 2001Alumni - 1996

Initiate Freshman Pre-Attitude Survey

Page 29: Technology-Mediated Assessment

Using Technology Pro

Students have the time to consider their responses

Timely feedback Responses are easily

analyzed, archived and distributed

Less paper Lower cost/efficient

administration

Con You lose the “captive

audience” You can’t guarantee a

diversity of opinions Motivated/Non-

motivated Like course/Dislike

course Not necessarily less

effort

Page 30: Technology-Mediated Assessment

Course Assessment Details 10 Core Items

Course Quality Instructor Quality

Relevant ABET EC2000 Items Pre-selected by

faculty member Customized

questions for specific course objectives

Page 31: Technology-Mediated Assessment

Selecting EC2000 Questions

Page 32: Technology-Mediated Assessment

Monitoring Faculty UsageOne of our culture change metrics is the percentage of faculty who are capitalizing on the system and adding custom and EC2000 questions. Currently around 15%.

Page 33: Technology-Mediated Assessment

Course Evaluation Results Web page access

Current term’s assessment Limited time window Limited access Secure site

Previous terms results Open access to numerical results; not comments

Email Results Individual faculty Aggregate Data – Department Chairs

Page 34: Technology-Mediated Assessment

Reporting

Page 35: Technology-Mediated Assessment

Promoting Responses Student-driven

results website Multiple targeted

emails to students and faculty from Dean

Announcements in classes

Posters all over the school

Random prize drawing

Page 36: Technology-Mediated Assessment

Closing the Loop

Page 37: Technology-Mediated Assessment

Does it Work? Student response rates have steadily

increased over past two years from 72% to 85%

More detail in student written comments in course assessments

Data is available that we have never had before

Faculty use of ABET EC2000 and Customized question features increasing but still limited (15%)

Page 38: Technology-Mediated Assessment

Cross Institutional Assessment with a Customized Web-Based Survey System

Mary Besterfield-Sacre & Larry Shuman University of Pittsburgh

This work is sponsored by two grants by the Engineering Information Foundation, EiF 98-01, Perception versus Performance: The Effects of Gender and Ethnicity Across Engineering Programs, and the National Science Foundation, Action Agenda - EEC-9872498, Engineering Education: Assessment Methodologies and Curricula Innovations

Page 39: Technology-Mediated Assessment

Why a Web-Based Survey System for Assessment? Need for a mechanism to routinely

Elicit student self-assessments and evaluations Facilitate both tracking and benchmarking

Most engineering schools lack sufficient resources to conduct requisite program assessments Expertise Time Funds

Triangulation of multiple measures Multiple measures

Page 40: Technology-Mediated Assessment

Pitt On-line Student Survey System (Pitt-OS3)

Allows multiple engineering schools to conduct routine program evaluations using EC 2000 related web-based survey instruments.

Assess and track students at appropriate points in their academic careers via questionnaires

Survey students throughout their undergraduate career Freshman Pre and Post Sophomore Junior Senior Alumni

Freshman orientation expanded to include Math Placement Examinations Mathematics Inventory Self-Assessment

Page 41: Technology-Mediated Assessment

Knowledge-Based

Competence

Application AreaSynthesize multiple

areas

Can Takeon

Complexity

AcceptAmbiguityWelcome

Environment

Confidence

DevelopComfort

Preparation

Opportunityand

Application

Work Experience

EC Outcomes

Attitudesand

Valuing

Student-Focused Model

Page 42: Technology-Mediated Assessment

The Student

Curriculum

Culture

In-ClassInstruction

LearningThrough

Experience

School ofEngineering

Services

EngineeringManagement

Advising/Counseling

UniversityServices

StudentGrowth

ENABLERS & ENHANCERS

OUTCOMES

Knowledge

Skills

Attitudes

CORE PROCESSES

WHO

WHAT HOW

System-Focused Model

Page 43: Technology-Mediated Assessment

Pitt OS3

Conduct routine program evaluation via surveys through the web Data collection Report generation (under development)

Web versus paper surveys Pros

Administration ease Minimize obtrusiveness Data is “cleaner”

Cons Lower response than paper-pencil surveys User/Technical issues

Page 44: Technology-Mediated Assessment

Pitt OS3

System Components

Internet

On-Line Student SurveySystem (OS3)

Global AdministratorMaintaining System

Local AdministratorControlling Survey "A"

StudentsTaking Survey "A"

StudentsTaking Survey "B"

Local AdministratorControlling Survey "B"

Page 45: Technology-Mediated Assessment

Pitt OS3

Local Administrator Individual at the school where the surveys are being

conducted Responsible for the administering the surveys through

a web-interface Controls the appearance of the survey

Selects school colors Uploads school emblem/logo

Selects survey survey beginning and ending dates Composes initial and reminder email letter(s) to

students Cut-and-pastes user login names and email address Manages surveys in progress Extends surveys beyond original dates

Page 46: Technology-Mediated Assessment

Pitt OS3

Local Administrator

Page 47: Technology-Mediated Assessment

Pitt OS3

Local Administrator

Page 48: Technology-Mediated Assessment

Pitt OS3

Local Administrator

Page 49: Technology-Mediated Assessment

Pitt OS3

Local Administrator

Page 50: Technology-Mediated Assessment

Pitt OS3

Student Java Applet running on a web browser One question per screen minimizes scroll bar

confusion Once student submits questionnaire, results are

compressed and sent to the OS3 server Results stored and student’s password is

invalidated Confirmation screen thanks the student for taking

the survey Can accommodate users who do not have email

accounts

Page 51: Technology-Mediated Assessment

Pitt OS3

Sample Student EmailSubject: Freshman Engineering Attitudes Pre-SurveyTo: [email protected]

Hello and Welcome to the Colorado School of Mines!

You are invited to participate in a research study designed to study students' attitudes about engineering, mathematics, andscience. This information will help CSM to design more effective courses and programs to enhance your undergraduateeducation.

The survey is called the Freshman Engineering Attitudes Pre-Survey. If you decide to participate, you will be asked to completethis survey twice: once at the beginning of the semester and again at the end of the academic year. The questionnaire, which takesless than 15 minutes to complete, can be taken any time at your leisure; however, the pre survey will only be available until 2000-09-22.

Please remember that there are no right or wrong answers, so be honest with your responses. Your responses will remainconfidential. If you have questions about this study, please contact Dr. Barbara Olds [ext. 3991 or [email protected]] or Dr. RonMiller [ext. 3892 or [email protected]].

Your decision to participate in this study is voluntary and there is no penalty if you decide not to participate.

For your convenience, the University of Pittsburgh has made it possible to take the survey online:Web location: http://136.142.87.142/os3/SurveyClient.html?=4

Your username is: MaryYour password is: Mary715

If you experience technical problems taking the survey, please contact Dr. Ray Hoare via email at [email protected].

Your participation in this project is important to us. Once you have completed the survey, please stop by the McBride HonorsProgram office to pick up a small token of our appreciation. Thank you for your help with this important project.

Barbara M. Olds Ronald L. MillerProfessor of Liberal Arts & International Studies Professor of Chemical Engineering

Page 52: Technology-Mediated Assessment

Pitt OS3

Student Welcome

Page 53: Technology-Mediated Assessment

Pitt OS3

Student Instructions

Page 54: Technology-Mediated Assessment

Pitt OS3 Questionnaire

Page 55: Technology-Mediated Assessment

Pitt OS3

How it Works Every day OS3 summarizes all active surveys

for each school Summary reports on the number of students who

have and have not taken the survey Specific students can also be viewed from the local

administrators account Upon completion of the survey dates

Email addresses are stripped from the system Only login names remain with results Only time the OS3 system has student email

addresses is when the local admin is receiving daily updates about their active surveys

Page 56: Technology-Mediated Assessment

Pitt OS3

Sample Daily ReportDate: Mon, 18 Jun 2001 13:10:21 -0500 (EST)Date-warning: Date header was inserted by pitt.eduFrom: [email protected]: Math Inventory Daily UpdateTo: [email protected] Math Inventory survey for University of Pittsburgh Freshman was startedon 2001-05-18.The last day for the survey is 2001-08-20.

227 have taken the survey.3 have not yet taken the survey.

The survey system is online athttp://166.153.77.154/os3/Student.html?=99,local. You can check the status of individual students as well as change other options such as the color schemethrough your local administrator account:

Username: localPassword: xxx1234

Page 57: Technology-Mediated Assessment

Pitt OS3

Evaluation of the System Piloted on five schools

Multiple surveys concurrently at each school Multiple schools at one time

Response rates vary (30 - 70% on average) Example

University of Pittsburgh - April 2001 One initial with two reminder emails over 2.5 weeks Responses

Freshman 70% Sophomores 48% Junior 44%

Varied by department Some usernames had “+”

Page 58: Technology-Mediated Assessment

Pitt OS3

System Trace of One School Freshman Post Survey Survey available for two weeks with one reminder

message 57% overall response rate Increased server ‘traffic’ 2 to 24 hours after each email Design concerns

63% of students had to log in more than one time Multiple logins due to case sensitive passwords

14% never finished - browser problems or didn’t want to finish

10% gave up - just didn’t complete login

Page 59: Technology-Mediated Assessment

Pitt OS3

Issues to Consider Consent for Human Subjects

Discuss with institution’s Internal Review Board

Surveys often exempt Java Applets not supported by

very old browsers HTML as alternate

Firewalls established by other organizations