the development of a comprehensive assessment plan: one campus’ experience
DESCRIPTION
The Development of a Comprehensive Assessment Plan: One Campus’ Experience. Bruce White ISECON 2007. Feedback / accountability. “For society to work […] we must be accountable for what we do and what we say.” “No person can succeed unless he or she is held accountable” - PowerPoint PPT PresentationTRANSCRIPT
The Development of a Comprehensive Assessment Plan:
One Campus’ Experience
Bruce WhiteISECON 2007
Feedback / accountability1. “For society to work […] we must be
accountable for what we do and what we say.”
2. “No person can succeed unless he or she is held accountable”
3. “Feedback is the breakfast of champions”4. “You need a culture of assessment, not a
climate”1 -Betty Dowdell2 – Grant Wiggins3 – Ken Blanchard4 – Gloria Rogers
OverviewAre we teaching what we say we are?Are students learning?How can we be more effective in our instruction?
SO … What do we want students to learn?Why do we want them to learn it?How can we help them to learn it?How do we know what they have learned?
Furthermore . . . Stakeholders want to see if we are
accomplishing our goals of education.
Possible stakeholders: Students Parents Employers Board of Regents / State Agencies Accrediting groups (AACSB / ABET / etc.) Faculty Alumni
Our campus program The Information Systems Management
program at Quinnipiac University in Hamden Connecticut started our journey towards a comprehensive assessment program in 2003.
Prior ‘assessment’ was informal: So ISM faculty – how do you think we are doing? So Advisory Board – what advise do you have for
us? So IS education community – what should we
teach (like IS2002 curriculum) So Employers – what do our students need to
know (or know better) Etc.
Our desired outcomes: Analysis and design of information systems
which meet enterprise needs. Use and experience with multiple design
methodologies. Experience in the use of multiple programming
languages. Development of hardware, software and
networking skills. Understanding of data management. Understand the role of IS in Organizations.
Possible assessment methods:Direct Assessment Methods: Simulations Behavioral Observations Performance Appraisals Locally Developed Exams External Examiner Portfolios / E-portfolios Oral exams Standardized Exams(Source: Gloria Rogers- ABET Community Matters 8-06)
Indirect Assessment Methods Exit and other interviews Archival data Focus groups Written or electronic surveys / questionnaires
Senior exit surveys Alumni surveys Employer surveys
Other factors: IS model curriculum Advisory board Alumni Surveys
Foundation of Our Assessment Program
We got interested in the CCER IS Assessment test earlyIt is a direct assessment test based on the IS2002 model curriculumIt has been thoroughly tested and analyzedIt has been shown to be valid and reliableTest scores reported on 37 different areas – and relevant to our learning outcomesThe test questions are written at higher levels of Bloom’s taxonomy – with scenarios
More on our assessment process We also use a senior exit survey
(indirect measure) An advisory board gives input Informal controls:
Campus decisions (such as number of credits allowed, changes in general education)
Model Curriculum Changes Employer input Conferences / technologies
Specific Learning Skills Skill Set 3.0 Strategic Org. Systems Develop. 2004 2005 2006 2007 Avg
3.1 Organizational Systems Development
3.1.1 Strategic Utilization of Information Technology 37 40 46 39 40.5
3.1.2 IS Planning 37 31 47 14 32.33.1.3 IT and Org. Systems 34 33 50 29 36.53.1.4 Information Systems Analysis & Design 47 42 53 44 46.5
3.1.5 Decision Making 23 22 22 17 213.1.6 Systems Concepts, Use of IT, Customer Service 43 35 45 38 40.3
3.1.7 Systems Theory and Quality Concepts 43 38 43 20 36
3.2 Project Management 2004 2005 2006 2007 Avg.
3.2.1 Team Leading, Project Goal Setting 49 42 61 36 47
3.2.2 Monitor and Direct Resources and Activities 35 41 50 50 44
3.2.3 Coordinate Life Cycle Scheduling and Planning 55 64 74 45 59.5
3.2.4 Apply concepts of continuous improvement 47 44 37 42 42.5
3.2.5 Project Schedule and Tracking 45 37 57 43 45.5
Continued
Overall AnalysisArea 2004 2005 2006 2007 Avg
Hardware and Software 39.6 38.7 48.0 31.1 39.35
Modern Programming Language
38.1 37.3 42.2 32.4 37.5
Data Management 41.2 40.9 53.9 41.4 44.35
Networking and Telecommunications
39.4 35.1 53.6 46.3 43.6
Analysis and Design 43.1 43.0 53.6 40.9 45.15
Role of IS in Organizations 50.1 44.7 58.7 45.5 49.75
Number taking test 27 24 11 9
Overall
Senior Exit SurveyLearning Objective 2006
scores2007 scores
Systems Analysis (including project management
4.4 4.5
Alternative Design Methodologies 3.7 3.5Programming Languages 4.0 3.0Hardware and Software 4.2 4.5Networking 3.9 3.5Data management 4.2 4.2IS in Organizations 4.2 4.4Ethics in IS / IT 4.2 3.8Global aspects of IS / IT 3.6 3.8
Next Step – setting metrics So … we have a solid direct measurement And … we have a good indirect measure
Now … what We are working on setting metrics
(especially for our direct measurement – the CCER IS Assessment test)
From ABET: “Every student does not have to achieve the
desired outcomes, but targets must be defined.”
Setting expectations
The faculty have considered the goals of the program and feel our program emphasizes systems analysis, the role of IS in organizations and data managementISECON 2007 - Accreditation16
Area ExpectationSystems Analysis 80% will score 50 or higherProgramming 60% will score 45 or higherData Mgmt 70% will score 50 or higherRole of IS in Org. 80% will score 50 or higherNetworking 70% will score 45 or higherHardware / Software
70% will score 45 or higher
Now what Next year as students take the CCER IS
Assessment test and as we get feedback from our senior survey, we will analyze the data to see if our outcomes have been reached.
If they have: If they haven’t:
We analyze why not – was it poor instruction? Poor students? Poor textbook? Overly optimistic expectations?
We change in an effort to ‘constantly and continually improve’ our program!!