vi: impact report of the quality enhancement plan · a. improved basic writing and math skills...

10
VI: Impact Report of the Quality Enhancement Plan 1. QEP: “Reconceptualizing UAB’s Undergraduate Core Curriculum” Reconceptualizing the undergraduate core curriculum is the focus of UAB’s QEP, which ensures that UAB students will have a solid foundation for academic success, professional achievement, and personal fulfillment. The Shared Vision of a UAB Graduate is one who demonstrates good communication skills, depth and breadth of knowledge, effective problem solving, and active citizenship. As a first step towards achieving this Vision, UAB focuses on enhancing writing, quantitative literacy (QL), and ethics and civic responsibility (ECR) throughout the undergraduate curriculum, thereby making the core curriculum more integral to all academic programs. Strategies to implement the QEP include enforcement of an orderly progression of academic coursework, a First Year Experience, mid-curricular courses that specifically include enhanced instruction, practice, and assessment of the targeted competencies, a required capstone, and a continuous cycle of assessment, intervention, and improvement for academic programs. (1) 2. Initial Outcomes of QEP with Intended Ultimate Goals STUDENT LEARNING OUTCOMES with ultimate goals A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general writing, QL, and ECR => students will perform better in academic courses, internships, community service, and jobs C. Improved competency in discipline-specific writing, QL, and ECR => students are better prepared for responsible citizenship and post-graduation transitions INFRASTUCTURE COMPONENTS with intended goals A. Enforced early enrollment in composition and core math courses => minimize number of students who have not fulfilled basic composition and math core requirements by, respectively, 30 and 60 hours earned B. Faculty leadership in QEP implementation and assessment => promote faculty ownership of QEP initiated curriculum changes and assessment C. QEP grants, faculty workshops, online University Writing Web, and University Writing Center => support faculty integrating enhanced instruction, practice, and assessment of targeted competencies into new and existing courses D. WEAVE on-line => heighten program accountability for collecting data that demonstrate student improvement in writing, QL, and ECR and/or that identify curriculum changes needed to improve student learning, part of a continuous cycle of assessment, analysis, and improvement 3. Significant Changes Our QEP predicted that by Fall 2009 50% of freshmen would elect to be enrolled in highly structured learning communities that included a 25 person Freshman Seminar, freshman composition, and a math or science core course. In Fall 2006, UAB offered eight freshman

Upload: others

Post on 19-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

VI: Impact Report of the Quality Enhancement Plan 1. QEP: “Reconceptualizing UAB’s Undergraduate Core Curriculum”

Reconceptualizing the undergraduate core curriculum is the focus of UAB’s QEP, which ensures that UAB students will have a solid foundation for academic success, professional achievement, and personal fulfillment. The Shared Vision of a UAB Graduate is one who demonstrates good communication skills, depth and breadth of knowledge, effective problem solving, and active citizenship. As a first step towards achieving this Vision, UAB focuses on enhancing writing, quantitative literacy (QL), and ethics and civic responsibility (ECR) throughout the undergraduate curriculum, thereby making the core curriculum more integral to all academic programs. Strategies to implement the QEP include enforcement of an orderly progression of academic coursework, a First Year Experience, mid-curricular courses that specifically include enhanced instruction, practice, and assessment of the targeted competencies, a required capstone, and a continuous cycle of assessment, intervention, and improvement for academic programs. (1)

2. Initial Outcomes of QEP with Intended Ultimate Goals

STUDENT LEARNING OUTCOMES with ultimate goals A. Improved basic writing and math skills => students are better prepared for advanced

courses B. Improved competency in general writing, QL, and ECR => students will perform better

in academic courses, internships, community service, and jobs C. Improved competency in discipline-specific writing, QL, and ECR => students are better

prepared for responsible citizenship and post-graduation transitions INFRASTUCTURE COMPONENTS with intended goals

A. Enforced early enrollment in composition and core math courses => minimize number of students who have not fulfilled basic composition and math core requirements by, respectively, 30 and 60 hours earned

B. Faculty leadership in QEP implementation and assessment => promote faculty ownership of QEP initiated curriculum changes and assessment

C. QEP grants, faculty workshops, online University Writing Web, and University Writing Center => support faculty integrating enhanced instruction, practice, and assessment of targeted competencies into new and existing courses

D. WEAVE on-line => heighten program accountability for collecting data that demonstrate student improvement in writing, QL, and ECR and/or that identify curriculum changes needed to improve student learning, part of a continuous cycle of assessment, analysis, and improvement

3. Significant Changes Our QEP predicted that by Fall 2009 50% of freshmen would elect to be enrolled in highly structured learning communities that included a 25 person Freshman Seminar, freshman composition, and a math or science core course. In Fall 2006, UAB offered eight freshman

Page 2: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

learning communities and adopted EBI’s First-Year Initiative survey (FYI) as the assessment instrument. By using these data to identify areas needing improvement, there was significant improvement in many of the 15 FYI Factor scores the following year. For example, the mean for 2006 students reporting on Satisfaction with College was 5.08 (below the 5.51 mean for all participating institutions) whereas the mean for 2007 students was 5.64 (above the 5.57 mean for all participating institutions). There was similar improvement between the 4.53 mean of 2006 UAB students for Overall Course Effectiveness (below the 4.78 mean for all participating institutions) and the 5.11 mean for 2007 UAB students (above the 4.73 mean for all participating institutions). Despite major publicity campaigns, student enrollment remained disappointing. Faculty offered twelve learning communities in Fall 2007, but only eight enrolled the minimum required ten students and enrollment was well below seating capacity in most cases. Challenges included a system that did not allow block registration, more entering students with AP credit in English, limited elective hours in some programs, and a perceived financial and time disadvantage in taking a freshman seminar that fulfilled no requirement. To increase the number of students benefiting from key components integrated into learning communities while accommodating different programmatic restrictions, in Fall 2008 UAB expanded first year experience (FYE) options to include learning communities, a critical thinking course, and stand-alone FYE courses developed by a school or department. FYI scores for the expanded 2008 FYE options dropped significantly in almost every factor. A two-hour FYE faculty development workshop that reviews best practices for the required FYE modules was developed, and FYI data improved. In August 2010, the workshop was expanded to 5 ¼ hours and included concurrent sessions targeted at different audiences (learning community vs. stand alone FYE instructors, new vs. experienced instructors) and experienced FYE faculty as presenters. Between 2010 and 2011, FYI scores for the preceding Fall FYEs improved significantly in ten factors and improved slightly in four other factors. (2) Based on feedback received from evaluations, the May 2011 FYE Faculty Workshop will be 7 ½ hours long and include a student panel. In Fall 2009 taking and passing a FYE course in the first 24 credit hours at UAB became a university-wide freshman requirement. Despite great differences in format, maximum enrollments, and instructor rank and numbers, all FYE courses must include a common core of FYE learning outcomes and objectives that promote student retention and success, as well as three mandatory assignments. (3) School of Engineering FYEs illustrate both the evolution of FYE possibilities and the use of data for improvement. In 2006, the Engineering sponsored learning communities composed of four courses plus a lab and recitation section received middle to low FYI scores. Students felt such scheduling was too much like high school and failed to promote the college experience of meeting a wide range of new people. Engineering experimented with different combinations and numbers of linked courses over the next two years. Currently, its learning communities are stretched over two semesters: in the fall students take two linked courses, one of which covers all

Page 3: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

FYE transition topics before introducing the range of engineering majors; and in the spring, a third course reinforces some transition topics within the context of information about the engineering process, engineering as a profession, and careers in engineering. With such annual review and restructuring, the restructured Engineering learning communities consistently earn among the highest FYI scores on multiple factors. 4. QEP’s Impact

STUDENT LEARNING:

A. Basic writing skills: The English department restructured its Freshman Composition sequence, instituted more faculty development, implemented changes to help retain adjunct faculty, and adopted more oversight over syllabi and textbooks. In 2007-08 and 2008-09 the department developed a systematic approach to measure changes in student writing between the start of EH 101 and the conclusion of EH 102. Because the measurement design controlled for multiple sources of variability and bias, it was possible to show statistically significant improvements in writing ability. The magnitude of the change was .58 out of 6 points the first year and .34 the second year. Overall, the scores were higher in 2008-09 than they were in 2007-08. Thesis development was emphasized in EH 102 when it was the only area where students failed to improve. (4) In 2009-10 English faculty decided to change the assessment design by scoring only one untimed essay from each student at the end of EH 102 using a single rubric but letting assignments vary by course section. This created a situation where measures of improvement in learning can only be done by comparing one year to the next under conditions that introduced many uncontrolled variables. With advice from Planning and Analysis, the English department will ensure that each prompt will be accompanied by the exact same instructions on how to respond, ask for the same outcome, be written with the same logic, and be the same approximate length. Basic math skills: The Department of Mathematics has a new Mathematics Learning Laboratory, revised curricula, hired instructors dedicated to pre-calculus math courses, and adopted procedures that closely monitor student progress. By Spring 2008, all sections of all pre-calculus math courses were converted to a restructured format using flexible hours, computer-based instruction and a Mathematics Learning Laboratory with some classroom contact and supplemental instruction as needed. As the first course so converted, MA 102 offers the most data on pre- and post-restructured format. Success rates averaged 43% for the three falls before restructuring; the success rate increased to 62% in Fall 2006 and 2007, to 75% in Fall 2008, and to78% in Fall 2009. Similarly, the Fall 2009 success rates for MA 098, 105, 106, and 107 show significant improvement (28%, 32%, 28%, and 40% higher) over success rates in Fall 2005 before restructuring. (5) Besides attendance, pass, and withdrawal data, Math uses student surveys and exam analyses to identify effective changes as well as components that need improvement, resulting in modified lesson plans, course format including amount of instructor contact, course materials, and even grade structure. For example, instructors observed that MA 098 students who had earned enough

Page 4: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

credits to pass the P/NP course were not putting sufficient effort into the last few course topics. While not impacting passing rates, this caused a fall-off in overall student learning. Math changed MA 098 to an A/B/C course. B. Targeted QEP competencies Writing and Quantitative Literacy: Since 2004, a primary instrument for institutional assessment of writing and quantitative literacy has been the ETS Academic Profile (renamed the Measure of Academic Proficiency and Progress, and then the Proficiency Profile (PP)). Initially the short version was taken by a sample of seniors and freshmen. UAB solicited freshmen to take the two-hour standard form beginning in Summer 2007 and a cohort of seniors started taking that form in Spring 2008. Scores and proficiency classifications for the two versions are equivalent, so trends from 2004 through 2009 for both freshmen and seniors are observable.

The best indicators of student performance are the proficiency classifications because these can be linked back to specific learning outcomes provided by ETS. There are three proficiency classifications (Not proficient, Marginal, and Proficient) reported in three competencies (Reading/Critical Thinking, Writing, and Mathematics), each with three levels. PP data accumulated to this point allow us to assess value added between freshmen and seniors. By creating matched samples of freshmen and seniors using propensity scores it has been possible to show consistently that there is a significant improvement between freshmen and senior total scores and proficiency levels.

Competency / Year

0%

20%40%60%80%

Percentages of students designated as Proficient at the three levels in the three competencies 2004-2009Cohort

Page 5: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

However, data for these past five years cannot be used to assess fully the impact of QEP initiatives on student learning addressed by the test because only about 50% of our seniors begin their academic career at UAB. To secure longitudinal data on improvement for native students and to increase student participation, in Fall 2007 UAB began to recruit freshmen into the Curriculum Assessment Support Team (CAST), a cohort of students who agree to take the PP as freshmen, rising juniors, and seniors. Members of each year’s CAST cohort accumulate credit hours at different rates. Of the original cohort that started in 2007, only 30 have taken the test three times. This has provided the first opportunity to examine longitudinal data and validate previous results that showed a clear difference between freshmen and senior cohorts who took the test in the same academic year. The results of a repeated measures analysis show statistically significant increases in the Total Score and each of the four subscores. (6) For example, students show a significant gain in math between the freshman and junior years. Since the vehicles for delivering these competencies have been developed sequentially (FYE first, QEP designated courses second, and capstones last), we expect that the change between the junior and senior years for future CAST cohorts will reflect a similar increase. This should be accompanied by an increase in the numbers of students designated as Proficient at Math Level 3 as well. There are now three CAST cohorts in the pipeline, with each subsequent cohort being exposed to more curricular changes that emphasize the targeted competencies. Since the PP assesses only some of our QL learning outcomes and does not include a writing sample, a committee began meeting in \Spring 2009 to identify or develop additional instruments that could be administered in alternate years. The Fall 2010 CAST 400-member freshman cohort wrote an essay in response to the same ETS Criterion Online Writing Evaluation prompt and took the Core Competencies Assessment Test (CCAT), an in-house 46 item test on ECR competencies and those QL competencies not tested by the PP. On the written essay, UAB freshman generally performed best on the Mechanics section, which includes punctuation, and Style, which involves more aesthetic writing choices such as varied sentence length and active vs. passive voice. They generally performed worst in Usage, a category that includes word choice and proper article use, and Grammar, especially with respect to run-ons and fragments. The English department will continue to emphasize these areas in freshman composition. Comparative analyses of essays written by these same students as rising juniors and seniors will provide direct assessment of the effectiveness of our efforts to enhance writing. Data analyses of ECR and QL test items indicate the need for reworking of items that were either too simplistic or too ambiguous; but we expect that on-going administrations of improved test items will provide comparable insight into the effectiveness of UAB’s emphasis on these competencies. Ethics and Civic Responsibility: This broad competency has proven the most difficult to assess. Although there can be quantifiable knowledge change about relevant components like the Academic Honor Code and current events, the more significant institutional goal is a behavioral change. Initial exposure to an institutional emphasis on ethics and civic responsibility (ECR) occurs through a freshman discussion book. All freshmen are required to read a common Discussion Book and, the day before Fall term begins, attend a presentation by the author or relevant

Page 6: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

6

individual and participate in a small group discussion facilitated by trained faculty and staff. Besides building community, the discussions introduce the concept of difficult dialogues and QEP competencies relevant to the book: 2005 The Spirits Catch You and You Fall Down; 2006 The Kite Runner; 2007 All Over but the Shoutin’; 2008 Field Notes from a Catastrophe: Man, Nature, and Climate Change; 2009 Mountains Beyond Mountain; 2010 Outcasts United; and 2011 Thinking in Pictures. A monthly dialogue series, campus and off-campus events, publications, and other activities have been used to extend discussion of relevant issues throughout the year. Freshmen attendance at the author presentation and small group discussions has increased from 79.3% in 2005 to 96.7% in 2010. The chart below identifies the percentage of respondents to a student survey who felt the book had contributed “quite a bit” or “very much” to the ECR competencies related to their understanding of social, medical, or ethical issues; understanding of people of other racial, ethnic, and cultural backgrounds; awareness of the impact that global events have on their lives; and the likelihood of their engaging in difficult dialogues in class or with friends. Increasing percentages of those who feel comfortable with dialogue reflect successive revisions of the small group discussion template to foster practice of this desired classroom behavior and interaction.

One question directly related to the ECR emphasis of the particular book is added each year. 74% of respondents said Mountains Beyond Mountains contributed “quite a bit” or “very much” to being motivated to help others who are less fortunate than themselves. 59% of respondents said Outcasts United contributed “quite a bit” or “very much” to their thinking about American identity and America’s role in the world. The most common standardized instrument to measure achievement in ethical competence is the Defining Issues Test (DIT-2), which we began administering to a small sample of entering freshmen in Summer 2007 and graduating seniors in Spring 2008. For 2007, 2008, and 2009 freshmen groups, DIT-2 scores matched the Freshmen Norms group very closely with the predominant ethical schema being Maintaining Norms. A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. DIT-2 scores for the 2008, 2009, and 2010 senior groups were basically the same as those of the freshmen and thus significantly different from the National Senior Norms group on two of the three moral reasoning schemas. While there was a statistically significant effect on the N2 Index, a trend in the right direction, results are preliminary and more research is necessary before we can attribute changes to any systematic developmental differences between freshmen and seniors. (7) Interestingly, our

0

10

20

30

40

50

60

70

80

ethical issues diversity global events dialogues

2005

2006

2007

2008

2009

2010

Page 7: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

7

philosophy faculty members, who teach ethical reasoning, question the validity of the DIT in measuring ethical reasoning. We will be soliciting their help in either finding a new instrument or method for measuring ethical competence. This is an exciting area of new research for us especially since we have a Center for the Study of Ethics and Values in Science. Writing, QL, and ECR courses: Designated writing, QL, and ECR courses are the major curricular vehicle for improving instruction in and assessment of QEP competencies. Every program is required to identify where their majors will take at least two writing, two QL, and two ECR designated courses between the FYE course and their capstone. The Writing, QL, and ECR Committees have so far approved 82 courses for writing designation, 87 for QL, and 77 for ECR. (8)

Programs have reported that developing QEP designated courses and a capstone has caused them to review their curriculum and resulted in a more coherent and cohesive undergraduate experience that makes more obvious to students the intentional instructional development in their program plan. In other cases, the process of mapping course assignments and syllabi to determine where best to integrate writing, QL, or ECR into curriculum has also provided an opportunity for self-assessment across courses, resulting in departments deleting material that was overlapping in some courses and increasing the depth and breadth of content taught in other courses.

During 2009-10, the Writing, QL, and ECR Committees each conducted a pilot-feasibility study on using student artifacts from writing, QL, or ECR designated courses to assess QEP learning outcomes at the institutional level. Each committee collected and reviewed approximately 100 examples of randomly-selected, ungraded student work from across the disciplines. Besides procedural challenges, committees identified difficulties in assessing work across disciplines because of differences in disciplinary conventions, course level, definitions, and expectations. There were fewer challenges with assessing QL than with writing and ECR. Although this process is in its infancy, the ultimate goal is to identify common characteristics of mastery and sources of difficulty in the core competencies that cut across disciplines, and to use this information so faculty attention and university resources can be directed toward areas of particular need for enhancement. This faculty-based, course embedded assessment pilot is an important stage in developing a bottom-up culture of assessment across campus. The pilot project has led the committees to focus this year on clarifying the learning outcomes for each competency and revising the rubric used in reviewing applications for writing, QL, or ECR course designation since course designations must be renewed every three years.

C. Discipline-specific QEP competencies: The Schools of Nursing, Business, Engineering, and Education integrate the assessment of QEP competencies into the school-wide assessment each conducts for their professional accrediting agencies. Nursing seniors take the Comprehensive Assessment Technologies Institute examination, receiving a score and percentile ranking on eight subscales which are the same as the subscales on the NCLEX-RN licensure examination. Percentile rankings on the major subscales of the examination provide norm referenced data for use in quality improvement. Each semester the School of Business assesses writing competencies of all students in the core writing course BUS 350 using a common rubric. Assessment results for

Page 8: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

8

Fall 2009 showed that 89% of students meet or exceed the school accreditation goal learning objectives as measured by this rubric. The area in which students score the lowest is then targeted for emphasis in BUS 350. Similarly, engineering programs directly and indirectly assess student learning in QL-related areas as part of ABET accreditation. When data showed a need to improve students’ ability to apply mathematical concepts to complex analyses, EGR 265 Mathematical Tools for Engineering Problem Solving was developed to replace MA 227 Calculus III and MA 252 Ordinary Differential Equations in all five curricula. Similarly, the Mechanical Engineering and Materials Engineering programs improved instruction regarding the impact of engineering solutions, which is one aspect of ECR, by requiring MSE 401 Materials Processing, which emphasizes safety, ethical responsibilities in manufacturing, cultural and ethical issues of off-shore supply chains, and environmental issues. Some individual programs like Theatre also have discipline-specific accrediting agencies that impose guidelines for assessment or like Nuclear Medicine Technology rely upon student performance on national board/certification exams. Foreign Languages requires graduating seniors to take the national standardized Web Computer Adaptive Placement Exam (WebCAPE), which assesses language competency in terms of manipulating grammatical structures, vocabulary, etc. Because students have consistently met or exceeded achievement targets overall, the Department voted to raise the average target WebCAPE score from 525 to 550 in the Fall of 2009. Programs like Criminal Justice rely upon an ETS Major Field Test in the discipline to assess both disciplinary knowledge and QEP competencies like professional ethics. After the physics Major Field Test began to report results in only the areas of Introductory and Advanced Physics, the Physics department implemented course-embedded assessment and developed a new model of introductory-level training in scientific writing for Physics majors. The model is being implemented in the Modern Physics I-II Laboratories. Over two semesters, students receive introductory-level instruction on how to write documents reporting the results of Physics experiments in 6 writing exercises of increasing complexity, providing background for the professional-level writing to be developed in the capstone course. Other programs administer locally developed instruments to assess disciplinary knowledge and QEP competencies. For example, Psychology seniors take in-house developed tests. After 2009 and 2010 seniors performed below expectations in statistical analysis, a faculty committee was formed and is meeting regularly to consider where statistics training can be enhanced in the Psychology BS curriculum. The outcomes quiz is also being evaluated to ensure it validly assesses student knowledge. The Department of Philosophy collects papers written by majors with between two and eight philosophy courses and by seniors. A committee of three assesses papers on argument presentation, argument analysis, overall coherence, and writing mechanics. In Spring 2010 the average increase in proficiency was 3% below the target goal of 25% improvement with argument analysis as the lowest scoring component. Each fall the assessment committee presents results to the whole department for discussion of action plans to address deficiencies. Programs that do not use standardized tests can assess QEP competencies by analyzing student artifacts in capstone courses, a graduation requirement for all students graduating in 2013 or later.

Page 9: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

9

Approved capstone courses must include discipline-specific aspects of writing, QL, and ECR. Student artifacts drawn from capstones will provide material for both programmatic and institutional assessment. INFRASTRUCTURE: A. Enforced early enrollment: In Spring 2010, 96.2% of degree-seeking undergraduate students with 30+ hours earned had fulfilled their core freshman composition requirements; 94% of degree-seeking undergraduate students with 60+ hours earned had fulfilled their core math requirement. These numbers exclude students in the University Honors Program, who fulfill core curriculum requirements through specialized interdisciplinary seminars. B. Faculty Leadership: The Core Curriculum Steering Committee, 53% of whose members are faculty, oversees the QEP, awards QEP grants and approves courses for capstone designation. Faculty chair the Writing, QL, and ECR Committees. The Director of Core Curriculum Enhancement is an ex-officio member of all three committees constituted respectively of 100%, 63%, and 77% faculty. These committees identified university-wide learning outcomes for these three competencies, developed rubrics for QEP designated courses, review applications for QEP designation, and conducted a pilot assessment based on student work collected from QEP designated courses. (9) An electronic quarterly QEP Newsletter highlights QEP activities and personalities, shares assessment data, and invites service on QEP committees. (10) C. Faculty Resources: About $209,000 in QEP grants has been awarded to faculty to enhance instruction, practice, and assessment of writing, QL, and/or ECR in mid-curricular courses or to departments to develop a capstone course or integrate discipline-specific writing, QL and ECR into an existing capstone. (11) This amount includes a Core Commitments grant ($25,000) from Association of American Colleges &Universities to enhance ECR in mid-curricular and capstone courses. Additional resources have been provided by two Difficult Dialogues grants ($100,000 and $60,000) from the Ford Foundation, the first to help develop our pilot freshman learning communities and train faculty in facilitating difficult dialogues and the second to promote ECR by integrating difficult dialogue pedagogy into sophomore core classes and by supporting co-curricular events that used student produced ethnographic films to promote dialogue about community and diversity issues. To support curricular development, the Office of Core Curriculum Enhancement sponsors (1) monthly FYE Brown Bag series for those teaching or interested in developing a FYE; (2) May and August FYE Faculty Workshops on best practices; and (3) monthly Conversation on Capstones where one department shares best practices and challenges in developing and/or enhancing a capstone. The Office has also sponsored five campus visits by the Duke University’s Director of Writing Across the Curriculum and two campus visits by the Director of the Rutland Center for Ethics, Clemson University, to run workshops for faculty on integrating writing and ethics assignments, respectively, into courses. The Office has promoted QL through a QL Awareness Week, guest speakers and faculty panels on integrating QL into courses, and a campus Discussion Book (Field Notes from a Catastrophe) and events highlighting QL.

Page 10: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

10

Since fall 2008, all students in EH 091, EH 101, and 102 are required to purchase a customized textbook that includes a code good for five years’ access to an online University Writing Web maintained by Pearson. Faculty in other disciplines are encouraged to include this textbook on syllabi so all students can benefit from resources at this site: composition textbook, technical support, tutorials on avoiding plagiarism, and discipline-specific writing materials from UAB faculty and units. Since January 2010, a University Writing Center has been open part-time, providing individual writing in the disciplines tutoring, as well as writing workshops for students and faculty. D. WEAVE: Since Fall 2007 all academic and administrative units are expected to enter annual information on mission, goals, achievement targets, measures, findings, and action plans into WEAVE Online, the University’s learning outcomes database. The Office of Planning and Analysis runs multiple workshops on writing learning outcomes, identifying satisfactory measures, etc. WEAVE online allows programs to link what they are doing to the learning outcomes for the QEP. 5. Concluding Statement: UAB undertook an ambitious QEP designed to strengthen foundational skills and competencies perceived as essential for student success by changing campus attitudes and creating a significant climate change. By developing a required First Year Experience which provides an initial introduction to writing, QL, and ECR expectations; mid-curricular courses that highlight writing, QL, or ECR learning outcomes; and capstones in each major with discipline-specific aspects of these targeted competencies, these core curriculum/general education competencies have become the shared responsibility of all faculty. This campus-wide focus on the importance of writing, QL, and ECR for all students regardless of major is beginning to be reflected in assessment data that demonstrate achievement levels and value added in these competencies.

(1) Executive Summary, QEP document, pp. 1-2, (2) 5-Year Longitudinal Comparison of FYI Factor Data (3) FYE Learning Outcomes (4) Freshman Composition Assessment Report 2006-07 Freshman Composition Assessment Report 2007-08 Freshman Composition Assessment Report 2008-09 Freshman Composition Assessment Report 2009-10 (5) Pre-calculus Math Cumulative Progress Report, Fall 2005-Spring 2010 (6) CAST 2007 Cohort Report (7) Defining Issues Report for 2008 Defining Issues Report for 2009 Defining Issues Report for 2010 (8) Writing Designated Courses; QL Designated Courses; and ECR Designated Courses (9) QEP Membership lists (10) Archive of QEP Newsletters (11) Financial History of QEP Grants and List of QEP Grant Awards

Page 11: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

10-8

First Year Experience Courses First Year Experience (FYE) courses are the gateway to undergraduate education at UAB. FYE courses improve student retention by helping to bridge the gap between high school experiences and university expectations and enhance successful progress towards graduation by establishing the foundations for academic achievement and holistic development. Every UAB freshman should share a common foundation for learning, whatever their majors or professional goals are. After successful completion of an FYE course, students should be able to

1. Assume responsibility for their own educational progress by a. Employing basic academic survival skills b. Planning their curriculum intentionally c. Being able to articulate the purpose and value of the core curriculum d. Knowing about campus policies and resources e. Exercising personal and academic integrity f. Maintaining a healthy lifestyle g. Managing financial resources effectively

2. Demonstrate social integration and engagement by

a. Establishing community-building bonds with peers, faculty, and students b. Participating actively in campus life c. Knowing the Shared Vision for a UAB graduate d. Drawing connections between classroom experiences and the expanding

communities of which they are a part To help students acquire these core learning outcomes (individual FYE courses may have additional learning outcomes), all FYE courses must include coverage of the following topics (red typeface indicates a required activity or assignment related to the topic):

1. Structure & mission of UAB 2. Faculty expectations and student responsibilities 3. Academic policies, including the Academic Honor Code (online Academic Integrity

module) 4. Academic advising & career planning (Advising Assignment) 5. Academic survival skills (e.g., regular attendance, understanding the syllabus, reading

and thinking critically, note-taking, test-taking, learning styles) 6. Time management 7. Financial management 8. Maintaining a healthy lifestyle (e.g., stress management, nutrition, recreation, drug &

alcohol awareness; personal safety) 9. University library resources (online tutorial) 10. Campus involvement opportunities (e.g., social activities, clubs and organizations,

cultural events) that promote learning outside the classroom (DragonQuest required for FLCs)

** FYE courses at UAB currently include U101, freshman learning communities (FLC), and school-specific FYE courses. Mandated FYE topics can be spread across linked courses in an FLC. NOTE: During the last week of class, all FYE courses are required to administer the First Year Initiative (FYI) survey for assessment. The FYI can be taken in class or online.

Page 12: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

QEP Newletter

http://main.uab.edu/Sites/DOE/QEP/76715/[3/23/2011 11:17:54 AM]

QEP

About QEP

Importance of QEPRecommendations

UndergraduateCurriculum Changes

QEP Newletter

Organizational Chart

Capstones

QEP Grants

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

QEP Newletter

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

February 2011

November 2010

May 2010

August 2010

Page 13: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

The Fifth-Year Interim Report is a “mini”

report only in the sense that fewer stan-

dards are addressed in the compliance

audit component of the report.

With other graduate institutions in the

2015 cohort, UAB will submit its Fifth-

Year Interim Report, which includes the

QEP Impact Report, in mid-March.

Dr. Glenna Brown, Associate Provost for

Planning and Analysis, is UAB’s liaison

to SACS.

At the 2010 SACS-COC Annual Meeting in

Louisville, KY, announcements about

institutions that most recently submitted

their Fifth-Year Interim Reports and un-

derwent a Review Process were dismal. All

39 institutions in this review cohort were

community colleges or 4-year undergradu-

ate institutions which will undergo reac-

creditation in 2015.

Upon initial review in the Fall, 36 of 39

institutions were asked to submit addi-

tional monitoring reports. Most problems

arose from incomplete or unacceptable

information provided by the institution to

validate its compliance with each Core

Requirement, Comprehensive Standard,

and Federal Requirement. The QEP Impact

Reports of 13 of these 39 institutions were

found unacceptable.

Because this was only the second cohort

required to submit a Fifth-Year Interim

Report, many institutions mistakenly

assumed that reporting and/or reviewing

standards could be less rigorous than those

applied during the formal reaccreditation

process every ten years.

R e p o r t f r o m t h e 2 0 1 0 S A C S - C O C A n n u a l M e e t i n g

A n d r e w K e i t t : A l a b a m a P r o f e s s o r o f t h e Y e a r

Last spring, Jean Ann Linney, then Interim

Dean of the new College of Arts and Sci-

ences, received a call for nominations for

U.S. Professor of the Year. This is the only

national award that recognizes under-

graduate teaching and mentoring. She

immediately thought of history’s “resident

department expert in pedagogy,” Andrew

Keitt.

As her letter of support notes, “Since

joining the UAB faculty a decade ago An-

drew has been devoted to improvements in

teaching while also working on his own

research. He devotes himself to innovative

methods and is never content to simply

continue what he has been doing in the

classroom. He is also eager to share ideas

with his colleagues.”

Andrew has especially championed the

Reacting to the Past pedagogy initially

developed at Barnard. Students role-play

characters involved in historical debates

on such key issues as the French Revolu-

tion, Darwin’s theory of evolution, and the

Civil Rights Movement.

Yasameen Ebrahimi credits this interac-

tive, immersion pedagogy for his decision

to become a history major: “The Reacting

to the Past games require students to work

individually and work in teams which

prepares them for other classes and for

real-world experiences. I love writing

papers for Reacting to the Past games

because you write them from your charac-

ter's point of view. It is easy to get caught

up in the debates of the game and

this helps students understand historical

issues from different perspectives. You

become really familiar with specific issues

of specific times in history. This allows

students to become very knowledgeable

and then they learn to place their specific

knowledge in the larger picture.“

Although Harrison Chase Childs already

had a love of history when he came to UAB,

he is full of superlatives when describing

the excitement and learning generated by

Andrew Keitt’s own obvious passion for

history and sincere, personal interest in

each student’s academic progress.

At the time Harrison first took a Reacting

course, he admits, “I wasn’t sure if I even

wanted to remain in college and finish my

education. However, after Cont’d on p. 3

QE

P U

AB

F e b r u a r y 2 0 1 1

V o l u m e 2 , I s s u e 1

D e g r e e s o f E x c e l l e n c e

I n s i d e t h i s i s s u e :

T h e C o r e C o m p e t e n -

c i e s A s s e s s m e n t T e s t

2 0 1 0 - 1 1 D i s c u s s i o n

B o o k D i a l o g u e s

E T S C r i t e r i o n W r i t -

i n g S a m p l e

2

A l a b a m a P r o f e s s o r o f

t h e Y e a r ( c o n t ’ d )

A d v i s o r s : E s s e n t i a l

C o n t r i b u t o r s t o S t u -

d e n t S u c c e s s

Q E P T i m e l i n e

3

P u t t i n g U A B ’ s B e s t

F o o t F o r w a r d f o r

F r e s h m e n

A F a c u l t y P r o f i l e :

D o n n a S l o v e n s k y

2 0 1 1 - 2 0 1 2 U A B D i s -

c u s s i o n B o o k A n -

n o u n c e d

4

W h o ’ s W h o i n C o r e

C u r r i c u l u m E n h a n c e -

m e n t a t U A B

5

QEP Grants

Quality Enhancement Plan Initiative

Competency

Discussion Book and Campus

Conversations

Freshmen Learning

Communities (FLC)

Mid-curricular Enhancement

Capstone Courses

Writing Yes Yes Yes Yes

Quantitative Literacy

Depends on book choice

Depends on linked courses and theme of FLC

Yes Yes

Ethics and Civic Responsibility

Yes Yes Yes Yes

"I can honestly say

that it was his

mentoring and pas-

sion for teaching

that made me the

student that I am

today....he does it

not because it is

his job but because

he truly wishes to

see his students

succeed in all that

they do.”

Harrison Chase Childs, un-

dergraduate history and

anthropology major

Page 14: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

SACS requires institutions to not only enhance programs, but also to measure their

impact so they can be improved. The QL and ECR Committee, in conjunction with the

Offices of Core Curriculum Enhancement and of Planning and Analysis, is developing

a Core Competencies Assessment Test (CCAT) to measure its QEP-related learning

outcomes in those areas of QL and ECR that are not assessed by the Proficiency

Profile and other assessment instruments described in earlier QEP Newsletters. The

QL and ECR sections of the CCAT were developed in parallel, and each of these

sections is described briefly below.

UAB has been assessing its students’ math skills with standardized tests since 2004.

Such tests assess basic quantitative skills but not the broader range of QL competen-

cies that UAB is seeking to enhance, which include application of quantitative meth-

ods to real-world reasoning, problem-solving, and communication.

The CCAT therefore includes quantitative questions with real-life implications. In a

typical question, students are asked to use percentages, probabilities, frequencies,

tables, or graphs to draw conclusions involving topics such as retail discounts, injury

and mortality risk, medical decision-making, international relations, and population

growth. A few questions require students to compute basic descriptive statistics or

convert between different measurement units. Other questions involve no computa-

tion at all, but assess related concepts related to systems of measurement, research

design, and communication of quantitative information.

The initial version of the CCAT includes 25 QL items and was administered in August

to freshmen who have agreed to help UAB assess student learning and who consti-

tute the 2010 CAST cohort (Curriculum Assessment Support Team). Incoming

freshmen found the test challenging! They had particular difficulty drawing conclu-

sions from data on the implications of inaccuracy of a medical test, determining the

best graph to present a distribution of scores, and abstracting frequencies from a

table to compute complementary probabilities. On the other hand, incoming stu-

dents had a satisfactory understanding of the purpose of placebo conditions in

medical research. On average, students answered about half of the questions cor-

rectly, indicating much room for improvement as they progress towards a degree.

Like QL, ECR involves knowledge and practice in several domains including knowl-

edge of current events and ethical reasoning and decision-making. Current events

questions on the ECR section of the CCAT concern recent legislation, legal decisions,

and ethical issues that have been covered in the national press, and related matters

of constitutional law. About half of the CCAT ECR items focus on ethical knowledge

that is especially relevant to academics, and to UAB students in particular. Students

are asked about appropriate citation of sources in academic work, and their knowl-

edge and understanding of the UAB Honor Code is probed.

Overall, the average incoming freshman was able to answer about 60% of ethical knowledge

questions correctly. Students did best on questions about UAB’s Honor Code, while perform-

ance on contemporary issues, ethical argument, and citation questions left more room for

improvement.

Since civic engagement is another aspect of ECR, the CCAT includes four survey items deal-

ing with students’ history of voting, and their involvement in community service and social

justice issues. Almost 90% of incoming freshman had participated in some kind of commu-

nity service within the past 12 months. Being an undergraduate at UAB will provide all

freshmen with multiple opportunities for this and other forms of civic engagement.

Students who took the CCAT last August as incoming freshmen have agreed to take the test

again as rising juniors and seniors. In the intervening time, they will have experienced an

enhanced freshman year, designated mid-curriculum writing, QL and ECR courses, and

senior capstones. Senior testing on the CCAT will not wait, however, but is already under-

way for this year’s graduating class. The current cohort of seniors will have been minimally

exposed to the QEP, and so their CCAT scores will serve as a baseline for future comparisons.

Both cross-sectional and longitudinal analyses will be used to examine the QEP’s impact,

which is expected to increase over time as more of its components come on-line. Improve-

ments in scores will help assess the “value added” by a QEP-enhanced UAB education.

Because the CCAT assesses QL and ECR across majors and programs, it is necessarily gen-

eral. Complete assessment will continue to require assessment of discipline-specific QL and

ECR within and by individual programs.

Meanwhile, the ECR and QL committees, together with QEP leadership, will monitor results

with the goals of improving the validity of the test, ensuring that test content remains suffi-

ciently stable to permit comparisons over time, and feeding back test results to faculty

engaged in teaching and curriculum development.

T h e C o r e C o m p e t e n c i e s A s s e s s m e n t T e s t : A s s e s s i n g Q L a n d E C R

T h e E T S C r i t e r i o n W r i t i n g S a m p l e

that this mistake is more associated

with timed writing, as is one of the

other major errors, unnecessary repeti-

tion of words (57.8% of students). A

writing process that allows time for

proofreading and revision should give

students a chance to correct these kinds

of errors.

On the other hand, two of the other

major errors involve sentence bounda-

ries and are thus more fundamental:

61.5% of student essays included run-

on sentences and 59.6% had sentence

fragments.

Of the four categories, UAB freshman

generally performed best on the Me-

chanics section, which includes punc-

tuation, and Style, which involves more

aesthetic writing choices such as varied

sentence length and active vs. passive

voice. They generally performed worst

Although the Proficiency Profile taken

by a sample of UAB freshmen, rising

juniors, and seniors provides some

insight into writing skills, it does not

require students to do any actual writ-

ing. To provide additional insight into

the writing skills of students, the 403

freshmen who form the 2010 CAST

cohort each wrote an essay in response

to a single prompt provided by the ETS

Criterion writing evaluation service.

ETS’s analysis of these essays assessed

the students’ English language skills as

opposed to more higher-order skills

such as organizing an argument and

marshaling evidence. Criterion catego-

ries for evaluation include Style, Me-

chanics, Usage, and Grammar.

The results point to four main kinds of

errors. 75.4% of students made mis-

takes with articles. The good news is

Page 2 D e g r e e s o f E x c e l l e n c e

2 0 1 0 - 1 1 D i s c u s s i o n B o o k D i a l o g u e s

Learn more about issues relevant to this year’s book Outcasts United from UAB students

who have had firsthand experience with being “outcasts united” in the pursuit of cross-

cultural understanding. Hear about different lessons learned and skills developed through

living and studying overseas. This special program takes place in the HUC Alumni

Auditorium, 11:30 am—12:30 pm, on Thursday, February 17.

There are two additional Discussion Book Dialogues this term:

Mar 24 Paul Harbin, former UAB women’s soccer coach “Coaching for Life”

(Note: this is the 4th Thursday)

Apr 21 Scotty Colson, Office of Economic Development, Mayor’s Office,

“Birmingham’s Sister City and Other International Programs”

The March and April events will take place in Heritage Hall, room 549, 11:30 am—12:30

pm. Feel free to bring your lunch. Beverages and snacks will be provided. Free and open

to the public.

in Usage, a category that includes word choice and proper article use, and Grammar, where

the above cited problems with run-ons and fragments were evident.

Upon reviewing the Criterion analysis of the essays written by entering freshmen, Peter

Bellis, chair of the Department of English, commented, “Our first-year program has specific

components to address both sentence boundary issues and matters of usage. It's good to

have data that confirm our faculty's sense of student needs, however, and we will continue

to concentrate our efforts in these areas.”

Besides helping to identify or confirm areas of focus for the freshman composition pro-

gram, Criterion data will provide a baseline for assessing student progress in writing when

the same cohort of students write a second essay and a third essay as rising juniors and

seniors. Such longitudinal data could effectively demonstrate the value added by UAB’s

commitment to writing designated courses and capstones with a writing component.

Page 15: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

seeing Dr. Keitt’s enthusiasm in his

work, I was reminded why I wanted to

teach history. I wanted to bring rele-

vance back to the history profession

and make it real to people again. And

that is what he does in his classes. He

makes the students truly understand

what it is that they’re learning by mak-

ing them a part of it.”

Such enthusiastic student praise would

not surprise Holly Radford, student

counselor, who often hears students

discuss favorite courses and teachers.

In a discussion about effective FYE

courses, she noted, “The two that stand

out the most, based on the students I

have worked with over the last three

years, are Rita Treutel for UNIV 101 and

Dr. Andrew Keitt for LCS 108, Politics

and Virtue in Western Civilization. The

students comment on UNIV 101 with

Rita Treutel saying the class is more

challenging and demanding than ex-

pected but they learn a lot and respect

her as an instructor because of her

direct, realistic, and supportive ap-

proach to teaching. The students that

have taken Dr. Keitt’s LSC 108 learning

community enjoy the different teaching

style he brings to a traditional history

course. The class is not solely lecture

based, but instead a course where

students reenact the past by represent-

ing key individuals in history. The

students also enjoy Dr. Keitt’s class

because it is paired with his HY

102 course; therefore they are studying

the same materials as well as learning

with the same students.”

Given such universal accolades, Andrew

was probably the only one at UAB

surprised by the official announcement

that the Carnegie Foundation for the

Advancement of Teaching had named

him the 2010 Alabama Professor of the

Year. When contacted by reporters, he

gratefully acknowledged Jean Ann

Linney, Catherine Danielou, and others

who nominated him for the honor.

Judging criteria for the Professor of the

Year awards are so high that awards are

not made in every state each year.

Awardees must not only excel in teach-

ing but also be shown to have positively

influenced the lives and careers of

students. 2009 Professor of the Year

awards were made in only 38 states, the

District of Columbia, and Guam.

Congratulations, Andrew!!!

A l a b a m a P r o f e s s o r o f t h e Y e a r ( c o n t i n u e d f r o m p a g e 1 )

A d v i s o r s : E s s e n t i a l C o n t r i b u t o r s t o S t u d e n t S u c c e s s

faculty members and their above and

beyond commitment to making the

class enjoyable.“

Elizabeth Turnbull, Kevin Jerrolds, and

Nancy Walburn serve on the FYE Coor-

dinating Committee, the oversight

group that helps ensure that FYE

courses are both instructive and enjoy-

able as they promote the successful

transition of freshmen into the culture

of the university.

A key component of all FYEs is the

required Advising Assignment. Nancy

Walburn explains that this requires

freshmen “to identify their educational

aspirations and specific strategies for

attaining them. The assignment spans

most of a student’s first semester and

incorporates critical thinking skills,

communication skills, work ethic and

organizational skills, all of which have

been identified as important outcomes

of an education at UAB. Furthermore, it

provides a foundation for ongoing

reflection, discussion and feedback. This

dynamic interaction between student

and advisor is a unique resource that

supports our students throughout their

university experience in the develop-

ment of the skills and attainment of the

goals set forth in our QEP.”

Kathryn Klyce is especially pleased that

the ECR goals of civic responsibility,

ethical decision-making, knowledge of

current events, and respect for diversity

are fostered through the broad range of

service learning opportunities available

to freshmen through seniors. “From the

local to the international level [service

learning fosters] a profound apprecia-

tion of some of the social, economic, and

health disparities that exist in the world

today that simply could not be fully

ascertained in a classroom setting.“ In

service learning courses, students have

reflected on what it has meant to them

to learn about the disparities between

second graders in a Birmingham city

school and those in a Hoover school or

the devastating and often tragic dispari-

ties caused by poverty and lack of

education in South Africa.

In Fall 2010, Kathryn joined fellow

advisors who have partnered with CAS

faculty to teach stand alone FYE

courses. Two of these experienced

advisors, Jamie Grimes and Ovuke’

Emonina, were invited to share their

best practices, challenges, and advice

with new instructors at an extended

FYE Faculty Workshop last August.

Of course, the role of advisors extends

well beyond the freshman year.

While Nancy Walburn agrees that “The

QEP sets forth certain expectations for

every undergraduate that informs the

work of our advisors,” she emphasizes

that the work of the advisor is only just

beginning with freshmen. “Utilizing the

structure of an advising curriculum that

follows a model timeline from freshman

to senior year, advisors work to ensure

each student has multiple opportunities

to meet these expectations successfully

as part of the educational planning

process.“ Whichever hat an advisor is

wearing, the goal is student success and

progress towards degree completion.

Although too many students still seek

their advisors only when it is time to

register, that is changing. Besides help-

ing students understand and navigate

institutional processes and curricular

requirements, many advisors are un-

dertaking additional roles to increase

student satisfaction and success in the

key transitional freshman year.

Some advisors volunteer to facilitate

small group discussions the day before

fall term begins. They use that year’s

discussion book to introduce key insti-

tutional principles about diversity and

civic responsibility and to generate

dialogue that starts building the social

relationships between students that are

so crucial to student satisfaction and

retention. After facilitated small group

discussions for several years, Nate

Wade was appointed by Dr. Garrison to

serve on the Discussion Book Commit-

tee that reviews book nominations and

forwards a list of finalists to the Presi-

dent.

Holly Radford often resolves students’

confusion about the FYE requirement

and FYE options. “Many students do not

fully grasp the benefit of a first year

experience course or learning commu-

nity when they are asked to register for

one at orientation,” she says, but they

later “comment on the value and overall

enhancement of their college experi-

ence because of the first year experi-

ence course or learning community.

They express the sense of community

they feel and the friends they make with

the students they see regularly in their

classes. They also comment on the

"Freshman Learning

Communities have played

an important role in my

development as a

teacher by allowing me

to experiment with

pedagogies like

Reacting to the Past

that don't always fit

easily into the

traditional

curriculum.”

Andrew Keitt, Ph.D. Associate

Professor, Department of

History, and 2010 Alabama

Professor of the Year

Page 3 V o l u m e 2 , I s s u e 1

QEP TIMELINE

2004

First administration of Profi-ciency Profile

2005

SACS approves QEP

First Discussion Book

2006

First learning communities

2007

First QEP grants

QL Awareness Week

First writing across the curricu-lum workshops with consultant

2008

First stand alone FYEs

First ethics across the curricu-lum workshops with consultant

2009

University Writing Web

First QEP designated courses

2010

University Writing Center

Capstones all identified

First administration of inter-nally-developed test of QEP competencies

Spr 2011

Fifth-Year Interim Report due

Page 16: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

During its first five years, freshman

learning communities have benefited

from the passion and expertise of many

dedicated and gifted faculty, including

eleven faculty who have been awarded

the President’s Excellence in Teaching

Award: David Basilico (English), Alison

Chapman (English), Catherine Danielou

(FLL), Colin Davis (History), John Ehiri

(Public Health). Allen Johnston

(Business), Andrew Keitt (History),

John Mayer (Mathematics), Pam

Paustian (Health Professions), Gunter

Stolz (Mathematics), and Nikos Zahari-

adis (International Relations)

This roster of outstanding teachers

should not be surprising. Learning

communities provide exciting opportu-

nities for innovative courses, cross-

disciplinary explorations, and experi-

mental pedagogy. This is one reason

why structured learning communities

and freshman seminars have been

identified as high impact practices by

the Association of American Colleges

and Universities and in national studies

based on data from the National Survey

of Student Engagement.

UAB freshmen benefit from a First Year

Experience with a double whammy—a

FYE course and activities centered

around a common Discussion Book.

UAB’s sincere commitment to enhanc-

ing the freshman year experience has

recently received unprecedented na-

tional recognition.

In 2010 and 2011, The National Re-

source Center for the First-Year Experi-

ence and Students in Transition se-

lected and recognized in the Chronicle of

Higher Education and at its annual

meetings Nancy Walburn, Director of

General Studies, and Marilyn Kurata,

Director of Core Curriculum Enhance-

ment, respectively as Outstanding First-

Year Student Advocates. Sponsored by

Cengage Learning, ten of these national

awards are given each year, but only

two are given to representatives of

institutions with 15,000 or more stu-

dents.

The University of Alabama at Huntsville

and the University of Georgia are two of

the latest research institutions that are

implementing a required First Year

Experience for all freshmen.

Assessment results support the effec-

tiveness of FYE courses in achieving

FYE goals as measured by the nationally

standardized First Year Initiative (FYI)

survey, which identifies students’

perception of their achievement of 13

course outcomes, satisfaction with the

university, and overall sense of belong-

ing and acceptance. Data for Fall 2010

FYE students showed significant im-

provement over Fall 2009 data in 10 of

these areas and slight improvement in 4

additional areas.

Additionally, UAB freshmen consis-

tently report significantly higher satis-

As always, groups of freshmen will meet

with an assigned faculty or staff facilita-

tor for a 90-minute small group discus-

sion following Dr. Grandin’s talk. Since

2005, these discussion groups have

provided freshmen with a personalized

introduction to UAB and the concept of

difficult dialogues, while simultaneously

providing faculty and staff with a stimu-

lating, cross-campus experience.

Members of the Discussion Book Com-

mittee generally are selected from those

After reviewing the recommendations

of the UAB Discussion Book Committee,

President Garrison has selected the

2011 UAB Discussion Book: Thinking

in Pictures: My Life with Autism by Dr.

Temple Grandin, professor of animal

science, award-winning author, and

internationally acclaimed spokesperson

for autism.

The author will come to campus on

Monday, August 15, the day before Fall

term begins to speak to all freshmen.

who have served as small group facilita-

tors for a number of years.

If you can be a facilitator on August 15,

3:00-4:30 pm, send your name, email

address, and campus mailing address to

Juanita Sizemore ([email protected]),

who will acknowledge receipt. You will

have the opportunity to attend one of

three scheduled facilitator training

sessions in August.

Everyone is encouraged to integrate

P u t t i n g U A B ’ s B e s t F o o t F o r w a r d f o r F r e s h m e n

2 0 1 1 - 2 0 1 2 U A B D i s c u s s i o n B o o k A n n o u n c e d

A F a c u l t y P r o f i l e : D o n n a S l o v e n s k y

was one of the first two educators in the

nation to be designated a Fellow of the

American Health Information Manage-

ment Association (AHIMA).

Midge Ray, Associate Professor, praises

Donna for being “very student oriented”

and for “bringing the QEP to the fore-

front of our school by encouraging

faculty to integrate more critical think-

ing and writing into the curriculum.”

Donna agrees that “The QEP overall has

created a better integration between the

core curriculum and our undergraduate

majors and allowed us to focus on

advanced skills development.” Reflect-

ing her research interests in outcomes

assessment and innovative teaching

methodologies, she states, “Articulating

the QEP elements of critical thinking,

writing, quantitative literacy, and civic

responsibility as they are applied in our

majors required us to examine and

perhaps re-think our teaching ap-

proaches and evaluation prac-

tices. Again, our focus has been teach-

ing students the knowledge and skills

required for employment in their health

profession discipline. The QEP initiative

has encouraged us to look at progres-

sion of skills development in these areas

and build on previous learning.”

As chair of the Academic Standards

Subcommittee of the Athletics Advisory

Committee, Donna has had as strong an

impact on the academic success of

UAB’s basketball team and other ath-

letes. She is gratified that “recruits have

commented that other schools they

visited did not have as strong a focus on

Donna Slovensky, Ph.D., RHIA, FAHIMA,

is Associate Dean in the School of Health

Professions, a Professor in the Depart-

ment of Health Services Administration,

and a scholar in the Lister Hill Center

for Health Policy. She holds secondary

appointments in the Department of

Management in the School of Business,

the UAB Graduate School, and the

School of Medicine Center for Outcomes

and Effectiveness Research and Educa-

tion.

Having received her master’s and doc-

toral degrees from UAB, Donna is an

outstanding representative of UAB in

more than one way. She has been a

consultant to many health care organi-

zations including inpatient and ambula-

tory facilities, home health programs,

and physician practices. In 2001, she

Page 4 V o l u m e 2 , I s s u e 1

Donna Slovensky

into their 2011-12 courses the selected

Discussion Book and/or supporting activi-

ties like the monthly Discussion Book

Dialogues series or College Night at the

Birmingham Museum of Art (dates to be

announced).

Everyone is also invited to send nomina-

tions for future Discussion Books to

Marilyn Kurata ([email protected]).

Include title, author, book genre, and ra-

tionale for nominating the book as required

reading for all freshmen.

faction with the university in compari-

son to the aggregate FYI reports from

all participating institutions.

The First Year Experience at UAB is an

evolving initiative. Begun as highly

structured learning communities taken

by a few freshmen, FYE courses today

include options that vary in format,

number of credit hours, emphasis or

focus, instructor status, and maximum

enrollment. More changes are inevita-

ble, but the overall goal of providing

freshmen with the foundational skills,

institutional knowledge, behavioral

patterns, and sense of community that

promote student satisfaction and aca-

demic success will remain the same.

Thomas DiLorenzo, Dean of the College

of Arts and Sciences, has expressed his

commitment to showcasing UAB’s top

faculty in the first year classroom.

Freshmen who are inspired to excel-

lence are likely to become the graduate

students and researchers of tomorrow.

Page 17: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Learn More. http://main.uab.edu/Sites/DOE/

Get Involved. Contact.

Dr. Marilyn Kurata, Director HUC 460G

701 20th Street South Birmingham, AL 35294-1150

Phone: (205) 996-6420 Fax: (205) 996-7399

E-mail: [email protected]

CORE CURRICULUM STEERING COMMITTEE Marilyn Kurata, Chair Peter Bellis * Theodore Benditt Serge Bokobza* Joe Burns* Alison Chapman * Stella Cocoris * Edwin Cook * Robert Corley David Corliss Colin Davis * Dana Hettich Harold Kincaid Chris Kyle Andrew Marsch * John Mayer * Bradley Newcomer Doug Rigney Philip Way * #

W h o ’ s w h o i n C o r e C u r r i c u l u m E n h a n c e m e n t a t U A B

WRITING COMMITTEE Alison Chapman, Chair Tracey Baker * David Basilico Peter Bellis * Theodore Benditt Scott Brande Anne Cusic Karen Dahle * Fouad Fouad Nichole Griffith * Kyle Grimes Sarah Helms Maria Hopkins * Minabere Ibelema * Peggy Jolly * Andrew Keitt Karen Kennedy Judith King Maxie Kohler Randy Kornegay * Marilyn Kurata * # James Martin Kathleen Martin Bruce McComiskey * Tennant McWilliams Stephen Miller * Mubenga Nkashama* Douglas Oliver * Tonya Perry Midge Ray * Linda Reed Anthony Roberson Cynthia Ryan Rosalia Scripa * Lisa Sharlach Anthony Skjellum * Deborah Tanju Rita Treutel * Jacqueline Wood

FYE COORDINATING COMMITTEE Marilyn Kurata, Chair Pamela Autrey Scott Brande Kathleen Brown Shanna Campbell Kristin J. Chapleau Catherine Danielou* Colin Davis Joy Deupree Zoe Dwyer * Matt Fifolt * Michael Froning Harry Hamilton Linda Harris * Kevin Jerrolds * Michael LeBeau Danez Marrable* Juanita McMath Suzanne Scott-Trammell * Sandra Sims* Donna Slovensky * Jessica Smith* Angela Stowe Laura Talbott-Forbes Peter Tofani * Elizabeth Turnbull* Nancy Walburn * William York

UAB DISCUSSION BOOK COMMITTEE Marilyn Kurata, Chair Thomas Alexander Carolyn Braswell * Denise Bruns * Kristin J. Chapleau * David Chaplin * Janelle Chiasera * William Cockerham Robert Corley Catherine Danielou * Allan Dobbins Michael Froning Ted Gemberling * Wesley Granger * Jeff Graveline * Pat Greenup * Linda Gunter * Harry Hamilton * Patricia Higginbottom William Hutchings Daniel Jackson * Josephine Jackson-Banks J. Michael Kilby Sheri Spaine Long Heather Martin Warren Martin James McClintock Max Michael Bradley Newcomer Rosie O'Beirne * Kristin Olson * Groesbeck Parham Richard Sims * Greer Stanton * Laura Talbott-Forbes Rita Treutel * Diane Tucker * Rodney Tucker Dale Turnbough Janice Vincent Nate Wade * Patty Wang

QUANTITATIVE LITERACY COMMITTEE Edwin Cook, Chair Gypsy Abbott Jonathan Amsbary Scott Arnold * Theodore Benditt Norman Bolus * Theodore Bos Holly Brasher * Renato Corbetta * David Corliss Youngshook Han Marilyn Kurata * # Melinda Lalor * John Mayer * Teena McGuinness Stephanie Rauterkus * Don Ross Lisa Sharlach Melanie Shores Scott Snyder * Kui Zhang *

ETHICS & CIVIC RESPONSIBILITY COMMITTEE Colin Davis, Chair Thomas Alexander * Audra Buck Ellen Buckner Robert Corley * Sarah Culver Wendy Gunther-Canada * Norma-May Isakow * Robert Jefferson * Susan Key * Harold Kincaid Marilyn Kurata * # Mark LaGory Melinda Lalor Lyn Lewis Craig McClure * David Morrow * Bradley Newcomer Jennan Phillips * Jane Roy* Deborah Voltz Charles Watkins

Th

e first year experien

ceT

he first year exp

erience

Th

e core co

mp

etencies

Th

e core co

mp

etencies

* Current members # Ex Officio

Newsletter Editor: Marilyn Kurata Contributor: Ed Cook

Page 18: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

quality. Assessment helps demonstrate

that students are graduating with the

competencies and knowledge that their

degree implies.

To assess the effectiveness of our QEP on

student outcomes, a range of direct and

indirect assessments are used, including

but not limited to the ETS Proficiency

Profile., ETS Writing Sample, EBI FYI

Assessment, the Defining Issues Test, a

Discussion Book Survey, NSSE, IDEA, an

in-house test with (continued on page 2)

Assessment “is the systematic collection,

review, and use of information about

educational programs undertaken for the

purpose of improving student learning and

development” (Palomba & Banta 1999).

As this definition emphasizes, higher

education accountability has moved from

what institutions provide (credentialed

faculty, library resources, activities, etc.) to

what students learn.

How can student learning be assessed

accurately and definitively especially when

(1) learning goals are neither course nor

discipline specific, (2) all assessment

instruments have limitations, and (3)

multiple factors contribute to and affect

data results?

Significant challenges to assessing student

learning exist, but they do not invalidate

the need to assess. The focus on learning

shared by accrediting agencies, the federal

government, legislative bodies, and the

public will only increase, especially during

times of fiscal crisis. With fewer dollars

available, everyone wants assurance of

W h a t i s A s s e s s m e n t a n d W h y S h o u l d Y o u C a r e ?

C h e m i c a l E t h i c s a n d C i v i c R e s p o n s i b i l i t y

In Spring 2008, Craig McClure, Ph.D., and

Aaron Lucius, Ph.D., successfully applied

for a QEP grant to develop a new course

entitled CH 320 Chemistry in Culture and

Ethics .

A year later, this case-based course pro-

vided the first structured curricular oppor-

tunity for UAB students to explore ethical

issues inherent in the intersection of

chemical innovations, emerging technolo-

gies, and public policy.

This fall McClure and Lucius published

“Implementing and Evaluating a Chemistry

Course in Chemical Ethics and Civic Re-

sponsibility” in the Journal of Chemical

Education (87, 1171-1175)describing the

course and assessment results. The article

begins by acknowledging that “Arguments

against teaching ethics in a science curricu-

lum frequently center on the belief that a

moral understanding of right and wrong is

established by the time an individual

enters higher education so little value can

be gained by offering additional education

abut ethics. However, courses such as the

one described in this paper may help

students understand the ethical dimen-

sions of emerging technologies and how

students’ moral stance may be applied

within the framework of their major

course of study” ( 1171).

The impact of the course was evaluated

using the Science Education for new Civic

Engagements and Responsibilities—

Student Assessment of Learning Gains

(SENCER-SALG). Developed as part of the

NSF-sponsored SENCER project, this vali-

dated survey instrument includes both

items measured on a five-point Likert-type

scale and free-response questions.

McClure and Lucius were able to conclude,

“The results shown here indicate that a

course involving ethics in the context of

research and chemical innovations has a

positive impact on the understanding of

students in the application of chemistry,

and that an understanding of ethical di-

mensions of scientific innovations is im-

portant for these students to develop as

citizens who can participate in an effective

public discourse” (1171).

Another benefit of the course for students

was the opportunity to practice and en-

hance research and writing skills.

QE

P U

AB

N o v e m b e r 2 0 1 0

V o l u m e 1 , I s s u e 3

D e g r e e s o f E x c e l l e n c e

I n s i d e t h i s i s s u e :

T h e F i r s t - Y e a r I n i -

t i a t i v e A s s e s s m e n t

2 0 1 0 - 1 1 D i s c u s s i o n

B o o k D i a l o g u e s

W h a t i s A s s e s s m e n t

a n d W h y S h o u l d Y o u

C a r e ? ( c o n t ’ d )

2

E C R , C u l t u r a l L i t e r -

a c y , a n d L i t e r a t u r e

i n T r a n s l a t i o n

S t u d e n t V o i c e s f r o m

F L 2 2 0

Q E P T i m e l i n e

3

U s i n g t h e N S S E t o

A s s e s s E C R

A F a c u l t y P r o f i l e :

M a r k L a G o r y

M e d i c a l T e c h n o l o g y ’ s

C a p s t o n e a s a C u l m i -

n a t i n g E x p e r i e n c e

4

W h o ’ s W h o i n C o r e

C u r r i c u l u m E n h a n c e -

m e n t a t U A B

5

QEP Grants

Quality Enhancement Plan Initiative

Competency

Discussion Book and Campus

Conversations

Freshmen Learning

Communities (FLC)

Mid-curricular Enhancement

Capstone Courses

Writing Yes Yes Yes Yes

Quantitative Literacy

Depends on book choice

Depends on linked courses and theme of FLC

Yes Yes

Ethics and Civic Responsibility

Yes Yes Yes Yes

“Teaching this course was valuable for me as an in-structor, as well as for the students enrolled in the course. We were able to talk about chemistry in the classroom in a way that we seldom do, as a discipline with ethical and social implications. I think the students enjoyed that there were no right or wrong answers, but thinking about broader issues which im-pact them in their careers and lives was valuable to their experience in the course.” Dr. Craig McClure, Associate

Professor, Department of

Chemistry

Page 19: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

The First-Year Initiative (FYI) Assessment was

developed by the Policy Center on the First Year

of College in conjunction with Educational Bench-

marking (EBI). Based on the work of John Gard-

ner, foremost authority on the first-year experi-

ence, the FYI has been given to all students en-

rolled in freshman learning communities since

they were first offered as part of the QEP.

Beginning in fall 2008, the FYI has been adminis-

tered to freshmen in all First Year Experience

(FYE) courses, including learning communities,

U101, and stand-alone FYE courses developed by

schools and departments.

The FYI consists of 70 standardized items to

which students respond on a Likert scale of 1(not

at all) to 7 (significantly) with 8 indicating not

applicable. In the data analysis, the 70 items are

grouped into 15 factors.

The graph to the right shows the factors in which

there has been the most obvious consistent im-

provement across almost all FYE classes based on

all mean scores. 2007 baseline data are student

data from the first set of learning communities

offered in Fall 2006. 2008 data are data from Fall

2007 courses, etc.

FYI data on other factors like Course Improved

Connections with Peers, Usefulness of Course

Readings, and Course Improved Knowledge of

Wellness have differed dramatically not only by

FYE type but also by individual section.

FYI results are shared so they can be used to

identify best practices and areas to target for

improvement. Instructors receive data results for

their individual classes. Deans and associate

deans receive FYI data for all FYE classes taught

by instructors in their unit.

T h e F i r s t - Y e a r I n i t i a t i v e A s s e s s m e n t

W h a t i s A s s e s s m e n t ( c o n t ’ d f r o m p . 1 )

This issue of the QEP newsletter fo-

cuses on assessment, Articles highlight

a range of assessment instruments and

practices, especially those that provide

approaches to evaluating ECR, the most

difficult and elusive of all QEP outcomes

to measure.

In this issue, you will hear both faculty

and student voices persuasively articu-

late some of the reasons why UAB has

made and should make a commitment

to promoting ECR as an essential com-

ponent of its self-identified educational

mission “to be a research university and

academic health center that discovers,

teaches and applies knowledge for the

intellectual, cultural, social and eco-

nomic benefit of Birmingham, the state

and beyond.”

quantitative literacy (QL) and ethics

and civic responsibility (ECR) items,

and numbers related to enrollment,

retention, participation, in co-curricular

activities, and graduation.

Data collected by individual programs

is also used to provide valuable supple-

mental evidence of student learning.

Additionally, national awards, honors,

and grants related to QEP goals consti-

tute external recognition of what is

being accomplished.

None of this data in itself is sufficient to

definitively assess how much our stu-

dents have gained in writing, QL, and

ECR. Everything together, however,

provides substantive evidence of

achievement, as well as benchmarks for

identifying best practices and future

goals.

Page 2 D e g r e e s o f E x c e l l e n c e

2 0 1 0 - 1 1 D i s c u s s i o n B o o k D i a l o g u e s

Learn more about issues relevant to this year’s book Outcasts United by

attending one or more of the monthly Discussion Book Dialogues, which

take place 11:30 am -12:30 pm, in Heritage Hall, room 549, except on

February 17 (TBA).

Beverages and snacks are provided. Free and open to the public.

2010

Nov 18 Jessica Dallow, Ph.D., Associate Professor, Department of Art and Art History,

“Contemporary Artists and Exile”

Dec 16 Josh Carter, Director, UAB Study Away,

“An Interactive Simulation of Cross-Cultural Communication”

2011

Jan 20 Emily Hanna, Ph.D., Curator of the Arts of Africa and the Americas,

Birmingham Museum of Art,

“Unity and Diversity: African Art and the Creation of Community”

Feb 17 UAB Study Away Student Panel, “Lessons Learned by UAB Students Abroad”

Mar 24 UAB Soccer Representatives TBA (Note: this is the 4th Thursday)

Apr 21 Scotty Colson, Office of Economic Development, Mayor’s Office,

“Birmingham’s Sister City and Other International Programs”

----- Satisfaction with College/

University

----- Course Improved Knowledge

of Academic Services

----- Course Improved Knowledge

of Campus Policies

----- Course Improved Managing

Time and Priorities

----- Course Increased Out-of-

Class Engagement

Page 20: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Sheri Spaine Long, Ph.D. , Professor of

Spanish and former chair of the Depart-

ment of Foreign Languages and Litera-

tures, highlights the sidebar quote in

her syllabus for FLL 220 Foreign Litera-

tures in English Translation.

“Considering many readers (not just

students) are sometimes not aware that

they are reading translations, it is key to

have students reflect on the difficulty

and the importance of literary transla-

tion in the history of ideas and the

transmission of culture in general. It is

also important to consider bias and

ethics in the act of translation,” Long

says.

The recipient of numerous awards and

honors, most recently the 2010 Flor-

ence Steiner Award for Leadership in

Foreign Language Education from the

American Council on the Teaching of

Foreign Languages, Long is a nationally

recognized advocate for the increasing

importance of languages and global

literature in fostering cross-cultural

understanding.

Two main challenges she frequently

encounters are xenophobic attitudes (“I

am not interested in Japanese literature

because I don’t know anything about

Japan”) and inherent frustration based

on the intuition that a reader of trans-

lated text is missing something that

would be apparent to a native speaker.

In class, she engages students in frank

discussions about the impossibility of

learning all languages and cultures,

simultaneously stressing how reading

and discussing readings from France,

China, Iran, Japan, Equatorial Guinea,

and other countries prepares them to

be “more sensitive and tolerant global

citizens.”

Long point out, “It is important to

recognize our own limitations and use

translation as a gateway to other cul-

tures. It is essential not to isolate our

students or us and to embrace the

concept of ‘words without borders.’ I

(personally) want my students to leave

behind their hesitation to read foreign

literature and engage all things interna-

tional.” She adds, “given the degree of

interconnectedness in our world, what

is the alternative?”

Excerpts from student responses on an

exam for FL 220 are reprinted below.

E C R , C u l t u r a l L i t e r a c y , a n d L i t e r a t u r e i n T r a n s l a t i o n

S t u d e n t s V o i c e s f r o m F L L 2 2 0

gritte, Botticelli and even Bill Clinton’s

underwear, reminds us that the human

experience transcends national borders.

I have never seen Taiwan, but it was

easy to relate to Chu T’ien-hsin’s playful

satire of modern urban anonymity in

“Man of La Mancha”; I have never vis-

ited Hong Kong, but it was easy to find

myself among the citizens of the floating

city who, in their struggle for stability

and prosperity, “became victims of the

bottomless pit of material de-

sire.” (Floating City 189) My world

doesn’t look much like 19th century

Spain, but a woman who rejects a

traditional lifestyle in favor of inde-

pendence would still be “gossip for a

vexed society” (Torn Lace 64), and

there are still relics of Cristóbal’s femi-

nine ideal of a woman who is “docile”

and “self-effacing” with “an ever-smiling

countenance and never-ending agree-

ableness” (The Cigarette Stub 127-128).

Excerpt from Student #2: To live in a

metropolitan city such as Atlanta, New

York, or even Birmingham, one has to

consider other people and the sur-

rounding environment. With practically

everyone being from another place, it is

almost socially detrimental to be absent

minded of other people. The United

States is made up of so many different

cultures that it is only correct that we

do not neglect one another. It is virtu-

ally impossible to be ethical and civi-

cally responsible without considering

the importance of other cultures.

. . . . I believe that these stories have

opened my eyes to other cultures that

aren’t necessarily better, just different

from my own. The literature studied in

this class has made me more culturally

sensitive and aware. It has also taught

me that not only is it okay to be differ-

ent and strange, but there may be some

common ground that exists in the

seemingly separate worlds. Reading

those varied stories and books have

taught me to keep my mind open, never

stereotype or judge, and to always do

research.

Excerpt from Student #3: In the wake

of a historical presidential election, civic

responsibility has become a hot topic.

In the US, civic responsibility is often

defined locally or nationally. However,

with ever-expanding mass communica-

tions, increased foreign trade and

travel, and national security issues,

defining civic responsibility with re-

spect to global concerns is more impor-

tant than ever. Social injustice is a

particularly important global concern

because the injustices pervade all

aspects of life—personal relationships,

work, and government. . . . Foreign

literature’s unique ability to reveal

social injustices and to inspire through

empathy is reflected in the short stories

“Spring Silkworms,” “Hometown,” “Poor

Bea,” “Torn Lace,” and “The Cigarette

Stub.”. . . “Spring Silkworms” is told

from a different cultural perspective but

the underling social injustices reflected

occur globally. I can empathize because

the current economy in the US has

forced many companies to increase

outsourcing. Also, although hard work

can gain you some mobility in the US,

one’s future is often limited by initial

socioeconomic status.

Essay prompt: Consider ethics and

civic responsibility in its local, national

and global context. Has the reading and

studying of foreign literature made you

a more responsible global citizen with

regard to social injustice and our mu-

tual responsibility? Explain. Use exam-

ples from the Pardo Bazán stories,

Zamora Loboch’s story, and the Chinese

short stories.?

Excerpt from Student #1: If pilgrimage

is defined as travel for transformation,

this semester’s study of foreign litera-

ture has been for me a literary pilgrim-

age. Too often, I identify with the

narrator of “Hometown” who “had

nothing to look out on but the square

patch of sky that was visible above the

high walls of a family court-

yard” (Hometown 7). Perhaps the basis

of poor global citizenship is not ethno-

centric bigotry but the ignorance and

indifference that come from too much

time spent staring at a tiny sliver of sky.

In that case foreign literature is a way to

tear down the walls, to put a human

face on injustices that our modern

sensibilities have conveniently confined

to long, long ago in a land far away. At

the same time, world literature is a

reminder of the commonalities of the

human journey, the global scope of the

issues we encounter every day.

At some moments, my literary pil-

grimage brought me into contact with

issues that were new and “foreign”; at

others, I was surprised to find familiar

themes located in a new time and space.

Global literature, with its references to

Don Quijote, Luis Buñuel, Rene Ma-

". . . to compare the two versions of a given work and define the relationship between them involves taking into account . . . the differences between two language systems, two literary traditions, two critical traditions, and two cultures" Fitch, Brian T. "The Relationship Between Compagnie and Company: One Work, Two Texts, Two Fictive Universes." In Beckett Translating / Translating Beckett. Edited by Alan Warren Friedman et al. University Park and London: The Pennsylvania State University Press, 1987, 26-35.

Page 3 V o l u m e 1 , I s s u e 3

QEP TIMELINE

2004

First administration of Profi-ciency Profile

2005

SACS approves QEP

First Discussion Book

2006

First learning communities

2007

First QEP grants

QL Awareness Week

First writing across the curricu-lum workshops with consultant

2008

First stand alone FYEs

First ethics across the curricu-lum workshops with consultant

2009

University Writing Web

First QEP designated courses

2010

University Writing Center

Capstones all identified

First administration of inter-nally-developed test of QEP competencies

Spr 2011

Fifth-Year Interim Report due

Page 21: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

As part of the assessment for the re-

cently concluded Difficult Dialogues

grant from the Ford Foundation, Chris-

topher Reaves, Ph.D., analyzed both

UAB and national data from the 2006

and 2009 National Survey of Student

Engagement (NSSE), the two most

recent years when UAB administered

this nationally recognized assessment

instrument to freshmen and seniors.

Overall the data indicate that UAB has

been very productive in making strides

in its pluralism goals when compared to

universities that are similar in geogra-

phy (southern and urban) and similar in

their research- oriented focus (just as

UAB, one sub-group of comparison

universities were classified by the

Carnegie Foundation as having “Very

High Research Activity”).

Also, analysis of the data from 2006 to

2009 typically shows an increase in the

beliefs from UAB students (first year

and seniors) over the three year time

period that diversity is an important

and valued component of UAB life and

academic experience.

Specifically, NSSE data on seven state-

ments related to diversity and pluralism

were analyzed from the NSSE survey.

For example, students identified how

often diverse perspectives (different

races, religions, genders, political be-

liefs, etc.,) were included in class discus-

sions or writing assignments; or how

often they examined the strengths and

weaknesses of their own views on a

topic or issue; or how often they had

serious conversations with students of a

different race or ethnicity than their

own.

In 41 of the 42 comparisons made, UAB

students reported higher percentages of

pluralistic beliefs about their university

than the students at counterpart univer-

sities. The single exception was 61% of

2009 UAB seniors reporting that di-

verse perspectives are “often” or “very

often” a part of class discussions or

writing assignments compared to 63%

for all students at urban institutions.

In comparing the pluralistic opinions of

first-year and senior students at UAB

from 2006 to 2009, the data demon-

strates in 13 of 14 comparisons that

UAB students increasingly believe their

university encourages, promotes, or

enables the goals of pluralism. Al-

though there was a 3 percentage point

increase for seniors over those 3 years

of how often they examined the

strengths and weakness of their own

view on a topic or issue, there was a 4

percentage point decrease for first year

students.

More typical are the following sample

data. There was an increase from 2006

to 2009 of 2 percentage points for first-

year students and 5 percentage points

for seniors in how often diverse per-

spectives were included in class discus-

sions or writing assignments. Similarly,

students rewrite responses to case

review questions from a previous

clinical correlation class and use data

previously collected in a clinical chemis-

try class for advanced data analysis,

interpretation, and presentation. Stu-

dents identify personal behaviors

related to discipline-specific ethical

issues from previous affective evalua-

tions and develop an improvement plan.

Literally drawing upon work generated

and lessons learned in earlier course-

Linda Jeff, MA, MT (ASCP), Associate

Professor of Clinical and Diagnostic

Sciences, recently presented on how

MT 495 Clinical Practice has been

revised and enhanced to provide a

culminating capstone experience for

medical technology majors.

Attendees were impressed by the

integration of students’ work and affec-

tive evaluations from previous course-

work and clinical practice into the

capstone curriculum. For example,

work, the MT 495 capstone imposes

coherence and cohesiveness on the

undergraduate curriculum. The result

is that the intentional instructional

development in the program plan for

medical technology majors is made

more visible and obvious to students in

the program.

Janelle Chiasera, Ph.D., Acting Chair,

Department of Clinical and Diagnostic

Sciences, worked with Jeff on trans-

forming MT 495 into a more compre-

U s i n g t h e N S S E t o A s s e s s E C R

M e d i c a l T e c h n o l o g y ’ s C a p s t o n e a s a C u l m i n a t i n g E x p e r i e n c e

A F a c u l t y P r o f i l e : M a r k L a G o r y

that demonstrates the health costs of

being poor and living in the inner city.

A prolific scholar and researcher in the

areas of Urban Sociology, Homeless-

ness, Mental Health, and Aging, he has

published over 50 articles in refereed

journals.

A founding member of UAB’s Ethics and

Civic Responsibility Committee, Mark is

a persuasive advocate for the univer-

sity’s role in educating students in

personal and social responsibility. He

believes strongly that progress on

major social issues challenging our

country and globe depends on an edu-

cated citizenry passionate about their

civic responsibility: "One of the major

assets any society has is its social capi-

tal, its network of citizens engaged in

community life and the common good.

Social capital not only allows communi-

ties to get by but to get ahead. This asset

is a major source of societal wealth, and

colleges and universities can help the

next generation of decision makers to

understand its value for the American

future."

Among the multiple grants, honors, and

awards he has received since coming to

UAB in 1980, Mark may be most proud

of the Odessa Woolfolk University

Award for Outstanding Community

Service. After his retirement at the end

of this calendar year, Mark plans on

continuing his lifelong commitment to

serve the community, especially the

homeless, the disadvantaged, and

others most at risk through his work as

a staff member at St. Luke's Episcopal

Church.

Mark LaGory wears many hats. Profes-

sor and Chair of the Department of

Sociology and Social Work, he has

secondary appointments as a Professor

of Urban Affairs and Research Scientist

in the Center for Aging. A leader in

faculty governance at UAB, he has

served twice as President of the Faculty

Senate and was the first elected Chair of

the Comprehensive Faculty Senate.

Mark is also an ordained deacon in the

Episcopal Church and a member of the

Board of Directors for the Old Firehouse

Shelter and Alabama Appleseed Center

for Law and Justice.

Mark received his B.A., M.A, and PhD.

from the University of Cincinnati. He is

the co-author or editor of six books,

including Unhealthy Cities: Poverty,

Race, and Place in America, a 2010 book

Page 4 V o l u m e 1 , I s s u e 3

Mark LaGory

hensive capstone experience and devel-

oped the rubric used for portfolio evalua-

tion. Chiasera says with satisfaction, “At the

end of the day this is really about how we

can better educate our students. QEP has

helped us think about this in much greater

detail and has provided us with an oppor-

tunity to assure our students are making

connections.”

If you are interested in sharing your pro-

gram’s capstone, contact Marilyn Kurata at

[email protected].

there was an increase of 9 percentage

points for first-year students and 8

percentage points for seniors in how

often they had serious conversations

with students of a different race or

ethnicity than their own.

Although the QEP’s emphasis on ECR,

which includes pluralism as one of four

primary outcomes, is only one of sev-

eral possible contributors to the posi-

tive difference in responses between

the schools and among UAB students

from 2006 to 2009, the data clearly

demonstrates that our student body

believes UAB embodies the goals of

pluralism at a much greater rate than

comparable universities.

Most relevant to the QEP, the NSSE data

from 2006 and 2009 also demonstrates

that a higher percentage of students

believe UAB promotes pluralism than it

did in its recent past.

Page 22: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Learn More. http://main.uab.edu/Sites/DOE/

Get Involved. Contact.

Dr. Marilyn Kurata, Director 320B Administration Building

701 20th Street South Birmingham, AL 35294 Phone: (205) 996-6420

Fax: (205) 996-7399 E-mail: [email protected]

CORE CURRICULUM STEERING COMMITTEE Marilyn Kurata, Chair Peter Bellis * Theodore Benditt Serge Bokobza* Joe Burns* Alison Chapman * Stella Cocoris * Edwin Cook * Robert Corley David Corliss Colin Davis * Dana Hettich Harold Kincaid Chris Kyle Andrew Marsch * John Mayer * Bradley Newcomer Doug Rigney Philip Way * #

W h o ’ s w h o i n C o r e C u r r i c u l u m E n h a n c e m e n t a t U A B

WRITING COMMITTEE Alison Chapman, Chair Tracey Baker * David Basilico Peter Bellis * Theodore Benditt Scott Brande Anne Cusic Karen Dahle * Fouad Fouad Nichole Griffith * Kyle Grimes Sarah Helms Maria Hopkins * Minabere Ibelema * Peggy Jolly * Andrew Keitt Karen Kennedy Judith King Maxie Kohler Randy Kornegay * Marilyn Kurata * # James Martin Kathleen Martin Bruce McComiskey * Tennant McWilliams Stephen Miller * Mubenga Nkashama * Douglas Oliver * Tonya Perry Midge Ray * Linda Reed Anthony Roberson Cynthia Ryan Rosalia Scripa * Lisa Sharlach Anthony Skjellum * Deborah Tanju Rita Treutel * Jacqueline Wood

FYE COORDINATING COMMITTEE Marilyn Kurata, Chair Pamela Autrey Scott Brande Kathleen Brown Shanna Campbell Kristin J. Chapleau Catherine Danielou* Colin Davis Joy Deupree Zoe Dwyer * Matt Fifolt * Michael Froning Harry Hamilton Linda Harris * Kevin Jerrolds * Michael LeBeau Danez Marrable* Juanita McMath Suzanne Scott-Trammell * Sandra Sims* Donna Slovensky * Jessica Smith* Angela Stowe Laura Talbott-Forbes Peter Tofani * Nancy Walburn * William York

UAB DISCUSSION BOOK COMMITTEE Marilyn Kurata, Chair Thomas Alexander Carolyn Braswell * Denise Bruns * Kristin J. Chapleau * David Chaplin * Janelle Chiasera * William Cockerham Robert Corley Catherine Danielou * Allan Dobbins Michael Froning Ted Gemberling * Wesley Granger * Jeff Graveline * Pat Greenup * Linda Gunter * Harry Hamilton * Patricia Higginbottom William Hutchings Daniel Jackson * Josephine Jackson-Banks J. Michael Kilby Sheri Spaine Long Heather Martin Warren Martin James McClintock Max Michael Bradley Newcomer Rosie O'Beirne * Kristin Olson * Groesbeck Parham Richard Sims * Greer Stanton * Laura Talbott-Forbes Rita Treutel * Diane Tucker * Rodney Tucker Dale Turnbough Janice Vincent Nate Wade * Patty Wang

QUANTITATIVE LITERACY COMMITTEE Edwin Cook, Chair Gypsy Abbott Jonathan Amsbary Scott Arnold * Theodore Benditt Norman Bolus * Theodore Bos Holly Brasher * Renato Corbetta * David Corliss Youngshook Han Marilyn Kurata * # Melinda Lalor * John Mayer * Teena McGuinness Stephanie Rauterkus * Don Ross Lisa Sharlach Melanie Shores Scott Snyder * Kui Zhang *

ETHICS & CIVIC RESPONSIBILITY COMMITTEE Colin Davis, Chair Thomas Alexander * Audra Buck Ellen Buckner Robert Corley * Sarah Culver Wendy Gunther-Canada * Norma-May Isakow * Robert Jefferson * Susan Key * Harold Kincaid * Marilyn Kurata * # Mark LaGory * Melinda Lalor Lyn Lewis Craig McClure * David Morrow * Bradley Newcomer Jennan Phillips * Deborah Voltz * Charles Watkins

Th

e first year experien

ceT

he first year exp

erience

Th

e core co

mp

etencies

Th

e core co

mp

etencies

* Current members # Ex Officio

Newsletter Editor: Marilyn Kurata Contributor: Chris Reaves

Page 23: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Ever since the QEP was developed by a 17-

member, faculty-dominated committee and

endorsed by the Faculty Senate, over a

hundred faculty and staff have served on

one or more of the QEP committees in-

volved in its on-going implementation. (See

p. 5 for membership lists)

Faculty have worked to integrate more

writing, QL, and/or ECR into course curric-

ula and Student Affairs has worked tire-

lessly to support UAB’s emphasis on ECR

with relevant co-curricular activities.

Improving students' writing, QL, and ECR

depends upon the efforts of everyone

across disciplines and across campus.

Together we are indeed working towards

degrees of excellence!

In less than a year, UAB will be sending a

required Fifth-Year Interim Report to

SACS, our accrediting agency.

A major part of this report will be the QEP

Impact Report, detailing how much and

how successfully the university has

achieved the Quality Enhancement Plan it

proposed as part of its last and very suc-

cessful reaccreditation in 2005.

UAB's QEP is a re-visioning of the under-

graduate curriculum. It promises to im-

prove student competencies in writing,

quantitative literacy (QL) and ethics and

civic responsibility (ECR) by introducing

these competencies in the freshman year,

reinforcing them in subsequent years, and

enhancing the discipline-specific aspects of

all three competencies in the senior year.

To provide the necessary infrastructure for

this ambitious promise, UAB has imple-

mented processes that enforce early regis-

tration in required core courses in English

composition and pre-calculus math, as well

as prerequisite checking for all courses.

To foster success, an Early Warning System

and Advising Curriculum have also been

implemented.

The bookends to UAB’s academic educa-

tion for undergraduates are a First Year

Experience, that consists of a required First

Year Experience (FYE) course and common

discussion book, and a required capstone

in the major.

W h a t i s t h e Q E P a n d W h y S h o u l d Y o u C a r e ?

I n c r e a s e d S u c c e s s f o r P r e - C a l c u l u s M a t h S t u d e n t s

The Department of Mathematics has hired

instructors dedicated to pre-calculus math

classes, revised curricula, and integrated

technology via a Mathematics Learning

Laboratory to provide individualized

instruction and closer monitoring of stu-

dent progress. The results have been a

spectacular success.

The success rate of MA 102 averaged 43%

for the three falls before restructuring; the

success rate increased to 62% in fall 2006

and 2007 and to 75% in fall 2008—a 32%

improvement since restructuring.

Other pre-calculus math courses subse-

quently underwent restructuring as well.

The Fall 2008 success rates for MA 098,

105, 106, and 107 were respectively 21%,

33%, 32%, and 40% higher over the suc-

cess rates in Fall 2005 before math restruc-

turing.

Such improvements in student success are

all the more noteworthy since the last

sections of pre-calculus math were not

converted until spring 2008.

The restructured courses incorporate

flexible hours, computer based instruction

and testing in the Mathematics Learning

Laboratory with some classroom contact

and supplemental instruction as needed.

Besides attendance, pass, and withdrawal

data, the Department of Mathematics used

student surveys and exam analyses to

identify successful changes as well as

pedagogical components that needed

improvement. For example, Math analyzed

specific items on exams to understand

problems students were having. Results

were then used to modify lesson plans and,

where possible, course materials.

Kudos to the Department of Mathematics!

QE

P U

AB

M a y 2 0 1 0

V o l u m e 1 , I s s u e 1

D e g r e e s o f E x c e l l e n c e

I n s i d e t h i s i s s u e :

T h e D i s c u s s i o n B o o k

I n i t i a t i v e

L e a r n i n g C o m m u n i -

t i e s a n d t h e F i r s t -

Y e a r E x p e r i e n c e

I m p r o v i n g W r i t i n g :

F r e s h m a n C o m p o s i -

t i o n

T h e W r i t i n g W e b

T h e W r i t i n g C e n t e r

2

I m p r o v i n g Q L : U p d a t e

I m p r o v i n g E C R : U p -

d a t e

N a t i o n a l g r a n t s h e l p

p r o m o t e E C R

Q E P T i m e l i n e

3

A s s e s s m e n t : O v e r -

v i e w o f t h e E T S P r o -

f i c i e n c y P r o f i l e

A F a c u l t y P r o f i l e :

A n t h o n y S k j e l l u m

W h y C a p s t o n e

C o u r s e s a r e C r i t i c a l

4

C u r r e n t a n d p a s t

m e m b e r s o f Q E P c o m -

m i t t e e s

5

QEP Grants The deadline for submitting QEP grant

proposals for round 8 is 08/01/2010.

QEP Grants support the development

of enhanced instruction in writing, QL,

and ECR in mid-curricular courses or

support the development or enhance-

ment of capstones. Mid-curricular

grant applications may be submitted

by any FT faculty, whereas capstone

applications must be submitted by a

chair or the departmental administra-

tor responsible for oversight of the

undergraduate degree program.

Preference will be given to programs

that develop sustainable courses and

instructional models that can be used

as templates by other programs.

Quality Enhancement Plan Initiative

Competency

Discussion Book and Campus

Conversations

Freshmen Learning

Communities (FLC)

Mid-curricular Enhancement

Capstone Courses

Writing Yes Yes Yes Yes

Quantitative Literacy

Depends on book choice

Depends on linked courses and theme of FLC

Yes Yes

Ethics and Civic Responsibility

Yes Yes Yes Yes

http://www.uab.edu/images/degexc/QEP/pdf/QEP_CFP_10-6.pdf

Page 24: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Since 2005, entering freshmen have

been assigned a common book to read

before coming to campus and then

discuss the day before fall term begins.

After being welcomed by President

Garrison, they hear a presentation by

the author or another relevant person-

ality. All students then go to assigned

classrooms to participate in small group

discussions led by the President, Pro-

vost, faculty, and staff.

Besides building community and per-

sonalizing UAB to new freshmen, Dis-

cussion Book events introduce the

concept of difficult dialogues and the

value of relevant QEP competencies like

quantitative literacy, social responsibil-

ity, or valuing diversity.

Discussion book themes have been

reinforced or enlarged upon in the past

by optional classroom adoption, movies,

a monthly dialogue series, a collection

of essays by faculty, staff, and students,

on-line publication of student work, and

events at the Birmingham Museum of

Art, the McWane Science Center, and

the Birmingham Zoo.

Anyone can nominate a book by filling

out the form at http://main.uab.edu/

Sites/DOE/ECR/discussionbook/5611/

Nominations are reviewed by a campus-

wide UAB Discussion Book Committee.

Members are appointed by President

Garrison and generally have served as

small group facilitators in preceding

years. The Committee recommends 3-4

books to President Garrison, who makes

the final selection.

UAB has adopted the following Discus-

sion Books: 2005 The Spirits Catch You

and You Fall Down; 2006 The Kite Run-

ner; 2007 All Over but the Shoutin’; 2008

Field Notes from a Catastrophe: Man,

Nature, and Climate Change; and 2009

Mountains Beyond Mountains: The Quest

of Dr. Paul Farmer, a Man Who Would

Cure the World.

Students gave the Fall 2009 speaker, Dr.

David Walton, a standing ovation.

The 2010-11 UAB Discussion Book is

Outcasts United: An American Town, a

Refugee Team, and One Woman’s Quest

to Make a Difference by Birmingham-

born, New York Times reporter Warren

St. John..

Next issue: How the Discussion Book helps promote learning outcomes

EH 102 at the end of the spring semes-

ter demonstrate with a high degree of

certainty that students’ writing skills

improved over the intervening year.

The magnitude of the change was 0.58

out of 6 points the first year and 0.34

the second year; however, overall, the

scores were higher in 2008-09 than

they were in 2007-08.

Assessment identified thesis develop-

ment as the only area where students

failed to improve, so this is now empha-

sized in EH 102.

Next issue: Improving writing in mid-

curricular courses

In Spring 2009, 95.6% of degree-

seeking undergraduate students who

had earned 30+ hours had fulfilled core

freshman composition requirements .

The English department has restruc-

tured its Freshman Composition se-

quence, instituted more faculty devel-

opment, worked to increase retention of

adjunct faculty, and collaborated with

the Office of Planning & Analysis to

refine assessment procedures/rubrics.

For 2007-08 and 2008-09, analyses of

the assessment of student writing

samples taken from EH 101 at the

beginning of the fall semester and from

T h e D i s c u s s i o n B o o k I n i t i a t i v e

I m p r o v i n g W r i t i n g : F r e s h m a n C o m p o s i t i o n

L e a r n i n g C o m m u n i t i e s a n d t h e F i r s t Y e a r E x p e r i e n c e

Empirical studies and program evalua-

tions at multiple institutions document

the effectiveness of structured learning

communities in increasing student

engagement, thereby promoting stu-

dent satisfaction and learning.

Freshman learning communities (FLC)

provide an opportunity for teachers to

incorporate innovative pedagogy and

interdisciplinary collaborations, in

acknowledgement of which participat-

ing faculty receive a supplemental

stipend. Sample FLC themes are re-

flected in such titles as The Green

Revolution, Exploring Birmingham,

What is a Good Life?, Health without

Borders, From Reformation to Revolu-

tion, Lost!, and Impacting Community

through Service Learning.

In Fall 2009, 486 freshmen enrolled in

FLCs. Three of their instructors were

among those who were honored with

the 2010 President’s Award for Excel-

lence in Teaching — Alison Chapman

(English), Allen Johnston (Management,

Information Systems, and Quantitative

Methods), and Andrew Keitt (History).

Since Fall 2008, freshmen have had the

option of enrolling in stand alone FYE

courses developed by individual under-

graduate schools. In addition, the De-

partments of Theatre, Music, Art and

Art History, and Communication Studies

have offered FYEs for students with

special disciplinary interests.

Next Issue: How FYEs promote learn-ing outcomes

All freshmen, except those in the Uni-

versity Honors Program, entering in Fall

2009 and after must take and pass

(with a C or better) a first year experi-

ence (FYE) course in their first 24 credit

hours at UAB. FYE courses include

learning communities, U101 (The

University Experience), and school-

specific FYE courses. All FYE courses

must include a common core of ten FYE

topics fundamental to the success and

retention of freshmen.

Learning communities can range from

the simple block scheduling of a com-

mon cohort of students in two or more

courses to a fully integrated instruc-

tional program in which designated

cohorts of students take team-taught

classes together.

“If the [Discussion]

book even changed one

person’s perspective

on what is going on

with our energy and

environmental issues,

then I call that a

positive influence.”

Umair Khan, leader of the UAB

Green Initiative, Kaleidoscope,

Vol. 41, Issue 31

Page 2 D e g r e e s o f E x c e l l e n c e

If you haven’t visited the Mer-

vyn H. Sterne Library lately,

you are in for a thoroughly

pleasant surprise. Pass by

Starbucks and the hordes of

students (and faculty) studying

in comfortable armchairs or

using the computer banks and

at the back of the first floor you

will find the University Writ-

ing Center, which opened in

January. It offers tutorials,

workshops, and personalized

help.

T h e W r i t i n g W e b

The Writing Web is an online resource that

has three main components: (1) online writ-

ing resources and Ask-a-Tutor; (2) My Comp-

Lab with instructional modules and writing

exercises and the UAB University Writing

Web Handbook; and (3) Writing by Disci-

plines, which groups locally generated writ-

ing materials by school, department, and, if

relevant, course.

The Writing Web was developed by the

Writing Committee in conjunction with

Pearson Publishing as a resource to help

students, faculty, and departments enhance

writing across the curriculum.

Since Fall 2007, all freshmen who take fresh-

man composition at UAB use a customized

textbook that gives them four years’ access to

UAB’s University Writing Web. Any student

may purchase the textbook and access to the

Writing Web.

Students benefit because they can find writ-

ing resources they need in a single location.

Any faculty member can get a free access

code by contacting Rita Treutel (English).

Faculty can send Treutel writing guidelines,

rubrics, assignments, and sample student

work for posting on the Writing Web, to

which they can then direct their students.

Currently, the Writing by Disciplines section

is unevenly populated. Some units are taking

full advantage of this resource to support the

enhancement of writing in their programs.

Some departments have posted no materials.

Contact Rita Treutel ([email protected]) for

more information about how you can use the

Writing Web to help improve your students’

writing.

Page 25: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

In 2008, the UAB Reporter featured a

series on “Why I serve on the QL Com-

mittee.” Scott Arnold (Philosophy), Lisa

Sharlach (Government), Norman Bolus

(Nuclear Medicine), and Holly Brasher

(Government) described why quantita-

tive literacy needs improvement. Ar-

nold reported, “The fact that these

students [in Contemporary Moral Issues

class] could not distinguish between the

number of murders and the murder rate

indicated to me that students have not

been taught basic quantitative reason-

ing.”

Ed Cook, chair of the QL Committee,

states, “Numbers are everywhere, and

our graduates need strong quantitative

skills to navigate their personal lives

and excel in their disciplines. Moreover,

claims based on numbers and scientific

‘facts’ bombard us from nearly every

direction and affect every part of our

lives. Teaching students to sort intelli-

gently through such claims is among

the most important things that we can

do. Enhancing QL among our graduates

is also in our self interest, as they will

vote in our elections, lead our economy,

and generally shape the society we live

in. I'm pleased that so many UAB faculty

have become part of the QL effort.”

Strengthening QL begins with heighten-

ing campus awareness of QL. In 2007

QL Awareness Week featured keynote

speaker John Allen Paulos and presenta-

tions by Bolus, David Corliss (Planning

& Analysis), John Mayer (Mathematics),

and John Moore (Foreign Languages &

Literatures), and a teacher in the

Greater Birmingham Math Partnership.

Last month, Dilhani Uswatte, winner of

the 2009 Milken National Educator

Award, spoke on campus on “Promoting

Positive Attitudes towards Math and

Improving QL in Education.”

A crucial second step towards improv-

ing student skills in QL is to integrate

QL across the curriculum. The QL Com-

mittee developed a rubric for identify-

ing whether a course promotes QL

sufficiently to be designated a QL

course. Such a course clearly identifies

QL goals in the catalog description, the

syllabus, assignments, and assessments.

Thus far, the QL Committee has ap-

proved 81 courses for QL designation

with more applications awaiting review.

Next issue: Finding QL in Unlikely

Places

classroom, and two public events fea-

turing national speakers on how race

still impacts life in America. DDI team

members were Robert Corley, Harold

Kincaid, and Marilyn Kurata,

The 2008-10 DDI grant promotes ECR

by integrating difficult dialogue peda-

gogy into sociology and anthropology

courses and by supporting co-curricular

events that use student-produced

ethnographic films to promote difficult

dialogues about contemporary social

and community issues. DDI team mem-

bers are Thomas Alexander, Michele

Forman, Marilyn Kurata, Mark LaGory,

Rosie O’Beirne, and Christopher Reaves.

The Ford Foundation has awarded UAB

two Difficult Dialogue Initiative (DDI)

grants ($100,000 and $60,000) to

support the development of curricular

and co-curricular programming that

fosters information exchange and

respectful dialogue about controversial

issues.

The 2006-08 DDI grant supported the

development and teaching of two learn-

ing communities, one on confronting

Birmingham’s past and present and one

on exploring ethical issues in medicine.

The grant also supported a series of

faculty development workshops on

fostering respectful dialogue in the

In 2007, the Association of American

Colleges & Universities named UAB to

its Leadership Consortium on Educating

Students for Personal and Social Re-

sponsibility. A 2007-09 Core Commit-

ments grant ($25,000 plus 60% of cost

share) was used for QEP grants to

faculty to enhance ECR in courses.

Remaining cost share monies supported

faculty/student forums on promoting

academic integrity.

Core Commitments team members

were Thomas Alexander, David Corliss,

Norma-May Isakow, Marilyn Kurata,

Midge Ray, Doug Rigney, and Philip

Way.

I m p r o v i n g Q u a n t i t a t i v e L i t e r a c y : U p d a t e

N a t i o n a l G r a n t s H e l p P r o m o t e E C R a t U A B

I m p r o v i n g E t h i c s a n d C i v i c R e s p o n s i b i l i t y : U p d a t e

bility (ECR) into new or existing

courses.

The ECR Committee developed a rubric

and guidelines for identifying whether a

course promotes ECR sufficiently to be

designated a ECR course. Such a course

identifies ECR goals in the syllabus,

catalog description, assignments, and

assessments.

Thus far, the ECR Committee has ap-

proved 73 courses for ECR designation

with more applications awaiting review.

UAB’s commitment to promote ECR has

been supported by the Office for Service

Learning, which defines service learning

as a pedagogical model “that intention-

ally integrates enhanced academic

learning, purposeful civic learning, and

relevant and meaningful service with

the community.”

Norma-May Isakow, Director, says that

20 courses have received service learn-

ing designation, including Dollars and

Sense, a freshman learning community

developed by Stephanie Rauterkus

(Finance & Accounting). Her students

work with Junior Achievement, a non-

profit organization, to teach elementary

school students basic lessons in finan-

cial literacy, including using credit cards

wisely and balancing a checkbook.

The Office for Service Learning spon-

sors monthly service learning work-

shops on best practices.

Next issue: Promoting Academic

Integrity

Daniel Wueste, Director of the Rutland

Institute for Ethics at Clemson Univer-

sity and President of the Society for

Ethics Across the Curriculum, visited

UAB in Fall 2008 and Spring 2010 to

conduct workshops with school faculty.

Deborah Tanju (Accounting & Finance),

enthusiastically reported on his most

recent visit by saying that, “The Ethics

Across the Curriculum Workshop that I

attended yesterday and today was the

most educational, interesting, helpful,

and useful academic training that I have

received in my 30+ years as a faculty

member in the UA System.”

Workshops and grants are two ways

that UAB supports faculty integrating

more specific instruction and assign-

ments in ethics and/or civic responsi-

“Good training in

quantitative skills pays

dividends throughout

life.”

Holly Brasher, Ph.D.

Department of Government

UAB Reporter, Vol. 32. No. 19

Page 3 V o l u m e 1 , I s s u e 1

QEP TIMELINE

2004

First administration of Profi-ciency Profile

2005

SACS approves QEP

First Discussion Book

2006

First learning communities

2007

First QEP grants

QL Awareness Week

First writing across the curricu-lum workshops with consultant

2008

First stand alone FYEs

First ethics across the curricu-lum workshops with consultant

2009

University Writing Web

First QEP designated courses

2010

University Writing Center

Capstones all identified

First administration of inter-nally-developed test of QEP competencies

Spr 2011

Fifth-Year Interim Report due

Page 26: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

SACS Comprehensive Standard 3.5.1

states that “The institution identifies

college-level competencies within the

general education core and provides

evidence that graduates have attained

those competencies.” To provide evi-

dence that UAB has been and continues

to be in compliance with this standard,

UAB has administered ETS’s test of

general education, the Proficiency

Profile, to groups of volunteer seniors

since the spring of 2004.

The test has been through two name

changes since 2004. It was first called

the Academic Profile and then the

Measure of Academic Proficiency and

Progress. It has also been administered

in a 40-minute abbreviated version and

a two-hour standard version. Despite

the name changes and the use of both

versions of the test, the resulting scores

can be considered equivalent over time

because of the use of statistical equating

techniques.

To test for “value-added” education we

also began testing groups of freshmen

during orientation in the summer of

2004 and have continued to do so every

year since. To acquire detailed longitu-

dinal data, UAB asked a group of fresh-

men to commit to retaking the standard

form of the test as rising juniors and

again as seniors. Several rising juniors

have already retaken the test.

ETS reports both scaled scores and

proficiency classifications for Reading,

Critical Thinking, Writing, and Mathe-

matics. The questions are framed within

the contexts of the humanities, social

sciences, and natural sciences and these

scales scores are reported as well; since

this is a skills test, proficiency classifica-

tions are not reported in the context

areas.

The scaled scores are norm-referenced,

meaning that they are calculated based

on the distribution of scores among all

test-takers. Norm-referenced scores are

useful for determining the relative

position of an individual or a group of

individuals with respect to each other.

These can be translated into the famil-

iar percentile scores.

The proficiency classifications are

criterion–referenced, meaning that

students are classified relative to pre-

determined cutscores rather than each

other. These cutscores are established

by a group of experts based on the

expectations of what minimally compe-

tent students should be able to do at

particular levels.

While the scaled scores are useful for

comparing freshmen to seniors or UAB

students to students from comparison

institutions, the proficiency classifica-

tions are useful for determining what

students should be able to do to demon-

strate learning. ETS publishes the

proficiencies being tested so that insti-

tutions can see what learning outcomes

are or are not being addressed in the

general education curriculum.

Subsequent issues of this newsletter

will discuss in more detail how UAB

freshmen and seniors compare to each

other and to students who have taken

the Proficiency Profile at similar institu-

tions. Suffice it to say at this point that

seniors do significantly better than

freshmen and both UAB freshmen and

seniors do better than students at

similar institutions on both the scaled

scores and the proficiency levels.

rate the last year through a culminating

experience. The Association of Ameri-

can Colleges & Universities has identi-

fied capstones as one of eight educa-

tional practices particularly effective in

promoting key liberal education out-

comes through sequential curricular

design (Purposeful Pathways 2006).

“Doing Less Work, Collecting Better

Data: Using Capstone Courses to Assess

Learning” by Catherine White Berheide

(peerReview 2007) identifies other

The title of a collection of essays, The

Senior Year Experience: Facilitating

Integration, Reflection, Closure, and

Transition, by John Gardner and

Gretchen Van der Veer, indicates why so

many universities and some profes-

sional accrediting agencies already

require senior capstones.

In 1998 “Reinventing Undergraduate

Education: A Blueprint for America’s

Research Universities,” the Boyer Com-

mission urged universities to reinvigo-

reasons for the popularity of capstones.

ABET is the accrediting agency for

college programs in applied science,

computing, engineering, and technol-

ogy. At the most recent monthly Con-

versation on Capstones faculty develop-

ment workshop, Andrew Sullivan (Civil,

Construction, & Environmental Engi-

neering) discussed how his department

fulfills ABET’s requirement that cap-

stones promote both hard skills in the

discipline, and “soft” ECR skills like

A s s e s s m e n t : O v e r v i e w o f T h e E T S P r o f i c i e n c y P r o f i l e

W h y C a p s t o n e C o u r s e s a r e C r i t i c a l

A F a c u l t y P r o f i l e : A n t h o n y S k j e l l u m

Science at the Lawrence Livermore

National Laboratory, working in super

computing, a field that remains his

focus to-date.

Tony’s collaborative leadership style

includes working with the first two

chairs of CIS, Anthony Barnard and

Warren Jones, who remain active emeri-

tus participants in the department.

Collegiality and productivity are hall-

marks of the Department, which cur-

rently has nine professors and five staff

members.

“With nearly 400 undergraduate majors

enrolled by Fall 2010, QEP has proven

both a timely addition to the framework

of curriculum enhancement, and a

rallying point for CIS Faculty. I’m cer-

tain that enhancements in undergradu-

ate quality—both through ABET ac-

creditation and subsequent QEP en-

hancements—have decidedly enhanced

our reputation among prospective

students as well as key learning out-

comes,” observes Tony.

He enjoys watching Star Trek reruns

with his son and daughter when not

composing cryptic e-mails on his ever-

handy Blackberry.

Anthony Skjellum came to UAB in 2003

after 10 years as a Professor at Missis-

sippi State University. Tony, as he

prefers to be called, is the third chair of

the Department of Computer and Infor-

mation Sciences. Given its 42+ year

history, the department actually pre-

dates the modern UAB.

What many folks don’t know is the

impressive interdisciplinary breadth

Tony brought with him. He holds three

degrees from Caltech: B.S. in Physics,

M.S. in Chemical Engineering, and Ph.D.

in Chemical Engineering & Computer

Science. Before joining Mississippi State,

he held a research position in Computer

The ETS Proficiency

Profile is a test of

general education

competencies that has

been administered to

thousands of students

at hundreds of

institutions across the

country. Learn more

on the ETS web site.

Page 4 V o l u m e 1 , I s s u e 1

Dr. Anthony Skjellum

ethical responsibility, communication

skills, and awareness of contemporary

issues.

All students graduating in 2013 or later

must successfully complete the capstone

course or experience required by their

major program or school to graduate. So

far, 43 programs have successfully devel-

oped designated capstone courses.

Next issue: Different disciplines, differ-

ent capstones

Page 27: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Learn More. http://main.uab.edu/Sites/DOE/

Get Involved. Contact.

Dr. Marilyn Kurata, Director 320B Administration Building

701 20th Street South Birmingham, AL 35294 Phone: (205) 996-6420

Fax: (205) 996-7399 E-mail: [email protected]

CORE CURRICULUM STEERING COMMITTEE Marilyn Kurata, Chair Peter Bellis * Theodore Benditt Alison Chapman * Stella Cocoris * Edwin Cook * Robert Corley David Corliss * Colin Davis * Dana Hettich * Harold Kincaid * Chris Kyle * Andrew Marsch * John Mayer * Bradley Newcomer Doug Rigney Philip Way * #

W h o ’ s w h o i n C o r e C u r r i c u l u m E n h a n c e m e n t a t U A B

WRITING COMMITTEE Alison Chapman, Chair Tracey Baker * David Basilico Peter Bellis * Theodore Benditt Scott Brande Anne Cusic Fouad Fouad Kyle Grimes Sarah Helms Maria Hopkins * Minabere Ibelema * Peggy Jolly * Andrew Keitt Karen Kennedy Judith King Maxie Kohler * Randy Kornegay * Marilyn Kurata * # James Martin * Kathleen Martin Bruce McComiskey * Tennant McWilliams * Tonya Perry * Midge Ray * Linda Reed Anthony Roberson * Cynthia Ryan Rosalia Scripa * Lisa Sharlach Anthony Skjellum * Deborah Tanju Rita Treutel * Jacqueline Wood

FYE COORDINATING COMMITTEE Marilyn Kurata, Chair Pamela Autrey Scott Brande* Kathleen Brown Shanna Campbell Kristin J. Chapleau* Colin Davis* Joy Deupree Zoe Dwyer * Matt Fifolt * Michael Froning Harry Hamilton Linda Harris * Kevin Jerrolds * Michael LeBeau Juanita McMath Donna Slovensky Angela Stowe * Laura Talbott-Forbes* Peter Tofani * Nancy Walburn * William York *

UAB DISCUSSION BOOK COMMITTEE Marilyn Kurata, Chair Thomas Alexander Carolyn Braswell * Denise Bruns * William Cockerham Robert Corley Catherine Danielou * Allan Dobbins Michael Froning Wesley Granger * Linda Gunter * Harry Hamilton * Patricia Higginbottom * William Hutchings * Josephine Jackson-Banks J. Michael Kilby Sheri Spaine Long Heather Martin * Warren Martin James McClintock * Max Michael Bradley Newcomer Rosie O'Beirne * Kristin Olson * Groesbeck Parham Richard Sims * Greer Stanton * Laura Talbott-Forbes Rita Treutel * Diane Tucker * Rodney Tucker Dale Turnbough Janice Vincent Nate Wade * Patty Wang * Bettye Wilson

QUANTITATIVE LITERACY COMMITTEE Edwin Cook, Chair Gypsy Abbott Jonathan Amsbary Scott Arnold * Theodore Benditt Norman Bolus * Theodore Bos Holly Brasher * Renato Corbetta * David Corliss * Youngshook Han Marilyn Kurata * # Melinda Lalor * John Mayer * Teena McGuinness * Don Ross * Lisa Sharlach Melanie Shores Scott Snyder * Kui Zhang *

ETHICS & CIVIC RESPONSIBILITY COMMITTEE Harold Kincaid, Co-Chair Colin Davis, Co-Chair Thomas Alexander * Audra Buck Ellen Buckner Robert Corley * Sarah Culver Wendy Gunther-Canada * Norma-May Isakow * Susan Key * Marilyn Kurata * # Mark LaGory * Melinda Lalor * Lyn Lewis Bradley Newcomer Deborah Voltz * Charles Watkins*

Th

e first year experien

ceT

he first year exp

erience

Th

e core co

mp

etencies

Th

e core co

mp

etencies

* Current members # Ex Officio

Newsletter Editor: Marilyn Kurata Contributor: David Corliss

Page 28: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

access code by contacting Nichole Griffith,

Interim Director of the University Writing

Center at [email protected]. To arrange a

Pearson workshop to learn about the

UWW resources available to you and your

students contact Nichole or Jency Sharp,

P e a r s o n r e p r e s e n t a t i v e , a t

[email protected]

Students who encounter problems access-

ing the UWW can contact Rita Treutel for

help at [email protected].

IMPORTANT: No student should ever buy

a second University Writing Web handbook.

Students who did not take freshman com-

position at UAB must purchase a new

handbook to get access to the UWW. Used

books will not have viable access codes.

As part of its commitment to enhance

freshman composition, the Department of

English requires every student who takes a

freshman composition course at UAB to

purchase The University Writing Web

textbook.. This handbook includes an

access code for 5 years of access to the

online UAB University Writing Web

(UWW).

The UWW is an online resource that has

three main components: (1) online writing

resources and Ask-a-Tutor; (2) MyComp-

Lab with instructional modules and writing

exercises and the UAB University Writing

Web Handbook; and (3) Writing by Disci-

plines, which groups locally generated

writing materials by school, department,

and, if relevant, course.

Faculty can have students compose their

written work in MyCompLab and then use

different options to provide audio or

written comments when responding to

student assignments.

Faculty can post writing assignments,

guidelines, sample papers, grading rubrics,

and readings for specific courses.

Students can find specific or general help

for writing problems at the UWW. There is

an excellent tutorial on avoiding plagia-

rism; and APA and MLA content has been

updated to reflect most recent guidelines.

Want to learn more about how the UWW

can help you help your students improve

their writing skills? Faculty can get a free

W h a t i s t h e U n i v e r s i t y W r i t i n g W e b a n d W h y S h o u l d Y o u C a r e ?

2 0 1 0 D i s c u s s i o n B o o k — P r e p a r i n g S t u d e n t s , B u i l d i n g C o m m u n i t y

The day before fall term began, 68 enthusi-

astic faculty and staff facilitated small

discussion groups after the Class of 2014

had attended a presentation by author

Warren St. John on Outcasts United, UAB’s

sixth campus discussion book.

“I really do think this is one of the most

effective and best institutional initiatives I

have seen in my 12+ years at UAB,” says

Brad Newcomer (Nuclear Medicine Tech-

nology Program & ELSP). “This whole

effort has been a wonderful addition to the

fabric of UAB undergraduate education,

especially as it pertains to the freshman’s

initial exposure to UAB and college-level

discourse. “

Sheri Spaine Long (FLL) is typical of the

volunteer facilitators who come from

multiple units across campus. Obviously

having communicated her enthusiasm (“I

love facilitating”) to her small group, she

reports that her students enjoyed a lively

and substantive discussion.

Lois Christensen (Education) wrote, “We

had the BEST group. They shared and

spoke out about all of the topics. Because

Ms. Ebtesam Rababah attended our session

and gave some background about Jordan to

fill in gaps about Luma, they wanted more

geographic information. Ms. Rababah is a

doctoral student in the SOE. It made the

text and experience so much more rele-

vant.” Not surprisingly, “Many of the stu-

dents thanked us profusely. . . . This was a

terrific choice.”

Richard Berliner (Real Estate) encouraged

discussion among his students with home-

made cookies and his firsthand report on

Clarkston, home of the Fugees. A slide-

show of his visit can be seen at http://

main.uab.edu/Sites/DOE/ECR/

QE

P U

AB

A u g u s t 2 0 1 0

V o l u m e 1 , I s s u e 2

D e g r e e s o f E x c e l l e n c e

I n s i d e t h i s i s s u e : H o w t h e D i s c u s s i o n

B o o k P r o m o t e s

L e a r n i n g O u t c o m e s

F i n d i n g Q L i n

U n l i k e l y P l a c e s

2 0 1 0 - 1 1 D i s c u s s i o n

B o o k D i a l o g u e s

T h e U n i v e r s i t y W r i t -

i n g C e n t e r

2

H o w F Y E s P r o m o t e

L e a r n i n g O u t c o m e s

M e r v y n H . S t e r n e – A n

E s s e n t i a l P a r t n e r i n

t h e Q E P

P r o m o t i n g A c a d e m i c

I n t e g r i t y

Q E P T i m e l i n e

3

P i l o t P r o j e c t o n A s -

s e s s i n g Q E P C o m p e -

t e n c i e s i n S t u d e n t

W o r k

A F a c u l t y P r o f i l e :

L o u A n n e W o r t h i n g -

t o n

D i f f e r e n t D i s c i -

p l i n e s , D i f f e r e n t

C a p s t o n e s

4

W h o ’ s W h o i n C o r e

C u r r i c u l u m E n h a n c e -

m e n t a t U A B

5

QEP Grants “I'm honored to serve as a new

student Discussion Group facilita-

tor and a member of the Discus-

sion Book Committee. I attended

a small liberal arts university, and

I will always remember our week-

long orientation which included

many opportunities for small-

group discussion and socializing

with other freshmen. As a facilita-

tor of the New Student Discussion

Groups I can provide new UAB

students with a similar experience

on a smaller scale and make the

transition to university life a little

easier for them.“

Heather Martin, Associate Librar-

ian, Sterne Library

Quality Enhancement Plan Initiative

Competency

Discussion Book and Campus

Conversations

Freshmen Learning

Communities (FLC)

Mid-curricular Enhancement

Capstone Courses

Writing Yes Yes Yes Yes

Quantitative Literacy

Depends on book choice

Depends on linked courses and theme of FLC

Yes Yes

Ethics and Civic Responsibility

Yes Yes Yes Yes

Page 29: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

This chart shows the percentage of freshmen

responding to a student survey who felt reading

the annual discussion book, hearing a relevant

presentation, and participating in a small group

discussion contributed “quite a bit” or “very

much” to their understanding of social, medical,

or ethical issues; understanding of people of

other racial, ethnic, and cultural backgrounds;

awareness of the impact that global events have

on their lives; and the likelihood of their engag-

ing in difficult dialogues in class or with friends.

The increasing percentages of respondents who

feel comfortable with dialogue reflect successive

revisions of the small group discussion template

to include practice of this desired collegiate

classroom interaction. The particular book has a

direct impact on how significantly it contributes

to other desired outcomes.

2005 The Spirit Catches You and You Fall Down

2006 The Kite Runner

2007 All Over But the Shoutin’

2008 Field Notes from a Catastrophe

2009 Mountains Beyond Mountains

2010 survey data is still being collected

in basis for 20 hours a week. Specializ-

ing in writing in the disciplines, the

Center was operating at 92% of tutoring

capacity by the end of spring term.

Additionally, the Center responds to

individual faculty requests. For example

the Center has offered workshops in

Reading and Studying Skills for Sandra

Davis’s nursing students, developed six

online writing modules for Mary War-

ren’s and Laura Vogtle’s Occupational

Therapy courses, and provided an AP

Literature Review for Tom Struzick’s

sociology students and Charlene

Bender’s nursing students. Contact

Nichole Griffith (996-7178) for help.

Since January 2010, students and fac-

ulty have a new writing resource at the

back of the renovated first floor of

Sterne Library.

The University Writing Center offers

work stations with desktop computers

for individual tutoring, three conference

rooms with large-screen television

monitors that can accommodate up to

six people, and a 40-seat classroom.

Interim Director Nichole Griffith, Ph.D.,

assistant professor of English, oversees

experienced adjunct instructors and

trained students, who offer one-on-one

help on both an appointment and walk-

H o w t h e D i s c u s s i o n B o o k P r o m o t e s L e a r n i n g O u t c o m e s

T h e U n i v e r s i t y W r i t i n g C e n t e r

F i n d i n g Q L i n U n l i k e l y P l a c e s

Principles I, Communication Research

Methods, and Measurement and Evalua-

tion in Early Childhood Education.

Varying levels of QL instruction, prac-

tice, and assessment are also found in

such disparate courses as Introductory

Spanish I, African Identity and Personal-

ity, Practical Reasoning, Introduction to

Symbolic Logic, Directed Study in Respi-

ratory Care II, Introductory French I,

Radiation Protection and Biology,

Technical Writing, and Informatics and

Research for Nursing Practice for RNs.

All students benefit from QL skills!

It is not surprising that the majority of

the 80+ courses that have been ap-

proved for Quantitative Literacy (QL)

designation are offered by departments

that were part of the former School of

Natural Sciences and Mathematics and

School of Social and Behavior Sciences.

However, a review of the titles of QL

designated courses shows that QL

components are introduced or rein-

forced across a much broader range of

disciplines including courses on Finan-

cial Management in Healthcare Organi-

zations, Electrical Networks, Accounting

Page 2 D e g r e e s o f E x c e l l e n c e

2 0 1 0 - 1 1 D i s c u s s i o n B o o k D i a l o g u e s

Learn more about issues relevant to this year’s book Outcasts United by

attending one or more of the monthly Discussion Book Dialogues, which

take place 11:30 am -12:30 pm, in Heritage Hall, room 549, except on

December 16 (see below for location).

Beverages and snacks are provided. Free and open to the public.

2010

Sept 16 Samantha Kelly, Curator of Education, Birmingham Museum of Art,

“The Power of the Creative Act: How Museums Transform and Unite

Community”

Oct 21 Kristi Menear, Ph.D., Associate Professor, Department of Human Studies,

“Outcomes of Physical Activity in All Children”

Nov 18 Jessica Dallow, Ph.D., Associate Professor, Department of Art and Art History,

“Contemporary Artists and Exile”

Dec 16 Josh Carter, Director, UAB Study Away,

“An Interactive Simulation of Cross-Cultural Communication”

2011

Jan 20 Emily Hanna, Ph.D., Curator of the Arts of Africa and the Americas,

Birmingham Museum of Art,

“Unity and Diversity: African Art and the Creation of Community”

Feb 17 UAB Study Away Student Panel, “Lessons Learned by UAB Students Abroad”

Mar 24 UAB Soccer Representatives TBA (Note: this is the 4th Thursday)

Apr 21 Scotty Colson, Office of Economic Development, Mayor’s Office,

“Birmingham’s Sister City and Other International Programs”

0

10

20

30

40

50

60

70

80

ethical

issues

diversity global

events

dialogues

2005

2006

2007

2008

2009

October 15 is the deadline to Nominate a Discussion Book for 2012

Page 30: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

In “What Student Engagement Data Tell

Us about College Readiness,” George

Kuh reports that both the National

Survey of Student Engagement and The

Association of American Colleges and

Universities’ Liberal Education and

America’s Promise (LEAP) project

identify freshman learning communities

and freshman seminars as high impact

practices that channel student effort

more productively.

First Year Experience (FYE) courses are

effective because they help close the

gap between collegiate responsibilities

and student expectations based on high

school experience. According to the

High School Survey of Student Engage-

ment, 47% of seniors study 3 or fewer

hours per week, yet 66% of these sen-

iors report receiving mostly A’s and B’s.

In high school, good faith effort is re-

warded, and extra credit work is often

available to bolster final course grades.

College freshmen need to learn early

that results count in college. If they do

not complete regular assignments in a

timely. satisfactory way, they will fail.

A Student Success Work Group at UAB

used such data to generate what has

become the common core for all FYE

courses—coverage of the Structure &

Mission of UAB, Faculty Expectations &

Student Responsibilities, Academic

Policies & Procedures, Academic Sur-

vival Skills, Advising & Career Planning,

Time Management, Financial Manage-

ment, Maintaining a Healthy Lifestyle,

Library Research Resources, and Cam-

pus Involvement Opportunities.

In early August, 34 FYE faculty attended

a 5-hour workshop to share their exper-

tise and questions on how to teach this

core material most effectively. After

presenting on “What’s the Difference

between High School & College?” Rita

Treutel (English) joined Randy Blythe

(English) in a presentation on “Making

the Most of Your FLC.” Ovuke’ Emonina

(Biology), Zoe Dwyer (Engineering),

and John Moore (FLL) facilitated two

sessions on “FYE Instructors Share Best

Practices.” Jamie Grimes (Chemistry)

and Adam Vines (English) shared prac-

tical tips on “Integrating FYE Topics into

the Curriculum” to make a greater

impact. Delores Carlito (Sterne) and

Kerri Barnstuble (Undergraduate Re-

tention Initiatives) led a discussion on

“Strategies to Promote Academic Integ-

rity & Discourage Plagiarism.”

anecdotal evidence indicate that UAB

faculty share these national concerns.

Faculty may be encouraged to know

that a recent study demonstrates the

greater effectiveness of a Web-based

online tutorial over threats in deterring

plagiarism, especially for students who

entered college with lower test scores.

On lin e tut or ial opt ions in clude

“Avoiding Plagiarism” in MyCompLab

through the University Writing Web, the

CBB Plagiarism Resource Site, and

On August 1, The New York Times pub-

lished an article describing real-life in-

stances of how Plagiarism Lines Blur for

Students in Digital Age . Students assume

that unattributed articles on the Web

can be cited without acknowledgement

or that anything from Wikipedia counts

as “common knowledge.”

According to the 2008 Campus Comput-

ing Survey, 54.7% of responding institu-

tions utilize antiplagiarism software to

detect deliberate or inadvertent plagia-

rism. Growing use of Turnitin.com and

UMUC’s VAIL.

After researching these and other

options, Delores Carlito (Sterne), Kerri

Barnstuble (Undergraduate Retention

Initiatives), and Nichole Griffith

(University Writing Center) collabora-

tively developed an online academic

integrity tutorial for FYE courses spe-

cifically geared for UAB students. When

the tutorial goes live in September, the

eReporter and emails will publicize how

to access this newest tool to promote

academic integrity.

H o w F Y E s P r o m o t e L e a r n i n g O u t c o m e s

P r o m o t i n g A c a d e m i c I n t e g r i t y

M e r v y n H . S t e r n e — A n E s s e n t i a l P a r t n e r i n t h e Q E P

appointed by the Faculty Senate to

serve on the Core Curriculum Steering

Committee, which oversees implemen-

tation of the QEP.

Sterne professionals have played a

particularly key role in supporting the

Discussion Book Initiative and First

Year Experience courses.

Heather Martin (Reference-A&H) just

concluded her fourth year on the Dis-

cussion Book Committee, which reviews

all nominations and recommends to

President Garrison 3-4 titles for the

following year’s discussion book.

Hettich, Martin, and Jeff Graveline

(Reference-Business & Government

Document) have taken leadership roles

in compiling and posting a list of mate-

rials and readings to support each

year’s discussion book.

A member of the FYE Coordinating

Committee, Linda Harris (Head of

Reference Services) coordinates the

library liaisons assigned to each FYE

course: Graveline, Martin, Hettich,

Jennifer Long (Reference-NSM), Craig

Beard (Reference-Engineering), Brooke

Becker (Reference-SBS), Imelda Vetter

(Reference-Education), and Delores

Carlito (Instruction & Outreach).

Carlito is a regular presenter at the

annual FYE faculty workshops. She has

developed materials that tie library

resources to FYE learning outcomes and

that suggest ways to incorporate re-

search into FYE classes.

Most recently, Carlito collaborated on

an online Academic Integrity tutorial

that goes live in September (see below).

If you haven’t visited Sterne Library

recently, do so. Although still undergo-

ing renovation in stages, the first floor

has become one of the liveliest places

on campus now that Starbucks, com-

fortable chairs, and more user-friendly

settings have joined the ever helpful

professional staff in welcoming stu-

dents and faculty to stay a while.

What may not be as obvious is the

extent to which Sterne librarians have

contributed to the QEP. Because educa-

tion takes place outside the class-

room—in labs, internships, libraries,

clinics, field work, and co-curricular

activities—as much as it does inside the

classroom, enhancing learning out-

comes is a campus-wide endeavor with

multiple working partners.

Last year, Dana Hettich (General Refer-

ence) was one of two representatives

“Although there is only a three

months’ difference between high

school seniors and college freshmen,

we think of them as coming to UAB

prepared with knowledge – how to

research, how to evaluate, how to

think critically and independently –

that they may not have. I remember

being a freshman at UAB and I am

amazed at how I survived that first

semester, as clueless as I was. I

work with the FYE courses so that I

can introduce students to resources

that they will be able to use their

entire career at UAB. It’s important

that the freshmen feel comfortable

with and welcome at UAB.”

Delores Carlito, Reference Librarian

for Instruction

Page 3 V o l u m e 1 , I s s u e 2

QEP TIMELINE

2004

First administration of Profi-ciency Profile

2005

SACS approves QEP

First Discussion Book

2006

First learning communities

2007

First QEP grants

QL Awareness Week

First writing across the curricu-lum workshops with consultant

2008

First stand alone FYEs

First ethics across the curricu-lum workshops with consultant

2009

University Writing Web

First QEP designated courses

2010

University Writing Center

Capstones all identified

First administration of inter-nally-developed test of QEP competencies

Spr 2011

Fifth-Year Interim Report due

Page 31: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

The Writing, QL, and ECR Committees

each recently concluded a pilot/

feasibility study on the collection and

evaluation of student work from QEP

designated courses as an effective

means to assess QEP learning outcomes

at the institutional level. A brief descrip-

tion of the methods, results, and find-

ings of the study by the QL Committee

(p. 5) follows.

Instructors of ten of the 28 QL-

designated courses (43 distinct sec-

tions) that were offered during Fall

2009 were asked to select an assign-

ment or exam, indicate the broad QL

competencies that it assessed, and

provide ten randomly-selected, un-

graded student responses for the com-

mittee to review.

The pilot study included a course from

each of the then existing eight schools

with undergraduate programs. One

extra course was included from NS&M

and SBS because these schools offered

the most QL courses.

Course instructors, QL Committee

members, and Judy Baker in the Office

of Planning and Analysis worked to-

gether to ensure that identifying infor-

mation was removed from each student

response before it was reviewed.

Each set of student work products was

evaluated by three committee members

with the primary reviewer being from

the same school as the course from

which the materials had been submit-

ted. Primary reviewers led the discus-

sion of each set of materials and each

reviewer presented his or her ratings

and qualitative reactions to the QL

Committee. These presentations then

served as a basis for discussion.

One goal of the pilot study was to de-

velop a method for evaluating student

work products for evidence and quality

of QL. Another goal was to evaluate the

reliability of the ratings obtained.

Assessing QEP learning outcomes at the

institutional level was not a goal be-

cause the samples of student work

were limited and most courses supply-

ing student work were not at the senior

level.

The pilot study was successful in identi-

fying some best practice procedures to

promote reliable and valid assessment,

for example, specifying that assign-

ments be complete and self-contained

and clarifying which QL competencies

in a course are being assessed in a

specific assignment.

The pilot study also identified the need

for periodic review and possible revi-

sion of QL competencies to clarify fine

distinctions or ambiguities and to

ensure that all desired QL competencies

are covered.

The QL, Writing, and ECR Committees

encountered different challenges and

came to different conclusions on the

ple faculty have shared guidelines,

manuals, rubrics, red flags, assignments,

and models on capstones that range in

format/focus from internships, clinical

practice, and field work to experimental

research, group projects, and theses.

As one attendee remarked, “Why rein-

vent the wheel? It’s great to be able to

see what has worked and what can be

adapted for my own program and

students.”

Faculty can view handouts or Power-

Points from all past presentations at the

Since Fall 2007, programs across cam-

pus have shared best practices and

challenges at monthly Conversation on

Capstones meetings. One developing,

piloted, or established capstone is

highlighted at each of these informal

lunch meetings.

Gregg Janowksi inaugurated this work-

shop series with a presentation on the

senior design courses that serve as a

capstone experience for Mechanical

Engineering and Materials Science &

Engineering majors. Since then, multi-

Conversation on Capstones site. Find

helpful

On Sept 9, 12-1:00 pm, Suzanne Scott-

Trammell, Director of Career Services,

will kick-off the 2010-11 Conversation

on Capstones series with a presentation

on how Career Services can provide

resources and information that could be

usefully integrated into capstones

regardless of the discipline. Please

RSVP to Juanita Sizemore at

[email protected] by noon, Tuesday,

September 7, since lunch will be pro-

P i l o t P r o j e c t o n A s s e s s i n g Q E P C o m p e t e n c i e s i n S t u d e n t W o r k

D i f f e r e n t D i s c i p l i n e s , D i f f e r e n t C a p s t o n e s

A F a c u l t y P r o f i l e : L o u A n n e W o r t h i n g t o n

However, her colleagues recognize and

appreciate the steady administrative

support she provided during a time of

structural change and leadership transi-

tion. Last spring, Kristi Menear

(Associate Professor, Dept of Human

Studies) pointed out, “Given all of the

above and beyond the call of duty work

Lou Anne is doing during our period of

time without an SOE dean or interim

dean, I would feel very guilty if I had to

say no to a request from her!”

A two-time recipient of the President’s

Award for Excellence in Teaching in the

School of Education, Lou Anne has also

been awarded several state awards for

her commitment to the field of special

education. She has served as President

of the Alabama Federation Council for

Exceptional Children and has also

served on numerous state and national

committees and boards.

In recent years, she has created and

implemented a comprehensive in-

service education program in the areas

of inclusion and collaborative teaching.

Lou Anne has been involved in program

building and accreditation efforts for

over two decades, and she serves as the

UAB liaison to the Alabama State De-

partment of Education.

She earned her bachelor's and master's

degrees from Auburn University and a

Ph.D. from the University of Alabama.

Her hobbies include landscaping, read-

ing, and collecting antiques.

Lou Anne Worthington has worn multi-

ple hats since joining the UAB faculty in

1996 including program coordinator,

chair (of the Department of Leadership,

Special Education and Foundations),

and since January 2009 Associate Dean

for Programs in the School of Education,

College of Arts and Sciences.

Typically, Lou Anne credits her col-

leagues for the School of Education

earning straight A's on the Alabama

State Department of Education's latest

report card on teacher preparation

programs: “I think it reflects the great-

ness of our teachers, students and

everyone in the SOE and all the hard

work they put into being success-

ful” (Kaleidoscope, 7/26/2010).

Page 4 V o l u m e 1 , I s s u e 2

Lou Anne Worthington

vided in Sterne Library Room 182

The following additional fall Conversation

on Capstones meetings will also take place

12:00—1:00 pm:

Linda Jeff on SHP’s Medical Technology

Capstone (Oct 20, room TBA)

Vanessa Vega on SOE’s Student Teaching

Internship Capstone (Nov 17, room TBA)

If you would like to share your program’s

capstone, contact Marilyn Kurata at

[email protected].

basis of their individual pilot/feasibility

studies. Each committee identified

issues that will need to be addressed so

that student work from across the

disciplines can be used as a basis for

institutional assessment. Each study

also underscored the value of examin-

ing student work as a means of feeding

back to the committees the ways in

which core competencies are assessed

by instructors and expressed by stu-

dents.

Although this process is in its infancy,

the ultimate goal is to identify common

characteristics of mastery and sources

of difficulty in the core competencies

that cut across disciplines, and to use

this information to direct faculty and

university attention and resources to

enhance instruction in these competen-

cies in the future.

The Committees and the QEP leadership

thank all faculty who participated in

this project.

Page 32: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Learn More. http://main.uab.edu/Sites/DOE/

Get Involved. Contact.

Dr. Marilyn Kurata, Director 320B Administration Building

701 20th Street South Birmingham, AL 35294 Phone: (205) 996-6420

Fax: (205) 996-7399 E-mail: [email protected]

CORE CURRICULUM STEERING COMMITTEE Marilyn Kurata, Chair Peter Bellis * Theodore Benditt Serge Bokobza* Joe Burns* Alison Chapman * Stella Cocoris * Edwin Cook * Robert Corley David Corliss * Colin Davis * Dana Hettich Harold Kincaid Chris Kyle Andrew Marsch * John Mayer * Bradley Newcomer Doug Rigney Philip Way * #

W h o ’ s w h o i n C o r e C u r r i c u l u m E n h a n c e m e n t a t U A B

WRITING COMMITTEE Alison Chapman, Chair Tracey Baker * David Basilico Peter Bellis * Theodore Benditt Scott Brande Anne Cusic Fouad Fouad Nichole Griffith Kyle Grimes Sarah Helms Maria Hopkins * Minabere Ibelema * Peggy Jolly * Andrew Keitt Karen Kennedy Judith King Maxie Kohler * Randy Kornegay * Marilyn Kurata * # James Martin * Kathleen Martin Bruce McComiskey * Tennant McWilliams * Tonya Perry Midge Ray * Linda Reed Anthony Roberson * Cynthia Ryan Rosalia Scripa * Lisa Sharlach Anthony Skjellum * Deborah Tanju Rita Treutel * Jacqueline Wood

FYE COORDINATING COMMITTEE Marilyn Kurata, Chair Pamela Autrey Scott Brande Kathleen Brown Shanna Campbell Kristin J. Chapleau Catherine Danielou* Colin Davis Joy Deupree Zoe Dwyer * Matt Fifolt * Michael Froning Harry Hamilton Linda Harris * Kevin Jerrolds * Michael LeBeau Juanita McMath Suzanne Scott-Trammell * Sandra Sims* Donna Slovensky * Jessica Smith * Angela Stowe Laura Talbott-Forbes Peter Tofani * Nancy Walburn * William York

UAB DISCUSSION BOOK COMMITTEE Marilyn Kurata, Chair Thomas Alexander Carolyn Braswell * Denise Bruns * William Cockerham Robert Corley Catherine Danielou * Allan Dobbins Michael Froning Wesley Granger * Jeff Graveline * Linda Gunter * Harry Hamilton * Patricia Higginbottom William Hutchings Daniel Jackson * Josephine Jackson-Banks J. Michael Kilby Sheri Spaine Long Heather Martin Warren Martin James McClintock Max Michael Bradley Newcomer Rosie O'Beirne * Kristin Olson * Groesbeck Parham Richard Sims * Greer Stanton * Robyn Stiff * Laura Talbott-Forbes Rita Treutel * Diane Tucker * Rodney Tucker Dale Turnbough Janice Vincent Nate Wade * Patty Wang Bettye Wilson

QUANTITATIVE LITERACY COMMITTEE Edwin Cook, Chair Gypsy Abbott Jonathan Amsbary Scott Arnold * Theodore Benditt Norman Bolus * Theodore Bos Holly Brasher * Renato Corbetta * David Corliss * Youngshook Han Marilyn Kurata * # Melinda Lalor * John Mayer * Teena McGuinness Don Ross Lisa Sharlach Melanie Shores Scott Snyder * Kui Zhang *

ETHICS & CIVIC RESPONSIBILITY COMMITTEE Colin Davis, Chair Thomas Alexander * Audra Buck Ellen Buckner Robert Corley * Sarah Culver Wendy Gunther-Canada * Norma-May Isakow * Robert Jefferson * Susan Key * Harold Kincaid * Marilyn Kurata * # Mark LaGory * Melinda Lalor Lyn Lewis Craig McClure * David Morrow * Bradley Newcomer Jennan Phillips * Deborah Voltz * Charles Watkins

Th

e first year experien

ceT

he first year exp

erience

Th

e core co

mp

etencies

Th

e core co

mp

etencies

* Current members # Ex Officio

Newsletter Editor: Marilyn Kurata Contributor: Ed Cook

Page 33: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Success rates in Precalculus Courses

Definitions:

Success in 098 means attaining a grade of P, otherwise A, B, or C.

Black numbers identify pass rates for courses taught in traditional format.

Teal numbers identify pass rates for courses with some sections taught in traditional format and some sections taught in restructured format.

Blue numbers identify pass rates for courses taught only in restructured format.

Red numbers identify pass rates for courses taught in ALEKS format.

MA 098 MA 102 MA 105 MA 106 MA 107 MA 110 MA 098 MA 102 MA 105 MA 106 MA 107

Fall terms

Fall 05 40% 34% 50% 39% 38% 69%

Fall 06 54% 53% 61% 71% 71% 63%

Fall 07 56% 62% 73% 66% 71% 72%

Fall 08 61% 75% 83% 71% 78% 89%

Fall 09 68% 78% 82% 67% 78% 85% 49% 62% 52% 59% 59%

Spring terms

Spr 06 36% 44% 60% 73% 31% 58%

Spr 07 50% 51% 53% 59% 72% 55%

Spr 08 52% 66% 78% 70% 88% 87%

Spr 09 63% 71% 80% 73% 73% 89% 56% 73% 60% 52% 70%

Spr 10 65% 76% 76% 63% 68% 88% 61% 77% 66% 38% 72%

Summer terms

Sum 06 37% 69% 55% 63% 65% 56%

Sum 07 73% 63% 62% 77% NA 81%

Sum 08 72% 92% 90% 78% NA 93%

Sum 09 75% 78% 86% 91% 81% 88% 57% 78% 81% 85% 81%

Comments:

Prerequisite and placement requirements for MA 107 were changed effective Fall 06.

All sections (including QL sections) Distance-Learning sections (QL sections only)

Page 34: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

A report from the 

Office of Planning and Analysis

University of Alabama at Birmingham 934‐2226 

Title: Repeated Measures Analysis of CAST Members Performance on the

ETS Proficiency Profile Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects Prepared for: Core Curriculum Steering Committee

Outcomes Assessment Committee Copied to: Dr. Glenna Brown

Date: September 2010 Confidential: No

Summary: The CAST program started in the Fall of 2007 with the idea that students who participated would take the ETS Proficiency Profile as freshmen, rising juniors, and seniors. Of the cohort that started in 2007, 30 have taken the test three times. This is the first opportunity to examine longitudinal data and validate previous results that have shown a clear difference between freshmen and senior cohorts who took the test in the same academic year. The results of a repeated measures analysis show statistically significant increases in the Total Score and each of the four subscores. Furthermore, the magnitudes of the differences are larger than have been observed using propensity score matching designed to equate the freshmen and senior cohorts on the basis of HSGPA and ACT Composite scores. Concomitant with these changes in the scaled scores, this CAST cohort also demonstrated a trend to higher levels of proficiency in the skills measured. The students who managed to take the test three times within a short period of time in college are obviously highly motivated. Their average HSGPA and ACT Composite scores are higher than have been observed in previous cohorts. As more students complete the testing cycle, it is expected that these two indicators will decrease and become more representative of the entire cohort.

    

Page 35: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Methods Data are now available for 30 students who were members of the Fall 2007 CAST cohort. The scaled scores were analyzed using a simple repeated measures ANOVA. The proficiency levels were analyzed by looking at changes in the proficiency levels over the three tests. 

Results Figures 1‐5 (starting on page 3) show the mean scaled Total, Reading, Critical Thinking, Writing, and Mathematics scores. The scale for the Total score ranges from 400 to 500 while the scale of each of the subscores ranges from 100 to 130. Table 1 shows that all the effects are significant. Freshmen are significantly different from seniors in all cases. 

Table 1. Repeated measures ANOVA results. 

Scaled Score F(21,58) p Total 15.4 <0.001 Reading 4.3 0.0195 Critical Thinking 11.8 <0.001 Writing 3.3 0.044 Mathematics 16.0 <0.001

It is possible to get a sense of the magnitude of the scaled score differences by examining the percentile equivalents. Table 2 shows the mean scaled scores, the freshmen equivalent percentiles, the senior equivalent percentiles, and the difference between the freshmen and senior percentiles. These percentile scores come from the comparative data provided by ETS for large samples of students.. In this case the equivalent percentiles are taken from the tables for freshmen and seniors from institutions that ETS still refers to by the old Carnegie Classifications as Research I and II.1 

Table 2. Mean scaled scores and their class equivalent percentile scores. See text for interpretation of these data. 

Scaled Score Class 

Mean Scaled Score 

Freshmen Equivalent Percentile 

Senior Equivalent Percentile  SR‐FR 

Total FR 452 65 57  

  JR 459 74 68    SR 465 81 76 11 

Reading FR 121 59 48    JR 122 66 55    SR 123 69 59 0 

Critical Thinking FR 114 67 59    JR 116 75 70    SR 119 81 76 9 

Writing FR 117 62 58    JR 118 68 61    SR 119 81 79 17 

Mathematics FR 115 52 48    JR 118 69 65    SR 119 75 73 21 

                                                       1 Previous reports on the Proficiency Profile have discussed the cautions that must be exercised when using these comparative data. 

Page 36: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

To understand what this table means consider the Total score. The average of these 30 students as freshmen was 452, which corresponds to the 65th percentile of the freshmen distribution. The average of these 30 students as seniors was 465. A freshmen scoring at that level would fall in the 81st percentile of freshmen. As seniors, however, these students fall in the 76th percentile of the senior distribution. 

The differences between the freshmen and senior percentiles taken from their respective distributions are shown in the last column. All the gains are positive except for reading, which does not change at all. Both Writing and Mathematics show the largest gains. Given the emphasis on these two areas since this class entered in 2007, these are encouraging results when considered from a value‐added perspective. 

 Figure 1 Mean Total scaled score. Error bars are the 95% confidence limits. 

 Figure 2. Mean Reading scaled score. Error bars are the 95% confidence limits. 

FR-Total JR-Total SR-Total

400

410

420

430

440

450

460

470

480

490

500

Tota

l Sco

re

FR-Read JR-Read SR-Read

100

105

110

115

120

125

130

Rea

ding

Page 37: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

 Figure 3 Mean Critical thinking scaled score. Error bars are the 95% confidence limits. 

 Figure 4 Mean Writing scaled score. Error bars are the 95% confidence limits. 

 Figure 5 Mean Mathematics scaled scores. Error bars are the 95% confidence limits. 

Since this is the first time that it has been possible to analyze the Proficiency Profile using longitudinal data, it is of interest to compare these results with those of previous analyses. Table 3 shows the results from the July 2009 report. Part A shows the numbers when no 

FR-CT JR-CT SR-CT

100

105

110

115

120

125

130

Crit

ical

Thi

nkin

g

FR-Write JR-Write SR-Write

100

105

110

115

120

125

130

Writ

ing

FR-Math JR-Math SR-Math

100

105

110

115

120

125

130

Mat

h

Page 38: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

matching is done. Part B shows what happens when freshmen are matched to seniors using the propensity score matching technique. Part C shows the results for the 30 CAST members. The main conclusion is that the repeated measures analysis on the CAST cohort validates the fact that seniors are, indeed, scoring higher than freshmen. 

Table 3. Parts A and B of this table were taken from the July 2009 report on the Proficiency Profile. Part C shows the CAST Repeated Measures Results. 

A. UNMATCHED Mean ‐ Seniors 

Mean ‐Freshmen Difference t‐value p 

N ‐ Seniors 

N ‐Freshmen

HS GPA  3.52  3.66 ‐0.14 ‐1.40 0.163  41  201

ACT Composite  23.7  24.6 ‐0.9 ‐1.41 0.158  41  201

MAPP Total Score  453.0  445.9 7.1 2.32 0.021*  41  201

MAPP Reading  119.9  118.2 1.7 1.58 0.116  41  201

MAPP Writing  116.2  115.9 0.3 0.30 0.762  41  201

MAPP Critical Thinking  114.1  112.3 1.8 1.80 0.073  41  201

MAPP Mathematics  116.7  113.6 3.1 3.08 0.002*  41  201

B. MATCHED Mean ‐ Seniors 

Mean ‐Freshmen Difference t‐value p 

N ‐ Seniors 

N ‐Freshmen

HS GPA  3.52  3.56 ‐0.04 ‐0.303 0.763  41  41

ACT Composite  23.7  23.4 0.3 0.306 0.760  41  41

MAPP Total Score  453.0  443.5 9.5 2.22 0.029*  41  41

MAPP Reading  119.9  117.7 2.2 1.47 0.145  41  41

MAPP Writing  116.2  115.2 1.0 1.12 0.264  41  41

MAPP Critical Thinking  114.1  112.3 1.8 1.24 0.219  41  41

MAPP Mathematics  116.7  112.2 4.5 3.20 0.002*  41  41

C. CAST REPEATED MEASURES 

Mean ‐ Seniors 

Mean ‐Freshmen Difference

Dependentt‐value p 

HS GPA  3.79  3.79 ‐‐ 30 

ACT Composite  26.2  26.2 ‐‐ 30 

MAPP Total Score  465.3  451.8 13.5 6.4 <0.001*  30 

MAPP Reading  123.4  120.5 2.9 3.3 0.003*  30 

MAPP Writing  119.0  117.2 1.8 3.4 0.002*  30 

MAPP Critical Thinking  118.7  114.2 4.5 4.5 <0.001*  30 

MAPP Mathematics  119.1  114.7 4.4 7.5 <0.001*  30 

The first thing to note is that the HSGPA and the ACT Composite scores for the CAST members are higher than those observed in the independent freshmen and senior cohorts. This is likely reflective of the fact that these students attained senior status in three years. This suggests that they are highly motivated and clearly in the upper part of the academic ability distribution.  

It is probably because of this that the differences in the scaled scores in Part C exceed those in Part B. The interesting thing about this is that, given that these students are above average, the “value‐added” is not compressed due to any ceiling effect. What will be interesting to observe is how these differences change as more of the 2007 CAST cohort take the test as seniors. One would expect the mean HSGPA and ACT scores to decrease. The question is how this will affect the scaled score differences. 

Page 39: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

All the scaled scores reported above are norm‐referenced scores, meaning that they measure how students do relative to each other. ETS also reports criterion‐referenced performance in terms of three proficiency levels. A student is considered Not Proficient (N) if he or she does not answer correctly a predetermined set of items that demonstrate mastery of a set of skills. If mastery is demonstrated, the student is deemed Proficient (P). A designation of Marginal (M) indicates that that there is insufficient evidence on which to classify a student as Not Proficient or Proficient. 

The skills that are assessed include two levels of Reading, one of Critical Thinking, three of Writing, and three of Math. The Critical Thinking level takes the place of the third level of Reading. The columns in Table 4 show the nine skill levels. 

To assess the changes in students’ skill levels over time, the levels achieved over the three tests were concatenated into a string such as MNN. This string means that the student was Marginal the first time he or she took the test and Not Proficient the next two times. All the resulting strings are shown in the first column in Table 4. 

Table 4. Changes in students’ Proficiency Levels over time in the nine skill levels. See text for details on interpretation. 

Prof

icie

ncy

Sequ

ence

Prof

icie

ncy

Scor

e

Valu

e A

dded

Sc

ore

Rea

ding

1

Rea

ding

2

Crit

ical

Th

inki

ng

Writ

ing

1

Writ

ing

2

Writ

ing

3

Mat

h 1

Mat

h 2

Mat

h 3

Num

ber o

f In

stan

ces

MNN 4 -1 1 2 3 MMN 5 -1 1 1 2 PNM 6 -1 1 1 PMM 7 -1 1 1 PPM 8 -1 2 1 1 4

Subtotal 2 3 1 0 1 3 0 1 0 11 NNN 3 0 1 10 5 2 10 28 NMN 4 0 2 2 1 4 9 MNM 5 0 1 2 1 4 NPN 5 0 1 1 2 MMM 6 0 1 1 7 7 1 1 1 19 MPM 7 0 2 1 1 2 1 1 8 PNP 7 0 1 1 1 1 1 5 PMP 8 0 1 2 1 2 2 2 10 PPP 9 0 21 9 2 23 7 2 16 10 3 93

Subtotal 24 15 16 26 19 20 20 17 21 178 NNM 4 1 2 2 3 1 3 2 13 NMM 5 1 1 2 1 2 1 2 9 MNP 6 1 1 1 1 1 2 1 7 NPM 6 1 3 3 MMP 7 1 1 2 2 1 2 2 2 12 MPP 8 1 3 4 2 2 3 1 3 2 3 23 NNP 5 2 1 1 2 NMP 6 2 2 2 1 1 2 8 NPP 7 2 1 1 2 4

Subtotal 4 12 13 4 10 7 10 12 9 81

To bring some order to how students changed over time two numerical scores were created. The first is simply the sum of arbitrarily assigned scores where 1=N, 2=M, and 3=P. Thus, MNN 

Page 40: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

=4. These are shown in the second column in Table 4. The other score is called the Value Added score. It is simply the difference between the proficiency level at the senior level minus that at the freshmen level. Thus, the Value Added Score for MNN = 1 ‐ 2 = ‐1. These values are shown in the third column in Table 4. The table is sorted on the Proficiency Score within the Value Added Score. Table 5 shows the same data sorted by Value Added Score within Proficiency Score. 

Table 5. Data in Table 4 sorted by Value Added Score within Proficiency Score. 

Prof

icie

ncy

Sequ

ence

Prof

icie

ncy

Scor

e

Valu

e A

dded

Sc

ore

Rea

ding

1

Rea

ding

2

Crit

ical

Th

inki

ng

Writ

ing

1

Writ

ing

2

Writ

ing

3

Mat

h 1

Mat

h 2

Mat

h 3

Num

ber o

f In

stan

ces

NNN 3 0 1 10 5 2 10 28 MNN 4 -1 1 2 3 NMN 4 0 2 2 1 4 9 NNM 4 1 2 2 3 1 3 2 13 MMN 5 -1 1 1 2 MNM 5 0 1 2 1 4 NPN 5 0 1 1 2 NMM 5 1 1 2 1 2 1 2 9 NNP 5 2 1 1 2 PNM 6 -1 1 1 MMM 6 0 1 1 7 7 1 1 1 19 MNP 6 1 1 1 1 1 2 1 7 NPM 6 1 3 3 NMP 6 2 2 2 1 1 2 8 PMM 7 -1 1 1 MPM 7 0 2 1 1 2 1 1 8 PNP 7 0 1 1 1 1 1 5 MMP 7 1 1 2 2 1 2 2 2 12 NPP 7 2 1 1 2 4 PPM 8 -1 2 1 1 4 PMP 8 0 1 2 1 2 2 2 10 MPP 8 1 3 4 2 2 3 1 3 2 3 23 PPP 9 0 21 9 2 23 7 2 16 10 3 93

Some interesting observations from Table 4 include the following: 

• There were only 11 instances where students were less proficient in their senior year than in the freshmen year. 

• There were 178 instances where there was no change from freshmen to senior year, 93 of which were Proficient for all three tests and could not have increased. 

o Sixty of those 93 instances were at skill level 1. 

o Twenty‐six of those 93 instances were at Math 1 and Math 2 combined. 

• There were 28 instances of NNN, 20 of which are accounted for by Critical Thinking and Math 3. 

o Only four students showed the NNN pattern in both Critical Thinking and Math 3. 

Page 41: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

• There were 81 instances where there was an improvement from the freshmen to the senior test, 14 of which were jumps from Not Proficient to Proficient. 

o Thirty‐five of the 81 improved from  Marginal to Proficient. 

Table 5 shows the same data from the perspective of the Proficiency Score. The mean Proficiency Score, weighted by the Number of Instances is 6.8 and the median is 7, which falls at the MMP level given the way the table is sorted. The weighted mean of the Value Added Score is 0.31. Both these indicate an overall trend toward higher levels of proficiency. It should be noted, however, that, within each of the three skills, the number of students with high Proficiency Scores at level 3 is small. 

The analysis of the criterion‐referenced proficiency classifications are less clear cut but nevertheless support the conclusion that, overall, students are becoming more proficient in the skills measured by the test. The disappointing result is that there does not appear to be a significant shift of a large number of students into level 3 of the three skill categories. From an assessment perspective, these results are the most useful ones because there are specific learning outcomes related to these skills that can be addressed with curriculum changes. 

Discussion The analysis of longitudinal data presented here validates the previous findings that seniors score better than freshmen and that there is some “value added” by the curriculum. As more CAST students complete the three tests the exact magnitude of the differences will become clearer as the sample becomes more representative of the class. 

One of the encouraging things to note about the three test process is that it is perhaps now possible to begin to see the effects of initiatives like the mid‐curricular and capstone emphases on writing and quantitative literacy over time. Where this is most evident is in the math scores shown in Figure 5. The post hoc comparison of the means shows that the significant gains in math are made between the freshmen and junior years. What one would like to see, and expect, as the QEP designated courses and the capstone courses take effect is that the change between the junior and senior years increases. This should be accompanied by an increase in the numbers of students designated as Proficient at Math Level 3 as well. There are now three CAST cohorts in the pipeline, with each subsequent cohort being exposed to more curricular changes that emphasize writing and quantitative literacy. If the data change as one would predict, this is truly a groundbreaking initiative. 

Page 42: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved Writing Courses

http://main.uab.edu/Sites/DOE/59260/[3/23/2011 11:14:49 AM]

Writing

Writing Committee

Writing LearningOutcomes

Approved WritingCourses

Capstones

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

Approved Writing Courses

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

College of Arts and Sciences (Excluding the School of Education) AAS 290 Writing in African American Studies ANTH 481 Voyage in Anthropology ARH 204 Renaissance to Modern Art ARH 480 Art Criticism and Theory* BY 123 Introductory Biology I * BY 124 Introductory Biology II BY 409 Principles of Human Physiology BY 442 Experimental Phycology CM 494 Communication Research Methods CS 201 Introduction to Object-Oriented Programming CS 302 Object-Oriented Design CS 499 Senior Capstone EH 203 Writing in Birmingham* EH 216 Introduction to Literature* EH 217 World Literature I* EH 218 World Literature II* EH 221 British and Irish Literature I* EH 222 British and Irish Literature II* EH 223 American Literature I* EH 224 American Literature II EH 301 Reading, Writing, and Research for the English Major EH 303 Advanced Composition EH 304 Editing in Professional Contexts EH 401 Tutoring Writing EH 402 Writing in Popular Periodicals EH 403 Business Writing EH 404 Technical Writing EH 433 Academic Writing EH 456 Visual Rhetoric EH 457 Writing and Medicine EH 459 Discourse Analysis

Writing Matrix

Course Applications

The following courses have been approved by the Writing Committee to bedesignated as Writing courses:

* Satisfies Core Requirements

Page 43: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved Writing Courses

http://main.uab.edu/Sites/DOE/59260/[3/23/2011 11:14:49 AM]

* ENV 109 Laboratory in Environmental Science * FR 220 Intermediate French Composition HY 300 Historian's Craft ITS 471 Political Power and Propaganda in Film JS 300 Methods of Social Research JS 410 Criminal Justice Ethics JS 497 Internship and Capstone for Criminal Justice Practitioners JS 498 Distance Internship and Capstone in Criminal Justice JS 499 Criminal Justice Internship and Capstone* MA 126 Calculus II* MA 252 Introduction to Differential Equations MA 361 Mathematical Modeling MA 461 Modeling with PDE MA 486 Mathematical Statistics MC 494 Communication Research Methods MU 472 Music History and Literature 1750-present PH 351 Modern Physics I PH 351L Modern Physics I Laboratory PH 352 Modern Physics II PH 352L Modern Physics II Laboratory PHL 330 Libertarianism as a Political Philosophy PHL 341 History of Philosophy: Descartes to Hume PHL 375 Philosophy of Mind PHL 405 Epistemology * PHS 101 Physical Science PSC 104 Introduction to Political Theory PSC 330 The American Judicial Process PSC 411 Introduction to Research Methods PSC 471 Political Power and Propaganda in Film PY 315 Psychological Research Methods PY 490 Psychology Capstone SPA 300 Advanced Grammar and Composition SOC 400 Research Methods SOC 407 Development of Sociological Theory SW 200 Professional Writing for Human Service Professionals * THR 100 Introduction to Theatre THR 215 Playwriting THR 482 Theatre History: 1700 to Realism THR 483 Theatre History: Realism and Non-Realism School of Business BUS 350 Business Communication School of Education EDU 200 Education as a Profession EDU 210 Writing and Speaking Skills for Education Professionals HE 432 Administration of Health and Fitness Programs School of Engineering BME 423 Living Systems Analysis BME 498 Senior Design I BME 499 Senior Design II CE 221 Mechanics of Solids Laboratory MSE 310 Materials Engineering Laboratory II MSE 413 Composite Materials MSE 498 Senior Design I MSE 499 Senior Design II School of Health Professions AHS 460 Research Methods HIM 410 Interpretation of Clinical Information MT 460 Clinical Correlations

Page 44: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved Writing Courses

http://main.uab.edu/Sites/DOE/59260/[3/23/2011 11:14:49 AM]

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

NMT 400 Introduction to Clinical Nuclear Medicine Technology RST 325 Directed Study in Respiratory Care I RST 432 Directed Study in Respiratory Care III School of Nursing NUR 474Q Role Transition for Professional Nursing Practice NUR 445 Nursing of the Child and Adolescent

Page 45: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved QL Courses

http://main.uab.edu/Sites/DOE/QL/59277/[3/23/2011 11:32:53 AM]

Quantitative Literacy

QL Committee

QL LearningOutcomes

Approved QLCourses

UAB's Strategy

More About QL

Capstones

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

Approved QL Courses

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

College of Arts and Sciences (Excluding the School of Education) AAS 320 African Identity and Personality ANTH 285 Mapping Our World ANTH 481 Voyage in Anthropology ARS 101 Introduction to Two-Dimensional Design* AST 101 Astronomy of the Universe* AST 102 Astronomy of Stellar Systems* AST 111 Astronomy of the Universe Laboratory * AST 112 Astronomy of Stellar Systems Laboratory * BY 102 Topics in Contemporary Biology Lab* BY 123 Introductory Biology I * BY 124 Introductory Biology II BY 245 Fundamentals of Scientific Investigation BY 409 Principles of Human Physiology BY 442 Experimental Phycology BY 467 Population Ecology* CH 105 Introductory Chemistry I* CH 106 Introductory Chemistry I Laboratory* CH 107 Introductory Chemistry II * CH 108 Introductory Chemistry II Laboratory CH 114 General Chemistry I Laboratory (Honors) * CH 115 General Chemistry I * CH 116 General Chemistry I Laboratory* CH 117 General Chemistry II* CH 118 General Chemistry II Laboratory CH 119 General Chemistry II Laboratory (Honors) CM 494 Communication Research Methods EH 404 Technical Writing * FR 101 Introductory French I HY 285 Mapping Our World JS 120 Descriptive Statistics * MA 105 Pre-Calculus Algebra

QL Matrix

Course Applications

The following courses have been approved by the Quantitative Literacy Committee to bedesignated as QL courses:

* Satisfies Core Requirements

Page 46: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved QL Courses

http://main.uab.edu/Sites/DOE/QL/59277/[3/23/2011 11:32:53 AM]

* MA 106 Pre-Calculus Trigonometry* MA 107 Pre-Calculus Algebra/Trigonometry * MA 109 Survey of Calculus* MA 110 Finite Mathematics* MA 125 Calculus I* MA 126 Calculus II MA 180 Introduction to Statistics * MA 252 Introduction to Differential Equations MA 361 Mathematical Modeling MA 418 Statistics for Teachers MA 440 Advanced Calculus I MA 441 Advanced Calculus MA 461 Modeling with PDE MA 486 Mathematical Statistics MC 494 Communication Research Methods MU 115 Computer Music I * PH 201 College Physics I* PH 221 General Physics I PH 351 Modern Physics I PH 351L Modern Physics I Laboratory PH 352 Modern Physics II PH 352L Modern Physics II Laboratory* PHL 120 Practical Reasoning PHL 220 Introduction to Symbolic Logic PHL 321 Cooperation and Competition PHL 490 Neuroeconomics* PHS 101 Physical Science* PSC 103 Introduction to International Relations PSC 403 International Relations Seminar PSC 411 Introduction to Research Methods PSC 412 Introduction to Statistical Analysis PSC 461 International Political Economy PY 214 Elementary Statistical Methods and Design PY 217 Laboratory for Elementary Statistics PY 253 Brain, Mind, and Behavior PY 315 Psychological Research Methods PY 490 Psychology Capstone* SPA 101 Introductory Spanish I SOC 400 Research Methods SOC 410 Introduction to Social Statistics SW 320 Introduction to Research Methods SW 321 Statistics for Social Work Research School of Business AC 200 Accounting Principles I EC 330 Cooperation and Competition EC 490 Neuroeconomics FN 310 Fundamentals of Financial Management QM 214 Quantitative Analysis I School of Education EPR 410 Measurement and Evaluation in Early Childhood/Elementary Education EPR 411 Introduction to Measurement and Evaluation in Education HE 431 Planning & Evaluating Effective Health Education & Promotion Programs HE 432 Administration of Health & Fitness Programs PE 305 Motor Development PE 400 Physiology of Exercise School of Engineering BME 498 Senior Design I CE 210 Statics CE 344 Engineering Analysis

Page 47: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved QL Courses

http://main.uab.edu/Sites/DOE/QL/59277/[3/23/2011 11:32:53 AM]

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

EE 316 Electrical Networks School of Health Professions AHS 360 Statistics for Healthcare Managers AHS 416 Financial Management in Healthcare Organizations HIM 425 Introduction to Epidemiology and Applied Statistics in Health Care Organizations MT 455 Research Principles NMT 441 Radiation Protection and Biology RST 415 Directed Study in Respiratory Care II School of Nursing NUR 374Q Informatics and Research for Nursing Practice NUR 381Q Informatics and Research for Nursing Practice for RN's

Page 48: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved ECR Courses

http://main.uab.edu/Sites/DOE/ECR/59275/[3/23/2011 11:33:33 AM]

Ethics and CivicResponsibility (ECR)

ECR Committee

ECR learningoutcomes

Approved ECRCourses

Discussion Book

Difficult Dialogues I

Difficult Dialogues II

Core Commitments

Capstones

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

Approved ECR Courses

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

College of Arts and Sciences (Excluding the School of Education)* AAS 200 Introduction to African American Studies AAS 290 Writing in African American Studies* ANTH 101 Introductory Cultural Anthropology ANTH 481 Voyage in Anthropology ARH 468 Race and Representation ARH 471 Topics in Asian Cinema ARH 478 Buddhist Arts of Asia ARS 450 Advanced Graphic Design CM 494 Communication Research Methods CS 499 Senior Capstone EH 203 Writing in Birmingham EH 301 Reading, Writing, and Research for the English Major EH 365 African-American Literature I: 1746-1954 EH 366 African-American Literature II: 1954-present EH 403 Business Writing EH 457 Writing and Medicine * ENV 108 Human Population and the Earth's Environment* ENV 109 Laboratory in Environmental Science * FLL 120 Foreign Culture* FLL 220 Foreign Literatures in English Translation* HY 101 Western Civilization to 1648* HY 102 Western Civilization since 1648 * HY 105 World History 1600 to Present * HY 120 The United States to 1877* HY 121 The United States since 1877 HY 258 Britain and the Third World ITS 471 Political Power and Propaganda in Film JS 410 Criminal Justice Ethics JS 497 Internship and Capstone for Criminal Justice Practitioners JS 498 Distance Internship and Capstone in Criminal Justice JS 499 Criminal Justice Internship and Capstone

ECR Matrix

Course Applications

The following courses have been approved by the Ethics and CivicResponsibility Committee to be designated as ECR courses:

* Satisfies Core Requirements

Page 49: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved ECR Courses

http://main.uab.edu/Sites/DOE/ECR/59275/[3/23/2011 11:33:33 AM]

MC 101 Survey of Mass Communication MC 494 Communication Research Methods PH 351 Modern Physics I PH 351L Modern Physics I Laboratory PH 352 Modern Physics II PH 352L Modern Physics II Laboratory* PHL 115 Contemporary Moral Issues PHL 116 Bioethics PHL 135 The Rule of Law PHL 230 Social and Political Philosophy PHL 335 Philosophy of Law* PSC 101 Introduction to American Government * PSC 103 Introduction to International Relations PSC 104 Introduction to Political Theory PSC 320 Political Participation PSC 340 American Political Thought PSC 471 Political Power and Propaganda in Film* PY 212 Developmental Psychology PY 218 Abnormal Psychology PY 315 Psychological Research Methods PY 397 Community-Based Practicum in Psychology PY 490 Psychology Capstone* SOC 100 Introduction to Sociology * SOC 200 Social Change* SOC 245 Contemporary Social Problems SOC 400 Research Methods SW 222 Social Work Values Lab SW 322 Practice of Social Work I THR 482 Theatre History: 1700 to Realism School of Business LS 246 Legal Environment of Business MG 302 Management Processes and Behavior School of Education ECY 300 Ssurvey of Special Education EDF 362 Foundations of Education I: Social, Historical & Philosophical HE 343 Theories and Determinants of Health Behavior School of Engineering BME 420 Implant-Tissue Interaction BME 498 Senior Design I CE 236 Environmental Engineering CE 497 Construction Engineering Management EE 485 Engineering Operations MSE 401 Materials Processing School of Health Professions AHS 415 Ethics for Health Care Professionals HIM 415 Introduction to Health Information Management HIM 460 Coding and Classification Systems MT 405 Laboratory Management NMT 400 Introduction to Clinical Nuclear Medicine Technology RST 311 Principles of Patient Assessment RST 422 Long Term and Preventive Care RST 427 Review of Critical Care Concepts School of Nursing NUR 365 Maternal-Newborn and Women's Health Nursing NUR 374Q Informatics and Research for Nursing Practice NUR 395 Community and Public Health Nursing NUR 397Q Community and Public Health Nursing for RN's

Page 50: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Approved ECR Courses

http://main.uab.edu/Sites/DOE/ECR/59275/[3/23/2011 11:33:33 AM]

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

NUR 474Q Role Transition for Professional Nursing Practice

Page 51: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Learn More. http://main.uab.edu/Sites/DOE/

Get Involved. Contact.

Dr. Marilyn Kurata, Director 320B Administration Building

701 20th Street South Birmingham, AL 35294 Phone: (205) 996-6420

Fax: (205) 996-7399 E-mail: [email protected]

CORE CURRICULUM STEERING COMMITTEE Marilyn Kurata, Chair Peter Bellis * Theodore Benditt Serge Bokobza* Joe Burns* Alison Chapman * Stella Cocoris * Edwin Cook * Robert Corley David Corliss Colin Davis * Dana Hettich Harold Kincaid Chris Kyle Andrew Marsch * John Mayer * Bradley Newcomer Doug Rigney Philip Way * #

W h o ’ s w h o i n C o r e C u r r i c u l u m E n h a n c e m e n t a t U A B

WRITING COMMITTEE Alison Chapman, Chair Tracey Baker * David Basilico Peter Bellis * Theodore Benditt Scott Brande Anne Cusic Karen Dahle * Fouad Fouad Nichole Griffith * Kyle Grimes Sarah Helms Maria Hopkins * Minabere Ibelema * Peggy Jolly * Andrew Keitt Karen Kennedy Judith King Maxie Kohler Randy Kornegay * Marilyn Kurata * # James Martin Kathleen Martin Bruce McComiskey * Tennant McWilliams Stephen Miller * Mubenga Nkashama * Douglas Oliver * Tonya Perry Midge Ray * Linda Reed Anthony Roberson Cynthia Ryan Rosalia Scripa * Lisa Sharlach Anthony Skjellum * Deborah Tanju Rita Treutel * Jacqueline Wood

FYE COORDINATING COMMITTEE Marilyn Kurata, Chair Pamela Autrey Scott Brande Kathleen Brown Shanna Campbell Kristin J. Chapleau Catherine Danielou* Colin Davis Joy Deupree Zoe Dwyer * Matt Fifolt * Michael Froning Harry Hamilton Linda Harris * Kevin Jerrolds * Michael LeBeau Danez Marrable* Juanita McMath Suzanne Scott-Trammell * Sandra Sims* Donna Slovensky * Jessica Smith* Angela Stowe Laura Talbott-Forbes Peter Tofani * Nancy Walburn * William York

UAB DISCUSSION BOOK COMMITTEE Marilyn Kurata, Chair Thomas Alexander Carolyn Braswell * Denise Bruns * Kristin J. Chapleau * David Chaplin * Janelle Chiasera * William Cockerham Robert Corley Catherine Danielou * Allan Dobbins Michael Froning Ted Gemberling * Wesley Granger * Jeff Graveline * Pat Greenup * Linda Gunter * Harry Hamilton * Patricia Higginbottom William Hutchings Daniel Jackson * Josephine Jackson-Banks J. Michael Kilby Sheri Spaine Long Heather Martin Warren Martin James McClintock Max Michael Bradley Newcomer Rosie O'Beirne * Kristin Olson * Groesbeck Parham Richard Sims * Greer Stanton * Laura Talbott-Forbes Rita Treutel * Diane Tucker * Rodney Tucker Dale Turnbough Janice Vincent Nate Wade * Patty Wang

QUANTITATIVE LITERACY COMMITTEE Edwin Cook, Chair Gypsy Abbott Jonathan Amsbary Scott Arnold * Theodore Benditt Norman Bolus * Theodore Bos Holly Brasher * Renato Corbetta * David Corliss Youngshook Han Marilyn Kurata * # Melinda Lalor * John Mayer * Teena McGuinness Stephanie Rauterkus * Don Ross Lisa Sharlach Melanie Shores Scott Snyder * Kui Zhang *

ETHICS & CIVIC RESPONSIBILITY COMMITTEE Colin Davis, Chair Thomas Alexander * Audra Buck Ellen Buckner Robert Corley * Sarah Culver Wendy Gunther-Canada * Norma-May Isakow * Robert Jefferson * Susan Key * Harold Kincaid * Marilyn Kurata * # Mark LaGory * Melinda Lalor Lyn Lewis Craig McClure * David Morrow * Bradley Newcomer Jennan Phillips * Deborah Voltz * Charles Watkins

Th

e first year experien

ceT

he first year exp

erience

Th

e core co

mp

etencies

Th

e core co

mp

etencies

* Current members # Ex Officio

Newsletter Editor: Marilyn Kurata Contributor: Chris Reaves

Page 52: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

r" " CSARUniversity of Alabama at Birmingham Analysis of Factors: Five-Year Longitudinal Comparison

2011's 2010's Statistical Previous Year's Data

Factor 15: Overall Course Effectiveness

N

Data

Mean N

Data

Mean

Factor 14: Course Included Engaging Pedagogy 1,105 4.87 1.44 1,130 4.66 1.52 0.21 ... 4.49 0.36 l' 5.06 -0.19 ~ 4.74 0.13 t NA

Factor 9: Course Improved Managing Time and Priorities 1,106 4.98 1.47 1,131 4.71 1.60 0.27 ... 4.57 0.41 l' 4.50 0.48 t 3.94 1.04 t NA

Factor 7: Course Improved Knowledge of Campus Policies 1,106 5.26 1.54 1,128 4.87 1.63 0.39 ... 4.60 0.66 l' 4.44 0.82 l' 4.55 0.71 t NA

Factor 5: Course Improved Connections with Peers 1,105 4.90 1.67 1,129 4.63 1.75 0.27 ... 4.61 0.29 t 5.47 -0.57 ~ 5.52 -0.62 ~ NA

Factor 6: Course Increased Out-of-Class Engagement 1,094 4.05 1.86 1,115 3.76 1.85 0.29 ... 3.67 0.38 t 4.23 -0.18 ~ 3.11 0.94 t NA

1,107 4.64 1.39 1,134 4.40 1.42 0.24 ... 4.23 0.41 t 4.34 0.30 t 4.37 0.27 t NA

Factor 2: Course Improved Academic and Cognitive Skills 1,102 4.25 1.62 1,124 4.13 1.60 0.12 ND 3.90 0.35 l' 4.37 -0.12 ~ 4.17 0.08 NA

Factor 3: Course Improved Critical Thinking 1,100 4.75 1.66 1,122 4.62 1.72 0.13 ND 4.46 0.29 t 4.79 -0.04 4.79 -0.04 NA

Factor 4: Course Improved Connections with Faculty 1,104 4.79 1.49 1,130 4.58 1.60 0.21 ... 4.55 0.24 t 4.64 0.15 t 4.46 0.33 l' NA

Factor 6: Course Improved Knowledge of Academic 1,106 5.46 1.40 1,126 5.22 1.47 0.24 ... 4.64 0.62 t 4.92 0.54 t 4.62 0.84 t NA

Services

Factor 10:

Factor 11:

Course Improved Knowledge of Wellness

Sense of Belonging and Acceptance

1,104

1,102

4.07

5.58

1.77

1.37

1,126

1,125

3.63

5.46

1.71

1.44

0.44

0.10 ND

... 13~5.54

0.49

0.04

l' 3.75

5.55

0.32

0.Q3

t 3.40

5.25

0.67

0.33

t t

NA

NA

Factor 13: Satisfaction with College/University 1,106 5.74 1.27 1,129 5.78 1.25 -0.04 ND 5.71 0.03 5.64 0.10 t 5.06 0.66 t NA

Stat Level (Level of Statistical Significance) - ••• denotes p " .001; •• denotes p" .01;' denotes p " .05 ; ND denotes no statistical difference between means. Arrow Designations - ... : 2011 has a statistically higher mean than 2010 . .... : 2011 has a statistically lower mean than 2010

NA: Not Applicable - Your institution did not participate in the study that year or the factor is new

Difference = Difference between means. Arrow Designations ~ denotes a difference" -0.1; t denotes difference> 0.1

NOTE: A T-Test is performed between 2011 and 2010 to determine if the differences in means are statistical. All other comparisons are not statistically tested.

Factor 1: Course Improved Study Strategies

Copylighl EB12011: May Not Be Reproduced Without Penni ••ion Page 19 E81 F!rst~Year Initiative Survey

Page 53: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

History of grant applications and awards

http://main.uab.edu/Sites/DOE/QEP/80805/[3/23/2011 11:30:36 AM]

QEPQEP GrantsHistory of grantapplications andawards

List of grantrecipients andawards

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

History of grant applications and awards

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

Calls forProposals

Number ofApplications

Total AmountRequested

Number ofApplications

Funded

AmountAwarded

Round 1 4 $41,771.59 3 $22,657.59 Round 2 6 $44,974.80 4 $29,544.80 Round 3 4 $14,262.00 4 $14,262.00 Round 4 12 $127,693.43 10 $43,368.00 Round 5 5 $39,947.60 3 $14,261.00 Round 6 5 $35,876.00 4 $24,870.00 Round 7 4 $33,008.06 4 $19,203.80 Round 8 6 $46,992.99 3 $19,204.00 Round 9 8 $29,883.00 5 $21,257.00

Page 54: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

List of grant recipients and awards

http://main.uab.edu/Sites/DOE/QEP/80806/[3/23/2011 12:02:55 PM]

QEPQEP GrantsHistory of grantapplications andawards

List of grantrecipients andawards

Quick LinksUAB HomeStudents.uab.eduNews & EventsOffice of the Provost

List of grant recipients and awards

QEP

Freshman Learning Communities (FLC)

Writing

Quantitative Literacy

Ethics and Civic Responsibility (ECR)

Erik Angner, Philosophy/ CAS, enhancing QL in PHL 322 Philosophical Issuesin Behavioral Economics/EC 320 Behavioral EconomicsDavid Basilico, English/ CAS, developing new version of EH 496 CapstoneSeminarHolly Brasher, Government/ CAS, enhancing Writing & ECR in PSC 320Political ParticipationHolly Brasher, Government/ CAS, enhancing Writing & QL in PSC 411Introduction to Research MethodsLoretta Cormier and Sharyn Jones, History & Anthropology/ CAS, enhancingWriting, QL, & ECR in ANTH 481 Voyage in Anthropology: Launching yourProfessional FutureRetta Evans and Laura Talbott-Forbes , Human Studies/ Education/ CAS,enhancing Writing, QL, and ECR in HE 342 Introduction to Health Education;HE 343 Theories and Determinants of Health Behavior; HE 431 Planning,Implementing, and Evaluating Health Promotion Programs; and HE 432Administration of Health and Fitness ProgramsRobert Fischer, Biology/ CAS, enhancing ECR in BY 409 Human Physiology,BY 429 Evolution, BY 398 & BY 498 Research/Honors ResearchMichele Forman, Urban Affairs/ CAS, enhancing ECR in DSC 390 Liberty andthe Pursuit of Happiness: REpresenting American Identity on FilmMichele Forman, History/ CAS and Michael Sloane, University HonorsProgram, service-learning partnership project with the Arrington Middle SchoolCharnetta Gadling-Cole, Sociology & Social Work/ CAS, integrating the BESTprogram into SW 205 Geriatric Services and Social WorkElizabeth Gardner, Criminal Justice/ CAS, enhancing QL in JS 250Criminalistics: An OverviewKyle Grimes, English/ CAS, enhancing QEP competencies in EH 492 EnglishNow! CapstoneMaria Hopkins, Psychology/ CAS, developing and piloting a blended version ofPY 490 Psychology CapstoneShannon Houser, Health Services Admin./ SHP, enhancing QL in HIM 425Introduction to Epidemiology and Applied Statistices in Health CareOrganizationsNorma-May Isakow, Service Learning/ APUP, enhancing ECR in all ServiceLearning coursesAndrew Keitt, History/ CAS, enhancing Writing, creating a pilot assessmentcommunity for the Department of History and AnthropologyKaren Kennedy, Deans Office / BUS, and Melinda Lalor, Deans Office/ENGR, enhancing ECR in undergraduate courses in both schoolsSue Kim, English/ CAS, enhancing Writing & ECR in EH 3XX (ServiceLearning class)

Page 55: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

List of grant recipients and awards

http://main.uab.edu/Sites/DOE/QEP/80806/[3/23/2011 12:02:55 PM]

Associate Provost for Undergraduate Programs, Mailing Address: AB374, 1530 3rd AVE S BIRMINGHAM AL 35294-3361.Text Only © 2006 University of Alabama at Birmingham All rights reserved. Disclaimer. Created by UAB Web Communications.

Sue Kim, English/ CAS, developing a new version of EH 496 CapstoneSeminarJason Kirby, Civil, Construction, & Environmental Engineering/ Engineering,enhancing Writing & ECR in Labs CE 220, CE 222, CE 236, & CE 499 SeniorDesignMark LaGory, Sociology/ CAS, enhancing ECR through Intergroup DialoguesMelinda Lalor, Deans Office/ Engineering, enhancing Writing in multiplecourses and labsCraig McClure and Aaron Lucius, Chemistry/ CAS, enhancing Writing & ECRin CH 320 Chemistry in Culture & EthicsLance Nail, Finance, Economics & Quantitative Methods/ BUS,enhancing QEP competencies in FN 495 Institutions and Investments CapstoneAndreas and Stephanie Rauterkus, Accounting & Finance/ BUS, enhancingQL in FN 350 Equity Portfolio Mgmt. & FN 351 Bond Portfolio Mgmt.Robert Robinson, Government / CAS, enhancing Writing & QL in PSC 330The American Judicial ProcessRobert Robinson, Government/ CAS, enhancing Writing & ECR in PSC 431American Constitutional Law IIDavid Schwebel, Psychology/ CAS, enhancing QEP competencies in PY 450Psychology CapstoneRose Scripa, Materials Science & Engineering/ ENGR, enhancing Writing inEngineering coursesDavid Shealy, Physics/ CAS, enhancing QEP competencies in PH 499 PhysicsCapstoneMelanie Shores, Human Studies/ Education/ CAS, enhancing Writing, QL, andECR in EPR 410/EPR 510 Introduction to Measurement & Evaluation inEducation; EPR 411/EPR 511 Measure and Evaluation in Secondary SchoolsDonna Slovensky, Academic Affairs/ SHP, enhancing ECR in HRP 102Experiencing the Health ProfessionsChris Walker, History & Anthropology/ CAS, enhancing QEP competencies inSW 490 Practicum in Social Work and SW 494 Field Practicum SeminarCapstoneStephen Yoder, Accounting & Information Systems/ BUS, enhancing ECR inLS 246 Legal Environment of BusinessStephen Yoder, Marketing, Industrial Distribution & Economics/ BUS,enhancing ECR in MBA 612/AC 612 Corporate Governance, MBA 634/BUS450 Strategic Management, AC 413 Internal Auditing, and JS 440 White Collar& Corporate CrimeNikolaos Zahariadis, Government/ CAS, enhancing Writing & ECR in PSC395 Political Power and Propaganda in FilmNikolaos Zahariadis, Government/ CAS, enhancing QEP competencies in ITS470/PSC 402 Seminar in International Studies CapstoneNikolaos Zahariadis, Government/ CAS, enhancing Writing & ECR in PSC363 Nationalism in World PoliticsNikolaos Zahariadis, Government/ CAS, enhancing Writing and ECR in PSC395 Special Topics: Food, Religion, and Violence in the MediterraneanLamia Zayzafoon, Foreign Languages/ CAS, enhancing Writing, QL, & ECR inARA 399 Advanced Arabic I

Page 56: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Quality Enhancement Plan--University of Alabama at Birmingham 1

EXECUTIVE SUMMARY

Reconceptualizing the undergraduate core curriculum is the focus for UAB’s QEP, which ensures that UAB students will have a solid foundation for academic success, professional achievement, and personal fulfillment. The strategies for implementing this plan include the development of A Shared Vision for a UAB Graduate, selection of initial competencies for enhancement, development of targeted interventions, and recommendations for an infrastructure to support a coherent undergraduate experience for students and a continuous cycle of assessment, intervention, and improvement for academic programs.

The Quality Enhancement Plan (QEP) begins with a Shared Vision for a UAB Graduate, regardless of major, as an individual who uses communication skills effectively, possesses breadth and depth of knowledge, is experienced at problem-solving, and is prepared for responsible citizenship in the community, nation, and world. To achieve this Shared Vision for a UAB Graduate, the QEP identifies three initial targets for enhancement. It prioritizes

1. Writing because writing is fundamental to competent functioning across the undergraduate curriculum and in life beyond graduation.

2. Quantitative literacy because quantitative literacy enables one to solve quantitative problems in coursework and to make wise decisions about public matters that increasingly are couched in technical terms.

3. Ethics and civic responsibility because a university education should develop the ability of individuals to make informed and ethical decisions, to accept responsibility for one’s choices, and to participate as part of multiple larger social units.

The QEP incorporates the following implementation strategies to improve student learning in writing, quantitative literacy, and ethics and civic responsibility:

• The enforcement of an orderly progression of academic coursework through consistent advising, automatic early course registration in freshman composition, and automatic prerequisite checking when students register for courses.

• A restructuring of freshman composition, including the adoption of standardized learning objectives, course guidelines, and grading rubrics for English Composition 101 and English Composition 102. Mastery of basic grammar will become a fundamental course objective for EH 101 and be reinforced in EH 102.

• A restructuring of basic math instruction integrated with a new Mathematics Learning Laboratory and emphasizing incorporation of quantitative literacy learning objectives in MA 105 (Pre-Calculus Algebra) and MA (110 Finite Mathematics), which a majority of entering freshmen must pass through prior to graduation.

• A significant expansion of UAB learning communities for regularly admitted students. For each learning community, the same 25 students would be block registered for a Freshman Seminar; EH 101 or EH 102; a social science, physical science, or math course

Page 57: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Quality Enhancement Plan--University of Alabama at Birmingham 2

from Area III or Area IV of the required Alabama General Studies Curriculum, and an optional fourth course. Together, these courses will introduce and reinforce the learning objectives in writing, quantitative literacy, and ethics and civic responsibility.

• Heightened program accountability whereby each department must define those discipline-specific aspects of writing, quantitative literacy, and ethics and civic responsibility which are relevant to its majors and identify the courses in which these aspects are introduced, taught, or reinforced.

• The development of an online Writing Web to facilitate a more coherent approach to teaching and evaluating discipline-specific types of undergraduate writing by providing (1) a detailed description of the most common elements of each writing genre; (2) a set of sample papers; and (3) a generic evaluation rubric that can be adapted for use by instructors and students.

• Shared responsibility for achieving graduation-level competencies among general education (core) courses, courses in the major, other influential components of the university experience such as academic advisers and Student Affairs, and the student.

• A required senior capstone course or experience comparable to those already in place in the professional schools that will draw upon students’ previous years of learning and provide meaningful closure to their educational experience at UAB.

• The selection of a yearly UAB Discussion Book as the basis for a series of activities, initiated by the university President, that will unite the UAB community in a shared learning experience that promotes open discussion and civic involvement.

• A new grant program to provide the necessary time and instructional resources for faculty to develop learning community curriculum, to enhance instructional methodologies to improve student engagement, and to transform courses to improve student learning of writing, quantitative literacy, and ethics and civic responsibility.

• Assessment instruments and strategies to measure individual and institutional improvement in student learning of writing, quantitative literacy, and ethics and civic responsibility. These data will provide the basis for modifications in implementation strategies to improve student learning, part of a continuous cycle of assessment, analysis, and improvement.

The Shared Vision for a UAB Graduate reflects high expectations. The phenomenal growth of UAB into a major research university provided the impetus for the Committee to develop an ambitious QEP that helps UAB reach the first Goal in its Strategic Plan: “We will achieve a highly effective undergraduate educational experience to give students the best possible preparation for productive and meaningful careers and lives that benefit society.”

Page 58: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: Analysis of an assessment of writing samples from EH 101 and 102

Prepared by: David Corliss, Ph.D. Director, Special Assessment Projects

Prepared for: Dr. Peggy Jolly, Dr. Ted Benditt, Dr. Peter Bellis, Dr. Marilyn Kurata Copied to: Dr. Glenna Brown, Dr. Philip Way

Date: June 21, 2007 Confidential: No

Summary: This report contains the results of analyses of the assessment of student writing samples from EH 101 and 102. These samples were scored by eight faculty on five objectives. There are significant effects on the resulting scores of course, reader, objective, course by reader, and reader by objective. The main effects include:

1. There is ~0.5 point difference between the two courses on the 6-point scale used by each reader.

2. There is ~1.0 point difference between the lowest and highest scoring readers.

3. The scores on the higher-order objectives, Thesis, Response, Evidence are lower than those for Sentence Structure and Vocabulary and Grammar and Usage.

There were many potential sources of bias introduced by the methodology that could, at least in part, have caused some of these significant effects. Three methods for eliminating these biases in future assessments are proposed.

Page 59: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Descriptive statistics A total of thirty-two EH 101 and 114 EH 102 writing samples were scored on five objectives by eight readers who rendered a total of 1,460 scores. There were twenty-three different pairings of readers. Not all reader pairings were equally represented nor were they evenly distributed across the objectives within a pair. The number of judgments by the pairs ranged from one to forty-one.

Characteristics of readers’ scores Figure 1 shows the mean scores by reader sorted in descending order by the mean of the scores for the two courses. The range of the means of both courses is from 3.3 to 4.2. An analysis of variance shows that there is a significant effect of reader with Reader 104 being significantly different from all the others except 710. Reader 710, in turn, is significantly different from all the others except 614 and 217.

0.0

0.5

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

512 418 313 112 217 614 710 104

Mean Score

Reader Code

EH 101

EH 102

Figure 1. Mean scores by reader and course.

Figure 2 shows the distributions of the differences between readers when they score the same objective on the same paper. The percentages within each course add to 100%. This enables the two distributions to be better compared than would otherwise be possible when viewed in terms of absolute numbers. The mode for both courses is 1. The percentages of differences greater than or equal to 2 were 18% and 23% for EH 101 and EH 102, respectively.

Figure 3 shows the same data as Figure 2 broken out by objective. The modal values occur for a difference of 1 for the Response and Evidence objectives for both courses. Differences greater than or equal to 2 occurred most frequently for the Thesis objective for both courses. A surprisingly large number of these differences occurred for SS & V in EH 101.

Page 60: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

0%

10%

20%

30%

40%

50%

60%

0 1 2 3 4 5

Difference Between Readers

EH 101

EH 102

Figure 2. Distributions of the differences between readers when scoring the same objective on the same paper.

0%

2%

4%

6%

8%

10%

12%

14%

1‐Thesis

2‐Re

spon

se

3‐Eviden

ce

4‐SS & V

5‐G & U

1‐Thesis

2‐Re

spon

se

3‐Eviden

ce

4‐SS & V

5‐G & U

1‐Thesis

2‐Re

spon

se

3‐Eviden

ce

4‐SS & V

5‐G & U

1‐Thesis

2‐Re

spon

se

3‐Eviden

ce

4‐SS & V

5‐G & U

1‐Thesis

2‐Re

spon

se

4‐SS & V

2‐Re

spon

se

0 1 2 3 4 5

Difference Between Readers

EH 101

EH 102

Figure 3. Distributions of the differences between readers when scoring the same objective broken out by objective.

A third reader scored objectives when the difference between the primary readers was greater than or equal to 2. Figure 4 shows how this worked. The bottom row of numbers on the x-axis is the Reader 3 score. The other two scores are those of Readers 1 and 2.

Consider an example of how to read Figure 4. The largest percentage of score combinations (highlighted with the blue oval) occurs for EH 101 when Reader 1 scored an objective a 4, Reader 2 scored it a 2, and Reader 3 scored it a 3. This is a case where Reader 3 essentially represents the average of the two readers’ scores. Between the two

Page 61: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

courses about 28% of the judgments of Reader 3 split the difference between Readers 1 and 2. Reader 3 matched the lower score about 5% of the time and matched the higher score about 28% of the time. Reader 3 was lower than the lower score or higher than the higher score only about 1.5% of the time.

0.00%

5.00%

10.00%

15.00%

20.00%

25.00%

5 1 3 4 4 4 5 5 1 2 2 5 4 5 1 5 6 2 1 2 3 3 4 5 6 6 2 3 3 4 6 4

1 3 1 2 1 2 3 4 5 1 2 3 4 5 6 3 4 5 6 4 6

1 2 3 4 5 6

Top row:  Reader 2 scoresMiddle row:  Reader 1 scoresBottom row: Reader 3 scores 

EH 101

EH 102

Figure 4. Frequency of scores when a third reader was required to resolve a difference of 2 or greater between Reader 1 and Reader 2. The blue oval highlights the example described in the text.

What the scores tell us about writing The ultimate questions to be answered by this exercise of scoring all these writing samples is to determine where students are having the most difficulty at a certain phase of their education and whether they improve over time. Before attempting to answer these questions from these data it is important to consider the potential measurement biases that make it difficult to draw definitive conclusions from this round of assessment:

• First, and probably the most important, is the fact that the readers were not blinded to whether the writing sample came from EH 101 or EH 102.

• Second, within each course the sections and students were clearly identified.

• Third, the topics were different for EH 101 and 102 and there is no way to tell how much influence that had on the student output.

• Fourth, as pointed out in the first section of this report, the distribution of readers across courses and objectives was not uniform. This resulted in a predominance of scores coming from a few reader combinations for many of the objectives.

• Fifth, when aggregated across courses and objectives, the scores of individual readers are significantly different. This issue, in combination with the previous one, results in a statistically significant interaction effect between reader and objective and between reader and course.

Even though these biases may have been introduced it is nevertheless useful to make some predictions that can be tested for face validity, if not for statistical significance. There are two predictions of particular interest given the questions above. The first is that students should do consistently better across all the objectives in EH 102 than they

Page 62: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

do in EH 101. In other words, they should have developed their overall competency level as result of taking these courses.

The second is that students should do better on the more concrete objectives (i.e., Sentence Structure and Vocabulary, Grammar and Usage) than they do on the higher order objectives (i.e., Thesis, Response to Topic, Evidence Supporting Topic). This prediction is based on the assumption that they enter college better at the basic competencies than they are at the higher-order thinking competencies. The next two figures bear on these two predictions.

Figure 5 shows the distributions of scores by objective and course. The scores used in this figure and the next are the sum of the two reader scores. Using the Reader 3 scores does not alter these results noticeably. The percentages within each course add to 100% across all the scores and objectives. This normalization enables us to compare the two courses more easily.

0%

1%

2%

3%

4%

5%

6%

7%

8%

9%

10%

1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112

1‐Thesis 2‐Response 3‐Evidence 4‐SS & V 5‐G & U

EH 101

EH 102

Figure 5. Distributions of scores by objective and course. The percentages represented by all the bars for each course add to 100%.

The first thing to notice about this figure is that the distribution of scores for the Thesis objective is generally more spread out than the distributions for the other objectives. This produces lower frequencies (shorter bars) in the Thesis distribution. Overall the distributions within each objective shift to the right, get narrower, and, hence, get taller moving toward the right on the graph. This is consistent with the second prediction above. It is also supported by the analysis of variance which shows that the mean score on the Thesis objective is significantly different from all the other objectives except Evidence. Evidence is, in turn, statistically different from Grammar and Usage.

Figure 6 is a transformation of Figure 5 that makes it easier to see the differences between the two courses. To generate this figure the percentage (frequency) of each score was added to the sum of all the percentages of the previous scores starting with the lowest. These cumulative sums were normalized to 100% within each objective. To interpret these curves consider the following characteristics of such curves:

• If the distributions were identical in position and shape then the lines would lie on top of each other.

• If the distributions were identical in shape but shifted on the score axis from each other then the shapes of the lines would be identical but displaced from each other. A shift of a cumulative distribution to the left represents a shift to a lower mean score.

Page 63: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

6

• If the curves are separated in only part of the score range and overlap in another, then the shapes of the underlying distributions are different.

There are some interesting things to learn from Figure 6. First, there are some obvious shifts of the EH 102 curves to the right of the EH 101 curves. This is consistent with the first prediction above. The only surprising exception to this is that the curves for the Response objective are essentially on top of each other.

Second, there is generally greater separation between the curves in the lower score range than in the upper. This is due to higher frequencies of students in the lower score ranges in EH 101 than in EH 102. Again, the exception is for the Response objective. One possible explanation for this is that the students at the low ends of the distribution in EH 101 dropped out. Another is that the gains made by the students at the lower end of the EH 101 distributions were greater than those of the students at the high end.

Third, the curves for EH 102 are smoother s-shaped curves indicating that they more closely resemble a normal distribution. This may be due to the fact that there were many more cases in EH 102 than in EH 101.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112 1 2 3 4 5 6 7 8 9 101112

1‐Thesis 2‐Response 3‐Evidence 4‐SS & V 5‐G & U

EH 101

EH 102

Figure 6. Cumulative percentages by objective and course. The percentages at each score are successively added from low to high within each objective. The vertical red arrows extending up from 6 for thesis and G & U indicate the percentages of students who achieved that score or less in each course.

The way in which the data are plotted in Figure 6 is particularly useful for determining the percentages of student that fall below a particular cut score. Suppose, for example, that a score of 6 is taken as the dividing line between being not proficient and proficient. A vertical line drawn up from 6 intersects the cumulative curves at points where the percentages can be read off the vertical axis. There are two examples shown as red arrows in the figure. The percentages of not-proficient students for the Thesis objective are 70% and 38% for EH 101 and EH 102, respectively. The corresponding percentages for Grammar and Usage are 24% and 7%.

When aggregated across the objectives and readers there is about a 0.5 point difference between courses on the 6 point scale used by each reader. Though this difference is statistically significant, given the potential biases discussed above, this result has to be interpreted with caution. They are to be taken as examples of how these data can be analyzed in future assessments.

Page 64: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

7

The relationship among scores on the objectives While readers are attempting to score only one objective at a time it is possible that they are being influenced by some other aspect of the writing sample. For example, sentence structure, vocabulary, grammar and usage may bias a reader to respond more negatively to reasonable presentations of evidence, response, or thesis. Conversely, if a thesis statement is not present or is poorly constructed then it is unlikely that the student has written a good response or marshaled evidence to support a thesis. The presence of such interactions can be tested for by looking at the correlations among the scores on the objectives.

Table 1 shows the results. For EH 101 only two pairs are significantly correlated: Evidence with Sentence Structure and Vocabulary; Sentence Structure and Vocabulary with Grammar and Usage. For EH 102 the results are much more dramatic. All pairs are significantly correlated except for Thesis with Grammar and Usage. It should be noted that the magnitude of some of the correlation coefficients in the EH 101 table is the same as those in the EH 102 table but the former are not statistically significant. This is likely an effect of sample size.

There are two alternative, though not mutually exclusive, interpretations of these results. As mentioned above, it could be that readers are influenced by all parts of the sample when judging only one. Alternatively, while there may be differences within a student on each of the objectives, the array of scores increases or decreases together. That is, students who are low on one competency may be low on other competencies and vice versa.

Table 1. Correlations among the objective scores within courses. Statistically significant (p < 0.5) correlation coefficients are indicated in bold.

EH 101 (n = 32) Thesis Response Evidence SS&V

Response 0.00 Evidence 0.16 0.26 SS&V 0.17 0.35 0.42 G&U 0.13 0.23 0.23 0.45

EH 102 (n = 114) Thesis Response Evidence SS&V

Response 0.41 Evidence 0.35 0.44 SS&V 0.35 0.35 0.39 G&U 0.13 0.31 0.22 0.40

Another way to look at these data is to compute the average score for each student and then determine the correlations of the scores on each of the objectives with the average score. Table 2 shows the results. All the correlation coefficients are statistically significant. The thing to note about these data is that the correlation coefficients increase from Thesis to G & U for EH 101 and decrease for EH 102. This means that the higher order objectives (Thesis, Response, and Evidence) are stronger determinants of the mean score for EH 102 than for EH 101. This is consistent with the above suggestion that all the components hang together better in the EH 102 samples.

Table 2. Correlations of the scores on each objective with the average score across all objectives by course. All correlation coefficients are statistically significant (p<0.5).

EH 101 EH 102 Thesis 0.55 0.72Response 0.52 0.77Evidence 0.64 0.68SS&V 0.71 0.67G&U 0.69 0.54

Eliminating potential biases in future writing assessments To be able to answer the questions of interest it is necessary to eliminate the potential biases mentioned above. The experimental design required has the following features:

Page 65: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

8

1. The readers should not be able to identify which course, section, or student the writing sample comes from. This requires that all identifying information be removed and that the answer sheets be uniformly designed. Each sample should be coded so that the scores can be linked back to the course.

2. Students should be required to respond to the same topic in the two writing samples to control for potential effects of difficulty or familiarity with a topic. The best way to do this is to have two topics. Half the students would respond to one topic in EH 101 and the other topic in EH 102. Those students who responded to one topic in EH 101 would respond to the other in EH 102.

3. The sample size needs to be adequate to test for the effects of topic and course. Based on the results reported here, a minimum of fifty writing samples per course is required. This number increases by a factor of two if the effect topic is to be tested.

4. There are multiple ways to distribute individual readers and reader pairs across the samples. The primary issue is how to best factor out inter-reader differences and distribute pairings across samples adequately. Is it better to have readers score the same two objectives or should each reader score all objectives? Should the same readers be paired as shown in the table or should readers be paired with multiple other readers? Table 3 shows one possibility. The rules here are straightforward:

a. Each reader scores two objectives for two courses and two different topics. Reader 1, for example, scores Thesis and G & U on Topic 1 in EH 101 and Topic 2 in EH 102.

b. The same readers are always paired on whatever objectives they score.

c. Although not perfect, reader pairs are scoring both the higher-order thinking objectives and the more fundamental competencies. Since there are three of the former and two of the latter, Readers 5 and 6 score Response and Evidence, two of the higher order objectives. Table 3. Experimental design A for future writing assessments.

Reader Course Student Group Writing Topic Objective 1 2 3 4 5 6 7 8 9 10 EH 101 A 1 1-Thesis X X EH 101 A 1 2-Response X X EH 101 A 1 3-Evidence X X EH 101 A 1 4-SS & V X X EH 101 A 1 5-G & U X X EH 101 B 2 1-Thesis X X EH 101 B 2 2-Response X X EH 101 B 2 3-Evidence X X EH 101 B 2 4-SS & V X X EH 101 B 2 5-G & U X X EH 102 A 2 1-Thesis X X EH 102 A 2 2-Response X X EH 102 A 2 3-Evidence X X EH 102 A 2 4-SS & V X X EH 102 A 2 5-G & U X X EH 102 B 1 1-Thesis X X EH 102 B 1 2-Response X X EH 102 B 1 3-Evidence X X EH 102 B 1 4-SS & V X X EH 102 B 1 5-G & U X X

5. Table 4 shows an alternative method that is only marginally more complex than the previous one. Each reader still scores the same number of samples but hits four unique objectives. There are twenty unique pairings of readers and each reader is paired with four other readers. The number of topics remains the same.

Page 66: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

9

Table 4. Experimental design B for future writing assessments.

Reader Course Student Group Writing Topic Objective 1 2 3 4 5 6 7 8 9 10 EH 101 A 1 1-Thesis X X EH 101 A 1 2-Response X X EH 101 A 1 3-Evidence X X EH 101 A 1 4-SS & V X X EH 101 A 1 5-G & U X X EH 101 B 2 1-Thesis X X EH 101 B 2 2-Response X X EH 101 B 2 3-Evidence X X EH 101 B 2 4-SS & V X X EH 101 B 2 5-G & U X X EH 102 A 2 1-Thesis X X EH 102 A 2 2-Response X X EH 102 A 2 3-Evidence X X EH 102 A 2 4-SS & V X X EH 102 A 2 5-G & U X X EH 102 B 1 1-Thesis X X EH 102 B 1 2-Response X X EH 102 B 1 3-Evidence X X EH 102 B 1 4-SS & V X X EH 102 B 1 5-G & U X X

6. It is possible to develop a design in which all readers score all objectives and pairings are distributed evenly. The idea is to form two panels of five readers each. These panels should be balanced by rank, experience, interest, and whatever other characteristics are appropriately. Table 5 shows a partial view of how this would be laid out. Each Panel 1 reader is paired with each Panel 2 reader and both score all five objectives on a writing sample. All readers and reader pairs score both topics for both courses over twenty writing samples. This is probably the best design and the numbers can be increased easily to gain statistical power and the ability to test for topic and reader effects.

Table 5. Experimental design C for future writing assessments. The entire table includes 100 rows so only a small sample is shown here.

Course Student Group Writing Topic Panel 1 Reader

Panel 2 Reader

Sample # 1-

Thes

is

2- Res

pons

e 3-

Evid

ence

4-SS

& V

5-G

& U

EH 101 A 1 1 1 1 X X X X X EH 101 A 1 1 2 2 X X X X X EH 101 A 1 1 3 3 X X X X X EH 101 A 1 1 4 4 X X X X X EH 101 A 1 1 5 5 X X X X X EH 101 A 1 2 1 6 X X X X X EH 101 A 1 2 2 7 X X X X X EH 101 A 1 2 3 8 X X X X X EH 101 A 1 2 4 9 X X X X X EH 101 A 1 2 5 10 X X X X X EH 101 A 1 3 1 11 X X X X X … … … … … … EH 102 A 2 5 1 96 X X X X X EH 102 A 2 5 2 97 X X X X X EH 102 A 2 5 3 98 X X X X X EH 102 A 2 5 4 99 X X X X X EH 102 A 2 5 5 100 X X X X X

Page 67: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: Analysis of an assessment of writing samples from EH 101 and 102

for the 2007-08 academic year Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects Prepared for: Dr. Peggy Jolly, Dr. Peter Bellis, Dr. Marilyn Kurata

Copied to: Dr. Glenna Brown, Dr. Philip Way Date: August 5, 2008

Confidential: No Summary: This report contains the results of analyses of the assessment of

student writing samples taken from EH 101 at the beginning of the fall semester 2007 and from EH 102 at the end of the spring semester 2008. The experimental design was successful at eliminating the potential sources of bias discussed in last year’s report, thus making it possible to conclude with a high degree of certainty that students’ writing skills improved over the intervening year. The magnitude of the change was 0.6 point out of 6 points for the mean of the five objective scores. The largest difference between EH 101 and EH 102 was for the Response objective at 0.62 points while the smallest was for SS & V at 0.50 points. All of these differences were statistically significant. The Topic had no statistically significant effect on the outcome. There were, however, statistically significant differences among readers and reader pairs but there were no interaction effects among Course, Topic, Reader, or Reader Pairs. That is, there was no evidence that reader differences introduced any significant biases. It is possible to use the additional analyses presented in this report to guide future assessments and generate discussions about ways to increase the differences between the EH 101 and EH 102 samples.

Page 68: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Introduction The analysis of the previous year’s data showed that writing improved from EH 101 to EH 102 by about 0.5 points on a 6-point scale. While this difference was statistically significant, the analysis also revealed many potential sources of bias that could have inflated the observed difference. Several changes were made in the experimental design to eliminate as many of these sources as possible. The primary ones related to sampling are described in the following section.

Methods Answer book design Answer booklets were prepared that enabled the blinded scoring of samples of timed student writing samples gathered at the beginning of the fall semester 2007 and at the end of the spring semester 2008. The first page identified the course and term and included blanks where students could enter their personally identifiable information. Once grades were assigned, this page was removed from the papers that were collected for assessment.

The header of each page included the topic number, letter codes for the two readers who were to read the papers, and a random-number-based code that identified the course and term. With the first page removed readers were thus not able to identify the course from which the paper came. The codes were deciphered in the spreadsheet that generated them.

The following instructions were given to the students:

General Directions

For this writing exercise you will be asked to construct an argument based on a contention made by the author of the UAB Discussion Book for this year. A short passage taken directly from the book introduces the author’s views. Your essay should reflect your own views and should be supported with specific details.

Plan your essay carefully before you begin to write. You can jot down notes on the back of this page to guide yourself. You do not need to fill the 6 pages provided, but please do not exceed them.

Proofread your work carefully before turning it in.

The writing prompts were based on the UAB Discussion Book:

Topic 1:

The passage below from All Over but the Shoutin’ (Rick Bragg, Vintage, 1997) describes the author’s observation of how Birmingham has changed over the years from a historically steel producing, racist area to a Yuppie town.

Read the passage carefully and construct an argument that supports or refutes Bragg’s contention that the previous and current views of Birmingham have little in common.

“By the time I got to Birmingham, its great story was already frozen in stone. Kelly Ingram Park is a place of statues now, quiet, peaceful, unless you are one of those people to whom history screams….It is a yuppie town now. At lunchtime, 20th Street is a parade of black wingtips and sensible pumps. The sky has not been darkened by the steel mills for a long time. A world-class medical school, not the furnaces, defines his green and pretty city. The very name Birmingham will always be shorthand for the worst of the civil rights movement, I suppose, but when I was there, in the last part of the 1980s, the city had abandoned even the memory of men like Theophilus Eugene ‘Bull” Connor,’ as I wrote then, ‘like a gun left behind at the scene of the crime’” (157-8).

Topic2.

The passage below from All Over but the Shoutin’ (Rick Bragg, Vintage, 1997) describes the author’s observations on the effects of poverty.

Page 69: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

Read the passage carefully and construct an argument that supports or refutes Bragg’s contention that poverty profoundly influences those who live with it daily.

“There is a notion, a badly mistaken one among comfortable people, that you do not miss what you never had. I have written that line myself, which is shameful to me now. I, of all people, should know better, should know that being poor does not make you blind to the riches around you; that living in other folks’ houses for a lifetime does not mean a person does not dream of a house of his or her own, even if it is just a little one. My mother ached for a house, for a patch of ground, for something. When I was a young man and we would take drives through town, she stared at the homes of others with a longing so strong you could feel it. She stared and she hoped and she dreamed until she finally just got too tired of wanting” (24-25).

Sampling The original design was set up to collect an equal number of papers from each course and each topic. Each paper was to be read by two readers and each reader was to be paired with five other readers over all the papers. There were fewer papers read than called for in the original design. There were also some imbalances in the numbers of papers per course, per topic, and per reader (Table 1). While not optimal, these discrepancies did not appear to have affected the results.

Table 1. Numbers of papers read by Course, Topic, and Reader.

Course Topic A B C D E F G H I J Total EH 101 1 8 7 9 11 14 8 12 11 9 9 98

2 16 14 11 13 17 14 15 15 13 14 142 101 Total 24 21 20 24 31 22 27 26 22 23 240

EH 102 1 14 12 4 6 8 9 10 8 6 11 88 2 12 11 10 13 14 13 13 10 10 12 118

102 Total 26 23 14 19 22 22 23 18 16 23 206 Grand Total 50 44 34 43 53 44 50 44 38 46 446

Table 2 shows how the 10 readers were paired. There were 216 individual papers read by 25 different pairs of readers and 14 papers that were read by only one reader.

Results Analysis of variance The data were analyzed using multivariate analysis of variance (MANOVA) where all the objectives were treated collectively as dependent variables. The model tested the effects of course, topic, and reader as well as all the interactions among these variables.

Table 3 shows that there are significant effects for Course and Reader. In spite of the fact that there are significant differences among the readers, there are no significant interaction effects between Course and Reader, Topic and Reader, or Course and Topic. This means, for example, that there was no systematic bias whereby one or more readers reader may score EH 101 higher and EH 102 lower while other readers do the opposite.

Figure 1 shows the effect of course by itself. This graph and the statistics in Table 3 clearly indicate a significant effect of Course on all the dependent variables collectively. Follow up tests for each dependent variable separately

Table 2. Reader pairings.

READER Pair A B C D E F G H I J AF 10 10 AG 11 11 AH 10 10 AI 8 8 AJ 10 10BF 10 10 BG 8 8 BH 10 10 BI 7 7 BJ 8 8 CF 6 6 CG 8 8 CH 7 7 CI 7 7 CJ 6 6 DF 8 8 DG 8 8 DH 8 8 DI 8 8 DJ 10 10EF 10 10 EG 13 13 EH 9 9 EI 7 7 EJ 9 9

Total 49 43 34 42 48 44 48 44 37 43

Page 70: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

indicate that Course significantly effects all the objectives when each is considered alone. Table 3. Analysis of variance results. The critical number is the p-value in the right-hand column

Wilks Lambda F Effect df Error df p Course 0.88 10.41 5 401.00 0.000* Topic 0.99 0.65 5 401.00 0.664 Reader 0.58 5.15 45 1796.87 0.000* Course by Topic 0.99 1.00 5 401.00 0.414 Course by Reader 0.90 0.98 45 1796.87 0.518 Topic by Reader 0.88 1.15 45 1796.87 0.233 Course by Topic by Reader 0.93 0.68 45 1796.87 0.949

Wilks lambda=0.88, F(5, 401)=10.41, p<.001Vertical bars denote 0.95 confidence intervals

1-Thesis 2-Response 3-Evidence 4-SS & V 5-G & U

101 102

Course

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Figure 1. Effect of course on the objectives.

The largest difference between EH 101 and EH 102 is for the Response objective at 0.62 points. The smallest is for SS & V at 0.50 points.

Figure 2 shows that the effect of Topic is not significant. Even though the effect is not significant, it is interesting to note the consistency with which each the scores on each objective are slightly lower on Topic 2 than on Topic 1. The largest difference is -0.19 for Evidence and the smallest is -0.09 for G&U.

Figure 3 shows that the effect of Reader is significant. Post hoc tests reveal that Reader G is significantly different from six of the other readers and Reader E is significantly different from 5 of the other readers (G not being one of them).

Page 71: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

Wilks lambda=0.99, F(5, 421)=1.07, p=.376Vertical bars denote 0.95 confidence intervals

1-Thesis 2-Response 3-Evidence 4-SS & V 5-G & U

1 2

Topic

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Figure 2. The effect of Topic on the objectives.

Wilks lambda=0.58, F(45, 1796.9)=5.15, p<0.001Vertical bars denote 0.95 confidence intervals

1-Thesis 2-Response 3-Evidence 4-SS & V 5-G & U

A B C D E F G H I J

READER

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Figure 3. The effect of Reader on the objectives.

Figure 4 shows the differences in the overall mean of the objective scores among pairs of readers. The effect is significant. It is, however, due primarily to the difference between pairs BG and EI. No other combination of pairs is significantly different. There is no significant interaction of reader pairs with Course. Hence, there is no indication of bias when readers are taken in pairs.

Page 72: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

6

Current effect: F(24, 381)=1.89, p=.007Vertical bars denote 0.95 confidence intervals

AF AG AH AI AJ BF BG BH BI BJ CF CG CH CI CJ DF DG DH DI DJ EF EG EH EI EJ

Reader Pair

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Ove

rall

mea

n

Figure 4. Effect of Reader pairs on the mean of the objectives. (The mean rather than all the objective scores was used in this display to make it easier to read.)

The relationship among scores on the objectives Even though each writing sample is scored on five objectives, it is unlikely that they are independent. It is reasonable to expect, for example, that Thesis, Response, and Evidence would be dependent on each other. If a thesis is not well stated one could hypothesize that it might be difficult to justify scoring the evidence in support of that thesis very high. Table 4 supports this particular hypothesis but it is not the strongest relationship. In fact, Thesis is most highly correlated with Response, which, in turn is most highly correlated with Evidence.

The tightest relationship among the objectives is between G&U and SS&V, however. This is probably because they are less abstract than the other three objectives. Note that the correlations between these two objectives and the other three are the lowest in the table.

Table 4. Correlations among the objectives by course. All these correlations are statistically significant.

EH 101 Thesis Response Evidence SS&V

Response 0.80 Evidence 0.65 0.83 SS&V 0.57 0.63 0.66 G&U 0.47 0.51 0.54 0.84

EH 102 Thesis Response Evidence SS&V

Response 0.80 Evidence 0.71 0.85 SS&V 0.63 0.65 0.72 G&U 0.56 0.57 0.61 0.87

Another thing to note in Table 4 is that the correlations are generally higher for EH 102 than they are for EH 101. This can be interpreted to mean that the essays probably hang together better for the more well written among them.

There were 109 readings where the reader scored all the objectives the same. Reader E accounted for 28% of these scoring patterns while Readers C and H accounted for another 29%. Removing these obviously correlated scores

Page 73: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

7

from the analysis reduced the correlations coefficients, as might be expected, but they were all still significant and exhibited the same patterns as those in Table 4.

Table 5 shows the correlations between the individual objectives and the overall mean of the objectives by course and topic. Though differences among them are small, EH 102 again seems to hang together better than EH 101. The only correlation that seems to be out of line with the others is the one between the mean and G&U for Topic 2 in EH 101.

Table 5. Correlations of between the mean of the objectives and the individual objectives by course and topic. All these correlations are statistically significant.

EH 101 EH 102 Topic 1 Topic 2 Topic 1 Topic 2

Thesis 0.81 0.84 0.83 0.87 Response 0.87 0.90 0.90 0.89 Evidence 0.85 0.88 0.89 0.91 SS&V 0.86 0.87 0.90 0.87 G&U 0.81 0.77 0.83 0.82

It is interesting to compare the results in Table 4 and Table 5 above with the corresponding tables in last year’s report (Tables 1 and 2 on page 7). There were only two significant correlation coefficients for EH 101 and those were much lower than those above. Part of the reason for this may be that there was a smaller sample size last year. It may also be due to the overall design in which the readers were blind to the course from which the sample came. It could also be due to better training. Whatever the reason, the observed correlation coefficients represent an overall improvement in the process.

What the scores tell us about students’ writing proficiency Figure 5 shows the cumulative percent of students for each objective by course. Unlike the results obtained with the previous year’s sample, each of the objectives shows a clear cut difference by course. It is possible to determine from this graph what percentage of students might be considered proficient at a particular score. Take, for example, a score of 4 on Thesis. At 4 the cumulative percentages of students 80% for EH 101 and 61% for EH 102. If students above this score are considered proficient, then the corresponding percentages of proficient students would be 20% and 39%, respectively. Table 6 shows the calculations for all the objectives at scores of 3, 4, and 5.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6

Thesis Response Evidence SS & G G & U

Cumulative pe

rcen

t of stude

nts

EH 101 EH 102

Figure 5. Cumulative percent of students versus the score for each objective by Course.

Page 74: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

8

Table 6 can be used as a guide on where the gains between courses are the greatest and where they are the least. The gains across the three scores for Thesis and Response are essentially equal. Evidence is in the middle while SS & G and G & U are essentially equal at the low end when all three scores are taken together.

Table 6. Percentages of “proficient” students by course at scores of 3, 4, and 5 on each objective.

Cutoff Score EH 101 EH 102 Gain Thesis 3 52.9% 74.8% 21.8%

4 20.0% 38.8% 18.8% 5 2.5% 12.6% 10.1%

Response 3 49.6% 71.8% 22.3% 4 15.0% 33.0% 18.0% 5 0.4% 11.2% 10.7%

Evidence 3 44.6% 65.0% 20.5% 4 15.0% 29.1% 14.1% 5 0.4% 9.7% 9.3%

SS & G 3 53.8% 72.8% 19.1% 4 12.5% 27.7% 15.2% 5 2.1% 6.3% 4.2%

G & U 3 54.2% 71.8% 17.7% 4 13.3% 31.6% 18.2% 5 2.1% 6.3% 4.2%

Discussion The principal objective of the experimental design used to analyze EH 101 and EH 102 writing samples was to eliminate as many sources of bias as possible and determine whether students’ writing skills improved from the beginning of EH 101 to the end of EH 102. While significant differences among readers and reader pairs were not eliminated, these differences were not systematic and therefore did not introduce any bias into the interpretation of the most important result, the difference seen between the two courses. There were no interaction effects between Course, Topic, Reader, or Reader Pairs either. It is therefore possible to conclude with a high degree of certainty that the observed differences in the mean score (0.6 points) and in each of the objective scores are real effects of Course.

Next steps Given that there are significant differences in writing performance, a next step is to use the other data presented in this report to probe more deeply into a number of issues. For example,

1. Is it possible to eliminate significant reader differences, both at the individual and pair levels?

2. How can the percentages of “proficient” students be improved overall and, in particular, in SS & G and G & U?

3. Although the effect of Topic was not significant, students in both courses scored consistently lower on this topic on all the objectives. What hypotheses might one form about why this might be the case and how might these affect the formulations writing prompts in the future?

Page 75: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the

100

% 90% 80% 70% 60% 50% 40% 30% 40% 50%

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: Analysis of an assessment of writing samples from EH 101 and 102

for the 2008-09 academic year

Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects

Prepared for: Dr. Peggy Jolly, Dr. Peter Bellis, Dr. Marilyn Kurata

Copied to: Dr. Glenna Brown, Dr. Philip Way

Date: June, 2009

Confidential: No

Summary: This report contains the results of analyses of the assessment of

student writing samples taken from EH 101 at the beginning of the

fall semester 2008 and from EH 102 at the end of the spring semester

2009. It is possible to conclude with a high degree of certainty that

students’ writing skills improved over the intervening year. The

magnitude of the change was 0.34 point out of 6 points for the mean

of the five objective scores. This is smaller than the observed

difference of 0.58 points the previous year. Overall, the scores were

higher in 2008-09 than they were in 2007-08.

There were statistically significant effects of readers, reader pairs,

and the topics but there were no interaction effects among Course,

Topic, Reader, or Reader Pairs. That is, there was no evidence that

reader differences introduced any significant biases.

Further investigation of the prompts suggests that the readability of

the prompt may have a significant effect on the scores. It is not,

however, in the direction that one would expect—students scored

higher on what were deemed to be more difficult prompts as

measured by several readability indexes.

Page 76: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Introduction

This report presents the analyses of the timed writing samples taken at the beginning of EH 101 (pre-test) and the end of EH 102 (post-test).

Methods

The methods used this year were identical to those used last year with two exceptions. First, the number of papers to have been sampled was reduced from 500 to 400. This reduced the reading load from 50 to 40 per reader. Second, each reader was paired with every other reader to further reduce the influence of readers who may score significantly higher or lower than the average of readers.

Answer book design Answer booklets were prepared that enabled the blinded scoring of samples of timed student writing samples gathered at the beginning of the fall semester 2008 and at the end of the spring semester 2009. The first page identified the course and term and included blanks where students could enter their personally identifiable information. Once grades were assigned, this page was removed from the papers that were collected for assessment.

The header of each page included the topic number, letter codes for the two readers who were to read the papers, and a random-number-based code that identified the course and term. With the first page removed readers were thus not able to identify the course from which the paper came. The codes were deciphered in the spreadsheet that generated them.

The following instructions were given to the students:

General Directions

For this writing exercise you will be asked to construct an argument based on a contention made by the author of the UAB Discussion Book for this year. A short passage taken directly from the book introduces the author’s views. Your essay should reflect your own views and should be supported with specific details.

Plan your essay carefully before you begin to write. You can jot down notes on the back of this page to guide yourself. You do not need to fill the 6 pages provided, but please do not exceed them.

Proofread your work carefully before turning it in.

The writing prompts were based on the UAB Discussion Book:

Topic 1:

The passage below is from Field Notes from a Catastrophe: Man, Nature, and Climate Change by Elizabeth Kolbert (New York: Bloomsbury. 2006.) Read the passage carefully and, in a well-developed essay, present an argument that either refutes or supports Kolbert's assertion that we can think our way out of the problems of global warming .

“People are always imagining new ways to live, and then figuring out ways to remake the world to suit what they’ve imagined. This capacity has allowed us, collectively, to overcome any number of threats in the past, some imposed by nature and some by ourselves. It could be argued, taking this long view, that global warming will turn out to be just one more test in a sequence that already stretches from plague and pestilence to the prospect of nuclear annihilation. If, at this moment, the bind that we’re in seems insoluble, once we’ve thought long and hard enough about it we’ll find--or perhaps float--our way clear.” (p. 187).

Topic2.

The passage below is from Field Notes from a Catastrophe: Man, Nature, and Climate Change by Elizabeth Kolbert (New York: Bloomsbury. 2006.) Read the passage carefully and, in a well-developed essay, present an argument that either refutes or supports Kolbert's assertion that we are choosing to knowingly destroy ourselves.

Page 77: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

“As the effects of global warming become more and more difficult to ignore, will we react by finally fashioning a global response? Or will we retreat into even narrower and more destructive forms of self-interest? It may seem impossible to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.” (pp. 188-189)

Sampling

The original design was set up to collect an equal number of papers from each course and each topic. Each paper was to be read by two readers and each reader was to be paired with all other readers over all the papers for a total of 400 readings. There were fewer papers read than called for in the original design. There were also some imbalances in the numbers of papers per course, per topic, and per reader (Table 1). Only one paper was read by one reader. While not optimal, these discrepancies did not appear to have affected the results.

Table 1. Numbers of papers read by Course, Topic, and Reader.

Course Topic A B C D E F G H I J Total

EH 101 1 7 9 9 9 10 10 7 10 9 9 89

2 8 8 8 10 10 10 9 8 8 10 89

101 Total 15 17 17 19 20 20 16 18 17 19 178

EH 102 1 8 8 8 8 9 9 8 6 10 10 84

2 7 9 6 6 9 6 7 8 8 8 74

102 Total 15 17 14 14 18 15 15 14 18 18 158

Grand Total 30 34 31 33 38 35 31 32 35 37 336

Results

Analysis of variance The data were analyzed using multivariate analysis of variance (MANOVA) where all the objectives were treated collectively as dependent variables. The model tested the effects of course, topic, and reader as well as all the interactions among these variables.

Table 2 shows that there are significant effects for Course, Topic, and Reader. The writing was better at the end of EH 102 than at the beginning of EH 101. Students scored higher on Topic 2 than on Topic 1 independent of which course the sample was taken from. The reader effect is due primarily to two readers (Figure 3). Post hoc analysis using the mean score alone as the dependent variable indicates that the effect of Reader is primarily due to two readers: Reader H is significantly different from all the other readers with a mean score of 4.7 and Reader G is significantly different from five of the other readers with a mean score of 3.5. In spite of the fact that there are significant differences among the readers, there are no significant interaction effects Course by Reader or Topic by Reader, or Course by Topic by Reader. In other words, these Reader effects are not systematic in any way.

Table 2. Analysis of variance results. The critical number is the p-value in the right-hand column

Wilks Lambda F Effect df Error df p

Course 0.900 5.4 6 291.0 0.000*

Topic 0.934 3.5 6 291.0 0.003*

Reader 0.366 6.0 54 1488.4 0.000*

Course by Topic 0.990 0.5 6 291.0 0.803

Course by Reader 0.812 1.2 54 1488.4 0.212

Topic by Reader 0.814 1.1 54 1488.4 0.234

Course by Topic by Reader 0.872 0.8 54 1488.4 0.909

Figure 1 shows the effect of course by itself. This graph and the statistics in Table 2 clearly indicate a significant effect of Course on all the dependent variables collectively. Follow up tests for each dependent variable separately indicate that Course significantly effects all the objectives when each is considered alone. The largest difference between EH 101 and EH 102 is 0.4 points and occurs for all objectives except SS&V, which is 0.3, and Outside Sources, which is 0.20 points.

Page 78: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

Wilks lambda=0.900, F(6, 291)=5.4, p<0.001

Vertical bars denote 0.95 confidence intervals

1-Thesis

2-Response

3-Evidence

4-SS & V

5-G & U

6-Outside Src

101 102

Course

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Mean S

core

Figure 1. Effect of course on the objectives.

Figure 2 shows that the effect of Topic. The largest difference is in the Response objective at 0.28 while the smallest is in the Thesis objective at 0.08.

Wilks lambda=.933, F(6, 291)=3.4, p=.003

Vertical bars denote 0.95 confidence intervals

1-Thesis

2-Response

3-Evidence

4-SS & V

5-G & U

6-Outside Src

1 2

Topic

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Mean S

core

Figure 2. The effect of Topic on the objectives.

Page 79: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

Wilks lambda=.366, F(54, 1488.4)=6.01, p<0.001

Vertical bars denote 0.95 confidence intervals

1-Thesis

2-Response

3-Evidence

4-SS & V

5-G & U

6-Outside Src

A B C D E F G H I J

Reader

1.0

1.5

2.0

2.5

3.0

3.5

4.0

4.5

5.0

5.5

6.0

Figure 3. The effect of Reader on the objectives.

The relationship among scores on the objectives Even though each writing sample is scored on six objectives, it is unlikely that they are independent. It is reasonable to expect, for example, that Thesis, Response, and Evidence would be dependent on each other. If a thesis is not well stated one could hypothesize that it might be difficult to justify scoring the evidence in support of that thesis very high. Table 3 supports this particular hypothesis but it is not the strongest relationship. In fact, Thesis is most highly correlated with Response, which, in turn is most highly correlated with Evidence.

Table 3. Correlations among the objectives by course. All these correlations are statistically significant.

EH 101 Thesis Response Evidence SS&V Response 0.75 Evidence 0.66 0.72 SS&V 0.41 0.53 0.50 G&U 0.47 0.54 0.51 0.74 EH 102 Thesis Response Evidence SS&V Response 0.78 Evidence 0.66 0.78 SS&V 0.63 0.69 0.70 G&U 0.59 0.59 0.55 0.68

Another thing to note in Table 3 is that the correlations are generally higher for EH 102 than they are for EH 101. This can be interpreted to mean that the essays are probably more coherent for the more well written among them.

Table 4 shows the correlations between the individual objectives and the overall mean of the objectives by course and topic. Though differences among them are small, EH 102 again seems more coherent than EH 101. It is interesting that SS&V and G&U correlations are distinctly lower in EH 101 than they are in EH 102. The only correlation that seems to be out of line with the others is the one between the mean and G&U for Topic 2 in EH 102. Note that these two areas were targeted for improvement on the basis of last year’s analysis.

Page 80: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

6

Table 4. Correlations of between the mean of the objectives and the individual objectives by course and topic. All these correlations are statistically significant.

EH 101 EH 102 Topic 1 Topic 2 Topic 1 Topic 2 Thesis 0.81 0.79 0.86 0.85 Response 0.87 0.83 0.88 0.88 Evidence 0.83 0.85 0.85 0.85 SS&V 0.76 0.75 0.83 0.85 G&U 0.79 0.76 0.81 0.75

What the scores tell us about students’ writing proficiency Figure 4 shows the cumulative percent of students for each objective by course. In all cases there is a difference between the two courses, but the differences are noticeably smaller this year than they were last year. There are a number of possible explanations for this, many of which can interact. These will be discussed in the next section.

It is possible to determine from Figure 4 what percentage of students might be considered proficient at a particular score. Take, for example, a score of 4 on Thesis. At a score of 4 the cumulative percentages of students 74% for EH 101 and 63% for EH 102. If students above this score are considered proficient, then the corresponding percentages of proficient students would be 26% and 36%, respectively.

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6 1 2 3 4 5 6

1-Thesis 2-Response 3-Evidence 4-SS & V 5-G & U

EH 101

EH 102

Figure 4. Cumulative percent of students versus the score for each objective by Course.

Table 5 shows the calculations for all the objectives at scores of 3, 4, and 5. Table 5 can be used as a guide to where the gains between courses are the greatest and where they are the least. The aggregate gains for Thesis, Response, and Evidence are essentially equal. There are smaller gains for SS&G and G&U.

Year-over-year comparisons As noted above, the differences in the scores on the objectives are smaller in 2008-09 than they were in 2007-08. This shows up clearly in Table 5 as well where the gains are smaller. Note, however, that the percentages of students who could be considered “proficient” at different score levels are almost uniformly higher in 2008-09. Figure 5 shows that the scores are higher and the slopes of the lines are generally less steep for the 2008-09 scores. Does this necessarily mean that students did not improve as much in 2008-09 as they did in the previous year? Not necessarily.

Page 81: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

7

Table 5. Percentages of “proficient” students by course at scores of 3, 4, and 5 on each objective. See Year-over-year comparisons below for discussion of 2007-08 data.

2007-08 2008-09

Cutoff Score EH 101 EH 102 Gain EH 101 EH 102 Gain

Thesis 3 52.9% 74.8% 21.8% 62.9% 81.0% 18.1%

4 20.0% 38.8% 18.8% 25.8% 36.7% 10.9%

5 2.5% 12.6% 10.1% 0.0% 6.3% 6.3%

Response 3 49.6% 71.8% 22.3% 65.2% 82.9% 17.7%

4 15.0% 33.0% 18.0% 29.8% 43.7% 13.9%

5 0.4% 11.2% 10.7% 0.6% 6.3% 5.8%

Evidence 3 44.6% 65.0% 20.5% 50.6% 69.0% 18.4%

4 15.0% 29.1% 14.1% 18.5% 34.8% 16.3%

5 0.4% 9.7% 9.3% 1.1% 3.8% 2.7%

SS & G 3 53.8% 72.8% 19.1% 79.2% 89.2% 10.0%

4 12.5% 27.7% 15.2% 26.4% 39.2% 12.8%

5 2.1% 6.3% 4.2% 0.6% 3.8% 3.2%

G & U 3 54.2% 71.8% 17.7% 75.8% 88.6% 12.8%

4 13.3% 31.6% 18.2% 22.5% 38.0% 15.5%

5 2.1% 6.3% 4.2% 0.0% 2.5% 2.5%

One possible explanation for the smaller gain is purely statistical. There are actually very few scores of 6 on any of the objectives; they comprise at most 3% of the scores. For practical purposes the ceiling for scoring can be considered to be 5 instead of 6, which would lower the highest scores awarded and hence the overall mean. Thus, if the EH 101 scores are higher for some reason, and there is range compression for EH 102, this would account for a smaller difference between the two courses.

While the ceiling effect can explain a smaller difference, it does not explain why the scores are higher in the first place. One possible explanation is the difficulty of the prompt. There are several indicators of difficulty or, more appropriately, readability. One can compute the Flesch Reading Ease index directly in MS Word and on a variety of web sites. This index is based on numbers of sentences, words, and syllables per word. It is also possible to compute a Fog index and a Lexile score, both of which attempt to incorporate factors beyond simple counts, like frequency of word use in ordinary discourse.

3.0

3.2

3.4

3.6

3.8

4.0

4.2

4.4

4.6

4.8

5.0

EH 101 EH 102 EH 101 EH 102

2007-08 2008-09

1-Thesis

2-Response

3-Evidence

4-SS & V

5-G & U

Figure 5. A comparison of scores between 2007-08 and 1008-09.

Page 82: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

8

The interesting, and most curious, result of computing these indexes for the four prompts used over the last two years is that they all show an inverse relation between readability and the mean score. That is, the more “difficult” the prompt was to read, the higher the higher the students scored. What is even more remarkable is that there is an almost perfect correlation between the scores and both the number of sentences and the number of words in the prompt. The correlations between the scores and the Fog index and the Lexile score are much lower at 0.75 and 0.63, respectively. A higher number means a more difficult prompt for these two indices.

The strength and consistency of the relation of these indexes to the scores suggest that there is something fundamental going on, but precisely what will have to be explored further in the light of other possibilities. For example, were the readers more lenient in their scoring? Were the EH 101 students actually academically better and got even better over the two courses? Were the statements of the assertions provided by the faculty more or less challenging? Were the assertions at odds with how someone else may have interpreted the quoted text? All these things may contribute something to outcome that we cannot parse out at this stage.

Discussion

The following three questions were raised in the report of last year’s results.

1. Is it possible to eliminate significant reader differences, both at the individual and pair levels?

a. Not yet.

2. How can the percentages of “proficient” students be improved overall and, in particular, in SS & G and G & U?

a. The percentages of proficient students appears to have increased, but the data on readability suggest that the improvement may have been a function of the prompts themselves.

3. Although the effect of Topic was not significant, students in both courses scored consistently lower on this topic on all the objectives. What hypotheses might one form about why this might be the case and how might these affect the formulations writing prompts in the future?

a. The readability data suggest some very strong relations between simple characteristics of the text and the responses.

Recommendation

The results presented here suggest that a closer examination of the effect of a writing prompt has on student responses. It is hard to believe that such simple measures as the number of sentences and words are so strongly correlated with how the writing was scored. It is even harder to accept that the more difficult the prompt, the better the scores. Perhaps a linguistic analysis would shed some light on this. In any case, it suggests that prompts for future writing assignments should be scrutinized very closely.

Page 83: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

A report from the 

Office of Planning and Analysis

University of Alabama at Birmingham 934‐2226 

Title:  Analysis of an assessment of writing samples from EH 102 for the Spring semester of 2010 

Prepared by:  David Corliss, Ph.D. 

Director, Special Assessment Projects 

Prepared for:  Dr. Peggy Jolly, Dr. Peter Bellis, Dr. Marilyn Kurata 

Copied to:  Dr. Glenna Brown, Dr. Philip Way 

Date:  August 23, 2010 

Confidential:  No 

Summary:  While there are indications of relative strengths and weaknesses in certain aspects of students’ writing abilities at the end of EH 102, there were multiple, uncontrolled sources of variability and possible biases in the measurement design that cloud the interpretation of the data in this single year snapshot.  If action plans to improve student learning are designed on the basis of these findings, it will not be possible to use these findings as a baseline to determine whether these actions were effective or not because of the potential shift in the makeup of the future writing assignments. It is recommended that the measurement design be reworked to standardize on the writing assignments, the student preparation, and how the students are judged against the rubric. One possible approach is to use the reflective essay that was originally proposed by the faculty. 

  

Page 84: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Introduction In 2007‐08 and 2008‐09 the faculty in EH 101 and EH 102 have participated in a systematic approach to measuring changes in student writing ability from the beginning of EH 101 in the fall to the end of EH 102 in the spring. Because of the measurement design it was possible to show statistically significant improvements in writing ability. Previous reports describe the methods and results in detail. Suffice it to say here that the hallmark of the method used was that it controlled for as many sources of variability and bias as possible. 

The process used in 2009‐10 involved scoring one untimed essay from each student at the end of EH 102 using the rubric in Appendix I. The assignments varied by faculty member. This creates a situation where measures of improvement in learning can only be done by comparing one year to the next under conditions that introduce many uncontrolled variables. Scoring Methods Here are the instructions on how the scoring was to be organized: 

1. Select  3 randomly selected papers from 40 sections, along with a copy of the prompt.  2. Affix the appropriate identifying label to each paper. 3. Create a single document that has all 40 prompts (retyped) and numbered in section 

order. Make 10 copies of this for the 10 readers. 4. Assign your slowest readers to letters E and I. Otherwise it does not matter. 5. To start the process create 10 stacks of 12 papers each organized by reader 1. You might 

want to create a sign for each stack that has the reader letter on it. Put the document containing the prompts on top. It does not matter what the order of the papers is within a stack as long as the reader associates the paper with the right prompt and picks the right section and paper column on the scoring sheet to enter the scores. Things will get shuffled after the first reading anyway. 

6. When reader 1 finishes a paper, he or she should cross out the reader 1 letter on the label and put the paper in a separate stack for reader 2. 

7. Each reader should complete the initial 12 papers before reading those that have been completed by reader 1. 

8. When reader 2 finishes a paper, he or she should cross out the reader 2 letter and put the paper in the completed stack. 

9. A reader is done when all the columns on the score sheet are filled in. 

The readers were organized in a very specific way to distribute the workload evenly and get as much mix and match as possible given 10 readers, 40 sections, and 3 papers per section (not exactly a recipe for evenly distributing things): 

1. Every reader is paired with every other reader at least once. 2. Every reader is first on 12 papers and second on 12 papers. 3. Every reader therefore reads 24 papers. 4. Every paper is read twice. 5. One reader reads all three papers from a section, paired with three different readers. 

Page 85: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

The readers went through a training session using papers selected to cover the quality range. The intent was to calibrate each reader’s standards against the descriptions in the rubric and come to some common understanding of how to interpret the scores. Results The number papers collected was 108 from 40 sections. Readers scored an average of 22 papers each, with a high of 24 and low of 19. The number of papers scored by reader pairs ranged from 2 to 8; the original design called for a range of 2 to 6. Eighty‐seven papers were read twice. 

Figure 1 shows that Reader D was 1.2 points higher and Reader I was 1.7 points lower than the grand mean making the range 2.9 points. The scores for Reader I were significantly different from all the others. 

 Figure 1. Mean scores by reader for the three item groups and the overall average. (CTRW: Critical Thinking, Reading, Writing; RK: Rhetorical Knowledge; KC: Knowledge of Conventions) 

Figure 2 shows the mean scores by section with Reader I excluded. There were six sections where the difference between the section mean and the grand mean of 5.4 was greater than or equal to 1. Section 24 was 1.3 points higher and Section 13 was 1.1 points lower making the range in this case 2.4. 

There were 87 papers that were read by two readers. The mean difference in the average score between the two was 1.1 with a standard deviation of 0.8. The median and mode were 0.8 and 0.5, respectively. The differences ranged from 0.0 to 3.7. There was no systematic relationship between the mean scores of the two readers and the differences. 

Figure 3 shows the variation by reader pair with Reader I included to make sure that all pairs were represented as pairs. In this case there were six pairs for whom the difference between 

0

1

2

3

4

5

6

7

8

A B C D E F G H I J

Mean Score

Reader Code

CTRW

RK

KC

Overall

Page 86: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

their mean and the grand mean of 5.3 was greater than or equal to 1.0. The range was a surprisingly large 3.7 (‐1.9 to 1.8). 

 Figure 2. Mean scores by section for the three item groups and the overall average. (CTRW: Critical Thinking, Reading, Writing; RK: Rhetorical Knowledge; KC: Knowledge of Conventions) 

 Figure 3. Mean scores by reader pair for the three item groups and the overall average. (CTRW: Critical Thinking, Reading, Writing; RK: Rhetorical Knowledge; KC: Knowledge of Conventions) 

 

0.0

1.0

2.0

3.0

4.0

5.0

6.0

7.0

8.0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 29 30 31 34 35 36 37 38 39 40

Mean Score

Section

CTRW

RK

KC

Overall

Reader I Excluded

0.0

1.0

2.0

3.0

4.0

5.0

6.0

7.0

8.0

AB AC AD AE AF AG AH AI AJ BC BD BE BF BG BH BI BJ CD CE CF CG CH CI CJ DE DF DG DH DI DJ EF EG EH EI EJ FG FH FI FJ GH GI GJ HI HJ IJ

Mean Score

Reader Pair

CTRW RK KC Overall

Page 87: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Figure 4 shows that performance on rubric item RK‐1 (stating purpose of essay) was rated the highest while item KC‐3 (uses MLA conventions) was rated the lowest. Table 1 indicates that RK‐1 was significantly different from six of the other items while KC‐3 significantly different from five other items. The p‐values in this table were not corrected for multiple comparisons and are to be used as indicators of relative importance that merit attention. 

 Figure 4. Box and whisker plot of the mean score for each of the rubric items. (CTRW: Critical Thinking, Reading, Writing; RK: Rhetorical Knowledge; KC: Knowledge of Conventions) 

Table 1. Rubric item pairs that are significantly different. (Note: no correction of p‐values was made for multiple comparisons) 

Item Pair Mean 1 Mean 2 t-value df p SD 1 SD 2 N

RK-1 vs. CTRW-1 5.67 5.21 2.9 430 0.0046 1.7 1.6 216

RK-1 vs. CTRW-2 5.67 5.21 2.8 430 0.0046 1.7 1.6 216

RK-1 vs. RK-3 5.67 5.26 2.6 430 0.0098 1.7 1.5 216

RK-1 vs. KC-1 5.67 5.35 2.0 430 0.0485 1.7 1.6 216

RK-1 vs. KC-2 5.67 5.27 2.5 430 0.0114 1.7 1.5 216

RK-1 vs. KC-3 5.67 5.03 3.9 430 0.0001 1.7 1.7 216

KC-3 vs. CTRW-3 5.03 5.48 -2.9 430 0.0039 1.7 1.6 216

KC-3 vs. RK-2 5.03 5.42 -2.6 430 0.0096 1.7 1.5 216

KC-3 vs. RK-4 5.03 5.42 -2.6 430 0.0105 1.7 1.5 216

KC-3 vs. KC-1 5.03 5.35 -2.0 430 0.0429 1.7 1.6 216

 

CTRW-1 CTRW-2 CTRW-3 RK-1 RK-2 RK-3 RK-4 KC-1 KC-2 KC-3

Rubric Item

0

1

2

3

4

5

6

7

8

Mea

n S

core

Mean Mean±SE Mean±1.96*SE Reader="I" excluded

Page 88: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

Discussion Under ideal measurement conditions the findings presented in Figure 4 and Table 1 would indicate where students are strongest and weakest even though the statistically significant effect sizes are small. The mean score for RK‐1 is the highest which means that students are generally stating the purpose of the essay clearly enough. All the items that differ significantly from RK‐1 suggest that they are not developing the essay to the same proficiency level nor are they demonstrating knowledge of conventions to as high a level. The lowest ranking item is KC‐3, knowledge of MLA conventions. 

The previous paragraph started by suggesting that the conditions under which these scores were obtained were not ideal. The first issue is that not all students faced the same challenge. The instructions for the writing assignment varied dramatically from one instructor to another. For example, one stated that, “…you will evaluate a print advertisement of your choice.” Another said, “For this paper, you will argue the value or lack of value of a particular thing.” Yet another asked students to “write an argumentative essay ‘in which you reveal an argument that people would not otherwise recognize.’”  

These topics present quite different challenges. Such differences will inevitably contribute to the variability in scores by section shown in Figure 2. Previous analyses of the timed EH 101 and EH 102 assignments that used only two very similar prompts based on the discussion book suggested that there is a significant effect of the prompt itself. The current measurement design makes it impossible to test for any differential effects of the assignment, particularly with only three essays per section. 

The variability introduced by the multiple different assignments was likely further compounded by the fact that the readers did not have the assignments for many of the papers when they did the scoring. This means that some readers were reading some essays without complete knowledge of what the students were supposed to be doing. One could argue that there is some value in not knowing what the student was supposed to be conveying, but without a level playing field of assignments, not everyone is really being measured against the same ruler.  

Consider the consequences of the absence of standardization by thinking about the difference between RK‐1 (stating the purpose of the essay) and CTRW‐1 (summarizing the ideas of others). On a relative basis, students were less proficient on CTRW‐1 than on RK‐1. This would lead one to suppose that concentrating on that over the next year might improve students’ ability to summarize the ideas of others. The problem is that there were differential requirements for citing outside sources. Of the available assignments, seven required a minimum of three, two a minimum of four, one at least six, and two did not specify; one required viewing a film. Readers who scored these essays without knowledge of the assignment may have been looking for more sources than were actually required and scored the essays on this characteristic lower than they might have otherwise. 

Now suppose that the mix of requirements is different next year and the scoring is done with all the assignments known to the readers. Further suppose that there is improvement in CTRW‐1. Could it be attributed to increased emphasis on summarizing the results of others? The answer is that it is impossible to tell since neither the current measures or the future measures are 

Page 89: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

standardized. The only way to be able to attribute changes in student performance to changes in pedagogy is to use measurement methods that remain as constant as possible over time. Simply shifting the writing prompt is enough to shift scores without changing pedagogy. Recommendation Since the purpose of this kind of assessment that aggregates student work products from across faculty and sections is to identify aspects of writing where students are relatively weak and work to improve student performance in those areas, it is essential that the measurement methodology be standardized. The first requirement is to create writing assignments that are equally challenging to all students. To do otherwise is like asking students to clear hurdles that vary in height from one lane to the next. Standardized writing assignments lead naturally to the second requirement, that being to create a training regimen that enables students of equivalent ability to clear equivalent hurdles. The third requirement is that judges of student performance use equivalently calibrated measurement scales. If there is one judge who expects a really high level of performance and another who is willing to give students the benefit of the doubt, there is no way to identify the real winners because of the variability. 

Preliminary discussions with faculty focused on the idea of using a reflective essay based on the students’ e‐portfolios as the work product that would be evaluated. That would be one strategy that gets away from the short, timed essays that were used in the past two years. Providing uniform instructions on what is expected from a reflective essay and backing those with appropriate instruction would be steps in the right direction toward standardization. 

 

 

Page 90: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

APPENDIX I. Scoring Rubric   The student who is off target…  The student at Level 1…  The student at Level 2…  The student at Level 3… 

CTRW‐1  … does not summarize the ideas of others. 

… summarizes the ideas of others. 

… summarizes the ideas of others and offers analytical comments about those ideas. 

… summarizes the ideas of others, offers analytical comments about those ideas, and expresses original thoughts in response. 

NA  0  1  2  3  4  5  6  7  8 

CTRW‐2  … incorporates no outside sources. 

… incorporates outside sources.  … incorporates outside sources and demonstrates an ability to synthesize divergent thoughts of others. 

… incorporates outside sources and demonstrates an ability to synthesize divergent thoughts of others as a means of developing original thoughts. 

NA  0  1  2  3  4  5  6  7  8 

CTRW‐3  … does not develop ideas in any consistent way. 

… uses generalities to develop ideas. 

… uses generalities to develop ideas and then moves into more concrete examples. 

… uses generalities to develop ideas, then moves into more concrete examples, and uses specific language to develop those examples. 

NA  0  1  2  3  4  5  6  7  8 

RK‐1  … does not state in any clear way the purpose of the essay. 

… vaguely hints at the purpose of the essay. 

… clearly states the purpose of the essay. 

… clearly states and reinforces the purpose of the essay. 

NA  0  1  2  3  4  5  6  7  8 

RK‐2  … seems to have no identifiable audience in mind. 

… indirectly and inconsistently addresses an audience. 

… addresses an audience throughout most of the essay. 

… identifies with an audience through specific references to its own experience in relationship to the writer’s experience. 

NA  0  1  2  3  4  5  6  7  8 

RK‐3  … uses no format that would fit any rhetorical situation. 

… uses identifiable formats for writing, even if the formats do not exactly fit the rhetorical situation. 

… uses formats and genres that respond effectively to the constraints of a rhetorical situation. 

… tailors a unique structure to the specific constraints of a rhetorical situation. 

NA  0  1  2  3  4  5  6  7  8 

RK‐4  … does not use voice and tone appropriate to the constraints of the rhetorical situation. 

… inconsistently uses voice and tone that are appropriate to the constraints of a rhetorical situation. 

… consistently uses voice and tone that are appropriate to the constraints of a rhetorical situation. 

… consistently and effectively uses voice and tone that are appropriate to the constraints of a rhetorical situation. 

NA  0  1  2  3  4  5  6  7  8 

KC‐1  … uses syntax and a vocabulary clearly below college level. 

… produces minimal syntactical variety and uses a limited vocabulary. 

… produces some syntactical variety and a vocabulary acceptable for academic discourse. 

… produces a range of syntactical variety and a controlled, sophisticated vocabulary. 

NA  0  1  2  3  4  5  6  7  8 

KC‐2  … demonstrates no knowledge of SEAE conventions. 

… demonstrates adequate but inconsistent control of SEAE conventions of grammar and usage. 

… demonstrates adequate control of SEAE conventions of grammar and usage. 

… demonstrates consistent control of SEAE conventions of grammar and usage. 

NA  0  1  2  3  4  5  6  7  8 

KC‐3  … demonstrates no knowledge of MLA format. 

… demonstrates inconsistent control of MLA format in parenthetical and works cited documentation. 

… demonstrates adequate control of MLA format in parenthetical and works cited documentation. 

… demonstrates consistent control of MLA format in parenthetical and works cited documentation. 

NA  0  1  2  3  4  5  6  7  8 

CTRW: Critical Thinking, Reading, Writing; RK: Rhetorical Knowledge; KC: Knowledge of Conventions

Page 91: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the 

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: An Analysis of the Defining Issues Test (DIT-2) for Entering Freshmen,

Summer 2007 Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects Prepared for: Dr. Marilyn Kurata, Dr. Philip Way

Copied to: Dr. Glenna Brown, Dr. Dan Osborn Date: October, 2007

Confidential: No Summary: The DIT-2 was administered to a total of 94 randomly selected

freshmen during summer 2007 orientations. Four students were dropped for the scoring because of flaws in their responses. The UAB Freshmen scores matched a Freshmen Norms group (N = 2,096) very closely. Their profile indicated that the predominant ethical schema was Maintaining Norms. A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. A comparison to a Senior Norms group suggests that we should expect our seniors to operate according to what is called the Post Conventional schema. In this case ethical decisions take into account the possibility that laws and social conventions may be biased and limiting. They “therefore appeal to the moral purposes and ideals that undergird social law and order rather than the laws themselves (Rest et al. 1999a, 38–43).” To ultimately measure the effectiveness of initiatives related to the Ethics and Civic Responsibility component of the QEP and the Core Commitments Project, it is essential as a first step to test graduating seniors in the spring to determine whether they also match the profile of a national group. A regular cycle of testing then needs to be continued to establish any longitudinal changes as the culture of the institution changes.

Page 92: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Introduction As part of the Ethics and Civic Responsibility (ECR) component of the QEP and the AAC&U’s Core Commitments Project, it is necessary to assess students’ ethical development. To begin this process the Defining Issues Test (DIT), an instrument developed at the University of Minnesota, was administered to 94 entering freshmen during the last orientation session in August 2007. The intention is to administer the same instrument to a sample of graduating seniors in the spring semester of 2008 and follow up with regular administrations to freshmen and seniors on an annual basis.  

The ideal outcome of this assessment program will be to see specific changes in scores (see below) between the freshmen and senior years. We hypothesize that there will be at least three influences on the changes in scores over time. The first is the natural maturation of students during the college years. The second is the exposure to coursework that influences their ethical development. The third is the increased emphasis on the development of ethical, personal, and social responsibility brought about by cultural changes that naturally follow from the QEP and Core Commitments initiatives. 

Thus, we should immediately see a difference between freshmen and seniors because of the first two factors. Over time, we should see an increasing difference between freshmen and seniors as the culture of the institution changes to emphasize ethics and civic responsibility. 

The data presented here thus provide the first glimpse into where our new freshmen are as they entered UAB in 2007. The data provided by the developers of the DIT provide us with norms for freshmen and target values for seniors. The later essentially become our score card criteria in this area. 

The DIT While there is a great deal of background theory and practice on which the DIT is based, it is beyond the scope of this report to discuss the development process that brought the instrument to its present state. The most expedient way of introducing the DIT is to quote directly from the developers’ description and provide an example. 

Description “The DIT is a device for activating moral schemas (to the extent that a person has developed them) and for assessing them in terms of importance judgments. The DIT has dilemmas and standard items; the subject's task is to rate and rank the items in terms of their moral importance. As the subject encounters an item that both makes sense and also taps into the subject's preferred schema, that item is rated and ranked as highly important. Alternatively, when the subject encounters an item that either doesn't make sense or seems simplistic and unconvincing, the item receives a low rating and is passed over for the next item. The items of the DIT balance "bottom up" processing (stating just enough of a line of argument to activate a schema) with "top down" processing (not a full line of argument so that the subject has to "fill in" the meaning from schema already in the subject's head). In the DIT we are interested in knowing which schemas the subject brings to the task (are already in the subject's head). Presumably those are the schemas that structure and guide the subject's thinking in decision‐making beyond the test situation.”1 

Example The following is an example of one of the dilemmas that is used in the DIT. It is called “Heinz and the Drug.” 

1 http://www.centerforthestudyofethicaldevelopment.net/Instruments,%20Services,%20and%20Materials.htm

Page 93: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

In Europe a woman was near death from a special kind of cancer. There was one drug that doctors thought might save her. It was a form of radium that a druggist in the same town had recently discovered. The drug was expensive to make, but the druggist was charging ten times what the drug cost to make. He paid $200 for the radium and charged $2,000 for a small dose of the drug. The sick woman's husband, Heinz, went to everyone he knew to borrow the money, but he could only get together about $1,000, which is half of what it cost. He told the druggist that his wife was dying, and asked him to sell it cheaper or let him pay later. But the druggist said, "No, I discovered the drug and I'm going to make money on it." So Heinz got desperate and began to think about breaking into the man's store to steal the drug for his wife. Should Heinz steal the drug? 

Please rate the following statements in terms of their importance in making a decision about what to do in the dilemma. (1=Great importance, 2=Much importance, 3=Some Importance, 4=Little importance, 5=No importance)  

1. Whether a community's laws are going to be upheld.  2. Isn't it only natural for a loving husband to care so much for his wife that he'd steal?  3. Is Heinz willing to risk getting shot as a burglar or going to jail for the chance that stealing the drug 

might help?  4. Whether Heinz is a professional wrestler, or had considerable influence with professional wrestlers.  5. Whether Heinz is stealing for himself or doing this solely to help someone else.  6. Whether the druggist's rights to his invention have to be respected.  7. Whether the essence of living is more encompassing than the termination of dying, socially and 

individually.  8. What values are going to be the basis for governing how people act towards each other.  9. Whether the druggist is going to be allowed to hide behind a worthless law which only protects the 

rich anyhow.  10. Whether the law in the case is getting in the way of the most basic claim of any member of society.  11. Whether the druggist deserves to be robbed for being so greedy and cruel.  12. Would stealing in such a case bring about more total good for the whole society or not.  

Now please rank the top four most important statements. Put the number of the statement in the blank:  ____ Most important item ____ Second most important item ____ Third most important item ____ Fourth most important item  

The Scores There are four scores reported. Three of them are direct measures of what have come to be called moral reasoning schemas. They are: 

• Personal Interest. A person who ranks the importance of each of the statements following a dilemma from the perspective of this schema is, in theory, basing his or her decisions primarily on self interest. 

• Maintaining Norms: A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. 

• Post Conventional: Ethical decisions made from the perspective of this schema take into account the possibility that laws and social conventions may be biased and limiting. They “therefore appeal to the moral purposes and ideals that undergird social law and order rather than the laws themselves (Rest et al. 1999a, 38–43).”2 

2 Rizzo AM, Swisher LL. 2004 Comparing the Stewart–Sprinthall Management Survey and the Defining Issues Test-2 as Measures of Moral Reasoning in Public Administration. Journal of Public Administration Research and Theory, Vol. 14, 335-343.

Page 94: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

These three scores represent a progression from what might considered the very shallow, almost childish, perspective of self‐interest to one where deeper, more philosophical ideas are considered in any ethical reasoning situation. The three scores are actually the percentages of responses that the respondent chooses that reflect each of the three schemas. Thus one would expect that more mature groups would choose more Post Conventional arguments at the expense of the other two.  

The fourth score is called the N2 Score. It is derived from combinations of the others and, according to the Guide, is more sensitive to longitudinal changes. It is very closely related to the Post Conventional score. 

Results Figure 1 shows that the reciprocal relationships among the schema scores. As the Personal Interest score increases, both of the others decrease. Likewise as Maintain Norms score increases the Post Conventional score decreases.  

Table 1 shows the results for the sample of UAB freshmen and compares their scores with those reported in the Guide for DIT‐2 for a large sample of freshmen and seniors. The ideal pattern is the highest score on the Post Conventional schema and the lowest on the Personal Interest schema. The scores for the Senior Norms show just such a pattern. 

The UAB Freshmen and the Freshmen Norms groups show the highest score on the Maintain Norms schema and the lowest on the Personal Interest schema. This is consistent with the notion that their ethical development is moving in the right direction but is not yet dominated by the Post Conventional schema. 

 

The DIT developers invented a type indicator to capture the degree to which a schema is dominant in an individual and whether that individual is “consolidated” or “transitional.” As the words imply, a person 

Table 1. DIT-2 Schema scores for a sample UAB entering freshmen, August 2007. Four of the 94 students who participated were dropped because of flaws in their responses. Asterisks indicate statistically significant differences when these numbers are treated as continuous variables rather than proportions. None of the differences are significant when treated as proportions. See text for description of scores.

 UAB Freshmen 2007 

(N=90) Freshmen Norms 

(N=2,096) UAB ‐

Freshmen Senior Norms (N=2,441) 

UAB –  Seniors 

Mean  SD Mean SD Difference Mean SD  Difference

Personal Interest  28.2  12.4 28.5 12.3 ‐0.3 24.8 12.53  3.4* 

Maintain Norms  37.2  14.4 33.6 13.0 3.6* 32.4 14.01  4.8* 

Post Conventional  30.2  13.9 32.3 13.9 ‐2.1 37.84 15.44  ‐7.6* 

N2 Score  30.5  13.8 31.0 14.4 ‐0.5 36.85 15.53  ‐6.4* 

6 15 24 32 41 50 59 67 76 85

21019273543526068

6 15 24 32 41 50 59 67 76 85-54

132230394857657483

2 10 19 27 35 43 52 60 68-54

132230394857657483

Personal Interest

Maintain Norms

Post Conv entional

 Figure 1. Relationships among the scores on the three schemas.

Page 95: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

who is consolidated shows a clear preference for a particular schema, while one who is transitional shows no clear evidence of preferring one schema over another, though one may still predominate. They grouped individuals into types based on a combination of which schema predominated and the degree to which the others were represented. The following table shows their classification scheme: 

Type  Predominant Schema Secondary Schema Transitional/ Consolidated 1  Personal Interests ‐‐ Consolidated2  Personal Interests ‐‐ Transitional3  Maintaining Norms Personal Interests Transitional4  Maintaining Norms ‐‐ Consolidated5  Maintaining Norms Post Conventional Transitional6  Post Conventional ‐‐ Transitional7  Post Conventional ‐‐ Consolidated

 Figure 2 shows the distribution of UAB Freshmen by Type Indicator. Note that there is only one person in the Type 1 category. This is good—one would hope that college freshmen view ethical issues in broader terms than complete personal interest. 

The mode of this distribution is Type 3, the median is Type 4, and the mean is 4.2. These values indicate that the Maintaining Norms schema is predominant in this group. In spite of the fact that there are 11 students (12%) in the Type 7 category, the difference between the UAB Freshmen and the Freshmen Norms for the Maintaining Norms schema is statistically significant (See Table 1). 

Conclusions In spite of the small differences between UAB Freshmen and the Freshmen Norms group, the profile of UAB Freshmen on the three schemas essentially matches the Freshmen Norms group. To match the the scores of the Senior Norms group, it is clear that the Personal Interest and Maintaining Norms scores have to decrease by at least 3.4 and 4.8 points, respectively, and the Post Conventional score has to increase by 7.6 points. This is not as daunting a task as it may appear since the scores are compensatory as described above. It will be interesting to see whether our current seniors match the Senior Norms as well as the freshmen match the Freshmen Norms. 

1 2 3 4 5 6 7

Type Indicator

0

2

4

6

8

10

12

14

16

18

20

Num

ber o

f Stu

dent

sMean = 4.2

Figure 2. Distribution of UAB freshmen by Type Indicator.

Page 96: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the 

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: An Analysis of the Defining Issues Test (DIT-2) for Seniors, Spring 2008,

and Entering Freshmen, Summer 2008 Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects Prepared for: Dr. Marilyn Kurata, Dr. Harold Kincaid, Dr. Brad Newcomer

Copied to: Dr. Philip Way, Dr. Glenna Brown, Dr. Dan Osborn Date: May 2009

Confidential: No Summary: The DIT-2 was administered to a total of 86 randomly selected

freshmen during summer 2008 orientations. Five students were dropped for the scoring because of flaws in their responses. The UAB Freshmen scores matched a Freshmen Norms group (N = 2,096) very closely. Their profile indicated that the predominant ethical schema was Maintaining Norms. A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. The DIT-2 was also administered to a total of 64 self-selected seniors. Two of these students were dropped because of flaws in their responses. The seniors’ scores were basically the same as those of the freshmen and significantly different from the National Senior Norms group on two of the three moral reasoning schemas—they were not as high as expected on the Post Conventional moral reasoning schema. This raised two questions. First, Can we effect a change by more explicit teaching of moral reasoning? Second, why are our students different in the first place since they were probably exposed to maturation processes and curricula that were not much different than the students in the population to which they are being compared?

Page 97: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Introduction The 2008‐09 year was the first year in which we administered the DIT‐2 to a self‐selected group of seniors in the spring and a randomly selected group of freshmen in the summer. The results for the seniors were not as predicted based on the normative data in the DIT manual and raise a number of questions that need to be addressed. 

The Scores There are four scores reported. Three of them are direct measures of what have come to be called moral reasoning schemas. They are: 

• Personal Interest. A person who ranks the importance of each of the statements following a dilemma from the perspective of this schema is, in theory, basing his or her decisions primarily on self interest. 

• Maintaining Norms: A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. 

• Post Conventional: Ethical decisions made from the perspective of this schema take into account the possibility that laws and social conventions may be biased and limiting. They “therefore appeal to the moral purposes and ideals that undergird social law and order rather than the laws themselves (Rest et al. 1999a, 38–43).”1 

These three scores represent a progression from what might considered the very shallow, almost childish, perspective of self‐interest to one where deeper, more philosophical ideas are considered in any ethical reasoning situation. The three scores are actually the percentages of responses that the respondent chooses that reflect each of the three schemas. Thus, one would expect that more mature groups would choose more Post Conventional arguments at the expense of the other two.  

The fourth score is called the N2 Score. It is derived from combinations of the others and, according to the Guide, is more sensitive to longitudinal changes. It is very closely related to the Post Conventional score. 

Results Table 1 shows the results for the sample of UAB freshmen and compares their scores with those reported in the Guide for DIT‐2 for a large sample of freshmen. The ideal pattern is the highest score on the “Post Conventional” schema and the lowest on the “Personal Interest” schema.  

The UAB Freshmen and the Freshmen Norms groups show the highest score on the “Maintain Norms” schema and the lowest on the “Personal Interest” schema. This is consistent with the notion that their ethical development is moving in the right direction but is not yet dominated by the “Post Conventional” schema. 

The UAB Freshmen in this year’s sample did better than the national norms in that they scored lower in the Personal Interest schema and higher in the Maintain Norms schema. Their scores were also better in these two schemas than the previous year’s freshmen. 

1 Rizzo AM, Swisher LL. 2004 Comparing the Stewart–Sprinthall Management Survey and the Defining Issues Test-2 as Measures of Moral Reasoning in Public Administration. Journal of Public Administration Research and Theory, Vol. 14, 335-343.

Page 98: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

 

Table 2 shows how UAB Seniors fare against the National Senior Norms. They score higher in the Personal Interest and Maintain Norms schemas and lower in the Post Conventional and on the N2 score. This is suggests that the ethical reasoning skills of these students have not completely developed. Table 3 shows that they are essentially no different than the 2008 entering freshmen. 

 

The DIT developers invented a type indicator to capture the degree to which a schema is dominant in an individual and whether that individual is “consolidated” or “transitional.” As the words imply, a person who is consolidated shows a clear preference for a particular schema, while one who is transitional shows no clear evidence of preferring one schema to another, though one may still predominate. They grouped individuals into types based on a combination of which schema predominated and the degree to which the others were represented. The following table shows their classification scheme: 

Table 3. DIT-2 Schema scores comparing UAB Seniors with UAB Freshmen

 UAB Freshmen 2008

(N=81) UAB Seniors 

(N=62) UAB Seniors –  UAB  Freshmen 

Mean SD Mean SD Difference 

Personal Interest  25.5 11.0 26.5 13.3 1.0

Maintain Norms  38.0 13.0 38.0 13.7 0.0

Post Conventional  32.3 13.5 31.3 11.4 ‐1.0

N2 Score  31.9 14.0 31.2 11.1 ‐0.7

Table 2. DIT-2 Schema scores for a sample UAB graduating seniors in 2008. Two of the 64 students who participated were dropped because of flaws in their responses.

 UAB Seniors 2008 

(N=62) 

National Senior Norms 

(N=2,441) 

UAB Seniors – National Senior 

Norms 

Mean SD Mean SD Difference 

Personal Interest  26.5 13.3 24.8 12.53 1.7

Maintain Norms  38.0 13.7 32.4 14.01 5.6*

Post Conventional  31.3 11.4 37.8 15.44 ‐6.5*

N2 Score  31.2 11.1 36.8 15.53 ‐5.6*

Table 1. DIT-2 Schema scores for a sample UAB entering freshmen, August 2008. Five of the 86 students who participated were dropped because of flaws in their responses. Asterisks indicate statistically significant differences when these numbers are treated as continuous variables rather than proportions. See text for description of scores.

 UAB Freshmen 2008 

(N=81) 

National Freshmen Norms (N=2,096) 

UAB Freshmen – National Freshmen 

Norms 

Mean SD Mean SD Difference 

Personal Interest  25.5 11.0 28.5 12.3 ‐3.0*

Maintain Norms  38.0 13.0 33.6 13.0 4.4*

Post Conventional  32.3 13.5 32.3 13.9 0.0

N2 Score  31.9 14.0 31.0 14.4 0.9

Type Predominant Schema Secondary Schema Transitional/ Consolidated1 Personal Interests ‐‐ Consolidated 2 Personal Interests ‐‐ Transitional 3 Maintaining Norms Personal Interests Transitional 4 Maintaining Norms ‐‐ Consolidated 5 Maintaining Norms Post Conventional Transitional 6 Post Conventional ‐‐ Transitional 7 Post Conventional ‐‐ Consolidated 

Page 99: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

 Figure 1 shows the distributions of UAB Freshmen and Seniors by Type Indicator. The mode of the distribution for 2007 Freshmen distribution was Type 3, the median is Type 4, and the mean is 4.2. For the 2008 Freshmen the mode is 3 and 4, the median is 4, and the mean is 4.5. The corresponding numbers for the seniors are 6, 5, and 4.6, respectively. 

Discussion Last year’s report indicated the degree of change that should have been observed between the freshmen and senior samples. This was based on a comparison of the freshmen scores to the senior national norms that were provided in the user guide. It is clear from the data that the seniors were significantly different from the national norms on the two of the three schema scores. While the magnitudes of the differences in the schema scores are quite large (Table 2), the Type Indicator data in Figure 1 nevertheless suggests movement in the right direction. Seniors are more likely to be in the transitional state between the Maintaining Norms and Post Conventional schema than are freshmen. 

The national comparison data provided in the report shows a clear shift toward the higher Type Indicators as education level increases. The data on which these norms are based come from a broad cross‐section of institutions. It is probably save to assume that very few of them dealt explicitly with developing students’ moral reasoning schemas. If this is true, then the data suggest that the observed differences between freshmen and seniors may occur in part to normal maturation and in part as an unplanned byproduct of the educational process. 

This reasoning raises two questions. The first, and more important, is, can we effect a shift by explicitly emphasizing moral reasoning schemas in our designated courses? 

The second question raises several issues. Since the DIT is based on a very specific model of moral reasoning and how it can be measured, should we our pedagogy incorporate that model? Faculty may raise objections to this idea because it is nothing more than “teaching to the test.” Furthermore, there are some studies of the DIT that indicate that it can be “gamed” if students are explicitly taught the schemas and how they work. If the test can be “gamed” in a way that moves scores more toward the Post Conventional schema, does that necessarily mean that students will actually make judgments that way in the real world? In other words, will anything substantive be gained by addressing what the DIT measures? Perhaps, if students actually think twice before taking an action, it will. 

The second question is, why are our students different if they have, in fact, been exposed to essentially the same curricula as other students and mature at the same rate? It is unfortunate that there is not enough information in the data set to even get a hint at an answer. But this difference does present the opportunity determine if an explicit intervention can make a difference. 

0%

5%

10%

15%

20%

25%

30%

1 2 3 4 5 6 7 1 2 3 4 5 6 7 1 2 3 4 5 6 7

Freshmen 2007 Freshmen 2008 Senior 2008

Figure 1. Distribution of UAB Freshmen and UAB Seniors by Type Indicator.

Page 100: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

1

A report from the 

Office of Planning and Analysis

University of Alabama at Birmingham 934-2226

Title: An Analysis of the Defining Issues Test (DIT-2) for Seniors, Spring 2008

and 2009, and Entering Freshmen, Summer 2008 and 2009 Prepared by: David Corliss, Ph.D.

Director, Special Assessment Projects Prepared for: Dr. Marilyn Kurata, Dr. Colin Davis, Dr. Harold Kincaid

Copied to: Dr. Philip Way, Dr. Glenna Brown Date: September 2010

Confidential: No Summary: The DIT-2 was administered to 86 freshmen during summer 2008

orientations, 64 seniors in the spring of 2009, 77 freshmen during summer 2009 orientations, and 54 seniors during the spring of 2010. All students were volunteers. The effect of cohort on four moral judgment scores was tested using MANOVA. While there was a statistically significant effect of cohort on the N2 Index, it cannot be attributed to any systematic developmental differences between freshmen and seniors. The overall results are similar to those described in the May 2009 report: Both freshmen and seniors show some significant differences from national norms, but the seniors who take this test are not significantly different from the freshmen even though there is a small trend in the right direction. There are no consistent longitudinal trends.

Page 101: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

2

Introduction The Defining Issues Test (DIT) is designed to assess the development of moral reasoning, not one’s ethics or morality per se. Entering freshmen and graduating seniors have been volunteering to take the DIT since the summer of 2007. Given the timing we now have data on four entering freshmen and two senior cohorts. The data analyzed for this report include freshmen and seniors from the 2008‐09 and 2009‐10 academic years. The primary question to be answered is whether we can see any difference between the developmental stages of freshmen and seniors. 

The DIT Scores There are many scores included in the analysis of which four are analyzed here. Three of them are direct measures of what have come to be called moral reasoning schemas. They are: 

• Personal Interest. A person who ranks the importance of each of the statements following a dilemma from the perspective of this schema is, in theory, basing his or her decisions primarily on self interest. 

• Maintaining Norms: A person operating from this schema is basing his or her decisions primarily on laws, conventions, and the social order. 

• Post Conventional: Ethical decisions made from the perspective of this schema take into account the possibility that laws and social conventions may be biased and limiting. They “therefore appeal to the moral purposes and ideals that undergird social law and order rather than the laws themselves (Rest et al. 1999a, 38–43).”1 

These three scores represent a progression from what might considered the shallow, almost childish, perspective of self‐interest to one where deeper, more philosophical ideas are considered in any ethical reasoning situation. The three scores are actually the percentages of responses that the respondent chooses that reflect each of the three schemas. Thus, one would expect that more mature groups would choose more Post Conventional arguments at the expense of the other two.  

The fourth score is called the N2 Index. It is derived from combinations of the others and, according to the Guide, is more sensitive to longitudinal changes. It is very closely correlated with the Post Conventional score. 

Results Table 1 shows the results for the combined sample of UAB freshmen and compares their scores with those reported in the Guide for DIT‐2 for a large sample of freshmen. The ideal pattern is to have the highest score on the “Post Conventional” schema and the lowest on the “Personal Interest” schema.  

The UAB Freshmen and the Freshmen Norms groups show the highest score on the “Maintain Norms” schema and the lowest on the “Personal Interest” schema. This is consistent with the notion that their ethical development is moving in the right direction but is not yet dominated by the “Post Conventional” schema. UAB freshmen are significantly higher on the “Maintain Norms” schema but not at the expense of a significant difference in the “Post Conventional” or the “N2 Index” schemas. 

 

1 Rizzo AM, Swisher LL. 2004 Comparing the Stewart–Sprinthall Management Survey and the Defining Issues Test-2 as Measures of Moral Reasoning in Public Administration. Journal of Public Administration Research and Theory, Vol. 14, 335-343.

Page 102: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

3

Table 2 shows how UAB Seniors fare against the National Senior Norms. They score higher in the “Personal Interest,” but not significantly so. They are significantly higher in “Maintain Norms” schemas and significantly lower in the “Post Conventional” and on the “N2 Index.” This is suggests that the ethical reasoning skills of these students have not developed. Table 3 shows that there is essentially no difference between the seniors and the entering freshmen. 

The DIT developers invented a type indicator to capture the degree to which a schema is dominant in an individual and whether that individual is “consolidated” or “transitional.” As the words imply, a person who is consolidated shows a clear preference for a particular schema, while one who is transitional shows no clear evidence of preferring one schema to another, though one may still predominate. They grouped individuals into types based on a combination of which schema predominated and the degree to which the others were represented. This table shows their classification scheme. 

Table 3. DIT-2 Schema scores comparing UAB Seniors with UAB Freshmen

 UAB Freshmen 2008

(N=157) UAB Seniors (N=118) 

UAB Seniors –  UAB  Freshmen 

Mean SD Mean SD Difference 

Personal Interest  26.7 11.4 26.3 12.1 ‐0.4

Maintain Norms  36.7 12.7 36.3 13.7 ‐0.4

Post Conventional  31.7 13.1 32.9 11.9 1.2

N2  Index  26.9 13.8 28.0 12.0 1.1

Table 2. DIT-2 Schema scores for a combined sample UAB graduating seniors in 2009.and 2010.

 UAB Seniors 2008 

(N=118) 

National Senior Norms 

(N=2,441) 

UAB Seniors – National Senior 

Norms 

Mean SD Mean SD Difference 

Personal Interest  26.3 12.1 24.8 12.5 1.5

Maintain Norms  36.3 13.7 32.4 14.0 *3.9

Post Conventional  32.9 11.9 37.8 15.4 *‐4.9

N2  Index  28.0 12.0 36.8 15.5 *‐8.8

Table 1. DIT-2 Schema scores for a combined sample of UAB freshmen entering in 2008 and 2009 compared to national norms.. Asterisks indicate statistically significant differences when these numbers are treated as continuous variables rather than proportions. See text for description of scores.

 UAB Freshmen 

(N=157) 

National Freshmen Norms (N=2,096) 

UAB Freshmen – National Freshmen 

Norms 

Mean SD Mean SD Difference 

Personal Interest  26.7 11.4 28.5 12.3 ‐1.8

Maintain Norms  36.7 12.7 33.6 13.0 *4.0

Post Conventional  31.7 13.1 32.3 13.9 ‐1.1

N2  Index  26.9 13.8 31.0 14.4 ‐1.0

Type Predominant Schema Secondary Schema Transitional/ Consolidated1 Personal Interests ‐‐ Consolidated 2 Personal Interests ‐‐ Transitional 3 Maintaining Norms Personal Interests Transitional 4 Maintaining Norms ‐‐ Consolidated 5 Maintaining Norms Post Conventional Transitional 6 Post Conventional ‐‐ Transitional 7 Post Conventional ‐‐ Consolidated 

Page 103: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

4

 

 

Figure 1 shows the distributions of UAB freshmen and seniors by Type. Note that the seniors predominate at Types 5‐7 and that freshmen and seniors are about equal at Type 4, a consolidated stage. Freshmen clearly predominate at Type 3, but seniors outnumber freshmen at Type 2 where, in the ideal, freshmen should predominate. This is a transitional stage. 

 

The degree to which seniors are shifted to the higher Types can best be seen in Figure 2. This figure shows the cumulative distributions of freshmen and seniors. The ideal curve for the seniors should be shifted to the right of that of the freshmen for all types and the more to the right the better. The Type Indicators where these curves cross the 50% mark represent the medians. These are about 3.65 for freshmen and 4.2 for seniors. Thus, as one would hope, seniors do tend to be a little more consolidated, but still not where they should be. 

Discussion The results on this instrument are an interesting contrast to those found on the ETS Proficiency Profile. 

Figure 2. Cumulative distributions of UAB Freshmen and UAB Seniors by Type Indicator.

0%

25%

50%

75%

100%

1 2 3 4 5 6 7

Cumulative Pe

rcen

t of stude

nts

Type

Freshmen

Seniors

Figure 1. Distributions of UAB Freshmen and UAB Seniors by Type Indicator.

0%

5%

10%

15%

20%

25%

30%

1 2 3 4 5 6 7

Percen

t of Stude

nts

Type

Freshman

Senior

Page 104: VI: Impact Report of the Quality Enhancement Plan · A. Improved basic writing and math skills => students are better prepared for advanced courses B. Improved competency in general

5

No matter how freshmen and seniors are compared on the Profile, senior cohorts have always scored higher than entering freshmen. That is, whether the comparisons are made without controlling for HSGPA and ACT Composite scores, whether they are made by matching freshmen to seniors using propensity scores, or whether they are made using repeated measures, seniors always score higher. 

The comparisons reported here for the DIT were made with no attempt to control for HSGPA or ACT Reading scores to see if they create a separation between freshmen and seniors. There is some indication in the user manual that reading ability may be an important covariate. There are also some variables included in the test that might be useful for matching. These include things like where a person falls on political and religious dimensions. 

These analyses need to be done and the scores for the individuals taking the DIT need to be tested for relationships to other characteristics of students. While there is a large body of literature on this test, opportunities for deeper exploration of these data are abundant. Indeed, further exploration is essential if the ethical dimension of the QEP’s ECR efforts is to be enhanced in our students. The data suggest that there is room for quality improvement.