using portfolios to assess esl students

Upload: anindita-pal

Post on 02-Apr-2018

227 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/27/2019 Using Portfolios to Assess ESL Students

    1/24

    Using portfolios to assess the writing of ESL students: a powerful alternative?

    Bailin Song*

    , Bonne AugustKingsborough Community College, City University of New York, 2001 Oriental Blvd., Brooklyn, NY 11235, USA

    Abstract

    This article describes a quantitative study that compared the performance of two groupsof advanced ESL students in ENG 22, a second semester composition course. Both groupshad been enrolled in ENG C2, a compensatory version of Freshman English for students

    with scores one level below passing on the CUNY Writing Assessment Test (WAT). At theend of ENG C2, one group was assessed on the basis of portfolios, as well as the CUNYWAT; the other was assessed using the WAT. Comparable percentages of students in bothgroups passed the WAT at the end of C2. However, students from the portfolio group withpassing portfolios were permitted to advance to ENG 22 regardless of their performance onthe WAT, while students in the non-portfolio group moved ahead only if they had passed theWAT. (The WAT remained a graduation requirement for all students.) The study found thatstudents were twice as likely to pass into ENG 22 from ENG C2 when they were evaluatedby portfolio than when they were required to pass the WAT. Nevertheless, at the end of ENG 22, the pass rate and grade distribution for the two groups were nearly identical.Because portfolio assessment was able to identify more than twice the number of ESLstudents who proved successful in the next English course, however, it seems a moreappropriate assessment alternative for the ESL population. # 2002 Elsevier Science Inc.All rights reserved.

    Keywords: Portfolio assessment; ESL students; CUNY WAT

    Introduction

    Portfolio assessment of writing, which incorporates several diverse writingsamples produced at different times, has often seemed ideally suited to programs

    Journal of Second Language Writing11 (2002) 4972

    * Corresponding author. Tel.: 1-718-368-5849; fax: 1-718-368-4786.E-mail address: [email protected] (B. Song).

    1060-3743/02/$ see front matter # 2002 Elsevier Science Inc. All rights reserved.PII: S 1 0 6 0 - 3 7 4 3 ( 0 2 ) 0 0 0 5 3 - X

  • 7/27/2019 Using Portfolios to Assess ESL Students

    2/24

    that use a curriculum inuenced by the writing process. Portfolios can accom-modate and even support extensive revision, can be used to examine progress over

    time, and can encourage students to take responsibility for their own writing.Furthermore, assessment criteria seem less arbitrary for portfolios than they mightwhen applied to a single, impromptu piece.

    The literature is rich in discussions of the important issues raised by portfolioassessment ( Belanoff & Dickson, 1991; Black, Daiker, Sommers, & Stygall,1994; Camp, 1993 ; Elbow & Belanoff, 1997; Hamp-Lyons & Condon, 2000; Huot& Williamson, 1997 ; Kearns, 1993; Yancey, 1999 ) and in accounts of thedevelopment of portfolio assessment programs ( Belanoff & Elbow, 1986; Courts& McInerney, 1993 ; Gill, 1993 ; Hamp-Lyons & Condon, 1993, 2000; Markstein,

    Withrow, Brookes, & Price, 1992 ). As yet, however, this literature has not beenmuch augmented by quantitative research. Based on their screening of articles onportfolio assessment in the literature in a 10-year period, Herman and Winters(1994) nd that a very small number of them (

  • 7/27/2019 Using Portfolios to Assess ESL Students

    3/24

    samples as adjuncts to indirect methods of assessment, but cautioned those whoused them to ` be cognizant of the numerous variables that condition the

    interpretation of their results.'' That caution was well-advised, for the benetsof portfolio assessment appear to lie not in reducing the number of variables but inextending the number and range of pieces available for assessment and in its link to process writing ( Hamp-Lyons, 1994 ).

    Hamp-Lyons (1991) advocates the use of portfolios for ESL students. Portfo-lios are thought to be especially suitable for non-native English-speaking studentsbecause ``portfolios provide a broader measure of what students can do, andbecause they replace the timed writing context, which has long been claimed to beparticularly discriminatory against non-native writers'' ( Hamp-Lyons & Condon,

    2000, p. 61). According to Hamp-Lyons and Condon, after portfolios wereintroduced into the exit assessment, more ESL students tested out of theUniversity of Michigan's Writing Practicum on the rst try than had been thecase when the exit assessment used a timed essay. Ruetten (1994) , citing researchthat shows that ESL students nd holistically scored competency exams parti-cularly difcult, describes her own research involving the second course of acomposition sequence which required students to pass a prociency exam. InRuetten's study, native English speakers and non-native speakers achieved acomparable pass rate when a prototype of the portfolio, an appeals folder

    containing several representative pieces of writing, was evaluated. Given theresults, Ruetten concludes that some kind of portfolio assessment is particularlyuseful in evaluating ESL writers (p. 94). Research at Borough of ManhattanCommunity College of the City University of New York ( Jones, 1992 ) shows thatESL students assessed by portfolio achieved results in the next course that werebetter than or comparable to those achieved by native English speakers who wereassessed on the CUNY Writing Assessment Test (WAT), a holistically gradedtimed impromptu essay.

    Challenges of portfolio assessment

    While portfolio assessment promises potential benets for curriculum andassessment, it also faces challenges. Brown and Hudson (1998) sum up from theliterature ve disadvantages of using portfolio assessment: the issues of designdecision, logistics, interpretation, reliability, and validity. Of great concern are theassessment's time-consuming nature, and the issues of reliability and validity.

    Portfolio assessment programs, without a doubt, make substantial demands on

    instructors. While planning portfolio tasks and lessons, coaching students on drafts,and helping them compile portfolios can be comfortably folded into a process-oriented course, the actual evaluation of portfolios is inevitably labor intensive,requiring a signicant amount of time from instructors. Furthermore, reliability andvalidity concerns remain as unresolved issues. For example, how can we assurepsychometric reliability, such as scoring consistency and rater agreement? Through

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 51

  • 7/27/2019 Using Portfolios to Assess ESL Students

    4/24

    ` negotiation'' among raters or through rater training directed by anchor papers andscoring guides ( Yancey, 1999 )? How can scoring fairness be achieved? Through the

    articulation of difference and negotiation so that community standards are estab-lished (Elbow & Belanoff, 1997 )? How do we provide ``equitable assessmentsettings'' ( Herman & Winters, 1994, p. 52 ) and make sure that all students haveequal access to resources ( Brown & Hudson, 1998 )? More importantly, how do weascertain that portfolios adequately exemplify students' writing abilities so that thedecisions we make about students are accurate?

    In the debate over validity and reliability, there are conicting perspectives onthe role of psychometric standards and standardization. Some consider portfolioassessment a viable alternative precisely because it resists standardization ( Huot

    & Williamson, 1997 ; Moss, 1994 ). They believe that reliability and validity in thenarrow psychometric sense are undesirable factors in evaluation. On the otherhand, others ( Herman, Gearhar, & Baker, 1993; LeMahieu, Gitomer, & Eresh,1995) nd that psychometric integrity is attainable for portfolio assessment.Hamp-Lyons and Condon (2000) believe that both reliability and validity arenecessary and must be established ``if portfolio-based assessments are to growand to replace less satisfactory ones'' (p. 136) since only these types of data canconvince bureaucrats. In 1994 they nd that the University of Michigan's full-scale entry-level and exit portfolio assessment achieves levels of reliability and

    validity equal to or better than direct tests of writing that are based on timedwritings scored holistically. Williams (1998, 2000) strongly argues for anddemonstrates a need for standardization and strict adherence to standard proto-cols. He reasons that standardized procedures are necessary in establishingperformance standards. Without standards for implementation and outcomes,portfolio assessment will become whimsical, capricious, and unfair because ``itincreases the subjectivity teachers bring to evaluation'' (2000, p. 136). Thisunreliability will threaten any benets portfolio assessment brings and make itlose its appeal because portfolio assessment was, indeed, ``developed with the

    goal of making the evaluation of classroom writing more objective, more fair, andmore realistic'' (2000, p. 147).

    The purpose of the study

    In 1995 the Department of English at Kingsborough Community College, alarge urban campus that is part of the City University of New York, adoptedportfolio assessment as the standard procedure for students in intermediate and

    advanced ESL. At the same time, portfolios were also adopted for developmentalEnglish writing courses, which ESL students took after completing advancedESL. The decision to implement portfolio assessment throughout the ESL anddevelopmental sequences followed 2 years of experimenting with portfolioassessment on a limited, voluntary basis. A major factor in the decision wasthe desire to implement a form of assessment that supported our curriculum. We

    52 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    5/24

    also reasoned that portfolio assessment would be a more valid measure for exitfrom the developmental writing sequence, a relatively high stakes decision that

    had previously depended upon the CUNY WAT.The CUNY WAT (see Appendix A for a sample test and criteria for evaluation)was given to all entering freshmen on all CUNY campuses prior to the introduc-tion of the American College Testing Program (ACT) writing and reading examsin the fall of 2000. The WAT is a timed impromptu essay in which students mustargue for or against a proposition of general interest (e.g., ` States should mandatethe use of automobile safety belts'' or `` Students who come late to class shouldnot be penalized''), supporting their arguments from their experience, theirreading, and their observations of others. Essays were evaluated by two readers

    on a scale of 16, with a total score of 8 (4 4) considered passing. All WATreaders were certied by the University's Ofce of Academic Affairs. Forcertication, readers participated in an all-day training session at the end of which they were given a test. They had to meet the University's standards in orderto be permitted to grade WAT tests.

    At Kingsborough Community College entering students who passed the WATwere admitted into the standard freshman composition class, ENG 12. Those whoscored below 6 were placed into developmental courses or ESL, as appropriate.Students who scored a 6 or 7 were placed in ENG C2. The ENG C2 course was

    designed as a variant of the standard four-credit freshman English course;however, because the students had not yet passed the CUNY WAT and werethereby ``remedial,'' the course was augmented with an additional non-creditinstructional hour. Under the old, i.e., pre-portfolio, system, if students passedboth the course and the WAT, they received credit for the rst semester of Freshman English and advanced to ENG 22, the second semester English course.If they failed the WAT, regardless of their performance in the course, they had torepeat the course.

    Like students who had been placed in ENG 12, successful C2 students then

    advanced to the second semester Freshman English course, ENG 22. Thus, thedevelopmental sequence and the composition sequence intersected at ENG C2.Moreover, this course, which included both native English speakers and ESLstudents, presented a particular challenge to instructors because the high stakes,university-mandated exit testing took placeat itsconclusion.C2studentspressed fordirecttest preparation andresistedattention to thebroader developmentofacademicwriting,eventhoughthecoursewasintendedtohavethesameobjectivesasENG12.

    When the portfolio was introduced experimentally, both instructors andstudents commented that the new system seemed to them to be more suitable

    and fairer than a single test. The portfolio was a performance-based appraisal thatevaluated students' progress and accomplishment within the learning environ-ment. Although the portfolio also contained an in-class nal writing exam, theexam was not a test of writing speed because it allowed an adequate amount of time for writing, nor was it culturally biased, as students had an opportunity tobecome familiar with the issues addressed in the prompts.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 53

  • 7/27/2019 Using Portfolios to Assess ESL Students

    6/24

    Regardless of the department's satisfaction with the portfolio, however, theWAT could not be waived: it was a university-wide, as well as a college,

    requirement. At best, the use of portfolios could allow students who had donepassing work in the course to receive credit for ENG C2 and continue into ENG 22if their portfolios were deemed passing, thus deferring for a semester therequirement that the students pass the WAT. This provided time for an additionalwriting course before students had to retake and pass the high stakes test.

    As the department's use of portfolios expanded to include all of the develop-mental writing courses, it became increasingly important to have data thatdemonstrated whetherin addition to instructor and student satisfactiontheuse of portfolios to pass out of the developmental program was effective in

    determining readiness for the next course. Thus, we decided to conduct a study.We were interested in nding out how portfolio assessment had served our ESLstudents and whether or not it demonstrated validity for the purpose of makingplacement decisions about these students.

    The questions the study attempted to answer are as follows:

    1. Did pass rates for the CUNY WAT and for ENG C2 differ signicantlybetween portfolio sections and non-portfolio sections?

    2. When they took ENG 22, how did the performance of ESL students from

    portfolio sections of ENG C2 compare with those who had passed theENG C2 sections that used the WAT as the method of assessment?3. In particular, how successful in ENG 22 were those portfolio students who

    had passed portfolios but had not achieved a passing score on the WAT?

    Using portfolios to assess the writing of ESL students at Kingsborough

    The basic writing/composition sequence at Kingsborough integrates intensive

    reading and writing in thematically organized courses. The development of writing through a process of revision is central to the sequence. For this reason,portfolio assessment seemed a particularly appropriate way to evaluate studentprogress. As members of the department had moved from a pedagogy thatemphasized the teaching of grammatical correctness and rhetorical modes toone that stressed process, revision, and text-based writing assignments, they wereeager to replace the timed impromptu essay format with a method of assessmentthat supported the new curriculum. While in the long run, students might beexpected to apply process techniques in a timed essay, this did not seem to be a

    reasonable expectation for students who had completed only 12 weeks (the lengthof the Kingsborough semester) of basic writing, nor did one sample of studentwriting seem to be a sufcient basis for a high-stakes decision.

    A second reason for seeking a different form of assessment was the increasingnumber of immigrant students. These ESL students, many of whom were well-educated in their rst languages, responded well to the curriculum, but found great

    54 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    7/24

    difculty in passing the WAT. Unfamiliarity with topics, inability to constructconvincing arguments for or against culturally specic issues, and anxiety about

    being tested impromptu in a second language may account for the difculty.However, frequently students cited having inadequate time as the main factoraffecting their performance on the WAT. This is probably because ESL studentssimply need more time to implement all aspects of the writing processthinkingand planning, writing (and translating), and reviewing and editing. In a studycomparing rst- and second-language writing, Raimes (1987) nds that ESLstudents tend to edit and correct their work more than native English speakers.This suggests that the opportunity to review and edit is of crucial importance forESL students. When pressured by time on the WAT, ESL students usually lacked

    time for even a quick review of their essays, thus producing essays with moreerrors and language problems than permitted.As the ESL population increased, average pass rates in the upper level basic

    writing course fell from over 50% to just above 40%. The holistically scored WATbecame a barrier to ESL students, thus conrming Ruetten's (1994) research onholistically scored evaluations. ESL students were held back by the WAT eventhough they had demonstrated that they could complete course assignments,including in-class writing assignments, satisfactorily or even excellently. Thissituation generated a large number of students who were repeating the course,

    sometimes several times, and a consequent retention problem of growingproportions, as students found themselves unable to proceed.

    The Kingsborough model of portfolio assessment

    In order for the portfolio assessment of writing to receive ofcial sanction andbe considered as a trustworthy and credible evaluation measure, it is necessary forportfolio programs to establish acceptable performance standards and have in

    place standardized procedures and guidelines for conducting the assessments. Webelieve it is very important to implement standardized procedures and outcomestandards strictly if we want to achieve and maintain high levels of reliability andconsistency.

    The Kingsborough portfolio, designed by a group of instructors, both full-timeand part-time, contains a cover letter, two revised essays with several drafts, and adepartmental writing exam.

    The cover letter provides a venue for the student to present him/herself as awriter to the portfolio reader. It encourages accounts of struggles encountered and

    progress made during the semester as well as discussions of how he/she views thewriting process. The two revised essays are chosen by the students from acollection of ve to six revised essays as those best representing their writingability. A typical 12-week basic writing course at Kingsborough requires of students ve to six revised essays, at least half of which must be reading-based,and four to ve in-class essays, as well as informal writing. Instructors develop

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 55

  • 7/27/2019 Using Portfolios to Assess ESL Students

    8/24

    their own essay topics, incorporating the course readings as much as possible (seeAppendix B for a sample assignment). Each of the revised essays in the portfolio

    must be accompanied by at least two other drafts with the instructor's comments.Students who do not have all the drafts or fulll the requirements of the class arenot allowed to submit a portfolio. We decided to require two of the student'srevised essays, one of which must be reading-based, in order to provide the readerwith an adequate but not overwhelming sample of each student's work.

    The departmental writing exam, the in-class writing piece included in allportfolios, is 2 hours long, more than twice the length of time allowed for theWAT. The exam is based on a long reading passage distributed in advance. Thestudents do not discuss or write about it in-class ahead of time, but are encouraged

    to clarify their understanding by using a dictionary or discussing it with each otheroutside of class. On the exam, students are asked to spend about 30 minutesanswering short-response questions on the reading passage and about 90 minuteswriting an essay on one of the two given topics about issues drawn from thepassage (see Appendix C ).

    Evaluation criteria for portfolios include: nding and organizing ideas, usingthe revision process, and editing and presenting work (see Appendix D ). Thespecic check list items included in the criteria were determined by the depart-ment based on discussions with instructors and a survey of literature as well as

    reference to the Evaluation Scale for the WAT.Portfolios are cross-read by pairs of instructors. To maintain reliability,instructors of portfolio sections are trained intensively twice a semester, onceat mid-term and once at the end just before the actual reading. In training sessions,instructors examine and discuss ``anchor'' essays. Without exception, all instruc-tors teaching developmental writing must attend these training sessions. Students'portfolios are graded pass/fail by the readers. Space is provided on the evaluationsheets for readers to make additional comments on the weaknesses and strengthsof the portfolios they read. Discussions and communications between the pair of

    readers are discouraged, to make sure that students' portfolios are evaluatedstrictly according to the pre-established standards, rather than through theirinstructor's negotiations with the reader. Instead, an appeals process is availableto the student's instructor should the instructor and the portfolio reader disagree.

    Thanks to the intensive training and strict adherence to rating guidelines andstandards, the level of inter-rater reliability of our portfolio assessment over theyears has remained high. For example, in the spring semester of 2001, portfolioreadings for the highest level of our developmental English sequence achieved aninter-rater reliability of 0.82. This result is above the 0.80 benchmark for holistic

    ratings of timed essays.As the instructors have become more accustomed to reading portfolios and theprocess has become more systematized, the time required has decreased. We havedeveloped strategies for responding succinctly, especially to students with passingportfolios. An experienced reader can evaluate a portfolio in about 1215minutes.

    56 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    9/24

    ESL population and data collection procedures

    Kingsborough Community College has a large ESL population. Each semester,about 500 students are enrolled in ESL classes taught by ESL faculty, while over300 other second language students take upper level developmental Englishcourses, many of which are either designated for ESL students or taught byEnglish faculty with experience in working with ESL students. Students comefrom as many as 60 countries and regions and speak more than 40 languages ordialects. However, the predominant language spoken by our ESL students isRussian (44%), followed by French and Creole (15%), Chinese (12%), Spanish(10%), Arabic (4%), and Polish (3.6%), according to the fall, 2000 freshman

    enrollment record.1

    Most of the ESL students are recent immigrants and havecompleted their high school education in their native countries; a small number of them have studied 1 or 2 years at college or hold a bachelor's degree in their rstlanguage. Although most attend Kingsborough to pursue academic degrees, sometake classes for certicates or to prepare for license exams so that they can enter aprofession quickly.

    All nine sections of ENG C2 (designated for ESL students) that rst adoptedportfolio assessment were included in the study as the portfolio group, and amatching number of ENG C2 sections (also for ESL students) that used the

    CUNY WAT as an exit criterion were included as the non-portfoliogroup. Although the portfolio was used as an exit criterion, students in theportfolio group were also given the WAT at the end of the semester as required bythe college. Students had to have completed all the course work in order to submita portfolio or be allowed to take the WAT. One hundred and three students fromthe portfolio group and 107 from the non-portfolio group (50% of the totalenrollment of each group) were randomly selected to be included in the study.

    Students in both groups, as required for entry into ENG C2, had received scoresof 6 or 7 on the CUNY WAT. Those who had scored lower or higher were placed

    into different levels. Students chose sections solely based on scheduling concerns,without any knowledge of which sections were participating in the portfolioprogram or which professors were teaching the sections (Kingsborough does notlist professors' names in the course catalogue). It is, therefore, reasonably safe toconclude that students were distributed randomly and all sections began withstudents of comparable writing ability.

    Our analysis of grades in ENG 22 operated from the pool of students whopassed ENG C2, either through the portfolio or the WAT. We analyzed only thosestudents who took ENG 22 in the semester immediately following the one in

    which they completed ENG C2. Therefore, we cannot report on those studentswho did not pass ENG C2.

    1 Although specific numbers of students speaking these languages vary from semester to semester,Russian has long been the predominant language spoken by our ESL students. For this study, studentsof Russian language background made up more than 50% of the subjects.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 57

  • 7/27/2019 Using Portfolios to Assess ESL Students

    10/24

    In addition, some students (in both groups) who passed ENG C2 did not takeENG 22 at Kingsborough during the next semester, because of transfers, drops,

    and other reasons; therefore, only students whose ENG 22 grades were available,specically 64 of 80 students in the portfolio group and 36 of 41 in the non-portfolio group, were included in further data analysis. Of the 16 portfoliostudents who did not take ENG 22, 5 received an A in C2, 8 received a B,and 3 received a C, averaging 3.13 using the standard numerical scale for grades.All ve non-portfolio students who did not take ENG 22 received a B, or 3.0, inC2. The average grades are very similar, and thus do not suggest that the sample of students taking ENG 22 was skewed by the absence of those who passed ENG C2but did not take ENG 22.

    Analysis of data

    According to Huck and Cormier (1996) , a chi-square test or a z-test can be usedto contrast two unrelated groups on a dichotomous dependent variable when datatake the form of proportions, percentages, or frequencies. These two tests aremathematically equivalent and will always lead to the same decision regarding anull hypothesis. Thus, we used the z-test for proportions to contrast the two

    groups' CUNY WAT and ENG C2 pass rates (for Question 1).In comparing the students' performance in ENG 22 (Question 2), the distribu-tion of the letter grades for the two groups were analyzed using w

    2 . As the twogroups' means of the ENG 22 grades appeared very similar, we wanted to knowwhether the students in the two groups were distributed in the same fashion acrossthe ve letter grades.

    To compare the performance in ENG 22 of the portfolio students who had notpassed the CUNY WAT with that of the non-portfolio students who had passed theCUNY WAT (Question 3), a more commonly used t -test was used to see if their

    numerical grade means were signicantly different.

    Results

    Question 1: The CUNY WAT and ENG C2 pass rates

    The pass rates of the CUNY WAT and ENG C2 for the two groups are reportedin Table 1 . The two groups' CUNY WAT pass rates do not appear signicantly

    different. Of the 103 students in the portfolio group, 33 or 32% passed the CUNYWAT as compared to 37 or 35% of the 107 students in the non-portfoliogroup. However, 80 or 78% of the 103 students in the portfolio group passedthe portfolio assessment and thus the ENG C2 course; by contrast, only 41 or 38%of the 107 students in the non-portfolio group passed the course as determinedmainly by their scores on the CUNY WAT. These ENG C2 pass rates look

    58 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    11/24

    signicantly different. The z-tests for proportions conrmed that there was nosignicant difference between the two groups' CUNY WAT pa ss rates, but the two

    groups differed signicantly in passing the ENG C2 course.2

    While ESL students in the portfolio group had a CUNY WAT pass rate similarto the ESL students in the non-portfolio group, students in the former group weretwice as likely as students in the latter group to pass ENG C2 when they wereevaluated by portfolio. These results indicated that on the timed impromptu test,the WAT, the performance of the two groups of students was similar. The studentsin the portfolio sections, however, were able to demonstrate their readiness for thenext level through a range of pieces: representations of their semester's work produced under less time pressure and with access to dictionaries and other

    reference material and a writing exam, completed in 2 hours rather than 50minutes and on topics they had read about. Their revised pieces of writing oftenreected the skills of experienced writers, showing extensive changes in ideas,content, support, text organization and structure, and expression. The non-portfolio students, though, did not have this chance to demonstrate their writingability.

    Question 2: Students' performance in ENG 22

    Since the ultimate test of evaluation criteria is usually its ability to predictfuture success, we followed the two groups of students into their next level of English courses, ENG 22.

    As shown in Table 2 , the two groups' means of equated grades do not appearsignicantly different. A chi-square test conrmed this. No signicant differencewas found, w

    2 (4, n 100 2:15, P :71, in the distributions of the ve lettergrades for the portfolio group and the WAT group. (The non-portfolio group isreferred to as the WAT group here since all members had passed the WAT before

    Table 1Comparison of CUNY WAT and ENG C2 pass rate

    Sample of portfoliogroup ( n 103)

    Sample of non-portfoliogroup ( n 107)

    z P

    PassingCUNY WAT 33 (32%) 37 (35%) 0.46 .3228ENG C2 80 (78%) 41 (38%) 6.43 0

    2 To be on the safe side, we also performed chi-square tests on these data. The tests yielded thesame results as did the z-tests: no significant difference was found between the two groups' CUNYWAT pass rates, but their ENG C2 pass rates were significantly different. The tests confirm Huck andCormier's statement that chi-square test and z-test are mathematically equivalent and will always leadto the same decision regarding null hypothesis when analyzing data in the form of proportion,percentages, or frequencies.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 59

  • 7/27/2019 Using Portfolios to Assess ESL Students

    12/24

    they took ENG 22.) In other words, the students in the portfolio group had thesame proportionate breakdown of grades as did the students in the WAT group. Inboth groups, the great majority of students (over 85%) received a C or better thanC letter grade. In other words, portfolio students had the same success rate as WATstudents in ENG 22. This result further suggests that portfolio assessment for ESL

    students in ENG C2 was as good a predictor of their success in ENG 22 as was therequired writing prociency test, the CUNY WAT.

    Question 3: The performance in ENG 22 of the portfolio students who passed ENG C2 but failed the WAT

    Using the ENG 22 data, we particularly examined the subgroup of portfoliostudents who advanced to ENG 22 on the merit of their portfolios without apassing score on the WAT. Of the 64 students who took ENG 22, 38 did not have a

    passing score on the WAT.Table 3 shows the distribution of the ENG 22 letter grades and the equatedgrade mean of this sub-group of portfolio students as compared to those of theWAT group. The means of the two groups were not signicantly different(t 1:03, df 72, P :31). The great majority (84%) of the portfolio studentsreceived a C or better than C grade. The success rate in ENG 22 of the portfoliosub-group was comparable to that of the WAT group (89%), despite the fact thatnone of the portfolio students had passed the WAT before they enrolled in ENG22. This nding is particularly encouraging because these students would have

    been denied a chance even to be enrolled in the course, not to mention reachingtheir full writing potential, if the WAT had been used as the criterion for exit fromENG C2.

    We also followed up on these students' WAT records since passing the testremained a graduation requirement. Twenty-nine or 76% of them passed the testwithin a year after they completed ENG C2. This suggests that additional writing

    Table 2Distribution of frequencies and percentages by grades and group

    Grade (equated) GroupPortfolio WAT

    n (%) n (%)

    A (4) 12 (19) 4 (11)B (3) 25 (39) 19 (53)C (2) 18 (28) 9 (25)D (1) 3 (5) 1 (3)F (0) 6 (9) 3 (8)

    Total 64 (100) 36 (100)

    Mean (S.D.) 2.53 (1.14) 2.56 (1.03)

    60 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    13/24

    instruction and practice in writing, even though not directed toward the test,enabled them to meet that requirement within the mandated time frame.

    Discussion

    The results of the study indicate that ESL students, whether enrolled inportfolio or non-portfolio sections, generally have difculties passing a timedimpromptu test, as evidenced by the low WAT pass rates for the two groups of students (32% and 35%, respectively). These results support the ndings of earlystudies (Johns, 1991; Ruetten, 1994; Thompson, 1990 ) which suggest thatholistically scored, timed writing prociency exams ` are particularly (and

    perhaps unfairly) difcult for ESL writers'' ( Ruetten, 1994 , p. 91). These examsparticularly handicap ESL students because they not only test them on unfamiliargenres and tasks, but also require them to meet standards of excellence ingrammatical and mechanical accuracy they cannot reach on a rst draft in 50minutes ( Hamp-Lyons & Condon, 2000 ).

    However, it is encouraging to note that ESL students are twice as likely to passtheir rst semester English course when they are evaluated by the use of portfolioas when evaluated by the timed WAT. This nding supports that of Ruetten (1994)that portfolio assessment is indeed very useful for ESL students. Although there is

    a concern that students who have passed the course through portfolio withoutpassing the writing competency exam as required may not be able to function inthe subsequent English course, such concern seems to be groundless according tothe ndings. The study found no evidence that allowing ESL students to pass anEnglish writing course through a carefully structured and monitored portfolioassessment reduced their chances of successfully completing the requirements in

    Table 3Distribution of frequencies and percentages by grades and group

    Grade (equated) GroupPortfolio (sub-group) WAT

    n (%) n (%)

    A (4) 4 (11) 4 (11)B (3) 15 (39) 19 (53)C (2) 13 (34) 9 (25)D (1) 0 (0) 1 (3)F (0) 6 (16) 3 (8)

    Total 38 (100) 36 (100)

    Mean (S.D.) 2.29 (1.18) 2.56 (1.03)

    t 1.03 1.03P 0.31 0.31

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 61

  • 7/27/2019 Using Portfolios to Assess ESL Students

    14/24

    the subsequent English course. Indeed, in their second semester of English, thesub-group of studentsthose who passed the rst level English course on the

    merit of their portfolios without a passing score on the writing prociency examperformed as well as students who passed the writing prociency exam.Demonstrating validity is often asked of any assessment method. The King-

    sborough portfolio assessment model seems to have established criterion-relatedvalidation in predicting students' future performance. As measured by the secondsemester English course, the great majority of students (over 85%) who werepermitted to advance to the course with passing portfolios succeeded in thatcourse. They had the same course pass rate as students who had passed the CUNYWAT, proving that portfolio assessment is as valid as the CUNY WAT in

    predicting students' success in a subsequent English course. However, becauseportfolio assessment was able to identify more than twice the number (80 out of 103 versus 41 out of 107) of our ESL students who actually proved successful inthe next English course, it seems a more appropriate assessment alternative for ourESL population.

    It is possible, of course, that other alternatives, e.g., allowing extra time for theWAT, might have proven equally or even more effective than the portfolio inplacing students in ENG 22; however, that would not have provided the curricularsupport we were seeking for this course, since the primary purpose of ENG C2

    was to serve as a variant of Freshman English, an introduction to academicwriting. Moreover, college and university restrictions precluded other alterna-tives. Although we continue to use portfolios for course exit, the Trustees requirethe use of a nationally normed standardized instrument (the ACT tests) for exitfrom the developmental sequence and entrance into Freshman English and willnot entertain proposals for alternative methods of assessment.

    Limitations of the study and suggestions for future research

    The ndings of this study are based on the empirical data of real classroomsrather than data from pure research settings. As such, it is difcult to controlvariables affecting performance. Therefore, the study cannot identify variablesthat contribute to portfolio students' successes in both the rst and subsequentEnglish courses. Successful performance might be attributable to the changedpedagogy that focuses on higher-order critical thinking skills, or perhaps to theextra time students are forced to invest in the writing process. Another possiblefactor could be students' higher motivation as they see the opportunity to pass on

    to the next level of the course sequence rather than repeating the same course. It ispossible that all these factors have a confounding impact. Future studies arenecessary in order to nd answers to the question of contributing variables. Followup case studies on individual portfolio students would also be helpful.

    Another limitation of the study is also due to its empirical nature. Although ourportfolio assessment has a good scoring reliability, the correlation between the

    62 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    15/24

    grades assigned by ENG 22 instructors and students' achievement in writingcannot be assessed. Therefore, using ENG 22 grades as a success measure can be

    problematic, for example, when classroom teachers grade on a curve, i.e., assign aspread of grades (from A to F) no matter how good a class is relative to otherclasses. Nevertheless, we can assume quite safely that the course grade is areliable, useful measure for this study because our ENG 22 instructors all havesubstantial experience in teaching this course, and they do not use curving whenassigning grades. In addition, students enrolling in the ENG 22 sections wererandomly, and therefore presumably normally, distributed over more than 30sections. Hence, the overall comparability of the grade distribution for bothgroups seems indicative of comparable performance.

    Because quantitative data are so limited in portfolio assessment studies, morestudies of this nature should be replicated at other institutions where the ESLpopulation is different and other types of writing competency exams are given. Inaddition, studies with different research designs should be conducted to furtherexamine the effects of portfolio assessment. For example, to measure theinstructional aspects of portfolio approach, a more appropriate writing test,perhaps one similar to our departmental writing exam which allows more timeand gives topics on issues that are familiar to students, should be given to bothportfolio and non-portfolio students at the end. This study is unable to determine

    the instructional effects of the portfolio approach because only the timed,impromptu WAT test was given to both portfolio and non-portfolio studentsand both groups had similar pass rates on the test.

    Conclusion

    This study adds to the evidence noted by Ruetten and others that the holisticallygraded timed impromptu essay exam appears to discriminate against competent

    ESL writers, and in fact places an unnecessary obstacle in their path. Largenumbers of Kingsborough students who might have continued to the next coursesuccessfully were blocked from progressing because they were unable to pass theholistically scored WAT. Further, the study demonstrates that, when carefullyconducted with clear evaluation standards, portfolio assessment can be relied uponas a basis for making judgments about the writing prociency of ESL students.

    Acknowledgments

    The authors are grateful to the three anonymous reviewers whose questions andcomments have been so helpful to the development of this article. We owe specialthanks to our colleagues in the Department of English at Kingsborough Com-munity College, CUNY, who developed the portfolio system and the curriculum itsupports.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 63

  • 7/27/2019 Using Portfolios to Assess ESL Students

    16/24

    Appendix A. Sample CUNY Writing Assessment Test and grading criteria

    The CUNY Writing Skills Assessment TestCity University of New York 1983

    I. Sample test

    DirectionsYou will have 50 minutes to plan and write the essay assigned below. You may

    wish to use your 50 minutes in the following way: 10 minutes planning what youare going to write; 30 minutes writing; 10 minutes reading and correcting whatyou have written.

    You should expressyour thoughts clearly andorganizeyour ideas so that they willmake sense to a reader. Correct grammar and sentence structure are important.

    Write your essay on the lined pages of your booklet. You may use the inside of the front cover of the booklet for preliminary notes.

    You must write your essay on one of the following assignments. Read each onecarefully and then choose either A or B.

    A. It always strikes me as a terrible shame to see young people spending so

    much of their time staring at television. If we could unplug all the TV setsin America, our children would grow up to be healthier, better educated,and more independent human beings .

    Do you agree or disagree? Explain and illustrate your answer from yourown experience, your observations of others, or your reading.

    B. Older people bring to their work a lifetime of knowledge and experience.They should not be forced to retire, even if keeping them on the job cutsdown on the opportunities for young people to find work .

    Do you agree or disagree? Explain and illustrate your answer from your

    own experience, your observations of others, or your reading.

    II. Evaluation scale for the writing skills assessment test

    6The essay provides a well-organized response to the topic and maintains a

    central focus. The ideas are expressed in appropriate language. A sense of patternof development is present from beginning to end. The writer supports assertionswith explanation or illustration, and the vocabulary is well suited to the context.Sentences reect a command of syntax within the ordinary range of standardwritten English. Grammar, punctuation, and spelling are almost always correct.

    5The essay provides an organized response to the topic. The ideas are expressed

    in clear language most of the time. The writer develops ideas and generally signals

    64 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    17/24

    relationships within and between paragraphs. The writer uses vocabulary that isappropriate for the essay topic and avoids oversimplications or distortions.

    Sentences generally are correct grammatically, although some errors may bepresent when sentence structure is particularly complex. With few exceptions,grammar, punctuation, and spelling are correct.

    4The essay shows a basic understanding of the demands of essay organization,

    although there might be occasional digression. The development of ideas issometimes incomplete or rudimentary, but a basic logical structure can bediscerned. Vocabulary generally is appropriate for the essay topic but at timesis oversimplied. Sentences reect a sufcient command of standard writtenEnglish to ensure reasonable clarity of expression. Common forms of agreementand grammatical inection are usually, although not always, correct. The writergenerally demonstrates through punctuation an understanding of the boundariesof the sentence. The writer spells common words, except perhaps so-called``demons,'' with a reasonable degree of accuracy.

    3The essay provides a response to the topic but generally has no overall pattern

    of organization. Ideas are often repeated or undeveloped, though occasionally aparagraph within the essay does have some structure. The writer uses informallanguage occasionally and records conversational speech when appropriatewritten prose is needed. Vocabulary often is limited. The writer generally doesnot signal relationships within and between paragraphs. Syntax is often rudi-mentary and lacking in variety. The essay has recurrent grammatical problems, orbecause of an extremely narrow range of syntactical choices, only occasionalgrammatical problems appear. The writer does not demonstrate a rm under-standing of the boundaries of the sentence. The writer occasionally misspellscommon words of the language.

    2The essay begins with a response to the topic but does not develop that

    response. Ideas are repeated frequently, or are presented randomly, or both. Thewriter uses informal language frequently and does little more than recordconversational speech. Words are often misused, and vocabulary is limited.Syntax is often tangled and is not sufciently stable to ensure reasonable clarityof expression. Errors in grammar, punctuation, and spelling occur often.

    1 The essay suffers from general incoherence and has no discernible pattern of organization. It displays a high frequency of error in the regular features of standard written English. Lapses in punctuation, spelling, and grammar oftenfrustrate the reader. Or, the essay is so brief that any reasonably accurate judgmentof the writer's competence is impossible.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 65

  • 7/27/2019 Using Portfolios to Assess ESL Students

    18/24

    Appendix B. Sample ENG C2 writing assignment

    ENG C2Professor August

    In-class Essay 2

    In The Broken Cord , Michael Dorris makes many references to his heritage asan American Indian. He describes and examines both negative and positiveaspects of being a Native American. Discuss the importance of this heritage toDorris, referring to specic incidents in the book to illustrate and support yourpoints. In relation to Dorris, consider your own ethnic heritage. What is itsimportance to you? Does it play a similar role for you as it does for Dorris, or is itquite different?

    This is a more complicated essay than we've done so far. In order to ndmaterial and make certain that the structure of your essay is logical and clear, besure to take time for step 1 below.

    Please follow these steps as you write your essay:1. Use about 10 minutes for invention, thinking about and planning your

    essay. You might brainstorm, free write, outline, or just jot down ideas.2. Draft the essay. Be sure to double space (skip lines). Remember to use

    paragraphs to indicate steps in your thinking or new aspects of your topic.Try to have an engaging introduction and a good conclusion.

    3. Leave 10 minutes at least to edit and proofread carefully. Look for placesthat are not clear. Also look for words you may have left out, as well as forspelling, capitalization, correct punctuating of quotations, and sentenceboundaries.

    Appendix C

    Kingsborough Community CollegeCity University of New York Department of English

    Departmental writing examination

    The questions below refer to the essay, ``The Tyranny of the Majority'' by LaniGuinier, which was distributed in advance by the instructor. Students areencouraged to use their own marked-up copies. Extra copies are available inthe English Department (C309) for students who have not brought their own.

    66 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    19/24

    Use about 30 minutes to answer Parts I and II. Write your answers on theattached paper. Be sure to label each answer with the question number .

    Part I : Answer both questions in Part I.

    1. Summarize, using your own words, paragraph 25.2. Paraphrase paragraph 14.

    Part II : Choose one of the following questions. Answer in a paragraph.

    1. How does Nikolas's solution to the Sesame Street Magazine exercisecorrespond to Guinier's views on ``majority rule?''

    2. Explain the irony in Guinier's use of the word ` tyranny'' to describe a

    basic principle of the American form of government (see paragraph 12).3. Use your own words to explain what Guinier means by a ` MadisonianMajority'' (paragraph 16). Explain why she gives it this name.

    Part III. Essay : Use about one and one-half hours of the exam time to write youressay.

    Choose one of the topics below. Write a logically organized, well-developed, andcarefully proofread essay on the topic. In your essay, refer to Guinier's essayand quote from it . Write the essay on the paper provided.

    1. In her essay, ` The Tyranny of the Majority,'' Lani Guinier questions theequity of a winner-take-all approach to democracy in a multiculturalsociety. Write an essay in which you continue this discussion. In youressay:

    Using your own words, explain the difference Guinier finds betweenmajority rule in a homogeneous society and a heterogeneous society(see, e.g., paragraph 14).Guinier uses the example of an incident that occurred at Brother Rice

    High School in Chicago. From your own experience or reading, choosean example of majority rule in a heterogeneous society or community.Describe the experience. Then discuss specifically and in detail how itillustrates the problem posed by Guinier or a possible solution to thatproblem.

    2. Guinier makes extensive use of her young son's idea of ``fairness.'' Is shestretching her son's vision of the world too far by assuming that what isright and works for children is also right and works for adults, or is sheaccurate in her application of Nikolas's moral view to the idea of

    democracy? Write an essay in which you examine these ideas and extendthe discussion. In your essay:Summarize, using your own words, what Guinier presents as her son'sidea of fairness.From your experience or your reading, choose an example of a conflictor tension between majority and minority interests. Describe this conflict

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 67

  • 7/27/2019 Using Portfolios to Assess ESL Students

    20/24

    or tension. Then discuss specifically how Nikolas's moral view mightapply. Does it offer a possible solution or not? Explain your thinking.

    Appendix D. Sample portfolio evaluation form

    68 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    21/24

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 69

  • 7/27/2019 Using Portfolios to Assess ESL Students

    22/24

    References

    Belanoff, P., & Dickson, M. (Eds.). (1991). Portfolios: Process and product . Portsmouth, NH:Boynton/Cook Heinemann.

    Belanoff, P., & Elbow, P. (1986). Using portfolios to increase collaboration and community in awriting program. WPA: Writing Program Administrator , 9, 2739.Black, L., Daiker, D., Sommers, J., & Stygall, G. (Eds.). (1994). New directions in portfolio

    assessment: Reflective practice, critical theory, and large-scale scoring . Portsmouth, NH:Boynton/Cook Heinemann.

    Brown, J. D., & Hudson, T. (1998). The alternatives in language assessment. TESOL Quarterly , 32 ,653675.

    70 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972

  • 7/27/2019 Using Portfolios to Assess ESL Students

    23/24

    Camp, R. (1993). The place of portfolios in our changing view of writing assessment. In R. Bennett& W. C. Ward (Eds.), Construction versus choice in cognitive measurement: Issues in constructed response, performance testing, and portfolio assessment (pp. 183212). Hillsdale, NJ: Erlbaum.

    Carlson, S., & Bridgeman, B. (1986). Testing ESL student writers. In K. Greenberg, H. Wiener, & R.Donovan (Eds.), Writing assessment: Issues and strategies (pp. 126152). New York and London:Longman.

    Courts, P.L., & McInerney, K.H. (1993). Assessment in higher education: Politics, pedagogy, and portfolios . Westport, CT: Praeger.

    Elbow, P., & Belanoff, P. (1997). Reflections on an explosion: Portfolios in the 90's and beyond. In K.Yancey & I. Weiser (Eds.), Situating portfolios: Four perspectives (pp. 2133). Logan, UT: UtahState University Press.

    Gill, K. (1993). Process and portfolios in writing instruction . Urbana, IL: National Council of Teachers of English.

    Hamp-Lyons, L. (Ed.). (1991). Assessing second language writing in academic contexts . Norwood,NJ: Ablex.

    Hamp-Lyons, L. (1994). Interweaving assessment and instruction in college ESL writing classes.College ESL , 4, 4355.

    Hamp-Lyons, L., & Condon, W. (1993). Questioning assumptions about portfolio-based assessment.College Composition and Communication , 44 , 176190.

    Hamp-Lyons, L., & Condon, W. (2000). Assessing the portfolio: Principles for practice, theory, and research . Cresskill, NJ: Hampton.

    Herman, J. K., Gearhar, M., & Baker, E. (1993). Assessing writing portfolios: Issues in the validityand meaning of scores. Educational Assessment , 1, 201224.

    Herman, J. L., & Winters, L. (1994). Portfolio research: A slim collection. Educational Leadership ,52 , 4855.

    Huck, S.W., & Cormier, W.H. (1996). Reading statistics and research . New York, NY: Harper CollinsHuot, B. (1992, October). Portfolios, fad or revolution? Paper presented at New Directions in

    Portfolio Assessment, the Fourth Miami Conference on the Teaching of Writing, Miami, Ohio.Huot, B., & Williamson, M. (1997). Rethinking portfolios for evaluating writing: Issues of

    assessment and power. In K. Yancey and I. Weiser (Eds.), Situating portfolios: Four perspectives(pp. 4356). Logan, UT: Utah State University Press.

    Hutchings, P. (1990). Learning over time: Portfolio assessment. AAHE Bulletin ( April ), 68.Johns, A. M. (1991). Interpreting an English competency examination. Written Communication , 8,

    379401.Jones, J.W. (1992). Evaluation of the English as a Second Language portfolio assessment project at

    Borough of Manhattan Community College . A practicum report presented to Nova University.Kearns, E. (1993). On the running board of the portfolio bandwagon. WPA: Writing Program Administrator , 16 , 5058.

    LeMahieu, P., Gitomer, D.H., & Eresh, J.T. (1995). Portfolios in large-scale assessment: Difficult butnot impossible. Educational Measurement: Issues and Practice , 14 , 1116, 2528.

    Markstein, L., Withrow, J., Brookes, G., & Price, S. (1992, March). A portfolio assessment experiment for college ESL students . Paper presented at the 1992 TESOL Convention, Vancouver.

    Moss, P. (1994). Validity in high stakes writing assessment: Problems and possibilities. AssessingWriting , 1, 109128.

    Raimes, A. (1987). Language proficiency, writing ability, and composing strategies: A study of ESLcollege student writers. Language Learning , 37 , 439467.

    Ruetten, M. K. (1994). Evaluating ESL students' performance on proficiency exams. Journal of Second Language Writing , 3, 8596.

    Thompson, R. M. (1990). Writing proficiency tests and remediation: Some cultural differences.TESOL Quarterly , 24 , 99102.

    White, E. (1992). Portfolios as an assessment concept . Paper presented at New Directions inPortfolio Assessment, the Fourth Miami Conference on the Teaching of Writing, Miami, OH.

    B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972 71

  • 7/27/2019 Using Portfolios to Assess ESL Students

    24/24

    Williams, J.D. (1998). Preparing to teach writing . Mahwah, NJ: Lawrence Erlbaum.Williams, J.D. (2000). Identity and reliability in portfolio assessment. In B. Sunstein & J. Lovell

    (Eds.), The portfolio standard (pp. 135148). Portsmouth, NH: Heinemann.Yancey, K. B. (1999). Looking back as we look forward: Historicizing writing assessment. College

    Composition and Communication , 50 , 483503.

    72 B. Song, B. August / Journal of Second Language Writing 11 (2002) 4972