intro assessment c mm

Upload: farah-pocoyo-pattinson-farhana

Post on 01-Nov-2015

218 views

Category:

Documents


0 download

DESCRIPTION

l

TRANSCRIPT

  • Introduction to AssessmentESL Materials and TestingWeek 8

  • What is assessment?Not the same as testing!An ongoing process to ensure that the course/class objectives and goals are met. A process, not a product.A test is a form of assessment. (Brown, 2004, p. 5)

  • Informal and Formal AssessmentInformal assessment can take a number of forms:unplanned comments, verbal feedback to students, observing students perform a task or work in small groups, and so on.Formal assessment are exercises or procedures which are:systematic give students and teachers an appraisal of students achievement such as tests.

  • Traditional AssessmentMultiple-choiceTrue-falseMatchingNorm-referenced and criterion referenced tests

  • Norm and Criterion-referenced testsNorm-referenced teststandardized tests (college board, TOEFL, GRE)Place test-takers on a mathematical continuum in rank orderCriterion-referenced testsgive test-takers feedback on specific objectives (criterea)test objectives of a course known as instructional value

  • Authentic AssessmentAuthentic assessment reflects student learning, achievement, motivation, and attitudes on instructionally relevant classroom activities (OMalley & Valdez, 1996).Examples: performance assessmentportfoliosself-assessment

  • Purposes for AssessmentDiagnose students strengths and needsProvide feedback on student learningProvide a basis for instructional placementInform and guide instructionCommunicate learning expectationsMotivate and focus students attention and effortProvide practice applying knowledge and skills

  • Purposes continuedProvide a basis for evaluation for the purpose of:GradingPromotion/graduationProgram admission/selectionAccountabilityGauge program effectiveness

  • Assessment Instruments

  • DiscussionHow would you document a student performance during a discussion?Which types of assessments noted in the chart could be considered authentic assessment?

  • Principles of Language AssessmentPracticalityReliabilityValidityAuthenticityWashback

  • PracticalityAn effective test is practicalIs not excessively expensiveStays within appropriate time constraintsIs relatively easy to administerHas a scoring/evaluation procedure that is specific and time-efficient

  • ReliabilityA reliable test is consistent and dependable. If you give the same test to the same students in two different occasions, the test should yield similar results.Student-related reliabilityRater reliabilityTest administration reliabilityTest reliability

  • Student Related ReliabilityThe most common issue in student related reliability is caused by temporary illness, fatigue, a bad day, anxiety, and other physical and psychological factors which may make an observed score deviate from a true score.

  • Rater ReliabilityHuman error, subjectivity, and bias may enter into the scoring process.Inter-rater reliability occurs when two or more scorers yield inconsistent scores of the same test, possibly for lack of attention to scoring criteria, inexperience, inattention, or even preconceived bias toward a particular good and bad student.

  • Test Administration ReliabilityTest administration reliability deals with the conditions in which the test is administered.Street noise outside the building bad equipment room temperature the conditions of chairs and tables, photocopying variation

  • Test Reliability The test is too longPoorly written or ambiguous test items

  • ValidityA test is valid if it actually assess the objectives and what has been taught.Content validityCriterion validity (tests objectives)Construct validityConsequential validity Face validity

  • Content ValidityA test is valid if the teacher can clearly define the achievement that he or she is measuringA test of tennis competency that asks someone to run a 100-yard dash lacks content validityIf a teacher uses the communicative approach to teach speaking and then uses the audiolingual method to design test items, it is going to lack content validity

  • Criterion-related ValidityThe extent to which the objectives of the test have been measured or assessed. For instance, if you are assessing reading skills such as scanning and skimming information, how are the exercises designed to test these objectives? In other words, the test is valid if the objectives taught are the objectives tested and the items are actually testing this objectives.

  • Construct Validity A construct is an explanation or theory that attempts to explain observed phenomenaIf you are testing vocabulary and the lexical objective is to use the lexical items for communication, writing the definitions of the test will not match with the construct of communicative language use

  • Consequential Validity Accuracy in measuring intended criteriaIts impact on the preparation of test-takersIts effect on the learnerSocial consequences of a test interpretation (exit exam for pre-basic students at El Colegio, the College Board)

  • Face ValidityFace validity refers to the degree to which a test looks right, and appears to measure the knowledge or ability it claims to measureA well-constructed, expected format with familiar tasksA test that is clearly doable within the allotted time limitDirections are crystal clearTasks that relate to the course (content validity)A difficulty level that presents a reasonable challenge

  • AuthenticityThe language in the test is as natural as possibleItems are contextualized rather than isolatedTopics are relevant and meaningful for learnersSome thematic organization to items is providedTasks represent, or closely approximate, real-world tasks

  • WashbackWashback refers to the effects the tests have on instruction in terms of how students prepare for the test Cram courses and teaching to the test are examples of such washbackIn some cases the student may learn when working on a test or assessmentWashback can be positive or negative

  • Alternative Assessment OptionsSelf and peer-assessmentsOral production-student self-checklist, peer checklist, offering and receiving holistic rating of an oral presentation Listening comprehension- listening to TV or radio broadcasts and checking comprehension with a partnerWriting-revising work on your own, peer-editingReading- reading textbook passages followed by self-check comprehension questions, self-assessment of reading habits(page 416, Brown, 2001)

  • Authentic AssessmentPerformance assessment- any form of assessment in which the student constructs a response orally or in writing. It requires the learner to accomplish a complex and significant task, while bringing to bear prior knowledge, recent learning, and relevant skills to solve realistic or authentic problems (OMalley & Valdez, 1996; Herman, et. al., 1992).

  • Examples of Authentic AssessmentPortfolio assessmentStudent self-assessmentPeer assessmentStudent-teacher conferencesOral interviewsWriting samplesProjects or exhibitionsExperiments or demonstrations

  • Characteristics of performance assessmentConstructed responseHigher-order thinkingAuthenticityIntegrativeProcess and productDepth versus breadth

  • Journals Specify to students the purpose of the journalGive clear directions to students on how to get started (prompts for instance I was very happy when)Give guidelines on length of each entryBe clear yourself on the principal purpose of the journalHelp students to process your feedback, and show them how to respond to your responses

  • ConferencesCommonly used when teaching writingOne-on-one interaction between teacher and studentConferences are formative assessment as opposed to offering a final grade or a summative assessment. In other words, they are meant to provide guidance and feedback.

  • PortfoliosCommonly used with the communicative language teaching approach (CLT)It is a collection of students work that demonstrates to students and others the efforts, progress and achievements in a given area. You can have a reading portfolio or a writing portfolio, for instanceYou can also have a reflective or assessment portfolio as opposed to collecting every piece of evidence for each objective achieved in the course

  • Portfolio GuidelinesSpecify the purpose of the portfolioGive clear directions to students on how to get startedGive guidelines of acceptable materials or artifactsCollect portfolios on a pre-announced dates and return promptlyHelp students to process your feedbackEstablish a rubric to evaluate the portfolio and discuss it with your students

  • Cooperative Test ConstructionCooperative test construction involves the students contribution to the design of test items. It is based on the concept of collaborative and cooperative learning in which students are involved in the process(Brown, 2001, p. 420)