measuring teachers' use of formative assessment: 

43
A learning progressions approach Measuring teachers' use of formative assessment: Brent Duckor (SJSU) Diana Wilmot (PAUSD) Bill Conrad & Jimmy ScheRrer (SCCOE) Amy Dray (UCB)

Upload: kaiyo

Post on 03-Feb-2016

25 views

Category:

Documents


0 download

DESCRIPTION

Measuring teachers' use of formative assessment: . A learning progressions approach. Why formative assessment?. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Measuring teachers' use of formative assessment: 

A learning progressions approach

Measuring teachers' use of formative assessment:

Brent Duckor (SJSU)Diana Wilmot (PAUSD)

Bill Conrad & Jimmy ScheRrer (SCCOE)Amy Dray (UCB)

Page 2: Measuring teachers' use of formative assessment: 

Why formative assessment?

Black and Wiliam (1998) report that studies of formative assessment (FA) show an effect size on standardized tests between 0.4 and 0.7, far larger than most educational interventions and equivalent to approximately 8 months of additional instruction;

Further, FA is particularly effective for low achieving students, narrowing the gap between them and high achievers, while raising overall achievement;

Enhancing FA would be an extremely cost-effective way to improve teaching practice;

Unfortunately, they also find that FA is relatively rare in the classroom, and that most teachers lack effective FA skills.

Page 3: Measuring teachers' use of formative assessment: 

Measuring Effective FA practice: Toward a cycle of psychometric

inquiryDefine constructs in multi-dimensional spaceDesign items and observations protocolIterate scoring strategy in alignment

construct mapsApply measurement model to validate

teacher and item-level claims and to warrant inferences about effective practice

Page 4: Measuring teachers' use of formative assessment: 
Page 5: Measuring teachers' use of formative assessment: 

teachers who practice assessment for learning know

and can understand and articulate in advance of teaching the achievement

targets that their students are to hit;

inform their students about those learning goals, in terms that students understand;

translate classroom assessment results into frequent descriptive feedback

for students, providing them with specific insights as to how to improve;

continuously adjusting instruction based on the results of classroom assessments.

Ask an expert

Page 6: Measuring teachers' use of formative assessment: 

Research suggestsFA 1.0

Page 7: Measuring teachers' use of formative assessment: 

Good formative feedback

Types of feedback (Butler, 1998; Butler & Neuman, 1995; Hattie & Timperley, 2007; Kluger & DeNisi, 1996);

Level of specificity and task relatedness (Tunstall & Gipps, 1996; Ames, 1992; Dweck, 1986);

The “next steps” required of students (Butler & Neuman, 1995) can influence the effectiveness of formative assessment on classroom learning.

FA 2.0

Page 8: Measuring teachers' use of formative assessment: 

The Zone of study

Page 9: Measuring teachers' use of formative assessment: 

Items Design

Outcome SpaceMeasurementModel

Reliability

Validity

Construct Map

Page 10: Measuring teachers' use of formative assessment: 

Phase 1: Construct Mapping

Page 11: Measuring teachers' use of formative assessment: 

Knowledge of formative assessment

Page 12: Measuring teachers' use of formative assessment: 

Mapping a Route …FA 3.0

Novice teaching

Expert teaching

Page 13: Measuring teachers' use of formative assessment: 

Skills Allocationat “Soliciting responses” level

Page 14: Measuring teachers' use of formative assessment: 

Construct MapItem Design

MeasurementModel

Reliability

Validity

Outcome Space

Page 15: Measuring teachers' use of formative assessment: 

Phase 2: Items Design

Align items (tasks, prompts, scenarios) with the levels on the construct map.

Building items to map onto a refined set of formative assessment practices.

Consider various item types and delivery platforms.

Review an example of a simulated scenario with focus on turns of talk.

Consider content and construct validity, as well as inter-rater reliability in construction and use of items.

Page 16: Measuring teachers' use of formative assessment: 

Consider the context: The Lesson Cycle

Initial framing of a question within a HIGH LEVEL Task.

Study the enactment of formative assessment implementation within a science classroom engaged in HL tasks in ecology.

Page 17: Measuring teachers' use of formative assessment: 

The evolution of a Task

The Enacted Task

Stein, Remillard, & Smith (2007)

Page 18: Measuring teachers' use of formative assessment: 

Delivery Platforms for data collection

Traditional “paper and pencil” questionsClassroom based “authentic” tasksLesson planning/enactment/reflectionsLesson study of best formative assessment

practicesInnovative adaptive virtual scenarios (AVS)Video episodesWeb-based virtual platforms

Page 19: Measuring teachers' use of formative assessment: 

Adaptive VIRTUAL scenarios

Ability to share a range of novice through expert formative assessment practice in video format.

Pause videos throughout teacher enactment to measure teachers level of sophistication.

Page 20: Measuring teachers' use of formative assessment: 

Potential items within Simulated SCENARIOs

Pause video and ask the teacher:How would you characterize this initial move? What do you notice about the teachers’ questioning strategy?How might you compare one questioning strategy with another?What would you do next?

Replay video and show teachers’ follow up moves.

Pause and ask the teacher what they think:How would you negotiate this student response?What kind of question would you pose next?

Page 21: Measuring teachers' use of formative assessment: 

Sample of teacher responses

Initial move was “literal”, ”more open ended”.

Teacher’s response to students was: “too directed towards the right answer”, “thoughtfully provoking misconceptions”, “open enough to provoke deeper conceptual knowledge across the

classroom”

I would have asked a question like,”…[a question that uses student thinking as a basis]”

Page 22: Measuring teachers' use of formative assessment: 

Administration of tools

Administer tools with widest range of teachers possible pre-service Induction yearsVeteran 5-9 yearsVeteran 10+ years

Use adaptive technology to capture teacher’s zone of proximal development in their expertise of formative assessment practices

Examine relationship (if any) between scores on multiple tools

Page 23: Measuring teachers' use of formative assessment: 

Construct MapItems Design

MeasurementModel

Reliability

Validity

Outcome Space

Page 24: Measuring teachers' use of formative assessment: 

Phase 3: Defining the outcome space

Link each item response back to the levels on the construct map.

Using scoring guides to capture granularity of formative assessment practices.

Consider various types e.g. rubrics, observation protocols, coding

Provide exemplars of practice to assist in scoring protocols.

Consider content and construct validity, as well as inter-rater reliability in construction and use of scoring guides.

Page 25: Measuring teachers' use of formative assessment: 

Problem of Practice:Formative assessment

Teachers have difficulty scaffolding student thinking and reasoning through discourse

As a result, the cognitive demand of a task often declines during implementation (e.g., TIMSS, QUASAR)

Page 26: Measuring teachers' use of formative assessment: 

A Tool for Measuring Teachers’ use of most-formative assessment

1. Initiate participation in classroom discussions

2. Respond to students contributions during a discussion

Scherrer & Stein (In Press)

Page 27: Measuring teachers' use of formative assessment: 

An Example of How to Apply the Codes in Context

1. Teacher: What can you tell me about this shape?

2. Juan: It has 4 right angles.

3. Teacher: What else can you tell me?

4. Kayla: It is a rectangle.

5. Teacher: Okay, a rectangle. Why do you think it is a rectangle?

6. Kayla: It has 4 sides.

7. Teacher: Are all shapes that have 4 sides rectangles?

8. Yasmin: It could also be a quadrilateral.

9. Teacher: Wait. I am asking if all shapes with 4 sides are rectangles.

Launch

Collect

Repeat, Uptake

Uptake-Literal

Terminal, Reinitiate

Page 28: Measuring teachers' use of formative assessment: 

Scoring the Codes

1. Teacher: What can you tell me about this shape?

2. Juan: It is a quadrilateral.

3. Teacher: What else can you tell me?

4. Kayla: It is a rectangle.

5. Teacher: Okay, what else?

6. Yasmin: It has four right angles.

7. Teacher: Okay, what about this shape over here? What can you tell me about this one?

Launch

Collect

Collect

Launch

In this example, the teacher did not “do” anything with the student responses.

+1

+0

+0

+1

Page 29: Measuring teachers' use of formative assessment: 

Scoring the Codes

1. Teacher: What can you tell me about this shape?

2. Juan: It is a quadrilateral.

3. Teacher: What else can you tell me?

4. Kayla: It is a rectangle.

5. Teacher: Okay, Juan said this shape is a quadrilateral, and Kayla said it is a rectangle. What is similar about quadrilaterals and rectangles?

Launch

Collect

Connect

In this example, the teacher asked an open-ended question, gathered an additional response to that question, and then connected the two responses.

+1

+1

+2

Page 30: Measuring teachers' use of formative assessment: 

Connecting codes to response levels on the construct map

Response Levels

Launch, Collect, Connect, Uptake

Launch, Collect, Uptake

Launch, Collect, Literal

Launch, Literal, Literal

Literal, Literal, Literal

Scoring designations

L4

L3

L2

L1

L0

Page 31: Measuring teachers' use of formative assessment: 

Construct MapItems Design

Outcome Space

Reliability

Validity

Measurement Model

Page 32: Measuring teachers' use of formative assessment: 

Phase 4: applying measurement models

Cross reference qualitative construct maps with technically calibrated Wright Maps using IRT.

Employ person and item statistics to check on model fit

Consider various types of measurement models including facets.

Provide individual and group level data on progress.

Consider “internal structure” validity evidence for construct maps, in addition to checks on reliability of tools.

Page 33: Measuring teachers' use of formative assessment: 

Berkeley Evaluation And Assessment Research Center

Item response theory can model a “learning progression” within a particular domain. For example:

KSC: Knowledge of science content

KST: Knowledge of student thinking

KFA: Knowledge of formative assessment

Page 34: Measuring teachers' use of formative assessment: 

Berkeley Evaluation And Assessment Research Center

Page 35: Measuring teachers' use of formative assessment: 
Page 36: Measuring teachers' use of formative assessment: 
Page 37: Measuring teachers' use of formative assessment: 
Page 38: Measuring teachers' use of formative assessment: 

Case study: Developing an integrated assessment system (DIAS)

for teacher educationPamela Moss, University of Michigan

Mark Wilson, University of California, Berkeley

Goal is to develop an assessment system that:focuses on teaching practice grounded in professional and disciplinary knowledge as it develops over time;addresses multiple purposes of a broad array of stakeholders working in different contexts; andcreates the foundation for programmatic coherence and professional development across time and institutional contexts.

Page 39: Measuring teachers' use of formative assessment: 

Case study: DIAS

The research team identified the ways in which student teachers can learn to use formative and summative assessment to guide their students’ learning.

Developed a construct map that outlined a progression of learning in “Assessment.”

Described the different aspects of “Assessment,” such as: Identifying the mathematical target to be assessed;Understanding the purposes of the assessment;Designing appropriate and feasible tasks (such as end of class checks); Developing accurate inferences about individual student and whole class

learning.

Page 40: Measuring teachers' use of formative assessment: 

Case study: DIAS

The research team collected data from student teachers enrolled in the elementary mathematics teacher education program at the University of Michigan.

Designed scoring guides based on our construct map.Coded videotapes from over 100 student teachers as they conducted

lessons in the classroom.Coded associated collected data, such as lesson plans and

reflections, since these documents contain information about what the student teachers hope to learn from the assessment(s), and what they infer about the students in their classroom.

Using item response methods to determine which aspects of assessment practice are easier or more difficult for the student teachers and to thereby inform the teacher education program.

Page 41: Measuring teachers' use of formative assessment: 

Synopsis

Incredible partnership

Filling an important educational research space

Identified the assessment space

Focus on the content

Emphasis on student thinking

Contributions to instrumentation and methodology

Marry qualitative and quantitative data using IRT framework

Next steps: pilot study

Page 42: Measuring teachers' use of formative assessment: 

Contact Information

Bill Conrad Jimmy ScherrrerSanta Clara County Office of Education [email protected] (Office)510-761-2007 (Cell)[email protected]

Brent Duckor, Ph.D. Diana Wilmot, Ph.D.Assistant Professor Coordinator, Research & EvaluationCollege of Education Palo Alto Unified School DistrictSan Jose State University [email protected]@sjsu.edu

Amy Dray, Ph.D. UC Berkeley Graduate School of [email protected]

Page 43: Measuring teachers' use of formative assessment: 

A learning progressions approach

Measuring teachers' use of formative assessment:

Brent Duckor (SJSU)Diana Wilmot (PAUSD)

Bill Conrad & Jimmy ScheRrer (SCCOE)Amy Dray (UCB)