california educational research association annual conference november 19, 2009 san francisco, ca...

Post on 17-Jan-2016

217 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

California Educational Research AssociationAnnual ConferenceNovember 19, 2009San Francisco, CA

Margaret Heritage

California Assessment Literacy Initiative for Improved Student Learning

Session Objectives

1- Provide background to the California Assessment Literacy Initiative (CALI)

2- Present goals and framework for CALI

3- Present content ideas and delivery system design ideas

4- Solicit feedback on the above

CALI Steering Committee

• Gina Koency ( LACOE)

• Paula Carroll (SJCOE)

• Kathryn Edwards (LACOE)

• Karen Greer (LACOE)

• Margaret Heritage (CRESST/UCLA)

• Mary Tribbey (BCOE)

The Question…

“Cheshire …,” Alice began rather timidly, “would you tell me please, which way I ought to go from here?”

“That all depends a good deal on where you want to get to,” said the Cat

Lewis Carroll

The Answer…

“Cheshire …,” Alice began rather timidly, “would you tell me please, which way I ought to go from here?”

“That all depends on what your data tell you,” said the Cat

Lewis Carroll

“When you have a robust data system and professional development system working hand in hand, there is total alignment for the kids.”

Chris Steinhauser, Long Beach Unified School District Superintendent

Background

Increased Focus on Data Use

“The collection, analysis and use of educational data are central to the improvement of student outcomes envisioned by No Child Left Behind (NCLB)”

(U.S. Department of Education, 2009, p. vii).

Data Use

Data use leads to improvements in student learning

(Snipes, Doolittle & Herlihy, 2002; Williams, Kirst, Haertel et al., 2005; Armstrong & Anthes (2001); Cawelti & Protheroe, 2001; LaRocque, 2007; Symonds, 2004; Tognieri & Anderson, 2003).

Increased Focus on Data Use

• Increased attention paid to data use

(Heritage & Yeagley, 2005; Ikemoto & Marsh, 2007; Mandinach & Honey, 2008; Wayman & Stringfield, 2006).

BUT

• Expansion of data gathering without a corresponding jump in data use

(Data Quality Campaign, 2006; Hamilton et al., 2009; Rothman, 2008).

Lack of Data/Assessment Literacy

• Data literacy “…presumes an accumulating facility with the interpretation of data, not to mention a familiarity with data sources and creativity in assembling relevant data quickly and efficiently”

(Knapp, Swinnerton, Copland, & Monpas-Huber, 2006, p. 13).

• Administrators and teachers lack skills to use data effectively

(Datnow, Park, & Wohlstetter, 2007; Heritage, Lee, Chen, & LaTorre, 2005; Stiggins, 2002).

What is Assessment Literacy? (HO)

Assessment literate educators:

• Design/evaluate assessments

• Identify learning targets

• Provide differentiated instruction & assessments

• Monitor student progress

• Organize, analyze, interpret, and use data

• Evaluate reliability/validity of assessments

• Engage students in assessment process & provide feedback

Survey

• Administer survey of assessment literacy skills and assessment data use

Project Goals (HO)

1) Build county office capacity

2) Create and disseminate Web-based assessment literacy professional development modules

3) Conduct impact studies

4) Provide guidance to teachers focused on using existing state resources

5) Provide teachers with clear guidance on how to effectively utilize assessment results

Discussion

• Do the attributes of assessment literacy seem right to you?

• What would you change?

• Do the project goals seem right to you?

• What would you change?

Collaborative Model

CALI Professional Development Program: Framework

Data Use Processes and Skills

Research calls for:

• A systematic process facilitating data use

(Armstrong & Anthes, 2001; Boudett, City, & Murnane, 2005; Heritage & Chen, 2005)

• Strong data-literate leadership at all levels of the system

(Mason, 2002; Herman & Gribbons, 2001)

• A data-driven culture

(Datnow, Park, & Wohlstetter, 2007; Lachat & Smith, 2005)

Assessment System

• Coherent: built on a well-structured conceptual base

• Comprehensive: provides a variety of evidence

• Continuous: provides indications of student growth over time

(NRC, 2001, p.259)

Different levels of granularity for different purposes

Annual State Tests

Interpretation

What students have learned/ have they met the standard?

Differences among groups

Strengths/weaknesses inIndividual’s and groups’ learning

Strengths/weaknesses incurriculum/instruction/programs

Are our improvement strategies working?

Annual State Tests

Action

Adjustments to curriculum, instruction, programs

Monitoring/accountability

Reporting

Inform professional development needs

Quarterly/Monthly Assessments

Interpretation

Progress monitoring for individuals: have students reached the

benchmark?

Differences among groups

Strengths/weaknesses inindividual’s and groups’ learning

Strengths/weaknesses in curriculum/instruction

Quarterly/Monthly Assessments

Action

Adjustments to curriculum, instruction

Progress monitoring / accountability

Reporting

Minute-by-minute/Daily/WeeklyInterpretation

Identify the gap between current status and the desired learning

goals

Identify individual misconceptions/difficulties

Identify missing building blocks

Minute-by-minute/Daily/WeeklyAction

What to do to move learning forward

Adjustments to ongoing instruction/learning

Feedback

Assessments in the System

Assessment Knowledge

The PD modules will address specific knowledge about the assessments:

a) range of assessment task (constructed response, selected response, performance, instructional task)

b) purpose of each of the assessments and their potential use

c) assessment quality: validity (including alignment), reliability, freedom from bias, and usability

d) the need for multiple measures to inform decisions

Assessment Use

Assessment Skills

The PD modules will address specific skills for assessment use:

a) establishing learning goals accessible to all students

b) interpretation skills, including statistical knowledge

c) matching learning opportunities to learners’ needs

d) feedback to learners

Example

Guidelines for Writing Data Statements and Summaries

Each statement should:

• Communicate a single idea about student achievement

• Present the facts objectively rather than state evaluative or explanatory comments

• Be short, clear sentences or phrases in everyday language that is easy to understand

• Be an independent statement, that is, its meaning should not be dependent on other statements

• Represent the data accurately by including relevant numerical data when needed for evidence

• Review all of the data statements and identify the most important ideas that convey the story about achievement

(Adapted from Van Houten, L., Miyasaka, J., Agullard, K., & Zimmerman, J. (2006). Developing an Effective School Plan: An Activity-Based Guide to Understanding Your School and Improving Student Outcomes. Oakland, CA: WestED)

Writing Data Statements and Summaries: Activity

Write a paragraph of statements summarizing the major and important findings. The statements can be in a slightly more narrative style, but still tightly based on data. Important numerical results should be included to support the points made. Avoid including personal judgments and opinions. If you find you are describing why the results occurred, or using the word “because” in your summary, you have moved to interpretation and are no longer summarizing!

(Adapted from Van Houten, L., Miyasaka, J., Agullard, K., & Zimmerman, J. (2006). Developing an Effective School Plan: An Activity-Based Guide to Understanding Your School and Improving Student Outcomes. Oakland, CA: WestED)

Student Involvement in Assessment

• Self-assessment (metacognition and self-regulation)

• Peer-assessment

• Use of feedback

Discussion

• What topics should the professional development program include to support educators’ effective use of data?

CALI Professional Development Program

Online Professional Development

Meta-analyses comparing distance education classes vs. traditional classes highlight the need for deliberate course design for online delivery to be effective, specifically incorporation of interactivity

(Bernard et al., 2004; Bernard et al., 2009).

Modules: Design Elements

• Completed in 3-5 hours

• Designed to support collaboration

• Include assignments tied to practice

• Include pretest, checks for understanding, and posttest

• Include videos of related practice activities

• Include videos of experts speaking

• Links to related resources for additional information

• Inclusion of interactive tools that would enable the user to explore the impact of certain decisions

Virtual Professional Learning Communities

• Professional Learning Communities (PLCs) are: “…structures for continuous learning and use of knowledge in the course of conducting the work of teaching”

(Hord and Sommers, 2008, cited in Mundry & Stiles, 2009, p. 9).

• “Teachers in every case are learning and working with their peers to situate their learning in real practice”

(Mundry & Stiles, 2009, p.9)

Electronic Networking and Professional Development

• Review of research finds the following benefits to incorporating electronic networking into professional development:

1) Reducing teacher isolation & supporting sharing

2) Encouraging reflection on practice

3) Improving teaching practice

4) Encouraging professional learning communities

(Barnett, 2002)

Electronic Networking and Professional Development (cont.)

• Studies of professional development for secondary science teachers incorporating electronic networking (i.e. blogging, discussion forum, and an online community) found evidence of learning through interactions with others, increased levels of reflective teaching, and increased value placed on learning by participants

(Luehmann & Tinelli, 2008; Makinster, Barab, Harwood, & Andersen, 2006)

Discussion

• What else should we be thinking about with regard to the online delivery system?

mheritag@ucla.edu

top related