this work is supported with nsf-nsdl funding (due-0633124). refining margins mini-lessons using...

1
This work is supported with NSF-NSDL funding (DUE-0633124). Refining MARGINS Mini-Lessons Using Classroom Observations Ellen Iverson, Cathryn A Manduca, John McDaris Science Education Resource Center, Carleton College // serc.carleton.edu/margins/index.html How was the activity used? Activities can be more or less successful in different environments or for different purposes. Faculty also use activities to support a range of teaching strategies. This section collects information on how and why the activity was used, and how well the outcome matched the faculty members’ goals for students. Did the activity lead to the desired learning? This question lies at the heart of the activities’ success in supporting learning. The questionnaire probes both if the students met the learning goals and how the respondent assessed this critical element. The latter is important in understanding the strength of the field test. How effective was the activity in the classroom? Did the activity successfully engage students’ interest, support them effectively in learning, and allow them to confirm their new understanding? This section of the questionnaire probes how successful the activity was in areas identified by educational research as underpinning learning. What do faculty need to successfully use this activity? This part of the field test gathers information on the materials supporting use of the activity by others. What additional information would the tester have liked to make implementation easier or more successful. Kastens, K. & Holzman, N. (2006) The Digital Library for Earth System Education Provides Individualized Reports for Teachers on the Effectiveness of Educational Resources in Their Own Classrooms. D-Lib Magazine, V 12 (1).Kastens, K. A., and John C. Butler, 2001, How to identify the “best” resources for the reviewed collection of the Digital Library for Earth System Education, Computers and the Geosciences, v. 27 (3), 375-378.Hancock, G. and C.A. Manduca (2005). Developing Quantitative Skills Activities for Geoscience Students, EOS, 86(39), p. 355.MA Mayhew, MK Hall (2007), Field Test of a Peer Review System for Digital Geoscience Education Resources, Eos Trans.AGU, 88(52), Fall Meet. Suppl., Abstract ED51B-0408. Results: Thirteen field tests have been completed to date showing that these mini-lessons: • Give students hands-on experience with scientific data; • Help students make connections between geologic phenomena and data. Activity authors found suggestions for improvements in design, adaptations for other audiences, suggestions for clearer presentation, and tips for using the materials helpful and the first mini- lessons have now been revised using this feedback. Web-deliverable Laboratory/Classroo m Exercises 36 mini-lessons available on web Field Testing Teaching Materials One of the challenges that we face in developing teaching materials or activities from research findings is testing the materials to determine that they are working as intended. Traditionally this is done by using material developed by an individual faculty member in their own class, noticing what worked and didn’t, and improving them the next year. However, as we move to a community process of creating and sharing teaching materials, a community based process for testing materials is appropriate. The MARGINS project has piloted such a community-based process for testing teaching materials and activities developed as part of its mini-lesson project. Building on prior work developing mechanisms for community review of teaching resources (e.g. Kastens, 2001; Hancock and Manduca, 2005; Mayhew and Hall, 2007), the MARGINS evaluation team developed a structured classroom observation protocol. The goals of field testing are to: a)gather structured, consistent feedback for the lesson authors based on classroom use; b)guide reviewers of these lessons to reflect on research based educational practice as a framework for their comments; c)collect information on the data and observations that the reviewer used to underpin their review; d)determine which mini-lessons are ready to be made widely available on the website. The Field-Test Questionnaire The field test review is designed to provide feedback in four key areas:

Upload: matthew-allen

Post on 19-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: This work is supported with NSF-NSDL funding (DUE-0633124). Refining MARGINS Mini-Lessons Using Classroom Observations Ellen Iverson, Cathryn A Manduca,

This work is supported with NSF-NSDL funding (DUE-0633124).

Refining MARGINS Mini-Lessons Using Classroom Observations Ellen Iverson, Cathryn A Manduca, John McDaris Science Education Resource Center, Carleton College

http://serc.carleton.edu/margins/index.html

• How was the activity used? Activities can be more or less successful in different environments or for different purposes. Faculty also use activities to support a range of teaching strategies. This section collects information on how and why the activity was used, and how well the outcome matched the faculty members’ goals for students.

• Did the activity lead to the desired learning? This question lies at the heart of the activities’ success in supporting learning. The questionnaire probes both if the students met the learning goals and how the respondent assessed this critical element. The latter is important in understanding the strength of the field test.

• How effective was the activity in the classroom? Did the activity successfully engage students’ interest, support them effectively in learning, and allow them to confirm their new understanding? This section of the questionnaire probes how successful the activity was in areas identified by educational research as underpinning learning.

• What do faculty need to successfully use this activity? This part of the field test gathers information on the materials supporting use of the activity by others. What additional information would the tester have liked to make implementation easier or more successful.

Kastens, K. & Holzman, N. (2006) The Digital Library for Earth System Education Provides Individualized Reports for Teachers on the Effectiveness of Educational Resources in Their Own Classrooms. D-Lib Magazine, V 12 (1).Kastens, K. A., and John C. Butler, 2001, How to identify the “best” resources for the reviewed collection of the Digital Library for Earth System Education, Computers and the Geosciences, v. 27 (3), 375-378.Hancock, G. and C.A. Manduca (2005). Developing Quantitative Skills Activities for Geoscience Students, EOS, 86(39), p. 355.MA Mayhew, MK Hall (2007), Field Test of a Peer Review System for Digital Geoscience Education Resources, Eos Trans.AGU, 88(52), Fall Meet. Suppl., Abstract ED51B-0408.

Results: Thirteen field tests have been completed to date showing that these mini-lessons:• Give students hands-on experience with scientific data;• Help students make connections between geologic phenomena and data.

Activity authors found suggestions for improvements in design, adaptations for other audiences, suggestions for clearer presentation, and tips for using the materials helpful and the first mini-lessons have now been revised using this feedback.

Web-deliverable Laboratory/Classroom Exercises

36 mini-lessons available on web

Field Testing Teaching Materials One of the challenges that we face in developing teaching materials or activities from research findings is testing the materials to determine that they are working as intended. Traditionally this is done by using material developed by an individual faculty member in their own class, noticing what worked and didn’t, and improving them the next year. However, as we move to a community process of creating and sharing teaching materials, a community based process for testing materials is appropriate.

The MARGINS project has piloted such a community-based process for testing teaching materials and activities developed as part of its mini-lesson project. Building on prior work developing mechanisms for community review of teaching resources (e.g. Kastens, 2001; Hancock and Manduca, 2005; Mayhew and Hall, 2007), the MARGINS evaluation team developed a structured classroom observation protocol.

The goals of field testing are to: a)gather structured, consistent feedback for the lesson

authors based on classroom use;b)guide reviewers of these lessons to reflect on research

based educational practice as a framework for their comments;

c)collect information on the data and observations that the reviewer used to underpin their review;

d)determine which mini-lessons are ready to be made widely available on the website.

The Field-Test Questionnaire The field test review is designed to provide feedback in four key areas: