a r 2015-2016 ext… · perceptions of how well ceems helped meet schools’ science and...

76
Evaluation of CEEMS: The Cincinnati Engineering Enhanced Mathematics and Science ANNUAL REPORT 2015-2016 DISCOVERY CENTER for EVALUATION, RESEARCH, AND PROFESSIONAL LEARNING formerly Ohio’s Evaluation & Assessment Center MIAMI UNIVERSITY OXFORD, OH

Upload: others

Post on 01-Aug-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Evaluation of CEEMS:

The Cincinnati Engineering Enhanced

Mathematics and Science

ANNUAL REPORT 2015-2016

DISCOVERY CENTER

for

EVALUATION, RESEARCH, AND PROFESSIONAL LEARNING

formerly Ohio’s Evaluation & Assessment Center

MIAMI UNIVERSITY

OXFORD, OH

Page 2: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Please cite as follows:

Woodruff, S. B., Dixon, M., & Li, Y. (2016). Evaluation of CEEMS: The Cincinnati Engineering Enhanced

Mathematics and Science: Annual report 2015-2016. Oxford, OH: Miami University, Discovery Center

for Evaluation, Research, and Professional Learning.

Distributed by:

© Discovery Center for Evaluation, Research, and Professional Learning

Miami University, Oxford, OH

408 McGuffey Hall

210 E. Spring St.

Oxford, Ohio 45056

[email protected]

(513) 529-1686 phone

(513) 529-2110 fax

Page 3: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page i

Table of Contents

Table of Contents ............................................................................................................................. i

Table of Tables ................................................................................................................................ iii

Table of Figures ................................................................................................................................ v

Introduction .................................................................................................................................... 1

Evaluation Methods and Findings ................................................................................................... 3

Principal Focus Group and Embedded Activity ........................................................................... 3

Participants .............................................................................................................................. 3

Instruments and Data Collection ............................................................................................. 3

Principal Focus Group .......................................................................................................... 3

Focus Group Embedded Activity .......................................................................................... 3

Data Analysis ........................................................................................................................... 5

Findings .................................................................................................................................... 7

Principal Focus Group .......................................................................................................... 7

Embedded Activity ............................................................................................................... 8

Summary and Discussion ....................................................................................................... 12

Classroom Observations ........................................................................................................... 12

Instrument ............................................................................................................................. 12

Data Collection ...................................................................................................................... 12

Data Analysis ......................................................................................................................... 13

Findings .................................................................................................................................. 13

Analysis Question 1: What were the primary characteristics of lessons in terms of

purpose, focus, design, and implementation? .................................................................. 14

Analysis Question 2: What were the primary characteristics of lesson content, lesson

features, and classroom culture? ...................................................................................... 20

Analysis Question 3: What were the primary characteristics of the overall lesson? ........ 25

Communication Logs ................................................................................................................ 27

Theoretical Framework ......................................................................................................... 27

Page 4: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page ii

Field Procedures .................................................................................................................... 28

Data Source ........................................................................................................................ 28

Case Identification ............................................................................................................. 29

Data Analysis ......................................................................................................................... 30

Case Study Questions ......................................................................................................... 32

Preliminary Findings .............................................................................................................. 32

Case #1 ............................................................................................................................... 32

Case #2 ............................................................................................................................... 33

Case #3 ............................................................................................................................... 35

Case #4 ............................................................................................................................... 36

Level-3 Questions ............................................................................................................... 37

Summary and Recommendations ................................................................................................. 38

Summary ................................................................................................................................... 38

Recommendations .................................................................................................................... 39

Next Steps .............................................................................................................................. 39

References .................................................................................................................................... 40

Appendices .................................................................................................................................... 41

Page 5: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page iii

Table of Tables

Table 1. Evaluation Question, Instruments/Measures, and Data Collection/Analysis ................... 1

Table 2. Content of Principal Focus Group Activity Sheet and Numbered Responses ................... 4

Table 3. Principal Focus Group Codes and Sub-Codes .................................................................... 5

Table 4. Lesson Focus by Lesson Type, Inside the Classroom Observation and Analytic

Protocol, 2015-2016 ..................................................................................................................... 14

Table 5. Item Ratings for Lesson Design by Lesson Type, Inside the Classroom Observation

and Analytic Protocol, 2015-2016 ................................................................................................ 16

Table 6. Synthesis Ratings for Lesson Design by Lesson Type, Inside the Classroom

Observation and Analytic Protocol, 2015-2016 ............................................................................ 17

Table 7. Item Ratings for Lesson Implementation by Lesson Type, Inside the Classroom

Observation and Analytic Protocol, 2015-2016 ............................................................................ 18

Table 8. Synthesis Ratings for Lesson Implementation by Lesson Type, Inside the Classroom

Observation and Analytic Protocol, 2015-2016 ............................................................................ 20

Table 9. Item Ratings for Lesson Content by Lesson Type, Inside the Classroom Observation

and Analytic Protocol, 2015-2016 ................................................................................................ 20

Table 10. Synthesis Ratings for Mathematics/Science Content by Lesson Type, Inside the

Classroom Observation and Analytic Protocol, 2015-2016 .......................................................... 22

Table 11. Lesson Feature by Lesson Type, Inside the Classroom Observation and Analytic

Protocol, 2015-2016 ..................................................................................................................... 23

Table 12. Item Ratings for Classroom Culture by Lesson Type, Inside the Classroom

Observation and Analytic Protocol, 2015-2016 ............................................................................ 24

Table 13. Synthesis Ratings for Classroom Culture by Lesson Type, Inside the Classroom

Observation and Analytic Protocol, 2015-2016 ............................................................................ 25

Table 14. Observer Ratings for Likely Impact of Instruction on Student Learning by Lesson

Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016 ............................... 26

Table 15. Connections between Theory and Case Study Propositions ........................................ 28

Table 16. Concepts and Definitions for Deductive Analysis, Communication Logs, 2014-2015 .. 30

Table 17. Case Study Questions and Findings, Case #1, Communication Logs, 2014-2015 ......... 33

Page 6: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page iv

Table 18. Case Study Questions and Answers, Case #2, Communication Logs, 2014-2015 ........ 34

Table 19. Case Study Questions and Answers, Case #3, Communication Logs, 2014-2015 ........ 35

Table 20. Case Study Questions and Answers, Case #4, Communication Logs, 2014-2015 ........ 36

Page 7: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page v

Table of Figures

Figure 1. Mathematics/science education issues addressed by CEEMS elements, Principal

Focus Group Activity Sheet. .......................................................................................................... 10

Figure 2. Mathematics/science education issues that addressed by at least one CEEMS

element by principal, Principal Focus Group Activity Sheet. ........................................................ 11

Figure 3. Communication Log categorization scheme. ................................................................. 29

Page 8: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Introduction

The Discovery Center for Evaluation, Research, and Professional Learning (formerly Ohio’s Evaluation & Assessment Center for Mathematics and Science Education) collaborates with the Evaluation Services Center (ESC) at the University of Cincinnati, and the Cincinnati Engineering Enhanced Mathematics and Science (CEEMS) project research team evaluation to conduct an evaluation of the influence of CEEMS participation on teachers’ instructional practices. The CEEMS project is funded through a Mathematics and Science Partnership (MSP) grant from the National Science Foundation (NSF). For this project, the Discovery Center team consists of Dr. Sarah B. Woodruff, Principal Investigator for the evaluation; Yue Li, Senior Research Associate and Project Team Leader; and Maressa Dixson, the Research Associate. The purpose of this mixed-methods study is to evaluate the influence of CEEMS participation on teachers’ confidence and competence in their incorporation of engineering principles into their science instruction. Specifically, the evaluation study asks the following questions:

1. In what ways did teachers’ instructional practices change in the course of their participation in CEEMS?

2. In what ways did CEEMS Resource Team support for teachers change in the course of their participation in CEEMS?

Between Fall 2015 and Summer 2017, the evaluation study will address Goal 3 of the revised CEEMS project goals (i.e., to develop mathematics and science teacher knowledge of challenge-based learning, engineering, and the engineering design process as instructional strategies through explicit training and classroom implementation support). The Discovery Center evaluation team analyzed quantitative and qualitative data from administrator focus groups, classroom observations, and Resource Team communication logs to inform evaluation questions. Table 1 summarizes the evaluation questions, corresponding instruments, and data collection and analysis methods. Table 1. Evaluation Question, Instruments/Measures, and Data Collection/Analysis

Evaluation Question Instrument/Measure(s) Data Collection/Analysis

EQ 1: In what ways did teachers’ instructional practices change in the course of their participation in CEEMS?

Administrator Focus Group & Focus Group Activity

Classroom Observations

Communication Logs

Thematic analysis of observation and focus group data

Change in classroom observation ratings

Qualitative content analysis of observations, communication logs, focus group activity

Page 9: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 2

Evaluation Question Instrument/Measure(s) Data Collection/Analysis

EQ2: In what ways did CEEMS Resource Team support for teachers change in the course of their participation in CEEMS?

Communication Logs

Descriptive statistics (e.g., frequencies) of pertinent items

Case studies of teachers with richly detailed log.

During 2015-2016, the Discovery Center evaluation team conducted and analyzed data from one administrator focus group, analyzed 2015-2016 classroom observation data, and 2014-2015 Resource Team communication logs. Detailed evaluation methods and findings are reported herein.

Page 10: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 3

Evaluation Methods and Findings

Principal Focus Group and Embedded Activity

The Discovery Center evaluation team used findings from the principal focus group and its embedded activity, in combination with other qualitative data, to inform the first evaluation question (i.e., in what ways did teachers’ instructional practices change in the course of their participation in CEEMS?). Particularly, the evaluation focused on understanding administrators’ perceptions of how well CEEMS helped meet schools’ science and mathematics education needs.

Participants

Principals at CEEMS schools with more than one CEEMS teacher and more than 1 year of CEEMS participation were invited to participate in a focus group about the influence of the CEEMS program on their schools. Of the 8 principals who agreed to attend, 4 attended. These principals hailed from 2 middle schools (Grades 5-8 and Grades 6-8) and 2 high schools (Grades 7-12 and Grades 9-12).

Instruments and Data Collection

Principal Focus Group

During the focus group, an evaluator from the Discovery Center asked questions about principals’ experiences with, and perceptions of, the CEEMS program, particularly regarding the extent to which the program helped them meet their schools’ science and mathematics education needs. The focus group included open-ended response questions (captured by audio recording and then transcribed) and a pencil-paper activity that asked participants to match CEEMS elements to the science/mathematics education issue that each element helped to address. The Discovery Center analyzed both oral and written data and results are provided in this summary. The focus group occurred on November 13, 2015 and lasted approximately 1 hour.

Focus Group Embedded Activity

As part of the principal focus group, the Discovery Center evaluator asked participants to complete an activity in which they matched 9 elements of the CEEMS program with 10 science and mathematics education issues that middle and high schools frequently face. Prior to the focus group, the evaluator determined the 9 CEEMS elements and 9 of the common issues. The evaluator added issue 10 as a result of discussions during the first part of the focus group. The evaluator drew upon the CEEMS proposal, annual evaluation reports, and discussions with CEEMS personnel to identify 9 elements of the CEEMS program that were both prominent features and features principals were likely to observe. Similarly, the evaluator drew upon STEM education literature and prior knowledge to identify 9 issues secondary schools

Page 11: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 4

reportedly faced in terms of science and/or mathematics education. Another focus group question, prior to the activity, asked participants to discuss some of the issues that their particular schools faced in terms of mathematics and science education, and participant responses included many issues identified in the focus group activity as well as the one issue added by the evaluator. The list of elements and the list of issues were intended to be illustrative rather than exhaustive. For the activity, a response sheet for each participant and a numbered list of issues was displayed on a white board in view of all participants. The response sheet included a section for participants to indicate their school type (high school, middle school, or other). The sheet contained a table that listed each CEEMS element and a blank section for participants to indicate which issue(s) related to that element in each row. The bottom of the response sheet allowed participants to make additional comments. The CEEMS elements and science/mathematics education issues are listed in Table 2. The Principal Focus Group Protocol and Embedded Activity Response Sheet can be found in Appendix A. Table 2. Content of Principal Focus Group Activity Sheet and Numbered Responses

CEEMS Elements

Science Action Planning Engineering Challenge-Based Instruction Teacher-led Professional development In-class support from graduate student fellows STEM career exposure CEEMS Conference Resource team communication and support Summer Institute for Teachers (SIT) Engineering design-based instruction

Issues in Science and Mathematics Education

1. Student science/mathematics prior achievement/background 2. Student interest in STEM learning 3. Perceptions of achievement levels required for science/mathematics learning 4. Student perceptions of relevance of science/mathematics content 5. Science/mathematics teacher qualifications 6. Science/mathematics teacher dispositions 7. Science/mathematics teacher availability/retention 8. Demographically relevant achievement gaps 9. Resources necessary for 21st century science/mathematics instruction 10. New ideas for ways to teach content

Page 12: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 5

Data Analysis

The evaluator used NVivo (version 11) to analyze the transcribed audio recordings. Transcription was used to transfer the content of the audio recording to written text, to save focus group data without personal identifiers. The transcript did not include extraneous vocalizations, pauses, or other contextual information that was not directly pertinent to what the speaker said (Lapadat, 2000). The evaluator removed all personal, school, and district names during transcription to protect participant confidentiality. The evaluator then uploaded the transcript into the NVivo database and deleted all copies of the audio recordings to ensure the evaluation team did not maintain personally identifiable information. The evaluator developed 9 codes deductively, based on the conceptual framework outlined in the original CEEMS proposal to the NSF and two prior annual program evaluation reports (2013 and 2014). The evaluator entered these codes and their definitions into the NVivo database and used the codes to label focus group text that reflected each code. The evaluator also created a code named “inductive” to capture text that did not reflect 1 of the 9 deductive codes. In the second round of coding, the evaluator created 6 sub-codes to summarize the content of text collected under the code “inductive.” Table 3 lists and defines all codes and sub-codes. Table 3. Principal Focus Group Codes and Sub-Codes

Code/Sub-Code Name

Definition

Deductive Codes

Trans-Disciplinary Curriculum

Curriculum that includes information, concepts, terms, examples, etc. from more than one academic discipline.

Inquiry-Based Learning

Instructional practices that encourage, support, and/or require scientific inquiry processes. Scientific inquiry processes include hypothesizing, designing a study, observing and making predictions, collecting and analyzing data, and interpreting results from an inquiry study.

Authentic Learning

Curriculum and/or instruction that utilizes real-world examples, connects content to students’ everyday lives, and/or can be applied to novel real-world situations.

Engineering Design Process

Curriculum and instruction that situates the learning of STEM concepts within the engineering design process.

Project/Challenge/ Problem-Based Learning

Curriculum and instruction that situates the learning of STEM concepts within the context of a challenge (problem, project) that must be solved.

Career Exploration Discussion, research, or consideration of STEM careers in the context of science and/or mathematics learning. Interaction with individuals who have direct professional knowledge of STEM careers.

Collaborative Learning Environment

A learning environment that supports, encourages, and/or requires cooperation among students and/or between students and the teacher.

Page 13: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 6

Code/Sub-Code Name

Definition

Deductive Codes

Professional Learning Community

Cooperation among/between teachers to prepare for and/or conduct lessons.

Active Learning Learning that is engaging to students and/or requires active student participation.

Inductive Codes

Resources Discussions of the resources necessary to complete CEEMS lessons, including time, human, and material resources. Discussion of resources necessary to continue providing the program.

Teacher Retention Discussion of the effect of the CEEMS program on teacher retention. Discussions of issues related to teacher retention, such as qualifications and marketability.

Program Availability

Discussions of the time CEEMS teachers must commit to complete the program. Discussions of making the program available to a larger group of teachers.

Conceptual Understanding

Discussions of students' conceptual understanding, preparedness, or background.

Recruitment Discussions of teacher recruitment for the CEEMS program.

Engineering Pipeline

Discussions of the engineering pipeline, including engineering courses and other opportunities for students to engage with engineering.

Evaluators further analyzed text collected under codes and sub-codes that included at least 4 independent references, as the remaining codes contained fewer references than individual focus group participants. In the third round of coding, the evaluator identified themes that emerged from the codes containing at least 4 references. Themes are declarative statements about the topic captured by the code and include only text that reflects the statement (as opposed to text that simply relates to the code). The evaluator identified all themes that included at least 5 independent references or references from at least 3 individuals (75% of the sample). To analyze the embedded activity sheet data, the evaluator first entered all data into an Excel spreadsheet. The evaluator analyzed data using quantitative content analysis. Because of the small sample size (n = 4), content analysis procedures consisted of frequency counts. The goals of this analysis were to determine 1) which issues participants identified most frequently as being addressed by each CEEMS element, 2) which issues each participant identified as being addressed by CEEMS elements overall, and 3) which issues all participants identified most frequently overall.

Page 14: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 7

Findings

Principal Focus Group

Findings from the principal focus group data are summarized into four themes. Theme 1: CEEMS facilitates teacher collaboration. The strongest theme that emerged from this focus group was the observation that CEEMS has facilitated meaningful collaboration among science teachers and between science teachers and teachers of other disciplines. Every participant described ways the CEEMS program has resulted in productive and meaningful opportunities for teacher collaboration. These collaborative opportunities were diverse; they ranged from informal teaching tips, to teacher-led professional development in regular professional learning community (PLC) meetings, to co-teaching and peer coaching, to the restructuring of all science lessons according to a CEEMS model. One middle school principal explained, “because we do have [name] as our science consultant--they are now working on all of their unit plans developing as a CEEMS-like ... structure.” For this school, purposeful work with a CEEMS resource teacher enabled the CEEMS model to infiltrate across the curriculum, even to non-CEEMS teachers and lessons. A principal of a large high school said, “for a while we had two CEEMS teachers next door to each other. So that created almost a powerful synergy of these four-five science teachers always collaborating, working together.” The close proximity of their classrooms and their common CEEMS experiences enabled these teachers to leverage their combined knowledge in demonstrable ways. Theme 2: CEEMS illuminates real-world connections among concepts. Focus group participants described CEEMS as a means to making direct connections between curriculum content and real-world applications of science and mathematics concepts. One high school principal lauded the CEEMS program for using engineering design principles to connect curriculum to its real-world application. He explained, “And I always wanna [sic] try to talk to them, to the kids and teachers, ‘[to] make this stuff real world.’ Because, if you think about it . . . if we go across from Ohio to Kentucky on a bridge that can drop at any time, we need kids to be able to design a new bridge.” This principal made the real-world applicability of the content a priority for students and teachers, and this prioritization was based on practical concerns for local implications.

Theme 3: Human resources facilitate career exploration. Focus group participants praised the CEEMS program for its incorporation of graduate student fellows and retired engineers as human resources who provided invaluable services, particularly the opportunity for students to explore STEM careers. One principal explained:

“Another piece that I wanted to add, that I thought has been very powerful and helped to add to it, I think, has been the grad--either the students that are coming in or the engineers that are either still active or have retired to come in and be a part of some of those lessons and a resource. That, to me for us, has been huge, because . . . we're kinda [sic] out in . . . the outskirts and it's very tough to get those resources or people to come

Page 15: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 8

out. So when you've got an actual . . . engineer or a student who's really studying to be this and helping those students and making them realize, yeah, you guys can do this too. That's been very powerful for us.”

This principal’s school found it difficult to solicit community volunteers, largely because of its remote location relative to other area schools. The CEEMS program facilitates direct and meaningful interaction among schools and individuals who have experience with STEM careers. For these participants’ schools, STEM graduate students and professionals have offered unique opportunities for students to envision themselves in STEM professions. Theme 4: The CEEMS credential allows teachers greater mobility. This theme is important for the future sustainability of the CEEMS program. Principals indicated that, to some extent, their schools can become victims of the success of the CEEMS program because CEEMS teachers are more highly qualified after they complete the program. The CEEMS credential makes teachers marketable, and they are able to take that credential to nearly any school or district in which they would like to teach. Teacher mobility differed for each principal’s school: 1 high school lost 3 CEEMS teachers, 1 middle school lost 1 CEEMS teacher, while the remaining schools retained their CEEMS teachers thus far. One high school principal had not experienced a loss of CEEMS teachers, but still was concerned that:

“. . . there's really no way that I can lock that teacher into my building, except for the two years that they're going through the program. But after that, if you start losing them, it's all for naught.”

Regardless of whether or not the school had lost CEEMS teachers, all participants acknowledged that 1) the end of the CEEMS program poses a threat to the many benefits of the CEEMS program, particularly because 2) credentialed individuals can choose to leave at any time, and 3) currently, there are no formal mechanisms to sustain the CEEMS program without individual CEEMS teachers.

Embedded Activity

As shown in Figure 1, for all CEEMS elements the most frequently identified science/mathematics education issue was identified by at least 3 participants, and in some cases by all 4 participants. When asked which science/mathematics issues the CEEMS element of science action planning addressed, 3 participants identified 1. Student science/mathematics prior achievement/background and 3 participants identified 6. Science/mathematics teacher dispositions. When asked which science/mathematics issues the CEEMS element of engineering challenge-based instruction addressed, 3 participants identified 1. Student science/mathematics prior achievement/background, 3 participants identified 10. New ideas for ways to teach content, and 3 participants identified 6. Science/mathematics teacher dispositions. When asked which science/mathematics issues the CEEMS element of teacher-led professional development addressed, all participants identified 9. Resources necessary for 21st century

Page 16: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 9

science/mathematics instruction and 3 participants additionally identified 10. New ideas for ways to teach content. When asked which science/mathematics issues the CEEMS element of in-class support from graduate student fellows addressed, 3 participants identified 6. Science/mathematics teacher dispositions. When asked which science/mathematics issues the CEEMS element of STEM career exposure addressed, 3 participants identified 4. Student perceptions of relevance of science/mathematics content. When asked which science/mathematics issues the CEEMS element of the CEEMS conference addressed, 3 participants also identified 9. Resources necessary for 21st century science/mathematics instruction and 3 other participants identified 10. New ideas for ways to teach content. When asked which science/mathematics issues the CEEMS element of resource team communication and support addressed, all participants identified 9. Resources necessary for 21st century science/mathematics instruction. When asked which science/mathematics issues the CEEMS element of the Summer Institute for Teachers (SIT) addressed, all participants identified 6. Science/mathematics teacher dispositions, while 3 participants also identified 7. Science/mathematics teacher availability/retention, and 3 other participants identified 9. Resources necessary for 21st century science/mathematics instruction. Finally, when asked which science/mathematics issues the CEEMS element of engineering design-based instruction addressed, all participants identified 2. Student interest in STEM learning, while 3 participants identified 4. Student perceptions of relevance of science/mathematics content, and three participants identified 10. New ideas for ways to teach content.

Page 17: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 10

CEEMS Elements Issues in Science and Mathematics Education

1. Science Action Planning

1. Student science/mathematics prior achievement/background

2. Engineering Challenge-Based Instruction

2. Student interest in STEM learning

3. Teacher-led Professional development

3. Perceptions of achievement levels required for science/mathematics learning

4. In-class support from graduate student fellows

4. Student perceptions of relevance of science/mathematics content

5. STEM career exposure 5. Science/mathematics teacher

qualifications

6. CEEMS Conference

6. Science/mathematics teacher dispositions

7. Resource team communication and support

7. Science/mathematics teacher availability/retention

8. Summer Institute for Teachers (SIT)

8. Demographically relevant achievement gaps

9. Engineering design-based instruction

9. Resources necessary for 21st century science/mathematics instruction

10. New ideas for ways to teach content

Figure 1. Mathematics/science education issues addressed by CEEMS elements, Principal Focus

Group Activity Sheet.

As shown in Figure 2, all participants identified at least 9 of 10 issues as being addressed by at least one of the CEEMS elements included in the activity. Specifically, Participant 1, a middle school principal, chose all issues at some point. Participant 2, a high school principal, indicated all issues except 10. New ideas for ways to teach content were addressed by at least one element of the CEEMS program. Participant 3 indicated that all issues except 3. Perceptions of achievement levels required for science/mathematics learning were addressed by at least one

Page 18: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 11

element of the CEEMS program. Participant 4 indicated that all issues except 7. Science/mathematics teacher availability/retention were addressed by at least one element of the CEEMS program.

Principal Focus Group Participant Issues in Science and Mathematics Education

Participant 1

1. Student science/mathematics prior achievement/background

2. Student interest in STEM learning

Participant 2

3. Perceptions of achievement levels required for science/mathematics learning

4. Student perceptions of relevance of science/mathematics content

Participant 3

5. Science/mathematics teacher qualifications

6. Science/mathematics teacher dispositions

Participant 4

7. Science/mathematics teacher availability/retention

8. Demographically relevant achievement gaps

9. Resources necessary for 21st century science/mathematics instruction

10. New ideas for ways to teach content

Figure 2. Mathematics/science education issues that addressed by at least one CEEMS element

by principal, Principal Focus Group Activity Sheet.

Page 19: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 12

The maximum number of times participants could select any individual issue as being addressed by the CEEMS program was 36. Across all participants, the most frequently identified issues were 9. Resources necessary for 21st century science/mathematics instruction (count=22), 10. New ideas for ways to teach content (count=20), and 6. Science/mathematics teacher dispositions (count=18). The least frequently identified issues were 7. Science/mathematics teacher availability/retention (count=7) and 8. Demographically relevant achievement gaps (count=10).

Summary and Discussion

The overwhelming consensus among principals was that the CEEMS program has provided many benefits to science and mathematics education at their schools. Participants perceived most of the central elements of the CEEMS program as working as intended by CEEMS program developers. From these principals’ perspective, the strengths of the CEEMS program lies in its preparation of teachers as collaborative leaders, its connection of science and mathematics concepts to their real-world engineering applications, its incorporation of industry professionals as resources and models for what is possible for students who pursue STEM disciplines, and its direct provision of the resources necessary for 21st century science and mathematics education. Because the program is so dynamic, intense, and effective, CEEMS teachers are highly qualified, and therefore, in demand. The difficulties of CEEMS participation, then, lie in the need for resources that will sustain the program beyond the initial grant period.

Classroom Observations

Instrument

To understand change in instructional practices, classroom observation data were collected through hand-written field notes and completion of the Inside the Classroom Observation and Analytic Protocol (Horizon Research Inc., 2000). This mixed methods protocol included sections that allowed observers to record basic information about the classroom—such as the subject and number of students—and then provide both quantitative ratings and narrative descriptions of important elements of the lesson. These elements included lesson purpose, focus, design, implementation, and content; classroom culture; time usage; likely impact on mathematics/science learning; and specific lesson features. Finally, graduate fellows rated the lesson overall and provided a narrative description of what occurred during the observation. A copy of the observation instrument can be found in Appendix B.

Data Collection

In October 2015, Discovery Center staff trained three CEEMS engineering fellows to conduct classroom observations using the Inside the Classroom Observation and Analytic Protocol. During the training session, the evaluator first reviewed the protocol in detail with CEEMS fellows, then practiced using video-taped lessons, and discussed low-inference observations.

Page 20: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 13

After completed the training, fellows completed 58 observation forms from 17 teachers’ classrooms (8 high school teachers and 9 middle school teachers) during the 2015-2016 school year. Fellows collected a minimum of one and a maximum of six observations per teacher. Fellows observed non-CEEMS, CEEMS Introductory, and CEEMS Challenge lessons, although the number of lessons observed by type varied for each teacher. Non-CEEMS lessons were either regular lessons from non-CEEMS classrooms or non-CEEMS lessons in a CEEMS classroom. In CEEMS Introductory lessons, the teacher introduced a challenge students would work on for several lessons. In CEEMS Challenge lessons, the teacher engaged students in one or more stages of CEEMS challenge implementation.

Data Analysis

To analyze classroom observation data, the evaluation team developed nine qualitative codes, deductively, based on the conceptual framework outlined in the original CEEMS proposal to the NSF and two prior annual program evaluation reports (2013 and 2014). The evaluation team entered information from classroom observation forms into an Excel spreadsheet for analysis. Evaluators applied the nine deductive codes to qualitative data related to the purpose, design, and implementation of the lesson, as well as to the overall narrative description. Evaluators then developed codes inductively to apply to data that described processes not captured in the original nine deductive codes. This process resulted in nine codes that reflected traditional instructional practices. The code book with both inductive and deductive codes can be found in Appendix C. In the next analysis step, the evaluation team separated classroom observation forms by unit type (i.e., non-CEEMS, CEEMS Introductory, and CEEMS Challenge lesson) to understand change in instructional practices based on the type of lesson. Of the 58 forms analyzed, 18 forms described non-CEEMS lessons, 14 forms described CEEMS Introductory lessons, and 26 forms described CEEMS Challenge lessons. Evaluators identified themes across all forms, within each unit type, to analyze teacher change based on lesson type. Kruskal Wallis tests and post hoc paired-wise comparisons using Mann Whitney U Tests with Bonferroni correction were conducted to examine differences in lesson design, implementation, mathematics/science content, classroom culture, and likely impact on mathematics/science learning across lesson types.

Findings

Analyses by lesson type were conducted to understand primary characteristics of the lesson in the aggregate and similarities and differences in lessons across types. In general, non-CEEMS lessons focused on conceptual learning, seat work, and other traditional instructional practices; CEEMS Introductory lessons featured inquiry-based instructional practices; and CEEMS Challenge lessons featured project-, challenge-, and design-based instructional practices that required student collaboration in groups as primary lesson features. Lessons of all types included traditional; inquiry-based; and project-, challenge-, or design-based instructional practices, although not every lesson integrated CEEMS and non-CEEMS elements. Lessons were

Page 21: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 14

rated highly across the board; most item and synthesis ratings were 3, 4, or 5 (with 1 being lowest and 5 being highest). Additionally, teachers focused any single lesson on one of four modes of mathematics/science learning, and they used all three types of lessons to do so. Finally, CEEMS Introductory lessons were rated more highly, than were CEEMS Challenge, and non-CEEMS lessons in terms of likely positive impact on student learning. To organize quantitative and qualitative findings in comprehensible ways for this analysis, evaluators developed three analysis questions. These analysis questions serve as sub-headings for the remainder of this section on findings.

Analysis Question 1: What were the primary characteristics of lessons in terms of purpose, focus, design, and implementation?

The purposes of the observed lessons were categorized as:

Non-CEEMS—Conceptual Learning & Assessment: o Introduced or taught specific mathematics/science concepts or formulas

(Conceptual Learning), or o Reviewed concepts learned or tested previously, most times to prepare for a

future test (Assessment)

CEEMS Introductory—Project-, Challenge-, Problem-Based Learning: o Introduced either the entire CBL unit or the “big idea” to be used to define the

challenge.

CEEMS Challenge—Engineering Design Process: o Studied concepts of design; sketched, developed, or planned designs based on

prior study and modeling; or built or tested the properties of a physical object designed by students.

As shown in Table 4, the majority of observed non-CEEMS and CEEMS Challenge lessons were mainly focused on mathematics/science concepts; while a larger proportion of CEEMS Introductory lessons (43%) focused on both algorithms/facts/vocabulary and mathematics/science concepts. Table 4. Lesson Focus by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Focus Non-CEEMS Introductory Challenge Total

Almost entirely working on mathematics/science concepts

8 (44%) 3 (21%) 10 (38%) 21 (36%)

Mostly working on mathematics/science concepts, but working on some algorithms/facts/vocabulary

4 (22%) 4 (29%) 7 (27%) 15 (26%)

Page 22: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 15

Lesson Focus Non-CEEMS Introductory Challenge Total

About equally working on algorithms/facts/vocabulary and working on mathematics/science concepts

4 (22%) 6 (43%) 4 (15%) 14 (24%)

Mostly working on the development of algorithms/facts/vocabulary, but working on some mathematics/science concepts

2 (11%) 1 (7%) 4 (15%) 7 (12%)

Not specified 0 (0%) 0 (0%) 1 (4%) 1 (2%)

Total 18 (100%) 14 (100%) 25 (100%) 58 (100%) Note. Percentages over 50% were marked bold in this Table. As shown in Table 5, the designs for each type of lessons were categorized as:

Non-CEEMS—Seat Work: o When implemented for the entire class period: completion of an assignment or

worksheet or review of prior concepts, individually o When implemented for part of the class period: embedded exercises reviewed

by entire class

CEEMS Introductory—Inquiry-Based Learning: o Students developed fundamental elements of the CEEMS challenge—such as the

essential questions, guiding questions, or the parameters of the challenge—while the teacher played a facilitating role to guide students to testable essential questions.

o Observation, investigation, or experimentation in support of a larger project.

CEEMS Challenge—Engineering Design Process: o Studied scientific or mathematical concepts directly related to the engineering

design being built, sketched or crafted designs, built physical objects, or tested physical objects designed by students. All design lessons included prior engagement with mathematical/scientific concepts directly related to the challenge and prior decision-making, such as sketching and group discussion.

Post hoc pair-wise comparisons indicated that the instructional strategies and activities used in the CEEMS Introductory lesson reflected attention to students’ experience, preparedness, prior knowledge, and/or learning styles to a significantly greater extent than did the CEEMS Challenge lessons, and that the design of the CEEMS Introductory lessons encouraged a collaborative approach to learning among the students to a significantly greater extent than did non-CEEMS and CEEMS Challenge lessons. In addition, statistically significant differences were found in incorporation of tasks, roles, and interactions that were consistent with investigative mathematics/science and reflection of careful planning and organization for the design across three lesson types based on Kruscal Wallis test results. However, no significant pair-wise comparison differences were found between any pair of lesson types.

Page 23: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 16

Table 5. Item Ratings for Lesson Design by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Design Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

1. The design of the lesson incorporated tasks, roles, and interactions consistent with investigative mathematics/science.

Non-CEEMS 18 0 (0%) 3 (17%) 8 (44%) 7 (39%) .005

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 25 (96%) 1 (4%)

2. The design of the lesson reflected careful planning and organization.

Non-CEEMS 18 0 (0%) 4 (22%) 14 (78%) 0 (0%) .036

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 1 (4%) 25 (96%) 0 (0%)

3. The instructional strategies and activities used in this lesson reflected attention to students’ experience, preparedness, prior knowledge, and/or learning styles.

Non-CEEMS 18 2 (11%) 1 (6%) 12 (67%) 3 (17%) .011 b

Introductory 14 0 (0%) 0 (0%) 12 (86%) 2 (14%)

Challenge 26 0 (0%) 8 (31%) 14 (54%) 4 (15%)

4. The resources available in this lesson contributed to accomplishing the purposes of the instruction.

Non-CEEMS 18 0 (0%) 2 (11%) 16 (89%) 0 (0%) .312

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 2 (8%) 24 (92%) 0 (0%)

5. The instructional strategies and activities reflected attention to issues of access, equity, and diversity for students (e.g., cooperative learning, language-appropriate strategies/materials).

Non-CEEMS 18 1 (6%) 3 (17%) 12 (67%) 2 (11%) .515

Introductory 14 0 (0%) 0 (0%) 12 (86%) 2 (14%)

Challenge 26 0 (0%) 0 (0%) 19 (73%) 7 (27%)

Page 24: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 17

Design Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

6. The design of the lesson encouraged a collaborative approach to learning among the students.

Non-CEEMS 18 0 (0%) 4 (22%) 14 (78%) 0 (0%) .001ab

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 1 (4%) 24 (92%) 1 (4%)

7. Adequate time and structure were provided for “sense-making.”

Non-CEEMS 18 0 (0%) 5 (28%) 13 (72%) 0 (0%) .298

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 1 (4%) 23 (88%) 2 (8%)

8. Adequate time and structure were provided for wrap-up.

Non-CEEMS 18 0 (0%) 6 (33%) 12 (67%) 0 (0%) .114

Introductory 14 0 (0%) 0 (0%) 13 (93%) 1 (7%)

Challenge 25 0 (0%) 3 (12%) 22 (88%) 0 (0%)

Note. Rating 1=“not at all” and Rating 5=“to a great extent.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests. a Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Introductory lesson scores. b Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that Introductory lessen scores are significantly different from Challenge lesson scores.

Synthesis Ratings indicated that all Introductory, 92% of Challenge, and 72% of non-CEEMS lessons were highly reflective of best design practices (Table 6). A statistically significant difference was found for the synthesis lesson design scores across three lesson types based on Kruscal Wallis test results. However, no significant pair-wise comparison differences were found between any pair of lesson types. Table 6. Synthesis Ratings for Lesson Design by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Type n Rating 1 and 2 (%) Rating 3 (%) Rating 4 and 5 (%) p *

Non-CEEMS 18 0 (0%) 5 (28%) 13 (72%) .035

Introductory 14 0 (0%) 0 (0%) 14 (100%)

Challenge 25 0 (0%) 2 (8%) 23 (92%) Note. Rating 1=“not at all reflective of best practices” and Rating 5=”extremely reflective of best practices.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests.

Page 25: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 18

As shown in Table 7, the implementation for each lesson type were categorized as:

Non-CEEMS —Direct Instruction o Concept introduced, clarified, or reviewed through lecture, direct teacher

questioning, whole-class exercises, or technology-enabled quizzing

CEEMS Introductory—Inquiry-Based Learning o Facilitated by the teacher, students developed essential questions, the

challenge, or elements of the challenge. o Set up by the teacher, students made systematic observations, collected data or

information, analyzed or interpreted data, or conducted experiments.

CEEMS Challenge—Student Collaboration o Students worked in groups (most lessons). o Teacher solicited student input or assistance during whole-class discussion.

Statistically significant differences were found for items regarding lesson implementation (i.e., “instructional strategies were consistent with investigative mathematics/science,” “teacher appeared confident in his/her ability to teach mathematics/science,” “teacher’s classroom management style/strategies enhanced quality of lesson,” and “pace of the lesson was appropriate for developmental levels/needs of students and purposes of lesson”) across three lesson types based on Kruscal Wallis test results. However, no significant pair-wise comparison differences were found between any pair of lesson types. Table 7. Item Ratings for Lesson Implementation by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Implementation Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know and

N/A (%)

p *

1. The instructional strategies were consistent with investigative mathematics/science.

Non-CEEMS 18 1 (6%) 2 (11%) 5 (28%) 10

(56%) .038

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 22 (85%) 4 (15%)

2. The teacher appeared confident in his/her ability to teach mathematics/science.

Non-CEEMS 18 0 (0%) 0 (0%) 15 (83%) 3 (17%) .036

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 26

(100%) 0 (0%)

Page 26: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 19

Implementation Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know and

N/A (%)

p *

3. The teacher’s classroom management style/strategies enhanced the quality of the lesson.

Non-CEEMS 18 2 (11%) 6 (33%) 10 (56%) 0 (0%) .026

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 3 (12%) 23 (88%) 0 (0%)

4. The pace of the lesson was appropriate for the developmental levels/needs of the students and the purposes of the lesson.

Non-CEEMS 18 0 (0%) 5 (28%) 13 (72%) 0 (0%) .028

Introductory 14 0 (0%) 0 (0%) 13 (93%) 1 (7%)

Challenge 26 0 (0%) 2 (8%) 24 (92%) 0 (0%)

5. The teacher was able to “read” the students’ level of understanding and adjusted instruction accordingly.

Non-CEEMS 18 0 (0%) 3 (17%) 8 (44%) 7 (39%) .097

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 4 (15%) 22 (85%) 0 (0%)

6. The teacher’s questioning strategies were likely to enhance the development of student conceptual understanding/problem solving (e.g., emphasized higher order questions, appropriately used “wait time,” identified prior conceptions and misconceptions).

Non-CEEMS 16 0 (0%) 5 (31%) 11 (69%) 0 (0%) .053

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 1 (4%) 24 (92%) 1 (4%)

Note. Rating 1=“not at all” and Rating 5=”to a great extent.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests.

Synthesis Ratings indicated that all Introductory, 92% of Challenge, and 69% of non-CEEMS lessons were highly reflective of best implementation practices (Table 8). Synthesis rating scores for implementation were not significantly different across three lesson types based on Kruscal Wallis test results.

Page 27: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 20

Table 8. Synthesis Ratings for Lesson Implementation by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Type n Rating 1 and 2 (%) Rating 3 (%) Rating 4 and 5 (%) p *

Non-CEEMS 16 0 (0%) 5 (31%) 11 (69%) .076

Introductory 14 0 (0%) 0 (0%) 14 (100%)

Challenge 26 0 (0%) 2 (8%) 24 (92%) Note. Rating 1=“not at all reflective of best practices” and Rating 5=”extremely reflective of best practices.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests.

Analysis Question 2: What were the primary characteristics of lesson content, lesson features, and classroom culture?

As shown in Table 9, the mathematics/science content of CEEMS Introductory and Challenge lessons were significant, worthwhile, and appropriate for the developmental levels of the students in the class. During these lessons, were intellectually engaged with important ideas relevant to the focus of the lesson to a significantly greater extent than did students in non-CEEMS lessons. Teachers were knowledgeable about mathematics/science concepts. In addition, compared to non-CEEMS lessons, both CEEMS Introductory and Challenge lessons made significantly more connections to other areas of mathematics/science, to other disciplines, and/or to real-world contexts. CEEMS Introductory and Challenge lessons also were rated significantly higher in the degree of “sense-making” of mathematics/science content” than were non-CEEMS lessons. Table 9. Item Ratings for Lesson Content by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Mathematics/Science Content

Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

1. The mathematics/science content was significant and worthwhile.

Non-CEEMS 18 0 (0%) 2 (11%) 13 (72%) 3 (17%) .065

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 4 (15%) 19 (73%) 3 (12%)

2. The mathematics/science content was appropriate for the developmental levels of the students in this class.

Non-CEEMS 18 0 (0%) 5 (28%) 11 (61%) 2 (11%) .054

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 1 (4%) 22 (85%) 3 (12%)

Page 28: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 21

Mathematics/Science Content

Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

3. Teacher-provided content information was accurate.

Non-CEEMS 18 0 (0%) 1 (6%) 13 (72%) 4 (22%) .149

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 22 (85%) 4 (15%)

4. Students were intellectually engaged with important ideas relevant to the focus of the lesson.

Non-CEEMS 18 2 (11%) 6 (33%) 8 (44%) 2 (11%) .004 a

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 4 (15%) 21 (81%) 1 (4%)

5. The teacher displayed an understanding of mathematics/science concepts (e.g., in his/her dialogue with students).

Non-CEEMS 18 0 (0%) 1 (6%) 14 (78%) 3 (17%) .023

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 26

(100%) 0 (0%)

6. Mathematics/science was portrayed as a dynamic body of knowledge continually enriched by conjecture, investigation analysis, and/or proof/justification.

Non-CEEMS 18 3 (17%) 2 (11%) 7 (39%) 6 (33%) .013

Introductory 14 0 (0%) 0 (0%) 13 (93%) 1 (7%)

Challenge 26 0 (0%) 1 (4%) 23 (88%) 2 (8%)

7. Elements of mathematical/science abstraction (e.g., symbolic representations, theory building) were included when it was important to do so.

Non-CEEMS 18 0 (0%) 5 (28%) 6 (33%) 7 (39%) .209

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 24 (92%) 2 (8%)

8. Appropriate connections were made to other areas of mathematics/science, to other disciplines, and/or to real-world contexts.

Non-CEEMS 18 4 (22%) 5 (28%) 3 (17%) 6 (33%) < .001 ac

Introductory 14 0 (0%) 0 (0%) 14

(100%) 0 (0%)

Challenge 26 0 (0%) 0 (0%) 26

(100%) 0 (0%)

Page 29: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 22

Mathematics/Science Content

Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

9. The degree of “sense-making” of mathematics/science content within this lesson was appropriate for the developmental levels/needs of the students and the purposes of the lesson.

Non-CEEMS 18 0 (0%) 8 (44%) 7 (39%) 3 (17%) < .001 ac

Introductory 14 0 (0%) 0 (0%) 13 (93%) 1 (7%)

Challenge 25 0 (0%) 1 (4%) 22 (88%) 2 (8%)

Note. Rating 1=“not at all” and Rating 5=”to a great extent.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests. a Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Introductory lesson scores. c Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Challenge lesson scores.

Synthesis Ratings indicated that all Introductory and Challenge lessons, as well as 44% of non-CEEMS lessons were highly reflective of best practices for mathematics/science content (Table 10). Both CEEMS Introductory and Challenge lessons had significantly higher synthesis ratings in mathematics/science content than did non-CEEMS lessons.

Table 10. Synthesis Ratings for Mathematics/Science Content by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Type n Rating 1 and 2 (%) Rating 3 (%) Rating 4 and 5 (%) p *

Non-CEEMS 16 0 (0%) 9 (56%) 7 (44%) < .001 ac

Introductory 14 0 (0%) 0 (0%) 14 (100%)

Challenge 25 0 (0%) 0 (0%) 25 (100%) Note. Rating 1=“not at all reflective of best practices” and Rating 5=“extremely reflective of best practices.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests. a Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Introductory lesson scores. c Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Challenge lesson scores.

As shown in Table 11, non-CEEMS lessons featured traditional, lecture-type instruction, note-taking, and use of technology such as computers and calculators; CEEMS Introductory lessons featured both traditional, lecture-type and reform investigation-type instruction, note-taking, and use of technology, as well as other audio-visual resources; and CEEMS Challenge lessons

Page 30: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 23

also featured both traditional, lecture-type and reform investigation-type instruction, note-taking, and use of audio-visual resources. Table 11. Lesson Feature by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Features Non-CEEMS Introductory Challenge Total

a. High quality “traditional” instruction, e.g., lecture

9 (50%) 8 (57%) 14 (54%) 31 (53%)

b. High quality “reform” instruction, e.g., investigation

3 (17%) 9 (64%) 17 (65%) 29 (50%)

c. Teacher/students using manipulatives 1 (6%) 1 (7%) 2 (8%) 4 (7%)

d. Teacher/students using calculators/computers

11 (61%) 9 (64%) 11 (42%) 31 (53%)

e. Teacher/students using other scientific equipment

1 (6%) 6 (43%) 7 (27%) 14 (24%)

f. Teacher/students using other audio-visual resources

8 (44%) 11 (79%) 15 (58%) 34 (59%)

g. Students playing a game 1 (6%) 0 (0%) 1 (4%) 2 (3%)

h. Students completing labnotes/journals/worksheets or answering textbook questions/exercises

14 (78%) 9 (64%) 19 (73%) 42 (72%)

i. Review/practice to prepare students for an externally mandated test

0 (0%) 0 (0%) 0 (0%) 0 (0%)

j. More than incidental reference/connection to other disciplines

1 (6%) 4 (29%) 5 (19%) 10 (17%)

Note. Percentages over 50% were marked bold in this Table.

Compared to non-CEEMS lessons, CEEMS Introductory lessons reflected significantly more collegial working relationships among students and demonstrated more intellectual rigor, constructive criticism, and the challenging of ideas (Table 12). Statistically significant differences also were found in ratings for “interactions reflected collaborative working relationships between teacher and students” and for “climate of the lesson encouraged students to generate ideas, questions, conjectures, and/or propositions” across three lesson types based on Kruscal Wallis test results. However, no significant pair-wise comparison differences were found between any pair of lesson types.

Page 31: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 24

Table 12. Item Ratings for Classroom Culture by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Classroom Culture Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5

(%)

Don't Know

and N/A (%)

p *

1. Active participation of all was encouraged and valued.

Non-CEEMS 18 2 (11%) 1 (6%) 14 (78%) 1 (6%) .378

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 4 (15%) 22 (85%) 0 (0%)

2. There was a climate of respect for students’ ideas, questions, and contributions.

Non-CEEMS 18 2 (11%) 5 (28%) 10 (56%) 1 (6%) .288

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 1 (4%) 2 (8%) 23 (88%) 0 (0%)

3. Interactions reflected collegial working relationships among students (e.g., students worked together, talked with each other about the lesson).

Non-CEEMS 18 2 (11%) 5 (28%) 11 (61%) 0 (0%) .013 a

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 4 (15%) 22 (85%) 0 (0%)

4. Interactions reflected collaborative working relationships between teacher and students.

Non-CEEMS 18 4 (22%) 1 (6%) 13 (72%) 0 (0%) .033

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 5 (19%) 21 (81%) 0 (0%)

5. The climate of the lesson encouraged students to generate ideas, questions, conjectures, and/or propositions.

Non-CEEMS 18 0 (0%) 6 (33%) 12 (67%) 0 (0%) .042

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 3 (12%) 23 (88%) 0 (0%)

6. Intellectual rigor, constructive criticism, and the challenging of ideas were evident.

Non-CEEMS 18 4 (22%) 6 (33%) 7 (39%) 1 (6%) .001 a

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 6 (23%) 18 (69%) 2 (8%)

Note. Rating 1=“not at all” and Rating 5=”to a great extent.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests. a Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Introductory lesson scores.

Synthesis Ratings indicated that classroom culture of all Introductory, 75% of Challenge, and 67% of non-CEEMS lessons facilitated student learning (Table 13). Synthesis rating scores for classroom culture were not significantly different across three lesson types based on Kruscal Wallis test results.

Page 32: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 25

Table 13. Synthesis Ratings for Classroom Culture by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

Lesson Type n Rating 1 and 2 (%) Rating 3 (%) Rating 4 and 5 (%) p *

non-CEEMS 18 2 (11%) 4 (22%) 12 (67%) .064

Introductory 14 0 (0%) 0 (0%) 14 (100%)

Challenge 24 0 (0%) 6 (25%) 18 (75%) Note. Rating 1=“classroom culture interfered with student learning” and Rating 5=” classroom culture facilitated student learning.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests.

Analysis Question 3: What were the primary characteristics of the overall lesson?

Non-CEEMS Lessons: Direct Instruction. The majority of non-CEEMS lessons (83%) included direct instruction as a primary characteristic of the lesson. In these lessons, teachers used traditional teaching strategies, such as lecture and direct questioning with the whole class. Most lessons included CEEMS-consistent instructional strategies alongside traditional strategies, student collaboration in groups, or whole-class instruction that engaged all students. Nevertheless, the primary characteristic of these lessons was the use of traditional instructional practices. The timing of non-CEEMS lessons (i.e., before or after CEEMS lessons were conducted by the same teacher) did not appear to be associated with the degree of traditional instructional practices used in the lesson (Table 14).

CEEMS Introductory Lessons: Inquiry-Based Learning. Facilitated by the teacher, students developed essential questions, defined the purpose or parameters of the challenge, or determined challenge procedures (5 lessons). In other inquiry-based lessons, students made systematic observations, collected data or information, analyzed or interpreted data, or conducted experiments facilitated by the teacher (3 lessons). These activities occurred in whole- or small-group settings. These activities were conducted as part of a larger, multiple-lesson challenge or project. Although the degree to which teachers controlled aspects of the inquiry process varied, the majority of these lessons were characterized by some type of independent inquiry intended to support learning related to a challenge (Table 14). CEEMS Challenge: Challenge-Based Learning and Engineering Design Process. The majority of CEEMS Challenge lessons were characterized by design or organization of, implementation of, reflection about, or presentation of a specific challenge. CEEMS Challenge lessons varied in content, as teachers adapted challenge-based learning and the engineering design process to their subject areas and grade-levels. As was the case with all lesson types, CEEMS Challenge lessons mixed traditional and CEEMS-consistent instructional strategies. CEEMS Challenge lessons often occurred across several class periods. CEEMS Challenge lessons occurred in student groups and required the engagement of all students (Table 14). Compared to non-CEEMS lessons, CEEMS Introductory lessons were significantly more likely to impact students’ understanding of important mathematics/science concepts; students’ ability to apply or generalize skills and concepts to other areas of mathematics/science, other

Page 33: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 26

disciplines, and/or real-life situations; students’ self-confidence in doing mathematics/science; and students’ interest in and/or appreciation for the discipline. Table 14. Observer Ratings for Likely Impact of Instruction on Student Learning by Lesson Type, Inside the Classroom Observation and Analytic Protocol, 2015-2016

E1. Likely Impact of Instruction on Students’ Understanding of Mathematics/Science

Lesson Type n Rating 1 and 2

(%)

Rating 3 (%)

Rating 4 and 5 (%)

Don't Know

and N/A (%)

p *

a. Students’ understanding of mathematics/science as a dynamic body of knowledge generated and enriched by investigation.

Non-CEEMS 18 0 (0%) 6 (33%) 6 (33%) 6 (33%) .072

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 6 (23%) 19 (73%) 1 (4%)

b. Students’ understanding of important mathematics/science concepts.

Non-CEEMS 18 2 (11%) 6 (33%) 10 (56%) 0 (0%) .059

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 8 (31%) 18 (69%) 0 (0%)

c. Students’ capacity to carry out their own inquiries.

Non-CEEMS 18 2 (11%) 4 (22%) 4 (22%) 8 (44%) .008 a

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 7 (27%) 18 (69%) 1 (4%)

d. Students’ ability to apply or generalize skills and concepts to other areas of mathematics/science, other disciplines, and/or real-life situations.

Non-CEEMS 18 2 (11%) 3 (17%) 4 (22%) 9 (50%) .004 a

Introductory 14 0 (0%) 0 (0%) 14 (100%) 0 (0%)

Challenge 26 0 (0%) 7 (27%) 18 (69%) 1 (4%)

e. Students’ self-confidence in doing mathematics/science.

Non-CEEMS 18 2 (11%) 6 (33%) 10 (56%) 0 (0%) .022 a

Introductory 14 0 (0%) 1 (7%) 13 (93%) 0 (0%)

Challenge 26 0 (0%) 6 (23%) 19 (73%) 1 (4%)

f. Students’ interest in and/or appreciation for the discipline.

Non-CEEMS 18 2 (11%) 9 (50%) 7 (39%) 0 (0%) .007 a

Introductory 14 0 (0%) 1 (7%) 12 (86%) 1 (7%)

Challenge 26 0 (0%) 10 (38%) 15 (58%) 1 (4%) Note. Rating 1=“negative effect,” Rating 3=“mixed/neutral effect,” and Rating 5=“positive effect.” Percentages over 50% were marked bold in this Table. * p values were calculated based on Kruskal Wallis Tests. a Pair-wise comparison based on Mann Whitney U Test with Bonferroni correction indicated that non-CEEMS lessen scores are significantly different from Introductory lesson scores.

Page 34: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 27

Communication Logs

To inform evaluation question 2, (i.e., in what ways CEEMS Resource Team supported teachers change in the course of their participation in CEEMS), the Discovery Center designed a multiple-case study that used communication logs between resource team members and teachers as the data source.

Theoretical Framework

In the CEEMS project, resource team members’ support of teachers and instruction was intended to “Develop mathematics and science teacher knowledge of challenge-based learning, engineering, and the engineering design process as instructional strategies” (CEEMS External Evaluation Annual Report Year 3). This ongoing support was based on the expectation that regular communication between teachers and resource team members would facilitate teachers’ abilities to incorporate challenge-based learning, engineering concepts, and the engineering design process into their regular classroom activities and improve their instructional practice. In addition, resource team members were expected to observe and assist in the revision of lesson plans that teachers shared on a public website. In these ways resource team members provided classroom-embedded, individualized support for CEEMS mathematics and science teachers. The structure of classroom-embedded, individualized resource team member support reflected the Guskey Model of Teacher Change (Guskey, 1986, 2002), which the evaluation team used to develop the theoretical framework for this multiple-case study. The Guskey Model represents the process of teacher change in attitudes and beliefs as a result of the implementation of new instructional practices and their subsequent effects on student learning. Traditional models of teacher change assumed change in attitudes and beliefs preceded change in practice (Guskey, 2002). Evaluators believed the Guskey Model of Teacher Change was more aligned with the CEEMS structure and its goal for resource team support than were traditional models of teacher change. More specifically, Guskey (2002) identified three primary implications of his model for professional development:

1. Change is gradual and difficult; 2. Teachers require regular feedback on student learning progress; and 3. Teachers require follow-up support and continued pressure to persist when change is

difficult. The theoretical framework that guided this multiple-case study was developed based on these three proposed implications. Each implication led to one or more propositions investigated within each case. Table 15 shows the connections between the theoretical implications of the Guskey Model and this multiple-case study’s propositions.

Page 35: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 28

Table 15. Connections between Theory and Case Study Propositions

Theoretical Implication of Guskey Model Case Study Proposition

Change is gradual and difficult

Early in the academic year, multiple communication logs will cover similar content/issues. Later in the academic year, individual communication logs will cover different, complex, unique, and/or multiple content items/issues.

Early in the academic year, communication logs will cover fundamental issues and basic content. Later in the academic year, communication logs will cover more complex, specialized, unique, or multi-faceted content/issues.

Teachers require regular feedback on student learning progress

Communications that offer feedback (as opposed to procedural communications) will include attention to student learning progress/outcomes

Teachers require follow-up support and continued pressure to persist when change is difficult

Communications will include attention to the outcomes of previous communications/interactions

These propositions represent an ideal type evaluators compared to each case. The goal of this multiple-case study was analytic generalization based on a literal replication design (Yin, 2009). That is, analysis across multiple cases allowed testing the theoretical framework multiple times. In addition to literal replication, Yin (2009) has suggested case study researchers design multiple case studies to allow for theoretical replication—the inclusion of cases with at least one element that is different from literal replication cases—to test the theory’s robustness. However, theoretical replication was not a part of this evaluation study’s design because the data did not include relevant variables related to individual teachers or resource team members that could be used to measure variation in the sample. The evaluation aim, then, was to compare the empirical evidence of the ways resource team support changed with the Guskey Model and offer evaluative judgments in consideration of the CEEMS project goal for resource team member support.

Field Procedures

Data Source

Data analyzed for this multiple case study were collected by the UC ESC Team and included communication logs between resource team members and the teacher(s) they supported during the 2014-2015 academic year. These data included the date, resource team member(s) and teacher(s) involved, communication media, and a narrative description of the nature of the

Page 36: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 29

communication. For each case, evaluators analyzed the nature of the communication as a whole, as well as, the content of each communication, over time.

Case Identification

After reviewing the content of 276 communication logs created by resource team members during the academic year 2014-2015, evaluators identified 20 communication logs with richly detailed information for further analyses. Selected logs were created by four resource team members (Team members 1, 5, 7, and 14). One communication log (resource Team member 5) was discarded from this initial sample because there were no additional communications between the resource team member and teacher that could be used to examine change over time. Based on the nature of the data, evaluators defined a case as a resource team member-teacher pair. In the second iteration of case identification, all communication logs were read again to identify logs that. while not richly detailed, still contained information about specific feedback the resource team member gave to the teacher. During this second iteration, evaluators identified logs that included information about the type and/or nature of resource team member support, even when the information was not richly detailed. These two initial readings of the content of communication logs revealed that the vast majority of communication logs fell into one of four categories, depending on the length (summary vs. description) and nature (resource team members’ actions/recommendations vs. observations) of the log. This categorization scheme is represented in Figure 3.

Des

crip

tio

n

Quadrant I

Descriptions of resource team members’ actions/recommendations

Quadrant II

Descriptions of resource team members’ observations

Sum

mar

y Quadrant III

Summaries of Resource Team Members’ Actions/Recommendations

Quadrant IV

Summaries of resource team members’ observations

Resource Team Members’ Actions Resource Team Members’ Observations

Figure 3. Communication Log categorization scheme.

Evaluators followed the categorization scheme detailed in Figure 1 to code each communication log, based on the nature of the communication recorded in the log. We identified all communication logs categorized according to the characteristics in Quadrant I. We identified each case within this sample of communication logs, and then collected all communications for each case. This process resulted in eight cases. We analyzed the first four

Page 37: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 30

cases and reported the preliminary results here. We will analyze the next four cases as the first replication sample. The University of Cincinnati’s Evaluation Services Center collected and is in the process of cleaning four years of communication log data, which we will include in this multiple case study as the evaluation progresses. We aim to use multiple years of communication log data to analyze replication samples until we reach data saturation, or the stage in which no new findings emerge despite the collection of new data. We will compare findings from initial and replication case analyses to make conclusions and recommendations about the nature of change in resource team member support for teachers.

Data Analysis

Evaluators followed a deductive qualitative approach to analyze case study data. The analysis goal was to discover within- and cross-case themes related to the deductive concepts. The same process will be followed to discover within- and cross-case themes for the replication cases. Because the analyses reported here do not include replication samples, findings should be considered preliminary. Concepts used to analyze data deductively are defined in Table 16. Table 16. Concepts and Definitions for Deductive Analysis, Communication Logs, 2014-2015

Concept Operational Definition Sub-Codes

1 Trans-Disciplinary Curriculum

Support related to the inclusion of information, concepts, terms, examples, etc. from more than one academic discipline into curriculum/instruction.

Information Concepts Terms Examples Other

2 Inquiry-Based Learning

Support related to instructional practices that encourage scientific inquiry processes, including questioning; hypothesis development; study/project design; observation/prediction; and data collection, analysis, and interpretation.

Question Development Hypothesis Development Study Design Observation/Predictions Data Collection/Analysis Data Interpretation

3 Authentic Learning Support for the use of real-world examples/concerns, the connection of content to students’ everyday lives, and/or the application of concepts to novel/real-world situations.

Real-World Examples Everyday Connection Novel Situations Real-World Concepts Real-World Processes

Page 38: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 31

Concept Operational Definition Sub-Codes

4 Engineering Design Process

Support for curriculum/instruction related to the engineering design process.

Study of Design Design Implementation Re-Design/Reflection

5 Challenge/Problem-Based Learning/Project

Support for curriculum/instruction related to the development/implementation of a challenge that must be solved.

Challenge Introduction Challenge Implementation Challenge Presentation / Summary Challenge Reflection/Revision

6 Career Exploration Support for discussion, research, or consideration of STEM careers in the context of science and/or mathematics learning.

Career Discussion/Lecture Career Exploration (student-led) Professional Guest

7 Collaborative Learning Environment

Support related to cooperation among students and/or between students and the teacher.

Student Collaboration Student-Teacher Collaboration

8 Professional Learning Community

Discussions related to cooperation among/between teachers to prepare for and/or conduct lessons.

Integrated Curriculum/Instruction Instructional Support

9 Active Learning Support related to student engagement, hands-on activities, teacher questioning/feedback, whole-class discussion, or other forms of active engagement with the curriculum.

Hands-On Engagement

10 Assessment Discussions related to assessment Review Assessment Rubrics

11 Revision Discussions about revision, including the need (or lack thereof) for revisions and the ways revision of materials (e.g., lesson plans) impacted instruction/outcomes

Lesson Plan/Curriculum Procedures Instruction

12 Technology Discussions about the ways video resources were related to the observation

Website/Video Resources Video Uploading

13 External Influences Discussions about the ways external influences—those out of the control of the classroom teacher—affected the lesson/unit/instruction

Schedule Policy Student Body [other? (specify)]

Page 39: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 32

Global Codes Codes applied to indicate the nature of the entire communication

14 Feedback Communication

A communication that includes information about support/feedback provided to the teacher

15 Procedural Communication

A communication that only includes procedural information, such as due dates and scheduling

16 Descriptive Communication

A communication that only includes a description of the classroom observation

The concepts defined in Table 16 represent codes evaluators applied to communication log data. Each code contains at least two sub-codes, which reflect the code’s operational definitions and facilitated analyses. The three global codes represent codes applied to indicate the nature of the entire communication.

Case Study Questions

Case study questions in this section include level-2 questions, asked of each case, and level-3 questions, asked “of the pattern of findings across multiple cases” (Yin, 2009, p. 87). Level-2 Questions

1. In what ways were communications from August to December different from communications from January to May?

2. In what ways were communications different from one another based on communication type?

3. In what ways did later communications relate to earlier communications? 4. What themes emerged in this case?

Level-3 Questions

1. Across cases, what pattern(s) emerged in the extent to which cases reflected case study propositions?

2. Across cases, within what domains did resource team members focus their support?

Preliminary Findings

Case #1

This case included four communications spaced relatively regularly throughout the academic year (November, October, January, and April). These were feedback communications (November and January) and descriptive communications (October and April). Table 17 provides answers to Level-2 case study questions for this case.

Page 40: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 33

Table 17. Case Study Questions and Findings, Case #1, Communication Logs, 2014-2015

Case Study Question Findings

1. In what ways were communications from August to December different from communications from January to May?

The November communication was different from all others because it was an end-of-unit summary that included specific feedback on what went well and what could be improved.

The feedback communication in January was more of a report that feedback occurred than a description of the feedback itself. The content and brevity of this communication suggested that the resource team member and teacher discussed, in person, specific improvements to lesson implementation and the availability of materials.

2. In what ways were communications different based on communication type?

The November end-of-unit communication provided five independent points of feedback to note the ways lesson design and implementation enhanced student learning. This communication included three points of feedback about the ways instructional changes could improve student learning in the future and one point of feedback about the influence of the school’s schedule on the lesson.

Descriptive communications provided descriptions of the major features and activities of the lesson and minimal evaluative feedback. The few times evaluative feedback was provided in these communications, the comment concerned the influence of instructional decisions on student learning.

3. In what ways did later communications relate to earlier communications?

Later communications did not relate to earlier communications.

4. What themes emerged in this case?

Three of the four communications (October, November, and April) highlighted the way at least one element of the challenge or inquiry process influenced student learning.

Case #2

This case included 11 communications spaced regularly throughout the academic year (September (1), October (3), November (1), January (3), March (1), April (2)). These were feedback communications (3) and descriptive communications. Findings for case study questions are found in Table 18.

Page 41: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 34

Table 18. Case Study Questions and Answers, Case #2, Communication Logs, 2014-2015

Case Study Question Answer

1. In what ways were communications from August to December different from communications from January to May?

The first four communications (September-October) described lesson implementation with no additional comments or opinions. The fifth communication, in November, described a 1½ hour unit review among two resource team members and a teacher. The review generated what the resource team member described as “excellent discussion about challenge-based learning” (Case #2 communication log, 11/1/2014) and lesson improvements.

The feedback communication in March summarized an end-of-unit communication.

January communications were more diverse in content than any other group of communications. One January communication embedded comments about the influence of lesson implementation on student learning in predominantly descriptive communications, and another simply noted that an observation and in-person feedback discussion had occurred.

2. In what ways were communications different based on communication type?

The March end-of-unit communication provided 12 independent points of feedback to note the ways lesson design and implementation enhanced student learning. This communication included five points of feedback about the ways changes to instruction can improve student learning in the future. Comments and suggestions were specific and almost always linked student response (e.g., engagement, collaboration) to individual instructional strategies.

Descriptive communications provided descriptions of the major features and activities of the lesson and only two evaluative comments. These comments connected productive student collaboration in groups to strong group leadership.

3. In what ways did later communications relate to earlier communications?

Later communications did not relate to earlier communications.

4. What themes emerged in this case?

The primary lesson elements this resource team member described were about challenge- and inquiry-based learning. Feedback mirrored the challenge-based process of planning, implementation, reflection, and revision.

Page 42: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 35

Case #3

This case included eight communications from September – October. These were feedback communications (4) and procedural communications (4). Findings for case study questions are found in Table 19.

Table 19. Case Study Questions and Answers, Case #3, Communication Logs, 2014-2015

Case Study Question Answer

1. In what ways were communications from August to December different from communications from January to May?

This question cannot be answered because communications end in October.

2. In what ways were communications different based on communication type?

The two evaluative comments contained in feedback communications were specific and connected lesson implementation to student learning

3. In what ways did later communications relate to earlier communications?

Later communications did not relate to earlier communications.

4. What themes emerged in this case?

No Theme

Page 43: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 36

Case #4

This case included five communications spaced regularly throughout the academic year (October, November, December, March, and April). These were feedback communications (3) and descriptive communications (2). Findings for case study questions are provided in Table 20.

Table 20. Case Study Questions and Answers, Case #4, Communication Logs, 2014-2015

Case Study Question Answer

1. In what ways were communications from August to December different from communications from January to May?

One communication (December) indicated that the resource team member provided specific feedback on a unit, but those comments were not available in the log. All remaining communications are similar.

2. In what ways were communications different based on communication type?

Except for December, all communications describe lesson observations in detail. These lessons use challenge-based learning, inquiry-based learning, and the engineering design process as the primary modes of learning. Feedback was minimal for all communications.

3. In what ways did later communications relate to earlier communications?

Later communications did not relate to earlier communications.

4. What themes emerged in this case?

The primary lesson elements this resource tem member described were about inquiry-based learning and the engineering design process. Re-design and re-test are elements of three observations.

Page 44: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 37

Level-3 Questions

1. Across cases, what pattern(s) emerged in the extent to which cases reflected case study propositions?

2. Across cases, within what domains did resource team members focus their support? These findings are preliminary and are not supported by replication cases; and therefore, evaluators cannot answer cross-case questions with confidence at this stage. At this preliminary stage of analysis communication logs appear to link specific instructional strategies with student learning when resource team members comment on lesson implementation. It also appears that descriptions of observations are more frequent than specific feedback on lesson elements. One pattern consistent across cases is that later communications do not relate to earlier communications, as proposed by the theoretical framework. Even this apparent consistency should be noted with caution, however, as one case included several communications but none after October.

Page 45: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 38

Summary and Recommendations

Summary

A summary of findings from the Year 5 evaluation, followed by recommendations for continued project progress, are provided by evaluation question. Evaluation Question 1: In what ways did teachers’ instructional practices change in the course of their participation in CEEMS? Principals reported that the CEEMS program has provided many benefits to science and mathematics education and helped address various teaching and learning issues in their schools. Principals perceived that most of the central elements of the CEEMS program were implemented as intended by CEEMS program developers. From these principals’ perspectives, the strength of the CEEMS program lies in its preparation of teachers as collaborative leaders, its connection of science and math concepts to real-world engineering applications, its incorporation of industry professionals as resources and models for what is possible for students who pursue STEM disciplines, and its direct provision of the resources necessary for 21st century science and mathematics education. Because the program is so dynamic, intense, and effective, CEEMS teachers are highly qualified and, therefore, in demand. The challenge of CEEMS participation, then, lies in the need for resources that will sustain the program beyond the initial grant period. Classroom observations suggested that first-year CEEMS participant teachers implemented the CEEMS program as intended. In particular, the pattern of lesson elements, by lesson type is consistent with the purpose of the lessons and suggests teachers are implementing the CEEMS program as intended. Variation in instructional practices, by lesson type suggests that one impact of CEEMS participation was an increase in the diversity of instructional strategies teachers applied in their science and mathematics classrooms. The ratings of CEEMS Introductory lessons, as most consistent with best practices in mathematics and science, followed by CEEMS Challenge classes, and then by non-CEEMS classes suggests that participation in the CEEMS program improved the quality of teachers’ instruction, compared to their routine practices, and increased the likelihood of the lesson’s positive effect on student learning. Although non-CEEMS classes mixed CEEMS-consistent and traditional instructional practices at times, these classroom observations suggest that teachers have not yet internalized CEEMS instructional practices to the point that they have replaced traditional practices. Instead, it appears that, in the first year of CEEMS instruction, most teachers differentiated between CEEMS and non-CEEMS lessons and reserved most inquiry-based, challenge-based, and engineering design-based instruction for CEEMS lessons. Evaluation Question 2: In what ways did CEEMS Resource Team support for teachers change in the course of their participation in CEEMS?

Page 46: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 39

Preliminary findings based on analyzed 2014-2015 communication log data suggested that, in some cases, teachers and resource team members discussed elements of the challenge-based, inquiry-based, and/or engineering design process of planning, implementation, reflection, and revision during the school year; and that resource team member support for teachers changed in response to the communication type, with end-of-unit communications being the most detailed.

Recommendations

Classroom observation data suggested that teachers appeared to reserve most CEEMS-based instructional practices for CEEMS lessons, even when they conducted all types of lessons (non-CEEMS, CEEMS Introductory, and CEEMS Challenge) to focus on different types of mathematics and science learning. These findings suggest that teachers already use CEEMS instructional practices for a variety of instructional purposes. Although they may compartmentalize CEEMS and non-CEEMS units because they have developed CEEMS units through an iterative and collaborative process during their CEEMS participation, the project team may share these evaluation results with teachers to demonstrate 1) that CEEMS instructional practices are versatile enough to support a variety of learning objectives, and 2) that embedding CEEMS instructional practices—particularly inquiry-based practices—may improve the quality of instruction even in non-CEEMS lessons. If resource team members will continue to keep communication logs in the future, the project could improve the consistency of information across logs by providing guidelines or examples of what logs should entail. This would ensure that critical data are available to the evaluation. It appears that communication between teachers and resource team members occurs in person most frequently, and that most communication logs are used to record that communication happened, rather than to describe the nature of the communication. For the project, research, and evaluation teams to understand the nature of resource team member-teacher collaboration more completely, communication logs must record more details about how that communication takes place.

Next Steps

Because data from communication logs contained detailed, but short entries, evaluators will need to conduct several case replications and use all four years of communication log data to be confident that findings and judgments are based on sound evidence. Additionally, the evaluation team will revise training protocols for fellows to enhance the credibility and dependability of data collection. As fellows collect classroom observation data next year, the evaluation team will be more vigilant in ensuring all teachers have observations of all three lesson types to gauge the degree to which variation by lesson type depends on variation in teachers’ instructional styles.

Page 47: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 40

References

Lapadat, J. C. (2000). Problematizing transcription: Purpose, paradigm and quality. International Journal of Social Research Methodology, 3(3), 203-219.

Horizon Research Inc. (2000). Inside the classroom observation and analytic protocol. Retrieved June 22, 2016 from http://www.horizon-research.com/horizonresearchwp/wp-content/uploads/2013/04/cop1.pdf.

Guskey, T. R. (1986). Staff development and the process of teacher change, Educational Researcher, 15(5), pp. 5-12.

Guskey, T. R. (2002). Professional development and teacher change, Teachers and Teaching, 8(3), pp. 381-391, DOI: 10.1080/135406002100000512.

Yin, R. K. (2009). Case study research: Design and methods (4th Ed.). Thousand Oaks, CA: Sage.

Page 48: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 41

Appendices

Appendix A. Principal Focus Group Protocol and Embedded Activity Response Sheet ............... 43

Appendix B. Inside the Classroom Observation and Analytic Protocol ........................................ 45

Appendix C. Classroom Observation Code Book .......................................................................... 67

Page 49: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Discovery Center for Evaluation, Research, and Professional Learning

Evaluation of CEEMS, Annual report 2015-2016, Page 42

This page left intentionally blank.

Page 50: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS
Page 51: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Circle your current placement: High School – Middle School – Other: ______________

Features of CEEMS program Issues schools face in math & science

Science action planning

Engineering challenge-based instruction

Teacher-led professional development

In-class support from graduate student fellows

STEM career exposure

CEEMS conference

Resource team communication & support

Summer Institute for Teachers (SIT)

Engineering design-based instruction

Additional Comments:

Evaluation of CEEMS, Annual report 2015-2016 43

Appendix A. Principal Focus Group Protocol

Page 52: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Issues schools face in math & science 1. Students’ Science/Mathematics Literacy/Background 2. Student interest in STEM learning 3. Perceptions of intelligence needed for science/math learning 4. Perceptions of relevance of science/math content 5. Science/math teacher qualifications 6. Science/math teacher dispositions 7. Science/math teacher availability/commitment 8. Demographically relevant achievement gaps 9. Resources necessary for 21st century science/math instruction 10. New ideas for ways to teach content.*

*This item was added by focus group participants.

Evaluation of CEEMS, Annual report 2015-2016 44

Page 53: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 1 11/30/00

Inside the ClassroomObservation and Analytic Protocol

Observation Date: Time: Start: End:

School: District:

Teacher:

PART ONE: THE LESSON

Section A. Basic Descriptive Information

1. Teacher Gender: Male Female

Teacher Ethnicity: American Indian or Alaskan NativeAsianHispanic or LatinoBlack or African-AmericanNative Hawaiian or Other Pacific IslanderWhite

2 Subject Observed: Mathematics Science

3. Grade Level(s):

4. Course Title (if applicable)

Class Period (if applicable)

5. Students: Number of Males Number of Females

6. Did you collect copies of instructional materials to be sent to HRI?

¨ Yes ¨ No, explain:

Evaluation of CEEMS, Annual report 2015-2016 45

Appendix B. Inside the Classroom Observation and Analytic Protocol

Page 54: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 2 11/30/00

Section B. Purpose of the Lesson:In this section, you are asked to indicate how lesson time was spent and to provide the teacher's statedpurpose for the lesson.

1. According to the teacher, the purpose of this lesson was:

2. Based on time spent, the focus of this lesson is best described as: (Check one.)

¡ Almost entirely working on the development of algorithms/facts/vocabulary

¡ Mostly working on the development of algorithms/facts/vocabulary, but working on some mathematics/scienceconcepts

¡ About equally working on algorithms/facts/vocabulary and working on mathematics/science concepts

¡ Mostly working on mathematics/science concepts, but working on some algorithms/facts/vocabulary

¡ Almost entirely working on mathematics/science concepts

Section C. Lesson RatingsIn this part of the form, you are asked to rate each of a number of key indicators in four differentcategories, from 1 (not at all) to 5 (to a great extent). You may list any additional indicators youconsider important in capturing the essence of this lesson and rate these as well. Use your “Ratings ofKey Indicators” to inform your “Synthesis Ratings”. It is important to indicate in “Supporting Evidencefor Synthesis Ratings” what factors were most influential in determining your synthesis ratings and togive specific examples and/or quotes to illustrate those factors.

Note that any one lesson is not likely to provide evidence for every single indicator; use 6, “Don’tknow” when there is not enough evidence for you to make a judgment. Use 7, “N/A” (Not Applicable)when you consider the indicator inappropriate given the purpose and context of the lesson. This sectionalso includes ratings of the likely impact of instruction and a capsule rating of the quality of the lesson.

Evaluation of CEEMS, Annual report 2015-2016 46

Page 55: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 3 11/30/00

I. Design

A. Ratings of Key Indicators

1. The design of the lesson incorporated tasks, roles, andinteractions consistent with investigative mathematics/science. 1 2 3 4 5 6 7

2. The design of the lesson reflected careful planning andorganization. 1 2 3 4 5 6* 7*

3. The instructional strategies and activities used in thislesson reflected attention to students’ experience,preparedness, prior knowledge, and/or learning styles. 1 2 3 4 5 6 7

4. The resources available in this lesson contributed toaccomplishing the purposes of the instruction. 1 2 3 4 5 6 7

5. The instructional strategies and activities reflected attentionto issues of access, equity, and diversity for students(e.g., cooperative learning, language-appropriatestrategies/materials). 1 2 3 4 5 6* 7*

6. The design of the lesson encouraged a collaborativeapproach to learning among the students. 1 2 3 4 5 6 7

7. Adequate time and structure were provided for “sense-making.” 1 2 3 4 5 6* 7*

8. Adequate time and structure were provided for wrap-up. 1 2 3 4 5 6 7

9. _______________________________________________ 1 2 3 4 5

* We anticipate that these indicators should be rated 1-5 for nearly all lessons. If you rated any of these indicators 6or 7, please provide an explanation in your supporting evidence below.

B. Synthesis Rating

1 2 3 4 5Design of the lesson notat all reflective of bestpractice inmathematics/scienceeducation

Design of the lessonextremely reflective ofbest practice inmathematics/scienceeducation

C. Supporting Evidence for Synthesis RatingProvide a brief description of the nature and quality of this component of the lesson, the rationale for your synthesisrating, and the evidence to support that rating.

Notatall

To agreatextent

Don’tknow N/A

Evaluation of CEEMS, Annual report 2015-2016 47

Page 56: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 4 11/30/00

II. Implementation

A. Ratings of Key Indicators

1. The instructional strategies were consistent withinvestigative mathematics/science. 1 2 3 4 5 6 7

2. The teacher appeared confident in his/her ability to teachmathematics/science. 1 2 3 4 5 6 7

3. The teacher’s classroom management style/strategiesenhanced the quality of the lesson. 1 2 3 4 5 6* 7*

4. The pace of the lesson was appropriate for the developmentallevels/needs of the students and the purposes of the lesson. 1 2 3 4 5 6* 7*

5. The teacher was able to “read” the students’ level of understandingand adjusted instruction accordingly. 1 2 3 4 5 6 7

è 6. The teacher’s questioning strategies were likely to enhance thedevelopment of student conceptual understanding/problem solving(e.g., emphasized higher order questions, appropriately used“wait time,” identified prior conceptions and misconceptions). 1 2 3 4 5 6 7

7. __________________________________________________ 1 2 3 4 5

* We anticipate that these indicators should be rated 1-5 for nearly all lessons. If you rated any of these indicators 6or 7, please provide an explanation in your supporting evidence below.

B. Synthesis Rating

1 2 3 4 5Implementation of thelesson not at all reflectiveof best practice inmathematics/scienceeducation

Implementation of thelesson extremelyreflective of best practicein mathematics/scienceeducation

C. Supporting Evidence for Synthesis RatingProvide a brief description of the nature and quality of this component of the lesson, the rationale for your synthesisrating, and the evidence to support that rating. (If available, be sure to include examples/quotes to illustrate ratingsof teacher questioning (A6).)

Notatall

To agreatextent

Don’tknow N/A

Evaluation of CEEMS, Annual report 2015-2016 48

Page 57: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 5 11/30/00

III. Mathematics/Science Content

A. Ratings of Key Indicators

è 1. The mathematics/science content was significant and worthwhile. 1 2 3 4 5 6* 7*

è 2. The mathematics/science content was appropriate for thedevelopmental levels of the students in this class. 1 2 3 4 5 6* 7*

è 3. Teacher-provided content information was accurate. 1 2 3 4 5 6 7

è 4. Students were intellectually engaged with important ideasrelevant to the focus of the lesson. 1 2 3 4 5 6* 7*

5. The teacher displayed an understanding of mathematics/scienceconcepts (e.g., in his/her dialogue with students). 1 2 3 4 5 6 7

6. Mathematics/science was portrayed as a dynamic body ofknowledge continually enriched by conjecture, investigationanalysis, and/or proof/justification. 1 2 3 4 5 6 7

7. Elements of mathematical/science abstraction (e.g., symbolicrepresentations, theory building) were included when it wasimportant to do so. 1 2 3 4 5 6 7

8. Appropriate connections were made to other areas of mathematics/science, to other disciplines, and/or to real-world contexts. 1 2 3 4 5 6 7

è 9. The degree of “sense-making” of mathematics/science contentwithin this lesson was appropriate for the developmentallevels/needs of the students and the purposes of the lesson. 1 2 3 4 5 6* 7*

10. _______________________________________________ 1 2 3 4 5

* We anticipate that these indicators should be rated 1-5 for nearly all lessons. If you rated any of these indicators 6or 7, please provide an explanation in your supporting evidence below.

B. Synthesis Rating

1 2 3 4 5Mathematics/sciencecontent of lesson not atall reflective of currentstandards formathematics/scienceeducation

Mathematics/sciencecontent of lessonextremely reflective ofcurrent standards formathematics/scienceeducation

C. Supporting Evidence for Synthesis Rating Provide a brief description of the nature and quality of this component of the lesson, the rationale for yoursynthesis rating, and the evidence to support that rating. (If available, be sure to include examples/quotes toillustrate ratings of quality of content (A1, A2, A3), intellectual engagement (A4), and nature of “sense-making”(A9).)

Notatall

To agreatextent

Don’tknow N/A

Evaluation of CEEMS, Annual report 2015-2016 49

Page 58: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 6 11/30/00

IV. Classroom Culture

A. Ratings of Key Indicators

è 1. Active participation of all was encouraged and valued. 1 2 3 4 5 6* 7*

è 2. There was a climate of respect for students’ ideas,questions, and contributions. 1 2 3 4 5 6* 7*

3. Interactions reflected collegial working relationshipsamong students (e.g., students worked together, talked witheach other about the lesson). 1 2 3 4 5 6 7

4. Interactions reflected collaborative working relationshipsbetween teacher and students. 1 2 3 4 5 6* 7*

5. The climate of the lesson encouraged students to generateideas, questions, conjectures, and/or propositions. 1 2 3 4 5 6 7

è 6. Intellectual rigor, constructive criticism, and the challengingof ideas were evident. 1 2 3 4 5 6* 7*

7. _______________________________________________ 1 2 3 4 5

* We anticipate that these indicators should be rated 1-5 for nearly all lessons. If you rated any of these indicators 6or 7, please provide an explanation in your supporting evidence below.

B. Synthesis Rating

1 2 3 4 5Classroom cultureinterfered with studentlearning

Classroom culturefacilitated the learning ofall students

C. Supporting Evidence for Synthesis RatingProvide a brief description of the nature and quality of this component of the lesson, the rationale for yoursynthesis rating, and the evidence to support that rating. (If available, be sure to include examples/quotes toillustrate ratings of active participation (A1), climate of respect (A2), and intellectual rigor (A6). While directevidence that reflects particular sensitivity or insensitivity toward student diversity is not often observed, wewould like you to document any examples you do see.)

Notatall

To agreatextent

Don’tknow N/A

Evaluation of CEEMS, Annual report 2015-2016 50

Page 59: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 7 11/30/00

Section D. Lesson Arrangements and Activities

In question 1 of this section, please divide the total duration of the lesson into instructional and non-instructional time. In question 2, make your estimates based only on the instructional time of the lesson.

1. Approximately how many minutes during the lesson were spent:

a. On instructional activities? ________ minutes

b. On housekeeping unrelated to the lesson/interruptions/othernon-instructional activities? ________ minutes

Describe:

c. Check here if the lesson included a major interruption (e.g., fire drill, assembly, shortened classperiod): �

2. Considering only the instructional time of the lesson (listed in 1a above), approximately what percentof this time was spent in each of the following arrangements?

a. Whole class _______ %

b. Pairs/small groups _______ %

c. Individuals _______ %

100 %

Evaluation of CEEMS, Annual report 2015-2016 51

Page 60: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 8 11/30/00

Section E. Overall Ratings of the Lesson

1. Likely Impact of Instruction on Students’ Understanding of Mathematics/Science

While the impact of a single lesson may well be limited in scope, it is important to judge whether the lesson is likely tohelp move students in the desired direction. For this series of ratings, consider all available information (i.e., yourprevious ratings of design, implementation, content, and classroom culture, and the interview with the teacher) as youassess the likely impact of this lesson. Elaborate on ratings with comments in the space provided.

Select the response that best describes your overall assessment of the likely effect of this lesson in each of the followingareas.

a. Students’ understanding of mathematics/science as a dynamicbody of knowledge generated and enriched by investigation. ¡ ¡ ¡ ¡ ¡ ¡ ¡

b. Students’ understanding of important mathematics/scienceconcepts. ¡ ¡ ¡ ¡ ¡ ¡ ¡

c. Students’ capacity to carry out their own inquiries. ¡ ¡ ¡ ¡ ¡ ¡ ¡

d. Students’ ability to apply or generalize skills and concepts toother areas of mathematics/science, other disciplines, and/orreal-life situations. ¡ ¡ ¡ ¡ ¡ ¡ ¡

e. Students’ self-confidence in doing mathematics/science. ¡ ¡ ¡ ¡ ¡ ¡ ¡

f. Students’ interest in and/or appreciation for the discipline. ¡ ¡ ¡ ¡ ¡ ¡ ¡

Comments:

Negativeeffect

Don’t know N/A

Mixed orneutraleffect

Positiveeffect

Evaluation of CEEMS, Annual report 2015-2016 52

Page 61: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 9 11/30/00

2. Capsule Rating of the Quality of the Lesson

In this final rating of the lesson, consider all available information about the lesson, itscontext and the teacher’s purpose, and your own judgment of the relative importance of theratings you have made. Select the capsule description that best characterizes the lesson youobserved. Keep in mind that this rating is not intended to be an average of all the previousratings, but should encapsulate your overall assessment of the quality and likely impact of thelesson.

O Level 1: Ineffective InstructionThere is little or no evidence of student thinking or engagement with important ideas ofmathematics/science. Instruction is highly unlikely to enhance students’ understanding of the discipline orto develop their capacity to successfully “do” mathematics/science. Lesson was characterized by either(select one below):

¡ Passive “Learning”Instruction is pedantic and uninspiring. Students are passive recipients of information from the teacheror textbook; material is presented in a way that is inaccessible to many of the students.

¡ Activity for Activity’s SakeStudents are involved in hands-on activities or other individual or group work, but it appears to beactivity for activity’s sake. Lesson lacks a clear sense of purpose and/or a clear link to conceptualdevelopment.

O Level 2: Elements of Effective InstructionInstruction contains some elements of effective practice, but there are serious problems in the design,implementation, content, and/or appropriateness for many students in the class. For example, the contentmay lack importance and/or appropriateness; instruction may not successfully address the difficulties thatmany students are experiencing, etc. Overall, the lesson is very limited in its likelihood to enhance students’understanding of the discipline or to develop their capacity to successfully “do” mathematics/science.

O Level 3: Beginning Stages of Effective Instruction. (Select one below.)

¡ Low 3 ¡ Solid 3 ¡ High 3

Instruction is purposeful and characterized by quite a few elements of effective practice. Students are, attimes, engaged in meaningful work, but there are weaknesses, ranging from substantial to fairly minor, inthe design, implementation, or content of instruction. For example, the teacher may short-circuit a plannedexploration by telling students what they “should have found”; instruction may not adequately address theneeds of a number of students; or the classroom culture may limit the accessibility or effectiveness of thelesson. Overall, the lesson is somewhat limited in its likelihood to enhance students’ understanding of thediscipline or to develop their capacity to successfully “do” mathematics/science.

O Level 4: Accomplished, Effective InstructionInstruction is purposeful and engaging for most students. Students actively participate in meaningful work(e.g., investigations, teacher presentations, discussions with each other or the teacher, reading). The lessonis well-designed and the teacher implements it well, but adaptation of content or pedagogy in response tostudent needs and interests is limited. Instruction is quite likely to enhance most students’ understanding ofthe discipline and to develop their capacity to successfully “do” mathematics/science.

O Level 5: Exemplary InstructionInstruction is purposeful and all students are highly engaged most or all of the time in meaningful work(e.g., investigation, teacher presentations, discussions with each other or the teacher, reading). The lesson iswell-designed and artfully implemented, with flexibility and responsiveness to students’ needs andinterests. Instruction is highly likely to enhance most students’ understanding of the discipline and todevelop their capacity to successfully “do” mathematics/science.

Evaluation of CEEMS, Annual report 2015-2016 53

Page 62: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 10 11/30/00

Section F. Descriptive Rationale

1. Narrative

In 1–2 pages, describe what happened in this lesson, including enough rich detail that readers have asense of having been there. Include:

• Where this lesson fit in with the overall unit;• The focus of this lesson (e.g., the extent to which it was review/practice versus addressing new material; the extent

to which it addressed algorithms/vocabulary versus mathematics/science concepts);• Instructional materials used, if any;• A synopsis of the structure/flow of the lesson;• Nature and quality of lesson activities, including lecture, class discussion, problem-solving/investigation, seatwork;• Roles of the teacher and students in the intellectual work of the lesson (e.g., providing problems or questions,

proposing conjectures or hypotheses; developing/applying strategies or procedures; and drawing, challenging, orverifying conclusions);

• Roles of any other adults in the classroom, e.g., teacher’s aide; and• The reasoning behind your capsule rating, highlighting the likely impact on students’ understanding of

science/mathematics.

This description should stand on its own. Do not be concerned if you repeat information you havealready provided elsewhere, e.g., in your supporting evidence for your synthesis ratings (e.g.,implementation).

Evaluation of CEEMS, Annual report 2015-2016 54

Page 63: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 11 11/30/00

2. Lesson Features

Indicate which of the following features were included in this lesson, however briefly. Then, if NOTalready described in the descriptive rationale, provide a brief description of the applicable features inthis lesson.

Checkall thatapply

Describe, if NOT in descriptive rationale

a. High quality“traditional” instruction,e.g., lecture

¡

b. High quality “reform”instruction, e.g.,investigation

¡

c. Teacher/students usingmanipulatives ¡

d. Teacher/students usingcalculators/computers ¡

e. Teacher/students usingother scientificequipment

¡

f. Teacher/students usingother audio-visualresources

¡

g. Students playing a game ¡

h. Students completinglabnotes/journals/worksheets or answeringtextbook questions/ exercises

¡

i. Review/practice toprepare students for anexternally mandated test

¡

j. More than incidentalreference/connection toother disciplines

¡

Evaluation of CEEMS, Annual report 2015-2016 55

Page 64: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 12 11/30/00

PART TWO: INFLUENCES ON THE SELECTION OF TOPICS/INSTRUCTIONAL MATERIALS/PEDAGOGY USED IN PLANNING THIS LESSON

Section A. Areas of InfluenceLessons are designed and selected for a variety of reasons, some of which are under the control of theteacher and some of which are not. In Part Two of the protocol, researchers should draw upon theteacher interview in considering how each of a number of factors influenced the selection oftopics/instructional materials/pedagogy in planning for this lesson.

1. Policy and Support Infrastructure

a. Curriculum and Assessment Policies

i. When talking about why s/he chose the mathematics/science topics/concepts/skills included in this lesson, theteacher spontaneously mentioned (Check all that apply):

o They are included in the curriculum/textbook/test; s/he is expected/required to teach them

o They have always been taught in this grade/course

o They are important for kids to learn

o The students need knowledge of/exposure to these topics/concepts/skills for future units in this class/course

o The students need knowledge of/exposure to these topics/concepts/skills for future classes/courses

In the interview, the teacher was explicitly asked about state and district curriculum and assessments. Pleasesummarize the information the teacher provided about each of the following, including quotes when appropriate,being sure to note particular influences on the selection of topics, instructional materials, and/or pedagogy for thislesson. Then rate the extent of influence of each.

ii. State and district curriculum standards/frameworksDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent ¡ Not Applicable

Evaluation of CEEMS, Annual report 2015-2016 56

Page 65: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 13 11/30/00

iii. State and district science or mathematics tests/accountability systems/rewards and sanctionsDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent ¡ Not Applicable

iv. Textbook/program designated for this classDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent ¡ Not Applicable

b. Support Infrastructure

In the interview, the teacher was asked about the professional development opportunitiesprovided or encouraged by the district, as well as the influences of the principal,parents/community, school board, and other teachers in the school. Please summarize theinformation the teacher provided about each of the following, including quotes when appropriate,being sure to note particular influences on the selection of topics, instructional materials, and/orpedagogy for this lesson. Then rate the extent of influence of each.

i. Teacher professional development that is provided or encouraged by the districtDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent ¡ Not Applicable

Evaluation of CEEMS, Annual report 2015-2016 57

Page 66: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 14 11/30/00

ii. PrincipalDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

iii. Parents/communityDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

iv. School board/district administrationDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

v. Teacher collegiality (within the school/district)Describe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

Evaluation of CEEMS, Annual report 2015-2016 58

Page 67: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 15 11/30/00

c. Other Elements of the Policy and Support Infrastructure

In the interview, the teacher may have mentioned other aspects of the policy environment andsupport infrastructure. For each of the following that were mentioned, please summarize theinformation the teacher provided, including quotes when appropriate, being sure to noteparticular influences on the selection of topics, instructional materials, and pedagogy for thislesson. Then, rate the extent of the influence of each.

i. National standards documents o Not mentionedDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

ii. School/district tracking/course assignment policies, including multi-age grouping and/or students remainingwith the same teacher for multiple years o Not mentioned

Describe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

iii State and/or district tests of subjects other than the one observed o Not mentionedDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

Evaluation of CEEMS, Annual report 2015-2016 59

Page 68: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 16 11/30/00

iv. School/district scheduling policies, including class length/block scheduling o Not mentionedDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

v. Teacher evaluation system o Not mentionedDescribe:

Rate the extent to which this aspect influenced the selection of topics/instructional materials/pedagogy for thislesson. ¡ Not at all ¡ Somewhat ¡ To a great extent

Evaluation of CEEMS, Annual report 2015-2016 60

Page 69: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 17 11/30/00

2. The Physical Environment

We are defining the physical environment as including:

• Size and “feel” of the room, including what’s on the walls;• State of repair of classroom facilities;• Appropriateness and flexibility of furniture;• Availability of running water, electrical outlets, storage space; and• Availability of equipment and supplies (including calculators and computers).

a. Describe the physical environment of this classroom.

b. Did the physical environment constrain the design and/or implementation of this lesson?(Circle one.)

Yes No Don’t knowIf yes, explain:

Evaluation of CEEMS, Annual report 2015-2016 61

Page 70: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 18 11/30/00

3. Instructional Materials

a. Which best describes the source of the instructional materials upon which this lesson was based?(Check one.)¡ Materials designated for this class/course, from a commercially published textbook/program¡ Materials designated for this class/course, developed by district, school, or other non-commercial source¡ Materials selected or adapted by the teacher, from a commercially published textbook/program¡ Materials selected or adapted by the teacher, from a non-commercial source¡ Materials developed by the teacher

b. Describe the textbook/program/instructional materials, including publisher, title, date, and pagesif applicable. If the teacher made modifications to the instructional materials for this lesson,describe the modifications, why the teacher made these modifications, and the impact of themodifications on the quality of the lesson design.

Evaluation of CEEMS, Annual report 2015-2016 62

Page 71: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 19 11/30/00

4. Student Characteristics

a. Number of students:

i. Total in class: ____________ii. For whom English is not their first language: _________iii. With learning disabilities: ___________iv. With other special needs: __________

b. Describe the ability level of students in this class compared to the student population in theschool. (Check one.)¡ Represent the lower range of ability levels¡ Represent the middle range of ability levels¡ Represent the higher range of ability levels¡ Represent a broad range of ability levels

c. Teachers may consciously or unconsciously base their decisions on their perceptions of thecharacteristics of a particular group of students. Describe how the characteristics of the studentsin this class may have influenced the selection of topics/instructional materials/pedagogy for thislesson.

In this category, we include such factors as:

• Cognitive abilities• Learning styles• Prior knowledge• Prior school experience• Fluency with English

• Student attitudes towardsscience and mathematics

• Perceptions of utility of content• Goals and aspirations• Facility with class routines

• Student absenteeism/mobility• Influence of parents• Influence of peer culture

Evaluation of CEEMS, Annual report 2015-2016 63

Page 72: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 20 11/30/00

5. The Teacher

a. Number of years teacher has taught prior to this school year: ___________

b. In most situations, teachers have considerable latitude in making instructional decisions, andtheir decisions are often influenced by such factors as the teacher’s:

• Knowledge of/attitudes toward/beliefs about the subject matter;• Knowledge of/attitudes toward/beliefs about students as learners in general;• Knowledge of/attitudes toward/beliefs about pedagogy;• Pedagogical content knowledge/expertise; and• Choices about professional development, conferences, networks.

Describe how the teacher’s background knowledge, skills, and attitudes may have affected theselection of topics/instructional materials/pedagogy for this lesson.

c. If you think this lesson was very different from what is typical of this teacher’s instruction in theclass, check here o and explain the likely differences and the evidence you have for them.

Evaluation of CEEMS, Annual report 2015-2016 64

Page 73: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 21 11/30/00

Section B. Why This Lesson?In the previous section you considered separately how each of a number of factors (curriculum andassessment policies, supportive infrastructure, physical environment, instructional materials, studentcharacteristics, teacher) may have influenced the selection of topics/instructional materials/pedagogy forthis lesson. In this section, we would like you to consider how these various influences interacted, andhighlight those which were most salient in determining why this lesson was taught and how it wasdesigned. (Do not consider how well the design actually matched the students’ needs, how well it wasimplemented, or your own judgement of the teacher’s knowledge and skills. Rather, try to put yourselfin the teacher’s head— what s/he was thinking when planning this lesson. It would be appropriate to say“The teacher perceived himself as highly knowledgeable about… ” or “The teacher indicated that thestudents already understood… ” even if you have reason to believe that the teacher’s perceptions areinaccurate.)

Evaluation of CEEMS, Annual report 2015-2016 65

Page 74: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

Horizon Research, Inc. Inside the Classroom: Observation and Analytic Protocol – Page 22 11/30/00

PART THREE: PUTTING IT ALL TOGETHER

We plan to use the data collected in this study to illustrate the status of mathematics and scienceeducation in the United States; to talk about the factors that affect the nature, substance, and quality ofteaching practice in science and mathematics; and to understand how broadly and deeply “reform” haspenetrated into science and mathematics classrooms. We will use narrative accounts (stories andvignettes) as devices to illustrate the nature of, quality of, and factors affecting science and mathematicslessons.

You have now had the opportunity to observe a lesson and also to find out what the teacher was thinkingwhen s/he designed it. In this section, we ask you to “put it all together,” highlighting “the story” of thislesson and providing a tag line that together communicate to us the narrative account that you wouldwrite about this lesson. We also ask you to assess the overall quality of the lesson, provide anyadditional information you would like to share about this lesson, and let us know if you think this lessonwould make an interesting vignette.

1. The Story of this LessonSummarize why this lesson was taught, why it looked the way it did, and how well it worked.

2. Tag LineWrite a phrase or brief sentence that captures the essence of the story of this lesson.

3. Overall assessment of the quality of the lesson in layperson’s terms:

______Bad______Fair______Good______Very Good

4. Additional InformationUse this space to write anything else you would like to say about this lesson, e.g., to suggest specificissues that may or may not be central to the story of this lesson, but illustrate a dilemma or issueparticularly well.

5. RecommendationCheck here if you would recommend that this lesson be considered for a vignette. o

Evaluation of CEEMS, Annual report 2015-2016 66

Page 75: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

1

CE

EM

S C

lass

roo

m O

bse

rvat

ion

s C

ode

Bo

ok

Code

Nam

e C

ode

Def

init

ion

Su

b-C

od

es

Tra

ns-

Dis

cipli

nar

y C

urr

icu

lum

T

DC

—1.0

C

urr

iculu

m t

hat

incl

udes

info

rmat

ion,

conce

pts

, te

rms,

exam

ple

s, e

tc. fr

om

more

than

one

acad

emic

dis

cipli

ne.

Info

rmat

ion

Co

nce

pts

Ter

ms

Ex

amp

les

Oth

er

Inquir

y-B

ased

Lea

rnin

g

IBL

—2.0

In

stru

ctio

nal

pra

ctic

es t

hat

enco

ura

ge,

support

, an

d/o

r re

quir

e

scie

nti

fic

inquir

y p

roce

sses

. S

cien

tifi

c in

quir

y p

roce

sses

incl

ude

ques

tion d

evel

opm

ent,

hypoth

esis

dev

elo

pm

ent,

des

ignin

g a

stu

dy,

obse

rvin

g a

nd m

akin

g p

red

icti

ons,

co

llec

tin

g

and a

nal

yzi

ng d

ata,

and i

nte

rpre

ting r

esult

s fr

om

an

inq

uir

y

study.

Qu

esti

on

Dev

elo

pm

ent

Hyp

oth

esis

Dev

elo

pm

ent

Stu

dy D

esig

n

Ob

serv

atio

n/P

redic

tion

s

Dat

a

Co

llec

tio

n/A

nal

ysi

s

Dat

a In

terp

reta

tio

n

Auth

enti

c L

earn

ing

AU

L—

3.0

C

urr

iculu

m a

nd/o

r in

stru

ctio

n t

hat

uti

lize

s re

al-w

orl

d e

xam

ple

s,

connec

ts c

onte

nt

to s

tuden

ts’

ever

yday

liv

es,

and

/or

can

be

appli

ed t

o n

ovel

rea

l-w

orl

d s

ituat

ions.

Curr

iculu

m a

nd

/or

inst

ruct

ion t

hat

uti

lize

s re

al-w

orl

d c

once

pts

or

pro

cess

es,

such

as p

roje

ct m

anag

emen

t or

budget

ing f

or

mat

eria

ls.

Rea

l-W

orl

d E

xam

ple

s

Ever

yd

ay C

on

nec

tio

n

No

vel

Sit

uat

ion

s

Rea

l-W

orl

d C

on

cepts

Rea

l-W

orl

d P

roce

sses

Engin

eeri

ng D

esig

n P

roce

ss

ED

P—

4.0

C

urr

iculu

m a

nd i

nst

ruct

ion t

hat

sit

uat

es t

he

lear

nin

g o

f S

TE

M

conce

pts

wit

hin

the

engin

eeri

ng d

esig

n p

roce

ss.

Stu

dy o

f D

esig

n

Des

ign

Im

ple

men

tati

on

Re-

Des

ign

/Ref

lect

ion

Pro

ject

/Chal

len

ge/

Pro

ble

m-B

ased

Lea

rnin

g

PB

L—

5.0

C

urr

iculu

m a

nd i

nst

ruct

ion t

hat

sit

uat

es t

he

lear

nin

g o

f S

TE

M

conce

pts

wit

hin

the

conte

xt

of

a ch

alle

nge

(pro

ble

m,

pro

ject

)

that

must

be

solv

ed.

Ch

alle

nge

Intr

od

uct

ion

Ch

alle

nge

Imp

lem

enta

tion

Ch

alle

nge

Pre

senta

tio

n

/ S

um

mar

y

Ch

alle

nge

Ref

lect

ion/R

evis

ion

Car

eer

Explo

rati

on

CA

E—

6.0

D

iscu

ssio

n,

rese

arch

, or

consi

der

atio

n o

f S

TE

M c

aree

rs i

n t

he

conte

xt

of

scie

nce

and/o

r m

ath l

earn

ing.

Inte

ract

ion

wit

h

indiv

idual

s w

ho h

ave

dir

ect

pro

fess

ional

know

led

ge

of

ST

EM

care

ers.

Car

eer

Dis

cuss

ion

/Lec

ture

Car

eer

Ex

plo

rati

on

(stu

den

t-le

d)

Evaluation of CEEMS, Annual report 2015-2016 67

Appendix C. Classroom Observation Code Book

Page 76: A R 2015-2016 Ext… · perceptions of how well CEEMS helped meet schools’ science and mathematics education needs. Participants Principals at CEEMS schools with more than one CEEMS

2

Pro

fess

ion

al G

ues

t

Coll

abora

tive

Lea

rnin

g E

nvir

on

men

t C

LE

—7.0

A

lea

rnin

g e

nvir

onm

ent

that

support

s, e

nco

ura

ges

, an

d/o

r

requir

es c

ooper

atio

n a

mong s

tuden

ts a

nd/o

r b

etw

een

stu

den

ts

and t

he

teac

her

.

Stu

den

t C

oll

abo

rati

on

Stu

den

t-T

each

er

Co

llab

ora

tio

n

Pro

fess

ional

Lea

rnin

g C

om

mu

nit

y

PL

C—

8.0

C

ooper

atio

n a

mong/b

etw

een t

each

ers

to p

repar

e fo

r an

d/o

r

conduct

les

sons.

Inte

gra

ted

Cu

rric

ulu

m/I

nst

ruct

ion

Inst

ruct

ion

al S

up

port

Act

ive

Lea

rnin

g

AC

L—

9.0

L

earn

ing t

hat

is

engag

ing t

o s

tuden

ts a

nd/o

r re

qu

ires

act

ive

studen

t par

tici

pat

ion, in

cludin

g t

each

er q

ues

tio

nin

g/f

eed

bac

k.

Act

ive

Lea

rnin

g i

ncl

udes

han

ds-

on e

xper

ience

s.

Han

ds-

On

En

gag

em

ent

Ass

essm

ent

(incl

. re

vie

w)

D

iscu

ssio

n o

f giv

ing o

r pre

par

ing f

or

an a

sses

smen

t, a

sses

sin

g

oth

er s

tuden

ts, or

self

-ass

essm

ent

Ass

ignm

ent

of

ho

mew

ork

Dis

cuss

ions

of

the

assi

gnm

ent

of

hom

ework

Cla

ss d

isru

pti

on

D

iscu

ssio

ns

of

inte

rrupti

on t

o n

orm

al c

lass

roo

m a

ctiv

itie

s

Cla

ssro

om

Man

agem

ent

D

iscu

ssio

ns

of

acti

ons

or

inac

tion,

on t

he

par

t of

the

teac

her

, to

contr

ol,

red

irec

t, a

ddre

ss,

or

stop d

isru

pti

ve

or

off

-tas

k s

tuden

t

beh

avio

r. D

iscu

ssio

n o

f th

e ord

erli

nes

s an

d q

uie

tnes

s--o

r la

ck

ther

eof-

-of

the

clas

sroom

envir

onm

ent

Com

pet

itio

n

D

iscu

ssio

ns

of

clas

sroom

com

pet

itio

n a

mong s

tud

ents

(as

oppose

d t

o c

oll

abora

tion)

Conce

ptu

al L

earn

ing

D

iscu

ssio

ns

of

acti

vit

ies

inte

nded

to f

ost

er o

r re

sult

in

lea

rnin

g

about

conce

pts

, voca

bula

ry, et

c.

Dif

fere

nti

ated

In

stru

ctio

n

D

iscu

ssio

ns

of

the

teac

her

's e

ffort

s to

addre

ss i

nd

ivid

ual

studen

ts' l

earn

ing n

eeds,

ques

tions,

or

conce

rns

Dir

ect

inst

ruct

ion

D

iscu

ssio

ns

of

dir

ect

inst

ruct

ion a

ctiv

itie

s, i

ncl

ud

ing l

ectu

re,

teac

her

dem

onst

rati

on, te

acher

fac

t-fi

ndin

g (

i.e.

, q

ues

tio

nin

g

studen

ts i

n a

way

that

req

uir

es a

spec

ific

answ

er),

sp

ecif

ic

dir

ecti

ons

giv

en b

y t

he

teac

her

, an

d t

each

er c

lari

fica

tion

of

mis

conce

pti

ons

thro

ugh d

irec

t an

swer

s to

stu

den

t q

ues

tio

ns.

Mis

con

cep

tio

ns

Tea

cher

Dir

ecti

on

s

Tea

cher

Qu

esti

on

ing

Tec

hnolo

gy

Dis

cuss

ion o

f te

chnolo

gy u

se a

s par

t of

the

less

on

.

Sea

t W

ork

Dis

cuss

ion o

f w

ork

shee

ts,

seat

-bas

ed a

ssig

nm

ents

, an

d

indiv

idual

ized

lea

rnin

g u

nco

nnec

ted t

o a

chal

len

ge.

Evaluation of CEEMS, Annual report 2015-2016 68