clcs facilitator effectiveness measurement · pdf filecommunity learning center schools, inc....

Download CLCS Facilitator Effectiveness Measurement · PDF fileCommunity Learning Center Schools, Inc. (CLCS) Facilitator Effectiveness Measurement System (EMS) Introduction: Community Learning

If you can't read please download the document

Upload: lamhanh

Post on 06-Feb-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

  • Community Learning Center Schools, Inc. (CLCS)

    Facilitator Effectiveness Measurement System (EMS)

    Introduction:

    Community Learning Center Schools Inc. (CLCS) is committed to developing effective educators. CLCS also believes that all facilitators should be held accountable for the objectives and outcomes they can influence. All CLCS facilitators are expected to actively participate in a process of continuous improvement and reflection by being open to feedback and by using the resources and tools that CLCS provides. The Board subscribes to the development and implementation of a comprehensive model of evaluation for facilitators, which will ensure facilitators professional growth and contribute to improved performance.

    Purposes of the Facilitator EMS

    The primary purpose of the Community Learning Center Schools Facilitator Effectiveness Measurement System is the improvement and maintenance of quality professional performance, as well as promotion of CLCS, Nea and ACLC missions and goals. Evaluation is both a means and an end. As a means, it is a process of communication, personal support, feedback, adjustment, and growth for both the individual and the organization. As an end, it represents the basis for documenting evidence for retention as well as support for improvement and promotion.

    Design of the Facilitator EMS

    The CLCS evaluation committee, in cooperation with Facilitators, shall have the responsibility for maintaining an effective and efficient evaluation system. In doing so, the Evaluation Committee emphasizes that evaluations should be conducted in a professional and cooperative manner. The CLCS Board also recognizes that periodic review and evaluation of the system will occur.

  • Facilitator EMS Implementation

    The Facilitator EMS will assess the facilitators current level of performance in specific areas: Classroom observation and evaluation using attached rubrics (55%), integrating the evaluation of progress made since the last review (as applicable), and re-establishing goals for subsequent evaluations; learner achievement data (30%), including state standardized tests (when available), formative assessment data (learner achievement data on curriculum embedded interim benchmark assessments) and evidence that the facilitator uses data to tailor instruction to meet the needs of individual learners; and finally, the Lead Facilitator will work with the Program Evaluation Committee to survey parents (5%), learners (5%), and facilitator peers (5%). Values obtained in all of these areas will generate a Summary Score that will be a measure of a facilitators overall effectiveness.

    Summary Score Effectiveness Rating Chart

    1 = Ineffective (0 50%) 2 = Inconsistently Effective (51- 60%) 3 = Generally Effective (61 70%) 4 = Consistently Effective (71 89%) 5 = Exceptional/Master Facilitator (91 100%)

  • Classroom Observation and Evaluation Schedule

    Timeframe Facilitator/Lead Facilitator Activities and Deliverables

    Throughout the year Facilitator collects evidence of student learning and professional activities, and Lead Facilitator conducts informal observations of professional practice

    August Lead Facilitator sends each Facilitator an evaluation schedule Lead Facilitator and Facilitator reviews evaluation schedule.

    September Lead Facilitator and Facilitator review artifacts list. Facilitator conducts self-assessment and sets goals for the school year (tool below)

    October Lead Facilitator conducts informal observation(s).

    November and December

    Lead Facilitator conducts first formal observation. Lead Facilitator and Facilitator hold post conference for reflection about first formal observation (tool below)

    January and February

    Lead Facilitator and Facilitator review artifacts list. Facilitator conducts self-assessment

    March, April and May

    Lead Facilitator conducts second formal observation. Lead Facilitator and Facilitator hold post conference for reflection about second formal observation Lead Facilitator completes Summative Evaluation; holds conference with Facilitator

    May and June If appropriate, formulate growth goals for the following year.

    Learner Growth Assessments Schedule

    Timeframe Deliverable

    September Baseline achievement levels in core content areas assessed

    December Midway achievement levels in core content areas assessed

    April Concluding achievement levels in core content areas assessed

    Survey Administration Schedule

    Timeframe Deliverable

    Learner Survey Digital survey administered between April 1 - 30

    Peer Survey Digital survey administered between April 15 30

    Family Survey Digital survey administered between April 1 - 30

  • Classroom Observation and Evaluation Tools

    The tools for facilitator classroom evaluation are utilized for the entire academic year and represent a running record of all formal and informal observations. In addition, these tools include rubrics that emphasize our focus on the use of research-based instructional strategies, classroom technology integration, and building learners background knowledge. The rubrics are organized around six domains: Planning and Preparation for Learning, Classroom Management, Delivery of Instruction, Monitoring/Assessment and Follow-up, Family and Community Outreach, and Professional Responsibilities.

    Formal Evaluations

    Planned, extended observation visits where the lesson plan and expected learning outcomes are submitted to the observer the day previous to the scheduled observation.

    Informal Evaluations

    Short observation visits that do not require prior submission of lesson plans or scheduling confirmation. May be as short as 5 minutes, but length may vary.

    Classroom Observation Rating System: The rubrics use a four-level rating scale:

    4 Highly Effective 3 - Effective 2 Improvement Necessary 1 Does Not Meet Standards

    The Effective level describes solid, expected professional performance; facilitators should feel good about scoring at this level. The Highly Effective level is reserved for truly outstanding teaching that meets very demanding criteria; there will be relatively few ratings at this level. Improvement Necessary indicates that performance has real deficiencies; no facilitator should be content to remain at this level (although some novices might begin here). Persistent performance at the Does Not Meet Standards level is clearly unacceptable and should lead to dismissal if it is not improved immediately.

    If a Facilitators performance average falls below Effective in any domain, the Facilitator will be placed on an Improvement Support Plan (ISP). An ISP is required if any Facilitator receives a 1 in any area of an evaluation domain or a 1 or 2 for an average rating for a domain. Although placement on an Improvement Support Plan generally occurs at evaluation intervals, a Facilitator can be placed on a Improvement Support Plan at any point during the year or any time during the evaluation cycle for any reasonable and just cause.

    Improvement Support Plan

    If a Facilitator is to be placed on an Improvement Support Plan, the Lead Facilitator will prepare and send the Facilitator a memorandum outlining:

    1) the areas of concern that need to be addressed,

    2) any applicable instructions for the facilitator,

    3) any applicable resources that are available,

    4) overview of timelines and target dates

    The Lead Facilitator will set up a conference to review the Performance Support Plan with the Facilitator. Copies of the Improvement Support Plan will be forwarded to the Executive Director.

    Learner Growth Assessment Tools:

    Standardized assessments will be used to identify markers for learner growth and to establish baseline and subsequent achievement levels. These assessments include, but may not be limited to, California STAR and CST tests (if available), internal Benchmarks, Developmental Reading Assessments (DRA), Measures of Academic Progress (MAP), etc.

    Survey Tools

  • The Lead Facilitator will work with the Program Evaluation Committee to administer surveys to learners, the facilitator team and families. The surveys will include the following: Learner Survey LEARNERS WILL RATE THEIR FACILITATOR IN THE FOLLOWING AREAS: MY FACILITATOR:

    GIVES CLEAR DIRECTIONS MAKES LESSONS INTERESTING HELPS LEARNERS LIKE LEARNING DOESNT WASTE LEARNER TIME KEEPS SEMINAR WEBSITE UP TO DATE WEEKLY KEEPS GRADES UP TO DATE WEEKLY DIRECTS LEARNER BEHAVIORS IN A WAY THAT HELPS LEARNING TREATS LEARNERS WITH RESPECT

    Facilitator Peer Survey FACILITATORS WILL RATE THEIR PEERS IN THE FOLLOWING AREAS: PEERS REPORT THAT THE FACILITATOR :

    POSITIVELY CONTRIBUTES TO SCHOOL CULTURE POSITIVELY RESPONDS TO FEEDBACK LISTENS TO THE VIEWS OF OTHERS TREATS LEARNERS AND FAMILIES WITH RESPECT IS RESPONSIVE TO COMMUNICATION USES DATA TO INFORM INSTRUCTION PARTICIPATES POSITIVELY IN PROBLEM SOLVING

    Family Survey FAMILIES WILL RATE THE FACILITATORS IN THE FOLLOWING AREAS: FAMILIES REPORT THAT THE FACILITATOR :

    SETS HIGH ACADEMIC GOALS PREPARES LEARNERS FOR THE NEXT ACADEMIC LEVEL AND SUPPORTS GOAL ATTAINMENT CREATES ENGAGING LEARNING EXPERIENCES CREATES ACCESS TO RESOURCES UPDATES SEMINAR WEBPAGE WEEKLY UPDATES LEARNER GRADES WEEKLY TREATS LEARNER/FAMILY WITH RESPECT

  • CLASSROOM EVALUATION RATING DOCUMENTS FACILITATOR NAME: SCHOOL:

    These domains use a four-level rating scale with the following labels: