standards balloting and commenting system tutorial
TRANSCRIPT
Standards Balloting and Commenting System TutorialTHIS PRESENTATION WILL BE POSTED AFTER THE WORKSHOP
Barb Nutter, NERC Manager of Standards InformationWendy Muller, NERC Standards Development AdministratorStandards and Compliance Spring WorkshopApril 2, 2014
Project 2014-01 – Standards Applicability for DispersedGeneration ResourcesTony Jankowski, Standard Drafting Team ChairStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
• Standard Authorization Request (SAR) posted for formal comment on November 20, 2013
• FERC approved the Bulk Electric System (BES) Phase 2 Definition on March 20, 2014 The order recognizes this project
• Objective of the revised SAR• Address all Generator Owner (GO)/Generator Operator (GOP)
Standards and some other functions that rely on GO/GOP data Currently enforceable and pending regulatory approval Coordination among active Standard Drafting Teams (SDTs) and
Committee activity
Project Overview
RELIABILITY | ACCOUNTABILITY3
• Concurrent Activities and Milestones Risk-Based Registration Regional BES definition compliance guidance BES definition: July 1, 2014 effective date with compliance by July 1, 2016 May NERC Board of Trustees (BOT) meeting
• SDT developing a white paper
Project Overview(Cont’d)
RELIABILITY | ACCOUNTABILITY4
• BES Definition Phase 2• Intent is to not modify content of requirements Focus on applicability to dispersed generation resources
• Applicability Options As stated in BES Definition At point of aggregation >=75MVA At connection to “Grid” (net at the point of interconnection)
• Necessary for reliability• Technical justification for departure from “status quo”• Standards grouping/buckets by timeframe or functionality and
priority
Concepts Applied to the Standards
RELIABILITY | ACCOUNTABILITY5
• White paper to be posted on April 4, 2014 Direction of the SDT Technical consideration of the unique characteristics of dispersed
generation Prioritization of Standards to address Present options for potential modifications to the Standardso Focus on applicabilityo Develop technical analyses related to possible recommendations
Solicit feedback from industry
White Paper
RELIABILITY | ACCOUNTABILITY6
Example of Applicability Modification (PRC-005)
RELIABILITY | ACCOUNTABILITY7
• April 4, 2014: SDT White Paper posted for industry comment• April 21, 2014: Industry Webinar• July 2014: Initial Standards modifications posted for comment
and ballot• November/December 2014: Anticipated project completion
date• February 2015: NERC BOT action
Project Timeline
RELIABILITY | ACCOUNTABILITY8
PER-005-2 – Operations Personnel Training:
What is a Systematic Approach to Training?
Patti Metro, NRECA | Chair of PER-005-2 Standard Drafting Team Lauri Jones, PG&E | Vice Chair of PER-005-2 Standard Drafting TeamNERC Standards and Compliance WorkshopApril 2, 2014
What is New or Updated in PER-005-2?
• Updated and NEW NERC Glossary Terms System Operator: An individual at a Control Center
of a Balancing Authority, Transmission Operator, or Reliability Coordinator, who operates or directs the operation of the Bulk Electric System in Real-time.
Operations Support Personnel: Individuals who perform current day or next day outage coordination or assessments, or who determine SOLs, IROLs, or operating nomograms, in direct support of Real-time operations of the Bulk Electric System.
What is New or Updated in PER-005-2?
• New Applicable Entities Transmission Owner that has personnel, excluding field
switching personnel, who can act independently to operate or direct the operation of the Transmission Owner’s Bulk Electric System transmission Facilities in Real-time.
Generator Operator that has dispatch personnel at a centrally located dispatch center who receive direction from the Generator Operator’s Reliability Coordinator, Balancing Authority, Transmission Operator, or Transmission Owner, and may develop specific dispatch instructions for plant operators under their control. These personnel do not include plant operators located at a generator plant site or personnel at a centrally located dispatch center who relay dispatch instructions without making any modifications.
What is New or Updated in PER-005-2?
• Requirement R1 – Minor modifications for clarity• Requirement R2 – Same as R1, but applicable to
Transmission Owner (TO)• Requirement R3 – Added TO• Requirement R4 – Added TO
32 hours removed as emergency operations training will be inherent in developing an entity's reliability-related task list associated with the applicable personnel included in R1 and R2
• Requirement R5 – New requirement for Operations Support Personnel
• Requirement R6 – New requirement for Generator Operator (GO)
Flexibility in PER-005-2
• Provides flexibility for an entity to determine their BES company-specific Real-time reliability-related tasks.
• Requires that an entity document its methodology for determining those tasks, which will place parameters around what tasks an entity includes.
• Phrase “according to its program” provides an entity the flexibility to develop and deliver training in a timely manner.
Systematic Approach to Develop and Implement Training versus Systematic Approach to Training • Addresses industry concerns that the phrase
“systematic approach to training” meant that there was a single methodology to be used.
• Re-worded to clarify that there are different types of training programs that can be used.
• Flexibility is provided to entities so they can use the type of methodology that works best for the entity.
Transmission Owner in Requirement R4
• Reflects the varying registrations and responsibilities of local control centers. Agreements may require mitigation and/or response
from the local control center operator and therefore including TOs in this requirement is appropriate.
• Not applicable if TO does not have “(1) operational authority or control over Facilities with established Interconnection Reliability Operating Limits (IROLs), or (2) established protection systems or operating guides to mitigate IROL violations”.
Overview of a Systematic Approach
to Training
The Systematic Approach to Training (SAT)
DEFINED as:An approach that provides a logical progression from the identification of the tasks required to perform a job to the implementation and evaluation of training.
Systematic Approach to Training (SAT)
Focuses on what a worker must perform
Effectively captures this data regardless of a worker’s duties
SAT Philosophy
Advantages of Using SAT• Logical process for training development• Permits effective management control Performance can be measured and corrected
• Trains on skills and knowledge determined through a systematic analysis of job requirements.
• Effectiveness is monitored systematically.• Results are used to improve program design and
implementation.• Cost Effective Only the elements for proper job performance are
included.
SAT• Method that provides a total approach for the
establishment of performance-based training programs.
• Consists of five general phases that include: Analysis Design Development Implementation Evaluation
ADDIE
Resource: The ADDIE Model, Mike Molenda, Indiana University
The most important, the most neglected, the most difficult stage of SAT
• The learning objectives are derived from the analysis which describe the desired performance after training.
• The end result is a task list for a particular job with the associated knowledge, skills, and attitude (attributes) = KSA to perform that job competently.
ANALYSIS PHASE
Analysis determines training requirements of the jobs to be performed• Determines if training is an appropriate solution • Identify the skills, knowledge, and attitudes to be included in training• Analysis has three major types of analysis to gather information: Needs analysiso Identifies potential or existing training needs and non-training solutions,
by examining gaps between performance requirements and existing or expected performance
Job analysis oAnalyzing a specific job by determining the tasks required for successful
job performance Task analysisoBreaking down a task into its manageable steps to determine the
knowledge and skills needed to perform the task
ANALYSIS PHASE
DIF SURVEY
Resultant list analyzed for:• Difficulty• Importance• Frequency
If the task is difficult, very important, and done infrequently, it COULD indicate training is needed
DIF SURVEY RATING FACTORS
FREQUENCY:This factor refers to how often the task is performed by the individual.
The following scale is used to rate task FREQUENCY of performance:
0 = Never1 = Rarely (about once a year)2 = Seldom (about three or four times a year)3 = Occasionally (about once a month)4 = Often (about once a week)5 = Very often (daily)
DIF SURVEY RATING FACTORS
IMPORTANCEThe following scale is used to rate task IMPORTANCE. For each job task, complete the following sentence:
“The consequences of performing this task improperly are …”
1 Negligible2 Low. May result in additional or repetitive efforts but, no
equipment damage or delays should result.3 Average. May result in delay. 4 High. Would result in significant equipment damage, forced
shutdowns, or reduction in equipment availability.5 Extremely High. Would result in employee injury,
equipment damage, or potential danger to public health and safety.
DIF SURVEY RATING FACTORS
DIFFICULTY
Difficulty refers to mental and physical effort required by a worker to achieve proficiency in task performance. The following scale is usedto rate task DIFFICULTY:
1 Very easy to learn how to perform proficiently2 Somewhat easy to learn how to perform proficiently3 Moderately difficult to learn how to perform proficiently4 Very difficult to learn how to perform proficiently5 Extremely difficult to learn how to perform proficiently
The decision tree is intended to be used as a guide, not as a strict rule/procedure.One of the following training decisions may be made for each task on the task list:• Train – provide the appropriate level of formal training• No Train – no formal training (task can be learned on the job and consequences
of improper job performance are minimal)• Train/Retrain – provide a combination of initial formal training plus periodic
retraining of the task
DECISION TREE
TASK ANALYSISIsolates the Conditions, Behaviors, and Standards of task performance
Conditions under which the task will be performed
Behavior to be observed
State the Standard that shows how you know the worker performed the task correctly.
DESIGN PHASE
• In this phase, LEARNING OBJECTIVES are defined: Learning objectives are clear and concise
statements of the intended learning outcomes of a training event. o Are specific, measurable, realistic, observable, and can be
understood by everyone. Identify performance assessments. Identify methods to observe and measure
performance of the individual. Identify training settings.
DESIGN PHASE
MeasurableState, List, DefineDescribe, ExplainInterpret, PredictContrast, DifferentiateCompose, ReviseEvaluate, Justify
Un-MeasurableBelieveKnowUnderstandRationalize
DESIGN PHASE• No minimum/maximum number of objective
process for training development• No number of objectives per hour of class• Based strictly on main, measurable points of
task• Keep thinking: What do I want to see them do?• Sequence from simple-to-complex• After writing objectives and sequencing them, it
is time to write the assessment itemsmanagement control.
DEVELOPMENT PHASE
The development phase is to produce the materials required for the implementation of the training programs.
• Identify effective training methods.
• Develop training activities.
• Be aware that learners like to “listen-do”.
• Create training materials and guides.
NOT everyone learns the same!
The most material can be covered in the least time with lecture…but, also with the least retention!
PowerPoint does not correct retention issues.
Utilize discussion and interaction in order for the student to retain the content.
DEVELOPMENT PHASE
IMPLEMENTATION PHASE
The purpose of the implementation phase is to deliver the training program in an efficientand effective manner:
• Select and train instructors.• Deliver training.• Evaluate the learners after training is
delivered.• Collect course feedback.• Create training records.
EVALUATION PHASE• The purpose of the evaluation phase is to
determine and document the degree to which training has achieved its stated objectives, i.e. to evaluate: the adequacy, appropriateness, effectiveness, and efficiency of training.
• Levels of evaluation: reaction level, learning level, transfer evaluation, and impact evaluation
• FEEDBACK to other phases
Reaction – Feedback sheets
Learning – Training assessments, evaluation
Behavior – Post-training evaluation
Results – Program Rate of Investment (ROI)
KIRKPATRICK’S MODEL
THANK YOU
Lauri JonesPacific Gas and Electric CompanyTransmission System Operations – TrainingSr. [email protected]
Patti MetroNational Rural Electric Cooperative AssociationManager, Transmission & Reliability StandardsPH 703.907.5817CELL [email protected]
Regulatory Process and NERC Enforcement DatesBill Edwards, NERC Legal CounselStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
• Overview of the Federal Energy Regulatory Commission (FERC)• FERC Approval Process for Reliability Standards• FERC Docketing• Regulatory Approval Date of FERC Action• NERC Enforcement Dates• Questions
Introduction
RELIABILITY | ACCOUNTABILITY3
• Office of Electric Reliability Oversees the development and review of mandatory reliability and
security standards. Ensures compliance with the approved mandatory standards by the users, owners, and operators of the bulk power system.
• Office of the General Counsel Provide legal services to the Commission including review and approval of
NERC Reliability Standards.
FERC Overview
RELIABILITY | ACCOUNTABILITY4
• Office of Energy Infrastructure Security Provides leadership, expertise, and assistance to the Commission to
identify, communicate, and seek comprehensive solutions to potential risks to FERC-jurisdictional facilities from cyber attacks and such physical threats as electromagnetic pulses.
• Office of Enforcement Reviews NERC Notices of Penalty and responsible for oversight of NERC’s
enforcement program.
FERC Overview
RELIABILITY | ACCOUNTABILITY5
• Office of Electric Market Regulation Reviews and approves NERC and Regional Entity Budgets Responsible for reviewing and ruling on changes to the NERC Rules of
Procedure Delegation agreements
• Office of Policy and Innovation Issues, coordinates, and develops proposed policy reforms to address
emerging issues affecting wholesale and interstate energy markets, including such areas as climate change, the integration of renewable resources, and the deployment of demand response and distributed resources, smart grid, and other advanced technologies.
FERC Overview
RELIABILITY | ACCOUNTABILITY6
• Five Member Commission Chairman Four Commissioners
• Offices of the Commission report to the Chairman• Review and vote on matters before the Commission• Supporting assistants engaged in all matters before the
Commission• “As a member of the Commission’s staff, the views I express are
my own, and not necessarily those of the Chairman or of any individual Commissioner.”
FERC Overview
RELIABILITY | ACCOUNTABILITY7
• All NERC filings and orders can be found here: • http://www.nerc.com/FilingsOrders/us/Pages/NERCFilings2014.
aspx
NERC Filings and Orders Page
RELIABILITY | ACCOUNTABILITY8
• FERC Electric Reliability webpage http://www.ferc.gov/industries/electric/indus-act/reliability.asp
Electric Reliability Page
Regulatory Approval Date
RELIABILITY | ACCOUNTABILITY9
Final Action
Proceeding Type:Adjudication vs.
Rulemaking
Petition Submitted to FERC
RD or RM Docket No. Assigned
RD- Notice of Filing Issued
- 30-Day Public Comment
-Intervention required
Delegated Letter Orderissued by Office of Electric Reliability
(no adverse comments filed)
Formal or Letter Orderissued by the full
Commission(adverse comments filed
or by own election)
RM- Notice of Proposed Rulemaking Issued- Public Comments
Requested (~60-days)
Final Ruleissued by the full
Commission
Reliability Standard Approval
RELIABILITY | ACCOUNTABILITY10
• Docket types for reliability matters RD - Electric Reliability Standards RM - Rulemaking Proceedings RR - ERO Rules and Organizational Filings RC - Compliance and Enforcement of Reliability Standards NP - Notice of Penalty
FERC Docketing
RELIABILITY | ACCOUNTABILITY11
• Administrative process include three main processes: Rulemaking (RM) Investigation Adjudication (RD)
“The Commission may approve, by rule or order, a proposed reliability standard or modification to a reliability standard if it determines that the standard is just, reasonable, not unduly discriminatory or preferential, and in the public interest.” 16 U.S.C. § 824o(d)(2)
Types of Proceedings
RELIABILITY | ACCOUNTABILITY12
• Public comment period set by a Notice of Filing in the Federal Register
• Must intervene to be a party to a proceeding and appeal• Historically used for: Regional Reliability Standards “Non-controversial” Reliability Standards
• Decisions issued by: Order by the voting Commission if adverse comments filed Delegated Letter Order issued by the Director of the Office of Electric
Reliability if adverse comments not filed No NOPR or Final Rule in RD dockets
RD Dockets
RELIABILITY | ACCOUNTABILITY13
• Notice and Comment Rulemaking involves three steps: Notice of the proposed rulemaking (NOPR) Comment Period o Typically 60 days from publication of NOPR in the Federal Register
Final Ruleo Rehearing or Clarificationo Order on Rehearing or Clarification
RM Dockets
RELIABILITY | ACCOUNTABILITY14
• How will I know which docket no. is assigned and when comment periods occur? Go to Federal Register (www.federalregister.gov) and set-up a “My FR”
account Customize a search alert for “North American Electric Reliability
Corporation”
Federal Register Notification
RELIABILITY | ACCOUNTABILITY15
FERC Action
• FERC may approve or remand a Reliability Standard “The Commission shall remand to the Electric Reliability Organization … a
proposed reliability standard … that the Commission disapproves in whole or in part.” 16 U.S.C. § 824o(d)(4)
• FERC may also direct modifications to a Reliability Standard “The Commission … may order the Electric Reliability Organization to
submit to the Commission a proposed reliability standard or a modification to a reliability standard that addresses a specific matter…” 16 U.S.C. §824o(d)(5)
RELIABILITY | ACCOUNTABILITY16
• Calculating the enforcement date for a Reliability Standard 1. Determine the effective date of the FERC action in the RM or RD docket,
also referred to in NERC documents as the date of “applicable regulatory approval.”
2. Find the “Effective Date” section of the Reliability Standard and compute the future enforcement date using the date of “applicable regulatory approval” as the starting point.
3. Cross-reference the implementation plan to determine whether specific Requirements follow a different timeline for demonstrating compliance.
NERC Enforcement Dates
RELIABILITY | ACCOUNTABILITY17
• RM Dockets FERC Final Rules do not become immediately effective on issuance Small Business Regulatory Enforcement Fairness Act o Prior to taking effect, Final Rules must be sent for review to:
– Congress– Government Accountability Office
o “Major rules” must allow at least 60 days after the date of publication in the Federal Register for Congressional review.
o Effective date will be listed in the Federal Register version of the Final Rule– In computing these dates, the day after publication is counted as the first day– When a date falls on a weekend or holiday, the next Federal business day is used
• RD Dockets Effective on issuance unless otherwise stated in the order
FERC Order Effective Dates
RELIABILITY | ACCOUNTABILITY18
• Once the effective date of the FERC action has been determined…
• Locate the “Effective Date” section of the Reliability Standard and calculate the enforcement date using the time references Using the example below, assume the effective date of FERC action is
February 12, 2014 Count 12 months forward to February 12, 2015 First day of the first calendar quarter after February 12 is April 1, 2015
NERC Enforcement Dates
RELIABILITY | ACCOUNTABILITY19
• Select US Enforcements Dates from the left navigation Note: This page defaults to the Mandatory Standards Subject to
Enforcement status
US Enforcements Dates Page
RELIABILITY | ACCOUNTABILITY20
US Enforcements Dates Page
• Select the status to see the dates pertinent to the standard Note: clicking the enforcement date will enable viewing of the FERC Order for
that standard
RELIABILITY | ACCOUNTABILITY21
• The Detail link shows the different enforcement/inactive dates by requirement when applicable in a standard
US Enforcements Dates Page
RELIABILITY | ACCOUNTABILITY22
• The Notes column provides detail pertinent to the standard i.e.: PRC-005-2 (the notes for this standard denote the need to refer to the
implementation plan for specific compliance dates)
US Enforcements Dates Page
RELIABILITY | ACCOUNTABILITY23
PRC-005-2 Implementation
Bill Edwards, NERC Legal CounselStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
• Introduction• A brief history of PRC-005• PRC-005-1b vs. PRC-005-2• Transition from PRC-005-1b to PRC-005-2• PRC-005-2
Discuss applicability of PRC-005-2 Evidence and retention periods Summary of requirements
• Future standards development
PRC-005
RELIABILITY | ACCOUNTABILITY3
• PRC-005-2 – Protection System Maintenance Replaces four legacy standards
o PRC-005-1b – Transmission and Generation Protection System Maintenance and Testing
o PRC-008-0 – Implementation and Documentation of Underfrequency Load Shedding Equipment Maintenance Program
o PRC-011-0 – Undervoltage Load Shedding System Maintenance and Testingo PRC-017-0 – Special Protection System Maintenance and Testing
• Standard Drafting Team (SDT) had wide representation All eight regions Transmission, generation, and distribution expertise—and wide variety of entity size Subject matter expertise consultant organizations Testing companies Much observer participation Regular FERC participation
PRC-005-2 – Introduction
RELIABILITY | ACCOUNTABILITY4
• PRC-005-1: Approved by FERC in Order No. 693 Directed changes to establish maximum allowable intervals Suggested merging PRC-005, PRC-008, PRC-011, and PRC-017
• Two interpretations approved since Order No. 693• Project 2007-17 (PRC-005-2) timeline• SDT included a comprehensive Supplementary Reference and FAQ
document
PRC-005 – A Brief History
RELIABILITY | ACCOUNTABILITY5
• PRC-005-3 Adds Automatic Reclosing
o Applicability limited to Automatic Reclosing that could directly affect BES stabilityo Filed with FERC on February 14, 2014
• PRC-005-4 Adding sudden pressure relays Also addressing identified issues from PRC-005-3 development First meeting of SDT on March 31-April 4, 2014
PRC-005 Developments
RELIABILITY | ACCOUNTABILITY6
PRC-005-2 approved by FERC in Order No. 793
Background
Action Date
Order No. 793 Issued by FERC
Dec. 19, 2013
Published in Federal Register (FR)
Dec. 24, 2013
Effective Date of Order No. 793
Feb. 24, 2014 (60 days from publication in the FR)
RELIABILITY | ACCOUNTABILITY7
• PRC-005-2 Implementation Plan timeframes key off of “applicable regulatory approval”
• Enforcement date initially set to February 24, 2014 as a result• Enforcement date since corrected to reflect the date entities
must demonstrate compliance with a requirement• Enforcement Date in U.S.: April 1, 2015
Enforcement Date
RELIABILITY | ACCOUNTABILITY8
• While entities are transitioning to the requirements of PRC-005-2, each entity must be prepared to identify: All of its applicable Protection System components Whether each component has last been maintained according to
PRC-005-2 or under PRC-005-1b, PRC-008-0, PRC-011-0, or PRC-017-0
• Implementation plan uses a phased approach to accommodate spreading the maintenance and testing out over a reasonable period of time
• Legacy standards will be retired following full implementation of PRC-005-2
Implementation Plan Overview
RELIABILITY | ACCOUNTABILITY9
• Entities must maintain documentation to demonstrate compliance with PRC-005-1b, PRC-008-0, PRC-011-0, and PRC-017-0 until that entity meets the requirements of PRC-005-2
• Each entity will maintain each of their Protection System components according to their maintenance program already in place for the legacy standards or according to the program for PRC-005-2, but not both
• Once an entity has designated PRC-005-2 as its maintenance program for specific Protection System components, they cannot revert to the original program for those components
Implementation Plan OverviewContinued
RELIABILITY | ACCOUNTABILITY10
• R1: Establish Protection System Maintenance Program (PSMP)• R2: Performance-based maintenance intervals follow the
procedure established in PRC-005 Attachment • R5: Demonstrate efforts to correct identified Unresolved
Maintenance Issues• Date Calculation:
Regulatory Effective Date: February 24, 2014 First quarter following regulatory approval: April 1, 2014 Add 12 months: April 1, 2015
Compliance Dates: R1, R2, R5
RELIABILITY | ACCOUNTABILITY11
PRC-005-2 R3 and R4 Implementation Timelines
Max. Maintenance Interval % Compliant By
Less than 1 year 100% Oct. 1, 2015 (1D/1Q 18 mo. following regulatory approval)
1–2 calendar years 100% Apr. 1, 2017 (1D/1Q 36 mo. following regulatory approval)
3 calendar years 30% Apr. 1, 2016 (1D/1Q 24 mo. following regulatory approval)1
3 calendar years 60% Apr. 1, 2017 (1D/1Q 36 mo. following regulatory approval)
3 calendar years 100% Apr. 1, 2018 (1D/1Q 48 mo. following regulatory approval)
6 calendar years 30% Apr. 1, 2017 (1D/1Q 36 mo. following regulatory approval)2
6 calendar years 60% Apr. 1, 2019 (1D/1Q 60 mo. following regulatory approval)
6 calendar years 100% Apr. 1, 2021 (1D/1Q 84 mo. following regulatory approval)
12 calendar years 30% Apr. 1, 2019 (1D/1Q 60 mo. following regulatory approval)
12 calendar years 60% Apr. 1, 2023 (1D/1Q 108 mo. following regulatory approval)
12 calendar years 100% Apr. 1, 2027 (1D/1Q 156 mo. following regulatory approval)
1Or, for generating plants with scheduled outage intervals exceeding two years, at the conclusion of the first succeeding maintenance outage. 2 Or, for generating plants with scheduled outage intervals exceeding three years, at the conclusion of the first succeeding maintenance outage.
RELIABILITY | ACCOUNTABILITY12
• Implementation plan for PRC-005-3 builds on the implementation plan of PRC-005-2
• Implementation timing for Protection System Components will always key off the dates in this presentation
• The language to facilitate this is included in the PRC-005-3 implementation plan
• The date to key on for the additional Automatic Reclosing relays will be the regulatory approval date of PRC-005-3
• The implementation plan for PRC-005-4 will follow suit• NERC will conduct additional outreach at the time of regulatory
approval
Future Implementation Plans
RELIABILITY | ACCOUNTABILITY13
Standards Development CoordinationErika Chanzes, NERC Standards DeveloperStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY15
• Standard development inputs• Reasons to coordinate• Example: Project 2008-02 Undervoltage Load Shedding (UVLS)• Coordination moving forward
Standards Development Coordination
RELIABILITY | ACCOUNTABILITY16
• Currently, many standards are open for revision or new standards are being developed to address: Outstanding FERC directives Five-year reviews New and emerging issues or critical FERC directives
• Active standards projects also need to consider: Outstanding FERC directives Results-based format Paragraph 81 criteria Independent Expert Review Panel (IERP) recommendations Input from technical committees and reports
Standard Development Inputs
RELIABILITY | ACCOUNTABILITY17
• During development, drafting teams may identify issues that result in recommendations to address other standards. These can include: Impacts of newly-defined or revised NERC Glossary terms Elements that are more appropriately addressed in other standards Redundancy with requirements in other standards
• If an affected standard is also currently under revision, the projects must coordinate to try to avoid: Two projects working on the same standard at the same time Complicated implementation plans seeking to address timing
alignments of retirements and effective dates
Reasons to Coordinate
RELIABILITY | ACCOUNTABILITY18
Example: Project 2008-02 UVLS
Considerations Planned Coordination
RELIABILITY | ACCOUNTABILITY19
Coordination Moving Forward
• How projects coordinate is determined on a case-by-case basis, and the approach is dependent on weighed risks and benefits
• Coordination plans that lock projects together from a timing standpoint will continue to evolve as the development and ballot results unfold
• Standard Developers and drafting teams will include coordination plans with other projects in their communications and outreach with industry to ensure transparency and input from stakeholders
RELIABILITY | ACCOUNTABILITY20
Project 2013-03 – Geomagnetic Disturbance Mitigation
Frank Koza, PJM InterconnectionStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
Geomagnetically Induced Currents (GICs) can cause:Increased reactive power consumption, transformer heating, and protection and control misoperation
GMD Concern for the Power System
RELIABILITY | ACCOUNTABILITY3
Standard Drafting Team
Name Registered Entity
Frank Koza (Chair) PJM Interconnection
Dr. Randy Horton (Vice Chair) Southern Company
Donald Atkinson Georgia Transmission Corporation
Dr. Emanuel Bernabeu Dominion Resource Services, Inc
Kenneth Fleischer NextEra Energy
Dr. Luis Marti Hydro One Networks
Dr. Antti Pulkkinen NASA Goddard Space Flight Center
Dr. Qun Qiu American Electric Power
RELIABILITY | ACCOUNTABILITY4
EOP-010-1 Requirements
• R1 – Each Reliability Coordinator (RC) required to develop, coordinate, maintain, and implement, as necessary, a GMD Operating Plan
• R2 – Each RC is responsible for disseminating forecast and current space weather information
• R3 – Each Transmission Operator (TOP) required to develop, maintain, and implement an Operating Procedure or Operating Process to mitigate the effects of GMD events
Requirements are not prescriptive to allow the entity to tailor Operating Procedures based on entity-specific factors like geography, geology, and system topology
RELIABILITY | ACCOUNTABILITY5
EOP-010-1 Applicability
• RCs• TOPs with a Transmission Operator Area that includes a power
transformer with a high side wye-grounded winding with terminal voltage greater than 200KV
Does not apply to:• Balancing Authorities (BAs)• Generator Operators (GOPs)
Actions of BAs and GOPs are either covered under other requirements or would require detailed studies as described in the White Paper Supporting Functional Entity Applicability.
RELIABILITY | ACCOUNTABILITY6
200kV Threshold Rationale
• For lines less than 200kV, impedance is higher, lines are generally shorter, and lower voltage lines provide minimal contribution to GIC; hence, such lines are ignored in analysis. (Example: calculation included in the whitepaper)
• If 230 kV lines were ignored, significant GIC would be mistakenly excluded from analysis and could result in inaccurate var consumption calculations. (Example: calculation included in the whitepaper)
White Paper Supporting Network Applicability includes rationale explanation, example calculations, and reference list.
RELIABILITY | ACCOUNTABILITY7
Resources for Operating Procedure Preparation
• Operating Procedure Template – TOP• Operating Procedure Template – GOP• GIC Application Guide
RELIABILITY | ACCOUNTABILITY8
TPL-007-1
• Requires a planning assessment of the system for its ability to withstand a Benchmark GMD Event without causing a wide area blackout, voltage collapse, or damage to transformers.
• Applicability: PCs, TPs, TOs, GOs Need system models: DC (GIC calculation) and AC (power flow) Transformer information: internal winding resistance Substation grounding information
• Studies that may be necessary to perform a GMD assessment: Transformer GIC impact (reactive power and thermal) Power flow system studies Impact of harmonics on reactive power compensation devices
RELIABILITY | ACCOUNTABILITY9
GMD Benchmark Event: Electric Field
Epeak = Ebenchmark x α x β (in V/km)where,
Epeak = Benchmark geoelectric field magnitude at system location
Ebenchmark = Benchmark geoelectric field magnitude at reference location (60° N geomagnetic latitude, resistive ground model)
α = Factor adjustment for geomagnetic latitudeβ = Factor adjustment for regional earth conductivity
model
RELIABILITY | ACCOUNTABILITY10
Geoelectric Field Benchmark Magnitude
Geoelectric field magnitude (Ebenchmark)
Take many years of magnetometer data and extrapolate the data to arrive at a statistical probability of a ~1:100 year event at reference location.
3-8 V/km Range at 60⁰ N 8 V/km to be conservative
RELIABILITY | ACCOUNTABILITY11
Assessing Local vs. Wide Area Impacts
• Peak geoelectric fields in localized regions may be 2-3 times larger than neighboring locations
• 8V/km is an averagegeoelectric field magnitude to support a wide area analysis
• Previous studies have focused on local impacts—hence, using fields of ~20V/km
RELIABILITY | ACCOUNTABILITY12
• Sample α scaling factors for geomagnetic latitudes1.0 at 60⁰ N Juneau; Winnipeg;
Churchill Falls, NL0.3 at 50⁰ N New York ; St Louis;
Salt Lake City0.1 at 40⁰ N Jacksonville; New
Orleans; Tucson
• Factors are illustrative and subject to change, but the factors are calculated from actual magnetometer data
Geomagnetic Latitude Scaling
Geomagnetic Latitude ChartApplication for converting
geographic latitude to geomagnetic latitude is available from NOAA
website
RELIABILITY | ACCOUNTABILITY13
Earth conductivity model factor (β)Scaling Factor: 0.81 Atlantic Coastal (CP-1)(analysis ongoing) 0.30 Columbia Plateau (CO-1)
Earth Conductivity Scaling
Based on data from U.S. Geological Survey (USGS) and Natural Resources Canada
RELIABILITY | ACCOUNTABILITY14
Historical Perspective on Benchmark
1989 Hydro Quebec Storm(Estimated 1-in-100 year event)
Let’s be clear: There are significant error bars involved in estimating rare events
50
50
mV/km
1000
1000 Example CalculationBenchmark Geoelectric Field for Fredricksburg vicinity:
Epeak=8 x α x β
α=0.18 @45⁰Nβ=0.81 (Model CP-1)
Epeak= 1.17 V/km(Estimated 1-in-75 to 1-in-100 year event)
Fredericksburg, VA Magnetometer Station readings converted to e-field
RELIABILITY | ACCOUNTABILITY15
Assessment Process Overview
Assemble model and equipment data
Create DC model of the system
Calculate GICs for each transformer
Use GICs to calculate reactive losses
Run AC power flow w/ reactive losses included
Identify limit violations and system issues
Investigate mitigation options
“Standard” TPL Planning
“New” Planning Steps GIC Calculation is now available on most power system analysis software
Conduct thermal assessment of transformers
Corrective Action Plan
RELIABILITY | ACCOUNTABILITY16
Thermal Assessment of Transformers
• Thermal limits: IEEE C57.91 (Guide for Loading Mineral-Oil-Immersed Transformers) provides temperature limits
• Transformer manufacturer capability curves
• Thermal response simulation
RELIABILITY | ACCOUNTABILITY17
Assessment Results Example
• GICs will vary based on a number of factors: geology, geography, topology, proximity to large bodies of water, etc.
Mitigation strategies• Operating procedures• Blocking devices• Selective outages• Protection upgrades• Equipment
replacement• New equipment
specifications
RELIABILITY | ACCOUNTABILITY18
Challenges
• Lack of commercially available software tools and validated models for transformers and harmonics analysis Transformer heating has been a “hot” topic (pun intended) and we won’t
have a definitive “answer” to this issue for some time (read: years) We will attempt to address these issues at a high level in the standards
• Assessment criteria (how do you know if you have an issue?) Less stringent acceptance criteria than “standard” planning – this is about
preventing a cascade and blackout. What are the necessary criteria in analysis that will prevent a cascade and
blackout? [Hint: It is not Table 1 of TPL-001-4]
RELIABILITY | ACCOUNTABILITY19
Revisions to TOP and IROReliability StandardsLaura Hussey, NERC Director of Standards DevelopmentStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
Agenda
• History• Federal Energy Regulatory Commission (FERC) Notice of
Proposed Rulemaking (NOPR)• Project 2014-03 Revisions to TOP and IRO Standards Technical Conferences Next Steps
• How Can You Help?
RELIABILITY | ACCOUNTABILITY3
History: Two Big Projects
• Project 2007-03 Real-time Operations Goal was to revise TOP-001 through TOP-008 to streamline, make them
more clear and results-based, align responsibilities with functional model Resulted in three TOP standards, approved by ballot pool
o TOP-001-2 Transmission Operationso TOP-002-3 Operations Planningo TOP-003-2 Operational Reliability Data
• Project 2006-06 Reliability Coordination Goal was to clarify, streamline requirements applicable to the Reliability
Coordinator, including two COM and six IRO standards Resulted in two COM and four IRO standards approved by ballot pool, plus
conforming changes to PRC-001-1
RELIABILITY | ACCOUNTABILITY4
History Part 2: Context
• September 2011: SW Blackout Joint NERC-FERC Investigation Results of Investigation presented to NERC Board of Trustees (Board) in
May 2012
• NERC Board Approval of Project 2007-03 Real-time Operations standards May 2012 Board asked NERC staff to review recommendations of SW Blackout report
against standards before filing standards Staff reported to Board in November 2012
• April 2013: NERC filed two petitions, one for approval of TOP standards and one for approval of IRO standards
RELIABILITY | ACCOUNTABILITY5
History Part 3: NOPR and Response
• November 2013: FERC issues NOPR, proposing to remand TOP and IRO standards
• December 2013: NERC files motion requesting that FERC defer action to allow NERC and industry time to review issues in NOPR, revise standards Motion proposes to hold technical conferences, file completed revisions by
January 31, 2015
• January 2014: FERC grants NERC’s motion
RELIABILITY | ACCOUNTABILITY6
Project 2014-03
• Project 2014-03 Revisions to TOP and IRO Reliability Standards Standard Authorization Request (SAR) posted for comment through March
24, 2014 Drafting Team appointed February 2014
• Two technical conferences in early March Comment period ended March 24, 2014
• Initial drafting team meeting April 8-10, 2014; second meeting two weeks later
RELIABILITY | ACCOUNTABILITY7
Project Schedule – Next steps
• May: Post final SAR and draft standards with supporting documentation for 45 days
• August: Second posting for 45 days• October: Final posting and ballot• November 2014: Board adoption• File by January 31, 2015
RELIABILITY | ACCOUNTABILITY8
SDT Roster
• Chair – Dave Souder, PJM• Vice Chair – Andrew Pankratz, FP&L• David Bueche, CenterPoint Energy • Jim Case, Entergy • Allen Klassen, Westar Energy • Bruce Larsen, WE Energies • Jason Marshall, ACES Power Marketing • Bert Peters, Arizona Public Service Co.• Robert Rhodes, SPP • Eric Senkowicz, FRCC • Kevin Sherd, MISO
RELIABILITY | ACCOUNTABILITY9
Operating Concepts
• Decision-making Authority (paragraphs 84 & 87) • Analysis of System Operating Limits (SOLs) (paragraphs 42 & 52)• Mitigation Plans (paragraph 54)• Operating to the Most Severe Single Contingency (paragraph 70)• Unknown Operating States (paragraph 75)
RELIABILITY | ACCOUNTABILITY10
• Submittal• TOP-001-2, R11• Each TOP shall act or direct
others to act, to mitigate both the magnitude and duration of exceeding an IROL within the IROL’s Tv, or of an SOL identified in Requirement R8.
• NOPR (paragraph 87) • NERC’s proposal with respect
to mitigating IROLs appears to give both the transmission operator and reliability coordinator authority to act. Therefore, we seek clarification and technical explanation whether the RC or the TOP has primary responsibility for IROLs.
Decision-Making Authority
RELIABILITY | ACCOUNTABILITY11
• Submittal• TOP-001-2, R8• Each TOP shall inform its RC
of each SOL which, while not an IROL, has been identified by the TOP as supporting reliability internal to its TOP Area based on its assessment of its Operational Planning Analysis
• NOPR (paragraph 42)• Without a requirement to
analyze and operate within all SOLs in the proposed standards and by limiting non-IROL SOLs to only those identified by the TOP internal to its area, system reliability is reduced and negative consequences can occur outside of the TOP’s internal area
Analysis of System Operating Limits
RELIABILITY | ACCOUNTABILITY12
• Submittal• TOP-001-2, R7: Each TOP shall
not operate outside any identified IROL for a continuous duration exceeding its associated IROL Tv
• TOP-001-2, R8: Each TOP shall not operate outside any SOL identified in R8 for a continuous duration that would cause a violation of the Facility Rating/ Stability criteria upon which it is based
• NOPR (paragraph 54) • The TOP should have
operational or mitigation plans for all Bulk-Power System IROLs and SOLs that can be implemented within 30 minutes or less to return the system to a secure state
Mitigation Plans
RELIABILITY | ACCOUNTABILITY13
• Submittal• Replaced by TOP-001-2, R7 & R9• R7: Each TOP shall not operate
outside any identified IROL for a continuous duration exceeding its associated IROL Tv.
• R9: Each TOP shall not operate outside any SOL identified in Requirement R8 for a continuous duration that would cause a violation of the Facility Rating or Stability criteria upon which it is based.
• NOPR (paragraph 70)• NERC proposes to delete
TOP-004-2, Requirement R2, which provides that each TOP “shall operate so that instability, uncontrolled separation, or cascading outages will not occur as a result of the most severe single contingency.”
Operating to the Most Severe Single Contingency
RELIABILITY | ACCOUNTABILITY14
• Submittal• Requirement deleted• SDT viewed ‘unknown
operating states’ as referring to lack of studies of all possible conditions. And, in today’s environment, didn’t feel that such a condition would exist.
• NOPR (paragraph 75)• With regard to mitigation of
unknown operating states, while NERC asserts that “unknown states” cannot exist, a transmission provider could have valid operating limits for all facilities but lack situational awareness when valid limits are exceeded. …the Commission seeks comment and technical explanation from NERC and other interested entities on the proposed retirement.
Unknown Operating States
RELIABILITY | ACCOUNTABILITY15
Tools and Analysis
• Time Horizons (paragraph 55)• System Models, Monitoring, and Tools (Transmission Operator -
paragraph 60) (Reliability Coordinator – paragraph 95) • Cause of SOL Violations (paragraph 73)• Real-time Contingency Analysis (RTCA) (paragraph 74) • External Networks and sub-100 kV Facilities and Contingencies
(paragraph 67)
RELIABILITY | ACCOUNTABILITY16
• Submittal• TOP-001-2, R8• Each TOP shall inform its RC
of each SOL which, while not an IROL, has been identified by the TOP as supporting reliability internal to its TOP Area based on its assessment of its Operational Planning Analysis. [Time Horizon: Operations Planning]
• NOPR (paragraph 55) • Requirement R8 should
pertain to all IROLs and all SOLs for all operating time horizons
Time Horizons
RELIABILITY | ACCOUNTABILITY17
• Submittal• None – SDT believed
certification covered this topic.
• NOPR (paragraph 60)• Monitoring and analysis
capabilities are essential in establishing and maintaining situational awareness. NERC indicates that these functions are assured through the certification process. We are not convinced that NERC’s certification process is a suitable substitute for a mandatory Reliability Standard. … certification is a one-time process that may not adequately assure continual operational responsibility would occur.
System Models, Monitoring, and Tools
RELIABILITY | ACCOUNTABILITY18
• Submittal• Requirement deleted as
Real-time is not when to investigate or to do root-cause analysis – but instead is the time to ‘fix’ the problem. Causes can be determined later and off-line.
• NOPR (paragraph 73)• Proposal deletes
requirement for determining the cause of SOL violations in all time-frames, including real-time
Cause of SOL Violations
RELIABILITY | ACCOUNTABILITY19
• Submittal• None – deferred to Project
2009-02
• NOPR (paragraph 74)• Should all TOPs be required
to run a real-time contingency analysis (RTCA) frequently, since the lack of such analysis can impair situational awareness substantially?
Real-time Contingency Analysis
RELIABILITY | ACCOUNTABILITY20
• Submittal• TOP-002-3, R1• Each TOP shall have an
Operational Planning Analysis that represents projected System conditions that will allow it to assess whether the planned operations for the next day within its TOP Area will exceed any of its Facility Ratings or Stability Limits during anticipated normal and Contingency event conditions.
• NOPR (paragraph 67)• Does ‘projected System
conditions’ include external networks or sub-100 kV facilities?
External Networks and sub-100 kV Facilities and Contingencies
RELIABILITY | ACCOUNTABILITY21
Coordination and Communication
• Reliability Directive (paragraph 64) • Corrective Action (paragraph 78)• Notification of Emergencies (paragraph 80) • Outage Coordination (paragraph 89) • Secure Network (paragraph 92)
RELIABILITY | ACCOUNTABILITY22
• Submittal• Definition• Reliability Directive – A
communication initiated by an RC, TOP, or BA where action by the recipient is necessary to address an Emergency or Adverse Reliability Impacts.
• NOPR (paragraph 64)• TOP now uses “reliability
directive,” which does not appear to be limited to a specific set of circumstances. …, proposed definition of “Reliability Directive” appears to require compliance with directives only in emergencies, not normal or pre-emergency times. … We believe that directives from a RC or TOP should be mandatory at all times, and not just during emergencies (unless safety is violated, etc.).
Reliability Directive
RELIABILITY | ACCOUNTABILITY23
• Submittal• Requirement deleted• The SDT believes that the
proposed TOP-001-2 covers this situation for operations and that the proposed TOP-002-3 covers it for operations planning. The proposed standards do not limit the circumstances for which corrective actions need to be taken or what situation caused the problem. When exceedances occur, the TOP must take the prescribed actions.
• NOPR (paragraph 78) • The Commission seeks
comment and technical explanation on how current PRC-001-1 R2 requirement for corrective action (i.e., return a system to a stable state) is addressed in its proposal.
Corrective Action
RELIABILITY | ACCOUNTABILITY24
• Submittal• TOP-001-2, R3: Each TOP shall inform
its RC and TOPs that are known or expected to be affected by each actual and anticipated Emergency based on its assessment of its Operational Planning Analysis. [Time Horizon: Operations Planning]
• TOP-001-2, R5: Each TOP shall inform its RC and other TOPs of its operations known or expected to result in an Adverse Reliability Impact on those respective Transmission Operator Areas unless conditions do not permit such communications. [Time Horizon: Same-day Operations, Real-Time Operations]
• NOPR (paragraphs 80 – 82) • We believe that, consistent
with the currently-effective TOP Reliability Standards, the notification requirement of proposed TOP-001-2 should apply to all emergencies, including real-time and same day emergencies.
Notification of Emergencies
RELIABILITY | ACCOUNTABILITY25
• Submittal• IRO-008-1, R3. When an RC
determines that the results of an Operational Planning Analysis or Real-Time Assessment indicates the need for specific operational actions to prevent or mitigate an instance of exceeding an IROL, the RC shall share its results with those entities that are expected to take those actions.
• IRO-010-1a, R3. Each BA, GO, GOP, IA, LSE, RC, TOP, and TO shall provide data and information, as specified, to the RC(s) with which it has a reliability relationship.
• NOPR (paragraph 89)• The Commission does not see
the specified requirements as dictating outage coordination.
Outage Coordination
RELIABILITY | ACCOUNTABILITY26
• Submittal• Requirement deleted
• NOPR (paragraph 92) • Is there a need for a specific
requirement that the data exchange between the RC, TOP, and BA be accomplished “via a secure network?”
Secure Network
RELIABILITY | ACCOUNTABILITY27
Critical Infrastructure Protection StandardsVersion 5 RevisionsTHIS PRESENTATION WILL BE POSTED AFTER THE WORKSHOPRyan Stewart, NERC Standards DeveloperStandards and Compliance Spring WorkshopApril 2, 2014
Reliability Standard Audit Worksheet DevelopmentJerry Hedrick, NERC Director of Regional Oversight for ComplianceValerie Agnew, NERC Director of Standards DevelopmentStandards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
• Purpose of Reliability Standard Audit Worksheets (RSAWs)• Current RSAW development• Future RSAW content• Future supporting activities
Agenda
RELIABILITY | ACCOUNTABILITY3
Purpose for Parallel RSAW and Standards Development
• Align Standards and Compliance/Enforcement• Provide transparency to industry• Standards Process Improvement Group (SPIG): Align requirements with RSAWs Compliance staff develop RSAWs during standards development and post
RSAWs for comment
RELIABILITY | ACCOUNTABILITY4
Purpose of RSAW
• A document used by compliance auditors that: Establishes testing methodology to guide auditor judgment Creates a principled approach to testing compliance with standards Captures and supports the work performed to establish a reasonable
assurance of compliance
• A document that industry can rely on for: Supporting self-assessments and testing Transparent and predictable methodologies for compliance testing Facilitating dialogue related to compliance testing
• A document used by standards to: Validate the standard’s language can establish clear testing methodologies Demonstrate how a standard will be tested prior to becoming enforceable
RELIABILITY | ACCOUNTABILITY5
Current RSAW and Standards Timelines
45-day formal comment and ballot period
Additional comment and ballot, if necessary
Final ballot
Standards Development(4-6 months)
RSAW developed after regulatory approval
• Board approval• Regulatory approval• Implementation period• Effective Date
RELIABILITY | ACCOUNTABILITY6
Revised RSAW and Standards Timelines
45-day formal comment and ballot period
Additional comment and ballot, if necessary
Final ballot
• Board approval• Regulatory approval• Implementation period• Effective Date
Standards Development(4-6 months) Compliance coordinates
with SDT
RSAW developed
RSAW posted 15 days after standard
RSAW revised, if necessary, based on
comments
RSAW posted
Draft standard to compliance for RSAW development
RELIABILITY | ACCOUNTABILITY7
Recent Standards and RSAW Postings
• New standards development projects posting draft RSAWs: On standards web page as supporting materials RSAWs are not balloted Provide early review of RSAWs associated with standards
RELIABILITY | ACCOUNTABILITY8
2014 Enhancements
• NERC staff will coordinate input between standards drafting teams and Compliance
• Regional Entity staff involved in drafting RSAWs• RSAWs posted during formal comment and ballot periods: Posted approximately 15 days from start of comment period RSAWs posted to the compliance web page Not balloted Stakeholders may provide comments
RELIABILITY | ACCOUNTABILITY9
Benefits
• Prior to the approval of a standard: A collaborative approach to standards development Validates the ability of a standard to be tested for compliance Establishes compliance testing methodologies Develops a collaborative approach to assuring understanding Provides industry time to perform self-assessment in advance of a
standard becoming effective
• Following a standard becoming effective: Facilitates dialogue between auditors and Registered Entities Establishes predictable and repeatable testing methodologies Aids in the application of auditor judgment Provides clarity of data and information to both Enforcement and
Standards
RELIABILITY | ACCOUNTABILITY10
Future RSAW Development
• Align with principles of a risk-based approach to compliance monitoring
• Improve RSAW content: Reflect testing procedures Refrain from “check the box” compliance assessment Drive process improvement dialogue as appropriate
• Provide auditors flexibility to: Use professional judgment Consider risk
• Provides Enforcement and Standards with: Consistent feedback Process improvement feedback
RELIABILITY | ACCOUNTABILITY11
Provide Assurance to Industry
• Visibility of auditor training: Auditors Manual Tools and Processes RSAWs In-depth training for future standards
• Consistent messages from Regional Entities• Compliance Monitoring Experiences
RELIABILITY | ACCOUNTABILITY12
Auditor Training
Auditor Knowledge and Experience
Manual &
Handbook
RSAWs
Training
Tools & Processes
RELIABILITY | ACCOUNTABILITY13
Auditor Training
Auditor Knowledge and Experience
Manual &
Handbook
RSAWs
Training
Tools & Processes
Compliance Activities:1. When2. How3. What
RELIABILITY | ACCOUNTABILITY14
Auditor Training
Auditor Knowledge and Experience
Manual &
Handbook
RSAWs
Training
Tools & Processes
Reinforce:1. Activities2. Methodology3. Approach
Compliance Activities:1. When2. How3. What
RELIABILITY | ACCOUNTABILITY15
Auditor Training
Auditor Knowledge and Experience
Manual &
Handbook
RSAWs
Training
Tools & Processes
Compliance Design:1. Templates2. Requirements
Reinforce:1. Activities2. Methodology3. Approach
Compliance Activities:1. When2. How3. What
RELIABILITY | ACCOUNTABILITY16
Auditor Training
Auditor Knowledge and Experience
Manual &
Handbook
RSAWs
Training
Tools & Processes
Standard-Specific Test Plans & Methodology
Compliance Design:1. Templates2. Requirements
Reinforce:1. Activities2. Methodology3. Approach
Compliance Activities:1. When2. How3. What
RELIABILITY | ACCOUNTABILITY17
Implementation
• For existing projects, NERC will develop parallel RSAWs: Enhancements for projects that begin in 2014
• Auditor training• Standard-specific auditor training
RELIABILITY | ACCOUNTABILITY18
NERC Cost Effective Analysis Process (CEAP)Laura Hussey, NERC Director of Standards DevelopmentGuy V. Zito, NPCC, Assist. Vice President – Standards, CEAP Team LeadNERC 2014 Standards and Compliance Spring WorkshopApril 2, 2014
RELIABILITY | ACCOUNTABILITY2
• Background: FERC “. . . reliability does not come without cost . . .” Concerns expressed:o NERC Board of Trusteeso Industry trade groupso Entities and stakeholderso State and provincial governmental authorities
NERC CEAP
RELIABILITY | ACCOUNTABILITY3
• Background (continued): Developed in response to a Standards Process Input Group (SPIG)
recommendation Developed from the Northeast Power Coordinating Council (NPCC) CEAP NERC Standards Committee (SC) approved October 2012 NERC currently conducting a “pilot” on two projects using a CEAP team of
SC and SC Process Subcommittee members and observers:o Project 2010-13.2 – Phase 2 of Relay Loadability: Generation (PRC-025-1) (Cost
Effectiveness Analysis [CEA] Phase 2)o Project 2007-11 – Disturbance Monitoring (PRC-002-2) (CEA Phase 2)
NERC CEAP
RELIABILITY | ACCOUNTABILITY4
• CEAP objectives: Introduce a cost benefit “type” analysis into the standards decision-making
at or before the Standards Authorization Request (SAR) stage of a project. Introduce a cost effectiveness analysis into the standards development
process once the standard is sufficiently developed. (requirement by re Provide meaningful cost implementation information to the industry and
regulators. Afford industry the opportunity to propose alternate requirements to a
standard that achieves the reliability objective of a draft standard more efficiently or less costly.
Identify, to the extent practical, the benefits of a standard.
NERC CEAP
RELIABILITY | ACCOUNTABILITY5
• Concept: CEAP has two phases (sets of questions)o Cost Impact Analysis (CIA) – initially determines a limited type of CIA used to
derive feasibility, order of magnitude costs, and potential reliability benefit, and to examine probabilities of occurrence
o Cost Effectiveness Analysis (CEA) – identifies costs associated with implementation, maintenance, ongoing compliance and reporting, and resources (EFTE)
NERC CEAP
RELIABILITY | ACCOUNTABILITY6
• CEAP basics: Does not result in delays to a standard’s development Does not result in variances to the existing standards development process Provides potential for a “cost benefit gateway” to a standard’s
development Does not reveal market sensitive or Critical Energy Infrastructure
Information (CEII) information SC provides oversight and owns and manages the process at this time CEAP team conducts the analysis – Reliability Assessment and
Performance Analysis (RAPA) Outreach through announcements, notifications, and education
NERC CEAP
RELIABILITY | ACCOUNTABILITY7
• CEAP determinations: CIA – A report and recommendation from NERC Staff on whether to
proceed with development of a standard at or before the SAR stage (i.e., the SC authorizes the SAR or Reliability Issues Steering Committee [RISC] initiates SC action)
CEA – A report and recommendation from NERC Staff to the SC to either continue or suggest the standard be revised by the Standard Drafting Team (SDT) to address CEA findings
NERC CEAP
RELIABILITY | ACCOUNTABILITY8
• Next steps: Complete the current pilot of CEAP and provide CEAP report to the
Disturbance Monitoring SDT and the SC. Report results of pilot and lessons learned to the SC. Prepare revisions to CEAP process to address lessons learned during pilot
and enhance derivation of benefits and use of metrics. Develop criteria for choosing CEAP projects for 2014 as well as developing
potential alternate approaches for accomplishing cost effectiveness. Report results to the SC and seek approval to implement. Execute the 2014 CEAP plan. Develop strategy for 2015.
NERC CEAP
RELIABILITY | ACCOUNTABILITY9
• Opportunities for stakeholders: Participate in forums and trade groups to share information and
coordinate concerns. Respond to CEAP surveys and supply all the requested information that is
asked for in order to give the team sufficient data points. Review the published CEAP reports as ballot positions are being developed
to provide certainty and information to justify your ballot. Ensure the right people in your organization are receiving the CEAP
announcements. Keep a diligent watch for opportunities to comment on future generations
of the process itself.
NERC CEAP
RELIABILITY | ACCOUNTABILITY10
“We can’t solve problems by using the same kind of thinking that created them”
- Albert Einstein, 1879–1955
NERC CEAP
RELIABILITY | ACCOUNTABILITY11