california department of fish and wildlife and california

69
California Department of Fish and Wildlife and California State Lands Commission Oil Spill Prevention, Response, and Preparedness Program Report No. 21-3600-019 December 2020

Upload: others

Post on 05-Dec-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

California Department of Fish and Wildlife and California State Lands Commission

Oil Spill Prevention, Response, and Preparedness Program

Report No. 21-3600-019 December 2020

Team Members

Cheryl L. McCormick, CPA, Chief Rebecca G. McAllister, CPA, Assistant Chief Chikako Takagi-Galamba, CGPM, Manager

Cindie Lor, Supervisor An Truong, Lead

Austin Lange Mathew Rios

Toni Silva

Final reports are available on our website at www.dof.ca.gov.

You can contact our office at:

California Department of Finance Office of State Audits and Evaluations

915 L Street, 6th Floor Sacramento, CA 95814

(916) 322-2985

December 31, 2020

Honorable Gavin Newson, Governor

State Capitol, Suite 1173

Sacramento, CA 95814

Final Report—California Oil Spill Prevention, Response, and Preparedness Program

Performance Audit

The California Department of Finance, Office of State Audits and Evaluations, has

completed its audit of the California Department of Fish and Wildlife’s (CDFW) and California

State Lands Commission’s (Commission) Oil Spill Prevention, Response, and Preparedness

Program.

The enclosed report is for your information and use. CDFW’s and the Commission’s response

to the report findings and our evaluation of the response are incorporated into this final

report. A detailed Corrective Action Plan addressing the findings and recommendations is

due from CDFW and the Commission within 60 days from receipt of this letter, and every

six months thereafter, until all planned actions have been implemented. This report will be

placed on our website.

If you have any questions regarding this report, please contact Chikako Takagi-Galamba,

Manager, at (916) 322-2985.

Sincerely,

Cheryl L. McCormick, CPA

Chief, Office of State Audits and Evaluations

cc: Keely Martin Bosler, Director, California Department of Finance

Charlton H. Bonham, Director, California Department of Fish and Wildlife

Thomas M. Cullen Jr., Administrator, Office of Spill Prevention and Response,

California Department of Fish and Wildlife

Jennifer Lucchesi, Executive Officer, California State Lands Commission

Original signed by:

Transmitted via e-mail

December 31, 2020

Erika Contreras Secretary of the Senate State Capitol, Suite 3044 Sacramento, CA 95814

Cara Jenkins Legislative Counsel State Capitol, Suite 3021 Sacramento, CA 95814

Sue Parker Chief Clerk of Assembly State Capitol, Suite 3196 Sacramento, CA 95814

Final Report—California Oil Spill Prevention, Response, and Preparedness Program Performance Audit

The California Department of Finance, Office of State Audits and Evaluations, has completed its audit of the California Department of Fish and Wildlife’s (CDFW) and California State Lands Commission’s (Commission) Oil Spill Prevention, Response, and Preparedness Program.

The enclosed report is for your information and use. CDFW’s and the Commission’s response to the report findings and our evaluation of the response are incorporated into this final report. This report will be placed on our website.

A detailed Corrective Action Plan (CAP) addressing the findings and recommendations is due from CDFW and the Commission within 60 days from receipt of this letter. The CAP should include milestones and target dates to correct all deficiencies. The CAP should be sent to: [email protected]. After the initial CAP is submitted, it should be updated every six months thereafter, until all planned actions have been implemented. The appropriate individual or mailbox CDFW and the Commission have designated will receive reminders when the updates are due to Finance.

If you have any questions regarding this report, please contact Chikako Takagi-Galamba, Manager, at (916) 322-2985.

Sincerely,

Cheryl L. McCormick, CPA Chief, Office of State Audits and Evaluations

cc: On the following page

Original signed by:

cc: Charlton H. Bonham, Director, California Department of Fish and Wildlife Thomas M. Cullen Jr., Administrator, Office of Spill Prevention and Response,

California Department of Fish and Wildlife Julie Yamamoto, Assistant Deputy Administrator, Office of Spill Prevention and

Response, California Department of Fish and Wildlife Steve Hampton, Assistant Deputy Administrator, Office of Spill Prevention and

Response, California Department of Fish and Wildlife Amir Sharifi, Branch Chief, Financial and Administrative Services, Office of Spill

Prevention and Response, California Department of Fish and Wildlife Jennifer Lucchesi, Executive Officer, California State Lands Commission Colin Connor, Assistant Executive Officer, California State Lands Commission

TABLE OF CONTENTS EXECUTIVE SUMMARY ..................................................................................................................... 1

BACKGROUND, SCOPE, AND METHODOLOGY .......................................................................... 2

BACKGROUND ............................................................................................................................. 2

SCOPE ........................................................................................................................................... 7

METHODOLOGY .......................................................................................................................... 7

RESULTS ............................................................................................................................................. 9

CONCLUSION ............................................................................................................................... 9

PROGRAMMATIC EFFECTIVENESS ............................................................................................ 11

California Department of Fish and Wildlife Office of Spill Prevention and Response...... 11

DRILLS AND EXERCISES ........................................................................................................... 11

Finding 1: Opportunities Exist for Improvement in OSPR’s Drills and Exercises ............... 15

OIL SPILL RESPONSE ................................................................................................................ 19

Finding 2: Improve Methods to Identify Spill Causes and to Determine Spill Response Type ........................................................................................................................ 23

California State Lands Commission ....................................................................................... 26

OIL TRANSFER MONITORING .................................................................................................. 26

Finding 3: Ensure the Priority Monitoring System is Accurate and Supported to Demonstrate High Priority Transfers are Properly Monitored ............................................ 29

SAFETY AUDITS ......................................................................................................................... 32

Finding 4: Evaluate Safety Audit Process to Improve its Efficiency and Timely Meet Program Goals and Requirements ............................................................................ 34

PROGRAM FINANCIAL BASIS .................................................................................................... 40

Fund Balances ........................................................................................................................ 40

Program Fund Revenue and Expenditure Activities ......................................................... 42

Finding 5: The Commission Should Ensure Cost Allocation to Fund 0320 is Equitable and Accurate ....................................................................................................... 44

APPENDIX A: Significant Internal Control Components ........................................................... 46

APPENDIX B: Detailed Methodologies ....................................................................................... 47

APPENDIX C: List of Acronyms and Abbreviations ................................................................... 52

RESPONSE ....................................................................................................................................... 53

EVALUATION OF RESPONSE .......................................................................................................... 62

1

EXECUTIVE SUMMARY In accordance with Government Code section 8670.42, the California Department of Finance, Office of State Audits and Evaluations, conducted a performance audit of the Oil Spill Prevention, Response, and Preparedness Program (Program) for the period July 1, 2016 through June 30, 2020. Our audit objective was to assess the programmatic effectiveness and financial basis of the Program and identify measures to improve Program efficiency and effectiveness.

In 1990, the California Legislature enacted the Lempert-Keene-Seastrand Oil Spill Prevention and Response Act (Act) to address all aspects of marine oil spill prevention and response in California. Acting as the Administrator, the California Department of Fish and Wildlife, Office of Spill Prevention and Response (OSPR), implements the provisions of the Act through the Program. The California State Lands Commission (Commission) has certain authority to implement Program prevention measures over marine oil terminals.

Our audit focused on OSPR’s processes and procedures for drills and exercises and oil spill response, the Commission’s processes and procedures for oil transfer monitoring and safety audits, and the financial basis of the Oil Spill Prevention and Administration Fund (Fund 0320) and the Oil Spill Trust Fund (Fund 0321). Our audit did not include a technical review of these Program activities in accordance with Government Code, California Code of Regulations, and other applicable regulations.

Based on procedures performed, during 2016-17 through 2019-20, OSPR conducted 672 drills and exercises related to 1,433 approved oil spill contingency plans within the three-year regulatory requirement, and has reduced the number of drills and exercises required to be conducted each year by performing drills and exercises simultaneously. OSPR also responded to the 5,900 reported spill incidents as required by Program regulations. Further, the Commission demonstrated continuing coverage of oil transfer monitoring at its marine oil terminals by monitoring 34 percent of the 27,546 total oil transfers. The Commission also met its five-year safety audit cycle strategic goal for the six active oil and gas drilling and production facilities.

Opportunities to improve the efficiency and effectiveness of the Program relating to OSPR’s and the Commission’s operational practices, information systems, and workload planning were identified. For instance, OSPR’s and the Commission’s data collected in key databases are not complete or accurate and do not allow for appropriate reporting or workload planning.

Program revenues were collected and expended primarily on Program prevention and preparedness activities performed by OSPR and the Commission. Fund 0320 expenditures have exceeded revenues resulting in a declining fund balance. To assist the fund in remaining solvent, Fund 0321 will provide a $6.5 million loan to Fund 0320 in 2020-21. OSPR continuously recovered costs incurred in responding to spill incidents from responsible parties, and deposited reimbursements into Fund 0321, as required. As of June 30, 2020, Fund 0321 is sufficiently funded with an approximate $50 million fund balance.

2

BACKGROUND, SCOPE,

AND METHODOLOGY

BACKGROUND

Oil Spill Prevention and Response Program

In 1990, the California Legislature enacted the Lempert-Keene-Seastrand Oil Spill Prevention and Response Act (Act). The Act, codified in Government Code (GC) sections 8670.1 to 8670.73, included all aspects of marine oil spill prevention and response in California and established an Administrator who has broad powers to implement the Act’s provisions. The Act also assigned the California State Lands Commission (Commission) certain authority over marine oil terminals. In 1991, the Office of Spill Prevention and Response (OSPR), headed by the Governor appointed Administrator, was created within the California Department of Fish and Wildlife (CDFW) to implement the Act through the Oil Spill Prevention, Response, and Preparedness Program (Program).1 Subsequently, the Program has been expanded as follows:

• Senate Bill (SB) 861 (Chapter 35, Statutes of 2014) expanded the Program to include all state surface waters at risk of oil spills from any inland source, including pipelines, production facilities, and shipments of oil transported by railroads. SB 861 is commonly referred to as the inland facilities expansion.

• Assembly Bill (AB) 1197 (Chapter 584, Statutes of 2017) required OSPR to establish criteria for certifying a spill management team (SMT) based on the SMT’s capacity to respond to spills and manage spills effectively, review applications for SMT certification, and certify SMTs, as specified; required oil spill contingency plans to identify at least one SMT certified by OSPR; and required OSPR to adopt regulations to implement these provisions as appropriate.

• AB 936 (Chapter 770, Statutes of 2019) required OSPR to revise criteria for oil spill response organizations (OSRO) determined by OSPR to be capable of addressing non-floating oil spills; required OSPR to consider technologies for addressing non-floating oil spills; and required submittal of an amended California oil spill contingency plan addressing both marine and inland oil spills every three years.

Office of Spill Prevention and Response

OSPR has public trustee and custodial responsibilities for protecting, managing, and restoring the State’s fish, wildlife, and plants. It is one of the few state agencies in the nation that has both major pollution response authority and public trustee authority for wildlife and habitat. OSPR’s mission is to provide the best achievable protection of the State's natural resources by preventing, preparing for, and responding to spills of oil and other harmful materials, and through restoring and enhancing affected resources.2

1 Excerpts from https://wildlife.ca.gov/OSPR/About/History. 2 Excerpts from http://www.wildlife.ca.gov/OSPR/About.

3

OSPR’s Prevention, Preparedness, Environmental Response, and Enforcement Branches implement the Program through its prevention, preparedness, response, and restoration activities. Table 1 describes OSPR’s main Program activities.

Table 1: OSPR’s Program Key Activities

Program Areas Activities

Prevention Monitor and Inspect-Monitors over-water oil transfers, conducts vessel risk assessments and boarding of high risk vessels, analyzes marine vessel casualties, reviews and verifies industry oil spill contingency plans, and inspects inland facilities for spill containment measures.

Preparedness Drills and Exercises-Conducts drills and exercises to assess plan holder response and issues ratings to approve OSRO, and reviews the financial capability of plan holders.

Response Response-Responds to imminent or actual petroleum oil spill events throughout the State using field response teams, consisting of Wildlife Officers, Oil Spill Prevention Specialists, and Environmental Scientists.

Restoration Assessments-Conducts natural resource damage assessments for oil spills affecting surface waters of the State, in order to recover compensation for those damages and losses.

OSPR’s Drills and Exercises and Response Program activities, referenced in this report, are described below:

Drills and Exercises

Preventing oil spills is the best strategy for avoiding potential damages to the environment and human health.3 However, once an oil spill occurs, responding quickly and in a well-organized manner is the best approach to contain and control the spill. As described in Figure 1, each marine vessel, marine facility, and inland facility owner/operator (plan holder) must maintain an OSPR approved contingency plan (C-Plan). The C-Plan is an oil spill response and removal plan that delineates the actions to be implemented in the event of a spill emergency. Approved C-Plans are valid for five years. After five years, the C-Plan must be re-submitted to OSPR for approval.

3 United State Environmental Protection Agency, Understanding Oil Spills and Oil Spill Response

https://www.epa.gov/emergency-response/understanding-oil-spills-and-oil-spill-response.

Figure 1: Contingency Plan

Each marine vessel, marine facility, and inland facility owner/operator (plan holder) conducting business in California must have an OSPR approved contingency plan unless exempt due to geographical location or production factor.

A contingency plan identifies actions in which the plan holder plans to implement in the event of an oil spill and specific equipment and personnel to be used.

Source: Government Code section 8670.28

4

California Code of Regulations (CCR) title 14, sections 820.01 and 820.02, require each C-Plan holder to conduct drills and exercises (D&E) designed to carry out either portions of the C-Plan or the entire plan. D&E promote the response readiness of plan holders by evaluating the sufficiency of C-Plans and providing response teams with the opportunity to practice skills. Equipment and systems are also tested during D&E. The C-Plan D&E objectives must be successfully completed by the plan holders over a three-year period. Feedback and recommendations for improvements to the C-Plan, equipment, and/or systems are provided by OSPR. Failure to conduct the required D&E may result in administrative civil penalties or legal enforcement actions.

Response

In accordance with GC section 8670.7, OSPR has the primary authority to direct prevention, removal, abatement, response, containment, and clean-up efforts with regard to all aspects of any oil spill in surface waters of California (excluding groundwater), in accordance with any applicable facility or vessel contingency plans, and the California oil spill contingency plan. Part of this responsibility requires OSPR, in coordination with other entities, to be present at the location of any oil spill of more than 100,000 gallons into California waterways, and to ensure that appropriately trained personnel are on-site to respond, contain, and clean-up the spill, conduct assessments of the environmental impact, and determine the cause and amount of discharge.

California State Lands Commission

Established in 1938, the Commission provides California with effective stewardship of the lands, waterways, and resources entrusted to its care through preservation, restoration, enhancement, responsible economic development, and promotion of public access. The Commission issues leases for offshore oil production facilities within three nautical miles of the coast, including oil-producing islands and offshore platforms. The Commission also regulates every marine oil terminal in California. Both of these functions form the Commission’s Program, which is designed to provide the best achievable protection of public health, safety, and the environment, and to prevent an oil spill in state waters.4 While the Commission does not respond to oil spills, it conducts prevention activities through its Marine Environmental Protection Division and Mineral Resources Management Division, as described in Table 2.

4 Information obtained from the Commission website, http://www.slc.ca.gov/about.

5

Table 2: Commission’s Program Key Activities

Divisions Activities Marine Environmental Protection

Oil Transfer Monitoring-Inspects marine oil terminal transfer operations and activities daily at 34 sites along the coast, and enforces regulatory requirements, including observing and assessing oil transfers to and from tankers and barges at marine oil terminals, conducting annual spot inspections, and monitoring marine oil terminal pipelines.

Mineral Resources Management

Safety Audits-Conducts safety audits of offshore oil and gas production facilities to ensure compliance with regulatory requirements.

Enforcement-Enforces regulatory requirements for drilling, abandoning, and production of oil and gas wells on state leases and inspects the operation and maintenance of 34 offshore oil and gas pipelines.

Both Divisions Operations Manual Reviews-Conducts operations manual reviews of marine oil terminals’ offshore oil and gas production facilities to ensure manuals meet regulatory requirements.

Inspections-Conducts monthly inspections of offshore oil production equipment to ensure integrity of platform safety systems to respond to emergencies or spills.

The Commission’s Oil Transfer Monitoring and Safety Audits Program activities, referenced in this report, are described below:

Oil Transfer Monitoring

In accordance with CCR, title 2, section 2320 (a), the Commission continuously monitors transfer operations at its 34 Marine Oil Terminals (MOT) to ensure compliance with various regulatory requirements for oil transfer operations. MOT operators are required to send notification to the Commission at least 4 hours, but not more than 24 hours, before initiation of an oil transfer. The Commission uses a marine terminal priority monitoring rating system to ensure the most critical and significant oil transfers are monitored. Commission staff will monitor selected transfer events, such as hook-up, start pump, steady rate, topping off/stripping, and disconnect, to ensure compliance with regulations. Safety Audits

The Commission conducts comprehensive safety audits of offshore oil and gas production facilities on a five-year cycle to provide a comprehensive evaluation of facility design, condition, procedures, and personnel. The safety audits are designed to ensure compliance with CCR, title 2, sections 2129 through 2142, and 2170 through 2175, which addresses requirements for oil and gas production, drill and production pollution control, and operational manual and emergency planning.

6

Program Funding

Program activities are supported by the following funds:

• Fund 0207 – Fish and Wildlife Pollution Account

• Fund 0320 – Oil Spill Prevention and Administration Fund

• Fund 0321 – Oil Spill Trust Fund

• Fund 0322 – Environmental Enhancement Fund

The primary source of funding for Program activities are Fund 0320 and Fund 0321.

Oil Spill Prevention and Administration Fund-Fund 0320

Fund 0320 is used for OSPR’s and the Commission’s prevention and preparedness activities. Revenues are primarily collected from crude oil barrel regulatory fees or petroleum product transfers and non-tank vessel fees on biennial applications for Certificates of Financial Responsibility (COFR), which ensure a vessel or facility has the adequate financial resources to pay clean-up and damage costs resulting from a spill.5

The COFR fee amount is based on the non-tank vessel’s capacity, in which the larger carrying capacities result in larger fees. Current biennial COFR fees range from $650 to $3,250.

In accordance with GC section 8670.40, the California Department of Tax and Fee Administration is responsible for collecting oil barrel fees. The fee amount is determined by OSPR and is currently 6.5 cents per barrel of crude oil that passed over, across, under, or through the state’s lands or waterways from operators of pipelines, marine oil terminals, and oil refineries. SB 861 expanded OSPR’s Program activities to inland activities and the 6.5 cents fee per barrel was expanded to include crude oil received at California refineries.

Fees collected are deposited into Fund 0320 for use on oil spill prevention and preparedness, such as implementing oil spill prevention programs, researching prevention and control technology, carrying out studies that may lead to improved oil spill prevention and response, and financing environmental and economic studies relating to the effects of an oil spill. The remaining revenues provide ongoing funding of the Oiled Wildlife Care Network and to correct historical deficits.6

Fund 0320 cannot be expended on oil spill response activities, per GC section 8670.40 (e).

Oil Spill Trust Fund-Fund 0321

Fund 0321 is used by OSPR for oil spill response activities and receives reimbursements through cost recovery from parties deemed responsible for the spill. OSPR is required to recover costs incurred in responding to spill incidents from responsible parties and deposit reimbursements into the fund, as required by GC sections 8670.47 and 8670.53. Cost

5 Information obtained from http://www.dof.ca.gov/budget/manual_state_funds/Find_a_Fund/documents/0320.pdf. 6 Information obtained from https://wildlife.ca.gov/OSPR/About.

7

recovery methods include 1) legal actions, 2) direct billing to the responsible parties, and 3) filing a claim with the Federal Oil Spill Liability Trust Fund for federal reimbursement to partially or fully recover spill costs, if the responsible party cannot be determined or is unable to pay. The majority of spill costs are recovered through responsible party reimbursements.

SCOPE

In accordance with GC section 8670.42, the California Department of Finance, Office of State Audits and Evaluations, conducted a performance audit of OSPR’s and the Commission’s Program for the period July 1, 2016 through June 30, 2020.

Our audit objective was to assess the programmatic effectiveness and financial basis of the Program and identify measures to improve Program efficiency and effectiveness. To accomplish our objective, we determined:

• The Program’s service delivery levels and if the Program met regulatory requirements.

• If operational processes and utilization of resources address Program needs.

• If Program revenues were utilized for key program activities and fund balances are adequate to support program expenditures.

Our audit focused on Program activities considered to be significant to the Program’s objective of “best achievable protection of the environment and the public health”, which include OSPR’s preparedness activities for drills and exercises, and response activities for oil spills, and the Commission’s prevention activities for oil transfer monitoring and safety audits. Our audit results did not include a technical review of these Program activities in accordance with GC, Program regulations, or other applicable regulations.

In performing our audit, we considered internal controls significant to the audit objectives. See Appendix A for a list of significant internal control components and underlying principles.

METHODOLOGY

In planning the audit, we gained an understanding of areas significant to the Program. We identified Program requirements by reviewing applicable statutes and regulations, and policies and procedures provided by OSPR and the Commission, and resources available on their websites. We reviewed prior audit reports and interviewed personnel to gain an understanding of OSPR’s and the Commission’s operations and information technology systems.

We conducted a risk assessment, including evaluating whether Program key internal controls significant to our audit objectives were properly designed, implemented, and operating effectively. Internal controls evaluated focused on:

• OSPR’s established processes and procedures, and information systems used for conducting drills and exercises and responding to oil spills;

8

• The Commission’s established processes and procedures, and information systems used for monitoring of oil transfers and conducting safety audits;

• OSPR’s and the Commission’s established processes and procedures, and information systems used to track and account for Program expenditures and revenue, and oil spill cost recovery.

Our assessment included conducting interviews with OSPR and Commission personnel, reviewing policies and procedures, and observing key processes related to review and approval activities. Deficiencies in internal controls identified during our audit and determined to be significant within the context of our audit objectives are included in the results section of this report.

Additionally, we assessed the reliability of reports generated from OSPR’s Incident Tracking Database and Readiness Database; and the Commission’s Oil Spill Prevention Database (OSPD), Safety Audit History Log, and Action Item Matrix. Specifically, we reviewed existing information and gained an understanding of relevant controls by observing key processes related to system operations and review and approval protocols, and traced a selection of data to source documentation to test for accuracy and completeness. Except as noted below, we determined that the data were sufficiently reliable to address the audit objectives.

During testing of the Commission’s oil transfer reports generated from OSPD, we determined that vessel priority rating numbers maintained in OSPD were not accurate. Therefore, we determined the data related to the vessel priority ratings are not sufficiently reliable. Significant data deficiencies identified during our audit are included in the Results section of this report.

Based on the results of our planning, we developed specific methods for gathering evidence to address the audit objectives. Our methods are detailed in the Table of Methodologies in Appendix B.

Except as discussed in the following paragraph, we conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe that the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Finance, CDFW, and the Commission are part of the State of California’s Executive Branch. As required by various statutes within the California Government Code, Finance performs certain management and accounting functions. Under generally accepted government auditing standards, performance of these activities creates an organizational impairment with respect to independence. However, Finance has developed and implemented sufficient safeguards to mitigate the organizational impairment so reliance can be placed on the work performed.

9

RESULTS

CONCLUSION

During 2016-17 through 2019-20, OSPR conducted 672 D&E related to 1,433 approved C-Plans within the three-year regulatory deadlines; however, instances where plan holders did not complete the required D&E within the semi-annual and annual time frames were noted. Additionally, OSPR condensed its D&E into single events for the same operator, and a majority of the D&E conducted received a pass rating, resulting in an overall decrease in its D&E workload. Opportunities exist for OSPR to proactively monitor plan holders to timely identify non-compliance with D&E requirements, and improve the efficiency and effectiveness of D&E operations and workload planning.

Additionally, OSPR responded to 98 percent of the 5,900 reported spill incidents as required by Program regulations. The remaining 2 percent of the reported spill incidents did not warrant OSPR’s response because those spills did not involve state waters or the spill volume was less than 100 gallons. However, the causes of the oil spills were not always determined and opportunities exist for OSPR to improve the quality of the collected spill data by ensuring spill causes are consistently determined and establishing guidance for determining the types of spill responses to ensure consistency in its practices.

The Commission demonstrated continuing coverage of oil transfers by monitoring 9,494 of 27,546 oil transfers (34 percent), totaling approximately 1.3 billion gallons, during 2016-17 through 2019-20. However, the Commission could not demonstrate it properly monitored all critical or high priority oil transfers due to a lack of supporting data.

Additionally, the Commission completed safety audits for six oil and gas drilling and production facilities within each facility’s five-year cycle, in accordance with the Commission’s Strategic Plan goals. Opportunities exist for the Commission to improve its operational efficiency and effectiveness by evaluating the safety audit process and monitoring audit deficiency action items to ensure corrective actions are implemented within the established time frames.

Program barrel and non-tank vessel fees were collected, deposited into Fund 0320, and used primarily to fund prevention and preparedness activities performed by OSPR and the Commission. However, beginning in 2017-18, expenditures have exceeded revenues, resulting in a declining fund balance. To assist the fund in remaining solvent, Fund 0321 will provide a $6.5 million loan to Fund 0320. As OSPR seeks to remedy its future projections of a declining fund balance, it should continue to monitor revenues, and OSPR and the Commission should seek to ensure effective use of revenues collected by implementing practices to improve Program operations. Further, OSPR and the Commission should also ensure Program expenditures are accurately charged to the Fund.

OSPR continuously recovered costs incurred in responding to spill incidents from responsible parties, and deposited reimbursements into Fund 0321, as required. As of June 30, 2020, Fund 0321 is sufficiently funded.

10

FINDINGS AND RECOMMENDATIONS

An effective program ensures goals and objectives are achieved and it is the responsibility of management to ensure the program activities are effectively designed, implemented, and achieving strategic goals and other intended results. Additionally, program efficiency is demonstrated through the use of inputs and other resources to achieve program results and it is the responsibility of management to achieve the optimal relationship between outputs of services and the resources used to produce them in terms of quantity and process time. The key to an efficient and effective program is dependent on three factors, including human resources, processes, and technology, as noted in Figure 2.

Figure 2: Triad of an Efficient and Effective Program

Source: Herman Murdock, Operational Auditing

The findings and recommendations detail OSPR’s and the Commission’s programmatic and fiscal administration of the Program, and includes identified non-compliances and opportunities to improve the efficiency and effectiveness of the Program pertaining to operational practices, information systems, and workload planning.

Ineffective and inefficient Program operations may adversely impact OSPR’s and the Commission’s ability to timely assess Program performance, resolve emerging issues, and monitor compliance with regulatory responsibilities and requirements.

11

PROGRAMMATIC EFFECTIVENESS

California Department of Fish and Wildlife Office of Spill Prevention and Response

OSPR’s D&E Unit, within its Preparedness Branch, is responsible for promoting the response readiness of vessel and facility C-Plan holders, and engaging in the design and conduct of individual D&E.

Additionally, OSPR staff from its Environmental Response, Enforcement, and Prevention Branches coordinate oil spill response and tracking efforts through Field Response Teams (FRT). During spill incidents, OSPR FRT serves as the state’s on-scene coordinator to ensure that spills are remediated to minimize impacts to wildlife and wildlife habitat. FRT activities include obtaining information regarding the cause and source of the spill, collecting evidence, coordinating with allied agencies, and preparing spill memorandums or comprehensive reports used in legal actions.

DRILLS AND EXERCISES

Process Overview

OSPR assesses the performance of plan holders by conducting D&E to validate all or part of a C-Plan. Plan holders are required to complete a specific set of deliverables and objectives based on their plan type and tier risk level, which is established by CCR, title 14, sections 820.01 and 820.02. Inland plan holders have three tier levels based on their estimated worst case spill volume. Marine plan holders have two tier levels based on their facility or vessel type, as noted in Table 3.

Table 3: Plan Holders Tier System

Tier Inland Marine

1 1,000 barrels or more Tank Vessels, Non-Tank Vessels, and Marine Facilities

2 500 to 999 barrels Small Marine Fueling Facilities, Mobile Transfer Units and Vessels Carrying Oil as Secondary Cargo

3 Up to 499 barrels Source: CCR, title 14, sections 820.01 and 820.02.

A plan holder must conduct and pass all of its required D&E objectives over a three-year period. The three-year cycle begins when the C-Plan is approved by OSPR (i.e., status date). D&E may include, but are not limited to, an annual SMT table top exercise (TTX) or a semi-annual equipment deployment drill (SED). A SMT TTX evaluates SMT’s knowledge of the C-Plan. TTX is performed in a tabletop setting, with a simulation of an actual oil spill to accomplish specified objectives noted in the C-Plan. A SED is required to be conducted within the first six months of the calendar year to test the deployment of facility-owned equipment, which includes immediate containment strategies identified in the C-Plan. If the SED is not successfully passed, a second SED is required to be performed in the second half of the calendar year.

12

Program regulations require plan holders to submit a notification to the Administrator to schedule a D&E with OSPR, as noted in Figure 3. Plan holders utilize OSPR’s online calendar to schedule D&Es and are required to notify OSPR of their request at least 30 and 60 days in advance for SEDs and TTXs, respectively. Only D&E scheduled on OSPR’s calendar will be eligible to receive credit for successfully completed deliverables and objectives.

Once a D&E is scheduled, OSPR’s assigned Drill Coordinator is responsible for participating in the Drill Design Meeting. This entails coordination with the plan holder regarding the D&E plan and design, scope of the D&E and objectives to test, and ensuring the plan holder tests the required objectives in accordance with its C-Plan type (marine vs. inland) and tier risk level. For example, the event scenario may be a large spill incident, and will include spill location, amount spilled, and product spilled. Each initial response objective met by the SMT participants will be documented by the Drill Coordinator. The SMT must make every effort to manage the simulated incident and to demonstrate it has the ability to meet each objective.

All assigned OSPR staff and involved parties (e.g., SMT) must be present to conduct the D&E. D&Es will generally be conducted in eight hours or less, but some may last several days. OSPR’s iAuditor software application is used to record observations and document D&E results through various forms and checklists, and photographic evidence, as needed. The documents are used to prepare an assessment report (Drill Report) within 20 business days of the D&E's conclusion.

Within 60 days of completing the D&E, plan holders are required to submit a Request for Drill/Exercise Credit form to OSPR, including supporting documentation for each objective tested. OSPR examines the plan holder’s form and supporting documents, compares it to OSPR’s drill assessment report, and issues a Credit Approval Letter for the objectives successfully completed. Additionally, several facilities or vessels can be owned by a single operator (i.e., plan holder), with a C-Plan for each facility or vessel. If the plan holder has multiple C-Plans in the same geographic region, the plan holder may combine its D&E for these C-Plans into a single event if each C-Plan is tested and incorporated into the D&E.

Beginning 2008, OSPR developed a violations review process to identify plan holders that did not comply with D&E semi-annual or annual requirements in the prior year. Plan holders identified as not in compliance are issued a violation letter. However, as noted in Finding 1A, the violations review process has not been formalized with policies and procedures, or consistently implemented since 2008. As a result, OSPR has not effectively monitored plan holder compliance with regulatory requirements.

Plan holder schedules D&E

and gives proper notice to OSPR

Conduct Drill Design Meeting

Conduct D&E

Drill Coordinator compiles Drill

Report

Plan holder submits Request for Drill/Excercise

Credit

OSPR reviews Request for

Drill/Excercise Credit

OSPR issues Credit Approval Letter

Figure 3: D&E Process Overview

Source: Information obtained from OSPR

13

Readiness Database

OSPR created a Microsoft Access database (Readiness Database) to track C-Plans, plan holders, and D&E information, such as the C-Plan number and status date, plan holder name, D&E completed, objectives completed and not completed, D&E results, and D&E participants.

A report, generated from the Readiness Database at the beginning of the calendar year, identifies plan holders who did not conduct a D&E in the prior year. Another report generated identifies plan holders who untimely notified OSPR of a scheduled D&E or delayed submittal of their Request for Drill/Exercise Credit form. According to OSPR, although the reports are generated from the Readiness Database, the report information may not be accurate. Therefore, OSPR staff must validate the reports by comparing the information to each plan holder record in the Readiness Database.

The Readiness Database is also used by OSPR staff to prepare for a D&E. After a plan holder schedules a D&E and submits its listing of planned objectives to test, the Drill Coordinator reviews the plan holder’s information in the Readiness Database to verify selected objectives are consistent with OSPR’s records. According to OSPR staff, they are unable to generate a report identifying objectives that have been included in a previous D&E and objectives that remain outstanding. As a result, the Drill Coordinator must manually review each plan holder’s record to determine if all required objectives due were previously met and to suggest revisions to the planned objectives. After completion of the D&E, the Drill Coordinator is responsible for compiling the results and supporting documentation, and recording the results into the Readiness Database. Limitations of the Readiness Database impacts OSPR’s ability to effectively prioritize its resources to meet its workload requirements, as noted in Finding 1B.

Drills and Exercises Activity

OSPR conducted 672 D&E on 1,433 approved C-Plans during the period July 1, 2016 through June 30, 2020. See Figure 4 for detail on the C-Plan types and Figure 5 for detail on the D&E types conducted.

Source: OSPR Readiness Database

14

During 2016-17 through 2019-20, of the D&E performed, approximately 92 percent resulted in a pass rating, while 1.5 percent resulted in a fail or partial fail rating. The remaining 6.5 percent D&E were cancelled or rescheduled. We also reviewed the pass and fail rates for TTX, SED, and TTX/SED, as noted in Figure 6.

Figure 6: Pass–Fail Rates for D&E Conducted 2016-17 through 2019-20

Source: OSPR Readiness Database

As part of our review, we selected 15 plan holders and verified they met all D&E objectives, including the annual TTX and semi-annual SED, within the facility’s respective three-year cycle, as required.

In accordance with Program regulations, OSPR D&E activities operate on a calendar year basis. A comparison of D&E completed during calendar year 2017 through 2019 indicates a trending reduction of each D&E type, as noted in Figure 7. The approximate 15 percent overall decrease in D&E between 2017 through 2019 is attributable to multiple variables including plan holders’ ability to condense multiple C-Plan D&E into a single event and a majority of D&E receiving a pass rating; and therefore, additional D&E were not required to maintain plan holder compliance.

Figure 7: D&E Activity by Calendar Year

Source: OSPR Readiness Database

15

Drills and Exercises Conclusion

Based on our review of D&E data, except as noted in Finding 1A, OSPR conducted 672 D&E on 1,433 approved C-Plans, including conducting TTX and SED D&E with plan holders by the regulatory semi-annual, annual, and three-year cycle deadlines. Additionally, we observed the number of D&E events has declined each year due in part to OSPR condensing D&E for multiple C-Plans into single events. Further, a majority of the D&E conducted received a pass rating, demonstrating a plan holder’s readiness to respond to an oil spill, thus additional D&E events were not necessary to be performed to maintain plan holder compliance.

Although regulations require C-Plan holders to be responsible for meeting D&E requirements, opportunities exist for OSPR to proactively monitor plan holders to timely identify non-compliance with D&E requirements, and improve the efficiency and effectiveness of D&E operations and workload planning, as noted in Finding 1B.

Finding 1: Opportunities Exist for Improvement in OSPR’s Drills and Exercises

Our review of D&E data, established processes, and systems used for D&E identified instances of TTX and SED non-compliance with requirements, and inefficient program operations that hinder OSPR’s ability to effectively monitor D&E, assess operational needs, and plan its workload.

A. Formalize Violations Review Process to Ensure All Plan Holders Conduct Annual Table Top Exercises and Semi-Annual Equipment Deployment Drills

OSPR’s violations review process, developed in 2008, has not been formalized with policies and procedures, and has been inconsistently implemented. As a result, OSPR has not timely identified instances of non-compliance with D&E requirements, relating to TTX and SED activities.

Beginning in late 2019, OSPR re-implemented its violations review process to identify non-compliance instances such as untimely notifications of a scheduled D&E or requests for credit, or plan holders that did not complete the required D&E in the prior year. As a result, OSPR identified several instances of plan holder non-compliance with regulations. Specifically, 8 of 107 C-Plan holders, during calendar years 2017 and 2018, did not conduct the required annual TTX, as identified in Table 4. These plan holders had C-Plan approvals (status dates) from as early as July 2016 through April 2018, and were required to conduct at least one annual TTX.

16

Table 4: Annual Table Top Exercise Non-Compliance

Plan Number

C-Plan Status

C-Plan Status Date

Violation Year

Violation Letter Date

Elapsed Time in Sending

Violation Letter I1-30-6181 Approved 9/28/2017 2017 1/31/2020 25 I1-56-6128 Approved 7/5/2017 2017 1/31/2020 25 I5-15-5004 Approved 7/5/2016 2017 1/31/2020 25 I1-42-6148 Approved 8/3/2017 2017 10/23/2020 34 T2-48-2077 Approved 3/22/2018 2018 1/31/2020 13 F2-20-3238 Approved 4/19/2018 2018 1/31/2020 13 I1-42-6096 Approved 10/12/2017 2018 1/31/2020 13 RS-60-5033 Approved 7/25/2016 2018 9/17/2020 33

Additionally, three plan holders, as identified in Table 5, did not conduct SEDs within the first six months of the calendar year, as required. Specifically, 2 of 71 SEDs in 2016 and 1 of 47 SEDs in 2018 were conducted, on average, 5 months late.

Table 5: Semi-Annual Equipment Deployment Non-Compliance

Plan Number SED Drill Date SED Due Date

Months Late in Scheduling

Drill T2-20-3630 12/7/2016 6/30/2016 6 F6-37-0183 11/21/2016 6/30/2016 5 S2-48-0032 11/26/2018 6/30/2018 5

CCR, title 14, section 820.01 (a) (1) (A), requires facilities to conduct at least one TTX annually and sections 820.01(a) (1), 820.01 (a) (3), and 820.02, require certain plan holders to conduct SEDs in the first six months of the calendar year. In accordance with GC section 8670.5, as the Program Administrator, OSPR is responsible for ensuring the Act is implemented as intended.

Without documented policies and procedures, and consistent implementation of the violations review process, OSPR is not able to timely identify plan holder non-compliance with the regulations and issue violation letters to initiate corrective actions.

B. Strengthen Drills and Exercises Program Operations

Readiness Database Limitations

The Readiness Database lacks the reporting capabilities needed to effectively support D&E Unit operations. OSPR cannot easily generate accurate and reliable reports for planning and decision-making purposes.

For example, the database currently cannot generate a report that identifies D&E objectives a plan holder has completed and those that remain outstanding; or plan holders that are required to complete an SED. As a result, OSPR cannot efficiently track and monitor D&E compliance, or plan for D&E activities. As described in the Readiness Database section, D&E Unit staff must manually review each individual plan holder record in the database to identify complete and outstanding objectives when planning for a plan holder’s scheduled D&E. Depending on the C-Plan type and tier risk level, the

17

number of objectives to be tested and manually reviewed in the database can be up to 29 objectives.

D&E Unit staff are in the process of developing a report function that will produce a report that identifies completed and outstanding objectives for each plan holder. However, completion of this report feature has been delayed due to staff reassignment to perform COVID-19 contact tracing.

Additionally, reports generated from the database were not always accurate and required D&E Unit staff to manually compare report information to individual database records. Database permission levels currently allow staff from multiple units and branches to enter and edit data as it relates to their Program activities, resulting in variances with the consistency and accuracy of the data. According to D&E Unit staff, a primary cause for inaccuracies in the generated reports resulted from erroneous or incomplete data input and it has been working to improve the data entry process. For example, certain data entries that were previously entered by the Prevention Branch staff have recently been reassigned to the Preparedness Branch to improve the consistency of data entered. The D&E Unit anticipates that its current ongoing efforts to improve the data quality will result in more reliable data and reduce the resources used to validate the data’s accuracy.

The lack of reporting functions within the database and inconsistent quality of data impacts OSPR’s ability to efficiently perform its D&E activities and effectively prioritize its resources to meet its operational needs.

Inefficient Workload Planning

The D&E Unit does not proactively track and monitor plan holders’ D&E compliance and does not have an established process to identify, track, and ensure plan holders schedule the required D&E timely. As a result, OSPR is not able to consistently conduct D&E throughout the calendar year, and may not be aware of the nature and extent of workload increases during the quarters that coincide with regulatory deadlines for TTXs and SEDs, for example. Therefore, plan holders are at risk of not being able to meet the Program requirements as OSPR may not have sufficient staff available to perform the necessary workload.

Based on our analysis of D&E data provided by OSPR for 2016-17 through 2019-20, D&E were the highest during the second and fourth quarters of the calendar year, as shown in Figure 8. Comparison of events each year showed a consistent trend in which the number of events doubled in the second quarter when SEDs were due and the fourth quarter when TTXs were due. Further, because TTXs were the majority of D&E conducted, the fourth quarter had the most events conducted each year.

18

Figure 8: D&E Events for 2016-17 through 2019-20

According to D&E Unit staff, some plan holders delay the scheduling of their D&E in case they are able to receive credit through alternative means, such as actual response to a spill event. This is viewed as a strategy by plan holders to save resources associated with conducting a required D&E.

Program regulations require plan holders to ensure required D&Es are scheduled and to monitor which objectives are successfully completed. Although D&E Unit staff may communicate with plan holders to inform them of outstanding D&E as a courtesy, Program regulations currently do not authorize OPSR to enforce the requirement for plan holders to timely schedule D&Es.

However, as previously noted in Finding 1A, OSPR has an opportunity to consistently implement its violations review process. Doing so would enable OSPR to proactively monitor plan holders that have not scheduled the required D&E and anticipate the timing of corrective actions based upon its review. Additionally, OSPR’s D&E webpage identifies the D&E Unit’s goals as “to enforce regulatory requirements, improve OSPR and stakeholder preparedness through drill and exercise attendance, constantly improve the drill evaluation process, and maintain up-to-date calendars”.

As the lead government agency responsible for enforcing the Act and Program requirements, OSPR has the authority to develop and implement practices that promote efficient operations. With an established practice to consistently monitor plan holders through its violations review process, and consistently conduct D&E throughout the calendar year to avoid second and fourth quarter workload increases, the D&E Unit can plan designated Program activities to ensure the efficient use of its resources.

19

Recommendations

A. Develop policies and procedures to address requirements for the violations review process and ensure the process is consistently implemented.

B. Track completion of TTXs and SEDs to assist in identifying possible non-compliance with D&E requirements and send timely notification to corresponding plan holders to enable scheduling of D&E prior to the deadline.

C. Continue to explore improvements to the Readiness Database’s reporting features and ensure accurate and consistent data is maintained to enable the generation of reliable reports that will assist in workload planning and the tracking and monitoring of plan holders D&E compliance, such as D&Es and objectives completed and outstanding. Consider providing guidance for recording D&E information into the Readiness Database.

D. Establish a process to identify required D&Es and objectives outstanding for plan holders at the beginning of the calendar year, and periodically throughout the year to assist in prioritizing and planning D&E annual workload. Timely notify plan holders of the required D&E and outstanding objectives to facilitate the implementation of facility corrective actions.

E. Consider establishing a process to conduct D&E consistently throughout the calendar year. This may include proactive coordination with plan holders in developing annual D&E workload plans at the beginning of each calendar year to have better control over the D&E scheduling.

OIL SPILL RESPONSE

Process Overview

OSPR maintains a 24 hour Communications Network (Network) utilizing the OSPR Spill Desk during business hours and the Department of Parks and Recreation Dispatch Centers for evenings, weekends, and holidays. The Network receives and evaluates spill reports from the California Office of Emergency Services (CalOES). The Network has the responsibility of notifying OSPR Wildlife Officers and other FRT members of all oil spills which impact, or threaten to impact, marine and inland waters of California. In the event of a major incident, the Network will also assist with communications between the OSPR Operations Center and response personnel.7

FRTs are OSPR staff consisting of a Wildlife Officer, Environmental Scientist, and an Oil Spill Prevention Specialist on-call for 24-hours per day. OSPR has FRTs associated with each of its three regional offices located throughout the state: Fairfield (Northern), Bakersfield (Central), and Los Alamitos (Southern). In the event of a spill, all three on-call members of the appropriate FRT are notified by the Network upon receipt of the initial spill report and responds as necessary by reviewing the CalOES spill report data and speaking with any on-site personnel and other parties involved. See Figure 9 for an overview of the spill notification and response process.

7 Information obtained from https://wildlife.ca.gov/OSPR/Enforcement.

20

Figure 9: Spill Notification and Response Process

Source: Information provided by OSPR

During the initial receipt of a reported incident, the responding FRT member will consider various factors, such as the size of the spill, spill location (i.e., whether it is close to state waters), the source of spill (e.g., pipeline, vehicle, vessel) and based on professional judgement, will decide if a physical or telephone response is warranted, as noted in Figure 10. If a physical response is considered warranted, the responding FRT members will notify the other team members or contact the FRT member nearest to the spill to begin on-site spill response procedures. However, the decision on how to respond to a spill is based on FRT staff’s professional judgement and experience, and there currently are no documented policies or guidance, as noted in Finding 2B.

When responding to a spill, the Environmental Scientist will document details of the spill by determining the spill size, location, severity, and environmental impact. The Oil Spill Prevention Specialist will investigate and speak with the Wildlife Officer and responsible party to determine the source and cause of the spill, and quantify the spill amount. Figure 11 provides an overview of the spill response data gathering. However, OSPR may not always be able to determine the spill cause. Where the oil spill cause cannot be determined, OSPR notes it as “unknown”. As noted in Finding 2A, we observed a majority of the spill causes were recorded as “unknown” in the Incident Tracking Database (ITD), which hinders OSPR’s ability to identify oil spill risk factors and develop appropriate prevention strategies.

Figure 11: Spill Response Data Gathering

Source: Information provided by OSPR * Oil Spill Prevention Specialist

Figure 10: Spill Response Type

Physical Response may require the full FRT team or a FRT member to report to the location of the spill.

Telephone Response is when physical response is not warranted and the assigned FRT member can resolve and collect information on the spill incident over the phone. Source: Information provided by OSPR

21

The FRT will work with all related parties (landowners, responsible party, local and federal agencies, etc.) to reach a consensus on Clean-up End Points (CEPs), which establish the extent of the oil clean-up efforts. In certain instances, not all oil spilled may be designated for clean-up if additional removal of the spilled oil could cause further harm to the environment. Throughout the clean-up process, the Environmental Scientist will conduct regular site visits and a final inspection to ensure the CEPs were met. The action taken in response to the spill is entered into the ITD for tracking. Additionally, the results may be used to prepare a Spill Response Memo or Supplemental Environmental Incident Report for larger spills which contains a detailed account of the spill for when a claim is filed to recover costs incurred in responding to the spill incident.

Incident Tracking Database

The ITD is a Microsoft Access Database used to track each spill incident reported by CalOES. OSPR Spill Desk staff enter initial spill details received from the CalOES report into ITD, including reporting and responsible parties, assigned FRT members, initial spill volume, time of spill, and other pertinent information. Specifically, the FRT Wildlife Officer, Environmental Scientist, and Oil Spill Prevention Specialist will update any information previously entered into the ITD as well as add additional details such as the final spill volume, cause, and source.

Oil Spill Response Activity

A total of 834,990 gallons of oil was released from 5,900 reported spill incidents during 2016-17 through 2019-20, as summarized in Table 6.

Table 6: Number of Spill Incidents and Volume by Fiscal Year

Fiscal Year Incidents Volume (Gallons) 2016-17 1,574 79,919 2017-18 1,314 259,512 2018-19 1,589 255,262 2019-20 1,423 240,297

Total 5,900 834,990

Figure 12 displays spill incidents and volume from 2016-17 through 2019-20, for marine and inland spills. Although the spill volume and incidents increased by 104 percent and 24 percent, respectively, when compared to our prior audit period of 2012-13 through 2015-16,8 due to the inclusion of inland activities in 2014. However, the number of reported spill incidents remained relatively consistent during 2016-17 through 2019-20.

8 A copy of Finance’s prior Program performance audit dated December 29, 2016 can be found at:

https://esd.dof.ca.gov/reports/report.html. A total of 411,005 gallons of oil was released from the 4,748 reported incidents during fiscal years 2012-13 through 2015-16.

22

Figure 12: Spill Incident vs. Volume

Source: OSPR Incident Tracking Database

As shown in Figure 13, of the 5,900 reported incidents, over 90 percent (approximately 5,500 incidents) involved less than 100 total spill gallons, which are categorized as small spills. Large spills totaling more than 10,000 gallons accounted for less than 1 percent (10 incidents) of the 5,900 incidents reported. The 10 incidents were related to inland spills.

Figure 13: Spill Incidents by Volume (in Gallons)

Source: OSPR Incident Tracking Database

Based on our review of OSPR’s incident response type, physical or telephone, OSPR responded to 98 percent of the 5,900 reported spill incidents, in which 24 percent (1,436) were physical responses and 74 percent (4,336) were telephone responses. The remaining 2 percent of incidents (128) did not involve California waters and/or the spill volume was less than 100 gallons; and therefore, the incidents did not warrant a response by OSPR. See Figure 14 for incident response types for each year.

23

Figure 14: Comparison of Incident Response Type

Source: OSPR Incident Tracking Database

Oil Spill Response Conclusion

Since the Program’s expansion to include inland activities in 2014, the spill volume and number of spill incidents has increased by 104 percent and 24 percent, respectively. However, the total reported spills during 2016-17 through 2019-20 remained relatively consistent and did not show a pattern of increase or reduction. Of the 5,900 reported spill incidents, OSPR responded to all incidents it was required to respond in accordance with the Program regulations, representing 98 percent of the 5,900 reported spill incidents. The remaining 2 percent of reported spills did not warrant OSPR’s response according to the regulations. The majority of incidents responded resulted from spills of 100 gallons or less (small spills) and OSPR responded to those incidents either physically or by telephone. However, the causes of the spills were not always determined and input into the ITD by OSPR. We identified opportunities for OSPR to improve the quality of the spill data collected by ensuring spill causes are consistently recorded and establishing guidance for determining the type of spill response required (physical or telephone) to ensure consistency, as noted in Finding 2.

Finding 2: Improve Methods to Identify Spill Causes and to Determine Spill Response Type Our review of OSPR’s spill response processes and systems utilized found that the majority of spill causes were not identified in the ITD and OSPR did not have established policies and procedures for determining the type of spill response required. Data and process inconsistencies reduce the quality of information collected, and hinders the effectiveness of OSPR management decisions and improvement of Program performance.

A. Ensure Spill Cause is Identified or Subsequently Updated

During the period July 1, 2016 through June 30, 2019, we identified 3,070 of 5,900 (52 percent) reported spill incidents that OSPR responded to did not have a spill cause identified. Further, there were 30 instances where the spill causes were not identified for incidents with a spill volume over 1,000 gallons (large spill), in which OSPR responded to 29 of these 30 incidents either physically or by telephone. Figure 15 provides detail on the spill causes collected for reported spill incidents.

24

Figure 15: Common Spill Causes from 2016-17 to 2019-20 (Number of Spills)

As demonstrated in Figure 15, the top three spill causes were: 1) Unknown, 2) Human Error, and 3) Equipment Failure. The top three causes remained consistent from 2016-17 through 2019-20, and collectively represent approximately 90 percent of all spill causes. Further, based on our review of spill causes by volume, we noted that of the 5,533 small size spills reported (100 gallons or less), 2,944 or 53 percent of spills had causes identified as unknown. Although the number of unknown spill causes decreased with larger spills, approximately 94 percent of reported incidents fall into the small spill size category. Therefore, for OSPR to improve its prevention activities, it is critical to identify spill causes for small size spills as well as large spills.

According to OSPR, FRT staff continually update the ITD as new information is determined. OSPR is responsible for overseeing response and clean-up activities of oil spills in California’s marine and inland waters. However, if the spill is outside OSPR’s jurisdiction and is under another agency before the incident is closed, the determination of the cause is not automatically shared with OSPR and OSPR does not consistently follow up with the subsequent agency to obtain the respective information. OSPR coordinates with other agencies only if the outside agency requests OSPR's assistance. If OSPR was not requested to assist in the spill response, information regarding the cause or source may not be recorded in the ITD.

While OSPR’s goal is to assess and determine every spill cause and source, OSPR stated it is often unfeasible or impractical for the following reasons:

• Many spills are reported on the water, with a wide variety of sources, such as sheens (i.e., iridescent appearance of oil on the surface of the water) and natural seepage out of the sea bed or shoreline.

• Places such as harbors or ports may have hundreds of vessels at any given time making it difficult to identify a single spill source.

• Tide currents may shift oil around the body of water.

25

• For smaller spills, by the time responders arrive to the site, the product may have evaporated and disappeared.

• Spills from natural seepage out of the sea bed or shoreline are outside OSPR’s jurisdiction and assigned to other agencies.

Further, OSPR does not have an established process to determine and record the probable cause for the spills not easily or definitively determined.

GC section 8670.7 (e) states that the Administrator (OSPR), with the assistance from other state agencies and the federal on-scene coordinator, shall determine the cause and amount of the discharge of spills. Further, OSPR’s 2017-18 and 2018-19 Strategic Plan, Prevention Goals A and B, include examining oil spill risk across multiple factors by evaluating the database of oil spill incidents and identifying root causes for spills, and allocate resources and strategies in relation to oil spill risk factors.

With approximately half the spill causes designated as unknown, the spill data is incomplete and cannot be fully utilized for Program planning and improvement purposes. Gathering complete and accurate spill data is critical for analysis and improvement of prevention strategies for the Program. Further, consistent identification of spill causes will improve the Prevention Branch’s analysis of spill trends and risks to ensure prevention activities are effectively aligned with identified high risk areas.

B. Establish Guidance For Determining Spill Response Type

OSPR does not have established policies and procedures to ensure response staff consistently determine the spill response type required for reported spill incidents. Further, the spill factors or information considered to determine the OSPR’s response type was not documented. As a result, OSPR could not demonstrate the consistency or effectiveness of OSPR’s spill response determination process. Based on our review of spill report data, we noted that OSPR responded physically to 1,240 small size spills and by telephone to 20 large size spills.

According to OSPR response staff, each spill is assessed on a case-by-case basis because each spill is unique, requiring consideration of reported spill information (e.g., spill volume, location); however, factors other than spill volume, for example, may have more importance. Additionally, response determinations vary depending on the individual staff’s professional judgement and experience, as well as collective judgement and experience of the FRT. Further, response staff may rely on third party observations, and CalOES’ initial spill reports to determine whether spill response is warranted. However, lacking documented policies and procedures, there is no framework or consistent foundation to assist in ensuring staff make appropriate and consistent determinations regarding the type of spill response required.

Recommendations

A. Consider revising the spill analysis process to include procedures to determine and record in the ITD probable spill causes for spills not easily or definitively determinable. Determine if additional spill cause types including the addition of “probable” designation should be developed.

26

B. Coordinate with agencies responsible for spill incidents outside of OSPR’s jurisdiction to collect spill cause and source data, and input into ITD, as applicable.

C. Develop formalized policies and procedures or guidance to assist staff in making physical or telephone spill response determinations. Ensure the policies and procedures include the roles and responsibilities of the FRT members.

California State Lands Commission

The Commission’s Marine Environmental Protection Division (MEPD) staff oversee the marine oil terminal monitoring and inspection program, with responsibility divided between its Northern California Field Office (NCFO) in Hercules, and Southern California Field Office (SCFO) in Long Beach. Millions of barrels of oil are transferred over water through pipelines between ship and shore at California’s 34 MOTs along the coast. MEPD staff monitors MOT transfer operations daily and enforces regulatory requirements.

The Commission’s Mineral Resources Management Division (MRMD) staff conduct comprehensive safety audits of offshore oil and gas production facilities on a five-year cycle. The audits assess the design and condition of each platform and shore-side facility, and whether the operators of the platforms are producing safely.

OIL TRANSFER MONITORING

Process Overview

MEPD uses a marine terminal priority monitoring system (PMS) to ensure the most critical and significant oil transfers are monitored. PMS ranks each vessel operating in marine waters based on the number of prior MEPD monitored transfers. Each MOT is ranked based on various risk factors. MOT operators send daily notifications to the Commission's monitoring inbox of scheduled oil transfers. Daily, a MEPD staff assigned to monitor the inbox compiles a list of all oil transfers occurring, and may contact the terminals to verify the information. Using the PMS, an oil transfer priority rating is determined and used to prioritize the monitoring workload. As shown in Figure 16, the oil transfer priority rating is based on a risk matrix, as depicted in Figure 17, which considers the individual MOT and vessel priority ratings. An oil transfer priority rating of 1 is considered the highest priority for monitoring.

Figure 16: Oil Transfer Priority Rating Factors

Source: Information provided by the Commission

27

Figure 17: Monitoring Priority Risk Matrix for Oil Transfers

Source: Information provided by the Commission

• MOT Priority Rating (1 – 5): Factors considered include number of transfers every year, number of spills, equipment problems, a new transfer operation, or any other issue identified that impacts terminal safety. MOTs that have conducted less than 12 transfers in the past 12 months will be assigned a Priority 1 rating.

• Vessel Priority Rating (1 – 7): Factor considered is the total number of oil transfers monitored within the last three years. MEPD utilizes OSPD to automatically calculate the vessel priority rating number each quarter based on transfer monitoring data recorded within the last three years. A report (Crystal Report) is generated from OSPD listing each vessel’s priority rating. Although the Commission’s Marine Terminal Monitoring Priority System Memorandum, Procedures No. 12201.2 (Memorandum No. 12201.2) indicates the ratings of 1 through 7, the Monitoring Priority Risk Matrix does not include priority ratings for 6 and 7, as shown in Figure 17 and described in Finding 3A.

• Oil Transfer Priority Rating (1 – 7): The MOT and vessel ratings are compared to the risk matrix (Figure 17) to determine the transfer monitoring priority rating. However, any vessel that has not been inspected in the last 12 months will be assigned a Priority 1 rating.

Once all the oil transfer priority ratings are determined, the daily transfer list is shared with MEPD staff to select which transfers to monitor, starting with the highest priority transfers. Certain transfer events, including hook-up, start pump, steady rate, topping off/stripping, and disconnect, will also be selected. Staff will monitor selected events to ensure compliance with oil transfer requirements and will complete a data collection sheet, monitoring checklist, and an inspection report. Oil transfer inspection data are recorded in OSPD.

Oil Spill Prevention Database

OSPD is used to track all MOTs and vessels conducting any oil transfers, track regulatory compliance, monitor prevention activities, and also to manage daily operational tasks. Initial oil transfer information received from MOTs are entered into OSPD to record each MOT and vessel transfer activity, such as the arrival date and time, monitoring date and time, type and quantity of oil transferred, and estimated time finished date and time. A daily quality assurance review is conducted at the end of the day by an assigned Marine Safety Specialist to verify the inspection information entered by MEPD staff. Further, a

28

monthly quality assurance review is performed to reconcile the oil transfer information with monthly transfer reports from the MOTs, identifying the final quantity of oil transferred, and any other updates to transfer data initially provided.

As previously mentioned, the Crystal Report, which is generated from OPSD on a quarterly basis, automatically calculates the vessel priority rating and is used in the oil transfer priority rating. However, the vessel priority rating is not subsequently updated in OSPD and Crystal Reports are not retained. As a result, the Commission is not able to demonstrate it properly monitored all critical or high risk transfers, as described in Finding 3B.

Oil Transfer Monitoring Activity

During 2016-17 through 2019-20, the Commission monitored 9,494 of 27,546 (34 percent) total transfers. Based on the review of oil transfer data, although oil transfer activity at MOTs have not changed significantly over the years reviewed, the total oil transfers monitored has reduced from 3,267 (49 percent) in 2016-17 to 1,718 (24 percent) in 2019-20. See Figure 18 for detail. According to the Commission, its staff are responsible for inspection activities for both the Oil Spill Prevention Program and the Marine Invasive Species Program as mandated by the Legislature, and conduct those activities based on daily priorities and staff availability. Those dual responsibilities contributed to the reduction in oil transfer monitoring activities.

Figure 18: Comparison of Oil Transfers vs. Monitored

Source: Commission’s OSPD

As shown in Table 7, the Commission observed approximately 1.3 billion gallons (47 percent) of total oil transferred as a result of its monitoring activities. Both NCFO and SCFO provided consistent coverage by region.

Table 7: Total Quantity of Oil Monitored

Location Quantity

Monitored Quantity Not

Monitored Total Quantity

Transferred Percent

Monitored NCFO 596,845,074 713,551,411 1,310,396,485 46 % SCFO 750,480,891 794,309,342 1,544,790,233 49 % Total 1,347,325,965 1,507,860,753 2,855,186,718 47 %

Source: Commission’s OSPD

29

Oil Transfer Monitoring Conclusion

The Commission monitored 9,494 transfers, representing 34 percent of total oil transfers, and 1.3 billion gallons, representing 47 percent of total oil transferred, during 2016-17 through 2019-20. Overall, oil transfers and monitoring data reviewed demonstrated the Commission monitored MOT oil transfers on a continuing basis as required by Program regulations. Additionally, based on the trends observed during our data analysis, monitoring activities were generally focused on MOTs and vessels with the highest transfer count. However, because OSPD is not updated to reflect the most current vessel priority rating, and lack of documentation, the Commission could not demonstrate it properly monitored all critical or high risk oil transfers, as noted in Finding 3. Additionally, due to a lack of record retention to support priority ratings, the Commission could not demonstrate the effectiveness of its workload planning.

Finding 3: Ensure the Priority Monitoring System is Accurate and Supported to Demonstrate High Priority Transfers are Properly Monitored

MEPD staff are not using the vessel priority rating in OSPD as it is not updated; therefore, MEPD staff do not rely on it to determine the priority rating for oil transfers. Further, the accuracy of transfer priority ratings recorded for selected oil transfers could not be verified because supporting documentation was not available. As a result, the Commission could not demonstrate it properly monitored all critical or high risk transfers in accordance with its policies.

A. Ensure Vessel Priority Rating in the Oil Spill Prevention Database and Crystal Report is Accurate

During our data reliability testing of the oil transfer report data generated from OSPD, we determined that the vessel priority ratings were not accurate. Specifically, we selected 24 oil transfers during 2018-19 and 2019-20, and recalculated the transfer priority rating using the vessel and terminal priority rating listed in the oil transfers report. Of the 24 transfers selected, 16 priority ratings did not agree to our recalculated ratings using the priority risk matrix (Figure 17). The majority of the incorrect priority ratings were for oil transfers that had a vessel priority rating of 1, which resulted in an oil transfer priority rating of 1. However, the oil transfer priority rating recorded in OSPD for the 16 oil transfers ranged from 2 to 6. Due to the inconsistencies identified for oil transfer priority ratings between our recalculation and OSPD priority ratings, we did not use or rely on the OSPD priority ratings for the vessels, MOTs, or oil transfers.

Additionally, based on our discussion with MEPD staff that generate the Crystal Report, the report captures all vessel monitoring data including data from January 2008 to the report generation date. As a result, the vessel priority ratings identified in the Crystal Report for our audit period may have included approximately 8 (2008 to 2016) to 12 (2008 to 2020) years of vessel oil transfer data, instead of the last 3 years, which is used to determine the oil transfer priority ratings. Therefore, the vessel priority rating may incorrectly identify the vessel as lower risk, than if the rating was based on the last three years.

According to MEPD staff, the vessel priority rating in OSPD is not updated. The vessel rating is entered into OSPD when the vessel enters California for the first time and is not

30

subsequently updated. Therefore, MEPD staff rely on the quarterly Crystal Report to provide the most recent vessel priority rating for consideration when determining the oil transfer priority ratings for the upcoming quarter.

According to the Commission’s Memorandum No. 12201.2, tank vessel and tank barge priority rating numbers are derived from a Crystal Report that is generated every quarter. The content of the report examines the last three years of data for any tank vessel or tank barge within OSPD. The Crystal Report assigns a tank vessel or tank barge a priority rating of 1 to 7 based on the number of transfers monitored within the last three years.

Inaccurate oil transfer data and reporting hinders the Commission’s ability to make appropriate Program decisions and does not ensure accurate identification and sufficient monitoring of the most critical or high risk oil transfers.

B. Retain Documentation and Clarification to Support Effectiveness of Oil Transfer Priority Monitoring System

MEPD does not retain documentation to support its determination of priority ratings assigned to oil transfers. Therefore, the effectiveness of MEPD’s PMS and workload planning to properly monitor critical and high priority oil transfers could not be determined. To test the PMS, we selected 12 oil transfers during 2019 to confirm the transfer priority ratings by recalculating the vessel priority rating using the vessel’s monitoring data from the last three years. We did not recalculate the MOT priority ratings for the selected 12 oil transfers because MEPD does not maintain data or documentation to support the factors considered to assign the individual priority ratings.

Based upon our recalculations, vessel priority ratings did not agree for 8 vessel ratings and 7 transfer priority ratings used by MEPD, as shown in Table 8. Further, for 2 of the transfer ratings identified in Table 8, Vessel D and Vessel E, the recalculated vessel ratings of 7 and 6, respectively, were outside of the priority matrix parameters, as shown in Figure 17, and limited the vessel rating to 5. Therefore, the priority transfer rating could not be recalculated.

Table 8: Inconsistent Vessel Ratings

Vessel Name Arrival Date

Vessel Rating per the

Commission Recalculated Vessel Rating

Transfer Rating per the

Commission

Recalculated Transfer Rating

Vessel A 2/5/2019 1 3 5 3 Vessel B 1/18/2019 2 1 2 1 Vessel C* 2/15/2019 4 3 3 3 Vessel D 3/17/2019 1 7 6 Unknown Vessel E* 2/11/2019 7 6 5 Unknown Vessel F* 2/26/2019 2 1 2 1 Vessel G 2/20/2019 1 2 2 2 Vessel H* 2/22/2019 6 5 6 6 Vessel I* 1/15/2019 1 1 5 1 Vessel J* 2/3/2019 1 1 5 1

*= Updated Vessel rating provided by SCFO. Grayed area indicates no variances between the Commission’s rating and our recalculated rating.

31

MEPD’s NCFO and SCFO rely on the quarterly Crystal Reports to determine the transfer priority ratings for each oil transfer that will occur on a given day. The transfer priority rating can be adjusted and changed based on current information about either the vessel or MOT, such as violations, spills, historical monitoring, and/or facility construction.

According to MEPD staff, because the quarterly Crystal Report only presents vessel priority ratings based on monitoring activity-to-date, Crystal Reports are not retained once a new quarterly report is received. Additionally, because the report provides an assessment of the vessel at a point in time, during the subsequent three months of the quarter, the ratings may change depending on the vessel’s monthly oil transfer activities. Further, MEPD staff also consider other current risk factors (e.g., current MOT or vessel inspection issues/violations) and make adjustments to the priority rating based on professional judgement. However, MEPD staff do not include clarification for adjustments made to priority ratings in the “Remarks” section for the oil transfer data entered into OSPD.

Because MEPD does not retain the quarterly Crystal Reports or include clarification for adjustments made to priority ratings in the “Remarks” section of the oil transfer data entered into OSPD, it could not be verified if MEPD’s actual vessel and oil transfer ratings used at the time of the transfer were consistent with our recalculated ratings.

CCR, title 2, section 2320 (a) (3), requires the Commission to monitor transfer operations at all terminals on a continuing basis. Further, the Memorandum states the purpose of the monitoring priority risk matrix is to enable a rational and systematic determination of the probable risk posed by vessels conducting oil transfers at MOTs in California. Additionally, all New to California, Priority 1, and Priority 2 vessels not monitored must have an explanatory notation in the OSPD Remarks section.

Without documentation of factors or conditions considered when adjustments are made to priority ratings for vessels, MOTs, and transfers, the risk of review inconsistencies increases and the Commission may not be able to demonstrate a clear basis for its selection of oil transfers to monitor. Further, lack of supporting documentation prevents the Commission from demonstrating it adequately evaluated oil transfer risks to ensure critical and high risk oil transfers were prioritized for monitoring.

Recommendations

A. Develop a process to update the vessel priority rating in OSPD. Consider feasibility in updating vessel priority ratings in OSPD based on a similar auto-calculation method used to generate Crystal Reports.

B. Ensure the vessel rating generated in the quarterly Crystal Report is based on the last three years of each vessel’s oil transfer monitoring data for vessels, as required by the Memorandum.

C. Update the priority risk matrix to include all available vessel priority ratings, (i.e., ratings 6 and 7) and update associated documents as needed to resolve inconsistencies.

D. Ensure sufficient documentation is retained to support the determination of the transfer priority rating, such as the quarterly Crystal Reports, and document

32

adjustments to priority ratings in the OSPD “Remarks” section to maintain a sufficient audit trail. The audit trail should facilitate the tracing of priority rating determinations to source files and documents used.

SAFETY AUDITS

Process Overview

MRMD conducts comprehensive safety audits of oil and gas drilling and production facilities. MRMD also analyzes the technical design of safety systems and verifies the alarms and controls are installed and operate as intended. The safety audits performed include the following areas:9

• Equipment maintenance, and corrosion prevention and inspection programs to evaluate the fitness of pressure vessels, tanks, and piping.

• Training and qualification programs at platforms and oil facilities to assure competent operation.

• Safety assessment of management systems program is used to assess organizational safety culture, the human factor element, and the maturity level of safety programs.

• Operating manual and C-Plan reviews to evaluate the adequacy of facilities procedures for normal operation, upset conditions, and response to spill incidents.

In addition, a third-party contractor analyzes and inspects the design, maintenance, and condition of electrical distribution systems.

To implement its statutory responsibilities, the Commission’s goal is to complete a safety audit of each active10 facility once every five years. Upon completion of the safety audit, an audit report is issued to the facility and posted on the Commission’s website. An action item matrix is prepared internally to track identified audit deficiencies and the completion of corrective actions. Facilities must address deficiencies (action items), which are assigned a priority level, by specified deadlines. Priority levels are based on the risk of potential for injury, oil spill, other adverse environmental impacts, or property damage. According to MRMD staff, priority 1 consists of high risk items that are required to be corrected within 30 days of identification. However, most priority 1 action items are corrected by the facility by the time the report is issued. Priority 2 and 3 action items must be corrected within 120 and 180 days of identification, respectively, and are generally corrected after the report is issued. Upon verification that all action items have been implemented, MRMD sends an audit completion letter to the facility. See Figure 19 for an overview of the audit process.

9 Excerpts from https://www.slc.ca.gov/oil-spill-prevention/. 10 Active facility represents a facility which is currently operational.

33

Figure 19: Safety Audit Process Overview

Source: Information provided by the Commission

MRMD utilizes the Safety Audit History Log to track its workload and monitor the status of its safety audits. The log is an excel spreadsheet which contains current and historical data on the facilities audited and various milestone dates (e.g., initial letter, fieldwork completion, report issuance, action item log completion, etc.). However, we noted the Safety Audit History Log contains outdated and missing key information, hindering MRMD’s ability to effectively plan and track its workload to ensure ongoing regulatory oversight, as described in Finding 4A.

Safety Audits Activity

During 2016-17 through 2019-20, MRMD completed six facility safety audits within each facility’s five-year audit cycle, as shown in Table 9. However, audit reports were not always provided timely to the audited facility, as described in Finding 4A. Additionally, opportunities exist for the Commission to evaluate its practices and increase the efficiency of its audit process by incorporating a risk based approach that focuses audit resources on high risk areas, as noted in Finding 4A.

Table 9: Safety Audits Completed

Facility Initiation

Letter Issued Audit Report

Issued Completion Letter Issued

Platform Emmy 2013 March 2016 January 2017 Fort Apache Onshore 2014 May 2016 May 2017 Platform Esther 2014 October 2016 May 2017 Platform Eva 2014 December 2016 January 2018 Montalvo 2016 September 2017 March 2018 THUMS CRC 2017 June 2019 July 2020 Source: Commission’s Safety Audit History Log

Based on our analysis of the Safety Audit History Log, MRMD spent a significant amount of time completing the six safety audits. On average, a safety audit required 1,242 days, or approximately, 3.5 years, ranging from 821 days to 1,461 days (2.3 years to 4.1 years), to complete. See Figure 20 for the total number of days for each audit phase. MRMD’s

34

audit process does not include individual safety audit budgets or goals per audit phase, restricting its ability to monitor and evaluate its audit practices for efficiencies, as described in Finding 4A. Additionally, MRMD has experienced several staff vacancies impacting its ability to timely complete safety audits. At MRMD’s current audit completion pace, multiple audit cycles may begin to overlap, resulting in a strain on resources and the risk that future safety audits may not be completed within the upcoming five-year cycle, as detailed in Finding 4A.

Figure 20: Audit Timeline by Major Phases (in Days)

Avg: Average number of days R: Range of days *: “0” days indicates audit was closed immediately after action items were corrected. Source: Finance’s analysis of the Commission’s Safety Audit History Log

Further, facilities did not correct audit deficiencies within the established time frames for the respective action item priority level. Untimely completion of deficiency action items increases the safety risk of the facilities, as noted in Finding 4B.

Safety Audit Conclusion

During 2016-17 through 2019-20, MRMD completed all six active facility safety audits within each facility’s five-year cycle. However, the audited facilities did not address all action items within the respective action item priority level time frames. MRMD has experienced several staff vacancies, which negatively impacted its ability to timely complete its required workload. Opportunities exist for the Commission to evaluate its safety audit process and improve its efficiency to assist in facilitating Program goals and requirements are timely met. See Finding 4 for details.

Finding 4: Evaluate Safety Audit Process to Improve its Efficiency and Timely Meet Program Goals and Requirements

While MRMD completed the required number of safety audits during 2016-17 through 2019-20, opportunities exist for it to evaluate its safety audit process to improve its monitoring practices and effectively plan its workload. Additionally, MRMD should strengthen its practices to ensure audit deficiency action items are corrected by facilities within the established priority level time frames.

A. Improve Efficiency of Safety Audits

Ineffective Tracking and Workload Planning

The Safety Audit History Log, a tool used by MRMD staff to track the status of current and past safety audits, contains outdated and missing key information for workload tracking and planning purposes. Specifically:

35

• Facility information did not clearly identify the status of the facility, i.e., active or inactive. Specifically, of the 11 facilities listed, only 6 were active facilities that required a safety audit. Clearly identifying the facility status and separating inactive facilities from active facilities will assist in facilitating audit planning and decision-making as current and relevant information will be readily displayed in the log.

• The log does not identify when each active facility’s next five-year audit cycle is to begin. The inclusion of this date on the log will assist in the planning of current and future audit workload to meet safety audit priorities.

• Three audits required to begin in 2019 were not listed on the log. Accurate identification of required workload and deadlines will assist in identifying the resources needed to meet safety audit goals and objectives.

According to MRMD staff, the existing tracking system is a legacy tool developed by prior MRMD staff. While this tool is still utilized to track audits, there are no formalized procedures providing guidance for its purpose or use, and it is not regularly updated or maintained.

Additionally, aside from the Safety Audit History Log, no additional tools are used to plan and evaluate audit progress such as audit budgets, schedules, or specific milestone dates for completion of each phase of the audit. MRMD believes focusing on its primary goal of completing the safety audits within the five-year requirement is sufficient for tracking and monitoring its workload.

The Commission provides regulatory oversight through its established Safety Audit program to ensure oil and gas production facilities comply with CCR title 2, sections 2129 through 2142, and 2170 through 2175, which address requirements for oil and gas production, drill and production pollution control, and operational manual and emergency planning. Continued use of a tool that is outdated or unreliable hinders MRMD’s ability to effectively plan and track its workload to ensure ongoing regulatory oversight. The implementation of additional metrics such as audit budget and audit phase milestone dates will facilitate MRMD’s ability to evaluate its audit practices, and identify and implement process efficiencies.

Delays in Initiation of Safety Audits May Cause Overlap of Audit Timelines and Exertion of Resources

Although the Commission met its five-year audit cycle strategic goal during 2016-17 through 2019-20, at its current audit completion pace, it may not complete future safety audits within the upcoming five-year cycle. Based on the Safety Audit History Log, five facility audits scheduled to begin in 2018 and 2019 were initiated late or have not been initiated as of October 2020, as shown in Table 10. Further, due to the COVID-19 pandemic, MRMD has extended the completion of audit fieldwork due to the postponement of on-site visits for the Platform Emmy and HB Onshore safety audits.

36

Table 10: Safety Audits Required to Start During 2018 and 2019

Facility Audit Year Date Initiated Status Next Cycle Audit Year

Platform Emmy 2018 May 2019 In Progress 2023 HB Onshore 2018 May 2019 In Progress 2023 Platform Esther 2019 Not Initiated Not Initiated 2024 Platform Eva 2019 Not Initiated Not Initiated 2024 Fort Apache Onshore 2019 Not Initiated Not Initiated 2024

THUMS CRC 2022 Not

Applicable Not

Applicable 2027

Additionally, based on the design of MRMD’s audit cycle timelines, there may not be sufficient time between scheduled audits to absorb delays. Specifically, delays may cause instances in the audit cycle where multiple audits will overlap, resulting in impractical timelines that may require a quick turn-around for completion of fieldwork and report issuance, and strain MRMD resources.

For example, as of October 2020, six facilities are required to have a safety audit completed: (1) Platform Emmy, (2) Huntington Beach Onshore (HB Onshore), (3) Platform Esther, (4) Platform Eva, (5) Fort Apache Onshore, and (6) California Resources Corporation (CRC) Long Beach Unit, often referred to as THUMS CRC. MRMD grouped these facilities into the following three audit engagement cycles, to be completed concurrently over the five-year cycle:

1. Platform Emmy and HB Onshore

2. Fort Apache Onshore, Platform Esther, and Platform Eva

3. THUMS CRC

Based upon MRMD’s five-year cycle, facility group 1 and 2’s cycles may overlap during 2019 through 2021, resulting in MRMD staff conducting five safety audits concurrently. Further, during 2022, facility groups 2 and 3 may overlap, requiring MRMD to continue to exert its resources to finalize the completion of facility group 2’s three audits, while initiating the last audit for facility group 3, as noted in Table 11.

Table 11: Five-Year Audit Timeline

Facility 2018 2019 2020 2021 2022 2023 2024 2025 2026 Platform Emmy11

HB Onshore11

Fort Apache Onshore

Platform Esther

Platform Eva

THUMS CRC

Audit Cycle 1 Audit Cycle 2 Audit Cycle 3 11 Audit currently in progress.

37

According to MRMD staff, safety audits conducted in the past typically consisted of three to four staff and one supervisor. However, due to staff turnover, MRMD currently has only two staff available to complete current and upcoming safety audits, with MRMD’s Chief of Compliance performing additional duties as acting supervisor. As of 2020-21, MRMD currently has one vacant Senior Process Safety Engineer (PSE) and three vacant Associate PSE positions.

Further, the current audit program does not incorporate a risk assessment process to assist in identifying high risk areas, such as reviewing recent facility inspection records and associated violations, to more efficiently focus audit resources and reduce the time needed to complete the safety audits.

The Commission’s Strategic Plan, strategy 1.5 states that it aims to sustain a five-year cycle at all offshore and onshore marine oil production facilities, and to develop a systematic approach to its safety audits by relying on both a quantitative model and qualitative performance and risk-related data.

Staffing vacancies and audit delays increase the risk that the Commission may not complete upcoming safety audits within the required five-year cycle. With the implementation of a risk based approach to its audits, the Commission will be better equipped to identify audit process efficiencies and effectively utilize available resources to meet its operational needs.

Untimely Issuance of Safety Audit Reports

MRMD staff spent a significant amount of time to complete its safety audits and issue reports. Specifically, for the six audits completed during 2016-17 through 2019-20, an average of 28 months elapsed from the initial letter date to the audit report date, with audit report issuance ranging from 15 to 34 months, as shown in Table 12. Further, an average of 12 months elapsed from the completion of fieldwork to the issuance of the audit report.

Table 12: Safety Audit Report Issuance Metrics

Facility Audit Initiation Audit Completion Audit Duration

(Months) Platform Emmy June 2013 March 2016 33 Fort Apache Onshore February 2014 May 2016 27 Platform Esther February 2014 October 2016 32 Platform Eva February 2014 December 2016 34 Montalvo Onshore June 2016 September 2017 15 THUMS CRC February 2017 June 2019 28

We noted MRMD’s practice to group several facilities together and initiate the audits concurrently may not be effective in ensuring timely issuance of the audit reports. Based on our understanding, although the audit may have been initiated through the issuance of the initial letter, audit planning may not have begun immediately. Further, according to MRMD staff, the reasons for the delay in issuing prior audit reports were due to a complete revision of the audit report structure, the audit report review process, and staff vacancies. However, although reports were not issued timely through the end of 2016, the most recent two safety audit reports demonstrated a significant reduction in time

38

from the end of audit fieldwork to report issuance, as displayed in Figure 21. Additionally, MRMD staff noted that the new audit report structure assisted in reducing the report issuance processing time.

Figure 21: Months to Audit Report Issuance after Fieldwork

While MRMD made efforts to improve the reporting phase of the safety audits, without a process to continuously re-evaluate its program and efficiency of audit phases, MRMD may not be able to address significant delays within its process to ensure timely completion of its safety audits.

B. Ensure Timely Completion of Action Items to Address Safety Audit Deficiencies

Of the six facility audits performed, completion of corrective actions to address all deficiency action items ranged from 181 to 366 days. See Table 13 for detail.

Table 13: Completion of Action Items

Facility (Report Issuance Date)

Total Number of All Action

Items

Number of Days to

Complete HB Onshore (May 2015) 173 366 Platform Emmy (March 2016) 66 306 Fort Apache Onshore (May 2016) 103 365 Platform Esther (October 2016) 62 212 Platform Eva (December 2016) 96 365 Montalvo Onshore (September 2017) 236 181 THUMS CRC (June 2019) 603 366

The total number of all action items to be completed ranged from 62 to 603 items, which were all priority 2 and 3 risk levels. Although facilities addressed, on average, approximately 90 percent of action items on time, five of the six facilities were late in resolving the remaining action items. For example, THUMS CRC completed 79 percent of action items by the due date, but approximately 20 percent were completed late. Of the 603 action items, the majority of delayed corrective actions were for 111 priority 3 action items that were completed 30 to 90 days after the due date.

39

The Commission’s internal policy requires all action items to be addressed within established time frames depending on the priority level, 30 days for priority 1, 120 days for priority 2, and 180 days for priority 3.

Untimely completion of corrective actions to address safety audit deficiencies increases the safety risk of the facilities and reduces the Commission’s ability to demonstrate it maintains effective Program prevention measures.

Recommendations

A. Assess the Safety Audit History Log to determine if it contains all required information for audit planning and decision-making purposes. Update the log to identify which facilities are active and inactive and add a field to track each active facility’s next audit initiation year, and ensure the log is complete.

B. Develop audit budgets, schedules, or specific milestone dates for completion of each phase of the audit to plan and manage the workload and to ensure timely audit completion. Additionally, develop a multi-year audit schedule to estimate timelines for each facility.

C. Continue to re-evaluate the Safety Audit program and efficiency of audit phases. Consider implementing a risk-based auditing approach, such as completion of a risk assessment for facilities to narrow the scope of the safety audit and focus on high risk areas. Additionally, consider how results of recent facility inspections could be utilized to identify risk areas.

D. Consider assessing the current set of audit cycle facility groups to determine if the groupings should be further separated to prevent overlapping of multiple audit timelines and provide a more manageable workload.

E. Continue to monitor and follow up with facilities regarding outstanding action items to ensure deficiencies are timely corrected. Document actions taken.

40

PROGRAM FINANCIAL BASIS

Per barrel fees and non-tank vessel fees are the primary revenues for Fund 0320. These fees are collected and deposited in Fund 0320, and are used to fund prevention and preparedness activities of OSPR and the Commission, as required by GC section 8670.40 (e). Fund 0320 also incurred expenditures related to services provided by the California Department of Tax and Fee Administration, Office of Environmental Health Hazard, and University of California (collectively referred as Other Departments), and assistance provided through grants to local government entities.

OSPR continuously recovers costs incurred in responding to spill incidents from responsible parties and deposits reimbursements into Fund 0321, as required by GC sections 8670.47 and 8670.53.

Fund Balances

Fund 0320

The fund balance for Fund 0320 has been steadily decreasing since 2016-17, as displayed in Figure 22. During our audit period, 2016-17 through 2019-20, Fund 0320 revenues ranged from a low of $48.5 million in 2019-20 to a high of $53.2 million in 2017-18, while expenditures ranged from a low of $49.0 million in 2016-17 to a high of $57.7 million estimated for 2019-20.

Figure 22: Fund 0320 Revenues, Expenditures, and Fund Balances (in Thousands)

Source: OSPR Fund Condition Statements * The Commission has not completed 2019-20 year end closing for Fud 0320 and the

2019-20 figures include estimated expenditures for the Commission’s operations.

As shown in Figure 22, beginning in 2017-18, expenditures have exceeded revenues, requiring the fund balance to supplement the shortage. As a result, the 2016-17 fund balance of $34.3 million is estimated to decline to $18.2 million in 2019-20. In accordance with GC section 8670.40 (B) (7), OSPR must provide a reasonable reserve for contingencies when determining the per barrel oil fees. To ensure that the fee is appropriate, OPSR is required to annually project revenues and expenditures over three

41

fiscal years, including the current year (i.e., three-year projection). Based on OSPR’s current three-year projection, Fund 0320 will have an estimated negative ending fund balance of approximately $2.5 million by June 2022. To assist with the solvency of Fund 0320, the Budget Act of 2020 (Item 3600-012-0321) provided Fund 0320 a $6.5 million dollar loan from Fund 0321, as discussed below. According to OSPR, revenues have decreased due to the declining use of oil across California, including the effects of the COVID-19 economic shutdown during 2020. OSPR projects revenues to continue to decline, which could lead to insufficient funding for OSPR’s and the Commission’s Program activities. OSPR is currently evaluating potential solutions, such as a future increase to the per barrel fee.

Fund 0321

The fund balance for Fund 0321 was low during 2016-17 through 2017-18, due to a $40 million loan to the General Fund, approved in the Budget Act of 2010 (Item 3600-011-321). Loan repayments were completed in 2019-20, with $5 million and $35 million being paid in 2016-17 and 2019-20, respectively.

As shown in Figure 23, Fund 0321 revenues ranged from a low of approximately $1.7 million in 2017-18 to a high of approximately $36.9 million in 2019-20. The 2019-20 revenue consisted of the $35 million loan repayment and $1.9 million in revenue. Expenditures ranged from a low of $1 million estimated for 2019-20 to a high of $2.5 million in 2016-17. With a June 30, 2020 fund balance of approximately $50 million, Fund 0321 is sufficiently funded. Because the Budget Act of 2020 (Items 3600-011-0321 and 3600-012-0321) approved loans totaling $36.5 million ($30 million to the General Fund and $6.5 million to Fund 0320), the fund balance should be monitored to ensure the fund maintains an appropriate fund balance for future years.

Figure 23: Fund 0321 Revenues, Expenditures, and Fund Balances (in Thousands)

Source: OSPR Fund Condition Statement ** Revenue included $5 million loan repayment *** Revenue included $35 million loan repayment

42

Program Fund Revenue and Expenditure Activities

Based on the 2018-19 revenue and expenditure data, our review of Program activities by Fund is described below.12

Fund 0320

Figure 24 provides detail on Fund 0320’s 2018-19 revenues and expenditures. Barrel fees collected comprised the majority of revenues, totaling $46.1 million, with the remaining $5.7 million collected from non-tank vessel fees. Expenditures totaling approximately $56 million were attributable to $35.2 million for OSPR while the Commission accounted for $14 million. Other Departments and local government entities accounted for the remaining $6.3 million in expenditures.

As previously noted, Program expenditures exceeded revenues by approximately $3 million, with the fund balance supplementing the funding shortage. Therefore, as discussed in Finding 5, OSPR and the Commission should ensure Program expenditures are supported and accurately charged to Fund 0320.

Fund 0321

Based on our review, oil spill costs are being recovered from the responsible parties. The amounts recovered are being deposited into Fund 0321 and expended on responses to oil spills, as required. As presented in Table 14, costs recovered in responding to oil spills were the majority of revenues, totaling approximately $1.8 million. Out of the total expenditures of approximately $1.5 million, OPSR’s response to oil spills accounted for 99 percent of the expenditures.

Table 14: Fund 0321 Revenues and Expenditures for 2018-19 (in Thousands)

Revenue Expenditure Cost Recoveries $1,471 Response by OSPR $1,436 Other Revenues 281 Administration 14 Total $1,752 Total $1,450

Source OSPR Fund Condition Statement

Combined Program Expenditure Activities for Funds 0320 and 0321

Figure 25 illustrates combined Program expenditures by key activities for 2018-19 in which the Program incurred approximately $57 million of expenditures for Funds 0320 and 0321.

12 For our fund activities review, we selected 2018-19 since the impact of COVID-19 may have resulted in abnormal

Program operations for OSPR and the Commission during 2019-20.

43

Combined Program expenditures were predominately used for OSPR and the Commission’s prevention activities ($17. 7 million or 31 percent of the expenditures) and preparedness activities ($16.9 million or 30 percent of expenditures), as demonstrated in Figure 25. OSPR’s administrative support costs was the third significant cost category totaling $11.1 million or 19 percent of expenditures. Approximately $1.4 million (3 percent of expenditures) was used in responding to oil spills by OSPR. However, as noted in Finding 5, the Commission’s allocation of organization wide indirect costs was not supported by a Cost Allocation Plan (CAP), and may have resulted in indirect costs not being equitably charged to the Program and overstating Program expenditures.

Figure 25: Funds 0320 and 0321 Program Expenditure by Activity For 2018-19

Source: OSPR Fund Condition Statement

Financial Basis Conclusion

Based on our review of OSPR’s and the Commission’s fiscal records and Fund Condition Statements, per barrel fees and non-tank vessel fees are collected, deposited in Fund 0320, and primarily used to fund prevention and preparedness activities conducted by OSPR and the Commission, as required. Beginning in 2017-18, expenditures have exceeded revenues for Fund 0320, resulting in a declining fund balance. To assist the fund in remaining solvent, Fund 0321 will provide a $6.5 million loan to Fund 0320. As OPSR seeks to remedy its future projections of a declining Fund 0320 fund balance, it should continue to closely monitor revenue. Both OSPR and the Commission should also ensure the effective use of revenues collected by implementing measures to improve Program operations, as noted previously in Findings 1 through 4. Further, OSPR and the Commission should ensure Program expenditures are supported and accurately charged to Fund 0320, as discussed in Finding 5.

OSPR continuously recovered costs incurred in responding to spill incidents from responsible parties, and deposited reimbursements into Fund 0321, as required. Fund 0321 is sufficiently funded with a June 30, 2020 fund balance of approximately $50 million; however, any future loans to other funds should be monitored to ensure an appropriate future fund balance level.

44

Finding 5: The Commission Should Ensure Cost Allocation to Fund 0320 is Equitable and Accurate

Our review of the use of Program revenues found that the Commission has not developed a CAP for 2018-19 and 2019-20 for allocating its indirect administrative costs. Additionally, Program personnel services costs allocated to Fund 0320 may not reflect its actual Program workload. Without a CAP and accurately accounting for Program personnel services costs, Program funding may be reduced at a quicker rate than anticipated.

A. Lack of Indirect Cost Allocation Plan

Although the Commission provided a CAP for 2016-17 and 2017-18, CAPs for 2018-19 and 2019-20 were not available. According to the Commission, due to the transition from the California State Accounting and Reporting System to the Financial Information System for California (FI$Cal), it has not completed the respective CAPs. The 2016-17 and 2017-18 CAPs indicated indirect administrative costs, which consist of shared organization wide operating expenses such as rent, copiers, and general supplies, were allocated based on the number of hours charged by staff to the Program. However, the CAP did not include the specific formula for how the staff hours were used to allocate the costs.

For 2018-19 and 2019-20, the Commission allocated $549,321 and $720,051 in indirect administrative costs, respectively, based on staff hours charged to the Program. However, as noted in Finding 5B, inconsistencies were identified during personnel testing. Without CAPs for 2018-19 and 2019-20 available for review, the equitableness of indirect administrative costs allocated to the Commission’s expenditures under Fund 0320 cannot be determined.

B. Personnel Services Costs Allocation May Not Reflect Actual Program Workload

Based on our testing of 29 Commission staff timesheets for December 2019 and January 2020, we identified 9 MEPD staff timesheets where percentage of total hours recorded for Program activity were not consistent with the allocation rate used to allocate total personnel costs to Fund 0320.13 For example, one staff’s January 2020 timesheet indicated 14 percent of total hours spent on Program tasks. However, the allocation rate of 68 percent was used to allocate the staff’s cost to Fund 0320. For the 9 timesheets, a total of $85,411 in personnel costs were allocated to Fund 0320, of which $44,060 was not supported by timesheets, as noted in Table 15. We also identified 2 of the 9 timesheets were for one staff that was on extended leave for both months, which is not an allowable expense per the Program regulations. As a result, $15,918 in associated costs were allocated to Fund 0320. Because the timesheets did not support time spent on Program related activities, equitable allocation of costs was not supported.

Table 15: Allocated Personnel Costs Variance

Total personnel costs $ 85,411 Total costs supported by timesheets (a) 23,491

Total costs allocated to Fund 0320 (b) 67,551 Variance (b-a) $ 44,060

13 Costs were included under FI$Cal cost category 510, Salary and Wages.

45

Beginning 2019-20, the Commission began using a six-month average allocation rate to charge monthly staff costs to Fund 0320. The rate was based on each Program staff’s time reported in the Tempo timekeeping system, in which a rate is determined by comparing the staff’s total hours charged to Program activities to total work hours per month for the prior six months, and then an average six-month rate is calculated for use in the upcoming six months. The Commission acknowledged that there were issues with the way Program staff costs were allocated to Fund 0320 and stated it would be difficult to match the costs recorded under Fund 0320 to the hours recorded in Tempo. Without a “true-up” process to adjust allocated costs periodically or at year end to account for actual personnel costs incurred, personnel costs charged to Fund 0320 may not accurately reflect the actual Program workload. Inaccurate expenditure information may prevent Commission management from making sound decisions regarding Program operations.

Absent a CAP and a periodic “true-up” process to accurately account for actual personnel costs, Program funding may be reduced at a quicker rate than anticipated, limiting resources available to fund designated Program activities.

The State Administrative Manual (SAM) section 9202 states that the CAP methodology should demonstrate a reasonable and equitable distribution. Specifically, the cost allocation process must produce program cost data on a timely basis, the methodology selected must be applied consistently throughout the accounting period, information provided shall be as accurate as possible, and program costs must be fully auditable (i.e., working papers or system documentation must be retained showing program cost identification, accumulation, and distribution methods).

GC section 8670.40 (e), identifies costs that are allowed to be charged to Fund 0320, which includes costs for implementing the Program. Personnel costs charged to Fund 0320 should be based on actual staff time spent on Program related tasks. Additionally, SAM section 9205 identifies methods for entities to track employee time for allocation purposes if they are not spending 100 percent of their time on a program.

Recommendations

A. Complete CAPs for 2018-19 and 2019-20 in accordance with SAM requirements.

B. Assess total indirect administrative costs charged for 2016-17 through 2019-20 to ensure total costs allocated to Fund 0320 were accurate and equitable. As needed, consider if an adjustment is necessary.

C. Assess total Program personnel costs charged to Fund 0320 for 2018-19 and 2019-20 to ensure total costs charged are accurate. As needed, consider if an adjustment will be necessary. Also, implement an adjustment process to periodically true-up allocated costs if the six-month average allocation rate continues to be used by the Commission.

46

APPENDIX A

We considered the following internal control components and underlying principles significant to the audit objectives:

Internal Control Component Internal Control Principle

Control Environment

• Management demonstrates commitment to recruit, develop, and retain competent individuals.

Risk Assessment • Management identifies, analyzes, and responds to risks related to achieving the defined objectives.

Control Activities

• Management designs control activities to achieve objectives and respond to risks.

• Management designs the entity's information system and related control activities to achieve objectives.

• Management implements control activities through policies.

Information and Communication

• Management uses quality information to achieve the entity's objectives.

• Management internally communicates necessary quality information to achieve the entity's objectives.

• Management externally communicates necessary quality information to achieve the entity's objectives.

47

APPENDIX B

Table of Methodologies

Objective - To assess the programmatic effectiveness and financial basis of the Oil Spill Prevention, Response, and Preparedness Program and identify measures to improve Program efficiency and effectiveness. To Assess Programmatic Effectiveness

Sub-Objective Methods

A.

Determine the Programs service delivery levels and if the Program met regulatory requirements.

OSPR Drills and Exercises • Identified relevant statutory and regulatory

requirements for drills and exercises. • Interviewed OSPR staff to obtain an

understanding of the processes and procedures established for the tracking, reviewing, and approving of drills and exercises.

• Obtained reports of completed drills and exercises for the period July 1, 2016 through June 30, 2020.

• Analyzed the drills and exercises data by completing data and trend analysis to determine if plan holders conducted drills and exercises as required, the number of passed and failed drills and exercises, the number of re-tests, and to identify any trends, achievements, irregularities, and instances of non-compliance.

Oil Spill Response • Identified relevant statutory and regulatory

requirements for oil spill response. Interviewed OSPR staff to obtain an understanding of the processes and procedures established for the tracking and reviewing of oil spill response.

• Reviewed oil spill memorandums and reports, forms, and incident tracking screenshots to gain an understanding of established processes, procedures, and requirements.

• Obtained a report of oil spills for the period July 1, 2016 through June 30, 2020.

• Analyzed the oil spill data by completing a data and trend analysis to determine the total number of spills reported and OPSR response type, and to identify oil spill sources, causes, and the quantity of oil spilled, and to identify trends, achievements, irregularities, and instances of non-compliance.

48

Objective - To assess the programmatic effectiveness and financial basis of the Oil Spill Prevention, Response, and Preparedness Program and identify measures to improve Program efficiency and effectiveness. To Assess Programmatic Effectiveness

Sub-Objective Methods • Evaluated whether the spill response and data

gathering processes will provide consistent information for decision-making.

Commission Oil Transfer Monitoring • Identified relevant statutory and regulatory

requirements for oil transfer monitoring. Interviewed Commission staff to obtain an understanding of the processes and procedures established for the tracking and monitoring of oil transfers.

• Reviewed inspection reports, data collection sheets, transfer scheduler screenshots, templates, terminal transfer notification and monthly report, and risk priority matrix document to gain an understanding of established processes, procedures, and requirements.

• Obtained reports of oil transfers and monitoring activity for the period July 1, 2016 through June 30, 2020.

• Analyzed the oil spill transfer data by completing data and trend analysis to determine if the Commission provided continuous monitoring of all terminals and vessels, to identify the percentage of oil transfers monitored, the quantity of oil transfers monitored, and monitoring trends based on terminal location, and to identify any trends, achievements, irregularities, and instances of non-compliance.

Safety Audits • Identified relevant statutory and regulatory

requirements for safety audits. • Interviewed Commission staff to obtain an

understanding of the processes and procedures established for the tracking and completion of safety audits.

• Reviewed safety audit reports, Safety Audit History Log, action item matrix, and audit templates to gain an understanding of established processes, procedures, and requirements.

• Obtained the Safety Audit History Log and action item matrices for safety audit reports issued during the period July 1, 2016 through June 30, 2020.

49

Objective - To assess the programmatic effectiveness and financial basis of the Oil Spill Prevention, Response, and Preparedness Program and identify measures to improve Program efficiency and effectiveness. To Assess Programmatic Effectiveness

Sub-Objective Methods • Analyzed the safety audit data by completing

data and trend analysis to determine if the required audits were completed within the five-year cycle, the average time to complete an audit, to verify completion of corrective actions for audit deficiencies, to identify Commission staffing levels, and to identify trends, achievements, irregularities, and instances of non-compliance.

B. Determine if operational processes OSPR and utilization of resources address program needs.

Drills and Exercises • Interviewed OSPR staff to gain an

understanding of the Drills and Exercises Unit’s methodology for the planning, scheduling, and tracking of workload.

• Selected 15 oil spill plans to verify completion of all drill objectives within the required 3-year cycle, drill results was supported with an approval letter, and equipment was properly deployed as necessary.

• Evaluated whether the Drills and Exercise Unit’s workload planning methodology will enable effective planning to ensure completion of all drills and exercises required annually. Considered and applied results from the trend and data analysis completed for audit sub-objective A.

• Evaluated OSPR’s Readiness Database tracking and reporting capabilities to determine if it adequately supports the Drills and Exercises Unit’s activities.

Commission

Oil Transfer Monitoring • Interviewed Commission staff to gain an

understanding of the oil transfer risk analysis process used to prioritize oil transfer monitoring workload and field offices’ methodology for the planning and scheduling of oil transfers to monitor.

• Selected 12 oil transfers to determine if the terminal and vessel priority ratings assigned by Commission staff were accurate and in accordance with the Marine Terminal Monitoring Priority System Memorandum, and to determine if the oil transfer priority rating

50

Objective - To assess the programmatic effectiveness and financial basis of the Oil Spill Prevention, Response, and Preparedness Program and identify measures to improve Program efficiency and effectiveness. To Assess Programmatic Effectiveness

Sub-Objective Methods was correctly determined in accordance with the monitoring priority risk matrix.

• Evaluated the Commission’s workload planning methodology for oil transfer monitoring to determine if it will enable effective planning of oil transfer monitoring to ensure high priority or high risk transfers are properly monitored. Considered and applied results from the trend and data analysis completed for audit sub-objective A.

Safety Audits • Interviewed Commission staff to obtain an

understanding of the electrical audit contractor’s role within the safety audit process.

• Analyzed historical data in the Safety Audit History Log to determine if the electrical audit portion of the audit caused delays in the completion of the Commission’s safety audits.

• Determined if the Commission would complete the required safety audits within the current five-year audit cycle using efficiency metrics calculated from the Safety Audit History Log data.

• Evaluated the Commission’s methodology for tracking and planning its safety audits workload and determined if it will enable effective planning to meet required workload. Considered and applied results from the trend and data analysis completed for audit sub-objective A.

51

Objective - To assess the programmatic effectiveness and financial basis of the Oil Spill Prevention, Response, and Preparedness Program and identify measures to improve Program efficiency and effectiveness. To Assess Financial Basis

Sub-Objective Methods C. Determine if Program revenues were

utilized for key program activities and fund balances are adequate to support program expenditures.

• Identified relevant statutory and regulatory requirements for Funds 0320 and 0321.

• Interviewed OSPR and the Commission staff to obtain an understanding of established processes and procedures to track, review, and approve expenditures and revenues.

• Reviewed cost recovery invoices and supporting documents, COFR applications, certificates, and payments, and OSPR and the Commission staff timesheets to gain an understanding of established processes, procedures, and requirements.

• Obtained for the period July 1, 2016 through June 30, 2020: o OSPR and Commission Fi$Cal

Transaction Logs o OSPR Fund Condition Statements o OPSR Incidents By Billing Date

Report o Commission Tempo Timekeeping

Report • Analyzed expenditure, revenue and fund

balance trends by reviewing OSPR’s Fund Condition Statements for Funds 0320 and 0321. Selected 34 OSPR and 29 Commission staff timesheets to verify staff charged Program activity to the correct Program codes and costs were accurately recorded as expenditures for Funds 0320 and 0321.

• Selected 20 COFR fees collected by OSPR to verify fees were properly reviewed and recorded for Fund 0320.

• Selected 8 oil spill cost recovery invoices to verify costs were properly supported and payment was received and recorded for Fund 0321.

• Selected 32 barrel fees collected to verify fees were properly collected and recorded for Fund 0320.

52

APPENDIX C

List of Acronyms and Abbreviations

AB Assembly Bill Act Lembert-Keene-Seastrand Oil Spill Prevention and Response Act CalOES California Office of Emergency Services CCR California Code of Regulations CDFW California Department of Fish and Wildlife CEP Clean-up End Point COFR Certificates of Financial Responsibility Commission State Lands Commission COVID-19 Coronavirus C-Plan Contingency Plan

CRC LBU or THUMS CRC California Resources Corporation Long Beach Unit

D&E Drills and Exercises FI$Cal Financial Information System for California Finance California Department of Finance, Office of State Audits and Evaluations FRT Field Response Team GC Government Code HB Onshore Huntington Long Beach Onshore ITD Incident Tracking Database MEPD Marine Environmental Protection Division MRMD Mineral Resources Management Division MOT Marine Oil Terminals Network 24 Hour Communications Network OSPD Oil Spill Prevention Database OSPR Office of Spill Prevention and Response OSPS Oil Spill Prevention Specialist OSRO Oil Spill Response Organization PMS Priority Monitoring System Program Oil Spill Prevention, Response, and Preparedness Program SB Senate Bill SED Semi-Annual Equipment Deployment SMT Spill Management Team TTX Table Top Exercise

53

RESPONSE

State of California – Natural Resources Agency GAVIN NEWSOM, Governor DEPARTMENT OF FISH AND WILDLIFE CHARLTON H. BONHAM, Director

Director’s Office

P.O. Box 944209

Sacramento, CA 94244-2090

www.wildlife.ca.gov

Conserving California’s Wildlife Since 1870

December 30, 2020

Cheryl L. McCormick, CPA Chief, Office of State Audits and Evaluations Department of Finance 915 L Street Sacramento, CA 95814-3706

SUBJECT: OIL SPILL PREVENTION, RESPONSE, AND PREPAREDNESS PROGRAM PERFORMANCE AUDIT

Dear Ms. McCormick:

Thank you for the opportunity to respond to the California Department of Finance’s December 2020 performance audit of the financial basis and programmatic effectiveness of the State’s oil spill prevention, response, and preparedness program (Audit), which involved the review of the Department of Fish and Wildlife’s Office of Spill Prevention and Response (OSPR).

We appreciate the efforts of your audit team and acknowledge their professionalism and courteous interaction with our staff. While the audit and request for response is addressed to me as Administrator of OSPR, the attached responses to the audit findings were developed in coordination with my senior staff members.

If you have any questions, please contact Amir Sharifi at (916) 698-0889 or [email protected].

Sincerely,

Original signed by:

Thomas M. Cullen Administrator, Office of Spill Prevention and Response California Department of Fish and Wildlife

Attachment

Cheryl L. McCormick, CPA Chief, Office of State Audits and Evaluations December 30, 2020 Page 2 cc: Charlton H. Bonham, Director, California Department of Fish and Wildlife

Julie Yamamoto, Assistant Deputy Administrator, Office of Spill Prevention and Response, California Department of Fish and Wildlife

Amir Sharifi, Branch Chief, Office of Spill Prevention and Response, California Department of Fish and Wildlife

Jennifer Lucchesi, Executive Officer, California State Lands Commission Colin Connor, Assistant Executive Officer, California State Lands Commission

ATTACHMENT

California Department of Fish and Wildlife Office of Spill Prevention and Response

Responses to California Department of Finance Audit Report on the California Oil Spill Prevention, Response, and Preparedness Program

December 2020

RESPONSES TO AUDIT REPORT

As provided for in the California Department of Finance (DOF) California Oil Spill Prevention, Response, and Preparedness Program Performance Audit of December 2020, OSPR submits the following responses to specific findings and recommendations. OSPR recognizes the importance and value of periodic, independent examinations and is eager to improve its oil spill prevention, preparedness, and response programs based on this audit. We appreciate the professionalism and thoroughness of the audit team in examining and assessing our complex programs.

SPECIFIC AUDIT RECOMMENDATIONS AND OSPR RESPONSES

Response:

Recommendation A: The D&E Unit has established policies and procedures to address violations and have been implementing them since their creation. To decrease workload in vetting multiple database reports and increase efficiency of this process, OSPR will evaluate changes to the Readiness database and its processes for population of the database to improve utility of the database.

Recommendation B: D&E Unit staff currently track completion of D&Es and identify compliance and provide courtesy notices of regulatory requirements to plan holders. To decrease workload and increase efficiency of this process, OSPR will evaluate changes to the Readiness database and the processes for population of the database.

Recommendation C: OSPR agrees that the Readiness database can be improved or potentially replaced with a new product in order to generate accurate reports more easily and efficiently.

Recommendation D: OSPR currently implements a process to identify required D&Es and their mandatory objectives. OSPR will again provide a detailed description of this process to DOF. Staffing calendars are currently checked periodically to verify plan holders schedule required D&E and Coordinators already send multiple courtesy notices reminding plan holders of regulatory requirements. Checking for mandatory objectives periodically would only serve to add workload since they do not change at any point during a given calendar year unless an objective is failed, which is immediately acted upon by D&E Unit staff using timeframes per D&E regulations.

Recommendation E: OSPR does not find this recommendation to be legally feasible without enacting new regulations. Coordinators have no regulatory authority to force plan holders to schedule their D&E consistently throughout the year. D&E Unit staff can only encourage plan holders to schedule D&Es pursuant to current D&E regulations and issue violations when they fail to comply. Current workload limits outlined in D&E regulations cannot be implemented due to lack of staffing, so Coordinators make accommodations by increasing workload, often doubling the number of TTXs in a week compared to the regulatory limit.

Response:

Recommendation A: Spills that are reported with a probable cause identified could have that reported information entered into the Incident Tracking Database (ITDB) initially, with appropriate updates input following response and completion of the spill analysis process.

Finding 2: Improve Methods to Identify Spill Causes and to Determine Spill Response Type

Our review of OSPR’s spill response processes and systems utilized found that the majority of spill causes were not identified in the ITD and OSPR did not have established policies and procedures for determining the type of spill response required. Data and process inconsistencies reduce the quality of information collected, and hinders the effectiveness of OSPR management decisions and improvement of Program performance.

Recommendations

A. Consider revising the spill analysis process to include procedures to determineand record in the ITD probable spill causes for spills not easily or definitivelydeterminable. Determine if additional spill cause types including the addition of“probable” designation should be developed.

B. Coordinate with agencies responsible for spill incidents outside of OSPR’sjurisdiction to collect spill cause and source data, and input into ITD, asapplicable.

C. Develop formalized policies and procedures or guidance to assist staff in makingphysical or telephone spill response determinations. Ensure the policies andprocedures include the roles and responsibilities of the FRT members.

OSPR will determine needed changes to the ITD and ITD protocols that will provide for probable cause data entry for future spills.

Recommendation B: OSPR does coordinate closely with many other state, federal and local agencies with overlapping as well as distinct jurisdictions in establishment of Unified Command for a response. These agencies include U.S. EPA, U.S. Coast Guard, NOAA, California Highway Patrol, State Fire Marshal, State Water Resources Control Board, and local agencies such as the Certified Unified Program Agencies (CUPAs) tasked with hazardous material spill response. OSPR strives to obtain as well as share information regarding spill causation from and with other agencies (and will continue to do so). Such information is sometimes not legally obtainable due to confidentiality of various agency investigations.

Recommendation C: OSPR has established, formalized FRT procedures in place that include the roles and responsibilities of each FRT member. FRT staff train within their disciplines on procedures associated with their specific tasking. For example, FRT Environmental Scientists are trained in how to use tools to identify Resources at Risk.

Because every spill is different, and the spill variables are numerous and sometimes contradictory, formalized policies and procedures to determine appropriate response type would be difficult to establish and implement at the FRT level. For example, volume is a consideration in making response determinations, but if a large volume spill is to soil not associated with a waterway, or is contained, then a response may not be warranted. Conversely, a report of a light sheen in a salt marsh with endangered species would likely warrant a response. These spill variables (e.g., volume, product, location, environmental sensitivity, tides and currents, time to arrive on site, daylight hours, weather, concurrent hazards and emergencies (i.e., human rescue in progress, fire, flooding), public concern/interest) need to be considered collectively, using appropriate weighting and judgement, rather than individually as might occur using a decision tree or other similar tool. Each on-call FRT is responsible for discussing spill notifications from the perspective of each discipline and making determinations on whether and what level a response is warranted and appropriate.

The FRT does not try to use screening tools to limit their responses, but rather the collective knowledge and perspective from the three disciplines to decide which spills warrant a physical response. A screening tool would likely lead to diverting limited staff and resources to some spills when it was unnecessary and cause some, especially smaller volume, spills to not receive a physical response when in reality it was warranted. OSPR typically is addressing more than a single spill report at any given time and it’s critical that the FRT members are permitted to use best professional judgement in determining where resources are most effectively directed.

62

EVALUATION OF RESPONSE

OSPR’s and the Commission’s response to the draft audit report have been reviewed and incorporated into the final report. Attachments to the responses were removed for brevity. We acknowledge OSPR’s and the Commission’s willingness to implement our recommendations. In evaluating each response, we note the Commission generally agreed with Findings 3, 4, and 5. After evaluating and analyzing OSPR’s response, we provide the following comments related to Findings 1 and 2 where OSPR disagrees or partially disagrees with the reported findings:

Finding 1: Opportunities Exist for Improvement in OSPR’s Drills and Exercises

OSPR disagrees with the recommendation to establish a process to identify required D&Es and objectives outstanding for plan holders at the beginning of the calendar year, and periodically throughout the year to assist in prioritizing and planning D&E annual workload. Specifically, OSPR contends implementation of this recommendation would increase its workload and that its current processes identify required D&Es and objectives, verify plan holders schedule required D&E, and sends multiple courtesy notice reminders to plan holders of regulatory requirements. However, no additional documentation on the current processes and procedures was provided as indicated. Further, the implementation of a continuous process to identify remaining plan holders that have yet to schedule the required D&E will enable OSPR to effectively assess anticipated remaining workload and plan the use of its Program resources accordingly. Therefore, the finding and recommendations will remain unchanged.

OSPR also disagrees with the recommendation to consider establishing a process to conduct D&E consistently throughout the calendar year. Specifically, OSPR finds the recommendation to not be legally feasible without enacting new regulations. OSPR is the primary lead governing agency responsible for implementing and enforcing Program requirements. Therefore, if OSPR foresees a need for legislative change, then OSPR should consider the alternatives available to enable it to improve the efficiency and effectiveness of its D&E program operations. Consequently, we continue to recommend that OSPR proactively coordinate with plan holders in planning its annual workload to have better control over D&E scheduling. Therefore, the finding and recommendations will remain unchanged.

Finding 2: Improve Methods to Identify Spill Causes and to Determine Spill Response Type

OSPR partially disagrees with the recommendation to coordinate with agencies responsible for spill incidents outside of OSPR’s jurisdiction to collect spill cause and source data, and input into ITD, as applicable. Specifically, OSPR states it strives to obtain and share information regarding spill causation from and with other state, federal, and local agencies with overlapping, as well as distinct, jurisdictions in the establishment of the Unified Command for a response. It further claims the actual cause of a spill is not sometimes obtainable because of legal matters. However, as noted in the finding, GC requires OSPR, with the assistance from other state agencies and the federal on-scene coordinator, to determine the cause of spills. Therefore, if OSPR is unable to obtain information about the actual cause of a spill, we continue to recommend OSPR

63

document the known probable cause for the spill and continue coordinating with other agencies, as applicable. Therefore, the finding and recommendations will remain unchanged.

OSPR also disagrees with the recommendation to develop formalized policies and procedures or guidance to assist staff in making physical or telephone spill response determinations and to ensure the policies and procedures include the roles and responsibilities of the FRT members. Specifically, OSPR claims it has established formalized FRT procedures in place, which include roles and responsibilities of each FRT member, and relies upon best professional judgment of the members. However, documentation of this process was not provided for our review. Additionally, OSPR stated that a screening tool would likely lead to diverting limited staff and resources to some spills when it was unnecessary and cause some spills to not receive a physical response when it was warranted. However, documentation of OSPR’s response policies and procedures, including various factors to be considered, FRT roles and responsibilities, and sample case studies of previous spill responses, will enable OSPR to communicate expectations and provide examples of variables to be considered by FRT members when determining the type of spill response. This guidance could assist the FRT members in consistently making the best professional judgment regarding the type of response warranted. Therefore, the finding and recommendations will remain unchanged.