storm water benchmarking

21
PLANNING WORKSHOP II (Infrastructure Studies) SUBMITTED TO: Mr. Praman Dept. of Urban and Regional Planning SUBMITTED BY: 13011BA010 13011BA020 13011BA027

Upload: rajendra-naik

Post on 16-Jan-2017

30 views

Category:

Education


1 download

TRANSCRIPT

Page 1: Storm water benchmarking

PLANNING WORKSHOP II(Infrastructure Studies)

SUBMITTED TO: Mr. PramanDept. of Urban and Regional Planning

SUBMITTED BY:13011BA01013011BA02013011BA027

Page 2: Storm water benchmarking

BENCHMARKINGBenchmarking is a process of

Measuring ULB’s/Water Board’s Performance and practices in key areas,

Comparing them with best practice

Subsequent translation of this best practice into use,

Leading to superior performance –Performance Improvement

A process through which practices are analysed to provide a standard measurement

('benchmark') of effective performance within an organisation.

Page 3: Storm water benchmarking

Types of Benchmarking

Generally, there are two approaches to benchmarking: metric and process.

• Metric benchmarking is a quantitative comparative assessment using

standard performance indicators that enables utilities to track internal

performance over time, compare this performance against that of similar

utilities, and establish target levels of performance.

• Process benchmarking involves first identifying specific work procedures

to be improved through a step-by-step ‘process mapping’, and then

locating external examples of excellence for standard setting and possible

emulation.

Page 4: Storm water benchmarking

Service Level Benchmarking has been developed and released by the MoUD. It seeks to:

•  identify a minimum set of standard performance parameters for the water and

sanitation sector that are commonly understood and used by all stakeholders across

the country; 

•  define a common minimum framework for monitoring and reporting on these

indicators; and

• Set out guidelines on how to operationalise this framework in a phased manner.

Page 5: Storm water benchmarking

PRINCIPLES UNDERLYING THE BENCHMARKING PROCESS

The model of benchmarking process is famously referred to as the

“Deming cycle” and it includes a minimum of four phases “Plan –

Do-Action-Check”

Deming’s Benchmarking Cycle

Page 6: Storm water benchmarking

BENCHMARKING AN APPROACH

Identify Problem or Area for review

Map the ProcessIdentify Partners & Data Sources

Collect Data

Identify Good or Best Practise

Change Existing Practice

Page 7: Storm water benchmarking

SLB frameworkIndicators

List of service level indicators for each sector

Rationale

Why is this indicator important for the service

Definition

Objective and mathematical definition

Data Requirements

Data that is required to calculate the indicator

Reliability of measurementWhat is the reliability of data should be targetted

Frequency of measurement

How often should the indicator be measured

Level of detail

What is the level of geographical detail to which the data should be available

Target for the indicator

What is the performance level that should be targetted?

Page 8: Storm water benchmarking

STROM WATER DRAINAGE

• Extent of the network and effectiveness of the network are emphasized to assess storm water drainage systems performance.

• As this service does not yield any direct revenues, financial sustainability is not considered.

INDICATORS:• Coverage of storm water drainage network• Incidence of water logging/flooding

Page 9: Storm water benchmarking

SLB FOR STORM WATER DRAINAGE

Sl. No

Proposed Indicator Benchmark

1 Coverage of storm water drainage network 100 %

2 Incidence of water logging/flooding 0

Page 10: Storm water benchmarking

1.COVERAGE OF STORM WATER DRAINAGE NETWORK

Indicator Unit Definition

Coverage of storm water drainage network

% Coverage is defined in terms of road length covered by the storm water drainage network

Data Requirements

Data Requiring for calculating the indicator

UNIT Remarks

(a)Total length of road network in the ULB

Km Only consider roads that are more than 3.5 m wide carriageway

(b)Total length of primary secondary and tertiary drains

km Only consider drains that are trained, madeof pucca construction and are covered.

Coverage of storm waterdrainage networks

% Coverage = [(b/a)*100]

Page 11: Storm water benchmarking

RATIONALE FOR THE INDICATOR

This indicator provides an estimation of the extent of coverage of the storm water drainage network in the city. This value should be 100 percent.

Reliability of Measurement

Reliability scale Description of method

Lowest level of reliability (D) Not applicable.

Intermediate level (C) Estimated from city road maps, not updated in the past five years.

Intermediate level (B) Estimated from city road maps (that are detailed and to scale), whichhave been updated in the past five years.

Highest/preferred level of reliability (A)

Actual ground level surveys are carried out to measure drain and road length. Surveys are carried out to verify that drains are of puccaconstruction and covered.

Minimum frequency of measurement of performance indicator

Smallest geographical jurisdiction forof performance indicator measurement of performance

Measurement Annually Measurement Ward level

Page 12: Storm water benchmarking

2.INCIDENCE OF WATER LOGGING/FLOODING

Performance IndicatorIndicator Unit Definition

Data Requirements

Data required for calculatingthe indicator

Unit Remarks

a. Identification of flood pronepoints within the ULB limits. Thepoints may be named as A1, A2,A3,….An

Number Flood prone points within the city should beidentified as locations that experience waterlogging at key road intersections, or along a roadlength of 50 m or more, or in a locality affecting50 households or more.

b. Number of occasions offlooding/water logging in a year

Numberper year

An occasion or incident of flooding/water loggingshould be considered if it affects transportationand normal life. Typically, stagnant water for morethan four hours of a depth more than six inches.

The aggregate number ofinstances or occasions of waterlogging/flooding reportedacross the city in a year

Numberper year

Aggregate incidence = (b at A1) + (b at A2) +….. (b at An)

Page 13: Storm water benchmarking

Rationale for the IndicatorThis indicator provides a picture of the extent to which water logging and flooding are reported in the ULB within a year, which have impacted a significant number of persons as well as normal life and mobility. This indicator provides an assessment of the impact or outcome of storm water drainage systems. The benchmark value for this indicator should be zero.

Reliability of Measurement

Reliability scale Description of method

Lowest level of reliability (D) Not applicable.

Intermediate level (C) Not applicable.

Intermediate level (B) Based on reports/complaints filed by citizens.

Highest/preferred level ofreliability (A)

Flood prone points should be first identified based on reports/complaints filed by citizens, or by direct observations, and reportedinto a central control room. Monitoring stations (in charge ofspecific jurisdictions) should regularly monitor instances of floodingin the respective wards/zones, as mentioned above. Data shouldbe captured by time, date, location and extent of flooding.

Minimum frequency of measurement of performance indicator

Smallest geographical jurisdiction for measurement of performance

Measurement Annually Measurement Ward level

Page 14: Storm water benchmarking

Sl. No

Urban Service Frequency of Measurement by ULB/Utility

Frequency of Reporting within ULB/Utility

Frequency of Reporting to State/Central Govt.

Jurisdiction for Measurement by ULB /Utility

Jurisdiction for Reporting within ULB/Utility

Jurisdiction for Reporting to State/Central Govt.

1 Coverage of storm water drainage network

Annually Annually Annually Ward Ward ULB

2 Incidence of water logging/flooding

Quarterly Quarterly Annually Ward Ward ULB

SUGGESTED FREQUENCY AND JURISDICTION OF REPORTING

Page 15: Storm water benchmarking

Indicator Parameter Source of data Remark

Coverage Drain length pucca or covered (road width of 3.5 m)

1. Existing drainage map (updated).

2. Data captured by infrastructure DPR (like water, sewerage, drainage or road etc).

3. Physical measurement (using bike)

•Only pucca covered drains should be considered. •Road with both side drain should be considered once.

Water Logging No. of incidence in a year 1. Complaint records.2. Other related

records.3. Physical verification

during monsoon.4. Experience of the

local people.5. Disaggregated level

(supervisors).

• Along roads (50 mt length or more)

and Locality (affecting 50 HH

or more)

Incidence of storm water mixing in the sewerage

Capacity of storm water drains.

1. Disaggregated level (supervisors).

2. Complaint records.

Page 16: Storm water benchmarking

Summary of SLB Indicators – Storm Water Drainage

Source: MoUD SLB performance data.

Page 17: Storm water benchmarking

Average of Storm Water Drainage in Pilot Cities

Indicator Benchmark Pilot Cities Average

1. Drainage network coverage

100% 31.3

2. Incidence of water logging 0 66.7

Page 18: Storm water benchmarking

STORM WATER DRAINAGE (case study of Bangalore)

Observations/Comments

•Exact length of tertiary drains needs to be assessed. •Lack of regular updation of information on drains, water stagnation points etc. •No centralized monitoring system is in place. •Ward level road history is to be maintained. This is to be linked GIS.

Performance Indicator Benchmark Status Reliability

Coverage 100% 5 CIncidence of water logging 0 number 135 number B

Page 19: Storm water benchmarking

SWD: Areas identified for improvementIndicator Reasons for low

reliabilityISIP

Coverage No comprehensive records on drains

• Ward level road history register to be updated

• Developing a GIS based road database

• Maintenance of recordsIncidence of water logging

No records are being updated on occurance of flooding

• Identification of flood prone areas• Integration of traffic data and GIS

based data• Updation of records on occurance

of flooding• Participatory reporting for flooding

incidence• Establishmet of rain gauge

recording system & integration of rain fall data

Page 20: Storm water benchmarking

Key issues and possible solutions

Key Issues Likely Difficulties Possible Solutions

Choice of Indicators and Definitions

Difficulties in arriving on a universally accepted setof indicators

•Choose number and type of indicators carefully based on relevance and usefulness to a broad majority of utilities, ease of understanding and measurability, their likelihood to be monitored, and so on•Customize global indicators to suit the local context while, at the same time, retaining the flexibility to allow international comparisons•Communicate indicators and their definitions to utilities clearly

Data Collection

Availability and reliability of data can be limited

•Communicate indicator definitions, interpretations and their calculation to utilities clearly•Devise methods to arrive at broad indicators within the existing data Constraints•Include robust quality assurance mechanisms to grade the reliability and accuracy of data•Improve accounting practices and put in place incentives for utilities to collect and report accurate data

Page 21: Storm water benchmarking

Thank you