measurement and different uses of information benedict wauters

45
Measurement and different uses of information Benedict Wauters

Upload: eleanore-baker

Post on 25-Dec-2015

221 views

Category:

Documents


3 download

TRANSCRIPT

Measurement and different uses of information

Benedict Wauters

2

Measurement for systems thinkers-1

• Systems measures: Permanent measures:

• One stop capability: how much of the work can be handled at the first point of contact? Single piece flow = finish a job before starting another

• End to end time to do the work from the customer point of view: Variation is normal: in a call centre, the time to deal with a call in

satisfactory way for the client at the level of an individual worker depends on the nature of the call, the clients’ mood, design of procedures, knowledge of the worker, availability of info, etc.

This means one day perhaps the same person handles 75 calls and another day 125, yet it IS every day the same person

If causes of variation outside natural limits within team control, team should act on them, if beyond, then it’s the job of the manager (act on system) hence NO NEED FOR TARGETS!

• Accuracy and value from customer perspective (ask the customer)

Measurement for system thinkers-2

3

SPC

Assumption that worker is responsible for variation leads to shame for worker, followed by cheating and more shame

No point in acting when values fall within limits of SPC. To improve you need to work on the system.

Target reporting

Tells you very littleOn one side sits success, on the other failure (you hit the target or you don’t)*

Y % bad

X% good

Trend data

Does not allow toseparate noise frominteresting events

time

Average

Andy Neely et al. How to avoid the problems of target setting, 2010, PMO symposium

4

Measurement for systems thinkers-3 Temporary measures (ad hoc for redesign)

• Type and frequency of demand: once redesign is done, workers will pick up changes and if these are larger than normal, they will know these measures need to be taken again

• Type and frequency of dirt in input eg. incomplete forms, insufficient info,… (also from other workers in the system)

• Type and frequence of waste in flow eg handovers, lost time, error, rework…

• If predictable (recurrent), this means it is a system condition and can be changed, if unpredictable, then perhaps best to do nothing (some things just go wrong from time to time)

• Demand volume (how much work comes in) and capacity (how many people I have, how long they take*) Measures for planning (rostering, recruitment) and stock taking

(monitoring the plan) NOT for operations management (not to monitor workers)

*Why aren’t we all working for learning organisations? E-organisations and people May 2010. vol 17, n2, Seddon et al

5

• “…ensure that measures used for planning and budgeting purposes are not confused with measures used for improvement and development”.*

Andy Neely et al. How to avoid the problems of target setting, 2010, PMO symposium

*Source: Andy Neely is widely recognised as one of the world's leading authorities on organisational performance measurement and management. He has authored over 100 books and articles, including "Measuring Business Performance", published by the Economist and "The Performance Prism", published by the Financial Times. He has won numerous awards for his research and chairs the Performance Measurement Association, an international network for those interested in the performance measurement and management.

Use of performance information

1. To budget/plan

2. To control

3. To promote

4. To learn

5. To evaluate

6. To motivate

7. …ultimately, to improve

6

Bob Simons is the Charles M. Williams Professor of Business Administration at Harvard Business School. Over the last 30 years, Simons has taught accounting, management control, and strategy implementation courses in both the Harvard MBA and Executive Education Programs.

Robert D. Behn, Lecturer in Public Policy, focuses his research, teaching, and thinking on the leadership challenge of improving the performance of public agencies. He is the faculty chair of the School's executive program, Driving Government Performance: Leadership Strategies that Produce Results and conducts custom-designed executive education programs for government jurisdictions and public agencies.

9

Budgeting/planning -1• From budget/planning to improvement? Ensure

readyness! we need to estimate (predictable) demand AND we need to

estimate our required capacity in terms of people and resources (in diverse parts of an organisation that need to coordinate) to meet this demand

Then add reserve capacity for the uncertain demand How? Based on how we have been doing before, given system

conditions (stock-taking AFTER the fact) Otherwise, we will NOT have the required capacities in place

and will fail to satisfactorily meet demand Budgeting therefore is an exercise in ensuring the organisation

is as ready as possible as, naturally, performance will be lower for unprepared organisations than for prepared ones

However, this clearly has nothing to do with setting targets

10

Budgeting / planning -2

At the macro level in government?• usually political priorities determine envelopes at macro level • performance budgeting created the expectation that overall

government performance can be improved by reducing/increasing budgets if poor/good performance in terms of “output”?

• but how would government know if past output for the money is sufficient or could be improved?

better to know if service provision relative to its purpose (in the eyes of the customer) is under control and if there is waste

money is not always the answer e.g. lack of leadership in rethinking the system comes first

• only evaluation can determine which services deliver more/less outcomes and why

• government at macro level should engage its public service providers on this!

R. Behn, Public Administration Review, 2003M. De Jong et al., OECD Journal on Budgeting, 2013

Budgeting/planning -3

Demand plan: planned (in/decrease of predictable demand)

Capacity requirements + free capacity maintained (in % used of available people and facilities)

Factoring in expected process improvements

Direct operational, maintenance AND improvement costs

= OPEXCapital expenditure

(replacement/expansion of existing facilities for current business)

=CAPEXIndirect operational cost=OPEX

Strategic initiatives

STRATEX(can be

bothexpenses

orcapitalisa

tion)

PLANS

BUDGET

Operational plans (with a one year time horizon) specify how much demand is expected to beaddressed by which parts of the entity.

12

Learning and evaluation-1

• From learning and evaluation to improving? What is (not) working AND why* To get better understanding in general (enlightenment) for those

involved (new insights, new ideas, changed perceptions)* Not so easy:

• we observe only what we measure, which may not matter or show random correlations

• different people understand the same data differently and draw different lessons from it

• measurement systems (incl. as used in evaluation) reflect what decision-makers expect to see

• however, real learning often is triggered by the unexpected How can you design a measurement system to detect what you

don’t expect?

R. Behn, Public Administration Review, 2003*”programme learning” as inM. De Jong et al., OECD Journal on Budgeting, 2013

13

Learning and evaluation-2

• to detect the unexpected formal measurement systems must: cast a wide net (variety of measurement covering the entire system from end-to

end, internal and external) avoid excessive aggregation

• formal systems are not enough. There also needs to be informal “measurement”

this is the idea behind “management/data collection by walking around” where every story that people tell managers are the ultimate in disaggregation

• deviance is not always obvious as in the case of a clear failure: to detect deviances and understand if they are worth further investigation, service

providers must understand the internal and external environment (be sensitised) furthermore, to learn from failure, there cannot be an environment that focuses

primarily on assigning blame as people will then try to hide the deviant data

• after a significant deviance is detected, a learning strategy has to be deployed that probes for causes and implications:

will require expert knowledge and other sources of information beyond the measurement system itself

measurement itself is more likely to suggest topics for investigation than to directly impart key operational lessons

R. Behn, Public Administration Review, 2003*M. De Jong et al., OECD Journal on Budgeting, 2013

14

Learning and evaluation-3

Evaluation is a specific way to “measure”:• Can be used to answer the question how well are we doing (‘narrow’

accountability) Typically compare with others, or with the past Danger of simplistic comparisons (ignoring context) with others, unclear

objectives and/or inadequate “programmatic” structure (multiple constraints, inadequate resources, unreasonable timetables)

Requires looking at outcomes (relative to input), taking account of context (what is due to other factors than our action)

• Can also relate to a “needs” assessment (what is happening/changing at citizen level) irrespective of some particular public action (key for relevance)

• As with any other measurement, in evaluation, “deviance” from what we expect is only a starting point to probe further so we can learn to help us improve

R. Behn, Public Administration Review, 2003M. De Jong et al., OECD Journal on Budgeting, 2013

15

Learning and evaluation-4

Targets offer little scope for detecting interesting deviations (that

helps us probe further):

• Hit them / Did not hit them

“experience shows that most operational performance indicators

deliver little, if any value in terms of actionable insights unless

they are charted… in time-series format”*

• The SPC chart was one example: useful to continuously improve

current operations

• …but there are others like Sensemaker: more useful when we go

into development mode for wider systemic change (learn from

doing, experimentation)

*Andy Neely et al. How to avoid the problems of target setting, 2010, PMO symposium

16

Control-1

• Control to improve? Establish standards, in principle on the process however,

there are some considerations:• sometimes its is not easy/possible to observe (complicated)

processes (e.g. a manager) and we would rather measure output • or, monitoring the process , however tightly, does not mean the

output will necessary be satisfactory (e.g. we can observe a researcher, reading, writing, discussing etc. but whether /how this leads to a good piece of research in the end is not so certain) so we want to measure the output anyway

• or it costs a great deal less to monitor the output rather than the process

R. Behn, Public Administration Review, 2003

R. Simons, Performance measurement and control systems, ch. 4,2000

17

Control-2• if one wants to constrain experimentation/innovation in the process

as much as possible by tightly controlling the process (e.g. where safety is an issue or where tinkering in only one part of the process can have serious consequences for downstream parts etc.), one cannot wait for the output to control

• when neither process nor output can easily be controlled, we need to resort to “input control”, usually in the form of people through the recuitment process (e.g. highly capable and integer individuals

but...*• “standards” are derived from production environments where they

are needed for interoperability / integration E.g. tolerances in terms of how much front-window opening of a car can

deviate from norm, before fitting the glass becomes a problem!

• in most services, the “standard” is continuously set by the users, with variation to be absorbed by staff in a co-creating process

if not, it may be best to automate with ICT (eg license plates)

*John Seddon

18

Promote-1• To promote:

Convince others (politicians, journalists, citizens, ….) we are doing good job

• Performance info can be used to “legitimise”*: rationalise, justify, validate courses of (past/present/future) actions and decisions (incl. on budgets)

• It can be used to “reassure”* that government is doing what it is supposed to with taxpayers’s money (also called “transparency”)

• It can be used to “show compliance”* with regulations regarding performance management

• It can, by the above, unlock extra resources (e.g. earned autonomy, extra budgets, attract dedicated people,…) and hence improvemen

Requires easily understood measures that the particular stakeholders care about

• Eg for DMV (dpt. for motor vehicles) more important to show how much you have to wait to get a check (compared to waiting perhaps for other services), than to show accident rates due to malfunctioning cars

R. Behn, Public Administration Review, 2003*M. De Jong et al., OECD Journal on Budgeting, 2013

Promote-2

19

Using performance information for “promotion”?

20

Motivate

• To motivate where the conventional (but unfortunately not entirely correct) wisdom is as follows: Set stretch goals: targets

• on outputs to stimulate improving internal processes: under direct influence allows to provide quick feed-back

• after outputs covered, also include outcome targets to stimulate to work across boundaries

however, long time lags, many influences

Next, celebrate achievement (usually not failure)• Leaders lead the celebration and ensure renewed focus

R. Behn, Public Administration Review, 2003M. De Jong et al., OECD Journal on Budgeting, 2013

21

This is where the use for budgeting/ planning happens

Here we use it for control, evaluation, learning, promotion

Trouble in measurement paradise?

22

Purpose Bias in terms of use

Budgeting/ planning

None: we want the information to be as accurate as possible to be as well-prepared as possible.

Learning None: we want the information to be as accurate as possible to learn

Motivation Managers want to “stretch” their expectations for staff, which leads staff to understate current performance. Which detracts from accuracy.

Control We want to know if standards are being met. This conflicts with the desire to motivate people to “stretch” as well as the desire to have an accurate picture of real (not within tolerance) performance levels.

Evaluation People have an interest to meet targets, if they are salient, in any way possible (also undesired ways). This detracts from the “accuracy” requirement. On the other hand, many factors beyond staff’s control contribute to / detract from the observed performance and should be adjusted for to accurately evaluate them.

Promotion The organisation has an interest in setting low expectations, that can surely be met, to maintain credibility. Conflicts again with accuracy.

R. Simons, Performance measurement and control systems, ch. 4,2000

What about motivation?

23

The trouble with targets and motivation

24

• specific: target a specific area for improvement;• measurable: quantify or at least suggest an

indicator of progress;• assignable: specify who will do it;• realistic: state what results can be realistically

achieved, given available resources;• time-related: specify when the results can be

achieved.Source: Doran, 1981

SMART

The trouble with targets and motivation

25

What does the scientific evidence tell us?

The trouble with targets and motivation

26

Benedict Wauters

SMART or not: are simple management recipes useful to improve performance in a complex world? A critical reflection based on the experience of the Flemish ESF Agency.

Paper submitted for the conference on “Prestaties van organisaties in de publieke sector: van wegen naar gewicht verliezen” / “Performance of public sector organisations: from weighing to

losing weight.”

Politicologenetmaal Gent on 30 and 31 mei 2013

Further reading

The trouble with targets and motivation

1. Goals setting matter for performance

2. What we measure gets done, including by cheating, gaming etc.

3. A picture says more than a 1000 words (or some numbers)

4. Goals motivate, but what really counts in a complex world is the type of motivation

27

1. Goals matter for performance

• Challenging, quantified goals lead to more focus, energy, persistence, smarter work and hence… performance!

• …but then there also needs to be committment for the goal:because one believes one can achieve it (self-efficacy):

helped by role models, training, … and leaders who communicate trust and help reflect about HOW to get things done

because one believes it is important: helped by a broader vision and public committment to it from a supportive leader, participation in goal setting, explaining the rationale of the goal

28

29

1. Goals matter for performance

• …and positive feed-back is crucial: Explain what people did right, which strengthens self-efficacy

again… …in combination with knowing what are obstacles and how to

deal with them

• ….but if the limits of what one is able (capacities) and allowed (environment, e.g. work overload) to do are reached, performance drops again “The assignment of ambitious goals without any guidance on

ways to attain them often lead to stress, pressures on personal time, burnout, and in some instances unethical behavior. It is both foolish and immoral for organizations to assign stretch goals, and then fail to give employees the means to succeed...” Seijts et al (2005)

30

Don’t set challenging goals if you do not know how to reach them

1. Goals matter for performance

31

1. Goals matter for performance

• “A quota (output target) is a fortress against improvement of quality and productivity. I have yet to see a quota that includes any trace of a system by which to help anyone to do a better job. A quota is totally incompatible with never-ending improvement. There are better ways…“

E. DemingWilliam Edwards Deming was an American statistician, professor, author, lecturer and consultant. He is perhaps best known for the "Plan-Do-Check-Act" cycle popularly named after him. In Japan, from 1950 onwards, he taught top management how to improve design (and thus service), product quality, testing, and sales (the last through global markets) through various methods, including the application of statistical methods. Deming made a significant contribution to Japan's later reputation for innovative high-quality products and its economic power. He is regarded as having had more impact upon Japanese manufacturing and business than any other individual not of Japanese heritage.

32

1. Goals matter for performance

• Two (very important) caveats: the theory operates only at the individual task level :

• Not possible to determine at an aggregate level the constraints from the environment nor the capacities of all people for all tasks

in many (probably most) cases learning goals rather than performance goals are required!

• eg. not “ensure x% more participants with a job after 6 months”, but “search and try 5 strategies to…”

• also challenging and quantitative!

33

Target setting at the aggregate level…

1. Goals matter for performance

34

In many cases however, HOW to achieve a goal is all but clear and you need to set a learning goal

1. Goals matter for performance

35

1. Goals matter for performance

• Learning goals –when? “tasks for which minimal prior learning or performance routines

exist, or tasks where strategies that were once effective suddenly cease to be so, relocate the purpose of or benefit of goal setting from one of primarily motivation to that of knowledge acquisition, environmental scanning, and seeking feed-back”

“Meta-cognition is particularly necessary in environments with minimal structure or guidance”

• Why? Because with a performance goals “Yes, I can” (self-efficacy) quickly becomes, in the above context, “No, I can’t”, which leads to putting in less effort and taking less risk (needed here!) than for a learning goal

• The issue then becomes how you can determine in practice (rather than in a controlled experiment) if a learning goal is required? Perhaps most of the time?

36

1. Goals matter for performance

• Some more caveats / remarks:A higher level, abstract goal helps individuals to set

coherent targets for themselves (not conflicting with the targets set by others for themselves) and if a leader public commits to it, also helps staff committment to targets that contribute to it

Goals should always be formulated as a challenge, never a threat

If the level of performance is already high, there is less scope for improvement; high-performers should be left along in setting their targets

2. What gets measured, gets done…

“There is some evidence that targets and such “carrots and sticks” work, particularly if the desired outcome is focussed and measurable, as in the case of hospital waiting times…

…The two assumptions underlying such governance structures don’t hold for public service delivery, however: measurement error is an inherent problem, as is the resultant potential for undesired as well as desired responses, and the evidence bears this out.”

38UK ESRC, 2010

39

2. What gets measured, gets done…

4. Understanding motivation• intrinsic motivation is “the natural tendency

manifest from birth to seek out challenges, novelty and opportunities to learn”

• …but this natural tendency is easily frustrated if there is no supportive environment for :autonomymastery relatedness (Maslow, anyone?)

52

E. Deci is professor of Psychology and Gowen Professor in the Social Sciences at the University of Rochester, and director of its human motivation program. He is well known in psychology for his theories of intrinsic and extrinsic motivation and basic psychological needs. With Richard Ryan, he is the co-founder of self-determination theory (SDT), an influential contemporary motivational theory. Self-determination theory is a macro theory of human motivation that differentiates between autonomous and controlled forms of motivation; the theory has been applied to predict behavior and inform behavior change in many contexts including: education, health care, work organizations, parenting, and sport (as well as many others).

4. Understanding motivation

53

Do it for external

punishment /reward

Don’t do it or on automatic pilot / zombie mode

Do it forglory, pride,

avoiding shame and

fear

Do it because your recognise it as important

Do it because it fits with who

you are

Do it because you find it enjoyable as such

4. Understanding motivation

54

Do it for external

punishment /reward

Do it forglory, pride,

avoiding shame and

fear

Do it because your recognise it as important

Do it because it fits with who

you are

Don’t do it or on automatic pilot / zombie mode

Do it because you find it enjoyable as such

Student studies statistics for reward from parents or to

avoid feeling ashamed

Student studies statistics because he/she sees the importance of it for the

intrinsically valued activity of empirical psychology and

because it is congruent with who he/she is

4. Understanding motivation

55

Do it for external

punishment /reward

Do it forglory, pride,

avoiding shame and

fear

Do it because your recognise it as important

Do it because it fits with who

you are

Don’t do it or on automatic pilot / zombie mode

Do it because you find it enjoyable as such

“autonomous motivation is associated with more effective performance on

relatively complex tasks, whereas there is either no difference or a short-term advantage for controlled motivation when mundane tasks are involved.”

4. Understanding motivation

• “Taken together, studies in organizations have provided support for the propositions that autonomy supportive (rather than controlling) work environments and managerial methods promote basic need satisfaction, intrinsic motivation, and full internalization of extrinsic motivation, and that these in turn lead to persistence, effective performance, job satisfaction, positive work attitudes, organizational commitment, and psychological well-being” Gagné et al (2005)

56

57

4. Understanding motivation• Crucial is therefore the environment:

Autonomy supportive :• empathic, allow self-initiative and choice, gives rationale if

choices are limited, no excessive pressure, positive feed-back, support of growth of competences, good relational base….

Controlling:• Very salient rewards, deadlines, controlling language,

naming and shaming etc.

59

4. Understanding motivation

• What about punishment/reward…?Generally bad for intrinsic motivation because

experienced as controlling No effect for routine, dull tasks because there

is no intrinsic motivation to destroy; still, it is better to give a broader purpose to promote internalisation;

Negative for complex tasks

60

4. Understanding motivation

Not expected (ex post) rewards are good for intrinsic motivation. Why? …

• …only in an autonomy supportive context and if perceived as just (no competitiveness), because it conveys information on competence…

• … but less than positive feed-back which is a verbal (in principle unexpected) reward in the same context

• negative feed-back leads to amotivation

• Monitoring and evaluation for PMOs?If you measure and use information in the way

just discussed for your service provision towards delivery partners…

Would it not make sense if you support delivery partners to do the same towards their users?

68