developing talent measurement makes the difference - coaching conference - play a world game

41
Developing Talent: Measurement Makes the Difference Dragos Iliescu, PhD

Upload: atria-group-see

Post on 24-Jul-2015

54 views

Category:

Business


0 download

TRANSCRIPT

Developing Talent: Measurement Makes the Difference Dragos Iliescu, PhD

The context • After-Crisis Economy

Less money to go around:

HR is among the first to be impacted

Complete lock on HR spending for some industries (~35% of businesses)

Severe to mild budget cuts in other industries (reduction in spending per employe on average of 13%)

Fragmented market:

It's a Buyer's market now (it used to be a Seller's market)

Consultancies have closed

HR departments have been re-organized

Qualified professionals are on the market

They tend to offer the same services they offered before, just in smaller (oftentimes one-man) companies

But the brand behind the person is not the same any more.

Development services in this context • Direct impact of context on development services:

Training & development bugets are reduced

Specific new demands arise towards service providers

• Pressure on talent professionals

Demand for a targeted focus on specific talent groups

i.e. development is a luxury, and is for "Talents", not for everybody

Competition with other HR professionals

e.g. recruiting & selection, climate & culture surveys & interventions

• Talent professionals have a growing need to demonstrate value for their programs

How can this be done?

Only through measurement

Development Services: From Activity-Focused to Results-Focused • Activity-focused services:

Training and development is assumed to work

Nobody questions the impact of such services

It's a "fad": it's the in thing to do - there's a training & development budget, so let's do something "nice" with it (it needs to be spent in the end, doesn't it?)

There are no specific measurable objectives: the activity in itself is "checked"

Key attitude: "I need to do some trainings"

• Results-focused services:

Nothing is assumed to work

Client has to be convinced that development (e.g. training) is the right approach: it's a sales pitch every single time

Effectiveness on intervention is questioned and proof has to be provided

Key attitude: "I need to solve some problems" (e.g. increase engagement, maximize retention, develop key talent, build succession partnerships etc.)

Activity-Focused vs. Results-Focused (Phillips & Edwards, 2011) Activity-Focused Results-Focused

No business need for the program Program linked to specific business need

No assessment of performance issues Assessment of performance effectiveness

No specific measurable objectives Specific objectives both for individual

behavior and for business impact

No effort to prepare program participants

for results

Results expectations communicated to

participants

No effort to prepare the work environment

to support the application

Environment prepared to support

application

No efforts to build partnership with key

managers

Partnerships established with key

managers

No measurement of results Measurement of results and ROI analysis

Planning and reporting is input-focused Planning and reporting is outcome-focused

So: What's to be done? • Talent development professionals encounter pressures towards Results-

focused interventions

• From a practical point of view measurement becomes pervasive

Regardless of the actual process involved, which gets more and more complicated

Both at the 'selling' and at the 'follow-up' stage

In order to sell one will have to have strong efficacy statements

In order to show impact one will have to have a financial/ROI approach

• Data for such measurements & analyses

Comes from one specific audience

Consumers of development programs: those who are developed; they provide data about satisfaction with program, learning (increments in skills, knowledge or attitude) or changes in behavior

Is delivered to one specific audience

Clients of development programs: those who approve, fund, and support the program

Different Levels of Measurement Level of Measurement Description

1. Reaction to & satisfaction with

program

Measure the reaction of participants

to the program / measure the

satisfaction of other stakeholders with

the process

2. Learning Measures knowledge, skills, abilities

or attitude changes due to the

program directly

3. Application & implementation of

acquisitions

Measures changes in behavior on the

job

4. Business impact Measures business impact changes

related to the program

5. Return on investment (ROI) Compares the monetary value of the

impact with the cost of the program

Measurement process (Phillips, 2003)

Level 0: Preliminary input data 1. Number of employees participanting in the program

2. Number of hours of learning activity per employee

3. Enrollment statistics (demographics, participation rates, completion rates etc.)

4. Investment: total cost, cost per employee, direct cost per participant, cost as percent of payroll

Level 1: Reaction to & satisfaction with program • Most popular measurement

• Done by most trainers/development professionals

• Usually under the name of "participant feedback"

• However, measurement should not be approach in an informal, qualitative way (prone to bias by those who had an extremely satisfying or extremely disappointing experience)

Level 1: Reaction to & satisfaction with program 1. Relevance to the job

2. Usefulness of the program

3. Amount of new information

4. Likely to recommend to others

5. Importance of information

6. Intention to use knowledge/skills

7. Effectiveness of trainer/facilitator

8. Effectiveness of delivery method

9. ...

Level 2: Learning • Needs a measurement of the extent to which participants acquire new

knowledge, skills, competencies etc.

• This is a greater challenge, and is usually done through a follow-up process.

• Methods of measurement:

Formal/objective:

best practice is to be provided independently (by someone else than the service provider)

tests (especially knowledge tests),

job simulations (e.g. A&DC),

work samples

...

Informal/subjective:

self-assessment,

subjective assessment by service provider,

team assessment

...

Level 2: Learning • Depths of measurement:

Understanding of knowledge/skill

Ability to use knowledge/skill

Confidence in the use of knowledge/skill

• Best practice approach:

Pre & post-assessment

Usually reports a percentage improvement (not to be confounded with percentage of impact!!!)

Level 3: Application & implementation of acquisitions • Measurement of the change in on-the-job behavior

• Methods of measurement:

Formal/objective:

job simulations (e.g. A&DC)

work samples

multi-rater feedback

Informal/subjective:

self-assessment,

subjective assessment by service provider,

team assessment

• Depth of measurement:

Importance of new competency in work, as applied on the job

Frequency of use of new competency on the job

Effectiveness of new competency on the job

Level 3: Application & implementation of acquisitions • Supplementary indicators:

Percent of 'action plans' completed (the extent to which participants carry on with their assignments from the program)

Barriers & enablers to the application of the new competency: context, management support/rejection, peer support/rejection

Level 4: Business impact • The measurement of business indicators:

Hard: income/productivity

Soft: satisfaction (e.g. customer satisfaction, provider satisfaction)

• Methods of measurement:

Informal/subjective:

Questions like:

Was this program a good investment for the organization?

How much increase in productivity do you expect from your new you?

Participant estimate of program impact

Supervisor estimate of program impact

Management estimate of program impact

Expert opinion

Previous studies

Level 4: Business impact • Methods of measurement (contd.):

Formal/objective:

Using control & experimental group

Trend line analysis of performance data

Estimating/controlling for impact of other factors (usually Bayesian inference)

Level 5: Return on investment • ROI: Monetary benefits vs. Cost of program

• Methods for converting data to monetary value:

Formal/objective:

Converting employee time: employee is more qualified, works faster, economy in money

Convert cost of quality: employee is more qualified, works better, better products/services ...

Use historical costs

Informal/subjective:

Use internal/external experts

Use data from external databases

Use participants estimates

Use supervisor/management estimates

Use talent staff estimates

Level 5: Return on investment • Intangible outcomes

their conversion is not always desired

I would recommend their conversion to monetary values, though

• Examples:

brand

employee satisfaction

customer satisfaction

employee engagement

Now let us focus on Levels 2 & 3 • What is to be done in order to measure the extent to which participants

acquire new knowledge, skills, competencies etc., and apply them in their jobs, day by day etc.?

• Well: measure before and after

What does measurement after give us?

What does measurement before give us?

Data sources in assessment

Performance vs. Potential: The 9-Box Model

Let us talk about "Performance" and "Competence"

“A competency [...] is the repertoire of capabilities, activities, processes and responses available that enable a range of work demands to be met more effectively by some people than by others.”

Bartram and Kurz (2002)

An exercise, but first: let's meet the Universal Competency Framework

The Universal Competency Framework

Great 8 Factor level

20 Dimensions Competency level

112 Components Behaviour level

Leading & Deciding

Supporting & Cooperating

Interacting & Presenting

Analysing & Interpreting

Creating & Conceptualising

Organising & Executing

Adapting & Coping

Enterprising & Performing

1

2

3

4

5

6

7

8

Deciding & Initiating Action

Adhering to Principles & Values

Relating & Networking

Writing & Reporting

Learning & Researching

Delivering & Meeting Expectations

Coping with Pressure

Achieving Goals & Objectives

Leading & Supervising

Working with People

Persuading & Influencing

Applying Expertise & Technology

Creating & Innovating

Following Instructions & Procedures

Adapting & Responding to Change

Entrepreneurial & Commercial Thinking

Presenting & Communicating

Analysing

Formulating Concepts & Strategies

Planning & Organising

1.1

2.1

3.1

4.1

5.1

6.1

7.1

8.1

1.2

2.2

3.2

4.2

5.2

6.2

7.2

8.2

3.3

4.3

5.3

6.3

1

2

3

4

5

6

7

8

SHL ‘Great 8’

Conscientiousness

Cognitive Ability

Agreeableness

Extroversion

Emotional Stability

Openness to Experience

Need for Achievement

Need for Control Leading & Deciding

Supporting & Cooperating

Organising & Executing

Interacting & Presenting

Analysing & Interpreting

Creating & Conceptualising

Adapting & Coping

Enterprising & Performing

Mapping to the ‘Big 5’

Leading & Deciding

Supporting & Cooperating

Interacting & Presenting

Analysing & Interpreting

Creating & Conceptualising

Organising & Executing

3

4

5

6

Potential Measures

Behavioural Measures

Cognitive Ability

Agreeableness

Extroversion

Need for Control

Behavioural Interviews

360

Situational Interviews

Simulations

Adapting & Coping 7

8 Enterprising & Performing

Need for Achievement

Emotional Stability

Conscientiousness

Openness to Experience

1

2

Let's do the exercise!

How to assess these competencies?

Results - lag measures

1. Performance metrics

2. Track record

Results

(Lag)

Behaviours - now measures

1. Behavioral observation (AC)

2. Behavioral interview (CBI)

3. Behavioral questionnaire (360)

Behaviours

(Now)

Potential - lead measures

1. Motive & Values – „MQ‟

2. Personality traits – „OPQ‟

3. Cognitive - „Ability testing‟

4. Simulations – „MAP‟

Potential

(Lead)

Potential

(Lead)

Political factors

Organisation Strategy

Personality Assessment

• Occupational Personality Questionnaire either in a "competency"

framework

or in a "personality trait" framework

OPQ as "competency"

OPQ as "personality"

Thank you Prof. Dr. Dragos Iliescu Department of Psychology, University of Bucharest Sos. Panduri Nr. 90 050657 Bucharest, Romania Tel: +031-425.34.45 Email: [email protected]