actionable metrics for continuous improvement: · pdf filecontinuous improvement: balanced...

73
Actionable Metrics for Continuous Improvement: Balanced Scorecard Survey Tools Alexis Naiknimbalkar, CSU Chancellor’s Office Angela Song, UC San Diego

Upload: vukhue

Post on 19-Mar-2018

219 views

Category:

Documents


1 download

TRANSCRIPT

Actionable Metrics for Continuous Improvement:

Balanced Scorecard Survey ToolsAlexis Naiknimbalkar, CSU Chancellor’s Office

Angela Song, UC San Diego

Key topics for today

1. Introductions

2. CSU-UCSD Collaboration

3. Quick view of the Balanced Scorecard

4. The UC San Diego Surveys and Analytics Program

5. Overview of the Surveys

6. How the Survey results are used to drive improvements

7. The CSU – UC Collaboration Journey begins!

8. Next steps

The CSU-UC collaboration

• Background

• The collaboration

• UC San Diego’s surveys

• By using these standard questions, it enables benchmarking and sharing of best practices between Universities and departments

First… A quick history of UCSD’s

Balanced Scorecard

What is the Balanced Scorecard?

History:

• Developed in 1993 by David P. Norton and Robert Kaplan

• More than half of major companies in the US, Europe and Asia are using balanced scorecard approaches

• Harvard Business Review rates the Balanced Scorecard as one of the most influential business ideas of the past 75 years.

Four Perspectives:

1) Financial/Stakeholder

2) Internal Processes

3) Customer

4) Innovation and Growth

Survey!

Survey!

Purpose:

• A framework for performance measurement, communication, and management tied to strategy

• Provides a roadmap on where we should focus energies, priorities, and resources

• A holistic view of the organization

The Survey Program’s Evolution at UC San Diego

1993 –

16 Business Affairs units begin using the balanced scorecard approach

UC San Diego inducted into Balanced Scorecard Hall of Fame

Customer service, department outreach, and efficient internal processes are an expectation and norm

Surveys provide actionable data for continuous improvement initiatives and in support of the campus strategic plan

Best practices and benchmarking opportunities

1994 -Customer Satisfaction Survey

1995 -Student Satisfaction Survey

1997-Staff Climate Survey

2005-20151994-2004 2016-beyond

UC San Diego – VC Business Affairs UC San Diego – All VC AreasUC San Diego, UC Irvine (OIT), UC Riverside (BAS), CSU Chancellor’s Office (BAF)

2003-Surveys go online

Key Performance Indicators identified and benchmarked

2011 -Scalable advanced analytics and reports

2012 –UC San Diego’s 1st strategic plan

2014 -Surveys are fully scalable

Mission and vision aligned with strategy

2004-Surveys expand to all campus

2007-First study on diversity and staff satisfaction

LETS TALK ABOUT SURVEYS

FOR A MOMENT

Surveys are fun!

5 most bizarre survey finds

❶ ❷ ❸ ❹ ❺

From: oddee.com

5 most bizarre survey finds

#❶ 23% thought “MP3” was a Star Wars robot

(– Vouchercloud.net)

5 most bizarre survey finds

#❷ 51% of surveyed Americans think stormy weather "affects" Cloud Computing

(-Wakefield Research, 2012)

5 most bizarre survey finds

#❸ 1 in 4 Americans thinks the Sun goes around the Earth

(-NSF, 2012)

5 most bizarre survey finds

#❹ Average Americans think they're smarter than the Average American

(-YouGov, 2014)

5 most bizarre survey finds

#❺ Survey found that most Americans (75%) don’t trust Survey Results

(-Kantar data investment management poll, 2013)

Surveys allow a way to understand people’s attitudes, feelings, and behaviors

Do I think this is a good place to work? Will I do my best to contribute or will I just coast? How will I talk about it to others inside and outside the university? Do I feel valued? Do I feel I am making a difference? Am I feeling engaged and inspired orbored and just clocking my time? Does X department even care if they are helping or getting the way of my work? ……

Four main reasons to survey customers

1. Identify and fix

2. Assess the performance

3. Improve processes

4. Understand needs for a better overall experience

Uncover Answers

Evoke Discussion

Data-based

Decisions

Compare Results

Q. Why do (administrative support areas) survey in a university setting?

A. We should know if we are helping to support the mission of the university.

How would a leader know if he/she is meeting these needs?

ITS ALL ABOUT

THE DATA

Getting good data from people:

-- Psychometrics --

Not

METRICS

Psychometrics – getting good data from people (because we cant read their minds!)

Psychometrics is a field of study concerned with the theory and technique of psychological measurement.

Construct and validate assessments instruments (questionnaires, tests, personality tests)

Statistical measurement theory

Psychometric research involves two major tasks:

1) Create instruments that are valid2) Develop procedures to measure

Methodology – Design Customer Satisfaction Survey

Standard 8 rating questions for all services with up to 5 customized questions per service area1. Overall satisfaction

2. Understands my needs

3. Accessible

4. Responsive

5. Resolves issues

6. Knowledgeable/professional/courteous, etc.

7. Effective use of Blink (info sharing website)

8. Moving in positive direction

Stop, save, and finish later

Confidential responses

All staff and faculty invited

Message: Help us help you fulfill the mission of the university!

Methodology – Design Staff@Work Survey

Tested for internal reliability, conducted Factor Analysis

53 questions measure 4 dimensions:1. UC San Diego overall

2. Department effectiveness (diversity, mission)

3. Supervisor effectiveness

4. Employee effectiveness

Equity, diversity, and inclusion questions for comparisons

Message:• Are you a satisfied UCSD employee?

• Would you recommend others to work here?

Regressions analysis to predict what drives satisfaction

Participation rate: 100% for some areas, 56% overall

“Yes, it really is Anonymous”

Annual Campus Survey OverviewStaff@Work

(ie employee engagement)

• 18 years of data

• This is a campus wide anonymous survey for staff invited by the Chancellor

• Eight VC areas - 450 units participate

• 824 Verbatim Comments

• “Who made a difference in creating a positive work environment?”

• Net Promoter Score (NPS)

• 56% responded campus wide. Participation within departments reach up to 100%

Student Customer Satisfaction

• 20 years of data

• Departments voluntarily opt-in to be rated

• 44 student service units/programs currently evaluated

• 20,351 Verbatim Comments

• One “burning question”

• Net Promoter Score (NPS)

• 4884 undergraduate and graduate students participated

Faculty and Staff Customer Satisfaction

• 21 years of data

• Departments voluntarily opt-in to be rated

• 80 service units/programscurrently evaluated

• 3,773 Verbatim comments and suggestions for improvement

• One “burning question”

• Special Recognition for Customer Service

• Net Promoter Score (NPS)

• 36% responded with varying participation per rated unit

Congratulations! Here are your survey results. Good luck!

Analytics? Sure! Here you go.

Not cool.

ITS ALL ABOUT making meaning of

THE DATA

Customer Satisfaction Survey Reports: Who, What, Where, When, Why

Trend analysis: “When” over time Heat maps for easy identification of “Where” should we dig deeper

“What” are the basic descriptive statistics

Correlational analysis to identify drivers of satisfaction and start the conversation of “Why” are the scores varying

Drill downs to know which departments to focus your outreach or study best practices: “Who” needs attention

Customer Satisfaction Survey heat map quickly identify strengths and opportunities

Staff@Work: A picture can say a thousand words…

Arrows indicate positive or negative movement and statistical analysis inform you of significant trends

Descriptive statistics

Impact analysis:These are the items where people are saying, “I am not as happy about these things and they are also very important drivers of my satisfaction”

Interpreting the impact analysis report

Correlation means relationship. The closer to 1.0, the stronger the relationship between satisfaction and the survey item

“I am not as happy about these things AND they are also things that are important to me in impacting my level of satisfaction”

“I am not as happy about these things BUT they are notthings that are as important to me in impacting my level of satisfaction”

This quadrant shows the survey items that are rated high (above the mean score of 3.73) and also found to be highly related to satisfaction (above the mean correlation of .42). “I am happy about these things AND they are important to me in impacting my satisfaction.”

Keep it up! Don’t change anything All is well.

Focus on these things to address!

Keep an eye on these if they move intoPrimary Opps

“I am happy about these things BUT they are notthings that are as important to me in impacting my level of satisfaction”

Keep it up! But no need to spend too much effort here.

Now what?

ITS ALL ABOUT what you do with

THE DATA

The Survey Accountability Loop

Step I

Deploy survey and obtain feedback (ratings and

comments)

(Oct)

Step II

Identify themes and opportunities

(Dec)

Step III

Opportunities to Action: Develop and implement action plans and set goals (share with

team and Sr. Leadership)

(Jan – Feb)

Step IV

Follow up and assess performance and impact of

action plans

(Feb-May)

Step V

Did changes result in goal attainment?

Communicate impact and share results with Senior Leadership

(June)

Step VI

Realign with strategic goals and/or course correct

(Jul – Sept)

IDENTIFICATION OF NEEDS AND PROGRAM

PRIORITIES

RESPONSE PLANNING, GOAL SETTING

PROJECT AND PROGRAM IMPLEMENTATION

EVALUATION AND RECOGNITION FOR

ACTIONS TAKEN

REDEFINE SURVEY QUESTIONS

Staff@Work Case Study: Health Sciences Development: We Heard You

Risk Scenario: Based on FY13 Staff@Work survey results, Health Sciences

Development identified 12 opportunities for improvement

Data-Driven Action Taken: Sr. Leadership Team developed and executed

comprehensive strategies to address each opportunity under the “We Heard You”

campaign

Outcome: FY14 survey results revealed significant score improvements AND the

largest fundraising year for Health Sciences in the history of UC San Diego!

Case Study: We Heard You – Areas of Opportunity

Case Study: We Heard You – Data-Driven Action

The Senior Leadership Team worked together to develop comprehensive strategies that addressed each of the identified areas.

These strategies were branded under the theme “We Heard You.”

Over the past year, the Senior Leadership Team executed each strategy while periodically reminding the entire department that “We Heard You.”

Case Study: We Heard You - Outcomes

Case Study: We Heard You - Outcomes

Case Study: We Heard You - Outcomes

Case Study: We Heard You - Outcomes

Quote from the Sr. Executive Director of Health Sciences Development:

“While we know our work is not complete, the Staff@Work survey has provided a roadmap to help guide our entire team toward improved results, least of which is accomplishing the largest fundraising year for Health Sciences in the history of UC San Diego!”

ITS ALL ABOUT usingTHE DATA to make a

positive impact

Examples of actions taken as a result of staff and customer survey data:

Creation of a Professional Development and Training Program in Business and Financial Services which has resulted in career advancement and salary increases of an average or 21% for participants, 8 graduate level degrees and 20 professional certifications

Dining enhanced their menu choices to include healthier and vegan options

Housing improved lounge and shared living spaces in response to student feedback

Facilities Management instituted a client response system to more quickly address customer requests

Transportation offered specific commuting alternatives per the feedback received

Campus Shuttle brought back a shuttle route after hearing the feedback from customers

Equipment Management created a new inventory process to alleviate the burden on departments and resulted in successful inventory of approximately 80 campus buildings and 6,000 pieces of equipment with minimal intrusion into research or operational processes

Procurements created a Department Outreach program to address the specific needs of targeted customers

Career Services Center updated their Port Triton system to make their search feature more user friendly for students seeking quality internships

The BFS STRIVE Leadership Development Program was created and implemented to support development of high-potential employees in the department through mentorship. The program increased diversity, spurred career growth, encouraged professional development and fostered mentorships for the participants. It is now identified as a University “best practice” in succession planning

For more examples, quotes from leaders, and impact, visit http://blink.ucsd.edu/sponsor/OSI/opa/index.html.

ITS ALL ABOUT THE DATAso you can make meaningful

interpretations of what is important to people based on their attitudes and feelings, and take actions to make a

positive impact on the mission of your university

Back to our collaboration with CSU…

The timeline for launching CSU Chancellor’s Office Business and Finance Staff@Work Survey

Sponsor meeting (CSU and

UCSD)

Project approval, internal

communications rollout (CSU)

Working session

and official kickoff

(CSU and UCSD)

Develop hierarchy, email lists,

marketing (CSU)

Deploy survey (UCSD)

Analyze results (UCSD)

UCSD Delivers results to CSU

Build customized survey questions and application,

testing (UCSD)

6/15/15

6 weeks 4 weeks4 hours 4 hours8 weeks

Approximate Pre-Launch Time: 8-10 weeks

8/25Survey announcement

8/31 survey open

10/2 survey closed

10/15/15 initial results

11/15/15 final reports

4 weeks 2 weeks

4/15/15

CSU Chancellor’s Office plan

• Deploy the Customer Satisfaction Survey to customers in the Chancellor’s Office and 23 CSU campuses

• Identify metrics for the other two perspectives of the BSC• Financial/Stakeholder

• Internal Business Processes

• Now have benchmark capability with UCSD on the same survey questions year over year to collaborate and partner on best practices

Appendix: Example screenshots of the

Staff@Work Survey and Faculty and Staff Customer Satisfaction Survey

Questions?

Contacts:

Alexis Naiknimbalkar, [email protected](to learn more about how CSU has implemented the surveys at the Chancellor’s Office!)

Angela Song, [email protected](to learn more about the surveys and how to bring them to your campus!)

Some screenshots of the Staff@Work Survey Instrument

English or

Spanish

Welcome Page

Know what you are

rating & where you

are in the survey

Your department

name pops up if

you hover over

Anonymous

comments

Highlight a colleague or

leader that’s made a

difference!

Conduct and behavioral

questions for VC Equity,

Diversity, Inclusion

initiatives

Choose to “opt out” from

further reminders

Thank you gift for participation!

Some screenshots of the Faculty and Staff Survey Instrument

Key words for search!

Searchable by VC area, service category, or alphabetically

Navigation

Save and finish later

Special recognition for a department

This is the coupon that everyone receives for a gift item at the Bookstore. Each coupon is barcoded and unique

You can click out of the survey to our webpage to see recent Actions Taken and other information

Once you submit your survey, you cannot change responses. You are automatically returned to this page. You can also come back here to print you coupon.

This is the promotional banner that will be posted in three different locations on campus

Email to our VC contacts and departments in preparation for the launch…

The communication “kit” we provided departments to help encourage response participation

Note: reminders are only sent to people who have notsubmitted their survey