2014 hdi conference: session 203: continual service innovation using surveys and metrics to improve...

41
Continual Service Innovation: Using Surveys and Metrics to Improve Service Session 203 Eddie Vidal – University of Miami

Upload: eddie-vidal

Post on 01-Nov-2014

326 views

Category:

Technology


0 download

DESCRIPTION

The term innovation can be defined as something original and new. Has the way we deliver customer service changed in the last 50 years? Delivering service with a smile and a positive attitude hasn’t changed but what about the data used to improve service? Many organizations have numerous reports and data but lack the innovation to make simple changes to affect results. Survey results are a great tool but what are you doing with them? Are you using the results to create an action plan? What metrics do you use to help build your case for positive change? In this session Eddie will share with you useful methods to improve the quality of your service delivery on a continual basis. Key takeaways include; survey questions, actions based on results, metrics and templates.

TRANSCRIPT

Page 1: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Continual Service Innovation: Using Surveys and Metrics to Improve Service

Session 203 Eddie Vidal – University of Miami

Presenter
Presentation Notes
Session 203: Continual Service Innovation: Using Surveys and Metrics to Improve Service Delivery Wednesday, April 02 at 11:30 AM View schedule Has the way we deliver customer service changed over the past fifty years? What about the data used to improve the way we deliver that service? Most organizations collect data on service delivery, but many are unable to innovate, to take action and make simple changes that can have a dramatic effect. In this session, Eddie Vidal will share some useful methods—surveys, metrics, action plans based on real data—for achieving continual service innovation. (Intermediate) Continual Service Innovation – Using Surveys & Metrics to improve Service Delivery The term innovation can be defined as something original and new. Has the way we deliver customer service changed in the last 50 years? Delivering service with a smile and a positive attitude hasn’t changed but what about the data used to improve service? Many organizations have numerous reports and data but lack the innovation to make simple changes to affect results. Survey results are a great tool but what are you doing with them? Are you using the results to create an action plan? What metrics do you use to help build your case for positive change? In this session Eddie will share with you useful methods to improve the quality of your service delivery on a continual basis. Key takeaways include; survey questions, actions based on results, metrics and templates.
Page 2: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Eddie Vidal • HDI & Fusion Track Chair • HDI & Fusion Conference

Speaker • HDI Strategic Advisory Board • President Emeritus of South

Florida HDI Local Chapter • Published in Support World

Magazine & HDI Connect • HDI Support Center Manager

Certified • ITIL V3 Foundation & OSA

Certified • itSMF monthly podcast producer

Manager, UMIT Service Desk [email protected]

[email protected] 305-439-9240 @eddievidal

http://www.linkedin.com/in/eddievidal

Presenter
Presentation Notes
Eddie Vidal Manager, UMIT Service Desk�University of Miami��Eddie Vidal has over twenty years’ experience in information technology, where he focuses primarily on service delivery and support for IT infrastructures. In his current position as the manager of enterprise support services for the information technology department at the University of Miami, Eddie supports over 35,000 faculty, staff, and students. In addition to higher education, Eddie’s experience includes the hospitality and travel industries.��Eddie currently serves as the president of the HDI South Florida local chapter and a member of the HDI Desktop Support Advisory Board (DSAB). He has spoken at local, regional, and national events and has been published in HDI's SupportWorld magazine.
Page 3: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Strategies for improving service • UM Approach – Keeping it simple • Useful information – Why it’s important to

use metrics. • Templates

Takeaways

3

Page 4: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

How do you measure?

4

Page 5: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Surveys to improve Service Delivery • Call Monitoring to improve Customer

Service • Incident Tracking to improve processes,

documentation and knowledge base

What do you measure now?

5

Employee Surveys

Page 6: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• What is not defined cannot be controlled. • What is not controlled cannot be

measured. • What is not measured cannot be

improved.

Define | Measure | Improve

6

Presenter
Presentation Notes
What do you measure now? What do you do about it? Is it reactive? When first starting and obtaining the data you should be a changing upon the results and doing corrective actions. As you become more mature you should be anticipating your results, have you already created a plan and be one step ahead of the results.   You are doing something about it right? If not, what are you waiting for?
Page 7: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Introduction of something new original or important

• A new idea, method, or device • Better solutions • Doing things different

Innovation

7

Page 8: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Where do you start?

CSI Model 8

Page 9: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Survey Says!

Survey Results

9

Page 10: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Surveys at the U

10

Page 11: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Lencioni Model for Team Effectiveness

Inattention to Results

Avoidance of Accountability

Lack of Commitment

Fear of Conflict

Absence of Trust

11

Presenter
Presentation Notes
Let’s review the Lencioni model introduced in the module. This is a good model for team effectiveness and reveals how you can lead your team to more proactive behaviors depending on where they are in that model. Turn to page 18 of your participant guide to review the Lencioni model. Another way to understand this model is to take the opposite approach, a positive one: Members of truly cohesive teams possess the following characteristics: They trust one another They engage in unfiltered conflict around ideas They commit to decisions and plans of action They hold one another accountable for delivering against those plans. They focus on the achievement of collective results.
Page 12: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery
Page 13: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

First Attempt • Kept it simple • 5 questions • Phone calls by students • 105 surveys

Surveys at the U

13

Presenter
Presentation Notes
Telecom, student would call users and ask them 5 basic questions, lasted about a month to month and a half. Why did we stop? Because nothing was being done about the results. We noticed the feedback affected desktop support and when we reached out to them it didn't go far. The root cause of this was because there wasn't a collaborative relationship between the two groups. In order for a CSI model to work in an environment, you need to work together. Silos won't cut it.
Page 14: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

• Different department & role

• Expanded reach • Created SharePoint

Incident and Request form

• 238 surveys • 3 or below, take action • Positive comments

shared on monthly basis

Surveys at the U

14

Presenter
Presentation Notes
Second area worked well, User Support Services. Conduct surveys for 6 months. Created a ticketing system in SharePoint until ITSM tool was implemented. Prior to this, there were areas not tracking incidents/requests. This area consisted of a student application and tier 2 help,desk, internal desktop support, media and user account management. We received close to 300 surveys, one was sent to every incident or request that was closed, our response rate was x%. We did have an override field/button and noticed it was used at times when it really shouldn't have. Those were opportunities to do some coaching and communicate the process once again and the value behind it. The process was to review on a weekly basis and reach out to the users whose surveys scored 3 or below. This was a manageable number and I was able to stay on top of it. It was a more controlled environment and the teams involved all were from the same area. On a monthly basis we sent out the positive comments to the team and copied senior management in this group, not all IT. We created bar charts and shared results with the team. In addition, we compared our numbers in our vertical market, higher education with the results from the HDI support Center salary and survey guide. We were able to show our customer feedback was higher than the HDI results. In addition, at the end of the year I wrote a short recap of our total results and how we compared. Also created some tables and charts to display the comparisons.
Page 15: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Customer Surveys

1. Overall quality of IT Support Center Staff? 2. IT Support Staff handling my problem was

knowledgeable? 3. IT Support Staff handling my problem was

courteous and professional? 4. Incident was resolved to my complete

satisfaction? 5. Resolution of your incident completed in a

timely manner?

15

Page 16: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Survey Results

16

Page 17: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Survey Results

Overall Survey Results Start Date 6/4/12, Scores after 238 surveys 1/18/13

Customer Satisfaction with 1 (Very

Dissatisfied) 2

Dissatisfied 3 Neutral 4

Satisfied 5 (Very

Satisfied)

Total Satisfaction (Combined

4/5 Percentage) Mean

The courtesy of the representative 1% 0% 3% 4% 91% 96% 4.79

The technical skills/knowledge of the representative? 1% 0% 3% 6% 89% 95% 4.81

The timeliness of the service provided? 1% 3% 3% 5% 87% 92% 4.73

The quality of the service provided? 1% 2% 3% 4% 90% 95% 4.82

The overall service experience? 2% 1% 2% 8% 88% 95% 4.85

17

Page 18: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Surveys Comparisons

18

Page 19: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Surveys Comparisons Overall Survey Results USS HDI Education HealthcareThe courtesy of the analyst 97.5% 97% 96% 97%The technical skills/knowledge o 96.5% 95% 94% 96%The timeliness of the service pro 95% 92% 92% 94%The quality of the service provide 96.6% 94% 93% 95%The overall service experience 96% 93% 92% 94%

Overall Mean scores USS HDI Education HealthcareThe courtesy of the analyst 4.87 4.84 4.82 4.87The technical skills/knowledge o 4.83 4.78 4.74 4.82The timeliness of the service pro 4.75 4.69 4.68 4.74The quality of the service provide 4.83 4.76 4.72 4.8The overall service experience 4.8 4.72 4.69 4.77

19

Page 20: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Write a blog, write

something!

Celebrate Your Success

University of Miami User Support Services scores higher customer satisfaction ratings than 2012 HDI Customer satisfaction benchmarking

study report.

20

Presenter
Presentation Notes
User Support Services scores higher customer satisfaction ratings than benchmarking report   September 5, 2012 Eddie Vidal, Manger Enterprise Support Services   University of Miami User Support Services scores higher customer satisfaction ratings than 2012 HDI Customer satisfaction benchmarking study report. In June 2012 the University of Miami User Support Services (USS) team developed a customer survey process to obtain customer feedback. The survey asks the customer to rate the quality of service, timeliness, technical skills, courtesy and overall satisfaction and experience on a per incident basis. An email with a link to a five question survey is sent to customers after an incident/ticket is closed. USS created an Incident Tracking system using SharePoint forms in Jan 2012 and has closed approximately 8500 incidents. The survey’s results shown in the table below indicate USS customers are very satisfied with their service. HDI is the leading professional association and certification body for technical service and support professionals. HDI published a Customer Satisfaction benchmarking study report in September of 2012. The results are based on over 350,000 survey responses and 510 supports centers representing 215 companies. As the results show, USS customers are very satisfied with support achieving higher results than HDI's benchmarking study. Comparing USS to other Educational organizations USS exceeded the results published in the HDI report. The creation of customer surveys was developed following best IT service management practices and to obtain customers opinion of the services and support provided by USS. With the results of the survey, USS used the data to begin a continual service improvement program. The results were used to determine weaknesses and find ways to improve and correct them. This was accomplished by reviewing the survey results submitted and contacting users who rated USS service with a score of 3 or below. The rating scale is 1 for very dissatisfied and 5 for very satisfied. By contacting customers, USS learned many things and have found ways to improve the delivery of our services. The data has shown the recent effort of implementing the customer survey process has delivered the desired results of improving the delivery of IT services to the university community.
Page 21: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

• New department • All of UMIT • New ITSM tool • 1 out of 10 • 10% response rate • Started July 12, 2013 • 1975 Surveys received as

of March 24th

Surveys at the U

21

Presenter
Presentation Notes
455 for Medical Technical Service Desk 115 Uchart Clinical SD 215 Gables Service Desk 785 Total Surveys our of 1975
Page 22: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Survey Results

22

Page 23: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Dramatic Effect = Partnership • Start inside - UMIT Partnership • Move outward - Business Partnership

– Contacting the Customer – Listen – Make a change based on feedback – Advise customer change made – Correct broken processes

Simple Changes

23

Page 24: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Survey Results – Service Desk Comparison

24

Presenter
Presentation Notes
Scores are based on rating of 5 only
Page 25: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Propose Operational meeting to review • Review Results

– Beginning to current – Last 30 days – Last week – Focus on 3 or below – Strive for 5 and celebrate!

• Departmental Breakdown = Breakdown Silos?

• Benchmark – Implement - Improve

Next Steps

Define – Measure - Improve

25

Page 26: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Friendly Competitions between departments

• Gamification • Share results with Customers

Next Steps

26

Page 27: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Recognize Rock Stars • Learn from departments doing well • Coaching opportunities • Align with the Business

Benefits & Value

27

Page 28: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Use surveys for rewards, recognition and annual performance review

• Meet with business partners once a week • Share survey with outside

vendors/partners

Other Organizations

28

Page 29: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

HDI surveys

29

What does HDI do with

their surveys?

www.HDIConference.com/Eval

Page 30: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Other Organizations

30

Courtesy of Gina Montague – Infinite Campus

Page 31: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• What do we do with the feedback? We carefully review all survey results as soon as they arrive. Survey results are

shared with staff and are used for coaching and kudos. We want you to know how we are doing, too. Every month we will update

this site with our latest customer satisfaction score.

Infinite Campus

31

Courtesy of Gina Montague – Infinite Campus

Page 32: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Other Organizations

32

Das

hboa

rd

Page 33: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

Align Results

Job Knowledge

Quality of Work

Productivity

Customer Service

33

Presenter
Presentation Notes
Jenny: After reviewing many performance evaluation forms from various companies in the support industry, we found some common areas that most organizations are currently using to evaluate their staff. We will focus on these four today: job knowledge, quality of work, productivity, and customer service. While supervisors are putting thought into these reviews, the information shared with staff is mostly subjective. We want to show you how to take that to the next level. Note – these are the most common, but the methodology can be applied to other performance areas already identified on your performance reviews.
Page 34: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Quality

• Ticket Accuracy Review – Has the customer been contacted within 24 hours? – Are the diary entries user friendly? – Has the customer been kept in the loop? – Was customer sign-off obtained?

34

Presenter
Presentation Notes
EddieTime, how long do you think this takes? Manually completed by students. At FIU it was 48 hour turnaround The first report our team received was at 65%, the boss was not happy. Had to create a plan on how I was going to improve this. It took us 3 months to bring up the score to 90+%. Starting each meeting similar to 5 Dysfunctions of a Team, what are our goals? Has the customer been contacted within 24 hours? Are diary entries user friendly, does the customer understand it if they were to read the ticket? Was the customer kept in the loop. Was customer sign-off obtained? 3 attempts in 10 business days Kept Customer in loop? How is this determined? Keep this in mind to determine if the customer is kept in the loop. If the customer needs to call the SC for the status of their SR, do they really know what the status is? Would that phone call be required? 1) Are the diary entries updated to state when the next expected entry or work is to occur with the user? 2) Is the customer informed when the next visit or work on their service request is to be performed? 3) If you have set an appointment with the customer in advance has it been entered in the Service Request? 4) If customer was not reachable when contacting them is there an entry stating the means of communication, for example, via email, voicemail or in person? 5) Is there a statement stating the user was informed by leaving a service receipt form, email or voice mail? 6) If you answered YES to all of the questions then it's a yes for keeping customer in the loop. If the answer is no to one of the questions then the Customer Service Objective has not been met.
Page 35: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Ticket Evaluation Template

35

Page 36: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Call Monitoring

• Greeting the customer • Key points during the call • Ending the call • Behavioral Questions

36

Page 37: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Call Monitoring Score

37

Page 38: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Taking it to another level

• Use an incident for same call • Follow the trail from beginning to end • To post or not to post? • Create competition

38

Page 39: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Where do you start?

CSI Model 39

Page 40: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

• Strategies for improving service • UM Approach – Keeping it simple • Useful information – Why it’s important to

use metrics. • Templates

Takeaways

40

Page 41: 2014 HDI Conference: Session 203: Continual Service Innovation Using Surveys and Metrics to Improve Service Delivery

@eddievidal

Thank you for attending

Contact Information Eddie Vidal 305-439-9240 [email protected] [email protected] http://www.linkedin.com/in/eddievidal

@eddievidal

Please Complete the Session Evaluation

41

www.HDIConference.com/Eval