1Chief Performance Officer – Rob Hebeler, Q1 FY0708
DMS Quarterly Customer Satisfaction Survey
December 2007
2Chief Performance Officer – Rob Hebeler, Q1 FY0708
A Few Words First …
• The journey of continual improvement is not one for uninspired leadership or poor management. It is difficult. It is hard to always focus on how to get better on this day, and every day moving forward.
• Our customers expect and deserve service and product excellence. We should do nothing less to strive and meet this level of expectation.
• The key to success rests in our ability and desire to build relationships with each other and with our customers.
3Chief Performance Officer – Rob Hebeler, Q1 FY0708
Agenda
• Background• Executive Overview• Survey Design• Survey Implementation• Survey Findings• Closing Remarks and Suggestions for the Future
4Chief Performance Officer – Rob Hebeler, Q1 FY0708
Background• In January 2007, Secretary South and team embarked on a journey of
continual improvement. This journey was marked with the development of an innovative Customer Quality Assurance Framework that involved the entire organization.
• In June 2007, the department launched a customer satisfaction survey to benchmark product and service attributes, overall satisfaction and problem resolution. The survey was distributed to a random sample of the departments’ 1.1 million customers. The resulting benchmarks established our starting line and launched our journey of “measuring what matters.” We established our continual improvement goal of a 2% increase of “Extremely Satisfied” for each quarter moving forward.
• In October 2007, the department launched its first customer satisfaction survey for the fiscal year. The Q1 FY0708 survey used an online format with a phone-based helpline.
• The results of the customer satisfaction surveys enables the department to– Gauge customer satisfaction over time. – Gauge customer product and service performance over time. – Identify key areas where DMS employees excel and where there is opportunity for
improvement.
BackgroundSurvey DesignSurvey ImplementationSurvey FindingsClosing Remarks
5Chief Performance Officer – Rob Hebeler, Q1 FY0708
CommunicationsHow We Keep Everyone Informed
CultureHow We Behave
Our PlaybookHow We Get from Here to There
Our PeopleServing with C.L.A.S.S.
ProcessHow We Do Things
Our Improvements OpportunitiesWhat Needs to be Better
Our Products and ServicesWhat We Provide our Customers with and How We Treat Them
Our ResultsHow our Customers Keep Score
Our CustomersWho Have Needs and Expectations of Us1
2
3
4
5
6
7Continual
ImprovementWhat Must
Change
BackgroundSurvey DesignSurvey ImplementationSurvey FindingsClosing Remarks
6Chief Performance Officer – Rob Hebeler, Q1 FY0708
• Implement best practices to create a workplace of choice that fosters recruitment, development, recognition and reward
• Improve our contract management capabilities• Create robust strategic plans for each major
program and service area• Focus on the processes and procedures of
our core competencies to create a springboard for world-class performance
• Increase the brand awareness of our programs and services to our customers
• Develop Human Resource practices that encourage a resilient workforce with the ability and desire to serve in an emergency or natural disaster
• Measure what matters and continually improve the quality of services delivered to our customers
BackgroundSurvey DesignSurvey ImplementationSurvey FindingsClosing Remarks
7Chief Performance Officer – Rob Hebeler, Q1 FY0708
Q1 FY0708 Online Survey
• Survey Question Areas– Division Interaction Selection– Service– Product– Overall Satisfaction of
Experience vs. Expectation• Why do you feel this way?
• How can we improve?
– Problem Resolution• What types of problems?
Background
Survey DesignSurvey ImplementationSurvey FindingsClosing Remarks
8Chief Performance Officer – Rob Hebeler, Q1 FY0708
The Moment of Truth
Source: Moments of Truth by Jan Carlzon
Lo
yalt
y
Level of Customer Satisfaction
Dissatisfied Merely Satisfied Delighted
Zone of Pain
Zone of Mere Satisfaction
Zone of Delight
F
DC
A
B
WorseThan
Expected
Expected Better Than
Expected
Lo
yalt
y
Level of Customer Satisfaction
Dissatisfied Merely Satisfied Delighted
Lo
yalt
y
Level of Customer Satisfaction
Dissatisfied Merely Satisfied Delighted
Zone of Mere Satisfaction
Expected
Zone of DelightZone of Delight
Better Than
Expected
Better Than
Expected
Zone of Pain
WorseThan
Expected
vv
9Chief Performance Officer – Rob Hebeler, Q1 FY0708
Survey Design
• Used a 5-Point Scale to determine the level of satisfaction, and overall experience vs. expectation
Background
Survey DesignSurvey ImplementationSurvey FindingsClosing Remarks
Extremely Satisfied Satisfied
Neither Satisfied nor Dissatisfied Dissatisfied
Extremely Dissatisfied
Does Not Apply
vv
Advocate Loyal Indifferent/ Disappointed Mad
5 4 3 2 1
Resentful
10Chief Performance Officer – Rob Hebeler, Q1 FY0708
• Service Attributes build relationships and trust between the service provider and the customer. It’s how we treat our customers.– Friendly– Listened Well– Helpful– Speed of Service– Checked for my Satisfaction
Customer Experience = Service + Product
Background
Survey DesignSurvey ImplementationSurvey FindingsClosing Remarks
11Chief Performance Officer – Rob Hebeler, Q1 FY0708
Customer Experience = Service + Product• Product Attributes are those things,
tangible or intangible, that the customer values (whether they are paying for it or not). It’s what we give our customers.– Available or Delivered as Promised– Level of Quality– Easy to Understand or Use– Allowed me to Accomplish What I
Wanted to Do– Priced Fairly
Background
Survey DesignSurvey ImplementationSurvey FindingsClosing Remarks
12Chief Performance Officer – Rob Hebeler, Q1 FY0708
Survey Implementation• For the Q1 FY0708 Survey, we
targeted the 1.1 million DMS customers. We mailed a postcard to a random sample of 40,000 to ensure a statistically sound response. Of this group, 30,000 were given an incentive to participate (the possibility to win a $10 Florida State Scratch-off Lottery Ticket).
• Participants had the option of completing the survey online, or call a toll-free helpline
BackgroundSurvey Design Survey ImplementationSurvey FindingsClosing Remarks
13Chief Performance Officer – Rob Hebeler, Q1 FY0708
Survey Implementation: Response Rate
• 1,124 surveys were collected for a response rate of 3%. We achieved a 95% Confidence Level with a 3.8% Margin of Error.
• The Response Rate for the incentive group (Florida Lottery) was about twice that of the non-incentive group (1.7 times)
• Nearly one-third of those reporting that they have had contact within the past 3 months were from the -
– 32% Retirement Division. – 24% Human Resources
People– 17% State Group Insurance
BackgroundSurvey Design Survey ImplementationSurvey FindingsClosing Remarks
14Chief Performance Officer – Rob Hebeler, Q1 FY0708
The Q1 FY0708 Survey Says …
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
15Chief Performance Officer – Rob Hebeler, Q1 FY0708
DMS Overall Customer Satisfaction Q1 FY0708 Nearly half of the DMS customers indicated that they were
‘extremely satisfied’ with the experience they have had with their DMS division.
%s are based on those who selected a division to rate. Those who said they had not had contact or left blank were filtered out of these results. However, when filtered back in – the ‘%’ only changes minimally, by less than 1%.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Overall how satisfied are you with how well your experience with <particular division> met your expectations?
46 36 9 6
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
5 - Extremely Satisfied 4 3 2 1 - Extremely Dissastisfied
TOP 2 BOX: 82%
3
16Chief Performance Officer – Rob Hebeler, Q1 FY0708
DMS Overall Customer Satisfaction by Division Q1 FY0708 State Purchasing State Term Contracts Division had the highest
percentage of top box scores (extremely satisfied ratings) followed by Retirement Benefits. (Divisions with less then 10 responses were not calculated.)
%s are based on those who selected a division to rate. Those who said they had not had contact or left blank were filtered out of these results. However, when filtered back in – the ‘%’ only changes minimally, by less than 1%.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
17Chief Performance Officer – Rob Hebeler, Q1 FY0708
DMS Overall Customer Satisfaction Q1 FY0708 Vs. June 2007 Benchmark
Top Box, “Extremely Satisfied”, increased from 37% to 46% -- a 9% point increase over the June 2007 Benchmark Survey. The quarterly continual improvement goal of a 2% shift of Top Box (38.9%) was surpassed with a 14.2% shift of Top Box
%s are based on those who selected a division to rate. Those who said they had not had contact or left blank were filtered out of these results. However, when filtered back in – the ‘%’ only changes minimally, by less than 1%.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Overall how satisfied are you with how well your experience with <particular division> met your expectations?
37
46
41
36
9
9
7
6
5
4
0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100%
5 - Extremely Satisfied 4 3 2 1 - Extremely Dissastisfied
June 2007 Benchmark Survey TOP BOX: 37%
Q1 FY0708 TOP BOX: 46%
18Chief Performance Officer – Rob Hebeler, Q1 FY0708
DMS maintained relatively high top box ratings on all of their service attributes—particularly, friendliness, ability to listen well and helpfulness.
26%
42%
23%
40%
17%
48%
17%
48%
15%
50%
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Friendly Listened Helpful Speed ServiceChecked
Top Box Bottom 3 Box
DMS Customer Service Attributes Q1 FY0708
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Percentage of Top Box Scores and Bottom 3 Box Scores by Service Attribute
19Chief Performance Officer – Rob Hebeler, Q1 FY0708
Since the June 2007 Benchmark Survey, top box scores have risen in all service attributes. The largest increases were found in “Service Checked”, and “Friendly.”
33%
42%
35%
40%43%
48%
41%
48%
43%
50%
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Friendly Listened Helpful Speed ServiceChecked
Q4 Top Box Q3 Top Box
DMS Customer Service Attributes Q1 FY0708 Vs. June 2007 Benchmark
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Percentage of Top Box Scores by Service Attribute and Quarter
* * * *Denotes Statistically
Significant Difference p < .05.
20Chief Performance Officer – Rob Hebeler, Q1 FY0708
Correlation Analysis of Customer Service Attributes to Overall Satisfaction
In general – we see strong correlations between all of the attributes and overall satisfaction. Strong positive relations = higher overall satisfaction ratings.
The level of helpfulness and speed of service is most highly correlated to the level of overall satisfaction, whereas Friendliness is the least correlated.
*Pearson Correlation indicates the strength and direction of a linear relation between two factors. The value ranges between +1 (a perfect positive relation) and -1 (a perfect negative relation). Values greater than .7 indicate an extremely close relation between the factors—that is, as one factor rises, the other factor rises in a very similar fashion.
Attribute Pearson’s Correlation* (to overall satisfaction)
Helpful .80
Speed of Service .76
Listened .72
Checked for Satisfaction at end of Call .67
Friendliness .67
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
21Chief Performance Officer – Rob Hebeler, Q1 FY0708
Quad Analysis of Customer Service Attributes
A Quad Analysis of “customer service attributes,” we see that speed of service will be the primary area where the DMS can focus to impact better ‘overall satisfaction ratings.
In addition, better follow up / wrap up will also be an area to increase performance which in turn will help increase overall satisfaction.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Helpful
Speed of Service
Listened
Checked for Satisfaction
Friendly
3.8
3.9
4.0
4.1
4.2
4.3
4.4
0.6 0.7 0.8 0.9
Impact
Ave
rag
e P
erfo
rman
ce S
core
22Chief Performance Officer – Rob Hebeler, Q1 FY0708
Compared to bottom three box scores, DMS maintained very high top box ratings for nearly all of their product attributes—particularly, Availability and Quality. The one exception was for Price Value, here the percentage of top box scores was nearly equal to bottom three box scores.
30%
37%
14%
49%
16%
48%
12%
49%
10%
49%
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Available Quality EasyUnderstand
Allowed toAccomplish
Price Value
Top Box Bottom 3 Box
DMS Customer Product Attributes
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Percentage of Top Box Scores and Bottom 3 Box Scores by Product Attribute
23Chief Performance Officer – Rob Hebeler, Q1 FY0708
Since the June 2007 Benchmark, top box scores for product attributers increased across the board. The largest increases were found in “Quality,” “Allowed to Accomplish” and “Easy to Understand.” The smallest increase was for “Price Value.”
34%37%
40%
49%
38%
48%
39%
49%
41%
49%
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
50%
Available Quality EasyUnderstand
Allow toAccomplish
Price Value
Q4 Top Box Q3 Top Box
DMS Customer Product Attributes Q1 FY0708 Vs. June 2007 Benchmark
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Percentage of Top Box Scores by Product Attribute and Quarter
*Denotes Statistically
Significant Difference p < .05.
* * * *
24Chief Performance Officer – Rob Hebeler, Q1 FY0708
Correlation Analysis of Customer Product Attributes to Overall
Satisfaction
Again, we see very strong correlations between all of the attributes and overall satisfaction. Strong positive relations = higher overall satisfaction ratings.
With the exception of Price Value, all products attributes correlated with overall satisfaction at or above .79.
*Pearson Correlation indicates the strength and direction of a linear relation between two factors. The value ranges between +1 (a perfect positive relation) and -1 (a perfect negative relation). Values greater than .7 indicate an extremely close relation between the factors—that is, as one factor rises, the other factor rises in a very similar fashion.
Attribute Pearson’s Correlation* (to overall satisfaction)
Quality of Product .83
Allowed customer to Accomplish Goal .82
Easy to Understand .81
Availability .79
Price Value .66
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
25Chief Performance Officer – Rob Hebeler, Q1 FY0708
Quad Analysis of Product Attributes
A Quad Analysis of product attributes we see Price is a secondary Opportunity and there is no clearly defined primary opportunity.
Since Price has the least impact on overall satisfaction and there is little to no difference between the other attributes, the best advice to give would be to focus on attributes that are the easiest to and least costly to change.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Price
Allow
Easy
Quality
Available
3.7
3.8
3.9
4.0
4.1
4.2
4.3
4.4
4.5
0.6 0.7 0.8 0.9
Impact
Ave
rag
e P
erfo
rman
ce S
core
Maintain
Secondary Opportunities Primary Opportunities
Strengths
26Chief Performance Officer – Rob Hebeler, Q1 FY0708
Incidence of Problems with DMS Services Of the customer that had contact with DMS within the past 3 months,
only 12% reported that they had experienced a problem. Of those, roughly nearly half (45%) reported it to the DMS. Importantly, when asked about the satisfaction of the resolution they received, nearly half (45%) stated they were very or extremely dissatisfied.
29%
16%
26% 26%
29%
16%
45%
0%
5%
10%
15%
20%
25%
30%
35%
40%
45%
Top 2 5 4 3 2 1 Bottom 2
Satisfaction with Outcome
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Ratings for Resolution after Experiencing a Problem and Reporting it to DMS
27Chief Performance Officer – Rob Hebeler, Q1 FY0708
Customer Suggestions for Improvement of DMS The majority of customers (58%) suggested organizational changes to
improve DMS (improvement of interdepartmental communications, etc.). Lagging far behind, but still ranked second, was Services (how customers are treated).
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
Relative Percentage of Improvement Comments by Type
Organization, 58.3%
Products, 8.1%
Services, 13.8%
Process, 8.6%
Technology, 11.0% Categories for Improvement
Organizational: Relations between departments, people, merging departments, leadership etc.
Products: Tangible/Non-Tangible reference to something they receive or expect to receive.
Services: How customers are treated.
Process: Efficiencies in day-today operations.
Technology: increase efficiencies and means of getting product or service.
28Chief Performance Officer – Rob Hebeler, Q1 FY0708
Problems that Customers have Experienced*
Password problems; information gathering problems; unable to find desired topics.
Not able to get all info from one place (person) had to call another number to get info.
Website is terrible.
2 months to make Drs appointment need to speed things up.
People First is not user friendly requiring frequent calls to obtain help.
getting hold of someone to talk to -- could be 3 days before a return call.
Open enrollment through People First was much more difficult than it should have been. Web-based training videos did not function until late in the process.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
* Note: Unedited Customer Comments
29Chief Performance Officer – Rob Hebeler, Q1 FY0708
Comments that Represent Suggestions for Improvement from Customers*:
Staff should be better informed; if they can’t answer a question they should know the correct office to forward the person to.
Keep web pages updated.
Call center needs more employees..
Easier searches on data portal
Have varied office hours so that you can conduct business after 5 p.m. and on Saturdays.
Provide printed materials stating necessary information.
Too many different agencies doing same things. Retirement and investment had to go back and forth too much.
More inter agency communications. It is sometimes hard to know what each agency/division is thinking/doing. We all have personnel with strengths/weaknesses. If we communicate better we can leverage/overcome this.
BackgroundSurvey DesignSurvey Implementation Survey FindingsClosing Remarks
* Note: Unedited Customer Comments
30Chief Performance Officer – Rob Hebeler, Q1 FY0708
What’s Next …
BackgroundSurvey DesignSurvey ImplementationSurvey Findings
Closing Remarks
31Chief Performance Officer – Rob Hebeler, Q1 FY0708
Rob Hebeler, Chief Performance OfficerState of FloridaDepartment of Management Services(850) 487-9887 office, (850) 491-2095 cell [email protected]
We Serve Those Who Serve Florida
DMS E-Mail BasedPulse Survey
Click on the Survey Link brings you to our Web-based Customer Survey
BackgroundSurvey DesignSurvey ImplementationSurvey Findings
Closing Remarks
32Chief Performance Officer – Rob Hebeler, Q1 FY0708
Top AccomplishmentsBenchmark June 2007
Q3-07 PULSE Q3-W2 Cum September-2007
Satisfaction - Experience vs. Expectation 60% 62% 62% Saved …Customer Service 50% 49% 51% Implemented …. Friendly 65% 65% 66% Recovered … Listens Well 55% 57% 55% Established Helpful 45% 47% 48% Designed … Speed of Service 50% 52% 53% Built … Checks for Satisfaction 35% 34% 35% Communicated …Product Performance 50% 49% 51% Available or Delivered as Promised 65% 65% 66% Level of Quality 55% 57% 55% Easy to Understand or Use 45% 47% 47% Allowed me to Accomplish What I Wanted to Do 50% 52% 49% Price Fairly 50% 52% 49%Problem Resolution Satisfaction 35% 34% 37%
Our Alignment & FocusOur Motto
We Serve Those Who Serve Florida
Service Vision
Engaged Employees; Satisfied Customers
Service Mission
Providing Smarter, Better, Faster Services
Service Promise
Communicate Concerns Immediately
Listen, Learn and Grow Together
Act with Integrity and Honor
Strive for Greatness
Serve with a Servant's HeartKey: Green = We're Getting Better; Yellow = We are "Just Okay"; Red = We are falling behind the Customer's Expectations
Department of Management Services Priorities
Service, Product and Problem Resolution PerformanceDMS Performance Scoreboard
■ Implement best practices to create a workplace of choice that fosters recruitment, development, recognition and reward ■ Improve our contract management capabilities■ Create robust strategic plans for each major program and service area■ Focus on the processes and procedures of our core competencies to create a springboard for world-class performance■ Increase the brand awareness of our programs and services to our customers■ Develop Human Resource practices that encourage a resilient workforce with the ability and desire to serve in an emergency or natural disaster■ Measure what matters and continually improve the quality of services delivered to our customers
BackgroundSurvey DesignSurvey ImplementationSurvey Findings
Closing Remarks
33Chief Performance Officer – Rob Hebeler, Q1 FY0708
Customer Quarterly Survey Schedule
• Q2 FY0708– Launch: Mid-January; Results; Mid-February
• Q3 FY0708– Launch: Mid-April; Results; Mid-May
• Q4 FY0708– Launch: July; Results; August
34Chief Performance Officer – Rob Hebeler, Q1 FY0708
Closing Remarks
• Overall, DMS is doing a good job on product and service aspects. Indeed, based on the improvement in scores from the benchmark to Q1 FY0708, DMS is getting better!
• Overall Customer Satisfaction is up 9% points over the June 2007 benchmark. Increasing gains were evidenced in nearly every service and product attribute from the June 2007 Benchmark Survey to the Q1 FY0708 Survey.
• Customer service attributes of helpfulness and speed of service were found to be the most strongly correlated to overall satisfaction. Simply put, customers want to be helped and helped quickly.
• When it comes to products, quality and allowing customer to be able to do what they set out to accomplish were the most highly correlated with overall satisfaction. With the exception of product price, all product attributes correlated with overall satisfaction at a value greater or equal to .79.
• Only 12% of respondents reported having had a problem with DMS. However, when customer problems did arise, problem resolution appeared to be an issue. Nearly half of those who experienced a problem and reported it, said that they were dissatisfied or extremely dissatisfied with the resolution.
• Quad analysis identified speed of service as a service attribute that is an primary area for changes to be made. With regard to product attributes, no single attribute emerged as a primary candidate for change; therefore, DMS might focus on those attributes that are the easiest to change.
• Using the information gathered from the open-ended comments about how DMS might improve, 58% of those who responded mentioned organizational changes or improvements—interdepartmental communication, knowledge, etc.).
BackgroundSurvey DesignSurvey ImplementationSurvey Findings
Closing Remarks