Date post: | 17-Dec-2015 |
Category: |
Documents |
Upload: | julie-oneal |
View: | 216 times |
Download: | 0 times |
Georgia Infrastructure Transformation (GAIT) 2010
Service Levels and Operational Metrics
July 2009
2
GAIT Initiative Milestones2009 2010
Apr May Jun Jul Aug Sep Oct Nov Dec Jan Feb Mar Apr May Jun
What to expect 7/1 8/31: July• Asset refresh planning continues, pilots begin• Server and Storage Consolidation Design• Draft Disaster Recovery Plans• Point of Service Surveys (Results)• Initial MNS Management Reports available• Continued DDM Deployment
August• Change Management Training • Initial MNS SLA Performance Reports available• Online Service Request Deployment via Maximo• Server and Storage Consolidation Planning • True-up Activity• Server Tools Schedule Development and Pilot • Service Catalog Improvements
4/1: IBM Commencement• EUC Service Catalog implemented• GETS Portal activated• Personnel transitioned• Initiated BAU Unless Strategy• Solution Request Process introduced• Baseline Customer Satisfaction Survey• Invoices available online
5/1 – AT&T Commencement• Service Desk - Regional Operation Center Support• Network Mgmt moved to Raleigh• Voice/Data Service Catalog implemented
Begin Stand-up of Steady State Processes Process/Tool Maturation and Continuous Improvement
Chargeback Implementation
Begin Server & Storage Consolidation
Strategic Technology Plan
Agreement to Information Security Control Document
DOR Print Shop Move
MNS Asset Inventory Complete
Initial MNS SLA Performance Reports Available
6/1 – Consolidated Service Desk “go live”•Point of Service Surveys (Service Desk)•Service Catalog “go live” (ordering) •EUC Services Transformation •Asset Inventory Continued•GETS Print Shop Move
3
Agenda
• Introduction to Service Levels▪ What are they?
▪ Why are they important?
• SLA Reporting and Contract Documents
• SLAs for Key Areas▪ Service Desk
▪ End User Computing
• Consolidated Service Desk Update▪ Metrics
• Local and Network Printing Strategy
• Appendix▪ SLA Terminology
4
What are the three types of Service Levels?
Critical Measurements• Essential to the welfare of State of Georgia• Financial Service Level Credits are due for Service Level Default • Represented by an Expected and a Minimum Level
Key Measurements• Services are meaningful to State of Georgia business• Financial Service Level Credits NOT DUE for failure to attain• May be candidates for Critical Service Levels• Same format (Expected / Minimum) as Critical Service Level
Critical Deliverables (One time)• Usually one-time deliverables such as: service management
manuals, disaster recovery plans and technology plans
5
Communication Management
Performance Management
Knowledge Management
Data Management
(Oversight)
SLA & metrics handling considerations The broader picture
Sharing information with the world
•SLA reporting communication strategy & framework•Report content, design, creation, approval, delivery, distribution sequence
Socializing SLAs in our environment
•SDO meetings•Online•Webinars / training•Agency Head mtgs•Agency Advisory Council meetings
Managing behaviors
•SLA methodology - Flexibility of SLAs - Adding SLAs - Continuous improvement•Credits & earnback•SP variable compensation
Managing the data
Oversight•Data validation•Tracking•Data access•Data query
Three metrics types
Operational•No minimum or expected values•Leading – daily/weekly/monthly•Some required by contract
Key SLAs•Minimum and expected values•No financial penalties
Critical SLAs•Minimum and expected values•Financial penalties
Our focus today
6
Service Level Management and Reporting Details
Defined measurements and tools for Service Levels
Publish Service Level reports in the ESP Web Portal
Financial credits for non-performance or poor performance
Contract calls for continuous improvement annually
Improved visibility for all services throughout the enterprise with standardized reporting tools
7
Service Level AgreementsService Desk
End User Computing
88
Service Desk SLAs
Service Desk Services
Designation SLA Time Measurement Expected Minimum
*First Call Problem Resolution ***Resolved during first phone call 80% 75%
*Percentage of Problems Resolved ***48 hour Resolution 90% 85%
*Call Abandoned Not Applicable No more than 6% No more than 8%
**Speed to Answer 60 seconds 90% 85%
*Critical Measurement**Key Measurement***Calls measured are those which are considered resolvable at the Service Desk
99
End User Computing SLAs
End User Computing Services
Designation SLA Time Measurement Expected Minimum
Break/Fix
Break/Fix Time to Respond Within 30 minutes during business hours 90% 85%
Tier 1 VIP 4 hour Resolution 90% 85%
Tier 2 VIP 8 hour Resolution 90% 85%
Tier 3/Campus(350 assets within a 15 mile radius*) Next Business Day Resolution 90% 85%
Tier 4/Remote (all other locations*) 3 Business Day Resolution 90% 85%
Service Request
IMACs 5 Business Day Completion** 95% 90%
•Based upon RFP data, subject to change after the completion of the wall-to-wall inventory and True Up**Excludes new equipment orders that may take 2 – 3 weeks
10
Consolidated Service Desk Status Update
11
Speed to Answer Comparison
% Answered inless than 60 s
Target: 90%
Min Target: 85%
Jun-0
1Ju
n-0
2Ju
n-0
3Ju
n-0
4Ju
n-0
5Ju
n-0
6Ju
n-0
7Ju
n-0
8Ju
n-0
9Ju
n-1
0Ju
n-1
1Ju
n-1
2Ju
n-1
3Ju
n-1
4Ju
n-1
5Ju
n-1
6Ju
n-1
7Ju
n-1
8Ju
n-1
9Ju
n-2
0Ju
n-2
1Ju
n-2
2Ju
n-2
3Ju
n-2
4Ju
n-2
5Ju
n-2
6Ju
n-2
7Ju
n-2
8Ju
n-2
9Ju
n-3
0Ju
l-01
Jul-
02
Jul-
03
Jul-
04
Jul-
05
Jul-
06
Jul-
07
Jul-
08
Jul-
09
Jul-
10
50
60
70
80
90
100%
After Improvement
% of calls answered in less than 60 seconds (daily)
Process much more stable after improvement
12
Metric Attainment Strategy
Dennis Quinto / Alisa Bettis Raise awareness Hourly/Daily aged ticket updates Dedicated queue monitors
Service Desk Resolved < 48 HoursMin: 85%Exp: 90%
Metric Action Items Owner
Speed to AnswerMin: 85% < 60 secondsExp: 90% < 60 seconds
Focus on call back and call handle time opportunities Deploy biometric password tool Utilize Maximo automation Identify and proactively identify derailments due to environment:
changes to infrastructure, seasonal volumes, etc Staff automation resource
Dennis Quinto / Alisa Bettis
AbandonMin: 8%Exp: 6%
Analyze impact of Priority 1 and 2’s Define Abandon rate threshold Frequent scheduling reviews Utilize optimal scheduling
Dennis Quinto / Alisa Bettis
Customer SatisfactionMin: 75%Exp: 80%
Deploy Right to Left Strategy Quicker CKM turn around time Formulate detection process Consistent performance management and feedback
Dennis Quinto / Alisa Bettis
First Call ResolutionMin: 75%Exp: 80%
Deploy Right to Left Strategy Quicker CKM turnaround Mine resolved Incidents for knowledge opportunity Measure CKM effectiveness
Dennis Quinto / Alisa Bettis
Dennis Quinto / Alisa Bettis Raise awareness Hourly/Daily aged ticket updates Dedicated queue monitors
Service Desk Resolved < 48 HoursMin: 85%Exp: 90%
Metric Action Items Owner
Speed to AnswerMin: 85% < 60 secondsExp: 90% < 60 seconds
Focus on call back and call handle time opportunities Deploy biometric password tool Utilize Maximo automation Identify and proactively identify derailments due to environment:
changes to infrastructure, seasonal volumes, etc Staff automation resource
Dennis Quinto / Alisa Bettis
AbandonMin: 8%Exp: 6%
Analyze impact of Priority 1 and 2’s Define Abandon rate threshold Frequent scheduling reviews Utilize optimal scheduling
Dennis Quinto / Alisa Bettis
Customer SatisfactionMin: 75%Exp: 80%
Deploy Right to Left Strategy Quicker CKM turn around time Formulate detection process Consistent performance management and feedback
Dennis Quinto / Alisa Bettis
First Call ResolutionMin: 75%Exp: 80%
Deploy Right to Left Strategy Quicker CKM turnaround Mine resolved Incidents for knowledge opportunity Measure CKM effectiveness
Dennis Quinto / Alisa Bettis
13
Data
0.00%
20.00%
40.00%
60.00%
80.00%
100.00%
120.00%
1-Ju
n-09
2-Ju
n-09
3-Ju
n-09
4-Ju
n-09
5-Ju
n-09
6-Ju
n-09
7-Ju
n-09
8-Ju
n-09
9-Ju
n-09
10-J
un-0
9
11-J
un-0
9
12-J
un-0
9
13-J
un-0
9
14-J
un-0
9
15-J
un-0
9
16-J
un-0
9
17-J
un-0
9
18-J
un-0
9
19-J
un-0
9
Speed toAnswer(90% < 60s)
Mean
Upper ProcessLimit
Lower ProcessLimit
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• 74.21%• 107.36%• 41.07%
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• 93.66%• 103.02%• 84.29%
Speed to Answer (90% Answered in < 60 seconds)
Pre-Process Change Volumes
• Attained 90% or greater 1/15 days• Unstable process• Significant Moving Range variance from Mean
Post Process Change Volumes
• 90% or greater attainment on 9/10 days• Stable process• Note proximity of data points from Mean (less variance)
Data
0.00%
20.00%
40.00%
60.00%
80.00%
100.00%
120.00%
22-J
un-0
9
23-J
un-0
9
24-J
un-0
9
25-J
un-0
9
26-J
un-0
9
27-J
un-0
9
28-J
un-0
9
29-J
un-0
9
30-J
un-0
9
1-Ju
l-09
2-Ju
l-09
3-Ju
l-09
4-Ju
l-09
5-Ju
l-09
6-Ju
l-09
7-Ju
l-09
8-Ju
l-09
9-Ju
l-09
10-J
ul-0
9
Speed toAnswer(90% < 60s)
Mean
Upper ProcessLimit
Lower ProcessLimit
14
Abandoned Call Rate (Daily)
% Abandoned
Target: <6%
Jun-0
1Ju
n-0
2Ju
n-0
3Ju
n-0
4Ju
n-0
5Ju
n-0
6Ju
n-0
7Ju
n-0
8Ju
n-0
9Ju
n-1
0Ju
n-1
1Ju
n-1
2Ju
n-1
3Ju
n-1
4Ju
n-1
5Ju
n-1
6Ju
n-1
7Ju
n-1
8Ju
n-1
9Ju
n-2
0Ju
n-2
1Ju
n-2
2Ju
n-2
3Ju
n-2
4Ju
n-2
5Ju
n-2
6Ju
n-2
7Ju
n-2
8Ju
n-2
9Ju
n-3
0Ju
l-01
Jul-
02
Jul-
03
Jul-
05
Jul-
06
Jul-
07
Jul-
08
Jul-
09
Jul-
10
0.0
5.0
10.0
15.0
20.0
25.0
30.0%After Improvement
Process stable after improvement, target achieved.
Abandon Rate Comparison
15
Data
-5.00%
0.00%
5.00%
10.00%
15.00%
20.00%
1-Ju
n-09
2-Ju
n-09
3-Ju
n-09
4-Ju
n-09
5-Ju
n-09
6-Ju
n-09
7-Ju
n-09
8-Ju
n-09
9-Ju
n-09
10-J
un-0
9
11-J
un-0
9
12-J
un-0
9
13-J
un-0
9
14-J
un-0
9
15-J
un-0
9
16-J
un-0
9
17-J
un-0
9
18-J
un-0
9
19-J
un-0
9
DailyAbandonRate
Mean
Upper ProcessLimit
Lower ProcessLimit
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• 7.77%• 17.43%• -1.90%
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• 2.30%• 7.25%• -2.64%
Abandon Rate (6%)
Post Process Change Volumes
• Scheduling based on agency skilling volumes, arrival patterns, and trending analysis• Stable process• Less Moving Range variance, points closer to Mean
Pre-Process Change Volumes
• No historical trending data available• Unstable process• Significant Moving Range variance from Mean
Data
-4.00%
-2.00%
0.00%
2.00%
4.00%
6.00%
8.00%
22
-Ju
n-0
9
23
-Ju
n-0
9
24
-Ju
n-0
9
25
-Ju
n-0
9
26
-Ju
n-0
9
27
-Ju
n-0
9
28
-Ju
n-0
9
29
-Ju
n-0
9
30
-Ju
n-0
9
1-J
ul-
09
2-J
ul-
09
3-J
ul-
09
4-J
ul-
09
5-J
ul-
09
6-J
ul-
09
7-J
ul-
09
8-J
ul-
09
9-J
ul-
09
10
-Ju
l-0
9
DailyAbandonRate
Mean
Upper ProcessLimit
Lower ProcessLimit
16
First Call Resolution
• Calculated based on Service Requests opened and closed at IBM Service Desk while user is on the phone• Approximately 24,080 Service Requests opened at the IBM Service Desk (9,041 Service Requests closed at the IBM Service Desk)• Password resets approximately 50% of Service Requests closed at the IBM Service Desk• Other FCR categories: GETS Portal, GSMRT, Order Now, Legacy How To / Inquiry (Shines, SSL, etc.)• Right to Left Strategy will drive increase in FCR• Voice Trust automation will drive biometric password automation• NOTE: Data is approximation and not an official month end report
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• Expected SLA• Minimum SLA
• 73.67%• 91.06%• 56.27%
• 80%• 75%
First Call Resolution (June 1, 2009 to July 6, 2009)Data
0.00%
10.00%
20.00%
30.00%
40.00%
50.00%
60.00%
70.00%
80.00%
90.00%
100.00%
HDE
Mean
Upper ProcessLimit
Lower ProcessLimit
17 17
First Call Resolution Examples
Request Type Resolution Candidates Resolver Group Referrals
Password Resets • Sharepoint, VPN-SSL Firepass, Windows XP, TSO, RACF (GOES), Shines, GroupWise, GETS, Maximo,
• Success, e-billing, GCIC, PeopleSoft, voicemail,
OS Support • Usage issues with Windows XP, 2000, Vista, Novell
• Mainframe, Unix,
Application/Platform Support
• Web Applications (ga.gov, teamgeorgia, PW reset, CDC SSL Certificate verification, Intranet connectivity/browser interface)
• Telephony (Media Player MP3)• Connectivity (desktop verification, my network
places, Device Manager, etc.)• Triage (GETS, Maximo, OrderNow)
• Web (extra net)• Telephony Break Fix.• Connectivity (ISR, Switch, router resets)
18
GTA Service Desk Volumes
• GTA Service Desk show consistent saw tooth pattern• Volumes decrease as days in the week progress• Mondays are busiest days of the week• Fridays are least busy day of the week• Quarter/month end activities drove week 1 go live volumes• 4th of July volumes (week 5) fell below Mean
Parameter Result
• Mean• Upper Process Limit• Lower Process Limit
• 1,031.21 calls• 1,347.96 calls• 714.47 calls
GTA Service Desk Volumes (June 1, 2009 to July 10, 2009)
0
200
400
600
800
1000
1200
1400
1600
1-Ju
n-09
3-Ju
n-09
5-Ju
n-09
7-Ju
n-09
9-Ju
n-09
11-J
un-0
9
13-J
un-0
9
15-J
un-0
9
17-J
un-0
9
19-J
un-0
9
21-J
un-0
9
23-J
un-0
9
25-J
un-0
9
27-J
un-0
9
29-J
un-0
9
1-Ju
l-09
3-Ju
l-09
5-Ju
l-09
7-Ju
l-09
9-Ju
l-09
Calls Answered
Mean
Upper ProcessLimit
Lower ProcessLimit
19
Local Printer Strategy
20
Local Printer Support
• The Enterprise strategy is to reduce local printers and increase network printers for cost efficiency as well as environmental focus
• VIPs or those with business need will keep local printers; others will be mapped to a network printer
• Local printer failures will not be repaired/replaced if there is an available network printer nearby
• Network printers are being installed throughout the Enterprise as part of the PC Refresh
21
Appendix
22
Service Level Terminology
• Service Level Agreement (SLA) – An agreement that specifies “how well” we want the service provider to perform services
• Critical Measurement – Those service levels that are measured and have financial liability to the service provider if they are not performed ($$ at risk)
• Key Measurements – Those service levels that are measured and do NOT have financial liability to the service provider if they are not performed ($$ not at risk)
• Critical Deliverable – A one-time service level that must be met
23
• Expected Service Level – The average of past performance that the Service Provider must consistently meet
• Minimum Service Level – The minimum service level that the Service Provider must always remain above
Service Level Terminology