Date post: | 22-Jan-2018 |
Category: |
Documents |
Upload: | jeffrey-jahnke-mba |
View: | 155 times |
Download: | 0 times |
County of San Diego Health and Human Services Agency
Consulting Report on Knowledge Integration Project
Prepared By
Haley Exum Jeffrey Jahnke Andrea Loffert
David Wong
Date Submitted for Client Approval
December 5, 2013
Presented in partial fulfillment of the requirements for the Masters of Business Administration Degree Graduate School of Business Administration
San Diego State University
_____________________________ ________________________________ John Francis, Ph.D. Mujtaba Ahsan, Ph.D.
Page 1 of 129
SDSU MBA Consulting
Final Report Acknowledgment
I, Adrienne Perry of County of San Diego Health and Human Services Administration, do hereby
acknowledge receipt of a copy of the final consulting report that was prepared by the student consulting
team from the San Diego State University College of Business Administration.
The team has met and discussed with me the findings of this report. Whereas this report does
address the areas of management concern described in the engagement letter, and the team’s findings
represent valid considerations of these areas that are of utility to our management, this
acknowledgment does not necessarily mean that I am in complete agreement with the
recommendations.
____________________________________ _________________________________ Adrienne Perry, Senior Project Manager (Date) Knowledge Integration Project ____________________________________ _________________________________ Haley Exum, M.B.A Candidate Jeffrey Jahnke M.B.A Candidate ____________________________________ _________________________________ Andrea Loffert, M.B.A Candidate David Wong, M.B.A Candidate
Page 2 of 129
Table of Contents
Executive Summary ................................................................................................................................. 5
Introduction ............................................................................................................................................ 8
Problem Statement ................................................................................................................................. 8
Background ............................................................................................................................................. 8
Project Objectives ................................................................................................................................... 9
Project Methodology ............................................................................................................................. 10
Objective 1: Metrics .............................................................................................................................. 13
Methodology ..................................................................................................................................... 13
Results ............................................................................................................................................... 14
Secondary Research ....................................................................................................................... 14
Interview Results ........................................................................................................................... 15
Survey Results ............................................................................................................................... 15
Recommendations ............................................................................................................................. 17
Recommended Metrics .................................................................................................................. 17
Dashboard Design .......................................................................................................................... 21
Objective #2: Return on Investment ...................................................................................................... 23
Recommendations ............................................................................................................................. 34
Objective #3: Change Management ....................................................................................................... 36
Methodology ................................................................................................................................. 36
Results ........................................................................................................................................... 37
Secondary Research Results ........................................................................................................... 38
Interview and Focus Group Results ................................................................................................ 39
Survey Results ............................................................................................................................... 43
Recommendations ............................................................................................................................. 45
Conclusions ........................................................................................................................................... 51
References ............................................................................................................................................ 53
Appendices............................................................................................................................................ 55
Appendix A: Letter of Engagement .................................................................................................... 55
Appendix B: Stakeholder Analysis ...................................................................................................... 66
Appendix C: SWOT Analysis ............................................................................................................... 67
Appendix D: Trends Analysis .............................................................................................................. 68
Page 3 of 129
Appendix E: Live Well San Diego Top 10 Indicators Dashboard ........................................................... 69
Appendix F: HHSA Sample Dashboard ................................................................................................ 70
Appendix G: ROI Research Schedule .................................................................................................. 74
Appendix H: The ADKAR Model of Change Management ................................................................... 76
Appendix I: Leveraging Current Communication Tools ....................................................................... 78
Appendix J: Focus Group Attendance ................................................................................................. 80
Appendix K: Surveys .......................................................................................................................... 82
Program SME Survey ..................................................................................................................... 82
Proposed Front Line Survey (Unable to Perform) ........................................................................... 85
Data SME Survey............................................................................................................................ 87
Appendix L: Interviews & Focus Groups ............................................................................................. 89
County Executives .......................................................................................................................... 89
ADDs ............................................................................................................................................. 89
KIP Group ...................................................................................................................................... 92
Executive Team/Subset – Representative of AB109 and LIHP ......................................................... 92
Program Managers ........................................................................................................................ 93
Appendix M: Progress Reports ........................................................................................................... 94
Appendix N: Meeting Notes ............................................................................................................... 96
Appendix O: Survey Results ............................................................................................................. 115
Appendix P: Acronym Cheat Sheet Example ..................................................................................... 122
Appendix Q: Change Management Communications ....................................................................... 123
Appendix R: Change Management – Disruptive Communication Examples ...................................... 124
Appendix S: Lean Six Sigma evaluation of the AB109 Service Flow ................................................... 127
Page 4 of 129
List of Tables and Figures Figure 1 Front Dashboard View.............................................................................................................. 22 Figure 2 Current State of AB109 Service Flow ........................................................................................ 31 Figure 3 Future State of AB109 Service Flow .......................................................................................... 32 Figure 4 SDCHHSA (LIHP & AB109 Stakeholders) .................................................................................... 66 Figure 5 First Drilldown of Dashboard .................................................................................................... 70 Figure 6 Second Drilldown of Dashboard ............................................................................................... 71 Figure 7 Third Drilldown of Dashboard .................................................................................................. 72 Figure 8 Fourth Drilldown of Dashboard ................................................................................................ 73 Figure 9 Factors influencing Awareness of the need for change. Adapted from ADKAR (p.45), by J. M. Hiatt, 2006,Loveland, CO: Prosci Research. Copyright 2006. Adapted with permission. .......................... 76 Figure 10 Factors influencing Desire for change. Adapted from ADKAR (p.45), by J. M. Hiatt, 2006,Loveland, CO: Prosci Research. Copyright 2006. Adapted with permission. ................................... 77 Figure 11 Recommended Use for Lync Application ................................................................................ 79 Figure 12: Sample Survey Screen ........................................................................................................... 82 Figure 13 Acronym Cheat Sheet Example ............................................................................................. 122 Figure 14 Educate about the Universal Dictionary – Acronyms and Ontology ....................................... 123 Table 1 Interviews Conducted ............................................................................................................... 12 Table 2 Abbreviated Research Methodology.......................................................................................... 24 Table 3: SWOT Analysis ......................................................................................................................... 67 Table 4 Live Well San Diego Top 10 Indicators Dashboard ...................................................................... 69 Table 5 Table of Primary Research activities .......................................................................................... 74 Table 6: KIP Privacy/Program SME Affinity Exercise Oct 14, 2013 ........................................................... 80 Table 7: KIP Data Governance/Select ADD’s Focus Group Oct 16, 2013 .................................................. 80 Table 8:KIP Executive Subset Semi-Structured Focus Group Oct XX, 2013 .............................................. 80 Table 9: Community Transition Center/MASU Site Visit/Focus Group Oct. 23 2013 ................................ 80 Table 10 “Stoplight” Report 11/1/13 ..................................................................................................... 94 Table 11 “Stoplight” Report 11/8/13 ..................................................................................................... 94 Table 12 “Stoplight” Report 11/15/13 ................................................................................................... 95 Table 13 Data SME Survey Results: Snapshot of Raw Data Collection ................................................... 121 Table 14 Present State ......................................................................................................................... 127 Table 15 Future State .......................................................................................................................... 128
Page 5 of 129
Executive Summary
Our foremost goal was to provide recommendations with regard to the development and launch of the
County of San Diego Health and Human Services Agency’s (HHSA) initiative, the Knowledge Integration
Program (KIP). Subsequently this goal was realized by completing three primary objectives:
1.) Researched and developed a set of useful metrics that can be realistically presented in a
dashboard format. Various levels of employees and departments throughout the HHSA will
utilize this tool, so likewise the data measured and the dashboard look and feel will vary by user
as well.
2.) Established a contextual background for the future development of a KIP Return on Investment
(ROI) model. This information was supplemented by an analysis of the critical factors
determining an initiative’s direct and indirect cost savings, specific to programs serving the
AB109 and Low-Income Health Program (LIHP) populations.
3.) Researched various Change Management models, selected the approach must congruent with
the HHSA’s vision, and recommended a preparatory communication strategy for KIP roll-out.
The team conducted significant background research of the health care and county-level agency
environment, as well as the current operations of the County of San Diego HHSA. We identified
appropriate methods of research to unveil key metrics, develop an approach to identifying direct and in-
direct cost savings, and ripen a model for change appropriate to KIP’s Phase 1 implementation. Over
twelve weeks, we designed and combed over survey and interview questions with the assistance of KIP
team members to identify the appropriate audience for the desired objectives. We solicited candid and
insightful responses from over 100 HHSA personnel by visiting program facilities, conducting focus
groups and one-on-one interviews, and deploying online surveys.
The team analyzed copious notes and data to amass (7) tactical measures, directly related to improving
operational effectiveness and customer outcomes. These measures are: number of customers currently
served by the HHSA, number of applications received and processed, customers using more than one
service, success rate of referrals, successful case completion, occupancy, and system performance.
Many of the measurements suggested directly involve front line staff participation to realize KIP’s full
value, but also offer benefits to upper management and executive staff. Our team also created a
template of the HHSA’s imagined dashboard, which will be an integral deliverable in KIP’s Phase 1
Page 6 of 129
deployment. The dashboard identifies peripheral metrics that will become ancillary benefits of the data
integration amongst programs.
To expose key areas of KIP’s potential direct and indirect cost savings, such as reducing benefit
overpayments, redundancy in paperwork, and visits to sites per service, we spent hours with the
Community Transition Center staff to observe their current workflows. Our analysis was largely
enhanced by case studies of a few county systems recently initiating a similar integration, though with
distinctly different service offerings and populations. In our recommendations, we provide extensive
background on Lean Six Sigma tactics relevant to the HHSA’s initiative, as well as recommendations for
applying these techniques to current and future operations. In particular, focus groups and one-on-one
interviews offered great insight into areas of redundancy and duplication of efforts and services.
Finally, we assessed the climate for change across various levels in the organization using the ADKAR
method, which required an analysis of the internal level of awareness and desire for change. Our
research suggests that staff is largely aware of the need for change in the HHSA and innately desires the
improved service delivery. However, respondents largely desire this change and enhanced
communication from other programs, not necessarily their own, so it is evident that frustrations
amongst programs will be a significant challenge to address. Our change management recommendation
details a communications strategy identifying a support and leadership infrastructure, time schedule for
KIP unveiling, and appropriate channels of communication per audience.
As our research unfolded, our team developed a peripheral stance and subsequent recommendation to
the HHSA and KIP team. Our original charge, as we have completed, was to recommended metrics for a
dashboard used by HHSA executives in the preliminary stages and eventually offering support for
managers, supervisors, and end users--the front line. Our research and analysis suggests an alternative
approach. While innovation and change of this scope undoubtedly requires strong, visible leadership,
KIP’s data-driven decision making benefits should not begin at the top. Rather, with the tools to inform
day-to-day operations, front line staff will have the ability to improve outcomes and drive the return on
investment of this initiative. Igniting the Knowledge Integration Project with an executive dashboard
would certainly offer benefits, namely, identifying who the HHSA is serving, how many unduplicated
customers are served, the HHSA’s capacity and potential for growth, and high or low performers
amongst HHSA programs. These are all very significant quick wins. However, this approach is an
extension of reactive decision-making. Our research suggests an upstream approach: Initiating
Page 7 of 129
dashboard design for customer-facing operations, showcasing the information exchange system as a
“tool designed for them – the front line staff,” and empowering staff to make decisions, reduce
redundancy and overpayments, recognize successes, and improve outcomes.
Page 8 of 129
Introduction
First, let us thank you for giving us the opportunity to work with the County of San Diego Health and
Human Services Agency (HHSA). We greatly appreciate your sponsorship of the San Diego State
University M.B.A program. As seasoned MBA students with work experience and varying individual
strengths, we applied our knowledge and skills to offer recommendations supporting your various
objectives. We believe that the information provided in this report will be a valuable contribution to
your organization. The recommendations provided are based on exhaustive primary and secondary
research, consisting of industry best practices, management theory, and candid, insightful participation
from HHSA staff. In addition, two knowledgeable and experienced SDSU faculty advisors contributed to
the quality of this report. We are confident that the research results and subsequent recommendations
will assist in realizing HHSA’s strategic goals.
Problem Statement
Since the assimilation of multiple independent organizations into a single agency in the late 1990’s, the
County of San Diego HHSA has offered a comprehensive array of health and social services to the
community. The primary goal of this integration was to streamline the delivery of the many health and
social services offered, to more efficiently and effectively serve the residents of the sixth largest county
in the United States. However, the observed result has been collaboration rather than true integration,
as the operational independence of these programs discourages continuity of customer/patient care.
Continued disjointedness amongst their many programs would dilute their mission to “make people’s
lives healthier, safer, and self-sufficient by delivering essential services” in a cost-effective and outcome-
driven way. The County of San Diego HHSA acknowledges this need for change and is making
groundbreaking strides toward an innovative solution.
Background
We understand that the lack of an integrated data management system spanning across HHSA programs
has led to operational inefficiencies and threatened the organization’s continued effectiveness. Though
the HHSA produces and retains significant data about customers served, the data collected lacks
consistency across the enterprise. Consequently, this dynamic limits data interpretation and thus
usefulness. As it stands now, the HHSA cannot answer even the most basic questions about its customer
base, such as “How many unique individuals do we currently serve?” or “How many of our customers
Page 9 of 129
use more than one of our services?” The HHSA has identified the need for an integrated, streamlined
service delivery system consistent with their core values: Integrity, Accountability, Innovation, Quality
and Results. The Knowledge Integration Program, a central information exchange system, aims to set
enterprise-wide standards, common metrics, and improve the quality of the organization’s service
delivery. At this stage, the County is evaluating RFP’s from potential information exchange vendors for a
three phase implementation. Vendors’ proposals will be assessed based on their congruence with the
Agency’s imagined goals, substantial responsibilities to stakeholders, and cost.
The project indicted to the SDSU MBA student team is to identify an aggregate of key metrics currently
living in independent data silos, in preparation for the information exchange roll-out. These integrated
metrics should attest to the community’s overall health and inform operational decision-making at a
tactical level. Ultimately, the metrics identified will aid in the design and creation of a customized
dashboard for the agency’s leadership team. As a base for our analysis, the research scope was
narrowed to programs serving two customer segments in the ten pilot data source systems—the Low-
Income Health Program (LIHP) population and AB109 population1. Peripheral goals included an analysis
of key areas influencing direct and indirect cost savings, as well as assessing the internal response to the
forthcoming changes.
Project Objectives
To achieve these goals, three primary objectives were met. In order of priority, these objectives were:
1. Capturing each program’s key metrics and customer profiles in support of the near-term system
integration, through both candid conversations with the HHSA staff and cross-functional
analysis. The planned deliverable was to identify comprehensive metrics that reflect and
support the community’s overall health at a tactical level. To effectively organize the responses
received, the team’s preliminary research identified data each department/program produces,
where it is stored, who manages data retention, and how/with whom the data is shared.
2. Establishing a contextual background for the future development of a KIP Return on Investment
model. This information was supplemented by an analysis of the critical factors determining an
initiative’s direct and indirect ROI. The objective also included an assessment service functions,
1 Individuals sentenced to non-serious, non-violent or non-sex offenses will serve their sentences in county jails instead of state prison. (CDCR, 2013).
Page 10 of 129
aiming to identify areas for potential reduction of replicated services and/or administrative
functions and reduction of the number of interfaces.
3. Assessing the organization’s current climate as it relates to change management. Leveraging
the evaluation to develop high-level recommendations for staff assimilation into the
information exchange system with minimal pushback.
Project Methodology
The team leveraged extensive historical research and HHSA reports to design our data collection
approach. Our overall project methodology entailed the following:
Studied collateral, news articles, and online sources related to the LIHP and AB109 programs
and Live Well San Diego initiatives, in an effort to understand the HHSA audience. Reviewed
secondary research/reports provided by the client, related to Data Governance, the AB109 Data
Match Survey, previous KIP introduction presentations, and research notes provided by third-
party consulting firms.
Reviewed the Request for Proposal (RFP) solicitation offered to information exchange vendors,
as well as the vendors’ preliminary responses to the Request for Information (RFI). This offered
insight into the HHSA’s wish list of KIP deliverables, the actual capabilities of the intended
information exchange, and proposed change management/training support.
Performed case study analysis to assess the internal and external climate of the HHSA, in order
to design relevant research questions and make appropriate recommendations. In preparation
for primary research with HHSA staff, a stakeholder analysis, SWOT analysis, and trends analysis
were conducted (see Appendix B-D).
Compared the County of San Diego HHSA case analysis with other jurisdictions with integrated
data exchanges, namely the New York City Health and Human Services Connect Project and the
Alameda County Social Services Agency.
Held preliminary client meetings with Carrie Hoff, Assistant Deputy Director and Adrienne Perry,
Senior Project Manager, to refine the project scope and approach with various audiences (See
Appendix A for the Letter of Engagement). Determined that select groups have been introduced
to KIP initiatives and there is a significant need for establishing a change management
framework. During these meetings we determined the necessity to modify our planned ROI
Page 11 of 129
methodology and project scope, due to the lack of access to key financial data at this stage in
KIP implementation.
Strategically chose the sequence of interview scheduling. To create an informed platform as a
basis for our recommendations, we first connected with many key leaders. These individuals
included:
o Executives and Upper Management, including Assistant Deputy Directors (ADDs) and
Program Subject Matter Experts (Program SMEs) – The dashboard will be viewed most
often by these users to help them quickly track outcomes and various successes of the
integrated data systems project.
o Data Custodians/Subject Matter Experts (Data SMEs) – Staff who use the data on a daily
basis.
o Front Line Staff – Representative of the AB109 and LIHP customer populations.
o Throughout the first several weeks, we determined that the planned interview with the
HHSA’s IT consulting firm, Gartner Consulting, would offer little to no benefit for our
objectives within the limited time frame available. As the project scope was further
refined, our overarching goal shifted from identifying what data is stored and where (as
this research is already underway internally), to pinpointing tactical metrics vital to the
programs’ operational effectiveness and positive outcomes.
Revised survey, interview, and focus group questions frequently over the course of several
weeks to ensure responses would be relevant to our objectives and appropriate to the
audience. Designed the survey to begin with audience qualifiers, to identify the program they
most closely identify with and their tenure with the HHSA. Reviewed all questions with the
client and SDSU faculty advisors. Selected three to five questions to pose in one-on-one
interviews and focus groups (See Appendix J for Focus Group Attendance), as attempting to
cover too many topics in a limited time frame can perplex participants and/or result in
superficial information (Polonsky & Waller, 2011).
Assigned objective identifiers to all research questions (R for ROI, CM for Change Management,
and M for Metrics) to aid organization of responses once received (Appendix K-L).
Page 12 of 129
Scheduled all one-on-ones, focus groups, and survey deployment according to the
recommendations listed in Figure 1. The group also visited various programs offices, including
the North Central Family Resource Center collocated with a Public Health Center, the
Community Transition Center/MASU Unit, and the Housing and Community Development office.
Table 1 Interviews Conducted
Date Interviewed Position(s) Type of
Interview
10/14/2013 Program Subject Matter
Experts Program Subject Matter Experts
Affinity Exercise
10/16/2013 Assistant Deputy
Directors Assistant Deputy Directors
Affinity Exercise
10/16/2013 Todd Henderson Director, Housing and Community Development
Individual
10/17/2013 Adrian Gonzalez Public Safety Group IT Manager
Individual
10/17/2013 Andy Pease Chief Financial Officer
Individual
10/17/2013 Dean Arabatzis Chief Operations Officer
Individual
10/17/2013 KIP Team KIP Team Members
Focus Group
10/17/2013 Mike Haas Chief Information Officer
Individual
10/21/2013 Select Executives Child Welfare, KIP team members, HCPA, BHS
Focus Group
10/22/2013 Mack Jenkins Chief Probation Officer
Individual
10/23/2013
Community Transition Center Employees
Primarily Probation Officers, Marriage and Family Therapists, and an Administrative Assistant
On site visit & Focus Group
10/24/2013 Family Resource Center
Employees Application and renewal supervisors, FRC manager
On site visit & Focus Group
10/30/2013 Barbara Jimenez Deputy Director
Individual
Conducted an affinity exercise with Program SME’s at the HHSA’s downtown office. During the
exercise, a high-level of participation was observed but responses generally lacked specificity.
Going forward, a true focus group format was employed, in an effort to support free-flowing,
candid conversations. Additionally, we suggested that our client offer preliminary information
to focus group participants, as to our general objectives and the topics to be discussed. This
strategy notably improved the richness of the discussions and responses captured. Facilitated
Page 13 of 129
focus groups with the HHSA Executive Team, Assistant Deputy Directors, KIP Team, Community
Transition Center - MASU Unit staff, and Family Resource and Public Health Center staff.
Formatted Program SME and Data SMEs’ surveys for upload into Survey Monkey. Sent surveys
electronically, and collected within two weeks. Received an average of 16 responses per group.
Though the team designed and anticipated deploying a survey to a large number of front line
staff (Appendix K), the Human Resources approval process for this audience exceeded our
project’s time constraints.
Toured a Community Transition Center (CTC) and a Family Resource Center (FRC) to supplement
the focus groups conducted, which offered substantial insight into the day-to-day operations of
the individual programs. David Wong returned for additional analysis on several occasions,
primarily for ROI research.
Kept Carrie Hoff and Adrienne Perry apprised of our progress throughout the duration of the
project via weekly conference calls and “Stoplight” Reports (See Appendix M for “Stoplight”
Reports).
After collecting a significant aggregate of responses and valuable insights, data analysis began on the
following three objectives.
Objective 1: Metrics
Methodology
To identify the what, where, how and with whom data is retained and shared, we performed a deep
dive into the relationships amongst the 10 data sources and the programs they embody. We gathered
program background information such as their core service offering, stakeholders and data custodians,
current utilization of data management systems, patient mix, and program vision and goals. Our client
then provided a list of census-based metrics such as education level, quality of life, life expectancy,
income, security, physical environment, built environment, vulnerable populations and community
involvement, which have been the result of extensive public health research and constant monitoring.
This was a great starting point for the group to understand preliminary measures important to the HHSA
and their key stakeholders. This comprehension drove discussion regarding what integrated metrics
were possible, what metrics would be easily understood, and what metrics would be most useful for the
future integrated data source. It was also critical to determine a baseline of organizational metrics
currently tracked, to avoid redundant research.
Page 14 of 129
Our primary research entailed in-depth interviews with data custodians and executive leadership. We
asked probing questions such as:
If you had access to other programs’ data, what metrics would you measure? What would you
do differently from a process approach?
If you were able to design your own dashboard, what would it look like? What customer,
operations, and outcomes-focused metrics would be accessible? How often would the data be
refreshed?
Which of your programs will see the most significant operational changes
(processes/procedures) with the KIP rollout?
Is there a report that exists that describes your customer profile? If so, could you please send it
to us? Are there any cross-program reports that inform how many of your customers use other
services?
Do you gauge customer awareness of other programs? If so, how?
What key metrics are vital to your level of the organization? Are any of these metrics tied to
funding or associated with financial penalties?
What reports do you have, and what reports do you wish you had? Do you have reports
available to do your job effectively?
Our objective was to “bring focus” to a wide array of reports and metrics utilized by the many systems
and located in many different silos. Once we gathered a significant collection of responses, we used
business intelligence software such as advanced Microsoft Excel macros and Tableau to identify trends
and correlations in the data and ultimately identify key metrics across programs. This data helped the
group understand the HHSA’s current needs in terms of operational metrics, from the key individuals
mentioned previously. We discussed the validity, usefulness and value-add of all metrics identified by
the HHSA staff. Next, we reviewed metrics for applicability in present and future usage. Although these
metrics were primarily intended for upper management and executives to track progress, the HHSA will
eventually share these metrics with a multitude of stakeholders. Finally, the group identified 7 tactical
measures with elasticity for multiple audiences and potential to link with established census-based
metrics.
Results
Secondary Research
Page 15 of 129
The HHSA currently has many key indicators and metrics they report, representing overall county health.
Please refer to Appendix E for the Live Well San Diego Top 10 Indicators. Although these metrics may
also be included in the dashboard built through KIP, the SDSU MBA team’s goal was to capture
operational metrics with tactical measures from individual programs. The Live Well San Diego indicators
do not all show operational success or data from separate programs. These metrics informed our team
of the HHSA’s current research and reporting capabilities.
Interview Results
Throughout the interviews, respondents suggested key metrics needed to improve operations, as well as
opportunities to reassess the efficacy of some current reporting. The following list outlines the most
frequently expressed desirables from HHSA staff respondents:
Metrics accessed biweekly or weekly
Specific drill downs of every metric
o Click on the metric and drill down by region/department/supervisor/customer
Upper management would like to see Return On Investment of KIP
o Due to the enormous financial investment involved, executives will be charged to
express the monetary and intangible value-add resulting from KIP
How many customers they serve at any given time
Ability to track customers’ adherence to treatment plans and resulting outcomes
Because one of the target populations is the AB109 population, many of the interviews
mentioned metrics involving parolees and probation
Availability of services throughout all services in the county
Increase/Decrease of metrics comparative to past weeks/months/years
Survey Results
The team distributed two surveys, both open for a total of two weeks. One survey was distributed to a
variety of program subject matter experts (SMEs), fifteen of which provided insight into the current level
of coordinated care delivered across departments (Appendix O). Most respondents represented Child
Welfare Services or Public Health Services, and all were heavily tenured (5-25+ years within their
respective programs/county services). It is important to note that of the respondents, three individuals
completed the “About You” section of this survey (which identifies the program they represent and their
tenure with the County) but did not respond to any other survey questions. This is a key finding for
Page 16 of 129
purposes of assessing the reliability of the data, as Behavioral Health Services (BHS) and Housing &
Community Development’s perspectives were not captured. This also has implications from a change
management perspective, to be discussed further on. The Program SMEs informed that they currently
track and measure staff workload to ensure timeliness and productivity standards are met; but noted
that determining productivity standards is time consuming and “entails an ever vacillating set of
variables and product/task definitions.” When asked “Is there a report that exists that describes your
customer profile,” responses were divided – approximately half replied that, no, a report does not
currently exist, while the others replied that in fact reports are available. They have access to client
demographics by pulling reports from Child Welfare Services/Center for Medicare & Medicaid Services,
as well as from PHN’s Referral Criteria document. When asked “Do you have access to any cross-
program reports that inform how many of your customers use other services,” all responses were no.
The data subject matter expert survey collected insights from eight representatives of Aging &
Independent Services, three from Eligibility, three from BHS, and single responses from the Executive
Office, Information Technology, and Child Welfare Services. Totaling seventeen responses, the survey
surfaced concerns of maintaining quality case notes and logs, lack of cross-departmental reports, and
insufficient resources to perform their core job functions effectively. Of the seventeen respondents,
approximately 71% of survey respondents felt they had access to all reports needed in order to
efficiently complete their job. The most relied on reports were internal logs, daily case assignments,
weekly security reports and a weekly and monthly manager dashboard. The metrics most vital to their
operations were measures of recidivism and 10 and 30-day visit data for Aging and Independence
Services, pending applications/timely disposition, Alcohol & Drug treatment outcomes, call volume
reports (abandoned calls, wait time, calls answered), and reports of referrals taken from each program
(measured by Aging & Independence Services). If they were able to design their own dashboard with
imagined metrics, some of the operational metrics identified were percent of cases by office that are
ENI or NIFFI/have financial abuse allegations, continuous wait time summaries (call center, lobby, clients
in queue), and referral success rate tracking, with the ability to drill down to users. Other desired
metrics included monthly number of clients by region and funding source, weekly number of
unduplicated customers/cases by program and overall, and number of clients per region by primary and
secondary drug of choice. Regarding outcomes, the Data SMEs identified monthly work participation
rate and average time on aid as desired metrics, as well as percentage of customers successfully
completing treatment. Respondents also imagined the ability to see customer information summaries
Page 17 of 129
daily, regarding their use of other programs (i.e. abuse of aid sources, mental health history, etc.).
Further responses captured are available in Snapshot of Raw Data Collection (Appendix O, Table 13).
Recommendations
Recommended Metrics
Fueled by KIP’s integrated data and individual program needs, we recommend a set of (7) starter
metrics to track critical operational indicators, as well as (5) key metrics to consider in the future, as
KIP’s capabilities are fully realized. Finally, we offer a template of the final output—the internal
dashboard—to visualize the HHSA’s future capabilities as a data-driven entity.
The metrics recommended are a result of resonating themes and key points identified in surveys and
interviews (Please refer to Appendix F to view primary research results). We believe the identified
metrics will support the change in service delivery and improve the customer and staff experience, but
must be made available for tactical, front line staffs’ use to realize their full value. Many of the metrics
below would inform decision-making proactively at the customer-facing level (i.e. the best treatment
plan for a customer based on past referral successes/failures) versus reactively at the higher-level (i.e.
identifying swings in applications prompt reorganization of personnel resources). Still, the added
benefit for long-range decision making (forecasting, capacity planning etc.) is undeniable. This speaks to
a suggestion we received in the select executive focus group, which entailed organization of the
dashboard into short-term, mid-term, long-term metrics and goals. The list of developed metrics and
the reasoning behind these metrics can be found throughout the entirety of this section.
Number of unduplicated customers currently served by the HHSA. Throughout the majority
(over 50%) of interviews and surveys, respondents identified their inability to track the amount
of customers currently served as a significant concern. Monitoring and tracking this number
would create awareness of HHSA’s capacity and allow for proactive forecasting. After time,
tracking the growth or decline of total customers will allow business analysts to observe and
respond to demographic trends through data-driven decision making. From this metric, other
key measures can be derived, such as the proportions of male versus female customers,
ethnicities served, customer age groups, and most importantly, customer behaviors by region.
Page 18 of 129
All of these benefits can create awareness on resource allocation to create a more efficient
business flow, as well as identify pain points needing attention to improve outcomes.
Number of applications received and processed. This metric will help the HHSA understand the
correlations between the customer eligibility levels and the applications coming in. Some
questions that may be answered after tracking and understanding this data include: Are we
serving more customers as we process and accept more applications, or are those applications
coming from our current customer base? Is the number of applications we receive comparable
to other counties of similar size and structure to the County of San Diego HHSA? What
implications does that comparison entail in the short-term, mid-term, and long-term? Are
application counts down due to a high rate of application denial, administrative backlog, or are
other factors influencing the change? Where are we seeing the best/least timeliness of
application processing throughout our programs, and what factors or best-practices are
affecting this distinction? All of these questions can be answered with real-time tracking of the
number of applications and basic business analytics. Understanding trends and correlations
from this data can inform proactive decision-making, such as responding to changing customer
needs with expansion of service areas or reorganization of personnel resources. From an
executive level, focus group respondents expressed the desire to see this metric and related
measures on a weekly basis, as “there’s not enough time to look at this daily and gain
meaningful insights.”
Customers using more than one service. The vast majority of respondents expressed the
importance of understanding how many county services their customers are utilizing, and which
services are commonly used in tandem. This will improve the HHSA’s understanding of their
current customers’ needs. One benefit from this measure includes the possibility of creating
“bundles” of service delivery for the most common services utilized. Another pronounced
benefit of this metric would be the ability to detect benefit cross-over, and thus make strides to
reduce overpayments.
Success rate of…
o Referrals. Tracking referrals was a prevalent topic throughout much of our primary
research as it is a significant pain point of data poor decision making and time
consumption. With KIP, referral tracking will be automatic. We believe the most
relevant way to track referrals is by the success of the referral. Referral success is
Page 19 of 129
defined as a customer acting upon the referral. For example, if a customer is referred
by Public Health Services to Child Welfare Services, success occurs when that customer
‘checks in’ for their child services appointment. Tracking this metric will help the HHSA
understand the effectiveness of their referrals, and allow management to identify teams
and individuals that consistently support effective coordinated care. Ultimately, we
highly recommend creating a dashboard for front line use, so that individuals can have
visibility to their own successes. This offers staff intrinsic reward, but also offers a
comparison of their personal performance with others. Knowing that their actions are
visible will likely give front line employees more incentive to comprehensively assess
their customers’ needs. For the CTC front line staff, the opportunity to view an AB109
customer’s history of successful referrals would inform their case plan development and
allow them to can make better references for rehabilitation.
o Cases were mentioned throughout our research. The success of the case will be
dependent on the resolution and/or completion and closing of the case treatment plan.
We understand that cases may carry different meanings across programs, and thus
results should be categorized with like ontologies or modified if case completion does
not fit a specific program’s needs or deliverables. These modifications could include
progress bars representing the percentage of case completed, sorted by start date. We
recommend tracking this metric due to its potential for realizing correlations between
successful cases and other metrics, such as customer or application count, identifying
departments that are falling behind in successful case completion, and acknowledging
which employees have higher levels of success rates. This in turn will support improved
resource allocation and reward high-performers.
Occupancy rate of services can streamline the referral process. As mentioned in an executive
interview, they would like to see how many beds were available at any of their available
locations at any time. This metric will again create awareness of the capacity available in various
programs, and provide additional benefit to staff attempting to make a referral (under the
assumption that the dashboard reaches that level of employee). If the intended service has full
occupancy, the staff member can quickly search for other appropriate solutions. Understanding
the occupancy rate may also ensure compliance with federal regulations relating to ‘over-
stuffing’ locations. This metric would be realized in its full-capacity if it were available at the
Page 20 of 129
customer-facing level, so that it could be a tactical tool for finding bed availability. The front-
line focus groups offered candid stories of their pain points with obtaining bed availability from
Anasazi and SanWITS. This process could be significantly improved with access this real-time
dashboard metric (Appendix L for CTC notes).
System performance metric will be the only metric that will not show a percentage of a week
over week change. This percentage is based off of how well each system is running. For
example, if the Aging and Independence Services system performance metric is showing 88%,
that implies that 88% of their data silo is correctly syncing with the KIP integrated system. Once
the percentage is below a certain threshold, the dashboard will call attention to that
program/data silo and will reference any technological issues the silo may be experiencing. This
metric will support perceptions of KIP’s trustworthiness as a secure and accurate platform,
which was largely a concern amongst all respondent groups. Quickly visualizing how each data
silo is performing will be an important reference when pulling cross functional reports and
confidently clarifying data integrity.
Recommended Future Metrics
Some metrics are not currently measured and/or tracked, however, are important to consider for future
use. We recommended the following future metrics:
Program Performance, will track adherence to program specific goals, such as adherence to
processing applications within the applicable time frame (typically 30-45 days, depending on the
benefit) and successful collection/documentation of customer information (such as Social
Security Number and address). Once adequate time has allowed for the KIP learning curve and
an appropriate baseline has been established, programs should be monitored and accountable
for adherence to organizational and program specific standards.
Contractor Performance, like Program Performance, contractors will be measured on the length
of time their patients are in treatment, recidivism, number of customers served per a specified
period of time, customer/patient completion rate, rate of customer enrollment in education,
and other Utilization Review measures. This metric and will be leveraged by quality control
specialists for analysis of provider and contractor performance.
Productivity, or Return on Investment (ROI) is an important metric to track to show that the
large financial investment of KIP is a successful investment. Components of ROI’s indirect and
Page 21 of 129
direct cost savings, detailed in the following section, would likely be relevant stand-alone
metrics that feed into the measurement of Productivity.
Customer Satisfaction rating of customer’s experience and satisfaction with the services
offered. This can be in the form of a survey through mail, point-of-contact, or internet survey
options, and would be most beneficial if measures were taken to assess customer satisfaction
pre-KIP, to provide a benchmark for the future.
Employee Satisfaction would help the HHSA maintain awareness of their employees current
likes and dislikes about their position, but care must be taken when collecting the data for this
metric. Seeing the correlation between employee satisfaction and customer satisfaction would
also help the HHSA understand the quality of service given as a factor of the level of job
satisfaction.
Dashboard Design
We offer a template of the final output—the internal dashboard—to visualize the HHSA’s future
capabilities as a data-driven entity. The front of the dashboard will show a high level overview of all
seven services offered and seven different metrics. This dashboard design is intended for use by both
executives and program supervisors. The high level, initial dashboard will be most beneficial to
executives because, as found through primary research, executives would like to see very little data.
They would prefer to direct their attention solely to metrics in need of action. All metrics will initially be
shown as percentages, referencing the change in percentage week over week. For example, if the
dashboard for customers at Aging and Independence Services (AIS) shows -5%, that would indicate that
5% less customers were served by AIS than the prior week. The conditional formatting throughout the
first dashboard is variable by row. This means that every row has a range of colors per metric (expressed
in percentage, number, etc.), from green to red. The shade of red in which the number appears signifies
the need of action, and thus creates awareness of metric’s variance. Metrics in need of attention are
those that have the highest negative change percentage, week over week. Another feature of the front
dashboard includes a “System Performance” measure that is reflective of how well the new integrated
data system is functioning. This is important for all levels of management to have visibility to in order to
maintain cross-departmental reports and data at their optimum level.
Page 22 of 129
Figure 1 Front Dashboard View
Program Supervisors will benefit from the ‘drill down’ option of the dashboard. Each metric can be
double clicked on, which then pulls up a further drill down of that specific metric or data point. A drop
down menu on the front dashboard (Figure 1 above) helps isolate one service to show only their relative
metrics. Once at this service level, a historical trend line is automatically shown when any metric is
clicked on once (Appendix F, Figure 5). When a metric is double clicked on, the dashboard then drills
down to a further breakdown of the specific metric (Appendix F, Figure 6). For example, if the customer
metric is double clicked, it will then pull up customer count by week and specific customer demographic
indicators such as male count, female count, and a breakdown of the counts of customer ethnicities.
This ‘click to drill’ method is used throughout the dashboard to see increasingly specific information,
ending with client information (Appendix F, Figure 8). This client information will show general
information about a client, such as name, surname, email, address, and social security number. Specifics
such as services accessed, dates accessed, length of time using that service and specific reasons for using
that service are also available. Due to the granularity of the drill down capabilities, front line employee
will strongly benefit by increasing efficiency when understanding a customer’s needs and past services
used.
In sum, we have offered solutions for delivering useful metrics to an audience of executive leadership
and management. We also offer a recommendation peripheral to our objectives, but deemed crucial.
We advise keeping front line staff top of mind when establishing operational metrics, designing
Page 23 of 129
dashboards and seeking KIP cost savings opportunities, in order to fully capitalize on KIP’s proactive
benefits.
Objective #2: Return on Investment
Methodology
For the return on Investment modeling portion of our project, we established a contextual background
for later development of an ROI model. The team was originally tasked with developing an ROI model
for potential use during the implementation of the Phase 1 of the KIP program. The planned deliverable
was a working spreadsheet with built-in formulas allowing for figures and costs to be entered, which
would ultimately calculate a working percentage of an ROI. Initial research on ROI development
suggested use of the financial template offered by the Center for Medicaid Services (CMS)’s “Health
Care Innovation Award” to assess KIP’s return. The premise of the template was to provide a way to
calculate KIP’s positive impact by determining the overall cost Per Beneficiary per Year (PBPY).
However, during the course of the primary and secondary data collection, it was apparent that a model
could not be achieved within the scope of this project nor at this stage of KIP development. The team
and sponsor met to revise the Letter of Engagement, our project contract. Based on the revised
objectives, the below three sub-objectives were completed to support our second project objective:
Analyzed current services and processes to identify opportunities for direct savings that can be
used in a future ROI model
Analyzed current services, processes, and customer output to identify opportunities for indirect
savings that can be used in a future ROI model
Examined services/processes for potential efficiency realization at service centers (via Six Sigma
Methodologies i.e. Takt time, Value Stream Mapping, etc.)
With the revisions to the Letter of Engagement, it was apparent that the analysis of ROI subsections
would be a significant challenge. In for-profit organizations, ROI and related metrics provide a snapshot
of profitability, adjusted for the size of the investment assets tied up in the enterprise; however, in the
case of the HHSA, funding is largely based on allocations. While the organization does not have an
infrastructure to measure the performance of factors needed to calculate a true ROI, it does track
metrics to indicate performance, as listed in the metrics section of this report. With this in mind, the
Page 24 of 129
background information obtained on the sub-objectives will be available for guidance at a later point in
time, when the information exchange system is implemented and an accurate ROI can be obtained. The
alternative research process consisted of the following action items in order:
Gathered further secondary research on ROI, Six Sigma and benefits analysis. This was
accomplished through internet research, industry-specific educational seminars, electronic
database searches/articles, academic books, management journals and case studies related to
similar initiatives/models at the county-level, such as the study on Alameda County’s
implementation of IBM’s Integrated Reporting system. Articles, books and journals on Six Sigma
analysis were also used.
Gathered process information from two programs / services. Visits to various locations provided
detailed snapshots of current processes and areas to evaluate for inefficiencies and potential
waste.
Compiled the most relevant information related to direct and indirect costs into the final report,
for future use in 2013 financial reporting and future ROI calculations.
Recommended a template or approach to assess other process / services for the future ROI
model.
The below chart provides an overview of primary and secondary research activities supporting our
recommendations. Please refer to Appendix G, Table 5 for a detailed description by schedule and
audience:
Table 2 Abbreviated Research Methodology
Primary Research Secondary Research
One on One Interviews Process Documentation
Affinity Exercise Published literature pertaining to ROI and Six
Sigma
Focus Groups HHSA financial / Metric data
Site visit / Tours KIP related Presentations
Page 25 of 129
Online Surveys Web information search for AB109
Population
To build the foundation for our ROI objective, we designed interview questions to inform of direct and
indirect costs of operations, such as:
Which metrics are tied to funding (sanctions, overpayments, revenue recovery)?
Of the HHSA Systems to be interfaced in KIP Part I, how many are competing for general funds?
Do you track the cost of maintaining each of the data source systems? If yes, which ones and
can you give us this data? Do you have any financial information on the AB109 population?
Do you have records of the historical cost to implement the current systems?
Where do you expect to see cost savings with the newly integrated system?
What key metrics are vital to your level of the organization? Are any of these metrics tied to
funding or associated with penalties?
Results
Due to the change in deliverables for this objective, significant findings from secondary research and
Lean Six Sigma background information must be explicated. Our results include an analysis of the AB109
population work flow observed and implications for direct and indirect cost savings development.
Secondary Research
The ROI definition and model rationale must be identified before analyzing direct and indirect cost
savings. From investopedia.com, ROI is defined as: A performance measure used to evaluate the
efficiency of an investment or to compare the efficiency of a number of different investments. To
calculate ROI, the benefit (return) of an investment is divided by the cost of the investment; the result is
expressed as a percentage or a ratio.
Equation 1 ROI Formula
Paul Farris, author of Marketing Metrics: The Definitive Guide to Measuring Marketing Performance,
provides a slightly different definition stating that ROI is “the concept of an investment of some resource
yielding a benefit to the investor. A high ROI means the investment gains compare favorably to
Page 26 of 129
investment cost. As a performance measure, ROI is used to evaluate the efficiency of an investment or
to compare the efficiency of a number of different investments. In purely economic terms, it is one way
of considering profits in relation to capital invested.” A web search for ROI populated hundreds of ROI
types, applications, and approaches, suggesting that there are numerous interpretations of ROI. The
relevant point is that to define ROI, we must first understand the scope and purpose of the project. For
our purposes, the scope of the ROI assessment is as follows:
1. Provide project prioritization and justification
2. Evaluation of KIP project post-implementation
In doing so, however, it is important to outline the possible risks for ROI, as it has its limitations. ROI is a
metric designed for assessing profitability or financial efficiency. With that, it may provide completely
erroneous results if other factors are not considered. ROI is a single number, and has many uncertainties
that may make it out of context. It also does not take into project risks either, nor does it take in the
system effectiveness. To provide more meaningful context for business decisions, ROI numbers should
at least include assumptions along with other criteria of the specific ROI model. It does take into
consideration the cost savings amount.
There are no general rules for calculating ROI. However, the guideline is that ROI includes all costs and
all related benefits. Comparison of ROI calculations gained in different projects and by different teams,
consultants or contractors is not recommended, as they are not direct comparisons (apples to oranges).
For an ROI to be considered meaningful, it should include a detailed description of all costs and benefits,
assumptions made, how each intangible value was derived, and how they were used in the calculations.
Not stating these potential risks would make the ROI subjective and prone to human error or bias.
Primary Research Data:
In response to our interview questions, many interviewees expressed concerns or limitations with
regard to ROI and potential cost savings. We observed the following reoccurring themes/comments:
Costs of ensuring the quality of data (accuracy)
Cost for resources such as training and personnel for project implementation
With more referrals means increasing services – an increase in costs versus a decrease
o Prevalence of sanctions for timeliness in programs operating under federally mandated
performance metrics offers opposition to increasing referrals across programs. “What
Page 27 of 129
happens when our team (already serving 25k people per month) gets an increased workload
and thus an unintended consequence – sanctions? The two goals are conflicting.”
Having more information creates more incentive to use it, so the executive teams expect to see
increases in costly litigation due to availability of heavy documentation
Costs that are measured related to AB109s include health care, jail costs, court, probation
supervision, and treatment. Reports are Quarterly and allocations are determined annually
(which program gets what money annually)
o Anasazi captures AB109 costs but their financials are unique because they are collecting
reimbursement from county customers, unlike many other programs (Welfare, Cal Fresh,
etc.). Anasazi has 3 payment structures and a billing cycle specific to their operations.
Direct Cost Savings Development
Through the analysis of the information provided by primary and secondary research, the following list
of direct cost savings was considered:
Overpayments – Reduced benefit overpayments. During the focus group discussion at the
Family Resource Center (FRC), respondents suggested potential data inconsistency and/or error
within the system for individuals for Cal Fresh and MediCal. Benefits also can sometimes be
continued even when customers have deceased or are incarcerated. The KIP implementation
would enable better quality of the data and customer information by correcting errors and
inconsistencies within their benefits profile. Non-compliant customers will also be better
tracked to ensure the customer is compliant to their benefits requirement such as training,
education, orientation, and other welfare-to-work activities. As a result, case-workers can
identify these situations faster and discontinue the benefits to avoid overpayments.
Reoccurring services – In discussion with the focus group at the Community Transition Center
(CTC), the knowledge integration would help better serve the customers. This can be attained
with better quality data on customer medical and service histories. Customers that have been
through the benefits system a second time may currently be referred to the same program at
the same institution. A real-time and accurate understanding of the customer’s history would
enable the case worker avoid redundant case plans and referrals. Services that were not
effective before could be replaced with alternate options.
Page 28 of 129
Improved Appeals Litigations – Having access to better data mean that the case workers can
better document the rationale for discontinuation of benefits to a customer. Customers can
appeal the discontinuation of benefits through the appeals process; however, having better
documentation will enable the organization to track and stop overpayments made to customers.
Improved Productivity – Staff at both the CTC and FRC have indicated that a significant portion
of their work hours are spent transferring information from paper forms to an electronic format.
With data integration there is a likely possibility of reducing this activity, reducing the
occurrence of errors and overpayments, and reducing man-hours spent rectifying previous
errors. Overall, the HHSA will enjoy increased productivity and service workflow.
It is important to note that this is not an exhaustive list of direct cost savings that can be achieved.
In-Direct Cost Savings Development
The second of the three sub-objectives is to identify opportunities for in-direct cost savings, with a
primary focus on phase 1 of the KIP program. The list following list was developed with the ROI
Virtualization model in mind. In-direct cost savings, or intangible benefits, are sometime difficult or
impossible assign monetary value. The limitation of intangible benefits is their subjectivity, sometimes
based on an assessment or opinions instead of verifiable evidence or a financial bottom line. Through
the analysis of the information provided by primary and secondary research, the following opportunities
for in-direct cost savings were identified:
Reduced visits to sites per service – Customers that use multiple services, currently are required
to go to multiple service centers to enroll in services. With the roll out of KIP, these service
centers would in theory, be able to offer multiple services. A customer would no longer need to
talk to several case workers; they would only need to see one.
Reduced forms and assessments performed at each service site – One of reoccurring
improvements that was suggested during our discussions is the reduction of paperwork. This is
in reference to customers and the amount of forms they must fill out in order to get services.
Since the data is currently in silos, customers that receive multiple services must fill out a
separate form or perform a separate assessment at each visit. In some cases, contractors for
mental health are required to do a separate evaluation / assessment to ensure that a correct
service plan is in place. Reduction of these assessments would benefit both from an efficiency
standpoint and a customer view.
Page 29 of 129
CTC – Days in transition at the lighthouse – This is based on a gain of efficiency through the KIP
implementation; the whole process takes approximately 7 days for individuals to go through the
transition center at the lighthouse facility. Through the process improvements made by KIP, it
would be possible to see a decrease in days in transition; reference the six sigma section below.
It is understood that the system implementation itself would not generate cost savings until
there is a change in the methodology or process.
CTC – Days between relapse - This is based on a gain of efficiency through the KIP
implementation; it is possible to track individuals and their time between relapse. Through the
process improvements made by KIP, it would be possible to see a decrease the days between
relapse. With easier access to their health background and information, case workers can
correlate services that would be more successful in assisting with a customer. This would also
assist in monitoring of service effectiveness for the Community Resource Directory (CRD).
CTC – Days at Rehabilitation - This is based on a gain of efficiency through the KIP
implementation; this would help in monitoring of service effectiveness for the Community
Resource Directory (CRD).
FRC – Days to review and approve benefits application, this is based on a gain of efficiency
through the KIP implementation; the program is responsible to meet a 30 day deadline to
ensure that cases are resolved by California law. This can be used to monitor ROI efficiency of
the system.
FRC – Number of additional benefits the FRC refers to other services on CRD. This metric is
currently documented for several systems such as Cal Fresh, Cal WORKS, and Medi-Cal.
However, the system currently cannot indicate how many individuals use multiple services in
the county.
FRC – Effectiveness of FRC Case services – A review of complaints and surveys for customers can
inform how the organization is viewed and the effectiveness of the system (KIP in this case)
performs.
Case number and Capacity - Cases handled at the CRC and FRC from previous year compared to
year since KIP implementation started. This area would provide information of the number of
cases served compared to pre and post implementation.
Page 30 of 129
It is important that while this is a general list of in-direct cost savings that could be obtained, it is not
exhaustive. Our research also offers a general approach to developing the ROI model with the direct and
indirect areas of cost savings:
1. Agree on the ROI assumptions, cost savings, and calculation approach
2. Develop a process for data collection
3. Develop and maintain historical records of the organizations historical costs and return.
4. Develop and implement a process of systematic application of the ROI metric in the KIP process.
5. Ensure that that the costs for setting up calculation and data gathering are included in projects
costs – Typically 3-5% of the project.
Lean Six Sigma Concept Lean Six Sigma is a managerial concept that purges eight classifications of wastes, or “Downtime,”
including Defects, Overproduction, Waiting, Non-Utilized Talent, Transportation, Inventory, Motion, and
Extra-Processing. The result is the delivery of goods and services at a rate of 3.4 defects per million
opportunities (DPMO). This idea was first presented in the book titled Lean Six Sigma: Combining Six
Sigma with Lean Speed by Michael George in 2002. Lean Six Sigma uses the DMAIC phases similar to
that of Six Sigma. Lean Six Sigma projects comprise the Lean's waste elimination projects and the Six
Sigma projects basis of critical quality characteristics. This overall approach is applicable for the HHSA
organization as a whole.
Analysis of the AB109 Service Flow Process with Lean Six Sigma Tools
The team was tasked to take a Lean Six Sigma approach for process optimization of service lines within
the HHSA. We chose to look at the Service Flow for the AB109 population, as this was one of the initial
systems to be integrated within the first phase of KIP implementation, (See Appendix S for the complete
analysis of Lean and Six Sigma of the AB109 Service Flow Process). With some applicable tools to use,
the team was able to potentially see a reduction from 78 work hours to 36.5 work hours. The following
are useful Six Sigma tools that should be used to evaluate HHSA’s process optimization:
Value Stream Mapping – a lean manufacturing technique used to analyze and design the flow of
materials and information required to bring a product or service to a consumer. This was applied
to the AB109 Service Flow map, see figure 3 below. The core approach to this is 5 steps:
1. Identify the target product, product family, or service. (For this case, is the Service Flow)
Page 31 of 129
2. Draw while on the shop floor a current state value stream map, which shows the current
steps, delays, and information flows required to deliver the target product or service.
3. Assess the current state value stream map in terms of creating flow by eliminating waste.
4. Draw a future state value stream map.
5. Work toward the future state condition
Start
Assessment
Prior, to release Probation
screens each person’s case
and does the following:
Assess level of supervision
Establish assignment
Find status of EOP and
CCCMS
Notify RN
Notify BHST team
Transportation
Person exits Jail/
Prison and is
transported to the
CTC
Probation
Orientation
Criminogenic Needs
Assessment
(COMPAS)
BHST Screening
Referred to the nurse
case manager if
needed / MASU
1) Within an MDT model
a COMPAS case plan is
developed
2) Immediate referral to
services
3) Transported out
End
Figure 2 Current State of AB109 Service Flow
Benchmarking – Defined as “the process of comparing one's business processes and
performance metrics to industry bests or best practices from other industries. Dimensions
typically measured are quality, time and cost. In the process of best practice benchmarking,
management identifies the best firms in their industry, or in another industry where similar
processes exist, and compares the results and processes of those studied (the "targets") to one's
own results and processes. In this way, they learn how well the targets perform and, more
importantly, the business processes that explain why these firms are successful.” A lot of
information can be collected through reviewing and understanding what other counties are
doing in the State of California and replicating best practices. It can be used in several areas:
Process Benchmarking, Performance Benchmarking, Project Benchmarking, and on a certain
level, Strategic Benchmarking.
Page 32 of 129
Current and Future State – This is done in conjunction with Value Stream Mapping. Using the
example from Value Stream Mapping, possible wastes are identified at several areas of the
process in Figure 3 below. A possible reduction of review time for individuals coming out of
prison would be possible by KIP if their medical / historical record was easily accessible.
Additionally, KIP may enable the case worker to be able to see current occupancy for referred
services and not have to wait for reply by phone.
Start
Assessment
Prior, to release Probation
screens each person’s case
and does the following:
Assess level of supervision
Establish assignment
Find status of EOP and
CCCMS
Notify RN
Notify BHST team
2-3 Day Lead Time
Transportation
Person exits Jail/
Prison and is
transported to the
CTC
2-3 Day Lead Time
Probation
Orientation
Criminogenic Needs
Assessment
(COMPAS)
1 Day Time
BHST Screening
Referred to the nurse
case manager if
needed / MASU
2-3 Hours
1) Within an MDT model
a COMPAS case plan is
developed
2) Immediate referral to
services
3) Transported out
2-3 Hours
End
4- 5 Days
Possible Reduction Of Time Spend on
Historical Review
Possible Reduction Of Time Spend on
Data Entry
Possible Reduction Of Time waiting for return
Call from Referral Services
Checklist
Questionaire
Questionaire / Decision Support
Document
Questionaire
Figure 3 Future State of AB109 Service Flow
Lean Six Sigma – Implementation / Limitations / Required Structure
One cannot simply start implementation without understanding the concept of Lean Six Sigma. In
addition, a model must be developed and agreed upon for the applicable organization, which in this
case is HHSA. The American Society of Quality (ASQ) defines two tools for Six Sigma: DMAIC and
DMADV. The six sigma tool most often applied within a simple performance improvement model is
known as Define-Measure-Analyze-Improve-Control, or DMAIC, that is used when a project’s goal can be
accomplished by improvement on an existing product, process, or service. For the applicability, it makes
sense to look toward a DMAIC approach for HHSA. The outcome of Define phase can be stated as
follows:
1. A clear statement of the intended improvement (project Charter)
Page 33 of 129
2. A high-level map of the processes (SIPOC)
3. An understanding of the project’s link to corporate strategy(CTQ)
4. A list of what is important to the customer.
This is the crucial phase of defining the project. Development of the overall Lean Six Sigma model starts
here, and can be used to spearhead visionary improvements through KIP’s capability.
With this in mind, there are a number on limitations for Lean Six Sigma Limitation; first are resources. To
ensure that projects can be completed on time, adequate resources will be needed to implement the
project charter. This is where prioritization comes in handy for companies that are resource strapped.
There were a number of very good ideas that were mentioned in the meetings, however, some may be
simple implement, and some are more complicated and would require additional resources to complete.
Ensuring a full time equivalent (FTE) for project management would be a good starting point to ensure
that projects are managed effectively.
The second limitation is costs. It is important that any ROI model includes costs of salary for not only
the employees for the project, but also for those implementing the change. Training is also a significant
cost – basic Lean Six Sigma training can cost anywhere from 1,300 dollars to as much as 30,000 dollars
for advanced courses.
Six Sigma Return On Investment (SSROI)
A new strategy has recently been explored in the industry that combines ROI and Six Sigma. The
outcome is the Six Sigma Return on Investment, or SSROI. SSROI is a practice for increasing ROI through
Six Sigma by applying Six Sigma best practices. Six Sigma for ROI deals with current limitation of Return
on Investment (ROI) measurement, Six Sigma, and the Project Management Office (PMO). As stated
earlier in ROI, and Six Sigma, there are limitations within the models used. Six Sigma for ROI creates
PMO accountability by allowing project managers and stakeholders to appropriately allocate resources
to projects. This is a critical element in successful establishment of the ROI from KIP implementation. It
is important to develop a process for determining which projects are best candidates for Lean Six Sigma
implementation. It is critical to have someone with an in-depth understanding of Lean Six Sigma to
provide insight for the KIP Project both as a whole and as smaller, more feasible smaller projects. These
projects should be prioritized based on an agreed acceptance criteria. It can be argued that inefficient
selection would lead to failure on the KIP level as a whole as knowledge gained from successful projects
Page 34 of 129
should be integrated. During the interview process, it was suggested that there are not a large number
of resources available to allocate to the KIP program; in this case, it is essential that not only the right
resources be allocated, but the right projects be selected.
Recommendations
First, we offer a list of related tools that are not appropriate for HHSA’s specific needs, and therefore
should not be explored for implementation:
Takt time – Takt time sets the pace for industrial manufacturing lines so that production cycle
times can be matched to customer demand rate. For example, in automobile manufacturing,
cars are assembled on a line, at a certain cycle time, ideally being moved on to the next station
within the takt time so as to neither over or under produce. While this tool can be applied in the
case with the AB109 Service Flow, it may not help much in terms to reduce waste or gain
efficiency.
Pull Value – This approach is suited for creating products based on an estimated sales forecast,
enabling a plant to manufacture products as the customer demands. This is the pull system in
action. This usually results in positive changes in the organization for cycle times, finished
inventory are reduced, customer stabilizes order, and pricing is stabilized. This is not applicable
since HHSA is not a mass production manufacturing.
Perfection – The customer is searching for value added product. The pursuit of the first four
principles of lean thinking allows a firm to move towards what they call perfection. For the
purposed of the HHSA, perfection is not the ultimate goal; simply stated, in the short term,
HHSA would like to improve continuously.
ROI recommendations have been split into short term and long term recommendations. For Short term
recommendations, we recommend that the HHSA:
1. Define the role Lean Six Sigma will play into the KIP Implementation. A “model of practice team”
has been established; however logistically, how will the team function has not been defined. In
working with the HHSA, it is apparent that the resources are limited and that project scope
creep happens often within meetings and project priorities.
2. Develop standard approach for Lean Six Sigma, Tools such as project charters will enable
project teams (in this case, the model of practice team) to have a tangible goal for each
Page 35 of 129
improvement project they own. It is recommended that the KIP improvement for each phase be
broken down to further smaller projects to create tangible and feasible goals for the project
team. For example, the charter should be for an entire year and outline all smaller projects to be
worked on that year. Resources are allocated on the charter and approximate timeline for
completion is stated. This would be reviewed and approved cross departmentally and would
also include an executive sponsor.
3. Project Manager / The model of practice team should collaboratively address three objectives in
the short term while the KIP project is being implemented:
a. Create a one year charter of goals and objectives
b. Must be tangible and must be adequately planned from FTE perspective
c. Develop process for data collection for ROI
For Long term recommendations, the team recommends that the HHSA:
1. Plan to develop metrics and collect data to meet ROI measurement requirements; this includes
financial metrics / data collection and process metric / data collection from the AB109
population
2. Identification of improvement areas after data collection though the following:
a. Rate improvement areas based on risk / rewards
b. Develop a decision support system
c. Prioritize improvements based on factors / system
3. Develop Baseline costs for 2013 for the AB109 population for future ROI Model reference. This
would include cost per customer per service, current operating rates, rate of customers / hour /
social worker/ parole officer, and benchmarking what other counties have done in Alameda /
San Francisco / Los Angeles.
a. ROI direct cost savings – Overpayments, reoccurring services, improved appeals
litigation, and increased productivity
b. ROI indirect cost savings – Days in transition, days between relapse, days at
rehabilitation, days to review and approve benefits application, number of referrals to
CRD, effectiveness case services, and effectiveness of FRC Case services
4. Develop and build current team for the KIP Program through training managers / analysts. The
HHSA should also leverage current consultants (like Gartner) and enlist their involvement in
Page 36 of 129
Lean / Six Sigma activities. The “model of practice” team or cross functional team for KIP
implementation must have front liners involved from the planning stages.
Develop contingencies around large influential factors. The team has identified a couple
significant events this coming year that will need to be taken into consideration. The roll out of
the Affordable Care Act (Obama Care) will affect the capacity of the Family Resource Centers,
taking note of those currently supporting approximately 175,000 cases. With the
implementation of the Affordable Care Act, the FRC will be potentially servicing 400,000 cases.
With this significant change, it will be imperative to consider the external environment. This also
applies to the CTC, in that the CTC will be relocating within the next three months to a newer
facility to accommodate higher capacity.
Objective #3: Change Management
Methodology
The final portion of our project, a Change Management assessment, was largely performed to support
the transition to a new service-delivery approach. As the primary basis for our methodology, we utilized
the ADKAR model of Awareness, Desire, Knowledge, Ability and Reinforcement, to assess the internal
climate amongst various groups of HHSA staff (Hiatt, 2006). This model, detailed in Appendix H, outlines
the natural linear relationship of each of these factors when an organization experiences change. The
ADKAR framework identifies internal and external determinants of implementing change, and asserts
that successful change cannot be realized without the five building blocks (Hiatt, 2006). The current
stage in KIP development lends to study of the first two building blocks, stakeholder Awareness and
Desire. Thus, our research scope centralized on internal stakeholders of the LIHP and AB109 populations
(see Appendix B for Stakeholder Analysis) and our research was designed to assess staff awareness of
the need for change and the desire to support and participate in change. Our objective was to capture
responses from interviews, focus groups, and surveys to translate those results into an effective internal
communication strategy. This strategy was supported by the following two-fold approach, beginning
with secondary research and ending with primary data collection:
Gathered significant secondary research on Change Management best practices, as well as
ancillary IT resources available to promote internal communication and training and
Page 37 of 129
development. This was accomplished through internet research, industry-specific educational
seminars, electronic database searches/articles, academic books and management journals.
Chose to apply ADKAR change management principles based on their relevance to the County of
S.D. HHSA. Considered various forms of training and troubleshooting support systems, but
ultimately narrowed the secondary research scope to tactical strategies addressing internal
Awareness and Desire (Appendix H).
As we learned from initial client meetings, current employees of the HHSA may be slow to adopt change
due to lacking comfort with new technology, strong ownership of current data, and perceptions of
insufficient time to learn new processes. Fortunately, we were offered access to a captive audience
already convening to discuss these future process changes, as well as access to front-line operations
staff. The plan of action for our second approach was to design and distribute surveys, organize focus
groups and semi-structured exercises, and schedule one-on-one interviews, based on the respective
audience. Our objective was to obtain candid feedback on current operational efficiencies and
deficiencies, as well as insight into how the current team perceives HHSA changes and learns new
processes. We found that historical organizational change and past sponsorship follow-through are
critical success factors in any change management strategy. Thus, research to accurately gauge the
current internal climate is a crucial predecessor to implementation. The basis for recommending a
successful change management strategy evolved from the following actions:
Designed survey questions, focus groups, and interview questions to assess the staff’s
awareness of the need for change, awareness of KIP, and the overall desire for change.
Assigned objective identifiers to all research questions (R for ROI, CM for Change Management,
and M for Metrics) to aid organization of responses once received (Appendices K-L). The latter
qualifier was included due to secondary research suggesting tenure and resistance to change
often go hand-in-hand.
After gathering and sorting our results by project objective, we leveraged the survey, interview and
focus group results to identify pain points in the integration process. Finally, the team utilized research
results to recommend a change management strategy congruent with the KIP Phase 1 roll-out and the
HHSA’s mission, vision and value.
Results
Page 38 of 129
Secondary Research Results
From our secondary research of change management best practices and specifically the ADKAR model,
we identified the components of Awareness that must be addressed in any large-scale organizational
initiative. Largely these components relate to the human need to understand “why” (Hiatt, 2006). The
ADKAR model asserts that the organization must build awareness of the following:
What is the nature of the change and how does the change align with the vision for the
organization?
Why is the change being made? What are the risks of not changing?
How will the change impact our organization or our community?
What’s in it for me?
Though these components can and should be addressed through effective communication, many factors
contribute to a person’s recognition of the need for change (see Appendix H). These factors must be
weighted when setting a communication strategy. For instance, an audience’s acceptance of an
awareness message is complementary to their view of the organization’s current state. If a person is
extremely displeased with the current state, they most likely are aware of the need for change.
Conversely, if they are content and comfortable with the status quo, they are unlikely to understand
why the organization is sponsoring change or embody the motivation to support the change. To counter
predispositions and build awareness, the ADKAR model provides a tactical frame-work of activities. The
activities, explicated further in Appendix H, include targeted communications explaining the reasons for
change and risk of averting change, sponsorship at the appropriate levels throughout the organization,
preparing coaches, and offering information readily to all staff.
Desire, the second building block in a change management strategy, represents this motivation for
change. Feedback from HHSA respondents suggests that this building block will be the most crucial to
KIP success. The organization can develop new tools and processes, purchase technology and promote
a new culture within, but without employee buy-in and support, these initiatives are moot. The
challenge the HHSA leadership foresees is generating enough desire to set the change in motion,
meaning more referrals, coordinated care, and shared information. The information exchange system is
a tool and thus success is contingent upon its usage. In order to manage risk of resistance to change, an
organization must identify shared pain points of staff desire. The objective is not to weed out the
minority of reluctant or indifferent staff, but rather to proactively counter these pain points with
Page 39 of 129
appropriate positive communication. Leaders must consider factors intrinsically shaping our desire to
actively participate in change, and leverage those insights to foster desire in their teams. These factors
and tactics are identified in Appendix H. This research shaped our dialogue and survey question design,
analysis, and recommendations for a “Best in Class” change management approach for the HHSA.
Other supporting research provided key insights and shaped our final HHSA recommendations. From the
KIP RFI review, we learned that each of the vendors proposing to spearhead the system integration has
a different approach to change management and/or training support, but the common proposition is to
train internal “Super Users” to lead the charge for staff learning and development. At this juncture in the
RFP review process, no further insight into vendor change management strategy could be obtained.
From case analysis research and development of a KIP/LIHP and AB109 population SWOT analysis,
Stakeholder Analysis, and Trends Analysis (Appendices B, C, and D, respectively), we learned that the
most strategically relevant threats were related to change management. We also took away benchmarks
from two similar county social service agencies, such as the footprint of their formal training strategy2
and benefits enjoyed from integration, such as reduction in benefits overpayments. The latter was a
direct result of increased access to information—case workers we able to identify non-compliant
customers (i.e. not participating in the welfare to work activities such as orientation, training, etc.) and
discontinue benefits. This and data-driven improvements to appeals litigation combined for over $4.2
million in cost savings in the first year. All change management strategies explored were advocates of
celebrating these successes to drive Awareness, Desire, Knowledge, Ability and Reinforcement (ADKAR).
Interview and Focus Group Results
The team’s efforts collected insights from approximately one hundred HHSA representatives.
Overwhelmingly, interviewees and focus group participants identified that communication and effective
change management will be the critical hinge of KIP’s success. Several shared themes emerged:
Many HHSA leaders expect to see resistance to change, particularly from front line staff,
tenured staff, and independent contractors. Potential resistance will likely take the form of
limited participation in improved referral-making, unless explicitly requisite
They anticipate front line employees to believe the new system will be ‘extra work’
Timing the communication/raising awareness is key
2 Alameda County SSA: Formal training was initially provided over 5 days to 36 end users. Four members of the integration received approximately 30 days of training.(ROI Case Study: IBM SSIRS)
Page 40 of 129
Leadership must also be aware of how front-line staff and supervisors are perceiving the
change/sponsorship
Strong need for a universal dictionary across programs, including a definition of variables in
the future information exchange platform
Expected employee perceptions:
o Will I get in trouble for saying/reporting inaccurate data?
o Will my performance be rated on how well I do with this new initiative? Research
participants agreed there will be a need to measure this eventually but adequate
time for onboarding must be allowed
Amount of time needed for training is significant and will impact organizational units
unevenly
The responses summed to a general consensus: Strategic introduction of the new system is a necessity,
including the need to show the value added. Otherwise, an ominous threat looms that “dollars spent
will fizzle out as another legacy” (Executive focus group response, Appendix L). “We need analytics along
with integration to create value.” Like our secondary research advocated, most staff respondents
believed it is imperative to explain why the change is happening and sell the benefits of supporting the
initiative. Equally important is to ensure that all staff is aware of the benefits of the new system and the
information is presented accurately and uniformly. The overarching benefits begin with the opportunity
to improve outcomes and actually know if you have provided effective service. “Right now no
mechanism exists to evaluate outcomes for those county customers that complete their treatment plan.
How do you know if you are being effective?” said another participant in our Executive focus group. The
current system offers little reward to many staff, in terms of seeing their successes to fruition. We
observed that almost all research participants embodied a customer-centric approach, with a strong
desire to improve the quality of their personal service delivery.
In several focus groups, team members identified the unique challenge that certain HHSA programs will
face due to their heavy reliance on contract services. Contractors and contractually connected programs
outside of the HHSA will have their own procedures and databases. Encouraging desire for change will
be a hefty challenge for this subgroup, as they may be less entrenched in the HHSA vision than staff on
payroll. To add to the effect, many of the HHSA contractor are professionals with high levels of
autonomy—namely medical providers. Perceived threat to their independent processes and decision-
Page 41 of 129
making will likely spark opposition. Thus, this population will require a unique approach to motivating
change, particularly due to limited opportunity of enforcing performance standards. This challenge will
be greatly weighted on programs such as Behavioral Health Services (BHS) with 360+ contracts, Alcohol
and Drug Services (ADS) which also relies heavily on contracted service providers, and the Compliance
Office with responsibility to enforce adherence to contracts. Research participants identified
Contracting Officer's Technical Representatives (COTRs) as another cluster of individuals that directly
engage and oversee contracted services. Undoubtedly these subgroups will feel significant impact from
KIP’s implementation, and as our recommendations will later explain, should not be overlooked.
As mentioned above, establishing a collective dictionary was top-of-mind at every interview and focus
group. Respondents expressed desire for a more universal ontology, because the current organization
permits silos of semantics. In some cases the language is defined by the funder and thus cannot be
changed, but in many cases they find multiple acronyms exist that describe the same entry. Even the
definition of recidivism varies greatly across programs, as the Executive focus group pointed out, and
thus puts the quality and/or comparability of your data in question. Lack of a standard identifier
consumes time, leaves significant room for error, and acts as a formidable barrier to business analytics.
Research participants unanimously agreed this threat must be confronted in order to support the
change.
From the front line staff focus groups, we received candid feedback that offered valuable insight into
operations and internal culture. At the Community Transition Center, we were instantly immersed in the
fast pace of their work day. Their work load is heavy and unpredictable (i.e. the state notifies the CTC
twenty-four to thirty-six hours in advance of a parolee arriving) but the camaraderie and sense of
teamwork was apparent. When describing the challenges they encounter such as lacking IT resources or
physical space, their tones were jovial and uncomplaining. Some of the key takeaways from the
seventeen person focus group were as follows (See Appendix N for complete notes):
Strong awareness of the need for change:
Provided many examples of areas for improvement related to access of information. “If an
intake coordinator is out of the office, it could take days or weeks to refer a patient.”
Understood the need to improve the quality of treatment provided, and realize that exchanging
information supports improved outcomes. “No data collection is a waste of time—every bit of
information is important to planning for our customers’ care.”
Page 42 of 129
When asked, “Do you have the full complement of staff to do your jobs effectively?” the staff
hesitated and Karna prompted, “It’s ok to speak honestly.” They acknowledged that four to five
additional team members would improve the quality of their service delivery.
Anticipated Challenges w/KIP:
Understanding the system in a timely way, as to not interfere with workload. “We are BUSY. We
don’t have time to waste.”
Ontology – many acronyms and terms are very specific to their customer base. System-wide it
will present a challenge. Karna Lau provided a CDCR acronym cheat sheet (See Appendix P).
Expected Benefits:
Access to timely information would improve the customer experience and be less of a burden
on an already heavy workload
Data Driven Decision Making. Huge opportunity to improve program effectiveness because they
are the AB109 population’s first stop—the better quality the case plan recommended to the
patient, the better the outcomes. More info on patient history means they can make better
quality references for rehabilitation. They can look at what treatment plans have/haven’t
worked for the customer, and determine if they should be referred back to where they came
from (was it effective or not), so that an educated decision can be made as to the appropriate
case/treatment plan going forward.
As probation officers, having access to SanWITS and Anasazi patient/customer info would
enable them to do their jobs more quickly and more accurately.
Our observations of the CTC team were overwhelmingly in support of a change initiative. Their morale
was indicative of strong desire to improve their individual and organizational effectiveness through KIP’s
successes. Still, it is important to note a potential limitation of these results, as we perceived that many
staff strongly desired other programs to change their current behaviors and level of communication. For
example, the CTC staff expressed that “100% of the information we need could come from SanWITS.”
Though not explicitly stated, there was apparent frustration amongst certain programs and the staff’s
perception of their support of coordinated care. It will be critical to foster relationships amongst
programs that may have a history of resentment surrounding their current service delivery.
Page 43 of 129
Other noteworthy feedback was received related to other components of the ADKAR model. Many
participants spoke about the challenge of understanding new roles, permissions and responsibilities
associated with the change. Knowing how to implement the change is reflective of the “K” in the ADKAR
model, “Knowledge.” For example, the Assistant Deputy Directors raised concerns of staff “putting on
an investigator hat,” and abusing the access to information to perform detective work about customers.
Most participants also recognized that “Ability” will be a result of providing staff with enough time to
develop the required competencies, but also supporting the current staff with adequate resources.
Many felt that front-line staff would be deeply concerned with taking time away from their already
heavy workload. Though outside of the scope of this project, allocating funds to bring on additional
support personnel will require significant research and planning in the near future. Other factors
influencing ability did not seem to be strategically relevant, such as intellectual or physical capabilities.
Responses speaking to “Reinforcement”, the last component of the ADKAR model, were prevalent. The
executive leadership noted the system must be responsive, otherwise frustrations will proliferate and
buy-in will quickly suffer. Appropriate time to learn new processes must be allowed before measuring
and enforcing performance goals. During particularly candid discussions, perhaps the most notable
feedback we received was the agency’s tendency to enforce top-down decision making, without fully
addressing the front-line’s perceptions or feedback. Some staff frustrations have emerged with the way
past changes had been managed—primarily because they felt out of the loop. The research participants
reiterated a need to communicate with front-line staff sooner in the change process, and to ensure their
supervisors are supporting and delivering the appropriate messages.
Survey Results
Only the Program SME survey deployed with questions designed purposefully for informing a change
management assessment. The front-line staff survey we designed was unfortunately a missed
opportunity to gain candid insights on a broader scale, as the process for Human Resources approval
exceeded our project timeline. From the Program SME survey, however, we gained affirmation of the
common themes we gathered in the preceding focus groups and received new pieces of insight as well.
Right away we notice in the survey results (Appendix O) that 3 of the 15 respondents completed the
“About You” section of this survey (identifying their program and tenure with the HHSA) but did not
respond to any other survey questions. This is a key finding to note for assessing reliability of the data
(the HHSA must consider that BHS and Housing & Community Development’s viewpoints were not
Page 44 of 129
captured) as well as from a change management perspective. The individuals that did not respond were
from Behavioral Health Services, Housing & Community Development, and CWS, and had 15-25 years of
HHSA tenure, >25 years tenure, and 5-15 years tenure respectively. Though extenuating circumstances
could have occurred that prohibited these associates from completing the survey, it could also likely
imply a disinterest in participating. These results would be in agreement with several journals we
reviewed in our initial research, which identified a positive correlation between tenure and resistance to
change.
Key results pulled from the survey include the following:
On nearly every question, members of the same program (exclusively Child Welfare Services
(CWS) and Eligibility) had opposite and/or conflicting responses. For example, when asked “How
often does your program measure the time it takes, from start to finish, completing and/or
performing business processes? And prompted with four choices (Often, Occasionally, Rarely,
Never), one Eligibility respondent replied often while the other replied never.
Two of ten respondents described internal communication barriers as the more time consuming
processes for themselves and their team. Specifically, pain points exist in “playing phone tag”
with individuals and community partners and when attempting to locate language and culturally
appropriate service providers. This response surfaced again in the follow question, which asked,
“Where do you (or does your team) experience the most redundancy in work?”
Though the question, “Regarding your current IT environment, what applications do you feel are
the most user-friendly?” was aimed at assessing current levels of satisfaction with current
systems (factors of desire to change), we received feedback that Eligibility’s staff are “using One
Note [Microsoft] in creative ways” and “exploring other instant messaging tools.” This response
was encouraging to our planned communication strategy, to be discussed in the following
section.
A foil to the preceding question, question seven solicited feedback on the least user-friendly
application. The lack of shared responses could infer that like the existing silos of data,
applications are often program-specific as well. The response that resonated most was the
following: “And while not an application, the organization of the S: drive is in great need of
revision. The pathways are not intuitive or logical and it takes a long time to finally get to the file
one desires. Most of our tracking tools are stored on the s: drive, so that is why it is a significant
Page 45 of 129
concern.” This pain point seems to have potential to be resolved internally, and may be a
beneficial housekeeping action item to address pre-KIP implementation.
Probation and Public Assistance both ranked as the top programs the survey respondents
collaborate/communicate with to provide coordinated care.
With regard to KIP initiatives, the respondents were asked to gauge the level of awareness of the need
for change within the HHSA. Most common were “My team is dissatisfied with current access to
customer information across programs and redundancy (i.e. “Our customers think we are already
integrated - and we should be,” “It’s about time someone listened to me,” etc.)” and “My team
somewhat perceives a need for this change but is primarily consumed with keeping up with operational
workload (i.e. “I wish that we did business differently but I don’t have time to learn new
processes/procedures,” etc.).” The program leaders were also asked what staff concerns they anticipate
with the proposed transformation of service delivery. The primary concern they anticipate was an
increasing work load, closely followed by fears of new technology and lack of awareness about where to
get information. Following this question, an open ended response invitation prompted two unique
responses—one, they anticipate general feelings of fear, not just of technology, because some programs
have staff who are also benefit recipients. Maintaining the privacy of their personal case records is very
important to them and thus KIP may feel threatening to these individuals. Secondly, another program
leader suggested concern of not knowing the permissions of the customer—what are the front-line staff
permitted to disclose that’s on the customer’s record? These are areas that will require training and
development, to ensure that staff can perform at their pre-KIP levels.
Our research results suggest that staff is largely aware of the need for change in the HHSA. For the most
part, we accept that staff innately desire the impending change, but have valid concerns of workload
impact and fears of new processes and technology. They also desire improved communication and
information sharing from programs that have been historically challenging to collaborate with. The
following recommendations offer solutions to encourage and build staff desire for change.
Recommendations
Organizations that lay the groundwork for change management and follow-through with sufficient
support enjoy a significant competitive advantage. Some of the critical success factors that determine
the success of an organizational change include: a foundation of change readiness, effective
communication, visible sponsorship, adequate training and ongoing support, and addressing potential
Page 46 of 129
resistance or misinformation. Motivating and monitoring these critical success factors is a significant,
constant challenge, particularly in a bureaucracy where decision processes can be cumbrous and offer
little incentive to recognize continuous change. The County HHSA has identified several organizational
changes necessary to facilitate this process, but we offer supplementary recommendations for
managing the internal change. We recommend the following action plan for building awareness,
creating desire, and planning for the Phase 1 KIP roll-out.
Sponsorship Coalition
First and foremost, it is critical to establish a sponsorship coalition specific to managing the change. We
recommend identifying a large network of supervisors, mid-level managers, executives, and KIP change
management leaders that can fulfill the time commitment this visibility will require. The network should
be broad enough to embody representatives from every program in the HHSA, including those not
included in the initial KIP roll-out. In particular, ensure that the sponsorship coalition includes
representatives that engage frequently with contractors, as this population’s support is critical. We
suggest creating an organizational chart to map the network of change leaders and designate sub-
groups with common program backgrounds. This should be a living, breathing document with flexibility
to add change leaders as they emerge or remove leaders that do not have adequate bandwidth to
support the coalition. In these instances, the sponsorship coalition should make a solid effort to find a
replacement from the relative program. We also recommend that the KIP team consider adding
resources (promotion to the team, hiring on an additional employee, etc.) to enable your primary
change management leader to dedicate their work hours to this initiative.
Stage One: Initial Change Management Planning
As the KIP team is well aware of, much preparation will be required before unveiling the change to all
stakeholders. Action items we recommend at this stage are as follows:
Educate your sponsorship coalition to be subject matter experts on change management
approaches and KIP’s functionalities. This team will be responsible for relaying the message of
change to the front-line staff, so care must be taken to ensure that message is accurate. The team
should understand the KIP and HHSA vision, what KIP is and is not, and who it will affect in each
phase and in what capacity. They must anticipate misconceptions and prepare appropriate
rebuttals, as well as identify any potential pain points that may emerge in their program. If funding
permits, offering opportunities for lean Six Sigma training would be to the team’s benefit.
Page 47 of 129
Establish sponsorship coalition meeting schedules with the least impact possible on their busy
schedule, but frequent enough to maintain momentum. Share all relevant KIP progress,
challenges/hiccups and successes with the sponsorship coalition.
Survey the front-lines on a broad scale. Though we were unable to accomplish this within the time
constraints of this project, we believe that it is pivotal to collect their invaluable insights at this
stage. A comprehensive, accurate assessment of Awareness and Desire could not be obtained from
this research project’s sample size. The results should be analyzed to identify pain points across
regions, programs, and target customers served.
o Questions should not include references to KIP
o Suggest adding additional questions to the un-deployed front line survey (Appendix K): Do
you trust your current IT systems/applications in terms of data integrity/accuracy of data?
This question is in response to the recurring mention of potential employee
concerns of lost data integrity/distrust of the new information exchange.
Determine the complement of staff that must be maintained to support KIP’s vision. Budget for
additions to staff and hire appropriate support staff to compensate for time for cross-training,
additional leadership meetings, reporting to prepare for KIP roll-out, and changes to job roles. New
positions that will offer crucial KIP support include a Business Analytics associate and several Data
Custodians to maintain current, accurate information in the exchange (contact information,
addresses, etc.) and address unmatched data. Our research suggested that in any matched data
system there is a small percentage of data that must be sorted manually. These positions would be
responsible for managing the manual data sort.
General Housekeeping: Charge one to two individuals with the responsibility to organize/support
documentation development.
o Compile all programs referral cheat sheets and coordinate with 2-1-1, to create a master
reference guide. Guide should be searchable and include pertinent information such as
contact names, menu of services provided, and eventually offer notes with the program’s
preferred communication channel (i.e. what’s the most effective way to communicate with
our team?)
o In a similar fashion, create a universal dictionary that assembles acronyms and key
terminology from each of the programs. Resolve conflicts that exist in shared terms and
charge managers/supervisor with the responsibility to enact and enforce these changes at
Page 48 of 129
the program level. When the communication strategy is in effect, offer training sessions to
promote the shared ontology (see Appendix Q for flyer example).
o Incorporate Standard Operating Procedures (SOP) development into
management/supervisors goals. Current, thorough definitions of job roles and common
tasks should be outlined pre-KIP, when processes and responsibilities will be changing. SOP’s
should be individually owned by team members and tested/walked through by other staff
for functionality. Suggested time frame for this action item is six to nine months.
o If deemed beneficial, organize the S: drive/SharePoint to prepare for increased use.
Feedback from surveys suggests that current organization is not intuitive or orderly.
Encourage staff to leverage internal communication solutions in a greater capacity, particularly
Microsoft Lync for instant messaging. Require a short web-based training on Microsoft Lync’s to
raise awareness of its capabilities and benefits (an example of such presentation is available in
Appendix I). Promote connecting with associates from other programs for improved coordination of
care. For example, staff at the CTC could send a message to a representative at SanWITS, requesting
updated bed availability. Our feedback from many programs informed that “playing phone tag” is
one of their most redundant and time consuming processes, thus we suggest an alternative mode of
interdepartmental communication. Ensure that the Microsoft Lync presentation addresses any
privacy standards must be upheld.
Make cross-training and inter-program engagement top priorities. Improving collaboration will offer
“Quick Wins” along the road to full-scale integration.
o Entrust all management and supervisory staff with establishing and implementing a
schedule of cross-training with other departments, but offer flexible goals for completion.
For most HHSA programs, this will be challenging due to the heavy workload, and should be
a benefit more than a burden. This cross-training could take many forms depending on the
nature of the programs’ operations (Social worker ride along, high-level virtual meeting, in-
person job shadowing).
Drum up more excitement about the comprehensive array of services that HHSA programs provide.
Begin to routinely organize open houses and tours of program facilities to celebrate the service they
provide, and offer incentives to encourage staff attendance/participation.
o Begin with AB109 and LIHP
o Host every 3-4 weeks
Page 49 of 129
o Provide snacks, tours, reference guides (“How to communicate with our team”)
Stage Two: Communication Strategy
At Stage Two, the objective is for all stakeholders to have access to the same, comprehensive
information. The focus and challenge at this stage is generating excitement and support from the front-
line staff. We strongly recommend framing the Knowledge Integration Project as a two-part initiative – a
change in the way HHSA programs collaborate and deliver services, and an information exchange system
designed as a tool for the front-line staff. This change will visibly empower them to make decisions,
reduce redundancy and overpayments, recognize successes, and improve outcomes.
This is what drives their motivation. The HHSA front-lines are craving a way to gain feedback on their
efforts to improve the well-being of individuals. We recommend presenting KIP as an opportunity rather
than a request to buy-in to executive sponsorship.
Still, this initiative demands sponsorship, but across all levels of the organization. Approaching the KIP
initiative in this manner will require constant accountability and follow through. Appropriate leadership
must be available to provide the visibility this change requires. The system must also be responsive, the
dashboards must not only be relevant and meaningful but simple to use, and adequate support systems
must be in place. With these support initiatives and a swell of desire, the HHSA will overcome what we
foresee is the largest obstacle to KIP success—staff embracement. We propose the following
communication strategy when informing the HHSA of the Knowledge Integration Project.
Communication Action Plan:
Provide all sponsors with scripts for communication with the front-line staff, including responses
to common misconceptions and/or concerns. Ensure that the sponsorship coalition practices the
KIP unveiling on a subset of those already aware of the initiative, and that a high level of rapport
is communicated in the presentation
Prepare collateral for staff to review at the KIP unveiling meeting (i.e. a brochure with FAQs,
KIP’s 5 functionalities, etc.) to be used for reference at a later date
Approximately ten weeks prior to KIP’s full-scale roll-out, inform the entire team of the
upcoming change. Host a celebration to unveil the Knowledge Integration Project, generate
excitement and inform. Unite as many staff and contractors as possible to the same location for
KIP unveiling, organize video conferencing in advance to include all who could not attend. All
Page 50 of 129
staff and contractors, including those outside of the initial pilot programs, should attend.
Ensure the introduction session informs:
o What KIP can do, what it cannot do, and possible capabilities for the future
o Why the change is happening
o Sell the many benefits of this initiative
How it will improve access, quality, and outcomes for the customer
Opportunities
How it will improve their job effectiveness
How it will improve fiscal and operational health for the HHSA
o Identify internal strengths and how this initiative will be supported
o Address what processes will change, what will be measured, where to go for help, and
how to help others with the change process
o Establish credibility with the HHSA audience by acknowledging potential threats to its
success
Facets of KIP that are still unknown
Challenges foreseen
o Communicate long-term goals
o Timeframe
Follow-up
o One day after the event, email all attendees to thank them for their support and solicit
feedback and questions to the sponsorship coalition.
“If you could not attend, your supervisor will brief you.”
o Within two days - Ask supervisors to identify any staff not in attendance
Schedule overview sessions (1:1) between a member of the sponsorship
coalition and any absent parties within one week.
Create “disruptive” communication to maintain momentum (see Appendix R for examples)
o With the constant influx of information, KIP’s communication strategy must be creative,
attention-grabbing and useful; otherwise it will likely be ignored. Staff are already “used
to ignoring alerts” and bogged down with reporting – “I have 15 reports on my desk
every day.”
Add front line staff to the sponsorship coalition based on demonstrated interest and leadership.
Page 51 of 129
Solicit feedback regularly – charge the responsibility to Supervisors and ADD’s (weekly report
out to the KIP team – ½ page assessment with concerns, ideas, etc.). Create a web-based forum
for confidential feedback.
Celebrate and share KIP’s successes regularly. Quick wins will be imperative to maintain
momentum, given the scope and lengthy timeframe for this project’s implementation.
Begin to prepare communication approach for future KIP phases, including the shift from
training/onboarding to performance measurement accountability.
Approaching the KIP initiative in this manner will require constant accountability and follow through
from management. Appropriate leadership must be available to provide the visibility this change
requires. The system must also be responsive, the dashboards must not only be relevant and meaningful
but simple to use, and adequate support systems must be in place. With these support initiatives and a
swell of desire, the HHSA will overcome what we foresee is the largest obstacle to KIP success—staff
embracement.
Conclusions
In sum, our research circuitously united each of our three KIP objectives—identifying key metrics for a
dashboard, identifying direct and indirect cost saving opportunities, and proposing an effective change
management strategy—by informing of a common obstacle in bureaucratic organizations. Bureaucracy
often impedes timely adaptation, so a natural affinity exists to center decision-making at the top of the
organization (Managing Change in Organizations; p.38). Though this may streamline decision processes,
the unintended consequence can be high-level leadership departure from the front-line, as well as their
access to real-time data. The objective to make executive decisions more quickly would be data poor,
but the alternative data-driven decision-making would have significant costs as well—expensive and
time consuming data collection from the front lines. More often than not, front line staff feels
disempowered and unmotivated to support change – the “What’s in it for me?” effect. To counter these
common challenges, we robustly advise that KIP’s decision support structure consider front line roles as
critical determinant of outcomes and drivers of ROI. The dashboard should thus “allow for long-term
strategic decisions to be made at the corporate level, tactical decisions to be made at the point of
engagement, and operational decisions to be made by front-line employees” (Project Management
Institute, 2013). Our research offers the County of San Diego HHSA informed submissions of strategic,
tactical and operational measurements of success; identifies opportunities to streamline decision-
Page 52 of 129
making processes and enjoy cost savings; and finally, recommends the front-line as the focal point in a
visionary change management strategy.
Page 53 of 129
References
Applied Biosystems. (2003). The Financial Benefits of the MicroSEQ® Microbial Identification System. Foster City: Applied Biosystems.
Bogan, C., & English, M. (1994). Benchmarking for Best Practices. New York: McGraw-Hill. Breyfogle III, F. (1999). Implementing Six Sigma, Smarter Solutions Using Statistical Methods.
New York: John Wiley and Sons. Breyfogle III, F. (2000). Managing Six Sigma. New York: John Wiley and Sons. Brown, M. (1996). Keeping Score, Using the Right Metrics to Drive World Class Performance.
Quality Resources. Business & Legal Reports, Inc. (2006). The ROI of EHS: Practical Stategies to Demonstrate the
Business Value of Environmental, Health, and Safety Functions. Old Saybrook: Business & Legal Reports, Inc.
City of New York Health & Human Services. (2013, November 5). Case Study: City of New York Health & Human Services - HHS Connect. Retrieved from The Computerworld Honors Program : http://cwhonors.org/viewCaseStudy2010.asp?NominationID=163&Username=cnyhhs
DeSantis, C. (2013). Business Model for Horizontal Integration of Health and Human Services . Washington D.C.: American Public Human Services Association .
Empirix. (2006). Monitoring Contact Center Technology Examining the Substantial, Positive Financial Impact that Emirix OneSight for Contact Cetners Can Have on Your Organization. Case Study Forum.
Farris, P. W. (2010). Marketing Metrics: The Definitive Guide to Measuring Marketing Performance. Upper Saddle River: Pearson Prentice Hall.
Friedman, T., & Smith, M. (2011). Measuring the Business Value of Data Quality. Stamford: Gartner Consulting.
Gee, G., Richarson, W., & Wortman, B. (2000). CQM Primer. Terre Haute: Quality Coucil of Indiana.
George, M. L. (2002). Lean Six Sigma. New York: McGraw-Hill. Hiatt, J. M. (2006). ADKAR: A Model for Change in Business, Government and Our Community.
Loveland: Prosci. Hoerl, R. W. (2011). Six Sigma Black Belts: What Do They Need to Know? ASQ Journal of Quality
Technology, 391-406. Johnson, R., & Melicher, R. (1982). Financial Management (5th ed.). Boston: Allyn and Bacon,
Inc. Juran, J. (1999). Juran's Quality Handbook (5th ed.). New York: McGraw-Hill. Juran, J., & Gryna, F. (1993). Quality Planning and Analysis (3rd ed.). New York: McGraw-Hill. Kapan, R., & Norton, D. (1996). The Balanced Scorecard. Harvard Business School Press. Kerzner, H. (1995). Project Management, A Systems Approach to Planning, Scheduling, and
Controlling (5th ed.). New York: Van Nostrand Reinhold. LEAN Manufacturing. (2013, October 23). Retrieved from Wikipedia:
http://en.wikipedia.org/wiki/Lean_manufacturing
Page 54 of 129
Motorola. (2010). The Financial Benefits of Using LANPlanner and BroadbandPlanner. Schaumburg: Motorola White Paper.
O'Kane, B., & White, A. (2013). Establishing Milestones to Optimize MDM Time to Value. Stamford: Gartner Consulting.
Paulson, M. J. (2013). A New Spirit of Service. Healthcare Executive, 76-79. Polonsky, M. J., & Waller, D. S. (2011). Designing and Managing a Research Project. Los
Angeles: Sage. Project Management Institute. (2013). Managing Change in Organizations: A Practice Guide.
Newtown Square: Project Management Institute. Quantum. (2008). Save Time and Money With Quantum's Integrated Archiving Solution. Case
Study Forums. (2008). Return on Investment of Streamlined Data Mining and Analysis Pipeline Pilot Improves
Research Efficiency, Productivity, and Costs, Delivering ROI of up to 7:1. Case Study Forums.
Richarson, W., Gee, G., & Wortman, B. (2004). CSSBB Primer. Terre Haute: Quality Coucil of Indiana.
SD County HHSA. (2013). KIP RFI Summary and Analysis. San Diego. Six Sigma. (2013, October 22). Retrieved from Wikipedia:
http://en.wikipedia.org/wiki/Six_Sigma Six Sigma for ROI. (2013, November 5). Retrieved from Wikipedia:
http://en.wikipedia.org/wiki/Six_Sigma_for_ROI Smale, P. (2013, November 13). Overcoming Organizational Resistance: How Alcon drives pre-
commitment to action. Retrieved from CEB Marketing Insights Leadership Council : https://www.mreb.executiveboard.com/Members/Events/Abstract.aspx?
TruScan. (2008). From the Loading Dock to Quality Stock Three Scenarios for Reducing Incoming Inspection Costs. Case Study Forum.
Page 55 of 129
Appendices
Appendix A: Letter of Engagement
October 11, 2013 Ms. Adrienne Perry Senior Project Manager, Knowledge Integration Program Health and Human Services Agency 1255 Imperial Avenue, Suite 743 San Diego, CA 92101 Dear Ms. Perry, First and foremost, we’d like to thank you and the San Diego County Health and Human Services Agency for your engagement with San Diego State University’s MBA program. We welcome the challenge charged to us and look forward to creating value for both your organization and our community. After a thorough assessment of the SDCHHSA’s mission, vision and values, we have detailed our understanding of the Agency’s core business challenge and the scope of the project at hand. This letter elucidates our fundamental objectives, our methodology, and our strategic deliverables.
Understanding of the San Diego County HHSA Situation
Since the assimilation of multiple independent organizations into a single Agency in the late 1990’s, the San Diego County HHSA has offered a comprehensive array of health and social services to the community. The primary goal of this integration was to streamline the delivery of the many health and social services offered, to more efficiently and effectively serve the residents of the sixth largest county in the United States. However, the observed result has been collaboration rather than true integration, as the operational independence of these programs discourages continuity of customer/patient care. Continued disjointedness amongst their many programs would dilute their mission to “make people’s lives healthier, safer, and self-sufficient by delivering essential services” in a cost-effective and outcome-driven way. We understand that the lack of an integrated data management system spanning across HHSA programs has led to operational inefficiencies and threatened the organization’s continued effectiveness. Though the HHSA produces and retains significant data about customers served, the data collected lacks consistency across the enterprise, which limits interpretation and thus usefulness. As it stands now, Health and Human services cannot answer even the most basic questions about its client base, such as “How many unique individuals do we currently serve?” or “How many of our clients use more than one of our services?” The SD County HHSA has identified the need for an integrated, streamlined service delivery system consistent with their core values: Integrity, Accountability, Innovation, Quality and Results. The Knowledge Integration Project, a central information exchange computer system, aims to set enterprise-wide standards, common metrics, and improve the quality of the organization’s service delivery. At this stage, the County has solicited RFP’s from potential information exchange vendors for a
Page 56 of 129
three phase implementation. Vendors’ proposals will soon be evaluated in their congruence with the Agency’s imagined goals, substantial responsibilities of organization to stakeholders, and cost. The project indicted to the SDSU MBA student team is to identify a framework for aggregating the key metrics currently living in independent data silos, in preparation for the information exchange roll-out. These integrated metrics should attest to the community’s overall health at a tactical level, and will aid in the design and creation of the ideal dashboard for the agency’s leadership team. As a base for our analysis, the scope has been narrowed to ten pilot data source systems. Peripheral goals include developing a model to quantify the value/ROI of the Knowledge Integration Project, as well as assessing the internal response to the forthcoming changes.
San Diego County HHSA -- Project Objectives
To achieve these goals, three primary objectives will be met. In order of priority, these objectives are:
1.) Identifying data each department/program produces, where it is stored, who manages data retention, and how/with whom the data is shared. This information will serve as a base for cross-functional analysis, to ensure that each program’s key metrics and customer profiles will be captured by the newly integrated system. The planned deliverable is identifying comprehensive metrics that reflect the community’s overall health at a tactical level.
2.) Estimating the Phase 1-3 return on investment through an analysis of projected cost savings from reduced replication of services and/or administrative functions and decreasing the number of interfaces. Business analytics such as tracking workload inefficiencies will also be considered when forecasting the project’s ROI.
3.) Assessing the organization’s current climate as it relates to change management, and developing high-level recommendations for assimilating the HHSA team into the information exchange system with minimal pushback.
Methodology/Strategic Approach
Fortunately, there has been extensive historical research that can be used as a resource when designing our data collection models, including reports discussing the most effective form of research for specific environments and objectives (i.e. observation vs. surveys, etc.). We also have significant resources available that will aid in data collection. The HHSA can provide a range of metrics from broad public health figures to data on workload times for applicable positions and their tasks. They also have extended access to an applied informatics fellow who is a subject matter expert on key county health metrics and the overall health of our community. He will also serve as a portal to information that may be inaccessible to our team due to privacy restrictions. This addresses the two most significant challenges we’ve identified, which are ensuring that we are compliant with all HIPAA regulations and other privacy laws when dealing with health and human data, and ensuring that we have access to enough relevant data to produce accurate and relevant research/analytics. Access to financial information may also be a challenge, so we will address this in order to pursue project goals, such as ROI analysis.
Objective #1:
Page 57 of 129
To identify the what, where, how and with whom data is retained and shared, we will deep dive into the current data that that HHSA has on their clients through primary and secondary research. We have gathered significant key information about the ten data systems and continue to do so, such as identifying respective facilities for observation, their stakeholders and data custodians, current systems utilized at each of those programs, patient mix, financial health/role within the county budget, and key metrics and goals. It is critical to determine the organization-wide metrics that are currently tracked to avoid duplication of efforts. We plan to review the Knowledge Integration Project RFP available to the public, to have an enhanced sense of the agency’s “must-have” list and overarching goals. We will design our research model based on IT resources available and the best practices we have observed in our research and past experiences. Our primary research will entail in-depth interviews with data custodians and executive leadership. We will ask probing questions such as, “If you had access to other programs’ data, what metrics would you measure? What would you do differently from a process approach?” or “If you had a dashboard, what metrics would you want to see?” Our objective is to “bring focus” to a wide array of data spread out over many systems and located in many different silos. Once we have identified the key metrics to measure, we will use advanced statistics software such as XL-Miner or R to identify trends and correlations within the data. Fortunately, we are able to leverage the SDSU MBA student team’s strong background in data analytics, six sigma and organizational excellence to create singular and cross-departmental reports. These reports will aim to identify and measure the key quality standards defined, as well as drill down into duplication of services, redundancy in forms, etc. Our team will utilize Excel’s data cube as well as other applications for quantitative research. Throughout this process of primary research, many of our action points will be dependent on availability and scheduling with key members of HHSA. A preliminary process will consist of the following action items in order:
Review and discuss current available material provided by our primary source of information and contact, Carrie Hoff, Assistant Deputy Director. These documents contain important information regarding current systems, data, and metrics currently being used. There are census based metrics such as education level, quality of life, life expectancy, income, security, physical environment, built environment, vulnerable populations and community involvement. This is a great starting point for the group to understand preliminary information that is important to both the HHSA as well as their key stakeholders. The understanding of these documents will drive discussion regarding what integrated metrics are possible, what metrics will be easily understood, and what metrics will be most useful for the future integrated data source. The top 5-10 key metrics that are suggested by our team will be starter metrics for the HHSA and their progression towards data driven management.
Strategically choose the sequence of interview scheduling. In order to fully understand and make appropriate suggestions to the HHSA, we must first connect with many key leaders. These individuals include:
o Adrienne Perry – Carrie’s Senior Project manager who deals with is familiar and comfortable with data.
Page 58 of 129
o Gartner Consulting – HHSA’s IT consulting firm. They have first-hand knowledge of the data silos and data they contain. Our key contact:
Hannes Shydecker – Works day-to-day with project management and has an insight on the current state of the system and organization
o Executives and Upper Management – These metrics will be viewed most often by these users to help them quickly understand the progress and success of the integrated data systems project. Top priority executive includes:
Nick Macchione – Director. He will be very valuable to helping us further understand the HHSA and the value these metrics will bring to the team.
o Data Custodians – Employees who use the data on a daily basis. These are the individuals that will be the main point of contact for any specific data analysis, if needed.
Directly contact and schedule interviews with key individuals. From past documents, meetings, and discussions, the group will have a primary set of questions to generate a discussion with the individuals whom either access or will access the data frequently. These key individuals include, but are not limited to, the HHSA Director, COO, CFO, HHSA GITM, Chief Probation Officer, PSG GITM, CIO, HCD Director, CSG GITM, Executive Team, Assistant Deputy Directors, Data Threading Group, Privacy Task Team, Data Management Task Team, Referral Task Team, and KIP Team. The various methods for meeting with these key individuals and teams will include one on one interviews, focus groups, and surveys where deemed appropriate by Carrie Hoff. The group will also conduct several on site interviews with the operations staffs of several departments, including the Family Resource Center, Public Health Center, Community Transition Center, Housing and Community Development, and the MASU Unit.
Between two to four group members will meet with each of the key individuals that are able to schedule a meeting. We feel that although it would be beneficial for every group member to attend every interview, it is not realistic with the time constraints in which we need to complete all action items. If at least two members are at every interview, we believe that is enough members for a high level direct discussion.
As a group, discuss all possibilities of metrics and decide on top 10-20. Some examples of possible metrics include:
o What percentages of adults are using two or more services?
o What is the difference in wait time for clients from the last month to this month?
o What is the average amount of days between when clients visit within one clinic? Two clinics?
o Return percentage of clients?
o Percentage change in length of time for new customer data entry.
o Percentage of referred clients, and a breakdown of most likely clients to refer friends, family, etc.
o Turnover percentage of Employees
Page 59 of 129
o And other operational metrics
We need to come up with hands on operational metrics that would be linked with the already established census based metrics.
Review metrics for accuracy in present and future usage. Although currently these metrics will be shown primarily to upper management and executives to help track progress, the HHSA is a public entity and will eventually share these metrics with the public. We must make sure that the metrics comply with any security regulations. These metrics must be larger integrated metrics.
Suggest five to ten metrics that we believe will be the most useful and effective metrics to post on the integrated data dashboard to Carrie and her team.
Set final five to ten starter metrics after verbal approval from the HHSA team. Identify the relationships between the individual programs, the independent data source systems, and the integrated metrics.
Objective #2:
For the Return on Investment modeling portion of our project, we will show the project’s ROI throughout several phases of the project completion, with flexibility as more departments are brought online. With this model, not only will we look at quantitative measures, such as comparing past expenses to current to identify cost savings, but we will also identify increased efficiencies due to improved business processes and improved health overall. This particular piece may prove to be the most difficult part of the project, as San Diego County is a leader in integrating their computer systems in the Health and Human Services Department, and will serve as a model for other counties nationwide.
We will review literature such as industry-specific current events and articles, management journals, and case studies related to similar initiatives/models at the county-level, such as the study on Alameda County’s implementation of IBM’s Integrated Reporting system.
Ultimately, we will use these models for some guidance, but will create one that is unique to our project, as these preexisting will not take into account the unique aspects of the San Diego County project. To create a reliable model, an accurate measure of the cost of the data system will have to be calculated, as well as a measure of the cost of the current interfaces. Though the RFP informs that the solution vendor will propose a price inclusive of staff training and leadership development, the cost of staff time to learn new processes will also be considered. To calculate cost savings, an estimate of future time spent doing the same tasks with the new system will have to be generated. A preliminary process will consist of the following action items in order:
Gather further secondary research on ROI and benefits analysis. This will be accomplished through internet research, industry-specific educational seminars, electronic database searches/articles, academic books and management journals.
Evaluate best ROI Models across various industries and their relevance to the SD County HHSA. Determine if Gartner or other consulting firms have conducted financial analysis related to the system integration, and if so, consider incorporating their key findings in our proposed model. Identify tangible and intangible areas to consider in the development of a “Best in Class” ROI model.
Page 60 of 129
Gather cost estimates from key financial analysts for county wide programs. Data will be taken from 2012 summaries relevant to calculating proposed ROI categories. The rationale is to drill down to cost per beneficiary per service.
Compile the most relevant information collected into the final report for future use in 2013 financial reporting and future ROI calculations.
Recommend a ROI Model which includes a Template for financial ROI calculation, and categories for consideration for the ROI model.
Objective #3: The final portion of our project, which is related to Change Management, will be largely performed to smooth the transition from the old system to the new. To appropriately create a smooth transition, it will be important to emphasize training, client benefits, and most importantly, employee benefits to using the new system. This effort will largely be stressing buy-in rather than changed policies. Our objective is to create an action plan outlining effective communication and training strategies for current team members. Our strategy is a two-fold approach to this objective; the first is described in the following action items:
Gather further secondary research on Change Management best practices, as well as ancillary IT resources available to promote internal communication and training and development. This will be accomplished through internet research, industry-specific educational seminars, electronic database searches/articles, academic books and management journals.
Evaluate best change management strategies and their relevance to the SD County HHSA. Consider various forms of training and development and troubleshooting support systems, and select initiatives that are most congruent with the agency’s strategic vision.
Gather cost estimates/quotes from key IT providers for solutions to compliment the core Knowledge Integration Program, if those solutions are not provided by the selected vendor. Examples include applications such as Microsoft Lync for interdepartmental communication and internal knowledge sharing related to processes and procedures
Compile the most relevant information collected into the final report, for future reference, discussion, or solutions purchasing.
Recommend a strategy of how to create the buzz, how not to create the buzz, how to prepare the team for success. Recommend methods to clearly communicate privacy policies/standards that must be adhered to and enforced.
Current employees of HHSA may be slow to adopt change due to a lack of technological comfort, as well as a strong ownership of current data. Fortunately, we have access to a captive audience that is already convening to discuss these future process changes, as well as access to front-line operations staff. Our plan of action for our second approach is to distribute surveys and/or organize focus groups, depending on the audience. We hope to obtain candid feedback on current operational efficiencies and deficiencies, as well as insight into how the current team learns new processes. Understanding the current climate of the on-boarding and training process will provide direction when planning a change management strategy and our planned methodology is as follows:
Page 61 of 129
Schedule an on-site visit to a Family Resource Center and coordinate introductions for our attendance. Gather email addresses for members of the Data Threading Group, Privacy Task Team, Data Management Task Team and Referral Task Team to distribute surveys.
Design survey questions and focus group interview questions to obtain the following information:
o What is the internal climate amongst the program silos—is it collaborative or is there a sense of competition amongst programs for limited funds?
o Which departments/programs currently communicate regularly and for what purpose?
o Number of staff members directly entering/touching customer data
o Current on-boarding procedures and training opportunities
o Current access/process for IT support
o Concerns for privacy/security of Information when sharing data across programs/departments
o Staff comfort level with current systems and perception of the teams’ ability to learn new systems
o Effectiveness or ineffectiveness of current IT systems—what applications are the most user-friendly, efficient, etc. and which are the least
o Thoughts on an inter-office communication systems—perceived benefits, drawbacks
o Workload assessed with current staff – what areas are fully staffed and which are not
o What key metrics are key to your level of the organization, and what metrics do your front-line teams measure/track? Are any of these metrics tied to funding or associated with penalties?
o How have past changes been managed? How much change is going on right now?
o Are you familiar with HHSA’s consulting group Gartner and have you had any communication with the consulting firm?
o What is the group’s background? Identify and take note of leaders within the group if applicable.
o From a program level, what vision/goals are most critical? What is the HHSA’s vision organization-wide?
o What is the perceived need for this change among employees and managers?
Send the survey electronically. Collect the surveys within one week and begin reviewing before on-site visits, in case new questions or topics of discussion are identified in the survey feedback.
Visit the Family Resource Center and conduct one-on-one interviews and a focus group. Promote free discussion after asking direct questions.
Create charts with data and analyze feedback received. Develop a risk assessment and identify where resistance might be expected.
Page 62 of 129
Combining knowledge gained from approach number one’s literary review, evaluate best change management strategies and their relevance to the SD County HHSA. Provide recommendations for a change management strategy corresponding with the KIP roll-out.
Each of the vendors proposing to spearhead the system integration will have a different approach to training and support, but the eventual goal is to train internal “Super Users” to lead the charge for staff learning and development. The County HHSA has identified organizational changes necessary to facilitate this process, but we will offer supplementary recommendations for managing the internal change.
Project Milestones
The objectives stated above will be completed and delivered by December 5th, 2013. We will utilize Microsoft Project to plan for key project dates and milestones, as well as for delegating responsibilities. We plan to spend face-to-face time with our client at their various offices and facilities, and have regular conference calls on a weekly basis or as needed. We anticipate a close working relationship with our client due to the vast amount of data available, so that we ensure that we are only accessing the relevant metrics.
Page 63 of 129
Page 64 of 129
Resource Key: H=Haley, J=Jeff, A=Andrea, D=David
TEAM: Exum Jahnke Loffert Wong Page 65 of 129
Page 66 of 129
Appendix B: Stakeholder Analysis
Key: Dark red circles represent the most strategically relevant stakeholders from the LIHP and AB109 perspective.
Figure 4 SDCHHSA (LIHP & AB109 Stakeholders)
Knowledge Integration
Project (LIHP & AB109
Populations)
SDCHHSA Customers
Clinical/Non-Clinical Staff
State Prisons & Local Jails
Executive Team
Local Health Systems
KIP Team
Community Members
(poor, uninsured & vulnerable)
Medicare and Medicaid
HHSA Contractors
Siloed HHSA Programs
State and Federal
Government
Page 67 of 129
Appendix C: SWOT Analysis
Table 3: SWOT Analysis
Strengths Weaknesses
1. Dedicated and compassionate staff
2. Strong KIP leadership team
3. HHSA supports innovation and realizes the need for a customer-centric approach – HHSA is striving for a “No wrong door approach”
4. Executive leadership is united in their perception that a change management strategy is imperative to the success of KIP
5. Ongoing strategic planning
6. Metrics driven organization
7. Current data governance offers a high-level of security and privacy
1. Lack of communication between siloed programs results in redundancy of procedures/processes, customer frustrations and overworked staff.
2. Ineffective data management/information system – prohibits knowledge of recidivism and creates general inaccessibility of timely information
3. Tight revenue, narrow payer mix due to uncompensated care and low remuneration from Medicaid, resulting in lack of resources for investments in capital projects (upgrades IT, medical equipment, renovations)
4. Customers often experience long wait times to obtain appointment or to be seen by provider
5. Not flexible/able to adapt quickly due to bureaucratic oversight and lack of internal unity
6. Limited communication structure or streamlined administration across all programs creates discontinuity for the customer’s treatment/support plan
7. General inability to measure outcomes - Lack of shared analytics or tracked referrals
Opportunity Threats
1. Improve continuity of care for poor and uninsured through enhanced referral system – improved access and quality for the county customer
2. Improve efficiency and effectiveness through creation of administrative foundation and upgrade of hardware/software
3. Increased financial rewards (grants, ACA incentives) and/or invitations to participate in future pilot programs, due to first mover advantages
4. Partner with other community leaders to increase awareness and improve data-driven decision making
5. Increase ability to compete in the local healthcare marketing with technology acquisition
6. Measuring outcomes would likely offer positive reinforcement/morale boosts to staff
1. Lack of employee/contractor buy-in or adherence to culture-shift
2. Lack of business analytics to turn data into a usable tool
3. Surge in referrals may undermine the HHSA’s ability to provide adequate and timely service
4. Unstable government conditions – Sequestration; local political scandals, legislative action and other factors (economic, political, etc.) decreasing tax revenue and thus limits SDCHHSA’s revenue streams – may result in inadequate support to KIP (i.e. needed additions to staffing levels, training investments, incentives)
5. Staff fear of new technology will present a barrier to change
6. Customer fears of sharing information will inhibit data gathering
7. Decreasing reimbursement - The HHSA already receives low reimbursement rates from Medicare and has high levels of uncompensated care.
8. Increasing costs of supplying health care; health care inflation
Page 68 of 129
Appendix D: Trends Analysis
High Impact, Low Probability of Continuing
o None identified
Low Impact, Low Probability of Continuing
o None identified
Low Impact, High Probability of Continuing
o None identified
High Impact High Probability of Continuing
o Increasing mandates from the Federal and State level i.e. electronic medical record use, health exchanges, etc. Possibility/necessity for more regional or local entities to offer similar solutions.
o Decreasing reimbursement from payers and continued financial strain on providers/systems - County commission budget pressures
o Investment in cost-effective, state-of-the art technology becomes a core competency/critical success factor in health services delivery
o Growth in older adult population – Currently approximately 20% of the population. Many have fixed incomes and use Medicaid, likely to result in increased provision of low-remuneration care/services
o Disproportionate Medicaid spending on aged, blind, and disabled (26% of recipients absorbing 65% of expenditures) continues to increase, largely due to the aging population and rise in chronic conditions
o More physicians entering into managed care agreements/aligning with larger health systems o Collaboration between hospitals and other HCO’s increases power of negotiation o State government continues involvement in management of Medicaid and indigent patients o Industry-wide focus on outpatient services and preventative care is growing o Increased use of non-physician providers (lower cost, good quality especially for primary care) o Increasing adoption of mobile health applications o Growth in HMO’s as both the general population and Medicaid beneficiaries enroll in managed care o More concentrated competitive environment as partnerships/joint ventures proliferate the health care
industry
Page 69 of 129
Appendix E: Live Well San Diego Top 10 Indicators Dashboard
Table 4 Live Well San Diego Top 10 Indicators Dashboard
Page 70 of 129
Appendix F: HHSA Sample Dashboard
Figure 5 First Drilldown of Dashboard
Page 71 of 129
Figure 6 Second Drilldown of Dashboard
Customer Drilldown
Total Count Male Count Female CountCaucasian
CountHispanic Count Black Count
Asian and
Pacific Islander
American
IndianOther
6/ 10 / 2013 81751 35349 46402 36379 23136 5641 13080 327 3188
6/ 17/ 2013 83457 43151 40306 37138 23618 5759 13353 334 3255
6/ 24/ 2013 77263 35267 41996 34382 21865 5331 12362 309 3013
7/ 1/ 2013 88710 39907 48803 39476 25105 6121 14194 355 3460
7/ 8/ 2013 94934 51401 43533 42246 26866 6550 15189 380 3702
7/ 15/ 2013 77892 28601 49291 34662 22043 5375 12463 312 3038
7/ 22/ 2013 94813 47003 47810 42192 26832 6542 15170 379 3698
7/ 29/ 2013 93508 48972 44536 41611 26463 6452 14961 374 3647
8/ 5/ 2013 85210 35490 49720 37918 24114 5879 13634 341 3323
8/ 12/ 2013 80952 31189 49763 36024 22909 5586 12952 324 3157
8/ 19/ 2013 99596 55378 44218 44320 28186 6872 15935 398 3884
8/ 26/ 2013 98906 57270 41636 44013 27990 6825 15825 396 3857
9/ 2/ 2013 85437 42535 42902 38019 24179 5895 13670 342 3332
9/ 9/ 2013 83717 34397 49320 37254 23692 5776 13395 335 3265
9/ 16/ 2013 95474 54817 40657 42486 27019 6588 15276 382 3723
9/ 23/ 2013 88012 44513 43499 39165 24907 6073 14082 352 3432
9/ 30 / 2013 85328 42548 42780 37971 24148 5888 13652 341 3328
10 / 7/ 2013 85369 37483 47886 37989 24159 5890 13659 341 3329
10 / 14/ 2013 77996 34504 43492 34708 22073 5382 12479 312 3042
10 / 21/ 2013 77240 32403 44837 34372 21859 5330 12358 309 3012
10 / 28/ 2013 76268 30433 45835 33939 21584 5262 12203 305 2974
11/ 4/ 2013 76681 29751 46930 34123 21701 5291 12269 307 2991
11/ 11/ 2013 73080 23103 49977 32521 20682 5043 11693 292 2850
11/ 18/ 2013 76179 33671 42508 33900 21559 5256 12189 305 2971
11/ 25/ 2013 63475 18162 45313 28246 17963 4380 10156 254 2476
Age and Independence Services Double click on metrics (last slide)
to pull up drill down
Click on a specific count
to receive further
breakdown
Page 72 of 129
Figure 7 Third Drilldown of Dashboard
Total Count 49977
User ID First Name Last Name Date of Birth Address City Email Social Security Notes
517498 Jane Bob 7/30/1935 P.O. Box 907, 1089 Nullam Av. San Diego [email protected] 584-31-2436
562428 Erica Smith 3/8/1944 615 Etiam Rd. San Diego [email protected] 787-25-6188
272144 Sharon Burch 4/28/1941 Ap #719-7295 Donec St. San Diego [email protected] 814-99-7851
799036 Eliana Elliott 7/17/1943 Ap #108-1953 Justo Rd. San Diego [email protected] 228-51-2676
250913 Genevieve Woodward 7/27/1937 134-8072 Dolor Rd. San Diego [email protected] 869-34-5393
16444 Wynter Savage 3/4/1942 Ap #138-9811 Fringilla Ave San Diego [email protected] 222-59-3910
338622 Madonna Booker 3/7/1938 609-2568 Semper St. San Diego [email protected] 480-82-6933
181945 Rhona Glenn 10/13/1928 7087 Fusce Rd. San Diego [email protected] 235-15-9689
177425 Shelly Roach 8/13/1940 785-4566 Eget Avenue San Diego [email protected] 192-75-2262
457012 Mari Dean 8/3/1941 809-821 Enim Rd. San Diego [email protected] 608-54-2757
912929 Anjolie Cannon 5/30/1933 202 Maecenas Avenue San Diego [email protected] 608-15-8867
493186 Audra Patton 9/19/1927 6607 Per St. San Diego [email protected] 418-98-6575
816321 Halee Snider 10/11/1939 Ap #787-1273 Dolor Road San Diego [email protected] 362-22-3450
686622 Hilary Bates 5/28/1943 Ap #190-3419 Ac St. San Diego [email protected] 369-17-1291
514211 Hermione Russell 12/9/1936 P.O. Box 974, 8362 Aliquam Avenue San Diego [email protected] 649-83-1093
141510 Cassandra Gay 12/19/1942 717-2301 Magnis Ave San Diego [email protected] 144-82-3574
999837 Bethany Roberson 11/10/1932 151-4542 Lacus Rd. San Diego [email protected] 241-66-1124
657143 Darrel Warner 4/14/1940 Ap #386-8842 Pharetra St. San Diego [email protected] 436-22-2430
63674 Tamekah Juarez 6/2/1939 164-8665 Nulla. St. San Diego [email protected] 481-65-4781
358381 Courtney Cortez 8/26/1944 1052 Cras Avenue San Diego [email protected] 597-90-4580
319721 Kaitlin Chavez 9/3/1927 969-3491 Cras Street San Diego [email protected] 491-23-5437
508750 Alfreda Blair 12/3/1930 1596 Magna. Av. San Diego [email protected] 671-92-3719
212792 Dorothy Sanford 11/3/1941 778-305 Nisi. Rd. San Diego [email protected] 995-57-5028
457872 Jaden Bruce 5/4/1940 961-8797 Sit Av. San Diego [email protected] 522-33-5798
302011 MacKenzie Barnett 3/6/1940 5607 Ut, Street San Diego [email protected] 869-18-8319
Age and Independence Services
11/20 /13 Female Customer Drilldown
Double click on a specific
user to receive further
breakdown
Page 73 of 129
Figure 8 Fourth Drilldown of Dashboard
Service History
Information Services Used
Length of
Time (in
Months)
Start
Date
End
DateSpecifics
User ID 686622Aging and Independence Services 22 Feb-10 Dec-11 Senior Mental health
First Name HilaryPublic Health Services 1 Jan-12 Feb-12 Flu Vaccine
Last Name BatesAging and Independence Services 8 Nov-12 Jul-13 Home Delivered Meals
Date of Birth 5/28/1943
Address Ap #190-3419 Ac St.
City San Diego
Email [email protected]
Social Security 369-17-1292
User Since… 2/1/2010
No
tes
Customer Information Card
User ID 686622
Hilary Bates
Page 74 of 129
Appendix G: ROI Research Schedule
Table 5 Table of Primary Research activities
Number Description
1 Kick off Meeting with Carrie Hoff (Sponsor) 9/5, 5:00 – 6:50PM
Attendees: Carrie Hoff, Jeff Jahnke, David Wong, Haley Exum, Andrea Loffert Description: Initial meeting with Stakeholders and Sponsors – Please see Appendix C for details regarding the project objectives and initial project discussions
Location: SDSU
2 Affinity Exercise with Program SME’s 10/14, 3:00 -4:30PM
Attendees: Adrienne Perry, Program SME’s, Jeff Jahnke, David Wong, Haley Exum
Description: Affinity Exercise to gather SME feedback on input to the KIP program, strengths, weaknesses, opportunities
Location: 1255 Imperial Ave Suite 750
3 One on One Interview with Todd Henderson 10/16, 8:30- 9:00AM
Attendees: Haley Exum, Andrea Loffert Description: One on One Interview with Todd Henderson from Housing and Community Development Location: 3989 Ruffin Road, San Diego 92123
4 Focus Group Meeting with Assistant Deputy Directors 10/16, 10:00 – 11:00AM
Attendees: Adrienne Perry, Carrie Hoff, Haley Exum, David Wong
Description: Focus Group Meeting to gather KIP information / Initiative from Management Location: 1255 Imperial Ave Suite 750
5 One on One Interview with Mikel Haas 10/17, 8:00 – 8:30AM
Attendees: Haley Exum, Jeff Jahnke, Description: One on One Interview with Mikel Haas
Location: 1600 Pacific Highway Suite 306F, 92101
6 One on One Interview with Adrian Gonzalez 10/17 12:30 – 1:00PM
Attendees: Haley Exum, Jeff Jahnke, David Wong
Description: One on One Interview with Adrian Gonzalez from the Department of Housing
Location: Phone Interview
7 Focus Group with Program SME’s 10/17, 1:00 -2:00PM
Attendees: Adrienne Perry, Program SME’s, David Wong, Haley Exum, Description: Affinity Exercise to gather SME feedback on input to the KIP program, strengths, weaknesses, opportunities
Location: 1255 Imperial Ave Suite 743
8 One on One Interview with Dean Arabatzis 10/17 3:00 – 3:30PM
Attendees: Adrienne Perry, Haley Exum, David Wong
Description: One on One Interview with Dean Arabatzis, Chief Operations Officer for HHSA
Page 75 of 129
Location: CAC, 1600 Pacific Highway, Room 206, San Diego CA 92101
9 One on One Interview with Andy Pease 10/17 3:30 – 4:00PM
Attendees: Adrienne Perry, Haley Exum, David Wong
Description: One on One Interview with Andy Pease, Chief Financial Officer for HHSA
Location: CAC, 1600 Pacific Highway, Room 206, San Diego CA 92101
10 Focus Group with Select Executives 10/21, 3:00 -4:30PM
Attendees: Adrienne Perry, Program Exec and Managers, David Wong, Haley Exum
Description: Focus Group discussion to gather SME feedback on input to the KIP goals, expectations, and potential issues
Location: 1255 Imperial Ave Suite 743
11 One on One Interview with Mack Jenkins 10/22 11:00 – 11:30AM
Attendees: Andrea Loffert, Jeff Jahnke, Haley Exum
Description: One on One Interview with Mack Jenkins
Location: 9444 Balboa Avenue, Ste. 500, San Diego, CA 92123
12 One on One Interview with Kim Hatfield 10/23 11:00 – 11:30AM
Attendees: Andrea Loffert, Jeff Jahnke
Description: One on One Interview with Kim Hatfield (Kim was unavailable at the time, this interview was cancelled.) Location: 1600 Pacific Highway, Room 201, San Diego CA 92101
13 Focus Group discussion with the CTC/MASU 10/23, 1:30 -3:30PM
Attendees: Adrienne Perry, David Wong, Haley Exum
Description: Focus Group discussion to gather Front line feedback on input to the KIP goals, expectations, and potential issues at the Community Transition Center (CTC)/ Medi-Cal Access Support Unit (MASU). This meeting included the facility and process tour at the lighthouse in downtown. Location: 552 14th St. San Diego, CA
14 One on One Interview with Barbara Jimenez 10/24 1:30 – 2:00PM
Attendees: Jeff Jahnke
Description: One on One Interview with Barbara Jimenez (Barbara was unavailable at the time, this interview was cancelled.) Location: Phone Interview
15 Focus Group discussion at the PHC and FRC 10/24, 1:00 -3:00PM
Attendees: David Wong, Andrea Loffert, Jeff Jahnke
Description: Focus Group discussion to gather Front line feedback on input to the KIP goals, expectations, and potential issues at the Family Resource Center (FRC). This meeting included the facility tour of the FRC and Public Health Center (PHC). Location: 5055 Ruffin Road, San Diego, CA 92123
Page 76 of 129
Appendix H: The ADKAR Model of Change Management
Awareness of the need for change
Desire to support and participate in the change
Knowledge of how to change
Ability to implement required skills and behaviors
Reinforcement to sustain the change
Figure 9 Factors influencing Awareness of the need for change. Adapted from ADKAR (p.45), by J. M. Hiatt, 2006,Loveland, CO: Prosci
Research. Copyright 2006. Adapted with permission.
A person’s view of the current
state
Circulation of misinformation
or rumors
How a person perceives problems
The credibility of the awareness
message sender
Contestability of the reasons for
change
Factors influencing
Awareness of the need for
Change
Page 77 of 129
Figure 10 Factors influencing Desire for change. Adapted from ADKAR (p.45), by J. M. Hiatt, 2006,Loveland, CO: Prosci Research.
Copyright 2006. Adapted with permission.
Building Awareness Develop effective and targeted communications to share the business reasons for change and the risk of
not changing.
Sponsor (lead) the change effectively at the right level in the organization; share why the change is
needed and how the change aligns with the overall business direction and vision.
Enable managers and supervisors to be effective coaches during the change process; prepare them to
manage change and help them reinforce awareness messages with their employees
Provide employees with ready access to business information.
Creating Desire
Enable business leaders to effectively sponsor the change; create a coalition of sponsorship at key
levels in the organization.
Equip managers and supervisors to be effective change leaders; enable them to manage resistance
Assess the risks associated with the change and design special tactics to address those ricks
Align incentive and performance management systems to support the change
The nature of the change and what's
in it for me?
Organizational climate and history
of change
Our personal situation
Intrinsic motivators
Factors influencing Desire for
change
Page 78 of 129
Appendix I: Leveraging Current Communication Tools
Microsoft Lync Presentation
Page 79 of 129
Figure 11 Recommended Use for Lync Application
Page 80 of 129
Appendix J: Focus Group Attendance
Table 6: KIP Privacy/Program SME Affinity Exercise Oct 14, 2013
Name Dept./Region Boyer, Allison Eligibility Colligan, Laura BHS Hernandez, Laura Eligibility Perry, Adrienne KIP Van Lingen, Leah CWS Verbun, Melinda CWS Borntrager, Robert Compliance Office
Table 7: KIP Data Governance/Select ADD’s Focus Group Oct 16, 2013
Name Dept./Region Bell, Ida Eligibility Dyar, Deborah Compliance Office Gill, Harold HCD Lang, Tabitha BHS Lontoc, Elainerose Probation Mosey, Roseann PSG IT Perry, Adrienne KIP Sellers, Mark AIS Yaghmaee, Saman AIS Yanischeff, Nick PHS
Table 8:KIP Executive Subset Semi-Structured Focus Group Oct XX, 2013
Name Dept./Region Fernandez, Mauricio KIP Data Mgmt. & Privacy Hoff, Carrie KIP Myers, Roseann CWS
Bower, Susan BHS – works w/350 contracts and two psychiatric hospitals
Shih, Peter HCPA –Health Policy/LIHP
Mosey, Roseann PSG IT
Perry, Adrienne Probation
Brown-Mercadel, Marie
LIHP
Table 9: Community Transition Center/MASU Site Visit/Focus Group Oct. 23 2013
Name Roles/Tenure Lau, Karna Supervising Probation Officer and Project Manager, 17 yrs. experience w/
Probation/AB109 Population LCSW BHS <1yr experience Probation Officer (PO) 1 yr.
Page 81 of 129
Admin Asst. who performs screening 2 yrs. Shih, Peter HCPA –Health Policy/LIHP Sr. PO 11 yrs. working with AB109 population for 1 yr. Perry, Adrienne KIP Sr. PO (Nathan) AB109 1.5 yrs., experience in the Prisoner Reentry Program RN Contractor from United’s staffing contract MASU LCSW Responsible for LIHP registration PO 7 yrs. LCSW MASU LCSW
8 yrs. working w/ AB109 population, < 6 months w/ the CTC Psychiatric Hospital - Started 14 yrs. ago in Cal Works –sees patients at the CTC and Hospital interchangeably (as well as the same patients at both facilities)
MASU Social Work Supervisor (interested in KIP leadership role)
1 yr.
LMFT <6 months; Contractor from Optum Health
NOTE: 2 Probation Officers, one BHS rep, and one manager were OOO
Page 82 of 129
Appendix K: Surveys
* Survey questions were pre-coded by objective identifiers (CM, R, M). These were not visible to respondents.
Program SME Survey
Figure 12: Sample Survey Screen
Thank you for taking the time to complete this survey administered by a student team from San Diego State University’s MBA program. We are currently engaging in a research project for the San Diego County HHSA’s Knowledge Integration Program and your feedback is very important to us. This survey is open now through Friday, October 25. This survey should only take about 15 minutes of your time. Your answers will be completely anonymous. Thank you again!
About You 1. What program or service do you represent:
a. Aging and Independence Services b. Behavioral Health Services c. Child Welfare Services d. Eligibility e. Executive Office f. Housing and Community Development g. Information Technology h. Probation i. Public Health Services
Page 83 of 129
j. Other 2. What is your tenure with the County:
a. < 5 years b. 5 – 15 years c. 15 – 25 years d. > 25 years
Staffing / Customer Profile 3. How often does your program measure the time it takes, from start to finish, completing and/or performing
business processes? (M) (R) (CM) a. Often. b. Occasionally. c. Rarely. d. Never.
Operations 4. What processes are the most time consuming for you and your team? (R) (CM) Open Text. 5. Where do you (or does your team) experience the most redundancy in work (specific forms, procedures,
etc.)? (R) (CM) Open Text. 6. Regarding your current IT environment, what applications do you feel are the most user-friendly? (R) (CM)
Open Text. 7. Regarding your current IT environment, what applications do you feel are the least user-friendly? (R) (CM)
Open Text. 8. What programs do you collaborate/communicate with to provide coordinated care or service? (M) (R)
Select all that apply.
Aging & Independence Services
Behavioral Health Services
Child Welfare Services
Housing Assistance
Public Assistance Programs (e.g. Cal Fresh, Cal WORKS, Medi-Cal, General Relief, LIHP)
Public Health Services
Probation (includes AB109)
Other(s) Open Text.
The Knowledge Integration Program (KIP) /Change Management 9. To the best of your knowledge, what is the current level of awareness, within your unit, for the person-
centered/coordinated service model that KIP technology will support? (M) (CM) Check all that apply.
My team understands that the changing health services landscape, along with other changes in overall customer expectations and new technologies demands integrated service delivery methods (i.e. “We need to streamline our services in order to provide the level of service that our customers want and expect,” etc.).
My team is dissatisfied with current access to customer information across programs and redundancy. (i.e. “Our customers think we are already integrated - and we should be,” “It’s about time someone listened to me,” etc.).
Though most of my team is aware of the need for change, some are content with the current situation (i.e. “What’s wrong with the way we do things now?”)
Page 84 of 129
My team somewhat perceives a need for this change but is primarily consumed with keeping up with operational workload. (i.e. “I wish that we did business differently but I don’t have time to learn new processes/procedures,” etc.).
The majority of my team believes in the status quo (i.e. “We’ve done this as long as I’ve been here so why change now; if it’s not broke don’t fix it,” etc.)
None of the above. Please explain: Open Text. 10. When considering the proposed transformation of service delivery associated with KIP, what staff concerns
do you anticipate? (CM) Check all that apply.
Increasing work load (either temporarily, with learning curve, or permanently, with new job expectations – or both)
Inability to perform at their pre-KIP level
Lessened job security
Fears of new technology
Lessening of data quality/integrity, etc.
Misinterpretation of data gathered
General confusion, lack of awareness about where to get more information
I do not foresee any staff concerns with the KIP
Other. Please explain: Open Text. 11. How have past changes been managed and how has your team perceived these initiatives? (CM)
Very well; new initiatives have energized the staff due to high levels of sponsorship accountability, follow through, and honest communication.
Well; our team understands that change/innovation is part of the County’s vision.
Fair; some initiatives have been met with some minor discontent from the staff due to management’s implementation.
Poor; the team does not respond well to changes due to their perception of past change management.
12. What changes are happening presently (not-related to KIP)? (CM) Open Text. Reporting/Metrics
13. Is there a report that exists that describes your customer profile? a. No. b. Yes. If yes, Please list: Open Text .
14. Do you have any reports that include information from across programs? (R) (M) (CM) Check one.
No.
Yes.
If yes, what is the name of the report?: Open Text. 13. If yes, do you have access to any cross-program reports that inform how many of your customers use other
services? (M) (R) (CM)
No.
Yes.
If yes, what is (are) the name(s) of the report(s)? Open Text. The SDSU MBA team kindly thanks you for your support and valuable contributions!
Page 85 of 129
Proposed Front Line Survey (Unable to Perform) Thank you for taking the time to complete this survey administered by a student team from San Diego State University’s MBA program. We are currently engaging in a research project for the San Diego County HHSA’s Knowledge Integration Program and your feedback is very important to us. This survey is open now through Friday, November 8th. This survey should only take about 10-15 minutes of your time. Your answers will be completely anonymous. Thank you again! About You 1. What program or service do you represent:
a. Aging and Independence Services b. Behavioral Health Services c. Child Welfare Services d. Eligibility e. Executive Office f. Housing and Community Development g. Information Technology h. Probation i. Public Health Services j. Other
2. What is your tenure with the County:
a. < 5 years b. 5 – 15 years c. 15 – 25 years d. > 25 years
Operations
3. Where do you see the greatest redundancy in daily tasks (specific forms, procedures, etc.)? Open Text. 4. What daily tasks are the most time consuming for you? Open Text. 5. Regarding your current IT system, what applications do you feel are:
a. The MOST user-friendly, efficient, etc.? Open Text. b. The LEAST user-friendly, efficient, etc.? Open Text.
6. Please describe any current on-boarding procedures and training opportunities at your program: Open Text. 7. What other programs do you collaborate with to provide coordinated care (e.g. through referrals, team
decision making, to perform routine case management? Select all that apply.
Aging & Independent Services
Behavioral Health Services
Public Assistance Programs (Cal Fresh, Cal WORKS, Medi-Cal, General Relief, LIHP)
Child Welfare Services
Housing Assistance
Public Health Services
Other(s)
Probation (includes AB109)
Adult Juvenile and Institution Services
Page 86 of 129
Other(s) Open Text. 8. Does your program track the outcomes of referrals to other programs? If so, which programs do you
communicate with in regard to outcomes? Select all that apply.
Aging & Independent Services
Behavioral Health Services
Public Assistance Programs (Cal Fresh, Cal WORKS, Medi-Cal, General Relief, LIHP)
Child Welfare Services
Housing Assistance
Public Health Services
Other(s)
Probation (includes AB109)
Adult Juvenile and Institution Services
Other(s) Open Text. 9. Compared to other programs, do you have any unique modes of service delivery? For example consider, is
contracted work prevalent at your program, are more customers served at your program than others, do you have access to unique software applications, etc.?
No.
Yes. Please list: Open Text. 10. In an average month, how often do you skip breaks because you have too much work to do?
a. Never b. Sometimes (Couple breaks a week) c. Often (One break a day) d. Always (Every break)
11. How often does your program measure the time it takes, from start to finish, to complete and/or perform business processes?
a. Often. b. Sometimes. c. Never.
Reporting/Metrics
12. Is there a report that exists that describes your customer profile? a. No. b. Yes. If yes, Please list: Open Text . c. I do not know
13. Do you have any reports that include information from across programs? Check one.
No.
Yes.
If yes, what is the name of the report? Open Text.
I do not know 12a. If yes, do you have access to any cross-program reports that inform how many of your customers use other
services?
No.
Yes.
If yes, what is (are) the name(s) of the report(s)? Open Text.
The SDSU MBA team kindly thanks you for your support and valuable contributions!
Page 87 of 129
Data SME Survey Thank you for taking the time to complete this survey administered by a student team from San Diego State University’s MBA program. We are currently engaging in a research project for the San Diego County HHSA’s Knowledge Integration Program and your feedback is very important to us. This survey is open now through Friday, October 25. This survey should only take about 15-20 minutes of your time. Your answers will be completely anonymous. Thank you again!
About You 1. What program or service do you represent:
a. Aging and Independence Services b. Behavioral Health Services c. Child Welfare Services d. Eligibility e. Executive Office f. Housing and Community Development g. Information Technology h. Probation i. Public Health Services j. Other
2. What is your tenure with the County: k. < 5 years l. 5 – 15 years m. 15 – 25 years n. > 25 years
About the HHSA
General/Reporting 1. What key data does your department/program produce/store? Open Text. 2. What reports do you rely on most? Please identify the frequency in which these reports are accessed (i.e.
daily, monthly, quarterly, etc.? Open Text. 3. Do you have any reports that are across programs? Check one.
No.
Yes. Please Describe: Open Text. 4. Do you have access to any cross-program reports that inform how many of your customers use other
services?
No.
Yes. If yes, What is (are) the name(s) of the report(s)? Open Text.
5. Do you have the reports available to do your job effectively? Check one.
No. Please Explain: Open Text.
Somewhat, but I could be more effective with additional resources/access to other program data. Please Identify: Open Text.
Yes.
Page 88 of 129
6. Compared to other programs, do you have any unique modes of service delivery? For example consider, is contracted work prevalent at your program, are more customers served at your program than others, do you have access to unique software applications, etc.?
No.
Yes. Please list: Open Text.
Metrics 7. What metrics are vital to support your daily/weekly/monthly etc. performance? Open Text. 8. Are any metrics tied to program administration such as funding, overpayments, revenue recovery, etc.?
Please Identify: Open Text. 9. Consider the following idea when answering the questions below: If you were able to design your own
dashboard to measure anything from operational effectiveness to population health in real-time, what would it look like?
a. What customer metrics would be accessible and how often would these data/ metrics be refreshed? For example, would you want to have access to these measures continuously, daily, monthly, annually, etc.? Open Text.
b. What operational metrics would be accessible and how often would these data/ metrics be refreshed? Open Text.
c. What outcome-driven metrics would be accessible and how often would these data/ metrics be refreshed? Open Text.
10. Do you currently have access to any or all of this data you imagine on your dashboard? If so, is it in a timely fashion and an easy to read/interpret format? Please Explain: Open Text.
Page 89 of 129
Appendix L: Interviews & Focus Groups
* Interview/ Focus group questions were pre-coded by objective identifiers (CM, R, M). These were not visible to respondents.
County Executives One on One Interviews 1. What are your expectations of KIP? 2. If you were able to design your own dashboard, what would it look like? What customer, operations,
and outcomes metrics would be accessible? How often would the data be refreshed? (M) (R) One on One – CFO 1. Which metrics are tied to funding (sanctions, overpayments, revenue recovery)? 2. Of the HHSA Systems to be interfaced in KIP Part I, how many are competing for general funds? (M) (R)
(CM) 3. Do you track the cost of maintaining each of the data source systems? If yes, which ones and can you
give us this data? Do you have any financial information on the AB109 population?(R) 4. Do you have records of their historical cost to implement the current systems? (R) 5. Where do you expect to see cost savings with the newly integrated system? (R) 6. If you were able to design your own dashboard, what would it look like? What customer, operations,
and outcomes metrics would be accessible? How often would the data be refreshed? (M) (R) ADDs
Affinity Exercise 1. What are your expectations of KIP?* (M) (R) (CM) 2. What do you expect to be the greatest challenges of rolling out the KIP 5 Functional capabilities?* (M)
(R) (CM) 3. What do you expect to be the greatest benefits of rolling out the KIP 5 functional capabilities?* (M) (R)
(CM) 4. What do you see as possible concerns/perceptions of the staff with the changing method of service
delivery? Prompts if necessary: (CM) a. Concern that they will need to be a jack of all trades and a master of none? b. Perceptions on heaver work load? c. Data quality concerns? d. Fears/concerns of reduced job security with more efficient systems? e. Any vendor concerns for workload requirements to enable interoperability between their
system and the new KIP vendor chosen? 5. Which of your programs will see the most significant operational changes (process/procedures) with the
KIP rollout? (M) (R) (CM) – Separate into message sender/audience so that we know how to treat responses
6. If you were able to design your own dashboard, what would it look like? What customer, operations, and outcomes metrics would be accessible? How often would the data be refreshed? (M) (R)
a. Do you currently have access to any or all of this data? In a timely fashion? In an easy to read/interpret format?
7. Is there a report that exists that describes your customer profile? If so, could you please send it to us? Any cross-program reports that inform how many of your customers use other services? (M) (R) (CM)
8. Do you gauge customer awareness of other programs? If so, how? (M) (R) (CM) 9. Do you have a full complement of staff to support operations? (CM)
Page 90 of 129
Affinity Exercise/Focus Group Introductory Presentation (Completed for all focus groups. Not all shown here)
Page 91 of 129
Page 92 of 129
KIP Group
Focus Group 1. Will job roles be effected by this change, if so how? (R) (CM) 2. What new positions must be created to manage new processes and what is budgeted for potential staff
increases? (R) 3. Do you expect any internal concerns of reduced job security with more efficient systems? Any vendor
concerns for workload requirements to enable interoperability between their system and the new KIP vendor chosen? (CM)
4. What key metrics are vital to your level of the organization? Are any of these metrics tied to funding or associated with penalties? (M) (R)
5. What reports do you have, and what reports do you wish you had? Do you have reports available to do your job effectively? Is there a report that exists that describes your customer profile? Any cross-program reports that could inform how many of your customers use other services? (M) (R) (CM)
6. How will the cost of the integrated system be allocated over time? (R) 7. To whom has KIP been communicated? What information has been communicated? Through what
mode of communication? How frequently? (CM)
Executive Team/Subset – Representative of AB109 and LIHP
Focus Group 1. What are your expectations of KIP? (M) (R) (CM) 2. What do you expect to be the greatest challenges of rolling out the KIP 5 Functional capabilities? (M) (R)
(CM) 3. What do you expect to be the greatest benefits of rolling out the KIP 5 functional capabilities? (M) (R)
(CM) 4. Do you expect to see cost savings or cost avoidance through KIP? (R) 5. If you were able to design your own dashboard, what would it look like? What customer, operations,
and outcomes metrics would be accessible? How often would the data be refreshed? (M) (R) **
Page 93 of 129
Program Managers
Focus Group
1. What do you expect to be the greatest challenges of rolling out the KIP 5 Functional capabilities? (M) (R) (C)
2. What do you expect to be the greatest benefits of rolling out the KIP 5 functional capabilities? (M) (R) (C)
3. What metrics are vital to support your daily/weekly/monthly performance? (M). a. Now, of these metrics, which of these do your front-line teams measure/track? Use another
color marker to identify them by writing “F.L.” next to the metric. (M)
b. Identify any metrics that are tied to funding or associated with penalties? Label with an “F” or a “P.” (M) (R)
4. Consider this idea when answering the following questions: If you were able to design your own dashboard to measure anything from operational effectiveness to population health in real-time, what would it look like?
i. What customer metrics would be accessible? (M) (R) ii. What operational metrics would be accessible? (M) (R) (C)
iii. What outcome-driven metrics would be accessible? (M) (R) (C) b. Looking at the metrics you’ve just identified, describe how often data/these metrics would be
refreshed? For example, would you want to have access to these measures continuously, daily, monthly, annually, etc.? (M) (R)
c. Do you currently have access to any or all of this data you imagine on your dashboard? In a timely fashion? In an easy to read/interpret format? (M)
Page 94 of 129
Appendix M: Progress Reports
Table 10 “Stoplight” Report 11/1/13
Benefit/ROI
Compiled notes and data from meetings with various executives/frontline employees
David returned to CTC to gain further insight to processes—data re: his visit is being compiled
Revised frontline survey Waiting for survey results
Compiling notes/data re: David’s additional visits
Awaiting approval for deploying frontline survey, targeting close date of Friday, Nov. 8
Beginning work on Beta Report
Metrics
Compiled notes and data from meetings with various executives/frontline employees
Waiting for survey results
Met with Barbara Jimenez to gain her insight with regards to KIP
Continuing data synthesis Beginning work on Beta report
Awaiting approval for deploying frontline survey, targeting close date of Friday, Nov. 8
Change Management
Compiled notes and data from meetings with various executives/frontline employees
Waiting for survey results
Met with Barbara Jimenez to gain her insight with regards to KIP
Awaiting survey results to compile full gamut of responses
Awaiting approval for deploying frontline survey, targeting close date of Friday, Nov. 8
Table 11 “Stoplight” Report 11/8/13
Benefit/ROI
Received Survey Results for SME’s and Program leaders
Received notification regarding frontline survey timeline issues
Continuing to dig through data to form recommendations
Metrics
Received Survey Results for SME’s and Program leaders
Received notification regarding frontline survey timeline issues
Continuing to dig through data to form recommendations
Change Management
Received Survey Results for SME’s and Program leaders
Received notification regarding frontline survey timeline issues
Continuing to dig through data to form recommendations
Page 95 of 129
Table 12 “Stoplight” Report 11/15/13
Benefit/ROI
Completing formation of recommendations Beginning presentation
Awaiting signed LOE
Awaiting guest list for presentation
Metrics
Completing formation of recommendations Beginning presentation
Awaiting signed LOE
Awaiting guest list for presentation
Change Management
Completing formation of recommendations Beginning presentation Awaiting signed LOE
Awaiting guest list for presentation
Page 96 of 129
Appendix N: Meeting Notes
Assistant Deputy Directors (ADDs) Meeting – October 16, 2013 Expectations
Sharing Information systems
Info on other Cash benefits
Access o Mobile Access o Data Access o One stop portal for customers
Improving ability to enhance data statistics
Quality, reaching out and helping more people
Able to reach cross systems Greatest Challenges
Concerns around data Integrity o Data duplication o Timeliness- to enter into system as well as review
Security o Permissions across programs o Keeping info in a secure way
Resourcing aspects o Don’t have the same operating systems o Don’t have training or access to all systems
Shared lingo o Acronyms
Enough information to be useful?
Client Info o Some clients hesitant to give some info
Greatest Benefits
Improved safety for workers in office and field
Improvement in way programs or services are offered
Reduce wait times
More info available immediately
Better referrals process
Better Coordination of services
Decrease in fraud and abuse
Reduce cost to deliver service
Improve grant writing, knowing clients better may bring more opportunities
Increase restitution fees collected from offender
Clients will not have to keep repeating answers Conceptions with Change Management/Service Delivery
Additional work o Learning about other programs
Correct equipment
User friendly system
Maintaining confidentiality of data
Page 97 of 129
Adrienne explaining phase 1
Employees might turn on investigator hats
Acronym use
Resistance to some with learning new system Phase 1
Look-up, search, and query function
Referral management piece o Cold referral, no circling back
4 source systems in phase 1: o Off Med o Cal Win o PSMC
Probation o Sanwits
“Not sure with this question- What kind of info is even available in all 4 of those systems?” –Mark Sellers
Probation o Treating Probation officers on what and how o Clients have many health issues, will give clients access
AIS o May impact process for AIS
HCD o Being able to search housing records
PCMS o Being able to access other systems
All programs will be impacted, although there is only 4 being integrated with phase one, all programs will have potential for accessibility
Dashboard
Collecting Social Security Numbers, waiving fees and check insurance information o Some staff won’t collect SS# o Stats based off individual employees as well as program wide
Correlations
Real time data o Have data, but don’t have resources to analyze and publicize it o Could help allocate staff resources o Staff numbers could correlate to data
Top 25 users of the most programs o People who are most common to all programs who are sucking up most resources
Way to identify people who were in critical need, i.e. someone is about to become homeless, way to step in to help provide service before problems arises
Common clients o Can we link them up into one big successful program?
Immediate access to client record to see more demographic information on the person
Gap analysis (PROBATION) o Opportunities o Certain percentage of pop have mental health issues
Cost savings/improved benefits to clients o Measure how fast they’re getting services o Quality of life improvements after services
Page 98 of 129
Red flags to look for within integrated
Metrics already set for departments? o Current systems are already set up to measure program metrics…they are having a tough time
to think of what needs to be in KIP dashboard
Cross reporting o HHSA & PSG (public safety group) for services to homeless
Unduplicated people Cost spent on services
KIP Focus Group - October 17, 2013 Will job roles be effected by this change, if so how?
On a broad level, this will more so be a shift in the way they are thinking
Will be a learning curve to understand job roles (becoming a care coordinator across programs – can see team members hanging on to “I’m a nurse and this is what I do”)
Will need to complement current staff with individuals offering Data Management/Operational Research skill sets
Business Analytics position and Statistician position were both already created but will need additional data-focused team members to support the quality of data at all times
o Gave the example of the State of VA – created a Master Client Index – 10% could not be matched – need people to match these and the need is ongoing
o This will require technical skill but also time
Data SME’s job roles will change the most drastically
At the admin level, their tasks and goals will change “No longer have to scramble to find figures for Nick, such as how many unduplicated customers do we serve?”
Challenges Anticipated:
Employee perceptions: o Will I get in trouble for saying/reporting inaccurate data? o Will my performance be rated on how well I do with this new initiative?
Will need to measure this eventually but there will need to be adequate time for onboarding
Contractors “Probably freaking out right now.” BHS has 360+ contracts, ADS heavy on contractors as well as all IT and medical services
COTR – subgroup that will need specific change management strategies in place to ensure that contracting officers are exhibiting leadership and support of KIP to their teams
KIP will raise a lot of questions and encourage dialogue, and with that may come some alarm/uncomfortable information (identifying poor performance for example). Since there is no access to historical metrics, it will take time to understand the performance level and to have a base line to make comparisons.
Metrics:
Ad hoc projects
Number of applications
Alerts for grants
Outcomes of referrals
“inform to improve” Where do you expect to see cost savings?
Page 99 of 129
Took 1 month to find # of unduplicated AB109 patients – right there is an opportunity for huge cost savings and access to real-time data (the system will eliminate the need for this research)
Avoidance of the Social Worker that has to drive across the county to access billing codes/patient data and input their data – this is common with many field workers
More mobility – more field teams will be able to get out into the community To whom has KIP been communicated?
From the Executive Level down to the ADD level
Some stakeholders outside of the county – national and state govt. and the BEACON work group # of staff to be exposed during KIP – 5,500 HHSA employees, 16,000 county contractors Kicking off a Model of Practice team to lead the communication strategy
KIP is coming
Address the rumor mill What modes of communication and/or training are currently employed?
Monthly required training (web based)
Yellow belt training has recently been an initiative for key employees
Train the trainer was the only formal training program but were unsure if this is still happening either – it used to be for Supervisors to train direct users of the systems
When rolling out KIP it will be important not to skip the supervisor level when instilling the system knowledge
There has not been a vehicle for change management our improving performance
The county has a YouTube page (the county news center is responsible for creating the videos)
Program SME Affinity Exercise – October 14, 2013
Expectations:
Behavioral Health Services – provide Clear Framework for how all information systems will work together. Provide clear communications for rationale. Provide leadership at various levels of organization.
Improve Services and Community
Reach out and Help more people.
Integration fit for operations-Identifies Rey contacts across systems Clients are involved with.
Enhanced reporting-seeing the population as a whole and not just by dept./ Program.
Improve ability to collect structural data.
Establish a one stop portal for customers.
Improve refusals “No wrong door” concept.
Expectations of KIP – Access o 1. Multiple staff levels o 2. Ease of access o 3. Mobile access o 4. Sustainable info o 5. Mgmt. reporting access o 6. Sup reports different than mgmt. report.
Page 100 of 129
Expectations of KIP: What programs / (HHSA) Have Probationers received benefits? Were they successful after treatment?
Better Customer experience-not having to repeat same information to multiple depts./programs.
Work together to better serve and help customers by sharing information.
What other cash /benefits from HHSA are Probationers receiving?
Shared information between department/programs for common Clients.
Assure the Public Health Information Systems are supported by KIP.
Develop user guide materials for all PHS Programs.
Probation—Access HHSA referred programs for our offenders.
Probation—Increase ability for direct referrals to HSSA resources Benefits
Common terminology-between programs/departments.
Sharing data between systems will be a challenge.
Duplicate profiles
Accuracy, timeliness of data-data integrity, confidentiality employees accessibility to data (right to know, need to know)
Having the information that is actually usable-need “enough” into in order to be able to guide client. Integrity of data-effective management of data, protection of P.H.I.
Adequate resources i.e. computer, broadband, speed, security, Network capacity-Keeping pure i-technology changes.
Permissions across programs accessing sensitive client data i.e. STD clients.
Confidentiality/privacy especially between depts..
Hesitancy of clients to give personal information i.e. many clients do not want to give SSN#
Customer’s privacy, visibility into other programs enrolled (should or should they not be receiving certain benefits?
Challenges-Probation — o 1. Data Dictionary o 2.Training staff on HHSA Programs o 3. Integrating Data into Probation case load management system (PCMS) o 4. Matching records accurately
Capability to access client information across programs and services.
Capability to exchange information in a served way.
Reduced fraud
Efficiencies in service deliveries.
Improved service delivery to clients-better coordination of services.
May see some room for improvement/change in the way programs/services are delivered based on common clients.
Improved grant writing-knowing clients better may bring about more opportunities for funding.
Improved safety for workers in office and field.
Increase restitution fees collected from offender.
Providing better services in terms of referrals. Seeing the broader picture of a client’s profile (health, psych, criminal, employment etc.)
Reduce wait times Provide better Serviced
Better access to services, efficiency in processing benefits, down fraud and abuse, saving money in reduced cost to deliver service, better data gathering and reporting to better assess community needs.
Page 101 of 129
Program might already know what customer needs before they even arrive for first appointment-clients will not have to keep repeating answer to questions.
Improved referral process will save customers from having to run around to different offices to find out what service they need.
Ability to gain access to more information and background on clients.
Improve assessment and quality of services to client.
Knowledge of resources currently during accessed-or those used in the past
Assist in developing a stronger client-centered approach
More info available immediately.
Community awareness
Community resource Directory-only go where referred to
Coordinated care
Accountability to address issues when info is rc’d
Probation-knowing what programs they are frequenting /RC’g benefits from
Make the workload more manageable
Speeding up process will improve quality
Consistency and only visit one place
Do what our customers think we do should be doing
Ability to quickly know what other agencies are involved.
Quicker access to services by clients.
Quicker access to information
Referrals for appropriate services
Avoid duplication of effort
Streamlined services
Easy, immediate access to info.
Help families in crisis without barriers.
Efficiencies in service delivery.
Timely sharing of information
Help families in crisis w/out barriers
Challenges:
Different operating system for staff to train, Duplication of info? multiple pieces of info take more time to review/understand.
Concerns about increased work load working with other programs
Concern about adding to their already full workloads.
Staff may appreciate access to new resources.
Hearing other services/departments will be challenging, but rewarding in there is enough info to orient staff about them.
Seen as “another county initiative” without adequate time and resources dedicated to making it a success.
Staff will need to learn more about other programs.
Having the right equipment and training to be able to operate efficiently-having good help-desk support. A system that is intuitive user friendly.
Should user be able to access the other data sources/systems?
Employers might turn on “investigator” hat.
Learning different acronyms/ Meaning of the data?
Maintaining the confidentiality of the data.
Page 102 of 129
What would be the work load impact?
Poor data/inaccurate date
Timing of bringing program on board
Defining roles and responsibilities-what should you act on and not
Too many alerts-cause staff to stop reading
Ontology-translation-agreeing on common terms/processes
Non-necessary alerts
Communication-making people aware –all know same info.
Staff buy-in, training, training-staff resources and proper learning of process.
Accurate client match and fear of over-share of sensitive info.
Fear-customers can be fearful to share
Trust in the system
Timely sharing info Misc.
AIS-Phase I-May impact successes for our APS and IHSS functions, including Public Authority processes.
Being able to know the meaning of data from other systems with respect to PCMS?
AB109 impacts to additional data available to assist with caseload in PCMS
Training Probation officers on what / how to manage info from other sources (outside PCMS)
HCD-Housing/Section8 rental assist-Being able to search and view Cal Win, records-already have, access-but access to case notes might be helpful.
Our offenders have a myriad of health/ mental health issues are low to no income. Access to these resources will assist us in referrals.
Stats for collecting SSN-identity who excels at collecting—collecting fees some staff are more willing to check insurance.
Public Health correlations in real-time have the data but not necessarily implying meaning.
Real-time-allocate resources to distribute staff.
Top 25 users “Frequent fliers”
Identify critical need-preventative measure.
Self-sufficiency clients-nice to know who you are working without other programs and drill down to individual.
Cost saving to client-how do you measure-speed improvements.
Probation-gap analysis-opportunities identified to improve process.
Alerts-homeless
Decision-making and planning for public health
PSG+HHSA how many unduplicated? What would your Ideal Dashboard look like?
Case by case-Associated department involved
Welfare with Domestics violence-pieces
Attendance review Board-Actionable customers-info to front line to intervene
Status of customer/currant providers and histories
Pregnant teens-helps
Location/GPS to people of client
Applicants/SNAP applicants supplemental application
18 Kids-kids needing reach out
Actionable tactics from program SME
Alerts/notification-too late /inaccuracies.
Page 103 of 129
Operational metrics
Number of kids that meet state / fed req. – quarterly report
Eligibility – Monthly customer feedback – regional office
Timely processing
Volume of application
Weekly dashboard case speed
Safe measures – child welfare services – monitor performance / case
A lot of “Just in case” data / metrics
Time on Aid – Reports on immediate need – Arco
Probation – Status of Offender
Timely processing of applications – Process to date CALWIN systems report out Todd Henderson, Housing and Community Development Program Director Interview October 16, 2013 EXPECTATIONS:
Diff department, not part of first part of KIP
More efficient use of data
Have over 11,000 households helped every month
Common Characteristics between clients and agencies o All are in need, multiple needs, all related, such as low income usually need dental, housing etc.
No cross referrals
OWN DATABASE (legacy system) , only exchange with HUD for tracking and funding purposes o Reason: confidentiality
Similar to HIPPA
Use DMV and post office records for addresses
Be able to more comprehensively serve clients o They only address housing needs, but most need more than just that
Referrals on Case by Case o Client has to ask, then they send them to diff non-profits o VERY manual process
Listed by health and physical repairs
Referring to them: o AIS o Legal Aid o Probation o Public Health o Other Public Housing Agencies
Oceanside National City
o People calling Directly Social Workers Homeless Shelters Churches
Page 104 of 129
Real Benefit will be for current client…can help them see what other services through the county are available
No money from County, funded by FEDs only
Software o ELITE by emphasis o Oracle
Individual metrics through their scanning system (Documentum) o Rent, Demographics, Re-cert (done every year), Income, household composition, exemptions
DASHBOARD:
FINANCIAL PIECE OF Section 8
Balancing of finances
EX: Spend $8 mill on rent, only get $6 mil from HUD
Day by day (Usually look at it twice a month)
How much spending on rent (average) – would like to see weekly
How much getting for rent
How many people in housing/can we pull off of the wait list (800+ applications long)
How much do they need to dip into reserves? (HAP)
Would you interested in seeing how manual proves is improving on dashboard?
YES…spending so much money, want to see improvements
Not so much time of referrals, but more of accuracy o Did they get what they need? o What happened with it?
DEG and how everyone is doing with their ‘piece of the pie’
Staff would input information CHANGE MANAGEMENT
Tough to see if staff will be resistant because it is mostly IT thus far
Gave an example of the Documentum scanning initiative to go paperless – those most tenured were against it – he made the decision to put them on a change mgmt. team (proved very effective with the exception of one employee who is no longer)
Timing the communication/raising awareness is key – for instance, “tomorrow is sequestration day, we have really big items on our plate – tomorrow would not be a good day to promote KIP.”
A group of 6-12 in his department have begun prelim. KIP planning
Todd forecasts that it will be positive reaction because it will be more of a tool for referrals o (AFTER everything is implemented) o He can see employees seeing it as ‘One more thing to do’ while it is being implemented o Focus on selling the positives and freely explain why? People want to know why and feel
involved in the decision, or at least the implementation process OTHER Staffing issues
o Balancing amount receiving for given admin fee’s and payroll Amount of work and what they’re able to pay per role
o Measured by Krono’s code and shows what they have been working on this year… What they look at is if they are adding value/what outcome are they giving?
Internal measurements: DEG (Department Excellence Goals, first 4 or 5 are set by county), OIP (Operational Incentive Plan, for certain classifications similar to a ‘report card’)
Operational Plan, 2 and 5 year financial plan, goals and accomplishment’s for plan (online at Housing Community Development website)
Page 105 of 129
Need a better relationship with 2-in-1
Every Monday they have a meeting to assess workload/bandwidth
Had an outside consultant assist with workload assessments a few years ago – used for two areas: o Succession planning (identifying leaders) o IT Competence
Adrian Gonzalez, Public Safety Group IT Manager Interview - October 17, 2013 Background: Role is to ensure that IT supports Public Safety’s (PS) mission Manages the sharing of records/resource services between probation and HHSA
From Probation: Primarily pushing out data but same data may be pulled from SharePoint
Interface on one on one basis – for data sharing but not for analysis or research
No standardized approach to referrals
Between 18k-23k adults and 3-4k juveniles in the system at any given time KIP Benefits: “Quite a number of benefits”
Standardize conventions – no universal ontology – creating a dictionary will improve organization effectiveness, timeliness of data and data accuracy
o For ex: acronyms representing schools and/or high school names – if entered slightly different into the system, the records won’t match which will effect measurements such as compliance with school attendance and who is receiving child support
o Criminal charges – come from the court – It’s critical that these populate accurately and quickly o Currently does not use any form of a cheat sheet – any inconsistencies or data anomalies will be
researched/addressed manually by management on a case by case basis
KIP will inform other programs how to share information with their program Challenges:
Change Mgmt. – huge opportunity to improve workflow processes and practices; share best practices and take a lean six sigma approach to doing business
But, the reality is in the details o Getting line staff/Supervisors to actually take advantage of this tool even though it will first take
greater effort on their part o Redefining the work – they may not have the time to use this tool. Need buy in that this
initiative is an investment with the potential to make the HHSA more efficient.
Implementation costs – to instill repeatable process improvement is the objective but can we afford it with the current staff
External dashboard:
Making info available to probations and their families to track their own progress. “Self-help” portal with mobile applications and web access.
Push reminders when registered – appts., court dates, locations
Responsive web design
Page 106 of 129
Andy Pease Interview – October 17, 2013 EXPECTATIONS:
Does not expect much change for him – doesn’t know what data will come
In the 3rd phase is shared analytics – still unsure how that will affect his role/responsibility
Funding is still siloed after integration
Big picture, thinks KIP will help the HHSA avoid making weak referrals and in the long-term, offer preventative care and improve the county customer’s general well being
Very long term – services are faster and comprehensive, and with comprehensive services, costs go up not down. Efficiency looks like less time to make referrals, but to counter, more referrals are happening.
DASHBOARD:
Right now they are performing quarterly time studies to determine how to balance the funds at each program (he has a level of autonomy when allocating what buckets/program to give funding to). Can’t do this real-time but more frequently than quarterly would be very helpful – perhaps bimonthly is a good schedule.
Trend data and outcome-driven data would improve grant writing o Success rate of the employment program o Are we reducing recidivism? o Adult Protective Services – how often are they coming back?
Foster care costs in particular, he would like to look at monthly, along with factors that lead to high foster care costs
o Educational obtainment o Success rate of placements (avg. length of time staying w/a family, number of families lived
with during 1 month) o Are we getting kids out of foster care quicker, and less returning?
OTHER
Reports monthly and quarterly to the Fed Gov. – numbers related to caseload (workload/number of customers) for Medi-Cal, Cal Fresh, Cal Win and the allocations are decided upon annually. It is supposed to be relative to the estimated growth for the following year, but this is often not the case b/c the govt. can choose to slash budgets at any time
o The quarterly claim describes how much time is spent on a Cal Fresh case (etc.), # of eligible county residents, and a suggested cost allocation plan
Doesn’t know if any ADS or PH measures are tied to funding
Cal Win is the case mgmt. system for Cal Works
% of sales tax goes to MH, % goes to Alcohol and Drug
Wants to see -What has changed in your processes o Need a reunification strategy – What did it look like before? After? o Immediate gains/quick wins:
Expand the defn. of teams and encourage team decision making w/multiple programs involved.
Establishing a base line – a lot of wasted time looking for information
Would like to know, how much time/cost is associated with each of these customers – could be tracked more
Page 107 of 129
Dean Arabatzis Interview – October 17, 2013
EXPECTATIONS:
Challenge is defining what metrics are actually implying o What does poor mean? (not just income – how many in their family, where do they live,
what do they do?) - $30k a year means a lot more to a single person that a single person with 5 kids.
o What does healthy mean?
When does it become Big Brother to ask all of these questions?
Staffing capacity vs. allocation must be considered –political perspectives -if you have a productivity boost, that means less workers per unit (more customers at the same time), and you’ve lowered your allocations from the state/fed
DASHBOARD:
Where are our clusters of demographics?
Analyze behaviors to reduce fraud: What are Cal Fresh recipients actually buying? Are they buying healthy food or liquor/junk food?
Heuristic suggested: Planning for overpayments before they happen.
Mike Haas Interview – October 17, 2013 EXPECTATIONS:
Analytics
Not as much beneficiary, more of the business level
Start with the big building blocks
Apply the project, and see which other departments
Peripheral , more of with outsourcing partner
Will maintain systems more than be effected by them DASHBOARD:
Wouldn’t have a dashboard, that is more for business units
Not sure what agencies need CHANGE MANAGEMENT
Standards o New position of Data manager, to help structure all data o Data users might help set a strategy as well as standards for a way to structure data so it is
useful OTHER
Responsibility for Master Contract as well all IT
Business units manage business apps
His department is just the ‘core’ of those business apps
Some business apps not through HP, some are through a third party Select Executive Focus Group - October 21, 2013
Page 108 of 129
Expectations:
Ability to share info and data (though some fed regs prohibit referrals, we will be able to manage care for families in multiple programs – not all family members are necessarily all using the same programs and we can’t track that)
Integrity of data is crucial to create health policy
Will see increases in treatment costs b/c more people are seeking help
Continued development of the prevention model Benefits
USING the data o Right now a lot of data is captured/reported but for what purpose? o “I have 15 reports on my desk daily” o Technology saturation – 80% of the reports have never been looked at
Program & Service Planning
Gap Analysis
Access to other county info – probation (outside from HHSA)
Will allow for a more holistic look at outcomes/performance
Assessing how well they are doing – right now no mechanism exists to evaluate outcomes for those county customers that complete their treatment plan – how do you know if you are being effective? Recidivism, what type of dosage, where are they coming back to – hospital, juvenile detention, are they actually completing treatment
Blended, more accessible services – comprehensive move towards self-sufficiency within the HHSA program
Combining legal and social services is very beneficial
Outcomes will be improved – right now there is no stability – addicts (meth, alcohol, etc.) will they stick with it if you send them all over the place? Of course not.
Challenges:
What happens after the referral – are they going? Right now we don’t know and thus are not meeting the needs of our customers
Are we going to have the RIGHT data to make decisions
Contracts have their own procedures/databases – will be a major challenge to encourage buy-in
Will need staff support - will need to show value add so the massive dollars spent will not fizzle out as another legacy
o Will it be embraced if the organization says you will use this? Depends on if the employees embrace the Why aspect and have the desire to improve
System will have to be responsive
Must ensure we understand what clients and front line teams are perceiving
Do those front line staff making referrals have the right knowledge set to refer to the appropriate level of care
Must keep in mind that every tax dollar spent is getting the best ROI with very scarce resources o We want to reduce fraud (BENEFIT)
Unintended consequences o Enforcing data entry procedures can be a burden to the worker– i.e. CWS (has 700 Social
Workers) has to enter data into a fed. System, now they have to enter it into a county one? Time consuming.
o Who will manage and be accountable? Who will enforce that procedures are followed and when will that happen?
Page 109 of 129
o Obligation to use data – encourages law suits, need to mitigate risk much more than before having multiple DB’s
Ex: In LA, a LCSW only checked one of 2 DB’s available in the county system before making a case visit, in which she reported the patient healthy. The other DB described the individual’s mental health as unstable and suicidal. That person committed suicide soon after the LCSW left.
Other:
Need to avoid being data rich but knowledge poor
Seeing 105% capacity in probation and county jails – people are staying in jail for years when it used to be a 2-3 day max – medical costs for those in jail are extremely expensive
CWS has to report out federal indicators which are associated with funding or sanctions
Sanctions exist for timeliness – Marie said what happens when our team (already serving 25k people per month) gets an increased workload and thus an unintended consequence – sanctions. The two goals are conflicting.
How are costs captured/measured related to the AB109 population: o Anasazi – financials are unique because they are collecting reimbursement from county
customers, unlike many other programs (Welfare, Cal Fresh, etc.). Have 3 payment structures and a billing cycle specific to their operations.
o Costs are measured related to AB109 health, jail costs, court, probation supervision, treatment – reports are Quarterly and allocations are determined annually (which program gets what money annually)
Dashboard:
Something like a red light green light system would be preferable for the majority of my reports – rather than sifting through pages to draw conclusions
Want to look at 10 things – not 82 things, and also sort into short –term objective, mid-range, and long-term goals.
Desire to have a benchmark – compared to other jurisdictions, how do we know when we are doing things right?
Locate a bed – quickly – saves time and improves outcomes
Customer metrics: o Who of my customers are using other services? o Analyze the major funding sources for services o How much time does it take the customer to get into treatment (2 weeks to wait for admission
to BHS is not uncommon) o What % of applications were processed in a timely manner (and drill down to the supervisor
responsible for high or low performance). Would want to see this weekly – not enough time to look at this daily and also be meaningful
o Managing diseases – hypertension, diabetes, etc. o Hardest to manage and/or measure: social indicators (regionalized health and how does it apply
to the population I serve/the program serves) o Can we connect with other counties? Relevant because our customers migrate, other counties
could be doing well and we should copy what they are doing – no need to reinvent the wheel. Also, it would provide a benchmark or “report card” to let us know how we are doing versus other counties
o A broad dashboard will help us not to overreact – we hear horror stories and it throws off a huge amount of staff, but how many people are we talking about here? 1 in 10,000? We need to remember the times we are doing it well for the vast majority
Page 110 of 129
o PH measures – allow for disaster prep (i.e. flu epidemic) and prepare to reach the vulnerable populations
o HEDIS measures are a vehicle to get there o Need place holder for ad hoc reports – unique reports o Need analytics along with integration to create value o Demographics – FPL and where they live o Metrics to measure prevention – would quantify some of the ROI benefit o Need a dictionary including a definition of variables in the DB
Some things are defined by the funder and cannot be changed What’s the universal definition of recidivism – not simple to do so as it varies greatly
across programs– thus, what is the quality and/or comparability of your data o Need time to learn the ontology CM o Look at the sequence for service delivery: How many referred > admitted > engaged >improved,
then use the data to see hot spots o Current reports typically are received in excel (prepopulated) o Staff compliance is such a challenge because there is already so much on their plate. If you want
un-mandated data you have to inform the staff to do so – they won’t organically start collecting it (i.e. want to identify homeless – at the FRC’s they could flag any PO Boxes to potentially give clues to other programs, but they must be asked/told to incorporate this process – what if the system could automatically sort?)
o What are your current benchmarks: CWS – operational incentives, DEG, overpayment (afraid of this cost) We pick our own targets but how sophisticated are they really? With integration and biz
analytics, it’s the hope to use data-driven decision making to set those target goals Great examples of current systems that are striving toward integration:
UC Berkeley – integrated CWS info at a very high level – ability to drill down by county
DSCH – every 2 years creates a snap report – very extensive, uses state documents to analyze substance abuse – linked to the DMV, Schools – BUT data is overwhelmingly rich
o Contractors are measured on length of time their patients are in treatment, recidivism, # of customers served, completion rate, enrolled in education
It would then be useful to be able to drill down and see – A is doing better than B, but what is the root cause?
Serving different populations?
Working conditions/limitations Community Transition Center (CTC)/MASU Focus Group and Tour - October 23, 2013 Background: The Community Transition Center is a center for all released parolees/probationers (AB109 population) that were San Diego County residents, prior to incarceration. Through a contract with Lighthouse, they handle logistics from the prison the probationer was released from to the CTC site in downtown San Diego. Most probationers are bussed from the state prisons by one of 26 drivers, but some are flown in from more distant prison locations in Northern California. The center typically has 1-3 days’ notice that a prisoner will be released
Page 111 of 129
to their facility. Upon their arrival, the CTC team first tests for alcohol or drug use, screens for health/mental health issues, and develops a case plan to assign the appropriate level of care for the individual—typically probation, Mental Health Services (MHS) or addiction treatment. If the individual tests positive for drug use, they will be immediately admitted to the on-site detox unit. From there, the team refers the individual to transitional housing or a Residential Treatment Plan (RTP), which is primarily for those with alcohol or drug addictions. The center was recently opened in January of this year as a pilot program and is primarily concerned with assessing the needs of the recently released prisoners and providing/referring the appropriate level of care to the individual. The center is open 7 days a week from 8am-6pm. AKA the PRIC (Post Release Intake Center) or PRO (Post Release Offender unit) How many of you have heard of KIP before today? 3 people raised their hands—Karna Lau, Nathan the Senior Probation Officer, and the MASU Social Work Supervisor Concerns/Challenges:
Understanding the system in a timely way, as to not interfere with workload. “We are BUSY. We don’t have time to waste.”
Data Quality – Will the permissions be accurate for their needs? Will the permissions be based on role or program? *Adrienne clarified that it will be role based.
Ontology – many acronyms and many very specific to their customer base – system-wide it will present a challenge. Karna to provide CDCR acronym cheat sheet.
Community Resource Directory – all referrals come through here, have the directory as a cheat sheet
Benefit:
Availability of info. At this point, for instance, they can pull a customer’s record up and it will show that they had visited Anasazi but there are no case notes related to the visit.
More info on patient history means that they can make better references to rehabilitate the patients – They can look at what treatment plans have/haven’t worked for the patient, should they be referred back to where they came from (was it effective or not), etc. so that an educated decision can be made as to the appropriate case/treatment plan going forward. Data Driven Decision Making. Huge opportunity to improve program effectiveness because they are the AB109 population’s first stop—the better quality the case plan recommended to the patient, the better the outcomes
As probation officers, having access to SanWITS and Anasazi patient/customer info would enable them to do their jobs more quickly and more accurately
o Now they have no info from either of these programs – “Anything would be helpful” o Why is this needed? People have trouble remembering their own history (many of their target
population are seeking treatment in many places and it is difficult to recollect all the details), many people have mental health issues and do not report information truthfully or accurately-it is imperative to know what Mental Health issues this patient/customer is dealing with in order to provide the highest quality of care.
Even if they knew this information already, they would still ask the customer where they’ve been/what enrolled in, etc., in order to gauge their cognitive level and honesty
Knowing this info in advance would reduce the detective work currently needed at times and improve the accuracy of the information they are receiving
o Many customers claim to have completed a certain program but they do not have a quick way to verify the validity of their statements. If they have a case where this is necessary, it must be researched on a person by person basis (phone calls, emails, waiting for responses from other programs)
Page 112 of 129
o Would like to know what other programs they might be enrolled in already but didn’t know it (like if their welfare has reactivated after they’ve left the prison system) or which programs might be abused by the customer. Violators – don’t know who these repeat offenders are so they regularly refer to the same programs they were abusing
Currently do not get to see the end result of their treatment plan for these individuals – no record of those doing well or not. It would improve the team’s performance if they were able to see the outcomes of their efforts – learning that your customer is doing well would be rewarding, learning where the pain points are in the process would allow the CTC team to reevaluate treatment strategies
Without any hiccups, it takes between 2-4 hours to process a new AB109 customer
However, say you don’t have access to room availability, the referral site does not answer their phone (constant issue w/VA, SanWITS, Anasazi), their historical information is not available (need to gain info from Public Health to determine HIV positive or not, for instance) – they stay for 2-3 days in one of the 30 beds available on-site.
If they could have certain info on the front end—CWS and PH—it would be beneficial
One referral at its best takes 30 minutes and anywhere they are referred to, they are reassessed o ASI assessment – Addiction Severity Index (a self-assessment) o BHS assessment – done at the CTC, BHS, and alcohol and drug program
“No data collection is a waste of time—every bit of information is important to planning for our customers’ care” “100% of the information we need could come from SanWITS”
“Our customers get frustrated – they go to so many visits/check-ups, they don’t know who is who is who”
The group then told the story about the 9 month pregnant homeless woman – no mental health issues, alcohol or drug use/ abuse – was the victim of domestic violence but had a criminal history on her record – were unable to refer her to ANY service for several weeks. She stayed at the CTC housing unit upstairs.
To enroll customers in LIHP, the LIHP Social Worker completes their day at the CTC (after filling out forms with the customers), and returns to Kearny Mesa to input the data. The process takes 3 steps/data entry points.
Would be great to have an eligibility worker on-site – but “that would take an act of God”
They offer group therapy/training classes to assist in the transition back to society Time consumption
Entering notes in the system – awaiting AB109 arrival
Client fills out paperwork
Screening – Alcohol and Drug, Behavioral Health
Entering clinical notes
Entering application notes
Reentering clinical notes for Anasazi/wherever they’ll be referred to
Using application notes to create Excel spreadsheets for analysis – report out on 35 outcome measures monthly
BHS rescreening
Redundant, but with a population with this many mental health issues, rescreening is a necessary evil Change Management:
When Cal-Win came online, they expected to get access to a lot of information but the understood promises were untrue – profile sharing was so limited that it was not as effective as they had hoped.
PO’s work four 10 hour shifts, staff get overtime pay – when we asked the team, do you often skip breaks because you have too much work to do – they laughed and replied – we don’t have a formal break, we eat lunch as we go. But, we also set our own schedules, “sometimes I take 15 smoke
Page 113 of 129
breaks….and sometimes I take…15 smoke breaks.” But really, no one ever has a set schedule in which they leave the CTC, because the work does not allow for that.
What IT/computer applications are the most user-friendly and which aren’t? o Anasazi is slow, temperamental, not user friendly – But, having access to any information, even
at its currently limited level, is beneficial. “But, how will the new system work? If it’s not easy, then forget it”
o COMPASS is fine – no complaints – not great but not bad
What language/disability services do you offer? o Vietnamese, Japanese, French – once called 211 to translate for a deaf individual. Recently had a
supposedly “blind” individual, which they could not provide adequate support for (“we could have been sued”), but it was discovered that this customer had been faking blindness for several years within the state prison he came from. The CTC had spent hours researching where to send this individual for specialized support.
Do you have the full complement of staff to do your jobs effectively? o The staff hesitated, Karna prompted, “It’s ok to speak honestly,” and they said, they would
benefit from 2 more BHS reps, 1-2 PO’s, 1 ADPS (Alcohol & Drug Specialist), and one Admin/Records Clerk,
They asked: o Will we be able to see everything available system-wide (and is that too much) or alternatively,
will we have access to enough info to make a difference? If they had a dashboard:
1) Who is getting released and when to expect them 2) Status updates for arriving customers (in transit, medical hold, other holds such as disorderly conduct
prior to release from the prison) 3) Any medical appointments their customers have for the day 4) From Anasazi and PCMS (don’t know what that is): What the calendar looks like
a. Daily appointments and availability for scheduling new customers 5) Pending court hearings 6) Profile violators 7) BOLOS – Be On the Look Outs (Argus alerts and other automated regional alerts) 8) Number of customers with MSO’s – mandatory supervision orders (not all come through) and the ability
to drill down to know which customers and/or data match when the customer profile comes up 9) Bed availability (overwhelming agreement with this suggestion) and availability of treatment
appointments/programs – alerts when they are available to avoid having to call to find out over and over again
a. Would like to know if there is a waiting list for: i. MH – Anasazi ii. AOD – SanWITS
b. “If an intake coordinator is out of the office, it could take days or weeks to refer a patient.” In that instance they would have to find a place to stay on their own or if the CTC had bed space (30 beds available upstairs), they could stay there
c. Current programs have no formal tracking system for bed availability (“a calendar with hash marks”) according to the team
Page 114 of 129
Barbara Jimenez, Deputy Director Interview – October 30, 2013 Background: 6 major regions – 6 regional managers / deputies
Barbara’s region: Not the largest sized area, but largest b/c of staff - 1300 total
Responsible for FRC’s, PH clinics and services, community health promotion, is the county-wide coordinator for homeless services, two call centers (Eligibility and a new one opened Oct 1 due to the ACA), CWS
24-25 year’s tenure - Front line delivery and operations. Expectations:
How many people are they serving? Is something she is interested in finding out
To identify who we serve, how many do we really serve in each of our regions – to break it down even further is invaluable (is access limited, what services need to be offered, where do you need to move your team to, staffing levels)
What are those services we are providing – what’s the level of the service we are providing, what services should we offer
How do you coordinate all of those services? Work closely with infinite partners in the community (outside of HHSA) Want to ensure we are not duplicating services
Unduplicated account
Would like to know if they’re maximizing their dollars Challenges:
Front line staff resistance
Make sure to communicate on what it is
Training, will she have to spend a ton of time getting trained and then training?
Set expectations on how the report will be used
Not a road block but the agency goes through constant change
Front line staff will say what’s in it for me – how long will this be around, what benefit to customers, what is going to be on my plate or come off plate –starting communication early is key – most/almost all staff would not know about KIP
How will training be rolled out will I have to go somewhere, will it be accessible remotely, will it be a week, 3 days, etc. - how long will it take
Communicate – talk about benefits and why it’s happening
Front line supervisor – most important to set the expectations – how will you report out, only drive performance with data if you can identify successes and need for improvement
Manual reports – homeless population – not a clear cut ability to run a clear, accurate report
Has to go to multiple programs to run reports – First goes to FRC to run a report and remove some non-applicable customers, then goes to Housing to run another report - how many folks do you have that match this criteria. What % in a district are homeless – manual count
Reports are run at the Neighborhood level, Community Level, and board level (2 districts overlap – often data needs to be broken down into these categories to adhere to various stakeholders). Would be nice to sort into the applicable audience
Reports:
Has to go to various departments to get reports, and then have to go through data to figure out specifics manually
Metrics:
Recent examples of metrics- eligibility o LOOK AT THOSE FOR GOOD EXAMPLES
Page 115 of 129
Immediate need process o With regulations
Weekly o # of applications
Having the details of those would be good for dashboard
Both weekly rollups and monthly rollups as well
How are we monitoring that
Easy to read
Make sure everyone is flexible because dashboard metrics and data might shift, throughout stages of KIP
Have a program/report analyst able to reach out to her if there are quick changes in any of reports Misc.: Great eligibility dashboards have evolved– take a look at those (Cal works, Cal fresh, Medi-Cal, general relief)
The change brought a Cultural shift and reporting out (individual reports are run at each program/service site – How many “Immediate Needs” applications were processed. Where are we at with meeting fed standards for immediate needs?
Change Management is essential to KIP’s success - Setting expectations related to the data is key- ensuring things just don’t look good on paper
North Central Family Resource Center Focus Group – October 24, 2013 Benefits of KIP
If they can't provide the benefits that they offer here, they will provide the help to get the resources from somewhere else
Automatic referrals o Hard to get information from schools
2-3 lead time on ordering client history, a system could really help with that if they were pulling the data from an in-house system like KIP (the system they pull from now is called EVE?)
Other offices that this one works with o Refugee o African alliance o Etc.
Following up on referrals o Some cases they are trying to promote self sufficiency o Other cases they would like to follow up
"Automation" help desk for new system feedback
Appendix O: Survey Results
Program SME’s Survey Note: Text in bold were common responses, outliers or particularly interesting/insightful. About You 1. What program or service do you represent:
a. Aging and Independence Services b. Behavioral Health Services X
Page 116 of 129
c. Child Welfare Services XXXXX d. Eligibility XX e. Executive Office X f. Housing and Community Development X g. Information Technology h. Probation i. Public Health Services XXXX j. Other KIP X
2. What is your tenure with the County: a. < 5 years (0) b. 5 – 15 years (9) c. 15 – 25 years (4) d. > 25 years (2)
*Note: 3 individuals completed the About You section of this survey (their program and tenure with the
HHSA) but did not respond to any other survey questions. This is a key finding to note for purposes of
assessing reliability of data (must consider that BHS and Housing & Community Development’s
perspectives were not captured) as well as from a change management perspective. The individuals that
did not respond were from BHS, Housing & Community Development, and CWS, and had 15-25 years of
HHSA tenure, >25 years tenure, and 5-15 years tenure respectively.
Staffing / Customer Profile 3. How often does your program measure the time it takes, from start to finish, completing and/or performing
business processes? (M) (R) (CM) a. Often. (5) b. Occasionally. (3) c. Rarely.(4) d. Never. (1)
*Note: 2 Eligibility respondents chose “Often”, one chose “Never.” Operations 4. What processes are the most time consuming for you and your team? (R) (CM) Open Text.
Child Welfare: o Rolling out programs and/or legislative mandates getting approvals from the chain of command o Trying to arrange language and culturally appropriate providers for clients; playing phone tags
with individuals and community partners; court time o a) depends by service sub-division; usually complaint is paperwork load takes away from time for
actual face to face client assistance b) locating appropriate service providers (esp. special language or cultural needs) c) court time (prepare for trial or actual sitting waiting for hearing)
Eligibility: o Monitoring and Tracking work to ensure timeliness and to ensure productivity standards are met;
Determining Productivity Standards is time consuming and entails an ever vacillating set of variables and product/task definitions.
o Processing of applications/FRC Writing of Program Material/Program
Executive Office: o Investigations
Public Health o Delays in data entry due to external website speed o Charting of home visits (it is captured on paper - not electronically)
Page 117 of 129
o Data entry into individual client record systems (i.e. PHIS, ETO, PHIX, SDIR) and access database developed by end users. Paper Charting for Nursing.
o Data quality assurance; interface integration; software testing 5. Where do you (or does your team) experience the most redundancy in work (specific forms, procedures,
etc.)? (R) (CM) Open Text.
Child Welfare: o Multiple team members at the same meeting Different meetings with the same team members o Trying to arrange language and culturally appropriate providers for clients; playing phone tags
with individuals and community partners; court time o a) multiple arms needing to be routed paperwork that they do nothing with; no need to route to so
many levels or entities b) missed phone calls; phone tag
Eligibility: o MediCal Lists- lots of duplicate lists from a variety of sources. To get bottom line numbers to project
workload and staffing, such as number of MC packets received, it is a multistep process. It requires one to merge excel spread sheets; sort out duplicates etc. to get the final result. The main redundancy is that the tracking tools of line staff, and those of supervisor duplicates information. Neither is able to produce quick reports. The amount of time the staff takes to fill out tracking tools is probably 2 hours a day cumulative; supervisors spend 3-4 hours a day. 2 hours for a line staff equals a Renewal, and 3-4 hours for supervisors equals time they could spend reviewing cases for accuracy.
o Dealing with the varying regulations for the individual programs, collecting data from clients/FRC
Executive Office: o N/A
Public Health o Data entry of STD cases o Inputting into three databases to capture case management activities (Kronos, PHIS and PHIX) o Duplicate/Triplicate data entry into individual client record systems (i.e. PHIS, ETO, PHIX, and SDIR). o Data quality assurance interface integration software testing
6. Regarding your current IT environment, what applications do you feel are the most user-friendly? (R) (CM) Open Text.
Child Welfare: o Some web-based programs that we use to support our job-CLEAR, SDM tools o Only one database system o We use only one application for our main case management; other departments may track
things on simple excel or access databases
Eligibility: o Microsoft Office- SharePoint. Our staff are using One Note in creative ways- exploring
other instant messaging tools. o Outlook
Executive Office: o Certiphi
Public Health o Local Evaluation Online - CDPH Office of AIDS database for reporting HIV Counseling and
Testing and Prevention database o PHIX Web o PHIX Web, Office and Field, SDIR o MS Office
Page 118 of 129
7. Regarding your current IT environment, what applications do you feel are the least user-friendly? (R) (CM) Open Text.
Child Welfare Services o CWS/CMS REGIS o one database system isn't the state of the art o Our data base is a state database designed in 1997. It can lack desirable functionality.
Eligibility o CWEA- well known issue. And while not an application, the organization of the S: drive is
in great need of revision. The pathways are not intuitive or logical and it takes a long time to finally get to the file one desires. Most of our tracking tools are stored on the s: drive, so that is why it is a significant concern.
o UNKNOWN
Public Health o SharePoint, PHIS, Access Databases; CalREDIE - California Reportable Disease Information
Exchange; ETO - Efforts to Outcomes (NFP database) 8. What programs do you collaborate/communicate with to provide coordinated care or service? (M) (R)
Select all that apply.
Aging & Independence Services: (3 of 15 total): KIP, Executive Office, Eligibility
Behavioral Health Services: (7 of 15) - CWS (3), KIP, Executive Office, PHS(2)
Child Welfare Services: KIP, Executive Office, Eligibility, PHS
Housing Assistance: CWS (2), KIP, Executive Office, Eligibility, PHS (2)
Public Assistance Programs (e.g. Cal Fresh, Cal WORKS, Medi-Cal, General Relief, LIHP): KIP, Executive Office, Eligibility, PHS, CWS
Public Health Services: CWS (2), KIP, Executive Office, Eligibility
Probation: (includes AB109) - CWS (3), KIP, Executive Office, Eligibility, PHS
Other(s) Open Text. HHSA Regional Public Health Centers, DEH Public Assistance and Probation both received 9 checkmarks with the audience surveyed.
The Knowledge Integration Program (KIP) /Change Management 9. To the best of your knowledge, what is the current level of awareness, within your unit, for the person-
centered/coordinated service model that KIP technology will support? (M) (CM) Check all that apply.
My team understands that the changing health services landscape, along with other changes in overall customer expectations and new technologies demands integrated service delivery methods (i.e. “We need to streamline our services in order to provide the level of service that our customers want and expect,” etc.).
i. 3 responses – CWS, KIP, PHS
My team is dissatisfied with current access to customer information across programs and redundancy. (i.e. “Our customers think we are already integrated - and we should be,” “It’s about time someone listened to me,” etc.).
i. 4 responses – Eligibility and 3 PHS
Though most of my team is aware of the need for change, some are content with the current situation (i.e. “What’s wrong with the way we do things now?”)
i. 1 response CWS
My team somewhat perceives a need for this change but is primarily consumed with keeping up with operational workload. (i.e. “I wish that we did business differently but I don’t have time to learn new processes/procedures,” etc.).
i. 4 responses – CWS, Eligibility and 2 PHS
Page 119 of 129
The majority of my team believes in the status quo (i.e. “We’ve done this as long as I’ve been here so why change now; if it’s not broke don’t fix it,” etc.)
i. 1 response Eligibility
None of the above. Please explain: Open Text. i. Eligibility: Some prefer to leave things as is, while others know they could deliver
improved services given the right environment ii. Eligibility: There is a great understanding of the changing landscape in Eligibility based
on the ACA. There is great anticipation for QMatic and CERMS. However, our staff are not as aware about the KIP ideals.
10. When considering the proposed transformation of service delivery associated with KIP, what staff concerns do you anticipate? (CM) Check all that apply.
Increasing work load (either temporarily, with learning curve, or permanently, with new job expectations – or both) (6)
Inability to perform at their pre-KIP level (3)
Lessened job security (0)
Fears of new technology (5)
Lessening of data quality/integrity, etc. (3)
Misinterpretation of data gathered (5)
General confusion, lack of awareness about where to get more information (5)
I do not foresee any staff concerns with the KIP (0)
Other. Please explain: Open Text. i. Anxiety about new things
ii. Fear. We have staff who are also Benefit recipients. Keeping their cases in a secured caseload is very important to them. They may be concerned more personal information could be accessed by a Knowledge Sharing model. ***
iii. Not knowing which information is accessible to the end user and/or client. *** 11. How have past changes been managed and how has your team perceived these initiatives? (CM)
Very well; new initiatives have energized the staff due to high levels of sponsorship accountability, follow through, and honest communication. (1) Eligibility
Well; our team understands that change/innovation is part of the County’s vision. (6)
Fair; some initiatives have been met with some minor discontent from the staff due to management’s implementation. (3)
Poor; the team does not respond well to changes due to their perception of past change management. (1) Executive
12. Is there a report that exists that describes your customer profile? a. No. (4) b. Yes. (5) If yes, Please list: Open Text.
i. Not sure what this means ii. Client demographics available in CWS/CMS
iii. Can pull a client demographics from CWS/CMS; CWS intranet has real time data dashboards
iv. We can capture data on number of cases, language of cases, and other case variables. (data unit)
v. Eligibility Operations has several reports that provide data, numbers, measurable, but I don’t believe there is one all-encompassing report that gives us the profile of our customer in the various programs.
Page 120 of 129
vi. PHN Referral Criteria doc, NFP ETO, PHIX Office, PHIS. These applications/programs manage our clients from either a client or family centric approach.
13. Do you have any reports that include information from across programs? (R) (M) (CM) Check one.
No. (8)
Yes. (2)
If yes, what is the name of the report? Open Text. i. We would like cross program reports but we usually have to get the other program's
information and roll up into a report we are doing. CWS ii. Can pull a client demographics from CWS/CMS; CWS intranet has real time data
dashboards iii. Dual agency reports (juvenile probation & CWS) iv. PHIS
13. If yes, do you have access to any cross-program reports that inform how many of your customers use other services? (M) (R) (CM)
No. (12)
Yes. (0)
If yes, what is (are) the name(s) of the report(s)? Open Text. a. N/A ; our cross info (such as FY-SIS education database) is mostly demographic info not
service info b. Database contains demographic information regarding our clients, not service provided.
we share health and education information of our dependent children with various community partners
c. PHIS does not allow this.
Page 121 of 129
Table 13 Data SME Survey Results: Snapshot of Raw Data Collection
Pro g ra m T e nureRe p o rts b y
d e p a rtme nt
Cro ss
Pro g ra m
Re p o rts?
Re p o rts
a va ila b le fo r
jo b e ffic ie ncy?
Re p o rts a va ila b le fo r
jo b e ffic ie ncy?
Mo st Re lie d
o n Re p o rts
Me trics Vita l to
Pro g ra m
Me trics tie d
to Pro g ra m
Ad min
Da shb o a rd
De s ig n-
Custo me r Me trics
Da shb o a rd De s ig n-
Op e ra tio na l Me trics
Da shb o a rd
De s ig n-
Outco me
Me trics
Da shb o a rd
De s ig n- Othe r
Me trics
Acce ss to d a ta
wa nte d fo r
Da shb o a rd ?
Acce ss to
d a ta wa nte d
fo r
Da shb o a rd ?
(sp e c ifics)
1
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
15 – 25 years Court No. No n/a Internal Logs
Cases meeting 30 day
requirement by
office/worker
2
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years StatisticsDaily Case
assignments
Percent of cases with
financial abuse
allegation
3
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years
Monthly data
reports to
CMS
Recidivism data
Percent of cases by
office that are ENI or
NIFFI
4
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years IHSS 30 day visit dataSimilarities among
programs
Percent of cases by
office that are ENI or
NIFFI
5
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years SART No.
No. client
demographics
insurance income
zip referrals and
services provided
readmission rates
services provided
within COSD
MEDS
The system we use to
electroncially store our
data does not produce
reports. Are other
programs within COSD
serving our clients?- This
would reduce duplication
of efforts and coordinate
our services.
All Case Notes
Number or new
case assignments
by office/worker
6
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
> 25 years referrals
7 Elig ib il ity 15 – 25 years
8 Elig ib il ity > 25 years
9 Elig ib il ity 15 – 25 years Eligibility No. Yes YES
10
Exe cutive
Office5 – 15 years Eligibility No.
other departments
are reluctant to
share data
needed to
complete projects,
such as address
level data on
clients
Weekly and
Monthly
Manager
Dashboard
Pending
applications/
reports timely
disposition
Application
processing
timeliness &
work
participation
rate & error
rate
Weekly Number of
unduplicated
clients/cases by
program/overall
pending
applications/reports -
daily timely
disposition - daily call
center wait times/
callers in queue -
continuous lobby wait
times/ clients in queue
/ continuous
Monthly work
participation rate
and average
time on aid
clients with CWS/
BHS/ APS
involvement -
monthly
YES CIS downloads
11
Info rma tio n
T e chno lo g y5 – 15 years
12
Be ha v io ra l
He a lth
Se rv ice s
15 – 25 yearsADS
Databook
We have
collaborated
with other
groups for
grants,
including CWS
and probation
We have also
matched GR
clients
I run Alcohol and
Drug reports. It
would be helpful to
compare ADS
Meth clients with
CWS outcomes for
those same clients
to see the effect of
treatment on the
outcomes of their
children.
COR Outcome data
Monthly number of
clients by region
and funding source
Monthly percentage of
successful referrals
Monthly percent
of clients
sucessfully
completing
treatment
daily client specific
information
regarding use of
other programs
NOAccess to some,
not all
13
Be ha v io ra l
He a lth
Se rv ice s
> 25 years
Meth
Strikeforce
report
No. YesCerner's Etreby
v7
Number of clients by
region by primary
and secondary drug
of choice
14
Be ha v io ra l
He a lth
Se rv ice s
> 25 years Operational No. Yes
Weekly
Security
Reports
15
Child
We lfa re
Se rv ice s
5 – 15 years Operational No. Yes
16
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years APS
total referrals
via ALEX made
using the
webportal,
count of
referrals by
program
some information
is in our system
that we would like
to pull data from
but the
query/report
buidling is beyond
the ability of our
best
10 day visit data
Monthly number of
new referrals by
program
Cases activated in the
required 10 day time
frame
Daily
findings on cases,
relationship SA to
the client and type
of abuse
YES
17
Ag ing a nd
Ind e p e nd e nc
e Se rv ice s
5 – 15 years APS CPS Yes Referrals
Call volume
report
(abandonned
calls, wait time,
calls answered).
Also, reports of
referrals taken
for each
program.
Information &
AssistanceReports/referrals
access daily for
management of
referrals, but also pull
daily/monthly/annually
reports.
Open text to have
advance search
mechanism
Preferably
immediately (live
feed)
YESData available
real time
Page 122 of 129
Appendix P: Acronym Cheat Sheet Example
Figure 13 Acronym Cheat Sheet Example
Page 123 of 129
Appendix Q: Change Management Communications
Figure 14 Educate about the Universal Dictionary – Acronyms and Ontology
Page 124 of 129
Appendix R: Change Management – Disruptive Communication Examples
Page 125 of 129
Pre-KIP unveiling presentation – generate buzz
Page 126 of 129
Page 127 of 129
Appendix S: Lean Six Sigma evaluation of the AB109 Service Flow
Table 14 Present State
Phase Tasks Days Time Current
Tools LEAN CATEGORY
SIX SIGMA CATEGORY
Assessment
Establish assignment
2 - 3 days
8 hours
Checklist Over
Processing Variable
Find status of EOP and CCCMS
8 hours
Notify RN 2
hours
Notify BHST team 2
hours
Transportation
Person exits Jail 2 - 3 days
8 hours
Transported to the
CTC 8
hours
Probation Orientation
Criminogenic Needs Assessment (COMPAS)
1 day
2-3 hours
Questionnaire Variable
BHAST Screening
Referred to the nurse case manager
if needed 2-3
hours
Questionnaire BHST Decision Support Tree
Variable
MASU
Data Entry Client Data Entry 4
Hours
Over Processing
Referral
Within an MDT model a COMPAS
case plan is developed
1 day 2 - 3
hours
Questionnaire Waiting Variable Immediate referral
to services 4 - 5 days
32 hours
Transported out
Total 10
days
78 Work hours
Page 128 of 129
Table 15 Future State
Phase Tasks Days Time Tools LEAN Improvement
SIX SIGMA Tool
Assessment
Establish assignment
1 day
2 hours
Checklist
Reduce over Processing
through integration of
data in system
Suggested Decision Support
Tree
Find status of EOP and CCCMS
2 hours
Notify RN 1
hour
Notify BHST team 1
hour
Transportation
Person exits Jail 2 - 3 days
8 hours
Transported to the
CTC 8
hours
Probation Orientation
Criminogenic Needs Assessment (COMPAS)
1/2 day
2-3 hours
Questionnaire
Suggested Decision Support
Tree
BHAST Screening
Referred to the nurse case manager
if needed 2-3
hours
Questionnaire BHST Decision Support Tree
Suggested Decision Support Tree for MASU
MASU
Data Entry Client Data Entry .5
hour
Automation of Data entry
Referral
Within an MDT model a COMPAS
case plan is developed
1 day 2 - 3
hours
Questionnaire
Reduce over Processing
through integration of
data in system
Suggested Decision Support
Tree Immediate referral
to services 1 day 8
hours Transported out
Total 6
days
36.5 Work hours