+ All Categories
Home > Documents > Pilot Survey

Pilot Survey

Date post: 04-Jun-2018
Category:
Upload: manish-kumar-sinha
View: 212 times
Download: 0 times
Share this document with a friend
22
 Prepared by: Josses Mugabi Samalie Mutuwa  Customer Sati sf action Sur vey Report  August 2009
Transcript
Page 1: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 1/22

 

Prepared by:

Josses MugabiSamalie Mutuwa

 

Customer Satisfaction

Survey Report

 August 2009

Page 2: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 2/22

 

2

 

Contents

Section Page

Executive summary 3 1.  Introduction 4 1.1  Background 4 1.2  Objectives and report outline 4 2.   Approach and methodology 4 2.1  What constitutes satisfaction? 4 2.2  Survey setting and sampling 5 2.3  Survey questionnaire 6 2.4  Main survey administration 6 2.5  Data compilation and analysis 7 3.  Results and discussion 8 3.1  Performance matrix 8 3.2  Customer satisfaction index 8 4.  Conclusion 9 5.   Annexes 11 5.1  Survey questionnaire 12 5.2  Sample screenshots of the data entry and analysis spreadsheets 17 5.3   Area Performance Charts 20 

Page 3: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 3/22

 

3

Executive summary

 As part of NWSC’s continuous endeavour to serve its customers better, the R&D department was

asked to develop and test a methodology to facilitate regular customer satisfaction measurement,

and to identify areas where customers would like us to improve.

 A telephone survey methodology emerged as the most efficient and cost effective way of periodically

assessing customer satisfaction (CS). This methodology was tested in six Areas (Kampala,

Bushenyi, Entebbe, Kabale, Mbarara and Tororo), with a total sample size of 1743 customers.The objectives of the survey were four-fold. First, we sought to ascertain the importance customers

attach to various attributes of our services. Second, we wanted to find out customers' perception of

our performance (satisfaction) on those attributes. Thirdly, we wanted to ascertain where the scope

and priorities for improvement lie. Fourthly, we wanted to demonstrate an approach to CS

benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III.

The results of the survey showed that on average customers attach high importance to all the

service attributes identified in previous surveys (i.e. reliability, pressure, water quality, timely and

accurate water bills, responsiveness in resolving complaints, responsiveness in effecting new

connections, customer care, convenience of bill payment process and office ambience).

However, customer’s level of satisfaction is moderate for most of the attributes, except office

ambience, convenience of bill payment processes and customer care. Moreover, satisfaction levelsfor technical attributes (such as supply reliability, pressure and quality) are generally lower than

customer service related attributes, implying that the scope for improvement lies in addressing the

technical quality dimensions of our service.

The survey also demonstrated the sort of customer satisfaction benchmarking that could be

incorporated in the existing M&E framework for the new IDAMC III. This benchmarking is based on

an overall measure of satisfaction called the customer satisfaction index (CSI). CSI values were

calculated for all the areas/branches surveyed, with Mbarara emerging as the best performing Area

with a CSI value of 91 percent, and Bushenyi the least performing with a CSI value of 78 percent.

It should be noted however that CSI values, although useful for benchmarking purposes, are not

informative – i.e. they do not tell the Area manager what attributes of the service need to be

improved. For this reason, CSI calculations should always be complemented with an analysis of the

performance relative to customer priorities (the performance matrix) in order to highlight thoseattributes where managers need to pay more attention.

It is recommended that surveys like these become a regular feature of our M&E framework so that

we are able to understand and track changes in customer priorities. To do this however, we will

need to ensure that our customer databases are kept up-to-date and complete with customer

telephone contacts – something that we found wanting in all the Areas.

Page 4: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 4/22

Page 5: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 5/22

 

5

Before we begin to create tools to measure the level of satisfaction, it is important to develop a clear

understanding of what exactly the customer wants. We need to know what our customers expect

from the services we provide.

Customer expectations are the customer-defined attributes of our service which we must meet or

exceed to achieve customer satisfaction. Previous customer perception surveys carried out in

NWSC have highlighted a number of service attributes which our customers expect. These include:

•  supply reliability

•  sufficient supply pressure

•  good quality water

•  timely and accurate bills

•  responsiveness to general inquiries

•  responsiveness in resolving complaints

•  responsiveness in effecting new connections

•  customer care (valuing and treating them well)

•  convenience of bill payment process

•  regular information updates regarding services

•  good office ambience

It should be noted that we cannot create customer satisfaction just by meeting these customer

requirements fully because these have to be met  in any case. However, falling short is certain tocreate dissatisfaction.

On the other hand, different customers will tend to rate the importance of these attributes differently.

Some may not care so much about office ambience, while others may attach high importance to how

quickly we resolve their complaints or the convenience of our bill payment process.

For performance measurement purposes therefore, we must first find out the importance customers

attach to each of the above attributes, and then assess their level of satisfaction on each. This way,

we are able to ascertain our performance relative to customer priorities, thus providing an easy way

to monitor improvements, and deciding upon the attributes that need to be concentrated on in order

to improve customer satisfaction.

The above constitutes the framework under which this survey was undertaken. The next section

describes the survey setting and sampling design adopted.

2.2 Survey setting and sampling

Initially, this survey was meant to cover all NWSC operational areas, but it was later scaled down to

only those Areas who managed to provide a full list of their customers (including contact telephone

numbers) in time.

The Areas which complied in time with our request for customer telephone numbers included:

Kampala, Bushenyi, Entebbe, Kabale, Mbarara and Tororo. The rest of the Areas did not submit

telephone numbers or provided them late.

The sample size was based on a 95 percent confidence interval and a ±10 percent margin of error.

In addition, a 50 percent response rate was assumed, implying that we had to target twice therequired sample size in order to obtain the required number of completed questionnaires.

Customer telephone numbers were then randomly selected from the customer lists to obtain the

random sample. Table 2.1 shows the sample sizes for each of six Areas. The sample size for

Kampala Water was drawn from only six branches (Branch 1, 2, 3, 4, 5 and 6), which we considered

representative of the entire customer base in Kampala.

Page 6: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 6/22

 

6

Table 2.1: Sample sizes

 Area Sampl e size*

Kampala 1,150

Entebbe 121

Mbarara 102

Bushenyi 105

Kabale 87

Tororo 178

TOTAL 1,743

2.3 Survey questionnaire

 A structured questionnaire was developed to measure both customer priorities – i.e. the level of

importance customers attach to the various service attributes mentioned above - and their level of

satisfaction with our performance on those attributes.

The questionnaire therefore had two parts: Part A contained 11 questions intended to find out

customer’s personal views on the importance they attach to various aspects of our water service,

while Part B consisted of 12 questions intended to find out customers’ level of satisfaction with our

services.

The questionnaire was designed to minimise time and effort on customer's part, and to actively

encourage the customer to answer the questions. This was achieved by incorporating 'objective'

type questions where a customer had to 'rate' on a scale of 1 to 7, for both ‘importance’ and

satisfaction. However, space was also provided for the customer's own opinions. This enabled them

to state any shortcomings or suggestions that could be useful in improving our service delivery.

The questionnaire was also pre-tested in line with standard survey practice. The process of pre-

testing involved: (i) asking colleagues, to review both the form and content of measures, and clarity

of instructions; and (ii) soliciting comments from the commercial and customer care division to

ensure that all service attributes captured in previous surveys are correctly represented.

Following the pre-test, a pilot survey was carried out with a small random sample of customers in

order to further test the suitability of the questionnaire and the procedures for data collection. The

pilot study was conducted in Branch 2 of Kampala Water. Both parts of the questionnaire were

subjected to internal consistency tests and found to be reliable. . A copy of the final questionnaire

used in the survey is attached in Annex 5.1. The next section briefly describes the procedures

followed in administration of the questionnaire.

2.4 Main survey administration

The questionnaire was administered via the telephone by staff from different departments, who took

off time from their normal duties to call the sampled customers. Prior to questionnaire

administration, the selected staff members attended a two-hour briefing session during which theobjectives of the survey, the questionnaire, method of administration and data entry procedures were

explained.

 A key consideration in survey practice is the response rate, that is, how many of the individuals

selected for the survey actually participated. Non-response bias is created when non-respondents’

would-be responses differ from the responses of those who participate in the study. The magnitude

of non-response bias depends on a study’s response rates. Moreover, in survey practice, overall

response rate is considered as an indicator of the representativeness of sample respondents.

Response rates of at least 50 percent, 60 percent and more than 70 percent are considered

adequate, good and very good, respectively.

Page 7: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 7/22

 

7

Computation of response rate for this survey was based on only those customers we contacted. We

consider response rate as a measure of our success in persuading sampled customers to participate

in the survey, and so we do not count against ourselves those whom we could not even contact (i.e.

telephone numbers switched off or not on the network).

The initial total sample size for the entire survey was 1742.  Out of these, a total of 968 customers

could not be contacted by telephone due to various reasons such as wrong or non-existent

telephone numbers, switched off telephone numbers, and limited time given to the interviewers to

complete the survey. As a result, the net sample size is 774. The total number of useablequestionnaires returned was 647. This resulted in an effective response rate of 84 percent.

The total cost of questionnaire administration was UGX 3,278,800 (including the cost of air-time for

the telephones, translation costs and a modest allowance for the survey team). If we had opted for

face-to-face administration, the total cost would have amounted to about UGX 3,692,800 (i.e.

allowance for interviewer, translation costs, and photocopying, transport and accommodation costs).

Therefore, it appears there is not much difference in the costs of administration for face-to-face and

telephone administration. Telephone administration is however preferred because of the quick

turnaround and high response rates as compared to face-to-face administration. The next section

explains how the data from the telephone survey was compiled and analysed.

2.5 Data compilation and analysisData entry and analysis spreadsheets were developed to enable interviewers to enter responses

directly as they talked to customers. Sample screenshots of the data entry and analysis

spreadsheets are shown in Annex 5.2.

In order to verify the data, random checks were performed on selected customers in the sample to

verify that the interviews had indeed been carried out. This involved calling selected customers and

asking them whether any one from NWSC had called them regarding a customer satisfaction survey.

The analytical work was mainly aimed at determining two major factors from the data: (i)

performance matrix, i.e. our performance relative to customer’s priorities; and (ii) customer

satisfaction index (CSI), i.e. overall customer satisfaction.

The performance matrix was obtained by averaging the importance and satisfaction scores for each

parameter and plotting these on the same bar chart to highlight areas where there is scope forimprovement. For descriptive purposes, scores above 6 were considered high, while scores

between 4 and 6 were considered moderate. Importance or satisfaction scores below 4 were

considered low.

The CSI on the other hand represents overall satisfaction level and was calculated as follows:

•  average of importance scores for each service attribute (I)

•  average of satisfaction scores for each service attribute (S)

•  average importance scores for all service attributes (Iall)

•  calculate weights (W) for each attribute by dividing the average of importance score for each

attribute by the average for all attributes, i.e. W= I/ Iall 

•  calculated weighted satisfaction scores (i.e. satisfaction scores that take into account the

importance ratings) = S*W

•  CSI = average of S*W for all service attributes expressed as a percentage.

For descriptive purposes, CSI values above 85 percent were taken to represent high levels of overall

satisfaction, while those below 60 percent were taken to represent a low level of satisfaction. CSI

values between 60 and 85 percent represented a moderate level of satisfaction.

Page 8: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 8/22

 

8

3. Results and discussion

3.1 Performance matrix

Figure 3.1 shows the global performance matrix emerging from the entire sample. Area

performance charts are provided in Annex 5.3. Based on the entire sample, we note that on

average, customers attach high importance (scores >6) to all the service attributes. However, their

level of satisfaction is moderate for most of the attributes, except office ambience, convenience of bill

payment processes and customer care.

Fig. 3.1: Global performance matrix

It can also be noted that satisfaction levels for technical attributes (such as supply reliability,

pressure and quality) are generally lower than customer service related attributes, implying that the

scope for improvement lies in addressing the technical quality dimensions of our service.

The high satisfaction level on customer service-related attributes (e.g. customer care, office

ambience, and convenience of bill payment processes) is not surprising given the corporation’s

sustained efforts over the years to improve customer service. Previous surveys of NWSC

customers2  have also shown that customer service-related attributes are better predictors of

customer satisfaction than technical quality attributes. However, falling short on technical quality is

certain to create ‘raging’ fans instead of raving fans. Given our focus on creating raving fans, it is

important that we balance our efforts and start paying attention to the technical attributes of our

services as well.

3.2 Customer satisfaction index

Figure 3.2 shows CSI values for each of the Areas and KW branches surveyed. With the exception

of Kabale and Bushenyi, all the other Areas have CSI values of 85 percent and above, implying high

levels of overall satisfaction.

Mbarara Area has the highest CSI value (91 percent) while Bushenyi has the lowest (78 percent).

Both Bushenyi and Kabale perform below the sample average of 85 percent.

2 Kayaga, S. (2002). The Influence of customer perceptions of urban water services on bill payment behaviour. PhD thesis,

Loughborough University, UK

Page 9: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 9/22

 

9

 

Fig 3.2: CSI values by Area

Fig 3.3: CSI values by KW Branch (based on five branches only) 

For KW, branches 1 and 5 have the highest level of overall satisfaction (89% and 87% respectively).

Branches 2, 3 and 4 perform below the KW average of 85 percent

This analysis demonstrates the kind of customer satisfaction benchmarking that could be

incorporated in the existing M&E framework for the new IDAMC III. It was not possible to obtain CSI

benchmarking figures from African water utilities, because many of them do not carry out regular

customer satisfaction surveys. Even those which do carry out some sort of customer surveys do not

calculate CSI values.

4. Conclusion

This survey sought to achieve three objectives (i) the importance customers attach to various

attributes of our services; (ii) customers' perception of our performance on those attributes; and (iii)

priorities for improvement.

The results showed that on average customers attach high importance to all the service attributes

identified in previous surveys (i.e. reliability, pressure, water quality, timely and accurate water bills,

Page 10: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 10/22

 

10

responsiveness in resolving complaints, responsiveness in effecting new connections, customer

care, convenience of bill payment process and office ambience).

However, customer’s level of satisfaction is moderate for most of the attributes, except office

ambience, convenience of bill payment processes and customer care. Moreover, satisfaction levels

for technical attributes (such as supply reliability, pressure and quality) are generally lower than

customer service related attributes, implying that the scope for improvement lies in addressing the

technical quality dimensions of our service.

The survey also demonstrated the sort of customer satisfaction benchmarking that could beincorporated in the existing M&E framework for the new IDAMC III. This benchmarking is based on

an overall measure of satisfaction called the customer satisfaction index (CSI). CSI values were

calculated for all the areas/branches surveyed, with Mbarara emerging as the best performing Area

with a CSI value of 91 percent, and Bushenyi the least performing with a CSI value of 78 percent.

It should be noted however that CSI values, although useful for benchmarking purposes, are not

informative – i.e. they do not tell the Area manager what attributes of the service need to be

improved. For this reason, CSI calculations should always be complemented with an analysis of the

performance relative to customer priorities (the performance matrix) in order to highlight those

attributes where managers need to pay more attention.

It is recommended that surveys like these become a regular feature of our M&E framework so that

we are able to understand and track changes in customer priorities. To do this however, we will

need to ensure that our customer databases are regularly updated with customer telephone

contacts.

Page 11: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 11/22

 

11

 

5. Annexes

Page 12: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 12/22

 

12

5.1 Survey questionnaire

National Water and Sewerage Corporation

Customer Satisfaction Survey Questionnaire

(Telephone surveys)

Questionnaire S/No: ___________________ Area: _________________________

Customer Reference No. _________________ Branch: _________________________

To the interviewer : Please read the following statement to each customer before you ask the questions.

Hello, I’m calling from National Water and Sewerage Corporation. My name is

 ________________________. As part of our continuous endeavour to serve you better, NWSC

management would like to know how you feel about our services. We are therefore conducting a survey to

establish areas that you would like us to improve upon since you are the reason we exist. We randomly

selected your phone number from our customer database. The survey is voluntary and will take about 10

minutes. Your opinions are very important to us, and please be assured that your responses shall be treated

with utmost confidentiality. May I proceed?

To be completed by the interviewer:

The language being used for the interview is: ____________________________________

Survey date: _________________________

Customer tel. number used (if different from the one on the sample sheet ): ________________

 ______________________________________________________________________

Page 13: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 13/22

 

13

Section A: [Customer Priorities]

This first section consists of a set of 11 questions intended to find out your personal views on the importance

you attach to various aspects of our piped water service to your home/premises/institution. If you do not

have an opinion on a particular question or if you feel a particular question does not apply to you, please feel

free to let me know.

 A1. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to having a reliable and continuous

supply of tap water to your home/premises/institution

 A2. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to receiving water of adequate pressure at

your home/premises/instituti on

 A3.  On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to receiving good quality water at your

home/premises/institution

 A4. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to receiving timely and accurate monthly

bills for the water you consume

 A5. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to having your enquiries responded to

quickly?

 A6. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to having your complaints resolved

quickly?

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

Page 14: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 14/22

 

14

  A7. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to having your request for a new

connection effected quickly?

 A8. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to being treated well as a valuable

customer when you interact with our staff

 A9. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to having a convenient system of paying

your monthly water bills

 A10. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to receiving regular information updates

regarding our services and plans

 A11. On a scale of 1 to 7, where 1 represents “extremely unimportant” and 7 represents “extremely

important”, how would you rate the importance you attach to being attended to in a clean ambience

when you visit any of our off ices

 ___________________________________________________________________________________

Section B: [Customer Satisfaction]

This section consists of a set of 12 questions intended to find out your level of satisfaction with our services.

If you do not have an opinion on a particular question or if you feel a particular question does not apply to

you, please feel free to let me know.

B1. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely satisfied”,

how would you rate your level of satisfaction with the reliability of   water supply to your

home/premises/institution

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

Page 15: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 15/22

 

15

B2. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the water pressure  at your

home/premises/institution

B3.  On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the quality of water your receive at your

home/premises/institution

B4. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the accuracy of monthly bills for the

water you consume

B5. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the time our staff take to respond to 

your enquiries

B6. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the time our staff take to resolve your

complaints?

B7. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with the time our staff take to effect new

connection requests

B8. On a scale of 1 to 7, where 1 represents “extremely dissatisfied” and 7 represents “extremely

satisfied”, how would you rate your level of satisfaction with regard to our customer care

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

1 2 3 54 6 7 N DK/NA

Page 16: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 16/22

Page 17: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 17/22

 

17

5.2 Sample screenshots of the data entry and analysis spreadsheets

Page 18: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 18/22

 

18

 

Page 19: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 19/22

 

19

 

Page 20: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 20/22

 

20

5.3 Area Performance Charts

Fig 5.1: Kampala Water Performance Matrix

Fig 5.2: Bushenyi Area Performance Matrix

Page 21: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 21/22

 

21

 

Fig 5.3: Entebbe Area Performance Matrix

Fig 5.4: Kabale Area Performance Matrix

Page 22: Pilot Survey

8/13/2019 Pilot Survey

http://slidepdf.com/reader/full/pilot-survey 22/22

 

 

Fig 5.5: Mbarara Area Performance Matrix

Fig 5.6: Tororo Area Performance Matrix


Recommended