Customer Satisfaction Survey Report
August 2009
Prepared by: Josses Mugabi Samalie Mutuwa
Contents
Section
Executive summary 1. 1.1 1.2 2. 2.1 2.2 2.3 2.4 2.5 3. 3.1 3.2 4. 5. 5.1 5.2 5.3 Introduction Background Objectives and report outline Approach and methodology What constitutes satisfaction? Survey setting and sampling Survey questionnaire Main survey administration Data compilation and analysis Results and discussion Performance matrix Customer satisfaction index Conclusion Annexes Survey questionnaire Sample screenshots of the data entry and analysis spreadsheets Area Performance Charts
Page
3 4 4 4 4 4 5 6 6 7 8 8 8 9 11 12 17 20
Executive summary
As part of NWSCs continuous endeavour to serve its customers better, the R&D department was asked to develop and test a methodology to facilitate regular customer satisfaction measurement, and to identify areas where customers would like us to improve. A telephone survey methodology emerged as the most efficient and cost effective way of periodically assessing customer satisfaction (CS). This methodology was tested in six Areas (Kampala, Bushenyi, Entebbe, Kabale, Mbarara and Tororo), with a total sample size of 1743 customers. The objectives of the survey were four-fold. First, we sought to ascertain the importance customers attach to various attributes of our services. Second, we wanted to find out customers' perception of our performance (satisfaction) on those attributes. Thirdly, we wanted to ascertain where the scope and priorities for improvement lie. Fourthly, we wanted to demonstrate an approach to CS benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III. The results of the survey showed that on average customers attach high importance to all the service attributes identified in previous surveys (i.e. reliability, pressure, water quality, timely and accurate water bills, responsiveness in resolving complaints, responsiveness in effecting new connections, customer care, convenience of bill payment process and office ambience). However, customers level of satisfaction is moderate for most of the attributes, except office ambience, convenience of bill payment processes and customer care. Moreover, satisfaction levels for technical attributes (such as supply reliability, pressure and quality) are generally lower than customer service related attributes, implying that the scope for improvement lies in addressing the technical quality dimensions of our service. The survey also demonstrated the sort of customer satisfaction benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III. This benchmarking is based on an overall measure of satisfaction called the customer satisfaction index (CSI). CSI values were calculated for all the areas/branches surveyed, with Mbarara emerging as the best performing Area with a CSI value of 91 percent, and Bushenyi the least performing with a CSI value of 78 percent. It should be noted however that CSI values, although useful for benchmarking purposes, are not informative i.e. they do not tell the Area manager what attributes of the service need to be improved. For this reason, CSI calculations should always be complemented with an analysis of the performance relative to customer priorities (the performance matrix) in order to highlight those attributes where managers need to pay more attention. It is recommended that surveys like these become a regular feature of our M&E framework so that we are able to understand and track changes in customer priorities. To do this however, we will need to ensure that our customer databases are kept up-to-date and complete with customer telephone contacts something that we found wanting in all the Areas.
1.
1.1
Introduction
Background
Achieving customer satisfaction is usually the primary goal for many water service providers, and many managers would gladly profess to be striving to provide satisfaction to their customers. In NWSC, we have even gone ahead and realised that satisfied customers just arent good enough. Inspired by Ken Blanchard and Sheldon Bowless best seller1, we have set out to create raving fans. However, we are still mindful of the fact that without satisfied customers, we cannot have raving fans. There is obviously a strong link between customer satisfaction and loyalty, and thats why we must regularly measure and track changes in customer satisfaction. With better understanding of customers' perceptions related to our services, we can determine the actions required to meet customers' needs. Measuring customer satisfaction also helps to promote an increased focus on customer requirements and stimulates improvements in the work practices and processes. It is also a requirement of the new ISO standard under which some of our area operations are certified. Under this standard we are required to identify parameters that cause customer satisfaction or dissatisfaction and consciously measure them.
1.2
Objectives and report outline
The overall objective of this exercise was to develop and test a survey methodology to facilitate regular customer satisfaction (CS) measurement. The specific objectives were to: determine the importance customers attach to various attributes i.e. technical and customer service aspects - of our services (customer priorities) determine the customers' perception of our performance on these attributes (satisfaction); determine priorities for improvement; and demonstrate an approach to CS benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III.
This report synthesises the results of the survey of 1,743 customers from six operational areas and identifies those service parameters where customers would like us to improve. The report also demonstrates an approach to CS benchmarking based on an overall measure of satisfaction. The rest of the report is organised as follows. Section 2 provides a brief discussion on the approach and methodology adopted, including the conceptual position taken regarding what constitutes satisfaction, the survey setting and sampling procedures followed; questionnaire development and administration, as well as data compilation and analysis process . Section 3 presents the pattern of results and briefly discusses their significance and implications; while Section 4 provides some concluding remarks and recommendations.
2.
2.1
Approach and methodology
What constitutes satisfaction?
There is a lot of debate and confusion about what exactly is customer satisfaction and how to go about measuring it. Customer satisfaction is quite a complex issue and it is not the purpose of this report to elaborate on the conceptual debates that exist in the literature. In our context, customer satisfaction is simply customers' perception that we (NWSC) have met or exceeded their expectations.
Blanchard, K. and Bowles, S. (1993). Raving fans: a revolutionary approach to customer service. Harper Collins
4
Before we begin to create tools to measure the level of satisfaction, it is important to develop a clear understanding of what exactly the customer wants. We need to know what our customers expect from the services we provide. Customer expectations are the customer-defined attributes of our service which we must meet or exceed to achieve customer satisfaction. Previous customer perception surveys carried out in NWSC have highlighted a number of service attributes which our customers expect. These include: supply reliability sufficient supply pressure good quality water timely and accurate bills responsiveness to general inquiries responsiveness in resolving complaints responsiveness in effecting new connections customer care (valuing and treating them well) convenience of bill payment process regular information updates regarding services good office ambience
It should be noted that we cannot create customer satisfaction just by meeting these customer requirements fully because these have to be met in any case. However, falling short is certain to create dissatisfaction. On the other hand, different customers will tend to rate the importance of these attributes differently. Some may not care so much about office ambience, while others may attach high importance to how quickly we resolve their complaints or the convenience of our bill payment process. For performance measurement purposes therefore, we must first find out the importance customers attach to each of the above attributes, and then assess their level of satisfaction on each. This way, we are able to ascertain our performance relative to customer priorities, thus providing an easy way to monitor improvements, and deciding upon the attributes that need to be concentrated on in order to improve customer satisfaction. The above constitutes the framework under which this survey was undertaken. The next section describes the survey setting and sampling design adopted.
2.2
Survey setting and sampling
Initially, this survey was meant to cover all NWSC operational areas, but it was later scaled down to only those Areas who managed to provide a full list of their customers (including contact telephone numbers) in time. The Areas which complied in time with our request for customer telephone numbers included: Kampala, Bushenyi, Entebbe, Kabale, Mbarara and Tororo. The rest of the Areas did not submit telephone numbers or provided them late. The sample size was based on a 95 percent confidence interval and a 10 percent margin of error. In addition, a 50 percent response rate was assumed, implying that we had to target twice the required sample size in order to obtain the required number of completed questionnaires. Customer telephone numbers were then randomly selected from the customer lists to obtain the random sample. Table 2.1 shows the sample sizes for each of six Areas. The sample size for Kampala Water was drawn from only six branches (Branch 1, 2, 3, 4, 5 and 6), which we considered representative of the entire customer base in Kampala.
Table 2.1: Sample sizes
Area Kampala Entebbe Mbarara Bushenyi Kabale Tororo TOTAL Sample size* 1,150 121 102 105 87 178 1,743
2.3
Survey questionnaire
A structured questionnaire was developed to measure both customer priorities i.e. the level of importance customers attach to the various service attributes mentioned above - and their level of satisfaction with our performance on those attributes. The questionnaire therefore had two parts: Part A contained 11 questions intended to find out customers personal views on the importance they attach to various aspects of our water service, while Part B consisted of 12 questions intended to find out customers level of satisfaction with our services. The questionnaire was designed to minimise time and effort on customer's part, and to actively encourage the customer to answer the questions. This was achieved by incorporating 'objective' type questions where a customer had to 'rate' on a scale of 1 to 7, for both importance and satisfaction. However, space was also provided for the customer's own opinions. This enabled them to state any shortcomings or suggestions that could be useful in improving our service delivery. The questionnaire was also pre-tested in line with standard survey practice. The process of pretesting involved: (i) asking colleagues, to review both the form and content of measures, and clarity of instructions; and (ii) soliciting comments from the commercial and customer care division to ensure that all service attributes captured in previous surveys are correctly represented. Following the pre-test, a pilot survey was carried out with a small random sample of customers in order to further test the suitability of the questionnaire and the procedures for data collection. The pilot study was conducted in Branch 2 of Kampala Water. Both parts of the questionnaire were subjected to internal consistency tests and found to be reliable. . A copy of the final questionnaire used in the survey is attached in Annex 5.1. The next section briefly describes the procedures followed in administration of the questionnaire.
2.4
Main survey administration
The questionnaire was administered via the telephone by staff from different departments, who took off time from their normal duties to call the sampled customers. Prior to questionnaire administration, the selected staff members attended a two-hour briefing session during which the objectives of the survey, the questionnaire, method of administration and data entry procedures were explained. A key consideration in survey practice is the response rate, that is, how many of the individuals selected for the survey actually participated. Non-response bias is created when non-respondents would-be responses differ from the responses of those who participate in the study. The magnitude of non-response bias depends on a studys response rates. Moreover, in survey practice, overall response rate is considered as an indicator of the representativeness of sample respondents. Response rates of at least 50 percent, 60 percent and more than 70 percent are considered adequate, good and very good, respectively.
6
Computation of response rate for this survey was based on only those customers we contacted. We consider response rate as a measure of our success in persuading sampled customers to participate in the survey, and so we do not count against ourselves those whom we could not even contact (i.e. telephone numbers switched off or not on the network). The initial total sample size for the entire survey was 1742. Out of these, a total of 968 customers could not be contacted by telephone due to various reasons such as wrong or non-existent telephone numbers, switched off telephone numbers, and limited time given to the interviewers to complete the survey. As a result, the net sample size is 774. The total number of useable questionnaires returned was 647. This resulted in an effective response rate of 84 percent. The total cost of questionnaire administration was UGX 3,278,800 (including the cost of air-time for the telephones, translation costs and a modest allowance for the survey team). If we had opted for face-to-face administration, the total cost would have amounted to about UGX 3,692,800 (i.e. allowance for interviewer, translation costs, and photocopying, transport and accommodation costs). Therefore, it appears there is not much difference in the costs of administration for face-to-face and telephone administration. Telephone administration is however preferred because of the quick turnaround and high response rates as compared to face-to-face administration. The next section explains how the data from the telephone survey was compiled and analysed.
2.5
Data compilation and analysis
Data entry and analysis spreadsheets were developed to enable interviewers to enter responses directly as they talked to customers. Sample screenshots of the data entry and analysis spreadsheets are shown in Annex 5.2. In order to verify the data, random checks were performed on selected customers in the sample to verify that the interviews had indeed been carried out. This involved calling selected customers and asking them whether any one from NWSC had called them regarding a customer satisfaction survey. The analytical work was mainly aimed at determining two major factors from the data: (i) performance matrix, i.e. our performance relative to customers priorities; and (ii) customer satisfaction index (CSI), i.e. overall customer satisfaction. The performance matrix was obtained by averaging the importance and satisfaction scores for each parameter and plotting these on the same bar chart to highlight areas where there is scope for improvement. For descriptive purposes, scores above 6 were considered high, while scores between 4 and 6 were considered moderate. Importance or satisfaction scores below 4 were considered low. The CSI on the other hand represents overall satisfaction level and was calculated as follows: average of importance scores for each service attribute (I) average of satisfaction scores for each service attribute (S) average importance scores for all service attributes (Iall) calculate weights (W) for each attribute by dividing the average of importance score for each attribute by the average for all attributes, i.e. W= I/ Iall calculated weighted satisfaction scores (i.e. satisfaction scores that take into account the importance ratings) = S*W CSI = average of S*W for all service attributes expressed as a percentage.
For descriptive purposes, CSI values above 85 percent were taken to represent high levels of overall satisfaction, while those below 60 percent were taken to represent a low level of satisfaction. CSI values between 60 and 85 percent represented a moderate level of satisfaction.
3.
3.1
Results and discussion
Performance matrix
Figure 3.1 shows the global performance matrix emerging from the entire sample. Area performance charts are provided in Annex 5.3. Based on the entire sample, we note that on average, customers attach high importance (scores >6) to all the service attributes. However, their level of satisfaction is moderate for most of the attributes, except office ambience, convenience of bill payment processes and customer care.
Fig. 3.1: Global performance matrix
It can also be noted that satisfaction levels for technical attributes (such as supply reliability, pressure and quality) are generally lower than customer service related attributes, implying that the scope for improvement lies in addressing the technical quality dimensions of our service. The high satisfaction level on customer service-related attributes (e.g. customer care, office ambience, and convenience of bill payment processes) is not surprising given the corporations sustained efforts over the years to improve customer service. Previous surveys of NWSC customers2 have also shown that customer service-related attributes are better predictors of customer satisfaction than technical quality attributes. However, falling short on technical quality is certain to create raging fans instead of raving fans. Given our focus on creating raving fans, it is important that we balance our efforts and start paying attention to the technical attributes of our services as well.
3.2
Customer satisfaction index
Figure 3.2 shows CSI values for each of the Areas and KW branches surveyed. With the exception of Kabale and Bushenyi, all the other Areas have CSI values of 85 percent and above, implying high levels of overall satisfaction. Mbarara Area has the highest CSI value (91 percent) while Bushenyi has the lowest (78 percent). Both Bushenyi and Kabale perform below the sample average of 85 percent.
Kayaga, S. (2002). The Influence of customer perceptions of urban water services on bill payment behaviour. PhD thesis, Loughborough University, UK 8
Fig 3.2: CSI values by Area
Fig 3.3: CSI values by KW Branch (based on five branches only)
For KW, branches 1 and 5 have the highest level of overall satisfaction (89% and 87% respectively). Branches 2, 3 and 4 perform below the KW average of 85 percent This analysis demonstrates the kind of customer satisfaction benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III. It was not possible to obtain CSI benchmarking figures from African water utilities, because many of them do not carry out regular customer satisfaction surveys. Even those which do carry out some sort of customer surveys do not calculate CSI values.
4.
Conclusion
This survey sought to achieve three objectives (i) the importance customers attach to various attributes of our services; (ii) customers' perception of our performance on those attributes; and (iii) priorities for improvement. The results showed that on average customers attach high importance to all the service attributes identified in previous surveys (i.e. reliability, pressure, water quality, timely and accurate water bills,
9
responsiveness in resolving complaints, responsiveness in effecting new connections, customer care, convenience of bill payment process and office ambience). However, customers level of satisfaction is moderate for most of the attributes, except office ambience, convenience of bill payment processes and customer care. Moreover, satisfaction levels for technical attributes (such as supply reliability, pressure and quality) are generally lower than customer service related attributes, implying that the scope for improvement lies in addressing the technical quality dimensions of our service. The survey also demonstrated the sort of customer satisfaction benchmarking that could be incorporated in the existing M&E framework for the new IDAMC III. This benchmarking is based on an overall measure of satisfaction called the customer satisfaction index (CSI). CSI values were calculated for all the areas/branches surveyed, with Mbarara emerging as the best performing Area with a CSI value of 91 percent, and Bushenyi the least performing with a CSI value of 78 percent. It should be noted however that CSI values, although useful for benchmarking purposes, are not informative i.e. they do not tell the Area manager what attributes of the service need to be improved. For this reason, CSI calculations should always be complemented with an analysis of the performance relative to customer priorities (the performance matrix) in order to highlight those attributes where managers need to pay more attention. It is recommended that surveys like these become a regular feature of our M&E framework so that we are able to understand and track changes in customer priorities. To do this however, we will need to ensure that our customer databases are regularly updated with customer telephone contacts.
10
5.
Annexes
11
5.1
Survey questionnaire
National Water and Sewerage Corporation
Customer Satisfaction Survey Questionnaire (Telephone surveys)
Questionnaire S/No: ___________________
Area: _________________________
Customer Reference No. _________________ Branch: _________________________
To the interviewer: Please read the following statement to each customer before you ask the questions.
Hello, Im calling from National Water and Sewerage Corporation. My name is ________________________. As part of our continuous endeavour to serve you better, NWSC management would like to know how you feel about our services. We are therefore conducting a survey to establish areas that you would like us to improve upon since you are the reason we exist. We randomly selected your phone number from our customer database. The survey is voluntary and will take about 10 minutes. Your opinions are very important to us, and please be assured that your responses shall be treated with utmost confidentiality. May I proceed?
To be completed by the interviewer: The language being used for the interview is: ____________________________________ Survey date: _________________________ Customer tel. number used (if different from the one on the sample sheet): ________________
______________________________________________________________________
12
Section A: [Customer Priorities] This first section consists of a set of 11 questions intended to find out your personal views on the importance you attach to various aspects of our piped water service to your home/premises/institution. If you do not have an opinion on a particular question or if you feel a particular question does not apply to you, please feel free to let me know.
A1.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to having a reliable and continuous supply of tap water to your home/premises/institution
N DK/NA
A2.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to receiving water of adequate pressure at your home/premises/institution
1 2 3 4 5 6 7
N DK/NA
A3.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to receiving good quality water at your home/premises/institution
1 2 3 4 5 6 7
N DK/NA
A4.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to receiving timely and accurate monthly bills for the water you consume
1 2 3 4 5 6 7
N DK/NA
A5.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to having your enquiries responded to quickly?
1 2 3 4 5 6 7
N DK/NA
A6.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to having your complaints resolved quickly?
1 2 3 4 5 6 7
N DK/NA
13
A7. On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to having your request for a new connection effected quickly?
1 2 3 4 5 6 7
N DK/NA
A8.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to being treated well as a valuable customer when you interact with our staff
1 2 3 4 5 6 7
N DK/NA
A9.
On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to having a convenient system of paying your monthly water bills
1 2 3 4 5 6 7
N DK/NA
A10. On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to receiving regular information updates regarding our services and plans
1 2 3 4 5 6 7
N DK/NA
A11. On a scale of 1 to 7, where 1 represents extremely unimportant and 7 represents extremely important, how would you rate the importance you attach to being attended to in a clean ambience when you visit any of our offices
1 2 3 4 5 6 7
N DK/NA
___________________________________________________________________________________ Section B: [Customer Satisfaction] This section consists of a set of 12 questions intended to find out your level of satisfaction with our services. If you do not have an opinion on a particular question or if you feel a particular question does not apply to you, please feel free to let me know.
B1. On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the reliability of water supply to your home/premises/institution
N DK/NA
14
B2.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the water pressure at your home/premises/institution
1 2 3 4 5 6 7
N DK/NA
B3.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the quality of water your receive at your home/premises/institution
1 2 3 4 5 6 7
N DK/NA
B4.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the accuracy of monthly bills for the water you consume
1 2 3 4 5 6 7
N DK/NA
B5.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the time our staff take to respond to your enquiries
1 2 3 4 5 6 7
N DK/NA
B6.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the time our staff take to resolve your complaints?
1 2 3 4 5 6 7
N DK/NA
B7. On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the time our staff take to effect new connection requests
1 2 3 4 5 6 7
N DK/NA
B8.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with regard to our customer care
1 2 3 4 5 6 7
N DK/NA
15
B9.
On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with regard to the convenience of our bill payment process
1 2 3 4 5 6 7
N DK/NA
B10. On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with managements effort in providing information updates regarding our services and plans
1 2 3 4 5 6 7
N DK/NA
B11. On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of satisfaction with the cleanliness and ambience at our offices
1 2 3 4 5 6 7
N DK/NA
B12. On a scale of 1 to 7, where 1 represents extremely dissatisfied and 7 represents extremely satisfied, how would you rate your level of overall satisfaction with our services
1 2 3 4 5 6 7
N DK/NA
To the enumerator: Please ask the respondent if he/she has any specific comments or suggestions that could be useful in improving service delivery. ...... Thank you for the time you have spent in answering the questions. We are grateful for your support and cooperation.
To the enumerator:
Time taken to complete the questionnaire
Any comments?
16
5.2
Sample screenshots of the data entry and analysis spreadsheets
17
18
19
5.3
Area Performance Charts
Fig 5.1: Kampala Water Performance Matrix
Fig 5.2: Bushenyi Area Performance Matrix
20
Fig 5.3: Entebbe Area Performance Matrix
Fig 5.4: Kabale Area Performance Matrix
21
Fig 5.5: Mbarara Area Performance Matrix
Fig 5.6: Tororo Area Performance Matrix
22