Question? Leave a message!




How to measure customer satisfaction

How to measure customer satisfaction 34
How to measure customer satisfaction A tool to improve the experience of customers November 2007 1How to measure customer satisfaction A tool to improve the experience of customers November 2007 2Table of Contents 4.3 How should the information be collected 25 Introduction 4 4.4 How do I know I have got it right 27 1 Why should we measure customer 5 How can I get insight from the results 29 satisfaction and who should be involved 6 5.1 Where do I start 29 1.1 Why should we measure satisfaction 6 5.2 Who thinks what 29 1.2 Who should be involved 7 5.3 What is driving satisfaction and how 30 5.4 What can I compare my results with 31 2 What will the process involve 9 3 Where do I start 11 6 How do I communicate and action 3.1 How do I define my service 11 35 the results, and then what 3.2 Who are my customers 14 6.1 Who should I communicate the findings to 35 3.3 What do we know already 15 6.2 How do I communicate the findings 35 3.4 What else can I find out 17 6.3 How do I action the results 36 4 How do I measure satisfaction 19 6.4 And now what happens 36 4.1 What should I ask 19 4.2 Who should be interviewed 23 3Introduction This document is a customer satisfaction The toolkit is designed for ‘service owners’ The toolkit has been commissioned by measurement ‘toolkit’. It is designed to within government, and those in the the Customer Insight Forum (CIF), within help public service providers improve research, insight and policy communities the framework of Transformational the experiences of their customers by who are responsible for implementing Government, and follows on from the understanding how to undertake effective a programme to measure and monitor Primer in Customer Insight in Public 2 customer satisfaction measurement. customer satisfaction. It can be read Services. The CIF was first formed as The toolkit considers the process of alongside a sister publication, Promoting an informal network in 2006, following measurement from customer research and Customer Satisfaction: Guidance on publication of the Transformational analysis through to the implementation of a Improving the Customer Experience Government strategy, to promote service improvement strategy. in the Public Services, which has been best practice in the use of customer written to support the process of insight across Government. It now has Transformational Government and the a more formal and active role in the drive towards improved service delivery. implementation and governance of service With the introduction in CSR 07 of a transformation. The toolkit and guidance crossgovernment Service Transformation have both been developed and produced 1 Agreement , departments, agencies and by BMRB Social Research and Henley Centre local government need to show how they HeadlightVision. are improving customers’ experiences of their services. Together the Guidance and Toolkit set out how public service providers can begin to do this. 1 Service Transformation Agreement, October 2007: http://www.hmtreasury.gov.uk/media/B/9/pbrcsr07service.pdf 4 2 Customer Insight in Public Services A Primer, October 2006: http://www.cabinetoffice.gov.uk/upload/assets/www.cabinetoffice.gov.uk/publications/deliverycouncil/word/custinsightprimer061128.docREADER’S GUIDANCE: This toolkit is intended to be accessible to all those involved in conducting or interpreting customer satisfaction measurement. As such there may be some sections which cover areas that the reader is already familiar with. In particular, research professionals may find that the overviews of data collection and sampling approaches summarise rather than provide detail in these potentially complex areas. For more detailed information about methodological issues please see the Government Social 3 Research Unit’s Magenta book. 3 The Magenta Book: Guidance notes for Policy Evaluation and Analysis: http://www.gsr.gov.uk/professionalguidance/magentabook/ 5Why should we measure customer 1 satisfaction and who should be involved The experience that customers have of Customer satisfaction measurement is a Customer satisfaction measurement allows an questionnairebased research approach. organisation to understand the issues, or key drivers, services can be explored in various ways. However, for quantitative measurement to be that cause satisfaction or dissatisfaction with a Qualitative research techniques can effective, it will generally need to be preceded service experience. When an organisation is able to be used to better understand a service by qualitative research to explore the key understand how satisfied its customers are, and why, through the customers’ eyes, and to features of a service from the perspective of it can focus its time and resources more effectively. explore in depth their experiences and the customer. Customer Journey Mapping and Customer satisfaction measurement may also enable expectations. Quantitative research can other techniques that do this are discussed in an organisation to understand the extent to which provide numerical measures of customer 4 detail in the CIF publication on this subject. satisfaction with a service is influenced by factors satisfaction and statistically representative outside of its control (such as the media) and to findings to assess the performance of 1.1 Why should we measure differentiate between what people say influences a service and provide information to how satisfied they are, and what is really driving their satisfaction drive improved service quality. This is satisfaction during a service experience. Customer referred to as Customer Satisfaction While good research can be used for performance satisfaction measurement can help an organisation management and/or to meet statutory requirements, Measurement and is our focus here. understand what it can and cannot control. the most successful customer measurement Customer satisfaction measurement involves the Most importantly, customer satisfaction measurement programmes are motivated by the desire to put collection of data that provides information about helps an organisation focus on its customers, and customer focus at the heart of an organisation. how satisfied or dissatisfied customers are with a should galvanise service owners, customerfacing Customerfocused organisations view customer service. As well as providing an organisation with staff, policy, strategy and research staff, as well as satisfaction measurement as a means rather than an ‘scores’, the data can be used to understand the senior management, around the aim of improving the end – as part of a cycle of continuous improvement reasons for the level of satisfaction that has been customer experience. in service delivery, and as part of the wider toolkit recorded. This information can be collected and of customer insight techniques. Many organisations analysed in many different ways. This toolkit explores regularly track their levels of customer satisfaction the basic processes and the relative benefits of to monitor performance over time and measure the different approaches. impact of service improvement activity. 6 4 http://www.cabinetoffice.gov.uk/publicservicereform/deliverycouncil/workplan.aspxto improve customer experience can lead to more From their day to day work, customerfacing 1.2 Who should be timely action on the findings of the research. This is staff will have ideas about how customers view Our research involved particularly important in local government where the the experience of a service and the reasons why found that there leadership tends to be more closely involved in service experiences are satisfactory or not. When preparing is much excellent The most important stakeholders in customer design and delivery. the way for customer research, it is important to customer satisfaction measurement are, of course, the service tap into this insight as it can guide the focus of the satisfaction ● Policy and Strategic staff should use the customers themselves. From an internal perspective, measurement work and provide valuable material for questionnaire findings to support strategic decision making. however, there are a number of professional groups already being development. Customerfacing staff are also critical whose involvement in the research will ultimately ● Research and Insight staff will need to undertaken across stakeholders when it comes to implementing the central and local determine whether or not it is effective. The customer analyse the data and share findings effectively. results of customer satisfaction measurement: it will government. measurement programme itself may be executed often be their job to deliver the changes which can ● Communications staff should be involved in However, only a by the research community within an organisation, bring improvements in the customer experience. Their communicating the research findings and resulting small proportion but for implementation to be effective it needs to be commitment and buyin is essential. actions to internal and external audiences, including of this work is ‘owned’ by the organisation: customers. being used to ● Senior management can make the difference drive service ● Operational management need to between good research that remains unused and transformation. understand how the findings can be applied to A basic rule of genuine service transformation. The involvement of their area of responsibility. Customer satisfaction thumb is – if you senior management not only signals that the work is measurement will give a sense – at a very tactical don’t have the viewed as strategically important, it also means that level of how customers feel about the service they full support and those who have the power to act on the findings are are providing and the performance of staff involved involvement more likely to do so. in delivery of the service. Service directors need to of key internal understand why they are obtaining these results and ● Political leaders are important to help agree stakeholders, how they can be used to drive forward improvements and articulate the policy commitments in terms your research in delivery. programme will of service improvement that can be undertaken not be effective. as a result of the findings. In particular, the early ● Customerfacing staff are incredibly valuable involvement of politicians in recognising the need in customer research programmes for many reasons. 7What will the process involve 2 First Time Start here... Find out What will the Define what you Explore process involve know Insight audit: Qualitatively: What/who Admin data Internal clients Service Complaints Customers Customers Previous surveys Staff Internal clients Other research Measure Take action customer experience Uncover your Communicate insights and plan Engage and Analyse and build: discuss with: Segmentation Internal clients Key drivers Staff (all levels) Further qual work Customers Build other info in Service Transformation Cycle 8What will the process involve 2 Measuring customer satisfaction is just one For further guidance on embedding customer insight in your organisation, see the Customer Insight Forum’s stage in a continuous programme of service paper Establishing an Effective Customer Insight transformation. For organisations new 5 Capability in the Public Sector. to this process, the first stages require a review of what the service provides, where The Government Communication Network’s Engage it sits in context with other related services programme also provides an excellent framework for the effective use of customer insight, taking well tried in customers’ minds, who its customers are principles of strategic communication and adapting and what information about the customer them for Government to develop communications experience is already available. 6 that shift attitudes and change behaviours. After this, qualitative research should be conducted with customers and staff to highlight key issues that the survey will need to capture. At this point decisions will need to be made about which customers should be interviewed and what methods should be used. Once the survey has been conducted the data will need to be interpreted to provide actionable insights for the organisation. Finally, the results will need to be communicated across the organisation in such a way that the findings are taken on board and action taken as a result. For many organisations this process will form a continuous cycle of improvement. 5 Establishing an Effective Customer Insight Capability in Public Sector Organisations, Cabinet Office, January 2007: http://www.cabinetoffice.gov.uk/upload/ 9 assets/www.cabinetoffice.gov.uk/publications/deliverycouncil/word/emergingprinciples.doc 6 Further information on the Engage programme is available at: http://www.cabinetoffice.gov.uk/governmentcommunication/engage.aspxWhere do I start 3 First Time Start here... Find out Define what you Explore know Insight audit: Qualitatively: What/who Admin data Internal clients Service Complaints Customers Customers Previous surveys Staff Internal clients Other research Measure Take action customer experience Uncover your Communicate insights and plan Engage and Analyse and build: discuss with: Segmentation Internal clients Key drivers Staff (all levels) Further qual work Customers Build other info in 10Where do I start 3 For organisations that are new to customer Broadly speaking there are four questions to address, ● Are my customers involved in simple or complex and we will go through each of these in turn in more interactions with my service satisfaction measurement, getting started detail: can seem a daunting task in itself. As a ● How do customers interact with my service first step, it involves understanding what ● How do I define my service ● Do customers define my service in the same way customer satisfaction measurement can do ● Who are my customers that I do for your organisation and making sure that ● What do I know already The answers to these questions can influence both your key internal stakeholders understand customer perceptions of the service and the way this as well as the research community. ● What else can I find out in which the customer satisfaction measurement After this, there are a series of (quite programme is designed, conducted and analysed. straightforward) issues to consider on It is therefore important to think through these the road to designing or commissioning a 3.1 How do I define my issues before designing or commissioning customer research programme. satisfaction measurement. service This section provides a check list of questions How do customers come to use my service to answer before starting to measure customer Defining the service that the organisation (or relevant The answers to this question may seem obvious, and satisfaction. All organisations, whether they part of it) provides will help inform everything else. will be for some types of service, but it is worth asking are already carrying out customer satisfaction The most fundamental things that an organisation as part of the process of defining your service. One measurement or are relatively new to it, should should consider are what its service is and what of the key issues to consider here is that of customer consider these points to ensure that they are spending vision it has for it. Among the more specific issues to choice – for example, is the service one that customers taxpayers’ money wisely. consider when measuring customer satisfaction are: opt in to (such as dental services or NHS Direct), one ● How do customers come to use my service that is universally provided (such as refuse collection) ● Does my service involve an outcome that is likely to or one that customers are required by law to use (such affect satisfaction as vehicle tax) Whichever of these applies, is your organisation the sole provider of the service or can ● Do customers pay for my service or is it ‘free at the the customer choose between different providers point of delivery’ 11Choice and competition are almost always present in However, even more complex services can be broken Does my service involve an outcome that is The HMRC private sector product and service provision, but often down conceptually into single interactions to better likely to affect satisfaction Customer not in public sector provision, and this is one of the understand and define the service. Before designing Outcome can influence satisfaction. For example, a Service key features that distinguishes the two in relation to a customer research programme, consider whether it service that is universally available (e.g. Child Benefit) Survey divides measuring customer perceptions. is better to conduct a single survey for all customers is likely to be perceived differently from a service or separate surveys for different customer groups. customers into where there is judgement on which customers are If service experiences are likely to differ radically 13 different eligible (e.g. Tax Credits, Incapacity Benefit). Similarly, Are my customers involved in simple or for different customer groups, a more customised groups the outcome of a civil or criminal court case will complex interactions with my service approach may yield a greater depth of understanding based on influence the satisfaction of the parties involved in at the analysis stage . Whether a service involves one or two simple ‘oneoff’ the case and the outcome of a planning application their service transactions or a complex set of ongoing interactions will affect the perceptions of the applicant and other interactions. between the customer and service provider will interested parties in the application process. For Do customers pay for my service or is it This allows have a strong bearing on how customer satisfaction many services, then, the effect of the outcome on ‘free at the point of delivery’ the survey to measurement should be conducted. An example of the customer’s perceptions of the service needs to provide more a service involving relatively simple transactions is the Customers may have different expectations of a be taken into account when measuring customer meaningful passport application service provided by the Identity service which they pay for directly and one which satisfaction. and Passport Service (IPS). Most customers contact is paid for through general taxation (but may be and actionable How do customers interact with my service the IPS to apply for or renew passports and can be perceived as ‘free’). This affects the way in which the results as interviewed in a relatively straightforward way about findings need to be viewed; it also impacts on the The way in which customers interact with a service information is their experience of the application process. HM type of research techniques that can be used. For varies (e.g. face to face in a local office, over the gained about Revenue and Customs (HMRC), on the other hand, example, if a cost can be assigned to different levels telephone or by submitting forms online or in the each group provides a more complex service (a variety of taxes, of service, then there are research techniques that ask post) and customers may each use a variety of on the specic fi duties and benefits with interactions of varying levels customers to ‘trade off’ cost against various service channels. The channels that are used will impact on experiences of complexity) and the way in which questions are elements, helping an organisation understand more decisions about which data collection methods to use, asked of the customer about the service they have about what their customers value. as well as on the levels of satisfaction with the service. they have. experienced needs to take this into account. If a service is largely provided online, for example, 12online data collection is a viable and even desirable think of the customer as the recipient of a range of . option. Likewise, if the majority of interactions take different services that it provides, while the customer Many public place in person or by telephone, then online data may think of the services as distinct and unrelated. In collection may not be viable, especially if a significant some cases the customer may not even know who services are minority of customers do not have access to the provides the service. accessible via internet. a variety of The best way to explore how customers define channels and It is important to recognise that customer needs the service is through qualitative interviews with it is important vary by channel. For example, customers making a customers, using techniques such as Customer payment online may place an emphasis on security Journey Mapping and to tailor questionnaire content for an and instant confirmation of the transaction, whereas and language accordingly. However, it should be organisation customer satisfaction with the same payment recognised that there may be instances where it is not to understand transaction over the telephone may be affected realistic to expect customers to be able to differentiate which by being kept in a queue or being asked to press between organisations or understand some of the customers are additional number options, rather than being able to complexities of how services are delivered even after using which speak to a call centre agent straight away. extensive questionnaire development. channels and to design their customer Do customers define my service in the same satisfaction way that I do research When assessing what needs to be measured it is programme important to understand whether the customer accordingly.. defines the service in the same way as the service provider. For example, the customer’s definition of the service may cross organisational boundaries and the contribution of different agencies or departments will need to be taken into account; the organisation may 13Research survey and to focus on getting feedback from those Customer segmentation 3.2 Who are my customers commissioned by who are better placed to comment on the service and Segmentation involves grouping customers based the Ministry of so inform service transformation. Having defined the service, you now need to work on who they are and how they interact with an Justice involved out who your customers are. In the public sector this The needs of different customers will also have an organisation’s services. Once customer segments have speaking to can be a matter of some debate. The view we take impact on the research design. Particular attention been established within an organisation they can the relatives of here is that a customer is somebody who is a direct should be given to customers who are vulnerable or then be used to better target operational resources. murder victims recipient of a service, as distinct from a citizen, who hard to reach. Services provided for these customers In addition to this they can also provide a common as part of an will typically be a stakeholder (for instance a taxpayer) may be so different from those for mainstream service framework and language for referring to customers evaluation of the who may not have a direct connection with the users that it is advisable to look at their experiences within an organisation. A segmentation is an Victims Advocate service at that time. For example, even though an separately. By doing so an organisation can focus approximation – a tool that can allow an organisation scheme. Given individual may not have any children of their own they measures on service aspects that are critical for these to use internal ‘short hand’ when talking about their the complexity still have an interest in having an education system groups even if the majority of customers never access customers. of the subject that is efficient and that provides the skilled workers them (for example translated materials or adapted and the issues Customers can be segmented in a multitude of ways. for the future. Their views matter, but it is customers services for customers with disabilities). arising from it At its simplest level, a segmentation may be based on rather than citizens that are the focus of this toolkit. the decision was If the experiences of these customers are particularly service usage. For example, Acas conducts customer Definitions of ‘customers’ for the purpose of customer made to conduct complex it may be worthwhile conducting exploratory feedback surveys on key service areas of their delivery satisfaction measurement may range from all residents the evaluation qualitative research rather than attempting to such as individual and collective conciliation; advisory in a local authority area to people who have had using qualitative interview them in a larger scale quantitative survey. services; and training events. More sophisticated recent involvement with a specific service. face to face segments can be derived from administrative data For further discussion on how customers should be interviews. or previous research. Some segmentations are based Some organisations may have customers with whom defined, see the Cabinet Office publication: Customer on demographic or attitudinal characteristics, or a 7 they have virtually no contact. For example, child Insight in Public Services: A Primer. . combination of both. Exploratory qualitative research benefit recipients may have no need for contact with can also be used to tease out how different customers HMRC following their initial application, whilst Tax use a service. Credit recipients have regular contact with HMRC each year. When customers have very little contact If an organisation has already identified customer with a service it may be best to exclude them from the segments, it is generally helpful if customer 7 http://www.cabinetoffice.gov.uk/upload/assets/www.cabinetoffice.gov.uk/publications/deliverycouncil/word/custinsightprimer061128.doc 14satisfaction measurement is compatible with these Key questions to be considered include: definitions. For further reading on segmentation see Customer Is my organisation currently monitoring a 8 insight in public services: A Primer or the Local customer satisfaction Government Association How to guide to 9 segmentation for Local Authorities. Is my organisation currently reporting a b high level of customer satisfaction 3.3 What do we know Are there specific issues with our service at already c the moment that we currently know about Most organisations have a range of information that can help identify the strengths and weaknesses of Where is my organisation in the journey of d the current service being provided, even if it only .impr oving customer satisfaction provides a partial picture. Taking time to understand the information already available before undertaking customer satisfaction measurement should ensure that unnecessary research is not commissioned and that any research that is carried out is fully informed and relevant. More information on the ‘insight audit’ approach can be found in chapter 3 of the Guidance. Taking time to understand what it already knows can help an organisation with the design of any research and in understanding the results . 8 http://www.cabinetoffice.gov.uk/upload/assets/www.cabinetoffice.gov.uk/publications/deliverycouncil/word/custinsightprimer061128.doc 15 9 http://www.lga.gov.uk/Documents/Customerinsightguidetosegmentation.pdfauthorities many surveys have been carried out in Sources of information include addition to the Best Value Performance Indicator ● Administrative data (BVPI) surveys. With improved communication and sharing of results this data could help inform other Administrative data can be a rich source of authorities or service providers of key issues for sub information for organisations. This can include groups of their customers. For further information on ● call volumes data e.g. waiting times, ‘hang this subject see the report for the Local Government ups’ and answered calls, Association (LGA), National Consumer Council (NCC) ● website statistics e.g. number of people and Improvement and Development Agency for local visiting website, pages viewed and return visits, government (IDeA) Customer Insight: developing customer satisfaction measures for local government ● Applications data e.g. benefit claims over a services. period of time. http://www.lga.gov.uk/Briefing. ● Customer feedback asplsection=59id=SXCD6AA78492E4ccat=1145 Customer feedback (which might include complaints, For further discussion of some of the sources see the suggestions and compliments) can be used to 10 Primer in Customer Insight in Public Services. identify current areas for improvements as well as to inform areas to be included in customer satisfaction measurement. ● Mystery shopping data Many organisations conduct mystery shopping in order to monitor the services being provided to its customers. This can be a valuable source of information and can inform areas to be covered in customer satisfaction measurement. ● Existing survey data Taking time to find out what survey data already exists is a valuable process that is often overlooked. Different research may be carried out in different parts of the organisation. For example, within local 16 10 Customer Insight in Public Services A Primer, October 2006: http://www.cabinetoffice.gov.uk/upload/assets/www.cabinetoffice.gov.uk/ publications/deliverycouncil/word/custinsightprimer061128.docmisinterpretation of findings that result from oneoff 3.4 What else can I find events. For example, a particular problem in getting out used to a new piece of software could coincide with a temporary drop in overall satisfaction levels, but not A ‘due diligence’ approach to customer satisfaction constitute a long term problem . measurement requires some preliminary qualitative research. A small investment in exploratory qualitative As part of the development work research will help define the key areas that seem Qualitative to relate to customers’ satisfaction or otherwise, so for the 2006 International Pension research that the quantitative research focuses on the right provides Service Customer Survey the questions. This exploratory work might include: a deeper ● Qualitative research with customers to help define researchers spent a day at the understanding the service and to segment customers if applicable – contact centre interviewing staff e.g. focus groups, depth interviews, observation etc. of the customer about the issues they faced and experience but ● Qualitative research with key stakeholders (heads of policy, strategy, insight and channel directors ) to cannot be used gaining a better understanding help set the policy context and key objectives – e.g. to measure interviews, meetings and consultation of the customer experience. This performance. ● Qualitative research with customerfacing visit proved vital in both the staff – e.g. interviews, consultation and even workshadowing to enhance understanding of development of the questionnaire how the service works in practice. Staff will be and analysis of the results. able to highlight the areas that they see causing problems for customers on a regular basis. In addition, understanding the situation on the ground can provide context to the results and prevent 17How do I measure satisfaction 4 First Time Start here... Find out For organisations that are new to customer Sources of information include 2.1 What do we know Define what you Explore satisfaction measurement, ‘getting started’ – Administrative data already know can seem like a daunting task in itself. Administrative data can be a rich source of As a first step, it involves understanding Once you know how to define your service and information for organisations. This can include Insight audit: Qualitatively: What/who what customer satisfaction measurement your customers, it is worth thinking about what Admin data Internal clients Service ● call volumes data e.g. waiting times, ‘hangups’ and can do for your organisation, and also, else you know already to help target your customer Complaints Customers Customers answered calls, involves making sure that your key internal satisfaction measurement, and avoid ‘reinventing Previous surveys Staff Internal clients stakeholders understand this as well as the ● website statistics e.g. number of people visiting the wheel’. Most organisations have a range of Other research Measure research community. After this, there are website, pages viewed and return visits, information that can help identify the strengths and a series of (pretty straightforward) issues Take action customer weaknesses of the current service being provided, ● Applications data e.g. benefit claims over a period to consider on the road to designing, or even if it only provides a partial picture. Taking time to experience of time. commissioning, a research programme. understand the information currently available before Uncover your Communicate ●Customer feedback undertaking customer satisfaction measurement will This section provides a check list of questions to insights and plan at best avoid commissioning unnecessary research and Customer feedback, (this can include complaints, answer before you start. All organisations, whether ensure that any research carried out is fully informed suggestions and compliments) can be used to they are already carrying out customer satisfaction and relevant. identify current areas for improvements as well as to Engage and measurement or relatively new to it, should consider Analyse and build: inform areas to be included in customer satisfaction discuss with: these points at each cycle of research to ensure that Segmentation Key questions to be considered include: measurement. Internal clients Key drivers they are spending taxpayers’ money wisely a. Is my organisation currently monitoring customer Staff (all levels) Further qual work – Mystery shopping data Broadly speaking there are four questions to address, satisfaction Customers Build other info in and we will go through each of these in turn in more b. Is my organisation currently achieving a high level Many organisations conduct mystery shopping detail: of customer satisfaction in order to monitor the services being provided ● What do we know already c. Are there specific issues with our service at the to its customers. This can be a valuable source of moment that we currently know about information and can inform areas to be covered in ● How do I define my service d. Where is my organisation in the journey of customer satisfaction measurement. ● Who are my customers improving customer satisfaction ● What else can we find out 18How do I measure satisfaction 4 11 Once you have completed the preliminary What types of question should be included Understanding Customer Satisfaction provides a discussion of different approaches and includes stages described in Section 3, you can start There are four broad types of question that make example questionnaires based on these approaches. to think about how satisfaction should be up the majority of most customer satisfaction measured for your organisation. Overall rating questions in public sector research questionnaires. typically use satisfaction or performance scales, such There are three major decisions to be made: ● Overall rating measures as those shown below. ● What should I ask ● Servicespecific questions ● Who should I interview ● Customer priorities ● How should I collect the information ● Customer characteristics Q. How satisfied Q. How would are you with…. you rate… ● Very satisfied ● Excellent Overall rating measures ● Fairly satisfied ● Very good 4.1 What should I ask ● Neither satisfied nor dissatisfied ● Fairly good Overall rating measures are questions where ● Fairly dissatisfied ● Poor The steps that you will already have taken to define customers are asked to rate various aspects of the ● Very dissatisfied your service and work out what you already know service (e.g. telephone call handling, the application should have begun to shape your questionnaire. process etc) and their experience of the service as a The next step is to think more in detail about the whole. questions you should ask. These questions generally use a rating scale of one type or other to summarise the customer’s perceptions or feelings about a service or aspects The quality of the research findings will of it. While there has been much debate within the depend on the quality of the questions research community about which scales work best that are asked. You need to invest up in customer satisfaction measurement, there is no front to spend wisely later on. universally ‘accepted wisdom’ in this area. The report for the Office of Public Services Reform Measuring 11 http://www.number10.gov.uk/files/pdf/MeasuringCustomerSatisfaction.PDF 19In the private sector, where choice and competition ‘grab the headlines’, what we have called service Customer priorities are more common, a ‘likelihood to recommend’ scale specific measures are needed to fill in the gaps Customer priorities can be identified in a number is often used, as shown below, and has been found to and ultimately will be more valuable in providing the of ways as part of a survey. These include asking discriminate between customers more effectively than insights that can lead to service transformation. customers to rate the importance of service elements, satisfaction or performance scales. These questions focus on the details of the customer’s to rank them in order of importance or to ‘trade them experiences such as how many calls were needed off’ against each other. However, while each of these Q. How likely would you be to recommend… before an enquiry was resolved; were they seen approaches can be used in customer satisfaction ● Definitely would promptly; did they understand what to do; how easy measurement, they all have shortcomings. ● Probably would were the forms to complete; and so on. The actual Using an importance scale for public services is measures that an individual service should focus ● Probably wouldn’t problematic because people have a tendency to rate on will vary but can be identified in the exploratory ● Definitely wouldn’t almost all service elements as being important. This stage of the research process. These servicespecific means that the survey results may not differentiate questions can then be used as diagnostic measures This measure would be appropriate for public sector between ‘hygiene factors’ that are expected as a to identify which elements of the service are services where the customer has a degree of choice basic standard (providing accurate information, being responsible for problems from the customer’s point of in whether or not to use the service, for example, polite, responding in a timely fashion etc) and factors view. in relation to advice services, schools, educational that genuinely drive satisfaction. courses, dentists, etc. Variations on this theme could Ranking the importance of service elements (e.g. be used for other services, for example, “would you putting them in order from most to least important) tell other people that this service was easy or difficult is problematic because people can find it difficult to to use” Ensuring that the key service elements make meaningful comparisons between more than are included is a critical part of the four or five service elements. Furthermore a simple Servicespecific measures questionnaire development. The early ranking exercise assumes that the distance between Overall rating measures provide a snapshot of how steps taken to define the service should each pair of ranked items is equal, whereas in fact customers perceive the service as a whole and specific ensure that you have correctly identified one or two service elements might be almost equally components of it, but do not explain why customers what these are. important and others cluster around the bottom of feel the way they do. While these measures might 20the list. There are techniques that can be used to analysis, not only frees up questionnaire space, but is Are there any questions or topics I can overcome this problem, but ranking exercises remain widely thought to provide a more accurate indication borrow from elsewhere problematic for the reasons outlined below. of the aspects of service delivery that truly drive It is vital that the questionnaire should be tailored to customer satisfaction. Key driver analysis is discussed Where a cost can be assigned to different levels of your service and your needs. However, there is no in more detail Section 5.3. service, and customers can realistically be expected to need to fully reinvent the wheel with every survey. prioritise components within a ‘package’ of options, Customer characteristics Experienced researchers will be familiar with other there are research techniques that ask customers surveys and will have access to existing questions. Recording customers’ characteristics provides If an organisation to ‘trade off’ cost against various service elements, Talking to contacts carrying out surveys in other public important context for understanding their service already has helping an organisation understand more about what sector organisations can help save you work. Do not experience. The types of questions that should be a customer their customers value. ‘Trade off’ research (which assume, though, that because someone else has asked will vary depending on the services provided segmentation often uses a statistical technique called conjoint used a question, it works You still need to test it out by the organisation, but will usually include basic in use, it is analysis) is widely used in the private sector to design in your survey (see Section 4.4) and make sure it is demographics such as sex and age. Using a consistent important that the optimal service offering for different customer relevant to your organisation. set of personal characteristics will enable you to questions are groups. Some local authorities also use ‘trade off’ bring together information from different surveys Research carried out in the UK with public sector included which techniques to help them understand customer within your organisation and across organisations, organisations suggests that there are five themes that can enable priorities for local services. 13 for example to find out how different services are are likely to be relevant to all organisations: customer The fundamental problem with all of these techniques meeting the needs of a particular age group. The segments to be ● Delivery of the service (how problems were is that they assume that people are actually able Customer Insight Protocol developed by the LGA, identified. handled, reliability, outcome etc) 12 to assess which aspects of a service are of most NCC and IDeA recommends a common approach ● Timeliness (waiting times, number of times importance to them, whereas the impact of different and identifies date of birth, sex, ethnic group and contacted) elements on a customer’s overall satisfaction level may post code as essential information to capture in every be more subconscious. For this reason it is increasingly survey. ● Information (accuracy, enough information, kept common in customer satisfaction measurement for informed) customer priorities to be assessed indirectly at the ● Professionalism (competent staff, fair treatment) analysis stage, rather than via direct questioning. This approach, using a technique known as key driver ● Staff attitude (friendly, polite, sympathetic) 12 http://www.lga.gov.uk/Briefing.asplsection=59id=SXCAAAA78492C2ccat=1145 21 13 The Drivers of Satisfaction with Public Services, OPSR 2004(e.g. satisfaction) are consistent in both the text and to convince customers to complete them and the number of options for respondents to select. levels of response can be low. Also, the quality of In Canada the Common Measurement Tool While there are techniques that attempt to compare information that customers give declines dramatically (CMT) provides a database of questions for surveys using different scales, these are generally if questionnaires are too long: customers may give organisations to use when designing surveys. unsatisfactory. little thought to their answers towards the end of the Various batteries of questions are available for survey or simply not complete it. If customers perceive information about different contact channels. The benefits and drawbacks of benchmarking data the service to be particularly important, the effect of See the chapter 4 of the Guidance for more over time and between organisations are covered in questionnaire length is reduced and longer surveys are information on ‘common measurement’. In more depth in Section 5.4 and in the Guidance. possible. the UK the Local Government Association, in association with the National Consumer How long should a questionnaire be Council, has produced a data protocol setting out common approaches for customer Not as long as a piece of string, because the time and 14 profiling and satisfaction measurement. energy required from the respondent must be taken into accountThe optimal questionnaire length will depend largely on the method of data collection and These can be used as a starting point for the complexity of the service. A rough guide for the questionnaire development where they are relevant, maximum questionnaire length that should be used but it is important not to lose sight of the need to for the four main data collection methods is provided tailor the questionnaire to the particular nature your below. own service and to ask questions that are specific ● Online – 5 to 10 minutes enough to produce actionable results. ● Postal – 8 to 12 pages If you want to benchmark the results with previous customer satisfaction results, or with results from ● Telephone – 1520 minutes other organisations, questions should be used that ● Face to face – 30 minutes are common to the surveys you want to compare with. In particular, it is important that rating scales When surveys are longer than this, it can be hard 22 14 http://www.ncc.org.uk/nccpdf/poldocs/NCC177pdcustomerinsight.pdfdetails they have recorded. In particular, whether How should I sample my customers 4.2 Who should be address, telephone number and email address are If customer Customers can be sampled for a face to face or interviewed stored will determine which data collection strategies details are held telephone survey using either a probability design or can be used. on a database a quota design. You should already have defined your customers, but that file will If you do not have a list of customers available, then one of the first decisions organisations have to make With a probability sample design a set number need to have screening the general population or a sample of is whether to try to interview every customer or to of customers are selected, using an appropriate been registered businesses may be cost effective provided that your interview a sample of customers. In most customer random sampling method, and an attempt is made for research customers make up a reasonably high proportion of satisfaction measurement the decision is made to to interview all of them. This can involve calling back purposes as the population. interview a sample, as the time and cost involved in part of the on the same individual several times until a final Data Protection interviewing all customers is too great. The exception A third option to consider is to interview customers outcome for that individual (be it successful or not) Act. is where the customer base is very small, in which as they are accessing services. This could take the can be recorded. The proportion of customers who case a ‘census’ approach is more feasible. form of an automated questionnaire at the end of a are successfully interviewed is known as the response phone call, a pop up survey after visiting a website rate and maximising this is important to ensure the How can I find customers to interview or an exit interview after accessing a service in survey is representative. This approach also tends to There are a number of possible sources of customers person. Alternatively, an organisation may choose to take longer as multiple attempts have to be made to for a survey, including: record contact details as customers access services contact customers and persuade them to take part. A in order to interview them later. This will allow for probability survey with a very low response rate might ● Your organisation’s customer database longer questionnaires and may lead to higher rates be less reliable than a well designed quota survey. ● Screening the general population or a sample of of response, although it will add time and cost. With a quota sample design a larger number of businesses One of the advantages of interviewing customers customers are selected initially (usually 5 to 10 times as they access services, or shortly afterwards, is that ● Recruiting/interviewing customers as they access the number of interviews required) and it is not it overcomes problems of recall – the experience of services intended that an interview should be attempted with the service is fresh in the customer’s mind, so the Most customer surveys take the first approach, as it all of them. Instead fieldwork continues until a target feedback they give should very accurately reflect their is the most targeted way of reaching customers. An number has been achieved. Various quotas are set on actual experience. organisation will need to establish what customer who is interviewed (e.g. by sex, age, type of customer 23etc) to ensure that the survey is representative of It is impossible to give a number of interviews that will customers as a whole if no quotas were set then be suitable for a survey without first answering these The 2005 Quota Probability those groups least likely to take part could be under questions. For the most robust results it is always best Jobcentre • Service is important • Service less important represented and the results could be misleading. to interview as many customers as the budget allows Plus Customer to customers and to customers and With a quotabased design there is no reliance on for the data collection method that you are using. Survey used a response rates are response rates are response rate. It is possible to set quotas to represent As a rule of thumb, you should not analyse results quota sampling likely to be high likely to be low the natural profile of customers, or to overrepresent for a particular group of customers based on fewer • Survey findings approach as it • Results are required minority groups to allow their views to be recorded. than 100 interviews – even at this level any changes need to be was felt that quickly particularly robust observed over time or between subgroups will need In a postal or internet survey, there is no way to response rates for external scrutiny to be relatively large (1015) for you to be confident • Resources are limited control who responds to the survey. These surveys would be poor that they represent real differences. As you increase either depend on a high response rate or a judgement and ensuring the number of interviews the level of reliability needs to be made about groups that are less likely to enough increases, although the rate of improvement tails off respond and larger numbers of these groups included customers were considerably once you hit 1000 interviews. in the sample. This approach then mimics a quota interviewed How many customers should I interview sample approach. from certain Unless the number of customers is small enough key groups was The decision about which approach to take will to conduct a census, a decision needs to be made a high priority. depend largely on issues relating to service type, on how many people to interview. The questions to resources and how the results will be used. In address include: addition, if a customer satisfaction survey has been ● How robust does your data need to be run before, and the results will need to be compared, ● What method of data collection are you using it is a good idea to use a similar approach. An overview of what should be considered for each ● What is the budget approach is included in the table below. ● What subgroups of customer are you interested in 24Level of participation and avoiding bias In general, data collection methods that involve 4.3 How should the an interviewer, such as face to face and telephone Certain types of customer are more likely to take information be collected interviewing, tend to have higher levels of part in a survey than others. For example, customers participation. This is because the interviewer is able who are very dissatisfied (or very satisfied) may be There are four main data collection methods that can to persuade the customer to take part there and more likely to respond to a survey than those in the be used to conduct a customer satisfaction survey: then, whereas a postal or online questionnaire can be middle. When this happens the survey findings can be more easily ignored. There are, however, some cases ● Face to face (in the customer’s home) misleading and, as a result, specific actions taken in where postal and online surveys can achieve high response to the survey could actually make the overall ● Telephone levels of response, such as when the service is seen as customer experience worse. Decisions about the data ● Postal particularly salient or important. collection method need to be taken to reduce any ● Online such bias in the data, for example, by increasing the level of participation or by setting interviewing quotas The choice of data collection method will depend to make sure the research accurately represents on a number of key factors that are summarised and Response rates for the NHS adult In customer views. discussed below. patient survey, which is coordinated by the Healthcare Commission and uses a postal Face to face Telephone Internet Postal methodology, average around 60. In contrast only 20 of customers responded to Level of participation a postal survey carried out by the DVLA. Length of questionnaire Length of fieldwork Cost 25Length and complexity of the questionnaire However, there may be instances where postal or Practical issues internet surveys are actually quicker than a telephone When the questionnaire is particularly long or One of the most important considerations when survey. This is because the fieldwork period that is complex, the presence of an interviewer can choosing a data collection method is what would needed for 10,000 postal or web questionnaires is encourage respondents to persevere. Respondents be appropriate for the service’s customers. This can the same as that required for 100 questionnaires, can easily abandon postal and online questionnaires if involve assessing both the resources that customers while, with telephone or face to face surveys, an they feel the questionnaire is too long, although the can access and the difficulties that they may have in increase in the number of interviews may result in a salience or importance of the service will again have responding in certain modes. proportionate increase in the length of the fieldwork a bearing on whether or not people are prepared to Obviously, an online survey will only be appropriate if period. complete long interviews. a significant majority of customers have access to the Cost In addition to the length of the questionnaire, the internet and can complete an online survey relatively type of questions that will be asked can also have an Whilst there is always an optimal way to collect the easily. Whilst most customers will have access to a impact on which data collection method should be data, this needs to be balanced against the budget telephone, certain groups (particularly younger people used. For example, for obvious reasons, it is difficult to available for the research. with low incomes) are less likely to have a landline, use visual prompts in a telephone survey, while postal so unless mobile phone numbers are available, The most expensive data collection methods are face questionnaires have to be kept very simple in their these customers will be excluded. Choice of method to face and telephone interviewing because of the structure if people are to be expected to fill them in becomes even more important if part of the research need to pay interviewer fees. Of these two methods, correctly. focuses on channel preference. face to face interviewing is significantly more Length of fieldwork expensive than telephone. The first Pension Service Customer Survey Online and postal questionnaires are the least Different methods of data collection will tend to in 2003 was conducted face to face as one expensive data collection methods, with online take different lengths of time. Generally speaking, of the key aims of the research was to generally being the cheapest. One of the main telephone fieldwork can be turned around in the understand customers’ channel preference in benefits of using online data collection is that the shortest period of time while postal surveys tend to the wake of the switch from Social Security marginal costs of increasing the sample size are take the longest because reminders and replacement Offices to large scale call centres. As such it was important that the research was as negligible. questionnaires have to be mailed out to respondents. inclusive as possible. 26‘Hard to reach’ customers 4.4 How do I know I have Some customers may also have difficulties that mean got it right that certain data collection methods are inappropriate. For example, customers who have difficulty reading, Before a full scale survey is conducted a test as a result of literacy problems or visual impairment, version (or pilot) should be run to ensure that the struggle to complete postal and online questionnaires. questions used can be understood by customers The extent to which these considerations impact on and that the overall structure of the questionnaire the choice of data collection method will depend works. This can involve sitting down with individual partly on the scale of the difficulties and partially on customers and asking them to talk through the the extent to which these difficulties could impact way they understand each question and reach on customers’ use of an organisation’s services. In their answers (cognitive piloting) or a full scale practice, there will almost always be some specific fieldwork pilot which tests out all of the research issues in using most services for customers with methods on a small scale (e.g. 1020 interviews) reading or hearing difficulties. with a researcher listening in to the interviews. The first Pension Service Customer Survey in 2003 was conducted face to face as one of the key aims of the research was to understand customers’ channel preference in the wake of the switch from Social Security Offices to large scale call centres. As such it was important that the research was as inclusive as possible. 27How can I get insight from the results 5 First Time Start here... Find out How can I get Define what you Explore insight from the know results Insight audit: Qualitatively: What/who Admin data Internal clients Service Complaints Customers Customers Previous surveys Staff Internal clients Other research Measure Take action customer experience Uncover your Communicate insights and plan Engage and Analyse and build: discuss with: Segmentation Internal clients Key drivers Staff (all levels) Further qual work Customers Build other info in 28How can I get insight from the results 5 Collecting the data in the right way and those of another enables organisations to start generate insights about different groups of customers. formulating a targeted plan of action to improve their asking the right questions are critical If you are thinking of carrying out cluster analysis of services. At a simple level, this analysis might be based steps along the way to successful your customer survey data, points to note are that: on a breakdown of the results by information about customer satisfaction measurement. ● The survey needs to have interviewed at least customers such as their age, sex, service or channel But the research will only be valuable several hundred people – ideally no less than 600, and usage, etc, which has either been collected in the if it delivers insight that can be used as preferably 1000 or more. survey or is available on the customer database used a basis for service transformation. This to select the survey sample. ● The sample needs to be representative of the larger section of the toolkit outlines how to customer population. Some organisations use predefined customer use and build on the data you gathered segments to identify differences between customer ● The survey needs to contain plenty of demographic to ensure that it delivers this insight. In order to better groups, which can inform how service improvements and attitudinal information. should be tailored to meet the diverse needs of understand 5.1 Where do I start these groups. These segmentation models might be headline findings based on sociodemographic characteristics or more A good starting point is to look at the headline and what they sophisticated classification systems, such as Mosaic or findings of the research. At their most basic level actually imply Acorn, which go beyond basic sociodemographics headline findings show how customers answered organisations to classify people by their lifestyles, culture and each question. For example, “80 of customers were consumer behaviour, based on where they live. While very or fairly satisfied overall with the service they had can compare these techniques are more widely used in the private received”, “50 of customers had their call answered results with sector, they are gaining credence in local and central within 30 seconds”. targets that they government. may have set or The term ‘segmentation’ is also used to describe with results from 5.2 Who thinks what the statistical technique called ‘cluster analysis’. previous surveys. This approach is commonly used to inform Knowing that the views, experiences and satisfaction communications strategies, but can also be a useful levels of one subgroup of customers differ from tool in customer satisfaction measurement to 29Example output from modelling How to interpret a dissatisfaction “Bubble” Chart Dissatisfied with driver 45 Red = Large contributor 40 30 of the sample are Orange = Medium contributor 35 dissatisfied with this driver Green = Small contributor 30 25 The size of the bubble 20 captures the driver’s overall contribution to 15 Customers dissatisfied dissatisfaction with this driver are three 10 (i.e. 3 X 30) times as likely to be dissatisfied overall 5 0 0 1 2 3 4 5 6 Individual impact (e.g. 2 = twice as likely to be dissatisfied) © HCHLV 2006 3 Example output from modelling Key driver analysis produces a list of factors that 5.3 What is driving Drivers of Dissatisfaction with complaint handling influence satisfaction and an idea of how important dissatisfied with driver satisfaction and how each factor is. Results from key driver analysis can 80 be presented in very userfriendly formats to help 70 As discussed in Section 1, just measuring satisfaction communicate findings and drive action across the Event not completely 60 resolved is not sufficient to inform service transformation – it Number of times whole organisation. The charts below show how key 50 Were not told what company would contacted help desk was do to resolve issue unacceptable tells an organisation how it is doing, but not why it 40 driver analysis can be displayed visually. Not kept well informed Difficulty of registering 30 is performing as it is. In addition, it is important to Agent did not know how to deal with it The ‘bubble’ charts shown here are an example of 20 Company did not do what they said understand the influence of different factors on the Email did not explain how issue they would to resolve issue 10 would be resolved a really useful format for communicating research customer’s experience and how they interact with 0 findings to diverse audiences within an organisation. 0 1 2 3 4 5 each other. In order to produce actionable insights, it Individual impact (eg 2=Twice as likely to be dissatisfied) In this case, the charts illustrate the drivers of is also critical to explore these factors in more depth, © HCHLV 2006 dissatisfaction with a call centre. The size of the and to understand how they could be changed in bubble captures the driver’s overall contribution to order to improve customer service. dissatisfaction. This type of visual display is very useful Key driver analysis for communicating progress over time. In this case, Example output from modelling the organisation chose to actively focus on ‘shrinking’ How to interpret a dissatisfaction “Bubble” Chart Depending on the level of detail gained from a survey, a small number of the largest bubbles over a three the list of factors that are likely to contribute to Dissatisfied with driver month period, before moving on to others. The satisfaction can be quite long. However, it is possible 45 Red = Large contributor analysis was done on a monthly basis, and the charts 40 to identify which factors have the biggest impact and 30 of the sample are Orange = Medium contributor 35 dissatisfied with this driver shared with management and front line staff, so they use this information to target resources effectively. Green = Small contributor 30 were all able to understand the progress that was In quantitative customer satisfaction measurement 25 The size of the bubble being made. 20 captures the driver’s this is usually done using a statistical technique called overall contribution to 15 Customers dissatisfied dissatisfaction multiple regression, which is also commonly known as with this driver are three 10 (i.e. 3 X 30) times as likely to be dissatisfied overall 5 key driver analysis. 0 0 1 2 3 4 5 6 Individual impact (e.g. 2 = twice as likely to be dissatisfied) 3 © HCHLV 2006 30 Example output from modelling Drivers of Dissatisfaction with complaint handling dissatisfied with driver 80 70 Event not completely 60 resolved Number of times 50 Were not told what company would contacted help desk was do to resolve issue unacceptable 40 Difficulty of registering Not kept well informed 30 Agent did not know how to deal with it 20 Company did not do what they said Email did not explain how issue they would to resolve issue 10 would be resolved 0 0 1 2 3 4 5 Individual impact (eg 2=Twice as likely to be dissatisfied) © HCHLV 2006Carrying out qualitative research after the survey 5.4 What can I compare my The Key Driver analysis described here is can be an extremely valuable way to explore further based on large scale surveys analysed using the key drivers of satisfaction and inform service results with statistical techniques. But it is often possible to improvement plans. For example, you may find that gain an understanding of what the key drivers The main benefit of customer satisfaction a major driver of dissatisfaction is the difficulty of for satisfaction might be using qualitative measurement is to uncover issues that can improve filling in a form, but there is unlikely to be time in techniques and speaking to front line staff customer service, rather than producing indicators the survey to explore exactly what it is about the and customers. Bear in mind that, while of performance. One question that often gets form that is difficult. Without this information, you qualitative techniques will identify key areas, asked, however, is “x of customers say they are cannot plan improvements. Carrying out follow up it won’t provide measures that allow those satisfied with the service, but is this good or bad” qualitative interviews with customers who reported areas to be assessed against each other. Benchmarking against other sources can help to problems in the survey, or for that matter those who answer this question. There are two possible ways to were highly satisfied with the service is an effective do this: Building in other sources of insight approach, because you can target the follow up interviews according to particular areas of satisfaction ● Comparing over time with previous surveys about Analysing headline data and the experiences of or dissatisfaction that you want to explore. This is the same service different customers can provide useful information, an approach used by HMRC, who follow up specific and key driver analysis can identify the priorities for ● Comparing with other surveys about other similar issues raised in their Customer Service Survey to improvement. However, on its own this information services. provide more depth and inform change. will not necessarily be enough to inform service transformation. It is important to build in other Benchmarking internally over time sources of insight and not to treat the survey findings in isolation. ‘Benchmarking’ over time can be useful to see how a service or one aspect of a service has changed. The Information that is available, such as administrative research method and key questions should remain data, or information from customerfacing staff (see the same to enable you to see if changes that have Section 3.4), can be used to provide useful context been implemented have resulted in improvements when interpreting the results. in customer perceptions. However, since the service 31will be continually under review, and changes will actual frequency will depend on the service in question be made, decisions sometimes need to be made to and the length of time that it takes to implement In order to compare performance usefully the let go of old questions and old data, and move on change. Police Performance Assessment Framework to measure new more relevant issues that reflect the Benchmarking with other services (PPAF) bands police forces together in “Most current service. Similar” groups. These groupings are based Benchmarking across services is only of value if the on sociodemographic factors that have services are comparable. Different services can rarely a strong link to crime rather than actual The Identity and Passport Service are currently be compared easily because the nature of the service crime levels, which will to some degree be a reviewing their customer survey in order to reflect and the type of customers that use it will have a product of police force performance. changes to the passport application process strong bearing on customer perceptions. In essence, such as interviews for first time applicants. In there is always a risk of ‘comparing apples with addition, they are ensuring that improvements pears’. to segmentation models can be captured in the customer survey. There are cases where comparison is possible, particularly with local services. For example, all police forces provide similar services and comparisons can usefully be made between forces. However, local This kind of tracking requires regular surveys but it is services respond to local circumstances and local important to find the right balance between collecting demographics vary considerably. While there are data so frequently that there is not enough time to analysis techniques available that can help control action any change, and so infrequently that there are for these factors, the most useful comparisons can long periods when customer focus can be lost. The be made between areas which are demographically similar. A new ‘Place Survey’ is being introduced In short, transactional services are more likely to be from 2008 which will replace the Best Value usefully compared than highly complex services. See Performance Indicator Survey. This is likely chapter 4 of the Guidance for more information. to be undertaken more frequently than the previous survey, and will provide useful data about different local authority areas that will be comparable over time . 3233How do I communicate and action the results, and then what 6 First Time Start here... Find out Define what you Explore How do I communicate and know action the results, Insight audit: Qualitatively: What/who and then what Admin data Internal clients Service Complaints Customers Customers Previous surveys Staff Internal clients Other research Measure Take action customer experience Uncover your Communicate insights and plan Engage and Analyse and build: discuss with: Segmentation Internal clients Key drivers Staff (all levels) Further qual work Customers Build other info in 34How do I communicate and action the 6 results, and then what Once the research has been conducted and 6.2 How do I communicate the key themes identified the next step will Every effort should be made to ensure the findings be to communicate findings in a concise and that senior management feel they are actionable manner. Wherever possible the presentation of findings should hearing the customer’s voice through the be tailored to reflect the different needs and interests 6.1 Who should I of different stakeholder groups. For senior audiences research findings. This can be done in communicate the findings to the focus should be on the key findings, whilst or a literal sense by playing back audio or customerfacing staff more detailed feedback around video clips of customers that may have Generally speaking the findings should be their areas of responsibility is likely to be appropriate. communicated to as wide an audience as possible. been recorded during the qualitative Senior audiences are likely to want to know how is This will certainly include the internal stakeholders their organisation performing in its customers’ eyes; research stage. (NB Consent would need identified in section 1.2 but will sometimes include and what can be done to improve things They customers and other external stakeholders as well. to have been gained from customers in will need the information necessary to set targets Ensuring there are no barriers to accessing research order to do this.) for transformation. Customer facing staff will need information is critical. The findings only have meaning feedback on the direct actions that they need to take To illustrate the survey findings verbatim and value if different stakeholders across the to improve the customers’ experience. quotes from open questions can be used organisation are able to engage with and use them. For an organisation understaking customer Users need to be able to drill down to their own area as an antidote to a succession of charts satisfaction research for the first time, a workshop, of responsibility. and graphs. When presented well bringing together diverse stakeholders from across the organisation, can be invaluable at this stage. This the findings of a customer research allows the results of the research to be shared and the programme should feel like a “back Information sharing tools, such as action plan to be jointly agreed in terms of priorities to the floor” exercise for senior for change in the short and medium term, and the intranets, are helpful in allowing survey allocation of responsibilities for pushing through the management. findings to be communicated. necessary actions. 35 helped to produce a workable plan of action that 6.3 How do I action the was then implemented. The next step is to go right results back to the start, review where you are now, and start a new cycle of research to assess the success Having employed the best possible research tools, of your interventions. Welcome to public service Research can also and used a considered approach to analysing and transformation identify service interpreting the results, you should now have an idea of what the priorities are for changes to the service. areas that are Communication of the results up and down the less important for organisation should have helped to develop these customers and ideas into an action plan with which stakeholders which could be are fully engaged. This process should have put the foundations in place for change, so all that remains is scaled back to save to put your plans into action. resources. 6.4 And now what happens Once you have acted on the findings of the research, this completes the cycle of improving the customer experience. At the outset you reviewed fully what your service was, who your customers were, and what you already knew. You sought further information to help inform you about issues you needed to explore and provide context. You then designed a customer satisfaction survey suited to your needs that provided you with robust data, and interpreted this data in a way that produced indepth actionable findings. Communicating these findings to the organisation 3637How to measure customer satisfaction A tool to improve the experience of customers November 2007 38
Website URL
Comment