Enter the Zone of Tolerance
SERVQAL gathers three data sets: minimum, desired, and perceived service quality. These three data sets combine to create a “zone of tolerance.” Think of the zone of tolerance as the boundaries representing the upper (desired) and lower (minimum) boundaries of service quality. Plotting the third data set, perceived quality, in relation to the zone of tolerance lets IT suppliers and consumers instantly visualize service quality. It also shows where to improve, and can function as a continuous improvement model.
Consider the image in Figure 2 below. It shows how to use SERVQUAL to show IT performance based on the job IT needs to do as measured by customer requirements.
Figure 2. Example SERVQAUL SLM Report
Figure 2 makes it is very easy to understand IT performance. Even non-technical business managers will have no problem at all understanding IT performance. There is no translation from speeds, feeds, MIPS and so on into human terms. The value of the measure means nothing; it’s the relationship of the measure to the boundaries that is meaningful.
SERVQUAL offers two metrics: measure of service adequacy (MSA) and measure of service superiority (MSS.) MSA represents perceived quality less adequate quality while MSS perceived quality less desired quality. These metrics are internal to the provider. MSS > 0 means the provider is over-servicing based on requirements. MSA < 0 means the provider is under-servicing, as shown in figure 3.
Figure 3. MSA and MSS Scores
This model drives IT operations based on customer and business requirements. Using MSS and MSA values help the IT service provider prioritize work, balance resources and select improvement targets. Select for improvement those services not operating within the zone. Initiate corrective action for services moving down and out of the zone. Market services that move up in the zone. Stop investing when balanced in the zone.
This model also provides a roadmap for improvement and remediation. The four SERVQUAL gaps identify where poor service quality (Gap #5) originates. It also shows where you don’t have to focus—an important and often overlooked item.
Using Figure 3 as an example, it would seem we probably need to focus on Assurance, defined as “knowledge and courtesy of employees and their ability to inspire trust and confidence,” and Reliability or “ability to perform the promised service dependably and accurately.” Managers can quickly see they ought to reallocate and balance resources, perhaps moving funding or resources from “Tangibles” to “Assurance.”
Based on MSS and MSA values, examine Gaps 1 to 4. Shrinking some gaps requires training, others usually require software support tools, and others require modifications to process. The resolution depends on the gap, and the tools required depend on the resolution. Here are some examples viewed from the perspective of the service supplier. Gaps 1 to 4 reflect where in the supplier organization the poor quality arises. Gap 5 is the “Service Quality Gap” measured as q=e-p.
SERVQUAL not only provides a roadmap to measure and improve service quality, it also provides a model to present service quality metrics, and a means for quantifying and prioritizing supplier operations and improvement projects.
Figure 4. SERVQAUL Mapping to ITIL v3
Gap #1 – The “Market Information Gap”
Gap 1 occurs when there is a discontinuity between customer expectations and managements understanding of customers expectations. Reasons here include insufficient research into, or understanding of, customer needs; inadequate use of the research; lack of interaction between management and customers; insufficient communication between staff and managers; etc.
Resolutions include conducting research, making senior IT managers interact with customers, making senior managers occasionally perform customer contact roles, encouraging upward communication from customer contact employees, and so on. ITIL, Six Sigma and other solution sets can help here, along with good old-fashioned attention to customers. ITIL v3 Service Strategy directly addresses this gap.
Gap #2 – The “Service Standards Gap”
Gap 2 arises within the provider organization when there is a misunderstanding between management perceptions of customers’ expectations, and service quality specifications used by provider staff. Causes of this gap include: inadequate management commitment to service quality, absence of formal process for setting service quality goals, inadequate standardization of tasks, and a perception of infeasibility or that customer expectations cannot be met.
Gap 2 resolutions include using tools like CMMI and ITIL to define process, clarify roles, and to document and measure service delivery goals and performances. ITIL v3 Service Strategy, and the SLP (Service Level Package) passes to Service Design, and the SDP (Service Design Package) Service Design passes to Service Transition directly address this gap.
Gap #3 – The “Service Performance Gap”
This Gap appears between service quality specifications and service delivery. Key factors here are lack of teamwork, poor employee/job fit, poor technology/job fit, lack of perceived control by contact personnel, inappropriate evaluation and compensation systems, role conflict and ambiguity among contact employees.
Ways to address Gap 3 include: investing in employee training, supporting employees with appropriate technology and information systems, giving customer-contact employees sufficient flexibility to respond, reducing role conflict and ambiguity, recognizing and rewarding employees who deliver superior service. ITIL v3 Service Transition, and the early life support provided to Service Operation directly addresses this gap.