Journal of Services Marketing Emerald Article: Improving the application of quality conformance tools in service firms John B. Jensen, Robert E. Markland

Article information: To cite this document: John B. Jensen, Robert E. Markland, (1996),"Improving the application of quality conformance tools in service firms", Journal of Services Marketing, Vol. 10 Iss: 1 pp. 35 - 55 Permanent link to this document: http://dx.doi.org/10.1108/08876049610147838 Downloaded on: 15-01-2013 References: This document contains references to 31 other documents Citations: This document has been cited by 17 other documents To copy this document: [email protected] This document has been downloaded 1167 times since 2005. *

Users who downloaded this Article also downloaded: * John B. Jensen, Robert E. Markland, (1996),"Improving the application of quality conformance tools in service firms", Journal of Services Marketing, Vol. 10 Iss: 1 pp. 35 - 55 http://dx.doi.org/10.1108/08876049610147838 John B. Jensen, Robert E. Markland, (1996),"Improving the application of quality conformance tools in service firms", Journal of Services Marketing, Vol. 10 Iss: 1 pp. 35 - 55 http://dx.doi.org/10.1108/08876049610147838 John B. Jensen, Robert E. Markland, (1996),"Improving the application of quality conformance tools in service firms", Journal of Services Marketing, Vol. 10 Iss: 1 pp. 35 - 55 http://dx.doi.org/10.1108/08876049610147838

Access to this document was granted through an Emerald subscription provided by UNIVERSITY OF ECONOMICS HO CHI MINH For Authors: If you would like to write for this, or any other Emerald publication, then please use our Emerald for Authors service. Information about how to choose which publication to write for and submission guidelines are available for all. Please visit www.emeraldinsight.com/authors for more information. About Emerald www.emeraldinsight.com With over forty years' experience, Emerald Group Publishing is a leading independent publisher of global research with impact in business, society, public policy and education. In total, Emerald publishes over 275 journals and more than 130 book series, as well as an extensive range of online products and services. Emerald is both COUNTER 3 and TRANSFER compliant. The organization is a partner of the Committee on Publication Ethics (COPE) and also works with Portico and the LOCKSS initiative for digital archive preservation. *Related content and download information correct at time of download.

Improving the application of quality conformance tools in service firms John B. Jensen and Robert E. Markland

Executive summary and implications for the practitioner (Supplied by Marketing Consultants for MCB University Press) By asking whether it's better to eliminate the causes of errors or improve the overall efficiency of a system Jensen and Markland cast some light on the question of managing for service quality. As they note, quality is not an accidental event – good service is the result of a systematic approach to delivery and an environment or culture conducive to high standards. However, the reasons for service breakdowns are, as we all know, many and varied. Most services are complex and involve a range of tangible and intangible elements. As a result improvements to service quality involve more than changing a few screws or making the packaging better. Most importantly the role of those who deliver all or part of the service must be assessed since good service is, in the end, determined by the performance of the individuals delivering that service and any changes to systems impact on their daily work. Failing to acknowledge this fact may result in wonderful systems on paper but a regimented and uncooperative workforce. Elsewhere in this issue of JSM we see how different services have different emphases in the elements making up quality. The components of quality in a fast food restaurant are very different from those on a railway or in a bank. Here we try to assess how the tools available to measure service quality might adapt to identify those areas where action would improve quality delivered. In simpler terms do we stop making mistakes or does the system need changing? It may be that both the system itself and its administration present problems and it seems sensible to observe that a badly structured or designed system will, through its very nature, contribute to errors by those delivering a service. However, businesses should seek to make those changes that have the greatest positive effect on performance by identifying the problem areas and then setting priorities for action. Furthermore companies must consider the question of costs in time and resources – some changes might have enormous benefits to service quality but the time and cost involved mean that realizing the gains is some years away and the company has to operate in the meantime. An approach that sets improvements going in a sustainable way (e.g. staff training and development) may be a better long-term bet than scrapping what you’ve got in place and starting again. By using SERVQUAL as the basis for their model Jensen and Markland have the advantage of using perhaps the most well known and widely used service quality measurement tool. Despite criticisms of SERVQUAL – especially in the dependency of some of the variables involved and the questions used – it still provides a valuable tool for managers. The method they describe is, however, fairly involved and assumes a degree of statistical THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996 pp. 35-55 © MCB UNIVERSITY PRESS 0887-6045

35

knowledge not usually held by service managers. However, since this is an initial exercise, we can hope for a more accessible model to emerge in due course and the model presented does show how service quality measurement approaches are developing. While service systems are important much of what constitutes good service is already known – subjective analysis is always a good start in improving any service operation. However, as our authors remind us, service quality is determined by the customer not the business and this makes it hard for a manager to assess service failings solely through his own subjective observation. The oft heard comment that managers are too close to the problem really to see it holds much truth and managers should never assume they understand better than their staff or their customers. If a customer thinks your service is bad, it is bad. Too often we fall back on the belief that half our customers are moronic idiots who expect to be treated like royalty and charged nothing. What more scientific measures of customer satisfaction show consistently is that people expect occasional mistakes, are tolerant of delay when they understand it and are willing to pay for superior service. Businesses should take time and, if necessary, spend money, to learn about quality measurement systems such as SERVQUAL and should work on how to apply them in their organizations. Otherwise too many managers will address the wrong problem or only hit on the right problem by luck. The models may be daunting and often couched in incomprehensible terminology but their use could assist many businesses in improving quality, retaining more customers and delivering improved profits. This more scientific approach doesn’t negate traditional means of assessing quality such as mystery shoppers and counting complaints but using measures like SERVQUAL supports qualitative analyses with quantitative information showing the key areas of service quality as identified by your customers. Finally, Jensen and Markland show how service quality measures can be used over time. Taking a snapshot of quality standards from time to time is useful but a continuous measure of performance is far more useful. From this such issues as the effects of seasonality (are your staff grumpier in the winter?) and employee changes can be measured and the effects of any system changes analysed as they occur. For many service businesses quality is often the only differentiator and firms that neglect proper analysis are running the risk of losing customers and even going out of business. Simon Cooke

36

Simon Cooke, The Cooke Consultancy, UK © 1996

Introduction A common response to the question of what distinguishes one service provider from another often revolves around the notion of quality. Outstanding service providers are dedicated to the satisfaction of their customers. Unlike manufacturing firms that are able to appraise the quality of their product objectively by the degree to which output meets a technical specification, for the service firm, excellent service quality is present only if the customer perceives and values it. This dedication to customer service leads service providers to measure customer satisfaction and use customer responses to guide service operations. Ongoing conformance checks

The delivery of a customer-perceived high-quality service is not a chance event. Only through deliberate service delivery design and ongoing conformance checks can a service firm hope to be successful (King, 1987). Conformance checks evaluate standard operating measures to aid the manager trying to determine if the service system is performing as expected. Performance can then be improved by concentrating efforts on those areas found to be underperforming. A key component of service quality control is that customer feedback is required (King, 1987). There are two ways to improve any system or process (Deming, 1985,1986). One may act on the system to improve overall functionality (or quality), or one may identify and eliminate special causes of system variation. In the domain of service providers, if one views service quality along a scale (e.g. 1 = poor service and 10 = superior service), one may increase the mean service quality score by improving the service delivery system and thus improve the level of service for all customers. Alternatively, one may identify and eliminate the special causes of unsatisfactory service and thereby improve the mean level of service by preventing the experiences that would have rated very low. Given this context, an important managerial consideration is where to concentrate quality improvement efforts. This article intends to help a manager answer this question by providing a tool to monitor day-to-day service quality and thereby identify abnormal service experiences. The resulting investigation into abnormally pleased or displeased customers may highlight service delivery aspects that need to be addressed.

Powerful tools for service improvement

For managers interested in improving their service system through reengineering, modern manufacturing quality management techniques (such as value chains, flowcharting, diagnostic modeling, and fishbone diagrams) have been applied successfully to service-quality management (Ballantyne, 1990). These tools, properly used, are both powerful quality diagnostic instruments and helpful in service-system analysis. While they aid in the focus and development of system-improving programs, they are not conformance check mechanisms. This article develops and then appraises the usefulness of a tool called a “quality perception control chart”. This control chart is suggested as a mechanism that effectively separates assignable from common causes of variation detected in perceptual data. Thus, this article provides a tool for the manager who must decide how to control aberrant service system behavior that creates extraordinarily low performance for some customers. This

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

37

article is also written to help the manager who would like to examine the effectiveness of service reengineering efforts on customer satisfaction. Service quality measurement Manufacturing quality literature has been philosophically grounded in the manufacturing-based quality model. In this model, quality is defined as a lack of deviation from a defined specification. Thus, for most manufacturing firms performance standards are relatively easy to specify. Often, manufacturing quality improvement begins with the collection and analysis of readily available quality performance data. Definitions of service quality

For service industries, a statement of quality philosophy is more difficult to delineate since service firms generally provide utility, not objects. Several definitions of service quality have been suggested in the literature. Some authors have advanced the notion that service quality is meeting the needs and requirements of customers (Murdick et al., 1990; Smith, 1987). Building on a user-based definition, others have stated that service quality is how well the service delivered matches the customer’s expectations (Creedon, 1988; Lewis, 1989). More drastic notions such as “providing better service than the customer expects” have been advocated (Lewis, 1989). Some authors, unable or unwilling to define service quality, have provided models that demonstrate how perceptions of service quality are formed. Researchers have developed multistage models of the perception of service quality, suggesting that disconfirmation, expectations and actual performance affect customer satisfaction which, in turn, becomes an input to customers’ perceptions of service quality (Bolton and Drew, 1991). The evolution of SERVQUAL Perhaps the most comprehensive and most referenced investigation into service quality was conducted by Parasuraman et al. (1985). They explained that the continuum of perceived service quality is formed by the multiplicative effects of customers’ pre-purchase expectations, perceived process quality, and perceived output quality. To judge its ability to provide a quality service, the service firm must first understand how its service system impacts on customer expectations and satisfaction. Defining service quality as the gap between expectations of service and the perception of the service experience, they provided a list of the most important aspects of a quality service as seen by the service customer. A service provider scoring high marks in each of these categories was viewed as high quality.

Five service quality dimensions identified

38

While it may be relatively simple to identify service aspects that are valuable to the customer, it is not so easy to quantify a firm’s ability to provide each customer with the required amount of courtesy, security and credibility. To date, the most significant research into assessing customers’ perceptions of service quality was also conducted by Parasuraman et al. (1988) who developed the SERVQUAL multiple-item survey instrument. To measure a customer’s appraisal of the excellence of an individual service experience, they developed a reliable 22-item survey instrument based on their previously defined categorization of service quality. Further research into the identification of latent service-quality constructs (using factor analysis) led to the identification of five service-quality dimensions. Figure 1 demonstrates how the original 22 SERVQUAL survey elements were THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

φ10

Component correlations φ8 φ5

φ1

SERVQUAL components

Tangibles

SERVQUAL survey items

1

Item variance

∂1 ∂2 ∂3 ∂4

2

3

4

φ9 φ2

Reliability

5

6

7

8

φ6

φ3

Responsiveness

9

∂5 ∂6 ∂7 ∂8 ∂9

10 11 12 13

∂10 ∂11 ∂12 ∂13

φ7 Assurance

14 15 16 17

φ4 Empathy

18 19 20 21 22

∂14 ∂15 ∂16 ∂17 ∂18 ∂19 ∂20 ∂21 ∂22

Figure 1. The original SERVQUAL components

incorporated into five quality dimensions. These dimensions, hypothesized to have emerged from the 22-item SERVQUAL instrument, were thought to be generalizable to a wide range of service industries (Parasuraman et al., 1988). More recently, the SERVQUAL scale has been reassessed and refined (Parasuraman et al., 1991). Critiques of SERVQUAL SERVQUAL has become the quality measurement standard for service industries. Not only has it seen extensive use by practitioners (Babakus and Mangold, 1992; Crompton and Mackay, 1989; Cronin and Taylor, 1992; Fick and Ritchie, 1991; Reidenbach and Sandifer-Smallwood, 1990; Saleh and Ryan, 1991; Woodside et al., 1989) but it has also been the subject of several academic studies. Criticisms of SERVQUAL have centered on the underlying structure of SERVQUAL, multicollinearity problems, and the questions used in the instrument. Criticism of the structure

Much criticism of the five-dimensional structure of service quality has appeared in the literature. Using the SERVQUAL instrument for three commonly purchased professional services and three nonprofessional services, researchers determined that the level of consumers’ expectations regarding the quality of professional services tends to depend heavily on the demographic characteristics of the individual (Webster, 1989). Another study identified six factors influencing employee perceptions of customer service on retail store sales. These factors were different from those found in SERVQUAL (Weitzel et al., 1989). A modified version of SERVQUAL, used in four service firms, identified that as many as nine distinct dimensions of service quality may exist (Carman, 1990). A study of the hospitality industry found that five dimensions of service quality did in fact exist, but that they were significantly different from the original five dimensions identified by the SERVQUAL authors (Saleh and Ryan, 1991). Other researchers have criticized the structure developed by the SERVQUAL authors, noting that the factor loadings they advocated accounted for less than 50 percent of the item variances in most cases (Babakus and Boller, 1992).

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

39

Problems in measuring complex perceptual constructs

The authors of SERVQUAL themselves noted problems inherent in measuring complex perceptual constructs (Parasuraman et al., 1985). They observed that: •

consumers’ perceptions of service quality result from a comparison of their expectations before they receive service to their actual service experience;



quality perceptions are derived from the service process as well as from the service outcome; and



service quality is of two types, normal and exceptional.

Given this complexity in defining the underlying structure of SERVQUAL, its authors were unable to assume that the five service-quality dimensions were uncorrelated. To factor analyze the data adequately, the authors employed an oblique factor rotation. This rotation procedure assumes the existence of some level of shared variance between the dimensions. Other researchers have used modified versions of SERVQUAL and have employed orthogonal factor rotation as a transformation technique to reduce multicollinearity problems (Reidenbach and Sandifer-Smallwood,1990). Critiques of SERVQUAL strongly suggest that the dimensionality of customer service forces any measure of service to be multivariate. Consequently, any collection of service measures will demonstrate considerable multicollinearity due to the strong effects of various customers’ attitudes. This suggests the need for well-conceived statistical measurement tools for extracting meaningful information from such measures. The construction of quality dimensions

The majority of criticism of SERVQUAL stems from the construction of quality dimensions. However, it is also important to note that a few studies, using difference scores as a measure of quality, have suggested conceptual and operational problems as well (Teas, 1993, 1994). At this point, the issue has not been settled. Further, the procedure we propose here is applicable to any instrument that gathers multivariate perceptual information. Since the research has not identified a widely agreed on best instrument, we propose that for the purposes of this article, SERVQUAL provides an adequate, widely recognized tool. Nevertheless, perceptions of service quality have a complex composition, their analysis is not straightforward, and it appears that quality constructs differ from industry to industry. Thus, we suggest that SERVQUAL provides only an initial instrument used to gather vital service quality information, and that additional analysis is required.

Parametric measurement of service quality Little management guidance into service quality conformance measurement has appeared in the literature. For example, the SERVQUAL authors noted that SERVQUAL has a variety of potential applications, including:

40



periodically tracking service quality trends;



determining the relative importance of the five dimensions in influencing customers’ overall quality perceptions;



categorizing a firm’s customers into several perceived-quality segments; and



tracking the level of service provided by each unit of multi-unit companies (Parasuraman et al., 1988). THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

However, detailed methods to perform these activities were never outlined. We suggest that SERVQUAL data also may be used to monitor a service system’s ability to provide desired quality to its customers. Importance of developing a tracking procedure

While it appears obvious that tracking service quality is important, little has been done to develop a tracking procedure. In manufacturing, process variation is tracked by the measurement of a quality attribute’s deviation from specification. In practice, the evaluation of single operational variables, over time, is often studied using quality control charts. Because of the multivariate construct complexity of perceptual service quality data, service firms attempting to use individual univariate control charts on individual quality measures will experience major difficulty. This difficulty revolves around the inability of the tool to control type I error. In simpler terms, one cannot identify which customers experienced aberrant service performance, as the probability that an individual customer is out of control is related to the number of variables collected. For example, if the service provider collects three uncorrelated measures of service quality and plots each on a separate univariate control chart (limits set typically with 5 percent type I error), the overall service quality type I error would be between 14 and 15 percent. This problem is compounded with the addition of more variables (e.g. 5 and 9 variables collected yield 23 and 32 percent of customers out-ofcontrol respectively). Furthermore, if the collected variables are correlated, the correlation pattern must be investigated before overall type I error may be assessed. The difficulty of appraising service quality using univariate control charts will be demonstrated in a subsequent section. This article shows that, given some latent structure of service quality, the SERVQUAL instrument, in modified form, can be used to determine which service experiences were within the expected service system boundaries and which were exceptional (positively or negatively). If an orthogonal latent structure can be identified, then we also can determine which service dimensions caused the exceptional service outcome.

A check tool must satisfy four main requirements

An approach to the service conformance check problem To provide valuable management information, a service conformance check tool must satisfy four main requirements: (1) Provide longitudinal measures of service quality. Administering periodic reviews of the service system only addresses one aspect of service quality. Management may learn a great deal about systemic problems by “taking the businesses’ temperature” from time to time, but to improve the system realistically, the day-to-day operating characteristics of the system must be determined and controlled. (2) Track multiple related variables. Customer satisfaction requires that several needs are satisfied simultaneously (Bolton and Drew, 1991; King, 1987; Parasuraman, et al., 1988). Should any important customer need go unsatisfied, the customer’s perception of the entire service experience may be diminished. It is vital that management strive to control all aspects of the service episode. (3) Provide a measure of overall satisfaction. Although it may be difficult to break apart elements of the service experience, customers tend to

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

41

form an overall perception of service quality. A metric capturing this perception provides a measure of overall service success. (4) Focus attention on non-systematic service delivery problems. Instead of identifying normal service behavior that should be improved through alteration of the service delivery system, a check tool should identify aberrant system behavior that is both rare and especially damaging to the customer’s perception of quality service. Control chart framework

What is required is a method that assists in the coincident interpretation of associated variables. The method of principal components used in a control chart framework, as defined by Jackson (1980, 1981a, 1981b) and further refined (Schall and Chandra, 1987), provides such a tool. This article proposes that it can function as an improved service conformance check tool, and is particularly applicable for quality surveys done using a SERVQUAL type instrument. In the next section, the underlying dimensionality of SERVQUAL responses to an actual customer survey will be explored to attempt to confirm the original quality dimensions hypothesized by the SERVQUAL authors. Next, a univariate control chart will be constructed for each dimension. Then, the t 2 control chart, here called a “quality perception control chart”, will be constructed and contrasted against the univariate control charts. For those customers found out-of-control on the quality perception control chart, additional analyses will be made using raw data (customer responses), the SERVQUAL quality dimensions, and the principal components of the SERVQUAL dimensions. These analyses will attempt to detect why individual customers had extraordinarily satisfying or unacceptable service episodes. For interested readers, the Appendix presents a step-by-step guide to constructing the t 2 control chart.

Applying the SERVQUAL instrument

Factor analysis

42

An application The Management Information Services (MIS) division of a large international university is explored. The MIS division is responsible for serving the diverse computing needs of over 3,500 students. First, the SERVQUAL instrument was administered to several hundred studentcustomers of this division. For each of the 22 SERVQUAL items the respondents were asked to provide a rating, using a seven-point Likert scale, of the quality of the “features received”. Additionally, the respondents were asked to provide a rating, using a seven-point Likert scale, of the quality of the “features desired” for each item. From these two ratings, a “score” for each question was then constructed by subtracting the “features desired” rating from the “features received” rating. The higher the score the more positive was the overall service experience. Approximately 150 completed instruments were returned. Construction of service quality factors Before utilizing the proposed five-factor relationship suggested by the SERVQUAL authors (Parasuraman et al., 1988, 1991), all the SERVQUAL responses were subjected to factor analysis in order to calculate the factor loadings necessary to create the original SERVQUAL dimensions. Confirmatory estimates of the correlations between certain SERVQUAL dimensions formed were very close to one. As displayed in Table I, analysis of the observed correlations between the hypothesized dimensions strongly THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

Tangibles

Reliability

Responsiveness

1.000 0.601 0.498 0.606 0.512

1.000 0.825 0.891 0.585

1.000 0.974 0.727

Tangibles Reliability Responsiveness Assurance Empathy

Assurance Empathy

1.000 0.825

1.000

Table I. Correlation matrix of original SERVQUAL dimensions

indicated that “Responsiveness” and “Assurance” were not unique dimensions. Therefore, these dimensions were merged into one. The data were then resubjected to factor analysis using the four-dimension solution as a model. Figure 2 presents the relationships for the four-dimension solution. Individual factor loadings

Given these four modified SERVQUAL dimensions, actual values for these dimensions were calculated for each customer. These values (linear combinations of the individual SERVQUAL responses forwarded by customers) were computed using the individual factor loadings provided by the four-factor solution. Standardized factor loadings of confirmed dimensions: = 0.521 (x1) + 0.803 (x2) + 0.557 (x3) + 0.810 (x4) = 0.693 (x5) + 0.646 (x6) + 0.704 (x7) + 0.729 (x8) + 0.580 (x9) Responsiveness/ Assurance = 0.529 (x10) + 0.636 (x11) + 0.747 (x12) + 0.729 (x13) + 0.582 (x14) + 0.448 (x15)+ 0.744 (x16) + 0.471 (x17) Empathy = 0.725 (x18) + 0.643 (x19) + 0.731 (x20) + 0.667 (x21) + 0.700 (x22)

Tangibles Reliability

φ6 Component correlations φ4

φ5

φ1 Modified SERVQUAL components

Tangibles

SERVQUAL survey items

1

2

3

4

Item variances ∂1 ∂2 ∂3 ∂4

φ2

φ3 Responsiveness and assurance

Empathy

9

10 11 12 13 14 15 16 17

18 19 20 21 22

∂5 ∂6 ∂7 ∂8 ∂9

∂10 ∂11 ∂12 ∂13 ∂14 ∂15 ∂16 ∂17

∂18 ∂19 ∂20 ∂21 ∂22

Reliability

5

6

7

8

Figure 2. The SERVQUAL components confirmed from data set one THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

43

Given the four-factor solution, the evaluation of service experiences continued on a much smaller (four variable) data set. This data set was investigated further for aberrant service experiences.

Scanning customer responses

Univariate control charts Once the underlying dimensions of the service data have been confirmed, the manager is left with the question of how to scan customer responses to identify which customers were unusually pleased or displeased with the service provided. As stated earlier, control charts have long been used to separate out abnormal observations by first establishing a normal range of data behavior. Each customer contributes only one point on each service quality dimension, and since there appears no good reason to gather customers arbitrarily into sample groups, neither a sample range nor a sample standard deviation can be calculated. Standard X-bar control limits, which require these calculations, cannot be employed. However, for a process that produces individual items (here: served customers) a “control chart for individuals” may be used (Duncan, 1986). This procedure allows calculation of two sigma control limits for each service dimension. The data set was next evaluated to locate each customer that experienced an abnormal service incident. Fifteen customers (i.e. approximately 10 percent of the total customers) triggered out-of-control signals. They are presented in Table II along with the upper control limit (UCL) and lower control limit (LCL) calculated for each service quality dimension.

Drawbacks of univariate control charts

As one evaluates Table II, it becomes obvious that univariate control charts used in this way have serious drawbacks. First, many out-of-control observations were recorded, perhaps too many for the manager to evaluate. Second, this procedure is good at identifying customers who reported being abnormally satisfied or dissatisfied with a particular dimension but is poor at identifying customers who were abnormally satisfied or dissatisfied with the entire service experience. To restrict the number of out-of-control

Observation number 9 11 16 17 19 34 63 66 81 83 97 105 138 140 142

Tangibles UCL: 1.936 LCL: -12.950

Reliability UCL: -2.945 LCL: -18.155

Responsiveness UCL: -1.327 LCL: -24.800

-3.126 -14.475** -13.455** -13.455** -15.068** -16.146** -13.665** -0.521 3.769 -1.520 2.170** 0.000 -14.222** -14.475** -3.226

-8.549 -11.829 -15.302 -12.922 -14.623 -20.112** -12.022 -3.377 -7.364 -13.314 -2.623** 0.605** -12.111 -17.398 0.704**

-9.159 -19.628 -17.725 -22.936 -25.936** -29.316** -16.988 -0.792** 1.441** -25.595** -2.720 0.840** -6.237 -20.584 -1.644

Empathy UCL: 1.464 LCL: -18.463 -19.346** -13.325 -9.450 -11.587 18.061 -20.796** -15.880 -2.068 2.906** -17.330 -6.347 -6.922 2.844** -15.094 -8.299

Table II. Out-of-control observations on univariate control charts (original data) 44

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

occurrences, one may be tempted to loosen the control limits. However, this ignores the real difficulty with the univariate procedure. Simply stated, the SERVQUAL dimensions should not be regarded as independent of one another. What is required is a procedure that can evaluate all dimensions simultaneously.

High level of intercorrelation

The quality perception control chart As demonstrated in Table I, the original SERVQUAL constructs exhibit a high level of intercorrelation. It is not surprising, therefore, that the correlation matrix calculated from the four-construct model also indicates a high degree of shared variance, as shown in Table III. Given this high level of intercorrelation, the confirmed four-construct model must be adjusted to isolate uncorrelated constructs applicable for multivariate analysis. The principal components transformation is used to compute the quality perception control chart. A principal axes transformation on the covariance matrix calculated from the raw data yielded the following four eigenvectors:  0.65629  0.44044   0.60247  −0.11098    0.80381  0.32574  −0.18731   0.46119            . U=  0.93401  0.05064  −0.29284  −0.19825        0.81248  − 0.52502   0.23772   0.08787  

(1)

The eigenvalues were next calculated as: l1 = 2.609330 l2 = 0.578307 l3 = 0.540327 l4 = 0.272036.

(2)

The eigenvectors were then scaled to unit variance and employed in the transformation of the four construct data matrix into a new matrix of four uncorrelated variables. This is presented graphically in Figure 3, which demonstrates the bottom-up transformation of SERVQUAL survey items into the hypothesized important dimensions of service quality. Two-step control chart procedure

Next, these uncorrelated factors, suitable for control chart evaluation, are used to determine which customers were extraordinarily satisfied or

Tangibles Tangibles Reliability Responsiveness and assurance Empathy

Reliability

Responsiveness and assurance

1.000 0.600 0.547

1.000 0.865

1.000

0.512

0.586

0.772

Empathy

1.000

Table III. Correlation matrix of confirmed dimensions THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

45

Principal components

One

Two

Three

Four

Tangibles

Reliability

Responsiveness and assurance

Empathy

9

10 11 12 13 14 15 16 17

18 19 20 21 22

∂5 ∂6 ∂7 ∂8 ∂9

∂10 ∂11 ∂12 ∂13 ∂14 ∂15 ∂16 ∂17

∂18 ∂19 ∂20 ∂21 ∂22

Principal component loadings

Modified SERVQUAL components

Factor loadings

SERVQUAL survey items

1

2

3

4

Item variances ∂1 ∂2 ∂3 ∂4

5

6

7

8

Figure 3. Orthogonal dimensions extracted from the SERVQUAL components of data set one

dissatisfied with the service provided. This evaluation employs a two-step control chart procedure: (1) The initial control chart compares individual t 2 values, computed for each observation, to a precalculated control limit. Values registering outof-control indicate service quality responses significantly different from normal. The t 2 control chart has a single (upper) control limit that is related to the F distribution. While the t 2 distribution has been cataloged, its relationship to the F distribution can be exploited more easily(Jackson, 1980, 1981a). The upper control limit for the t 2 chart can be calculated using the following formula: t12−α , p, n =

where:

p(n − 1) F1−α , p, n − p n− p

(3)

p = number of variables; n = number of observations.

(2) Univariate control charts may then be constructed by plotting the individual principal component scores or the original SERVQUAL variables. In terms of the principal components, since each measure has been scaled to unit variance, control limits are then directly found from a table of standard normal variates. For example, if 95 percent limits are desired, the upper limit should be located at 1.96 and the lower limit at –1.96. Original SERVQUAL variables may be plotted by using the procedures discussed earlier. Once constructed, the use of these related control charts is straightforward:

46



Locate each customer observation as a single point on the t 2 control chart.



If the point is in control then no further investigation is warranted. THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

• Mechanics of control charting

If the point is out-of-control, then the set of univariate control charts can be investigated to help locate the origin of the problem.

Since the t 2 control chart identifies abnormal service experiences and controls the alpha level in a predetermined manner, it should be consulted before the individual principal component charts are investigated. Although these procedures look formidable, especially if performed at the operational level, research indicates that technology in the form of highly accessible and more powerful personal computers has made it possible to automate statistical process control procedures and to give technicians and managers the flexibility to spend less time on the mechanics of control charting (Wardell et al., 1992). The t2 values for our sample data were calculated and plotted, as shown in Figure 4, for the customers surveyed. The four transformed uncorrelated

TSQU 20

18

16

14

12

10

8

6

4

2

0 0

20

40

60

80

100

120

140

160

NB Note: Six obs had missing values; three obs hidden

Figure 4. T Square Control Chart (Plot of TSQU★NB. Symbol used is /★/) THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

47

variables (principal components) were also plotted for these customers, as shown in Figures 5, 6, 7 and 8.

Understanding abnormal service experiences

Out-of-control observations For ten customer-service experiences, the t2 statistic plotted out-of-control. Thus, ten customers recorded significantly abnormal service experiences. For these individuals, the next step was to attempt to understand why their experiences were significantly different from the other customers served. When large numbers of observations are plotted along several dimensions, the reason for individual out-of-control situations may be difficult to

Prin1 4

3

2

1

0

–1

–2

–3

–4 0

20

40

60

80

100

120

140

160

NB Note: Six obs had missing values; three obs hidden

Figure 5. Principal component one chart (Plot of PRIN1★NB. Symbol used is /★/) 48

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

diagnose. The principal component charts may be of major value since, if interpretation of these components can be made, they provide strong diagnostic tools. Principal component score control charts provide valuable diagnostic tools under two situations. First, if the principal component technique is not overly destructive of the original SERVQUAL dimensions (i.e. each variable loads high on one, and only one, principal component), then the principal components may serve as surrogate measures of the original SERVQUAL factors. Second, if the principal component procedure is highly destructive of the original structure, then one may wish to provide a new identity to as many of the principal components as possible. If interpretation of the original components cannot be made, component rotation, similar to factor rotation in factor analysis, may be helpful in identifying the new uncorrelated latent structures. In cases where principal components analysis destroys the original SERVQUAL structure and the identity of the new latent constructs cannot be made, then one may need to return to the raw data for help in finding reasons for service aberrations. Prin2 3

2

1

0

–1

–2

–3 0

20

40

60

80

100

120

140

160

NB Note: Six obs had missing values; two obs hidden

Figure 6. Principal component two chart (Plot of PRIN2★NB. Symbol used is /★/)

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

49

As stated earlier, the univariate control chart limits may be helpful if this step becomes necessary. Three patterns

By evaluating the principal component values of these ten customers, three patterns emerge. Customer 1 recorded exceptionally high values for principal components one and two. Customers 19 and 34 registered very low values for components one and three, while the remaining customers (52, 66, 77, 90, 97, 105 and 142) had high values for principal component one. A summary is provided in Table IV. If an individual phenomenon could be isolated for each of these principal components then one could identify the reasons behind abnormal customer Prin3 2

1.5

1

0.5

0

–0.5

–1

–1.5

–2 0

20

40

60

80 NB

100

120

140

160

Note: Six obs had missing values; two obs hidden

Figure 7. Principal component three chart (Plot of PRIN3★NB. Symbol used is /★/) 50

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

satisfaction directly from the principal components. By evaluating the eigenvectors (sometimes called the principal component loadings) a relatively clear picture of principal component one emerges. Since all four SERVQUAL dimensions loaded high on eigenvector one, the first principal component will only be abnormally high or low if all measures of service are high or low. In other words, principal component one is a metric of overall satisfaction. It is not surprising that each customer with an out-of-control t2 value also displays an out-of-control signal for principal component one. Further analysis required

We were not able to attach a specific meaning to the remaining principal components. This is unfortunate as customers 1, 19, and 34 had multiple principal components out-of-control. Since these principal components are not easily translatable, further analysis will be conducted on the unrotated SERVQUAL dimensions used to produce the univariate control charts. Prin4 2

1.5

1

0.5

0

–0.5

–1

–1.5

–2

–2.5 0

20

40

60

80 NB

100

120

140

160

Note: Six obs had missing values; five obs hidden

Figure 8. Principal component four chart (Plot of PRIN4★NB. Symbol used is /★/) THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

51

Observation number 1 19 34 52 66 77 90 97 105 142

t2 values

Principal component one

Principal component two

Principal component three

Principal component four

10.7508 10.5227 19.4686 13.3677 11.4365 10.8856 12.3586 15.0016 15.4271 11.7104

2.0584 -2.4497 -3.5153 3.6158 3.3156 3.2706 3.4807 3.4955 3.3266 2.5391

2.0252 -0.1822 -0.4418 0.4293 0.5275 0.1958 0.4379 1.6497 1.7346 1.4906

-0.4348 -2.0403 -2.0267 0.0226 -0.1477 -0.0751 0.0226 -0.2472 -0.6787 -1.0892

-1.4910 -0.5707 -1.6758 0.3304 0.3782 0.3802 0.1341 -0.0116 0.9442 1.3621

Table IV. Out-of-control observations on t 2 control chart (transformed data)

Out of control observations

Additionally, the final seven t2 out-of-control observations (52, 66, 77, 90, 97, 105 and 142) are attributed to customers who experienced abnormally pleasing service experiences. Since none of the principal components, other than the first, demonstrates an out-of-control signal, further analysis of these customers requires evaluation beyond the principal components. It should be clear that the reason t2 values fall out-of-control is that the patterns of responses given by specific customers differ from the normal patterns observed. Once unusual experiences are identified, the univariate control charts may provide insight as to why. Observation numbers 1, 52, 77, and 90 did not register an out-of-control signal on any of the univariate charts. Reviewing the data for these four cases demonstrated very high (but not out-of-control) values for all four service quality dimensions. These customers are unusual in that they are satisfied with all dimensions of the service. Customers 66, 97, 105, and 142 fall into this same category, but they experienced a positive out-of-control observation for one dimension. Only two customers signaled out-of-control low conditions. Customer 19 was extraordinarily dissatisfied with the tangibles dimension and the responsiveness/assurance dimension. Customer 34 scored out-of-control low on all four service quality dimensions. Management might gain further service delivery insights by investigating what made these service experiences unusually dissatisfying.

Service managers must exercise caution

Conclusions The results of this study indicate that service managers attempting to use SERVQUAL type instruments for measuring service quality conformance must do so with caution. SERVQUAL type instruments inherently have a structure whose dimensions are correlated. This creates the need for wellconceived statistical measurement tools for extracting meaningful information. This article has proposed and illustrated a two-step control chart procedure for evaluating service operations, using SERVQUAL type instruments. Initially, a t 2 control chart is constructed using the (transformed) multiple responses to the survey instrument. Principal component control charts are then used to examine the individual dimensions of service quality further.

52

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

This procedure is shown to be much more accurate for controlling type I error than is the procedure involving univariate control charts constructed for the original (correlated) data. A case evaluation was performed to evaluate how the proposed procedure performs for a real set of data. Very encouraging results were obtained. We anticipate further testing of the procedure using a greater volume of data obtained from various service quality measurement environments. We also believe that the procedure can be of value when used longitudinally. This suggests the need for further studies involving application of the procedure at varying time intervals. References Babakus, E. and Boller, G.W. (1992), “An empirical assessment of the SERVQUAL scale”, Journal of Business Research, Vol. 24, pp. 253-68. Babakus, E. and Mangold, G.W. (1992), “Adapting the SERVQUAL scale to hospital services: an empirical investigation”, Health Services Research, Vol. 26 No. 6, February, pp. 767-86. Ballantyne, D. (1990), “Coming to grips with service intangibles using quality management techniques”, Marketing Intelligence & Planning, pp. 4-10. Bolton, R.N. and Drew, J.H. (1991), “A multistage model of customers’ assessments of service quality and value”, Journal of Consumer Research, March, pp. 375-84. Carman, J.M. (1990), “Consumer perceptions of service quality: an assessment of the SERVQUAL dimensions”, Journal of Retailing, Vol. 66 No. 1, Spring, pp. 33-55. Creedon, J. (1988), “Inside Met Life’s growing strategy”, Journal of Business Strategy, January-February, pp. 23-7. Crompton, J.L. and Mackay, K.J. (1989), “Users’ perceptions of the relative importance of service quality dimensions in selected public recreation programs”, Leisure Sciences, Vol. 11, pp. 367-75. Cronin, J.J. and Taylor, S.A. (1992), “Measuring service quality: a reexamination and extension”, Journal of Marketing, Vol. 56, July, pp. 55-68. Deming, W.E. (1985), “Transformation of the western style of management”, Interfaces, Vol. 15 No. 3, May-June, pp. 6-11. Deming, W.E. (1986), Out of the Crisis, Massachusetts Institute of Technology Center for Advanced Engineering, Cambridge, MA. Duncan, A.J. (1986), Quality Control and Industrial Statistics, Irwin, Homewood, IL. Fick, G.R. and Ritchie, J.R.B. (1991), “Measuring service quality in the travel and tourism industry”, Journal of Travel Research, Fall, pp. 2-9. Jackson, E.J. (1980), “Principal components and factor analysis: Part I – principal components”, Journal of Quality Technology, October, pp. 201-13. Jackson, E.J. (1981a), “Principal components and factor analysis: Part II – additional topics related to principal components”, Journal of Quality Technology, January, pp. 46-58. Jackson, E.J. (1981b), “Principal components and factor analysis: Part III – what is factor analysis?”, Journal of Quality Technology, April, pp. 125-30. King, C.A. (1987), “A framework for a service quality assurance system”, Quality Progress, September, pp. 27-32. Lewis, B. (1989), “Quality in the service sector: a review”, International Journal of Bank Marketing, Vol. 7 No. 5, pp. 4-12. Murdick, R.G., Render, B. and Russell, R.S. (1990), Service Operations Management, Allyn and Bacon, Needham Heights, MA. Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service quality and its implications for future research”, Journal of Marketing, Vol. 49, Fall, pp. 41-50. Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1988), “SERVQUAL: a multiple-item scale for measuring consumer perceptions of service quality”, Journal of Retailing, Spring, pp. 12-37.

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

53

Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1991), “Refinement and reassessment of the SERVQUAL scale”, Journal of Retailing, Vol. 67 No. 4, Winter, pp. 420-50. Reidenbach, E.R. and Sandifer-Smallwood, B. (1990), “Exploring perceptions of hospital operations by a modified SERVQUAL approach”, Journal of Health Care Marketing, Vol. 10 No. 4, pp. 47-55. Saleh, F. and Ryan, C. (1991), “Analyzing service quality in the hospitality industry using the SERVQUAL model”, The Service Industries Journal, Vol. 11 No. 3, pp. 324-43. Schall, S. and Chandra, J. (1987), “Multivariate quality control using principal components”, International Journal of Production Research, Vol. 25 No. 4, pp. 571-88. Smith, S. (1987), “How to quantify quality”, Management Today, October, pp. 86-8. Teas, R.K. (1993), “Expectations, performance evaluation, and customers’ perceptions of quality”, Journal of Marketing, October, pp. 18-34. Teas, R.K. (1994), “Expectations as a comparison standard in measuring service quality: an assessment of a reassessment”, Journal of Marketing, January, pp. 132-9. Wardell, D.G., Moskowitz, H. and Plante, R.D. (1992), “Control charts in the presence of data correlation”, Management Science, August, pp. 1084-105. Webster, C. (1989), “Can customers be segmented on the basis of their service quality expectations?”, The Journal of Services Marketing, Vol. 3 No. 2, Spring, pp. 35-53. Weitzel, W., Schwarzkopf, A.B. and Peach, E.B. (1989), “The influence of customer service on retail stores”, Journal of Retailing, Vol. 65 No. 1, Spring, pp. 27-39. Woodside, A.G., Frey, L.L. and Daly, R.T. (1989), “Linking service quality, customer satisfaction and behavioral intention”, Journal of Health Care Marketing, Vol. 9 No. 4, December, pp. 5-17. Appendix Jackson (1980, 1981a, 1981b) outlined the six following steps for creating a univariate control chart, for situations in which multiple (correlated) variables are being measured: (1) Collect a sample of n customer quality responses for p correlated variables and compute the customer-to-customer covariance matrix and the vector of sample means.

 x11 x  21  . X=  .  .   x n1

X

.

.

.

x1 p 

x 22

.

.

.

x2 p

.

.

.

.

.

. .

xn2

.

.

  .   .  .   x np 

= Raw data matrix

 x1   x  2  . =  = Vector of variable sample means  .  .   x p 

 s11 s  21  . S=  .  .  s p1 where:

54

x12

s12

.

.

.

s1 p 

s22

.

.

.

s2 p

.

.

.

.

. s p2

. .

.

.

  .   .  .   s pp 

( A1)

(A2)

= Covariance matrix

( A 3)

p = number of variables; n = number of observations.

THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

(2) Perform a principal axes transformation on the covariance matrix calculated from the raw data, yielding the eigenvectors: A matrix formed from the submatrices U = [u1:u2 … :up] = of coordinate axes of the new variables (A4) (eigenvectors) and the eigenvalues

L = U ′SU

 11  0   . =  .  .  0

0

.

.

.

12

.

.

.

.

.

0

   .  .  1p .

.

.

.

.

0

0

.

.

.

= Diagonal matrix whose diagonal elements are the eigenvalues of S

( A 5)

(3) Rescale the eigenvectors to unit variance: W =U/

L

( A6 )

where: W = [w1 : w2 : … : wp] = A matrix formed from the submatrices of eigenvectors scaled to unit variance. w1 = u1 /

Note :

11 =

The first eigenvector scaled to unit variance

(A7) (A8)

(4) Transform the raw data matrix into a matrix of observations of uncorrelated variables: – Y = W ′ (X – X ) (A9) where: Y = [y1 : y2 : … : yp] = A matrix formed from the submatrices of transformed observations (principal components scaled to unit variance). – Note: y1 = w1 (X – X ) = The first principal component scaled to unit variance.

(A10) (A11)

(5) Compute the univariate measure of conformance for each observational vector:

ti2 =

P

∑ y 2ji

( A12 )

j =1

(6) Plot the t 2 control chart. The upper control limit for the t 2 control chart can be calculated using the following formula: t1− α , p , n = 2

p( n − 1) n− p

F1− α , p , n − p

( A13)

where: p = number of variables; n = number of observations.



John B. Jensen is Assistant Professor of Management Science at the School of Business, University of Southern Maine, Portland, Maine, USA. Robert E. Markland is at the College of Business Administration, University of South Carolina, Columbia, USA. THE JOURNAL OF SERVICES MARKETING VOL. 10 NO. 1 1996

55

Journal of Services Marketing

However, the reasons for service breakdowns are, as we all know, many and varied. Most services are complex and involve a range of tangible and intangible ...

155KB Sizes 1 Downloads 97 Views

Recommend Documents

Measuring Marketing Insights Services
The marketing world's intense focus on analytics, of late, hasn't always led to better performance — because, while it's easy to collect data, it's difficult to turn it into deep insight. This Insight Center covered content that included a leading

PdF Services Marketing
Page 1. PdF Services Marketing: Integrating Customer Focus Across the ... platform a delivery and logistics network a 7 Development Projects for Microsoft Office ...

Data-Driven Marketing services
Apr 19, 2018 - reps and a similar percentage of the leads generated disappear into a “sales lead black hole.” And ... In the former, the goal is to establish awareness and ..... had three types of data streams they could leverage for targeting. .

Mobile Marketing Matters Services
Bridgevine is focused on customer acquisition empowered by its proprietary. Acquisition and Merchandising Platform, AMP. AMP combines SEO, SEM, mobile,. Daily Deals, Internet radio, display, social, and affiliates. Perhaps the most intriguing and con