Professional Documents
Culture Documents
Class Note TQM
Class Note TQM
system, and sorting out the most significant factors responsible for most significant effect on the
system but for sorting out the most insignificant factors responsible for most significant effect on
the system. The principles of TQM implementation in a business activity sets the core values
first and then techniques & Tools. The core values adopted in a TQM company must not be
based on the tools and techniques available in the company but the reverse. The core values also
must not be based on the skill and ethics of the employees of the company but the reverse is
effective in a efficient quality company.
Out different techniques used in TQM principle, Pareto analysis is combined with other few
important and frequently used techniques such as FMEA (Failure Mode and Effect Analysis),
DMAIC (Define Measure Analyse Improve and Control etc
TQM is based on some core values of an organisation interested to implement Total Quality
Management in the organisation. The different core values of a TQM company along with
its tools & techniques generally employed in the system are
iStockphoto
Imagine that you've just stepped into a new role as head of department. Unsurprisingly, you've
inherited a whole host of problems that need your attention.
Ideally, you want to focus your attention on fixing the most important problems. But how do you
decide which problems you need to deal with first? And are some problems caused by the same
underlying issue?
Pareto Analysis is a simple technique for prioritizing possible changes by identifying the
problems that will be resolved by making these changes. By using this approach, you can
prioritize the individual changes that will most improve the situation.
Pareto Analysis uses the Pareto Principle also known as the "80/20 Rule" which is the idea
that 20 percent of causes generate 80 percent of results. With this tool, we're trying to find the 20
percent of work that will generate 80 percent of the results that doing all of the work would
deliver.
Note:
The figures 80 and 20 are illustrative the Pareto Principle illustrates the lack of symmetry that
often appears between work put in and results achieved. For example, 13 percent of work could
generate 87 percent of returns. Or 70 percent of problems could be resolved by dealing with 30
percent of the causes.
, the
this.)
Step 3: Score Problems
Now you need to score each problem. The scoring method you use depends on the sort of
problem you're trying to solve.
For example, if you're trying to improve profits, you might score problems on the basis of how
much they are costing you. Alternatively, if you're trying to improve customer satisfaction, you
might score them on the basis of the number of complaints eliminated by solving the problem.
Step 4: Group Problems Together By Root Cause
Next, group the problems together by cause. For example, if three of your problems are caused
by lack of staff, put these in the same group.
Step 5: Add up the Scores for Each Group
You can now add up the scores for each cause group. The group with the top score is your
highest priority, and the group with the lowest score is your lowest priority.
Step 6: Take Action
Now you need to deal with the causes of your problems, dealing with your top-priority problem,
or group of problems, first.
Keep in mind that low scoring problems may not even be worth bothering with - solving these
problems may cost you more than the solutions are worth.
Note:
While this approach is great for identifying the most important root cause to deal with, it doesn't
take into account the cost of doing so. Where costs are significant, you'll need to use techniques
such as Cost/Benefit Analysis, and use IRRs and NPVs
to determine which changes you
should implement.
Pareto Analysis is a simple technique for prioritizing problem-solving work so that the first piece of work
done to resolve the greatest number of problems. It's based on the Pareto Principle (also known as the
80/20 Rule) the idea that 80 percent of problems may be caused by as few as 20 percent of causes.
Pareto Analysis involves the identification and listing of problems and their causes. Then score
each problem and group them together by their cause. Then add up the score for each group.
Finally, finding a solution to the cause of the problems in group with the highest score is worked
out.
Pareto Analysis not only shows you the most important problem to solve, it also gives you a
score showing how severe the problem is.
Analyse the following data table following the method of Pareto analysis by pictorially
presenting the derived information from the data to identify the vital 20% causes that need to be
taken care of to bring about an 80% overall improvement..
Causes
Technical failure
Workforce Problems
Environmental factors
Shortage of Resources
Government approval
Frequency in
%
42
35
12
8
3
Cumulative frequency in
%
42
77
89
97
100
A graphical representation of the analysis provides a more powerful tool, which is easily
comprehensible as well, to identify the relative importance of all the enlisted causes, and come to
a final decision on what are the few key causes significantly affecting the project. Heres how
Pareto Analysis data can be pictorially presented to identify the vital 20% causes that need to be
taken care of to bring about an 80% overall improvement.
1. Make a list of all the probable causes in one column and in the adjacent column fill in the
frequency of occurrence, in percentage form, for each of the causes. Sort the data, based on the
frequency of occurrence, in a descending order.
2. In the next, adjacent column calculate the cumulative frequency of occurrence, which by
default will appear in percentages.
3. Finally, make a bar graph using the data, plotting the frequency along the y-axis and the causes
along the x-axis. The cumulative frequency can be represented as a line. The resulting bar chart
will make it clear what are the key causes resulting in 80% of the problem related to the project.
to provide products and services with a level of quality that satisfies customers, at the appropriate
time and price.
There are many proposed tools and techniques to achieve the TQM promises. Generally, a
technique can be considered as a number of activities performed in a certain order to reach the
values (Hellsten & Klefsj, 2000). On the other hand, tools sometimes have statistical basis to
support decision making or facilitate analysis of data.
Most of the studies in TQM implementation focus on the concept of TQM. There are very few
studies in the literature that directly suggest an implementation roadmap of TQM tools and
techniques and usually they are not a complete roadmap. Therefore, a comprehensive roadmap
for TQM implementation is proposed that covers all the cited tools and techniques.
The management focus and commitment phase requires the use of data analysis tools (e.g. cause
& effect analysis, flow charts and Pareto analysis) to identify problem areas, quantify their
effects and prioritize the need for solution. During the intensive improvement phase the
introduction of more complex tools (e.g. statistical process control (SPC) and failure mode and
effects analysis (FMEA)) help to facilitate company-wide improvement.
Bunney and Dale (1997) also categorized TQM tools and techniques in two different ways, first
in five categories regarding to their application and second in seven categories regarding to the
function that they can be used. Table 1 shows the TQM tools regarding to their application.
Table 1: Analysis of Application of Tools and Techniques (Bunney & Dale, 1997)
Table 2: Analysis of Tools and Techniques Used within each Function (Bunney & Dale,
1997)
In another study, TQM can be defined as a management system, which consists of three
interdependent units, namely core values, techniques and tools. The idea is that the core values
must be supported by techniques, such as process management, benchmarking, and customer
focused planning, or improvement teams, and tools, such as control charts, the quality house or
Ishikawa diagrams, in order to be part of a culture. They emphasized that this systematic
definition will facilitate for organizations the understanding and implementation of TQM.
Therefore, the implementation work should begin with the acceptance of the core values that
characterizing the culture of organization. The next step is to continuously choose techniques
that are suitable for supporting the selected values. Ultimately, suitable tools have to be identified
and used in an efficient way in order to support the chosen techniques.
Figure 1: TQM as a Management System Consists of Values, Techniques and Tools (Hellsten &
Klefsj, 2000)
According study, the basis for the culture of the organization are the core values. Another
component is techniques, i.e. ways to work within the organization to reach the values. A
technique consists of a number of activities performed in a certain order. The important concept
here is that TQM really should be looked on as a system. The values are supported by techniques
and tools to form a whole. We have to start with the core values and ask: Which core values
should characterize our organization? When this is decided, we have to identify techniques that
are suitable for our organization to use and support our values. Finally, from that decision the
suitable tools have to be identified and used in an efficient way to support the techniques (see
Figure 2).
course, important to note that a particular technique can support different core values and the
same tool can be useful within many techniques.
In another work, 15 frequently used TQM tools and classified them according to qualitative
TQM tools and quantitative TQM tools. Qualitative tools consist mainly of subjective inputs,
which often do not intend to measure something of a numerical nature. Quantitative tools, on the
other hand, involve either the extension of historical data or the analysis of objective data, which
usually avoid personal biases that sometimes contaminate qualitative tools. They categorized
TQM tools as below:
Qualitative tools:
Quantitative tools:
Additional Information:
Pareto analysis:
Pareto analysis is a formal technique useful where many possible courses of action are
competing for attention. In essence, the problem-solving process estimates the benefit delivered
by each action, then selects a number of the most effective actions that deliver a total benefit
reasonably close to the maximal possible one. However, it can be limited by its exclusion of
possibly important problems which may be small initially, but which grow with time. It should
be combined with other analytical tools such as failure mode and effects analysis and fault tree
analysis for example.
This technique helps to identify the top portion of causes that need to be addressed to resolve the
majority of problems. Once the predominant causes are identified, then tools like the Ishikawa
diagram or Fish-bone Analysis can be used to identify the root causes of the problems. While it is
common to refer to pareto as "80/20" rule, under the assumption that, in all situations, 20% of
causes determine 80% of problems, this ratio is merely a convenient rule of thumb and is not nor
should it be considered immutable law of nature.
The application of the Pareto analysis in risk management allows management to focus on those
risks that have the most impact on the project.
Steps to identify the important causes using 80/20 rule
1. Form an explicit table listing the causes and their frequency as a percentage.
2. Arrange the rows in the decreasing order of importance of the causes (i.e., the most
important cause first)
3. Add a cumulative percentage column to the table
4. Plot with causes on x- and cumulative percentage on y-axis
5. Join the above points to form a curve
6. Plot (on the same graph) a bar graph with causes on x- and percent frequency on y-axis
7. Draw a line at 80% on y-axis parallel to x-axis. Then drop the line at the point of
intersection with the curve on x-axis. This point on the x-axis separates the important
causes (on the left) and trivial causes (on the right)
8. Explicitly review the chart to ensure that causes for at least 80% of the problems are
captured.
The Pareto principle (also known as the 8020 rule, the law of the vital few, and the principle
of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of
the causes. Management consultant Joseph M. Juran suggested the principle and named it after
Italian economist Vilfredo Pareto, who, while at the University of Lausanne in 1896, published
his first paper "Cours d'conomie politique." Essentially, Pareto showed that approximately 80%
of the land in Italy was owned by 20% of the population; Pareto developed the principle by
observing that 20% of the peapods in his garden contained 80% of the peas.
It is a common rule of thumb in business; e.g., "80% of your sales come from 20% of your
clients." Mathematically, the 8020 rule is roughly followed by a power law distribution (also
known as a Pareto distribution) for a particular set of parameters, and many natural phenomena
have been shown empirically to exhibit such a distribution.
The Pareto principle is only tangentially related to Pareto efficiency. Pareto developed both
concepts in the context of the distribution of income and wealth among the population.
The distribution is claimed to appear in several different aspects relevant to entrepreneurs and
business managers. For example:
80% of a company's profits come from 20% of the time its staff spend
Other applications:
In the systems science discipline, Epstein and Axtell created an agent-based simulation model
called SugarScape, from a decentralized modeling approach, based on individual behaviour rules
defined for each agent in the economy. Wealth distribution and Pareto's 80/20 principle became
emergent in their results, which suggests the principle is a natural phenomenon.
The Pareto principle has many applications in quality control. It is the basis for the Pareto chart,
one of the key tools used in total quality control and six sigma. The Pareto principle serves as a
baseline for ABC-analysis and XYZ-analysis, widely used in logistics and procurement for the
purpose of optimizing stock of goods, as well as costs of keeping and replenishing that stock.
The Pareto principle was also mentioned in the book 24/8 - The Secret for being Mega-Effective
by Achieving More in Less Time by Amit Offir. Offir claims that if you want to function as a onestop shop, simply focus on the 20% of what is important in a project and that way you will save a
lot of time and energy.
In health care in the United States, 20% of patients have been found to use 80% of health care
resources. Several criminology studies have found 80% of crimes are committed by 20% of
criminals. This statistic is used to support both stop-and-frisk policies and broken windows
policing, as catching those criminals committing minor crimes will likely net many criminals
wanted for (or who would normally commit) larger ones.
In the financial services industry, this concept is known as profit risk, where 20% or fewer of a
company's customers are generating positive income, while 80% or more are costing the
company money.
Mathematical notes
The idea has rule of thumb application in many places, but it is commonly misused. For
example, it is a misuse to state a solution to a problem "fits the 8020 rule" just because it fits
80% of the cases; it must also be that the solution requires only 20% of the resources that would
be needed to solve all cases. Additionally, it is a misuse of the 8020 rule to interpret data with a
small number of categories or observations.
This is a special case of the wider phenomenon of Pareto distributions. If the Pareto index ,
which is one of the parameters characterizing a Pareto distribution, is chosen as = log45 1.16,
then one has 80% of effects coming from 20% of causes.
It follows that one also has 80% of that top 80% of effects coming from 20% of that top 20% of
causes, and so on. Eighty percent of 80% is 64%; 20% of 20% is 4%, so this implies a "64-4"
law; and similarly implies a "51.2-0.8" law. Similarly for the bottom 80% of causes and bottom
20% of effects, the bottom 80% of the bottom 80% only cause 20% of the remaining 20%. This
is broadly in line with the world population/wealth table above, where the bottom 60% of the
people own 5.5% of the wealth.
The 64-4 correlation also implies a 32% 'fair' area between the 4% and 64%, where the lower
80% of the top 20% (16%) and upper 20% of the bottom 80% (also 16%) relates to the
corresponding lower top and upper bottom of effects (32%). This is also broadly in line with the
world population table above, where the second 20% control 12% of the wealth, and the bottom
of the top 20% (presumably) control 16% of the wealth.
The term 8020 is only shorthand for the general principle at work. In individual cases, the
distribution could just as well be, say, 8010 or 8030. There is no need for the two numbers to
add up to the number 100, as they are measures of different things, e.g., 'number of customers' vs
'amount spent'). However, each case in which they do not add up to 100%, is equivalent to one in
which they do; for example, as noted above, the "64-4 law" (in which the two numbers do not
add up to 100%) is equivalent to the "8020 law" (in which they do add up to 100%). Thus,
specifying two percentages independently does not lead to a broader class of distributions than
what one gets by specifying the larger one and letting the smaller one be its complement relative
to 100%. Thus, there is only one degree of freedom in the choice of that parameter.
Adding up to 100 leads to a nice symmetry. For example, if 80% of effects come from the top
20% of sources, then the remaining 20% of effects come from the lower 80% of sources. This is
called the "joint ratio", and can be used to measure the degree of imbalance: a joint ratio of 96:4
is very imbalanced, 80:20 is significantly imbalanced (Gini index: 60%), 70:30 is moderately
imbalanced (Gini index: 40%), and 55:45 is just slightly imbalanced.
The Pareto principle is an illustration of a "power law" relationship, which also occurs in
phenomena such as brush fires and earthquakes. Because it is self-similar over a wide range of
magnitudes, it produces outcomes completely different from Gaussian distribution phenomena.
This fact explains the frequent breakdowns of sophisticated financial instruments, which are
modeled on the assumption that a Gaussian relationship is appropriate to, for example, stock
price movements.
Equality measures
Gini coefficient and Hoover index
Using the "A : B" notation (for example, 0.8:0.2) and with A + B = 1, inequality measures like the
Gini index (G) and the Hoover index (H) can be computed. In this case both are the same.
Theil index
The Theil index is an entropy measure used to quantify inequalities. The measure is 0 for 50:50
distributions and reaches 1 at a Pareto distribution of 82:18. Higher inequalities yield Theil
indices above 1.
need for solution. During the intensive improvement phase the introduction of more complex
tools (e.g. statistical process control (SPC) and failure mode and effects analysis (FMEA)) help
to facilitate company-wide improvement.
Bunney and Dale (1997) also categorized TQM tools and techniques in two different ways, first
in five categories regarding to their application and second in seven categories regarding to the
function that they can be used. Table 1 shows the TQM tools regarding to their application.
Table 1: Analysis of Application of Tools and Techniques (Bunney & Dale, 1997)
Table 2: Analysis of Tools and Techniques Used within each Function (Bunney & Dale,
1997)
SPC---Statistical Process Control
FMEA---Failure Mode and Effect (Ccriticality)Analysis
QFD---Quality Function Deployment
In another study, Hellsten and Klefsj (2000) suggested a new definition for TQM. Their study
indicates that TQM can be defined as a management system, which consists of three
interdependent units, namely core values, techniques and tools. The idea is that the core values
must be supported by techniques, such as process management, benchmarking, and customer
focused planning, or improvement teams, and tools, such as control charts, the quality house or
Ishikawa diagrams, in order to be part of a culture. They emphasized that this systematic
definition will facilitate for organizations the understanding and implementation of TQM.
Therefore, the implementation work should begin with the acceptance of the core values that
characterizing the culture of organization. The next step is to continuously choose techniques
that are suitable for supporting the selected values. Ultimately, suitable tools have to be identified
and used in an efficient way in order to support the chosen techniques.
Figure 1 illustrates this definition (The techniques and tools in the figure are just examples and
not a complete list).
According to Hellsten and Klefsj (2000) the basis for the culture of the organization are the core
values. Another component is techniques, i.e. ways to work within the organization to reach the
values. A technique consists of a number of activities performed in a certain order. The important
concept here is that TQM really should be looked on as a system. The values are supported by
techniques and tools to form a whole. We have to start with the core values and ask: Which core
values should characterize our organization? When this is decided, we have to identify
techniques that are suitable for our organization to use and support our values. Finally, from that
decision the suitable tools have to be identified and used in an efficient way to support the
techniques (see Figure 2).
Quantitative tools:
brainstorming;run charts;
election grids;histograms.
task lists.
In another study, Ahmed and Hassan (2003) introduced a different method. They indicated that
from the point of presentation of data on a process, tools can be classified as graphical tools and
flow diagrams. In the graphical tools class there are histogram, stem-and-leaf diagrams, line
charts, bar charts, pie charts, run or time series charts, control charts, and Pareto diagrams, and in
the flow diagrams class there are flow diagrams, process flow charts, cause-and-effect diagrams,
and tree diagrams. Tools like check sheets, location plots, and data tables can be used to facilitate
data collection and summarization.
For analyzing quality management aspects, these basic quality control tools are powerful and
acceptable. Force-field analysis, nominal group technique, affinity diagram, interrelationship
diagram, tree diagram, matrix diagram, prioritization matrices, process decision program chart
(PDPC), and activity network diagram (PERT, CPM, arrow diagram, AoN) are some of the
relevant management tools associated with quality management that can be applied to generate
and treat soft data. Dale (2003) listed these tools under seven management tools M7, including
affinity diagrams, relation diagrams, systematic diagrams, matrix data analysis, PDPCs, and
arrow diagrams.
Rao et al. (1996) categorized brainstorming, affinity diagrams (or structured brainstorming),
process potential index Cp, process performance index Cpk, Taguchi's loss function, and design
of experiment (DoE) as advanced tools.
Ahmed and Hassan (2003) indicated that a systematic approach can produce very significant
benefits in the long run. Deming's plan-do-study/check-act (PDSA/PDCA) is an excellent
technique in monitoring and problem solving for continuous quality improvement where any
brilliant ideas of individuals can be accommodated. However, a good number of other tools and
techniques have to be invited to apply this properly. In other words, it integrates a few essential
tools and techniques. In fact, without a strategic disposition, any tool or technique should not be
taken in isolation for use. Figure 3 depicts the systematic use of various tools in different
operational stages.
In another study in this area, Fazel & Salegna (1996) tried to group major TQM tools and
techniques into six major categories as determined by their primary area of implementation
focus:
1) customer-based;
2) management-based;
3) employee-based;
4) supplier-based;
5) process-based; and
6) product-based.
Table 3 shows this classification.
The above mentioned categories are described below (Salegna & Fazel, 1996):
1) Customer-based strategies should be the focal point of every TQM programme, around which
all other strategies are formulated. Customer satisfaction is only likely to be achieved and
maintained when the customer plays an active role in the organizations process of quality
improvement. Major techniques used to accomplish this are customer needs analysis, customer
surveys and quality function deployment.
2) Management-based strategies are also extremely important for the successful implementation
of TQM. TQM initiatives are not likely to succeed without strong leadership and support from
top management. The goals and the benefits of implementing TQM must clearly be
communicated by top management to the workforce. The alignment of the reward structure with
the goals of the organization is also vital to the organizations success in achieving these goals.
Table 3: TQM Tools and Techniques Categories (Salegna & Fazel, 1996)
Implementation Strategy
Leadership
Cross-training
Quality circles
Quality teams
Brainstorming
Nominal group technique
Supplier documentation
Supplier certification
Quality improvement process
Just-in-time
Lead time reduction
Benchmarking
Quality cost analysis
Quality audits
Quality assessment
Process documentation
ISO 9000
Work flow analysis
Design of experiments
Concurrent engineering
Product flow analysis
5) Process-based strategies focus on improving processes by reducing waste, defect rates, cycle time,
and providing feedback on the performance of the process. Benchmarking, SPC and JIT are some of
the most popular techniques employed by companies to achieve these goals.
6) Product-based strategies are directly focused on the quality of the product, its physical
characteristics and its manufacturability.
THE PROPOSED ROADMAP FOR TQM IMPLEMENTATION
According to Hammer & Goding (2001) DMAIC methodology provides a structured framework for
solving business problems by assuring correct and effective process execution. This methodology has
6 phases in which, in the case of Six Sigma, teams take total employee involvement approaches to
complete the cycle of process management and use self-diagnosis skills to fulfill the goals of each
phase. The business will naturally reach the Six Sigma quality, when all key processes within a
business are completed for each of these five each phases (Byrne, 2003). Figure 4 shows the DMAIC
methodology.
However, in the case of TQM, the factors that affecting TQM tools selection are many and it
should be considered before any implementation plan. These factors are:
Some tools or techniques appear simple over others in their development and interpretation. The
purpose of each of them is distinct and problem specific. Certainly, not all tools or techniques are
required in one firm. SPC tools are very basic and can be applied for both short and long term
goals. Some of the tools and techniques are commonly (even frequently) used, for example
Pareto chart, cause & effect diagram, histogram or quality control charts for quality performance
monitoring and improvement, and some others can be used less frequently (such as
Benchmarking, QFD). Some of the techniques are used, for example, QFD, FMEA, and design
for manufacturability (DFM), in the design and development processes. Different control charts
and process capability indices are used in controlling manufacturing processes (Ahmed &
Hassan, 2003).
The fundamental slogan of TQM is "Do-it-right-the-first-time (DIRFT)". For this, quality is
required to be introduced at the design level. The tools that have direct linkage with the
introduction of new products are DoE, QFD, FMEA, and fault tree analysis (Spring, Mc Quater,
Swift, Dale, & Booker, 1998).
Regarding to functions and activities of a manufacturing firm the following tools and techniques
are suitable for implementing TQM (Ahmed & Hassan, 2003):
a) in new product introduction - DoE, brainstorming, cause & effect diagram, QFD, fault tree
analysis;
b) in stage of production - process flow diagrams, Pareto chart, control chart;
c) in assessing the process or product - histogram, pie chart, scatter diagram, bar chart, etc; and
d) in every stage of data collection - capability indices, check sheet or check list, etc. Therefore,
need for a systematic approach in selection stage of tools and techniques and then in
implementation phases, calls for a more comprehensive methodology. In order to respond to this,
an extended model proposed here based on the DMAIC methodology. This model is called TQM
implementation roadmap to show its developmental sequences toward TQM tools and techniques
implementation. Figure 5 shows the proposed roadmap.
The correct selection and use of tools and techniques is a vital component of any successful TQM
implementation plan. The TQM tools and techniques can be divided into simple tools for solving a
special problem and complex one that cover all functions within the company. Before any
implementation the availability of resources within the company, the usage and scope of each tools
and techniques and the product characteristics should be considered carefully. In order to response to
the quest for a comprehensive methodology for TQM tools and techniques implementation, a
roadmap with 6 steps proposed in this research. The developmental sequences of this roadmap start
with steps that begin from process documentation and lead to accomplishment of more complex and
modern quality tools and techniques that guide organization to a high level of quality. This roadmap
can help all organizations intending to reach to a high level of institutionalized quality.
Of course, before being able to construct a quality management program, the company must first
organize a system of quality that will determine how they will judge their products. This quality
system will contain the characteristics to be judged in order to define the quality of the particular
product or service to be produced for the market.
The first characteristic of quality is the quality of the products blueprint or plan. This is the
engineering that is involved with the scheme of the product or service and how good it is, as well
as how good it answers the needs and wants of the target market. This first step is one that is
involved with planning the product.
The second characteristic in defining quality is the rate of how the finished product is then seen
to have strictly adhered to the specifications that were set by the initial plans and designs. Not all
the time manufacturing processes yield a product that has been constructed exactly according to
plan (in certain manufacturing situations, there are concerns that cause compromises to be made
such as costs of materials, time of production, availability of resources, etc.) and thats why
determining whether a certain finished product measures up to the initial plan accordingly is an
important factor in determining quality.
The third characteristic for determining quality is the satisfaction based on good customer
service. After the product has been planned, designed, the blueprint and the process have been
laid out, next comes the actual construction of the product. After the manufacturing processes,
then the process of selling the product comes into play. Measuring performance on this third
process is very important because in this third phase, actual contact with the target customers is
put into play and this is where customer trust, satisfaction and loyalty may or may not be firmly
established. It is important to remember that no matter how good a product may be, if it is not
presented accordingly to the customers, and if the concerns of the customers afterwards are not
given the proper attention and appropriate action, then satisfaction, trust, and loyalty may suffer,
and then follows viability and profitability for the business.
In all these aspects of quality, customer satisfaction is the topmost focus. These three
characteristics of quality are of equal importance because if a certain level of incompetence is
present in either of these quality specifiers, the fate of the product and the business will surely
suffer the critical consequences.