Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 11

Future of quality management

In this file, you can ref useful information about future of quality management such as future of
quality managementforms, tools for future of quality management, future of quality
managementstrategies If you need more assistant for future of quality management, please
leave your comment at the end of file.
Other useful material for future of quality management:
qualitymanagement123.com/23-free-ebooks-for-quality-management
qualitymanagement123.com/185-free-quality-management-forms
qualitymanagement123.com/free-98-ISO-9001-templates-and-forms
qualitymanagement123.com/top-84-quality-management-KPIs
qualitymanagement123.com/top-18-quality-management-job-descriptions
qualitymanagement123.com/86-quality-management-interview-questions-and-answers

I. Contents of future of quality management


==================
The future approaches with the force of a level-five hurricane, making businesses reconfigure
their strategies, reorganize their structures, and reinvent their processes. Unless they adapt, oncedominant businesses will become marginal players, and weaker ones will be swept into the
dustbin of history.
Recognizing the challenges ahead, businesses of all sizes in all industries will be making farreaching changes. Many of these will focus on integrating enterprise excellence, an area with
enormous potential. Look for the emergence of a new member of senior operating teams to
emerge at corporate levels during the next two decades: the chief enterprise integration officer
(CEIO).
The CEIO will contribute by helping the entire organization achieve the three Rs of business:
doing the right things right at the right time. Reporting directly to the chief executive officer, the
CEIO will help create a new kind of organization, one thats resilient enough to survive domestic
and international market turbulence, satisfy customers increasingly critical demands, and
accelerate the pace of improvement in all its functions.
Following is a profile of the new business organization that will evolve during the next two
decades.
New emphasis on measurement
Dissatisfied with results produced by todays scorecards, businesses will adopt new types of
these measuring tools. In many organizations, scorecard metrics lead to counterproductive

results. In some cases, decisions based on irrelevant or incorrect data fail to justify or support
corporate strategies. In the future, management will adopt measurement and improvement
systems that will orchestrate day-to-day activities to produce a true alignment to business needs.
Essential for survival in the future will be a system that does more than just monitor overall
operations for management. It should also provide the entire workforce with information that can
be used consistently up and down the line to make sure everyones performance directly supports
corporate strategies or becomes a target for corrective action. In other words, only what matters
will be measured, and every metric will have an owner who can justify its relevance to an
organizations bottom line.
The red-yellow-green scorecard, which has provided insufficient information and led to costly
wrong behavior, will become history.
Roadmaps
The pressure of international competition to provide maximum, measurable, predictable, and
sustainable bottom-line results will increase for all levels of management. To achieve these, a
management system called integrated enterprise excellence (IEE) will integrate excellence
throughout an organization by including all lean and Six Sigma projects as well as providing a
roadmap to make this possible.
The enterprise roadmap will help organizations select more productive projects and provide a nononsense, integrated value-chain measurement and analysis system to help orchestrate day-today work activities and target overall process-improvement efforts.
This management system will be effective not only for viewing high-level operational and
corporate business metrics, but also for providing the tools to solve tough problems. For
example, it could help a company use design of experiments to solve an elusive quality problem
for a chain of fast-food restaurants, or help a company eliminate a series of interrelated facilities
with complicated interactions and improve process flows, thus improving the bottom line.
This system will provide methods to develop and coordinate a basic set of statistical and
nonstatistical tools. The process will enable management to meet growth goals, maximize cash
flow, nurture innovation, develop responsive supply-chain dynamics, respond accurately to
customer needs, improve employee performance, prevent firefighting, avoid surprises, and
predict financial results.
Selected checks and balances will stay in place regardless of management turnover, changes in
competitive conditions, or the economic climate. In taking lean Six Sigma and the balanced
scorecard from an individual project orientation to a new level throughout an enterprise, this
roadmap will identify flaws in all operational processes and determine whether a process itself is
flawed. New systems will be designed with preventive actions in place to avoid firefighting.
This new approach will use the familiar Six Sigma define-measure-analyze-improve-control
project steps to pursue a corporatewide, integrated management system up and down the value
chain.

Individual lean Six Sigma projects will be selected based on their contribution to the bottom line.
No longer will projects be selected in isolation of each other, with improvements being made
everywhere except along the organizations critical success paths. No longer will projects focus
on harvesting low-hanging fruit, only to see improvement efforts stall soon after.
All projects will truly integrate lean and Six Sigma methodologies. Scattered and individual
pursuits to improve efficiency and quality where needed or not will be avoided if they waste
corporate resources that are required elsewhere.
There will be no Six Sigma without lean, no lean without Six Sigma, and no individual projects
that dont support a companys corporate strategies and financial goals. This will be true
regardless of whether the environment is manufacturing or transactional in nature. All other
process improvement tools will be fully integrated.
Innovation
Innovative discipline will be routinely integrated with analytics in a balance appropriate for the
organizations particular culture and strategy. Every product and service developed will have an
identified market thats been researched, with the potential clearly understood and the required
marketing capabilities available.
All work will be data-driven and standardized to reduce variability and improve quality. Data
will replace hunch, instinct, and emotion. There will be zero-level ambiguity regarding internal
and external customer requirements. Material and information flows will be seamless, not
uncoordinated or scattered individually throughout an organization.
Statistical and visualization tools will be used with maximum effectiveness throughout the value
chain. Everyone will have the information needed to perform assigned tasks optimally because
software will allow even nontechnical employees to use data.
Critical operational processes will be inherently flaw-free when this discipline is put in place
along with a with firm management discipline thats pursued faithfully and professionally.
Organizations will be able to more easily distinguish between special cause variability and the
far more prevalent common cause variability, and marshal resources to combat common cause
variability by moving from unproductive firefighting to fire prevention.
Leadership
New corporate management structures that include a CEIO will produce leaders who are
teachers, and teachers who are leaders and professionals.
Managers at all levels will have a full understanding of the Y = f (X) equation. They will no
longer believe processes improve simply by setting goals.
Business performance data will be presented in terms that everyone can understand. For
example, The process is predictable with a 2-percent nonconformance rate that costs the
business $200,000 monthly.

Decision making will always be based on high-level metrics not bounded by calendar year or
quarters. Measurements will be made at two different levels. At the operational level, thorough
measures of operational components such as defective rates, lead times, on-time deliveries,
development times, and safety will be conducted. At the corporate level, there will be more
accurate and predictable measures of factors that affect finances, such as return on invested
capital, operating income, and net profit margin.
Accounting irregularities will be relics. There will be zero potential for improper movement of
resources from one entity to another. Employees will no longer avoid responsibility or use
metrics to hide productivity shortfalls rather than monitor them. In short, problems like those
reported in the press about Dell, Enron, and other companies will be a thing of the past.
CEO compensation wont be tied as much to the companys stock price, and the CEOs efforts to
promote its value, but influenced more by performance toward true and effective governance.
In contrast with what many businesses experience today, gains from this integrated,
enterprisewide structure will be perpetual. The system will stay in place whether there are
changes in management, the environment, or the economy.
New corporate responsibilities
The CEIO will emerge from the ranks of todays Master Black Belts. Certification for Master
Black Belts will require a new curriculum, one already emerging at future-oriented training
providers. The CEIO will have an equal footing with the chief financial officer, chief information
officer, and corporate or operating officers responsible for marketing, manufacturing, and human
resources.
The CEIO will be the officer in charge of the overall enterprise measurement, analysis, and
change management system. This is because, more than any other company leader, the CEIO
will be best positioned to provide, stimulate, and leverage the agility that businesses will need to
accommodate the accelerating changes that will characterize business survival during the next
two decades.
Tomorrows business environment
Two decades from now, the business environment will bear little resemblance to what we see
today.
Much has been written about opportunities for marketing to the millions of consumers in
emerging markets. Attention should be paid as well to the huge market potential in our own
country: the rich older U.S. citizens whose population is growing rapidly and whose life
expectancy is increasing.
Outsourcing of manufacturing, design, and transactional processes will accelerate. But the CEIO
will also have to anticipate other upcoming challenges: restraints on business that may be placed
by foreign governments, policy changes by the U.S. government to soften outsourcings
disruptions in our labor market, and the higher cost of shipping goods to the United States due to
the increasing cost of oil.

Well see exciting new products and services mandated by rising customer expectations and
made possible by analytics-based innovation. There will be speedier cycle times, mass
customization, smaller inventories, and automatic orders and reorders. We can also expect new
ways to enter orders, seamless delivery, and a sharp reduction in time between customer
feedback and satisfactory problem resolution.
Indeed, a wide range of products will have built-in quality and reorder sensors. Even our
dreadfully inadequate air traffic and health care industries will improve. The design for Six
Sigma process will transition to a design of integrated enterprise excellence system. This system
will play a far stronger role in business than it does today. Cost control will become critical as
price transparency becomes increasingly clear.
Integrating operations up and down the value chain will result in more challenging supplier
qualification requirements. More attention will be given to green manufacturing, workplace
safety, child labor, animal welfare, and product safety.
Scorecarding will become more sophisticated. Scorecards will allow more penetrating drilldowns, empowering managers with new ways to improve processes, track the results, continually
fine-tune, and manage mentoring and training. There will be continuing improvements of
scorecards desktop publishing tools. Using mouse-overs or right-click actions, managers will
learn the source of each metric, how it was calculated, and who owns it. Scorecard formats will
lead to less firefighting and more fire prevention.
Scorecards of the future will allow for easy chart annotation and tracking threaded discussions.
The system will be fully web-enabled for design and use without resorting to software plug-ins
that might be impeded by network firewalls. It will be unnecessary to initiate separate dashboard
projects for different users.
CEIO leadership
Until recently, success in organizational improvements belonged to those who could control and
continuously improve. Now, and increasingly during the next two decades, the future will
include leadership by the CEIO.
==================

III. Quality management tools

1. Check sheet

The check sheet is a form (document) used to collect data


in real time at the location where the data is generated.
The data it captures can be quantitative or qualitative.
When the information is quantitative, the check sheet is
sometimes called a tally sheet.
The defining characteristic of a check sheet is that data
are recorded by making marks ("checks") on it. A typical
check sheet is divided into regions, and marks made in
different regions have different significance. Data are
read by observing the location and number of marks on
the sheet.
Check sheets typically employ a heading that answers the
Five Ws:

Who filled out the check sheet


What was collected (what each check represents,
an identifying batch or lot number)
Where the collection took place (facility, room,
apparatus)
When the collection took place (hour, shift, day of
the week)
Why the data were collected

2. Control chart
Control charts, also known as Shewhart charts
(after Walter A. Shewhart) or process-behavior
charts, in statistical process control are tools used
to determine if a manufacturing or business
process is in a state of statistical control.
If analysis of the control chart indicates that the
process is currently under control (i.e., is stable,
with variation only coming from sources common
to the process), then no corrections or changes to
process control parameters are needed or desired.

In addition, data from the process can be used to


predict the future performance of the process. If
the chart indicates that the monitored process is
not in control, analysis of the chart can help
determine the sources of variation, as this will
result in degraded process performance.[1] A
process that is stable but operating outside of
desired (specification) limits (e.g., scrap rates
may be in statistical control but above desired
limits) needs to be improved through a deliberate
effort to understand the causes of current
performance and fundamentally improve the
process.
The control chart is one of the seven basic tools of
quality control.[3] Typically control charts are
used for time-series data, though they can be used
for data that have logical comparability (i.e. you
want to compare samples that were taken all at
the same time, or the performance of different
individuals), however the type of chart used to do
this requires consideration.

3. Pareto chart

A Pareto chart, named after Vilfredo Pareto, is a type


of chart that contains both bars and a line graph, where
individual values are represented in descending order
by bars, and the cumulative total is represented by the
line.
The left vertical axis is the frequency of occurrence,
but it can alternatively represent cost or another
important unit of measure. The right vertical axis is
the cumulative percentage of the total number of
occurrences, total cost, or total of the particular unit of
measure. Because the reasons are in decreasing order,
the cumulative function is a concave function. To take
the example above, in order to lower the amount of
late arrivals by 78%, it is sufficient to solve the first
three issues.
The purpose of the Pareto chart is to highlight the
most important among a (typically large) set of
factors. In quality control, it often represents the most
common sources of defects, the highest occurring type
of defect, or the most frequent reasons for customer
complaints, and so on. Wilkinson (2006) devised an
algorithm for producing statistically based acceptance
limits (similar to confidence intervals) for each bar in
the Pareto chart.

4. Scatter plot Method

A scatter plot, scatterplot, or scattergraph is a type of


mathematical diagram using Cartesian coordinates to
display values for two variables for a set of data.
The data is displayed as a collection of points, each
having the value of one variable determining the position
on the horizontal axis and the value of the other variable
determining the position on the vertical axis.[2] This kind
of plot is also called a scatter chart, scattergram, scatter
diagram,[3] or scatter graph.
A scatter plot is used when a variable exists that is under
the control of the experimenter. If a parameter exists that
is systematically incremented and/or decremented by the
other, it is called the control parameter or independent
variable and is customarily plotted along the horizontal
axis. The measured or dependent variable is customarily
plotted along the vertical axis. If no dependent variable
exists, either type of variable can be plotted on either axis
and a scatter plot will illustrate only the degree of
correlation (not causation) between two variables.
A scatter plot can suggest various kinds of correlations
between variables with a certain confidence interval. For
example, weight and height, weight would be on x axis
and height would be on the y axis. Correlations may be
positive (rising), negative (falling), or null (uncorrelated).
If the pattern of dots slopes from lower left to upper right,
it suggests a positive correlation between the variables
being studied. If the pattern of dots slopes from upper left
to lower right, it suggests a negative correlation. A line of
best fit (alternatively called 'trendline') can be drawn in
order to study the correlation between the variables. An
equation for the correlation between the variables can be
determined by established best-fit procedures. For a linear
correlation, the best-fit procedure is known as linear
regression and is guaranteed to generate a correct solution
in a finite time. No universal best-fit procedure is
guaranteed to generate a correct solution for arbitrary
relationships. A scatter plot is also very useful when we
wish to see how two comparable data sets agree with each

other. In this case, an identity line, i.e., a y=x line, or an


1:1 line, is often drawn as a reference. The more the two
data sets agree, the more the scatters tend to concentrate in
the vicinity of the identity line; if the two data sets are
numerically identical, the scatters fall on the identity line
exactly.

5.Ishikawa diagram
Ishikawa diagrams (also called fishbone diagrams,
herringbone diagrams, cause-and-effect diagrams, or
Fishikawa) are causal diagrams created by Kaoru
Ishikawa (1968) that show the causes of a specific event.
[1][2] Common uses of the Ishikawa diagram are product
design and quality defect prevention, to identify potential
factors causing an overall effect. Each cause or reason for
imperfection is a source of variation. Causes are usually
grouped into major categories to identify these sources of
variation. The categories typically include
People: Anyone involved with the process
Methods: How the process is performed and the
specific requirements for doing it, such as policies,
procedures, rules, regulations and laws
Machines: Any equipment, computers, tools, etc.
required to accomplish the job
Materials: Raw materials, parts, pens, paper, etc.
used to produce the final product
Measurements: Data generated from the process
that are used to evaluate its quality
Environment: The conditions, such as location,
time, temperature, and culture in which the process
operates

6. Histogram method

A histogram is a graphical representation of the


distribution of data. It is an estimate of the probability
distribution of a continuous variable (quantitative
variable) and was first introduced by Karl Pearson.[1] To
construct a histogram, the first step is to "bin" the range of
values -- that is, divide the entire range of values into a
series of small intervals -- and then count how many
values fall into each interval. A rectangle is drawn with
height proportional to the count and width equal to the bin
size, so that rectangles abut each other. A histogram may
also be normalized displaying relative frequencies. It then
shows the proportion of cases that fall into each of several
categories, with the sum of the heights equaling 1. The
bins are usually specified as consecutive, non-overlapping
intervals of a variable. The bins (intervals) must be
adjacent, and usually equal size.[2] The rectangles of a
histogram are drawn so that they touch each other to
indicate that the original variable is continuous.[3]

III. Other topics related to Future of quality management (pdf


download)
quality management systems
quality management courses
quality management tools
iso 9001 quality management system
quality management process
quality management system example
quality system management
quality management techniques
quality management standards
quality management policy
quality management strategy
quality management books

You might also like