Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 56

Impacts of Global Warming

About two thousand leading scientists of the world got together in 1995 to discuss and decide about
the Global Warming. These scientists, discussing and deciding under the banner of
Intergovernmental Panel on Climate Change (IPCC), came to the conclusion that the global warming
was real, serious and accelerating. The panel predicted that the earth’s average temperature could
rise further by 1.4 to 5.8 Celsius degrees within next 100 years. It may cause serious changes in
weather pattern that may surely endanger our lives, our agriculture, and our plants as well as our
animals.

The Global Warming has following effects on human societies, agriculture, other plants and animals

I. Effects of Global Warming on Human Society

Any adverse change in the weather pattern or in the environment on the whole, is sure to affect
adversely the lives of human beings. We can easily understand that a warmer climate is sure to
change the patterns of rain fall and snow fall. It is sure to cause increases in the frequency and
severity of drought and floods also. The increasing heat will cause the air to expand and develop
more capacity of holding moisture. In turn, this is sure to affect our resources like water, forests, and
different ecological systems. These conditions will affect the conditions of power generation,
infrastructure, tourism and healthy living. 

The changes in temperature have already caused a number of disasters like cyclones, hurricanes,
and other forms of storms across the world. In a nut shell we may conclude that changing climate
due to the global warming will damage natural resources, together with causing spread of deadly
diseases, displacement of human populations, hunger, loss of economy and shortage of human
resources, in case the problem of global warming is not averted properly.

The Global Warming has various types of impacts on the whole earth, its systems and the whole
human society.  Global Warming may cause frequent natural disasters like cyclones, storms and
hurricanes, floods and droughts. Thus it may claim lives of large sections of human society. It may
also cause cloud bursts, avalanches, landslides, mud- flows and earthquakes. A very large number of
men, women and children are killed in these disasters. Thousands of people become homeless and
migrate elsewhere to take shelter as refugees. Global Warming is causing melting of ice and glaciers
which is leading to a rise in sea- level. As a result, the creeping up oceans swallow low laying islands,
coastal areas, people and their property etc.  Such is the case of SATBHAYA village of Orissa State of
India.

According to the fourth assessment Report, Brussels, April 2007 of Intergovernmental Panel on
Climate Change (IPCC), the Global Warming Induced Climate Change causes following adverse
impacts on human health –
•Increases in malnutrition and consequent disorders, with implication for child growth and
development;

•Increased deaths, diseases and injury due to heat waves, floods, storms, fires and droughts;

•The increased burden of diarrhoeal diseases;

•Increased frequency of cardio-respiratory diseases due to higher concentrations of ground level


ozone related to climate change; and

•The altered spatial distribution of vectors of some diseases.

Global Warming has wide ranging effects on many aspects of human life. It threatens economies,
lives and traditions. In the U.S.A., Chicago experienced one of the worst weather related disasters in
Illinois history when a heat wave resulted in 525 deaths during 5 day period in July of 1995. A
warmer climate can expand the geographic range of tropical mosquito borne diseases such as
Malaria, dengue fever, yellow fever etc. to higher altitudes and the maturation of certain disease
causing agents and their vectors. India was hit by a severe heat wave in 2003. Temperature reached
as high as 50 Celsius degrees in May across the worst hit areas and over 1200 people died. Just 5
years earlier, the most disastrous heat wave that hit India during April- June 1998 with an estimated
fatalities 3028. The temperature rose as higher as 45 to 49.80 Celsius degrees in several Indian
states. The human caused global warming may have already doubled the chance of killer heat waves
like the one that scorched Europe in July- August2003. The summer was hottest in Europe in July –
August 2003. That summer was hottest in Europe in at least last 5000 years.

The rising sea level is causing loss of land, loss of property and loss of lives. It may also cause large
scale displacement of people which may further create a problem of rehabilitation. The rising
temperature due to global warming has melted down the ice of a number of glaciers. The total
surface of glaciers worldwide has decreased by 50 per cent since the end of 19th century. According
to a report, the snow cap that covered Mountain Kilimanjaro, the highest mountain of Africa, since
last 11000 years, i.e. since the last ice age, was about to disappear up to March 2005. The ice caps of
this mountain that gave it the name of the Shining Mountain are no longer shining now. Fascinated
by its shining appearance, the famous author Ernest Hemmingway wrote his classic ‘The Snows of
Kilimanjaro”. But about 80 per cent of its snow has vanished now, since 1912. 
Image 1: Mount Kilimanjaro (source: Wikipedia)

II. Effects of Global Warming on Agriculture

Our demands for more and more food are rising on day by day due to increasing human population.
The supply of food mainly comes from agriculture. But the increasing temperature followed by
changes in climatic conditions due to global warming is sure to damage it.

Global warming may cause drought and outbreak of insects. Both of these conditions are damaging
to agriculture. Higher temperature accelerates the maturation of earlier disease causing agents and
their vectors. All these conditions cause damage to and failure of crops.

Climatic factors like temperature, wind, relative humidity, rain fall etc. have direct effects on
agriculture. Since global warming is changing the global climates, adverse changes in agriculture and
its production is bound to occur. On the other hand changing frequencies of natural disasters such as
floods, cyclones, hurricanes, landslides, mud-flows etc. tend to vanish out our crops besides causing
great losses to life and property. Hail storms, wind storms and also, fog and mist cause serious
damages to our crops each year. In many parts of India, farmers go on demonstration and road jams
demanding compensations from the government whenever their crops are damaged due to any of
these reasons.

 In a democratic country the government is formed by the people themselves. On the other hand,
climate change is a global tragedy the responsibilities of which go in everyone’s share. However,
since the climate is changing, agriculture is sure to be damaged and such incidents are bound to
follow because everyone wants to live.
III. Effects of Global Warming on other Plants and Animals

Since all the plants and animals depend on a favourable climate, suitable habitats, food and
appropriate breeding conditions for survival and development, the global warming followed by the
climate change is sure to impose disastrous effects on them. In 1990s, the outbreak of Spruce Bark
Beetle in South Central Alaska damaged 4 million acres of spruce forests. The Animal populations in
Arctic region are declining fast due to rising temperature across the region. The disproportionate
warming in this region has had detrimental effects on many arctic species including the Arctic Gull,
Emperor Penguins, Arctic fox, and the polar beer. The total area of the Arctic sea ice has declined by
6 per cent over the last twenty years. Since 1979 damage to coral reefs and events of coral bleaching
on a large scale has increased considerably.

The global warming has destructive effects on the world biodiversity. So far vast varieties of birds,
reptiles, insects, bacteria, fungi, rodents etc. have vanished out due to these effects. The golden
toads in the mountains of Costa Rica are believed to have gone extinct in recent years. The
population of Salamanders and similar species are at the brink of extinction as their embryos do not
develop properly under the conditions of rising temperature due to global warming.

Image 2 : Golden Toad of Florida( source Wikipedia)

In Central South America many of the mountain amphibians including the golden toads have been
vanished due to global warming. This worldwide calamity is compounded by habitat loss and alien
invasive species ‘making their impacts considerably worse. According to the WWF (Worldwide Fund
for nature), with the doubling of carbon dioxide, climate change could eventually destroy 35 per
cent of the world’s existing terrestrial habitats. Birds’ habitats will be altered through changes in sea
level, fire regimes, vegetation and land use etc. Due to the Climate Anomaly caused by global
warming, the population of the Emperor Penguins declined by 50 per cent during 1970s due to
reduced adult survival caused by prolonged abnormally warm temperatures with reduced sea ice.

The population of the Siberian Crane, which is a critically endangered species, has reduced
considerably up to few thousand individuals. It demonstrates the vulnerability of a wet land
migratory bird to the climate change. Some populations of migratory birds have been declining
sharply because of unfavourable variations in climatic conditions. The increasing deposits of carbon
dioxide in oceanic waters due to global warming and other incidents are damaging molluscs due to
which their population in marine waters has declined sharply.

Image 3: Siberian Crane (source: Wikipedia)

The global warming is damaging various ecosystems like mangrove- swamps, coral reefs and coastal
lagoons etc. due to various reasons like reduction in pH of oceanic water and increasing deposits of
acids. The migration time of spring butterflies in Britain has become earlier than it was 30 years ago.
It has been observed that the behaviours of some bird species have changed due to climatic
variations in the Indian state of Orissa. Some birds like Black Headed Oriole and Open Billed Stork
have changed their times of migration where as some birds like Bronze Winged Jacana and Indian
Small Skylark have changed their nesting behaviours. The change in climate due to rise in global
temperature is causing migration of species of wild animals towards poles and high altitudes. Since
those areas are already inhabited by animals of different species, the migrated species receive the
status of Refugee Species.

The Intergovernmental Panel for Climate Change (IPCC) -2001, has made following predictions about
the financial losses due to Climate Change caused by Global Warming-

(i).The annual loss due to frequent tropical cyclones; rising sea level causing loss of land , damage to
fishing stocks, loss of agriculture and water supplies etc. could be more than US $300 000 million.

(ii).The Water Industry is estimated to face extra cost of US$47 000 million annually, on the global
level up to 2050.

(iii). Droughts, floods and fires are estimated to cause a loss worth US$42 000 million to agriculture
and forestry worldwide, in case  the levels of carbon dioxide reach up to twice of their concentration
during pre-industrial era.

(iv). The cost of protecting homes, factories and power stations from floods due to the rising sea
levels and storm surges may reach up to US$1000 million annually, on the global levels.

(v).The losses to different ecosystems including mangroves, swamps, coral reefs, and coastal lagoons
could amount to more than US$70 000 million by 2050.

Green House Effect, Global Warming and Climate Change

The warming effect produced inside a green house by its glass panels, is called as Green
House Effect. A green house is a specialized house constructed at horticulture stations for
keeping plants that survive better in warm surrounding.

Green House Effect: How does it occur?


The walls or panels of a green house are made of specialized glass that allows short wave
solar radiations to go inside but does not allow long wave infrared heat energy of the glass
house to pass out. In fact, some of the solar radiations that are absorbed inside the green
house get transformed into heat energy in the form of long wave infrared radiations that can
not go out of the glass panels of the green house. Thus, temperature inside the green house
rises as compared to the temperature outside. The term- Green House Effect was coined
by   J. Fourier in 1827.

Green House Gases (GHGs)


Gases that help in causing green house effect are called as Green House Gases (GHGs).
These gases either occur naturally or are produced on earth due to human activities of
burning fossil fuel and bio-mass.
One of the most abundant naturally occurring green house gases is the water vapour. Other
green house gases are carbon dioxide, methane, Nitrous oxide, Trifluoromethyl sulphur
pentafluoride and hydrochlorofluorocarbons. Trifluoromethyl sulphur pentafluoride is an
industrially produced chemical compound. It is a serious heat trapping compound which
could be discovered only after the year 2000. It is since 1700s, that a substantial increase in
the concentration of green house gases has occurred in the atmosphere.

Importance of Green House Effect


Our Earth is subjected to green house effect which is very important for creating a climate
favourable to the sustenance of most forms of life on it. In this context, the green house effect
can be defined as –

The effect of warming and insulation of the earth caused due to some heat trapping gases
accumulated in the atmosphere after their emission from the earth surface, is called as green
house effect.

The natural green house effect is in fact, a process of Thermal Blanketing of the earth which
maintains its temperature around 33 Celsius degrees which helps in the sustenance of life on
it. Without the green house effect, the climate of the earth is reported to become too cold for
most of the life to survive. Then, the temperature may fall much below the required level
essential for the existence of life. Hence green house effect is an important natural process
which is essential for the survival of life on this planet.

Global Warming
Human activities of pollution are modifying the natural process of green house effect. The
advent of the Industrial Revolution in the 1700s boosted up the activities of burning of fossil
fuels like coal, oil and natural gas which released lots of heat absorbing gases into the
atmosphere. Clearing of land for agriculture or for urban settlements, wiped out the
vegetation that acted as ecological sink for some of those gases like carbon dioxide.

 The heat absorbing gases accumulate in high concentrations in the upper atmosphere around
the earth extending up to 100 km above its surface and act as glass panels of a green house.
They allow much of the short wave solar radiation to reach to the earth surface but stop much
of the long wave infrared rays against escaping out as heat. They absorb these infrared
radiations and then re-radiate most of them back to the earth surface. Thus, the temperature of
the atmosphere rises gradually causing an unnatural heating effect which is called as the
Global Warming. Global warming is the enhanced green house effect due to greater
accumulation of GHGs in the upper atmosphere.

The warming of the earth’s atmosphere attributed to a build up of green house gases in high
concentration in the atmosphere is called as the Global Warming. It is a term which is used to
refer to the observed increase in the average temperature of the earth’s atmosphere and
oceans in recent decades. Scientific discoveries reveal that the world experienced warmest
atmosphere during last 50 years out of the period of 100 years. The global mean temperature
increased by about 0.5 to 1 degree Celsius, within a period of last 100 years.

Global Warming and Climate Change


The regular pattern of weather conditions of a particular place is called as climate of that
place. This regular pattern of weather conditions is considerably disturbed by fluctuations in
temperature. The disturbance in the pattern of weather conditions at a particular place may
rightly be called as Climate Change.

The Global Warming itself is not the Climate Change as the effects of global warming may
not be uniformly negative. But, it is the abnormal rise in global temperature that is causing
changes in the global climate. Let us see, how abnormal changes in temperature may lead to
climate change.

Temperature has significant role in the regulation of water cycle in the environment. Hence,
rise in global temperature can change the pattern of water cycle. On the other hand, increased
temperature can cause most of the ice to melt down. The increased evaporation of water due
to high temperature may alter the pattern of cloud formation and rains at different places. The
physical features of the earth also play important roles in causing temperature variations that
finally result into variations in air pressures. These variations cause disastrous conditions like
storms, cyclones, tornadoes and hurricanes etc.

International agencies studying the climate change have projected the globally averaged
temperature to increase by 1.4 to 5.8 degrees Celsius over the period 1990 to 2100. The
Intergovernmental Panel on Climate Change (IPCC) does important climatic researches and
surveys on periodic basis. It has hundreds of scientists from many different countries that
study and analyse the meteorological changes and provide a collective pictures of global
warming and other changes in the climate systems.

Impacts of Global Warming


The Global Warming has various types of impacts on the whole earth and its systems. Some
of the major impacts of Global Warming are mentioned below.

. Global Warming may cause frequent natural disasters like cyclones, storms and hurricanes,
floods and droughts. It may also cause cloud bursts, avalanches, landslides, mud- flows and
earthquakes.

. Global Warming is causing melting of ice and glaciers which is leading to a rise in sea-
level. As a result, the creeping up of oceans swallow low lying islands and coastal areas.    
                                         
The rising sea level is causing loss of land, loss of property and loss of lives. It may also
cause large scale displacement of people which may further create a problem of
rehabilitation.

. It is damaging forests, agriculture and water supplies.

. It is damaging various ecosystems like mangrove- swamps, coral reefs and coastal lagoons
etc. due to various reasons like reduction in pH of oceanic water and increasing deposits of
acids.

. Some populations of migratory birds have been declining because of unfavourable


variations in climatic conditions. On the other hand the migration time of spring butterflies in
Britain has become earlier than it was 30 years ago. It has been observed that the behaviours
of some bird species have changed due to climatic variations in some  Indian states .Some
birds like Black Headed Oriole and Open Billed Stork have changed their times of migration
where as some birds like Bronze Winged Jacana and Indian Small Skylark have changed their
nesting behaviours.

Image 1:  Black headed oriole

 Image 2: Open billed stork


Image 3: Bronze Winged Jacana
 courtesy all images - flickr

. The change in climate due to rise in global temperature is causing migration of


species of wild animals towards poles and high altitudes. Since those areas are already
inhabited by animals of different species, the migrated species receive the status of Refugee
Species.

Problems in the urban environment

As per an assessment, about 600 million people inhabited urban areas of the world in the year
1950.But now urban areas have been recorded to be inhabited by about half of the population
of the world. On the basis of current rate of growth of human population, it has been assessed
that the population of cities may get doubled within the period of coming 28 years. Rise of
population in urban areas at current rates is feared to raise the consumption of resources in
the same proportion. Since resources are fixed, crises in all walks of life are bound to follow
making lives more and more miserable. Some of the major problems of urban areas are
associated with - pollutions of air, water and land; housing and congestion; land use; waste
disposal; common social facilities etc.

Problems of pollutions of air, water and land

Since cities are heavily populated areas with a large number of workshops, industrial units
etc. these bear a severe problem of pollution of air, water and land. Various types of gases
and fumes that come up into atmosphere from fuel burning sources from municipal areas,
transport sectors and industries cause severe air pollution. Principal pollutants of air in urban
areas comprise dust particles, particulates of various compounds and gases like water vapour,
smokes, carbon dioxide, Nitrogen oxide, sulphur oxides and Chlorofluorocarbons etc. SO2,
NO2, H2S, CO2 etc can form acids on combining with water vapour in the sunlight. These
acids get deposited on dust particles in the atmosphere and on anything else. This process is
called as Acid Deposition.
Acids in the atmosphere are formed due to the combination of acid forming gases( Sulphur
dioxide, Hydrogen sulphide, Nitrogen oxides etc.) with water vapour in the presence of sun
light. These acids come down on earth during rains. This process is known as Acid Rain.
The accumulation of heat absorbing gases like Carbon dioxide (CO2) and Methane (CH4) in
the upper atmosphere causes Green House Effect or the Global Warming. What is global
warming? The thick layer of heat absorbing gases in the upper atmosphere allow the heat
producing radiations of the earth to come in but they do not allow the heat of the earth’s
atmosphere to go out. This condition causes an increase in the temperature of the earth. This
condition is called as the Green House Effect. When the concentration of green house gases
(GHGs) rises up to greater extent in the upper atmosphere, it causes a considerable rise in the
temperature of the earth. This condition is called as Global Warming. A rise of about 26% of
carbon dioxide has already been recorded in a period of 200 years. The global warming is
causing changes in the global climates. It is also causing the melting of polar ice- caps which
is contributing to the rise in sea level. The carbon dioxide has been responsible for a gradual
rise in the global temperature.
When pollutants like Chlorofluorocarbons (CFCs), aerosols, Carbon tetrachloride, Chlorine
etc. accumulate in the stratosphere, these react with the Ozone of the Ozone Layer found
there as a protective umbrella of the earth. The Ozone Layer is called as the protective
umbrella of the earth because it does not allow the Ultraviolet Radiations of the sunlight
to cross it and to reach to the earth. The Ultraviolet radiation, if allowed to reach to the earth,
it may cause skin cancer, cataract, hereditary diseases, loss of marine life and loss of
terrestrial plants.

The Chlorofluorocarbons are a family of synthetic chemicals that are compounds of the
elements-chlorine, fluorine, and carbon. CFCs are stable, non-flammable, non-corrosive,
relatively non-toxic chemicals and are easy and inexpensive to produce. During the 1970s,
scientists linked CFCs to the destruction of Earth’s ozone layer. The manufacture of CFCs
has since been banned in most countries. These chemical compounds are usually released into
the atmosphere from pressurised aerosol cans. In stratosphere these chemicals react with the
ozone of the ozone layer, make it thin and tend to cause a hole in the ozone belt. The third
world cities are facing a number of problems due to air pollution. It has been reported that
about 60% people of Kolkata, India, suffer from respiratory diseases related to pollution.

Water pollution has emerged as a major issue in urban areas over the years. Water pollution
and water scarcity are major problems faced by many cities and towns of the world.
Municipal wastes, hospital wastes, industrial wastes, wastes from slaughter houses and
pathology centres are routinely mixed into the water sources like lakes, ponds, oceans etc.
Surface runoffs from mine areas join water bodies and contaminate their waters with a
number of pollutants. The surface run offs from Bachra and North Karnpura areas of Ranchi
district of the Jharkhand state of India has been reported to contain Cyanide and Arsenic
which join the river Damodar. A number of closed water bodies in different areas of India are
facing Eutrophication. Water pollution in the Third World Countries has been reported to be
higher than air pollution. India has 3120 towns and only about 210 out of them were having
sewage treatment plants by1980s.
Varieties of solid and liquid wastes in huge quantities are routinely produced in urban areas.
Money, energy and labour are required to dispose these wastes. Much of these wastes are
disposed off in rivers, streams and lakes as untreated sewage. Industries produce lots of toxic
wastes. These wastes besides being toxic gases are solids and liquids as well. Wastes from
municipal and industrial areas are mixed with water where they cause serious water pollution.
Ganges, the holy river is severely polluted by dumping of sewage by 114 towns and cities.
Industrial effluents joining our water bodies contaminate water with heavy metals which
produce toxic effects.

Cities generate too much of solid waste that are usually dumped here and there in street
corners and even in open fields. These conditions create pollution of land. The decomposable
matter contained in these wastes act as fertile grounds for development of various diseases.
During rains, these wastes contaminate nearby ponds and pools and contribute to water
pollution. During summers, fumes and even minor particles of these decomposing wastes are
flown towards human habitations cause different types of infectious diseases.

Problem of congestion

Urban areas are usually densely populated locations. People migrating from villages towards
cities further increase the population burden of cities. Thus urban areas are suffering from
serious congestion and traffic problems. All of the city dwellers need houses to live in but
houses and the land are limited. Wet lands and agricultural fields of sub-urban areas are
purchased by developers for building apartments on high prices and most of the cultivable
land is being used for constructing houses. Thus urban areas are expanding on one hand and
the rates of rent are increasing on the other. Labourers working in factories or other people
doing minor business occupy government land, pavements, parks, monuments etc. and start
living there. Jhuggi, Jhopari clusters are mushrooming around every city. These are called as
slums. Slums are very unhygienic places as they don’t have proper civic amenities.

Many developing countries are clearing their forests for constructing dwelling units, airports,
bus stations etc. Thus the problem of housing is very acute in cities. The United Nations
Organization observed the World Habitat Day on October 6, 2004 and asked the world
community to solve the housing problem of slums in their countries. 6th October is marked
as the World Habitat day across the world. The quality of housing improved in Urban
India from 1981 to 1991. The increase in pucca houses was from 64.7% to 73.1% in urban
areas between this period. The National Building Organization has however, estimated that
there was a housing shortage of 75.7 lakh during 1997.

The govt. of India has formulated a comprehensive Housing & Habitat Policy in Aug 1998
having a long term goal of reducing homelessness, improving the housing conditions
inadequately housed and in providing a minimum level of basic services and amenities to all.
The National Co-Operative Housing Federation of India (NCHF) is the national apex
organization which was set up in 1969. It is spearheading the entire co-operative housing
movement in India. It promotes guides and co-ordinates the activities of housing co-
operatives. Besides this Hindustan Prefab Limited, Housing and Urban Development
Corporation Limited (in caption 1970 all the all the organizations inclined to solve the
housing problem in urban area).

Problem of open space

The total surface area of the Earth is 5,000 million hectares, out of which only 29.2% is land
and the rest 70.8% is under water. About 30% of the land surface is useless as it consists of
Marshes, swamps, deserts and steep mountains. India covers an area of 32, 87,263 sq Km (31
March 1982) extending from snow covered Himalayan heights to the tropical rain forests of
the south.
With the expansion of the cities more and more land is going under constructions of
buildings, fly-overs, hospitals, railway stations, international aerodromes etc.. As such the
cultivable green lands around the cities are under severe stress. Many developing countries
are clearing their forests to grow cash crops and rear cattle for the production of meat. Many
cities in India are unplanned. They neither have proper play grounds nor parks and green
belts. These cities have become just the jungles of concrete and rain water cannot percolate
into the ground to recharge the water table. As such most of the cities face water crisis during
major part of a year. The water supply for these cities is maintained by bringing water
through pipelines from a distant river usually crossing through some other state. It is again a
matter of worry for two grounds: first, it often creates interstate disputes and, second, it puts
stress on the resource of some other people.

Problem of waste disposal

Urban areas face great problems of waste disposal. Cities generate greater amounts of wastes
of wastes of different types. Solid wastes generated from municipal and industrial sources
contain large number of ingredients some of which remain toxic in nature.

Management of solid waste is one of the essential services which are to be provided by the
municipalities or the corporations to urban people. It is an important and regular activity;
hence it should be planned and executed properly so as to maintain a clean environment.

The disposal of solid waste from cities is an expensive activity. It involves money and man
power. Hence practice of recycle; reuse and composting should be adopted to reduce the load
of solid wastes from urban areas. Governments are trying to solve the problem of solid waste
in urban areas with environmental management policies that include waste management and
urban planning and by making Environmental Impact Assessment compulsory for large
projects. Some of the major ingredients of solid wastes are biodegradable matter, green
coconut shells, papers, plastics, metals, glass and ceramics, coal, inert materials and others.

Problem of common social facilities

Human beings need facilities to make their life comfortable on all fronts. More people require
more facilities. But facilities may be limited so there may be a great rush and competition
availing facilities. Thus considerable stress is imposed on them. Here are some examples-
Roads, Parks, play grounds water supply, community wells, community halls, community
hand pumps, schools etc. are some of the social facilities that people avail regularly. But
these social facilities are neither managed properly nor taken care of by the local public.
Roads often remain occupied by demonstrators of political parties or processions. Many
times school buses are caught in the traffic jams causing agony to the students, parents,
principals and teachers.

Grounds where people go for morning walks are often occupied by circuses, exhibitions, fairs
and the like activities. The members of the management groups and the workers of these
activities often damage the entire landscape and leave without repairing the same. Political
parties organize the rallies and meetings and grounds and go after leaving banners, posters
and different types of garbage in the open. Even parks in urban areas are often occupied by
beggars and criminal people. People without civic sense throw garbage in community wells,
ponds and lakes. Wherever we see, we see the abuse of social facilities by the careless public.
All these problems are caused by the explosion of population and lack of the sense of social
responsibility among citizens.

Six Sigma

The often-used Six Sigma symbol.

Six Sigma is a business management strategy originally developed by Motorola, USA in


1986. As of 2010, it is widely used in many sectors of industry, although its use is not
without controversy.

Six Sigma seeks to improve the quality of process outputs by identifying and removing the
causes of defects (errors) and minimizing variability in manufacturing and business
processes. It uses a set of quality management methods, including statistical methods, and
creates a special infrastructure of people within the organization ("Black Belts", "Green
Belts", etc.) who are experts in these methods. Each Six Sigma project carried out within an
organization follows a defined sequence of steps and has quantified financial targets (cost
reduction or profit increase).

The term Six Sigma originated from terminology associated with manufacturing, specifically
terms associated with statistical modelling of manufacturing processes. The maturity of a
manufacturing process can be described by a sigma rating indicating its yield, or the
percentage of defect-free products it creates. A six sigma process is one in which 99.99966%
of the products manufactured are statistically expected to be free of defects (3.4 defects per
million). Motorola set a goal of "six sigma" for all of its manufacturing operations, and this
goal became a byword for the management and engineering practices used to achieve it.

Lean manufacturing

Lean manufacturing or lean production, often simply, "Lean," is a production practice that
considers the expenditure of resources for any goal other than the creation of value for the
end customer to be wasteful, and thus a target for elimination. Working from the perspective
of the customer who consumes a product or service, "value" is defined as any action or
process that a customer would be willing to pay for. Basically, lean is centered on preserving
value with less work. Lean manufacturing is a management philosophy derived mostly from
the Toyota Production System (TPS) (hence the term Toyotism is also prevalent) and
identified as "Lean" only in the 1990s. It is renowned for its focus on reduction of the original
Toyota seven wastes to improve overall customer value, but there are varying perspectives on
how this is best achieved. The steady growth of Toyota, from a small company to the world's
largest automaker has focused attention on how it has achieved this.

Lean manufacturing is a variation on the theme of efficiency based on optimizing flow; it is a


present-day instance of the recurring theme in human history toward increasing efficiency,
decreasing waste, and using empirical methods to decide what matters, rather than
uncritically accepting pre-existing ideas. As such, it is a chapter in the larger narrative that
also includes such ideas as the folk wisdom of thrift, time and motion study, Taylorism, the
Efficiency Movement, and Fordism. Lean manufacturing is often seen as a more refined
version of earlier efficiency efforts, building upon the work of earlier leaders such as Taylor
or Ford, and learning from their mistakes

Single-Minute Exchange of Die

Single-Minute Exchange of Die (SMED) is one of the many lean production methods for
reducing waste in a manufacturing process. It provides a rapid and efficient way of
converting a manufacturing process from running the current product to running the next
product. This rapid changeover is key to reducing production lot sizes and thereby improving
flow (Mura).

The phrase "single minute" does not mean that all changeovers and startups should take only
one minute, but that they should take less than 10 minutes (in other words, "single-digit
minute"). Closely associated is a yet more difficult concept, One-Touch Exchange of Die,
(OTED), which says changeovers can and should take less than 100 seconds.

The concept arose in the late 1950s and early 1960s, when Shigeo Shingo, was consulting to
a variety of companies including Toyota, and was contemplating their inability to eliminate
bottlenecks at car body-moulding presses. The bottlenecks were caused by long tool
changeover times which drove up production lot sizes. The economic lot size is calculated
from the ratio of actual production time and the 'change-over' time; which is the time taken to
stop production of a product and start production of the same, or another, product. If change-
over takes a long time then the lost production due to change-overs drives up the cost of the
actual production itself. This can be seen from the table below where the change-over and
processing time per unit are held constant whilst the lot size is changed. The Operation time
is the unit processing time with the overhead of the change-over included. The Ratio is the
percentage increase in effective operating time caused by the change-over.

Just-in-time

Just-in-time (JIT) is an inventory strategy that strives to improve a business's return on


investment by reducing in-process inventory and associated carrying costs. Just In Time
production method is also called the Toyota Production System. To meet JIT objectives, the
process relies on signals or Kanban between different points in the process, which tell
production when to make the next part. Kanban are usually 'tickets' but can be simple visual
signals, such as the presence or absence of a part on a shelf. Implemented correctly, JIT
focuses on continuous improvement and can improve a manufacturing organization's return
on investment, quality, and efficiency. To achieve continuous improvement key areas of
focus could be flow, employee involvement and quality.

Quick notice that stock depletion requires personnel to order new stock is critical to the
inventory reduction at the center of JIT. This saves warehouse space and costs. However, the
complete mechanism for making this work is often misunderstood.

For instance, its effective application cannot be independent of other key components of a
lean manufacturing system or it can "...end up with the opposite of the desired result." In
recent years manufacturers have continued to try to hone forecasting methods (such as
applying a trailing 13 week average as a better predictor for JIT planning, however some
research demonstrates that basing JIT on the presumption of stability is inherently flawed

Logistics

Logistics is the management of the flow of the goods, information and other resources in a
repair cycle between the point of origin and the point of consumption in order to meet the
requirements of customers. Logistics involves the integration of information, transportation,
inventory, warehousing, material handling, and packaging, and occasionally security.
Logistics is a channel of the supply chain which adds the value of time and place utility.
Today the complexity of production logistics can be modeled, analyzed, visualized and
optimized by plant simulation software

Business logistics
Logistics as a business concept evolved in the 1950s due to the increasing complexity of
supplying businesses with materials and shipping out products in an increasingly globalized
supply chain, leading to a call for experts called supply chain logisticians. Business logistics
can be defined as "having the right item in the right quantity at the right time at the right
place for the right price in the right condition to the right customer", and is the science of
process and incorporates all industry sectors. The goal of logistics work is to manage the
fruition of project life cycles, supply chains and resultant efficiencies.

In business, logistics may have either internal focus (inbound logistics), or external focus
(outbound logistics) covering the flow and storage of materials from point of origin to point
of consumption (see supply chain management). The main functions of a qualified logistician
include inventory management, purchasing, transportation, warehousing, consultation and the
organizing and planning of these activities. Logisticians combine a professional knowledge of
each of these functions to coordinate resources in an organization. There are two
fundamentally different forms of logistics: one optimizes a steady flow of material through a
network of transport links and storage nodes; the other coordinates a sequence of resources to
carry out some project..

Production logistics

The term production logistics is used to describe logistic processes within an industry. The
purpose of production logistics is to ensure that each machine and workstation is being fed
with the right product in the right quantity and quality at the right time. The concern is not the
transportation itself, but to streamline and control the flow through value-adding processes
and eliminate non–value-adding ones. Production logistics can be applied to existing as well
as new plants. Manufacturing in an existing plant is a constantly changing process. Machines
are exchanged and new ones added, which gives the opportunity to improve the production
logistics system accordingly. Production logistics provides the means to achieve customer
response and capital efficiency.

Production logistics is becoming more important with decreasing batch sizes. In many
industries (e.g. mobile phones), a batch size of one is the short-term aim, allowing even a
single customer's demand to be fulfilled efficiently. Track and tracing, which is an essential
part of production logistics—due to product safety and product reliability issues—is also
gaining importance, especially in the automotive and medical industries.

Logistics Management Software


The term Logistics Management or supply chain management is that part of Supply Chain
Management that plans, implements, and controls the efficient, effective, forward, and reverse flow
and storage of goods, services, and related information between the point of origin and the point of
consumption in order to meet customers’ requirements

Software is used for logistics automation which helps the supply chain industry in automating
the workflow as well as management of the system. There are very few generalized software
available in the new market in the said topology. This is because there is no rule to generalize
the system as well as work flow even though the practice is more or less the same. Most of
the commercial companies do use one or the other of the custom solutions.

But there are various software solutions that are being used within the departments of
logistics. There are a few departments in Logistics, namely: Conventional Department,
Container Department, Warehouse, Marine Engineering, Heavy Haulage, etc.

Software used in these departments:

 Conventional department : CVT software / CTMS software


 Container Trucking: CTMS software
 Warehouse : WMS /WCS

Improving Effectiveness of Logistics Management:

1. Logistical Network
2. Information
3. Transportation
4. Sound Inventory Management
5. Warehousing, Materials Handling & Packaging
Third-party logistics

A third-party logistics provider (abbreviated 3PL, or sometimes TPL) is a firm that


provides a one stop shop service to its customers of outsourced (or "third party") logistics
services for part, or all of their supply chain management functions.

Third party logistics providers typically specialize in integrated operation, warehousing


and transportation services that can be scaled and customized to customer’s needs based on
market conditions and the demands and delivery service requirements for their products and
materials.

Definition
To put forward some standard definitions, we would adopt the definition of 3PL found in the
Council of Supply Chain Management Professionals’ glossary, which reads as follows:

"A firm [that] provides multiple logistics services for use by customers. Preferably, these
services are integrated, or "bundled" together, by the provider. Among the services 3PLs
provide are transportation, warehousing, cross-docking, inventory management, packaging,
and freight forwarding."

Types of 3PL
Third-party logistics providers

 freight forwarders
 courier companies
 other companies integrating & offering subcontracted logistics and transportation services

Hertz and Alfredsson (2003) describe four categories of 3PL providers

 Standard 3PL provider: this is the most basic form of a 3PL provider. They would perform
activities such as, pick and pack, warehousing, and distribution (business) – the most basic
functions of logistics. For a majority of these firms, the 3PL function is not their main activity.
 Service developer: this type of 3PL provider will offer their customers advanced value-added
services such as: tracking and tracing, cross-docking, specific packaging, or providing a
unique security system. A solid IT foundation and a focus on economies of scale and scope
will enable this type of 3PL provider to perform these types of tasks.
 The customer adapter: this type of 3PL provider comes in at the request of the customer and
essentially takes over complete control of the company’s logistics activities. The 3PL provider
improves the logistics dramatically, but do not develop a new service. The customer base for
this type of 3PL provider is typically quite small.
 The customer developer: this is the highest level that a 3PL provider can attain with respect
to its processes and activities. This occurs when the 3PL provider integrates itself with the
customer and takes over their entire logistics function. These providers will have few
customers, but will perform extensive and detailed tasks for them.
Non Asset-based Logistics Providers
Advancements in technology and the associated increases in supply chain visibility and inter-
company communications have given rise to a relatively new model for third-party logistics
operations – the “non-asset based logistics provider.” Non-asset based providers perform
functions such as consultation on packaging and transportation, freight quoting, financial
settlement, auditing, tracking, customer service and issue resolution. However, they don’t
employ any truck drivers or warehouse personnel, and they don’t own any physical freight
distribution assets of their own – no trucks, no storage trailers, no pallets, and no
warehousing. A non-assets based provider consists of a team of domain experts with
accumulated freight industry expertise and information technology assets. They fill a role
similar to freight agents or brokers, but maintain a significantly greater degree of “hands on”
involvement in the transportation of products.

To be useful, providers such as Choice Logistics, Diversified Transportation Services (DTS)


or CaseStack must show its customers a benefit in financial and operational terms by
leveraging exceptional expertise and ability in the areas of operations, negotiations, and
customer service in a way that complements its customers' preexisting physical assets.

On-Demand Transportation
On-Demand Transportation is a relatively new term coined by 3PL providers to describe their
brokerage, ad-hoc, and "flyer" service offerings.

On-Demand Transportation has become a mandatory capability for today's successful 3PL
providers in offering client specific solutions to supply chain needs.

These shipments do not usually move under the "lowest rate wins" scenario and can be very
profitable to the 3PL that wins the business. The cost quoted to customers for On-Demand
services are based on specific circumstances and availability and can differ greatly from
normal "published" rates.

On-Demand Transportation is a niche that continues to grow and evolve within the 3PL
industry.

Specific modes of transport which may be subject to the on-demand model include (but are
not limited to) the following:

 FTL, or Full Truck Load


 Hotshot (direct, exclusive courier)
 Next Flight Out, sometimes also referred to as Best Flight Out (commercial airline shipping)
 International Expedited

Terminology
In the "PL" terminology, it is important to differentiate the 3PL from the:

 1PL, which are the shipper or the consignee,


 2PL, which are actual carriers such as YRC Worldwide, UPS, FedEx,
 4PL, which are consulting firms such as CPCS, SCMO, BMT, Deloitte, and Accenture.

Fourth-party logistics

A Fourth-party logistics provider (abbreviated 4PL), lead logistics provider, or 4th Party
Logistics provider, is a consulting firm specialized in logistics, transportation, and supply
chain management. Typical fourth-party logistics providers are CPCS, SCMO, BMT,
Deloitte, Capgemini, 3t Europe and Accenture.

As the 4PL industry is still in its infancy and currently being created throughout the world
(Blue Ocean Strategy), its definition and function still leads to a lot of confusion, even for
professionals of the transportation industry.

History
The term 4PL is generally considered to have been introduced by Accenture, which
registered it as a trademark in 1996. Accenture described the 4PL as an "integrator that
assembles the resources, capabilities, and technology of its own organization and other
organizations to design, supply chain solutions".

The trademark was later abandoned, and the term has become a part of the public domain.

Definition
A fourth-party logistics provider is an independent, singularly accountable, non-asset based
integrator of a client's supply and demand chains.

Conflict of interest
To avoid any conflict of interest, it is important that this fourth-party logistics provider be
non-asset based, as far as logistics, transportation, and supply chain management assets are
concerned. 4PL use 2PLs and/or 3PLs to supply service to customers, owning only computer
systems and intellectual capital.

Confusion
Nowadays advisors, consultants, software companies and even 3PLs lay claim to being a
4PL. This is because any company advising a customer on logistics, transportation, and
supply chain matters feels it may somehow claim to be a 4PL. This is effectively the case
only when the principle of neutrality is respected, and that any conflict of interest is avoided.

A fourth-party logistics provider must also offer services considering a 360 degree view,
which is not focused on its ability to implement the recommendations it gives, but on all the
options available in the market.
Principle of neutrality
As such an IT consulting firm specialized in WMS (Warehouse Management Systems),
which is objectively considering all the various WMS present in the market is a 4PL. It may
obviously not represent any WMS brand or any software company, otherwise the concept of
neutrality is broken, and it leads to conflict of interest.

Similarly a non-asset based consulting firm specialized in logistics, transportation, and supply
chain management may claim it is a 4PL. This is effectively the case if it does not own
warehouses, logistics platforms[disambiguation needed], vans, trucks, ships, barges, planes, a freight
forwarder, or a courier company, otherwise it would lead to conflict of interest.

It has been sometimes argued that a 4PL is the same thing as a "non-asset based 3PL". This is
not the case. Considering that probably 90% of the world's 3PL are "non-asset based", they
nevertheless generate revenues & profits from their "non-asset based" activities. As such a
3PL cannot be a 4PL in the same time, as this would lead to conflict of interest. Indeed it
would then have a tendency to recommend to customers its "non-asset based" operation as
the best possible option.

Examples of 4PL
The best examples of fourth-part logistics providers are "non-asset based" consulting firms
exclusively specialized in logistics, transportation, and supply chain management such as
SCMO, BMT Limited, MVA Consulting, TTR, Intermodality, CPCS, and 3t-Europe, which
offer complete ranges of services, from strategy to implementation.

Others are more generalist consulting firms such as the Big Four auditors, respectively
Deloitte, PricewaterhouseCoopers, Capgemini, Ernst & Young, and KPMG, as well as
Accenture, Arup, Atkins (company), Mott MacDonald, Parsons Brinckerhoff, and AECOM.

Other firms such as McKinsey & Company, Bain & Company, A.T. Kearney, the Boston
Consulting Group, and Booz & Company, may also play the role of 4PL with a different
value proposition, and are considered to be "pure strategy" firms only.

Overlapping
Are often calling themselves 4PL, advisors, or consultants:

 freight forwarders, who tell their customers they will advise them on the best possible
solution (within the frame of their own operations),
 warehouse operators and logistics platform operators, who tell their customers they will
advise them on the best possible solution (within the frame of their own operations).

3PL vs. 4PL


A 4PL is a consultant, and cannot be an operator. This is to respect the principle of neutrality.
A 3PL is an operator, which specializes in integrated operation, warehousing and
transportation services. These services may be 100% outsourced, as in the case of "non-asset
based 3PL". It is then a pure 3PL. It may also own part of its operations, such as warehouses,
vans, or trucks. It then is both a 3PL and a 2PL, but is usually still called a 3PL. It can also
offer genuine supply chain consulting services outside of its usual range of services. It is then
both a 3PL and a 4PL, but is usually still called a 3PL.

It is important to differentiate 3PL, which actually deliver supply chain consulting services
outside of their usual range of integrated operations, from 3PL which use the term consulting
or 4PL abusively, as a marketing tool only. Some 3PL currently go as far as giving a title of
consultant to their sales people, who are only selling their classical 3PL services. These are
clearly 3PL only.

In other cases, 2PL logistics operators, or 3PL with advanced logistics and information
technology capabilities may call themselves 4PL, or a mix of 3PL/4PL. Their capabilities are
so advanced in logistics, wms, and/or communication that they effectively need to customize
their operations for each new customer, which requires a lot intellectual capital, similar to the
4PL. Nevertheless, their ownership of logistics assets contradicts the 4PL status, and leads to
conflict of interest for real consultancy. They may be called "advanced logistics 2PL/3PL" or
"total logistics 2PL/3PL".

Supply chain management

Supply chain management (SCM) is the management of a network of interconnected


businesses involved in the ultimate provision of product and service packages required by
end customers (Harland, 1996). Supply chain management spans all movement and storage of
raw materials, work-in-process inventory, and finished goods from point of origin to point of
consumption (supply chain).

Another definition is provided by the APICS Dictionary when it defines SCM as the "design,
planning, execution, control, and monitoring of supply chain activities with the objective of
creating net value, building a competitive infrastructure, leveraging worldwide logistics,
synchronizing supply with demand and measuring performance globally."

Definitions
More common and accepted definitions of supply chain management are:

 Supply chain management is the systemic, strategic coordination of the traditional


business functions and the tactics across these business functions within a particular
company and across businesses within the supply chain, for the purposes of
improving the long-term performance of the individual companies and the supply
chain as a whole (Mentzer et al., 2001).

 A customer focused definition is given by Hines (2004:p76) "Supply chain strategies


require a total systems view of the linkages in the chain that work together efficiently
to create customer satisfaction at the end point of delivery to the consumer. As a
consequence costs must be lowered throughout the chain by driving out unnecessary
costs and focusing attention on adding value. Throughput efficiency must be
increased, bottlenecks removed and performance measurement must focus on total
systems efficiency and equitable reward distribution to those in the supply chain
adding value. The supply chain system must be responsive to customer requirements."

 Global supply chain forum - supply chain management is the integration of key
business processes across the supply chain for the purpose of creating value for
customers and stakeholders (Lambert, 2008).

 According to the Council of Supply Chain Management Professionals (CSCMP),


supply chain management encompasses the planning and management of all activities
involved in sourcing, procurement, conversion, and logistics management. It also
includes the crucial components of coordination and collaboration with channel
partners, which can be suppliers, intermediaries, third-party service providers, and
customers. In essence, supply chain management integrates supply and demand
management within and across companies. More recently, the loosely coupled, self-
organizing network of businesses that cooperate to provide product and service
offerings has been called the Extended Enterprise.

A supply chain, as opposed to supply chain management, is a set of organizations directly


linked by one or more of the upstream and downstream flows of products, services, finances,
and information from a source to a customer. Managing a supply chain is 'supply chain
management' (Mentzer et al., 2001).

Supply chain management software includes tools or modules used to execute supply chain
transactions, manage supplier relationships and control associated business processes.

Supply chain event management (abbreviated as SCEM) is a consideration of all possible


events and factors that can disrupt a supply chain. With SCEM possible scenarios can be

Problems addressed by supply chain management


Supply chain management must address the following problems:

 Distribution Network Configuration: number, location and network missions of


suppliers, production facilities, distribution centers, warehouses, cross-docks and
customers.
 Distribution Strategy: questions of operating control (centralized, decentralized or
shared); delivery scheme, e.g., direct shipment, pool point shipping, cross docking,
DSD (direct store delivery), closed loop shipping; mode of transportation, e.g., motor
carrier, including truckload, LTL, parcel; railroad; intermodal transport, including
TOFC (trailer on flatcar) and COFC (container on flatcar); ocean freight; airfreight;
replenishment strategy (e.g., pull, push or hybrid); and transportation control (e.g.,
owner-operated, private carrier, common carrier, contract carrier, or 3PL).
 Trade-Offs in Logistical Activities: The above activities must be well coordinated in
order to achieve the lowest total logistics cost. Trade-offs may increase the total cost
if only one of the activities is optimized. For example, full truckload (FTL) rates are
more economical on a cost per pallet basis than less than truckload (LTL) shipments.
If, however, a full truckload of a product is ordered to reduce transportation costs,
there will be an increase in inventory holding costs which may increase total logistics
costs. It is therefore imperative to take a systems approach when planning logistical
activities. These trade-offs are key to developing the most efficient and effective
Logistics and SCM strategy.
 Information: Integration of processes through the supply chain to share valuable
information, including demand signals, forecasts, inventory, transportation, potential
collaboration, etc.
 Inventory Management: Quantity and location of inventory, including raw
materials, work-in-progress (WIP) and finished goods.
 Cash-Flow: Arranging the payment terms and methodologies for exchanging funds
across entities within the supply chain.

Supply chain execution means managing and coordinating the movement of materials,
information and funds across the supply chain. The flow is bi-directional.

Activities/functions
Supply chain management is a cross-function approach including managing the movement of
raw materials into an organization, certain aspects of the internal processing of materials into
finished goods, and the movement of finished goods out of the organization and toward the
end-consumer. As organizations strive to focus on core competencies and becoming more
flexible, they reduce their ownership of raw materials sources and distribution channels.
These functions are increasingly being outsourced to other entities that can perform the
activities better or more cost effectively. The effect is to increase the number of organizations
involved in satisfying customer demand, while reducing management control of daily
logistics operations. Less control and more supply chain partners led to the creation of supply
chain management concepts. The purpose of supply chain management is to improve trust
and collaboration among supply chain partners, thus improving inventory visibility and the
velocity of inventory movement.

Several models have been proposed for understanding the activities required to manage
material movements across organizational and functional boundaries. SCOR is a supply chain
management model promoted by the Supply Chain Council. Another model is the SCM
Model proposed by the Global Supply Chain Forum (GSCF). Supply chain activities can be
grouped into strategic, tactical, and operational levels . The CSCMP has adopted The
American Productivity & Quality Center (APQC) Process Classification FrameworkSM a
high-level, industry-neutral enterprise process model that allows organizations to see their

Strategic level

 Strategic network optimization, including the number, location, and size of


warehousing, distribution centers, and facilities.
 Strategic partnerships with suppliers, distributors, and customers, creating
communication channels for critical information and operational improvements such
as cross docking, direct shipping, and third-party logistics.
 Product life cycle management, so that new and existing products can be optimally
integrated into the supply chain and capacity management activities.
 Information technology chain operations.
 Where-to-make and make-buy decisions.
 Aligning overall organizational strategy with supply strategy.
 It is for long term and needs resource commitment.

Tactical level

 Sourcing contracts and other purchasing decisions.


 Production decisions, including contracting, scheduling, and planning process
definition.
 Inventory decisions, including quantity, location, and quality of inventory.
 Transportation strategy, including frequency, routes, and contracting.
 Benchmarking of all operations against competitors and implementation of best
practices throughout the enterprise.
 Milestone payments.
 Focus on customer demand.

Operational level

 Daily production and distribution planning, including all nodes in the supply chain.
 Production scheduling for each manufacturing facility in the supply chain (minute by
minute).
 Demand planning and forecasting, coordinating the demand forecast of all customers
and sharing the forecast with all suppliers.
 Sourcing planning, including current inventory and forecast demand, in collaboration
with all suppliers.
 Inbound operations, including transportation from suppliers and receiving inventory.
 Production operations, including the consumption of materials and flow of finished
goods.
 Outbound operations, including all fulfillment activities, warehousing and
transportation to customers.
 Order promising, accounting for all constraints in the supply chain, including all
suppliers, manufacturing facilities, distribution centers, and other customers.
 From production level to supply level accounting all transit damage cases & arrange
to settlement at customer level by maintaining company loss through insurance
company.

Importance of supply chain management


Organizations increasingly find that they must rely on effective supply chains, or networks, to
compete in the global market and networked economy. In Peter Drucker's (1998) new
management paradigms, this concept of business relationships extends beyond traditional
enterprise boundaries and seeks to organize entire business processes throughout a value
chain of multiple companies.

During the past decades, globalization, outsourcing and information technology have enabled
many organizations, such as Dell and Hewlett Packard, to successfully operate solid
collaborative supply networks in which each specialized business partner focuses on only a
few key strategic activities (Scott, 1993). This inter-organizational supply network can be
acknowledged as a new form of organization. However, with the complicated interactions
among the players, the network structure fits neither "market" nor "hierarchy" categories
(Powell, 1990). It is not clear what kind of performance impacts different supply network
structures could have on firms, and little is known about the coordination conditions and
trade-offs that may exist among the players. From a systems perspective, a complex network
structure can be decomposed into individual component firms (Zhang and Dilts, 2004).
Traditionally, companies in a supply network concentrate on the inputs and outputs of the
processes, with little concern for the internal management working of other individual
players. Therefore, the choice of an internal management control structure is known to impact
local firm performance (Mintzberg, 1979).

In the 21st century, changes in the business environment have contributed to the development
of supply chain networks. First, as an outcome of globalization and the proliferation of
multinational companies, joint ventures, strategic alliances and business partnerships,
significant success factors were identified, complementing the earlier "Just-In-Time", "Lean
Manufacturing" and "Agile Manufacturing" practices. Second, technological changes,
particularly the dramatic fall in information communication costs, which are a significant
component of transaction costs, have led to changes in coordination among the members of
the supply chain network (Coase, 1998).

Many researchers have recognized these kinds of supply network structures as a new
organization form, using terms such as "Keiretsu", "Extended Enterprise", "Virtual
Corporation", "Global Production Network", and "Next Generation Manufacturing System".
In general, such a structure can be defined as "a group of semi-independent organizations,
each with their capabilities, which collaborate in ever-changing constellations to serve one or
more markets in order to achieve some business goal specific to that collaboration"
(Akkermans, 2001).

The security management system for supply chains is described in ISO/IEC 28000 and
ISO/IEC 28001 and related standards published jointly by ISO and IEC.

Historical developments in supply chain management


Six major movements can be observed in the evolution of supply chain management studies:
Creation, Integration, and Globalization (Movahedi et al., 2009), Specialization Phases One
and Two, and SCM 2.0.

1. creation era

The term supply chain management was first coined by a U.S. industry consultant in the
early 1980s. However, the concept of a supply chain in management was of great importance
long before, in the early 20th century, especially with the creation of the assembly line. The
characteristics of this era of supply chain management include the need for large-scale
changes, re-engineering, downsizing driven by cost reduction programs, and widespread
attention to the Japanese practice of management.

2. integration era

This era of supply chain management studies was highlighted with the development of
Electronic Data Interchange (EDI) systems in the 1960s and developed through the 1990s by
the introduction of Enterprise Resource Planning (ERP) systems. This era has continued to
develop into the 21st century with the expansion of internet-based collaborative systems. This
era of supply chain evolution is characterized by both increasing value-adding and cost
reductions through integration.

In fact a supply chain can be classified as a Stage 1, 2 or 3 network. In stage 1 type supply
chain, various systems such as Make, Storage, Distribution, Material control, etc are not
linked and are independent of each other. In a stage 2 supply chain, these are integrated under
one plan and is ERP enabled. A stage 3 supply chain is one in which vertical integration with
the suppliers in upstream direction and customers in downstream direction is achieved. An
example of this kind of supply chain is Tesco.

3. globalization era

The third movement of supply chain management development, the globalization era, can be
characterized by the attention given to global systems of supplier relationships and the
expansion of supply chains over national boundaries and into other continents. Although the
use of global sources in the supply chain of organizations can be traced back several decades
(e.g., in the oil industry), it was not until the late 1980s that a considerable number of
organizations started to integrate global sources into their core business. This era is
characterized by the globalization of supply chain management in organizations with the goal
of increasing their competitive advantage, value-adding, and reducing costs through global
sourcing.

4. specialization era—phase one: outsourced manufacturing and distribution

In the 1990s, industries began to focus on “core competencies” and adopted a specialization
model. Companies abandoned vertical integration, sold off non-core operations, and
outsourced those functions to other companies. This changed management requirements by
extending the supply chain well beyond company walls and distributing management across
specialized supply chain partnerships.

This transition also re-focused the fundamental perspectives of each respective organization.
OEMs became brand owners that needed deep visibility into their supply base. They had to
control the entire supply chain from above instead of from within. Contract manufacturers
had to manage bills of material with different part numbering schemes from multiple OEMs
and support customer requests for work -in-process visibility and vendor-managed inventory
(VMI).

The specialization model creates manufacturing and distribution networks composed of


multiple, individual supply chains specific to products, suppliers, and customers who work
together to design, manufacture, distribute, market, sell, and service a product. The set of
partners may change according to a given market, region, or channel, resulting in a
proliferation of trading partner environments, each with its own unique characteristics and
demands.

5. specialization era—phase two: supply chain management as a service

Specialization within the supply chain began in the 1980s with the inception of transportation
brokerages, warehouse management, and non-asset-based carriers and has matured beyond
transportation and logistics into aspects of supply planning, collaboration, execution and
performance management.

At any given moment, market forces could demand changes from suppliers, logistics
providers, locations and customers, and from any number of these specialized participants as
components of supply chain networks. This variability has significant effects on the supply
chain infrastructure, from the foundation layers of establishing and managing the electronic
communication between the trading partners to more complex requirements including the
configuration of the processes and work flows that are essential to the management of the
network itself.

Supply chain specialization enables companies to improve their overall competencies in the
same way that outsourced manufacturing and distribution has done; it allows them to focus
on their core competencies and assemble networks of specific, best-in-class partners to
contribute to the overall value chain itself, thereby increasing overall performance and
efficiency. The ability to quickly obtain and deploy this domain-specific supply chain
expertise without developing and maintaining an entirely unique and complex competency in
house is the leading reason why supply chain specialization is gaining popularity.

Outsourced technology hosting for supply chain solutions debuted in the late 1990s and has
taken root primarily in transportation and collaboration categories. This has progressed from
the Application Service Provider (ASP) model from approximately 1998 through 2003 to the
On-Demand model from approximately 2003-2006 to the Software as a Service (SaaS) model
currently in focus today.

6. supply chain management 2.0 (SCM 2.0)

Building on globalization and specialization, the term SCM 2.0 has been coined to describe
both the changes within the supply chain itself as well as the evolution of the processes,
methods and tools that manage it in this new "era".

Web 2.0 is defined as a trend in the use of the World Wide Web that is meant to increase
creativity, information sharing, and collaboration among users. At its core, the common
attribute that Web 2.0 brings is to help navigate the vast amount of information available on
the Web in order to find what is being sought. It is the notion of a usable pathway. SCM 2.0
follows this notion into supply chain operations. It is the pathway to SCM results, a
combination of the processes, methodologies, tools and delivery options to guide companies
to their results quickly as the complexity and speed of the supply chain increase due to the
effects of global competition, rapid price fluctuations, surging oil prices, short product life
cycles, expanded specialization, near-/far- and off-shoring, and talent scarcity.

SCM 2.0 leverages proven solutions designed to rapidly deliver results with the agility to
quickly manage future change for continuous flexibility, value and success. This is delivered
through competency networks composed of best-of-breed supply chain domain expertise to
understand which elements, both operationally and organizationally, are the critical few that
deliver the results as well as through intimate understanding of how to manage these elements
to achieve desired results. Finally, the solutions are delivered in a variety of options, such as
no-touch via business process outsourcing, mid-touch via managed services and software as a
service (SaaS), or high touch in the traditional software deployment model.
Supply chain business process integration
Successful SCM requires a change from managing individual functions to integrating
activities into key supply chain processes. An example scenario: the purchasing department
places orders as requirements become known. The marketing department, responding to
customer demand, communicates with several distributors and retailers as it attempts to
determine ways to satisfy this demand. Information shared between supply chain partners can
only be fully leveraged through process integration.

Supply chain business process integration involves collaborative work between buyers and
suppliers, joint product development, common systems and shared information. According to
Lambert and Cooper (2000), operating an integrated supply chain requires a continuous
information flow. However, in many companies, management has reached the conclusion that
optimizing the product flows cannot be accomplished without implementing a process
approach to the business. The key supply chain processes stated by Lambert (2004) are:

 Customer relationship management


 Customer service management
 Demand management
 Order fulfillment
 Manufacturing flow management
 Supplier relationship management
 Product development and commercialization
 Returns management

Much has been written about demand management. Best-in-Class companies have similar
characteristics, which include the following: a) Internal and external collaboration b) Lead
time reduction initiatives c) Tighter feedback from customer and market demand d) Customer
level forecasting

One could suggest other key critical supply business processes which combine these
processes stated by Lambert such as:

a. Customer service management


b. Procurement
c. Product development and commercialization
d. Manufacturing flow management/support
e. Physical distribution
f. Outsourcing/partnerships
g. Performance measurement

a) Customer service management process

Customer Relationship Management concerns the relationship between the organization and
its customers. Customer service is the source of customer information. It also provides the
customer with real-time information on scheduling and product availability through interfaces
with the company's production and distribution operations. Successful organizations use the
following steps to build customer relationships:
 determine mutually satisfying goals for organization and customers
 establish and maintain customer rapport
 produce positive feelings in the organization and the customers

b) Procurement process

Strategic plans are drawn up with suppliers to support the manufacturing flow management
process and the development of new products. In firms where operations extend globally,
sourcing should be managed on a global basis. The desired outcome is a win-win relationship
where both parties benefit, and a reduction in time required for the design cycle and product
development. Also, the purchasing function develops rapid communication systems, such as
electronic data interchange (EDI) and Internet linkage to convey possible requirements more
rapidly. Activities related to obtaining products and materials from outside suppliers involve
resource planning, supply sourcing, negotiation, order placement, inbound transportation,
storage, handling and quality assurance, many of which include the responsibility to
coordinate with suppliers on matters of scheduling, supply continuity, hedging, and research
into new sources or programs.

c) Product development and commercialization

Here, customers and suppliers must be integrated into the product development process in
order to reduce time to market. As product life cycles shorten, the appropriate products must
be developed and successfully launched with ever shorter time-schedules to remain
competitive. According to Lambert and Cooper (2000), managers of the product development
and commercialization process must:

1. coordinate with customer relationship management to identify customer-articulated


needs;
2. select materials and suppliers in conjunction with procurement, and
3. develop production technology in manufacturing flow to manufacture and integrate
into the best supply chain flow for the product/market combination.

d) Manufacturing flow management process

The manufacturing process produces and supplies products to the distribution channels based
on past forecasts. Manufacturing processes must be flexible to respond to market changes and
must accommodate mass customization. Orders are processes operating on a just-in-time
(JIT) basis in minimum lot sizes. Also, changes in the manufacturing flow process lead to
shorter cycle times, meaning improved responsiveness and efficiency in meeting customer
demand. Activities related to planning, scheduling and supporting manufacturing operations,
such as work-in-process storage, handling, transportation, and time phasing of components,
inventory at manufacturing sites and maximum flexibility in the coordination of geographic
and final assemblies postponement of physical distribution operations.

e) Physical distribution

This concerns movement of a finished product/service to customers. In physical distribution,


the customer is the final destination of a marketing channel, and the availability of the
product/service is a vital part of each channel participant's marketing effort. It is also through
the physical distribution process that the time and space of customer service become an
integral part of marketing, thus it links a marketing channel with its customers (e.g., links
manufacturers, wholesalers, retailers).

f) Outsourcing/partnerships

This is not just outsourcing the procurement of materials and components, but also
outsourcing of services that traditionally have been provided in-house. The logic of this trend
is that the company will increasingly focus on those activities in the value chain where it has
a distinctive advantage, and outsource everything else. This movement has been particularly
evident in logistics where the provision of transport, warehousing and inventory control is
increasingly subcontracted to specialists or logistics partners. Also, managing and controlling
this network of partners and suppliers requires a blend of both central and local involvement.
Hence, strategic decisions need to be taken centrally, with the monitoring and control of
supplier performance and day-to-day liaison with logistics partners being best managed at a
local level.

g) Performance measurement

Experts found a strong relationship from the largest arcs of supplier and customer integration
to market share and profitability. Taking advantage of supplier capabilities and emphasizing a
long-term supply chain perspective in customer relationships can both be correlated with firm
performance. As logistics competency becomes a more critical factor in creating and
maintaining competitive advantage, logistics measurement becomes increasingly important
because the difference between profitable and unprofitable operations becomes more narrow.
A.T. Kearney Consultants (1985) noted that firms engaging in comprehensive performance
measurement realized improvements in overall productivity. According to experts, internal
measures are generally collected and analyzed by the firm including

1. Cost
2. Customer Service
3. Productivity measures
4. Asset measurement, and
5. Quality.

External performance measurement is examined through customer perception measures and


"best practice" benchmarking, and includes 1) customer perception measurement, and 2) best
practice benchmarking.

h)Warehousing management : As a case of reducing company cost & expenses, warehousing


management is carrying the valuable role against operations. In case of perfect storing &
office with all convenient facilities in company level, reducing manpower cost, dispatching
authority with on time delivery, loading & unloading facilities with proper area, area for
service station, stock management system etc.

Components of supply chain management are as follows: 1. Standardization 2. Postponement


3. Customization

Theories of supply chain management


Currently there is a gap in the literature available on supply chain management studies: there
is no theoretical support for explaining the existence and the boundaries of supply chain
management. A few authors such as Halldorsson, et al. (2003), Ketchen and Hult (2006) and
Lavassani, et al. (2009) have tried to provide theoretical foundations for different areas
related to supply chain by employing organizational theories. These theories include:

 Resource-Based View (RBV)


 Transaction Cost Analysis (TCA)
 Knowledge-Based View (KBV)
 Strategic Choice Theory (SCT)
 Agency Theory (AT)
 Institutional theory (InT)
 Systems Theory (ST)
 Network Perspective (NP)
 Materials Logistics Management (MLM)
 Just-in-Time (JIT)
 Materials Requirements Planning (MRP)
 Theory of Constraints (TOC)
 Total Quality Management (TQM)
 Agile Manufacturing
 Time Based Competition (TBC)
 Quick Response Manufacturing (QRM)
 Customer Relationship Management (CRM)
 and many more

Supply chain centroids


In the study of supply chain management, the concept of centroids has become an important
economic consideration. A centroid is a place that has a high proportion of a country’s
population and a high proportion of its manufacturing, generally within 500 mi (805 km). In
the U.S., two major supply chain centroids have been defined, one near Dayton, Ohio and a
second near Riverside, California.

The centroid near Dayton is particularly important because it is closest to the population
center of the US and Canada. Dayton is within 500 miles of 60% of the population and
manufacturing capacity of the U.S., as well as 60 percent of Canada’s population. The region
includes the Interstate 70/75 interchange, which is one of the busiest in the nation with
154,000 vehicles passing through in a day. Of those, anywhere between 30 percent and 35
percent are trucks hauling goods. In addition, the I-75 corridor is home to the busiest north-
south rail route east of the Mississippi.

Tax efficient supply chain management


Tax Efficient Supply Chain Management is a business model which consider the effect of
Tax in design and implementation of supply chain management. As the consequence of
Globalization, business which is cross-nation should pay different tax rates in different
countries. Due to the differences, global players have the opportunity to calculate and
optimize supply chain based on tax efficiency legally. It is used as a method of gaining more
profit for company which owns global supply chain.

Supply chain sustainability


Supply chain sustainability is a business issue affecting an organization’s supply chain or
logistics network and is frequently quantified by comparison with SECH ratings. SECH
ratings are defined as social, ethical, cultural and health footprints. Consumers have become
more aware of the environmental impact of their purchases and companies’ SECH ratings
and, along with non-governmental organizations ([NGO]s), are setting the agenda for
transitions to organically-grown foods, anti-sweatshop labor codes and locally-produced
goods that support independent and small businesses. Because supply chains frequently
account for over 75% of a company’s carbon footprint[14] many organizations are exploring
how they can reduce this and thus improve their SECH rating.

For example, in July, 2009 the U.S. based Wal-Mart corporation announced its intentions to
create a global sustainability index that would rate products according to the environmental
and social impact made while the products were manufactured and distributed. The
sustainability rating index is intended to create environmental accountability in Wal-Mart's
supply chain, and provide the motivation and infrastructure for other retail industry
companies to do the same.[15]

Components of supply chain management integration


The management components of SCM

The SCM components are the third element of the four-square circulation framework. The
level of integration and management of a business process link is a function of the number
and level, ranging from low to high, of components added to the link (Ellram and Cooper,
1990; Houlihan, 1985). Consequently, adding more management components or increasing
the level of each component can increase the level of integration of the business process link.
The literature on business process re-engineering, buyer-supplier relationships, and SCM
suggests various possible components that must receive managerial attention when managing
supply relationships. Lambert and Cooper (2000) identified the following components:

 Planning and control


 Work structure
 Organization structure
 Product flow facility structure
 Information flow facility structure
 Management methods
 Power and leadership structure
 Risk and reward structure
 Culture and attitude

However, a more careful examination of the existing literature leads to a more comprehensive
understanding of what should be the key critical supply chain components, the "branches" of
the previous identified supply chain business processes, that is, what kind of relationship the
components may have that are related to suppliers and customers. Bowersox and Closs states
that the emphasis on cooperation represents the synergism leading to the highest level of joint
achievement (Bowersox and Closs, 1996). A primary level channel participant is a business
that is willing to participate in the inventory ownership responsibility or assume other aspects
of financial risk, thus including primary level components (Bowersox and Closs, 1996). A
secondary level participant (specialized) is a business that participates in channel
relationships by performing essential services for primary participants, including secondary
level components, which support primary participants. Third level channel participants and
components that support the primary level channel participants and are the fundamental
branches of the secondary level components may also be included.

Consequently, Lambert and Cooper's framework of supply chain components does not lead to
any conclusion about what are the primary or secondary (specialized) level supply chain
components (see Bowersox and Closs, 1996, p. 93). That is, what supply chain components
should be viewed as primary or secondary, how should these components be structured in
order to have a more comprehensive supply chain structure, and how to examine the supply
chain as an integrative one (See above sections 2.1 and 3.1).

Reverse supply chain Reverse logistics is the process of managing the return of goods.
Reverse logistics is also referred to as "Aftermarket Customer Services". In other words, any
time money is taken from a company's warranty reserve or service logistics budget one can
speak of a reverse logistics operation.

Supply chain systems and value


Supply chain systems configure value for those that organise the networks. Value is the
additional revenue over and above the costs of building the network. Co-creating value and
sharing the benefits appropriately to encourage effective participation is a key challenge for
any supply system. Tony Hines defines value as follows: “Ultimately it is the customer who
pays the price for service delivered that confirms value and not the producer who simply adds
cost until that point”

Global supply chain management


Global supply chains pose challenges regarding both quantity and value:

Supply and value chain trends

 Globalization
 Increased cross border sourcing
 Collaboration for parts of value chain with low-cost providers
 Shared service centers for logistical and administrative functions
 Increasingly global operations, which require increasingly global coordination
and planning to achieve global optimums
 Complex problems involve also midsized companies to an increasing degree,

These trends have many benefits for manufacturers because they make possible larger lot
sizes, lower taxes, and better environments (culture, infrastructure, special tax zones,
sophisticated OEM) for their products. Meanwhile, on top of the problems recognized in
supply chain management, there will be many more challenges when the scope of supply
chains is global. This is because with a supply chain of a larger scope, the lead time is much
longer. Furthermore, there are more issues involved such as multi-currencies, different
policies and different laws. The consequent problems include:1. different currencies and
valuations in different countries; 2. different tax laws (Tax Efficient Supply Chain
Management); 3. different trading protocols; 4. lack of transparency of cost and profit.

Materials management
Materials management is the branch of logistics that deals with the tangible components of a
supply chain. Specifically, this covers the acquisition of spare parts and replacements, quality control
of purchasing and ordering such parts, and the standards involved in ordering, shipping, and
warehousing the said parts.

Goals

The goal of materials management is to provide an unbroken chain of components for


production to manufacture goods on time for the customer base. The materials department is
charged with releasing materials to a supply base, ensuring that the materials are delivered on
time to the company using the correct carrier. Materials is generally measured by
accomplishing on time delivery to the customer, on time delivery from the supply base,
attaining a freight budget, inventory shrink management, and inventory accuracy. The
materials department is also charged with the responsibility of managing new launches.

In some companies materials management is also charged with the procurement of materials
by establishing and managing a supply base. In other companies the procurement and
management of the supply base is the responsibility of a separate purchasing department. The
purchasing department is then responsible for the purchased price variances from the supply
base.

In large companies with multitudes of customer changes to the final product over the course
of a year, there may be a separate logistics department that is responsible for all new
acquisition launches and customer changes. This logistics department ensures that the launch
materials are procured for production and then transfers the responsibility to the plant
materials management to manage

Quality Assurance

A large component of materials management is ensuring that parts and materials used in the
supply chain meet minimum requirements by performing quality assurance (QA). While most
of the writing and discussion about materials management is on acquisition and standards,
much of the day to day work conducted in materials management deals with QA issues. Parts
and material are tested, both before purchase orders are placed and during use, to ensure there
are no short or long term issues that would disrupt the supply chain. This aspect of material
management is most important in heavily automated industries, since failure rates due to
faulty parts can slow or even stop production lines, throwing off timetables for production
goals.
Materials mangers are rarely responsible for the direct management of quality issues
concerning the supply chain or the customer. A sperate quality function generally deals with

Standards

There are no standards for materials management that are practiced from company to
company. Most companies use ERP systems such as SAP, Oracle, BPCS, MAPICS, and
other systems to manage materials control. Small concerns that do not have or cannot afford
ERP systems use a form of spreadsheet application to manage materials.

Materials management is not a science and depending upon the relevance and importance that
company officials place upon controlling material flow, the level of expertise changes. Some
companies place materials management on a level whereby there is a logistics director, other
companies see the importance level as managing at the plant level by hiring an inventory
manager or materials manager, and still other companies employ the concept that the
supervisors in the plant are responsible accompanied by a planners.

Because there are no standards there is only best practices for any particular business sector
that are widely used. For example, the generateion of releases to the supply base come in
many forms from the lowest level that requires sending facsimilies and PDF files, the EDI
information exchange, to the ultimate practice of a supplier web base site.

Materials Management Challenges


The major challenge that materials managers face is maintaining a consistent flow of
materials for production. There are many factors that inhibit the accuracy of inventory which
results in production shortages, premium freight, and often inventory adjustments. The major
issues that all materials managers face are incorrect bills of materials, inaccurate cycle
counts, un-reported scrap, shipping errors, receiving errors, and production reporting errors.
Materials managers have strived to determine how to manage these issues in the business
sectors of manufacturing since the beginning of the industrial revolution. Although there are
no known methods that eliminate the afore mentioned inventory accuracy inhibitors, there are
best methods available to eliminate the impact upon maintaining an interrupted flow of
materials for production.

There are a number of practices in the manufacturing sector that are a cut above the rest in
manageing inventory. Most third party logisitcs companies do have what it takes to manage
materials without inventory shrinks and shortages and anyone that desires to manage their
systems better should take a look at these practices. There is an ultimate practice that comes
as close to eliminating all of the afore mentioned inventory accurcy issues and these practices
can be found in the book entitled, "Materials Management, An Executive's Supply Chain
Guide" by S. McDonald and Wiley Publications www.Wiley.com and in major book stores.
One challenge for materials managers is to provide timely releases to the supply base. On the
scale of worst to best practices, sending releases via facsimile or PDF file is the worst
practice and transmitting releases to the supplier based web site is the best practice. Why?
The flaw in transmitting releases via facsimile or email is that they can get lost or even
interpreted incorrectly into the suppliers system resulting in a stock out. The problem with
transmitting EDI releases is that not all suppliers have EDI systems capable of receiving the
release information. The best practice is to transmit the releases to a common supplier web
base site where the suppliers can view (for free) the releases. The other advantage is that the
supplier is required to use the carrier listed in the web site, must transmit an ASN (advanced
shipping notification), and review the accumulative balances of the order

Improving circulation infrastructure

Redundancy can be reduced and effectiveness is increased when service points are clustered
to reduce the amount of redundancy. An effective materials management program can also
resolve “island” approaches to shipping, receiving, and vehicle movement. Solutions can
include creating a new central loading location, as well consolidating service areas and docks
from separate buildings into one. Developing better campus circulation infrastructure also
means re-evaluating truck delivery and service vehicle routes. Vehicle type, size, and
schedules are studied to make these more compatible with surrounding neighborhoods. This
will reduce truck traffic, creating a safer environment for pedestrians and a more attractive
environment for other uses.

Materials Management Week

Each year, an entire week is dedicated to celebrating resource and materials management
professionals for their outstanding contributions to healthcare and the overall success of the
supply chain. Sponsored by the Association for Healthcare Resource & Materials
Management (AHRMM), National Healthcare Resource & Materials Management Week
(MM Week) provides an opportunity to recognize the integral role materials management
professionals play in delivering high-quality patient care throughout the health care industry.
In 2010 Material Management Week is October 4-10 October.

Benefits

An effective materials management plan builds from and enhances an institutional master
plan by filling in the gaps and producing an environmentally responsible and efficient
outcome. An institutional campus, office, or housing complex can expect a myriad of benefits
from an effective materials management plan. For starters, there are long-term cost savings,
as consolidating, reconfiguring, and better managing a campus’ core infrastructure reduces
annual operating costs. An institutional campus, office, or housing complex will also get the
highest and best use out of campus real estate.

An effective materials management plan also means a more holistic approach to managing
vehicle use and emissions, solid waste, hazardous waste, recycling, and utility services. As a
result, this means a “greener,” more sustainable environment and a manifestation of the many
demands today for institutions to become more environmentally friendly. In fact, thanks to
such environmental advantages, creative materials management plans may qualify for LEED
Innovation in Design credits.

And finally, an effective materials management plan can improve aesthetics. Removing
unsafe and unsightly conditions, placing core services out of sight, and creating a more
pedestrian-friendly environment will improve the visual and physical sense of place for those
who live and work there
ABC analysis

ABC analysis is a business term used to define an inventory categorization technique often
used in materials management. It is also known as Selective Inventory Control.

ABC analysis provides a mechanism for identifying items that will have a significant impact
on overall inventory cost, while also providing a mechanism for identifying different
categories of stock that will require different management and controls.

When carrying out an ABC analysis, inventory items are valued (item cost multiplied by
quantity issued/consumed in period) with the results then ranked. The results are then
grouped typically into three bands. These bands are called ABC codes.

ABC codes
1. "A class" inventory will typically contain items that account for 80% of total value, or 20% of
total items.
2. "B class" inventory will have around 15% of total value, or 30% of total items.
3. "C class" inventory will account for the remaining 5%, or 50% of total items.

ABC Analysis is similar to the Pareto principle in that the "A class" group will typically
account for a large proportion of the overall value but a small percentage of the overall
volume of inventory.

Another recommended breakdown of ABC classes:

1. "A" approximately 10% of items or 66.6% of value


2. "B" approximately 20% of items or 23.3% of value
3. "C" approximately 70% of items or 10.1% of value

Pareto principle

The Pareto principle (also known as the 80-20 rule, the law of the vital few, and the
principle of factor sparsity) states that, for many events, roughly 80% of the effects come
from 20% of the causes.

Business management thinker Joseph M. Juran suggested the principle and named it after
Italian economist Vilfredo Pareto, who observed in 1906 that 80% of the land in Italy was
owned by 20% of the population; he developed the principle by observing that 20% of the
pea pods in his garden contained 80% of the peas.

It is a common rule of thumb in business; e.g., "80% of your sales come from 20% of your
clients". Mathematically, where something is shared among a sufficiently large set of
participants, there must be a number k between 50 and 100 such that "k% is taken by
(100 − k)% of the participants". The number k may vary from 50 (in the case of equal
distribution, i.e. 100% of the population have equal shares) to nearly 100 (when a tiny
number of participants account for almost all of the resource). There is nothing special about
the number 80% mathematically, but many real systems have k somewhere around this region
of intermediate imbalance in distribution.

The Pareto principle is only tangentially related to Pareto efficiency, which was also
introduced by the same economist. Pareto developed both concepts in the context of the
distribution of income and wealth among the population.

Inventory management
Inventory management is primarily about specifying the size and placement of stocked goods.
Inventory management is required at different locations within a facility or within multiple
locations of a supply network to protect the regular and planned course of production against
the random disturbance of running out of materials or goods. The scope of inventory
management also concerns the fine lines between replenishment lead time, carrying costs of
inventory, asset management, inventory forecasting, inventory valuation, inventory visibility,
future inventory price forecasting, physical inventory, available physical space for inventory,
quality management, replenishment, returns and defective goods and demand forecasting.
Balancing these competing requirements leads to optimal inventory levels, which is an on-
going process as the business needs shift and react to the wider environment.

Inventory management involves a retailer seeking to acquire and maintain a proper


merchandise assortment while ordering, shipping, handling, and related costs are kept in
check.

Systems and processes that identify inventory requirements, set targets, provide
replenishment techniques and report actual and projected inventory status.

Handles all functions related to the tracking and management of material. This would include
the monitoring of material moved into and out of stockroom locations and the reconciling of
the inventory balances. Also may include ABC analysis, lot tracking, cycle counting support
etc.

Management of the inventories, with the primary objective of determining/controlling stock


levels within the physical distribution function to balance the need for product availability
against the need for minimizing stock holding and handling costs. See inventory
proportionality.

The reasons for keeping stock

There are three basic reasons for keeping an inventory:

1. Time - The time lags present in the supply chain, from supplier to user at every stage,
requires that you maintain certain amounts of inventory to use in this "lead time."
2. Uncertainty - Inventories are maintained as buffers to meet uncertainties in demand,
supply and movements of goods.
3. Economies of scale - Ideal condition of "one unit at a time at a place where a user
needs it, when he needs it" principle tends to incur lots of costs in terms of logistics.
So bulk buying, movement and storing brings in economies of scale, thus inventory.
All these stock reasons can apply to any owner or product stage.

 Buffer stock is held in individual workstations against the possibility that the
upstream workstation may be a little delayed in long setup or change over time. This
stock is then used while that changeover is happening. This stock can be eliminated
by tools like SMED.

These classifications apply along the whole Supply chain, not just within a facility or plant.

Where these stocks contain the same or similar items, it is often the work practice to hold all
these stocks mixed together before or after the sub-process to which they relate. This 'reduces'
costs. Because they are mixed up together there is no visual reminder to operators of the
adjacent sub-processes or line management of the stock, which is due to a particular cause
and should be a particular individual's responsibility with inevitable consequences. Some
plants have centralized stock holding across sub-processes, which makes the situation even
more acute.

Special terms used in dealing with inventory

 Stock Keeping Unit (SKU) is a unique combination of all the components that are
assembled into the purchasable item. Therefore, any change in the packaging or
product is a new SKU. This level of detailed specification assists in managing
inventory.
 Stockout means running out of the inventory of an SKU.
 "New old stock" (sometimes abbreviated NOS) is a term used in business to refer to
merchandise being offered for sale that was manufactured long ago but that has never
been used. Such merchandise may not be produced anymore, and the new old stock
may represent the only market source of a particular item at the present time.

Typology

1. Buffer/safety stock
2. Cycle stock (Used in batch processes, it is the available inventory, excluding buffer
stock)
3. De-coupling (Buffer stock that is held by both the supplier and the user)
4. Anticipation stock (Building up extra stock for periods of increased demand - e.g. ice
cream for summer)
5. Pipeline stock (Goods still in transit or in the process of distribution - have left the
factory but not arrived at the customer yet)

Inventory examples

While accountants often discuss inventory in terms of goods for sale, organizations -
manufacturers, service-providers and not-for-profits - also have inventories (fixtures,
furniture, supplies, ...) that they do not intend to sell. Manufacturers', distributors', and
wholesalers' inventory tends to cluster in warehouses. Retailers' inventory may exist in a
warehouse or in a shop or store accessible to customers. Inventories not intended for sale to
customers or to clients may be held in any premises an organization uses. Stock ties up cash
and, if uncontrolled, it will be impossible to know the actual level of stocks and therefore
impossible to control them.

While the reasons for holding stock were covered earlier, most manufacturing organizations
usually divide their "goods for sale" inventory into:

 Raw materials - materials and components scheduled for use in making a product.
 Work in process, WIP - materials and components that have begun their
transformation to finished goods.
 Finished goods - goods ready for sale to customers.
 Goods for resale - returned goods that are salable.

Stock management

Stock management is the function of understanding the stock mix of a company and the
different demands on that stock. The demands are influenced by both external and internal
factors and are balanced by the creation of Purchase order requests to keep supplies at a
reasonable or prescribed level.

Retail supply chain


Stock management in the retail supply chain follows the following sequence:

1. Request for new stock from stores to head office


2. Head office issues purchase orders to the vendor
3. Vendor ships the goods
4. Warehouse receives the goods
5. Warehouse stocks and distributes to the stores
6. Stores receive the goods
7. Goods are sold to customers at the stores

The management of the inventory in the supply chain involves managing the physical
quantities as well as the costing of the goods as it flows through the supply chain.

In managing the cost prices of the goods throughout the supply chain, several costing
methods are employed:

1. Retail method
2. Weighted Average Price method
3. FIFO (First In First Out) method
4. LIFO (Last In First Out) method
5. LPP (Last Purchase Price) method
6. BNM (Bottle neck method)

Weighted Average Price Method


The calculation can be done for different periods. If the calculation is done on a monthly
basis, then it is referred to the periodic method. In this method, the available stock is
calculated by:

ADD Stock at beginning of period


ADD Stock purchased during the period
AVERAGE total cost by total qty to arrive at the Average Cost of Goods for the period.

This Average Cost Price is applied to all movements and adjustments in that period.
Ending stock in qty is arrived at by Applying all the changes in qty to the Available balance.
Multiplying the stock balance in qty by the Average cost gives the Stock cost at the end of the
period.

Using the perpetual method, the calculation is done upon every purchase transaction.

Thus, the calculation is the same based on the periodic calculation whether by period
(periodic) or by transaction (perpetual).

The only difference is the 'periodicity' or scope of the calculation. - Periodic is done monthly
- Perpetual is done for the duration of the purchase until the next purchase

In practice, the daily averaging has been used to closely approximate the perpetual method. 6.
Bottle neck method ( depends on proper planning support)

Software applications
The implementation of inventory management applications has become a valuable tool for
organizations looking to more efficiently manage stock. While the capabilities of applications
vary, most inventory management applications give organizations a structured method of
accounting for all incoming and outgoing inventory within their facilities. Organizations save
a significant amount in costs associated with manual inventory counts, administrative errors
and reductions in inventory stock-outs.

Business models
Just-in-time Inventory (JIT), Vendor Managed Inventory (VMI) and Customer Managed
Inventory (CMI) are a few of the popular models being employed by organizations looking to
have greater stock management control.

JIT is a model which attempts to replenish inventory for organizations when the inventory is
required. The model attempts to avoid excess inventory and its associated costs. As a result,
companies receive inventory only when the need for more stock is approaching.

VMI and CMI are two business models that adhere to the JIT inventory principles. VMI gives
the vendor in a vendor/customer relationship the ability to monitor, plan and control
inventory for their customers. Customers relinquish the order making responsibilities in
exchange for timely inventory replenishment that increases organizational efficiency.
CMI allows the customer to order and control their inventory from their vendors/suppliers.
Both VMI and CMI benefit the vendor as well as the customer. Vendors see a significant
increase in sales due to increased inventory turns and cost savings realized by their
customers, while customers realize similar benefits.

Cross-docking

Cross-docking is a practice in logistics of unloading materials from an incoming semi-trailer


truck or rail car and loading these materials directly into outbound trucks, trailers, or rail cars,
with little or no storage in between. This may be done to change type of conveyance, to sort
material intended for different destinations, or to combine material from different origins into
transport vehicles (or containers) with the same, or similar destination.

Cross-Dock operations were first pioneered in the US trucking industry in the 1930s, and
have been in continuous use in LTL (less than truckload) operations ever since. The US
Military began utilizing cross-dock operations in the 1950s. Wal-Mart began utilizing cross-
docking in the retail sector in the late 1980s.

In the LTL trucking industry, cross-docking is done by moving cargo from one transport
vehicle directly into another, with minimal or no warehousing. In retail practice, cross-
docking operations may utilize staging areas where inbound materials are sorted,
consolidated, and stored until the outbound shipment is complete and ready to ship.

Advantages of Retail Cross-Docking


 Streamlines the supply chain from point of origin to point of sale
 Reduces handling costs, operating costs, and the storage of inventory
 Products get to the distributor and consequently to the customer faster
 Reduces, or eliminates warehousing costs
 May increase available retail sales space.

Vendor Management System

A Vendor Management System (VMS) is an Internet-enabled, often Web-based application


that acts as a mechanism for business to manage and procure staffing services – temporary,
and, in some cases, permanent placement services – as well as outside contract or contingent
labour. Typical features of a VMS application include order distribution, consolidated billing
and significant enhancements in reporting capability that outperforms manual systems and
processes.

In the financial industy due to recent regulations, Vendor Management implies consistent
Risk Classificant and due diligence done to ensure Risk assessment that eliminates undue
third party risks exposure.

Definitions
The contingent workforce is a provisional group of workers who work for an organization on
a non-permanent basis, also known as freelancers, independent professionals, temporary
contract workers, independent contractors or consultants. VMS is a type of contingent
workforce management. There are several other terms associated with VMS which are all
relevant to the contingent workforce, or staffing industry.

A vendor is literally a person or organization that vends or sells contingent labor.


Specifically a vendor can be an independent consultant, a consulting company, or staffing
company (who can also be called a supplier – because they supply the labor or expertise
rather than selling it directly)[2].

A VOP, or Vendor On Premise, is a vendor that sets up shop on the client's premises. They
are concerned with filling the labor needs and requirements of the client.[3] The VOP does this
either by sourcing labor directly from themselves, or from other suppliers, whom may be
their competitors. Also, the VOP manages and coordinates this labor for the client.

A MSP, or Managed Service Provider, manages vendors and measure their effectiveness in
recruiting according to the client's standards and requirements. MSPs generally do not recruit
directly, but try to find the best suppliers of vendors according to the client's requirements.
This, in essence, makes the MSP more neutral than a VOP in finding talent because they
themselves do not provide the labor.[4], [5]

VMS is a tool, specifically a software program, that distributes job requirements to staffing
companies, recruiters, consulting companies, and other vendors (i.e. Independent
consultants).[6] It facilitates the interview and hire process, as well as labor time collection
approval and payment.

A CMS, or Contractor management System, is a tool which interfaces with the Access
Control Systems of large refineries, plants, and manufacturing facilities and the ERP system
in order to capture the real-time hours/data between contractors and client. This type of
system will typically involve a collaborative effort between the contractor and facility owner
to simplify the timekeeping process and improve project cost visibility.

An EOR, or Employer of Record, is designed to facilitate all components of independent


contractor management, including classificiation, auditing, and compliance reviews.
Employer of Records help drive down the risk of co-employment and allow enterprises to
engage and manage independent contractors without the stress of government audits or tax
liabilities

Statistical process control


Statistical process control (SPC) is the application of statistical methods to the monitoring and
control of a process to ensure that it operates at its full potential to produce conforming product.
Under SPC, a process behaves predictably to produce as much conforming product as possible with
the least possible waste. While SPC has been applied most frequently to controlling manufacturing
lines, it applies equally well to any process with a measurable output. Key tools in SPC are control
charts, a focus on continuous improvement and designed experiments.
Much of the power of SPC lies in the ability to examine a process and the sources of variation
in that process using tools that give weight to objective analysis over subjective opinions and
that allow the strength of each source to be determined numerically. Variations in the process
that may affect the quality of the end product or service can be detected and corrected, thus
reducing waste as well as the likelihood that problems will be passed on to the customer.
With its emphasis on early detection and prevention of problems, SPC has a distinct
advantage over other quality methods, such as inspection, that apply resources to detecting
and correcting problems after they have occurred.

In addition to reducing waste, SPC can lead to a reduction in the time required to produce the
product or service from end to end. This is partially due to a diminished likelihood that the
final product will have to be reworked, but it may also result from using SPC data to identify
bottlenecks, wait times, and other sources of delays within the process. Process cycle time
reductions coupled with improvements in yield have made SPC a valuable tool from both a
cost reduction and a customer satisfaction standpoint.

How to Use SPC


Statistical Process Control may be broadly broken down into three sets of activities:
understanding the process; understanding the causes of variation; and elimination of the
sources of special cause variation.

In understanding a process, the process is typically mapped out and the process is monitored
using control charts. Control charts are used to identify variation that may be due to special
causes, and to free the user from concern over variation due to common causes. This is a
continuous, ongoing activity. When a process is stable and does not trigger any of the
detection rules for a control chart, a process capability analysis may also be performed to
predict the ability of the current process to produce conforming (i.e. within specification)
product in the future.

When excessive variation is identified by the control chart detection rules, or the process
capability is found lacking, additional effort is exerted to determine causes of that variance.
The tools used include Ishikawa diagrams, designed experiments and Pareto charts. Designed
experiments are critical to this phase of SPC, as they are the only means of objectively
quantifying the relative importance of the many potential causes of variation.

Once the causes of variation have been quantified, effort is spent in eliminating those causes
that are both statistically and practically significant (i.e. a cause that has only a small but
statistically significant effect may not be considered cost-effective to fix; however, a cause
that is not statistically significant can never be considered practically significant). Generally,
this includes development of standard work, error-proofing and training. Additional process
changes may be required to reduce variation or align the process with the desired target,
especially if there is a problem with process capability.
Control chart

Control charts, also known as Shewhart charts or process-behaviour charts, in statistical


process control are tools used to determine whether or not a manufacturing or business
process is in a state of statistical control.

Overview
If analysis of the control chart indicates that the process is currently under control (i.e. is
stable, with variation only coming from sources common to the process) then data from the
process can be used to predict the future performance of the process. If the chart indicates
that the process being monitored is not in control, analysis of the chart can help determine the
sources of variation, which can then be eliminated to bring the process back into control. A
control chart is a specific kind of run chart that allows significant change to be differentiated
from the natural variability of the process.

The control chart can be seen as part of an objective and disciplined approach that enables
correct decisions regarding control of the process, including whether or not to change process
control parameters. Process parameters should never be adjusted for a process that is in
control, as this will result in degraded process performance.[1]

The control chart is one of the seven basic tools of quality control

Chart details
A control chart consists of:

 Points representing a statistic (e.g., a mean, range, proportion) of measurements of a quality


characteristic in samples taken from the process at different times [the data]
 The mean of this statistic using all the samples is calculated (e.g., the mean of the means,
mean of the ranges, mean of the proportions)
 A center line is drawn at the value of the mean of the statistic
 The standard error (e.g., standard deviation/sqrt(n) for the mean) of the statistic is also
calculated using all the samples
 Upper and lower control limits (sometimes called "natural process limits") that indicate the
threshold at which the process output is considered statistically 'unlikely' are drawn typically
at 3 standard errors from the center line

The chart may have other optional features, including:

 Upper and lower warning limits, drawn as separate lines, typically two standard errors above
and below the center line
 Division into zones, with the addition of rules governing frequencies of observations in each
zone
 Annotation with events of interest, as determined by the Quality Engineer in charge of the
process's quality
Chart usage

If the process is in control, all points will plot within the control limits. Any observations
outside the limits, or systematic patterns within, suggest the introduction of a new (and likely
unanticipated) source of variation, known as a special-cause variation. Since increased
variation means increased quality costs, a control chart "signaling" the presence of a special-
cause requires immediate investigation.

This makes the control limits very important decision aids. The control limits tell you about
process behavior and have no intrinsic relationship to any specification targets or engineering
tolerance. In practice, the process mean (and hence the center line) may not coincide with the
specified value (or target) of the quality characteristic because the process' design simply
can't deliver the process characteristic at the desired level.

Control charts limit specification limits or targets because of the tendency of those involved
with the process (e.g., machine operators) to focus on performing to specification when in
fact the least-cost course of action is to keep process variation as low as possible. Attempting
to make a process whose natural center is not the same as the target perform to target
specification increases process variability and increases costs significantly and is the cause of
much inefficiency in operations. Process capability studies do examine the relationship
between the natural process limits (the control limits) and specifications, however.

The purpose of control charts is to allow simple detection of events that are indicative of
actual process change. This simple decision can be difficult where the process characteristic
is continuously varying; the control chart provides statistically objective criteria of change.
When change is detected and considered good its cause should be identified and possibly
become the new way of working, where the change is bad then its cause should be identified
and eliminated.

The purpose in adding warning limits or subdividing the control chart into zones is to provide
early notification if something is amiss. Instead of immediately launching a process
improvement effort to determine whether special causes are present, the Quality Engineer
may temporarily increase the rate at which samples are taken from the process output until it's
clear that the process is truly in control. Note that with three sigma limits, common-causes
result in signals less than once out of every forty points for skewed processes and less than
once out of every two hundred points for normally distributed processes.[5]
Choice of limits

Shewhart set 3-sigma (3-standard error) limits on the following basis.

 The coarse result of Chebyshev's inequality that, for any probability distribution, the
probability of an outcome greater than k standard deviations from the mean is at most 1/k2.
[citation needed]

 The finer result of the Vysochanskii-Petunin inequality, that for any unimodal probability
distribution, the probability of an outcome greater than k standard deviations from the
mean is at most 4/(9k2).[citation needed]
 The empirical investigation of sundry probability distributions reveals that at least 99% of
observations occurred within three standard deviations of the mean.[citation needed]

Shewhart summarized the conclusions by saying:

... the fact that the criterion which we happen to use has a fine ancestry in highbrow
statistical theorems does not justify its use. Such justification must come from empirical
evidence that it works. As the practical engineer might say, the proof of the pudding is in the
eating.[citation needed]

Though he initially experimented with limits based on probability distributions, Shewhart


ultimately wrote:

Some of the earliest attempts to characterize a state of statistical control were inspired by the
belief that there existed a special form of frequency function f and it was early argued that
the normal law characterized such a state. When the normal law was found to be inadequate,
then generalized functional forms were tried. Today, however, all hopes of finding a unique
functional form f are blasted.[citation needed]

The control chart is intended as a heuristic. Deming insisted that it is not a hypothesis test and
is not motivated by the Neyman-Pearson lemma. He contended that the disjoint nature of
population and sampling frame in most industrial situations compromised the use of
conventional statistical techniques. Deming's intention was to seek insights into the cause
system of a process ...under a wide range of unknowable circumstances, future and past….
[citation needed]
He claimed that, under such conditions, 3-sigma limits provided ... a rational and
economic guide to minimum economic loss... from the two errors:[citation needed]

1. Ascribe a variation or a mistake to a special cause when in fact the cause belongs to the
system (common cause). (Also known as a Type I error)
2. Ascribe a variation or a mistake to the system (common causes) when in fact the cause was
special. (Also known as a Type II error)

Calculation of standard deviation

As for the calculation of control limits, the standard deviation (error) required is that of the
common-cause variation in the process. Hence, the usual estimator, in terms of sample
variance, is not used as this estimates the total squared-error loss from both common- and
special-causes of variation.
An alternative method is to use the relationship between the range of a sample and its
standard deviation derived by Leonard H. C. Tippett, an estimator which tends to be less
influenced by the extreme observations which typify special-causes.

Rules for detecting signals


The most common sets are:

 The Western Electric rules


 The Wheeler rules (equivalent to the Western Electric zone tests [6])
 The Nelson rules

There has been particular controversy as to how long a run of observations, all on the same
side of the centre line, should count as a signal, with 6, 7, 8 and 9 all being advocated by
various writers.

The most important principle for choosing a set of rules is that the choice be made before the
data is inspected. Choosing rules once the data have been seen tends to increase the Type I
error rate owing to testing effects suggested by the data.

Alternative bases
In 1935, the British Standards Institution, under the influence of Egon Pearson and against
Shewhart's spirit, adopted control charts, replacing 3-sigma limits with limits based on
percentiles of the normal distribution. This move continues to be represented by John
Oakland and others but has been widely deprecated by writers in the Shewhart-Deming
tradition.

Performance of control charts


When a point falls outside of the limits established for a given control chart, those responsible
for the underlying process are expected to determine whether a special cause has occurred. If
one has, it is appropriate to determine if the results with the special cause are better than or
worse than results from common causes alone. If worse, then that cause should be eliminated
if possible. If better, it may be appropriate to intentionally retain the special cause within the
system producing the results.[citation needed]

It is known that even when a process is in control (that is, no special causes are present in the
system), there is approximately a 0.27% probability of a point exceeding 3-sigma control
limits. Since the control limits are evaluated each time a point is added to the chart, it readily
follows that every control chart will eventually signal the possible presence of a special
cause, even though one may not have actually occurred. For a Shewhart control chart using
3-sigma limits, this false alarm occurs on average once every 1/0.0027 or 370.4 observations.
Therefore, the in-control average run length (or in-control ARL) of a Shewhart chart is
370.4.[citation needed]

Meanwhile, if a special cause does occur, it may not be of sufficient magnitude for the chart
to produce an immediate alarm condition. If a special cause occurs, one can describe that
cause by measuring the change in the mean and/or variance of the process in question. When
those changes are quantified, it is possible to determine the out-of-control ARL for the chart.
[citation needed]

It turns out that Shewhart charts are quite good at detecting large changes in the process mean
or variance, as their out-of-control ARLs are fairly short in these cases. However, for smaller
changes (such as a 1- or 2-sigma change in the mean), the Shewhart chart does not detect
these changes efficiently. Other types of control charts have been developed, such as the
EWMA chart and the CUSUM chart, which detect smaller changes more efficiently by
making use of information from observations collected prior to the most recent data point.

Criticisms
Several authors have criticised the control chart on the grounds that it violates the likelihood
principle.[citation needed] However, the principle is itself controversial and supporters of control
charts further argue that, in general, it is impossible to specify a likelihood function for a
process not in statistical control, especially where knowledge about the cause system of the
process is weak.[citation needed]

Some authors have criticised the use of average run lengths (ARLs) for comparing control
chart performance, because that average usually follows a geometric distribution, which has
high variability and difficulties.[citation needed]

Types of charts
Process Process Size of
Chart Process observation observations observations shift to
relationships type detect

Quality characteristic
Large (≥
and R chart measurement within one Independent Variables
1.5σ)
subgroup

Quality characteristic
Large (≥
and s chart measurement within one Independent Variables
1.5σ)
subgroup

Shewhart
Quality characteristic
individuals control Large (≥
measurement for one Independent Variables†
chart (ImR chart or 1.5σ)
observation
XmR chart)

Quality characteristic
Large (≥
Three-way chart measurement within one Independent Variables
1.5σ)
subgroup

Fraction nonconforming within Large (≥


p-chart Independent Attributes†
one subgroup 1.5σ)
Number nonconforming within Large (≥
np-chart Independent Attributes†
one subgroup 1.5σ)

Number of nonconformances Large (≥


c-chart Independent Attributes†
within one subgroup 1.5σ)

Nonconformances per unit within Large (≥


u-chart Independent Attributes†
one subgroup 1.5σ)

Exponentially weighted moving


average of quality characteristic Attributes or Small (<
EWMA chart Independent
measurement within one variables 1.5σ)
subgroup

Cumulative sum of quality


Attributes or Small (<
CUSUM chart characteristic measurement Independent
variables 1.5σ)
within one subgroup

Quality characteristic
Attributes or
Time series model measurement within one Autocorrelated N/A
variables
subgroup

Quality characteristic Dependent of


Regression control Large (≥
measurement within one process control Variables
chart 1.5σ)
subgroup variables


Some practitioners also recommend the use of Individuals charts for attribute data,
particularly when the assumptions of either binomially-distributed data (p- and np-charts) or
Poisson-distributed data (u- and c-charts) are violated.[7] Two primary justifications are given
for this practice. First, normality is not necessary for statistical control, so the Individuals
chart may be used with non-normal data.[8] Second, attribute charts derive the measure of
dispersion directly from the mean proportion (by assuming a probability distribution), while
Individuals charts derive the measure of dispersion from the data, independent of the mean,
making Individuals charts more robust than attributes charts to violations of the assumptions
about the distribution of the underlying population.[9] It is sometimes noted that the
substitution of the Individuals chart works best for large counts, when the binomial and
Poisson distributions approximate a normal distribution. i.e. when the number of trials n >
1000 for p- and np-charts or λ > 500 for u- and c-charts.

Critics of this approach argue that control charts should not be used then their underlying
assumptions are violated, such as when process data is neither normally distributed nor
binomially (or Poisson) distributed. Such processes are not in control and should be improved
before the application of control charts. Additionally, application of the charts in the presence
of such deviations increases the type I and type II error rates of the control charts, and may
make the chart of little practical use.

Central limit theorem

Main article: Central limit theorem


The theorem states that under certain (fairly common) conditions, the sum of a large number
of random variables will have an approximately normal distribution. For example if (x1, …,
xn) is a sequence of iid random variables, each having mean μ and variance σ2, then the
central limit theorem states that

The theorem will hold even if the summands xi are not iid, although some constraints on the
degree of dependence and the growth rate of moments still have to be imposed.

The importance of the central limit theorem cannot be overemphasized. A great number of
test statistics, scores, and estimators encountered in practice contain sums of certain random
variables in them, even more estimators can be represented as sums of random variables
through the use of influence functions — all of these quantities are governed by the central
limit theorem and will have asymptotically normal distribution as a result.

Histogram

In statistics, a histogram is a graphical representation, showing a visual impression of the


distribution of data. It is an estimate of the probability distribution of a continuous variable
and was first introduced by Karl Pearson [1]. A histogram consists of tabular frequencies,
shown as adjacent rectangles, erected over discrete intervals (bins), with an area equal to the
frequency of the observations in the interval. The height of a rectangle is also equal to the
frequency density of the interval, i.e., the frequency divided by the width of the interval. The
total area of the histogram is equal to the number of data. A histogram may also be
normalized displaying relative frequencies. It then shows the proportion of cases that fall into
each of several categories, with the total area equaling 1. The categories are usually specified
as consecutive, non-overlapping intervals of a variable. The categories (intervals) must be
adjacent, and often are chosen to be of the same size.[2]

Histograms are used to plot density of data, and often for density estimation: estimating the
probability density function of the underlying variable. The total area of a histogram used for
probability density is always normalized to 1. If the length of the intervals on the x-axis are
all 1, then a histogram is identical to a relative frequency plot.

An alternative to the histogram is kernel density estimation, which uses a kernel to smooth
samples. This will construct a smooth probability density function, which will in general
more accurately reflect the underlying variable.

The histogram is one of the seven basic tools of quality control

Seven Basic Tools of Quality

The Seven Basic Tools of Quality is a designation given to a fixed set of graphical
techniques identified as being most helpful in troubleshooting issues related to quality.[1]
They are called basic because they are suitable for people with little formal training in
statistics and because they can be used to solve the vast majority of quality-related issues.[2]

The tools are:[3]

 The cause-and-effect or Ishikawa diagram


 The check sheet
 The control chart
 The histogram
 The Pareto chart
 The scatter diagram
 Stratification (alternately flow chart or run chart)

The designation arose in postwar Japan, inspired by the seven famous weapons of Benkei.[4]
At that time, companies that had set about training their workforces in statistical quality
control found that the complexity of the subject intimidated the vast majority of their workers
and scaled back training to focus primarily on simpler methods which suffice for most
quality-related issues anyway.[5]

The Seven Basic Tools stand in contrast with more advanced statistical methods such as
survey sampling, acceptance sampling, statistical hypothesis testing, design of experiments,
multivariate analysis, and various methods developed in the field of operations research

Seven Management and Planning Tools


The Seven Management and Planning Tools have their roots in Operations Research work done
after World War II and the Japanese Total Quality Control (TQC) research. In 1979 the book Seven
New Quality Tools for Managers and Staff was published and in 1983 was translated into English.

The seven tools include:

1. Affinity Diagram (KJ Method)


2. Interrelationship Diagraph (ID)
3. Tree Diagram
4. Prioritization Matrix
5. Matrix Diagram
6. Process Decision Program Chart (PDPC)
7. Activity Network Diagram

Affinity Diagram
This tool takes large amounts of disorganized data and information and enables one to
organize it into groupings based on natural relationships. It was created in the 1960s by
Japanese anthropologist Jiro Kawakita. Its also known as KJ diagram,after Jiro Kawakita.
Affinity diagram is a special kind of brainstorming tool.

Interrelationship Diagraph

This tool displays all the interrelated cause-and-effect relationships and factors involved in a
complex problem and describes desired outcomes. The process of creating an
interrelationship diagraph helps a group analyze the natural links between different aspects of
a complex situation.

Tree Diagram

This tool is used to break down broad categories into finer and finer levels of detail. It can
map levels of details of tasks that are required to accomplish a goal or task. It can be used to
break down broad general subjects into finer and finer levels of detail. Developing the tree
diagram helps one move their thinking from generalities to specifics.

Prioritization Matrix

This tool is used to prioritize items and describe them in terms of weighted criteria. It uses a
combination of tree and matrix diagramming techniques to do a pair-wise evaluation of items
and to narrow down options to the most desired or most effective.
Matrix Diagram

This tool shows the relationship between items. At each intersection a relationship is either
absent or present. It then gives information about the relationship, such as its strength, the
roles played by various individuals or measurements. Six differently shaped matrices are
possible: L, T, Y, X, C, R and roof-shaped, depending on how many groups must be
compared.

Process Decision Program Chart (PDPC)

A useful way of planning is to break down tasks into a hierarchy, using a Tree Diagram. The
PDPC extends the tree diagram a couple of levels to identify risks and countermeasures for
the bottom level tasks. Different shaped boxes are used to highlight risks and identify
possible countermeasures (often shown as 'clouds' to indicate their uncertain nature). The
PDPC is similar to the Failure Modes and Effects Analysis (FMEA) in that both identify
risks, consequences of failure, and contingency actions; the FMEA also rates relative risk
levels for each potential failure point.

Activity Network Diagram


This tool is used to plan the appropriate sequence or schedule for a set of tasks and related
subtasks. It is used when subtasks must occur in parallel. The diagram enables one to
determine the critical path (longest sequence of tasks). (See also PERT diagram.)

You might also like