Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

1

Improving Some Basic Top Management Decisions


Eli Schragenheim

Introduction

Currently, most companies do not have the relevant information to answer a series of important
questions:

What products/services should the company sell?

What price is appropriate?

What level of capacity should the company maintain?

These are the basic decisions every organization must make, with or without appropriate relevant
information. These are not one time questions, actually managers frequently need to re-evaluate the
answers whenever new deals with clients or suppliers are considered and when price changes are
suggested. The quality of decisions are crucial to the organization – no matter what the goal, vision,
mission and leading strategy the owners and top executives have in mind. A natural question is: Are
most of the actual decisions good enough? Also, are the current tools used to make the above
decisions truly effective – leading the decision makers to the right decisions?

While the above questions apply to every organization, this article focuses mainly on manufacturing and
most service-oriented organizations that possess two organizational characteristics to which this article
is relevant:

1. The lead-time, the normal span of time from customer order to delivery, is be relatively short
(e.g., less than eight weeks);
2. The key resources have recognized finite capacity.

Therefore, multi-project organizations, for whom lead-time could be many months, and distribution and
retail organizations that have capacity problems (shelf space and liquidity), but do not directly measure
the capacity and the actual load, are not, yet, dealt with in this article.

Unless a new company is built from scratch, or the company is in a startup stage, an organization’s
current situation already includes products and services currently sold to some market segments, and
the company has various resources with specific capabilities and capacities already in place. Thus, the
actual decisions to be made are whether to make changes to existing product-mix, capacity levels, and
pricing.

I claim that it is possible to find much better answers to the above questions and as a result
make much better decisions.

In other words, most organizations can greatly improve their decisions. For example, current company
capabilities could serve additional market segments and many times also respond to other unsatisfied
2

needs of clients. Any such change to product mix will involve changes in operations as well as to
marketing and sales and also to the whole strategy of the company.

Common decision-making practice is based on traditional management accounting tools. The Theory of
Constraints (TOC) challenges the basic use of “cost-per-unit” to calculate the potential impact of sales
(and products) on net income. Many scholars have pointed out that the cost-per-unit figures are flawed
and often can significantly distort bottom line impact. However TOC, through use of extant Throughput
Accounting (TA) methodology, together with the five focusing steps, does not yet offer a full and robust
methodology to arrive at truly good and reliable decisions.

The purpose of this paper is to expand the TOC methodology to provide the information required to
support product mix and capacity utilization decisions. First, the paper argues the need to such decision
support and then describes the basis for a much more dynamic methodology to constantly improve
product-mix and market segment selection. Thus, this article presents missing elements in both cost
accounting and TA methodology that are required to support product mix and capacity utilization
decisions.

A key claim of this paper is the need to consider the full impact of uncertainty. In order to successfully
deal with uncertainty the paper suggests a method of combining the translation of the intuition of
relevant people into numbers, which express the assessment of uncertainty, and then analyze the full
ramifications of those assessments to the management team for final decisions. So, finally this paper
presents the complete set of tools to support these key decisions.

The intended audience for this paper is people possessing a more-than-introductory knowledge of TOC,
particularly the five focusing steps and throughput, inventory/investment, and operating expense(T, I
and OE), otherwise known as Throughput Accounting, as originally developed by Dr. Eli Goldratt in the
mid-1980s. A later, more detailed article is planned to present these new concepts without requiring
reader’s prior knowledge of TOC.

An example demonstrating the extent of the problem

Assume a manufacturing company that produces a mix of about one hundred shop keeping units (SKUs).
The business is considering an offer to supply a new and very large potential customer. The new
business represents monthly shipments of a mix of twelve SKUs. However, the prospective customer
demands a substantial 15 percent discount from the current price list. The potential customer is willing
to commit to a minimum monthly demand and will provide a four-months rolling forecast of quantities
required for the twelve SKUs.

The key question is: Should the company accept the order?

In order to answer the key question two supporting questions have to be answered:

1. What information is required to make a good decision?


2. How should the decision process be conducted, considering the impact of the uncertainty of
certain variables?
3

Cost Accounting and Throughput Accounting recommend different data items to support the decision.
Many managers, especially CEOs, may simply prefer to use their (gross) intuition to arrive at a decision.

The Cost-per-Unit Approach

The invention of “cost-per-unit” in the early part of the 20th century provided a simple way for the
business world to determine whether the selling price for one unit of a product covers the expense of
producing and selling that unit. The underlying assumption is that direct calculation of the real bottom
line’s impact (now and possibly also in the future) of selling one unit of product P1 for a price of $N is
very complicated. Given the “cost-per-unit” one gets one number representing the cost that went into
producing and selling the product-unit and thus all above decisions can easily be made.

However, the big promise of an accurate “cost-per-unit” is unrealized. The way significant costs that are
not truly variable with every sale are allocated to a product distorts the real impact on bottom line.
Professors of accounting have proclaimed words against the wide and automatic use of cost-per-unit for
key decision, claiming that all methods of allocation of the common costs of shared resources are
arbitrary and the one chosen cannot be defended against competing methods.1

It took me years to understand the mistake those professors make. While the distortions in applying the
cost-per-unit to the above decisions are absolutely right it is easy to ignore the need of managers to
make decisions and be protected from unfair criticism after the fact. In other words, managers need “a
book” describing how decisions should be made; especially decisions that could cause negative results.

Activity-Based-Costing (ABC) came to offer a solution to the problem of allocation. The idea is to create
a better link between any usage of capacity and its costs. Analyzing the leading example above, ABC
could provide clearer identification of the real usage of capacity and from that generate the cost of such
a decision to be compared with the revenue. Underlining this direction is an assumption that:

Under ideal management, every usage of capacity should, at least, cover the associated costs.

I am not convinced that ABC truly succeeds to accurately register for every decision the stream of the
required capacity consumption. Eventually some assumptions about the usage of capacity are taken up
even though they are not real. For instance, how IT capacity is consumed per every cost-driver cannot
be easily modeled in a useful way. However, this is not the main point. If every single usage of capacity
is expected to, at least, cover its associated costs then we have to believe that:

It is possible to match capacity to demand!

1
The Arthur L. Thomas monographs published by the American Accounting Association (“The Allocation Problem in
Financial Accounting theory,” Studies in Accounting Research, No. 3 (1969) and “The Allocation , Part Two,” Studies
in Accounting Research, No. 9 (1974)) failed to gain acceptance only because they were directed at financial
accounting rather than the more appropriate managerial accounting. While allocation in financial accounting is
flawed, it is necessary to provide full information to outside users. (See also the Thomas response (The Accounting
Review, Vol. 53, No. 1, January 1978) to a critique of the monographs by Leonard G. Eckel (The Accounting Review,
Vol. 51, No. 4, October 1976).
4

If the above assumption is not valid – then whenever capacity consumption is required it is possible that
spare capacity is left and thus there are no associated costs due to that consumption. In the other case,
when there is no spare capacity left, and yet there is potential demand that requires capacity
consumption then there is no simple way to know whether it is possible to get quick additional capacity
and how much it’d cost.

When we accept the above assumption then the derivation of cost of capacity per usage is straight-
forward. One only needs the cost of providing the resource for one year, obtain available capacity (for
instance, machine that is available for 16 hours per day, six days a week) and derive the cost per hour or
minute. If, for some technical reason, one can use the machine, or the operator, for only 90% of the
time, then the cost-per-hour is calculated to absorb that unavoidable lack of capacity.

When there is certain amount of unused (idle) capacity, yet we accept the assumption that it is possible
to match capacity to demand, we have to face extra costs that are not directly covered. The
assumption claims that management has failed to match capacity to demand, so they should know the
extent of their failure and take actions to fix the situation. ABC does yield the figures of the costs of idle
capacity.

As already mentioned when there is a need for some additional capacity to materialize a sale it is
possible to get the exact additional amount for the known cost figures for that particular capacity.

Really?

Suppose that in order to supply the quantities for the leading example, a specific machine, call it M3,
requires 12.3% of its available capacity. Suppose that M3 is already loaded to 92%. Question is: How do
we match capacity to the new demand? On paper we have idle capacity of 8%, but, that is not enough.
The company needs 4.3% additional capacity of that machine in order to match its capacity to the new
demand. Is it possible to get that exact amount of machine capacity and if it is would the cost be 4.3%
of the current cost of that machine?

Robert Kaplan and Robin Cooper, the creators of activity-based-costing, say about matching capacity to
demand:

“They can then forecast both where new bottlenecks for resources will develop in the future as well as to
identify resources whose current and future supply will likely exceed the future demands for the
capabilities they provide. They can also iterate back and forth between decisions that affect the demands
for resources with decisions to increase or decrease the supply of resources. ABC provides a dynamic
theory of constraints, enabling managers to make better decisions today in light of their impact on future
resource constraints.” [Kaplan and Cooper, Cost and Effect, page 134]

The suggested process of iterating back and forth between impacting the demand and checking the
capacity profile is very insightful – question is whether the ABC calculations provide good support for
that process. What comes from the above verbalization is that Kaplan and Cooper introduce a small
change to the above assumption; and now the assumption is:
5

It is possible, at least in the long term, to match capacity to demand

If this assumption is valid in reality, then it makes sense to require from every usage of capacity to
return the relative costs of maintaining that capacity!

Alas, that assumption is clearly NOT valid in reality. There are three different, independent, causes that
limit our ability to match capacity to demand!

Cause 1: The TOC insight that in order to perform well to the client’s requirements it is imperative to
maintain substantial amount of protective capacity. In other words, balancing capacity with the
demand leads to such inferior performance to the market that the business would not be sustained.
TOC makes the distinction between the capacity constraint, CCR, of which a much better match
between capacity and demand is beneficial, and the rest of the resources that must have significant
amount of protective capacity.

General comments to cause 1

1. There is no known formula for how much protective capacity is required. Using buffer
management yields whether the current state of capacity is safe.
2. Eli Schragenheim claims that even the CCR needs to have a certain level of protective
capacity to respond adequately to market requirements. This level of protective capacity is
lower than what all other resources must have.
3. When the market requirements become stricter, demanding faster response times, more
protective capacity is required.

Cause 2: The market demand changes faster than the ability to change the capacity levels!

The immediate ramification is that capacity should be maintained for the peak period and then
substantial amount of excess capacity would automatically exist at off-peak times.

Cause 3: Capacity can be purchased ONLY in certain minimum quantities!

Actually the claim is that capacity is NOT a continuous function! It is definitely a discrete one, forcing
different mathematics than the tools for continuous functions. Optimum results based on mathematical
techniques for continuous functions are grossly misleading.

The minimum quantities for purchasing capacity of a specific resource depend on the type of the
resource; for some resources these minimum quantities are VERY large. Hence, even if just 4.3% of the
monthly capacity of M3 is needed, it could be the case that the minimum quantity available to purchase
is 100% - in other words: buy a new machine!

In addition to capacity being discrete rather than continuous, market demand is also often discrete
since many large clients insist on being supplied of a whole collection of different products, and in
certain quantities. This means that matching the capacity to such market opportunities might require
substantial amount of excess capacity.
6

Realizing that there is no way a company is capable of truly matching capacity to demand pulls the rug
under the various ways to calculate cost-per-unit. The usage of capacity for a particular demand might
be either “free,” meaning use of otherwise idle capacity, or very expensive as it compels the purchase of
much more capacity than is theoretically required.

What is still true is that total revenues from the whole product-mix sold should be greater than total
costs.

Coming back to the leading example, the cost calculation of the capacity required to deliver the monthly
quantities to that specific client does not represent the company’s actual additional expense in
delivering the goods. It could be the case that using the existing spare capacity would not have any
impact on expenses, but it could also be that requiring more capacity than what is available might cause
very high expenses. An alternative way to evaluate the new opportunity is to reduce sales of current
demand - causing loss of current revenues that should be compared with the additional business gained.

Throughput Accounting (TA)

TOC accounting is centered on T, I and OE. I assume the reader is familiar with the basic definitions, as
appeared in the books by Corbett and Smith and in the TOCICO Dictionary. Let’s first analyze how TOC
would approach the leading example and then expand the discussion to the conceptual advantages and
the limitations of TOC in supporting the full range of decisions concerning product-mix, pricing and
capacity.

The basic TOC approach in evaluating the leading example is first to calculate the net T to be generated
by a typical monthly delivery. As the client insists on getting a 15% reduction on the list price, there is a
probability that the total T generated by the monthly shipment would be negative. This would happen
only when the ratio of the T to the selling price is less than 15%. Such low ratio of T/selling-price is not
common in business. When it does happen then the T generated directly from the order is negative so,
in that case, the offer should be rejected.

Once the calculations have specified a certain positive T (monthly delivery) the next question is:

Does the company face a threat to its other sales because of the capacity invested in the new offer?

In other words, considering the additional load, is there any lack of capacity of even one resource? If no
shortage of capacity occurs, meaning it is possible to deliver the new offer on top of all the current sales,
then the impact on the bottom line is the additional annual T generated by the offer. In such a case the
simple answer is to get the new offer.

If there is a shortage of capacity, then other sales would be affected. The decision process has to look
for the sales that yield the lowest overall T, free enough capacity and can be cancelled without
problems.

The focus of TA is on Throughput. In a way this term is not new. The term “contribution” is an
equivalent concept. Marginal costing is an existing management accounting method that seems similar
7

to TA, but contains basic flaws that make it target for harsh criticism. The emphasis of TOC on
Throughput as a prime global measurement expressing the total contribution of sales not just the T
generated by one order in isolation solves the negative branches of marginal costing. The advantage of
viewing the total T versus the total OE is a revelation that delivers simple and effective message on the
state of the organization. Looking at T – OE = Profit rather than the common Revenues minus Expenses
focuses the view on the relevant areas thus simplifying the calculation of the net impact on the bottom
line. T focuses on sales considering also the truly variable costs, which do not vary with every sale and
thus can be considered linear. OE, operating expenses, consists mainly of maintaining capacity, from the
buildings/offices through manpower, maintenance and transport as well as management and marketing
expenses. Hence, whenever we need to consider delta-T minus delta-OE we know where to look in a
focused way rather than looking for anything that impact Expenses.

The current practice of Throughput Accounting looks only at one resource - the known capacity-
constraint-resource (CCR), assuming that only the capacity of the CCR might cause loss of orders due to
lack of capacity. The mechanism to check any new offer relative to existing ones is by comparing the
T/CU of the new order to the lowest existing T/CU products. Let’s investigate the logic and the inherent
limitations of that practice.

The problematic sensitivity of T/CU

The original logic behind comparing products using T/CU was fully based on the assumption that there is
one CCC that is fully loaded by existing sales. In such a case when selling one additional unit of product X
is on the expense of selling something else. Only the capacity of the CCR is considered. Thus, if the T/CU
of product X is larger than T/CU of Product Y then selling more X and less Y adds more T without adding
more OE and the bottom-line goes up.

What is wrong with the above logic?

 What if the CCR is not 100% loaded? Then there is no real trade-off between selling more of
Product X and selling more of product Y! Why not sell more of both?
 When the organization is not capacity constrained then T/CU is not relevant at all. There are no
TOC tools to lead an analysis identifying the direction for increasing sales considering the fact
that at some point in time a CCR will emerge.
 When there is a quick way to temporarily increase the CCR capacity at a certain cost then there
is no trade-off between different products. The only question is whether the additional T
(delta-T) is larger than the additional cost (delta-OE) for the capacity increase. T/CU is not
relevant for that case.
 When more than one CCR2 is active.
 TOC never claimed that only one CCR exists! The last part of The Haystack Syndrome is
dedicated to handling several interacting constraints! TOC claims that you should not allow
more than one CCR to exist, but, certainly having more than one is a valid possibility. Alan

2
An active CCR means the company has certain potential demand that is not served.
8

Barnard claimed that when there are more than two interactive CCRs then T/CU is wrong. In his
2006 presentation at TOCICO annual conference he quoted Eli Goldratt:
“Based on Alan’s revised P&Q it is clear that, “Using ‘Highest Contribution per Scarce Resource’
when more than one resource is overloaded is simply Wrong…To the extent that it can cause
bankruptcy…” Quote by Dr. Eli Goldratt, Feb 2006.”3
 The discrete nature of sales. Giving up a certain limited amount of existing sales usually ends up
with losing much more sales. You might like to reduce product Y sales by just 3%, but,
eventually you might lose 15% because clients look for a certain quantity and if you cannot
supply all the quantity requested they look elsewhere.
 A key point against T/CU is that the quantity of the new offer is not considered, and while its
T/CU could be very appealing it is possible that the quantity will turn a non-CCR into a CCR and
even a bottleneck.
 Based on the above argument – any large-decision, like entering a new market-segment, or
serving a large new client, cannot be supported by the existing techniques of TA.
 Linear programming (LP) could be considered as a methodology that offers a solution to the
above arguments against T/CU. Alan Barnard in his above presentation states: “However,
considering that LP ignores inherent variation & dynamic inter-dependencies that exist within
Demand, Capacity and Supply Constraints, LP cannot provide a practical solution (e.g impact of
batching and dependent setups).”
• Adding to the above I like to add that the discrete nature of both sales and capacity are
hard to model in LP. More on the relationships between the LP and TOC can be found in
Vicky Mabin papers.

Concentrating on only one resource, treating all other resources as irrelevant as well as ignoring the
quick ways for capacity increase such as overtime, extra shifts, and outsourcing, makes TA focus only on
very short-term decisions. This is how Kaplan and Norton express their view regarding TOC (Cost and
Effect, page 134):

“TOC and ABC are not in conflict. In fact, they complement each other beautifully, with TOC providing
short-term optimization to maximize short-term profits (when operating in a constrained production
environment) and ABC providing the instrumentation for dynamic optimization of resource supply,
product design and mix, pricing, and supplier and customer relationships for long-term profitability…TOC
operates to enhance profitability within existing resources and constraints4, and existing products and
customer relations. ABC examines the economics of existing resource supply, product design, pricing and
mix, and customer profitability, and provides the map by which the existing economics can be enhanced
for better future.”

I disagree with the claim that ABC techniques lead us to improving the overall long-term bottom line of
organizations. I am especially skeptical of the notion of ‘optimization’. As already noted, excess

3
I even challenge the requirement that the two CCRs have to interact. Assume an assembly where one leg uses
one CCR and another leg uses a different CCR. The two do not interact, but the end product passes through both!
4
The emphasis is my own, EliS.
9

capacity is something every organization has to live with. Still, certain dynamic changes in the product-
mix, pricing and capacity could significantly enhance both the short-term and the long term bottom-line.
The growth of an organization involves many options – which one should the organization choose? I
claim that enhancing the TA techniques and concepts could lead us there.

TA currently lacks the techniques and the related processes to evaluate changes in the product-mix that
consider the total T and the total OE. Thus, the ability to predict Profit = T-OE is now not properly
supported by ABC or by TOC.

The ignored impact of uncertainty

While TOC as such considers uncertainty very seriously and effectively by the use of visual buffers, TA
ignores the issue. As long as TA is fully concerned with small decisions for the very short-term the
impact of uncertainty might be relatively small.

When larger decisions, which could impact the organization in the medium and long-terms, are
considered it is necessary to consider the impact of uncertainty.

The most significant impact of uncertainty is on the future demand. When decisions concerning market
opportunities are evaluated there are two different uncertain impacts that need careful considerations:

a. The quantity of the resulting sales resulting from the opportunity


b. The indirect impact on other sales. This indirect impact consists of two different impacts:
 Impact on the sales, or the price, of other products due to the price and quantities sold
as part of the opportunity at hand. For instance, selling large quantities in lower price in
a certain market segment might impact the reputation and price of sales at the regular
market segment.
 Reduced other sales due to lack of capacity.

Eventually the proposed solution, expanding the boundaries of TA, has to offer ways to estimate the
impact of uncertainty.

The Role of Intuition in the Required Information for Decision Support

When there is lack of formal information people use their intuition to complement the full information
required for a decision. The point that human beings process a lot of data in an unconscious way and
those data items are not part of any computerized system, yet they are, many times, relevant to the
decision at hand.

Human intuition has several limitations; it is far from being accurate, it is influenced by biases and
personal values and it is slow to update itself when changes occur. But, there is no alternative way that
is also practical and can cope with the timing of having to make a decision. The need to handle
uncertainty, or any other lack of concrete information, forces us to actively use the intuition of the key
relevant people. The use has to consider the limitations of the intuition as part of the overall handling of
uncertainty.
10

The author claims that the intuition of good managers is an asset and thus it has to play an active part
in the decision making process of the organization! The real question is how that asset is used without
its potential damage.

The Decision Support in the TOC WAY - DSTOC

The strategy / objective of the proposed methodology:

Top management has good support in evaluating the most appropriate product mix, pricing and
capacity decisions that result in being close to the potential of the organization for the time frame
stretching from next month to one or two years.

There are several necessary assumptions for the above strategy:

 Currently, the decisions for these decisions are far from satisfactory and even damaging – being
based either solely on intuition, which is heavily based on the existing comfort-zone, or on the
current tools of management accounting based on flawed paradigms.
 The current Throughput Accounting is limited by checking the capacity of only ONE resource,
which might ignore lack of capacity of other resources.
 The potential bottom-line impact of superior decision support is very significant.

The tactic or the main action-list to achieve the strategy, the main backbone of the proposed solution,
is:

Develop a managerial process and supporting tools using:

 ‘What-if’ scenarios, where the intuition regarding potential sales and the internal dependencies
are translated into numbers used to calculate the resulting capacity profile of critical resources,
thus checking the viability of every scenario and it’s resulting bottom-line.
 Every such ‘what-if’ inquiry could be calculated in a matter of seconds!
 Uncertainty is handled by generating pessimistic and optimistic scenario for every decision.

Some of the key parallel assumptions, explaining the logic of how the tactic would be able to achieve the
strategy are:

 The existing computers are strong enough to carry out throughput and capacity calculations of
all items to be sold in a matter of seconds.
 It is possible to model the realistic options for quick, and not-so-quick, capacity increases that
would consider the minimum quantity that the capacity can be purchased and its cost and come
up with the required delta-OE.
 Every new opportunity in the market, or any change in the capacity levels, can be checked for its
impact on the bottom-line by adding it to the current sales and capacity, calculating the change
of T and OE while making sure enough protective capacity is available.
11

o Actually every such opportunity is “simulated” leading to additional optional decisions in


order to ensure availability of acceptable capacity to support the new level of sales.
Then the new T and OE are calculated to yield the predicted bottom-line.
o Only then the decision to accept or reject the opportunity is made.
 It is possible to guess, and fine-tune later in time, the required levels of protective capacity.
 There is no real need to consider the capacity of ALL resources. It is enough to consider only the
‘critical resources,’ the relatively few resources that might become CCRs when the product-mix
changes.
o The real problem of considering numerous resources is not lack of enough computing
power, but the need to get good enough data that represent that capacity required by
the various products, and the realistic availability of the capacity.
 By being able to calculate the bottom-line impact of many potential opportunities and related
ideas – top management are able to decide upon a set of decisions that bring the future product
mix close enough to the most-appropriate product mix for the time period.

Several key sufficiency assumptions that lead us to recognize the amount of the challenge.

 It is difficult to influence managers to express their sincere intuition regarding what could, or
could not, be sold and at what price, as the when the actual result will become known the
quality of their intuition will be judged.
 Same concern about the sincere intuition of the operational intuition regarding the level of
protective capacity that enables delivery according to customer’s satisfaction.
 The balance between formal data and intuitive assessments is sensitive and subject to the
personal relationships between key players.
 Formal data is not immune to mistakes and certainly the intuition of executives is subject to
uncertainty. Handling uncertainty is a critical part of the process.

Describing the ingredients of the new TOC Way to Support Top Basic Decisions

New key definitions:

Critical Resources:

Definition: A group of resources that each might become a CCR and possibly a real bottleneck for a
certain realistic product-mix.

The definition of critical resources is a managerial judgment. On one hand, we would like to include
more resources in order to calculate delta-OE in a suitable manner. On the other hand, losing control of
data quality forces s to focus on the few that truly matter.

No critical resources should be planned for 100% utilization. We assume TWO different lines that
signal maximum planned utilization and by this provide protective capacity. The first line refers to the
weakest link how much one resource can be loaded without compromising the standards of delivery
and commitment. The other critical resources should have more protective capacity.
12

Products and T-generators:

The term ‘product’ is used to describe both the output of Operations and what is actually sold to
customers. The same ‘product’, the output of Operations, can be sold at different prices to segmented
markets and can also be a part of what is sold to customers as a set or as a package. This duality creates
problems when evaluating different sales initiatives.

The following definitions separate the two different entities:

Definition: Product is the output of Operations; any part of an item that can be sold either by itself or as
part of a package.

The data items that are required of every product: Product-id, Truly-variable-costs for one unit, the
capacity requirements from every critical resource including setups and downtime.

Definition: T-generator is what is actually sold to customers at a specific price. Every T-generator is built
from products, each product along with its quantity (for one unit of the T-generator).

The data items required for every T-generator: A list of every product-id included in the T-generator and
its quantity, a price tag. Potentially a T-generator might have additional truly-variable-costs (TVC) on top
of what is included within the products, for instance, special packaging.

Capacity Buffers and other types of capacity increase

Definition: Capacity Buffer is an optional fast temporary capacity increase for a certain cost. For
instance, a work center that usually works two shifts a day, but when required a third shift can be
activated at an additional cost.

Capacity buffers are especially important for critical resources. The damage of truly running out of
capacity is too big not to have an option to add capacity in a hurry. The “buffer” part is to protect the
organization from the damage.

It is the responsibility of the general capacity management to provide and maintain capacity buffers.
Even when capacity buffers are not used – the option to use them usually involves a certain level of cost;
for instance, having enough manpower to enable additional overtime or extra-shifts when necessary.
Outsourcing also requires frequent use of this option, otherwise the option might disappear.

The extra cost of maintaining capacity buffers has to be compared to the opportunity cost of not being
able to meet commitments, losing new opportunities or the cost of giving up demand.

Buffer management is definitely relevant for capacity buffers. The depth and length of red zone
penetrations send a warning that market commitments might be compromised and expediting efforts
might require additional capacity.

Other options of increasing capacity, such as adding machines, equipment, tools and also manpower are
not part of the capacity buffer as they cannot be done in a hurry and once purchased remain for long
13

time. Therefore, these capacity elevations are all planned moves, not actions taken during execution.
Actually these are investments that should be evaluated for longer time periods. The proposed solution
could support these decisions when the whole horizon is appropriately long, like two years.

The initial references: the current Sales Profile and the current Capacity Profile

The whole process is built upon adding new ideas and opportunities to a reference set that depicts the
sales for the next relevant period assuming no change in the current activities. The reference
encompasses the existing T-generators and their predicted sales quantities without new initiatives.
These comprise the Sales Profile Reference. The simple way to create the reference is to gather the
actual sales of the last period and assume that sales for the next period are the same or consider a
seasonality factor.

From the Sales Profile Reference two major aggregative information items are calculated:

1. The total throughput (T) to be generated


2. The Capacity Profile Reference for every critical resource.

We assume that the basic OE level is known; hence we can calculate the net profit as T-OE.

The Capacity Profile Reference shows the percentage of the load relative to the available capacity, as
calculated from the T-generators (without overtime, extra-shifts etc.)

The Sales Profile Reference is just a forecast. The decisions we like to draw from it are based on an
aggregated sales and capacity calculation and thus are less dependent on sales fluctuations per specific
SKUs.

The resulting Capacity Profile might look different than the actual load on the resources at the specific
period in time. The point is that the actual load observed does not include capacity that was used last
period for deliveries for the period under consideration, but does include capacity for deliveries for next
period. We assume that in most cases the two usages of capacity would cancel each other.

The managerial process for making the product-mix decisions

Decisions regarding potential changes in the product-mix, pricing or capacity levels should not be made
by just one key area. At the very least, the three areas of Operations/Production and Marketing/Sales
and Finance, possibly led by the CEO, should take the decision together.

This paper key process is centered on a periodical meeting of top management. The core idea is to
simulate the current sales and their impact on capacity and consider any new idea by adding it to the
sales and re-calculate the total load on the critical resources and the total T generated. When the
protective capacity boundaries are violated the management team has to decide whether to reduce
other sales, add capacity or ignore the idea.

The first step is a review the Sales and Capacity Profile References, reflecting the current financial, sales
and capacity states. From the result, the amount of potential opportunities can be roughly evaluated.
14

Then the search for an overall superior state is initiated. Management still needs to realize that capacity
cannot match the demand. Thus, Management has to allow enough protective capacity, including the
capacity buffers. Still, the state of excess capacity against market opportunities could be improved. In
the search for superior state the discrete nature of both sales and capacity elevation have to be carefully
considered.

Every new idea to improve the profit has to include its predicted impact on other sales. For instance,
initiating a promotion requires careful check of the total T to be generated for longer period of time
than just the promotion period, realizing that the promotion would definitely reduce the sales of the
same items at the original price for quite some time and possibly also the sales of similar items. All
these predictions are the responsibility of Sales, but have to prepared a-priori for the meeting where the
promotion, and other decisions, would be considered.

These inputs are based on the intuition of the responsible sales people and translated into numbers. As
would be mentioned later, in order to handle the uncertainty and the inaccurate nature of the intuition
two different set of numbers should be checked: pessimistic assessment and an optimistic one.

Once the opportunity is well defined it is ADDED to the Sales Profile Reference, including the potential
changes as defined within the opportunity. Then the new T level is calculated along with the delta-T of
the difference between the new T and the reference-T.

Next step shows the new update of the Capacity Profile - the updated load versus the available capacity
of the critical resources. If no line of protective capacity has been penetrated, then no delta-OE is
required.

However, when one or more of the critical resources cross the maximum-utilization line, protective
capacity is penetrated, then two generic options for action are available:

1. Reduce existing sales. The VP of Sales needs to state his/her best intuition of the realistic
options of reducing the sales and by how much. If a certain SKU is under a make-to-availability
commitment to all clients then there is no simple way to reduce sales. It is possible to increase
the price, but then there is no accurate way to know how much the sales will be reduced. Any
reduction in sales might hurt the sales of other SKUs. In all cases, Sales has to agree to that
step. Any reduction in sales will impact the new T-level and decrease delta-T.
2. Increase the capacity of the relevant critical resources, mainly from the capacity buffer. This
step increases delta-OE depending on the capacity units that are purchased. Purchasing
machines and/or hiring manpower would increase ‘I’ and should force checking longer-term
ramifications.

Eventually the specific idea under consideration is either accepted or rejected. Eventually during the
meeting the decisions that have been accepted are combined for a new evaluation of total delta-T and
delta-OE. Some decisions that have been accepted may now be rejected due to lack of capacity.
15

Every meeting should define the period of time into the future it considers. Different decisions have
different time horizons to which these decisions are relevant. For instance, evaluating entry into a new
export market is usually not relevant to next month; it needs to be evaluated for a period of 6-12
months. In some special cases, some new opportunities could lead to considering purchasing of
additional machines and then the full ramifications have to be analyzed for the long-term.

The leading example is aimed at monthly deliveries for a particular client. The best way to evaluate such
a decision is to check two different time periods: a “regular” month and a month representing peak
sales. Checking the peak period aims at validating that even at peak, delta-T minus Delta-OE is positive,
otherwise the annual impact of the example has to be calculated.

Handling Uncertainty

The current tradition within organizations ignores most of the “common and expected uncertainty.”
This category of uncertainty refers to the expected fluctuations of sales, delays in projects, problems in
the supply, quality problems, temporary lack of capacity and other regular operational disruptions.
Instead of recognizing the absolute need for buffers organizations ignore the fluctuations. The
treatment of forecasts, and all the derived plans, is that they express what the organization should
achieve. From that paradigm faulty measurements are drawn that influence the dysfunctional behavior
of almost everyone within the organization.

Forecasts provide partial information on what lies ahead. There is no valid way to manage any
organization without forecasts! TOC uses its buffering system to enable good performance in spite of
the expected forecasts deviations. However, there is no way, even according to TOC, to manage the
medium and long-term capacity without forecasts. Even the buffers sizes are determines by the
forecasts, including the assessment of the forecasting error.

The general idea of addressing uncertainty is to use TWO forecasts: one that represents a reasonable
pessimistic forecast and another that is reasonably optimistic. This approach looks on forecasts as a
range rather than a single number.

There are two different impacts of uncertainty/forecasts on the proposed process where both look at
the potential demand. In addition, we have the impact of uncertainty on capacity levels.

The first important impact of uncertainty is on the Sales Profile Reference that is based on short-term
forecast of all sales. As was already noted, while we have to specify the forecasted sales of every single
T-generator, our main interest is the aggregated impact on the total T and the load on each of the
critical resources. Aggregating the sales causes the impact of the uncertainty to be lower than the
impact on a single T-generator. Thus, in this case, the suggestion is NOT to use two different forecasts
approaches for the Sales Profile Reference but to base the reference on the average result of all
forecasted sales, assuming every forecast is targeted at the average/expected value.

However, when it comes to new opportunities then the impact of uncertainty is much higher. There are
two reasons for not being able to ignore the impact of uncertainty when evaluating an opportunity.
16

First, most new opportunities involve only few T-generators and thus the restraining impact of
aggregation is reduced. Second, new opportunities are based on lower level of intuition and thus the
level of the uncertainty is more significant.

In order to illustrate the possible impact of uncertainty let us look at two different examples of such
opportunities to be considered:

a. Announcing one month of promotion where a number of products are sold at a discount of
15%. Problem is: the quantities to be sold, both during that month and also during two months
after the promotion, are critical to the evaluation whether the promotion will increase the
bottom line or decrease it! It seems clear that the realistic pessimistic (yet reasonable) forecast
is significantly smaller than the reasonable optimistic forecast – the relative difference could be
easily twice the pessimistic forecast and even more. The practical impact on capacity could be
considerable. Suppose that the optimistic forecast would force many extra-shifts, causing high
delta-OE and even force reducing sales of other T-generators offered at full price! This situation
certainly calls for managerial judgment taking the entire potential ramifications into account.
b. Entering a new market, such as an export market. This move is obviously exposed to too much
uncertainty. The pessimistic scenario might include realizing, after a while, that the sales do not
cover the delta-OE for such an operation and bring that move to a close. This scenario points to
the maximum loss that the move might cause. On other hand, if sales would be close to the
optimistic scenario then the resulting bottom-line increase would be much larger. Note that
selling the same basic products in a foreign country might involve not just delta-I and delta-OE,
but also truly higher truly-variable-costs (TVC) due to transport, customs and commissions.
Certainly, for such a decision one needs to calculate all ramifications, based on both pessimistic
and optimistic forecasts, in order to know whether this entry to an export market truly provides
great potential and what is the possible level of loss if the potential is not materialize in reality.

Another impact of uncertainty is based on capacity calculations. We assume the quality of the data and
the assumptions regarding the impact of setups, downtime and protective capacity would go up sharply
after starting to use the process consistently. This would reduce the impact of uncertainty on the
capacity side of the process.

Therefore, the process should allow checking every opportunity based on two different forecasts. For
the time being the main way to address uncertainty is by judging their economic impact on the
company.

The behavioral aspects of the proposed managerial process

Would Sales, Operations and Finance be ready to collaborate for a holistic planning of what to sell
next period?

There is an inherent tension between those functions. However, the process proposed here is a clear
win-win-win as the inherent UDEs of each function is getting an answer. Sales get a formal agreement
of Production to be able to support the predicted quantities on time. Sales also see the picture of the
17

load on Production and understand when they demand overtime for which its cost should be covered by
the delta-T of the additional sales. Production sees the pessimistic and optimistic predictions of the new
initiatives and understands better their capacity requirements. Finance sees clearly how much delta-T is
generated by the investment of delta-OE. Being confronted with a simulation of their reality should, to
my mind, overcome the resistance to cooperate.

Having to have Sales forecasts as critical part of the simulations raises the question: are not most of
those forecasts too optimistic?

Today they are, many times, over optimistic and because of two different causes. The first is the fear of
shortages leading to loss of sales. This fear is usually higher than the fear of being left with too high
inventory, which, in most cases, do not “belong” to the salespeople. The second cause for over-
optimistic forecasts is the natural tendency of typical salesperson to be optimistic, relying on his/her
ability to sell more.

This optimistic tendency is a managerial problem. Organizations cannot operate without assessment of
the future, in other words, forecasting. The current practices do not push individuals to learn from
experience and improve their intuition and by that improve their forecasting abilities. Asking
salespersons to state just ONE number as their forecast makes it hard to monitor the forecast and
improve it in the future. One number can never be right.

The proposed process, when truly matter, asks for both the pessimistic and the optimistic forecasts from
Sales. When the actual sales are known it is possible to check whether it falls within the range. When
the actual sales fall outside the range the salesperson gets feedback that can be useful in the future.
The real concern would not be too optimistic forecasts, but too large difference between the optimistic
and the pessimistic. The solution for this NBR lies in the process itself: a too optimistic forecast causes
a problem for Sales, because it could drive the capacity of one or more critical resources beyond the
limit of the protective capacity, causing search for options to reduce sales.

The author does not claim that the behavioral problems of Sales and Operations would be totally
eliminated. The claim is that the situation would be significantly improved relative to the current
state.

Preparations for the periodical top management decision meeting

A top management meeting should be well prepared in order to exploit executives’ attention to critical
upcoming issues with improved Sales being properly supported by Operations and leading to superior
bottom-line.

Certainly Sales has to prepare its Sales Profile Reference, which includes possible changes to last period
sales report. The preparations for such a meeting might consider different time frames, such as the
following month, next quarter and next year, and prepare the data for each.
18

Operations have to prepare carefully the capacity requirements of every active product for each of the
critical resources. Operations also have to model all the options for capacity increases, along the time-
delay to purchase them and the cost-per-minimum-capacity-increase.

All of these preparations make it possible to represent a realistic view of the current state – when no
new initiatives are taken!

What about new market opportunities and new ways to maintain capacity to support a more volatile
market? The ability to quickly assess the economic worth of an opportunity should encourage Sales to
propose various directions to increase sales – from new market segments to new product packages
together with their price-per-package, thus proposing new T-generators and taking advantage of the
capacity at hand. This will drive the organization to new level of performance.

I expect that during the time between those management meetings, the responsible executives would
be busy looking for more and more opportunities. In the meeting itself all the ramifications of every
opportunity should be analyzed. Most of the opportunities should be prepared before the meeting.

Software supporting the process

The ability to support the quick global calculations of T and the capacity profile certainly requires
software support. The basic data should be taken from the organization’s database. The software itself
needs to provide a quick, simple and friendly way to create new T-generators and new opportunities
that might include changes in the sales of regular T-generators.

The current available software, called DSTOC (decision support in the TOC way), was developed by
Vector Consulting Group India, in collaboration with Eli Schragenheim.

The software is just a tool for making quick calculations that refer to the organization as a whole. All
decisions have to be taken by the management team.

Looking to the future: the Business Intelligence (BI) in the TOC Way

The process described in this article is the first step to create Business Intelligence in the TOC Way. The
above process fits manufacturing and service organizations where the lead-time is relatively short.
However, when lead-time is much longer, as in a multi-project environment, we face two difficulties:

1. We cannot assume that capacity spent in the last period is about equal to the capacity to be
consumed this period for the next one. In case of projects we would probably need to refer only
to very long time periods.
2. In the above process we did not deal with the impact of the exact timing of spending or getting
money. In projects the organization invests money but also capacity being used now in order to
generate T in the future. There is an obvious need to consider the right way to evaluate
streams of cash. While the topic has been dealt in Finance and also by Goldratt himself – the
specific way of handling it for projects is beyond the current boundaries of the current paper.
19

The next step should be to consider the generic case where delta-I>0 is an integral part of the decision
at hand. Actually any analysis concerning investments should consider both medium and long term
periods, for instance, one year and also five years. The longer period considered by the management
team the higher the impact of uncertainty. Therefore, for the more generic TOC way of handling
investments, dealing with both time versus money and uncertainty, we better wait until the former
processes, for manufacturing/services and then for multi-projects, are well established and lessons have
been learned.

The other side of the future business-intelligence (BI) the TOC way is describing the current
performance indicators that point to what should be improved. Eventually we need to come up with
how to value the potential of the main parts of the organization. In other words, how to measure the
performance of a non-constraint? Not with the intent to lay off employees – but in order to know how
they would face further growth without becoming a critical resource that needs constant monitoring.

References:

1. Thomas Corbett: Throughput Accounting as representing the current knowledge of TA.


2. Kaplan and Norton: Cost and Effect.
3. Alan Barnard “Challenging a fundamental Law of Economics”, Anuual TOCICO conference
presentation, 2006.
4. V. J. Mabin and J. Gibson, Synergies from Spreadsheet LP with the Theory of Constraints – a Case
Study, The Journal of the Operational Research Society, Vol. 49, No. 9 (Sep., 1998), pp. 918 – 927

You might also like