Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

Sales Planning and Control Using Absorbing Markov Chains

Author(s): William W. Thompson, Jr. and James U. McNeal


Source: Journal of Marketing Research, Vol. 4, No. 1, (Feb., 1967), pp. 62-66
Published by: American Marketing Association
Stable URL: http://www.jstor.org/stable/3150166
Accessed: 19/06/2008 12:12
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you
may use content in the JSTOR archive only for your personal, non-commercial use.
Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
http://www.jstor.org/action/showPublisher?publisherCode=ama.
Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed
page of such transmission.

JSTOR is a not-for-profit organization founded in 1995 to build trusted digital archives for scholarship. We work with the
scholarly community to preserve their work and the materials they rely upon, and to build a common research platform that
promotes the discovery and use of these resources. For more information about JSTOR, please contact support@jstor.org.

http://www.jstor.org

I
WILLIAMW. THOMPSON, Jr., and JAMES U. McNEAL*
A stochastic model that generates data for sales planning and control is described.
An example is presented that shows how these data are used to plan short-run
sales activities and train effective salesmen. In the model, changes in customer
propensities to buy are treated as Markov processes. Finally, it is suggested that
the concepts developed here may be computerized and integrated into existing
systems for planning and control.

Sales
Planning
Chains
Markov

and

Control

Using

Much attention has been given to the problems of


planning and controlling sales efforts [2]. In recent
years, rising selling costs have given particular impetus
to solving these problems [12]. A major obstacle to developing sales plans and control procedures is the lack
of meaningful data on which to base them. It is well
known, for example, that selling efforts are most productive when salesmen call on the most promising prospects; but the problem of providing information identifying the most promising buyers has always been difficult
for sales managers.
This article proposes a model that will give useful data
for planning and controlling selling efforts. Further, the
article will show how data generated by the model can
be used to produce (a) short-run sales strategies and (b)
procedures that will help long-run development of salesmen's effectiveness.
Although Ehrenberg [5] has recently expressed some
reservations, others have found Markov chains to be
particularly useful in describing marketing phenomena
[1, 3, 4, 6, 9, 11]. Cyert, Davidson, and Thompson, [3],
for instance, have used absorbing Markov chains for
formulating and evaluating management policies on the
aging of charge accounts in a department store. The
model here applies Markov chains to determining economic decision rules for planning and controlling sales
efforts. Shuchman [10] has used a similar analysis; however, his assumptions and results differ markedly from
those presented here.

Absorbing

THE MODEL
The fundamental assumptions underlying the model
are:
1. The salesman sells only one product, the price of

which is constantfor all customers.


2. Sales are made on an ad hoc basis. There is no continuingcustomer-salesmanrelationship.
3. The expected cost of calling on a customer is equal
for all customers.
4. The expected time consumed in a sales call is equal
for all customers.
5. Customers may be classified on the basis of their
relative propensities to buy as exhibited during the
most recent previouscall.
The first four assumptions fit a number of selling situations. An encyclopedia salesman, for example, usually
offers one product at one price. A customer rarely purchases more than once. Therefore, after a sale is made,
the salesman is not likely to call again. The cost of a
call should not vary significantly among customers.
This is particularly true when the sales presentation is
totally programmed or canned.
Assumption 5 simply recognizes that propensity to
buy is a most logical base for classifying potential customers. Obviously, this assumption would not be sound

for many convenience goods that are purchased through


habit or impulse.
The system described here may be treated as a finite
absorbing Markov process.1 It includes two absorbing

* William W. Thompson, Jr., is professor of quantitative sciences, University of Houston, and James U. McNeal is associate professor of marketing, University of Georgia. The authors
are grateful to the Research Foundation, Oklahoma State University, for supportingthis work.

1A Markov process is one in which only the immediate past


is relevant for predicting the future. More specifically, the probability of entering a certain state depends only on the state
62
Journal of Marketing Research,

Vol. IV (February1967), 62-6

SALESPLANNINGAND CONTROLUSINGABSORBING
MARKOVCHAINS

states-sale completed and sale lost-and n-2 nonabsorbingor transientstates. In general, any appropriate
numberof nonabsorbingstates may be used. These are
ordered so as to establish a range of discrete classes,
based on perceiveddifferencesin customerpropensities
to buy.2It is also usuallyconvenientto defineone nonabsorbing state for the initial classification of new
customers.The set of n statesis exhaustiveandmutually
exclusive.3Thus, a customerassignedto one of the n
statesbeforea call mustbe assignedto one of them after
the call.
The transitionof customersfrom one state Si to another Sj occurs accordingto the transitionprobability
Pij . The set of all transitionprobabilitiesfor the system
of n statesis given as the transitionmatrix:

(1)

P=

pll

P12

P'

plj

.'

p21

P22

' *

p2j

*' P*

Pil

Pi2

pij

*?

_Pnl

Pn2

P'

pinj

Pln
2n

Pin
*

pnn_

where for any point in time, the probabilityP,j is definedfor each possibletransition.Symbolically,

(2)

ipij=
j=1

(for i = 1,2,

. , n).

Considera case wheren = 6. In such a system,there


are two absorbingand four nonabsorbingstates. Definitions of these states are providedin the figure. The
term"interest,"whichis used in the descriptionsof states
in the figure, denotes propensityto buy as perceived
by the salesman. Admittedly, a salesman's evaluation of propensityto buy is somewhatsubjective.However, in manyinstances,wayscan be developedto maintain adequateobjectivity.Thompson[13], for example,
has describeda brief and practicalworksheetby which
a salesmanmay cross-classifycustomercues as either
presently occupied. A Markov chain is a stochastic process involving a succession of simple Markov processes.
A finite Markov chain is an absorbing chain if it contains at
least one absorbing state, and if it is possible to reach an absorbing state from every state in the chain in a finite number
of steps. An absorbing state is one that once entered can never
be left; see [8].
2A fundamental differencebetween this approach and that of
Shuchman [10] is in the basis of classification. Shuchman,
like Cyert, Davidson and Thompson [3], defines his set of
states in terms of an aging process. In his formulation, a given
customer progresses automatically from one state to another according to the number of sales calls made upon him. Shuchman
implicitly assumes that customers of a given age (state) are
homogeneous for propensities to buy. This assumption would
appear to be unrealistic in that it avoids the basic problem of
evaluating the complex differences which exist between customers. In effect, Shuchman uses the behavior patterns of
salesmen as a measure of customer behavior.
3The set is exhaustive in that it includes all possible states.
It is mutually exclusive because a customer can be classified in
one and only one state at any given point in time.

63

favorable-verbal, unfavorable-verbal, favorable-nonverbal, or unfavorable-nonverbal.(An example of a


favorable-verbalcue is a complimentabout the salesman's product or company. If, on the other hand, a
customerfrequentlylooks away from the salesman,this
action is interpreted as unfavorable-nonverbal.)An
analysisof these observationsfor a particularcall may
be summarizedinto a compositeinterestscore that may
be relateddirectlyto one of the Markovianstates.4
State

Type

Description

S1 Absorbing
Sale completed during most recent call
Sale lost during most recent call
Si Absorbing
S3 Nonabsorbing New customer-no history
S4 Nonabsorbing Customer indicated low degree of interest during most recent call
Ss Nonabsorbing Customer indicated medium degree of
interest during most recent call
S6 Nonabsorbing Customer indicated high degree of interest during most recent call

Assume that for a given salesman, and for a given


point in time, probabilityvalues are determinedfor
Equation1 as follows:

(la)

P=

1
00
0
0
10
0
.10 .30 0 .25
0
.05 .45 0 .20
.15 .10 0 .15
.20 .05 0 .15

0
0
.20

0
0
.15

.20 .10
.25

.35

.30

.30

In general,
(3)

Pij =

Uij

E
k=l

Uik

whereuij is the numberof times a transitionfrom Si to


Sj is expectedto occurin m trials.
As a practicalmatter,transitionprobabilitiesare determinedfrom history.It shouldbe noted that a particular salesman'stransitionmatrix may vary with time.
Indeed, it would be expected that the probabilitiesof
success (transitionto state S1) will increase and the
probabilities of failure (transition to state S2) will de-

creaseas the salesman'sabilitiesare developedby learn-

4There
may be some question about whether a salesman
would give an honest estimate of a customer's propensity to
buy because such an appraisal ultimately influences management's evaluation of him. The system is designed, however, to
encourage an accurate appraisal. If, for example, a salesman
typically overestimates propensity to buy, this eventually will
be reflected in a low sales rate of very likely prospects. If he
typically underestimatespropensity to buy, a low effectiveness
score for prospects shifting to high propensity states will result. Further, since the system directs the salesman to make
future calls on prospects having high expected values, consistently placing good prospects in a low propensity state denies
the salesman opportunities to call on his most valuable customers.

FEBRUARY
1967
JOURNALOF MARKETING
RESEARCH,

64
ing. A transition matrix, such as la, describes the salesman's ability to perform at a point in time. It is this
matrix, then, that should be used as a basis for planning
and control during the time period under consideration.
The transition matrix may be kept up to date by including the more recent experience and excluding the
more noncurrent experience in calculating transition
probabilities.
It is convenient to transform the matrix of la into
canonical form. In general, for a system of r absorbing
states and s transient states, this form is
r
(4)

p =

| Qsxs_

Si

1
0
S2
P = S3 .10
S4 .05
S5 .15
S6 .20
Si

(4a)

(7a) r =

S2

S3

S4

S5

S6

0
1
.30
.45
.10
.05

0O
0O

0
0
.25
.20
.15
.15

0
0
.20
.20
.25
.30

0
0
.15
.IU
.35
.30

0
0
0
0

.t^s

Kemeny and Snell [8] have suggested some interesting results that may be obtained from (4a).5 These will
now be examined in light of the present example.
The fundamental matrix for an absorbing Markov
chain is defined as
N = (I - Q)-1.
(5)
From 4a,

(6)

I-

where ~ is a column vector with all unity elements.7 For


the example,

}S

where I is an identity matrix and 0 is a zero matrix. For


the example under consideration,

r = Ne,

(7)

Irxr Orxs }r
-Rsxr

now classified in state S4, for instance, he will be in


state S5 an average of .590 times before ultimately being
absorbed. It is expected that a customer presently classified in S6 will be in that state 1.976 times before absorption.
The expected total number of calls that must be completed before a customer's absorption is,

.667
.590
1.890
.939

1
.555
0 1.455
.547
0
.547
0
.547

.626
.504
1.024
1.976

1
1
1
1

2.848
2.549
3.461 .
3.462

Thus, (7a) shows that a customer who is classified in

S3 will require an additional 2.848 calls before a sale is

made or lost. Similarly, a customer in state S6 requires


an expected 3.462 calls before the customer-salesman
relationship is ended.
Thus far, the model has not discriminated between
the two absorbing states. Obviously, such a distinction
is important. The matrix,
B = NR,
(8)
gives probabilities bi that a process presently in the
nonabsorbing state Si will be absorbed in the absorbing
state Sj. From (5a) and (4a),
1
.555
.667
.626
.10 .30
0 1.455
.590
.504
.05 .45
0
.547 1.890 1.024
.15 .10
0
.547
.939 1.976
.20 .05
(8a)
.352 .648
.261 .739
.515 .485
.562 .438
%,--

Q=

1 -.25
.80
00
-.15
0 -.15

-.20
-.20
.75
-.30

-.15
-.10
-.35 '
.70

Notice that in (8a),


r

and
1
.555
0
1.455
(5a) N = (I - Q)- = 0
.547
.547
-0

.667
.590
1.890
.939

.626
.504
1.024
1.976

The elements nij of matrix N may be interpreted as


the mean number of times a customer will be in a transient state Sj before being absorbed.6 If a customer is
5 The model presented in this paper uses the general framework developed by Kemeny and Snell; see, especially, Chapter
III.
'Estimated variances for the elements nlj are given by the
s x s matrix,

N2= N(2Nd,- 1) - N,q,

where Nd, is the matrix N with all off-diagonal entries set to


zero, and N,q is a matrix of the squared entries of N. Variances
are not calculated for the example; see [8].

(9)

bij

j=1

= 1, for all i.

The matrix B shows that the probability of ultimately


selling to a customer now classified in S4 is .261. The
probability that the sale will be lost is .739. Similar interpretations may be made for the other transient states.
USE OF THE MODEL FOR SHORT-RUN
PLANNING
The model developed in the last section has as a primary function the generation of data to aid in short7 The vector r gives the number of times, including the original position, the process is in a nonabsorbing state. This is
equivalent to the number of calls required to complete absorption, including the final call that results in absorption.
The variance of r is given by
72 =

(2N -

) T - Tq .

65

MARKOVCHAINS
SALESPLANNINGAND CONTROLUSINGABSORBING

runplanningof sales activities.8In this section,a method


that uses these data in determininga formal procedure
for schedulingthe time of sales personnelis given. In
effect, a decision rule is formulatedby which the customers of a given salesmanmay be rank orderedrelative to their net expectedvalues. By applyingthis rule,
the salesman'scalls may be scheduledto maximizethe
total expected net revenues within the constraintimposed by limited sale's time.9
If the cost of makinga call is assumedconstantfor all
calls, the expectedtotal costs of movingcustomersfrom
each of the non-absorbingstates to absorbing states
are givenby the vector,
C =

(10)

The value 97.85, for instance, representsthe gross


expectedrevenueassociatedwith customersof stateSs .
Expectedvalues of customersby states are given by
the vector
V = R -C.

(13)
For the example,

97.85
106.78
(13a)

CT,

where c is a constant and r is the vector of (7a). Assume

thatthe cost of a call is $15. Then,

(lOa)

C=

(1)

2.848

42.72

2.549

38.24

3.461
3.462

51.92
51.93

whereB is the matrixof (8), and U is a columnvector


having revenue elements correspondingto the two absorbingstates.
If, for example, a sale provides a constant $190 in
revenues,and a lost saleyieldsno revenue,then
U=

190]

and
.352 .648
.261 .739
.515 .485
.562 .438

(11)

38.24

51.92
51.93

24.17
11.35
45.93
54.85

It is expected,for instance,that $38.24 will be needed


to absorba customerwho is classifiedin stateS4 .
The expected revenue associatedwith a given customer is the revenuethat would be realizedif a sale is
made, times the probabilitythat the sale will be made.
Expected revenues for customersin each of the nonabsorbingstatesare givenby
R = BU,
(11)

(12)

42.72

66.88
49.59

190
0'

66.88
49.59
97.85
106.78

This vectormay be interpretedas follows:A customer


now classifiedin S4 has an expectedvalue, in net dollar
return,of only $11.35. Similarly,a customerin S6 is
worth $54.85. From (13a) it is evidentthat the priority
ordering S6,

S5, S3, S4

provides an economic rule for

allocatingthe salesman'stime. In effect, the salesman


should give top priority to customers in state S6 .10 Any

remainingtime should be allocatedto customersin S5,


etc. It is interestingto note, in the example, that new
customers(state S3) are of more value than those in
state S4. Thus, the structure provides a formal rule for

automaticallyeliminatingpoor prospectsand replacing


themwithnew prospects.
USE OF THE MODEL FOR LONG-RUN
CONTROL
In the last section,the salesman'smannerof operating
(as reflected by his transition matrix) was taken as
given.It may be reasoned,then, that a specificeconomic
rule for rank orderingcustomers,such as the one establishedin the previousanalysis,representsan optimal
short-runstrategyonly as long as the transitionmatrix
correspondingto it remainsunchanged.
This section offers some observationson how a particular salesman'stransitionmatrix may be used as a
device for helping develop his effectiveness.A critical
analysisof the transitionmatrix P, along with its correspondingB and r matrixes,shouldrevealthe areasin
which a salesman indicates weaknesses. By focusing
attentionon these areas, the salesmanmay be able to
eliminatehis difficulties(and thus upgradehis transition matrix).For example,the matrixB of (8a) shows
that the probabilityof makinga sale to a customerin S6

It should be observed that the Shuchman model [10] also


incorporates a planning feature for sales management. As he
notes, however, his formulation is based on assumptions that are
quite restrictive. In view of these limitations, that model would
not seem to be appropriate for planning at an operational level.
9 See [7] for a discussion of Markov
processes with rewards.

10A timing constraint may be superimposed on this framework. For example, it may be desirable to specify a minimum
time interval between calls for customers of a given class. This
would have the effect of placing those customers most recently
contacted in an ineligible status.

1967
FEBRUARY
JOURNALOF MARKETING
RESEARCH,

66
is only .562. Perhaps a modified sales presentation at
this advanced stage would increase the effectiveness of
the salesman's total performance.
To institute control over the process of development,
it is desirable to establish some benchmarks. These may
consist of a standard transition matrix and appropriate
standard matrices that are devised from it, i.e., B, r,
N, etc. One logical possibility is to establish a control
transition matrix P having mean entries Pij based on the
transition experiences of all salesmen. Corresponding N,
r, and B matrixes also can be formulated. The actual
performance of each salesman may then be related to
average or standard performance. It is possible that
the properties of the variables being controlled are such
that even highly formalized control systems based on
statistical inference may be used.
SOME CONSIDERATION FOR
IMPLEMENTATION
It will be observed that the type of analyses presented
in this article may be easily adapted to computer operations. In fact, the program described may be integrated
with little difficulty into an existing computerized framework for planning and controlling sales activities.
Initially, a transition matrix for each salesman is
stored in memory. Inputs to the system consist of
weekly or daily sales data, including those pertaining to
classification and reclassification of customers. Each
salesman's transition matrix is updated continually by
including the most recent input data in the calculation
of new moving average probability values. As output,
the salesman periodically gets a list of all current customers, including new prospects. These are rank ordered
relative to expected values. Finally, a performance eval-

uation based on the standard transition matrix may be


included in the printout.
REFERENCES
1. Wroe Alderson and Paul E. Green, Planning and Problem
Solving in Marketing, Homewood, Ill.: Richard D. Irwin,
Inc., 1964, 180-91.
2. Richard D. Crisp, Sales Planning and Control, New York:
McGraw-HillBook Co., 1961.
3. R. M. Cyert, et al., "Estimationof the Allowance for Doubtful Accounts by Markov Chains," Management Science, 8
(April 1962), 287-303.
4. J. E. Draper and L. H. Nolin, "A Markov Chain Analysis of
Brand Preference," Journal of Advertising Research, 4
(September1964), 33-9.
5. A. S. C. Ehrenberg, "An Appraisal of Markov BrandSwitching Models," Journal of Marketing Research, 2
(November 1965), 347-62.
6. J. D. Herniter and J. F. Magee, "Customer Behavior as a
Markov Process," Operations Research, 9 (January 1961),
105-22.
7. Ronald A. Howard, Dynamic Programming and Markov
Processes, New York: John Wiley & Sons, Inc., and Cambridge: The Technology Press of The Massachusetts Institute of Technology, 1960.
8. John G. Kemeny and J. Laurie Snell, Finite Markov Chains,
New York: D. Van Nostrand, Inc., 1960.
9. Richard B. Maffei, "BrandPreferences and Simple Markov
Processes," OperationsResearch, 8 (March 1960), 210-8.
10. Abraham Shuchman, "The Planning and Control of Personal Selling Effort Directed at New Account Acquisition:
A Markovian Analysis," in New Research in Marketing,
Berkeley, Calif.: The Institute of Business and Economic
Research, 1966, 45-56.
11. George P. H. Styan and Harry Smith, Jr., "MarkovChains
Applied to Marketing," Journal of Marketing Research,
1 (February 1964), 50-5.
12. "The Climbing Cost of IndustrialCalls," Sales Management,
92 (April 17, 1964), 87.
13. Joseph W. Thompson, Selling: A Behavioral Science Approach, New York: McGraw-HillBook Co., 1966.

You might also like