Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Review of Political Economy

ISSN: 0953-8259 (Print) 1465-3982 (Online) Journal homepage: http://www.tandfonline.com/loi/crpe20

Modeling and Measuring Economic Reality: Reply


to the Reviews

Edward J. Nell & Karim Errouaki

To cite this article: Edward J. Nell & Karim Errouaki (2016) Modeling and Measuring
Economic Reality: Reply to the Reviews, Review of Political Economy, 28:3, 448-463, DOI:
10.1080/09538259.2016.1154758

To link to this article: http://dx.doi.org/10.1080/09538259.2016.1154758

Published online: 08 Jul 2016.

Submit your article to this journal

Article views: 49

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at


http://www.tandfonline.com/action/journalInformation?journalCode=crpe20

Download by: [177.207.94.122] Date: 04 August 2016, At: 18:09


REVIEW OF POLITICAL ECONOMY, 2016
VOL. 28, NO. 3, 448–463
http://dx.doi.org/10.1080/09538259.2016.1154758

Modeling and Measuring Economic Reality: Reply to the


Reviews
Edward J. Nella and Karim Errouakib
a
New School for Social Research, New York, USA; bFoundation for the Culture of Peace, Autonomous
University of Madrid, Madrid, Spain

ABSTRACT ARTICLE HISTORY


To begin with, we’d like to express our appreciation to our three Received 2 October 2015
commentators for their thoughtful and helpful reviews. Like the Accepted 30 October 2015
founders of the subject, we believe, and our reviewers seem to
Downloaded by [177.207.94.122] at 18:09 04 August 2016

KEYWORDS
agree, that structural econometrics has the potential to enable us Conceptual analysis; DSGE;
to develop serious working models of how different economies fieldwork; general
actually operate, and also to tell us something about the equilibrium; Haavelmo
changing patterns of their growth and transformation. But both distribution; induction;
we and our reviewers agree that there is a great deal wrong with Keynesian uncertainty;
the way econometrics is practiced today. lawlike; methodological
triangle-circle; measuring
economic reality;
methodological
institutionalism, modeling,
probabilities, neoclassical
theory ownership and value
relations; reliable and volatile
relationships; scientific
variables; statistical and
substantive adequacy;
structural econometrics;
transformational growth

JEL CODES
B4; C1; C2; C3; C4; C5; C6; C7;
C8; E1; E2; E3; E4

1. Textbooks and Probabilities


Both Spanos and Boland mention the misleading nature of textbooks, which encourage
students (and serious researchers!) to make invalid and thus meaningless calculations.
But it is not only textbooks; things are worse than that. One of us (EN) agreed to teach
an Introductory course, based on our book, which would warn students of misleading
practices and invalid claims. This did not turn out well. After an introduction to statistics,
the first thing was to emphasize the need to understand the importance of the error term
and the assumptions about its distribution. For example, in the Classical Linear Regression
(CLR) Model, to estimate a coefficient, say the propensity to consume out of income, we
start from the equation C = C∗ + BY + e, where C is aggregate household consumption out
of income, C∗ is the autonomous component, B is the average propensity to consume out

CONTACT Karim Errouaki karim.errouaki@gmail.com


© 2016 Informa UK Limited, trading as Taylor & Francis Group
REVIEW OF POLITICAL ECONOMY 449

of aggregate income and e is the error term. We assume the relationship is linear (this is
not an innocent move). First we shift the coordinates to eliminate C∗ , giving us c = by + e,
or c/y = b + e/y. Then we consider the error term. The errors are supposed to be random,
and so independent. They are equally likely to be very large and very small. Hence if
repeated samples are drawn, or if the sample size is large enough, they should tend to
‘cancel out’; the mean should be, or be approaching, zero. Hence c/y = b. We have an esti-
mate of the propensity to consume! (See Spanos’s discussion of the CLR model in Section
Two of his review essay.)
But what if the sample is reasonably large, of good quality, and the mean of the errors in
the sample is nowhere near zero? It could be positive or negative. There is no reason now
to suppose that the error term will go to zero as the sample size increases; nor is there any
reason to suppose that repeated draws of the same size sample (if this were possible) would
yield estimates that average out to b. Textbooks will say the estimate is ‘biased’. But in fact,
b cannot be considered an estimate of the actual propensity of households to consume out
Downloaded by [177.207.94.122] at 18:09 04 August 2016

of income, since b is either too large or too small, depending on the sign of the error term.
However, the temptation is to ignore this, and to assume that the sample is drawn from a
distribution that has desirable properties, such as a normal distribution and independence,
so that the error term will vanish, giving us an accurate estimate of the actual value of b, in
effect by assumption; or, if other assumptions about the distribution are adopted, making it
possible to ‘adjust’ the estimating procedure to obtain a ‘useful’ estimate. But these calcu-
lations are not ‘useful’; the estimated parameter b does not measure the household’s pro-
pensity to consume out of income. (Note that the actual parameter—assumed to exist—is
intrinsically unknowable because it cannot be observed in itself, but can only be estimated
from data.) It is not that ‘the coefficient is biased’; it is that something is wrong with the
calculation, and we don’t know what. It could be that there are errors in the statistics, but it
could equally be that there are other variables that should be included in the calculation.
Or the relationship might be non-linear. Calculations of this kind are meaningless unless
there is evidence that the assumptions are valid. Nor are the assumptions really about the
‘error term’; they are assumptions about the ‘errors’ in the original equations—as shown in
the Haavelmo Distribution. This was the point of the class, and the course proposed to
apply the same analysis to the role of untested assumptions in Multiple Regression,
Simultaneous Equations, Correlation and so on, and then consider what kinds of evidence
and testing could be found to accept or reject these assumptions.
But the proposed course was a complete washout. The students revolted; they did not
want to study the derivations of the various expressions for different kinds of regressions,
they wanted to run the formulae on their cell phones. They wanted to do econometrics, not
study the deficiencies of standard econometric practice. They wanted to work with data
sets and get answers to questions, not worry about obscure problems embedded in tech-
nical assumptions. Our colleagues largely agreed; the students want to get busy and work
with data, and that’s all to the good. In the view of both, standard econometrics must be
generally OK; after all, textbooks teach it and make money—it passes the market test.

2. Lawlike, Reliable and Volatile Relationships between Variables


But a lot of people understand that probabilistic assumptions are often made irresponsibly,
even if they don’t realize how serious this is and continue to do it. Accepting this, we
450 E. J. NELL AND K. ERROUAKI

wanted to develop a further critique from a different angle in our book. This new critique
starts from our distinction between Reliable and Volatile relationships, a point that none
of the three reviewers discuss. Reliable relationships are grounded in technology and obli-
gations, contracts and norms, not in calculations of optimizing and rational choice. Main-
stream thinking for decades has sought to ground all economic relationships in optimizing
behavior—‘microfoundations’—clearly implying that all relationships are on a par. They
are all grounded in preferences, various constraints and rational optimizing; so it must
seem that they are pretty reliable, since they depend on ‘economic rationality’, but also,
at times, unreliable, since people are not rational all the time. Most of the time economic
agents are rational, but sometimes they are not. To set up a model we assume they are
rational, so all theoretical relationships should be reliable, ceteris paribus. We reject this
approach altogether, for reasons explained in Hollis and Nell (1975). Of course businesses
may make use of optimizing models and other management tools when making their
decisions, but it is the commitments, obligations and contracts that result from these
Downloaded by [177.207.94.122] at 18:09 04 August 2016

decisions that bind the activities of economic agents, and define their relationships. The
various tools, including rules of thumb, involved in making the decisions are not relevant
here (except to note that economic decision-making is a complex social process, and is not
determined uniquely by any simple model).
In contrast to Reliable relationships we propose that a number of the most important
economic relationships, especially financial ones, are inherently volatile and unreliable. It
is not that we don’t know what determines them, or why they develop as they do; in fact we
very often do know, though not until after the fact. These relationships simply have an
inherently free-floating and unpredictable aspect, even though very often they appear to
be stable and well-behaved—for a time. The Stock Market goes up regularly for months
or years. Inflation is tame and mild and normally stays at an even level. House prices
rise modestly in a regular fashion. It looks as though we can treat these variables the
same way we deal with employment, output, consumption and productivity, picking
out what affects them and establishing the relationships. But this is a mistake: we can
apply the same techniques, but we cannot have the same confidence in the results.
This distinction follows directly from our argument that there must exist ‘lawlike’
relationships in the economy, even though the economy must remain ‘open’. There
must be lawlike relationships between well-defined variables because such relationships
are necessary for the regular reproduction of the system. The economy exists, the social
order exists, and this existence has to be explained. Socio-economic existence doesn’t
just happen, it has to be brought about. Economic and social activity uses up energy
and material in the form of goods and services; these have to be reproduced, in a
regular manner (Nell 1998, Chapter 7; Sraffa 1960).
These questions do not normally get much attention from most economists.1 No doubt
these are philosophical issues in some respects—but the question is not what discipline an
argument ‘belongs to’, but whether it is relevant and sound. ‘Do we need an answer here?
Are we making a good argument, does it get us nearer to the truth?’ Even though our argu-
ments concern epistemology and ontology, understanding the mechanisms that enable the

1
See Lowe (1965, 1976) for a discussion of these issues. Tony Lawson (1997) has examined the role of lawlike relationships
in the social sciences, albeit from a perspective very different from ours. Yet he also is concerned with questions of social
ontology.
REVIEW OF POLITICAL ECONOMY 451

economy to hang together and carry out its provisioning and reproduction is a conspicu-
ously practical matter. The existence of lawlike relationships is, in fact, a matter for econ-
omics, because the continued existence of the socio-economic order depends on the
functioning of a system that will regularly reproduce the goods, services and trained
people who are used up or worn out or retired every year (Nell and Errouaki 2013, Chap-
ters 4 and 5; see also Sraffa 1960). And to study how this happens it is appropriate to con-
struct models.
But while our argument supports the claim that there must be lawlike relationships, it
does not support the claim that all socio-economic activity is governed by stable, reliable
lawlike relationships. If that were the case, it could be argued that all economic activity is
deterministic, and that all data sets are ergodic. This would be very convenient for econo-
metrics, but it is not the case. Indeed, we emphasize in the book that economic activity
takes place in conditions of Keynesian uncertainty. Furthermore, some variables may
not stay in stable relationships, and some apparently lawlike relationships may be
Downloaded by [177.207.94.122] at 18:09 04 August 2016

subject to sudden changes There is uncertainty here.


Indeed, looking carefully at our book, a critic might claim to see a gap in our argument:
we basically took it for granted that we all agree that the future is uncertain and that
aspects of it are unknowable; the decisions that will determine aspects of the future
have not been taken yet, and can’t be predicted. Innovations are coming, but we cannot
predict them in any detail. Conflict and competition are under way, and it is unclear
who will be the winners and losers. Agents face tests of strength, skill and wit; can they
meet the challenges? Will they succeed? Agents face undecidable problems, where no
one can say what will happen.

3. Can LaPlace’s Demon Crowd out Uncertainty?


There is a long philosophical debate over Determinism: could LaPlace’s Demon predict
the price of oil 10 years or 10 months ahead, does he know now the prices of all stocks
into the indefinite future, and so on? We have taken it for granted that the answer is
no, that some aspects of the future are unknowable; regardless of what is the case for
the physical world, universal Determinism for the social world is false. But apart from
footnote 17 in Chapter 5 (pp. 143–144), we did not argue the position. Yet, if the econo-
metric world is ergodic, as many modelers assume, the answer should be yes, Determinism
would then be true, at least for the aspects being studied.
So let’s start by considering whether the Demon, as a scientific observer, can predict the
answer, and determine its truth, when a third party—you or me—asks the famous Greek
Liar, now a leading statistician, whether or not, in fact, he always lies? If he says Yes, he
tells the truth—so he is lying. If he says No, he is lying, so apparently he is does always
lie, which means he is telling the truth … . This is a problem for the Demon, but not
much of one for anyone else. Yet it points the way, because it arises from the mind reflect-
ing on itself (cf. Nell and Errouaki 2013, p. 123).
Let’s take a step further: suppose we issue a derivative on a dedicated con artist; he has
let us know that he will never keep a promise and we can count on it. We take his side and
sell to counterparties. He is as good as his word, reneges on all deals and defrauds party
after party, until one day an angry creditor shoots him. He has never kept a promise in his
life, just as he said. We go to collect from the counterparties, and they refuse to pay up.
452 E. J. NELL AND K. ERROUAKI

They say: He did keep his word—to us! We reply, So he broke it—which is what he said he
would always do … but then, they retort, ipso facto, he kept his promise to us … . So who
wins the bet? What would the Demon predict, on what grounds?
The self-reflecting mind can thus generate paradoxes that might cause real trouble, but
much more importantly, the mind’s ability to stand back and reflect on what we are doing
is the source of innovation. But predicting novelty presents even more serious problems.
How can an innovation be predicted? If the innovation is described in full detail, then it
has been made, not predicted. But if it is not described in full detail, then it has not been
truly predicted either. To predict an innovation, we have to say what its impact will be,
what effects it will have, how it will be received. But to do this we must know how this
innovation is going to work, we have to anticipate what the problems with it will be,
how it will interact with related technologies and with the markets, and for all this we
need to know the details. So if the Demon predicts an innovation, he cannot tell
anyone about it!
Downloaded by [177.207.94.122] at 18:09 04 August 2016

The problem of novelty extends further. Consider interlocking decisions, e.g., to invest
and build new plants, to open a marketing drive, to move to a new location, to enter a new
market. Company A’s decision is contingent on that of B, whose decision depends on what
C does, where C, in turn, is waiting to see what A does, and so on. Once one of them takes
the plunge everything is clear, but until then we just have to wait. The crucial piece of data
doesn’t exist. It is a novelty. Likewise, a firm may be waiting to see whether an innovation
works, or a focus group approves a proposed new product.
Why can’t the Demon just predict which firm will take the plunge, whether the inno-
vation works, the focus group approves or whatever? Again, to predict innovations and
new developments is to produce them, bring them about; they cannot be predicted like
the trajectory of a comet, or the result of mixing chemicals. When we are faced with some-
thing truly new, there is no past experience to draw on to develop ‘laws’.
In addition to interdependent decisions, problems can arise from interdependent
expectations. ‘Nobody goes to that bar any more; it’s too crowded.’ This is the famous
El Farol problem; should we go there or not? If it’s expected to be too crowded, no one
will go—and it will be empty! But then, since it’s empty, everyone will go, and it will be
crowded! What to do? If we can’t figure it out—and the mathematicians at the Santa Fe
Institute say we can’t—then what can the Demon tell us?
Winning and losing also pose a special problem for the Demon. It’s easy to pick a
winner, we do it all the time, sometimes even getting it right. But when we pick the
winners of the Super Bowl and the World Series, we are making bets, not scientific predic-
tions. Consider: sinking the putt that wins the tournament on the 18th hole; hitting a home
run in the bottom of the ninth. You can do it, but will you, on this occasion? Failure is
always possible, you lose concentration, you get distracted, or your opponent on that
day is just better. Success is never guaranteed in a competitive contest, whether a game
or on the market or in politics. If success were always guaranteed, always predictable, it
wouldn’t be a contest.2 We can often predict what agents will set out to do (e.g., on the
basis of rational decision models) but it is another matter entirely to predict whether
they will succeed.

2
It could be argued that winning, losing, success and failure were predictable in principle, but no one actually knew how.
But how could we know that all contests are predictable in principle, if we don’t know how this could be done?
REVIEW OF POLITICAL ECONOMY 453

And then there are moral issues. One way of predicting behavior is to determine what
the agent or agents ought to do. What would the right course of action be, in these circum-
stances? The problem, a very common problem, is that there may be contradictory
‘oughts’, pulling in different directions. ‘He ought to stay home and look after his
widowed mother’, ‘He ought to take the scholarship and go to college; everyone will be
better off in the long run’. We can often say on the basis of character that he will ‘do
the right thing’, but in a conflict of moral imperatives, it may be impossible to say what
is right.
Importantly, all this adds up to the conclusion that the socio-economic world is open,
indeterminate, in a number of different areas and for a number of different reasons. In
other words, parts of it, at least, are non-ergodic. Keynesian uncertainty is real, and
cannot be reduced to measurable, and therefore predictable, numbers. Uncertainty
cannot be repackaged as risk. The significant implication is that there will be aspects
and patterns of behavior that may be unpredictable, yet may have to be included in
Downloaded by [177.207.94.122] at 18:09 04 August 2016

models of the economy.

4. Variables, Laws and the Methodological Triangle-Circle (MTC)


None of our commentators mentioned our efforts to provide a precise definition of scien-
tific variables and of the relationships between such variables (Nell and Errouaki 2013,
pp. 85–96). A careful examination shows that a well-defined ‘variable’ implies logical con-
straints on the kinds of relationships it can enter into. Further, there are also implications
and constraints on the way such variables and relationships are attached to their ‘bearers’.
That is, functional relationships connect scientific variables, and are ascribed to
appropriate bearers: consumption is a function of income, and this function describes
the behavior of households. The quantity demanded of a good is a function of price,
and this describes the behavior of an agent in the market. If the relationship implies exten-
sive powers of discrimination, or comprehensive knowledge, the bearer must have been
somehow provided with such powers. This discussion provides foundations for our analy-
sis of ‘relevance’, that is, of how models refer to the world. It also provides a basis for cri-
ticizing neoclassical theory:
There are indeed lawlike relationships to be discovered, although, unlike the laws of natural
science, they are limited and bounded by history and geography. Moreover, they can change
as a result of changes in institutions and technology. (But the processes of such change can
themselves be explored.) And some relations must always be inherently volatile. (Nell and
Errouaki 2013, p. 401)

This provides a general way of looking at econometric models, and their position in
history. Models can be developed; there are general laws, but they can change over
time; there are also unreliable laws or apparent laws, which can shift and change at
short notice.3 Boumans may be closest to us on this, we think. Indeed, it seems that
3
We need to add that the variables and relationships must be economic. This is not a simple or innocent point. For almost
three thousand years Ancient Egypt arranged for the regular ‘cross-delivery’ of goods and services, enabling reproduction
and the distribution and regular use of a surplus to take place. But no money, as we understand it, was used, nor were
there markets as we understand them (in contrast to Mesopotamia). Detailed accounts were kept; deliveries were made
largely on the basis of tradition and religion; sometimes force was used, but generally it was not necessary. Our discussion
of Ownership and Value relations (pp. 126–133) is designed to explore these issues.
454 E. J. NELL AND K. ERROUAKI

Boumans basically agrees with us, although he expresses some reservations and favorably
discusses a different approach.4 But he seems to like the Methodological Triangle-Circle
(MTC) that we use to summarize our framework.5
Boumans describes economic laws as ‘inexact’, as do some other methodologists. In our
opinion this is a mistake; ‘inexact’ with respect to what? How are we supposed to say what
is ‘exact’? For the most part the laws are as ‘exact’ as they need to be for the system to work.
Economic relationships change and develop under the pressure of the forces driving trans-
formational growth. (Consider our example of the change from a partially stabilizing ‘price
mechanism’ to the output multiplier, as the technology moved from 19th-century craft to
20th-century mass production.) By contrast, we try to give a precise definition of economic
laws, showing how and why they are not like physical laws, and why and how they can
change over time.
Leontief (1984) and Morgenstern (1963) made good points about the limitations and
distortions of data that contain ‘evasive answers and deliberate lies’ (problems that field-
Downloaded by [177.207.94.122] at 18:09 04 August 2016

work can help to overcome.) Leontief (1984) noted that economics was remarkable for the
ratio of the immense amount of empirical effort to the tiny yield of agreed-upon empirical
knowledge, a ratio that is only worse today—not, in our opinion, because econometrics
can’t be done, but because it is done improperly. Neither was sympathetic to econometrics
(or for that matter to macroeconomics). Koopmans (1947) felt that empirical work had to
be guided by theory, which in his view had to be neoclassical theory; he did not seem to
appreciate macroeconomics at all, nor did he seem to recognize any alternatives to neo-
classical thinking.
By contrast, we think econometrics, properly done, can be extremely valuable; but it
must stay within the limitations indicated on p. 401 and elsewhere throughout the
book. The program of the founders was sound enough—a good start, anyway—but they
modeled it too much on natural science, especially physics, and they did not appreciate
that special techniques (such as fieldwork) are necessary for a social science. In particular,
they did not understand that the ‘openness’ of social existence requires a new and different
approach to modeling.
Boumans suggests an approach, based on ideas of the physicist Heinrich Hertz, that he
considers a possible alternative to ours (though close to it.) We are not convinced. Our
‘Coherence’ or Conceptual Analysis, concerns drawing out the essentials of the concepts
in a model—the variables and their bearers, and the relationships among them. This sets
the stage for Relevance; the definition of a concept in the model must agree in essentials
with the way that concept works in practice. Banks in the real world can fail, as we know all
too well; firms do not know the future, households don’t either, and don’t spread out con-
sumption in a rational plan over a lifetime, etc. The model’s concepts must refer to, must
have the same essential properties as, the concepts embodied in the practices of the

4
We are not inclined to put much faith in ideas and analogies drawn from physics, for reasons made clear in our discussion
of ‘Variables, Laws and Induction’ (Chapters 4 and 5). We argue that there are clear differences between the natural and
the social sciences; both make use of models and both look for and find ‘lawlike relationships’, but the natural sciences
can develop theories and models that are ‘closed’ and deterministic in ways economic and social sciences cannot be. The
cycle the electron follows around the core of the atom does not depend on the electron’s expectations or knowledge;
there are no such things in physics. But the business cycle does depend on the expectations of the firms and households
experiencing it.
5
At the end of his paper Boumans makes the very interesting observation that we can distinguish activities along the edges
from what goes on in the interior of the Triangle.
REVIEW OF POLITICAL ECONOMY 455

economy. (Discovering these is one of the functions of fieldwork.) Concepts like ‘income’,
‘capital’, ‘money’, ‘labor’ and ‘work’ must mean the same thing in the model as they do in
practice. Of course, the model’s definitions can, and indeed must, abstract from the multi-
farious and irrelevant detail of actual behavior, but they cannot distort reality (no uncer-
tainty) or posit idealize agents (instantaneous mobility). The model cannot endow the
agents, the bearers of the variables, with powers and knowledge that they do not typically
have, or in many cases could not have.

5. Equilibrium?
Boland argues that equilibrium models are the wrong way to go, and we agree. But we go
in a different direction in regard to the reasons and analysis. Basically, if the economy is
undergoing Transformational Growth, which it always is, equilibrium analysis can at best
be a short-run approximation. Boland’s critique of Dynamically Stochastic General Equi-
Downloaded by [177.207.94.122] at 18:09 04 August 2016

librium (DSGE) models is surely sound, but in our view it does not go far enough.6 We
argue (pp. 158–159 et passim) that a model consists of a purely formal skeleton, with
the conceptual and interpretive flesh and blood filling it out. The formal part is usually
some mathematical algorithm or set of algorithms—maximizing, simultaneous equations,
dynamic movements—while the conceptual part interprets the variables and relationships
in the mathematical system as elements of the economy. The MTC diagram comes into
play here. The concepts expressing the meaning of the variables and relations have to
be consistent with those inherent in the practices or institutions being modeled; this is ‘rel-
evance’ and must be checked by fieldwork. But the concepts and their relations also have to
be consistent with the formal structure as well, and that formal structure must exist in
what is being modeled, which must be checked by measurement.
Now consider a general equilibrium (GE) model: any model claiming to encompass the
whole economy must be open. In particular, we argue that the financial side of a macro or
GE model must incorporate true uncertainty: it cannot be closed and determinate, except
temporarily. Here we run into a problem that may stem from the fact that economists have
borrowed tools and approaches from the physical sciences. In the course of using a model
to analyze a problem, the equations of the model may be solved, which means it will be
temporarily treated as closed and determinate. Solving the model makes it easier for the
observer, the builder of the model, to draw conclusions and develop the implications.
But the need to solve the equations is a consequence of using the mathematical
methods we typically draw on; the actual positions of markets and agents in the volatile
sectors of these models are, in fact, fragile and temporary, regardless of how they are
treated in solving the model. They are not equilibrium positions; the agents are in the
process of continually rethinking their positions and making up their minds anew.7 Equi-
librium models, Boland says, ‘are incapable of dealing with real-world economies’. Well,
6
In recent testimony before the US Congress, Robert M. Solow (2010) gave a prepared statement on ‘Building a Science of
Economics for the Real World’. He argued that modern macroeconomics has not only failed at solving economic and
financial problems, but is bound to fail. He observed that the DSGE model ‘populates its simplified economy … with
exactly one single combination worker-owner-consumer-everything-else who plans ahead carefully and lives forever.
One important consequence of this “representative agent” assumption is that there are no conflicts of interest, no incom-
patible expectations, no deceptions’ (p. 2). The model, Solow suggested, fails the ‘smell test—does this make sense?’
7
Leontief (1971, pp. 3–4, cited by Boumans) also observes that the ‘system [is] in constant flux’. Yes, and the flux is among
the most important things to study.
456 E. J. NELL AND K. ERROUAKI

yes and no. Neoclassical rational choice equilibrium models are usually worthless descrip-
tively, for many reasons—but linear and non-linear programing models, and other man-
agement optimizing techniques (like critical path analysis), are highly useful for
prescriptive purposes. Keynes-Kalecki aggregate demand analysis and Minsky-type finan-
cial fragility models can be solved for ‘equilibrium’, and these are certainly useful (of
course the definition of equilibrium is different from that of the mainstream). But
Boland is surely right to emphasize that the economy is always in motion, and seldom
in any kind of balance, so equilibrium modeling is out of touch at worst, and potentially
misleading even at best.
Taking this further, we think neoclassical theory is mostly unrealistic in ways that make
it literally inapplicable; it does not, and in many cases logically cannot, refer to the world.
Our argument is that its variables and relationships, and especially the bearers of its vari-
ables, are not defined in ways that would permit their continued existence. Notoriously, the
assumptions made about the agents and their world are generally ‘unrealistic’, but that is
Downloaded by [177.207.94.122] at 18:09 04 August 2016

not the point here. Rather, when assumptions are made, the necessary conditions presup-
posed by those assumptions must be added to the model. This is almost never done; think
of the kind of information and communications system that would be necessary to provide
households and businesses with even a plausible and limited degree of ‘perfect foresight’.
But even to assume an educated and skilled labor force is to assume also the educational
system necessary to support and maintain that labor force. Likewise for capital; economic
activity uses up and wears out capital goods, but the endowment of capital must be kept
intact, and that requires specialized production and exchange, as is clear in Sraffa and
Leontief. In general, neoclassical models fail to underwrite their assumptions with the con-
ditions necessary to maintain them, so that broadly, and in some cases precisely, these
models are inconsistent with the conditions for regular reproduction. Such an approach
can’t provide a scientific basis for analyzing an economy.
Structural models of reproduction, in our view, provide the basic foundations for
macroeconomics, and macroeconomics sets the stage and determines the conditions in
which all other economics activity takes place. There’s no space to argue this here, but
the theme runs throughout our book (see also Hollis and Nell 1975).

6. Probability
Boland also contends that the use of probability in econometrics is very often misguided,
because the events being studied are singular or unique. If events are not repeated, no
frequencies can be developed, so there is no way to define probabilities. But he also
notes that besides the frequentist concept, there is another notion of probability,
namely that probability measures the weight of the evidence in favor of a proposition.
This means all the evidence from all possible sources. So on this view, the probability
of surviving a jump off the CN Tower need not be restricted to 0 or 1. It might make per-
fectly good sense to say that depending on the condition of the ground, the physical con-
dition of the jumper, and whether the jump is feet-forward or head-first, there could be,
say, a 25 per cent chance of survival; the number is perhaps more metaphorical than scien-
tific, but it nevertheless gives us an idea of the magnitude of the chances. So while the
absence of a coherent frequency certainly means we cannot assess statistical adequacy,
it does not mean we cannot draw on the intuitive idea of probability. But it has to be
REVIEW OF POLITICAL ECONOMY 457

used properly; it can’t ‘stand in’ for a frequency distribution, but it could be used to qualify
a forecast. We argue in the book (pp. 116–118, 322–323, 347–348) that both concepts of
probability are valid and that both are needed in the analysis of social and economic
phenomena.

7. Statistical and Substantive Adequacy under Uncertainty


Spanos’s paper presents us with serious challenges, while at the same time offering strong
support for our general positions. It begins by quoting a passage from us, in which we
seem to be clearly wrong. The passage, taken as he quotes us, does appear to mix up stat-
istical adequacy and substantive adequacy. What we meant was that statistical adequacy, a
Measurement issue, does not make much of a contribution to ensuring substantive ade-
quacy; yes, it is a necessary condition, and, yes, getting it right can help to understand
the relationships (a Relevance issue), but the substantive questions have to be dealt with
Downloaded by [177.207.94.122] at 18:09 04 August 2016

at the Conceptual level, not at the point of Measurement. Mis-specification testing is defi-
nitely useful, and helps to define more clearly the relationships between data and the
elements of the equations (i.e., matters of Relevance). But getting specification right is
chiefly a Conceptual Analysis matter, always bearing in mind that such analysis needs
to be interactive with relevance and measurement, making use of fieldwork as well as stat-
istical work.
That this is what we meant is shown on pp. 303–308 of our book, where we summarize
Spanos’s approach in some detail, and where we talk carefully about statistical adequacy
and what to do about it. (We discuss mis-specification testing, but we don’t take up his
latest work, which his article here does.) Spanos claims that our ‘“methodological institu-
tionalism” cannot be properly implemented without the notion of statistical adequacy’. In
general, we agree, for the reasons he advances in his paper, many of which we covered in
the pages cited above. But statistical adequacy is only a step on the way to substantive ade-
quacy, which is what we all really want. We think there is a larger problem with econo-
metric modeling, and what can be expected from it, and appeals to statistical adequacy
will not help with this—for the very good reason that in some, and perhaps many, impor-
tant cases full statistical adequacy may not be achievable. And why not? The joker in the
pack is Keynesian uncertainty.

8. Reliable and Volatile Relationships in Models


Spanos does not take up our Reliable/Volatile distinction. Reliable functions and variables
are grounded in the conditions of the present, while volatile functions and relations are
determined by expectations of the future. But the distinction goes beyond that: reliable
relationships are largely real, while volatile ones are monetary and financial. The short-
run output-employment relation is reliable and real; so is the relationship between
income and consumption. But the price level as a function of the exchange rate is mon-
etary and volatile, and the price level as a function of the quantity of money is notoriously
volatile. The speculative demand for money (or securities) as a function of the interest rate
is volatile and financial. What this distinction does, then, is help to formulate the general
claim that, in the short run, the real side of the economy is persistently ‘shocked’ by fluc-
tuations emanating from the financial side. But these financial fluctuations are not
458 E. J. NELL AND K. ERROUAKI

themselves necessarily ‘caused’ in any systematic way by any systematic ‘forces’. The func-
tions are inherently unstable, because they depend on our expectations of the future, the
correctness of which cannot be determined. Hence an indefinitely large list of factors can
influence these expectations in an indefinitely large number of ways.
In the simple Klein-Nell model, it is clearly indicated which parts of the model are
reliable and which are volatile. Then, in solving it, we can show the reliable parts as func-
tions of the volatile, where fieldwork, common sense, vernacular knowledge and concep-
tual analysis can lead to determining short-term hypotheses for the volatile relationships.8
With this, reduced forms can be estimated, and we can see how the changes in the volatile
variables affect the rest of the economy, providing guidance to those developing policies to
control fluctuations.9

9. ‘Having’ a Probability Distribution


But if this distinction can be generally supported, if there is Keynesian uncertainty, as there
Downloaded by [177.207.94.122] at 18:09 04 August 2016

must be, it calls for some rethinking of the idea of statistical adequacy. We certainly agree
that determining statistical adequacy should be done whenever possible, when it makes
sense.
But the Haavelmo Distribution assembles all the observations together; Reliable vari-
ables and Volatile, future-oriented variables are pooled. Reliable variables, however,
may plausibly be supposed to have fairly well-defined distributions, depending on the
sources of variations. But some important volatile variables—financial variables, inflation
variables, for example—may not meet the conditions that allow us to say they have a
certain probability distribution.
What does ‘have’ mean here? To say this variable has such-and-such a probability dis-
tribution requires that there be a well-defined process generating measurably different
values of the same variable, and the set we have today may be considered a ‘draw’ from
the larger set of all possible realized values of the variable. Usually there will be a target
value, that is to say, a correct or true dependent value, that corresponds to the values of
independent variables, and the process that generates the values tends to hit and miss
the targets in a regular fashion; the results then may be normally distributed, or skewed
one way or another, with or without fat tails, etc. But a variable that results from
market activity may be well-defined even if there is no identifiable process that generates
that market activity and there are no clear-cut true values (for example, an index of Stock
Market prices). The actual values, say of the Stock Market, will be easily calculated, and a

8
A Klein-Nell approach, using the reliable/volatile distinction, provides an account of the way underlying economic struc-
tures evolve over time that is based in actual practice, as understood through conceptual analysis and fieldwork. In Klein-
Nell, the theory assumes a structure of production, a Leontief-Sraffa technology, with a Cambridge approach to effective
demand, and a modern monetary system. No agents are assumed to have impossible or unrealistic powers (perfect fore-
sight, costless mobility); no unexplained institutions (i.e., Walrasian auctioneers) are postulated. So the theory is coherent
and the demand side reasonably well worked out, with the supply side implicit, but available from other input-output
studies. Applicability is guaranteed by fieldwork and by our knowledge of the vernacular. Measurement is provided by
national income statistics. For further details see Errouaki (2014).
9
No doubt a conventional economist might simply deny that the economy can be usefully divided into reliable and volatile
elements, and would argue that all functions are somewhat volatile, and all are reasonably reliable, too. This flies in the
face of data and history; it also runs contrary to common sense. Some relationships are strongly ‘tethered’; others are free-
floating, and our expectations of them are subject to sometimes extreme uncertainty. Of course there may be many in
between.
REVIEW OF POLITICAL ECONOMY 459

huge database assembled. But what is the process that generates, causes or directs the
market activity? There are many answers, and no agreement. Nor is it clear that there
is any one or dominant cause or process; some causes or processes appear to operate at
one time, others at other times, changing unpredictably. Many may operate at the same
time. The processes that generate values may be changing, and changing in irregular
and unpredictable ways.
The problem is this: A long series of observations of a Volatile variable, in a relationship
subject to Keynesian uncertainty, will very likely divide into stretches of time that show
different patterns of variations of the variable—for example when the market is generally
optimistic compared to when it is generally pessimistic—but we may nevertheless want to
use some of these data to derive a (temporary) estimate of the coefficient of the variable,
either for specific past periods or looking forward. So we won’t be able to calculate stat-
istical adequacy in any mechanical manner. Yet we would want to do so for the Reliable
part of the model, and for Reliable relationships generally.
Downloaded by [177.207.94.122] at 18:09 04 August 2016

10. Two Models


As an example, consider our discussion of the Marginal Efficiency Calculation and the
associated Conventional Projection—Keynes’s idea of the ‘conventional projection’ of
today’s economic position into the future (Nell and Errouaki 2013, pp. 424–429). Recall
from the book that this consists of two equations and two unknowns: the present as a func-
tion of the expected future, and the expected future as a function of the present.
This discussion follows our development of Keynes’s idea of ‘animal spirits’ and an
exposition of our idea of ‘tethered’ and ‘floating’ expectations. Business expectations
about the operation of the technology and productive capacity currently in use are
likely to be well grounded, tethered to contracts and obligations. Likewise business expec-
tations about what will happen in today’s markets. By contrast, expectations about the
future operation of technology and behavior of markets will necessarily be less well
grounded, indeed may be ‘grounded’ in unwarranted optimism or pessimism, so that esti-
mates of future earnings will contain at least some free-floating elements.
So were we to form a Haavelmo Distribution we would put together the current values
of capital, on the one hand, and the expected future values of capital, on the other. The
former are reliable, and it should be possible to find out something about the distribution.
But the latter are volatile, and given the varieties of factors that influence our views of the
future, there is not going to be any clearly defined generating process. Of course, given any
data set, we can examine its properties, but if it is volatile, then it is not a draw from a
larger set, which would be the set of all values generated by the process.
Objection! This is not a full-scale macromodel. It abstracts from far too much, leaves
out most of the economy. So let’s consider the Minsky-Nell model summarized on
pp. 453–461 of our book, and presented in detail in Chapter 13 of Nell (1998).
This is an aggregate business cycle model for a growing economy. It’s also fairly simple
and based on plausible assumptions and reasonable behavior. Nothing is idealized, though
the abstraction is considerable. The idea is that, given the real wage and the level of pro-
ductivity, variations in aggregate demand generate corresponding variations in realized
profits, but at the same time variations in the level of activity leads to variations in the
amount of profits required to sustain activity (pay fixed costs and interest, meet the bill
460 E. J. NELL AND K. ERROUAKI

for salaries and bonuses, for legal and other fees, etc). The level of activity, in turn is driven
by Investment spending, which depends on projected growth. So there are two relation-
ships here: one shows the profits generated by each possible level of current Investment
spending, the other, the level of profits required to support a level of Investment spending
(and the activity that follows). Given an initial level of capital, we can write these as the
current rate of realized profits and the current rate of projected growth.
The first of these, Investment generates Profits, is pretty straightforward, and uncer-
tainty does not play a role here. Household behavior is treated as passive; households
‘spend what they get’, so revenue, and thus realized Profits, depends on the multiplier,
applied to Investment and, of course, to any ‘autonomous’ spending by business. The
second is trickier, and is where uncertainty comes in. In order to sustain a particular
level of Investment, firms have to have sufficient Profits on hand to convince the banks
to finance the Investment. On balance, the larger the level of Investment, the more
wary the banks will be, and the larger the Profits will have to be. But this relationship is
filled with uncertainty, and is also subject to effects of widespread feelings of optimism
Downloaded by [177.207.94.122] at 18:09 04 August 2016

or pessimism. Moreover, the required level of Profits reflects the level of fixed costs and
various fees, something that can be expected to rise over time if the economy is
booming, or fall if it is descending into a slump. But again this movement, while clear
in the statistics in general, is nevertheless subject to great uncertainty in regard to detail.
Here’s how it works. At very low levels of activity (low Investment) the Profits gener-
ated will surely tend to exceed what is required, especially if the economy has been at the
low level for some time, so that fixed costs, etc., are greatly reduced. As expectations
improve, Investment and so activity will increase. Also, debt will pile up and fixed costs
will rise; optimism will lead to the expansion of bonuses, greater use of consultants,
etc., increasing the level of profits required. Once this rises above the level generated,
Investment will have to be cut back, and the level of activity will start to decline, creating
widespread uncertainty, leading to contraction.10
For those interested, the diagram is shown in Figure 1; footnote 10 explains how the
shifts in these two functions trace out a cycle (Nell and Errouaki 2013, p. 459). The Invest-
ment Finance (IF) relation does most of the shifting, in response to changes in the degree
and nature of uncertainty. (The slope of the rE will change if the real wage changes.)

The model boils down to two reduced-form equations in two variables, the rate of growth g and the general rate of return
10

r. The first reduced form equation is the rE line rising from left to right, showing that as the rate of growth rises, the rate of
return also rises (in a special case, along a 45 degree line). The intercept will be autonomous spending. Then the rising
line shows that the spending from growth generates profits; as the growth rate rises, investment spending rises, increas-
ing the current, realized rate of profit. So this equation gives us the ‘generated rate of return’. The other equation con-
necting these two variables, called IF for Investment Finance, shows the rate of return that will be necessary to support a
certain level of the rate of growth—the ‘required rate of return’. On a diagram this also rises from left to right, but initially
it has a low slope, then as the growth rate rises, risk rises, slowly at first, then rapidly, and the curve turns up sharply. The
degree of this curvature will reflect uncertainty, being larger as uncertainty is higher. The intercept shows the level of
fixed costs, and will rise in the boom and fall in the slump.
Adding two dynamic conditions, a simple demand-driven business cycle story can be told using these two reduced
form equations. The first of these conditions is that g rises when the generated rate of return is greater than the required,
while the second is that the intercept of the IF is strongly procyclical, whereas that of the rE moves very little. Consider a
middling level of g, such that the generated rate of return lies above the required. Then g will rise; but as g increases, the
intercept will rise, and risk will also rise, so the curvature of the function will increase. Eventually it will tend to rise above
the rE line, but once it becomes tangent, the system becomes unstable downwards (Nell 1998, pp. 681–682), precipitat-
ing a downswing. This in turn will lead to cost-cutting and downsizing, lowering the intercept of the IF, until it falls below
the rE again—at which point a turnaround will occur and expansion can begin once more.
REVIEW OF POLITICAL ECONOMY 461
Downloaded by [177.207.94.122] at 18:09 04 August 2016

Figure 1. The Minsky-Nell model

But the details are not at issue here, nor is plausibility or degree of oversimplification;
instead the question is, how could such a model be confronted with data? Can the relation-
ships be estimated? What kind of data could we draw on? The first function, showing
growth generating profits, is a combination of expenditure relationships and multipliers;
it is clearly reliable, but the second is another matter. This shows the profits required to
support a level of Investment. Some parts of it are reliable, and even the treatment of
risk has solid grounding (Nell 1998, pp. 662–666). But both the risks and the financial
costs will depend on expectations of the future, and once again these can be highly volatile,
so that determining a probability distribution (except for a short time, or under special
conditions) brings up the question of whether there really is any such thing. Let’s ask
further: are the costs of supporting a level of Investment reliably related to the level of
activity, or do they vary with evanescent phenomena like the state of confidence or
degree of optimism? If that seems likely to be the case, then the second function is
subject to uncertainty.
So far we have not modeled the equity market, or addressed the debt/equity ratio; and
more could be said about interest rates generally. A first step would be to combine the two
models above, and the next would be to model interest rates and debt more fully. Finally,
we might wish to bring in money explicitly, including both the price level and its rate of
change (Nell 1998). This would allow us to deal with inflation and to distinguish the
nominal and real rates of interest. But consider what this means: the first model is all
about the future, today’s interest rates and value of debt depend in part on expectations,
as do the price level and inflation. All these are future-oriented, and should be considered
potentially volatile; fitting them into the second function will make it even more
unreliable.
So what to do? The fact that the second function is unreliable is not the end of the story;
it is a new beginning. We want to fit all the volatile elements, reflecting uncertainty, into
462 E. J. NELL AND K. ERROUAKI

one place, in this case an enhanced IF function. That way all the volatile elements will be
incorporated into one function, while the other, the rE, will be Reliable. The model will
then exhibit the effects on the stable parts of the economy caused by the changes in the
volatile elements. The reliable parts can be estimated following statistically adequate
and sensible econometric procedures, while the volatile relationships will have to be
addressed in as many ways as possible, drawing on other subjects and on fieldwork, as
well as whatever econometrics can be done, when the relationships may sometimes be
there, and sometimes not, may be one thing at one time and another at a different
time. Yet if conceptual analysis identifies a plausible causal link between the expected
value of a future variable and something today—sales of a new product and investment
spending today, sales of a future product and equity today—then it is worth trying to esti-
mate that link, even if we may expect the link to change in unpredictable ways. By doing
this we can see more precisely how uncertainty affects the volatile relationships, which in
turn impact the rest of the economy. By tracing out these connections, we will, hopefully,
Downloaded by [177.207.94.122] at 18:09 04 August 2016

make it possible to see where best to develop policies to control, direct and limit the
volatility.

11. In Conclusion
The reviews were great, thought-provoking and critical but constructive. The common
theme seems to be that empirical model-building is the right way to go, but that it is
not being done properly. Spanos (and ourselves) say that Haavelmo started us off well,
but the profession has lost its way since; we need to regain statistical adequacy as far as
possible. Boland (and ourselves) would like to move away from equilibrium modeling,
towards greater realism and more attention to dynamics, while Boumans (and ourselves)
want to move towards greater interaction between fieldwork, measurement and modeling.
However, we feel that Keynesian uncertainty has not been properly taken into account by
econometricians, and we propose an approach based on the Reliable-volatile distinction.
Nevertheless, there seems to be some real agreement here, unusual in this field, and this is
a good place at which to end.

Disclosure statement
No potential conflict of interest was reported by the authors.

References
Errouaki, K. 2014. ‘A Klein-Nell Alternative to the Failed Macroeconometrics of Lucas and the
Rational Expectations Hypothesis.’ The Foundation for Culture of Peace: Autonomous
University of Madrid [unpublished].
Hollis, M. and E. J. Nell. 1975. Rational Economic Man: A Philosophical Critique of Neoclassical
Economics. Cambridge: Cambridge University Press.
Koopmans, T. C. 1947. ‘Measurement without Theory.’ Review of Economics and Statistics, 29:
161‒172.
Lawson, T. 1997. Economics and Reality. London: Routledge.
Leontief, W. 1971. ‘Theoretical assumptions and nonobserved facts.’ American Economic Review 61:
1–7.
REVIEW OF POLITICAL ECONOMY 463

Leontief, W. 1984. Essays in Economics. Rutgers: Transaction Books.


Lowe, A. 1965. On Economic Knowledge: Toward a Science of Political Economics. New York:
Harper & Row.
Lowe, A. 1976. The Path of Economic Growth. New York: Cambridge University Press.
Morgenstern, O. 1963. On the Accuracy of Economic Observations. Princeton: Princeton University
Press.
Nell, E. J. 1998. The General Theory of Transformational Growth. Cambridge: Cambridge
University Press.
Nell, E. J., and K. Errouaki. 2013. Rational Econometric Man: Transforming Structural
Econometrics. Cheltenham: Edward Elgar.
Solow, R. M. 2010. ‘Building a Science of Economics for the Real World.’ Testimony before the US
House of Representatives Committee on Science and Technology (Subcommittee on
Investigations and Oversight) [http://www2.econ.iastate.edu/classes/econ502/tesfatsion/Solow.
StateOfMacro.CongressionalTestimony.July2010.pdf].
Sraffa, P. 1960. Production of Commodities By Means of Commodities. Cambridge: Cambridge
University Press.
Downloaded by [177.207.94.122] at 18:09 04 August 2016

You might also like