Download as pdf or txt
Download as pdf or txt
You are on page 1of 36

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/341556136

Quantifying Value with Effective Complexity

Article in Journal of Interdisciplinary Economics · May 2020


DOI: 10.1177/0260107920913663

CITATIONS READS
0 149

2 authors:

Maxwell Murialdo Arturo Cifuentes


California Institute of Technology Pontificia Universidad Católica de Chile
11 PUBLICATIONS 186 CITATIONS 79 PUBLICATIONS 583 CITATIONS

SEE PROFILE SEE PROFILE

Some of the authors of this publication are also working on these related projects:

REAL OPTIONS View project

Artistic Power Value APV View project

All content following this page was uploaded by Maxwell Murialdo on 25 May 2020.

The user has requested enhancement of the downloaded file.


  1  

Title: Quantifying Value with Effective Complexity


Authors:  Maxwell  Murialdo1*,  Arturo  Cifuentes2  

Keywords: Theory of Value, Valuation, Economic Policy, Information Theory, Complexity

JEL Codes: B41, B59, C63, D63

Affiliations:
1
Lawrence Livermore National Laboratory, Livermore, CA, USA
2
Clapes-UC, Santiago, CHILE

*Correspondence to: maxmurialdo@gmail.com

 
  2  

Abstract:

We present a new economic theory of value based on complexity theory. For simplicity

we call this theory ‘complexalism’ (a portmanteau of ‘complexity’ and ‘capitalism’).

Complexalism is a framework that establishes valuations by quantifying the present and

future complexities of objects and their surroundings. This framework reparameterizes

questions of economic value into more objectively addressable subcomponents. First, we

motivate the importance of developing alternative frameworks for value. Next, we

discuss a novel three-dimensional framework to analyze value and the use of effective

complexity (EC) as a proxy metric of economic value. Finally, we propose explicit

methods for quantifying complexity and simulating valuations. The resulting valuations

may serve to benchmark prices and can be used in evaluating the market rules of

engagement.

 
  3  

Introduction

Every coherent economic system must be underpinned by a theory of value. A

theory of value axiomatically impacts what is or isn’t considered valuable, helps to

facilitate the exchange of disparate goods, shapes the formulation of public policies, and

affects the income distribution under that system. Historical theories of value include the

Labour Theory of Value, the Marginal Theory of Value, the Sraffian Theory of Value,

etc. (Dobb, 1975; Taylor, 2001).

In recent decades economic inequality has reached extreme levels in the United

States and a number of other advanced economies (Dabla-Norris et al. 2015), prompting

prominent thinkers to question the efficacy of current market systems, underlying rules of

engagement, and conventional theories of value (Stiglitz, 2015; Deaton, 2019; Yunus,

2017). In particular, the presumed correspondence between compensation and value

creation (under capitalism) has been called into question, with the implication being that

prices and value regularly diverge from one another (Graeber & Cerutti, 2018).

Some implementations of capitalism more closely resemble a system wherein

money is ransomed rather than earned. Here the word ‘ransomed’ is not intended to

insinuate violence or unethical behavior. Rather, the word ‘ransomed’ is used to imply

that money is only meted out when one’s value can be reasonably withheld until an

exchange is completed. This property is key. Value that cannot be systematically

withheld has no protections and is unlikely to receive just recompense. (Note that an

exchange may occur at a single point in time or spread out over an extended timeframe.)

Accordingly, a number of constructs have been devised and deployed to protect

specific forms of value and render them ‘ransomable’. These protections are both de jure

 
  4  

and de facto, arising from laws, regulations, corporate policies, and societal norms. Some

such protections are easy to observe (e.g. copyrights, patents, and property rights).

However, at the scale necessitated by the vast, entangled economy, these protections and

their supporting structures fan out into a ubiquitous web of rules, both visible and

invisible. We call these and other such rules the ‘market rules of engagement’. The

importance of these exogenous rules in determining prices has been highlighted by

Douglass North and other economists (North, 1991, 1992).

Designing and enforcing protections that render value appropriately ‘ransomable’

is a nonobvious process. However, it is a critical process. Corporations and individuals

alike live and die by the specifics of these protections. In some cases, the value produced

is well protected. In other cases, presumably only a fraction of the value produced can be

ransomed, if any at all.

At present, the protections that arise from laws, regulations, corporate policies,

and societal norms are molded by democracy, public opinion, special interests, and other

such forces. All of these forces are subject to biases (both conscious and unconscious)

and their intuitions and desires can fail to reflect a deeper reality (even when operating

under the best of intentions). Often these biases are rooted in historical precedents, which

may no longer align with contemporary reality. Consequently, market prices and value

are expected to diverge from one another habitually. This divergence is

counterproductive to the greater economic goal of maximizing value (which is distinct

from maximizing monetary wealth). At issue is not simply the question of whether the

correlations between market prices and inherent values are strong or weak; the problem is

far deeper. The problem is that at present we lack a suitable quantitative means to even

 
  5  

assess whether these correlations are strong or weak. We lack a robust alternative metric

of value to benchmark against.

Utilizing Frameworks

One goal in the pursuit of quantifying value is to design a more fair economic

system. At first glance ‘fairness’ represents a large and nebulous concept that is often ill-

constrained and ill-posed. However, serious attacks have been mounted to open up fronts

and make rigorous and meaningful progress on the question of fairness, most notably the

work of John Rawls and his ‘Veil of Ignorance’ (Rawls, 1971). In his theory, Rawls asks

us to imagine going back to a time before we were born, before we had any inkling of our

own unique circumstances, talents, susceptibilities, or preferences. Only from this

vantage point, behind a veil of ignorance, Rawls argues, could we reasonably come to a

consensus about fair rules of engagement for the world at large. This framework allows

us to move towards more objectively decidable resolutions on the seemingly subjective

question of ‘fairness’.

In general, a goal of science is to reparameterize seemingly subjective questions

into more objectively addressable subcomponents by introducing new frameworks (e.g.

‘Veil of Ignorance’). An insightful framework can break down seemingly subjective

phenomena into smaller, more objectively decidable questions that nonetheless shed light

on the larger phenomenon when logically reassembled. Here we define an ‘objective

question’ as one whose answer is perspective (or mind) independent and a ‘subjective

question’ as one whose answer is perspective (or mind) dependent. These two extremes

form opposite ends of a spectrum, with likely no firm dividing line, only gradations

 
  6  

between the two. As such, we posit that the best way to measure the objectivity of a

specific question is to measure the strength of the consensus among individuals

competently providing answers. For these purposes, competence depends on the scope of

information available to the individual and his or her ability to comprehend and reason

from that information. (In some cases this competence may only exist in theory.) As an

example, a question regarding the outcome of a specific coin flip is highly objective

because once all of the information regarding the outcome has been conveyed to and

understood by the observers, there exists a strong consensus on the outcome (regardless

of prior uncertainty while the coin was in the air). Even if the relevant information is

never collected or disseminated, the question is still highly objective due to the

hypothetical possibility that the information could have been collected and a strong

consensus achieved.

In order to illustrate how science makes use of insightful frameworks to break

down seemingly subjective phenomena into more objectively decidable sub-questions,

we consider the following hypothetical scenario. Some ancient peoples might have

believed that the coolness of an object to the touch was a purely subjective sensation.

Today, however, we have rigorous means of objectively quantifying the temperature and

thermal conductivity of an object in order to better address this question. First, the

sensory experience of warmth or coolness had to be mapped to concepts of temperature

and thermal conductivity (defined within frameworks of thermodynamics and heat

transfer). Further quantifying these concepts required, for instance, mapping units of

temperature to the expansion of mercury or mapping units of time to the motion of a

pendulum. These physical observations are more easily measured and agreed upon with a

 
  7  

simple visual inspection. In this manner, scientific frameworks have rendered questions

regarding the coolness of an object to the touch more objective. Science produces

frameworks that can reparameterize large and slippery questions into more objectively

agreed upon fragments (e.g. thermometer readings from mercury levels).

Following suit, we propose a framework to render the seemingly subjective

concept of economic value more objective by analyzing value through the lens of

complexity theory. By definition, economics studies the distribution, allocation,

consumption, and production of wealth. However, wealth takes many disparate forms,

which are difficult to reconcile and sum without deferring to market prices.

Unfortunately, market prices cannot measure a good’s inherent value. Market prices can

only claim to correlate with inherent value in the context of a number of untested

assumptions. One such assumption is a belief in the validity and efficacy of the market

rules of engagement. As previously discussed, faulty rules of engagement will cause a

divergence between market prices and inherent value. Accordingly, conventional

economics tools cannot be used to robustly evaluate the market rules of engagement, the

rules’ abilities to accurately assess value, or even the overall growth of value in a society.

Any attempt to evaluate the market rules of engagement by tracking only market prices is

inherently circular. Conventional economics tools can only produce naïve assessments of

the growth of a society’s economic value. For example, conventional tools may be able to

measure an increase in the quantity of a certain class of goods, but could not accurately

assess the quality or other important properties of those goods (Benedikt & Oden, 2011).

 
  8  

Reparameterizing Value

On examination we divide value along three dimensions: Progress, Qualia, and

Truth. The dimension of value that we label as ‘Progress’ pertains to the advancement of

the world. Heuristically it encompasses development, innovation, and production in

technology, infrastructure, art, ideas, culture, etc. Progress measures how each decade has

improved upon the last. Later we will argue that the Progress dimension of value best

represents economic value on the whole and can be quantified with effective complexity.

Others have hinted that value and complexity are correlated (Daly, 1973; Romer, 1996),

and Benedikt & Oden (2011) explicitly argued for a valorization of complexity. In this

work we make a distinct argument for the intrinsic (as opposed to instrumental) value of

complexity and propose a framework by which complexity-based valuations can be

hypothetically computed.

The dimension of value that we label as ‘Qualia’ pertains to the feelings and

experiences of conscious beings. In the philosophy of mind, ‘Qualia’ denotes the

subjective experiences and sensations that are unique to conscious beings. Unlike non-

conscious computers, which only operate on data in a mechanistic sense, humans

experience color, taste, sound, pleasure, pain, etc. These experiences undoubtedly impact

the subjective well-being of conscious individuals and cannot be dismissed as factors in

determining desires, demands, and certain forms of value.

Finally, the dimension of value that we label as ‘Truth’ explores any inherent

value in gaining knowledge and understanding of reality, as it is. For example, it may be

argued that gaining knowledge of a new fundamental particle in physics or of an

 
  9  

occurrence in ancient history has value in of itself (and is therefore worth seeking),

independent of any potential impact on Progress or Qualia.

As a general model, these three dimensions of value are taken to be orthogonal to

one another and can be thought of as a useful set of basis vectors that span a three-

dimensional value space. Any specific action or object may have positive or negative

value along one dimension, independent of its value along another dimension. For

example, a horrible truth may have value by virtue of its accuracy in describing the world

(Truth) but contribute to pain along the Qualia dimension. Alternatively, a pleasurable

activity (Qualia) might contribute to the deterioration of Progress, while having no

impact along the Truth dimension. In general, intentioned actions are undertaken with the

fundamental motive of effecting change in at least one of these three dimensions. While

higher-level motives may be formulated (e.g. love, friendship, loyalty, honor, etc.), at

base these higher-level motives can usually be deconstructed into component vectors that

lie along one or more of the three fundamental dimensions (Progress, Qualia, Truth).

A specific action or object (in a given context) can have its value-point plotted on

a three-dimensional graph as shown in Figure 1. Furthermore, a specific individual’s

current location within a three-dimensional value space may be determined from past

actions and objects (vectors) that affect that individual. An individual’s current distance

from the origin in one of the dimensions (e.g. Qualia) may affect that individual’s ability

to move along a different dimension of value (e.g. Progress). For instance, a person who

has experienced a great deal of pain and depression (Qualia) might find it difficult to

produce works of Progress. Conversely, a person who has experienced too much bliss

(Qualia) might also be dissuaded from working towards Progress. Consequently, moving

 
  10  

from one value-point to another may be thought of as analogous to moving a particle in a

non-conservative, three-dimensional force field. The difficulty of moving from one point

to another point in the three-dimensional space is path dependent. The ability to traverse

one of the dimensions can be impacted by the starting location in the other two

dimensions, even while all three dimensions are in fact orthogonal to one another.  

Figure 1. The value-point of a hypothetical action plotted in three-dimensional value

space.

Some philosophers, such as Epicurus, favored Qualia as preferential and

dismissed the other dimensions of value (Wilson, 1951). However, this perspective does

not reflect the ideals of contemporary capitalistic societies. At a policy level, capitalistic

societies tend to favor progress and accomplishments, lauding these endeavors as most

economically valuable and to some extent rewarding them accordingly. The most

profitable companies tend to peddle in Progress, whether through production, improved

distribution, efficiency, or innovation in physical goods, technology, infrastructure,

culture, relationships, information, ideas, etc. (Murphy et al. 2019). Fundamentally, all of

 
  11  

these activities are engaged in the process of complexifying the world (whether directly

or indirectly).

Moreover, when actions are pleasurable but pose a hindrance or detriment to

Progress (herein measured by complexity), they are disincentivized with laws,

regulations, taxes, or social stigma. Examples of such activities may include using hard

drugs, smoking cigarettes, activities that pose a fire risk, vandalism, pollution, dropping

out of high school, laziness, etc. Conversely, actions that may be considered difficult,

painful, or lacking in pleasure are generally commended if their results yield Progress

(complexity). Examples of such activities may include maintaining physical fitness,

painful medical procedures, putting a man on the moon, home maintenance, volunteer

work, dedication to a craft, acts of heroism, work ethic, etc. In other cases, activities have

no impact on Progress (complexity) and are therefore neutral in terms of economic value.

Intuitively, capitalism would rate a society where people lounge in pleasure but do very

little, as inferior to a society where people endure pain and hard work but achieve great

accomplishments. Moreover, while Truth may contain its own inherent value, capitalistic

societies tend to favor truths that pay out by facilitating Progress (e.g. scientific truths

that lead to new inventions). For these reasons, economic value is most suitably measured

along the dimension of Progress.

Progress (or economic value) cannot be easily measured with simple physical

properties like mass, volume, composition, or substance. Rather, economic value arises

out of embodied information, typically in the form of a hierarchical arrangement of an

object’s fundamental parts. For example, a coherent novel is more valuable than an

equally large stack of papers that have been printed on with the same amount of ink in a

 
  12  

random gibberish pattern. Food is more valuable before it has been digested (a process

that breaks down a great deal of its molecular structure). A structured computer is more

valuable than its unstructured raw materials. The same could be said of life. While the

specific atoms in a living body are continually flushed out and replaced (as individual

cells die), the embodied information (from DNA down to electrons) preserves a living

being’s individuality and potential value (both intrinsic and instrumental). In fact, most

any valuable object could be rendered valueless by burning it to ash in a high-temperature

furnace, thereby erasing its embodied information. Even pure metals like platinum can

have their embodied information and value destroyed by nuclear processes that rearrange

their fundamental particles.

The second law of thermodynamics ensures that on a grand scale organized

information is continually randomized and degraded. However, on a local scale

something beautiful is afoot. Humans are born creative and with the potential to be

productive. Great civilizations arise from human minds and hands. Interpersonal

relationships and networks flourish. Technology progresses. Culture abounds. These

measures of progress (and others) make life worth living and are, as a rule of thumb,

‘valuable’. Moreover, they all share a common denominator; they all embody (or

engender) abundant complexity in a universe that tends towards disorder.

Whereas entropy measures ignorance, information is the opposite of ignorance. In

a naïve sense we could choose to measure embodied information either in units of

entropy (Shannon Information) or by using Algorithmic Information Content (AIC) (also

called Kolmogorov complexity) (Chaitin, 1990; Kolmogorov, 1963). However,

meaningful information (that which contributes to economic value) is better quantified by

 
  13  

operating on the AIC to tease out the contributions attributable to regularities. The

information measure known as ‘effective complexity’ (EC) does precisely this (as will be

discussed later in detail) and therefore provides a powerful proxy metric for measuring

economic value.

While most of the universe marches towards a high-entropy state, living beings

and their creations not only embody enormous complexity but also continually give rise

to even greater complexity. Effective complexity is a characteristic largely engendered by

life and the inventions of living beings (Bar-Yam, 2002; Capra, 2005) (which both

harness and fight the increase of entropy on a local scale). The creation and preservation

of net EC is what humans strive for everyday as they work to keep their bodies healthy

and living, maintain their cars and houses, build and nurture relationships, produce

writing, art, ideas, and culture. The creation and preservation of EC is built into a

human’s raison d’être and can be interpreted as a measure of what humans have and have

created of value.

However, often humans treasure things for their simplicity, not their complexity.

They seek simplified tax codes, tools that are simple but effective, and highly purified

silicon for use in transistors. Measuring the currently embodied EC of an object does not

capture its full value. Instead, we must consider both an object’s intrinsic value and its

instrumental value. The EC currently embodied by an object functions as a proxy for its

intrinsic value. However, the object may also serve as a raw material, tool, or other factor

in the production of still greater complexity in the future. This future complexity is the

object’s instrumental value and must be incorporated into any calculation of the object’s

overall value. For example, in a hypothetical chain of events, highly purified silicon is

 
  14  

used to produce transistors, which are assembled into computers, which produce

abundant future complexity through vast calculations. While purified silicon may appear

to have little value (due to its minimal embodied EC), the sum of its intrinsic and

instrumental value is quite large. Often an object’s future EC contributions will dwarf its

presently embodied EC. When humans seek things that are simple, it is with the ultimate

purpose of using this simplicity to facilitate even greater complexity in the future.

As a further spot check on the correlation between economic value and effective

complexity, consider both the value and EC of an individual human. Human life and

well-being are prized by general consensus. Simultaneously, the human brain, with its

nearly 100 billion neurons (Allen & Barres, 2009), each making up to 10,000

connections, is highly complex (perhaps the most complex object in our known solar

system). Furthermore, thinking individuals (with well-functioning brains) engender

enormous future EC through their creativity, decisions, actions, and secondary effects.

Therefore, under complexalism, actions that ultimately promote the survival and well-

being of humans have large positive value, while those that ultimately destroy human life

have large negative value.

Measuring Complexity

Complexity theory is a rich field, and a number of diverse complexity measures

have been proposed (Lloyd, 2001). However, the measure known as effective complexity

best formalizes our intuitive notions of complexity in a general format (Gell-Mann &

Lloyd, 2010). Other measures of complexity (e.g. self-dissimilarity (Wolpert &

Macready, 2002)) can typically be formulated as special cases of effective complexity.

 
  15  

Effective complexity makes use of Algorithmic Information Content, which measures the

length of a highly compressed description of a bit string. Formally, the AIC of a bit string

is the length of the shortest computer program that can print the bit string and then halt

(on a universal computer). As an informal example, the following bit string,

‘1010101010101010101010101010101010101010’, could be compressed and described

as ‘ “10” repeated 20 times’. Therefore, the AIC of this bit string is given by the length of

its compressed description, not the length of the full bit string. Conversely, a bit string

that is truly random cannot be compressed, and its shortest description is at least as long

as the bit string itself. For this reason, AIC is sometimes called ‘algorithmic randomness’

as it assigns higher values to random bit strings than to bit strings containing patterns,

which allow for compression.

Therefore, AIC is not a viable measure of what we intuitively consider embodied

information. The AIC of a random bit string is large, whereas its embodied information is

small. For example, in literature a novel composed by randomly selecting words from a

dictionary is considered to contain very little meaningful information, despite its large

AIC.

Effective complexity, on the other hand, only measures the AIC of the

regularities in a bit string. In doing so, effective complexity closely approximates our

intuitive notion of embodied information. Effective complexity separates a bit string into

a subset containing its regular features and a subset containing its incidental (or random)

features. It then measures the AIC of just the regularities.

Regular features are distinguished from incidental features by using a procedure

from the literature that embeds the bit string in a set of similar bit strings and assigns a

 
  16  

probability to each member of the set (Gell-Mann & Lloyd, 1996). Each member of the

set must share the regularities of the original bit string but otherwise can vary in its

details. For instance, the bit string ‘1010101010101010101010101010101010101010’

could be typed out in various fonts wherein the symbols ‘1’ and ‘0’ are perceived as the

regularities, and the stylistic differences between each font are perceived as the incidental

features.

An assembled set of similar entities, where each member has an assigned

probability, can be considered an ensemble (wherein the regularities are constraints to be

held constant). This ensemble mirrors ensembles used in statistical mechanics (e.g. the

microcanonical ensemble wherein the total energy imposes a constraint, or the canonical

ensemble wherein the temperature imposes a constraint). By analogy, the regular features

correspond to macrostate features while the incidental features correspond to microstate

features. The effective complexity is given by the AIC of the ensemble. EC is thus

distinct from AIC in that it conceptually separates the bit string into two subsets, one

containing the regularities and the other containing the incidental features. EC only

measures the information content of the regularities in the bit string.

One notable criticism leveled at effective complexity is that it is not uniquely

determined for a given bit string (in the absence of externally imposed constraints)

(McAllister, 2003). The effective complexity of a bit string depends on subjective

decisions (constraints) such as on which computing machine the AIC is calculated, the

choice of an encoding scheme, and how regularities are defined. The first two subjective

decisions (choice of a computing machine and an encoding scheme) can be largely

 
  17  

dismissed as long as protocols are established and followed which standardize both

decisions (thereby minimizing any resulting subjectivity or ambiguity).

The third necessary and subjective decision, defining regularities (also denoted

‘coarse-graining’), has been addressed by Céspedes and Fuentes, who argue that effective

complexity can be uniquely determined (or at least reasonably bounded) as long as the

‘cognitive and practical interests’ concerning the bit string are determined a priori

(Céspedes & Fuentes, 2020). Under these conditions effective complexity is compatible

with scientific practices and can be applied robustly as an information measure. As a

hypothetical example, Céspedes and Fuentes imagine a bit string that records the earth’s

atmospheric temperature over time. This bit string may be considered on the basis of a

period of one day or on the basis of a period of 21,000 years, yielding different patterns

over different timescales and hence different potential regularities. If, however, the

various researchers using this bit string agree to use the same basis (presumably based on

a shared research agenda), then they will arrive at the same (or at least a similar) effective

complexity result. If there is consensus on the ‘cognitive and practical interests’ among

the parties (i.e. agreement on the course-graining practice for determining regularities),

then effective complexity can be uniquely determined.

Under complexalism, effective complexity is used as a proxy metric for the

economic value of goods. In order for EC to serve as the basis for a rigorous economic

theory of value, the EC of the goods being evaluated must be uniquely determined (given

a specific context). As discussed, uniquely determining the EC of an object (or bit string)

requires that the parties involved come to a consensus on how to identify regularities (i.e.

coarse-graining). While this process inherently involves some subjectivity, we contend

 
  18  

that it can be resolved in a more objective fashion than other nebulous approaches to

assessing value.

Complexalism reparameterizes large and slippery questions of value into sub-

questions about the coarse-graining procedure. Each of these sub-questions, on its own,

can more easily garner a consensus resolution. Once reassembled, the agreed upon

answers to the sub-questions enable a robust quantification of value. The purpose of this

reparameterization is analogous to the purpose of rendering an object’s coolness to the

touch more objective by slicing it into more objectively addressable subcomponents (e.g.

the expansion of mercury and the swinging of a pendulum).

Finding the best coarse-graining procedures for complexalism remains an open

question left for future work, but we list a few conceivable approaches. In one approach

the regularities in a physical object might be constrained by a ‘minimum intentional

feature size’. For instance, the minimum intentional feature size could be defined as the

smallest unit of detail (voxel) that the creator would reproducibly incorporate into his or

her work under standard operating procedures. A 3D-printed part might have a minimum

intentional feature size of 1 mm. If asked to produce a dozen identical prints, each print

would be reproduced with an all-around precision of 1 mm. Smaller features would be

irreproducible, unintentional, and considered noise. Likewise, two pristine cars of the

same make, model, year, color, and accessories that have just come off of the same

assembly line are identical for all practical valuation purposes. Complexalism should not

concern itself about the precise arrangement of atoms and dislocations within the metal

itself (so long as they don’t affect other important materials properties), lest these atomic-

scale regularities drown out the regularities of interest. More generally, a diverse set of

 
  19  

‘minimum intentional feature parameters’ may be selected. For a 3D-printed part, these

may include the selected set of colors (quantified by wavelength), compositions,

mechanical properties, etc. For a highly designed object like a car, each aspect is

designed to within certain intentional specifications (e.g. manufacturing tolerances).

These specifications set error bars on the parameter precision necessary for each feature

to maintain its purpose. Accordingly, these specifications could be used to guide the

coarse-graining procedures.

Alternatively, ‘minimum consequential feature parameters’ could be determined.

These feature parameters favor consequence over intent. For example, an object might be

overdesigned (in intent) at a feature scale that cannot be practically distinguished from a

less designed object. One could argue that if both objects are indistinguishable, they

should have the same valuation from a consequence perspective. The minimum

consequential feature parameter could accordingly be constrained by physical limitations

on a human’s ability to distinguish between colors, hardness, mechanical properties, etc.

Furthermore, if the causal relationship interrelating features at different hierarchical

levels is known (e.g. how microscale features impact bulk mechanical properties), a more

complete picture of the consequences resulting from each scale could be assessed and

accounted for. Deciding on a coarse-graining procedure might additionally be aided by

machine learning techniques or generative adversarial networks. As Gell-Mann and

Lloyd have pointed out, the coarse-graining procedure does not necessarily need to be

decided by humans or even living beings (Gell-Mann & Lloyd, 2010).

We contrast complexalism with the current state of affairs in assigning valuations.

At present, market prices are the default proxy for economic value. This approach is

 
  20  

riddled with fundamental weaknesses. Market prices are dictated (in part) by the

equilibrium between supply and demand. However, demand is readily manipulated and

often too nebulous to evaluate directly and rigorously (Becher & Feldman, 2016). While

demand operates on a market scale, it derives from individual desires (whether simple or

multifaceted). There is little reason to believe that following the desires of individuals

(even averaged en masse) is an efficient route for pursuing or assessing Truth or

Progress. For this reason, questions in philosophy, science, mathematics, medicine,

engineering, etc. cannot be viably resolved by a popular vote. Instead these questions

must be resolved with expertise, specialized training, and rigorous studies. While demand

may indirectly drive Progress (or at least motivate it), the connection between the two is

tenuous. Generally, policy interventions are additionally necessary (e.g. government

funding for research in the basic sciences).

If anything, desires may more aptly reflect the Qualia dimension of value. Qualia

are, after all, subjective internal experiences most readily accessed by the individual

experiencing them. However, here too, manipulations arise. In his theory of Mimetic

Desire, Rene Girard argues that most all desires arise from the copying of others

(Livingston, 1992). While inner desires may appear highly personal and seem to elucidate

something deeper about one’s true self, they are often, in fact, foreign entities implanted

through advertising, media, and other sources.

Furthermore, market prices depend on market rules of engagement, which, as

discussed, cannot be robustly evaluated using conventional economics tools. In many

cases, market prices may result from a combination of demand and poorly designed rules

of engagement. For these reasons, market prices are unlikely to accurately reflect

 
  21  

Progress or economic value. Simultaneously, communism (as an opposing economic

system) is rife with fundamental weaknesses, not least of which is its dependence on the

highly subjective discernments and opinions of bureaucrats in planning the economy.

Complexalism, on the other hand, provides an alternative and potentially more

objectively decidable metric of value that circumvents many of the shortcomings inherent

to ‘demand’.

To be clear, complexalism does not take power out of the hands of the people.

Rather, it is meant to empower people in their value assessments, by providing a more

potent and palpable framework for valuations. This is accomplished by first

differentiating between three independent dimensions of value: Progress, Qualia, and

Truth. Next a systematic approach for quantifying Progress using effective complexity is

explored (as detailed below). By breaking down large and nebulous questions of value

into more addressable sub-questions, complexalism allows individuals to develop

consensuses at a more fundamental level, far removed from the biases and susceptibilities

inherent to desires.

Simulating Valuations

The process of quantitatively assessing value under complexalism is most easily

explained by starting with three idealized assumptions: 1) the universe is deterministic, 2)

we have access to unlimited computing power, and 3) we have access to all initial state

information. These assumptions can later be relaxed through approximations. Instead of

following Revenue and Cost (e.g. to decide levels of production), complexalism tracks

 
  22  

the tension between two new terms, ‘Total Complexity’ (TC) and ‘Complexity

Opportunity Cost’ (COC).

TC is a measure of the EC both embodied within (intrinsic) and engendered by

(instrumental) an object of interest. Here we use the term ‘object’ in a broad sense,

encompassing physical items, ideas, services, etc. First the object of interest (e.g. my

yellow bicycle) and its surroundings are encoded as information in a simulation. The EC

presently embodied in both the simulated object and its simulated surroundings is

quantified and this number is stored. Next, the simulation is run forward in time, with the

embodied EC (of both the object and surroundings) computed and stored at each

timestep. For example, my yellow bicycle is utilized to perform daily tasks, each of

which increases the net complexity of the surroundings. Any secondary effects are also

simulated, quantified, and captured in an expanding chain of causal events, each

contributing to the embodied EC of the simulated surroundings for that timestep. Overall,

the EC of the entire simulation is integrated with respect to time (stepwise, assuming

infinitesimal timesteps) from the present (to) up to a specified point in the future (tf). This

integrated result is stored as ‘result A’.

Now the whole process is repeated in simulation B, with a single difference; in

simulation B, my yellow bicycle is absent (while the surroundings begin in the same

initial state). The simulation will obviously run a different course in the absence of my

yellow bicycle. Once again, the EC is integrated with respect to time up to the same

future point and stored as ‘result B’. The difference between ‘result A’ and ‘result B’ is

the difference in complexity between two worlds, one with the yellow bicycle and one

without it (Equation 1).

 
  23  

!! !!
𝑇𝐶 =   𝐶  𝑑𝑡  
!! !"
−   𝐶  𝑑𝑡
!! !  
(1)

The variable Cos represents the instantaneous EC of the object of interest and its

surroundings, while Cs represents the instantaneous EC of just the surroundings. TC

measures the impact (i.e. complexalism benefit), of my yellow bicycle. However, impact

alone does not dictate value. Value is contextual. In complexalism contextuality is built

in by virtue of the simulations themselves being contextual. These simulations

incorporate not only the object of interest, but also its potentially affected surroundings.

This context can incorporate time, place, cultural norms, etc.

Consider a simple hypothetical example of a farmer, Jane, with four identical

buckets of water and four potential uses for water: hydrate herself, hydrate her animals,

wash her body, and water her garden. When deciding how to use the first bucket of water,

each activity has a distinct level of priority, as measured by the associated TC. Hydrating

herself in order to remain alive is a high priority as it has high TC. The staggering

interconnectivity of the human brain along with its creative potential is an enormous

source of complexity (both intrinsic and instrumental). Therefore, fostering human life,

well-being, and thinking are highly valuable activities under complexalism. Watering the

garden has much lower priority as it has lower TC. However, each potential use case has

diminishing marginal TC as more water is allotted to that potential use case. As the

farmer drinks more water, she becomes less thirsty, and the TC added with each

additional sip decreases. Under complexalism we assume that the farmer will act

reasonably and spread the available water over the potential use cases so as to maximize

its impact (TC).

 
  24  

In some cases, equivalent goods are not equally accessible. For instance, some

water may be readily available, while other water must first be pumped from a distant

well and carried back. These different sources of water entail different Complexity

Opportunity Costs (COC). COC is the TC that could have otherwise been added to the

system, had time and resources not been expended elsewhere (e.g. in acquiring the

replacement goods). For example, the water that is readily available has nearly zero COC,

whereas the water that must first be pumped from a distant well and carried back expends

worker energy and time, which could have otherwise been spent producing additional TC

in other tasks. In general, the marginal COC of obtaining an additional unit of a good

increases with consumption (at least at high levels of consumption). This is because the

most easily available goods are consumed first, and additional goods must be scrounged

for in increasingly taxing ways. Note that the COC can entail the raw materials,

manufacturing, labour, and transportation, etc. associated with obtaining an equivalent

replacement unit (not necessarily the efforts that went into the original unit in question).

Whereas the marginal TC generally decreases with increased consumption, the

marginal COC generally increases with increased consumption. When the marginal TC

and marginal COC are calculated as functions of consumption (using simulations) and

plotted on the same graph (Figure 2), they intersect at a break-even point where marginal

TC equals marginal COC. This break-even point sets a boundary on the simulation that

limits further scrounging/production for additional units of that good. Obtaining

additional units (beyond the break-even point) would actually decrease overall

complexity and would constitute an unreasonable behavior.

 
  25  

Figure 2. Hypothetical marginal TC and marginal COC curves intersecting at a break-

even point.

Establishing a meritocratic correspondence between one’s compensation (or

price) and one’s positive impact is a philosophical question with at least three distinct

possible approaches. Should compensation be proportional to A) the benefit the

individual produces, B) the hypothetical loss that would be incurred by the absence of the

individual, or C) the cost to replace the individual?

We contend that option B best represents a meritocracy. In order to highlight the

differences between option A and option B, we return to the simple example of the

farmer with four buckets of water (instead of individuals). Let us posit that in this

circumstance the farmer uses the first bucket of water (bucket #1) to hydrate herself,

bucket #2 to hydrate her animals, bucket #3 to wash her body, and bucket #4 to water her

garden, in order to maximize TC. Under option A, bucket #1 has tremendous TC or value

(by virtue of keeping the farmer alive), while bucket #4 has minimal TC or value (by

 
  26  

virtue of keeping her garden alive). However, in a practical sense (due to the

interchangeability of the identical buckets of water), if bucket #1 went missing (e.g.

stolen), the farmer would not die of thirst. Rather she would promote buckets 2-4 to serve

use cases 1-3 (in order of priority). Therefore, the value lost is the TC associated with

keeping the garden alive. This more pragmatic assessment of value is reflected in option

B.

Option C closely mirrors current market principles and shares in its weaknesses.

For example, the cost to replace a worker is typically assessed via market forces (supply

and demand establishing an equilibrium price). However, a market-derived price can only

be interrogated after the rules of engagement have been set for the market. The rules of

engagement (e.g. regulation on monopolies, monopsonies, unions, minimum wage,

contract law, etc.) impact the relative negotiating power of every party involved and

therefore impact the resulting price. For these reasons, option C is not suitable for meta-

level evaluations of the rules of engagement (a goal of complexalism). Alternatively, if

the ‘cost’ of option C is interpreted to mean the Complexity Opportunity Cost (COC),

then in general option C reduces to Option B due to the equivalence between marginal

TC and marginal COC at the break-even point.

In complexalism simulations (for determining TC), when one unit of a good is

removed, other units of that same good, previously allocated to lower priority uses, are

shifted to fulfill the highest priority uses first. The overall effect is that removing a unit

from any use case always results in a loss of TC equivalent to the marginal TC of that

good at the break-even point. Therefore, under complexalism a good or activity’s value

 
  27  

(along with its price or level of compensation) is proportional to the marginal TC of that

object at the break-even point.

This described procedure gives complexalism-derived valuations that can in

theory be applied to anything or any profession. It is important to note that under

complexalism the calculated values of specific objects may or may not resemble current

market prices. In some cases the contrast between the two will be stark. These differences

should be taken as features, not bugs. The utility of any new framework derives from its

areas of disagreement with prior frameworks. Furthermore, complexalism is an economic

theory of value, not a moral theory of value. Other moral restrictions (e.g. deontological)

should be considered in addition to the valuations prescribed by complexalism.  

At present, the available computational resources and datasets are not large

enough to implement complexalism simulations on a broad scale. Nonetheless,

complexalism can be wielded as a potent theoretical framework. As an example by

analogy, Rawl’s ‘Veil of Ignorance’ framework cannot be physically put into practice but

has nonetheless served as a conceptual foundation and guiding light in important policy

decisions (Kukathas & Pettit, 1990). In a similar fashion, complexalism might help to

systematically contextualize economic policy decision-making.

Furthermore, a number of simplifying assumptions might be employed to render

complexalism more computationally tractable. First, in order to eliminate excess

unknowns about the future, complexalism could use recent historical data to make

retrodictions about the recent past (as opposed to predictions about the future). These

retrodictions (made at set time intervals) could establish valuations for specific goods and

occupations to compare against market-derived prices. These comparisons would

 
  28  

benchmark where prices and value have diverged and by how much. For instance, a

representative basket of goods and occupations could be simulated in order to establish

valuation anchor points for use in a course-correcting feedback loop (when steering the

economy). Furthermore, mean-field approximations might be used to provide generalized

(though context relevant) surroundings in the simulations, as opposed to highly

specialized surroundings with unique minutia. Although Chaitin proved that Algorithmic

Information Content is technically undecidable (Chaitin, 1990), a number of approaches

to bound and/or estimate AIC have been proposed (Cilibrasi & Vitanyi, 2005; Delahaye

& Zenil, 2012; Bloem et al. 2014; Soler-Toscano et al. 2014; Soler-Toscano & Zenil,

2017). Gell-Mann and Lloyd simply propose setting a finite maximal computation time in

order to produce a decidable AIC proxy when computing EC (Gell-Mann & Lloyd,

2010). Historically, a number of frameworks and algorithms have been proposed before

the computational power to widely implement them existed (e.g. Newton’s method,

machine learning (Nilsson, 1965)). Likewise, complexalism may benefit from recent and

future advances in exascale computing and quantum computing (Arute et al. 2019).

Conclusions

Complexalism provides an independent and potentially more objective metric for

assessing the value impact of objects and occupations when evaluating policy decisions.

Computed value-anchoring points may serve as targets to aim for when steering the

economy (by tuning its underlying rules of engagement). Monetary wealth or economic

growth measured by market prices cannot be used for these meta-level evaluations. Value

and market prices often diverge precisely because of poorly designed rules of

 
  29  

engagement. In the current absence of a more objective measure of value, the

mechanisms for determining prices and wages cannot be cross-examined. This sets up a

blind feedback loop that is more likely to conjure a simulacrum than reflect reality.

Complexalism offers an alternative and potentially more objective benchmark of value by

which to judge the efficacy of rules and their outcomes.

 
  30  

References:

Allen, N. J., & Barres, B. A. (2009). Glia—more than just brain glue. Nature, 457(7230),

675-677.

Arute, F., Arya, K., Babbush, R., Bacon, D., Bardin, J. C., Barends, R., et al. (2019).

Quantum supremacy using a programmable superconducting processor. Nature,

574(7779), 505-510.

Bar-Yam, Y. (2002). Complexity rising: From human beings to human civilization, a

complexity profile. Encyclopedia of Life Support Systems. Oxford: ELOSS Publishers.

Becher, S. I., & Feldman, Y. (2016). Manipulating, fast and slow: the law of non-verbal

market manipulations. Cardozo L. Rev., 38, 459-507.

Benedikt, M., & Oden, M. (2011). Better is better than more: complexity, economic

progress, and qualitative growth. The University Of Texas at Austin, Center for

Sustainable Development Working Paper Series–2011 (1), 1-77.

Bloem, P., Mota, F., de Rooij, S., Antunes, L., & Adriaans, P. (2014). A safe

approximation for Kolmogorov complexity. In International Conference on Algorithmic

Learning Theory (pp. 336-350). Cham: Springer.

Capra, F. (2005). Complexity and life. Theory, Culture & Society, 22(5), 33-44.

 
  31  

Céspedes, E., & Fuentes, M. (2020). Effective complexity: in which sense is it

informative? Journal for General Philosophy of Science, 1-16.

Chaitin, G. J. (1990). Information, randomness & incompleteness: papers on algorithmic

information theory (Vol. 8). Singapore: World Scientific.

Cilibrasi, R., & Vitányi, P. M. (2005). Clustering by compression. IEEE Transactions on

Information theory, 51(4), 1523-1545.

Dabla-Norris, M. E., Kochhar, M. K., Suphaphiphat, M. N., Ricka, M. F., & Tsounta, E.

(2015). Causes and consequences of income inequality: a global perspective.

International Monetary Fund.

Daly, H. E. (1973). Toward a steady-state economy (Vol. 2). San Francisco: WH

Freeman.

Deaton, A. (2019). What’s wrong with contemporary capitalism? Project Syndicate.

Retrieved from https://www.project-syndicate.org/

Delahaye, J. P., & Zenil, H. (2012). Numerical evaluation of algorithmic complexity for

short strings: A glance into the innermost structure of randomness. Applied Mathematics

and Computation, 219(1), 63-77.

 
  32  

Dobb, M. (1975). Theories of value and distribution since Adam Smith. Cambridge:

Cambridge University Press.

Gell-Mann, M., & Lloyd, S. (1996). Information measures, effective complexity, and

total information. Complexity, 2(1), 44-52.

Gell-Mann, M., & Lloyd, S. (2010). Effective complexity. In Murray Gell-Mann:

Selected Papers (pp. 391-402).

Graeber, D., & Cerutti, A. (2018). Bullshit jobs. New York: Simon & Schuster.

Kolmogorov, A. N. (1963). On tables of random numbers. Sankhyā: The Indian Journal

of Statistics, Series A, 369-376.

Kukathas, C., & Pettit, P. (1990). Rawls: a theory of justice and its critics. Stanford:

Stanford University Press.

Livingston, P. N. (1992). Models of desire: René Girard and the psychology of mimesis.

Baltimore: Johns Hopkins University Press.

Lloyd, S. (2001). Measures of complexity: a nonexhaustive list. IEEE Control Systems

Magazine, 21(4), 7-8.

 
  33  

McAllister, J. W. (2003). Effective complexity as a measure of information content.

Philosophy of Science, 70(2), 302-307.

Murphy, A., Ponciano, J., Hansen, S. & Touryalai, H. (2019). Global 2000: the world’s

largest public companies. Forbes. Retrieved from www.forbes.com/global2000

Nilsson, N. J. (1965). Learning machines. New York: Mcgraw-Hill.

North, D. C. (1991). Institutions. Journal of Economic Perspectives, 5(1), 97-112.

North, D. C. (1992). Transaction costs, institutions, and economic performance (pp. 13-

15). San Francisco, CA: ICS Press.

Rawls, J. (1971). A Theory of Justice. Cambridge, Massachusetts: Belknap Press.

Romer, P. M. (1996). Why, indeed, in America? Theory, history, and the origins of

modern economic growth. The American Economic Review, 86(2), 202-206.

Soler-Toscano, F., Zenil, H., Delahaye, J. P., & Gauvrit, N. (2014). Calculating

Kolmogorov complexity from the output frequency distributions of small Turing

machines. PloS one, 9(5).

 
  34  

Soler-Toscano, F., & Zenil, H. (2017). A computable measure of algorithmic probability

by finite approximations with an application to integer sequences. Complexity, 2017, 1-

10.

Stiglitz, J. E. (2015). Rewriting the rules of the American economy: an agenda for growth

and shared prosperity. New York: WW Norton & Company.

Taylor, K. S. (2001). Human society and the global economy. SUNY-Oswego Department

of Economics. Retrieved from https://EconPapers.repec.org/RePEc:oet:tbooks:prin7

Wilson, C. (1951). Epicureanism: a very short introduction. Oxford: Oxford University

Press.

Wolpert, D. H., & Macready, W. G. (2002). Self-dissimmilarity: an empirically

observable measure of complexity. In Unifying Themes in Complex Systems: Proceedings

of the First NECSI International Conference, (pp. 626-43). Cambridge: Perseus

Yunus, M. (2017). A world of three zeros: the new economics of zero poverty, zero

unemployment, and zero net carbon emissions. London: Hachette UK.

 
  35  

Acknowledgments: Authors declare no competing interests. The views and opinions

expressed in this paper are those of the authors and not necessarily the views and

opinions of the U.S. Department of Energy. The authors disclosed receipt of the

following financial support for the research, authorship, and/or publication of this article:

this work was performed under the auspices of the U.S. Department of Energy by

Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

View publication stats

You might also like