Professional Documents
Culture Documents
Maxwell Murialdo - Complexalism - Quantifying Value With Effective Complexity
Maxwell Murialdo - Complexalism - Quantifying Value With Effective Complexity
net/publication/341556136
CITATIONS READS
0 149
2 authors:
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Maxwell Murialdo on 25 May 2020.
Affiliations:
1
Lawrence Livermore National Laboratory, Livermore, CA, USA
2
Clapes-UC, Santiago, CHILE
2
Abstract:
We present a new economic theory of value based on complexity theory. For simplicity
discuss a novel three-dimensional framework to analyze value and the use of effective
methods for quantifying complexity and simulating valuations. The resulting valuations
may serve to benchmark prices and can be used in evaluating the market rules of
engagement.
3
Introduction
facilitate the exchange of disparate goods, shapes the formulation of public policies, and
affects the income distribution under that system. Historical theories of value include the
Labour Theory of Value, the Marginal Theory of Value, the Sraffian Theory of Value,
In recent decades economic inequality has reached extreme levels in the United
States and a number of other advanced economies (Dabla-Norris et al. 2015), prompting
prominent thinkers to question the efficacy of current market systems, underlying rules of
engagement, and conventional theories of value (Stiglitz, 2015; Deaton, 2019; Yunus,
creation (under capitalism) has been called into question, with the implication being that
prices and value regularly diverge from one another (Graeber & Cerutti, 2018).
money is ransomed rather than earned. Here the word ‘ransomed’ is not intended to
insinuate violence or unethical behavior. Rather, the word ‘ransomed’ is used to imply
that money is only meted out when one’s value can be reasonably withheld until an
withheld has no protections and is unlikely to receive just recompense. (Note that an
exchange may occur at a single point in time or spread out over an extended timeframe.)
specific forms of value and render them ‘ransomable’. These protections are both de jure
4
and de facto, arising from laws, regulations, corporate policies, and societal norms. Some
such protections are easy to observe (e.g. copyrights, patents, and property rights).
However, at the scale necessitated by the vast, entangled economy, these protections and
their supporting structures fan out into a ubiquitous web of rules, both visible and
invisible. We call these and other such rules the ‘market rules of engagement’. The
alike live and die by the specifics of these protections. In some cases, the value produced
is well protected. In other cases, presumably only a fraction of the value produced can be
At present, the protections that arise from laws, regulations, corporate policies,
and societal norms are molded by democracy, public opinion, special interests, and other
such forces. All of these forces are subject to biases (both conscious and unconscious)
and their intuitions and desires can fail to reflect a deeper reality (even when operating
under the best of intentions). Often these biases are rooted in historical precedents, which
may no longer align with contemporary reality. Consequently, market prices and value
from maximizing monetary wealth). At issue is not simply the question of whether the
correlations between market prices and inherent values are strong or weak; the problem is
far deeper. The problem is that at present we lack a suitable quantitative means to even
5
assess whether these correlations are strong or weak. We lack a robust alternative metric
Utilizing Frameworks
One goal in the pursuit of quantifying value is to design a more fair economic
system. At first glance ‘fairness’ represents a large and nebulous concept that is often ill-
constrained and ill-posed. However, serious attacks have been mounted to open up fronts
and make rigorous and meaningful progress on the question of fairness, most notably the
work of John Rawls and his ‘Veil of Ignorance’ (Rawls, 1971). In his theory, Rawls asks
us to imagine going back to a time before we were born, before we had any inkling of our
vantage point, behind a veil of ignorance, Rawls argues, could we reasonably come to a
consensus about fair rules of engagement for the world at large. This framework allows
question of ‘fairness’.
phenomena into smaller, more objectively decidable questions that nonetheless shed light
question’ as one whose answer is perspective (or mind) independent and a ‘subjective
question’ as one whose answer is perspective (or mind) dependent. These two extremes
form opposite ends of a spectrum, with likely no firm dividing line, only gradations
6
between the two. As such, we posit that the best way to measure the objectivity of a
competently providing answers. For these purposes, competence depends on the scope of
information available to the individual and his or her ability to comprehend and reason
from that information. (In some cases this competence may only exist in theory.) As an
example, a question regarding the outcome of a specific coin flip is highly objective
because once all of the information regarding the outcome has been conveyed to and
understood by the observers, there exists a strong consensus on the outcome (regardless
of prior uncertainty while the coin was in the air). Even if the relevant information is
never collected or disseminated, the question is still highly objective due to the
hypothetical possibility that the information could have been collected and a strong
consensus achieved.
we consider the following hypothetical scenario. Some ancient peoples might have
believed that the coolness of an object to the touch was a purely subjective sensation.
Today, however, we have rigorous means of objectively quantifying the temperature and
thermal conductivity of an object in order to better address this question. First, the
transfer). Further quantifying these concepts required, for instance, mapping units of
pendulum. These physical observations are more easily measured and agreed upon with a
7
simple visual inspection. In this manner, scientific frameworks have rendered questions
regarding the coolness of an object to the touch more objective. Science produces
frameworks that can reparameterize large and slippery questions into more objectively
concept of economic value more objective by analyzing value through the lens of
consumption, and production of wealth. However, wealth takes many disparate forms,
which are difficult to reconcile and sum without deferring to market prices.
Unfortunately, market prices cannot measure a good’s inherent value. Market prices can
only claim to correlate with inherent value in the context of a number of untested
assumptions. One such assumption is a belief in the validity and efficacy of the market
economics tools cannot be used to robustly evaluate the market rules of engagement, the
rules’ abilities to accurately assess value, or even the overall growth of value in a society.
Any attempt to evaluate the market rules of engagement by tracking only market prices is
inherently circular. Conventional economics tools can only produce naïve assessments of
the growth of a society’s economic value. For example, conventional tools may be able to
measure an increase in the quantity of a certain class of goods, but could not accurately
assess the quality or other important properties of those goods (Benedikt & Oden, 2011).
8
Reparameterizing Value
Truth. The dimension of value that we label as ‘Progress’ pertains to the advancement of
technology, infrastructure, art, ideas, culture, etc. Progress measures how each decade has
improved upon the last. Later we will argue that the Progress dimension of value best
represents economic value on the whole and can be quantified with effective complexity.
Others have hinted that value and complexity are correlated (Daly, 1973; Romer, 1996),
and Benedikt & Oden (2011) explicitly argued for a valorization of complexity. In this
work we make a distinct argument for the intrinsic (as opposed to instrumental) value of
hypothetically computed.
The dimension of value that we label as ‘Qualia’ pertains to the feelings and
subjective experiences and sensations that are unique to conscious beings. Unlike non-
experience color, taste, sound, pleasure, pain, etc. These experiences undoubtedly impact
Finally, the dimension of value that we label as ‘Truth’ explores any inherent
value in gaining knowledge and understanding of reality, as it is. For example, it may be
9
occurrence in ancient history has value in of itself (and is therefore worth seeking),
one another and can be thought of as a useful set of basis vectors that span a three-
dimensional value space. Any specific action or object may have positive or negative
value along one dimension, independent of its value along another dimension. For
example, a horrible truth may have value by virtue of its accuracy in describing the world
(Truth) but contribute to pain along the Qualia dimension. Alternatively, a pleasurable
impact along the Truth dimension. In general, intentioned actions are undertaken with the
fundamental motive of effecting change in at least one of these three dimensions. While
higher-level motives may be formulated (e.g. love, friendship, loyalty, honor, etc.), at
base these higher-level motives can usually be deconstructed into component vectors that
lie along one or more of the three fundamental dimensions (Progress, Qualia, Truth).
A specific action or object (in a given context) can have its value-point plotted on
current location within a three-dimensional value space may be determined from past
actions and objects (vectors) that affect that individual. An individual’s current distance
from the origin in one of the dimensions (e.g. Qualia) may affect that individual’s ability
to move along a different dimension of value (e.g. Progress). For instance, a person who
has experienced a great deal of pain and depression (Qualia) might find it difficult to
produce works of Progress. Conversely, a person who has experienced too much bliss
(Qualia) might also be dissuaded from working towards Progress. Consequently, moving
10
non-conservative, three-dimensional force field. The difficulty of moving from one point
to another point in the three-dimensional space is path dependent. The ability to traverse
one of the dimensions can be impacted by the starting location in the other two
dimensions, even while all three dimensions are in fact orthogonal to one another.
space.
dismissed the other dimensions of value (Wilson, 1951). However, this perspective does
not reflect the ideals of contemporary capitalistic societies. At a policy level, capitalistic
societies tend to favor progress and accomplishments, lauding these endeavors as most
economically valuable and to some extent rewarding them accordingly. The most
culture, relationships, information, ideas, etc. (Murphy et al. 2019). Fundamentally, all of
11
these activities are engaged in the process of complexifying the world (whether directly
or indirectly).
regulations, taxes, or social stigma. Examples of such activities may include using hard
drugs, smoking cigarettes, activities that pose a fire risk, vandalism, pollution, dropping
out of high school, laziness, etc. Conversely, actions that may be considered difficult,
painful, or lacking in pleasure are generally commended if their results yield Progress
painful medical procedures, putting a man on the moon, home maintenance, volunteer
work, dedication to a craft, acts of heroism, work ethic, etc. In other cases, activities have
no impact on Progress (complexity) and are therefore neutral in terms of economic value.
Intuitively, capitalism would rate a society where people lounge in pleasure but do very
little, as inferior to a society where people endure pain and hard work but achieve great
accomplishments. Moreover, while Truth may contain its own inherent value, capitalistic
societies tend to favor truths that pay out by facilitating Progress (e.g. scientific truths
that lead to new inventions). For these reasons, economic value is most suitably measured
Progress (or economic value) cannot be easily measured with simple physical
properties like mass, volume, composition, or substance. Rather, economic value arises
object’s fundamental parts. For example, a coherent novel is more valuable than an
equally large stack of papers that have been printed on with the same amount of ink in a
12
random gibberish pattern. Food is more valuable before it has been digested (a process
that breaks down a great deal of its molecular structure). A structured computer is more
valuable than its unstructured raw materials. The same could be said of life. While the
specific atoms in a living body are continually flushed out and replaced (as individual
cells die), the embodied information (from DNA down to electrons) preserves a living
being’s individuality and potential value (both intrinsic and instrumental). In fact, most
furnace, thereby erasing its embodied information. Even pure metals like platinum can
have their embodied information and value destroyed by nuclear processes that rearrange
something beautiful is afoot. Humans are born creative and with the potential to be
productive. Great civilizations arise from human minds and hands. Interpersonal
measures of progress (and others) make life worth living and are, as a rule of thumb,
‘valuable’. Moreover, they all share a common denominator; they all embody (or
13
operating on the AIC to tease out the contributions attributable to regularities. The
information measure known as ‘effective complexity’ (EC) does precisely this (as will be
discussed later in detail) and therefore provides a powerful proxy metric for measuring
economic value.
While most of the universe marches towards a high-entropy state, living beings
and their creations not only embody enormous complexity but also continually give rise
life and the inventions of living beings (Bar-Yam, 2002; Capra, 2005) (which both
harness and fight the increase of entropy on a local scale). The creation and preservation
of net EC is what humans strive for everyday as they work to keep their bodies healthy
and living, maintain their cars and houses, build and nurture relationships, produce
writing, art, ideas, and culture. The creation and preservation of EC is built into a
human’s raison d’être and can be interpreted as a measure of what humans have and have
created of value.
However, often humans treasure things for their simplicity, not their complexity.
They seek simplified tax codes, tools that are simple but effective, and highly purified
silicon for use in transistors. Measuring the currently embodied EC of an object does not
capture its full value. Instead, we must consider both an object’s intrinsic value and its
instrumental value. The EC currently embodied by an object functions as a proxy for its
intrinsic value. However, the object may also serve as a raw material, tool, or other factor
in the production of still greater complexity in the future. This future complexity is the
object’s instrumental value and must be incorporated into any calculation of the object’s
overall value. For example, in a hypothetical chain of events, highly purified silicon is
14
used to produce transistors, which are assembled into computers, which produce
abundant future complexity through vast calculations. While purified silicon may appear
to have little value (due to its minimal embodied EC), the sum of its intrinsic and
instrumental value is quite large. Often an object’s future EC contributions will dwarf its
presently embodied EC. When humans seek things that are simple, it is with the ultimate
purpose of using this simplicity to facilitate even greater complexity in the future.
As a further spot check on the correlation between economic value and effective
complexity, consider both the value and EC of an individual human. Human life and
well-being are prized by general consensus. Simultaneously, the human brain, with its
nearly 100 billion neurons (Allen & Barres, 2009), each making up to 10,000
connections, is highly complex (perhaps the most complex object in our known solar
enormous future EC through their creativity, decisions, actions, and secondary effects.
Therefore, under complexalism, actions that ultimately promote the survival and well-
being of humans have large positive value, while those that ultimately destroy human life
Measuring Complexity
have been proposed (Lloyd, 2001). However, the measure known as effective complexity
best formalizes our intuitive notions of complexity in a general format (Gell-Mann &
15
Effective complexity makes use of Algorithmic Information Content, which measures the
length of a highly compressed description of a bit string. Formally, the AIC of a bit string
is the length of the shortest computer program that can print the bit string and then halt
as ‘ “10” repeated 20 times’. Therefore, the AIC of this bit string is given by the length of
its compressed description, not the length of the full bit string. Conversely, a bit string
that is truly random cannot be compressed, and its shortest description is at least as long
as the bit string itself. For this reason, AIC is sometimes called ‘algorithmic randomness’
as it assigns higher values to random bit strings than to bit strings containing patterns,
information. The AIC of a random bit string is large, whereas its embodied information is
small. For example, in literature a novel composed by randomly selecting words from a
dictionary is considered to contain very little meaningful information, despite its large
AIC.
Effective complexity, on the other hand, only measures the AIC of the
regularities in a bit string. In doing so, effective complexity closely approximates our
intuitive notion of embodied information. Effective complexity separates a bit string into
a subset containing its regular features and a subset containing its incidental (or random)
from the literature that embeds the bit string in a set of similar bit strings and assigns a
16
probability to each member of the set (Gell-Mann & Lloyd, 1996). Each member of the
set must share the regularities of the original bit string but otherwise can vary in its
could be typed out in various fonts wherein the symbols ‘1’ and ‘0’ are perceived as the
regularities, and the stylistic differences between each font are perceived as the incidental
features.
held constant). This ensemble mirrors ensembles used in statistical mechanics (e.g. the
microcanonical ensemble wherein the total energy imposes a constraint, or the canonical
ensemble wherein the temperature imposes a constraint). By analogy, the regular features
features. The effective complexity is given by the AIC of the ensemble. EC is thus
distinct from AIC in that it conceptually separates the bit string into two subsets, one
containing the regularities and the other containing the incidental features. EC only
determined for a given bit string (in the absence of externally imposed constraints)
decisions (constraints) such as on which computing machine the AIC is calculated, the
choice of an encoding scheme, and how regularities are defined. The first two subjective
17
dismissed as long as protocols are established and followed which standardize both
The third necessary and subjective decision, defining regularities (also denoted
‘coarse-graining’), has been addressed by Céspedes and Fuentes, who argue that effective
complexity can be uniquely determined (or at least reasonably bounded) as long as the
‘cognitive and practical interests’ concerning the bit string are determined a priori
(Céspedes & Fuentes, 2020). Under these conditions effective complexity is compatible
hypothetical example, Céspedes and Fuentes imagine a bit string that records the earth’s
atmospheric temperature over time. This bit string may be considered on the basis of a
period of one day or on the basis of a period of 21,000 years, yielding different patterns
over different timescales and hence different potential regularities. If, however, the
various researchers using this bit string agree to use the same basis (presumably based on
a shared research agenda), then they will arrive at the same (or at least a similar) effective
complexity result. If there is consensus on the ‘cognitive and practical interests’ among
the parties (i.e. agreement on the course-graining practice for determining regularities),
economic value of goods. In order for EC to serve as the basis for a rigorous economic
theory of value, the EC of the goods being evaluated must be uniquely determined (given
a specific context). As discussed, uniquely determining the EC of an object (or bit string)
requires that the parties involved come to a consensus on how to identify regularities (i.e.
18
that it can be resolved in a more objective fashion than other nebulous approaches to
assessing value.
questions about the coarse-graining procedure. Each of these sub-questions, on its own,
can more easily garner a consensus resolution. Once reassembled, the agreed upon
answers to the sub-questions enable a robust quantification of value. The purpose of this
touch more objective by slicing it into more objectively addressable subcomponents (e.g.
question left for future work, but we list a few conceivable approaches. In one approach
feature size’. For instance, the minimum intentional feature size could be defined as the
smallest unit of detail (voxel) that the creator would reproducibly incorporate into his or
her work under standard operating procedures. A 3D-printed part might have a minimum
intentional feature size of 1 mm. If asked to produce a dozen identical prints, each print
irreproducible, unintentional, and considered noise. Likewise, two pristine cars of the
same make, model, year, color, and accessories that have just come off of the same
assembly line are identical for all practical valuation purposes. Complexalism should not
concern itself about the precise arrangement of atoms and dislocations within the metal
itself (so long as they don’t affect other important materials properties), lest these atomic-
scale regularities drown out the regularities of interest. More generally, a diverse set of
19
‘minimum intentional feature parameters’ may be selected. For a 3D-printed part, these
mechanical properties, etc. For a highly designed object like a car, each aspect is
These specifications set error bars on the parameter precision necessary for each feature
to maintain its purpose. Accordingly, these specifications could be used to guide the
coarse-graining procedures.
These feature parameters favor consequence over intent. For example, an object might be
overdesigned (in intent) at a feature scale that cannot be practically distinguished from a
less designed object. One could argue that if both objects are indistinguishable, they
should have the same valuation from a consequence perspective. The minimum
levels is known (e.g. how microscale features impact bulk mechanical properties), a more
complete picture of the consequences resulting from each scale could be assessed and
Lloyd have pointed out, the coarse-graining procedure does not necessarily need to be
At present, market prices are the default proxy for economic value. This approach is
20
riddled with fundamental weaknesses. Market prices are dictated (in part) by the
equilibrium between supply and demand. However, demand is readily manipulated and
often too nebulous to evaluate directly and rigorously (Becher & Feldman, 2016). While
demand operates on a market scale, it derives from individual desires (whether simple or
multifaceted). There is little reason to believe that following the desires of individuals
engineering, etc. cannot be viably resolved by a popular vote. Instead these questions
must be resolved with expertise, specialized training, and rigorous studies. While demand
may indirectly drive Progress (or at least motivate it), the connection between the two is
If anything, desires may more aptly reflect the Qualia dimension of value. Qualia
are, after all, subjective internal experiences most readily accessed by the individual
experiencing them. However, here too, manipulations arise. In his theory of Mimetic
Desire, Rene Girard argues that most all desires arise from the copying of others
(Livingston, 1992). While inner desires may appear highly personal and seem to elucidate
something deeper about one’s true self, they are often, in fact, foreign entities implanted
cases, market prices may result from a combination of demand and poorly designed rules
of engagement. For these reasons, market prices are unlikely to accurately reflect
21
system) is rife with fundamental weaknesses, not least of which is its dependence on the
objectively decidable metric of value that circumvents many of the shortcomings inherent
to ‘demand’.
To be clear, complexalism does not take power out of the hands of the people.
Truth. Next a systematic approach for quantifying Progress using effective complexity is
explored (as detailed below). By breaking down large and nebulous questions of value
consensuses at a more fundamental level, far removed from the biases and susceptibilities
inherent to desires.
Simulating Valuations
we have access to unlimited computing power, and 3) we have access to all initial state
following Revenue and Cost (e.g. to decide levels of production), complexalism tracks
22
the tension between two new terms, ‘Total Complexity’ (TC) and ‘Complexity
(instrumental) an object of interest. Here we use the term ‘object’ in a broad sense,
encompassing physical items, ideas, services, etc. First the object of interest (e.g. my
yellow bicycle) and its surroundings are encoded as information in a simulation. The EC
presently embodied in both the simulated object and its simulated surroundings is
quantified and this number is stored. Next, the simulation is run forward in time, with the
embodied EC (of both the object and surroundings) computed and stored at each
timestep. For example, my yellow bicycle is utilized to perform daily tasks, each of
which increases the net complexity of the surroundings. Any secondary effects are also
contributing to the embodied EC of the simulated surroundings for that timestep. Overall,
the EC of the entire simulation is integrated with respect to time (stepwise, assuming
infinitesimal timesteps) from the present (to) up to a specified point in the future (tf). This
simulation B, my yellow bicycle is absent (while the surroundings begin in the same
initial state). The simulation will obviously run a different course in the absence of my
yellow bicycle. Once again, the EC is integrated with respect to time up to the same
future point and stored as ‘result B’. The difference between ‘result A’ and ‘result B’ is
the difference in complexity between two worlds, one with the yellow bicycle and one
23
!! !!
𝑇𝐶 = 𝐶 𝑑𝑡
!! !"
− 𝐶 𝑑𝑡
!! !
(1)
The variable Cos represents the instantaneous EC of the object of interest and its
measures the impact (i.e. complexalism benefit), of my yellow bicycle. However, impact
alone does not dictate value. Value is contextual. In complexalism contextuality is built
incorporate not only the object of interest, but also its potentially affected surroundings.
buckets of water and four potential uses for water: hydrate herself, hydrate her animals,
wash her body, and water her garden. When deciding how to use the first bucket of water,
each activity has a distinct level of priority, as measured by the associated TC. Hydrating
herself in order to remain alive is a high priority as it has high TC. The staggering
interconnectivity of the human brain along with its creative potential is an enormous
source of complexity (both intrinsic and instrumental). Therefore, fostering human life,
well-being, and thinking are highly valuable activities under complexalism. Watering the
garden has much lower priority as it has lower TC. However, each potential use case has
diminishing marginal TC as more water is allotted to that potential use case. As the
farmer drinks more water, she becomes less thirsty, and the TC added with each
additional sip decreases. Under complexalism we assume that the farmer will act
reasonably and spread the available water over the potential use cases so as to maximize
24
In some cases, equivalent goods are not equally accessible. For instance, some
water may be readily available, while other water must first be pumped from a distant
well and carried back. These different sources of water entail different Complexity
Opportunity Costs (COC). COC is the TC that could have otherwise been added to the
system, had time and resources not been expended elsewhere (e.g. in acquiring the
replacement goods). For example, the water that is readily available has nearly zero COC,
whereas the water that must first be pumped from a distant well and carried back expends
worker energy and time, which could have otherwise been spent producing additional TC
in other tasks. In general, the marginal COC of obtaining an additional unit of a good
increases with consumption (at least at high levels of consumption). This is because the
most easily available goods are consumed first, and additional goods must be scrounged
for in increasingly taxing ways. Note that the COC can entail the raw materials,
replacement unit (not necessarily the efforts that went into the original unit in question).
marginal COC generally increases with increased consumption. When the marginal TC
and marginal COC are calculated as functions of consumption (using simulations) and
plotted on the same graph (Figure 2), they intersect at a break-even point where marginal
TC equals marginal COC. This break-even point sets a boundary on the simulation that
additional units (beyond the break-even point) would actually decrease overall
25
even point.
price) and one’s positive impact is a philosophical question with at least three distinct
individual produces, B) the hypothetical loss that would be incurred by the absence of the
differences between option A and option B, we return to the simple example of the
farmer with four buckets of water (instead of individuals). Let us posit that in this
circumstance the farmer uses the first bucket of water (bucket #1) to hydrate herself,
bucket #2 to hydrate her animals, bucket #3 to wash her body, and bucket #4 to water her
garden, in order to maximize TC. Under option A, bucket #1 has tremendous TC or value
(by virtue of keeping the farmer alive), while bucket #4 has minimal TC or value (by
26
virtue of keeping her garden alive). However, in a practical sense (due to the
stolen), the farmer would not die of thirst. Rather she would promote buckets 2-4 to serve
use cases 1-3 (in order of priority). Therefore, the value lost is the TC associated with
keeping the garden alive. This more pragmatic assessment of value is reflected in option
B.
Option C closely mirrors current market principles and shares in its weaknesses.
For example, the cost to replace a worker is typically assessed via market forces (supply
and demand establishing an equilibrium price). However, a market-derived price can only
be interrogated after the rules of engagement have been set for the market. The rules of
contract law, etc.) impact the relative negotiating power of every party involved and
therefore impact the resulting price. For these reasons, option C is not suitable for meta-
the ‘cost’ of option C is interpreted to mean the Complexity Opportunity Cost (COC),
then in general option C reduces to Option B due to the equivalence between marginal
removed, other units of that same good, previously allocated to lower priority uses, are
shifted to fulfill the highest priority uses first. The overall effect is that removing a unit
from any use case always results in a loss of TC equivalent to the marginal TC of that
good at the break-even point. Therefore, under complexalism a good or activity’s value
27
(along with its price or level of compensation) is proportional to the marginal TC of that
complexalism the calculated values of specific objects may or may not resemble current
market prices. In some cases the contrast between the two will be stark. These differences
should be taken as features, not bugs. The utility of any new framework derives from its
theory of value, not a moral theory of value. Other moral restrictions (e.g. deontological)
At present, the available computational resources and datasets are not large
analogy, Rawl’s ‘Veil of Ignorance’ framework cannot be physically put into practice but
has nonetheless served as a conceptual foundation and guiding light in important policy
decisions (Kukathas & Pettit, 1990). In a similar fashion, complexalism might help to
unknowns about the future, complexalism could use recent historical data to make
retrodictions about the recent past (as opposed to predictions about the future). These
retrodictions (made at set time intervals) could establish valuations for specific goods and
28
benchmark where prices and value have diverged and by how much. For instance, a
valuation anchor points for use in a course-correcting feedback loop (when steering the
specialized surroundings with unique minutia. Although Chaitin proved that Algorithmic
to bound and/or estimate AIC have been proposed (Cilibrasi & Vitanyi, 2005; Delahaye
& Zenil, 2012; Bloem et al. 2014; Soler-Toscano et al. 2014; Soler-Toscano & Zenil,
2017). Gell-Mann and Lloyd simply propose setting a finite maximal computation time in
order to produce a decidable AIC proxy when computing EC (Gell-Mann & Lloyd,
2010). Historically, a number of frameworks and algorithms have been proposed before
the computational power to widely implement them existed (e.g. Newton’s method,
machine learning (Nilsson, 1965)). Likewise, complexalism may benefit from recent and
future advances in exascale computing and quantum computing (Arute et al. 2019).
Conclusions
assessing the value impact of objects and occupations when evaluating policy decisions.
Computed value-anchoring points may serve as targets to aim for when steering the
economy (by tuning its underlying rules of engagement). Monetary wealth or economic
growth measured by market prices cannot be used for these meta-level evaluations. Value
and market prices often diverge precisely because of poorly designed rules of
29
mechanisms for determining prices and wages cannot be cross-examined. This sets up a
blind feedback loop that is more likely to conjure a simulacrum than reflect reality.
30
References:
Allen, N. J., & Barres, B. A. (2009). Glia—more than just brain glue. Nature, 457(7230),
675-677.
Arute, F., Arya, K., Babbush, R., Bacon, D., Bardin, J. C., Barends, R., et al. (2019).
574(7779), 505-510.
Becher, S. I., & Feldman, Y. (2016). Manipulating, fast and slow: the law of non-verbal
Benedikt, M., & Oden, M. (2011). Better is better than more: complexity, economic
progress, and qualitative growth. The University Of Texas at Austin, Center for
Bloem, P., Mota, F., de Rooij, S., Antunes, L., & Adriaans, P. (2014). A safe
Capra, F. (2005). Complexity and life. Theory, Culture & Society, 22(5), 33-44.
31
Dabla-Norris, M. E., Kochhar, M. K., Suphaphiphat, M. N., Ricka, M. F., & Tsounta, E.
Freeman.
Delahaye, J. P., & Zenil, H. (2012). Numerical evaluation of algorithmic complexity for
short strings: A glance into the innermost structure of randomness. Applied Mathematics
32
Dobb, M. (1975). Theories of value and distribution since Adam Smith. Cambridge:
Gell-Mann, M., & Lloyd, S. (1996). Information measures, effective complexity, and
Graeber, D., & Cerutti, A. (2018). Bullshit jobs. New York: Simon & Schuster.
Kukathas, C., & Pettit, P. (1990). Rawls: a theory of justice and its critics. Stanford:
Livingston, P. N. (1992). Models of desire: René Girard and the psychology of mimesis.
33
Murphy, A., Ponciano, J., Hansen, S. & Touryalai, H. (2019). Global 2000: the world’s
North, D. C. (1992). Transaction costs, institutions, and economic performance (pp. 13-
Romer, P. M. (1996). Why, indeed, in America? Theory, history, and the origins of
Soler-Toscano, F., Zenil, H., Delahaye, J. P., & Gauvrit, N. (2014). Calculating
34
10.
Stiglitz, J. E. (2015). Rewriting the rules of the American economy: an agenda for growth
Taylor, K. S. (2001). Human society and the global economy. SUNY-Oswego Department
Press.
Yunus, M. (2017). A world of three zeros: the new economics of zero poverty, zero
35
expressed in this paper are those of the authors and not necessarily the views and
opinions of the U.S. Department of Energy. The authors disclosed receipt of the
following financial support for the research, authorship, and/or publication of this article:
this work was performed under the auspices of the U.S. Department of Energy by