Kak Consciousness

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

166

NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935


Kak et al., A three-layered model for consciousness states

A Three-Layered Model for Consciousness States


Arushi Kak*, Abhinav Gautam, Subhash Kak
ABSTRACT

This paper presents a general three-level framework to represent different states of consciousness. Although
awareness, by itself, is an all-or-nothing phenomenon, the state of consciousness depends on the degree to which
preconscious and memory states are accessible to awareness. The relative ability to recruit these states and
integrate distributed neural activity has a corresponding effect on the state of attention which can, therefore, be
measured on a spectrum. Research in anesthesiology showing an uncoupling of perception from sensory inputs
supports the idea of the disembodied consciousness state. We propose a three-level hierarchical model to
explain these findings. The higher level nodes in this model are non-physical and they leave their trace in the
non-local binding of activity across regions that are far apart. The activity across the brain can be quantified
using an entropy-based metric and the non-physical nodes corresponding to consciousness states may be
governed by quantum dynamics.
Key Words: consciousness states, neural correlates of consciousness, hierarchical processing, perception
sensation decoupling, quantum neural processing
DOI Number: 10.14704/nq.2016.14.2.935
NeuroQuantology 2016; 2: 166-174

Introduction1
The mapping of distributed brain processes to
behavior and cognition must include the
influence of social context and history that also
contribute to our subjective feel of the world. The
neurophysiological component of this equation is
provided by activity in the neural circuitry of the
brain and in the functional modules that map to
specific conscious states. The mapping is further
complicated by the self-organization of the brain
since activity and behavior have implications for
structure. This indicates that the idea of
Corresponding author: Subhash Kak, School of Electrical and
Computer Engineering, Oklahoma State University, Stillwater, OK
74078, USA
Addresses: Arushi Kak, MD, St. Elizabeths Medical Center, 736
Cambridge St, Brighton, MA 02135; Abhinav Gautam, 610 NE 36th
Street, #3408, Miami, FL 33137; Subhash Kak, School of Electrical
and Computer Engineering, Oklahoma State University, Stillwater,
OK 74078, USA Phone: + 405-744-6096
e-mail arushi.kak@gmail.com; abhinav.gautam@gmail.com;
subhash.kak@gmail.com
Relevant conflicts of interest/financial disclosures: The authors
declare that the research was conducted in the absence of any
commercial or financial relationships that could be construed as a
potential conflict of interest.
Received: 26 February, 2016; Revised: 13 March 2016;
Accepted: 15 March 2016
eISSN 1303-5150

mind/brain identity based on a classical


computational framework is incorrect, which is
consistent with the finding that there is no
unified neural correlate of consciousness (NCC)
(Zeki, 2003).
There exist many different neural
correlates for different kinds of consciousnesses,
which is supported by the fact that there are
specialized modules for particular perceptions.
Furthermore, conscious experience is selective:
some neural activity generates it whereas other
activity doesnt. If consciousness is equated to
attention, there appears to be greater overlap
between mechanisms of memory and awareness
than between those of attention and awareness
(Lamme, 2003; Zeki, 2003). When considered as
activity in specific areas of the brain,
consciousness appears to depend on the activity
of the thalamus and brain stem (Paus, 2000), or
in higher prefrontal and parietal association
areas (Marois and Ivanoff, 2005). The claustrum,
along the underside of the neocortex, has also
been identified as the region where widespread
coordination of the activity occurs, resulting in
www.neuroquantology.com

167
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

the seamless quality of conscious experience; the


claustrum synchronizes the activity between both
the cortical hemispheres and between cortical
regions within the same hemisphere (Crick and
Koch, 2005).
In the case of the visual system,
information comes through the primary visual
cortex V1, which creates a well-defined map of
the spatial information in vision. After this,
information flows to the secondary areas V2, V3,
V4, and V5. Neurons in V1 and V2 respond
selectively to bars of specific orientations and
they are believed to support edge and corner
detection. In V2, neurons are tuned to basic
properties such as orientation, spatial frequency,
and color. In V3/V3A, they process global motion.
V4 neurons are like V2 but they are tuned to
features of intermediate complexity; faces are
recognized in the inferotemporal cortex; and the
area V5 plays a major role in the perception of
motion and in eye movements. If V5 is damaged
there is no perception of motion. If V1 is damaged
but V5 is intact, then signals in V5 are correlated
with the stimulus, but the subject has no
conscious awareness of that fact. Figure 1 shows
these areas in which face and object recognition
regions (blue), receive their input largely from V1
(yellow) and V2. In short we find that sub-tasks
in the vision process are distributed over several
regions. (Zeki et al., 1991; Zeki, 2003)
If we consider language, the complex
manner in which aphasias manifest suggest that
its production and processing are a tangled
process. Certain components of the language
functioning process operate in a binary fashion.
These components include comprehension,
production, repetition, and various abstract
processes. Viewing each as a separate module is
not quite correct due to subtle interrelationships
mediating between these capabilities which all
come into operation in normal behavior. The
explanation of counterintuitive disorders such as
alexia without agraphia can only be understood
by postulating decision nodes that do not have
physical neural correlates.
From the point of view of anesthesia, two
states of consciousness may be postulated for the
patient who is not fully anesthetized: (i) the
patient seems to be cognizant responding to
commands but with no postoperative recall or
memory of the events; and, (ii) the patient can
recall events postoperatively, but was not
eISSN 1303-5150

necessarily conscious enough to respond to


commands. Our understanding of these cases
must be addressed by the model of consciousness
states that is developed, and reconciled with the
hierarchy of conscious, preconscious, and
subliminal processing that is well understood
(Dehaene et al., 2006).

Figure 1. Areas of the visual brain

The
mechanisms
associated
with
consciousness are often very dissimilar both
spatially and temporally and may be associated
with different perceptions. This is so, for
example, for the specific cases of color and visual
motion awareness. The disparate and multiple
mechanisms of perception and comprehension
suggests that consciousness has no single neural
correlate (Zeki, 2003).
But if there is no single neural correlate,
how do distributed lower-level consciousnesses
relate to each other, and how does the integration
of the neural activity take place, as in the network
hub research of van den Heuvel and Sporns
(2013)? The general view is that the
communication between preconscious states
somehow maps to a single consciousness (Gilbert
and Sigman, 2007) although a conceptual
framework within which this communication
may be addressed is not defined.
In this paper we take up this question by
advancing a model of different hierarchicallydefined consciousnesses which are mediated by
languages and metalanguages with varying
capacity to recruit lower level consciousness
states. It is further assumed that some of these
languages are quantum in character. The highest
node in this hierarchical model, which is located
in a non-physical space, represents the unitary
consciousness but it does not have a single neural
correlate, and, in this sense, it agrees with the
available evidence.
www.neuroquantology.com

168
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

A Hierarchical Model of Consciousness


The idea of consciousness requires not only an
awareness of things but also the awareness that
one is aware. If awareness is some kind of a
measurement, it should have a reference. This, in
turn, poses two problems: Firstly, what is the
reference for awareness? Secondly, how does
consciousness
choose
between
various
possibilities?
The problem of the referent in awareness
is an old one. The Vedic sages of India solved it by
postulating a single universal, transcendental
consciousness. In this view, the individuals
empirical consciousness is a projection and the
referent for it is the universal. The analogy of the
same sun reflecting in a million different pots of
water as little suns is provided to explain the
empirical consciousness of the individual. The
Mkya Upaniad speaks of four basic states of
consciousness: waking, dreaming, deep-sleep,
and a fourth, which transcends these three. The
waking state is primarily concerned with the
embodied self and sensations. In the dreaming
self, not all memories associated with the
autobiographical self are available to the
conscious agent. In deep-sleep where there is no
dreaming, the conscious self is dormant; and in
the fourth state, the conscious self is in a
transcendent state.
In a somewhat similar way, Plato invokes
an object in a cave that cannot be seen directly
and whose shadows on the wall represent the
accessible phenomena. In this view, truth is an
abstraction and universals exist independent of
particulars.
In Western philosophy, Ren Descartes
proposed that consciousness resides within an
immaterial domain of res cogitans (the realm of
thought), to be contrasted from the domain of
material things, or res extensa (the realm of
extension). He assumed that the two realms
interact in the brain, but this dualist Cartesian
position is no longer taken seriously. Immanuel
Kant arrived at a resolution similar to that of the
Vedic tradition by arguing that empirical
consciousness must have a necessary reference
to
a
transcendental
consciousness
(a
consciousness that precedes all particular
experience). The universal or transcendental
position is generally unacceptable to mainstream
scientists who insist on reductionist models.
eISSN 1303-5150

William James spoke of two kinds of


selves: the self as knower (the I), and the self as
known (the me). Each persons self is partly
subjective (as knower) and partly objective (as
known). The objective self may be described in its
three aspects: the material self, the social self,
and the spiritual self (James, 1890). Narrative
self-reference is in contrast to the immediate
knowing I, that supports the notion of
momentary experience as an expression of
selfhood.
James believed that, as knower, the self is
comprised of different mental states. Thought has
no constant elements and every perception is
relative and contextualized. States of mind are
never repeated, and whereas objects might be
constant and discrete, thought is constantly
changing and mental states arise out of choices
that are made by the mind. James believed that
thought flows, and thus he could speak of a
stream of consciousness.
If one were to find the boundaries
between the me and the I of consciousness, it
becomes essential to find a minimal sense of
self. It is easy to speak of the intuition that there
is a basic or primitive something that is the true
self, and much harder to provide evidence for
such belief. Conceptually, there must be
something permanent a bed rock - underlying
the stream of consciousness. It is known that
medial prefrontal cortex (mPFC) supports selfawareness by linking subjective experiences
across time and this function is of special
importance to human social cognition and
behavior. Additionally, the mPFC plays a role in
decision making, executive control, rewardguided learning, and recent and remote
memories. Cortical midline processes support
narrative self-reference that maintains continuity
of identity across time. It has been proposed that
the function of the mPFC is to learn associations
between context, locations, events, and
corresponding adaptive responses, particularly
emotional responses (Euston et al., 2012).
Therefore, neuropsychological case studies of
episodic memory or loss of memory can help to
define the neural bases of the narrative self.
From an information theory perspective,
the quantification of the spatiotemporal
distribution of current evoked in the brain using
a transient magnetic field, can measure the level
of consciousness. The idea behind this measure
www.neuroquantology.com

169
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

is that when the brain is unconscious, the evoked


activity is either localized, or widespread and
uniform, as during slow wave sleep or epileptic
seizures (i.e., lack of differentiation). The
conscious state, on the other hand, corresponds
to differentiated but structured activity in the
brain (Llinas et al., 1999). This hypothesis
requires an appropriate measure to be associated
with the activity and one could use an entropybased function for this (as seems logical). But the
computation of such entropy required to create a
three-dimensional map of the brain will be
problematic, given the difficulty of accessing this
activity, although one may instead consider the
entropy of suitable evoked one-dimensional
signals.
To deal with the empirical pre-conscious
or
conscious
awareness,
we
postulate
independent and distributed neural structures at
the lowest Level 1, which may include subliminal
and preconscious nodes. The nodes in the next
higher Levels, 2 and 3, do not possess specific
neuronal correlates and there is aggregation of
several of the lowest consciousnesses, and the
unitary consciousness is defined at the highest
Level 3. This is shown in Figure 2 where it may
be noted that the hierarchy has a tangled
structure and cross-dependence that goes
beyond what is shown in the sketch.

assume that the binding of the attributes in Level


1 will be the fastest, and correspondingly slower
in Level 2, and yet slower in the cumulative
aggregation in Level 3. Due to the interference
between various levels and the tangled nature of
the information flow, one has illusions of
perception as given by the example of Figure 3.
Since the idea of a transcendental
consciousness cannot be reconciled with the
reductionist position, Zeki (2003) argues that
Level 1 agents are ontologically more
fundamental and there is no need to invoke a
transcendent entity. Speaking of the visual
systems, he believes that the Level 1 cortical
programs that construct visual attributes must
precede any entrainment by experience. He
therefore believes that Level 1 microconsciousnesses
precede
the
unified
consciousness and they are present at birth.

Figure 2. A hierarchical model of autonomous agents

The nature of the conscious experience would


depend on what other nodes are accessible to the
node at Level 3. The hierarchical details of the
centers of consciousness will vary across
different individuals. If more of the processing is
done in Levels 2 and 3, one would expect faster
response, and this may explain why chimpanzees
appear to have a superior working memory for
numerals compared to humans (Matsuzawa,
2013).
The speed of attribute binding would
depend on the complexity of the communications
and the relationships. Therefore, it is natural to
eISSN 1303-5150

Figure 3. The checker shadow illusion by Edward H.


Adelson (A and B have identical shades), and the rabbit and
duck illusion from the 23 October 1892 issue of Fliegende
Bltter.
www.neuroquantology.com

170
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

Zeki (2003) does not consider the possibility that


the entrainment of the Level 1 consciousnesses
may have occurred within the species in earlier
generations. Our proposed model of Figure 2 is
different from Zekis in that nodes of Level 2 and
Level 3 do not have neural correlates, and,
therefore, they are located in a non-physical
plane. It is because of this non-physical nature
that these virtual nodes are able to synchronize
activity in distant modules.

effectively modulate one's behavior (selfregulation), and a positive relationship between


self and the other that transcends self-focused
needs and increases prosocial characteristics
(self-transcendence). They call their approach
self-awareness: regulation and transcendence (SART). But there are consciousness states that are
not
associated
with
the
subjects
autobiographical self.
Entropy Metric for Consciousness States

Recruitment
of
Consciousness
Preconsciousness States

and

The mind must select from the pool of memories


in the emergence of any specific consciousness
state that the subject can communicate to others.
This selection may not be made consciously and
it may be determined by the stream of previous
consciousness states and the emotional state of
the subject. It is to be expected that the executive
control processes play an important role in the
selection.
Furthermore, repeated selection of
certain memories at the expense of others may
affect the recall process, causing unwanted
memories to be pushed back into the
unconscious. Thus mechanisms can be recruited
that prevent unwanted declarative memories
from entering awareness, and this cognitive act
may have enduring consequences for the rejected
memories (Anderson and Green, 2001).
Executive control can influence the
degree to which memories are being recruited to
assist awareness. The performance of the
executive control, even in the presence of
awareness, will define the spectrum of awareness
states. On the other hand, pathologies that
prevent the recruitment of memories will
degrade the states of consciousness and thus the
consciousness of an Alzheimers disease patient is
of a lesser quality than that of a healthy
individual. One may also speak of a person being
only partially awake or fully awake using the
same criterion.
If one were to refer to mindfulness as the
state in which executive control can recruit
memories with the greatest ease, we face the
question of how to define it. Vago and Silbersweig
(2012) describe mindfulness in terms of
systematic mental training that develops metaawareness (self-awareness), an ability to
eISSN 1303-5150

Entropy is a commonly used mathematical


measure for information and represents the
degree of surprise associated with a given event
or state. To determine the entropy of a threedimensional structure associated with the brain
is computationally impractical due to the
difficulty of accessing the internal states of the
brain and we must not only access the physical
state of the neurons (if it were even possible), but
also of the virtual nodes.
Since the inner states are mapped to
behavior and this may be seen either in the
ambulatory behavior, which may be twodimensional for certain organisms, or in the
behavior associated with the accessed mental
states, which may be seen as one-dimensional.
The commonly used information entropy
function for a one-dimensional system is:

S k B pi ln pi
i

where pi is the probability that the state i is


occupied, and kB is the Boltzmann constant in
Joules/Kelvin, and the amount of energy required
to erase one bit of information is kT ln 2
(Landauer, 1961). The energy requirements for
the physical processing of information by the
brain are important in their own right, but this
issue does not concern us here.
Although in thermodynamic systems all
the states of the same energy are taken to have
the same probability, this is not the case in
information networks which are associated with
decisions at various hierarchical levels.
Experiments,
as
well
as
theoretical
considerations, show that such networks are
characterized by self-similar or recursive
behavior (Bressler, 1995).
www.neuroquantology.com

171
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

A network consisting of N nodes, labeled


1, 2, n, may be represented by a graph where
the connection between the nodes i and j is
shown by a link between the two. A network may
be valued for the function it performs to nodes
outside of it, or it may be valued for the function
it serves to the nodes within. Unlike engineered
networks whose function (to nodes outside the
network) is well-defined, goodness of the
networks inside the brain cannot be quantified
because of the elusiveness of the inner
experience of well-being. Nevertheless, for the
entire network one speaks of some general
function such as value, utility, well-being, or
welfare, whose definition is driven by extraneous
considerations of theory.
In networks where value must be
associated with the nodes within, one speaks of
Pareto efficiency, or Pareto optimality, which is a
state of allocation of resources in which it is
impossible to make any one individual better off
(in terms of a suitable measure of wellbeing)
without making at least one individual worse off.
A state is Pareto efficient or optimal when no
further improvements can be made. The
allocation, normally in terms of resources, could
also be in terms of some property of the
connectivity where we ignore the cost of the
connections.
If one were considering connectivity as
value, the total value of the network is
proportional to n (n 1), that is, roughly, n 2 ,
since in a network with n nodes each can make (n
1) connections with other nodes. This ignores
the fact that in an evolutionary network the
connection capacity of the nodes must vary as
new nodes can only gradually get connected with
others and some other nodes might leave the
network. A careful analysis indicates that the
potential value of a network of size n grows in
proportion to n log n .
Any theoretical definition of utility for a
network is at best arbitrary; therefore, the actual
connectivity can provide insight into value. Such
connectivity is determined by cognitive or energy
considerations with one defining feature of it
being self-similarity across different layers, which
is driven by the logic of local and hierarchical
interactions. The value of the network may be
examined from how close it is to self-similarity
across many different layers.
Hierarchical modular brain networks are
eISSN 1303-5150

particularly suited to facilitate local neuronal


operations and the global integration of local
functions.
Energy
and
connectivity
considerations constrain functional networks but
it is essential for these networks to integrate the
broader picture (Kak, 2012). The selforganization of brain networks leads to both
redundancy and self-similarity (Park and Friston,
2013).
Conversely, it has been proposed
consciousness is a fundamental property
possessed by physical systems having specific
causal properties (Tononi and Koch, 2015). This
approach correlates integrated information of a
certain type directly with consciousness without
explaining how the capacity for awareness
emerges and, therefore, this hypothesis doesnt
appear to have the potential of deepening the
understanding of consciousness. This view
appears to simply be a version of the strong-AI
thesis according to which awareness will emerge
in any computing system that is sufficiently
complex. Consciousness may have specific
physical correlates but the correlates are
projections which cannot explain the problem of
the necessary referent for understanding, or how
awareness emerges in a classical system.
Anesthesia, Sleep, and Coma
General anesthesia leads to unconsciousness,
amnesia, analgesia, and akinesia, while
maintaining stability of the autonomic,
cardiovascular, respiratory, and temperature
regulatory systems. With the deepening of the
level of general anesthesia, a progressive increase
in low-frequency, high-amplitude patterns on the
electroencephalogram (EEG) occurs. During
recovery from coma, the EEG transitions through
the patterns characterizing general anesthesia,
sleep, and finally the awake state. There is a
corresponding sequence of the return of regular
breathing, salivation, tearing, swallowing,
gagging, and grimacing, and finally the capacity to
respond to oral commands.
Brown et al. (2010, p. 2644) argue that
the quantitative neurobehavioral metrics used
to monitor recovery from coma could be used to
track the emergence from general anesthesia
from a functional state that can approximate
brain-stem death to states similar to a vegetative
state and, eventually, to a minimally conscious
state. Such a metric would be similar to the one
www.neuroquantology.com

172
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

that has been proposed to determine the entropy


of the activity in the brain as a measure of level of
consciousness.
The idea of the uncoupling of perception
from sensory inputs seems to support the idea of
disembodied Level 3 consciousness. In a model
proposed by Pandit (2014), an explicit sensation
perception coupling is incorporated into a
macroscopic, functional model of consciousness.
According to Pandit:
The main hypothesis is that the minimum
requirement for satisfactory general
anesthesia is an uncoupling of perceptual
experience from sensory input. The two
model elements requiring elimination for
this minimally adequate state are at least:
(1) the perceptual experience and (2)
stimulus-evoked thoughts. Eliminating
these will (according to model) reduce
attention to the surgical process and so
the formation of explicit memories. With
use of adjuvant drugs, a considerable
proportion
of
the
structures
underpinning consciousness can be
disrupted (Pandit, 2014, p. 202).
Pandit (2014) calls the drug-induced state in
which perception and sensory inputs are
uncoupled as dysanaesthesia, and he terms this
as a new state between the usual conscious and
unconscious states. In such a state, the patient
may respond to commands and other stimuli but
remain unaffected otherwise by pain or surgery.
In his study, surgical patients under anesthesia
were placed in a state of paralysis covering the
entire body, with the exception of the forearm,
which they could use to respond to commands
and otherwise signal wakefulness or perception
of pain. A third of the patients moved their finger
when commanded, although they were under the
usual level of anesthesia prescribed for such
surgery.
The state where the patient is
immobilized and cannot talk, but is still aware,
may be dysanaesthesia. It may also be that the
patient is paralyzed with neuromuscular blocker
agents but has not been administered sufficient
anesthesia to decrease the level of consciousness.
If the identification of the dysanaesthesia state is
true, one can explain it within the framework of
our hierarchical model. When modules for
language comprehension are not accessible to the
consciousness node in Level 3, the subject is able
eISSN 1303-5150

to comprehend the command to move their


finger if this node acts as a referent.
One could hypothesize that the modules
are accessible, but networks to form memories
are not, which is why there is no recollection of
moving the finger in this state. But since there is
awareness in the consciousness node of Level 3,
we are compelled to postulate that awareness,
without the accompaniment of memories, must
have independent ontological reality. This is
supported by neuroimaging results that show
some parts of the cortex are still functioning in
vegetative patients (Laureys, 2005). Primary
sensory cortices in these patients are activated by
an external painful stimulus, although these areas
are functionally disconnected from the higher
associative areas needed for awareness. Likewise,
patients surviving severe brain injury may regain
consciousness without recovering their ability to
understand and communicate (Rosanova et al.,
2012).
Quantum Models
Certain broad characteristics of the evolution
self-organizing living structures may be
described by complex system theory, but analysis
of other interactions and activity requires
cognitive agents. The biological system functions
at different scales of physical organization and
there exist layers for which a classical
representation is appropriate, and others for
which we must describe using not only electrical
flow of information, but also electromagnetic
fields (Qiu et al., 2015), as well as quantum
mechanics.
Although the emergence of these
cognitive agents (in Levels 1 and 2 of our model)
is generally viewed in the context of the
evolutionary history of the biological system that
is grounded in classical macroscopic processes,
we cannot ignore that underlying these
macroscopic processes are physical structures
that have a quantum basis at the deepest level of
description. Previously, the quantum ground was
ignored due to high level of noise in biological
systems that would make it hard for any quantum
state to maintain coherence, but there are many
situations, such as vision and light harvesting,
where quantum models appear appropriate (e.g.,
Pelli, 1985; Anna et al., 2013; Lambert et al.,
2013).
www.neuroquantology.com

173
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

The question of noise in biology becomes


irrelevant when we consider virtual nodes,
whose existence has been assumed in the
proposed model as well as elsewhere (Kak,
1996). Indeed, we need to consider a variety of
objects, including quantum ones, in considering a
general biological system from the perspectives
of attention and high-level consciousness
(Gautam and Kak, 2013). Quantum formalism is
required
to
deal
with
fundamental
complementarity in Nature, and it is associated
with nonlocality, entanglement, and coherent
behavior.
Memories may be modeled as nodes. The
state of such a memory node depends on the
activity in all related notes and, therefore,
memories can never be the same as they are
reconstructed each time. Since memories map
into distributed activity over many neurons, the
memory as a node is virtual and, therefore, it may
be modeled as a quantum object (Kak, 2013).
Memory nodes may interact with each other as
virtual quantum particles and fields that are not
affected by material channels within the brain.
If the higher nodes in the model of Figure
2 follow quantum characteristics, that would be
an explanation for long range correlations inside
the brain and in behavior. It has been proposed
that the cognitive system, through its evolution,
veils the nonlocal processes of the quantum
interactions amongst consciousness states by
creating classical narrative for them (Kak, 2014).
Conclusions
The mechanisms associated with consciousness
are often very different both spatially and
temporally and may be associated with different
perceptions. This is so, for example, for the
specific cases of color and visual motion
awareness.
Since there exist multiple
mechanisms of perception, we are led to the view
that there is no single neural correlate of unified
consciousness.

eISSN 1303-5150

Normal consciousness exists when the


brain is in a state of harmony and is able to
transmit signals throughout the brain. When this
signaling is hindered either by injury or by
anesthesia, the experience changes and in certain
states the subject is witnessing, but uncoupled
from the autobiographical self.
Although awareness by itself is an all-ornothing phenomenon, the state of consciousness
depends on the degree to which preconscious
and memory states are accessible to awareness.
The uncoupling of perception from sensory
inputs supports the idea of a disembodied
consciousness state. A consciousness state that is
decoupled from specific memories of the
individual indicates that such states have an
ontological reality.
The degree to which memory nodes have
recruited other nodes will have a corresponding
effect on the quality of inner dialog and the
corresponding experience of reality. We
proposed a three-level hierarchical model to
explain these findings, and the entropy associated
with the consciousness states can, in principle, be
estimated by means of experiments based on the
nature of activity patterns.
The model of different hierarchicallydefined consciousness states that are mediated
by languages and metalanguages with varying
capacity or ability to recruit memories has
explanatory power. In theory, the computed
value of the entropy associated with the
consciousness states will be a scalar projection of
a three-dimensional reality, and it can serve a
purpose in identifying abnormal states. The
highest node in this hierarchical model, which is
located in a non-physical space, represents the
unitary consciousness, but it does not have a
single neural correlate, and, in this sense, it is
consistent with the available evidence.

www.neuroquantology.com

174
NeuroQuantology | June 2016 | Volume 14 | Issue 2 | Page 166-174 | doi: 10.14704/nq.2016.14.2.935
Kak et al., A three-layered model for consciousness states

References
Anderson MC and Green C. Suppressing unwanted memories
by executive control. Nature 2001; 410: 363-369.
Anna JM, Scholes GD and van Grondelle R. A little coherence in
photosynthetic light harvesting. BioScience 2013; 64: 1425.
Bressler, S.L. Large-scale cortical networks and cognition.
Brain Research Reviews 1995; 20: 288-304.
Brown EN, Lydic R and Schiff ND. General anesthesia, sleep,
and coma. N Engl J Med 2010; 363(27): 26382650.
Crick F and Koch C. What is the function of the claustrum?
Phil Trans Royal Soc B 2005; 360: 1271-1279.
Dehaene S, Changeux JP, Naccache L, Sackur J and Sergent C.
Conscious, preconscious, and subliminal processing: a
testable taxonomy. Trends in Cognitive Sciences 2006; 10:
204-211.
Euston DR, Gruber AJ and McNaughton BL. The role of medial
prefrontal cortex in memory and decision making.
Neuron 2012; 76: 1057-1070.
Gautam A and Kak S. Symbols, meaning, and origins of mind.
Biosemiotics 2013; 6: 301-309.
Gilbert CD and Sigman M. Brain states: top-down influences
in sensory processing. Neuron 2007; 54: 677-696.
James W. The Principles of Psychology. Henry Holt, New York,
1890 (Reprinted Bristol: Thoemmes Press, 1999).
Kak S. The three languages of the brain: quantum,
reorganizational, and associative. In Learning as SelfOrganization, K. Pribram and J. King (editors). Lawrence
Erlbaum Associates, Mahwah, NJ, 185-219, 1996.
Kak S. Hidden order and the origin of complex structures. In
Origin(s) of Design in Nature. L. Swan, R. Gordon and J.
Seckbach, (editors). Springer, Dordrecht, 643-652, 2012.
Kak S. Biological memories and agents as quantum collectives.
NeuroQuantology 2013; 11: 391-398
Kak S. From the no-signaling theorem to veiled non-locality.
NeuroQuantology 2014; 12: 12-20.
Lambert N, Chen YN, Cheng YC, Li CM, Chen GY and Nori F.
Quantum biology. Nature Physics 2013; 9: 1018.
Lamme VA. Why visual attention and awareness are different.
Trends Cogn Sci 2003; 7: 1218.
Landauer R. Irreversibility and heat generation in the
computing process. IBM Journal of Research and
Development 1961; 5 (3): 183191.
Laureys S. The neural correlate of (un)awareness: lessons
from the vegetative state. Trends Cogn Sci 2005; 9: 556
559
Llinas RR, Ribary U, Jeanmonod D, Kronberg E and Mitra PP.
Thalamocortical dysrhythmia: A neurological and
neuropsychiatric
syndrome
characterized
by
magnetoencephalography. Proc Natl Acad Sci USA 1999;
96(26): 1522215227.
Marois R and Ivanoff J. Capacity limits of information
processing in the brain. Trends Cogn Sci 2005; 9: 296
305.
Matsuzawa T. Evolution of the brain and social behavior in
chimpanzees. Current Opinion in Neurobiology 2013; 23:
443449.
Pandit JJ. Acceptably aware during general anaesthesia:
'dysanaesthesia'--the uncoupling of perception from
sensory inputs. Conscious Cogn 2014; 27: 194-212.
Park HJ and Friston K. Structural and functional brain
networks: from connections to cognition. Science 2013;
342: 1238411.
Paus T. Functional anatomy of arousal and attention systems
in the human brain. Prog Brain Res 2000; 126: 6577.
Pelli DG. Uncertainty explains many aspects of visual contrast,
eISSN 1303-5150

detection and discrimination. J Opt Soc Am 1985; 2:


1508-32.
Qiu C, Shivacharan RS, Zhang M and Durand DM. Can Neural
Activity Propagate by Endogenous Electrical Field? Journal
of Neuroscience 2015; 35(48): 15800
Rosanova M, Gosseries O, Casarotto S, Boly M, Casali AG,
Bruno MA, Mariotti M, Boveroux P, Tononi G, Laureys S
and Massimini M. Recovery of cortical effective
connectivity and recovery of consciousness in vegetative
patients. Brain 2012; 135: 1308-1320.
Tononi G and Koch C. Consciousness: Here, there and
everywhere? Phil Trans R Soc B 2015; 370: 20140167.
Vago DR and Silbersweig DA. Self-awareness, self-regulation,
and self-transcendence (S-ART): a framework for
understanding the neurobiological mechanisms of
mindfulness. Front Hum Neurosci 2012; 6: 296.
van den Heuvel MP and Sporns O. Network hubs in the
human brain. Trends Cogn Sci 2013; 17: 683-696.
Zeki S. et al. A direct demonstration of functional
specialization in human visual cortex. The Journal of
Neuroscience 1991; 11: 641-649.
Zeki S. The disunity of consciousness. Trends Cogn Sci 2003;
7: 214218.

www.neuroquantology.com

You might also like