Mental Architecture and Basic Psychological Processes: Week 2

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 45

Mental Architecture and

Basic Psychological Processes: Week 2

David J. Lobina

FACULTY OF PHILOSOPHY
UNIVERSITY OF OXFORD
&
DEPARTMENT OF PSYCHOLOGY
UNIVERSITY ROVIRA I VIRGILI

Barcelona, 24 Feb 2016


Outline

1 Recap Week 1

2 Representational Theory of Mind

3 Computational Theory of Mind


Behaviourism

• Behaviourism —prediction of dispositions to behave


• Part Metaphysics: no irreducible mental entities
• Part Methodology: psychology should be scientific
Behaviourism

• Behaviourism —prediction of dispositions to behave


• Part Metaphysics: no irreducible mental entities
• Part Methodology: psychology should be scientific

• Stimulus-Responses (probabilistic) associations as


theoretical unit
• Conditioning theory of learning (classical and operant
versions)
Problems for Behaviourism

• Dogmatic restrictions on the domain of psychology —the


mental states mediating S-R pairs
• Novelty —linguistic behaviour
• Stimulus independence —of both language and cognition
tout court
Cognitive Psychology

• Need to postulate internal structure —in terms of both


mental states and mental processes
• Inference of internal structure from observable behaviour
Cognitive Psychology

• Need to postulate internal structure —in terms of both


mental states and mental processes
• Inference of internal structure from observable behaviour

• Varieties of Functionalism:
• Functional analysis: decomposition of a cognitive
phenomenon into (atomic) parts
• Machine functionalism: mental states akin to machine
states of, e.g., a Turing Machine
• Metaphysical functionalism: mental states identified by the
causal relations they enter
Levels of Explanation

• Dennett (1969)
• Personal level accounts: intentional terminology (beliefs,
desires, etc.)
• Subpersonal level: physics, cognitive psychology

• Chomsky’s dichotomy
• Competence: what a given system is and does
• Performance: how the system is put to use
Levels of Explanation

• Dennett (1969)
• Personal level accounts: intentional terminology (beliefs,
desires, etc.)
• Subpersonal level: physics, cognitive psychology

• Chomsky’s dichotomy
• Competence: what a given system is and does
• Performance: how the system is put to use
(Chomsky & Miller, 1963; Chomsky, 1963; Miller &
Chomsky, 1963)
Marr & Poggio (1976)

• Computational level —input-output function


• Algorithmic level —how the function is mechanically
implemented (a computational process)
• Level of mechanisms —memory, attention, control
operations the algorithm employs
• Implementational level —neural substrate
Marr & Poggio (1976)

• Computational level —input-output function


• Algorithmic level —how the function is mechanically
implemented (a computational process)
• Level of mechanisms —memory, attention, control
operations the algorithm employs
• Implementational level —neural substrate
(Dennett, 1987; Newell, 1980; Pylyshyn, 1980, 1984)
Modifications and Developments

• Theory of the computation (ToC) à la Chomsky


• function in intension —“mechanical procedure” mediating
inputs-outputs
• rather than a function in extension —sets of input-output
pairs
• How ToC is implemented —levels of algorithm and of the
mechanism
• constitute the “functional/cognitive architecture” —basic
processing mechanisms
Modifications and Developments

• Theory of the computation (ToC) à la Chomsky


• function in intension —“mechanical procedure” mediating
inputs-outputs
• rather than a function in extension —sets of input-output
pairs
• How ToC is implemented —levels of algorithm and of the
mechanism
• constitute the “functional/cognitive architecture” —basic
processing mechanisms
• Two possible explanations for cognitive patterns: either
properties of the architecture or knowledge-based (content
of mental states)
The explanandum of cognitive psychology

• Behaviour actually caused by inner happenings of some


kind
The explanandum of cognitive psychology

• Behaviour actually caused by inner happenings of some


kind
• The nature of mental entities —Representational Theory of
Mind (RTM)
• The nature of the means according to which these entities
interrelate —Computational Theory of Mind (CTM)
RTM

• Common ground: any theory that postulates semantically


evaluable (mental) objects
• Proposition/Content (Peacocke, 1986), objects of Prop.
Att. (beliefs, desires, etc.; PA)
RTM

• Common ground: any theory that postulates semantically


evaluable (mental) objects
• Proposition/Content (Peacocke, 1986), objects of Prop.
Att. (beliefs, desires, etc.; PA)
• Belief/desire/fear/think/imagine that the cat is on the mat
RTM

• Common ground: any theory that postulates semantically


evaluable (mental) objects
• Proposition/Content (Peacocke, 1986), objects of Prop.
Att. (beliefs, desires, etc.; PA)
• Belief/desire/fear/think/imagine that the cat is on the mat
• Belief/desire/etc. that P def. as mental states (MS)
• Propositions def. as mental representations (MRs)
• MSs have intentionality (are about/refer to things) in
terms of the semantic properties of the MRs
MRs

• Are structured —have a predicate-subject configuration


(logical form)
• Fa, where F is a predicate and a a term (or object)
MRs

• Are structured —have a predicate-subject configuration


(logical form)
• Fa, where F is a predicate and a a term (or object)
• Must be accurate, consistent, appropriate, and veridical
• Relationship between the represented system and the
representing system
Fodor’s RTM

• Mental states distinguished by content, and by their


relations to MRs
• PAs as structural relations to MRs —as two-place relations
• PAs interact causally and do so in virtue of their content
• These interactions constitute mental processes which
eventuate behaviour (they are efficacious)
What are these interactions?

The acts of the mind, wherein it exerts its power over simple
ideas, are chiefly these three: 1. Combining several simple ideas
into one compound one, and thus all complex ideas are made.
2. The second is bringing two ideas, whether simple or complex,
together, and setting them by one another so as to take a view
of them at once, without uniting them into one, by which it gets
all its ideas of relations. 3. The third is separating them from
all other ideas that accompany them in their real existence: this
is called abstraction, and thus all its general ideas are made.
John Locke. An Essay Concerning Human Understanding.
Book II, Chapter XII, 1690.
Admonitions

• Locke and other Empiricists posited an associationist


process (and therefrom to behaviourism)
• Locke and co. focused on the representations of the senses,
their Ideas were images
• Conceptual representations (no phenomenal features)
• Nonconceptual (sensations, etc.)
Admonitions

• Locke and other Empiricists posited an associationist


process (and therefrom to behaviourism)
• Locke and co. focused on the representations of the senses,
their Ideas were images
• Conceptual representations (no phenomenal features)
• Nonconceptual (sensations, etc.)

• Hobbes, Leibniz, Babbage —mental processes as a


‘calculation’ (ratiocination)
What, then?

In the XX century computation understood in intuitive terms


as a ‘process whereby we proceed from initially given objects,
called inputs, according to a fixed set of rules, called a program,
procedure, or algorithm, through a series of steps and arrive at
the end of these steps with a final result, called an output’
(Soare, 1996, p. 286)
The Most Amazing Fact of 1936



 Untyped lambda calculus (Church)
 General recursive functions (Church)


Computable Functions Turing machine (Turing)
Production systems (Post)




...

Different intensions, different purposes

• Church’s formalism as hypothesis on the computable


functions
• Turing outlined a mechanical process in motion
• Post’s system generates and enumerates sets
Turing (1936)

• Machine can write/erase/scan symbols on a tape, which is


divided into squares
• Prints binary digits symbols (0, 1), behaviour determined
by symbol being scanned
• Sequences of symbols constitute the state of the system
Turing (1936)

• Machine can write/erase/scan symbols on a tape, which is


divided into squares
• Prints binary digits symbols (0, 1), behaviour determined
by symbol being scanned
• Sequences of symbols constitute the state of the system
• Turing (1950) identifies three parts of a digital computing
device: a store, an executive unit, and a control unit
Syntactic and Semantic Properties

• Syntax:
• A transition table lists and orders operations —e.g., a
configuration such as (S1 , Write, Shift to the Right, S2 )
• Difference between 00110 and 00111 is physical therefore
“formal”, but at the same time
Syntactic and Semantic Properties

• Syntax:
• A transition table lists and orders operations —e.g., a
configuration such as (S1 , Write, Shift to the Right, S2 )
• Difference between 00110 and 00111 is physical therefore
“formal”, but at the same time
• Semantics:
• Look-up table establishes the interpretation of each symbol
(and groups of symbols)
• Specific sequences can stand for basic arithmetic operations
(addition, subtraction, multiplication, etc.)
Lessons for Cognitive Science

• Fodor: Mental states and processes are computational


• PAs are computational relations to MRs (propositions)
• Computations are symbolic (defined over MRs) and formal
(no access to semantic properties)
• Different content means a different form and therefore
different behaviour
Lessons for Cognitive Science

• Fodor: Mental states and processes are computational


• PAs are computational relations to MRs (propositions)
• Computations are symbolic (defined over MRs) and formal
(no access to semantic properties)
• Different content means a different form and therefore
different behaviour
• Gallistel & King (2009)
• Mental symbols isomorphic to aspects of the environment
• Symbols must be distinguishable, constructible, and
efficacious
• Distinction between procedure and a process
Modifications

• Different lives of MRs:


• Semantic/Knowledge level: content
• Symbolic/Syntactic level: code or form
Modifications

• Different lives of MRs:


• Semantic/Knowledge level: content
• Symbolic/Syntactic level: code or form

• Cognitive processes as sequences of symbolic expressions


(different MRs for different domains),
• Namely, the occurrence, transformation, storage, and
retrieval of MRs
Modifications

• Different lives of MRs:


• Semantic/Knowledge level: content
• Symbolic/Syntactic level: code or form

• Cognitive processes as sequences of symbolic expressions


(different MRs for different domains),
• Namely, the occurrence, transformation, storage, and
retrieval of MRs
• Structurally, and in comparison to a TM:
• parallel instead of serial processing, nondeterministic,
memory must be addressable
• distinction between machine and program (rules of
operations)
A Corollary, and an Example

• From CTM to RTM or from RTM to CTM?


A Corollary, and an Example

• From CTM to RTM or from RTM to CTM?


• Shannon’s (1948) information-theoretic account of
communication
• Information as binary choices —amount of info. to choose
between two alternatives is a bit
• adapted to cognitive psychology (information-processing);
hence, information-bearing symbols
A Corollary, and an Example

• From CTM to RTM or from RTM to CTM?


• Shannon’s (1948) information-theoretic account of
communication
• Information as binary choices —amount of info. to choose
between two alternatives is a bit
• adapted to cognitive psychology (information-processing);
hence, information-bearing symbols
• Miller (1956) on absolute judgement, immediate memory,
and span of attention
• Pitch (2.5 bits), Loudness (2.3), Position of a dot in a
square (4.6); Immediate memory (7 items); Subitizing (6
digits at a glance)
• role of “grouping” stimuli increases capacity —sounds into
letters, letters into chunks, then into words, phrases, etc.
An Aside

• Linguistic representations
• Bundles of lexical features (phonological/phonetic,
syntactic, maybe semantic): e.g., velar; wh-feature, etc.
• Don’t refer to anything out there, are not veridical
An Aside

• Linguistic representations
• Bundles of lexical features (phonological/phonetic,
syntactic, maybe semantic): e.g., velar; wh-feature, etc.
• Don’t refer to anything out there, are not veridical
• Should be treated as algebra (that is, as variables), as
inputs and outputs to specific derivations
An Aside

• Linguistic representations
• Bundles of lexical features (phonological/phonetic,
syntactic, maybe semantic): e.g., velar; wh-feature, etc.
• Don’t refer to anything out there, are not veridical
• Should be treated as algebra (that is, as variables), as
inputs and outputs to specific derivations
• Linguistic computations
• Not modelled with a TM, but (initially) with Post’s
production systems: g −→ h —a substitution rule
• Abstract characterisations of well-formedness conditions
Bibliography I

Chomsky, N. (1963). Formal properties of grammars. In


R. D. Luce, R. R. Bush & E. Galanter (Eds.), Handbook of
mathematical psychology (p. 323-418). John Wiley and sons,
Inc.
Chomsky, N. & Miller, G. A. (1963). Introduction to the formal
analysis of natural languages. In R. D. Luce, R. R. Bush &
E. Galanter (Eds.), Handbook of mathematical psychology,
vol. 2 (p. 269-322). John Wiley and Sons, Inc.
Dennett, D. C. (1969). Content and consciousness. London,
England: Routledge.
Dennett, D. C. (1987). The intentional stance. Cambridge,
MA: The MIT Press.
Gallistel, C. R. & King, A. P. (2009). Memory and the
computational brain. Malden, MA: Wiley-Blackwell.
Bibliography II

Marr, D. & Poggio, T. (1976). From understanding


computation to understanding neural circuitry. MIT AI Lab,
memo 357, 1-22.
Miller, G. A. (1956). The magical number seven, plus or minus
two: some limits on our capacity for processing information.
Psychological Review, 63(2), 81-97.
Miller, G. A. & Chomsky, N. (1963). Finitary models of
language users. In R. D. Luce, R. R. Bush & E. Galanter
(Eds.), Handbook of mathematical psychology, vol. 2
(p. 419-492). John Wiley and sons, Inc.
Newell, A. (1980). The knowledge level. Artificial Intelligence,
18, 81-132.
Peacocke, C. (1986). Thoughts: An essay on content. Oxford,
England: Basil Blackwell Publisher Ltd.
Bibliography III

Pylyshyn, Z. W. (1980). Computation and cognition: issues in


the foundations of cognitive science. The Behavioral and
Brain Sciences, 3, 111-169.
Pylyshyn, Z. W. (1984). Computation and cognition.
Cambridge, MA: The MIT Press.
Soare, R. (1996). Computability and recursion. The Bulletin of
Symbolic Logic, 2(3), 284-321.
Turing, A. M. (1936). On computable numbers, with an
application to the Entscheidungsproblem. In M. Davis (Ed.),
The undecidable (p. 115-153). Dover Publications, Inc.
Turing, A. M. (1950). Computing machinery and intelligence.
Mind, 59, 433-60.

You might also like