150 Studyguide1 Fa16 (Mind)

You might also like

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 6

Introduction to Philosophical Thought Fall 2016

Philosophy 150 Dr. Melkonian

STUDY GUIDE
Unit I, The Mind-Body Problem
Unit II, Can Non-Humans Think?

The final exam will cover all material presented in classroom lectures and discussions as
well as handouts through the eight week, plus the following readings:

Brie Gertler, “In Defense of Mind-Body Dualism”


Jackson, “The Qualia Problem” (dualism; epiphenomenalism)
Papineau, “The Case for Materialism” (mind-brain identity theory)
Churchland, “Functionalism and Eliminative Materialism”
Turing, “Computing Machinery and Intelligence”
Searle, “Minds, Brains, and Programs” (against “strong” functionalism)
Lycan, “Robots and Minds”

Introduction:

philia-sophia
Philosophy begins in wonder.

Philosophy is the attempt to see how things, in the broadest possible sense of the term,
hang together, in the broadest possible sense of the term.

Philosophy is concerned with a synoptic view. (Synoptic: having to do with a


general view of the whole.)

An argument is a sequence of statements intended to establish some claim.


The claim the argument is intended to establish is the conclusion.
The statements intended to support the conclusion are the premises.

Metaphysical questions are questions about the nature of things as general as reality, life,
death, immortality, the soul, and personhood. Metaphysical categories that we have discussed
include matter, mind, substance, qualities, and ideas.

Unit I: The Mind-Body Problem:

mental events/physical events (examples?)


qualia (examples?)
privileged access
res cogitans/res extensa (review)
intentionality (“aboutness”; the capacity of
some things to represent other things)
etc.

1
What is Philosophy of Mind? Name two questions that Philosophy of Mind attempts to
answer.

The Mind-Body Problem: What is the mind and how does it differ from the body, if at all? How
does one account for the relation (if any!) between the mind and the body?

Positions in Philosophy of Mind:

dualism*
interactionism (Rene Descartes)
parallelism
occasionalism
epiphenomenalism (Frank Jackson)

materialism
behaviorism* (B.F. Skinner)
reductive materialism, or mind-brain identity theory* (David Papineau)
eliminative materialism* (Paul Churchland)
functionalism* (William Lycan)
Dualism is the view that there are two basic kinds of stuff, two substances: thinking
substance (or mind), and extended substance (or matter). René Descartes’ is the most famous
dualist philosopher. His picture of the relationship between the mind and the body: res
cogitans/res extensa. The mind has no extension but it does have a location—pineal gland. But
how to account for mind-body interaction? There are other versions of dualism in philosophy of
mind, too, including interactionism, epiphenomenalism (Frank Jackson), parallelism (Leibniz),
and occasionalism (Malebranche).

Idealism (or immaterialism) in metaphysics is the view that there is only one substance,
mind. Ideas are what minds “have.” All sensible objects are ideas.

Materialism (or physicalism) in metaphysics is the view that everything is a material


object, state, event, or process and nothing else. “the view that sensations and other mental states
are entirely physical.” (Gertler in Feinberg, p. 360a) There is no immaterial mind or soul.

Eliminative materialism is the view that a mature neuroscience probably will establish
that our commonsense psychological conceptions of the causes of human behavior and the nature
of cognitive activity (“folk psychology”) are so radically false that humans simply do not undergo
mental states such as beliefs, desires, and intentions. Folk psychology might turn out, in
retrospect, to resemble stories about Santa and the took fairy.

What is the distinction between a brain state and a mental state, according to: a.) a
dualist; b.) a materialist? Give examples of both.
What is an advantage of each of these positions?
What is a possible objection or drawback to each of these positions?

Brie Gertler (a dualist) defends a view she calls naturalistic dualism, arguing that, in
feeling pain, we know the essence of pain. (Gertler defends her self-description as a naturalist by
saying that, although science studies nothing supernatural, still nonphysical events could in

2
principle be objects of scientific study. Objection: But if we hold that naturalism can admit
nonphysical states into the natural order, don’t we thereby gut the term of substantive meaning?)
The view she argues against in her paper is mind-brain identity theory: she believes that
mental states are not identical to physical states: pain is not just stimulation of C-fibers. Thus,
there would appear to be (at least) two very distinct sorts of events or states: physical events or
states, and mental events or states.
“I will defend dualism by arguing that it is possible that you experience pain even if you
are in no physical state, that is, even if you have no body whatsoever.” (361b) Review Gertler’s
Disembodiment Argument (363a). “If our argument succeeds in showing that pain can be present
in the absence of any physical state, we will have established dualism, for we will have shown
that pain is not identical to anything physical, and thus that at least some mental states […] are
not physical.” Here is the argument, schematically:

1. I can “conceive of” (imagine) experiencing this very pain while possessing no
physical features; while being disembodied.

2*. If, using concepts that are sufficiently comprehensive, I can conceive of a particular
scenario
occurring, then that scenario is possible.”

3. So, it is possible that this very pain occurs in a disembodied being.

4. If this very pain was identical so some physical state, then it could not possibly occur
in a disembodied being.1

5. So, this very pain is not identical to any physical state.

6. Conclusion: Therefore, the identity thesis, which says that every mental state is
identical to some physical state, is false.

Possible objections: what about mind-body interaction? Isn’t dualism spooky?

Frank Jackson (a dualist): Here is his argument: “Nothing you could tell of a physical
sort captures the smell of a rose, for instance. Therefore, Physicalism is false.” Review
Jackson’s Knowledge Argument (373a).
The smell of a rose is an example of a quale (plural: qualia). Qualia are the qualitative
“feel” of experiences; the inexpressively private “innerness” of mental states; the what-it-is-like
aspect of experience. According to Jackson, they are caused by physical states, notably brain
states, but they do not cause physical states or mental states; indeed, they do not do anything.
Qualia are what we have privileged access to. A quale is information of a sort, but it is not
physical information. Qualia are epiphenomena. I
Who is Jackson’s figure Fred? What role does he play in Jackson’s Knowledge
Argument for qualia? What is Jackson’s tomato-sorting thought experiment, the one involving
red1 and red2?
Who is Jackson’s figure, Mary? What is she supposed to illustrate about qualia?
Can you give an example of a mental event seeming to cause a physical event? What
might Frank Jackson say about this?

1
If A is identical to B, then it is impossible for (“inconceivable that”? “impossible to imagine that”?)
something to be an A without being a B. So (5) follows by modus tollens from (4) and (3).

3
In response to a possible objection to epiphenomenalism, Jackson speculates that having
qualia might be a by-product of an evolved characteristic that is conducive to survival.

Behaviorism: Talk about emotions, sensations, beliefs and desires is a shorthand way of
talking about actual and potential patterns of behavior. (Ryle)

Environmental input  black box  behavioral output.

Behaviorists hold that statements containing expressions about mental states and events
(hopes, fears, dreams, imaginations, desires, volitions, and so on) have the same meaning as (and
are translatable into) some set of publicly verifiable, testable, statements describing behavioral
and bodily processes and dispositions. “I’m thirsty” means “I am disposed to get a beer from the
fridge.”

Operational definition (for example, an operational definition of “soluble”: Something


is soluble if, were it to be immersed in unsaturated water, it would dissolve. What is a problem
with coming up with an operational definition of an intentional state, such as wanting a Hawaiian
vacation? Hint: most mental states are multitracked dispositions.)
Behaviorists are commonly criticized for ignoring or denying qualia.

David Papineau is a reductive materialist, or a mind-brain identity theorist. According


to mind-brain identity theory, mental states are physical states: Each type of mental state or
process is one and the very same thing as some type of physical state or process within the brain
or central nervous system.

Papineau vocabulary: mental property/physical property; the causal argument (for mind-
body identity/materialism); the Completeness of Physics assumption; over-determination
of physical effects of mental states; the “belts and braces” account of mental and physical
causation; causal danglers.

Here is Papineau’s Causal Argument for Materialism:

P1: Mental states and events have physical effects.


P2: All physical effects are caused by purely physical prior histories.
P3: The physical effects of mental events are not always overdetermined (caused
twice over).
C: Therefore, mental events mentioned in P1 must be identical to some part of
physical causes mentioned in P2.

Note: P2 above is known as the “completeness of physics” assumption. Can you


rephrase it in your own words?
Note that Papineau’s Causal Argument is not aimed against epiphenomenalism. Can you
explain why this is so?
What does Papineau mean by “overdetermined” in P3 above?
Explain intertheoretic reduction from categories of “Folk Psychology” to neuroscience.
Papineau, too, faces the missing qualia problem.

Paul Churchland is an eliminative materialist. "Eliminative materialism is the thesis that


our commonsense conception of psychological phenomena constitutes a radically false theory, a

4
theory so fundamentally defective that both the principles and the ontology of that theory will
eventually be displaced, rather than smoothly reduced, by completed neuroscience."
He devotes much of his discussion to a critical examination of functionalism.
Functionalism in Philosophy of Mind was an attempt to address some serious objections
to Behaviorism, while retaining some insights of behaviorism. Functionalists see themselves as
combining the best features physicalism and behaviorism, without the drawbacks of either.
According to functionalism (or token-identity theory), the defining feature of any type of
mental state is the set of causal relations it bears to: (1) environmental effects on the body; (2)
other types of mental states, and (3) bodily behavior.
What are Churchland’s objections to functionalism?

Churchland vocabulary: “folk psychology”; multiple realizability; functional


isomorphism.

Churchland describes the functionalists’ emphasis: “What is important for mentality is


not the matter of which the creature is made, but the structure of the internal activities which that
matter sustains.” Paul Churchland, in Feinberg, ed., 383b.
Multiple realizability, in the philosophy of mind, is the thesis that the same mental
property, state, or event can be implemented by different physical properties, states or events.

Unit II: Can Non-Humans Think?

Alan Turing wants to address the question, Can machines think? How are we to
determine the answer to this question? (“The original question, ‘Can machines think?’ I believe
to be too meaningless to deserve discussion.”)
Turing vocabulary: discrete state machines (such as, approximately, a digital computer);
the imitation game; the Turing test (a version of the imitation game); the learning
machine.

Describe the imitation game. Describe a Turing machine. What is it supposed to show?
Turing advocates a position that we today would call. Strong Artificial Intelligence (SAI).
This is the view that appropriately programming a machine is sufficient for giving it intentional
states. According to SAI, computers could literally be said to understand and have other
cognitive states (desire; intentional states). “In strong AI, because the programmed computer has
cognitive states, the programs are not mere tools that enable us to test psychological explanations;
rather, the programs are themselves the explanations.”
Notice that SAI is a version of functionalism.
The Computational Model of the Mind (accepted by Functionalists):

environmental input processor & program  behavioral output

So Turing replaces the question Can machines think? with the question: Are there
imaginable digital computers that would do well in the imitation game? He believes that a
properly programmed computer could think, could understand.

Turing addresses six objections to his view (pp. 299-303). Review these objections.

John Searle presents an argument against SAI. It is also an attack on functionalism. He


rejects two related views of SAI: (1) The view that the appropriately programmed computer is a

5
mind, capable of cognitive states (or mental states); (2) the view that programs are (or could be)
psychological explanations--explanations of cognitive states.

Searle vocabulary: Strong AI; Weak AI; syntactical (grammatical) individuation;


semantic (intentional) individuation; non-reductive physicalism.

Describe Searle’s Chinese Room Argument. Why is this an argument against Strong AI?
Searle points out that one cannot say that understanding is taking place in the Chinese
Room: the person in the room cannot be said to understand the story in Chinese. (This becomes
clear when we compare the two routines, one in Chinese and English, and the other in only
English.) This is true for any such algorithms because they are syntactically individuated (on the
basis of “grammar”—rules for correlating symbols) but intentional states such as understanding
are semantically individuated—words have meanings based on what they “refer to,” what they
are “about.”
Searle subscribes to non-reductive physicalism, which holds that high-level features and
processes are not necessarily reducible to lower-level features and processes; mental states are
physical phenomena and results of material processes; nevertheless, they cannot be neatly
reduced to brain states.

William Lycan disagrees with Searle’s view of the computational model. He defends
functionalism. In “Robots and Minds,” he makes the case that digital computers, or machines
like digital computers that conform to the computational model are, at least potentially, capable
of intelligence.

Lycan vocabulary: the interaction problem; intelligence as a certain sort of flexibility;


contingency; information sensitivity; the computational picture of mentality; the
computer model of the mind; soft determinism.

Lycan tentatively defined intelligence as responsiveness to contingencies—a kind of


flexibility. (Contingencies are unforeseen events, whether mental events or non-mental events.)
Lycan holds that this responsiveness can be described as information-sensitivity. Computers, like
minds, are information-sensitive; they too register, store, manage, and use information.

What did Lycan mean when he said that computers have intelligence (in the sense of
information-sensitivity), but computers have no judgment?

Who (or what) is Lycan’s imaginary friend Harry?


Who (or what) is Lycan’s imaginary friend Henrietta? How is she used to support the
functionalist position on personal identity?

Etc.

You might also like