Wednesday, June 23, 2010

The Explanatory Inadequacy of the Identity-Theory of Mind

As I see it, there are two key features of consciousness that the identity theory of mind fails to explain adequately (the identity theory being the thesis that mental events, properties, and the like just are brain events, properties, and etc.). Those two features are (1) Intentionality, and (2) Qualia. Let's first address intentionality.

The word intentionality is a bit misleading in that it doesn't, strictly speaking, refer to intentions (although intentions are a form of intentional state). The word "intentionality" just means directedness or aboutness. So, for example, my thinking about a train chugging down the railroad is an intentional state, because it is about something; namely, a train chugging down the railroad. I used the word "directedness" as well because of what philosophers like to call "propositional attitudes". Propositional attitudes are mental attitudes we take towards particular propositions. So, for example, given the proposition "The Patriots will win the super bowl", I can either believe that proposition, hope for the truth of that proposition, despise that proposition, and so forth. All of these states, or attitudes (namely, believing, hoping, and despising) are all intentional in that they are world-directed, or have a certain "aboutness" to them.

So, how does this relate to the adequacy, or lackthereof, of the identity theory of mind? Well, it doesn't seem that any given brain state could be about anything. Brain states are just physical objects and/or events, but physical objects and events aren't about anything, they just are. How is it that a synapse firing in my brain ends up being about the super bowl, my aunt, how much stock I have in a company, or anything else? It just doesn't seem to be that sort of thing. So, as a simple syllogism, this could be expressed as follows (by Celarent):

(1) No physical events are about something.
(2) Brain events are physical events.
(3) :. No brain events are about something.

So, if intentionality is a feature of consciosness at all, it would seem to be an irreducible feature of it, and thus cannot be identified with brain states themselves.

Next there is qualia. "Qualia" refers to the raw feel or what it is like feature of consciousness. For example, when I look at and feel a rose, apart from all the physical properties of that rose, there is in addition a what it is like to see and feel a rose quality to it (which would be constituted by more specific qualia, such as the qualitative experience of redness, or the qualitative feel of softness). So, to cut to the quick, the argument would be something like this (by Modus Tollens):

(1) If something is entirely physical, then it is (in principle) describable in purely physical terms.
(2) Consciousness is not describable in purely physical terms.
(3) :. Consciousness is not entirely physical.

Premise 2 refers to qualia, which most people would take to be an essential part of what we mean by consciousness (that is, our first-person qualitative experience of the world). Support for qualia being irreducibly mental entities can be given by way of various thought experiments, the most famous of which is the so-called "Mary" argument (sometimes also called the knowledge argument). The argument goes something like this: Say there is a scientist (call her "Mary") that knows everything there is to know about color-vision. She knows everything about the brain states associated with it, what goes on in the eye to make it happen, the wavelengths of various colors, and etc. There's just one issue: She's been living in a black and white room her entire life. One day, she finally leaves her colorless room and sees red for the first time. At this point, it would appear that Mary has learned something new about the color red, e.g. the qualitative feel of what it is like to see red. But then, it would appear that Mary didn't know everything about color-vision before she actually saw the color. But how can this be? The answer is, quite simply, that she only knew every scientific or physical fact about color-vision, but she lacked knowledge of a further fact, one which must be some further fact about color-vision that wasn't captured in all of the scientific descriptions she knew of. Thus, the qualitative feature of consciousness cannot be identical to physical facts about the brain, light hitting the retina, and/or so on.

Many philosophers have contested this point in very creative ways. One line of response goes something like this: Although it's true that Mary had some new experience when she finally saw red, she didn't acquire knowledge of a new fact about the neurophysiology of seeing red. Rather, she just knew the same facts (all physical) that she already did under a new description. But, all the while, such and such neurological event had always been identical to experience of red. And, so, the identity theory survives another day.

Or does it? The issue with this response is that a qualitative first-person experience isn't just a description, it's a phenomenon. The only reason one could give the new description of, say, pain as a qualitative experience instead of a C-fiber firing is if one has a first-person experience that isn't captured in descriptions of C-fiber firings. And so, I think, the objection doesn't work. Consciousness isn't just another way to describe a neural event, it's a unique, and distinct, phenomena.

So, as I said, I don't think the identity theory of mind can account for these two features of consciousness: Intentionality, and qualia. There are a number of interesting arguments that have gone back and forth on this issue, particularly in regard to aforementioned thought experiment. One argument in particular, Saul Kripke's argument utilizing rigid and nonrigid designators, is extremely subtle and interesting. Suffice it to say for now, however, that physicalism just won't do about the mind (at least in any eliminative sense....supervenience physicalism may still be in the running).

4 comments:

  1. About aboutness: Could mental states be referencing not real things, but their counterparts in the brain? Like memories?

    (I don't know nothin about philosophy of mind.)

    ReplyDelete
  2. Good question. Here's what I'd say:

    The main point about intentionality is not whether it's world-directed (external, or memory-directed (internal), the point is that it's directed at all. It doesn't seem any more plausible that a piece of matter could be about a memory or thought, than it seems plausible it could be about a tree or rock.

    Maybe I'm misunderstanding what you're trying to get across though. If so, could you clarify?

    ReplyDelete
  3. Alright your explanation makes some sense.

    I just like commenting. You should post more often. And debate Max!

    ReplyDelete
  4. Haha I would love to! I don't know why he has such an aversion to my arguments here...you can still be a perfectly thorough physicalist and accept them (hence why I mentioned supervenience physicalism at the end).

    ReplyDelete