Hampshire College
Philosophy of Mind
What is identity theory? In this paper, I explain it, and explain why it is flawed, namely, because it does not account for multiple realization.
Identity theory is the metaphysical doctrine that, while epistemically distinct, brain states and mental states are ontologically identical. By this I mean that mental states and brain states are the same thing: the only difference is in how one perceives, thinks about, and describes this thing. The fact that brain states and mental states are identical is an empirical observation: it is not known a priori. An analogy that will perhaps make this whole idea clearer is the case of lightning, and its identity with electrical discharge. It is known that lightning is electrical discharge; however, the way we perceive, think about, and describe lightning is completely different from the way we perceive, think about, and describe electrical discharge. Furthermore, we would not know that they are the same thing if it had not been experimentally confirmed (ala Benjamin Franklin and his kite). Similarly, evidence from cognitive neuroscience leads to the hypothesis of identity. Identity theory is a reductionist theory; it posits that the description of any higher-level phenomenon can be reduced to a description of lower-level phenomena without a loss of information, i.e., mental states can be reduced to brain states, which, being physical states, are further reducible to the laws of physics.
Identity theory is, furthermore, a theory of type-identity: it claims that any particular mental state type (such as pain) is reducible to a particular brain state type (such as C-fibers firing)—indeed, that it must be that brain state type, just as lighting must be electrical discharge. This is distinguished from token-identity, which claims merely that any instance of some mental state is reducible to some sort of brain state. Type-identities, though not without their difficulties, are much easier to clearly define than token-identities. Take, for example, the claim that “pain is C-fibers firing”. If something is pain, it is C-fibers firing, and if it is C-fibers firing, it is pain—plain and simple. Token identities are fuzzier: for example, what is it about this object in my hand that makes it a mug? If we want strict definitions of our token-identities, we must be able to say that those things, and only those things, that satisfy a certain set of criteria (has a handle, holds liquid, etc.) are tokens of a certain type. We must either posit functional types—i.e., that all tokens that are representative of the type X have the particular combination of functions Y, and that all things that have the particular combination of functions Y are X—or physical types—all tokens that are representative of type X share a particular physical trait Y, and all things that share the physical trait Y are X.
The advantages of a reductionist theory like type-identity theory are apparent: it gives our experience a sort of explanatory, causal, and even ontological coherence and closure. When things are reduced, the method of explanation for any particular phenomenon is no longer different from the method of explanation for any other phenomena: they can both be explained in terms of the same underlying process. Causally, reductionism implies that there are not separate chains of causality for different phenomena, i.e. that there are not mental processes going on according to their own laws of causality separate from the physical processes and their laws of causality. And ontologically, reductionism reduces the number of posited kinds of entities (minds, heat, etc.) down to, potentially, one.
There is, however, a problem, which may be fatal for identity (or, specifically, type-identity) theory, and, indeed, reductionism in general—multiple realization, i.e., the fact that the same higher-level system can be implemented on entirely different underlying physical systems. For an example, consider an algorithm (a set of rules for how to perform some function). Let’s say the rules of this algorithm are to take an input number and add 2 to it. This algorithm can be run on multiple, almost incomparable physical systems. For example, I can do it in my head, I can do it with a pen and pencil, I can design a contraption out of Tinkertoys to do it, I can create a program on my calculator to do it, it can be done on a Mac, and it can be done on a PC. In fact, there are a potentially infinite number of ways that this one algorithm could be implemented, with nothing physically similar about any of them.
Similarly, there seem to be multiple ways in which a mental event, for example seeing green, can be correlated with brain events—after all, the way that you see green might be quite different from the way I see green. In fact, in order for the description “seeing green” to be useful, it must describe these multiple ways: otherwise, the only thing that could be described as truly seeing green would be me, as everyone’s brain is different and I am the only person with the exactly the physical brain state that I have when seeing green! The case of C-fibers being pain would seem to be easier: C-fibers are, after all, a particular type of neuron shared by us all. But if pain is C-fibers firing, does that mean that a petri dish full of C-fibers and nothing else would experience pain? If there is any conclusion we do not want our theory of mind to entail, it is this. This would be true of any mental event: if one attempts to identify it with some very specific thing (the activities of a type of cell, the presence of a chemical, etc.) one must claim that that thing, all by itself, is the said mental event, a claim which leads to absurdity. Pain, instead, seems to identify with various events interacting amongst different parts of the brain, the specific organization of which is once again going to be less and less similar between any two people. The problem becomes even more difficult when considering the supposed mental lives of non-humans. Monkeys, dogs, reptiles, etc., have brains that are physically more or less different from ours. Identity theory would seem to suggest that these creatures cannot have the same sort of mental events we do[1]. And these creatures at least are biologically similar to us: what about computers or aliens?
In fact, everything that has mental states would seem to have to have distinct types of mental states! Our ontologically simple theory has failed us: instead of pain, for example, we now have my pain, your pain, Sue’s pain, dog pain, alien pain, computer pain, etc. Instead of simplifying our ontology, identity theory requires us to posit a new type of mental state for each possible case. This is no good—we would like a theory of mentality that not only explains the nature of mental states we happen to have, but that also has some degree of generalizability, allowing us to say something about the mental states of others in a way that such states can be related to our own. Identity theory has very weak explanatory power in this regard—it would be as if we had a theory of biology that didn’t allow us the make comparisons between different species.
Is there a solution, wherein we can avoid dualism, keep physicalism, and not run into these problems? I believe so, and I believe the answer lies in a sort of functional type-identity theory that I glossed over earlier. Let’s go back to the algorithm: I said that it was the same algorithm regardless of the physical implementation: but what makes this the case? It is that the algorithm is performing the same function. Similarly, a mental event is a sort of function processing information in a certain way. While there may be difficulties with this theory, it seems to me to be one worth pursuing.
[1] I must qualify this statement, because identity theory has, occasionally, done the exact opposite. If we identify pain with C-fibers firing, and another creature has C-fibers, then they feel pain: we can therefore infer that animals such as, say, cows experience the same kind of pain we do, as the physiology is quite similar. This argument, unfortunately, does not hold under scrutiny, since, as said earlier, pain identifies the interaction of several parts of the brain which is going to be very different between us and other creatures.
No comments:
Post a Comment