type vs. token
type-type identity vs. token-token identity
functional state: a state that can be defined in terms of inputs, outputs, and relations to other functional states. Examples: upper-case mode and lower-case mode on a typewriter; Turing machine states. (For the typewriter, inputs are struck keys and outputs are imprintings on the paper. For the Turing machine, the input is the symbol on the square of the tape that the TM is currently over; the output is the new symbol that gets printed on the tape.)
functionalism: the view that mental states are functional states. (What are the inputs and outputs? Roughly: sensory inputs; motor outputs.)
Two varieties of functionalism:
(1) Functionalism (with a capital 'F': Block's term; he also calls this "a priori functionalism"). Nida-Rümelin calls this "conceptual functionalism." The idea here is that functionalism actually gives the meaning of ordinary-language terms for mental states.
(2) Psychofunctionalism (Block's term again; he also calls this "empirical functionalism"). On this view, functionalism gives a correct account of the nature of psychological states, but not necessarily of the meaning of ordinary language terms.
Block says that one main problem with behaviorism is "liberalism": it will count things as having mental states as long as they behave in certain ways. But if mental states are internal causes of behavior, then this is too liberal, counting people or things as having mental states that they don't in fact have. (Someone who convincingly acts as though he or she is in pain will, too liberally, be said to be in pain; similarly, a convincing enough robot would be said to have mental states even if it had no conscious experience at all.
The source of behaviorism's liberalism seems to be its insistence that mental states are simply behaviors, not causes of behavior. Functionalism avoids this by acknowledging that mental states are internal causes of behavior.
The Identity Theory
Block describes the problem with the identity theory as "chauvinism." The idea is that, just as a male chauvinist regards women as less important than men simply because they are of a different gender, the identity theorist must deny that beings sufficiently different from humans (computers, Martians, maybe even octopuses) have mental states simply because they aren't human (or more precisely, because they don't have a human-like central nervous system). The identity theory thus runs the risk of denying mental states to beings that actually have them.
By allowing for multiple realizability of mental states, functionalism avoids this problem. Computers, Martians, etc. will be said to have mental states as long as they have states that are functionally equivalent to ours, regardless of whether they are "implemented" or "realized" in the same way. The physical state that realizes pain in a Martian might be quite different from the physical state that realizes pain in a human, just as the physical state that realizes "upper-case mode" in an old manual typewriter may be quite different from the physical state that realizes it in an electric typewriter.
Most criticisms of functionalism have focused on conscious experiences ("raw feels" or "qualia").
This objection is related to the "liberalism" objection to behaviorism. Block suggests that functionalism, while less liberal than behaviorism, will still be committed to the presence of mental states in things that don't in fact have them.
Why? Because according to functionalism, anything functionally identical with me will share all my mental states. But, Block argues, there will inevitably be the possibility of things that have no mental states but are functionally identical with me.
His example: the "homunculi-headed robot." (Which then morphs into the Chinese Nation example. In the subsequent literature this example is often called "Blockhead" in Block's honor.) Suppose we have a machine table for the psychology of a particular person. Now hook up a robot so that for each square in the machine table, one person in China executes that instruction. The system robot-plus-people-in-China-plus-satellites-and-radios must be regarded by the functionalist as having conscious experiences, since it is functionally identical with the person we began with. Block thinks it's obvious that this system is not in fact conscious. Therefore functionalism must be false, since it has a false consequence.
The "inverted qualia" objection has long been a standard objection to functionalism. The idea is that two people could be functionally identical even though their experiences were inverted, for example, red might look to one the way green looks to the other. If that's possible, then the experience of what red looks like can't be a functional state.
Nida-Rümelin takes this one step further, providing evidence from the biology of color perception to suggest that qualia inversion may actually occur; at least, it is predicted by mainstream views in color vision science.
(The details are pretty nifty. Sketchy notes: Three kinds of cones: R, G, B, each coated with a different pigment.
Two neural channels that relay information from the cones to the brain. The strength of the signal in the r-g channel is determined by r - g; the strength of the signal in the y-b channel is determined by g + r - b. How reddish or greenish something appears is determined by the r-g channel (increasing positive values are redder and redder; decreasing negative values are greener and greener). Similarly, how yellowish or blueish something appears is determined by the y-b channel.
Two types of red-green color-blindness: R pigment in the G cones, or G pigment in the R cones. Both lead to g - r always being 0, and so to nothing appearing reddish or greenish.
But now consider the person who has both causes of red-green color blindness at once!