Searle surveys modern physicalistic responses to the mind-body problem and argues that they are all mistaken.
Technical objections: (1) not clear what a "disposition" is; (2) circularity; (3) leaves out causal relations between mental states and behavior.
Fundamental objection: "denies the existence of any inner mental states in addition to external behavior."
2. Type identity theories
Fundamental objection: a dilemma. If mind-brain identities are empirically discovered, then we must identify the items on the two sides of the identity statement by means of different properties. Are the items on the "mental" side identified by "subjective, mental, introspective" (p. 37) features, or not? If they are, then we have not after all "gotten rid of the mind." If they are not then we're just ignoring the mental.
Technical objections: (1) no topic-neutral vocabulary. (Or rather, it's possible to describe mental phenomena in a "topic-neutral" way, but this doesn't prove anything about their nature.) (2) "neuronal chauvinism" (cf. species chauvinism) (3) Kripke: no contingent identity statements between rigid designators, so if mind-brain identity statements are true, they are necessarily true. But they seem clearly to be contingent (since e.g. we can imagine that they are false), therefore they are false. (Searle says this is basically a technical version of the common-sense objection.)
3. Token-token identity theories
A sort of minimal version of materialism. However, doesn't seem to say anything very interesting: seems that the essence of a mental state type will have to do with what different instances of that type have in common. So we need some sort of story about mental state types, not just tokens. Functionalism offers one such explanatory story.
4. Black box functionalism
Two brain-state tokens are tokens of the same mental type iff they have the same causal relations to inputs, outputs, and other brain states. (One explication: Ramsey sentences.)
Fundamental objection: leaves out qualia (as shown by absent qualia, inverted qualia arguments).
Technical objection: Block's Chinese Nation argument.
5. Strong AI
Mind as program, brain as hardware. Basis of cognitive science.
technical problems: frame problem, problem of formalizing nonmonotonic reasoning.
Fundamental problem: leaves out consciousness and intentionality. (Searle's Chinese Room argument.)
6. Eliminative materialism
Mental states won't reduce to neurophysiological states. Therefore there are no mental states (as described by "folk psychology").
Compare: tennis rackets don't reduce: they "do not exactly, or even remotely, match the taxonomy of theoretical physics" (p. 47). Therefore there are no tennis rackets.
Is this a fair parallel? Churchland et al. don't merely argue that mental states don't correspond to neurophysiological types, but rather that there isn't anything that plays the roles folk psychology says that beliefs, desires, etc. play.
Fundamental objection: denies the existence of the mental.
7. Naturalizing content
Externalism about mental content: meanings as not in the head.
Two versions of externalism: (a) causal accounts: roughly, mental states refer to whatever typically causes them. So a particular perceptual state represents cats if it is normally caused by cats. (b) teleological accounts. Unfortunately, I can't give a short-phrase summary of teleological accounts! The rough idea is that for mental state type M to refer to objects of type O is for the function of M to involve O, where function is construed teleologically (i.e. in terms of the goal or purpose of the state), and teleology in turn is typically explained in terms of natural selection. Very, very crudely, we might say that a certain mental state represents danger if the reason we have that state is that in our past evolutionary history, organisms with the state were more likely to survive than organisms without it because it enabled them to avoid danger. (This is far too crude; see Karen Neander's entry in the Stanford Encyclopedia of Philosophy for a good introduction to the topic.)
Technical objection: disjunction problem. [Teleological views arose in part as an attempt to overcome the disjunction problem for causal theories.]
symptom of the common sense problem: meaning is normative; (other) physical stuff isn't.