||This biographical article
needs additional citations for verification. Please help by
adding reliable sources. Contentious
material about living persons that is unsourced or poorly sourced must
be removed immediately, especially if potentially libelous
or harmful. (July 2009)
(Find sources: – )
|David John Chalmers|
|Full name||David John Chalmers|
|Born||20 April 1966
|Main interests||Philosophy of mind
|Notable ideas||Hard problem of consciousness; property dualism; neutral monism; extended mind; two-dimensional semantics|
David John Chalmers (born 20 April 1966) is an Australian philosopher specializing in the area of philosophy of mind. He is Professor of Philosophy and Director of the Centre for Consciousness at the Australian National University.
Chalmers was born and raised in Australia, and since 2004 has been Professor of Philosophy, Director of the Centre for Consciousness, and an ARC Federation Fellow at the Australian National University. From an early age, he excelled at mathematics, eventually completing his undergraduate education at the University of Adelaide with a Bachelor's degree in mathematics and computer science. He then briefly studied at Lincoln College at the University of Oxford as a Rhodes Scholar before studying for his PhD at Indiana University Bloomington under Douglas Hofstadter. He was a postdoctoral fellow in the Philosophy-Neuroscience-Psychology program directed by Andy Clark at Washington University in St. Louis from 1993 to 1995, and his first professorship was at UC Santa Cruz, from August 1995 to December 1998. Chalmers was subsequently appointed Professor of Philosophy (1999-2004) and, later, Director of the Center for Consciousness Studies (2002-2004) at the University of Arizona, sponsor of the Toward a Science of Consciousness conference where he made his legendary "debut" in 1994.
Chalmers' book, The Conscious Mind (1996), is widely considered (by both advocates and opponents) to be an essential work on consciousness and its relation to the mind-body problem in philosophy of mind. In the book, Chalmers argues that all forms of physicalism (whether reductive or non-reductive) that have dominated modern philosophy and science fail to account for the existence (that is, presence in reality) of consciousness itself. He proposes an alternative dualistic view he calls naturalistic dualism (but which might also be characterized by more traditional formulations such as property dualism, neutral monism, or double-aspect theory). The book was described by The Sunday Times as "one of the best science books of the year".
Chalmers is best known for his formulation of the notion of a hard problem of consciousness in both his book and in the paper "Facing Up to the Problem of Consciousness" (originally published in The Journal of Consciousness Studies, 1995). He makes the distinction between "easy" problems of consciousness, such as explaining object discrimination or verbal reports, and the single hard problem, which could be stated "why does the feeling which accompanies awareness of sensory information exist at all?" The essential difference between the (cognitive) easy problems and the (phenomenal) hard problem is that the former are at least theoretically answerable via the standard strategy in philosophy of mind: functionalism. Chalmers argues for an "explanatory gap" from the objective to the subjective, and criticizes physical explanations of mental experience, making him a dualist in an era that has been dominated by materialism.
In support of this, Chalmers is famous for his commitment to the logical (though, importantly, not physical) possibility of philosophical zombies, although he was not the first to propose the thought experiment. These zombies, unlike the zombie of popular fiction, are complete physical duplicates of human beings, lacking only qualitative experience. Chalmers argues that since such zombies are conceivable to us, they must therefore be logically possible. Since they are logically possible, then qualia and sentience are not fully explained by physical properties alone. Instead, Chalmers argues that consciousness is a fundamental property ontologically autonomous of any known (or even possible) physical properties, and that there may be lawlike rules which he terms "psychophysical laws" that determine which physical systems are associated with which types of qualia. However, he rejects Cartesian-style interactive dualism in which the mind has the power to alter the behavior of the brain, suggesting instead that the physical world is "causally closed" so that physical events only have physical causes, so that for example human behavior could be explained entirely in terms of the functions of the physical brain. He further speculates that all information-bearing systems may be conscious, leading him to entertain the possibility of conscious thermostats and a qualified panpsychism he calls panprotopsychism. Though Chalmers maintains a formal agnosticism on the issue, even conceding the viability of panpsychism places him at odds with the majority of his contemporaries.
After the publication of Chalmers' landmark paper, more than twenty papers in response were published in the Journal of Consciousness Studies. These papers (by Daniel Dennett, Colin McGinn, Francisco Varela, Francis Crick, and Roger Penrose among others) were collected and published in the book Explaining Consciousness: The Hard Problem. John Searle fiercely critiqued Chalmers's views in The New York Review of Books.
Chalmers, with Andy Clark, has written The Extended Mind, an article about the borders of the mind 
On his web site, David Chalmers has compiled a large bibliography on the philosophy of mind and related fields with close to 18000 annotated entries topically organized.
Chalmers appears in The Matrix video documentary "The Roots of the Matrix" and presents a novel take on a large part of the traditionally skeptical "brain in a vat" hypothesis. He maintains that this hypothesis is not, contrary to common philosophical opinion, a skeptical hypothesis.
He serves on the editorial board of the journals Philo, Consciousness and Cognition, the Journal of Consciousness Studies, and Psyche.
A partial list of publications by Chalmers:
||This article does not cite any references or sources.
Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed. (February 2008)
Panpsychism, in philosophy, is either the view that all parts of matter involve mind, or the more holistic view that the whole universe is an organism that possesses a mind (see pandeism and panentheism). It is thus a stronger and more ambitious view than animism or hylozoism, which holds only that all things are alive. This is not to say that panpsychism believes that all matter is alive or even conscious but rather that the constituent parts of matter are composed of some form of mind and are sentient.
Panpsychism claims that everything is sentient and that there are either many separate minds, or one single mind that unites everything that is. The concept of the unconscious, made popular by the psychoanalysts, made possible a variant of panpsychism that denies consciousness from some entities while still asserting the ubiquity of mind.
Panexperientialism, as espoused by Alfred North Whitehead is a less bold variation, which credits all entities with phenomenal consciousness but not with cognition, and therefore not necessarily with fully-fledged minds.
Panprotoexperientialism is a more cautious variation still, which credits all entities with non-physical properties that are precursors to phenomenal consciousness (or phenomenal consciousness in a latent, undeveloped form) but not with cognition itself, or with conscious awareness.
Panpsychism can be understood as a form of idealism - the metaphysical view that says the fundamental constituents of reality are mental (a view that holds that matter is dependent on minds, or that only mental qualities exist- a type of substance monism). However, whereas Berkeleyan idealism holds that matter exists only in the mind, panpsychism holds that all material entities have minds — and an existence of their own, independent from human observers.
Eliminative Materialism, the view that there is no such thing as mind, but only matter- is incompatible with panpsychism.
Materialism, the view that ultimately there is only matter, is compatible with panpsychism just in case the property of mindedness is always attributed to matter. However, most thinkers who take a materialist or physicalist approach to the mind-body problem only attribute mind to certain highly-organised beings, and attribute it in virtue of their structural, functional and behavioural attributes. This means that once a brain falls below a certain level of functioning (in death or perhaps deep coma) there would no longer be a mind associated with it. Panpsychists often reverse the physicalist's belief that the mental emerges from the mechanistic operation of matter. Instead, they say, mechanical behaviour is derived from primitive mentality of atoms and molecules — as are sophisticated mentality and organic behaviour, the difference being attributed to the presence or absence of complex structure in a compound object. So long as this inverted emergence, the derivation of non-mental properties from mental ones, is in place, panpsychism is not a form of property dualism.
No form of panpsychism attributes full, human-style consciousness to the fundamental constituents of the universe therefore all version need a certain amount of emergence — that is, weak emergence, in which more sophisticated versions of basic properties emerge at a higher level. No version of panpsychism requires strong emergence, in which high-level properties do not have any low-level precursors or basis, and instead emerge "from nothing". Indeed, avoidance of strong emergentism is one of the motivations for panpsychism. "Strong emergence, if it exists, can be used to reject the physicalist picture of the world as fundamentally incomplete. By contrast, weak emergence can be used to support the physicalist picture of the world". Thus, "weak emergence" would be incompatible with any non-physicalism, such as psychism (including panpsychism); whereas "strong emergence" would support non-physicalism, such as psychism (including panpsychism). Furthermore, because "strong emergence" has a holistic outlook, it is particularly amenable to universalist holistic panpsychism ("one single mind that unites everything that is" as "universal soul" etc. of Neo-platonic metaphysics).
Hylopathism argues for a similarly universal attribution of sentience to matter. Few writers would advocate a hylopathic materialism, although the idea is not new; it has been formulated as "whatever underlies consciousness in a material sense, i.e., whatever it is about the brain that gives rise to consciousness, must necessarily be present to some degree in any other material thing". Similar ideas have been attributed to philosopher David Chalmers. However, there are also varieties of monism that don't presuppose (like materialism and idealism do) that mind and matter are fundamentally separable. An example is neutral monism first introduced by Spinoza and later propounded by William James. Panpsychism can be combined with this view.
Panexperientialism or panprotopsychism are related concepts. Alfred North Whitehead incorporated a scientific worldview into the development of his philosophical system similar to Einstein’s Theory of Relativity. His ideas were a significant development of the idea of panpsychism, also known as panexperientialism, due to Whitehead’s emphasis on experience, though the term itself was first applied to Whitehead's philosophy by David Ray Griffin many years later. Process philosophy suggests that fundamental elements of the universe are occasions of experience, which can be collected into groups creating something as complex as a human being. This experience is not consciousness; there is no mind-body duality under this system as mind is seen as a very developed kind of experience. Whitehead was not a subjective idealist and, while his philosophy resembles the concept of monads first proposed by Leibniz, Whitehead’s occasions of experience are interrelated with every other occasion of experience that has ever occurred. He embraced panentheism with God encompassing all occasions of experience, transcending them. Whitehead believed that the occasions of experience are the smallest element in the universe--even smaller than subatomic particles.
A criticism is that it can be demonstrated that the only properties shared by all qualia are that they are not precisely describable, and thus are of indeterminate meaning within any philosophy which relies upon precise definition. This has been something of a blow to panpsychism in general, since some of the same problems seem to be present in panpsychism in that it tends to presuppose a definition for mentality without describing it in any real detail. (What separates mental and non-mental phenomena?)
The panpsychist answers both these challenges in the same way: we already know what qualia are through direct, introspective apprehension; and we likewise know what conscious mentality is by virtue of being conscious. For someone like Alfred North Whitehead, third-person description takes second place to the intimate connection between every entity and every other which is, he says, the very fabric of reality. To take a mere description as having primary reality is to commit the "fallacy of misplaced concreteness".
Another criticism is that we have a detailed understanding of how cognition — thought, memory, etc — work in terms of the functioning and structure of the brain. If the matter that the brain is made of already has cognitive abilities simply by virtue of being matter, then cognition is somehow being done twice over.
One response is to separate the phenomenal, non-cognitive aspects of consciousness — particularly qualia, the essence of the hard problem of consciousness — from cognition. Thus panpsychism is transformed into panexperientialism. However, this strategy of division generates problems of its own: what is going on causally in the head of someone who is thinking -- cognitively of course -- about their qualia?
The view of the world as a macrocosm in relation to man (which is a microcosm, respectively) was a staple theme in Greek philosophy. In that view it was natural to think about the world in anthropomorphic terms. The view passed into the medieval period via Neoplatonism, and became shared by Leibniz, Schelling, Schopenhauer and many others.
Idea of "animated atom" in Russian cosmism in the early 20th century.
Josiah Royce (1855-1916), the leading American absolute idealist, held to the panpsychist view, though he didn't necessarily attribute mental properties to the smallest constituents of mentalistic "systems".
The panpsychist doctrine has recently been making some kind of a comeback in the American philosophy of mind — for example, Christian de Quincey and Leo Stubenberg have each recently defended it. In the philosophy of mind, panpsychism is one possible solution to the so-called hard problem of consciousness. The doctrine has also been applied in the field of environmental philosophy through the work of Australian philosopher, Freya Mathews.
Carl Jung, who is maybe best known for his idea of collective unconscious, wrote that "psyche and matter are contained in one and the same world, and moreover are in continuous contact with one another", and that it was probable that "psyche and matter are two different aspects of one and the same thing". (orig. source unknown, cited in Danah Zohar & Ian Marshall, SQ: Connecting with our Spiritual Intelligence, Bloomsbury, 2000, p. 81). This could be interpreted as panpsychism, apparently of the neutral monism variety.
Panpsychism and emergentism can be seen as alternative ways to bridge the more extreme positions of crude reductionism and crude holism. Panpsychism differs from emergentism in that according to panpsychism, even the smallest physical particles have mental characteristics. Emergentism claims that though the particles be mindless, some systems formed by them, and by nothing but them, do possess mental attributes. The human brain is a case in point.
Gaia theory, which views the biosphere as a self-regulating system, that maintains homeostasis in relation to many vital chemical and physical variables, is sometimes interpreted as panpsychism, because some think that any goal-directed behavior qualifies as mental. However, the goal-directed behavior of the biosphere, as explained by the Gaia theory, is an emergent function of organised, living matter, not a quality of any matter. Thus Gaia theory is more properly associated with emergentism than panpsychism.
The label "naive" (vs. "philosophical") panpsychism is sometimes used to mean, not a doctrine defended by any philosopher, but the attitude of primal people and children to think of even inanimate objects as sentient and/or intentional. This is the same as animism.
Panpsychism, as a view that the universe has "universal consciousness", is shared by some forms of religious thought: theosophy, pantheism, cosmotheism and panentheism.
Panpsychism also plays a part in Hindu, Buddhist and Shinto thought.
A philosophical zombie, p-zombie or p-zed is a hypothetical being that is indistinguishable from a normal human being except that it lacks conscious experience, qualia, or sentience. When a zombie is poked with a sharp object, for example, it does not feel any pain. While it behaves exactly as if it does feel pain (it may say "ouch" and recoil from the stimulus, or tell us that it is in intense pain), it does not actually have the experience of pain as a putative 'normal' person does.
The notion of a philosophical zombie is mainly a thought experiment used in arguments (often called zombie arguments) in the philosophy of mind, particularly arguments against forms of physicalism, such as materialism and behaviorism.
Philosophical zombies are widely used in thought experiments, though the detailed articulation of the concept is not always the same. There are, in effect, different types of p-zombies. What differs is how much exactly they have in common with normal human beings. P-zombies were introduced primarily to argue against specific types of physicalism, such as behaviorism. According to behaviorism, mental states exist solely in terms of behavior: belief, desire, thought, consciousness, and so on, are simply certain kinds of behavior or tendencies towards behaviors. One might invoke the notion of a p-zombie that is behaviorally indistinguishable from a normal human being, but that lacks conscious experiences. According to the behaviorist, such a being is not logically possible, since consciousness is defined in terms of behavior. So an appeal to the intuition that a p-zombie so described is possible furnishes an argument that behaviorism is false. Behaviorists tend to respond to this that a p-zombie is not possible and so the theory that one might exist is false.
One might distinguish between various types of zombies, as they are used in different thought experiments, as follows:
However, philosophical zombies are primarily discussed in the context of arguments against physicalism (or functionalism) in general. Thus, a p-zombie is typically understood as a being that is physically indistinguishable from a normal human being but that lacks conscious experience.
According to physicalism, the physical facts determine all other facts; it follows that, since all the facts about a p-zombie are fixed by the physical facts, and these facts are the same for the p-zombie and for the normal conscious human from which it cannot be physically distinguished, physicalism must hold that p-zombies are not possible, or that p-zombies are the same as normal humans. Therefore, zombie arguments support lines of reasoning that aim to show that zombies are possible.
Most arguments ultimately lend support to some form of dualism—the view that the world includes two kinds of substance (or perhaps two kinds of property): the mental and the physical.
The zombie argument against physicalism is, therefore, a version of a general modal argument against physicalism, such as that of Saul Kripke's in "Naming and Necessity" (1972). The notion of a p-zombie, as used to argue against physicalism, was notably advanced in the 1970s by Thomas Nagel (1970; 1974) and Robert Kirk (1974).
However, the zombie argument against physicalism in general was most famously developed in detail by David Chalmers in The Conscious Mind (1996). According to Chalmers, one can coherently conceive of an entire zombie world: a world physically indiscernible from our world, but entirely lacking conscious experience. In such a world, the counterpart of every being that is conscious in our world would be a p-zombie. The structure of Chalmers' version of the zombie argument can be outlined as follows:
The argument is logically valid, in that if its premises are true, then the conclusion must be true. However, philosophers dispute that its premises are true. For example, concerning premise 2: Is such a zombie world really possible? Chalmers states that "it certainly seems that a coherent situation is described; I can discern no contradiction in the description." Since such a world is conceivable, Chalmers claims, it is possible; and if such a world is possible, then physicalism is false. Chalmers is arguing only for logical possibility, and he maintains that this is all that his argument requires. He states: "Zombies are probably not naturally possible: they probably cannot exist in our world, with its laws of nature."
This leads to the following questions: What is the relevant notion of possibility here? Is the scenario in premise 2 possible in the sense that is suggested in premise 1? Some philosophers maintain that the relevant kind of possibility is not so weak as logical possibility. They argue that, while a zombie world is logically possible (that is, there is no logical contradiction in any full description of the scenario), such a weak notion is not relevant in the analysis of a metaphysical thesis such as physicalism. Most philosophers agree that the relevant notion of possibility is some sort of metaphysical possibility. What the proponent of the zombie argument claims is that one can tell from the armchair, just by the power of reason, that such a zombie scenario is metaphysically possible. Chalmers states: "From the conceivability of zombies, proponents of the argument infer their metaphysical possibility." Chalmers claims that this inference from conceivability to metaphysical possibility is not generally legitimate, but it is legitimate for phenomenal concepts such as consciousness. Indeed, according to Chalmers, whatever is logically possible is also, in the sense relevant here, metaphysically possible.
A physicalist might respond to the zombie argument in several ways. Most responses deny premise 2 (of Chalmers' version above); that is, they deny that a zombie scenario is possible.
One response is to claim that the idea of qualia and related phenomenal notions of the mind are not coherent concepts, and the zombie scenario is therefore incoherent. Daniel Dennett and others take this line. They argue that while consciousness, subjective experiences, and so forth exist in some sense, they are not as the zombie argument proponent claims they are; pain, for example, is not something that can be stripped off a person's mental life without bringing about any behavioral or physiological differences. Dennett coined the term zimboes (philosophical zombies that have second-order beliefs) to argue that the idea of a philosophical zombie is incoherent. He states: "Philosophers ought to have dropped the zombie like a hot potato, but since they persist in their embrace, this gives me a golden opportunity to focus attention on the most seductive error in current thinking." In a related vein, Nigel Thomas argues that the zombie concept is inherently self-contradictory: Because zombies, ex hypothesis, behave just like regular humans, they will claim to be conscious. Thomas argues that any construal of this claim (that is, whether it is taken to be true, false, or neither true nor false) inevitably entails either a contradiction or a manifest absurdity.
Another physicalist response is to provide an error theory to account for intuition that zombies are possible. Philosophers such as Stephen Yablo (1998) have taken this line and argued that notions of what counts as physical, and what counts as physically possible, change over time; so while conceptual analysis is reliable in some areas of philosophy, it is not reliable here. Yablo says he is "braced for the information that is going to make zombies inconceivable, even though I have no real idea what form the information is going to take."
The zombie argument is difficult to assess, because it brings to light fundamental disagreements that philosophers have about the method and scope of philosophy itself. It gets to the core of disagreements about the nature and abilities of conceptual analysis. Proponents of the zombie argument, such as Chalmers, think that conceptual analysis is a central part of (if not the only part of) philosophy and that it certainly can do a great deal of philosophical work. However, others, such as Dennett, Paul Churchland, W.V.O. Quine, and so on, have fundamentally different views from Chalmers about the nature and scope of philosophical analysis. For this reason, discussion of the zombie argument remains vigorous in philosophy.
Under physicalism, it has been claimed that one must either believe that anyone including oneself might be a zombie, or that no one can be a zombie - following from the assertion that one's own conviction about being (or not being) a zombie is a product of the physical world and is therefore no different from anyone else's. This argument has been expressed by Daniel Dennett who argues that "Zimboes thinkZ they are conscious, thinkZ they have qualia, thinkZ they suffer pains - they are just 'wrong' (according to this lamentable tradition), in ways that neither they nor we could ever discover!" . While it has been argued that zombies are metaphysically impossible under the assumption of physicalism, it has also been argued that zombies are not conceivable under the assumption of physicalism. It has been claimed that under physicalism, when a distinction is made in ones mind between a hypothetical zombie and oneself (assumed not to be a zombie), and noting that the concept of oneself is under physicalism a product of physical reality, the concept of the hypothetical zombie can only be a subset of the concept of oneself and will in this nature also entail a deficit in observables (cognitive systems) thereby contradicting the original definition of a zombie. This argument has been expressed by Daniel Dennett who argues that, "when philosophers claim that zombies are conceivable, they invariably underestimate the task of conception (or imagination), and end up imagining something that violates their own definition" .
A survey of professional philosophers and others on their philosophical views was carried out in November 2009. The Survey was taken by 3226 respondents, including 1803 philosophy faculty members and/or PhDs and 829 philosophy graduate students. One of the questions was about whether philosophical zombies are conceivable and/or metaphysically possible. The result was the following:
In philosophy, the brain in a vat is an element used in a variety of thought experiments intended to draw out certain features of our ideas of knowledge, reality, truth, mind, and meaning. It is drawn from the idea, common to many science fiction stories, that a mad scientist might remove a person's brain from the body, suspend it in a vat of life-sustaining liquid, and connect its neurons by wires to a supercomputer which would provide it with electrical impulses identical to those the brain normally receives. According to such stories, the computer would then be simulating reality (including appropriate responses to the brain's own output) and the person with the "disembodied" brain would continue to have perfectly normal conscious experiences without these being related to objects or events in the real world.
The simplest use of brain-in-a-vat scenarios is as an argument for philosophical skepticism and Solipsism. A simple version of this runs as follows: Since the brain in a vat gives and receives the exact same impulses as it would if it were in a skull, and since these are its only way of interacting with its environment, then it is not possible to tell, from the perspective of that brain, whether it is in a skull or a vat. Yet in the first case most of the person's beliefs may be true (if he believes, say, that he is walking down the street, or eating ice-cream); in the latter case they are false. Since, the argument says, you cannot know whether you are a brain in a vat, then you cannot know whether most of your beliefs might be completely false. Since, in principle, it is impossible to rule out your being a brain in a vat, you cannot have good grounds for believing any of the things you believe; you certainly cannot know them.
This argument is a contemporary version of the argument given by Descartes in Meditations on First Philosophy (which he eventually rejects) that he could not trust his perceptions on the grounds that an evil demon might, conceivably, be controlling his every experience. It is also more distantly related to Descartes' argument that he cannot trust his perceptions because he may be dreaming (Descartes' dream argument is preceded by Zhuangzi in "Chuang Chou dreamed he was a butterfly"). In this latter argument the worry about active deception is removed.
Such puzzles have been worked over in many variations by philosophers in recent decades. American philosopher Hilary Putnam popularized the modern terminology over Descartes's "evil daemon," although it brings up such complications and objections as whether the mind is reducible to the workings of a brain. Some, including Barry Stroud, continue to insist that such puzzles constitute an unanswerable objection to any knowledge claims. Hilary Putnam argued against the special case of a brain born in a vat. In the first chapter of his 1982 Reason, Truth, and History, Putnam claims that the thought experiment is inconsistent on the grounds that a brain born in a vat could not have the sort of history and interaction with the world that would allow its thoughts or words to be about the vat that it is in.
In other words, if a brain in a vat stated "I am a brain in a vat", it would always be stating a falsehood. If the brain making this statement lives in the "real" world, then it is not a brain in a vat. On the other hand, if the brain making this statement is really just a brain in the vat then by stating "I am a brain in a vat" what the brain is really stating is "I am what nerve stimuli have convinced me is a 'brain,' and I reside in an image that I have been convinced is called a 'vat'." That is, a brain in a vat would never be thinking about real brains or real vats, but rather about images sent into it that resemble real brains or real vats. This of course makes our definition of "real" even more muddled. This refutation of the vat theory is a consequence of his endorsement, at that time, of the causal theory of reference. Roughly, in this case: if you've never experienced the real world, then you can't have thoughts about it, whether to deny or affirm them. Putnam contends that by "brain" and "vat" the brain in a vat must be referring not to things in the "outside" world but to elements of its own "virtual world"; and it is clearly not a brain in a vat in that sense. One of the other problems is that the supposed brain in a vat cannot have any evidence for being a brain in a vat, because that would be saying "I have what nerve stimuli have convinced me is evidence to my being a brain in a vat" and also "Nerve stimuli have convinced me of the fact that I am a brain in a vat"
Many writers, however, have found Putnam's proposed solution unsatisfying, as it appears, in this regard at least, to depend on a shaky theory of meaning: that we cannot meaningfully talk or think about the "external" world because we cannot experience it; sounds like a version of the outmoded verification principle. Consider the following quote: "How can the fact that, in the case of the brains in a vat, the language is connected by the program with sensory inputs which do not intrinsically or extrinsically represent trees (or anything external) possibly bring it about that the whole system of representations, the language in use, does refer to or represent trees or any thing external?" Putnam here argues from the lack of sensory inputs representing (real world) trees to our inability to meaningfully think about trees. But it is not clear why the referents of our terms must be accessible to us in experience. One cannot, for example, have experience of other people's private states of consciousness; does this imply that one cannot meaningfully ascribe mental states to others?
Subsequent writers on the topic have been particularly interested in the problems it presents for content: that is, how - if at all - can the brain's thoughts be about a person or place with whom it has never interacted and which perhaps does not exist.
||This "In popular culture" section may contain too many minor or trivial references. Please reorganize this content to explain the subject's impact on popular culture rather than simply listing appearances, and remove trivia references. (September 2009)|
|This article needs additional citations for verification.
Please help improve this article by adding reliable references. Unsourced material may be challenged and removed. (January 2010)
Sociology of: childhood · culture
|Related fields and subfields|
|Categories and lists|
The Sociology of knowledge is the study of the relationship between human thought and the social context within which it arises, and of the effects prevailing ideas have on societies. (See also: sociology of scientific knowledge)
The term first came into widespread use in the 1920s, when a number of German-speaking sociologists, most notably Max Scheler, and Karl Mannheim, wrote extensively on it. With the dominance of functionalism through the middle years of the 20th century, the sociology of knowledge tended to remain on the periphery of mainstream sociological thought. It was largely reinvented and applied much more closely to everyday life in the 1960s, particularly by Peter L. Berger and Thomas Luckmann in The Social Construction of Reality (1966) and is still central for methods dealing with qualitative understanding of human society (compare socially constructed reality). The 'genealogical' and 'archaeological' studies of Michel Foucault are of considerable contemporary influence.
The German political philosophers Karl Marx (1818–1883) and Friedrich Engels (1820–1895) argued in Die Deutsche Ideologie (1846, German Ideology) and elsewhere that people's ideologies, including their social and political beliefs and opinions, are rooted in their class interests, and more broadly in the social and economic circumstances in which they live: "It is men, who in developing their material inter-course, change, along with this their real existence, their thinking and the products of their thinking. Life is not determined by consciousness, but consciousness by life" (Marx-Engels Gesamtausgabe 1/5). Under the influence of this doctrine, and of Phenomenology, the Hungarian-born German sociologist Karl Mannheim (1893–1947) gave impetus to the growth of the sociology of knowledge with his Ideologie und Utopie (1929, translated and extended in 1936 as Ideology and Utopia), although the term had been introduced five years earlier by the co-founder of the movement, the German philosopher, phenomenologist and social theorist Max Scheler (1874–1928), in Versuche zu einer Soziologie des Wissens (1924, Attempts at a Sociology of Knowledge). Mannheim feared that this interpretation could be seen to claim that all knowledge and beliefs are the products of socio-political forces since this form of relativism is self-defeating (if it is true, then it too is merely a product of socio-political forces and has no claim to truth and no persuasive force). Mannheim believed that relativism was a strange mixture of modern and ancient beliefs in that it contained within itself a belief in an absolute truth which was true for all times and places (the ancient view most often associated with Plato) and condemned other truth claims because they could not achieve this level of objectivity (an idea gleaned from Marx). Mannheim sought to escape this problem with the idea of 'relationism'. This is the idea that certain things are true only in certain times and places (a view influenced by pragmatism) however, this does not make them less true. Mannheim felt that a stratum of free-floating intellectuals (who he claimed were only loosely anchored to the class structure of society) could most perfectly realize this form of truth by creating a "dynamic synthesis" of the ideologies of other groups.
Phenomenological Sociology is the study of the formal structures of concrete social existence as made available in and through the analytical description of acts of intentional consciousness. The "object" of such an analysis is the meaningful lived world of everyday life: the "Lebenswelt", or Life-world (Husserl:1989). The task, like that of every other phenomenological investigation, is to describe the formal structures of this object of investigation in subjective terms, as an object-constituted-in-and-for-consciousness (Gurwitsch:1964). That which makes such a description different from the "naive" subjective descriptions of the man in the street, or those of the traditional, positivist social scientist, is the utilization of phenomenological methods.
The leading proponent of Phenomenological Sociology was Alfred Schutz [1899-1959]. Schutz sought to provide a critical philosophical foundation for Max Weber's interpretive sociology through the use of phenomenological methods derived from the transcendental phenomenological investigations of Edmund Husserl [1859-1938]. Husserl's work was directed at establishing the formal structures of intentional consciousness. Schutz's work was directed at establishing the formal structures of the Life-world (Schutz:1980). Husserl's work was conducted as a transcendental phenomenology of consciousness. Schutz's work was conducted as a mundane phenomenology of the Life-world (Natanson:1974). The difference in their research projects lies at the level of analysis, the objects taken as topics of study, and the type of phenomenological reduction that is employed for the purposes of analysis.
Ultimately, the two projects should be seen as complementary, with the structures of the latter dependent on the structures of the former. That is, valid phenomenological descriptions of the formal structures of the Life-world should be wholly consistent with the descriptions of the formal structures of intentional consciousness. It is from the latter that the former derives its validity and truth value (Sokolowski:2000).
The phenomenological tie-in with the sociology of knowledge stems from two key historical sources for Mannheim's analysis:  Mannheim was dependent on insights derived from Husserl's phenomenological investigations, especially the theory of meaning as found in Husserl's Logical Investigations of 1900/1901 (Husserl:2000), in the formulation of his central methodological work: "On The Interpretation of Weltanschauung" (Mannheim:1993:see fn41 & fn43) - this essay forms the centerpiece for Mannheim's method of historical understanding and is central to his conception of the sociology of knowledge as a research program; and  The concept of "Weltanschauung" employed by Mannheim has its origins in the hermeneutic philosophy of Wilhelm Dilthey, who relied on Husserl's theory of meaning (above) for his methodological specification of the interpretive act (Mannheim: 1993: see fn38).
It is also noteworthy that Husserl's analysis of the formal structures of consciousness, and Schutz's analysis of the formal structures of the Life-world are specifically intended to establish the foundations, in consciousness, for the understanding and interpretation of a social world which is subject to cultural and historical change. The phenomenological position is that although the facticity of the social world may be culturally and historically relative, the formal structures of consciousness, and the processes by which we come to know and understand this facticity, are not. That is, the understanding of any actual social world is unavoidably dependent on understanding the structures and processes of consciousness that found, and constitute, any possible social world.
Alternately, if the facticity of the social world and the structures of consciousness prove to be culturally and historically relative, then we are at an impasse in regard to any meaningful scientific understanding of the social world which is not subjective (as opposed to being objective and grounded in nature [positivism], or inter subjective and grounded in the structures of consciousness [phenomenology]), and relative to the cultural and idealization formations of particular concrete individuals living in a particular socio-historical group.
A particularly important contemporary contribution to the sociology of knowledge is found in the work of Michel Foucault. Madness and Civilization (1961) postulated that conceptions of madness and what was considered "reason" or "knowledge" was itself subject to major culture bias - in this respect mirroring similar criticisms by Thomas Szasz, at the time the foremost critic of psychiatry, and himself now an eminent psychiatrist. A point where Foucault and Szasz agreed was that sociological processes played the major role in defining "madness" as an "illness" and prescribing "cures". In The Birth of the Clinic: An Archeology of Medical Perception (1963), Foucault extended his critique to institutional clinical medicine, arguing for the central conceptual metaphor of "The Gaze", which had implications for medical education, prison design, and the carceral state as understood today. Concepts of criminal justice and its intersection with medicine were better developed in this work than in Szasz and others, who confined their critique to current psychiatric practice. The Order of Things (1966) and The Archeology of Knowledge (1969) introduced abstract notions of mathesis and taxonomia to explain the subjective 'ordering' of the human sciences. These, he claimed, had transformed 17th and 18th century studies of "general grammar" into modern "linguistics", "natural history" into modern "biology", and "analysis of wealth" into modern "economics"; though not, claimed Foucault, without loss of meaning. The 19th century transformed what knowledge was.
Perhaps Foucault's best-known claim was that "Man did not exist" before the 18th century. Foucault regarded notions of humanity and of humanism as inventions modernity. Accordingly, a cognitive bias had been introduced unwittingly into science, by over-trusting the individual doctor or scientist's ability to see and state things objectively. Foucault roots this argument in the rediscovery of Kant, though his thought is significantly influenced by Nietzsche - that philosopher declaring the "death of God" in the 19th century, and the anti-humanists proposing the "death of Man" in the 20th.
Knowledge ecology is a concept originating from knowledge management and that aimed at "bridging the gap between the static data repositories of knowledge management and the dynamic, adaptive behavior of natural systems" , and in particular relying on the concept of interaction and emergence. Knowledge ecology, and its related concept information ecology has been elaborated by different academics and practitioners such as Thomas H. Davenport , Bonnie Nardi, or Georges Pór.
|Thomas Samuel Kuhn|
|Full name||Thomas Samuel Kuhn|
|Born||July 18, 1922(1922-07-18)
|Died||June 17, 1996 (aged 73)
|Main interests||Philosophy of science|
|Notable ideas||Paradigm shift
Thomas Samuel Kuhn (surname pronounced /ˈkuːn/; July 18, 1922 – June 17, 1996) was an American intellectual who wrote extensively on the history of science and developed several important notions in the sociology and philosophy of science.
Kuhn has made several important contributions to our understanding of the progress of knowledge:
Thomas Kuhn was born in Cincinnati, Ohio to Samuel L. Kuhn, an industrial engineer, and Minette Stroock Kuhn. He obtained his B.S. degree in physics from Harvard University in 1943, and M.S. and Ph.D. degrees in physics in 1946 and 1949, respectively. As he states in the first few pages of the preface to the second edition of The Structure of Scientific Revolutions, his three years of total academic freedom as a Harvard Junior Fellow were crucial in allowing him to switch from physics to the history (and philosophy) of science. He later taught a course in the history of science at Harvard from 1948 until 1956 at the suggestion of university president James Conant. After leaving Harvard, Kuhn taught at the University of California, Berkeley, in both the philosophy department and the history department, being named Professor of the History of Science in 1961. At Berkeley, he wrote and published (in 1962) his best known and most influential work: The Structure of Scientific Revolutions. In 1964, he joined Princeton University as the M. Taylor Pyne Professor of Philosophy and History of Science. In 1979, he joined the Massachusetts Institute of Technology (MIT) as the Laurance S. Rockefeller Professor of Philosophy, remaining there until 1991. Kuhn interviewed and taped Danish physicist Niels Bohr the day before Bohr's death. The recording contains the last words of Niels Bohr caught on tape. In 1994, Kuhn was diagnosed with cancer of the bronchial tubes, of which he died in 1996.
Thomas Kuhn was married twice, first to Kathryn Muhs (with whom he had three children) and later to Jehane Barton (Jehane R. Kuhn).
The Structure of Scientific Revolutions (SSR) was originally printed as an article in the International Encyclopedia of Unified Science, published by the logical positivists of the Vienna Circle. In this book, Kuhn argued that science does not progress via a linear accumulation of new knowledge, but undergoes periodic revolutions, also called "paradigm shifts" (although he did not coin the phrase), in which the nature of scientific inquiry within a particular field is abruptly transformed. In general, science is broken up into three distinct stages. Prescience, which lacks a central paradigm, comes first. This is followed by "normal science", when scientists attempt to enlarge the central paradigm by "puzzle-solving". Thus, the failure of a result to conform to the paradigm is seen not as refuting the paradigm, but as the mistake of the researcher, contra Popper's refutability criterion. As anomalous results build up, science reaches a crisis, at which point a new paradigm, which subsumes the old results along with the anomalous results into one framework, is accepted. This is termed revolutionary science.
In SSR, Kuhn also argues that rival paradigms are incommensurable—that is, it is not possible to understand one paradigm through the conceptual framework and terminology of another rival paradigm. For many critics, for example David Stove (Popper and After, 1982), this thesis seemed to entail that theory choice is fundamentally irrational: if rival theories cannot be directly compared, then one cannot make a rational choice as to which one is better. Whether Kuhn's views had such relativistic consequences is the subject of much debate; Kuhn himself denied the accusation of relativism in the third edition of SSR, and sought to clarify his views to avoid further misinterpretation. Freeman Dyson has quoted Kuhn as saying "I am not a Kuhnian!", referring to the relativism that some philosophers have developed based on his work.
The enormous impact of Kuhn's work can be measured in the changes it brought about in the vocabulary of the philosophy of science: besides "paradigm shift", Kuhn raised the word "paradigm" itself from a term used in certain forms of linguistics to its current broader meaning, coined the term "normal science" to refer to the relatively routine, day-to-day work of scientists working within a paradigm, and was largely responsible for the use of the term "scientific revolutions" in the plural, taking place at widely different periods of time and in different disciplines, as opposed to a single "Scientific Revolution" in the late Renaissance. The frequent use of the phrase "paradigm shift" has made scientists more aware of and in many cases more receptive to paradigm changes, so that Kuhn’s analysis of the evolution of scientific views has by itself influenced that evolution.
Kuhn's work has been extensively used in social science; for instance, in the post-positivist/positivist debate within International Relations. Kuhn is credited as a foundational force behind the post-Mertonian Sociology of Scientific Knowledge.
A defense Kuhn gives against the objection that his account of science from The Structure of Scientific Revolutions results in relativism can be found in an essay by Kuhn called "Objectivity, Value Judgment, and Theory Choice." In this essay, he reiterates five criteria from the penultimate chapter of SSR that determine (or help determine, more properly) theory choice:
He then goes on to show how, although these criteria admittedly determine theory choice, they are imprecise in practice and relative to individual scientists. According to Kuhn, "When scientists must choose between competing theories, two men fully committed to the same list of criteria for choice may nevertheless reach different conclusions." For this reason, basically, the criteria still are not "objective" in the usual sense of the word because individual scientists reach different conclusions with the same criteria due to valuing one criterion over another or even adding additional criteria for selfish or other subjective reasons. Kuhn then goes on to say, "I am suggesting, of course, that the criteria of choice with which I began function not as rules, which determine choice, but as values, which influence it." Because Kuhn utilizes the history of science in his account of science, his criteria or values for theory choice are often understood as descriptive normative rules (or more properly, values) of theory choice for the scientific community rather than prescriptive normative rules in the usual sense of the word "criteria," although there are many varied interpretations of Kuhn's account of science.
Although they used different terminologies, both Kuhn and Michael Polanyi believed that scientists' subjective experiences made science a relativistic discipline. Polanyi lectured on this topic for decades before Kuhn published "The Structure of Scientific Revolutions."
Supporters of Polanyi charged Kuhn with plagiarism, as it was known that Kuhn attended several of Polanyi's lectures, and that the two men had debated endlessly over the epistemology of science before either had achieved fame. In response to these critics, Kuhn cited Polanyi in the second edition of "The Structure of Scientific Revolutions,"  and the two scientists agreed to set aside their differences in the hopes of enlightening the world to the dynamic nature of science. Despite this intellectual alliance, Polanyi's work was constantly interpreted by others within the framework of Kuhn's paradigm shifts, much to Polanyi's (and Kuhn's) dismay.
Kuhn was named a Guggenheim Fellow in 1954, and in 1982 was awarded the George Sarton Medal by the History of Science Society. He was also awarded numerous honorary doctorates.
||This article's introduction section may not adequately summarize its contents. To comply with Wikipedia's lead section guidelines, please consider expanding the lead to provide an accessible overview of the article's key points. (September 2009)|
|The Structure of Scientific Revolutions|
Cover of 3rd edition, paperback
|Author||Thomas Samuel Kuhn|
|Subject(s)||History of science|
|Publisher||University of Chicago Press|
The Structure of Scientific Revolutions (1962), by Thomas Kuhn, is an analysis of the history of science. Its publication was a landmark event in the sociology of scientific knowledge, and popularized the terms paradigm and paradigm shift.
The work was first published as a monograph in the International Encyclopedia of Unified Science, then as a book by University of Chicago Press in 1962. (All page numbers below refer to the third edition of the text, published in 1996). In 1969, Kuhn added a postscript to the book in which he replied to critical responses to the first edition of the book.
Kuhn dated the genesis of his book to 1947, when he was a graduate student at Harvard University and had been asked to teach a science class for humanities undergraduates with a focus on historical case studies. Kuhn later commented that until then, "I'd never read an old document in science." Aristotle's Physics was astonishingly unlike Isaac Newton's work in its concepts of matter and motion. Kuhn concluded that Aristotle's concepts were not "bad Newton," just different.
Kuhn's approach to the history and philosophy of science has been described as focusing on conceptual issues: what sorts of ideas were thinkable at a particular time? What sorts of intellectual options and strategies were available to people during a given period? What types of lexicons and terminology were known and employed during certain epochs? Stressing the importance of not attributing modern modes of thought to historical actors, Kuhn's book argues that the evolution of scientific theory does not emerge from the straightforward accumulation of facts, but rather from a set of changing intellectual circumstances and possibilities. Such an approach is largely commensurate with the general historical school of non-linear history.
Kuhn explains his ideas using examples taken from the history of science. For instance, at a particular stage in the history of chemistry, some chemists began to explore the idea of atomism. When many substances are heated they have a tendency to decompose into their constituent elements, and often (though not invariably) these elements can be observed to combine only in set proportions. At one time, a combination of water and alcohol was generally classified as a compound. Nowadays it is considered to be a solution, but there was no reason then to suspect that it was not a compound. Water and alcohol would not separate spontaneously, but they could be separated when heated. Water and alcohol can be combined in any proportion.
A chemist favoring atomic theory would have viewed all compounds whose elements combine in fixed proportions as exhibiting normal behavior, and all known exceptions to this pattern would be regarded as anomalies whose behavior would probably be explained at some time in the future. On the other hand, if a chemist believed that theories of the atomicity of matter were erroneous, then all compounds whose elements combined in fixed proportions would be regarded as anomalies whose behavior would probably be explained at some time in the future, and all those compounds whose elements are capable of combining in any ratio would be seen as exhibiting the normal behavior of compounds. Nowadays the consensus is that the atomists' view was correct. But if one were to restrict oneself to thinking about chemistry using only the knowledge available at the time, either point of view would be defensible.
What is arguably the most famous example of a revolution in scientific thought is the Copernican Revolution. In Ptolemy's school of thought, cycles and epicycles (with some additional concepts) were used for modeling the movements of the planets in a cosmos that had a stationary Earth at its center. As the accuracy of celestial observations increased, the complexity of the Ptolemaic cyclical and epicyclical mechanisms had to increase in step with the increased accuracy of the observations, in order to maintain the calculated planetary positions close to the observed positions. Copernicus proposed a cosmology in which the Sun was at the center and the Earth was one of the planets revolving around it. For modeling the planetary motions, Copernicus used the tools he was familiar with, namely the cycles and epicycles of the Ptolemaic toolbox. But Copernicus' model needed more cycles and epicycles than existed in the then-current Ptolemaic model, and due to a lack of accuracy in calculations, Copernicus's model did not appear to provide more accurate predictions than the Ptolemy model. Copernicus' contemporaries rejected his cosmology, and Kuhn asserts that they were quite right to do so: Copernicus' cosmology lacked credibility.
Thomas Kuhn illustrates how a paradigm shift later became possible when Galileo Galilei introduced his new ideas concerning motion. Intuitively, when an object is set in motion, it soon comes to a halt. A well-made cart may travel a long distance before it stops, but unless something keeps pushing it, it will eventually stop moving. Aristotle had argued that this was presumably a fundamental property of nature: in order for the motion of an object to be sustained, it must continue to be pushed. Given the knowledge available at the time, this represented sensible, reasonable thinking.
Galileo put forward a bold alternative conjecture: suppose, he said, that we always observe objects coming to a halt simply because some friction is always occurring. Galileo had no equipment with which to objectively confirm his conjecture, but he suggested that without any friction to slow down an object in motion, its inherent tendency is to maintain its speed without the application of any additional force.
The Ptolemaic approach of using cycles and epicycles was becoming strained: there seemed to be no end to the mushrooming growth in complexity required to account for the observable phenomena. Johannes Kepler was the first person to abandon the tools of the Ptolemaic paradigm. He started to explore the possibility that the planet Mars might have an elliptical orbit rather than a circular one. Clearly, the angular velocity could not be constant, but it proved very difficult to find the formula describing the rate of change of the planet's angular velocity. After many years of calculations, Kepler arrived at what we now know as the law of equal areas.
Galileo's conjecture was merely that — a conjecture. So was Kepler's cosmology. But each conjecture increased the credibility of the other, and together, they changed the prevailing perceptions of the scientific community. Later, Newton showed that Kepler's three laws could all be derived from a single theory of motion and planetary motion. Newton solidified and unified the paradigm shift that Galileo and Kepler had initiated.
One of the aims of science is to find models that will account for as many observations as possible within a coherent framework. Together, Galileo's rethinking of the nature of motion and Keplerian cosmology represented a coherent framework that was capable of rivaling the Aristotelian/Ptolemaic framework.
Once a paradigm shift has taken place, the textbooks are rewritten. Often the history of science too is rewritten, being presented as an inevitable process leading up to the current, established framework of thought. There is a prevalent belief that all hitherto-unexplained phenomena will in due course be accounted for in terms of this established framework. Kuhn states that scientists spend most (if not all) of their careers in a process of puzzle-solving. Their puzzle-solving is pursued with great tenacity, because the previous successes of the established paradigm tend to generate great confidence that the approach being taken guarantees that a solution to the puzzle exists, even though it may be very hard to find. Kuhn calls this process normal science.
As a paradigm is stretched to its limits, anomalies — failures of the current paradigm to take into account observed phenomena — accumulate. Their significance is judged by the practitioners of the discipline. Some anomalies may be dismissed as errors in observation, others as merely requiring small adjustments to the current paradigm that will be clarified in due course. Some anomalies resolve themselves spontaneously, having increased the available depth of insight along the way. But no matter how great or numerous the anomalies that persist, Kuhn observes, the practicing scientists will not lose faith in the established paradigm for as long as no credible alternative is available; to lose faith in the solubility of the problems would in effect mean ceasing to be a scientist.
In any community of scientists, Kuhn states, there are some individuals who are bolder than most. These scientists, judging that a crisis exists, embark on what Thomas Kuhn calls revolutionary science, exploring alternatives to long-held, obvious-seeming assumptions. Occasionally this generates a rival to the established framework of thought. The new candidate paradigm will appear to be accompanied by numerous anomalies, partly because it is still so new and incomplete. The majority of the scientific community will oppose any conceptual change, and, Kuhn emphasizes, so they should. In order to fulfill its potential, a scientific community needs to contain both individuals who are bold and individuals who are conservative. There are many examples in the history of science in which confidence in the established frame of thought was eventually vindicated. Whether the anomalies of a candidate for a new paradigm will be resolvable is almost impossible to predict. Those scientists who possess an exceptional ability to recognize a theory's potential will be the first whose preference is likely to shift in favour of the challenging paradigm. There typically follows a period in which there are adherents of both paradigms. In time, if the challenging paradigm is solidified and unified, it will replace the old paradigm, and a paradigm shift will have occurred.
Chronologically, Kuhn distinguishes between three phases. The first phase, which exists only once, is the pre-paradigm phase, in which there is no consensus on any particular theory, though the research being carried out can be considered scientific in nature. This phase is characterized by several incompatible and incomplete theories. If the actors in the pre-paradigm community eventually gravitate to one of these conceptual frameworks and ultimately to a widespread consensus on the appropriate choice of methods, terminology and on the kinds of experiment that are likely to contribute to increased insights, then the second phase, normal science, begins, in which puzzles are solved within the context of the dominant paradigm. As long as there is general consensus within the discipline, normal science continues. Over time, progress in normal science may reveal anomalies, facts which are difficult to explain within the context of the existing paradigm. While usually these anomalies are resolved, in some cases they may accumulate to the point where normal science becomes difficult and where weaknesses in the old paradigm are revealed. Kuhn refers to this as a crisis, and they are often resolved within the context of normal science. However, after significant efforts of normal science within a paradigm fail, science may enter the third phase, that of revolutionary science, in which the underlying assumptions of the field are reexamined and a new paradigm is established. After the new paradigm's dominance is established, scientists return to normal science, solving puzzles within the new paradigm. A science may go through these cycles repeatedly, though Kuhn notes that it is a good thing for science that such shifts do not occur often or easily.
According to Kuhn, the scientific paradigms preceding and succeeding a paradigm shift are so different that their theories are incommensurable — the new paradigm cannot be proven or disproven by the rules of the old paradigm, and vice versa. The paradigm shift does not merely involve the revision or transformation of an individual theory, it changes the way terminology is defined, how the scientists in that field view their subject, and, perhaps most significantly, what questions are regarded as valid, and what rules are used to determine the truth of a particular theory. The new theories were not, as the scientists had previously thought, just extensions of old theories, but were instead completely new world views. Such incommensurability exists not just before and after a paradigm shift, but in the periods in between conflicting paradigms. It is simply not possible, according to Kuhn, to construct an impartial language that can be used to perform a neutral comparison between conflicting paradigms, because the very terms used are integral to the respective paradigms, and therefore have different connotations in each paradigm. The advocates of mutually exclusive paradigms are in an invidious position: "Though each may hope to convert the other to his way of seeing science and its problems, neither may hope to prove his case. The competition between paradigms is not the sort of battle that can be resolved by proof." (SSR, p. 148). Scientists subscribing to different paradigms end up talking past one another.
Kuhn (SSR, section XII) states that the probabilistic tools used by verificationists are inherently inadequate for the task of deciding between conflicting theories, since they belong to the very paradigms they seek to compare. Similarly, observations that are intended to falsify a statement will fall under one of the paradigms they are supposed to help compare, and will therefore also be inadequate for the task. According to Kuhn, the concept of falsifiability is unhelpful for understanding why and how science has developed as it has. In the practice of science, scientists will only consider the possibility that a theory has been falsified if an alternative theory is available which they judge to be credible. If there isn't, scientists will continue to adhere to the established conceptual framework. If a paradigm shift has occurred, the textbooks will be rewritten to state that the previous theory has been falsified.
The first edition of SSR ended with a chapter entitled "Progress through Revolutions", in which Kuhn spelled out his views on the nature of scientific progress. Since he considered problem solving to be a central element of science, Kuhn saw that for a new candidate for paradigm to be accepted by a scientific community, "First, the new candidate must seem to resolve some outstanding and generally recognized problem that can be met in no other way. Second, the new paradigm must promise to preserve a relatively large part of the concrete problem solving activity that has accrued to science through its predecessors." And overall Kuhn maintained that the new paradigm must also solve more problems than its predecessor, which therefore entailed the number of newly solved problems must be greater than those solved in the old paradigm but no longer solved in the new one.
In the second edition of SSR, Kuhn added a postscript in which he elaborated his ideas on the nature of scientific progress. He described a thought experiment involving an observer who has the opportunity to inspect an assortment of theories, each corresponding to a single stage in a succession of theories. What if the observer is presented with these theories without any explicit indication of their chronological order? Kuhn anticipates that it will be possible to reconstruct their chronology on the basis of the theories' scope and content, because the more recent a theory is, the better it will be as an instrument for solving the kinds of puzzle that scientists aim to solve. Kuhn remarked: "That is not a relativist's position, and it displays the sense in which I am a convinced believer in scientific progress."
In 1987, Kuhn's work was reported to be the twentieth-century book most frequently cited in the period 1976-83 in the Arts and the Humanities and the Times Literary Supplement labeled it one of "The Hundred Most Influential Books Since the Second World War." The book's basic concepts have been adopted and co-opted by a variety of fields and disciplines beyond those encompassing the history and philosophy of science.
SSR is viewed by postmodern and post-structuralist thinkers as having called into question the enterprise of science by demonstrating that scientific knowledge is dependent on the culture and historical circumstances of groups of scientists rather than on their adherence to a specific, definable method. In this regard, Kuhn is considered a precursor to the more radical thinking of Paul Feyerabend. Kuhn's work has also been regarded as blurring the demarcation between scientific and non-scientific enterprises, because it describes the mechanism of scientific progress without invoking any idealized scientific method that is capable of distinguishing science from non-science. In the years following the publication of The Structure of Scientific Revolutions, debate raged with adherents of Karl Popper's doctrine of falsificationism, such as Imre Lakatos.
On the one hand, logical positivists and many scientists have criticized Kuhn's "humanizing" of the scientific process for going too far, while the postmodernists, together with Feyerabend, have criticized Kuhn for not going far enough. SSR has also been embraced by those wishing to discredit or attack the authority of science, such as creationists and radical environmentalists, and it was also in tune with the national change in attitudes towards science which was occurring at the time of the book's publication (Rachel Carson's Silent Spring was published in the same year as SSR). Indeed, modern scholars have speculated whether Kuhn would have been more explicit about his intention not to create a tool that could be used for attempting to undermine science and the scientific process if he had been able to foresee these developments.
The changes that occur in politics, society and business are often expressed in Kuhnian terms, however poor their parallel with the practice of science may seem to scientists and historians of science. The terms "paradigm" and "paradigm shift" have become such notorious clichés and buzzwords that they are viewed in many circles as being effectively devoid of content. Misused and overused to the point of becoming meaningless, their use in these contexts rarely has any firm foundation in Kuhn's original definitions.
Kuhn's SSR was soon criticized by his colleagues in the history and philosophy of science. In 1965, a special symposium on Kuhn's SSR was held at an International Colloquium on the Philosophy of Science that took place at Bedford College, London, and was chaired by Sir Karl Popper. The symposium led to the publication of the symposium's presentations plus other essays, most of them critical, which eventually appeared in an influential volume of essays that by 1999 had gone through 21 printings. Kuhn expressed the opinion that his critics' readings of his book were so inconsistent with his own understanding of it that he was "...tempted to posit the existence of two Thomas Kuhns," one the author of his book, the other the individual who had been criticized in the symposium by "Professors Popper, Feyerabend, Lakatos, Toulmin and Watkins."
In his 1972 work, Human Understanding, Stephen Toulmin argued that a more realistic picture of science than that presented in SSR would admit the fact that revisions in science take place much more frequently, and are much less dramatic than can be explained by the model of revolution/normal science. In Toulmin's view, such revisions occur quite often during periods of what Kuhn would call "normal science." In order for Kuhn to explain such revisions in terms of the non-paradigmatic puzzle solutions of normal science, he would need to delineate what is perhaps an implausibly sharp distinction between paradigmatic and non-paradigmatic science.
In a series of texts published in the early 1970s, C.R. Kordig asserted a position somewhere between that of Kuhn and the older philosophy of science. His criticism of the Kuhnian position was that the incommensurability thesis was too radical, and that this made it impossible to explain the confrontation of scientific theories which actually occurs. According to Kordig, it is in fact possible to admit the existence of revolutions and paradigm shifts in science while still recognizing that theories belonging to different paradigms can be compared and confronted on the plane of observation. Those who accept the incommensurability thesis do not do so because they admit the discontinuity of paradigms, but because they attribute a radical change in meanings to such shifts.
Kordig maintains that there is a common observational plane. For example, when Kepler and Tycho Brahe are trying to explain the relative variation of the distance of the sun from the horizon at sunrise, both see the same thing (the same configuration is focused on the retina of each individual). This is just one example of the fact that "rival scientific theories share some observations, and therefore some meanings." Kordig suggests that with this approach, he is not reintroducing the distinction between observations and theory in which the former is assigned a privileged and neutral status, but that it is possible to affirm more simply the fact that, even if no sharp distinction exists between theory and observations, this does not imply that there are no comprehensible differences at the two extremes of this polarity.
At a secondary level, for Kordig there is a common plane of inter-paradigmatic standards or shared norms which permit the effective confrontation of rival theories.
In 1973, Hartry Field published an article which also sharply criticized Kuhn's idea of incommensurability. In particular, he took issue with this passage from Kuhn:
Field takes this idea of incommensurability between the same terms in different theories one step further. Instead of attempting to identify a persistence of the reference of terms in different theories, Field's analysis emphasizes the indeterminacy of reference within individual theories. Field takes the example of the term "mass", and asks what exactly "mass" means in modern post-relativistic physics. He finds that there are at least two different definitions:
Projecting this distinction backwards in time onto Newtonian dynamics, we can formulate the following two hypotheses:
According to Field, it is impossible to decide which of these two affirmations is true. Prior to the theory of relativity, the term "mass" was referentially indeterminate. But this does not mean that the term "mass" did not have a different meaning than it now has. The problem is not one of meaning but of reference. The reference of such terms as mass is only partially determined: we don't really know how Newton intended his use of this term to be applied. As a consequence, neither of the two terms fully denotes (refers). It follows that it is improper to maintain that a term has changed its reference during a scientific revolution; it is more appropriate to describe terms such as "mass" as "having undergone a denotional refinement."
The close connection between the interpretationalist hypothesis and a holistic conception of beliefs is at the root of the notion of the dependence of perception on theory, a central concept in SSR. Kuhn maintained that the perception of the world depends on how the percipient conceives the world: two scientists who witness the same phenomenon and are steeped in two radically different theories will see two different things. According to this view, it is our interpretation of the world which determines what we see.
Jerry Fodor attempts to establish that this theoretical paradigm is fallacious and misleading by demonstrating the impenetrability of perception to the background knowledge of subjects. The strongest case can be based on evidence from experimental cognitive psychology, namely the persistence of perceptual illusions. Knowing that the lines in the Muller-Lyer illusion are equal does not prevent one from continuing to see one line as being longer than the other. It is this impenetrability of the information elaborated by the mental modules which limits the scope of interpretationalism.
In epistemology, for example, the criticism of what Fodor calls the interpretationalist hypothesis accounts for the common-sense intuition (on which naïve physics is based) of the independence of reality from the conceptual categories of the experimenter. If the processes of elaboration of the mental modules are in fact independent of the background theories, then it is possible to maintain the realist view that two scientists who embrace two radically diverse theories see the world exactly in the same manner even if they interpret it differently. The point is that it is necessary to distinguish between observations and the perceptual fixation of beliefs. While it is beyond doubt that the second process involves the holistic relationship between beliefs, the first is largely independent of the background beliefs of individuals.
Other critics, such as Israel Sheffler, Hilary Putnam and Saul Kripke, have focused on the Fregean distinction between sense and reference in order to defend a position of scientific realism. Sheffler contends that Kuhn confuses the meanings of terms such as "mass" with their references. While their meanings may very well differ, their references (the objects or entities to which they correspond in the external world) remain fixed.
More recently, criticism from a different direction has been developed by Arun Bala in his study The Dialogue of Civilizations in the Birth of Modern Science (Palgrave Macmillan, 2006). He charges that The Structure of Scientific Revolutions is itself a profoundly Eurocentric work, although it is often perceived as opening the door to the multicultural turn in historical studies of science. Bala charges that Kuhn ignores the significant impact of Arabic and Chinese science when he writes:
Every civilization of which we have records has possessed a technology, an art, a religion, a political system, laws and so on. In many cases those facets of civilizations have been as developed as our own. But only the civilizations that descend from Hellenic Greece have possessed more than the most rudimentary science. The bulk of scientific knowledge is a product of Europe in the last four centuries. No other place and time has supported the very special communities from which scientific productivity comes.—Kuhn , 1962, pp. 167-168
Bala argues that it is precisely Kuhn’s postmodern epistemological paradigm which obstructs recognition of non-Western influences on modern science. Bala argues that this leads Kuhn to treat different cultural scientific traditions as separate intellectual universes isolated from each other. Instead, Bala argues, we would have a different multicultural picture of science by including the contributions from Arabic, Chinese, ancient Egyptian and Indian traditions of philosophy, mathematics, astronomy and physics that went into shaping the birth of modern science.
Modal realism is the view, notably propounded by David Lewis, that all possible worlds are as real as the actual world. It is based on the following tenets: possible worlds exist; possible worlds are not different in kind from the actual world; possible worlds are irreducible entities; the term actual in actual world is indexical.
The term goes back to Leibniz's theory of possible worlds, used to analyse necessity, possibility, and similar modal notions. In short: the actual world is regarded as merely one among an infinite set of logically possible worlds, some "nearer" to the actual world and some more remote. A proposition is necessary if it is true in all possible worlds, and possible if it is true in at least one.
At the heart of David Lewis's modal realism are six central doctrines about possible worlds:
Lewis backs modal realism for a variety of reasons. First, there doesn't seem to be a reason not to. Many abstract mathematical entities are held to exist simply because they are useful. For example, sets are useful, abstract mathematical constructs that were only conceived in the 19th century. These are now considered to be objects in their own right, and while this is a philosophically unintuitive idea, its usefulness in understanding the workings of mathematics makes belief in it worthwhile. The same should go for possible worlds. Since these constructs have helped us make sense of key philosophical concepts in epistemology, metaphysics, philosophy of mind, etc., their existence should be uncritically accepted on pragmatic grounds.
Second, Lewis believes that the entire concept of modality can be reduced to possible world talks. For example, to say "x is possible" is to say, there exist a possible world x. To say, "x is necessary" is to say all possible worlds x. The use of possible world provides a sort of economy with the least number of underfined primitives\axioms in our ontology.
In philosophy possible worlds are usually regarded as real but abstract possibilities, or sometimes as a mere metaphor, abbreviation, or façon de parler for sets of counterfactual propositions.
Lewis himself not only claimed to take modal realism seriously (although he did regret his choice of the expression modal realism), he also insisted that his claims should be taken literally:
By what right do we call possible worlds and their inhabitants disreputable entities, unfit for philosophical services unless they can beg redemption from philosophy of language? I know of no accusation against possibles that cannot be made with equal justice against sets. Yet few philosophical consciences scruple at set theory. Sets and possibles alike make for a crowded ontology. Sets and possibles alike raise questions we have no way to answer. [...] I propose to be equally undisturbed by these equally mysterious mysteries.
How many [possible worlds] are there? In what respects do they vary, and what is common to them all? Do they obey a nontrivial law of identity of indiscernibles? Here I am at a disadvantage compared to someone who pretends as a figure of speech to believe in possible worlds, but really does not. If worlds were creatures of my imagination, I could imagine them to be any way I liked, and I could tell you all you wished to hear simply by carrying on my imaginative creation. But as I believe that there really are other worlds, I am entitled to confess that there is much about them that I do not know, and that I do not know how to find out.
|The neutrality of this section is disputed. Please see the discussion on the talk page. Please do not remove this message until the dispute is resolved. (September 2009)|
While it may appear to be a simply extravagant account of modality, modal realism has proven to be historically quite resilient, and has so far resisted all attempts at definitive refutation. Lewis' own extended presentation of the theory (On the Plurality of Worlds, 1986) raises and then counters several lines of argument against it. That work is still the best introduction not only to the theory, but to its reception among philosophers. The many objections that continue to be published are typically variations on one or other of the lines that Lewis has already canvassed.
Here are some of the major categories of objection:
The theory does not accord with our deepest intuitions about reality. This is sometimes called "the incredulous stare", since it lacks argumentative content, and is merely an expression of the affront that the theory represents to "common sense" philosophical and pre-philosophical orthodoxy. Lewis is concerned to support the deliverances of common sense in general: "Common sense is a settled body of theory — unsystematic folk theory — which at any rate we do believe; and I presume that we are reasonable to believe it. (Most of it.)" (1986, p. 134). But most of it is not all of it (otherwise there would be no place for philosophy at all), and Lewis finds that reasonable argument and the weight of such considerations as theoretical efficiency compel us to accept modal realism. The alternatives, he argues at length, can themselves be shown to yield conclusions offensive to our modal intuitions.
Some object that modal realism postulates vastly too many entities, compared with other theories. It is therefore, they argue, vulnerable to Occam's razor, according to which we should prefer, all things being equal, those theories that postulate the smallest number of entities. Lewis's reply is that all things are not equal, and in particular competing accounts of possible worlds themselves postulate more classes of entities, since there must be not only one real "concrete" world (the actual world), but many worlds of a different class altogether ("abstract" in some way or other).
This is perhaps a variant of the previous category, but it relies on appeals to mathematical propriety rather than Occamist principles. Some argue that Lewis's principles of "worldmaking" (means by which we might establish the existence of further worlds by recombination of parts of worlds we already think exist) are too permissive. So permissive are they, in fact, that the total number of worlds must exceed what is mathematically coherent. Lewis allows that there are difficulties and subtleties to address on this front (1986, pp. 89–90). Daniel Nolan ("Recombination unbound", Philosophical Studies, 1996, vol. 84, pp. 239–262) mounts a sustained argument against certain forms of the objection; but variations on it continue to appear.
On the version of his theory that Lewis strongly favours, each world is distinct from every other world by being spatially and temporally isolated from it. Some have objected that a world in which spatio-temporally isolated universes ("island universes") coexist is therefore not possible, by Lewis's theory (see for example Bigelow, John, and Pargetter, Robert, "Beyond the blank stare", Theoria, 1987, Vol. 53, pp. 97–114). Lewis's awareness of this difficulty discomforted him; but he could have replied that other means of distinguishing worlds may be available, or alternatively that sometimes there will inevitably be further surprising and counterintuitive consequences — beyond what we had thought we would be committed to at the start of our investigation. But this fact in itself is hardly surprising.
A continuing theme in Lewis's replies to the critics of modal realism is the use of tu quoque argument: your account would fail in just the same way that you claim mine would. A major heuristic virtue of Lewis's theory is that it is sufficiently definite for objections to gain some foothold; but these objections, once clearly articulated, can then be turned equally against other theories of the ontology and epistemology of possible worlds.
Another criticism levelled against modal realism, specifically applied to the mathematical expression of it, Max Tegmark's Ultimate ensemble, is that it equates mathematical reality with physical reality:
Physical existence is something that we have some experience of. We probably can't define it but, like many things we have difficulty defining, we know it when we see it. Mathematical existence is a far weaker thing, but much easier to define. Mathematical existence just means logical self-consistency: this is all that is needed for a mathematical statement to be "true". (Barrow, 2002, pp. 279–80)
However, as Damon Woolsey points out , the rather intuitive notion of physical existence being defined by its comprising "real stuff" is the true metaphysical extravagance. The existence of "real stuff" constituting the building blocks of matter would be indetectable, because the question of whether any actual "physical stuff" exists or not could have no effect on the interactions of matter we observe, and thus there is no reason to suppose the existence of the "real stuff" that constitutes matter. Occam's Razor actually prefers modal realism over other explanations because it posits the existence of one less unobservable category of existence: namely, physical substance.