As a way of thinking through the bleakness of the political present through which we are all too precipitously moving, this paper attempts to demonstrate the interconnections between three concepts: politics, law and religion. By way of a detailed reading of Rousseau, I try to show how any conception of legitimate politics and law requires a conception of religion at its base and as its basis. In the conclusion, an argument is presented for a politics of the supreme fiction, which attempts to show how poetry can take the place of religion.

Simon Critchley's Infinitely Demanding makes a timely contribution to contemporary debates in ethics and political philosophy. For all its originality, however, one can raise critical questions concerning Critchley's account of the forms of resistance possible within liberal democratic polities. In this article I question the adequacy of Critchley's ethically based neo-anarchism as a response to neo-liberalism, critically analysing the role of ideology in his account of the motivational deficit afflicting capitalist liberal democracies.

William R. Catton, Jr.

From Wikipedia, the free encyclopedia

Jump to: navigation, search
William R. Catton, Jr.

William R. Catton, Jr. (born January 15, 1926) is an American sociologist best known for his scholarly work in environmental sociology and human ecology. His intellectual approach is broad and interdisciplinary. Catton's repute extends beyond academic social science due primarily to his 1980 book, Overshoot: The Ecological Basis of Revolutionary Change. Catton has written three other books, including From Animistic to Naturalistic Sociology. In addition he has authored numerous scholarly articles, book chapters and book reviews. In 2008 he is working on a new book, titled Bottleneck: Humanity's Impending Impasse. He is retired from academic life and lives in Lakewood, Washington, USA.




William Catton was born in Minneapolis, Minnesota on January 15, 1926. He served in the US Navy from 1943-46. After his military service he enrolled at Oberlin College, where he met Nancy Lewis. The two were married in 1949 and remain married in 2008. They have four sons, six grandchildren, and two great-grandchildren.

Catton graduated from Oberlin College with an A.B. degree in 1950, whereupon he entered the graduate program in sociology at the University of Washington. He earned his M.A. there in 1952 and his Ph.D. in 1954. He is now Professor Emeritus of Sociology at Washington State University. Catton served as President of the Pacific Sociological Association 1984-85 and as the first chair of the American Sociological Association Section on Environmental Sociology.[1]

Intellectual development

William Catton started his professional career as a mainstream sociologist, without a special focus on the environment. However, in the course of his early research he worked with John Hendee, a USFS forest ranger, and Frank Brockman, a National Parks naturalist who became Professor of Forestry at the University of Washington. Catton became sensitized to population issues by noting the congestion at campgrounds in the natural parks he visited in the northwest US and Canada. He was also influenced by the museum exhibits in the Visitor Centers in these parks.

From an early point in his career Catton was dissatisfied with the qualitative slant of sociology and wanted to put the discipline on a more quantitative and thus scientific footing. He felt that this orientation would help sociologists guide human societies to a better future. This "neopositivist" attitude was directed towards ecological issues after Catton resigned his position at the University of Washington in 1970 and moved to the University of Canterbury in Christchurch, New Zealand. He had become discouraged by the rapidly swelling student body at his old university, and by the adverse social effects of a sharp population increase in the Puget Sound region.

In New Zealand, Catton once again associated with foresters and became familiar with that country's national park system. His "Aha!" moment came at the Visitor Center in Westland National Park. An exhibit there represented the case of land newly made bare by a receding glacier. It showed the sequence of changes in vegetation at increasing distances from the retreating ice, representing earlier and earlier time periods. This example of ecological succession - the transformation of ecosystems over time - showed plant species altering their environment, thereby making it less suitable for them but more suitable for successor species. Another revelation came when he picked up the book Violence, Monkeys, and Man, which reinforced his personal view that higher population is associated with greater stress and violence. With these two experiences, Catton's broad aim to make sociology more scientific became the specific aim to make the discipline more cognizant of the biogeochemical processes associated with the environment. This paradigm shift led quickly to the writing project that produced Overshoot.[2]

Overshoot: The Ecological Basis of Revolutionary Change

Overshoot was started during Catton’s three years in New Zealand, and completed after he returned to the US in 1973 to become Professor of Sociology at Washington State University. It took considerable time in the late 1970s for him to find a reputable publisher who did not assume that the market for books on ecology was saturated, so Overshoot was not published until 1980. During this period Catton, in collaboration with fellow scholar Riley Dunlap, produced a series of influential articles on ecological issues. Although Overshoot has never been a major seller, it has remained in print continuously since 1980, and it has recently been translated into Russian and Spanish.[3]

The core message in Overshoot is that, "... our lifestyles, mores, institutions, patterns of interaction, values, and expectations are shaped by a cultural heritage that was formed in a time when carrying capacity exceeded the human load. A cultural heritage can outlast the conditions that produced it. That carrying capacity surplus is gone now, eroded both by population increase and immense technological enlargement of per capita resource appetites and environmental impacts. Human life is now being lived in an era of deepening carrying capacity deficit. All of the familiar aspects of human societal life are under compelling pressure to change in this new era when the load increasingly exceeds the carrying capacities of many local regions—and of a finite planet. Social disorganization, friction, demoralization, and conflict will escalate."[4]

Overshoot continues to be a source of conceptual insight and existential inspiration regarding the ecological basis of human societies, especially to those aware of the massive threat posed by peak oil, climate change and other ecological pressures Catton either identified or anticipated. Years ahead of its time because of the clarity of formulation of a fully ecological paradigm, the book supplies scientific analysis of what E.O. Wilson has called "The Bottleneck" of ecological pressures and threats resulting from human actions on the natural environment.

Intellectual contribution

William Catton came of age in sociology when the major debates in the field were about theoretical orientation (structural-functionalism or consensus theory versus Marxism or conflict theory), and methodology (quantitative versus qualitative). His inherent attraction to nature and understanding of how the earth’s ecosystems operate afforded him the insight that all human social systems, including the economy, operate within the parameters of the natural ecology.

Catton’s primary contribution is the articulation of an intellectual framework that synthesizes sociological and ecological theory. He has shown that the prevalent idea of human control over nature being a great achievement was in fact a reflection of the exploitation of natural resources that seemed limitless but were actually finite.

One of his critical observations is that, “Monumental social changes (and troubles) in the 21st century will be misunderstood (and thus worsened, I believe) insofar as people ... continue interpreting events according to a [pre-ecological] worldview that insufficiently recognizes human society’s ultimate dependence on its ecosystem context.”[5]

Awards and honors

Published works

Books (sole author)

Books (co-author)




  1. ^ All biographical information from the Curriculum Vita of William R. Catton, Jr.
  2. ^ All information on Catton's intellectual development from William Catton's paper, A Retrospective View of My development as an Environmental Sociologist
  3. ^ William Catton, A Retrospective View of My development as an Environmental Sociologist
  4. ^ Ibid., p. 8.
  5. ^ Ibid., p. 8.



Carrying capacity transgressed two ways
by William R. Catton. Jr.

Biology is, as Hardin (1986) has reminded us, rich with insights that indicate a need for a "massive restructuring of popular opinions." In particular, the supposition that Earth is a cornucopia for mankind needs serious modification. Unfortunately, appropriate opinion restructuring is impeded by an inherent antagonism: although ecologists recognize there are limits to ecosystem sustainability, politicians are professionally compelled to remain deaf to suggestions that growth of human activities and elevation of consumption cannot be perpetual. The ecologists time horizons are based on evolution or succession; politicians' horizons are seldom more than two or four years away, because they get re-elected by encouraging electorates to expect them (at least in election years) to promote economic growth.

This article will offer suggestions for getting some fundamental ecological insights onto the public policy agenda. Specifically, I will try to go a step beyond Hardin and make the case for remarriage of sociology and biology as a means to this end. Although knowledge of both the ecological and sociological nature of the human species is politically necessary to forestall disaster, few national leaders yet recognize it. Carrying capacity needs to be understood as the maximum load an environment can permanently support (i.e., without reduction of its ability to support future generations), with load referring not just to the number of users of an environment but to the total demands they make upon it. For human societies, as for populations of other species, the relation of load to carrying capacity is crucial in shaping our future. Public comprehension of the concepts of carrying capacity and load is both vague and inadequate, and the need to correct these deficiencies is urgent.

Human ecology

For two reasons, Homo sapiens is a species especially likely to transgress an environment's sustainable carrying capacity. First, humans have an unusually long period of maturation. Therefore, sociologists commonly view learned culture (rather than biological instincts) as the explanatory mainspring in accounting for human behavior patterns. We must now also see that the modesty of an infant's demands upon an ecosystem obscures the immensity of the load each adult may later impose. Second, our cultural nature enables our wants vastly to exceed mere physiological appetites.

Ecology has long been described as the study of interrelationships among organisms and their environment. My task is to show what special turns in such study are required by the special nature of human organisms. When sociology shuns such biological concepts as carrying capacity (or distorts their meaning in embracing them), it ignores important determinants of human experience. But unless ecologists take the facts of human culture appropriately into account, they, just as truly, are being unrealistic. Various sociologists who have styled themselves human ecologists have purported to resolve the differences. Let us examine some of their efforts.

Aware of the central importance of the ecosystem concept, Duncan (1959, 1961) sought to adapt that concept for use in human ecology, taking into account two fundamental ways in which humans differ from other organisms in their ways of environmental interaction. Humans develop technology and humans organize into groups more elaborately and more variably than nonhuman populations do. So to Duncan the human ecological version of the ecosystem concept seemed to consist of what he called an ecological complex comprising four classes of interdependent variables--population, organization, environment, and technology (POET.)

Hawley (1973), reacting somewhat sternly to a paper by Odum (1969), also insisted on the special nature of human involvement with ecosystems. In the second part of Figure 1, I have used Duncan's POET notation to represent the insistence by Hawley that for humans the relation between a population and its environment is always mediated by the organization and technology employed by that population. To Hawley this mediation not only seemed to mitigate the specificity of environment as a finite (local) territory, it appeared also to abrogate environmental limits to social progress such as were presupposed by the author of the famous 1798 essay on population pressure, Robert Malthus (and seemingly accepted by Odum).

This diagrammatic representation of Hawley's idea (using Duncan's notation), with O and T enclosed between two arcs, resembles a lens. Thus it seems to reflect Hawley's view that organization and technology could magnify an environment's carrying capacity. However, as the symmetry of the diagram reveals, it is equally plausible to imagine looking through the lens from the E side, in which case O and T would magnify P. Indeed, organization and technology have enlarged the resource appetites and environmental impacts of various human populations.

Next I applied the Duncan notation to the conception of human ecology put forth by Park, the Chicago sociologist credited with first using the term human ecology more than 60 years ago. Park (1936) differentiated human ecology from plant and animal ecology by pointing out the need "to reckon with the fact that in human society [there is a] cultural superstructure [that] imposes itself as an instrument of direction and control upon the biotic substructure." Pursuant to that difference, Park spoke of a social complex comprising three elements--population, artifact (technological culture), and custom and beliefs (non-material culture). In the third part of Figure 1, these ideas are represented with the POET notation, and as a result the social complex is seen as an entity; interaction occurs between it and its environment, not just between each of its component variables and the environment, or between P and E mediated by O and T.

The third model of ecological reality is, I submit, superior to either of the first two described. For ecosocial theory purposes, this representation of Park enables us to think of O and T as modifications of P (Winner 1986), and I propose therefore to construe Park's work as recognition of some new (ecosocial) taxa, which, without waiting for official acceptance of the idea by systematists, can be referred to as Homo colossus. This article will demonstrate the appropriateness of this designation.

Prosthetic polymorphism

With different organizations and technologies, one population of humans can be a very different sort of ecological entity than another human aggregate. Accordingly, let us invoke the biological concept of polymorphism. It has been defined somewhat simplistically by Topoff (1981) for a colony of social insects as "the existence of individuals that differ in both size and structure," and defined more generally, yet more precisely, by Ford (1955) as "the occurrence together in the same habitat of two or more distinct forms of a species in such proportions that the rarest of them cannot be maintained merely by recurrent mutation."

Leaving aside the issues of genetics and natural selection implicit in Ford's definition, let us consider division of labor, a classic topic in sociology that was recognized as early as 1893 to be an extension of the biological phenomenon of organic specialization (Durkheim 1933). Division of labor arises even in very simple human societies, based at least on age and sex differences. In modern societies it becomes much more elaborate (Catton 1985).

For the task of modeling ecosystem processes when humans are involved, we need to broaden the concept of polymorphism. I propose that the possible differentiation of functions is limited when it has to depend either on biological polymorphism within a single species or on genetically based differences between the Various species cooperating in a biotic community. Human societies have transcended these limits, and, ecologically speaking, what is distinctive about our species is that we have substituted sociocultural differentiation and technology for biological polymorphism and interspecific differences.

This is the way biologists' and sociologists' views of the special nature of the human species ought to converge. Among the members of a human labor force, the polymorphism that makes possible a highly ramified division of functions is in the tools rather than in the hands that wield them. There is polymorphism in the socially instilled contents of the brains that control those hands and those tools, not in the biological structure of those brains.

Machines, tools, and other artifacts can be described as prosthetic organs--detachable extensions of the human body. The British Museum of Natural History in London has an eloquent exhibit on natural selection that includes a display comparing variations in an organ with variations among tools adapted for different tasks. It acquaints viewers with the ecological significance seen by Darwin in the assorted types of beaks on the several species of finches he observed exploiting different resources on the Galapagos Islands (British Museum of Natural History Staff 1981).

To advance the task of modeling ecosystem processes in which humans are involved, we should therefore broaden (in a sociological direction) our use of the term polymorphism. If we consider an industrial civilization's human labor force not just as a population of furless and bipedal mammals but instead as a population of social complexes (tool-users-modified-by their-respective-tools-and-organizational-roles), then it can be seen that our species is impressively polymorphic. Culture enables Homo colossus to be, in this sense, the world's most polymorphic species.

This important ecological implication of culture has new significance for reuniting sociology and biology. Substitution of human sociocultural polymorphism for the less diversified and much less flexible biological version has important consequences. Ecologists and sociologists should stress this fact to the public--who may then require such understanding among elected policy makers.

Sociocultural polymorphism had already been recognized (without the label) when Colinvaux (1973, p. 579) wrote, "Man alone can change his niche without speciating." I prefer to speak of quasispeciation, meaning the adaptation of various members of the one human species to different niches by cultural (i.e., technological and organizational) differentiation without recourse to genetic differentiation. It is by this means that humans have in the course of their evolution several times succeeded in usurping from other species portions of the planet's total life-supporting capacity. Each time, human numbers increased. Now it is essential to see the latest episodes of quasispeciation can lead to resource scarcity and environmental degradation. Colinvaux (1973, p. 579) recognized that "The time is already on us when…the carrying capacity of our living space is not enough to provide a broadened niche for all men who now exist."


Freese (1985) provides a clear definition of a serial trap, further elucidating issues raised by the now famous description of the commons dilemma by Hardin (1968). A serial trap exists when resources required by a user population are replaced over time at a more or less constant rate; replacement rate is exceeded by use rate; resources depletion cumulatively affects further availability, so that relative scarcity intensifies exponentially; and as time passes, system degradation becomes less and less reversible. As Freese notes, serial traps clearly do occur in natural ecosystems not under human domination. Various species in Various ecosystems have experienced the cycle or irruption and crash (e.g., birds, caribou; see Remmert 1980, Welty 1982). But he seeks to persuade sociologists that serial traps also occur in human-dominated ecosystems (Whittaker 1975), and that the dependence of industrial societies on nonrenewable resources must be seen as an example. (By definition, the replacement rate for nonrenewable resources must be effectively constant, i.e., zero, and any nonzero rate of use must exceed it.) Modern societies have consistently mistaken rates of discovery for rates of replacement (Pratt 1952, Simon and Kahn 1984), entrapment being the result of the illusion that all is well if use rates are just not yet in excess of recent discovery rates.

What political and economic decision makers and their constituents most need to learn from an ecosocial theory is the idea that cumulative effects of ecosystem use can make it progressively less feasible to retreat from an accustomed use pattern back to an earlier one after the newer pattern belatedly comes to be seen for the trap it is (Costanza 1987).

Quite recently, man-machine combinations enlarged the effective environment, but precariously so. Between 1930 and 1960, most draft animals on US farms were replaced by tractors. According to the Office of Technology Assessment (1985), this released some 20% of US cropland from raising feed for animals and made it available for growing crops for human consumption. Conventional wisdom accepts this as unmitigated progress. It is not seen as a trap. Ecologists may astutely ask, however, what is to happen after humans have expanded their numbers or their appetites in response to the 20% capacity increment? If the fossil fuels for tractors become depleted and too costly, some land may need again be devoted to producing biomass fuel (either for the tractors or as feed for a new generation of draft animals).

The OTA (1985, p. 19) went on to say, "The increased mechanization of farming permitted the amount of land cultivated per farm worker to increase fivefold from 1930 to 1980." For purposes of ecological modeling, it is as if farm workers (as PTO complexes, not just as P) had been enlarged by a factor of five; each can do five times as much farming as could his less colossal grandfather. How was this enlargement accomplished?

According to the OTA (1985, p. 19) report, "The amount of capital...used per worker increased more than 15 times in this period" and furthermore there is now heavy reliance "on the nonfarm sector for machinery, fuel, fertilizer, and other chemicals." Clearly, then, the farm labor force is not just P; O and T can sensibly be viewed as extensions of it. But again, we need to recognize as a trap this conversion of Homo sapiens into Homo colossus.

Carrying capacity and von Liebig's law

The concept of carrying capacity, if correctly understood, can spotlight traps. For any use of any environment by any population, there is a volume and intensity of use that can be exceed only by degrading that environment's future suitability for that use. Carrying capacity, the word for maximal sustainable use level, can be exceeded--but only temporarily. Ecologically, Malthus's main error was supposing that it was not possible for a population to increase beyond the level of available sustenance. It can and does happen, but always the overshoot will be temporary.

The comparably tragic error of Malthus's latter-day critics has been to mistake serial traps for progress, i.e., to construe technological change that facilitates temporary evasion of carrying capacity limits as permanent elevation (or repeal) of those limits. When load comes to exceed carrying capacity, the overload inexorably causes environmental damage; then the reduced carrying capacity leads to load reduction (i.e., a crash).

Ecologists have not made this situation clear enough. Too often they have embraced the logistic curve model f or population growth and have construed the upper asymptote as the best representation of carrying capacity (e.g., Emlen 1984). The logistic curve does not rise above that upper limit, and the limit is represented by a constant in the mathematical formula. But carrying capacities are not constant; they can and do change.

Political and economic leaders, and social scientists tend to exaggerate any recognition that carrying capacity is not constant into the supposition that it is infinite. The fact that carrying capacities can be difficult to measure cannot exempt populations from the consequences of exceeding their environments' power to sustain them. The human prospect would be brighter if somehow these points were to be central to the agenda for the next super power summit meeting--but of course they won't even be mentioned.

Recently there has appeared both a biological and sociological literature alleging an inescapably subjective element in so--called carrying capacity. It is said that carrying capacity ultimately depends on people's value judgements (McHale and McHale 1976, Shelby and Heberlein 1984, Wagar 1964). Some imply that, unless carrying capacity can be assigned a precise value, the concept has no significance. Politicians and industrialists grasp such straws all too eagerly.

The antidote to such thinking is provided by the relation between the carrying capacity concept and von Liebig's law of the minimum. Justus von Liebig ( 1842), an agricultural chemist, showed that it was the least abundantly available nutrient that limited the yield a farm could produce. It need not be difficult to see why this must be so ( and why it can be generalized to so many phenomena), given that any organism is a complex chemical structure. The number of specimens of any organism that can be constructed from a given assortment of chemical components will be limited by the scarcest component. (The principle also can be illustrated with a nonbiological example. Imagine a collection of a dozen flashlight batteries, five battery cases, six reflector and lens assemblies, and two three-volt bulbs. The availability of only two bulbs will limit to just two the number of working two-cell flashlights that can be assembled, even though there will be other parts left.)

With this in mind, let us see why it is so misleading to imply or assert that the concept of carrying capacity is based on value judgment. In Figure 2a, I have represented carrying capacity by a circle and load by a square (drawn so that its area is equal to the area of the circle). Think of the circle as the cross section of a pipeline through which there is a constant flow of some limiting resource. Quantitatively, an environment's carrying capacity for a particular life form is set (according to von Liebigs law) by the continual rate of flow of the least abundantly available necessary resource. The load is clearly the product of two dimensions: the number of users of that limiting resource multiplied by the mean per capita rate of use. The point is this: a sustainable load is a load not exceeding the sustained rate of supply.

Clearly, the load may have different shapes and still be compatible with carrying capacity. Instead of the schematic square we may substitute a vertical rectangle (Figure 2b), representing an increase in the number of users, and a commensurate reduction per capita mean use of the limiting resource. As long as the area of the rectangle remains no larger than the area of the circle, we have a representation of a sustainable load. Alternatively, we could have a horizontal rectangle (Figure 2c), where per capita use level has increased and the trade-off enabling the load to remain sustainable is a reduction of user numbers.

From these comparisons what is most vital to note is that the question of a load's sustainable magnitude is an objective ecological issue, not a value question. The question of which trade-off is preferable to its alternative is a value issue. Choosing whether to increase the user population at the cost of lowering its standard of living or to raise affluence at the cost of population reduction depends on a value judgment. But it is a serious mistake to suppose this denudes carrying capacity of any objective meaning.

The importance of avoiding such erroneous thinking becomes clear when we imagine increasing one dimension of load without the trade-off on the other dimension (Figure 3). If population growth continues so that a maximum sustainable load has an overload added to it, habitat damage takes a bite out of carrying capacity. Likewise, if per capita use rises beyond the level prevailing in an already maximal load, and there is no trade-off reduction of user numbers, the overload must again result in habitat damage and carrying capacity reduction.

The normative questions that have led some analysts to declare carrying capacity a useless concept have to do with questions of equity rather than of sustainability. Neither biologists nor sociologists should confuse the two. Political disagreements as to what constitutes equitable allocation of finite resources should not obscure the fact that nature exacts penalties when loads exceed carrying capacity, whether the excess comes on the vertical or the horizontal dimension.

Actually, the human load has been expanding on both dimensions. The number of humans on Earth has increased enormously since prehistoric times. Also there has been great technological progress over the millennia, especially in the last two centuries. We are not yet accustomed, however, to putting these two items of knowledge together and recognizing the two-dimensioned enlargement (or the enormity of that enlargement) of the human load, nor have we come to terms with its ecological implications.

Homo colossus

This two-way expansion of the human load is represented graphically in Figure 4. For many warm blooded species, to maintain life with no gain or loss of body substance an animal needs average daily food energy intake of

W3/4 · 70 Kcal where w = body weight in kilograms (Kleiber 1947). Applying this formula to size estimates for various cetacean species (Gaskin 1982, Minasian et al. 1984) and to estimates of human exosomatic energy use plus food intake (Catton 1986) enables us to select particular dolphin or whale types to represent various stages in the evolution of Homo sapiens into Homo colossus. (It is cultural evolution, not biological evolution, we thus represent.)

The human load to be supported by the ecosystems of the world has not just grown from 3 x 106 individuals in 35,000 B.C. to 5 x 109 equivalent individuals today. Each of those three million hunter-gatherers was the energy-using counterpart of a common dolphin (Delphinus delphis), whereas each of today's 232 million Americans matches the energy use of a sperm whale (Physeter macrocephalus) .

Projecting that all humans can someday be as industrialized as Americans have become (Kahn et al. 1976) is equivalent to imagining a world populated by five billion sperm whales. It reflects woeful ignorance of the ecological consequences of cultural polymorphism. Such is the folly implicit in declaring that "the term carrying capacity has by now no useful meaning" (Simon and Kahn 1984, p. 45).

We urgently need widespread dissemination of the fact that carrying capacity is not infinitely expandable. The time has come when "tragic choices" must be acknowledged (Calabresi and Bobbitt 1978); in the world as it is and is going to be, human loads can grow on one axis only by shrinking on the other axis. Otherwise our legacy to posterity will be reduced carrying capacity and the human suffering that it will entail. Anyone wishing for a more humane and happier future should strive to spread ecological literacy. Those who aspire to leadership positions should be required to demonstrate that they understand the load and carrying capacity concepts.

William R. Catton, Jr., is a professor emeritus at the Department of Sociology, Washington State University, Pullman, WA 99164. Among his research interests are division of labor and the ecological basis of revolutionary change.

Malthus: More Relevant Than Ever

by William R. Catton, Jr.

August 1998

For the last two hundred years, Malthus' An Essay on the Principles of Population has served to define the terms of debate on human population growth and the Earth's capacity to provide subsistence.   And, if human civilization lasts that long, William Catton's 1980 book, Overshoot may well turn out to be the definitive statement on this issue for the next two centuries.  In this Forum, Dr. Catton elucidates the contemporary relevance of Malthus, by examining the concept of overshoot -- the ability of humans to temporarily expand their numbers at the expense of the natural world's long-term sustainability in the context of Charles Darwin's understanding of population competition.

Malthus, Darwin, and population competition

In 1798, Thomas Robert Malthus tried to inform people that a human population, like a population of any other species, had the potential to increase exponentially were it not limited by finite support from its resource base.  He warned us that growth of the number of human consumers and their demands will always threaten to outrun the growth of sustenance.  When Charles Darwin read Malthus, he recognized more fully than most other readers that the Malthusian principle applied to all species.  And Darwin saw how reproduction beyond replacement can foster a universal competitive relationship among a population's members, as well as how expansion by a population of one species may be at the expense of populations of other species. 

Others were not so perceptive.  When I was in high school, the textbook used in my biology class listed "Over-production of individuals" first among "the chief factors assigned by Darwin to account for the development of new species from common ancestry through natural selection" (Moon and Man, 1933:457), but it did not cite Malthus nor discuss his concerns about population pressure.  That neglect was typical because, for a while, "it was argued widely that developments had disproved Malthus, that the problem was no longer man's propensity to reproduce more rapidly than his sustenance, but his unwillingness to reproduce adequately in an industrial and urban setting" (Taeuber, 1964:120). 

Malthus in the age of exuberance

Most of us can remember learning in school to dismiss Malthus as "too pessimistic." Technological progress and the economic growth resulting therefrom, we learned to assume, can always provide the essential consumables (or substitutes) that have permitted exuberant population growth.  One of my college textbooks put it this way: "For conditions as they existed in 1798, Malthus was reasonably sound in his doctrines; but scientific and technological changes in the interval since his day have made Malthusian principles, in large part, an intellectual curiosity in our era" (Barnes, 1948:51). 

In graduate school one of my textbooks acknowledged that "Man's tendency to multiply up to the maximum carrying capacity of the land is superficially evident in many parts of the world" (Hawley, 1950:150-151).  Its eminent author, who has been called the "dean" of American human ecologists, conceded the likelihood that most lands at most historic times "have been populated to capacity in view of the particular modes of life of their occupants" but insisted (pp. 160ff) changes in such modes of life had made "the Malthusian interpretation of population problems decreasingly useful." The article about Malthus in the International Encyclopedia of the Social Sciences called his theory of population "a perfect example of metaphysics masquerading as science" (Blaug, 1968:551).

Reassessing Malthus inappropriately

When co-authoring an introductory sociology text my colleagues and I began to dissent from these disparaging evaluations of Malthus, but for not quite the right reasons (Lundberg et al., 1968:682):  "Despite his inadequate data," we said Malthus "was nevertheless correct in arguing that the food supply fixes an upper limit beyond which the population cannot go at any given time."  And we gave him credit for having taken into account "certain social and psychological factors, such as celibacy and moral restraint, which might keep population below that theoretical limit, and in doing this he focused attention," we supposed, "on factors which were frequently overlooked at the time." 

Looking back, I now see both of those sentences of ours as inaccurate or misleading.  His essay did not fully succeed in directing most people's attention to all the relevant factors, i.e., those checks that would prevent a human population from expanding to its full potential.  Further, and more importantly, Malthus's confidence that no population could overshoot carrying capacity, but would only press miserably against the limit, precluded foreseeing the prodigality-based affluence we achieved by running up carrying capacity deficits that would be disastrous later on.

Overshooting Carrying Capacity

Drawing down resources from the future

Contrary to our partial endorsement, (1) Food is not the only component of "sustenance" for modern human living; industrialized human societies rely on continuing flows of many other resources, and a cessation of supply of any essential commodity can be devastating. (2) By drawing down "savings accounts" (i.e., using resources faster than their rates of renewal), populations can (and do) temporarily exceed carrying capacity.  When the stockpile runs out, the once-thriving population finds itself in dire straits. 

Misunderstanding checks and balances

With respect to our second appraisal sentence, although Malthus meant to focus attention on factors that check population growth, the effort didn't always succeed.  Readers' attention seems to have persistently strayed back to the notion that Malthus believed populations would inevitably doom themselves to starvation by growing exponentially, so populations that burgeoned and prospered have been seen as supposed refutations of Malthus. 

What most of us just didn't see was that a relatively short feedback loop was assumed by Malthus because of his 18th century perspective on technology.  He was not mistaken in attributing exponential growth potential to all populations, nor was he mistaken in recognizing the unlikelihood that required resource supplies would grow apace.  He did err in supposing population could never grow significantly beyond a key resource limit.  Populations can, and often do exceed carrying capacity, and come to grief only after a delay.  Malthus was writing not only before there was a developed science of ecology but also before there were full-blown industrial societies making prodigal use of fossil energy and other nonrenewable resources.

Delayed feedback from the environment

Human over-reproduction may be curbed by its ultimate adverse consequences much less promptly than Malthus assumed, unlike what happens to animals with much shorter maturation times and without technology.   Two facts make the feedback loop dangerously longer for us than for most nonhuman species.  First, humans have an unusually long period of maturation compared to other species.  The lag between birth and the age of maximum resource consumption hardly mattered in 1798.  Then as now, people's offspring made small resource demands as infants, and in 1798 their adult demands exceeded those of their infancy by a ratio not much greater than the adult-to-infant resource demand ratio for other animal species (which grow to maturity in only a year or so).  Second, a mere eight human generations after Malthus, today's technology and our colossal reliance as adults on exosomatic energy sources (Cottrell, 1955; Catton, 1980; Price, 1995) have enormously magnified that ratio, putting it too far out of adjustment with ecosystem processes that supplied the modest demands of our ancestors. 

So continuing to suppose the world can afford all the precious progeny we may produce leads now to serious problems.  Babies grow up.  In an industrial society, as adults they expect to live lifestyles that involve taking from the environment enormous per capita resource withdrawals and dumping into it vast amounts of life's toxic by-products. 

It was no fault of Malthus that in 1798 he did not foresee this magnification.  Even today, parents seldom if ever base their decisions about sexual activity on calculations of the lifetime resource demands and environmental impacts of each prospective child that may result.  Our affluence, technology, and extraordinary period of maturation combine to obscure and delay but do not avert negative feedback from the environment.

Criticism ignores human capacity for overshoot

Malthus was not wrong in the ways commonly supposed.  From his 18th century perspective he simply had no basis for seeing the human ability to "overshoot" carrying capacity.  It was inconceivable to Malthus that human societies could, by taking advantage of favorable conditions (new technology, abundant fossil fuels),  temporarily increase human numbers and appetites above the long-term capacity of environments to provide needed resources and services.  But it is inexcusable today not to recognize the way populations can sometimes overshoot sustainable carrying capacity and what happens to them after they have done it. 

Human economic growth and technology have only created the appearance that Malthus was wrong (in the way we used to learn in school).  What our technological advances have actually done was to allow human loads to grow precariously beyond the earth's long-term carrying capacity by drawing down the planet's stocks of key resources accumulated over 4 billion years of evolution.

Competition and Overshoot

Human population growth and inter-species competition

Nearly everyone (but not Darwin) ignored crucial parts of the Malthus message.  Darwin (1859:63) stands out for understanding Malthus correctly.  Just after those two famous sentences about geometric increase of population versus arithmetic increase of food, Malthus ([1798] 1976:20) had said, "Necessity, that imperious all pervading law of nature, restrains them [all species] within the prescribed bounds.  The race of plants and the race of animals shrink under this great restrictive law.  And the race of man cannot, by any efforts of reason, escape from it.  Among plants and animals its effects are waste of seed, sickness, and premature death.  Among mankind, misery and vice." 

In the third chapter of On the Origin of Species, Darwin (1859:60-79) spelled out how checks on the growth of any one species population are exerted by populations of other species associated with it in the web of life.  Because every population is part of what we have since learned to call an ecosystem, when a particular species is "fortunate" enough to expand its numbers phenomenally, catastrophic reduction of other species populations must result.  "We suck our sustenance from the rest of nature . . . reducing its bounty as ours grows" (Leakey and Lewin, 1995:233).  But the "prosperity" of an irrupting population is fatefully precarious, as its own future is imperiled by nature's disrupted balance.

Environmental feedback: mass extinction poses a major threat

We have trebled the human load upon this planet in my lifetime by using the planet unsustainably and this has caused a new era of extinction.  According to a recent survey, a majority of American biologists regard the mass extinction of plant and animal species now resulting from human domination of the earth as a grave threat to humans in the next century (Warrick, 1998).  We live in a world losing biodiversity at an unprecedented rate (Koopowitz and Kaye, 1983; Wilson, 1992:215ff; Tuxill, 1998).  It is high time to see that this consequence was implicit in the 1798 essay by Malthus. 

Mankind is not only depleting essential mineral stocks.  We are also diminishing the plant and animal resources available to future human generations, and destroying biological buffers against the effects of global climate change (Suplee, 1998).  We are stealing from the human future.  Had the "moral restraint" of our parents and grandparents been enhanced by understanding Malthus as cogently as Darwin did, a less ominous future might have been their legacy to us (and ours to our descendants).

Other Voices  Back


The Problem of Denial

by William R. Catton, Jr.
Professor Emeritus - Sociology Washington State University


Abundant evidence suggests industrial civilization must be "downsized" to curb damage to the ecosphere by the "technosphere." Trends behind this prospect include prodigious population growth, urbanization, cultural dependence upon ravenous use of fossil fuels and other nonrenewable resources, consequent air pollution, and global climate change. Despite prolonged Cold War distraction and entrenched faith that technology could always enlarge carrying capacity, these trends were well publicized. But there remain eminent writers who persist in denying that human carrying capacity (Earth's maximum sustainable human load) has now been or ever will be exceeded. Denials of ecological limits resemble anosognosia (inability of stroke patients to recognize their paralysis). Some denial literature resembles their confabulations (elaborately unreal stories concocted as rationalizations). Denial by opponents of human ecology seems to be a way of coping with an insufferable contradiction between past convictions and present circumstances, a defense against intolerable anomalous information.

The passionate drive in the 104th U.S. Congress "to kill many environmental protection laws ... in the name of less government" (Wager 1995, 3) stands in stark contrast to the ecological wisdom implicit in a two-centuries-old statement William Ophuls used as a tone-setting epigraph for his book on Ecology and the Politics of Scarcity (see Ophuls and Boyan 1992, vi). Humans, said Edmund Burke,

... are qualified for civil liberty in exact proportion to their disposition to put moral chains upon their own appetites.... Society cannot exist unless a controlling power upon will and appetite be placed somewhere, and the less of it there is within, the more there must be without. It is ordained in the eternal constitution of things, that men of intemperate minds cannot be free. Their passions forge their fetters.

That these aphoristic sentences imply an important ecological principle becomes sharply visible in light of some conceptualizations recently set forth by Barry Commoner (1990). People, he said, live in two worlds. We live not only in the natural world that evolved physically, chemically, and biologically over Earth's five billion years, but also in a world humans have made. He insists we need to understand how these two worlds (Commoner calls them "the ecosphere" and "the technosphere") interact, especially now that the technosphere has become so enormous and consequential, breaching the division between the two worlds. What we still euphemistically call "acts of God" are no longer uninfluenced by human societal activity. Collectively, we now significantly alter the natural processes of the ecosphere, "the thin global skin of air, water, and soil, and the plants and animals that live in it"

According to Commoner (1990, 15), "What we call the 'environmental crisis' — the array of critical unsolved problems ranging from local toxic dumps to the disruption of global climate" — results from a drastic mismatch between the ecosphere's "cyclical, conservative, and self-consistent processes" and the technosphere's "linear, innovative, but ecologically disharmonious processes."

While Commoner's statement carries real meaning to human ecologists, I am sure it is completely opaque to the person who happens to "represent" my Congressional district. It is probably meaningless to most of her House colleagues, and to most members of the Senate. It would probably find little resonance with most of the voters who put them in Congress. The aim of this paper is to try to shed some light on the apparent refusal of ostensibly educated individuals to realize the urgent need, as Commoner puts it, for ending the "suicidal war" between technosphere and ecosphere. Never have so many seemed so oblivious to so momentous a future-shaping condition.


Human ecologists could well be dismayed by the apparent preoccupation of society's decision makers with matters of less basic importance to our global prospects than the following facts:

  1. Human numbers on this planet are much greater today (and still growing) than they were just half a century ago (Demeny 1986, 29-33; Ehrlich and Ehrlich, 1990; Keyfitz, 1991).

  2. A greater fraction of the world's people today live in cities, and many cities are faced with problems of serious air pollution (Demeny, 1986,55-58; Lowe, 1991).

  3. Industrialization has enabled and required mankind to use fossil fuels and other nonrenewable resources at prodigious rates, with little regard for the finiteness of the Earth's deposits of these substances (Young, 1992; Flavin and Lenssen, 1994,29-49; Inkeles, 1994; Szell, 1994).

  4. The combustion products we have been putting into the atmosphere may be causing climate change (Tangley, 1988;Abrahamson, 1989;Rathjens, 1991;Revkin, 1992; Ravin and Lenssen, 1994,50-70; Wigley, 1995).

  5. Other products of modem chemistry have been accumulating in the upper atmosphere and wreaking havoc with the protective ozone layer (Benedick, 1991; Litfin, 1994, 52-77).

Could mass media preoccupation with less crucially significant matters explain why there appear even now to be so many literate and educated people who remain unconcerned about these facts, or who deny their truth or at least their importance?


It is instructive to trace the entry of these topics into the print media to see just how long there has been information about them readily available to the reading public. For that purpose, it was a simple expedient to explore entries in the many volumes of the Readers Guide to Periodical Literature, a standard resource in many libraries. Without parading details of this exploration, it suffices to say that all five facts have had appreciable public exposure in the print media.2 The treatments of the five topics attained clarity and explicitness at varying dates:

  1. exploding population was publicized from the 1950s;

  2. air pollution received explicit attention by the time of World War II;

  3. ravenous industrial dependence on exhaustible resources was explicitly depicted from 1973 onward;

  4. treatment of global warming by the greenhouse effect of C02 etc. in the atmosphere became fairly clear from the mid-1950s;

  5. depiction of the ozone layer got explicit coverage from 1985.

Thus, for a decade at minimum, and for several decades in some cases, these facts have been "available" to the general reader — and to politicians.


Until the end of the 1980s, public views of nearly everything were colored by the dangerous rivalry between two nuclear-armed superpowers, each defining the other ideologically as evil. When the Cold War ended at last, it became possible for a writer desiring to educate people to ecological facts of life to suggest that the visible failure of communism was accompanied by an as yet unrecognized failure of Western capitalism (Orr, 1992, ix). "Our failures," he said, "are still being concealed by bad bookkeeping (both fiscal and ecological), dishonest rhetoric, and wishful thinking." But he insisted "the day of reckoning" was coming soon, for "the world ... is not without limits" recognizable by an ecologically literate person.

Most people are not yet ecologically literate, even this many years after the well-publicized 1972 United Nations Conference on the Human Environment that met in Stockholm. So "the health of the planet" continued to deteriorate (Brown et al., 1991, 20-21). As many people as existed altogether in 1900 were added to the Earth's load between 1972 and 1991, while the world lost nearly 200 million hectares of trees, an area the size of the United States east of the Mississippi. Deserts have expanded by 120 million hectares, claiming more land than is planted to crops in China and Nigeria combined. The world's farmers lost about 480 billion tons of topsoil, roughly equal to that which covers the agricultural land of India and France. And thousands of plant and animal species with which we shared the planet in 1972 no longer exist. So it may have been a display of exaggerated faith in popular wisdom when a pair of demographers (Tsui and Bogue, 1978, 3) wrote that "No social problem, other than war, has attracted greater and more sustained public concern during the decades since World War II than the 'population explosion'."


Not only were there Cold War blinders restricting people's perceptions of the world for so many years; there was also a consummate faith that continuing technological innovations will enable Earth's human carrying capacity to be expanded "to almost any required size" (Ehrlich et al., 1971, 41). This exuberant worldview has been expressed in professional scientific journals, not just in more popular media. Examples have appeared in Scientific American (e.g. Hopper, 1976; Revelle, 1976), BioScience (e.g. Weinberg, 1973), and The Sciences (e.g. Ausubel, 1993), as well as in Science, official organ of the American Association for the Advancement of Science, which published the following assertion (Simon, 1980):

Incredible as it may seem at first, the term 'finite' is not only inappropriate but is downright misleading in the context of natural resources....Even the total weight of the earth is not a theoretical limit to the amount of copper that might be available to earthlings in the future. Only the total weight of the universe — if that term has a useful meaning here — would be such a theoretical limit. In summary, because we find new lodes, invent better production methods, and discover new substitutes, the ultimate constraint upon our capacity to enjoy unlimited raw materials at acceptable prices is knowledge. And the source of knowledge is the human mind. Ultimately, then, the key constraint is human imagination and the exercise of human skills. Hence an increase of human beings constitutes an addition to the crucial stock of resources, along with causing additional consumption of resources (1435-1436).

The idea in that final sentence recurred in the context of some analysis of carrying capacity estimates; ex-President Bush was quoted insisting that every human possesses not just a consuming mouth but also productive hands (Cohen, 1995a). Here, and again in a subsequent book, Cohen (1995b) implied that "the historical record of faster-than-exponential population growth" might well continue to be "accompanied by an immense improvement in average well-being." While human population had increased fourfold between 1860 and 1991, human use of inanimate energy increased 93-fold in the same period.3 Human influence upon the planet had thus grown enormously faster than mere human biomass. Is this fact a basis for optimistic amazement, or should it be arousing deep anxiety?

The point of the statement from George Bush (echoing Friedrich Engels) was to suggest that human carrying capacity may be not just burdened by more people but could actually be raised by them.4 But no such faith can eliminate mathematical limits (see Cohen, 1995b: Appendix 6). Anyway, carrying capacity is not simply "Earth's maximum supportable human population" (Cohen, 1995a, 342); the concept should denote Earth's maximum sustainable load. The concept of human carrying capacity is not qualified sufficiently by noting that the load depends on level of living as well as number of people. What the carrying capacity concept must spotlight is the issue of system durability; how long can an ecosystem support a given load? It is true that the load varies with level of living. It is no less essential to recognize the idea (which should be so simple) that overuse of an environment reduces its load-supporting capacity for future generations of users.

As Ehrlich et al. (1971,41) suggested a generation ago, it already "seems clear that a population size smaller than that of 1970 will be necessary, if all human beings are to have a high material standard of living, and if a comfortable margin for error is to be maintained against ecocatastrophes." More recently a writer for Worldwatch Institute has stated flatly that "we have surpassed the planet's carrying capacity" (Postel, 1994,4), saying what makes this evident is the extent of depletion and damage to natural capital. "The earth's environmental assets are now insufficient to sustain both our present patterns of economic activity and the life-support systems we depend on".5

As if to remove all doubt as to what is meant by asserting the human load already exceeds Earth's carrying capacity, the head of the Worldwatch Institute insists "Time is not on our side. The world has waited too long to stabilize population.... If we care about the future, we have no other choice but to launch a worldwide effort to stabilize our life-support systems — soils, fisheries, aquifers, and forests — and the climate system" (Brown, 1995, 141; cf.Catton, 1980). And as he pointed out, this would take a massive mobilization of financial and political resources comparable to organizing to fight World War II. Absent that, "we will leave our children a world without hope."6


For emphasis let me now personalize the changes that have led to these somber assessments. There are almost three times as many people now living on this planet as there were when, as a boy, I first asked an adult what was the world's total population. Within my lifetime, I have seen small towns grow to become cities, spreading over once open countryside. In the first third of my life, automobiles and highways, although already well-established as a means of family mobility, had not yet displaced trains and railroads as the major choice for long­distance travel. Nor had the airlines yet proliferated to the point of near extinction of rail passenger service. The changes in transportation I've experienced and witnessed in one lifetime, together with other industrial growth, have caused world extraction of crude oil to double each decade. This means that in each of the first five or six decades of my life we humans extracted and used as much petroleum as had been used in all previous time — including the immediately preceding decade. High school mathematics was more than enough to make it evident that this decennial doubling could not go on forever.

As a visible indication of climate change, I have personally become acquainted with glaciers in both the northern and southern hemispheres and have seen that they are now hundreds of meters shorter than when I first laid eyes on them. And although I have no direct personal experience of stratospheric ozone thinning, I am inclined to trust the scientific literature on that subject (e.g. Cicerone, 1987; Kerr, 1988; Rowland, 1989; Firor, 1990).


There are others who deny the whole idea that carrying capacity has now been, or ever will be, exceeded by the human load. Writing of a future "age of abundance," an economist at the Cato Institute in Washington, DC, has argued that just because a grocery store stocks only a three days supply of milk no one worries that life after the third day must be lived without milk, and similarly, we should not expect to run out of copper simply because copper mining companies calculate that they have only a certain number of years of reserves. When they use up those reserves, they will have a renewed incentive to locate new sources of supply (Moore, 1995,116).

He insists, therefore, that the only reliable measure of "a resource's supply is the change in its market price." In support of that view, he cites Julian Simon's book. The Ultimate Resource, a title alluding to human brains and reflecting a faith that ever-increasing numbers of them on this planet will ensure an escalation of solutions to outrace any escalation of problems.

In a more recent book, Simon (1994,65) has asserted that we already have in the world's libraries "the technology to feed, clothe, and supply energy to an ever-growing population for the next 7 billion years." After noting the relative recency of much of our technological knowledge, Simon adds, "Even if no new knowledge were ever invented after those advances, we would be able to go on increasing forever, improving our standard of living and our control over our environment."

If most human ecologists would regard this as quite preposterous and detached from reality, I have felt almost as stunned each time I have read the negating paraphrase by Julian Simon and Herman Kahn (1984,1-2) of the summary of The Global 2000 Report to the President:

If present trends continue, the world in 2000 will be less crowded (though more populated), less polluted, more stable ecologically, and less vulnerable to resource supply disruptions than the world we live in now. Stresses involving population, resources, and environment will be less in the future than now... The world's people will be richer in most ways than they are today ... The out­look for food and other necessities of life will be better ... life for most people on earth will be less precarious economically than it is now.

The emphases and ellipses are by Simon and Kahn, obviously intended to make their glowing expectations contrast maximally and point-for-point with the Global 2000 summary they were paraphrasing.

Knowing these two men to be both intelligent and educated, I have wondered every time I looked at the quoted passage how they could so flagrantly deny what so many ecologists regard as the real state of the world.7 As long ago as 1956, a past president of the American Ecological Society, who was then president of the American Association for the Advancement of Science, and would shortly thereafter serve as president of the American Society of Naturalists, wrote: "I have yet to meet a biologist who shares the optimistic unconcern about natural resources that is so prevalent among a considerable group of technologists and economists" (Sears, 1956,22).

In addition to Simon and Kahn, other authors have exhibited similarly sublime denial. Two astonishing books by Wattenberg (1984; 1987) asserted that "the bad news" is just plain wrong, and that far from overpopulation, fertility rate declines in industrial nations should be seen as a dangerous "birth dearth." Just recently, Wattenberg (1995) reasserted his "birth dearth" concept with specific reference to the United States, arguing this country needs immigrants to offset the effects of past fertility declines on the present young adult (labor force) strata in our population pyramid.

A pair of economists, writing about "the doomsday myth" (Maurice and Smithson, 1984), claimed modem economic problems do not significantly differ from crises that have been occurring and getting solved over the past 10,000 years. Energy shortage problems in the 1970s are alleged to have been a "created crisis" (Sutton, 1979). And anyway some problems are better left unaddressed, according to a civil engineer, an economist, and an environmental engineer who argued that recycling, for example, is often too costly and has doubtful environmental value (Hendrickson et al., 1995).

Not surprisingly, an advertisement for the Mobil Corporation answers its own headline question, "Running out of oil?" by proclaiming, "Not in your lifetime nor your grandchildren's." The ad's aim, as it appeared in Newsweek for May 8,1995 (15), was to forestall "Forcing the market to make the transition to alternative fuels prematurely" which, it said, "will harm the economy, consumers and taxpayers."

And, just possibly, Mobil Corporation?

In many instances, perhaps, denial may express a vested interest. But there are common occurrences of denial in other life contexts. In American Holocaust, Stannard (1982) presented what a catalog blurb for the book called "A devastating portrait of the death, disease, misery, and apocalyptic destruction experienced by American Indians during the centuries after 1492." The pre-European inhabitants of this continent did indeed experience those woes, yet most European-descended Americans today tend not to think about the Indians' plight. Further, it is possible to be so committed to reserving the term "holocaust" for another particular disaster that one objects to its use in the title of a book about American Indians. Katz (1994) has argued that "the Holocaust" is a singular event in human history and the only example of true genocide.

And yet, remarkably, there was frequent resort to denial by survivors of that World War II, Nazi-inflicted Holocaust (Salamon, 1994). It appears that fear of ostracism by a society disinterested in what had transpired in those death camps caused many survivors to avoid speaking about what they had experienced. Denial, said the paper reporting this, "is considered one of the least mature defense mechanisms. Yet, it is the one most often employed when stressors are the most overwhelming."


This may be a clue to the Simon-Kahn puzzle. What could be more overwhelming than clear realization of the full implications of finding the world truly in a condition of life-support system erosion by severe overload? How tempting to deny the breakdown is happening, or ever could happen.

Alcoholics are commonly seen as persons overwhelmed by experiences or circumstances beyond their ability to cope, and denial is common among them and their families (Crisman, 1991). Realizing this, it seemed plausible to look into writings by psychiatrists and associated treatment professionals in search of some principles that would help explain how Simon and Kahn could contend the Global 2000 Report was diametrically wrong.

After searching not very profitably through a number of papers in psychiatric and related journals,' I happened to en­counter in Discover magazine an unexpectedly suggestive article. It described research by a neuroscientist and physician at the University of California at San Diego. He studies a rather amazing form of denial. The researcher's name is Vilayanur Ramachandran, and the form of denial he has been studying is called anosognosia. "One of the best-known victims of the condition was Supreme Court Justice William 0. Douglas, who suffered a right-hemisphere stroke in 1974 that paralyzed his left side and eventually forced his retirement. He initially dismissed the paralysis as a myth, and weeks later was still inviting reporters to go on hiking expeditions with him. When one visitor asked about his left leg, he claimed he had recently been kicking 40-yard field goals with it" (Shreeve, 1995).

As Dr. Ramachandran describes it, anosognosia is a condition in which the patient does not just ignore his or her paralysis, but actively denies it "in spite of... complete inability to move." To explain away the real condition, the patient often concocts "elaborate stories or chillingly unreal rationalizations." (Confabulations is the term for these stories.)

Simon's notion that the existing contents of libraries ensure perpetual growth and progress for 7 billion years certainly resembles a confabulation (see Berlyne, 1972; Mercer et al., 1977; and Shapiro et al., 1981). The resemblance seems all the more striking the more one reads about anosognosia and its effects.9 Ramachandran's explanation for anosognosia says that this form of denial is a way of coping with an insufferable contradiction that confronts stroke patients; their paralysis is an incompatible, identity-threatening anomaly that contradicts their prior experience of themselves and their milieu.

Ramachandran has begun trying to use this unusual "window into the brain" to understand new aspects of the functioning circuitry of that truly remarkable but vulnerable organ. He postulates what he calls an "anomaly detector" as a kind of decision-making center somewhere in the brain. He has not located it specifically, but feels it must be in a part of the brain that usually interacts with the part affected by the stroke his patients have suffered, i.e., the right hemisphere. Anosognosia involves a breakdown of anomaly detection, so the patient is truly unaware of his paralysis and consequent disabilities.

In a Dictionary of Medical Syndromes (Magalini et al., 1990, 54) there is this description of anosognosia:
Inability of the patient to recognize a body or functional defect.... Denies the existence of the condition and attempts to disprove it by going through psychic process that lets him convince himself that what is said by the physician is false.

Here, of course, there is no mention of an "anomaly detector" in the patient's brain. But the reference to a "process that lets him convince himself" of the falsity of what is said by the physician (ordinarily a trusted authority) seemed a close parallel to the case of Simon and Kahn having convinced themselves the things said not only in the Global 2000 Report but by other knowledgeable writers before and since are false.

It is not the intention of the present paper to impute neurological or psychological aberration to Simon, or Kahn, or Wattenberg, or anyone else writing denials of global ecological peril. If the insights derived from accounts of anosognosia are to shed real light on the Simon-Kahn type of denials, we have to suppose there are sociocultural, interpersonal processes of "anomaly detection," not just organic ones within the individual brain. We must also suppose these interpersonal processes are subject to deflection, and that the deflective influences are discoverable sociocultural forces.

Reading a description of anosognosia (and other related forms of denial or "neglect") coupled with suggestions for care of the patient by family members (Caplan et al., 1994, 214-221) caused me to recall a long forgotten instance of denial in my own extended family. It did not have to do with stroke or paralysis, but it did indeed suggest "anomaly detection" may be sometimes a fallible interpersonal process, not just a neural function in the patient's own brain.

The example that came to mind was a memory of my mother sadly telling of a visit with her elderly parents when her father, my grandfather, a retired physician, was dying (in effect, of old age). He had had a fall which broke a hip. Bedridden, he developed gangrene in his feet. My mother forlornly described the way her mother had turned back the blankets to show my mother the condition of my grandfather's feet, while pathetically insisting my mother must confirm my grandmother's wishful perception of (nonexistent) signs of improvement. Denial by solicited agreement.10

There are indeed social psychological patterns that resemble anosognosia, and they are not simply manifestations of neurological impairment. This is clearly evident in the case of Woodrow Wilson, 28th president of the United States, who suffered a stroke on October 2,1919, while winding up a nationwide speaking tour to raise public support for the proposed League of Nations. Despite paralysis of the left side of his body, he remained "under the illusion, persistently fostered by those around him, that he was on the way to recovery" (Hecksher, 1991,632-633. emphasis added; see also Hoover, 1958; Walworth, 1965; Grayson, 1977). Both Mrs. Wilson and the president's doctor. Rear Admiral Grayson, are said to have feared Wilson would "fall back into his post stroke depression" if told the plain truth about his disability.

In the aftermath of World War I, Wilson's supreme mission was to ensure future world peace by establishing the League of Nations. Led by those close to him to believe his infirmity was abating, he meant to continue campaigning for the League's establishment "Undated rough notes in longhand reveal that at some time during [the spring of 1920] Wilson drafted a document entitled '3rd Inaugural'" (Hecksher, 1991, 633). Third term candidacy was his imagined way of bringing to ultimate fruition the presumed public support for his League by overcoming the growing opposition in the Senate. Neither his wife nor his physician would tell him, during that spring, that it was an "utter impossibility" for him to run for a third presidential term.

This tragedy of Woodrow Wilson helps clarify what stroke patients with anosognosia and writers who deny global ecological peril have in common: a compulsion to overcome what social psychologists call cognitive dissonance (Festinger, 1957). According to theory, cognitive dissonance can be strongly aversive. A person will attempt to reduce or eliminate such dissonance between two or more cognitions. The person will act to avoid events or stimuli that would increase it. The severity or intensity of cognitive dissonance depends on the importance to the individual ( ...and to his peers? his reference group?) of the cognitions involved. New cognitions that will add weight to one side of the aversive contradiction can reduce the intolerable dissonance — either by diminishing the contradiction or by reducing its perceived importance (Zajonc, 1968, 360-361).

Now let us imagine a scholar who happens to be deeply committed to what Murray (1972, 219) called "the American economic model." Our scholar then encounters "cognitions" like the following:

[This model, which] values increasing growth, waste (non-recycling of essential materials), and competition [clearly violates] ecological principles that have been established by observations: Competition results in elimination of competitors; continually growing populations eventually collapse; and ecosystems whose essential materials are not recycled cannot be sustained.

There is no reason to believe that economic systems based on principles that bring collapse to ecological systems are immune to a similar fate.

These statements are certainly dissonant with his prior convictions. It should be no surprise if this committed scholar tries to reduce dissonance by amassing additional "cognitions" supporting his adherence to "the American economic model."


By thinking of denial as a defense against intolerable anomalous information, we come back to the classic assertion by Paul Sears (1964,11) that ecology "if taken seriously as an instrument for the long-run welfare of mankind, would ... endanger the assumptions and practices accepted by modern societies...." Ecology, he said, affords by its very nature a continuing critique of human operations within the ecosystem. He agreed with F. Fraser Darling that we humans are an integral part of the ecosystem, albeit the most dominant species. Without resorting to Barry Commoner's military metaphor, Sears was already expressing concern about that clash between technosphere and ecosphere later named, described, and deplored by Commoner as a "suicidal war."

Ecological understanding of nature's limits and man's place in nature contradicts deeply entrenched cultural expectations of endless material progress. This fact has been expressed repeatedly by assorted writers who came to it from various directions. In his important but too-seldom-read book on the relation between societal forms and the kinds of energy converters used, Cottrell (1955, 2)" wrote forty years ago that clear understanding of that relation was "likely to raise questions about the reliability of certain propositions which are basic to the policies of both the Communist and the Free World. Some of the makers of these policies will be unwilling to accept its implications," especially if, as Garrett Hardin (1985, 469) contended three decades later, ecology "demands that our current political, social, economic, and moral order be stood on its head."

Simple inability to do that, or committed reluctance to consider how that might happen, seem widespread. Perhaps that is what motivates men like Simon and Kahn to scorn such views and information as were presented in the Global 2000 Report. It challenged beliefs and attitudes that were central to their very identity as humans made in the Western industrial mold. In the same way, and just as fundamentally, it must challenge the beliefs and attitudes crucial to the identities of members of the 104th Congress of the United States. Is it possible that for them, "downsizing" government (to "balance the budget" by 2002 A.D.) has a "latent function"? — it has helped divert attention from humanity's involvement in that "suicidal war" on the ecosphere. If surviving that conflict requires downsizing industrial civilization, rather than just the federal government, how long can the world afford such diversion of those who purport to shape the course of history? When will evidence (or social pressure) suffice to emancipate them from habits of denial?

Other Voices  Back


Tribute to Garrett Hardin

William R. Catton, Jr.

Garrett Hardin's wisdom and clearly expressed challenges to conventional thoughtways will live on. Had his only publication been the 1968 paper in Science magazine, "The Tragedy of the Commons," that alone would have been a monumental intellectual legacy. Just where in his numerous books and papers occurred the simple truth about every "shortage" of some resource being a "longage" of demand for it, I don't recall. But his aphoristic statement of that concept perfectly reflects his keen awareness that we see not just with eyes but with ideas. He was a great purveyor of vision-widening ideas.

Only twice did I have opportunities for face-to-face conversations with Garrett Hardin, but I will long cherish having heard him address an audience at Washington State University under the title "Eskimos and Ecologists Are Happy" - meaning that one need not be demoralized either by living in an extremely limiting environment nor from studying a discipline that fosters maximum awareness of limits.

Shortly thereafter it was my privilege to publish a review of his 1993 book, Living Within Limits, in the journal Human Ecology Review. As usual with Hardin's books, it was filled with epigrammatic gems. One of my favorites: "With the coinage of 'sustainable development,' the defenders of the unsteady state have won a few more years' moratorium from the painful process of thinking." We who admired him must continue our efforts to end that moratorium.

William R. Catton, Jr., Professor Emeritus, Washington State University

Tribute to Garrett Hardin

William R. Catton, Jr.

Garrett Hardin's wisdom and clearly expressed challenges to conventional thoughtways will live on. Had his only publication been the 1968 paper in Science magazine, "The Tragedy of the Commons," that alone would have been a monumental intellectual legacy. Just where in his numerous books and papers occurred the simple truth about every "shortage" of some resource being a "longage" of demand for it, I don't recall. But his aphoristic statement of that concept perfectly reflects his keen awareness that we see not just with eyes but with ideas. He was a great purveyor of vision-widening ideas.

Only twice did I have opportunities for face-to-face conversations with Garrett Hardin, but I will long cherish having heard him address an audience at Washington State University under the title "Eskimos and Ecologists Are Happy" - meaning that one need not be demoralized either by living in an extremely limiting environment nor from studying a discipline that fosters maximum awareness of limits.

Shortly thereafter it was my privilege to publish a review of his 1993 book, Living Within Limits, in the journal Human Ecology Review. As usual with Hardin's books, it was filled with epigrammatic gems. One of my favorites: "With the coinage of 'sustainable development,' the defenders of the unsteady state have won a few more years' moratorium from the painful process of thinking." We who admired him must continue our efforts to end that moratorium.

William R. Catton, Jr., Professor Emeritus, Washington State University

Conflict theory

From Wikipedia, the free encyclopedia

Jump to: navigation, search

Conflict theories are perspectives in social science which emphasize the social, political or material inequality of a social group, which critique the broad socio-political system, or which otherwise detract from structural functionalism and ideological conservativism. Conflict theories draw attention to power differentials, such as class conflict, and generally contrast traditional or historically dominant ideologies. Conflict theory is most commonly associated with Marxism, but as a reaction to functionalism and the positivist method may also be associated with critical theory, feminist theory, queer theory, postmodern theory, post-structural theory, postcolonial theory, and a variety of other perspectives.



Basic conflicts

Social identity theory (SIT) can restore some coherence to organizational identification, and it can suggest fruitful applications to organizational behavior. SIT offers a social-psychological perspective, developed principally by Henri Tajfel (1978, 1981; Tajfel & Turner, 1985) and John Turner (1975, 1982, 1984, and 1985). Following a review of the literature on SIT, the antecedents and consequences of social identification in organizations are discussed. This perspective is then applied to three domains of organizational behavior: socialization, role conflict and intergroup relations Social Identity Theory

According to SIT, people tend to classify them selves and others into various social categories, such as organizational membership, religious affiliation, gender, and age cohort (Tjfel & Turner, 1985). As these examples suggest, people may be classified in various categories, and different individuals may utilize different categorization Echnos. Categories are defined by prototypical characteristics abstracted from the members (Turner, 1985). Social classification serves two functions. First, it cognitively sag

Fisher argues that intergroup conflicts arise from objective differences of interest, coupled with antagonistic or controlling attitudes or behaviors. Incompatibilities, which can prompt conflict, include economic, power or value differences, or differences in needs-satisfaction. Often intergroup conflicts have a mixture of these elements.

These incompatibilities can then be exacerbated into destructive intergroup conflict by common perceptual and cognitive processes. The very act of group categorization tends to create some in-group favoritism. Conflict between groups encourages negative stereotyping of the opposing group. Cognitive biases lead individuals to attribute positive personal characteristics to fellow in-group members and excuse their negative behaviors. At the same time, such biases lead people to attribute negative characteristics to out-group members and explain away any positive behaviors.


Social identity theory (SIT) can restore some coherence to organizational identification, and it can suggest fruitful applications to organizational behavior. SIT offers a social-psychological perspective, developed principally by Henri Tajfel (1978, 1981; Tajfel & Turner, 1985) and John Turner (1975, 1982, 1984, and 1985). Following a review of the literature on SIT, the antecedents and consequences of social identification in organizations are discussed. This perspective is then applied to three domains of organizational behavior: socialization, role conflict and intergroup relations Social Identity Theory

According to SIT, people tend to classify them selves and others into various social categories, such as organizational membership, religious affiliation, gender, and age cohort (Tjfel & Turner, 1985). As these examples suggest, people may be classified in various categories, and different individuals may utilize different categorization Echnos. Categories are defined by prototypical characteristics abstracted from the members (Turner, 1985). Social classification serves two functions. First, it cognitively sag

Fisher argues that intergroup conflicts arise from objective differences of interest, coupled with antagonistic or controlling attitudes or behaviors. Incompatibilities, which can prompt conflict, include economic, power or value differences, or differences in needs-satisfaction. Often intergroup conflicts have a mixture of these elements.

These incompatibilities can then be exacerbated into destructive intergroup conflict by common perceptual and cognitive processes. The very act of group categorization tends to create some in-group favoritism. Conflict between groups encourages negative stereotyping of the opposing group. Cognitive biases lead individuals to attribute positive personal characteristics to fellow in-group members and excuse their negative behaviors. At the same time, such biases lead people to attribute negative characteristics to out-group members and explain away any positive behaviors.


Children enter school with a wide range of knowledge and physical, social, emotional, linguistic, and cognitive skills. Because wealthier families spend much more money on their children’s school preparation when compared with their poorer counterparts, children from lower income families whose knowledge and skills are far behind those of their classmates from wealthier families enter school at a disadvantage [1]. If these delayed children are unable to catch up, they face greater challenges throughout their school careers. [1] have indicated that the racial gap in achievement scores of high schools students is oftentimes already evident when children first began school. With regard to preschool training, according to [1], children living in poverty are much less likely than are non-poor children to be able to recognize the letters of the alphabet, count higher, write their name, or read. Furthermore, children’s cognitive/literacy school readiness skills are higher among those with more educated mothers. [2] have also noted that children from higher class families are more likely to have a home environment that provides the intellectual skills they need to do well in school. Because of this early preparation, many researchers have found that middle-upper class children are already ahead of lower-class children in intellectual ability before the first year of school.

Once children get to school, poorer students, who are already less readily prepared for school when compared with their non-poor counterparts, typically face additional hardships with regard to school quality. School quality varies by neighborhood socioeconomic status. Because schools are funded by local property taxes, there are large differences in per-pupil expenditures such that expenditures in the richest 5 % of schools are more than twice the expenditures in the poorest 5 %. [3] In addition to these economic measures, there are large differences in many non-economic measures of school quality such as school violence, the number of AP course offerings, and the extent of the school’s library collections. School differences such as these influence the degree of education that students obtain and perpetuate continual income and race differentials in the education system.

As confirmed by [4] through data collected from public schools, tracking is another means by which education structures inequality. A large percentage of U.S. public schools follow the practice of placing children in different tracks that prepare some students for college and others for vocational skills that do not lead to college [5]. Factors such as measured intellectual skills and class background, influence track placement. Because cognitive skills and academic performance are influenced by class background and race, the effect is the similar: tracking tends to separate children by class and race and limits opportunities for students to move from one academic track to another. [6] have confirmed this criticism of educational tracking by showing that children in the college-prep track improve in academic achievement over the years, while those in the lower track perform at lower levels.[7] propose that this differential achievement in school occurs because of the different expectations of administrators, teachers, and parents for students in the separate tracks. Evidence such as the previous, and results from studies comparing differences in tracked and non-tracked educational systems [8], confirm that the reinforcement of schooling practices such as tracking increases educational inequality, reinforces class differences, and differentiates children in terms of family background.

With respect to Conflict Theory, employment requirements reflect the efforts of the bourgeoisie or the upper class to monopolize or dominate jobs by imposing their cultural standards on the selection process. [9] examined, educational upgrading as a means of maintaining class boundaries. [9] noted that when college degrees were much more limited and the middle class typically had only high school degrees, middle class occupations required a high school degree. However, as more middle class Americans obtained college degrees and more of the working class obtained high school degrees, middle class occupations were upgraded so that they required a college degree. [7] argued that education is a certification of class membership more than of technical skills and certifies that people have learned to respect the authority, and accept the values, ideals, and system of inequality in the occupational structure. Furthermore, [7] postulate that the educational system teaches people to properly subordinate to reproduce social relations of production by valuing the cultural capital of the upper class and devaluing the cultural capital of the lower class. In other words, schools train the wealthy to take up places at the top of the economy while conditioning the poor to accept their lowly status in the class structure.

Modes of conflict

In conflict theory there are different modes of conflict. One mode of conflict theory is that of warfare and revolution. Warfare and revolutions take place phases due to the rocky “collations among a variety of social classes.” An example of warfare is that going on currently in Burma, where there is military versus population fighting for control over the country’s government. Another mode of conflict in conflict theory is that of strikes. Modern society has created a main social divider between workers and managers. When workers feel they have been treated unfairly, they go on strike to regain their right to power. Another mode of conflict in conflict theory is that of domination. Different social classes tend to form different ideologies based around promotion of their own class' welfare. Different groups will struggle in conflict over what they think is right, what the norms are, and their ideologies. Higher classes have more abstract ideologies, while subordinated classes ideas reflect the want in their own lives. The ideas of the ruling class are the ruling ideas, where the ruling material force is the ruling intellectual force.


The following are four primary assumptions of modern conflict theory:[10]

  1. Competition. Competition over scarce resources (money, leisure, sexual partners, and so on) is at the heart of all social relationships. Competition rather than consensus is characteristic of human relationships in all societies to which this theory is applicable (Marxian materialists assert that there is no competitive human nature; rather, humans are influenced by their surroundings resulting in a competitive propensity).
  2. Structural inequality. Inequalities in power and reward are built into all social structures. Individuals and groups that benefit from any particular structure strive to see it maintained.
  3. Revolution. Change occurs as a result of conflict between competing social classes rather than through adaptation. Change is often abrupt and revolutionary rather than evolutionary.
  4. War. Even war is a unifier of the societies involved, as well as possibly ending whole societies. In modern society, a source of conflict is power: politicians are competing to enter into a system;they act in their self interest, not for the welfare of people.

See also

Structural functionalism

From Wikipedia, the free encyclopedia

  (Redirected from Structural-functionalism)
Jump to: navigation, search

Structural functionalism is a broad perspective in the social sciences which addresses social structure in terms of the function of its constituent elements, namely norms, customs, traditions and institutions. It studies society as a structure with interrelated parts. A common analogy, popularized by Herbert Spencer, regards these interrelated parts of society as "organs" that work toward the proper functioning of the "body" as a whole.[1] The perspective was implicit in the thought of the original sociological positivist, Auguste Comte, who stressed the need for cohesion after the social malaise of the French revolution. It was later presented in the work of Émile Durkheim, who developed a full theory of organic solidarity, again informed by positivism, or the quest for "social facts."

Although functionalism shares a theoretical affinity with the empirical method, theorists such as Bronisław Malinowski and Talcott Parsons are to some extent antipositivist.[2] Parsons, in fact, saw "structural functionalism" as descriptive of a particular stage in the methodological development of the social sciences rather than a specific school of thought.[3] While functionalism has an affinity with "grand theory" (for example, the systems theory of Niklas Luhmann), emphasis may be placed on small units of socialization, such as the family. It is also simplistic to equate the perspective directly with political conservatism.[4] Functionalism has been associated with thinkers as diverse as the post-structuralist philosopher Michel Foucault.[5] In the most basic terms, it simply emphasises "the effort to impute, as rigorously as possible, to each feature, custom, or practice, its effect on the functioning of a supposedly stable, cohesive system."[2]



Theoretical background

Structural functionalism has historical affinity with the application of the scientific method in social theory and research. Sociological positivism asserts that one can study the social world in the same ways as one studies the physical world, and that social laws are directly and objectively observable. Certain contemporary functionalists have, in contrast, rejected empirical methods. Nevertheless, structural functionalists are broadly united in the view, firstly, that rules and regulations (both informal norms and formal laws) are necessary to organise a society effectively and, secondly, that social institutions (both traditional and governmental) form the necessary constituent parts of the social structure.

Although Comte may be defined as a structural-functionalist, the perspective was developed primarily through the work of Émile Durkheim, who emphasised the central role that moral consensus plays in maintaining social order and creating an equilibrium or a normal state of society. Durkheim was concerned with the question of how certain societies maintain internal stability and survive over time. He proposed that such societies tend to be segmented, with equivalent parts held together by shared values, common symbols or, as his nephew Marcel Mauss held, systems of exchanges. In modern, complicated societies, members perform very different tasks, resulting in a strong interdependence. Based on the metaphor above of an organism in which many parts function together to sustain the whole, Durkheim argued that complicated societies are held together by organic solidarity.

These views were upheld by Radcliffe-Brown, who, following Auguste Comte, believed that society constitutes a separate "level" of reality, distinct from both biological and inorganic matter. Explanations of social phenomena had therefore to be constructed within this level, individuals being merely transient occupants of comparatively stable social roles.

Durkheim proposed that most stateless, "primitive" societies, lacking strong centralised institutions, are based on an association of corporate-descent groups. Structural functionalism also took on Malinowski's argument that the basic building block of society is the nuclear family, and that the clan is an outgrowth, not vice versa.

The central concern of structural functionalism is a continuation of the Durkheimian task of explaining the apparent stability and internal cohesion needed by societies to endure over time. Societies are seen as coherent, bounded and fundamentally relational constructs that function like organisms, with their various parts (or social institutions) working together in an unconscious, quasi-automatic fashion toward achieving an overall social equilibrium. All social and cultural phenomena are therefore seen as functional in the sense of working together, and are effectively deemed to have "lives" of their own. They are primarily analysed in terms of this function. The individual is significant not in and of himself but rather in terms of his status, his position in patterns of social relations, and the behaviours associated with his status. The social structure, then, is the network of statuses connected by associated roles.

In the 1970s, political scientists Gabriel Almond and Bingham Powell introduced a structural-functionalist approach to comparing political systems. They argued that, in order to understand a political system, it is necessary to understand not only its institutions (or structures) but also their respective functions. They also insisted that these institutions, to be properly understood, must be placed in a meaningful and dynamic historical context.

This idea stood in marked contrast to prevalent approaches in the field of comparative politics — the state-society theory and the dependency theory. These were the descendants of David Easton's system theory in international relations, a mechanistic view that saw all political systems as essentially the same, subject to the same laws of "stimulus and response" — or inputs and outputs — while paying little attention to unique characteristics. The structural-functional approach is based on the view that a political system is made up of several key components, including interest groups, political parties and branches of government.

In addition to structures, Almond and Powell showed that a political system consists of various functions, chief among them political socialisation, recruitment and communication: socialisation refers to the way in which societies pass along their values and beliefs to succeeding generations, and in political terms describes the process by which a society inculcates civic virtues, or the habits of effective citizenship; recruitment denotes the process by which a political system generates interest, engagement and participation from citizens; and communication refers to the way that a system promulgates its values and information.

Prominent Theorists

Herbert Spencer

Herbert Spencer, a British philosopher famous for applying the theory of natural selection to society, was in many ways the first true sociological functionalist;[6] in fact, while Durkheim is widely considered the most important functionalist among positivist theorists, it is well known that much of his analysis was culled from reading Spencer's work, especially his Principles of Sociology (1874-96).

While most avoid the tedious task of reading Spencer's massive volumes (filled as they are with long passages explicating the organismic analogy, with reference to cells, simple organisms, animals, humans and society), there are some important insights that have quietly influenced many contemporary theorists, including Talcott Parsons, in his early work "The Structure of Social Action" (1937), Cultural anthropology, too, uses functionalism consistently.

This evolutionary model, unlike most Nineteenth-Century evolutionary theories, is cyclical, beginning with the differentiation and increasing complication of an organic or "super-organic" (Spencer's term for a social system) body, followed by a fluctuating state of equilibrium and disequilibrium (or a state of adjustment and adaptation), and, finally, a stage of disintegration or dissolution. Following Thomas Malthus' population principles, Spencer concluded that society is constantly facing selection pressures (internal and external) that force it to adapt its internal structure through differentiation.

Every solution, however, causes a new set of selection pressures that threaten society's viability. It should be noted that Spencer was not a determinist in the sense that he never said that

  1. selection pressures will be felt in time to change them;
  2. they will be felt and reacted to; or
  3. the solutions will always work.

In fact, he was in many ways a political sociologist,[7]and recognised that the degree of centralised and consolidated authority in a given polity could make or break its ability to adapt. In other words, he saw a general trend towards the centralisation of power as leading to stagnation and, ultimately, pressure to decentralise.

More specifically, Spencer recognised three functional needs or prerequisites that produce selection pressures: they are regulatory, operative (production) and distributive. He argued that all societies need to solve problems of control and coordination, production of goods, services and ideas, and, finally, to find ways of distributing these resources.

Initially, in tribal societies, these three needs are inseparable, and the kinship system is the dominant structure that satisfies them. As many scholars have noted, all institutions are subsumed under kinship organisation,[8] but, with increasing population (both in terms of sheer numbers and density), problems emerge with regards to feeding individuals, creating new forms of organisation — consider the emergent division of labour —, coordinating and controlling various differentiated social units, and developing systems of resource distribution.

The solution, as Spencer sees it, is to differentiate structures to fulfil more specialised functions; thus a chief or "big man" emerges, soon followed by a group of lieutenants, and later kings and administrators.

Perhaps Spencer's greatest obstacle to being widely discussed in modern sociology is the fact that much of his social philosophy is rooted in the social and historical context of Victorian England. He coined the term "survival of the fittest" in discussing the simple fact that small tribes or societies tend to be defeated or conquered by larger ones. Of course, many sociologists still use him (knowingly or otherwise) in their analyses, as is especially the case in the recent re-emergence of evolutionary theory.

Talcott Parsons

Talcott Parsons was heavily influenced by Durkheim and Max Weber, synthesising much of their work into his action theory, which he based on the system-theoretical concept and the methodological principle of voluntary action. He held that "the social system is made up of the actions of individuals."[9] His starting point, accordingly, is the interaction between two individuals faced with a variety of choices about how they might act,[10] choices that are influenced and constrained by a number of physical and social factors.[11]

Parsons determined that each individual has expectations of the other's action and reaction to his own behaviour, and that these expectations would (if successful) be "derived" from the accepted norms and values of the society they inhabit.[12] As Parsons himself emphasised, however, in a general context there would never exist any perfect "fit" between behaviours and norms, so such a relation is never complete or "perfect."

Social norms were always problematic for Parsons, who never claimed (as has often been alleged) that social norms were generally accepted and agreed upon, should this prevent some kind of universal law. Whether social norms were accepted or not was for Parsons simply a historical question.

As behaviours are repeated in more interactions, and these expectations are entrenched or institutionalised, a role is created. Parsons defines a "role" as the normatively-regulated participation "of a person in a concrete process of social interaction with specific, concrete role-partners."[13] Although any individual, theoretically, can fulfil any role, she is expected to conform to the norms governing the nature of the role she fulfils.[14]

Furthermore, one person can and does fulfil many different roles at the same time. In one sense, an individual can be seen to be a "composition"[15] of the roles he inhabits. Certainly, today, when asked to describe themselves, most people would answer with reference to their societal roles.

Parsons later developed the idea of roles into collectivities of roles that complement each other in fulfilling functions for society.[16] Some roles are bound up in institutions and social structures (economic, educational, legal and even gender-based). These are functional in the sense that they assist society in operating[17] and fulfil its functional needs so that society runs smoothly.

A society where there is no conflict, where everyone knows what is expected of him, and where these expectations are consistently met, is in a perfect state of equilibrium. The key processes for Parsons in attaining this equilibrium are socialisation and social control. Socialisation is important because it is the mechanism for transferring the accepted norms and values of society to the individuals within the system. Perfect socialisation occurs when these norms and values are completely internalised, when they become part of the individual's personality.[18]

Parson states that "this point [...] is independent of the sense in which [the] individual is concretely autonomous or creative rather than 'passive' or 'conforming', for individuality and creativity, are to a considerable extent, phenomena of the institutionalization of expectations";[19] they are culturally constructed.

Socialisation is supported by the positive and negative sanctioning of role behaviours that do or do not meet these expectations.[20] A punishment could be informal, like a snigger or gossip, or more formalised, through institutions such as prisons and mental homes. If these two processes were perfect, society would become static and unchanging, and in reality this is unlikely to occur for long.

Parsons recognises this, stating that he treats "the structure of the system as problematic and subject to change,"[21] and that his concept of the tendency towards equilibrium "does not imply the empirical dominance of stability over change."[22] He does, however, believe that these changes occur in a relatively smooth way.

Individuals in interaction with changing situations adapt through a process of "role bargaining."[23] Once the roles are established, they create norms that guide further action and are thus institutionalised, creating stability across social interactions. Where the adaptation process cannot adjust, due to sharp shocks or immediate radical change, structural dissolution occurs and either new structures (and therefore a new system) are formed, or society dies.

This model of social change has been described as a "moving equilibrium,"[24] and emphasises a desire for social order.

Robert Merton

Robert Merton was a functionalist and he fundamentally agreed with Parsons’ theory. However, he acknowledged that it was problematic, believing that it was too generalised [Holmwood, 2005:100]. Merton tended to emphasise middle-range theory rather than a grand theory, meaning that he was able to deal specifically with some of the limitations in Parsons’ theory. He identified 3 main limitations: functional unity, universal functionalism and indispensability [Ritzer in Gingrich, 1999]. He also developed the concept of deviance and made the distinction between manifest and latent functions.

Merton criticised functional unity, saying that not all parts of a modern, complex society work for the functional unity of society. Some institutions and structures may have other functions, and some may even be generally dysfunctional, or be functional for some while being dysfunctional for others. This is because not all structures are functional for society as a whole. Some practices are only functional for a dominant individual or a group [Holmwood, 2005:91]. Here Merton introduces the concepts of power and coercion into functionalism and identifies the sites of tension which may lead to struggle or conflict. Merton states that by recognizing and examining the dysfunctional aspects of society we can explain the development and persistence of alternatives. Thus, as Holmwood states, “Merton explicitly made power and conflict central issues for research within a functionalist paradigm” [2005:91].

Merton also noted that there may be functional alternatives to the institutions and structures currently fulfilling the functions of society. This means that the institutions that currently exist are not indispensable to society. Merton states that “just as the same item may have multiple functions, so may the same function be diversely fulfilled by alternative items” [cited in Holmwood, 2005:91]. This notion of functional alternatives is important because it reduces the tendency of functionalism to imply approval of the status quo.

Merton’s theory of deviance is derived from Durkheim’s idea of anomie. It is central in explaining how internal changes can occur in a system. For Merton, anomie means a discontinuity between cultural goals and the accepted methods available for reaching them.

Merton believes that there are 5 situations facing an actor.

Thus it can be seen that change can occur internally in society through either innovation or rebellion. It is true that society will attempt to control these individuals and negate the changes, but as the innovation or rebellion builds momentum, society will eventually adapt or face dissolution.

The last of Merton’s important contributions to functionalism was his distinction between manifest and latent functions. Manifest functions refer to the conscious intentions of actors; latent functions are the objective consequences of their actions, which are often unintended [Holmwood, 2005:90]. Merton used the example of the Hopi rain dance to show that sometimes an individual’s understanding of their motive for an action may not fully explain why that action continues to be performed. Sometimes actions fulfill a function of which the actor is unaware, and this is the latent function of an action. 2.14.08

Social change

Talcott Parsons analyzed society as having a complex system of equilibriums but it is a distortion when it is claimed that Parsons believed that they would be in some kind of "perfect" balance or that a disturbed equilibrium would return "quickly" to its current position. He never argued or claimed anything of that kind. In contrast, Parsons always argued that for most societies the value-integrals of society (and therefore also there relatively state of "equilibrium") is generally importantly incomplete and in a modern society it is utopian, Parsons maintained, that there can be anything approaching a "completely" system-integration. Indeed, Parsons argued that the build-in value-ambivalence and tensions which characterizes almost all cultural systems will make the idea of "optimal" social integration" a sheer utopia. Parsons never claimed that one part of the societal system "must" adapt to the other; there doesn't exist such a "must"; however, he maintained that insufficient levels of system-adaption would have various kind of "problematic" consequences depending on the exact historical situation. Naturally, if a society suffers from a severe sum of integral malfunctions its survival will ultimately be at stake. After all, many empires and civilizations have vanished and disappeared as history have marched along.

From a theoretical point of view, Parsons discussed social evolution in the light of four distinct systemic processes. These are:

  1. Differentiation
  2. Adaptive Upgrading.
  3. Inclusion
  4. Value Generalization

Structural functionalism and unilineal descent

In their attempt to explain the social stability of African "primitive" stateless societies where they undertook their fieldwork, Evans-Pritchard (1940) and Meyer Fortes (1945) argued that the Tallensi and the Nuer were primarily organised around unilineal descent groups. Such groups are characterised by common purposes, such as administering property or defending against attacks; they form a permanent social structure that persists well beyond the lifespan of their members. In the case of the Tallensi and the Nuer, these corporate groups were based on kinship which in turn fitted into the larger structures of unilineal descent; consequently Evans-Pritchard's and Fortes' model is called "descent theory". Moreover, in this African context territorial divisions were aligned with lineages; descent theory therefore synthesised both blood and soil as two sides of one coin (cf. Kuper, 1988:195). Affinal ties with the parent through whom descent is not reckoned, however, are considered to be merely complementary or secondary (Fortes created the concept of "complementary filiation"), with the reckoning of kinship through descent being considered the primary organising force of social systems. Because of its strong emphasis on unilineal descent, this new kinship theory came to be called "descent theory".

Before long, descent theory had found its critics. Many African tribal societies seemed to fit this neat model rather well, although Africanists, such as Richards, also argued that Fortes and Evans-Pritchard had deliberately downplayed internal contradictions and overemphasised the stability of the local lineage systems and their significance for the organisation of society.[25] However, in many Asian settings the problems were even more obvious. In Papua New Guinea, the local patrilineal descent groups were fragmented and contained large amounts of non-agnates. Status distinctions did not depend on descent, and genealogies were too short to account for social solidarity through identification with a common ancestor. In particular, the phenomenon of cognatic (or bilateral) kinship posed a serious problem to the proposition that descent groups are the primary element behind the social structures of "primitive" societies.

Leach's (1966) critique came in the form of the classical Malinowskian argument, pointing out that "in Evans-Pritchard's studies of the Nuer and also in Fortes's studies of the Tallensi unilineal descent turns out to be largely an ideal concept to which the empirical facts are only adapted by means of fictions." (1966:8). People's self-interest, manoeuvring, manipulation and competition had been ignored. Moreover, descent theory neglected the significance of marriage and affinal ties, which were emphasised by Levi-Strauss' structural anthropology, at the expense of overemphasising the role of descent. To quote Leach: "The evident importance attached to matrilateral and affinal kinship connections is not so much explained as explained away."[26]


In the 1960s, functionalism was criticized for being unable to account for social change, or for structural contradictions and conflict (and thus was often called "consensus theory"). The refutation of the second criticism of functionalism, that it is static and has no concept of change, has already been articulated above, concluding that while Parsons’ theory allows for change, it is an orderly process of change [Parsons, 1961:38], a moving equilibrium. Therefore referring to Parsons’ theory of society as static is inaccurate. It is true that it does place emphasis on equilibrium and the maintenance or quick return to social order, but this is a product of the time in which Parsons was writing (post-World War II, and the start of the cold war). Society was in upheaval and fear abounded. At the time social order was crucial, and this is reflected in Parsons' tendency to promote equilibrium and social order rather than social change.

Furthermore, Durkheim favored a radical form of guild socialism along with functionalist explanations. Also, Marxism, while acknowledging social contradictions, still uses functionalist explanations. Parsons' evolutionary theory describes the differentiation and reintegration systems and subsystems and thus at least temporary conflict before reintegration (ibid). "The fact that functional analysis can be seen by some as inherently conservative and by others as inherently radical suggests that it may be inherently neither one nor the other." (Merton 1957: 39)

Stronger criticisms include the epistemological argument that functionalism is teleological, that is it attempts to describe social institutions solely through their effects and thereby does not explain the cause of those effects. However, Parsons drew directly on many of Durkheim’s concepts in creating his theory. Certainly Durkheim was one of the first theorists to explain a phenomenon with reference to the function it served for society. He said, “the determination of function is…necessary for the complete explanation of the phenomena” [cited in Coser, 1977:140]. However Durkheim made a clear distinction between historical and functional analysis, saying, “when…the explanation of a social phenomenon is undertaken, we must seek separately the efficient cause which produces it and the function it fulfills” [cited in Coser, 1977:140]. If Durkheim made this distinction, then it is unlikely that Parsons did not. However Merton does explicitly state that functional analysis does not seek to explain why the action happened in the first instance, but why it continues or is reproduced. He says that “latent functions …go far towards explaining the continuance of the pattern” [cited in Elster, 1990:130, emphasis added]. Therefore it can be argued that functionalism does not explain the original cause of a phenomenon with reference to its effect, and is therefore, not teleological.

Another criticism describes the ontological argument that society can not have "needs" as a human being does, and even if society does have needs they need not be met. Anthony Giddens argues that functionalist explanations may all be rewritten as historical accounts of individual human actions and consequences (see Structuration theory).

A further criticism directed at functionalism is that it contains no sense of agency, that individuals are seen as puppets, acting as their role requires. Yet Holmwood states that the most sophisticated forms of functionalism are based on “a highly developed concept of action” [2005:107], and as was explained above, Parsons took as his starting point the individual and their actions. His theory did not however articulate how these actors exercise their agency in opposition to the socialisation and inculcation of accepted norms. As has been shown above, Merton addressed this limitation through his concept of deviance, and so it can be seen that functionalism allows for agency. It cannot, however, explain why individuals choose to accept or reject the accepted norms, why and in what circumstances they choose to exercise their agency, and this does remain a considerable limitation of the theory.

Further criticisms have been levelled at functionalism by proponents of other social theories, particularly conflict theorists, marxists, feminists and postmodernists. Conflict theorists criticised functionalism’s concept of systems as giving far too much weight to integration and consensus, and neglecting independence and conflict [Holmwood, 2005:100]. Lockwood [in Holmwood, 2005:101], in line with conflict theory, suggested that Parsons’ theory missed the concept of system contradiction. He did not account for those parts of the system that might have tendencies to mal-integration. According to Lockwood, it was these tendencies that come to the surface as opposition and conflict among actors. However Parsons’ thought that the issues of conflict and cooperation were very much intertwined and sought to account for both in his model [Holmwood, 2005:103]. In this however he was limited by his analysis of an ‘ideal type’ of society which was characterised by consensus. Merton, through his critique of functional unity, introduced into functionalism an explicit analysis of tension and conflict.

Marxism which was revived soon after the emergence of conflict theory, criticised professional sociology (functionalism and conflict theory alike) for being partisan to advanced welfare capitalism [Holmwood, 2005:103]. Gouldner [in Holmwood, 2005:103] thought that Parsons’ theory specifically was an expression of the dominant interests of welfare capitalism, that it justified institutions with reference to the function they fulfill for society. It may be that Parsons’ work implied or articulated that certain institutions were necessary to fulfill the functional prerequisites of society, but whether or not this is the case, Merton explicitly states that institutions are not indispensable and that there are functional alternatives. That he does not identify any alternatives to the current institutions does reflect a conservative bias, which as has been stated before is a product of the specific time that he was writing in.

As functionalism’s prominence was ending, feminism was on the rise, and it attempted a radical criticism of functionalism. It believed that functionalism neglected the suppression of women within the family structure. Holmwood [2005:103] shows, however, that Parsons did in fact describe the situations where tensions and conflict existed or were about to take place, even if he didn’t articulate those conflicts. Some feminists agree, suggesting that Parsons’ provided accurate descriptions of these situations. [Johnson in Holmwood, 2005:103]. On the other hand, Parsons recognised that he had oversimplified his functional analysis of women in relation to work and the family, and focused on the positive functions of the family for society and not on its dysfunctions for women. Merton, too, although addressing situations where function and dysfunction occurred simultaneously, lacked a “feminist sensibility” [Holmwood, 2005:103], although I repeat this was likely a product of the desire for social order.

Postmodernism, as a theory, is critical of claims of truth. Therefore the idea of grand theory that can explain society in all its forms is treated with skepticism at the least. This critique is important because it exposes the danger that grand theory can pose, when not seen as a limited perspective, as one way of understanding society.

Jeffrey Alexander (1985) sees functionalism as a broad school rather than a specific method or system, such as Parson's, which is capable of taking equilibrium (stability) as a reference-point rather than assumption and treats structural differentiation as a major form of social change. "The name 'functionalism' implies a difference of method or interpretation that does not exist." (Davis 1967: 401) This removes the determinism criticized above. Cohen argues that rather than needs a society has dispositional facts: features of the social environment that support the existence of particular social institutions but do not cause them. (ibid)

New support from multilevel selectionists

Some evolutionary theorists—including the biologist David Sloan Wilson and anthropologists Robert Boyd and Peter Richerson—have provided strong support for structural functionalism by proposing a framework called multilevel selection. The structures of a society such as religion, is by them seen as Darwinian (biological or cultural) adaptations at the group level. DS Wilson has put new life into the old "society as organism" theory/metaphor.with a critique to the structural functionalism critical theory emerged during 1850 and onwards.

Famous functionalists

Famous functionalists include:

See also