About The ARDA | Tools | FAQs | Contact Us

In the social sciences generally, as well as in the social science of religion, the term theory is actually used in a multitude of applications. In a sense, every specific theory embodies a somewhat different idea of what theory means, so it is not surprising that this word tends to confuse people. For example, fully 93 articles in the Stanford Encyclopedia of Philosophy have “theory” in their titles, yet they approach it from almost as many different directions.

Citing the work of Rodney Stark and William Bainbridge, we offer the following general definition of a theory:

theory is a set of statements, or hypotheses, about relationships among a set of abstract concepts. These statements say how and why the concepts are interrelated. Furthermore, these statements must give rise to implications that potentially are falsifiable empirically.


Rodney Stark and William Sims Bainbridge, A Theory of Religion (New York: Toronto/Lang, 1987), p. 13.
The following Theories are available.

There are many ways in which serious theories of the biological evolution of the human brain can offer hypotheses about the nature of religion and variations across history and across subgroups in the population with respect to religious beliefs and practices (Watts and Turner, 2014).  Most obviously, if evidence shows that religion has on balance been beneficial for humanity, it can be said to have evolved over time through natural selection from the varieties of ideas and activities oriented toward the supernatural that naturally spring up.  However, that simple idea leaves open whether the evolution was primarily biological or cultural, and it does not immediately suggest what kinds of research could clarify the mechanisms involved and establish the degree of truth to the theory.  Recently, innovative theory development and empirical research had raised legitimate hopes that very reliable answers to fundamental questions about the human mind can be answered using such techniques as functional magnetic imaging “brain scans” and genetic sequencing to compare the biological characteristics of people who exhibit different patterns of behavior.  This really does seem to be a time of very rapid scientific advance, but too early to be entirely confident in any of the specific hypotheses or research findings.

One area that is relatively easy to understand but where conclusive research remains an accomplishment for the future concerns memory.  The brain’s methods for storing information appear to be very complex, placing different kinds of memory in different locations and probably coding them in different ways as well.  One standard distinction is between episodic memory, which remembers specific events, and semantic memory, which is more abstract and remembers facts or procedures that apply to multiple situations.  A memory of when a cat caused a Christmas tree to fall down by climbing it is episodic and associated with an image of exactly where that tree stood in which house, while knowledge that ornamented Christmas trees are a well-established tradition is semantic and does not refer to a specific time or place.  Harvey Whitehouse (2004) has suggested that different kinds of religious organizations may favor one or the other kind of memory.

Whitehouse says that sects emphasize episodic memory, while mainstream denominations emphasize semantic memory, although the difference is a matter of degree and all religions make some use of both.  Sects give greater significance to extreme religious experiences, that happen under specific circumstances and are very emotional, thus having their enduring effect through episodic memories, for example of an adult baptism.  Low-intensity denominations make greater use of repetition of standard doctrines, and someone who was baptized as an infant will lack a memory of that episode, but will have learned beliefs and developed commitment gradually as an older child in Sunday school and other settings, relying upon semantic memory.

Although episodic and semantic memories are distinguishable, they blend into each other, and one of the interesting but underdeveloped topics of research is how religious episodic memories may change over time, for example to come semantically into conformity with the doctrines of the religion to which the individual belongs.  Somewhat controversially, Michael Carroll (1983, 1985) examined the ways in which ill-defined psychologically extreme episodes could be shaped by the beliefs of the community to become memories of apparitions of the Virgin Mary.  Cognitive scientists have explored more fully how episodic and other kinds of memory may change over time, especially becoming incorrect, often excessively simplified or adjusted to fit a stereotype (Conway 1997, 2001).  This raises the worrisome possibility that all memories of communication with supernatural beings are false, explainable in terms of common cognitive errors.

A frequently discussed theory of the origins of religion concerns the possibility that the human brain contains a module that evolved to model the behavior of other people, allowing us to interact and cooperate with each other.  This possibility has been theorized in various ways.  For example, H. Porter Abbott (2003) suggests that the main cognitive process that guides human action, above the low-level details that may be non-verbal such as how to grasp and lift an object, is structured in terms of narrative.  In a narrative you seek to achieve a goal, but face obstacles, must gain resources and allies, then go through a series of steps toward that goal.  In everyday life as well as in literary narratives including the bible, we understand the behavior of other people in terms of that kind of narrative structure.  Perhaps we also apply that mode of thinking to any complex process, and interpret natural processes like the weather in terms of the actions of conscious beings or gods.  Abbott notes the irony that our propensity to think in terms of narratives is a result of biological evolution, yet it makes it difficult for people to accept the theory of evolution through natural selection from random variations, because it is “non-narratable” and does not fit that model of intentional action.

A rather different but not necessarily contradictory cognitive theory postulates that there exists a distinct organ in the brain, sometimes called “mirror neurons” (Dapretto et al, 2006), that evolved specifically to model the behavior of other people.  How distinct these neurons may be is a question for empirical research, for example doing brain scans that identify where in the brain the neurons are active when a person performs a specific action, like grasping and lifting an object, in comparison with brain scans when the person watches somebody else perform the same action.  Several cognitive scientists have argued that belief in gods reflects hyperactivity of a module in the mind that interprets the intentions of other minds (Boyer, 2001; Atran 2002; Barrett 2004).  The fact that some portion of the human brain serves practical functions very well does not mean that it does an equally good job outside the area of life for which it evolved.  Yet this kind of theory may have explanatory power, placing religious phenomena into new contexts of understanding without completely rejecting them.  For example, if we develop mental models in our own mind of the people closest to us in life, such as our parents, those models will endure after the biological deaths of those significant others.  That is to say, if a version of our deceased parent’s mind persists within our own brain, then it has entered a kind of afterlife, if not exactly the form described at the church we attend (Bering, 2006).

Recent research has taken this kind of thinking to a new level.  Clearly our brains do contain specialized modules, most obvious in the case of anatomically distinct components like the cerebrum and cerebellum, and people may differ in the relative strengths of some of these components, whether for genetic reasons or because environmental conditions favored development of one or another brain component during childhood.  One possibility considered by several cognitive scientists is that two modules in the brain may actually be in competition with each other: one perhaps based in mirror neurons that enables social bonding, and another that energizes abstract thought.  Given the complexity and constant change in human society, evolution may ultimately favor both, and even favor diversity such that individuals differ greatly in the dominance of one or the other of these two functions.  A standard game theory principle in the evolution of a cognitively complex species is that evolution that favors cooperation may set the stage for individualism, by giving some individuals the opportunity to exploit their more docile con-specifics (Maynard Smith 1982).

However, given how unstable the human environment has been over the millennia, as our species emerged from East Africa and developed technologies that allowed us to thrive in a vast diversity of environments, having a diverse gene pool allowed us to benefit from a division of labor, in which many people were content to live in a stable community, while some were more able to deal with novel circumstances.  This is the starting point for analysis by sociologist Satoshi Kanazawa (2010) in a provocatively titled article, “Why Liberals and Atheists Are More Intelligent.”  He does not assert that political liberals and faithless atheists are better, but that the component of their brains responsible for what psychologists traditionally called general intelligence is not really intellectually superior, but had evolved to handle novelty, such as changes in the surrounding environment.  Conservatives and religious people have stronger brain components devoted to maintenance of beneficial traditions and social solidarity.  Over the course of human history, both propensities combined to serve human survival and well-being, so neither kind of brain module became dominant in the gene pool.

A significant number and diversity of individually limited research studies already explore ideas like this from a variety of perspectives, but raising new questions simultaneously with improving research methods.  A 2016 research article begins by framing one subset of these questions:

Prior work has established that analytic thinking is associated with disbelief in God, whereas religious and spiritual beliefs have been positively linked to social and emotional cognition. However, social and emotional cognition can be subdivided into a number of distinct dimensions, and some work suggests that analytic thinking is in tension with some aspects of social-emotional cognition. This leaves open two questions. First, is belief linked to social and emotional cognition in general, or a specific dimension in particular? Second, does the negative relationship between belief and analytic thinking still hold after relationships with social and emotional cognition are taken into account? (Jack et al, 2016:1; cf. Gervais and Norenzayan, 2012)

On the basis of eight studies, this particular article suggested that the brain function associated with religion might not be a hyperactive “other minds” module that misinterprets natural events as the actions of conscious supernatural beings, but rather a module responsible for ethical thinking and thus moral behavior.  Without jumping to the premature conclusion that we now understand the human mind, we should ponder how this theory places morality at the center of religion, rather than gods.  However, a recent review essay by Paul Bloom (2012:179), one of the most influential cognitive scientists who has studied religion, admitted that definitive scientific knowledge in this area may not yet exist: “I conclude that religion has powerfully good moral effects and powerfully bad moral effects, but these are due to aspects of religion that are shared by other human practices. There is surprisingly little evidence for a moral effect of specifically religious beliefs.”

One source of perplexity may be the fact that “religion” itself may have evolved through human history.  Sects and denominations are complex social movements and societal institutions, that seek to serve multiple functions for members, emerging under particular historical conditions, and employing many of the cultural and technological tools like literature and architecture that are employed by other institutions.  Cognitive science largely lacks a social science component, and thus its studies of religion examine individual human beings rather than church congregations or even individuals playing distinct roles in an institution such as clergy.

Kanazawa’s (2004) work offers a precautionary concept he calls the savannah principle: Human brains evolved under conditions on the savannah of East Africa, when people lived in small hunter-gatherer bands that lacked large-scale formal organizations, and humans have not changed much biologically since then, even as we have developed many additional learned mental skills.  Thus, cognitive science and studies of the brain can provide insights of value for understanding modern institutions like church congregations and multi-million-member denominations, but must be combined with social science perspectives designed to explain the modern world.  Furthermore, if the savannah principle is even only partly correct, then religion is only partially anchored by the results of human biological evolution, and very significant changes in religious cultures and organizations remain possible for the future.


Abbott, H. Porter.  2003.  “Unnarratable Knowledge: The Difficulty of Understanding Evolution by Natural Selection,” pp. 143-162 in Narrative Theory and the Cognitive Sciences, edited by David Herman.  Stanford, California: Center for the Study of Language and Information.

Atran, Scott.  2002.  In Gods We Trust: The Evolutionary Landscape of Religion. Oxford, England: Oxford University Press.

Barrett, Justin L.  2004.  Why Would Anyone Believe in God?  Walnut Creek, California: AltaMira.

Bering, Jesse M.  2006.  “The Cognitive Psychology of Belief in the Supernatural,” American Scientist 94: 142-149.

Bloom, Paul.  2012.  “Religion, Morality, Evolution,” Annual Review of Psychology 63:179-199.

Boyer, Pascal.  2001.  Religion Explained: The Evolutionary Origins of Religious Thought. New York: Basic Books.

Carroll, Michael P.  1983. “Visions of the Virgin Mary: The Effect of Family Structures on Marian Apparitions,” Journal for the Scientific Study of Religion 22:205-221.

Carroll, Michael P.  1985.  “The Virgin Mary at LaSalette and Lourdes: Whom Did the Children See?” Journal for the Scientific Study of Religion 24:56-74.

Conway, Martin A. (ed.).  1997.  Cognitive Models of Memory.  Cambridge, Massachusetts: MIT Press.

Conway, Martin A.  2001.  “Sensory-perceptual Episodic Memory and its Context: Autobiographical Memory,” Philosophical Transactions of the Royal Society: Biological Sciences 356(1413):1375-1384.

Dapretto, Mirella, Mari S. Davies, Jennifer H. Pfeifer, Ashley A. Scott, Marian Sigman, Susan Y. Bookheimer, and Marco Iacoboni.  2006.  “Understanding Emotions in Others: Mirror Neuron Dysfunction in Children with Autism Spectrum Disorders,” Nature Neuroscience 9(1):28-30.

Gervais, Will M., and Ara Norenzayan.  2012.  “Analytic Thinking Promotes Religious Disbelief,” Science 336: 493-496.

Jack, Anthony Ian, Jared Parker Friedman, Richard Eleftherios Boyatzis and Scott Nolan Taylor.  2016.  “Why Do You Believe in God? Relationships between Religious Belief, Analytic Thinking, Mentalizing and Moral Concern,” PLOS ONE  DOI:10.1371/journal.pone.0149989.

Kanazawa, Satoshi.  2004.  “The Savanna Principle,” Managerial and Decision Economics 25(1):41-54.

Kanazawa, Satoshi. 2010.  “Why Liberals and Atheists Are More Intelligent,” Social Psychology Quarterly 73(1):33-57.

Maynard Smith, John.  Evolution and the Theory of Games.  1982.  New York: Cambridge University Press.  [sic: he considers “Maynard Smith” without a hyphen to be his last name.]

Watts, Fraser, and Leon Turner. 2014.  Evolution, Religion, and Cognitive Science.  New York: Oxford University Press.

Whitehouse, Harvey.  2004.  Modes of Religiosity: A Cognitive Theory of Religious Transmission.  Walnut Creek, California: Altamira.

While widely applied in social science and scholarship of religion, the term charisma is defined and used in many different ways, and thus needs to be used with care.  Ordinary dictionaries often distinguish two very broad meanings: (1) possessing a divine gift, and (2) having the charm to inspire devotion in the minds of other people.  The first definition immediately raises theological questions about what powers or special talents God gives to some people, and thus what particular religious tradition provides the answers.  The second definition raises a host of questions in social psychology about how one human being actually influences others, and longstanding debates about how the mass media confer celebrity status upon some public figures, including televangelists.

Within the social science of religion, there even exists a third definition, referring not to the charisma of an individual person, but distinguishing charismatic movements that heavily emphasize personal relationships, versus more traditional or bureaucratic organizations that minimize this emotional factor.   It was Max Weber who most influentially distinguished three ideal types of authority: charismatic, traditional and rational or bureaucratic.  He defined charisma thus:

The term “charisma” will be applied to a certain quality of an individual personality by virtue of which he is considered extraordinary and treated as endowed with supernatural, superhuman, or at least specifically exceptional powers or qualities. These are such, as are not accessible to the ordinary person, but are regarded as of divine origin or as exemplary, and on the basis of them the individual concerned is treated as a “leader.” In primitive circumstances this peculiar kind of quality is thought of as resting on magical powers, whether of prophets, persons with a reputation for therapeutic or legal wisdom, leaders in the hunt, or heroes in war. How the quality in question would be ultimately judged from any ethical, aesthetic, or other such point of view is naturally entirely indifferent for purposes of definition. What is alone important is how the individual is actually regarded by those subject to charismatic authority, by his “followers” or “disciples.” (Weber 1922: 241-242)

Weber’s confidence in the clarity of the concept may have been misplaced, yet it certainly is the case that many religious traditions believe that some humans beings of the past had a close connection with a deity, whether Pythia the Oracle of Delphi, Moses, Mohammad,or self-proclaimed prophet Major Jealous “Father” Divine.  Traditional beliefs are actually rather complex, as for example I Corinthians 12 suggests that many or even all people may have divine gifts, but different ones:

8 For to one is given by the Spirit the word of wisdom; to another the word of knowledge by the same Spirit;

9 To another faith by the same Spirit; to another the gifts of healing by the same Spirit;

10 To another the working of miracles; to another prophecy; to another discerning of spirits; to another divers kinds of tongues; to another the interpretation of tongues:

11 But all these worketh that one and the selfsame Spirit, dividing to every man severally as he will.

Weber scholar Guenther Roth noted that Weber’s concept of charisma was both troublesome and provocative, and complained that it had been oversimplified in more recent years: “In popular usage, ‘charisma’ has lost any distinctive meaning and merely denotes a personal attribute, the glamor and attractiveness of persons and objects.  In contrast to this diffuseness, there has been a strong tendency in academic usage to reduce charisma to a relationship between national leaders and manipulated and organized masses” (Roth 1975:148).

Other social and behavioral scientists who have studied charisma might agree with Roth’s description of the concept’s evolution while arguing that it was justified.  For example, Donald McIntosh (1970) applied the Psychoanalytic thinking of Sigmund Freud to Weber’s typology of forms of authority, suggesting that God was a projection of the believer’s subconscious images of his or her own father.  The traditional form of authority reflected an early stage of childhood, while the charismatic form expressed the intensity of the Oedipal conflict, in which the son rises up to overthrow his father, seeking to take his place.  Today, when Psychoanalysis has lost a good deal of respect within social science, this analysis is still valuable as an example of how it may be worthwhile to analyze charisma as a psychological relationship between leaders and followers.

Peter Berger praised Weber’s abstract conceptualization, but noted that recent historical research had disagreed with Weber’s description of the specific social location of many prophets in the Hebrew religious tradition.  Berger reported that prophecy was more integrated into traditional forms of leadership than Weber had realized, but agreed that charisma was a force for change: “Charisma represents the sudden eruption into history of quite new forces, often linked to quite new ideas.  Far from being ‘reflections’ or ‘functions’ of already existing social processes, the charismatic forces powerfully act back upon the pre-existing processes and, indeed, initiate new processes of their own” (Berger 1963:949-950).

Thus, charisma may not need to operate from outside existing institutions, but regardless of its social location and the formal authority of the leaders who express it, represents novelty.  Werner Stark (1965) offered a similar analysis but focused on the Roman Catholic Church, arguing that charisma need not fade in the routinization process identified by Weber, but can constantly reawaken, even within the most well-established religious organizations.

A number of studies in social psychology and organizational behavior have attempted to determine which attributes of leaders connect in the minds of followers or observers, in a way that might define charisma empirically.  Much of the most systematic research concerns the qualities of business or political leaders, rather than religious leaders, but may provide insights of general applicability.

One questionnaire charisma scale covered a wide range of ways in which corporation leaders may attract unusual respect from their employees (Agle et al 2006), which factor analysis mapped onto five dimensions: (1) dynamic leadership, (2) exemplary leadership that sets a good and trustworthy example, (3) personal leadership that cares for the well-being of employees, (4) leader expectations for high performance by employees, and (5) leader willingness to take risks.  But one of the specific items heavily weighted on the dynamic leadership dimension was “charismatic,” so that dimension is a more focused measure, containing these other items describing a leader:

  • is dynamic
  • has the ability to excite a group of people
  • when communicating, drives to motivate with every word, story, and inflection
  • communicates an exciting vision of the future of the organization
  • paints an exciting picture of the future of the organization

A somewhat similar study (Conger and Kanungo 1994) identified six factors related to charisma: (1) articulation of a vision, (2) sensitivity to conditions in the organization’s environment, (3) unconventional behavior, (4) willingness to take personal risks, (5) sensitivity to member needs, and (6) does not maintain the status quo.  The items in the first factor describe a charismatic leader as:

  • exciting public speaker
  • appears to be a skillful performer when presenting to a group
  • inspirational, able to motivate by articulating effectively the importance of what organizational members are doing
  • has vision, often brings up ideas about possibilities for the future
  • provides inspiring strategic and organizational goals
  • constantly generates new ideas for the future of the organization

Influential theorist Edward Shils (1965: 201) argued that charisma is by no means limited to religion and that Weber’s conceptualizations were far too narrow, even through he had extensively considered secular manifestations of it.  For Shils, any person, institution or aspect of culture is charismatic if it represents “some very central feature of man’s existence and the cosmos in which he lives.” Thus it is appropriate to apply the term widely, if selectively: “The person who through sensitivity, cultivated or disciplined by practice and experience, by rationally controlled observation and analysis, by intuitive penetration, or by artistic disclosure, reaches or is believed to have attained contact with that ‘vital layer’ of reality is, by virtue of that contact, a charismatic person.”

More recently, Robert J. House (1996) has developed a theory of charisma as value-based leader behavior, rather compatible with the perspective of Shils, but more activist in quality.  By brightly reflecting shared human values, a leader represents some central features of human existence, chiefly our hopes and goals, what Shils would indeed have recognized as values.  But for House, a leader does not merely represent a valued goal, but promotes effective means to achieve it.  In collaboration with many other researchers, House has offered a rather complex system of theoretical concepts connected to charismatic leadership, with at least some evidence of their plausibility (House, Spangler and Woycke 1991; Shamir, House and Arthur 1993).

Yet this heavy focus on individual leaders did not invalidate alternative definitions of charisma, and social scientists of religion continued also to find value in Weber’s classification of some organizations as charismatic in structure, contrasting with others that are more traditional or rational (Nelson 1993).  This diversity of approaches implies that we are free to select the one best suited for our own research project, but should be very clear in our publications to offer both a formal definition of charisma as we use the term, and citation of scholarly literature that provides a solid background of theory.


Agle, Bradley R., Nandu J. Nagarajan, Jeffrey A. Sonnenfeld, and Dhinu Srinivasan.  2006.  “Does CEO Charisma Matter?  An Empirical Analysis of the Relationships among Organizational Performance, Environmental Uncertainty, and Top Management Team Perceptions of CEO Charisma,” The Academy of Management Journal 49(1): 161-174.

Berger, Peter.  1963. “Charisma and Religious Innovation: The Social Location of Israelite Prophecy,” American Sociological Review 28(6): 940-950.

Conger, Jay A., and Rabindra N. Kanungo.  1994.  “Charismatic Leadership in Organizations: Perceived Behavioral Attributes and Their Measurement,” Journal of Organizational Behavior 15(5):439-452.

House, Robert J.  1996.  “Path-Goal Theory of Leadership: Lessons, Legacy, and a Reformulated Theory,” Leadership Quarterly 7(3): 323-352.

House, Robert J., William D. Spangler and James Woycke.  1991.  “Personality and Charisma in the U.S. Presidency: A Psychological Theory of Leader Effectiveness,” Administrative Science Quarterly 36(3): 364-396.

McIntosh, Donald.  1970.  “Weber and Freud: On the Nature and Sources of Authority,” American Sociological Review 35(5): 901-911.

Nelson, Reed E.  1993.  “Authority, Organization, and Societal Context in Multinational Churches,” Administrative Science Quarterly 38(4): 653-682.

Roth, Guenther.  1975  “Socio-Historical Model and Developmental Theory: Charismatic Community, Charisma of Reason and the Counterculture,” American Sociological Review 40(2): 148-157.

Shils, Edward.  1965.  “Charisma, Order, and Status,” American Sociological Review 30(2): 199-213.

Shamir, Boas, Robert J. House and Michael B. Arthur.  1993.  “The Motivational Effects of Charismatic Leadership: A Self-Concept Based Theory,” Organization Science 4(4): 577-594.

Stark, Werner.  1965.  “The Routinization of Charisma: A Consideration of Catholicism,” Sociological Analysis 26(4):203-211.

Weber, Max.  1922.  Economy and Society.  Berkeley: University of California Press (1978).

This controversial term refers to the possibility that coercive or deceptive indoctrination techniques can take control over a person’s mind; for example, causing the individual to join a radical religious movement.  A considerable body of research indicates that popular versions of the brainwashing theory are simply wrong, yet its ideas can be linked to similar concepts that have legitimate scientific interest. In turn, these ideas also can illuminate phenomena studied by the social science of religion — but only if understood critically.

Whether called brainwashing, thought control or something else, the concept entered popular culture through the apparent power of radical propaganda in the context of the coercive social structures of Nazi Germany and the Soviet Union.  l Nineteen Eighty-Four by George Orwell (1948) is the best known of many works of fiction from the twentieth century that dramatized the idea.  Shortly after Orwell’s novel was published, there emerged a classic case of supposed brainwashing in the real world over  the “coercive persuasion” (Schein et al, 1961) of American prisoners of war held by the Communists in the Korean War of 1950-1953.  Yet this episode  transformed few if any of the captive soldiers into Communists.

The connection between brainwashing and recruitment to religious movements was powerfully made in the aftermath of the culturally turbulent 1960s, both in popular culture and for scholars.  Some parents who were concerned that their children had joined radical “cults” supported a deprogramming movement that used coercive techniques in attempts to reverse the sinister programming that they had supposedly suffered (Patrick and Dulack 1974).  Social scientists David Bromley and Anson Shupe (1981:211) summarized the false assumption thus: “The centerpiece of the anti-cultists’ allegations is that cults brainwash their members through some combination of drugging, hypnosis, self-hypnosis, chanting or lecturing, and deprivation of food, sleep, and freedom of thought.”

It cannot be emphasized forcefully enough that the brainwashing theory does not, in fact, explain recruitment to radical religious movements, based on extensive observations by researchers in the field who were able to gain access to a variety of such groups.  Only a small fraction of the people who attend the meetings of a radical group eventually join, rates of defection never reach zero even after years of membership, and it is fairly common for “cults” to disintegrate, whether through organized schisms or total collapse (Barker 1984, 1986).  Ironically, one piece of evidence often misinterpreted is that a few of these groups seem to try to brainwash members, imposing strict lifestyle regimens and intensive indoctrination procedures, but that may reflect leaders’ awareness of how fragile their group is, or members’ genuine desire for leaders to help them transform their own identities.

Yet the brainwashing concept can offer valuable theoretical insights if considered coolly.  First we may wish to study what functions a belief such as brainwashing has for the believer.  A parent whose young adult child has joined a radical group may need an antidote for shame, and a practical solution for a problem facing the family.  A person may become a “deprogrammer” or take on some other form of rescue mission in order to gain social status, and yes, money.  Members of conventional religious denominations and secular individuals may want an explanation of recruitment to a radical religion that does not require them to take its beliefs seriously.  Thus the notion of brainwashing illustrates how theories can have instrumental value for the people who adopt them.

It also is worth noting that social science almost never assigns a hundred percent of the variance to one explanatory factor, which is to say that every effect has multiple causes.  Indeed, the indoctrination practices of every social movement, whether religious or political, radical or conventional, probably have some effect on recruits.  Thus a main problem with the brainwashing theory is its totalitarian quality, that it leaves no room for other factors that pull people into a movement, or push them out (Snow and Machalek 1984).

A third qualifying observation is that the brainwashing theory is psychological, whereas research on radical religious movements tends to be sociological or anthropological.  The association of religious radicalism with mental pathology is not a new idea; a century before the 1960s, psychiatrists believed that many cases of severe mental illness were the result of “religious excitement” (Bainbridge 1984).  Writing in a collection of essays titled The Future of New Religious Movements, Bromley, Hadden and Hammond (1987:215-216) considered the psychology-sociology debate about brainwashing and expressed great concern about its dysfunctional quality:

  The psychologically based interpretation of cults in many respects represents a frontal assault on the sociological perspective, for it begs the very questions sociologists regard as central to an understanding of new religious groups and their individual members.  It ignores such issues as the role of sociocultural forces in the emergence of social movements; the interactional dimensions of affiliation and organizational participation; the role of subordinates in the creation and maintenance of charismatic authority; and the organization problems of social movements.  In essence, the brainwashing perspective represents a return to pathology theory, traditionally a perspective portraying social movements as havens for those unable to cope with the rigors of conventional society.  The anti-cult reformulation simply substitutes “manipulated” for “naturally occurring” helplessness.  At base, there is a challenge to the more dynamic, interactive conceptions of individual and group structure that dominate current sociological thinking.

A fourth broad area of debate concerns implications for legal determination of responsibility for harmful actions.  Probably the most famous and intensively debated case is that of heiress Patty Hearst, who was kidnapped in 1974 by a group calling itself the Symbionese Liberation Army but then apparently was converted to its cause and participated in a bank robbery (Isenberg 2000).  The equivalent of a brainwashing defense failed, and she was convicted, sentenced to a long prison term, but then presidents of the United States first commuted her sentence then pardoned her.  This sensational case should not obscure the fact that very serious scholarly debate circles around the contradiction between the religion-related concept of moral responsibility, and the tendency of some theories in both psychology and sociology to have a deterministic character, as reflected in the title as well as the content of Daniel Robinson’s (1980) book, Psychology and Law: Can Justice Survive the Social Sciences?  Thus the brainwashing theory, while usually framed in simplistic and unscientific terms, may be a useful introduction to many profound issues in the social and behavioral sciences.


Bainbridge, William Sims.  1984 “Religious Insanity in America: The Official Nineteenth-Century Theory,” Sociological Analysis 45: 223-240.

Barker, Eileen.  1984.  The Making of a Moonie: Choice or Brainwashing?  Oxford: Blackwell.

Barker, Eileen.  1986.  “Religious Movements: Cult and Anticult Since Jonestown,” Annual Review of Sociology, 12:329-346.

Bromley, David G., and Anson D. Shupe, Jr.  1981.  Strange Gods: The Great American Cult Scare.  Boston: Beacon.

Bromley, David G., Jeffrey K. Hadden, and Phillip E. Hammond.  1987.  “Reflections on the Scholarly Study of New Religious Movements,” pp. 210-217 in The Future of New Religious Movements, edited by David G. Bromley and Phillip E. Hammond.  Macon, GA: Mercer University Press.

Isenberg, Nancy.  2000.  “Not ‘Anyone’s Daughter’: Patty Hearst and the Postmodern Legal Subject,” American Quarterly, 52(4): 639-681.

Robinson, Daniel N.   1980.  Psychology and Law: Can Justice Survive the Social Sciences?  New York: Oxford University Press.

Orwell, George.  1949.  Nineteen Eighty-Four.  New York: Harcourt, Brace.

Patrick, Ted, and Tom Dulack.  1974.  Let Our Children Go!  New York: Dutton.

Schein, Edgar F., Inge Schneier, and Curtis H. Barker.  1961.  Coercive Persuasion.  New York: Norton.

Snow, David A., and Richard Machalek.  1984.  “The Sociology of Conversion,” Annual Review of Sociology, 10:167-190.


The term meme was coined by evolutionary biologist Richard Dawkins (1976) to name the equivalent of a gene in the inheritance and evolution of culture, and can refer to a cultural element that combines with others to determine the character of a particular religious phenomenon.  Other leading figures of the period proposed different terminology; for example Charles Lumsden and Edward O. Wilson (1981) suggested culturgen, while Luigi Luca Cavalli-Sforza and Marcus Feldman (1981) more modestly used the phrase, cultural traits rather than devising a new word.  Subsequently, Dawkins (2006) rather aggressively attacked religion from the perspective of Atheism, yet his term meme became far more popular than the alternatives, so we can use it in analysis of religious evolution without endorsing his personal views on the subject.  The concept that units of culture might function like biological genes emerged as part of the development of sociobiology (Wilson 1975), that initially analyzed the genetic roots of social behavior in a range of animals, from insects to mammals, then began analyzing the complex social patterns exhibited by human beings.

Evolutionary ideas in the social sciences have been significant for at least a century and a half, notably in works by Herbert Spencer (1857) who believed that social evolution followed the same laws as biological evolution, but with a teleological quality that natural selection in biology lacked, because humans consciously seek to achieve a better future.  Building on the axiom that each cause tends to have more than one effect, and conjecturing that a multiplicity of effects become causes in their complex interaction, he deduced that all forms of long-term evolution would lead to increasing complexity.  Traditional societal institutions would not become extinct, but coalesce around specific functions, as society became more differentiated.  Thus, religion might focus on the ultimate meaning of human lives, and the support of spiritual communities, while some of its traditional tasks, like lawmaking and treatment of disease would spin off to newer societal institutions, such as formal legislatures and the medical profession.  A century after Spencer, Talcott Parsons (1964) described religion explicitly as an evolutionary universal, absolutely required for human survival, like the universality of lungs in large land animals, even as other institutions of society assumed some of its ancient functions.

As appealing as the global philosophies of Spencer and Parsons may be, they do not offer much advice about how to conduct detailed research about religion or culture more generally in modern societies, or to make good sense of empirical results.  The meme concept introduces the possibility that rigorous mechanisms may operate in the process of cultural change, as they do in biological reproduction and heredity.  However, we may well doubt that cultural traits are governed by any underlying rigor, as biological genes are based on DNA structures.  In a critical essay on the widespread use of evolutionary metaphors in archaeology, James Boone and Alden Smith (1998) noted the important distinction in biology between genotypes and phenotypes, the sequences of DNA base pairs that determine genes versus the visible manifestations of inheritance manifested in adult organisms.  They offered this model of how biological evolution works:

  1. Genetic variation is continually produced by mutation and recombination
  2. This variation interacts with external environmental factors to shape phenotypes
  3. These phenotypes and associated genotypes are differentially successful in surviving and reproducing.
  4. Offspring inherit some of the genes and thus tend to develop the associated phenotypes of their parents.
  5. The proliferation of more successful genotypes results in transgenerational increase in phenotypes that are better adapted to local environments.

The Boone and Smith critique argued that the equivalent of genotypes probably did not exist for the artifacts studied by archaeologists, and triggered a more general debate as other scholars added their comments after the article.  Similarly, applying modern evolutionary thinking to linguistics seems promising but proved difficult (Croft 2008), as is certainly the case for culture more generally (Aunger 2000).  By the beginning of 2016, Wikipedia was confusing as much as it clarified by presenting Internet meme and viral phenomenon as near synonyms, citing each biological metaphor on the page devoted to the other.  Indeed, one problem with the critiques of the meme concept is that they demanded memetics to exactly translate into the terms of genetics, when really only a metaphoric similarity is intended (Strong and Bainbridge 2003).  For example, Strong (1990) suggested that cultural evolution was Lamarckian, that is, assuming that characteristics acquired during life could be inherited by subsequent generations.  Meme, after all, is a contraction of mimesis, a technical term meaning imitation.

Nearly a century ago, William Fielding Ogburn (1922) offered a four-step theory of human cultural evolution that gave technological invention the key role, but can be translated into evolutionary and cultural terms:

  1. Invention in which new forms of knowledge are created, comparable to mutation in biological genetics.
  2. Accumulation of inventions, like the development of a complex genome of an advanced organism.
  3. Diffusion, comparable to the emergence of a complex gene pool for a diverse species.
  4. Adjustment, as the culture adapts to the set of recent technological innovations, as they interact in combination.

The result of the fourth step is analogous to a cultural phenotype, whereas the complex of technological inventions is the genotype.  This approach to cultural evolution continued to have some influence within the social science of technological development (White 1959), but it also is possible to transfer this mode of analysis to the development of religious traditions and institutions within cultural sciences.  For example, religious organizations can be clustered into families, and when a sect breaks away from a denomination, it is like a child being born from a parent.  Religious innovation often involves a combination of ideas from two or more already existing religious movements, in a process where the elements that function like biological genes are specific beliefs and practices (Bainbridge 1985).


Aunger, Robert.  2000.  Darwinizing Culture: The Status of Memetics as a Science.  Oxford: Oxford University Press.

Bainbridge, William Sims.  1985.  “Cultural Genetics,” pp. 157-198 in Religious Movements, edited by Rodney Stark. New York: Paragon.

Boone, James L, and Eric Alden Smith.  1998.  “Is it Evolution Yet?  A Critique of Evolutionary Anthropology,” Current Anthropology 39(51): S141-S174.

Cavalli-Sforza, Luigi Luca and Marcus W. Feldman. 1981. Cultural Transmission and Evolution. Princeton, N.J.: Princeton University Press.

Croft, William.  2008  “Evolutionary Linguistics,” Annual Review of Anthropology 27: 219-234.

Dawkins, Richard. 1976. The Selfish Gene. New York: Oxford University Press.

Dawkins, Richard. 2006.  The God Delusion. Boston: Houghton Mifflin.

Lumsden, Charles J. and Edward O. Wilson. 1981. Genes, Mind, and Culture. Cambridge: Harvard University Press.

Ogburn, William Fielding.  1922.  Social Change with Respect to Culture and Original Nature New York: Huebsch.

Parsons, Talcott. 1964. “Evolutionary Universals in Society.” American Sociological Review 29: 339–357.

Spencer, Herbert.  1857.  “Progress: Its Law and Causes,” The Westminster Review 67: 445-485.

Strong, Gary W.  1990.  “Neo-Lamarckism in the Rediscovery of Culture,” Behavioral and Brain Sciences 13(1):92-93.

Strong, Gary W., and William Sims Bainbridge. 2003.  “Memetics: A Potential New Science,” pp. 318-325 in Converging Technologies for Improving Human Performance, edited by Mihail C. Roco and William Sims Bainbridge. Dordrecht, Netherlands: Kluwer.

White, Leslie A.  1959.  The Evolution of Culture: The Development of Civilization to the Fall of Rome.  New York: McGraw Hill.

Wilson, Edward O.  1975.  Sociobiology: The New Synthesis.  Cambridge, Massachusetts: Harvard University Press.

Often defined simplistically as normlessness, the term anomie has been popular in social science at least since Emile Durkheim’s 1897 book on suicide, and thus has been used in a variety of different ways.  Indeed, Stephen Marks (1974) has documented that Durkheim’s own thinking about the concept evolved across a number of his publications.  Applied to individuals, the adjective anomic often is used as a synonym for demoralized or alienated, and anomic society may be considered disorganized.  Yet given the availability of these supposed synonyms, there is no good reason to dilute the meaning of anomie, and it is probably best used to refer to the model developed by Robert K. Merton in his 1938 essay, “Social Structure and Anomie,” which connects naturally to broader perspectives called strain theory and Structural Functionalism.  This more focused approach is especially effective in developing theoretical connections to religion.

Although Durkheim’s Suicide gives religion a prominent role, oddly it is not central to the part of his argument that focuses on anomie.  He frames a set of four categories (Dohrenwend 1959), distinguishing anomie from egoism, which can be defined as weak interpersonal bonds, and contrasting both with altruism, roughly the opposite of both anomie and egoism, and fatalism, which is discussed only in a footnote.  Rather than stressing the power of religion to add meaning to human life in his discussion of anomie, he examines the connection between economic fluctuations and the suicide rate, claiming that the rate rises both during very bad economic times and very good economic times, thus aggravated by the unusual nature of the situation.  In bad economic times people may despair, but in financial booms they may lose any sense of standards.  Human beings need a sense of stable goals and expectations within definite limits, Durkheim said, and economic instability erodes it.  Later researchers failed to find a connection between sharp economic upturns and suicide (Henry and Short 1954), yet this does not render anomie a vacuous concept, or reduce it to depression which describes a mood conducive to suicide as well as unusually poor economic conditions.  Rather, anomie can be interpreted in terms of the values and norms of society, both of which may be established and supported by religion (Stark and Bainbridge 1996: 18-19).

Merton’s anomie model describes relationships a person or group may have to the values and norms of society.  Values can be defined as the cultural goals people are expected to achieve in life, while norms are the legitimate means for achieving those goals.  In principle, one may accept or reject either of these independently of the other.  That gave Merton four possible combinations, each of which he named:

  1. Conformity: accept values, accept norms
  2. Innovation: accept values, reject norms
  3. Ritualism: reject values, accept norms
  4. Retreatism: reject values, reject norms

While these are presented as choices a person or group may decide, the surrounding socio-economic conditions also play a role.  People are most likely to be conformists, accepting both the values and the norms, if as a practical matter they are able to achieve the culturally established goals by societally approved means.  What happens if, for some people, it is impossible to achieve the values by following the norms?  Innovation comes into play, and it may take many forms.  Much of the relevant literature cites criminal behavior as the most obvious example.  If the society establishes material wealth as the goal, then for some people conforming to the norms of getting a good education and working hard in a high-paying job will achieve that goal.  People who do not have access to a good education, or cannot benefit from the opportunity, may turn to crime to achieve wealth, thus violating the norms while continuing to seek the values.  For a variety of reasons, including incompetence as a criminal, or perhaps because of their personality disposition, some people ritualistically follow the norms while having given up hope of achieving the goals.  After a series of diverse disappointments, others may retreat into a condition in which they abandon hope of achieving the goals of society, and also fail to follow the norms.

It is noteworthy that Merton named one of his categories innovation, rather than for example criminality.  Society has many norms, and violating some may not constitute a crime.  How do you ride to town?  On a horse, of course.  Yet inventing the automobile violates this expectation, and offers a new way to travel.  Similarly, innovation in the arts often involves deviating from established styles, as was the case many decades ago with the emergence of rock within popular music, which had established norms encouraging harmonious songs; by definition, innovation in art often requires dissonance with the old.  Innovation also occurs in and around religion.  Consider the fact that women were prevented from playing high-status roles in the conventional churches in past years.  This is a possible explanation for why Mary Baker Eddy innovated in creating Christian Science, or Aimee Semple McPherson founded the Foursquare Gospel Church.

Not all of Merton’s associates were entirely happy with his innovation category, and perhaps in response he invented a fifth category he called rebellion.  Standing somewhat outside the system of the four other categories, rebellion involves replacing standard values and norms with new ones.  Thus it is like a positive variant of retreatism, renouncing one culture while adopting or even creating a substitute.  It is an open question whether adding rebellion improves the four-category model, or confuses it.  And the connotation of the term rebellion may be rather harsh.  Where does one draw the lines between rebellion and the other categories?  Establishing a radical religious commune does sound like rebellion, except that its members might prefer a more positive term, such as liberation.  Suppose people who cannot earn wealth in order to gain the value of social status join or create a religious sect that confers status through their relationship with God?  Is that rebellion, innovation, retreatism or what?  Yet despite the ample room for quibbles such as these, Merton’s anomie model has been tremendously influential in the history of sociology.

Neil Smelser’s 1962 treatise, Theory of Collective Behavior, may be the most complex influential development from Merton’s model.  It offers a general explanation for the emergence of social movements as well as fads, crazes and panics.  Smelser was a student and collaborator of Talcott Parsons, the leading Structural Functionalist who assumed that each viable culture possessed a set of relatively stable values, often described as widely shared goals for social action, supported by systems of norms that constituted institutions (Parsons and Shils 1951).  Parsons (1964) argued that religion was one of the evolutionary universals of humanity, saying that all societies possess four features of supreme importance: religion, communication with language, social organization through kinship and technology.  He concluded that these four are essential for human society, and that all lasting cultural developments must be based upon them.  While some individuals may be irreligious or skeptical, if everybody abandons religion, then the society will fall as surely as if everybody abandoned technology, he believed.

Yet Smelser agreed with Merton that the values or norms of a society often fail to function well for an individual member of society, or sometimes an entire category of people.  Smelser’s analysis was not limited to the two concepts, values and norms, but added two others:  mobilization, in which institutions of society such as churches define the roles that members of the society should perform, and facilities, the situational resources or barriers that assist people in attaining their goals or prevent attainment.  These are the four components of action that comprise the social system and if they fall out of harmony with each other, the result is structural strain, which can either be called anomie or considered the most common cause of anomie.  Structural strain is a necessity for emergence of a radical social movement, but the second point in a six-step value added model, in which each step prepares the way for the next to take effect:

  1. Structural conduciveness: A precondition, such as the existence of a financial market before there can be a financial panic.
  2. Structure strain: Impairment in the relations between the four components of action.
  3. Generalized belief: An ideology that attributes a cause to the strain and says what to do about it.
  4. Precipitating factors: Dramatic events that trigger action based on the generalized belief.
  5. Mobilization of participants: Communication brings together people suffering from the strain who are ready to share the same generalized belief.
  6. Social control: Ways in which the regular institutions of society work against the emergence of unconventional collective behavior or a social movement.

An especially influential application of Smelser’s strain theory is the model of recruitment to a radical religion developed by John Lofland and Rodney Stark (1965), based on their observational study of a small branch of Sun Myung Moon’s Unification Church, early in its entrance to the United States from Korea.  Lofland and Stark were students in the same academic department where Smelser taught, but they also benefitted from other mentors, notably sociologist of religion Charles Y. Glock, with whom Stark collaborated.  The Lofland-Stark theoretical model is a series of seven steps that a person makes, perhaps in this order, to become what they called a deployable agent, a dedicated member of a “cult” who could be effective in converting new members:

For conversion it is necessary that a person:

  1. experience enduring, acutely felt tensions;
  2. within a religious problem-solving perspective;
  3. which leads to defining himself as a religious seeker;
  4. encountering the cult at a turning point in his life;
  5. wherein an affective bond to adherents is formed (or pre-exists);
  6. where extra-cult attachments are low or neutralized;
  7. and where, to become a “deployable agent,” exposure to intensive interaction is accomplished.

The acutely felt tensions in the first step may be the direct result of structural strain, if for example the individual person is unable to find satisfactory employment in a market-oriented economy, or an indirect result, as in cases when standard medical institutions fail to provide effective treatments for chronic disorders.  The individual experience of structural strain can be identified as anomie, especially following a very practical definition of the term: “The state of being without effective rules for living” (Stark and Bainbridge 1987: 217).  The fourth step in the model, turning point, may also provoke anomie, because the person experiences dislocation such as from a divorce, losing a job, or even more positively, graduating from school, when old forms of behavior may no longer be effective.  The sixth point is a time when social attachments are weak, thus aggravating anomie as well as representing the low social control that the last step of Smelser’s model says would leave a person free to deviate from standard norms.

The religious problem-solving perspective of the second step in the Lofland-Stark model represents the generalized belief third step of Smelser’s model, and under conditions of anomie drives a search for a new religion.  Alternative problem-solving perspectives would include political action to change society, psychiatric treatment to change the individual, or as in Merton’s innovation concept, even criminal behavior.  The fifth and seventh steps of the Lofland-Stark model develop social bonds with members of the group, achieving mobilization for new norms and values, as well as social incorporation in the group.  The work of Smelser, Lofland, and Stark shows how an important concept like anomie can be integrated logically with different but compatible concepts to produce a sophisticated theoretical model of human religious behavior.


Dohrenwend, Bruce P.  1959.  “Egoism, Altruism, Anomie, and Fatalism: A Conceptual Analysis of Durkheim’s Types,” American Sociological Review 24(4): 466-473.

Durkheim, Emile.  1897.  Suicide.  New York: Free Press [1951].

Henry, Andrew F., and James F. Short.  1954.  Suicide and Homicide.  Glencoe, IL: Free Press.

Lofland, John, and Rodney Stark.  1965.  “Becoming a World-Saver: A Theory of Conversion to a Deviant Perspective,” American Sociological Review 30: 862-875.

Marks, Stephen R.  1974.  “Durkheim’s Theory of Anomie,” American Journal of Sociology 80(2): 329-363.

Merton, Robert K.  1938.  “Social Structure and Anomie,” American Sociological Review 3(5): 672-682.

Parsons, Talcott.  1964.  “Evolutionary Universals in Society,” American Sociological Review 29:339-357.

Parsons, Talcott, and Edward A. Shils (eds.).  1951.  Toward a General Theory of Action.  Cambridge, MA: Harvard University Press.

Smelser, Neil J.  1962.  Theory of Collective Behavior.  New York: Free Press.

Stark, Rodney, and William Sims Bainbridge.  1996. Religion, Deviance and Social Control. New York: Routledge.

Stark, Rodney, and William Sims Bainbridge.  1987. A Theory of Religion. New York: Toronto/Lang.

Humans are theorized to have a natural need to form coherent mental models of the world, and thus they will exert effort to resolve any contradiction between two beliefs, or between a belief and a behavior.  As influentially stated in 1957 by Leon Festinger, the theory included two hypotheses:

  1. The existence of dissonance, being psychologically uncomfortable, will motivate the person to try to reduce the dissonance and achieve consonance.
  2. When dissonance is present, in addition to trying to reduce it, the person will actively avoid situations and information which would likely increase the dissonance. (page 3)

Festinger explicitly connects these abstract ideas to religion through the example of the Great Disappointment of 1843-1844, when William Miller’s prediction of the Second Coming was apparently disconfirmed.  The theory of cognitive dissonance predicts that people will join together to defend their beliefs against disconfirmation, perhaps resulting in religious innovation, when these conditions exist:

  1. A belief or set of beliefs is held with conviction by a number of people.
  2. The belief, at least in part, has sufficient implication for the affairs of the daily world so that the believers take action in accordance with the belief.
  3. The action is sufficiently important, and sufficiently difficult to undo, that the believers are, in a very real sense, committed to the belief.
  4. At least some part of the belief is sufficiently specific and concerned with the real world so that unequivocal disproof or disconfirmation is possible.
  5. This possible disconfirmation actually occurs, usually in the form of the nonoccurrence of a predicted event within the time limits set for its occurrence.
  6. The dissonance thus introduced between the belief and the information concerning the nonoccurrence of the predicted event exists in the cognitions of all the believers, and hence, social support is attempting to reduce the dissonance is easily obtained. (pages 247-248)

A major source of this application of the theory was a field observation study of a millenarian group Festinger conducted with Henry W. Riecken and Stanley Schachter, and published as the 1956 book When Prophecy Fails.  However, the theory is not limited to such rare events as this.  For example, the disconfirmation might be the failure to help a sick person who received a religious healing ritual.  The second main hypothesis of the theory, that people actively avoid situations and information which would likely increase dissonance, may explain social distance between two religious groups with very different beliefs, even without any particular disconfirmation events.

Around the year 1950, many psychologists and social scientists developed theories of cognitive consistency, notable among them the balance theory of Fritz Heider (1946, 1958).  Originally, Heider was concerned about cognitive consistency within some well-defined situation, especially attitudes toward different elements of a causal system, such as when a person commits a harmful act and the person harmed develops a consistent negative attitude toward both the act and the perpetrator.  Then he expanded his thinking to consider a number of configurations, of which the most relevant to religion might best be described as the following.  Person A and Person B often come into interaction with each other, in situations where their attitudes toward a religious Belief may be salient.  From the standpoint of Person A, there are two relationships: to Person B and to Belief.  If Person A has positive feelings toward Person B, accepts Belief, and Person B also accepts Belief, then the situation is cognitively consonant.  If Person A has positive feelings toward Person B who accepts Belief, but Person A has no attitude toward belief, there is a mild dissonance which can be resolved by Person A developing a positive attitude toward Belief.  But if Person A strenuously rejects Belief, then the dissonance causes a real problem, that might be resolved by coming to have negative feelings toward both Person B and Belief, or positive feelings toward both.  The transition from strong dissonance to consonance can often be very difficult, and has an unpredictable outcome, especially if other people and beliefs are also involved.

The literature on cognitive consistency and social behavior is vast and couched in a variety of terminologies.  One modern way of studying theories of cognition in complex social systems is through computer simulation, for example using a multi-agent system that models the attitudes of a large number of people (Bainbridge 2006).  For example, one can program a simulation to represent a social network in which agents start with randomly selected attitudes toward a belief, perhaps as simple as positive or negative, and a random pattern of social relations with neighboring agents.  The simulation will select agents at random, check whether the agent agrees with one of the adjacent agents on the belief, and create a social bond between the two agents if they agree, or remove any bond if they disagree on the belief.  Over time, the social network will evolve to maximize consonance, in which only agents who agree have social bonds, creating a pair of separate but overlapping social networks, one accepting and the other rejecting the belief.  A more complex version of the same program could find a dissonant triad of Agent A, Agent B and Belief, and at random change Agent A’s attitude toward either Agent B or Belief.  This also results eventually in a consonant situation, but with the potential for emergence of many local groups that are in agreement with each other, often separated from groups that share the same Belief but are situated on the other side of groups that disagree with them.  Then one can program the simulation to give a particular Belief the power to motivate agents who accept it to build temporary social relationships with agents who disagree, which over time can have the effect of spreading the Belief throughout the simulated society.

The theory that religious beliefs can have a distinctive ability to overcome cognitive dissonance was proposed over a century ago by pioneer psychologist William James (1907).  His complex set of ideas does not map perfectly onto modern psychological theories, partly because his fundamental philosophical point was that a belief is true to the extent that it is pragmatically useful.  Specifically he argued that religious beliefs promote a better quality of life and benevolent society, quite apart from any scientific evidence about their objective truth or falsity.  He described many of the empiricist philosophers and scientists of his day as “tough-minded,” which caused them to be irreligious.  That is, they were especially sensitive to cognitive dissonance, demanding that all theories about the universe harmonize with each other.  In contrast, James said, people who were more religious were “tender-minded,” being more optimistic that contradictory beliefs could be simultaneously true.  Some of his extensive discussion of this paradox suggested that religions might offer principles that transcend dissonance, thereby bringing people together.



Bainbridge, William Sims.  2006.  God from the Machine: Artificial Intelligence Models of Human Cognition.  Lanham, Maryland: AltaMira.

Festinger, Leon, Henry W. Riecken and Stanley Schachter. 1956. When Prophecy Fails. Minneapolis: University of Minnesota Press.

Festinger, Leon. 1957. A Theory of Cognitive Dissonance. Evanston, Illinois: Row, Peterson.

Heider, Fritz. 1946. “Attitudes and Cognitive Organization.” The Journal of Psychology 21: 107–112.

Heider, Fritz. 1958.  The Psychology of Interpersonal Relations.  New York: Wiley.

James, William. 1907.  Pragmatism: A New Name for Some Old Ways of Thinking. New York: Longmans, Green.

Human beings may have always believed that outsiders to the community are less trustworthy than community members, but only in the twentieth century was this observation formalized — in Control Theory.  The pivotal statement of the theory was offered in the book Causes of Delinquency by Travis Hirschi (1969), who used questionnaire data augmented by police records to illustrate and test the proposition that high school boys are less likely to commit delinquent acts if they have strong social bonds.  But he went beyond that simple idea to identify four aspects of social connectedness:

  1. Attachment: Strong connections to other people, especially parents, that make a person sensitive to their wishes.
  2. Commitment: Investment of time and energy in a conventional line of activity that would be endangered by delinquency.
  3. Involvement: Being so busy in conventional activities, such as adult-organized recreation, that there is no time free to perform delinquent acts.
  4. Belief: The conviction that the norms of society are good and should be obeyed.

Religion could play a significant role in any of these, and of course involvement in church-based activities is especially obvious.  However, when Hirschi tested these four aspects of the theory, he found evidence for only the first three.  Legend has it that at this point, Rodney Stark asked Hirschi why he had not highlighted in his book manuscript the role religion played in preventing delinquency.  Hirschi replied that his data did not reveal any such normative power of religion for adolescents, and that led to their influential journal article “Hellfire and Delinquency” (Hirschi and Stark 1969).  Years later, Stark (et al. 1982) returned to this issue to explore possible geographic variations in the power of religion to deter delinquency.  The main finding may be called the Stark Effect: The power of religion to deter delinquency is significant when a substantial fraction of the population is religious, but absent when only a minority belong to religious organizations.  This reflects a very interesting general issue for social theory, in that the connections between variables within the minds of individuals may be conditioned by the organization of the surrounding society.  That is, in the part of California where Hirschi’s data were collected, churches are weak, so even boys who are believers may confine their faith within the walls of the church and ignore religious principles when they are in secular environments.

The Stark Effect also reminds us of the prior history of Control Theory.  It was a condensation of the dominant perspective on deviant behavior of the Chicago School of Sociology (Papachristos 2012), in the third of a century before Hirschi wrote, that identified its cause as social disorganization.  Over the early decades of the twentieth century, the sociology department at the University of Chicago dominated sociology, with an intense focus on urban studies, often doing research on differences across the neighborhoods of Chicago itself.  A famous ethnography by Nels Anderson (1923) described the chaos in an area of Chicago he called Hobohemia, a “Bohemian” district populated by migrant workers and unemployed hobos.  Rigorous statistical studies looked at variations in rates of crime and delinquency (Shaw and McKay 1927).  A study that did not focus on religion, but with findings that have theoretical implications for religion, was The Gang by Frederick Thrasher (1927).   When the community at large becomes disorganized, small-scale social organization will arise spontaneously.  While that principle was central to Thrasher’s theory of gangs, it could be applied as easily to the emergence of social groups that are not at all criminal, including religious sects and cults.  If traditionally dominant religious organizations lose their coherence, the result may be an eruption of reform and revival movements within these organizations, and novel religious movements entirely outside them.

The classic example of control theory applied to religion is Emile Durkheim’s (1897) study of suicide.  Beginning decades before Durkheim wrote, social scientists had begun using geographic rates of suicide to develop and test theories, and his own contribution consisted primarily of providing a theoretical framework.  One of his key concepts was egoism, not to be confused with egotism but similar, referring to an unhealthy individualism that left a person vulnerable.  He argued that the fragmentation of denominations and sects that marked Protestantism encouraged individualism, and thus increased the rate of egoistic suicides, compared with the more unified Catholicism.  Over the following century, sociological critics pointed out that Durkheim’s own data may not have supported his claim, for example because the high rates for Protestants seemed limited to German areas and thus may have reflected a different social cause.  But setting aside the debates over historical differences between Protestantism and Catholicism, more recent studies do seem to show that the coherence of religious communities is a factor deterring suicide (Pescosolido and Georgianna 1989).

Given how many variants of Control Theory exist, and Hirschi’s attempt to identify four subtheories within it, a survey of the social science of religion would identify many other models that could be given this label.  For example, Rosabeth Moss Kanter’s (1972) theory of commitment mechanisms is worth considering.  She developed it to explain the relative success of some nineteenth-century America utopian communities, which from her perspective only happened to be religious, although religious hope and faith may indeed have been required for these mechanisms to function.  Kanter identified six commitment mechanisms, which may be classified in three pairs.  The first one in each pair requires giving something up, while the second involves receiving something in return:


  1. Sacrifice
  2. Investment


  1. Renunciation
  2. Communion


  1. Mortification
  2. Transcendence

In many of the communes, members were required to give up ordinary material rewards, yet they were compensated by reasonably comfortable living conditions.  This could be considered in plain economic terms: they had sacrificed all their money to the commune, and would profit from this investment only if they remained loyal members.  They had renounced many social relationships with outsiders, but enjoyed an intense communion with fellow members.  They had accepted harsh criticism of the secular identities they had abandoned, but gained new and positive identities within the ideal community.  This last point did not necessary require mortification of the flesh, but of their former self-images, to achieve transcendence of the human condition, which only religion seems capable of offering.  This model of commitment to religious communities may be applied to ordinary churches, but with milder versions of each of the six mechanisms.  The first in each pair dissolves the control that non-religious society may have inflicted, while the second imposes control by the religious organization.  The vast literature on deviance, conformity and religion may include many plausible multi-component control theories, of which those by Hirschi and Kanter are prominent examples.


Anderson, Nels.  1923. The Hobo.  Chicago: University of Chicago Press.

Durkheim, Emile.  1897.  Suicide.  New York: Free Press (1951).

Hirschi, Travis.  1969.  Causes of Delinquency.  Berkeley, CA: University of California Press.

Hirschi, Travis, and Rodney Stark. 1969.  “Hellfire and Delinquency.” Social Problems 17:202-213.

Kanter, Rosabeth Moss.  1972.  Commitment and Community: Communes and Utopias in Sociological Perspective.  Cambridge, MA: Harvard University Press.

Papachristos, Andrew V.  “The Chicago School of Sociology.”  Pp. 472-479 in Leadership in Science and Technology, edited by William Sims Bainbridge.  Thousand Oaks, CA: SAGE.

Pescosolido, Bernice A., and Sharon Georgianna. 1989. “Durkheim, Suicide, and Religion: Toward a Network Theory of Suicide.” American Sociological Review 54:33-48

Shaw, Clifford, and Henry D. McKay. 1927. Delinquency Areas. Chicago: University of Chicago Press.

Thrasher, Frederic M. 1927.  The Gang.  Chicago: University of Chicago Press.

Stark, Rodney, Lori Kent, and Daniel P. Doyle.  1982.  “Religion and Delinquency: The Ecology of a ‘Lost’ Relationship.” Journal of Research in Crime and Delinquency 18:4-24.



Proposed by criminologist Edwin Sutherland, Differential Association Theory has been mainly applied in social science studies of deviant behavior, so it is reasonable to extend it to analysis of cults and sects, but Sutherland really proposed it as a general theory of social influence that could be applied very broadly.  The standard version was published in the fourth edition of his textbook, Principles of Criminology, and postulated that criminal behavior was chiefly learned within intimate personal groups (Sutherland 1947: 6-7).  An important part of that learning could be conceptualized as values, whether to view legal codes favorably or unfavorably, but also could include practical skills such as techniques for committing a crime.  Sutherland explicitly stated that his theory described the ways people learned all forms of social behavior, and that criminal behavior served the same needs and values as other kinds of behavior.

A central but frequently misunderstood principle of the theory was stated thus: “A person becomes delinquent because of an excess of definitions favorable to violation of law over definitions unfavorable to violation of law.”  Many scholars and students read the theory as if it were simply a model of social influence: If most of the people you associate with are criminals, you will be one, too.  But that is not at all what the theory is about, as the eleventh edition of Principles of Criminology clearly states (Sutherland et al. 1992: 92).  It is a theory of communication, what today would be included in information science, and was written within that sociological school of thought called symbolic interactionism.  The word definition repeatedly used by Sutherland connects to the work of W. I. Thomas (1923:42), who stressed the importance of the definition of the situation in determining human behavior.

For many years, beginning with Travis Hirschi’s 1969 book Causes of Delinquency, criminologists vigorously debated the relative merits of Differential Association Theory against Control Theory which Hirschi favored.  In Control Theory, people become deviant largely because their social attachments to other people are weak, so this battle of the explanations tended to support the revisionist version of Differential Association Theory that focused on differential social ties with criminal subcultures versus with conventional society.  Hirschi seemed to demonstrate that his own data, from a survey of California adolescents, entirely supported Control Theory, but the debate was never really settled, and at least one prominent criminologist found solid evidence in favor of Sutherland’s theory in Hirschi’s own dataset (Matsueda 1982).  Of course, there is no reason why both theories could not be correct, combining to explain deviance better than either could alone, especially when we recognize that Sutherland’s theory is not just about social bonds between people, but about the differential transmission of information and values.  Notably, the theory can give religion a prominent role to play in promulgating norms, the obvious example being the influence of powerfully expressed church sermons in convincing churchgoers to obey the laws of the society.

Thus, Differential Association Theory is about mental associations between concepts, and how people learn definitions of situation in complex patterns of communication with other people.  Therefore it can be adapted to model phenomena other than crime, especially when a complete cultural consensus is lacking, and people may receive communications that define things in different ways.  Sutherland expressed his theory in terms of nine propositions, which may easily be translated to describe the dynamics of religious belief and adherence (Bainbridge 2006:47-48):

  1. Religious behavior is learned.
  2. In interaction with other persons in a process of communication.
  3. Principally within intimate personal groups.
  4. The learning includes: (a) religious practices, and (b) religious beliefs, attitudes, and values.
  5. The religious beliefs, attitudes and values are learned from definitions favorable to acceptance of the particular religion or unfavorable.
  6. A person converts to a religion because of an excess of definitions favorable to the religion over definitions unfavorable to the religion.
  7. Differential associations may vary in frequency, duration, priority and intensity.
  8. The process of religious learning involves all the mechanisms that are involved in any other learning.
  9. While religious behavior is an expression of general needs and values, it is not explained by those general needs and values, since nonreligious behavior is an expression of the same needs and values.

The seventh point deserves special scrutiny.  Suppose a person is receiving mixed messages about a religious belief, both pro and con.  If someone says the belief is true, that associates the belief with truth.  If someone else says the belief is false, that associates the belief with falsehood.  The differential association is the balance between these two kinds of message.  If the favorable association is communicated with greater frequency, then it outweighs the less frequent negative association.  The duration variable suggests that a long message, like a church sermon lasting twenty minutes, will outweigh a quick comment by someone who rejects the church.  Priority concerns earlier versus later messages, for example the power of religious instruction in childhood.  Intensity, in the context of symbolic interaction, may simply be how much emotion is invested in the message.

Aspects of this theory are quite compatible with other communication-related studies that date from about the time Sutherland was working, that have stood the test of time, and that can be applied to religious topics.  For example, in the 1950s many researchers were studying the influence of the mass media, which seemed to be able to influence citizens in an impersonal manner.  Or, the mass media were impersonating personal influence, as people might come to feel they had a personal relationship with television newscasters and actors.  The most relevant example is Jack Webb, who created the highly popular 1951-1959 TV crime drama, Dragnet, in which he himself played the role of a policeman named Joe Friday, who always triumphed over the bad guys in 276 episodes of this half-hour program.  Among the most significant sociological works of that decade was Personal Influence by Elihu Katz and Paul Lazarsfeld (1955), which offered a two-step model of diffusion of information and values.  Whether from the mass media, or some more obscure source, a few open-minded or creative members of the local community would become enthusiastic about something, then promote it through the immediate community, serving as opinion leaders.  Both the moralism dramatized in Dragnet and the enthusiasm of a local opinion leader increase the intensity of communication of norms, thus being quite compatible with Differential Association Theory.

Many social scientists have offered critiques and extensions of Sutherland’s theory.  Robert Burgess and Ronald Akers (1966) revised the theory considerably, removing it from the symbolic interactionist tradition and situating their version in Behaviorism.  This was a huge modification, implying that people accept a religious belief if they receive tangible rewards for doing so.  In their reformulation, differential association becomes the cost-benefit ratio, thus rather more relevant for understanding why people might steal property than why they pray for guidance, but not totally incompatible with use of the theory to explain at least some kinds of religious behavior.

In recent years, the Internet has revolutionized the mass media, fragmenting or diversifying the flood of information and norms, and giving people technologies that allow them to communicate 24/7, and that create online communities that may sometimes transmit intense definitions.  Thus, we seem to have entered a period in which Differential Association Theory is not only relevant but needs to be updated and expanded through a variety of innovative empirical research projects.


Bainbridge, William Sims.  2006. God from the Machine: Artificial Intelligence Models of Religious Cognition. Walnut Grove, CA: AltaMira Press.

Burgess, Robert L., and  Ronald L. Akers.  1966. “A Differential Association-Reinforcement Theory of Criminal Behavior,” Social Problems 14(2): 128-147.

Hirschi, Travis.  1969.  Causes of Delinquency. Berkeley, CA: University of California Press.

Katz, Elihu, and Paul F. Lazarsfeld.   1955.  Personal Influence: The Part Played by People in the Flow of Mass Communications.  Glencoe, ILL:  Free Press.

Matsueda, Ross L.  1982.  “Testing Control Theory and Differential Association: A Causal Modeling Approach,” American Sociological Review 47:489-504.

Sutherland, Edwin H.  1947.  Principles of Criminology.  Chicago: Lippincott.

Sutherland, Edwin H., Donald R. Cressey, and David F. Luckenbill.  1992.  Principles of Criminology.  Lanham, MD: General Hall.

Thomas, William I.  1923.  The Unadjusted Girl.  Boston: Little, Brown.

This kind of theory postulates that the human mind naturally seeks simple models of reality, and that humans will tend to avoid extreme cognitive effort. This perspective is similar to, but distinguishable from, cognitive consistency theories.

The human mind can hold a vast number and variety of memories, from episodic recollections of specific past events and separate but connected glossaries of words in multiple languages, to manual skills like typing on a keyboard in which the letters of the alphabet are arranged in an illogical sequence.  Yet when guiding behavior, relatively few pieces of information seem to dominate, thereby facilitating quick and decisive action.  In 1956, psychologist George A. Miller observed that humans can hold about seven items in short-term memory, also called working memory, and can assign sensory inputs such as musical tones to a maximum of about seven categories.

Vast amounts of information may pass through short-term memory during the course of a life, but only a small fraction of these data persists beyond a few minutes, for example building up autobiographical memory over years (Conway 2001). Miller was aware that the human brain fed information for long-term memory into short-term memory in what cognitive science came to call chunks, and chunking was one of the key methodologies developed for computer-based artificial intelligence over the subsequent decades (Newell 1990).  In reflecting upon one’s religion, at one moment a particular moving church service may be recalled, but at another moment the particular words of a verse in the Bible or the visual image of a crucifix may dominate consciousness.

In 1954, Gordon Allport offered a cognitive effort theory of prejudice, based on this axiom of cognitive efficiency.  In a study that frequently used examples from religion, he said that humans tended to conceptualize the world in terms of simple categories that require little effort to apply, rather than, for example, multiple dimensions of continuous variation.  Humans tend to classify other humans into distinct groups and assume that every member of a group shares the same primary characteristics.  This applies to stereotypes of members of religious groups other than our own.  While only empirical research can determine the extent to which this theory constrains human cognitions about other faiths, it has proven relatively easy to program artificial intelligence systems that can solve problems relatively well, while consolidating into suboptimal models of reality comparable to human prejudice (Bainbridge 1995).

Pascal Boyer (2001) and Justin Barrett (2004) argued that supernatural ideas must be minimally counterintuitive if they are going to be popular.  That is, they must contradict ordinary assumptions about life in only one or two important respects.  God, for example, is conceptualized as a person, having thoughts and desires just like humans, but also possessing special powers.  A minimally counterintuitive idea is by definition an example of cognitive simplicity.

If human short-term memory evolved to maximize cognitive efficiency and thus provide quick and decisive control over real-time human action, then it may not only be limited but also structured in a way that is oriented toward action.  H. Porter Abbott (2003) suggests that human thought organizes things in terms of narratives — stories in which protagonists face obstacles and take actions in pursuit of goals — and thus the scientific theory of evolution by natural selection from random variation cannot compete for popular acceptance with religious stories because it is unnarratable.  Large fractions of the holy scriptures of the world are stories, and deities often are framed as personified forces of nature.  It also is possible that belief in an afterlife reflects the fact that we develop mental models of the feelings, beliefs and intentions of the individual people closest to us, and the models persist after the deaths of the people (Bering 2006).

Cognitive efficiency theories can be applied to most domains of human endeavor, including the social science of religion itself.  The obvious example is Max Weber’s (1949) methodology of conceptualizing human thought, action and social organization in terms of ideal types, when he introduced the church-sect duality into the sociology of religion.  Harvey Whitehouse (2004) has proposed a cognitively simple cognitive theory of the difference between a church and a sect.  Mainstream religion, relatively intellectual and involving a host of symbols, is rooted in semantic memory that learns through constant repetition of logically connected ideas.  Sectarian religion is more emotional and rooted in episodic memory that is activated by emotionally intense but rare religious experiences.  The two modes of religiosity, according to Whitehouse, spread socially through very different forms of activity in the brain that are functionally distinct and may be anatomically distinct as well.

On the basis of his fundamental postulate that humans adopt behavioral strategies that reduce cognitive effort, it is noteworthy that Gordon Allport (1960) developed a very influential distinction between two types of religious motivation, extrinsic and intrinsic.  Extrinsic religiousness is utilitarian, using religion to obtain rewards of value in ordinary life, even supernatural rewards directly from God.  Intrinsic religiousness is not greedy in this way, but reflects sincere adherence to the faith, even accepting the unselfish goals prescribed by it.  Allport considered intrinsic religiousness to be more mature, thus connecting this distinction to theories of human cognitive development.  Yet given how much he published about religion, Allport might himself be considered religious, and framed his theories of human cognition at least in part to harmonize with his religious beliefs.

It is possible for an ideal type theory of religion to be objective, rather than an expression of academic or theological prejudice, if it is based on logical alternatives.  For example, Andrew Greeley (1989) postulated the cognitive theory that Catholics tend to have analogical imagination, while Protestant have dialectical imagination.  The analogical way of thinking believes that God is present in the world, expressing Himself through every aspect of creation, and it stresses the community.  The dialectical imagination believes that God has withdrawn from the sinful world, and it stresses the individual.  Of course, the metaphor of withdrawal suggests movement away from us along a spatial dimension, not a leap from inside our world to outside.  Also, Greeley’s dichotomy seems not to apply well to other religious traditions, notably those that are polytheistic.

Jupiter used to visit our world frequently, and interacted directly with living humans, while Neptune and Pluto were as distant from us as those astronomical objects that bear their names.  Thus classical paganism could be described as both analogical and dialectical. It would at least appear superficially that polytheisms are cognitively more complex than monotheism, perhaps because they represent a stage in the unification of separate tribes of people who worshiped different gods, or because they represented different features of nature that required centuries of cultural development to bring together.  Stark and Bainbridge (1987: 55-87) considered at some length the proposition that the evolution of religion would be toward greater simplicity, in part to avoid the dangers of magic, which falsely appears to serve Allport’s extrinsic motivations, but also reflecting the gradual unification of many smaller social systems into a few large one.  However, they cautioned against the too-simple deduction that the end result would be pure monotheism, because, to put the point in Greeley’s terms, that could mean total withdrawal of God from the world, which would render Him simply irrelevant.

A very different critique of the cognitive simplicity theory was proposed by H. Porter Abbott (2013) and suggested by the title of his book, Real Mysteries: Narrative and the Unknowable.  While drawing upon twentieth-century existentialism, his argument restates a very traditional perspective fundamental to some Asian religions, notably Zen Buddhism, but incorporated worldwide in mysticism: Much of human experience cannot be put into words, and thus cannot be captured cognitively.  Miller focused on the chunked contents of short-term memory, ignoring the vastly more complex and even ineffable contents of the entire brain.  Allport’s extrinsic religion does so as well, and the lack of clarity to his concept of intrinsic religion may merely indicate it is stored outside short-term memory.  Thus, cognitive efficiency theory may explain much, but can only with difficulty explain some of religion’s fundamental qualities.

Attachment theory posits that religion can be explained by understanding the human need for attachment in general, and one’s relationship to her/his parents specifically.  Childhood experience is of particular importance to explaining adult religiosity from this perspective.  Attachment styles are generally divided into a tri-chotomy of:

  1. Secure,
  2. Avoidant, and
  3. Anxious/ambivalent.

Parental relationships are implicated in religiosity in one of two ways:

  1. Modeling, where one’s human attachment style serves as the pattern for attachment to religion and the divine or
  2. compensation, where religion serves as a substitute for human relationships.

Within the framework of the theory, the need for attachments is underpinned by biosocial impulses and desires that have developed over the period of human evolution.

Used in the psychology of religion, these approaches attempt to delineate the different motivations for the expression of religiosity.  Originally developed by Allport and Ross (1967), the effort was to distinguish internal motivation to be religious from the external or immediate benefits offered by participation in a religious community.  In other words, it seeks to distinguish between “religion that serves as its own end goal” and religion “used in the service of other goals or needs–i.e. as an instrumental value” (Gorsuch 1988: 210).

This research led to many future studies employing and refining the scale, most notably dividing the extrinsic dimension in two, as well as the development of metrics to assess different orientations such as Batson’s (1976) “quest” dimension.  Debates remain about whether some actions, such as attendance at worship service, are indicative of intrinsic or extrinsic religious motivation.

Originating in psychological studies of religion, these approaches highlight the ways in which individuals utilize religion to cope with difficult situations and make sense of events in their lives.  Research indicates that people typically have naturalistic and religious meaning systems operating simultaneously, and how these systems are combined to make sense of events depends on the characteristics of both the individual and the event in question.  Research on attributions is typically carried out in experimental settings where subjects are presented with scenarios or vignettes and asked how they would explain and understand the situations presented.

Similarly, research and theory indicates that religious coping is more likely to occur in situations perceived as uncontrollable.  As such, levels of religious coping vary not only by individual characteristics, but also by social location (status set).  Together these theories posit that religion helps satiate human need for meaning and explanations addressing existential concerns.

Interactive Ritual Chain Theory (IRC) is a perspective focusing on the interactions and the emotional input and feedback of individuals within those interactions.  Drawing on the work of Durkheim ([1912] 1995) and Erving Goffman (1959, 1967), the theory is formally posited by Randall Collins (1993, 2004; see his 2010 essay on micro-sociological approaches to religion).  IRC asserts that interactions produce or deplete the “emotional energy” of participants depending on key factors.  These factors include the physical co-presence of interactants, exclusivity of the group, a mutual focus and mood, and bodily synchronization.  Given its situational emphasis, IRC is difficult to measure quantitatively, but it is insightful for qualitative research, especially in areas such as worship rituals.  Importantly, IRC theory also situates religious actors in social space and outlines the linkages between ritual, affect, and belief.

This perspective posits that religion survives and can thrive in pluralistic, modern society by embedding itself in subcultures that offer satisfying, morally-orienting collective identities which provide adherents with meaning and a sense of belonging.

In a pluralistic society, religious groups which are better at transmitting and employing the cultural tools needed to create both clear distinction from, and significant engagement and tension with other relevant out-groups (short of becoming genuinely countercultural), will be relatively successful (Smith, 1998:118-119).  Although similar to the religious economies claim that groups at a medium level of sociocultural tension will attract the most adherents, this perspective focuses more on issues of identity and symbolic boundaries by drawing on cultural sociology.

Studies of conversion, religious schisms, and secularization utilize social network theory to understand the influence of community and networks on the religious life of individuals, groups and societies. In his classic study of suicide, Emile Durkheim used religion as an indicator of how well or poorly a society was socially integrated; other research indicates that religion actively builds social networks (Durkheim 1897, Stark and Bainbridge 1981, Bainbridge 1987, Bainbridge 2006).

Religion may produce strong social networks, but it also depends upon them. Thus religion may feature as either the independent or dependent variable in studies related to this theoretical perspective. There are a number of studies of social networks where religious vitality is the result of levels of social integration (Bainbridge 1990, Stark and Bainbridge 1980, Bainbridge 1989).  Also see “Conversion Theory.”

Secularization has a long history of theory relating to the idea that religion will become less powerful as a social institution with the progress of “modernity.”  For example, Berger defines the term as meaning: “The process by which sectors of society and culture are removed from the domination of religious institutions and symbols” (1967:107).  Secularization perspectives are varied, but in general there are three levels upon which secularization is theorized to occur (see Tschannen 1991).  These are: 1) Macro – social differentiation; 2) Meso – the decline of significance of religion in organizations; and 3) Individual – a reduction in levels of practice, belief, or affiliation at the individual level.  A hotly debated topic concerns the degree to which secularization is an inevitable process as societies “modernize,” or whether instances of secularization are “exceptions” (see Davie 2002; Martin 1991).  Another source of contention is whether all three levels of secularization are necessarily linked together or whether processes at one level may occur without those at another.

Secularization was for a long period the dominant perspective on religious change in the social sciences.  Yet there has never been a single theory of secularization.  A family of theories drew on classical sociological theory.   Durkheim posited that increasing social differentiation as a result of the expanding social division of labor would lead to the separation of the sacred and secular realms.   Secular institutions would become predominate, the “collective conscience” generated by religious participation would erode, and the functions performed by religion would be taken over by newly specialized institutions, such as the nation-state and the education system.  Weber proposed that the increasing dominance of instrumental rationality in economic and political institutions was leading to the “disenchantment” of everyday life, and eventually the eclipse of religious reason.  Marx saw religion as little more than an ideological system for the justification and perpetuation of class domination, arguing that as class consciousness and materialism advanced religion would disappear.

In general, contemporary proponents of various secularization theories observe that modernity tends to erode religion’s plausibility, intensity, and authority. Further, these theories tend to posit the contemporary retreat of sacred institutions, the privatization of faith, and the “progressive shrinkage and decline of religion” in public life (Casanova 1994).   Within this broad consensus, however, there are a variety of theoretical positions.

Peter Berger (1967) offers a micro-level version of the theory that focuses on the plausibility of religious concepts.   He argues that changes in religious consciousness are due not only to science and the Enlightenment but also to the expanding social and cultural pluralism that is a central feature of liberalizing societies.  This confrontation with pluralism was posited to damage the plausibility of religious dogma.  When religious adherents encounter credible others with rival and fully incompatible claims to ultimate truth, their own certitudes begin to suffer.  In liberal societies, multiple religious and secular groups jostle for influence on the basis of philosophical and ethical claims, undercutting each of their claims to predominance, and ultimately leading to the privatization of religion in civil society.

There is also a macro version of the theory. In the decades following the Second World War, most social scientists expected convergence among the Western industrial democracies and, with varying degrees of lag, a general move toward modernity by the “developing” societies. Modernization was expected to bring prosperity and opportunity, initiating a “culture shift” to post-traditional values and lifestyles (Inglehart 1990). As Ronald Inglehart and Wayne Baker (2000) explain, “Modernization theorists … have argued that the world is changing in ways that erode traditional values. Economic development almost inevitably brings the decline of religion, parochialism and cultural differences.”  In fact, cross-national research does suggest a generally negative association between development and religiosity, though with some caveats.  More recently, the modernization theory of secularization has been modified. Neo-modernization theory now rejects the linear implications of past formulations and seeks to link the micro and macro-levels in its explanation for religious change.  The key micro-level factor is now held to be existential insecurity; the greater that insecurity, the more likely that people will be religious.  But where economic, political and social conditions have improved such that personal security improves, religion loses its impact.  This does not mean that the world is growing secular, however, since most of the world’s population growth is occurring in the poorest, most unstable, and most traditional societies (Norris and Inglehart 2004).

There is no unified theory of secularization, and some of the mechanisms proposed by secularization theorists seem to remain obscure. In addition, it is not clear why indicators of “secularization” are high in some modern societies – for instance, in many Western European societies – but lower in others such as the United States, or why some countries that are far less developed than the Western industrial democracies are more irreligious.  Berger (2002), once a prominent proponent of the secularization thesis, now declares, “Our age is not an age of secularization. On the contrary, it is an age of exuberant religiosity, much of it in the form of passionate movements with global outreach.”  Social scientists proposing the religious economies model have generally been highly skeptical of secularization theory. Rodney Stark (1999) advised that the thesis be left to “rest in peace”; contrary evidence and theoretical shortcomings effectively consigning it to the dustbin of history.

Others note that secularization often appears to be an intentional political project, rather than a spontaneous socio-cultural development (Smith 2003; Froese 2009).  Many studies identify the central role played by church–state institutions in causing variation in secularization across societies. Political mobilization on the basis of religion is often triggered by the efforts of political elites to reduce the public role of religion (institutional secularization) or extend governmental authority to domains previously organized by religious organizations. State regulation or penetration into areas once dominated by religion often provokes conflict, especially where it threatens the influence of religious authorities. “Reasonable” government regulation of religious expression or activities in the interest of secularization is often viewed as an attack on religion.

Finally, there are the new cultural approaches to secularization theory that argue against linear secularization narratives but still contend that secularization is manifest in the pluralism of religious worldviews and highly individualized assemblages of religious and supernatural beliefs. As is becoming apparent in Europe and the United States, even where belief in the supernatural remains, denominational and confessional attachments appear to be weakening.  As a result, religious preferences are becoming more individualized, the status of orthodox religious authorities is diminishing, and growing proportions of people seek a spirituality divorced from conventional religion (Lambert 2004).  As Phil Gorski (2005) aptly puts it, “the weakening of traditional Christianity appears not as a decline of religion per se … but as a return to polysemism, since the new worldviews are not uniformly theistic.”  Thus it is not that religion inevitably declines but that religion as it has been theologically and institutionally understood in the West for the last several centuries is being irresistibly altered.

Religious economies perspectives are built on rational choice assumptions.  In particular these approaches assume that there are potential benefits to religious participation, both psychological and inter-personal, that people may seek.  With the analogy that overall levels of religion in an area are like a market, it assumes an innate demand for some type of explanation system and posits that differences in levels of participation can be at least partially explained by the supply of “religious goods” in an area.  Producers of these religious goods and individual consumers become central concepts, as well governmental regulation of religion.  Other important concepts include the level of tension between a religious group and the surrounding socio-cultural environment and the level of “strictness” required to be a member of a religious group.

“Individuals act rationally, weighing costs and benefits of potential actions, and choosing those actions that maximize their net benefits.” (Iannaccone, 1997:26)

“Within the limits of their information and understanding, restricted by available options, guided by their preferences and tastes, humans attempt to make rational choices.” (Stark and Finke, 2000:38) 

“A religious economy … encompasses all of the religious activity going on in any society.” (Stark and Finke, 2000:35)

This theory holds that religion is just as important a feature of modern society as it is of traditional society, but it takes different forms and possesses different characteristics. While compatible with functionalist theories, this theory does not depend upon them, because it concerns the historical transformation of religion, whereas religion’s functions may be constant. Among the most influential variants of this perspective is the pattern variable theory of Parsons and Shils.

According to them, five socio-cultural dimensions of variation together describe modernization and are applicable to religion as well as to all other major institutions of society. With the traditional end of the dimensions on the left, the pattern variables are:

  1. Affectivity – Affective Neutrality
  2. Self-orientation – Collectivity-orientation
  3. Particularism – Universalism
  4. Ascription – Achievement
  5. Diffuseness – Specificity

Modern religion, as defined by the right end of these dimensions, is emotionally cooler than traditional religion, which is oriented toward very large social collectivities such as all humanity rather than the self and which promulgates universalistic norms and hopes of salvation, placing responsibility for moral choice in the individual. Modern religion becomes differentiated from other institutions of the society. These continua are intended to be five dimensions, rather than aspects of one, implying that prior to the completion of modernization some societies and their religions might be mixed types.

A recent critique of this theory, which grants that it has been exceedingly influential, comes from Nils Gilman (2003).  Various forms of modernization theory are also intertwined with theories of secularization.  Although these links were at one time conceptualized in a teleological fashion (see Gorski and Altinordu 2008), more recent scholarship has suggested that the link between secularization and modernization is more complex (e.g. Almond et al. 2003 Casanova 1994; Warner 1993). 

According to this perspective, religion exists because it serves an integrating function for society as a whole. Durkheim suggested this when he argued that God represents the society, and in worshiping God, society really reveres itself. The elements of the culture that are essential to the society’s survival are labeled sacred, in this theory. Unlike theories of the rise and fall of civilizations, functionalists do not consider the survival of a religious culture to be problematic. While flavors of this theory are common in older writings on religion, among the best texts to consult are Durkheim 1915, Parsons 1937, and Parsons 1964.

Some readers of these works would draw the inference that religious individuals are more likely to follow society’s norms than non-religious individuals. However, pure functionalist theory does not rise or fall on whether this hypothesis is true, because it concerns the society as a whole, rather than individuals. In his 1964 journal article, Parsons suggests that the proof of functionalist theories of religion can be found in the fact that all societies have possessed religion – that it is an evolutionary universal necessary for the survival of society. He also considers religion to be a precondition for the development of many of the apparently non-religious features of modern society.

According to this theory, fertility and mortality rates change in a predictable manner, when a society evolves from a traditional to a modern form. Initially, death rates are high, because of the primitive technology and economic poverty of a traditional society, so birth rates were also high to sustain a stable population. At the beginning of modernization, technological and economic progress reduces the death rate, but the birth rate remains high through social inertia, so there is a population explosion. Eventually, the birth rate comes down as well, and the result is a stable population with low fertility and mortality rates. A classic statement comes from Kingsley Davis. 

By the mid-1980s, however, it was apparent that most advanced industrial or post-industrial societies had birth rates that were too low to offset even a greatly diminished death rate, and after the distorting effect of the changing age distribution had worked its way through, these societies seemed doom to population collapse. The immediate relevance to religion is two-fold, because low birth rates undercut some of the family-related variables that encourage religious participation and because one of the few factors that could sustain fertility at the replacement level is religion. However, as Nathan Keyfitz pointed out, only fundamentalist religions, like radical Islam, may have sufficient fertility, and thus society ironically may become more religious through modernization rather than less (Keyfitz 1987).

This perspective has been used to explain the relative success of more conservative religious movements, in spite of the high numbers of disaffiliates in post-industrial world.

Asian religions, and some classical western philosophers, believed that history consisted of an endless series of cycles: the Wheel of Life, eternal return, or eternal recurrence. This idea can also be found in nineteenth-century European philosophies that related to religion, notably the work of Friedrich Nietzsche, and it is not entirely implausible that processes analogous to the individual’s life cycle occur on larger, societal scales.

Probably the most impressive cyclical theory that gives religion a central role is the one proposed by Pitirim A. Sorokin. For Sorokin, the most influential elements of culture are those that concern the inner experience of people, their images, ideas, volitions, feelings, and emotions. The essence of a culture is defined by the view people have of the nature of reality, the goals they value, and the means they emphasize in reaching these goals.

In his theory, each great civilization emerges out of a period of chaos with a coherent set of spiritual beliefs that give it strength. Often it is born in the development of a new religious tradition. At this point, it is what Sorokin called an ideational culture. A successful ideational culture grows and develops. With success, however, comes complaisance. The society slowly loses its faith in spirituality, doubt sets in, and the culture begins to become sensate, a perspective on existence that is the opposite of ideational. A sensate culture believes that reality is whatever the sense organs perceive, and it does not believe in any supernatural world. Its aims are physical or sensual, and it seeks to achieve them through exploiting or changing the external world.

Depending upon circumstances, most people in a sensate society will exhibit one of three personality orientations. Active individuals use technology and empire-building to take charge of the material world. Passive individuals indulge themselves in pleasures of the flesh. And cynical individuals exploit the prevailing conditions for their own profit without any ideal to provide fundamental values. The entire cycle of which he wrote can take many centuries to complete, but Sorokin believed that western society was approaching a crisis point.

Ultimately, a sensate civilization is likely to crash, ushering in a new period of intense cultural chaos out of which a new ideational civilization may be born. Sorokin wrote, “Neither the decay of the Western society and culture, nor their death, is predicted by my thesis. What it does assert… is simply that one of the most important phases of their life history, the Sensate, is now ending and that we are turning toward its opposite through a period of transition. Such a period is always disquieting, grim, cruel, bloody, and painful” (vol. 3, p. 537).

Were Sorokin alive today, he probably would cite the rise of Islamic fundamentalism as confirmation of his theory, suggest that it would result in widespread conflict and religious revival in Islamic societies, even as European Christian culture continued to descend toward at least temporary collapse.

It deserves note that the notion of a cyclical periodicity has been also applied to Western religions, and to more narrow phenomena such as political movements.  For example, some researchers (Jelen 1991; Lienesch 1993) have suggested that religiously motivated political activity in the United States follows a cyclical pattern, based in part on the activation of negative affect toward out-groups (e.g. immigrants, homosexuals), and the subsequent mobilization of religious particularism.

Inspired by Lofland’s field research on recruitment to the Unification Church, in its earliest days sending evangelists from Korea to the United States, Conversion Theory offers a series of steps a person must go through in order to become a member of a new religious group:

  1. Experience enduring, acutely felt tensions
  2. Within a religious problem-solving perspective,
  3. Which leads him to define himself as a religious seeker;
  4. Encountering the group at a turning point in his life
  5. Wherein an affective bond is formed (or pre-exists) with one or more converts;
  6. Where extra-cult attachments are absent or neutralized;
  7. And where, if he is to become a deployable agent, he is exposed to intensive interaction (Lofland and Stark 1965).

Fifteen years after he had collaborated with Lofland, Stark worked with Bainbridge on a paper that implied that several of the steps of the model were unnecessary, and that frequency of social interaction – the last three steps – could be quite sufficient (Stark and Bainbridge 1980).  Although the process of conversion remains an important idea, the importance of social networks for conversion is the aspect of conversion theories that have had the most impact on further research.  More recently scholars have taken the focus on networks and incorporated it into religious switching, and even de-conversion.

“Cognitive science is the interdisciplinary study of mind and intelligence, embracing philosophy, psychology, artificial intelligence, neuroscience, linguistics, and anthropology.” Paul Thagard in the Stanford Encyclopedia of Philosophy, available online.

Recent cognitive theories have not yet been fully integrated into the social science of religion, but researchers have many opportunities to expand the scope of cognitive science (Bainbridge 2006).  Wuthnow (2007) provides an introductory overview of the issues, extant literature, and potential for research avenues with regard to cognitive theories and religion.

There are actually several distinguishable cognitive theories of religious phenomena, they are:

  1. Attribution of Intentionality: Perhaps the most widely influential theory currently in the cognitive science approach to religion holds that faith in supernatural beings is a cognitive error that naturally springs from the way the human brain evolved (Boyer 2001, Barrett 2004, Abbott 2003).
  2. Cognitive Consistency: Theories in this sub-category argue that humans have a natural need to form consistent mental models of the world, and thus they will exert effort to resolve any contradiction between two beliefs, or between a belief and a behavior (Festinger et al. 1956, Festinger 1957).
  3. Cognitive Efficiency: This kind of theory postulates that the human mind naturally seeks simple models of reality, and that humans will tend to avoid extreme cognitive effort. This perspective is similar to, but distinguishable from, cognitive consistency theories (Miller 1956, Newell 1990, Boyer 2001, Allport 1954, Bainbridge 1995).
  4. Modes of Memory: Harvey Whitehouse has argued that different styles of religion are based in different parts of the brain, specifically in different memory structures (Whitehouse 2004).
  5. Pragmatic Epistemology: Whereas some theories consider religion to be the result of cognitive errors, this theory argues that religion serves the interests of the individual and thus is true for that very reason (James 1948).

Theories in this broad category assert that each major civilization, and perhaps smaller units as well in prehistoric times and remote regions, has a degree of cultural coherence, often marked by a distinctive religion. When two such civilizations come into contact, they compete, sometimes for several centuries, with resultant religious conflict. Also, it seems likely that every civilization eventually will exhaust its central cultural principles and collapse. Thus these theories tend to concern the rise and fall of civilizations (See Gibbon 1776).

Gibbon suggests that the establishment of Christianity as the state religion was an attempt by Constantine and some of his successors to strengthen the Roman Empire. More recent scholars have argued that Christianity did indeed have a characteristic that made it suitable as a stabilizer of the state and perhaps the civilization in which the state was embedded, namely particularism. By rejecting the truth of alternative religions, it asserted a principle of central authority that could be useful to stabilize the governance of a large society (See O’Donnell 1977).

A more general civilization theory was proposed by Oswald Spengler. Spengler asserted that every great civilization was based on a single idea, and when this idea became exhausted, the civilization would fall. For modern, Western civilization the idea was boundless space, and when the Age of Exploration came to an end, by the year 1900, the doom of the west approached. Ideas like Spengler’s continue to be popular among politically conservative intellectuals, some of whom consider Christianity to be the central idea of the West (See Burnham 1964, Buchanan 2002).

Among the modern attempts to develop data that could test or refine civilization theories, the World Values Survey stands out. Publications based on it tend to give a mixed picture, with some evidence that major cultural blocs in the world do indeed have somewhat different values, but perhaps not markedly different (See Inglehart and Baker 2000).  Samuel Huntington was well-known advocate of the idea that civilizations are based on competing and incompatible religious traditions, thereby limiting the extent possibilities concerning inter-civilization cooperation. 


Churches are religious bodies in a relatively low state of tension with their environments.

Sects are religious bodies in a relatively high state of tension with their environments.

The sect-church process concerns the fact that new religious bodies necessarily begin as sects or new religious movements and that, if they are successful in attracting a substantial following, they will, over time, almost inevitably be gradually transformed into churches. That is, successful religious movements will shift their emphasis toward this world and away from the next, moving from high tension with the surrounding socio-cultural environment toward increasingly lower levels of tension. As this occurs, a religious body will become increasingly less able to satisfy members who desire a high-tension version of faith. As discontent grows, these people will become dissatisfied that the group is abandoning its original positions and practices. At some point this growing conflict within the group will erupt in a split, and the faction desiring a return to higher tension will found a new sect. If this movement proves successful, over time it too will be transformed into a church, and once again a split will occur. The result is an endless cycle of sect formation, transformation, schism, and rebirth. The many workings of this cycle account for the countless varieties of each of the major faiths (Finke and Stark 1992:44-45).

The church/sect cycle is proposed as a general theory of change in religious organizations over time.  It is rooted in the work of Max Weber and Ernst Troeltsch, but has recently been taken up by rational choice theorists.

QuickSearch The Knowledge-Base

To search the knowledge-base, enter a term below:

Select a Theory below to learn more:

Select a Concept below to learn more: