contact us

Use the form on the right to contact us.

You can edit the text in this area, and change where the contact form on the right submits to, by entering edit mode using the modes on the bottom right.


Long Beach, CA

Transcending Rigidity: Guideposts for Flourishing in an Unthinking World

Transcending Rigidity Part XIII: The Rise of Anxiety and the Return of Scapegoating

Brandon Cook

Comparison is the thief of joy. 
— Theodore Roosevelt

         In 2013, I watched the story of Justine Sacco unfold in real time. Before boarding a flight to South Africa, Sacco tweeted, “Going to Africa. Hope I don’t get AIDS. Just kidding. I’m white!” When I read the tweet, I thought she was pointing out injustice by making an incisive, albeit inappropriate, joke. It would be much like a white woman in 1960s Alabama saying sarcastically, “I hope I don’t get pulled over…oh, wait: I’m white!” (I guess, unfortunately, I can replace “1960s Alabama” with “Many-Places-in-America Today”). As a joke, it is (in my opinion) in bad taste, satire or not, callous to the reality of AIDS. Nevertheless, the tweet, if in bad taste, was not racist—not how I read it, anyway. Rather, she was roasting racism itself, pointing out that a woman being free from privation due to her skin color speaks to something disturbing about our world. Later, Sacco said of the tweet, “To me it was so insane of a comment for anyone to make…I thought there was no way that anyone could possibly think it was literal.”[1]

         She mis-judged her audience. Rather than receiving it as satire and social commentary, she was lambasted. People were outraged (or, at least, “outraged”), either not recognizing the sardonic nature of her tweet or recognizing it but appalled at the chutzpah it took to make it. Or perhaps because we live in a culture in which the first to take offense gains leverage and power, whether there is real offense or not. As Sacco flew through the air, #hasjustinelandedyet began trending on Twitter, as well as tweets to the effect that “when this lady lands, her life is over.” They were right. Outrage is a contagion and spreads as such. Mobs are ruled by emotions, not reason nor thoughtfulness.[2] A day after she landed, she was fired from her job as a public relations executive (let not the irony be lost on you), her company having little choice but to dismiss her or become an object of abuse in their own right.

         The mobs chanting for her downfall on social media did not care what Sacco meant; she had broken a social taboo. At some point, the outrage escalated from anger at an inappropriate joke to the worst possible caricatures of her intent, to the schadenfreude of watching someone else pay for a mistake. And, because of our brave new technological world, all of this unfolded in real, digital time, a truly twenty-first century phenomena. When a culture is chronically anxious and always over-stimulated and “plugged in,” such contagion easily spreads. When we find someone who has broken cultural rules, there is a huge emotional payoff—and the relief of our anxiety—in scapegoating them. In watching them topple and then feeling right, certain, righteous, and justified as they fall. After all, we would never do what they did. We are good and virtuous; we are not like those others. For a while, such thinking makes us feel less anxious. We have staved off, for the moment, our suspicion that perhaps we are not so virtuous. 

         Freedom from anxiety is a good thing. But how we deal with anxiety—how we get to that freedom—matters a great deal. Every life (not to mention every system, family, nation, marriage, and organization) carries anxiety, since life by its nature carries concerns. Everyone fears, longs, hopes, and worries about something. Furthermore, every system is, to some degree, an anxious system.[3] Since anxiety is just a part of life, societies must deal with it; that is, in large part, their function. But how do they deal with it? And how do they deal with it when anxiety reaches critical levels, as it has in America in 2020 (even before the Covid-19 pandemic and before, even, the election of Donald Trump). While studies prove the fact of rising anxiety, I can rely—and you probably can, too—on anecdotal evidence. In the last weeks I have had four separate conversations with people who said, “Everything feels like a powder keg waiting to explode” or “Things seem so polarized right now, I’ve never experienced this” or “Things are so anxious and it’s exhausting.” One person told me, “It used to be ‘I’m right and you’re wrong’ and now it’s ‘I’m right and you’re evil.’” Each noted, in their own way, the rise of anxiety in our culture. (Note: I wrote this paragraph before the events of January 2021 and the storming of the U.S. Capitol; you can imagine how elevated these sentiments might be in the aftermath.)

         In this essay, I want to explore two means of dealing with anxiety. Since we are not handling anxiety well, on the whole, as a society, this is a critical exploration. The two means of dealing with anxiety are related. The first is the scapegoating ritual, an ancient practice, and the second is the rise of a morality defined by rule-keeping, a contemporary practice. At the end of the essay, we’ll see that these two phenomena are intimately related. 

Mimetic Theory: Finding a Scapegoat 

         In his seminal book I Saw Satan Fall Like Lightning, René Girard describes how human societies function based on mimesis—that is, through imitation. Mimetic theory, in simplest terms, means that we learn what to desire, in part, by imitating what other people desire, as we compare ourselves to each other.[4] Thus, “Man is the creature who does not know what to desire. He turns to others in order to make up his mind; we desire what others desire because we imitate their desires.”[5] For this reason, humans go through intense seasons of comparison and imitation (junior high school comes to mind.) And for this reason, fashion tends to run in uniform trends and spread from Coast to Coast. We want something, in part, because otherpeople want it or have it, as we rely on seeing what others desire to determine our own desires. We pattern our lives on ideal models and try to find examples that we can follow, giving rise to celebrity culture and people who are famous for being famous. This process of imitation often leads to rivalry, since we inevitably end up longing for the same objects of desire. A man will desire a woman, for example, not only because of an intrinsic attraction but because he knows she is desired by others. This dynamic sets up competition and potential violence across societies. Consider the Trojan War as an archetype: Helen serves as the object—the mimetic desire—over which a war is fought. Girard said that this mimetic impulse in fact provides the basis for all competition in human societies. 

          Indeed, we compete for the same jobs, for partners, for degrees, for careers. Ultimately, this competition creates anxiety in the form of envy, jealousy, and comparison. And this anxiety can escalate into what Girard called “the scapegoat mechanism.” The scapegoat is a random victim who becomes the focus of the growing anxiety generated from rivalry and conflict. If a scapegoat can be found that rivals can agree to blame, anxiety can be expelled through punishment of this scapegoat. It does not matter if the scapegoat is guilty or not; if he/she is believed to be guilty, the scapegoating mechanism—the punishment of or expulsion of the scapegoat from the community—can successfully “carry away” a society’s anxiety and sense of guilt. (One might think of the story of Jonah and the sailors who wanted to get rid of the source of impurity and curse in their midst.) The scapegoating mechanism was, in ancient societies, a means of controlling violence that could otherwise escalate. Sometimes the victim was a human; later, animal sacrifice supplanted human sacrifice.

         Mimetic theory, thus, helps make sense of the ubiquity of scapegoating rituals throughout early human history, across diverse cultures. The Ancient Athenians, for example, in spring or early summer at the festival of the Thargelia, scapegoated a man and woman considered particularly unattractive. The two were feasted, paraded, beaten, and either driven out of the city or stoned to death. The expulsion of the hideous served to relieve anxiety, perhaps because ridding a group of the externally unattractive quiets the sense or suspicion of internal, individual hideousness. In Israelite society, we read of a similar scapegoating ritual involving not humans but an actual goat: “[The High Priest Aaron] is to lay both hands on the head of the live goat and confess over it all the wickedness and rebellion of the Israelites—all their sins—and put them on the goat’s head. He shall send the goat away into the wilderness in the care of someone appointed for the task.”[6] 

         Later, as societies developed, structures were put in place to deal with anxiety in alternate ways, such as through evolving religious or legal structures which could deal with anxiety and curb violence without resorting to a literal scapegoat ritual (though humans have remained, up until now, quite eager to find some scapegoat to “other”, the blame of which continues to relieve anxiety.) What seems a bizarre ritual to our modern ears begins to make sense when we see it as a mechanism for reducing anxiety and conflict.

The Rise of Anxiety

         With its insights into the dynamics of competitive desire across human cultures, mimetic theory also provides perspective on how groups develop social norms in order to act uniformly within societies. Through such norms behavior which threatens group cohesion is abolished. For developing societies, establishing norms and taboos is critical, since group cohesion is often necessary not only for cultural development but for physical survival. Societies, then, invariably set group identity up against a seen or unseen “other,” such that group identity can be baselined and maintained. Social belonging is most easily attained by comparing a “we” versus a “them.” Scapegoating functions on basis of comparison. We construct identities and compare ourselves to others: I am a man, a woman, a Christian, an atheist, a progressive, a liberal, a conservative. Then, we find some “other” who is “bad” so that we can, by comparison, secure our belief that we are good. And by aligning with a community that shares our view(s), while bolstering our sense of goodness, we can relieve our anxiety.

Mimesis establishes a drive towards sameness and inclusion rather than being the “other” or the “outsider” or “the guilty one.” As we desire the same things and shun the same things, our society coheres. So, for example, the ancient Israelites did not mix certain fabrics or eat certain foods, not because there was anything inherently immoral about such behavior, but because this is what the “others”—the Canaanites and the Philistines—did. 

         These sorts of norms may seem arbitrary to us, but such purity codes are actually community-building codes; they help hold the tribe together.[7] Even prohibitions which appear arbitrary become moral prohibitions and take on moral character. As moral psychologist Jonathan Haidt says, in most places, the social order is a moral order.[8] Indeed, for most of human history in most places, social norms are moral norms. In many parts of the world, for example, marrying whomever your parents tell you to marry—which may seem to 21st century Westerners as social-meddling—is a moral concern, since social stability is a moral matter. The good person listens to their parents and thereby preserves social order. They are seen and rewarded, in turn, for acting morally. Traditionally, then, social context has told us what is moral (that is, what is most beneficial for the group according to its dominant narrative of meaning) and therefore what is good and virtuous. Morality, in this sense, is whatever restrains the ego so that society can function harmoniously. Classical Greek philosophy—certainly in Socrates, Plato, and Aristotle—focused on the individual and in some sense created the eventual basis for Western individualism, but they never did so at the expense of the community.[9]Indeed, each was concerned with the function of virtue in creating a just, harmonious society within the polis, not with individualism as its own end. Morality has always been most concerned with—at least up until our contemporary, post-Enlightenment Western world—“whatever works” for social functioning.[10]

         This group-oriented, socio-centric approach now has stiff competition, namely in the hyper-individualism of Western liberalism ascendant since the Enlightenment. We are far less quick than our forebears to consider morality in the context of social or group dynamics over and above our “expressive individualism”—that is, our emotivist focus on what “works” for me as an individual.[11] As Alasdair MacIntyre writes, “Both [the ancient polis and the Medieval kingdom] are conceived as communities in which men in company pursue the human good and not merely as—what the modern liberal state takes itself to be—providing the arena in which each individual seeks his or her own private good.”[12] Or, again quoting Haidt: “Only recently has our social order become organized around the individual expression of individual freedom; [individuality] has only recently eclipsed the socio-centric approach.”[13]  

         There is a great tension, then, between our human need for a group morality and our Western, philosophical insistence on rampant individualism. And this tension leads to great anxiety. We need group cohesion, but we have been raised or inculcated to prefer an individualism that cuts against the restraints of group membership. Furthermore, this individualism has transformed our very notion of “freedom.” In classic thought, both Greek and Christian, freedom required the imposition of self-restraint in order to develop the virtues. Virtue, in fact, until the modern era, centered around the necessity of self-restraint, and liberty meant “freedom from enslavement to our appetites.”[14] Freedom from our baser passions enables us to live a life of harmony and justice, heeding the angels of our better nature. As James Deneen writes, “Liberty [classically-speaking] is the learned capacity to govern one’s self using hybrid higher faculties of reason and spirit through the cultivation of virtue. The condition of doing as one wants is defined in this premodern view as one of slavery in which we are driven by our basest appetites to act against our better nature.”[15] How different: now our modern notion of freedom is exactly the opposite: namely, to do whatever the hell one wants.[16]

         In fact, the idea of self-restraint is not only downplayed in liberal, democratic societies; it is seen as oppressive.[17] The idea of “conquering self” feels like an archaic ideal from a failed age, reminiscent of an oppressive aristocracy or legalistic Mother Church. Indeed, in the new notions of “freedom”, post-Enlightenment, the classical notion of “self-restraint” is seen as an impediment to individual self-expression. Our contemporary notion is that to be free of any shackling impediment to self-expression is virtue. In such a milieu, any social pressure to align or restrain becomes an act of violence against one's personal will.

         I recently read a sentence which wonderfully captured this contemporary notion of freedom: “The key to happiness [is] freedom of choice where each individual would be able to choose what kind of lifestyle they want with no legal or other repercussions and enforced dogmas of any kind.”[18] It sounds wonderful, doesn’t it? It’s also completely untenable and neglects the reality that communities must have norms and that social cohesion is critical for human satisfaction and joy. It’s these bonds that tie us together and make life meaningful. While there can be oppressive bonds, of course (which is why, no doubt, the above quote resonates) the notion that happiness can be centered only or even primarily in our unrestrained self-expression is a complete re-writing and undoing of centuries of moral practice—not just in the West but around the world. As Richard Rohr says, “The modern and postmodern world is the first period of history where a large number of people have been allowed to take their private lives and identities seriously. This marks a wonderful movement into individuation, but there is also a diminishment and fragility if that is all we have.”[19] In our society, the individual has become all and the individual’s ability to choose has become the most transcendent good. Our focus is on the expression of the individual, not the community. And our notion of freedom is now centered in our own self-expression. This represents a major change in moral structure.

         This re-ordering of virtue is, as Deneen argues in Why Liberalism Failed, liberalism’s fatal flaw. Our insistence on hyper-individualism and freedom as the satisfaction of our appetites leaves us dissatisfied and adrift.[20] Thus, anxiety rises as we have no meaningful narratives about life and no social cohesion by which to share those narratives.[21] Which raises a question: since we can no longer turn to scapegoating rituals to quell our anxiety, and since our grand narratives (religious, scientific, et al) have failed…what do we turn to?

After Virtue: The Rise of Rule-Based “Morality”    

         Not only do we need a moral framework that tells us that we belong, we need one in which we can know that we are good, moral people. We are, as Jonathan Haidt elucidates in The Righteous Mind, wired for righteousness; that is, we are hard-wired to convince ourselves of our own moral goodness. But in the absence of a classical notion of virtue (freedom from our baser passions in order to pursue a given end), we need an alternative moral framework to prove our goodness. This can be largely achieved in contemporary society through a moral framework centered around rule-keeping. Indeed, we now have a moral framework largely driven by rules and rule-keeping

         By rule, I mean a moral precept prescribed as a normative behavior, but of a very specific type. “Don’t kill anyone” is of course a moral precept accepted and deeply felt by all (except psychopaths). Thus, it is a moral rule and also a universal law. But it’s not universal laws of this sort—Ten Commandment type rules, we might call them—by which we judge each other since they provide little difference of opinion and, thus, little room for comparison. We might say that it’s only a rule, as opposed to a law, if it’s not a universal; that is, rules are matters of debate. They are social mores not universally accepted. “Don’t say the n-word unless you are black” is a moral rule generally accepted across American society; nevertheless, it is not universally accepted nor deeply felt by all (some conservatives question why black people can use the word and, on the farther right, there are some non-blacks who do use the word). Such rules, as opposed to universally accepted laws, provide a basis for moral comparison and thus an opportunity (1) to be included in the group which abides by the rule while also (2) proving that we are good, virtuous people. 

         Examples of contemporary moral rule-keeping debates include:

·      White privilege/systemic racism is a core societal issue versus “white privilege/systemic racism” is overblown or doesn’t exist 

·      The patriarchy is at the root of much societal harm versus “patriarchy” is misconstrued

·      The use of gendered versus non-gendered language  

          People on both sides of these issues are fully convinced that they are the moral rule-keepers and that not only is their perspective correct but that the rules ensuing from their views are the right moral rules. Increasingly, issues both political and sociological are polarized such that opposing perspectives are made into moral rules by which we can prove our moral goodness. During the Covid-19 pandemic, I have heard the accusation that mask-wearers are “sheep” while people who don’t wear masks are selfish and evil. Between such polarized conclusions, conversation is seldom possible, and indeed in such a matrix conversation is not the goal. Conversation is shunned in favor of the psychology of competition between two competing teams. As Jonathan Haidt says, “When people all share values, all share morals, they become a team.”[22] These “teams” are almost always aligned with a conservative bias on one hand and a progressive bias on the other. As a society we are largely no longer concerned with the interplay of conservative and progressive values—which are both needed in a healthy, functioning society—but with how such values can be put in competition. 

         Our contemporary debate, then, is not like the moral debate we might find across history in various times and nations, a battle between conservative and progressive impulses held together by a common will. Increasingly, the focus of our debate is not to provide consensus but rather the competition by which we can make comparisons to one another. To do this, we need moral rules. And we need rules precisely because we lack an agreed-on moral framework and therefore are looking for some means by which we can adjudicate and demonstrate our goodness. Said differently: our rule-keeping is created in the absence of any moral framework; we now construct our frameworks, often individually, in the absence of a broader societal narrative.

         Societally, we generally no longer believe (with a nod to Nietzsche and Camus) that there is a telos or any grand narrative that makes life meaningful. We no longer have stories that give us a unified notion of virtue and thus our moral rules no longer flow from any united grand narrative. The Medieval “divine right of Kings” flowed from the narrative of an ordered universe with a Divine God at the top. Now we believe the universe is chaotic and random. The chivalric code of knights centered on the ideals of abnegation, flowing from the Christian narrative of humility and sacrifice. Now we believe such narratives and values impede our expressive individualism. Our rules do not flow from any grand narrative, and yet we still attempt to use our values to establish a uniform morality in order to create the social cohesions we so badly lack. But they cannot become universal laws and must remain only rules, deeply debatable and often arbitrary and ambiguous in their application. Further, we divide into teams which become pre-eminent over any grander commitment (to narrative or to nation, for example). This is, in part, why we see such polarization and anxiety across our nation, as individuals take refuge in their preferred political archetype—conservative or progressive—in order to make of it a new grand narrative. Our liberal societies are now not so much focused on the good but on the right. And, more specifically, who is right. 

         The resulting rule-keeping, to be effective, must be dogmatic. Rule-keeping for the religiously rigid has always been dogmatic (“don’t drink, don’t chew, don’t run with girls who do”), since rule-keeping is how you demonstrate you are righteous and part of the tribe. But religion is far from the only domain for dogmatic rule-keeping. We see it on both the left and the right, politically speaking. Political discourse is often no such thing—not thoughtful conversation, certainly; more often, it is an automatic and thus unthinking response to “the other side,” a reflexive insistence on the rules of team red or team blue. Similarly, we have not the time to think about Justine Sacco’s tweet. We just intuit that she has broken some rule and that we are to be appalled at the violation. Context does not matter, the rule does. We are on the lookout for anyone who may have broken a rule. As Alasdair MacIntyre writes, “judgment has an indispensable role in the life of the virtuous man which it does not and could not have in, for example, the life of the merely law-abiding or rule-abiding man.”[23]

         Such unthinking dogmatism is the opposite of thoughtfulness. Thoughtfulness requires a consideration of time and place. Indeed, virtue, according to Aristotle, was the practice of “the right thing in the right time in the right place.” This lack of thoughtfulness is the symptom of our era’s great sickness. As Rowan Williams writes, “We suffer from a loss of patience with argument, real mutual persuasion; a loss of the idea that by mutual persuasion and careful argument we might have our minds enlarged to receive more of the truth.”[24]Patience and careful argumentation is the domain of the thoughtful. Rule-keeping does not require any such thought.

         Moral rule-keeping, in fact, does not necessarily demand inner virtue, at all. In a rule-keeping moral system, the primary goal, ultimately, it is to be right. Is to beat the competition. The inner cultivation of virtue need not factor in at all. And if the goal is to win a competition, it is sufficient to have a semblance or simulacra of virtue rather than virtue deeply internalized.[25] It is enough to determine who’s good and who’s bad (we are always in the “good” crowd, of course), to win the competition. We may know “the things you cannot say” and “the lines you must not cross,” but knowing the pitfalls to avoid does not mean that you are becoming good. Nevertheless, in a rule-keeping moral culture, it is enough to follow the rules or, at least, be seen to follow them. Goodness is no longer a telos, an endpoint we aspire to as the end point of a unified human life; alignment to social mores—the rules of our team—will suffice as goodness.

Rule-Keeping and The Rise of Hypocrisy-Hunting

         The competitive aspect to rule-keeping means that calling out the hypocrisy of others is one of the best ways to “win,” all while safe-guarding our own sense of rightness. Rule-keeping, thus, constantly has us calling out the “sins” of others while ignoring any shortcoming in ourselves or our “team.” To admit weakness, in fact, becomes synonymous with defeat. It is no surprise than that calling out hypocrisy has become part and parcel of our modern moral culture.[26] And in a rule-keeping culture, hypocrisy abounds. After all, anytime someone abides by a rule that is not deeply felt in order to appear righteous, this is hypocrisy. Rule-keeping easily devolves into mere appearance-keeping, since rules may be kept out of fear of being seen breaking and losing social status rather than out of internalized moral conviction. On the left, this leads to political correctness, which is often rightly (pardon the pun) lampooned. When rule-keeping becomes an attempt to out-imitate and out-rival another in the desire to appear moral, there is much to lampoon. In such a scenario, rule-keeping collapses on itself: everybody is simply trying not to get caught. Perception is ultimate and intention is negligible. I have been in many conversations about race where sincere white people were afraid to say anything for fear of saying the wrong thing. There can be no real conversation when people are too worried about breaking a moral rule, even if they are genuinely or sincerely confused (as opposed to being negligent or of ill-will). And many of our societal matters of debate are truly complex. An issue we will explore further below.

         A corollary to hypocrisy hunting is loud insistence that you are right. Inner harmony can be simulated by external demonstration, often in the call to “cancel” another. The need to vigorously confirm our righteousness has given rise to what is popularly called “cancel culture.” Let me say that certainly some people need to be cancelled—that is, have their platform removed. But often, platforms are removed simply because an offended party wants to be right and is not willing to consider an opposing point of view.[27]  

         This reveals the need for yet another layer of consideration and discovery: in a rule-keeping competition, thinking (in the form of patient, thoughtful conversation that Rowan Williams had in mind) is in some sense, entirely beside the point. There is no need to investigate our own view—or its flaws or limitations—too deeply, if at all. In fact, in a competition, this is exactly what you do not want to do. Thoughtfulness is not required, while vigorous attack is (as 99% of political ads demonstrate.) We are a culture intoxicated with the entertainment value of a good zinger rather than the fruit of steady thoughtfulness.[28] Since our competing tribes (again, largely divided by progressive and conservative biases) are focused on our rules and the absurdity of “the other’s” view, we can easily ignore any dissonance within our own view. “The other side is bad” is all we need to say, as creatively or, if that fails, repetitively as we can.

         And now we begin to see that these two "tribes" are not as different as they believe themselves to be. Sure, the content of professed beliefs on left and right may be different (although, as Thomas Cahoone points out, in historical terms we have never been so unified; we are almost uniformly pro-limited government, pro-some form of government regulation, anti-aristocratic, anti-theocracy, anti-fascist, anti-communist, and on and on).[29] But the way these beliefs are held is nearly identical on left and right, including radical group identification and allegiance to tribal norms and expectations which become and are taken as moral rules. 

         The result is that societal issues never get fully addressed or discussed and therefore go unremedied. Consider: someone committed to using non-gendered pronouns may see their position as the only moral position without accounting for the difficulty of widespread use of non-gendered pronouns. Or transgender advocates may ignore broader societal issues such as how gender is determined for athletic events and who can compete in girls’ and women’s sports. At the same time, those rejecting the use of non-gendered pronouns may see their view as the moral one while failing to account for the very real phenomena of trans-gendered individuals and the need for a thoughtful, compassionate societal response.[30]

         Let us sum up this Russian doll of phenomenon resultant from a rule-keeping culture—a lack of thoughtfulness, a commitment to calling out hypocrisy, the rise of political correctness and cancel culture, and a lack of self-critique of our own political views—by saying that morality driven by rule-keeping suffers a unique blindness. The true purpose of rule-keeping is hidden from those that are captured by it. Moral-rule keeping often creates the feeling that it’s about being good, virtuous people, when its deeper and actual purpose—quelling our anxiety by convincing us that we are the good ones—remains hidden. This is the failure of morality by rule-keeping. Rule-keeping is an idol which always demands more sacrifice: ever greater levels of virtue signaling, outrage, hypocrisy hunting, and blindness to our own weaknesses. All the while, we may be blind to the fact that we are blind at all.

The Failure of Rules

         Rules used to enforce a moral system easily degenerate. Rather than providing us with a new moral code by which to enjoy social cohesion and proof that we are good, moral people, the rules become tools by which we segregate into tribes which demands the exclusion of an “other.” Such exclusion provides a sense of belonging, since we are not on the outside with those non-rule-keepers, and with a sense of righteousness, since we are not like “them.” Once this gambit is accepted, we come to need the polarization of our society since it always gives us an “other” against which to vent our anxiety. “Every Republican who voted for Trump is evil.” “Those godless, baby-killing Democrats who want to destroy America.” Note that the subtle, though now rapid, increase in use of incensed rhetoric becomes unstoppable as our frameworks and worldviews are polarized against each other. In the past, it was possible to have secondary and tertiary points of disagreement while remaining in conversation, but since every point of disagreement is now a part of the moral matrix of group rule-keeping, every minor point of disagreement can become a hill worth dying on—or at least an opportunity to berate the other side on Twitter. This dynamic applies to conservatives and progressives alike. A poll released before the 2020 U.S. Presidential election found that in both parties, over 70% of respondents believed that "If the wrong candidate wins this election, America will not recover."[31]This response is indicative of a certainty which reveals how deeply we have come to need an “other” to despise to prove our own rightness and certainty. We no longer need a broader society, in fact; we just need our tribe.[32]

         We do not recognize what is happening, as the blindness intrinsic in a rule-keeping morality begins to take our own sight. We insist that red or blue or progressive or conservative is right and that others are not only wrong but evil.[33] We need not investigate the flaws in our own views too deeply. We come to believe any news source that backs up our point of view is authentic, while everything else is “fake news.” Since we now have the technology—via the internet and highly-niched news service which we can align with and thereby confirm our every opinion—we can be “right.”[34] And we form tribes that do not require our thoughtfulness but only our allegiance. Then complex matters of disagreement become the simplistic basis for demonization and exclusion, as we militarize our opinions to demonstrate how the “other” is bad and wrong. This either/or thinking effects everything we come to do. It affects the way we see others. It means we are always looking for a scapegoat. 

         Which brings us back to Justine Sacco.   

         Justine Sacco became a scapegoat, an object to be sacrificed to relieve of us our own cultural anxiety. I said that mimetic theory and morality by rule-keeping were deeply related, and no doubt by now it is clear why. Rule-keeping morality, now ascendant in the U.S., is mimetic theory par excellence. Rule-keeping is the comparison method which constantly demands an “other” to be found and scapegoated. The ancient scapegoating mechanism and contemporary rule-keeping are mechanisms for the same thing, namely, the relief of our anxiety. The scapegoat mechanism is re-emerging, and the volatility of social media is its greatest catalyst. 

         Sacco was not unvirtuous, yet she broke a societal rule. As the scapegoat mechanism continues, we will come to need a new scapegoat almost every day, until at some point the entire sacrificial system will collapse under its own weight.      

The Descent into Violence

         And what results when the system collapses? Let us remember that scapegoating is, ultimately, an expression of violence done to an other. 

         Most Americans were aghast at the storming of the U.S. capitol in January 2021 and the loss of life that ensued. But we should not be surprised. Mimetic contagion, the rise of anxiety, and the adoption of the scapegoating mechanism always leads to a proliferation of violence across society. And while the events at the capitol can be seen as a reaction on the right, the left is not immune to the violence intrinsic in a scapegoating culture. In fact, comparison and rule-keeping as a catalyst to mimetic contagion often proliferates on the left over and above the right. Despite our culture’s supposed allegiance to the creed of “tolerance,” a progressive value of the left, we are often anything but tolerant. Or, more accurately, we are tolerant only within our own moral frameworks. As we look to the future, as things stand, we should expect the rise of scapegoating to lead, inexorably, to increased violence across our society.

         Is there a way out? Moral discourse, as opposed to moral rule-keeping, of course requires engagement with an opposing point of view. But discourse is increasingly scarce in an anxious society which has learned to deal with anxiety by scapegoating an “other.” Yet our salvation—at the risk of using such a weighty term—lies specifically in learning to re-engage in discourse with curiosity and thoughtfulness. In the next essays, we will begin mapping a course to just such an engagement. 

 


[1] See ‘Justine Sacco, the PR exec who was fired from IAC for her tweets, has landed back at IAC’s Match Group’ in Vox. 1/19/18 https://www.vox.com/2018/1/19/16911074/justine-sacco-iac-match-group-return-tweet

[Accessed 1/3/21]. Sacco’s full comment: “To me it was so insane of a comment for anyone to make….I thought there was no way that anyone could possibly think it was literal….Unfortunately, I am not a character on ‘South Park’ or a comedian, so I had no business commenting on the epidemic in such a politically incorrect manner on a public platform...To put it simply, I wasn’t trying to raise awareness of AIDS or piss off the world or ruin my life. Living in America puts us in a bit of a bubble when it comes to what is going on in the third world. I was making fun of that bubble.”

[2] “A group experience takes place on a lower level of consciousness than the experience of an individual. This is due to the fact that, when many people gather together to share one common emotion, the total psyche emerging from the group is below the level of the individual psyche. If it is a very large group, the collective psyche will be more like the psyche of an animal, which is the reason why the ethical attitude of large organizations is always doubtful. The psychology of a large crowd inevitably sinks to the level of mob psychology. If, therefore, I have a so-called collective experience as a member of a group, it takes place on a lower level of consciousness than if I had the experience by myself alone.” Jung, C.G. The Archetypes and the Collective Unconscious. Princeton University Press. 1969. Page 125.

[3] See Failure of Nerve: Leadership in the Age of the Quick Fix by Edwin Friedman. Church Publishing. New York, NY. 2017.

[4] Another Frenchman, Jean Jacques-Rousseau, had previously noted how his contemporary society, as opposed to humans in a supposed “state of nature,” function based on competition and comparison.

[5]  René Girard qtd. in “Generative Scapegoating” by Robert G. Hammerton-Kelly, ed. Violent Origins: Walter Burkert, René Girard, and Jonathan Z. Smith on Ritual Killing and Cultural Formation, Stanford University Press. 1988. p. 122

[6] Leviticus 16:21. New International Version.

[7] Often the more austere the group social expectations are (that is, the requirements for belonging), the more cohesive the bond towards one’s own tribe and, therefore, the greater the fear of exclusion. This simultaneously produces a more deeply embedded suspicion and judgment of those outside of one’s group. Behavior solidifying tribal membership takes on moral character, as do its social prohibitions.

[8] Haidt, Jonathan. The Righteous Mind: Why Good People Are Divided by Politics and Religion. Vintage. February, 2013. Page 20.

[9] In Aristotle, for example, one could only become moral and virtuous in relationship to a community which included both intimate friends and a larger polis.

[10] There is an old Arab Bedouin saying: “I, against my brothers. I and my brothers against my cousins. I and my brothers and my cousins against the world. That is jungle law.” (Nafisa Haji) This view of the world and of morality, rooted in kinship, in which group identity is the primary means of knowing who you are, puts the individual’s happiness second. Only in our post-Industrial Revolution, post-Enlightenment world have these kinship structures been supplanted and has the individual’s happiness been placed before group membership and kinship bonds.

[11] Deneen, James. Why Liberalism Failed. Yale University Press. January 2018. Page 122.  For more on emotivist culture, see ‘Essay VII’ in this series.

[12] MacIntyre, Alasdair. After Virtue: A Study in Moral Theory. University of Notre Dame Press. Notre Dame, IN. March, 2007. Page 172. See also page 195: “For liberal individualism, a community is simply an arena in which individuals each pursue their own self-chosen conception of the good life, and political institutions exist to provide that degree of order which makes such self-determined activity possible.” 

[13] Jonathan Haidt in an interview with Jonathan Sacks. Morality in the 21st Century Podcast. “Episode 8: Jonathan Haidt.” September 3, 2018. Consider also this comment from MacIntyre: “Hence we lack, as [the ancient Greeks] did not, any in public, generally shared communal mode either representing political conflict or for putting our politics to the philosophical question.” (MacIntyre, 138) In ancient Athens, in other words (and contra our world), morality was—far from being individualistic—very much socio-centric. 

[14] Consider Isaiah Berlin’s two notions of freedom: “negative liberty” is the absence of constraints. “Positive liberty,” on the other hand, means realizing one’s purpose and potential and involves, therefore, far more than freedom from constraint. 

[15] Deneen, 113.

[16] As Deneen puts it: liberty is now “the agent’s ability to do whatever he likes…Modern theory defines liberty as the greatest possible pursuit and satisfaction of the appetites, while government is a conventional and unnatural obstacle to this pursuit. [Ancient theory understood] liberty to be achieved only through virtuous self-government.” Page 48. 

[17] Hegel had noted, after the French Revolution, that the notion of freedom championed in the Revolution was incomplete. Its notion of self was atomic, individual, and nothing but negation. That is, the ideals of the Revolution and, specifically, its notion of liberty would create a self who sees the world as limiting its soul’s freedom and thus, in the words of Lawrence Cahoone, “can only deal with the outside world by trying to destroy it.” See The Modern Political Tradition: Hobbes to Habermas by Lawrence Cahoone. ‘Lecture 13: Civil Society Constant, Hegel, Tocqueville.’ The Great Courses. May 2014.

[18] ‘These Unusual Sexual Practices Will Set You Free’ by Claire Divino in Medium.com, 9/14/20 https://medium.com/sexography/these-social-norms-will-set-you-free-7ebf7f22e01f [Accessed 2/5/21] David Brooks summarizes and critiques this notion of freedom: “When you look back on it from the vantage of 2020, moral freedom, like the other dominant values of the time, contained within it a core assumption: If everybody does their own thing, then everything will work out for everybody. If everybody pursues their own economic self-interest, then the economy will thrive for all. If everybody chooses their own family style, then children will prosper. If each individual chooses his or her own moral code, then people will still feel solidarity with one another and be decent to one another. This was an ideology of maximum freedom and minimum sacrifice.” See ‘America is Having a Moral Convulsion’ by David Brooks in The Atlantic. 10/5/20

https://www.theatlantic.com/ideas/archive/2020/10/collapsing-levels-trust-are-devastating-america/616581/

[Accessed 1/21/21] As a side note, this political ideal of maximum individual freedom has an economic concomitant: Adam Smith had posited that consuming more than you need was a moral good, forming an economic corollary to “expressive individualism.”

[19] Rohr, Richard. “My Story” in Richard Rohr’s Daily Meditation. January 25th, 2021.  

[20] See footnote 14. Alexis de Tocqueville saw democracy itself as an atomizing, individualizing order driving people into isolation and outside previous social structures, bonds, and institutions. 

[21] Which, it should be noted, creates a vacuum which may be filled by atrocious narratives, such as—an example from our contemporary world—white supremacy. 

[22] Haidt, Jonathan. “The Moral Roots of Liberals and Conservatives,” Ted Talk. https://www.ted.com/talks/jonathan_haidt_the_moral_roots_of_liberals_and_conservatives/transcript?language=en#t-187811 [Accessed 2/12/21]  Haidt goes on to say that such engagement within the psychology of competition “shuts down open-minded thinking.”

[23] MacIntyre, 154.

[24] Williams Rowan. Being Disciples: Essentials of the Christian Life. Eerdmans. 2016. Pages 24-25. We are witnessing the opposite of what Kant called enlarged thought, the consideration of another or, in this case, the thought of another.

[25] In classical notions of virtue, virtue could only be communicated externally if it was enjoyed internally. “The activity achieved and the activity enjoyed are one and the same state.” (MacIntyre, 197). In a rule-keeping morality, no such internalization is necessary. In a rule-keeping morality, no such internalization may be possible. 

[26] Consider the rise of the “canned apology” or what we might call the paint-by-numbers apology (“I’m very sorry to all I offended, this does not represent who I am,” et cetera). The apology, in an attempt to prove we are not hypocrites, has come to take on a formal structure which undercuts sincerity. At the same time, such apologies are never good enough for the crowd which senses the insincerity undergirding what has essentially become a society-wide rite or ritual, even if the person apologizing is genuinely remorseful. 

[27] This can be a function of Girard’s scapegoating mechanism: collectively arguing for the silencing of another functions as a scapegoating mechanism.

[28] As Neil Postman continually warned.

[29] Cahoone, ‘Lecture 36: Why Political Philosophy Matters.’

[30] The debate over personal pronouns, in particular, is a flashpoint in contemporary moral rule-keeping debates, no doubt because it is connected to sex and sexuality which connects to our deepest desire for love and acceptance. The debate was perhaps made most prominent by psychologist Jordan’s Peterson’s refusal to use non-gendered pronouns (on the grounds that codification of speech under the law is a slippery slope towards totalitarianism). To use non-gendered pronouns is, to some, a deeply felt moral end and thus a societal rule that should be instantiated. To others, like Peterson, resistance to the legislation of such speech is a moral end (though on a personal level, they may have no issue with using non-gendered pronouns in individual relationships). Thus, moral debate is deeply charged and moral incommensurability—the inability to confidently compare moral ends (or even recognize that there are moral ends in competition)—is rife.

[31] From the Braver Angels YouGov Poll, October 22-23, 2020. https://braverangels.org/yougov-poll-results-2/

[32] Philosophically speaking, once our notion of freedom is merely Berlin’s “negative freedom”—that is, freedom from restraint—then we are free to get our way and have our unchallenged opinions or processes go painfully awry. To have our view challenged comes to feel like a violation of our freedom. 

[33] To say we are inclined towards dehumanizing “the other” does not mean that all moral issues are incommensurable or that all moral views are equal. There was great debate about civil rights in America in the 1960s but only a bigot would now say that MLK and the Civil Rights movements were not moral goods. There often is a “right side to history.” The point of this essay is rather that most of our societal debates require a synthesis of discourse from both left and right.  

[34] "[Social media platforms] plainly encourage the vices most dangerous to a free society. They drive us to speak without listening, to approach others confrontationally rather than graciously, to spread conspiracies and rumours, to dismiss and ignore what we would rather not hear, to make the private public, to oversimplify a complex world, to react to one another much too quickly and curtly. They eat away at our capacity for patient toleration, our decorum, our forbearance, our restraint." Levin, Yuval. A Time to Build: From Family and Community to Congress and the Campus, How Recommitting to Our Institutions Can Revive the American Dream. Basic Books. 2020.

Transcending Rigidity Part VII: Moral Incommensurability and the Rise of Emotivist Culture

Brandon Cook

Disagreement About Moral Ends 

         Disagreement is not necessarily confusion. We can have a disagreement and yet be very clear about the terms of our disagreement—what we agree about in substance, and how our views differ. Confusion sets in when we are not sure what the disagreement is about or what you’re saying or what I’m saying.

         Confusion also sets in when we are arguing with different assumptions about what is good, right, or meaningful. Consider, for example, this real-life anecdote: a self-described pro-lifer and pro-choicer walk into a bar for a civil conversation. (Too outlandish? Just humor me.) The pro-lifer states her case on the sacredness and sanctity of life; she has credible scientific evidence that life begins long before birth. The pro-choicer stakes her argument on the inviolable right of a woman to choose, which is rooted in the individual liberty core to our entire system of political thought. Both individuals, then, are arguing from a strong moral end. And both have different assumptions about which end is most meaningful. How does one resolve such a disagreement and come to a moral understanding of abortion when both participants are engaging from both reasoned and deeply felt passions? In fact, when two parties argue for a moral good with no means of prioritizing the goods in play, there is no way to resolve such a debate.

         Perhaps a simpler analogy will make the problem clearer: a couple gets into an argument about why the husband didn’t help the wife get their children out the door in the morning. “I was busy sweeping the floor,” the husband says. “Well, next time, please help me get the children out the door,” she replies. The issue at play here, also, is different assumptions about what is most important and what should have priority. In the husband’s mind, clearly, cleaning the floor was the more important end; in the wife’s, helping with the kids. Confusion reigns until these ends can be located, discussed, and prioritized. But imagine (silly as it might sound) that, as in the argument about abortion, the husband and wife can never arrive at a consensus about which end is most important and that neither is willing to budge. The argument would become intractable; just as we can’t resolve a domestic argument about cleaning the floor if we can’t establish a priority of ends, so we can’t have moral conversation if we have confusion about what ends are most important.[1]

         Every human being, as is illustrated in the above examples, holds moral ends—some good to which they aspire.[2] These goods may be nothing more than the pursuit of their own satisfaction or pleasure, but there is always some end pursued, consciously or unconsciously. As David Bentley Hart writes, “We act always toward an end that we desire, either morally, effectively, or pathologically.”[3] This ability to pursue an end is the freedom of our humanity. As Richard Neibhur says, “The freedom of man appears…[as] the necessity of self-determination by final causes: [a man’s] practical reason appears as his ability to distinguish between inclusive and exclusive, immediate and ultimate ends and to relate means to ends.”[4] In other words, humans are those beings which are free because they can identify and chart a course towards some desired goal, which by necessity requires the prioritization of some ends over others.

         Abraham Maslow’s hierarchy of needs (see ‘Essay 6’ in this series) is an example of prioritizing moral ends. Maslow’s distinction between actualization and transcendence represents an attempt to name a felt good and draw a priority distinction; he placed transcendence above actualization. But imagine if someone said, “No, giving ourselves to something bigger than ourselves (transcendence) is not as important as our own self-expression (actualization).”[5] How would we resolve such a debate? We can appeal to common sense and basic human sensibilities or point to the outcomes of various philosophies of living, but there is little way to prove which end is higher. To do so, we would have to have some agreed upon end (in Greek, the word is telos, meaning an ultimate aim or endpoint) constituting the highest endpoint of a human life. We would have to have an agreement about “the good life” is, philosophically speaking. But in moral debates, we almost always arrive at a place where we cannot logically resolve disagreement because we prioritize moral ends differently (a subject I will explore in future essays). Often, disputation takes place with growing frustration because neither side knows how to articulate this discrepancy in moral assumptions.

         In his magnum opus After Virtue: A Study in Moral Theory, Alasdair MacIntyre calls situations in which we have no means of prioritizing moral ends and resolving moral debate instances of “moral incommensurability.” Moral incommensurability is the inability to find an evaluative approach by which to measure moral ends against each other.[6] When our ends differ and we have no means to bridge the gap between my perception and yours, neither rhetorical flourish nor passionate insistence is able—generally speaking—to convince another or bridge the gap between different points of view. We have no means of resolving such moral incommensurability because “there is no rational way of deciding which type of claim is to be given priority or how one is to be weighted against the other.”[7] 

         This situation is the default experience in our culture. The orientation of almost all societal discourse is now rooted in moral incommensurability. Debates on social media exemplify this dynamic (to call them “debates” is a stretch, admittedly), but the dynamic is everywhere. We have little means of determining which ends are higher ends and why. The end result is chronic anxiety and a deterioration of discourse patterns. That is, we are losing our ability to talk about important matters, and we are increasingly anxious because of it.

No Guiding Narratives—No Big Deal?

         In the last two essays, I stated that what makes our historical era, post-Enlightenment, unique, is the diminution or outright loss of guiding narratives—the stories we tell ourselves to make the world and our place in it meaningful. Moral incommensurability flourishes in such a world. We once believed that hierarchical religion and aristocracy would order the world and save our souls (the medieval worldview); then we believed that science would save us (Modernism, via the Enlightenment). So we no longer have a medieval worldview, with a clear narrative about God and aristocratic hierarchy; and neither do we have a Modern worldview, in which we trust science and reason to lead us into a bright and better future. Is this such a bad thing? Moral incommensurability was lessened under these guiding narratives, which made for less debate and, in some sense, greater clarity. But those narratives were faulty or incomplete, if based on no further evidence than that they stopped working and we no longer trust them.[8] If postmodernity is pessimism about finding any transcendent truth, and thus any guiding narrative…so what of it? If we enact virtue—if we “try to be good people,” to put it simply—for no other reason than it’s a practical, materialistic mandate (an evolutionary injunction, as it were) and with little other narrative or imperative behind it…does it really matter?

         After all, guiding narratives are not necessarily holy or good. They can be abused and manipulated for all sorts of reasons. The Christian narrative of “going forth to make disciples of all nations” was abused to force baptize—along with rapine and pillaging—countless thousands, in both the Medieval period and in the Age of Reason, with the Crusades and the “Christian” conquest of the New World. Shouldn’t grand ideals, when they can be so easily manipulated and abused, be avoided?

         But it is not our lack of guiding narratives alone that is troubling, but the concomitant inability—always on the rise, so it seems—to have any civil conversation or disagreement at all, whether about moral ends, political opinions, or anything else. When great guiding narratives—such as those of Christendom or the Age of Reason—fail, we are less likely to put much stock in any guiding narrative. This bolsters moral incommensurability, giving rise to greater levels of debate. But if there is no means of having that debate—if people lack the skills or stomach to thoughtfully engage such debate, for example—chaos can be the only result. We are witnessing the rise, in fact, of a chaos-inducing pattern of discourse that follows this basic pattern:

(1) I am right. Certainty always masquerades as a sort of “righteousness.” It serves an emotional needs by convincing one that “I am right and therefore I am good.”

(2) I can prove I’m right. I can confirm, through a media source, scriptural text, or other authority, my rightness. Our ability to confirm our biases has never been so readily available, given the proliferation of “news” sources so easily customized to fit our biases.

(3) You are wrong. Obviously, since I am right.

(4) Not only you are wrong, you are evil. This is the dehumanization of those holding a counter point of view. When under thread, dehumanization allows us to protect our viewpoint, thus allowing us to continue securing the emotional benefits of rightness we so crave. People are willing to defend their certainty to the hilt because what’s really at stake is their emotional security.

As you can see (and have no doubt experienced yourself), this pattern of thinking leaves no space at all for disagreement, let alone debate. This is a major problem, since vibrant debate—about guiding narratives, moral ends, political opinions, or anything else—is healthy. The alternative, after all, is totalitarianism, politically and intellectually. When we lose the ability to debate, to converse, to hold opposing tensions without de-humanizing each other, we are in uncharted, dangerous territory.[9] We are in the midst of a new moral culture.

The Rise of Emotivist Culture

         So what has changed? There has always been moral disagreement, that is nothing new. Why is moral incommensurability—marked by our increasing inability to have civil conversations outside our predilections and preferences—such a growing feature of our society? It is helpful to look to the past, for comparison. Simply put, what moral cultures before us—Ancient, Medieval, and Modern—had in common was a belief that human beings had a function in the world. Philosophical thought was grounded in the notion that humanity has a telos, an ultimate end or aim. This formed the basis of moral conversation. In Aristotelian thought, for example, an object could be defined in part by its functional purpose, its telos.[10] And just as you could describe an object as good if it met its functional end (a walking stick being good because it fulfilled the purpose of a walking stick, or a statue being good because it fulfilled its end as an object of beauty in a garden, for example) or so you could define a man by the pursuit of functional ends. Thus, you could take steps to define what a good man or woman was. Of course, there was vigorous debate about what the function or endpoint of humans is, but the belief was always that there are certain transcendent ends to which we can aspire. The universe was, in turn, conceived of as meaningful. In the ancient world, the cosmos was, in Stoic thought for example, guided by the Logos, a principle that maintained order and harmony in the universe. In Medieval Europe, the world was perceived to be guided by the divine right of kings, under God. In Enlightenment Europe, a man was defined as good if he lived according to rational ends. Thus, a Roman man was good by his ability to live harmoniously with nature, accepting his place within the patriarchy, just as a Medieval man was good by virtue of his faithful piety, just as an Enlightened modern was good through his use of reason. Up until the Enlightenment, then, the world was conceived of as meaningful and orderly, such that men and women could find meaningful ends—their own telos—within it.

         Now, by and large, we don’t believe the universe is ordered and meaningful, nor that there are any meaningful ends for us to pursue. We have no societal center. We have no guiding narrative. Where there are ends, they are the result of our need to construct ends for our own sanity, not because there is any transcendent reality behind them. Rules for moral life become a necessary arbitrariness. Now “meaning” is construed as a necessity simply to stave off chaos. There is largely no shared conception of what human beings are and thus, what our moral ends are. And since there is no notion of telos, there is little ability to discuss “the good life.” As MacIntyre writes, “Questions about the good life for man or the ends of human life are to be regarded from the public standpoint as systematically unsettlable.”[11] Before Modernity, to make progress in life was to move towards a given telos. We are now moral agents without a telos, without a function. “Each moral agent now [speaks] unconstrained by the externalities of divine law, natural teleology, or hierarchical authority.”[12] In other words, we have lost the sense that has any function in the world at all.[13] In such a situation, how does one begin to have moral conversation?        

         At the same time, it’s important to note that we do not necessarily construe a loss of telos as a bad thing, given that we often perceived (and perceive) such ends as coming from restricting authorities. Indeed, many experience the end of ends as a liberation. Says MacIntyre, “Many of those who lived through this change in our predecessor culture [the loss of traditional narratives of meaning, such as the ancient or medieval worldviews described above] saw it as a deliverance both from the burdens of traditional theism and the confusions of teleological modes of thought.”[14] In other words, moral ends were connoted with restrictive institutions of authorities—God or aristocracy, for example—and to be free of those ends is to be released from a burden. (One can imagine a child finally getting the keys to a car and breaking out of the constraining restrictions of life under mom and dad.) “Modernism,” says MacIntyre, is characterized to some degree by the loss of any claim to our duty or function, and this is often considered a gain, "as the emergence of the individual freed on the one hand from the social bonds of those constraining hierarchies which the modern world rejected at its birth and on the other hand from what modernity has taken to be the superstitions of teleology."[15]

         We are in the “active nihilism” that German philosopher Friedrick Nietzsche predicted. Philosophy since the Enlightenment has been a deconstruction of what we had believed but no longer can bring ourselves to believe—the Judeo-Christian narrative of a transcendent, personal, Creator God, for example, who promises life in a world to come, and so forth. Nietzsche predicted that after a serious bloodletting and a battle of philosophical worldviews, a new world would lead to a new world in which supermen who embodied new values would emerge.

         Indeed, we are now trying to fill the void created by the loss of moral frameworks and agreed upon ends. In such an environment, any end can be king. What increasingly fills the void of our inability to dialogue is strong sentiment. Moral discourse becomes subject to emotional feeling.[16] This results in what MacIntyre calls an “emotivist culture” in which personal sentiment, as opposed to any agreed upon moral narrative, reigns. In an emotivist culture, authority is no longer vested in moral narratives, but in feeling. In such a mode, we are doubling down on individualism. That what I feel, because it feels good (generally speaking, because it makes me feel right), must be right.[17] This brings us full circle: in such a situation, again quoting MacIntyre, “there is no rational way of deciding which type of claim is to be given priority or how one is to be weighted against the other.”[18] Emotivism and moral incommensurability go hand-in-hand. To put it in the language in our last essay: “Whatever I believe is just me doing me, and how dare you question that?” Once this sort of statement is accepted, there is no further basis for civil conversation. Disagreement becomes conflict. Conflict is taken as abuse. Opposing opinion becomes an attack. And the possibility of civil discourse is broken.

         So, on one hand, the loss of guiding narratives is a type of freedom, sure; but we are also in confusion. And the danger beyond the loss of moral frameworks is the increasing incivility which undermines even having a conversation, rendering discussion in situations when we have different moral assumptions nearly impossible. It’s not, in other words, that we have different opinions, but that, increasingly, we have no means to talk about those disagreements. The alternative is to conclude that we see the world clearly and are right and righteous. Moral incommensurability and emotivist culture, then, buttress the discourse outlined above: (1) I am right, (2) I can prove I’m right, (3) You are wrong, (4) Not only you are wrong, you are evil.

Emotivist Culture and The Loss of Civil Discourse

         The end result of an emotivist culture is confusion. Specifically, the confusion of confronting “one contingent arbitrariness against another.”[19] And the headwaters of emotivism play out downstream. Settled facts or the notion of any fact at all becomes debatable. President Trump (who is an exemplar of emotivist culture par excellence) can say that his inauguration set attendance records despite photographic evidence because…well, he because he strongly wishes it to be so. This sort of confusion also plays out in moral debate, as in the imagined abortion argument, above, and in public and political debate of every sort. We live in an emotivist culture, with dizzying effects and amplified confusion, and such confusion undermines moral conversation. In addition to the confusions outlined in Essay 5 (‘What Story Are We In?’), here are further confusions resulting from an emotivist culture:

         I. Loss of Discourse Patterns

         “Speaking your truth” (a cultural corollary of “you do you”) for authenticity’s sake is a great discourse pattern. Speaking honestly is generally far better than being false-faced. But if this is your only discourse pattern, you have annihilated the basis for moral conversation. Or if you believe something is true because you deeply wish it to be so, you have obviated the possibility of rational conversation. Further, once you have made the move that something is moral or praiseworthy because it is authentic or because you feel it strongly, then any moral narrative is permissible. A white supremacist can make a strong moral argument based on these terms, in that his racism is deeply felt and is only, after all, “my experience.” When we pit the morality of an individual’s personal preference against a collective “we”, there is little space for thoughtful dialogue. Indeed, in such a situation, any narrative which “feels good,” even if it is a twisted narrative, can be justified. A dignity culture (by which I mean a culture which appeals to legal bodies to resolve disputes rather than direct retaliation, such as dueling, as in an honor culture) such as our contemporary culture has traditionally used public shaming to curb outrageous moral narratives, but if we are entering into a new moral culture, as sociologists Bradley Campbell and Jason Manning posit, then curbing twisted narratives may be harder work moving forward.[20] The rise of White Nationalism is, again, one such example.

         Once rational dialogue is subsumed by emotivism, all sorts of demons are let out of Pandora’s box. Once I locate moral ends in my own self based on my feelings alone, curiosity is interrupted and downstream, over time, transformative modes of discourse are undercut. Soon, there’s no need to engage in thoughtful dialogue with anyone at all.

         II. Loss of Facts 

         In an emotivist culture, facts become increasingly meaningless. Facts that are dissonant with our preference may be dismissed. Meanwhile, the rise of infotainment—statistics, information, and news media tailored to our preferences and emotional reactions—supplies us with armament to justify our preference. How many times have you seen a fact re-butted on social media by a specious yet hard-to-debunk counter claim, backed by some quasi-reputable science or media outlet? Or consider how this dynamic plays out politically: The Age of Trump is marked by the mainstreaming of ad hominem attacks over any debate of substance and the seeming disregard of facts altogether.[21] This is a serious matter. Such individuated emotivism loosens the bonds that hold us together. 

         III. Rise of Anxiety

         The loss of thoughtful moral dialogue, unending stimulation via entertainment-driven news media and social media, and the loss of facts as hard realities, not surprisingly, leads to ongoing anxiety, ennui, languor, meaningless, and purposelessness. We are what Edwin Friedman calls “chronically anxious.”[22] Chronic anxiety creates a vicious loop; it reduces the capacity to be thoughtful, which is the very faculty which could undercut emotivist culture.[23]

Looking Forward

         The rise of moral incommensurability and emotivist culture makes it hard to progress to spiritual maturity of Stage 4 (see the first two essays in this series). And it’s hard to see a way out of our confusion. But our map forward must start with understanding where we are, before we can map a path forward. Before we chart that direction, let us further understand what emotivist culture creates, so that we can chart our way more surely. In the next essay, we will explore the moral superiority, virtue signaling, and de-humanizing which becomes part and parcel of an emotivist culture.

 


[1] Of course, in the floor cleaning scenario, one partner is very likely to “give in” or compromise, for the sake of domestic tranquility. In many moral debates, however, since passions are so deeply felt, neither side is willing to give in.

[2] “The ends to which men as members of such a species love are conceived by them as goods.” MacIntyre, Alasdair. After Virtue: A Study in Moral Theory. University of Notre Dame Press. Notre Dame, IN. Page 82.

[3] Hart, David Bentley. That All Shall Be Saved: Heaven, Hell, and Universal Salvation. Yale University Press. September, 2019. Page 42.

[4] Neihbur, H. Richard. “The Responsible Self: An Essay in Christian Moral Philosophy.” Harper and Row Publishers. New York, NY. 1963. Page 51.

[5] This may seem a silly example, but consider Ayn Rand’s philosophy of objectivism, which does not draw a distinction between self-expression as an experience of own’s own happiness and transcendence.

[6] MacIntyre. See Chapter 6 and page 70, in particular.

[7] MacIntyre, 70. MacIntyre is talking, in the above sentence, about claims of utility made by Jeremy Bentham and rights over against traditional concepts of justice; nevertheless, we could just as easily talk about contemporary matters.

[8] Ferry, Luc. A Brief History of Thought: A Philosophical Guide to Living. Harper Perennial. 2011. Pages 212-214. And leaving aside the fact that they often allowed gross excesses or abuses.

[9] While we can’t hearken back to some non-existent ideal age, we can, nevertheless, look with serious concern at the future, questioning whether our moral advancement will continue. As Edwin Friedman says in Failure of Nerve, societies can continue to progress materially long after they have reached their moral and ideological zeniths and are, in fact, in decline. See Failure of Nerve: Leadership in the Age of the Quick Fix by Edwin Friedman. Church Publishing. New York, NY. 2017. See “Chapter I: Imaginative Gridlock and the Spirit of Adventure.”

[10] Among Aristotle’s four causes, this is the “final cause.”

[11] MacIntyre, 119.

[12] Ibid, 68.

[13] MacIntyre, 59-60.

[14] Ibid, 60.

[15] Ibid, 34.

[16] Again, to quote Jonathan Sacks: “We [have moved from a world of “We” to one of “I”, the private pursuit of personal desire.” Sacks, Jonathan in Covenant and Conversations: Life-Changing Ideas in the Parsha. “Making Space.” March 7, 2018.

[17] The collapsing of “we” to “I” has been accomplished on a philosophical basis by the new empiricism which arose in the 17th and18th century “…by making every experiencing subject a close round; there is nothing beyond my experience for me to compare my experience with, so that the contrast between seems to me and is in fact can never be formulated.” MacIntyre, 80.

[18] MacIntyre, 70. In this sentence, MacIntyre Is discussing claims of utility made by Jeremy Bentham as well as rights versus traditional concepts of justice, but he could just as easily be discussing contemporary matters.

[19] MacIntyre, 33.

[20] See “Microaggression and Moral Culture” by Bradley Campbell and Jason Manning. Comparative Sociology. Volume 13. 2014. Pages 692-726.

[21] Again, President Trump claimed, for example, that 1.5 million people came to his inauguration. See “Here’s What the Evidence Shows” by Timothy B. Lee in Vox.com, online. January 23, 2017. https://www.vox.com/policy-and-politics/2017/1/21/14347298/trump-inauguration-crowd-size [June 1, 2020]

[22] See Failure of Nerve: Leadership in the Age of the Quick Fix by Edwin Friedman. Church Publishing. New York, NY. 2017.

[23] “Cerebration that occurs in a reactive mode should not truly be labeled ‘thinking.’ The key to thinking lies in an emotional category, the differentiation of the thinker’s self.” Friedman, 137.

Transcending Rigidity Part VI: Self-Actualization and Transcendence

Brandon Cook

In the last essay, I stated that both authenticity and respecting others are high moral ends in our culture. In vernacular, they are captured by the catchphrases “you do you” and the ancient-but-never-so-modern “do no harm.” In practice, the two values become, “Do what you want as long as you don’t hurt anyone else.” The primary value, then, is self-expression and authenticity, and “doing no harm” because a sort of guide rail for how to “do you” without drawing a foul. If “you do you” is, increasingly, a primary cultural value, where does it come from and what might it lead to? We’ll take these two questions in turn.

“You do you”—the quest for authentic self-expression—sits within a broader question, old as philosophy itself: “What is the good life?” Even the ancient Oracle at Delphi gave the sterling advice, “Know thyself.” “Express thyself,” we can imagine, might follow soon after.[1] But it was not until the rise of existentialism and phenomenology in the 18th century that the idea of authenticity—being true to one’s self—put on steam as a broader cultural value. An authentic life, in existential thought, is a life lived from inner conviction rather than a mindless alignment with cultural norms.[2] By this definition, for example, an unthinking Christian who is a Christian simply because he was born in a Christian household is inauthentic (as Soren Kierkegaard made clear in his critiques of 19th century Denmark). But consider: thoughtful consideration of our inner motivation is a luxury. Who has time to contemplate authenticity when what matters is getting the crops in and keeping the children fed?[3] In a society committed to material survival—the situation of the great majority of societies and people across time and history—writing poetry and making art, let alone contemplating authenticity, are privileges which “come after.” Indeed, only once society has enough stability, or those in a stable society have enough wealth, can the inner life become as important as the outer life. Once survival is assured, and not before, do we have time for self-expression.

Twentieth century psychologist Abraham Maslow created his famous “hierarchy of needs” (1943), in part, to express this simple idea. The hierarchy of needs contains, as its highest ideal, what Maslow called “self-actualization,” which mirrors the contemporary value of self-expression.[4]

 

MP.png

Maslow’s hierarchy suggests that once we have our most basic needs met (food, water, shelter), we can progress to meeting deeper needs, such as self-respect and love, before attaining self-actualization. In our affluent Western world (despite its ongoing poverties), never have so many had the time and space to struggle with internal issues. Never have so many been situated as “kings and queens” of their own personal dominions. This may explain why depression and mental health issues are so prevalent; when you aren’t busy assuring your own survival, there’s a lot more time to get stuck in pathology.

The progress of Maslow’s hierarchy is simple and, in its basics, obvious; yet it is an essential rubric for understanding both basic human motivation and the progress of society and culture.[5] And families for that matter: think, for example, of how often a father or mother works hard to provide for their family’s external needs (the lower half of the pyramid), but their children, with the luxury of taking such security for granted, end up frustrated with their parents’ emotional or psychological distance (the upper half of the pyramid).

The hierarchy can also help us understand cultural dynamics and generational change. Consider, for example, how much grief millennials get for “being soft” or “more self-focused” compared to the generations that came immediately before. Millennials are not as focused on money and job security as boomers and more committed, compared with previous generations, to deriving meaning from their work over money, putting purpose over paychecks. This focus can seem to boomers as “wanting it all” and opens millennials to the criticism that they are spoiled, without the gut or backbone to “suck it up” as the boomers did. But what we are witnessing in such a generational dispute is a classic movement up the pyramid. We cease being satisfied with mere survival and aspire to thrive, which may include enjoying our work and feeling we are contributing to the world over and above simply making money. This is all well and good and how the world works (according to Maslow, anyway). Millennials are not meant to have the same values as the generation(s) that came before; that’s not how human development works. Many such disputes across generations are based in shifting values as you head up the pyramid.

At the top of his pyramid hierarchy, Maslow initially placed “self-actualization.” He said, simply, “What a man can be, he must be.”[6] We might say, then, that “self-actualization” means “realizing one’s potential.” One might become a great scholar or a great athlete or a good, moral person. But we might ask, “Towards what end?” or “With what goal in mind?” Even with Maslow, who coined the phrase, the idea is somewhat ambiguous. Actualization seems to be a drive that “just is.” But he was clear that someone who actualizes his or herself may have to let go of the very human desire for social praise and approval which stands in the way of self-expression and actualization. Indeed, Maslow placed anything below “self-actualization” as a deficiency need, which might indicate that our longing for love, connection, and friendship is actually a weakness standing in the way of our self-expression.

Beyond Self-Actualization

In later years, Maslow edited and re-cast his theory, stating that beyond self-actualization lies “transcendence.” Transcendence is the act of giving one’s self to something beyond one’s self. “Transcendence refers to the very highest and most inclusive or holistic levels of human consciousness, behaving and relating, as ends rather than means, to oneself, to significant others, to human beings in general, to other species, to nature, and to the cosmos.”[7]

In this distinction between self-actualization and transcendence, Maslow hits at the core distinction and critical issue: the battle between self-actualization as “you doing you” and transcendence as “giving yourself to another.” He re-cast hierarchy to capture the truth that beyond self-expression, we long for (we are made for, if you will) transcendence. We are made for love, where love is action taken on behalf of another. There is no real meaning without something or someone we are willing to die for. There is no real meaning apart from a love that is willing to sacrifice for someone else or some greater cause.[8]

Imagine an Olympic champion who has self-actualized as the greatest athlete in her sport. Imagine her standing on the podium as her national anthem plays. But imagine she feels deflated in victory because her ego’s sole motivation was to prove herself, and there is no further carrot to chase and, worse, the end result—even her triumph—is unsatisfying. “Is this all there is?”, she might ask. But you might also imagine her feeling full and content, crying as the anthem is played because she longed to bring honor to her country, which she loves. Or imagine instead that she won with the goal of using Olympic fame to further a cause she cares deeply about. Of course, our motives are never so cut-and-dry, but you can see quite easily that even at a zenith of self-actualization, there is something beyond, something transcendent, which is the only thing that can actually satisfy us. We can only be content when we have someone or something greater than ourselves in mind.

Maslow’s hierarchy points us into a question about human ends. What satisfies us? What are we made for? Can we find self-expression meaningful in itself, or do we require something beyond it? And it illustrates a simple truth: because the West is so prosperous, generally speaking and despite ongoing deprivations, we have on the whole more time to explore what we can become. But the distinction between actualization and transcendence is now every bit the difference it was in Maslow’s mind seventy years ago. What is our goal? Self-expression and actualization, with a focus on ourselves? Or transcendence, with a focus on some thing or someone beyond ourselves? And where do we look for our guiding ideals—to Instagram and social media? Reality TV? Social institutions? Indeed, does our ideal have anything to do with becoming virtuous—that is, becoming a person who truly cares for others? Or is it about cultivating a life of appearances? In a culture driven by the refrain “you do you”, the path of least resistance is the path of mere self-expression, and the path towards spiritual maturity is easily ignored. Neil Postman warned that we are on course to “amuse ourselves to death”, and perhaps there is nothing so amusing as endless attempts at our own self-expression.[9]

Christian Self-Actualization

At the same time, even as there is something deeply Christian about our contemporary virtue of “playing nice” (and certainly in the morality of “respecting others”), there is also something deeply Christian in the idea “you do you.” Authenticity is a Christian virtue. The Judeo-Christian narrative established universal human dignity through the world-changing idea that all individuals, regardless of birth or social status, have equal intrinsic value before God.[10] The Greeks taught that masters were, by nature, superior to their slaves, just as men were superior to women. This is aristocratic dogma in pure form: you are your birth and your station. The Judeo-Christian tradition upended all that.[11] Hear the words of the Apostle Paul: “In Christ, there is no longer Jew or Gentile, slave or free, male and female. For you are all one in Christ Jesus.”[12] In the Greco-Roman world, this is a call to revolution, an overthrowing of social systems with men above all and slaves at the bottom. And in this revolutionary idea of universal standing before God, a new morality, regardless of social status, was born. As it spread, Christianity revolutionized the world with its insistence that all human beings are equally valuable, and the Christian norm of caring for the victim and the marginalized—its founder himself being a victim of unjust violence—began to re-make the world.[13] Thus, from the Judeo-Christian narrative, the ideals “do no harm”, “be fair”, “respect others”, and even “you do you” trace back. Our culture, which often blanches at Christianity, is, ironically, deeply Christian.[14] In fact, we now live in the continual paradox of a post-Christian world which still largely functions according to Judeo-Christian values. Our cultural institutions, specifically news media and universities, often criticize Christianity or Christian institutions using the language of Judeo-Christian morality while failing to recognize they are operating within a Judeo-Christian ethic. Of course, it must be noted, because our Christian institutions are often Christian in name only, failing to live up to the Judeo-Christian ethic, such critiques are not without merit. We should remember that Jesus critiqued the religion of his day for not living up to its own ideals.

If we are going to discuss actualization, then, we should consider what Christian actualization would mean. Having noted that the impulse towards authenticity and self-expression so current in our culture bloom from the Judeo-Christian insistence that all people have universal value, we must also recognize how easily distorted this narrative is. Self-expression was never contextualized within the Judeo-Christian ethic as “for its own sake” or “as its own end.” In Christianity, the highest ideal always is to give yourself for the sake of another.[15] And to deprive yourself (to negate yourself, which is the opposite of self-expression, in some sense) for the sake of another, if need be. As Jonathan Sacks, former Chief Rabbi of the United Kingdom, writes: 

The highest achievement is not self-expression but self-limitation: Making space for something other and different from  ourselves. The happiest marriages are those where each spouse makes space for the other to be his or her-self. Great parents make space for their children. Great leaders make space for their followers. Great teachers make space for their pupils. They are their when they are needed, but don’t crush or inhibit or try to dominate. They practice…self-limitation, so that others have the space to grow. That is how God created the universe, and it is how we allow others to fill our lives with their glory.[16]

This is transcendence of the highest order. Sacks further comments on the danger of missing transcendence and focusing instead on our own personal self-actualization, at the expense of others. “We [have moved],” he says, “from a world of “We” to one of “I”, the private pursuit of personal desire.”[17] An apt reading of our culture and a prophetic warning.

Simply stated, the Western narrative has changed. Christianity, at its best, provided a guiding moral narrative in which we respect others and honor human dignity because this is what God Himself does and because we are called to become like God.[18] Love becomes the highest form of self-expression, even when it demands self- abnegation or the diminution of our comfort. Yet sacrifice in this context is self-expression, not its denial. Self-actualization becomes transcendence and transcendence becomes self-actualization. Self-denial in service of expressing love we find supremely meaningful!

Of course, in our contemporary world, we still find self-denial and sacrifice inspirational. Humans always will. We are made for the transcendent. But we now have very little moral narrative by which to contextualize sacrifice and self-denial. Now we seek to “become good” or “do good” without any sense that “becoming good” is part of a comprehensive narrative at all. Perhaps values such as “be fair” or “respect others” are simply materialistic mandates—evolutionary imperatives, as it were—which we “should” adhere to with an inchoate sense of conviction, but with little other supporting narrative to guide us. In the Christian ethic, the idea was to become like God who cares for all, and in so doing to secure a salvation.[19] Within Christian spirituality, true self-expression becomes rooted in and inseparable from humility and a belief that we cannot become truly good in our own selves apart from God, and only through union with him. But now the ideal is simply to be a good person with little recourse to know whether we are actually becoming good or not and if it truly matters, anyway. In such a world, image and appearance easily substitutes for actual virtue and true inner goodness. And when we become obsessed with curating an ideal life in which our own self-expression is the highest ideal, we have left the Judeo-Christian narrative altogether. Social media and technology become doors to endless entertainment and a world of simulacra and seeming, without the need to do or become good let alone to be in caring, transformative conversation with people who hold different views or opinions. We need deeper values, then, than “be fair” and “do no harm” and “you do you.” We need a narrative that demands our transformation as we learn to care for others. We need a story that demands we listen to, hear, and thoughtfully consider others. A culture without such values ends up, by the path of least resistance, with “express yourself” as the dominant imperative. Such a culture will shortly devour itself.

Our lack of vibrant moral narratives, then, makes it difficult to mature spiritually, or to live in the transcendent focus of Stage 4.[20] In the next essay, we will explore how our lack of guiding narratives is part of a greater cultural confusion.

 

[1] Or, as Shakespeare put it, “To thine own self be true.” Hamlet, Act 1, scene 3, 78-82.

[2] Scott Peck’s frank appraisal was that most people will remain perfectly happy to live in Stage 2 their entire lives. Peck, M. Scott. A Different Drum: Community Making and Peace. Touchstone Press. 1987. Page 199.

[3] Though it should be noted that it is often the poor who are most in touch with authenticity and least able to hide behinds masks of inauthentic pretext. The poor are too close to the sharp edges of reality to pretend otherwise. Without glorifying poverty, Jesus calls this placed “blessed.” See Matthew 5:5-12.

[4] Image from “Maslow’s Hierarchy of Needs” at https://www.simplypsychology.org/maslow.html [January 13, 2020]

[5] Also an interesting overlay with Scott Peck’s schema (again, see Essay 1 in this series). Stage 2 spirituality is akin to conservative politics: it is about conserving what is, preventing chaos. This is the traditional/conservative archetype in both religion and politics, which we will explore in a future essay. In Stage 2 spirituality, you are less likely to question authorities or cultural/religious norms and call this lack of questioning “faith.” In Stage 3 spirituality, there is an impulse beyond conserving what is, even if it means leaving the boundaries of Stage 2. This movement from security towards questioning is akin to the movement up Maslow’s pyramid. Of “self-actualizers”, Maslow says, “Their notions of right and wrong and of good and evil are often not the conventional ones.” Again, to overlay with Peck, self-actualization would be descriptive of those who have left—or internalized—the conventions of Stage 2, arriving at an authentic expression of morality even if it doesn’t line up with the Stage 2 structures and morality in which they were raised.

[6] Maslow, A.H. Motivation and Personality: A General Theory of Human Motivation Based Upon a Synthesis Primarily of Holistic and Dynamic Principles. Harper & Brothers. 1954. Page 93.

[7] Maslow, Abraham. Farther Reaches of Human Nature. New York. 1971. p. 269. Again, cf. M. Scott Peck’s Fourth Stage (Mystical/Communion), which is directly analogous to Maslow’s description of transcendence.

[8] “There is no greater love than to lay down one’s life for one’s friends.” John 15:13 NLT

[9] See Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Books. 2005.

[10] For an excellent description of this revolution, see Luc Ferry’s A Brief History of Thought: A Philosophical Guide to Living. Harper Perennial. 2011. Pages 71-78.

[11] Though there was misogyny in Judaism and early Christianity, much as there is today. Revolutions are not accomplished in a moment.

[12] Galatians 3:28. The implications of this revolution are far-reaching and deeply biblical. The “you do you” narrative is, in some sense, dependent on the Judeo-Christian narrative as evidenced in Galatians. If each person is an image bearer of God, a child of God, then all are called up on to uniquely express themselves.

[13] For an exploration of this idea in its entirety, see Rene Girard’s incomparable I See Satan Fall Like Lightning. James G. Williams, trans. Orbis Books. 2001.

[14] You can imagine a Harvard or Berkley professor arguing for dignity, which is an entirely Christian norm, while rejecting the notion that there is anything transcendent or supernatural about them.

[15] Again: “There is no greater love than to lay down one’s life for one’s friends.” John 15:13 NLT

[16] Sacks, Jonathan in Covenant and Conversations: Life-Changing Ideas in the Parsha. “Making Space.” March 7, 2018.

[17] In an interview with Jonathan Haidt. Morality in the 21st Century Podcast. “Episode 8: Jonathan Haidt.” September 3, 2018.

[18] Notwithstanding the failure of Christians and Christian institutions to live up to this ideal.

[19] Although even this aim has largely been subverted by the Church, replaced with the ideal of “going to heaven when you die,” with no demand on transformation/character change/caring for others in this world. See my book, with Bill Hull, The Cost of Cheap Grace: Reclaiming the Value of Discipleship. Navpress. 2020. “Chapter 3: The Gospel Americana.”

[20] See essay 1 and 2 in this series.

Transcending Rigidity Part V: What Story Are We In?

Brandon Cook

In the last essay, I asserted that while we have cultural values about which there is broad consensus (such as alleviating human suffering or protecting the marginalized), we 21st century-ites largely have no narrative of meaning from which we live. Our morality largely consists of “playing nice”: that is, in the words of Jonathan Haidt, “being fair” and “doing no harm”, or in the words of Charles Taylor, showing “respect” to others.[1] But there is little consensus about what these values mean, let alone how to implement them. The recent debate about detention centers at the US border and the separation of children from their parents is a prime example of a total lack of clarity about how to care for, let alone respect, those on society’s margins. Furthermore, all of this debate happens in contention with another Western value, which is the drive towards individual self-expression, what we might call being one’s true or authentic self. This drive is captured in the cultural catchphrases “you do you” and “live your best life.” The impulse to express one’s self has become a default cultural mandate. And certainly, in most contexts, self-expression is far better than self-repression. Still, it must be asked: to what extent is “expressing one’s self” a part of the good life (or not), and what is the goal of this self-expression?

Authenticity and self-expression have become high ideals in our culture and are now treated as ends in and of themselves. In fact, they should be treated as means to an end.[2] After all, we can express ourselves for many reasons, many of which do not make us (or anyone else) happier, let alone virtuous. I can call someone an expletive and say “I was just being honest,” but I should not mistake such honesty for virtue. I can, in like manner, be honest simply to blow off steam. Or I can be honest to encourage someone. I can be honest to confront them, with care and concern for their well-being. In each instance, the intent of the expression becomes a measure of virtue or the lack thereof. One must not, then, simply say, “Express yourself.” One must ask, “What is the goal of self-expression?” On social media (and in real life), what appear to be transparent self-disclosure may simply be an attempt to bolster the ego by garnering attention. If self-expression is simply its own end rather than the means towards a greater goal—such as loving another or creating beautiful art for the world—self-expression becomes a ship with no harbor, doomed to endless sailing and no satisfaction.[3] If not focused on “willing the good of another,” self-expression becomes, simply, self-indulgence.[4] Indeed, “living our best life,” if defined in terms which do not move us towards the maturity of Stage 4 (see essay 1 and 2 in this series) and its focus on loving others, we are climbing the wrong ladder, as Thomas Merton put it, destined one day to discover it.[5] And if our “best life” consists in seeking new ways to express ourselves while hoping people respond with adulation or even envy (indeed, this seems to be much of the drive and temptation behind social media), we will always be unhappy, like the gluttons in Dante’s vision of hell: always eating, never satisfied. Self-expression without love, in fact, is probably a good definition of the terrible life. As my friend Bryan Rouanzoin says, “authenticity without commitment is just self-indulgence.”[6]

 Societal Confusion

We now live in a continual paradox. The values “be fair” and “do no harm,” at least superficially, incline us towards focusing on others. But the value “you be you” inclines us towards self-focus. In our new conception of morality, such as it stands, there is continual competition between the values be fair/do no harm/respect others and the cultural imperative to fully express one’s self.[7] Furthermore, self-expression is as ambiguous as are “respecting others” or “being fair.” What exactly, after all, does it mean to “do you”? The tension between these values generates increasing confusion.

We live at a time in which the pushback against the objectification of women is in high gear, thank God. The #metoo movement is not a Christian movement, but it certainly has its root in the Judeo-Christian tradition that all people have intrinsic value and none should suffer abuse because of disparities in power. Nevertheless, at the same time that our culture pushes against the objectification of women, pornography and the objectification of women proliferates. The Sports Illustrated Swimsuit Issue illustrates the paradox. Suffering a downturn do to the free-flow of sexual content on the internet and due to cultural pressure to be more inclusive, the swimsuit edition has recently sought to diversify, focusing on models of different body types and even one wearing a burkini (a burqa combined with a bikini). Can the magazine, by diversifying, claim that it “empowers” women, rather than objectifying them? How would we determine which is so?[8] That is, where does objectification end and empowerment begin? Similarly, is viewing harder expressions of pornography simply “you doing you”, or is it participating in the objectification—or, in some cases, the exploitation of—women? Are you doing harm? How do we decide?

To accommodate both narratives—“you do you” and “do no harm,” we must become double-minded about what is in and out, in terms of our moral narratives. Like Alice in Wonderland, things keep changing size and shape, as we try to force things into our subjective perceptions of what is moral. Consider this example from an article about “normalizing rape fantasy,” which includes the line: “Rape fantasies aren’t really about rape."[9] I’m a fan of nuance, but this sentiment is Orwellian. Of course a moral person is for sexual consent across the board in any sexual encounter, without exception; but the argument here is to also normalize rape fantasies—rape, by definition, being the total absence of consent—while maintaining that this sort of indulgence is not dangerous. There are some cases where you can’t have your cake and eat it, too. Nevertheless, since the prevailing sentiment in the ongoing sexual revolution is “Hey, I’m into it, so it’s okay!”, any space for moral discourse disappears. In fact, when moral discourse becomes located within the orientation “you do you,” the basis for moral discourse is obliterated. Once subjective feeling or preference becomes the trump card in moral discourse, there is shortly no basis for any moral discourse at all.

I saw an article on the website Buzzfeed called “Thirsty Men of the Week,” which consisted of muscular men, sweating, shirtless, or sweating and shirtless. I scrolled down to the article’s comments, curious to see what the reader reaction would be, knowing that Buzzfeed would never publish an analogous story featuring scantily-clad women. The comments, almost universally, decried the double standard: since Buzzfeed would never allow an article so blatantly objectifying women, why would they do so with men? At least here was some consistency. But what were the editor’s thinking? How could such an obvious double standard get by their censors? Is it (as I’m left to assume) that because women have so long been objectified, men should be, at least until the scales are equal?

These are examples of the tensions and the push/pull between our dominant culture values of (a) “respecting others” and (b) “you do you”/”live your best life.” Sometimes they exist in harmony. Often, they do not. And where does one end and the other begin? Supermodel Chrissy Teigen, when criticized for getting blocked on Twitter by President Donald Trump, responded: I have a best-selling book, great boobs, a family I love, am literally eating pasta on a lake in Italy and I married rich.[10] The response is amusing—and I get that, to some degree, she’s speaking tongue in cheek. But her response is also telling. Especially the “great boobs” comment. Is having ample breasts a part of the meaningful life? Of “you being you”? The statement might be read as entirely anti-feminist, defining as it does any quality of life based on the quality of one’s breasts. On the other hand, Teigen is being herself! Living her best life! Pushing back against those who would question her independence and unique self-expression. In this sense, in defending herself, she’s being deeply feminist.

I am aware that the above examples focus on sex, sexuality, and gender. Indeed, I focus on these intentionally, because sex and gender tap into our deepest desire for loving connection with another. The intensity of feeling around these topics creates helpful contrast for seeing the contradictions in which we are living. Feminism in its broadest history, specifically, provides an interesting example of the contradictions which we don’t know how to resolve. What is now known as first wave feminism began in the 19th century and continued into the 20th, and was mostly concerned with the right to vote. Second wave feminism (1960s-1980s) critiqued the limiting societal roles—childbearing and homemaking—into which women were forced. Second wave feminism was often accompanied by women breaking out of social roles and in some sense becoming more masculine in order to break through closed doors. Putting away makeup and exchanging an apron for a business suit. Second wave feminism in this sense rejected the objectification of women. Third wave feminism (mid 1990s to present) pushed back against this masculinization of women. If second wave feminism toned down the sex appeal of women, third wave feminisms says in response, “No, be as sexy as you want!” (You can hear the clear affinity towards the mantra “you do you” and even Teigen’s “great boobs” comment.)

But again, these distinctions can create confusion. Whereas dressing sexily was once seen as anti-feminist, now calling out overt or over-the-top sexualization can be seen as “slut shaming.” Which is right? And how do we know where good taste devolves into inappropriate sexualization of either men or women? We really have no idea. When we locate moral standards within the strong feeling of a subjective individual, there’s no possibility of moral conversation.[11] Morality becomes located in individual feeling and confirmation bias, which re-enforces the cultural imperative “you do you”, distancing us from any unified basis for moral discourse.[12]

Technology, for all its gains, further hazards our situation. Moral conversation subsumed by personal preference is fire, technology is gas on the flames. Social media creates echo chambers where we learn there is no need to thoughtfully consider a disconforming point of view—an opinion that does not fit within our preferences. If we disagree, we can quickly find someone or some news source who agrees with us. And, after all, if we disagree with anyone, we need not take the time to think about it; if television turned the world from thinking to entertainment, as Neil Postman so clearly descried, the internet (and social media, in particular) has become television’s more powerful progeny. Attention spans are shrinking as media stimulation expands, further discouraging us from thoughtful consideration of anything at all. Once retreating into echo chambers and avoiding nuanced thought become habitual, we easily come to believe that our subjective preferences reflect objective reality and are not limited or biased (as all subjective preference must, in fact, be). The net result is that all social discourse, moral or otherwise, is increasingly framed as either/or.[13] This is the presentation of “fact” when we become too lazy to consider thoughtful nuance. In either/or thinking, either A is right/wrong or B is right/wrong, and there is no middle ground.

How many times have you seen an article claiming “A is the greatest thing in the world”, followed by an article the next week (or even the next day) claiming “actually, A is a terrible thing”?

Tom’s Shoes giving shoes away to the poor is great/Tom’s shoes giving shoes away is the worst kind of disempowering charity.

Abortion is only about saving a child’s life/Abortion is only about the rights of women.

Republicans are the only party that cares about America /Republicans want to destroy the poor.

Democrats actually care about people/Democrats want to destroy America.

Of course, the nature of debate always involves opposing viewpoints in confrontation. But true discourse involves conversation, give and take, and respect, which is pre-requisite for accessing the pertinent facts and details necessary for comprehension, let alone a profitable dialogue. In our culture, we have increasingly less (vanishingly less, I should probably say) time for such thoughtfulness. The result is a shrill tenor to public discourse which is evident from our politics to our news media to our discourse over social media.

 Perils Ahead

Perhaps this does not seem perilous. Perhaps the confusion between the values “you do you” and “respect others” is no immediate danger. Perhaps our lack of awareness about how technology is changing our discourse patterns is not that big of a deal. But I would argue it is dangerous and it is a very big deal to increasingly “resolve” our moral dilemmas simply by yelling louder at each other, or by retreating to echo chambers where our own point of view is re-enforced without thoughtful interrogation, or by relying on whatever personal preference is strongest within us. There be dragons! In fact, history tells us that once we forgo the need for transformative conversation, we begin to de-humanize each other, to re-enforce our belief that we are right. Personal prejudice and institutional racism, for example, is only kept in place by such dehumanization and mis-placed certainty. Once such de-humanization takes root, violence inevitably follows.[14]

This context of confusion, of a loss not only of moral narratives but of any way of talking about morality at all, makes it hard for us to progress to spiritual maturity. Indeed, the de-humanization ultimately produced by our confusion is the antithesis of the maturity of Stage 4. Further, our societal focus on “you do you” ends up separating us from the greatest moral narrative of all, which is a life well-lived as a life focused on caring for others. To this topic we will turn next. 


[1]  In an interview with Jonathan Sacks. Morality in the 21st Century Podcast. “Episode 8: Jonathan Haidt.” September 3, 2018; Taylor, Charles. Sources of the Self: The Making of Modern Identity. Cambridge University Press. 1989. Pages 14-19.

[2] Of course, when discussing culture, it must be asked, “Whose culture are we talking about?” Talking about “culture” can be walking where angels fear to tread: what culture and whose culture are we talking about, anyway? After all, culture changes according to demographic. In 21st century America, the cultural concerns of a largely black audience will no doubt be different than those of a largely white audience. So, we must ask, in any cultural conversation, what indicators are being using to define culture? Universities and news and media are generally taken as default cultural indicators, and these are helpful for providing concrete data on shifts in culture; new classes or news articles on any given subject may, for example, signal a change in cultural awareness. Since I am interested in discussing modes of public discourse, I take the news media as my prime cultural indicator. While it would take serious research to study and taxonomize the issues of concern in this essay—instances of the cultural values “be fair/do no harm” and “you do you,” in particular, as expressed in news media—I will largely rely on my anecdotal experience. Nevertheless, instances of these values and these values in competition, as evinced in media, abound, and I will cite numerous examples in the course of this essay.

[3] And of course, almost all of our action in the world involves dual motives, some virtuous, some self-involved, which we are not able to separate from one another. Nor should we wait to act until we feel purely motivated, which may have us waiting a long time and thus undercutting virtue altogether. Virtue becomes real in action and not before.

[4]  St. Thomas’ excellent definition of love. St. Thomas Aquinas, STh I-II,26 4, corp. art.

[5]  See essay 1 in this series, “Transcending Rigidity: Four Phases of Spiritual Development, Part I.”

Stage 1 = Anti-social/Chaotic (“Chaos”)

Stage 2 = Formal/Institutional ( “Bounded” or “the Boundaries Stage”)

Stage 3 = Doubt/Skepticism (“Deconstruction”)

Stage 4 = Mystical/Communion (“Union”)

The full Thomas Merton quote is: “People may spend their whole lives climbing the ladder of success only to find, once they reach the top, that the ladder is leaning against the wrong wall.” Qtd. in Richard Rohr’s “Orthopraxy.” August 26th, 2015. https://cac.org/reverse-mission-2015-08-26/

[6] In conversation, but I took the liberty of quoting him in The Cost of Cheap Grace: Reclaiming the Value of Discipleship by Bill Hull and Brandon Cook. Navpress. 2020. Page 58.

[7] In some sense, this tension mirrors the individualism of Western cultures and the communalism of Eastern cultures. Which is not to say this is an East/West issue but rather that these issues are archetypal and will express themselves archetypally throughout cultures.

[8] “Although the women in this year’s issue are an array of shapes and colors and backgrounds, Day admitted that the issue traditionally has focused on the male gaze, and with highly sexualized results. But she said the magazine is ‘evolving’ and that the criticism of the issue objectifying women is ‘subjective.’” Qtd in ‘Is the Sports Illustrated Swimsuit Issue Still Relevant’ by Kali Hays. May 8, 2019.

https://wwd.com/business-news/media/is-the-sports-illustrated-swimsuit-issue-tyra-banks-alex-morganstill-relevant-1203125840/ [April 21, 2020]

[9] https://medium.com/@JessicaLexicus/reclaiming-my-rape-fantasies-fd469a0dfa9b I recognize that Medium is not exactly mainstream media, but this writer has more than 44,000 followers, and I take Medium as revelatory (if not predictive) of the cultural conversations that are coming. https://www.psychologytoday.com/us/blog/evolution-the-self/201411/don-t-call-them-rape-fantasies

[10] I have a best selling book, great boobs, a family I love, am literally eating pasta on a lake in Italy and I married rich. https://t.co/OHLfgnp8CL

— @chrissyteigen July 31, 2017

https://www.huffpost.com/entry/chrissy-teigen-trump-block_n_597f70c7e4b00bb8ff387405

[11] Ultimately, the expression “you do you” is rooted in what Alasdair MacIntyre calls an “emotivist culture,” in which moral evaluations are ultimately based on subjective feeling and individual preference, rather than any larger guiding moral narrative. See MacIntyre, Alasdair. After Virue: A Study in Moral Theory. The University of Notre Dame Press. 2007. See “Chapter 3: Emotivism: Social Content and Social Context.”

[12] See, again, MacIntyre, Alasdair.

[13] Remember that either/or thinking is a sign of Stage 2 thinking (see essay 1 in this series) which serves us as children and hampers us as adults.

[14] A cycle described in Rene Girard’s mimetic theory. See I See Satan Fall Like Lightning. Orbis Books. Maryknoll, NY. Orbis Books. 2001.

Transcending Rigidity Part IV: The Quest for A New Morality

Brandon Cook

A Marvelous World Devoid of Meaning

The Enlightenment, as posited in the previous essay, was a flowering of science which, it was hoped, would bring about a new world order not based in faith or religion but in reason. It germinated a confident optimism that through scientific inquiry, humankind would find answers for its deepest questions, as science became the guiding authority in a new Stage 2.[1] This optimism was not rooted in the authority of the pope or the Bible but on empirical observation of the universe and on reason. The French revolutionaries even renamed Notre Dame and many other cathedrals “Temples of Reason.” 

And while the Enlightenment was much to our material good (penicillin, after all), it  marks the beginning of ongoing skepticism across developed societies in the West. Its underlying concern of securing human happiness has failed. That is, it has failed according to the Enlightenment standard of securing a world free of superstition and violence. The 20th century was the bloodiest century in history and proof, perhaps, that science would not save us from ourselves. And science itself, once the herald of a new age, has now been supplanted by technology (more on that below). 

Meanwhile, scientific discovery has demonstrated that the physical universe—and our place in it—is far more complicated than Enlightenment philosophers supposed. The ancient and medieval mind generally perceived the world as purposed and purposeful, rationally designed for human flourishing. The universe’s very orderliness was a moral guide for human behavior, an invitation into moderation and harmonious living and, thus, virtue.[2] But the scientific revolution post-Enlightenment has killed this worldview such that now perceive the world as barbaric and, potentially, devoid of meaning. Darwinian evolution undermined the Church’s teaching about a purposeful creation, and Einstein’s Theories of Relativity and Heisenberg’s Uncertainty Principle reveal that we live in a chaotic universe which does not function according to uniform physical laws once taken as given.[3] Thus, the world is conceptualized as confusing and chaotic, if not without its beauty. 

Further, humans are more complicated than most Enlightenment theories allowed. Economist Adam Smith, author of The Wealth of Nations, assumed that men and women act in rational self-interest, and that assumption became the guiding tenet of modern economic theory. But facts and reason do not always change our minds, let alone our behavior.[4] And humans often act against their own long-term self-interest for short-term emotional gains. 

The complexity of the world—and of ourselves—has moved us far beyond the optimism of the Enlightenment into deep doubt and generalized pessimism that there is anything which we can truly know nor any authority which we can truly trust. A world in which everything has become interpretation.[5] This is a movement into the doubt and skepticism of Stage 3. 

Indeed, this emergence of a world solidly situated in an ongoing Stage 3—a world in which we have no clear guiding authorities—is something new, historically speaking.[6] In the West, the ancient Greeks and Romans lived by a clear moral code: virtue was rooted in strength and honor. In The Illiad, for example, the moral man behaves courageously on the field of battle, and it’s the hero in arms that achieves immortality.[7] In the Roman world, the word “virtue” is rooted in the root word “virility.” In the Christian/Medieval world, chastity and fidelity became the highest moral goods. Chivalric knighthood was meant to capture all that was virtuous about these values. To the Enlightenment philosophe, the man guided by reason was the man of morality. And this, perhaps, still comes closest to our contemporary ideal. But we 21st century-ites largely have no clear moral story nor narrative of meaning from which we live. We do have cardinal moral values, such as alleviating human suffering or protecting the marginalized, but we have little consensus on how to go about them.

Nor do we envision a golden age ahead in which we will follow science and reason into utopia. Such a future, in which humankind has put its greatest faults and errors—prejudice, racism, and violence—behind was popularly envisioned as recently as the original Star Trek series in the 1960s (its future vision a close corollary to the critiques and hopes of the Port Huron Statement, described in the last essay). But we no longer have such faith in science. Indeed, science, which was meant to guide us to new, golden shores has largely been replaced with technology.[8] Science, according to the Enlightenment vision, was to liberate humankind. It was going to set us all free, empowering us to solve our biggest problems. But technology has taken over, and its aim is simply to create more and consume more. Creating new and better products has become an end in itself, rather than a path towards enlightenment or a new world. As French philosopher Luc Ferry writes:

        …contrary to the philosophy of the Enlightenment, which aimed at emancipation and human happiness, technology is well and truly a process without purpose and devoid of any objectives: ultimately, nobody knows any longer the direction in which the world is moving, because it is automatically governed by competition and in no sense directed by the conscious will of men collectively united behind a project, at the heart of a society…[9]

Consumption itself, then, has become the default guiding value in our world. We now buy new smartphones simply because we are expected to buy new smartphones, and we consume because that is what we do. And thus, we have come to live in what Neil Postman called a “technopoly,” in which “the culture seeks its authorization in technology, finds its satisfactions in technology, and takes its orders from technology.”[10] Our technopoly is marked by an endless stream of information and stimulation and little moral cipher by which to decode it. 

In the battle between the Orwellian vision of an all-powerful authoritarian state controlling a large population and Aldous Huxley’s vision of a population so doped up that the need for a totalitarian state is obviated, Huxley has won out.[11] Orwell envisioned, in 1984, a world entirely controlled by Big Brother in which, for example, to control the population, reading is forbidden. Huxley envisioned a world in which there would be no need for the controlling apparatus of a central state because the populace would be so numbed out and over-stimulated. In such a society there is no need to outlaw books because no one wants to read, let alone think. Thus, Postman presciently titled his seminal work, in which he describes the ever-overstimulated technopoly we are becoming, Amusing Ourselves to Death. Our current technopoly provides pleasure and comfort but very little sense of meaning, and the end result is despair. We are surrounded by so much stimulus and information (via Facebook, cable news, the internet, et al) that we have little time to make sense of it let alone find meaning in the midst of it. 

Thus, even as our technology continues to improve and progress, we are racked with doubt and skepticism about meaning and our epistemology—our confidence about what we can know at all—has radically shifted, to the point where we hardly believe there are any objective facts at all nor any coherent narrative behind them. With so much information, much of it false, how does one go about determining what is true or real? Further, we have no grand societal project around which to organize. No Enlightenment project, as it were. In such a world, whoever masters ratings (rather than reason) can be king, as the 2016 US presidential results perhaps reveal. The propaganda arm of President Donald Trump’s administration, which often consists simply of Tweets from the president himself, is a dark omen of the world before us, so full of misinformation or outright deception. Optics and legerdemain carry the day, not facts. In fact, we have never had so many facts at our disposal, and yet confusion reigns about what they all mean and which ones we can trust. 

How then can we have high level conversation about what morality or virtue—and therefore, spiritual maturity—is? Indeed, philosopher and author Alasdair MacIntyre describes our contemporary situation as having not even the common language to have a conversation about morality, let alone to define what it is.[12] This, he says, is a catastrophe which we do not recognize. We are now lost in all the information available to us, with no moorings for how to interpret it. Studies reveal that if you take a group with any particular political leaning and show them facts that contradict their beliefs, they will not believe the facts.[13] This penchant for confirmation bias has always been a part of human thinking, but the proliferation of our technology means it can happen more quickly, as there’s always a confirming bias on hand. 

The Brave New Morality Which May Not Make You Virtuous

In a deconstructed world like ours, what becomes of morality? In our Stage 3 world, even without any clear guiding authority, nevertheless, values have emerged about which there is general consensus. Jonathan Haidt, author of The Righteous Mind, sums up this brave, new morality simply:

              1. Be fair.

              2. Do no harm.[14]

These values—being fair and doing no harm—form the basis of our new ethic, such as it stands. Or at least, they form the basis for contemporary debate, as we seek to draw boundaries and establish a new stage 2. We might call it the ethic of “playing nice.” Philosopher and author Charles Taylor agrees with Haidt but uses slightly different language to describe the shape of our morality. Our moral horizons, says Taylor, are now anchored by the moral good of “respect” for others.[15] This entails, for example, alleviating human suffering. 

All of this may seem obvious, but in fact such moral goods have not always been givens. In ancient Greece, for example, the warrior/honor ethic put a far greater emphasis on an individual’s own dignity than it did on caring for others. Succeeding on the battlefield was far more virtuous than, say, caring for the poor. In our world, on the other hand, respecting others has become the leading moral good. Nevertheless, apart from alleviating the most blatant forms of suffering, we don’t always have clarity about what is meant by respecting others. A man holding open a door for a woman was once (and in many places, still is) considered respectful; in some places, it is now considered an offense.

Such an illustration helps to reveal our contemporary problem. These values—being fair, doing no harm, respecting others—are not bad values, as such. Being fair and doing no harm are solid moral goods and not far removed from the central plank of Jesus’ morality: “do unto others as you would have them do unto you” (Matthew 7:12). Yet, on their own, devoid of broader context, they are threadbare goods for providing a framework of meaning. 

First of all, they are subjective. After all, what do you mean by “fair” and what do I mean by it? What is harm and do we agree on it? Sometimes what is for someone’s best—for their spiritual development—feels painful. It can certainly feel like harm even if it’s for their (or your, or my) good.[16] So what is included in “harm” and what is excluded? And do we have the language to speak of it? As a pastor I’ve been around numerous debates in which Christians have said that excluding queer Christians from fellowship is an act of love, for their own good, whereas many queer Christians I know interpret this as a great act of harm. 

Furthermore, “doing no harm” is a grand ideal, but it does not necessarily move us into the maturity of Stage 4. Doing no harm is not the same as doing good.[17] We need more than a “no” to move away from; we need a “yes” to move towards.[18] (No wonder people are so frustrated and despairing in our world.) And the new morality does not necessarily lead to internalized selflessness and generosity. Rather than internalizing goodness, the goal too easily becomes to be perceived as nice or to convince ourselves that we are nice without having to do any of the hard work of becoming virtuous. Social media facilitates this: simply post that Republicans/Democrats are idiotic/evil and you will feel you have accomplished something, with very little risk to your person. Post that we should be welcoming to immigrants and someone will likely post that first we must take of our veterans (as if it’s an either-or). The result is people yelling louder and louder while no one listens or hears. Yet by this means we are able to relieve our anxiety: by comparing ourselves to others without having to take any virtuous action in the world, we feel the relief of righteous indignation. No wonder virtue signaling—the act of declaring your opinions in order to demonstrate your moral rightness to others—is rife all around us. And no wonder, as this game escalates, we live in an increasingly sanctimonious world in which people are continually afraid of what they can and can’t say.[19] A world in which people either seek to control others by regulating behavior on the left or by pushing back against political correctness on the right. 

Second, since we have no consensus on what fairness and harm are, these values leave us firmly in the realm of individual morality with little sense of our communal connections. As Jonathan Sacks, former Chief Rabbi of the United Kingdom, puts it “We [have moved] from a world of “We” to one of “I”, the private pursuit of personal desire.”[20] Stage 4 maturity is rooted in seeing how all things are connected and caring, therefore, for others; it is the realm of “we,” not of “I.” On one hand, the values “be fair” and “do no harm” domake us think of others and thus push us towards Stage 4. But at the same time, we can become so concerned with what is fair to us and obsessed and offended with what feels like harm to us that we end up getting stuck in a narrow, self-focused individualism which pushes us very little towards seeing others. That is, which does very little to move us to the maturity of Stage 4.

In the next essay, we will explore just how easily we get stuck in hyper-individualism and how this stuckness can keep us from spiritual maturity.

 

 

[1] Remembering that Stage 2 always needs an authority to anchor it. See http://www.storyflight.com/four-phases-of-spiritual-development-part-i

[2] The Stoics, whose thought anchored the ancient and greatly influenced the Medieval worldview, believed that nature revealed a harmony with which an individual, through right behavior, could align in order to find flourishing and happiness.

[3] Newton’s Laws of motion, for example, only go so far and break down near the speed of light. Time itself—that apparent constant—Einstein proved, is relative. 

[4] See “Why Facts Don’t Change Our Minds” by Elizabeth Kolbert in The New Yorker. February 27, 2017. https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds [9/30/19]

[5] Derrida’s statement “Il n’y a pas de hors-texte”, meaning either “there is nothing outside the text” or “there is no outside-text” is popularly taken to mean that everything is interpretation. See Jordan B. Peterson’s commentary in 12 Rules for Life: An Antidote to Chaos. Random House. 2018. Page 311. 

[6] To be clear, I am saying that we are looking to establish a new Stage 2, which is the pattern of history, but we have never been less clear on how to do this. Nietzsche, for whatever we may agree or disagree with him in particulars, was perhaps correct in predicting an ongoing battle between competing ideologies before a new moral order would emerge.

[7] Meanwhile, the Greek (and then Roman) philosophers emphasized moderation and balance in the virtuous individual. Aristotle correlated this “golden mean” to behavior on the battlefield: the virtuous man is neither cowardly or brash; rather his courage, as with all virtues, lies in between these extremes.

[8] Ferry, Luc. A Brief History of Thought: A Philosophical Guide to Living. Harper Perennial. 2011. Page 212-213.

[9] Ferry. 213-214.

[10] See Postman’s Technopoly: The Surrender of Culture to Technology. Vintage. 1993. Page 71.

[11] An argument developed by Postman in Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Books. 2005.

[12] See After Virtue by Alasdair MacIntyre. University of Notre Dame Press. 2007. “Chapter Two: The Nature of Moral Disagreement Today and the Claims of Emotivism.”

[13] See Brendan Nyhan and Jason Reifler. “When Corrections Fail: The Persistence of Political Misperceptions” in Political Behavior 32. 2010

[14] See the Morality in the 21st Century Podcast. “Episode 8: Jonathan Haidt.” 9/3/18.

15  Taylor, Charles. Sources of the SelfThe Making of Modern Identity. Cambridge University Press. 2003. Pages 14-15.

[16] I always remind myself that Jesus was kind but he was not always nice.

[17] In Jesus’ formulation “Do unto others as you would have them do unto you,” Jesus goes further than his contemporary Rabbi Hillel. Hillel said, “That which is despicable to you, do not do to your fellow, this is the whole Torah, and the rest is commentary, go and learn it." Jesus extends this idea into more positivistic moral statement. https://www.jewishvirtuallibrary.org/rabbi-hillel-quotes-on-judaism-and-israel [9/30/19]

[18] Thanks to my friend Mike Goldsworthy for articulating this so clearly, in personal conversations.

[19] Some of this fear is good and right. The #metoo movement, for example, made afraid and rightfully so those who abused positional power exploitatively. 

[20] In an interview with Jonathan Haidt. Morality in the 21st Century Podcast. “Episode 8: Jonathan Haidt.” 9/3/18.

Transcending Rigidity: The Four Phases of Spiritual Development, Part III: The Patterns of History

Brandon Cook

In essays I and II, we explored the basic dynamics of spiritual development, following the Four Stages outlined by M. Scott Peck in his book The Different Drum.[1]

 So far, we have looked at how the Four Stages of Spiritual Development functions on an individual level (before proceeding, I suggest reading essays I and II). In this essay, we will expand our focus to see how the Four Stages also functions on a collective level, in society and culture. 

As cultures seek to establish meaningful values to guide human life—as we all, together, try to make sense of the world and our place in it—we are in an endless quest to avoid chaos (Stage 1), establish authoritative values and boundaries for right behavior (Stage 2), even while constantly questioning established authorities (Stage 3), as we try (or not, perhaps) to become spiritually maturity (Stage 4). These dynamics can help us understand the societies in which we live and, ultimately, how to find meaning in a confused world. 

Societal Shifts

 In 1793, the streets of Paris ran with blood. During the Reign of Terror, the blood-letting and chaos into which the French Revolution descended, anyone too closely associated with the Ancien Régime­ (the old political order) was carried into the streets and executed. Though the revolutionaries of 1789 had envisioned a new societal order based on science, reason, and logic—the virtues of the Enlightenment—Paris in 1793 was anything but enlightened. 

What caused this sudden explosion? Societal change is most often a gradual process. And indeed, The Reign of Terror only seems at first glance to be a sudden outburst when, in fact, it had a long preamble: decades of discontent with the monarchy, growing doubts about the Church, and famines which drove countless Parisians to starvation and despair. Yet beneath the violence was the same longing that drives all revolutions: the quest for a new world. This hope for a bright tomorrow, for us or for those who come after us, drives us all. It is the stubborn hope and quest for meaning in a world which often seems meaningless. 

Societies seek continuity and stability so that its culture can define what is meaningful, such that it can be pursued.[2] And this cultural pursuit of meaning mirrors the individual’s pursuit: just as individuals develop towards spiritual maturity (or get stuck in a stage of development, or regress to a previous stage), so do cultures writ large. The word “spiritual” here, again, is used in its broadest sense and is not necessarily referential of religion. “Spiritual” includes the array of an individual’s (or culture’s) psychological and emotive processes involving humankind’s deepest and most sacred—and in this sense spiritual—longings and aspirations. Our search for love, beauty, and meaningfulness all fall under this definition of “spiritual”, whether someone believes in God or not. By this definition, all individuals and all societies are spiritual, whatever their creed, dogma, or lack thereof. And all societies are shifting continually, moving through the Four Stage of Spiritual Development as the society seeks spiritual fulfillment. Indeed, movements between the stages happen in the world around us all the time. If we zoom out from the individual to the societal level, we see movements through the stages in every era of human history; they are movements of the zeitgeist, the defining spirit of a cultural milieu. And the paradigm of The Four Stages helps us understand them.

The radicalized revolutionaries of Paris attempted, through violence, to deconstruct their Stage 2 world, defined as it was by the boundaries delineated by supreme institutions—the monarchy, aristocracy, and Church. From one perspective, the French Revolution was a movement from Stage 2 to Stage 3; the revolutionaries doubted and radically deconstructed the Stage 2 dogma of church and state which stood in the way of a new world order. From another perspective, they were simply moving towards a newStage 2, in which the authority of science and reason would replace the authority of the Bible, the Pope, and the monarchy. This is how history works: Stage 2 breaks down and moves into Stage 3 and ultimately the values of Stage 3 form the boundaries of a new Stage 2. This is the most common pattern in history.[3] But that’s not what we witness in 1793—not immediately, anyway. Then, the nation reverted to the chaos of Stage 1. Obviously people don’t like to stay in chaos—we love stability, certainty, and predictability too much. But regressions into chaos do happen. Sometimes deconstruction takes a wrecking ball, so it seems.

The same movement—from Stage 2 to 1—can happen in reverse. In the 1970s, Iranian zealots revolted against what they viewed (accurately enough) as a licentious, corrupt sovereign. They were reacting against the perceived chaos of Stage 1 and moved their society into the highly rigid and legalistic strictures of an Islamic society. In other words, they moved their nation into Stage 2, to a formal, institutionally- centered theocracy. 

And again, the most common pattern is a movement from Stage 2 to Stage 3. Consider an example from recent history: In the United States in 1962, the Students for a Democratic Society issued thePort Huron Statement, in which a younger generation decries the world being inherited from their forebears.[4] One sees in it the concerns of a people discontented withthe security of Stage 2 (in this case, the security and prosperity of white bread, post-World War II American society) and beginning to name and denounce the cracks in the dam of American life. The critique within thePort Huron Statementrepresents a movement of deconstruction, calling into question established values and boundaries. Indeed, in the first three, short paragraphs, the authorities of Stage 2—American might and fervor for what we might call “the American way” and “the American dream” and the values of “the Greatest Generation”—are essentially declared bankrupt.[5] There is racism in the South, nuclear antagonism across the hemispheres, general dread about the growth of the industrial-military complex, and deep conviction that the status quo being handed down is not good enough. The statement, then, represents a classic example of doubt, rejection of values, and thus a movement from Stage 2 to Stage 3. 

The Port Huron Statement is the front edge of cultural change—indeed, of what is now called “cultural revolution” (there’s that word again)—that shock-waved across American society in the 1960s. The “hippies,” with dreams of an “Age of Aquarius”, helped uproot the conservative sexual norms of the 1950s and introduced a new, liberalized sexual ethic which, again, represents a movement from Stage 2 to 3. The cultural revolution of the 60s is another example of deconstruction, not unlike the French Revolution, in an attempt to find meaning. Thus does history move forward.

Of course, the social upheavals in Vietnam-era America were perceived by many not as forward movement but as a regression from Stage 2 into chaos. What we see, as always, depends on our own perspective and whatever stage we are in.[6] Nevertheless, the pattern of doubting and deconstructing what came before in order to build a new edifice—or perhaps a new foundation—is always the pattern. Each generation looks back in critique at the generation that came before. This societal pattern is mirrored at the individual level by what happens in nuclear families. As children become adolescents, they begin to question the Stage 2 boundaries outlined by parents, caregivers, and authorities. In some sense, we are all meant to challenge authority and to deconstruct Stage 2; it’s simply part of our development. But then we are left with the task of creating new boundaries, or embracing—in a new way, no doubt—the ones which came before. The film The Big Chill captures this tension well, as former 60s radicals are portrayed fifteen years later far removed from the radicalism of their youth and, in some instances, having become the new, conservative boundary-makers just a few years down the line.[7] Indeed, many 60s radicals became power-brokers in a 1980s America defined by the pursuit of wealth. Authorities come, authorities are dismantled, and then new authorities are put in place, and we march back and forth between Stage 2 and Stage 3. This is the pattern and patter of history, with occasional regressions into the chaos of Stage 1 (a la the French Revolution).

And of course, there are occasional glimpses into the spiritual maturity which marks Stage 4. Martin Luther King, Jr’s “I Have a Dream” speech in 1963 (a year after thePort Huron statement), marks one of these glimpses. Perhaps we are at times even attaining ground towards Stage 4, as Dr. King envisioned. Yet looking at the current political gridlock and mode of discourse in the US, it’s easy to conclude we are regressing, as a culture, into the tribalism of Stage 2. All the while, whatever the stage of our culture, rest assured it isgrappling for a narrative in which we can find purpose and meaning. Nevertheless, despite a continual proliferation in our technology, our narratives about what is meaningful are up in the air. In fact, our technology serves in part to mask our inability to find meaningful narratives from which to live and covering the underlying despair now growing across American society.[8] We are wealthier than ever, yet wealth by itself cannot provide narratives of meaning.

In the next essay, we’ll explore how our society and culture is now stuck between Stage 2 and Stage 3, as we try to find new narratives of meaning in which we can anchor ourselves.


[1] Peck, M. Scott. The Road Less Traveled: A New Psychology Of Love, Traditional Values And Spiritual Growth. Touchstone/Simon & Schuster. 1985. Pages 187-199. Note: the language “Chaos/Boundaries/Deconstruction/ Union” overlaps with but varies from Peck’s language. I find this language, though altered, is helpful as shorthand.

[2]Society being the ordering of systems and power structures to support a large group of people and culture being that society’s pursuit of truth and beauty through the exploration of art and science.  Consider an example: Athens went to war in the fifth century BCE to maintain societal security—that is, to protect the population and its culture through strength of arms. The army as a power structure of society made space for the poets, philosophers, and high culture of Athens. The society ensures the culture’s survival such that it, safe and secure, its culture can address the great philosophical questions—“Why are we here?”, “What is beauty?”, “How do we live meaningfully?” and so forth. On one hand, this is obvious; yet the principle functions with increasing subtlety. The conflict between security and privacy in our contemporary world—a la Facebook and Google—maps to the same struggle between a power structure and the preservation of beauty (even if power is now represented by the reach of a corporation and beauty is simply the maintenance of privacy as a common good.) Indeed, societies are always in this tension: how does one maintain order while creating enough space for beauty and art? The question becomes pointed the more authoritarian or totalitarian a society becomes. Individuals are in this same tension, between the need for security and the desire for thriving. As are individual relationships. Psychologist Esther Perel, for example, describes all marriage relationships as a tension between the desire for stability and the drive towards romance. (See Esther Perel in “Why is Modern Love So Damn Hard?” at https://estherperel.com/blog/why-modern-love-is-so-damn-hard [September 19, 2019]). 

[3]Such patterns—generally movements from Stage 2 to 3—are usually long in emerging. It took centuries for Christian thought to overtake and supplant Stoicism, the dominant Greco-Roman philosophy, just as Judeo-Christian culture held sway for centuries before succumbing to the new science of the Enlightenment and the advent of a materialistic, Darwinian era. And we can still look back on The Enlightenment, over two centuries ago, as the clearest, dividing line between us and the Medievals.

[4]1 We are people of this generation, bred in at least modest comfort, housed now in universities, looking uncomfortably to the world we inherit.

2 When we were kids the United States was the wealthiest and strongest country in the world; the only one with the atom bomb, the least scarred by modern war, an initiator of the United Nations that we thought would distribute Western influence throughout the world. Freedom and equality for each individual, government of, by, and for the people—these American values we found good, principles by which we could live as men. Many of us began maturing in complacency.

3 As we grew, however, our comfort was penetrated by events too troubling to dismiss. First, the permeating and victimizing fact of human degradation, symbolized by the Southern struggle against racial bigotry, compelled most of us from silence to activism. Second, the enclosing fact of the Cold War, symbolized by the presence of the Bomb, brought awareness that we ourselves, and our friends, and millions of abstract "others" we knew more directly because of our common peril, might die at any time. We might deliberately ignore, or avoid, or fail to feel all other human problems, but not these two, for these were too immediate and crushing in their impact, too challenging in the demand that we as individuals take the responsibility for encounter and resolution.

Accessed at https://history.hanover.edu/courses/excerpts/111huron.html[September 23, 2019]

[5]“Greatest Generation” being a phrase coined by Tom Brokaw in his book of the same name.  Random House. 2001.

[6]While remembering that these stages are merely descriptive and no one is simply “in” one stage across the board.

[7]The Big Chill. Dir. by Lawrence Kasdan. Columbia Pictures. 1983.

[8]Suicide rates, especially among younger white men, who have traditionally been the most privileged), have skyrocketed. See, for example, https://www.rollingstone.com/culture/culture-features/suicide-rate-america-white-men-841576/[September 24, 2019]

Transcending Rigidity: Four Stages of Spiritual Development, Part II

Brandon Cook

The ultimate purpose of these essays is to explore rigidity, which happens when people get stuck in their psycho-spiritual development. But to understand rigidity, it’s helpful to understand some of the fundamental dynamics at play between the four stages. (Note: if you have not read the first essay, ‘Four Phases of Spiritual Development, Part I,’ you may want to stop and read that first.)

Relating to Other Stages

M. Scott Peck’s Four Stages of Spiritual Development looks like this:

1.  Anti-social/Chaotic (what I will simply call “Chaos”)

2.  Formal/Institutional (what I will call “Bounded” or “the Boundaries Stage”)

3.  Doubt/Skepticism (what I will call “Deconstruction”)

4.  Mystical/Communion (what I will call “Union”)

The goal of spiritual development is to mature into Stage 4, but entering Stage 4 is a dynamic process. The path is not a straight line, and to walk it, it helps to understand the dynamics at play between the stages. Indeed,Peck makes a number of observations about how people in each stage relate to the other stages. For example, people tend to look down on people who are in the stage they have just passed through.[1] The newly devoted believer in Stage 2 tends to judge a person in the chaos of Stage 1. (You can imagine someone shaking their head disapprovingly at “those sinners.”) By the same token, a person who has just deconstructed their faith and dwells in the doubt and skepticism of Stage 3 tends to view as naive the devoted believer of Stage 2. 

But even as we judge those in the stage behind us, we “retain vestiges of the previous stages through which we have just come.”[2] Under stress, we are all tempted to lie, coerce, or manipulate to get our way, thus returning to Stage 1. Someone in Stage 2 might have an old drinking buddy come in town who, after some cajoling, pulls his buddy into the chaos of Stage 1. Someone who lives in the logic and skepticism of Stage 3 may also be oddly and irrationally superstitious about certain matters. Even in Stage 3 or 4, under pressure you might—like all of us—wish for some oracle or authority to cut through life’s ambiguities with clear answers, such as you experienced in Stage 2. And even in Stage 4, you might succumb to the fear of “what people may think” and pretend to deny your faith in order to appear coolly logical and collected as you once were within the skepticism of Stage 3. 

Not only do we tend to look down on those just behind us, but we are mostly threatened by those in the stages above us. “If people are one step ahead of us, we usually admire them. If they are two steps ahead of us, we usually think they are evil.”[3] After all and for example, people in Stage 2 are told to “love sinners.” If they move past their initial judgmentalism, people in Stage 2 can become quite loving of those in Stage 1. Yet they will generally remain threatened by those in the stages above them. Consider Jesus: to some, he was a prophet who revealed the true heart of God; to others he was a lunatic, a rabble-rouser, a devil sent to tear down everything the nation held dear.[4] As Abraham Heschel writes, “The prophet…employs notes one octave too high for our ears.”[5] In a simplistic sense, Jesus was too far ahead, spiritually speaking, of those who crucified him. His notes were too high for their ears, and all they couldhear was how he challenged them (and to their minds, threatened them) with his radical way of relating to the Law and to God. As Jesus himself said, “To him who has ears to hear, let him hear” (e.g., Mark 4:9). All this despite Jesus saying that he came “not to abolish but to fulfill the Law and the prophets” (Matthew 5:17). Jesus knew he was the fulfillmentof the sacred, not its contradiction.[6] Nevertheless, what you see depends on your perspective, which depends on your own spiritual development. To the rigid mind stuck in Stage 2 (and remember, not all who are in Stage 2 are rigid or legalistic), Jesus was a threat. And because people tend to use the stability of Stage 2 to cover their own inner sense of chaos—of guilt and shame—they will protect their Stage 2 thinking without realizing they are actually protecting themselves. Jesus knew that teaching that God demands inner transformation and not external performance would get him in trouble. And he knew that teaching that God is a universal Love which embraces all people would fall upon deaf ears and, furthermore, would be a threat to respectable people trying to perform their way into standing with God. Jesus was no dummy; he knew he was going to be killed by people stuck in Stage 2.

The Difficulty of Moving Towards Stage 4

It is important to note that though these stages describe spiritual development, we are not strictly speaking about (nor should we limit the stages to) religiousdevelopment. In other words, while we can use the word “spiritual” in its most personal, intimate sense—of an individual’s belief in a higher power, we can also use the word “spiritual” in its broadest definition. “Spiritual” in this sense means the array of psychological and emotive processes which are not strictly religious and may not even involve a belief in God per se, but which involve humankind’s deepest and most sacred—and in this sense spiritual—aspirations. 

It also bears saying, again, that the progression from Stage 1 to 2 is not a progression from bad to good. There are wonderful, sincere, and authentic people in Stage 2, and there are jerks in Stage 3. There are wonderful people who would never hurt a fly who get stuck in the chaos and addiction of Stage 1, and so on. Nevertheless, there is a real danger of getting stuck in a stage and never becoming mature. A progression to internalized maturity—to internal goodness—means a movement through the stages. You cannot be truly good, for example, if your motivation for doing good is fear that God will punish you if you don’t (as it often the case in Stage 2). That may be your first catalyst, but it must not be your last. You must progress to doing good for its own sake and for the sake of others. 

Still, though human beings want to become mature and good, it’s very difficult to progress into Stage 4.[7] The great spiritual teachers, after all, have always taught that it takes loss to become fully human, which is to say fully spiritual. Suffering is the only thing that forces most of us into change. As David Brooks writes, “If you ask anybody, ‘What’s the activity that you had that made you who you are?’ no one says, ‘You know I had a really great vacation in Hawaii.’ …They say, ‘I had a period of struggle. I lost a loved one. I was in the Army. And that period of struggle…made me who I am.’”[8] Indeed, the very reason we don’t progress through the stages is that it involves struggle and suffering. It involves changing our understanding of the world, and we prefer instead “vacations in Hawaii” in the form of stability, certainty, and predictability. Furthermore, we live in a culture that encourages us to avoid discomfort and pain, often to the detriment of our souls (though of course, pain still finds us). So as humans, we tend to live in Stage 2 or 3. We want to avoid the chaos and shame of Stage 1, but we are also wary of the suffering and loss required to enter Stage 4. We all intuit what Jesus said: in becoming mature, we will “lose our life” (Matthew 10:39).That is, we will lose our ability to find ultimate meaning in our certainty, in our ability to hold things together, or in how fine and respectable we look (all of which may be of ultimate concern in Stage 2). Or we will lose our capacity to look calm, cool, collected, and detached in our skepticism (Stage 3). Stage 4, on the other hand, requires vulnerability, letting go of certainty, and giving your time, energy, care, and love away. This requires becoming humble and joyful even in the midst of not having all the answers. In Stage 4, you are more concerned with giving than receiving, though you understand that you only can give because you receive. Stage 4 also requires—according to Jesus, anyway—holding tension, turning the other cheek, and forgiving those who crucify you. No wonder Stage 2 and 3 remain, in some sense, more appealing than Stage 4.

Temptations and Dangers 

The general progression to Stage 4 is difficult because we only get there through engaging suffering and loss. In fact, any change in our psychology is a type of death, because you have to let go of what you knew (or thought you knew) to learn something new. In that sense, you lose who you were in order to become someone new. And “it hurts to become real.”[9] Becoming always involves loss. But in addition to this general difficulty, each stage has specific, unique temptations and dangers which can sidetrack the progression to Stage 4. We will look at each Stage in turn. 

Stage 1

An alcoholic stuck in Stage 1 does not drink without reason. There is a genealogy to any addiction. She drinks because life is painful, and she relates to alcohol as a reliable friend simply because it helps her deal with the pain of being. As obvious as it sounds: life is painful because human beings can feel pain. We feel physical and emotional pain, like dogs and dolphins, but unlike animals, we also feel psychological and existential pain. The unique aspect of our humanity is this self-awareness, the pain of existence and of being. It is the pain of human longing. Because of it, we feel psychological pain: worry, fear, and insecurity. Indeed, each stage is meant to help us deal with this felt anxiety of being. And in each stage, maturity is difficult because it would require letting go of something that has kept us safe. In the case of an alcoholic, a bottle represents power over pain, at least temporarily. Putting down the bottle would mean not only accounting for who they’ve hurt through their addiction but also confronting the pain and unmet longing that feeds the addiction in the first place. 

The temptation in Stage 1, then, is concluding that the only way you will be safe or feel relief is by clinging to what numbs you out. This keeps you stuck in a vicious cycle of guilt and shame.[10] Whereas guilt can motivate behavior change, shame is difficult to escape and can drive us back into the very anti-social and chaotic behavior that generated it. After all, if you have no hope for something better, at least you can have the comfort of a bottle or a sexual tryst. Relief for a moment seems better than despair, even if the despair comes due with compounded interest after the act. 

The film The Wrestler captures this dynamic poignantly, as Randy ‘The Ram’ Robinson returns to the wrestling ring—the only place where he has any sense of self or transcendence—even though it means his death because of a bad heart.[11] This is the power of shame: it keeps us returning to places of comfort that ultimately kill us. So the key temptation in Stage 1 is to let shame drive you into despair and hopelessness. 

Stage 2

The temptation in Stage 1 is straight-forward compared with the subtleties of temptation that occur in Stage 2, where the rocks and sandbars are often hidden beneath the waterline. In Stage 2, because life seems to work quite well compared to the chaos of Stage 1, it’s easy to deny that there is any danger to be aware of at all. Someone caught in chaos will have a hard time denying that their life is in shambles, whereas a respectable person in Stage 2 can plausibly deny that there is anything at all to deny. A good suit covers a multitude of sins. People in Stage 2 can take pride in being good, thoughtful, and respectable folks, which can masquerade as spiritual maturity even if there’s nothing mature in it and even if they simply substitute rigidity for maturity. 

In Stage 2, the chief temptations are an over-attachment to stability and security.[12] Stage 2 is about having answers and knowing how things should be or should be done. To be clear, we should not hold anything against security or stability; in and of themselves, they are good things, necessary for our development, and certainly better than chaos. But they can become ultimate things. That is, they can become idols we worship rather than gifts for which we give thanks.[13] Rather than being addicted to pleasure (Stage 1), someone in Stage 2 may be addicted to their sense of stability and control. This stability expresses itself as “having answers,” “knowing how things should be,” and fixating on “the right way to do things.” Remember, Peck called Stage 2 the formal stage because people get attached to the form of how things should be done.[14]

In Stage 2, we often see a fusing together of religious and political conservatism, because both provide a sense of safety and security. I recently saw a bumper sticker which read, “Heaven has closed borders; hell has open borders…think about that.” An inane sentiment and a lousy way to lobby for border control, yet a perfect example of how easily notions of religious and political security entwine. Each shares the goal of security, albeit in different forms. Another bumper sticker I’ve recently seen reads, “I stand for the flag, I kneel for the cross.” It is an amazing and impossible thought to imagine early Christians merging the imagery of the cross with any political symbol, and certainly not the emblems of Rome, but in our country Christianity has been greatly reduced to a Gospel Americana in which the chief goal is “going to heaven when you die” (that is, achieving “eternal security”) rather than actually making any progression to spiritual maturity. Once this bastardization of Christianity is in place and the ultimate goal of security has been enshrined as the ultimate good, it’s a short leap to justifying any political philosophy which seems to provide security. No wonder the cross is so often draped in a flag across the churches of America. And no wonder so many people are disenchanted with the Church and Christianity, since we were made for far more than security. Many Christians have been taught only to love their own salvation but develop no genuine love for the world around them. Yet the height of spiritual maturity in Stage 4 is caring for others, which often involves a loss of security.[15] You cannot love another and remain perfectly secure, stable, and in control. Love means loss.[16] We can be confident Jesus would not have gone to the cross if his chief goal was security and stability. 

When we look at this love of stability from a different point of view, we see that the rigid person in Stage 2 is often simply protecting themselves from their own insecurities or an inner sense of guilt and shame. (As with Stage 1, guilt and shame are underlying realities in Stage 2 and every other stage, for that matter). But the sophistication of rigidity in Stage 2 is its built-in blindness, which does not exist in any of the other stages. The certainty inherent in Stage 2 (“we have the answers”) blinds us to—that is, allows us not to see—what’s actually happening within us. What’s reallyhappening in Stage 2 rigidity is an emotional need is being met; the emotional desire for control and safety is being satisfied. But blindness allows us to believe we are simply defending “the Truth,” which is always, in our minds, God’s truth. Thus, the temptation in Stage 2 is the temptation towards certainty—of thinking you have the answers and have God and the universe basically figured out.If you can look good or knowledgeable or competent, and if you can be certain about your answers, you can have stability to stave off any inner sense of vulnerability, all while believing you are simply “defending the truth.” This also prepares you to interpret any resistance or pushback not as a potentially helpful challenge to you but as an attack on “the Truth,” which you must defeat.

This sort of blindness is the very thing Jesus warned his disciples about. He discerned that the zeal of the Pharisees—the leading religious leaders of his day—was not in actuality for the Law but for the control and power and prestige the Law gave them. Jesus makes this explicitly clear: “Beware of these teachers of religious law! For they like to parade around in flowing robes and receive respectful greetings as they walk in the marketplaces…Yet they shamelessly cheat widows out of their property and then pretend to be pious by making long prayers in public.”[17] In other words, how they look and how they appear to others is the driver, all masked in the veneer of devotion to God while remaining inwardly corrupt.

The respectability of the Pharisees is powerful. Seeming to have answers and holding your answers with certainty can simulate emotional maturity. But Jesus goes to great lengths to show that those with “the answers” may nevertheless remain spiritually immature. In Jesus’ “Parable of the Good Samaritan,” three men pass a man beaten and stricken on the side of the road, and only one stops to help him. Jesus then asks, “Which man was a good neighbor?”[18] That is, who was spiritually mature? It was not the respectable one dressed in religious finery. Of course, in the teaching of our churches and religious institutions, we easily dismiss these religious hypocrites as “anyone but us” when in fact they may be a stunningly accurate representation of us. Yet the blindness outlined above would prevent us seeing the correlation.

In fact, in Jesus’ teaching, though Jesus warns about chaos (Stage 1) the greatest danger is not chaos but of getting stuck in religious rigidity (Stage 2).[19] In fact, Jesus never gets his dander up against people who are sinners or stuck in chaos. The story of a woman caught in adultery is most famous for Jesus’ words to the religious accuser: “He is without sin among you, cast the first stone” (John 8:7). And to the “sinful” woman at the well, Jesus says he would give “living water” (John 4:10). Jesus focuses not on the dangers of chaos but on getting stuck in Stage 2, from which it is much more difficult to extract one’s self.[20] Stage 2 is where we are most likely to deceive ourselves into thinking we are good without any true goodness

The irony of Jesus’ teaching is that you only come to God not when you are secure, stable, and certain but when you are very aware you don’t have all the answer nor have it figured out. In some sense, you can’t really believe until you aren’tcertain. You come to God—and to spiritual maturity—when you confess you don’t have all the resources nor all the answers necessary to navigate life.[21]

In terms of the difficulty of progressing from Stage 2 to Stage 4, two other points must be made: Of course, some people stay in Stage 2 simply because they fear that they’ll “go to hell” if they don’t. That is, they think of faith as having the answers and never doubting what they’ve been taught, which will secure them eternal life.[22] Again, Jesus warns against this sort of merely intellectualized faith without corresponding emotional connection or tangible action in the world around us.[23] But to the rigid mind, anything short of ironclad certainty is seen as a loss of faith which can threaten hellfire, so they are very motivated to stay in Stage 2. There is a lesson for all of us in this, whether we are Christians or not: Jesus showed us, in stark terms, that it’s simply easier to be tribalistic and to cling to our own sense of certainty as one clings to an idol. It’s easiest of all to stay in Stage 2.

And finally, I have noticed a dynamic that can make it very difficult for people who have experienced high levels of chaos or trauma to move beyond Stage 2. For them, the ambiguity of Stage 3—the same ambiguity which can make faith sincere and authentic—seems to resemble the chaos of Stage 1. For this reason, people who have experienced high levels of trauma and chaos may be very committed to staying in the stability of Stage 2. Losing that would feellike a return to chaos, even if it’s actually a movement towards maturity. 

In this essay, I have spent most of my time on Stage 2 simply because this is where most people get stuck. Nevertheless, temptation abounds in Stage 3, as well. 

Stage 3

We come to Stage 3 because the answers in Stage 2 stop working. We hit a wall that forces us to transcend what we’ve known or believed up until now. Many of Jesus’ earliest followers were poor because suffering and poverty reveals the ambiguities of life and ushers us toward Stage 3. And in Stage 3, we come closer to an authentic experience of God.

At the same time, the temptation in Stage 3 is connected to the suffering of the world, which can make us believe there are no answers at all. Indeed, the temptation in Stage 3 is simply to not believe anything. The danger is cynicism, which—though it’s bad for your brain—carries a certain allure.[24] It allows us to tamp down the pain of hope and human longing. 

The cynicism of Stage 3 can manifest as arrogance. People in Stage 3 are less susceptible to rigidity by simple virtue of their skepticism, which admits no certainties about which to be rigid. But they can, nevertheless, become rigid in their arrogance. They may look down and despise “those idiots” in Stage 2. And indeed, they may revert to the anti-socialism—the “living for yourself” ethos–of Stage 1. That is, they may have the cool confidence that comes from skepticism and yet not be progressing towards loving or caring for others. Stage 2 pushes the impulse to love others (after all, it’s the right thing to do), but this can be lost in Stage 3, even under a banner of progressivism. The “have it all” ethos of greed rampant in American culture is alive and well even among those who think they’ve dismantled such constructs.

The Cost of Technology

There are many temptations that make it difficult for us to mature towards Stage 4, and the difficulty can be exacerbated at everystage by the explosion in technology. Technology, of course, can be a boon: it allows us to hear and consider new thoughts and considerations faster than ever before. And yet technology has become a means in its own end.[25] Now we simply consume because we consume, always wanting more—new gadgets and media and experiences. And all this consumption serves to shorten our attention span, such that we have little time to thoughtfully consider any new thought, even as we are bombarded by more and more of them. Indeed, technology, more than anything else, can lock us into unthinking rigidity. For example, the algorithms of Facebook ensure that we see stories we agree with and therefore will click on, so that its advertisers can make money and will keep feeding the machine. It is confirmation bias, par excellence: we see stories that appeal to our political sensibilities, so we get stories that re-enforce our worldview. This increases our siloes and makes thoughtful dialogue more difficult. Social media is already a terrible place to dialogue (for proof, go read the comments on any recent political debate), and angry, argumentative debate of this sort—which is part and parcel of the discourse promulgated by social media—doesn’t change minds. In fact, the defensiveness they engender only re-enforces our rigidity. Technology when used in this way simply numbs us out, diverting us from confronting reality—and our need to mature—as we endlessly entertain ourselves. Or as Neil Postman put it, as we amuse ourselves to death.[26]

Aspiring to Stage 4: Counting the Cost

So, movements between the stages are not uncommon, but there are forces working against maturing to Stage 4. Jesus’ words “take up your cross, and follow me” are significant.[27] Moving into Stage 4 will exact a cost. Maturity means leaving behind that which has kept you safe, up until now, so that you can become who you might be.

At the same time, our movement to maturity does not happen in a vacuum. Indeed, the culture around us is always in a state of flux and change. To mature, we have to be aware of that, as well. In the next essay, we will explore how movements between the stages occurs not just on an individual level but in society as a whole.

 

[1]Peck, M. Scott. A Different Drum: Community Making and Peace. Touchstone PressNew York, NY.  Page 195.

[2]Peck,198.

[3]Peck, 195.

[4]See John 11:50. 

[5]Heschel, A.J. The Prophets.Harper Perennial Modern Classics. 2001. Page 12.

[6]Matthew 5:17, paraphrased.

[7]I am reminded of Jesus’ words: the gateway to life is very narrow and the road is difficult” (Matthew 7:14).

[8]David Brooks, qtd. in “What’s The Key To A Meaningful Life? You Might Not Like The Answer.”

https://www.huffingtonpost.com/entry/david-brooks-life-meaning_us_56e6f962e4b0b25c9182b0c3 [February 4, 2019].

[9]The honest admission of the Skin Horse in Margery Williams’ The Velveteen Rabbit. (HarperPerennial Classics, New York, NY. 2013.)

[10]Brené Brown makes it clear that guilthas to do with remorse for what we’ve done, but shame is connected to our sense of identity. In other words, guilt says, “what I’ve done is bad” while shame says, “I am bad.” See “Shame v Guilt” at https://brenebrown.com/articles/2013/01/14/shame-v-guilt/[August 5, 2019]

[11]The Wrestler. Fox Searchlight Pictures. Dir. by Darren Aronofsky. 2008.

[12]In future essays I will explore how the value of stability and security forms the basis of the conservative archetype.

[13]Tim Keller points out that idolatry is always taking a good thing—like money or sex—and making it into an “ultimate thing,” which it was never meant to be.  See Counterfeit Gods:The Empty Promises of Money, Sex, and Power, and the Only Hope that Mattersby Timothy Keller. Viking Press, New York, NY. 2009.

[14]It should be noted, of course, that often beneath the mask of respectability found in Stage 2 is very often found a hidden chaos. The Southern States, which are also the most professedly Christian/religious states, also report the highest incidence of pornography access.See “America’s Bible Belt states indulge in more online porn than other less religious states” in Christianity Today.https://www.christiantoday.com/article/americas-bible-belt-states-indulge-in-more-online-porn-than-other-less-religious-states/42045.htm[8/1/19]

[15]After all, “you must love the Lord your God with all your heart, all your soul, all your mind, and all your strength.’[and] … ‘Love your neighbor as yourself.’ No other commandment is greater than these.” (Mark 12:30-31)

[16]“To love at all is to be vulnerable. Love anything and your heart will be wrung and possibly broken.” C.S. Lewis in The Four loves. HarperOne. 2017. Page 155.

[17]Mark 12:38, 40.

[18]Luke 10:25-37, paraphrased.

[19]For a warning against chaos, see John 8:11. 

[20]See Matthew 23.

[21]See, for example, Matthew 5:3-12, the Beatitudes. 

[22]Ironically, though Jesus held the Scripture up as an authority as much as other first century Jews, he related to it differently and rejected the oral Law which had been constructed around it.

[23]See Matthew 7:21-23 and Matthew 25:31-46.

[24]Studies have shown that prolonged cynicism affects brain function. See, for example, “Cynical? You may be hurting your brain health” in ScienceDaily. https://www.sciencedaily.com/releases/2014/05/140528163739.htm[August 1, 2019]

[25]As Luc Ferry writes, science has been overtaken by technology. Science was once held as hope for a new world whereas technology is simply about creation for consumption’s sake. See “From Science to Technology: The Disappearance of Ends and the Triumph of Means” in A Brief History of Thought: A Philosophical Guide to Living. Harper Perennial. 2011. Page 211.

[26]Postman, Neil. Amusing Ourselves to Death: Public Discourse in the Age of Show Business. Penguin Books. New York, NY. 2005.

[27]See Matthew 16:24. NIV.

Transcending Rigidity: Four Stages of Spiritual Development, Part I

Brandon Cook

M. Scott Peck, a Harvard and Case Western Reserve-trained psychologist, wrote what is essentially the first self-help book, The Road Less Travelled, in 1978.[1](To date, it has sold over ten million copies.) In 1980, Peck became a Christian, embracing again the faith of his youth. In later life, he participated in exorcisms with Malachi Martin, the chief exorcist of the Catholic Church.[2]Talk about a life. 

In his book The Different Drum: Community Making and Peace(1987), Peck outlines the process of spiritual development in four stages.[3]I have found it to be one of the most simple and helpful schemas for understanding not only how individuals but also systems—familial, organizational, and cultural—function. 

Peck describes the 4 stages as:

1. Anti-social/Chaotic (what I will simply call “Chaos”)

2. Formal/Institutional (what I will call “Bounded” or “the Boundaries Stage”)

3.  Doubt/Skepticism (what I will call “Deconstruction”)

4.  Mystical/Communion (what I will call “Union”)

To briefly summarize the progression: 

Stage 1: The Anti-Social/Chaotic Stage

“Anti-social”does not mean avoiding parties or ignoring people, but rather being self-focused with little capacity to see beyond the needs and wants of one’s own self. A young child is in Stage 1, being acutely aware of and concerned with their own needs before they develop awareness of the needs of anyone else. This is obviously true of a baby, and if two-year-olds suddenly became the size of dinosaurs, none of us would survive. But adults can live in Stage 1, also. A drug addict who steals from his family to feed his addiction or someone who floats through meaningless sexual liaisons with little care or concern for his partners remains in the chaos of Stage 1. People in this stage are largely focused on gratifying their own needs even if it hurts others. Stage 1 is, therefore, a turbulent and emotionally charged stage not only for the person in it but also for those close to them. 

A scriptural example of Stage 1 is the younger brother in The Parable of the Two Sons (Luke 15:11-32). He demands his inheritance early and leaves home, dishonoring his father. He wastes his money on alcohol and prostitutes. And he is forced to tend to pigs, lusting after their slop to sate his hunger. He is an embodiment, in turn, of both anti-social behavior and of chaos.

Stage 2: The Formal/Institutional Stage

The formal or institutional phase of spiritual development is one of religious awakening and consciousness. It involves a clear sense of right and wrong and, usually, of being part of a religious tribe or community. As antidotes to chaos, there are clear boundaries of both belief and behavior in Stage 2, centered around an authority—the pope, the Church, the Scripture, the tradition. These boundaries provide a rich matrix of meaning—the universe has purpose, God is real, what we do matters, and so forth.

Peck called Stage 2 the “formal” stage because in it people get attached to the form of how things are done. Church-goers in Stage 2, for example, may get upset if you change the order of worship songs. I remember someone coming to me as a pastor deeply disturbed that I had added liturgical words (“This is the Word of the Lord”/”Thanks be to God”) to our weekly Scripture readings; it reminded him of his Catholic upbringing, and not fondly. Our conversation was had in a good spirit, thank God—these sorts of conversations about “the way things should be done” are not always so pleasant. After all, people’s emotional safety gets attached to the form of how things are done and they are often willing to fight to protect their sense of safety. 

In Stage 2, people tend to act within a tight set of boundaries. They might not drink or see R-rated movies. The motivations for these convictions lie on a spectrum: they may be felt as a personal conviction, on the level of conscience (e.g., “Scripture says the best path is, at the very least, moderation, so I’ll just avoid drinking altogether”). Or they may be engaged simply to be part of a tribe with accepted norms (“We don’t dothose things”). Or they may be based in fear ("doing X may bring judgment upon me so I won’t do it”) rather than any pro-active ethic (“Y is the right and good thing to do, so I won’t do X”).

A scriptural example of Stage 2 is the nation of Israel coming under the Law at Sinai (Exodus 19-24). The nation moves from the chaos of slavery and desert wandering into a highly structured Law code which ensures tribal membership (including circumcision of the penis, in case anyone doubted they were serious). They return to Stage 1, worshipping a golden calf (Exodus 32), just as they turn, later, to the chaos of worshipping foreign gods (e.g., Ezekiel 14:6); nevertheless, the identity of Israel will ultimately be defined as a tribe and nation under a Law, the Torah, which they see as instituted by God Himself. They will live, increasingly, within clearly delineated boundaries and under ritualized institutional practices. The nation’s story over a millennia, up to and beyond the destruction of the Temple in 70 AD, is the struggle to faithfully keep the Law and live within the prescribed boundaries. 

Most stories of Jesus interacting with the Pharisees—the religious leaders in his day—reveal, not surprisingly, a people determined to stay within the rules. In fact, part of rabbinic tradition was “building a fence around the law” so that you could never get close enough to break the actual law. If the law was “don’t boil a kid in its mother’s milk” (Exodus 23:19), the fence might be “never eat meat and cheese together.” The goal of putting up fences was to make sure you could never get close enough to breaking the law to actually violate it. In Stage 2, keeping these sorts of boundaries is of the highest importance. Jesus was crucified, in part, because he was so committed to questioning the boundaries.[4]

Another scriptural example of Stage 2 is the older brother in the Parable of the Two Sons (Luke 15:11-32). While his brother is out hiring prostitutes, he remains at home, on his father’s estate, where he is insistent that he has done right and has kept the rules. In fact, he is keenly aware of what he believes is due him because of it. He has lived within the boundaries and lives in entitlement because of it. And though he has remained on his father’s estate, he doesn’t understand his father. Indeed, his father has to tell him: “everything I have is yours” because he doesn’t understand his father’s generosity and goodness (Luke 15:31). Those in Stage 2 may be religious, but it doesn’t mean they understand or have a deep experience of God. Indeed, writes Peck: 

Another thing characterizing the religious behavior of Stage 2 people is that their vision of God is almost entirely that of an external, transcendent Being. They have very little understudying of the immanent, indwelling God—the God of the Holy Spirit, or what Quakers call the Inner Light… And although they often consider Him loving, they also generally feel He possesses—and will use—punitive power.[5]

In Stage 2 the view of God is mostly “out there.” This transcendent God is awesome, with clear power; He is a holy God whom you can fear and worship but whose goodness you may not deeply experience nor whose being come to truly love. Of course, such notions of love are often ignored or dismissed in Stage 2, anyway; the most important value in Stage 2 is stability and security, not love.

3: The Doubt/Skepticism Stage

Stage 2 gives us an ordered world, but there are deeper drives within us than the drive for stability. We want meaning and we want flourishing; we want to understand the nature of reality. And if we are going to believe, we want to believe sincerely and not just because we were raised to believe or because a tradition was handed down to us. Stage 3 is, thus, a period of deconstruction and reconstruction. It is dominated by an analysis of reality a quest for truth. People in Stage 3 tend to become agnostics or atheists, or they embrace or return to their faith with a more mature and authentic posture. In either event, they are looking to re-build their reality. 

To understand Stage 3 you have to understand the transition from Stage 2 to 3: Stage 2 tends to break down as people internalize their values and discover that they differ from the values of their religious culture or institution. This leads to asking, “Who needs this (institutional/religious) way of thinking, anyway?” In other words, we are often forced into Stage 3 because Stage 2 simply stops making sense to us. The movement into Stage 3, for example, can be catalyzed by painful experiences which our thinking in Stage 2 can’t account for. In classical Christian thought, the beginning of Stage 3, therefore, often aligns with what St. John of the Cross called “the dark night of the soul,” when what-has-worked-up-until-now or how-God-was-experienced-up-until-now no longer works—no longer provides a sense of vibrant and viable orientation to the world. We may realize that “bad things happen to good people” and be thrown into doubt and confusion. We may suffer personal loss and wonder where God is. Indeed, our image of God may seem completely inadequate in light of our suffering and human suffering in general. The world becomes less black and white and more grays are admitted. Then doubts about the nature of God, the universe, and one’s own beliefs appear like cracks in a wall. People may slowly come to realize that their faith is formulaic and not authentic; that they are committed to it because it provides stability but not because they think, feel, believe, or experience it at a core level. The onset of this doubt phase is a period of unsettling destabilization, as previous held beliefs are deconstructed, and the movement into Stage 3 is, thus, generally a slow process, a dam bursting in stages after a long period of cracking. 

If the movement from Stage 1 to Stage 2 is a conversion to belief, from chaos to formal religion, the movement from Stage 2 to 3 is a different type of conversion, to skepticism. At the same time, it does not require letting go of one’s faith. Or it may seem to demand that, entirely. In either case, Stage 3 demands an authentic wrestling with the nature of reality, not simply reality as it was received growing up as taught by one’s default authorities (parents, teachers, religious leaders). At the same time, they may return to the teachings of prior authorities. As Peck says, Stage 3 truth-seekers often discover the mosaic of truth, though bigger than they can comprehend, “strangely resembles those ‘primitive myths and superstitions’ their Stage 2 parents or grandparents believe in.”[6]In other words, they may land back where they began, though, no doubt, they will hold their beliefs differently. Stage 3, then, if it leads to faith, will lead to an inwardly-directly rather than outwardly-directed faith. That is, it will not simply be a faith dictated by the external structures—family,  church, synagogue, mosque; rather, faith will be centered on the structure of deeply felt, internal conscience. 

A scriptural example of Stage 3 is The Book of Job, which was written to deal with the crisis of faith that occurs between Stages 2 and 3. Namely, Job has done things right (lived within the boundaries, as it were) and yet painful things are happening to him. The Book of Deuteronomy seems to promise that if you do right, good things will happen, but that isn’t Job’s experience.[7]What does that mean, and how does Job respond? And where on earth is God in the midst of it? The Book of Job addresses these questions without resolving them.[8]The brilliance of Scripture is that it addresses rather than shies away from these questions. 

St. Paul on the Damascus Road (Acts 9:1-9ff) provides another example. Paul is a zealous Pharisee who, once again, has “done it right.” Or so he thinks. He is a real-life older brother, centered in Stage 2. But he is thrown into radical doubt and deconstruction when he encounters Jesus. In Paul’s case, the movement from Stage 2 to 3 is not from religious certainty to agnosticism or atheism but rather from religious certainty to religious doubt and uncertainty, which may be true for many people making the movement from 2 to 3.[9]

Stage 4: The Mystical/Communal Stage

It is important to note that while the progression from Stage 1 to 2 to 3 to 4 is a linear progression, it’s not a progression in the goodness of human beings nor a statement on their commitment (or lack of commitment) to transformation. People who enter Stage 3 may doubt and become skeptics about God, becoming agnostics or atheistic, all while continuing to develop as human beings. Ironically, they may be “nonbelievers” but, as Peck says, be “generally more spiritually developed than many content to remain in Stage 2.”[10]Or they may return to their faith with a more informed and matured posture. Indeed, the possibility of faith in Stage 3 is that it is transformed from formula (“I was born a Christian so I believe it”) to sincere and authentic internalization (“I have experienced this, therefore I own it”). This postures makes it possible to deeply internalize the experience of God. And, indeed, it makes it possible to head into an increasing experience of God and of spiritual wisdom.

In Stage 4, the Mystical/Communal stage:

1. You realize all things are connected. 

2. You appreciate mystery rather than resist it as people in Stage 2 tend to do. (People in Stage 4 enter religion to encounter mystery, people in Stage 2 enter religion in order to escape it.) 

3. You embrace emptiness, emptied of prejudices; you do not think in terms of factions but instead see underlying connectedness.[11]

What does this look like in simple terms? People in Stage 4 have moved beyond bitterness and embraced forgiveness. They take themselves less seriously. They know that they are connected to other human beings and, indeed, to all creation. They become kind (which does not always mean nice; they know that love can and does sting). They seek to care for others—for neighbors, just as Jesus taught. Above all, they understand that the heart of God is zealous, generous, love. Thus, we tend to gravitate to people in Stage 4. They exude a wisdom that is captivating, a kindness that is an invitation.

A scriptural example of Stage 4 (not unsurprisingly) is Jesus. Listen to his words in ‘The Great Priestly High Prayer’ in John chapter 17:

“I am praying not only for these disciples but also for all who will ever believe in me through their message. I pray that they will all be one, just as you and I are one—as you are in me, Father, and I am in you. And may they be in us so that the world will believe you sent me.

“I have given them the glory you gave me, so they may be one as we are one. I am in them and you are in me. May they experience such perfect unity that the world will know that you sent me and that you love them as much as you love me.” (John 17:20-23, NLT)

Jesus focuses on how things are connected, or how he longs for them to be, anyway. This is why I call Stage 4 the “Union” stage. Union means seeing, as the New Testament says, all things hold together in God and that God, as Jesus says above, wants us to be one with him.[12]

The conversion from conversions from Stage 1 (Anti-Social/Chaotic) to Stage 2 (Formal/Institutional) is generally sudden. One “sees the light,” as it were. And the conversion from Stage 2 to 3 (Doubt/Skepticism) is generally gradual. The conversation from Stage 3 to Stage 4 (Mystical/Communal) is most certainly so. Coming to know, experience, encounter, and rely on the goodness of God, despite the suffering of life, is a lifelong journey. It requires a growing awareness that God is not only “out there” but “right here,” not only transcendent but immanent. And that He is not only the All-Powerful one but also the All-Vulnerable, All-Suffering One. The understanding of God as immanent, “right here,” helps you understand that all things are touched by God and exist in God. Indeed, mysticism, as Peck means it, simply means seeing this ground-level connectedness. He calls it “communal” because in Stage 4 the individual sees they are part of a whole and not simply an autonomous individual. This has vast ramifications for how you treat other human beings, how you care for the world around you and, indeed, what you believe God is up to in the world. 

Understanding the 4 Stages

Why is it important to understand the four stages? 

In the Creation stories in Genesis 1 and 2, one of the first thing Adam does is name the animals. In other words, he labels reality. Awareness of the nature of reality of anything is pre-requisite for flourishing. (As a Chinese Proverb says, “The right naming of things is the beginning of wisdom.”) So understanding the four stages help us on a personal level. It will change for the better how we interact with the world. Right understanding always does. 

But furthermore (and without hyperbole), there has been no point in history in which understanding the dynamics of these four stages is as important as it is now. We live in an era where the ability to converse across stages is vanishing. People are locking down in rigidity—about religion, politics, political correctness—losing the ability to think and, therefore, dialogue or thoughtfully converse. People have always gotten stuck in Stage 2, but our “always on” technology amplifies our stuckness. After all, our technological process is a double-edged sword, affecting us for good and ill. The 24-hour news cycle is transforming our attention spans, and the algorithms of Facebook and Instagram ensure we are presented with articles and videos which simply re-enforce the worldview we already hold. Alarmingly, facts no longer seem to matter, though we’ve never had more access to information and data. On a societal level, we are being shepherded into rigid Stage 2 thinking, into a reactive posture in which we defend our boundaries without thinking or analysis. This is a perilous development. It requires clear thinkers with mature hearts and minds to lead us out of it.

In the upcoming essays, I will focus on how people get stuck in Stage 2 and never progress beyond it. This means exploring religious rigidity, which prevents movement from Stage 2 to Stage 3 and 4. But before we do, I will turn in the next essay to look at the Four Stages as a whole at a deeper level.

 

 

[1]Peck, M. Scott. The Road Less Traveled: A New Psychology Of Love, Traditional Values And Spiritual Growth. Touchstone/Simon & Schuster. 1985.

[2]An experience he writes about inGlimpses of the Devil: A Psychiatrist's Personal Accounts of Possession. Free Press. 2005.

[3]Peck, M. Scott. A Different Drum: Community Making and Peace. Touchstone Press1987.

[4]For an illustration of Jesus questioning man-made boundaries, see Matthew 12:1-8.

[5]A Different Drum,190.

[6]A Different Drum, 192.

[7]See Deuteronomy 28 and its promise that “If you fully obey,” God will bless you.

[8]See Job 38-41. God basically replies, “I’m God and you’re not.”

[9]The contemporary trend of millennials leaving the evangelical church represents a classic pattern of people moving from Stage 2 into Stage 3, in this case from some sort of faith to “no faith” or to some reconstructed version of faith. If or how these millennials return to church or re-construct the church will be a fascinating social study. See “Will Young Evangelicals Come Back to Church?” by Myriam Renaud. 

https://religionandpolitics.org/2019/06/25/will-young-evangelicals-come-back-to-church/[July 2, 2019]

[10]A Different Drum, 191.

[11]A Different Drum, 193.

[12]See Colossians 1:17.