Be Rational

Rationality is uncool. To describe someone with a slang word for the cerebral, like nerd, wonk, geek, or brainiac, is to imply they are terminally challenged in hipness. For decades, Hollywood screenplays and rock-song lyrics have equated joy and freedom with an escape from reason. “A man needs a little madness or else he never dares cut the rope and be free,” said Zorba the Greek. "Stop Making Sense," advised Talking Heads; “Let’s go crazy,” adjured the Artist Formerly Known as Prince. Fashionable academic movements like postmodernism and critical theory (not to be confused with critical thinking) hold that reason, truth, and objectivity are social constructions that justify the privilege of dominant groups. These movements have an air of sophistication about them, implying that Western philosophy and science are provincial, old-fashioned, naïve to the diversity of ways of knowing found across periods and cultures. To be sure, not far from where I live in downtown Boston there is a splendid turquoise and gold mosaic that proclaims, “Follow reason.” But it is affixed to the Grand Lodge of the Masons, the fez- and apron-sporting fraternal organization that is the answer to the question “What’s the opposite of hip?”

My own position on rationality is “I’m for it.” Though I cannot argue that reason is dope, phat, chill, fly, sick, or da bomb, and strictly speaking I cannot even justify or rationalize reason, I will defend the message on the mosaic: we ought to follow reason.

To begin at the beginning: what is rationality? As with most words in common usage, no definition can stipulate its meaning exactly, and the dictionary just leads us in a circle: most define rational as “having reason,” but reason itself comes from the Latin ration-, often defined as “reason.”

A definition that is more or less faithful to the way the word is used is “the ability to use knowledge to attain goals.” Knowledge in turn is standardly defined as “justified true belief.” We would not credit someone with being rational if they acted on beliefs that were known to be false, such as looking for their keys in a place they knew the keys could not be, or if those beliefs could not be justified—if they came, say, from a drug-induced vision or a hallucinated voice rather than observation of the world or inference from some other true belief.

Emblem of Boston Masons' Grand Lodge

The beliefs, moreover, must be held in service of a goal. No one gets rationality credit for merely thinking true thoughts, like calculating the digits of pi, or cranking out the logical implications of a proposition (“Either 1 + 1 = 2 or the moon is made of cheese,” “If 1 + 1 = 3, then pigs can fly”). A rational agent must have a goal, whether it is to ascertain the truth of a noteworthy idea, called theoretical reason, or to bring about a noteworthy outcome in the world, called practical reason (“what is true” and “what to do”). Even the humdrum rationality of seeing rather than hallucinating is in the service of the ever-present goal built into our visual systems of knowing our surroundings.

A rational agent, moreover, must attain that goal not by doing something that just happens to work there and then, but by using whatever knowledge is applicable to the circumstances. Here is how William James distinguished a rational entity from a nonrational one that would at first appear to be doing the same thing:

Romeo wants Juliet as the filings want the magnet; and if no obstacles intervene he moves toward her by as straight a line as they. But Romeo and Juliet, if a wall be built between them, do not remain idiotically pressing their faces against its opposite sides like the magnet and the filings with the card. Romeo soon finds a circuitous way, by scaling the wall or otherwise, of touching Juliet’s lips directly. With the filings the path is fixed; whether it reaches the end depends on accidents. With the lover it is the end which is fixed; the path may be modified indefinitely.

With this definition the case for rationality seems all too obvious: do you want things or don’t you? If you do, rationality is what allows you to get them.

Now, this case for rationality is open to an objection. It advises us to ground our beliefs in the truth, to ensure that our inference from one belief to another is justified, and to make plans that are likely to bring about a given end. But that only raises further questions. What is “truth”? What makes an inference “justified”? How do we know that means can be found that really do bring about a given end? But the quest to provide the ultimate, absolute, final reason for reason is a fool’s errand. Just as an inquisitive three-year-old will reply to every answer to a “why” question with another “Why?,” the quest to find the ultimate reason for reason can always be stymied by a demand to provide a reason for the reason for the reason. Just because I believe P implies Q, and I believe P, why should I believe Q? Is it because I also believe [(P implies Q) and P] implies Q? But why should I believe that? Is it because I have still another belief, {[(P implies Q) and P] implies Q} implies Q?

This regress was the basis for Lewis Carroll’s 1895 story What the Tortoise Said to Achilles, which imagined the conversation that would unfold when the fleet-footed warrior caught up to (but could never overtake) the tortoise with the head start in Zeno’s second paradox. (In the time it took for Achilles to close the gap, the tortoise moved on, opening up a new gap for Achilles to close, ad infinitum.) Carroll was a logician as well as a children’s author, and in this article, published in the philosophy journal Mind, he imagines the warrior seated on the tortoise’s back and responding to the tortoise’s escalating demands to justify his arguments by filling up a notebook with thousands of rules for rules for rules. The moral is that reasoning with logical rules at some point must simply be executed by a mechanism that is hardwired into the machine or brain and runs because that’s how the circuitry works, not because it consults a rule telling it what to do. We program apps into a computer, but its CPU is not itself an app; it’s a piece of silicon in which elementary operations like comparing symbols and adding numbers have been burned. Those operations are designed (by an engineer, or in the case of the brain by natural selection) to implement laws of logic and mathematics that are inherent to the abstract realm of ideas.

Now, Mr. Spock notwithstanding, logic is not the same thing as reasoning. But they are closely related, and the reasons the rules of logic can’t be executed by still more rules of logic (ad infinitum) also apply to the justification of reason by still more reason. In each case the ultimate rule has to be “Just do it.” At the end of the day the discussants have no choice but to commit to reason, because that’s what they committed themselves to at the beginning of the day, when they opened up a discussion of why we should follow reason. As long as people are arguing and persuading and then evaluating and accepting or rejecting the arguments—as opposed to, say, bribing or threatening each other into mouthing some words—it’s too late to ask about the value of reason. They’re already reasoning, and have tacitly accepted its value.

When it comes to arguing against reason, as soon as you show up, you lose. Let’s say you argue that rationality is unnecessary. Is that statement rational? If you concede it isn’t, then there’s no reason for me to believe it—you just said so yourself. But if you insist I must believe it because the statement is rationally compelling, you’ve conceded that rationality is the measure by which we should accept beliefs, in which case that particular one must be false. In a similar way, if you were to claim that everything is subjective, I could ask, “Is that statement subjective?” If it is, then you are free to believe it, but I don’t have to. Or suppose you claim that everything is relative. Is that statement relative? If it is, then it may be true for you right here and now but not for anyone else or after you’ve stopped talking. This is also why the recent cliché that we’re living in a “post-truth era” cannot be true. If it were true, then it would not be true, because it would be asserting something true about the era in which we are living.

This argument, laid out by the philosopher Thomas Nagel in The Last Word, is admittedly unconventional, as any argument about argument itself would have to be. Nagel compared it to Descartes’s argument that our own existence is the one thing we cannot doubt, because the very fact of wondering whether we exist presupposes the existence of a wonderer. The very fact of interrogating the concept of reason using reason presupposes the validity of reason. Because of this unconventionality, it’s not quite right to say that we should “believe in” reason or “have faith in” reason. As Nagel points out, that’s “one thought too many.” The masons (and the Masons) got it right: we should follow reason.

Now, arguments for truth, objectivity, and reason may stick in the craw, because they seem dangerously arrogant: “Who the hell are you to claim to have the absolute truth?” But that’s not what the case for rationality is about. The psychologist David Myers has said that the essence of monotheistic belief is: (1) There is a God and (2) it’s not me (and it’s also not you). The secular equivalent is: (1) There is objective truth and (2) I don’t know it (and neither do you). The same epistemic humility applies to the rationality that leads to truth. Perfect rationality and objective truth are aspirations that no mortal can ever claim to have attained. But the conviction that they are out there licenses us to develop rules we can all abide by that allow us to approach the truth collectively in ways that are impossible for any of us individually.

The rules are designed to sideline the biases that get in the way of rationality: the cognitive illusions built into human nature, and the bigotries, prejudices, phobias, and -isms that infect the members of a race, class, gender, sexuality, or civilization. These rules include principles of critical thinking and the normative systems of logic, probability, and empirical reasoning. They are implemented among flesh-and-blood people by social institutions that prevent people from imposing their egos or biases or delusions on everyone else. “Ambition must be made to counteract ambition,” wrote James Madison about the checks and balances in a democratic government, and that is how other institutions steer communities of biased and ambition-addled people toward disinterested truth. Examples include the adversarial system in law, peer review in science, editing and fact-checking in journalism, academic freedom in universities, and freedom of speech in the public sphere. Disagreement is necessary in deliberations among mortals. As the saying goes, the more we disagree, the more chance there is that at least one of us is right.

Though we can never prove that reasoning is sound or the truth can be known (since we would need to assume the soundness of reason to do it), we can stoke our confidence that they are. When we apply reason to reason itself, we find that it is not just an inarticulate gut impulse, a mysterious oracle that whispers truths into our ear. We can expose the rules of reason and distill and purify them into normative models of logic and probability. We can even implement them in machines that duplicate and exceed our own rational powers. Computers are literally mechanized logic, their smallest circuits called logic gates.

Another reassurance that reason is valid is that it works. Life is not a dream, in which we pop up in disconnected locations and bewildering things happen without rhyme or reason. By scaling the wall, Romeo really does get to touch Juliet’s lips. And by deploying reason in other ways, we reach the moon, invent smartphones, and extinguish smallpox. The cooperativeness of the world when we apply reason to it is a strong indication that rationality really does get at objective truths.

And ultimately even relativists who deny the possibility of objective truth and insist that all claims are merely the narratives of a culture lack the courage of their convictions. The cultural anthropologists or literary scholars who avow that the truths of science are merely the narratives of one culture will still have their child’s infection treated with antibiotics prescribed by a physician rather than a healing song performed by a shaman. And though relativism is often adorned with a moral halo, the moral convictions of relativists depend on a commitment to objective truth. Was slavery a myth? Was the Holocaust just one of many possible narratives? Is climate change a social construction? Or are the suffering and danger that define these events really real—claims that we know are true because of logic and evidence and objective scholarship? Now relativists stop being so relative.

For the same reason there can be no tradeoff between rationality and social justice or any other moral or political cause. The quest for social justice begins with the belief that certain groups are oppressed and others privileged. These are factual claims and may be mistaken (as advocates of social justice themselves insist in response to the claim that it’s straight white men who are oppressed). We affirm these beliefs because reason and evidence suggest they are true. And the quest in turn is guided by the belief that certain measures are necessary to rectify those injustices. Is leveling the playing field enough? Or have past injustices left some groups at a disadvantage that can only be set right by compensatory policies? Would particular measures merely be feel-good signaling that leaves the oppressed groups no better off? Would they make matters worse? Advocates of social justice need to know the answers to these questions, and reason is the only way we can know anything about anything.

Admittedly, the peculiar nature of the argument for reason always leaves open a loophole. In introducing the case for reason, I wrote, “As long as people are arguing and persuading…,” but that’s a big “as long as.” Rationality rejecters can refuse to play the game. They can say, “I don’t have to justify my beliefs to you. Your demands for arguments and evidence show that you are part of the problem.” Instead of feeling any need to persuade, people who are certain they are correct can impose their beliefs by force. In theocracies and autocracies, authorities censor, imprison, exile, or burn those with the wrong opinions. In democracies the force is less brutish, but people still find means to impose a belief rather than argue for it. Modern universities—oddly enough, given that their mission is to evaluate ideas—have been at the forefront of finding ways to suppress opinions, including disinviting and drowning out speakers, removing controversial teachers from the classroom, revoking offers of jobs and support, expunging contentious articles from archives, and classifying differences of opinion as punishable harassment and discrimination. They respond as Ring Lardner recalled his father doing when the writer was a boy: “‘Shut up,’ he explained.”

If you know you are right, why should you try to persuade others through reason? Why not just strengthen solidarity within your coalition and mobilize it to fight for justice? One reason is that you would be inviting questions such as: Are you infallible? Are you certain that you’re right about everything? If so, what makes you different from your opponents, who also are certain they’re right? And from authorities throughout history who insisted they were right but who we now know were wrong? If you have to silence people who disagree with you, does that mean you have no good arguments for why they’re mistaken? The incriminating lack of answers to such questions could alienate those who have not taken sides, including the generations whose beliefs are not set in stone.

And another reason not to blow off persuasion is that you will have left those who disagree with you no choice but to join the game you are playing and counter you with force rather than argument. They may be stronger than you, if not now then at some time in the future. At that point, when you are the one who is canceled, it will be too late to claim that your views should be taken seriously because of their merits.

Excerpted, with permission, from Rationality: What It Is, Why It Seems Scarce, Why It Matters, by Steven Pinker. Published by Viking, an imprint of Penguin Random House LLC. Copyright © 2021 by Steven Pinker.

This is a companion discussion topic for the original entry at

When Mr. Pinkner started discussing the quest for social justice versus the need for rationality I thought he might say something enlightening. Alas, . . .

Postmodernism may have a point that knowledge is imperfect- every theory can be displaced by another if its successor is a better fit for the empirically observed, and the true test would appear to be time. But what they neglect to mention is that theories in science invariably become more apt and fitting empirical observation as they displace previous ones, unless, of course, they have been captured by some external ideologically motivated movement. Archimedes Principle is just as true today as it was in Ancient Greece, and Bernoulli’s Principle has lasted over 200 years. But let’s examine the claim that instead of being truly universal, Western system of knowledge, empiricism and science are designed to unduly preference white men over others.

If such were the case then Western systems of knowledge would have no value or utility in other cultures- instead we see the reverse. Some may not value democracy as we do, but they have basically adopted virtually every other branch of Western knowledge from Actuarial Science to Biology. Those countries in Africa most successful are those who kept their colonial institutions, with those who choose indigenous systems or fell prey to Soviet Advisors preaching communism are poorer by every metric.

And to be fair, many of the ideas of the Enlightenment were not original concepts. The market had existed in several times and places even though its existence was a rare and precious occurrence. Islam had been an inheritor of the knowledge of antiquity, they had adopted Indian numbering systems and pushed the boundaries of mathematics and optics. Unfortunately, with the dominion of clerics over science they lost not only their naming rights but also their civilisation’s ascendency- should we also make the same mistake?

The true culprits of the failure to close educational attainment gaps by arbitrary group are educational theorists. By failing to adopt and make central the most empirically evidenced theory of how the brain learns, Cognitive Load Theory, they have negated the power of public education to lift-up regardless of circumstances. Crucially, students need to be able to commit large blocks of practical knowledge to long-term memory in areas like reading or maths, if they are to have any chance at all of performing cognitively complex tasks, because working memory is puny in even the brightest minds. This axiom even extends to the value of the knowledge that individuals can extract from the internet, through their existing knowledge and the framing of very precise questions.

When educational systems are not geared to optimise learning through the best science available then other factors become more important, including but not limited to socio-economics, parental education, parental time engagement, parental time engagement in the peer group, peer group and levels of fathers in the community. But here’s the thing- it doesn’t have to this way- public education, whether provided by a conventional school or a charter can still be transformative.

Newham is the second poorest borough in London. It has crime which is perennially high. It serves a diverse multi-ethnic community. Yet only recently Brampton Manor Academy placed more students at Oxbridge than Eton College. Students receive detention for being a minute late and are expected to carry an activity book, so staff know where they are supposed to be at all times. Parents are also subject to the discipline code. A the same time that Eton has been sacking teachers for encouraging controversial discussions over the subject of patriarchy, Brampton Manor Academy has been reinventing education as a knowledge rich means of learning, with every resource geared to optimising every moment of their student day.

The failure to correct class and racial inequities is not a product of Western science or knowledge, the lack of a universally accessible portal through which all can learn fairly according to their individual ability. It’s the impact that postmodernism has had in devaluing knowledge, and by extension devaluing most those who could benefit from its transformative power. It’s not that Western Knowledge inherently privileges white people, it’s that educational theorists and bureaucrats have been inadvertently sabotaging the transmission of a universally accessible system of empirical knowledge to Black and Brown people, through the educationally poisonous doctrine of postmodernism.

Even now, as Cognitive Load Theory starts to become an educational fad, there is a deemphasis of its most important feature- several of the resources for teachers I initially checked either neglected or completely omitted the importance of making pupils commit large segments of useable knowledge, or knowledge schema to long-term memory. Instead the emphasis seems to be on adapting a working knowledge on some parts of Cognitive Load Theory to presentation skills, reducing ‘Problem Space’ and changing visual data to reduce attention split.

These may be all very well and good, but unless most pupils learn their times tables by rote they will never be good at maths, and unless they learn phonics as well as whole word, is it unlikely they will ever read for pleasure unless they are unusually bright. Perhaps educational theorists should learn the rest of the Plutarch quotes on education: “Education is not the filling of a pail, but the lighting of a fire." “For the mind does not require filling like a bottle, but rather, like wood, it only requires kindling to create in it an impulse to think independently and an ardent desire for the truth.” If they understood the broader reference then they would concede that one still needs the kindling (or knowledge) as well as the fire to learn and understand.

As usual, my essays are to be found on my Substack which is free to view and comment:

1 Like

The problem with this article is that it maintains an Enlightenment world-view, and is completely unaware of what I call the “German Turn” that starts with Kant’s “Critiques.”
I say that Kant’s notion that we cannot know “things-in-themselves” but only appearances gives us permission, e.g., to say that space and time are not absolute, as everybody knows, but, like, relative, OMG! Kant points directly to Einstein and quantum mechanics, do not pass Go.
The point of Kant’s Critique is that human knowledge does not come from reason but critique. In most of human history the kings and the priests and the philosophers decided what was true and you’d better not question them.
But we live in an “era of criticism” that recognizes that the royal road to knowledge is similar to Karl Popper’s notion. It starts with a problem with current knowledge, then proceeds to a critique of current theory, then comes up with another theory, then waits for another problem to appear.
Of course, “critical theory” is a joke. You’d better bow to its truth or you will get canceled. No critique allowed.


I agree with this article in most ways, but the idea of an ‘objective reality’ divorced from our existence is the equivalent of religion for me. To me reality is a construct built of all the experiences and inferences based on successful predictability and open to integrating any new experience. The Platonic concept of a reality of which our experience is just a foggy reflection is the root of all the malevolent philosophical forces of history.


I enjoyed Steven’s article because it gave a decent round up as to why ‘reason’ in the broad sense is a superior icon of human progress than the self canceling shlock fantasizing that is postmodernism.

But I think Moleculist comes closest to the problem with Steven’s analysis by positing that objective reason can never be the a sole source of intellectual/cultural authority anymore than subjective perception/feeling is.

I would assert that the subjective and objective mind are both crucial elements inside a model of consciousness where they dance together, in the same way that in an older parlance, faith and reason do.

If for any reason the dancing stops and that partnership breaks up, subjectivity becomes solipsism and objectivity becomes an abstract mind game; faith becomes blind and reason becomes anybody’s.

Both need the other for vital services it itself cannot provide alone, that secure the overall relationship and provide a unified coherence and authority to their understandings…

Subjectivity is very grounding and personal. Objectivity provides overview and collective understanding. Together they deepen that understanding and provide a layered robustness to consciousness they would not otherwise have. And to succeed, they both have to absorb enough of the element of other to appreciate their own weaknesses and strengths, which is critically important in the simpatico give and take of their dance.

The same goes for faith and reason. Faith must be sufficiently plausible for reason to want to dance with it. And reason must be sufficiently grounded and humble enough to know that the assumptions inside any model of reality and argument for it are infinitely regressive.

In the end all the underlying sources of knowledge are fundamentally opaque, because as the tools of abstract and practical knowledge evolve, so does the picture they paint.

My interest in this is in the breakup of unified consciousness and the mess that that this is creating. Steven is having to defend reason and objectivity because faith and subjectivity haven’t just walked out of the dance. They want to take over the dance floor. Faith fundamentalism of both the traditional and woke kind are on the march and our increasingly prepared to use force to get their way.

Something really awful has to have happened and Steven clearly hasn’t registered that that might be the problem. I am suggesting that the civilianized war economy that evolved out of total war mobilization, whereby war production run by military machines was replaced by production war run by marketing machines, looked pretty cool up front, but the downstream long term results have been absolutely devastating across all platforms, whether we are talking eco systems, social systems or existential/spiritual ‘software’ ones.

The ‘tragedy of the commons’ has turned into slow motion deregulatory and privatizing catastrophe that is now speeding up as our reality starts to dissolve in front of us. What we are looking at is a whole world order that is in trouble and getting a taste of its mortality.

The last time that happened was the Reformation, when the Medieval world was dissolved and replaced ultimately by modernity and capitalism. And everyone was forced back to what they would tolerate and the bottom lines for which they would fight. That continued on and off for a 150 years.

I think these arguments are precursors to what may become a great disaggregation, as we move into a post-modern interregnum where old forces and new, solid contenders and opportunists and the best and worst of us contend for keeps in winner takes all struggles that will go on for generations, until the future becomes clearer and a new way forward makes itself plain…and where the dancers are talking to each other again and wending their way back to the dance floor.

Don’t hold your breath on that.


There is much I agree with in the comments of both Moleculist and c.d.eastmannagle.

I would further argue that Pinker’s rather uncritical notion of objectivity represents a form of old-school “Enlightenment Fundamentalism.” Ironically this academic fundamentalism precisely led to the post-modern swing toward to subjectivity in the first place. The swing was, at first, a corrective which became disastrous once it took over the academic and (finally cultural) “dance floor.”

I think the underlining tension between objectivity/unity and postmodern/diversity is best resolved with by the philosophical position of Critical Realism. It acknowledges one’s positional limitations (lack of omniscience) while affirming that there is the Really Real (existence is not an illusion and it can be knowable because it is orderly.)

Happily critical realism is non-sectarian and can be held by both the secular materialist (those who hold an a priori belief in a ‘closed universe’ that evolved from chaos to order) and the Christian philosopher (those who hold an a priori belief in a designed universe open to revelation.)

That said, my main critique of Pinker is that his arguments for rationalism have all been said before. So the real issue for him is: why do they fail so spectacularly with today’s university students. Why will he end up preaching to the choir? Why do young adults so fully embrace being irrational and incoherent?

You could argue that kids have been indoctrinated since grade school. But here is my deeper answer: because Pinker’s arguments do not give them any personal meaning.

His sort of enlightenment rationalism had pretty much ran aground in the academy by the late 1970s. Because re-introducing the original concepts upon which universities were founded was a non-starter, the only thing left to do was pursue technology and affluence. Ultimately that leaves a void.

So like it or not, social justice activism swooped in to fill it with a form of meaning and purpose. Pinker’s worldview will not undo that.


BTW. Here is Karl Popper’s definition of “rationality” from Knowledge and the Mind-Body Problem.

I mean by “rationality” simply a critical attitude towards problems – the readiness to learn from our mistakes, and the attitude of consciously searching for our mistakes and our prejudices. Thus, I mean by “rationality” the attitude of conscious, critical error elimination.

1 Like

This seems to harken back to that Scientism thread (?) from 1 or 2 years ago (?). Those who argue for religion (perhaps just the theistic ones) start from a position that is not falsifiable. So too that those who argue for logic, rationality, and science, simply understand/accept that those things just are. Simply different ways to skin a cat, i suppose.

Chris, the problem here is that ‘learning from our mistakes’ assumes what it needs to prove, which is that they were ‘mistakes’. ‘Mistakes’ are constructed in hindsight. Acting in real time is governed by a combination with foresight, which is always a gamble and loaded with elements that are unknowable when acting, even with the best preparation. Could one have prepared better. One can always prepare better. And your critics always think you could have prepared better, which is how they construct your ‘mistakes’, which is anything that doesn’t go as planned or hoped for.

We all have ‘prejudices’ because we couldn’t operate any sort of argument or position without them. ‘Prejudice’ has come to mean, ‘Views WE don’t like or approve of’. But really, prejudices are operating assumptions one can neither prove nor disprove. They form the design parameters for our views of reality. If we want to inspect them, we have move into another critical model, which has its own assumptions/prejudices…ad nauseam.

And I think what that means is that all knowledge is provisional. Nobody ever gets the final word. But as we move through our particular time/space, some provisional understanding will get more traction because of the larger dynamics going on at the time, which gives them their plausibility license and a potential use-by time horizon.

If we are going through a period of deregulation and privatization of the social system, certain prejudices, attitudes, analyses and the behavior stemming from them will do much better and seem more compelling and legitimate than others…that they would not necessarily do at any other time.

I would suggest we are starting to go through a transitional period where all that is up for grabs, which means we have to slug it out, literally and metaphorically to see who wins, and wins most often.

What I am arguing for is the historicity of ideas; that they do not stand and fall purely on the validity of their own arguments. Context is critically important as to why one argument flies and another doesn’t.

I think postmodernist is unadorned schlock, but in a marketed world that runs on fantasy and perception that completely bypasses critical evaluation, it works a treat…

Hope that helps clarify matters.


About half-way thru the article I concluded that the author had caught up with the tortoise and was now sitting atop it and writing out the endless dialogue between them, but I couldn’t see any objective in the continuing journey.

Social Justice activism does seem to fill the god shaped hole nicely. Many people have pointed out the faith based nature of SJ culture.

It’s a very appealing religion actually, since it is easily customizable for individual needs - so long as one believes the fantasies of the other members of the cult, the cult will reciprocate.

As long as folkx are high on this stuff, reason and logic won’t be effective.

Reality will always win in the long run though. Fantasy cultures aren’t really designed to keep the electricity on, the water running, or food on the table.

Pinker’s arguments will likely be better received as the results of the policies implemented by the woke clerisy become more and more obvious to the rest of us. I only hope the revelation comes sooner rather than later.


Growing up, among a zillion other dad-isms, when the topic of rational thought came up my father used to say, “Man is not a rational animal, man is a rationalizing animal.”

He was saying that people will use a supposedly rational thinking process to reach the conclusions they wanted to reach anyway, or to convince others of those conclusions. For example, rationalization can convince you that printing money in the absence of increased production won’t devalue it. It can convince you that if we cut spending on police, black people will be safer. It can convince you that Christians are to be loathed while Muslims are to be respected.

The development and maturation of the scientific method gave us tools to identify and weed out rationalizations by eliminating various types of bias. It introduced the concept of falsifiability, allowing us to determine whether something was testable rather than faith-based. It allowed us to pressure-test our rationalizations.

But the scientific method only works when it’s applied consistently and brutally. For example, these days the majority of theoretical physicists believe that matter is composed of tiny strings that reside in eleven dimensions. There is absolutely no way, currently or in the foreseeable future, to test this theory.

It therefore fails the test of falsifiability. Despite this, many physicists feel strongly that Superstring Theory is valid, in large part because the math it’s based on is beautiful. By believing this theory is correct based on the beauty of it’s math, in my view these scientists are engaging in rationalized thought rather than rational thought.

That said, I’m not sure I’d care to live in a truly rational society. For example, it’s actually rational to consider eugenics as a means of eliminating the genetic defects that pile up when natural selection is no longer in play, as is the case with human society today. I absolutely don’t want to live in a society that engages in eugenics, though.

For me, rational thought needs to be combined with a set of values. Rational thought alone is scary.