Education, Philosophy

Reason and Reality in an Era of Conspiracy

Did Nelson Mandela die in prison or did he die years later? Many people reading this probably lived through such recent history, and know perfectly well that he died decades after his release. Many of our students, however, do not know this because they didn’t live through it, and they haven’t been taught it. That is trivially true, and not particularly worrisome. But in 2010 a quirky blogger named Fiona Broome noticed that many people she met – people who should know better – incorrectly believed that Mandela had died in prison. She dubbed this kind of widespread collective false memory the “Mandela Effect” and began gathering more cases, inspiring others to contribute examples to a growing online database. Ask your students if they’ve heard of the Mandela Effect and you will find that 95 percent are familiar with it. They are fascinated by it because it is a highly successful topic of countless YouTube videos and memes.

Examples of the Mandela Effect are amusing and easy to find online. Everyone thinks that Darth Vader said “Luke, I am your father” but in fact he said, “No, I am your father.” Everyone thinks Forrest Gump said “Life is like a box of chocolates” but he actually said, “life was like a box of chocolates.” The Queen in Snow White never said “Mirror, mirror on the wall,” but actually said, “Magic mirror on the wall.” The popular children’s book and show “The Berenstein Bears” is actually “The “Berenstain Bears.” Many people think evangelist Billy Graham died years ago but – at the time of this writing, at least – he lives on.

The Memory Illusion by Dr. Julia Shaw

There’s nothing particularly surprising or even interesting about such failures of memory. Our memories are deeply fallible and highly suggestible. A recent study revealed that psychologists can easily coax subjects to ‘remember’ committing a crime they never actually committed (see Dr. Julia Shaw’s recent book The Memory Illusion). With the right coaching I can start to remember the time I assaulted a stranger, even if I never did. Moreover, cultural memes – like famous movie dialogue lines – naturally glitch, vary, and distort in the replication process, especially in mass media replication. The version an actor misstates on a late-night talk show, becomes the new urtext and rapidly replicates through the wider culture, replacing the original. All cultural transmission is a giant “telephone game.”

But here’s where it gets weird. Students – yes, current undergrads – think the explanation for these strange false memories is that a parallel universe is occasionally spilling into ours. In that parallel universe, apparently, Mandela did die in prison, and the Snow White mirror phrase is how we remember it. Alternatively, a number of students believe that we humans are moving between these parallel universes unknowingly, and also time-traveling so that our memories are distorted via the shifting time line.

When I pressed my students, suggesting they were not really serious, they grew indignant. Like a gnostic elite, they frowned upon my failure to grasp the genius of their metaphysical conspiracy theory. As an antidote, I introduced Ockham’s Razor – a principle originally set forth by William of Ockham (c.1287-1347), that simpler explanations are preferable to metaphysically complex explanations of phenomena. When explaining an event, Ockham suggested that entities should not be multiplied unnecessarily. As he puts it in Summa Totius Logicae, “It is futile to do with more things that which can be done with fewer.” (i. 12) When you have two competing theories that make the same predictions, the simpler one is usually better. I don’t need a world of demonic possession, for example, to explain why the delusional person on my bus this morning was talking to his hand.

William of Ockham

I turned to a student in the front row to illustrate the obviousness of Ockham’s Razor. “Imagine,” I said to him, “a series of weird bright lights appear over your neighborhood tonight around midnight. Now, which is easier to believe: that an alien invasion is happening, or that the military is testing a new technology?” Without blinking, the young man told me that the military would never be in his neighborhood, so aliens seemed more reasonable.

“But…but…” I stammered, “you have to assume a whole bunch of stuff to make the alien explanation work – like, aliens exist, aliens are intelligent, aliens have incredible technology, aliens have travelled across the solar system, aliens have evaded scientific corroboration, and also that aliens would come to your neighborhood while the military would not. Whereas, the list of assumptions for the military testing explanation is relatively small by comparison.” He just blinked in response.

Thus reproached, I switched to another example, asking students whether it was easier to believe a ghost slammed their door shut, or the wind did it. Some conceded that the wind was a leaner or more parsimonious explanation, but a great many students already believe strongly in the reality and ubiquity of ghosts, so referring to spirits did not seem to them like a violation of Ockham’s razor. Many ghost stories from family and friends, and YouTube videos had already served as a sort of ‘corroboration’ for their theory, so ghosts and wind had a comparably equal metaphysical status in their world.

I had arrived at the death of reductio ad absurdum arguments – where you demonstrate the absurdity of your opponent’s view, by showing that it has obviously ridiculous implications which your opponent has not yet appreciated. But one cannot offer such a refutation if every ridiculous implication seems equally reasonable to your opponent. You never get to an absurdity that both parties accept, and it’s just ‘beliefs’ all the way down. It is the bottomless pit of the undiscriminating mind. It is preferable to my students to think that alternate universes are colliding to create alternate Mandela histories and Darth Vader dialogue, than to think we simply remember stuff badly. Adiuva nos Deus.

Intense handwringing has attended the rise of fake news. The worries are justified, since the media has never been more unreliable, biased, and embattled. But the proposed solutions tend to focus on fixing the media, and no one dares suggest that we should be fixing the human mind. The human mind, however, is arguably broken, and educators must implement a rigorous curriculum of informal logic before our gathering gloom of fallacies, magical thinking, conspiracy theories, and dogma make the Dark Ages look sunny by comparison. Obviously, it would be nice if our politicians and pundits were reliable purveyors of truth, but since that isn’t about to happen, we should instead be striving to create citizens who see through these charlatans. Kids who believe their mistaken memories of movie lines are proof we’re all time-traveling or jumping between parallel realities are not those needed citizens.

Harvard legal scholars Cass R. Sunstein and Adrian Vermeule suggest that contemporary Americans are more credulous and committed to conspiracy theories because they have a “crippled epistemology.” When a person or community comes to believe that the twin towers fell because of a U. S. government plot, or that the 1969 moon landing never happened, or that AIDS was a manmade weapon, they reveal a crippled epistemology. And, according to Sunstein and Vermeule, as well as Cambridge political theorist David Runciman, the “crippling” results from a reduced number of informational sources or streams.1 When information enters a community through only a few restricted channels, then the group becomes isolated and their acceptance of ‘weird’ ideas doesn’t seem irrational or weird to those inside the group. We see lots of evidence of this, like when people get all their news on Facebook but the news is trickling through a single ideological pipeline.

On this view, increasing the number of informational sources reduces conspiracy gullibility. Closed societies, like North Korea and to a lesser degree China, are more susceptible to fake news, conspiracy, and collective delusion. Belief systems in these closed environments are extremely resistant to correction, because alternative perspectives are unavailable. Online information bubbles produce some of the same results as political media censorship.

My own experience in the Creation Museum in Kentucky confirms the idea that informational isolation is highly distorting. I spent time touring Noah’s Ark, watching animatronic dinosaurs frolic with Adam and Eve, and talking to the director Ken Ham. The Evangelical isolation became obvious, and was even worn as a badge of honor by Ham and others, as if to say, “We are uncontaminated by secular information.” The Bible-belt audience for Creation Science gets all its news about the outside world from Evangelical cable TV shows, Christian radio programs, blogs, podcasts, and of course church. An Evangelical museum, claiming the earth is 4000 years old, is just icing on the mono-flavored informational cake.

People and dinosaurs co-exist in an exhibit at the Creation Museum in Kentucky. Pic by David Berkowitz (2011)

However, this kind of information isolation cannot explain our students’ gullibility. Yes, the average undergrad has a silo of narrow interests, but they are not deficient in informational streams. In fact, I want to argue they have the opposite problem. We now have almost unlimited information streams ready to hand on our laptops, tablets, smartphones, and other mass media. If I Google the 9/11 atrocity, it will be approximately 2 or 3 clicks to a guy in his mother’s basement, explaining in compelling detail how the Bush administration knew the attack was coming, engineered the collapse of the twin towers, and that the U. S. government is a puppet for the Illuminati, and so on. If you don’t already have a logical method or even an intuitive sense for parsing the digital spray of theories, claims, images, videos, and so-called facts coming at you, then you are quickly adrift in what seem like equally reasonable theories.

During the Renaissance, Europe had a similar credulity problem. So much crazy stuff was coming back from the New World – animals, foods, peoples, etc. – that Europeans didn’t quite know what to believe. The most we could do is collect all this stuff into wunderkammern or curiosity cabinets, and hope that systemic knowledge would make sense of it eventually. In a way, the current undergraduate mind is like the pre-Modern mind, chasing after weirdness, shiny objects, and connections that seem more like hermetic and alchemical systems. According to a recent survey by the National Science Foundation, for example, over half of young people today (aged between 18 and 24), believe that astrology is a science, providing real knowledge.2

During the last three years, I have surveyed around 600 students and found some depressing trends. Approximately half of these students believe they have dreams that predict the future. Half believe in ghosts. A third of them believe aliens already visit our planet. A third believe that AIDS is a man-made disease created to destroy specific social groups. A third believe that the 1969 moon landing never happened. And a third believe that Princess Diana was assassinated by the royal family. Importantly, it’s not the same third that believes all these things. There is not a consistently gullible group that believes every wacky thing. Rather, the same student will be utterly dogmatic about one strange theory, but dismissive and disdainful about another.

There appears to be a two-step breakdown in critical thinking. Unlimited information, without logical training, leads to a crude form of skepticism in students. Everything is doubtful and everything is possible. Since that state of suspended commitment is not tenable, it is usually followed by an almost arbitrary dogmatism.

From Socrates through Descartes to Michael Shermer today, doubting is usually thought to be an emancipatory step in critical thinking. Moderate skepticism keeps your mind open and pushes you to find the evidence or principles supporting controversial claims and theories. Philosopher David Hume, however, described a crude form of skepticism that leads to paranoia and then gullibility. He saw a kind of breakdown in critical thinking that presages our own.

David Hume (1711-1776)

“There is indeed a kind of brutish and ignorant skepticism,” Hume wrote, “which gives the vulgar a general prejudice against what they do not easily understand, and makes them reject every principle, which requires elaborate reasoning to prove and establish it.” Paranoid people, Hume explained, give their assent “to the most absurd tenets, which a traditional superstition has recommended to them. They firmly believe in witches; though they will not believe nor attend to the most simple proposition of Euclid.”3

Absent the smug tone, Hume is onto something. Creationists have just enough skepticism to doubt evolution, climate deniers have just enough skepticism to doubt global warming, and millennial students have just enough to believe in the latest conspiracy theory, as well as ghosts, fortune telling, astrology, and so on. In the Creationism case, the crippled epistemology results from too little information but in my Chicago undergrads the gullibility results from a tsunami of competing informational options. For our students, settling on a conspiracy closes an otherwise open confusion loop, converting distressing complexity and uncertainty into a reassuring answer – even if that answer is “time travel” or “aliens” or “the Illuminati.”

Another motive seems lurking in the background too, and it is insidious. The millennial generation does not like being wrong. They are unaccustomed to it. Their education – a unique blend of No Child Left Behind, helicopter parenting, and oppression olympics, has made them uncomfortable with Socratic criticism. When my colleague recently corrected the grammar on a student’s essay, the student scolded him for enacting “microaggressions” against her syntax. So, conspiracies no doubt seem especially attractive when they help to reinforce a student’s infallibility. Having an alternate dimension of Mandela Effect realities, for example, means the student is never wrong. I’m always right, it’s just reality that keeps changing.

There are two cures for all this bad thinking. One of them is out of our hands as educators, and involves growing up. While the demands of adult life are sometimes delayed – as graduates return to live at home and postpone starting families – there is an inevitable diminution of conspiracy indulgence when one is struggling to pay a mortgage, raise children, and otherwise succeed in the quotidian challenges of middle class. Interestingly, however, this reduction in wacky thinking is not provided by the light of reason and educational attainment, but rather by the inevitable suppression or inhibition resulting from the demands of the workaday world. Our students at 30-something and 40-something are not converts to rationality, but merely distracted from their old adventures in gullibility.

The other cure for conspiracy thinking, and the problems of too little and too much information, is something we educators can provide. And it’s relatively cheap. We could require an informal logic course for every undergraduate, preferably in their freshman or sophomore year. The course should be taught by philosophers or those explicitly trained in logic. I’m not talking about some vague ‘critical thinking’ course that has been robbed of its logic component. I’m talking about learning to understand syllogisms, fallacies, criteria for argument evaluation, deduction, induction, burden of proof, cognitive biases, and so on. The informal logic course (as opposed to formal symbolic logic) uses real life arguments as instances of these fundamentals, focusing on reasoning skills in social and political debate, news, editorials, advertising, blogs, podcasts, institutional communications, and so on. The students are not acquiring these skills by osmosis through other courses. It needs to be made explicit in the curriculum.

Studying logic gives students a way to weight the information that is coming at them. It gives them the tools to discriminate between competing claims and theories. In the currently frantic ‘attention economy,’ logic also teaches the patience and grit needed to follow complex explanations through their legitimate levels of depth. It won’t matter how sensitive to diversity our students become, or how good their self-esteem is, if a lack of logic renders them profoundly gullible. More than just a curative to lazy conspiracy thinking, logic is a great bulwark against totalitarianism and manipulation. The best cure for fake news is smarter citizens.

 

Stephen T. Asma is Professor of Philosophy at Columbia College Chicago, and author of ten books including The Evolution of Imagination (University of Chicago Press, 2017). He writes regularly for the New York Times and Aeon. You can follow him on Twitter @stephen_asma 


References:

1 Sunstein, C. R. and Vermeule, A. (2009), Conspiracy Theories: Causes and Cures*. Journal of Political Philosophy, 17: 202–227. And see Runciman’s work on a 5-year funded research project called “Conspiracy and Democracy” at http://www.conspiracyanddemocracy.org

2 National Science Foundation, https://www.nsf.gov/statistics/seind14/index.cfm/chapter-7/c7h.htm

3 See David Hume’s Dialogues Concerning Natural Religion (Hackett, 2nd Ed. 1998), Book I.

If you liked this article please consider becoming a patron of Quillette

28 Comments

  1. Uri Harris says

    I think consistently teaching students that critical thinking = challenging power hierarchies plays a role as well. The advance of critical theory may very well lead to more conspiracy thought. Claire Lehmann mentioned in a recent article a critical theorist who wrote a thesis on the Australian Government’s vaccination policy being a conspiracy between Big Pharma and the WHO.

  2. Pingback: All monopolies fall — Freedom Today Journal

  3. Florina T says

    So, if we follow your reasoning to its logical end, people only need to be correctly educated to ditch religion altogether. Because nothing about it is logical and stuff.
    We have many thousands of years behind us already that prove this is not how its done.
    There’s absolutely nothing to prevent people to go on just as they do today, irrespective of any kind of training they’ll undergo. They will continue to be dogmatic in relation to some theories and laugh at others.
    To move from propping your intellect with dogmas to making sense of the world and your place in it, to understanding that you are essentially alone – a natural accident of evolution, if you wish, that there is no higher purpose than the one you draw for yourself and that there won’t be any prize or retribution whether you succeed or fail – and still feel at peace with yourself it’s not achieved by logic.
    It’s a decision which has nothing to do with logic, in fact I would argue that it is illogical since many people who come to this conclusion live in anguish and those who are the most dogmatic seem to live more fulfilling lives by their own account.

    • Charles K. says

      Florina T – I like your comments very much. Very useful.

    • Dan Vesty says

      While I agree with your assertion that logic can’t solve all human problems, I think you may be over-egging the custard in your apparent insinuation that logic leads inevitably to existential despair. For example, although I consider myself a relatively rational, logical, scientifically-minded person, there’s nothing about my admittedly patchy understanding of the natural world that leads me to the understanding that I am ‘essentially alone’. Quite the opposite actually, as far as I can tell there are people everywhere I look, and a bit more essential aloneness might actually be a welcome break…As for being a ‘natural accident of evolution’, I’m not really sure what you mean there either. After all, the concept of ‘accident’ only exists by contrast to the concept of ‘purposeful action’ – and as evolution is not a living thing acting in the world, it’s difficult to see how any of it’s products could be described as ‘accidents’.

    • LUKA PRPIĆ says

      Truth is irrational . Facts are true but truth is not a fact.

  4. Markus says

    Excuse me, but in the original Brothers Grimm variant of snow white the queen interrogates the mirror with the “Mirror, mirror…” variant. So I guess most people believe correctly. I do not know where the “Magic mirror…”variant came from…

    • I suspect the whole Snow White story was made up anyway so arguing over what her non-existent stepmother said isn’t very productive.

      Now if there really was a Snow White my question would be how did the Prince obtain consent before kissing her?

    • Torkulweef says

      »Spieglein, Spieglein an der Wand,
      Wer ist die Schönste im ganzen Land?«

  5. DiscoveredJoys says

    I suspect there is another cognitive bias working here too. We are exposed to multiple stories, news articles, social reactions, video games, and so on, to the narrative that the *single* cause of X is Y. It is far more comforting to believe that ‘Because there is a single cause I can understand and do something about it’. The alternative is to accept that there are many causes, many effects, all interacting. How can you control your world when you are not aware of all the causes and you don’t even understand how it all works?

    So it is comforting to believe that ‘God did it’ rather than messy evolution. That HIV is a government conspiracy rather than an unpredictable change in a virus. That a political or economic model held with enough conviction will explain the real world.

    Shit happens – for all sorts of reasons.

  6. Sarka says

    I agree with much of this, but would add a theory I read in a book, back in the 1980s, on hysteria and moral panics. Irritatingly I cannot remember the author’s name, but it dealt with such matters as the “alien abduction” mania in the US. The author made a persuasive case that American (US) society was a lot more prone to such manias, including conspiracy theories of all kinds, than Western Europe, and speculated as to why this was. Part of her answer was precisely the egalitarian and individualist ethos of the USA….Ordinary people there were in her view less inclined than Europeans to defer to intellectual authorities (govt. experts, the scientific establishment), and they believed, much more automatically, that one man’s opinion was as good as any other’s.

    Obviously, since the eighties this ethos has hardly diminished, and to some extent Europeans have started to catch up with it. Obviously, it is fostered by the Net now, and instant access to any version of reality you like. And it is also (and even among quite highly educated people), fostered by changes in educational philosophy and practice that – especially in the humanities – try perversely to instil a “critical spirit” in children before they have learnt anything much to be critical about. Furthermore, in my subject – history – as in other “civics” sorts of subject dealing with the state of society, the attempt to engage children’s interest and enthusiasm (excellent in theory) is often a matter of presenting “stories” in not just a simplified but a very sentimental way, with goodies and baddies. It’s not hard to come out of that education with a combination of over confidence, very little competence or interest in complicated argument, a conviction that the world is divided into wicked predatory people and cuddly victims and even a conviction that knowledge isn’t important – just being the right sort of person with the right kind of views is all that is necessary.

    The problem is that this problem has arisen from developments that would often be good (egalitarianism, critical spirit etc.) if they were real rather than somehow ersatz. I think philosophy curses in schools would be great, but I am not sure how much they will help because what people need in their intellectual lives – good judgment – is not just an understanding of pure logic but (again, especially in humanities subjects) skills in judging materials (arguments, narratives, sorts of sources) that can’t really be learned in the abstract.

  7. Carl Sageman says

    This is a wonderful article. There are additional factors that would compliment this article.

    We often seek the data that matches our own assumptions. For example, some feminists will only seek information from other feminists (excluding Christina Hoffman Sommers, of course). To illustrate,

    I have talked to many feminists about domestic violence. I explained how government data from English speaking countries showed that about 30% of victims were male and there was a lot of female initiated violence (according to Jordan Peterson, females initiate more violence, but they don’t hit as hard). I contrasted this with a simple internet search (Google and Bing) that showed not one single male victim (only females) and thousands of mainstream news articles indicating that only men are responsible for domestic violence and never victims (I have a partial list which is very long). I even highlighted the oft repeated statement “zero tolerance”, which appears to mean “zero tolerance if you’re male”. I have asked how every mainstream newspaper got it wrong. The answers were:
    – you faked the search results (even when demonstrated at the time)
    – it’s a conspiracy by men to make feminists look bad
    – women don’t hit as hard (eg. The woman who killed her boyfriend when she purposely hit him with a car some how hit softer than a male would)
    – the governments are lying about the statistics
    – men do it more and therefore don’t deserve representstion (contradicting the zero tolerance)
    – I don’t believe it (no justification given)

    How do you explain this? It’s exactly the same as the reporting on James Damore and his diversity memo. I tracked this across approximately 75 mainstream media outlet links. Not one outlet was willing to reference expert analysis from Quillette (4 experts). I contacted several media outlets with the link to the Quillette article and one outlet posted 2 more articles afterward and ignored the link. Even the factual feminist derided James on multiple occasions: her argument was that James “shouldn’t have said [what the experts all confirmed]” without any real criticism of James. It’s rare for Sommers to make such a fundamental mistake, especially repeatedly.

    I’m going to steer this back to feminism because feminism is so frequently wrong and yet, it’s revered by most public figures.

    How does feminism keep on getting it wrong? Wage gaps, domestic violence, or just about anything else dispelled by the Factual Feminist (CH Sommers) in her YouTube videos. The answer isn’t a lack of logic. I’ll give it several labels
    – dogmatic ideology (equivalent to that of religion)
    – confirmation bias
    – echo chambers
    – self-serving bias
    – ingroup bias
    – fundamental attribution error
    – group attribution error (eg. Seen with Harvey Weinstein articles)
    – just-world hypothesis
    – identity politics
    – intersectionalism

    In summary, some groups (not just feminism) purposely choose to ignore fact and reality. For these groups, reality is variable (see post modernism). Do you really think these groups want the truth or don’t know the truth? Feminism isn’t aware of its many consistently perpetuated falsehoods? Even Steve Jobs admitted he knew the truth but chose to live in his reality distortion field when it suited.

    No amount of logic education will correct this. It’s a philosophy, a set of values and a way of life. Nobody will hold these people accountable. It’s the accountability that I believe is missing.

    One last minor point. I don’t use Facebook, so, all of the fake news I see comes from the mainstream media (and nearly always the same outlets). Does Facebook drive the fake news (as suggested in the article), or is it ideologists who go into journalism who incite ideologically motivated groups on Facebook and Twitter through mainstream news articles (eg. On James Damore)? If we mischaracterise the source, we will fail to understand and combat this issue.

    I hope this comment doesn’t get filtered.

  8. But here’s where it gets weird. Students – yes, current undergrads – think the explanation for these strange false memories is that a parallel universe is occasionally spilling into ours. In that parallel universe, apparently, Mandela did die in prison, and the Snow White mirror phrase is how we remember it. Alternatively, a number of students believe that we humans are moving between these parallel universes unknowingly, and also time-traveling so that our memories are distorted via the shifting time line.

    I did a lot of pot at university too.

  9. Ben M. says

    The question no one has asked is, does Ockham’s Razor apply to Ockham’s Razor? Is an explanation of Ockham’s Razor ever the simplest way to communicate the concept of Ockham’s Razor? Furthermore, is Ockham’s Razor fundamentally different from K.I.S.S. (keep it simple stupid)? If not, can we just settle on one? Or, must we maintain a brow level based separation? Regarding the origins of these competing simplicity principles, is William of Ockham really more of an authority on logical matters than my great uncle Clive? After all, William was a 14th century religious fanatic (friar) who may have been predisposed to choose simple answers like “God did it” over evolution, whereas Uncle Clive don’t believe in ‘nuthin since he got back from ‘Nam. Perhaps a better statement of the principle would go something like, hypotheses appear simpler to people if it matches all the other beliefs they already hold. This reminds me of a quote from Einstein, “Common sense is the collection of prejudices acquired by age eighteen.” Logic is a pretty good meeting ground to try and bridge the gap between people’s prejudices, but you could get even more basic. If you see lights in the sky you could start out by admitting you have no idea what it is. Science requires you to first identify what you do not know, and finding out that you don’t know you don’t know is an inevitable step in the development of any critical mind. An accurate mental model of the universe has never been a condition for being human. I would wager that people are the most rational and logical they have ever been in history, but now we are just learning all the things other people always believed, and it’s weird.

  10. yandoodan says

    Two (minor) points about your visit to the creationist museum.

    1. They believe the universe (including the Earth) to be 6,000 years old (not 4,000), having been created on Easter Day, 4004 BC. (This is “in the Bible” only to the extent that Bishop Ussher’s calculations are printed in the margins. Do not try to tell them this. It won’t work.)

    2. Confirmation is pointless. Talk to the people running the creationist museum and they will give you thousands of confirmatory instances. They will talk your head off. You won’t be able to get away. And do not get them on the subject of unconformities! Now, what about those instances where your theory survived strong attempts to falsify it?

  11. Something you may find intriguing are the studies within Lazy User Theory specific to information acquisition. I’m afraid I don’t have the cite at hand, but I recall one conducted among tribes in Africa, for example, which aligns. In this case, they found that tribe members were more willing to seek and accept information from lazy, tribal sources than from others which may hold greater credibility. It went directly laziness in acquisition and I think that ties well to what we are currently experiencing. The ease in acquisition has overshadowed trustworthiness of source; unfortunately, this has been “labeled” as tribalism and bias which does nothing but further the divide because it is usually explained in condescending tone. That same condescending tone was explored in that same study and was one of the reasons why the more reliable information source wasn’t considered — those were Government media talking down to the uneducated masses.

    Consider the creation argument and 6000 years age — when you point out that if everything is 6000 years old and you start mapping out from the Bible text the generations and the ages listed it doesn’t add up — you’ve explained it in context versus “bah, you’re an idiot, carbon dating (which has some challenges to validity, but I digress) says stuff is millions” which is discounting their view condescendingly. Pointing out that the Bible was passed down by verbal tradition with tellers using high-context language morphing things over time suddenly becomes a reasonable explanation of the 6000 versus > 6000 (aka, a biblical year != a complete orbit around the sun which they weren’t even aware of back then).

    Using the example of ghosts and the readily available videos on YouTube and Ghost Hunters/etc — these are easy to get to and see but the Occam’s razor explanation isn’t as readily available. GhostHunters is an interesting show because they seek out more reasonable explanations of things they see/hear but that involves a cable-TV subscription and the right channel at the right time.

  12. There’s a difference between a healthy skepticism and pseudo skepticism not discussed in this article. This not very original material appears to be harping on the same old tunes of the James Randi crowd. All the same cliches are in place for a pseudo skeptic manifesto of epic proportions, straight from the Michael Shermer playbook. No pure pseudo skepticism is complete without a straw man thrashing of religion, ghosts, telepathy, creationism, etc, leading up to a flat denial of anything challenging the philosophy of scientism. I can hear the mental gears of the author clicking into place like the ticking of the clockwork universe that informs his mechanistic materialism. He gathered his critical thinking skills from multiple subscriptions to ready made “skeptical” resources. His “Occam’s razor” is a dull knife for sawing through controversial issues opposing his world view.

  13. Debbie says

    Isn’t the real problem human nature? People are uncomfortable — and reluctant to admit — not knowing. It’s easier to let the imagination fill the knowledge gaps. Wouldn’t it be more honest like this?
    Q: Did humans walk on the moon?
    A: I’ve seen some footage, but I don’t know.
    Q: Did the royals kill Diana?
    A: I thought it was a car crash in a tunnel, but I don’t know.
    Q: Have aliens visited earth?
    A: Sounds unlikely, but I don’t know.
    Q: Why don’t you remember that lyric correctly?
    A: I thought that’s how it went. I guess I just didn’t know

  14. “We could require an informal logic course for every undergraduate, preferably in their freshman or sophomore year.”

    That leaves all those out who do not go to college but who still affect society through their choices. How about moving this “requirement” up into grade school or high school? Or, preferably, have it remain through both whilst becoming more and more thorough as they age?

    Professional academia tends to have its own bias – the missing sample of those who do not pay money to attend their classes as though all thought and learning can only take place within their hallowed halls and is non-existent in other forms or venues. Since around 65-70% of Americans do not have a college degree it seems to be a pretty big sample to ignore.

    “…there is an inevitable diminution of conspiracy indulgence when one is struggling to pay a mortgage, raise children, and otherwise succeed in the quotidian challenges of middle class.”

    A nod to Aristotle? With no pragmatic concerns then, the poor and the wealthy are free to indulge in whatever irrationality they like. It also takes the fact from the previous bit about 65-70% of adults with no college education and who were (probably) in the workforce since they were 18 or so, combined with the naturally suppressed conspiracy indulgences due to “quotidian challenges” suggestion, to make the idea that 65-70% of people actually have an OK grasp on reality…though simply through pragmatic needs.

    “…logic is a great bulwark against totalitarianism and manipulation.”

    Maybe, but emotion is the great manipulator of, and the engine that engages in, any sort of logic. I believe Hume said that. He also said something along the lines of reason itself being inert, able to produce (potentially coherent) beliefs, but only does so in service to the passions of the thinker. The argument about the legitimacy of logic as it has, inherent within it, a motivation to wield it.

    “But one cannot offer such a refutation if every ridiculous implication seems equally reasonable to your opponent. You never get to an absurdity that both parties accept, and it’s just ‘beliefs’ all the way down.”

    In the context of Occam – I assume the student(s) don’t have strong grasps of probabilities either so we can blame that on maths, but really I think there is a pertinent aphorism somewhere like this – “if you engage in an argument with a fool, the fool automatically wins.”
    Maybe that is why the wise are usually silent.

    In the case of being paid to engage with the fools (or children of fools), then hopefully there is a good benefits package associated with that.

  15. The psychological phenomena that you describe exists because that is what our current environment has created. Specifically, we now live in a world of extraordinary affluence and the ancient evolutionary environment of extreme hardship and routine existential threat is effectively extinct. As a result, the natural feedback mechanism for reality error (e.g. extreme physical harm or early death) is no longer operative. Selection is no longer purging this type of error proclivity from our DNA legacy.

  16. Fuendetodos says

    “When a person or community comes to believe that the twin towers fell because of a U. S. government plot, or that the 1969 moon landing never happened, or that AIDS was a manmade weapon, they reveal a crippled epistemology.”
    So for the burning of Rome you prefer to believe in cristian fanatics culpability than a insise job ordered by Nero. You prefer to think a single communist burned down the Reichtag than follow the “conspirationist theory” of a black flag attack made by the nazis. You prefer to think that passports from plane hickjackers can survive intacts a plane crash against a tower just to be found in the ground by a policemen and presented as proof in tv just a few hours after the crash. You prefer the collapse than the termite to explain the fall of the towers. You did never seen a video of a plane crashing at the Pentagon but you may prefer security cameras where not working. So who you said is revealing a crippled epistemology?
    Time to analize and change your own paradigma, maybe?

  17. University is, as observed, too late. Quite frankly, the only answer is that which Dorothy L Sayers proposed years ago, in The Lost Tools of Learning: bring back the trivium and the quadrivium. From the first year of schooling.

  18. Pingback: Dinocrat » Blog Archive » No marketing campaign needed

  19. Liaquat Ali says

    Stephen T Asma, great lecture. I agree with you about judging any controvercial rhetoric after critical thinking rather than taking it as granted. Logical thinking may play crucial role in achieving such goals. If we examine present day science deeply, we will come to know that it is based upon logic. Whether it be mathematics, physics or any other science, its fundamental roots are derived from logic. I would like to give one example here to emphasize my point. Newton derived his famous formula f=ma by thinking why the apple did not go to the sky instead of falling on his head. This critical thinking based upon pure logic made it possible to achieve fundamental appreciation of one of the basic forces of nature.

  20. gavin mc says

    these comments are just as good as, if not better than, the original article.

  21. spencer says

    there is a simple solution to the “education problem” identified which is nonetheless difficult to build: 1. identify student motivations to study (+/-). 2. establish individual (ideal) or generic incentives. 3. build a logic-based robot or computer (i.e. an operating system). 4. reward or incentivise (the parallel universe will punish) worked through logical outcomes (see puzzle solving computer games). disclaimer: proof must itself be logically certified (i.e. a lack of trust in any given authority undermines the certification)

  22. I propose that you cannot ‘teach someone how to think’, especially not by giving them a class on it! Something to be written down, memorized, it is not.

    Many human beings might grow up thinking diligently if nothing interfered with this otherwise naturally-occurring process. What happens to us is our socio-educational culture, or as I sometimes refer to it: ‘The memorisation of many details’. We live in a system that prefers to have us ‘memorize what we are taught’. Just like religeous schools. I have fought it all my life.

    Increasingly poor critical thinking skills in the general population seems (to me) to be a direct result of this hidden educational philosophy. Otherwise, show me where in our education, and socio-economic systems critical thinking gets nurtured? Or even tolerated.

Comments are closed.