On Bullshit – updated english version


On Bullshit

Bullshit (BS) is often interpreted as nonsensical facts. The use of empty words is a common characteristic of BS. The term is generally used as a placeholder for contexts like humbug, knickknack, frippery or mumbo jumbo. BS always has something arbitrary and is characterized by vagueness. It pretends to be profoundly knowledgeable to the audience – often just with clever rhetoric tricks. It is not important to the BShitter whether the communicated information is right or wrong; it is solely about achieving a goal by influencing the attitudes of the ‘audience’. Therefore it seems logical that BS and narcism are strongly related.

So one could compare a BShitter with a liar, but according to Harry Frankfurt this would not cover the full context. The main distinction: the liar works with the idea of “untruth”, while the BShitter is completely indifferent to whether an information is true or false. Any “narrative” that could be used to get the BShitter closer to the desired result, will be spontaneously selected. Therefore, according to Frankfurt, the BShitter operates beyond a concept of truth and this is what makes him so damn dangerous: he cannot be pinned down and escapes every argument within a cloud of fog. You can convict the liar, but not the BShitter. So the BShitter keeps the audience on its toes and sets the pace, appearing to some people as a visionary leader…

Harry Frankfurt writes:

“It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction.”

How is BS created?

Harry Frankfurt sees BS as an inevitable evil in our society; especially in democracies BS can thrive. This statement may sound unpleasant at first, but it is understandable when you consider the dilemma of the responsible citizen: Whether it is about pandemics, international refugee policies or the local climate initiative: the ‘ideal citizen’ is always well informed and has a profound opinion on all issues. Unfortunately, we do not have the cognitive abilities to know and evaluate everything. We have the tendency to close information gaps with “narratives”, which might be plausible, but still not true. This is (at the same time) the source of and the soil for BS.

The desperate search for security in uncertain times accelerates BS and it seems as if there is a direct relationship between the complexity of a situation and the need for simplified explanatory patterns. The BShitter creates the impression of knowing where things are going and this calms his followers enormously in their mostly simple world view.

But there is also another facet in the context of the creation of BS and this is related to the economic circumstances. In my empirical observation there is not only the above mentioned problem of trivialization, but also the wide field of the silent bearing of obvious BS. This manifests itself through ex- or implicit customer/supplier relationships, in which the economic dependency on the supplier side means that you put up with shit and play along with the BS. You need the money.

Who is receptive to bullshit?

The only study I know about this subject was published by University of Waterloo (Canada) and bears the wonderful title:

“On the reception and detection of pseudo-profound bullshit”

The researchers wanted to find out which mental constitution increases susceptibility to BS and which characteristics are helpful in identifying BS.

As a starting material to be tested, sentences were generated by a computer program, which are syntactically correct, but do not contain any concrete statement, thus suggesting a semantic depth that is actually not contained – classical BS. Example:

“Hidden meaning transforms unparalleled abstract beauty.”

For the insider: The authors of the study were inspired by tweets from Deepak Chopra, a grandmaster of bullshit who, for example, posts such BS via Twitter:

 Source: https://twitter.com/deepakchopra/status/483188737162358785

The study was designed to investigate two mechanisms for susceptibility to BS. Essentially, these are cognitive distortions of which the observed person is unaware:

1. The acceptance of BS = How to deal with the expectation that a statement must have a profound meaning?

2. The ability to identify BS by recognizing and naming logical misconceptions. The authors of the study introduce the term ‘conflict monitoring’ in this context.

The two mechanisms could then be demonstrated quite well in various experiments (α between 0.7 and 0.9). In summary, the following predispositions favour the susceptibility of BS:

  • Less ability to reflect, uncritical openness
  • Low cognitive abilities (e.g. language, fluid intelligence, arithmetic, …)
  • susceptibility to ontological confusion = confusion of terms, meanings and contexts
  • Epistemologically suspect beliefs, including affinity for religiosity and paranormal phenomena (Remark: I call this pop esoteric, a trivialized form of mysticism with a consumeristic mindset)
  • Last but not least, the authors cite the Barnum effect, among others, to explain BS susceptibility. Furthermore, the [[Dunning Kruger]] Effect should be considered as another precondition of BS.

A relative resistance to BS was shown by those test persons who exhibited the following characteristics:

  • Critical thinking and logical analysis skills
  • “Reflective Open Mindness” (not taking over everything without questioning and yet remaining open to new things as a contrast to uncritical openness)
  • Large vocabulary and knowledge of the meaning of words to be able to recognize inconsistencies and pseudo-profound statements

So how to deal with BS?

Carl Sagan created a list of nine aspects that should help to make sure that one does not generate, nor spread BS. Actually it is called the Baloney Detection Kit and is part of every well trained scientist – or at least it should be.

  1. Wherever possible there must be independent confirmation of the “facts.”
  2. Encourage substantive debate on the evidence by knowledgeable proponents of all points of view.
  3. Arguments from authority carry little weight — “authorities” have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts.
  4. Spin more than one hypothesis. If there’s something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among “multiple working hypotheses,” has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy.
  5. Try not to get overly attached to a hypothesis just because it’s yours. It’s only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don’t, others will.
  6. Quantify. If whatever it is you’re explaining has some measure, some numerical quantity attached to it, you’ll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging.
  7. If there’s a chain of argument, every link in the chain must work (including the premise) — not just most of them.
  8. Occam’s Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler.
  9. Always ask whether the hypothesis can be, at least in principle, falsified. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result.

Conclusion

BS is everywhere. It’s about attenuating and NOT amplifying it. I can’t even guarantee that this blog post does not contains BS. For sure I don’t want to work with narratives. I’d rather try to think like a scientist than like a storyteller.

 

References:

    • Harry Frankfurt (1986), Raritan Quarterly Review
    • Gordon Pennycook et. al (2015), Judgment and Decision Making, Vol. 10, No. 6, pp. 549–563
    • Carl Sagan (1996), The Demon-Haunted World: Science as a Candle in the Dark
    • John Cleese about Dunning Kruger
    • A guitar instructor (2012 ff), Düsseldorf

Leave a Reply

Your email address will not be published.

This site uses Akismet to reduce spam. Learn how your comment data is processed.