The truth is struggling. Ever since The Economist, in one of its September issues, discovered and described the “post-truth world” we’re living in[1], it has become fashionable to complain about the disintegration of truth. More recently, this complaint has been further intensified by a broad debate about the nature and role of “fake news”[2]. If only, common sense and common opinions seem to say, we could get rid of all misperceptions, errors, lies, and outright deceit once and for all – then, we could live in a peaceful world made of truths, happily ever after.
Of course, there’s advantage in being surrounded by truths: If I know that the ice is not yet frozen, I won’t step on the lake (and not drown to death); if I know that measles can be a fatal disease, I might opt for vaccination (and not get infected); if I suspect that the super-cheap phone offered on the internet will break after a dozen uses, I will most likely not buy it (and be saved from frustration); if I know that the friendly chap offering to transfer millions to my bank account is most likely not real, I can happily trash his email (and not be cheated).
At the same time, there are many situations when truth itself is neither black nor white, but comes cloaked in some washed-out shade of grey. These, I believe, are the situations we need to pay attention to – today more than ever, as past truths are getting dismantled (when new evidence emerges or old arguments go out of fashion), future truths are uncertain (because all predictions are uncertain, alas), and present truths are undermined by the ongoing virtualisation of our world, where (physical) reality loses its defining power to (digital) hallucinations the reality of which we haven’t quite gotten accustomed to yet[3].
Trying to understand the ambiguity of truth itself, four very different concepts have helped me to categorise, think, and talk about what makes truth (and what doesn’t). I’m sharing them here, hoping that they can enrich our debates about truth – and thereby, ideally, also sharpen our tools to deal with the inevitable lies we’ll always encounter as long as there’s space, speech, and beings moving around in both[4].
- The fallacy of intuitive truth: Ever since Daniel Kahnemann and Amos Tversky published their first research on cognitive biases (for which they then won the Nobel Prize in 2002), the distinction between what seems intuitively true to us and what is statistically valid has become an enlightened household staple of discussions about truth (and falsehood)[5]. Linda – the famous thirty-one year old, single, outspoken, bright philosophy major who used to be deeply concerned with issues of discrimination and social justice in her student days and participated in anti-nuclear demonstrations – is NOT more likely to be a bank teller active in the feminist movement than to be a bank teller[6]. We might still feel pulled towards the first answer with all our guts – but: Mathematics is ruthless and doesn’t make it any more true, however much we wish it was. “The most coherent stories”, Kahnemann writes, “are not necessarily the most probable, but they are plausible, and the notions of coherence, plausibility, and probability are easily confused by the unwary”[7]. Unfortunately, this fallacy (as well as the many variations on this theme described in Kahnemann’s prolific work) doesn’t go away even if we know about its existence. Rather, whenever we notice our intuition raising its voice, we need to pause, think, and work hard with (often: against) its likely mistakes – this is what Kahnemann calls “System 2”, “the conscious, reasoning self”[8].
- The fallacy of socially acceptable truth: Another well-known effect of distortion of truth is the phenomenon coined as “Groupthink” by Irvin L. Janis in his 1970s academic bestseller[9]. Analysing a number of prominent political fiascoes, Janis identifies three types of symptoms that favour any individual’s alignment with the views, opinions, and decisions of the group it belongs to – even if what’s at stake is a massive mistake resulting in catastrophe, war, or breakdown of systems (possibly including destruction of the group itself). The three types of symptoms are (I) overestimation of the group – consisting of “an illusion of invulnerability” and “unquestioned belief in the group’s inherent morality”, (II) closed-mindedness – consisting of “collective efforts to rationalise in order to discount warnings” and “stereotyped views of enemy leaders as too evil […] or too weak”, as well as (III) pressure toward uniformity – consisting of “self-censorship of deviations”, “a shared illusion of unanimity”, “direct pressure on any member who expresses strong arguments against any of the group’s stereotypes […]”, and “the emergence of self-appointed mind guards”[10]. Given that we’re all social beings who – at least some of the time – also want (and need) to belong to groups, we’re all equally in danger of also succumbing to groupthink – which, by the way, is one of the main reasons the phenomenon of the “Filter Bubble”[11] is so important to understand. As Janis himself laid out, the only means to minimise the detrimental effects of groupthink are rigorous rules for the group that walk the fine line between preserving the group’s cohesiveness and trust on the one hand, and opening its discussions to contrarian views, on the other – a number of institutionalised “devil’s advocates” explicitly established to challenge consensus and the cosiness of feeling in sync[12].
- The fallacy of relative truth: Buddhist epistemology is – among other elements – built around the distinction between ultimate (or absolute) truth and relative truth[13]. Ultimate truth – as its name suggests – is unsurpassable in all dimensions, and therefore (by definition) unspeakable (an observation also found in other religions, ideologies, and schools of thought). Relative truth, on the other hand, is further disambiguated as valid and invalid relative truth, whereby valid relative truth is – loosely speaking – everything that functions in the (outer) world that we live in, while invalid relative truth is a mismatch between (subjective) perception and (objective) outer world, often exemplified with the illusions experienced after taking drugs. Now, as valid relative truth describes a functioning relationship between subject(s) and object(s), it can look very different for different subjects – one classical example being water which fish see as living environment, while human beings see it as something to drink, and hungry ghosts (a certain type of inferior beings with the Buddhist universe) see it as poison. All of these different perspectives function for those involved given their respective reference systems – while, at the same time, none is “truer” than any other[14]. Again, as in the previous fallacies, we cannot escape the relativity that we live in. So the best we can do is try to understand (and eventually transcend) our own subjectivity while, at the same time, (as compassionately as possible) acknowledging others’ (differing) subjectivities.
- The fallacy of emotional truth: Finally, developmental-constructive psychology has another – complementary – perspective on what constitutes our truths in day-to-day life, in particular when it comes to emotional learnings that shape our values, views, and behaviours through unconscious patterns of mind. My favourite psychological exposition of such workings of our mind is Bruce Ecker’s and Laurel Hulley’s explanation of the foundation of “Depth-Oriented Brief Therapy”[15]. In their interpretation, a person’s emotional truth (of a possibly consciously unwanted behaviour) is their unconscious need to display the behaviour in question because of its value for preventing or protecting something vitally important to them, often linked back to emotional wounds from the past. In psychotherapy, of course, identifying such emotional truths and undermining their power through moving the unconscious assumptions into the realm of the conscious mind, is a well-established path towards healing. However, beyond the pathologically painful instances addressed by psychotherapy, the same logic of unconscious emotional truths holding us captive to certain patterns of thinking and acting, remains equally valid. So digging up, exposing, and thereby overcoming such emotional truths as much as possible (in ourselves and – if we’re skilful enough – also in others) becomes another imperative to help us disentangle the messy web of conflicting truths.
The truth, indeed, is struggling. But – as these four points show – not so much because of sudden new developments undermining what used to be simple, straightforward, and unanimous truth for all. Rather, truth is and has always been a multi-headed beast[16], beaten, hacked, and torn into pieces by deceivingly strong intuitions, seductive lures of group agreements, irritating relativities of all things subjective, and mesmerising spells of our unconscious emotional meaning-making.
As a consequence, getting closer to the truth is also a struggle – a constant process of disagreeing with ourselves and with others in search of a more refined truth[17]. So in the end, today’s challenge might not be one of telling the truth from all that is false, but of better and better understanding the different displays of truths and how to find mutual understanding, connections, or even common grounds across them.
[1] The article from the September 10th, 2016 print edition is available online here [retrieved Dec 7, 2016]. BACK TO TEXT
[2] The phenomenon itself, however, of course being nothing new, but as old as mankind conveying messages. For the more recent discussion, in particular regarding fake news on the internet, Wikipedia gives a pretty comprehensive overview here [retrieved Dec 7, 2016]. BACK TO TEXT
[3] See my earlier blog posts on The Digital Space of Phenomena (for thoughts on how the world around us changes as it becomes more virtual) and on “No virtualisation without co-creation” (on how this is closely linked with how our societies function – or not) [retrieved Dec 7, 2016]. BACK TO TEXT
[4] For a more inward-looking perspective see my article “Embracing Lies” on Levekunst [retrieved Dec 7, 2016]. BACK TO TEXT
[5] For a (long) summary, read Daniel Kahnenmann’s “Thinking Fast And Slow” (2011).BACK TO TEXT
[6] The full description of the “Linda” experiment is in “Thinking Fast And Slow”, Chapter 15 (pp. 156 sqq.).BACK TO TEXT
[7] Ibid., p. 159, emphasis in the original text.BACK TO TEXT
[8] Ibid., p. 21 – and for its workings, the rest of the book.BACK TO TEXT
[9] In the following, I’m quoting from Irving L. Janis, Groupthink, Second Edition (1982).BACK TO TEXT
[10] Ibid., p. 174 sq.BACK TO TEXT
[11] Described by Eli Pariser in “The Filter Bubble” (2011).BACK TO TEXT
[12] Spelled out in detail in “Groupthink”, Chapter 11 (pp. 260 sqq).BACK TO TEXT
[13] Further explanations and sources can be found in the pretty comprehensive article on Wikipedia on the “Two truths doctrine” [retrieved Dec 7, 2016].BACK TO TEXT
[14] Those more scientifically than spiritually inclined might, by the way, rewrite the previous sentences as an exposition on the theory of relativity as developed by Albert Einstein.BACK TO TEXT
[15] Bruce Ecker, Laurel Hulley, “Depth-Oriented Brief Therapy” (1996).BACK TO TEXT
[16] For a beautiful sensory illustration of this phenomenon, watch this video of “Ten Headed Beast” by Hundreds [retrieved Dec 7, 2016].BACK TO TEXT
[17] I recently wrote about how to disagree gracefully: “Disagreement as Practice” for Levekunst [retrieved Dec 7, 2016].BACK TO TEXT
Respond to (Almost) Fifty Shades of Truth