‘Thinking, Fast and Slow’: Epistemology, Feedback Loops and the Science of Bias and Human Irrationality

R.C. Smith

If we take as a stated assumption, based off the growing body of science, that prejudice is pervasive – that human irrationality is, to put it philosophically, a central theme in the human struggle toward a rational society – I think one of the lessons is epistemological in form. Its basic reduction is this: instead of approaching the world of phenomena, a priori, through the conscious endeavor to understand, which implies openness in the learning subject as a sort of constant and normative orientation process, the standard paradigm of irrational society is driven to approach the world a priori through hardened established frameworks. It is the practice of what some call dogmatic thinking. It is the philosophical account of being deeply consumed by paradigmatic emphasis on bias of opinion instead of an emphasis on fact.

I remain interested in the notion of social pathology, particularly in how through the notion of “social pathology” there exists the potential for a broader meta-analysis of human irrationality. Further, it is interesting to think about how human irrationality might become deeply normalized paradigmatically, in both thought and practice. One description or example is in the form of the increasing hostility and polarisation of political views, which has been observed or described also in terms of the hardening of attitudes (toward opposing views). The idea of pathology, in critical sociological terms, attempts to explain this (in part) through an account of positive and negative feedback loops. This concept is one that can be found in a number of areas, from science and engineering to social systems theory. But the idea, applied in the current context, is simple: the more divisive and polar people’s views, including also things like media consumption (which have become increasingly biased and tribalised on both sides of the spectrum), the more that creates a sort of echo chamber that basically serves only to reinforce the extremity of polar views. As the median of polarisation widens, into opposing extremes, the pathology of the cycle is such that it leads to increasing irrationality of views that no longer have any first order engagement with facts, unbiased researched, or constructive rational debate.

Another additional account of such pathological cycles or feedback loops can be found in a critique of the sociology of “worldviews”, in which these feedback loops essentially operate in terms of the solidification separate hardened and almost absolute views of reality. But the core detail, at least in how I’ve thought about it in the past, concerns how the feedback loop leads to or is an exemplification of increasingly extreme positioning of views that are no longer rational in epistemological engagement. The study of an issue or social phenomena, the engagement in debate about varying interpretations, is not based on an openness to learn or to arrive at a more accurate account – rather, the drive seems more to argue from the basis of one’s biases. This is not only generally counter-scientific, and also non-rational, it is very similar to the sort of patterned dogmatisms that was critiqued by certain enlightenment philosophers centuries ago.

Evidence of such trends can be found both in theoretical and empirical research, such as the widely cited study by Pew Research on Political Polarisation in the American Public.
But we can also deepen this discussion by noting that what one is also describing here is the implicit (and often explicit) presence of bias in people’s engagements, as well as what seems to have become the cultural acceptance of that bias on behalf of “politics”. This signals the magnitude of the problem of human self-ignorance, individually and culturally. I think it is an epistemological problem inasmuch as, in other studies, we understand such bias and prejudice and dogma as sociological, cultural, psychological and even bio-chemically driven. As an aid, what is required, to the best of my reading, is genuine critical thinking that challenges the very existence of the construction of these negative feedback loops and “worldviews” as well as, psychologically, the production increasingly hardened attitudes.

Psychology

But let’s pause a think a little bit about the psychology. This problem seems to be more than instinct versus rational cognition, though that is one reduction. The notion of a “gut feeling” – or an intuition or impulse based purely off experience (i.e., emotional history, past experiences, etc.), could be argued as being one distinct basis of the irrational process of belief formation. It is based on subconscious decision making processes, cognitive biases, memories and even bio-chemical reactions. There is a lot of incredibly interesting science in this area, especially in relation to survival training. One might think, too, of the psychology of self-affirmation theory, in which researcher describe how “much research suggests that people have a ‘‘psychological immune system’’ that initiates protective adaptations when an actual or impending threat is perceived” (p. 184). Additionally, and interestingly, “At both the individual and collective levels, important domains of functioning—health, political decision-making, conflict, relationships, academic performance—call forth the motivation to defend the self. People defensively distort, deny, and misrepresent reality in a manner that protects self-integrity” (pp. 230-231).

Furthermore, there is a lot of evidence in psychology about how, when one perceives oneself to be under threat, one’s brain resorts to an evolved fight or flight reaction. This perceived threat can be immediately physical or cognitive. Under anxiety, the brain shuts down (for lack of a better description). This state is certainly not prone to rational deliberation, as areas of brain that govern working memory, the processing of new information, and so on are prohibited as adrenaline floods one’s bloodstream and the hormone cortisol is released. Working memory has also been shown to be impaired in response to increases in the hormone cortisol. In other words, the body switches into or turns to being reliant on instinctual mechanisms.

In these situations, and even those where one is faced with overwhelming or unclear information and uncertain decisions, studies have found the people tend to resort to cognitive shortcuts. In psychology such shortcuts are referred to as heuristics, which can be useful but also tend to lead to irrational decision-making processes. It’s an interesting concept, which refers back to the inhibiting of the brain and basic cognitive processes which might foster rational deliberation and consideration. Most recently I learned of a study in The Journal of Neuroscience, which furthered discussion on heuristics.

There is too much science to link in one essay. But one of the basic ideas across the literature seems to suggest that, in many situations, the cognitive status can be incredibly reactionary and certainly not logical. That some studies on heuristics, such as the one in The Journal of Neuroscience, suggest that it is not all emotion, but potentially also cognitive laziness, add even more intrigue to the total picture that researchers are slowly building.

From a behavioural perspective – and here I am thinking aloud in memory of passed resources I’ve studied over the years – it would seem that there are many instances in which one’s inclination is toward very instinctual pattern of thinking, designed similarly to the fight or flight reaction; though I don’t think it is necessarily deterministic, as survival studies show people can think rationally under threat or stress. And this makes sense in that, if my memory serves me correctly, the primitive part of the brain is understood to be the amygdala. Located deep within the medial temporal lobe, it is thought that this part of the brain is link to both fear and pleasure. Some describe it as the “danger detector”. But the amygdala has also been linked to cognitive bias and biased behaviour.

On an epistemological level, these fascination points of research remain me of the widely celebrated book Thinking, Fast and Slow (2012) by Daniel Kahneman. I discovered this book far beyond my particular period of interest and study that resulted in my thesis on social pathology, which is a shame as it would have significantly impacted my arguments and would have served as a key reference.

Additionally, this discussion brings to mind a study I recently read on the epistemology of bias, particularly in relation to conspircist ideation.

PLOS Study

The study, titled Epistemic beliefs’ role in promoting misperceptions and conspiracist ideation, is interesting in that it studies prejudice, bias and conspiracist ideation in relation to epistemology. As an empirical reference, it seems to drill down a bit deeper into an incredibly fundamental issue: human irrationality.

In short,  the researchers found that “People who tend to trust their intuition or to believe that the facts they hear are politically biased are more likely to stand behind inaccurate beliefs. And those who rely on concrete evidence to form their beliefs are less likely to have misperceptions about high-profile scientific and political issues”.

As the lead researcher, Kelly Garret, commented in the article linked above:

“Scientific and political misperceptions are dangerously common in the U.S. today. The willingness of large minorities of Americans to embrace falsehoods and conspiracy theories poses a threat to society’s ability to make well-informed decisions about pressing matters […]. A lot of attention is paid to our political motivations, and while political bias is a reality, we shouldn’t lose track of the fact that people have other kinds of biases too.”

In addition to the PLOS study, the book by Kahneman also serves to suggest, it seems, how there is potentially the question of an interesting connection between all the points discussed so far and the general cultural development of thinking fast as opposed to slow, rational thought and consideration. More than that, as the author’s of the PLOS article suggest: one counter to epistemic beliefs not based on evidence is premised on “emphasizing the importance of evidence, cautious use of feelings, and trust that rigorous assessment by knowledgeable specialists is an effective guard against political manipulation”. It echoes calls that what is needed – at the very roots of culture and society – is a more scientific mindset. Perhaps it also signals the need for a more slow, thoughtful and considerate culture? Perhaps, too, it signals the need for “critical thinking” and less political thinking?

All of these questions and many others encircle a deeper issue that seems to be subject of increasing empirical acknowledgement. But more concisely there is also the question as to whether – or to what extent – there is a social and structural component to the fostering or promoting of cultural groups based on misperception, bias and conspiracist ideation. It sort of ties-in to discourses on social pathology, in which there is a component of human stupidity and, thus, irrationality, understood to operate as a scar. It’s the idea of the frightened snail, as edges along with its tentacles extended, until, in fear, it recedes back into its shell. The analogy serves also as a description of the hardened subject, of which stupidity is, in a sense, a developmental and emotional scar. The key idea here concerns an enquiry into what role social, economic and broader environmental conditions play in fostering rational subjectivity, as opposed to irrational, fear-driven and hardened forms of subjectivity.

It is striking, too, in this age of post-modernism in which post-empirical, post-truth developments provide ripe soil for the growth of conspiricist ideation and non-scientific approaches, that such conspiracies as the earth being flat often also refer in some way to hidden forces. It’s almost like the myth of the devil all over again. But what I am really angling toward is the question of sociohistorical-cultural context in addition to psychological, emotional and cognitive development. A lot of conspiracy theories are premised, as the PLOS study alluded, on manufacturing what I would describe as a substitute reality. As such, many of them seem to perform the same operation as the myth of devil – and of religion writ large – insofar that they explain away, in a ‘just so’ sense, everything that the individual finds overwhelming or difficult in life and in the (real) social world. Unemployed or stuck in a dead-end job in one’s thirties? ‘It’s not my fault, it’s the Illuminati or some hidden New World Order!’ These sorts of explanations or justification don’t seem uncommon – they indicate some belief that society is rigged, and that one’s struggle in life or in seeking personal success are the result of a massive hidden force, as opposed to one’s choices or concrete social issues like economic inequality. Conspiracies, in this sense, act as substitute realities to appease the psyche not only with respect empirical injustice, struggle and suffering – that is, concrete structural social, economic and political issues – they also seem on my observation to deflect from personal responsibility and the platitudes of existential angst common to human experience. These “worldviews” become so entrenched, even the concept of facts are rendered meaningless.

Thinking broadly – and perhaps searching philosophically

What I am mostly curious about at the current juncture is how all of this might link together, especially considering the PLOS study with respect to the formation of closed and prejudiced – or dogmatic – systems of belief. Philosophically speaking, it is interesting to think about how prejudiced systems of belief seem to operate according to established predictors of misperception. Indeed, that is something the PLOS study hints at.

Another interesting study from Stanford University was recently published. It seems to go with recent trends in terms of research findings and the slow piecing together of a much larger picture on human bias. It suggested that changing behaviors may be easier when people see norms changing. This raises a number of very interesting sociological questions.  One example cited, as summarised in the article linked above, concerns how “people ate less meat and conserved more water when they thought those behaviors reflected how society is changing”. One can think of a long list of examples that would seem to support, or be supported by, this research. Think of such a facile example as the changes in fashion trends, where majority of people will think a new fashion style looks ridiculous only for that same style to be normalized and supported by the majority five years later. One can draw examples of the same basic meaning from a number of areas.

More deeply, it is interesting to think of this research in relation to cognitive bias. Can it help explain why some norms, which may have become pathological or destructive, continue to be sustained? For example, think of the norms of gun culture in the U.S. in comparison to every other western society, particularly in relation to levels of gun violence. The data, at least when I last reviewed it, was striking. Moreover, in that studies of social pathology have a direct connection to the study of social norms, can it be said that there is link between this research on behaviour in relation to norms and social bias more generally? One can think of a number of different types of bias in this context, including Bayesian priors.

Ultimately, these questions should be saved for another time and after more thorough research and studies have been achieved. Indeed, a lot of the questions I am hinting at need to be weighed against the evidence. But this essay has, admittedly, aroused in me a deeper question about bias and what, in sociology, one might described as “systemic trends”. If bias is so widespread and prevalent, and if human beings are (or can be) incredibly irrational, what does this say in epistemological terms about us in our present history? Let me put it another way, what does it say about the current successes of the enlightenment project and how far do we still have to go to defeat the epistemologies of myth? Bias and prejudice are intimately linked to myth, which, itself, is perhaps the most pure case of human irrationality.

A fascinating example, which I’ll cite here to touch on the deeper point I seem to be encircling, refers to an anti-vaccination movement in the 1990s. I recently read about this in relation to the latest science which suggests, due a variety of factors (some cyclical, some in relation to climate change), that lyme disease is potentially about to explode among the populace. Currently, there is no vaccine, and lyme disease remains a very urgent problem. But that’s not to say that there wasn’t a vaccine! Indeed, as Chelsea Whyte wrote, “We used to have one, but thanks to anti-vaccination activists, that is no longer the case”. What happened?

In the late 1990s, a race was on to make the first Lyme disease vaccine. By December 1998, the US Food and Drug Administration approved the release of Lymerix, developed by SmithKline Beecham, now GSK. But the company voluntarily withdrew the drug after only four years.

This followed a series of lawsuits – including one where recipients claimed Lymerix caused chronic arthritis. Influenced by now-discredited research purporting to show a link between the MMR vaccine and autism, activists raised the question of whether the Lyme disease vaccine could cause arthritis.

Media coverage and the anti-Lyme-vaccination groups gave a voice to those who believed their pain was due to the vaccine, and public support for the vaccine declined.

What is interesting about this example is because it is in no way uncommon, and it offers interesting angle of insight into a similar problem the PLOS study sought to investigate. A few years after the anti-vaccination movement won the media battle and persuaded public opinion, comprehensive research in a retrospective study showed “only 905 reports for 1.4 million doses”. “Still, wrties Whyte, “the damage was done, and the vaccine was benched”. And even though there is a vaccine currently in ear;y human trials, what is essentially myth – false or baseless knowledge thought of as true – remains insofar that it will be an uphill battle to fight anti-vacc. lobbyists and re-educate the public.

Why this example, among many others, stand out is because it seems to correlate not only with the PLOS study but also more broad studies on the development of post-factualist, post-empirical culture. And the deeper question of this essay asks: what, if at all, underlines such developments in thought, perception and in human behaviour?

Concluding reflections

To conclude: it will be fascinating to monitor the emerging research and growing body of evidence when it comes to understanding human bias. What direction it all goes, it is difficult to say. We know human beings can be deeply biased. Not only is this a problem is greater society, it is one we must also constantly fight against in the natural sciences. But in thinking about epistemology in relation to the PLOS study, I suppose what I find interesting is the question of whether, if bias can take different forms – from the construction of some sort of complete worldview (think of a highly politicized subject) to prejudiced belief about a particular topic (think of a generally pro-science politically left individual, who then is also anti-GMO in the face of scientific consensus) – perhaps the simple reduction is one of science or anti-science? When I think of anti-science, and anti-reason for that matter, which can actually sometimes operate under the guise of pro-science and pro-reason, I think of a closed, repressed, dogmatic form of thinking that possesses very particular epistemological characteristics. What does one call such thinking? I have no answer. I’ve seen accounts under numerous headings: “uncritical thought” (associated with the critical thinking movement), anti-enlightenment epistemology, ideological thought, the epistemology of cognitive bias, and so on.

It also serves to emphasize things like Bayesian reasoning, critical thinking and critical reading (and simply general cognitive and epistemological agility and openness to new evidence), which are incredibly important skills and analytical tools when it comes to academic study and even daily experience, forming a precursor to exercising to the fullest extent one’s capacity to reason and to engage with the world in a rational way. Critical thinking has a deep place in science, and is increasingly informed by advanced scientific research in learning, cognitive and neural systems, and so on. The definitions of all these terms are very well known, with students introduced to critical thinking and reading exercises at undergrad (or earlier).

Having said that, it is worthwhile noting that critical thinking is not necessarily a negative process, although some describe it this way. This is because the a priori aim of critical thinking and reading is not to find fault. Identifying, constructing and evaluating arguments is certainly one aspect of critical thinking, and this includes the ability to dissect arguments and locate underlying assumptions, and thus test those things for inconsistency. At its most basic however, critical thinking is a deeply rational process – to assess the strength and weakness of an argument, especially when weighed against the evidence and a fuller assessment of the phenomenon or issue in question from all points of study in its complexity. Thus, it refers to systematic evaluation and problem-solving, as well as normative consideration and reflection on the status of one’s own beliefs and values as a subject, so as to ensure openness against potentially creeping bias and prejudice.

In this sense, it may not be entirely accurate, but I often think of “critical thinking” in epistemological terms as systemic thinking in that one one component of it is to seek to understand not only  the local phenomenon but also the systems around it or within which that phenomenon (or issue) exists, and thus also the logical connections between concepts, ideas and the thing itself. It is about deep, multidimensional and open consideration inasmuch as it entails scrutinizing the work presented to see whether there are biases that one can detect which shape the author’s interpretation of any facts and ideas. But beyond that, to think critically does not necessarily mean to think “politically”. It is much closer, in epistemological terms, to thinking objectively, slowly and with great deliberation.

And so maybe one lesson here – in a consideration of deeply biased social world – is the need for more critical thinking, and less political thinking?

Perhaps another interesting questions concerns whether, if bias and prejudice are so widespread and prevalent, as emerging research on cognitive bias would seem to indicate, is the current trajectory of social culture not then heading in the opposite direction that it should? Contemporary social culture seems, in a more speculative tone, predicated on “fast thinking” (to borrow from Kahneman) and heuristics. There seems to be a lot of reactionary debates, instead of thoughtful, informed and considerate engagements. With an overwhelming and endless flow of information – which almost acts as a new form of censorship repressing genuine content, data and fundamental discussion – there seems to be a significant emphasis and discernible demand for immediate reaction, click-bait headlines, news spectacle and watered down literature, as opposed to thoughtful and well-informed and evidence-based deliberation that requires deep rational consideration. It leaves one to wonder,  would the general coordinates of a science-based society or science-inspired culture not be represented by a complete different vector?