Why do people and groups ignore, deny and resist knowledge about society’s many problems? In a world of ‘alternative facts’, ‘fake news’, and ‘fact resistance’ that some believe could be remedied by ‘factfulness’ or ‘enlightenment’, the question has never been more pressing. Following years of ideologically polarised debates on this topic, the book seeks to further advance our understanding of the phenomenon of knowledge resistance by integrating insights from the social, economic, and evolutionary sciences. In current debates and studies, several vital factors are downplayed: that all people and institutions – even science – occasionally resist knowledge while calling their resistance ‘scepticism’, that knowledge resistance is not always irrational, that facts don’t equal truth, and that knowledge claims continuously need to be re-evaluated. Ignoring such key factors undermines the chances of reducing problematic knowledge resistance. Examples used in the book include controversies over climate change, the roots of violence, gender roles, religion, child-rearing, vaccination, genetically modified food, and artificial intelligence. In addition to accessible discussion of the scholarly literature and media sources, in-depth interviews with other renowned human scientists in the UK about their perspectives on knowledge resistance contribute to understanding this intriguing phenomenon. Moreover, the author shares his personal experiences of cultural clashes between different knowledge claims. The book is written for the educated public, students, and scholars interested in how people and groups handle knowledge controversies, and how such disputes can be resolved in the service of better managing the urgent social, environmental, and health-related problems of today.
This chapter uses the case of the measles vaccine scandal to show how we, even when provided with overwhelming data showing that the vaccine doesn’t cause autism, we may under some circumstances resist such knowledge claims and even become more certain that the data are wrong or irrelevant. Although the chapter shares the common puzzlement about such fact resistance, it also introduces the book’s critique of ‘common-sense complaints’ about fact resistance as based on a naive view of people as either irrational or as having the potential to be saved from their fact resistance if they are given even more and better facts. The aim of the book is presented here: to rethink knowledge resistance by treating it as the multi-faceted and profoundly human phenomenon it is. Such rethinking requires that we do two things differently. We have to allow ourselves to gain insights from what the broadest range of human sciences have to say about knowledge resistance. Moreover, we have to leave some of our – usually negative – preconceptions about knowledge resistance aside. The chapter ends by briefly describing what the rest of the book will cover.
This chapter describes phenomena that are typically discussed in the same breath as knowledge resistance. Scepticism is here described as the opposite of knowledge resistance, since scepticism implies a readiness to accept a knowledge claim once the evidence and arguments are strong enough. Knowledge resistance, on the other hand, implies immunity to evidence and arguments. The chapter also discusses the difference between fact resistance and knowledge resistance. The latter term is preferred in this book because the common complaints about fact resistance fail to address how it is possible to lie and deceive oneself with correct facts. Fact resistance is not the whole story when people and groups seem to avoid crucial insights about climate change, vaccination, crime prevention, or social well-being. The chapter shows why knowledge resistance is a far more useful concept in order to fully understand and handle stubborn avoidance of insight. In introducing knowledge resistance, the chapter also indicates how it can’t be reduced to an emotional, passionate mindset. Nor can acceptance of seemingly valid knowledge claims be reduced to a ‘reason-oriented mindset’. The chapter ends with some additional remarks about what knowledge resistance is and isn’t.
This chapter begins by describing a scene where the author, as a Swedish high school student on an exchange trip to Colorado, witnessed a ritual of ‘moderate’ spanking while visiting some friends of the host family. The example is aimed at illustrating how both the author and the American father struggled to stay loyal to the knowledge beliefs of the social liberal Swedish culture (where physical punishment of children is prohibited) and to the conservative part of Colorado culture, respectively. This prompts a discussion of why it appears so important to us to stick to not only our community’s moral beliefs but also to its beliefs about what is true and false. The chapter shows how we often judge the quality of other people’s knowledge claims by checking what community they come from rather than testing how reliable the claims themselves seem to be. This leads the chapter to introduce the strongly social function of knowledge. Rather than truth-seekers, we have evolved throughout the long history of humankind to use knowledge claims very flexibly in order to strengthen our social bonds with our group and mark distinction from other groups.
In this chapter, the picture is broadened from descriptions of how community-based knowledge resistance takes place to why such patterns are so prevalent. To do this, the chapter combines perspective about human biological evolution with anthropological and sociological thought. What – if anything – could be the ‘adaptive value’ of knowledge resistance of groups? Could different chances of survival and reproduction throughout the long human history be associated with differences in people’s inclination to conform to the knowledge beliefs of their community? Community-based knowledge conformity is here illustrated by an examination of religious and scientific groups. The chapter also sheds light on such conformity in the context of male-dominated audiophile communities in their passionate pursuit of the perfect stereo-system sound. The more sceptical science is about the superiority of some audio equipment over other, the more it strengthens the sense of audiophile communities to insist that the double-blind comparisons of audio equipment carried out by scientists lack the sensitivity and sophistication of the golden ears of audiophiles.
This chapter moves from knowledge resistance between communities to how individuals handle knowledge when they reason with each other. We usually like to think that the point with reasoning with each other is to move closer to a more accurate understanding of the world. If that were the case, people would begin their reasoning with others in an open-minded way, by collecting and evaluating the available information and arguments systematically before shaping their knowledge belief. However, as court judges have known for centuries, the order is commonly the reverse. This is in line with new research in a strand of thought called argumentative theory. It stresses that the function of reasoning with others is primarily to win the argument and persuade the others of one’s own belief rather than to move closer to the truth. This chapter also raises questions over whether some phenomena have certain inherent features that make us particularly inclined to resist seemingly valid knowledge about them. Examples discussed include vaccination, climate change, and human evolution.
This chapter discusses an idea shared among some economists: people actively and consciously only acquire the knowledge that will be instrumental in achieving their clearly defined, substantive goals. Accordingly, people resist knowledge when it costs more time, money, and other resources than it benefits their efforts towards reaching their goals. The chapter presents three versions of this idea. One contends that people always know what knowledge they resist, and that they oppose it for the reason mentioned above. The second is the notion that people usually resist knowledge in a goal-rational way, although they sometimes fail. The third contends that each individual is several persons over time, with partly conflicting preferences. Although each person within the individual always resists knowledge rationally given their time-limited goal, their resistance can be irrational given the goals of their other persons. The chapter adds an alternative to these views. Whereas the perspectives of ‘rational ignorance’ in this chapter focus on substantive costs (regarding time, money, effort, health, and the environment), the Dionysian, deeply social interest and the social rationality associated with it need to be incorporated into any analysis of the costs behind knowledge acquisition.
This chapter looks at two additional sides of knowledge, sides that partly fall within ‘strategic ignorance’. The first is to resist knowledge when it carries with it a moral and social responsibility. This includes knowledge about genetically carried diseases of ourselves or others, or practical information on how we could do more to reduce the suffering of others, for instance in the developing world. The second side is to resist knowledge when ignorance opens up opportunities that would have been difficult if you knew ‘too much’. Convincing others – and yourself – that you are ignorant about specific occurrences can in some cases be of great value. These include certain innovative environments and even scientific fields where ‘fresh thinkers’ and ‘blank slates’ may sometimes be seen as increasing the chances of thinking outside the box. The chapter also shows three different ways of ignoring knowledge strategically, often in organisational settings: deny, dismiss, and divert. The chapter concludes with a brief discussion about how to assess and distinguish harmful versus beneficial knowledge resistance.
This chapter shows an influential factor in making sense of concrete cases of knowledge resistance: people’s concerns about what would happen in culture and politics if they didn’t resist a particular knowledge claim. The chapter uses the example of knowledge about biological evolution to illustrate three ways that people interpret such knowledge. The first is that society should imitate evolution, with its competition and pressures, particularly on weaker individuals, as a model for society and politics. Social Darwinism contends this view. The second is to resist knowledge claims that biological evolution interacts with culture in shaping, for instance, male control and violence on women. Parts of the social sciences and cultural studies imply this view. The third is that we should recognise and learn about human evolution to better combat its cultural expressions where we acknowledge their harm. Men’s frequent attempts at controlling and exerting violence on women are examples of this. Although this chapter endorses the third view, the chapter concludes with a self-critical discussion about the challenges that remain in choosing between whether to resist or combat knowledge that we know will make groups apply the knowledge in a harmful way.
The chapter discusses why the ‘rationality’ of resisting some knowledge does not mean that knowledge resistance is good. This leads the chapter to elaborate on what could be done – if anything – about knowledge resistance in cases that seem harmful. The chapter reiterates an earlier discussion about the difficulty for individuals of succeeding in ending our biased handling of information. Some ‘Apollonian’ heuristics are presented, ways of thinking that may help us think straight and more logically when we face, for instance, biased media coverage. While such heuristics may ideally help us think with more ‘factfulness’ – a term popularised by Hans Rosling – they are far from sufficient for fighting the more profound knowledge resistance in society. To truly make a difference requires us to collaborate within and between organisations to this end. Measures will, secondly, need to be taken mainly through structural changes in the rules of the game for how we handle knowledge. Concretely, the chapter suggests several ways in which structures can be altered. These include problem reframing, employing cognitive role models, and knowledge collaboration across groups.