Note to Self

You must realize that other people experience the world differently than you.

Until then, you won’t realize anything.

Admit you blindly carry your prejudice around with you everywhere.

Until then, think yourself a god.

Share · Tweet

Groupthink: The Human Evolutionary Superpower?

I recently listened to a lecture, A New Theory of Human Understanding, by Hugo Mercier. In the talk, he discusses the research and findings behind his new book, The Enigma of Reason.

It all starts with a question: how and why did human beings acquire the ability to reason? The standard answer is that reason exists to assist with decision making. We assume that reason emerged under selective pressure favouring smartness. It’s survival of the most accurate! The ‘fittest’ are those most adapted to figuring out what is real and true about the world. As a Darwinian explanation, this seems reasonable enough.

Or is it?

With his colleague Dan Sperber, Mercier proposes a counter hypothesis: what if the original function of reason was not to make ‘right’ or ‘correct’ predictions or observations about the world, but to convince other people to think or act differently?

If you want someone else to do something, walking up to them and barking an order is highly ineffective. Even if commanding from a hierarchical position of authority, you still accept that influencing the actions of another human is a nuanced affair. To change what someone does, you have to change something about the way they think. You have to give them an insight into why your idea is a good idea. They need to buy it. Trust is crucial. In short, even the simplest act of influencing another person requires you to reason with them.

In this account, reason emerges primarily to serve a social function. The original, adaptive purpose of reason was not to reach correct and factual conclusions about the world but to convince one another about our ideas — beliefs about how we should organize, behave, and work together. For Mercier, this theory goes a long way to explaining why our reasoning is so prone to bias and groupthink. Reason evolved socially, not by solitary philosophers roaming around and coming up with scientifically factual observations about the world. (The practice of intentional, self-aware objectivity is only a particular ‘use case’ for reason that develops far down the evolutionary line.)

This social model for the evolutionary origin of reason goes a long to explaining why we tend to strongly favour what we already believe and why we excel at using reason to justify our prejudice and bias. After all, if the original purpose of ‘reason 1.0’ was to enable us to function and communicate as a group, we should not be unsurprised that our capacity to self-organize around ideas far outperforms our devotion to sober, detached analysis. In evolutionary terms, group inclusion equals survival, making group identity primal and paramount.

Consider the modern implications. For instance, we might look at the concept of ‘conclusive evidence‘ itself as a socially constructed and group-dependent cognitive framework: using evidence to make decisions is only an effective strategy to the extent that you are part of a group that shares your convictions about evidence in the first place.

The ramifications of Mercier and Sperber’s theory are significant. It invites us to critically rethink what it even means ‘to reason with someone’ in the first place. To suppose that presenting another person ‘with the facts’ is a coherent strategy for changing their minds is, perhaps, a mythology its own.

Share · Tweet

Open-mindedness

A few days ago, I heard someone say, “You need an open mind,” at the beginning of a sentence to explain why their view of an issue was indisputably correct.

I chuckled at the irony.

Mostly, of course, I was laughing at myself.

Close-mindedness is a curse and burden I tend to assume is carried by other people.

If they would just open their minds, then they’d see the light, accept the facts, and know the truth.

Most of the time, I use “a willingness to learn” as a laudatory description for those who have journeyed to the camp of my conclusions.

If travelers pack up and continue down the road — beyond the confines of my settlement — I rename their “willingness to learn” as rebellion, ignorance, heresy, or a listless lack of commitment.

I guess the only mind I need to worry about opening and closing is my own.

Share · Tweet

Prejudice and Power

A recent panel discussion at London School of Economics highlights two distinctive ways to think about the nature of prejudice.

On the one hand, we might think about prejudice as a function of individual psychology. We might say that humans are wired by evolution to spot patterns and predict the behaviours of others. Yes, people make erroneous judgments all the time, such as ascribing incorrect beliefs and motives to individuals from observed external characteristics, but these glitches might be corrected by education and training.

The other way to describe prejudice is to locate it in a social environment. In this case, it is not merely the case that some individuals make incorrect prejudgments, but rather that the structures of society more or less train us to systematically assume certain things about one another. For example, if the media repeatedly pummels us with images of a particular group performing violent acts, we are far more likely to prejudge (unconsciously or otherwise) an identifiable individual from this group when we meet them on the street. Scale this up and you have oppression: societal discrimination operationalized to assign certain groups subordinate roles in society.

The idea here is that we need to see the causation of discrimination as cyclical: we have biological processes that equip us to make heuristic snap judgments about one another, yes, but we live in a social world that informs the objects and content of our prejudice. In other words: our prejudice frames the world and the world frames our prejudice. To understand how prejudice manifests itself, we need to realize that these two “distinctive” modes function reciprocally, not in isolation from one another.

This view of prejudice as a self-reinforcing social phenomenon leads us to a critical observation: we cannot fully talk about the prejudice of individuals without talking about who has power in society. Power and prejudice are inextricably linked to one another. Thinking about prejudice as an exclusively “personal” or “individualistic” problem fails to account for the ways that institutions and governments are critical actors in the social environments that nurture, inform, and bias the opinions of individuals. At this level, then, we need to think about prejudice as systemic and structural as it is personal and individualistic.

If this is true, it does little good to place the “blame” for prejudice at the doorstep of an individual who has a preconceived opinion about another group of people, at least to the extent that their preconceived ideas came from reified “cues” in society. The end game of this perspective is not to fully write off the responsibility of the individual for acting in prejudicial or discriminatory ways, but rather to situate the individual in a context that attempts to account for the array of social, political, and corporate interests pushing and nudging them at all times.

If we say that prejudice is only a “natural” human phenomenon, we are effectively (and selectively?) ignoring all the other structures we have enshrined to organize society.

Share · Tweet

Check your hidden bias about your hidden bias

In a recent episode of Analysis, David Edmonds digs into some of the conflicting evidence and critiques of implicit bias research. Does the Implicit Association Test stand up to rigorous statistical scrutiny? Do survey tools that measure implicit bias actually predict bigoted, prejudiced, or discriminatory behaviour in the real world? How explanatory is the data we use to describe implicit bias? Do workshops that teach people about implicit bias give participants effective tools to constructively change their unconscious thinking… or do they inadvertently foster an atmosphere of distrust between majority and minority groups?

This program provides a good reminder to maintain a healthy, indifferent skepticism about using indirect measures to infer or quantify the way that other human beings think about one another. At the risk of falling into an infinite rabbit hole of meta analysis, the key takeaway here is to remember that its quite possible to be bias about one’s understanding of bias.

Share · Tweet