Cognitive Biases 101

So as promised, Shane Mauss asked me to do a blogpost about cognitive biases - so here it is!

If you listen to my podcast with Shane and Adam you will hear how he actually thought I was just asking him to suggest a topic to investigate in the field of scientific enquiry more generally, but the truth is that a sound understanding of cognitive biases is important to keep in mind when thinking about psychedelic science, in part because such an exciting and dynamic area of interest can induce a type of myopia. Those of us who want to see psychedelics become more widely disseminated are at risk of falling prey to biases in their favour. (i.e “I am going to recirculate all studies in favour of their usage and ignore those that don’t return such glowing results”).


This blogcast marks the start of a ridiculous, self-imposed workload - I want to increasingly make the podcast more of an organic discussion, and opposed to assuming areas of interest for a blog I will ask guests to suggest topics that I will subsequently go and research, so you don’t have to. This way I will delineate podcast conversations from blogcast education. I will also record the blog as an audio version so you can listen during your ‘found time’ if that works better for you.


We all have biases. They seem to constitute a universal human trait. As a human being, you have two choices - to be the type of person who has cognitive biases and acknowledges that they have them, or to be the type of person who has biases and does not acknowledge that they do.

The former type of person can at least be on the lookout for such biases and their critical capacity to think objectively is therefore somewhat protected. The latter type of person often is subconsciously hijacked by their biases and is unreliable in their decision making. Their judgements and proclamations have an ‘infallible’ hue to them and they are very vulnerable to adopting dogmatic styles of thinking. They are not truly available for productive collaboration…

…This is why they are often the most dangerous person in the room.

In the current phase of the psychedelic renaissance it is important we have an understanding of cognitive biases and how they can nudge us into conflating our personal narrative with the way things actually are (i.e. confusing the map with the territory). Shane and Adam have noticed a current tendency in the psychedelic community for the adoption of a ‘psychedelic orthodoxy’. This troubling development is one we should call out wherever we find it, because dogma (with all its attendant bullshit), is often just what you get when biases all clump together unchallenged and then ossify into a big hard mess.

So here you go Shane, thanks for doing the podcast


A cognitive bias is a systematic error in thinking that affects the decisions and judgments that people make.

  • Some of these biases are related to memory. The way you remember an event may be biased for a number of reasons and that in turn can lead to biased thinking and decision-making.

  • Other cognitive biases might be related to problems with attention. Since attention is a limited resource, people have to be selective about what they pay attention to in the world around them. Because of this, subtle biases can creep in and influence the way you see and think about the world.


Our brains are constantly creating our own subjective social reality from the processing of perceptual input. Thus your construction of social reality, not the objective input, tends to dictate how you behave in the social world. Tripping on a classical psychedelic leaves you in no doubt about just how reductionist the construction of ordinary waking consciousness truly is.

Under the influence of psychedelics, the experience of the raw data of the world made manifest before you is - well,

it’s a lot.

To understand why we have cognitive biases you therefore need to appreciate their evolutionary utility.

Our ancestors were tasked with navigating through a perilous world of one proximal danger after another. parsimony was the name of the game and cognition that heavily filtered the ‘world at large’ down to the absolute bare essentials was thus selected for.

The only people who don’t think like this with any degree of regularity are babies, poets and people who are tripping. (three groups who famously good at beholding the wonder of the world ‘real time’ but infamously terrible at remembering to take the bins out or do their tax returns on time).

The following poem, ‘Snow’ by Louis Macniece illustrates how such poetic minds can occasionally drop into this less reductionist world view. He describes how upon walking into a room, his mind spontaneously becomes wide open;

The room was suddenly rich,

and the great bay-window was

Spawning snow and pink roses against it
Soundlessly collateral and incompatible:

World is suddener than we fancy it.

World is crazier

and more of it than we think,
Incorrigibly plural.

I peel and portion
A tangerine and spit the pips and feel
The drunkenness of things being various.


Such ‘drunkeness’ of thought just would not do for our neolithic ancestors. However, whilst the parsimonious system of ‘just enough information to survive’ has proven devastatingly effective from an evolutionary perspective, there is a trade-off:

our tendency to use mental shortcuts has allowed us to think faster but also means we are more likely to make mistakes.

Biases are the product of cognitive systems falling prey to such vulnerabilities - namely perceptual distortion, inaccurate judgment and illogical interpretation; phenomena which are broadly called ‘irrationality’. Like everyone else on God’s green earth, You have irrationality baked into your source code. Let that not be a damning indictment, but rather a liberating one - as long as you acknowledge this you can control for it, and you by no means alone in your irrationality. You can educate yourself on the ways in which these biases are likely to show up in your thinking, and therefore you can somewhat control for them. You can train yourself to be a better thinker, and it is easier to do this in collaboration with others.

Another silver lining is the fact that whilst your thinking is irrational, it is often predictably so. Your thinking may be unreliable, but it is reliably unreliable.


In addition to the adaptive element of cognitive biases, there is a more physical factor which acts as a cause. our decision-making ability is constrained by limitations of our both our cognitive systems, (i.e. the finite amount of ‘meat computer real estate’ imprisoned in our skulls) and the amount of accessible data we have to process (as well as on the amount of time that we have in order to process it). This means that when we try to solve a problem, we often end up reaching a solution that is different from the one we would reach if our cognitive systems were perfect because we have a finite processing power and an (obscenely) incomplete data set.

We are therefore rationally bounded. This means that we make decisions that are not necessarily optimal since instead of looking for the best solution that is available for a certain problem, we generally tend to look for a solution that is perceived as good enough given the circumstances. This inability to truly intake the ‘incorrigible plurality’ of the world is called bounded rationality.


A term you may have come across is that of heuristics - which implies a type of mental short cut. Our human default to conserve mental energy means we will often reach for an imperfect heuristic conclusion that is - on some level of cognitive analysis - deemed to be good enough for what it’s for.

As we have discovered this pattern of cognitive miserliness has gotten us a long way from the East African plains, but it is subject to producing sub-optimal conclusions in three main ways. In order to understand these three types of derangements of outcome, we need a little background into two cognitive systems - known as system one and system two.


  • System One is the term given to cognition responsible for intuitive processing. It is fast, spontaneous, automatic and effortless. This system can process seemingly disparate strands of thought in parallel, and it is strongly mediated by emotion. It is more Dionysian in nature. When you are intuitively and effortlessly listening to a story in your mother tongue (and it is invoking different strands of imagery and memory) you are utilising system one.

  • System Two is responsible for our conscious reasoning. It is slow, deliberate, controlled and effortful. When you are effortfully listening to directions in a language you barely understand (and in doing so trying to consciously both decipher the syntax and retrieve recently stored vocabulary) you are utilising system two.

These two comparative examples serve to demonstrate not just the visceral difference of operating in either of these systems (it explains for example why you can feel exhausted after a night of social conversation in a foreign language but rejuvenated by a chat in your mother tongue), but also the potential for transferability of tasks from one system to the other.

The arduous acquisition of mastery in a second language will over time shunt the processing of its comprehension predominantly from system two to system one.

Similarly, if you have gained any degree of mastery in chess, you might choose exactly the same move as you used to in certain specific situations - the difference being is that in the past you could only cognitively arrive at the move through the laborious engagement of system 2, whereas now you just intuitively ‘see the board’ and ‘know’ that this it is the best move to make.

Overall, based on this framework, cognitive biases occur primarily due to a failure of three cognitive mechanisms:

  • Failure of System one to generate a correct intuitive impression. This means that System one gives us a quick but incorrect solution to our problem.

  • Failure of System two to monitor impressions generated by System 1. This means that System 2 fails to notice and correct faulty impressions which are generated by System 1.

  • Failure of System two to reason correctly. This means that System 2 gives us an incorrect solution to our problem, due to a failure of our conscious reasoning process.


There are two main ways to classify cognitive biases.

  • The Domain of thought implicated.

  • The upstream ‘Cause’ of the bias.


In terms of the domain of thought implicated, different biases affect different areas of our cognition. Based on this criterion, some of the most common types of biases include:

  • Information biases. This relates to how we process information. The overkill effect is a bias that causes people to reject offhand explanations they perceive as too complex. It is mediated by intelligence and antecedent knowledge, so make sure when trying to persuade someone you are calibrating your information to their level of intellect and understanding, otherwise, you will overwhelm them and they might paradoxically dismiss your point of view.

  • Belief biases. These are biases which relate to how we formulate belief. The backfire effect is a bias that demonstrates how the process of vehemently defending one’s pre-existing beliefs against evidence which shows them to be wrong serves to actually strengthen how tightly those beliefs are subsequently held. If you find yourself doubling down on a defensive argument, ask yourself why. Are you so sure that the trench you are digging deeper into contains the truth?

  • Decision-making biases These are biases that affect the way we make decisions. We are social animals, so large swathes of our cognitive propensity has been selected for relative to how well it performs in a group setting. The bandwagon effect is a cognitive bias that causes people to do something because they believe that other people are doing the same thing. It is an example of the powerful phenomena of conformity. We feel a need to conform and act in accordance with others, and because people often rely on other people’s judgment when deciding how to act. The psychological distress and discomfort generated by even the most banal proclamations of dissent from ‘groupthink’ should not be underestimated. Check out this example of the Asch conformity experiments to see how it works

  • Calculation biases. These are biases that affect the way in which we calculate things such as probabilities or values. For example, the gambler’s fallacy is a cognitive bias that causes people to mistakenly believe that if something happens more frequently than normal during a given time period, then it will happen less frequently in the future. This occurs because people believe that a short sequence of random independent events should be representative of a longer sequence of events

  • Memory biases These are biases that emerge because of our imperfect encoding and retrieval of memory. For example, the rosy retrospection effect arises because we tend to disproportionately remember good aspects of memories over time when compared to the recall of bad aspects of the same memories.

  • Social biases- these are biases that affect the way we perceive the social world around us, and thus how we behave in it. In our early toddler phase, we are ostensibly solipsistic in terms of our outlook. We cannot ascribe divergent mental representations of the world to others, and this has been experimentally ratified. Thankfully though, as we grow up, we graduate to a theory of mind, where we are able to understand that others have beliefs, desires, intentions, and perspectives that are different to our own. The spotlight effect (a cognitive bias that causes people to think that they are being observed and noticed by others more than they actually are) is perhaps an example of how we are occasionally vulnerable to slipping back into an overemphasis of this heavily ego-centred viewpoint.

  • Another bias that can affect several areas of our cognition is confirmation bias, which is a cognitive bias that causes people to search for, favour, interpret and recall information in a way that confirms their preexisting beliefs. This bias can, for example, affect the way we acquire new information, as well as the way we remember old information and the way we evaluate different choices.

Often people want to have a big long list of very specific biases that they can wrote learn and then try and apply in their lives, but this is a bit like trying to remember exactly how to play a bundle of specific songs on the piano through mere muscle memory. If you practice  techniques and musical theory itself as it relates to the piano, after a while you can pretty much play any basic song – so try to think about domains of thought and relate specifically named biases to that – it’s a better way to have a working knowledge of the terms themselves anyway.


So in addition to classifying biases in accordance with what domain of thought they implicate, they can be considered in terms of their upstream causes. There are several main causes of bias - the concepts of heuristics, the phenomena of limited cognitive capacity, noisy info processing, the motivational nature of emotion and the impact of social influence.

  • Heuristics

So one major source of cognitive bias is the taking of mental shortcuts, known as heuristics

If you have ever been out of shape and then done a circuit training class that was far too difficult for you, (but one that your fragile ego demanded that you make it to the end of), then you might know the ‘embodied’ feeling of taking shortcuts. When the personal trainer isn’t looking  - you go on your knees to do the push-ups, when you are running laps of the gym you might literally cut a few corners – all you are thinking about is trying to conserve energy as best as you can, just to make it to the end of the session without actually collapsing.

From an evolutionary perspective, our brains sort of learned to perform the cognitive equivalent of this! Think of your brain as similarly miserly with regards to output –  but given that our brains are so tiny relative to all the things there are to know, this was actually our best evolutionary bet.

  • Limited cognitive capacity

This refers to our brain's tendency to outsource the remembering of information. The google effect refers to how people have a tendency to forget info that they believe can be easily retrieved online. This isn’t a problem per se, and for a creature such as a human (with a finite amount of cognitive  ‘hard drive’) it is actually quite an efficient thing to do – the problem arrives because we develop a default disconnect from our conscious thought and our ‘in house’ data set we cease being able to   counteract our biases in  a ‘real-time’ way with cold hard evidence in our heads  - because the evidence is no longer in our head. 

  • Noisy information processing

 This means that the way we process information is subject to context.  When a computer gets a bit of information encoded into it, its ability to process this information is not dependant on whether it was given the information in a humorous way. Humans, on the other hand, are biased towards differential processing and retrieval of information that was encountered in an emotional way. So, unlike the computer, you would probably be more likely to remember t information if the contextual ‘noise’ around it had been humorous.

  •  Affective motivation

Another interesting mediator of biases is affect - otherwise known as emotion.

We are more likely for example to develop a bias towards someone we have previously done a favour for – this is known as the ben franklin effect.

In his autobiography, Franklin explains how he dealt with the animosity of a rival legislator when he served in the Pennsylvania legislature in the 18th century:

“Having heard that he had in his library a certain very scarce and curious book, I wrote a note to him, expressing my desire of perusing that book and requesting he would do me the favour of lending it to me for a few days. He sent it immediately, and I return'd it in about a week with another note, expressing strongly my sense of the favour. When we next met in the House, he spoke to me (which he had never done before), and with great civility; and he ever after manifested a readiness to serve me on all occasions, so that we became great friends, and our friendship continued to his death”.

This occurs because people want to avoid cognitive dissonance, which could arise as a result of behaving in a favourable way towards someone that they either dislike or don’t like enough.

  • Social influence

This refers to how the way we classify other people impacts our cognition about (and our behaviour towards) them. For example, the outgroup homogeneity bias is a cognitive bias that causes people to view members of outside groups as being more similar to each other than members of groups that they are a part of. This occurs because people tend to allocate more of their attention to members of their own group since interactions with those people are generally perceived as more important.

When you encounter flat-out tribalism in our species ( and you have some evolutionary understanding for how in the past people ostensibly from ‘other tribes’ posed legitimate and significant infective and physical risk to your tribe), then you are more likely to perceive such tribalism as understandable. This is not the same as believing it to be acceptable – it just allows you to have a more nuanced appreciation for the troublesome primitive wells that exist in our psyche. Being conscious of such ancient evolutionary proclivities is probably your best bet for counteracting them in your day to day life.

Hopefully you can now appreciate that the roots of our biases go really deep. What is more,  they overlap and intertwine…

For example, many heuristics-based biases occur due to our limited cognitive capacity, and many biases that have an emotional motivation are also affected by social factors.

You can’t really get under your biases – they are ancient  - but you can work with them (more on that later).


another way to distinguish biases is based on the involvement of emotion or affect. ‘hot’ cognition is cognition coloured by emotion, whereas ‘cold’ cognition implies cognitive processing which ostensibly functions independently of emotional involvement.

Hot biases are biases which are motivated by emotional considerations, such as our desire to have a positive self-image, or our need to feel that we made a choice that is valid from a moral perspective. Cold biases are biases which occur due to emotionally-neutral processes, such as our intent to make an optimal choice, or our intent to make a decision quickly.

A good way to think about this in relation to yourself is viscerally. If you feel you have identified a predictable deviation from reality in the way that you think  - check in with yourself. If you feel its affect, (ie a subtle or not so subtle change in your emotional state)  you are probably dealing with an emotionally mediated,  ‘hot’ bias. If you notice that you are systematically making a biased choice in a fairly dispassionate context – (i.e. you have no strong feelings about it) then you might be dealing with a ‘cold’ bias.

As with the other criteria that are used in order to categorize cognitive biases, the hot/cold distinction can sometimes be difficult to apply when it comes to the occurrence of certain biases, that are affected by emotional considerations only to a small degree.  Nevertheless, the hot/cold distinction can be valuable in many cases and can be used in conjunction with other criteria in order to categorize cognitive biases and understand why they occur.


It can be a bit depressing to see how deep diffuse and pernicious cognitive biases are. But that does not mean you should despair. Knowing about them is the absolute first step to stopping them from running roughshod and uncontested through your life. The fact they seem to be a universal human trait should also give you some solace. We are all in this together - but you do still have a responsibility to yourself and others to control for your biases as much as you consciously can.


So just to recap, biases tend to cluster around 3 types of thinking errors, or cognitive failures.

·       Failure of System one (our intuitive system) to generate correct intuitions.

·       Failure of System two (our reasoning system) to monitor and correct System one.

·       Failure of System two to carry out a proper reasoning process.

The key to mitigating biases starts with three strategies which mirror these three failures.


Your system one is an intuition factory. It is wild and untrammelled, and given how expensive cognition is from a physiological perspective, we are more or less stuck with it as the predominant thinking style in our day to day lives. The goal is, therefore, to have it function less sub-optimally than it currently does.

A counterintuitive strategy on the path to thinking better is to actually practice jumping to conclusions on a more regular basis. Considered in isolation, this is terrible advice for a blog tasked with helping you to try and think better - but habituating yourself to make more use of system one is the first step on the road to becoming less irrational in your cognition. There is a statistical method buried within this seeming madness.

Within the field statistics, there is a concept of power. the general rule of thumb is, the bigger your data set is, the more powerful any subsequent statistical inferences will be. The same concept functions when it comes to the data set of information you carry around in your head. Your system one is effortless, so data collection is not a problem for it. Granted, it collects data randomly and poorly, but its hodgepodge sampling is better than none at all, and you need some raw ingredients to work with later on. It is a fundamental human right to habitually and fearlessly follow your system one wherever it may lead - as long as you confine such meanderings to sacred privacy of your own thoughts.

There is a tendency to reflexively repudiate ‘system one cognition’ when you first uncover the true extent of its irrationality, but one must accept that it is the best we have. System two is lazy - it does not seem to concern itself with pattern detection, with striking out into the unknown and with uncovering disparate connections and sensory inputs.


So now you have habituated your system one to fire on all cylinders, you can help it make better intuitions by creating tight feedback loops. Think of this as honing your system one’s intuitive capacity - by recruiting the monitoring skills of system two. This is the other side of the coin when it comes to becoming a better intuitive thinker.

Let’s demonstrate this coupling of system one and system two with a hypothetical example.

Imagine you move to a new city and decide you want to improve your ability to intuitively ‘read’ people - quickly and from a distance. Imagine you work in sales for a large international company. A friend invites you to a corporate party with many people you don’t know. You see an outgoing middle-aged chap across the room and decided to unleash your system one on him from a distance. So let’s say your system one rapidly and effortlessly arrives at the following conclusions about the stranger;

  • He is from the American midwest,

  • He works in oil and gas,

  • He works in sales,

  • He is married.

Do not concern yourself at this stage with how and why you arrived so rapidly at such conclusions. We already know that such conclusions have been derived from a lifetime of biases. Strategy one (firing up your intuition factory) has taught you to simply be conscious of the processing of system one. Now you have created and articulated to yourself several falsifiable conclusions. Now it is the turn of system two to strategise. (In a way, methodically generating a handful of explicit conclusions about the stranger is the work of system 2 already).

At this stage, your scattershot technique of data collection is going to be refined, because you are going to go over and actually talk to the person, with the express intention of finding out where they are from, what they do for a living, and what their marital status is.

So let’s say after 5 minutes of chit chat you gleaned that they do indeed live in the American midwest, and whilst they do work in oil and gas, they are single and work in accounts - not sales. You have thus created a feedback loop. In a way, you have held the face of your intuitive biases up to the flame of empiricism.

Broadly speaking, two of your 4 conclusions were correct. You correctly ascertained the industry worked in and where they were from. Utilise system two to try and unpick why you thought this. This is an impossible task because you will never be able to comprehensively and consciously chase down all of the cascades of ‘priors’ that led you to these correct assumptions, but you might discover that you seem to have a decent sensibility to such things. These intuitions (related to how you heuristically determine someone’s profession and place of origin) have proven themselves serviceable - until next time they fire up - at which point they will be checked for veracity again.

The other two erroneous assumptions you made (that he was married and worked in sales) were based on what? Again, this dialogue between the methodical enquiry of system two and the clandestine workings of system one will feel effortful in your head - and that is a good sign, it means you are properly recruiting system two to the task at hand. You might consciously ‘squeeze out’ the realisation from system one that you have a bias to think that…

‘…all extraverted people are married and make great commissions in sales’.

Maybe you gain a little bit of insight that, as a deep introvert who themselves works in sales and has been unlucky in love, your lifetime pattern has been one of believing that your lack of ability to put yourself ‘out there’ has meant you have missed opportunities to both find a life partner and close sales in your job. Perhaps you are emotionally triggered to be jealous of people who you perceive to be extraverted because somewhere along the line, your system one has cultivated the ‘hot bias’ that extraversion is inextricably linked with successful careers in sales, and successful efforts to get out there and find a suitable life partner.

Standing across the room and seeing this extraverted stranger hold court at a party meant that your system one (heavily mediated by the emotion of jealousy) triggered that link between extraversion and perceived affluence in both love and money. As painful as this has been, you have successfully debiased yourself a little bit. Your moment of painful insight has updated and nuanced your ability to read social situations.

This, ‘one two’ combination of;

1 ) letting your system one effortlessly and incorrigibly seek out data points that, 2) your system two subsequently integrates through tigh feedback loops..

will start to pay dividends very quickly. Also, such habits of feedback compound and make you much more socially competent. From a wellbeing perspective, the return on investment is potentially huge, as your thinking becomes less sub-optimal over time.


To do this, you need to improve your conscious reasoning process, in order to ensure that System two reaches optimal solutions. You can accomplish this by implementing relevant metacognitive strategies, which promote a more effective reasoning process.

Note that in this context, metacognitive strategies are strategies that you can apply in order to regulate your cognition.

Different metacognitive strategies will be applicable in different scenarios. These strategies can be fairly simple and universal, such as increasing your awareness of the bias in question, or they can be more specific, such as creating psychological distance in order to mitigate the egocentric bias which is the tendency to anchor other people’s viewpoint to your own.

With enough practice, the application of metacognitive strategies can become intuitive. However, their application can be effective even at an early stage, where you have to consciously remind yourself to use them. This stands in contrast with training System 1 to form better intuitions, which generally requires a significant amount of practice in order to reach a meaningful improvement in performance.


Hopefully, this has given you some scope for how beautifully irrational you are. Don’t be despondent about this, but rather look at it as an opportunity for the absolute necessity of connecting with other people. You have an unknown quotient of dumb, irrational and biased bullshit in your head - but so does everybody else. You might just need access to a specific aspect of their (slightly) less sub-optimal data set to help you out of a hole you repeatedly keep falling into. Rest assured there will come a day when some aspect of your data set is more accurate than theirs and you will have the opportunity to return the favour.

Shane Mauss’ excellent podcast here we are can be viewed as his attempt to flush out his own biases by speaking to brilliant (but biased) humans who know loads of stuff that he doesn’t. Perhaps we all need to engage in such a project of bias hunting within ourselves, with mirth, with humility, and with a default assumption that other people know stuff that we just don’t.

I have recently joined twitter and have found the psychedelic science bubble to be no less infected with rampant biases than anywhere else. This is no doubt in large part an artefact of the medium (and it is why I go out of my way to meet in person with podcast guests) but it is nonetheless depressing to me to see that I can reliably predict what views certain thought leaders in the field of psychedelic will hold on a range unrelated topics. This suggests a developing monoculture, a narrowing of the Overton window resulting in one singular narrative, one side.

And he who knows only his side of the argument knows not even that.

Take a breath, relax - you are a fuckwit. So is everybody else.

Have a cup of tea and a biscuit.

We are going to be alright.

nb - a few sections of this blog are heavily indebted to the following effectivlogy. The full text is available here, and certain paragraphs were so well structured that to avoid claims of plagiarism you can read them here. The overall goal of this website is to make you a slightly better and more balanced thinker as you navigate the rapidly expanding world of psychedelic science - The more quality sources you can ingest the better. So if you want to go even deeper into the topic of cognitive biases, debiasing and behavioural economics then I highly recommend the following resources:

·Thinking, Fast and Slow

Predictably Irrational 

The Art of Thinking Clearly 

Niall Campbell