The Ladder of Inference. Climbing Down from Expert Bias


The Ladder of Inference is a concept developed by the late Harvard Professor Chris Argyris, to help explain why people looking at the same set of evidence can draw very different conclusions.

The difference comes from the idea that, based on their beliefs, people ‘choose’ what they see in amongst a mass of information.

More on that later, but first off, who fancies an experiment?

If I was being a bit hipster I could claim it as a Randomised Control Trial (RCT if you are uber hipster), but I’ll stick to plain old ‘experiment’.

Try this experiment at home or in the office:

  1. Go to twitter and find a hashtag for a recent conference or seminar where people have been busily tweeting,
  2. The topic doesn’t really matter, but something reasonably linked to your area of interest/business might be useful,
  3. Search on the hashtag so that you can see a good selection of tweets – about 100 will do,
  4. Copy the 100 tweets and paste them into a document. The aim is to have about 3 pages of written text for people to look at (pretty straightforward so far),
  5. Now find two Test Subjects (people). Colleagues with very different views on the world would be good. Mr Grumpy and Miss Sunshine or Cruella de Vil and Ronald McDonald (we all have them).
  6. Now ask the Test Subjects to independently review the tweets and provide you with a summary of the key points emerging from the Twitter stream,
  7. Just to make it interesting you could ask the Test Subjects to summarise their findings in no more than 5 tweets,
  8. Sit back and wait for the results.
  9. If you are feeling ambitious, you could repeat this experiment a number of times, with different Test Subjects, different collections of tweets or a different context.
  10. For example choose Test Subjects (people) with a very similar outlook. Before they do the analysis, brief one that you thought seminar was excellent, and the other that you thought it was rubbish (sneaky!).

The Results. I’ve not formally run this experiment (yet) but I have experienced a fair number of the summaries of Twitter conversations that people like to share about conferences. Storify Twitter summaries are almost mandatory for public sector conferences nowadays.

What has intrigued me is just how differently people can interpret and present the summary of the discussion at the same event. I appreciate that there will be a certain amount of bias and pushing of corporate messages. If your organisation is running the conference/seminar you will probably want to push key messages – dissenting voices and challenge probably isn’t something you are going to ‘share’ given the choice.

However, it is the variation in the summaries from the apparently independent/unbiased people that intrigues me. I’m sure that people will have very good intentions, but I do think there is a fair bit of ‘expert bias’ going on in these situations.

This is something neatly illustrated by The Ladder of Inference developed by Chris Argyris and had also been used by Peter Senge in his book The Fifth Discipline.


The basic idea is that:

  • When presented with a range of information/data/facts, we select what fits with a belief we already hold about that situation.
  • All other information is ignored.
  • We make our decision on the ‘evidence’ we have selected (evidence based decisions are always the best).
  • Our beliefs become stronger based on that good decision we have made (a feedback loop).
  • In the future, when we look at information/data/facts, and what we select will be influenced our now more strongly held beliefs.

Both of the graphics I’ve used in this post illustrates the concept very effectively. If you would prefer here is a 3 min video, The Ladder of Inference Created Bad Judgement, from Edward Muzio.

Am I suffering from ‘Expert Bias’? I’ve been worrying for a while that I’m a candidate for The Ladder of Inference. It’s not just how I view the tweets from a seminar, but just about everything where I’m presented with data/information/facts.

Everything I’ve ever experienced informs how I see things and gives me an ‘expert bias’. You might now be thinking, ‘that would never happen to me….’ (oh lucky you…..)

At one level it could be argued that I should stop getting anxious and get over it – that’s the job, to make sense of complicated information. At another level I would like to put some rigour into what I’m doing, how do I perform a cross check?

The work from Chris Argyris suggests a process of ‘climbing down’ The Ladder of Inference. Going back down into the facts and looking at all them more closely, attempting to remove your bias. The Edward Muzio video summarises this as:

  • Climb down the Ladder of Inference (a lovely metaphor)
  • Question your Assumptions and Conclusions (with a trusted colleague)
  • Seek Out Contrary Data (to test what you are seeing)

These are all good suggestion, but quite hard to do. What if you have no idea of how you are biased? The topic of unconscious bias, how you recognise it, and how you deal with it is helpfully the focus HR Training Programmes in many organisations, linked to things like equalities and diversity work, but helpful in so many other areas.

At the end of the day, if we just get as far as recognising there is such a thing as The Ladder of Inference and Expert Bias, I think that’s a pretty good start. It might help me when I’m reading the next tweet Storify.

So, What’s the PONT?

  1. People can interpret the same set of ‘facts’ in different ways.
  2. Recognising it as Expert Bias, Unconscious Bias or The Ladder of Inference is helpful as it can help prevent wrong conclusions and bad decisions.
  3. Once your bias is recognised, you need to take steps to ‘Climb Down The Ladder’. It’s always better to fall from a low step.

About WhatsthePONT

I'm from Old South Wales and I'm interested almost everything. Narrowing it down a bit: cooperatives, social enterprises, decent public services, complexity science, The Cynefin Framework, behavioural science and a sustainable future. In 2018/19 I completed a Winston Churchill Travelling Fellowship, looking at big cooperative enterprises and social businesses in NE Spain and the USA. You can find out more here:

12 Responses

  1. Great stuff! Weirdly enough, we have a Storify of our internal event with a session on Unconscious Bias that may be of interest (!

    In terms of putting Storify’s togther, you’re right, it is easy to get sucked into sharing a positive view of things. What I’ve personally tried to do is include constructive feedback – what we do well, but also what we can learn from. Endlessly sharing praise without context isn’t useful, and neither are damning statements.

    I think Storify can be useful in combatting unconscious bias if you use it as a collective note-taking exercise (I’ll credit Ena Lloyd for that….). If you consciously include different perspectives, you can end up with with a much more rounded and comprehensive account of what’s happened. If you’re just using it to capture your own content, then it’s much less useful and you’re in danger of re-affriming your own closely held beliefs.



  2. Interesting stuff! Thanks to those clever algorithms at Facebook, Twitter and Google, the Internet is becoming increasingly more tailored to our own world view – entrenching those biases even further.

    I think part of keeping yourself honest is sharing your views with others (like blogging!). The difficult part is getting them shared with those who may not necessarily agree with you. (see above – the echo chamber effect).

    Personally, I’d like to see a bit more dissent and challenge at conferences. Although I feel like there’s a sliding scale for how much dissent there can be before it’s no longer a valuable learning experience. Case in point – BBC Question Time.

      1. That Aristotle bloke summed it up quite nicely. “It is the mark of an educated mind to be able to entertain a thought without accepting it.”

        Did some detective work and found something called Flocker here : – looks very useful indeed. I’ll be using that!

        The problem with Question Time (and occasionally Twitter?) is that they are poor platforms for constructive debate.The format suffers from people wanting to be seen publicly show boating and point scoring rather than engage in actual conversation (hmm.. you could apply that to politics in general huh!?).

        Yes – would be good to meet! If we don’t have a chance to cross paths before, I’m going to Gov Camp Cymru in September. Are you attending?

  3. This is all very true, but the basic idea, that we select or seek out evidence that supports our preconceived ideas, while ignoring that which doesn’t, is well known and there is a generally accepted term for it, which you haven’t used here:

    “Confirmation bias”.

  4. Angela

    Successful use of the Ladder of Inference

    Just like with any professional tool, to avoid professional bias in system’s scienc you just don’t do whatever you like or pleases you at each one of the steps of the ladder. Each step follows principles, methods, and standards and since you are a professional, you are expected to know your system really well.

    For example, how do you retain or exclude observations in your field?
    Why you use specific assumptions and not others?
    How do you avoid fallacies from assumptions to conclusions?

    If you explain to somebody how you climbed the ladder of inference, one will understand the level of bias, if any.

    Just like with using a hammer; you either hammer the nail or your fingers or something else.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s