
The Ladder of Inference is a concept developed by the late Harvard Professor Chris Argyris, to help explain why people looking at the same set of evidence can draw very different conclusions.
The difference comes from the idea that, based on their beliefs, people ‘choose’ what they see in amongst a mass of information.
More on that later, but first off, who fancies an experiment?
If I was being a bit hipster I could claim it as a Randomised Control Trial (RCT if you are uber hipster), but I’ll stick to plain old ‘experiment’.
Try this experiment at home or in the office:
- Go to twitter and find a hashtag for a recent conference or seminar where people have been busily tweeting,
- The topic doesn’t really matter, but something reasonably linked to your area of interest/business might be useful,
- Search on the hashtag so that you can see a good selection of tweets – about 100 will do,
- Copy the 100 tweets and paste them into a document. The aim is to have about 3 pages of written text for people to look at (pretty straightforward so far),
- Now find two Test Subjects (people). Colleagues with very different views on the world would be good. Mr Grumpy and Miss Sunshine or Cruella de Vil and Ronald McDonald (we all have them).
- Now ask the Test Subjects to independently review the tweets and provide you with a summary of the key points emerging from the Twitter stream,
- Just to make it interesting you could ask the Test Subjects to summarise their findings in no more than 5 tweets,
- Sit back and wait for the results.
- If you are feeling ambitious, you could repeat this experiment a number of times, with different Test Subjects, different collections of tweets or a different context.
- For example choose Test Subjects (people) with a very similar outlook. Before they do the analysis, brief one that you thought seminar was excellent, and the other that you thought it was rubbish (sneaky!).
The Results. I’ve not formally run this experiment (yet) but I have experienced a fair number of the summaries of Twitter conversations that people like to share about conferences. Storify Twitter summaries are almost mandatory for public sector conferences nowadays.
What has intrigued me is just how differently people can interpret and present the summary of the discussion at the same event. I appreciate that there will be a certain amount of bias and pushing of corporate messages. If your organisation is running the conference/seminar you will probably want to push key messages – dissenting voices and challenge probably isn’t something you are going to ‘share’ given the choice.
However, it is the variation in the summaries from the apparently independent/unbiased people that intrigues me. I’m sure that people will have very good intentions, but I do think there is a fair bit of ‘expert bias’ going on in these situations.
This is something neatly illustrated by The Ladder of Inference developed by Chris Argyris and had also been used by Peter Senge in his book The Fifth Discipline.

The basic idea is that:
- When presented with a range of information/data/facts, we select what fits with a belief we already hold about that situation.
- All other information is ignored.
- We make our decision on the ‘evidence’ we have selected (evidence based decisions are always the best).
- Our beliefs become stronger based on that good decision we have made (a feedback loop).
- In the future, when we look at information/data/facts, and what we select will be influenced our now more strongly held beliefs.
Both of the graphics I’ve used in this post illustrates the concept very effectively. If you would prefer here is a 3 min video, The Ladder of Inference Created Bad Judgement, from Edward Muzio.
Am I suffering from ‘Expert Bias’? I’ve been worrying for a while that I’m a candidate for The Ladder of Inference. It’s not just how I view the tweets from a seminar, but just about everything where I’m presented with data/information/facts.
Everything I’ve ever experienced informs how I see things and gives me an ‘expert bias’. You might now be thinking, ‘that would never happen to me….’ (oh lucky you…..)
At one level it could be argued that I should stop getting anxious and get over it – that’s the job, to make sense of complicated information. At another level I would like to put some rigour into what I’m doing, how do I perform a cross check?
The work from Chris Argyris suggests a process of ‘climbing down’ The Ladder of Inference. Going back down into the facts and looking at all them more closely, attempting to remove your bias. The Edward Muzio video summarises this as:
- Climb down the Ladder of Inference (a lovely metaphor)
- Question your Assumptions and Conclusions (with a trusted colleague)
- Seek Out Contrary Data (to test what you are seeing)
These are all good suggestion, but quite hard to do. What if you have no idea of how you are biased? The topic of unconscious bias, how you recognise it, and how you deal with it is helpfully the focus HR Training Programmes in many organisations, linked to things like equalities and diversity work, but helpful in so many other areas.
At the end of the day, if we just get as far as recognising there is such a thing as The Ladder of Inference and Expert Bias, I think that’s a pretty good start. It might help me when I’m reading the next tweet Storify.
So, What’s the PONT?
- People can interpret the same set of ‘facts’ in different ways.
- Recognising it as Expert Bias, Unconscious Bias or The Ladder of Inference is helpful as it can help prevent wrong conclusions and bad decisions.
- Once your bias is recognised, you need to take steps to ‘Climb Down The Ladder’. It’s always better to fall from a low step.
Great stuff! Weirdly enough, we have a Storify of our internal event with a session on Unconscious Bias that may be of interest (https://storify.com/GoodPracticeWAO/challenging-our-assumptions-wales-audit-office-int)!
In terms of putting Storify’s togther, you’re right, it is easy to get sucked into sharing a positive view of things. What I’ve personally tried to do is include constructive feedback – what we do well, but also what we can learn from. Endlessly sharing praise without context isn’t useful, and neither are damning statements.
I think Storify can be useful in combatting unconscious bias if you use it as a collective note-taking exercise (I’ll credit Ena Lloyd for that….). If you consciously include different perspectives, you can end up with with a much more rounded and comprehensive account of what’s happened. If you’re just using it to capture your own content, then it’s much less useful and you’re in danger of re-affriming your own closely held beliefs.
Cheers!
Dyfrig
Thanks Dyfrig,
Plenty to think about on this topic.
In some ways the Storify approach does go some way in helping people identify any bias.
If you just listen to the spoken word, things are easy to dismiss and forget.
At least with Tweets they are written down in front of you – unfiltered.
The act of choosing or excluding them becomes a bit more of a deliberate act – so hopefully making you think about bias.
Outside of Tweets and Twitter it would be interesting to see how people choose what they select, and how they try to compensate for bias.
Thanks
Chris
Interesting stuff! Thanks to those clever algorithms at Facebook, Twitter and Google, the Internet is becoming increasingly more tailored to our own world view – entrenching those biases even further.
I think part of keeping yourself honest is sharing your views with others (like blogging!). The difficult part is getting them shared with those who may not necessarily agree with you. (see above – the echo chamber effect).
Personally, I’d like to see a bit more dissent and challenge at conferences. Although I feel like there’s a sliding scale for how much dissent there can be before it’s no longer a valuable learning experience. Case in point – BBC Question Time.
Thanks Neil,
I think we’ve probably always gravitated to people to hold similar views to ourselves – it’s just so much easier in the digital world.
As society we need to work out how we balance the positives with the negatives.
It’s not easy – I must admit that reading a stream polar opposite opinions on Twitter usually leaves me with my head spinning.
It’s not until about 3am next morning I eventually work out what I think.
I had an interesting ‘echo chamber’ experience last week.
I was at an event where someone did an analysis of the Tweets using something called flocker? (I think it was called that – it kind of fits with the Twitter bird thing).
They made the observation that the tweets using hashtag had the characteristics of an echo chamber – funnily enough I don’t think they got re-tweeted very much – even though they had posted using said hashtag 🙂
I need to chase that one up, and find out if it was flocker.
I had to stop watching BBC Question Time a while back. The Twitter ranting finished it off for me – and I do wonder if some of those people are just there for the entertainment value. Much better to listen to the Radio 4 version, Any Questions (that of course might just be a reflection of the stage in my life I’ve reached).
We must catch up for a chat.
Thanks
Chris
That Aristotle bloke summed it up quite nicely. “It is the mark of an educated mind to be able to entertain a thought without accepting it.”
Did some detective work and found something called Flocker here : http://flocker.outliers.es/ – looks very useful indeed. I’ll be using that!
The problem with Question Time (and occasionally Twitter?) is that they are poor platforms for constructive debate.The format suffers from people wanting to be seen publicly show boating and point scoring rather than engage in actual conversation (hmm.. you could apply that to politics in general huh!?).
Yes – would be good to meet! If we don’t have a chance to cross paths before, I’m going to Gov Camp Cymru in September. Are you attending?
This is all very true, but the basic idea, that we select or seek out evidence that supports our preconceived ideas, while ignoring that which doesn’t, is well known and there is a generally accepted term for it, which you haven’t used here:
“Confirmation bias”.
[…] The Ladder of Inference. Climbing Down from Expert Bias. […]
[…] Evidence in our complex world is not always binary (yes/no), black or white. There is lots of grey and ambiguity that needs to be recognised. I did touch on this previously when I wrote about The Ladder of Inference and Expert Bias. […]
[…] a huge amount to think about here, I have written about something similar previously in Expert Bias and The Ladder of Inference. It needs a lot more thinking and debate, hopefully at #behfest16, which I can hopefully report […]
Successful use of the Ladder of Inference
Just like with any professional tool, to avoid professional bias in system’s scienc you just don’t do whatever you like or pleases you at each one of the steps of the ladder. Each step follows principles, methods, and standards and since you are a professional, you are expected to know your system really well.
For example, how do you retain or exclude observations in your field?
Why you use specific assumptions and not others?
How do you avoid fallacies from assumptions to conclusions?
If you explain to somebody how you climbed the ladder of inference, one will understand the level of bias, if any.
Just like with using a hammer; you either hammer the nail or your fingers or something else.
[…] we doomed? Hopefully not. Previously I wrote about the Ladder of Inference and Confirmation Bias. Professor Chris Argyris who developed the idea of the Ladder of Inference speaks about […]
[…] The Ladder of Inference. Climbing Down from Expert Bias […]