You're reading...
Change Management, Systems Thinking

Thinking fast and slow for human behaviour change

I’ve written recently about change management and models for human behaviour change. It’s become apparent that many of the successes and failures of health and other improvement initiatives are down to human motivation, behaviour and thinking.  Studying fields such as psychology that are concerned with how people think, behave and make decisions can give us some useful insights into why change and improvement can be so difficult.

Kahneman fast-slowMy Christmas reading was Daniel Kahneman’s book “Thinking fast and slow”. Kahnemen is a psychologist and economist who won a Nobel Prize for his work on behavioural economics in 2002. Thinking fast and slow is all about why people think what they do and why they make the decisions they make. Kahneman calls “thinking fast”, System 1, and “thinking slow”, System 2.  System 1 operates automatically and quickly, with little effort and without voluntary control. System 2 requires mental effort and concentration.

For example, if you were asked what is 2 + 5 you would instantly know that the answer is 7. System 1 gives us the answer automatically because it’s something we’ve learned through practice and repetition. If, however, I asked you to multiply 17 x 23, you’d probably have to think more carefully and get out a calculator. However, System 1 would probably tell you that a guess of 200 would be too low and 500 would be too high. Kahneman claims that we all like to believe we think in a System 2 way; i.e. rationally, and we use that ability to make informed decisions. Unfortunately, that’s not what happens in practice.

System 1 is fine because it uses what we have learned in order to react quickly and lowers the mental load we have to cope with. System 1 often uses “rules of thumb” to make decisions quickly and usually these lead to good decisions. Occasionally, it can also lead to mistakes. Try this: A bat and a ball cost £1.10 between them. The bat costs £1 more than the ball. How much does the ball cost?

Most people answer 10p but the correct answer is 5p. If the ball cost 10p and the bat cost £1 more, it would cost £1.10, making the total cost £1.20. System 1 thinking evokes an answer that is intuitive, appealing and wrong! System 1 causes people to be overconfident and to place too much faith in intuition. Our brains are inherently lazy and we default to System 1 decision-making and with that comes cognitive biases. Only if we come across something unusual or if we make a conscious effort, do we engage System 2.

WYSIATI: What you see is all there is

System 1 is a “machine for jumping to conclusions” on the basis of limited information. Try this: Jo is 30 years old, outspoken and has tattoos. Jo lives in a Northern city and works as a car mechanic. Which of these statements is more probable?

1: Jo owns a Gundog

2: Jo owns a Terrier

Based on Kennel Club registration statistics it is actually 4 times more likely that 1 is correct, yet some people will jump to conclusions because of their inherent biases or will assume the (irrelevant) storyline is of some significance and outweighs the statistical evidence. It takes more mental effort to apply System 2 thinking and come up with the right answer. In fact, you’d probably have to search for the registration data unless you happen to be a dog expert.

If you’re a driver and you’ve been “carved up” by a couple of white vans, System 1 could lead you to conclude that most white van drivers are careless or dangerous drivers. Similarly, if your 75-year-old friend has smoked cigarettes all their life and not had lung cancer, System 1 may lead you to conclude that “smoking isn’t as dangerous as it’s made out to be”, despite there being plenty of contrary evidence and your experience is based on a sample of one!

When you bring individuals with their own supposedly rational views into a group, you can end up with a whole group coming up with completely irrational rejections of robust scientific evidence. I’m sure you can think of several topical examples of this without me needing to spell them out. System 1 thinking has little understanding of logic and statistics, which is why we all need to be aware of this risk and become more careful and reflective users of data. 

Recognising bias

System 1 takes shortcuts to make decisions; for example, confirmation bias means you tend to agree with information which supports something you already believe. If you’ve heard about a few cases of flooding in an area, you’re more likely to agree with research studies that linked flooding to cuts in river dredging, even if they only involved very small sample sizes.

There is also an “availability” bias, where you overestimate the probability of something that you have heard often or that you find easy to remember. This is a particular danger in the world of health improvement where a few cases of a disease might get discussed widely in the press, or a DNA test is developed and people then rush to use that test, while completely ignoring the low incidence of the condition or the fact that other conditions are much more important to address.

Kahneman discusses the “availability cascade”. This is a self-sustaining chain of events that may start with a few media reports of a problem that lead to widespread public panic and eventually result in policy changes by legislators. Often, the emotional reaction (e.g. people dying or having severe adverse reactions to a drug) becomes a story in itself and the story can be accelerated by media headlines, social media groups and campaigning individuals who work to ensure a continuous supply of bad news cases. Scientists who try to try to use data to dampen the fears sometimes face hostility or are accused of a cover-up and we simply don’t get to hear of the 99.9% of people who haven’t been affected.

Hindsight bias occurs when people reconstruct a story to exaggerate the probability they knew an event was going to happen. Apparently, there are many experts who “knew” the financial crash of 2008 was going to happen. It’s just odd that they never said anything about it prior to the events!

System 1 thinking is intuitive and makes our lives easier by reducing the amount of mental effort we need to expend when making decisions. Being aware of this helps us to understand why it’s so hard to change people’s behaviour. It’s helpful to be aware of these lessons from Thinking fast and slow because, otherwise, we will continue to make the same mistakes and will not see the improvements that we all want to see in areas such as health, criminal justice and sustainability. It’s also something to bear in mind for those of us who help organisations to manage change and create performance improvement.

.

.

.

.

.

.

.

.

.

 

Discussion

Comments are closed.

Enter your email address to follow this blog and receive notifications of new posts by email.

Join 594 other followers

Connect with Ian Seath

Find us on Facebook Improvement Skills Consulting Ltd. on LinkedIn Follow IanJSeath on Twitter

Archives

Copyright Notice

© Improvement Skills Consulting Ltd. and Ian Seath, 2007-20. Unauthorised use and/or duplication of this material without express and written permission from this site’s author and/or owner is strictly prohibited. Excerpts and links may be used, provided that full and clear credit is given to Improvement Skills Consulting Ltd. and Ian Seath with appropriate and specific direction to the original content.

%d bloggers like this: