Why I read it:
I enjoy reading books about psychology and behaviour, so that I can understand the mind better, and then try to transfer some of that learning to my professional and personal life. I’d heard Adam Grant speak at various events and on podcasts, and like his personable and articulate style, coupled with his clear expertise. Several books that I’ve read recently had touched upon knowledge, confidence, the Dunning-Kruger effect, and I wanted to find out more about how we can realise, embrace and utilise what we don’t know.
The book is split into three main parts (plus a conclusion): individual rethinking, which is about our own views; interpersonal rethinking, or the opening of other people’s minds; collective rethinking, which entails organisations or groups adapting together. Each part includes numerous studies, anecdotes, and crisp explanations from Grant, who makes the science behind each idea accessible and enjoyable for the reader.
Think Again uses compelling arguments for how we get stuck in our views and fall into traps such as biases, or not seeking out the right feedback. The premise is essentially that to make good decisions, we need to ‘think like a scientist’, in other words be objective, gather as much evidence as possible, and then be open to adapt or change based on those things. The book outlines some challenges we have as humans to following this approach, along with solutions to help us through this self sabotage!
I could do this reflection again and come up with 10 more takeaways, for example Grant’s views on imposter syndrome, how we approach conflict by viewing a disagreement as a dance and not a battle, and finally how we use the principles of rethinking to benefit our future plans. My advice is to read the book!
- Preacher, prosecutor, politician – think like a scientist – Grant contests that we often take on three roles when promoting a view or idea: preacher, when we deliver sermons to protect or promote our ideas; prosecutor, when we find flaws in other people’s ideas and look to prove them wrong; and politician, where we seek to win over an audience. He advises us to think more objectively, to weigh up circumstances and facts, and to be more like a scientist. He uses various studies and examples to show the difference it can make when you follow sound, logical advice and evidence, over our tendencies to follow gut or emotion. For each situation we are in, it might be worth thinking about whether we have fallen into one of those three roles that might cloud our ability to make the best decision.
- Embrace the joy of being wrong – to think like a scientist, Grant discusses how we should embrace the joy of being wrong by learning to detach from our ideas, and detach our opinions from our beliefs. It takes humility to admit to ourselves and others that we are wrong, but scientists find that it doesn’t make others view us as less competent; in fact, their view becomes more favourable if we welcome new ideas or evidence, rather than reject them.
- Seek out information that goes against your views – studies of people who were asked to imagine the perspectives of their political opposites showed no greater appreciation of their views. What the research did find, was that seeking people out with different views, and talking to them directly about their perspectives, had a big impact. Grant discusses how we must use as many clues as possible when making decisions, and that it’s not enough to imagine arguments or ideas that diverge from our own, but to address them directly to find out as much as we can. Other research shows that if we talk to someone about their views, no matter how extreme, if we see their strong belief in something first-hand, it builds respect of them as a person, regardless of what we think of their view.
- Psychological safety – in performance cultures, the drive for results means workers often don’t question their superiors, try out new ideas, or work collaboratively. Grant discusses the need to create more of an open culture to question, with his favourite being ‘how do you know?’ as a tool for asking a non-judgmental question that mixes curiosity with a desire to know more. Grant conducted experiments in several organisations to improve psychological safety by asking managers to request feedback and criticism from their staff; that on its own didn’t have a high impact, so they tweaked it and instead asked managers to share with their teams some anecdotes about when they had received feedback and been able to act on it or improve. Having their managers admit they were fallible, and admit they benefited from the critique, fostered a culture of staff feeling safer to speak up and contribute. There is a great anecdote about introducing more psychological safety at the Gates Foundation, and the huge relief of employees when Melinda, who staff couldn’t usually get an emotional read from, announced that she goes into a lot of meetings where there are things she doesn’t know. The staff felt safer in the knowledge that their seemingly perfect leader had gaps in her knowledge, and was brave enough to admit it.
Humility and confidence:
‘Humility is often misunderstood. It’s not a matter of having low self-confidence. One of the Latin roots of humility means ‘from the earth’. It’s about being grounded – recognising that we’re flawed and fallible. Confidence is a measure of how much you believe in yourself. Evidence shows that’s distinct from how much you believe in your methods. You can be confident in your ability to achieve a goal in the future while maintaining the humility to question whether you have the right tools in the present’
Grant explores three biases that can ultimately undo our ability to think like a scientist.
Confirmation bias – seeing what we expect to see.
Desirability bias – seeing what we want to see.
I’m not biased bias – believing you are more objective than others. Grant argues that smart, bright thinkers often fall into this trap, which makes it harder to rethink and adapt.
These biases may sound obvious and simple, but here’s what I did. I retraced my steps for a day, and tried to retrospectively apply them to my decisions and actions. When had I simply looked for confirmation of something to validate my existing view, rather than doing the digging to see if that was actually right? When had I seen bias present itself in someone else, and made a judgment without weighing up the possibility of my own bias?
Question and reflect
The book taught me a lot about seeking out evidence and perspectives to ensure that my own ideology isn’t clouding my judgment – what steps can we put into our decision-making process so that this desire for evidence becomes something we always follow?
What circumstances would need to exist for you to admit you were wrong, or change a plan that you’d invested a lot of energy into? Someone’s view? Data? The book is about rethinking and adapting, and that’s something I found challenging at first, but Grant makes a compelling case for how pride doesn’t serve us well!
Read this if…
You are interested in human behaviour
You are a leader or someone who has to make decisions and want to gain a better understanding of how we might behave, versus how we could behave!