The general message of Irrationality is that, if there has ever been a rational decision taken in the history of humankind, it was a fluke.
Sutherland goes through various different kinds of bias and error that are present in the way people think – obedience to authority, conformity to the group, a poor grasp of probability and statistics, being influenced by whatever you heard most recently, placing too much emphasis on unusual cases, looking for evidence that confirms your hypothesis and not evidence that contradicts it, being influenced by the order in which information is provided, placing too much confidence in intuition, being unwilling to cut your losses, and so on and so on – and for each of them he provides examples of psychology experiments that demonstrate that people systematically and repeatedly make the same stupid mistakes.
It’s a reminder that the scientific method is, in the end, just a whole series of elaborate ways to resist the tendency of the human mind to leap to the wrong conclusion. Not that science always gets it right first time as a result, but least at its best there’s a cultural understanding within science that it’s very easy to be wrong in lots of different ways and that you have to be very careful and methodical to try to avoid error.
It also tends to suggest that anyone who has to make complicated and important decisions – politicians, doctors, judges, engineers – could usefully take similar care to carefully and methodically eliminate systemic biases in the way they decide things, because they’re almost certainly less good at it than they think they are. That’s true of all of us, of course, but most of the decisions most of us make aren’t actually going to have particularly serious consequences.
Anyway, the book. It’s mainly made up of lots and lots of examples – often with several experiments described in a single paragraph – so it’s somewhat dense, and I should probably read it again if I want to take it all in, but it’s well written, which helps. And always interesting.