Tuesday, June 14, 2011

Tim Harford´s "Adapt" - a book review

Tim Harford´s new book “Adapt” is a wonderful read but difficult to pigeonhole. There´s interesting stuff about the Iraq war, the finance crisis, development aid, randomized experiments, skunk works, the design of safety systems, whistleblowers, overconfidence and groupthink phenomena, not to mention a truly wonderful explanation of how a carbon tax would work and why environmentalists should embrace it. Even when he covers topics that have been ably covered by others elsewhere, he does so in a light and enjoyable way and manages to dig up new, cool anecdotes. It´s partly a popularization of science, partly a business book, at times it almost moves into self-help territory, and at times it seems to present new and interesting perspectives on big topics (such as financial regulation). Still - though it may sound sprawling, I didn´t really find it so when reading it. At one level it reads like a series of interesting pieces of journalism on different topics, but on another, there´s an underlying thread of ideas that gradually emerges.

The way I read it, the main point of the book is that the problems we face are too complex for us to understand and figure out the solutions to from behind a desk. Evidence from the failed predictions of experts to the extinction records of firms and the failure of high-level military strategies support this. There´s a number of reasons why this is so, ranging from the difficulty of capturing and aggregating information at a sufficiently finely grained level to psychological tendencies to trust in our (frequently false) beliefs and suppress possible evidence that they´re wrong. Still - we do solve problems - but this happens through an evolutionary process: We make lots of bets - each one of which is small enough that failure is acceptable - and the winning bets identify “good enough for now” solutions that we replicate and grow. The best examples of this (as a method for human problem solving) are market economies and science. Lots of entrepreneurs who hope to strike it big, some of whom combine the factors of production in a way that better creates value than others - thus making a profit (to put the point in an Austrian way). Lots of scientists stating hypotheses, some of whom are able to better predict the outcomes of experimental and quasi-experimental data than others - thus having their hypotheses strengthened (on a related note - I recently made the argument together with a colleague that this process is broken in economics - see more on that here).

Harford also discusses a host of implications that follow from this - the need to “decouple” systems so that failure in a single component (such as a bank in the financial system) doesn´t bring down the entire system, the need to finance both “highly certain” research ideas as well as “long shot” ideas, avoiding groupthink by including people likely to disagree (thus creating room for disagreement in the group) and demanding disagreement, and using prizes to elicit experiments. He also discusses how such evolutionary processes can be exploited better in policy- which is where he gets to his beautiful explanation of how a carbon tax works by tilting the playing field (there are two chapters here that should be reworked into a pamphlet and handed out in schools and parliaments).

That´s my brief take on the underlying “storyline” - but it doesn´t do justice to the book, which reads like a string of intellectual firecrackers. The wide-ranging topics, however, also means that they are necessarily touched on lightly - it´s an appetizer for a lot of ideas more than a fully satisfying meal. For instance, if success in the market (and elsewhere) consists of being the “lucky” winner who made a bet that - ahead of time - had no stronger claim to being right than others, how does this factor into our views on entitlements and redistributive taxation? If prizes (such as the prize for a space-going flight) actually elicit large-scale, expensive experiments that we only need to pay for when they succeed - does this mean that they exploit some irrational overconfidence in the competitors? If people were sensible and unbiased in their estimate of success, would they spend more than their expected reward? And if not - wouldn´t that mean the prize money would have to be sufficient to finance all the experiments - in which case it doesn`t save us any money? To what extent does the desire for control play into the desire for top down planning and control? (Imagine you were the prime minister - would you feel comfortable if loads of schools were allowed to try out whatever they felt like, risking the chance that some of them would beat kids or indoctrinate them in some way that blew up in the media?) In an online interview by Cory Doctorow, Harford states that

I also looked at the banking crisis and big industrial accidents such as Deepwater Horizon, and found that there were almost always people who could have blown the whistle — and sometimes did — but the message didn’t get through. So those communication lines need to be opened up and kept open.

Yes - but no…. After all, if there´s a host of signals coming up, most of them wrong, it might well be rational to have some filtering mechanism in place that also weeds out many of the correct signals in order to avoid being swamped and misguided by wrong ones.

While we´re on the topic of whistleblowers - I also wish he´d said a word or two about some of the biggest transparency cases of recent years. On the one hand, the whistleblower-friendly candidate Obama who changed his tune once he got in office. This could have served as a way of discussing how hard it is to actually have people looking over your shoulder and criticizing you, even when you think (or at least see the arguments for) allowing them to do so. Also, I would have been interested in Tim Harford´s views on Wikileaks, which in some ways is the biggest attempt to increase transparency in modern times - as well as his views on the conflicts it generated (a book championing the cause of whistle-blowers should also at least mention the awful treatment of claimed whistleblower Bradley Manning). Given the many stories from the Iraq war and the US military about the dangers of a strictly enforced official partyline/strategy/story, the potential value in Wikileaks shining a light on what is actually going on seems pretty clear. Or at least worthy of discussion.

Given the number of topics covered in the book there are obviously quibbles you may have with certain facts that are wrong or the way some of them are treated, but that´s to be expected. More importantly, there were parts of the argument that I felt were missing - especially concerning how difficult it is to learn from experience. As documented in for instance Robyn Dawes´ excellent “House of cards” (in the context of psychology and the misguided beliefs of treatment professionals), there are clear cases where statistical decision rules consistently outperform human judgments, without this being enough to convince the experts who could gain from them. Or consider this post on the backfire effect from the you are not so smart blog, which discusses experiments suggesting that people can react to evidence that they were wrong by being even more convinced in their wrongness. The way politicians respond to arguments about the surprisingly weak effect of drug decriminalization on usage levels is another example. In terms of Harford´s argument - adaption and evolution not only requires us to test things and find out what works - it also requires us to accept what works and implement it more broadly. Taking into account the number of things covered, he probably covered this too - but if he wants his ideas to be taken up in policy circles I think (that is, my gut-feeling is) that this would be perhaps the hardest part.

Finally, the book could also have been tempered by applying its thesis to the thesis itself: Has “planned evolution” been attempted, and did it actually work? As the book argues, the devil is often in the details and seemingly good ideas based on solid case stories may turn out to work quite differently in practice from what we expected.

No comments:

Post a Comment