People choose to listen to charlatans even when it is against their interest to do so. That’s the message of some recent experiments. Aristotelis Boukouras and colleagues got subjects to take a multiple choice exam during which they could choose to take help from one of two computerized advisors. One of these was an expert, who gave the answers that a panel of economists would. The other gave the answers that had been most popular with other people who had taken the test. They found that most people chose to take advice from the latter – even after they had been told that it was only giving the popular answers and even when they were paid for getting answers right. What’s more, even when people had the option of switching to the expert when they could see that the populist advisor was
chris considers the following as important:
This could be interesting, too:
chris writes Why Labour should talk about productivity
chris writes It’s not the 90s any more
chris writes Labour’s patriotism problem
chris writes Hedge fund humbugs
People choose to listen to charlatans even when it is against their interest to do so. That’s the message of some recent experiments.
Aristotelis Boukouras and colleagues got subjects to take a multiple choice exam during which they could choose to take help from one of two computerized advisors. One of these was an expert, who gave the answers that a panel of economists would. The other gave the answers that had been most popular with other people who had taken the test. They found that most people chose to take advice from the latter – even after they had been told that it was only giving the popular answers and even when they were paid for getting answers right. What’s more, even when people had the option of switching to the expert when they could see that the populist advisor was wrong, only around half did so. They concluded:
A charlatan espousing popular beliefs can lead laypeople to choose to follow her advice rather than the advice of a genuine expert. This is true even in the face of increasing negative evidence regarding the accuracy of the charlatan.
What’s going on here is a variation on the confirmation bias. “People have a strong tendency to follow the adviser who suggests similar answers to the people’s own priors” the authors say. And it is the charlatan, who gives the most popular answer, who is the more likely to do this.
This has been corroborated by some different recent experiments by Basit Zafar and colleagues. They asked people which articles about the pandemic they wanted to read, having shown them just the headlines. They found that pessimists tended to choose articles with pessimistic headlines and optimists articles with optimistic ones. What’s more, having chosen articles in line with their priors they then revised their beliefs more if the article confirmed their priors – so pessimists who chose to read a pessimistic article became even more pessimistic about the outlook for deaths and jobs whilst optimists who chose optimistic stories became more optimistic. In this way, beliefs became more polarized; this finding of course confirms earlier evidence (pdf).
Of course, these experiments have external validity. People believe Covid denialists and those who claimed economic benefits from Brexit. And they even believe charlatans when their own money is at stake – for example by investing in high-charging under-performing funds.
The thing about these experiments, though, is that people choose the charlatan even under ideal conditions. In the real world there are many other ways for charlatans to build support as well even ignoring the biased and ignorant media. These include:
- Ideological homophily. Boukouras and colleagues asked questions that weren’t hot ideological issues. Many issues on which we seek expertise, however, are. This causes right-wingers to regard Econ101ers as experts and MMTers as cranks whereas leftists do the opposite.
- Herding effects. As Robert Shiller shows in Narrative Economics, stories can spread exactly like viruses. This is one reason why asset price bubbles occur and why friends and colleagues have shareholdings and asset allocations (pdf) that are more similar than they should be.
- Ignorance of selection bias. David Hirshleifer gives an example of this. People talk more about their investment successes than their failures. This causes listeners to over-estimate the probable success of active stock-picking relative to sticking money in tracker funds, and to over-invest in speculative stocks.
- Lack of incentives. Even when people have big money at stake, they sometimes make bad decisions, for example by buying high-charging, low-performance actively managed funds. How much more likely are they then to make mistakes when incentives to be right are absent? As Jason Brennan has said, ““when it comes to politics, smart doesn’t pay, and dumb doesn’t hurt.”
- Snake oil sales tricks. In a brilliant paper (pdf), the late Werner Troesken showed how sellers of patent medicines stayed in business for decades by tricks such as: giving people a short-term pick-up with alcohol or opium that they mistook (pdf) for a genuine cure; appealing to people’s desperation (they knew, decades before prospect theory, that desperate people take risks); hyping their products and distinguishing them from others; and denigrating experts.
One implication of all this is that a public service broadcaster as the BBC purports to be cannot be impartial. If you offer people two sides of a story or two talking heads, many will choose the charlatan or false story over the true one. And we’ll get increased polarization – which might make for good TV but not necessarily for good politics or a good society.
But I think the implication is more devastating. All this undermines the conventional liberal faith in the marketplace of ideas. John Stuart Mill thought that “wrong opinions and practices gradually yield to fact and argument.” Experiments, however, confirm our real world experience that in fact the opposite can happen. And this isn’t simply because of our biased and dysfunctional media.
But let’s push the “marketplace of ideas” metaphor a step further. Markets must be embedded within social norms, rules and mechanisms if they are to work effectively, as Jesse Norman argues. In the marketplace of ideas these norms, rules and mechanisms are obviously inadequate. Which poses the question: what would effective ones look like? Too few people are asking this question – which might be because existing ones actually serve the interests of extractive capitalism rather well.