Here’s Scott Alexander grading his Trump predictions, and then assessing his grading performance:
According to my own judgment, I usually did better on predictions about race, and worse on other things. An optimistic take on this is that race has become so emotionally charged that most people have kind of crazy beliefs about it, which makes them easier to beat. A pessimistic take is that race has become so emotionally charged that everyone including me has crazy beliefs, which makes me a more biased judge and lets me award myself points more shamelessly than I would do anywhere else….
Did my liberal bias lead me to underestimate Trump? I'm not sure. He did better than I expected on the economy and on not starting wars. But he did worse than I expected on getting any of his policies enacted. Maybe this is what I should expect if I was suffering from liberal bias — maybe the liberal narrative was "Trump is an evil supervillain who will successfully complete all kinds of terrible things", which made me underestimate how well Trump would do on things I liked, but overestimate how well he would do on things I hated.
Scott Alexander is a rationalist, in a specific, recent sense of the word, and this is an example of rationalists’ thinking style. They put numbers on their predictions, and evaluate their performance. They know about cognitive psychology and can namecheck their own biases. They “steelman” their opponents’ arguments to make sure they are being fair.
The influence of contemporary psychology has made people conscious of the myriad ways humans can be wrong — the fallacies of our reasoning, perhaps our mind’s unconscious biases. The rationalists have taken these ideas to heart. They’re dedicated to mental self-improvement. One website is named lesswrong.
This is fine and admirable. The quality of argument on these sites is often classier and more thoughtful than a newspaper, and it’s a world away from the idiocy of social media. The risk in this kind of mental self-improvement is that it misunderstands how intellectual progress works. First, you can’t get rid of these biases. Your brain is social; you indulge in identity-protective cognition; you fall in love with your hypotheses. That’s how humans are. And second, that’s fine, because intellectual progress doesn’t depend on you being better than that. Free societies get at the truth, not by cultivating geniuses who rise above mortal constraints on thought, but by the clash of competing opinions among ordinary people with their biases and self-interest. The process is what’s key. This is why, in the Anglosphere, justice is done via competing advocates, rather than by an investigating judge.
All public ideas have politics attached. Crudely, homo economicus gives you markets and privatization; Thinking, Fast and Slow gives you the Nudge Unit. The danger of psychologism is that it encourages us to hand over our judgment to experts who arrive with charts explaining how our thinking is mistaken. Don’t second-guess yourself! Your perspective deserves to be heard. If it’s wrong (and controversial enough) there will be plenty of people to tell you so.
If you liked this piece, then I would love you to do three things:
Subscribe to this newsletter. It’s free and spam-free.
Share Wyclif’s Dust on social media, or forward it via email. By telling your friends and/or followers, you’ll be doing me a huge favour.
Read about the book I’m writing. It’s called Wyclif’s Dust, too. You can download a sample chapter.
“…there will be plenty of people to tell you so.” But you would do better if you can internalise them. And you should be open to the possibility that they are right and you are wrong, and being aware of biases can help with that.