Rory Sutherland wrote an article for Behavioral Scientist titled Are We Too Impatient to Be Intelligent? (17 September 2024).
« Businesspeople, governments, and politicians aren’t looking to solve problems; they’re looking to win arguments. And the way you win an argument is by pretending that what should be an open-ended question with many possible right answers isn’t one …This is a massive problem in decision-making. We try to close down the solution space of any problem in order to arrive at a single right answer that is difficult to argue with. »
« the Uber map. It doesn’t change how long you wait for the taxi. It changes the quality of the waiting time by reducing uncertainty… Too often, we optimize for the numerical thing, time and speed. We’re not optimizing for the emotional state, which is disquiet or anxiety. »
« So this is what’s happened to the world: optimization trumps human preference. The people who want to win the argument are effectively prepared to ignore human truths to preserve the integrity of the artificial model. »
« The Unaccountability Machine by Dan Davies is a fantastic book, which argues that people create these models because if you can reduce decision-making to an algorithm, or a formula, or a process, or a procedure, you avoid the risk of blame. Computer says no, effectively. »
« Instinctively, people love to codify things, and make them numerical, and turn them into optimization problems with a single right answer. Because the second you acknowledge ambiguity, you now have to exercise choice. If you can pretend there’s no ambiguity, then you haven’t made a decision, you can’t be blamed, you can’t be held responsible. And what’s the first thing you remove if you want to remove ambiguity from a model? You remove human psychology, because human psychology, particularly around time, is massively ambiguous. »
« a problem… which bedevils many technologies and many behaviors. It starts as an option, then it becomes an obligation. We welcome the technology at first because it presents us with a choice. But then everybody else has to adopt the technology, and we suddenly realize we’re worse off than we were when we started. »
« Most of you, if you were students, wrote essays or something like that as undergraduates, right? Fairly confident to say that nobody’s actually kept them? Nobody re-reads them. In fact, the essays you wrote are totally worthless. But the value wasn’t in the essay. What’s valuable is the effort you had to put in to produce the essay. Now, what AI essays do is they shortcut from the request to the delivery of the finished good and bypass the very part of the journey which is actually valuable—the time and effort you invest in constructing the essay in the first place. »
« Similarly, the valuable part of advertising is, to some extent, the process of producing it, not the advertising itself. Because it forces you to ask questions about a business which people mostly never get around to asking: What do we stand for? What’s our function? Who do we appeal to? Who’s our target audience? How do we present ourselves? How do we differentiate ourselves? How do we make ourselves look different and feel valuable to the people who encounter us? »
« The general assumption driven by these optimization models is always that faster is better. I think there are things we need to deliberately and consciously slow down for our own sanity and for our own productivity. If we don’t ask that question about what those things are, I think we’ll get things terribly, terribly wrong. »
Rory Sutherland is the vice chairman of Ogilvy and the author Alchemy: The Surprising Power of Ideas Which Don’t Make Sense.