(This is not part of the Postrationality series. It’s just an isolated thought that I wanted to share.)
In order to change someone’s mind, you don’t have to change their beliefs. You just have to change their associations.
Let me unpack that a bit. By “change someone’s mind”, I mean change it in a way that affects their actions. In the rationalist community, we tend to see beliefs as the be-all and end-all of decision making. Based on our beliefs, we should choose our actions to maximize expected utility, and that’s all there is to decision-making. But in practice, beliefs are only part of our reasoning and decision-making processes.
Let me give you a (pretty obvious) example of how we can fail at decision-making despite having correct beliefs. Suppose your friend is coming over tonight, and you’re planning to make dinner. You know that your friend is a vegetarian, but when you’re at the supermarket, you forget this and buy chicken. This mistake leads to a loss of utility, since either you serve your friend chicken for dinner, or you eventually remember and have to run back out to the store. In either case, a failure of reasoning occurred, and it led to a loss of utility. But the problem here wasn’t with your beliefs; you knew about your friend’s dietary preferences. The problem was with your memory, and which beliefs you actually used when you were making the decision.
So that’s the point I’m trying to get at here: your decision doesn’t just depend on your beliefs; it also depends on which specific beliefs you actually use when you’re deciding. And we can’t just use every belief, because there’s too many to reason with efficiently. So we have to do approximate inference, and restrict ourselves to a subset of our beliefs.
Fortunately, most beliefs will be totally irrelevant to a given decision. If you’re choosing what to buy for dinner, then it doesn’t really matter that DNA is stored in the nucleus of the cell. This means that in order to make good decisions, you need to figure out which of your beliefs are most relevant, and use those and only those when reasoning. In the example above, where the person bought chicken to serve to a vegetarian friend, that was a failure at retrieving and using a relevant belief. In general, when you forget something important, you are failing to retrieve a relevant belief.
This is where associations come in, because we use them in our belief-retrieval systems. That is, let’s say your vegetarian friend’s name is Steve, and he plays in a metal band, he frequently wears mismatched socks, and one time he went bungee jumping off the Eiffel Tower. These are all things you associate with Steve; that is, when you think of Steve, these are the facts that might spring to mind. If we want to get slightly more formal, we can imagine that the mind contains a network of facts/entities/ideas, and that every pair of these has a strength of association between them. These strengths will be mediated by context, so that in the context of cooking Steve dinner, your association score might grow stronger to facts about his food preferences, and weaker to facts about his socks. This means that in order to remember that Steve is a vegetarian, you either need a strong base association score between “Steve” and “vegetarian”, or you need the context of cooking Steve dinner to increase the score enough that it passes some threshold of relevance.
This is why, if you want to change how someone acts, you don’t need to change their beliefs. You just need to change their associations. Change which facts they (subconsciously) decide are relevant to the situation. Change what comes to mind when they think of a person or organization. The media does this all the time; it doesn’t even have to lie. It just has to broadcast information selectively. Suppose there’s a politician running for office, Senator Dick Head. You know that Senator Head once donated $5,000 dollars to protecting the short-snouted snail, a cause that is dear to your heart. But he also cheated on his wife, and you find this morally repugnant. The media doesn’t care about the short-snouted snail, so it never reports on his donation. But the news channel you watch is constantly telling you what a horrible awful cheater Senator Head is. So your association between “Senator Head” and “cheated on his wife” gets stronger, while your association to “cares about the short-snouted snail” remains weak. This means that by voting time, you are truly disgusted with Senator Head, and you vote for his competitor, Congressman Mike Rotch, instead.
So, in conclusion, reasoning is not just about which beliefs you possess. It’s also about which beliefs you actually use during a specific reasoning task. Thus, if you want to change someone’s mind, you don’t have to change their beliefs. You just need to change which beliefs they’re likely to use when reasoning.
Note: this post is not science. I cannot cite research that supports anything I just said (though I do think it’s reasonable, or I wouldn’t have written it). And I don’t know any mathematical models that reason by choosing beliefs according to strengths of association. I do know some researchers are working on how to choose which beliefs to use, but I have no idea how they’re going about it. So please don’t take anything I’ve said here as scientific fact. This is just informed speculation.