Life Is a School of Probability

2022-04-30 • 7 min read

img

The Nazarene said: “If you abide in my word, you are truly my disciples, and you will know the truth, and the truth will set you free.” (John 8:31-32 [NIV]) And: “[E]veryone who practices sin is a slave to sin. The slave does not remain in the house forever; the son remains forever. So if the Son sets you free, you will be free indeed.” (John 8:34-36 [NIV])

What does he mean? What truth is he talking about? How can the truth set someone free? Free from what?

Summary #

I argue that common belief systems are rigid and encourage thinking of beliefs as things that are either true or false. I argue that a saner way of thinking is to accept uncertainty and describe beliefs using probabilities.

How Some Belief Systems Intend to Set Us Free #

Many belief systems are said to do this thing. Christianity will free us from sin and eternal death, Marxism will free us from bourgeois oppression, fascism will free us from Marxism and FIRE from the drudgery of wage labour. It can work in two ways, I think. The first kind of ideology gives you a new set of beliefs, and with those beliefs a new goal to strive for. In light of the new goal, one’s old habits are bad, but there are new, good habits for one to adopt instead. The reason that this is freeing is that the new habits, unlike the old ones, are in the service of lofty (not base) goals and therefore freely (not compulsively) chosen: “Everyone who practices sin is a slave to sin.”

The second kind of ideology gives you a new set of beliefs, but these beliefs don’t cause you to change your goals exactly. It assumes you already want to free yourself from such-and-such a yoke and simply gives you the means of doing so. It tells you: “What you’ve been doing is not working; here’s why, and here’s what you need to do instead.” Marxism might be an example of this: “Only the consciousness of the proletariat can point to the way that leads out of the impasse of capitalism.” (Lukács 1972)

One would not be too far off the mark were one to call the first kind enlightenment ideology and the second kind emancipation ideology. The former aims to change the human mind. The latter aims to change governments, institutions, norms and social relations, in other words the world “out there”. They both aim to shift people’s perspectives. Those who are driven by their baser urges to lives of depravity can be saved only by realising that they are sinning, and by replacing their corrupted goals with lofty goals – so says Christianity. Those who are kept in fetters by an oppressive system can be saved only be realising what position they are in, and by adopting a new set of actions in light of this realisation – so says Marxism.

The Problem of Rigidity #

What if the prescribed beliefs are false? What if, as a result, the prescribed actions are not effective? I don’t think these ideologies have a good answer to these questions. The beliefs they prescribe are the reason they exist. If you don’t accept most of the things in the Nicene Creed, why bother with Christianity?

In other words, these ideologies are inflexible. As soon you say that the Christian religion is true, there are all these other, related beliefs that must also be true, and you feel urged to believe these too. But some of the beliefs probably aren’t true, or at least it’s far from obvious that they are. Now you are stuck trying to justify shaky beliefs, telling yourself you believe things that you don’t really find plausible and in general having a hard time whenever you are confronted with evidence contrary to your beliefs.

This is essentially the point made in Zmigrod (2020):

Ideologies possess two essential qualities regardless of the content of their beliefs or ambition: they are doctrinal and relational. […] [T]he doctrinal component of ideologies is facilitated by the existence of a rigid dogma that the ideology embraces. […] [T]he the relational facet of ideologies – characterized by parochial altruism towards fellow adherents and antagonism towards non-adherents and dissimilar others – is facilitated by processes of identity demarcation. All ideologies invent and adopt clear identity markers, such as flags, symbols, anthems, costumes, and rituals, which signal membership and devotion.

The second, “relational” quality is important because identity and group membership can hinder actively open-minded thinking (Galef 2021). But it’s the first, “doctrinal” quality that I’m concerned with here.

Compare this with rationality. One thing I like about rationality is that it is flexible. It does not prescribe any particular object-level belief or action.[1] It is merely the craft of finding more accurate beliefs, more worthy goals and actions that have a good chance of achieving those goals. The goals can change and the craft will remain useful. The beliefs can change and the craft will remain useful. The actions can change and … you get the point. I’m not saying that rationalists by definition have more accurate views than non-rationalists. My point is only that rationality is relatively goal-, belief- and action-agnostic.

(Effective altruism, in my interpretation, is a subset of rationality. If rationality is about finding better beliefs, goals and actions for achieving those goals, effective altruism is about finding better beliefs, sub-goals and actions, conditional on the overarching goal of “doing good”. But doing good is a very general goal that nearly everyone thinks is worth pursuing.)

img

The Problem of False Certainty #

A second problem in these ideologies is that they are unwelcoming towards uncertainty. Being founded on certain beliefs, they are meaningless if those beliefs are false; to permit uncertainty about them is to permit uncertainty about the ideology itself. The picture of knowledge that this paints is this: You need to go out there and find out what’s true and what’s false. If you’re faced with a hard question, you need to work at it until you find the truth, and then your mind will be at peace.

Hmm … I said before that I think these belief systems aim to set us free by substituting lofty goals for compulsive ones, or by giving us the tools to overcome our oppressors. Now I think that needs to be revised. Maybe what they do, or aim to do, or part of what they aim to do, is to free us from confusion. People find confusion unpleasant. They want to be sure. Assertive belief systems give them the certainty they desire.

van Prooijen and Krouwel (2019) make a similar argument about extreme political ideologies, namely that they

  1. are more likely to be adopted by people who suffer from “a sense of meaninglessness that stems from anxious uncertainty”;
  2. are associated with black-and-white thinking;
  3. lead to overconfident judgments; and
  4. lead to reduced tolerance for differing views.

And unlike me, they also support their argument with actual empirical evidence. They cite McGregor, Prentice, and Nash (2013), for example, which study suggests that people go for extreme ideologies when events cause them to be uncertain about the goals they are trying to reach, supporting (1). I interpret (1) to be basically “people who are not certain about which goals to adopt in life cast about for any goal whose supporters seem sure of themselves”.

I think (1) is the result of a model of the world where certainty is readily accessed when needed. I also think that model is faulty because, for us humans, who always have incomplete information, there is no truth – there are only degrees of uncertainty. Or as Laplace (1825) put it: “Strictly speaking it may even be said that nearly all our knowledge is problematical […] the entire system of human knowledge is connected with the theory [of probability].” What we call true are those beliefs that we are more than 99% (say) confident really are true, what we call false are those beliefs that we are less than 1% (say) confident are true, and everything else is a faint maybe.

There are many things of which we can say that they are almost certainly true or false (the sky is blue, etc.). These things are not very interesting because nearly everyone agrees that they are true or false. The interesting questions are those we have less evidence about. Because there is less evidence about them, and because reality is immensely complex, we should expect to be quite uncertain about them. That means the goal of finding the truth about them, of knowing for sure, is pure folly. So when somebody sets out to find the truth, they will either convince themself, without anything close to the evidence that would warrant it, that such-and-such a thing is certain, or they will be depressed by their failure to find it.

Instead, Galef (2021) advises us to embrace confusion. We should adopt as our goal not to find the truth, but to have more accurate beliefs. We should look at beliefs as falling on a continuous scale between certainly true and certainly false. Instead of “admitting that we’re wrong”, we should be updating our beliefs, in order that we can incorporate new evidence without it coming into tension with existing beliefs. Walter Bagehot gave us the aptest expression of this feeling: “Life is a school of probability.”[2]

One no longer feels a need for certainty on any subject, but can be satisfied with uncertainty. One is happy to say: I don’t know anything about that; or: Seems likelier than the alternative. When faced with evidence contrary to one’s belief, one will think only: Okay!

References #

Galef, Julia. 2021. The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Penguin.
Laplace, Pierre Simon. 1825. A Philosophical Essay on Probabilities.
Lukács, Georg. 1972. History and Class Consciousness: Studies in Marxist Dialectics. MIT Press.
McGregor, Ian, Mike Prentice, and Kyle Nash. 2013. “Anxious Uncertainty and Reactive Approach Motivation (Ram) for Religious, Idealistic, and Lifestyle Extremes.” Journal of Social Issues 69 (3): 537--63.
Prooijen, Jan-Willem van, and André P. M. Krouwel. 2019. “Psychological Features of Extreme Political Ideologies.” Current Directions in Psychological Science 28 (2): 159--63.
Zmigrod, Leor. 2020. “A Psychology of Ideology: Unpacking the Psychological Structure of Ideological Thinking.”

Footnotes #

  1. Of course some beliefs and goals are unusually rare or common among rationalists (as compared to the general population). That may be partly due to groupthink but it is also the result of truth-seeking: it would be weird if a group aiming to find more accurate beliefs did not sometimes converge on the most plausible beliefs. The important thing is that these beliefs and goals are not prescribed – they are up for debate – you cannot hold them and still be a rationalist.

    Objection: Don’t rationalists all share the goal of having more accurate beliefs, and isn’t that essential to rationality? Reply: Some rationalists might argue that accurate beliefs is only instrumentally important in making better decisions, and if they were to hamper our ability to make good decisions, rationalists should not want more accurate beliefs. But even assuming that wanting to have accurate beliefs is essential to rationality, given that “it’s good to have more accurate beliefs” is widely accepted, it seems fairly safe to rest your ideology on it. ↩︎

  2. One objection here could be that, to the extent that we know anything about these processes, what we know are mostly associations, not causative relationships. Maybe adopting these ideologies don’t cause resistance to actively open-minded thinking – maybe some other thing, like say people’s personal dispositions, causes both. I think that’s possible, but will in defence of the causation hypothesis point out that there’s a plausible mechanism for how the one could cause the other (the mechanism I outline in this post). ↩︎