Why Probability Is a Tool for Uncertainty: From Mars Life to Bayesian Logic
Exploring how probability quantifies uncertainty, the article examines examples from the chance of life on Mars to weather forecasts, explains subjective versus objective interpretations, discusses Bayes' theorem, Cromwell's Rule, and the foundational role of conditional probability in logical reasoning.
Probability and Uncertainty
How likely is there life on Mars? What is the probability that an electron has a certain mass? What was the chance of a sunny day on July 9, 1816?
Note that answers to binary questions like “Is there life on Mars?” are often treated as yes/no, but we are interested in the probability of life given current data and our understanding of Martian physics and biological conditions. This proposition depends on the information we possess, not on an objective natural property.
We use probability because we are uncertain about events, not because the events themselves are uncertain. This definition, which depends on our level of knowledge, is sometimes called the subjective definition of probability, explaining why the Bayesian school is referred to as subjective statistics. However, this does not mean every proposition is equally meaningful; it merely acknowledges that our understanding of the world is based on incomplete data and models.
Understanding the world without models or theory is impossible. Even if we could escape societal assumptions, we remain biologically constrained: evolution has wired our brains to associate with existing models. We inevitably describe the world probabilistically. Regardless of whether the underlying reality is deterministic or random, we use probability as a tool to measure uncertainty.
Logic concerns effective inference. In Aristotelian or classical logic a proposition is either true or false, whereas in the Bayesian view probability treats certainty as a special case: a true proposition has probability 1 and a false one has probability 0.
Only when we have sufficient data showing that life can grow and reproduce on Mars would we assign a probability of 1 to “Life exists on Mars.” Assigning probability 0 is usually difficult because undiscovered habitats or experimental errors may exist. This relates to Cromwell’s Rule, which advises against assigning probabilities of exactly 0 or 1 to logical propositions. Cox proved mathematically that incorporating uncertainty into logical inference requires probability theory, making Bayes’ theorem a logical consequence of probability. Thus Bayesian statistics can be seen as an extension of logic for handling uncertainty.
Probabilities lie between 0 and 1 (inclusive) and obey several rules, one of which is the multiplication rule:
The probability of both A and B occurring equals the probability of A occurring multiplied by the probability of B occurring given that A has occurred. Here P(A∩B) denotes the joint probability, while P(B|A) is the conditional probability, which has a different real‑world meaning. For example, the probability that the road is wet differs from the probability that the road is wet given it is raining. If A provides no information about B, then P(B|A)=P(B), meaning A and B are independent. Conversely, if A gives information about B, the conditional probability may be higher or lower.
Conditional probability is a core concept in statistics and is essential for understanding Bayes’ theorem. Re‑ordering the multiplication rule yields the same relationship expressed in a different form.
We do not compute conditional probabilities for events with probability 0, so the denominator is restricted to the reduced sample space where the conditioning event is known to have occurred.
All probabilities are essentially conditional; there is no absolute probability. Whenever we discuss probability, implicit models, hypotheses, or conditions are involved. For instance, the chance of rain tomorrow differs on Earth, Mars, or elsewhere in the universe, and the probability of a coin landing heads depends on our assumptions about the coin’s bias.
Reference:
Osvaldo Martin, Python Bayesian Analysis
Model Perspective
Insights, knowledge, and enjoyment from a mathematical modeling researcher and educator. Hosted by Haihua Wang, a modeling instructor and author of "Clever Use of Chat for Mathematical Modeling", "Modeling: The Mathematics of Thinking", "Mathematical Modeling Practice: A Hands‑On Guide to Competitions", and co‑author of "Mathematical Modeling: Teaching Design and Cases".
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.