Evidence and Elimination: A Bayesian Interpretation of Falsification in Scientific Practice
By: Tommaso Costa
Potential Business Impact:
Makes science theories stronger by comparing them.
The classical conception of falsification presents scientific theories as entities that are decisively refuted when their predictions fail. This picture has long been challenged by both philosophical analysis and scientific practice, yet the relationship between Popper's eliminative view of theory testing and Bayesian model comparison remains insufficiently articulated. This paper develops a unified account in which falsification is reinterpreted as a Bayesian process of model elimination. A theory is not rejected because it contradicts an observation in a logical sense; it is eliminated because it assigns vanishing integrated probability to the data in comparison with an alternative model. This reinterpretation resolves the difficulties raised by the Duhem-Quine thesis, clarifies the status of auxiliary hypotheses, and explains why ad hoc modifications reduce rather than increase theoretical credibility. The analysis is illustrated through two classical episodes in celestial mechanics, the discovery of Neptune and the anomalous precession of Mercury. In the Neptune case, an auxiliary hypothesis internal to Newtonian gravity dramatically increases the marginal likelihood of the theory, preserving it from apparent refutation. In the Mercury case, no permissible auxiliary modification can rescue the Newtonian model, while general relativity assigns high probability to the anomaly without adjustable parameters. The resulting posterior collapse provides a quantitative realisation of Popper's eliminative criterion. Bayesian model comparison therefore supplies the mathematical structure that Popper's philosophy lacked and offers a coherent account of scientific theory change as a process of successive eliminations within a space of competing models.
Similar Papers
From Hume to Jaynes: Induction as the Logic of Plausible Reasoning
Other Statistics
Makes thinking about uncertain things more logical.
The Bayesian Way: Uncertainty, Learning, and Statistical Reasoning
Methodology
Teaches computers to learn from past information.
The Bayes Factor Reversal Paradox
Methodology
Bayesian math can give opposite answers.