From Hume to Jaynes: Induction as the Logic of Plausible Reasoning
By: Tommaso Costa
Potential Business Impact:
Makes thinking about uncertain things more logical.
The problem of induction has persisted since Hume exposed the logical gap between repeated observation and universal inference. Traditional attempts to resolve it have oscillated between two extremes: the probabilistic optimism of Laplace and Jeffreys, who sought to quantify belief through probability, and the critical skepticism of Popper, who replaced confirmation with falsification. Both approaches, however, assume that induction must deliver certainty or its negation. In this paper, I argue that the problem of induction dissolves when recast in terms of logical coherence (understood as internal consistency of credences under updating) rather than truth. Following E. T. Jaynes, probability is interpreted not as frequency or decision rule but as the extension of deductive logic to incomplete information. Under this interpretation, Bayes's theorem is not an empirical statement but a consistency condition that constrains rational belief updating. Induction thus emerges as the special case of deductive reasoning applied to uncertain premises. Falsification appears as the limiting form of Bayesian updating when new data drive posterior plausibility toward zero, while the Bayes Factor quantifies the continuous spectrum of evidential strength. Through analytical examples, including Laplace's sunrise problem, Jeffreys's mixed prior, and confidence-based reformulations, I show that only the logic of plausible reasoning unifies these perspectives without contradiction. Induction, properly understood, is not the leap from past to future but the discipline of maintaining coherence between evidence, belief, and information.
Similar Papers
Analogy making as the basis of statistical inference
Other Statistics
Teaches computers to learn from examples like people.
Evidence and Elimination: A Bayesian Interpretation of Falsification in Scientific Practice
Other Statistics
Makes science theories stronger by comparing them.
The Bayesian Way: Uncertainty, Learning, and Statistical Reasoning
Methodology
Teaches computers to learn from past information.