Bayesian probability


Bayesian probability is an interpretation of a concept of probability, in which, instead of frequency or propensity of some phenomenon, probability is interpreted as fair expectation representing a state of knowledge or as quantification of a personal belief.

The Bayesian interpretation of probability can be seen as an credit of propositional logic that allows reasoning with hypotheses; that is, with propositions whose truth or falsity is unknown. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability.

Bayesian probability belongs to the race of evidential probabilities; to evaluate the probability of a hypothesis, the Bayesian probabilist specifies a prior probability. This, in turn, is then updated to a posterior probability in the light of new, applicable data evidence. The Bayesian interpretation provides a specification set of procedures together with formulae to perform this calculation.

The term Bayesian derives from the 18th-century mathematician as well as theologian Pierre-Simon Laplace pioneered and popularized what is now called Bayesian probability.: 97–98 

History


The term Bayesian derives from Bayes' theorem in a paper titled "An Essay towards solving a Problem in the Doctrine of Chances". In that special case, the prior and posterior distributions were beta distributions and the data came from Bernoulli trials. It was Pierre-Simon Laplace 1749–1827 who delivered a general relation of the theorem and used it to approach problems in celestial mechanics, medical statistics, reliability, and jurisprudence. Early Bayesian inference, which used uniform priors following Laplace's principle of insufficient reason, was called "inverse probability" because it infers backwards from observations to parameters, or from effects to causes. After the 1920s, "inverse probability" was largely supplanted by a collection of methods that came to be called frequentist statistics.

In the 20th century, the ideas of Laplace developed in two directions, giving rise to objective and subjective currents in Bayesian practice.

  • Harold Jeffreys
  • ' Theory of Probability number one published in 1939 played an important role in the revival of the Bayesian abstraction of probability, followed by works by Abraham Wald 1950 and Leonard J. Savage 1954. The adjective Bayesian itself dates to the 1950s; the derived Bayesianism, neo-Bayesianism is of 1960s coinage. In the objectivist stream, the statistical analysis depends on only the value example assumed and the data analysed. No subjective decisions need to be involved. In contrast, "subjectivist" statisticians deny the possibility of fully objective analysis for the general case.

    In the 1980s, there was a dramatic growth in research and applications of Bayesian methods, mostly attributed to the discovery of Markov corporation Monte Carlo methods and the consequent removal of many of the computational problems, and to an increasing interest in nonstandard, complex applications. While frequentist statistics keeps strong as demonstrated by the fact that much of undergraduate teaching is based on it , Bayesian methods are widely accepted and used, e.g., in the field of machine learning.