bayesian linear regression

 

  • Bayesian linear regression is a type of conditional modeling in which the mean of one variable is described by a linear combination of other variables, with the goal of obtaining
    the posterior probability of the regression coefficients (as well as other parameters describing the distribution of the regressand) and ultimately allowing the out-of-sample prediction of the regressand (often labelled ) conditional on observed
    values of the regressors (usually ).

  • Equivalently, it can also be described as a scaled inverse chi-squared distribution, Further the conditional prior density is a normal distribution, In the notation of the
    normal distribution, the conditional prior distribution is Posterior distribution[edit] With the prior now specified, the posterior distribution can be expressed as With some re-arrangement,[3] the posterior can be re-written so that the posterior
    mean of the parameter vector can be expressed in terms of the least squares estimator and the prior mean , with the strength of the prior indicated by the prior precision matrix To justify that is indeed the posterior mean, the quadratic terms
    in the exponential can be re-arranged as a quadratic form in .

  • Model setup Consider a standard linear regression problem, in which for we specify the mean of the conditional distribution of given a predictor vector : where is a vector,
    and the are independent and identically normally distributed random variables: This corresponds to the following likelihood function: The ordinary least squares solution is used to estimate the coefficient vector using the Moore–Penrose pseudoinverse:
    where is the design matrix, each row of which is a predictor vector ; and is the column -vector .

  • Here, the model is defined by the likelihood function and the prior distribution on the parameters, i.e.

  • Because we have chosen a conjugate prior, the marginal likelihood can also be easily computed by evaluating the following equality for arbitrary values of and .

  • In this model, and under a particular choice of prior probabilities for the parameters—so-called conjugate priors—the posterior can be found analytically.

  • where the two factors correspond to the densities of and distributions, with the parameters of these given by which illustrates Bayesian inference being a compromise between
    the information contained in the prior and the information contained in the sample.

  • The prior belief about the parameters is combined with the data’s likelihood function according to Bayes theorem to yield the posterior belief about the parameters and .

  • In fact, a “full” Bayesian analysis would require a joint likelihood along with a prior , where symbolizes the parameters of the distribution for .

  • In the Bayesian approach, the data are supplemented with additional information in the form of a prior probability distribution.

  • More so, under classic assumptions are considered chosen (for example, in a designed experiment) and therefore has a known probability without parameters.

  • The prior can take different functional forms depending on the domain and the information that is available a priori.

  • Model complexity is already taken into account by the model evidence, because it marginalizes out the parameters by integrating over all possible values of and .

 

Works Cited

[‘See Jackman (2009), p. 101.
2. ^ See Gelman et al. (2013), p. 354.
3. ^ The intermediate steps of this computation can be found in O’Hagan (1994) at the beginning of the chapter on Linear models.
4. ^ The intermediate steps are in Fahrmeir et
al. (2009) on page 188.
5. ^ The intermediate steps of this computation can be found in O’Hagan (1994) on page 257.
6. ^ Carlin and Louis(2008) and Gelman, et al. (2003) explain how to use sampling methods for Bayesian linear regression.
2. Box,
G. E. P.; Tiao, G. C. (1973). Bayesian Inference in Statistical Analysis. Wiley. ISBN 0-471-57428-7.
3. Carlin, Bradley P.; Louis, Thomas A. (2008). Bayesian Methods for Data Analysis (Third ed.). Boca Raton, FL: Chapman and Hall/CRC. ISBN 1-58488-697-8.
4. Fahrmeir,
L.; Kneib, T.; Lang, S. (2009). Regression. Modelle, Methoden und Anwendungen (Second ed.). Heidelberg: Springer. doi:10.1007/978-3-642-01837-4. ISBN 978-3-642-01836-7.
5. Gelman, Andrew; et al. (2013). “Introduction to regression models”. Bayesian
Data Analysis (Third ed.). Boca Raton, FL: Chapman and Hall/CRC. pp. 353–380. ISBN 978-1-4398-4095-5.
6. Jackman, Simon (2009). “Regression models”. Bayesian Analysis for the Social Sciences. Wiley. pp. 99–124. ISBN 978-0-470-01154-6.
7. Rossi,
Peter E.; Allenby, Greg M.; McCulloch, Robert (2006). Bayesian Statistics and Marketing. John Wiley & Sons. ISBN 0470863676.
8. O’Hagan, Anthony (1994). Bayesian Inference. Kendall’s Advanced Theory of Statistics. Vol. 2B (First ed.). Halsted.
ISBN 0-340-52922-9.
Photo credit: https://www.flickr.com/photos/mathias-erhart/3552074420/’]