DataTaunew | comments | leaders | submitlogin
Ask DT: How can a prior be composed from a weighted sum of other priors?
5 points by Nadav 3500 days ago | 3 comments
I am reading Uber's blog post about predicting the destination of a ride based on it's origin/time/and other features [http://blog.uber.com/passenger-destinations].

They describe composing the prior from a weighted sum of other priors. I am new to bayesian inference and so far all priors I have seen were based on known distributions (normal/poisson/binomial/etc.). I am confused how/why this can be done - is it proven that the new prior's density function would integrate to 1? and what about interdependence of the components?



5 points by alex 3500 days ago | link

This is just a mixture model. You can easily prove that a convex combination of PDFs is guaranteed to yield another PDF:

$\int f = \alpha \int f_1 + (1-\alpha) \int f_2 = \alpha + 1 - \alpha = 1 $

where we assume that $f_1, f_2$ are valid PDFs and the weights sum to 1.

-----

1 point by Nadav 3499 days ago | link

Thanks, so being independent is not a requirement?

-----

1 point by jcbozonier 3497 days ago | link

Any prior can be made to sum to 1 with proper scaling. Perhaps also check out Think Bayes: http://www.greenteapress.com/thinkbayes/

Downey hits on some examples of just creating a prior from subjective information.

-----




RSS | Announcements