Binomial
Let \(x\) be the number of successes
on \(n\) (independent) trials with
\(\theta\) as the probability of
success of a trial, denoted by \(x|n,\theta
\sim Binomial(n,\theta)\) and probability density function \[
p(x|n,\theta) = {n \choose x}\theta^x(1-\theta)^{n-x}, \qquad\qquad
x=0,1,\ldots,n,
\] with \[
E(x|n,\theta) = n\theta \ \ \ \mbox{and} \ \ \ V(x|n,\theta) =
n\theta(1-\theta).
\]
Prior and posterior
for \(\theta\)
Suppose we use a simple (non-informative) uniform prior for \(\theta\), i.e.\(\theta \sim U(0,1)\). Therefore \[
p(\theta|x,n) \propto \theta^x(1-\theta)^{n-x},
\] which is the kernel of a Beta distribution with parameters
\(x+1\) and \(n-x+1\).
Recall that when \(\theta \sim
Beta(a,b)\), then \[
p(\theta|a,b) =
\frac{\Gamma(a+b)}{\Gamma(a)\Gamma(b)}\theta^{a-1}(1-\theta)^{b-1},
\] with \(E(\theta|a,b)=a/(a+b)\) and \(V(\theta|a,b)=ab/((a+b)^2(a+b+1))\).
In our case, \(a=x+1\) and \(b=n-x+1\),so \[
E(\theta|x,n) = \frac{x+1}{n+2} \qquad \mbox{and} \qquad
V(\theta|x,n) = \frac{(x+1)(n-x+1)}{(n+2)^2(n+3)}.
\]
Prior predictive
The prior predictive is given by \[
p(x|n,\theta) = \int_0^1 {n \choose x}\theta^x(1-\theta)^{n-x}d\theta =
{n \choose x}\frac{\Gamma(x+1)\Gamma(n-x+1)}{\Gamma(n+2)}.
\]
Negative Binomial
Let \(y\) be the number of failures
before \(m\) successes based on
independent Bernoulli trials. Therefore the total number of trials is
\(n=m+y\). The random variable \(y\), given \(m\) and \(\theta\), what we call the Negative
Binomial with parameters \(m\) and
\(\theta\), denoted by \(y|m,\theta \sim NB(m,\theta)\) with
probability density function \[
p(y|\theta,m) = {m+y-1 \choose y}\theta^m(1-\theta)^y, \qquad
y=0,1,2,\ldots,
\] with \[
E(y|m,\theta) = \frac{m(1-\theta)}{\theta} \qquad \mbox{and} \qquad
V(y|m,\theta) = \frac{m(1-\theta)}{\theta^2}.
\]
Prior and posterior
for \(\theta\)
Suppose, like with the Binomial case, we use a simple
(non-informative) uniform prior for \(\theta\), i.e.\(\theta \sim U(0,1)\). Therefore \[
p(\theta|x,n) \propto \theta^m(1-\theta)^y,
\] which is the kernel of a Beta distribution with parameters
\(m+1\) and \(y+1\). \[
E(\theta|y,m) = \frac{m+1}{m+y+2} \qquad \mbox{and} \qquad
V(\theta|y,m) = \frac{(m+1)(y+1)}{(m+y+2)^2(m+y+3)}.
\]
Prior predictive
The prior predictive is given by \[
p(y|m,\theta) = \int_0^1 {m+y-1 \choose y} \theta^m(1-\theta)^y d\theta
=
{m+y-1 \choose y} \frac{\Gamma(m+1)\Gamma(y+1)}{\Gamma(m+y+2)}.
\]
Data
Let us assume we observed the following data \[
\{0,1,1,0,0,1,0,1,1,1\}
\] For this dataset, the values of \((n,x)\) and \((m,y)\) for the Binomial and Negative
Binomial, respectively, are \[
(n,x) = (10,6) \qquad \mbox{and} \qquad (m,y)= (6,4).
\]
Posterior
distributions
\[\begin{eqnarray*}
p(\theta|x,n) &\propto& \theta^x(1-\theta)^{n-x} =
\theta^6(1-\theta)^4\\
p(\theta|y,m) &\propto& \theta^m(1-\theta)^y =
\theta^6(1-\theta)^4,
\end{eqnarray*}\] so both posterior are exactly the same.
Basically, for a given dataset of 0/1 Bernoulli trials, it does not
matter of the data were collected under the assumption of \(n\) iid draws or \(m+y\) iid draws until \(m\) successes are collected. Both
likelihood are the same up to constants, therefore the posteriors are
identical. This is an instance of the Likelihood Principle. Check my PhD
lecture, where I review sufficiency principle, conditionality principle
and likelihood principle, among other topics: https://hedibert.org/wp-content/uploads/2017/04/principlesofdatareduction.pdf,
Bayes factor
The Bayes factor is the ration of the predictive densities:
\[
B = \frac{p(x|n,\theta)}{p(y|m,\theta)} =
\frac{{10 \choose 6}\frac{\Gamma(7)\Gamma(5)}{\Gamma(12)}}
{{9 \choose 4} \frac{\Gamma(7)\Gamma(5)}{\Gamma(12)}} =
\frac{{10 \choose 6}}{{9 \choose 4}} =
\frac{210}{126} = 1.666667.
\] In words, for the observed data, the Binomial model is
slightly better than the Negative Binomial, despite the fact that both
models lead to identical posterior distributions for \(\theta\).
LS0tCnRpdGxlOiAiQmlub21pYWwgdnMgTmVnYXRpdmUgQmlub21pYWwiCmF1dGhvcjogIlNhbWUgcG9zdGVyaW9yLCBkaWZmZXJlbnQgcHJlZGljdGl2ZXMiCmRhdGU6ICIwNS8wNi8yMDI1IgpvdXRwdXQ6CiAgaHRtbF9kb2N1bWVudDoKICAgIHRvYzogdHJ1ZQogICAgdG9jX2RlcHRoOiAzCiAgICB0b2NfY29sbGFwc2VkOiB0cnVlCiAgICBjb2RlX2Rvd25sb2FkOiB5ZXMKICAgIG51bWJlcl9zZWN0aW9uczogdHJ1ZQotLS0KCiMgQmlub21pYWwKCkxldCAkeCQgYmUgdGhlIG51bWJlciBvZiBzdWNjZXNzZXMgb24gJG4kIChpbmRlcGVuZGVudCkgdHJpYWxzIHdpdGggJFx0aGV0YSQgYXMgdGhlIHByb2JhYmlsaXR5IG9mIHN1Y2Nlc3Mgb2YgYSB0cmlhbCwgZGVub3RlZCBieSAkeHxuLFx0aGV0YSBcc2ltIEJpbm9taWFsKG4sXHRoZXRhKSQgYW5kIHByb2JhYmlsaXR5IGRlbnNpdHkgZnVuY3Rpb24KJCQKcCh4fG4sXHRoZXRhKSA9IHtuIFxjaG9vc2UgeH1cdGhldGFeeCgxLVx0aGV0YSlee24teH0sICBccXF1YWRccXF1YWQgeD0wLDEsXGxkb3RzLG4sCiQkCndpdGggCiQkCkUoeHxuLFx0aGV0YSkgPSBuXHRoZXRhIFwgXCBcIFxtYm94e2FuZH0gXCBcIFwgVih4fG4sXHRoZXRhKSA9IG5cdGhldGEoMS1cdGhldGEpLgokJAoKIyMgUHJpb3IgYW5kIHBvc3RlcmlvciBmb3IgJFx0aGV0YSQKClN1cHBvc2Ugd2UgdXNlIGEgc2ltcGxlIChub24taW5mb3JtYXRpdmUpIHVuaWZvcm0gcHJpb3IgZm9yICRcdGhldGEkLCBpLmUuJFx0aGV0YSBcc2ltIFUoMCwxKSQuIFRoZXJlZm9yZQokJApwKFx0aGV0YXx4LG4pIFxwcm9wdG8gXHRoZXRhXngoMS1cdGhldGEpXntuLXh9LAokJAp3aGljaCBpcyB0aGUga2VybmVsIG9mIGEgQmV0YSBkaXN0cmlidXRpb24gd2l0aCBwYXJhbWV0ZXJzICR4KzEkIGFuZCAkbi14KzEkLiAKClJlY2FsbCB0aGF0IHdoZW4gJFx0aGV0YSBcc2ltIEJldGEoYSxiKSQsIHRoZW4gCiQkCnAoXHRoZXRhfGEsYikgPSBcZnJhY3tcR2FtbWEoYStiKX17XEdhbW1hKGEpXEdhbW1hKGIpfVx0aGV0YV57YS0xfSgxLVx0aGV0YSlee2ItMX0sCiQkCndpdGggJEUoXHRoZXRhfGEsYik9YS8oYStiKSQgYW5kICRWKFx0aGV0YXxhLGIpPWFiLygoYStiKV4yKGErYisxKSkkLiAKCkluIG91ciBjYXNlLCAkYT14KzEkIGFuZCAkYj1uLXgrMSQsc28gCiQkCkUoXHRoZXRhfHgsbikgPSBcZnJhY3t4KzF9e24rMn0gXHFxdWFkIFxtYm94e2FuZH0gXHFxdWFkClYoXHRoZXRhfHgsbikgPSBcZnJhY3soeCsxKShuLXgrMSl9eyhuKzIpXjIobiszKX0uCiQkCgojIyBQcmlvciBwcmVkaWN0aXZlIAoKVGhlIHByaW9yIHByZWRpY3RpdmUgaXMgZ2l2ZW4gYnkKJCQKcCh4fG4sXHRoZXRhKSA9IFxpbnRfMF4xIHtuIFxjaG9vc2UgeH1cdGhldGFeeCgxLVx0aGV0YSlee24teH1kXHRoZXRhID0ge24gXGNob29zZSB4fVxmcmFje1xHYW1tYSh4KzEpXEdhbW1hKG4teCsxKX17XEdhbW1hKG4rMil9LgokJAoKIyBOZWdhdGl2ZSBCaW5vbWlhbAoKTGV0ICR5JCBiZSB0aGUgbnVtYmVyIG9mIGZhaWx1cmVzIGJlZm9yZSAkbSQgc3VjY2Vzc2VzIGJhc2VkIG9uIGluZGVwZW5kZW50IEJlcm5vdWxsaSB0cmlhbHMuICBUaGVyZWZvcmUgdGhlIHRvdGFsIG51bWJlciBvZiB0cmlhbHMgaXMgJG49bSt5JC4gIFRoZSByYW5kb20gdmFyaWFibGUgJHkkLCBnaXZlbiAkbSQgYW5kICRcdGhldGEkLCB3aGF0IHdlIGNhbGwgdGhlIE5lZ2F0aXZlIEJpbm9taWFsIHdpdGggcGFyYW1ldGVycyAkbSQgYW5kICRcdGhldGEkLCBkZW5vdGVkIGJ5ICR5fG0sXHRoZXRhIFxzaW0gTkIobSxcdGhldGEpJCB3aXRoIHByb2JhYmlsaXR5IGRlbnNpdHkgZnVuY3Rpb24KJCQKcCh5fFx0aGV0YSxtKSA9IHttK3ktMSBcY2hvb3NlIHl9XHRoZXRhXm0oMS1cdGhldGEpXnksIFxxcXVhZCB5PTAsMSwyLFxsZG90cywKJCQKd2l0aCAKJCQKRSh5fG0sXHRoZXRhKSA9IFxmcmFje20oMS1cdGhldGEpfXtcdGhldGF9IFxxcXVhZCBcbWJveHthbmR9IFxxcXVhZCBWKHl8bSxcdGhldGEpID0gXGZyYWN7bSgxLVx0aGV0YSl9e1x0aGV0YV4yfS4KJCQKCiMjIFByaW9yIGFuZCBwb3N0ZXJpb3IgZm9yICRcdGhldGEkCgpTdXBwb3NlLCBsaWtlIHdpdGggdGhlIEJpbm9taWFsIGNhc2UsIHdlIHVzZSBhIHNpbXBsZSAobm9uLWluZm9ybWF0aXZlKSB1bmlmb3JtIHByaW9yIGZvciAkXHRoZXRhJCwgaS5lLiRcdGhldGEgXHNpbSBVKDAsMSkkLiBUaGVyZWZvcmUKJCQKcChcdGhldGF8eCxuKSBccHJvcHRvIFx0aGV0YV5tKDEtXHRoZXRhKV55LAokJAp3aGljaCBpcyB0aGUga2VybmVsIG9mIGEgQmV0YSBkaXN0cmlidXRpb24gd2l0aCBwYXJhbWV0ZXJzICRtKzEkIGFuZCAkeSsxJC4gCiQkCkUoXHRoZXRhfHksbSkgPSBcZnJhY3ttKzF9e20reSsyfSBccXF1YWQgXG1ib3h7YW5kfSBccXF1YWQKVihcdGhldGF8eSxtKSA9IFxmcmFjeyhtKzEpKHkrMSl9eyhtK3krMileMihtK3krMyl9LgokJAoKIyMgUHJpb3IgcHJlZGljdGl2ZSAKClRoZSBwcmlvciBwcmVkaWN0aXZlIGlzIGdpdmVuIGJ5CiQkCnAoeXxtLFx0aGV0YSkgPSBcaW50XzBeMSB7bSt5LTEgXGNob29zZSB5fSBcdGhldGFebSgxLVx0aGV0YSleeSBkXHRoZXRhID0gCnttK3ktMSBcY2hvb3NlIHl9IFxmcmFje1xHYW1tYShtKzEpXEdhbW1hKHkrMSl9e1xHYW1tYShtK3krMil9LgokJAoKIyBEYXRhCgpMZXQgdXMgYXNzdW1lIHdlIG9ic2VydmVkIHRoZSBmb2xsb3dpbmcgZGF0YQokJApcezAsMSwxLDAsMCwxLDAsMSwxLDFcfQokJApGb3IgdGhpcyBkYXRhc2V0LCB0aGUgdmFsdWVzIG9mICQobix4KSQgYW5kICQobSx5KSQgZm9yIHRoZSBCaW5vbWlhbCBhbmQgTmVnYXRpdmUgQmlub21pYWwsIHJlc3BlY3RpdmVseSwgYXJlIAokJAoobix4KSA9ICgxMCw2KSBccXF1YWQgXG1ib3h7YW5kfSBccXF1YWQgKG0seSk9ICg2LDQpLgokJAoKIyMgUG9zdGVyaW9yIGRpc3RyaWJ1dGlvbnMKClxiZWdpbntlcW5hcnJheSp9CnAoXHRoZXRhfHgsbikgJlxwcm9wdG8mIFx0aGV0YV54KDEtXHRoZXRhKV57bi14fSA9IFx0aGV0YV42KDEtXHRoZXRhKV40XFwKcChcdGhldGF8eSxtKSAmXHByb3B0byYgXHRoZXRhXm0oMS1cdGhldGEpXnkgPSBcdGhldGFeNigxLVx0aGV0YSleNCwKXGVuZHtlcW5hcnJheSp9CnNvIGJvdGggcG9zdGVyaW9yIGFyZSBleGFjdGx5IHRoZSBzYW1lLiAgQmFzaWNhbGx5LCBmb3IgYSBnaXZlbiBkYXRhc2V0IG9mIDAvMSBCZXJub3VsbGkgdHJpYWxzLCBpdCBkb2VzIG5vdCBtYXR0ZXIgb2YgdGhlIGRhdGEgd2VyZSBjb2xsZWN0ZWQgdW5kZXIgdGhlIGFzc3VtcHRpb24gb2YgJG4kIGlpZCBkcmF3cyBvciAkbSt5JCBpaWQgZHJhd3MgdW50aWwgJG0kIHN1Y2Nlc3NlcyBhcmUgY29sbGVjdGVkLiAgQm90aCBsaWtlbGlob29kIGFyZSB0aGUgc2FtZSB1cCB0byBjb25zdGFudHMsIHRoZXJlZm9yZSB0aGUgcG9zdGVyaW9ycyBhcmUgaWRlbnRpY2FsLiAgVGhpcyBpcyBhbiBpbnN0YW5jZSBvZiB0aGUgTGlrZWxpaG9vZCBQcmluY2lwbGUuICBDaGVjayBteSBQaEQgbGVjdHVyZSwgd2hlcmUgSSByZXZpZXcgc3VmZmljaWVuY3kgcHJpbmNpcGxlLCBjb25kaXRpb25hbGl0eSBwcmluY2lwbGUgYW5kIGxpa2VsaWhvb2QgcHJpbmNpcGxlLCBhbW9uZyBvdGhlciB0b3BpY3M6IGh0dHBzOi8vaGVkaWJlcnQub3JnL3dwLWNvbnRlbnQvdXBsb2Fkcy8yMDE3LzA0L3ByaW5jaXBsZXNvZmRhdGFyZWR1Y3Rpb24ucGRmLAoKCgojIyBCYXllcyBmYWN0b3IKClRoZSBCYXllcyBmYWN0b3IgaXMgdGhlIHJhdGlvbiBvZiB0aGUgcHJlZGljdGl2ZSBkZW5zaXRpZXM6CgokJApCID0gXGZyYWN7cCh4fG4sXHRoZXRhKX17cCh5fG0sXHRoZXRhKX0gPSAKXGZyYWN7ezEwIFxjaG9vc2UgNn1cZnJhY3tcR2FtbWEoNylcR2FtbWEoNSl9e1xHYW1tYSgxMil9fQp7ezkgXGNob29zZSA0fSBcZnJhY3tcR2FtbWEoNylcR2FtbWEoNSl9e1xHYW1tYSgxMil9fSA9IApcZnJhY3t7MTAgXGNob29zZSA2fX17ezkgXGNob29zZSA0fX0gPSAKXGZyYWN7MjEwfXsxMjZ9ID0gMS42NjY2NjcuCiQkCkluIHdvcmRzLCBmb3IgdGhlIG9ic2VydmVkIGRhdGEsIHRoZSBCaW5vbWlhbCBtb2RlbCBpcyBzbGlnaHRseSBiZXR0ZXIgdGhhbiB0aGUgTmVnYXRpdmUgQmlub21pYWwsIGRlc3BpdGUgdGhlIGZhY3QgdGhhdCBib3RoIG1vZGVscyBsZWFkIHRvIGlkZW50aWNhbCBwb3N0ZXJpb3IgZGlzdHJpYnV0aW9ucyBmb3IgJFx0aGV0YSQu