1 Models

Let us assume that observations \(x_1,\ldots,x_n\) are independent and identifically distributed and that we are interested in comparing competing models

  1. Model \({\cal M}_1: \ Gaussian(\mu,1)\), or
  2. Model \({\cal M}_2: \ Cauchy(\mu,1)\),

where \(\mu\) is the mean of the \(X\)’s under the Gaussian model, \({\cal M}_1\), but not under the Cauchy model, \({\cal M}_2\), since the Cauchy distribution has undefined mean. Nonetheless, \(\mu\) can still be interpreted as the median/model of the \(X\)’s under the Cauchy model.

2 Prior

Since \(\mu\) represent the same quantity (location) under both models, we will assume a common prior for \(\mu\): \[ \mu \sim N(0,\tau_0^2), \] where, for simplicity, we will assume that \(\tau_0^2=10\).

3 Posterior

Under the Gaussian model, the posterior distribution of \(\mu\) follows from simple conjugacy: \[ \mu|x_{1:n},{\cal M}_1 \sim N(\mu_1,\tau_1^2), \] where \(\mu_1=\sum_{i=1}^n x_i/(n+1/\tau_0^2)\) and \(1/(n+1/\tau_0^2)\). So, posterior inference can be performed in closed form.

However, this is not the case for the posterior of \(\mu\) under the Cauchy model: \[ p(\mu|x_{1:n},{\cal M}_1) \propto \left[\prod_{i=1}^n p(x_i|\mu,{\cal M}_2)\right] p(\mu) \propto \left[\prod_{i=1}^n \frac{1}{1+(x_i-\mu)^2}\right]\exp\{-0.5\mu^2/\tau_0^2\}, \] which does not follow any known family of distributions.

# Normalized posterior for model 1
post1 = function(sumx,mu,tau2){
  dnorm(mu,sumx/(n+1/tau2),sqrt(1/(n+1/tau2)))
}
# Unnormalized posterior for model 2
post2 = function(x,mu,tau2){
  prod(dt(mu-x,df=1))*dnorm(mu,0,sqrt(tau2))
}
# Unnormalized posterior for model 2 (log)
logpost2 = function(x,mu,tau2){
  sum(dt(mu-x,df=1,log=TRUE))+dnorm(mu,0,sqrt(tau2),log=TRUE)
}

4 Turning the Bayesian crank

# Hyperparameters
tau2=10

# Observed data
x    = c(-4.3,3.2)
sumx = sum(x)
n    = length(x)

# Plotting the posterior densities
par(mfrow=c(1,2))

mu = seq(-5,5,length=1000)
plot(mu,post1(sumx,mu,tau2),type="l",lwd=2,ylab="Posterior")
title("Model 1: Gaussian")
for (i in 1:n)
  points(x[i],0,pch=16,col=3)

mu = seq(-10,10,length=1000)
lp = rep(0,1000)
for (i in 1:1000)
  lp[i] = logpost2(x,mu[i],tau2)
lp1 = exp(lp-max(lp))
plot(mu,lp1,type="l",lwd=2,ylab="Prior*likelihood")
title("Model 2: Cauchy")
for (i in 1:n)
  points(x[i],0,pch=16,col=3)

5 Computing \(p(x_{1:n}|{\cal M}_2)\) via simple Reimann integration

M      = 1000000
mus   = seq(-15,15,length=M)
h       = mus[2]-mus[1]
prod = rep(0,M)
for (i in 1:M)
prod[i] = h*post2(x,mus[i],tau2)
kappa = sum(prod)
kappa
## [1] 0.000775713

We can now plot both posterior densities under the same figure:

par(mfrow=c(1,1))
plot(mu,post1(sumx,mu,tau2),type="l",lwd=2,xlab=expression(mu),
ylab="Posterior density")
lines(mu,exp(lp-log(kappa)),col=2,lwd=2)
lines(mu,dnorm(mu,0,sqrt(10)),col=4,lwd=2)
legend("topright",legend=c("Prior of mu","Posterior of mu (Gaussian model)","Posterior of mu (Cauchy model)"),col=c(4,1:2),lwd=2,lty=1)
for (i in 1:n)
  points(x[i],0,pch=16,col=3)

6 Computing other posterior summaries

Below we compute, via simple Reimann integration, the following summaries:

  1. \(E(\mu|x_{1:n},{\cal M}_2)\)
  2. \(E(\mu^2|x_{1:n},{\cal M}_2)\)
  3. \(Pr(\mu<1|x_{1:n},{\cal M}_2)\)
M      = 1000000
mus    = seq(-15,15,length=M)
h      = mus[2]-mus[1]
prod   = matrix(0,M,3)
for (i in 1:M){
  prod[i,1] = mus[i]*h*post2(x,mus[i],tau2)/kappa
  prod[i,2] = mus[i]^2*h*post2(x,mus[i],tau2)/kappa
  prod[i,3] = ifelse(mus[i]<1,1,0)*h*post2(x,mus[i],tau2)/kappa
}
E = sum(prod[,1])
E2 = sum(prod[,2])
V = E2-E^2
SD = sqrt(V)

tab = round(rbind(c(sumx/(n+1/tau2),sqrt(1/(n+1/tau2)),
pnorm(1,sumx/(n+1/tau2),sqrt(1/(n+1/tau2)))),
c(E,SD,sum(prod[,3]))),3)
rownames(tab) = c("Gaussian model","Cauchy model")
colnames(tab) = c("E(mu)","SD(mu)","Pr(mu<1)")
tab
##                 E(mu) SD(mu) Pr(mu<1)
## Gaussian model -0.524   0.69    0.986
## Cauchy model   -0.066   2.95    0.550

7 SIR-based posterior inference under the Cauchy model

N = 1000000
mus = rnorm(N,0,sqrt(10))
w = rep(0,N)
for (i in 1:N)
  w[i] = prod(dt(x-mus[i],df=1))
ind = sample(1:N,size=N,prob=w,replace=TRUE)
mus1 = mus[ind]

hist(mus1,prob=TRUE,xlab=expression(mu),ylab="Posterior density",main="",ylim=c(0,0.25))
mu = seq(-10,10,length=1000)
lines(mu,exp(lp-log(kappa)),col=2,lwd=2)
legend("topleft",legend=c("Exact posterior","MC-based posterior draws"),col=2:1,lty=1,lwd=2)

7.1 Summarizing the results

tab = rbind(tab,round(c(mean(mus1),sqrt(var(mus1)),mean(mus1<1)),3))
rownames(tab) = c("Gaussian model (closed form)","Cauchy model (Reimann)","Cauchy model (SIR)")
tab = cbind(tab[c(1,3),],
rbind(c(qnorm(0.05,sumx/(n+1/tau2),sqrt(1/(n+1/tau2))),
qnorm(0.95,sumx/(n+1/tau2),sqrt(1/(n+1/tau2)))),
quantile(mus1,c(0.05,0.95))))
tab
##                               E(mu) SD(mu) Pr(mu<1)        5%       95%
## Gaussian model (closed form) -0.524  0.690    0.986 -1.658866 0.6112473
## Cauchy model (SIR)           -0.065  2.952    0.550 -4.635423 3.8669082

8 Model comparison

The prior predictive for the Gaussian model is easy to derive \[ p(x_{1:n}|{\cal M}_1) = \prod_{i=1}^n p_N(x_i|0,1+\tau^2). \]

priorpred.M1 =prod(dnorm(x,0,tau2+1))
priorpred.M2.Reimann = kappa
priorpred.M2.MonteCarlo = mean(w)
BF.Reimann = priorpred.M1/priorpred.M2.Reimann
BF.MonteCarlo = priorpred.M1/priorpred.M2.MonteCarlo
c(priorpred.M1,priorpred.M2.Reimann,priorpred.M2.MonteCarlo)
## [1] 0.0011680885 0.0007757130 0.0007756855
c(BF.Reimann,BF.MonteCarlo)
## [1] 1.505826 1.505879
LS0tCnRpdGxlOiAiR2F1c3NpYW4gdnMgQ2F1Y2h5IG1vZGVsLCBjb21tb24gcHJpb3IiCnN1YnRpdGxlOiAiQ2xvc2VkLWZvcm0gdnMgU0lSLWJhc2VkIGluZmVyZW5jZSIKYXV0aG9yOiAiSGVkaWJlcnQgRnJlaXRhcyBMb3BlcyIKZGF0ZTogIjIvMy8yMDIyIgpvdXRwdXQ6CiAgaHRtbF9kb2N1bWVudDoKICAgIHRvYzogdHJ1ZQogICAgdG9jX2RlcHRoOiAyCiAgICBjb2RlX2Rvd25sb2FkOiB5ZXMKICAgIG51bWJlcl9zZWN0aW9uczogdHJ1ZQotLS0KCmBgYHtyIHNldHVwLCBpbmNsdWRlPUZBTFNFfQprbml0cjo6b3B0c19jaHVuayRzZXQoZWNobyA9IFRSVUUpCmBgYAoKIyBNb2RlbHMKCkxldCB1cyBhc3N1bWUgdGhhdCBvYnNlcnZhdGlvbnMgJHhfMSxcbGRvdHMseF9uJCBhcmUgaW5kZXBlbmRlbnQgYW5kIGlkZW50aWZpY2FsbHkgZGlzdHJpYnV0ZWQgYW5kIHRoYXQgd2UgYXJlIGludGVyZXN0ZWQgaW4gY29tcGFyaW5nIGNvbXBldGluZyBtb2RlbHMKCjEpIE1vZGVsICR7XGNhbCBNfV8xOiBcIEdhdXNzaWFuKFxtdSwxKSQsIG9yCjIpIE1vZGVsICR7XGNhbCBNfV8yOiBcIENhdWNoeShcbXUsMSkkLAoKd2hlcmUgJFxtdSQgaXMgdGhlIG1lYW4gb2YgdGhlICRYJCdzIHVuZGVyIHRoZSBHYXVzc2lhbiBtb2RlbCwgJHtcY2FsIE19XzEkLCBidXQgbm90IHVuZGVyIHRoZSBDYXVjaHkgbW9kZWwsICR7XGNhbCBNfV8yJCwgc2luY2UgdGhlIENhdWNoeSBkaXN0cmlidXRpb24gaGFzIHVuZGVmaW5lZCBtZWFuLiAgTm9uZXRoZWxlc3MsICRcbXUkIGNhbiBzdGlsbCBiZSBpbnRlcnByZXRlZCBhcyB0aGUgbWVkaWFuL21vZGVsIG9mIHRoZSAkWCQncyB1bmRlciB0aGUgQ2F1Y2h5IG1vZGVsLgoKCiMgUHJpb3IKU2luY2UgJFxtdSQgcmVwcmVzZW50IHRoZSBzYW1lIHF1YW50aXR5IChsb2NhdGlvbikgdW5kZXIgYm90aCBtb2RlbHMsIHdlIHdpbGwgYXNzdW1lIGEgY29tbW9uIHByaW9yIGZvciAkXG11JDoKJCQKXG11IFxzaW0gTigwLFx0YXVfMF4yKSwKJCQKd2hlcmUsIGZvciBzaW1wbGljaXR5LCB3ZSB3aWxsIGFzc3VtZSB0aGF0ICRcdGF1XzBeMj0xMCQuCgojIFBvc3RlcmlvcgoKVW5kZXIgdGhlIEdhdXNzaWFuIG1vZGVsLCB0aGUgcG9zdGVyaW9yIGRpc3RyaWJ1dGlvbiBvZiAkXG11JCBmb2xsb3dzIGZyb20gc2ltcGxlIGNvbmp1Z2FjeToKJCQKXG11fHhfezE6bn0se1xjYWwgTX1fMSBcc2ltIE4oXG11XzEsXHRhdV8xXjIpLAokJAp3aGVyZSAkXG11XzE9XHN1bV97aT0xfV5uIHhfaS8obisxL1x0YXVfMF4yKSQgYW5kICQxLyhuKzEvXHRhdV8wXjIpJC4gIFNvLCBwb3N0ZXJpb3IgaW5mZXJlbmNlIGNhbiBiZSBwZXJmb3JtZWQgaW4gY2xvc2VkIGZvcm0uICAKCkhvd2V2ZXIsIHRoaXMgaXMgbm90IHRoZSBjYXNlIGZvciB0aGUgcG9zdGVyaW9yIG9mICRcbXUkIHVuZGVyIHRoZSBDYXVjaHkgbW9kZWw6CiQkCnAoXG11fHhfezE6bn0se1xjYWwgTX1fMSkgXHByb3B0byBcbGVmdFtccHJvZF97aT0xfV5uIHAoeF9pfFxtdSx7XGNhbCBNfV8yKVxyaWdodF0gcChcbXUpClxwcm9wdG8gXGxlZnRbXHByb2Rfe2k9MX1ebiBcZnJhY3sxfXsxKyh4X2ktXG11KV4yfVxyaWdodF1cZXhwXHstMC41XG11XjIvXHRhdV8wXjJcfSwKJCQKd2hpY2ggZG9lcyBub3QgZm9sbG93IGFueSBrbm93biBmYW1pbHkgb2YgZGlzdHJpYnV0aW9ucy4KCgpgYGB7cn0KIyBOb3JtYWxpemVkIHBvc3RlcmlvciBmb3IgbW9kZWwgMQpwb3N0MSA9IGZ1bmN0aW9uKHN1bXgsbXUsdGF1Mil7CiAgZG5vcm0obXUsc3VteC8obisxL3RhdTIpLHNxcnQoMS8obisxL3RhdTIpKSkKfQojIFVubm9ybWFsaXplZCBwb3N0ZXJpb3IgZm9yIG1vZGVsIDIKcG9zdDIgPSBmdW5jdGlvbih4LG11LHRhdTIpewogIHByb2QoZHQobXUteCxkZj0xKSkqZG5vcm0obXUsMCxzcXJ0KHRhdTIpKQp9CiMgVW5ub3JtYWxpemVkIHBvc3RlcmlvciBmb3IgbW9kZWwgMiAobG9nKQpsb2dwb3N0MiA9IGZ1bmN0aW9uKHgsbXUsdGF1Mil7CiAgc3VtKGR0KG11LXgsZGY9MSxsb2c9VFJVRSkpK2Rub3JtKG11LDAsc3FydCh0YXUyKSxsb2c9VFJVRSkKfQpgYGAKCgojIFR1cm5pbmcgdGhlIEJheWVzaWFuIGNyYW5rCgpgYGB7ciBmaWcud2lkdGg9OCwgZmlnLmhlaWdodD01fQojIEh5cGVycGFyYW1ldGVycwp0YXUyPTEwCgojIE9ic2VydmVkIGRhdGEKeCAgICA9IGMoLTQuMywzLjIpCnN1bXggPSBzdW0oeCkKbiAgICA9IGxlbmd0aCh4KQoKIyBQbG90dGluZyB0aGUgcG9zdGVyaW9yIGRlbnNpdGllcwpwYXIobWZyb3c9YygxLDIpKQoKbXUgPSBzZXEoLTUsNSxsZW5ndGg9MTAwMCkKcGxvdChtdSxwb3N0MShzdW14LG11LHRhdTIpLHR5cGU9ImwiLGx3ZD0yLHlsYWI9IlBvc3RlcmlvciIpCnRpdGxlKCJNb2RlbCAxOiBHYXVzc2lhbiIpCmZvciAoaSBpbiAxOm4pCiAgcG9pbnRzKHhbaV0sMCxwY2g9MTYsY29sPTMpCgptdSA9IHNlcSgtMTAsMTAsbGVuZ3RoPTEwMDApCmxwID0gcmVwKDAsMTAwMCkKZm9yIChpIGluIDE6MTAwMCkKICBscFtpXSA9IGxvZ3Bvc3QyKHgsbXVbaV0sdGF1MikKbHAxID0gZXhwKGxwLW1heChscCkpCnBsb3QobXUsbHAxLHR5cGU9ImwiLGx3ZD0yLHlsYWI9IlByaW9yKmxpa2VsaWhvb2QiKQp0aXRsZSgiTW9kZWwgMjogQ2F1Y2h5IikKZm9yIChpIGluIDE6bikKICBwb2ludHMoeFtpXSwwLHBjaD0xNixjb2w9MykKYGBgCgojIENvbXB1dGluZyAkcCh4X3sxOm59fHtcY2FsIE19XzIpJCB2aWEgc2ltcGxlIFJlaW1hbm4gaW50ZWdyYXRpb24KCmBgYHtyIGZpZy53aWR0aD04LCBmaWcuaGVpZ2h0PTV9Ck0gICAgICA9IDEwMDAwMDAKbXVzICAgPSBzZXEoLTE1LDE1LGxlbmd0aD1NKQpoICAgICAgID0gbXVzWzJdLW11c1sxXQpwcm9kID0gcmVwKDAsTSkKZm9yIChpIGluIDE6TSkKcHJvZFtpXSA9IGgqcG9zdDIoeCxtdXNbaV0sdGF1MikKa2FwcGEgPSBzdW0ocHJvZCkKa2FwcGEKYGBgCgpXZSBjYW4gbm93IHBsb3QgYm90aCBwb3N0ZXJpb3IgZGVuc2l0aWVzIHVuZGVyIHRoZSBzYW1lIGZpZ3VyZToKCmBgYHtyIGZpZy53aWR0aD04LCBmaWcuaGVpZ2h0PTV9CnBhcihtZnJvdz1jKDEsMSkpCnBsb3QobXUscG9zdDEoc3VteCxtdSx0YXUyKSx0eXBlPSJsIixsd2Q9Mix4bGFiPWV4cHJlc3Npb24obXUpLAp5bGFiPSJQb3N0ZXJpb3IgZGVuc2l0eSIpCmxpbmVzKG11LGV4cChscC1sb2coa2FwcGEpKSxjb2w9Mixsd2Q9MikKbGluZXMobXUsZG5vcm0obXUsMCxzcXJ0KDEwKSksY29sPTQsbHdkPTIpCmxlZ2VuZCgidG9wcmlnaHQiLGxlZ2VuZD1jKCJQcmlvciBvZiBtdSIsIlBvc3RlcmlvciBvZiBtdSAoR2F1c3NpYW4gbW9kZWwpIiwiUG9zdGVyaW9yIG9mIG11IChDYXVjaHkgbW9kZWwpIiksY29sPWMoNCwxOjIpLGx3ZD0yLGx0eT0xKQpmb3IgKGkgaW4gMTpuKQogIHBvaW50cyh4W2ldLDAscGNoPTE2LGNvbD0zKQpgYGAKCgojIENvbXB1dGluZyBvdGhlciBwb3N0ZXJpb3Igc3VtbWFyaWVzCgpCZWxvdyB3ZSBjb21wdXRlLCB2aWEgc2ltcGxlIFJlaW1hbm4gaW50ZWdyYXRpb24sIHRoZSBmb2xsb3dpbmcgc3VtbWFyaWVzOgoKYSkgJEUoXG11fHhfezE6bn0se1xjYWwgTX1fMikkCmIpICRFKFxtdV4yfHhfezE6bn0se1xjYWwgTX1fMikkCmMpICRQcihcbXU8MXx4X3sxOm59LHtcY2FsIE19XzIpJAoKYGBge3J9Ck0gICAgICA9IDEwMDAwMDAKbXVzICAgID0gc2VxKC0xNSwxNSxsZW5ndGg9TSkKaCAgICAgID0gbXVzWzJdLW11c1sxXQpwcm9kICAgPSBtYXRyaXgoMCxNLDMpCmZvciAoaSBpbiAxOk0pewogIHByb2RbaSwxXSA9IG11c1tpXSpoKnBvc3QyKHgsbXVzW2ldLHRhdTIpL2thcHBhCiAgcHJvZFtpLDJdID0gbXVzW2ldXjIqaCpwb3N0Mih4LG11c1tpXSx0YXUyKS9rYXBwYQogIHByb2RbaSwzXSA9IGlmZWxzZShtdXNbaV08MSwxLDApKmgqcG9zdDIoeCxtdXNbaV0sdGF1Mikva2FwcGEKfQpFID0gc3VtKHByb2RbLDFdKQpFMiA9IHN1bShwcm9kWywyXSkKViA9IEUyLUVeMgpTRCA9IHNxcnQoVikKCnRhYiA9IHJvdW5kKHJiaW5kKGMoc3VteC8obisxL3RhdTIpLHNxcnQoMS8obisxL3RhdTIpKSwKcG5vcm0oMSxzdW14LyhuKzEvdGF1Miksc3FydCgxLyhuKzEvdGF1MikpKSksCmMoRSxTRCxzdW0ocHJvZFssM10pKSksMykKcm93bmFtZXModGFiKSA9IGMoIkdhdXNzaWFuIG1vZGVsIiwiQ2F1Y2h5IG1vZGVsIikKY29sbmFtZXModGFiKSA9IGMoIkUobXUpIiwiU0QobXUpIiwiUHIobXU8MSkiKQp0YWIKYGBgCgojIFNJUi1iYXNlZCBwb3N0ZXJpb3IgaW5mZXJlbmNlIHVuZGVyIHRoZSBDYXVjaHkgbW9kZWwKCmBgYHtyIGZpZy53aWR0aD04LCBmaWcuaGVpZ2h0PTV9Ck4gPSAxMDAwMDAwCm11cyA9IHJub3JtKE4sMCxzcXJ0KDEwKSkKdyA9IHJlcCgwLE4pCmZvciAoaSBpbiAxOk4pCiAgd1tpXSA9IHByb2QoZHQoeC1tdXNbaV0sZGY9MSkpCmluZCA9IHNhbXBsZSgxOk4sc2l6ZT1OLHByb2I9dyxyZXBsYWNlPVRSVUUpCm11czEgPSBtdXNbaW5kXQoKaGlzdChtdXMxLHByb2I9VFJVRSx4bGFiPWV4cHJlc3Npb24obXUpLHlsYWI9IlBvc3RlcmlvciBkZW5zaXR5IixtYWluPSIiLHlsaW09YygwLDAuMjUpKQptdSA9IHNlcSgtMTAsMTAsbGVuZ3RoPTEwMDApCmxpbmVzKG11LGV4cChscC1sb2coa2FwcGEpKSxjb2w9Mixsd2Q9MikKbGVnZW5kKCJ0b3BsZWZ0IixsZWdlbmQ9YygiRXhhY3QgcG9zdGVyaW9yIiwiTUMtYmFzZWQgcG9zdGVyaW9yIGRyYXdzIiksY29sPTI6MSxsdHk9MSxsd2Q9MikKYGBgCgojIyBTdW1tYXJpemluZyB0aGUgcmVzdWx0cwoKYGBge3J9CnRhYiA9IHJiaW5kKHRhYixyb3VuZChjKG1lYW4obXVzMSksc3FydCh2YXIobXVzMSkpLG1lYW4obXVzMTwxKSksMykpCnJvd25hbWVzKHRhYikgPSBjKCJHYXVzc2lhbiBtb2RlbCAoY2xvc2VkIGZvcm0pIiwiQ2F1Y2h5IG1vZGVsIChSZWltYW5uKSIsIkNhdWNoeSBtb2RlbCAoU0lSKSIpCnRhYiA9IGNiaW5kKHRhYltjKDEsMyksXSwKcmJpbmQoYyhxbm9ybSgwLjA1LHN1bXgvKG4rMS90YXUyKSxzcXJ0KDEvKG4rMS90YXUyKSkpLApxbm9ybSgwLjk1LHN1bXgvKG4rMS90YXUyKSxzcXJ0KDEvKG4rMS90YXUyKSkpKSwKcXVhbnRpbGUobXVzMSxjKDAuMDUsMC45NSkpKSkKdGFiCmBgYAoKIyBNb2RlbCBjb21wYXJpc29uClRoZSBwcmlvciBwcmVkaWN0aXZlIGZvciB0aGUgR2F1c3NpYW4gbW9kZWwgaXMgZWFzeSB0byBkZXJpdmUKJCQKcCh4X3sxOm59fHtcY2FsIE19XzEpID0gXHByb2Rfe2k9MX1ebiBwX04oeF9pfDAsMStcdGF1XjIpLgokJAoKYGBge3J9CnByaW9ycHJlZC5NMSA9cHJvZChkbm9ybSh4LDAsdGF1MisxKSkKcHJpb3JwcmVkLk0yLlJlaW1hbm4gPSBrYXBwYQpwcmlvcnByZWQuTTIuTW9udGVDYXJsbyA9IG1lYW4odykKQkYuUmVpbWFubiA9IHByaW9ycHJlZC5NMS9wcmlvcnByZWQuTTIuUmVpbWFubgpCRi5Nb250ZUNhcmxvID0gcHJpb3JwcmVkLk0xL3ByaW9ycHJlZC5NMi5Nb250ZUNhcmxvCmMocHJpb3JwcmVkLk0xLHByaW9ycHJlZC5NMi5SZWltYW5uLHByaW9ycHJlZC5NMi5Nb250ZUNhcmxvKQpjKEJGLlJlaW1hbm4sQkYuTW9udGVDYXJsbykKYGBgCg==