1 The model

In this illustration we will consider a simple and Gaussian linear regression model where the errors follow a first order autogressive structure, where, for \(t=1,\ldots,n\), \[\begin{eqnarray*} y_t &=& \beta x_t + u_t\\ u_t &=& \rho u_{t-1} + \epsilon_t \end{eqnarray*}\] where \(\epsilon_t\)s follows a Gaussian white noise process with variance \(\sigma^2\). This is studied, amongst various extensions, where studied in Zellner and Tiao (1964) Bayesian Analysis of the Regression Model With Autocorrelated Errors, Journal of the American Statistical Association (JASA), 59, 763-778. The paper can be downloaded freely from either https://www.jstor.org/stable/2283097 or https://doi.org/10.2307/2283097.

2 The data

In one of their exercises, \(\beta=3\), \(\sigma^2=1\) and \(\rho=0.5\) (case I) or \(\rho=1.25\) (case II), and \(n=16\). The \(x\)’s are rescaled investment expenditures taken from Haavelmo (1947) Methods of measuring the marginal propensity to consume, JASA, 42, 105-22, which can be downloaded freely from either https://www.jstor.org/stable/2280191 or https://doi.org/10.2307/2280191.

x=c(3.0,3.9,6.0,4.2,5.2,4.7,5.1,4.5,6.0,3.9,4.1,2.2,1.7,2.7,3.3,4.8)
y1=c(9.500,12.649,18.794,12.198,14.372,13.909,14.556,14.700,18.281,
     13.890,10.318,5.473,4.044,6.361,7.036,13.368)
y2=c(9.500,13.024,19.975,14.270,16.760,15.923,16.931,17.111,22.195,
     18.992,18.338,14.012,13.873,17.855,20.099,27.549)
n = length(x)

par(mfrow=c(2,3))
plot(x,y1,type="n",col=0)
text(x,y1,label=1:n,col='blue')
ts.plot(y1-3*x)
acf(y1-3*x)

plot(x,y2,type="n",col=0)
text(x,y2,label=1:n,col='blue')
ts.plot(y2-3*x)
acf(y2-3*x)

3 Simple OLS fits

fit1 = lm(y1~x)
summary(fit1)
## 
## Call:
## lm(formula = y1 ~ x)
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.20858 -0.86812 -0.01237  0.74634  2.65171 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)    
## (Intercept)  -1.7209     1.1427  -1.506    0.154    
## x             3.3229     0.2683  12.385 6.23e-09 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.306 on 14 degrees of freedom
## Multiple R-squared:  0.9164, Adjusted R-squared:  0.9104 
## F-statistic: 153.4 on 1 and 14 DF,  p-value: 6.231e-09
fit2 = lm(y2~x)
summary(fit2)
## 
## Call:
## lm(formula = y2 ~ x)
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -5.9539 -2.3962 -0.3135  1.7708  9.0627 
## 
## Coefficients:
##             Estimate Std. Error t value Pr(>|t|)   
## (Intercept)  10.3998     3.2577   3.192  0.00652 **
## x             1.6847     0.7649   2.202  0.04489 * 
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 3.724 on 14 degrees of freedom
## Multiple R-squared:  0.2573, Adjusted R-squared:  0.2043 
## F-statistic: 4.851 on 1 and 14 DF,  p-value: 0.04489

3.1 Residual plots of the OLS fits

u1 = fit1$res
u2 = fit2$res
par(mfrow=c(1,2))
plot(u1,type="b")
plot(u2,type="b")

3.2 Fitting AR(1) to the residuals

summary(lm(u1[2:n]~u1[1:(n-1)]))
## 
## Call:
## lm(formula = u1[2:n] ~ u1[1:(n - 1)])
## 
## Residuals:
##      Min       1Q   Median       3Q      Max 
## -2.05668 -0.71217  0.04631  0.38774  2.73466 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)
## (Intercept)   -0.08763    0.33698  -0.260    0.799
## u1[1:(n - 1)]  0.07222    0.27126   0.266    0.794
## 
## Residual standard error: 1.304 on 13 degrees of freedom
## Multiple R-squared:  0.005423,   Adjusted R-squared:  -0.07108 
## F-statistic: 0.07088 on 1 and 13 DF,  p-value: 0.7942
summary(lm(u2[2:n]~u2[1:(n-1)]))
## 
## Call:
## lm(formula = u2[2:n] ~ u2[1:(n - 1)])
## 
## Residuals:
##     Min      1Q  Median      3Q     Max 
## -3.6734 -0.8364 -0.2031  1.1456  3.9352 
## 
## Coefficients:
##               Estimate Std. Error t value Pr(>|t|)    
## (Intercept)     0.9994     0.5213   1.917 0.077438 .  
## u2[1:(n - 1)]   0.9972     0.1907   5.228 0.000163 ***
## ---
## Signif. codes:  0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
## 
## Residual standard error: 1.969 on 13 degrees of freedom
## Multiple R-squared:  0.6777, Adjusted R-squared:  0.6529 
## F-statistic: 27.34 on 1 and 13 DF,  p-value: 0.0001628

4 Likelihood and prior

It is easy to see that \[\begin{eqnarray*} y_t &=& \beta x_t + u_t\\ \rho y_{t-1} &=& \rho \beta x_{t-1} + \rho u_{t-1}, \end{eqnarray*}\] such that \[ y_t - \rho y_{t-1} = \beta (x_t-\rho x_{t-1}) + \epsilon_t. \] So, for the sake of the exercise, assume that the likelihood is conditional on \(y_1\) and \(x_1\). Therefore \[ p(y_{2:n}|x_{2:n},\theta) \propto (\sigma^2)^{-(n-1)/2}\exp\left\{-\frac{1}{\sigma^2} \sum_{t=2}^n (y_t - \rho y_{t-1} - \beta (x_t-\rho x_{t-1}))^2 \right\}, \] and assume that the prior for \(\theta=(\beta,\rho,\sigma^2)\) is \[ p(\beta,\rho,\sigma^2) \propto \frac{1}{\sigma^2}. \]

5 Contours of \(p(\beta,\rho|\mbox{data})\)

loglikelihoodprior = function(y,x,beta,rho,sig){
  sum(dnorm(y[2:n],rho*y[1:(n-1)]+beta*(x[2:n]-rho*x[1:(n-1)]),sig,log=TRUE))-2*log(sig)
}
M = 200
betas = seq(1,4.5,length=M)
rhos  = seq(-2,2,length=M)
sigs  = seq(0.1,4,length=M)
post  = array(0,c(M,M,M))
post2 = matrix(0,M,M)
for (i in 1:M)
  for (j in 1:M)
    for (k in 1:M)
      post[i,j,k] = loglikelihoodprior(y1,x,betas[i],rhos[j],sigs[k])
max = max(post)
for (i in 1:M)
  for (j in 1:M)
    post2[i,j] = sum(exp(post[i,j,]-max))

contour(betas,rhos,post2,xlab=expression(beta),ylab=expression(rho),nlevels=20,
        xlim=c(1,4),ylim=c(-1,2),drawlabels=FALSE)

6 SIR-based posterior inference for the case I dataset (\(\rho=0.5\))

M = 1000000
beta0 = runif(M,-5,5)
rho0 = runif(M,-3,3)
sig0 = runif(M,0.01,4)
w = rep(0,M)
for (i in 1:M)
  w[i] = loglikelihoodprior(y1,x,beta0[i],rho0[i],sig0[i])
w = exp(w-max(w))
k = sample(1:M,size=M,replace=TRUE,prob=w)
beta = beta0[k]
rho  = rho0[k]
sig  = sig0[k]

par(mfrow=c(2,2))
hist(beta,main=expression(beta),ylab="Posterior",xlab="",prob=TRUE)
legend("topright",legend=c(
paste("5th perc.=",round(quantile(beta,0.05),2),sep=""),
paste("50th perc.=",round(quantile(beta,0.5),2),sep=""),
paste("95th perc.=",round(quantile(beta,0.95),2),sep="")),bty="n")                          
abline(v=3,lwd=2,col=2)
hist(rho,main=expression(rho),ylab="Posterior",xlab="",prob=TRUE)
legend("topright",legend=c(
paste("5th perc.=",round(quantile(rho,0.05),2),sep=""),
paste("50th perc.=",round(quantile(rho,0.5),2),sep=""),
paste("95th perc.=",round(quantile(rho,0.95),2),sep="")),bty="n")                          
hist(sig,main=expression(sig),ylab="Posterior",xlab="",prob=TRUE)
legend("topright",legend=c(
paste("5th perc.=",round(quantile(sig,0.05),2),sep=""),
paste("50th perc.=",round(quantile(sig,0.5),2),sep=""),
paste("95th perc.=",round(quantile(sig,0.95),2),sep="")),bty="n")                          
abline(v=1,lwd=2,col=2)
plot(beta,rho,xlab=expression(beta),ylab=expression(rho))
contour(betas,rhos,post2,add=TRUE,col=3,nlevels=20,drawlabels=FALSE)

LS0tCnRpdGxlOiAiUmVncmVzc2lvbiBNb2RlbCBXaXRoIEF1dG9jb3JyZWxhdGVkIEVycm9ycyIKc3VidGl0bGU6ICJCYXNlZCBvbiBaZWxsbmVyIGFuZCBUaWFvICgxOTY0KSBKQVNBIHBhcGVyIgphdXRob3I6ICJIZWRpYmVydCBGcmVpdGFzIExvcGVzIgpkYXRlOiAiMi8zLzIwMjIiCm91dHB1dDoKICBodG1sX2RvY3VtZW50OgogICAgdG9jOiB0cnVlCiAgICB0b2NfZGVwdGg6IDIKICAgIGNvZGVfZG93bmxvYWQ6IHllcwogICAgbnVtYmVyX3NlY3Rpb25zOiB0cnVlCi0tLQoKYGBge3Igc2V0dXAsIGluY2x1ZGU9RkFMU0V9CmtuaXRyOjpvcHRzX2NodW5rJHNldChlY2hvID0gVFJVRSkKYGBgCgojIFRoZSBtb2RlbApJbiB0aGlzIGlsbHVzdHJhdGlvbiB3ZSB3aWxsIGNvbnNpZGVyIGEgc2ltcGxlIGFuZCBHYXVzc2lhbiBsaW5lYXIgcmVncmVzc2lvbiBtb2RlbCB3aGVyZSB0aGUgZXJyb3JzIGZvbGxvdyBhIGZpcnN0IG9yZGVyIGF1dG9ncmVzc2l2ZSBzdHJ1Y3R1cmUsIHdoZXJlLCBmb3IgJHQ9MSxcbGRvdHMsbiQsClxiZWdpbntlcW5hcnJheSp9CnlfdCAmPSYgXGJldGEgeF90ICsgdV90XFwKdV90ICY9JiBccmhvIHVfe3QtMX0gKyBcZXBzaWxvbl90ClxlbmR7ZXFuYXJyYXkqfQp3aGVyZSAkXGVwc2lsb25fdCRzIGZvbGxvd3MgYSBHYXVzc2lhbiB3aGl0ZSBub2lzZSBwcm9jZXNzIHdpdGggdmFyaWFuY2UgJFxzaWdtYV4yJC4gIFRoaXMgaXMgc3R1ZGllZCwgYW1vbmdzdCB2YXJpb3VzIGV4dGVuc2lvbnMsIHdoZXJlIHN0dWRpZWQgaW4gWmVsbG5lciBhbmQgVGlhbyAoMTk2NCkgQmF5ZXNpYW4gQW5hbHlzaXMgb2YgdGhlIFJlZ3Jlc3Npb24gTW9kZWwgV2l0aCBBdXRvY29ycmVsYXRlZCBFcnJvcnMsICpKb3VybmFsIG9mIHRoZSBBbWVyaWNhbiBTdGF0aXN0aWNhbCBBc3NvY2lhdGlvbiAoSkFTQSkqLCA1OSwgNzYzLTc3OC4gIFRoZSBwYXBlciBjYW4gYmUgZG93bmxvYWRlZCBmcmVlbHkgZnJvbSBlaXRoZXIKaHR0cHM6Ly93d3cuanN0b3Iub3JnL3N0YWJsZS8yMjgzMDk3IG9yIGh0dHBzOi8vZG9pLm9yZy8xMC4yMzA3LzIyODMwOTcuCgoKIyBUaGUgZGF0YQpJbiBvbmUgb2YgdGhlaXIgZXhlcmNpc2VzLCAkXGJldGE9MyQsICRcc2lnbWFeMj0xJCBhbmQgJFxyaG89MC41JCAoY2FzZSBJKSBvciAkXHJobz0xLjI1JCAoY2FzZSBJSSksIGFuZCAkbj0xNiQuICBUaGUgJHgkJ3MgYXJlIHJlc2NhbGVkIGludmVzdG1lbnQgZXhwZW5kaXR1cmVzIHRha2VuIGZyb20gSGFhdmVsbW8gKDE5NDcpIE1ldGhvZHMgb2YgbWVhc3VyaW5nIHRoZSBtYXJnaW5hbCBwcm9wZW5zaXR5IHRvIGNvbnN1bWUsICpKQVNBKiwgNDIsIDEwNS0yMiwgd2hpY2ggY2FuIGJlIGRvd25sb2FkZWQgZnJlZWx5IGZyb20gZWl0aGVyIGh0dHBzOi8vd3d3LmpzdG9yLm9yZy9zdGFibGUvMjI4MDE5MSBvciBodHRwczovL2RvaS5vcmcvMTAuMjMwNy8yMjgwMTkxLgoKYGBge3IgZmlnLndpZHRoPTEwLCBmaWcuaGVpZ2h0PTd9Cng9YygzLjAsMy45LDYuMCw0LjIsNS4yLDQuNyw1LjEsNC41LDYuMCwzLjksNC4xLDIuMiwxLjcsMi43LDMuMyw0LjgpCnkxPWMoOS41MDAsMTIuNjQ5LDE4Ljc5NCwxMi4xOTgsMTQuMzcyLDEzLjkwOSwxNC41NTYsMTQuNzAwLDE4LjI4MSwKICAgICAxMy44OTAsMTAuMzE4LDUuNDczLDQuMDQ0LDYuMzYxLDcuMDM2LDEzLjM2OCkKeTI9Yyg5LjUwMCwxMy4wMjQsMTkuOTc1LDE0LjI3MCwxNi43NjAsMTUuOTIzLDE2LjkzMSwxNy4xMTEsMjIuMTk1LAogICAgIDE4Ljk5MiwxOC4zMzgsMTQuMDEyLDEzLjg3MywxNy44NTUsMjAuMDk5LDI3LjU0OSkKbiA9IGxlbmd0aCh4KQoKcGFyKG1mcm93PWMoMiwzKSkKcGxvdCh4LHkxLHR5cGU9Im4iLGNvbD0wKQp0ZXh0KHgseTEsbGFiZWw9MTpuLGNvbD0nYmx1ZScpCnRzLnBsb3QoeTEtMyp4KQphY2YoeTEtMyp4KQoKcGxvdCh4LHkyLHR5cGU9Im4iLGNvbD0wKQp0ZXh0KHgseTIsbGFiZWw9MTpuLGNvbD0nYmx1ZScpCnRzLnBsb3QoeTItMyp4KQphY2YoeTItMyp4KQpgYGAKCgojIFNpbXBsZSBPTFMgZml0cwoKYGBge3J9CmZpdDEgPSBsbSh5MX54KQpzdW1tYXJ5KGZpdDEpCgpmaXQyID0gbG0oeTJ+eCkKc3VtbWFyeShmaXQyKQpgYGAKCiMjIFJlc2lkdWFsIHBsb3RzIG9mIHRoZSBPTFMgZml0cwoKCmBgYHtyIGZpZy53aWR0aD04LCBmaWcuaGVpZ2h0PTV9CnUxID0gZml0MSRyZXMKdTIgPSBmaXQyJHJlcwpwYXIobWZyb3c9YygxLDIpKQpwbG90KHUxLHR5cGU9ImIiKQpwbG90KHUyLHR5cGU9ImIiKQpgYGAKCgojIyBGaXR0aW5nIEFSKDEpIHRvIHRoZSByZXNpZHVhbHMKYGBge3J9CnN1bW1hcnkobG0odTFbMjpuXX51MVsxOihuLTEpXSkpCnN1bW1hcnkobG0odTJbMjpuXX51MlsxOihuLTEpXSkpCmBgYAoKIyBMaWtlbGlob29kIGFuZCBwcmlvcgpJdCBpcyBlYXN5IHRvIHNlZSB0aGF0ClxiZWdpbntlcW5hcnJheSp9CnlfdCAmPSYgXGJldGEgeF90ICsgdV90XFwKXHJobyB5X3t0LTF9ICY9JiBccmhvIFxiZXRhIHhfe3QtMX0gKyBccmhvIHVfe3QtMX0sClxlbmR7ZXFuYXJyYXkqfQpzdWNoIHRoYXQKJCQKeV90IC0gXHJobyB5X3t0LTF9ID0gXGJldGEgKHhfdC1ccmhvIHhfe3QtMX0pICsgXGVwc2lsb25fdC4KJCQKU28sIGZvciB0aGUgc2FrZSBvZiB0aGUgZXhlcmNpc2UsIGFzc3VtZSB0aGF0IHRoZSBsaWtlbGlob29kIGlzIGNvbmRpdGlvbmFsIG9uICR5XzEkIGFuZCAkeF8xJC4KVGhlcmVmb3JlCiQkCnAoeV97MjpufXx4X3syOm59LFx0aGV0YSkgXHByb3B0byAoXHNpZ21hXjIpXnstKG4tMSkvMn1cZXhwXGxlZnRcey1cZnJhY3sxfXtcc2lnbWFeMn0KXHN1bV97dD0yfV5uICh5X3QgLSBccmhvIHlfe3QtMX0gLSBcYmV0YSAoeF90LVxyaG8geF97dC0xfSkpXjIKXHJpZ2h0XH0sCiQkCmFuZCBhc3N1bWUgdGhhdCB0aGUgcHJpb3IgZm9yICRcdGhldGE9KFxiZXRhLFxyaG8sXHNpZ21hXjIpJCBpcyAKJCQKcChcYmV0YSxccmhvLFxzaWdtYV4yKSBccHJvcHRvIFxmcmFjezF9e1xzaWdtYV4yfS4KJCQKCiMgQ29udG91cnMgb2YgJHAoXGJldGEsXHJob3xcbWJveHtkYXRhfSkkCgpgYGB7cn0KbG9nbGlrZWxpaG9vZHByaW9yID0gZnVuY3Rpb24oeSx4LGJldGEscmhvLHNpZyl7CiAgc3VtKGRub3JtKHlbMjpuXSxyaG8qeVsxOihuLTEpXStiZXRhKih4WzI6bl0tcmhvKnhbMToobi0xKV0pLHNpZyxsb2c9VFJVRSkpLTIqbG9nKHNpZykKfQpNID0gMjAwCmJldGFzID0gc2VxKDEsNC41LGxlbmd0aD1NKQpyaG9zICA9IHNlcSgtMiwyLGxlbmd0aD1NKQpzaWdzICA9IHNlcSgwLjEsNCxsZW5ndGg9TSkKcG9zdCAgPSBhcnJheSgwLGMoTSxNLE0pKQpwb3N0MiA9IG1hdHJpeCgwLE0sTSkKZm9yIChpIGluIDE6TSkKICBmb3IgKGogaW4gMTpNKQogICAgZm9yIChrIGluIDE6TSkKICAgICAgcG9zdFtpLGosa10gPSBsb2dsaWtlbGlob29kcHJpb3IoeTEseCxiZXRhc1tpXSxyaG9zW2pdLHNpZ3Nba10pCm1heCA9IG1heChwb3N0KQpmb3IgKGkgaW4gMTpNKQogIGZvciAoaiBpbiAxOk0pCiAgICBwb3N0MltpLGpdID0gc3VtKGV4cChwb3N0W2ksaixdLW1heCkpCgpjb250b3VyKGJldGFzLHJob3MscG9zdDIseGxhYj1leHByZXNzaW9uKGJldGEpLHlsYWI9ZXhwcmVzc2lvbihyaG8pLG5sZXZlbHM9MjAsCiAgICAgICAgeGxpbT1jKDEsNCkseWxpbT1jKC0xLDIpLGRyYXdsYWJlbHM9RkFMU0UpCmBgYAoKIyBTSVItYmFzZWQgcG9zdGVyaW9yIGluZmVyZW5jZSBmb3IgdGhlIGNhc2UgSSBkYXRhc2V0ICgkXHJobz0wLjUkKQoKYGBge3IgZmlnLndpZHRoPTEwLCBmaWcuaGVpZ2h0PTh9Ck0gPSAxMDAwMDAwCmJldGEwID0gcnVuaWYoTSwtNSw1KQpyaG8wID0gcnVuaWYoTSwtMywzKQpzaWcwID0gcnVuaWYoTSwwLjAxLDQpCncgPSByZXAoMCxNKQpmb3IgKGkgaW4gMTpNKQogIHdbaV0gPSBsb2dsaWtlbGlob29kcHJpb3IoeTEseCxiZXRhMFtpXSxyaG8wW2ldLHNpZzBbaV0pCncgPSBleHAody1tYXgodykpCmsgPSBzYW1wbGUoMTpNLHNpemU9TSxyZXBsYWNlPVRSVUUscHJvYj13KQpiZXRhID0gYmV0YTBba10KcmhvICA9IHJobzBba10Kc2lnICA9IHNpZzBba10KCnBhcihtZnJvdz1jKDIsMikpCmhpc3QoYmV0YSxtYWluPWV4cHJlc3Npb24oYmV0YSkseWxhYj0iUG9zdGVyaW9yIix4bGFiPSIiLHByb2I9VFJVRSkKbGVnZW5kKCJ0b3ByaWdodCIsbGVnZW5kPWMoCnBhc3RlKCI1dGggcGVyYy49Iixyb3VuZChxdWFudGlsZShiZXRhLDAuMDUpLDIpLHNlcD0iIiksCnBhc3RlKCI1MHRoIHBlcmMuPSIscm91bmQocXVhbnRpbGUoYmV0YSwwLjUpLDIpLHNlcD0iIiksCnBhc3RlKCI5NXRoIHBlcmMuPSIscm91bmQocXVhbnRpbGUoYmV0YSwwLjk1KSwyKSxzZXA9IiIpKSxidHk9Im4iKSAgICAgICAgICAgICAgICAgICAgICAgICAgCmFibGluZSh2PTMsbHdkPTIsY29sPTIpCmhpc3QocmhvLG1haW49ZXhwcmVzc2lvbihyaG8pLHlsYWI9IlBvc3RlcmlvciIseGxhYj0iIixwcm9iPVRSVUUpCmxlZ2VuZCgidG9wcmlnaHQiLGxlZ2VuZD1jKApwYXN0ZSgiNXRoIHBlcmMuPSIscm91bmQocXVhbnRpbGUocmhvLDAuMDUpLDIpLHNlcD0iIiksCnBhc3RlKCI1MHRoIHBlcmMuPSIscm91bmQocXVhbnRpbGUocmhvLDAuNSksMiksc2VwPSIiKSwKcGFzdGUoIjk1dGggcGVyYy49Iixyb3VuZChxdWFudGlsZShyaG8sMC45NSksMiksc2VwPSIiKSksYnR5PSJuIikgICAgICAgICAgICAgICAgICAgICAgICAgIApoaXN0KHNpZyxtYWluPWV4cHJlc3Npb24oc2lnKSx5bGFiPSJQb3N0ZXJpb3IiLHhsYWI9IiIscHJvYj1UUlVFKQpsZWdlbmQoInRvcHJpZ2h0IixsZWdlbmQ9YygKcGFzdGUoIjV0aCBwZXJjLj0iLHJvdW5kKHF1YW50aWxlKHNpZywwLjA1KSwyKSxzZXA9IiIpLApwYXN0ZSgiNTB0aCBwZXJjLj0iLHJvdW5kKHF1YW50aWxlKHNpZywwLjUpLDIpLHNlcD0iIiksCnBhc3RlKCI5NXRoIHBlcmMuPSIscm91bmQocXVhbnRpbGUoc2lnLDAuOTUpLDIpLHNlcD0iIikpLGJ0eT0ibiIpICAgICAgICAgICAgICAgICAgICAgICAgICAKYWJsaW5lKHY9MSxsd2Q9Mixjb2w9MikKcGxvdChiZXRhLHJobyx4bGFiPWV4cHJlc3Npb24oYmV0YSkseWxhYj1leHByZXNzaW9uKHJobykpCmNvbnRvdXIoYmV0YXMscmhvcyxwb3N0MixhZGQ9VFJVRSxjb2w9MyxubGV2ZWxzPTIwLGRyYXdsYWJlbHM9RkFMU0UpCmBgYAo=