bayesian linear regression
DESCRIPTION
Bayesian Linear Regression In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable. Bayesian Linear Regression. Response variable - Also called “dependent” or “outcome” variable - What we want to explain or predict - PowerPoint PPT PresentationTRANSCRIPT
Bayesian Linear Regression
In regression analysis, we look at the conditional distribution of the response variable at different levels of a predictor variable
Response variable
- Also called “dependent” or “outcome” variable- What we want to explain or predict- In simple linear regression, response variable is continuous
Predictor variables
-Also called “independent” variables or “covariates”-In simple linear regression, predictor variable usually is also continuous-How we define which variable is response and which is predictor depends on our research question
Bayesian Linear Regression
Quick review of linear functions
Y is a response variable that is a linear function of the predictor variable
β0: intercept; the value of Y when X=0
β1: slope; how much Y changes when X increases by 1 unit
XY 10
Intro to Bayesian simple linear regression
Likelihood
Prior distribution
The posterior distribution is not straighforward
We have to implement MCMC techniques with WinBUGS
2102
10 ,~,,,| iii xNxy
vuIG
JjhN jjj
,~
,...,1,~2
00
Examples
Willingness to Pay for Environmental Improvements in a Large City. For example, we can study the social benefits of a set of environmental and urban improvements planned for the waterfront of the City of Valencia (Spain):
Response variable:
How much are you willing to pay for this policy?
Covariates: Sex, Age, Income
Data: 80 individuals
for(i IN 1 : n)
for(j IN 1 : 4)
v
u
h0[j]
beta0[j]
mean[i]
sigma
sigma2
h
beta[j]
income[i]
age[i]
sex[i]
WTP[i]
h0[j]
name: h0[j] type: constant
Examples
Random Utility Model
Probit Model
Logit Model
Discrete choice experiment
Objectives:
1. Revealed preference models use random utilities
2. Probit models assume that utilities are multivariate normal
3. Probit MCMC generates latent, random utilities
4. Logit models assume that the random utilities have extreme value distributions.
5. Logit MCMC uses the Hasting-Metropolis algorith
Random Utility Model
Utility for alternative m is:
where there are n subjects (sample),
M+1 alternatives in the choice set, and
Ji choice ocassiones for subject i
1,...,1
,...,1
,...,1
,,,,,,
Mm
Jj
ni
xY
i
mjiimjimji
Subject picks alternative k if
for all m
The probability of selecting k is
for all m
Statistical Models:
{εi,j,m} are Normal Probit Model
{εi,j,m} are Extreme Value Logit Model
mjikji YY ,,,,
mjikji YYP ,,,,
Logit model in WinBUGS
Likelihood:
Prior distributions:
kki
ii
XXp
pBernouilliy
...logit
p-1p
ln
;~
110i
i
JjhN jjj ,...,1,~ 00
Logit model in WinBUGS
Example: Discrete choice experiment
To study the value that car consumers place upon environmental concerns when purchasing a car
Response variable: Yes/No
Attributes: safety (Yes/No), carbon dioxide emissions, acceleration from 0 to 100 km/h(<10sec. And < 7.5 sec)2.5 sec), second hand, and annual cost (900€, 1400€, 2000€).
Sample size: 150
Logit model in WinBUGS
Probit model in WinBUGS
Likelihood:
Prior distributions:
kki
ii
XXp
pBernouilliy
...
;~
1101
vuIG
JjhN jjj
,~
,...,1,~2
00
Probit model in WinBUGS
Hierarchical Logit
The hierarchical logistic regression model is a very easy extension of standard logit.
Likelihoodyij ~ Bernoulli(pij),
logit(pij) <- b1j + b2jx2ij + … bkjxkij
Priorsbjk ~ N(Bjk, Tk) for all j,k Bjk <- k1 + k2 zj2 + … + km zjm
qr ~ N(0, .001) for all q,rTk ~ Gamma(.01, .01)