<div dir="ltr">


















<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Boa tarde povo.<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Gostaria de divulgar um curso em nivel de doutorado que lecionarei nas proximas varias semanas aqui no Insper.  Detalhes abaixo.  Interessados entrar em contato comigo para verificar viabilidade.<span></span></span></p><p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><br></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"><a href="http://hedibert.org/current-teaching/#tab-BayesianEconometrics-2018" style="color:rgb(5,99,193);text-decoration:underline"><span lang="PT-BR">http://hedibert.org/current-teaching/#tab-BayesianEconometrics-2018</span></a><span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-family:Calibri,sans-serif"><span style="font-size:21.3333px">Obrigado e abracos,</span></p><p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Hedibert</span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"> </span><br></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Objective<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">The end of the course goal is to allow the
student to critically decide between a Bayesian, a frequentist or
Bayesian-frequentist compromise when facing real world problems in the fields
of micro- and macro-econometrics and finance, as well as in quantitative
marketing, strategy and business administration.<span>  </span>With this end in mind, we will visit well
known Bayesian issues, such as prior specification and model comparison and
model averaging, but also study regularization via Bayesian LASSO,
Spike-and-Slab and related schemes, “small n, large p” issues, Bayesian statistical
learning via additive regression trees, random forests, large-scale VAR and
(dynamic) factor models.<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Course description<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Basic ingredients: prior, posterior, and
predictive distributions, sequential Bayes, conjugate analysis,
exchangeability, principles of data reduction and decision theory.<span>  </span>Model criticism: Bayes factor, computing
marginal likelihoods, Savage-Dickey ratio, reversible jump MCMC, Bayesian model
averaging and deviance information criterion.<span> 
</span>Modern computation via (Markov chain) Monte Carlo methods: Monte Carlo
integration, sampling-importance resampling, Gibbs sampler, Metropolis-Hastings
algorithms.<span>  </span>Mixture models, Hierarchical
models, Bayesian regularization, Instrumental variables modeling, Large-scale
(sparse) factor modeling, Bayesian additive regression trees (BART) and related
topics, Dynamic models, Sequential Monte Carlo algorithms, Bayesian methods in
microeconometrics, macroeconometrics, marketing and finance.<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Part
I Bayesian ingredients<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Inference:
likelihood, prior, predictive and posterior distributions<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Model
criticism: Marginal likelihoods, Bayes factor, model averaging and decision
theory<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Computation:
An introduction (Markov chain and sequencial) Monte Carlo methods<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Part
II Multivariate models<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Large-scale
vector autoregressive models<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Factor
models and other dimension reduction models<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Time-varying
high-dimensional covariance models<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt"><span> </span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Part
III Modern Bayesian statistical learning<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Mixture
models and the Dirichlet process: handling non-Gaussian models<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Regularization:
sparsity via shrinkage and variable selection<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Large vector-autoregressive and factor models:
combining sparsity and parsimony<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span lang="PT-BR" style="font-size:16pt">Classification and support vector machines<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Regression
trees and random forests<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span style="font-size:16pt">Latent
Dirichlet allocation: Text as data, text mining<span></span></span></p>

<p class="MsoNormal" style="margin:0cm 0cm 0.0001pt;font-size:12pt;font-family:Calibri,sans-serif"><span> </span></p>





<br clear="all"><div><br></div><div class="gmail_signature" data-smartmail="gmail_signature"><div dir="ltr"><br></div></div>
</div>