[ABE-L] Fwd: ISBA-BNP webinar: Nhat Ho

Hedibert Lopes hedibert em gmail.com
Ter Nov 14 07:45:07 -03 2023



Begin forwarded message:

> From: ISBA-BNP Program Chair Jim Griffin <j.griffin em ucl.ac.uk>
> Date: November 14, 2023 at 6:48:06 AM GMT-3
> To: "Prof. Hedibert Lopes" <hedibert em gmail.com>
> Subject: ISBA-BNP webinar: Nhat Ho
> 
> 
> Dear all,
> 
> We are delighted to announce a new webinar of the series organized by BNP-ISBA, the Bayesian Nonparametric section of ISBA.  
>  
> The zoom link of the webinar will be available the day before at https://bnp-isba.github.io/webinars.html, where you can also find the list of upcoming webinars (until February 2024). 
> 
> DATE & TIME: 17:00 UTC on Wednesday November 22, 2023. Note that 17:00 UTC corresponds to 12:00 US Eastern and 18:00 Central European.
> 
> SPEAKER: Nhat Ho (University of Texas at Austin)
> TITLE: On Excess Mass Behavior in Gaussian Mixture Models with Orlicz-Wasserstein Distances
> 
> Abstract: Dirichlet Process mixture models (DPMM) in combination with Gaussian kernels have been an important modeling tool for numerous data domains arising from biological, physical, and social sciences. However, this versatility in applications does not extend to strong theoretical guarantees for the underlying parameter estimates, for which only a logarithmic rate is achieved. In this work, we (re)introduce and investigate a metric, named Orlicz-Wasserstein distance, in the study of the Bayesian contraction behavior for the parameters. We show that despite the overall slow convergence guarantees for all the parameters, posterior contraction for parameters happens at almost polynomial rates in outlier regions of the parameter space. Our theoretical results provide new insight in understanding the convergence behavior of parameters arising from various settings of hierarchical Bayesian nonparametric models. In addition, we provide an algorithm to compute the metric by leveraging Sinkhorn divergences and validate our findings through a simulation study.
> 
> This talk is based on joint work with Aritra Guha and Long Nguyen.
> 
> Bio: Nhat Ho is currently an Assistant Professor of Statistics and Data Science at the University of Texas at Austin. He is a core member of the University of Texas Austin Machine Learning Laboratory and senior personnel of the Institute for Foundations of Machine Learning. A central theme of his research focuses on four important aspects of complex and large-scale models and data: (1) Interpretability, efficiency, and robustness of deep learning and complex machine learning models, including Transformer architectures, Deep Generative Models, Convolutional Neural Networks.; (2) Scalability of Optimal Transport for machine learning and deep learning applications; (3) Stability and optimality of optimization and sampling algorithms for solving complex statistical machine learning models; (4) Heterogeneity of complex data, including mixture and hierarchical models and Bayesian nonparametrics.
> 
> Best regards
> 
> --
> Jim Griffin
> ISBA - BNP Section
> Program Chair 2022-2023
> e-mail: j.griffin em ucl.ac.uk
>  
> 
> Sent from the International Society for Bayesian Analysis
> members-info em bayesian.org
> United States
> To customize your greeting or update your name, email or mailing subscriptions login at http://bayesian.org and go to My Account. ISBA protects the privacy of its members and does not distribute its mailing list to third parties. To opt out of future mailings go to https://bayesian.org/civicrm/?civiwp=CiviCRM&q=civicrm%2Fmailing%2Foptout&reset=1&jid=5594&qid=1171469&h=04bfe00b5c33561f 
-------------- Próxima Parte ----------
Um anexo em HTML foi limpo...
URL: <http://lists.ime.usp.br/pipermail/abe/attachments/20231114/8c61f066/attachment.htm>


Mais detalhes sobre a lista de discussão abe