Description:
The Bayesian approach is attracting an increasing interest in several applied fields since it allows borrowing of information, coherent uncertainty quantification, shrinkage, inclusion of prior knowledge, modeling of complex dependence structures and tractable inference via the posterior distribution. This session on JUNIOR CONTRIBUTIONS TO BAYESIAN MODELING AND INFERENCE aims at hosting talks by junior researchers from Latin America who are currently working on providing advances in Bayesian inference from different perspectives, covering sampling, regression and high-dimensional sparse models.
Organizer: j-ISBA Section
Isadora Antoniano. Bocconi University, Italy
Speakers:
Title:
A Bayesian approach to differential recruitment with respondent-driven sampling data
Abstract:
Respondent-driven sampling (RDS) is a sampling mechanism that has proven very effective to sample hard-to-reach human populations connected through social networks. A small number of individuals typically known to the researcher are initially sampled and asked to recruit a small fixed number of their contacts who are also members of the target population. Each subsequent sampling waves are produced by peer recruitment until the desired sample size is achieved. However, the researcher’s lack of control over the sampling process has posed several challenges to producing valid statistical inference from RDS data. For instance, participants are generally assumed to recruit completely at random among their contacts despite the growing empirical evidence that suggests otherwise and the substantial sensitivity of most RDS estimators to this assumption. The main contributions are to parameterize an alternative recruitment behavior and propose a Bayesian estimator to correct for nonrandom recruitment.
Title:
Extensions of bayesian dynamic quantile linear models
Abstract:
Our initial aim is to present a new class of models, named dynamic quantile linear models. It combines dynamic linear models with distribution free quantile regression producing a robust statistical method. This class of models provides richer information on the effects of the predictors than does the traditional mean regression and it is very insensitive to heteroscedasticity and outliers, accommodating the non-normal errors often encountered in practical applications. Bayesian inference for quantile regression proceeds by forming the likelihood function based on the asymmetric Laplace distribution and a location-scale mixture representation of it allows finding analytical expressions for the conditional posterior densities of the model. Thus, Bayesian inference for dynamic quantile linear models can be performed using an efficient Markov chain Monte Carlo algorithm or a fast sequential procedure suited for high-dimensional predictive modeling applications with massive data. The purpose of this work is to extend this class of models to account for structural features in the dataset, as: functional, time dependent and multivariate components. This will be done using hierarchical dynamic quantile linear models.
Title:
Decoupling Shrinkage and Selection in Bayesian Factor Analysis
Abstract:
Informative sparse inducing priors, has been a relevant option in variable selection in a variety of statistical models. Despite the interpretability and ease of application, there are still divergences in determining whether a parameter, a posteriori, is really null. In this context, the DSS approach (”Decoupling Shrinkage and Selection”) appears as an alternative that preserves the a posteriori information, while providing an optimal selection in the set of variables. In this work, we extend the DSS methodology for the Gaussian Linear Factor Analysis model, in order to obtain a sparse loading matrix, reducing to zero the loadings that are not relevant to the model. To perform such selection, we introduce an utility function and a post inference procedure. We illustrate our findings with simulations and an application in finance. The impact of the procedure is also used to explore the number of factors in the model.