In this short paper, I consider the variable selection problem in linear regression models and review two objective Bayesian methods for which I have been developing R code. These two methods, namely, fractional Bayes factors and intrinsic priors, are useful when models are to be compared in lack of substantive prior information. In particular, they are useful when many variables are available for selection, and thus exponentially many models are to be compared, so that subjective prior elicitation under each model is virtually impossible. A case of special interest, which ultimately motivates my work on this topic, is when the structure of an acyclic directed graph is to be learned from data; in this case the model space is even larger, because each graph corresponds to a family of linear regression models.
Objective Bayesian comparison of linear regression models / LA ROCCA, Luca. - ELETTRONICO. - (2011), pp. 1-4. (Intervento presentato al convegno 8th International Meeting of the Classification and Data Analysis Group of the Italian Statistical Society (CLADAG 2011) tenutosi a Pavia nel 7-9 settembre 2011).