Academia.edu no longer supports Internet Explorer.
To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to upgrade your browser.
2019, Statistical theory and related fields
…
5 pages
1 file
Statistical Theory and Related Fields, 2019
2012
This paper presents a refinement of the Bayesian Information Criterion (BIC). 7 While the original BIC selects models on the basis of complexity and fit, the 8 so-called prior-adapted BIC allows us to choose among statistical models that 9 differ on three scores: fit, complexity, and model size. The prior-adapted BIC 10 can therefore accommodate comparisons among statistical models that differ only 11 in the admissible parameter space, eg, for choosing among models with different 12 constraints on the parameters.
2017
Markov chain Monte Carlo (MCMC) has been an indispensable tool for Bayesian analysis of complex statistical models even for high-dimensional problems. However, there still lacks a consistent criterion for selecting models based on the outputs of MCMC. The existing deviance information criterion (DIC) is known to be inconsistent and non-invariant for reparameterization. This paper proposes an Average BIC-like (ABIC) model selection criterion and an Average EBIC-like (AEBIC) model selection criterion for low and high-dimensional problems, respectively; establishes their consistency under mild conditions; and illustrates their applications using generalized linear models. The proposed criteria overcome shortcomings of DIC. The numerical results indicate that the proposed criteria can significantly outperform DIC as well as the MLE-based criteria, such as AIC, BIC and EBIC, in terms of model selection accuracy.
Statistical theory and related fields, 2019
We present a new approach to model selection and Bayes factor determination, based on Laplace expansions (as in BIC), which we call Prior-based Bayes Information Criterion (PBIC). In this approach, the Laplace expansion is only done with the likelihood function, and then a suitable prior distribution is chosen to allow exact computation of the (approximate) marginal likelihood arising from the Laplace approximation and the prior. The result is a closed-form expression similar to BIC, but now involves a term arising from the prior distribution (which BIC ignores) and also incorporates the idea that different parameters can have different effective sample sizes (whereas BIC only allows one overall sample size n). We also consider a modification of PBIC which is more favorable to complex models.
Myriads of model selection criteria (Bayesian and frequentist) have been proposed in the literature aiming at selecting a single model regardless of its intended use. An honorable exception in the frequentist perspective is the " focused information criterion " (FIC) aiming at selecting a model based on the parameter of interest (focus). This paper takes the same view in the Bayesian context; that is, a model may be good for one estimand but bad for another. The proposed method exploits the Bayesian model averaging (BMA) machinery to obtain a new criterion, the focused Bayesian model averaging (FoBMA), for which the best model is the one whose estimate is closest to the BMA estimate. In particular, for two models, this criterion reduces to the classical Bayesian model selection scheme of choosing the model with the highest posterior probability. The new method is applied in linear regression, logistic regression, and survival analysis. This criterion is specially important in epidemiological studies in which the objective is often to determine a risk factor (focus) for a disease, adjusting for potential confounding factors.
D al l e M ol l e I nst i t ut e f or Percept ual Art i f i ci al I nt el l i gence • P. O. Box 592 • M art i gny • Val ai s • Sw i t zerl and phone +41-27-721 77 11 f ax +41-27-721 77 12 e-m ai l secret ari at @ i di ap. ch i nt ernet ht t p: / / w w w. i di ap. ch
2015
We propose information criteria that measure the prediction risk of a predictive density based on the Bayesian marginal likelihood from a frequentist point of view. We derive criteria for selecting variables in linear regression models, assuming a prior distribution of the regression coefficients. Then, we discuss the relationship between the proposed criteria and related criteria. There are three advantages of our method. First, this is a compromise between the frequentist and Bayesian standpoints because it evaluates the frequentist's risk of the Bayesian model. Thus, it is less influenced by a prior misspecification. Second, the criteria exhibits consistency when selecting the true model. Third, when a uniform prior is assumed for the regression coefficients, the resulting criterion is equivalent to the residual information criterion (RIC) of Shi and Tsai (2002).
2011
Geostatistical analyses require an estimation of the covariance structure of a random field and its parameters jointly from noisy data. Whereas in some cases (as in that of a Matérn variogram) a range of structural models can be captured with one or a few parameters, in many other cases it is necessary to consider a discrete set of structural model alternatives, such as drifts and variograms. Ranking these alternatives and identifying the best among them has traditionally been done with the aid of information theoretic or Bayesian model selection criteria. There is an ongoing debate in the literature about the relative merits of these various criteria. We contribute to this discussion by using synthetic data to compare the abilities of two common Bayesian criteria, BIC and KIC, to discriminate between alternative models of drift as a function of sample size when drift and variogram parameters are unknown. Adopting the results of Markov Chain Monte Carlo simulations as reference we confirm that KIC reduces asymptotically to BIC and provides consistently more reliable indications of model quality than does BIC for samples of all sizes. Practical considerations often cause analysts to replace the observed Fisher information matrix entering into KIC with its expected value. Our results show that this causes the performance of KIC to deteriorate with diminishing sample size. These results are equally valid for one and multiple realizations of uncertain data entering into our analysis. Bayesian theory indicates that, in the case of statistically independent and identically distributed data, posterior model probabilities become asymptotically insensitive to prior probabilities as sample size increases. We do not find this to be the case when working with samples taken from an autocorrelated random field.
Journal of the American Statistical Association, 2010
For high-dimensional data set with complicated dependency structures, the full likelihood approach often renders to intractable computational complexity. This imposes di±culty on model selection as most of the traditionally used information criteria require the evaluation of the full likelihood. We propose a composite likelihood version of the Bayesian information criterion (BIC) and establish its consistency property for the selection of the true underlying model. Under some mild regularity conditions, the proposed BIC is shown to be selection consistent, where the number of potential model parameters is allowed to increase to in¯nity at a certain rate of the sample size. Simulation studies demonstrate the empirical performance of this new BIC criterion, especially for the scenario that the number of parameters increases with the sample size.
SUPERAÇÃO DO PRECEDENTE E MODULAÇÃO DE EFEITOS, 2024
I notai Boccadibue, un giudice pubblico e la Società Maggiore di Santa Maria a Firenze nel 1299, 2024
VI Congreso ARLAC-México, 2024
International Journal of Languages' Education, 2017
Universidad Complutenses de Madrid, Facultad de Comercio y Turismo, Sala Germán Bernácer, 7-8 de mayo de 2024
J. Rutledge, Paradox and Contradiction in Theology, London: Routledge, 2023
2018
Canadian J of Sociology, 2011
Bilge Şehir Kilis, 2023
Open heart, 2018
Acta Iuris Stetinensis, 2015
Journal of Leadership Studies, 2018
Aquaculture, 2018
GVU conference proceeding, 2024
Oral Surgery, Oral Medicine, Oral Pathology, and Oral Radiology, 2020
Health and Social Care Chaplaincy, 2020
Journal of Physics: Conference Series, 2019