5,197 research outputs found

    Issues in the Multiple Try Metropolis mixing

    Full text link
    The multiple Try Metropolis (MTM) algorithm is an advanced MCMC technique based on drawing and testing several candidates at each iteration of the algorithm. One of them is selected according to certain weights and then it is tested according to a suitable acceptance probability. Clearly, since the computational cost increases as the employed number of tries grows, one expects that the performance of an MTM scheme improves as the number of tries increases, as well. However, there are scenarios where the increase of number of tries does not produce a corresponding enhancement of the performance. In this work, we describe these scenarios and then we introduce possible solutions for solving these issues

    The Generalized Weighted Lindley Distribution: Properties, Estimation and Applications

    Full text link
    In this paper, we proposed a new lifetime distribution namely generalized weighted Lindley (GLW) distribution. The GLW distribution is a useful generalization of the weighted Lindley distribution, which accommodates increasing, decreasing, decreasing-increasing-decreasing, bathtub, or unimodal hazard functions, making the GWL distribution a flexible model for reliability data. A significant account of mathematical properties of the new distribution are presented. Different estimation procedures are also given such as, maximum likelihood estimators, method of moments, ordinary and weighted least-squares, percentile, maximum product of spacings and minimum distance estimators. The different estimators are compared by an extensive numerical simulations. Finally, we analyze two data sets for illustrative purposes, proving that the GWL outperform several usual three parameters lifetime distributions

    An Evidence of Link between Default and Loss of Bank Loans from the Modeling of Competing Risks

    Full text link
    In this paper, we propose a method that provides a useful technique to compare relationship between risks involved that takes customer become defaulter and debt collection process that might make this defaulter recovered. Through estimation of competitive risks that lead to realization of the event of interest, we showed that there is a significant relation between the intensity of default and losses from defaulted loans in collection processes. To reach this goal, we investigate a competing risks model applied to whole credit risk cycle into a bank loans portfolio. We estimated competing causes related to occurrence of default, thereafter, comparing it with estimated competing causes that lead loans to write-off condition. In context of modeling competing risks, we used a specification of Poisson distribution for numbers from competing causes and Weibull distribution for failures times. The likelihood maximum estimation is used to parameters estimation and the model is applied to a real data of personal loansComment: 8 page

    Bayesian model averaging: A systematic review and conceptual classification

    Full text link
    Bayesian Model Averaging (BMA) is an application of Bayesian inference to the problems of model selection, combined estimation and prediction that produces a straightforward model choice criteria and less risky predictions. However, the application of BMA is not always straightforward, leading to diverse assumptions and situational choices on its different aspects. Despite the widespread application of BMA in the literature, there were not many accounts of these differences and trends besides a few landmark revisions in the late 1990s and early 2000s, therefore not taking into account any advancements made in the last 15 years. In this work, we present an account of these developments through a careful content analysis of 587 articles in BMA published between 1996 and 2014. We also develop a conceptual classification scheme to better describe this vast literature, understand its trends and future directions and provide guidance for the researcher interested in both the application and development of the methodology. The results of the classification scheme and content review are then used to discuss the present and future of the BMA literature

    The Stark Effect with Minimum Length

    Full text link
    We will study the splitting in the energy spectrum of the hydrogen atom subjected to a uniform electric field (Stark effect) with the Heisenberg algebra deformed leading to the minimum length. We will use the perturbation theory for cases not degenerate (n=1n=1) and degenerate (n=2n=2), along with known results of corrections in these levels caused by the minimum length applied purely to the hydrogen atom, so that we may find and estimate the corrections of minimum length applied to the Stark effect.Comment: 12 page

    Classification methods applied to credit scoring: A systematic review and overall comparison

    Full text link
    The need for controlling and effectively managing credit risk has led financial institutions to excel in improving techniques designed for this purpose, resulting in the development of various quantitative models by financial institutions and consulting companies. Hence, the growing number of academic studies about credit scoring shows a variety of classification methods applied to discriminate good and bad borrowers. This paper, therefore, aims to present a systematic literature review relating theory and application of binary classification techniques for credit scoring financial analysis. The general results show the use and importance of the main techniques for credit rating, as well as some of the scientific paradigm changes throughout the years

    Maximum Likelihood Estimation for the Weight Lindley Distribution Parameters under Different Types of Censoring

    Full text link
    In this paper the maximum likelihood equations for the parameters of the Weight Lindley distribution are studied considering different types of censoring, such as, type I, type II and random censoring mechanism. A numerical simulation study is perform to evaluate the maximum likelihood estimates. The proposed methodology is illustrated in a real data set.Comment: 19 pg

    Adaptive Rejection Sampling with fixed number of nodes

    Full text link
    The adaptive rejection sampling (ARS) algorithm is a universal random generator for drawing samples efficiently from a univariate log-concave target probability density function (pdf). ARS generates independent samples from the target via rejection sampling with high acceptance rates. Indeed, ARS yields a sequence of proposal functions that converge toward the target pdf, so that the probability of accepting a sample approaches one. However, sampling from the proposal pdf becomes more computational demanding each time it is updated. In this work, we propose a novel ARS scheme, called Cheap Adaptive Rejection Sampling (CARS), where the computational effort for drawing from the proposal remains constant, decided in advance by the user. For generating a large number of desired samples, CARS is faster than ARS.Comment: (to appear) Communications in Statistics - Simulation and Computatio

    Metamaterials from modified CPT-odd standard model extension and minimum length

    Full text link
    Here we discuss the standard model extension in the presence of CPT-odd Lorentz violation (LV) sector and of a deformed Heisenberg algebra that leads to a non-commutative theory with minimum length (ML). We derive the set of Maxwell equations emerging from this theory and considered the consequences with respect to the usual effects of electromagnetic waves and material media. We then considered the set of modified equations in material media and investigate the metamaterial behaviour as a consequence of LV and ML. We show that a negative index refraction can be derived from the presence of a non-commutativity suitably tuned by the β\beta parameter, while in the presence of LV, we obtained the set of modified Maxwell equation in terms of the corresponding material fields with terms depending explicitly from the terms of interaction between the material fields depending on non-commutativity with the background field due to CPT-odd LV. We conclude that a new set of metamaterials can be derived as a consequence of CPT-odd LV and non-commutativity with minimum length.Comment: 14 page

    The Frechet distribution: Estimation and Application an Overview

    Full text link
    In this article, we consider the problem of estimating the parameters of the Fr\'echet distribution from both frequentist and Bayesian points of view. First we briefly describe different frequentist approaches, namely, maximum likelihood, method of moments, percentile estimators, L-moments, ordinary and weighted least squares, maximum product of spacings, maximum goodness-of-fit estimators and compare them with respect to mean relative estimates, mean squared errors and the 95\% coverage probability of the asymptotic confidence intervals using extensive numerical simulations. Next, we consider the Bayesian inference approach using reference priors. The Metropolis-Hasting algorithm is used to draw Markov Chain Monte Carlo samples, and they have in turn been used to compute the Bayes estimates and also to construct the corresponding credible intervals. Five real data sets related to the minimum flow of water on Piracicaba river in Brazil are used to illustrate the applicability of the discussed procedures
    corecore