1,373 research outputs found

    Robust rank correlation based screening

    Full text link
    Independence screening is a variable selection method that uses a ranking criterion to select significant variables, particularly for statistical models with nonpolynomial dimensionality or "large p, small n" paradigms when p can be as large as an exponential of the sample size n. In this paper we propose a robust rank correlation screening (RRCS) method to deal with ultra-high dimensional data. The new procedure is based on the Kendall \tau correlation coefficient between response and predictor variables rather than the Pearson correlation of existing methods. The new method has four desirable features compared with existing independence screening methods. First, the sure independence screening property can hold only under the existence of a second order moment of predictor variables, rather than exponential tails or alikeness, even when the number of predictor variables grows as fast as exponentially of the sample size. Second, it can be used to deal with semiparametric models such as transformation regression models and single-index models under monotonic constraint to the link function without involving nonparametric estimation even when there are nonparametric functions in the models. Third, the procedure can be largely used against outliers and influence points in the observations. Last, the use of indicator functions in rank correlation screening greatly simplifies the theoretical derivation due to the boundedness of the resulting statistics, compared with previous studies on variable screening. Simulations are carried out for comparisons with existing methods and a real data example is analyzed.Comment: Published in at http://dx.doi.org/10.1214/12-AOS1024 the Annals of Statistics (http://www.imstat.org/aos/) by the Institute of Mathematical Statistics (http://www.imstat.org). arXiv admin note: text overlap with arXiv:0903.525

    Rethinking China's underurbanization: An evaluation of its county-to-city upgrading policy

    Get PDF
    "It has been argued in the literature that China is underurbanized in large part because of restrictions on migration. While the presence of migration barriers can help explain why existing cities fail to achieve their optimal size, it cannot explain the lack of cities. Although migration has become much easier over time, the number of cities in China has been rather stagnant. In this paper, we argue that lack of appropriate mechanisms for creating new cities is another reason for underurbanization. Under China's hierarchical governance structure, the only way to create new cities is through the centralized policy of upgrading existing counties or prefectures into cities. However, in practice the implementation of the county-to-city upgrading policy was more complicated than expected. Based on a county-level panel dataset, this paper shows that jurisdictions that were upgraded to cities prior to 1998 do not perform better relative to their counterparts that remain to be counties in terms of both economic growth and providing public services. The policy was retracted in 1997, freezing the number of county-level cities since then. This, in turn, contributes to the observed underurbanization." from authors' abstractUrbanization, City creation, Governance structure, Political centralization, Development strategies,

    Mean Volatility Regressions

    Get PDF
    Motivated by increment process modeling for two correlated random and non-random systems from a discrete-time asset pricing with both risk free asset and risky security, we propose a class of semiparametric regressions for a combination of a non-random and a random system. Unlike classical regressions, mean regression functions in the new model contain variance components and the model variables are related to latent variables, for which certain economic interpretation can be made. The motivating example explains why the GARCH-M of which the mean function contains a variance component cannot cover the newly proposed models. Further, we show that statistical inference for the increment process cannot be simply dealt with by a two-step procedure working separately on the two involved systems although the increment process is a weighted sum of the two systems. We further investigate the asymptotic behaviors of estimation by using sophisticated nonparametric smoothing. Monte Carlo simulations are conducted to examine finite-sample performance, and a real dataset published in Almanac of China’s Finance and Banking (2004 and 2005) is analyzed for illustration about the increment process of wealth in financial market of China from 2003 to 2004.Non-random systems, Random systems, Semiparametric regression, Variance built-in Mean

    Long-distance thermal temporal ghost imaging over optical fibers

    Full text link
    A thermal ghost imaging scheme between two distant parties is proposed and experimentally demonstrated over long-distance optical fibers. In the scheme, the weak thermal light is split into two paths. Photons in one path are spatially diffused according to their frequencies by a spatial dispersion component, then illuminate the object and record its spatial transmission information. Photons in the other path are temporally diffused by a temporal dispersion component. By the coincidence measurement between photons of two paths, the object can be imaged in a way of ghost imaging, based on the frequency correlation between photons in the two paths. In the experiment, the weak thermal light source is prepared by the spontaneous four-wave mixing in a silicon waveguide. The temporal dispersion is introduced by single mode fibers of 50 km, which also could be looked as a fiber link. Experimental results show that this scheme can be realized over long-distance optical fibers

    Experimental preparation and verification of quantum money

    Full text link
    A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report an experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6×1063.6\times 10^6 states in one verification round, limiting the forging probability to 10710^{-7} based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.Comment: 12 pages, 4 figure
    corecore