11,153 research outputs found

    SNOWMASS WHITE PAPER - SLHC Endcap 1.4<y<4 Hadron Optical Calorimetry Upgrades in CMS with Applications to NLC/T-LEP, Intensity Frontier, and Beyond

    Full text link
    Radiation damage in the plastic scintillator and/or readout WLS fibers in the HE endcap calorimeter 1.4<y<4 in the CMS experiment at LHC and SLHC will require remediation after 2018. We describe one alternative using the existing brass absorber in the Endcap calorimeter, to replace the plastic scintillator tiles with BaF2 tiles, or quartz tiles coated with thin(1-5 micron) films of radiation-hard pTerphenyl(pTP) or the fast phosphor ZnO:Ga. These tiles would be read-out by easily replaceable arrays of straight, parallel WLS fibers coupled to clear plastic-cladded quartz fibers of proven radiation resistance. We describe a second alternative with a new absorber matrix extending to 1.4<y<4 in a novel Analog Particle Flow Cerenkov Compensated Calorimeter, using a dual readout of quartz tiles and scintillating (plastic, BaF2, or pTP/ ZnO:Ga thin film coated quartz, or liquid scintillator) tiles, also using easily replaceable arrays of parallel WLS fibers coupled to clear quartz transmitting fibers for readout. An Analog Particle Flow Scintillator-Cerenkov Compensated Calorimeter has application in NLC/T-LEP detectors and Intensity Frontier detectors

    Decoding the Mechanism for the Origin of Dark Matter in the Early Universe Using LHC Data

    Full text link
    It is shown that LHC data can allow one to decode the mechanism by which dark matter is generated in the early universe in supersymmetric theories. We focus on two of the major mechanisms for such generation of dark matter which are known to be the Stau Coannihilation (Stau-Co) where the neutralino is typically Bino like and annihilation on the Hyperbolic Branch (HB) where the neutralino has a significant Higgsino component. An investigation of how one may discriminate between the Stau-Co region and the HB region using LHC data is given for the mSUGRA model. The analysis utilizes several signatures including multi leptons, hadronic jets, b-tagging, and missing transverse momentum. A study of the SUSY signatures reveals several correlated smoking gun signals allowing a clear discrimination between the Stau-Co and the HB regions where dark matter in the early universe can originate.Comment: 7 pages, 5 figs, 2 columns, Accepted for publication in Physical Review

    A review of outlier detection procedures used in Surveying Engineering

    Get PDF
    The method of least squares is the most widely used parameter estimation tool in surveying engineering. It is implemented by minimizing the sum of squares of weighted residuals. The good attribute of the method of least squares is that it can give an unbiased and minimum variance estimate. Moreover, if the observation errors are normally distributed identical results to the maximum likelihood method can be obtained. However, the method of least squares requires gross error and systematic bias free observations to provide optimal results. Unfortunately, these undesired errors are often encountered in practice. Therefore, outlier diagnosis is an important issue in spatial data analysis. There are two different approaches to deal with outliers: statistical outlier test methods and robust estimation. Baarda and Pope methods are well known hypothetical testing methods. On the other hand, there are numerous robust methods to eliminate or reduce disruptive effects of outliers, such as Mestimation method, L1 norm minimization, the least median squares and the least trimmed squares. Robust methods are useful to locate multiple outliers. Yet, statistical testing approach can also be generalized to multiple outliers. Furthermore, reliability measures and robustness analysis enable us to assess the quality of our networks in terms of gross error detection and the effect of undetected errors. In this study, a review of outlier detection procedures is given. The main features of the methods are summarized. Finally, statistical test for multiple outliers is applied to a GPS network

    Analyzing the effect of local rounding error propagation on the maximal attainable accuracy of the pipelined Conjugate Gradient method

    Get PDF
    Pipelined Krylov subspace methods typically offer improved strong scaling on parallel HPC hardware compared to standard Krylov subspace methods for large and sparse linear systems. In pipelined methods the traditional synchronization bottleneck is mitigated by overlapping time-consuming global communications with useful computations. However, to achieve this communication hiding strategy, pipelined methods introduce additional recurrence relations for a number of auxiliary variables that are required to update the approximate solution. This paper aims at studying the influence of local rounding errors that are introduced by the additional recurrences in the pipelined Conjugate Gradient method. Specifically, we analyze the impact of local round-off effects on the attainable accuracy of the pipelined CG algorithm and compare to the traditional CG method. Furthermore, we estimate the gap between the true residual and the recursively computed residual used in the algorithm. Based on this estimate we suggest an automated residual replacement strategy to reduce the loss of attainable accuracy on the final iterative solution. The resulting pipelined CG method with residual replacement improves the maximal attainable accuracy of pipelined CG, while maintaining the efficient parallel performance of the pipelined method. This conclusion is substantiated by numerical results for a variety of benchmark problems.Comment: 26 pages, 6 figures, 2 tables, 4 algorithm
    corecore