162 research outputs found

    An origin for small neutrino masses in the NMSSM

    Get PDF
    We consider the Next to Minimal Supersymmetric Standard Model (NMSSM) which provides a natural solution to the so-called mu problem by introducing a new gauge-singlet superfield S. We realize that a new mechanism of neutrino mass suppression, based on the R-parity violating bilinear terms mu_i L_i H_u mixing neutrinos and higgsinos, arises within the NMSSM, offering thus an original solution to the neutrino mass problem (connected to the solution for the mu problem). We generate realistic (Majorana) neutrino mass values without requiring any strong hierarchy amongst the fundamental parameters, in contrast with the alternative models. In particular, the ratio |mu_i/mu| can reach about 10^-1, unlike in the MSSM where it has to be much smaller than unity. We check that the obtained parameters also satisfy the collider constraints and internal consistencies of the NMSSM. The price to pay for this new cancellation-type mechanism of neutrino mass reduction is a certain fine tuning, which get significantly improved in some regions of parameter space. Besides, we discuss the feasibility of our scenario when the R-parity violating bilinear terms have a common origin with the mu term, namely when those are generated via a VEV of the S scalar component from the couplings lambda_i S L_i H_u. Finally, we make comments on some specific phenomenology of the NMSSM in the presence of R-parity violating bilinear terms.Comment: 21 pages, 5 figures, Latex fil

    A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    Full text link
    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI papers published on JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/jins

    Naturalness and Fine Tuning in the NMSSM: Implications of Early LHC Results

    Get PDF
    We study the fine tuning in the parameter space of the semi-constrained NMSSM, where most soft Susy breaking parameters are universal at the GUT scale. We discuss the dependence of the fine tuning on the soft Susy breaking parameters M_1/2 and m0, and on the Higgs masses in NMSSM specific scenarios involving large singlet-doublet Higgs mixing or dominant Higgs-to-Higgs decays. Whereas these latter scenarios allow a priori for considerably less fine tuning than the constrained MSSM, the early LHC results rule out a large part of the parameter space of the semi-constrained NMSSM corresponding to low values of the fine tuning.Comment: 19 pages, 10 figures, bounds from Susy searches with ~1/fb include

    Theoretical predictions for the direct detection of neutralino dark matter in the NMSSM

    Full text link
    We analyse the direct detection of neutralino dark matter in the framework of the Next-to-Minimal Supersymmetric Standard Model. After performing a detailed analysis of the parameter space, taking into account all the available constraints from LEPII, we compute the neutralino-nucleon cross section, and compare the results with the sensitivity of detectors. We find that sizable values for the detection cross section, within the reach of dark matter detectors, are attainable in this framework. For example, neutralino-proton cross sections compatible with the sensitivity of present experiments can be obtained due to the exchange of very light Higgses with m_{h_1^0}\lsim 70 GeV. Such Higgses have a significant singlet composition, thus escaping detection and being in agreement with accelerator data. The lightest neutralino in these cases exhibits a large singlino-Higgsino composition, and a mass in the range 50\lsim m_{\tilde\chi_1^0}\lsim 100 GeV.Comment: Final version to appear in JHEP. References added. LaTeX, 53 pages, 23 figure

    Dark Matter in a Constrained NMSSM

    Full text link
    We explore the parameter space of a Constrained Next-to-Minimal Supersymmetric Standard Model with GUT scale boundary conditions (CNMSSM) and find regions where the relic density of the lightest neutralino is compatible with the WMAP measurement. We emphasize differences with the MSSM: cases where annihilation of the LSP occurs via a Higgs resonance at low values of tan\beta and cases where the LSP has a large singlino component. The particle spectrum as well as theoretical and collider constraints are calculated with NMSSMTools. All neutralino annihilation and coannihilation processes are then computed with micrOMEGAs, taking into account higher order corrections to the Higgs sector.Comment: 17 pages, 6 figures, references added, some comments added, version to be published in JCA

    Spontaneous R-Parity Violation, A4A_4 Flavor Symmetry and Tribimaximal Mixing

    Full text link
    We explore the possibility of spontaneous R parity violation in the context of A4A_4 flavor symmetry. Our model contains SU(3)c×SU(2)L×U(1)YSU(3)_c \times SU(2)_L \times U(1)_Y singlet matter chiral superfields which are arranged as triplet of A4A_4 and as well as few additional Higgs chiral superfields which are singlet under MSSM gauge group and belong to triplet and singlet representation under the A4A_4 flavor symmetry. R parity is broken spontaneously by the vacuum expectation values of the different sneutrino fields and hence we have neutrino-neutralino as well as neutrino-MSSM gauge singlet higgsino mixings in our model, in addition to the standard model neutrino- gauge singlet neutrino, gaugino-higgsino and higgsino-higgsino mixings. Because all of these mixings we have an extended neutral fermion mass matrix. We explore the low energy neutrino mass matrix for our model and point out that with some specific constraints between the sneutrino vacuum expectation values as well as the MSSM gauge singlet Higgs vacuum expectation values, the low energy neutrino mass matrix will lead to a tribimaximal mixing matrix. We also analyze the potential minimization for our model and show that one can realize a higher vacuum expectation value of the SU(3)c×SU(2)L×U(1)YSU(3)_c \times SU(2)_L \times U(1)_Y singlet sneutrino fields even when the other sneutrino vacuum expectation values are extremely small or even zero.Comment: 18 page

    Radiative contribution to neutrino masses and mixing in μν\mu\nuSSM

    Full text link
    In an extension of the minimal supersymmetric standard model (popularly known as the μν\mu\nuSSM), three right handed neutrino superfields are introduced to solve the μ\mu-problem and to accommodate the non-vanishing neutrino masses and mixing. Neutrino masses at the tree level are generated through RR-parity violation and seesaw mechanism. We have analyzed the full effect of one-loop contributions to the neutrino mass matrix. We show that the current three flavour global neutrino data can be accommodated in the μν\mu\nuSSM, for both the tree level and one-loop corrected analyses. We find that it is relatively easier to accommodate the normal hierarchical mass pattern compared to the inverted hierarchical or quasi-degenerate case, when one-loop corrections are included.Comment: 51 pages, 14 figures (58 .eps files), expanded introduction, other minor changes, references adde

    Optimization of Planck/LFI on--board data handling

    Get PDF
    To asses stability against 1/f noise, the Low Frequency Instrument (LFI) onboard the Planck mission will acquire data at a rate much higher than the data rate allowed by its telemetry bandwith of 35.5 kbps. The data are processed by an onboard pipeline, followed onground by a reversing step. This paper illustrates the LFI scientific onboard processing to fit the allowed datarate. This is a lossy process tuned by using a set of 5 parameters Naver, r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level of distortion introduced by the onboard processing, EpsilonQ, as a function of these parameters. It describes the method of optimizing the onboard processing chain. The tuning procedure is based on a optimization algorithm applied to unprocessed and uncompressed raw data provided either by simulations, prelaunch tests or data taken from LFI operating in diagnostic mode. All the needed optimization steps are performed by an automated tool, OCA2, which ends with optimized parameters and produces a set of statistical indicators, among them the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr = 2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup the process an analytical model is developed that is able to extract most of the relevant information on EpsilonQ and Cr as a function of the signal statistics and the processing parameters. This model will be of interest for the instrument data analysis. The method was applied during ground tests when the instrument was operating in conditions representative of flight. Optimized parameters were obtained and the performance has been verified, the required data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of 3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx, txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted 10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio

    Off-line radiometric analysis of Planck/LFI data

    Get PDF
    The Planck Low Frequency Instrument (LFI) is an array of 22 pseudo-correlation radiometers on-board the Planck satellite to measure temperature and polarization anisotropies in the Cosmic Microwave Background (CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the performances of the LFI, a software suite named LIFE has been developed. Its aims are to provide a common platform to use for analyzing the results of the tests performed on the single components of the instrument (RCAs, Radiometric Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA). Moreover, its analysis tools are designed to be used during the flight as well to produce periodic reports on the status of the instrument. The LIFE suite has been developed using a multi-layered, cross-platform approach. It implements a number of analysis modules written in RSI IDL, each accessing the data through a portable and heavily optimized library of functions written in C and C++. One of the most important features of LIFE is its ability to run the same data analysis codes both using ground test data and real flight data as input. The LIFE software suite has been successfully used during the RCA/RAA tests and the Planck Integrated System Tests. Moreover, the software has also passed the verification for its in-flight use during the System Operations Verification Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST: http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022

    Requirements on the LFI On-Board Compression

    Get PDF
    Versione Finale. Final Version.The present document describes the requirements for the compression program and the On-Board compression operations for P LANCK /LFI
    corecore