1,810 research outputs found

    Channel Estimation for Diffusive Molecular Communications

    Full text link
    In molecular communication (MC) systems, the \textit{expected} number of molecules observed at the receiver over time after the instantaneous release of molecules by the transmitter is referred to as the channel impulse response (CIR). Knowledge of the CIR is needed for the design of detection and equalization schemes. In this paper, we present a training-based CIR estimation framework for MC systems which aims at estimating the CIR based on the \textit{observed} number of molecules at the receiver due to emission of a \textit{sequence} of known numbers of molecules by the transmitter. Thereby, we distinguish two scenarios depending on whether or not statistical channel knowledge is available. In particular, we derive maximum likelihood (ML) and least sum of square errors (LSSE) estimators which do not require any knowledge of the channel statistics. For the case, when statistical channel knowledge is available, the corresponding maximum a posteriori (MAP) and linear minimum mean square error (LMMSE) estimators are provided. As performance bound, we derive the classical Cramer Rao (CR) lower bound, valid for any unbiased estimator, which does not exploit statistical channel knowledge, and the Bayesian CR lower bound, valid for any unbiased estimator, which exploits statistical channel knowledge. Finally, we propose optimal and suboptimal training sequence designs for the considered MC system. Simulation results confirm the analysis and compare the performance of the proposed estimation techniques with the respective CR lower bounds.Comment: to be appeared in IEEE Transactions on Communications. arXiv admin note: text overlap with arXiv:1510.0861

    Trinification, the Hierarchy Problem and Inverse Seesaw Neutrino Masses

    Full text link
    In minimal trinification models light neutrino masses can be generated via a radiative see-saw mechanism, where the masses of the right-handed neutrinos originate from loops involving Higgs and fermion fields at the unification scale. This mechanism is absent in models aiming at solving or ameliorating the hierarchy problem, such as low-energy supersymmetry, since the large seesaw-scale disappears. In this case, neutrino masses need to be generated via a TeV-scale mechanism. In this paper, we investigate an inverse seesaw mechanism and discuss some phenomenological consequences.Comment: 10 pages, 11 figure

    Robust sparse principal component analysis.

    Get PDF
    A method for principal component analysis is proposed that is sparse and robust at the same time. The sparsity delivers principal components that have loadings on a small number of variables, making them easier to interpret. The robustness makes the analysis resistant to outlying observations. The principal components correspond to directions that maximize a robust measure of the variance, with an additional penalty term to take sparseness into account. We propose an algorithm to compute the sparse and robust principal components. The method is applied on several real data examples, and diagnostic plots for detecting outliers and for selecting the degree of sparsity are provided. A simulation experiment studies the loss in statistical efficiency by requiring both robustness and sparsity.Dispersion measure; Projection-pursuit; Outliers; Variable selection;

    Contrast estimation for parametric stationary determinantal point processes

    Get PDF
    We study minimum contrast estimation for parametric stationary determi-nantal point processes. These processes form a useful class of models for repulsive (or regular, or inhibitive) point patterns and are already applied in numerous statistical applications. Our main focus is on minimum contrast methods based on the Ripley's K-function or on the pair correlation function. Strong consistency and asymptotic normality of theses procedures are proved under general conditions that only concern the existence of the process and its regularity with respect to the parameters. A key ingredient of the proofs is the recently established Brillinger mixing property of stationary determinantal point processes. This work may be viewed as a complement to the study of Y. Guan and M. Sherman who establish the same kind of asymptotic properties for a large class of Cox processes, which in turn are models for clustering (or aggregation)

    Memory Effects in Granular Material

    Full text link
    We present a combined experimental and theoretical study of memory effects in vibration-induced compaction of granular materials. In particular, the response of the system to an abrupt change in shaking intensity is measured. At short times after the perturbation a granular analog of aging in glasses is observed. Using a simple two-state model, we are able to explain this short-time response. We also discuss the possibility for the system to obey an approximate pseudo-fluctuation-dissipation theorem relationship and relate our work to earlier experimental and theoretical studies of the problem.Comment: 5 pages, 4 figures, reference list change

    SCIAMACHY: The new Level 0-1 Processor

    Get PDF
    SCIAMACHY (SCanning Imaging Absorption spectroMeter for Atmospheric CHartographY) is a scanning nadir and limb spectrometer covering the wavelength range from 212 nm to 2386 nm in 8 channels. It is a joint project of Germany, the Netherlands and Belgium and was launched in February 2002 on the ENVISAT platform. After the platform failure in April 2012, SCIAMACHY is now in the postprocessing phase F. SCIAMACHY�s originally specified in-orbit lifetime was double the planned lifetime. SCIAMACHY was designed to measure column densities and vertical profiles of trace gas species in the mesosphere, in the stratosphere and in the troposphere (Bovensmann et al., 1999). It can detect a large amount of atmospheric gases (e.g. O3 , H2CO, CHOCHO, SO2 , BrO, OClO, NO2 , H2O, CO, CH4 , among others ) and can provide information about aerosols and clouds. The operational processing of SCIAMACHY is split into Level 0-1 processing (essentially providing calibrated radiances) and Level 1-2 processing providing geophysical products. The operational Level 0-1 processor has been completely re-coded and embedded in a newly developed framework that speeds up processing considerably. In the frame of the SCIAMACHY Quality Working Group activities, ESA is continuing the improvement of the archived data sets. Currently Version 9 of the Level 0-1 processor is being implemented. It will include An updated degradation correction Several improvements in the SWIR spectral range like a better dark correction, an improved dead & bad pixel characterisation and an improved spectral calibration Improvements to the polarisation correction algorithm Improvements to the geolocation by a better pointing characterisation Additionally a new format for the Level 1b and Level 1c will be implemented. The version 9 products will be available in netCDF version 4 that is aligned with the formats of the GOME -1 and Sentinel missions. We will present the first results of the new Level 0-1 processing in this paper

    ERIS: revitalising an adaptive optics instrument for the VLT

    Get PDF
    ERIS is an instrument that will both extend and enhance the fundamental diffraction limited imaging and spectroscopy capability for the VLT. It will replace two instruments that are now being maintained beyond their operational lifetimes, combine their functionality on a single focus, provide a new wavefront sensing module that makes use of the facility Adaptive Optics System, and considerably improve their performance. The instrument will be competitive with respect to JWST in several regimes, and has outstanding potential for studies of the Galactic Center, exoplanets, and high redshift galaxies. ERIS had its final design review in 2017, and is expected to be on sky in 2020. This contribution describes the instrument concept, outlines its expected performance, and highlights where it will most excel.Comment: 12 pages, Proc SPIE 10702 "Ground-Based and Airborne Instrumentation for Astronomy VII

    Scanning tunneling spectroscopy of high-temperature superconductors

    Full text link
    Tunneling spectroscopy played a central role in the experimental verification of the microscopic theory of superconductivity in the classical superconductors. Initial attempts to apply the same approach to high-temperature superconductors were hampered by various problems related to the complexity of these materials. The use of scanning tunneling microscopy/spectroscopy (STM/STS) on these compounds allowed to overcome the main difficulties. This success motivated a rapidly growing scientific community to apply this technique to high-temperature superconductors. This paper reviews the experimental highlights obtained over the last decade. We first recall the crucial efforts to gain control over the technique and to obtain reproducible results. We then discuss how the STM/STS technique has contributed to the study of some of the most unusual and remarkable properties of high-temperature superconductors: the unusual large gap values and the absence of scaling with the critical temperature; the pseudogap and its relation to superconductivity; the unprecedented small size of the vortex cores and its influence on vortex matter; the unexpected electronic properties of the vortex cores; the combination of atomic resolution and spectroscopy leading to the observation of periodic local density of states modulations in the superconducting and pseudogap states, and in the vortex cores.Comment: To appear in RMP; 65 pages, 62 figure

    Mechanical thrombectomy in acute ischemic stroke : Consensus statement by ESO-Karolinska Stroke Update 2014/2015, supported by ESO, ESMINT, ESNR and EAN

    Get PDF
    The original version of this consensus statement on mechanical thrombectomy was approved at the European Stroke Organisation (ESO)-Karolinska Stroke Update conference in Stockholm, 16-18 November 2014. The statement has later, during 2015, been updated with new clinical trials data in accordance with a decision made at the conference. Revisions have been made at a face-to-face meeting during the ESO Winter School in Berne in February, through email exchanges and the final version has then been approved by each society. The recommendations are identical to the original version with evidence level upgraded by 20 February 2015 and confirmed by 15 May 2015. The purpose of the ESO-Karolinska Stroke Update meetings is to provide updates on recent stroke therapy research and to discuss how the results may be implemented into clinical routine. Selected topics are discussed at consensus sessions, for which a consensus statement is prepared and discussed by the participants at the meeting. The statements are advisory to the ESO guidelines committee. This consensus statement includes recommendations on mechanical thrombectomy after acute stroke. The statement is supported by ESO, European Society of Minimally Invasive Neurological Therapy (ESMINT), European Society of Neuroradiology (ESNR), and European Academy of Neurology (EAN).Peer reviewe
    corecore