2,815 research outputs found

    Automated real-time EEG sleep spindle detection for brain-state-dependent brain stimulation

    Get PDF
    Sleep spindles are a hallmark electroencephalographic feature of non-rapid eye movement sleep, and are believed to be instrumental for sleep-dependent memory reactivation and consolidation. However, direct proof of their causal relevance is hard to obtain, and our understanding of their immediate neurophysiological consequences is limited. To investigate their causal role, spindles need to be targeted in real-time with sensory or non-invasive brain-stimulation techniques. While fully automated offline detection algorithms are well established, spindle detection in real-time is highly challenging due to their spontaneous and transient nature. Here, we present the real-time spindle detector, a robust multi-channel electroencephalographic signal-processing algorithm that enables the automated triggering of stimulation during sleep spindles in a phase-specific manner. We validated the real-time spindle detection method by streaming pre-recorded sleep electroencephalographic datasets to a real-time computer system running a Simulink® Real-Time™ implementation of the algorithm. Sleep spindles were detected with high levels of Sensitivity (~83%), Precision (~78%) and a convincing F1-Score (~81%) in reference to state-of-the-art offline algorithms (which reached similar or lower levels when compared with each other), for both naps and full nights, and largely independent of sleep scoring information. Detected spindles were comparable in frequency, duration, amplitude and symmetry, and showed the typical time–frequency characteristics as well as a centroparietal topography. Spindles were detected close to their centre and reliably at the predefined target phase. The real-time spindle detection algorithm therefore empowers researchers to target spindles during human sleep, and apply the stimulation method and experimental paradigm of their choice

    ARIADNE: A Scientific Navigator to Find Your Way Through the Resource Labyrinth of Psychological Sciences

    Get PDF
    Performing high-quality research is a challenging endeavor, especially for early career researchers, in many fields of psychological science. Most research is characterized by experiential learning, which can be time-consuming, error-prone, and frustrating. Although most institutions provide selected resources to help researchers with their projects, these resources are often expensive, spread out, hard to find, and difficult to compare with one another in terms of reliability, validity, usability, and practicability. A comprehensive overview of resources that are useful for researchers in psychological science is missing. To address this issue, we created ARIADNE: a living and interactive resource navigator that helps to use and search a dynamically updated database of resources ( https://igor-biodgps.github.io/ARIADNE ). In this tutorial, we aim to guide researchers through a standard research project using ARIADNE along the way. The open-access database covers a growing list of resources useful for each step of a research project, from the planning and designing of a study, over the collection and analysis of the data, to the writing and disseminating of the findings. We provide (a) a step-by-step guide on how to perform a research project (in the fields of biological psychology and neuroscience as a case example but with broad application to neighboring fields) and (b) an overview of resources that are useful at different project steps. By explicitly highlighting open-access and open-source resources, we level the playing field for researchers from underprivileged countries or institutions, thereby facilitating open, fair, and reproducible research in the psychological sciences

    Severe early onset preeclampsia: short and long term clinical, psychosocial and biochemical aspects

    Get PDF
    Preeclampsia is a pregnancy specific disorder commonly defined as de novo hypertension and proteinuria after 20 weeks gestational age. It occurs in approximately 3-5% of pregnancies and it is still a major cause of both foetal and maternal morbidity and mortality worldwide1. As extensive research has not yet elucidated the aetiology of preeclampsia, there are no rational preventive or therapeutic interventions available. The only rational treatment is delivery, which benefits the mother but is not in the interest of the foetus, if remote from term. Early onset preeclampsia (<32 weeks’ gestational age) occurs in less than 1% of pregnancies. It is, however often associated with maternal morbidity as the risk of progression to severe maternal disease is inversely related with gestational age at onset2. Resulting prematurity is therefore the main cause of neonatal mortality and morbidity in patients with severe preeclampsia3. Although the discussion is ongoing, perinatal survival is suggested to be increased in patients with preterm preeclampsia by expectant, non-interventional management. This temporising treatment option to lengthen pregnancy includes the use of antihypertensive medication to control hypertension, magnesium sulphate to prevent eclampsia and corticosteroids to enhance foetal lung maturity4. With optimal maternal haemodynamic status and reassuring foetal condition this results on average in an extension of 2 weeks. Prolongation of these pregnancies is a great challenge for clinicians to balance between potential maternal risks on one the eve hand and possible foetal benefits on the other. Clinical controversies regarding prolongation of preterm preeclamptic pregnancies still exist – also taking into account that preeclampsia is the leading cause of maternal mortality in the Netherlands5 - a debate which is even more pronounced in very preterm pregnancies with questionable foetal viability6-9. Do maternal risks of prolongation of these very early pregnancies outweigh the chances of neonatal survival? Counselling of women with very early onset preeclampsia not only comprises of knowledge of the outcome of those particular pregnancies, but also knowledge of outcomes of future pregnancies of these women is of major clinical importance. This thesis opens with a review of the literature on identifiable risk factors of preeclampsia

    Penilaian Kinerja Keuangan Koperasi di Kabupaten Pelalawan

    Full text link
    This paper describe development and financial performance of cooperative in District Pelalawan among 2007 - 2008. Studies on primary and secondary cooperative in 12 sub-districts. Method in this stady use performance measuring of productivity, efficiency, growth, liquidity, and solvability of cooperative. Productivity of cooperative in Pelalawan was highly but efficiency still low. Profit and income were highly, even liquidity of cooperative very high, and solvability was good

    Enhancing precision in human neuroscience

    Get PDF
    Human neuroscience has always been pushing the boundary of what is measurable. During the last decade, concerns about statistical power and replicability - in science in general, but also specifically in human neuroscience - have fueled an extensive debate. One important insight from this discourse is the need for larger samples, which naturally increases statistical power. An alternative is to increase the precision of measurements, which is the focus of this review. This option is often overlooked, even though statistical power benefits from increasing precision as much as from increasing sample size. Nonetheless, precision has always been at the heart of good scientific practice in human neuroscience, with researchers relying on lab traditions or rules of thumb to ensure sufficient precision for their studies. In this review, we encourage a more systematic approach to precision. We start by introducing measurement precision and its importance for well-powered studies in human neuroscience. Then, determinants for precision in a range of neuroscientific methods (MRI, M/EEG, EDA, Eye-Tracking, and Endocrinology) are elaborated. We end by discussing how a more systematic evaluation of precision and the application of respective insights can lead to an increase in reproducibility in human neuroscience

    Measurement of the Bs0 = μ+μ- decay properties and search for the B0 → μ+μ- decay in proton-proton collisions at √s=13 TeV

    Get PDF
    Measurements are presented of the B0s & RARR; & mu;+& mu;- branching fraction and effective lifetime, as well as results of a search for the B0 & RARR; & mu;+& mu;- decay in proton-proton collisions at & RADIC;s =13 TeV at the LHC. The analysis is based on data collected with the CMS detector in 2016-2018 corresponding to an integrated luminosity of 140 fb-1. The branching fraction of the B0s & RARR; & mu;+& mu;- decay and the effective B0s meson lifetime are the most precise single measurements to date. No evidence for the B0 & RARR; & mu;+& mu;- decay has been found. All results are found to be consistent with the standard model predictions and previous measurements. & COPY; 2023 The Author(s). Published by Elsevier B.V. This is an open access article under the CC BY license (http://creativecommons .org /licenses /by /4 .0/). Funded by SCOAP3

    Portable Acceleration of CMS Computing Workflows with Coprocessors as a Service

    Get PDF
    Computing demands for large scientific experiments, such as the CMS experiment at the CERN LHC, will increase dramatically in the next decades. To complement the future performance increases of software running on central processing units (CPUs), explorations of coprocessor usage in data processing hold great potential and interest. Coprocessors are a class of computer processors that supplement CPUs, often improving the execution of certain functions due to architectural design choices. We explore the approach of Services for Optimized Network Inference on Coprocessors (SONIC) and study the deployment of this as-a-service approach in large-scale data processing. In the studies, we take a data processing workflow of the CMS experiment and run the main workflow on CPUs, while offloading several machine learning (ML) inference tasks onto either remote or local coprocessors, specifically graphics processing units (GPUs). With experiments performed at Google Cloud, the Purdue Tier-2 computing center, and combinations of the two, we demonstrate the acceleration of these ML algorithms individually on coprocessors and the corresponding throughput improvement for the entire workflow. This approach can be easily generalized to different types of coprocessors and deployed on local CPUs without decreasing the throughput performance. We emphasize that the SONIC approach enables high coprocessor usage and enables the portability to run workflows on different types of coprocessors

    Measurement of Energy Correlators inside Jets and Determination of the Strong Coupling Formula Presented

    Get PDF
    Energy correlators that describe energy-weighted distances between two or three particles in a hadronic jet are measured using an event sample of s\sqrt{s}=13 TeV proton-proton collisions collected by the CMS experiment and corresponding to an integrated luminosity of 36.3 fb1^{−1}. The measured distributions are consistent with the trends in the simulation that reveal two key features of the strong interaction: confinement and asymptotic freedom. By comparing the ratio of the measured three- and two-particle energy correlator distributions with theoretical calculations that resum collinear emissions at approximate next-to-next-to-leading-logarithmic accuracy matched to a next-to-leading-order calculation, the strong coupling is determined at the Z boson mass: αS_S (mZ_Z)=0.1229 0.00400.0050\frac{0.0040}{-0.0050} , the most precise αS_SmZ_Z value obtained using jet substructure observable

    Search for the Z Boson Decay to ττμμ in Proton-Proton Collisions at √s = 13 TeV

    Get PDF
    The first search for the boson decay to ⁢⁢⁢ at the CERN LHC is presented, based on data collected by the CMS experiment at the LHC in proton-proton collisions at a center-of-mass energy of 13 TeV and corresponding to an integrated luminosity of 138  fb−1. The data are compatible with the predicted background. For the first time, an upper limit at the 95% confidence level of 6.9 times the standard model expectation is placed on the ratio of the →⁢⁢⁢ to →4⁢ branching fractions. Limits are also placed on the six flavor-conserving four-lepton effective-field-theory operators involving two muons and two tau leptons, for the first time testing all such operators
    corecore