233 research outputs found

    Preparation and Measurement of Three-Qubit Entanglement in a Superconducting Circuit

    Full text link
    Traditionally, quantum entanglement has played a central role in foundational discussions of quantum mechanics. The measurement of correlations between entangled particles can exhibit results at odds with classical behavior. These discrepancies increase exponentially with the number of entangled particles. When entanglement is extended from just two quantum bits (qubits) to three, the incompatibilities between classical and quantum correlation properties can change from a violation of inequalities involving statistical averages to sign differences in deterministic observations. With the ample confirmation of quantum mechanical predictions by experiments, entanglement has evolved from a philosophical conundrum to a key resource for quantum-based technologies, like quantum cryptography and computation. In particular, maximal entanglement of more than two qubits is crucial to the implementation of quantum error correction protocols. While entanglement of up to 3, 5, and 8 qubits has been demonstrated among spins, photons, and ions, respectively, entanglement in engineered solid-state systems has been limited to two qubits. Here, we demonstrate three-qubit entanglement in a superconducting circuit, creating Greenberger-Horne-Zeilinger (GHZ) states with fidelity of 88%, measured with quantum state tomography. Several entanglement witnesses show violation of bi-separable bounds by 830\pm80%. Our entangling sequence realizes the first step of basic quantum error correction, namely the encoding of a logical qubit into a manifold of GHZ-like states using a repetition code. The integration of encoding, decoding and error-correcting steps in a feedback loop will be the next milestone for quantum computing with integrated circuits.Comment: 7 pages, 4 figures, and Supplementary Information (4 figures)

    A systematic genome-wide analysis of zebrafish protein-coding gene function

    Get PDF
    Since the publication of the human reference genome, the identities of specific genes associated with human diseases are being discovered at a rapid rate. A central problem is that the biological activity of these genes is often unclear. Detailed investigations in model vertebrate organisms, typically mice, have been essential for understanding the activities of many orthologues of these disease-associated genes. Although gene-targeting approaches1, 2, 3 and phenotype analysis have led to a detailed understanding of nearly 6,000 protein-coding genes3, 4, this number falls considerably short of the more than 22,000 mouse protein-coding genes5. Similarly, in zebrafish genetics, one-by-one gene studies using positional cloning6, insertional mutagenesis7, 8, 9, antisense morpholino oligonucleotides10, targeted re-sequencing11, 12, 13, and zinc finger and TAL endonucleases14, 15, 16, 17 have made substantial contributions to our understanding of the biological activity of vertebrate genes, but again the number of genes studied falls well short of the more than 26,000 zebrafish protein-coding genes18. Importantly, for both mice and zebrafish, none of these strategies are particularly suited to the rapid generation of knockouts in thousands of genes and the assessment of their biological activity. Here we describe an active project that aims to identify and phenotype the disruptive mutations in every zebrafish protein-coding gene, using a well-annotated zebrafish reference genome sequence18, 19, high-throughput sequencing and efficient chemical mutagenesis. So far we have identified potentially disruptive mutations in more than 38% of all known zebrafish protein-coding genes. We have developed a multi-allelic phenotyping scheme to efficiently assess the effects of each allele during embryogenesis and have analysed the phenotypic consequences of over 1,000 alleles. All mutant alleles and data are available to the community and our phenotyping scheme is adaptable to phenotypic analysis beyond embryogenesis

    A structural heart-brain axis mediates the association between cardiovascular risk and cognitive function

    Get PDF
    Elevated vascular disease risk associates with poorer cognitive function, but the mechanism for this link is poorly understood. A leading theory, the structural-functional model argues that vascular risk may drive adverse cardiac remodelling, which, in turn, leads to chronic cerebral hypoperfusion and subsequent brain structural damage. This model predicts that variation in heart and brain structure should associate with both greater vascular risk and lower cognitive function. This study tests that prediction in a large sample of the UK Biobank (N = 11,962). We assemble and summarise vascular risk factors, cardiac magnetic resonance radiomics, brain structural and diffusion MRI indices, and cognitive assessment. We also extract “heart-brain axes” capturing the covariation in heart and brain structure. Many heart and brain measures partially explain the vascular risk—cognitive function association, like left ventricular end-diastolic volume and grey matter volume. Notably, a heart-brain axis, capturing correlation between lower myocardial intensity, lower grey matter volume, and poorer thalamic white matter integrity, completely mediates the association, supporting the structural-functional model. Our findings also complicate this theory by finding that brain structural variation cannot completely explain the heart structure—cognitive function association. Our results broadly offer evidence for the structural functional hypothesis, identify imaging biomarkers for this association by considering covariation in heart and brain structure, and generate novel hypotheses about how cardiovascular risk may link to cognitive function

    Racism as a determinant of health: a systematic review and meta-analysis

    Get PDF
    Despite a growing body of epidemiological evidence in recent years documenting the health impacts of racism, the cumulative evidence base has yet to be synthesized in a comprehensive meta-analysis focused specifically on racism as a determinant of health. This meta-analysis reviewed the literature focusing on the relationship between reported racism and mental and physical health outcomes. Data from 293 studies reported in 333 articles published between 1983 and 2013, and conducted predominately in the U.S., were analysed using random effects models and mean weighted effect sizes. Racism was associated with poorer mental health (negative mental health: r = -.23, 95% CI [-.24,-.21], k = 227; positive mental health: r = -.13, 95% CI [-.16,-.10], k = 113), including depression, anxiety, psychological stress and various other outcomes. Racism was also associated with poorer general health (r = -.13 (95% CI [-.18,-.09], k = 30), and poorer physical health (r = -.09, 95% CI [-.12,-.06], k = 50). Moderation effects were found for some outcomes with regard to study and exposure characteristics. Effect sizes of racism on mental health were stronger in cross-sectional compared with longitudinal data and in non-representative samples compared with representative samples. Age, sex, birthplace and education level did not moderate the effects of racism on health. Ethnicity significantly moderated the effect of racism on negative mental health and physical health: the association between racism and negative mental health was significantly stronger for Asian American and Latino(a) American participants compared with African American participants, and the association between racism and physical health was significantly stronger for Latino(a) American participants compared with African American participants.<br /

    Search for a new gauge boson in pi(0) decays

    Get PDF
    A search was made for a new light gauge boson X which might be produced in pi(0) -->, gamma + X decay from neutral pions generated by 450 GeV protons in the CERN SPS neutrino target. The X's would penetrate the downstream shielding and be observed in the NOMAD detector via the Primakoff effect, in the process of X --> pi(0) conversion in the external Coulomb field of a nucleus. With 1.45 x 10(18) protons on target, 20 candidate events with energy between 8 and 140 GeV were found from the analysis of neutrino data. This number is in agreement with the expectation of 18.1 +/- 2.8 background events from standard neutrino processes. A new 90% C.L. upper limit on the branching ratio Br(pi(0) --> Y + X)< (3.3 to 1.9)X 10(-5) for X masses ranging from 0 to 120 MeV/c(2) is obtained

    Neutrinos

    Get PDF
    229 pages229 pages229 pagesThe Proceedings of the 2011 workshop on Fundamental Physics at the Intensity Frontier. Science opportunities at the intensity frontier are identified and described in the areas of heavy quarks, charged leptons, neutrinos, proton decay, new light weakly-coupled particles, and nucleons, nuclei, and atoms

    First clinical evaluation of a novel capacitive ECG system in patients with acute myocardial infarction

    Get PDF
    The ECG plays a central role in the rapid diagnosis of acute myocardial infarctions (MI). In haemodynamically instable patients, adhesion of electrodes sometimes is difficult and assessing ECGs through layers of clothes has not been done so far. A novel capacitive measurement of ECG signals is possible without skin contact. Whether this technical innovation can be used in patients with MI is unclear. We evaluated a capacitive ECG system (cECG) in patients with anterior and inferior ST elevation MI (STEMI) as compared to patients without ST elevations in anterior and inferior leads. The cECG was assessed using a sensor array consisting of 15 electrodes of which the classical leads I, II, III, aVL, aVF and V-1-V-3 were calculated from. 66 patients were included in the study. In addition to the conventional ECG (kECG) the novel cECG was registered before reperfusion therapy was started. In a first round, 19 patients presented with anterior MI, 23 with inferior MI, and 7 either with left bundle branch block or lateral MI. Regarding anterior MI, a significant correlation (P < 0.05) was found between ST elevations in leads I, aVL, V-2 and V-3 comparing cECG and kECG. In inferior MI, there was only a significant correlation (P < 0.05) in lead III between cECG and kECG, but not in II and aVF. Therefore, 17 additional patients were included in the study by placing an additional electrode further away from the sensor array on the chest. ST elevations now correlated in all inferior leads II, III and aVF (P < 0.05) as measured in 9 patients with inferior MI. In addition, in 8 patients an inferior MI was correctly ruled out. It is possible to identify STEMIs by cECG. This innovative technique could play an important role in the pre-hospital period as well as in the hospital

    Discovery of Salmonella trehalose phospholipids reveals functional convergence with mycobacteria.

    Get PDF
    Salmonella species are among the world's most prevalent pathogens. Because the cell wall interfaces with the host, we designed a lipidomics approach to reveal pathogen-specific cell wall compounds. Among the molecules differentially expressed between Salmonella Paratyphi and S. Typhi, we focused on lipids that are enriched in S. Typhi, because it causes typhoid fever. We discovered a previously unknown family of trehalose phospholipids, 6,6'-diphosphatidyltrehalose (diPT) and 6-phosphatidyltrehalose (PT). Cardiolipin synthase B (ClsB) is essential for PT and diPT but not for cardiolipin biosynthesis. Chemotyping outperformed clsB homology analysis in evaluating synthesis of diPT. DiPT is restricted to a subset of Gram-negative bacteria: large amounts are produced by S. Typhi, lower amounts by other pathogens, and variable amounts by Escherichia coli strains. DiPT activates Mincle, a macrophage activating receptor that also recognizes mycobacterial cord factor (6,6'-trehalose dimycolate). Thus, Gram-negative bacteria show convergent function with mycobacteria. Overall, we discovered a previously unknown immunostimulant that is selectively expressed among medically important bacterial species

    Trends in Volunteering in Scandinavia

    Get PDF
    In this chapter, we examine participation rate and time use trends in volunteering in Scandinavia during the period from the beginning of the 1990s until the mid-2010s. The aim of the analysis is twofold. First, we aim to provide a descriptive analysis of the trends in volunteering in Scandinavia during the period under investigation. Second, we aim to determine whether and to what extent the socio-demographic and institutional changes in the Scandinavian societies during this period can explain the observed trends in volunteering. The results show that the overall levels of participation in volunteering are high and stable in the Scandinavian countries, with a small upward trend. The participation levels are all high in international comparisons, but they are markedly higher in Norway and Sweden than in Denmark. Volunteers’ contributions of time appear relatively stable in Norway, but Denmark has witnessed a slight decline and Sweden has witnessed a slight increase. The explanatory analysis revealed that nearly half of the upward trend in the levels of volunteering can be attributed to the expansion of education in the Scandinavian countries. The explanatory analysis also indicated that the gap in the levels of volunteering between Sweden and Norway on the one hand, and Denmark on the other hand, cannot be attributed to socio-demographic differences between the countries, as the gap is left unchanged when controlling for socio-demographic factors.</p
    corecore