403 research outputs found

    Hybrid silicon nanostructures with conductive ligands and their microscopic conductivities

    Get PDF
    Silicon nanoparticles (SiNPs) functionalized with conjugated molecules promise a potential pathway to generate a new category of thermoelectric materials. While the thermoelectric performance of materials based on phenyl-acetylene capped SiNPs has been proven, their low conductivity is still a problem for their general application. A muon study of phenyl-acetylene capped SiNPs has been recently carried out using the HiFi spectrometer at the Rutherford Appleton Laboratory, measuring the ALC spectra as a function of temperature. The results show a reduction in the measured line width of the resonance above room temperature, suggesting an activated behaviour for this system. This study shows that the muon study could be a powerful method to investigate microscopic conductivity of hybrid thermoelectric materials

    Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems

    Get PDF
    A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud \u

    A Measurement of Rb using a Double Tagging Method

    Get PDF
    The fraction of Z to bbbar events in hadronic Z decays has been measured by the OPAL experiment using the data collected at LEP between 1992 and 1995. The Z to bbbar decays were tagged using displaced secondary vertices, and high momentum electrons and muons. Systematic uncertainties were reduced by measuring the b-tagging efficiency using a double tagging technique. Efficiency correlations between opposite hemispheres of an event are small, and are well understood through comparisons between real and simulated data samples. A value of Rb = 0.2178 +- 0.0011 +- 0.0013 was obtained, where the first error is statistical and the second systematic. The uncertainty on Rc, the fraction of Z to ccbar events in hadronic Z decays, is not included in the errors. The dependence on Rc is Delta(Rb)/Rb = -0.056*Delta(Rc)/Rc where Delta(Rc) is the deviation of Rc from the value 0.172 predicted by the Standard Model. The result for Rb agrees with the value of 0.2155 +- 0.0003 predicted by the Standard Model.Comment: 42 pages, LaTeX, 14 eps figures included, submitted to European Physical Journal

    Measurement of the B+ and B-0 lifetimes and search for CP(T) violation using reconstructed secondary vertices

    Get PDF
    The lifetimes of the B+ and B-0 mesons, and their ratio, have been measured in the OPAL experiment using 2.4 million hadronic Z(0) decays recorded at LEP. Z(0) --> b (b) over bar decays were tagged using displaced secondary vertices and high momentum electrons and muons. The lifetimes were then measured using well-reconstructed charged and neutral secondary vertices selected in this tagged data sample. The results aretau(B+) = 1.643 +/- 0.037 +/- 0.025 pstau(Bo) = 1.523 +/- 0.057 +/- 0.053 pstau(B+)/tau(Bo) = 1.079 +/- 0.064 +/- 0.041,where in each case the first error is statistical and the second systematic.A larger data sample of 3.1 million hadronic Z(o) decays has been used to search for CP and CPT violating effects by comparison of inclusive b and (b) over bar hadron decays, No evidence fur such effects is seen. The CP violation parameter Re(epsilon(B)) is measured to be Re(epsilon(B)) = 0.001 +/- 0.014 +/- 0.003and the fractional difference between b and (b) over bar hadron lifetimes is measured to(Delta tau/tau)(b) = tau(b hadron) - tau((b) over bar hadron)/tau(average) = -0.001 +/- 0.012 +/- 0.008

    Performance of the CMS Cathode Strip Chambers with Cosmic Rays

    Get PDF
    The Cathode Strip Chambers (CSCs) constitute the primary muon tracking device in the CMS endcaps. Their performance has been evaluated using data taken during a cosmic ray run in fall 2008. Measured noise levels are low, with the number of noisy channels well below 1%. Coordinate resolution was measured for all types of chambers, and fall in the range 47 microns to 243 microns. The efficiencies for local charged track triggers, for hit and for segments reconstruction were measured, and are above 99%. The timing resolution per layer is approximately 5 ns

    Tracing the Flow of Perceptual Features in an Algorithmic Brain Network

    Get PDF
    The model of the brain as an information processing machine is a profound hypothesis in which neuroscience, psychology and theory of computation are now deeply rooted. Modern neuroscience aims to model the brain as a network of densely interconnected functional nodes. However, to model the dynamic information processing mechanisms of perception and cognition, it is imperative to understand brain networks at an algorithmic level–i.e. as the information flow that network nodes code and communicate. Here, using innovative methods (Directed Feature Information), we reconstructed examples of possible algorithmic brain networks that code and communicate the specific features underlying two distinct perceptions of the same ambiguous picture. In each observer, we identified a network architecture comprising one occipito-temporal hub where the features underlying both perceptual decisions dynamically converge. Our focus on detailed information flow represents an important step towards a new brain algorithmics to model the mechanisms of perception and cognition

    A randomised, double-blind, placebo-controlled trial of repeated nebulisation of non-viral cystic fibrosis transmembrane conductance regulator (CFTR) gene therapy in patients with cystic fibrosis

    Get PDF
    BACKGROUND: Cystic fibrosis (CF) is a chronic, life-limiting disease caused by mutations in the CF transmembrane conductance regulator (CFTR) gene leading to abnormal airway surface ion transport, chronic lung infections, inflammation and eventual respiratory failure. With the exception of the small-molecule potentiator, ivacaftor (Kalydeco®, Vertex Pharmaceuticals, Boston, MA, USA), which is suitable for a small proportion of patients, there are no licensed therapies targeting the basic defect. The UK Cystic Fibrosis Gene Therapy Consortium has taken a cationic lipid-mediated CFTR gene therapy formulation through preclinical and clinical development. OBJECTIVE: To determine clinical efficacy of the formulation delivered to the airways over a period of 1 year in patients with CF. DESIGN: This was a randomised, double-blind, placebo-controlled Phase IIb trial of the CFTR gene–liposome complex pGM169/GL67A. Randomisation was performed via InForm™ version 4.6 (Phase Forward Incorporated, Oracle, CA, USA) and was 1 : 1, except for patients in the mechanistic subgroups (2 : 1). Allocation was blinded by masking nebuliser chambers. SETTINGS: Data were collected in the clinical and scientific sites and entered onto a trial-specific InForm, version 4.6 database. PARTICIPANTS: Patients with CF aged ≥ 12 years with forced expiratory volume in the first second (FEV1) between 50% and 90% predicted and any combination of CFTR mutations. The per-protocol group (≥ 9 doses) consisted of 54 patients receiving placebo (62 randomised) and 62 patients receiving gene therapy (78 randomised). INTERVENTIONS: Subjects received 5 ml of nebulised pGM169/G67A (active) or 0.9% saline (placebo) at 28 (±5)-day intervals over 1 year. MAIN OUTCOME MEASURES: The primary end point was the relative change in percentage predicted FEV1 over the 12-month period. A number of secondary clinical outcomes were assessed alongside safety measures: other spirometric values; lung clearance index (LCI) assessed by multibreath washout; structural disease on computed tomography (CT) scan; the Cystic Fibrosis Questionnaire – Revised (CFQ-R), a validated quality-of-life questionnaire; exercise capacity and monitoring; systemic and sputum inflammatory markers; and adverse events (AEs). A mechanistic study was performed in a subgroup in whom transgene deoxyribonucleic acid (DNA) and messenger ribonucleic acid (mRNA) was measured alongside nasal and lower airway potential difference. RESULTS: There was a significant (p = 0.046) treatment effect (TE) of 3.7% [95% confidence interval (CI) 0.1% to 7.3%] in the primary end point at 12 months and in secondary end points, including forced vital capacity (FVC) (p = 0.031) and CT gas trapping (p = 0.048). Other outcomes, although not reaching statistical significance, favoured active treatment. Effects were noted by 1 month and were irrespective of sex, age or CFTR mutation class. Subjects with a more severe baseline FEV1 had a FEV1 TE of 6.4% (95% CI 0.8% to 12.1%) and greater changes in many other secondary outcomes. However, the more mildly affected group also demonstrated benefits, particularly in small airway disease markers such as LCI. The active group showed a significantly (p = 0.032) greater bronchial chloride secretory response. No difference in treatment-attributable AEs was seen between the placebo and active groups. CONCLUSIONS: Monthly application of the pGM169/GL67A gene therapy formulation was associated with an improvement in lung function, other clinically relevant parameters and bronchial CFTR function, compared with placebo. LIMITATIONS: Although encouraging, the improvement in FEV1 was modest and was not accompanied by detectable improvement in patients’ quality of life. FUTURE WORK: Future work will focus on attempts to increase efficacy by increasing dose or frequency, the coadministration of a CFTR potentiator, or the use of modified viral vectors capable of repeated administration. TRIAL REGISTRATION: ClinicalTrials.gov NCT01621867

    Low potency toxins reveal dense interaction networks in metabolism

    Get PDF
    Background The chemicals of metabolism are constructed of a small set of atoms and bonds. This may be because chemical structures outside the chemical space in which life operates are incompatible with biochemistry, or because mechanisms to make or utilize such excluded structures has not evolved. In this paper I address the extent to which biochemistry is restricted to a small fraction of the chemical space of possible chemicals, a restricted subset that I call Biochemical Space. I explore evidence that this restriction is at least in part due to selection again specific structures, and suggest a mechanism by which this occurs. Results Chemicals that contain structures that our outside Biochemical Space (UnBiological groups) are more likely to be toxic to a wide range of organisms, even though they have no specifically toxic groups and no obvious mechanism of toxicity. This correlation of UnBiological with toxicity is stronger for low potency (millimolar) toxins. I relate this to the observation that most chemicals interact with many biological structures at low millimolar toxicity. I hypothesise that life has to select its components not only to have a specific set of functions but also to avoid interactions with all the other components of life that might degrade their function. Conclusions The chemistry of life has to form a dense, self-consistent network of chemical structures, and cannot easily be arbitrarily extended. The toxicity of arbitrary chemicals is a reflection of the disruption to that network occasioned by trying to insert a chemical into it without also selecting all the other components to tolerate that chemical. This suggests new ways to test for the toxicity of chemicals, and that engineering organisms to make high concentrations of materials such as chemical precursors or fuels may require more substantial engineering than just of the synthetic pathways involved

    Measurement of the W+WγW^{+}W^{-} \gamma Cross-section and First direct Limits on Anomalous Electroweak Quartic Gauge Couplings

    Get PDF
    A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W+W- events accompanied by hard photon radiation produced in e+e- collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183pb^-1 of data recorded at root{s}=189GeV. From these data, 17 W+W-gamma candidates are selected with photon energy greater than 10GeV, consistent with the Standard Model expectation. These events are used to measure the e+e- to W+W-gamma cross-section within a set of geometric and kinematic cuts; sigma{W+W-gamma} = 136+-37+-8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the {W+ W- gamma gamma} and {W+ W- gamma Z0} vertices: -0.070GeV^{-2} < a_0/Lambda^2 < 0.070GeV^{-2}, -0.13GeV^{-2} < a_c/Lambda^2 < 0.19GeV^{-2}, -0.61GeV^{-2} < a_n/Lambda^2 < 0.57GeV^{-2}, where Lambda represents the energy scale for new physics.A study of W + W − events accompanied by hard photon radiation produced in e + e − collisions at LEP is presented. Events consistent with two on-shell W-bosons and an isolated photon are selected from 183 pb −1 of data recorded at s =189 GeV. From these data, 17 W + W − γ candidates are selected with photon energy greater than 10 GeV, consistent with the Standard Model expectation. These events are used to measure the e + e − →W + W − γ cross-section within a set of geometric and kinematic cuts, σ ̂ WW γ =136±37±8 fb, where the first error is statistical and the second systematic. The photon energy spectrum is used to set the first direct, albeit weak, limits on possible anomalous contributions to the W + W − γγ and W + W − γ Z 0 vertices: −0.070 GeV −

    Transverse and Longitudinal Bose Einstein Correlations in hadronic Z0Z^0 Decays

    Get PDF
    Bose-Einstein correlations in pairs of identical charged pions produced in asample of 4.3 million Z0 hadronic decays are studied as a function of the threecomponents of the momentum difference, transverse ("out" and "side") andlongitudinal with respect to the thrust direction of the event. A significantdifference between the transverse, r_t_side, and longitudinal, r_l, dimensionsis observed, indicating that the emitting source of identical pions, asobserved in the Longitudinally CoMoving System, has an elongated shape. This isobserved with a variety of selection techniques. Specifically, the values ofthe parameters obtained by fitting the extended Goldhaber parametrisation tothe correlation function C'= C^{DATA}}/C^{MC} for two-jet events, selected withthe Durham algorithm and resolution parameter ycut=0.04, arer_t_out=(0.647+-0.011(stat})+0.022-0.124(syst)) fm,r_t_side=(0.809+-0.009(stat)+0.019-0.032}(syst)) fm, r_l=(0.989+-0.011(stat)+0.030-0.015(syst})) fm andr_l/r_t_side=1.222+- 0.027(stat})+0.075-0.012(syst). The results are discussedin the context of a recent model of Bose-Einstein correlations based on stringfragmentation.Bose-Einstein correlations in pairs of identical charged pions produced in a sample of 4.3 million Z0 hadronic decays are studied as a function of the three components of the momentum difference, transverse ("out" and "side") and longitudinal with respect to the thrust direction of the event. A significant difference between the transverse, r_t_side, and longitudinal, r_l, dimensions is observed, indicating that the emitting source of identical pions, as observed in the Longitudinally CoMoving System, has an elongated shape. This is observed with a variety of selection techniques. Specifically, the values of the parameters obtained by fitting the extended Goldhaber parametrisation to the correlation function C'= C^{DATA}}/C^{MC} for two-jet events, selected with the Durham algorithm and resolution parameter ycut=0.04, are r_t_out=(0.647+-0.011(stat})+0.022-0.124(syst)) fm, r_t_side=(0.809+-0.009(stat)+0.019-0.032}(syst)) fm, r_l=(0.989+-0.011(stat)+0.030-0.015(syst})) fm and r_l/r_t_side=1.222+-0.027(stat})+0.075-0.012(syst). The results are discussed in the context of a recent model of Bose-Einstein correlations based on string fragmentation
    corecore