5,325 research outputs found
Correcting for cell-type heterogeneity in epigenome-wide association studies: revisiting previous analyses
Epistasis not needed to explain low dN/dS
An important question in molecular evolution is whether an amino acid that
occurs at a given position makes an independent contribution to fitness, or
whether its effect depends on the state of other loci in the organism's genome,
a phenomenon known as epistasis. In a recent letter to Nature, Breen et al.
(2012) argued that epistasis must be "pervasive throughout protein evolution"
because the observed ratio between the per-site rates of non-synonymous and
synonymous substitutions (dN/dS) is much lower than would be expected in the
absence of epistasis. However, when calculating the expected dN/dS ratio in the
absence of epistasis, Breen et al. assumed that all amino acids observed in a
protein alignment at any particular position have equal fitness. Here, we relax
this unrealistic assumption and show that any dN/dS value can in principle be
achieved at a site, without epistasis. Furthermore, for all nuclear and
chloroplast genes in the Breen et al. dataset, we show that the observed dN/dS
values and the observed patterns of amino acid diversity at each site are
jointly consistent with a non-epistatic model of protein evolution.Comment: This manuscript is in response to "Epistasis as the primary factor in
molecular evolution" by Breen et al. Nature 490, 535-538 (2012
Molecular dynamics simulations of oscillatory Couette flows with slip boundary conditions
The effect of interfacial slip on steady-state and time-periodic flows of
monatomic liquids is investigated using non-equilibrium molecular dynamics
simulations. The fluid phase is confined between atomically smooth rigid walls,
and the fluid flows are induced by moving one of the walls. In steady shear
flows, the slip length increases almost linearly with shear rate. We found that
the velocity profiles in oscillatory flows are well described by the Stokes
flow solution with the slip length that depends on the local shear rate.
Interestingly, the rate dependence of the slip length obtained in steady shear
flows is recovered when the slip length in oscillatory flows is plotted as a
function of the local shear rate magnitude. For both types of flows, the
friction coefficient at the liquid-solid interface correlates well with the
structure of the first fluid layer near the solid wall.Comment: 31 pages, 11 figure
Lightest sterile neutrino abundance within the nuMSM
We determine the abundance of the lightest (dark matter) sterile neutrinos
created in the Early Universe due to active-sterile neutrino transitions from
the thermal plasma. Our starting point is the field-theoretic formula for the
sterile neutrino production rate, derived in our previous work [JHEP
06(2006)053], which allows to systematically incorporate all relevant effects,
and also to analyse various hadronic uncertainties. Our numerical results
differ moderately from previous computations in the literature, and lead to an
absolute upper bound on the mixing angles of the dark matter sterile neutrino.
Comparing this bound with existing astrophysical X-ray constraints, we find
that the Dodelson-Widrow scenario, which proposes sterile neutrinos generated
by active-sterile neutrino transitions to be the sole source of dark matter, is
only possible for sterile neutrino masses lighter than 3.5 keV (6 keV if all
hadronic uncertainties are pushed in one direction and the most stringent X-ray
bounds are relaxed by a factor of two). This upper bound may conflict with a
lower bound from structure formation, but a definitive conclusion necessitates
numerical simulations with the non-equilibrium momentum distribution function
that we derive. If other production mechanisms are also operative, no upper
bound on the sterile neutrino mass can be established.Comment: 34 pages. v2: clarifications and a reference added; published
version. v3: erratum appende
Ten Years of Experience Training Non-Physician Anesthesia Providers in Haiti.
Surgery is increasingly recognized as an effective means of treating a proportion of the global burden of disease, especially in resource-limited countries. Often non-physicians, such as nurses, provide the majority of anesthesia; however, their training and formal supervision is often of low priority or even non-existent. To increase the number of safe anesthesia providers in Haiti, Médecins Sans Frontières has trained nurse anesthetists (NAs) for over 10 years. This article describes the challenges, outcomes, and future directions of this training program. From 1998 to 2008, 24 students graduated. Nineteen (79%) continue to work as NAs in Haiti and 5 (21%) have emigrated. In 2008, NAs were critical in providing anesthesia during a post-hurricane emergency where they performed 330 procedures. Mortality was 0.3% and not associated with lack of anesthesiologist supervision. The completion rate of this training program was high and the majority of graduates continue to work as nurse anesthetists in Haiti. Successful training requires a setting with a sufficient volume and diversity of operations, appropriate anesthesia equipment, a structured and comprehensive training program, and recognition of the training program by the national ministry of health and relevant professional bodies. Preliminary outcomes support findings elsewhere that NAs can be a safe and effective alternative where anesthesiologists are scarce. Training non-physician anesthetists is a feasible and important way to scale up surgical services resource limited settings
How a Diverse Research Ecosystem Has Generated New Rehabilitation Technologies: Review of NIDILRR’s Rehabilitation Engineering Research Centers
Over 50 million United States citizens (1 in 6 people in the US) have a developmental, acquired, or degenerative disability. The average US citizen can expect to live 20% of his or her life with a disability. Rehabilitation technologies play a major role in improving the quality of life for people with a disability, yet widespread and highly challenging needs remain. Within the US, a major effort aimed at the creation and evaluation of rehabilitation technology has been the Rehabilitation Engineering Research Centers (RERCs) sponsored by the National Institute on Disability, Independent Living, and Rehabilitation Research. As envisioned at their conception by a panel of the National Academy of Science in 1970, these centers were intended to take a “total approach to rehabilitation”, combining medicine, engineering, and related science, to improve the quality of life of individuals with a disability. Here, we review the scope, achievements, and ongoing projects of an unbiased sample of 19 currently active or recently terminated RERCs. Specifically, for each center, we briefly explain the needs it targets, summarize key historical advances, identify emerging innovations, and consider future directions. Our assessment from this review is that the RERC program indeed involves a multidisciplinary approach, with 36 professional fields involved, although 70% of research and development staff are in engineering fields, 23% in clinical fields, and only 7% in basic science fields; significantly, 11% of the professional staff have a disability related to their research. We observe that the RERC program has substantially diversified the scope of its work since the 1970’s, addressing more types of disabilities using more technologies, and, in particular, often now focusing on information technologies. RERC work also now often views users as integrated into an interdependent society through technologies that both people with and without disabilities co-use (such as the internet, wireless communication, and architecture). In addition, RERC research has evolved to view users as able at improving outcomes through learning, exercise, and plasticity (rather than being static), which can be optimally timed. We provide examples of rehabilitation technology innovation produced by the RERCs that illustrate this increasingly diversifying scope and evolving perspective. We conclude by discussing growth opportunities and possible future directions of the RERC program
Increased insolation threshold for runaway greenhouse processes on Earth like planets
Because the solar luminosity increases over geological timescales, Earth
climate is expected to warm, increasing water evaporation which, in turn,
enhances the atmospheric greenhouse effect. Above a certain critical
insolation, this destabilizing greenhouse feedback can "runaway" until all the
oceans are evaporated. Through increases in stratospheric humidity, warming may
also cause oceans to escape to space before the runaway greenhouse occurs. The
critical insolation thresholds for these processes, however, remain uncertain
because they have so far been evaluated with unidimensional models that cannot
account for the dynamical and cloud feedback effects that are key stabilizing
features of Earth's climate. Here we use a 3D global climate model to show that
the threshold for the runaway greenhouse is about 375 W/m, significantly
higher than previously thought. Our model is specifically developed to quantify
the climate response of Earth-like planets to increased insolation in hot and
extremely moist atmospheres. In contrast with previous studies, we find that
clouds have a destabilizing feedback on the long term warming. However,
subsident, unsaturated regions created by the Hadley circulation have a
stabilizing effect that is strong enough to defer the runaway greenhouse limit
to higher insolation than inferred from 1D models. Furthermore, because of
wavelength-dependent radiative effects, the stratosphere remains cold and dry
enough to hamper atmospheric water escape, even at large fluxes. This has
strong implications for Venus early water history and extends the size of the
habitable zone around other stars.Comment: Published in Nature. Online publication date: December 12, 2013.
Accepted version before journal editing and with Supplementary Informatio
A mixed methods approach to evaluating community drug distributor performance in the control of neglected tropical diseases
BACKGROUND: Trusted literate, or semi-literate, community drug distributors (CDDs) are the primary implementers in integrated preventive chemotherapy (IPC) programmes for Neglected Tropical Disease (NTD) control. The CDDs are responsible for safely distributing drugs and for galvanising communities to repeatedly, often over many years, receive annual treatment, create and update treatment registers, monitor for side-effects and compile treatment coverage reports. These individuals are 'volunteers' for the programmes and do not receive remuneration for their annual work commitment. METHODS: A mixed methods approach, which included pictorial diaries to prospectively record CDD use of time, structured interviews and focus group discussions, was used to triangulate data on how 58 CDDs allocated their time towards their routine family activities and to NTD Programme activities in Uganda. The opportunity costs of CDD time were valued, performance assessed by determining the relationship between time and programme coverage, and CDD motivation for participating in the programme was explored. RESULTS: Key findings showed approximately 2.5 working weeks (range 0.6-11.4 working weeks) were spent on NTD Programme activities per year. The amount of time on NTD control activities significantly increased between the one and three deliveries that were required within an IPC campaign. CDD time spent on NTD Programme activities significantly reduced time available for subsistence and income generating engagements. As CDDs took more time to complete NTD Programme activities, their treatment performance, in terms of validated coverage, significantly decreased. Motivation for the programme was reported as low and CDDs felt undervalued. CONCLUSIONS: CDDs contribute a considerable amount of opportunity cost to the overall economic cost of the NTD Programme in Uganda due to the commitment of their time. Nevertheless, programme coverage of at least 75 %, as required by the World Health Organisation, is not being achieved and vulnerable individuals may not have access to treatment as a consequence of sub-optimal performance by the CDDs due to workload and programmatic factors
Monoidal computer III: A coalgebraic view of computability and complexity
Monoidal computer is a categorical model of intensional computation, where
many different programs correspond to the same input-output behavior. The
upshot of yet another model of computation is that a categorical formalism
should provide a much needed high level language for theory of computation,
flexible enough to allow abstracting away the low level implementation details
when they are irrelevant, or taking them into account when they are genuinely
needed. A salient feature of the approach through monoidal categories is the
formal graphical language of string diagrams, which supports visual reasoning
about programs and computations.
In the present paper, we provide a coalgebraic characterization of monoidal
computer. It turns out that the availability of interpreters and specializers,
that make a monoidal category into a monoidal computer, is equivalent with the
existence of a *universal state space*, that carries a weakly final state
machine for any pair of input and output types. Being able to program state
machines in monoidal computers allows us to represent Turing machines, to
capture their execution, count their steps, as well as, e.g., the memory cells
that they use. The coalgebraic view of monoidal computer thus provides a
convenient diagrammatic language for studying computability and complexity.Comment: 34 pages, 24 figures; in this version: added the Appendi
- …
