3,071 research outputs found
Effect of individual-level and socioeconomic factors on long-term survival after cataract surgery over a 30-year period
Purpose:
To evaluate survival and the risk for mortality after cataract surgery in relation to individual-level and socioeconomic factors in Scotland over 3 decades.
Setting:
Linked healthcare data, United Kingdom.
Design:
Representative population-based study.
Methods:
A 5% random sample of Scottish decedents linked to hospital records (1981 to 2012) was assessed. Survival time, survival probability, and determinants of mortality were evaluated after the first and second recorded hospital episodes for cataract surgery. Cox proportional-hazards regression models were used to assess the effect of individual-level and socioeconomic factors including age, geographic location, socioeconomic status, and comorbidity on mortality.
Results:
The study evaluated linked administrative healthcare data from 9228 deceased patients who had cataract surgery. The mean survival time was 2383 days ± 1853 (SD). The survival probability decreased from 98% 90 days after surgery to 22% at 10 years, 2% at 20 years, and 0% after 30 years. The mean age was 77 ± 9 years. Age (hazard ratio [HR] 3.66; 95% confidence interval [CI], 2.97-3.80; P < .001) and severe comorbidity (HR 1.68; 95% CI, 1.47-1.91; P < .001) were associated with an increased risk for mortality; women had a 20% lower risk than men (HR 0.80; 95% CI, 0.76-0.83; P < .001). Socioeconomic status and rural geographic locations were not linked to mortality.
Conclusions:
Long-term survival after cataract surgery was determined by individual-level characteristics reflecting the mortality patterns of aging populations. The mortality risk was independent of socioeconomic and geographic factors per se
Assessing framing of uncertainties in water management practice
Dealing with uncertainties in water management is an important issue and is one which will only increase in light of global changes, particularly climate change. So far, uncertainties in water management have mostly been assessed from a scientific point of view, and in quantitative terms. In this paper, we focus on the perspectives from water management practice, adopting a qualitative approach. We consider it important to know how uncertainties are framed in water management practice in order to develop practice relevant strategies for dealing with uncertainties. Framing refers to how people make sense of the world. With the aim of identifying what are important parameters for the framing of uncertainties in water management practice, in this paper we analyze uncertainty situations described by decision-makers in water management. The analysis builds on a series of ¿Uncertainty Dialogues¿ carried out within the NeWater project with water managers in the Rhine, Elbe and Guadiana basins in 2006. During these dialogues, representatives of these river basins were asked what uncertainties they encountered in their professional work life and how they confronted them. Analysing these dialogues we identified several important parameters of how uncertainties get framed. Our assumption is that making framing of uncertainty explicit for water managers will allow for better dealing with the respective uncertainty situations. Keywords Framing - Uncertainty - Water management practic
The antisaccade task as an index of sustained goal activation in working memory: modulation by nicotine
The antisaccade task provides a laboratory analogue of situations in which execution of the correct behavioural response requires the suppression of a more prepotent or habitual response. Errors (failures to inhibit a reflexive prosaccade towards a sudden onset target) are significantly increased in patients with damage to the dorsolateral prefrontal cortex and patients with schizophrenia. Recent models of antisaccade performance suggest that errors are more likely to occur when the intention to initiate an antisaccade is insufficiently activated within working memory. Nicotine has been shown to enhance specific working memory processes in healthy adults. MATERIALS AND METHODS: We explored the effect of nicotine on antisaccade performance in a large sample (N = 44) of young adult smokers. Minimally abstinent participants attended two test sessions and were asked to smoke one of their own cigarettes between baseline and retest during one session only. RESULTS AND CONCLUSION: Nicotine reduced antisaccade errors and correct antisaccade latencies if delivered before optimum performance levels are achieved, suggesting that nicotine supports the activation of intentions in working memory during task performance. The implications of this research for current theoretical accounts of antisaccade performance, and for interpreting the increased rate of antisaccade errors found in some psychiatric patient groups are discussed
Astrobiological Complexity with Probabilistic Cellular Automata
Search for extraterrestrial life and intelligence constitutes one of the
major endeavors in science, but has yet been quantitatively modeled only rarely
and in a cursory and superficial fashion. We argue that probabilistic cellular
automata (PCA) represent the best quantitative framework for modeling
astrobiological history of the Milky Way and its Galactic Habitable Zone. The
relevant astrobiological parameters are to be modeled as the elements of the
input probability matrix for the PCA kernel. With the underlying simplicity of
the cellular automata constructs, this approach enables a quick analysis of
large and ambiguous input parameters' space. We perform a simple clustering
analysis of typical astrobiological histories and discuss the relevant boundary
conditions of practical importance for planning and guiding actual empirical
astrobiological and SETI projects. In addition to showing how the present
framework is adaptable to more complex situations and updated observational
databases from current and near-future space missions, we demonstrate how
numerical results could offer a cautious rationale for continuation of
practical SETI searches.Comment: 37 pages, 11 figures, 2 tables; added journal reference belo
Algorithmic statistics: forty years later
Algorithmic statistics has two different (and almost orthogonal) motivations.
From the philosophical point of view, it tries to formalize how the statistics
works and why some statistical models are better than others. After this notion
of a "good model" is introduced, a natural question arises: it is possible that
for some piece of data there is no good model? If yes, how often these bad
("non-stochastic") data appear "in real life"?
Another, more technical motivation comes from algorithmic information theory.
In this theory a notion of complexity of a finite object (=amount of
information in this object) is introduced; it assigns to every object some
number, called its algorithmic complexity (or Kolmogorov complexity).
Algorithmic statistic provides a more fine-grained classification: for each
finite object some curve is defined that characterizes its behavior. It turns
out that several different definitions give (approximately) the same curve.
In this survey we try to provide an exposition of the main results in the
field (including full proofs for the most important ones), as well as some
historical comments. We assume that the reader is familiar with the main
notions of algorithmic information (Kolmogorov complexity) theory.Comment: Missing proofs adde
Networked buffering: a basic mechanism for distributed robustness in complex adaptive systems
A generic mechanism - networked buffering - is proposed for the generation of robust traits in complex systems. It requires two basic conditions to be satisfied: 1) agents are versatile enough to perform more than one single functional role within a system and 2) agents are degenerate, i.e. there exists partial overlap in the functional capabilities of agents. Given these prerequisites, degenerate systems can readily produce a distributed systemic response to local perturbations. Reciprocally, excess resources related to a single function can indirectly support multiple unrelated functions within a degenerate system. In models of genome:proteome mappings for which localized decision-making and modularity of genetic functions are assumed, we verify that such distributed compensatory effects cause enhanced robustness of system traits. The conditions needed for networked buffering to occur are neither demanding nor rare, supporting the conjecture that degeneracy may fundamentally underpin distributed robustness within several biotic and abiotic systems. For instance, networked buffering offers new insights into systems engineering and planning activities that occur under high uncertainty. It may also help explain recent developments in understanding the origins of resilience within complex ecosystems. \ud
\u
Horizontal DNA transfer mechanisms of bacteria as weapons of intragenomic conflict
Horizontal DNA transfer (HDT) is a pervasive mechanism of diversification in many microbial species, but its primary evolutionary role remains controversial. Much recent research has emphasised the adaptive benefit of acquiring novel DNA, but here we argue instead that intragenomic conflict provides a coherent framework for understanding the evolutionary origins of HDT. To test this hypothesis, we developed a mathematical model of a clonally descended bacterial population undergoing HDT through transmission of mobile genetic elements (MGEs) and genetic transformation. Including the known bias of transformation toward the acquisition of shorter alleles into the model suggested it could be an effective means of counteracting the spread of MGEs. Both constitutive and transient competence for transformation were found to provide an effective defence against parasitic MGEs; transient competence could also be effective at permitting the selective spread of MGEs conferring a benefit on their host bacterium. The coordination of transient competence with cell-cell killing, observed in multiple species, was found to result in synergistic blocking of MGE transmission through releasing genomic DNA for homologous recombination while simultaneously reducing horizontal MGE spread by lowering the local cell density. To evaluate the feasibility of the functions suggested by the modelling analysis, we analysed genomic data from longitudinal sampling of individuals carrying Streptococcus pneumoniae. This revealed the frequent within-host coexistence of clonally descended cells that differed in their MGE infection status, a necessary condition for the proposed mechanism to operate. Additionally, we found multiple examples of MGEs inhibiting transformation through integrative disruption of genes encoding the competence machinery across many species, providing evidence of an ongoing "arms race." Reduced rates of transformation have also been observed in cells infected by MGEs that reduce the concentration of extracellular DNA through secretion of DNases. Simulations predicted that either mechanism of limiting transformation would benefit individual MGEs, but also that this tactic's effectiveness was limited by competition with other MGEs coinfecting the same cell. A further observed behaviour we hypothesised to reduce elimination by transformation was MGE activation when cells become competent. Our model predicted that this response was effective at counteracting transformation independently of competing MGEs. Therefore, this framework is able to explain both common properties of MGEs, and the seemingly paradoxical bacterial behaviours of transformation and cell-cell killing within clonally related populations, as the consequences of intragenomic conflict between self-replicating chromosomes and parasitic MGEs. The antagonistic nature of the different mechanisms of HDT over short timescales means their contribution to bacterial evolution is likely to be substantially greater than previously appreciated
Evolutionary connectionism: algorithmic principles underlying the evolution of biological organisation in evo-devo, evo-eco and evolutionary transitions
The mechanisms of variation, selection and inheritance, on which evolution by natural selection depends, are not fixed over evolutionary time. Current evolutionary biology is increasingly focussed on understanding how the evolution of developmental organisations modifies the distribution of phenotypic variation, the evolution of ecological relationships modifies the selective environment, and the evolution of reproductive relationships modifies the heritability of the evolutionary unit. The major transitions in evolution, in particular, involve radical changes in developmental, ecological and reproductive organisations that instantiate variation, selection and inheritance at a higher level of biological organisation. However, current evolutionary theory is poorly equipped to describe how these organisations change over evolutionary time and especially how that results in adaptive complexes at successive scales of organisation (the key problem is that evolution is self-referential, i.e. the products of evolution change the parameters of the evolutionary process). Here we first reinterpret the central open questions in these domains from a perspective that emphasises the common underlying themes. We then synthesise the findings from a developing body of work that is building a new theoretical approach to these questions by converting well-understood theory and results from models of cognitive learning. Specifically, connectionist models of memory and learning demonstrate how simple incremental mechanisms, adjusting the relationships between individually-simple components, can produce organisations that exhibit complex system-level behaviours and improve the adaptive capabilities of the system. We use the term “evolutionary connectionism” to recognise that, by functionally equivalent processes, natural selection acting on the relationships within and between evolutionary entities can result in organisations that produce complex system-level behaviours in evolutionary systems and modify the adaptive capabilities of natural selection over time. We review the evidence supporting the functional equivalences between the domains of learning and of evolution, and discuss the potential for this to resolve conceptual problems in our understanding of the evolution of developmental, ecological and reproductive organisations and, in particular, the major evolutionary transitions
Biodiversity Loss and the Taxonomic Bottleneck: Emerging Biodiversity Science
Human domination of the Earth has resulted in dramatic changes to global and local patterns of biodiversity. Biodiversity is critical to human sustainability because it drives the ecosystem services that provide the core of our life-support system. As we, the human species, are the primary factor leading to the decline in biodiversity, we need detailed information about the biodiversity and species composition of specific locations in order to understand how different species contribute to ecosystem services and how humans can sustainably conserve and manage biodiversity. Taxonomy and ecology, two fundamental sciences that generate the knowledge about biodiversity, are associated with a number of limitations that prevent them from providing the information needed to fully understand the relevance of biodiversity in its entirety for human sustainability: (1) biodiversity conservation strategies that tend to be overly focused on research and policy on a global scale with little impact on local biodiversity; (2) the small knowledge base of extant global biodiversity; (3) a lack of much-needed site-specific data on the species composition of communities in human-dominated landscapes, which hinders ecosystem management and biodiversity conservation; (4) biodiversity studies with a lack of taxonomic precision; (5) a lack of taxonomic expertise and trained taxonomists; (6) a taxonomic bottleneck in biodiversity inventory and assessment; and (7) neglect of taxonomic resources and a lack of taxonomic service infrastructure for biodiversity science. These limitations are directly related to contemporary trends in research, conservation strategies, environmental stewardship, environmental education, sustainable development, and local site-specific conservation. Today’s biological knowledge is built on the known global biodiversity, which represents barely 20% of what is currently extant (commonly accepted estimate of 10 million species) on planet Earth. Much remains unexplored and unknown, particularly in hotspots regions of Africa, South Eastern Asia, and South and Central America, including many developing or underdeveloped countries, where localized biodiversity is scarcely studied or described. ‘‘Backyard biodiversity’’, defined as local biodiversity near human habitation, refers to the natural resources and capital for ecosystem services at the grassroots level, which urgently needs to be explored, documented, and conserved as it is the backbone of sustainable economic development in these countries. Beginning with early identification and documentation of local flora and fauna, taxonomy has documented global biodiversity and natural history based on the collection of ‘‘backyard biodiversity’’ specimens worldwide. However, this branch of science suffered a continuous decline in the latter half of the twentieth century, and has now reached a point of potential demise. At present there are very few professional taxonomists and trained local parataxonomists worldwide, while the need for, and demands on, taxonomic services by conservation and resource management communities are rapidly increasing. Systematic collections, the material basis of biodiversity information, have been neglected and abandoned, particularly at institutions of higher learning. Considering the rapid increase in the human population and urbanization, human sustainability requires new conceptual and practical approaches to refocusing and energizing the study of the biodiversity that is the core of natural resources for sustainable development and biotic capital for sustaining our life-support system. In this paper we aim to document and extrapolate the essence of biodiversity, discuss the state and nature of taxonomic demise, the trends of recent biodiversity studies, and suggest reasonable approaches to a biodiversity science to facilitate the expansion of global biodiversity knowledge and to create useful data on backyard biodiversity worldwide towards human sustainability
Measurement of the inclusive and dijet cross-sections of b-jets in pp collisions at sqrt(s) = 7 TeV with the ATLAS detector
The inclusive and dijet production cross-sections have been measured for jets
containing b-hadrons (b-jets) in proton-proton collisions at a centre-of-mass
energy of sqrt(s) = 7 TeV, using the ATLAS detector at the LHC. The
measurements use data corresponding to an integrated luminosity of 34 pb^-1.
The b-jets are identified using either a lifetime-based method, where secondary
decay vertices of b-hadrons in jets are reconstructed using information from
the tracking detectors, or a muon-based method where the presence of a muon is
used to identify semileptonic decays of b-hadrons inside jets. The inclusive
b-jet cross-section is measured as a function of transverse momentum in the
range 20 < pT < 400 GeV and rapidity in the range |y| < 2.1. The bbbar-dijet
cross-section is measured as a function of the dijet invariant mass in the
range 110 < m_jj < 760 GeV, the azimuthal angle difference between the two jets
and the angular variable chi in two dijet mass regions. The results are
compared with next-to-leading-order QCD predictions. Good agreement is observed
between the measured cross-sections and the predictions obtained using POWHEG +
Pythia. MC@NLO + Herwig shows good agreement with the measured bbbar-dijet
cross-section. However, it does not reproduce the measured inclusive
cross-section well, particularly for central b-jets with large transverse
momenta.Comment: 10 pages plus author list (21 pages total), 8 figures, 1 table, final
version published in European Physical Journal
- …
