436 research outputs found
Improving Interpretation of Cardiac Phenotypes and Enhancing Discovery With Expanded Knowledge in the Gene Ontology
BACKGROUND: A systems biology approach to cardiac physiology requires a comprehensive representation of how coordinated processes operate in the heart, as well as the ability to interpret relevant transcriptomic and proteomic experiments. The Gene Ontology (GO) Consortium provides structured, controlled vocabularies of biological terms that can be used to summarize and analyze functional knowledge for gene products. METHODS AND RESULTS: In this study, we created a computational resource to facilitate genetic studies of cardiac physiology by integrating literature curation with attention to an improved and expanded ontological representation of heart processes in the Gene Ontology. As a result, the Gene Ontology now contains terms that comprehensively describe the roles of proteins in cardiac muscle cell action potential, electrical coupling, and the transmission of the electrical impulse from the sinoatrial node to the ventricles. Evaluating the effectiveness of this approach to inform data analysis demonstrated that Gene Ontology annotations, analyzed within an expanded ontological context of heart processes, can help to identify candidate genes associated with arrhythmic disease risk loci. CONCLUSIONS: We determined that a combination of curation and ontology development for heart-specific genes and processes supports the identification and downstream analysis of genes responsible for the spread of the cardiac action potential through the heart. Annotating these genes and processes in a structured format facilitates data analysis and supports effective retrieval of gene-centric information about cardiac defects
Recommendations for a core outcome set for measuring standing balance in adult populations: a consensus-based approach
Standing balance is imperative for mobility and avoiding falls. Use of an excessive number of standing balance measures has limited the synthesis of balance intervention data and hampered consistent clinical practice.To develop recommendations for a core outcome set (COS) of standing balance measures for research and practice among adults.A combination of scoping reviews, literature appraisal, anonymous voting and face-to-face meetings with fourteen invited experts from a range of disciplines with international recognition in balance measurement and falls prevention. Consensus was sought over three rounds using pre-established criteria.The scoping review identified 56 existing standing balance measures validated in adult populations with evidence of use in the past five years, and these were considered for inclusion in the COS.Fifteen measures were excluded after the first round of scoring and a further 36 after round two. Five measures were considered in round three. Two measures reached consensus for recommendation, and the expert panel recommended that at a minimum, either the Berg Balance Scale or Mini Balance Evaluation Systems Test be used when measuring standing balance in adult populations.Inclusion of two measures in the COS may increase the feasibility of potential uptake, but poses challenges for data synthesis. Adoption of the standing balance COS does not constitute a comprehensive balance assessment for any population, and users should include additional validated measures as appropriate.The absence of a gold standard for measuring standing balance has contributed to the proliferation of outcome measures. These recommendations represent an important first step towards greater standardization in the assessment and measurement of this critical skill and will inform clinical research and practice internationally
Systematic review and meta-analysis of the diagnostic accuracy of ultrasonography for deep vein thrombosis
Background
Ultrasound (US) has largely replaced contrast venography as the definitive diagnostic test for deep vein thrombosis (DVT). We aimed to derive a definitive estimate of the diagnostic accuracy of US for clinically suspected DVT and identify study-level factors that might predict accuracy.
Methods
We undertook a systematic review, meta-analysis and meta-regression of diagnostic cohort studies that compared US to contrast venography in patients with suspected DVT. We searched Medline, EMBASE, CINAHL, Web of Science, Cochrane Database of Systematic Reviews, Cochrane Controlled Trials Register, Database of Reviews of Effectiveness, the ACP Journal Club, and citation lists (1966 to April 2004). Random effects meta-analysis was used to derive pooled estimates of sensitivity and specificity. Random effects meta-regression was used to identify study-level covariates that predicted diagnostic performance.
Results
We identified 100 cohorts comparing US to venography in patients with suspected DVT. Overall sensitivity for proximal DVT (95% confidence interval) was 94.2% (93.2 to 95.0), for distal DVT was 63.5% (59.8 to 67.0), and specificity was 93.8% (93.1 to 94.4). Duplex US had pooled sensitivity of 96.5% (95.1 to 97.6) for proximal DVT, 71.2% (64.6 to 77.2) for distal DVT and specificity of 94.0% (92.8 to 95.1). Triplex US had pooled sensitivity of 96.4% (94.4 to 97.1%) for proximal DVT, 75.2% (67.7 to 81.6) for distal DVT and specificity of 94.3% (92.5 to 95.8). Compression US alone had pooled sensitivity of 93.8 % (92.0 to 95.3%) for proximal DVT, 56.8% (49.0 to 66.4) for distal DVT and specificity of 97.8% (97.0 to 98.4). Sensitivity was higher in more recently published studies and in cohorts with higher prevalence of DVT and more proximal DVT, and was lower in cohorts that reported interpretation by a radiologist. Specificity was higher in cohorts that excluded patients with previous DVT. No studies were identified that compared repeat US to venography in all patients. Repeat US appears to have a positive yield of 1.3%, with 89% of these being confirmed by venography.
Conclusion
Combined colour-doppler US techniques have optimal sensitivity, while compression US has optimal specificity for DVT. However, all estimates are subject to substantial unexplained heterogeneity. The role of repeat scanning is very uncertain and based upon limited data
Early, Goal-Directed Therapy for Septic Shock - A Patient-Level Meta-Analysis
BACKGROUND:
After a single-center trial and observational studies suggesting that early, goal-directed therapy (EGDT) reduced mortality from septic shock, three multicenter trials (ProCESS, ARISE, and ProMISe) showed no benefit. This meta-analysis of individual patient data from the three recent trials was designed prospectively to improve statistical power and explore heterogeneity of treatment effect of EGDT.
METHODS:
We harmonized entry criteria, intervention protocols, outcomes, resource-use measures, and data collection across the trials and specified all analyses before unblinding. After completion of the trials, we pooled data, excluding the protocol-based standard-therapy group from the ProCESS trial, and resolved residual differences. The primary outcome was 90-day mortality. Secondary outcomes included 1-year survival, organ support, and hospitalization costs. We tested for treatment-by-subgroup interactions for 16 patient characteristics and 6 care-delivery characteristics.
RESULTS:
We studied 3723 patients at 138 hospitals in seven countries. Mortality at 90 days was similar for EGDT (462 of 1852 patients [24.9%]) and usual care (475 of 1871 patients [25.4%]); the adjusted odds ratio was 0.97 (95% confidence interval, 0.82 to 1.14; P=0.68). EGDT was associated with greater mean (±SD) use of intensive care (5.3±7.1 vs. 4.9±7.0 days, P=0.04) and cardiovascular support (1.9±3.7 vs. 1.6±2.9 days, P=0.01) than was usual care; other outcomes did not differ significantly, although average costs were higher with EGDT. Subgroup analyses showed no benefit from EGDT for patients with worse shock (higher serum lactate level, combined hypotension and hyperlactatemia, or higher predicted risk of death) or for hospitals with a lower propensity to use vasopressors or fluids during usual resuscitation.
CONCLUSIONS:
In this meta-analysis of individual patient data, EGDT did not result in better outcomes than usual care and was associated with higher hospitalization costs across a broad range of patient and hospital characteristics. (Funded by the National Institute of General Medical Sciences and others; PRISM ClinicalTrials.gov number, NCT02030158.
High genetic diversity at the extreme range edge: nucleotide variation at nuclear loci in Scots pine (Pinus sylvestris L.) in Scotland
Nucleotide polymorphism at 12 nuclear loci was studied in Scots pine populations across an environmental gradient in Scotland, to evaluate the impacts of demographic history and selection on genetic diversity. At eight loci, diversity patterns were compared between Scottish and continental European populations. At these loci, a similar level of diversity (θsil=~0.01) was found in Scottish vs mainland European populations, contrary to expectations for recent colonization, however, less rapid decay of linkage disequilibrium was observed in the former (ρ=0.0086±0.0009, ρ=0.0245±0.0022, respectively). Scottish populations also showed a deficit of rare nucleotide variants (multi-locus Tajima's D=0.316 vs D=−0.379) and differed significantly from mainland populations in allelic frequency and/or haplotype structure at several loci. Within Scotland, western populations showed slightly reduced nucleotide diversity (πtot=0.0068) compared with those from the south and east (0.0079 and 0.0083, respectively) and about three times higher recombination to diversity ratio (ρ/θ=0.71 vs 0.15 and 0.18, respectively). By comparison with results from coalescent simulations, the observed allelic frequency spectrum in the western populations was compatible with a relatively recent bottleneck (0.00175 × 4Ne generations) that reduced the population to about 2% of the present size. However, heterogeneity in the allelic frequency distribution among geographical regions in Scotland suggests that subsequent admixture of populations with different demographic histories may also have played a role
Can an Integrated Approach Reduce Child Vulnerability to Anaemia? Evidence from Three African Countries.
Addressing the complex, multi-factorial causes of childhood anaemia is best done through integrated packages of interventions. We hypothesized that due to reduced child vulnerability, a "buffering" of risk associated with known causes of anaemia would be observed among children living in areas benefiting from a community-based health and nutrition program intervention. Cross-sectional data on the nutrition and health status of children 24-59 mo (N = 2405) were obtained in 2000 and 2004 from program evaluation surveys in Ghana, Malawi and Tanzania. Linear regression models estimated the association between haemoglobin and immediate, underlying and basic causes of child anaemia and variation in this association between years. Lower haemoglobin levels were observed in children assessed in 2000 compared to 2004 (difference -3.30 g/L), children from Tanzania (-9.15 g/L) and Malawi (-2.96 g/L) compared to Ghana, and the youngest (24-35 mo) compared to oldest age group (48-59 mo; -5.43 g/L). Children who were stunted, malaria positive and recently ill also had lower haemoglobin, independent of age, sex and other underlying and basic causes of anaemia. Despite ongoing morbidity, risk of lower haemoglobin decreased for children with malaria and recent illness, suggesting decreased vulnerability to their anaemia-producing effects. Stunting remained an independent and unbuffered risk factor. Reducing chronic undernutrition is required in order to further reduce child vulnerability and ensure maximum impact of anaemia control programs. Buffering the impact of child morbidity on haemoglobin levels, including malaria, may be achieved in certain settings
Do Changes in the Pace of Events Affect One-Off Judgments of Duration?
Five experiments examined whether changes in the pace of external events influence people’s judgments of duration. In Experiments 1a–1c, participants heard pieces of music whose tempo accelerated, decelerated, or remained constant. In Experiment 2, participants completed a visuo-motor task in which the rate of stimulus presentation accelerated, decelerated, or remained constant. In Experiment 3, participants completed a reading task in which facts appeared on-screen at accelerating, decelerating, or constant rates. In all experiments, the physical duration of the to-be-judged interval was the same across conditions. We found no significant effects of temporal structure on duration judgments in any of the experiments, either when participants knew that a time estimate would be required (prospective judgments) or when they did not (retrospective judgments). These results provide a starting point for the investigation of how temporal structure affects one-off judgments of duration like those typically made in natural settings
From food to pest: Conversion factors determine switches between ecosystem services and disservices
Ecosystem research focuses on goods and services, thereby ascribing beneficial values to the ecosystems. Depending on the context, however, outputs from ecosystems can be both positive and negative. We examined how provisioning services of wild animals and plants can switch between being services and disservices. We studied agricultural communities in Laos to illustrate when and why these switches take place. Government restrictions on land use combined with economic and cultural changes have created perceptions of rodents and plants as problem species in some communities. In other communities that are maintaining shifting cultivation practices, the very same taxa were perceived as beneficial. We propose conversion factors that in a given context can determine where an individual taxon is located along a spectrum from ecosystem service to disservice, when, and for whom. We argue that the omission of disservices in ecosystem service accounts may lead governments to direct investments at inappropriate targets
Early invaders - Farmers, the granary weevil and other uninvited guests in the Neolithic
The Neolithic and the spread of agriculture saw several introductions of insect species associated with the environments and activities of the first farmers. Fossil insect research from the Neolithic lake settlement of Dispilio in Macedonia, northern Greece, provides evidence for the early European introduction of a flightless weevil, the granary weevil, Sitophilus granarius, which has since become cosmopolitan and one of the most important pests of stored cereals. The records of the granary weevil from the Middle Neolithic in northern Greece illuminate the significance of surplus storage for the spread of agriculture. The granary weevil and the house fly, Musca domestica were also introduced in the Neolithic of central Europe, with the expansion of Linear Band Keramik (LBK) culture groups. This paper reviews Neolithic insect introductions in Europe, including storage pests, discusses their distribution during different periods and the reasons behind the trends observed. Storage farming may be differentiated from pastoral farming on the basis of insect introductions arriving with incoming agricultural groups
- …
