56 research outputs found

    Co-activation of NF-κB and MYC renders cancer cells addicted to IL6 for survival and phenotypic stability

    Get PDF
    NF-κB and MYC are found co-deregulated in human B and plasma-cell cancers. In physiology, NF-κB is necessary for terminal B-to-plasma cell differentiation, whereas MYC repression is required. It is thus unclear if NF-κB/MYC co-deregulation is developmentally compatible in carcinogenesis and/or impacts cancer cell differentiation state, possibly uncovering unique sensitivities. Using a mouse system to trace cell lineage and oncogene activation we found that NF-κB/MYC co-deregulation originated cancers with a plasmablast-like phenotype, alike human plasmablastic-lymphoma and was linked to t(8;14)[MYC-IGH] multiple myeloma. Notably, in contrast to NF-κB or MYC activation alone, co-deregulation rendered cells addicted to IL6 for survival and phenotypic stability. We propose that conflicting oncogene-driven differentiation pressures can be accommodated at a cost in poorly-differentiated cancers

    Regions of High Out-Of-Hospital Cardiac Arrest Incidence and Low Bystander CPR Rates in Victoria, Australia

    Get PDF
    BACKGROUND: Out-of-hospital cardiac arrest (OHCA) remains a major public health issue and research has shown that large regional variation in outcomes exists. Of the interventions associated with survival, the provision of bystander CPR is one of the most important modifiable factors. The aim of this study is to identify census areas with high incidence of OHCA and low rates of bystander CPR in Victoria, Australia. METHODS: We conducted an observational study using prospectively collected population-based OHCA data from the state of Victoria in Australia. Using ArcGIS (ArcMap 10.0), we linked the location of the arrest using the dispatch coordinates (longitude and latitude) to Victorian Local Government Areas (LGAs). We used Bayesian hierarchical models with random effects on each LGA to provide shrunken estimates of the rates of bystander CPR and the incidence rates. RESULTS: Over the study period there were 31,019 adult OHCA attended, of which 21,436 (69.1%) cases were of presumed cardiac etiology. Significant variation in the incidence of OHCA among LGAs was observed. There was a 3 fold difference in the incidence rate between the lowest and highest LGAs, ranging from 38.5 to 115.1 cases per 100,000 person-years. The overall rate of bystander CPR for bystander witnessed OHCAs was 62.4%, with the rate increasing from 56.4% in 2008-2010 to 68.6% in 2010-2013. There was a 25.1% absolute difference in bystander CPR rates between the highest and lowest LGAs. CONCLUSION: Significant regional variation in OHCA incidence and bystander CPR rates exists throughout Victoria. Regions with high incidence and low bystander CPR participation can be identified and would make suitable targets for interventions to improve CPR participation rates

    Why Is the Correlation between Gene Importance and Gene Evolutionary Rate So Weak?

    Get PDF
    One of the few commonly believed principles of molecular evolution is that functionally more important genes (or DNA sequences) evolve more slowly than less important ones. This principle is widely used by molecular biologists in daily practice. However, recent genomic analysis of a diverse array of organisms found only weak, negative correlations between the evolutionary rate of a gene and its functional importance, typically measured under a single benign lab condition. A frequently suggested cause of the above finding is that gene importance determined in the lab differs from that in an organism's natural environment. Here, we test this hypothesis in yeast using gene importance values experimentally determined in 418 lab conditions or computationally predicted for 10,000 nutritional conditions. In no single condition or combination of conditions did we find a much stronger negative correlation, which is explainable by our subsequent finding that always-essential (enzyme) genes do not evolve significantly more slowly than sometimes-essential or always-nonessential ones. Furthermore, we verified that functional density, approximated by the fraction of amino acid sites within protein domains, is uncorrelated with gene importance. Thus, neither the lab-nature mismatch nor a potentially biased among-gene distribution of functional density explains the observed weakness of the correlation between gene importance and evolutionary rate. We conclude that the weakness is factual, rather than artifactual. In addition to being weakened by population genetic reasons, the correlation is likely to have been further weakened by the presence of multiple nontrivial rate determinants that are independent from gene importance. These findings notwithstanding, we show that the principle of slower evolution of more important genes does have some predictive power when genes with vastly different evolutionary rates are compared, explaining why the principle can be practically useful despite the weakness of the correlation

    Earthquake nucleation in the lower crust by local stress amplification

    Get PDF
    Deep intracontinental earthquakes are poorly understood, despite their potential to cause significant destruction. Although lower crustal strength is currently a topic of debate, dry lower continental crust may be strong under high-grade conditions. Such strength could enable earthquake slip at high differential stress within a predominantly viscous regime, but requires further documentation in nature. Here, we analyse geological observations of seismic structures in exhumed lower crustal rocks. A granulite facies shear zone network dissects an anorthosite intrusion in Lofoten, northern Norway, and separates relatively undeformed, microcracked blocks of anorthosite. In these blocks, pristine pseudotachylytes decorate fault sets that link adjacent or intersecting shear zones. These fossil seismogenic faults are rarely >15 m in length, yet record single-event displacements of tens of centimetres, a slip/length ratio that implies >1 GPa stress drops. These pseudotachylytes represent direct identification of earthquake nucleation as a transient consequence of ongoing, localised aseismic creep

    Gender-Associated Genes in Filarial Nematodes Are Important for Reproduction and Potential Intervention Targets

    Get PDF
    Lymphatic filariasis is a neglected tropical disease that is caused by thread-like parasitic worms that live and reproduce in lymphatic vessels of the human host. There are no vaccines to prevent filariasis, and available drugs are not effective against all stages of the parasite. In addition, recent reports suggest that the filarial nematodes may be developing resistance to key medications. Therefore, there is an urgent need to identify new drug targets in filarial worms. The purpose of this study was to perform a genome-wide analysis of gender-associated gene transcription to improve understanding of key reproductive processes in filarial nematodes. Our results indicate that thousands of genes are differentially expressed in male and female adult worms. Many of those genes are involved in specific reproductive processes such as embryogenesis and spermatogenesis. In addition, expression of some of those genes is suppressed by tetracycline, a drug that leads to sterilization of adult female worms in many filarial species. Thus, gender-associated genes represent priority targets for design of vaccines and drugs that interfere with reproduction of filarial nematodes. Additional work with this type of integrated systems biology approach should lead to important new tools for controlling filarial diseases

    Ecology and biogeography of megafauna and macrofauna at the first known deep-sea hydrothermal vents on the ultraslow-spreading Southwest Indian Ridge

    Get PDF
    0000-0002-9489-074X© The Author(s) 2016. This work is licensed under a Creative Commons Attribution 4.0 International License. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in the credit line; if the material is not included under the Creative Commons license, users will need to obtain permission from the license holder to reproduce the material. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ The attached file is the published version of the article

    Man and the Last Great Wilderness: Human Impact on the Deep Sea

    Get PDF
    The deep sea, the largest ecosystem on Earth and one of the least studied, harbours high biodiversity and provides a wealth of resources. Although humans have used the oceans for millennia, technological developments now allow exploitation of fisheries resources, hydrocarbons and minerals below 2000 m depth. The remoteness of the deep seafloor has promoted the disposal of residues and litter. Ocean acidification and climate change now bring a new dimension of global effects. Thus the challenges facing the deep sea are large and accelerating, providing a new imperative for the science community, industry and national and international organizations to work together to develop successful exploitation management and conservation of the deep-sea ecosystem. This paper provides scientific expert judgement and a semi-quantitative analysis of past, present and future impacts of human-related activities on global deep-sea habitats within three categories: disposal, exploitation and climate change. The analysis is the result of a Census of Marine Life – SYNDEEP workshop (September 2008). A detailed review of known impacts and their effects is provided. The analysis shows how, in recent decades, the most significant anthropogenic activities that affect the deep sea have evolved from mainly disposal (past) to exploitation (present). We predict that from now and into the future, increases in atmospheric CO2 and facets and consequences of climate change will have the most impact on deep-sea habitats and their fauna. Synergies between different anthropogenic pressures and associated effects are discussed, indicating that most synergies are related to increased atmospheric CO2 and climate change effects. We identify deep-sea ecosystems we believe are at higher risk from human impacts in the near future: benthic communities on sedimentary upper slopes, cold-water corals, canyon benthic communities and seamount pelagic and benthic communities. We finalise this review with a short discussion on protection and management methods

    The Arid Zone Monitoring Project: combining Indigenous ecological expertise with scientific data analysis to assess the potential of using sign-based surveys to monitor vertebrates in the Australian deserts

    Get PDF
    Deserts cover large areas and support substantial biodiversity; however, like other biomes, they are experiencing biodiversity loss. Monitoring biodiversity trends in deserts is rare, partly because of the logistical challenges of working in remote areas. This is true also in Australia, which has one of the largest and least populated desert areas worldwide, has suffered marked biodiversity loss since European colonisation, and has minimal large-scale biodiversity monitoring. However, Indigenous people of many Traditional Owner groups continue to live in, and care for, these deserts. Over the past two decades, Indigenous ranger groups have been collecting species records by using sign-based surveys, adding to work begun in the 1980s by researchers and government scientists. In sign-based surveys, the presence (or absence) of species is recorded by searching on sandy substrates for tracks, scats, burrows and diggings in a fixed area, or a fixed time. Such surveys combine the tracking skills of Indigenous people with robust analytical methods. Here, we describe a desert-wide project that collated and analysed existing sign-based data to explore its potential for local-, regional- and national-scale biodiversity monitoring. The Arid Zone Monitoring Project also provided guidance about future monitoring designs and data-collection methods for varying survey objectives. The project collated data from 44 groups and individuals, comprising almost 15,000 surveys from over 5300 unique sites, with almost 49,000 detections of 65 native and 11 introduced species, including threatened, and culturally significant species. Despite heterogeneity in survey objectives and data collection methods, we were able to use the collated data to describe species distributions and understand correlates of suitable habitat, investigate temporal trends, and to simulate the monitoring effort required to detect trends in over 25 vertebrate species at regional and national scales. Most importantly, we built a large collaboration, and produced informative maps and analyses, while respecting the intellectual property and diverse aspirations of the project partners. With this foundation in place, a national sign-based monitoring program for medium–large desert vertebrates seems achievable, if accompanied by overarching coordination and survey support, training, standardised data collection, improved sampling design, centralised data curation and storage, and regular communication

    Real world hospital costs following stress echocardiography in the UK: a costing study from the EVAREST/BSE-NSTEP multi-centre study

    Get PDF
    Background: Stress echocardiography is widely used to detect coronary artery disease, but little evidence on downstream hospital costs in real-world practice is available. We examined how stress echocardiography accuracy and downstream hospital costs vary across NHS hospitals and identified key factors that affect costs to help inform future clinical planning and guidelines. Methods: Data on 7636 patients recruited from 31 NHS hospitals within the UK between 2014 and 2020 as part of EVAREST/BSE-NSTEP clinical study, were used. Data included all diagnostic tests, procedures, and hospital admissions for 12 months after a stress echocardiogram and were costed using the NHS national unit costs. A decision tree was built to illustrate the clinical pathway and estimate average downstream hospital costs. Multi-level regression analysis was performed to identify variation in accuracy and costs at both patient, procedural, and hospital level. Linear regression and extrapolation were used to estimate annual hospital cost-savings associated with increasing predictive accuracy at hospital and national level. Results: Stress echocardiography accuracy varied with patient, hospital and operator characteristics. Hypertension, presence of wall motion abnormalities and higher number of hospital cardiology outpatient attendances annually reduced accuracy, adjusted odds ratio of 0.78 (95% CI 0.65 to 0.93), 0.27 (95% CI 0.15 to 0.48), 0.99 (95% CI 0.98 to 0.99) respectively, whereas a prior myocardial infarction, angiotensin receptor blocker medication, and greater operator experience increased accuracy, adjusted odds ratio of 1.77 (95% CI 1.34 to 2.33), 1.64 (95% CI 1.22 to 2.22), and 1.06 (95% CI 1.02 to 1.09) respectively. Average downstream costs were £646 per patient (SD 1796) with significant variation across hospitals. The average downstream costs between the 31 hospitals varied from £384–1730 per patient. False positive and false negative tests were associated with average downstream costs of £1446 (SD £601) and £4192 (SD 3332) respectively, driven by increased non-elective hospital admissions, adjusted odds ratio 2.48 (95% CI 1.08 to 5.66), 21.06 (95% CI 10.41 to 42.59) respectively. We estimated that an increase in accuracy by 1 percentage point could save the NHS in the UK £3.2 million annually. Conclusion: This study provides real-world evidence of downstream costs associated with stress echocardiography practice in the UK and estimates how improvements in accuracy could impact healthcare expenditure in the NHS. A real-world downstream costing approach could be adopted more widely in evaluation of imaging tests and interventions to reflect actual value for money and support realistic planning
    corecore