118 research outputs found
Recommended from our members
The influence of the atmospheric boundary layer on nocturnal layers of noctuids and other moths migrating over southern Britain
Insects migrating at high altitude over southern Britain have been continuously monitored by automatically-operating, vertical-looking radars over a period of several years. During some occasions in the summer months, the migrants were observed to form well-defined layer concentrations, typically at heights of 200-400 m, in the stable night-time atmosphere. Under these conditions, insects are likely to have control over their vertical movements and are selecting flight heights which are favourable for long-range migration. We therefore investigated the factors influencing the formation of these insect layers by comparing radar measurements of the vertical distribution of insect density with meteorological profiles generated by the UK Met. Office’s Unified Model (UM). Radar-derived measurements of mass and displacement speed, along with data from Rothamsted Insect Survey light traps provided information on the identity of the migrants. We present here three case studies where noctuid and pyralid moths contributed substantially to the observed layers. The major meteorological factors influencing the layer concentrations appeared to be: (a) the altitude of the warmest air, (b) heights corresponding to temperature preferences or thresholds for sustained migration and (c), on nights when air temperatures are relatively high, wind-speed maxima associated with the nocturnal jet. Back-trajectories indicated that layer duration may have been determined by the distance to the coast. Overall, the unique combination of meteorological data from the UM and insect data from entomological radar described here show considerable promise for systematic studies of high-altitude insect layering
Pubertal development and prostate cancer risk: Mendelian randomization study in a population-based cohort.
BACKGROUND: Epidemiological studies have observed a positive association between an earlier age at sexual development and prostate cancer, but markers of sexual maturation in boys are imprecise and observational estimates are likely to suffer from a degree of uncontrolled confounding. To obtain causal estimates, we examined the role of pubertal development in prostate cancer using genetic polymorphisms associated with Tanner stage in adolescent boys in a Mendelian randomization (MR) approach. METHODS: We derived a weighted genetic risk score for pubertal development, combining 13 SNPs associated with male Tanner stage. A higher score indicated a later puberty onset. We examined the association of this score with prostate cancer risk, stage and grade in the UK-based ProtecT case-control study (n = 2,927), and used the PRACTICAL consortium (n = 43,737) as a replication sample. RESULTS: In ProtecT, the puberty genetic score was inversely associated with prostate cancer grade (odds ratio (OR) of high- vs. low-grade cancer, per tertile of the score: 0.76; 95 % CI, 0.64-0.89). In an instrumental variable estimation of the causal OR, later physical development in adolescence (equivalent to a difference of one Tanner stage between pubertal boys of the same age) was associated with a 77 % (95 % CI, 43-91 %) reduced odds of high Gleason prostate cancer. In PRACTICAL, the puberty genetic score was associated with prostate cancer stage (OR of advanced vs. localized cancer, per tertile: 0.95; 95 % CI, 0.91-1.00) and prostate cancer-specific mortality (hazard ratio amongst cases, per tertile: 0.94; 95 % CI, 0.90-0.98), but not with disease grade. CONCLUSIONS: Older age at sexual maturation is causally linked to a reduced risk of later prostate cancer, especially aggressive disease.This work was supported by the World Cancer Research Fund (2011/419)
and Cancer Research UK (C18281/A19169). The Integrative Epidemiology
Unit (IEU) is supported by the MRC and the University of Bristol
(G0600705, MC_UU_12013/19), and the Integrative Cancer Epidemiology
Programme is supported by Cancer Research UK programme grant
C18281/A19169. The National Institute for Health Research (NIHR) Bristol
Nutrition Biomedical Research Unit is funded by the NIHR and is a
partnership between University Hospitals Bristol NHS Foundation Trust
and the University of Bristol. The ProtecT study is supported by the UK
NIHR Health Technology Assessment (HTA) Programme (HTA 96/20/99;
ISRCTN20141297). Funding for PRACTICAL and the iCOGS infrastructure
came from: the European Community’s Seventh Framework Programme
under grant agreement n° 223175 (HEALTH-F2-2009-223175) (COGS),
Cancer Research UK (C1287/A10118, C1287/A 10710, C12292/A11174,
C1281/A12014, C5047/A8384, C5047/A15007, C5047/A10692, C8197/
A16565), the National Institutes of Health (CA128978), and Post-Cancer GWAS
initiative (1U19 CA148537, 1U19 CA148065 and 1U19 CA148112 – the
GAME-ON initiative), the Department of Defence (W81XWH-10-1-0341), the
Canadian Institutes of Health Research (CIHR) for the CIHR Team in Familial Risks
of Breast Cancer, Komen Foundation for the Cure, the Breast Cancer Research
Foundation, and the Ovarian Cancer Research Fund. We acknowledge support
from the NIHR to the Biomedical Research Centre at The Institute of Cancer
Research and The Royal Marsden NHS Foundation Trust.This is the final version of the article. It first appeared from BioMed Central via http://dx.doi.org/10.1186/s12916-016-0602-
Influenza A H5N1 clade 2.3.4 virus with a different antiviral susceptibility profile replaced clade 1 virus in humans in northern Vietnam.
BACKGROUND: Prior to 2007, highly pathogenic avian influenza (HPAI) H5N1 viruses isolated from poultry and humans in Vietnam were consistently reported to be clade 1 viruses, susceptible to oseltamivir but resistant to amantadine. Here we describe the re-emergence of human HPAI H5N1 virus infections in Vietnam in 2007 and the characteristics of the isolated viruses. METHODS AND FINDINGS: Respiratory specimens from patients suspected to be infected with avian influenza in 2007 were screened by influenza and H5 subtype specific polymerase chain reaction. Isolated H5N1 strains were further characterized by genome sequencing and drug susceptibility testing. Eleven poultry outbreak isolates from 2007 were included in the sequence analysis. Eight patients, all of them from northern Vietnam, were diagnosed with H5N1 in 2007 and five of them died. Phylogenetic analysis of H5N1 viruses isolated from humans and poultry in 2007 showed that clade 2.3.4 H5N1 viruses replaced clade 1 viruses in northern Vietnam. Four human H5N1 strains had eight-fold reduced in-vitro susceptibility to oseltamivir as compared to clade 1 viruses. In two poultry isolates the I117V mutation was found in the neuraminidase gene, which is associated with reduced susceptibility to oseltamivir. No mutations in the M2 gene conferring amantadine resistance were found. CONCLUSION: In 2007, H5N1 clade 2.3.4 viruses replaced clade 1 viruses in northern Vietnam and were susceptible to amantadine but showed reduced susceptibility to oseltamivir. Combination antiviral therapy with oseltamivir and amantadine for human cases in Vietnam is recommended
Inhibition of StearoylCoA Desaturase-1 Inactivates Acetyl-CoA Carboxylase and Impairs Proliferation in Cancer Cells: Role of AMPK
Cancer cells activate the biosynthesis of saturated fatty acids (SFA) and monounsaturated fatty acids (MUFA) in order to sustain an increasing demand for phospholipids with appropriate acyl composition during cell replication. We have previously shown that a stable knockdown of stearoyl-CoA desaturase 1 (SCD1), the main Δ9-desaturase that converts SFA into MUFA, in cancer cells decreases the rate of lipogenesis, reduces proliferation and in vitro invasiveness, and dramatically impairs tumor formation and growth. Here we report that pharmacological inhibition of SCD1 with a novel small molecule in cancer cells promoted the activation of AMP-activated kinase (AMPK) and the subsequent reduction of acetylCoA carboxylase activity, with a concomitant inhibition of glucose-mediated lipogenesis. The pharmacological inhibition of AMPK further decreased proliferation of SCD1-depleted cells, whereas AMPK activation restored proliferation to control levels. Addition of supraphysiological concentrations of glucose or pyruvate, the end product of glycolysis, did not reverse the low proliferation rate of SCD1-ablated cancer cells. Our data suggest that cancer cells require active SCD1 to control the rate of glucose-mediated lipogenesis, and that when SCD1 activity is impaired cells downregulate SFA synthesis via AMPK-mediated inactivation of acetyl-CoA carboxylase, thus preventing the harmful effects of SFA accumulation
Biomarkers of angiogenesis and their role in the development of VEGF inhibitors
Vascular endothelial growth factor (VEGF) has been confirmed as an important therapeutic target in randomised clinical trials in multiple disease settings. However, the extent to which individual patients benefit from VEGF inhibitors is unclear. If we are to optimise the use of these drugs or develop combination regimens that build on this efficacy, it is critical to identify those patients who are likely to benefit, particularly as these agents can be toxic and are expensive. To this end, biomarkers have been evaluated in tissue, in circulation and by imaging. Consistent drug-induced increases in plasma VEGF-A and blood pressure, as well as reductions in soluble VEGF-R2 and dynamic contrast-enhanced MRI parameters have been reported. In some clinical trials, biomarker changes were statistically significant and associated with clinical end points, but there is considerable heterogeneity between studies that are to some extent attributable to methodological issues. On the basis of observations with these biomarkers, it is now appropriate to conduct detailed prospective studies to define a suite of predictive, pharmacodynamic and surrogate response biomarkers that identify those patients most likely to benefit from and monitor their response to this novel class of drugs
Pan-cancer analysis of whole genomes
Cancer is driven by genetic change, and the advent of massively parallel sequencing has enabled systematic documentation of this variation at the whole-genome scale(1-3). Here we report the integrative analysis of 2,658 whole-cancer genomes and their matching normal tissues across 38 tumour types from the Pan-Cancer Analysis of Whole Genomes (PCAWG) Consortium of the International Cancer Genome Consortium (ICGC) and The Cancer Genome Atlas (TCGA). We describe the generation of the PCAWG resource, facilitated by international data sharing using compute clouds. On average, cancer genomes contained 4-5 driver mutations when combining coding and non-coding genomic elements; however, in around 5% of cases no drivers were identified, suggesting that cancer driver discovery is not yet complete. Chromothripsis, in which many clustered structural variants arise in a single catastrophic event, is frequently an early event in tumour evolution; in acral melanoma, for example, these events precede most somatic point mutations and affect several cancer-associated genes simultaneously. Cancers with abnormal telomere maintenance often originate from tissues with low replicative activity and show several mechanisms of preventing telomere attrition to critical levels. Common and rare germline variants affect patterns of somatic mutation, including point mutations, structural variants and somatic retrotransposition. A collection of papers from the PCAWG Consortium describes non-coding mutations that drive cancer beyond those in the TERT promoter(4); identifies new signatures of mutational processes that cause base substitutions, small insertions and deletions and structural variation(5,6); analyses timings and patterns of tumour evolution(7); describes the diverse transcriptional consequences of somatic mutation on splicing, expression levels, fusion genes and promoter activity(8,9); and evaluates a range of more-specialized features of cancer genomes(8,10-18).Peer reviewe
Mapping geographical inequalities in access to drinking water and sanitation facilities in low-income and middle-income countries, 2000-17
Background Universal access to safe drinking water and sanitation facilities is an essential human right, recognised in the Sustainable Development Goals as crucial for preventing disease and improving human wellbeing. Comprehensive, high-resolution estimates are important to inform progress towards achieving this goal. We aimed to produce high-resolution geospatial estimates of access to drinking water and sanitation facilities. Methods We used a Bayesian geostatistical model and data from 600 sources across more than 88 low-income and middle-income countries (LMICs) to estimate access to drinking water and sanitation facilities on continuous continent-wide surfaces from 2000 to 2017, and aggregated results to policy-relevant administrative units. We estimated mutually exclusive and collectively exhaustive subcategories of facilities for drinking water (piped water on or off premises, other improved facilities, unimproved, and surface water) and sanitation facilities (septic or sewer sanitation, other improved, unimproved, and open defecation) with use of ordinal regression. We also estimated the number of diarrhoeal deaths in children younger than 5 years attributed to unsafe facilities and estimated deaths that were averted by increased access to safe facilities in 2017, and analysed geographical inequality in access within LMICs. Findings Across LMICs, access to both piped water and improved water overall increased between 2000 and 2017, with progress varying spatially. For piped water, the safest water facility type, access increased from 40.0% (95% uncertainty interval [UI] 39.4-40.7) to 50.3% (50.0-50.5), but was lowest in sub-Saharan Africa, where access to piped water was mostly concentrated in urban centres. Access to both sewer or septic sanitation and improved sanitation overall also increased across all LMICs during the study period. For sewer or septic sanitation, access was 46.3% (95% UI 46.1-46.5) in 2017, compared with 28.7% (28.5-29.0) in 2000. Although some units improved access to the safest drinking water or sanitation facilities since 2000, a large absolute number of people continued to not have access in several units with high access to such facilities (>80%) in 2017. More than 253 000 people did not have access to sewer or septic sanitation facilities in the city of Harare, Zimbabwe, despite 88.6% (95% UI 87.2-89.7) access overall. Many units were able to transition from the least safe facilities in 2000 to safe facilities by 2017; for units in which populations primarily practised open defecation in 2000, 686 (95% UI 664-711) of the 1830 (1797-1863) units transitioned to the use of improved sanitation. Geographical disparities in access to improved water across units decreased in 76.1% (95% UI 71.6-80.7) of countries from 2000 to 2017, and in 53.9% (50.6-59.6) of countries for access to improved sanitation, but remained evident subnationally in most countries in 2017. Interpretation Our estimates, combined with geospatial trends in diarrhoeal burden, identify where efforts to increase access to safe drinking water and sanitation facilities are most needed. By highlighting areas with successful approaches or in need of targeted interventions, our estimates can enable precision public health to effectively progress towards universal access to safe water and sanitation. Copyright (C) 2020 The Author(s). Published by Elsevier Ltd.Peer reviewe
- …
