457 research outputs found
Has the 'Fast-Track' referral system affected the route of presentation and/or clinical outcomes in patients with colorectal cancer?
Background: The aim of this study is to determine whether the 'Fast-Track' referral system has changed the route by which patients present with colorectal cancer (CRC) and whether the route of presentation has any effect on clinical outcome. Methods: A retrospective cohort study of patients diagnosed with CRC under the care of two consultant colorectal surgeons between April 2006 and December 2012. The route by which patients presented was categorised as Fast-Track (FT), non-Fast-Track (non-FT) or acute. Outcome variables were operative intent, disease stage and 2- and 5-year survival. Results: A total of 558 patients were identified. One hundred ninety-seven patients (35.3%) were referred as FT, 108 (19.4%) presented acutely and 253 patients (45.3%) presented via other routes (non-FT). Over the study period, the route of presentation did not change significantly (P=0.135). There was no significant difference between FT and non-FT groups in terms of the proportion of patients undergoing potentially curative surgery (70.6 vs 74.3%, P=0.092) or with node-negative disease (48.2 vs 52.2%, P=0.796) nor was there any difference in 2-year or 5-year survival (74.1 vs 73.9%, P=0.837 and 52.3 vs 53.8%, P=0.889, respectively). Patients who presented acutely were less likely to undergo curative resection, had more advanced disease and had worse 2- and 5-year survival. Conclusions: The Fast-Track referral system has not affected the route by which patients present with CRC nor has it had any effect on clinical outcomes. Alternative strategies are required if the desired improvement in outcomes is to be achieved
Timescales of transformational climate change adaptation in sub-Saharan African agriculture
Climate change is projected to constitute a significant threat to food security if no adaptation actions are taken. Transformation of agricultural systems, for example switching crop types or moving out of agriculture, is projected to be necessary in some cases. However, little attention has been paid to the timing of these transformations. Here, we develop a temporal uncertainty framework using the CMIP5 ensemble to assess when and where cultivation of key crops in sub-Saharan Africa becomes unviable. We report potential transformational changes for all major crops during the twenty-first century, as climates shift and areas become unsuitable. For most crops, however, transformation is limited to small pockets (<15% of area), and only for beans, maize and banana is transformation more widespread (â 1/430% area for maize and banana, 60% for beans). We envisage three overlapping adaptation phases to enable projected transformational changes: an incremental adaptation phase focused on improvements to crops and management, a preparatory phase that establishes appropriate policies and enabling environments, and a transformational adaptation phase in which farmers substitute crops, explore alternative livelihoods strategies, or relocate. To best align policies with production triggers for no-regret actions, monitoring capacities to track farming systems as well as climate are needed
Harnessing learning biases is essential for applying social learning in conservation
Social learning can influence how animals respond to anthropogenic changes in the environment, determining whether animals survive novel threats and exploit novel resources or produce maladaptive behaviour and contribute to human-wildlife conflict. Predicting where social learning will occur and manipulating its use are, therefore, important in conservation, but doing so is not straightforward. Learning is an inherently biased process that has been shaped by natural selection to prioritize important information and facilitate its efficient uptake. In this regard, social learning is no different from other learning processes because it too is shaped by perceptual filters, attentional biases and learning constraints that can differ between habitats, species, individuals and contexts. The biases that constrain social learning are not understood well enough to accurately predict whether or not social learning will occur in many situations, which limits the effective use of social learning in conservation practice. Nevertheless, we argue that by tapping into the biases that guide the social transmission of information, the conservation applications of social learning could be improved. We explore the conservation areas where social learning is highly relevant and link them to biases in the cues and contexts that shape social information use. The resulting synthesis highlights many promising areas for collaboration between the fields and stresses the importance of systematic reviews of the evidence surrounding social learning practices.BBSRC David Phillips Fellowship (BB/H021817/1
Both Positive and Negative Selection Pressures Contribute to the Polymorphism Pattern of the Duplicated Human CYP21A2 Gene.
The human steroid 21-hydroxylase gene (CYP21A2) participates in cortisol and aldosterone biosynthesis, and resides together with its paralogous (duplicated) pseudogene in a multiallelic copy number variation (CNV), called RCCX CNV. Concerted evolution caused by non-allelic gene conversion has been described in great ape CYP21 genes, and the same conversion activity is responsible for a serious genetic disorder of CYP21A2, congenital adrenal hyperplasia (CAH). In the current study, 33 CYP21A2 haplotype variants encoding 6 protein variants were determined from a European population. CYP21A2 was shown to be one of the most diverse human genes (HHe=0.949), but the diversity of intron 2 was greater still. Contrary to previous findings, the evolution of intron 2 did not follow concerted evolution, although the remaining part of the gene did. Fixed sites (different fixed alleles of sites in human CYP21 paralogues) significantly accumulated in intron 2, indicating that the excess of fixed sites was connected to the lack of effective non-allelic conversion and concerted evolution. Furthermore, positive selection was presumably focused on intron 2, and possibly associated with the previous genetic features. However, the positive selection detected by several neutrality tests was discerned along the whole gene. In addition, the clear signature of negative selection was observed in the coding sequence. The maintenance of the CYP21 enzyme function is critical, and could lead to negative selection, whereas the presumed gene regulation altering steroid hormone levels via intron 2 might help fast adaptation, which broadly characterizes the genes of human CNVs responding to the environment
Recommended from our members
Attribution: how is it relevant for loss and damage policy and practice?
Attribution has become a recurring issue in discussions about Loss and Damage (L&D). In this highly-politicised context, attribution is often associated with responsibility and blame; and linked to debates about liability and compensation. The aim of attribution science, however, is not to establish responsibility, but to further scientific understanding of causal links between elements of the Earth System and society. This research into causality could inform the management of climate-related risks through improved understanding of drivers of relevant hazards, or, more widely, vulnerability and exposure; with potential benefits regardless of political positions on L&D. Experience shows that it is nevertheless difficult to have open discussions about the science in the policy sphere. This is not only a missed opportunity, but also problematic in that it could inhibit understanding of scientific results and uncertainties, potentially leading to policy planning which does not have sufficient scientific evidence to support it. In this chapter, we first explore this dilemma for science-policy dialogue, summarising several years of research into stakeholder perspectives of attribution in the context of L&D. We then aim to provide clarity about the scientific research available, through an overview of research which might contribute evidence about the causal connections between anthropogenic climate change and losses and damages, including climate science, but also other fields which examine other drivers of hazard, exposure, and vulnerability. Finally, we explore potential applications of attribution research, suggesting that an integrated and nuanced approach has potential to inform planning to avert, minimise and address losses and damages. The key messages are
In the political context of climate negotiations, questions about whether losses and damages can be attributed to anthropogenic climate change are often linked to issues of responsibility, blame, and liability.
Attribution science does not aim to establish responsibility or blame, but rather to investigate drivers of change.
Attribution science is advancing rapidly, and has potential to increase understanding of how climate variability and change is influencing slow onset and extreme weather events, and how this interacts with other drivers of risk, including socio-economic drivers, to influence losses and damages.
Over time, some uncertainties in the science will be reduced, as the anthropogenic climate change signal becomes stronger, and understanding of climate variability and change develops.
However, some uncertainties will not be eliminated. Uncertainty is common in science, and does not prevent useful applications in policy, but might determine which applications are appropriate. It is important to highlight that in attribution studies, the strength of evidence varies substantially between different kinds of slow onset and extreme weather events, and between regions. Policy-makers should not expect the later emergence of conclusive evidence about the influence of climate variability and change on specific incidences of losses and damages; and, in particular, should not expect the strength of evidence to be equal between events, and between countries.
Rather than waiting for further confidence in attribution studies, there is potential to start working now to integrate science into policy and practice, to help understand and tackle drivers of losses and damages, informing prevention, recovery, rehabilitation, and transformation
Low-level regulatory T-cell activity is essential for functional type-2 effector immunity to expel gastrointestinal helminths
Helminth infection is frequently associated with the expansion of regulatory T cells (Tregs) and suppression of immune responses to bystander antigens. We show that infection of mice with the chronic gastrointestinal helminth Heligmosomoides polygyrus drives rapid polyclonal expansion of Foxp3(+)Helios(+)CD4(+) thymic (t)Tregs in the lamina propria and mesenteric lymph nodes while Foxp3(+)Helios(-)CD4(+) peripheral (p)Treg expand more slowly. Notably, in partially resistant BALB/c mice parasite survival positively correlates with Foxp3(+)Helios(+)CD4(+) tTreg numbers. Boosting of Foxp3(+)Helios(+)CD4(+) tTreg populations by administration of recombinant interleukin-2 (rIL-2):anti-IL-2 (IL-2C) complex increased worm persistence by diminishing type-2 responsiveness in vivo, including suppression of alternatively activated macrophage and granulomatous responses at the sites of infection. IL-2C also increased innate lymphoid cell (ILC) numbers, indicating that Treg functions dominate over ILC effects in this setting. Surprisingly, complete removal of Tregs in transgenic Foxp3-DTR mice also resulted in increased worm burdens, with "immunological chaos" evident in high levels of the pro-inflammatory cytokines IL-6 and interferon-γ. In contrast, worm clearance could be induced by anti-CD25 antibody-mediated partial depletion of early Treg, alongside increased T helper type 2 responses and without incurring pathology. These findings highlight the overarching importance of the early Treg response to infection and the non-linear association between inflammation and the prevailing Treg frequency
An increased abundance of tumor-infiltrating regulatory t cells is correlated with the progression and prognosis of pancreatic ductal adenocarcinoma
CD4+CD25+Foxp3+ regulatory T cells (Tregs) can inhibit cytotoxic responses. Though several studies have analyzed Treg frequency in the peripheral blood mononuclear cells (PBMCs) of pancreatic ductal adenocarcinoma (PDA) patients using flow cytometry (FCM), few studies have examined how intratumoral Tregs might contribute to immunosuppression in the tumor microenvironment. Thus, the potential role of intratumoral Tregs in PDA patients remains to be elucidated. In this study, we found that the percentages of Tregs, CD4+ T cells and CD8+ T cells were all increased significantly in tumor tissue compared to control pancreatic tissue, as assessed via FCM, whereas the percentages of these cell types in PBMCs did not differ between PDA patients and healthy volunteers. The percentages of CD8 + T cells in tumors were significantly lower than in PDA patient PBMCs. In addition, the relative numbers of CD4+CD25+Foxp3+ Tregs and CD8+ T cells were negatively correlated in the tissue of PDA patients, and the abundance of Tregs was significantly correlated with tumor differentiation. Additionally, Foxp3+ T cells were observed more frequently in juxtatumoral stroma (immediately adjacent to the tumor epithelial cells). Patients showing an increased prevalence of Foxp3+ T cells had a poorer prognosis, which was an independent factor for patient survival. These results suggest that Tregs may promote PDA progression by inhibiting the antitumor immunity of CD8+ T cells at local intratumoral sites. Moreover, a high proportion of Tregs in tumor tissues may reflect suppressed antitumor immunity. Copyright: © 2014 Tang et al
Design and descriptive epidemiology of the Infectious Diseases of East African Livestock (IDEAL) project, a longitudinal calf cohort study in western Kenya
BACKGROUND: There is a widely recognised lack of baseline epidemiological data on the dynamics and impacts of infectious cattle diseases in east Africa. The Infectious Diseases of East African Livestock (IDEAL) project is an epidemiological study of cattle health in western Kenya with the aim of providing baseline epidemiological data, investigating the impact of different infections on key responses such as growth, mortality and morbidity, the additive and/or multiplicative effects of co-infections, and the influence of management and genetic factors. A longitudinal cohort study of newborn calves was conducted in western Kenya between 2007-2009. Calves were randomly selected from all those reported in a 2 stage clustered sampling strategy. Calves were recruited between 3 and 7 days old. A team of veterinarians and animal health assistants carried out 5-weekly, clinical and postmortem visits. Blood and tissue samples were collected in association with all visits and screened using a range of laboratory based diagnostic methods for over 100 different pathogens or infectious exposures. RESULTS: The study followed the 548 calves over the first 51 weeks of life or until death and when they were reported clinically ill. The cohort experienced a high all cause mortality rate of 16% with at least 13% of these due to infectious diseases. Only 307 (6%) of routine visits were classified as clinical episodes, with a further 216 reported by farmers. 54% of calves reached one year without a reported clinical episode. Mortality was mainly to east coast fever, haemonchosis, and heartwater. Over 50 pathogens were detected in this population with exposure to a further 6 viruses and bacteria. CONCLUSION: The IDEAL study has demonstrated that it is possible to mount population based longitudinal animal studies. The results quantify for the first time in an animal population the high diversity of pathogens a population may have to deal with and the levels of co-infections with key pathogens such as Theileria parva. This study highlights the need to develop new systems based approaches to study pathogens in their natural settings to understand the impacts of co-infections on clinical outcomes and to develop new evidence based interventions that are relevant
Environmental quality determines finder-joiner dynamics in socially foraging three-spined sticklebacks (Gasterosteus aculeatus)
Speculation on the origin of sub-baseline excursions of CH4 at Cape Grim
The Advanced Global Atmospheric Gases Experiment (AGAGE) program has historically measured in situ
methane (CH4
) at Cape Grim via gas chromatography with flame ionization detection (GC-FID) in 40 minutely
grab samples. By adding continuous, high precision in situ measurements of CH4
(Picarro cavity ring-down
spectroscopy [CRDS]) at both Cape Grim, Tasmania, and Casey, Antarctica, a new feature has become apparent
in the Cape Grim CH4
record. During the austral summer (December to February), the Cape Grim CH4
record
periodically drops below baseline. For example, in Figure 1, a number of sustained episodes of depressed CH4
concentration can be seen below the baseline selected data shown in red. Notably, these episodes are also seen
in the GC-FID record.
In this presentation, we examine these sub-baseline excursions of CH4
. In conjunction with meteorology and a
variety of other chemical species measured at Cape Grim, including radon, ozone, hydrogen and ethane, we speculate on a number of possible mechanisms that might be responsible for these dips in CH4 mixing ratio
- …
