87 research outputs found

    Effects of break crops, and of wheat volunteers growing in break crops or in set-aside or conservation covers, all following crops of winter wheat, on the development of take-all (Gaeumannomyces graminis var. tritici) in succeeding crops of winter wheat

    Get PDF
    Experiments on the Rothamsted and Woburn Experimental Farms studied the effects on take-all of different break crops and of set-aside/conservation covers that interrupted sequences of winter wheat. There was no evidence for different effects on take-all of the break crops per se but the presence of volunteers, in crops of oilseed rape, increased the amounts of take-all in the following wheat. Severity of take-all was closely related to the numbers of volunteers in the preceding break crops and covers, and was affected by the date of their destruction. Early destruction of set-aside/conservation covers was usually effective in preventing damaging take-all in the following wheat except, sometimes, when populations of volunteers were very large. The experiments were not designed to test the effects of sowing dates but different amounts of take-all in the first wheats after breaks or covers apparently affected the severity of take-all in the following (second) wheats only where the latter were relatively late sown. In earlier-sown second wheats, take-all was consistently severe and unrelated to the severity of the disease in the preceding (first) wheats. Results from two very simple experiments suggested that substituting set-aside/conservation covers for winter wheat, for 1 year only, did not seriously interfere with the development of take-all disease or with the development or maintenance of take-all decline (TAD). With further research, it might be possible for growers wishing to exploit TAD to incorporate set-aside/conservation covers into their cropping strategies, and especially to avoid the worst effects of the disease on grain yield during the early stages of epidemics

    Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.

    Get PDF
    Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability

    The prevalence of anxiety and depression in people with age-related macular degeneration: a systematic review of observational study data

    Get PDF
    Background Comorbid mental health problems have been shown to have an adverse effect on the quality of life of people with common eye disorders. This study aims to assess whether symptoms of anxiety and/or depression are more prevalent in people with age-related macular degeneration (AMD) than in people without this condition. Methods A systematic search of electronic databases (Medline, CINAHL, EMBASE, PsycINFO) from inception to February 2012 was conducted to identify studies of AMD populations which measured symptoms of anxiety and/or depression. Reference checking of relevant articles was also performed. Data on the study setting, prevalence and how anxiety and depression were measured were extracted from the papers. Critical appraisal was performed using the Critical Appraisal Skills Programme (CASP) tools. Results A total of 16 papers were included in the review, from an original search result of 597. The prevalence estimates, taken from nine cross-sectional and cohort studies, ranged from 15.7%-44% for depressive symptoms and 9.6%-30.1% for anxiety symptoms in people with AMD. The seven case–control studies found that people with AMD were more likely to experience symptoms of depression compared with those without AMD, but not more likely to experience symptoms of anxiety. Conclusions Overall, the evidence suggests that symptoms of depression are more prevalent amongst AMD populations than anxiety symptoms. The heterogeneity of the studies included in this review means that it is difficult to draw strong conclusions as to the true estimates of depression and anxiety symptoms in AMD populations and prevented formal meta-analysis. Further research which specifies clinical anxiety and gives clear definitions as to the type of AMD being investigated is required

    Differential cellular and humoral immune responses in immunocompromised individuals following multiple SARS-CoV-2 vaccinations

    Get PDF
    Introduction: The heterogeneity of the immunocompromised population means some individuals may exhibit variable, weak or reduced vaccine-induced immune responses, leaving them poorly protected from COVID-19 disease despite receiving multiple SARS-CoV-2 vaccinations. There is conflicting data on the immunogenicity elicited by multiple vaccinations in immunocompromised groups. The aim of this study was to measure both humoral and cellular vaccine-induced immunity in several immunocompromised cohorts and to compare them to immunocompetent controls. Methods: Cytokine release in peptide-stimulated whole blood, and neutralising antibody and baseline SARS-CoV-2 spike-specific IgG levels in plasma were measured in rheumatology patients (n=29), renal transplant recipients (n=46), people living with HIV (PLWH) (n=27) and immunocompetent participants (n=64) post third or fourth vaccination from just one blood sample. Cytokines were measured by ELISA and multiplex array. Neutralising antibody levels in plasma were determined by a 50% neutralising antibody titre assay and SARS-CoV-2 spike specific IgG levels were quantified by ELISA. Results: In infection negative donors, IFN-γ, IL-2 and neutralising antibody levels were significantly reduced in rheumatology patients (p=0.0014, p=0.0415, p=0.0319, respectively) and renal transplant recipients (p<0.0001, p=0.0005, p<0.0001, respectively) compared to immunocompetent controls, with IgG antibody responses similarly affected. Conversely, cellular and humoral immune responses were not impaired in PLWH, or between individuals from all groups with previous SARS-CoV-2 infections. Discussion: These results suggest that specific subgroups within immunocompromised cohorts could benefit from distinct, personalised immunisation or treatment strategies. Identification of vaccine non-responders could be critical to protect those most at risk

    Abstracts from the Food Allergy and Anaphylaxis Meeting 2016

    Get PDF

    The effect of non-ploughing on cereal diseases

    Full text link
    The change from orthodox cultivation to non-plough husbandry may influence the incidence of cereal diseases by its effects on the availability of inoculum of the responsible pathogens, the susceptibility of the host plants and the suitability of the environment for disease development. Failure to bury crop debris may aggravate trash-borne diseases; in addition, increased disease levels in autumn-sown crops are likely if non-ploughing techniques should lead to earlier sowing. </jats:p

    Insurance companies' attitude to psychiatric illness.

    Full text link

    Epidemiology in sustainable systems

    Full text link
    corecore