10,341 research outputs found

    Social exclusion in adult informal carers: A systematic narrative review of the experiences of informal carers of people with dementia and mental illness.

    Get PDF
    Social exclusion has a negative impact on quality of life. People living with dementia or mental health disorders as well as informal carers have been separately described as socially excluded. The objective of this systematic narrative review was to examine the extent to which social exclusion experienced by adult informal carers of people living with dementia or severe mental health disorders has been identified and described in research literature. It synthesised qualitative and quantitative evidence and included the perspectives of carers themselves and of professionals. Eight electronic databases (1997–2017) were searched. Five relevant studies published between 2010 and 2016 were identified. All were qualitative and used interviews and focus groups. Study quality was variable and most were European. Two focused on carers of people living with dementia and three on carers of people with mental health disorders. Four investigated carers’ perspectives and experiences of social exclusion directly (total of 137 carer participants, predominantly parents, spouses and adult children), while the fifth focused on the perceptions of 65 participants working in health and social care. Stigma, financial difficulties and social isolation were highlighted in four studies and the challenges for carers in engaging in leisure activities were described in the fifth. Most conceptualised social exclusion as a form of stigma, or as resulting from stigma. One presented social exclusion as an element of carer burden. Two explicitly discussed the negative effects of social exclusion on carers. The dearth of research and the lack of specificity about social exclusion in carers was surprising. Future research should investigate aspects of social exclusion that may adversely affect carer well-being

    Patient stratification and genomics: flares, fizzlers and foxes

    Get PDF
    RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are

    Cytokine mRNA expression responses to resistance, aerobic, and concurrent exercise in sedentary middle-aged men

    Full text link
    Concurrent resistance and aerobic exercise (CE) is recommended to ageing populations, though is postulated to induce diminished acute molecular responses. Given that contraction-induced cytokine mRNA expression reportedly mediates remunerative postexercise molecular responses, it is necessary to determine whether cytokine mRNA expression may be diminished after CE. Eight middle-aged men (age, 53.3 ±1.8 years; body mass index, 29.4 ± 1.4 kg·m-2) randomly completed (balanced for completion order) 8 × 8 leg extensions at 70% maximal strength (RE), 40 min of cycling at 55% of peak aerobic workload (AE), or (workload-matched) 50% RE and 50% AE (CE). Muscle (vastus lateralis) was obtained pre-exercise, and at 1 h and 4 h postexercise, and analyzed for changes of glycogen concentration, tumor necrosis factor (TNF)α, TNF receptor-1 and -2 (TNF-R1 and TNF-R2, respectively), interleukin (IL)-6, IL-6R, IL-1β, and IL-1 receptor-antagonist (IL-1ra). All exercise modes upregulated cytokine mRNA expression at 1hpostexercise comparably (TNFα, TNF-R1, TNF-R2, IL-1β, IL-6) (p 0.05). Moreover, AE and RE upregulated IL-1β and IL-1ra expression, whereas CE upregulated IL-1β expression only (p 0.05). In conclusion, in middle-aged men, all modes induced commensurate cytokine mRNA expression at 1 h postexercise; however, only CE resulted in ameliorated expression at 4 h postexercise. Whether the RE or AE components of CE are independently or cumulatively sufficient to upregulate cytokine responses, or whether they collectively inhibit cytokine mRNA expression, remains to be determined

    Comparing badger (Meles meles) management strategies for reducing tuberculosis incidence in cattle

    Get PDF
    This is the final version of the article. Available from Public Library of Science via the DOI in this record.Bovine tuberculosis (bTB), caused by Mycobacterium bovis, continues to be a serious economic problem for the British cattle industry. The Eurasian badger (Meles meles) is partly responsible for maintenance of the disease and its transmission to cattle. Previous attempts to manage the disease by culling badgers have been hampered by social perturbation, which in some situations is associated with increases in the cattle herd incidence of bTB. Following the licensing of an injectable vaccine, we consider the relative merits of management strategies to reduce bTB in badgers, and thereby reduce cattle herd incidence. We used an established simulation model of the badger-cattle-TB system and investigated four proposed strategies: business as usual with no badger management, large-scale proactive badger culling, badger vaccination, and culling with a ring of vaccination around it. For ease of comparison with empirical data, model treatments were applied over 150 km(2) and were evaluated over the whole of a 300 km(2) area, comprising the core treatment area and a ring of approximately 2 km. The effects of treatment were evaluated over a 10-year period comprising treatment for five years and the subsequent five year period without treatment. Against a background of existing disease control measures, where 144 cattle herd incidents might be expected over 10 years, badger culling prevented 26 cattle herd incidents while vaccination prevented 16. Culling in the core 150 km(2) plus vaccination in a ring around it prevented about 40 cattle herd breakdowns by partly mitigating the negative effects of culling, although this approach clearly required greater effort. While model outcomes were robust to uncertainty in parameter estimates, the outcomes of culling were sensitive to low rates of land access for culling, low culling efficacy, and the early cessation of a culling strategy, all of which were likely to lead to an overall increase in cattle disease.Funding: The UK Department for Environment, Food and Rural Affairs (Defra, http://ww2.defra.gov.uk/) funded this work. Defra designed the three strategies to investigate, but had no role in data collection and analysis, decision to publish, or preparation of the manuscript

    Comparative effects of single-mode vs. duration-matched concurrent exercise training on body composition, low-grade inflammation, and glucose regulation in sedentary, overweight, middle-aged men

    Full text link
    The effect of duration-matched concurrent exercise training (CET) (50% resistance (RET) and 50% endurance (EET) training) on physiological training outcomes in untrained middle-aged men remains to be elucidated. Forty-seven men (age, 48.1 ± 6.8 years; body mass index, 30.4 ± 4.1 kg·m-2) were randomized into 12-weeks of EET (40-60 min of cycling), RET (10 exercises; 3-4 sets × 8-10 repetitions), CET (50% serial completion of RET and EET), or control condition. The following were determined: intervention-based changes in fitness and strength; abdominal visceral adipose tissue (VAT), total body fat (TB-FM) and fat-free (TB-FFM) mass; plasma cytokines (C-reactive protein (CRP), tumor necrosis factor-α (TNFα) interleukin-6 (IL-6)); muscle protein content of p110α and glucose transporter 4 (GLUT4); mRNA expression of GLUT4, peroxisome proliferator-activated receptor-γ coactivator-1α-β, cytochrome c oxidase, hexokinase II, citrate synthase; oral glucose tolerance; and estimated insulin sensitivity. CET promoted commensurate improvements of aerobic capacity and muscular strength and reduced VAT and TB-FM equivalently to EET and RET (p 0.05). EET reduced area under the curve for glucose, insulin, and C-peptide, whilst CET and RET respectively reduced insulin and C-peptide, and C-peptide only (p 0.05). In middle-aged men, 12 weeks of durationmatched CET promoted commensurate changes in fitness and strength, abdominal VAT, plasma cytokines and insulin sensitivity, and an equidistant glucose tolerance response to EET and RET; despite no change of measured muscle mechanisms associative to insulin action, glucose transport, and mitochondrial function

    The Contribution of Transcriptomics to Biomarker Development in Systemic Vasculitis and SLE.

    Get PDF
    A small but increasing number of gene expression based biomarkers are becoming available for routine clinical use, principally in oncology and transplantation. These underscore the potential of gene expression arrays and RNA sequencing for biomarker development, but this potential has not yet been fully realized and most candidates do not progress beyond the initial report. The first part of this review examines the process of gene expression- based biomarker development, highlighting how systematic biases and confounding can significantly skew study outcomes. Adequate validation in an independent cohort remains the single best means of protecting against these concerns. The second part considers gene-expression based biomarkers in Systemic Lupus Erythematosus (SLE) and systemic vasculitis. The type 1 interferon inducible gene signature remains by far the most studied in autoimmune rheumatic disease. While initially presented as an objective, blood-based biomarker of active SLE, subsequent research has shown that it is not specific to SLE and that its association with disease activity is considerably more nuanced than first thought. Nonetheless, it is currently under evaluation in ongoing trials of anti-interferon therapy. Other candidate markers of note include a prognostic CD8+ T-cell gene signature validated in SLE and ANCA-associated vasculitis, and a disease activity biomarker for SLE derived from modules of tightly correlated genes.This is the author accepted manuscript. The final version is available from Bentham Science via http://dx.doi.org/10.2174/138161282166615031313025

    Demographic buffering and compensatory recruitment promotes the persistence of disease in a wildlife population.

    Get PDF
    Published onlineLETTERDemographic buffering allows populations to persist by compensating for fluctuations in vital rates, including disease-induced mortality. Using long-term data on a badger (Meles meles Linnaeus, 1758) population naturally infected with Mycobacterium bovis, we built an integrated population model to quantify impacts of disease, density and environmental drivers on survival and recruitment. Badgers exhibit a slow life-history strategy, having high rates of adult survival with low variance, and low but variable rates of recruitment. Recruitment exhibited strong negative density-dependence, but was not influenced by disease, while adult survival was density independent but declined with increasing prevalence of diseased individuals. Given that reproductive success is not depressed by disease prevalence, density-dependent recruitment of cubs is likely to compensate for disease-induced mortality. This combination of slow life history and compensatory recruitment promotes the persistence of a naturally infected badger population and helps to explain the badger's role as a persistent reservoir of M. bovis.NERCUK Department of Environment, Food and Rural Affair

    A cross sectional study of water quality from dental unit water lines in dental practices in the West of Scotland

    Get PDF
    OBJECTIVE: To determine the microbiological quality of water from dental units in a general practice setting and current practice for disinfection of units. DESIGN: A cross-sectional study of the water quality from 40 dental units in 39 general practices and a questionnaire of the disinfection protocols used in those practices. SETTING: NHS practices in primarydental care. SUBJECTS: Thirty-nine general practices from the West of Scotland. METHODS: Water samples were collected on two separate occasions from dental units and analysed for microbiological quality by the total viable count (TVC) method. Water specimens were collected from the triple syringe, high speed outlet, cup filler and surgery tap. Each participating practitioner was asked to complete a questionnaire. Results Microbial contamination was highest from the high speed outlet followed by the triple syringe and cup filler. On average, the TVC counts from the high speed water lines at 37 degrees C and for the high speed lines, triple syringe and cup filler at 22 degrees C were significantly higher than that from the control tap water specimens. The study included units from 11 different manufacturers with ages ranging from under one year to over eight years. The age of the dental unit analysed did not appear to influence the level of microbial contamination. Five of the practices surveyed used disinfectants to clean the dental units but these had no significant effect on the microbiological quality of the water. The majority of dental units (25 out of 40) were never flushed with water between patients. A number of different non-sterile irrigants were used for surgical procedures. CONCLUSION: The microbiological quality of water from dental units in general dental practice is poor compared with that from drinking water sources. Suitable sterile irrigants should be used for surgical procedures in dental practice. Further work is required for pragmatic decontamination regimens of dental unit water lines in a general dental practice setting

    Evaluation of the efficacy of Alpron disinfectant for dental unit water lines

    Get PDF
    AIMS: To assess the efficacy of a disinfectant, Alpron, for controlling microbial contamination within dental unit water lines. METHODS: The microbiological quality of water emerging from the triple syringe, high speed handpiece, cup filler and surgery hand wash basin from six dental units was assessed for microbiological total viable counts at 22 degrees C and 37 degrees C before and after treatment with Alpron solutions. RESULTS: The study found that the use of Alpron disinfectant solutions could reduce microbial counts in dental unit water lines to similar levels for drinking water. This effect was maintained in all units for up to six weeks following one course of treatment. In four out of six units the low microbial counts were maintained for 13 weeks. CONCLUSIONS: Disinfectants may have a short term role to play in controlling microbial contamination of dental unit water lines to drinking water quality. However, in the longer term attention must be paid to redesigning dental units to discourage the build up of microbial biofilms
    corecore