259 research outputs found

    Is late-life dependency increasing or not? A comparison of the Cognitive Function and Ageing Studies (CFAS)

    Get PDF
    Background: Little is known about how dependency levels have changed between generational cohorts of older people. We estimated years lived in different care states at age 65 in 1991 and 2011 and new projections of future demand for care. Methods: Two population-based studies of older people in defined geographical areas conducted two decades apart (the Cognitive Function and Ageing Studies) provided prevalence estimates of dependency in four states: high (24-hour care); medium (daily care); low (less than daily); independent. Years in each dependency state were calculated by Sullivan’s method. To project future demand, the proportions in each dependency state (by age group and sex) were applied to the 2014 England population projections. Findings: Between 1991 and 2011 there were significant increases in years lived from age 65 with low (men:1·7 years, 95%CI 1·0-2·4; women:2·4 years, 95%CI 1·8-3·1) and high dependency (men:0·9 years, 95%CI 0·2-1·7; women:1·3 years, 95%CI 0·5-2·1). The majority of men’s extra years of life were independent (36%) or with low dependency (36%) whilst for women the majority were spent with low dependency (58%), only 5% being independent. There were substantial reductions in the proportions with medium and high dependency who lived in care homes, although, if these dependency and care home proportions remain constant in the future, further population ageing will require an extra 71,000 care home places by 2025. Interpretation: On average older men now spend 2.4 years and women 3.0 years with substantial care needs (medium or high dependency), and most will live in the community. These findings have considerable implications for older people’s families who provide the majority of unpaid care, but the findings also supply valuable new information for governments and care providers planning the resources and funding required for the care of their future ageing populations

    The lithic assemblage from Sugenya, a Pastoral Neolithic site of the Elmenteitan tradition in southwestern Kenya

    No full text
    The spread of mobile pastoralism throughout eastern Africa in the mid- to late Holocene fundamentally reshaped social and economic strategies and occurred against the backdrop of major climatic and demographic change. Early stone-tool-using herders in these regions faced new and unpredictable environments. Lithic technological strategies from this ‘Pastoral Neolithic’ (PN) period (c. 5000–1400 BP) reflect the social and conomic solutions to the novel environmental challenges faced by food-producing communities. In southern Kenya, the ‘Elmenteitan’ technological tradition appears during the PN in association with a specialised herding economy and distinct ceramic styles and settlement patterns. The Elmenteitan is known mostly from ockshelter sites in the Central Rift Valley and few open-air Elmenteitan sites have been extensively excavated. Fewer still have benefitted from comprehensive lithic analyses. This paper presents typological and technological analyses of the Elmenteitan site of Sugenya located in the Lemek Valley of southwestern Kenya and excavated by Alison Simons in 2002. Technological patterns add resolution to Elmenteitan tool-use and production in the region and contribute new insights to the organisation of Elmenteitan obsidian exchange networks

    Treatment of enteric fever (typhoid and paratyphoid fever) with cephalosporins

    Get PDF
    Background Typhoid and paratyphoid (enteric fever) are febrile bacterial illnesses common in many low‐ and middle‐income countries. The World Health Organization (WHO) currently recommends treatment with azithromycin, ciprofloxacin, or ceftriaxone due to widespread resistance to older, first‐line antimicrobials. Resistance patterns vary in different locations and are changing over time. Fluoroquinolone resistance in South Asia often precludes the use of ciprofloxacin. Extensively drug‐resistant strains of enteric fever have emerged in Pakistan. In some areas of the world, susceptibility to old first‐line antimicrobials, such as chloramphenicol, has re‐appeared. A Cochrane Review of the use of fluoroquinolones and azithromycin in the treatment of enteric fever has previously been undertaken, but the use of cephalosporins has not been systematically investigated and the optimal choice of drug and duration of treatment are uncertain. Objectives To evaluate the effectiveness of cephalosporins for treating enteric fever in children and adults compared to other antimicrobials. Search methods We searched the Cochrane Infectious Diseases Group Specialized Register, CENTRAL, MEDLINE, Embase, LILACS, the WHO ICTRP and ClinicalTrials.gov up to 24 November 2021. We also searched reference lists of included trials, contacted researchers working in the field, and contacted relevant organizations. Selection criteria We included randomized controlled trials (RCTs) in adults and children with enteric fever that compared a cephalosporin to another antimicrobial, a different cephalosporin, or a different treatment duration of the intervention cephalosporin. Enteric fever was diagnosed on the basis of blood culture, bone marrow culture, or molecular tests. Data collection and analysis We used standard Cochrane methods. Our primary outcomes were clinical failure, microbiological failure and relapse. Our secondary outcomes were time to defervescence, duration of hospital admission, convalescent faecal carriage, and adverse effects. We used the GRADE approach to assess certainty of evidence for each outcome. Main results We included 27 RCTs with 2231 total participants published between 1986 and 2016 across Africa, Asia, Europe, the Middle East and the Caribbean, with comparisons between cephalosporins and other antimicrobials used for the treatment of enteric fever in children and adults. The main comparisons are between antimicrobials in most common clinical use, namely cephalosporins compared to a fluoroquinolone and cephalosporins compared to azithromycin. Cephalosporin (cefixime) versus fluoroquinolones Clinical failure, microbiological failure and relapse may be increased in patients treated with cefixime compared to fluoroquinolones in three small trials published over 14 years ago: clinical failure (risk ratio (RR) 13.39, 95% confidence interval (CI) 3.24 to 55.39; 2 trials, 240 participants; low‐certainty evidence); microbiological failure (RR 4.07, 95% CI 0.46 to 36.41; 2 trials, 240 participants; low‐certainty evidence); relapse (RR 4.45, 95% CI 1.11 to 17.84; 2 trials, 220 participants; low‐certainty evidence). Time to defervescence in participants treated with cefixime may be longer compared to participants treated with fluoroquinolones (mean difference (MD) 1.74 days, 95% CI 0.50 to 2.98, 3 trials, 425 participants; low‐certainty evidence). Cephalosporin (ceftriaxone) versus azithromycin Ceftriaxone may result in a decrease in clinical failure compared to azithromycin, and it is unclear whether ceftriaxone has an effect on microbiological failure compared to azithromycin in two small trials published over 18 years ago and in one more recent trial, all conducted in participants under 18 years of age: clinical failure (RR 0.42, 95% CI 0.11 to 1.57; 3 trials, 196 participants; low‐certainty evidence); microbiological failure (RR 1.95, 95% CI 0.36 to 10.64, 3 trials, 196 participants; very low‐certainty evidence). It is unclear whether ceftriaxone increases or decreases relapse compared to azithromycin (RR 10.05, 95% CI 1.93 to 52.38; 3 trials, 185 participants; very low‐certainty evidence). Time to defervescence in participants treated with ceftriaxone may be shorter compared to participants treated with azithromycin (mean difference of −0.52 days, 95% CI −0.91 to −0.12; 3 trials, 196 participants; low‐certainty evidence). Cephalosporin (ceftriaxone) versus fluoroquinolones It is unclear whether ceftriaxone has an effect on clinical failure, microbiological failure, relapse, and time to defervescence compared to fluoroquinolones in three trials published over 28 years ago and two more recent trials: clinical failure (RR 3.77, 95% CI 0.72 to 19.81; 4 trials, 359 participants; very low‐certainty evidence); microbiological failure (RR 1.65, 95% CI 0.40 to 6.83; 3 trials, 316 participants; very low‐certainty evidence); relapse (RR 0.95, 95% CI 0.31 to 2.92; 3 trials, 297 participants; very low‐certainty evidence) and time to defervescence (MD 2.73 days, 95% CI −0.37 to 5.84; 3 trials, 285 participants; very low‐certainty evidence). It is unclear whether ceftriaxone decreases convalescent faecal carriage compared to the fluoroquinolone gatifloxacin (RR 0.18, 95% CI 0.01 to 3.72; 1 trial, 73 participants; very low‐certainty evidence) and length of hospital stay may be longer in participants treated with ceftriaxone compared to participants treated with the fluoroquinolone ofloxacin (mean of 12 days (range 7 to 23 days) in the ceftriaxone group compared to a mean of 9 days (range 6 to 13 days) in the ofloxacin group; 1 trial, 47 participants; low‐certainty evidence). Authors' conclusions Based on very low‐ to low‐certainty evidence, ceftriaxone is an effective treatment for adults and children with enteric fever, with few adverse effects. Trials suggest that there may be no difference in the performance of ceftriaxone compared with azithromycin, fluoroquinolones, or chloramphenicol. Cefixime can also be used for treatment of enteric fever but may not perform as well as fluoroquinolones. We are unable to draw firm general conclusions on comparative contemporary effectiveness given that most trials were small and conducted over 20 years previously. Clinicians need to take into account current, local resistance patterns in addition to route of administration when choosing an antimicrobial

    Azithromycin Resistance in Shigella spp. in Southeast Asia.

    Get PDF
    Infection by Shigella spp. is a common cause of dysentery in Southeast Asia. Antimicrobials are thought to be beneficial for treatment; however, antimicrobial resistance in Shigella spp. is becoming widespread. We aimed to assess the frequency and mechanisms associated with decreased susceptibility to azithromycin in Southeast Asian Shigella isolates and use these data to assess appropriate susceptibility breakpoints. Shigella isolates recovered in Vietnam and Laos were screened for susceptibility to azithromycin (15 μg) by disc diffusion and MIC. Phenotypic resistance was confirmed by PCR amplification of macrolide resistance loci. We compared the genetic relationships and plasmid contents of azithromycin-resistant Shigella sonnei isolates using whole-genome sequences. From 475 available Shigella spp. isolated in Vietnam and Laos between 1994 and 2012, 6/181 S. flexneri isolates (3.3%, MIC ≥ 16 g/liter) and 16/294 S. sonnei isolates (5.4%, MIC ≥ 32 g/liter) were phenotypically resistant to azithromycin. PCR amplification confirmed a resistance mechanism in 22/475 (4.6%) isolates (mphA in 19 isolates and ermB in 3 isolates). The susceptibility data demonstrated the acceptability of the S. flexneri (MIC ≥ 16 g/liter, zone diameter ≤ 15 mm) and S. sonnei (MIC ≥ 32 g/liter, zone diameter ≤ 11 mm) breakpoints with a <3% discrepancy. Phylogenetic analysis demonstrated that decreased susceptibility has arisen sporadically in Vietnamese S. sonnei isolates on at least seven occasions between 2000 and 2009 but failed to become established. While the proposed susceptibility breakpoints may allow better recognition of resistant isolates, additional studies are required to assess the impact on the clinical outcome. The potential emergence of azithromycin resistance highlights the need for alternative options for management of Shigella infections in countries where Shigella is endemic

    The CIPAZ study protocol: an open label randomised controlled trial of azithromycin versus ciprofloxacin for the treatment of children hospitalised with dysentery in Ho Chi Minh City, Vietnam

    Get PDF
    Background: Diarrhoeal disease remains a common cause of illness and death in children <5 years of age. Faecal-oral infection by Shigella spp. causing bacillary dysentery is a leading cause of moderate-to-severe diarrhoea, particularly in low and middle-income countries. In Southeast Asia, S. sonnei predominates and infections are frequently resistant to first-line treatment with the fluoroquinolone, ciprofloxacin. While resistance to all antimicrobials is increasing, there may be theoretical and clinical benefits to prioritizing treatment of bacillary dysentery with the azalide, azithromycin. In this study we aim to measure the efficacy of treatment with azithromycin compared with ciprofloxacin, the current standard of care, for the treatment of children with bacillary dysentery. Methods and analysis: We will perform a multicentre, open-label, randomized controlled trial of two therapeutic options for the antimicrobial treatment of children hospitalised with dysentery. Children (6–60 months of age) presenting with symptoms and signs of dysentery at Children’s Hospital 2 in Ho Chi Minh City will be randomised (1:1) to treatment with either oral ciprofloxacin (15mg/kg/twice daily for 3 days, standard-of-care) or oral azithromycin (10mg/kg/daily for 3 days). The primary endpoint will be the proportion of treatment failure (defined by clinical and microbiological parameters) by day 28 (+3 days) and will be compared between study arms by logistic regression modelling using treatment allocation as the main variable. Ethics and dissemination: The study protocol (version 1.2 dated 27th December 2018) has been approved by the Oxford Tropical Research Ethics Committee (47–18) and the ethical review boards of Children's Hospital 2 (1341/NĐ2-CĐT). The study has also been approved by the Vietnamese Ministry of Health (5044/QĐ-BYT). Trial registration: Clinicaltrials.gov: NCT03854929 (February 26th 2019)

    An efficient strategy for evaluating new non-invasive screening tests for colorectal cancer: the guiding principles.

    Get PDF
    New screening tests for colorectal cancer (CRC) are rapidly emerging. Conducting trials with mortality reduction as the end point supporting their adoption is challenging. We re-examined the principles underlying evaluation of new non-invasive tests in view of technological developments and identification of new biomarkers. A formal consensus approach involving a multidisciplinary expert panel revised eight previously established principles. Twelve newly stated principles emerged. Effectiveness of a new test can be evaluated by comparison with a proven comparator non-invasive test. The faecal immunochemical test is now considered the appropriate comparator, while colonoscopy remains the diagnostic standard. For a new test to be able to meet differing screening goals and regulatory requirements, flexibility to adjust its positivity threshold is desirable. A rigorous and efficient four-phased approach is proposed, commencing with small studies assessing the test's ability to discriminate between CRC and non-cancer states (phase I), followed by prospective estimation of accuracy across the continuum of neoplastic lesions in neoplasia-enriched populations (phase II). If these show promise, a provisional test positivity threshold is set before evaluation in typical screening populations. Phase III prospective studies determine single round intention-to-screen programme outcomes and confirm the test positivity threshold. Phase IV studies involve evaluation over repeated screening rounds with monitoring for missed lesions. Phases III and IV findings will provide the real-world data required to model test impact on CRC mortality and incidence. New non-invasive tests can be efficiently evaluated by a rigorous phased comparative approach, generating data from unbiased populations that inform predictions of their health impact

    Changing prevalence and treatment of depression among older people over two decades

    Get PDF
    Background Depression is a leading cause of disability, with older people particularly susceptible to poor outcomes.Aims To investigate whether the prevalence of depression and antidepressant use have changed across two decades in older people.Method The Cognitive Function and Ageing Studies (CFAS I and CFAS II) are two English population-based cohort studies of older people aged ≥65 years, with baseline measurements for each cohort conducted two decades apart (between 1990 and 1993 and between 2008 and 2011). Depression was assessed by the Geriatric Mental State examination and diagnosed with the Automated Geriatric Examination for Computer-Assisted Taxonomy algorithm.Results In CFAS I, 7635 people aged ≥65 years were interviewed, of whom 1457 were diagnostically assessed. In CFAS II, 7762 people were interviewed and diagnostically assessed. Age-standardised depression prevalence in CFAS II was 6.8% (95% CI 6.3-7.5%), representing a non-significant decline from CFAS I (risk ratio 0.82, 95% CI 0.64-1.07, P = 0.14). At the time of CFAS II, 10.7% of the population (95% CI 10.0-11.5%) were taking antidepressant medication, more than twice that of CFAS I (risk ratio 2.79, 95% CI 1.96-3.97, P < 0.0001). Among care home residents, depression prevalence was unchanged, but the use of antidepressants increased from 7.4% (95% CI 3.8-13.8%) to 29.2% (95% CI 22.6-36.7%).Conclusions A substantial increase in the proportion of the population reporting taking antidepressant medication is seen across two decades for people aged ≥65 years. However there was no evidence for a change in age-specific prevalence of depression

    Developing a predictive modelling capacity for a climate change-vulnerable blanket bog habitat: Assessing 1961-1990 baseline relationships

    Get PDF
    Aim: Understanding the spatial distribution of high priority habitats and developing predictive models using climate and environmental variables to replicate these distributions are desirable conservation goals. The aim of this study was to model and elucidate the contributions of climate and topography to the distribution of a priority blanket bog habitat in Ireland, and to examine how this might inform the development of a climate change predictive capacity for peat-lands in Ireland. Methods: Ten climatic and two topographic variables were recorded for grid cells with a spatial resolution of 1010 km, covering 87% of the mainland land surface of Ireland. Presence-absence data were matched to these variables and generalised linear models (GLMs) fitted to identify the main climatic and terrain predictor variables for occurrence of the habitat. Candidate predictor variables were screened for collinearity, and the accuracy of the final fitted GLM was evaluated using fourfold cross-validation based on the area under the curve (AUC) derived from a receiver operating characteristic (ROC) plot. The GLM predicted habitat occurrence probability maps were mapped against the actual distributions using GIS techniques. Results: Despite the apparent parsimony of the initial GLM using only climatic variables, further testing indicated collinearity among temperature and precipitation variables for example. Subsequent elimination of the collinear variables and inclusion of elevation data produced an excellent performance based on the AUC scores of the final GLM. Mean annual temperature and total mean annual precipitation in combination with elevation range were the most powerful explanatory variable group among those explored for the presence of blanket bog habitat. Main conclusions: The results confirm that this habitat distribution in general can be modelled well using the non-collinear climatic and terrain variables tested at the grid resolution used. Mapping the GLM-predicted distribution to the observed distribution produced useful results in replicating the projected occurrence of the habitat distribution over an extensive area. The methods developed will usefully inform future climate change predictive modelling for Irelan

    The chemical compound 'Heatin' stimulates hypocotyl elongation and interferes with the Arabidopsis NIT1-subfamily of nitrilases

    Get PDF
    Temperature passively affects biological processes involved in plant growth. Therefore, it is challenging to study the dedicated temperature signalling pathways that orchestrate thermomorphogenesis, a suite of elongation growth-based adaptations that enhance leaf-cooling capacity. We screened a chemical library for compounds that restored hypocotyl elongation in the pif4-2-deficient mutant background at warm temperature conditions in Arabidopsis thaliana to identify modulators of thermomorphogenesis. The small aromatic compound 'Heatin', containing 1-iminomethyl-2-naphthol as a pharmacophore, was selected as an enhancer of elongation growth. We show that ARABIDOPSIS ALDEHYDE OXIDASES redundantly contribute to Heatin-mediated hypocotyl elongation. Following a chemical proteomics approach, the members of the NITRILASE1-subfamily of auxin biosynthesis enzymes were identified among the molecular targets of Heatin. Our data reveal that nitrilases are involved in promotion of hypocotyl elongation in response to high temperature and Heatin-mediated hypocotyl elongation requires the NITRILASE1-subfamily members, NIT1 and NIT2. Heatin inhibits NIT1-subfamily enzymatic activity in vitro and the application of Heatin accordingly results in the accumulation of NIT1-subfamily substrate indole-3-acetonitrile in vivo. However, levels of the NIT1-subfamily product, bioactive auxin (indole-3-acetic acid), were also significantly increased. It is likely that the stimulation of hypocotyl elongation by Heatin might be independent of its observed interaction with NITRILASE1-subfamily members. However, nitrilases may contribute to the Heatin response by stimulating indole-3-acetic acid biosynthesis in an indirect way. Heatin and its functional analogues present novel chemical entities for studying auxin biology

    Prevalence and factors associated with poor performance in the 5-chair stand test: findings from the Cognitive Function and Ageing Study II and proposed Newcastle protocol for use in the assessment of sarcopenia

    Get PDF
    Background: Poor performance in the 5-chair stand test (5-CST) indicates reduced lower limb muscle strength. The 5-CST has been recommended for use in the initial assessment of sarcopenia, the accelerated loss of muscle strength and mass. In order to facilitate the use of the 5-CST in sarcopenia assessment, our aims were to (i) describe the prevalence and factors associated with poor performance in the 5-CST, (ii) examine the relationship between the 5-CST and gait speed, and (iii) propose a protocol for using the 5-CST. Methods: The population-based study Cognitive Function and Ageing Study II recruited people aged 65 years and over from defined geographical localities in Cambridgeshire, Newcastle, and Nottingham. The study collected data for assessment of functional ability during home visits, including the 5-CST and gait speed. We used multinomial logistic regression to assess the associations between factors including the SARC-F questionnaire and the category of 5-CST performance: fast (15 s), or unable, with slow/unable classed as poor performance. We reviewed previous studies on the protocol used to carry out the 5-CST. Results: A total of 7190 participants aged 65+ from the three diverse localities of Cognitive Function and Ageing Study II were included (54.1% female). The proportion of those with poor performance in the 5-CST increased with age, from 34.3% at age 65–69 to 89.7% at age 90+. Factors independently associated with poor performance included positive responses to the SARC-F questionnaire, physical inactivity, depression, impaired cognition, and multimorbidity (all P < 0.005). Most people with poor performance also had slow gait speed (57.8%) or were unable to complete the gait speed test (18.4%). We found variation in the 5-CST protocol used, for example, timing until a participant stood up for the fifth time or until they sat down afterwards. Conclusions: Poor performance in the 5-CST is increasingly common with age and is associated with a cluster of other factors that characterize risk for poor ageing such as physical inactivity, impaired cognition, and multimorbidity. We recommend a low threshold for performing the 5-CST in clinical settings and provide a protocol for its use
    corecore