68 research outputs found

    Inherited biotic protection in a Neotropical pioneer plant

    Get PDF
    Chelonanthus alatus is a bat-pollinated, pioneer Gentianaceae that clusters in patches where still-standing, dried-out stems are interspersed among live individuals. Flowers bear circum-floral nectaries (CFNs) that are attractive to ants, and seed dispersal is both barochorous and anemochorous. Although, in this study, live individuals never sheltered ant colonies, dried-out hollow stems - that can remain standing for 2 years - did. Workers from species nesting in dried-out stems as well as from ground-nesting species exploited the CFNs of live C. alatus individuals in the same patches during the daytime, but were absent at night (when bat pollination occurs) on 60.5% of the plants. By visiting the CFNs, the ants indirectly protect the flowers - but not the plant foliage - from herbivorous insects. We show that this protection is provided mostly by species nesting in dried-out stems, predominantly Pseudomyrmex gracilis. That dried-out stems remain standing for years and are regularly replaced results in an opportunistic, but stable association where colonies are sheltered by one generation of dead C. alatus while the live individuals nearby, belonging to the next generation, provide them with nectar; in turn, the ants protect their flowers from herbivores. We suggest that the investment in wood by C. alatus individuals permitting stillstanding, dried-out stems to shelter ant colonies constitutes an extended phenotype because foraging workers protect the flowers of live individuals in the same patch. Also, through this process these dried-out stems indirectly favor the reproduction (and so the fitness) of the next generation including both their own offspring and that of their siblings, alladding up to a potential case of inclusive fitness in plants

    Role of immunosuppression in an antibiotic stewardship intervention and its association with clinical outcomes and antibiotic use: protocol for an observational study (RISC-sepsis)

    Get PDF
    \ua9 Author(s) (or their employer(s)) 2022. Re-use permitted under CC BY. Published by BMJ. Introduction Sepsis is characterised by a dysregulated immune response to infection, with exaggerated pro-inflammatory and anti-inflammatory responses. A predominant immunosuppressive profile affecting both innate and adaptive immune responses is associated with increased hospital-acquired infection and reduced infection-free survival. While hospital-acquired infection leads to additional antibiotic use, the role of the immunosuppressive phenotype in guiding complex decisions, such as those affecting antibiotic stewardship, is uncertain. This study is a mechanistic substudy embedded within a multicentre clinical and cost-effectiveness trial of biomarker-guided antibiotic stewardship. This mechanistic study aims to determine the effect of sepsis-associated immunosuppression on the trial outcome measures. Methods and analysis RISC-sepsis is a prospective, multicentre, exploratory, observational study embedded within the ADAPT-sepsis trial. A subgroup of 180 participants with antibiotics commenced for suspected sepsis, enrolled in the ADAPT-sepsis trial, will be recruited. Blood samples will be collected on alternate days until day 7. At each time point, blood will be collected for flow cytometric analysis into cell preservation tubes. Immunophenotyping will be performed at a central testing hub by flow cytometry. The primary outcome measures are monocyte human leucocyte antigen-DR; neutrophil CD88; programmed cell death-1 on monocytes, neutrophils and T lymphocytes and the percentage of regulatory T cells. Secondary outcome measures will link to trial outcomes from the ADAPT-sepsis trial including antibiotic days; occurrence of hospital-acquired infection and length of ICU-stay and hospital-stay. Ethics and dissemination Ethical approval has been granted (IRAS 209815) and RISC-sepsis is registered with the ISRCTN (86837685). Study results will be disseminated by peer-reviewed publications, presentations at scientific meetings and via patient and public participation groups and social media

    Decay of interspecific avian flock networks along a disturbance gradient in Amazonia

    Get PDF
    Our understanding of how anthropogenic habitat change shapes species interactions is in its infancy. This is in large part because analytical approaches such as network theory have only recently been applied to characterize complex community dynamics. Network models are a powerful tool for quantifying how ecological interactions are affected by habitat modification because they provide metrics that quantify community structure and function. Here, we examine how large-scale habitat alteration has affected ecological interactions among mixed-species flocking birds in Amazonian rainforest. These flocks provide a model system for investigating how habitat heterogeneity influences non-trophic interactions and the subsequent social structure of forest-dependent mixed-species bird flocks. We analyse 21 flock interaction networks throughout a mosaic of primary forest, fragments of varying sizes and secondary forest (SF) at the Biological Dynamics of Forest Fragments Project in central Amazonian Brazil. Habitat type had a strong effect on network structure at the levels of both species and flock. Frequency of associations among species, as summarized by weighted degree, declined with increasing levels of forest fragmentation and SF. At the flock level, clustering coefficients and overall attendance positively correlated with mean vegetation height, indicating a strong effect of habitat structure on flock cohesion and stability. Prior research has shown that trophic interactions are often resilient to large-scale changes in habitat structure because species are ecologically redundant. By contrast, our results suggest that behavioural interactions and the structure of non-trophic networks are highly sensitive to environmental change. Thus, a more nuanced, system-by-system approach may be needed when thinking about the resiliency of ecological networks. © 2013 The Author(s) Published by the Royal Society. All rights reserved

    Exploring the relationship between grapheme colour-picking consistency and mental imagery

    Get PDF
    Previous research has indicated a potential link between mental imagery and synaesthesia. However, these findings are mainly based on imagery self-report measures and recruitment of self-selected synaesthetes. To avoid issues of self-selection and demand effects we recruited participants from the general population, rather than synaesthetes specifically, and used colour-picking consistency tests for letters and numbers to assess a "synaesthete-like" experience. Mental imagery ability and mental rotation ability were assessed using both self-report measures and behavioural assessments. Consistency in colour-picking for letters (but not numbers) was predicted by performance on the visual mental imagery task, but not by a mental rotation task or self-report measures. Using the consistency score as a proxy measure of grapheme-colour synaesthesia, we provide more evidence for the suggestion that synaesthetic experience is associated with enhanced mental imagery, even when participants are naïve to the research topic

    Biomarker-Guided Antibiotic Duration for Hospitalized Patients With Suspected Sepsis: The ADAPT-Sepsis Randomized Clinical Trial

    Get PDF
    \ua9 2025 American Medical Association. All rights reserved.Importance: For hospitalized critically ill adults with suspected sepsis, procalcitonin (PCT) and C-reactive protein (CRP) monitoring protocols can guide the duration of antibiotic therapy, but the evidence of the effect and safety of these protocols remains uncertain. Objective: To determine whether decisions based on assessment of CRP or PCT safely results in a reduction in the duration of antibiotic therapy. Design, Setting, and Participants: A multicenter, intervention-concealed randomized clinical trial, involving 2760 adults (≥18 years), in 41 UK National Health Service (NHS) intensive care units, requiring critical care within 24 hours of initiating intravenous antibiotics for suspected sepsis and likely to continue antibiotics for at least 72 hours. Intervention: From January 1, 2018, to June 5, 2024, 918 patients were assigned to the daily PCT-guided protocol, 924 to the daily CRP-guided protocol, and 918 assigned to standard care. Main Outcomes and Measures: The primary outcomes were total duration of antibiotics (effectiveness) and all-cause mortality (safety) to 28 days. Secondary outcomes included critical care unit data and hospital stay data. Ninety-day all-cause mortality was also collected. Results: Among the randomized patients (mean age 60.2 [SD, 15.4] years; 60.3% males), there was a significant reduction in antibiotic duration from randomization to 28 days for those in the daily PCT-guided protocol compared with standard care (mean duration, 10.7 [SD, 7.6] days for standard care and 9.8 [SD, 7.2] days for PCT; mean difference, 0.88 days; 95% CI, 0.19 to 1.58, P =.01). For all-cause mortality up to 28 days, the daily PCT-guided protocol was noninferior to standard care, where the noninferiority margin was set at 5.4% (19.4% [170 of 878] of patients receiving standard care; 20.9% [184 of 879], PCT; absolute difference, 1.57; 95% CI, -2.18 to 5.32; P =.02). No difference was found in antibiotic duration for standard care vs daily CRP-guided protocol (mean duration, 10.6 [7.7] days for CRP; mean difference, 0.09; 95% CI, -0.60 to 0.79; P =.79). For all-cause mortality, the daily CRP-guided protocol was inconclusive compared with standard care (21.1% [184 of 874] for CRP; absolute difference, 1.69; 95% CI, -2.07 to 5.45; P =.03). Conclusions and Relevance: Care guided by measurement of PCT reduces antibiotic duration safely compared with standard care, but CRP does not. All-cause mortality for CRP was inconclusive. Trial Registration: isrctn.org Identifier: ISRCTN47473244

    Habitat Composition and Connectivity Predicts Bat Presence and Activity at Foraging Sites in a Large UK Conurbation

    Get PDF
    Background: Urbanization is characterized by high levels of sealed land-cover, and small, geometrically complex, fragmented land-use patches. The extent and density of urbanized land-use is increasing, with implications for habitat quality, connectivity and city ecology. Little is known about densification thresholds for urban ecosystem function, and the response of mammals, nocturnal and cryptic taxa are poorly studied in this respect. Bats (Chiroptera) are sensitive to changing urban form at a species, guild and community level, so are ideal model organisms for analyses of this nature. Methodology/Principal Findings: We surveyed bats around urban ponds in the West Midlands conurbation, United Kingdom (UK). Sites were stratified between five urban land classes, representing a gradient of built land-cover at the 1 km 2 scale. Models for bat presence and activity were developed using land-cover and land-use data from multiple radii around each pond. Structural connectivity of tree networks was used as an indicator of the functional connectivity between habitats. All species were sensitive to measures of urban density. Some were also sensitive to landscape composition and structural connectivity at different spatial scales. These results represent new findings for an urban area. The activity of Pipistrellus pipistrellus (Schreber 1774) exhibited a non-linear relationship with the area of built land-cover, being much reduced beyond the threshold of,60 % built surface. The presence of tree networks appears to mitigate the negative effects of urbanization for this species

    Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy

    Get PDF
    Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore