64 research outputs found

    Genome Sequence of the Saprophyte Leptospira biflexa Provides Insights into the Evolution of Leptospira and the Pathogenesis of Leptospirosis

    Get PDF
    Leptospira biflexa is a free-living saprophytic spirochete present in aquatic environments. We determined the genome sequence of L. biflexa, making it the first saprophytic Leptospira to be sequenced. The L. biflexa genome has 3,590 protein-coding genes distributed across three circular replicons: the major 3,604 chromosome, a smaller 278-kb replicon that also carries essential genes, and a third 74-kb replicon. Comparative sequence analysis provides evidence that L. biflexa is an excellent model for the study of Leptospira evolution; we conclude that 2052 genes (61%) represent a progenitor genome that existed before divergence of pathogenic and saprophytic Leptospira species. Comparisons of the L. biflexa genome with two pathogenic Leptospira species reveal several major findings. Nearly one-third of the L. biflexa genes are absent in pathogenic Leptospira. We suggest that once incorporated into the L. biflexa genome, laterally transferred DNA undergoes minimal rearrangement due to physical restrictions imposed by high gene density and limited presence of transposable elements. In contrast, the genomes of pathogenic Leptospira species undergo frequent rearrangements, often involving recombination between insertion sequences. Identification of genes common to the two pathogenic species, L. borgpetersenii and L. interrogans, but absent in L. biflexa, is consistent with a role for these genes in pathogenesis. Differences in environmental sensing capacities of L. biflexa, L. borgpetersenii, and L. interrogans suggest a model which postulates that loss of signal transduction functions in L. borgpetersenii has impaired its survival outside a mammalian host, whereas L. interrogans has retained environmental sensory functions that facilitate disease transmission through water

    Optimal and continuous anaemia control in a cohort of dialysis patients in Switzerland

    Get PDF
    BACKGROUND: Guidelines for the management of anaemia in patients with chronic kidney disease (CKD) recommend a minimal haemoglobin (Hb) target of 11 g/dL. Recent surveys indicate that this requirement is not met in many patients in Europe. In most studies, Hb is only assessed over a short-term period. The aim of this study was to examine the control of anaemia over a continuous long-term period in Switzerland. METHODS: A prospective multi-centre observational study was conducted in dialysed patients treated with recombinant human epoetin (EPO) beta, over a one-year follow-up period, with monthly assessments of anaemia parameters. RESULTS: Three hundred and fifty patients from 27 centres, representing 14% of the dialysis population in Switzerland, were included. Mean Hb was 11.9 +/- 1.0 g/dL, and remained stable over time. Eighty-five % of the patients achieved mean Hb >or= 11 g/dL. Mean EPO dose was 155 +/- 118 IU/kg/week, being delivered mostly by subcutaneous route (64-71%). Mean serum ferritin and transferrin saturation were 435 +/- 253 microg/L and 30 +/- 11%, respectively. At month 12, adequate iron stores were found in 72.5% of patients, whereas absolute and functional iron deficiencies were observed in only 5.1% and 17.8%, respectively. Multivariate analysis showed that diabetes unexpectedly influenced Hb towards higher levels (12.1 +/- 0.9 g/dL; p = 0.02). One year survival was significantly higher in patients with Hb >or= 11 g/dL than in those with Hb <11 g/dL (19.7% vs 7.3%, p = 0.006). CONCLUSION: In comparison to European studies of reference, this survey shows a remarkable and continuous control of anaemia in Swiss dialysis centres. These results were reached through moderately high EPO doses, mostly given subcutaneously, and careful iron therapy management

    Structure, Function, and Evolution of the Thiomonas spp. Genome

    Get PDF
    Bacteria of the Thiomonas genus are ubiquitous in extreme environments, such as arsenic-rich acid mine drainage (AMD). The genome of one of these strains, Thiomonas sp. 3As, was sequenced, annotated, and examined, revealing specific adaptations allowing this bacterium to survive and grow in its highly toxic environment. In order to explore genomic diversity as well as genetic evolution in Thiomonas spp., a comparative genomic hybridization (CGH) approach was used on eight different strains of the Thiomonas genus, including five strains of the same species. Our results suggest that the Thiomonas genome has evolved through the gain or loss of genomic islands and that this evolution is influenced by the specific environmental conditions in which the strains live

    ISARIC-COVID-19 dataset: A Prospective, Standardized, Global Dataset of Patients Hospitalized with COVID-19

    Get PDF
    publishedVersio

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance

    Get PDF
    INTRODUCTION Investment in Africa over the past year with regard to severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing has led to a massive increase in the number of sequences, which, to date, exceeds 100,000 sequences generated to track the pandemic on the continent. These sequences have profoundly affected how public health officials in Africa have navigated the COVID-19 pandemic. RATIONALE We demonstrate how the first 100,000 SARS-CoV-2 sequences from Africa have helped monitor the epidemic on the continent, how genomic surveillance expanded over the course of the pandemic, and how we adapted our sequencing methods to deal with an evolving virus. Finally, we also examine how viral lineages have spread across the continent in a phylogeographic framework to gain insights into the underlying temporal and spatial transmission dynamics for several variants of concern (VOCs). RESULTS Our results indicate that the number of countries in Africa that can sequence the virus within their own borders is growing and that this is coupled with a shorter turnaround time from the time of sampling to sequence submission. Ongoing evolution necessitated the continual updating of primer sets, and, as a result, eight primer sets were designed in tandem with viral evolution and used to ensure effective sequencing of the virus. The pandemic unfolded through multiple waves of infection that were each driven by distinct genetic lineages, with B.1-like ancestral strains associated with the first pandemic wave of infections in 2020. Successive waves on the continent were fueled by different VOCs, with Alpha and Beta cocirculating in distinct spatial patterns during the second wave and Delta and Omicron affecting the whole continent during the third and fourth waves, respectively. Phylogeographic reconstruction points toward distinct differences in viral importation and exportation patterns associated with the Alpha, Beta, Delta, and Omicron variants and subvariants, when considering both Africa versus the rest of the world and viral dissemination within the continent. Our epidemiological and phylogenetic inferences therefore underscore the heterogeneous nature of the pandemic on the continent and highlight key insights and challenges, for instance, recognizing the limitations of low testing proportions. We also highlight the early warning capacity that genomic surveillance in Africa has had for the rest of the world with the detection of new lineages and variants, the most recent being the characterization of various Omicron subvariants. CONCLUSION Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve. This is important not only to help combat SARS-CoV-2 on the continent but also because it can be used as a platform to help address the many emerging and reemerging infectious disease threats in Africa. In particular, capacity building for local sequencing within countries or within the continent should be prioritized because this is generally associated with shorter turnaround times, providing the most benefit to local public health authorities tasked with pandemic response and mitigation and allowing for the fastest reaction to localized outbreaks. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    The evolving SARS-CoV-2 epidemic in Africa: Insights from rapidly expanding genomic surveillance.

    Get PDF
    Investment in severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) sequencing in Africa over the past year has led to a major increase in the number of sequences that have been generated and used to track the pandemic on the continent, a number that now exceeds 100,000 genomes. Our results show an increase in the number of African countries that are able to sequence domestically and highlight that local sequencing enables faster turnaround times and more-regular routine surveillance. Despite limitations of low testing proportions, findings from this genomic surveillance study underscore the heterogeneous nature of the pandemic and illuminate the distinct dispersal dynamics of variants of concern-particularly Alpha, Beta, Delta, and Omicron-on the continent. Sustained investment for diagnostics and genomic surveillance in Africa is needed as the virus continues to evolve while the continent faces many emerging and reemerging infectious disease threats. These investments are crucial for pandemic preparedness and response and will serve the health of the continent well into the 21st century

    Neurological manifestations of COVID-19 in adults and children

    Get PDF
    Different neurological manifestations of coronavirus disease 2019 (COVID-19) in adults and children and their impact have not been well characterized. We aimed to determine the prevalence of neurological manifestations and in-hospital complications among hospitalized COVID-19 patients and ascertain differences between adults and children. We conducted a prospective multicentre observational study using the International Severe Acute Respiratory and emerging Infection Consortium (ISARIC) cohort across 1507 sites worldwide from 30 January 2020 to 25 May 2021. Analyses of neurological manifestations and neurological complications considered unadjusted prevalence estimates for predefined patient subgroups, and adjusted estimates as a function of patient age and time of hospitalization using generalized linear models. Overall, 161 239 patients (158 267 adults; 2972 children) hospitalized with COVID-19 and assessed for neurological manifestations and complications were included. In adults and children, the most frequent neurological manifestations at admission were fatigue (adults: 37.4%; children: 20.4%), altered consciousness (20.9%; 6.8%), myalgia (16.9%; 7.6%), dysgeusia (7.4%; 1.9%), anosmia (6.0%; 2.2%) and seizure (1.1%; 5.2%). In adults, the most frequent in-hospital neurological complications were stroke (1.5%), seizure (1%) and CNS infection (0.2%). Each occurred more frequently in intensive care unit (ICU) than in non-ICU patients. In children, seizure was the only neurological complication to occur more frequently in ICU versus non-ICU (7.1% versus 2.3%, P < 0.001). Stroke prevalence increased with increasing age, while CNS infection and seizure steadily decreased with age. There was a dramatic decrease in stroke over time during the pandemic. Hypertension, chronic neurological disease and the use of extracorporeal membrane oxygenation were associated with increased risk of stroke. Altered consciousness was associated with CNS infection, seizure and stroke. All in-hospital neurological complications were associated with increased odds of death. The likelihood of death rose with increasing age, especially after 25 years of age. In conclusion, adults and children have different neurological manifestations and in-hospital complications associated with COVID-19. Stroke risk increased with increasing age, while CNS infection and seizure risk decreased with age

    Habitat Choice by Atlantic Salmon Parr in Relation to Turbulence at a Reach Scale

    Full text link
    The variables commonly used to describe the physical habitat of Atlantic salmon Salmo salar parr are average velocity, water depth, and substrate. A variety of micro- and mesohabitat models have been developed using these variables to assess habitat quality. However, Atlantic salmon parr live in highly turbulent streams and rivers in which intense fluctuations of water velocity occur. Laboratory experiments have shown that turbulence affects the behavior and energetics of fish. Nevertheless, habitat use in relation to the strong temporal variability of velocity in natural environments has rarely been studied. In this study, Atlantic salmon parr habitat was examined in relation to turbulence in the Patapedia River, Quebec. Rather than taking the usual approach of surveying a large population at one point in time, we used an intensive radiotelemetry tracking survey that focused on the habitat use of a few individual fish over an extended period. We analyzed habitat use in relation to several dynamic hydraulic variables. Our results revealed that under naturally turbulent conditions, the parr displayed high individual variability in their habitat use. Such heterogeneous use of habitat suggests that individuals are not constrained to a single habitat type. Furthermore, no differences were observed in habitat use among the four daily periods (dawn, day, dusk, and night) for individual parr
    corecore