20 research outputs found

    Réponse des cellules souches hématoïétiques aux radiations ionisantes

    No full text
    LE KREMLIN-B.- PARIS 11-BU Méd (940432101) / SudocSudocFranceF

    Thrombopoietin Administration Alleviates Hematopoietic Stem Cells Intrinsic Long Term Ionizing Radiation Damages. Identification of Preferential Cellular Expansion Sites

    Full text link
    Abstract Hematopoietic stem cells (HSC) are indispensable for the integrity of complex and long-lived organisms since they can reconstitute the hematopoietic system for life and achieve long term repopulation of lethally irradiated mice. Exposure of an organism to ionizing radiation (IR) causes dose dependant bone marrow suppression and challenge the replenishment capacity of HSC. Yet, the precise damages that are generated remain largely unexplored. To better understand these effects, phenotypic and functional changes in the stem/progenitor compartments of sublethally irradiated mice were monitored over a ten week period after radiation exposure. We report that shortly after sublethal IR-exposure, HSC, defined by their repopulating ability, still segregate in the Hoechst dye excluding side population (SP); yet, their Sca-1 (S) and c-Kit (K) expression levels are increased and severely reduced, respectively, with a concurrent increase in the proportion of SPSK cells positive for established indicators of HSC presence: CD150+/CD105+ and Tie2+. Virtually all HSCs quickly but transiently mobilize to replenish the bone marrow of myelo-ablated mice. Ten weeks after, whereas bone marrow cellularity has recovered and hematopoietic homeostasis is restored, major phenotypic modifications can be observed within the c-Kit+ Sca-1+ Lin−/low (KSL) stem/progenitor compartment: CD150+/Flk2− and Flk2+ KSL cell frequencies are increased and dramatically reduced, respectively. CD150+ KSL cells also show impaired reconstitution capacity, accrued γ-H2AX foci and increased tendency to apoptosis. This demonstrates that the KSL compartment is not properly restored 10 weeks after sublethal exposure, and that long-term IR-induced injury to the bone marrow proceeds, at least partially, through direct damage to the stem cell pool. Since thrombopoietin (TPO) has been shown to reduce haematopoietic injury when administered immediately after exposure to radiations, we asked whether TPO could restore the permanent IR-induced damage we observed in the HSC compartment. We first found in competitive transplant experiments that a single TPO administration rescued the impaired reconstitution capacity of HSC’s from animals exposed to sublethal IR. In addition, we observed that TPO injection right after irradiation considerably attenuates IR-induced long-term injury to the stem/progenitor compartment. Finally, the use of marrow cells from transgenic ubiquitous luciferase-expressing donors combined with bioluminescence imaging technology provided a valuable strategy that allowed visualizing HSC homing improvements of TPO-treated compared to untreated irradiated donors, and enabled the identification of a preferential cellular expansion sites which were inaccessible to investigation in most studies. Electronic microscopy analysis revealed that these sites show also differential activity of megakaryocytopoiesis with marked differences in the proplatelets reaching the vascular sinus. Altogether, our data provide novel insights in the cellular response of HSC to IR and the beneficial effects of TPO administration to these cells.</jats:p

    Plasma Citrulline Level As a Biochemical Marker to Predict and Diagnose Graft-Versus-Host Disease

    Full text link
    Abstract Background: Graft-versus-host disease (GVHD), more especially manifestations involving the gastro-intestinal (GI) tract, remains the major cause of morbidity and mortality after allogeneic stem cell transplantation (allo-SCT). Biochemical tools are needed both to predict and prevent acute GVHD development. In addition, deciphering infectious or toxic diarrhea from early manifestations of GI GVHD can be challenging. Citrulline is an amino acid produced by enterocytes of the small bowel. Plasma citrulline level (PCL) has been reported as a reliable quantitative marker of enterocyte mass and function. In an attempt to assess the impact of intestinal damage on the development of GVHD and further outcomes after allo-SCT, we prospectively monitored PCL in 146 consecutive patients admitted in our unit between January 2013 and May 2014. Patients and methods: The study included 93 males and 53 females with a median age of 52 years (range, 18-69), who received an allo-SCT for a hematological malignancy (acute leukemia: n=78; lymphoma: n=20; myelodysplastic syndrome: n=17; myeloproliferative neoplasm: n=12; multiple myeloma: n=10; and others: n=9). Fifty-one patients had a progressive disease. Donors were HLA-identical sibling (n=50) and unrelated (n=96). Source of stem cells was bone marrow (BM; n=76), peripheral blood stem cell (PBSC; n=63) and cord blood (CB; n=7). Conditioning was myeloablative in 88 (60%) patients. Enteral nutrition (EN) was systematically offered and started shortly after conditioning, although parenteral nutrition was provided when EN had been refused or was poorly tolerated. PCL were measured before conditioning (BC), monitored until day 30 after transplantation (d30), and assessed the first day of admission in case of hospitalization for diarrhea. Results: After a median follow-up of 220 days (6-564), grade II-IV acute GVHD was observed 31% patients (n=45) (GI-GVHD: n=17; grade III-IV: n=15; corticoresistant GVHD: n=10) and chronic GVHD in 27% patients (n=39) (extensive forms: n=13). The incidences of relapse and non-relapse mortality (NRM) were 18% and 21%, respectively. Mean PCL BC was 28 µmol/l (10-59). After a significant fall at d1, PCL increased slowly until a median of 16 µmol/l (0-52) at d15 and 17 µmol/l (1-66) at d30. PCL &lt;15 µmol/l BC was associated with a higher risk of grade III-IV (p=0.008), GI (p=0.013), corticoresistant (p&lt;0.001) acute GVHD and even chronic GVHD (p=0.003). Although recipient age seemed protective of PCL (p=0.014), progressive disease BC was the only factor predictive of low citrulline BC in multivariate analysis [HR: 1.14; 95%CI 1.06-1.22] (p=0.001). Patients with PCL &lt;15 µmol/l at d15 had more often grade II-IV than the others (49% vs 10%, p=0.002). Similar results were observed at d30 (p&lt;0.001). It is noteworthy that in patients with PCL &gt;15 µmol/l at d15 and/or d30, no grade III-IV, GI or corticoresistant acute GVHD were observed. In addition, the incidence of NRM in these patients was only 2% at d15 (vs 24%, p=0.002) and 3% at d30 (vs 25%, p=0.024). In multivariate analysis, only 2 factors influenced low d15 citrulline: EN refused or poorly tolerated [HR: 1.50; 95%CI 1.20-1.88] (p=0.001) and antithymocyte globulin incorporated in conditioning [HR: 0.65; 95%CI 0.47-0.88] (p=0.001). D15 PCL &lt;15 µmol/l was the only factor associated with grade II-IV acute GVHD [HR: 0.21; 95%CI 0.07-0.62] (p=0.005). Three factors were associated with a better overall survival: d30 PCL &gt;15 µmol/l [HR: 0.20; 95%CI 0.05-0.74] (p=0.016), BM graft vs PBSC/CB [HR: 0.19; 95%CI 0.05-0.79] (p=0.023) and good tolerance of EN [HR: 1.07; 95%CI 1.01-1.14] (p=0.026). Finally, 30 of the 146 patients were re-admitted for diarrhea post-transplant. The final diagnoses were GI-GVHD (n=20), infection (n=9) or functional diarrhea (n=1). In case of confirmed GI-GVHD, mean PCL at admission were lower (5, range 1-9) than for other diagnoses (22, range 8-47). PCL &lt;15 µmol/l was significantly associated with GI-GVHD (p&lt;0.001). Conclusion: Monitoring PCL until day 30 post-transplant seems simple and useful in order to identify patients at higher risk of acute GVHD and NRM. Since d15 PCL are correlated to EN tolerance and duration, low PCL could be an indication for starting/prolonging EN in patients who receive allo-SCT. In addition, PCL could be an interesting diagnostic and decision-making tool in case of post-transplant diarrhea. Disclosures No relevant conflicts of interest to declare. </jats:sec

    Environmental controls on surf zone injuries on high-energy beaches

    No full text
    Abstract. The two primary causes of surf zone injuries (SZIs) worldwide, including fatal drowning and severe spinal injuries, are rip currents (rips) and shore-break waves. SZIs also result from surfing and body boarding activity. In this paper we address the primary environmental controls on SZIs along the high-energy meso-macrotidal surf beach coast of SW France. A total of 2523 SZIs recorded by lifeguards over 186 sample days during the summers of 2007, 2009 and 2015 were combined with measured and/or hindcast weather, wave, tide and beach morphology data. All SZIs occurred disproportionately on warm sunny days with low wind likely because of increased beachgoer numbers and hazard exposure. Relationships were strongest for shore break and rip related SZIs and weakest for surfing related SZIs, the latter being also unaffected by tidal stage or range. Therefore the analysis focussed on bathers. Shore-break related SZIs disproportionately occur during shore-normal incident waves with average to below-average wave height (significant wave height Hs = 0.75–1.5 m) and around higher water levels and large tide range when waves break on the steepest section of the beach. In contrast, rip related drownings occur disproportionally near neap low tide, coinciding with maximized channel rip flow activity, under shore-normal incident waves with Hs &gt; 1.25 m and periods mean wave period longer than 5 s. Additional drowning incidents occurred at spring high tide, presumably due to small-scale swash rips. The composite wave and tide parameters proposed by Scott et al. (2014) are key controlling factors determining SZI occurrence, although the risk ranges are not necessarily transferable to all sites. Summer beach and surf zone morphology is highly interannually variable, which is critical to SZI patterns. The upper beach slope can vary from 0.06 to 0.18 between summers, resulting in low and high shore-break related SZIs, respectively. Summers with coast-wide highly (weakly) developed rip channels also result in widespread (scarce) rip related drowning incidents. With life risk defined in terms of the number of people exposed to life threatening hazards at a beach, the ability of morphodynamic models to simulate primary beach morphology characteristics a few weeks/months in advance is therefore of paramount importance to predict the primary surf-zone life risks along this coast. </jats:p

    Environmental controls on surf zone injuries on high-energy beaches

    No full text
    Abstract. The two primary causes of surf zone injuries (SZIs) worldwide, including fatal drowning and severe spinal injuries, are rip currents (rips) and shore-break waves. SZIs also result from surfing and bodyboarding activity. In this paper we address the primary environmental controls on SZIs along the high-energy meso–macro-tidal surf beach coast of southwestern France. A total of 2523 SZIs recorded by lifeguards over 186 sample days during the summers of 2007, 2009 and 2015 were combined with measured and/or hindcast weather, wave, tide, and beach morphology data. All SZIs occurred disproportionately on warm sunny days with low wind, likely because of increased beachgoer numbers and hazard exposure. Relationships were strongest for shore-break- and rip-related SZIs and weakest for surfing-related SZIs, the latter being also unaffected by tidal stage or range. Therefore, the analysis focused on bathers. More shore-break-related SZIs occur during shore-normal incident waves with average to below-average wave height (significant wave height, Hs = 0.75–1.5 m) and around higher water levels and large tide ranges when waves break on the steepest section of the beach. In contrast, more rip-related drownings occur near neap low tide, coinciding with maximised channel rip flow activity, under shore-normal incident waves with Hs &gt;1.25 m and mean wave periods longer than 5 s. Additional drowning incidents occurred at spring high tide, presumably due to small-scale swash rips. The composite wave and tide parameters proposed by Scott et al. (2014) are key controlling factors determining SZI occurrence, although the risk ranges are not necessarily transferable to all sites. Summer beach and surf zone morphology is interannually highly variable, which is critical to SZI patterns. The upper beach slope can vary from 0.06 to 0.18 between summers, resulting in low and high shore-break-related SZIs, respectively. Summers with coast-wide highly (weakly) developed rip channels also result in widespread (scarce) rip-related drowning incidents. With life risk defined in terms of the number of people exposed to life threatening hazards at a beach, the ability of morphodynamic models to simulate primary beach morphology characteristics a few weeks or months in advance is therefore of paramount importance for predicting the primary surf zone life risks along this coast. </jats:p

    Monitoring of Wilms’ Tumor 1 Expression As Minimal Residual Disease in Patients with Acute Myeloid Leukemia to Predict Relapse before and after Allogeneic Stem Cell Transplantation

    Full text link
    Abstract Background: Relapse remains the main cause of treatment failure in patients with acute myeloid leukemia (AML) after allogeneic stem cell transplantation (allo-SCT). The Wilms’ tumor 1 gene (WT1) is reportedly over-expressed in 90% of patients with AML and thus could be useful for minimal residual disease (MRD). However, the clinical utility of WT1 monitoring is still controversial, especially after allo-SCT. The aim of this study was to evaluate the usefulness of WT1 expression as a relapse predictor marker before and after allo-SCT in 160 patients with AML treated in our unit between May 2005 and April 2014. Patients and methods: The study included 78 males and 82 females with a median age of 50 years (range, 18-68), who received an allo-SCT for AML. Hematologic status before transplant was complete remission (CR, n=127) and progressive disease (PD, n=33). Donors were HLA-identical sibling (n=57) and unrelated (n=103). Source of stem cells was bone marrow (BM; n=79), peripheral blood stem cell (n=39) and cord blood (n=9). Conditioning was myeloablative in 102 (64%) patients. WT1 expression was assessed by using quantitative real-time PCR and ELN criteria were used to define cut-offs for positive MRD in BM and peripheral blood. Data were retrospectively collected at diagnosis, after the first induction chemotherapy, within a month before transplant, at day 100 post-transplant and at any time point after allo-SCT. A total of 1793 PCR were performed (BM: n=736; peripheral blood: n=1057), including 741 after allo-SCT. Results: After a median follow-up of 35 months (4-111), the overall survival (OS), event-free survival (EFS), relapse and non-relapse mortality (NRM) were 58,8%, 55%, 30% and 13,1%, respectively. Thirty-six (22,5%) patients developed acute grade II-IV graft-versus-host disease (GVHD) and 55 (34,4%) chronic GVHD. MRD remained positive after induction chemotherapy in 47/103 (45,6%) evaluable patients, of which 24 reached CR1 before allo-SCT and 23 relapsed. Among MRD negative patients after induction, 25 (44,6%) relapsed before transplant. Post-induction and pre-transplant positive MRD were significantly associated with a progressive disease at transplant (p=0,27 and p&lt;0.001). Only 1 patient with negative pre-transplant MRD was not in CR at transplant. Positive MRD post-transplant was significantly associated with incidence of relapse (p&lt;0.001). MRD was positive in 30/125 (24%) evaluable patients at day 100 post-transplant and in 54/140 (38,6%) evaluable patients at any time point after allo-SCT. Only 7 patients with positive MRD post-transplant did not relapse (MRD became negative without treatment during follow-up). Relapse was observed in 12/95 (12,6%) MRD negative patients at day 100 and only in 1/86 (1.2%) patient with negative MRD at any time after allo-SCT. MRD post-transplant was positive the same day as relapse in 26 cases and predicted relapse in 20 cases, with a median of 74 days (23-683) between the first MRD positive sample and the diagnosis of relapse. Higher WT1 expression post-induction chemotherapy, pre-transplant, at day 100 and at any time post-transplant showed inferior OS (p=0.03, 0.001, &lt;0.001 and &lt;0.001, respectively) and EFS (p=0.009, &lt;0.001, &lt;0.001 and &lt;0.001, respectively). In multivariate analysis, positive MRD post-transplant was the strongest factor adversely influencing OS [HR: 6.42; 95%CI 3.28-12.58] (p&lt;0.001) and EFS [HR: 10.86; 95%CI 5.45-21.72] (p&lt;0.001). Conclusion: WT1 expression, even after allo-SCT, can serve as a reliable marker for MRD in AML. Monitoring WT1 expression after allo-SCT could allow early detection of relapse and the subsequent selection of patients who may benefit from modulation of immunosuppression or donor lymphocyte infusion (DLI). Disclosures Quesnel: Oncoethix SA: Research Funding. </jats:sec
    corecore