123 research outputs found
Current worldwide nuclear cardiology practices and radiation exposure: results from the 65 country IAEA Nuclear Cardiology Protocols Cross-Sectional Study (INCAPS).
Cryptosporidium Priming Is More Effective than Vaccine for Protection against Cryptosporidiosis in a Murine Protein Malnutrition Model
Cryptosporidium is a major cause of severe diarrhea, especially in malnourished children. Using a murine model of C. parvum oocyst challenge that recapitulates clinical features of severe cryptosporidiosis during malnutrition, we interrogated the effect of protein malnutrition (PM) on primary and secondary responses to C. parvum challenge, and tested the differential ability of mucosal priming strategies to overcome the PM-induced susceptibility. We determined that while PM fundamentally alters systemic and mucosal primary immune responses to Cryptosporidium, priming with C. parvum (106 oocysts) provides robust protective immunity against re-challenge despite ongoing PM. C. parvum priming restores mucosal Th1-type effectors (CD3+CD8+CD103+ T-cells) and cytokines (IFNγ, and IL12p40) that otherwise decrease with ongoing PM. Vaccination strategies with Cryptosporidium antigens expressed in the S. Typhi vector 908htr, however, do not enhance Th1-type responses to C. parvum challenge during PM, even though vaccination strongly boosts immunity in challenged fully nourished hosts. Remote non-specific exposures to the attenuated S. Typhi vector alone or the TLR9 agonist CpG ODN-1668 can partially attenuate C. parvum severity during PM, but neither as effectively as viable C. parvum priming. We conclude that although PM interferes with basal and vaccine-boosted immune responses to C. parvum, sustained reductions in disease severity are possible through mucosal activators of host defenses, and specifically C. parvum priming can elicit impressively robust Th1-type protective immunity despite ongoing protein malnutrition. These findings add insight into potential correlates of Cryptosporidium immunity and future vaccine strategies in malnourished children
Relationship of Circulating Soluble Urokinase Plasminogen Activator Receptor (suPAR) Levels to Disease Control in Asthma and Asthmatic Pregnancy
Asthma has a high burden of morbidity if not controlled and may frequently complicate pregnancy, posing a risk for pregnancy outcomes. Elevated plasma level of the inflammatory biomarker soluble urokinase plasminogen activator receptor (suPAR) is related to a worse prognosis in many conditions such as infectious, autoimmune, or pregnancy-related diseases; however the value of suPAR in asthma and asthmatic pregnancy is unknown. The present study aimed to investigate the suPAR, CRP and IL-6 levels in asthma (asthmatic non-pregnant, ANP; N = 38; female N = 27) and asthmatic pregnancy (AP; N = 15), compared to healthy non-pregnant controls (HNP; N = 29; female N = 19) and to healthy pregnant women (HP; N = 58). The relationship between suPAR levels and asthma control was also evaluated. The diagnostic efficacy of suPAR in asthma control was analyzed using ROC analysis. IL-6 and CRP levels were comparable in all study groups. Circulating suPAR levels were lower in HP and AP than in HNP and ANP subjects, respectively (2.01 [1.81-2.38] and 2.39 [2.07-2.69] vs. 2.60 [1.82-3.49] and 2.84 [2.33-3.72] ng/mL, respectively, p = 0.0001). suPAR and airway resistance correlated in ANP (r = 0.47, p = 0.004). ROC analysis of suPAR values in ANP patients with PEF above and below 80% yielded an AUC of 0.75 (95% CI: 0.57-0.92, p = 0.023) and with ACT total score above and below 20 an AUC of 0.80 (95% CI: 0.64-0.95, p = 0.006). The cut-off value of suPAR to discriminate between controlled and not controlled AP and ANP was 4.04 ng/mL. In conclusion, suPAR may help the objective assessment of asthma control, since it correlates with airway resistance and has good sensitivity in the detection of impaired asthma control. Decrease in circulating suPAR levels detected both in healthy and asthmatic pregnant women presumably represents pregnancy induced immune tolerance
Multi-messenger observations of a binary neutron star merger
On 2017 August 17 a binary neutron star coalescence candidate (later designated GW170817) with merger time 12:41:04 UTC was observed through gravitational waves by the Advanced LIGO and Advanced Virgo detectors. The Fermi Gamma-ray Burst Monitor independently detected a gamma-ray burst (GRB 170817A) with a time delay of ~1.7 s with respect to the merger time. From the gravitational-wave signal, the source was initially localized to a sky region of 31 deg2 at a luminosity distance of 40+8-8 Mpc and with component masses consistent with neutron stars. The component masses were later measured to be in the range 0.86 to 2.26 Mo. An extensive observing campaign was launched across the electromagnetic spectrum leading to the discovery of a bright optical transient (SSS17a, now with the IAU identification of AT 2017gfo) in NGC 4993 (at ~40 Mpc) less than 11 hours after the merger by the One- Meter, Two Hemisphere (1M2H) team using the 1 m Swope Telescope. The optical transient was independently detected by multiple teams within an hour. Subsequent observations targeted the object and its environment. Early ultraviolet observations revealed a blue transient that faded within 48 hours. Optical and infrared observations showed a redward evolution over ~10 days. Following early non-detections, X-ray and radio emission were discovered at the transient’s position ~9 and ~16 days, respectively, after the merger. Both the X-ray and radio emission likely arise from a physical process that is distinct from the one that generates the UV/optical/near-infrared emission. No ultra-high-energy gamma-rays and no neutrino candidates consistent with the source were found in follow-up searches. These observations support the hypothesis that GW170817 was produced by the merger of two neutron stars in NGC4993 followed by a short gamma-ray burst (GRB 170817A) and a kilonova/macronova powered by the radioactive decay of r-process nuclei synthesized in the ejecta
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background: Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world.
Methods: This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231.
Findings: Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p < 0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p < 0·001).
Interpretation: Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication
Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.
BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112
Worldwide diagnostic reference levels for single-photon emission computed tomography myocardial perfusion imaging: findings from INCAPS.
OBJECTIVES: This study sought to establish worldwide and regional diagnostic reference levels (DRLs) and achievable administered activities (AAAs) for single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI). BACKGROUND: Reference levels serve as radiation dose benchmarks to compare individual laboratories against aggregated data, helping to identify sites in greatest need of dose reduction interventions. DRLs for SPECT MPI have previously been derived from national or regional registries. To date there have been no multiregional reports of DRLs for SPECT MPI from a single standardized dataset. METHODS: Data were submitted voluntarily to the INCAPS (International Atomic Energy Agency Nuclear Cardiology Protocols Study), a cross-sectional, multinational registry of MPI protocols. A total of 7,103 studies were included. DRLs and AAAs were calculated by protocol for each world region and for aggregated worldwide data. RESULTS: The aggregated worldwide DRLs for rest-stress or stress-rest studies employing technetium Tc 99m-labeled radiopharmaceuticals were 11.2 mCi (first dose) and 32.0 mCi (second dose) for 1-day protocols, and 23.0 mCi (first dose) and 24.0 mCi (second dose) for multiday protocols. Corresponding AAAs were 10.1 mCi (first dose) and 28.0 mCi (second dose) for 1-day protocols, and 17.8 mCi (first dose) and 18.7 mCi (second dose) for multiday protocols. For stress-only technetium Tc 99m studies, the worldwide DRL and AAA were 18.0 mCi and 12.5 mCi, respectively. Stress-first imaging was used in 26% to 92% of regional studies except in North America where it was used in just 7% of cases. Significant differences in DRLs and AAAs were observed between regions. CONCLUSIONS: This study reports reference levels for SPECT MPI for each major world region from one of the largest international registries of clinical MPI studies. Regional DRLs may be useful in establishing or revising guidelines or simply comparing individual laboratory protocols to regional trends. Organizations should continue to focus on establishing standardized reporting methods to improve the validity and comparability of regional DRLs
A comprehensive introduction to the genetic basis of non-syndromic hearing loss in the Saudi Arabian population
<p>Abstract</p> <p>Background</p> <p>Hearing loss is a clinically and genetically heterogeneous disorder. Mutations in the <it>DFNB1 </it>locus have been reported to be the most common cause of autosomal recessive non-syndromic hearing loss worldwide. Apart from <it>DFNB1</it>, many other loci and their underlying genes have also been identified and the basis of our study was to provide a comprehensive introduction to the delineation of the molecular basis of non-syndromic hearing loss in the Saudi Arabian population. This was performed by screening <it>DFNB1 </it>and to initiate prioritized linkage analysis or homozygosity mapping for a pilot number of families in which <it>DFNB1 </it>has been excluded.</p> <p>Methods</p> <p>Individuals from 130 families of Saudi Arabian tribal origin diagnosed with an autosomal recessive non-syndromic sensorineural hearing loss were screened for mutations at the <it>DFNB1 </it>locus by direct sequencing. If negative, genome wide linkage analysis or homozygosity mapping were performed using Affymetrix GeneChip<sup>® </sup>Human Mapping 250K/6.0 Arrays to identify regions containing any known-deafness causing genes that were subsequently sequenced.</p> <p>Results</p> <p>Our results strongly indicate that <it>DFNB1 </it>only accounts for 3% of non-syndromic hearing loss in the Saudi Arabian population of ethnic ancestry. Prioritized linkage analysis or homozygosity mapping in five separate families established that their hearing loss was caused by five different known-deafness causing genes thus confirming the genetic heterogeneity of this disorder in the kingdom.</p> <p>Conclusion</p> <p>The overall results of this study are highly suggestive that underlying molecular basis of autosomal recessive non-syndromic deafness in Saudi Arabia is very genetically heterogeneous. In addition, we report that the preliminary results indicate that there does not seem to be any common or more prevalent loci, genes or mutations in patients with autosomal recessive non-syndromic hearing loss in patients of Saudi Arabian tribal origin.</p
Protocol for a randomised, double-blind, placebo-controlled study of grass allergen immunotherapy tablet for seasonal allergic rhinitis: time course of nasal, cutaneous and immunological outcomes
Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy
Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries
- …
