264 research outputs found
Extracellular Hsp72 concentration relates to a minimum endogenous criteria during acute exercise-heat exposure
Extracellular heat-shock protein 72 (eHsp72) concentration increases during exercise-heat stress when conditions elicit physiological strain. Differences in severity of environmental and exercise stimuli have elicited varied response to stress. The present study aimed to quantify the extent of increased eHsp72 with increased exogenous heat stress, and determine related endogenous markers of strain in an exercise-heat model. Ten males cycled for 90 min at 50% O2peak in three conditions (TEMP, 20°C/63% RH; HOT, 30.2°C/51%RH; VHOT, 40.0°C/37%RH). Plasma was analysed for eHsp72 pre, immediately post and 24-h post each trial utilising a commercially available ELISA. Increased eHsp72 concentration was observed post VHOT trial (+172.4%) (P<0.05), but not TEMP (-1.9%) or HOT (+25.7%) conditions. eHsp72 returned to baseline values within 24hrs in all conditions. Changes were observed in rectal temperature (Trec), rate of Trec increase, area under the curve for Trec of 38.5°C and 39.0°C, duration Trec ≥ 38.5°C and ≥ 39.0°C, and change in muscle temperature, between VHOT, and TEMP and HOT, but not between TEMP and HOT. Each condition also elicited significantly increasing physiological strain, described by sweat rate, heart rate, physiological strain index, rating of perceived exertion and thermal sensation. Stepwise multiple regression reported rate of Trec increase and change in Trec to be predictors of increased eHsp72 concentration. Data suggests eHsp72 concentration increases once systemic temperature and sympathetic activity exceeds a minimum endogenous criteria elicited during VHOT conditions and is likely to be modulated by large, rapid changes in core temperature
Optimizing treatment with tumour necrosis factor inhibitors in rheumatoid arthritis—a proof of principle and exploratory trial: is dose tapering practical in good responders?
Objectives: RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. Methods: We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0-6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6-12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. Results: Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0-6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6-12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). Conclusion: Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states. Trail registration: EudraCT, https://www.clinicaltrialsregister.eu, 2010-020738-24; ISRCTN registry, https://www.isrctn.com, 28955701
Bioaccumulation and ecotoxicity of carbon nanotubes
Carbon nanotubes (CNT) have numerous industrial applications and may be released to the environment. In the aquatic environment, pristine or functionalized CNT have different dispersion behavior, potentially leading to different risks of exposure along the water column. Data included in this review indicate that CNT do not cross biological barriers readily. When internalized, only a minimal fraction of CNT translocate into organism body compartments. The reported CNT toxicity depends on exposure conditions, model organism, CNT-type, dispersion state and concentration. In the ecotoxicological tests, the aquatic organisms were generally found to be more sensitive than terrestrial organisms. Invertebrates were more sensitive than vertebrates. Single-walled CNT were found to be more toxic than double-/multi-walled CNT. Generally, the effect concentrations documented in literature were above current modeled average environmental concentrations. Measurement data are needed for estimation of environmental no-effect concentrations. Future studies with benchmark materials are needed to generate comparable results. Studies have to include better characterization of the starting materials, of the dispersions and of the biological fate, to obtain better knowledge of the exposure/effect relationships
QCD and strongly coupled gauge theories : challenges and perspectives
We highlight the progress, current status, and open challenges of QCD-driven physics, in theory and in experiment. We discuss how the strong interaction is intimately connected to a broad sweep of physical problems, in settings ranging from astrophysics and cosmology to strongly coupled, complex systems in particle and condensed-matter physics, as well as to searches for physics beyond the Standard Model. We also discuss how success in describing the strong interaction impacts other fields, and, in turn, how such subjects can impact studies of the strong interaction. In the course of the work we offer a perspective on the many research streams which flow into and out of QCD, as well as a vision for future developments.Peer reviewe
Vancomycin-resistant Enterococcus faecium sequence type 796 - rapid international dissemination of a new epidemic clone
Background: Vancomycin-resistant Enterococcus faecium (VRE) is a leading cause of hospital-acquired infections. New, presumably better-adapted strains of VRE appear unpredictably; it is uncertain how they spread despite improved infection control. We aimed to investigate the relatedness of a novel sequence type (ST) of vanB E. faecium - ST796 - very near its time of origin from hospitals in three Australian states and New Zealand. Methods: Following near-simultaneous outbreaks of ST796 in multiple institutions, we gathered then tested colonization and bloodstream infection isolates' antimicrobial resistance (AMR) phenotypes, and phylogenomic relationships using whole genome sequencing (WGS). Patient meta-data was explored to trace the spread of ST796. Results: A novel clone of vanB E. faecium (ST796) was first detected at one Australian hospital in late 2011, then in two New Zealand hospitals linked by inter-hospital transfers from separate Melbourne hospitals. ST796 also appeared in hospitals in South Australia and New South Wales and was responsible for at least one major colonization outbreak in a Neonatal Intensive Care Unit without identifiable links between centers. No exceptional AMR was detected in the isolates. While WGS analysis showed very limited diversity at the core genome, consistent with recent emergence of the clone, clustering by institution was observed. Conclusions: Evolution of new E. faecium clones, followed by recognized or unrecognized movement of colonized individuals then rapid intra-institutional cross-transmission best explain the multi-center, multistate and international outbreak we observed
Genomic and SNP Analyses Demonstrate a Distant Separation of the Hospital and Community-Associated Clades of Enterococcus faecium
Recent studies have pointed to the existence of two subpopulations of Enterococcus faecium, one containing primarily commensal/community-associated (CA) strains and one that contains most clinical or hospital-associated (HA) strains, including those classified by multi-locus sequence typing (MLST) as belonging to the CC17 group. The HA subpopulation more frequently has IS16, pathogenicity island(s), and plasmids or genes associated with antibiotic resistance, colonization, and/or virulence. Supporting the two clades concept, we previously found a 3–10% difference between four genes from HA-clade strains vs. CA-clade strains, including 5% difference between pbp5-R of ampicillin-resistant, HA strains and pbp5-S of ampicillin-sensitive, CA strains. To further investigate the core genome of these subpopulations, we studied 100 genes from 21 E. faecium genome sequences; our analyses of concatenated sequences, SNPs, and individual genes all identified two distinct groups. With the concatenated sequence, HA-clade strains differed by 0–1% from one another while CA clade strains differed from each other by 0–1.1%, with 3.5–4.2% difference between the two clades. While many strains had a few genes that grouped in one clade with most of their genes in the other clade, one strain had 28% of its genes in the CA clade and 72% in the HA clade, consistent with the predicted role of recombination in the evolution of E. faecium. Using estimates for Escherichia coli, molecular clock calculations using sSNP analysis indicate that these two clades may have diverged ≥1 million years ago or, using the higher mutation rate for Bacillus anthracis, ∼300,000 years ago. These data confirm the existence of two clades of E. faecium and show that the differences between the HA and CA clades occur at the core genomic level and long preceded the modern antibiotic era
Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.
BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112
Prognostic model to predict postoperative acute kidney injury in patients undergoing major gastrointestinal surgery based on a national prospective observational cohort study.
Background: Acute illness, existing co-morbidities and surgical stress response can all contribute to postoperative acute kidney injury (AKI) in patients undergoing major gastrointestinal surgery. The aim of this study was prospectively to develop a pragmatic prognostic model to stratify patients according to risk of developing AKI after major gastrointestinal surgery. Methods: This prospective multicentre cohort study included consecutive adults undergoing elective or emergency gastrointestinal resection, liver resection or stoma reversal in 2-week blocks over a continuous 3-month period. The primary outcome was the rate of AKI within 7 days of surgery. Bootstrap stability was used to select clinically plausible risk factors into the model. Internal model validation was carried out by bootstrap validation. Results: A total of 4544 patients were included across 173 centres in the UK and Ireland. The overall rate of AKI was 14·2 per cent (646 of 4544) and the 30-day mortality rate was 1·8 per cent (84 of 4544). Stage 1 AKI was significantly associated with 30-day mortality (unadjusted odds ratio 7·61, 95 per cent c.i. 4·49 to 12·90; P < 0·001), with increasing odds of death with each AKI stage. Six variables were selected for inclusion in the prognostic model: age, sex, ASA grade, preoperative estimated glomerular filtration rate, planned open surgery and preoperative use of either an angiotensin-converting enzyme inhibitor or an angiotensin receptor blocker. Internal validation demonstrated good model discrimination (c-statistic 0·65). Discussion: Following major gastrointestinal surgery, AKI occurred in one in seven patients. This preoperative prognostic model identified patients at high risk of postoperative AKI. Validation in an independent data set is required to ensure generalizability
Optimizing treatment with tumour necrosis factor inhibitors in rheumatoid arthritis-a proof of principle and exploratory trial: is dose tapering practical in good responders?
OBJECTIVES: RA patients receiving TNF inhibitors (TNFi) usually maintain their initial doses. The aim of the Optimizing Treatment with Tumour Necrosis Factor Inhibitors in Rheumatoid Arthritis trial was to evaluate whether tapering TNFi doses causes loss of clinical response. METHODS: We enrolled RA patients receiving etanercept or adalimumab and a DMARD with DAS28 under 3.2 for over 3 months. Initially (months 0-6) patients were randomized to control (constant TNFi) or two experimental groups (tapering TNFi by 33 or 66%). Subsequently (months 6-12) control subjects were randomized to taper TNFi by 33 or 66%. Disease flares (DAS28 increasing ⩾0.6 with at least one additional swollen joint) were the primary outcome. RESULTS: Two hundred and forty-four patients were screened, 103 randomized and 97 treated. In months 0-6 there were 8/50 (16%) flares in controls, 3/26 (12%) with 33% tapering and 6/21 (29%) with 66% tapering. Multivariate Cox analysis showed time to flare was unchanged with 33% tapering but was reduced with 66% tapering compared with controls (adjusted hazard ratio 2.81, 95% CI: 0.99, 7.94; P = 0.051). Analysing all tapered patients after controls were re-randomized (months 6-12) showed differences between groups: there were 6/48 (13%) flares with 33% tapering and 14/39 (36%) with 66% tapering. Multivariate Cox analysis showed 66% tapering reduced time to flare (adjusted hazard ratio 3.47, 95% CI: 1.26, 9.58; P = 0.016). CONCLUSION: Tapering TNFi by 33% has no impact on disease flares and appears practical in patients in sustained remission and low disease activity states
Implementing treat-to-target urate-lowering therapy during hospitalisations for gout flares.
OBJECTIVES: To evaluate a strategy designed to optimise care and increase uptake of urate-lowering therapy (ULT) during hospitalisations for gout flares. METHODS: We conducted a prospective cohort study to evaluate a strategy that combined optimal in-hospital gout management with a nurse-led, follow-up appointment, followed by handover to primary care. Outcomes, including ULT initiation, urate target attainment, and re-hospitalisation rates, were compared between patients hospitalised for flares in the 12 months post-implementation and a retrospective cohort of hospitalised patients from 12 months pre-implementation. RESULTS: 119 and 108 patients, respectively, were hospitalised for gout flares in the 12 months pre- and post-implementation. For patients with 6-month follow-up data available (n = 94 and n = 97, respectively), the proportion newly initiated on ULT increased from 49.2% pre-implementation to 92.3% post-implementation (age/sex-adjusted odds ratio (aOR) 11.5; 95% confidence interval (CI) 4.36-30.5; p < 0.001). After implementation, more patients achieved a serum urate ≤360 micromol/L within 6 months of discharge (10.6% pre-implementation vs. 26.8% post-implementation; aOR 3.04; 95% CI 1.36-6.78; p = 0.007). The proportion of patients re-hospitalised for flares was 14.9% pre-implementation vs. 9.3% post-implementation (aOR 0.53, 95% CI 0.22 to 1.32; p = 0.18). CONCLUSION: Over 90% of patients were initiated on ULT after implementing a strategy to optimise hospital gout care. Despite increased initiation of ULT during flares, recurrent hospitalisations were not more frequent following implementation. Significant relative improvements in urate target attainment were observed post-implementation; however, for the majority of hospitalised gout patients to achieve urate targets, closer primary-secondary care integration is still needed
- …
