220 research outputs found

    Synergisitic role of ADP and Ca2+ in diastolic myocardial stiffness

    Get PDF
    Heart failure (HF) with diastolic dysfunction has been attributed to increased myocardial stiffness that limits proper filling of the ventricle. Altered cross-bridge interaction may significantly contribute to high diastolic stiffness, but this has not been shown thus far. Cross-bridge interactions are dependent on cytosolic [Ca2+] and the regeneration of ATP from ADP. Depletion of myocardial energy reserve is a hallmark of HF leading to ADP accumulation and disturbed Ca2+-handling. Here, we investigated if ADP elevation in concert with increased diastolic [Ca2+] promotes diastolic cross-bridge formation and force generation and thereby increases diastolic stiffness. ADP dose-dependently increased force production in the absence of Ca2+ in membrane-permeabilized cardiomyocytes from human hearts. Moreover, physiological levels of ADP increased actomyosin force generation in the presence of Ca2+ both in human and rat membrane-permeabilized cardiomyocytes. Diastolic stress measured at physiological lattice spacing and 37°C in the presence of pathologicallevels of ADP and diastolic [Ca2+] revealed a 76±1% contribution of cross-bridge interaction to total diastolic stress in rat membrane-permeabilized cardiomyocytes. Inhibition of creatine kinase (CK), which increases cytosolic ADP, in enzyme-isolated intact rat cardiomyocytes impaired diastolic re-lengthening associated with diastolic Ca2+- overload. In isolated Langendorff-perfused rat hearts, CK-inhibition increased ventricular stiffness only in the presence of diastolic [Ca2+]. We propose that elevations of intracellular ADP in specific types of cardiac disease, including those where myocardial energy reserve is limited, contribute to diastolic dysfunction by recruiting cross-bridges even at low Ca2+ and thereby increase myocardial stiffness

    An autonomic performance environment for exascale

    Get PDF
    Exascale systems will require new approaches to performance observation, analysis, and runtime decision-making to optimize for performance and efficiency. The standard first-person model, in which multiple operating system processes and threads observe themselves and record first-person performance profiles or traces for offline analysis, is not adequate to observe and capture interactions at shared resources in highly concurrent, dynamic systems. Further, it does not support mechanisms for runtime adaptation. Our approach, called APEX (Autonomic Performance Environment for eXascale), provides mechanisms for sharing information among the layers of the software stack, including hardware, operating and runtime systems, and application code, both new and legacy. The performance measurement components share information across layers, merging first-person data sets with information collected by third-person tools observing shared hardware and software states at node- and global-levels. Critically, APEX provides a policy engine designed to guide runtime adaptation mechanisms to make algorithmic changes, re-allocate resources, or change scheduling rules when appropriate conditions occur

    Randomised, open-label, phase II study of Gemcitabine with and without IMM-101 for advanced pancreatic cancer

    Get PDF
    Background: Immune Modulation and Gemcitabine Evaluation-1, a randomised, open-label, phase II, first-line, proof of concept study (NCT01303172), explored safety and tolerability of IMM-101 (heat-killed Mycobacterium obuense; NCTC 13365) with gemcitabine (GEM) in advanced pancreatic ductal adenocarcinoma. Methods: Patients were randomised (2 : 1) to IMM-101 (10 mg ml−l intradermally)+GEM (1000 mg m−2 intravenously; n=75), or GEM alone (n=35). Safety was assessed on frequency and incidence of adverse events (AEs). Overall survival (OS), progression-free survival (PFS) and overall response rate (ORR) were collected. Results: IMM-101 was well tolerated with a similar rate of AE and serious adverse event reporting in both groups after allowance for exposure. Median OS in the intent-to-treat population was 6.7 months for IMM-101+GEM v 5.6 months for GEM; while not significant, the hazard ratio (HR) numerically favoured IMM-101+GEM (HR, 0.68 (95% CI, 0.44–1.04, P=0.074). In a pre-defined metastatic subgroup (84%), OS was significantly improved from 4.4 to 7.0 months in favour of IMM-101+GEM (HR, 0.54, 95% CI 0.33–0.87, P=0.01). Conclusions: IMM-101 with GEM was as safe and well tolerated as GEM alone, and there was a suggestion of a beneficial effect on survival in patients with metastatic disease. This warrants further evaluation in an adequately powered confirmatory study

    Predicting bee community responses to land-use changes: Effects of geographic and taxonomic biases

    Get PDF
    Land-use change and intensification threaten bee populations worldwide, imperilling pollination services. Global models are needed to better characterise, project, and mitigate bees' responses to these human impacts. The available data are, however, geographically and taxonomically unrepresentative; most data are from North America and Western Europe, overrepresenting bumblebees and raising concerns that model results may not be generalizable to other regions and taxa. To assess whether the geographic and taxonomic biases of data could undermine effectiveness of models for conservation policy, we have collated from the published literature a global dataset of bee diversity at sites facing land-use change and intensification, and assess whether bee responses to these pressures vary across 11 regions (Western, Northern, Eastern and Southern Europe; North, Central and South America; Australia and New Zealand; South East Asia; Middle and Southern Africa) and between bumblebees and other bees. Our analyses highlight strong regionally-based responses of total abundance, species richness and Simpson's diversity to land use, caused by variation in the sensitivity of species and potentially in the nature of threats. These results suggest that global extrapolation of models based on geographically and taxonomically restricted data may underestimate the true uncertainty, increasing the risk of ecological surprises

    Rhesus Macaques (Macaca mulatta) Are Natural Hosts of Specific Staphylococcus aureus Lineages

    Get PDF
    Currently, there is no animal model known that mimics natural nasal colonization by Staphylococcus aureus in humans. We investigated whether rhesus macaques are natural nasal carriers of S. aureus. Nasal swabs were taken from 731 macaques. S. aureus isolates were typed by pulsed-field gel electrophoresis (PFGE), spa repeat sequencing and multi-locus sequence typing (MLST), and compared with human strains. Furthermore, the isolates were characterized by several PCRs. Thirty-nine percent of 731 macaques were positive for S. aureus. In general, the macaque S. aureus isolates differed from human strains as they formed separate PFGE clusters, 50% of the isolates were untypeable by agr genotyping, 17 new spa types were identified, which all belonged to new sequence types (STs). Furthermore, 66% of macaque isolates were negative for all superantigen genes. To determine S. aureus nasal colonization, three nasal swabs from 48 duo-housed macaques were taken during a 5 month period. In addition, sera were analyzed for immunoglobulin G and A levels directed against 40 staphylococcal proteins using a bead-based flow cytometry technique. Nineteen percent of the animals were negative for S. aureus, and 17% were three times positive. S. aureus strains were easily exchanged between macaques. The antibody response was less pronounced in macaques compared to humans, and nasal carrier status was not associated with differences in serum anti-staphylococcal antibody levels. In conclusion, rhesus macaques are natural hosts of S. aureus, carrying host-specific lineages. Our data indicate that rhesus macaques are useful as an autologous model for studying S. aureus nasal colonization and infection prevention

    Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy

    Get PDF
    Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries

    The impact of surgical delay on resectability of colorectal cancer: An international prospective cohort study

    Get PDF
    AIM: The SARS-CoV-2 pandemic has provided a unique opportunity to explore the impact of surgical delays on cancer resectability. This study aimed to compare resectability for colorectal cancer patients undergoing delayed versus non-delayed surgery. METHODS: This was an international prospective cohort study of consecutive colorectal cancer patients with a decision for curative surgery (January-April 2020). Surgical delay was defined as an operation taking place more than 4 weeks after treatment decision, in a patient who did not receive neoadjuvant therapy. A subgroup analysis explored the effects of delay in elective patients only. The impact of longer delays was explored in a sensitivity analysis. The primary outcome was complete resection, defined as curative resection with an R0 margin. RESULTS: Overall, 5453 patients from 304 hospitals in 47 countries were included, of whom 6.6% (358/5453) did not receive their planned operation. Of the 4304 operated patients without neoadjuvant therapy, 40.5% (1744/4304) were delayed beyond 4 weeks. Delayed patients were more likely to be older, men, more comorbid, have higher body mass index and have rectal cancer and early stage disease. Delayed patients had higher unadjusted rates of complete resection (93.7% vs. 91.9%, P = 0.032) and lower rates of emergency surgery (4.5% vs. 22.5%, P < 0.001). After adjustment, delay was not associated with a lower rate of complete resection (OR 1.18, 95% CI 0.90-1.55, P = 0.224), which was consistent in elective patients only (OR 0.94, 95% CI 0.69-1.27, P = 0.672). Longer delays were not associated with poorer outcomes. CONCLUSION: One in 15 colorectal cancer patients did not receive their planned operation during the first wave of COVID-19. Surgical delay did not appear to compromise resectability, raising the hypothesis that any reduction in long-term survival attributable to delays is likely to be due to micro-metastatic disease

    A comprehensive overview of radioguided surgery using gamma detection probe technology

    Get PDF
    The concept of radioguided surgery, which was first developed some 60 years ago, involves the use of a radiation detection probe system for the intraoperative detection of radionuclides. The use of gamma detection probe technology in radioguided surgery has tremendously expanded and has evolved into what is now considered an established discipline within the practice of surgery, revolutionizing the surgical management of many malignancies, including breast cancer, melanoma, and colorectal cancer, as well as the surgical management of parathyroid disease. The impact of radioguided surgery on the surgical management of cancer patients includes providing vital and real-time information to the surgeon regarding the location and extent of disease, as well as regarding the assessment of surgical resection margins. Additionally, it has allowed the surgeon to minimize the surgical invasiveness of many diagnostic and therapeutic procedures, while still maintaining maximum benefit to the cancer patient. In the current review, we have attempted to comprehensively evaluate the history, technical aspects, and clinical applications of radioguided surgery using gamma detection probe technology
    corecore