54 research outputs found

    Biliary Bicarbonate Secretion Constitutes a Protective Mechanism against Bile Acid-Induced Injury in Man

    Get PDF
    Background: Cholangiocytes expose a striking resistance against bile acids: while other cell types, such as hepatocytes, are susceptible to bile acid-induced toxicity and apoptosis already at micromolar concentrations, cholangiocytes are continuously exposed to millimolar concentrations as present in bile. We present a hypothesis suggesting that biliary secretion of HCO(3)(-) in man serves to protect cholangiocytes against bile acid-induced damage by fostering the deprotonation of apolar bile acids to more polar bile salts. Here, we tested if bile acid-induced toxicity is pH-dependent and if anion exchanger 2 (AE2) protects against bile acid-induced damage. Methods: A human cholangiocyte cell line was exposed to chenodeoxycholate (CDC), or its glycine conjugate, from 0.5 mM to 2.0 mM at pH 7.4, 7.1, 6.7 or 6.4, or after knockdown of AE2. Cell viability and apoptosis were determined by WST and caspase-3/-7 assays, respectively. Results: Glycochenodeoxycholate (GCDC) uptake in cholangiocytes is pH-dependent. Furthermore, CDC and GCDC (pK(a) 4-5) induce cholangiocyte toxicity in a pH-dependent manner: 0.5 mM CDC and 1 mM GCDC at pH 7.4 had no effect on cell viability, but at pH 6.4 decreased viability by >80% and increased caspase activity almost 10- and 30-fold, respectively. Acidification alone had no effect. AE2 knockdown led to 3- and 2-fold enhanced apoptosis induced by 0.75 mM CDC or 2 mM GCDC at pH 7.4. Discussion: These data support our hypothesis of a biliary HCO(3)(-) umbrella serving to protect human cholangiocytes against bile acid-induced injury. AE2 is a key contributor to this protective mechanism. The development and progression of cholangiopathies, such as primary biliary cirrhosis, may be a consequence of genetic and acquired functional defects of genes involved in maintaining the biliary HCO(3)(-) umbrella. Copyright (C) 2011 S. Karger AG, Base

    Pelvic trauma : WSES classification and guidelines

    Get PDF
    Complex pelvic injuries are among the most dangerous and deadly trauma related lesions. Different classification systems exist, some are based on the mechanism of injury, some on anatomic patterns and some are focusing on the resulting instability requiring operative fixation. The optimal treatment strategy, however, should keep into consideration the hemodynamic status, the anatomic impairment of pelvic ring function and the associated injuries. The management of pelvic trauma patients aims definitively to restore the homeostasis and the normal physiopathology associated to the mechanical stability of the pelvic ring. Thus the management of pelvic trauma must be multidisciplinary and should be ultimately based on the physiology of the patient and the anatomy of the injury. This paper presents the World Society of Emergency Surgery (WSES) classification of pelvic trauma and the management Guidelines.Peer reviewe

    Frequent mutation of histone-modifying genes in non-Hodgkin lymphoma

    Get PDF
    Follicular lymphoma (FL) and diffuse large B-cell lymphoma (DLBCL) are the two most common non-Hodgkin lymphomas (NHLs). Here we sequenced tumour and matched normal DNA from 13 DLBCL cases and one FL case to identify genes with mutations in B-cell NHL. We analysed RNA-seq data from these and another 113 NHLs to identify genes with candidate mutations, and then re-sequenced tumour and matched normal DNA from these cases to confirm 109 genes with multiple somatic mutations. Genes with roles in histone modification were frequent targets of somatic mutation. For example, 32% of DLBCL and 89% of FL cases had somatic mutations in MLL2, which encodes a histone methyltransferase, and 11.4% and 13.4% of DLBCL and FL cases, respectively, had mutations in MEF2B, a calcium-regulated gene that cooperates with CREBBP and EP300 in acetylating histones. Our analysis suggests a previously unappreciated disruption of chromatin biology in lymphomagenesis

    Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy

    Get PDF
    Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries

    Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study

    Get PDF
    Background Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p<0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p<0·001). Interpretation Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication. Funding DFID-MRC-Wellcome Trust Joint Global Health Trial Development Grant, National Institute of Health Research Global Health Research Unit Grant

    Long-term outcomes for neoadjuvant versus adjuvant chemotherapy in early breast cancer: meta-analysis of individual patient data from ten randomised trials

    Get PDF
    Background Neoadjuvant chemotherapy (NACT) for early breast cancer can make breast-conserving surgery more feasible and might be more likely to eradicate micrometastatic disease than might the same chemotherapy given after surgery. We investigated the long-term benefits and risks of NACT and the influence of tumour characteristics on outcome with a collaborative meta-analysis of individual patient data from relevant randomised trials. Methods We obtained information about prerandomisation tumour characteristics, clinical tumour response, surgery, recurrence, and mortality for 4756 women in ten randomised trials in early breast cancer that began before 2005 and compared NACT with the same chemotherapy given postoperatively. Primary outcomes were tumour response, extent of local therapy, local and distant recurrence, breast cancer death, and overall mortality. Analyses by intention-to-treat used standard regression (for response and frequency of breast-conserving therapy) and log-rank methods (for recurrence and mortality). Findings Patients entered the trials from 1983 to 2002 and median follow-up was 9 years (IQR 5–14), with the last follow-up in 2013. Most chemotherapy was anthracycline based (3838 [81%] of 4756 women). More than two thirds (1349 [69%] of 1947) of women allocated NACT had a complete or partial clinical response. Patients allocated NACT had an increased frequency of breast-conserving therapy (1504 [65%] of 2320 treated with NACT vs 1135 [49%] of 2318 treated with adjuvant chemotherapy). NACT was associated with more frequent local recurrence than was adjuvant chemotherapy: the 15 year local recurrence was 21·4% for NACT versus 15·9% for adjuvant chemotherapy (5·5% increase [95% CI 2·4–8·6]; rate ratio 1·37 [95% CI 1·17–1·61]; p=0·0001). No significant difference between NACT and adjuvant chemotherapy was noted for distant recurrence (15 year risk 38·2% for NACT vs 38·0% for adjuvant chemotherapy; rate ratio 1·02 [95% CI 0·92–1·14]; p=0·66), breast cancer mortality (34·4% vs 33·7%; 1·06 [0·95–1·18]; p=0·31), or death from any cause (40·9% vs 41·2%; 1·04 [0·94–1·15]; p=0·45). Interpretation Tumours downsized by NACT might have higher local recurrence after breast-conserving therapy than might tumours of the same dimensions in women who have not received NACT. Strategies to mitigate the increased local recurrence after breast-conserving therapy in tumours downsized by NACT should be considered—eg, careful tumour localisation, detailed pathological assessment, and appropriate radiotherapy

    Pelvic trauma: WSES classification and guidelines

    Full text link

    Global variation in anastomosis and end colostomy formation following left-sided colorectal resection

    Get PDF
    Background End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection. Methods This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model. Results In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001). Conclusion Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
    corecore