29 research outputs found
Interaction of the tetracyclines with double-stranded RNAs of random base sequence: new perspectives on the target and mechanism of action
The 16S rRNA binding mechanism proposed for the antibacterial action of the tetracyclines does not explain their mechanism of action against non-bacterial pathogens. In addition, several contradictory base pairs have been proposed as their binding sites on the 16S rRNA. This study investigated the binding of minocycline and doxycycline to short double-stranded RNAs (dsRNAs) of random base sequences. These tetracyclines caused a dose-dependent decrease in the fluorescence intensities of 6-carboxyfluorescein (FAM)-labelled dsRNA and ethidium bromide (EtBr)-stained dsRNA, indicating that both drugs bind to dsRNA of random base sequence in a manner that is competitive with the binding of EtBr and other nucleic acid ligands often used as stains. This effect was observable in the presence of Mg2+. The binding of the tetracyclines to dsRNA changed features of the fluorescence emission spectra of the drugs and the CD spectra of the RNA, and inhibited RNase III cleavage of the dsRNA. These results indicate that the double-stranded structures of RNAs may have a more important role in their interaction with the tetracyclines than the specific base pairs, which had hitherto been the subject of much investigation. Given the diverse functions of cellular RNAs, the binding of the tetracyclines to their double-stranded helixes may alter the normal processing and functioning of the various biological processes they regulate. This could help to explain the wide range of action of the tetracyclines against various pathogens and disease condition
Receptor-Induced Dilatation in the Systemic and Intrarenal Adaptation to Pregnancy in Rats
Normal pregnancy is associated with systemic and intrarenal vasodilatation resulting in an increased glomerular filtration rate. This adaptive response occurs in spite of elevated circulating levels of angiotensin II (Ang II). In the present study, we evaluated the potential mechanisms responsible for this adaptation. The reactivity of the mesangial cells (MCs) cultured from 14-day-pregnant rats to Ang II was measured through changes in the intracellular calcium concentration ([Cai]). The expression levels of inducible nitric oxide synthase (iNOS), the Ang II-induced vasodilatation receptor AT2, and the relaxin (LGR7) receptor were evaluated in cultured MCs and in the aorta, renal artery and kidney cortex by real time-PCR. The intrarenal distribution of LGR7 was further analyzed by immunohistochemistry. The MCs displayed a relative insensitivity to Ang II, which was paralleled by an impressive increase in the expression level of iNOS, AT2 and LGR7. These results suggest that the MCs also adapt to the pregnancy, thereby contributing to the maintenance of the glomerular surface area even in the presence of high levels of Ang II. The mRNA expression levels of AT2 and LGR7 also increased in the aorta, renal artery and kidney of the pregnant animals, whereas the expression of the AT1 did not significantly change. This further suggests a role of these vasodilatation-induced receptors in the systemic and intrarenal adaptation during pregnancy. LGR7 was localized in the glomeruli and on the apical membrane of the tubular cells, with stronger labeling in the kidneys of pregnant rats. These results suggest a role of iNOS, AT2, and LGR7 in the systemic vasodilatation and intrarenal adaptation to pregnancy and also suggest a pivotal role for relaxin in the tubular function during gestation
The Utilization of Aquatic Bushmeat from Small Cetaceans and Manatees in South America and West Africa
Aquatic bushmeat can be defined as the products derived from wild aquatic megafauna (e.g., marine mammals) that are used for human consumption and non-food purposes, including traditional medicine. It is obtained through illegal or unregulated hunts as well as from stranded (dead or alive) and bycaught animals. In most South American and West African countries aquatic mammals are or have been taken for bushmeat, including 33 small cetaceans and all three manatee species. Of these, two cetacean species are listed in the IUCN red list as “near threatened,” and one as “vulnerable,” as are all manatee species. Additionally, 22 cetacean species are listed as “data deficient,” hence some of these species may also be at risk. No reports (recent or otherwise) were found for some countries, caution is needed in concluding that aquatic bushmeat is not utilized in these nations. Moreover, although aquatic bushmeat is mostly obtained opportunistically and was likely originally taken only for local consumption, directed catches occur in most countries and may have reached unsustainable levels in some areas. For example, in Peru and Nigeria, thousands of small cetaceans are illegally hunted annually. Reliable, recent data and a better overall understanding of the drivers of aquatic bushmeat will be essential in the development of effective mitigation measures
Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy
Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p<0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p<0·001). Interpretation Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication. Funding DFID-MRC-Wellcome Trust Joint Global Health Trial Development Grant, National Institute of Health Research Global Health Research Unit Grant
Effects of caffeine on the electrophysiological, cognitive and motor responses of the central nervous system
Increasing frailty is associated with higher prevalence and reduced recognition of delirium in older hospitalised inpatients: results of a multi-centre study
Purpose Delirium is a neuropsychiatric disorder delineated by an acute change in cognition, attention, and consciousness. It is common, particularly in older adults, but poorly recognised. Frailty is the accumulation of deficits conferring an increased risk of adverse outcomes. We set out to determine how severity of frailty, as measured using the CFS, affected delirium rates, and recognition in hospitalised older people in the United Kingdom. Methods Adults over 65 years were included in an observational multi-centre audit across UK hospitals, two prospective rounds, and one retrospective note review. Clinical Frailty Scale (CFS), delirium status, and 30-day outcomes were recorded. Results The overall prevalence of delirium was 16.3% (483). Patients with delirium were more frail than patients without delirium (median CFS 6 vs 4). The risk of delirium was greater with increasing frailty [OR 2.9 (1.8–4.6) in CFS 4 vs 1–3; OR 12.4 (6.2–24.5) in CFS 8 vs 1–3]. Higher CFS was associated with reduced recognition of delirium (OR of 0.7 (0.3–1.9) in CFS 4 compared to 0.2 (0.1–0.7) in CFS 8). These risks were both independent of age and dementia. Conclusion We have demonstrated an incremental increase in risk of delirium with increasing frailty. This has important clinical implications, suggesting that frailty may provide a more nuanced measure of vulnerability to delirium and poor outcomes. However, the most frail patients are least likely to have their delirium diagnosed and there is a significant lack of research into the underlying pathophysiology of both of these common geriatric syndromes
Global variation in anastomosis and end colostomy formation following left-sided colorectal resection
Background
End colostomy rates following colorectal resection vary across institutions in high-income settings, being influenced by patient, disease, surgeon and system factors. This study aimed to assess global variation in end colostomy rates after left-sided colorectal resection.
Methods
This study comprised an analysis of GlobalSurg-1 and -2 international, prospective, observational cohort studies (2014, 2016), including consecutive adult patients undergoing elective or emergency left-sided colorectal resection within discrete 2-week windows. Countries were grouped into high-, middle- and low-income tertiles according to the United Nations Human Development Index (HDI). Factors associated with colostomy formation versus primary anastomosis were explored using a multilevel, multivariable logistic regression model.
Results
In total, 1635 patients from 242 hospitals in 57 countries undergoing left-sided colorectal resection were included: 113 (6·9 per cent) from low-HDI, 254 (15·5 per cent) from middle-HDI and 1268 (77·6 per cent) from high-HDI countries. There was a higher proportion of patients with perforated disease (57·5, 40·9 and 35·4 per cent; P < 0·001) and subsequent use of end colostomy (52·2, 24·8 and 18·9 per cent; P < 0·001) in low- compared with middle- and high-HDI settings. The association with colostomy use in low-HDI settings persisted (odds ratio (OR) 3·20, 95 per cent c.i. 1·35 to 7·57; P = 0·008) after risk adjustment for malignant disease (OR 2·34, 1·65 to 3·32; P < 0·001), emergency surgery (OR 4·08, 2·73 to 6·10; P < 0·001), time to operation at least 48 h (OR 1·99, 1·28 to 3·09; P = 0·002) and disease perforation (OR 4·00, 2·81 to 5·69; P < 0·001).
Conclusion
Global differences existed in the proportion of patients receiving end stomas after left-sided colorectal resection based on income, which went beyond case mix alone
Effect of hydrofluoric acid on acid decomposition mixtures for determining iron and other metallic elements in green vegetables.
The efficiency of acid mixtures, HNO3 - HCIO4 -HF, HNO3 - HCI - HF, HNO3 - HCIO4 and HNO3 - HCl in the decomposition of four edible green vegetables, Gboma (Solanum macrocarpon), Aleefu (Amaranttius hibirid-us), Shoeley (Hibiscus sabdariffa) and Ademe (Corchorus olitorius), for flame Atomic Absorption Spectromet-er analysis of Fe, Mn, Mg, Cu, Zn and Ca was studied. The concentrations of Fe were higher (120.61 – 710.10 mg/kg), while the values of Cu were lower (2.31 – 4.84 mg/kg) in all the samples. The values of concentration for Fe were more reproducible when HF was included in the decomposition mixtures. There were no significant differences in the concentrations of the other elements when HF was included in the acid mixture as compared to the acid mixtures without HF. Therefore, the inclusion of HF in the acid decomposition mixtures would ensure total and precise estimation of Fe in plant materials, but not critical for analysis of Mn, Mg, Cu, Zn and Ca. Performance of the decomposition procedures was verified by applying the methods to analyse Standard Reference Material IAEA-V-10 Hay Powder. Journal of Applied Science and Technology Vol. 12 (1&2) 2007: pp. 84-8
