93 research outputs found
A proposed quantitative methodology for the evaluation of the effectiveness of Human Element, Leadership and Management (HELM) training in the UK
In 2006, a review of maritime accidents found that non-technical skills (NTSs) are the single largest contributing factor towards such incidents. NTSs are composed of both interpersonal and cognitive elements. These include things such as situational awareness, teamwork, decision making, leadership, management and communication skills. In a crisis situation, good NTSs allow a deck officer to quickly recognise that a problem exists and then harness the resources that are at their disposal to safely and efficiently bring the situation back under control. This paper has two aims. The first is to develop a methodology which will enable educators to quantitatively assess the impact of Maritime and Coastguard Agency (MCA)-approved Human Element, Leadership and Management (HELM) training on deck officer’s NTSs with a view to identifying further training requirements. The second is to determine whether the HELM training provided to develop the NTSs of trainee deck officers is fit for purpose. To achieve these aims, a three-phase approach was adopted. Initially, a taxonomy for deck officer’s NTSs is established, behavioural markers are identified and the relative importance of each attribute is calculated using the analytical hierarchy process (AHP). Subsequently, a set of scenarios were identified for the assessment of deck officer’s NTSs in a ship bridge simulator environment. A random selection of students that have completed the Chief Mate (CM) programme was performed, and data regarding their NTS-related performance in the scenarios was collected. Finally, the collected data was fed into the evidential reasoning (ER) algorithm, utility values were produced and, having established these values, the effectiveness of the HELM training that the students have received was then evaluated
Crystalline phases involved in the hydration of calcium silicate-based cements: Semi-quantitative Rietveld X-ray diffraction analysis
Chemical comparisons of powder and hydrated forms of calcium silicate cements (CSCs) and calculation of alterations in tricalcium silicate (Ca3SiO5) calcium hydroxide (Ca(OH)2) are essential for understanding their hydration processes. This study aimed to evaluate and compare these changes in ProRoot MTA, Biodentine and CEM cement. Powder and hydrated forms of tooth coloured ProRoot MTA, Biodentine and CEM cement were subjected to X-ray diffraction (XRD) analysis with Rietveld refinement to semi-quantitatively identify and quantify the main phases involved in their hydration process. Data were reported descriptively. Reduction in Ca3SiO5 and formation of Ca(OH)2 were seen after the hydration of ProRoot MTA and Biodentine; however, in the case of CEM cement, no reduction of Ca3SiO5 and no formation of Ca(OH)2 were detected. The highest percentages of amorphous phases were seen in Biodentine samples. Ettringite was detected in the hydrated forms of ProRoot MTA and CEM cement but not in Biodentine
The Coevolution of Virulence: Tolerance in Perspective
Coevolutionary interactions, such as those between host and parasite, predator and prey, or plant and pollinator, evolve subject to the genes of both interactors. It is clear, for example, that the evolution of pollination strategies can only be understood with knowledge of both the pollinator and the pollinated. Studies of the evolution of virulence, the reduction in host fitness due to infection, have nonetheless tended to focus on parasite evolution. Host-centric approaches have also been proposed—for example, under the rubric of “tolerance”, the ability of hosts to minimize virulence without necessarily minimizing parasite density. Within the tolerance framework, however, there is room for more comprehensive measures of host fitness traits, and for fuller consideration of the consequences of coevolution. For example, the evolution of tolerance can result in changed selection on parasite populations, which should provoke parasite evolution despite the fact that tolerance is not directly antagonistic to parasite fitness. As a result, consideration of the potential for parasite counter-adaptation to host tolerance—whether evolved or medially manipulated—is essential to the emergence of a cohesive theory of biotic partnerships and robust disease control strategies
Multifractal and entropy analysis of resting-state electroencephalography reveals spatial organization in local dynamic functional connectivity
Functional connectivity of the brain fluctuates even in resting-state condition. It has been reported recently that fluctuations of global functional network topology and those of individual connections between brain regions expressed multifractal scaling. To expand on these findings, in this study we investigated if multifractality was indeed an inherent property of dynamic functional connectivity (DFC) on the regional level as well. Furthermore, we explored if local DFC showed region-specific differences in its multifractal and entropy-related features. DFC analyses were performed on 62-channel, resting-state electroencephalography recordings of twelve young, healthy subjects. Surrogate data testing verified the true multifractal nature of regional DFC that could be attributed to the presumed nonlinear nature of the underlying processes. Moreover, we found a characteristic spatial distribution of local connectivity dynamics, in that frontal and occipital regions showed stronger long-range correlation and higher degree of multifractality, whereas the highest values of entropy were found over the central and temporal regions. The revealed topology reflected well the underlying resting-state network organization of the brain. The presented results and the proposed analysis framework could improve our understanding on how resting-state brain activity is spatio-temporally organized and may provide potential biomarkers for future clinical research
Determinants of costs and the length of stay in acute coronary syndromes : a real life analysis of more than 10 000 patients
Aims: The aim of this study was to investigate inpatient costs of acute coronary syndromes (ACS) in Switzerland and to assess the main cost drivers associated with this disease.
Methods and Results: We used the national multicenter registry AMIS (acute myocardial infarction in Switzerland) which includes a representative number of 65 hospitals and a total of 11.623 patient records. The following cost modules were analyzed: hospital stay, percutaneous coronary interventions (PCI) and thrombolysis. Expenses were assessed using data from official Swiss national statistical sources. Mean total costs per patient were 12.101 Euro (median 10.929 Euro; 95% CI: 1.161–27.722 Euro). The length of stay ranged from one to 129 days with a mean of 9.5 days (median 8.0 days; 95% CI: 1–23). Overall costs were independently influenced by age, gender and existent co-morbidities, e.g. cerebrovascular disease and diabetes (p < 0.0001).
Conclusion: Our study determined specific causes for the high costs associated with hospital treatment on a large representative sample. The results should highlight unnecessary expenses and help policy makers to evaluate the base case for a DRG (Diagnosis Related Groups) scenario in Switzerland. Cost weighting of the identified secondary diagnosis should be considered in the calculation and coding of a primary diagnosis for ACS
Pooled analysis of who surgical safety checklist use and mortality after emergency laparotomy
Background: The World Health Organization (WHO) Surgical Safety Checklist has fostered safe practice for 10 years, yet its place in emergency surgery has not been assessed on a global scale. The aim of this study was to evaluate reported checklist use in emergency settings and examine the relationship with perioperative mortality in patients who had emergency laparotomy. Methods: In two multinational cohort studies, adults undergoing emergency laparotomy were compared with those having elective gastrointestinal surgery. Relationships between reported checklist use and mortality were determined using multivariable logistic regression and bootstrapped simulation. Results: Of 12 296 patients included from 76 countries, 4843 underwent emergency laparotomy. After adjusting for patient and disease factors, checklist use before emergency laparotomy was more common in countries with a high Human Development Index (HDI) (2455 of 2741, 89⋅6 per cent) compared with that in countries with a middle (753 of 1242, 60⋅6 per cent; odds ratio (OR) 0⋅17, 95 per cent c.i. 0⋅14 to 0⋅21, P < 0⋅001) or low (363 of 860, 42⋅2 percent; OR 0⋅08, 0⋅07 to 0⋅10, P < 0⋅001) HDI. Checklist use was less common in elective surgery than for emergency laparotomy in high-HDI countries (risk difference −9⋅4 (95 per cent c.i. −11⋅9 to −6⋅9) per cent; P < 0⋅001), but the relationship was reversed in low-HDI countries (+12⋅1 (+7⋅0 to +17⋅3) per cent; P < 0⋅001). In multivariable models, checklist use was associated with a lower 30-day perioperative mortality (OR 0⋅60, 0⋅50 to 0⋅73; P < 0⋅001). The greatest absolute benefit was seen for emergency surgery in low-and middle-HDI countries. Conclusion: Checklist use in emergency laparotomy was associated with a significantly lower perioperative mortality rate. Checklist use in low-HDI countries was half that in high-HDI countries
Surgical site infection after gastrointestinal surgery in high-income, middle-income, and low-income countries: a prospective, international, multicentre cohort study
Background Surgical site infection (SSI) is one of the most common infections associated with health care, but its importance as a global health priority is not fully understood. We quantified the burden of SSI after gastrointestinal surgery in countries in all parts of the world. Methods This international, prospective, multicentre cohort study included consecutive patients undergoing elective or emergency gastrointestinal resection within 2-week time periods at any health-care facility in any country. Countries with participating centres were stratified into high-income, middle-income, and low-income groups according to the UN's Human Development Index (HDI). Data variables from the GlobalSurg 1 study and other studies that have been found to affect the likelihood of SSI were entered into risk adjustment models. The primary outcome measure was the 30-day SSI incidence (defined by US Centers for Disease Control and Prevention criteria for superficial and deep incisional SSI). Relationships with explanatory variables were examined using Bayesian multilevel logistic regression models. This trial is registered with ClinicalTrials.gov, number NCT02662231. Findings Between Jan 4, 2016, and July 31, 2016, 13 265 records were submitted for analysis. 12 539 patients from 343 hospitals in 66 countries were included. 7339 (58·5%) patient were from high-HDI countries (193 hospitals in 30 countries), 3918 (31·2%) patients were from middle-HDI countries (82 hospitals in 18 countries), and 1282 (10·2%) patients were from low-HDI countries (68 hospitals in 18 countries). In total, 1538 (12·3%) patients had SSI within 30 days of surgery. The incidence of SSI varied between countries with high (691 [9·4%] of 7339 patients), middle (549 [14·0%] of 3918 patients), and low (298 [23·2%] of 1282) HDI (p<0·001). The highest SSI incidence in each HDI group was after dirty surgery (102 [17·8%] of 574 patients in high-HDI countries; 74 [31·4%] of 236 patients in middle-HDI countries; 72 [39·8%] of 181 patients in low-HDI countries). Following risk factor adjustment, patients in low-HDI countries were at greatest risk of SSI (adjusted odds ratio 1·60, 95% credible interval 1·05–2·37; p=0·030). 132 (21·6%) of 610 patients with an SSI and a microbiology culture result had an infection that was resistant to the prophylactic antibiotic used. Resistant infections were detected in 49 (16·6%) of 295 patients in high-HDI countries, in 37 (19·8%) of 187 patients in middle-HDI countries, and in 46 (35·9%) of 128 patients in low-HDI countries (p<0·001). Interpretation Countries with a low HDI carry a disproportionately greater burden of SSI than countries with a middle or high HDI and might have higher rates of antibiotic resistance. In view of WHO recommendations on SSI prevention that highlight the absence of high-quality interventional research, urgent, pragmatic, randomised trials based in LMICs are needed to assess measures aiming to reduce this preventable complication. Funding DFID-MRC-Wellcome Trust Joint Global Health Trial Development Grant, National Institute of Health Research Global Health Research Unit Grant
Impact of clinical phenotypes on management and outcomes in European atrial fibrillation patients: a report from the ESC-EHRA EURObservational Research Programme in AF (EORP-AF) General Long-Term Registry
Background: Epidemiological studies in atrial fibrillation (AF) illustrate that clinical complexity increase the risk of major adverse outcomes. We aimed to describe European AF patients\u2019 clinical phenotypes and analyse the differential clinical course. Methods: We performed a hierarchical cluster analysis based on Ward\u2019s Method and Squared Euclidean Distance using 22 clinical binary variables, identifying the optimal number of clusters. We investigated differences in clinical management, use of healthcare resources and outcomes in a cohort of European AF patients from a Europe-wide observational registry. Results: A total of 9363 were available for this analysis. We identified three clusters: Cluster 1 (n = 3634; 38.8%) characterized by older patients and prevalent non-cardiac comorbidities; Cluster 2 (n = 2774; 29.6%) characterized by younger patients with low prevalence of comorbidities; Cluster 3 (n = 2955;31.6%) characterized by patients\u2019 prevalent cardiovascular risk factors/comorbidities. Over a mean follow-up of 22.5 months, Cluster 3 had the highest rate of cardiovascular events, all-cause death, and the composite outcome (combining the previous two) compared to Cluster 1 and Cluster 2 (all P <.001). An adjusted Cox regression showed that compared to Cluster 2, Cluster 3 (hazard ratio (HR) 2.87, 95% confidence interval (CI) 2.27\u20133.62; HR 3.42, 95%CI 2.72\u20134.31; HR 2.79, 95%CI 2.32\u20133.35), and Cluster 1 (HR 1.88, 95%CI 1.48\u20132.38; HR 2.50, 95%CI 1.98\u20133.15; HR 2.09, 95%CI 1.74\u20132.51) reported a higher risk for the three outcomes respectively. Conclusions: In European AF patients, three main clusters were identified, differentiated by differential presence of comorbidities. Both non-cardiac and cardiac comorbidities clusters were found to be associated with an increased risk of major adverse outcomes
Clinical Use of Rivaroxaban: Pharmacokinetic and Pharmacodynamic Rationale for Dosing Regimens in Different Indications
Target-specific oral anticoagulants have become increasingly available as alternatives to traditional agents for the management of a number of thromboembolic disorders. To date, the direct Factor Xa inhibitor rivaroxaban is the most widely approved of the new agents. The dosing of rivaroxaban varies and adheres to specific schedules in each of the clinical settings in which it has been investigated. These regimens were devised based on the results of phase II dose-finding studies and/or pharmacokinetic modeling, and were demonstrated to be successful in randomized, phase III studies. In most cases, the pharmacodynamic profile of rivaroxaban permits once-daily dosing. A once-daily dose is indicated for the prevention of venous thromboembolism (VTE) in patients undergoing hip or knee replacement surgery, the long-term prevention of stroke in patients with non-valvular atrial fibrillation, and the long-term secondary prevention of recurrent VTE. Twice-daily dosing is required in the acute phase of treatment in patients with VTE and in the combination of rivaroxaban with standard single or dual antiplatelet therapy for secondary prevention after acute coronary syndrome events. This article reviews the empirical and clinical rationale supporting the dose regimens of rivaroxaban in each clinical setting
Global challenges for seagrass conservation
Seagrasses, flowering marine plants that form underwater meadows, play a significant global role in supporting food security, mitigating climate change and supporting biodiversity. Although progress is being made to conserve seagrass meadows in select areas, most meadows remain under significant pressure resulting in a decline in meadow condition and loss of function. Effective management strategies need to be implemented to reverse seagrass loss and enhance their fundamental role in coastal ocean habitats. Here we propose that seagrass meadows globally face a series of significant common challenges that must be addressed from a multifaceted and interdisciplinary perspective in order to achieve global conservation of seagrass meadows. The six main global challenges to seagrass conservation are (1) a lack of awareness of what seagrasses are and a limited societal recognition of the importance of seagrasses in coastal systems; (2) the status of many seagrass meadows are unknown, and up-to-date information on status and condition is essential; (3) understanding threatening activities at local scales is required to target management actions accordingly; (4) expanding our understanding of interactions between the socio-economic and ecological elements of seagrass systems is essential to balance the needs of people and the planet; (5) seagrass research should be expanded to generate scientific inquiries that support conservation actions; (6) increased understanding of the linkages between seagrass and climate change is required to adapt conservation accordingly. We also explicitly outline a series of proposed policy actions that will enable the scientific and conservation community to rise to these challenges. We urge the seagrass conservation community to engage stakeholders from local resource users to international policy-makers to address the challenges outlined here, in order to secure the future of the world’s seagrass ecosystems and maintain the vital services which they supply
- …
