600 research outputs found

    Development of the (d,n) proton-transfer reaction in inverse kinematics for structure studies

    Get PDF
    Transfer reactions have provided exciting opportunities to study the structure of exotic nuclei and are often used to inform studies relating to nucleosynthesis and applications. In order to benefit from these reactions and their application to rare ion beams (RIBs) it is necessary to develop the tools and techniques to perform and analyze the data from reactions performed in inverse kinematics, that is with targets of light nuclei and heavier beams. We are continuing to expand the transfer reaction toolbox in preparation for the next generation of facilities, such as the Facility for Rare Ion Beams (FRIB), which is scheduled for completion in 2022. An important step in this process is to perform the (d,n) reaction in inverse kinematics, with analyses that include Q-value spectra and differential cross sections. In this way, proton-transfer reactions can be placed on the same level as the more commonly used neutron-transfer reactions, such as (d,p), (9Be,8Be), and (13C,12C). Here we present an overview of the techniques used in (d,p) and (d,n), and some recent data from (d,n) reactions in inverse kinematics using stable beams of 12C and 16O.Comment: 9 pages, 4 figures, presented at the XXXV Mazurian Lakes Conference on Physics, Piaski, Polan

    Beyond BMI: The “Metabolically healthy obese” phenotype & its association with clinical/subclinical cardiovascular disease and all-cause mortality -- a systematic review

    Get PDF
    Background: A subgroup has emerged within the obese that do not display the typical metabolic disorders associated with obesity and are hypothesized to have lower risk of complications. The purpose of this review was to analyze the literature which has examined the burden of cardiovascular disease (CVD) and all-cause mortality in the metabolically healthy obese (MHO) population. Methods: Pubmed, Cochrane Library, and Web of Science were searched from their inception until December 2012. Studies were included which clearly defined the MHO group (using either insulin sensitivity and/or components of metabolic syndrome AND obesity) and its association with either all cause mortality, CVD mortality, incident CVD, and/or subclinical CVD. Results: A total of 20 studies were identified; 15 cohort and 5 cross-sectional. Eight studies used the NCEP Adult Treatment Panel III definition of metabolic syndrome to define “metabolically healthy”, while another nine used insulin resistance. Seven studies assessed all-cause mortality, seven assessed CVD mortality, and nine assessed incident CVD. MHO was found to be significantly associated with all-cause mortality in two studies (30%), CVD mortality in one study (14%), and incident CVD in three studies (33%). Of the six studies which examined subclinical disease, four (67%) showed significantly higher mean common carotid artery intima media thickness (CCA-IMT), coronary artery calcium (CAC), or other subclinical CVD markers in the MHO as compared to their MHNW counterparts. Conclusions: MHO is an important, emerging phenotype with a CVD risk between healthy, normal weight and unhealthy, obese individuals. Successful work towards a universally accepted definition of MHO would improve (and simplify) future studies and aid inter-study comparisons. Usefulness of a definition inclusive of insulin sensitivity and stricter criteria for metabolic syndrome components as well as the potential addition of markers of fatty liver and inflammation should be explored. Clinicians should be hesitant to reassure patients that the metabolically benign phenotype is safe, as increased risk cardiovascular disease and death have been shown

    eLearning resources to supplement postgraduate neurosurgery training.

    Get PDF
    BACKGROUND: In an increasingly complex and competitive professional environment, improving methods to educate neurosurgical residents is key to ensure high-quality patient care. Electronic (e)Learning resources promise interactive knowledge acquisition. We set out to give a comprehensive overview on available eLearning resources that aim to improve postgraduate neurosurgical training and review the available literature. MATERIAL AND METHODS: A MEDLINE query was performed, using the search term "electronic AND learning AND neurosurgery". Only peer-reviewed English-language articles on the use of any means of eLearning to improve theoretical knowledge in postgraduate neurosurgical training were included. Reference lists were crosschecked for further relevant articles. Captured parameters were the year, country of origin, method of eLearning reported, and type of article, as well as its conclusion. eLearning resources were additionally searched for using Google. RESULTS: Of n = 301 identified articles by the MEDLINE search, n = 43 articles were analysed in detail. Applying defined criteria, n = 28 articles were excluded and n = 15 included. Most articles were generated within this decade, with groups from the USA, the UK and India having a leadership role. The majority of articles reviewed existing eLearning resources, others reported on the concept, development and use of generated eLearning resources. There was no article that scientifically assessed the effectiveness of eLearning resources (against traditional learning methods) in terms of efficacy or costs. Only one article reported on satisfaction rates with an eLearning tool. All authors of articles dealing with eLearning and the use of new media in neurosurgery uniformly agreed on its great potential and increasing future use, but most also highlighted some weaknesses and possible dangers. CONCLUSION: This review found only a few articles dealing with the modern aspects of eLearning as an adjunct to postgraduate neurosurgery training. Comprehensive eLearning platforms offering didactic modules with clear learning objectives are rare. Two decades after the rise of eLearning in neurosurgery, some promising solutions are readily available, but the potential of eLearning has not yet been sufficiently exploited

    Obtaining high resolution excitation functions with an active thick-target approach and validating them with mirror nuclei

    Full text link
    Measurement of fusion excitation functions for stable nuclei has largely been restricted to nuclei with significant natural abundance. Typically, to investigate neighboring nuclei with low natural abundance has required obtaining isotopically enriched material. This restriction often limits the ability to perform such measurements. We report the measurement of a high quality fusion excitation function for a 17^{17}O beam produced from unenriched material with 0.038\% natural abundance. The measurement is enabled by using an active thick-target approach and the accuracy of the result is validated using its mirror nucleus 17^{17}F and resonances. The result provides important information about the average fusion cross-section for the oxygen isotopic chain as a function of neutron excess.Comment: 4 pages, 4 figure

    Assessment of atherosclerotic plaque burden: comparison of AI-QCT versus SIS, CAC, visual and CAD-RADS stenosis categories

    Get PDF
    This study assesses the agreement of Artificial Intelligence-Quantitative Computed Tomography (AI-QCT) with qualitative approaches to atherosclerotic disease burden codified in the multisociety 2022 CAD-RADS 2.0 Expert Consensus. 105 patients who underwent cardiac computed tomography angiography (CCTA) for chest pain were evaluated by a blinded core laboratory through FDA-cleared software (Cleerly, Denver, CO) that performs AI-QCT through artificial intelligence, analyzing factors such as % stenosis, plaque volume, and plaque composition. AI-QCT plaque volume was then staged by recently validated prognostic thresholds, and compared with CAD-RADS 2.0 clinical methods of plaque evaluation (segment involvement score (SIS), coronary artery calcium score (CACS), visual assessment, and CAD-RADS percent (%) stenosis) by expert consensus blinded to the AI-QCT core lab reads. Average age of subjects were 59 ± 11 years; 44% women, with 50% of patients at CAD-RADS 1–2 and 21% at CAD-RADS 3 and above by expert consensus. AI-QCT quantitative plaque burden staging had excellent agreement of 93% (k = 0.87 95% CI: 0.79–0.96) with SIS. There was moderate agreement between AI-QCT quantitative plaque volume and categories of visual assessment (64.4%; k = 0.488 [0.38–0.60]), and CACS (66.3%; k = 0.488 [0.36–0.61]). Agreement between AI-QCT plaque volume stage and CAD-RADS % stenosis category was also moderate. There was discordance at small plaque volumes. With ongoing validation, these results demonstrate a potential for AI-QCT as a rapid, reproducible approach to quantify total plaque burden

    Contextual and Granular Policy Enforcement in Database-backed Applications

    Full text link
    Database-backed applications rely on inlined policy checks to process users' private and confidential data in a policy-compliant manner as traditional database access control mechanisms cannot enforce complex policies. However, application bugs due to missed checks are common in such applications, which result in data breaches. While separating policy from code is a natural solution, many data protection policies specify restrictions based on the context in which data is accessed and how the data is used. Enforcing these restrictions automatically presents significant challenges, as the information needed to determine context requires a tight coupling between policy enforcement and an application's implementation. We present Estrela, a framework for enforcing contextual and granular data access policies. Working from the observation that API endpoints can be associated with salient contextual information in most database-backed applications, Estrela allows developers to specify API-specific restrictions on data access and use. Estrela provides a clean separation between policy specification and the application's implementation, which facilitates easier auditing and maintenance of policies. Policies in Estrela consist of pre-evaluation and post-evaluation conditions, which provide the means to modulate database access before a query is issued, and to impose finer-grained constraints on information release after the evaluation of query, respectively. We build a prototype of Estrela and apply it to retrofit several real world applications (from 1000-80k LOC) to enforce different contextual policies. Our evaluation shows that Estrela can enforce policies with minimal overheads
    corecore