600 research outputs found
Recommended from our members
All-cause and cause-specific mortality in individuals with zero and minimal coronary artery calcium: A long-term, competing risk analysis in the Coronary Artery Calcium Consortium.
Background and aimsThe long-term associations between zero, minimal coronary artery calcium (CAC) and cause-specific mortality are currently unknown, particularly after accounting for competing risks with other causes of death.MethodsWe evaluated 66,363 individuals from the CAC Consortium (mean age 54 years, 33% women), a multi-center, retrospective cohort study of asymptomatic individuals undergoing CAC scoring for clinical risk assessment. Baseline evaluations occurred between 1991 and 2010.ResultsOver a mean of 12 years of follow-up, individuals with CAC = 0 (45% prevalence, mean age 45 years) had stable low rates of coronary heart disease (CHD) death, cardiovascular disease (CVD) death (ranging 0.32 to 0.43 per 1000 person-years), and all-cause death (1.38-1.62 per 1000 person-years). Cancer was the predominant cause of death in this group, yet rates were also very low (0.47-0.79 per 1000 person-years). Compared to CAC = 0, individuals with CAC 1-10 had an increased multivariable-adjusted risk of CVD death only under age 40. Individuals with CAC>10 had multivariable-adjusted increased risks of CHD death, CVD death and all-cause death at all ages, and a higher proportion of CVD deaths.ConclusionsCAC = 0 is a frequent finding among individuals undergoing CAC scanning for risk assessment and is associated with low rates of all-cause death at 12 years of follow-up. Our results support the emerging consensus that CAC = 0 represents a unique population with favorable all-cause prognosis who may be considered for more flexible treatment goals in primary prevention. Detection of any CAC in young adults could be used to trigger aggressive preventive interventions
Development of the (d,n) proton-transfer reaction in inverse kinematics for structure studies
Transfer reactions have provided exciting opportunities to study the
structure of exotic nuclei and are often used to inform studies relating to
nucleosynthesis and applications. In order to benefit from these reactions and
their application to rare ion beams (RIBs) it is necessary to develop the tools
and techniques to perform and analyze the data from reactions performed in
inverse kinematics, that is with targets of light nuclei and heavier beams. We
are continuing to expand the transfer reaction toolbox in preparation for the
next generation of facilities, such as the Facility for Rare Ion Beams (FRIB),
which is scheduled for completion in 2022. An important step in this process is
to perform the (d,n) reaction in inverse kinematics, with analyses that include
Q-value spectra and differential cross sections. In this way, proton-transfer
reactions can be placed on the same level as the more commonly used
neutron-transfer reactions, such as (d,p), (9Be,8Be), and (13C,12C). Here we
present an overview of the techniques used in (d,p) and (d,n), and some recent
data from (d,n) reactions in inverse kinematics using stable beams of 12C and
16O.Comment: 9 pages, 4 figures, presented at the XXXV Mazurian Lakes Conference
on Physics, Piaski, Polan
Recommended from our members
Feasibility study of electrocardiographic and respiratory gated, gadolinium enhanced magnetic resonance angiography of pulmonary veins and the impact of heart rate and rhythm on study quality
Background: We aimed to assess the feasibility of 3 dimensional (3D) respiratory and ECG gated, gadolinium enhanced magnetic resonance angiography (MRA) on a 3 Tesla (3 T) scanner for imaging pulmonary veins (PV) and left atrium (LA). The impact of heart rate (HR) and rhythm irregularity associated with atrial fibrillation (AF) on image and segmentation qualities were also assessed. Methods: 101 consecutive patients underwent respiratory and ECG gated (ventricular end systolic window) MRA for pre AF ablation imaging. Image quality (assessed by PV delineation) was scored as 1 = not visualized, 2 = poor, 3 = good and 4 = excellent. Segmentation quality was scored on a similar 4 point scale. Signal to noise ratios (SNRs) were calculated for the LA, LA appendage (LAA), and PV. Contrast to noise ratios (CNRs) were calculated between myocardium and LA, LAA and PV, respectively. Associations between HR/rhythm and quality metrics were assessed. Results: 35 of 101 (34.7%) patients were in AF at time of MRA. 100 (99%) patients had diagnostic studies, and 91 (90.1%) were of good or excellent quality. Overall, mean ± standard deviation (SD) image quality score was 3.40 ± 0.69. Inter observer agreement for image quality scores was substantial, (kappa = 0.68; 95% confidence interval (CI): 0.46, 0.90). Neither HR adjusting for rhythm [odds ratio (OR) = 1.03, 95% CI = 0.98,1.09; p = 0.22] nor rhythm adjusting for HR [OR = 1.25, 95% CI = 0.20, 7.69; p = 0.81] demonstrated association with image quality. Similarly, SNRs and CNRs were largely independent of HR after adjusting for rhythm. Segmentation quality scores were good or excellent for 77.3% of patients: mean ± SD score = 2.91 ± 0.63, and scores did not significantly differ by baseline rhythm (p = 0.78). Conclusions: 3D respiratory and ECG gated, gadolinium enhanced MRA of the PVs and LA on a 3 T system is feasible during ventricular end systole, achieving high image quality and high quality image segmentation when imported into electroanatomic mapping systems. Quality is independent of HR and heart rhythm for this free breathing, radiation free, alternative strategy to current MRA or CT based approaches, for pre AF ablation imaging of PVs and LA
Beyond BMI: The “Metabolically healthy obese” phenotype & its association with clinical/subclinical cardiovascular disease and all-cause mortality -- a systematic review
Background: A subgroup has emerged within the obese that do not display the typical metabolic disorders associated with obesity and are hypothesized to have lower risk of complications. The purpose of this review was to analyze the literature which has examined the burden of cardiovascular disease (CVD) and all-cause mortality in the metabolically healthy obese (MHO) population. Methods: Pubmed, Cochrane Library, and Web of Science were searched from their inception until December 2012. Studies were included which clearly defined the MHO group (using either insulin sensitivity and/or components of metabolic syndrome AND obesity) and its association with either all cause mortality, CVD mortality, incident CVD, and/or subclinical CVD. Results: A total of 20 studies were identified; 15 cohort and 5 cross-sectional. Eight studies used the NCEP Adult Treatment Panel III definition of metabolic syndrome to define “metabolically healthy”, while another nine used insulin resistance. Seven studies assessed all-cause mortality, seven assessed CVD mortality, and nine assessed incident CVD. MHO was found to be significantly associated with all-cause mortality in two studies (30%), CVD mortality in one study (14%), and incident CVD in three studies (33%). Of the six studies which examined subclinical disease, four (67%) showed significantly higher mean common carotid artery intima media thickness (CCA-IMT), coronary artery calcium (CAC), or other subclinical CVD markers in the MHO as compared to their MHNW counterparts. Conclusions: MHO is an important, emerging phenotype with a CVD risk between healthy, normal weight and unhealthy, obese individuals. Successful work towards a universally accepted definition of MHO would improve (and simplify) future studies and aid inter-study comparisons. Usefulness of a definition inclusive of insulin sensitivity and stricter criteria for metabolic syndrome components as well as the potential addition of markers of fatty liver and inflammation should be explored. Clinicians should be hesitant to reassure patients that the metabolically benign phenotype is safe, as increased risk cardiovascular disease and death have been shown
Self-rated health is associated with the length of stay at the intensive care unit and hospital following cardiac surgery
eLearning resources to supplement postgraduate neurosurgery training.
BACKGROUND: In an increasingly complex and competitive professional environment, improving methods to educate neurosurgical residents is key to ensure high-quality patient care. Electronic (e)Learning resources promise interactive knowledge acquisition. We set out to give a comprehensive overview on available eLearning resources that aim to improve postgraduate neurosurgical training and review the available literature. MATERIAL AND METHODS: A MEDLINE query was performed, using the search term "electronic AND learning AND neurosurgery". Only peer-reviewed English-language articles on the use of any means of eLearning to improve theoretical knowledge in postgraduate neurosurgical training were included. Reference lists were crosschecked for further relevant articles. Captured parameters were the year, country of origin, method of eLearning reported, and type of article, as well as its conclusion. eLearning resources were additionally searched for using Google. RESULTS: Of n = 301 identified articles by the MEDLINE search, n = 43 articles were analysed in detail. Applying defined criteria, n = 28 articles were excluded and n = 15 included. Most articles were generated within this decade, with groups from the USA, the UK and India having a leadership role. The majority of articles reviewed existing eLearning resources, others reported on the concept, development and use of generated eLearning resources. There was no article that scientifically assessed the effectiveness of eLearning resources (against traditional learning methods) in terms of efficacy or costs. Only one article reported on satisfaction rates with an eLearning tool. All authors of articles dealing with eLearning and the use of new media in neurosurgery uniformly agreed on its great potential and increasing future use, but most also highlighted some weaknesses and possible dangers. CONCLUSION: This review found only a few articles dealing with the modern aspects of eLearning as an adjunct to postgraduate neurosurgery training. Comprehensive eLearning platforms offering didactic modules with clear learning objectives are rare. Two decades after the rise of eLearning in neurosurgery, some promising solutions are readily available, but the potential of eLearning has not yet been sufficiently exploited
Obtaining high resolution excitation functions with an active thick-target approach and validating them with mirror nuclei
Measurement of fusion excitation functions for stable nuclei has largely been
restricted to nuclei with significant natural abundance. Typically, to
investigate neighboring nuclei with low natural abundance has required
obtaining isotopically enriched material. This restriction often limits the
ability to perform such measurements. We report the measurement of a high
quality fusion excitation function for a O beam produced from unenriched
material with 0.038\% natural abundance. The measurement is enabled by using an
active thick-target approach and the accuracy of the result is validated using
its mirror nucleus F and resonances. The result provides important
information about the average fusion cross-section for the oxygen isotopic
chain as a function of neutron excess.Comment: 4 pages, 4 figure
Assessment of atherosclerotic plaque burden: comparison of AI-QCT versus SIS, CAC, visual and CAD-RADS stenosis categories
This study assesses the agreement of Artificial Intelligence-Quantitative Computed Tomography (AI-QCT) with qualitative approaches to atherosclerotic disease burden codified in the multisociety 2022 CAD-RADS 2.0 Expert Consensus. 105 patients who underwent cardiac computed tomography angiography (CCTA) for chest pain were evaluated by a blinded core laboratory through FDA-cleared software (Cleerly, Denver, CO) that performs AI-QCT through artificial intelligence, analyzing factors such as % stenosis, plaque volume, and plaque composition. AI-QCT plaque volume was then staged by recently validated prognostic thresholds, and compared with CAD-RADS 2.0 clinical methods of plaque evaluation (segment involvement score (SIS), coronary artery calcium score (CACS), visual assessment, and CAD-RADS percent (%) stenosis) by expert consensus blinded to the AI-QCT core lab reads. Average age of subjects were 59 ± 11 years; 44% women, with 50% of patients at CAD-RADS 1–2 and 21% at CAD-RADS 3 and above by expert consensus. AI-QCT quantitative plaque burden staging had excellent agreement of 93% (k = 0.87 95% CI: 0.79–0.96) with SIS. There was moderate agreement between AI-QCT quantitative plaque volume and categories of visual assessment (64.4%; k = 0.488 [0.38–0.60]), and CACS (66.3%; k = 0.488 [0.36–0.61]). Agreement between AI-QCT plaque volume stage and CAD-RADS % stenosis category was also moderate. There was discordance at small plaque volumes. With ongoing validation, these results demonstrate a potential for AI-QCT as a rapid, reproducible approach to quantify total plaque burden
Contextual and Granular Policy Enforcement in Database-backed Applications
Database-backed applications rely on inlined policy checks to process users'
private and confidential data in a policy-compliant manner as traditional
database access control mechanisms cannot enforce complex policies. However,
application bugs due to missed checks are common in such applications, which
result in data breaches. While separating policy from code is a natural
solution, many data protection policies specify restrictions based on the
context in which data is accessed and how the data is used. Enforcing these
restrictions automatically presents significant challenges, as the information
needed to determine context requires a tight coupling between policy
enforcement and an application's implementation. We present Estrela, a
framework for enforcing contextual and granular data access policies. Working
from the observation that API endpoints can be associated with salient
contextual information in most database-backed applications, Estrela allows
developers to specify API-specific restrictions on data access and use. Estrela
provides a clean separation between policy specification and the application's
implementation, which facilitates easier auditing and maintenance of policies.
Policies in Estrela consist of pre-evaluation and post-evaluation conditions,
which provide the means to modulate database access before a query is issued,
and to impose finer-grained constraints on information release after the
evaluation of query, respectively. We build a prototype of Estrela and apply it
to retrofit several real world applications (from 1000-80k LOC) to enforce
different contextual policies. Our evaluation shows that Estrela can enforce
policies with minimal overheads
- …
