39 research outputs found
Antimicrobial Stewardship Training for Infectious Diseases Fellows: Program Directors Identify a Curriculum Need
A needs assessment survey of infectious diseases (ID) training program directors identified gaps in educational resources for training and evaluating ID fellows in antimicrobial stewardship. An Infectious Diseases Society of America-sponsored core curriculum was developed to address that need
Reflections on Seminole Rock: The Past, Present, and Future of Deference to Agency Regulatory Interpretations
Seminole Rock (or Auer) deference has captured the attention of scholars, policymakers, and the judiciary. That is why Notice & Comment, the blog of the Yale Journal on Regulation and the American Bar Association’s Section of Administrative Law & Regulatory Practice, hosted an online symposium from September 12 to September 23, 2016 on the subject. This symposium contains over 20 contributions addressing different aspects of Seminole Rock deference.
Topics include: History of Seminole Rock Empirical Examinations of Seminole Rock Understanding Seminole Rock Within Agencies Understanding Seminole Rock as Applied to Tax, Environmental Law, and Criminal Sentencing Why Seminole Rock Matters Should the Supreme Court Overrule Seminole Rock? Would Overruling Seminole Rock Have Unintended Consequences? What Might the Supreme Court Do? What Might Congress Do? The Future of Seminole Roc
Evaluation of the Infectious Diseases Society of America’s Core Antimicrobial Stewardship Curriculum for Infectious Diseases Fellows
Background
Antimicrobial stewardship (AS) programs are required by Centers for Medicare and Medicaid Services and should ideally have infectious diseases (ID) physician involvement; however, only 50% of ID fellowship programs have formal AS curricula. The Infectious Diseases Society of America (IDSA) formed a workgroup to develop a core AS curriculum for ID fellows. Here we study its impact.
Methods
ID program directors and fellows in 56 fellowship programs were surveyed regarding the content and effectiveness of their AS training before and after implementation of the IDSA curriculum. Fellows’ knowledge was assessed using multiple-choice questions. Fellows completing their first year of fellowship were surveyed before curriculum implementation (“pre-curriculum”) and compared to first-year fellows who complete the curriculum the following year (“post-curriculum”).
Results
Forty-nine (88%) program directors and 105 (67%) fellows completed the pre-curriculum surveys; 35 (64%) program directors and 79 (50%) fellows completed the post-curriculum surveys. Prior to IDSA curriculum implementation, only 51% of programs had a “formal” curriculum. After implementation, satisfaction with AS training increased among program directors (16% to 68%) and fellows (51% to 68%). Fellows’ confidence increased in 7/10 AS content areas. Knowledge scores improved from a mean of 4.6 to 5.1 correct answers of 9 questions (P = .028). The major hurdle to curriculum implementation was time, both for formal teaching and for e-learning.
Conclusions
Effective AS training is a critical component of ID fellowship training. The IDSA Core AS Curriculum can enhance AS training, increase fellow confidence, and improve overall satisfaction of fellows and program directors
Same data, different analysts: variation in effect sizes due to analytical decisions in ecology and evolutionary biology
Although variation in effect sizes and predicted values among studies of similar phenomena is inevitable, such variation far exceeds what might be produced by sampling error alone. One possible explanation for variation among results is differences among researchers in the decisions they make regarding statistical analyses. A growing array of studies has explored this analytical variability in different fields and has found substantial variability among results despite analysts having the same data and research question. Many of these studies have been in the social sciences, but one small "many analyst" study found similar variability in ecology. We expanded the scope of this prior work by implementing a large-scale empirical exploration of the variation in effect sizes and model predictions generated by the analytical decisions of different researchers in ecology and evolutionary biology. We used two unpublished datasets, one from evolutionary ecology (blue tit, Cyanistes caeruleus, to compare sibling number and nestling growth) and one from conservation ecology (Eucalyptus, to compare grass cover and tree seedling recruitment). The project leaders recruited 174 analyst teams, comprising 246 analysts, to investigate the answers to prespecified research questions. Analyses conducted by these teams yielded 141 usable effects (compatible with our meta-analyses and with all necessary information provided) for the blue tit dataset, and 85 usable effects for the Eucalyptus dataset. We found substantial heterogeneity among results for both datasets, although the patterns of variation differed between them. For the blue tit analyses, the average effect was convincingly negative, with less growth for nestlings living with more siblings, but there was near continuous variation in effect size from large negative effects to effects near zero, and even effects crossing the traditional threshold of statistical significance in the opposite direction. In contrast, the average relationship between grass cover and Eucalyptus seedling number was only slightly negative and not convincingly different from zero, and most effects ranged from weakly negative to weakly positive, with about a third of effects crossing the traditional threshold of significance in one direction or the other. However, there were also several striking outliers in the Eucalyptus dataset, with effects far from zero. For both datasets, we found substantial variation in the variable selection and random effects structures among analyses, as well as in the ratings of the analytical methods by peer reviewers, but we found no strong relationship between any of these and deviation from the meta-analytic mean. In other words, analyses with results that were far from the mean were no more or less likely to have dissimilar variable sets, use random effects in their models, or receive poor peer reviews than those analyses that found results that were close to the mean. The existence of substantial variability among analysis outcomes raises important questions about how ecologists and evolutionary biologists should interpret published results, and how they should conduct analyses in the future
MO487DYSNATRAEMIA AND ASSOCIATED MORTALITY RISK THRESHOLDS ARE MODIFIED BY KIDNEY FUNCTION IN THE IRISH HEALTH SYSTEM: THE NATIONAL KIDNEY DISEASE SURVEILLANCE SYSTEM (NKDSS)
Abstract
Background and Aims
Dysnatraemia is associated with increased mortality risk in the general population, but it is unclear to what extent kidney function influences this relationship. We investigated the impact of dysnatraemia on total and cardiovascular (CV) mortality while exploring the concurrent impact of chronic kidney disease.
Method
We utilised data from the Irish Kidney Disease Surveillance System (NKSS) to explore the association of serum sodium (Na+) (mmol/L) and mortality in a longitudinal cohort study. We identified all adult individuals (age > 18 years) who accessed health care from January 1st, 2007 and December 31st, 2013 in a regional health system with complete data on serum Na+, associated laboratory indicators and vital status up to 31st December 2013 (n = 32, 686). Patients receiving dialysis were excluded. The primary exposure was serum Na+ first recorded during the study period for each patient with a concurrent serum glucose measurement. Chronic kidney disease was defined as eGFR <60ml/min/1.73m² vs greater recorded at index date. The association of serum Na+ with all-cause (ACM) and CV mortality was explored in categories and as a continuous variable using polynomial splines. Multivariable Cox regression with competing risks determined hazard ratios (HR) and 95% confidence intervals with adjustment for baseline health indicators.
Results
There were 5,118 deaths (15.7%) over a median follow up of 5.5 years. In multivariable adjusted models, the association of serum Na+ with all-cause and CV mortality followed a non-linear, u-shaped pattern. For all-cause mortality, the optimal range for greatest survival was between 139-146 mmol/L [HR 1.02 (1.00-1.03) and HR 1.19 (1.02-1.38) respectively, while for CV mortality, the optimal range was much narrower at 134-143mmol/L [HR 1.16 (1.02-1.23) and HR 1.09 (1.01-1.89) respectively] (Figure 1). The impact of serum Na+ on mortality was modified by baseline kidney function (p value < 0.001 for interaction). In stratified analysis, the impact of serum Na+ on all-cause mortality was greatly attenuated among patients with GFR< 60 ml/min/m², than above. This pattern was replicated in analyses of CV mortality.
Conclusion
This study supports the view that hypernatraemia and hyponatraemia are better tolerated with poorer kidney function. The risk thresholds for mortality were much narrower for CV death than all-cause death suggesting that these thresholds be taken into account to inform decision making and therapeutic interventions.
Funding source
Health Research Board (HRB-SDAP-2019-036), Midwest Research and Education Foundation (MKid)
</jats:sec
Ultrasound-guided approach to the cervical articular process joints in horses: a validation of the technique in cadavers
Summary
Objectives: To compare accuracy of the ultrasound-guided craniodorsal (CrD) approach with the dorsal (D) approach to the cervical articular process joints, and to evaluate the effect of the transducer, needle gauge, and operator experience.
Methods: Cervical articular process joints from 14 cadaveric neck specimens were injected using either a D or CrD approach, a linear (13 MHx) or microconvex transducer (10 MHz), and an 18 or 20 gauge needle, by an experienced or inexperienced operator. Injectate consisted of an iodinated contrast material solution. Time taken for injection, number of redirects, and retrieval of synovial fluid were recorded. Accuracy was assessed using a scoring system for contrast seen on computed tomography (CT).
Results: The successful performance of intraarticular injections of contrast detected by CT using the D (61/68) and CrD (57/64) approaches was comparable. No significant effect of approach, transducer or needle gauge was observed on injection accuracy, time taken to perform injection, or number of redirects. The 18 gauge needle had a positive correlation with retrieval of synovial fluid. A positive learning curve was observed for the inexperienced operator.
Clinical relevance: Both approaches to the cervical articular process joints were highly accurate. Ultrasound-guided injection of the cervical articular process joints is an easilylearnt technique for an inexperienced veterinarian. Either approach may be employed in the field with a high level of accuracy, using widely available equipment.</jats:p
Vasoactive Intestinal Peptide (VIP) Regulates Activity-Dependent Neuroprotective Protein (ADNP) Expression In Vivo
Prevalence of anaemia, iron, and vitamin deficiencies in the health system in the Republic of Ireland: a retrospective cohort study
Background: Anaemia is a common but treatable condition that predicts adverse clinical outcomes. However, standards of anaemia management vary considerably. Aim: To estimate the prevalence of anaemia and extent of screening for common underlying causes in the healthcare system in the Republic of Ireland. Design & setting: We conducted a retrospective cohort study of 112 181 adult patients, aged ≥18 years, who had a full blood count performed in 2013, using data from the National Kidney Disease Surveillance System. Method: The prevalence of anaemia was determined across demographic and clinical subgroups, according to World Health Organization (WHO) definitions. The proportion screened for iron, vitamin B12, and folate deficiency was determined within a 3-month follow-up period and the corresponding prevalence for each deficiency determined. Results: The overall prevalence of anaemia was 12.0% (95% confidence interval [CI] = 11.8% to 12.2%) and was higher in women than men (13.2% versus 10.5%, P75 years) and worsening kidney function (8.2%, 10.9%, 33.2%, and 63.8% for each estimated glomerular filtration rate [eGFR] categories >90, 60–89, 30–59 and <30 ml/min/1.73 m², respectively, P<0.001). After 3-months' follow-up, the proportion screened for iron deficiency was 11.2% based on transferrin saturation and 33.7% using serum ferritin. Screening for folate and B12 deficiency was 17.6% and 19.8%, respectively. Among screened patients, the prevalence of iron deficiency, B12, and folate deficiency was 37.0%, 6.3%, and 5.8%, respectively. Conclusion: The burden of anaemia in the healthcare system is substantial especially for older patients and those with advanced kidney disease. Low screening rates for iron, B12, and folate deficiency are common and warrant quality improvement initiatives
Impact of serum sodium concentrations, and effect modifiers on mortality in the Irish Health System
Abstract Background Abnormalities of serum sodium are associated with increased mortality risk in hospitalised patients, but it is unclear whether, and to what extent other factors influence this relationship. We investigated the impact of dysnatraemia on total and cause-specific mortality in the Irish health system while exploring the concurrent impact of age, kidney function and designated clinical work-based settings. Methods A retrospective cohort study of 32,666 participants was conducted using data from the National Kidney Disease Surveillance System. Hyponatraemia was defined as 145 mmol/L with normal range 135–145 mmol/L. Multivariable Cox proportional hazards regression was used to estimate hazard ratios (HR’s) and 95% Confidence Intervals (CIs) while penalised spline models further examined patterns of risk. Results There were 5,114 deaths (15.7%) over a median follow up of 5.5 years. Dysnatraemia was present in 8.5% of patients overall. In multivariable analysis, both baseline and time-dependent serum sodium concentrations exhibited a U-shaped association with mortality. Hyponatremia was significantly associated with increased risk for cardiovascular [HR 1.38 (1.18–1.61)], malignant [HR: 2.49 (2.23–2.78)] and non-cardiovascular/non-malignant causes of death [1.36 (1.17–1.58)], while hypernatremia was significantly associated with cardiovascular [HR: 2.16 (1.58–2.96)] and non-cardiovascular/ non-malignant deaths respectively [HR: 3.60 (2.87–4.52)]. The sodium-mortality relationship was significantly influenced by age, level of kidney function and the clinical setting at baseline (P < 0.001). For hyponatraemia, relative mortality risks were significantly higher for younger patients (interaction term P < 0.001), for patients with better kidney function, and for patients attending general practice [HR 2.70 (2.15–3.36)] than other clinical settings. For hypernatraemia, age and kidney function remained significant effect modifiers, with patients attending outpatient departments experiencing the greatest risk [HR 9.84 (4.88–18.62)] than patients who attended other clinical locations. Optimal serum sodium thresholds for mortality varied by level of kidney function with a flattening of mortality curve observed for patients with poorer kidney function. Conclusion Serum sodium concentrations outside the standard normal range adversly impact mortality and are associated with specific causes of death. The thresholds at which these risks appear to vary by age, level of kidney function, and are modified in specific clinical settings within the health system
Prevalence of anaemia, iron, and vitamin deficiencies in the health system in the Republic of Ireland: a retrospective cohort study
Background: Anaemia is a common but treatable condition that predicts adverse clinical outcomes. However, standards of anaemia management vary considerably. Aim: To estimate the prevalence of anaemia and extent of screening for common underlying causes in the healthcare system in the Republic of Ireland. Design & setting: We conducted a retrospective cohort study of 112 181 adult patients, aged ≥18 years, who had a full blood count performed in 2013, using data from the National Kidney Disease Surveillance System. Method: The prevalence of anaemia was determined across demographic and clinical subgroups, according to World Health Organization (WHO) definitions. The proportion screened for iron, vitamin B12, and folate deficiency was determined within a 3-month follow-up period and the corresponding prevalence for each deficiency determined. Results: The overall prevalence of anaemia was 12.0% (95% confidence interval [CI] = 11.8% to 12.2%) and was higher in women than men (13.2% versus 10.5%, P75 years) and worsening kidney function (8.2%, 10.9%, 33.2%, and 63.8% for each estimated glomerular filtration rate [eGFR] categories >90, 60–89, 30–59 and <30 ml/min/1.73 m², respectively, P<0.001). After 3-months' follow-up, the proportion screened for iron deficiency was 11.2% based on transferrin saturation and 33.7% using serum ferritin. Screening for folate and B12 deficiency was 17.6% and 19.8%, respectively. Among screened patients, the prevalence of iron deficiency, B12, and folate deficiency was 37.0%, 6.3%, and 5.8%, respectively. Conclusion: The burden of anaemia in the healthcare system is substantial especially for older patients and those with advanced kidney disease. Low screening rates for iron, B12, and folate deficiency are common and warrant quality improvement initiatives.</p
