531 research outputs found

    Occupational cooling practices of emergency first responders in the United States: A survey

    Get PDF
    This is an accepted manuscript of an article published by Taylor & Francis in Temperature on 29/07/2018, available online: https://doi.org/10.1080/23328940.2018.1493907 The accepted version of the publication may differ from the final published version.© 2018 Informa UK Limited, trading as Taylor & Francis Group. Despite extensive documentation directed specifically toward mitigating thermal strain of first responders, we wished to ascertain the degree to which first responders applied cooling strategies, and what opinions are held by the various agencies/departments within the United States. An internet-based survey of first responders was distributed to the International Association of Fire Chiefs, International Association of Fire Firefighters, National Bomb Squad Advisory Board and the USA Interagency Board and their subsequent departments and branches. Individual first responder departments were questioned regarding the use of pre-, concurrent, post-cooling, types of methods employed, and/or reasons why they had not incorporated various methods in first responder deployment. Completed surveys were collected from 119 unique de-identified departments, including those working in law enforcement (29%), as firefighters (29%), EOD (28%) and HAZMAT technicians (15%). One-hundred and eighteen departments (99%) reported heat strain/illness to be a risk to employee safety during occupational duties. The percentage of departments with at least one case of heat illness in the previous year were as follows: fire (39%) HAZMAT (23%), EOD (20%) and law enforcement (18%). Post-cooling was the scheduled cooling method implemented the most (63%). Fire departments were significantly more likely to use post-cooling, as well as combine two types of scheduled cooling compared to other departments. Importantly, 25% of all departments surveyed provided no cooling whatsoever. The greatest barriers to personnel cooling were as follows–availability, cost, logistics, and knowledge. Our findings could aid in a better understanding of current practices and perceptions of heat illness and injury prevention in United States first responders. Abbreviations: EOD: explosive ordnance disposal; HAZMAT: hazardous materials.This project is financially supported by the United States Government through the United States Department of Defense (DOD).Published versio

    Effects of climate change on the epidemiology of flood-related waterborne disease: A Systematic Literature Review

    Get PDF
    Natural disasters, such as flooding related to extreme precipitation, can lead to many adverse health effects (i.e. waterborne disease). Several outbreaks of waterborne disease have been linked to extreme precipitation, and gastrointestinal infection has been shown to increase after floods. Climate change is likely to lead to a higher frequency of waterborne disease through increases in extreme precipitation and associated flooding affecting water and sanitation infrastructure. This review sought to answer 2 research questions: 1. Has the epidemiology of waterborne disease related to floods changed over time? 2. Can this difference be related to climate change? A literature search was conducted in MEDLINE and Embase for studies reporting on the epidemiology of waterborne disease related to flooding. Studies were screened against inclusion and exclusion criteria, with a total of 52 publications included. Studies of campylobacter, dermatitis, pink eye, and schistosomiasis reported an association between floods and an increase in infection, adenovirus 40/41 and astrovirus showed a significant decrease in risk of disease related to flooding, and cryptosporidium, Giardia, cholera, Escherichia coli, leptospirosis, salmonella, shigella, hepatitis A, rotavirus, sapovirus, and dysentery had mixed evidence. Several studies reported on disease outbreaks tied to a specific flood, but the majority were from events in the past 20 years. It is difficult to draw clear conclusions regarding how waterborne disease is or is not related to floods due to the varied comparisons and outcome definitions. Additionally, most studies were of recent events precluding an analysis of any change over time. Continued research on flood-associated waterborne disease will allow for future analysis of epidemiological changes in response to alterations in climate. In the meantime, public health officials in flood-prone areas should prepare for increases in waterborne disease by educating their constituents on flood safety and implementing interventions for prevention and treatment

    Adjusting for unmeasured confounding in nonrandomized longitudinal studies: a methodological review

    Get PDF
    OBJECTIVE: Motivated by recent calls to use electronic health records for research, we reviewed the application and development of methods for addressing the bias from unmeasured confounding in longitudinal data. DESIGN: Methodological review of existing literature SETTING: We searched MEDLINE and EMBASE for articles addressing the threat to causal inference from unmeasured confounding in nonrandomised longitudinal health data through quasi-experimental analysis. RESULTS: Among the 121 studies included for review, 84 used instrumental variable analysis (IVA), of which 36 used lagged or historical instruments. Difference-in-differences (DiD) and fixed effects (FE) models were found in 29 studies. Five of these combined IVA with DiD or FE to try to mitigate for time-dependent confounding. Other less frequently used methods included prior event rate ratio adjustment, regression discontinuity nested within pre-post studies, propensity score calibration, perturbation analysis and negative control outcomes. CONCLUSIONS: Well-established econometric methods such as DiD and IVA are commonly used to address unmeasured confounding in non-randomised, longitudinal studies, but researchers often fail to take full advantage of available longitudinal information. A range of promising new methods have been developed, but further studies are needed to understand their relative performance in different contexts before they can be recommended for widespread use

    Impact of a centralized osteoporosis coordinator on post-fracture osteoporosis management: a cluster randomized trial

    Get PDF
    SUMMARY: We conducted a cluster randomized trial evaluating the effect of a centralized coordinator who identifies and follows up with fracture patients and their primary care physicians about osteoporosis. Compared with controls, intervention patients were five times more likely to receive BMD testing and two times more likely to receive appropriate management. INTRODUCTION: To determine if a centralized coordinator who follows up with fracture patients and their primary care physicians by telephone and mail (intervention) will increase the proportion of patients who receive appropriate post-fracture osteoporosis management, compared to simple fall prevention advice (attention control). METHODS: A cluster randomized controlled trial was conducted in small community hospitals in the province of Ontario, Canada. Hospitals that treated between 60 and 340 fracture patients per year were eligible. Patients 40 years and older presenting with a low trauma fracture were identified from Emergency Department records and enrolled in the trial. The primary outcome was ‘appropriate’ management, defined as a normal bone mineral density (BMD) test or taking osteoporosis medications. RESULTS: Thirty-six hospitals were randomized to either intervention or control and 130 intervention and 137 control subjects completed the study. The mean age of participants was 65 ± 12 years and 69% were female. The intervention increased the proportion of patients who received appropriate management within 6 months of fracture; 45% in the intervention group compared with 26% in the control group (absolute difference of 19%; adjusted OR, 2.3; 95% CI, 1.3–4.1). The proportion who had a BMD test scheduled or performed was much higher with 57% of intervention patients compared with 21% of controls (absolute difference of 36%; adjusted OR, 4.8; 95% CI, 3.0–7.0). CONCLUSIONS: A centralized osteoporosis coordinator is effective in improving the quality of osteoporosis care in smaller communities that do not have on-site coordinators or direct access to osteoporosis specialists

    The Global Longitudinal Study of Osteoporosis in Women (GLOW): rationale and study design

    Get PDF
    SUMMARY: The Global Longitudinal study of Osteoporosis in Women (GLOW) is a prospective cohort study involving 723 physicians and 60,393 women subjects >or=55 years. The data will provide insights into the management of fracture risk in older women over 5 years, patient experience with prevention and treatment, and distribution of risk among older women on an international basis. INTRODUCTION: Data from cohort studies describing the distribution of osteoporosis-related fractures and risk factors are not directly comparable and do not compare regional differences in patterns of patient management and fracture outcomes. METHODS: The GLOW is a prospective, multinational, observational cohort study. Practices typical of each region were identified through primary care networks organized for administrative, research, or educational purposes. Noninstitutionalized patients visiting each practice within the previous 2 years were eligible. Self-administered questionnaires were mailed, with 2:1 oversampling of women >or=65 years. Follow-up questionnaires will be sent at 12-month intervals for 5 years. RESULTS: A total of 723 physicians at 17 sites in ten countries agreed to participate. Baseline surveys were mailed (October 2006 to February 2008) to 140,416 subjects. After the exclusion of 3,265 women who were ineligible or had died, 60,393 agreed to participate. CONCLUSIONS: GLOW will provide contemporary information on patterns of management of fracture risk in older women over a 5-year period. The collection of data in a similar manner in ten countries will permit comparisons of patient experience with prevention and treatment and provide insights into the distribution of risk among older women on an international basis

    Infectious Disease Threats in the Twenty-First Century: Strengthening the Global Response

    Get PDF
    The world has developed an elaborate global health system as a bulwark against known and unknown infectious disease threats. The system consists of various formal and informal networks of organizations that serve different stakeholders; have varying goals, modalities, resources, and accountability; operate at different regional levels (i.e., local, national, regional, or global); and cut across the public, private-for-profit, and private-not-for-profit sectors. The evolving global health system has done much to protect and promote human health. However, the world continues to be confronted by longstanding, emerging, and reemerging infectious disease threats. These threats differ widely in terms of severity and probability. They also have varying consequences for morbidity and mortality, as well as for a complex set of social and economic outcomes. To various degrees, they are also amenable to alternative responses, ranging from clean water provision to regulation to biomedical countermeasures. Whether the global health system as currently constituted can provide effective protection against a dynamic array of infectious disease threats has been called into question by recent outbreaks of Ebola, Zika, dengue, Middle East respiratory syndrome, severe acute respiratory syndrome, and influenza and by the looming threat of rising antimicrobial resistance. The concern is magnified by rapid population growth in areas with weak health systems, urbanization, globalization, climate change, civil conflict, and the changing nature of pathogen transmission between human and animal populations. There is also potential for human-originated outbreaks emanating from laboratory accidents or intentional biological attacks. This paper discusses these issues, along with the need for a (possibly self-standing) multi-disciplinary Global Technical Council on Infectious Disease Threats to address emerging global challenges with regard to infectious disease and associated social and economic risks. This Council would strengthen the global health system by improving collaboration and coordination across organizations (e.g., the WHO, Gavi, CEPI, national centers for disease control, pharmaceutical manufacturers, etc.); filling in knowledge gaps with respect to (for example) infectious disease surveillance, research and development needs, financing models, supply chain logistics, and the social and economic impacts of potential threats; and making high-level, evidence-based recommendations for managing global risks associated with infectious disease

    Bisphosphonates and risk of atrial fibrillation: a meta-analysis

    Get PDF
    Abstract Introduction Bisphosphonates are the most commonly used drugs for the prevention and treatment of osteoporosis. Although a recent FDA review of the results of clinical trials reported no clear link between bisphosphonates and serious or non-serious atrial fibrillation (AF), some epidemiologic studies have suggested an association between AF and bisphosphonates. Methods We conducted a meta-analysis of non-experimental studies to evaluate the risk of AF associated with bisphosphonates. Studies were identified by searching MEDLINE and EMBASE using a combination of the Medical Subject Headings and keywords. Our search was limited to English language articles. The pooled estimates of odds ratios (OR) as a measure of effect size were calculated using a random effects model. Results Seven eligible studies with 266,761 patients were identified: three cohort, three case-control, and one self-controlled case series. Bisphosphonate exposure was not associated with an increased risk of AF [pooled multivariate OR 1.04, 95% confidence interval (CI) 0.92-1.16] after adjusting for known risk factors. Moderate heterogeneity was noted (I-squared score = 62.8%). Stratified analyses by study design, cohort versus case-control studies, yielded similar results. Egger's and Begg's tests did not suggest an evidence of publication bias (P = 0.90, 1.00 respectively). No clear asymmetry was observed in the funnel plot analysis. Few studies compared risk between bisphosphonates or by dosing. Conclusions Our study did not find an association between bisphosphonate exposure and AF. This finding is consistent with the FDA's statement

    On-time denosumab dosing recovered rapidly during the COVID-19 pandemic, yet remains suboptimal

    Get PDF
    Timely administration of denosumab every 6 mo is critical in osteoporosis treatment to avoid multiple vertebral fracture risk upon denosumab discontinuation or delay. This study aimed to estimate the immediate and prolonged impact of the COVID-19 pandemic on the timing of denosumab doses. We identified older adults (≥66 yr) residing in the community who were due to receive denosumab between January 2016 and December 2020 using Ontario Drug Benefit data. We completed an interrupted time-series analysis to estimate the impact of the COVID-19 pandemic (March 2020) on the monthly proportion of on-time denosumab doses (183 +/-30 d). Analyses were stratified by user type: patients due for their second dose (novice users), third or fourth dose (intermediate users), or ≥5th dose (established users). In additional analyses, we considered patients living in nursing homes, switching to other osteoporosis drugs, and reported trends until February 2022. We studied 148 554 patients (90.9% female, mean [SD] age 79.6 [8.0] yr) receiving 648 221 denosumab doses. The average pre-pandemic proportion of on-time therapy was steady in the community, yet differed by user type: 64.9% novice users, 72.3% intermediate users, and 78.0% established users. We identified an immediate overall decline in the proportion of on-time doses across all user types at the start of the pandemic: -17.8% (95% CI, -19.6, -16.0). In nursing homes, the pre-pandemic proportion of on-time therapy was similar across user types (average 83.5%), with a small decline at the start of the pandemic: -3.2% (95% CI, -5.0, -1.2). On-time therapy returned to pre-pandemic levels by October 2020 and was not impacted by therapy switching. Although on-time dosing remains stable as of February 2022, approximately one-fourth of patients in the community do not receive denosumab on-time. In conclusion, although pandemic disruptions to denosumab dosing were temporary, levels of on-time therapy remain suboptimal
    corecore