402 research outputs found

    Absence of bias against smokers in access to coronary revascularization after cardiac catheterization

    Get PDF
    Objective. Many consider smoking to be a personal choice for which individuals should be held accountable. We assessed whether there is any evidence of bias against smokers in cardiac care decision-making by determining whether smokers were as likely as non-smokers to undergo revascularization procedures after cardiac catheterization. Design. Prospective cohort study. Subjects and setting. All patients undergoing cardiac catheterization in Alberta, Canada. Main measures. Patients were categorized as current smokers, former smokers, or never smokers, and then compared for their risk-adjusted likelihood of undergoing revascularization procedures (percutaneous coronary intervention or coronary artery bypass grafting) after cardiac catheterization. Results. Among 20406 patients undergoing catheterization, 25.4% were current smokers at the time of catheterization, 36.6% were former smokers, and 38.0% had never smoked. When compared with never smokers (reference group), the hazard ratio for undergoing any revascularization procedure after catheterization was 0.98 (95% CI 0.93-1.03) for current smokers and 0.98 (0.94-1.03) for former smokers. The hazard ratio for undergoing coronary artery bypass grafting was 1.09 (1.00-1.19) for current smokers and 1.00 (0.93-1.08) for former smokers. For percutaneous coronary intervention, the hazard ratios were 0.93 (0.87-0.99) for current smokers and 1.00 (0.94-1.06) for former smokers. Conclusion. Despite potential for discrimination on the basis of smoking status, current and former smokers undergoing cardiac catheterization in Alberta, Canada were as likely to undergo revascularization procedures as catheterization patients who had never smoke

    Canadian Pregnancy Outcomes in Rheumatoid Arthritis and Systemic Lupus Erythematosus

    Get PDF
    Objective. To describe obstetrical and neonatal outcomes in Canadian women with rheumatoid arthritis (RA) or systemic lupus erythematosus (SLE). Methods. An administrative database of hospitalizations for neonatal delivery (1998–2009) from Calgary, Alberta was searched to identify women with RA (38 pregnancies) or SLE (95 pregnancies), and women from the general population matched on maternal age and year of delivery (150 and 375 pregnancies, resp.). Conditional logistic regression was used to calculate odds ratios (OR) for maternal and neonatal outcomes, adjusting for parity. Results. Women with SLE had increased odds for preeclampsia or eclampsia (SLE OR 2.16 (95% CI 1.10–4.21; P = 0.024); RA OR 2.33 (95% CI 0.76–7.14; P = 0.138)). Women with SLE had increased odds for cesarean section after adjustment for dysfunctional labour, instrumentation and previous cesarean section (OR 3.47 (95% CI 1.67–7.22; P < 0.001)). Neonates born to women with SLE had increased odds of prematurity (SLE OR 6.17 (95% CI 3.28–11.58; P < 0.001); RA OR 2.66 (95% CI 0.90–7.84; P = 0.076)) and of SGA (SLE OR 2.54 (95% CI 1.42–4.55; P = 0.002); RA OR 2.18 (95% CI 0.84–5.66; P = 0.108)) after adjusting for maternal hypertension. There was no excess risk of congenital defects in neonates. Conclusions. There is increased obstetrical and neonatal morbidity in Canadian women with RA or SLE

    Accuracy of city postal code coordinates as a proxy for location of residence

    Get PDF
    BACKGROUND: Health studies sometimes rely on postal code location as a proxy for the location of residence. This study compares the postal code location to that of the street address using a database from the Alberta Provincial Project for Outcome Assessment in Coronary Heart Disease (APPROACH(©)). Cardiac catheterization cases in an urban Canadian City were used for calendar year 1999. We determined location in meters for both the address (using the City of Calgary Street Network File in ArcView 3.2) and postal code location (using Statistic Canada's Postal Code Conversion File). RESULTS: The distance between the two estimates of location for each case were measured and it was found that 87.9% of the postal code locations were within 200 meters of the true address location (straight line distances) and 96.5% were within 500 meters of the address location (straight line distances). CONCLUSIONS: We conclude that postal code locations are a reasonably accurate proxy for address location. However, there may be research questions for which a more accurate description of location is required

    Global, regional, and national comparative risk assessment of 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks for 195 countries and territories, 1990–2017 : a systematic analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017 comparative risk assessment (CRA) is a comprehensive approach to risk factor quantification that offers a useful tool for synthesising evidence on risks and risk outcome associations. With each annual GBD study, we update the GBD CRA to incorporate improved methods, new risks and risk outcome pairs, and new data on risk exposure levels and risk outcome associations. Methods: We used the CRA framework developed for previous iterations of GBD to estimate levels and trends in exposure, attributable deaths, and attributable disability-adjusted life-years (DALYs), by age group, sex, year, and location for 84 behavioural, environmental and occupational, and metabolic risks or groups of risks from 1990 to 2017. This study included 476 risk outcome pairs that met the GBD study criteria for convincing or probable evidence of causation. We extracted relative risk and exposure estimates from 46 749 randomised controlled trials, cohort studies, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. Using the counterfactual scenario of theoretical minimum risk exposure level (TMREL), we estimated the portion of deaths and DALYs that could be attributed to a given risk. We explored the relationship between development and risk exposure by modelling the relationship between the Socio-demographic Index (SDI) and risk-weighted exposure prevalence and estimated expected levels of exposure and risk-attributable burden by SDI. Finally, we explored temporal changes in risk-attributable DALYs by decomposing those changes into six main component drivers of change as follows: (1) population growth; (2) changes in population age structures; (3) changes in exposure to environmental and occupational risks; (4) changes in exposure to behavioural risks; (5) changes in exposure to metabolic risks; and (6) changes due to all other factors, approximated as the risk-deleted death and DALY rates, where the risk-deleted rate is the rate that would be observed had we reduced the exposure levels to the TMREL for all risk factors included in GBD 2017. Findings: In 2017,34.1 million (95% uncertainty interval [UI] 33.3-35.0) deaths and 121 billion (144-1.28) DALYs were attributable to GBD risk factors. Globally, 61.0% (59.6-62.4) of deaths and 48.3% (46.3-50.2) of DALYs were attributed to the GBD 2017 risk factors. When ranked by risk-attributable DALYs, high systolic blood pressure (SBP) was the leading risk factor, accounting for 10.4 million (9.39-11.5) deaths and 218 million (198-237) DALYs, followed by smoking (7.10 million [6.83-7.37] deaths and 182 million [173-193] DALYs), high fasting plasma glucose (6.53 million [5.23-8.23] deaths and 171 million [144-201] DALYs), high body-mass index (BMI; 4.72 million [2.99-6.70] deaths and 148 million [98.6-202] DALYs), and short gestation for birthweight (1.43 million [1.36-1.51] deaths and 139 million [131-147] DALYs). In total, risk-attributable DALYs declined by 4.9% (3.3-6.5) between 2007 and 2017. In the absence of demographic changes (ie, population growth and ageing), changes in risk exposure and risk-deleted DALYs would have led to a 23.5% decline in DALYs during that period. Conversely, in the absence of changes in risk exposure and risk-deleted DALYs, demographic changes would have led to an 18.6% increase in DALYs during that period. The ratios of observed risk exposure levels to exposure levels expected based on SDI (O/E ratios) increased globally for unsafe drinking water and household air pollution between 1990 and 2017. This result suggests that development is occurring more rapidly than are changes in the underlying risk structure in a population. Conversely, nearly universal declines in O/E ratios for smoking and alcohol use indicate that, for a given SDI, exposure to these risks is declining. In 2017, the leading Level 4 risk factor for age-standardised DALY rates was high SBP in four super-regions: central Europe, eastern Europe, and central Asia; north Africa and Middle East; south Asia; and southeast Asia, east Asia, and Oceania. The leading risk factor in the high-income super-region was smoking, in Latin America and Caribbean was high BMI, and in sub-Saharan Africa was unsafe sex. O/E ratios for unsafe sex in sub-Saharan Africa were notably high, and those for alcohol use in north Africa and the Middle East were notably low. Interpretation: By quantifying levels and trends in exposures to risk factors and the resulting disease burden, this assessment offers insight into where past policy and programme efforts might have been successful and highlights current priorities for public health action. Decreases in behavioural, environmental, and occupational risks have largely offset the effects of population growth and ageing, in relation to trends in absolute burden. Conversely, the combination of increasing metabolic risks and population ageing will probably continue to drive the increasing trends in non-communicable diseases at the global level, which presents both a public health challenge and opportunity. We see considerable spatiotemporal heterogeneity in levels of risk exposure and risk-attributable burden. Although levels of development underlie some of this heterogeneity, O/E ratios show risks for which countries are overperforming or underperforming relative to their level of development. As such, these ratios provide a benchmarking tool to help to focus local decision making. Our findings reinforce the importance of both risk exposure monitoring and epidemiological research to assess causal connections between risks and health outcomes, and they highlight the usefulness of the GBD study in synthesising data to draw comprehensive and robust conclusions that help to inform good policy and strategic health planning

    Mapping 123 million neonatal, infant and child deaths between 2000 and 2017

    Get PDF
    Since 2000, many countries have achieved considerable success in improving child survival, but localized progress remains unclear. To inform efforts towards United Nations Sustainable Development Goal 3.2—to end preventable child deaths by 2030—we need consistently estimated data at the subnational level regarding child mortality rates and trends. Here we quantified, for the period 2000–2017, the subnational variation in mortality rates and number of deaths of neonates, infants and children under 5 years of age within 99 low- and middle-income countries using a geostatistical survival model. We estimated that 32% of children under 5 in these countries lived in districts that had attained rates of 25 or fewer child deaths per 1,000 live births by 2017, and that 58% of child deaths between 2000 and 2017 in these countries could have been averted in the absence of geographical inequality. This study enables the identification of high-mortality clusters, patterns of progress and geographical inequalities to inform appropriate investments and implementations that will help to improve the health of all populations

    A multi-region assessment of population rates of cardiac catheterization and yield of high-risk coronary artery disease

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>There is variation in cardiac catheterization utilization across jurisdictions. Previous work from Alberta, Canada, showed no evidence of a plateau in the yield of high-risk disease at cardiac catheterization rates as high as 600 per 100,000 population suggesting that the optimal rate is higher. This work aims 1) To determine if a previously demonstrated linear relationship between the yield of high-risk coronary disease and cardiac catheterization rates persists with contemporary data and 2) to explore whether the linear relationship exists in other jurisdictions.</p> <p>Methods</p> <p>Detailed clinical information on all patients undergoing cardiac catheterization in 3 Canadian provinces was available through the Alberta Provincial Project for Outcomes Assessment in Coronary Heart (APPROACH) disease and partner initiatives in British Columbia and Nova Scotia. Population rates of catheterization and high-risk coronary disease detection for each health region in these three provinces, and age-adjusted rates produced using direct standardization. A mixed effects regression analysis was performed to assess the relationship between catheterization rate and high-risk coronary disease detection.</p> <p>Results</p> <p>In the contemporary Alberta data, we found a linear relationship between the population catheterization rate and the high-risk yield. Although the yield was slightly less in time period 2 (2002-2006) than in time period 1(1995-2001), there was no statistical evidence of a plateau. The linear relationship between catheterization rate and high-risk yield was similarly demonstrated in British Columbia and Nova Scotia and appears to extend, without a plateau in yield, to rates over 800 procedures per 100,000 population.</p> <p>Conclusions</p> <p>Our study demonstrates a consistent finding, over time and across jurisdictions, of linearly increasing detection of high-risk CAD as population rates of cardiac catheterization increase. This internationally-relevant finding can inform country-level planning of invasive cardiac care services.</p

    A comparison between the APACHE II and Charlson Index Score for predicting hospital mortality in critically ill patients

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Risk adjustment and mortality prediction in studies of critical care are usually performed using acuity of illness scores, such as Acute Physiology and Chronic Health Evaluation II (APACHE II), which emphasize physiological derangement. Common risk adjustment systems used in administrative datasets, like the Charlson index, are entirely based on the presence of co-morbid illnesses. The purpose of this study was to compare the discriminative ability of the Charlson index to the APACHE II in predicting hospital mortality in adult multisystem ICU patients.</p> <p>Methods</p> <p>This was a population-based cohort design. The study sample consisted of adult (>17 years of age) residents of the Calgary Health Region admitted to a multisystem ICU between April 2002 and March 2004. Clinical data were collected prospectively and linked to hospital outcome data. Multiple regression analyses were used to compare the performance of APACHE II and the Charlson index.</p> <p>Results</p> <p>The Charlson index was a poor predictor of mortality (C = 0.626). There was minimal difference between a baseline model containing age, sex and acute physiology score (C = 0.74) and models containing either chronic health points (C = 0.76) or Charlson index variations (C = 0.75, 0.76, 0.77). No important improvement in prediction occurred when the Charlson index was added to the full APACHE II model (C = 0.808 to C = 0.813).</p> <p>Conclusion</p> <p>The Charlson index does not perform as well as the APACHE II in predicting hospital mortality in ICU patients. However, when acuity of illness scores are unavailable or are not recorded in a standard way, the Charlson index might be considered as an alternative method of risk adjustment and therefore facilitate comparisons between intensive care units.</p

    Do coder characteristics influence validity of ICD-10 hospital discharge data?

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>Administrative data are widely used to study health systems and make important health policy decisions. Yet little is known about the influence of coder characteristics on administrative data validity in these studies. Our goal was to describe the relationship between several measures of validity in coded hospital discharge data and 1) coders' volume of coding (≥13,000 vs. <13,000 records), 2) coders' employment status (full- vs. part-time), and 3) hospital type.</p> <p>Methods</p> <p>This descriptive study examined 6 indicators of face validity in ICD-10 coded discharge records from 4 hospitals in Calgary, Canada between April 2002 and March 2007. Specifically, mean number of coded diagnoses, procedures, complications, Z-codes, and codes ending in 8 or 9 were compared by coding volume and employment status, as well as hospital type. The mean number of diagnoses was also compared across coder characteristics for 6 major conditions of varying complexity. Next, kappa statistics were computed to assess agreement between discharge data and linked chart data reabstracted by nursing chart reviewers. Kappas were compared across coder characteristics.</p> <p>Results</p> <p>422,618 discharge records were coded by 59 coders during the study period. The mean number of diagnoses per record decreased from 5.2 in 2002/2003 to 3.9 in 2006/2007, while the number of records coded annually increased from 69,613 to 102,842. Coders at the tertiary hospital coded the most diagnoses (5.0 compared with 3.9 and 3.8 at other sites). There was no variation by coder or site characteristics for any other face validity indicator. The mean number of diagnoses increased from 1.5 to 7.9 with increasing complexity of the major diagnosis, but did not vary with coder characteristics. Agreement (kappa) between coded data and chart review did not show any consistent pattern with respect to coder characteristics.</p> <p>Conclusions</p> <p>This large study suggests that coder characteristics do not influence the validity of hospital discharge data. Other jurisdictions might benefit from implementing similar employment programs to ours, e.g.: a requirement for a 2-year college training program, a single management structure across sites, and rotation of coders between sites. Limitations include few coder characteristics available for study due to privacy concerns.</p

    SnTox3 Acts in Effector Triggered Susceptibility to Induce Disease on Wheat Carrying the Snn3 Gene

    Get PDF
    The necrotrophic fungus Stagonospora nodorum produces multiple proteinaceous host-selective toxins (HSTs) which act in effector triggered susceptibility. Here, we report the molecular cloning and functional characterization of the SnTox3-encoding gene, designated SnTox3, as well as the initial characterization of the SnTox3 protein. SnTox3 is a 693 bp intron-free gene with little obvious homology to other known genes. The predicted immature SnTox3 protein is 25.8 kDa in size. A 20 amino acid signal sequence as well as a possible pro sequence are predicted. Six cysteine residues are predicted to form disulfide bonds and are shown to be important for SnTox3 activity. Using heterologous expression in Pichia pastoris and transformation into an avirulent S. nodorum isolate, we show that SnTox3 encodes the SnTox3 protein and that SnTox3 interacts with the wheat susceptibility gene Snn3. In addition, the avirulent S. nodorum isolate transformed with SnTox3 was virulent on host lines expressing the Snn3 gene. SnTox3-disrupted mutants were deficient in the production of SnTox3 and avirulent on the Snn3 differential wheat line BG220. An analysis of genetic diversity revealed that SnTox3 is present in 60.1% of a worldwide collection of 923 isolates and occurs as eleven nucleotide haplotypes resulting in four amino acid haplotypes. The cloning of SnTox3 provides a fundamental tool for the investigation of the S. nodorum–wheat interaction, as well as vital information for the general characterization of necrotroph–plant interactions
    corecore