131 research outputs found

    Clinicopathological evaluation of chronic traumatic encephalopathy in players of American football

    Full text link
    IMPORTANCE: Players of American football may be at increased risk of long-term neurological conditions, particularly chronic traumatic encephalopathy (CTE). OBJECTIVE: To determine the neuropathological and clinical features of deceased football players with CTE. DESIGN, SETTING, AND PARTICIPANTS: Case series of 202 football players whose brains were donated for research. Neuropathological evaluations and retrospective telephone clinical assessments (including head trauma history) with informants were performed blinded. Online questionnaires ascertained athletic and military history. EXPOSURES: Participation in American football at any level of play. MAIN OUTCOMES AND MEASURES: Neuropathological diagnoses of neurodegenerative diseases, including CTE, based on defined diagnostic criteria; CTE neuropathological severity (stages I to IV or dichotomized into mild [stages I and II] and severe [stages III and IV]); informant-reported athletic history and, for players who died in 2014 or later, clinical presentation, including behavior, mood, and cognitive symptoms and dementia. RESULTS: Among 202 deceased former football players (median age at death, 66 years [interquartile range, 47-76 years]), CTE was neuropathologically diagnosed in 177 players (87%; median age at death, 67 years [interquartile range, 52-77 years]; mean years of football participation, 15.1 [SD, 5.2]), including 0 of 2 pre–high school, 3 of 14 high school (21%), 48 of 53 college (91%), 9 of 14 semiprofessional (64%), 7 of 8 Canadian Football League (88%), and 110 of 111 National Football League (99%) players. Neuropathological severity of CTE was distributed across the highest level of play, with all 3 former high school players having mild pathology and the majority of former college (27 [56%]), semiprofessional (5 [56%]), and professional (101 [86%]) players having severe pathology. Among 27 participants with mild CTE pathology, 26 (96%) had behavioral or mood symptoms or both, 23 (85%) had cognitive symptoms, and 9 (33%) had signs of dementia. Among 84 participants with severe CTE pathology, 75 (89%) had behavioral or mood symptoms or both, 80 (95%) had cognitive symptoms, and 71 (85%) had signs of dementia. CONCLUSIONS AND RELEVANCE: In a convenience sample of deceased football players who donated their brains for research, a high proportion had neuropathological evidence of CTE, suggesting that CTE may be related to prior participation in football.This study received support from NINDS (grants U01 NS086659, R01 NS078337, R56 NS078337, U01 NS093334, and F32 NS096803), the National Institute on Aging (grants K23 AG046377, P30AG13846 and supplement 0572063345-5, R01 AG1649), the US Department of Defense (grant W81XWH-13-2-0064), the US Department of Veterans Affairs (I01 CX001038), the Veterans Affairs Biorepository (CSP 501), the Veterans Affairs Rehabilitation Research and Development Traumatic Brain Injury Center of Excellence (grant B6796-C), the Department of Defense Peer Reviewed Alzheimer’s Research Program (grant 13267017), the National Operating Committee on Standards for Athletic Equipment, the Alzheimer’s Association (grants NIRG-15-362697 and NIRG-305779), the Concussion Legacy Foundation, the Andlinger Family Foundation, the WWE, and the NFL

    An investigation of causes of false positive single nucleotide polymorphisms using simulated reads from a small eukaryote genome

    Get PDF
    Background: Single Nucleotide Polymorphisms (SNPs) are widely used molecular markers, and their use has increased massively since the inception of Next Generation Sequencing (NGS) technologies, which allow detection of large numbers of SNPs at low cost. However, both NGS data and their analysis are error-prone, which can lead to the generation of false positive (FP) SNPs. We explored the relationship between FP SNPs and seven factors involved in mapping-based variant calling - quality of the reference sequence, read length, choice of mapper and variant caller, mapping stringency and filtering of SNPs by read mapping quality and read depth. This resulted in 576 possible factor level combinations. We used error- and variant-free simulated reads to ensure that every SNP found was indeed a false positive. Results: The variation in the number of FP SNPs generated ranged from 0 to 36,621 for the 120 million base pairs (Mbp) genome. All of the experimental factors tested had statistically significant effects on the number of FP SNPs generated and there was a considerable amount of interaction between the different factors. Using a fragmented reference sequence led to a dramatic increase in the number of FP SNPs generated, as did relaxed read mapping and a lack of SNP filtering. The choice of reference assembler, mapper and variant caller also significantly affected the outcome. The effect of read length was more complex and suggests a possible interaction between mapping specificity and the potential for contributing more false positives as read length increases. Conclusions: The choice of tools and parameters involved in variant calling can have a dramatic effect on the number of FP SNPs produced, with particularly poor combinations of software and/or parameter settings yielding tens of thousands in this experiment. Between-factor interactions make simple recommendations difficult for a SNP discovery pipeline but the quality of the reference sequence is clearly of paramount importance. Our findings are also a stark reminder that it can be unwise to use the relaxed mismatch settings provided as defaults by some read mappers when reads are being mapped to a relatively unfinished reference sequence from e.g. a non-model organism in its early stages of genomic exploration

    Purine-Rich Foods Intake and Recurrent Gout Attacks

    Get PDF
    OBJECTIVE: To examine and quantify the relation between purine intake and the risk of recurrent gout attacks among gout patients. METHODS: The authors conducted a case-crossover study to examine associations of a set of putative risk factors with recurrent gout attacks. Individuals with gout were prospectively recruited and followed online for 1 year. Participants were asked about the following information when experiencing a gout attack: the onset date of the gout attack, clinical symptoms and signs, medications (including antigout medications), and presence of potential risk factors (including daily intake of various purine-containing food items) during the 2-day period prior to the gout attack. The same exposure information was also assessed over 2-day control periods. RESULTS: This study included 633 participants with gout. Compared with the lowest quintile of total purine intake over a 2-day period, OR of recurrent gout attacks were 1.17, 1.38, 2.21 and 4.76, respectively, with each increasing quintile (p for trend <0.001). The corresponding OR were 1.42, 1.34, 1.77 and 2.41 for increasing quintiles of purine intake from animal sources (p for trend <0.001), and 1.12, 0.99, 1.32 and 1.39 from plant sources (p=0.04), respectively. The effect of purine intake persisted across subgroups by sex, use of alcohol, diuretics, allopurinol, NSAIDs and colchicine. CONCLUSIONS: The study findings suggest that acute purine intake increases the risk of recurrent gout attacks by almost fivefold among gout patients. Avoiding or reducing amount of purine-rich foods intake, especially of animal origin, may help reduce the risk of gout attacks

    Barriers to Care and 1-Year Mortality Among Newly Diagnosed HIV-Infected People in Durban, South Africa

    Get PDF
    Background: Prompt entry into HIV care is often hindered by personal and structural barriers. Our objective was to evaluate the impact of self-perceived barriers to health care on 1-year mortality among newly diagnosed HIV-infected individuals in Durban, South Africa. Methods: Before HIV testing at 4 outpatient sites, adults (≥18 years) were surveyed regarding perceived barriers to care including (1) service delivery, (2) financial, (3) personal health perception, (4) logistical, and (5) structural. We assessed deaths via phone calls and the South African National Population Register. We used multivariable Cox proportional hazards models to determine the association between number of perceived barriers and death within 1 year. Results: One thousand eight hundred ninety-nine HIV-infected participants enrolled. Median age was 33 years (interquartile range: 27–41 years), 49% were females, and median CD4 count was 192/μL (interquartile range: 72–346/μL). One thousand fifty-seven participants (56%) reported no, 370 (20%) reported 1–3, and 460 (24%) reported >3 barriers to care. By 1 year, 250 [13%, 95% confidence interval (CI): 12% to 15%] participants died. Adjusting for age, sex, education, baseline CD4 count, distance to clinic, and tuberculosis status, participants with 1–3 barriers (adjusted hazard ratio: 1.49, 95% CI: 1.06 to 2.08) and >3 barriers (adjusted hazard ratio: 1.81, 95% CI: 1.35 to 2.43) had higher 1-year mortality risk compared with those without barriers. Conclusions: HIV-infected individuals in South Africa who reported perceived barriers to medical care at diagnosis were more likely to die within 1 year. Targeted structural interventions, such as extended clinic hours, travel vouchers, and streamlined clinic operations, may improve linkage to care and antiretroviral therapy initiation for these people

    The Impact of a Novel Computer-Based Decision Aid on Shared Decision-Making for Colorectal Cancer Screening: A Randomized Trial (Running head: SDM for CRC Screening)

    Get PDF
    Eliciting patients’ preferences within a framework of shared decision-making (SDM) has been advocated as a strategy for increasing colorectal cancer (CRC) screening adherence. Our objective was to assess the effectiveness of a novel decision aid on SDM in the primary care setting

    Aid-Assisted Decision-Making and Colorectal Cancer Screening: A Randomized Controlled Trial

    Get PDF
    Shared decision-making (SDM) is a widely recommended yet unproven strategy for increasing colorectal cancer (CRC) screening uptake. Previous trials of decision aids to increase SDM and CRC screening uptake have yielded mixed results
    corecore