158 research outputs found
Efficiency and safety of varying the frequency of whole blood donation (INTERVAL): a randomised trial of 45 000 donors
Background:
Limits on the frequency of whole blood donation exist primarily to safeguard donor health. However, there is substantial variation across blood services in the maximum frequency of donations allowed. We compared standard practice in the UK with shorter inter-donation intervals used in other countries.
Methods:
In this parallel group, pragmatic, randomised trial, we recruited whole blood donors aged 18 years or older from 25 centres across England, UK. By use of a computer-based algorithm, men were randomly assigned (1:1:1) to 12-week (standard) versus 10-week versus 8-week inter-donation intervals, and women were randomly assigned (1:1:1) to 16-week (standard) versus 14-week versus 12-week intervals. Participants were not masked to their allocated intervention group. The primary outcome was the number of donations over 2 years. Secondary outcomes related to safety were quality of life, symptoms potentially related to donation, physical activity, cognitive function, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin. This trial is registered with ISRCTN, number ISRCTN24760606, and is ongoing but no longer recruiting participants.
Findings:
45 263 whole blood donors (22 466 men, 22 797 women) were recruited between June 11, 2012, and June 15, 2014. Data were analysed for 45 042 (99·5%) participants. Men were randomly assigned to the 12-week (n=7452) versus 10-week (n=7449) versus 8-week (n=7456) groups; and women to the 16-week (n=7550) versus 14-week (n=7567) versus 12-week (n=7568) groups. In men, compared with the 12-week group, the mean amount of blood collected per donor over 2 years increased by 1·69 units (95% CI 1·59–1·80; approximately 795 mL) in the 8-week group and by 0·79 units (0·69–0·88; approximately 370 mL) in the 10-week group (p<0·0001 for both). In women, compared with the 16-week group, it increased by 0·84 units (95% CI 0·76–0·91; approximately 395 mL) in the 12-week group and by 0·46 units (0·39–0·53; approximately 215 mL) in the 14-week group (p<0·0001 for both). No significant differences were observed in quality of life, physical activity, or cognitive function across randomised groups. However, more frequent donation resulted in more donation-related symptoms (eg, tiredness, breathlessness, feeling faint, dizziness, and restless legs, especially among men [for all listed symptoms]), lower mean haemoglobin and ferritin concentrations, and more deferrals for low haemoglobin (p<0·0001 for each) than those observed in the standard frequency groups.
Interpretation:
Over 2 years, more frequent donation than is standard practice in the UK collected substantially more blood without having a major effect on donors' quality of life, physical activity, or cognitive function, but resulted in more donation-related symptoms, deferrals, and iron deficiency.
Funding:
NHS Blood and Transplant, National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Basic science232. Certolizumab pegol prevents pro-inflammatory alterations in endothelial cell function
Background: Cardiovascular disease is a major comorbidity of rheumatoid arthritis (RA) and a leading cause of death. Chronic systemic inflammation involving tumour necrosis factor alpha (TNF) could contribute to endothelial activation and atherogenesis. A number of anti-TNF therapies are in current use for the treatment of RA, including certolizumab pegol (CZP), (Cimzia ®; UCB, Belgium). Anti-TNF therapy has been associated with reduced clinical cardiovascular disease risk and ameliorated vascular function in RA patients. However, the specific effects of TNF inhibitors on endothelial cell function are largely unknown. Our aim was to investigate the mechanisms underpinning CZP effects on TNF-activated human endothelial cells. Methods: Human aortic endothelial cells (HAoECs) were cultured in vitro and exposed to a) TNF alone, b) TNF plus CZP, or c) neither agent. Microarray analysis was used to examine the transcriptional profile of cells treated for 6 hrs and quantitative polymerase chain reaction (qPCR) analysed gene expression at 1, 3, 6 and 24 hrs. NF-κB localization and IκB degradation were investigated using immunocytochemistry, high content analysis and western blotting. Flow cytometry was conducted to detect microparticle release from HAoECs. Results: Transcriptional profiling revealed that while TNF alone had strong effects on endothelial gene expression, TNF and CZP in combination produced a global gene expression pattern similar to untreated control. The two most highly up-regulated genes in response to TNF treatment were adhesion molecules E-selectin and VCAM-1 (q 0.2 compared to control; p > 0.05 compared to TNF alone). The NF-κB pathway was confirmed as a downstream target of TNF-induced HAoEC activation, via nuclear translocation of NF-κB and degradation of IκB, effects which were abolished by treatment with CZP. In addition, flow cytometry detected an increased production of endothelial microparticles in TNF-activated HAoECs, which was prevented by treatment with CZP. Conclusions: We have found at a cellular level that a clinically available TNF inhibitor, CZP reduces the expression of adhesion molecule expression, and prevents TNF-induced activation of the NF-κB pathway. Furthermore, CZP prevents the production of microparticles by activated endothelial cells. This could be central to the prevention of inflammatory environments underlying these conditions and measurement of microparticles has potential as a novel prognostic marker for future cardiovascular events in this patient group. Disclosure statement: Y.A. received a research grant from UCB. I.B. received a research grant from UCB. S.H. received a research grant from UCB. All other authors have declared no conflicts of interes
Small- and large-scale network structure of live fish movements in Scotland
Networks are increasingly being used as an epidemiological tool for studying the potential for disease transmission through animal movements in farming industries. We analysed the network of live fish movements for commercial salmonids in Scotland in 2003. This network was found to have a mixture of features both aiding and hindering disease transmission, hindered by being fragmented, with comparatively low mean number of connections (2.83), and low correlation between inward and outward connections (0.12), with moderate variance in these numbers (coefficients of dispersion of 0.99 and 3.12 for in and out respectively); but aided by low levels of clustering (0.060) and some non-random mixing (coefficient of assortativity of 0.16). Estimated inter-site basic reproduction number R0 did not exceed 2.4 at high transmission rate. The network was strongly organised into communities, resulting in a high modularity index (0.82). Arc (directed connection) removal indicated that effective surveillance of a small number of connections may facilitate a large reduction in the potential for disease spread within the industry. Useful criteria for identification of these important arcs included degree- and betweenness-based measures that could in future prove useful for prioritising surveillance
Nucleocapsid protein structures from orthobunyaviruses reveal insight into ribonucleoprotein architecture and RNA polymerization
All orthobunyaviruses possess three genome segments of single-stranded negative sense RNA that are encapsidated with the virus-encoded nucleocapsid (N) protein to form a ribonucleoprotein (RNP) complex, which is uncharacterized at high resolution. We report the crystal structure of both the Bunyamwera virus (BUNV) N–RNA complex and the unbound Schmallenberg virus (SBV) N protein, at resolutions of 3.20 and 2.75 Å, respectively. Both N proteins crystallized as ring-like tetramers and exhibit a high degree of structural similarity despite classification into different orthobunyavirus serogroups. The structures represent a new RNA-binding protein fold. BUNV N possesses a positively charged groove into which RNA is deeply sequestered, with the bases facing away from the solvent. This location is highly inaccessible, implying that RNA polymerization and other critical base pairing events in the virus life cycle require RNP disassembly. Mutational analysis of N protein supports a correlation between structure and function. Comparison between these crystal structures and electron microscopy images of both soluble tetramers and authentic RNPs suggests the N protein does not bind RNA as a repeating monomer; thus, it represents a newly described architecture for bunyavirus RNP assembly, with implications for many other segmented negative-strand RNA viruses
Implementing interventions to reduce antibiotic use: a qualitative study in high-prescribing practices.
BackgroundTrials have shown that delayed antibiotic prescriptions (DPs) and point-of-care C-Reactive Protein testing (POC-CRPT) are effective in reducing antibiotic use in general practice, but these were not typically implemented in high-prescribing practices. We aimed to explore views of professionals from high-prescribing practices about uptake and implementation of DPs and POC-CRPT to reduce antibiotic use.MethodsThis was a qualitative focus group study in English general practices. The highest antibiotic prescribing practices in the West Midlands were invited to participate. Clinical and non-clinical professionals attended focus groups co-facilitated by two researchers. Focus groups were audio-recorded, transcribed verbatim and analysed thematically.ResultsNine practices (50 professionals) participated. Four main themes were identified. Compatibility of strategies with clinical roles and experience - participants viewed the strategies as having limited value as 'clinical tools', perceiving them as useful only in 'rare' instances of clinical uncertainty and/or for those less experienced. Strategies as 'social tools' - participants perceived the strategies as helpful for negotiating treatment decisions and educating patients, particularly those expecting antibiotics. Ambiguities - participants perceived ambiguities around when they should be used, and about their impact on antibiotic use. Influence of context - various other situational and practical issues were raised with implementing the strategies.ConclusionsHigh-prescribing practices do not view DPs and POC-CRPT as sufficiently useful 'clinical tools' in a way which corresponds to the current policy approach advocating their use to reduce clinical uncertainty and improve antimicrobial stewardship. Instead, policy attention should focus on how these strategies may instead be used as 'social tools' to reduce unnecessary antibiotic use. Attention should also focus on the many ambiguities (concerns and questions) about, and contextual barriers to, using these strategies that need addressing to support wider and more consistent implementation
Technology-assisted education in graduate medical education: a review of the literature
Studies on computer-aided instruction and web-based learning have left many questions unanswered about the most effective use of technology-assisted education in graduate medical education
Longer-term efficiency and safety of increasing the frequency of whole blood donation (INTERVAL): extension study of a randomised trial of 20 757 blood donors
Background:
The INTERVAL trial showed that, over a 2-year period, inter-donation intervals for whole blood donation can be safely reduced to meet blood shortages. We extended the INTERVAL trial for a further 2 years to evaluate the longer-term risks and benefits of varying inter-donation intervals, and to compare routine versus more intensive reminders to help donors keep appointments.
Methods:
The INTERVAL trial was a parallel group, pragmatic, randomised trial that recruited blood donors aged 18 years or older from 25 static donor centres of NHS Blood and Transplant across England, UK. Here we report on the prespecified analyses after 4 years of follow-up. Participants were whole blood donors who agreed to continue trial participation on their originally allocated inter-donation intervals (men: 12, 10, and 8 weeks; women: 16, 14, and 12 weeks). They were further block-randomised (1:1) to routine versus more intensive reminders using computer-generated random sequences. The prespecified primary outcome was units of blood collected per year analysed in the intention-to-treat population. Secondary outcomes related to safety were quality of life, self-reported symptoms potentially related to donation, haemoglobin and ferritin concentrations, and deferrals because of low haemoglobin and other factors. This trial is registered with ISRCTN, number ISRCTN24760606, and has completed.
Findings:
Between Oct 19, 2014, and May 3, 2016, 20 757 of the 38 035 invited blood donors (10 843 [58%] men, 9914 [51%] women) participated in the extension study. 10 378 (50%) were randomly assigned to routine reminders and 10 379 (50%) were randomly assigned to more intensive reminders. Median follow-up was 1·1 years (IQR 0·7–1·3). Compared with routine reminders, more intensive reminders increased blood collection by a mean of 0·11 units per year (95% CI 0·04–0·17; p=0·0003) in men and 0·06 units per year (0·01–0·11; p=0·0094) in women. During the extension study, each week shorter inter-donation interval increased blood collection by a mean of 0·23 units per year (0·21–0·25) in men and 0·14 units per year (0·12–0·15) in women (both p<0·0001). More frequent donation resulted in more deferrals for low haemoglobin (odds ratio per week shorter inter-donation interval 1·19 [95% CI 1·15–1·22] in men and 1·10 [1·06–1·14] in women), and lower mean haemoglobin (difference per week shorter inter-donation interval −0·84 g/L [95% CI −0·99 to −0·70] in men and −0·45 g/L [–0·59 to −0·31] in women) and ferritin concentrations (percentage difference per week shorter inter-donation interval −6·5% [95% CI −7·6 to −5·5] in men and −5·3% [–6·5 to −4·2] in women; all p<0·0001). No differences were observed in quality of life, serious adverse events, or self-reported symptoms (p>0.0001 for tests of linear trend by inter-donation intervals) other than a higher reported frequency of doctor-diagnosed low iron concentrations and prescription of iron supplements in men (p<0·0001).
Interpretation:
During a period of up to 4 years, shorter inter-donation intervals and more intensive reminders resulted in more blood being collected without a detectable effect on donors' mental and physical wellbeing. However, donors had decreased haemoglobin concentrations and more self-reported symptoms compared with the initial 2 years of the trial. Our findings suggest that blood collection services could safely use shorter donation intervals and more intensive reminders to meet shortages, for donors who maintain adequate haemoglobin concentrations and iron stores.
Funding:
NHS Blood and Transplant, UK National Institute for Health Research, UK Medical Research Council, and British Heart Foundation
Challenges of COVID-19 Case Forecasting in the US, 2020–2021
During the COVID-19 pandemic, forecasting COVID-19 trends to support planning and response was a priority for scientists and decision makers alike. In the United States, COVID-19 forecasting was coordinated by a large group of universities, companies, and government entities led by the Centers for Disease Control and Prevention and the US COVID-19 Forecast Hub (https://covid19forecasthub.org). We evaluated approximately 9.7 million forecasts of weekly state-level COVID-19 cases for predictions 1-4 weeks into the future submitted by 24 teams from August 2020 to December 2021. We assessed coverage of central prediction intervals and weighted interval scores (WIS), adjusting for missing forecasts relative to a baseline forecast, and used a Gaussian generalized estimating equation (GEE) model to evaluate differences in skill across epidemic phases that were defined by the effective reproduction number. Overall, we found high variation in skill across individual models, with ensemble-based forecasts outperforming other approaches. Forecast skill relative to the baseline was generally higher for larger jurisdictions (e.g., states compared to counties). Over time, forecasts generally performed worst in periods of rapid changes in reported cases (either in increasing or decreasing epidemic phases) with 95% prediction interval coverage dropping below 50% during the growth phases of the winter 2020, Delta, and Omicron waves. Ideally, case forecasts could serve as a leading indicator of changes in transmission dynamics. However, while most COVID-19 case forecasts outperformed a naïve baseline model, even the most accurate case forecasts were unreliable in key phases. Further research could improve forecasts of leading indicators, like COVID-19 cases, by leveraging additional real-time data, addressing performance across phases, improving the characterization of forecast confidence, and ensuring that forecasts were coherent across spatial scales. In the meantime, it is critical for forecast users to appreciate current limitations and use a broad set of indicators to inform pandemic-related decision making
Evaluation of individual and ensemble probabilistic forecasts of COVID-19 mortality in the United States
Short-term probabilistic forecasts of the trajectory of the COVID-19 pandemic in the United States have served as a visible and important communication channel between the scientific modeling community and both the general public and decision-makers. Forecasting models provide specific, quantitative, and evaluable predictions that inform short-term decisions such as healthcare staffing needs, school closures, and allocation of medical supplies. Starting in April 2020, the US COVID-19 Forecast Hub (https://covid19forecasthub.org/) collected, disseminated, and synthesized tens of millions of specific predictions from more than 90 different academic, industry, and independent research groups. A multimodel ensemble forecast that combined predictions from dozens of groups every week provided the most consistently accurate probabilistic forecasts of incident deaths due to COVID-19 at the state and national level from April 2020 through October 2021. The performance of 27 individual models that submitted complete forecasts of COVID-19 deaths consistently throughout this year showed high variability in forecast skill across time, geospatial units, and forecast horizons. Two-thirds of the models evaluated showed better accuracy than a naïve baseline model. Forecast accuracy degraded as models made predictions further into the future, with probabilistic error at a 20-wk horizon three to five times larger than when predicting at a 1-wk horizon. This project underscores the role that collaboration and active coordination between governmental public-health agencies, academic modeling teams, and industry partners can play in developing modern modeling capabilities to support local, state, and federal response to outbreaks
Genetic determinants of risk in pulmonary arterial hypertension:international genome-wide association studies and meta-analysis
Background: Rare genetic variants cause pulmonary arterial hypertension, but the contribution of common genetic variation to disease risk and natural history is poorly characterised. We tested for genome-wide association for pulmonary arterial hypertension in large international cohorts and assessed the contribution of associated regions to outcomes. Methods: We did two separate genome-wide association studies (GWAS) and a meta-analysis of pulmonary arterial hypertension. These GWAS used data from four international case-control studies across 11 744 individuals with European ancestry (including 2085 patients). One GWAS used genotypes from 5895 whole-genome sequences and the other GWAS used genotyping array data from an additional 5849 individuals. Cross-validation of loci reaching genome-wide significance was sought by meta-analysis. Conditional analysis corrected for the most significant variants at each locus was used to resolve signals for multiple associations. We functionally annotated associated variants and tested associations with duration of survival. All-cause mortality was the primary endpoint in survival analyses. Findings: A locus near SOX17 (rs10103692, odds ratio 1·80 [95% CI 1·55–2·08], p=5·13 × 10 –15 ) and a second locus in HLA-DPA1 and HLA-DPB1 (collectively referred to as HLA-DPA1/DPB1 here; rs2856830, 1·56 [1·42–1·71], p=7·65 × 10 –20 ) within the class II MHC region were associated with pulmonary arterial hypertension. The SOX17 locus had two independent signals associated with pulmonary arterial hypertension (rs13266183, 1·36 [1·25–1·48], p=1·69 × 10 –12 ; and rs10103692). Functional and epigenomic data indicate that the risk variants near SOX17 alter gene regulation via an enhancer active in endothelial cells. Pulmonary arterial hypertension risk variants determined haplotype-specific enhancer activity, and CRISPR-mediated inhibition of the enhancer reduced SOX17 expression. The HLA-DPA1/DPB1 rs2856830 genotype was strongly associated with survival. Median survival from diagnosis in patients with pulmonary arterial hypertension with the C/C homozygous genotype was double (13·50 years [95% CI 12·07 to >13·50]) that of those with the T/T genotype (6·97 years [6·02–8·05]), despite similar baseline disease severity. Interpretation: This is the first study to report that common genetic variation at loci in an enhancer near SOX17 and in HLA-DPA1/DPB1 is associated with pulmonary arterial hypertension. Impairment of SOX17 function might be more common in pulmonary arterial hypertension than suggested by rare mutations in SOX17. Further studies are needed to confirm the association between HLA typing or rs2856830 genotyping and survival, and to determine whether HLA typing or rs2856830 genotyping improves risk stratification in clinical practice or trials. Funding: UK NIHR, BHF, UK MRC, Dinosaur Trust, NIH/NHLBI, ERS, EMBO, Wellcome Trust, EU, AHA, ACClinPharm, Netherlands CVRI, Dutch Heart Foundation, Dutch Federation of UMC, Netherlands OHRD and RNAS, German DFG, German BMBF, APH Paris, INSERM, Université Paris-Sud, and French ANR. </p
- …
