47 research outputs found
Infectious disease and health systems modelling for local decision making to control neglected tropical diseases
Most neglected tropical diseases (NTDs) have complex life cycles and are challenging to control. The “2020 goals” of control and elimination as a public health programme for a number of NTDs are the subject of significant international efforts and investments. Beyond 2020 there will be a drive to maintain these gains and to push for true local elimination of transmission. However, these diseases are affected by variations in vectors, human demography, access to water and sanitation, access to interventions and local health systems. We therefore argue that there will be a need to develop local quantitative expertise to support elimination efforts. If available now, quantitative analyses would provide updated estimates of the burden of disease, assist in the design of locally appropriate control programmes, estimate the effectiveness of current interventions and support ‘real-time’ updates to local operations. Such quantitative tools are increasingly available at an international scale for NTDs, but are rarely tailored to local scenarios. Localised expertise not only provides an opportunity for more relevant analyses, but also has a greater chance of developing positive feedback between data collection and analysis by demonstrating the value of data. This is essential as rational program design relies on good quality data collection. It is also likely that if such infrastructure is provided for NTDs there will be an additional impact on the health system more broadly. Locally tailored quantitative analyses can help achieve sustainable and effective control of NTDs, but also underpin the development of local health care systems
Mass drug administration and beyond : how can we strengthen health systems to deliver complex interventions to eliminate neglected tropical diseases?
Achieving the 2020 goals for Neglected Tropical Diseases (NTDs) requires scale-up of Mass Drug Administration (MDA) which will require long-term commitment of national and global financing partners, strengthening national capacity and, at the community level, systems to monitor and evaluate activities and impact.
For some settings and diseases, MDA is not appropriate and alternative interventions are required. Operational research is necessary to identify how existing MDA networks can deliver this more complex range of interventions equitably.
The final stages of the different global programmes to eliminate NTDs require eliminating foci of transmission which are likely to persist in complex and remote rural settings. Operational research is required to identify how current tools and practices might be adapted to locate and eliminate these hard-to-reach foci.
Chronic disabilities caused by NTDs will persist after transmission of pathogens ceases. Development and delivery of sustainable services to reduce the NTD-related disability is an urgent public health priority.
LSTM and its partners are world leaders in developing and delivering interventions to control vector-borne NTDs and malaria, particularly in hard-to-reach settings in Africa. Our experience, partnerships and research capacity allows us to serve as a hub for developing, supporting, monitoring and evaluating global programmes to eliminate NTDs
Fit for purpose : do we have the right tools to sustain NTD elimination?
Priorities for NTD control programmes will shift over the next 10-20 years as the elimination phase reaches the ‘end game’ for some NTDs, and the recognition that the control of other NTDs is much more problematic. The current goal of scaling up programmes based on preventive chemotherapy (PCT) will alter to sustaining NTD prevention, through sensitive surveillance and rapid response to resurgence. A new suite of tools and approaches will be required for both PCT and Intensive Disease Management (IDM) diseases in this timeframe to enable disease endemic countries to:
1. Sensitively and sustainably survey NTD transmission and prevalence in order to identify and respond quickly to resurgence.
2. Set relevant control targets based not only on epidemiological indicators but also entomological and ecological metrics and use decision support technology to help meet those targets.
3. Implement verified and cost-effective tools to prevent transmission throughout the elimination phase.
Liverpool School of Tropical Medicine (LSTM) and partners propose to evaluate and implement existing tools from other disease systems as well as new tools in the pipeline in order to support endemic country ownership in NTD decision-making during the elimination phase and beyond
The Holocene humid period in the Nefud Desert: Hunters and herders in the Jebel Oraf palaeolake basin, Saudi Arabia
Archaeological surveys and excavations in the Jebel Oraf palaeolake basin, north-western Saudi Arabia, have identified a well-preserved early- to mid-Holocene landscape. Two types of occupation site can be distinguished: nine small and ephemeral scatters from single occupation phases on the slopes of sand dunes and three hearth sites indicative of repeated occupation on palaeolake shorelines. In addition, 245 rock art panels, 81 cairns, and 15 stone structures were recorded. This diverse dataset provides an opportunity to reconstruct occupation patterns and changes in landscape use. A particularly important site, Jebel Oraf 2, documents two episodes of lake high stands at ca. 6500 BC and 5300 BC, flooding parts of the locality. Neolithic pastoralists likely occupied the site after the end of the wet season, when the terrain was dry. Earlier sites are located in dune embayments some 7–14 m above the shore of the palaeolake. These locations are consistent with hunting strategies identifiable in the rock art that suggest wildlife was ambushed at watering places. Later rock art at Jebel Oraf also documents the hunting of wild camel in the Iron Age. The lithic industries documented in the Jebel Oraf basin support arguments of repeated contact with Levantine populations
Evaluation of a novel malaria anti-sporozoite vaccine candidate, R21 in Matrix-M adjuvant, in the UK and Burkina Faso: two phase 1, first-in-human trials
Background
Malaria remains a substantial public health burden among young children in sub-Saharan Africa and a highly efficacious vaccine eliciting a durable immune response would be a useful tool for controlling malaria. R21 is a malaria vaccine comprising nanoparticles, formed from a circumsporozoite protein and hepatitis B surface antigen (HBsAg) fusion protein, without any unfused HBsAg, and is administered with the saponin-based Matrix-M adjuvant. This study aimed to assess the safety and immunogenicity of the malaria vaccine candidate, R21, administered with or without adjuvant Matrix-M in adults naïve to malaria infection and in healthy adults from malaria endemic areas.
Methods
In this Article we report two phase 1, first-in-human trials. The first trial was a phase 1a open-label study in the UK evaluating the safety and immunogenicity of R21 administered either alone, or with 50 μg of Matrix-M. The second trial was a phase 1b randomised controlled trial in Burkina Faso. Adults had to be aged 18–50 years for enrolment in the phase 1a trial, and 18–45 years in the phase 1b trial. The phase 1a trial doses were 2 μg, 10 μg, and 50 μg R21/Matrix-M, and 50 μg R21 only. The phase 1b trial doses were 10 μg R21/Matrix-M and saline placebo. Matrix-M was always dosed at 50 μg. Phase 1b implemented block randomisation by randomisation into study groups by an independent statistician based at the University of Oxford using a randomisation code list with allocation concealment using opaque sealed envelopes. The primary objective of the phase 1a trial was to assess the safety and tolerability of R21 with and without Matrix-M. The primary objective of the phase 1b trial was to assess the safety and tolerability of R21 with Matrix-M. Both trials are registered with ClinicalTrials.gov, NCT02572388 for phase 1a and NCT02925403 for phase 1b, and are completed.
Findings
Between Oct 1, 2015, and Jan 3, 2017, 31 individuals were enrolled in the phase 1a study. Six individuals were assigned to receive 2 μg R21/Matrix-M, 11 to 10 μg R21/Matrix-M, ten to 50 μg R21/Matrix-M, and four to 50 μg R21 only. Between Aug 26, 2016, and Sept 28, 2017, 13 individuals were enrolled in the phase 1b study. Eight individuals were assigned to receive 10 μg R21/Matrix-M, and five to placebo. Vaccinations were well tolerated, and most local and systemic adverse events were mild. There were no serious adverse events deemed related to vaccination. Two serious adverse events occurred. The first in the 10 μg R21/Matrix-M group was worsening of previously undisclosed or undiagnosed palindromic rheumatism and was deemed unlikely to be related to vaccination and the second in the 2 μg R21/Matrix-M was hospital admission for an unplanned excision of a pre-existing Bartholin’s cyst, also unrelated to vaccination. In the phase 1a study, a total of 21 adverse events were recorded in the 2 μg R21/Matrix-M group, 103 in the 10 μg R21/Matrix-M group, 94 in the 50 μg R21/Matrix-M group, and 21 in the 50 μg R21 alone group. In the phase 1b study, twelve adverse events were recorded in the 10 μg R21/Matrix-M group and 0 in the placebo group.
Interpretation
R21 with Matrix-M adjuvant has an acceptable safety profile. These data have formed the basis for efficacy testing of this vaccine
Effect of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker initiation on organ support-free days in patients hospitalized with COVID-19
IMPORTANCE Overactivation of the renin-angiotensin system (RAS) may contribute to poor clinical outcomes in patients with COVID-19.
Objective To determine whether angiotensin-converting enzyme (ACE) inhibitor or angiotensin receptor blocker (ARB) initiation improves outcomes in patients hospitalized for COVID-19.
DESIGN, SETTING, AND PARTICIPANTS In an ongoing, adaptive platform randomized clinical trial, 721 critically ill and 58 non–critically ill hospitalized adults were randomized to receive an RAS inhibitor or control between March 16, 2021, and February 25, 2022, at 69 sites in 7 countries (final follow-up on June 1, 2022).
INTERVENTIONS Patients were randomized to receive open-label initiation of an ACE inhibitor (n = 257), ARB (n = 248), ARB in combination with DMX-200 (a chemokine receptor-2 inhibitor; n = 10), or no RAS inhibitor (control; n = 264) for up to 10 days.
MAIN OUTCOMES AND MEASURES The primary outcome was organ support–free days, a composite of hospital survival and days alive without cardiovascular or respiratory organ support through 21 days. The primary analysis was a bayesian cumulative logistic model. Odds ratios (ORs) greater than 1 represent improved outcomes.
RESULTS On February 25, 2022, enrollment was discontinued due to safety concerns. Among 679 critically ill patients with available primary outcome data, the median age was 56 years and 239 participants (35.2%) were women. Median (IQR) organ support–free days among critically ill patients was 10 (–1 to 16) in the ACE inhibitor group (n = 231), 8 (–1 to 17) in the ARB group (n = 217), and 12 (0 to 17) in the control group (n = 231) (median adjusted odds ratios of 0.77 [95% bayesian credible interval, 0.58-1.06] for improvement for ACE inhibitor and 0.76 [95% credible interval, 0.56-1.05] for ARB compared with control). The posterior probabilities that ACE inhibitors and ARBs worsened organ support–free days compared with control were 94.9% and 95.4%, respectively. Hospital survival occurred in 166 of 231 critically ill participants (71.9%) in the ACE inhibitor group, 152 of 217 (70.0%) in the ARB group, and 182 of 231 (78.8%) in the control group (posterior probabilities that ACE inhibitor and ARB worsened hospital survival compared with control were 95.3% and 98.1%, respectively).
CONCLUSIONS AND RELEVANCE In this trial, among critically ill adults with COVID-19, initiation of an ACE inhibitor or ARB did not improve, and likely worsened, clinical outcomes.
TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT0273570
Recommended from our members
Prevalence, associated factors and outcomes of pressure injuries in adult intensive care unit patients: the DecubICUs study
Funder: European Society of Intensive Care Medicine; doi: http://dx.doi.org/10.13039/501100013347Funder: Flemish Society for Critical Care NursesAbstract: Purpose: Intensive care unit (ICU) patients are particularly susceptible to developing pressure injuries. Epidemiologic data is however unavailable. We aimed to provide an international picture of the extent of pressure injuries and factors associated with ICU-acquired pressure injuries in adult ICU patients. Methods: International 1-day point-prevalence study; follow-up for outcome assessment until hospital discharge (maximum 12 weeks). Factors associated with ICU-acquired pressure injury and hospital mortality were assessed by generalised linear mixed-effects regression analysis. Results: Data from 13,254 patients in 1117 ICUs (90 countries) revealed 6747 pressure injuries; 3997 (59.2%) were ICU-acquired. Overall prevalence was 26.6% (95% confidence interval [CI] 25.9–27.3). ICU-acquired prevalence was 16.2% (95% CI 15.6–16.8). Sacrum (37%) and heels (19.5%) were most affected. Factors independently associated with ICU-acquired pressure injuries were older age, male sex, being underweight, emergency surgery, higher Simplified Acute Physiology Score II, Braden score 3 days, comorbidities (chronic obstructive pulmonary disease, immunodeficiency), organ support (renal replacement, mechanical ventilation on ICU admission), and being in a low or lower-middle income-economy. Gradually increasing associations with mortality were identified for increasing severity of pressure injury: stage I (odds ratio [OR] 1.5; 95% CI 1.2–1.8), stage II (OR 1.6; 95% CI 1.4–1.9), and stage III or worse (OR 2.8; 95% CI 2.3–3.3). Conclusion: Pressure injuries are common in adult ICU patients. ICU-acquired pressure injuries are associated with mainly intrinsic factors and mortality. Optimal care standards, increased awareness, appropriate resource allocation, and further research into optimal prevention are pivotal to tackle this important patient safety threat
