217 research outputs found

    Validity of the Polar V800 heart rate monitor to measure RR intervals at rest

    Get PDF
    Purpose To assess the validity of RR intervals and short-term heart rate variability (HRV) data obtained from the Polar V800 heart rate monitor, in comparison to an electrocardiograph (ECG). Method Twenty participants completed an active orthostatic test using the V800 and ECG. An improved method for the identification and correction of RR intervals was employed prior to HRV analysis. Agreement of the data was assessed using intra-class correlation coefficients (ICC), Bland–Altman limits of agreement (LoA), and effect size (ES). Results A small number of errors were detected between ECG and Polar RR signal, with a combined error rate of 0.086 %. The RR intervals from ECG to V800 were significantly different, but with small ES for both supine corrected and standing corrected data (ES 0.999 for both supine and standing corrected intervals. When analysed with the same HRV software no significant differences were observed in any HRV parameters, for either supine or standing; the data displayed small bias and tight LoA, strong ICC (>0.99) and small ES (≤0.029). Conclusions The V800 improves over previous Polar models, with narrower LoA, stronger ICC and smaller ES for both the RR intervals and HRV parameters. The findings support the validity of the Polar V800 and its ability to produce RR interval recordings consistent with an ECG. In addition, HRV parameters derived from these recordings are also highly comparable

    The impact of anthelmintic drugs on weight gain of smallholder goats in subtropical regions

    Get PDF
    Helminth infections are recognised as a major impediment to the productivity of goats in smallholder production systems. We used a multilevel framework to estimate the impact that administration of locally available anthelminthic drugs can have on the weight gains of goats in smallholder settings in India and Tanzania. We recruited 234 goats from 92 households from Odisha state in India and 253 goats from 15 households from Dodoma region in Tanzania. The goats were non-pregnant adult females, and from each household a minimum of two goats were recruited wherever possible. Each goat was randomly assigned to treatment with a locally available anthelminthic drug, or non-treatment. Each animal was tagged, weighed and had its body condition score (BCS) assessed. Animals were followed up after 28 and 56 days and re-weighed. To account for the local variations in exposure to helminths and for variations between households and herds, the data were analysed in a multilevel mixed model with herd in village as nested random effects. Over the 56 days of study, the non-treated goats in India had gained a mean of 30.64 grams per day (a daily gain of 0.23% baseline body weight) and in Tanzania 66.01 grams per day (0.33% baseline body weight). From the mixed model, the treated goats in India gained a mean of 25.22 grams per day more than non-treated goats, this is significantly greater than the weight gain in non-treated goats (p<0.001). In Tanzania treated goats gained a mean of 9.878 grams per day more than non-treated goats, which is also significantly greater than non-treated goats (p=0.007). Furthermore, in India and Tanzania, goats with a lighter weight at the baseline survey gained greater amounts of weight. In both studies the BCS of the treated goats improved by a greater amount than the non-treated goats. In this study we have demonstrated that in certain settings, the administration of anthelminthic drugs has a clear beneficial impact on goat weight. We speculate that the beneficial impacts vary with timing of administration, the drugs used and the helminth species challenge in the specific setting

    Beyond Traditional Feedback Channels: Extracting Requirements-Relevant Feedback from TikTok and YouTube

    Full text link
    The increasing importance of videos as a medium for engagement, communication, and content creation makes them critical for organizations to consider for user feedback. However, sifting through vast amounts of video content on social media platforms to extract requirements-relevant feedback is challenging. This study delves into the potential of TikTok and YouTube, two widely used social media platforms that focus on video content, in identifying relevant user feedback that may be further refined into requirements using subsequent requirement generation steps. We evaluated the prospect of videos as a source of user feedback by analyzing audio and visual text, and metadata (i.e., description/title) from 6276 videos of 20 popular products across various industries. We employed state-of-the-art deep learning transformer-based models, and classified 3097 videos consisting of requirements relevant information. We then clustered relevant videos and found multiple requirements relevant feedback themes for each of the 20 products. This feedback can later be refined into requirements artifacts. We found that product ratings (feature, design, performance), bug reports, and usage tutorial are persistent themes from the videos. Video-based social media such as TikTok and YouTube can provide valuable user insights, making them a powerful and novel resource for companies to improve customer-centric development

    The SRC family kinase inhibitor NXP900 demonstrates potent anti-tumor activity in squamous cell carcinomas

    Get PDF
    NXP900 is a selective and potent SRC family kinase (SFK) inhibitor, currently being dosed in a phase 1 clinical trial, that locks SRC in the “closed” conformation, thereby inhibiting both kinase-dependent catalytic activity and kinase-independent functions. In contrast, several multi-targeted kinase inhibitors that inhibit SRC, including dasatinib and bosutinib, bind their target in the active “open” conformation, allowing SRC and other SFKs to act as a scaffold to promote tumorigenesis through non-catalytic functions. NXP900 exhibits a unique target selectivity profile with sub-nanomolar activity against SFK members over other kinases. This results in highly potent and specific SFK pathway inhibition. Here, we demonstrate that esophageal squamous cell carcinomas (ESCC) and head and neck squamous cell carcinomas (HNSCC) are exquisitely sensitive to NXP900 treatment in cell culture and in vivo, and we identify apatient population that could benefit from treatment with NXP900

    Global, regional, and national comparative risk assessment of 84 behavioural, environmental and occupational, and metabolic risks or clusters of risks for 195 countries and territories, 1990–2017 : a systematic analysis for the Global Burden of Disease Study 2017

    Get PDF
    Background: The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) 2017 comparative risk assessment (CRA) is a comprehensive approach to risk factor quantification that offers a useful tool for synthesising evidence on risks and risk outcome associations. With each annual GBD study, we update the GBD CRA to incorporate improved methods, new risks and risk outcome pairs, and new data on risk exposure levels and risk outcome associations. Methods: We used the CRA framework developed for previous iterations of GBD to estimate levels and trends in exposure, attributable deaths, and attributable disability-adjusted life-years (DALYs), by age group, sex, year, and location for 84 behavioural, environmental and occupational, and metabolic risks or groups of risks from 1990 to 2017. This study included 476 risk outcome pairs that met the GBD study criteria for convincing or probable evidence of causation. We extracted relative risk and exposure estimates from 46 749 randomised controlled trials, cohort studies, household surveys, census data, satellite data, and other sources. We used statistical models to pool data, adjust for bias, and incorporate covariates. Using the counterfactual scenario of theoretical minimum risk exposure level (TMREL), we estimated the portion of deaths and DALYs that could be attributed to a given risk. We explored the relationship between development and risk exposure by modelling the relationship between the Socio-demographic Index (SDI) and risk-weighted exposure prevalence and estimated expected levels of exposure and risk-attributable burden by SDI. Finally, we explored temporal changes in risk-attributable DALYs by decomposing those changes into six main component drivers of change as follows: (1) population growth; (2) changes in population age structures; (3) changes in exposure to environmental and occupational risks; (4) changes in exposure to behavioural risks; (5) changes in exposure to metabolic risks; and (6) changes due to all other factors, approximated as the risk-deleted death and DALY rates, where the risk-deleted rate is the rate that would be observed had we reduced the exposure levels to the TMREL for all risk factors included in GBD 2017. Findings: In 2017,34.1 million (95% uncertainty interval [UI] 33.3-35.0) deaths and 121 billion (144-1.28) DALYs were attributable to GBD risk factors. Globally, 61.0% (59.6-62.4) of deaths and 48.3% (46.3-50.2) of DALYs were attributed to the GBD 2017 risk factors. When ranked by risk-attributable DALYs, high systolic blood pressure (SBP) was the leading risk factor, accounting for 10.4 million (9.39-11.5) deaths and 218 million (198-237) DALYs, followed by smoking (7.10 million [6.83-7.37] deaths and 182 million [173-193] DALYs), high fasting plasma glucose (6.53 million [5.23-8.23] deaths and 171 million [144-201] DALYs), high body-mass index (BMI; 4.72 million [2.99-6.70] deaths and 148 million [98.6-202] DALYs), and short gestation for birthweight (1.43 million [1.36-1.51] deaths and 139 million [131-147] DALYs). In total, risk-attributable DALYs declined by 4.9% (3.3-6.5) between 2007 and 2017. In the absence of demographic changes (ie, population growth and ageing), changes in risk exposure and risk-deleted DALYs would have led to a 23.5% decline in DALYs during that period. Conversely, in the absence of changes in risk exposure and risk-deleted DALYs, demographic changes would have led to an 18.6% increase in DALYs during that period. The ratios of observed risk exposure levels to exposure levels expected based on SDI (O/E ratios) increased globally for unsafe drinking water and household air pollution between 1990 and 2017. This result suggests that development is occurring more rapidly than are changes in the underlying risk structure in a population. Conversely, nearly universal declines in O/E ratios for smoking and alcohol use indicate that, for a given SDI, exposure to these risks is declining. In 2017, the leading Level 4 risk factor for age-standardised DALY rates was high SBP in four super-regions: central Europe, eastern Europe, and central Asia; north Africa and Middle East; south Asia; and southeast Asia, east Asia, and Oceania. The leading risk factor in the high-income super-region was smoking, in Latin America and Caribbean was high BMI, and in sub-Saharan Africa was unsafe sex. O/E ratios for unsafe sex in sub-Saharan Africa were notably high, and those for alcohol use in north Africa and the Middle East were notably low. Interpretation: By quantifying levels and trends in exposures to risk factors and the resulting disease burden, this assessment offers insight into where past policy and programme efforts might have been successful and highlights current priorities for public health action. Decreases in behavioural, environmental, and occupational risks have largely offset the effects of population growth and ageing, in relation to trends in absolute burden. Conversely, the combination of increasing metabolic risks and population ageing will probably continue to drive the increasing trends in non-communicable diseases at the global level, which presents both a public health challenge and opportunity. We see considerable spatiotemporal heterogeneity in levels of risk exposure and risk-attributable burden. Although levels of development underlie some of this heterogeneity, O/E ratios show risks for which countries are overperforming or underperforming relative to their level of development. As such, these ratios provide a benchmarking tool to help to focus local decision making. Our findings reinforce the importance of both risk exposure monitoring and epidemiological research to assess causal connections between risks and health outcomes, and they highlight the usefulness of the GBD study in synthesising data to draw comprehensive and robust conclusions that help to inform good policy and strategic health planning

    Mapping the Birch and Grass Pollen Seasons in the UK Using Satellite Sensor Time-series

    Get PDF
    Grass and birch pollen are two major causes of seasonal allergic rhinitis (hay fever) in the UK and parts of Europe affecting around 15-20% of the population. Current prediction of these allergens in the UK is based on (i) measurements of pollen concentrations at a limited number of monitoring stations across the country and (ii) general information about the phenological status of the vegetation. Thus, the current prediction methodology provides information at a coarse spatial resolution only. Most station-based approaches take into account only local observations of flowering, while only a small number of approaches take into account remote observations of land surface phenology. The systematic gathering of detailed information about vegetation status nationwide would therefore be of great potential utility. In particular, there exists an opportunity to use remote sensing to estimate phenological variables that are related to the flowering phenophase and, thus, pollen release. In turn, these estimates can be used to predict pollen release at a fine spatial resolution. In this study, time-series of MERIS Terrestrial Chlorophyll Index (MTCI) data were used to predict two key phenological variables: the start of season and peak of season. A technique was then developed to estimate the flowering phenophase of birch and grass from the MTCI time-series. For birch, the timing of flowering was defined as the time after the start of the growing season when the MTCI value reached 25% of the maximum. Similarly, for grass this was defined as the time when the MTCI value reached 75% of the maximum. The predicted pollen release dates were validated with data from nine pollen monitoring stations in the UK. For both birch and grass, we obtained large positive correlations between the MTCI-derived start of pollen season and the start of the pollen season defined using station data, with a slightly larger correlation observed for birch than for grass. The technique was applied to produce detailed maps for the flowering of birch and grass across the UK for each of the years from 2003 to 2010. The results demonstrate that the remote sensing-based maps of onset flowering of birch and grass for the UK together with the pollen forecast from the Meteorology Office and National Pollen and Aerobiology Research Unit (NPARU) can potentially provide more accurate information to pollen allergy sufferers in the UK

    Antiinflammatory Therapy with Canakinumab for Atherosclerotic Disease

    Get PDF
    Background: Experimental and clinical data suggest that reducing inflammation without affecting lipid levels may reduce the risk of cardiovascular disease. Yet, the inflammatory hypothesis of atherothrombosis has remained unproved. Methods: We conducted a randomized, double-blind trial of canakinumab, a therapeutic monoclonal antibody targeting interleukin-1β, involving 10,061 patients with previous myocardial infarction and a high-sensitivity C-reactive protein level of 2 mg or more per liter. The trial compared three doses of canakinumab (50 mg, 150 mg, and 300 mg, administered subcutaneously every 3 months) with placebo. The primary efficacy end point was nonfatal myocardial infarction, nonfatal stroke, or cardiovascular death. RESULTS: At 48 months, the median reduction from baseline in the high-sensitivity C-reactive protein level was 26 percentage points greater in the group that received the 50-mg dose of canakinumab, 37 percentage points greater in the 150-mg group, and 41 percentage points greater in the 300-mg group than in the placebo group. Canakinumab did not reduce lipid levels from baseline. At a median follow-up of 3.7 years, the incidence rate for the primary end point was 4.50 events per 100 person-years in the placebo group, 4.11 events per 100 person-years in the 50-mg group, 3.86 events per 100 person-years in the 150-mg group, and 3.90 events per 100 person-years in the 300-mg group. The hazard ratios as compared with placebo were as follows: in the 50-mg group, 0.93 (95% confidence interval [CI], 0.80 to 1.07; P = 0.30); in the 150-mg group, 0.85 (95% CI, 0.74 to 0.98; P = 0.021); and in the 300-mg group, 0.86 (95% CI, 0.75 to 0.99; P = 0.031). The 150-mg dose, but not the other doses, met the prespecified multiplicity-adjusted threshold for statistical significance for the primary end point and the secondary end point that additionally included hospitalization for unstable angina that led to urgent revascularization (hazard ratio vs. placebo, 0.83; 95% CI, 0.73 to 0.95; P = 0.005). Canakinumab was associated with a higher incidence of fatal infection than was placebo. There was no significant difference in all-cause mortality (hazard ratio for all canakinumab doses vs. placebo, 0.94; 95% CI, 0.83 to 1.06; P = 0.31). Conclusions: Antiinflammatory therapy targeting the interleukin-1β innate immunity pathway with canakinumab at a dose of 150 mg every 3 months led to a significantly lower rate of recurrent cardiovascular events than placebo, independent of lipid-level lowering. (Funded by Novartis; CANTOS ClinicalTrials.gov number, NCT01327846.

    Prepared to react? Assessing the functional capacity of the primary health care system in rural Orissa, India to respond to the devastating flood of September 2008

    Get PDF
    Background: Early detection of an impending flood and the availability of countermeasures to deal with it can significantly reduce its health impacts. In developing countries like India, public primary health care facilities are frontline organizations that deal with disasters particularly in rural settings. For developing robust counter reacting systems evaluating preparedness capacities within existing systems becomes necessary. Objective: The objective of the study is to assess the functional capacity of the primary health care system in Jagatsinghpur district of rural Orissa in India to respond to the devastating flood of September 2008. Methods: An onsite survey was conducted in all 29 primary and secondary facilities in five rural blocks (administrative units) of Jagatsinghpur district in Orissa state. A pre-tested structured questionnaire was administered face to face in the facilities. The data was entered, processed and analyzed using STATA® 10. Results: Data from our primary survey clearly shows that the healthcare facilities are ill prepared to handle the flood despite being faced by them annually. Basic utilities like electricity backup and essential medical supplies are lacking during floods. Lack of human resources along with missing standard operating procedures; pre-identified communication and incident command systems; effective leadership; and weak financial structures are the main hindering factors in mounting an adequate response to the floods. Conclusion: The 2008 flood challenged the primary curative and preventive health care services in Jagatsinghpur. Simple steps like developing facility specific preparedness plans which detail out standard operating procedures during floods and identify clear lines of command will go a long way in strengthening the response to future floods. Performance critiques provided by the grass roots workers, like this one, should be used for institutional learning and effective preparedness planning. Additionally each facility should maintain contingency funds for emergency response along with local vendor agreements to ensure stock supplies during floods. The facilities should ensure that baseline public health standards for health care delivery identified by the Government are met in non-flood periods in order to improve the response during floods. Building strong public primary health care systems is a development challenge. The recovery phases of disasters should be seen as an opportunity to expand and improve services and facilities

    Wideband radiofrequency pulse sequence for evaluation of myocardial scar in patients with cardiac implantable devices

    Get PDF
    BackgroundCardiac magnetic resonance is a useful clinical tool to identify late gadolinium enhancement in heart failure patients with implantable electronic devices. Identification of LGE in patients with CIED is limited by artifact, which can be improved with a wide band radiofrequency pulse sequence.ObjectiveThe authors hypothesize that image quality of LGE images produced using wide-band pulse sequence in patients with devices is comparable to image quality produced using standard LGE sequences in patients without devices.MethodsTwo independent readers reviewed LGE images of 16 patients with CIED and 7 patients without intracardiac devices to assess for image quality, device-related artifact, and presence of LGE using the American Society of Echocardiography/American Heart Association 17 segment model of the heart on a 4-point Likert scale. The mean and standard deviation for image quality and artifact rating were determined. Inter-observer reliability was determined by calculating Cohen's kappa coefficient. Statistical significance was determined by T-test as a p {less than or equal to} 0.05 with a 95% confidence interval.ResultsAll patients underwent CMR without any adverse events. Overall IQ of WB LGE images was significantly better in patients with devices compared to standard LGE in patients without devices (p = 0.001) with reduction in overall artifact rating (p = 0.05).ConclusionOur study suggests wide-band pulse sequence for LGE can be applied safely to heart failure patients with devices in detection of LV myocardial scar while maintaining image quality, reducing artifact, and following routine imaging protocol after intravenous gadolinium contrast administration

    Differential Modulation of Angiogenesis by Erythropoiesis-Stimulating Agents in a Mouse Model of Ischaemic Retinopathy

    Get PDF
    BACKGROUND: Erythropoiesis stimulating agents (ESAs) are widely used to treat anaemia but concerns exist about their potential to promote pathological angiogenesis in some clinical scenarios. In the current study we have assessed the angiogenic potential of three ESAs; epoetin delta, darbepoetin alfa and epoetin beta using in vitro and in vivo models. METHODOLOGY/PRINCIPAL FINDINGS: The epoetins induced angiogenesis in human microvascular endothelial cells at high doses, although darbepoetin alfa was pro-angiogenic at low-doses (1-20 IU/ml). ESA-induced angiogenesis was VEGF-mediated. In a mouse model of ischaemia-induced retinopathy, all ESAs induced generation of reticulocytes but only epoetin beta exacerbated pathological (pre-retinal) neovascularisation in comparison to controls (p<0.05). Only epoetin delta induced a significant revascularisation response which enhanced normality of the vasculature (p<0.05). This was associated with mobilisation of haematopoietic stem cells and their localisation to the retinal vasculature. Darbepoetin alfa also increased the number of active microglia in the ischaemic retina relative to other ESAs (p<0.05). Darbepoetin alfa induced retinal TNFalpha and VEGF mRNA expression which were up to 4 fold higher than with epoetin delta (p<0.001). CONCLUSIONS: This study has implications for treatment of patients as there are clear differences in the angiogenic potential of the different ESAs
    corecore