97 research outputs found

    Changes in body composition and average daily energy expenditure of men and women during arduous extended polar travel

    Get PDF
    Weight and skin-fold measurements were made at five-day intervals during a 47-day expedition by six men and three women from the edge of the sea ice to the South Pole. From these, together with detailed manual records of the nutrition for individual participants, the average daily energy expenditure was determined before and after a resupply at approximately mid-point of the expedition. For all participants body weight fell during the expedition with the overall loss being much smaller for the three female participants (-4.0, -4.0, -4.4kg) than for the male participants, (mean±sd) -8.6±2.0kg. Fat weight fell approximately linearly during the expedition with a total loss of (-4.1, -6.5 and -2.5kg) for the three female participants and -6.8±1.7kg for the male participants. Individual fat-free weight changed by a smaller amount overall: (0.13, 2.5 and -1.8kg) for the three female participants; -1.8±2.0kg for the male participants who, with one exception, lost fat-free tissue All participants showed a substantial variation in fat-free tissue weight during the expedition. Analysis of the daily energy expenditure showed adequate nutrition but the intake fell for the second part of the expedition although the reasons for this are unclear, but adaptation to the cold, altitude and workload are possible explanations. The validity of this time-averaged measurement for individual participants was determined from analysing moments about the mean of time-series actigraphy data from wrist worn devices. The mean and autocorrelation function of the actigraphy data across subjects were analysed to determine whether measures could be compared between participants. The first, second and third moment about the mean of the day-to-day activity was found to be time-invariant for individual subjects (χ2, p>0.05) and the normalized mean and autocorrelation measured over a day for each participant indistinguishable from the mean of the group (χ2, p>0.05) allowing both longitudinal and cross-sectional analysis

    Preserved circadian variation in cortisol and androgens during a ski traverse of Antarctica in summer

    Get PDF
    Antarctic expeditions present extreme physiological challenges due to cold temperatures, high physical exertion, and 24-hour daylight. This observational study evaluated endocrine adaptation in nine participants (six men, three women) during a 47-day, 1,000 km unassisted ski traverse. Detailed salivary sampling was conducted before, during and after the expedition, corroborated by blood and hair sampling before and after the expedition. Cortisol, testosterone, and androstenedione were measured using mass spectrometry, and thyroid hormones via immunoassay. Diurnal cortisol, androstenedione and testosterone variation was preserved, while the morning cortisol increased during the expedition, suggesting that exercise demands overshadow the effects of continuous daylight in controlling hypothalamic-pituitary-adrenal and gonadal axis function. Morning testosterone decreased during the expedition, with a greater effect seen among men. No significant changes were seen in blood or hair steroid hormones. Gonadotropins in women indicated central suppression pre-expedition, normalizing post-expedition. Thyroid-stimulating hormone levels increased post-expedition without significant changes in free T3 or T4, consistent with mild polar T3 syndrome. These findings highlight the adaptability of hypothalamic-pituitary function to combined stressors of exercise, energy deficit, and cold. This is the first study to capture in situ endocrine responses during an Antarctic traverse, advancing our understanding of human adaptation in extreme environments.</p

    Mortality on Mount Everest, 1921-2006: descriptive study

    Get PDF
    Objective To examine patterns of mortality among climbers on Mount Everest over an 86 year period

    Healthcare Staff Perceptions and Misconceptions regarding Antibody Testing in the United Kingdom: Implications for the next steps for antibody screening

    Get PDF
    Background Healthcare workers have been at increased risk of exposure, infection and serious complications from COVID-19. Antibody testing has been used to identify staff members who have been previously infected by SARS-CoV-2, and has been rolled out rapidly in the United Kingdom. a number of published comment and editorial articles raising concerns about antibody testing in this context. We present perceptions of NHS healthcare workers in relation to SARS-CoV-2 antibody testing. Methods Electronic survey regarding perceptions towards SARS-CoV-2 antibody testing which was distributed to all healthcare workers at a major NHS tertiary hospital following implementation of antibody testing. Results In total, 560 healthcare workers completed the survey (80% female; 25% of BAME background; 58% from frontline clinical staff). Exploring whether they previously had COVID-19 was the primary reported reason for choosing to undergo antibody testing (85.2%). In case of a positive antibody test, 72% reported that they would feel relieved, whilst 48% felt that they would be happier to work in a patient-facing area. Moreover, 12% responded that a positive test would mean “social distancing is less important”, with 34% of the responders indicating that in this case they would be both less likely to catch COVID-19 and happier to visit friends/relatives. Conclusions NHS staff members primarily seek out SARS-CoV-2 antibody testing for an appropriate reason. Based on our findings and given the lack of definite data regarding the extent of immunity protection from a positive SARS-CoV-2 antibody test, significant concerns may be raised regarding the reported interpretation by healthcare workers of positive antibody test results. This needs to be further explored and addressed to protect NHS staff and patients

    Hypoxia is not the primary mechanism contributing to exercise-induced proteinuria

    Get PDF
    Introduction Proteinuria increases at altitude and with exercise, potentially as a result of hypoxia. Using urinary alpha-1 acid glycoprotein (α1-AGP) levels as a sensitive marker of proteinuria, we examined the impact of relative hypoxia due to high altitude and blood pressure-lowering medication on post-exercise proteinuria.Methods Twenty individuals were pair-matched for sex, age and ACE genotype. They completed maximal exercise tests once at sea level and twice at altitude (5035 m). Losartan (100 mg/day; angiotensin-receptor blocker) and placebo were randomly assigned within each pair 21 days before ascent. The first altitude exercise test was completed within 24–48 hours of arrival (each pair within ~1 hour). Acetazolamide (125 mg two times per day) was administrated immediately after this test for 48 hours until the second altitude exercise test.Results With placebo, post-exercise α1-AGP levels were similar at sea level and altitude. Odds ratio (OR) for increased resting α1-AGP at altitude versus sea level was greater without losartan (2.16 times greater). At altitude, OR for reduced post-exercise α1-AGP (58% lower) was higher with losartan than placebo (2.25 times greater, p=0.059) despite similar pulse oximetry (SpO2) (p=0.95) between groups. Acetazolamide reduced post-exercise proteinuria by approximately threefold (9.3±9.7 vs 3.6±6.0 μg/min; p=0.025) although changes were not correlated (r=−0.10) with significant improvements in SpO2 (69.1%±4.5% vs 75.8%±3.8%; p=0.001).Discussion Profound systemic hypoxia imposed by altitude does not result in greater post-exercise proteinuria than sea level. Losartan and acetazolamide may attenuate post-exercise proteinuria, however further research is warranted

    C3d positive donor-specific antibodies have a role in pre-transplant risk stratification of crossmatch positive HLA-incompatible renal transplantation:United Kingdom multicentre study

    Get PDF
    Anti-HLA-antibody characteristics aid to risk-stratify patients and improve long-term renal graft outcomes. Complement activation by donor-specific antibody (DSA) is an important characteristic that may determine renal allograft outcome. There is heterogeneity in graft outcomes within the moderate to high immunological risk cases (cross-match-positive). We explored the role of C3d-positive DSAs in sub-stratification of cross-match-positive cases and relate to the graft outcomes. We investigated 139 cross-match-positive living-donor renal transplant recipients from four transplant centres in the United Kingdom. C3d assay was performed on serum samples obtained at pretreatment (predesensitization) and Day 14 post-transplant. C3d-positive DSAs were found in 52 (37%) patients at pretreatment and in 37 (27%) patients at Day 14 post-transplant. Median follow-up of patients was 48 months (IQR 20.47–77.57). In the multivariable analysis, pretreatment C3d-positive DSA was independently associated with reduced overall graft survival, the hazard ratio of 3.29 (95% CI 1.37–7.86). The relative risk of death-censored five-year graft failure was 2.83 (95% CI 1.56–5.13). Patients with both pretreatment and Day 14 C3d-positive DSAs had the worst five-year graft survival at 45.5% compared with 87.2% in both pretreatment and Day 14 C3d-negative DSA patients with the relative risk of death-censored five-year graft failure was 4.26 (95% CI 1.79, 10.09). In this multicentre study, we have demonstrated for the first time the utility of C3d analysis as a distinctive biomarker to sub-stratify the risk of poor graft outcome in cross-match-positive living-donor renal transplantation.</p

    HLA antibody incompatible renal transplantation : long-term outcomes similar to deceased donor transplantation

    Get PDF
    Background. HLA incompatible renal transplantation still remains one of best therapeutic options for a subgroup of patients who are highly sensitized and difficult to match but not much is known about its long-term graft and patient survival. Methods. One hundred thirty-four HLA incompatible renal transplantation patients from 2003 to 2018 with a median follow of 6.93 y were analyzed retrospectively to estimate patient and graft survivals. Outcomes were compared with groups defined by baseline crossmatch status and the type and timings of rejection episodes. Results. The overall patient survival was 95%, 90%, and 81%; and graft survival was 95%, 85%, and 70% at 1, 5, and 10 y, respectively. This was similar to the first-time deceased donor transplant cohort. The graft survival for pretreatment cytotoxic-dependent crossmatch (CDC) positive crossmatch group was significantly low at 83%, 64%, and 40% at 1, 5, and 10 y, respectively, compared with other groups (Bead/CDC, P = 0.007; CDC/Flow, P = 0.001; and microbead assay/flow cytometry crossmatch, P = 0.837), although those with a low CDC titer (<1 in 2) have comparable outcomes to the CDC negative group. Female patients in general fared worse in both patient and graft survival outcomes in each of the 3 groups based on pretreatment crossmatch, although this did not reach statistical significance. Antibody-mediated rejection was the most frequent type of rejection with significant decline in graft survival by 10 y when compared with no rejection (P < 0.001). Rejection that occurred or continued to occur after the first 2 wk of transplantation caused a significant reduction in graft survivals (P < 0.001), whereas good outcomes were seen in those with a single early rejection episode. Conclusions. One-, 5-, and 10-y HLA incompatible graft and patient survival is comparable to deceased donor transplantation and can be further improved by excluding high-CDC titer cases. Antibody-positive female patients show worse long-term survival. Resolution of early rejection is associated with good long-term graft survival

    A vein bypass first versus a best endovascular treatment first revascularization strategy for patients with chronic limb-threatening ischaemia who require an infra-popliteal, with or without an additional more proximal infra-inguinal, revascularization procedure to restore limb perfusion:the BASIL-2 within-trial health economic analysis

    Get PDF
    BackgroundChronic limb-threatening ischaemia (CLTI) places a considerable socioeconomic burden health and social care systems worldwide. The objective of this health economic analysis was to investigate the cost-effectiveness (CEA) and cost-utility (CUA) of a vein bypass (VB) first versus a best endovascular treatment (BET) first revascularization strategy in patients with CLTI who require an infra-popliteal revascularization procedure to restore limb perfusion. Methods CEA and CUA analyses were conducted from the perspective of the UK National Health Service. Patient-level resource use and health outcomes data collected from the BASIL-2 trial over 2-7 years of follow-up were utilized to estimate incremental cost-effectiveness ratios expressed as cost per amputation-free life year (AFLY) and cost per quality-adjusted life year (QALY). EQ-5D-5L was used to generate participant QALYs at 2 and 3 years. Results At two years, the mean(s.d.) discounted hospital cost was £15 742.59(16 182.60) and £13 273.66(15 446.92) in the VB-first and BET-first revascularization strategy groups respectively. The lower costs (-£2524.23, 95% c.i., -£5844.93 to £1131.52) in the BET-first group were mainly due to the reduced number of days in hospital and lower procedural costs. BET-first was also more effective leading to additional AFLYs (0.429, 95% c.i., 0.03 to 0.88) at 7 years and discounted QALYs (0.016, 95% c.i., -0.08 to 0.12) at 2 years. Conclusion A best endovascular first revascularization strategy dominated a vein bypass first strategy in the cost-effectiveness and cost-utility analyses. The findings were robust across different scenarios and prespecified subgroups.<p/
    corecore