362 research outputs found
Application of the speed-duration relationship to normalize the intensity of high-intensity interval training
The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (P<0.050). For maximal HIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P>0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P>0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols
Appetite, gut hormone and energy intake responses to low volume sprint interval and traditional endurance exercise.
Sprint interval exercise improves several health markers but the appetite and energy balance response is unknown. This study compared the effects of sprint interval and endurance exercise on appetite, energy intake and gut hormone responses. Twelve healthy males [mean (SD): age 23 (3) years, body mass index 24.2 (2.9) kg m(-2), maximum oxygen uptake 46.3 (10.2) mL kg(-1) min(-1)] completed three 8 h trials [control (CON), endurance exercise (END), sprint interval exercise (SIE)] separated by 1 week. Trials commenced upon completion of a standardised breakfast. Sixty minutes of cycling at 68.1 (4.3) % of maximum oxygen uptake was performed from 1.75-2.75 h in END. Six 30-s Wingate tests were performed from 2.25-2.75 h in SIE. Appetite ratings, acylated ghrelin and peptide YY (PYY) concentrations were measured throughout each trial. Food intake was monitored from buffet meals at 3.5 and 7 h and an overnight food bag. Appetite (P 0.05). Therefore, relative energy intake (energy intake minus the net energy expenditure of exercise) was lower in END than that in CON (15.7 %; P = 0.006) and SIE (11.5 %; P = 0.082). An acute bout of endurance exercise resulted in lower appetite perceptions in the hours after exercise than sprint interval exercise and induced a greater 24 h energy deficit due to higher energy expenditure during exercise
Sprint interval and sprint continuous training increases circulating CD34+ cells and cardio-respiratory fitness in young healthy women
The improvement of vascular health in the exercising limb can be attained by sprint interval training (SIT).
However, the effects on systemic vascular function and on circulating angiogenic cells (CACs) which may contribute to endothelial repair have not been investigated. Additionally, a comparison between SIT and sprint continuous training (SCT) which is less time committing has not been made
Monocytes regulate the mechanism of T-cell death by inducing Fas-mediated apoptosis during bacterial infection.
Monocytes and T-cells are critical to the host response to acute bacterial infection but monocytes are primarily viewed as amplifying the inflammatory signal. The mechanisms of cell death regulating T-cell numbers at sites of infection are incompletely characterized. T-cell death in cultures of peripheral blood mononuclear cells (PBMC) showed 'classic' features of apoptosis following exposure to pneumococci. Conversely, purified CD3(+) T-cells cultured with pneumococci demonstrated necrosis with membrane permeabilization. The death of purified CD3(+) T-cells was not inhibited by necrostatin, but required the bacterial toxin pneumolysin. Apoptosis of CD3(+) T-cells in PBMC cultures required 'classical' CD14(+) monocytes, which enhanced T-cell activation. CD3(+) T-cell death was enhanced in HIV-seropositive individuals. Monocyte-mediated CD3(+) T-cell apoptotic death was Fas-dependent both in vitro and in vivo. In the early stages of the T-cell dependent host response to pneumococci reduced Fas ligand mediated T-cell apoptosis was associated with decreased bacterial clearance in the lung and increased bacteremia. In summary monocytes converted pathogen-associated necrosis into Fas-dependent apoptosis and regulated levels of activated T-cells at sites of acute bacterial infection. These changes were associated with enhanced bacterial clearance in the lung and reduced levels of invasive pneumococcal disease
Towards the minimal amount of exercise for improving metabolic health: beneficial effects of reduced-exertion high-intensity interval training
High-intensity interval training (HIT) has been proposed as a time-efficient alternative to traditional cardiorespiratory exercise training, but is very fatiguing. In this study, we investigated the effects of a reduced-exertion HIT (REHIT) exercise intervention on insulin sensitivity and aerobic capacity. Twenty-nine healthy but sedentary young men and women were randomly assigned to the REHIT intervention (men, n = 7; women, n = 8) or a control group (men, n = 6; women, n = 8). Subjects assigned to the control groups maintained their normal sedentary lifestyle, whilst subjects in the training groups completed three exercise sessions per week for 6 weeks. The 10-min exercise sessions consisted of low-intensity cycling (60 W) and one (first session) or two (all other sessions) brief ‘all-out’ sprints (10 s in week 1, 15 s in weeks 2–3 and 20 s in the final 3 weeks). Aerobic capacity ( V˙O2peakV˙O2peak ) and the glucose and insulin response to a 75-g glucose load (OGTT) were determined before and 3 days after the exercise program. Despite relatively low ratings of perceived exertion (RPE 13 ± 1), insulin sensitivity significantly increased by 28% in the male training group following the REHIT intervention (P < 0.05). V˙O2peakV˙O2peak increased in the male training (+15%) and female training (+12%) groups (P < 0.01). In conclusion we show that a novel, feasible exercise intervention can improve metabolic health and aerobic capacity. REHIT may offer a genuinely time-efficient alternative to HIT and conventional cardiorespiratory exercise training for improving risk factors of T2D
Research into the Health Benefits of Sprint Interval Training Should Focus on Protocols with Fewer and Shorter Sprints
Over the past decade, it has been convincingly shown that regularly performing repeated brief supramaximal cycle sprints (sprint interval training [SIT]) is associated with aerobic adaptations and health benefits similar to or greater than with moderate-intensity continuous training (MICT). SIT is often promoted as a time-efficient exercise strategy, but the most commonly studied SIT protocol (4–6 repeated 30-s Wingate sprints with 4 min recovery, here referred to as ‘classic’ SIT) takes up to approximately 30 min per session. Combined with high associated perceived exertion, this makes classic SIT unsuitable as an alternative/adjunct to current exercise recommendations involving MICT. However, there are no indications that the design of the classic SIT protocol has been based on considerations regarding the lowest number or shortest duration of sprints to optimise time efficiency while retaining the associated health benefits. In recent years, studies have shown that novel SIT protocols with both fewer and shorter sprints are efficacious at improving important risk factors of noncommunicable diseases in sedentary individuals, and provide health benefits that are no worse than those associated with classic SIT. These shorter/easier protocols have the potential to remove many of the common barriers to exercise in the general population. Thus, based on the evidence summarised in this current opinion paper, we propose that there is a need for a fundamental change in focus in SIT research in order to move away from further characterising the classic SIT protocol and towards establishing acceptable and effective protocols that involve minimal sprint durations and repetitions
Factors associated with completion of bowel cancer screening and the potential effects of simplifying the screening test algorithm
BACKGROUND: The primary colorectal cancer screening test in England is a guaiac faecal occult blood test (gFOBt). The NHS Bowel Cancer Screening Programme (BCSP) interprets tests on six samples on up to three test kits to determine a definitive positive or negative result. However, the test algorithm fails to achieve a definitive result for a significant number of participants because they do not comply with the programme requirements. This study identifies factors associated with failed compliance and modifications to the screening algorithm that will improve the clinical effectiveness of the screening programme. METHODS: The BCSP Southern Hub data for screening episodes started in 2006–2012 were analysed for participants aged 60–69 years. The variables included age, sex, level of deprivation, gFOBt results and clinical outcome. RESULTS: The data set included 1 409 335 screening episodes; 95.08% of participants had a definitively normal result on kit 1 (no positive spots). Among participants asked to complete a second or third gFOBt, 5.10% and 4.65%, respectively, failed to return a valid kit. Among participants referred for follow up, 13.80% did not comply. Older age was associated with compliance at repeat testing, but non-compliance at follow up. Increasing levels of deprivation were associated with non-compliance at repeat testing and follow up. Modelling a reduction in the threshold for immediate referral led to a small increase in completion of the screening pathway. CONCLUSIONS: Reducing the number of positive spots required on the first gFOBt kit for referral for follow-up and targeted measures to improve compliance with follow-up may improve completion of the screening pathway
Specific Gene Expression Responses to Parasite Genotypes Reveal Redundancy of Innate Immunity in Vertebrates
Vertebrate innate immunity is the first line of defense against an invading pathogen and has long been assumed to be largely unspecific with respect to parasite/pathogen species. However, recent phenotypic evidence suggests that immunogenetic variation, i.e. allelic variability in genes associated with the immune system, results in host-parasite genotype-by-genotype interactions and thus specific innate immune responses. Immunogenetic variation is common in all vertebrate taxa and this reflects an effective immunological function in complex environments. However, the underlying variability in host gene expression patterns as response of innate immunity to within-species genetic diversity of macroparasites in vertebrates is unknown. We hypothesized that intra-specific variation among parasite genotypes must be reflected in host gene expression patterns. Here we used high-throughput RNA-sequencing to examine the effect of parasite genotypes on gene expression patterns of a vertebrate host, the three-spined stickleback (Gasterosteus aculeatus). By infecting naïve fish with distinct trematode genotypes of the species Diplostomum pseudospathaceum we show that gene activity of innate immunity in three-spined sticklebacks depended on the identity of an infecting macroparasite genotype. In addition to a suite of genes indicative for a general response against the trematode we also find parasite-strain specific gene expression, in particular in the complement system genes, despite similar infection rates of single clone treatments. The observed discrepancy between infection rates and gene expression indicates the presence of alternative pathways which execute similar functions. This suggests that the innate immune system can induce redundant responses specific to parasite genotypes
The Evolution of Compact Binary Star Systems
We review the formation and evolution of compact binary stars consisting of
white dwarfs (WDs), neutron stars (NSs), and black holes (BHs). Binary NSs and
BHs are thought to be the primary astrophysical sources of gravitational waves
(GWs) within the frequency band of ground-based detectors, while compact
binaries of WDs are important sources of GWs at lower frequencies to be covered
by space interferometers (LISA). Major uncertainties in the current
understanding of properties of NSs and BHs most relevant to the GW studies are
discussed, including the treatment of the natal kicks which compact stellar
remnants acquire during the core collapse of massive stars and the common
envelope phase of binary evolution. We discuss the coalescence rates of binary
NSs and BHs and prospects for their detections, the formation and evolution of
binary WDs and their observational manifestations. Special attention is given
to AM CVn-stars -- compact binaries in which the Roche lobe is filled by
another WD or a low-mass partially degenerate helium-star, as these stars are
thought to be the best LISA verification binary GW sources.Comment: 105 pages, 18 figure
Incidence, demographics, and clinical characteristics of diabetes of the exocrine pancreas (type 3c): A retrospective cohort study
This study was conducted to describe the incidence of diabetes following pancreatic disease, assess how these patients are classified by clinicians, and compare clinical characteristics with type 1 and type 2 diabetes.Primary care records in England (n = 2,360,631) were searched for incident cases of adult-onset diabetes between 1 January 2005 and 31 March 2016. We examined demographics, diabetes classification, glycemic control, and insulin use in those with and without pancreatic disease (subcategorized into acute pancreatitis or chronic pancreatic disease) before diabetes diagnosis. Regression analysis was used to control for baseline potential risk factors for poor glycemic control (HbA1c ≥7% [53 mmol/mol]) and insulin requirement.We identified 31,789 new diagnoses of adult-onset diabetes. Diabetes following pancreatic disease (2.59 [95% CI 2.38-2.81] per 100,000 person-years) was more common than type 1 diabetes (1.64 [1.47-1.82]; P < 0.001). The 559 cases of diabetes following pancreatic disease were mostly classified by clinicians as type 2 diabetes (87.8%) and uncommonly as diabetes of the exocrine pancreas (2.7%). Diabetes following pancreatic disease was diagnosed at a median age of 59 years and BMI of 29.2 kg/m2. Diabetes following pancreatic disease was associated with poor glycemic control (adjusted odds ratio, 1.7 [1.3-2.2]; P < 0.001) compared with type 2 diabetes. Insulin use within 5 years was 4.1% (3.8-4.4) with type 2 diabetes, 20.9% (14.6-28.9) with diabetes following acute pancreatitis, and 45.8% (34.2-57.9) with diabetes following chronic pancreatic disease.Diabetes of the exocrine pancreas is frequently labeled type 2 diabetes but has worse glycemic control and a markedly greater requirement for insulin
- …
