935 research outputs found
The effect of cooling prior to and during exercise on exercise performance and capacity in the heat: a meta-analysis
Exercise is impaired in hot, compared to moderate, conditions. The development of hyperthermia is strongly linked to the impairment and as a result, many different strategies have been investigated to combat this. This meta-analysis focused on one of the most popular strategies: cooling. Pre-cooling has received the most attention but more recently cooling applied during the bout of exercise has also been investigated and both were reviewed. We conducted a literature search and retrieved twenty-eight articles which investigated the effect of cooling administered either prior to (n=23) or during (n=5) an exercise test in hot (WBGT >26°C) conditions. Mean and weighted effect sizes (Cohen’s d) were calculated. Overall, pre-cooling has a moderate (d=0.73) effect on subsequent performance but the magnitude of the effect is dependent upon the nature of the test. Sprint performance is impaired (d=-0.26) but intermittent performance and prolonged exercise are both improved following cooling (d=0.47 and d=1.91 respectively). Cooling during exercise also has a positive effect on performance and capacity (d=0.76). Improvements were observed in studies with and without cooling-induced physiological alterations and the literature supports the suggestion of a dose-response relationship between cooling, thermal strain and improvements in performance and capacity. In summary, pre-cooling can improve subsequent intermittent and prolonged exercise performance and capacity in a hot environment but sprint performance is impaired. Cooling during exercise also has a positive effect on exercise performance and capacity in a hot environment
Health effects in fish of long-term exposure to effluents from wastewater treatment works
The effects of simple mixtures of chemicals, with similar mechanisms of action, can be predicted using the concentration addition model (CA). The ability of this model to predict the estrogenic effects of more complex mixtures such as effluent discharges, however, has yet to be established. Effluents from 43 U.K. wastewater treatment works were analyzed for the presence of the principal estrogenic chemical contaminants, estradiol, estrone, ethinylestradiol, and nonylphenol. The measured concentrations were used to predict the estrogenic activity of each effluent, employing the model of CA, based on the relative potencies of the individual chemicals in an in vitro recombinant yeast estrogen screen (rYES) and a short-term (14-day) in vivo rainbow trout vitellogenin induction assay. Based on the measured concentrations of the four chemicals in the effluents and their relative potencies in each assay, the calculated in vitro and in vivo responses compared well and ranged between 3.5 and 87 ng/L of estradiol equivalents (E2 EQ) for the different effluents. In the rYES, however, the measured E2 EQ concentrations in the effluents ranged between 0.65 and 43 ng E2 EQ/L, and they varied against those predicted by the CA model. Deviations in the estimation of the estrogenic potency of the effluents by the CA model, compared with the measured responses in the rYES, are likely to have resulted from inaccuracies associated with the measurement of the chemicals in the extracts derived from the complex effluents. Such deviations could also result as a consequence of interactions between chemicals present in the extracts that disrupted the activation of the estrogen response elements in the rYES. E2 EQ concentrations derived from the vitellogenic response in fathead minnows exposed to a series of effluent dilutions were highly comparable with the E2 EQ concentrations derived from assessments of the estrogenic potency of these dilutions in the rYES. Together these data support the use of bioassays for determining the estrogenic potency of WwTW effluents, and they highlight the associated problems for modeling approaches that are reliant on measured concentrations of estrogenic chemicals
Effects of environmental enrichment on survivorship, growth, sex ratio and behaviour in laboratory maintained zebrafish Danio rerio
This is the author accepted manuscriptEnvironmental enrichment involves increasing the complexity of a fish's environment in order to improve welfare. Researchers are legally obliged to consider the welfare of laboratory animals and poor welfare may result in less robust data in experimental science. Laboratory zebrafish Danio rerio are usually kept in bare aquaria for ease of husbandry and, despite being a well-studied species, little is known about how laboratory housing affects their welfare. This study shows that environmental enrichment, in the form of the addition of gravel substratum and plants into the tank, affects survivorship, growth and behaviour in laboratory-maintained D. rerio. Larvae reared in enriched tanks had significantly higher survivorship compared with larvae reared in bare tanks. Effects of the tank conditions on growth were more variable. Females from enriched tanks had a higher body condition than females maintained in bare tanks, but intriguingly this was not the case for males, where the only difference was a more variable body condition in males maintained in bare tanks. Sex ratio in the rearing tanks did not differ between treatments. Resource monopolisation was higher for fish in enriched tanks than for those in bare tanks. Fish from enriched tanks displayed lower levels of behaviours associated with anxiety compared with fish from bare tanks when placed into a novel environment. Thus, this study demonstrates differences in welfare for D. rerio maintained under different environmental conditions with enhancements in welfare more commonly associated with tank enrichment.University of Exeter Aquatic Resources Centr
The nuclear receptors of Biomphalaria glabrata and Lottia gigantea: Implications for developing new model organisms
© 2015 Kaur et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are creditedNuclear receptors (NRs) are transcription regulators involved in an array of diverse physiological functions including key roles in endocrine and metabolic function. The aim of this study was to identify nuclear receptors in the fully sequenced genome of the gastropod snail, Biomphalaria glabrata, intermediate host for Schistosoma mansoni and compare these to known vertebrate NRs, with a view to assessing the snail's potential as a invertebrate model organism for endocrine function, both as a prospective new test organism and to elucidate the fundamental genetic and mechanistic causes of disease. For comparative purposes, the genome of a second gastropod, the owl limpet, Lottia gigantea was also investigated for nuclear receptors. Thirty-nine and thirty-three putative NRs were identified from the B. glabrata and L. gigantea genomes respectively, based on the presence of a conserved DNA-binding domain and/or ligand-binding domain. Nuclear receptor transcript expression was confirmed and sequences were subjected to a comparative phylogenetic analysis, which demonstrated that these molluscs have representatives of all the major NR subfamilies (1-6). Many of the identified NRs are conserved between vertebrates and invertebrates, however differences exist, most notably, the absence of receptors of Group 3C, which includes some of the vertebrate endocrine hormone targets. The mollusc genomes also contain NR homologues that are present in insects and nematodes but not in vertebrates, such as Group 1J (HR48/DAF12/HR96). The identification of many shared receptors between humans and molluscs indicates the potential for molluscs as model organisms; however the absence of several steroid hormone receptors indicates snail endocrine systems are fundamentally different.The National Centre for the Replacement, Refinement and Reduction of Animals in Research, Grant Ref:G0900802 to CSJ, LRN, SJ & EJR [www.nc3rs.org.uk]
Recommended from our members
Bioavailability in soils
The consumption of locally-produced vegetables by humans may be an important exposure pathway for soil contaminants in many urban settings and for agricultural land use. Hence, prediction of metal and metalloid uptake by vegetables from contaminated soils is an important part of the Human Health Risk Assessment procedure. The behaviour of metals (cadmium, chromium, cobalt, copper, mercury, molybdenum, nickel, lead and zinc) and metalloids (arsenic, boron and selenium) in contaminated soils depends to a large extent on the intrinsic charge, valence and speciation of the contaminant ion, and soil properties such as pH, redox status and contents of clay and/or organic matter. However, chemistry and behaviour of the contaminant in soil alone cannot predict soil-to-plant transfer. Root uptake, root selectivity, ion interactions, rhizosphere processes, leaf uptake from the atmosphere, and plant partitioning are important processes that ultimately govern the accumulation ofmetals and metalloids in edible vegetable tissues. Mechanistic models to accurately describe all these processes have not yet been developed, let alone validated under field conditions. Hence, to estimate risks by vegetable consumption, empirical models have been used to correlate concentrations of metals and metalloids in contaminated soils, soil physico-chemical characteristics, and concentrations of elements in vegetable tissues. These models should only be used within the bounds of their calibration, and often need to be re-calibrated or validated using local soil and environmental conditions on a regional or site-specific basis.Mike J. McLaughlin, Erik Smolders, Fien Degryse, and Rene Rietr
Cooling athletes with a spinal cord injury
Cooling strategies that help prevent a reduction in exercise capacity whilst exercising in the heat have received considerable research interest over the past 3 decades, especially in the lead up to a relatively hot Olympic and Paralympic Games. Progressing into the next Olympic/Paralympic cycle, the host, Rio de Janeiro, could again present an environmental challenge for competing athletes. Despite the interest and vast array of research into cooling strategies for the able-bodied athlete, less is known regarding the application of these cooling strategies in the thermoregulatory impaired spinal cord injured (SCI) athletic population. Individuals with a spinal cord injury (SCI) have a reduced afferent input to the thermoregulatory centre and a loss of both sweating capacity and vasomotor control below the level of the spinal cord lesion. The magnitude of this thermoregulatory impairment is proportional to the level of the lesion. For instance, individuals with high-level lesions (tetraplegia) are at a greater risk of heat illness than individuals with lower-level lesions (paraplegia) at a given exercise intensity. Therefore, cooling strategies may be highly beneficial in this population group, even in moderate ambient conditions (~21 °C). This review was undertaken to examine the scientific literature that addresses the application of cooling strategies in individuals with an SCI. Each method is discussed in regards to the practical issues associated with the method and the potential underlying mechanism. For instance, site-specific cooling would be more suitable for an athlete with an SCI than whole body water immersion, due to the practical difficulties of administering this method in this population group. From the studies reviewed, wearing an ice vest during intermittent sprint exercise has been shown to decrease thermal strain and improve performance. These garments have also been shown to be effective during exercise in the able-bodied. Drawing on additional findings from the able-bodied literature, the combination of methods used prior to and during exercise and/or during rest periods/half-time may increase the effectiveness of a strategy. However, due to the paucity of research involving athletes with an SCI, it is difficult to establish an optimal cooling strategy. Future studies are needed to ensure that research outcomes can be translated into meaningful performance enhancements by investigating cooling strategies under the constraints of actual competition. Cooling strategies that meet the demands of intermittent wheelchair sports need to be identified, with particular attention to the logistics of the sport
Improving zebrafish laboratory welfare and scientific research through understanding their natural history
This is the final version. Available on open access from Wiley via the DOI in this recordGlobally, millions of zebrafish (Danio rerio) are used for scientific laboratory experiments for which researchers have a duty of care, with legal obligations to consider their welfare. Considering the growing use of the zebrafish as a vertebrate model for addressing a diverse range of scientific questions, optimising their laboratory conditions is of major importance for both welfare and improving scientific research. However, most guidelines for the care and breeding of zebrafish for research are concerned primarily with maximising production and minimising costs and pay little attention to the effects on welfare of the environments in which the fish are maintained, or how those conditions affect
their scientific research. Here we review the physical and social conditions in which laboratory zebrafish are kept,
identifying and drawing attention to factors likely to affect their welfare and experimental science. We also identify
a fundamental lack knowledge of how zebrafish interact with many biotic and abiotic features in their natural environment to support ways to optimise zebrafish health and well-being in the laboratory, and in turn the quality of scientific data produced. We advocate that the conditions under which zebrafish are maintained need to become a more integral part of research and that we understand more fully how they influence experimental outcome and in turn interpretations of the data generated.University of Exete
Recommended from our members
Accommodation and vergence response gains to different near cues characterize specific esotropias
Aim. To describe preliminary findings of how the profile of the use of blur, disparity and proximal cues varies between non-strabismic groups and those with different types of esotropia.
Design. Case control study
Methodology. A remote haploscopic photorefractor measured simultaneous convergence and accommodation to a range of targets containing all combinations of binocular disparity, blur and proximal (looming) cues. 13 constant esotropes, 16 fully accommodative esotropes, and 8 convergence excess esotropes were compared with age and refractive error matched controls, and 27 young adult emmetropic controls. All wore full refractive correction if not emmetropic. Response AC/A and CA/C ratios were also assessed.
Results. Cue use differed between the groups. Even esotropes with constant suppression and no binocular vision (BV) responded to disparity in cues. The constant esotropes with weak BV showed trends for more stable responses and better vergence and accommodation than those without any BV. The accommodative esotropes made less use of disparity cues to drive accommodation (p=0.04) and more use of blur to drive vergence (p=0.008) than controls. All esotropic groups failed to show the strong bias for better responses to disparity cues found in the controls, with convergence excess esotropes favoring blur cues. AC/A and CA/C ratios existed in an inverse relationship in the different groups. Accommodative lag of >1.0D at 33cm was common (46%) in the pooled esotropia groups compared with 11% in typical children (p=0.05).
Conclusion. Esotropic children use near cues differently from matched non-esotropic children in ways characteristic to their deviations. Relatively higher weighting for blur cues was found in accommodative esotropia compared to matched controls
CAERvest® – a novel endothermic hypothermic device for core temperature cooling: safety and efficacy testing
Listeria monocytogenes in Milk Products
peer-reviewedMilk and milk products are frequently identified as vectors for transmission of Listeria monocytogenes. Milk can be contaminated at farm level either by indirect external contamination from the farm environment or less frequently by direct contamination of the milk from infection in the animal. Pasteurisation of milk will kill L. monocytogenes, but post-pasteurisation contamination, consumption of unpasteurised milk and manufacture of unpasteurised milk products can lead to milk being the cause of outbreaks of listeriosis. Therefore, there is a concern that L. monocytogenes in milk could lead to a public health risk. To protect against this risk, there is a need for awareness surrounding the issues, hygienic practices to reduce the risk and adequate sampling and analysis to verify that the risk is controlled. This review will highlight the issues surrounding L. monocytogenes in milk and milk products, including possible control measures. It will therefore create awareness about L. monocytogenes, contributing to protection of public health
- …
