489 research outputs found
Should Research Ethics Encourage the Production of Cost-Effective Interventions?
This project considers whether and how research ethics can contribute to the provision of cost-effective medical interventions. Clinical research ethics represents an underexplored context for the promotion of cost-effectiveness. In particular, although scholars have recently argued that research on less-expensive, less-effective interventions can be ethical, there has been little or no discussion of whether ethical considerations justify curtailing research on more expensive, more effective interventions. Yet considering cost-effectiveness at the research stage can help ensure that scarce resources such as tissue samples or limited subject popula- tions are employed where they do the most good; can support parallel efforts by providers and insurers to promote cost-effectiveness; and can ensure that research has social value and benefits subjects. I discuss and rebut potential objections to the consideration of cost-effectiveness in research, including the difficulty of predicting effectiveness and cost at the research stage, concerns about limitations in cost-effectiveness analysis, and worries about overly limiting researchers’ freedom. I then consider the advantages and disadvantages of having certain participants in the research enterprise, including IRBs, advisory committees, sponsors, investigators, and subjects, consider cost-effectiveness. The project concludes by qualifiedly endorsing the consideration of cost-effectiveness at the research stage. While incorporating cost-effectiveness considerations into the ethical evaluation of human subjects research will not on its own ensure that the health care system realizes cost-effectiveness goals, doing so nonetheless represents an important part of a broader effort to control rising medical costs
Building Babies - Chapter 16
In contrast to birds, male mammals rarely help to raise the offspring. Of all mammals, only among rodents, carnivores, and primates, males are sometimes intensively engaged in providing infant care (Kleiman and Malcolm 1981). Male caretaking of infants has long been recognized in nonhuman primates (Itani 1959). Given that infant care behavior can have a positive effect on the infant’s development, growth, well-being, or survival, why are male mammals not more frequently involved in “building babies”? We begin the chapter defining a few relevant terms and introducing the theory and hypotheses that have historically addressed the evolution of paternal care. We then review empirical findings on male care among primate taxa, before focusing, in the final section, on our own work on paternal care in South American owl monkeys (Aotus spp.). We conclude the chapter with some suggestions for future studies.Deutsche Forschungsgemeinschaft (HU 1746/2-1)
Wenner-Gren Foundation, the L.S.B. Leakey Foundation, the National Geographic Society, the National Science Foundation (BCS-0621020), the University of Pennsylvania Research Foundation, the Zoological Society of San Dieg
Quality of Life and Menopause in Women with Physical Disabilities
Objective: The goal of this cross-sectional study was to explore quality of life (QOL) in a sample of postmenopausal women with physical disabilities due to polio contracted in childhood. A structural equation model was used to confirm that menopause symptoms will have a minimal effect on QOL when disability-related variables are taken into account. Methods: A sample of 752 women who were postmenopausal completed a written survey. The structural equation model contained two measured predictors (age, severity of postpolio sequelae) and one latent predictor (menopause symptoms defined by four measured indicators). Functional status (defined by two measured indicators) was included as a mediator, with QOL (defined by three measured indicators) as the outcome. Results: The original model yielded acceptable fit indices (CFI = 0.96, RMSEA = 0.055) but resulted in a number of unexpected relationships that proved to be artifacts after model respecification. The respecified model yielded a nonsignificant chi-square value, which indicated no significant discrepancy between the proposed model and the observed data (chisquare = 18.5, dƒ = 13, p = 0.138). All fit indices indicated a good fit: CFI = 0.997, NNFI = 0.987, chi-square/dƒ = 1.43, and RMSEA = 0.024. Conclusions: When the effects of postpolio sequelae and functional status are included in the structural equation model, only the psychological symptoms of menopause play a prominent role in explaining QOL in this sample. The clinical implications of these findings suggest that attention to psychological symptoms and an exclusive focus on the physical aspects of menopause to the exclusion of other midlife life stressors and influences on a woman’s psychological well-being ignore the larger context of life in which they live. In particular, many women with disabilities may contend with additional or exacerbated stressors related to their disability.Peer Reviewedhttp://deepblue.lib.umich.edu/bitstream/2027.42/63153/1/jwh.2006.15.1014.pd
Supervised exercise training as an adjunctive therapy for venous leg ulcers: study protocol for a randomised controlled trial
Background: Venous leg ulcers are common, chronic wounds that are painful and reduce quality of life. Compression therapy is known to assist in the healing of venous leg ulceration. Supervised exercise training that targets an improvement in calf muscle pump function might be a useful adjunctive therapy for enhancing ulcer healing and other aspects of physical and mental health. However, the evidence of exercise for individuals with venous ulcers is sparse. Here, we describe the protocol for a study that aims to assess the feasibility of undertaking a randomised controlled trial of a supervised exercise programme in people who are receiving compression for venous ulceration. Methods/Design: This is a randomised, controlled, assessor-blinded, two-centre, feasibility trial with two parallel groups. Eighty adults who are receiving lower-limb compression for a venous leg ulcer will be randomly assigned to receive usual care (compression only) or usual care plus a 12-week supervised exercise programme. Participants in the exercise group will be invited to undertake three, 60-minute sessions of supervised exercise each week, and each session will involve a combination of treadmill walking, upright cycling and strength and flexibility exercises for the lower limbs. Participants will be assessed before randomisation and 3, 6 and 12 months after randomisation. Primary outcomes include rates of recruitment, retention and adherence. Secondary outcomes include time to ulcer healing, proportion of participants healed, percentage and absolute change in ulcer size, health-related quality of life (EQ-5D-5L and VEINES-QOL/Sym), lower-limb cutaneous microvascular function (laser Doppler flowmetry coupled with iontophoresis) and physical fitness (30-second sit-to-stand test, chair sit and reach test, 6-minute walk test and ankle range of motion). The costs associated with the exercise programme and health-care utilisation will be calculated. We will also complete interviews with a sub-sample of participants to explore their experiences of having a venous ulcer and the acceptability of the exercise intervention and study procedures. Discussion: Data from this study will be used to refine the supervised exercise programme, investigate the acceptability of the intervention and study design and determine the most appropriate outcome measures, thereby providing estimates of the factors needed to design an adequately powered trial across several centres
The bashful and the boastful : prestigious leaders and social change in Mesolithic Societies
The creation and maintenance of influential leaders and authorities is one of the key themes of archaeological and historical enquiry. However the social dynamics of authorities and leaders in the Mesolithic remains a largely unexplored area of study. The role and influence of authorities can be remarkably different in different situations yet they exist in all societies and in almost all social contexts from playgrounds to parliaments. Here we explore the literature on the dynamics of authority creation, maintenance and contestation in egalitarian societies, and discuss the implications for our interpretation and understanding of the formation of authorities and leaders and changing social relationships within the Mesolithic
Research into the Health Benefits of Sprint Interval Training Should Focus on Protocols with Fewer and Shorter Sprints
Over the past decade, it has been convincingly shown that regularly performing repeated brief supramaximal cycle sprints (sprint interval training [SIT]) is associated with aerobic adaptations and health benefits similar to or greater than with moderate-intensity continuous training (MICT). SIT is often promoted as a time-efficient exercise strategy, but the most commonly studied SIT protocol (4–6 repeated 30-s Wingate sprints with 4 min recovery, here referred to as ‘classic’ SIT) takes up to approximately 30 min per session. Combined with high associated perceived exertion, this makes classic SIT unsuitable as an alternative/adjunct to current exercise recommendations involving MICT. However, there are no indications that the design of the classic SIT protocol has been based on considerations regarding the lowest number or shortest duration of sprints to optimise time efficiency while retaining the associated health benefits. In recent years, studies have shown that novel SIT protocols with both fewer and shorter sprints are efficacious at improving important risk factors of noncommunicable diseases in sedentary individuals, and provide health benefits that are no worse than those associated with classic SIT. These shorter/easier protocols have the potential to remove many of the common barriers to exercise in the general population. Thus, based on the evidence summarised in this current opinion paper, we propose that there is a need for a fundamental change in focus in SIT research in order to move away from further characterising the classic SIT protocol and towards establishing acceptable and effective protocols that involve minimal sprint durations and repetitions
Nutritional correlates of koala persistence in a low-density population
It is widely postulated that nutritional factors drive bottom-up, resource-based patterns in herbivore ecology and distribution. There is, however, much controversy over the roles of different plant constituents and how these influence individual herbivores and herbivore populations. The density of koala (Phascolarctos cinereus) populations varies widely and many attribute population trends to variation in the nutritional quality of the eucalypt leaves of their diet, but there is little evidence to support this hypothesis. We used a nested design that involved sampling of trees at two spatial scales to investigate how leaf chemistry influences free-living koalas from a low-density population in south east New South Wales, Australia. Using koala faecal pellets as a proxy for koala visitation to trees, we found an interaction between toxins and nutrients in leaves at a small spatial scale, whereby koalas preferred trees with leaves of higher concentrations of available nitrogen but lower concentrations of sideroxylonals (secondary metabolites found exclusively in eucalypts) compared to neighbouring trees of the same species. We argue that taxonomic and phenotypic diversity is likely to be important when foraging in habitats of low nutritional quality in providing diet choice to tradeoff nutrients and toxins and minimise movement costs. Our findings suggest that immediate nutritional concerns are an important priority of folivores in low-quality habitats and imply that nutritional limitations play an important role in constraining folivore populations. We show that, with a careful experimental design, it is possible to make inferences about populations of herbivores that exist at extremely low densities and thus achieve a better understanding about how plant composition influences herbivore ecology and persistence.IW and WF received a grant from New
South Wales (NSW) Department of Environment,
Climate Change & Water
Defining functional diversity for lignocellulose degradation in a microbial community using multi-omics studies
Abstract\ud
\ud
Background\ud
Lignocellulose is one of the most abundant forms of fixed carbon in the biosphere. Current industrial approaches to the degradation of lignocellulose employ enzyme mixtures, usually from a single fungal species, which are only effective in hydrolyzing polysaccharides following biomass pre-treatments. While the enzymatic mechanisms of lignocellulose degradation have been characterized in detail in individual microbial species, the microbial communities that efficiently breakdown plant materials in nature are species rich and secrete a myriad of enzymes to perform “community-level” metabolism of lignocellulose. Single-species approaches are, therefore, likely to miss important aspects of lignocellulose degradation that will be central to optimizing commercial processes.\ud
\ud
\ud
Results\ud
Here, we investigated the microbial degradation of wheat straw in liquid cultures that had been inoculated with wheat straw compost. Samples taken at selected time points were subjected to multi-omics analysis with the aim of identifying new microbial mechanisms for lignocellulose degradation that could be applied in industrial pre-treatment of feedstocks. Phylogenetic composition of the community, based on sequenced bacterial and eukaryotic ribosomal genes, showed a gradual decrease in complexity and diversity over time due to microbial enrichment. Taxonomic affiliation of bacterial species showed dominance of Bacteroidetes and Proteobacteria and high relative abundance of genera Asticcacaulis, Leadbetterella and Truepera. The eukaryotic members of the community were enriched in peritrich ciliates from genus Telotrochidium that thrived in the liquid cultures compared to fungal species that were present in low abundance. A targeted metasecretome approach combined with metatranscriptomics analysis, identified 1127 proteins and showed the presence of numerous carbohydrate-active enzymes extracted from the biomass-bound fractions and from the culture supernatant. This revealed a wide array of hydrolytic cellulases, hemicellulases and carbohydrate-binding modules involved in lignocellulose degradation. The expression of these activities correlated to the changes in the biomass composition observed by FTIR and ssNMR measurements.\ud
\ud
\ud
Conclusions\ud
A combination of mass spectrometry-based proteomics coupled with metatranscriptomics has enabled the identification of a large number of lignocellulose degrading enzymes that can now be further explored for the development of improved enzyme cocktails for the treatment of plant-based feedstocks. In addition to the expected carbohydrate-active enzymes, our studies reveal a large number of unknown proteins, some of which may play a crucial role in community-based lignocellulose degradation.This work was funded by Biotechnology and Biological Sciences Research\ud
Council (BBSRC) Grants BB/1018492/1, BB/K020358/1 and BB/P027717/1, the\ud
BBSRC Network in Biotechnology and Bioenergy BIOCATNET and São Paulo\ud
Research Foundation (FAPESP) Grant 10/52362-5. ERdA thanks EMBRAPA\ud
Instrumentation São Carlos and Dr. Luiz Alberto Colnago for providing the\ud
NMR facility and CNPq Grant 312852/2014-2. The authors would like to thank\ud
Deborah Rathbone and Susan Heywood from the Biorenewables Develop‑\ud
ment Centre for technical assistance in rRNA amplicon sequencing
Inheritance of deleterious mutations at both BRCA1 and BRCA2 in an international sample of 32,295 women
Background: Most or mutation carriers have inherited a single (heterozygous) mutation. Transheterozygotes (TH) who have inherited deleterious mutations in both and are rare, and the consequences of transheterozygosity are poorly understood.
Methods: From 32,295 female mutation carriers, we identified 93 TH (0.3 %). "Cases" were defined as TH, and "controls" were single mutations at (SH1) or (SH2). Matched SH1 "controls" carried a BRCA1 mutation found in the TH "case". Matched SH2 "controls" carried a BRCA2 mutation found in the TH "case". After matching the TH carriers with SH1 or SH2, 91 TH were matched to 9316 SH1, and 89 TH were matched to 3370 SH2.
Results: The majority of TH (45.2 %) involved the three common Jewish mutations. TH were more likely than SH1 and SH2 women to have been ever diagnosed with breast cancer (BC; = 0.002). TH were more likely to be diagnosed with ovarian cancer (OC) than SH2 ( = 0.017), but not SH1. Age at BC diagnosis was the same in TH vs. SH1 ( = 0.231), but was on average 4.5 years younger in TH than in SH2 ( < 0.001). BC in TH was more likely to be estrogen receptor (ER) positive ( = 0.010) or progesterone receptor (PR) positive ( = 0.013) than in SH1, but less likely to be ER positive ( < 0.001) or PR positive ( = 0.012) than SH2. Among 15 tumors from TH patients, there was no clear pattern of loss of heterozygosity (LOH) for or in either BC or OC.
Conclusions: Our observations suggest that clinical TH phenotypes resemble SH1. However, TH breast tumor marker characteristics are phenotypically intermediate to SH1 and SH2.ACA and the CIMBA data management are funded by Cancer Research UK (C12292/A20861 and C12292/A11174). TRR was supported by R01-CA083855, R01-CA102776, and P50-CA083638. KLN, TMF, and SMD are supported by the Basser Research Center at the University of Pennsylvania. BP is supported by R01-CA112520. Cancer Research UK provided financial support for this work. ACA is a Senior Cancer Research UK Cancer Research Fellow. DFE is Cancer Research UK Principal Research Fellow. Tumor analysis was funded by STOP CANCER (to SJR). Study-specific acknowledgements are as provided in the manuscript
Outcomes of an inpatient refeeding protocol in youth with anorexia nervosa: Rady Children’s Hospital San Diego/University of California, San Diego
BACKGROUND: Current guidelines for nutritional rehabilitation in hospitalized restrictive eating disorder patients recommend a cautious approach to refeeding. Several studies suggest that higher calorie diets may be safe and effective, but have traditionally excluded severely malnourished patients. The goal of this study was to evaluate the safety of a higher calorie nutritional rehabilitation protocol (NRP) in a broad sample of inpatients with restrictive eating disorders, including those who were severely malnourished. METHODS: A retrospective chart review was conducted among eating disorder inpatients between January 2015 and March 2016. Patients were started on a lower calorie diet (≤1500 kcals/day) or higher calorie diet (≥1500 kcals/day). Calorie prescription on admission was based on physician clinical judgement. The sample included patients aged 8–20 years with any DSM-5 restrictive eating disorder. Those who were severely malnourished (<75% expected body weight [EBW]) or required tube feeding during admission were included. Multivariable regression models were used to determine whether level of nutritional rehabilitation was associated with hypophosphatemia, hypomagnesemia, or hypokalemia. RESULTS: The sample included 87 patients; mean age was 14.4 years (S.D. 32.7); 29% were <75% EBW. The majority (75.8%) was started on higher calorie diets (mean 1781 kcal/day). Controlling for rate of calorie change, initial %EBW, age, race/ethnicity, insurance, diagnosis, and NG/NJ tube placement, higher calorie diets were not associated with hypophosphatemia, hypomagnesemia, or hypokalemia on admission or within the first 72 h. Increased risk of hypophosphatemia on admission was associated with lower baseline %EBW. CONCLUSION: A higher calorie NRP was tolerated in this broad population of inpatients with restrictive eating disorders. Lower %EBW on admission was a more important predictor of hypophosphatemia than initial calorie level. Larger studies are required to demonstrate the safety of higher calorie diets in severely malnourished patients
- …
