473 research outputs found
Exploring the Use of Cost-Benefit Analysis to Compare Pharmaceutical Treatments for Menorrhagia
Background: The extra-welfarist theoretical framework tends to focus on health-related quality of life, whilst the welfarist framework captures a wider notion of well-being. EQ-5D and SF-6D are commonly used to value outcomes in chronic conditions with episodic symptoms, such as heavy menstrual bleeding (clinically termed menorrhagia). Because of their narrow-health focus and the condition’s periodic nature these measures may be unsuitable. A viable alternative measure is willingness to pay (WTP) from the welfarist framework. Objective: We explore the use of WTP in a preliminary cost-benefit analysis comparing pharmaceutical treatments for menorrhagia.
Methods: A cost-benefit analysis was carried out based on an outcome of WTP. The analysis is based in the UK primary care setting over a 24-month time period, with a partial societal perspective. Ninety-nine women completed a WTP exercise from the ex-ante (pre-treatment/condition) perspective. Maximum average WTP values were elicited for two pharmaceutical treatments, levonorgestrel-releasing intrauterine system (LNG-IUS) and oral treatment. Cost data were offset against WTP and the net present value derived for treatment. Qualitative information explaining the WTP values was also collected.
Results: Oral treatment was indicated to be the most cost-beneficial intervention costing £107 less than LNG-IUS and generating £7 more benefits. The mean incremental net present value for oral treatment compared with LNG-IUS was £113. The use of the WTP approach was acceptable as very few protests and non-responses were observed. Conclusion: The preliminary cost-benefit analysis results recommend oral treatment as the first-line treatment for menorrhagia. The WTP approach is a feasible alternative to the conventional EQ-5D/SF-6D approaches and offers advantages by capturing benefits beyond health, which is particularly relevant in menorrhagia
The effect of intervertebral cartilage on neutral posture and range of motion in the necks of sauropod dinosaurs
The necks of sauropod dinosaurs were a key factor in their evolution. The habitual posture and range of motion of these necks has been controversial, and computer-aided studies have argued for an obligatory sub-horizontal pose. However, such studies are compromised by their failure to take into account the important role of intervertebral cartilage. This cartilage takes very different forms in different animals. Mammals and crocodilians have intervertebral discs, while birds have synovial joints in their necks. The form and thickness of cartilage varies significantly even among closely related taxa. We cannot yet tell whether the neck joints of sauropods more closely resembled those of birds or mammals. Inspection of CT scans showed cartilage:bone ratios of 4.5% for Sauroposeidon and about 20% and 15% for two juvenile Apatosaurus individuals. In extant animals, this ratio varied from 2.59% for the rhea to 24% for a juvenile giraffe. It is not yet possible to disentangle ontogenetic and taxonomic signals, but mammal cartilage is generally three times as thick as that of birds. Our most detailed work, on a turkey, yielded a cartilage:bone ratio of 4.56%. Articular cartilage also added 11% to the length of the turkey's zygapophyseal facets. Simple image manipulation suggests that incorporating 4.56% of neck cartilage into an intervertebral joint of a turkey raises neutral posture by 15°. If this were also true of sauropods, the true neutral pose of the neck would be much higher than has been depicted. An additional 11% of zygapophyseal facet length translates to 11% more range of motion at each joint. More precise quantitative results must await detailed modelling. In summary, including cartilage in our models of sauropod necks shows that they were longer, more elevated and more flexible than previously recognised
The systematic guideline review: method, rationale, and test on chronic heart failure
Background: Evidence-based guidelines have the potential to improve healthcare. However, their de-novo-development requires substantial resources-especially for complex conditions, and adaptation may be biased by contextually influenced recommendations in source guidelines. In this paper we describe a new approach to guideline development-the systematic guideline review method (SGR), and its application in the development of an evidence-based guideline for family physicians on chronic heart failure (CHF).
Methods: A systematic search for guidelines was carried out. Evidence-based guidelines on CHF management in adults in ambulatory care published in English or German between the years 2000 and 2004 were included. Guidelines on acute or right heart failure were excluded. Eligibility was assessed by two reviewers, methodological quality of selected guidelines was appraised using the AGREE instrument, and a framework of relevant clinical questions for diagnostics and treatment was derived. Data were extracted into evidence tables, systematically compared by means of a consistency analysis and synthesized in a preliminary draft. Most relevant primary sources were re-assessed to verify the cited evidence. Evidence and recommendations were summarized in a draft guideline.
Results: Of 16 included guidelines five were of good quality. A total of 35 recommendations were systematically compared: 25/35 were consistent, 9/35 inconsistent, and 1/35 un-rateable (derived from a single guideline). Of the 25 consistencies, 14 were based on consensus, seven on evidence and four differed in grading. Major inconsistencies were found in 3/9 of the inconsistent recommendations. We re-evaluated the evidence for 17 recommendations (evidence-based, differing evidence levels and minor inconsistencies) - the majority was congruent. Incongruity was found where the stated evidence could not be verified in the cited primary sources, or where the evaluation in the source guidelines focused on treatment benefits and underestimated the risks. The draft guideline was completed in 8.5 man-months. The main limitation to this study was the lack of a second reviewer.
Conclusion: The systematic guideline review including framework development, consistency analysis and validation is an effective, valid, and resource saving-approach to the development of evidence-based guidelines
The effect of temperature, gradient and load carriage on oxygen consumption, posture and gait characteristics
Purpose The purpose of this experiment was to evaluate the effect of load carriage in a range of temperatures to establish the interaction between cold exposure, the magnitude of change from unloaded to loaded walking and gradient. Methods Eleven participants (19-27 years) provided written informed consent before performing six randomly ordered walking trials in six temperatures (20°C, 10°C, 5°C, 0°C, -5°C and -10°C). Trials involved two unloaded walking bouts before and after loaded walking (18.2 kg) at 4 km.hr⁻¹, on 0% and 10% gradients in 4 minute bouts. Results The change in absolute oxygen consumption (V̇O₂) from the first unloaded bout to loaded walking was similar across all six temperatures. When repeating the second unloaded bout, V̇O₂ at both -5°C and-10°C was greater compared to the first. At -10°C, V̇O₂ was increased from 1.60 ± 0.30 L.min⁻¹ to 1.89 ± 0.51 L.min⁻¹. Regardless of temperature, gradient had a greater effect on V̇O₂ and heart rate (HR) than backpack load. HR was unaffected by temperature. Stride length (SL) decreased with decreasing temperature but trunk forward lean was greater during cold exposure. Conclusion Decreased ambient temperature did not influence the magnitude of change in V̇O₂ from unloaded to loaded walking. However, in cold temperatures, V̇O₂ was significantly higher than in warm conditions. The increased V̇O₂ in colder temperatures at the same exercise intensity is predicted to ultimately lead to earlier onset of fatigue and cessation of exercise. These results highlight the need to consider both appropriate clothing and fitness during cold exposure
Recommended from our members
Gaze-grasp coordination in obstacle avoidance: differences between binocular and monocular viewing
Most adults can skillfully avoid potential obstacles when acting in everyday cluttered scenes. We examined how gaze and hand movements are normally coordinated for obstacle avoidance and whether these are altered when binocular depth information is unavailable. Visual fixations and hand movement kinematics were simultaneously recorded, while 13 right-handed subjects reached-to-precision grasp a cylindrical household object presented alone or with a potential obstacle (wine glass) located to its left (thumb's grasp side), right or just behind it (both closer to the finger's grasp side) using binocular or monocular vision. Gaze and hand movement strategies differed significantly by view and obstacle location. With binocular vision, initial fixations were near the target's centre of mass (COM) around the time of hand movement onset, but usually shifted to end just above the thumb's grasp site at initial object contact, this mainly being made by the thumb, consistent with selecting this digit for guiding the grasp. This strategy was associated with faster binocular hand movements and improved end-point grip precision across all trials than with monocular viewing, during which subjects usually continued to fixate the target closer to its COM despite a similar prevalence of thumb-first contacts. While subjects looked directly at the obstacle at each location on a minority of trials and their overall fixations on the target were somewhat biased towards the grasp side nearest to it, these gaze behaviours were particularly marked on monocular vision-obstacle behind trials which also commonly ended in finger-first contact. Subjects avoided colliding with the wine glass under both views when on the right (finger side) of the workspace by producing slower and straighter reaches, with this and the behind obstacle location also resulting in 'safer' (i.e. narrower) peak grip apertures and longer deceleration times than when the goal object was alone or the obstacle was on its thumb side. But monocular reach paths were more variable and deceleration times were selectively prolonged on finger-side and behind obstacle trials, with this latter condition further resulting in selectively increased grip closure times and corrections. Binocular vision thus provided added advantages for collision avoidance, known to require intact dorsal cortical stream processing mechanisms, particularly when the target of the grasp and potential obstacle to it were fairly closely separated in depth. Different accounts of the altered monocular gaze behaviour converged on the conclusion that additional perceptual and/or attentional resources are likely engaged compared to when continuous binocular depth information is available. Implications for people lacking binocular stereopsis are briefly considered
Impairment of Auditory-Motor Timing and Compensatory Reorganization after Ventral Premotor Cortex Stimulation
Integrating auditory and motor information often requires precise timing as in speech and music. In humans, the position of the ventral premotor cortex (PMv) in the dorsal auditory stream renders this area a node for auditory-motor integration. Yet, it remains unknown whether the PMv is critical for auditory-motor timing and which activity increases help to preserve task performance following its disruption. 16 healthy volunteers participated in two sessions with fMRI measured at baseline and following rTMS (rTMS) of either the left PMv or a control region. Subjects synchronized left or right finger tapping to sub-second beat rates of auditory rhythms in the experimental task, and produced self-paced tapping during spectrally matched auditory stimuli in the control task. Left PMv rTMS impaired auditory-motor synchronization accuracy in the first sub-block following stimulation (p<0.01, Bonferroni corrected), but spared motor timing and attention to task. Task-related activity increased in the homologue right PMv, but did not predict the behavioral effect of rTMS. In contrast, anterior midline cerebellum revealed most pronounced activity increase in less impaired subjects. The present findings suggest a critical role of the left PMv in feed-forward computations enabling accurate auditory-motor timing, which can be compensated by activity modulations in the cerebellum, but not in the homologue region contralateral to stimulation
Laparoscopy in management of appendicitis in high-, middle-, and low-income countries: a multicenter, prospective, cohort study.
BACKGROUND: Appendicitis is the most common abdominal surgical emergency worldwide. Differences between high- and low-income settings in the availability of laparoscopic appendectomy, alternative management choices, and outcomes are poorly described. The aim was to identify variation in surgical management and outcomes of appendicitis within low-, middle-, and high-Human Development Index (HDI) countries worldwide. METHODS: This is a multicenter, international prospective cohort study. Consecutive sampling of patients undergoing emergency appendectomy over 6 months was conducted. Follow-up lasted 30 days. RESULTS: 4546 patients from 52 countries underwent appendectomy (2499 high-, 1540 middle-, and 507 low-HDI groups). Surgical site infection (SSI) rates were higher in low-HDI (OR 2.57, 95% CI 1.33-4.99, p = 0.005) but not middle-HDI countries (OR 1.38, 95% CI 0.76-2.52, p = 0.291), compared with high-HDI countries after adjustment. A laparoscopic approach was common in high-HDI countries (1693/2499, 67.7%), but infrequent in low-HDI (41/507, 8.1%) and middle-HDI (132/1540, 8.6%) groups. After accounting for case-mix, laparoscopy was still associated with fewer overall complications (OR 0.55, 95% CI 0.42-0.71, p < 0.001) and SSIs (OR 0.22, 95% CI 0.14-0.33, p < 0.001). In propensity-score matched groups within low-/middle-HDI countries, laparoscopy was still associated with fewer overall complications (OR 0.23 95% CI 0.11-0.44) and SSI (OR 0.21 95% CI 0.09-0.45). CONCLUSION: A laparoscopic approach is associated with better outcomes and availability appears to differ by country HDI. Despite the profound clinical, operational, and financial barriers to its widespread introduction, laparoscopy could significantly improve outcomes for patients in low-resource environments. TRIAL REGISTRATION: NCT02179112
Task-Related Effects on the Temporal and Spatial Dynamics of Resting-State Functional Connectivity in the Default Network
Recent evidence points to two potentially fundamental aspects of the default network (DN), which have been relatively understudied. One is the temporal nature of the functional interactions among nodes of the network in the resting-state, usually assumed to be static. The second is possible influences of previous brain states on the spatial patterns (i.e., the brain regions involved) of functional connectivity (FC) in the DN at rest. The goal of the current study was to investigate modulations in both the spatial and temporal domains. We compared the resting-state FC of the DN in two runs that were separated by a 45 minute interval containing cognitive task execution. We used partial least squares (PLS), which allowed us to identify FC spatiotemporal patterns in the two runs and to determine differences between them. Our results revealed two primary modes of FC, assessed using a posterior cingulate seed – a robust correlation among DN regions that is stable both spatially and temporally, and a second pattern that is reduced in spatial extent and more variable temporally after cognitive tasks, showing switching between connectivity with certain DN regions and connectivity with other areas, including some task-related regions. Therefore, the DN seems to exhibit two simultaneous FC dynamics at rest. The first is spatially invariant and insensitive to previous brain states, suggesting that the DN maintains some temporally stable functional connections. The second dynamic is more variable and is seen more strongly when the resting-state follows a period of task execution, suggesting an after-effect of the cognitive activity engaged during task that carries over into resting-state periods
Best Practices in Researching Service-Learning at Community Colleges
In recent years, an increasing number of community colleges have integrated some form of service-learning into their programs or courses with the idea that it will promote civic engagement, increase student satisfaction with their courses and college experience as a whole, and improve learning outcomes. There is a good amount of research published on service-learning programs and outcomes conducted at four-year institutions, though there is a dearth of studies available on service-learning at community colleges. Because community colleges serve a purpose unique from that of four-year colleges and universities, both in their mission and often in the students they serve, research on service-learning at community colleges should also be distinct from investigations at the four-year level
Novel computational methods for increasing PCR primer design effectiveness in directed sequencing
<p>Abstract</p> <p>Background</p> <p>Polymerase chain reaction (PCR) is used in directed sequencing for the discovery of novel polymorphisms. As the first step in PCR directed sequencing, effective PCR primer design is crucial for obtaining high-quality sequence data for target regions. Since current computational primer design tools are not fully tuned with stable underlying laboratory protocols, researchers may still be forced to iteratively optimize protocols for failed amplifications after the primers have been ordered. Furthermore, potentially identifiable factors which contribute to PCR failures have yet to be elucidated. This inefficient approach to primer design is further intensified in a high-throughput laboratory, where hundreds of genes may be targeted in one experiment.</p> <p>Results</p> <p>We have developed a fully integrated computational PCR primer design pipeline that plays a key role in our high-throughput directed sequencing pipeline. Investigators may specify target regions defined through a rich set of descriptors, such as Ensembl accessions and arbitrary genomic coordinates. Primer pairs are then selected computationally to produce a minimal amplicon set capable of tiling across the specified target regions. As part of the tiling process, primer pairs are computationally screened to meet the criteria for success with one of two PCR amplification protocols. In the process of improving our sequencing success rate, which currently exceeds 95% for exons, we have discovered novel and accurate computational methods capable of identifying primers that may lead to PCR failures. We reveal the laboratory protocols and their associated, empirically determined computational parameters, as well as describe the novel computational methods which may benefit others in future primer design research.</p> <p>Conclusion</p> <p>The high-throughput PCR primer design pipeline has been very successful in providing the basis for high-quality directed sequencing results and for minimizing costs associated with labor and reprocessing. The modular architecture of the primer design software has made it possible to readily integrate additional primer critique tests based on iterative feedback from the laboratory. As a result, the primer design software, coupled with the laboratory protocols, serves as a powerful tool for low and high-throughput primer design to enable successful directed sequencing.</p
- …
