365 research outputs found
Synergies for Improving Oil Palm Production and Forest Conservation in Floodplain Landscapes
Lowland tropical forests are increasingly threatened with conversion to oil palm as global demand and high profit drives crop expansion throughout the world’s tropical regions. Yet, landscapes are not homogeneous and regional constraints dictate land suitability for this crop. We conducted a regional study to investigate spatial and economic components of forest conversion to oil palm within a tropical floodplain in the Lower Kinabatangan, Sabah, Malaysian Borneo. The Kinabatangan ecosystem harbours significant biodiversity with globally threatened species but has suffered forest loss and fragmentation. We mapped the oil palm and forested landscapes (using object-based-image analysis, classification and regression tree analysis and on-screen digitising of high-resolution imagery) and undertook economic modelling. Within the study region (520,269 ha), 250,617 ha is cultivated with oil palm with 77% having high Net-Present-Value (NPV) estimates (637/ha?yr); but 20.5% is under-producing. In fact 6.3% (15,810 ha) of oil palm is commercially redundant (with negative NPV of -65/ha?yr) due to palm mortality from flood inundation. These areas would have been important riparian or flooded forest types. Moreover, 30,173 ha of unprotected forest remain and despite its value for connectivity and biodiversity 64% is allocated for future oil palm. However, we estimate that at minimum 54% of these forests are unsuitable for this crop due to inundation events. If conversion to oil palm occurs, we predict a further 16,207 ha will become commercially redundant. This means that over 32,000 ha of forest within the floodplain would have been converted for little or no financial gain yet with significant cost to the ecosystem. Our findings have globally relevant implications for similar floodplain landscapes undergoing forest transformation to agriculture such as oil palm. Understanding landscape level constraints to this crop, and transferring these into policy and practice, may provide conservation and economic opportunities within these seemingly high opportunity cost landscapes
Tailoring hyper-heuristics to specific instances of a scheduling problem using affinity and competence functions
Hyper-heuristics are high level heuristics which coordinate lower level ones to solve a given problem. Low level heuristics, however, are not all as competent/good as each other at solving the given problem and some do not work together as well as others. Hence the idea of measuring how good they are (competence) at solving the problem and how well they work together (their affinity). Models of the affinity and competence properties are suggested and evaluated using previous information on the performance of the simple low level heuristics. The resulting model values are used to improve the performance of the hyper-heuristic by tailoring it not only to the specific problem but the specific instance being solved. The test case is a hard combinatorial problem, namely the Hybrid Flow Shop scheduling problem. Numerical results on randomly generated as well as real-world instances are included
COVID-19 Vaccine Booster Hesitancy among the Elderly in Malaysian Residential Care Homes: A Cross-Sectional Study in Klang Valley
Abstract: The elderly are considered a high-risk group for severe outcomes and death from COVID-19 infection. Given the emergence of new COVID variants and the immunity provided by vaccines waning over time, booster doses of the vaccine have been advocated for those at risk to stay protected. This study aimed to determine the factors associated with hesitancy toward the second booster of the COVID-19 vaccine among the elderly residing in residential care homes. A cross-sectional study was conducted in 24 residential care homes in the Klang Valley using a face-to-face interview questionnaire. The study population included individuals aged 60 and above who had been fully vaccinated against COVID-19 up to the first booster dose. Second-booster hesitancy was assessed using the Oxford Vaccine Hesitancy Scale with seven items, the aggregate score of which ranges from seven to thirty-five; the higher the score, the greater the level of hesitancy. Multivariate linear regression was employed to determine factors associated with second-booster hesitancy, and a p-value < 0.05 was considered statistically significant. Data from 401 elderly individuals were included for analysis. The mean score of the Oxford Vaccine Hesitancy Scale was 21.6 \ub1 7.2. Predictors of second booster hesitancy were identified. Age, Indian ethnicity, being a recipient of the Sinovac vaccine as the first COVID-19 booster, experiencing the death of close friends or immediate family members following COVID-19 vaccination, and negative messages (indicating that taking a booster dose is harmful) from caregivers, friends, or family members were found to be associated with an increased second-booster-hesitancy score. Conversely, positive messages (indicating that taking a booster is helpful) from the government and caregivers, friends, or family members were identified as predictors associated with a reduction in the second-booster-hesitancy score. While vaccines effectively combat severe COVID-19, the majority of the elderly hesitate before taking the second booster. Their hesitancy, rooted in the perception of a low self risk and reliance on protection from the initial doses, emphasizes the need for intervention by relevant bodies. Taking into consideration the risk, albeit relatively low, of potentially serious side effects following COVID-19 vaccinations, it is imperative that transparent, appropriate, and positive messaging regarding booster vaccines, particularly in the context of the elderly from residential care homes, be available. Encouraging this high-risk group to embrace the second booster aligns with the goal of maximizing protection within the vulnerable elderly populatio
Mental turmoil, suicide risk, illness perception, and temperament, and their impact on quality of life in chronic daily headache
To evaluate the relationship among quality of life, temperament, illness perception, and mental turmoil in patients affected by chronic daily headache with concomitant medication overuse headache. Participants were 116 consecutive adult outpatients admitted to the Department of General Medicine of the Sant’Andrea Hospital in Rome, between January 2007 and December 2007 with a diagnosis of chronic daily headache (illness duration >5 years). Patients were administered the Temperament Evaluation of Memphis, Pisa, Paris and San Diego-autoquestionnaire version (TEMPS-A), the Beck Hopelessness Scale (BHS), the Hamilton Rating Scale for Depression (HAM-D), the Mini-International Neuropsychiatric Interview (MINI), the Revised Illness Perception Questionnaire (IPQ), the Suicide Score Scale (SSS), and the Quality of Life Index (QL-Index). Twenty-eight percent of the patients evidenced moderate to severe depression, and 35% evidenced severe hopelessness. Analyses also indicated that quality of life, temperament, illness perception, and psychological turmoil are associated. However, a hierarchical multivariate regression analysis with quality of life as dependent variable indicated that only a model with mental turmoil variables may fit data; further, only the MINI suicidal intent resulted associated with quality of life (standardized regression coefficient = −0.55; t = −3.06; P < 0.01). Suicide risk may play a central role in affecting the quality of life of patients with chronic headache. The investigation of the interplay of factors that precipitate suicide risk should include assessment of chronic headache and its effects on wellbeing
First-Borns Carry a Higher Metabolic Risk in Early Adulthood: Evidence from a Prospective Cohort Study
Birth order has been associated with early growth variability and subsequent increased adiposity, but the consequent effects of increased fat mass on metabolic risk during adulthood have not been assessed. We aimed to quantify the metabolic risk in young adulthood of being first-born relative to those born second or subsequently.Body composition and metabolic risk were assessed in 2,249 men, aged 17-19 years, from a birth cohort in southern Brazil. Metabolic risk was assessed using a composite z-score integrating standardized measurements of blood pressure, total cholesterol, high density lipoprotein, triglycerides and fat mass. First-borns had lower birth weight z-score (Δ = -0.25, 95%CI -0.35, -0.15,p<0.001) but showed greater weight gain during infancy (change in weight z-score from birth to 20 months: Δ = 0.39, 95%CI 0.28-0.50, p<0.0001) and had greater mean height (Δ = 1.2 cm, 95%CI: 0.7-1.6, p<0.0001) and weight (Δ = 0.34 kg, 95%CI: 0.13-0.55, p<0.002) at 43 months. This greater weight and height tracked into early adulthood, with first-borns being significantly taller, heavier and with significantly higher fat mass than later-borns. The metabolic risk z-score was significantly higher in first-borns.First-born status is associated with significantly elevated adiposity and metabolic risk in young adult men in Brazil. Our results, linking cardiovascular risk with life history variables, suggest that metabolic risk may be associated with the worldwide trend to smaller family size and it may interact with changes in behavioural or environmental risk factors
Patient safety in primary care: a survey of general practitioners in the Netherlands
Contains fulltext :
89814.pdf (publisher's version ) (Open Access)BACKGROUND: Primary care encompasses many different clinical domains and patient groups, which means that patient safety in primary care may be equally broad. Previous research on safety in primary care has focused on medication safety and incident reporting. In this study, the views of general practitioners (GPs) on patient safety were examined. METHODS: A web-based survey of a sample of GPs was undertaken. The items were derived from aspects of patient safety issues identified in a prior interview study. The questionnaire used 10 clinical cases and 15 potential risk factors to explore GPs' views on patient safety. RESULTS: A total of 68 GPs responded (51.5% response rate). None of the clinical cases was uniformly judged as particularly safe or unsafe by the GPs. Cases judged to be unsafe by a majority of the GPs concerned either the maintenance of medical records or prescription and monitoring of medication. Cases which only a few GPs judged as unsafe concerned hygiene, the diagnostic process, prevention and communication. The risk factors most frequently judged to constitute a threat to patient safety were a poor doctor-patient relationship, insufficient continuing education on the part of the GP and a patient age over 75 years. Language barriers and polypharmacy also scored high. Deviation from evidence-based guidelines and patient privacy in the reception/waiting room were not perceived as risk factors by most of the GPs. CONCLUSION: The views of GPs on safety and risk in primary care did not completely match those presented in published papers and policy documents. The GPs in the present study judged a broader range of factors than in previously published research on patient safety in primary care, including a poor doctor-patient relationship, to pose a potential threat to patient safety. Other risk factors such as infection prevention, deviation from guidelines and incident reporting were judged to be less relevant than by policy makers
PPP1R12C Promotes Atrial Hypocontractility in Atrial Fibrillation
BackgroundAtrial fibrillation (AF)-the most common sustained cardiac arrhythmia-increases thromboembolic stroke risk 5-fold. Although atrial hypocontractility contributes to stroke risk in AF, the molecular mechanisms reducing myofilament contractile function remain unknown. We tested the hypothesis that increased expression of PPP1R12C (protein phosphatase 1 regulatory subunit 12C)-the PP1 (protein phosphatase 1) regulatory subunit targeting MLC2a (atrial myosin light chain 2)-causes hypophosphorylation of MLC2a and results in atrial hypocontractility.MethodsRight atrial appendage tissues were isolated from human patients with AF versus sinus rhythm controls. Western blots, coimmunoprecipitation, and phosphorylation studies were performed to examine how the PP1c (PP1 catalytic subunit)-PPP1R12C interaction causes MLC2a dephosphorylation. In vitro studies of pharmacological MRCK (myotonic dystrophy kinase-related Cdc42-binding kinase) inhibitor (BDP5290) in atrial HL-1 cells were performed to evaluate PP1 holoenzyme activity on MLC2a. Cardiac-specific lentiviral PPP1R12C overexpression was performed in mice to evaluate atrial remodeling with atrial cell shortening assays, echocardiography, and AF inducibility with electrophysiology studies.ResultsIn human patients with AF, PPP1R12C expression was increased 2-fold versus sinus rhythm controls (P=2.0×10-2; n=12 and 12 in each group) with >40% reduction in MLC2a phosphorylation (P=1.4×10-6; n=12 and 12 in each group). PPP1R12C-PP1c binding and PPP1R12C-MLC2a binding were significantly increased in AF (P=2.9×10-2 and 6.7×10-3, respectively; n=8 and 8 in each group). In vitro studies utilizing drug BDP5290, which inhibits T560-PPP1R12C phosphorylation, demonstrated increased PPP1R12C binding with both PP1c and MLC2a and dephosphorylation of MLC2a. Mice treated with lentiviral PPP1R12C vector demonstrated a 150% increase in left atrial size versus controls (P=5.0×10-6; n=12, 8, and 12), with reduced atrial strain and atrial ejection fraction. Pacing-induced AF in mice treated with lentiviral PPP1R12C vector was significantly higher than in controls (P=1.8×10-2 and 4.1×10-2, respectively; n=6, 6, and 5).ConclusionsPatients with AF exhibit increased levels of PPP1R12C protein compared with controls. PPP1R12C overexpression in mice increases PP1c targeting to MLC2a and causes MLC2a dephosphorylation, which reduces atrial contractility and increases AF inducibility. These findings suggest that PP1 regulation of sarcomere function at MLC2a is a key determinant of atrial contractility in AF
Standards for clinical trials for treating TB
BACKGROUND: The value, speed of completion and robustness of the evidence generated by TB treatment trials could be improved by implementing standards for best practice.METHODS: A global panel of experts participated in a Delphi process, using a 7-point Likert scale to score and revise draft standards until consensus was reached.RESULTS: Eleven standards were defined: Standard 1, high quality data on TB regimens are essential to inform clinical and programmatic management; Standard 2, the research questions addressed by TB trials should be relevant to affected communities, who should be included in all trial stages; Standard 3, trials should make every effort to be as inclusive as possible; Standard 4, the most efficient trial designs should be considered to improve the evidence base as quickly and cost effectively as possible, without compromising quality; Standard 5, trial governance should be in line with accepted good clinical practice; Standard 6, trials should investigate and report strategies that promote optimal engagement in care; Standard 7, where possible, TB trials should include pharmacokinetic and pharmacodynamic components; Standard 8, outcomes should include frequency of disease recurrence and post-treatment sequelae; Standard 9, TB trials should aim to harmonise key outcomes and data structures across studies; Standard 10, TB trials should include biobanking; Standard 11, treatment trials should invest in capacity strengthening of local trial and TB programme staff.CONCLUSION: These standards should improve the efficiency and effectiveness of evidence generation, as well as the translation of research into policy and practice
Hepatitis B virus genotypes/subgenotypes in voluntary blood donors in Makassar, South Sulawesi, Indonesia
<p>Abstract</p> <p>Background</p> <p>Hepatitis B virus (HBV) genotype appears to show varying geographic distribution. Molecular epidemiological study of HBV in particular areas in Indonesia is still limited. This study was aimed to identify the prevalence of HBV genotype/subgenotype and mutations in basal core promoter (BCP) region in voluntary blood donors in Makassar, one of the biggest cities in east part of Indonesia.</p> <p>A total of 214 hepatitis B surface antigen (HBsAg)-positive samples were enrolled in this study. HBV genotype/subgenotype was identified by genotype-specific PCR method or direct sequencing of pre-S region. Mutations in BCP were identified by direct sequencing of the corresponding region.</p> <p>Results</p> <p>HBV/B and HBV/C were detected in 61.21% and 25.23% of the samples, while mix of HBV/B and HBV/C was found in 12.62% of the samples. Based on pre-S region, among HBV/B and HBV/C, HBV/B3 (95.00%) and HBV/C1 (58.82%) were predominant. Interestingly, HBV/D was identified in two samples (22.165.07 and 22.252.07). Complete genome sequences of two HBV/D strains (22.165.07 and 22.252.07) demonstrated that both strains belong to HBV/D6, and the divergence between the two strains were 1.45%, while divergences of both 22.165.07 and 22.252.07 strains with reference strain (<ext-link ext-link-id="AM422939" ext-link-type="gen">AM422939</ext-link>/France) were 2.67%. A1762T/G1764A mutation was observed in 1.96% and 5.36%, whereas T1753V mutation was found in 2.94% and 1.79% of HBV/B and HBV/C, respectively.</p> <p>Conclusion</p> <p>HBV/B and HBV/C are dominant in Makassar, similar to most areas in Indonesia. Mutations in BCP which might be associated with severity of liver disease are less common.</p
- …
