245 research outputs found

    Additional Saturday rehabilitation improves functional independence and quality of life and reduces length of stay: a randomised controlled trial

    Get PDF
    Background Many inpatients receive little or no rehabilitation on weekends. Our aim was to determine what effect providing additional Saturday rehabilitation during inpatient rehabilitation had on functional independence, quality of life and length of stay compared to 5 days per week of rehabilitation.MethodsThis was a multicenter, single-blind (assessors) randomized controlled trial with concealed allocation and 12-month follow-up conducted in two publically funded metropolitan inpatient rehabilitation facilities in Melbourne, Australia. Patients were eligible if they were adults (aged &ge;18 years) admitted for rehabilitation for any orthopedic, neurological or other disabling conditions excluding those admitted for slow stream rehabilitation/geriatric evaluation and management. Participants were randomly allocated to usual care Monday to Friday rehabilitation (control) or to Monday to Saturday rehabilitation (intervention). The additional Saturday rehabilitation comprised physiotherapy and occupational therapy. The primary outcomes were functional independence (functional independence measure (FIM); measured on an 18 to 126 point scale), health-related quality of life (EQ-5D utility index; measured on a 0 to 1 scale, and EQ-5D visual analog scale; measured on a 0 to 100 scale), and patient length of stay. Outcome measures were assessed on admission, discharge (primary endpoint), and at 6 and 12 months post discharge.ResultsWe randomly assigned 996 adults (mean (SD) age 74 (13) years) to Monday to Saturday rehabilitation (n&thinsp;=&thinsp;496) or usual care Monday to Friday rehabilitation (n&thinsp;=&thinsp;500). Relative to admission scores, intervention group participants had higher functional independence (mean difference (MD) 2.3, 95% confidence interval (CI) 0.5 to 4.1, P&thinsp;=&thinsp;0.01) and health-related quality of life (MD 0.04, 95% CI 0.01 to 0.07, P&thinsp;=&thinsp;0.009) on discharge and may have had a shorter length of stay by 2 days (95% CI 0 to 4, P&thinsp;=&thinsp;0.1) when compared to control group participants. Intervention group participants were 17% more likely to have achieved a clinically significant change in functional independence of 22 FIM points or more (risk ratio (RR) 1.17, 95% CI 1.03 to 1.34) and 18% more likely to have achieved a clinically significant change in health-related quality of life (RR 1.18, 95% CI 1.04 to 1.34) on discharge compared to the control group. There was some maintenance of effect for functional independence and health-related quality of life at 6-month follow-up but not at 12-month follow-up. There was no difference in the number of adverse events between the groups (incidence rate ratio&thinsp;=&thinsp;0.81, 95% CI 0.61 to 1.08).ConclusionsProviding an additional day of rehabilitation improved functional independence and health-related quality of life at discharge and may have reduced length of stay for patients receiving inpatient rehabilitation.&nbsp;</p

    Practical Verification of Decision-Making in Agent-Based Autonomous Systems

    Get PDF
    We present a verification methodology for analysing the decision-making component in agent-based hybrid systems. Traditionally hybrid automata have been used to both implement and verify such systems, but hybrid automata based modelling, programming and verification techniques scale poorly as the complexity of discrete decision-making increases making them unattractive in situations where complex log- ical reasoning is required. In the programming of complex systems it has, therefore, become common to separate out logical decision-making into a separate, discrete, component. However, verification techniques have failed to keep pace with this devel- opment. We are exploring agent-based logical components and have developed a model checking technique for such components which can then be composed with a sepa- rate analysis of the continuous part of the hybrid system. Among other things this allows program model checkers to be used to verify the actual implementation of the decision-making in hybrid autonomous systems

    Validity of biomarkers of early circulatory impairment to predict outcome: a retrospective analysis

    Get PDF
    Objectives: The definition of circulatory impairment in the premature infant is controversial. Current research suggests overdiagnosis and overtreatment. We aimed to analyse which biomarkers move clinicians to initiate cardiovascular treatment (CVT). The prognostic capacity for adverse outcome (death and/or moderate-severe brain damage by cranial ultrasound at term equivalent) of these biomarkers was evaluated. Study Design: Retrospective data analysis from preterm infants enrolled in a placebo-controlled trial on dobutamine for low superior vena cava (SVC) flow, who showed normal SVC flow within the first 24 h (not randomized). Five positive biomarkers were considered: MABP 4 mmol/L; BE < −9 mmol/L; SVC flow <51 ml/kg/min. Results: Ninety eight infants formed the study cohort. Thirty six received CVT (2–95 h). Logistic regression models adjusted for gestational age showed a positive association between CVT and the risk of death or moderate-severe abnormal cranial ultrasound at term equivalent [(OR 5.2, 95%CI: 1.8–15.1) p = 0.002]. MABP 4 mmol/L were the most prevalent biomarkers at start of treatment. Low BE, high serum lactate and low SVC flow at first echocardiography showed a trend toward being associated with adverse outcome, although not statistically significant. Conclusions: Low blood pressure and high lactate are the most prevalent biomarkers used for CVT prescription. Lactic acidosis and low SVC flow early after birth showed a trend toward being associated with adverse outcome. These findings support using a combination of biomarkers for inclusion in a placebo-controlled trial on CVT during transitional circulation

    Common genetic variants near the Brittle Cornea Syndrome locus ZNF469 influence the blinding disease risk factor central corneal thickness Publication

    Get PDF
    Copyright: © 2010 Lu et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.Central corneal thickness (CCT), one of the most highly heritable human traits (h2 typically>0.9), is important for the diagnosis of glaucoma and a potential risk factor for glaucoma susceptibility. We conducted genome-wide association studies in five cohorts from Australia and the United Kingdom (total N = 5058). Three cohorts were based on individually genotyped twin collections, with the remaining two cohorts genotyped on pooled samples from singletons with extreme trait values. The pooled sample findings were validated by individual genotyping the pooled samples together with additional samples also within extreme quantiles. We describe methods for efficient combined analysis of the results from these different study designs. We have identified and replicated quantitative trait loci on chromosomes 13 and 16 for association with CCT. The locus on chromosome 13 (nearest gene FOXO1) had an overall meta-analysis p-value for all the individually genotyped samples of 4.6×10−10. The locus on chromosome 16 was associated with CCT with p = 8.95×10−11. The nearest gene to the associated chromosome 16 SNPs was ZNF469, a locus recently implicated in Brittle Cornea Syndrome (BCS), a very rare disorder characterized by abnormal thin corneas. Our findings suggest that in addition to rare variants in ZNF469 underlying CCT variation in BCS patients, more common variants near this gene may contribute to CCT variation in the general population

    Kaposi's Sarcoma-Associated Herpesvirus ORF57 Protein Binds and Protects a Nuclear Noncoding RNA from Cellular RNA Decay Pathways

    Get PDF
    The control of RNA stability is a key determinant in cellular gene expression. The stability of any transcript is modulated through the activity of cis- or trans-acting regulatory factors as well as cellular quality control systems that ensure the integrity of a transcript. As a result, invading viral pathogens must be able to subvert cellular RNA decay pathways capable of destroying viral transcripts. Here we report that the Kaposi's sarcoma-associated herpesvirus (KSHV) ORF57 protein binds to a unique KSHV polyadenylated nuclear RNA, called PAN RNA, and protects it from degradation by cellular factors. ORF57 increases PAN RNA levels and its effects are greatest on unstable alleles of PAN RNA. Kinetic analysis of transcription pulse assays shows that ORF57 protects PAN RNA from a rapid cellular RNA decay process, but ORF57 has little effect on transcription or PAN RNA localization based on chromatin immunoprecipitation and in situ hybridization experiments, respectively. Using a UV cross-linking technique, we further demonstrate that ORF57 binds PAN RNA directly in living cells and we show that binding correlates with function. In addition, we define an ORF57-responsive element (ORE) that is necessary for ORF57 binding to PAN RNA and sufficient to confer ORF57-response to a heterologous intronless β-globin mRNA, but not its spliced counterparts. We conclude that ORF57 binds to viral transcripts in the nucleus and protects them from a cellular RNA decay pathway. We propose that KSHV ORF57 protein functions to enhance the nuclear stability of intronless viral transcripts by protecting them from a cellular RNA quality control pathway

    Metastatic melanoma in an esophagus demonstrating Barrett esophagus with high grade dysplasia

    Get PDF
    BACKGROUND: Metastatic melanoma involving the esophagus is rare; the occurrence of metastatic melanoma in a background of Barrett esophagus is rarer still. We report a case of an 80 year-old male who presented to our institution for workup of Barrett esophagus with high-grade dysplasia and who proved to have metastatic melanoma occurring in the background of Barrett esophagus, the first report of this kind, to our knowledge, in the English literature. CASE PRESENTATION: An 80 year-old Caucasian male was diagnosed at an outside institution with Barrett’s esophagus with high grade dysplasia and presented to our institution for therapy. The patient underwent endoscopic mucosal resection using a band ligation technique of an area of nodularity within the Barrett esophagus. Microscopic examination demonstrated extensive Barrett esophagus with high-grade dysplasia as well as a second tumor which was morphologically different from the surrounding high-grade dysplasia and which was positive for S-100, HMB 45 and Melan-A on immunohistochemistry, consistent with melanoma. Further workup of the patient demonstrated multiple radiologic lesions consistent with metastases. Molecular studies demonstrated that the melanoma was positive for the 1799T>A (V600E) mutation in the BRAF gene. The overall features of the tumor were most consistent with metastatic melanoma occurring in a background of Barrett esophagus with high-grade dysplasia. CONCLUSION: This case demonstrates a unique intersection between a premalignant condition (Barrett esophagus with high grade dysplasia) and a separate malignancy (melanoma). This report also shows the utility of molecular testing to support the hypothesis of primary versus metastatic disease in melanoma

    Safety, Immunogenicity and Efficacy of Prime-Boost Vaccination with ChAd63 and MVA Encoding ME-TRAP against Plasmodium falciparum Infection in Adults in Senegal.

    Get PDF
    Malaria transmission is in decline in some parts of Africa, partly due to the scaling up of control measures. If the goal of elimination is to be achieved, additional control measures including an effective and durable vaccine will be required. Studies utilising the prime-boost approach to deliver viral vectors encoding the pre-erythrocytic antigen ME-TRAP (multiple epitope thrombospondin-related adhesion protein) have shown promising safety, immunogenicity and efficacy in sporozoite challenge studies. More recently, a study in Kenyan adults, similar to that reported here, showed substantial efficacy against P. falciparum infection. One hundred and twenty healthy male volunteers, living in a malaria endemic area of Senegal were randomised to receive either the Chimpanzee adenovirus (ChAd63) ME-TRAP as prime vaccination, followed eight weeks later by modified vaccinia Ankara (MVA) also encoding ME-TRAP as booster, or two doses of anti-rabies vaccine as a comparator. Prior to follow-up, antimalarials were administered to clear parasitaemia and then participants were monitored by PCR for malaria infection for eight weeks. The primary endpoint was time-to-infection with P. falciparum malaria, determined by two consecutive positive PCR results. Secondary endpoints included adverse event reporting, measures of cellular and humoral immunogenicity and a meta-analysis of combined vaccine efficacy with the parallel study in Kenyan adults.We show that this pre-erythrocytic malaria vaccine is safe and induces significant immunogenicity, with a peak T-cell response at seven days after boosting of 932 Spot Forming Cells (SFC)/106 Peripheral Blood Mononuclear Cells(PBMC) compared to 57 SFC/ 106 PBMCs in the control group. However, a vaccine efficacy was not observed: 12 of 57 ME-TRAP vaccinees became PCR positive during the intensive monitoring period as compared to 13 of the 58 controls (P = 0.80). This trial confirms that vaccine efficacy against malaria infection in adults may be rapidly assessed using this efficient and cost-effective clinical trial design. Further efficacy evaluation of this vectored candidate vaccine approach in other malaria transmission settings and age-de-escalation into the main target age groups for a malaria vaccine is in progress

    Determinants and impact of role-related time use allocation on self-reported health among married men and women: a cross-national comparative study

    Get PDF
    Background Research on the effects of marriage on health maintains that there is a gender-specific gradient, with men deriving far greater benefits than women. One reason provided for this difference is the disproportionate amount of time spent by women on housework and childcare. However, this hypothesis has yet to be explicitly tested for these role-related time use activities. This study provides empirical evidence on the association between role-related time use activities (i.e. housework, childcare and paid work) and self-reported health among married men and women. Methods Data from the Multinational Time Use Study (MTUS) on 32,881 men and 26,915 women from Germany, Italy, Spain, the UK and the US were analyzed. Seemingly unrelated regression (SUR) models and multivariable logistic regression were used to estimate the association between role-related time use activities and self-reported health among married men and women. Results The findings showed that education, occupation and number of children under 18 years old in the household were the most consistent predictors of time allocation among married men and women. Significant gender differences were also found in time allocation, with women sacrificing paid working time or reducing time devoted to housework for childcare. Men, in contrast, were less likely to reduce paid working hours to increase time spent on childcare, but instead reduced time allocation to housework. Allocating more time to paid work and childcare was associated with good health, whereas time spent on housework was associated with poor health, especially among women. Conclusions Time allocation to role-related activities have differential associations on health, and the effects vary by gender and across countries. To reduce the gender health gap among married men and women, public policies need to take social and gender roles into account

    A study protocol of a randomised controlled trial incorporating a health economic analysis to investigate if additional allied health services for rehabilitation reduce length of stay without compromising patient outcomes

    Get PDF
    Background Reducing patient length of stay is a high priority for health service providers. Preliminary information suggests additional Saturday rehabilitation services could reduce the time a patient stays in hospital by three days. This large trial will examine if providing additional physiotherapy and occupational therapy services on a Saturday reduces health care costs, and improves the health of hospital inpatients receiving rehabilitation compared to the usual Monday to Friday service. We will also investigate the cost effectiveness and patient outcomes of such a service. Methods/Design A randomised controlled trial will evaluate the effect of providing additional physiotherapy and occupational therapy for rehabilitation. Seven hundred and twelve patients receiving inpatient rehabilitation at two metropolitan sites will be randomly allocated to the intervention group or control group. The control group will receive usual care physiotherapy and occupational therapy from Monday to Friday while the intervention group will receive the same amount of rehabilitation as the control group Monday to Friday plus a full physiotherapy and occupational therapy service on Saturday. The primary outcomes will be patient length of stay, quality of life (EuroQol questionnaire), the Functional Independence Measure (FIM), and health utilization and cost data. Secondary outcomes will assess clinical outcomes relevant to the goals of therapy: the 10 metre walk test, the timed up and go test, the Personal Care Participation Assessment and Resource Tool (PC PART), and the modified motor assessment scale. Blinded assessors will assess outcomes at admission and discharge, and follow up data on quality of life, function and health care costs will be collected at 6 and 12 months after discharge. Between group differences will be analysed with analysis of covariance using baseline measures as the covariate. A health economic analysis will be carried out alongside the randomised controlled trial. Discussion This paper outlines the study protocol for the first fully powered randomised controlled trial incorporating a health economic analysis to establish if additional Saturday allied health services for rehabilitation inpatients reduces length of stay without compromising discharge outcomes. If successful, this trial will have substantial health benefits for the patients and for organizations delivering rehabilitation services

    The management of non-valvular atrial fibrillation (NVAF) in Australian general practice: bridging the evidence-practice gap. A national, representative postal survey

    Get PDF
    <p>Abstract</p> <p>Background</p> <p>General practitioners (GPs) are ideally placed to bridge the widely noted evidence-practice gap between current management of NVAF and the need to increase anticoagulant use to reduce the risk of fatal and disabling stroke in NVAF. We aimed to identify gaps in current care, and asked GPs to identify potentially useful strategies to overcome barriers to best practice.</p> <p>Methods</p> <p>We obtained contact details for a random sample of 1000 GPs from a national commercial data-base. Randomly selected GPs were mailed a questionnaire after an advance letter. Standardised reminders were administered to enhance response rates. As part of a larger survey assessing GP management of NVAF, we included questions to explore GPs' risk assessment, estimates of stroke risk and GPs' perceptions of the risks and benefits of anticoagulation with warfarin. In addition, we explored GPs' perceived barriers to the wider uptake of anticoagulation, quality control of anticoagulation and their assessment of strategies to assist in managing NVAF.</p> <p>Results</p> <p>596 out of 924 eligible GPs responded (64.4% response rate). The majority of GPs recognised that the benefits of warfarin outweighed the risks for three case scenarios in which warfarin is recommended according to Australian guidelines. In response to a hypothetical case scenario describing a patient with a supratherapeutic INR level of 5, 41.4% of the 596 GPs (n = 247) and 22.0% (n = 131) would be "highly likely" or "likely", respectively, to cease warfarin therapy and resume at a lower dose when INR levels are within therapeutic range. Only 27.9% (n = 166/596) would reassess the patient's INR levels within one day of recording the supratherapeutic INR. Patient contraindications to warfarin was reported to "usually" or "always" apply to the patients of 40.6% (n = 242/596) of GPs when considering whether or not to prescribe warfarin. Patient refusal to take warfarin "usually" or "always" applied to the patients of 22.3% (n = 133/596) of GPs. When asked to indicate the usefulness of strategies to assist in managing NVAF, the majority of GPs (89.1%, n = 531/596) reported that they would find patient educational resources outlining the benefits and risks of available treatments "quite useful" or "very useful". Just under two-thirds (65.2%; n = 389/596) reported that they would find point of care INR testing "quite" or "very" useful. An outreach specialist service and training to enable GPs to practice stroke medicine as a special interest were also considered to be "quite" or "very useful" by 61.9% (n = 369/596) GPs.</p> <p>Conclusion</p> <p>This survey identified gaps, based on GP self-report, in the current care of NVAF. GPs themselves have provided guidance on the selection of implementation strategies to bridge these gaps. These results may inform future initiatives designed to reduce the risk of fatal and disabling stroke in NVAF.</p
    corecore