43 research outputs found
Indications and trends of caesarean birth delivery in the current practice scenario
Background: Objective of current study was to analyze incidence, indications and trends of cesarean birth delivery in our environment.Methods: A prospective study of the cesarean sections performed at V.S. general teaching hospital in Ahmedabad from January 2008 to December 2013Results: Out of 28,411 total deliveries, 11629 women underwent CS. Each year the CS rate, above 40%, was relatively constant. 72.46% patients were within 20-29 years of age group. 39% patients were from middle to higher socio-economic class. CS in emergency patient was consistently more than 50% and in registered patient around 40%. Maternal indications for CS were twice common to fetal indications. Previous CS and fetal distress were the commonest among maternal and fetal indications respectively. Overall maternal morbidity in CS ranged from 8-10%, commonest being blood transfusion and wound infection. Neonatal morbidity was less than half and neonatal mortality was almost one third in comparison to normal delivery. Rising CS trend was noted in patients with previous CS, fetal distress, oligohydramnios and failed induction. Gradual but constant decline in CS rate was noted among emergency patients, patient with CPD, obstructed labor and PROM.Conclusions: Although to some extent higher CS rate is justifiable due to remarkable reduction in neonatal mortality and morbidity in high risk patients; the CS rate in our environment is still three times higher than WHO recommendation. In controlled environment with experienced staff, careful selection of patients for normal delivery among patients with previous CS, breech presentation and scientific induction of labor may satisfy our concern for mother and newborn safety while keeping the CS rate low
The Art of Measuring Physical Parameters in Galaxies: A Critical Assessment of Spectral Energy Distribution Fitting Techniques
Do COVID-19 Infectious Disease Models Incorporate the Social Determinants of Health? A Systematic Review
ObjectivesTo identify COVID-19 infectious disease models that accounted for social determinants of health (SDH).MethodsWe searched MEDLINE, EMBASE, Cochrane Library, medRxiv, and the Web of Science from December 2019 to August 2020. We included mathematical modelling studies focused on humans investigating COVID-19 impact and including at least one SDH. We abstracted study characteristics (e.g., country, model type, social determinants of health) and appraised study quality using best practices guidelines.Results83 studies were included. Most pertained to multiple countries (n = 15), the United States (n = 12), or China (n = 7). Most models were compartmental (n = 45) and agent-based (n = 7). Age was the most incorporated SDH (n = 74), followed by gender (n = 15), race/ethnicity (n = 7) and remote/rural location (n = 6). Most models reflected the dynamic nature of infectious disease spread (n = 51, 61%) but few reported on internal (n = 10, 12%) or external (n = 31, 37%) model validation.ConclusionFew models published early in the pandemic accounted for SDH other than age. Neglect of SDH in mathematical models of disease spread may result in foregone opportunities to understand differential impacts of the pandemic and to assess targeted interventions.Systematic Review Registration:[https://www.crd.york.ac.uk/prospero/display_record.php?ID=CRD42020207706], PROSPERO, CRD42020207706
Construct Validity of the Late-Life Function and Disability Instrument in African American Breast Cancer Survivors
Limited data exist on the validity of the Late-Life Function and Disability (LLFD) instrument in cancer survivors. We examined the construct validity of the abbreviated LLFD instrument in a sample of African-American breast cancer survivors. African American breast cancer survivors (n = 181) aged 50 years and older completed the abbreviated LLFD instrument and questions about sociodemographic and lifestyle characteristics. Confirmatory factor analysis (CFA), Cronbach alphas, and structural models were used to evaluate the construct validity of these measures. Minor modifications were made to the three-factor functional component portion of the inventory to improve model fit. Cronbach alpha’s (range 0.85–0.92) and inter-factor correlations (r = 0.3–0.5, all p < 0.05) were appropriate. The two-factor disability component fit the data and Cronbach alpha’s (0.91 and 0.98) were appropriate with a high inter-factor correlation (r = 0.95, p < 0.01). The average variance extracted (range = 0.55–0.93) and composite reliabilities (range = 0.86–0.98) were in acceptable ranges. Floor effects ranged from 7% for advanced lower function to 74% for personal role disability. Education and number of comorbidities were correlated significantly with functional outcomes. The abbreviated LLFD instrument had adequate construct validity in this sample of African American breast cancer survivors. Further studies are needed that examine the stability of the instrument over time
Optimising the Use of Thromboprophylaxis in Atrial Fibrillation (AF): Exploring Factors Affecting Decision-Making
University of Technology Sydney. Graduate School of Health.The risk of stroke is five-folds higher among patients with atrial fibrillation (AF) in comparison to those without AF. In fact, thromboembolic strokes occurring in AF patients are more disabling and fatal than in patients without AF. This increase in morbidity and mortality due to stroke in patients with atrial fibrillation has become a major global healthcare burden, and for this reason stroke prevention (using antithrombotic agents as the mainstay therapy) has been a critical feature of AF management. Although warfarin (an oral vitamin K antagonist) has been traditionally used for preventing stroke in AF patients, its complex pharmacology (i.e., narrow therapeutic index requiring regular therapeutic monitoring, its interactions with food, alcohol, and other medications), and prescribers’ concerns regarding patients’ nonadherence to the therapy make the decision-making around the initiation of therapy quite complicated. Consequently, anticoagulants are underutilised in many ‘at-risk’ patients, exposing them to an increased risk of a preventable stroke. Our research in a hospital-based study that used decision-making support tool i.e., a computerised antithrombotic risk assessment tool (CARAT- a tool developed based on local and international guidelines assists in therapy selection based on patients’ individualised risk versus benefit assessment) observed a marginal increase in anticoagulation prescription among eligible patients (57.8% vs 64.7%, P=0.35) in comparison to the baseline prescription. However, many at-risk patients were still not prescribed anticoagulants as recommended by CARAT, and the clinicians’ agreement with CARAT recommendation was low. This might have been due to clinicians’ perceived fears of risk such as falls, bleeding, and patients’ nonadherence to the therapy. To increase clinicians’ acceptance for CARAT tool, studies should further explore its validity in predicting clinical outcomes.
Recently, the direct oral anticoagulants (DOACs) have become available for thromboprophylaxis in patients with AF. These agents have safety and efficacy (in stroke prevention) profiles comparable to warfarin therapy. They also offer some practical advantages over warfarin in terms of not requiring regular therapeutic drug monitoring, plus their interactions with food, alcohol and other medications is limited. However, the DOACs are not completely devoid of risks or challenges to their use. These challenges include: a) the lack of specific drug monitoring tests; b) complicated management of renally-impaired patients; c) limited access to and/or unavailability of antidotes for the management of DOAC-related acute bleeding; d) high ‘out-of-pocket’ costs for patients in some countries; and e) the potential for patient nonadherence (due to the more frequent dosing required with dabigatran and apixaban). Such conditions present specific challenges for clinicians when prescribing these medications for long-term stroke prophylaxis in patients with AF. In 2014 following the listing of DOACs on the pharmaceutical benefits scheme (PBS) (which subsidises DOACs for stroke prevention in AF), it was important to report their utilisation of anticoagulant prescription in local Australian settings. It was also necessary to updated CARAT 2.0 in assessing whether the prescriptions were based on these revised guidelines. Our study (in a hospital setting in Sydney) found that 52.0% of the people were prescribed anticoagulants. Warfarin was the first-choice anticoagulant prescribed for two-thirds of patients, while the remaining one-third were on DOCAs. However, most of the patients eligible for anticoagulants were not prescribed it but were either prescribed antiplatelets or kept on nil therapy.
In this thesis a structured literature review explored factors influencing patients’ preference and adherence for warfarin versus DOACs. This is because research suggests that patients have an important role in the decision-making process for antithrombotic therapy selection in AF. This review discussed patients’ perspectives on medications. Here the findings were synthesised to present a framework depicting the five interacting dimensions of adherence: 1) therapy-related factors; 2) patient-related factors; 3) condition-related factors; 4) social–economic factors; and 5) health system factors. From this study, it was clear that patients’ views about treatment must be incorporated into the decision-making process to facilitate a) treatment; b) adherence; and c) achieve good clinical outcomes. In line with this study, another study then evaluated the information within web-based resources designed to educate patients on thromboprophylaxis in AF. The content and thematic analysis were conducted on these resources. It was found that the information provided in these resources were varied. It was found that implied bias of some resources towards specific anticoagulant therapies and their imbalanced information on the importance of anticoagulation in AF might misinform or confuse patients. Therefore, patients’ engagement in shared decision-making and adherence to medicines might be undermined by the suboptimal quality of information provided in these resources
An elegant scheme of self-testing for multipartite Bell inequalities
Abstract Self-testing is the most accurate form of certification of quantum devices. While self-testing in bipartite Bell scenarios has been thoroughly studied, self-testing in the more complex multipartite Bell scenarios remains largely unexplored. We present a simple and broadly applicable self-testing scheme for N-partite correlation Bell inequalities with two binary outcome observables per party. To showcase the versatility of our proof technique, we obtain self-testing statements for the MABK and WWWŻB family of linear Bell inequalities and Uffink’s family of quadratic Bell inequalities. In particular, we show that the N-partite MABK and Uffink’s quadratic Bell inequalities self-test the GHZ state and anti-commuting observables for each party. While the former uniquely specifies the state, the latter allows for an arbitrary relative phase. To demonstrate the operational relevance of the relative phase, we introduce Uffink’s complex-valued N partite Bell expression, whose extremal values self-test the GHZ states and uniquely specify the relative phase
