1,842 research outputs found
Relationships between components of blood pressure and cardiovascular events in patients with stable coronary artery disease and hypertension
Observational studies have shown a J-shaped relationship between diastolic blood pressure (BP) and cardiovascular events in hypertensive patients with coronary artery disease. We investigated whether the increased risk associated with low diastolic BP reflects elevated pulse pressure (PP). In 22 672 hypertensive patients with coronary artery disease from the CLARIFY registry (Prospective Observational Longitudinal Registry of Patients With Stable Coronary Artery Disease), followed for a median of 5.0 years, BP was measured annually and averaged. The relationships between PP and diastolic BP, alone or combined, and the primary composite outcome (cardiovascular death or myocardial infarction) were analyzed using multivariable Cox proportional hazards models. Adjusted hazard ratios for the primary outcome were 1.62 (95% confidence interval [CI], 1.40–1.87), 1.00 (ref), 1.07 (95% CI, 0.94–1.21), 1.54 (95% CI, 1.32–1.79), and 2.34 (95% CI, 1.95–2.81) for PP<45, 45 to 54 (reference), 55 to 64, 65 to 74, and ≥75 mm Hg, respectively, and 1.50 (95% CI, 1.31–1.72), 1.00 (reference), and 1.58 (95% CI, 1.42–1.77) for diastolic BPs of <70, 70 to 79 (ref), and ≥80 mm Hg, respectively. In a cross-classification analysis between diastolic BP and PP, the relationship between diastolic BP and the primary outcome remained J-shaped when the analysis was restricted to patients with the lowest-risk PP (45–64 mm Hg), with adjusted hazard ratios of 1.53 (95% CI, 1.27–1.83), 1.00 (ref), and 1.54 (95% CI, 1.34–1.75) in the <70, 70 to 79 (reference), and ≥80 mm Hg subgroups, respectively. The J-shaped relationship between diastolic BP and cardiovascular events in hypertensive patients with coronary artery disease persists in patients within the lowest-risk PP range and is therefore unlikely to be solely the consequence of an increased PP reflecting advanced vascular disease
Understanding the challenges to implementing case management for people with dementia in primary care in England: a qualitative study using Normalization Process Theory
Background
Case management has been suggested as a way of improving the quality and cost-effectiveness of support for people with dementia. In this study we adapted and implemented a successful United States’ model of case management in primary care in England. The results are reported elsewhere, but a key finding was that little case management took place. This paper reports the findings of the process evaluation which used Normalization Process Theory to understand the barriers to implementation.
Methods
Ethnographic methods were used to explore the views and experiences of case management. Interviews with 49 stakeholders (patients, carers, case managers, health and social care professionals) were supplemented with observation of case managers during meetings and initial assessments with patients. Transcripts and field notes were analysed initially using the constant comparative approach and emerging themes were then mapped onto the framework of Normalization Process Theory.
Results
The primary focus during implementation was on the case managers as isolated individuals, with little attention being paid to the social or organizational context within which they worked. Barriers relating to each of the four main constructs of Normalization Process Theory were identified, with a lack of clarity over the scope and boundaries of the intervention (coherence); variable investment in the intervention (cognitive participation); a lack of resources, skills and training to deliver case management (collective action); and limited reflection and feedback on the case manager role (reflexive monitoring).
Conclusions
Despite the intuitive appeal of case management to all stakeholders, there were multiple barriers to implementation in primary care in England including: difficulties in embedding case managers within existing well-established community networks; the challenges of protecting time for case management; and case managers’ inability to identify, and act on, emerging patient and carer needs (an essential, but previously unrecognised, training need). In the light of these barriers it is unclear whether primary care is the most appropriate setting for case management in England. The process evaluation highlights key aspects of implementation and training to be addressed in future studies of case management for dementia
Non-destructive monitoring of viability in an ex vivo organ culture model of osteochondral tissue
Organ culture is an increasingly important tool in research, with advantages over monolayer cell culture due to the inherent natural environment of tissues. Successful organ cultures must retain cell viability. The aim of this study was to produce viable and non-viable osteochondral organ cultures to assess the accumulation of soluble markers in the conditioned medium for predicting tissue viability. Porcine femoral osteochondral plugs were cultured for 20 days, with the addition on day 6, of Triton X-100 (to induce necrosis), camptothecin (to induce apoptosis) or no toxic additives. Tissue viability was assessed by the tissue destructive XTT (sodium 3'-[1-[(phenylamino)-carbonyl]-3,4-tetrazolium]-bis(4-methoxy-6-nitro)benzene-sulfonic acid hydrate) assay method and LIVE/DEAD® staining of the cartilage at days 0, 6 and 20. Tissue structure was assessed by histological evaluation using haematoxylin & eosin and safranin O. Conditioned medium was assessed every 3-4 days for glucose depletion, and levels of lactate dehydrogenase (LDH), alkaline phosphatase (AP), glycosaminoglycans (GAGs), and matrix metalloproteinase (MMP)-2 and MMP-9. Necrotic cultures immediately showed a reduction in glucose consumption, and an immediate increase in LDH, GAG, MMP-2 and MMP-9 levels. Apoptotic cultures showed a delayed reduction in glucose consumption and delayed increase in LDH, a small rise in MMP-2 and MMP-9, but no significant effect on GAGs released into the conditioned medium. The data showed that tissue viability could be monitored by assessing the conditioned medium for the aforementioned markers, negating the need for tissue destructive assays. Physiologically relevant whole- or part-joint organ culture models, necessary for research and pre-clinical assessment of therapies, could be monitored this way, reducing the need to sacrifice tissues to determine viability, and hence reducing the sample numbers necessary
Transverse-energy distributions at midrapidity in , Au, and AuAu collisions at --200~GeV and implications for particle-production models
Measurements of the midrapidity transverse energy distribution, d\Et/d\eta,
are presented for , Au, and AuAu collisions at
GeV and additionally for AuAu collisions at
and 130 GeV. The d\Et/d\eta distributions are first
compared with the number of nucleon participants , number of
binary collisions , and number of constituent-quark participants
calculated from a Glauber model based on the nuclear geometry. For
AuAu, \mean{d\Et/d\eta}/N_{\rm part} increases with , while
\mean{d\Et/d\eta}/N_{qp} is approximately constant for all three energies.
This indicates that the two component ansatz, , which has been used to represent
distributions, is simply a proxy for , and that the term
does not represent a hard-scattering component in distributions. The
distributions of AuAu and Au are then calculated from
the measured distribution using two models that both reproduce
the AuAu data. However, while the number-of-constituent-quark-participant
model agrees well with the Au data, the additive-quark model does not.Comment: 391 authors, 24 pages, 19 figures, and 15 Tables. Submitted to Phys.
Rev. C. Plain text data tables for the points plotted in figures for this and
previous PHENIX publications are publicly available at
http://www.phenix.bnl.gov/papers.htm
Relationship Between Time in Therapeutic Range and Comparative Treatment Effect of Rivaroxaban and Warfarin: Results From the ROCKET AF Trial
Background: Time in therapeutic range (TTR) is a standard quality measure of the use of warfarin. We assessed the relative effects of rivaroxaban versus warfarin at the level of trial center TTR (cTTR) since such analysis preserves randomized comparisons. Methods and Results: TTR was calculated using the Rosendaal method, without exclusion of international normalized ratio (INR) values performed during warfarin initiation. Measurements during warfarin interruptions >7 days were excluded. INRs were performed via standardized finger‐stick point‐of‐care devices at least every 4 weeks. The primary efficacy endpoint (stroke or non‐central nervous system embolism) was examined by quartiles of cTTR and by cTTR as a continuous function. Centers with the highest cTTRs by quartile had lower‐risk patients as reflected by lower CHADS2 scores (P<0.0001) and a lower prevalence of prior stroke or transient ischemic attack (P<0.0001). Sites with higher cTTR were predominantly from North America and Western Europe. The treatment effect of rivaroxaban versus warfarin on the primary endpoint was consistent across a wide range of cTTRs (P value for interaction=0.71). The hazard of major and non‐major clinically relevant bleeding increased with cTTR (P for interaction=0.001), however, the estimated reduction by rivaroxaban compared with warfarin in the hazard of intracranial hemorrhage was preserved across a wide range of threshold cTTR values. Conclusions: The treatment effect of rivaroxaban compared with warfarin for the prevention of stroke and systemic embolism is consistent regardless of cTTR
Search for dark matter in events with heavy quarks and missing transverse momentum in pp collisions with the ATLAS detector
This article reports on a search for dark matterpair production in association with bottom or top quarks in20.3fb−1ofppcollisions collected at√s=8TeVbytheATLAS detector at the LHC. Events with large missing trans-verse momentum are selected when produced in associationwith high-momentum jets of which one or more are identifiedas jets containingb-quarks. Final states with top quarks areselected by requiring a high jet multiplicity and in some casesa single lepton. The data are found to be consistent with theStandard Model expectations and limits are set on the massscale of effective field theories that describe scalar and tensorinteractions between dark matter and Standard Model par-ticles. Limits on the dark-matter–nucleon cross-section forspin-independent and spin-dependent interactions are alsoprovided. These limits are particularly strong for low-massdark matter. Using a simplified model, constraints are set onthe mass of dark matter and of a coloured mediator suitableto explain a possible signal of annihilating dark matter
Proposed model of integrated care to improve health outcomes for individuals with multimorbidities
Search for direct production of charginos and neutralinos in events with three leptons and missing transverse momentum in √s = 8 TeV pp collisions with the ATLAS detector
A search for the direct production of charginos and neutralinos in final states with three leptons and missing transverse momentum is presented. The analysis is based on 20.3 fb−1 of s√ = 8 TeV proton-proton collision data delivered by the Large Hadron Collider and recorded with the ATLAS detector. Observations are consistent with the Standard Model expectations and limits are set in R-parity-conserving phenomenological Minimal Supersymmetric Standard Models and in simplified supersymmetric models, significantly extending previous results. For simplified supersymmetric models of direct chargino (χ˜±1) and next-to-lightest neutralino (χ˜02) production with decays to lightest neutralino (χ˜01) via either all three generations of sleptons, staus only, gauge bosons, or Higgs bosons, (χ˜±1) and (χ˜02) masses are excluded up to 700 GeV, 380 GeV, 345 GeV, or 148 GeV respectively, for a massless (χ˜01
- …
