4,718 research outputs found
Recommended from our members
Are there valid proxy measures of clinical behaviour?
Background: Accurate measures of health professionals' clinical practice are critically important to guide health policy decisions, as well as for professional self-evaluation and for research-based investigation of clinical practice and process of care. It is often not feasible or ethical to measure behaviour through direct observation, and rigorous behavioural measures are difficult and costly to use. The aim of this review was to identify the current evidence relating to the relationships between proxy measures and direct measures of clinical behaviour. In particular, the accuracy of medical record review, clinician self-reported and patient-reported behaviour was assessed relative to directly observed behaviour.
Methods: We searched: PsycINFO; MEDLINE; EMBASE; CINAHL; Cochrane Central Register of Controlled Trials; science/social science citation index; Current contents (social & behavioural med/clinical med); ISI conference proceedings; and Index to Theses. Inclusion criteria: empirical, quantitative studies; and examining clinical behaviours. An independent, direct measure of behaviour (by standardised patient, other trained observer or by video/audio recording) was considered the 'gold standard' for comparison. Proxy measures of behaviour included: retrospective self-report; patient-report; or chart-review. All titles, abstracts, and full text articles retrieved by electronic searching were screened for inclusion and abstracted independently by two reviewers. Disagreements were resolved by discussion with a third reviewer where necessary.
Results: Fifteen reports originating from 11 studies met the inclusion criteria. The method of direct measurement was by standardised patient in six reports, trained observer in three reports, and audio/video recording in six reports. Multiple proxy measures of behaviour were compared in five of 15 reports. Only four of 15 reports used appropriate statistical methods to compare measures. Some direct measures failed to meet our validity criteria. The accuracy of patient report and chart review as proxy measures varied considerably across a wide range of clinical actions. The evidence for clinician self-report was inconclusive.
Conclusion: Valid measures of clinical behaviour are of fundamental importance to accurately identify gaps in care delivery, improve quality of care, and ultimately to improve patient care. However, the evidence base for three commonly used proxy measures of clinicians' behaviour is very limited. Further research is needed to better establish the methods of development, application, and analysis for a range of both direct and proxy measures of behaviour
Lattice QCD without topology barriers
As the continuum limit is approached, lattice QCD simulations tend to get
trapped in the topological charge sectors of field space and may consequently
give biased results in practice. We propose to bypass this problem by imposing
open (Neumann) boundary conditions on the gauge field in the time direction.
The topological charge can then flow in and out of the lattice, while many
properties of the theory (the hadron spectrum, for example) are not affected.
Extensive simulations of the SU(3) gauge theory, using the HMC and the closely
related SMD algorithm, confirm the absence of topology barriers if these
boundary conditions are chosen. Moreover, the calculated autocorrelation times
are found to scale approximately like the square of the inverse lattice
spacing, thus supporting the conjecture that the HMC algorithm is in the
universality class of the Langevin equation.Comment: Plain TeX source, 26 pages, 4 figures include
Evaluation of Dynamic Cell Processes and Behavior Using Video Bioinformatics Tools
Just as body language can reveal a person’s state of well-being, dynamic changes in cell behavior and
morphology can be used to monitor processes in cultured cells. This chapter discusses how CL-Quant
software, a commercially available video bioinformatics tool, can be used to extract quantitative data on:
(1) growth/proliferation, (2) cell and colony migration, (3) reactive oxygen species (ROS) production, and
(4) neural differentiation. Protocols created using CL-Quant were used to analyze both single cells and
colonies. Time-lapse experiments in which different cell types were subjected to various chemical
exposures were done using Nikon BioStations. Proliferation rate was measured in human embryonic stem
cell colonies by quantifying colony area (pixels) and in single cells by measuring confluency (pixels).
Colony and single cell migration were studied by measuring total displacement (distance between the
starting and ending points) and total distance traveled by the colonies/cells. To quantify ROS production,
cells were pre-loaded with MitoSOX Red™, a mitochondrial ROS (superoxide) indicator, treated with
various chemicals, then total intensity of the red fluorescence was measured in each frame. Lastly, neural
stem cells were incubated in differentiation medium for 12 days, and time lapse images were collected
daily. Differentiation of neural stem cells was quantified using a protocol that detects young neurons. CLQuant
software can be used to evaluate biological processes in living cells, and the protocols developed in
this project can be applied to basic research and toxicological studies, or to monitor quality control in
culture facilities
Forecasting Tunisian type 2 diabetes prevalence to 2027: validation of a simple model.
BACKGROUND: Most projections of type 2 diabetes (T2D) prevalence are simply based on demographic change (i.e. ageing). We developed a model to predict future trends in T2D prevalence in Tunisia, explicitly taking into account trends in major risk factors (obesity and smoking). This could improve assessment of policy options for prevention and health service planning. METHODS: The IMPACT T2D model uses a Markov approach to integrate population, obesity and smoking trends to estimate future T2D prevalence. We developed a model for the Tunisian population from 1997 to 2027, and validated the model outputs by comparing with a subsequent T2D prevalence survey conducted in 2005. RESULTS: The model estimated that the prevalence of T2D among Tunisians aged over 25 years was 12.0% in 1997 (95% confidence intervals 9.6%-14.4%), increasing to 15.1% (12.5%-17.4%) in 2005. Between 1997 and 2005, observed prevalence in men increased from 13.5% to 16.1% and in women from 12.9% to 14.1%. The model forecast for a dramatic rise in prevalence by 2027 (26.6% overall, 28.6% in men and 24.7% in women). However, if obesity prevalence declined by 20% in the 10 years from 2013, and if smoking decreased by 20% over 10 years from 2009, a 3.3% reduction in T2D prevalence could be achieved in 2027 (2.5% in men and 4.1% in women). CONCLUSIONS: This innovative model provides a reasonably close estimate of T2D prevalence for Tunisia over the 1997-2027 period. Diabetes burden is now a significant public health challenge. Our model predicts that this burden will increase significantly in the next two decades. Tackling obesity, smoking and other T2D risk factors thus needs urgent action. Tunisian decision makers have therefore defined two strategies: obesity reduction and tobacco control. Responses will be evaluated in future population surveys
Exotic particles below the TeV from low scale flavour theories
A flavour gauge theory is observable only if the symmetry is broken at
relatively low energies. The intrinsic parity-violation of the fermion
representations in a flavour theory describing quark, lepton and higgsino
masses and mixings generically requires anomaly cancellation by new fermions.
Benchmark supersymmetric flavour models are built and studied to argue that: i)
the flavour symmetry breaking should be about three orders of magnitude above
the higgsino mass, enough also to efficiently suppress FCNC and CP violations
coming from higher-dimensional operators; ii) new fermions with exotic decays
into lighter particles are typically required at scales of the order of the
higgsino mass.Comment: 19 pages, references added, one comment and one footnote added,
results unchange
Forces between clustered stereocilia minimize friction in the ear on a subnanometre scale
The detection of sound begins when energy derived from acoustic stimuli
deflects the hair bundles atop hair cells. As hair bundles move, the viscous
friction between stereocilia and the surrounding liquid poses a fundamental
challenge to the ear's high sensitivity and sharp frequency selectivity. Part
of the solution to this problem lies in the active process that uses energy for
frequency-selective sound amplification. Here we demonstrate that a
complementary part involves the fluid-structure interaction between the liquid
within the hair bundle and the stereocilia. Using force measurement on a
dynamically scaled model, finite-element analysis, analytical estimation of
hydrodynamic forces, stochastic simulation and high-resolution interferometric
measurement of hair bundles, we characterize the origin and magnitude of the
forces between individual stereocilia during small hair-bundle deflections. We
find that the close apposition of stereocilia effectively immobilizes the
liquid between them, which reduces the drag and suppresses the relative
squeezing but not the sliding mode of stereociliary motion. The obliquely
oriented tip links couple the mechanotransduction channels to this least
dissipative coherent mode, whereas the elastic horizontal top connectors
stabilize the structure, further reducing the drag. As measured from the
distortion products associated with channel gating at physiological stimulation
amplitudes of tens of nanometres, the balance of forces in a hair bundle
permits a relative mode of motion between adjacent stereocilia that encompasses
only a fraction of a nanometre. A combination of high-resolution experiments
and detailed numerical modelling of fluid-structure interactions reveals the
physical principles behind the basic structural features of hair bundles and
shows quantitatively how these organelles are adapted to the needs of sensitive
mechanotransduction.Comment: 21 pages, including 3 figures. For supplementary information, please
see the online version of the article at http://www.nature.com/natur
New directions in cellular therapy of cancer: a summary of the summit on cellular therapy for cancer
A summit on cellular therapy for cancer discussed and presented advances related to the use of adoptive cellular therapy for melanoma and other cancers. The summit revealed that this field is advancing rapidly. Conventional cellular therapies, such as tumor infiltrating lymphocytes (TIL), are becoming more effective and more available. Gene therapy is becoming an important tool in adoptive cell therapy. Lymphocytes are being engineered to express high affinity T cell receptors (TCRs), chimeric antibody-T cell receptors (CARs) and cytokines. T cell subsets with more naïve and stem cell-like characteristics have been shown in pre-clinical models to be more effective than unselected populations and it is now possible to reprogram T cells and to produce T cells with stem cell characteristics. In the future, combinations of adoptive transfer of T cells and specific vaccination against the cognate antigen can be envisaged to further enhance the effectiveness of these therapies
Tuning supersymmetric models at the LHC: A comparative analysis at two-loop level
We provide a comparative study of the fine tuning amount (Delta) at the
two-loop leading log level in supersymmetric models commonly used in SUSY
searches at the LHC. These are the constrained MSSM (CMSSM), non-universal
Higgs masses models (NUHM1, NUHM2), non-universal gaugino masses model (NUGM)
and GUT related gaugino masses models (NUGMd). Two definitions of the fine
tuning are used, the first (Delta_{max}) measures maximal fine-tuning wrt
individual parameters while the second (Delta_q) adds their contribution in
"quadrature". As a direct result of two theoretical constraints (the EW minimum
conditions), fine tuning (Delta_q) emerges as a suppressing factor (effective
prior) of the averaged likelihood (under the priors), under the integral of the
global probability of measuring the data (Bayesian evidence p(D)). For each
model, there is little difference between Delta_q, Delta_{max} in the region
allowed by the data, with similar behaviour as functions of the Higgs, gluino,
stop mass or SUSY scale (m_{susy}=(m_{\tilde t_1} m_{\tilde t_2})^{1/2}) or
dark matter and g-2 constraints. The analysis has the advantage that by
replacing any of these mass scales or constraints by their latest bounds one
easily infers for each model the value of Delta_q, Delta_{max} or vice versa.
For all models, minimal fine tuning is achieved for M_{higgs} near 115 GeV with
a Delta_q\approx Delta_{max}\approx 10 to 100 depending on the model, and in
the CMSSM this is actually a global minimum. Due to a strong (
exponential) dependence of Delta on M_{higgs}, for a Higgs mass near 125 GeV,
the above values of Delta_q\approx Delta_{max} increase to between 500 and
1000. Possible corrections to these values are briefly discussed.Comment: 23 pages, 46 figures; references added; some clarifications (section
2
A precision study of the fine tuning in the DiracNMSSM
Recently the DiracNMSSM has been proposed as a possible solution to reduce
the fine tuning in supersymmetry. We determine the degree of fine tuning needed
in the DiracNMSSM with and without non-universal gaugino masses and compare it
with the fine tuning in the GNMSSM. To apply reasonable cuts on the allowed
parameter regions we perform a precise calculation of the Higgs mass. In
addition, we include the limits from direct SUSY searches and dark matter
abundance. We find that both models are comparable in terms of fine tuning,
with the minimal fine tuning in the GNMSSM slightly smaller.Comment: 20 pages + appendices, 10 figure
Single-Scale Natural SUSY
We consider the prospects for natural SUSY models consistent with current
data. Recent constraints make the standard paradigm unnatural so we consider
what could be a minimal extension consistent with what we now know. The most
promising such scenarios extend the MSSM with new tree-level Higgs interactions
that can lift its mass to at least 125 GeV and also allow for flavor-dependent
soft terms so that the third generation squarks are lighter than current bounds
on the first and second generation squarks. We argue that a common feature of
almost all such models is the need for a new scale near 10 TeV, such as a scale
of Higgsing or confinement of a new gauge group. We consider the question
whether such a model can naturally derive from a single mass scale associated
with supersymmetry breaking. Most such models simply postulate new scales,
leaving their proximity to the scale of MSSM soft terms a mystery. This
coincidence problem may be thought of as a mild tuning, analogous to the usual
mu problem. We find that a single mass scale origin is challenging, but suggest
that a more natural origin for such a new dynamical scale is the gravitino
mass, m_{3/2}, in theories where the MSSM soft terms are a loop factor below
m_{3/2}. As an example, we build a variant of the NMSSM where the singlet S is
composite, and the strong dynamics leading to compositeness is triggered by
masses of order m_{3/2} for some fields. Our focus is the Higgs sector, but our
model is compatible with a light stop (with the other generation squarks heavy,
or with R-parity violation or another mechanism to hide them from current
searches). All the interesting low-energy mass scales, including linear terms
for S playing a key role in EWSB, arise dynamically from the single scale
m_{3/2}. However, numerical coefficients from RG effects and wavefunction
factors in an extra dimension complicate the otherwise simple story.Comment: 32 pages, 3 figures; version accepted by JHE
- …
