4,726 research outputs found
A novel protein isoform of the RON tyrosine kinase receptor transforms human pancreatic duct epithelial cells.
The MST1R gene is overexpressed in pancreatic cancer producing elevated levels of the RON tyrosine kinase receptor protein. While mutations in MST1R are rare, alternative splice variants have been previously reported in epithelial cancers. We report the discovery of a novel RON isoform discovered in human pancreatic cancer. Partial splicing of exons 5 and 6 (P5P6) produces a RON isoform that lacks the first extracellular immunoglobulin-plexin-transcription domain. The splice variant is detected in 73% of xenografts derived from pancreatic adenocarcinoma patients and 71% of pancreatic cancer cell lines. Peptides specific to RON P5P6 detected in human pancreatic cancer specimens by mass spectrometry confirm translation of the protein isoform. The P5P6 isoform is found to be constitutively phosphorylated, present in the cytoplasm, and it traffics to the plasma membrane. Expression of P5P6 in immortalized human pancreatic duct epithelial (HPDE) cells activates downstream AKT, and in human pancreatic epithelial nestin-expressing cells, activates both the AKT and MAPK pathways. Inhibiting RON P5P6 in HPDE cells using a small molecule inhibitor BMS-777607 blocked constitutive activation and decreased AKT signaling. P5P6 transforms NIH3T3 cells and induces tumorigenicity in HPDE cells. Resultant HPDE-P5P6 tumors develop a dense stromal compartment similar to that seen in pancreatic cancer. In summary, we have identified a novel and constitutively active isoform of the RON tyrosine kinase receptor that has transforming activity and is expressed in human pancreatic cancer. These findings provide additional insight into the biology of the RON receptor in pancreatic cancer and are clinically relevant to the study of RON as a potential therapeutic target
Probabilistic Analysis of Facility Location on Random Shortest Path Metrics
The facility location problem is an NP-hard optimization problem. Therefore,
approximation algorithms are often used to solve large instances. Such
algorithms often perform much better than worst-case analysis suggests.
Therefore, probabilistic analysis is a widely used tool to analyze such
algorithms. Most research on probabilistic analysis of NP-hard optimization
problems involving metric spaces, such as the facility location problem, has
been focused on Euclidean instances, and also instances with independent
(random) edge lengths, which are non-metric, have been researched. We would
like to extend this knowledge to other, more general, metrics.
We investigate the facility location problem using random shortest path
metrics. We analyze some probabilistic properties for a simple greedy heuristic
which gives a solution to the facility location problem: opening the
cheapest facilities (with only depending on the facility opening
costs). If the facility opening costs are such that is not too large,
then we show that this heuristic is asymptotically optimal. On the other hand,
for large values of , the analysis becomes more difficult, and we
provide a closed-form expression as upper bound for the expected approximation
ratio. In the special case where all facility opening costs are equal this
closed-form expression reduces to or or even
if the opening costs are sufficiently small.Comment: A preliminary version accepted to CiE 201
Extending the applicability of the dose addition model to the assessment of chemical mixtures of partial agonists by using a novel toxic unit extrapolation method
This article has been made available through the Brunel Open Access Publishing Fund.Dose addition, a commonly used concept in toxicology for the prediction of chemical mixture effects, cannot readily be applied to mixtures of partial agonists with differing maximal effects. Due to its mathematical features, effect levels that exceed the maximal effect of the least efficacious compound present in the mixture, cannot be calculated. This poses problems when dealing with mixtures likely to be encountered in realistic assessment situations where chemicals often show differing maximal effects. To overcome this limitation, we developed a pragmatic solution that extrapolates the toxic units of partial agonists to effect levels beyond their maximal efficacy. We extrapolated different additivity expectations that reflect theoretically possible extremes and validated this approach with a mixture of 21 estrogenic chemicals in the E-Screen. This assay measures the proliferation of human epithelial breast cancers. We found that the dose-response curves of the estrogenic agents exhibited widely varying shapes, slopes and maximal effects, which made it necessary to extrapolate mixture responses above 14% proliferation. Our toxic unit extrapolation approach predicted all mixture responses accurately. It extends the applicability of dose addition to combinations of agents with differing saturating effects and removes an important bottleneck that has severely hampered the use of dose addition in the past. © 2014 Scholze et al
The importance of sustained attention in early Alzheimer's disease
INTRODUCTION: There is conflicting evidence regarding impairment of sustained attention in early Alzheimer's disease (AD). We examine whether sustained attention is impaired and predicts deficits in other cognitive domains in early AD. METHODS: Fifty-one patients with early AD (MMSE > 18) and 15 healthy elderly controls were recruited. The sustained attention to response task (SART) was used to assess sustained attention. A subset of 25 patients also performed tasks assessing general cognitive function (ADAS-Cog), episodic memory (Logical memory scale, Paired Associates Learning), executive function (verbal fluency, grammatical reasoning) and working memory (digit and spatial span). RESULTS: AD patients were significantly impaired on the SART compared to healthy controls (total error β = 19.75, p = 0.027). SART errors significantly correlated with MMSE score (Spearman's rho = -0.338, p = 0.015) and significantly predicted deficits in ADAS-Cog (β = 0.14, p = 0.004). DISCUSSIONS: Patients with early AD have significant deficits in sustained attention, as measured using the SART. This may impair performance on general cognitive testing, and therefore should be taken into account during clinical assessment, and everyday management of individuals with early AD
Probabilistic Analysis of Optimization Problems on Generalized Random Shortest Path Metrics
Simple heuristics often show a remarkable performance in practice for
optimization problems. Worst-case analysis often falls short of explaining this
performance. Because of this, "beyond worst-case analysis" of algorithms has
recently gained a lot of attention, including probabilistic analysis of
algorithms.
The instances of many optimization problems are essentially a discrete metric
space. Probabilistic analysis for such metric optimization problems has
nevertheless mostly been conducted on instances drawn from Euclidean space,
which provides a structure that is usually heavily exploited in the analysis.
However, most instances from practice are not Euclidean. Little work has been
done on metric instances drawn from other, more realistic, distributions. Some
initial results have been obtained by Bringmann et al. (Algorithmica, 2013),
who have used random shortest path metrics on complete graphs to analyze
heuristics.
The goal of this paper is to generalize these findings to non-complete
graphs, especially Erd\H{o}s-R\'enyi random graphs. A random shortest path
metric is constructed by drawing independent random edge weights for each edge
in the graph and setting the distance between every pair of vertices to the
length of a shortest path between them with respect to the drawn weights. For
such instances, we prove that the greedy heuristic for the minimum distance
maximum matching problem, the nearest neighbor and insertion heuristics for the
traveling salesman problem, and a trivial heuristic for the -median problem
all achieve a constant expected approximation ratio. Additionally, we show a
polynomial upper bound for the expected number of iterations of the 2-opt
heuristic for the traveling salesman problem.Comment: An extended abstract appeared in the proceedings of WALCOM 201
Health literacy, health status, and healthcare utilization of Taiwanese adults: results from a national survey
Abstract Background Low health literacy is considered a worldwide health threat. The purpose of this study is to assess the prevalence and socio-demographic covariates of low health literacy in Taiwanese adults and to investigate the relationships between health literacy and health status and health care utilization. Methods A national survey of 1493 adults was conducted in 2008. Health literacy was measured using the Mandarin Health Literacy Scale. Health status was measured based on self-rated physical and mental health. Health care utilization was measured based on self-reported outpatient clinic visits, emergency room visits, and hospitalizations. Results Approximately thirty percent of adults were found to have low (inadequate or marginal) health literacy. They tended to be older, have fewer years of schooling, lower household income, and reside in less populated areas. Inadequate health literacy was associated with poorer mental health (OR, 0.57; 95% CI, 0.35-0.91). No association was found between health literacy and health care utilization even after adjusting for other covariates. Conclusions Low (inadequate and marginal) health literacy is prevalent in Taiwan. High prevalence of low health literacy is not necessarily indicative of the need for interventions. Systematic efforts to evaluate the impact of low health literacy on health outcomes in other countries would help to illuminate features of health care delivery and financing systems that may mitigate the adverse health effects of low health literacy.http://deepblue.lib.umich.edu/bitstream/2027.42/78252/1/1471-2458-10-614.xmlhttp://deepblue.lib.umich.edu/bitstream/2027.42/78252/2/1471-2458-10-614.pdfPeer Reviewe
Reinstated episodic context guides sampling-based decisions for reward.
How does experience inform decisions? In episodic sampling, decisions are guided by a few episodic memories of past choices. This process can yield choice patterns similar to model-free reinforcement learning; however, samples can vary from trial to trial, causing decisions to vary. Here we show that context retrieved during episodic sampling can cause choice behavior to deviate sharply from the predictions of reinforcement learning. Specifically, we show that, when a given memory is sampled, choices (in the present) are influenced by the properties of other decisions made in the same context as the sampled event. This effect is mediated by fMRI measures of context retrieval on each trial, suggesting a mechanism whereby cues trigger retrieval of context, which then triggers retrieval of other decisions from that context. This result establishes a new avenue by which experience can guide choice and, as such, has broad implications for the study of decisions
Measuring the Hidden Aspects of Solar Magnetism
2008 marks the 100th anniversary of the discovery of astrophysical magnetic
fields, when George Ellery Hale recorded the Zeeman splitting of spectral lines
in sunspots. With the introduction of Babcock's photoelectric magnetograph it
soon became clear that the Sun's magnetic field outside sunspots is extremely
structured. The field strengths that were measured were found to get larger
when the spatial resolution was improved. It was therefore necessary to come up
with methods to go beyond the spatial resolution limit and diagnose the
intrinsic magnetic-field properties without dependence on the quality of the
telescope used. The line-ratio technique that was developed in the early 1970s
revealed a picture where most flux that we see in magnetograms originates in
highly bundled, kG fields with a tiny volume filling factor. This led to
interpretations in terms of discrete, strong-field magnetic flux tubes embedded
in a rather field-free medium, and a whole industry of flux tube models at
increasing levels of sophistication. This magnetic-field paradigm has now been
shattered with the advent of high-precision imaging polarimeters that allow us
to apply the so-called "Second Solar Spectrum" to diagnose aspects of solar
magnetism that have been hidden to Zeeman diagnostics. It is found that the
bulk of the photospheric volume is seething with intermediately strong, tangled
fields. In the new paradigm the field behaves like a fractal with a high degree
of self-similarity, spanning about 8 orders of magnitude in scale size, down to
scales of order 10 m.Comment: To appear in "Magnetic Coupling between the Interior and the
Atmosphere of the Sun", eds. S.S. Hasan and R.J. Rutten, Astrophysics and
Space Science Proceedings, Springer-Verlag, Heidelberg, Berlin, 200
Recommended from our members
A demonstration of 'broken' visual space
It has long been assumed that there is a distorted mapping between real and ‘perceived’ space, based on demonstrations of systematic errors in judgements of slant, curvature, direction and separation. Here, we have applied a direct test to the notion of a coherent visual space. In an immersive virtual environment, participants judged the relative distance of two squares displayed in separate intervals. On some trials, the virtual scene expanded by a factor of four between intervals although, in line with recent results, participants did not report any noticeable change in the scene. We found that there was no consistent depth ordering of objects that can explain the distance matches participants made in this environment (e.g. A > B > D yet also A < C < D) and hence no single one-to-one mapping between participants’ perceived space and any real 3D environment. Instead, factors that affect pairwise comparisons of distances dictate participants’ performance. These data contradict, more directly than previous experiments, the idea that the visual system builds and uses a coherent 3D internal representation of a scene
The development of path integration: combining estimations of distance and heading
Efficient daily navigation is underpinned by path integration, the mechanism by which we use self-movement information to update our position in space. This process is well-understood in adulthood, but there has been relatively little study of path integration in childhood, leading to an underrepresentation in accounts of navigational development. Previous research has shown that calculation of distance and heading both tend to be less accurate in children as they are in adults, although there have been no studies of the combined calculation of distance and heading that typifies naturalistic path integration. In the present study 5-year-olds and 7-year-olds took part in a triangle-completion task, where they were required to return to the startpoint of a multi-element path using only idiothetic information. Performance was compared to a sample of adult participants, who were found to be more accurate than children on measures of landing error, heading error, and distance error. 7-year-olds were significantly more accurate than 5-year-olds on measures of landing error and heading error, although the difference between groups was much smaller for distance error. All measures were reliably correlated with age, demonstrating a clear development of path integration abilities within the age range tested. Taken together, these data make a strong case for the inclusion of path integration within developmental models of spatial navigational processing
- …
