1,280 research outputs found
The correspondence between augmentations and rulings for Legendrian knots
We strengthen the link between holomorphic and generating-function invariants
of Legendrian knots by establishing a formula relating the number of
augmentations of a knot's contact homology to the complete ruling invariant of
Chekanov and Pushkar.Comment: v2: 10 pages, 3 figures; minor revisions, to appear in Pacific J.
Mat
Invariants of Legendrian Knots and Coherent Orientations
We provide a translation between Chekanov's combinatorial theory for
invariants of Legendrian knots in the standard contact R^3 and a relative
version of Eliashberg and Hofer's Contact Homology. We use this translation to
transport the idea of ``coherent orientations'' from the Contact Homology world
to Chekanov's combinatorial setting. As a result, we obtain a lifting of
Chekanov's differential graded algebra invariant to an algebra over Z[t,t^{-1}]
with a full Z grading.Comment: 32 pages, 17 figures; small technical corrections to proof of Thm 3.7
and example 4.
Pharmacy Curriculum Outcomes Assessment (PCOA) as Predictor of Performance on NAPLEX
Objectives: The purpose of the study was to respond to students’ inquiry regarding the relationship between student performance on the PCOA administered in early spring of the P3 year and performance on the NAPLEX administered post-graduation.
Method: PCOA scores for two of the four content areas, Pharmaceutical Sciences and Clinical Sciences, resulting from administration of the assessment for P3 students in 2012 and 2013 were compared to the same student scores for the 2013 and 2014 NAPLEX taken post-graduation. A Pearson product-moment correlation coefficient was calculated to measure the linear correlation between the two sets of exam scores. Additionally, a linear regression was used to explain the predictor, PCOA, variability on the NAPLEX Score.
Results: The Pearson product-moment correlation coefficient for the combined PCOA content areas, Pharmaceutical Science and Clinical Science scores, was r=.572. A linear regression established that PCOA Pharmaceutical Science and Clinical Science scores could statistically significantly predict NAPLEX scores, p
Implications: Students taking the PCOA exam in the P3 year of their PharmD program may find value in using their performance on the assessment in the areas of Pharmaceutical Science and Clinical Science to predict their performance on the NAPLEX which is blueprinted to these areas of study
Naturally Occurring Isoleucyl-tRNA Synthetase without tRNA-dependent Pre-transfer Editing
Isoleucyl-tRNA synthetase (IleRS) is unusual among aminoacyl-tRNA synthetases in having a tRNA-dependent pre-transfer editing activity. Alongside the typical bacterial IleRS (such as Escherichia coli IleRS), some bacteria also have the enzymes (eukaryote-like) that cluster with eukaryotic IleRSs and exhibit low sensitivity to the antibiotic mupirocin. Our phylogenetic analysis suggests that the ileS1 and ileS2 genes of contemporary bacteria are the descendants of genes that might have arisen by an ancient duplication event before the separation of bacteria and archaea. We present the analysis of evolutionary constraints of the synthetic and editing reactions in eukaryotic/eukaryote-like IleRSs, which share a common origin but diverged through adaptation to different cell environments. The enzyme from the yeast cytosol exhibits tRNA-dependent pre-transfer editing analogous to E. coli IleRS. This argues for the presence of this proofreading in the common ancestor of both IleRS types and an ancient origin of the synthetic site-based quality control step. Yet surprisingly, the eukaryote-like enzyme from Streptomyces griseus IleRS lacks this capacity; at the same time, its synthetic site displays the 10(3)-fold drop in sensitivity to antibiotic mupirocin relative to the yeast enzyme. The discovery that pre-transfer editing is optional in IleRSs lends support to the notion that the conserved post-transfer editing domain is the main checkpoint in these enzymes. We substantiated this by showing that under error-prone conditions S. griseus IleRS is able to rescue the growth of an E. coli lacking functional IleRS, providing the first evidence that tRNA-dependent pre-transfer editing in IleRS is not essential for cell viability
Target specificity among canonical nuclear poly(A) polymerases in plants modulates organ growth and pathogen response
Polyadenylation of pre-mRNAs is critical for efficient nuclear export, stability, and translation of the mature mRNAs, and thus for gene expression. The bulk of pre-mRNAs are processed by canonical nuclear poly(A) polymerase (PAPS). Both vertebrate and higher-plant genomes encode more than one isoform of this enzyme, and these are coexpressed in different tissues. However, in neither case is it known whether the isoforms fulfill different functions or polyadenylate distinct subsets of pre-mRNAs. Here we show that the three canonical nuclear PAPS isoforms in Arabidopsis are functionally specialized owing to their evolutionarily divergent C-terminal domains. A strong loss-of-function mutation in PAPS1 causes a male gametophytic defect, whereas a weak allele leads to reduced leaf growth that results in part from a constitutive pathogen response. By contrast, plants lacking both PAPS2 and PAPS4 function are viable with wild-type leaf growth. Polyadenylation of SMALL AUXIN UP RNA (SAUR) mRNAs depends specifically on PAPS1 function. The resulting reduction in SAUR activity in paps1 mutants contributes to their reduced leaf growth, providing a causal link between polyadenylation of specific pre-mRNAs by a particular PAPS isoform and plant growth. This suggests the existence of an additional layer of regulation in plant and possibly vertebrate gene expression, whereby the relative activities of canonical nuclear PAPS isoforms control de novo synthesized poly(A) tail length and hence expression of specific subsets of mRNAs
Effect of sedimentary heterogeneities in the sealing formation on predictive analysis of geological CO<sub>2</sub> storage
Numerical models of geologic carbon sequestration (GCS) in saline aquifers use multiphase fluid flow-characteristic curves (relative permeability and capillary pressure) to represent the interactions of the non-wetting CO2 and the wetting brine. Relative permeability data for many sedimentary formations is very scarce, resulting in the utilisation of mathematical correlations to generate the fluid flow characteristics in these formations. The flow models are essential for the prediction of CO2 storage capacity and trapping mechanisms in the geological media. The observation of pressure dissipation across the storage and sealing formations is relevant for storage capacity and geomechanical analysis during CO2 injection.
This paper evaluates the relevance of representing relative permeability variations in the sealing formation when modelling geological CO2 sequestration processes. Here we concentrate on gradational changes in the lower part of the caprock, particularly how they affect pressure evolution within the entire sealing formation when duly represented by relative permeability functions.
The results demonstrate the importance of accounting for pore size variations in the mathematical model adopted to generate the characteristic curves for GCS analysis. Gradational changes at the base of the caprock influence the magnitude of pressure that propagates vertically into the caprock from the aquifer, especially at the critical zone (i.e. the region overlying the CO2 plume accumulating at the reservoir-seal interface). A higher degree of overpressure and CO2 storage capacity was observed at the base of caprocks that showed gradation. These results illustrate the need to obtain reliable relative permeability functions for GCS, beyond just permeability and porosity data. The study provides a formative principle for geomechanical simulations that study the possibility of pressure-induced caprock failure during CO2 sequestration
Robot life: simulation and participation in the study of evolution and social behavior.
This paper explores the case of using robots to simulate evolution, in particular the case of Hamilton's Law. The uses of robots raises several questions that this paper seeks to address. The first concerns the role of the robots in biological research: do they simulate something (life, evolution, sociality) or do they participate in something? The second question concerns the physicality of the robots: what difference does embodiment make to the role of the robot in these experiments. Thirdly, how do life, embodiment and social behavior relate in contemporary biology and why is it possible for robots to illuminate this relation? These questions are provoked by a strange similarity that has not been noted before: between the problem of simulation in philosophy of science, and Deleuze's reading of Plato on the relationship of ideas, copies and simulacra
Quality science from quality measurement: The role of measurement type with respect to replication and effect size magnitude in psychological research
Copyright: © 2018 Kornbrot et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.The quality of psychological studies is currently a major concern. The Many Labs Project (MLP) and the Open-Science-Collaboration (OSC) have collected key data on replicability and statistical effect sizes. We build on this work by investigating the role played by three measurement types: ratings, proportions and unbounded (measures without conceptual upper limits, e.g. time). Both replicability and effect sizes are dependent on the amount of variability due to extraneous factors. We predicted that the role of such extraneous factors might depend on measurement type, and would be greatest for ratings, intermediate for proportions and least for unbounded. Our results support this conjecture. OSC replication rates for unbounded, 43% and proportion 40% combined are reliably higher than those for ratings at 20% (effect size, w = .20). MLP replication rates for the original studies are: pro- portion = .74, ratings = .40 (effect size w = .33). Original effect sizes (Cohen’s d) are highest for: unbounded OSC cognitive = 1.45, OSC social = .90); next for proportions (OSC cogni- tive = 1.01, OSC social = .84, MLP = .82); and lowest for ratings (OSC social = .64, MLP = .31). These findings are of key importance to scientific methodology and design, even if the reasons for their occurrence are still at the level of conjecture.Peer reviewe
Cardioprotection by systemic dosing of thymosin beta four following ischemic myocardial injury
Thymosin beta 4 (Tβ4) was previously shown to reduce infarct size and improve contractile performance in chronic myocardial ischemic injury via two phases of action: an acute phase, just after injury, when Tβ4 preserves ischemic myocardium via antiapoptotic or anti-inflammatory mechanisms; and a chronic phase, when Tβ4 activates the growth of vascular or cardiac progenitor cells. In order to differentiate between the effects of Tβ4 during the acute and during the chronic phases, and also in order to obtain detailed hemodynamic and biomarker data on the effects of Tβ4 treatment suitable for use in clinical studies, we tested Tβ4 in a rat model of chronic myocardial ischemia using two dosing regimens: short term dosing (Tβ4 administered only during the first 3 days following injury), and long term dosing (Tβ4 administered during the first 3 days following injury and also every third day until the end of the study). Tβ4 administered throughout the study reduced infarct size and resulted in significant improvements in hemodynamic performance; however, chamber volumes and ejection fractions were not significantly improved. Tβ4 administered only during the first 3 days following injury tended to reduce infarct size, chamber volumes and improve hemodynamic performance. Plasma biomarkers of myocyte injury were significantly reduced by Tβ4 treatment during the acute injury period, and plasma ANP levels were significantly reduced in both dosing groups. Surprisingly, neither acute nor chronic Tβ4 treatment significantly increased blood vessel density in peri-infarct regions. These results suggest the following: repeated dosing may be required to achieve clinically measureable improvements in cardiac function post-myocardial infarction (MI); improvement in cardiac function may be observed in the absence of a high degree of angiogenesis; and that plasma biomarkers of cardiac function and myocardial injury are sensitive pharmacodynamic biomarkers of the effects of Tβ4
Towards Machine Wald
The past century has seen a steady increase in the need of estimating and
predicting complex systems and making (possibly critical) decisions with
limited information. Although computers have made possible the numerical
evaluation of sophisticated statistical models, these models are still designed
\emph{by humans} because there is currently no known recipe or algorithm for
dividing the design of a statistical model into a sequence of arithmetic
operations. Indeed enabling computers to \emph{think} as \emph{humans} have the
ability to do when faced with uncertainty is challenging in several major ways:
(1) Finding optimal statistical models remains to be formulated as a well posed
problem when information on the system of interest is incomplete and comes in
the form of a complex combination of sample data, partial knowledge of
constitutive relations and a limited description of the distribution of input
random variables. (2) The space of admissible scenarios along with the space of
relevant information, assumptions, and/or beliefs, tend to be infinite
dimensional, whereas calculus on a computer is necessarily discrete and finite.
With this purpose, this paper explores the foundations of a rigorous framework
for the scientific computation of optimal statistical estimators/models and
reviews their connections with Decision Theory, Machine Learning, Bayesian
Inference, Stochastic Optimization, Robust Optimization, Optimal Uncertainty
Quantification and Information Based Complexity.Comment: 37 page
- …
