1,030 research outputs found
Empirical Determinations of Key Physical Parameters Related to Classical Double Radio Sources
Multi-frequency radio observations of the radio bridge of powerful classical
double radio sources can be used to determine: the beam power of the jets
emanating from the AGN; the total time the source will actively produce jets
that power large-scale radio emission; the thermal pressure of the medium in
the vicinity of the radio source; and the total mass, including dark matter, of
the galaxy or cluster of galaxies traced by the ambient gas that surrounds the
radio source. The theoretical constructs that allow a determination of each of
these quantities using radio observations are presented and discussed.
Empirical determinations of each of these quantities are obtained and analyzed.
A sample of 14 radio galaxies and 8 radio loud quasars with redshifts between
zero and two for which there is enough radio information to be able to
determine the physical parameters listed above was studied in detail.
(abridged)Comment: Submitted to ApJ, LaTex, 26 total pages of text which includes
captions & two tables, plus 13 EPS figures & 1 tabl
Global Cosmological Parameters Determined Using Classical Double Radio Galaxies
A sample of 20 powerful extended radio galaxies with redshifts between zero
and two were used to determine constraints on global cosmological parameters.
Data for six radio sources were obtained from the VLA archive, analyzed, and
combined with the sample of 14 radio galaxies used previously by Guerra & Daly
to determine cosmological parameters. The results are consistent with our
previous results, and indicate that the current value of the mean mass density
of the universe is significantly less than the critical value. A universe with
of unity is ruled out at 99.0% confidence, and the best fitting
values of in matter are and
assuming zero space curvature and zero cosmological
constant, respectively. Note that identical results obtain when the low
redshift bin, which includes Cygnus A, is excluded; these results are
independent of whether the radio source Cygnus A is included. The method does
not rely on a zero-redshift normalization.
The radio properties of each source are also used to determine the density of
the gas in the vicinity of the source, and the beam power of the source. The
six new radio sources have physical characteristics similar to those found for
the original 14 sources. The density of the gas around these radio sources is
typical of gas in present day clusters of galaxies. The beam powers are
typically about .Comment: 39 pages includes 21 figures, accepted to Ap
Pantothenamides Are Potent, On-Target Inhibitors of Plasmodium falciparum Growth When Serum Pantetheinase Is Inactivated
Growth of the virulent human malaria parasite Plasmodium falciparum is dependent on an extracellular supply of pantothenate (vitamin B(5)) and is susceptible to inhibition by pantothenate analogues that hinder pantothenate utilization. In this study, on the hunt for pantothenate analogues with increased potency relative to those reported previously, we screened a series of pantothenamides (amide analogues of pantothenate) against P. falciparum and show for the first time that analogues of this type possess antiplasmodial activity. Although the active pantothenamides in this series exhibit only modest potency under standard in vitro culture conditions, we show that the potency of pantothenamides is selectively enhanced when the parasite culture medium is pre-incubated at 37°C for a prolonged period. We present evidence that this finding is linked to the presence in Albumax II (a serum-substitute routinely used for in vitro cultivation of P. falciparum) of pantetheinase activity: the activity of an enzyme that hydrolyzes the pantothenate metabolite pantetheine, for which pantothenamides also serve as substrates. Pantetheinase activity, and thereby pantothenamide degradation, is reduced following incubation of Albumax II-containing culture medium for a prolonged period at 37°C, revealing the true, sub-micromolar potency of pantothenamides. Importantly we show that the potent antiplasmodial effect of pantothenamides is attenuated with pantothenate, consistent with the compounds inhibiting parasite proliferation specifically by inhibiting pantothenate and/or CoA utilization. Additionally, we show that the pantothenamides interact with P. falciparum pantothenate kinase, the first enzyme involved in converting pantothenate to coenzyme A. This is the first demonstration of on-target antiplasmodial pantothenate analogues with sub-micromolar potency, and highlights the potential of pantetheinase-resistant pantothenamides as antimalarial agents.Aspects of this work were supported by grants from the South African Malaria Initiative (SAMI) to ES and KJS and the American Lebanese Syrian Associated Charities (ALSAC), St. Jude Children’s Research Hospital, to REL. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript
Substrate-blind photonic integration based on high-index glass materials
Conventional photonic integration technologies are inevitably substrate-dependent, as different substrate platforms stipulate vastly different device fabrication methods and processing compatibility requirements. Here we capitalize on the unique monolithic integration capacity of composition-engineered non-silicate glass materials (amorphous chalcogenides and transition metal oxides) to enable multifunctional, multi-layer photonic integration on virtually any technically important substrate platforms. We show that high-index glass film deposition and device fabrication can be performed at low temperatures ( < 250 °C) without compromising their low loss characteristics, and is thus fully compatible with monolithic integration on a broad range of substrates including semiconductors, plastics, textiles, and metals. Application of the technology is highlighted through three examples: demonstration of high-performance mid-IR photonic sensors on fluoride crystals, direct fabrication of photonic structures on graphene, and 3-D photonic integration on flexible plastic substrates.National Science Foundation (U.S.) (Award 1200406
Nodal dynamics, not degree distributions, determine the structural controllability of complex networks
Structural controllability has been proposed as an analytical framework for
making predictions regarding the control of complex networks across myriad
disciplines in the physical and life sciences (Liu et al.,
Nature:473(7346):167-173, 2011). Although the integration of control theory and
network analysis is important, we argue that the application of the structural
controllability framework to most if not all real-world networks leads to the
conclusion that a single control input, applied to the power dominating set
(PDS), is all that is needed for structural controllability. This result is
consistent with the well-known fact that controllability and its dual
observability are generic properties of systems. We argue that more important
than issues of structural controllability are the questions of whether a system
is almost uncontrollable, whether it is almost unobservable, and whether it
possesses almost pole-zero cancellations.Comment: 1 Figures, 6 page
Does publication bias inflate the apparent efficacy of psychological treatment for major depressive disorder? A systematic review and meta-analysis of US national institutes of health-funded trials
Background The efficacy of antidepressant medication has been shown empirically to be overestimated due to publication bias, but this has only been inferred statistically with regard to psychological treatment for depression. We assessed directly the extent of study publication bias in trials examining the efficacy of psychological treatment for depression. Methods and Findings We identified US National Institutes of Health grants awarded to fund randomized clinical trials comparing psychological treatment to control conditions or other treatments in patients diagnosed with major depressive disorder for the period 1972–2008, and we determined whether those grants led to publications. For studies that were not published, data were requested from investigators and included in the meta-analyses. Thirteen (23.6%) of the 55 funded grants that began trials did not result in publications, and two others never started. Among comparisons to control conditions, adding unpublished studies (Hedges’ g = 0.20; CI95% -0.11~0.51; k = 6) to published studies (g = 0.52; 0.37~0.68; k = 20) reduced the psychotherapy effect size point estimate (g = 0.39; 0.08~0.70) by 25%. Moreover, these findings may overestimate the "true" effect of psychological treatment for depression as outcome reporting bias could not be examined quantitatively. Conclusion The efficacy of psychological interventions for depression has been overestimated in the published literature, just as it has been for pharmacotherapy. Both are efficacious but not to the extent that the published literature would suggest. Funding agencies and journals should archive both original protocols and raw data from treatment trials to allow the detection and correction of outcome reporting bias. Clinicians, guidelines developers, and decision makers should be aware that the published literature overestimates the effects of the predominant treatments for depression
Conformal Inverse Optimization
Inverse optimization has been increasingly used to estimate unknown
parameters in an optimization model based on decision data. We show that such a
point estimation is insufficient in a prescriptive setting where the estimated
parameters are used to prescribe new decisions. The prescribed decisions may be
low-quality and misaligned with human intuition and thus are unlikely to be
adopted. To tackle this challenge, we propose conformal inverse optimization,
which seeks to learn an uncertainty set for the unknown parameters and then
solve a robust optimization model to prescribe new decisions. Under mild
assumptions, we show that our method enjoys provable guarantees on solution
quality, as evaluated using both the ground-truth parameters and the decision
maker's perception of the unknown parameters. Our method demonstrates strong
empirical performance compared to classic inverse optimization
On Dynamic Programming Decompositions of Static Risk Measures in Markov Decision Processes
Optimizing static risk-averse objectives in Markov decision processes is
difficult because they do not admit standard dynamic programming equations
common in Reinforcement Learning (RL) algorithms. Dynamic programming
decompositions that augment the state space with discrete risk levels have
recently gained popularity in the RL community. Prior work has shown that these
decompositions are optimal when the risk level is discretized sufficiently.
However, we show that these popular decompositions for
Conditional-Value-at-Risk (CVaR) and Entropic-Value-at-Risk (EVaR) are
inherently suboptimal regardless of the discretization level. In particular, we
show that a saddle point property assumed to hold in prior literature may be
violated. However, a decomposition does hold for Value-at-Risk and our proof
demonstrates how this risk measure differs from CVaR and EVaR. Our findings are
significant because risk-averse algorithms are used in high-stake environments,
making their correctness much more critical
The Marine Microbial Eukaryote Transcriptome Sequencing Project (MMETSP): illuminating the functional diversity of eukaryotic life in the oceans through transcriptome sequencing
International audienceCurrent sampling of genomic sequence data from eukaryotes is relatively poor, biased, and inadequate to address important questions about their biology, evolution, and ecology; this Community Page describes a resource of 700 transcriptomes from marine microbial eukaryotes to help understand their role in the world's oceans
The 3rd DBCLS BioHackathon: improving life science data integration with Semantic Web technologies.
BACKGROUND: BioHackathon 2010 was the third in a series of meetings hosted by the Database Center for Life Sciences (DBCLS) in Tokyo, Japan. The overall goal of the BioHackathon series is to improve the quality and accessibility of life science research data on the Web by bringing together representatives from public databases, analytical tool providers, and cyber-infrastructure researchers to jointly tackle important challenges in the area of in silico biological research. RESULTS: The theme of BioHackathon 2010 was the 'Semantic Web', and all attendees gathered with the shared goal of producing Semantic Web data from their respective resources, and/or consuming or interacting those data using their tools and interfaces. We discussed on topics including guidelines for designing semantic data and interoperability of resources. We consequently developed tools and clients for analysis and visualization. CONCLUSION: We provide a meeting report from BioHackathon 2010, in which we describe the discussions, decisions, and breakthroughs made as we moved towards compliance with Semantic Web technologies - from source provider, through middleware, to the end-consumer.RIGHTS : This article is licensed under the BioMed Central licence at http://www.biomedcentral.com/about/license which is similar to the 'Creative Commons Attribution Licence'. In brief you may : copy, distribute, and display the work; make derivative works; or make commercial use of the work - under the following conditions: the original author must be given credit; for any reuse or distribution, it must be made clear to others what the license terms of this work are
- …
