2,245 research outputs found
Social simulations: improving interdisciplinary understanding of scientific positioning and validity
Because of features that appear to be inherent in many social systems, modellers face complicated and subjective choices in positioning the scientific contribution of their research. This leads to a diversity of approaches and terminology, making interdisciplinary assessment of models highly problematic. Such modellers ideally need some kind of accessible, interdisciplinary framework to better understand and assess these choices. Existing texts tend either to take a specialised metaphysical approach, or focus on more pragmatic aspects such as the simulation process or descriptive protocols for how to present such research. Without a sufficiently neutral treatment of why a particular set of methods and style of model might be chosen, these choices can become entwined with the ideological and terminological baggage of a particular discipline. This paper attempts to provide such a framework. We begin with an epistemological model, which gives a standardised view on the types of validation available to the modeller, and their impact on scientific value. This is followed by a methodological framework, presented as a taxonomy of the key dimensions over which approaches are ultimately divided. Rather than working top-down from philosophical principles, we characterise the issues as a practitioner would see them. We believe that such a characterisation can be done 'well enough', where 'well enough' represents a common frame of reference for all modellers, which nevertheless respects the essence of the debate's subtleties and can be accepted as such by a majority of 'methodologists'. We conclude by discussing the limitations of such an approach, and potential further work for such a framework to be absorbed into existing, descriptive protocols and general social simulation texts
Algebraic tools for dealing with the atomic shell model. I. Wavefunctions and integrals for hydrogen--like ions
Today, the 'hydrogen atom model' is known to play its role not only in
teaching the basic elements of quantum mechanics but also for building up
effective theories in atomic and molecular physics, quantum optics, plasma
physics, or even in the design of semiconductor devices. Therefore, the
analytical as well as numerical solutions of the hydrogen--like ions are
frequently required both, for analyzing experimental data and for carrying out
quite advanced theoretical studies. In order to support a fast and consistent
access to these (Coulomb--field) solutions, here we present the Dirac program
which has been developed originally for studying the properties and dynamical
behaviour of the (hydrogen--like) ions. In the present version, a set of Maple
procedures is provided for the Coulomb wave and Green's functions by applying
the (wave) equations from both, the nonrelativistic and relativistic theory.
Apart from the interactive access to these functions, moreover, a number of
radial integrals are also implemented in the Dirac program which may help the
user to construct transition amplitudes and cross sections as they occur
frequently in the theory of ion--atom and ion--photon collisions.Comment: 23 pages, 1 figur
Missing data in trial-based cost-effectiveness analysis: An incomplete journey.
Cost-effectiveness analyses (CEA) conducted alongside randomised trials provide key evidence for informing healthcare decision making, but missing data pose substantive challenges. Recently, there have been a number of developments in methods and guidelines addressing missing data in trials. However, it is unclear whether these developments have permeated CEA practice. This paper critically reviews the extent of and methods used to address missing data in recently published trial-based CEA. Issues of the Health Technology Assessment journal from 2013 to 2015 were searched. Fifty-two eligible studies were identified. Missing data were very common; the median proportion of trial participants with complete cost-effectiveness data was 63% (interquartile range: 47%-81%). The most common approach for the primary analysis was to restrict analysis to those with complete data (43%), followed by multiple imputation (30%). Half of the studies conducted some sort of sensitivity analyses, but only 2 (4%) considered possible departures from the missing-at-random assumption. Further improvements are needed to address missing data in cost-effectiveness analyses conducted alongside randomised trials. These should focus on limiting the extent of missing data, choosing an appropriate method for the primary analysis that is valid under contextually plausible assumptions, and conducting sensitivity analyses to departures from the missing-at-random assumption
Primary Structure and Catalytic Mechanism of the Epoxide Hydrolase from Agrobacterium radiobacter AD1
The epoxide hydrolase gene from Agrobacterium radiobacter AD1, a bacterium that is able to grow on epichlorohydrin as the sole carbon source, was cloned by means of the polymerase chain reaction with two degenerate primers based on the N-terminal and C-terminal sequences of the enzyme. The epoxide hydrolase gene coded for a protein of 294 amino acids with a molecular mass of 34 kDa. An identical epoxide hydrolase gene was cloned from chromosomal DNA of the closely related strain A. radiobacter CFZ11. The recombinant epoxide hydrolase was expressed up to 40% of the total cellular protein content in Escherichia coli BL21(DE3) and the purified enzyme had a kcat of 21 s-1 with epichlorohydrin. Amino acid sequence similarity of the epoxide hydrolase with eukaryotic epoxide hydrolases, haloalkane dehalogenase from Xanthobacter autotrophicus GJ10, and bromoperoxidase A2 from Streptomyces aureofaciens indicated that it belonged to the α/β-hydrolase fold family. This conclusion was supported by secondary structure predictions and analysis of the secondary structure with circular dichroism spectroscopy. The catalytic triad residues of epoxide hydrolase are proposed to be Asp107, His275, and Asp246. Replacement of these residues to Ala/Glu, Arg/Gln, and Ala, respectively, resulted in a dramatic loss of activity for epichlorohydrin. The reaction mechanism of epoxide hydrolase proceeds via a covalently bound ester intermediate, as was shown by single turnover experiments with the His275 → Arg mutant of epoxide hydrolase in which the ester intermediate could be trapped.
Black Hole Spin via Continuum Fitting and the Role of Spin in Powering Transient Jets
The spins of ten stellar black holes have been measured using the
continuum-fitting method. These black holes are located in two distinct classes
of X-ray binary systems, one that is persistently X-ray bright and another that
is transient. Both the persistent and transient black holes remain for long
periods in a state where their spectra are dominated by a thermal accretion
disk component. The spin of a black hole of known mass and distance can be
measured by fitting this thermal continuum spectrum to the thin-disk model of
Novikov and Thorne; the key fit parameter is the radius of the inner edge of
the black hole's accretion disk. Strong observational and theoretical evidence
links the inner-disk radius to the radius of the innermost stable circular
orbit, which is trivially related to the dimensionless spin parameter a_* of
the black hole (|a_*| < 1). The ten spins that have so far been measured by
this continuum-fitting method range widely from a_* \approx 0 to a_* > 0.95.
The robustness of the method is demonstrated by the dozens or hundreds of
independent and consistent measurements of spin that have been obtained for
several black holes, and through careful consideration of many sources of
systematic error. Among the results discussed is a dichotomy between the
transient and persistent black holes; the latter have higher spins and larger
masses. Also discussed is recently discovered evidence in the transient sources
for a correlation between the power of ballistic jets and black hole spin.Comment: 30 pages. Accepted for publication in Space Science Reviews. Also to
appear in hard cover in the Space Sciences Series of ISSI "The Physics of
Accretion onto Black Holes" (Springer Publisher). Changes to Sections 5.2,
6.1 and 7.4. Section 7.4 responds to Russell et al. 2013 (MNRAS, 431, 405)
who find no evidence for a correlation between the power of ballistic jets
and black hole spi
Grain Surface Models and Data for Astrochemistry
AbstractThe cross-disciplinary field of astrochemistry exists to understand the formation, destruction, and survival of molecules in astrophysical environments. Molecules in space are synthesized via a large variety of gas-phase reactions, and reactions on dust-grain surfaces, where the surface acts as a catalyst. A broad consensus has been reached in the astrochemistry community on how to suitably treat gas-phase processes in models, and also on how to present the necessary reaction data in databases; however, no such consensus has yet been reached for grain-surface processes. A team of ∼25 experts covering observational, laboratory and theoretical (astro)chemistry met in summer of 2014 at the Lorentz Center in Leiden with the aim to provide solutions for this problem and to review the current state-of-the-art of grain surface models, both in terms of technical implementation into models as well as the most up-to-date information available from experiments and chemical computations. This review builds on the results of this workshop and gives an outlook for future directions
Digital technology and governance in transition: The case of the British Library
Comment on the organizational consequences of the new information and communications technologies (ICTs) is pervaded by a powerful imagery of disaggregation and a tendency for ?virtual? forms of production to be seen as synonymous with the ?end? of bureaucracy. This paper questions the underlying assumptions of the ?virtual organization?, highlighting the historically enduring, diversified character of the bureaucratic form. The paper then presents case study findings on the web-based access to information resources now being provided by the British Library (BL). The case study evidence produces two main findings. First, radically decentralised virtual forms of service delivery are heavily dependent on new forms of capacity-building and information aggregation. Second, digital technology is embedded in an inherently contested and contradictory context of institutional change. Current developments in the management and control of digital rights are consistent with the commodification of the public sphere. However, the evidence also suggests that scholarly access to information resources is being significantly influenced by the ?information society? objectives of the BL and other institutional players within the network of UK research libraries
An Integrated TCGA Pan-Cancer Clinical Data Resource to Drive High-Quality Survival Outcome Analytics
For a decade, The Cancer Genome Atlas (TCGA) program collected clinicopathologic annotation data along with multi-platform molecular profiles of more than 11,000 human tumors across 33 different cancer types. TCGA clinical data contain key features representing the democratized nature of the data collection process. To ensure proper use of this large clinical dataset associated with genomic features, we developed a standardized dataset named the TCGA Pan-Cancer Clinical Data Resource (TCGA-CDR), which includes four major clinical outcome endpoints. In addition to detailing major challenges and statistical limitations encountered during the effort of integrating the acquired clinical data, we present a summary that includes endpoint usage recommendations for each cancer type. These TCGA-CDR findings appear to be consistent with cancer genomics studies independent of the TCGA effort and provide opportunities for investigating cancer biology using clinical correlates at an unprecedented scale. Analysis of clinicopathologic annotations for over 11,000 cancer patients in the TCGA program leads to the generation of TCGA Clinical Data Resource, which provides recommendations of clinical outcome endpoint usage for 33 cancer types
Pan-Cancer Analysis of lncRNA Regulation Supports Their Targeting of Cancer Genes in Each Tumor Context
Long noncoding RNAs (lncRNAs) are commonly dys-regulated in tumors, but only a handful are known toplay pathophysiological roles in cancer. We inferredlncRNAs that dysregulate cancer pathways, onco-genes, and tumor suppressors (cancer genes) bymodeling their effects on the activity of transcriptionfactors, RNA-binding proteins, and microRNAs in5,185 TCGA tumors and 1,019 ENCODE assays.Our predictions included hundreds of candidateonco- and tumor-suppressor lncRNAs (cancerlncRNAs) whose somatic alterations account for thedysregulation of dozens of cancer genes and path-ways in each of 14 tumor contexts. To demonstrateproof of concept, we showed that perturbations tar-geting OIP5-AS1 (an inferred tumor suppressor) andTUG1 and WT1-AS (inferred onco-lncRNAs) dysre-gulated cancer genes and altered proliferation ofbreast and gynecologic cancer cells. Our analysis in-dicates that, although most lncRNAs are dysregu-lated in a tumor-specific manner, some, includingOIP5-AS1, TUG1, NEAT1, MEG3, and TSIX, synergis-tically dysregulate cancer pathways in multiple tumorcontexts
- …
