143 research outputs found
Nonparametric estimation of concave production technologies by entropic methods
An econometric methodology is developed for nonparametric estimation of concave production technologies. The methodology, bases on the priciple of maximum likelihood, uses entropic distance and concvex programming techniques to estimate production functions.convex programming, production functions, entropy
QPLEX: A Computational Modeling and Analysis Methodology for Stochastic Systems
This book introduces QPLEX, a powerful computational framework designed for modeling and analyzing nonstationary stochastic systems with large state spaces. The methodology excels at rapidly and accurately generating approximate distributions of system performance over time, offering a robust tool for understanding the dynamics of such systems. QPLEX circumvents the curse of dimensionality by imposing conditional independence, which may be represented via a probabilistic graphical model, and exploiting model dynamics. It is specifically crafted for transient analysis of nonstationary systems, often encountered in practical applications but rarely addressed by traditional techniques. It can work directly with empirical distributions and requires no stability assumptions. Since its output is not noisy, QPLEX is tailor-made for sensitivity analysis and optimization. The methodology’s few model primitives are flexible enough to specify a rich array of models. For example, models representing queueing networks can exhibit challenging characteristics such as short operational horizons; time-varying arrival rates, service durations, and numbers of servers; and complex routing of entities. The text is accessible to those with engineering, computer science, or mathematics backgrounds and knowledge of probability and stochastic models at the advanced undergraduate level. Many fully worked-out examples aid the comprehension of the concepts and calculations, ensuring readers can effectively apply the methods to real-world systems and making this book a valuable resource for researchers and practitioners alike. This is an open access book
Task workflow design and its impact on performance and volunteers' subjective preference in virtual citizen science
Virtual citizen science platforms allow non-scientists to take part in scientific research across a range of disciplines. What they ask of volunteers varies considerably in terms of task type, variety, user judgement required and user freedom, which has received little direct investigation. A study was performed with the Planet Four: Craters project to investigate the effect of task workflow design on both volunteer experience and the scientific results they produce. Participants' feedback through questionnaire responses indicated a preference for interfaces providing greater autonomy and variety, with free-text responses suggesting that autonomy was the more important. This did not translate into improved performance however, with the most autonomous interface not resulting in significantly better performance in data volume, agreement or accuracy compared to other less autonomous interfaces. The interface with the least number of task types, variety and autonomy resulted in the greatest data coverage. Agreement, both between participants and with the expert equivalent, was significantly improved when the interface most directly afforded tasks that captured the required underlying data (i.e. crater position or diameter). The implications for the designers of virtual citizen science platforms is that they have a balancing act to perform, weighing up the importance of user satisfaction, the data needs of the science case and the resources that can be committed both in terms of time and data reduction
The burden of TTN variants in the genomic era: analysis of 18,462 individuals from the Solve-RD consortium and general recommendations
Purpose:
Titin, the largest protein in the human body, has been associated with several disease phenotypes caused by variants in the TTN gene. With around 20% of the population carrying a rare TTN variant and over 60 million genomes expected to have been sequenced worldwide by 2025, interpreting these findings presents major challenges. This study analyzed TTN variants in the Solve-RD cohort, the European network for unsolved rare disease cases.
//
Methods:
We collected data from 11,072 individuals with suspected rare diseases and 7,390 healthy relatives from the Solve-RD consortium, checking and manually reviewing TTN variants. We then used a filtering approach focused on clinical relevance, and we provided updated recommendations based on recent literature.
//
Results:
Among the cohort, 240 individuals (1.3%) carried at least one heterozygous TTN truncating variant (TTNtv), with a 3.8% prevalence in the neuromuscular subgroup, primarily composed of unsolved cases. Four individuals received a titinopathy diagnosis. Additionally, 99 participants (0.5%) had a TTNtv in a high-cardiac-PSI exon (>80%), and four had an overt cardiomyopathy.
//
Conclusion:
This study highlights the need for standardized approach to TTN variants, and investigation of missing heritability in myopathic individuals with het TTNtv. Establishing consensus on PSI-based thresholds will be essential for assessing cardiac risk and guiding the management of asymptomatic individuals
DESCANT and β-Delayed Neutron Measurements at TRIUMF
The DESCANT array (Deuterated Scintillator Array for Neutron Tagging) consists of up to 70 detectors, each filled with approximately 2 liters of deuterated benzene. This scintillator material o_ers pulse-shape discrimination (PSD) capabilities to distinguish between neutrons and γ-rays interacting with the scintillator material. In addition, the anisotropic nature of n – d scattering allows for the determination of the neutron energy spectrum directly from the pulse height spectrum, complementing the traditional time-of-flight (ToF) information. DESCANT can be coupled either to the TIGRESS (TRIUMF-ISAC Gamma-Ray Escape Suppressed Spectrometer) γ-ray spectrometer [1] located in the ISAC-II [2] hall of TRIUMF for in-beam experiments, or to the GRIFFIN (Gamma-Ray Infrastructure For Fundamental Investigations of Nuclei) γ-ray spectrometer [3] located in the ISAC-I hall of TRIUMF for decay spectroscopy experiments
Ground-State and Pairing-Vibrational Bands with Equal Quadrupole Collectivity in \u3csup\u3e124\u3c/sup\u3eXe
The nuclear structure of 124Xe has been investigated via measurements of the β+/EC decay of 124Cs with the 8πγ-ray spectrometer at the TRIUMF-ISAC facility. . . .
For the remainder of this abstract, please visit: http://dx.doi.org/10.1103/PhysRevC.91.04432
Two-Neutron Transfer Reaction Mechanisms in \u3csup\u3e12\u3c/sup\u3eC(\u3csup\u3e6\u3c/sup\u3eHe, \u3csup\u3e4\u3c/sup\u3eHe) \u3csup\u3e14\u3c/sup\u3eC using a Realistic Three-Body \u3csup\u3e6\u3c/sup\u3eHe Model
The reaction mechanisms of the two-neutron transfer reaction 12C(6He,4He) have been studied at Elab=30 MeV at the TRIUMF ISAC-II facility using the Silicon Highly-segmented Array for Reactions and Coulex (SHARC) charged-particle detector array. Optical potential parameters have been extracted from the analysis of the elastic scattering angular distribution. The new potential has been applied to the study of the transfer angular distribution to the 2+2 8.32 MeV state in 14C, using a realistic three-body 6He model and advanced shell-model calculations for the carbon structure, allowing to calculate the relative contributions of the simultaneous and sequential two-neutron transfer. The reaction model provides a good description of the 30-MeV data set and shows that the simultaneous process is the dominant transfer mechanism. Sensitivity tests of optical potential parameters show that the final results can be considerably affected by the choice of optical potentials. A reanalysis of data measured previously at Elab=18 MeV, however, is not as well described by the same reaction model, suggesting that one needs to include higher-order effects in the reaction mechanism
(Re) defining salesperson motivation: current status, main challenges, and research directions
The construct of motivation is one of the central themes in selling and sales management research. Yet, to-date no review article exists that surveys the construct (both from an extrinsic and intrinsic motivation context), critically evaluates its current status, examines various key challenges apparent from the extant research, and suggests new research opportunities based on a thorough review of past work. The authors explore how motivation is defined, major theories underpinning motivation, how motivation has historically been measured, and key methodologies used over time. In addition, attention is given to principal drivers and outcomes of salesperson motivation. A summarizing appendix of key articles in salesperson motivation is provided
Perinatal and Socioeconomic Risk Factors for Variable and Persistent Cognitive Delay at 24 and 48 Months of Age in a National Sample
The objective of this paper is to examine patterns of cognitive delay at 24 and 48 months and quantify the effects of perinatal and sociodemographic risk factors on persistent and variable cognitive delay. Using data from 7,200 children in the Early Childhood Longitudinal Study, Birth Cohort (ECLS-B), multiple logistic regression models identified significant predictors of low cognitive functioning at 24 and 48 months. Additional multiple logistic models predicting cognitive delay at 48 months were estimated separately for children with and without delay at 24 months. Of the nearly 1,000 children delayed at 24 months, 24.2% remained delayed by 48 months; 7.9% of the children not delayed at 24 months exhibited delay at 48 months. Low and very low birthweight increased cognitive delay risk at 24, but not 48 months. Low maternal education had a strongly increasing effect (OR = 2.3 at 24 months, OR = 13.7 at 48 months), as did low family income (OR = 1.4 at 24 months, OR = 7.0 at 48 months). Among children delayed at 24 months, low maternal education predicted delay even more strongly at 48 months (OR = 30.5). Low cognitive functioning is highly dynamic from 24 to 48 months. Although gestational factors including low birthweight increase children’s risk of cognitive delay at 24 months, low maternal education and family income are more prevalent in the pediatric population and are much stronger predictors of both persistent and emerging delay between ages 24 and 48 months
Correction to: Solving patients with rare diseases through programmatic reanalysis of genome-phenome data
In the original publication of the article, consortium author lists were missing in the articl
- …
