555 research outputs found

    Statistical Agent Based Modelization of the Phenomenon of Drug Abuse

    Get PDF
    We introduce a statistical agent based model to describe the phenomenon of drug abuse and its dynamical evolution at the individual and global level. The agents are heterogeneous with respect to their intrinsic inclination to drugs, to their budget attitude and social environment. The various levels of drug use were inspired by the professional description of the phenomenon and this permits a direct comparison with all available data. We show that certain elements have a great importance to start the use of drugs, for example the rare events in the personal experiences which permit to overcame the barrier of drug use occasionally. The analysis of how the system reacts to perturbations is very important to understand its key elements and it provides strategies for effective policy making. The present model represents the first step of a realistic description of this phenomenon and can be easily generalized in various directions.Comment: 12 pages, 5 figure

    Statistical Properties of Height of Japanese Schoolchildren

    Full text link
    We study height distributions of Japanese schoolchildren based on the statictical data which are obtained from the school health survey by the ministry of education, culture, sports, science and technology, Japan . From our analysis, it has been clarified that the distribution of height changes from the lognormal distribution to the normal distribution in the periods of puberty.Comment: 2 pages, 2 figures, submitted to J. Phys. Soc. Jpn.; resubmitted to J. Phys. Soc. Jpn. after some revisio

    Ground states of two-dimensional ±\pmJ Edwards-Anderson spin glasses

    Full text link
    We present an exact algorithm for finding all the ground states of the two-dimensional Edwards-Anderson ±J\pm J spin glass and characterize its performance. We investigate how the ground states change with increasing system size and and with increasing antiferromagnetic bond ratio xx. We find that that some system properties have very large and strongly non-Gaussian variations between realizations.Comment: 15 pages, 21 figures, 2 tables, uses revtex4 macro

    The normal distribution is not normal. Rethinking pays and rewards

    Get PDF

    Decellularization of pericardial tissue and its impact on tensile viscoelasticity and glycosaminoglycan content

    Get PDF
    Bovine pericardium is a collagenous tissue commonly used as a natural biomaterial in the fabrication of cardiovascular devices. For tissue engineering purposes, this xenogeneic biomaterial must be decellularized to remove cellular antigens. With this in mind, three decellularization protocols were compared in terms of their effectiveness to extract cellular materials, their effect on glycosaminoglycan (GAG) content and, finally, their effect on tensile biomechanical behavior. The tissue decellularization was achieved by treatment with t-octyl phenoxy polyethoxy ethanol (Triton X-100), tridecyl polyethoxy ethanol (ATE) and alkaline treatment and subsequent treatment with nucleases (DNase/RNase). The quantified residual DNA content (3.0 ± 0.4%, 4.4 ± 0.6% and 5.6 ± 0.7% for Triton X-100, ATE and alkaline treatment, respectively) and the absence of nuclear structures (hematoxylin and eosin staining) were indicators of effective cell removal. In the same way, it was found that the native tissue GAG content decreased to 61.6 ± 0.6%, 62.7 ± 1.1% and 88.6 ± 0.2% for Triton X-100, ATE and alkaline treatment, respectively. In addition, an alteration in the tissue stress relaxation characteristics was observed after alkaline treatment. We can conclude that the three decellularization agents preserved the collagen structural network, anisotropy and the tensile modulus, tensile strength and maximum strain at failure of native tissue

    Lasers and optics: Looking towards third generation gravitational wave detectors

    Get PDF
    Third generation terrestrial interferometric gravitational wave detectors will likely require significant advances in laser and optical technologies to reduce two of the main limiting noise sources: thermal noise due to mirror coatings and quantum noise arising from a combination of shot noise and radiation pressure noise. Increases in laser power and possible changes of the operational wavelength require new high power laser sources and new electro-optic modulators and Faraday isolators. Squeezed light can be used to further reduce the quantum noise while nano-structured optical components can be used to reduce or eliminate mirror coating thermal noise as well as to implement all-reflective interferometer configurations to avoid thermal effects in mirror substrates. This paper is intended to give an overview on the current state-of-the-art and future trends in these areas of ongoing research and development.NSF/PHY0555453NSF/PHY0757968NSF/PHY0653582DFG/SFB/407DFG/SFB/TR7DFG/EXC/QUES

    Using combined diagnostic test results to hindcast trends of infection from cross-sectional data

    Get PDF
    Infectious disease surveillance is key to limiting the consequences from infectious pathogens and maintaining animal and public health. Following the detection of a disease outbreak, a response in proportion to the severity of the outbreak is required. It is thus critical to obtain accurate information concerning the origin of the outbreak and its forward trajectory. However, there is often a lack of situational awareness that may lead to over- or under-reaction. There is a widening range of tests available for detecting pathogens, with typically different temporal characteristics, e.g. in terms of when peak test response occurs relative to time of exposure. We have developed a statistical framework that combines response level data from multiple diagnostic tests and is able to ‘hindcast’ (infer the historical trend of) an infectious disease epidemic. Assuming diagnostic test data from a cross-sectional sample of individuals infected with a pathogen during an outbreak, we use a Bayesian Markov Chain Monte Carlo (MCMC) approach to estimate time of exposure, and the overall epidemic trend in the population prior to the time of sampling. We evaluate the performance of this statistical framework on simulated data from epidemic trend curves and show that we can recover the parameter values of those trends. We also apply the framework to epidemic trend curves taken from two historical outbreaks: a bluetongue outbreak in cattle, and a whooping cough outbreak in humans. Together, these results show that hindcasting can estimate the time since infection for individuals and provide accurate estimates of epidemic trends, and can be used to distinguish whether an outbreak is increasing or past its peak. We conclude that if temporal characteristics of diagnostics are known, it is possible to recover epidemic trends of both human and animal pathogens from cross-sectional data collected at a single point in time

    Carrier-envelope offset stable, coherently combined ytterbium-doped fiber CPA delivering 1 kW of average power

    Get PDF
    We present a carrier-envelope offset (CEO) stable ytterbium-doped fiber chirped-pulse amplification system employing the technology of coherent beam combining and delivering more than 1 kW of average power at a pulse repetition rate of 80 MHz. The CEO stability of the system is 220 mrad rms, characterized out-of-loop with an f -to-2f interferometer in a frequency offset range of 10 Hz to 20 MHz. The high-power amplification system boosts the average power of the CEO stable oscillator by five orders of magnitude while increasing the phase noise by only 100 mrad. No evidence of CEO noise deterioration due to coherent beam combining is found. Low-frequency CEO fluctuations at the chirped-pulse amplifier are suppressed by a “slow loop” feedback. To the best of our knowledge, this is the first demonstration of a coherently combined laser system delivering an outstanding average power and high CEO stability at the same time. © 2020 Optical Society of Americ

    Problems with Using the Normal Distribution – and Ways to Improve Quality and Efficiency of Data Analysis

    Get PDF
    Background: The Gaussian or normal distribution is the most established model to characterize quantitative variation of original data. Accordingly, data are summarized using the arithmetic mean and the standard deviation, by x 6 SD, or with the standard error of the mean, x 6 SEM. This, together with corresponding bars in graphical displays has become the standard to characterize variation. Methodology/Principal Findings: Here we question the adequacy of this characterization, and of the model. The published literature provides numerous examples for which such descriptions appear inappropriate because, based on the ‘‘95 % range check’’, their distributions are obviously skewed. In these cases, the symmetric characterization is a poor description and may trigger wrong conclusions. To solve the problem, it is enlightening to regard causes of variation. Multiplicative causes are by far more important than additive ones, in general, and benefit from a multiplicative (or log-) normal approach. Fortunately, quite similar to the normal, the log-normal distribution can now be handled easily and characterized at the level of the original data with the help of both, a new sign, x /, times-divide, and notation. Analogous to x 6 SD, it connects the multiplicative (or geometric) mean x * and the multiplicative standard deviation s * in the form x * x /s*, that is advantageous and recommended. Conclusions/Significance: The corresponding shift from the symmetric to the asymmetric view will substantially increas
    corecore