1,702 research outputs found

    Adjoint bi-continuous semigroups and semigroups on the space of measures

    Get PDF
    For a given bi-continuous semigroup T on a Banach space X we define its adjoint on an appropriate closed subspace X^o of the norm dual X'. Under some abstract conditions this adjoint semigroup is again bi-continuous with respect to the weak topology (X^o,X). An application is the following: For K a Polish space we consider operator semigroups on the space C(K) of bounded, continuous functions (endowed with the compact-open topology) and on the space M(K) of bounded Baire measures (endowed with the weak*-topology). We show that bi-continuous semigroups on M(K) are precisely those that are adjoints of a bi-continuous semigroups on C(K). We also prove that the class of bi-continuous semigroups on C(K) with respect to the compact-open topology coincides with the class of equicontinuous semigroups with respect to the strict topology. In general, if K is not Polish space this is not the case

    The Borexino detector at the Laboratori Nazionali del Gran Sasso

    Full text link
    Borexino, a large volume detector for low energy neutrino spectroscopy, is currently running underground at the Laboratori Nazionali del Gran Sasso, Italy. The main goal of the experiment is the real-time measurement of sub MeV solar neutrinos, and particularly of the mono energetic (862 keV) Be7 electron capture neutrinos, via neutrino-electron scattering in an ultra-pure liquid scintillator. This paper is mostly devoted to the description of the detector structure, the photomultipliers, the electronics, and the trigger and calibration systems. The real performance of the detector, which always meets, and sometimes exceeds, design expectations, is also shown. Some important aspects of the Borexino project, i.e. the fluid handling plants, the purification techniques and the filling procedures, are not covered in this paper and are, or will be, published elsewhere (see Introduction and Bibliography).Comment: 37 pages, 43 figures, to be submitted to NI

    Simulation approach for assessing the performance of the γEWMA control chart

    Get PDF
    i) Purpose: The purpose of this paper is to evaluate the performance of a modified EWMA control chart (γ\gammaEWMA control chart), which considers data distribution and incorporate its correlation structure, simulating in-control and out-of-control processes and to select an adequate value for smoothing parameter with these conditions. ii) Design/methodology/approach: This paper is based on a simulation approach using the methodology for evaluating statistical methods proposed by Morris et al. (2019). Data were generated from a simulation considering two factors that associated with data: (1) quality variable distribution skewness as an indicator of quality variable distribution; (2) the autocorrelation structure for type of relationship between the observations and modeled by AR(1). In addition, one factor associated with the process was considered, (1) the shift in the process mean. In the following step, when the chart control is modeled, the fourth factor intervenes. This factor is a smoothing parameter. Finally, three indicators defined from the Run Length are used to evaluate γEWMA control chart performance this factors and their interactions. iii) Findings: Interaction analysis for four factor evidence that the modeling and selection of parameters is different for out-of-control and in-control processes therefore the considerations and parameters selected for each case must be carefully analyzed. For out-of-control processes, it is better to preserve the original features of the distribution (mean and variance) for the calculation of the control limits. It makes sense that highly autocorrelated observations require smaller smoothing parameter since the correlation structure enables the preservation of relevant information in past data. iv) Originality/value: The γ\gammaEWMA control chart there has advantages because it gathers, in single chart control: the process and modelling characteristics, and data structure process. Although there are other proposals for modified EWMA, none of them simultaneously analyze the four factors nor their interactions. The proposed γ\gammaEWMA allows setting the appropriate smoothing parameter when these three factors are considered

    Clinical and laboratory variability in a cohort of patients diagnosed with type 1 VWD in the United States

    Get PDF
    Von Willebrand disease (VWD) is the most common inherited bleeding disorder, and type 1 VWD is the most common VWD variant. Despite its frequency, diagnosis of type 1 VWD remains the subject of much debate. In order to study the spectrum of type 1 VWD in the United States, the Zimmerman Program enrolled 482 subjects with a previous diagnosis of type 1 VWD without stringent laboratory diagnostic criteria. VWF laboratory testing and full length VWF gene sequencing were performed for all index cases and healthy control subjects in a central laboratory. Bleeding phenotype was characterized using the ISTH Bleeding Assessment Tool. At study entry, 64% of subjects had VWF:Ag or VWF:RCo below the lower limit of normal, while 36% had normal VWF levels. VWF sequence variations were most frequent in subjects with VWF:Ag < 30 IU/dL (82%) while subjects with type 1 VWD and VWF:Ag ≥ 30 IU/dL had an intermediate frequency of variants (44%). Subjects whose VWF testing was normal at study entry had a similar rate of sequence variations as the healthy controls at 14% of subjects. All subjects with severe type 1 VWD and VWF:Ag ≤ 5 IU/dL had an abnormal bleeding score, but otherwise bleeding score did not correlate with VWF:Ag level. Subjects with a historical diagnosis of type 1 VWD had similar rates of abnormal bleeding scores compared to subjects with low VWF levels at study entry. Type 1 VWD in the United States is highly variable, and bleeding symptoms are frequent in this population

    Machine learning methods for generating high dimensional discrete datasets

    Get PDF
    The development of platforms and techniques for emerging Big Data and Machine Learning applications requires the availability of real-life datasets. A possible solution is to synthesize datasets that reflect patterns of real ones using a two-step approach: first, a real dataset X is analyzed to derive relevant patterns Z and, then, to use such patterns for reconstructing a new dataset X ' that preserves the main characteristics of X. This survey explores two possible approaches: (1) Constraint-based generation and (2) probabilistic generative modeling. The former is devised using inverse mining (IFM) techniques, and consists of generating a dataset satisfying given support constraints on the itemsets of an input set, that are typically the frequent ones. By contrast, for the latter approach, recent developments in probabilistic generative modeling (PGM) are explored that model the generation as a sampling process from a parametric distribution, typically encoded as neural network. The two approaches are compared by providing an overview of their instantiations for the case of discrete data and discussing their pros and cons. This article is categorized under: Fundamental Concepts of Data and Knowledge > Big Data Mining Technologies > Machine Learning Algorithmic Development > Structure Discover

    The relationship between target joints and direct resource use in severe haemophilia

    Get PDF
    Objectives Target joints are a common complication of severe haemophilia. While factor replacement therapy constitutes the majority of costs in haemophilia, the relationship between target joints and non drug-related direct costs (NDDCs) has not been studied. Methods Data on haemophilia patients without inhibitors was drawn from the ‘Cost of Haemophilia across Europe – a Socioeconomic Survey’ (CHESS) study, a cost assessment in severe haemophilia A and B across five European countries (France, Germany, Italy, Spain, and the United Kingdom) in which 139 haemophilia specialists provided demographic and clinical information for 1285 adult patients. NDDCs were calculated using publicly available cost data, including 12-month ambulatory and secondary care activity: haematologist and other specialist consultant consultations, medical tests and examinations, bleed-related hospital admissions, and payments to professional care providers. A generalized linear model was developed to investigate the relationship between NDDCs and target joints (areas of chronic synovitis), adjusted for patient covariates. Results Five hundred and thirteen patients (42% of the sample) had no diagnosed target joints; a total of 1376 target joints (range 1–10) were recorded in the remaining 714 patients. Mean adjusted NDDCs for persons with no target joints were EUR 3134 (standard error (SE) EUR 158); for persons with one or more target joints, mean adjusted NDDCs were EUR 3913 (SE EUR 157; average mean effect EUR 779; p < 0.001). Conclusions Our analysis suggests that the presence of one or more target joints has a significant impact on NDDCs for patients with severe haemophilia, ceteris paribus. Prevention and management of target joints should be an important consideration of managing haemophilia patients

    Siamese Network for Fake Item Detection

    Get PDF
    Currently, most multimedia users choose to purchase items through e-commerce. Nevertheless, one of the main concerns of online shopping is the possibility of obtaining counterfeit products. Therefore, it is crucial to monitor the authenticity of the product, thus adopting an automatic mechanism to validate the similarity between the purchased item and the delivered one. To overcome this issue, we propose a Siamese Network model for detecting forged items. Preliminary experimentation on a publicly available dataset proves the effectiveness of our solution

    The TOTEM Experiment at the CERN Large Hadron Collider

    Get PDF
    The TOTEM Experiment will measure the total pp cross-section with the luminosity independent method and study elastic and diffractive scattering at the LHC. To achieve optimum forward coverage for charged particles emitted by the pp collisions in the interaction point IP5, two tracking telescopes, T1 and T2, will be installed on each side in the pseudorapidity region 3,1 <h< 6,5, and Roman Pot stations will be placed at distances of 147m and 220m from IP5. Being an independent experiment but technically integrated into CMS, TOTEM will first operate in standalone mode to pursue its own physics programme and at a later stage together with CMS for a common physics programme. This article gives a description of the TOTEM apparatus and its performance

    Pneumonic versus Nonpneumonic Exacerbations of Chronic Obstructive Pulmonary Disease.

    Get PDF
    Patients with chronic obstructive pulmonary disease (COPD) often suffer acute exacerbations (AECOPD) and community-acquired pneumonia (CAP), named nonpneumonic and pneumonic exacerbations of COPD, respectively. Abnormal host defense mechanisms may play a role in the specificity of the systemic inflammatory response. Given the association of this aspect to some biomarkers at admission (e.g., C-reactive protein), it can be used to help to discriminate AECOPD and CAP, especially in cases with doubtful infiltrates and advanced lung impairment. Fever, sputum purulence, chills, and pleuritic pain are typical clinical features of CAP in a patient with COPD, whereas isolated dyspnea at admission has been reported to predict AECOPD. Although CAP may have a worse outcome in terms of mortality (in hospital and short term), length of hospitalization, and early readmission rates, this has only been confirmed in a few prospective studies. There is a lack of methodologically sound research confirming the impact of severe AECOPD and COPD + CAP. Here, we review studies reporting head-to-head comparisons between AECOPD and CAP + COPD in hospitalized patients. We focus on the epidemiology, risk factors, systemic inflammatory response, clinical and microbiological characteristics, outcomes, and treatment approaches. Finally, we briefly discuss some proposals on how we should orient research in the future
    corecore