2,209 research outputs found

    Sensitive and Scalable Online Evaluation with Theoretical Guarantees

    Full text link
    Multileaved comparison methods generalize interleaved comparison methods to provide a scalable approach for comparing ranking systems based on regular user interactions. Such methods enable the increasingly rapid research and development of search engines. However, existing multileaved comparison methods that provide reliable outcomes do so by degrading the user experience during evaluation. Conversely, current multileaved comparison methods that maintain the user experience cannot guarantee correctness. Our contribution is two-fold. First, we propose a theoretical framework for systematically comparing multileaved comparison methods using the notions of considerateness, which concerns maintaining the user experience, and fidelity, which concerns reliable correct outcomes. Second, we introduce a novel multileaved comparison method, Pairwise Preference Multileaving (PPM), that performs comparisons based on document-pair preferences, and prove that it is considerate and has fidelity. We show empirically that, compared to previous multileaved comparison methods, PPM is more sensitive to user preferences and scalable with the number of rankers being compared.Comment: CIKM 2017, Proceedings of the 2017 ACM on Conference on Information and Knowledge Managemen

    Implicit surfaces with globally regularised and compactly supported basis functions

    Get PDF
    We consider the problem of constructing a function whose zero set is to represent a surface, given sample points with surface normal vectors. The contributions include a novel means of regularising multi-scale compactly supported basis functions that leads to the desirable properties previously only associated with fully supported bases, and show equivalence to a Gaussian process with modified covariance function. We also provide a regularisation framework for simpler and more direct treatment of surface normals, along with a corresponding generalisation of the representer theorem. We demonstrate the techniques on 3D problems of up to 14 million data points, as well as 4D time series data

    Assessment on experimental bacterial biofilms and in clinical practice of the efficacy of sampling solutions for microbiological testing of endoscopes

    Get PDF
    International audienceOpinions differ on the value of microbiological testing of endoscopes, which varies according to the technique used. We compared the efficacy on bacterial biofilms of sampling solutions used for the surveillance of the contamination of endoscope channels. To compare efficacy, we used an experimental model of a 48-h Pseudomonas biofilm grown on endoscope internal tubing. Sampling of this experimental biofilm was performed with a Tween 80-lecithin-based solution, saline, and sterile water. We also performed a randomized prospective study during routine clinical practice in our hospital sampling randomly with two different solutions the endoscopes after reprocessing. Biofilm recovery expressed as a logarithmic ratio of bacteria recovered on bacteria initially present in biofilm was significantly more effective with the Tween 80-lecithin-based solution than with saline solution (P = 0.002) and sterile water (P = 0.002). There was no significant difference between saline and sterile water. In the randomized clinical study, the rates of endoscopes that were contaminated with the Tween 80-lecithin-based sampling solution and the saline were 8/25 and 1/25, respectively (P = 0.02), and the mean numbers of bacteria recovered were 281 and 19 CFU/100 ml (P = 0.001), respectively. In conclusion, the efficiency and therefore the value of the monitoring of endoscope reprocessing by microbiological cultures is dependent on the sampling solutions used. A sampling solution with a tensioactive action is more efficient than saline in detecting biofilm contamination of endoscopes

    Microdissection of human chromosomes by a laser microbeam

    Get PDF
    A laser microbeam apparatus, based on an excimer laser pumped dye laser is used to microdissect human chromosomes and to isolate a single chromosome slice

    Differentiating muscle damage from myocardial injury by meaans of the serum creatinine kinase (CK) isoenzyme MB mass measurement/total CK activity ratio

    Full text link
    We immunoenzymometrically measured creatine kinase (CK) isoenzyme MB in extracts of myocardium and in homogenates of five different skeletal muscles. CK-MB concentrations in the former averaged 80.9 micrograms/g wet tissue; in the skeletal muscles it varied widely, being (e.g.) 25-fold greater in diaphragm than in psoas. CK-MB in skeletal muscles ranged from 0.9 to 44 ng/U of total CK; the mean for myocardium was 202 ng/U. In sera from 10 trauma and 36 burn patients without myocardial involvement, maximum ratios for CK-MB mass/total CK activity averaged 7 (SEM 1) ng/U and 18 (SEM 6) ng/U, respectively. Except for an infant (220 ng/U), the highest ratio we found for serum after muscular damage was 38 ng/U. In contrast, the mean maximum ratio determined in 23 cases of acute myocardial infarction exceeded 200 ng/U. Among seven determinations performed 8 to 32 h after onset of symptoms, each infarct patient demonstrated at least one ratio greater than or equal to 110 ng/U. Ratios observed after infarct were unrelated to treatment received during the acute phase. We propose a CK-MB/total CK ratio of 80 ng/U as the cutoff value for differentiating myocardial necrosis from muscular injury

    Consistency of safety and efficacy of new oral anticoagulants across subgroups of patients with atrial fibrillation.

    Get PDF
    AIMS: The well-known limitations of vitamin K antagonists (VKA) led to development of new oral anticoagulants (NOAC) in non-valvular atrial fibrillation (NVAF). The aim of this meta-analysis was to determine the consistency of treatment effects of NOAC irrespective of age, comorbidities, or prior VKA exposure. METHODS AND RESULTS: All randomized, controlled phase III trials comparing NOAC to VKA up to October 2012 were eligible provided their results (stroke/systemic embolism (SSE) and major bleeding (MB)) were reported according to age (≤ or >75 years), renal function, CHADS2 score, presence of diabetes mellitus or heart failure, prior VKA use or previous cerebrovascular events. Interactions were considered significant at p <0.05. Three studies (50,578 patients) were included, respectively evaluating apixaban, rivaroxaban, and dabigatran versus warfarin. A trend towards interaction with heart failure (p = 0.08) was observed with respect to SSE reduction, this being greater in patients not presenting heart failure (RR = 0.76 [0.67-0.86]) than in those with heart failure (RR = 0.90 [0.78-1.04]); Significant interaction (p = 0.01) with CHADS2 score was observed, NOAC achieving a greater reduction in bleeding risk in patients with a score of 0-1 (RR 0.67 CI 0.57-0.79) than in those with a score ≥2 (RR 0.85 CI 0.74-0.98). Comparison of MB in patients with (RR 0.97 CI 0.79-1.18) and without (RR 0.76 CI 0.65-0.88) diabetes mellitus showed a similar trend (p = 0.06). No other interactions were found. All subgroups derived benefit from NOA in terms of SSE or MB reduction. CONCLUSIONS: NOAC appeared to be more effective and safer than VKA in reducing SSE or MB irrespective of patient comorbidities. Thromboembolism risk, evaluated by CHADS2 score and, to a lesser extent, diabetes mellitus modified the treatment effects of NOAC without complete loss of benefit with respect to MB reduction

    A meta-analysis of effectiveness studies on computer technology-supported language learning

    Get PDF
    With the aim of summarizing years of research comparing pedagogies for second/foreign language teaching supported with computer technology and pedagogy not-supported by computer technology, a meta-analysis was conducted of empirical research investigating language outcomes. Thirty-seven studies yielding 52 effect sizes were included, following a search of literature from 1970 to 2006 and screening of studies based on stated criteria. The differences in research designs required subdivision of studies, but overall results favored the technology-supported pedagogy, with a small, but positive and statistically significant effect size. Second/foreign language instruction supported by computer technology was found to be at least as effective as instruction without technology, and in studies using rigorous research designs the CALL groups outperformed the non-CALL groups. The analyses of instructional conditions, characteristics of participants, and conditions of the research design did not provide reliable results because of the small number of effect sizes representing each group. The meta-analysis results provide an empirically-based response to the questions of whether or not technology-supported pedagogies enhance language learning, and the process of conducting the meta-analysis pointed to areas in research methodology that would benefit from attention in future research

    Optimal treatment allocations in space and time for on-line control of an emerging infectious disease

    Get PDF
    A key component in controlling the spread of an epidemic is deciding where, whenand to whom to apply an intervention.We develop a framework for using data to informthese decisionsin realtime.We formalize a treatment allocation strategy as a sequence of functions, oneper treatment period, that map up-to-date information on the spread of an infectious diseaseto a subset of locations where treatment should be allocated. An optimal allocation strategyoptimizes some cumulative outcome, e.g. the number of uninfected locations, the geographicfootprint of the disease or the cost of the epidemic. Estimation of an optimal allocation strategyfor an emerging infectious disease is challenging because spatial proximity induces interferencebetween locations, the number of possible allocations is exponential in the number oflocations, and because disease dynamics and intervention effectiveness are unknown at outbreak.We derive a Bayesian on-line estimator of the optimal allocation strategy that combinessimulation–optimization with Thompson sampling.The estimator proposed performs favourablyin simulation experiments. This work is motivated by and illustrated using data on the spread ofwhite nose syndrome, which is a highly fatal infectious disease devastating bat populations inNorth America
    corecore