2,855 research outputs found

    Single-pixel phase-corrected fiber bundle endomicroscopy with lensless focussing capability.

    Get PDF
    In this paper a novel single-pixel method for coherent imaging through an endoscopic fiber bundle is presented. The use of a single-pixel detector allows greater sensitivity over a wider range of wavelengths, which could have significant applications in endoscopic fluorescence microscopy. First, the principle of lensless focussing at the distal end of a coherent fiber bundle is simulated to examine the impact of pixelation at microscopic scales. Next, an experimental optical correlator system using spatial light modulators (SLMs) is presented. A simple contrast imaging method of characterizing and compensating phase aberrations introduced by fiber bundles is described. Experimental results are then presented showing that our phase compensation method enables characterization of the optical phase profile of individual fiberlets. After applying this correction, early results demonstrating the ability of the system to electronically adjust the focal plane at the distal end of the fiber bundle are presented. The structural similarity index (SSIM) between the simulated image and the experimental focus-adjusted image increases noticeably when the phase correction is applied and the retrieved image is visually recognizable. Strategies to improve image quality are discussed.G. Gordon would like to acknowledge support from a Henslow Research Fellowship from the Cambridge Philosophical Society, as well as research funding from the Cambridge Cancer Centre and Cancer Research UK. S. Bohndiek would like to acknowledge research funding from a Cancer Research UK Career Establishment Award and the CRUK-EPSRC Cancer Imaging Centre in Cambridge and Manchester.This is the final version of the article. It first appeared from IEEE via http://dx.doi.org/10.1109/JLT.2015.243681

    RNA-seq transcriptional profiling of peripheral blood leukocytes from cattle infected with Mycobacterium bovis

    Get PDF
    Bovine tuberculosis, caused by infection with Mycobacterium bovis, is a major endemic disease affecting cattle populations worldwide, despite the implementation of stringent surveillance and control programs in many countries. The development of high-throughput functional genomics technologies, including gene expression microarrays and RNA-sequencing (RNA-seq), has enabled detailed analysis of the host transcriptome to M. bovis infection, particularly at the macrophage and peripheral blood level. In the present study, we have analyzed the peripheral blood leukocyte (PBL) transcriptome of eight natural M. bovis-infected and eight age- and sex-matched non-infected control Holstein-Friesian animals using RNA-seq. In addition, we compared gene expression profiles generated using RNA-seq with those previously generated using the high-density Affymetrix(®) GeneChip(®) Bovine Genome Array platform from the same PBL-extracted RNA. A total of 3,250 differentially expressed (DE) annotated genes were detected in the M. bovis-infected samples relative to the controls (adjusted P-value ≤0.05), with the number of genes displaying decreased relative expression (1,671) exceeding those with increased relative expression (1,579). Ingenuity(®) Systems Pathway Analysis (IPA) of all DE genes revealed enrichment for genes with immune function. Notably, transcriptional suppression was observed among several of the top-ranking canonical pathways including Leukocyte Extravasation Signaling. Comparative platform analysis demonstrated that RNA-seq detected a larger number of annotated DE genes (3,250) relative to the microarray (1,398), of which 917 genes were common to both technologies and displayed the same direction of expression. Finally, we show that RNA-seq had an increased dynamic range compared to the microarray for estimating differential gene expression

    An optical Fourier transform coprocessor with direct phase determination.

    Get PDF
    The Fourier transform is a ubiquitous mathematical operation which arises naturally in optics. We propose and demonstrate a practical method to optically evaluate a complex-to-complex discrete Fourier transform. By implementing the Fourier transform optically we can overcome the limiting O(nlogn) complexity of fast Fourier transform algorithms. Efficiently extracting the phase from the well-known optical Fourier transform is challenging. By appropriately decomposing the input and exploiting symmetries of the Fourier transform we are able to determine the phase directly from straightforward intensity measurements, creating an optical Fourier transform with O(n) apparent complexity. Performing larger optical Fourier transforms requires higher resolution spatial light modulators, but the execution time remains unchanged. This method could unlock the potential of the optical Fourier transform to permit 2D complex-to-complex discrete Fourier transforms with a performance that is currently untenable, with applications across information processing and computational physics

    Whole-transcriptome, high-throughput RNA sequence analysis of the bovine macrophage response to Mycobacterium bovis infection in vitro

    Get PDF
    BACKGROUND: Mycobacterium bovis, the causative agent of bovine tuberculosis, is an intracellular pathogen that can persist inside host macrophages during infection via a diverse range of mechanisms that subvert the host immune response. In the current study, we have analysed and compared the transcriptomes of M. bovis-infected monocyte-derived macrophages (MDM) purified from six Holstein-Friesian females with the transcriptomes of non-infected control MDM from the same animals over a 24 h period using strand-specific RNA sequencing (RNA-seq). In addition, we compare gene expression profiles generated using RNA-seq with those previously generated by us using the high-density Affymetrix® GeneChip® Bovine Genome Array platform from the same MDM-extracted RNA. RESULTS: A mean of 7.2 million reads from each MDM sample mapped uniquely and unambiguously to single Bos taurus reference genome locations. Analysis of these mapped reads showed 2,584 genes (1,392 upregulated; 1,192 downregulated) and 757 putative natural antisense transcripts (558 upregulated; 119 downregulated) that were differentially expressed based on sense and antisense strand data, respectively (adjusted P-value ≤ 0.05). Of the differentially expressed genes, 694 were common to both the sense and antisense data sets, with the direction of expression (i.e. up- or downregulation) positively correlated for 693 genes and negatively correlated for the remaining gene. Gene ontology analysis of the differentially expressed genes revealed an enrichment of immune, apoptotic and cell signalling genes. Notably, the number of differentially expressed genes identified from RNA-seq sense strand analysis was greater than the number of differentially expressed genes detected from microarray analysis (2,584 genes versus 2,015 genes). Furthermore, our data reveal a greater dynamic range in the detection and quantification of gene transcripts for RNA-seq compared to microarray technology. CONCLUSIONS: This study highlights the value of RNA-seq in identifying novel immunomodulatory mechanisms that underlie host-mycobacterial pathogen interactions during infection, including possible complex post-transcriptional regulation of host gene expression involving antisense RNA

    Anonymous Single-Sign-On for n designated services with traceability

    Get PDF
    Anonymous Single-Sign-On authentication schemes have been proposed to allow users to access a service protected by a verifier without revealing their identity which has become more important due to the introduction of strong privacy regulations. In this paper we describe a new approach whereby anonymous authentication to different verifiers is achieved via authorisation tags and pseudonyms. The particular innovation of our scheme is authentication can only occur between a user and its designated verifier for a service, and the verification cannot be performed by any other verifier. The benefit of this authentication approach is that it prevents information leakage of a user's service access information, even if the verifiers for these services collude which each other. Our scheme also supports a trusted third party who is authorised to de-anonymise the user and reveal her whole services access information if required. Furthermore, our scheme is lightweight because it does not rely on attribute or policy-based signature schemes to enable access to multiple services. The scheme's security model is given together with a security proof, an implementation and a performance evaluation.Comment: 3

    A review of clinical decision-making: Models and current research

    Get PDF
    Aims and objectives: The aim of this paper was to review the current literature with respect to clinical decision-making models and the educational application of models to clinical practice. This was achieved by exploring the function and related research of the three available models of clinical decision making: information processing model, the intuitive-humanist model and the clinical decision making model. Background: Clinical decision-making is a unique process that involves the interplay between knowledge of pre-existing pathological conditions, explicit patient information, nursing care and experiential learning. Historically, two models of clinical decision making are recognised from the literature; the information processing model and the intuitive-humanist model. The usefulness and application of both models has been examined in relation the provision of nursing care and care related outcomes. More recently a third model of clinical decision making has been proposed. This new multidimensional model contains elements of the information processing model but also examines patient specific elements that are necessary for cue and pattern recognition. Design: Literature review Methods: Evaluation of the literature generated from MEDLINE, CINAHL, OVID, PUBMED and EBESCO systems and the Internet from 1980 – November 2005

    Quantum resource estimates for computing elliptic curve discrete logarithms

    Get PDF
    We give precise quantum resource estimates for Shor's algorithm to compute discrete logarithms on elliptic curves over prime fields. The estimates are derived from a simulation of a Toffoli gate network for controlled elliptic curve point addition, implemented within the framework of the quantum computing software tool suite LIQUiUi|\rangle. We determine circuit implementations for reversible modular arithmetic, including modular addition, multiplication and inversion, as well as reversible elliptic curve point addition. We conclude that elliptic curve discrete logarithms on an elliptic curve defined over an nn-bit prime field can be computed on a quantum computer with at most 9n+2log2(n)+109n + 2\lceil\log_2(n)\rceil+10 qubits using a quantum circuit of at most 448n3log2(n)+4090n3448 n^3 \log_2(n) + 4090 n^3 Toffoli gates. We are able to classically simulate the Toffoli networks corresponding to the controlled elliptic curve point addition as the core piece of Shor's algorithm for the NIST standard curves P-192, P-224, P-256, P-384 and P-521. Our approach allows gate-level comparisons to recent resource estimates for Shor's factoring algorithm. The results also support estimates given earlier by Proos and Zalka and indicate that, for current parameters at comparable classical security levels, the number of qubits required to tackle elliptic curves is less than for attacking RSA, suggesting that indeed ECC is an easier target than RSA.Comment: 24 pages, 2 tables, 11 figures. v2: typos fixed and reference added. ASIACRYPT 201

    Dealing with missing standard deviation and mean values in meta-analysis of continuous outcomes: a systematic review

    Get PDF
    Background: Rigorous, informative meta-analyses rely on availability of appropriate summary statistics or individual participant data. For continuous outcomes, especially those with naturally skewed distributions, summary information on the mean or variability often goes unreported. While full reporting of original trial data is the ideal, we sought to identify methods for handling unreported mean or variability summary statistics in meta-analysis. Methods: We undertook two systematic literature reviews to identify methodological approaches used to deal with missing mean or variability summary statistics. Five electronic databases were searched, in addition to the Cochrane Colloquium abstract books and the Cochrane Statistics Methods Group mailing list archive. We also conducted cited reference searching and emailed topic experts to identify recent methodological developments. Details recorded included the description of the method, the information required to implement the method, any underlying assumptions and whether the method could be readily applied in standard statistical software. We provided a summary description of the methods identified, illustrating selected methods in example meta-analysis scenarios. Results: For missing standard deviations (SDs), following screening of 503 articles, fifteen methods were identified in addition to those reported in a previous review. These included Bayesian hierarchical modelling at the meta-analysis level; summary statistic level imputation based on observed SD values from other trials in the meta-analysis; a practical approximation based on the range; and algebraic estimation of the SD based on other summary statistics. Following screening of 1124 articles for methods estimating the mean, one approximate Bayesian computation approach and three papers based on alternative summary statistics were identified. Illustrative meta-analyses showed that when replacing a missing SD the approximation using the range minimised loss of precision and generally performed better than omitting trials. When estimating missing means, a formula using the median, lower quartile and upper quartile performed best in preserving the precision of the meta-analysis findings, although in some scenarios, omitting trials gave superior results. Conclusions: Methods based on summary statistics (minimum, maximum, lower quartile, upper quartile, median) reported in the literature facilitate more comprehensive inclusion of randomised controlled trials with missing mean or variability summary statistics within meta-analyses

    Diffuse Gamma Rays: Galactic and Extragalactic Diffuse Emission

    Full text link
    "Diffuse" gamma rays consist of several components: truly diffuse emission from the interstellar medium, the extragalactic background, whose origin is not firmly established yet, and the contribution from unresolved and faint Galactic point sources. One approach to unravel these components is to study the diffuse emission from the interstellar medium, which traces the interactions of high energy particles with interstellar gas and radiation fields. Because of its origin such emission is potentially able to reveal much about the sources and propagation of cosmic rays. The extragalactic background, if reliably determined, can be used in cosmological and blazar studies. Studying the derived "average" spectrum of faint Galactic sources may be able to give a clue to the nature of the emitting objects.Comment: 32 pages, 28 figures, kapproc.cls. Chapter to the book "Cosmic Gamma-Ray Sources," to be published by Kluwer ASSL Series, Edited by K. S. Cheng and G. E. Romero. More details can be found at http://www.gamma.mpe-garching.mpg.de/~aws/aws.htm
    corecore