947 research outputs found

    Manifold regularization based on Nystr{\"o}m type subsampling

    Full text link
    In this paper, we study the Nystr{\"o}m type subsampling for large scale kernel methods to reduce the computational complexities of big data. We discuss the multi-penalty regularization scheme based on Nystr{\"o}m type subsampling which is motivated from well-studied manifold regularization schemes. We develop a theoretical analysis of multi-penalty least-square regularization scheme under the general source condition in vector-valued function setting, therefore the results can also be applied to multi-task learning problems. We achieve the optimal minimax convergence rates of multi-penalty regularization using the concept of effective dimension for the appropriate subsampling size. We discuss an aggregation approach based on linear function strategy to combine various Nystr{\"o}m approximants. Finally, we demonstrate the performance of multi-penalty regularization based on Nystr{\"o}m type subsampling on Caltech-101 data set for multi-class image classification and NSL-KDD benchmark data set for intrusion detection problem

    Structure of CdTe/ZnTe superlattices

    Get PDF
    The structure of CdTe/ZnTe superlattices has been analyzed through θ/2θ x‐ray diffraction, photoluminescence, and in situ reflection high‐energy electron diffraction (RHEED) measurements. Samples are found to break away from Cd_(x)Zn_(1−x)Te buffer layers as a consequence of the 6% lattice mismatch in this system. However, defect densities in these superlattices are seen to drop dramatically away from the buffer layer interface, accounting for the intense photoluminescence and high‐average strain fields seen in each of our samples. Observed variations in residual strains suggest that growth conditions play a role in forming misfit defects. This could explain discrepancies with calculated values of critical thickness based on models which neglect growth conditions. Photoluminescence spectra reveal that layer‐to‐layer growth proceeded with single monolayer uniformity, suggesting highly reproducible growth. Our results give hope for relatively defect‐free Cd_(x)Zn_(1−x)Te/Cd_(y)Zn_(1−y)Te superlattices with the potential for applications to optoelectronics offered by intense visible light emitters

    South Asian ethnicity is associated with a lower prevalence of atrial fibrillation despite greater prevalence of established risk factors: a population-based study in Bradford Metropolitan District

    Get PDF
    Aims: Previous studies indicate that South Asians (SAs) may have a reduced risk of developing atrial fibrillation (AF) despite having a higher prevalence of traditional cardiovascular risk factors. This observational study was designed to explore the relative differences between SAs and Whites in a well-defined, multi-ethnic population with careful consideration of traditional cardiovascular risk factors that are thought to contribute to the development of AF. Methods and results: Anonymized data from 417 575 adults were sourced from primary care records within Bradford Metropolitan District, UK. Atrial fibrillation diagnosis was indicated by the presence on the AF Quality Outcomes Framework register. Self-reported ethnicity was mapped to census ethnic codes. The age-standardized prevalence rates of AF were calculated for comparison between the White and SA populations; our study sample presented relative proportions of 2.39 and 0.4%. Multivariable logistic regression analysis was performed to estimate the odds of developing AF given SA ethnicity. Adjustment for age, sex, and established risk factors found a 71% reduction in odds of AF in SAs when compared with Whites [odds ratio (OR): 0.29, 95% confidence interval (CI): 0.26–0.32]. When stratified by ethnicity, analyses revealed significantly different odds of AF for patients with diabetes; diabetes was not associated with the development of AF in the SA population (0.81, 95% CI: 0.63–1.05). Conclusion: This study, in a multi-ethnic population, presents ethnicity as a predictor of AF in which prevalence is significantly lower in SAs when compared with Whites. This is despite SAs having a higher frequency of established risk factors for the development of AF, such as ischaemic heart disease, heart failure, hypertension, and type 2 diabetes. These findings are consistent with previous literature and add weight to the need for further investigation, although this is the first study to investigate the differential associations of individual risk factors with development of AF

    How much noise can be added in cardiac X-ray imaging without loss in perceived image quality?

    Get PDF
    Dynamic X-ray imaging systems are used for interventional cardiac procedures to treat coronary heart disease. X-ray settings are controlled automatically by specially-designed X-ray dose control mechanisms whose role is to ensure an adequate level of image quality is maintained with an acceptable radiation dose to the patient. Current commonplace dose control designs quantify image quality by performing a simple technical measurement directly from the image. However, the utility of cardiac X-ray images is in their interpretation by a cardiologist during an interventional procedure, rather than in a technical measurement. With the long term goal of devising a clinically-relevant image quality metric for an intelligent dose control system, we aim to investigate the relationship of image noise with clinical professionals’ perception of dynamic image sequences. Computer-generated noise was added, in incremental amounts, to angiograms of five different patients selected to represent the range of adult cardiac patient sizes. A two alternative forced choice staircase experiment was used to determine the amount of noise which can be added to a patient image sequences without changing image quality as perceived by clinical professionals. Twenty-five viewing sessions (five for each patient) were completed by thirteen observers. Results demonstrated scope to increase the noise of cardiac X-ray images by up to 21% ± 8% before it is noticeable by clinical professionals. This indicates a potential for 21% radiation dose reduction since X-ray image noise and radiation dose are directly related; this would be beneficial to both patients and personnel

    Improving outcomes in interstitial lung disease through the application of bioinformatics and systems biology

    Get PDF
    Idiopathic pulmonary fibrosis (IPF) and chronic obstructive pulmonary disease (COPD) are two distinct respiratory diseases whose features including pathogenesis and progression are not fully understood. However, both clinicians utilise changes in serial pulmonary function measurements to gain an insight into disease severity and control. More accurate prediction of disease progression would be beneficial, particularly for IPF given the variability in its clinical course as an unknown factor at the time of diagnosis. Home-based, real-time monitoring of disease progression by spirometry has provided an opportunity to optimise the delivery of treatment and reduce the length of clinical trials. Therefore, the potential to understand the mechanisms underlying disease progression and generate effective treatment has been improved. In light of this, the motivation for this project is to understand the mathematical features within daily pulmonary function time series generated by IPF patients. Hopefully, statistical models of pulmonary function time series would aid the identification of significant clinical events such as acute exacerbation. The mathematical techniques used to identify potentially important features within pulmonary function time series involved the autocorrelation function, critical transitions and detrended fluctuation analysis (DFA). Temporal properties, such as the serial correlation, abrupt changes in trends and complexity, were assessed using time series from the PROFILE clinical trial and London COPD cohort. Forced vital capacity (FVC) measurements were found to be correlated to the previous day’s reading which may inform the sampling rate of lung function during clinical trials. The presence of short-term memory within FVC time series will influence the management of missing data within clinical trials, particularly methods of imputation. Also, FVC time series’ exhibit long-term memory and adaptability supporting the role of FVC as a surrogate marker for IPF disease progression.Open Acces

    Can Image Enhancement Allow Radiation Dose to Be Reduced Whilst Maintaining the Perceived Diagnostic Image Quality Required for Coronary Angiography?

    Get PDF
    Objectives: The aim of this research was to quantify the reduction in radiation dose facilitated by image processing alone for percutaneous coronary intervention (PCI) patient angiograms, without reducing the perceived image quality required to confidently make a diagnosis. Methods: Incremental amounts of image noise were added to five PCI angiograms, simulating the angiogram as having been acquired at corresponding lower dose levels (10-89% dose reduction). Sixteen observers with relevant experience scored the image quality of these angiograms in three states - with no image processing and with two different modern image processing algorithms applied. These algorithms are used on state-of-the-art and previous generation cardiac interventional X-ray systems. Ordinal regression allowing for random effects and the delta method were used to quantify the dose reduction possible by the processing algorithms, for equivalent image quality scores. Results: Observers rated the quality of the images processed with the state-of-the-art and previous generation image processing with a 24.9% and 15.6% dose reduction respectively as equivalent in quality to the unenhanced images. The dose reduction facilitated by the state-of-the-art image processing relative to previous generation processing was 10.3%. Conclusions: Results demonstrate that statistically significant dose reduction can be facilitated with no loss in perceived image quality using modern image enhancement; the most recent processing algorithm was more effective in preserving image quality at lower doses. Advances in knowledge: Image enhancement was shown to maintain perceived image quality in coronary angiography at a reduced level of radiation dose using computer software to produce synthetic images from real angiograms simulating a reduction in dose

    Electronic Structure of Te and As Covered Si(211)

    Get PDF
    Electronic and atomic structures of the clean, and As and Te covered Si(211) surface are studied using pseudopotential density functional method. The clean surface is found to have (2 X 1) and rebonded (1 X 1) reconstructions as stable surface structures, but no \pi-bonded chain reconstruction. Binding energies of As and Te adatoms at a number of symmetry sites on the ideal and (2 X 1) reconstructed surfaces have been calculated because of their importance in the epitaxial growth of CdTe and other materials on the Si(211) surface. The special symmetry sites on these surfaces having the highest binding energies for isolated As and Te adatoms are identified. But more significantly, several sites are found to be nearly degenerate in binding energy values. This has important consequences for epitaxial growth processes. Optimal structures calculated for 0.5 ML of As and Te coverage reveal that the As adatoms dimerize on the surface while the Te adatoms do not. However, both As and Te covered surfaces are found to be metallic in nature.Comment: 17 pages, 9 figures, accepted for publication in Phys. Rev.

    Random sampling of signals concentrated on compact set in localized reproducing kernel subspace of Lp(Rn)L^p({\mathbb R}^n)

    Full text link
    The paper is devoted to studying the stability of random sampling in a localized reproducing kernel space. We show that if the sampling set on Ω\Omega (compact) discretizes the integral norm of simple functions up to a given error, then the sampling set is stable for the set of functions concentrated on Ω\Omega. Moreover, we prove with an overwhelming probability that O(μ(Ω)(logμ(Ω))3){\mathcal O}(\mu(\Omega)(\log \mu(\Omega))^3) many random points uniformly distributed over Ω\Omega yield a stable set of sampling for functions concentrated on Ω\Omega.Comment: 17 page
    corecore