195 research outputs found
A Path to Implement Precision Child Health Cardiovascular Medicine.
Congenital heart defects (CHDs) affect approximately 1% of live births and are a major source of childhood morbidity and mortality even in countries with advanced healthcare systems. Along with phenotypic heterogeneity, the underlying etiology of CHDs is multifactorial, involving genetic, epigenetic, and/or environmental contributors. Clear dissection of the underlying mechanism is a powerful step to establish individualized therapies. However, the majority of CHDs are yet to be clearly diagnosed for the underlying genetic and environmental factors, and even less with effective therapies. Although the survival rate for CHDs is steadily improving, there is still a significant unmet need for refining diagnostic precision and establishing targeted therapies to optimize life quality and to minimize future complications. In particular, proper identification of disease associated genetic variants in humans has been challenging, and this greatly impedes our ability to delineate gene-environment interactions that contribute to the pathogenesis of CHDs. Implementing a systematic multileveled approach can establish a continuum from phenotypic characterization in the clinic to molecular dissection using combined next-generation sequencing platforms and validation studies in suitable models at the bench. Key elements necessary to advance the field are: first, proper delineation of the phenotypic spectrum of CHDs; second, defining the molecular genotype/phenotype by combining whole-exome sequencing and transcriptome analysis; third, integration of phenotypic, genotypic, and molecular datasets to identify molecular network contributing to CHDs; fourth, generation of relevant disease models and multileveled experimental investigations. In order to achieve all these goals, access to high-quality biological specimens from well-defined patient cohorts is a crucial step. Therefore, establishing a CHD BioCore is an essential infrastructure and a critical step on the path toward precision child health cardiovascular medicine
Solving variational inequalities defined on a domain with infinitely many linear constraints
We study a variational inequality problem whose domain is defined by infinitely many linear inequalities. A discretization method and an analytic center based inexact cutting plane method are proposed. Under proper assumptions, the convergence results for both methods are given. We also provide numerical examples to illustrate the proposed method
A reduction method for semi-infinite programming by means of a global stochastic approach
We describe a reduction algorithm for solving semi-infinite programming problems. The proposed algorithm uses the simulated annealing method equipped with a function stretching as a multi-local procedure, and a penalty technique for the finite optimization process. An exponential penalty merit function is reduced along each search direction to ensure convergence from any starting point. Our preliminary numerical results seem to show that the algorithm is very promising in practice.Algoritmi Research CenterFundação para a Ciência e a Tecnologia (FCT) - bolsa POCI/MAT/58957/200
Information geometry of Gaussian channels
We define a local Riemannian metric tensor in the manifold of Gaussian
channels and the distance that it induces. We adopt an information-geometric
approach and define a metric derived from the Bures-Fisher metric for quantum
states. The resulting metric inherits several desirable properties from the
Bures-Fisher metric and is operationally motivated from distinguishability
considerations: It serves as an upper bound to the attainable quantum Fisher
information for the channel parameters using Gaussian states, under generic
constraints on the physically available resources. Our approach naturally
includes the use of entangled Gaussian probe states. We prove that the metric
enjoys some desirable properties like stability and covariance. As a byproduct,
we also obtain some general results in Gaussian channel estimation that are the
continuous-variable analogs of previously known results in finite dimensions.
We prove that optimal probe states are always pure and bounded in the number of
ancillary modes, even in the presence of constraints on the reduced state input
in the channel. This has experimental and computational implications: It limits
the complexity of optimal experimental setups for channel estimation and
reduces the computational requirements for the evaluation of the metric:
Indeed, we construct a converging algorithm for its computation. We provide
explicit formulae for computing the multiparametric quantum Fisher information
for dissipative channels probed with arbitrary Gaussian states, and provide the
optimal observables for the estimation of the channel parameters (e.g. bath
couplings, squeezing, and temperature).Comment: 19 pages, 4 figure
Air pollution control with semi-infinite programming
Environment issues are more than ever important in a modern society. Complying with
stricter legal thresholds on pollution emissions raises an important economic issue. This
paper presents some ideas in the use of optimization tools to help in the planning and control
of stationary pollution sources.
Three main semi-infinite programming approaches are described. The first consists in optimizing
an objective function while the pollution level in a given region is kept bellow a
given threshold. In the second approach the maximum pollution level in a given region
is computed and in the third an air pollution abatement problem is considered. These formulations
allow to obtain the best control parameters and the maxima pollution positions,
where the sampling stations should be placed.
A specific modeling language was used to code four academic problems. Numerical results
computed with a semi-infinite programming solver are shown
Algorithm Engineering in Robust Optimization
Robust optimization is a young and emerging field of research having received
a considerable increase of interest over the last decade. In this paper, we
argue that the the algorithm engineering methodology fits very well to the
field of robust optimization and yields a rewarding new perspective on both the
current state of research and open research directions.
To this end we go through the algorithm engineering cycle of design and
analysis of concepts, development and implementation of algorithms, and
theoretical and experimental evaluation. We show that many ideas of algorithm
engineering have already been applied in publications on robust optimization.
Most work on robust optimization is devoted to analysis of the concepts and the
development of algorithms, some papers deal with the evaluation of a particular
concept in case studies, and work on comparison of concepts just starts. What
is still a drawback in many papers on robustness is the missing link to include
the results of the experiments again in the design
Wnt11 regulates cardiac chamber development and disease during perinatal maturation
Ventricular chamber growth and development during perinatal circulatory transition is critical for functional adaptation of the heart. However, the chamber-specific programs of neonatal heart growth are poorly understood. We used integrated systems genomic and functional biology analyses of the perinatal chamber specific transcriptome and we identified Wnt11 as a prominent regulator of chamber-specific proliferation. Importantly, downregulation of Wnt11 expression was associated with cyanotic congenital heart defect (CHD) phenotypes and correlated with O2 saturation levels in hypoxemic infants with Tetralogy of Fallot (TOF). Perinatal hypoxia treatment in mice suppressed Wnt11 expression and induced myocyte proliferation more robustly in the right ventricle, modulating Rb1 protein activity. Wnt11 inactivation was sufficient to induce myocyte proliferation in perinatal mouse hearts and reduced Rb1 protein and phosphorylation in neonatal cardiomyocytes. Finally, downregulated Wnt11 in hypoxemic TOF infantile hearts was associated with Rb1 suppression and induction of proliferation markers. This study revealed a previously uncharacterized function of Wnt11-mediated signaling as an important player in programming the chamber-specific growth of the neonatal heart. This function influences the chamber-specific development and pathogenesis in response to hypoxia and cyanotic CHDs. Defining the underlying regulatory mechanism may yield chamber-specific therapies for infants born with CHDs
4D MUSIC CMR: value-based imaging of neonates and infants with congenital heart disease
Background4D Multiphase Steady State Imaging with Contrast (MUSIC) acquires high-resolution volumetric images of the beating heart during uninterrupted ventilation. We aim to evaluate the diagnostic performance and clinical impact of 4D MUSIC in a cohort of neonates and infants with congenital heart disease (CHD).MethodsForty consecutive neonates and infants with CHD (age range 2 days to 2 years, weight 1 to 13 kg) underwent 3.0 T CMR with ferumoxytol enhancement (FE) at a single institution. Independently, two readers graded the diagnostic image quality of intra-cardiac structures and related vascular segments on FE-MUSIC and breath held FE-CMRA images using a four-point scale. Correlation of the CMR findings with surgery and other imaging modalities was performed in all patients. Clinical impact was evaluated in consensus with referring surgeons and cardiologists. One point was given for each of five key outcome measures: 1) change in overall management, 2) change in surgical approach, 3) reduction in the need for diagnostic catheterization, 4) improved assessment of risk-to-benefit for planned intervention and discussion with parents, 5) accurate pre-procedural roadmap.ResultsAll FE-CMR studies were completed successfully, safely and without adverse events. On a four-point scale, the average FE-MUSIC image quality scores were >3.5 for intra-cardiac structures and >3.0 for coronary arteries. Intra-cardiac morphology and vascular anatomy were well visualized with good interobserver agreement (r = 0.46). Correspondence between the findings on MUSIC, surgery, correlative imaging and autopsy was excellent. The average clinical impact score was 4.2 ± 0.9. In five patients with discordant findings on echo/MUSIC (n = 5) and catheter angiography/MUSIC (n = 1), findings on FE-MUSIC were shown to be accurate at autopsy (n = 1) and surgery (n = 4). The decision to undertake biventricular vs univentricular repair was amended in 2 patients based on FE-MUSIC findings. Plans for surgical approaches which would have involved circulatory arrest were amended in two of 28 surgical cases. In all 28 cases requiring procedural intervention, FE-MUSIC provided accurate dynamic 3D roadmaps and more confident risk-to-benefit assessments for proposed interventions.ConclusionsFE-MUSIC CMR has high clinical impact by providing accurate, high quality, simple and safe dynamic 3D imaging of cardiac and vascular anatomy in neonates and infants with CHD. The findings influenced patient management in a positive manner
L2-norm multiple kernel learning and its application to biomedical data fusion
<p>Abstract</p> <p>Background</p> <p>This paper introduces the notion of optimizing different norms in the dual problem of support vector machines with multiple kernels. The selection of norms yields different extensions of multiple kernel learning (MKL) such as <it>L</it><sub>∞</sub>, <it>L</it><sub>1</sub>, and <it>L</it><sub>2 </sub>MKL. In particular, <it>L</it><sub>2 </sub>MKL is a novel method that leads to non-sparse optimal kernel coefficients, which is different from the sparse kernel coefficients optimized by the existing <it>L</it><sub>∞ </sub>MKL method. In real biomedical applications, <it>L</it><sub>2 </sub>MKL may have more advantages over sparse integration method for thoroughly combining complementary information in heterogeneous data sources.</p> <p>Results</p> <p>We provide a theoretical analysis of the relationship between the <it>L</it><sub>2 </sub>optimization of kernels in the dual problem with the <it>L</it><sub>2 </sub>coefficient regularization in the primal problem. Understanding the dual <it>L</it><sub>2 </sub>problem grants a unified view on MKL and enables us to extend the <it>L</it><sub>2 </sub>method to a wide range of machine learning problems. We implement <it>L</it><sub>2 </sub>MKL for ranking and classification problems and compare its performance with the sparse <it>L</it><sub>∞ </sub>and the averaging <it>L</it><sub>1 </sub>MKL methods. The experiments are carried out on six real biomedical data sets and two large scale UCI data sets. <it>L</it><sub>2 </sub>MKL yields better performance on most of the benchmark data sets. In particular, we propose a novel <it>L</it><sub>2 </sub>MKL least squares support vector machine (LSSVM) algorithm, which is shown to be an efficient and promising classifier for large scale data sets processing.</p> <p>Conclusions</p> <p>This paper extends the statistical framework of genomic data fusion based on MKL. Allowing non-sparse weights on the data sources is an attractive option in settings where we believe most data sources to be relevant to the problem at hand and want to avoid a "winner-takes-all" effect seen in <it>L</it><sub>∞ </sub>MKL, which can be detrimental to the performance in prospective studies. The notion of optimizing <it>L</it><sub>2 </sub>kernels can be straightforwardly extended to ranking, classification, regression, and clustering algorithms. To tackle the computational burden of MKL, this paper proposes several novel LSSVM based MKL algorithms. Systematic comparison on real data sets shows that LSSVM MKL has comparable performance as the conventional SVM MKL algorithms. Moreover, large scale numerical experiments indicate that when cast as semi-infinite programming, LSSVM MKL can be solved more efficiently than SVM MKL.</p> <p>Availability</p> <p>The MATLAB code of algorithms implemented in this paper is downloadable from <url>http://homes.esat.kuleuven.be/~sistawww/bioi/syu/l2lssvm.html</url>.</p
- …
