998 research outputs found

    Finite unions of balls in C^n are rationally convex

    Full text link
    It is shown that the rational convexity of any finite union of disjoint closed balls in C^n follows easily from the results of Duval and Sibony.Comment: V.2 - minor edits, 2 page

    Uniformization of strictly pseudoconvex domains

    Full text link
    It is shown that two strictly pseudoconvex Stein domains with real analytic boundaries have biholomorphic universal coverings provided that their boundaries are locally biholomorphically equivalent. This statement can be regarded as a higher dimensional analogue of the Riemann uniformization theorem

    On detecting harmonic oscillations

    Full text link
    In this paper, we focus on the following testing problem: assume that we are given observations of a real-valued signal along the grid 0,1,,N10,1,\ldots,N-1, corrupted by white Gaussian noise. We want to distinguish between two hypotheses: (a) the signal is a nuisance - a linear combination of dnd_n harmonic oscillations of known frequencies, and (b) signal is the sum of a nuisance and a linear combination of a given number dsd_s of harmonic oscillations with unknown frequencies, and such that the distance (measured in the uniform norm on the grid) between the signal and the set of nuisances is at least ρ>0\rho>0. We propose a computationally efficient test for distinguishing between (a) and (b) and show that its "resolution" (the smallest value of ρ\rho for which (a) and (b) are distinguished with a given confidence 1α1-\alpha) is O(ln(N/α)/N)\mathrm{O}(\sqrt{\ln(N/\alpha)/N}), with the hidden factor depending solely on dnd_n and dsd_s and independent of the frequencies in question. We show that this resolution, up to a factor which is polynomial in dn,dsd_n,d_s and logarithmic in NN, is the best possible under circumstances. We further extend the outlined results to the case of nuisances and signals close to linear combinations of harmonic oscillations, and provide illustrative numerical results.Comment: Published at http://dx.doi.org/10.3150/14-BEJ600 in the Bernoulli (http://isi.cbs.nl/bernoulli/) by the International Statistical Institute/Bernoulli Society (http://isi.cbs.nl/BS/bshome.htm

    Solving Variational Inequalities with Monotone Operators on Domains Given by Linear Minimization Oracles

    Full text link
    The standard algorithms for solving large-scale convex-concave saddle point problems, or, more generally, variational inequalities with monotone operators, are proximal type algorithms which at every iteration need to compute a prox-mapping, that is, to minimize over problem's domain XX the sum of a linear form and the specific convex distance-generating function underlying the algorithms in question. Relative computational simplicity of prox-mappings, which is the standard requirement when implementing proximal algorithms, clearly implies the possibility to equip XX with a relatively computationally cheap Linear Minimization Oracle (LMO) able to minimize over XX linear forms. There are, however, important situations where a cheap LMO indeed is available, but where no proximal setup with easy-to-compute prox-mappings is known. This fact motivates our goal in this paper, which is to develop techniques for solving variational inequalities with monotone operators on domains given by Linear Minimization Oracles. The techniques we develope can be viewed as a substantial extension of the proposed in [5] method of nonsmooth convex minimization over an LMO-represented domain

    Accuracy guarantees for L1-recovery

    Full text link
    We discuss two new methods of recovery of sparse signals from noisy observation based on 1\ell_1- minimization. They are closely related to the well-known techniques such as Lasso and Dantzig Selector. However, these estimators come with efficiently verifiable guaranties of performance. By optimizing these bounds with respect to the method parameters we are able to construct the estimators which possess better statistical properties than the commonly used ones. We also show how these techniques allow to provide efficiently computable accuracy bounds for Lasso and Dantzig Selector. We link our performance estimations to the well known results of Compressive Sensing and justify our proposed approach with an oracle inequality which links the properties of the recovery algorithms and the best estimation performance when the signal support is known. We demonstrate how the estimates can be computed using the Non-Euclidean Basis Pursuit algorithm

    Non-asymptotic confidence bounds for the optimal value of a stochastic program

    Get PDF
    We discuss a general approach to building non-asymptotic confidence bounds for stochastic optimization problems. Our principal contribution is the observation that a Sample Average Approximation of a problem supplies upper and lower bounds for the optimal value of the problem which are essentially better than the quality of the corresponding optimal solutions. At the same time, such bounds are more reliable than "standard" confidence bounds obtained through the asymptotic approach. We also discuss bounding the optimal value of MinMax Stochastic Optimization and stochastically constrained problems. We conclude with a simulation study illustrating the numerical behavior of the proposed bounds
    corecore