498 research outputs found
Intrinsic Volumes of Polyhedral Cones: A combinatorial perspective
The theory of intrinsic volumes of convex cones has recently found striking
applications in areas such as convex optimization and compressive sensing. This
article provides a self-contained account of the combinatorial theory of
intrinsic volumes for polyhedral cones. Direct derivations of the General
Steiner formula, the conic analogues of the Brianchon-Gram-Euler and the
Gauss-Bonnet relations, and the Principal Kinematic Formula are given. In
addition, a connection between the characteristic polynomial of a hyperplane
arrangement and the intrinsic volumes of the regions of the arrangement, due to
Klivans and Swartz, is generalized and some applications are presented.Comment: Survey, 23 page
Effective Condition Number Bounds for Convex Regularization
We derive bounds relating Renegar's condition number to quantities that
govern the statistical performance of convex regularization in settings that
include the -analysis setting. Using results from conic integral
geometry, we show that the bounds can be made to depend only on a random
projection, or restriction, of the analysis operator to a lower dimensional
space, and can still be effective if these operators are ill-conditioned. As an
application, we get new bounds for the undersampling phase transition of
composite convex regularizers. Key tools in the analysis are Slepian's
inequality and the kinematic formula from integral geometry.Comment: 17 pages, 4 figures . arXiv admin note: text overlap with
arXiv:1408.301
Gordon's inequality and condition numbers in conic optimization
The probabilistic analysis of condition numbers has traditionally been
approached from different angles; one is based on Smale's program in complexity
theory and features integral geometry, while the other is motivated by
geometric functional analysis and makes use of the theory of Gaussian
processes. In this note we explore connections between the two approaches in
the context of the biconic homogeneous feasiblity problem and the condition
numbers motivated by conic optimization theory. Key tools in the analysis are
Slepian's and Gordon's comparision inequalities for Gaussian processes,
interpreted as monotonicity properties of moment functionals, and their
interplay with ideas from conic integral geometry
Effective condition number bounds for convex regularization
We derive bounds relating Renegar's condition number to quantities that govern the statistical performance of convex regularization in settings that include the ℓ 1 -analysis setting. Using results from conic integral geometry, we show that the bounds can be made to depend only on a random projection, or restriction, of the analysis operator to a lower dimensional space, and can still be effective if these operators are ill-conditioned. As an application, we get new bounds for the undersampling phase transition of composite convex regularizers. Key tools in the analysis are Slepian's inequality and the kinematic formula from integral geometry
Generic Model Refactorings
Many modeling languages share some common concepts and principles. For example, Java, MOF, and UML share some aspects of the concepts\ud
of classes, methods, attributes, and inheritance. However, model\ud
transformations such as refactorings specified for a given language\ud
cannot be readily reused for another language because their related\ud
metamodels may be structurally different. Our aim is to enable a\ud
flexible reuse of model transformations across various metamodels.\ud
Thus, in this paper, we present an approach allowing the specification\ud
of generic model transformations, in particular refactorings, so\ud
that they can be applied to different metamodels. Our approach relies\ud
on two mechanisms: (1) an adaptation based mainly on the weaving\ud
of aspects; (2) the notion of model typing, an extension of object\ud
typing in the model-oriented context. We validated our approach by\ud
performing some experiments that consisted of specifying three well\ud
known refactorings (Encapsulate Field, Move Method, and Pull Up Method)\ud
and applying each of them onto three different metamodels (Java,\ud
MOF, and UML)
Convex recovery of a structured signal from independent random linear measurements
This chapter develops a theoretical analysis of the convex programming method
for recovering a structured signal from independent random linear measurements.
This technique delivers bounds for the sampling complexity that are similar
with recent results for standard Gaussian measurements, but the argument
applies to a much wider class of measurement ensembles. To demonstrate the
power of this approach, the paper presents a short analysis of phase retrieval
by trace-norm minimization. The key technical tool is a framework, due to
Mendelson and coauthors, for bounding a nonnegative empirical process.Comment: 18 pages, 1 figure. To appear in "Sampling Theory, a Renaissance."
v2: minor corrections. v3: updated citations and increased emphasis on
Mendelson's contribution
- …
