5,952 research outputs found
A new ADMM algorithm for the Euclidean median and its application to robust patch regression
The Euclidean Median (EM) of a set of points in an Euclidean space
is the point x minimizing the (weighted) sum of the Euclidean distances of x to
the points in . While there exits no closed-form expression for the EM,
it can nevertheless be computed using iterative methods such as the Wieszfeld
algorithm. The EM has classically been used as a robust estimator of centrality
for multivariate data. It was recently demonstrated that the EM can be used to
perform robust patch-based denoising of images by generalizing the popular
Non-Local Means algorithm. In this paper, we propose a novel algorithm for
computing the EM (and its box-constrained counterpart) using variable splitting
and the method of augmented Lagrangian. The attractive feature of this approach
is that the subproblems involved in the ADMM-based optimization of the
augmented Lagrangian can be resolved using simple closed-form projections. The
proposed ADMM solver is used for robust patch-based image denoising and is
shown to exhibit faster convergence compared to an existing solver.Comment: 5 pages, 3 figures, 1 table. To appear in Proc. IEEE International
Conference on Acoustics, Speech, and Signal Processing, April 19-24, 201
A First Step Towards Nuance-Oriented Interfaces for Virtual Environments
Designing usable interfaces for virtual environments (VEs) is not a trivial task. Much of the difficulty stems from the complexity and volume of the input data. Many VEs, in the creation of their interfaces, ignore much of the input data as a result of this. Using machine learning (ML), we introduce the notion of a nuance that can be used to increase the precision and power of a VE interface. An experiment verifying the existence of nuances using a neural network (NN) is discussed and a listing of guidelines to follow is given. We also review reasons why traditional ML techniques are difficult to apply to this problem
Affordances and Feedback in Nuance-Oriented Interfaces
Virtual Environments (VEs) and perceptive user interfaces must deal with complex users and their modes of interaction. One way to approach this problem is to recognize users’ nuances (subtle conscious or unconscious actions). In exploring nuance-oriented interfaces, we attempted to let users work as they preferred without being biased by feedback or affordances in the system. The hope was that we would discover the users’ innate models of interaction. The results of two user studies were that users are guided not by any innate model but by affordances and feedback in the interface. So, without this guidance, even the most obvious and useful components of an interface will be ignored
Tubulation pattern of membrane vesicles coated with bio filaments
Narrow membrane tubes are commonly pulled out from the surface of
phospholipid vesicles using forces applied either through laser or magnetic
tweezers or through the action of processive motor proteins. Recent examples
have emerged where such tubes spontaneously grow from vesicles coated with
bioactive cytoskeletal filaments (e.g. FtsZ, microtubule) in the presence GTP.
We show how a soft vesicle deforms due to the interplay between its topology,
local curvature and the forces due to the active filaments. We present results
from Dynamically Triangulated Monte Carlo simulations of a spherical continuum
membrane coated with a nematic field and show how the intrinsic curvature of
the filaments and their ordering interactions drive membrane tubulation. We
predict interesting patterns of nematic defects, on curved 2D membrane
surfaces, which promote tube formation. Implication of our model for more
dynamic cases where vesicles coated with an active mixture of microtubule and
myosin show shape oscillation, are also discussed. All these cases point to a
common theme that defect locations on 2D membrane surfaces are hot spots of
membrane deformation activity.Comment: 8 pages, 7 figure
Data Driven Surrogate Based Optimization in the Problem Solving Environment WBCSim
Large scale, multidisciplinary, engineering designs are always difficult due to the complexity and dimensionality of these problems. Direct coupling between the analysis codes and the optimization routines can be prohibitively time consuming due to the complexity of the underlying simulation codes. One way of tackling this problem is by constructing computationally cheap(er) approximations of the expensive simulations, that mimic the behavior of the simulation model as closely as possible. This paper presents a data driven, surrogate based optimization algorithm that uses a trust region based sequential approximate optimization (SAO) framework and a statistical sampling approach based on design of experiment (DOE) arrays. The algorithm is implemented using techniques from two packages—SURFPACK and SHEPPACK that provide a collection of approximation algorithms to build the surrogates and three different DOE techniques—full factorial (FF), Latin hypercube sampling (LHS), and central composite design (CCD)—are used to train the surrogates. The results are compared with the optimization results obtained by directly coupling an optimizer with the simulation code. The biggest concern in using the SAO framework based on statistical sampling is the generation of the required database. As the number of design variables grows, the computational cost of generating the required database grows rapidly. A data driven approach is proposed to tackle this situation, where the trick is to run the expensive simulation if and only if a nearby data point does not exist in the cumulatively growing database. Over time the database matures and is enriched as more and more optimizations are performed. Results show that the proposed methodology dramatically reduces the total number of calls to the expensive simulation runs during the optimization process
- …
