11,534,028 research outputs found
On the number of representations of n as a linear combination of four triangular numbers
Let and be the set of integers and the set of positive
integers, respectively. For
let be the number of representations of
by ). In this paper we obtain explicit formulas
for in the cases
, $(1,3,9,9),\
(1,1,3,9)(1,3,3,9)(1,1,9,9),\ (1,9,9,9)(1,1,1,9).$Comment: 18 page
Notes on a conjecture of Manoussakis concerning Hamilton cycles in digraphs
In 1992, Manoussakis conjectured that a strongly 2-connected digraph on
vertices is hamiltonian if for every two distinct pairs of independent
vertices and we have . In this note
we show that has a Hamilton path, which gives an affirmative evidence
supporting this conjecture.Comment: 8 page
Streaming Coreset Constructions for M-Estimators
We introduce a new method of maintaining a (k,epsilon)-coreset for clustering M-estimators over insertion-only streams. Let (P,w) be a weighted set (where w : P - > [0,infty) is the weight function) of points in a rho-metric space (meaning a set X equipped with a positive-semidefinite symmetric function D such that D(x,z) <=rho(D(x,y) + D(y,z)) for all x,y,z in X). For any set of points C, we define COST(P,w,C) = sum_{p in P} w(p) min_{c in C} D(p,c). A (k,epsilon)-coreset for (P,w) is a weighted set (Q,v) such that for every set C of k points, (1-epsilon)COST(P,w,C) <= COST(Q,v,C) <= (1+epsilon)COST(P,w,C). Essentially, the coreset (Q,v) can be used in place of (P,w) for all operations concerning the COST function. Coresets, as a method of data reduction, are used to solve fundamental problems in machine learning of streaming and distributed data.
M-estimators are functions D(x,y) that can be written as psi(d(x,y)) where ({X}, d) is a true metric (i.e. 1-metric) space. Special cases of M-estimators include the well-known k-median (psi(x) =x) and k-means (psi(x) = x^2) functions. Our technique takes an existing offline construction for an M-estimator coreset and converts it into the streaming setting, where n data points arrive sequentially. To our knowledge, this is the first streaming construction for any M-estimator that does not rely on the merge-and-reduce tree. For example, our coreset for streaming metric k-means uses O(epsilon^{-2} k log k log n) points of storage. The previous state-of-the-art required storing at least O(epsilon^{-2} k log k log^{4} n) points
Recovering metric from full ordinal information
Given a geodesic space (E, d), we show that full ordinal knowledge on the
metric d-i.e. knowledge of the function D d : (w, x, y, z) 1
d(w,x)d(y,z) , determines uniquely-up to a constant factor-the metric d.
For a subspace En of n points of E, converging in Hausdorff distance to E, we
construct a metric dn on En, based only on the knowledge of D d on En and
establish a sharp upper bound of the Gromov-Hausdorff distance between (En, dn)
and (E, d)
- …
