90,377 research outputs found
Joint and Competitive Caching Designs in Large-Scale Multi-Tier Wireless Multicasting Networks
Caching and multicasting are two promising methods to support massive content
delivery in multi-tier wireless networks. In this paper, we consider a random
caching and multicasting scheme with caching distributions in the two tiers as
design parameters, to achieve efficient content dissemination in a two-tier
large-scale cache-enabled wireless multicasting network. First, we derive
tractable expressions for the successful transmission probabilities in the
general region as well as the high SNR and high user density region,
respectively, utilizing tools from stochastic geometry. Then, for the case of a
single operator for the two tiers, we formulate the optimal joint caching
design problem to maximize the successful transmission probability in the
asymptotic region, which is nonconvex in general. By using the block successive
approximate optimization technique, we develop an iterative algorithm, which is
shown to converge to a stationary point. Next, for the case of two different
operators, one for each tier, we formulate the competitive caching design game
where each tier maximizes its successful transmission probability in the
asymptotic region. We show that the game has a unique Nash equilibrium (NE) and
develop an iterative algorithm, which is shown to converge to the NE under a
mild condition. Finally, by numerical simulations, we show that the proposed
designs achieve significant gains over existing schemes.Comment: 30 pages, 6 pages, submitted to IEEE GLOBECOM 2017 and IEEE Trans.
Commo
Ordering-sensitive and Semantic-aware Topic Modeling
Topic modeling of textual corpora is an important and challenging problem. In
most previous work, the "bag-of-words" assumption is usually made which ignores
the ordering of words. This assumption simplifies the computation, but it
unrealistically loses the ordering information and the semantic of words in the
context. In this paper, we present a Gaussian Mixture Neural Topic Model
(GMNTM) which incorporates both the ordering of words and the semantic meaning
of sentences into topic modeling. Specifically, we represent each topic as a
cluster of multi-dimensional vectors and embed the corpus into a collection of
vectors generated by the Gaussian mixture model. Each word is affected not only
by its topic, but also by the embedding vector of its surrounding words and the
context. The Gaussian mixture components and the topic of documents, sentences
and words can be learnt jointly. Extensive experiments show that our model can
learn better topics and more accurate word distributions for each topic.
Quantitatively, comparing to state-of-the-art topic modeling approaches, GMNTM
obtains significantly better performance in terms of perplexity, retrieval
accuracy and classification accuracy.Comment: To appear in proceedings of AAAI 201
- …
