4,954 research outputs found
The Optimal Size of Stochastic Hodgkin-Huxley Neuronal Systems for Maximal Energy Efficiency in Coding of Pulse Signals
The generation and conduction of action potentials represents a fundamental
means of communication in the nervous system, and is a metabolically expensive
process. In this paper, we investigate the energy efficiency of neural systems
in a process of transfer pulse signals with action potentials. By computer
simulation of a stochastic version of Hodgkin-Huxley model with detailed
description of ion channel random gating, and analytically solve a bistable
neuron model that mimic the action potential generation with a particle
crossing the barrier of a double well, we find optimal number of ion channels
that maximize energy efficiency for a neuron. We also investigate the energy
efficiency of neuron population in which input pulse signals are represented
with synchronized spikes and read out with a downstream coincidence detector
neuron. We find an optimal combination of the number of neurons in neuron
population and the number of ion channels in each neuron that maximize the
energy efficiency. The energy efficiency depends on the characters of the input
signals, e.g., the pulse strength and the inter-pulse intervals. We argue that
trade-off between reliability of signal transmission and energy cost may
influence the size of the neural systems if energy use is constrained.Comment: 22 pages, 10 figure
On the Depth of Deep Neural Networks: A Theoretical View
People believe that depth plays an important role in success of deep neural
networks (DNN). However, this belief lacks solid theoretical justifications as
far as we know. We investigate role of depth from perspective of margin bound.
In margin bound, expected error is upper bounded by empirical margin error plus
Rademacher Average (RA) based capacity term. First, we derive an upper bound
for RA of DNN, and show that it increases with increasing depth. This indicates
negative impact of depth on test performance. Second, we show that deeper
networks tend to have larger representation power (measured by Betti numbers
based complexity) than shallower networks in multi-class setting, and thus can
lead to smaller empirical margin error. This implies positive impact of depth.
The combination of these two results shows that for DNN with restricted number
of hidden units, increasing depth is not always good since there is a tradeoff
between positive and negative impacts. These results inspire us to seek
alternative ways to achieve positive impact of depth, e.g., imposing
margin-based penalty terms to cross entropy loss so as to reduce empirical
margin error without increasing depth. Our experiments show that in this way,
we achieve significantly better test performance.Comment: AAAI 201
A Game-theoretic Machine Learning Approach for Revenue Maximization in Sponsored Search
Sponsored search is an important monetization channel for search engines, in
which an auction mechanism is used to select the ads shown to users and
determine the prices charged from advertisers. There have been several pieces
of work in the literature that investigate how to design an auction mechanism
in order to optimize the revenue of the search engine. However, due to some
unrealistic assumptions used, the practical values of these studies are not
very clear. In this paper, we propose a novel \emph{game-theoretic machine
learning} approach, which naturally combines machine learning and game theory,
and learns the auction mechanism using a bilevel optimization framework. In
particular, we first learn a Markov model from historical data to describe how
advertisers change their bids in response to an auction mechanism, and then for
any given auction mechanism, we use the learnt model to predict its
corresponding future bid sequences. Next we learn the auction mechanism through
empirical revenue maximization on the predicted bid sequences. We show that the
empirical revenue will converge when the prediction period approaches infinity,
and a Genetic Programming algorithm can effectively optimize this empirical
revenue. Our experiments indicate that the proposed approach is able to produce
a much more effective auction mechanism than several baselines.Comment: Twenty-third International Conference on Artificial Intelligence
(IJCAI 2013
- …
