1,130 research outputs found
Asymptotic description of stochastic neural networks. I - existence of a Large Deviation Principle
We study the asymptotic law of a network of interacting neurons when the
number of neurons becomes infinite. The dynamics of the neurons is described by
a set of stochastic differential equations in discrete time. The neurons
interact through the synaptic weights which are Gaussian correlated random
variables. We describe the asymptotic law of the network when the number of
neurons goes to infinity. Unlike previous works which made the biologically
unrealistic assumption that the weights were i.i.d. random variables, we assume
that they are correlated. We introduce the process-level empirical measure of
the trajectories of the solutions to the equations of the finite network of
neurons and the averaged law (with respect to the synaptic weights) of the
trajectories of the solutions to the equations of the network of neurons. The
result is that the image law through the empirical measure satisfies a large
deviation principle with a good rate function. We provide an analytical
expression of this rate function in terms of the spectral representation of
certain Gaussian processes
Asymptotic description of stochastic neural networks. II - Characterization of the limit law
We continue the development, started in of the asymptotic description of
certain stochastic neural networks. We use the Large Deviation Principle (LDP)
and the good rate function H announced there to prove that H has a unique
minimum mu_e, a stationary measure on the set of trajectories. We characterize
this measure by its two marginals, at time 0, and from time 1 to T. The second
marginal is a stationary Gaussian measure. With an eye on applications, we show
that its mean and covariance operator can be inductively computed. Finally we
use the LDP to establish various convergence results, averaged and quenched
Euler Characteristic in Odd Dimensions
It is well known that the Euler characteristic of an odd dimensional compact
manifold is zero. An Euler complex is a combinatorial analogue of a compact
manifold. We present here an elementary proof of the corresponding result for
Euler complexes
The meanfield limit of a network of Hopfield neurons with correlated synaptic weights
We study the asymptotic behaviour for asymmetric neuronal dynamics in a
network of Hopfield neurons. The randomness in the network is modelled by
random couplings which are centered Gaussian correlated random variables. We
prove that the annealed law of the empirical measure satisfies a large
deviation principle without any condition on time. We prove that the good rate
function of this large deviation principle achieves its minimum value at a
unique Gaussian measure which is not Markovian. This implies almost sure
convergence of the empirical measure under the quenched law. We prove that the
limit equations are expressed as an infinite countable set of linear non
Markovian SDEs.Comment: 102 page
- …
