42,680 research outputs found
Carl Menger and Friedrich von Wieser on the Role of Knowledge and Beliefs in the Emergence and Evolution of Institutions
In this article we start from the well-known contribution of the Austrian school with respect to the problem of knowledge and its role in inter- individual coordination. Focusing on two authors of this school - his founding father Carl Menger and Friedrich von Wieser, we show that the y both appreciate the role of knowledge in the emergence of economic and social institutions. However, their divergences regarding methodological individualism and subjectivism lead them to provide two different perspectives concerning the emergence and dynamics of institutions. This is exemplified by Menger and Wieser’s way of dealing with the emergence of money: on one hand, Menger takes for granted the involuntary formation of shared knowledge about the validity of social institutions such as money; on the other hand, Wieser favours an explanation whereby collective beliefs are more than shared knowledge since they do have some autonomy vis-à-vis individuals.
Statistics for the Luria-Delbr\"uck distribution
The Luria-Delbr\"uck distribution is a classical model of mutations in cell
kinetics. It is obtained as a limit when the probability of mutation tends to
zero and the number of divisions to infinity. It can be interpreted as a
compound Poisson distribution (for the number of mutations) of exponential
mixtures (for the developing time of mutant clones) of geometric distributions
(for the number of cells produced by a mutant clone in a given time). The
probabilistic interpretation, and a rigourous proof of convergence in the
general case, are deduced from classical results on Bellman-Harris branching
processes. The two parameters of the Luria-Delbr\"uck distribution are the
expected number of mutations, which is the parameter of interest, and the
relative fitness of normal cells compared to mutants, which is the heavy tail
exponent. Both can be simultaneously estimated by the maximum likehood method.
However, the computation becomes numerically unstable as soon as the maximal
value of the sample is large, which occurs frequently due to the heavy tail
property. Based on the empirical generating function, robust estimators are
proposed and their asymptotic variance is given. They are comparable in
precision to maximum likelihood estimators, with a much broader range of
calculability, a better numerical stability, and a negligible computing time
Crossings of smooth shot noise processes
In this paper, we consider smooth shot noise processes and their expected
number of level crossings. When the kernel response function is sufficiently
smooth, the mean number of crossings function is obtained through an integral
formula. Moreover, as the intensity increases, or equivalently, as the number
of shots becomes larger, a normal convergence to the classical Rice's formula
for Gaussian processes is obtained. The Gaussian kernel function, that
corresponds to many applications in physics, is studied in detail and two
different regimes are exhibited.Comment: Published in at http://dx.doi.org/10.1214/11-AAP807 the Annals of
Applied Probability ( http://www.imstat.org/aap/ ) by the Institute of
Mathematical Statistics (http://www.imstat.org
Money, Banking and Dynamics: Schumpeter vs. Hayek
In the first section we discuss the Wicksellian origins of Schumpeter's and Hayek's approaches to money and banking in the context of dynamic economic analysis. The second section compares the role played by banks and credit in Schumpeter's and Hayek's explanation of economic fluctuations. We conclude by contrasting both authors' perception of economic dynamics.Banking ; credit theory ; business cycles
Dynamic robust duality in utility maximization
A celebrated financial application of convex duality theory gives an explicit
relation between the following two quantities:
(i) The optimal terminal wealth of the problem
to maximize the expected -utility of the terminal wealth
generated by admissible portfolios in a market
with the risky asset price process modeled as a semimartingale;
(ii) The optimal scenario of the dual problem to minimize
the expected -value of over a family of equivalent local
martingale measures , where is the convex conjugate function of the
concave function .
In this paper we consider markets modeled by It\^o-L\'evy processes. In the
first part we use the maximum principle in stochastic control theory to extend
the above relation to a \emph{dynamic} relation, valid for all .
We prove in particular that the optimal adjoint process for the primal problem
coincides with the optimal density process, and that the optimal adjoint
process for the dual problem coincides with the optimal wealth process, . In the terminal time case we recover the classical duality
connection above. We get moreover an explicit relation between the optimal
portfolio and the optimal measure . We also obtain that the
existence of an optimal scenario is equivalent to the replicability of a
related -claim.
In the second part we present robust (model uncertainty) versions of the
optimization problems in (i) and (ii), and we prove a similar dynamic relation
between them. In particular, we show how to get from the solution of one of the
problems to the other. We illustrate the results with explicit examples
A Parameterized Algebra for Event Notification Services
Event notification services are used in various applications such as digital libraries, stock tickers, traffic control, or facility management. However, to our knowledge, a common semantics of events in event notification services has not been defined so far. In this paper, we propose a parameterized event algebra which describes the semantics of composite events for event notification systems. The parameters serve as a basis for flexible handling of duplicates in both primitive and composite events
Knowledge and beliefs in economics: the case of the Austrian tradition
The contribution focuses on the problem of the influence of individual knowledge and beliefs on the working of economic activity, within the Austrian tradition of economic thought. More specifically, the contributions of von Mises, Hayek and Schumpeter are investigated. These contributions show a large variety of answers concerning the relation between individual and social beliefs. This variety is not exhaustive but it substantially contributes to a better understanding of contemporary theoretical debates.Individual/social beliefs, shared Knowledge, subjectivism, social rules
Why Global Integration May Lead to Terrorism: An Evolutionary Theory of Mimetic Rivalry
We study the emergence of the recent form of terrorism using evolutionary game theory. The model is an economic interpretation of René Girard's theory of mimetic rivalry. This theory presents terrorism as the result of competition between countries, when the desire to imitate the leading country is frustrated by the impossibility of doing so. We define a multi-country setup where interaction takes place in an international trade game, which is a coordination game. Countries follow a simple behavioral rule trying to reduce the gap between the maximal payoff obtained and their own payoff. In a coordination game, this may lead to mimetic rivalry behavior, that is the deliberate choice of a strategy degrading the situation of the leading country. Paradoxically, we find that the desire of convergence may lead to a more partitioned world economy.Terrorism Evolutionary game theory Mimetic Rivalry Risk-dominance
Societal Comparison and Social Change of the Family Division of Labour
In the case of the comparative societal analysis, the methodological problems are not simple methodological questions but of real theoretical questions. One will take the example of an international comparison relating to the division of the paid work and unpaid, to show how should have been worked out a specific methodology. This one lies nevertheless within a preliminary theoretical scope; it is partly the fruit of a "specific" theory. And, more basically still, it leads to a "general" theory; in fact a theory of the social change. This way, one studies the conditions of passage of a theory specific to a general theory by transfer of the paradigm of the societal regulations.paid and un paid work; family; international comparison; societal regulations; methodology/theory; task share; social change
Likelihood-Free Parallel Tempering
Approximate Bayesian Computational (ABC) methods (or likelihood-free methods)
have appeared in the past fifteen years as useful methods to perform Bayesian
analyses when the likelihood is analytically or computationally intractable.
Several ABC methods have been proposed: Monte Carlo Markov Chains (MCMC)
methods have been developped by Marjoramet al. (2003) and by Bortotet al.
(2007) for instance, and sequential methods have been proposed among others by
Sissonet al. (2007), Beaumont et al. (2009) and Del Moral et al. (2009). Until
now, while ABC-MCMC methods remain the reference, sequential ABC methods have
appeared to outperforms them (see for example McKinley et al. (2009) or Sisson
et al. (2007)). In this paper a new algorithm combining population-based MCMC
methods with ABC requirements is proposed, using an analogy with the Parallel
Tempering algorithm (Geyer, 1991). Performances are compared with existing ABC
algorithms on simulations and on a real example
- …
