550 research outputs found
Word shape analysis for a hybrid recognition system
This paper describes two wholistic recognizers developed for use in a hybrid recognition system. The recognizers use information about the word shape. This information is strongly related to word zoning. One of the recognizers is explicitly limited by the accuracy of the zoning information extraction. The other recognizer is designed so as to avoid this limitation. The recognizers use very simple sets of features and fuzzy set based pattern matching techniques. This not only aims to increase their robustness, but also causes problems with disambiguation of the results. A verification mechanism, using letter alternatives as compound features, is introduced. Letter alternatives are obtained from a segmentation based recognizer coexisting in the hybrid system. Despite some remaining disambiguation problems, wholistic recognizers are found capable of outperforming the segmentation based recognizer. When working together in a hybrid system, the results are significantly higher than that of the individual recognizers. Recognition results are reported and compared
Entropy of the Universe
After a discussion on several limiting cases where General Relativity turns
into less sophisticated theories, we find that in the correct thermodynamical
and cosmological weak field limit of Einstein's field equations the entropy of
the Universe is R^(3/2) -- dependent, where R stands for the radius of the
causally related Universe. Thus, entropy grows in the Universe, contrary to
Standard Cosmology prediction.Comment: To be published by International Journal of Theoretical Physic
Diversification and limited information in the Kelly game
Financial markets, with their vast range of different investment
opportunities, can be seen as a system of many different simultaneous games
with diverse and often unknown levels of risk and reward. We introduce
generalizations to the classic Kelly investment game [Kelly (1956)] that
incorporates these features, and use them to investigate the influence of
diversification and limited information on Kelly-optimal portfolios. In
particular we present approximate formulas for optimizing diversified
portfolios and exact results for optimal investment in unknown games where the
only available information is past outcomes.Comment: 11 pages, 4 figure
An error in temporal error theory
Within the philosophy of time there has been a growing interest in positions that deny the reality of time. Those positions, whether motivated by arguments from physics or metaphysics, have a shared conclusion: time is not real. What has not been made wholly clear, however, is exactly what it entails to deny the reality of time. Time is unreal, sure. But what does that mean?
There has (within the recent literature) been only one sustained attempt to spell out exactly what it would mean to endorse a (so-called) temporal error theory; a theory that denies the reality of time—Baron & Miller’s ‘What is temporal error theory?’. Despite the fact that their paper makes significant strides in spelling out what would be required of a temporal error theory, my claim in this paper is that their position must be rejected and replaced. As well as looking to reject Baron and Miller’s position, I also look to provide that replacement
Vector Theory of Gravity
We proposed a gravitation theory based on an analogy with electrodynamics on
the basis of a vector field. For the first time, to calculate the basic
gravitational effects in the framework of a vector theory of gravity, we use a
Lagrangian written with gravitational radiation neglected and generalized to
the case of ultra-relativistic speeds. This allows us to accurately calculate
the values of all three major gravity experiments: the values of the perihelion
shift of Mercury, the light deflection angle in the gravity field of the Sun
and the value of radar echo delay. The calculated values coincide with the
observed ones. It is shown that, in this theory, there exists a model of an
expanding Universe.Comment: 9 page
Intelligent Financial Fraud Detection Practices: An Investigation
Financial fraud is an issue with far reaching consequences in the finance
industry, government, corporate sectors, and for ordinary consumers. Increasing
dependence on new technologies such as cloud and mobile computing in recent
years has compounded the problem. Traditional methods of detection involve
extensive use of auditing, where a trained individual manually observes reports
or transactions in an attempt to discover fraudulent behaviour. This method is
not only time consuming, expensive and inaccurate, but in the age of big data
it is also impractical. Not surprisingly, financial institutions have turned to
automated processes using statistical and computational methods. This paper
presents a comprehensive investigation on financial fraud detection practices
using such data mining methods, with a particular focus on computational
intelligence-based techniques. Classification of the practices based on key
aspects such as detection algorithm used, fraud type investigated, and success
rate have been covered. Issues and challenges associated with the current
practices and potential future direction of research have also been identified.Comment: Proceedings of the 10th International Conference on Security and
Privacy in Communication Networks (SecureComm 2014
A relativistic action-at-a-distance description of gravitational interactions?
It is shown that certain aspects of gravitation may be described using a
relativistic action-at-a-distance formulation. The equations of motion of the
model presented are invariant under Lorentz transformations and agree with the
equations of Einstein's theory of General Relativity, at the first
Post-Newtonian approximation, for any number of interacting point masses
On a modified-Lorentz-transformation based gravity model confirming basic GRT experiments
Implementing Poincar\'e's `geometric conventionalism' a scalar
Lorentz-covariant gravity model is obtained based on gravitationally modified
Lorentz transformations (or GMLT). The modification essentially consists of an
appropriate space-time and momentum-energy scaling ("normalization") relative
to a nondynamical flat background geometry according to an isotropic,
nonsingular gravitational `affecting' function Phi(r). Elimination of the
gravitationally `unaffected' S_0 perspective by local composition of space-time
GMLT recovers the local Minkowskian metric and thus preserves the invariance of
the locally observed velocity of light. The associated energy-momentum GMLT
provides a covariant Hamiltonian description for test particles and photons
which, in a static gravitational field configuration, endorses the four `basic'
experiments for testing General Relativity Theory: gravitational i) deflection
of light, ii) precession of perihelia, iii) delay of radar echo, iv) shift of
spectral lines. The model recovers the Lagrangian of the Lorentz-Poincar\'e
gravity model by Torgny Sj\"odin and integrates elements of the precursor
gravitational theories, with spatially Variable Speed of Light (VSL) by
Einstein and Abraham, and gravitationally variable mass by Nordstr\"om.Comment: v1: 14 pages, extended version of conf. paper PIRT VIII, London,
2002. v2: section added on effective tensorial rank, references added,
appendix added, WEP issue deleted, abstract and other parts rewritten, same
results (to appear in Found. Phys.
Is "the theory of everything'' merely the ultimate ensemble theory?
We discuss some physical consequences of what might be called ``the ultimate
ensemble theory'', where not only worlds corresponding to say different sets of
initial data or different physical constants are considered equally real, but
also worlds ruled by altogether different equations. The only postulate in this
theory is that all structures that exist mathematically exist also physically,
by which we mean that in those complex enough to contain self-aware
substructures (SASs), these SASs will subjectively perceive themselves as
existing in a physically ``real'' world. We find that it is far from clear that
this simple theory, which has no free parameters whatsoever, is observationally
ruled out. The predictions of the theory take the form of probability
distributions for the outcome of experiments, which makes it testable. In
addition, it may be possible to rule it out by comparing its a priori
predictions for the observable attributes of nature (the particle masses, the
dimensionality of spacetime, etc) with what is observed.Comment: 29 pages, revised to match version published in Annals of Physics.
The New Scientist article and color figures are available at
http://www.sns.ias.edu/~max/toe_frames.html or from [email protected]
- …
