20,092 research outputs found
Crossings as a side effect of dependency lengths
The syntactic structure of sentences exhibits a striking regularity:
dependencies tend to not cross when drawn above the sentence. We investigate
two competing explanations. The traditional hypothesis is that this trend
arises from an independent principle of syntax that reduces crossings
practically to zero. An alternative to this view is the hypothesis that
crossings are a side effect of dependency lengths, i.e. sentences with shorter
dependency lengths should tend to have fewer crossings. We are able to reject
the traditional view in the majority of languages considered. The alternative
hypothesis can lead to a more parsimonious theory of language.Comment: the discussion section has been expanded significantly; in press in
Complexity (Wiley
A commentary on "The now-or-never bottleneck: a fundamental constraint on language", by Christiansen and Chater (2016)
In a recent article, Christiansen and Chater (2016) present a fundamental
constraint on language, i.e. a now-or-never bottleneck that arises from our
fleeting memory, and explore its implications, e.g., chunk-and-pass processing,
outlining a framework that promises to unify different areas of research. Here
we explore additional support for this constraint and suggest further
connections from quantitative linguistics and information theory
The meaning-frequency law in Zipfian optimization models of communication
According to Zipf's meaning-frequency law, words that are more frequent tend
to have more meanings. Here it is shown that a linear dependency between the
frequency of a form and its number of meanings is found in a family of models
of Zipf's law for word frequencies. This is evidence for a weak version of the
meaning-frequency law. Interestingly, that weak law (a) is not an inevitable of
property of the assumptions of the family and (b) is found at least in the
narrow regime where those models exhibit Zipf's law for word frequencies
The sum of edge lengths in random linear arrangements
Spatial networks are networks where nodes are located in a space equipped
with a metric. Typically, the space is two-dimensional and until recently and
traditionally, the metric that was usually considered was the Euclidean
distance. In spatial networks, the cost of a link depends on the edge length,
i.e. the distance between the nodes that define the edge. Hypothesizing that
there is pressure to reduce the length of the edges of a network requires a
null model, e.g., a random layout of the vertices of the network. Here we
investigate the properties of the distribution of the sum of edge lengths in
random linear arrangement of vertices, that has many applications in different
fields. A random linear arrangement consists of an ordering of the elements of
the nodes of a network being all possible orderings equally likely. The
distance between two vertices is one plus the number of intermediate vertices
in the ordering. Compact formulae for the 1st and 2nd moments about zero as
well as the variance of the sum of edge lengths are obtained for arbitrary
graphs and trees. We also analyze the evolution of that variance in Erdos-Renyi
graphs and its scaling in uniformly random trees. Various developments and
applications for future research are suggested
The placement of the head that minimizes online memory: a complex systems approach
It is well known that the length of a syntactic dependency determines its
online memory cost. Thus, the problem of the placement of a head and its
dependents (complements or modifiers) that minimizes online memory is
equivalent to the problem of the minimum linear arrangement of a star tree.
However, how that length is translated into cognitive cost is not known. This
study shows that the online memory cost is minimized when the head is placed at
the center, regardless of the function that transforms length into cost,
provided only that this function is strictly monotonically increasing. Online
memory defines a quasi-convex adaptive landscape with a single central minimum
if the number of elements is odd and two central minima if that number is even.
We discuss various aspects of the dynamics of word order of subject (S), verb
(V) and object (O) from a complex systems perspective and suggest that word
orders tend to evolve by swapping adjacent constituents from an initial or
early SOV configuration that is attracted towards a central word order by
online memory minimization. We also suggest that the stability of SVO is due to
at least two factors, the quasi-convex shape of the adaptive landscape in the
online memory dimension and online memory adaptations that avoid regression to
SOV. Although OVS is also optimal for placing the verb at the center, its low
frequency is explained by its long distance to the seminal SOV in the
permutation space.Comment: Minor changes (language improved; typos in Eqs. 5, 6 and 13
corrected
Dynamically Induced Zeeman Effect in Massless QED
It is shown that in non-perturbative massless QED an anomalous magnetic
moment is dynamically induced by an applied magnetic field. The induced
magnetic moment produces a Zeeman splitting for electrons in Landau levels
higher than . The expressions for the non-perturbative Lande g-factor and
Bohr magneton are obtained. Possible applications of this effect are outlined.Comment: Extensively revised version with several misprints and formulas
corrected. In this new version we included the non-perturbative Lande
g-factor and Bohr magneto
A correction on Shiloach's algorithm for minimum linear arrangement of trees
More than 30 years ago, Shiloach published an algorithm to solve the minimum
linear arrangement problem for undirected trees. Here we fix a small error in
the original version of the algorithm and discuss its effect on subsequent
literature. We also improve some aspects of the notation.Comment: A new introductory paragraph has been added; error solutions and
notation improvements are discussed with more dept
Beyond description. Comment on "Approaching human language with complex networks" by Cong & Liu
Comment on "Approaching human language with complex networks" by Cong & Li
The placement of the head that maximizes predictability. An information theoretic approach
The minimization of the length of syntactic dependencies is a
well-established principle of word order and the basis of a mathematical theory
of word order. Here we complete that theory from the perspective of information
theory, adding a competing word order principle: the maximization of
predictability of a target element. These two principles are in conflict: to
maximize the predictability of the head, the head should appear last, which
maximizes the costs with respect to dependency length minimization. The
implications of such a broad theoretical framework to understand the
optimality, diversity and evolution of the six possible orderings of subject,
object and verb are reviewed.Comment: in press in Glottometric
- …
