89,766 research outputs found
Hilbert-Samuel multiplicities of certain deformation rings
We compute presentations of crystalline framed deformation rings of a two
dimensional representation of the absolute Galois group of
, when has scalar semi-simplification, the
Hodge-Tate weights are small and . In the non-trivial cases, we show that
the special fibre is geometrically irreducible, generically reduced and the
Hilbert-Samuel multiplicity is either , or depending on
. We show that in the last two cases the deformation ring is not
Cohen-Macaulay.Comment: 12 pages; Prop. 5 and Lemma 2 are new, showing that the special fibre
of the universal deformation ring is geometrically irreducible and
generically reduce
The Behaviorisms of Skinner and Quine: Genesis, Development, and Mutual Influence
in april 1933, two bright young Ph.D.s were elected to the Harvard Society of Fellows: the psychologist B. F. Skinner and the philosopher/logician W. V. Quine. Both men would become among the most influential scholars of their time; Skinner leads the "Top 100 Most Eminent Psychologists of the 20th Century," whereas philosophers have selected Quine as the most important Anglophone philosopher after the Second World War.1 At the height of their fame, Skinner and Quine became "Edgar Pierce twins"; the latter obtaining the endowed chair at Harvard's department of philosophy, the former taking up the position at Harvard's psychology department.2Besides these biographical parallels, there also..
Relaxation Penalties and Priors for Plausible Modeling of Nonidentified Bias Sources
In designed experiments and surveys, known laws or design feat ures provide
checks on the most relevant aspects of a model and identify the target
parameters. In contrast, in most observational studies in the health and social
sciences, the primary study data do not identify and may not even bound target
parameters. Discrepancies between target and analogous identified parameters
(biases) are then of paramount concern, which forces a major shift in modeling
strategies. Conventional approaches are based on conditional testing of
equality constraints, which correspond to implausible point-mass priors. When
these constraints are not identified by available data, however, no such
testing is possible. In response, implausible constraints can be relaxed into
penalty functions derived from plausible prior distributions. The resulting
models can be fit within familiar full or partial likelihood frameworks. The
absence of identification renders all analyses part of a sensitivity analysis.
In this view, results from single models are merely examples of what might be
plausibly inferred. Nonetheless, just one plausible inference may suffice to
demonstrate inherent limitations of the data. Points are illustrated with
misclassified data from a study of sudden infant death syndrome. Extensions to
confounding, selection bias and more complex data structures are outlined.Comment: Published in at http://dx.doi.org/10.1214/09-STS291 the Statistical
Science (http://www.imstat.org/sts/) by the Institute of Mathematical
Statistics (http://www.imstat.org
The tenth order mock theta functions revisited
In this paper we consider the first four of the eight identities between the
tenth order mock theta functions, found in Ramanujan's lost notebook. These
were originally proved by Choi. Here we give an alternative (much shorter)
proof.Comment: 11 pages; preprint, submitted for publicatio
Exploring sensor data management
The increasing availability of cheap, small, low-power sensor hardware and the ubiquity of wired and wireless networks has led to the prediction that `smart evironments' will emerge in the near future. The sensors in these environments collect detailed information about the situation people are in, which is used to enhance information-processing applications that are present on their mobile and `ambient' devices.\ud
\ud
Bridging the gap between sensor data and application information poses new requirements to data management. This report discusses what these requirements are and documents ongoing research that explores ways of thinking about data management suited to these new requirements: a more sophisticated control flow model, data models that incorporate time, and ways to deal with the uncertainty in sensor data
- …
