1,850 research outputs found
The surprising implications of familial association in disease risk
Background: A wide range of diseases show some degree of clustering in
families; family history is therefore an important aspect for clinicians when
making risk predictions. Familial aggregation is often quantified in terms of a
familial relative risk (FRR), and although at first glance this measure may
seem simple and intuitive as an average risk prediction, its implications are
not straightforward.
Methods: We use two statistical models for the distribution of disease risk
in a population: a dichotomous risk model that gives an intuitive understanding
of the implication of a given FRR, and a continuous risk model that facilitates
a more detailed computation of the inequalities in disease risk. Published
estimates of FRRs are used to produce Lorenz curves and Gini indices that
quantifies the inequalities in risk for a range of diseases.
Results: We demonstrate that even a moderate familial association in disease
risk implies a very large difference in risk between individuals in the
population. We give examples of diseases for which this is likely to be true,
and we further demonstrate the relationship between the point estimates of FRRs
and the distribution of risk in the population.
Conclusions: The variation in risk for several severe diseases may be larger
than the variation in income in many countries. The implications of familial
risk estimates should be recognized by epidemiologists and clinicians.Comment: 17 pages, 5 figure
Tears, remorse and reparation in Henrik Ibsen's Peer Gynt
For more than 100 years, Henrik Ibsen’s Peer Gynt has been interpreted in the light of Søren Kierkegaard. With a problematic self as an essence of the play, one has emphasized a Kierkegaardian choice, necessary for Peer to become an integrated person. This paper challenges these interpretations by focusing on mourning as a way to develop the self in Peer Gynt. The reading reveals a striking correspondence, concerning structure and dynamics, between Peer’s way of dealing with feelings like sadness, guilt and remorse and Klein’s model of paranoid-schizoid and depressive position. Peer is facing painful feelings throughout the play. He identifies them quite easily, but is not able to tolerate the pain and avoids them with omnipotent fantasies, manic manoeuvres and denial. Hence, no reparation through mourning takes place, his development is arrested and he is unable to form a genuine love relationship with Solveig. The reading demonstrates an impressively profound complexity in Ibsen’s representation of Peer’s character, and a striking richness in detail in how it corresponds to Klein’s anthropology
Recommended from our members
Additive Intensity Regression Models in Corporate Default Analysis
We consider additive intensity (Aalen) models as an alternative to the multiplicative intensity (Cox) models for analyzing the default risk of a sample of rated, nonfinancial U.S. firms. The setting allows for estimating and testing the significance of time-varying effects. We use a variety of model checking techniques to identify misspecifications. In our final model, we find evidence of time-variation in the effects of distance-to-default and short-to-long term debt. Also we identify interactions between distance-to-default and other covariates, and the quick ratio covariate is significant. None of our macroeconomic covariates are significant
Nonparametric survival analysis of epidemic data
This paper develops nonparametric methods for the survival analysis of
epidemic data based on contact intervals. The contact interval from person i to
person j is the time between the onset of infectiousness in i and infectious
contact from i to j, where we define infectious contact as a contact sufficient
to infect a susceptible individual. We show that the Nelson-Aalen estimator
produces an unbiased estimate of the contact interval cumulative hazard
function when who-infects-whom is observed. When who-infects-whom is not
observed, we average the Nelson-Aalen estimates from all transmission networks
consistent with the observed data using an EM algorithm. This converges to a
nonparametric MLE of the contact interval cumulative hazard function that we
call the marginal Nelson-Aalen estimate. We study the behavior of these methods
in simulations and use them to analyze household surveillance data from the
2009 influenza A(H1N1) pandemic. In an appendix, we show that these methods
extend chain-binomial models to continuous time.Comment: 30 pages, 6 figure
Graphical models for marked point processes based on local independence
A new class of graphical models capturing the dependence structure of events
that occur in time is proposed. The graphs represent so-called local
independences, meaning that the intensities of certain types of events are
independent of some (but not necessarily all) events in the past. This dynamic
concept of independence is asymmetric, similar to Granger non-causality, so
that the corresponding local independence graphs differ considerably from
classical graphical models. Hence a new notion of graph separation, called
delta-separation, is introduced and implications for the underlying model as
well as for likelihood inference are explored. Benefits regarding facilitation
of reasoning about and understanding of dynamic dependencies as well as
computational simplifications are discussed.Comment: To appear in the Journal of the Royal Statistical Society Series
Does Cox analysis of a randomized survival study yield a causal treatment effect?
The final publication (Aalen, Odd O., Richard J. Cook, and Kjetil Røysland. Does Cox analysis of a randomized survival study yield a causal treatment effect?. Lifetime Data Analysis 21(4) (2015): 579-593. DOI: 10.1007/s10985-015-9335-y) is available at http://link.springer.com/article/10.1007/s10985-015-9335-yStatistical methods for survival analysis play a central role in the assessment
of treatment effects in randomized clinical trials in cardiovascular disease, cancer, and
many other fields. The most common approach to analysis involves fitting a Cox
regression model including a treatment indicator, and basing inference on the large
sample properties of the regression coefficient estimator. Despite the fact that treatment
assignment is randomized, the hazard ratio is not a quantity which admits a causal
interpretation in the case of unmodelled heterogeneity. This problem arises because
the risk sets beyond the first event time are comprised of the subset of individuals who
have not previously failed. The balance in the distribution of potential confounders
between treatment arms is lost by this implicit conditioning, whether or not censoring
is present. Thus while the Cox model may be used as a basis for valid tests of the
null hypotheses of no treatment effect if robust variance estimates are used, modeling
frameworks more compatible with causal reasoning may be preferable in general for
estimation.Canadian Institutes for Health Research (FRN 13887); Canada Research Chair (Tier 1) – CIHR funded (950-226626
A semi-Markov model for stroke with piecewise-constant hazards in the presence of left, right and interval censoring.
This paper presents a parametric method of fitting semi-Markov models with piecewise-constant hazards in the presence of left, right and interval censoring. We investigate transition intensities in a three-state illness-death model with no recovery. We relax the Markov assumption by adjusting the intensity for the transition from state 2 (illness) to state 3 (death) for the time spent in state 2 through a time-varying covariate. This involves the exact time of the transition from state 1 (healthy) to state 2. When the data are subject to left or interval censoring, this time is unknown. In the estimation of the likelihood, we take into account interval censoring by integrating out all possible times for the transition from state 1 to state 2. For left censoring, we use an Expectation-Maximisation inspired algorithm. A simulation study reflects the performance of the method. The proposed combination of statistical procedures provides great flexibility. We illustrate the method in an application by using data on stroke onset for the older population from the UK Medical Research Council Cognitive Function and Ageing Study
Distilling Information Reliability and Source Trustworthiness from Digital Traces
Online knowledge repositories typically rely on their users or dedicated
editors to evaluate the reliability of their content. These evaluations can be
viewed as noisy measurements of both information reliability and information
source trustworthiness. Can we leverage these noisy evaluations, often biased,
to distill a robust, unbiased and interpretable measure of both notions?
In this paper, we argue that the temporal traces left by these noisy
evaluations give cues on the reliability of the information and the
trustworthiness of the sources. Then, we propose a temporal point process
modeling framework that links these temporal traces to robust, unbiased and
interpretable notions of information reliability and source trustworthiness.
Furthermore, we develop an efficient convex optimization procedure to learn the
parameters of the model from historical traces. Experiments on real-world data
gathered from Wikipedia and Stack Overflow show that our modeling framework
accurately predicts evaluation events, provides an interpretable measure of
information reliability and source trustworthiness, and yields interesting
insights about real-world events.Comment: Accepted at 26th World Wide Web conference (WWW-17
- …
