609 research outputs found
A Recurrent Neural Network Survival Model: Predicting Web User Return Time
The size of a website's active user base directly affects its value. Thus, it
is important to monitor and influence a user's likelihood to return to a site.
Essential to this is predicting when a user will return. Current state of the
art approaches to solve this problem come in two flavors: (1) Recurrent Neural
Network (RNN) based solutions and (2) survival analysis methods. We observe
that both techniques are severely limited when applied to this problem.
Survival models can only incorporate aggregate representations of users instead
of automatically learning a representation directly from a raw time series of
user actions. RNNs can automatically learn features, but can not be directly
trained with examples of non-returning users who have no target value for their
return time. We develop a novel RNN survival model that removes the limitations
of the state of the art methods. We demonstrate that this model can
successfully be applied to return time prediction on a large e-commerce dataset
with a superior ability to discriminate between returning and non-returning
users than either method applied in isolation.Comment: Accepted into ECML PKDD 2018; 8 figures and 1 tabl
Lorentz violation, Gravity, Dissipation and Holography
We reconsider Lorentz Violation (LV) at the fundamental level. We show that
Lorentz Violation is intimately connected with gravity and that LV couplings in
QFT must always be fields in a gravitational sector. Diffeomorphism invariance
must be intact and the LV couplings transform as tensors under coordinate/frame
changes. Therefore searching for LV is one of the most sensitive ways of
looking for new physics, either new interactions or modifications of known
ones. Energy dissipation/Cerenkov radiation is shown to be a generic feature of
LV in QFT. A general computation is done in strongly coupled theories with
gravity duals. It is shown that in scale invariant regimes, the energy
dissipation rate depends non-triviallly on two characteristic exponents, the
Lifshitz exponent and the hyperscaling violation exponent.Comment: LateX, 51 pages, 9 figures. (v2) References and comments added.
Misprints correcte
Estimating time-to-onset of adverse drug reactions from spontaneous reporting databases.
International audienceBACKGROUND: Analyzing time-to-onset of adverse drug reactions from treatment exposure contributes to meeting pharmacovigilance objectives, i.e. identification and prevention. Post-marketing data are available from reporting systems. Times-to-onset from such databases are right-truncated because some patients who were exposed to the drug and who will eventually develop the adverse drug reaction may do it after the time of analysis and thus are not included in the data. Acknowledgment of the developments adapted to right-truncated data is not widespread and these methods have never been used in pharmacovigilance. We assess the use of appropriate methods as well as the consequences of not taking right truncation into account (naïve approach) on parametric maximum likelihood estimation of time-to-onset distribution. METHODS: Both approaches, naïve or taking right truncation into account, were compared with a simulation study. We used twelve scenarios for the exponential distribution and twenty-four for the Weibull and log-logistic distributions. These scenarios are defined by a set of parameters: the parameters of the time-to-onset distribution, the probability of this distribution falling within an observable values interval and the sample size. An application to reported lymphoma after anti TNF-¿ treatment from the French pharmacovigilance is presented. RESULTS: The simulation study shows that the bias and the mean squared error might in some instances be unacceptably large when right truncation is not considered while the truncation-based estimator shows always better and often satisfactory performances and the gap may be large. For the real dataset, the estimated expected time-to-onset leads to a minimum difference of 58 weeks between both approaches, which is not negligible. This difference is obtained for the Weibull model, under which the estimated probability of this distribution falling within an observable values interval is not far from 1. CONCLUSIONS: It is necessary to take right truncation into account for estimating time-to-onset of adverse drug reactions from spontaneous reporting databases
On the relationship between the reversed hazard rate and elasticity
Despite hazard and reversed hazard rates sharing a number of similar aspects, reversed hazard functions are far less frequently used. Understanding their meaning is not a simple task. The aim of this paper is to expand the usefulness of the reversed hazard function by relating it to other well-known concepts broadly used in economics: (linear or cumulative) rates of increase and elasticity. This will make it possible (i) to improve our understanding of the consequences of using a particular distribution and, in certain cases, (ii) to introduce our hypotheses and knowledge about the random process in a more meaningful and intuitive way, thus providing a means to achieving distributions that would otherwise be hardly imaginable or justifiable
Search for High Mass Photon Pairs in p-pbar --> gamma-gamma-jet-jet Events at sqrt(s)=1.8 TeV
A search has been carried out for events in the channel p-barp --> gamma
gamma jet jet. Such a signature can characterize the production of a
non-standard Higgs boson together with a W or Z boson. We refer to this
non-standard Higgs, having standard model couplings to vector bosons but no
coupling to fermions, as a "bosonic Higgs." With the requirement of two high
transverse energy photons and two jets, the diphoton mass (m(gamma gamma))
distribution is consistent with expected background. A 90(95)% C.L. upper limit
on the cross section as a function of mass is calculated, ranging from
0.60(0.80) pb for m(gamma gamma) = 65 GeV/c^2 to 0.26(0.34) pb for m(gamma
gamma) = 150 GeV/c^2, corresponding to a 95% C.L. lower limit on the mass of a
bosonic Higgs of 78.5 GeV/c^2.Comment: 9 pages, 3 figures. Replacement has new H->gamma gamma branching
ratios and corresponding new mass limit
Search For Heavy Pointlike Dirac Monopoles
We have searched for central production of a pair of photons with high
transverse energies in collisions at TeV using of data collected with the D\O detector at the Fermilab Tevatron in
1994--1996. If they exist, virtual heavy pointlike Dirac monopoles could
rescatter pairs of nearly real photons into this final state via a box diagram.
We observe no excess of events above background, and set lower 95% C.L. limits
of on the mass of a spin 0, 1/2, or 1 Dirac
monopole.Comment: 12 pages, 4 figure
Confidence interval estimation for the changepoint of treatment stratification in the presence of a qualitative covariate-treatment interaction
The goal in stratified medicine is to administer the \textquotedblbest\textquotedbl treatment to a patient. Not all patients might benefit from the same treatment; the choice of best treatment can depend on certain patient characteristics. In this article, it is assumed that a time-to-event outcome is considered as a patient-relevant outcome and a qualitative interaction between a continuous covariate and treatment exists, ie,~that patients with different values of one specific covariate should be treated differently. We suggest and investigate different methods for confidence interval estimation for the covariate value, where the treatment recommendation should be changed based on data collected in a randomized clinical trial. An adaptation of Fieller's theorem, the delta method, and different bootstrap approaches (normal, percentile-based, wild bootstrap) are investigated and compared in a simulation study. Extensions to multivariable problems are presented and evaluated. We observed appropriate confidence interval coverage following Fieller's theorem irrespective of sample size but at the cost of very wide or even infinite confidence intervals. The delta method and the wild bootstrap approach provided the smallest intervals but inadequate coverage for small to moderate event numbers, also depending on the location of the true changepoint. For the percentile-based bootstrap, wide intervals were observed, and it was slightly conservative regarding coverage, whereas the normal bootstrap did not provide acceptable results for many scenarios. The described methods were also applied to data from a randomized clinical trial comparing two treatments for patients with symptomatic, severe carotid artery stenosis, considering patient's age as predictive marker
Fitting censored quantile regression by variable neighborhood search
Quantile regression is an increasingly important topic in statistical analysis. However, fitting censored quantile regression is hard to solve numerically because the objective function to be minimized is not convex nor concave in regressors. Performance of standard methods is not satisfactory, particularly if a high degree of censoring is present. The usual approach is to simplify (linearize) estimator function, and to show theoretically that such approximation converges to optimal values. In this paper, we suggest a new approach, to solve optimization problem (nonlinear, nonconvex, and nondifferentiable) directly. Our method is based on variable neighborhood search approach, a recent successful technique for solving global optimization problems. The presented results indicate that our method can improve quality of censored quantizing regressors estimator considerably
A copula model for marked point processes
The final publication (Diao, Liqun, Richard J. Cook, and Ker-Ai Lee. (2013) A copula model for marked point processes. Lifetime Data Analysis, 19(4): 463-489) is available at Springer via http://dx.doi.org/10.1007/s10985-013-9259-3Many chronic diseases feature recurring clinically important events. In addition, however, there
often exists a random variable which is realized upon the occurrence of each event reflecting the
severity of the event, a cost associated with it, or possibly a short term response indicating the
effect of a therapeutic intervention. We describe a novel model for a marked point process which
incorporates a dependence between continuous marks and the event process through the use of a
copula function. The copula formulation ensures that event times can be modeled by any intensity
function for point processes, and any multivariate model can be specified for the continuous
marks. The relative efficiency of joint versus separate analyses of the event times and the marks is
examined through simulation under random censoring. An application to data from a recent trial
in transfusion medicine is given for illustration.Natural Sciences and Engineering Research Council of Canada (RGPIN 155849); Canadian Institutes for Health Research (FRN 13887); Canada Research Chair (Tier 1) – CIHR funded (950-226626
Limits on WWZ and WW\gamma couplings from p\bar{p}\to e\nu jj X events at \sqrt{s} = 1.8 TeV
We present limits on anomalous WWZ and WW-gamma couplings from a search for
WW and WZ production in p-bar p collisions at sqrt(s)=1.8 TeV. We use p-bar p
-> e-nu jjX events recorded with the D0 detector at the Fermilab Tevatron
Collider during the 1992-1995 run. The data sample corresponds to an integrated
luminosity of 96.0+-5.1 pb^(-1). Assuming identical WWZ and WW-gamma coupling
parameters, the 95% CL limits on the CP-conserving couplings are
-0.33<lambda<0.36 (Delta-kappa=0) and -0.43<Delta-kappa<0.59 (lambda=0), for a
form factor scale Lambda = 2.0 TeV. Limits based on other assumptions are also
presented.Comment: 11 pages, 2 figures, 2 table
- …
