2,455 research outputs found
A general bound for the limiting distribution of Breitung's statistic
Author's draft, December 21, 2007We consider the Breitung (2002, Journal of Econometrics 108, 343–363) statistic ξn, which provides a nonparametric test of the I(1) hypothesis. If ξ denotes the limit in distribution of ξn as n → ∞, we prove (Theorem 1) that 0 ≤ ξ ≤ 1/π2, a result that holds under any assumption on the underlying random variables. The result is a special case of a more general result (Theorem 3), which we prove using the so-called cotangent method associated with Cauchy's residue theorem
Seroprevalence of Hepatitis E among Boston Area Travelers, 2009-2010
We determined the prevalence of IgG antibodies to hepatitis E virus (anti-HEV IgG) among travelers
attending Boston-area travel health clinics from 2009 to 2010. Pre-travel samples were available for 1,356 travelers,
with paired pre- and post-travel samples for 450 (33%). Eighty of 1,356 (6%) pre-travel samples were positive
for anti-HEV IgG. Compared with participants who had never lived in nor traveled to a highly endemic
country, the pre-travel prevalence odds ratio (POR) of anti-HEV IgG among participants born in or with a history
of previous travel to a highly endemic country was increased (POR = 4.8, 95% CI = 2.3–10.3 and POR = 2.6,
95% CI = 1.4–5.0, respectively). Among participants with previous travel to a highly endemic country, anti-HEV
IgG was associated with age > 40 years (POR = 3.7, 95% CI = 1.3–10.2) and travel history to ≥ 3 highly endemic
countries (POR = 2.7, 95% CI = 1.2–5.9). Two participants may have contracted HEV infection during their
2009–2010 trip
ToyArchitecture: Unsupervised Learning of Interpretable Models of the World
Research in Artificial Intelligence (AI) has focused mostly on two extremes:
either on small improvements in narrow AI domains, or on universal theoretical
frameworks which are usually uncomputable, incompatible with theories of
biological intelligence, or lack practical implementations. The goal of this
work is to combine the main advantages of the two: to follow a big picture
view, while providing a particular theory and its implementation. In contrast
with purely theoretical approaches, the resulting architecture should be usable
in realistic settings, but also form the core of a framework containing all the
basic mechanisms, into which it should be easier to integrate additional
required functionality.
In this paper, we present a novel, purposely simple, and interpretable
hierarchical architecture which combines multiple different mechanisms into one
system: unsupervised learning of a model of the world, learning the influence
of one's own actions on the world, model-based reinforcement learning,
hierarchical planning and plan execution, and symbolic/sub-symbolic integration
in general. The learned model is stored in the form of hierarchical
representations with the following properties: 1) they are increasingly more
abstract, but can retain details when needed, and 2) they are easy to
manipulate in their local and symbolic-like form, thus also allowing one to
observe the learning process at each level of abstraction. On all levels of the
system, the representation of the data can be interpreted in both a symbolic
and a sub-symbolic manner. This enables the architecture to learn efficiently
using sub-symbolic methods and to employ symbolic inference.Comment: Revision: changed the pdftitl
Effective Interaction Techniques for the Gamow Shell Model
We apply a contour deformation technique in momentum space to the newly
developed Gamow shell model, and study the drip-line nuclei 5He, 6He and 7He. A
major problem in Gamow shell-model studies of nuclear many-body systems is the
increasing dimensionality of many-body configurations due to the large number
of resonant and complex continuum states necessary to reproduce bound and
resonant state energies. We address this problem using two different effective
operator approaches generalized to the complex momentum plane. These are the
Lee-Suzuki similarity transformation method for complex interactions and the
multi-reference perturbation theory method. The combination of these two
approaches results in a large truncation of the relevant configurations
compared with direct diagonalization. This offers interesting perspectives for
studies of weakly bound systems.Comment: 18 pages, 17 figs, Revtex
On the need for a nonlinear subscale turbulence term in POD models as exemplified for a high Reynolds number flow over an Ahmed body
We investigate a hierarchy of eddy-viscosity terms in POD Galerkin models to
account for a large fraction of unresolved fluctuation energy. These Galerkin
methods are applied to Large Eddy Simulation data for a flow around the
vehicle-like bluff body call Ahmed body. This flow has three challenges for any
reduced-order model: a high Reynolds number, coherent structures with broadband
frequency dynamics, and meta-stable asymmetric base flow states. The Galerkin
models are found to be most accurate with modal eddy viscosities as proposed by
Rempfer & Fasel (1994). Robustness of the model solution with respect to
initial conditions, eddy viscosity values and model order is only achieved for
state-dependent eddy viscosities as proposed by Noack, Morzynski & Tadmor
(2011). Only the POD system with state-dependent modal eddy viscosities can
address all challenges of the flow characteristics. All parameters are
analytically derived from the Navier-Stokes based balance equations with the
available data. We arrive at simple general guidelines for robust and accurate
POD models which can be expected to hold for a large class of turbulent flows.Comment: Submitted to the Journal of Fluid Mechanic
Putting theory oriented evaluation into practice
Evaluations of gaming simulations and business games as teaching devices are typically end-state driven. This emphasis fails to detect how the simulation being evaluated does or does not bring about its desired consequences. This paper advances the use of a logic model approach which possesses a holistic perspective that aims at including all elements associated with the situation created by a game. The use of the logic model approach is illustrated as applied to Simgame, a board game created for secondary school level business education in six European Union countries
Cognitive bias modification for social anxiety in adults who stutter: a feasibility study of a randomised controlled trial
Objective: To determine the feasibility and acceptability of a computerised treatment for social anxiety disorder for adults who stutter including identification of recruitment, retention and completion rates, large cost drivers and selection of most appropriate outcome measure(s) to inform the design of a future definitive trial. Design: Two-group parallel design (treatment vs placebo), double-blinded feasibility study. Participants: 31 adults who stutter. Intervention: Attention training via an online probe detection task in which the stimuli were images of faces displaying neutral and disgusted expressions. Main outcome measures Psychological measures: Structured Clinical Interview Global Assessment of Functioning score; Liebowitz Social Anxiety Scale; Social Phobia and Anxiety Inventory; State-Trait Anxiety Inventory; Unhelpful Thoughts and Beliefs about Stuttering. Speech fluency: percent syllables stuttered. Economic evaluation: resource use questionnaire; EuroQol three-dimension questionnaire. Acceptability: Likert Scale questionnaire of experience of trial, acceptability of the intervention and randomisation procedure. Results: Feasibility of recruitment strategy was demonstrated. Participant feedback indicated that the intervention and definitive trial, including randomisation, would be acceptable to adults who stutter. Of the 31 participants who were randomised, 25 provided data at all three data collection points. Conclusions: The feasibility study informed components of the intervention. Modifications to the design are needed before a definitive trial can be undertaken. Trial registration number I SRCTN55065978
Kaon photoproduction: background contributions, form factors and missing resonances
The photoproduction p(gamma, K+)Lambda process is studied within a
field-theoretic approach. It is shown that the background contributions
constitute an important part of the reaction dynamics. We compare predictions
obtained with three plausible techniques for dealing with these background
contributions. It appears that the extracted resonance parameters drastically
depend on the applied technique. We investigate the implications of the
corrections to the functional form of the hadronic form factor in the contact
term, recently suggested by Davidson and Workman (Phys. Rev. C 63, 025210). The
role of background contributions and hadronic form factors for the
identification of the quantum numbers of ``missing'' resonances is discussed.Comment: 11 pages, 7 eps figures, submitted to Phys. Rev.
The context, influences and challenges for undergraduate nurse clinical education: Continuing the dialogue
Introduction – Approaches to clinical education are highly diverse and becoming increasingly complex to sustain in complex milieu
Objective – To identify the influences and challenges of providing nurse clinical education in the undergraduate setting and to illustrate emerging solutions.
Method: A discursive exploration into the broad and varied body of evidence including peer reviewed and grey literature.
Discussion - Internationally, enabling undergraduate clinical learning opportunities faces a range of challenges. These can be illustrated under two broad themes: (1) Legacies from the past and the inherent features of nurse education and (2) Challenges of the present, including, population changes, workforce changes, and the disconnection between the health and education sectors. Responses to these challenges are triggering the emergence of novel approaches, such as collaborative models.
Conclusion(s) – Ongoing challenges in providing accessible, effective and quality clinical learning experiences are apparent
Numerical models of collisions between core-collapse supernovae and circumstellar shells
Recent observations of luminous Type IIn supernovae (SNe) provide compelling
evidence that massive circumstellar shells surround their progenitors. In this
paper we investigate how the properties of such shells influence the SN
lightcurve by conducting numerical simulations of the interaction between an
expanding SN and a circumstellar shell ejected a few years prior to core
collapse. Our parameter study explores how the emergent luminosity depends on a
range of circumstellar shell masses, velocities, geometries, and wind mass-loss
rates, as well as variations in the SN mass and energy. We find that the shell
mass is the most important parameter, in the sense that higher shell masses (or
higher ratios of M_shell/M_SN) lead to higher peak luminosities and higher
efficiencies in converting shock energy into visual light. Lower mass shells
can also cause high peak luminosities if the shell is slow or if the SN ejecta
are very fast, but only for a short time. Sustaining a high luminosity for
durations of more than 100 days requires massive circumstellar shells of order
10 M_sun or more. This reaffirms previous comparisons between pre-SN shells and
shells produced by giant eruptions of luminous blue variables (LBVs), although
the physical mechanism responsible for these outbursts remains uncertain. The
lightcurve shape and observed shell velocity can help diagnose the approximate
size and density of the circumstellar shell, and it may be possible to
distinguish between spherical and bipolar shells with multi-wavelength
lightcurves. These models are merely illustrative. One can, of course, achieve
even higher luminosities and longer duration light curves from interaction by
increasing the explosion energy and shell mass beyond values adopted here.Comment: Accepted for publication in MNRAS. Tables of numerical results (SN
lightcurves and velocities) to be published online. (Updated to fix figures
- …
