6,913 research outputs found
High-resolution imaging of planet host candidates. A comprehensive comparison of different techniques
The Kepler mission has discovered thousands of planet candidates. Currently,
some of them have already been discarded; more than 200 have been confirmed by
follow-up observations, and several hundreds have been validated. However, most
of them are still awaiting for confirmation. Thus, priorities (in terms of the
probability of the candidate being a real planet) must be established for
subsequent observations. The motivation of this work is to provide a set of
isolated (good) host candidates to be further tested by other techniques. We
identify close companions of the candidates that could have contaminated the
light curve of the planet host. We used the AstraLux North instrument located
at the 2.2 m telescope in the Calar Alto Observatory to obtain
diffraction-limited images of 174 Kepler objects of interest. The lucky-imaging
technique used in this work is compared to other AO and speckle imaging
observations of Kepler planet host candidates. We define a new parameter, the
blended source confidence level (BSC), to assess the probability of an object
to have blended non-detected eclipsing binaries capable of producing the
detected transit. We find that 67.2% of the observed Kepler hosts are isolated
within our detectability limits, and 32.8% have at least one visual companion
at angular separations below 6 arcsec. We find close companions (below 3
arcsec) for the 17.2% of the sample. The planet properties of this sample of
non-isolated hosts are revised. We report one possible S-type binary
(KOI-3158). We also report three possible false positives (KOIs 1230.01,
3649.01, and 3886.01) due to the presence of close companions. The BSC
parameter is calculated for all the isolated targets and compared to both the
value prior to any high-resolution image and, when possible, to observations
from previous high-spatial resolution surveys in the Kepler sample.Comment: Accepted for publication in A&A on April 29, 2014; 32 pages, 11
figures, 11 table
Determination of the Joint Confidence Region of Optimal Operating Conditions in Robust Design by Bootstrap Technique
Robust design has been widely recognized as a leading method in reducing
variability and improving quality. Most of the engineering statistics
literature mainly focuses on finding "point estimates" of the optimum operating
conditions for robust design. Various procedures for calculating point
estimates of the optimum operating conditions are considered. Although this
point estimation procedure is important for continuous quality improvement, the
immediate question is "how accurate are these optimum operating conditions?"
The answer for this is to consider interval estimation for a single variable or
joint confidence regions for multiple variables.
In this paper, with the help of the bootstrap technique, we develop
procedures for obtaining joint "confidence regions" for the optimum operating
conditions. Two different procedures using Bonferroni and multivariate normal
approximation are introduced. The proposed methods are illustrated and
substantiated using a numerical example.Comment: Two tables, Three figure
Application of Bayesian model averaging to measurements of the primordial power spectrum
Cosmological parameter uncertainties are often stated assuming a particular
model, neglecting the model uncertainty, even when Bayesian model selection is
unable to identify a conclusive best model. Bayesian model averaging is a
method for assessing parameter uncertainties in situations where there is also
uncertainty in the underlying model. We apply model averaging to the estimation
of the parameters associated with the primordial power spectra of curvature and
tensor perturbations. We use CosmoNest and MultiNest to compute the model
Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR,
BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find
that the model-averaged 95% credible interval for the spectral index using all
of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale
0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper
limit, depending on prior assumptions.Comment: 7 pages with 7 figures include
Routine Crime in Exceptional Times: The Impact of the 2002 Winter Olympics on Citizen Demand for Police Services
Despite their rich theoretical and practical importance, criminologists have paid scant attention to the patterns of crime and the responses to crime during exceptional events. Throughout the world large-scale political, social, economic, cultural, and sporting events have become commonplace. Natural disasters such as blackouts, hurricanes, tornadoes, and tsunamis present similar opportunities. Such events often tax the capacities of jurisdictions to provide safety and security in response to the exceptional event, as well as to meet the “routine” public safety needs. This article examines “routine” crime as measured by calls for police service, official crime reports, and police arrests in Salt Lake City before, during, and after the 2002 Olympic Games. The analyses suggest that while a rather benign demographic among attendees and the presence of large numbers of social control agents might have been expected to decrease calls for police service for minor crime, it actually increased in Salt Lake during this period. The implications of these findings are considered for theories of routine activities, as well as systems capacity
Synthetic LISA: Simulating Time Delay Interferometry in a Model LISA
We report on three numerical experiments on the implementation of Time-Delay
Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python
package that we developed to simulate the LISA science process at the level of
scientific and technical requirements. Specifically, we study the laser-noise
residuals left by first-generation TDI when the LISA armlengths have a
realistic time dependence; we characterize the armlength-measurements
accuracies that are needed to have effective laser-noise cancellation in both
first- and second-generation TDI; and we estimate the quantization and
telemetry bitdepth needed for the phase measurements. Synthetic LISA generates
synthetic time series of the LISA fundamental noises, as filtered through all
the TDI observables; it also provides a streamlined module to compute the TDI
responses to gravitational waves according to a full model of TDI, including
the motion of the LISA array and the temporal and directional dependence of the
armlengths. We discuss the theoretical model that underlies the simulation, its
implementation, and its use in future investigations on system characterization
and data-analysis prototyping for LISA.Comment: 18 pages, 14 EPS figures, REVTeX 4. Accepted PRD version. See
http://www.vallis.org/syntheticlisa for information on the Synthetic LISA
software packag
Recommended from our members
A management architecture for active networks
In this paper we present an architecture for network and applications management, which is based on the Active Networks paradigm and shows the advantages of network programmability. The stimulus to develop this architecture arises from an actual need to manage a cluster of active nodes, where it is often required to redeploy network assets and modify nodes connectivity. In our architecture, a remote front-end of the managing entity allows the operator to design new network topologies, to check the status of the nodes and to configure them. Moreover, the proposed framework allows to explore an active network, to monitor the active applications, to query each node and to install programmable traps. In order to take advantage of the Active Networks technology, we introduce active SNMP-like MIBs and agents, which are dynamic and programmable. The programmable management agents make tracing distributed applications a feasible task. We propose a general framework that can inter-operate with any active execution environment. In this framework, both the manager and the monitor front-ends communicate with an active node (the Active Network Access Point) through the XML language. A gateway service performs the translation of the queries from XML to an active packet language and injects the code in the network. We demonstrate the implementation of an active network gateway for PLAN (Packet Language for Active Networks) in a forty active nodes testbed. Finally, we discuss an application of the active management architecture to detect the causes of network failures by tracing network events in time
A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia
This paper outlines a methodology for semi-parametric spatio-temporal
modelling of data which is dense in time but sparse in space, obtained from a
split panel design, the most feasible approach to covering space and time with
limited equipment. The data are hourly averaged particle number concentration
(PNC) and were collected, as part of the Ultrafine Particles from Transport
Emissions and Child Health (UPTECH) project. Two weeks of continuous
measurements were taken at each of a number of government primary schools in
the Brisbane Metropolitan Area. The monitoring equipment was taken to each
school sequentially. The school data are augmented by data from long term
monitoring stations at three locations in Brisbane, Australia.
Fitting the model helps describe the spatial and temporal variability at a
subset of the UPTECH schools and the long-term monitoring sites. The temporal
variation is modelled hierarchically with penalised random walk terms, one
common to all sites and a term accounting for the remaining temporal trend at
each site. Parameter estimates and their uncertainty are computed in a
computationally efficient approximate Bayesian inference environment, R-INLA.
The temporal part of the model explains daily and weekly cycles in PNC at the
schools, which can be used to estimate the exposure of school children to
ultrafine particles (UFPs) emitted by vehicles. At each school and long-term
monitoring site, peaks in PNC can be attributed to the morning and afternoon
rush hour traffic and new particle formation events. The spatial component of
the model describes the school to school variation in mean PNC at each school
and within each school ground. It is shown how the spatial model can be
expanded to identify spatial patterns at the city scale with the inclusion of
more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH
projec
Kepler-91b: a planet at the end of its life. Planet and giant host star properties via light-curve variations
The evolution of planetary systems is intimately linked to the evolution of
their host star. Our understanding of the whole planetary evolution process is
based on the large planet diversity observed so far. To date, only few tens of
planets have been discovered orbiting stars ascending the Red Giant Branch.
Although several theories have been proposed, the question of how planets die
remains open due to the small number statistics. In this work we study the
giant star Kepler-91 (KOI-2133) in order to determine the nature of a
transiting companion. This system was detected by the Kepler Space Telescope.
However, its planetary confirmation is needed. We confirm the planetary nature
of the object transiting the star Kepler-91 by deriving a mass of and a planetary radius of
. Asteroseismic analysis produces a
stellar radius of and a mass of
. We find that its eccentric orbit
() is just away
from the stellar atmosphere at the pericenter. Kepler-91b could be the previous
stage of the planet engulfment, recently detected for BD+48 740. Our
estimations show that Kepler-91b will be swallowed by its host star in less
than 55 Myr. Among the confirmed planets around giant stars, this is the
planetary-mass body closest to its host star. At pericenter passage, the star
subtends an angle of , covering around 10% of the sky as seen from
the planet. The planetary atmosphere seems to be inflated probably due to the
high stellar irradiation.Comment: 21 pages, 8 tables and 11 figure
Recommended from our members
Baseline T cell dysfunction by single cell network profiling in metastatic breast cancer patients.
BackgroundWe previously reported the results of a multicentric prospective randomized trial of chemo-refractory metastatic breast cancer patients testing the efficacy of two doses of TGFβ blockade during radiotherapy. Despite a lack of objective responses to the combination, patients who received a higher dose of TGFβ blocking antibody fresolimumab had a better overall survival when compared to those assigned to lower dose (hazard ratio of 2.73, p = 0.039). They also demonstrated an improved peripheral blood mononuclear cell (PBMC) counts and increase in the CD8 central memory pool. We performed additional analysis on residual PBMC using single cell network profiling (SCNP).MethodsThe original trial randomized metastatic breast cancer patients to either 1 or 10 mg/kg of fresolimumab, every 3 weeks for 5 cycles, combined with radiotherapy to a metastatic site at week 1 and 7 (22.5 Gy given in 3 doses of 7.5 Gy). Trial immune monitoring results were previously reported. In 15 patients with available residual blood samples, additional functional studies were performed, and compared with data obtained in parallel from seven healthy female donors (HD): SCNP was applied to analyze T cell receptor (TCR) modulated signaling via CD3 and CD28 crosslinking and measurement of evoked phosphorylation of AKT and ERK in CD4 and CD8 T cell subsets defined by PD-1 expression.ResultsAt baseline, a significantly higher level of expression (p < 0.05) of PD-L1 was identified in patient monocytes compared to HD. TCR modulation revealed dysfunction of circulating T-cells in patient baseline samples as compared to HD, and this was more pronounced in PD-1+ cells. Treatment with radiotherapy and fresolimumab did not resolve this dyfunctional signaling. However, in vitro PD-1 blockade enhanced TCR signaling in patient PD-1+ T cells and not in PD-1- T cells or in PD-1+ T cells from HD.ConclusionsFunctional T cell analysis suggests that baseline T cell functionality is hampered in metastatic breast cancer patients, at least in part mediated by the PD-1 signaling pathway. These preliminary data support the rationale for investigating the possible beneficial effects of adding PD-1 blockade to improve responses to TGFβ blockade and radiotherapy.Trial registrationNCT01401062
State and dynamical parameter estimation for open quantum systems
Following the evolution of an open quantum system requires full knowledge of
its dynamics. In this paper we consider open quantum systems for which the
Hamiltonian is ``uncertain''. In particular, we treat in detail a simple system
similar to that considered by Mabuchi [Quant. Semiclass. Opt. 8, 1103 (1996)]:
a radiatively damped atom driven by an unknown Rabi frequency (as
would occur for an atom at an unknown point in a standing light wave). By
measuring the environment of the system, knowledge about the system state, and
about the uncertain dynamical parameter, can be acquired. We find that these
two sorts of knowledge acquisition (quantified by the posterior distribution
for , and the conditional purity of the system, respectively) are quite
distinct processes, which are not strongly correlated. Also, the quality and
quantity of knowledge gain depend strongly on the type of monitoring scheme. We
compare five different detection schemes (direct, adaptive, homodyne of the
quadrature, homodyne of the quadrature, and heterodyne) using four
different measures of the knowledge gain (Shannon information about ,
variance in , long-time system purity, and short-time system purity).Comment: 14 pages, 18 figure
- …
