5,829 research outputs found
Synthetic LISA: Simulating Time Delay Interferometry in a Model LISA
We report on three numerical experiments on the implementation of Time-Delay
Interferometry (TDI) for LISA, performed with Synthetic LISA, a C++/Python
package that we developed to simulate the LISA science process at the level of
scientific and technical requirements. Specifically, we study the laser-noise
residuals left by first-generation TDI when the LISA armlengths have a
realistic time dependence; we characterize the armlength-measurements
accuracies that are needed to have effective laser-noise cancellation in both
first- and second-generation TDI; and we estimate the quantization and
telemetry bitdepth needed for the phase measurements. Synthetic LISA generates
synthetic time series of the LISA fundamental noises, as filtered through all
the TDI observables; it also provides a streamlined module to compute the TDI
responses to gravitational waves according to a full model of TDI, including
the motion of the LISA array and the temporal and directional dependence of the
armlengths. We discuss the theoretical model that underlies the simulation, its
implementation, and its use in future investigations on system characterization
and data-analysis prototyping for LISA.Comment: 18 pages, 14 EPS figures, REVTeX 4. Accepted PRD version. See
http://www.vallis.org/syntheticlisa for information on the Synthetic LISA
software packag
DC-Prophet: Predicting Catastrophic Machine Failures in DataCenters
When will a server fail catastrophically in an industrial datacenter? Is it
possible to forecast these failures so preventive actions can be taken to
increase the reliability of a datacenter? To answer these questions, we have
studied what are probably the largest, publicly available datacenter traces,
containing more than 104 million events from 12,500 machines. Among these
samples, we observe and categorize three types of machine failures, all of
which are catastrophic and may lead to information loss, or even worse,
reliability degradation of a datacenter. We further propose a two-stage
framework-DC-Prophet-based on One-Class Support Vector Machine and Random
Forest. DC-Prophet extracts surprising patterns and accurately predicts the
next failure of a machine. Experimental results show that DC-Prophet achieves
an AUC of 0.93 in predicting the next machine failure, and a F3-score of 0.88
(out of 1). On average, DC-Prophet outperforms other classical machine learning
methods by 39.45% in F3-score.Comment: 13 pages, 5 figures, accepted by 2017 ECML PKD
"SMEs, Information Risk Management, and ROI"
Recent research in the area of standards accreditation has shown that the rate of take up of the ISO27001 (Information Security Management) by organisations been disappointing in many Western countries, compared to the picture emerging in Asia, and the rollout of previous international standards that relate to information management, such as ISO9001.
In this paper, a researcher and a practitioner from the UK investigate possible reasons for a lesser interest in pursuing certification for organisational Information Security Management Systems (ISMS) across Western countries. They also share their perceptions and concerns that current attitudes of UK of small businesses regarding complying with standards and legislation means that they may be taking unnecessary risks with their corporate and personal data under the possibly misguided notion that other priorities are more important during these current recessionary times.
The authors use an economics-based approach in proposing a solution to the problem. On the one hand they review the research that has provided methods for putting a figure on the value of corporate and personal data in larger organisations, and applying the principles of managing information risk as appropriate to SMEs. On the other hand they look at economics-related issues such as market pressure, insurance, outsourcing, and the legal and regulatory matters regarding privacy of personal data. The result provides a case for showing SMEs that, apart from the moral matter of being “good for the business”, there are very sound economic reasons for an SME developing an ISMS and getting ISO27001 certified
Application of Bayesian model averaging to measurements of the primordial power spectrum
Cosmological parameter uncertainties are often stated assuming a particular
model, neglecting the model uncertainty, even when Bayesian model selection is
unable to identify a conclusive best model. Bayesian model averaging is a
method for assessing parameter uncertainties in situations where there is also
uncertainty in the underlying model. We apply model averaging to the estimation
of the parameters associated with the primordial power spectra of curvature and
tensor perturbations. We use CosmoNest and MultiNest to compute the model
Evidences and posteriors, using cosmic microwave data from WMAP, ACBAR,
BOOMERanG and CBI, plus large-scale structure data from the SDSS DR7. We find
that the model-averaged 95% credible interval for the spectral index using all
of the data is 0.940 < n_s < 1.000, where n_s is specified at a pivot scale
0.015 Mpc^{-1}. For the tensors model averaging can tighten the credible upper
limit, depending on prior assumptions.Comment: 7 pages with 7 figures include
A Bayesian spatio-temporal model of panel design data: airborne particle number concentration in Brisbane, Australia
This paper outlines a methodology for semi-parametric spatio-temporal
modelling of data which is dense in time but sparse in space, obtained from a
split panel design, the most feasible approach to covering space and time with
limited equipment. The data are hourly averaged particle number concentration
(PNC) and were collected, as part of the Ultrafine Particles from Transport
Emissions and Child Health (UPTECH) project. Two weeks of continuous
measurements were taken at each of a number of government primary schools in
the Brisbane Metropolitan Area. The monitoring equipment was taken to each
school sequentially. The school data are augmented by data from long term
monitoring stations at three locations in Brisbane, Australia.
Fitting the model helps describe the spatial and temporal variability at a
subset of the UPTECH schools and the long-term monitoring sites. The temporal
variation is modelled hierarchically with penalised random walk terms, one
common to all sites and a term accounting for the remaining temporal trend at
each site. Parameter estimates and their uncertainty are computed in a
computationally efficient approximate Bayesian inference environment, R-INLA.
The temporal part of the model explains daily and weekly cycles in PNC at the
schools, which can be used to estimate the exposure of school children to
ultrafine particles (UFPs) emitted by vehicles. At each school and long-term
monitoring site, peaks in PNC can be attributed to the morning and afternoon
rush hour traffic and new particle formation events. The spatial component of
the model describes the school to school variation in mean PNC at each school
and within each school ground. It is shown how the spatial model can be
expanded to identify spatial patterns at the city scale with the inclusion of
more spatial locations.Comment: Draft of this paper presented at ISBA 2012 as poster, part of UPTECH
projec
Activity Recognition and Prediction in Real Homes
In this paper, we present work in progress on activity recognition and
prediction in real homes using either binary sensor data or depth video data.
We present our field trial and set-up for collecting and storing the data, our
methods, and our current results. We compare the accuracy of predicting the
next binary sensor event using probabilistic methods and Long Short-Term Memory
(LSTM) networks, include the time information to improve prediction accuracy,
as well as predict both the next sensor event and its mean time of occurrence
using one LSTM model. We investigate transfer learning between apartments and
show that it is possible to pre-train the model with data from other apartments
and achieve good accuracy in a new apartment straight away. In addition, we
present preliminary results from activity recognition using low-resolution
depth video data from seven apartments, and classify four activities - no
movement, standing up, sitting down, and TV interaction - by using a relatively
simple processing method where we apply an Infinite Impulse Response (IIR)
filter to extract movements from the frames prior to feeding them to a
convolutional LSTM network for the classification.Comment: 12 pages, Symposium of the Norwegian AI Society NAIS 201
Kepler-91b: a planet at the end of its life. Planet and giant host star properties via light-curve variations
The evolution of planetary systems is intimately linked to the evolution of
their host star. Our understanding of the whole planetary evolution process is
based on the large planet diversity observed so far. To date, only few tens of
planets have been discovered orbiting stars ascending the Red Giant Branch.
Although several theories have been proposed, the question of how planets die
remains open due to the small number statistics. In this work we study the
giant star Kepler-91 (KOI-2133) in order to determine the nature of a
transiting companion. This system was detected by the Kepler Space Telescope.
However, its planetary confirmation is needed. We confirm the planetary nature
of the object transiting the star Kepler-91 by deriving a mass of and a planetary radius of
. Asteroseismic analysis produces a
stellar radius of and a mass of
. We find that its eccentric orbit
() is just away
from the stellar atmosphere at the pericenter. Kepler-91b could be the previous
stage of the planet engulfment, recently detected for BD+48 740. Our
estimations show that Kepler-91b will be swallowed by its host star in less
than 55 Myr. Among the confirmed planets around giant stars, this is the
planetary-mass body closest to its host star. At pericenter passage, the star
subtends an angle of , covering around 10% of the sky as seen from
the planet. The planetary atmosphere seems to be inflated probably due to the
high stellar irradiation.Comment: 21 pages, 8 tables and 11 figure
Constraining the False Positive Rate for Kepler Planet Candidates with Multi-Color Photometry from the GTC
Using the OSIRIS instrument installed on the 10.4-m Gran Telescopio Canarias
(GTC) we acquired multi-color transit photometry of four small (Rp < 5 R_Earth)
short-period (P < 6 days) planet candidates recently identified by the Kepler
space mission. These observations are part of a program to constrain the false
positive rate for small, short-period Kepler planet candidates. Since planetary
transits should be largely achromatic when observed at different wavelengths
(excluding the small color changes due to stellar limb darkening), we use the
observed transit color to identify candidates as either false positives (e.g.,
a blend with a stellar eclipsing binary either in the background/foreground or
bound to the target star) or validated planets. Our results include the
identification of KOI 225.01 and KOI 1187.01 as false positives and the
tentative validation of KOI 420.01 and KOI 526.01 as planets. The probability
of identifying two false positives out of a sample of four targets is less than
1%, assuming an overall false positive rate for Kepler planet candidates of 10%
(as estimated by Morton & Johnson 2011). Therefore, these results suggest a
higher false positive rate for the small, short-period Kepler planet candidates
than has been theoretically predicted by other studies which consider the
Kepler planet candidate sample as a whole. Furthermore, our results are
consistent with a recent Doppler study of short-period giant Kepler planet
candidates (Santerne et al. 2012). We also investigate how the false positive
rate for our sample varies with different planetary and stellar properties. Our
results suggest that the false positive rate varies significantly with orbital
period and is largest at the shortest orbital periods (P < 3 days), where there
is a corresponding rise in the number of detached eclipsing binary stars...
(truncated)Comment: 13 pages, 12 figures, 3 tables; revised for MNRA
Six Peaks Visible in the Redshift Distribution of 46,400 SDSS Quasars Agree with the Preferred Redshifts Predicted by the Decreasing Intrinsic Redshift Model
The redshift distribution of all 46,400 quasars in the Sloan Digital Sky
Survey (SDSS) Quasar Catalog III, Third Data Release, is examined. Six Peaks
that fall within the redshift window below z = 4, are visible. Their positions
agree with the preferred redshift values predicted by the decreasing intrinsic
redshift (DIR) model, even though this model was derived using completely
independent evidence. A power spectrum analysis of the full dataset confirms
the presence of a single, significant power peak at the expected redshift
period. Power peaks with the predicted period are also obtained when the upper
and lower halves of the redshift distribution are examined separately. The
periodicity detected is in linear z, as opposed to log(1+z). Because the peaks
in the SDSS quasar redshift distribution agree well with the preferred
redshifts predicted by the intrinsic redshift relation, we conclude that this
relation, and the peaks in the redshift distribution, likely both have the same
origin, and this may be intrinsic redshifts, or a common selection effect.
However, because of the way the intrinsic redshift relation was determined it
seems unlikely that one selection effect could have been responsible for both.Comment: 12 pages, 12 figure, accepted for publication in the Astrophysical
Journa
- …
