3,860 research outputs found
Estimating the Impact of Highways on Average Travel Velocities and Market Size
In this paper we examine the link between additions to highway infrastructure and development of a market area. We do so by first relating highway travel speeds to added highway-mileage and then relating travel speed to the size of the market area. This approach bypasses issues in the public finance literature that derive from estimates of highway infrastructure spending. Also, rather than examining the effects of improved transportation efficiency on enhancements of productivity, this research examines their effect on enhancements in demand for local production. Our thought, which is borne out in the literature, is that industry-level productivity in a metropolitan area may be improved only marginally by lower delivered prices of inputs due to very localized improvements in the freight transportation system. On the other hand, the market for locally produced goods and services will expand somewhat uniformly across industries due to generally improved traffic movements in a metropolitan area. By applying this approach to data from the Texas Transportation Institute, we find a significant but small positive effect of highways and arterials (as opposed to other roadways) on changes in metropolitan urbanized area and metropolitan population change. This suggests that demand for local production may well be enhanced by expansions of highway and principal arterials infrastructure.
Hearing the Victim\u27s Voice: Analysis of Victims\u27 Advocate Participation in the Trial Proceeding of the International Criminal Court
Shape memory alloy based smart landing gear for an airship
The design and development of a shape memory alloy based smart landing gear for aerospace vehicles is based on a13; novel design approach. The smart landing gear comprises a landing beam, an arch, and a superelastic nickeltitanium shape memory alloy element. This design is of a generic nature and is applicable to a certain class of light13; aerospace vehicles. In this paper a specixFB01;c case of the shape memory alloy based smart landing gear design and13; development applicable to a radio controlled semirigid airship (radio controlled blimp) of 320 m3 volume is13; presented.Ajudicious combination of carbon xFB01;ber reinforced plastic for the landing beam, cane (naturally occurring13; plant product) wrapped with carbon xFB01;ber reinforced plastic for the arch, and superelastic shape memory alloy is13; used in the development. An appropriate sizing of the arch and landing beam is arrived at to meet the dual requirement of low weight and high-energy dissipation while ndergoing x201C;large elasticx201D; (large nonlinear recoverable13; elastic strain) deformations to ensure soft landings when the airship impacts the ground. The soft landing is required13; to ensure that shock and vibration are minimized (to protect the sensitive payload). The inherently large energydissipating character of the superelastic shape memory alloy element in the tensile mode of deformation and the superior elastic bounce back features of the landing gear provide the ideal solution.Anonlinear analysis based on the classical and xFB01;nite element method approach is followed to analyze the structure. Necessary experiments and tests have been conducted to check the veracity of the design. Good correlation has been found between the analyses and testing. This exercise is intended to provide an alternate method of developing an efxFB01;cient landing gear with satisfactory geometry for a x201C;certain class of light aerospace vehiclesx201D; such as airships, rotorcraft, and other light unmanned air vehicles
Sequential Quantiles via Hermite Series Density Estimation
Sequential quantile estimation refers to incorporating observations into
quantile estimates in an incremental fashion thus furnishing an online estimate
of one or more quantiles at any given point in time. Sequential quantile
estimation is also known as online quantile estimation. This area is relevant
to the analysis of data streams and to the one-pass analysis of massive data
sets. Applications include network traffic and latency analysis, real time
fraud detection and high frequency trading. We introduce new techniques for
online quantile estimation based on Hermite series estimators in the settings
of static quantile estimation and dynamic quantile estimation. In the static
quantile estimation setting we apply the existing Gauss-Hermite expansion in a
novel manner. In particular, we exploit the fact that Gauss-Hermite
coefficients can be updated in a sequential manner. To treat dynamic quantile
estimation we introduce a novel expansion with an exponentially weighted
estimator for the Gauss-Hermite coefficients which we term the Exponentially
Weighted Gauss-Hermite (EWGH) expansion. These algorithms go beyond existing
sequential quantile estimation algorithms in that they allow arbitrary
quantiles (as opposed to pre-specified quantiles) to be estimated at any point
in time. In doing so we provide a solution to online distribution function and
online quantile function estimation on data streams. In particular we derive an
analytical expression for the CDF and prove consistency results for the CDF
under certain conditions. In addition we analyse the associated quantile
estimator. Simulation studies and tests on real data reveal the Gauss-Hermite
based algorithms to be competitive with a leading existing algorithm.Comment: 43 pages, 9 figures. Improved version incorporating referee comments,
as appears in Electronic Journal of Statistic
Nonparametric Transient Classification using Adaptive Wavelets
Classifying transients based on multi band light curves is a challenging but
crucial problem in the era of GAIA and LSST since the sheer volume of
transients will make spectroscopic classification unfeasible. Here we present a
nonparametric classifier that uses the transient's light curve measurements to
predict its class given training data. It implements two novel components: the
first is the use of the BAGIDIS wavelet methodology - a characterization of
functional data using hierarchical wavelet coefficients. The second novelty is
the introduction of a ranked probability classifier on the wavelet coefficients
that handles both the heteroscedasticity of the data in addition to the
potential non-representativity of the training set. The ranked classifier is
simple and quick to implement while a major advantage of the BAGIDIS wavelets
is that they are translation invariant, hence they do not need the light curves
to be aligned to extract features. Further, BAGIDIS is nonparametric so it can
be used for blind searches for new objects. We demonstrate the effectiveness of
our ranked wavelet classifier against the well-tested Supernova Photometric
Classification Challenge dataset in which the challenge is to correctly
classify light curves as Type Ia or non-Ia supernovae. We train our ranked
probability classifier on the spectroscopically-confirmed subsample (which is
not representative) and show that it gives good results for all supernova with
observed light curve timespans greater than 100 days (roughly 55% of the
dataset). For such data, we obtain a Ia efficiency of 80.5% and a purity of
82.4% yielding a highly competitive score of 0.49 whilst implementing a truly
"model-blind" approach to supernova classification. Consequently this approach
may be particularly suitable for the classification of astronomical transients
in the era of large synoptic sky surveys.Comment: 14 pages, 8 figures. Published in MNRA
Flexible multiply towpreg
This invention relates to an improved flexible towpreg and a method of production therefor. The improved flexible towpreg comprises a plurality of towpreg plies which comprise reinforcing filaments and matrix forming material; the reinforcing filaments being substantially wetout by the matrix forming material such that the towpreg plies are substantially void-free composite articles, and the towpreg plies having an average thickness less than about 100 microns. The method of production for the improved flexible towpreg comprises the steps of spreading the reinforcing filaments to expose individually substantially all of the reinforcing filaments; coating the reinforcing filaments with the matrix forming material in a manner causing interfacial adhesion of the matrix forming material to the reinforcing filaments; forming the towpreg plies by heating the matrix forming material contacting the reinforcing filaments until the matrix forming material liquifies and coats the reinforcing filaments; and cooling the towpreg plies in a manner such that substantial cohesion between neighboring towpreg plies is prevented until the matrix forming material solidifies
Towards the Future of Supernova Cosmology
For future surveys, spectroscopic follow-up for all supernovae will be
extremely difficult. However, one can use light curve fitters, to obtain the
probability that an object is a Type Ia. One may consider applying a
probability cut to the data, but we show that the resulting non-Ia
contamination can lead to biases in the estimation of cosmological parameters.
A different method, which allows the use of the full dataset and results in
unbiased cosmological parameter estimation, is Bayesian Estimation Applied to
Multiple Species (BEAMS). BEAMS is a Bayesian approach to the problem which
includes the uncertainty in the types in the evaluation of the posterior. Here
we outline the theory of BEAMS and demonstrate its effectiveness using both
simulated datasets and SDSS-II data. We also show that it is possible to use
BEAMS if the data are correlated, by introducing a numerical marginalisation
over the types of the objects. This is largely a pedagogical introduction to
BEAMS with references to the main BEAMS papers.Comment: Replaced under married name Lochner (formally Knights). 3 pages, 2
figures. To appear in the Proceedings of 13th Marcel Grossmann Meeting
(MG13), Stockholm, Sweden, 1-7 July 201
Extending BEAMS to incorporate correlated systematic uncertainties
New supernova surveys such as the Dark Energy Survey, Pan-STARRS and the LSST
will produce an unprecedented number of photometric supernova candidates, most
with no spectroscopic data. Avoiding biases in cosmological parameters due to
the resulting inevitable contamination from non-Ia supernovae can be achieved
with the BEAMS formalism, allowing for fully photometric supernova cosmology
studies. Here we extend BEAMS to deal with the case in which the supernovae are
correlated by systematic uncertainties. The analytical form of the full BEAMS
posterior requires evaluating 2^N terms, where N is the number of supernova
candidates. This `exponential catastrophe' is computationally unfeasible even
for N of order 100. We circumvent the exponential catastrophe by marginalising
numerically instead of analytically over the possible supernova types: we
augment the cosmological parameters with nuisance parameters describing the
covariance matrix and the types of all the supernovae, \tau_i, that we include
in our MCMC analysis. We show that this method deals well even with large,
unknown systematic uncertainties without a major increase in computational
time, whereas ignoring the correlations can lead to significant biases and
incorrect credible contours. We then compare the numerical marginalisation
technique with a perturbative expansion of the posterior based on the insight
that future surveys will have exquisite light curves and hence the probability
that a given candidate is a Type Ia will be close to unity or zero, for most
objects. Although this perturbative approach changes computation of the
posterior from a 2^N problem into an N^2 or N^3 one, we show that it leads to
biases in general through a small number of misclassifications, implying that
numerical marginalisation is superior.Comment: Resubmitted under married name Lochner (formally Knights). Version 3:
major changes, including a large scale analysis with thousands of MCMC
chains. Matches version published in JCAP. 23 pages, 8 figure
- …
