719 research outputs found
An approximate solution to the optimal coordination problem for autonomous vehicles at intersections
In this paper, we address the problem of optimal and safe coordination of autonomous vehicles through a traffic intersection. We state the problem as a finite time, constrained optimal control problem, a combinatorial optimization problem that is difficult to solve in real-time. A low complexity computational scheme is proposed, based on a hierarchical decomposition of the original optimal control formulation, where a central coordination problem is solved together with a number of local optimal control problems for each vehicle. We show how the proposed decomposition allows a reduction of the complexity of the central problem, provided that approximated parametric solutions of the local problems are available beforehand. We derive conditions for the construction of the parametric approximations and demonstrate the method with a numerical example
Coordination of Cooperative Autonomous Vehicles Toward safer and more efficient road transportation
While intelligent transportation systems come in many shapes and sizes, arguably the most transformational realization will be the autonomous vehicle. As such vehicles become commercially available in the coming years, first on dedicated roads and under specific conditions, and later on all public roads at all times, a phase transition will occur. Once a sufficient number of autonomous vehicles is deployed, the opportunity for explicit coordination appears. This article treats this challenging network control problem, which lies at the intersection of control theory, signal processing, and wireless communication. We provide an overview of the state of the art, while at the same time highlighting key research directions for the coming decades
Gamma-Ray Bursts in Circumstellar Shells: A Possible Explanation for Flares
It is now generally accepted that long-duration gamma ray bursts (GRBs) are
due to the collapse of massive rotating stars. The precise collapse process
itself, however, is not yet fully understood. Strong winds, outbursts, and
intense ionizing UV radiation from single stars or strongly interacting
binaries are expected to destroy the molecular cloud cores that give birth to
them and create highly complex circumburst environments for the explosion. Such
environments might imprint features on GRB light curves that uniquely identify
the nature of the progenitor and its collapse. We have performed numerical
simulations of realistic environments for a variety of long-duration GRB
progenitors with ZEUS-MP, and have developed an analytical method for
calculating GRB light curves in these profiles. Though a full,
three-dimensional, relativistic magnetohydrodynamical computational model is
required to precisely describe the light curve from a GRB in complex
environments, our method can provide a qualitative understanding of these
phenomena. We find that, in the context of the standard afterglow model,
massive shells around GRBs produce strong signatures in their light curves, and
that this can distinguish them from those occurring in uniform media or steady
winds. These features can constrain the mass of the shell and the properties of
the wind before and after the ejection. Moreover, the interaction of the GRB
with the circumburst shell is seen to produce features that are consistent with
observed X-ray flares that are often attributed to delayed energy injection by
the central engine. Our algorithm for computing light curves is also applicable
to GRBs in a variety of environments such as those in high-redshift
cosmological halos or protogalaxies, both of which will soon be targets of
future surveys such as JANUS or Lobster.Comment: 12 pages, 5 figures, Accepted by Ap
Strong and auxiliary forms of the semi-Lagrangian method for incompressible flows
We present a review of the semi-Lagrangian method for advection-diusion and incompressible Navier-Stokes equations discretized with high-order methods. In particular, we compare the strong form where the departure points are computed directly via backwards integration with the auxiliary form where an auxiliary advection equation is solved instead; the latter is also referred to as Operator Integration Factor Splitting (OIFS) scheme. For intermediate size of time steps the auxiliary form is preferrable but for large time steps only the strong form is stable
A semi-Lagrangian micro-macro method for viscoelastic flow calculations
We present in this paper a semi-Lagrangian algorithm to calculate the viscoelastic flow in which a dilute polymer solution is modeled by the FENE dumbbell kinetic model. In this algorithm the material derivative operator of the Navier–Stokes equations (the macroscopic flow equations) is discretized in time by a semi-Lagrangian formulation of the second order backward difference formula (BDF2). This discretization leads to solving each time step a linear generalized Stokes problem. For the stochastic differential equations of the microscopic scale model, we use the second order predictor-corrector scheme proposed in [22] applied along the forward trajectories of the center of mass of the dumbbells. Important features of the algorithm are (1) the new semi-Lagrangian projection scheme; (2) the scheme to move and locate both the mesh-points and the dumbbells; and (3) the calculation and space discretization of the polymer stress. The algorithm has been tested on the 2d 10:1 contraction benchmark problem and has proved to be accurate and stable, being able to deal with flows at high Weissenberg (Wi) numbers; specifically, by adjusting the size of the time step we obtain solutions at Wi=444
The Long-Baseline Neutrino Experiment: Exploring Fundamental Symmetries of the Universe
The preponderance of matter over antimatter in the early Universe, the
dynamics of the supernova bursts that produced the heavy elements necessary for
life and whether protons eventually decay --- these mysteries at the forefront
of particle physics and astrophysics are key to understanding the early
evolution of our Universe, its current state and its eventual fate. The
Long-Baseline Neutrino Experiment (LBNE) represents an extensively developed
plan for a world-class experiment dedicated to addressing these questions. LBNE
is conceived around three central components: (1) a new, high-intensity
neutrino source generated from a megawatt-class proton accelerator at Fermi
National Accelerator Laboratory, (2) a near neutrino detector just downstream
of the source, and (3) a massive liquid argon time-projection chamber deployed
as a far detector deep underground at the Sanford Underground Research
Facility. This facility, located at the site of the former Homestake Mine in
Lead, South Dakota, is approximately 1,300 km from the neutrino source at
Fermilab -- a distance (baseline) that delivers optimal sensitivity to neutrino
charge-parity symmetry violation and mass ordering effects. This ambitious yet
cost-effective design incorporates scalability and flexibility and can
accommodate a variety of upgrades and contributions. With its exceptional
combination of experimental configuration, technical capabilities, and
potential for transformative discoveries, LBNE promises to be a vital facility
for the field of particle physics worldwide, providing physicists from around
the globe with opportunities to collaborate in a twenty to thirty year program
of exciting science. In this document we provide a comprehensive overview of
LBNE's scientific objectives, its place in the landscape of neutrino physics
worldwide, the technologies it will incorporate and the capabilities it will
possess.Comment: Major update of previous version. This is the reference document for
LBNE science program and current status. Chapters 1, 3, and 9 provide a
comprehensive overview of LBNE's scientific objectives, its place in the
landscape of neutrino physics worldwide, the technologies it will incorporate
and the capabilities it will possess. 288 pages, 116 figure
Development of a Consumer Reported Outcome Measure for Personal Care Products : The Rationale
© 2024 The Author(s). This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY), https://creativecommons.org/licenses/by/4.0/Background: Cosmetic products are one of the fastest-growing segments of personal care products in the United States. One of the critical elements in the sales and growth of cosmetics is leveraging claims. Unlike pharmaceuticals in the US, claims in personal care products are usually not reviewed nor require regulatory approvals before products are sold in the US. However, regulatory agencies have no oversight of how the advertisement is presented to the consumers and have cited known companies and brands for promoting deceptive advertising and forcing costly market withdrawal, impacting the financial values of investors and customers' confidence alike. Objectives: We conducted a literature search and a survey. The literature search was to identify the current methodologies available for substantiating the advertisement of personal care products (including cosmetics). The survey was conducted with regulatory professionals aiming to understand the use of the current methodologies. Methods: The survey was developed and distributed to regulatory professionals in different capacities within the Cosmetic and Personal Care industry who had extensive experience constructing and substantiating advertising claims regulatory for cosmetic and personal care products. The questionnaire comprised 9 questions with socio-demographic characteristics and regulatory experience validating claims. Results: We received 63 responses from 1354 forms sent from regulatory professionals validating advertising claims. The results show that 85 % of the respondents use the FDA guidance while the remaining 15 % use in-house or other non-governmental guides. Moreover, 58 % use some Risk Benefit, while 42 % do not use it when evaluating claim substantiation. Conclusion: Although the respondents qualifying the claims possess the experience and technical knowledge of Cosmetic and Personal Care Products, the presently available standards used in the US are not designed to validate the substantiation of advertising claims. Therefore, there is a need to develop a more robust methodology for the evaluation of the validation and substantiation of advertising claims. A technique of using personal experiences is already approved and used for pharmaceutical products known as Personal Reported Outcomes (PRO). Leveraging the PRO techniques can help develop a “consumer reported outcome measure” (CROM) tool for claim substantiation validation for the advertising of cosmetic and personal care products. Keywords : Personal Reported Outcomes, CROM, FDA, FTC, Cosmetic productsPeer reviewe
Optimisation-based coordination of connected, automated vehicles at intersections
In this paper, we analyse the performance of a model predictive controller for coordination of connected, automated vehicles at intersections. The problem has combinatorial complexity, and we propose to solve it approximately by using a two stage procedure where (1) the vehicle crossing order in which the vehicles cross the intersection is found by solving a mixed integer quadratic program and (2) the control commands are subsequently found by solving a nonlinear program. We show that the controller is persistently safe and compare its performance against traffic lights and two simpler optimisation-based coordination schemes. The results show that our approach outperforms the considered alternatives in terms of both energy consumption and travel-time delay, especially for medium to high traffic loads
- …
