2,731 research outputs found
A New Executive Order for Improving Federal Regulation' Deeper and Wider Cost-Benefit Analysis
An updated version of this article was published in the University of Pennyslvania Law Review .For over two decades, federal agencies have been required to analyze the benefits and costs of significant regulatory actions and to show that the benefits justify the costs. But the regulatory state continues to suffer from significant problems, including poor priority-setting, unintended adverse side-effects, and, on occasion, high costs for low benefits. In many cases, agencies do not offer an adequate account of either costs or benefits, and hence the commitment to cost-benefit balancing is not implemented in practice. A major current task is to ensure a deeper and wider commitment to cost-benefit analysis, properly understood. We explain how this task might be accomplished and offer a proposed executive order that would move regulation in better directions. In the course of the discussion, we explore a number of pertinent issues, including the actual record of the last two decades, the precautionary principle, the value of 'prompt letters', the role of distributional factors, and the need to incorporate independent agencies within the system of cost-benefit balancing.
Recommended from our members
Algorithms, Automation, and News
This special issue examines the growing importance of algorithms and automation in the gathering, composition, and distribution of news. It connects a long line of research on journalism and computation with scholarly and professional terrain yet to be explored. Taken as a whole, these articles share some of the noble ambitions of the pioneering publications on ‘reporting algorithms’, such as a desire to see computing help journalists in their watchdog role by holding power to account. However, they also go further, firstly by addressing the fuller range of technologies that computational journalism now consists of: from chatbots and recommender systems, to artificial intelligence and atomised journalism. Secondly, they advance the literature by demonstrating the increased variety of uses for these technologies, including engaging underserved audiences, selling subscriptions, and recombining and re-using content. Thirdly, they problematize computational journalism by, for example, pointing out some of the challenges inherent in applying AI to investigative journalism and in trying to preserve public service values. Fourthly, they offer suggestions for future research and practice, including by presenting a framework for developing democratic news recommenders and another that may help us think about computational journalism in a more integrated, structured manner
Mathematical modeling of chemically reactive pollutants in indoor air
A general mathematical model is presented for predicting
the concentrations of chemically reactive compounds in indoor air. The model accounts for the effects of ventilation, filtration, heterogeneous removal, direct
emission, and photolytic and thermal chemical reactions.
The model is applied to the induction of photochemically
reactive pollutants into a museum gallery, and the predicted
NO, NO_x-NO, and O_3 concentrations are compared to measured data. The model predicts substantial production
of several species due to chemical reaction, including
HNO_2, HNO_3, NO_3, and N_2O_5. Circumstances in which homogeneous chemistry may assume particular importance
are identified and include buildings with glass walls, indoor combustion sources, and direct emission of olefins
Protection of Works of Art From Atmospheric Ozone
Assesses the colorfastness of organic colorants and watercolor pigments tested in atmospheric ozone. A summary of a full report of the Environmental Quality Laboratory, California Institute of Technology, Pasadena
Centre Vortices in SU(3)
Copyright owned by the author(s) under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike Licence.We investigate the effectiveness of using smearing as a means to generate a preconditioning transformation for gauge fields prior to fixing to Maximal Centre Gauge. This still leaves the gaugefixed field in the original gauge orbit. As expected, we find that this preconditioning leads to higher maxima of the gauge-fixing condition, resulting in lower numbers of P-vortices. We also find that removing vortices appears to give a loss of confinement for all cases but that the string tension as measured from vortex-only configurations drops from about 65% to as low as 26% when using the preconditioning method
Nonlinear pharmacokinetics of imipramine and its active metabolite, desipramine
研究科: 千葉大学大学院薬学研究院学位:千大院薬博乙第80
Implementing tradable permits for sulfur oxides emissions : a case study in the South Coast Air Basin
Tradable emissions permits have important theoretical advantages over source-specific technical standards as a means for controlling pollution. Nonetheless, difficulties can arise in trying to implement an efficient, competitive market in emissions permits. Simple workable versions of the market concept may fail to achieve the competitive equilibrium, or to take account of important complexities in the relationship between the pattern of emissions and the geographical distribution of pollution. Existing regulatory law may severely limit the range of market opportunities that states can adopt.
This report examines the feasibility of tradable permits for controlling particulate sulfates in the Los Angeles airshed. Although the empirical part of the paper deals with a specific case, the methods developed have general applicability. Moreover, the particular market design that is proposed -- an auction process that involves no net revenue collection by the state -- has attractive features as a general model
A Multi-Code Analysis Toolkit for Astrophysical Simulation Data
The analysis of complex multiphysics astrophysical simulations presents a
unique and rapidly growing set of challenges: reproducibility, parallelization,
and vast increases in data size and complexity chief among them. In order to
meet these challenges, and in order to open up new avenues for collaboration
between users of multiple simulation platforms, we present yt (available at
http://yt.enzotools.org/), an open source, community-developed astrophysical
analysis and visualization toolkit. Analysis and visualization with yt are
oriented around physically relevant quantities rather than quantities native to
astrophysical simulation codes. While originally designed for handling Enzo's
structure adaptive mesh refinement (AMR) data, yt has been extended to work
with several different simulation methods and simulation codes including Orion,
RAMSES, and FLASH. We report on its methods for reading, handling, and
visualizing data, including projections, multivariate volume rendering,
multi-dimensional histograms, halo finding, light cone generation and
topologically-connected isocontour identification. Furthermore, we discuss the
underlying algorithms yt uses for processing and visualizing data, and its
mechanisms for parallelization of analysis tasks.Comment: 18 pages, 6 figures, emulateapj format. Resubmitted to Astrophysical
Journal Supplement Series with revisions from referee. yt can be found at
http://yt.enzotools.org
Kinetics of submicron oleic acid aerosols with ozone: A novel aerosol mass spectrometric technique
The reaction kinetics of submicron oleic (9-octadecanoic (Z)-) acid aerosols with ozone was studied using a novel aerosol mass spectrometric technique. In the apparatus a flow of size-selected aerosols is introduced into a flow reactor where the particles are exposed to a known density of ozone for a controlled period of time. The aerosol flow is then directed into an aerosol mass spectrometer for particle size and composition analyses. Data from these studies were used to: (a) quantitatively model the size-dependent kinetics process, (b) determine the aerosol size change due to uptake of ozone, (c) assess reaction stoichiometry, and (d) obtain qualitative information about the volatility of the reaction products. The reactive uptake probability for ozone on oleic acid particles obtained from modeling is 1.6 (±0.2) × 10^(−3) with an upper limit for the reacto-diffusive length of ∼10 nm. Atmospheric implications of the results are discussed
- …
