658 research outputs found
Smart Columbus Grid Modernization: AMI Innovation Adoption
Course Code: ENR/AEDE 4567Advanced Metering Infrastructure (AMI) is an integrated system of smart meters, communication networks, and data management systems that enables two-way communication between utilities and customers. This technology is intended to improve efficiency, identify and respond to outages more efficiently, and better monitor and control the distribution system. The only way to see direct benefits of this technology is to engage with the new features it provides. This analysis highlights the strengths and weaknesses of AMI from the consumer perspective for marketing purposes for the City of Columbus as an extension of the Smart Columbus initiatives in order to capitalize on positive sustainable behavior change.Smart ColumbusColumbus Division of PowerAEP OhioAcademic Major: Environment, Economy, Development, and Sustainabilit
Sea-level Rise, Storm Surges, and Extreme Precipitation in Coastal New Hampshire: Analysis Of Past And Projected Trends
Probing Mechanical and Chemical Instabilities in Neutron-Rich Matter
The isospin-dependence of mechanical and chemical instabilities is
investigated within a thermal and nuclear transport model using a Skyrme-type
phenomenological equation of state for neutron-rich matter. Respective roles of
the nuclear mean field and the 2-body stochastic scattering on the evolution of
density and isospin fluctuations in either mechanically or chemically unstable
regions of neutron-rich matter are investigated. It is found that the mean
field dominates overwhelmingly the fast growth of both fluctuations, while the
2-body scattering influences significantly the later growth of the isospin
fluctuation only. The magnitude of both fluctuations decreases with the
increasing isospin asymmetry because of the larger reduction of the attractive
isoscalar mean field by the stronger repuslive neutron symmetry potential in
the more neutron-rich matter. Moreover, it is shown that the isospin
fractionation happens later, but grows faster in the more neutron-rich matter.
Implications of these results to current experiments exploring properties of
neutron-rich matter are discussed.Comment: 18 pages & 15 figures, Nuclear Physics A (2001) in pres
Superpave5: What is It?
INDOT revised the asphalt specification in 2019 based on the concept of Superpave 5—that higher in-place density yields improved performance disability. This session will provide insight into the science of how Superpave 5 works while giving designers an understanding of how it affects design and construction specifications
The Scintillator Upgrade of IceTop: Performance of the prototype array
The IceCube Collaboration foresees to upgrade IceTop, the present surface
array, with scintillator detectors augmented by radio antennas. As one of
several goals the scintillator detectors will be used to measure and mitigate
the effects of snow accumulation on the IceTop tanks: the increasing energy
threshold and efficiency loss are nowadays the sources of the largest
systematic uncertainties in shower reconstruction and mass composition
analysis. In addition, the upgrade will provide useful experience for the
development of next generation neutrino detectors proposed for the South Pole.
In the Austral summer season, 2017-2018 two full "stations" were installed near
the center of the IceTop array. Each station features custom-designed
electronics and consists of seven detectors, each having an active area of
1.5m plastic scintillator and wavelength shifting fibers read out by a
Silicon Photomultiplier. In this contribution we review the detector design and
performance, and show results from more than one year of operation of the
prototype stations. During that year several thousand air shower events have
been measured in coincidence with IceTop.Comment: Presented at the 36th International Cosmic Ray Conference (ICRC
2019). See arXiv:1907.11699 for all IceCube contribution
A model-model and data-model comparison for the early Eocene hydrological cycle
A range of proxy observations have recently provided constraints on how
Earth's hydrological cycle responded to early Eocene climatic changes.
However, comparisons of proxy data to general circulation model (GCM)
simulated hydrology are limited and inter-model variability remains poorly
characterised. In this work, we undertake an intercomparison of GCM-derived
precipitation and <i>P</i> − <i>E</i> distributions within the extended EoMIP ensemble
(Eocene Modelling Intercomparison Project; Lunt et al., 2012), which includes
previously published early Eocene simulations performed using five GCMs
differing in boundary conditions, model structure, and precipitation-relevant
parameterisation schemes.
<br><br>
We show that an intensified hydrological cycle, manifested in enhanced
global precipitation and evaporation rates, is simulated for all Eocene
simulations relative to the preindustrial conditions. This is primarily due to elevated
atmospheric paleo-CO<sub>2</sub>, resulting in elevated temperatures, although the
effects of differences in paleogeography and ice sheets are also important
in some models. For a given CO<sub>2</sub> level, globally averaged precipitation rates
vary widely between models, largely arising from different simulated surface
air temperatures. Models with a similar global sensitivity of precipitation
rate to temperature (d<i>P</i>∕d<i>T</i>) display different regional precipitation responses
for a given temperature change. Regions that are particularly sensitive to
model choice include the South Pacific, tropical Africa, and the Peri-Tethys,
which may represent targets for future proxy acquisition.
<br><br>
A comparison of early and middle Eocene leaf-fossil-derived precipitation
estimates with the GCM output illustrates that GCMs generally underestimate
precipitation rates at high latitudes, although a possible seasonal bias of
the proxies cannot be excluded. Models which warm these regions, either via
elevated CO<sub>2</sub> or by varying poorly constrained model parameter values, are
most successful in simulating a match with geologic data. Further data from
low-latitude regions and better constraints on early Eocene CO<sub>2</sub> are now
required to discriminate between these model simulations given the large
error bars on paleoprecipitation estimates. Given the clear differences
between simulated precipitation distributions within the ensemble, our
results suggest that paleohydrological data offer an independent means by
which to evaluate model skill for warm climates
Export of nutrient rich Northern Component Water preceded early Oligocene Antarctic glaciation
The onset of the North Atlantic Deep Water formation is thought to have coincided with Antarctic ice-sheet growth about 34 million years ago (Ma). However, this timing is debated, in part due to questions over the geochemical signature of the ancient Northern Component Water (NCW) formed in the deep North Atlantic. Here we present detailed geochemical records from North Atlantic sediment cores located close to sites of deep-water formation. We find that prior to 36 Ma, the northwestern Atlantic was stratified, with nutrient-rich, low-salinity bottom waters. This restricted basin transitioned into a conduit for NCW that began flowing southwards approximately one million years before the initial Antarctic glaciation. The probable trigger was tectonic adjustments in subarctic seas that enabled an increased exchange across the Greenland–Scotland Ridge. The increasing surface salinity and density strengthened the production of NCW. The late Eocene deep-water mass differed in its carbon isotopic signature from modern values as a result of the leakage of fossil carbon from the Arctic Ocean. Export of this nutrient-laden water provided a transient pulse of CO2 to the Earth system, which perhaps caused short-term warming, whereas the long-term effect of enhanced NCW formation was a greater northward heat transport that cooled Antarctica
A Nested Genetic Algorithm for Explaining Classification Data Sets with Decision Rules
Our goal in this paper is to automatically extract a set of decision rules
(rule set) that best explains a classification data set. First, a large set of
decision rules is extracted from a set of decision trees trained on the data
set. The rule set should be concise, accurate, have a maximum coverage and
minimum number of inconsistencies. This problem can be formalized as a modified
version of the weighted budgeted maximum coverage problem, known to be NP-hard.
To solve the combinatorial optimization problem efficiently, we introduce a
nested genetic algorithm which we then use to derive explanations for ten
public data sets
Font Size and Presentation Rate\u27s Influence on Participants\u27 JOLs and Memory Performance
https://louis.uah.edu/research-horizons/1070/thumbnail.jp
Peer user approval based binary whitelisting
Enterprises face challenges in monitoring execution of software binaries. This disclosure describes social voting for enterprise level binary whitelisting. Per techniques of this disclosure, a peer user driven approval process is utilized for binary whitelisting. At a time of launch of a binary that is not pre-approved, a user is provided with information associated with the binary and directed to the social voting process. The user designates a peer user and requests that the peer user approve execution of the binary. The peer user is provided with information about the requesting user and about the binary. Approval by the peer user can be used to enable local binary execution by the requesting user. If the peer user does not approve execution, the binary is flagged as blockable, and execution is denied
- …
