1,027 research outputs found
Some results on contractive mappings as related to pattern recognition
Several of the techniques used in pattern recognition are reformulated as the problem of determining fixed points of a function. If x sub 0 is a fixed point of f and if f is contractive at x sub 0, then, for any y belonging to a sufficiently small neighborhood of x sub 0 the orbit of y will converge to x sub 0. Several general results regarding contractive mappings are developed with emphasis on functions
Fear of model misspecification and the robustness premium
Robust decision making implies welfare costs or robustness premia when the approximating model is the true data generating process. To examine the importance of these premia at the aggregate level we employ a simple two-sector dynamic general equilibrium model with human capital and introduce an additional form of precautionary behavior. The latter arises from the robust decision maker's ability to reduce the effects of model misspecification through allocating time and existing human capital to this end. We find that the extent of the robustness premia critically depends on the productivity of time relative to that of human capital. When the relative efficiency of time is low, despite transitory welfare costs, there are gains from following robust policies in the long-run. In contrast, high relative productivity of time implies misallocation costs that remain even in the long-run. Finally, depending on the technology used to reduce model uncertainty, we find that while increasing the fear of model misspecification leads to a net increase in precautionary behavior, investment and output can fall
Micro-geographic risk factors for malarial infection.
BACKGROUND: Knowledge of geography is integral to the study of insect-borne infectious disease such as malaria. This study was designed to evaluate whether geographic parameters are associated with malarial infection in the East Sepik province of Papua New Guinea (PNG), a remote area where malaria is a major cause of morbidity and mortality.
METHODS: A global positioning system (GPS) unit was used at each village to collect elevation, latitude and longitude data. Concurrently, a sketch map of each village was generated and the villages were sub-divided into regions of roughly equal populations. Blood samples were taken from subjects in each region using filter paper collection. The samples were later processed using nested PCR for qualitative determination of malarial infection. The area was mapped using the GPS-information and overlaid with prevalence data. Data tables were examined using traditional chi square statistical techniques. A logistic regression analysis was then used to determine the significance of geographic risk factors including, elevation, distance from administrative centre and village of residence.
RESULTS: Three hundred and thirty-two samples were included (24% of the total estimated population). Ninety-six were positive, yielding a prevalence of 29%. Chi square testing within each village found a non-random distribution of cases across sub-regions (p < 0.05). Multivariate logistic regression techniques suggested malarial infection changed with elevation (OR = 0.64 per 10 m, p < 0.05) and distance from administrative centre (OR = 1.3 per 100 m, p < 0.05).
CONCLUSION: These results suggest that malarial infection is significantly and independently associated with lower elevation and greater distance from administrative centre in a rural area in PNG. This type of analysis can provide information that may be used to target specific areas in developing countries for malaria prevention and treatment
On the Cyclicality and Stability of Real Earnings
We show in this paper that important insights into the cyclical behaviour of wages can be
gained by dividing (real) average hourly earnings into their straight-time hourly wage and
overtime components. Our motivation is based on the idea of employment-contingent
contracts. BLS published and unpublished statistics are used to decompose average
earnings into (i) the straight-time wage rate, (ii) the ?mark-up? needed to achieve an overtime
worker?s earnings rate, and (iii) the proportion of workers working overtime. Using monthly
manufacturing data from 1962?1997, cyclicality measures of these components are based
on contemporaneous bivariate correlations using four alternative detrending methods while
stability is examined using recursive estimation and testing methods. While the wage rate is
generally acyclical and unstable, the other two components are highly pro-cyclical and
relatively stable
Rural Child Care in Missouri: How to Improve it
Includes bibliographical references.During 1997 and 1998, a large study called Project REACH (Rural EArly CHildhood Professional Development Initiative), was conducted in a series of interventions over a 16 month period in rural Missouri. The training and follow up was intensive, continuous and individualized. An overview of the results are provided
Real Wages and the Cycle: The View from the Frequency Domain
In the time domain, the observed cyclical behavior of the real wage hides a range of
economic influences that give rise to cycles of differing lengths and amplitudes. This may
serve to produce a distorted picture of wage cyclicality. Here, we employ frequency domain
methods that allow us decompose wages into cyclical components and to assess the relative
contribution of each component. These are discussed in relation to wages alone (the
univariate case) and to wages in relation to production or employment-based measures of
the cycle (multivariate). In the multivariate dimension, we derive methods for determining
whether (i) wage and business cycles cohere (ii) lead-lag or contemporaneous relationships
exist and (iii) the degree of coherency between wage and business cycles is time
dependent. We establish that real wages are strongly procyclical and that the business cycle
is the dominant associated influence
Developing and Researching PhET simulations for Teaching Quantum Mechanics
Quantum mechanics is difficult to learn because it is counterintuitive, hard
to visualize, mathematically challenging, and abstract. The Physics Education
Technology (PhET) Project, known for its interactive computer simulations for
teaching and learning physics, now includes 18 simulations on quantum mechanics
designed to improve learning of this difficult subject. Our simulations include
several key features to help students build mental models and intuitions about
quantum mechanics: visual representations of abstract concepts and microscopic
processes that cannot be directly observed, interactive environments that
directly couple students' actions to animations, connections to everyday life,
and efficient calculations so students can focus on the concepts rather than
the math. Like all PhET simulations, these are developed using the results of
education research and feedback from educators, and are tested in student
interviews and classroom studies. This article provides an overview of the PhET
quantum simulations and their development. We also describe research
demonstrating their effectiveness and share some insights about student
thinking that we have gained from our research on quantum simulations.Comment: accepted by American Journal of Physics; v2 includes an additional
study, more explanation of research behind claims, clearer wording, and more
reference
Chemical climatology: a case study for ozone
In 1872 Scottish chemist Robert Angus Smith established the basis of ‘chemical climatology’ explicitly designed to assess the human health impact of the ‘man-made climates’ in cities. Since then usage of chemical climatology has been sporadic. However with large volumes of atmospheric composition datasets available from campaign measurements, monitoring and modelling, as well as pollutant impact studies, an updated framework based on Angus Smith’s principles would be useful as a resource for both scientists and policy makers. Through analogy with the use of the term climate in other areas (e.g. meteorological or political) a modern chemical climatology framework is described, highlighting impact-focused principles. To derive the chemical climatology the impact of atmospheric composition is first identified (e.g. damage to human health) The impact is linked to the state of atmospheric composition in time and space (e.g. ozone concentrations in the UK 1990 -2010). Finally the drivers of the state are assessed (e.g. emissions, chemical background, chemical precursors, meteorology).
Two chemical climates are presented: O3-human health and ozone-vegetation. The chemical climates are derived from measurements at the two UK European Monitoring and Evaluation Programme (EMEP) monitoring ‘supersites’: Auchencorth Moss and Harwell. The impacts of O3 on human health and on vegetation are assessed using the SOMO35 and AOT40 metrics respectively. Drivers of significant spatial variation in these impacts across the UK, and temporal changes at Harwell between 1990 and 2011 are discussed, as well as the relative importance of hemispheric, regional and local O3 chemical processing and its precursors. The individual site assessments are placed in regional context through the statistical evaluation of O3 variation across Europe.
The chemical climatology framework allows integration of individual scientific studies focussing on specific processes within the impact-state and driver space into a synthesised and more general understanding. This approach provides opportunities for developing understanding of multiple impacts are considered for each chemical component allow identification of common drivers of impacts, and potentially holistically considered mitigation strategies
Minimum Decision Cost for Quantum Ensembles
For a given ensemble of independent and identically prepared particles,
we calculate the binary decision costs of different strategies for measurement
of polarised spin 1/2 particles. The result proves that, for any given values
of the prior probabilities and any number of constituent particles, the cost
for a combined measurement is always less than or equal to that for any
combination of separate measurements upon sub-ensembles. The Bayes cost, which
is that associated with the optimal strategy (i.e., a combined measurement) is
obtained in a simple closed form.Comment: 11 pages, uses RevTe
Logic gates at the surface code threshold: Superconducting qubits poised for fault-tolerant quantum computing
A quantum computer can solve hard problems - such as prime factoring,
database searching, and quantum simulation - at the cost of needing to protect
fragile quantum states from error. Quantum error correction provides this
protection, by distributing a logical state among many physical qubits via
quantum entanglement. Superconductivity is an appealing platform, as it allows
for constructing large quantum circuits, and is compatible with
microfabrication. For superconducting qubits the surface code is a natural
choice for error correction, as it uses only nearest-neighbour coupling and
rapidly-cycled entangling gates. The gate fidelity requirements are modest: The
per-step fidelity threshold is only about 99%. Here, we demonstrate a universal
set of logic gates in a superconducting multi-qubit processor, achieving an
average single-qubit gate fidelity of 99.92% and a two-qubit gate fidelity up
to 99.4%. This places Josephson quantum computing at the fault-tolerant
threshold for surface code error correction. Our quantum processor is a first
step towards the surface code, using five qubits arranged in a linear array
with nearest-neighbour coupling. As a further demonstration, we construct a
five-qubit Greenberger-Horne-Zeilinger (GHZ) state using the complete circuit
and full set of gates. The results demonstrate that Josephson quantum computing
is a high-fidelity technology, with a clear path to scaling up to large-scale,
fault-tolerant quantum circuits.Comment: 15 pages, 13 figures, including supplementary materia
- …
