27,442 research outputs found
High resolution radiometric measurements of convective storms during the GATE experiment
Using passive microwave data from the NASA CV-990 aircraft and radar data collected during the Global Atmospheric Research Program Atlantic Tropical Experiment (GATE), an empirical model was developed relating brightness temperatures sensed at 19.35 GHz to surface rainfall rates. This model agreed well with theoretical computations of the relationship between microwave radiation and precipitation in the tropics. The GATE aircraft microwave data was then used to determine the detailed structure of convective systems. The high spatial resolution of the data permitted identification of individual cells which retained unique identities throughout their lifetimes in larger cloud masses and allowed analysis of the effects of cloud merger
Bit error rate measurement above and below bit rate tracking threshold
Bit error rate is measured by sending a pseudo-random noise (PRN) code test signal simulating digital data through digital equipment to be tested. An incoming signal representing the response of the equipment being tested, together with any added noise, is received and tracked by being compared with a locally generated PRN code. Once the locally generated PRN code matches the incoming signal a tracking lock is obtained. The incoming signal is then integrated and compared bit-by-bit against the locally generated PRN code and differences between bits being compared are counted as bit errors
Quantum computing with nearest neighbor interactions and error rates over 1%
Large-scale quantum computation will only be achieved if experimentally
implementable quantum error correction procedures are devised that can tolerate
experimentally achievable error rates. We describe a quantum error correction
procedure that requires only a 2-D square lattice of qubits that can interact
with their nearest neighbors, yet can tolerate quantum gate error rates over
1%. The precise maximum tolerable error rate depends on the error model, and we
calculate values in the range 1.1--1.4% for various physically reasonable
models. Even the lowest value represents the highest threshold error rate
calculated to date in a geometrically constrained setting, and a 50%
improvement over the previous record.Comment: 4 pages, 8 figure
A joint model for vehicle type and fuel type choice: evidence from a cross-nested logit study
In the face of growing concerns about greenhouse gas emissions, there is increasing interest in forecasting the likely demand for alternative fuel vehicles. This paper presents an analysis carried out on stated preference survey data on California consumer responses to a joint vehicle type choice and fuel type choice experiment. Our study recognises the fact that this choice process potentially involves high correlations that an analyst may not be able to adequately represent in the modelled utility components. We further hypothesise that a cross-nested logit structure can capture more of the correlation patterns than the standard nested logit model structure in such a multi-dimensional choice process. Our empirical analysis and a brief forecasting exercise produce evidence to support these assertions. The implications of these findings extend beyond the context of the demand for alternative fuel vehicles to the analysis of multi-dimensional choice processes in general. Finally, an extension verifies that further gains can be made by using mixed GEV structures, allowing for random heterogeneity in addition to the flexible correlation structures
Topological code Autotune
Many quantum systems are being investigated in the hope of building a
large-scale quantum computer. All of these systems suffer from decoherence,
resulting in errors during the execution of quantum gates. Quantum error
correction enables reliable quantum computation given unreliable hardware.
Unoptimized topological quantum error correction (TQEC), while still effective,
performs very suboptimally, especially at low error rates. Hand optimizing the
classical processing associated with a TQEC scheme for a specific system to
achieve better error tolerance can be extremely laborious. We describe a tool
Autotune capable of performing this optimization automatically, and give two
highly distinct examples of its use and extreme outperformance of unoptimized
TQEC. Autotune is designed to facilitate the precise study of real hardware
running TQEC with every quantum gate having a realistic, physics-based error
model.Comment: 13 pages, 17 figures, version accepted for publicatio
Modelling Organic Dairy Production Systems
In this study, a large number of organic dairy production strategies were compared in terms of physical and financial performance through the integrated use of computer simulation models and organic case study farm data. Production and financial data from three organic case study farms were used as a basis for the modelling process to ensure that the modelled systems were based on real sets of resources that might be available to a farmer. The case study farms were selected to represent a range of farming systems in terms of farm size, concentrate use and location. This paper describes the process used to model the farm systems: the integration of the three models used and the use of indicators to assess the modelled farm systems in terms of physical sustainability and financial performance
Refactoring Legacy JavaScript Code to Use Classes: The Good, The Bad and The Ugly
JavaScript systems are becoming increasingly complex and large. To tackle the
challenges involved in implementing these systems, the language is evolving to
include several constructions for programming- in-the-large. For example,
although the language is prototype-based, the latest JavaScript standard, named
ECMAScript 6 (ES6), provides native support for implementing classes. Even
though most modern web browsers support ES6, only a very few applications use
the class syntax. In this paper, we analyze the process of migrating structures
that emulate classes in legacy JavaScript code to adopt the new syntax for
classes introduced by ES6. We apply a set of migration rules on eight legacy
JavaScript systems. In our study, we document: (a) cases that are
straightforward to migrate (the good parts); (b) cases that require manual and
ad-hoc migration (the bad parts); and (c) cases that cannot be migrated due to
limitations and restrictions of ES6 (the ugly parts). Six out of eight systems
(75%) contain instances of bad and/or ugly cases. We also collect the
perceptions of JavaScript developers about migrating their code to use the new
syntax for classes.Comment: Paper accepted at 16th International Conference on Software Reuse
(ICSR), 2017; 16 page
- …
