2,076 research outputs found
Using bootstrap to assess uncertainties of VLBI results I. The method and image-based errors
Very Long Baseline Interferometric (VLBI) observations of quasar jets enable
one to measure many theoretically expected effects. Estimating the significance
of observational findings is complicated by the correlated noise in the image
plane. A reliable and well justified approach to estimate the uncertainties of
VLBI results is needed as well as significance testing criteria. We propose to
use bootstrap for both tasks. Using simulations we find that bootstrap-based
errors for the full intensity, rotation measure, and spectral index maps have
coverage closer to the nominal values than conventionally obtained errors. The
proposed method naturally takes into account heterogeneous interferometric
arrays (such as Space VLBI) and can be easily extended to account for
instrumental calibration factors.Comment: Accepted for publication in MNRAS. 11 pages, 16 figure
Zero-Parity Stabbing Information
Everett et al. introduced several varieties of stabbing information for the
lines determined by pairs of vertices of a simple polygon P, and established
their relationships to vertex visibility and other combinatorial data. In the
same spirit, we define the ``zero-parity (ZP) stabbing information'' to be a
natural weakening of their ``weak stabbing information,'' retaining only the
distinction among {zero, odd, even>0} in the number of polygon edges stabbed.
Whereas the weak stabbing information's relation to visibility remains an open
problem, we completely settle the analogous questions for zero-parity
information, with three results: (1) ZP information is insufficient to
distinguish internal from external visibility graph edges; (2) but it does
suffice for all polygons that avoid a certain complex substructure; and (3) the
natural generalization of ZP information to the continuous case of smooth
curves does distinguish internal from external visibility
Getting sick and paying for it
In certain situations, Americans who become chronically ill have to pay higher rates to continue their health insurance coverage. Indeed, although the majority of Americans are insured, hardly anyone is fully protected against the risk that their next insurance policy will cost considerably more than their current one.Insurance, Health
Quantitative Analysis of Health Insurance Reform: Separating Community Rating from Income Redistribution
Two key components of the upcoming health reform are a reorganization of the individual health insurance market and an increase in income redistribution in the economy. Which component contributes more to the welfare outcome of the reform? We address this question by constructing a general equilibrium life cycle model that incorporates both medical expenses and labor income risks. We replicate the key features of the current health insurance system in the U.S. and calibrate the model using the Medical Expenditures Panel Survey dataset. We find that the reform decreases the number of uninsured more than four times. It also brings significant welfare gains equivalent to almost one percent of the annual consumption. However, these welfare gains mostly come from the redistributive measures embedded in the reform. If the reform only reorganizes the individual market, introduces individual mandates but does not include any income-based transfers, the welfare gains are much smaller. This result is mostly driven by the fact that most uninsured people have low income. High burdens of health insurance premiums for this group are relieved disproportionately more by income-based measures than by the new rules in the individual market.health insurance, health reform, risk sharing, general equilibrium
Accounting for non-annuitization
Why don't people buy annuities? Several explanations have been provided by the previous literature: large fraction of preannuitized wealth in retirees' portfolios; adverse selection; bequest motives; and medical expense uncertainty. This paper uses a quantitative model to assess the importance of these impediments to annuitization and also studies three newer explanations: government safety net in terms of means-tested transfers; illiquidity of housing wealth; and restrictions on minimum amount of investment in annuities. This paper shows that quantitatively the last three explanations play a big role in reducing annuity demand. The minimum consumption floor turns out to be important to explain the lack of annuitization, especially for people in lower income quintiles, who are well insured by this provision. The minimum annuity purchase requirement involves big upfront investment and is binding for many, especially if housing wealth cannot be easily annuitized. Among the traditional explanations, preannuitized wealth has the largest quantitative contribution to the annuity puzzle.Accounting
Influence of physic-mechanical properties on a choice of metallurgical slags processing technology
Проаналізовано фізико-механічні властивості шлаків після первинної переробки – тер-мічного (термоудар) і механічного впливу на розплав, схильність відвальних шлаків до розпаду – структурним перетворенням, і можливі види механічної переробки для отримання різних видів шлакової продукції.
Наводиться технологічна схема переробки відвальних шлаків (можливе застосування також і для шлаків поточного випуску) з витяганням металу і отриманням широкого спектру шлакової продукції, включаючи в'яжучі матеріали.Проанализированы физико-механические свойства шлаков после первичной переработки – термического (термоудар) и механического воздействия на расплав, склонность отвальных шлаков к распаду – структурным превращениям, и возможные виды механической переработки для получения разных видов шлаковой продукции.
Приводится технологическая схема переработки отвальных шлаков (применимо и для шлаков текущего выпуска) с извлечением металла и получением широкого спектра шлаковой продукции, включая вяжущие материалы.Physical and mechanical properties of the slag after the initial processing – thermal (thermal shock) and mechanical impact on the melt, the dump slag inclination to disintegration – structural transformations and the possible kinds of mechanical processing to produce different types of slag products were analyzed.
Technological scheme of dump slag (it can also be applied to the current release slags) with metal extraction and of slag products wide range, including cementations materials were given
Vulnerable Open Source Dependencies: Counting Those That Matter
BACKGROUND: Vulnerable dependencies are a known problem in today's
open-source software ecosystems because OSS libraries are highly interconnected
and developers do not always update their dependencies. AIMS: In this paper we
aim to present a precise methodology, that combines the code-based analysis of
patches with information on build, test, update dates, and group extracted from
the very code repository, and therefore, caters to the needs of industrial
practice for correct allocation of development and audit resources. METHOD: To
understand the industrial impact of the proposed methodology, we considered the
200 most popular OSS Java libraries used by SAP in its own software. Our
analysis included 10905 distinct GAVs (group, artifact, version) when
considering all the library versions. RESULTS: We found that about 20% of the
dependencies affected by a known vulnerability are not deployed, and therefore,
they do not represent a danger to the analyzed library because they cannot be
exploited in practice. Developers of the analyzed libraries are able to fix
(and actually responsible for) 82% of the deployed vulnerable dependencies. The
vast majority (81%) of vulnerable dependencies may be fixed by simply updating
to a new version, while 1% of the vulnerable dependencies in our sample are
halted, and therefore, potentially require a costly mitigation strategy.
CONCLUSIONS: Our case study shows that the correct counting allows software
development companies to receive actionable information about their library
dependencies, and therefore, correctly allocate costly development and audit
resources, which is spent inefficiently in case of distorted measurements.Comment: This is a pre-print of the paper that appears, with the same title,
in the proceedings of the 12th International Symposium on Empirical Software
Engineering and Measurement, 201
The Source of Maser Emission W33C (G12.8-0.2)
Results of observations of the maser sources toward the W33C region
(G12.8-0.2) carried out on the 22-m radio telescope of the Pushchino Radio
Astronomy Observatory in the 1.35-cm H2O line and on the Large radio telescope
in Nancay (France) in the main (1665 and 1667 MHz) and satellite (1612 and 1720
MHz) OH lines are reported. Multiple, strongly variable short-lived H2O
emission features were detected in a broad interval of radial velocities, from
-7 to 55 km/s. OH maser emission in the 1667-MHz line was discovered in a
velocity range of 35-41 km/s. Stokes parameters of maser emission in the main
OH lines 1665 and 1667 MHz were measured. Zeeman splitting was detected in the
1665-MHz line at 33.4 and 39.4 km/s and in the 1667 MHz line only at 39.4 km/s.
The magnetic field intensity was estimated. A appreciable variability of Zeeman
splitting components was observed at 39 and 39.8 km/s in both main lines. The
extended spectrum and fast variability of the H2O maser emission together with
the variability of the Zeeman splitting components in the main OH lines can be
due to the composite clumpy structure of the molecular cloud and to the
presence in it of large-scale rotation and bipolar outflow as well as of
turbulent motions of material.Comment: 7 pages, 2 tables, 8 figures, accepted by Astronomicheskii Zhurnal
(Astronomy Reports
SUBSTANTIATION OF MUD PREPARATION TECHNOLOGY
To study the advantages of hydrodynamic cavitation, to calculate the
frequency of cavitation oscillations by the device parameters, to obtain a formula
for determining the dispersion time of the material, and to study the flow of the
drilling fluid in the device using the SolidWorks program
- …
