3,722 research outputs found

    Impact-induced acceleration by obstacles

    Full text link
    We explore a surprising phenomenon in which an obstruction accelerates, rather than decelerates, a moving flexible object. It has been claimed that the right kind of discrete chain falling onto a table falls \emph{faster} than a free-falling body. We confirm and quantify this effect, reveal its complicated dependence on angle of incidence, and identify multiple operative mechanisms. Prior theories for direct impact onto flat surfaces, which involve a single constitutive parameter, match our data well if we account for a characteristic delay length that must impinge before the onset of excess acceleration. Our measurements provide a robust determination of this parameter. This supports the possibility of modeling such discrete structures as continuous bodies with a complicated constitutive law of impact that includes angle of incidence as an input.Comment: small changes and corrections, added reference

    Detecting the Cosmic Gravitational Wave Background with the Big Bang Observer

    Full text link
    The detection of the Cosmic Microwave Background Radiation (CMB) was one of the most important cosmological discoveries of the last century. With the development of interferometric gravitational wave detectors, we may be in a position to detect the gravitational equivalent of the CMB in this century. The Cosmic Gravitational Background (CGB) is likely to be isotropic and stochastic, making it difficult to distinguish from instrument noise. The contribution from the CGB can be isolated by cross-correlating the signals from two or more independent detectors. Here we extend previous studies that considered the cross-correlation of two Michelson channels by calculating the optimal signal to noise ratio that can be achieved by combining the full set of interferometry variables that are available with a six link triangular interferometer. In contrast to the two channel case, we find that the relative orientation of a pair of coplanar detectors does not affect the signal to noise ratio. We apply our results to the detector design described in the Big Bang Observer (BBO) mission concept study and find that BBO could detect a background with Ωgw>2.2×1017\Omega_{gw} > 2.2 \times 10^{-17}.Comment: 15 pages, 12 Figure

    A comparison of sensitivity-specificity imputation, direct imputation and fully Bayesian analysis to adjust for exposure misclassification when validation data are unavailable.

    Get PDF
    : Measurement error is an important source of bias in epidemiological studies. We illustrate three approaches to sensitivity analysis for the effect of measurement error: imputation of the 'true' exposure based on specifying the sensitivity and specificity of the measured exposure (SS); direct imputation (DI) using a regression model for the predictive values; and adjustment based on a fully Bayesian analysis. : We deliberately misclassify smoking status in data from a case-control study of lung cancer. We then implement the SS and DI methods using fixed-parameter (FBA) and probabilistic (PBA) bias analyses, and Bayesian analysis using the Markov-Chain Monte-Carlo program WinBUGS to show how well each recovers the original association. : The 'true' smoking-lung cancer odds ratio (OR), adjusted for sex in the original dataset, was OR = 8.18 [95% confidence limits (CL): 5.86, 11.43]; after misclassification, it decreased to OR = 3.08 (nominal 95% CL: 2.40, 3.96). The adjusted point estimates from all three approaches were always closer to the 'true' OR than the OR estimated from the unadjusted misclassified smoking data, and the adjusted interval estimates were always wider than the unadjusted interval estimate. When imputed misclassification parameters departed much from the actual misclassification, the 'true' OR was often omitted in the FBA intervals whereas it was always included in the PBA and Bayesian intervals. : These results illustrate how PBA and Bayesian analyses can be used to better account for uncertainty and bias due to measurement error.<br/

    A search for HI in some peculiar faint dwarf galaxies

    Full text link
    We present a deep Giant Metrewave Radio Telescope (GMRT) search for HI 21 cm emission from three dwarf galaxies, viz. POX 186, SC 24 and KKR 25. Based, in part, on previous single dish HI observations, these galaxies have been classified as a BCD, a dwarf irregular and a transition galaxy respectively. However, in conflict with previous single dish detections, we do not detect HI in SC 24 or KKR 25. We suggest that the previous single dish measurements were probably confused with the local galactic emission. In the case of POX 186, we confirm the previous non detection of HI but with substantially improved limits on its HI mass. Our derived upper limits on the HI mass of SC 24 and KKR 25 are similar to the typical HI mass limit for dwarf spheroidal galaxies, whereas in the case of POX 186, we find that its gas content is somewhat smaller than is typical of BCD galaxies.Comment: Accepted for publication in MNRA

    A Formalization of the Theorem of Existence of First-Order Most General Unifiers

    Full text link
    This work presents a formalization of the theorem of existence of most general unifiers in first-order signatures in the higher-order proof assistant PVS. The distinguishing feature of this formalization is that it remains close to the textbook proofs that are based on proving the correctness of the well-known Robinson's first-order unification algorithm. The formalization was applied inside a PVS development for term rewriting systems that provides a complete formalization of the Knuth-Bendix Critical Pair theorem, among other relevant theorems of the theory of rewriting. In addition, the formalization methodology has been proved of practical use in order to verify the correctness of unification algorithms in the style of the original Robinson's unification algorithm.Comment: In Proceedings LSFA 2011, arXiv:1203.542

    Conceptualizing cultures of violence and cultural change

    Get PDF
    The historiography of violence has undergone a distinct cultural turn as attention has shifted from examining violence as a clearly defined (and countable) social problem to analysing its historically defined 'social meaning'. Nevertheless, the precise nature of the relationship between 'violence' and 'culture' is still being established. How are 'cultures of violence' formed? What impact do they have on violent behaviour? How do they change? This essay examines some of the conceptual aspects of the relationship between culture and violence. It brings together empirical research into nineteenth-century England with recent research results from other European contexts to examine three aspects of the relationship between culture and violence. These are organised under the labels 'seeing violence', 'identifying the violent' and 'changing violence'. Within a particular society, narratives regarding particular kinds of behaviour shape cultural attitudes. The notion 'violence' is thus defined in relation to physically aggressive acts as well as by being connected to other kinds of attitudes and contexts. As a result, the boundaries between physical aggression which is legitimate and that which is illegitimate (and thus 'violence') are set. Once 'violence' is defined, particular cultures form ideas about who is responsible for it: reactions to violence become associated with social arrangements such as class and gender as well as to attitudes toward the self. Finally, cultures of violence make efforts to tame or eradicate illegitimate forms of physical aggression. This process is not only connected to the development of new forms of power (e.g., new policing or punishment strategies) but also to less tangible cultural influences which aim at changing the behaviour defined as violence (in particular among the social groups identified as violent). Even if successful, this three-tiered process of seeing violence, identifying the violent and changing violence continues anew, emphasising the ways that cultures of violence develop through a continuous process of reevaluation and reinvention

    Surface Brightness Profiles of Composite Images of Compact Galaxies at z~4-6 in the HUDF

    Full text link
    The Hubble Ultra Deep Field (HUDF) contains a significant number of B, V and i'-band dropout objects, many of which were recently confirmed to be young star-forming galaxies at z~4-6. These galaxies are too faint individually to accurately measure their radial surface brightness profiles. Their average light profiles are potentially of great interest, since they may contain clues to the time since the onset of significant galaxy assembly. We separately co-add V, i' and z'-band HUDF images of sets of z~4,5 and 6 objects, pre-selected to have nearly identical compact sizes and the roundest shapes. From these stacked images, we are able to study the averaged radial structure of these objects at much higher signal-to-noise ratio than possible for an individual faint object. Here we explore the reliability and usefulness of a stacking technique of compact objects at z~4-6 in the HUDF. Our results are: (1) image stacking provides reliable and reproducible average surface brightness profiles; (2) the shape of the average surface brightness profiles show that even the faintest z~4-6 objects are resolved; and (3) if late-type galaxies dominate the population of galaxies at z~4-6, as previous HST studies have shown, then limits to dynamical age estimates for these galaxies from their profile shapes are comparable with the SED ages obtained from the broadband colors. We also present accurate measurements of the sky-background in the HUDF and its associated 1-sigma uncertainties.Comment: 10 pages, 9 figures, 2 tables, emulateapj; Accepted for publication in The Astronomical Journa

    Wood Dust in Joineries and Furniture Manufacturing: An Exposure Determinant and Intervention Study.

    Get PDF
    : To assess wood dust exposures and determinants in joineries and furniture manufacturing and to evaluate the efficacy of specific interventions on dust emissions under laboratory conditions. Also, in a subsequent follow-up study in a small sample of joinery workshops, we aimed to develop, implement, and evaluate a cost-effective and practicable intervention to reduce dust exposures. : Personal inhalable dust (n = 201) was measured in 99 workers from 10 joineries and 3 furniture-making factories. To assess exposure determinants, full-shift video exposure monitoring (VEM) was conducted in 19 workers and task-based VEM in 32 workers (in 7 joineries and 3 furniture factories). We assessed the efficacy of vacuum extraction on hand tools and the use of vacuum cleaners instead of sweeping and dry wiping under laboratory conditions. These measures were subsequently implemented in three joinery workshops with 'high' (&gt;4 mg m-3) and one with 'low' (&lt;2 mg m-3) baseline exposures. We also included two control workshops (one 'low' and one 'high' exposure workshop) in which no interventions were implemented. Exposures were measured 4 months prior and 4 months following the intervention. : Average (geometric means) exposures in joinery and furniture making were 2.5 mg m-3 [geometric standard deviations (GSD) 2.5] and 0.6 mg m-3 (GSD 2.3), respectively. In joinery workers cleaning was associated with a 3.0-fold higher (P &lt; 0.001) dust concentration compared to low exposure tasks (e.g. gluing), while the use of hand tools showed 3.0- to 11.0-fold higher (P &lt; 0.001) exposures. In furniture makers, we found a 5.4-fold higher exposure (P &lt; 0.001) with using a table/circular saw. Laboratory efficiency experiments showed a 10-fold decrease in exposure (P &lt; 0.001) when using a vacuum cleaner. Vacuum extraction on hand tools combined with a downdraft table reduced exposures by 42.5% for routing (P &lt; 0.1) and 85.5% for orbital sanding (P &lt; 0.001). Following intervention measures in joineries, a borderline statistically significant (P &lt; 0.10) reduction in exposure of 30% was found in workshops with 'high' baseline exposures, but no reduction was shown in the workshop with 'low' baseline exposures. : Wood dust exposure is high in joinery workers and (to a lesser extent) furniture makers with frequent use of hand tools and cleaning being key drivers of exposure. Vacuum extraction on hand tools and alternative cleaning methods reduced workplace exposures substantially, but may be insufficient to achieve compliance with current occupational exposure limits.<br/
    corecore