138,488 research outputs found

    The Complex X-ray Spectrum of the Sefyert 1.5 Source NGC 6860

    Full text link
    The X-ray spectrum of the Seyfert 1.5 source NGC 6860 is among the most complex of the sources detected in the Swift Burst Alert Telescope all-sky survey. A short XMM-Newton follow-up observation of the source revealed a flat spectrum both above and below 2 keV. To uncover the complexity of the source, in this paper we analyze both a 40 ks Suzaku and a 100 ks XMM-Newton observation of NGC 6860. While the spectral state of the source changed between the newer observations presented here and the earlier short XMM-Newton spectrum - showing a higher flux and steeper power law component - the spectrum of NGC 6860 is still complex with clearly detected warm absorption signatures. We find that a two component warm ionized absorber is present in the soft spectrum, with column densities of about 10^20 and 10^21 cm$^-2, ionization parameters of xi = 180 and 45 ergs cm s^-1, and outflow velocities for each component in the range of 0-300 km s^-1. Additionally, in the hard spectrum we find a broad (approx 11000 km s^-1) Fe K-alpha emission line, redshifted by approx 2800 km s^-1.Comment: 35 pages, 9 figures, Accepted to Ap

    Understanding CP phase-dependent measurements at neutrino superbeams in terms of bi-rate graphs

    Full text link
    We discuss the impact of the true value of the CP phase on the mass hierarchy, CP violation, and CP precision measurements at neutrino superbeams and related experiments. We we use a complete statistical experiment simulation including spectral information, systematics, correlations, and degeneracies to produce the results. However, since it is very complicated to understand the results in terms of a complete experiment simulation, we show the corresponding bi-rate graphs as useful tools to investigate the CP phase-dependencies qualitatively. Unlike bi-probability graphs, which are based upon oscillation probabilities, bi-rate graphs use the total event rates of two measurements simultaneously as a function of the CP phase. Since they allow error bars for direct quantitative estimates, they can be used for a direct comparison with a complete statistical experiment simulation. We find that one can describe the CP phase dependencies of the mentioned measurements at neutrino superbeam setups, as well as one can understand the role of the sgn(Δm312)\mathrm{sgn} (\Delta m_{31}^2)-degeneracy. As one of the most interesting results, we discuss the dependence of the CP precision measurement as a function of the CP phase itself, which leads to ``CP patterns''. It turns out that this dependence is rather strong, which means that one has to be careful when one is comparing the CP precisions of different experiments.Comment: Major revisions: Scope reduced and discussions simplified. Summary and conclusions unchanged. 14 pages, 4 figures. Final version to appear in PR

    American terror:from Oklahoma City to 9/11 and after

    Get PDF
    Throughout American history, both terrorism and extremism have been constructed, evoked or ignored strategically by the state, media and public at different points, in order to disown and demonize political movements whenever their ideologies and objectives become problematic or inconvenient – because they overlap with, and thus compromise, the legitimacy of the dominant ideology and democratic credentials of the state, because they conflict with the dominant ideology or hegemonic order, because they offend the general (voting) public, or because they expose the fallacies of national unity and bi-polar opposition in the face of foreign enemies or international conflicts, such as the war on terror. This chapter looks at how domestic extreme right terrorism has been constructed, represented, evoked or ignored in the American political imagination in the post-civil rights era, with a particular focus on its changing status following the Oklahoma City bombing and 9/11

    The Generation of Memory: Reflections on the “Memory Boom” in Contemporary Historical Studies

    Get PDF
    Jay Winter delivered the following in the form of a lecture at the Canadian War Museum on 31 October 2000. A distinguished academic, Winter has been writing about the cultural history of the First World War for nearly three decades. He has taught at the University of Cambridge in England and is presently at Yale University. Since 1988, he has been a director of the Historial de la grande guerre in Peronne, an important war museum in northern France. In this capacity, he has become familiar with a great many institutions of war and military history around the world and he has great knowledge and familiarity with the important historical and intellectual debates that will be fundamental to the creation of a new Canadian War Museum, which is now slated to open in May 2005. Probably Winter’s best-known book is Sites of Memory, Sites of Mourning: the Great War in European Cultural History published in 1995. In it, he argues that the rituals of mourning associated with commemoration after the First World War had a history stretching far back in human life and experience. In this he contradicts the thinking of Canadian historian Modris Eksteins who argued that the Great War marked the birth of the modern age. Lately, Daniel Sherman has proposed that commemorative ceremonies and memorials are significantly politicized in the interests of state control. In the following paper Winter warns against the dangers of collective memory being collapsed into “a set of stories formed by or about the state” while also providing a rich overview of the great importance that attention to memory and culture studies has taken on in contemporary thought. These cannot be ignored in any serious attempt to lay the intellectual foundation of any new museum, and perhaps especially may have specific relevance to a new war museum

    Development of a new Thomson parabola spectrometer for analysis of laser accelerated ions

    Get PDF
    This thesis details my work on developing a new Thomson parabola spectrometer for use at the SCARLET Laser Facility at The Ohio State University. The SCARLET laser facility is a 300 TW laser reaching peak intensities exceeding 10 21 W/cm 2 . The laser is used to study laser-matter interactions and plasma phenomena. The laser-matter interactions accelerate multiple types of particles and to understand the interactions it is necessary to have diagnostic tools to characterize the accelerated particles. In order to measure the charged particles a common device is a Thomson parabola spectrometer. A Thomson parabola spectrometer uses parallel electric and magnetic fields that are perpendicular to the incoming particles. This causes deflection of the particles based on their charge-to-mass ratio and energy. Therefore, the Thomson parabola spectrometer allows us to determine what particles are present and what their energy range is. I designed a new spectrometer to replace the existing Thomson parabola spectrometer which had problems during operation that reduced performance. Using a MATLAB code, I first modeled the performance of the new design to determine physical dimensions and field strengths that would allow for 1 MeV resolution of protons up to a maximum energy of 40 MeV. This resulted in a 5 cm long magnetic field with a field strength of 0.12 T and 10 cm electrodes with a voltage difference of 6 kV. These physical dimensions were used to create a SolidWorks model. As of this writing, the newly designed Thomson parabola spectrometer has been built and is currently being installed for use on future experiments.No embargoAcademic Major: Engineering Physic

    Walla Walla Valley Viticultural Area

    Get PDF
    Produced by the Walla Walla Valley Wine Alliance, this map depicts the Walla Walla Valley AVA (American Viticultural Area), which crosses state boundaries. The Walla Walla Valley AVA extends from Southern Washington to Northern Oregon

    Quantum and Classical Message Identification via Quantum Channels

    Full text link
    We discuss concepts of message identification in the sense of Ahlswede and Dueck via general quantum channels, extending investigations for classical channels, initial work for classical-quantum (cq) channels and "quantum fingerprinting". We show that the identification capacity of a discrete memoryless quantum channel for classical information can be larger than that for transmission; this is in contrast to all previously considered models, where it turns out to equal the common randomness capacity (equals transmission capacity in our case): in particular, for a noiseless qubit, we show the identification capacity to be 2, while transmission and common randomness capacity are 1. Then we turn to a natural concept of identification of quantum messages (i.e. a notion of "fingerprint" for quantum states). This is much closer to quantum information transmission than its classical counterpart (for one thing, the code length grows only exponentially, compared to double exponentially for classical identification). Indeed, we show how the problem exhibits a nice connection to visible quantum coding. Astonishingly, for the noiseless qubit channel this capacity turns out to be 2: in other words, one can compress two qubits into one and this is optimal. In general however, we conjecture quantum identification capacity to be different from classical identification capacity.Comment: 18 pages, requires Rinton-P9x6.cls. On the occasion of Alexander Holevo's 60th birthday. Version 2 has a few theorems knocked off: Y Steinberg has pointed out a crucial error in my statements on simultaneous ID codes. They are all gone and replaced by a speculative remark. The central results of the paper are all unharmed. In v3: proof of Proposition 17 corrected, without change of its statemen
    corecore