9,557 research outputs found

    Decadal-scale thermohaline variability in the Atlantic sector of the Southern Ocean

    Get PDF
    An enhanced Altimetry Gravest Empirical Mode (AGEM), including both adiabatic and diabatic trends, is developed for the Antarctic Circumpolar Current (ACC) south of Africa using updated hydrographic CTD sections, Argo data, and satellite altimetry. This AGEM has improved accuracy compared to traditional climatologies and other proxy methods. The AGEM for the Atlantic Southern Ocean offers an ideal technique to investigate the thermohaline variability over the past two decades in a key region for water mass exchanges and transformation. In order to assess and attribute changes in the hydrography of the region, we separate the changes into adiabatic and diabatic components. Integrated over the upper 2000 dbar of the ACC south of Africa, results show mean adiabatic changes of 0.16 ± 0.11°C decade−1 and 0.006 ± 0.014 decade−1, and diabatic differences of −0.044 ± 0.13°C decade−1 and −0.01 ± 0.017 decade−1 for temperature and salinity, respectively. The trends of the resultant AGEM, that include both adiabatic and diabatic variability (termed AD-AGEM), show a significant increase in the heat content of the upper 2000 dbar of the ACC with a mean warming of 0.12 ± 0.087°C decade−1. This study focuses on the Antarctic Intermediate Water (AAIW) mass where negative diabatic trends dominate positive adiabatic differences in the Subantarctic Zone (SAZ), with results indicating a cooling (−0.17°C decade−1) and freshening (−0.032 decade−1) of AAIW in this area, whereas south of the SAZ positive adiabatic and diabatic trends together create a cumulative warming (0.31°C decade−1) and salinification (0.014 decade−1) of AAIW

    Staying true with the help of others: doxastic self-control through interpersonal commitment

    Get PDF
    I explore the possibility and rationality of interpersonal mechanisms of doxastic self-control, that is, ways in which individuals can make use of other people in order to get themselves to stick to their beliefs. I look, in particular, at two ways in which people can make interpersonal epistemic commitments, and thereby willingly undertake accountability to others, in order to get themselves to maintain their beliefs in the face of anticipated “epistemic temptations”. The first way is through the avowal of belief, and the second is through the establishment of collective belief. I argue that both of these forms of interpersonal epistemic commitment can function as effective tools for doxastic self-control, and, moreover, that the control they facilitate should not be dismissed as irrational from an epistemic perspective

    Using XDAQ in Application Scenarios of the CMS Experiment

    Full text link
    XDAQ is a generic data acquisition software environment that emerged from a rich set of of use-cases encountered in the CMS experiment. They cover not the deployment for multiple sub-detectors and the operation of different processing and networking equipment as well as a distributed collaboration of users with different needs. The use of the software in various application scenarios demonstrated the viability of the approach. We discuss two applications, the tracker local DAQ system for front-end commissioning and the muon chamber validation system. The description is completed by a brief overview of XDAQ.Comment: Conference CHEP 2003 (Computing in High Energy and Nuclear Physics, La Jolla, CA

    Circulation, retention, and mixing of waters within the Weddell-Scotia Confluence, Southern Ocean:The role of stratified Taylor columns

    Get PDF
    The waters of the Weddell-Scotia Confluence (WSC) lie above the rugged topography of the South Scotia Ridge in the Southern Ocean. Meridional exchanges across the WSC transfer water and tracers between the Antarctic Circumpolar Current (ACC) to the north and the subpolar Weddell Gyre to the south. Here, we examine the role of topographic interactions in mediating these exchanges, and in modifying the waters transferred. A case study is presented using data from a free-drifting, intermediate-depth float, which circulated anticyclonically over Discovery Bank on the South Scotia Ridge for close to 4 years. Dimensional analysis indicates that the local conditions are conducive to the formation of Taylor columns. Contemporaneous ship-derived transient tracer data enable estimation of the rate of isopycnal mixing associated with this column, with values of O(1000 m2/s) obtained. Although necessarily coarse, this is of the same order as the rate of isopycnal mixing induced by transient mesoscale eddies within the ACC. A picture emerges of the Taylor column acting as a slow, steady blender, retaining the waters in the vicinity of the WSC for lengthy periods during which they can be subject to significant modification. A full regional float data set, bathymetric data, and a Southern Ocean state estimate are used to identify other potential sites for Taylor column formation. We find that they are likely to be sufficiently widespread to exert a significant influence on water mass modification and meridional fluxes across the southern edge of the ACC in this sector of the Southern Ocean

    The CMS Event Builder

    Full text link
    The data acquisition system of the CMS experiment at the Large Hadron Collider will employ an event builder which will combine data from about 500 data sources into full events at an aggregate throughput of 100 GByte/s. Several architectures and switch technologies have been evaluated for the DAQ Technical Design Report by measurements with test benches and by simulation. This paper describes studies of an EVB test-bench based on 64 PCs acting as data sources and data consumers and employing both Gigabit Ethernet and Myrinet technologies as the interconnect. In the case of Ethernet, protocols based on Layer-2 frames and on TCP/IP are evaluated. Results from ongoing studies, including measurements on throughput and scaling are presented. The architecture of the baseline CMS event builder will be outlined. The event builder is organised into two stages with intelligent buffers in between. The first stage contains 64 switches performing a first level of data concentration by building super-fragments from fragments of 8 data sources. The second stage combines the 64 super-fragments into full events. This architecture allows installation of the second stage of the event builder in steps, with the overall throughput scaling linearly with the number of switches in the second stage. Possible implementations of the components of the event builder are discussed and the expected performance of the full event builder is outlined.Comment: Conference CHEP0

    Modeling interannual dense shelf water export in the region of the Mertz Glacier Tongue (1992-2007)

    Get PDF
    1] Ocean observations around the Australian-Antarctic basin show the importance of coastal latent heat polynyas near the Mertz Glacier Tongue (MGT) to the formation of Dense Shelf Water (DSW) and associated Antarctic Bottom Water (AABW). Here, we use a regional ocean/ice shelf model to investigate the interannual variability of the export of DSW from the Adélie (west of the MGT) and the Mertz (east of the MGT) depressions from 1992 to 2007. The variability in the model is driven by changes in observed surface heat and salt fluxes. The model simulates an annual mean export of DSW through the Adélie sill of about 0.07 ± 0.06 Sv. From 1992 to 1998, the export of DSW through the Adélie (Mertz) sills peaked at 0.14 Sv (0.29 Sv) during July to November. During periods of mean to strong polynya activity (defined by the surface ocean heat loss), DSW formed in the Adélie depression can spread into the Mertz depression via the cavity under the MGT. An additional simulation, where ocean/ice shelf thermodynamics have been disabled, highlights the fact that models without ocean/ice shelf interaction processes will significantly overestimate rates of DSW export. The melt rates of the MGT are 1.2 ± 0.4 m yr−1 during periods of average to strong polynya activity and can increase to 3.8 ± 1.5 m/yr during periods of sustained weak polynya activity, due to the increased presence of relatively warmer water interacting with the base of the ice shelf. The increased melting of the MGT during a weak polynya state can cause further freshening of the DSW and ultimately limits the production of AABW

    Nonequilibrium spectral diffusion due to laser heating in stimulated photon echo spectroscopy of low temperature glasses

    Full text link
    A quantitative theory is developed, which accounts for heating artifacts in three-pulse photon echo (3PE) experiments. The heat diffusion equation is solved and the average value of the temperature in the focal volume of the laser is determined as a function of the 3PE waiting time. This temperature is used in the framework of nonequilibrium spectral diffusion theory to calculate the effective homogeneous linewidth of an ensemble of probe molecules embedded in an amorphous host. The theory fits recently observed plateaus and bumps without introducing a gap in the distribution function of flip rates of the two-level systems or any other major modification of the standard tunneling model.Comment: 10 pages, Revtex, 6 eps-figures, accepted for publication in Phys. Rev.

    Coagulopathy in Zellweger spectrum disorders: a role for vitamin K

    Get PDF
    Introduction: Zellweger spectrum disorders (ZSDs) are caused by an impairment of peroxisome biogenesis, resulting in multiple metabolic abnormalities. This leads to a range of symptoms, including hepatic dysfunction and coagulopathy. This study evaluated the incidence and severity of coagulopathy and the effect of vitamin K supplementation orally and IV in ZSD. Methods: Data were retrospectively retrieved from the medical records of 30 ZSD patients to study coagulopathy and the effect of vitamin K orally on proteins induced by vitamin K absence (PIVKA-II) levels. Five patients from the cohort with a prolonged prothrombin time, low factor VII, and elevated PIVKA-II levels received 10 mg of vitamin K IV. Laboratory results, including thrombin generation, at baseline and 72 h after vitamin K administration were examined. Results: In the retrospective cohort, four patients (13.3%) experienced intracranial bleedings and 14 (46.7%) reported minor bleeding. No thrombotic events occurred. PIVKA-II levels decreased 38% after start of vitamin K therapy orally. In the five patients with a coagulopathy, despite treatment with oral administration of vitamin K, vitamin K IV caused an additional decrease (23%) of PIVKA-II levels and increased thrombin generation. Conclusion: Bleeding complications frequently occur in ZSD patients due to liver disease and vitamin K deficiency. Vitamin K deficiency is partly corrected by vitamin K supplementation orally, and vitamin K administered IV additionally improves vitamin K status, as shown by further decrease of PIVKA-II and improved thrombin generation

    The CMS event builder demonstrator based on Myrinet

    Get PDF
    The data acquisition system for the CMS experiment at the Large Hadron Collider (LHC) will require a large and high performance event building network. Several switch technologies are currently being evaluated in order to compare different architectures for the event builder. One candidate is Myrinet. This paper describes the demonstrator which has been set up to study a small-scale (8*8) event builder based on a Myrinet switch. Measurements are presented on throughput, overhead and scaling for various traffic conditions. Results are shown on event building with a push architecture. (6 refs)

    Commissioning of the CMS High Level Trigger

    Get PDF
    The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-crossing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008
    corecore