152 research outputs found

    НАЩАДКИ КОШОВОГО ОТАМАНА ЙОСИПА ГЛАДКОГО

    Get PDF
    Постать останнього кошового отамана Задунайської запорозької Січі а згодом й Азовського козацького війська Йосипа Михайловича Гладкого не залишилася поза увагою істориків [1] і народної пам’яті [2]. Діяльність цієї, безумовно, харизматичної людини отримала неоднозначну оцінку в попередній і сучасній історіографії. Останнім часом з’явилися ґрунтовні дослідження запорозького історика Людмили Маленко, присвячені історії Азовського козацького війська [3] і персонально діяльності отамана цього війська Й.Гладкого [4]. Дослідниця ввела до наукового обігу потужний корпус нових джерел. У полі зору Л.Маленко опинилася також і генеалогія Гладких. Проте ще наприкінці 1880-х рр. цього питання торкався відомий дослідник Запорожжя Дмитро Іванович Яворницький (1855-1940). Він був чи не першим, хто більш-менш повно висвітлив родинні стосунки Й.Гладкого. Вже у першій своїй великій монографії “Запорожжя в залишках старовини і переказах народу” Д.Яворницький приділив немало рядків Й.Гладкому та його нащадкам [5]. Головним джерелом у цьому дослідженні були документи родинного архіву Гладких. Яворницькому допомагав в цьому питанні його олександрівський приятель і відомий дослідник історії й фольклору місцевого краю Яків Павлович Новицький (1847-1925). В творчому доробку історика є й спеціальна стаття, присвячена Й. Гладкому та його генеалогії [6]

    The role of history and strength of the oceanic forcing in sea level projections from Antarctica with the Parallel Ice Sheet Model

    Get PDF
    Mass loss from the Antarctic Ice Sheet constitutes the largest uncertainty in projections of future sea level rise. Ocean-driven melting underneath the floating ice shelves and subsequent acceleration of the inland ice streams are the major reasons for currently observed mass loss from Antarctica and are expected to become more important in the future. Here we show that for projections of future mass loss from the Antarctic Ice Sheet, it is essential (1) to better constrain the sensitivity of sub-shelf melt rates to ocean warming and (2) to include the historic trajectory of the ice sheet. In particular, we find that while the ice sheet response in simulations using the Parallel Ice Sheet Model is comparable to the median response of models in three Antarctic Ice Sheet Intercomparison projects – initMIP, LARMIP-2 and ISMIP6 – conducted with a range of ice sheet models, the projected 21st century sea level contribution differs significantly depending on these two factors. For the highest emission scenario RCP8.5, this leads to projected ice loss ranging from 1.4 to 4.0 cm of sea level equivalent in simulations in which ISMIP6 ocean forcing drives the PICO ocean box model where parameter tuning leads to a comparably low sub-shelf melt sensitivity and in which no surface forcing is applied. This is opposed to a likely range of 9.1 to 35.8 cm using the exact same initial setup, but emulated from the LARMIP-2 experiments with a higher melt sensitivity, even though both projects use forcing from climate models and melt rates are calibrated with previous oceanographic studies. Furthermore, using two initial states, one with a previous historic simulation from 1850 to 2014 and one starting from a steady state, we show that while differences between the ice sheet configurations in 2015 seem marginal at first sight, the historic simulation increases the susceptibility of the ice sheet to ocean warming, thereby increasing mass loss from 2015 to 2100 by 5 % to 50 %. Hindcasting past ice sheet changes with numerical models would thus provide valuable tools to better constrain projections. Our results emphasize that the uncertainty that arises from the forcing is of the same order of magnitude as the ice dynamic response for future sea level projections

    Antarctic sub-shelf melt rates via PICO

    Get PDF
    Ocean-induced melting below ice shelves is one of the dominant drivers for mass loss from the Antarctic Ice Sheet at present. An appropriate representation of sub-shelf melt rates is therefore essential for model simulations of marine-based ice sheet evolution. Continental-scale ice sheet models often rely on simple melt-parameterizations, in particular for long-term simulations, when fully coupled ice–ocean interaction becomes computationally too expensive. Such parameterizations can account for the influence of the local depth of the ice-shelf draft or its slope on melting. However, they do not capture the effect of ocean circulation underneath the ice shelf. Here we present the Potsdam Ice-shelf Cavity mOdel (PICO), which simulates the vertical overturning circulation in ice-shelf cavities and thus enables the computation of sub-shelf melt rates consistent with this circulation. PICO is based on an ocean box model that coarsely resolves ice shelf cavities and uses a boundary layer melt formulation. We implement it as a module of the Parallel Ice Sheet Model (PISM) and evaluate its performance under present-day conditions of the Southern Ocean. We identify a set of parameters that yield two-dimensional melt rate fields that qualitatively reproduce the typical pattern of comparably high melting near the grounding line and lower melting or refreezing towards the calving front. PICO captures the wide range of melt rates observed for Antarctic ice shelves, with an average of about 0.1m a−1 for cold sub-shelf cavities, for example, underneath Ross or Ronne ice shelves, to 16m a−1 for warm cavities such as in the Amundsen Sea region. This makes PICO a computationally feasible and more physical alternative to melt parameterizations purely based on ice draft geometry

    A high efficiency photon veto for the Light Dark Matter eXperiment

    Get PDF
    Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10⁻¹³ rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies

    A high efficiency photon veto for the Light Dark Matter eXperiment

    Get PDF
    Fixed-target experiments using primary electron beams can be powerful discovery tools for light dark matter in the sub-GeV mass range. The Light Dark Matter eXperiment (LDMX) is designed to measure missing momentum in high-rate electron fixed-target reactions with beam energies of 4 GeV to 16 GeV. A prerequisite for achieving several important sensitivity milestones is the capability to efficiently reject backgrounds associated with few-GeV bremsstrahlung, by twelve orders of magnitude, while maintaining high efficiency for signal. The primary challenge arises from events with photo-nuclear reactions faking the missing-momentum property of a dark matter signal. We present a methodology developed for the LDMX detector concept that is capable of the required rejection. By employing a detailed Geant4-based model of the detector response, we demonstrate that the sampling calorimetry proposed for LDMX can achieve better than 10⁻¹³ rejection of few-GeV photons. This suggests that the luminosity-limited sensitivity of LDMX can be realized at 4 GeV and higher beam energies

    Structural and Functional Evolution of the Trace Amine-Associated Receptors TAAR3, TAAR4 and TAAR5 in Primates

    Get PDF
    The family of trace amine-associated receptors (TAAR) comprises 9 mammalian TAAR subtypes, with intact gene and pseudogene numbers differing considerably even between closely related species. To date the best characterized subtype is TAAR1, which activates the Gs protein/adenylyl cyclase pathway upon stimulation by trace amines and psychoactive substances like MDMA or LSD. Recently, chemosensory function involving recognition of volatile amines was proposed for murine TAAR3, TAAR4 and TAAR5. Humans can smell volatile amines despite carrying open reading frame (ORF) disruptions in TAAR3 and TAAR4. Therefore, we set out to study the functional and structural evolution of these genes with a special focus on primates. Functional analyses showed that ligands activating the murine TAAR3, TAAR4 and TAAR5 do not activate intact primate and mammalian orthologs, although they evolve under purifying selection and hence must be functional. We also find little evidence for positive selection that could explain the functional differences between mouse and other mammals. Our findings rather suggest that the previously identified volatile amine TAAR3–5 agonists reflect the high agonist promiscuity of TAAR, and that the ligands driving purifying selection of these TAAR in mouse and other mammals still await discovery. More generally, our study points out how analyses in an evolutionary context can help to interpret functional data generated in single species
    corecore