295 research outputs found

    Usefulness of the ManageMed Screen (MMS) and the Screening for Self-Medication Safety Post Stroke (S5) for Assessing Medication Management Capacity for Clients Post-Stroke

    Get PDF
    Occupational therapists need to efficiently and accurately screen a client’s medication management capacity, especially for clients post-stroke. Most therapists are not aware of, nor do they utilize specific assessments for, medication management capacity. The purpose of this pilot study was to compare the results of the ManageMed Screen (MMS), the Screening for Self-Medication Safety Post Stroke (S5), and the Montreal Assessment of Cognition (MoCA) on a population of rehabilitation clients post-stroke to determine the usefulness of the medication assessment tools in clinical practice. These screens were designed for use in occupational therapy practice among other healthcare professions: the MMS was validated for the general adult population, the S5 for clients post-stroke, and the MoCA is a cognitive screen used with adult clients with a variety of diagnoses including stroke. The MoCA was used to explore the potential relationship between cognition and medication management capacity. Study participants included five clients post-stroke and three occupational therapists. Clients were screened by the occupational therapists with the MMS, S5, and MoCA, and clinicians also participated in a focus group to assess their perceived usefulness of the screens. Results demonstrated that the MMS score compared to the S5 score was not statistically significant (r=.671, p=.215). There is no established consistency between the MoCA and MMS given these five clients. The MMS score was correlated to the MoCA score and was not found to be significant at a value of .205 with p=.741. The S5 score was also correlated to the MoCA score using SPSS and was found to have a non-significant value of -.287 and p=.640. Additionally, through a focus group, clinicians deemed both the MMS and S5 as useful, but felt the MMS was a more useful screen for their clinical practice with regard to efficient and practical use with clients post-stroke in a rehabilitation setting

    Realfast: Real-Time, Commensal Fast Transient Surveys with the Very Large Array

    Full text link
    Radio interferometers have the ability to precisely localize and better characterize the properties of sources. This ability is having a powerful impact on the study of fast radio transients, where a few milliseconds of data is enough to pinpoint a source at cosmological distances. However, recording interferometric data at millisecond cadence produces a terabyte-per-hour data stream that strains networks, computing systems, and archives. This challenge mirrors that of other domains of science, where the science scope is limited by the computational architecture as much as the physical processes at play. Here, we present a solution to this problem in the context of radio transients: realfast, a commensal, fast transient search system at the Jansky Very Large Array. Realfast uses a novel architecture to distribute fast-sampled interferometric data to a 32-node, 64-GPU cluster for real-time imaging and transient detection. By detecting transients in situ, we can trigger the recording of data for those rare, brief instants when the event occurs and reduce the recorded data volume by a factor of 1000. This makes it possible to commensally search a data stream that would otherwise be impossible to record. This system will search for millisecond transients in more than 1000 hours of data per year, potentially localizing several Fast Radio Bursts, pulsars, and other sources of impulsive radio emission. We describe the science scope for realfast, the system design, expected outcomes, and ways real-time analysis can help in other fields of astrophysics.Comment: Accepted to ApJS Special Issue on Data; 11 pages, 4 figure

    Hydrogen Epoch of Reionization Array (HERA)

    Get PDF
    The Hydrogen Epoch of Reionization Array (HERA) is a staged experiment to measure 21 cm emission from the primordial intergalactic medium (IGM) throughout cosmic reionization (z=612z=6-12), and to explore earlier epochs of our Cosmic Dawn (z30z\sim30). During these epochs, early stars and black holes heated and ionized the IGM, introducing fluctuations in 21 cm emission. HERA is designed to characterize the evolution of the 21 cm power spectrum to constrain the timing and morphology of reionization, the properties of the first galaxies, the evolution of large-scale structure, and the early sources of heating. The full HERA instrument will be a 350-element interferometer in South Africa consisting of 14-m parabolic dishes observing from 50 to 250 MHz. Currently, 19 dishes have been deployed on site and the next 18 are under construction. HERA has been designated as an SKA Precursor instrument. In this paper, we summarize HERA's scientific context and provide forecasts for its key science results. After reviewing the current state of the art in foreground mitigation, we use the delay-spectrum technique to motivate high-level performance requirements for the HERA instrument. Next, we present the HERA instrument design, along with the subsystem specifications that ensure that HERA meets its performance requirements. Finally, we summarize the schedule and status of the project. We conclude by suggesting that, given the realities of foreground contamination, current-generation 21 cm instruments are approaching their sensitivity limits. HERA is designed to bring both the sensitivity and the precision to deliver its primary science on the basis of proven foreground filtering techniques, while developing new subtraction techniques to unlock new capabilities. The result will be a major step toward realizing the widely recognized scientific potential of 21 cm cosmology.Comment: 26 pages, 24 figures, 2 table

    On the complexity of the Saccharomyces bayanus taxon: hybridization and potential hybrid speciation

    Get PDF
    Although the genus Saccharomyces has been thoroughly studied, some species in the genus has not yet been accurately resolved; an example is S. bayanus, a taxon that includes genetically diverse lineages of pure and hybrid strains. This diversity makes the assignation and classification of strains belonging to this species unclear and controversial. They have been subdivided by some authors into two varieties (bayanus and uvarum), which have been raised to the species level by others. In this work, we evaluate the complexity of 46 different strains included in the S. bayanus taxon by means of PCR-RFLP analysis and by sequencing of 34 gene regions and one mitochondrial gene. Using the sequence data, and based on the S. bayanus var. bayanus reference strain NBRC 1948, a hypothetical pure S. bayanus was reconstructed for these genes that showed alleles with similarity values lower than 97% with the S. bayanus var. uvarum strain CBS 7001, and of 99¿100% with the non S. cerevisiae portion in S. pastorianus Weihenstephan 34/70 and with the new species S. eubayanus. Among the S. bayanus strains under study, different levels of homozygosity, hybridization and introgression were found; however, no pure S. bayanus var. bayanus strain was identified. These S. bayanus hybrids can be classified into two types: homozygous (type I) and heterozygous hybrids (type II), indicating that they have been originated by different hybridization processes. Therefore, a putative evolutionary scenario involving two different hybridization events between a S. bayanus var. uvarum and unknown European S. eubayanus-like strains can be postulated to explain the genomic diversity observed in our S. bayanus var. bayanus strains

    Optimizing sparse RFI prediction using deep learning

    Get PDF
    Radio frequency interference (RFI) is an ever-present limiting factor among radio telescopes even in the most remote observing locations. When looking to retain the maximum amount of sensitivity and reduce contamination for Epoch of Reionization studies, the identification and removal of RFI is especially important. In addition to improved RFI identification, we must also take into account computational efficiency of the RFI-Identification algorithm as radio interferometer arrays such as the Hydrogen Epoch of Reionization Array (HERA) grow larger in number of receivers. To address this, we present a deep fully convolutional neural network (DFCN) that is comprehensive in its use of interferometric data, where both amplitude and phase information are used jointly for identifying RFI. We train the network using simulated HERA visibilities containing mock RFI, yielding a known \u2018ground truth\u2019 data set for evaluating the accuracy of various RFI algorithms. Evaluation of the DFCN model is performed on observations from the 67 dish build-out, HERA-67, and achieves a data throughput of 1.6 7 105 HERA time-ordered 1024 channelled visibilities per hour per GPU. We determine that relative to an amplitude only network including visibility phase adds important adjacent time\u2013frequency context which increases discrimination between RFI and non-RFI. The inclusion of phase when predicting achieves a recall of 0.81, precision of 0.58, and F2 score of 0.75 as applied to our HERA-67 observations

    Phase II study of two dose schedules of C.E.R.A. (Continuous Erythropoietin Receptor Activator) in anemic patients with advanced non-small cell lung cancer (NSCLC) receiving chemotherapy

    Get PDF
    BACKGROUND: C.E.R.A. (Continuous Erythropoietin Receptor Activator) is an innovative agent with unique erythropoietin receptor activity and prolonged half-life. This study evaluated C.E.R.A. once weekly (QW) or once every 3 weeks (Q3W) in patients with anemia and advanced non-small cell lung cancer (NSCLC) receiving chemotherapy. METHODS: In this Phase II, randomized, open-label, multicenter, dose-finding study, patients (n = 218) with Stage IIIB or IV NSCLC and hemoglobin (Hb) ≤ 11 g/dL were randomized to one of six treatment groups of C.E.R.A. administered subcutaneously for 12 weeks: 0.7, 1.4, or 2.1 μg/kg QW or 2.1, 4.2, or 6.3 μg/kg Q3W. Primary endpoint was average Hb level between baseline and end of initial treatment (defined as last Hb measurement before dose reduction or transfusion, or the value at week 13). Hematopoietic response (Hb increase ≥ 2 g/dL or achievement of Hb ≥ 12 g/dL with no blood transfusion in the previous 28 days determined in two consecutive measurements within a 10-day interval) was also measured. RESULTS: Dose-dependent Hb increases were observed, although the magnitude of increase was moderate. Hematopoietic response rate was also dose dependent, achieved by 51% and 62% of patients in the 4.2 and 6.3 μg/kg Q3W groups, and 63% of the 2.1 μg/kg QW group. In the Q3W group, the proportion of early responders (defined as ≥ 1 g/dL increase in Hb from baseline during the first 22 days) increased with increasing C.E.R.A. dose, reaching 41% with the highest dose. In the 6.3 μg/kg Q3W group, 15% of patients received blood transfusion. There was an inclination for higher mean Hb increases and lower transfusion use in the Q3W groups than in the QW groups. C.E.R.A. was generally well tolerated. CONCLUSION: C.E.R.A. administered QW or Q3W showed clinical activity and safety in patients with NSCLC. There were dose-dependent increases in Hb responses. C.E.R.A. appeared to be more effective when the same dose over time was given Q3W than QW, with a suggestion that C.E.R.A. 6.3 μg/kg Q3W provided best efficacy in this study. However, further dose-finding studies using higher doses are required to determine the optimal C.E.R.A. dose regimen in cancer patients receiving chemotherapy

    Impacts and Statistical Mitigation of Missing Data on the 21 cm Power Spectrum : A Case Study with the Hydrogen Epoch of Reionization Array

    Get PDF
    The precise characterization and mitigation of systematic effects is one of the biggest roadblocks impeding the detection of the fluctuations of cosmological 21 cm signals. Missing data in radio cosmological experiments, often due to radio frequency interference (RFI), pose a particular challenge to power spectrum analysis as this could lead to the ringing of bright foreground modes in the Fourier space, heavily contaminating the cosmological signals. Here we show that the problem of missing data becomes even more arduous in the presence of systematic effects. Using a realistic numerical simulation, we demonstrate that partially flagged data combined with systematic effects can introduce significant foreground ringing. We show that such an effect can be mitigated through inpainting the missing data. We present a rigorous statistical framework that incorporates the process of inpainting missing data into a quadratic estimator of the 21 cm power spectrum. Under this framework, the uncertainties associated with our inpainting method and its impact on power spectrum statistics can be understood. These results are applied to the latest Phase II observations taken by the Hydrogen Epoch of Reionization Array, forming a crucial component in power spectrum analyses as we move toward detecting 21 cm signals in the ever more noisy RFI environment

    Methods of Error Estimation for Delay Power Spectra in 21 cm Cosmology

    Get PDF
    Precise measurements of the 21 cm power spectrum are crucial for understanding the physical processes of hydrogen reionization. Currently, this probe is being pursued by low-frequency radio interferometer arrays. As these experiments come closer to making a first detection of the signal, error estimation will play an increasingly important role in setting robust measurements. Using the delay power spectrum approach, we have produced a critical examination of different ways that one can estimate error bars on the power spectrum. We do this through a synthesis of analytic work, simulations of toy models, and tests on small amounts of real data. We find that, although computed independently, the different error bar methodologies are in good agreement with each other in the noise-dominated regime of the power spectrum. For our preferred methodology, the predicted probability distribution function is consistent with the empirical noise power distributions from both simulated and real data. This diagnosis is mainly in support of the forthcoming HERA upper limit and also is expected to be more generally applicable
    corecore