583 research outputs found
Observational Mass-to-Light Ratio of Galaxy Systems: from Poor Groups to Rich Clusters
We study the mass-to-light ratio of galaxy systems from poor groups to rich
clusters, and present for the first time a large database for useful
comparisons with theoretical predictions. We extend a previous work, where B_j
band luminosities and optical virial masses were analyzed for a sample of 89
clusters. Here we also consider a sample of 52 more clusters, 36 poor clusters,
7 rich groups, and two catalogs, of about 500 groups each, recently identified
in the Nearby Optical Galaxy sample by using two different algorithms. We
obtain the blue luminosity and virial mass for all systems considered. We
devote a large effort to establishing the homogeneity of the resulting values,
as well as to considering comparable physical regions, i.e. those included
within the virial radius. By analyzing a fiducial, combined sample of 294
systems we find that the mass increases faster than the luminosity: the linear
fit gives M\propto L_B^{1.34 \pm 0.03}, with a tendency for a steeper increase
in the low--mass range. In agreement with the previous work, our present
results are superior owing to the much higher statistical significance and the
wider dynamical range covered (about 10^{12}-10^{15} M_solar). We present a
comparison between our results and the theoretical predictions on the relation
between M/L_B and halo mass, obtained by combining cosmological numerical
simulations and semianalytic modeling of galaxy formation.Comment: 25 pages, 12 eps figures, accepted for publication in Ap
Lowering the energy threshold in COSINE-100 dark matter searches
COSINE-100 is a dark matter detection experiment that uses NaI(Tl) crystal
detectors operating at the Yangyang underground laboratory in Korea since
September 2016. Its main goal is to test the annual modulation observed by the
DAMA/LIBRA experiment with the same target medium. Recently DAMA/LIBRA has
released data with an energy threshold lowered to 1 keV, and the persistent
annual modulation behavior is still observed at 9.5. By lowering the
energy threshold for electron recoils to 1 keV, COSINE-100 annual modulation
results can be compared to those of DAMA/LIBRA in a model-independent way.
Additionally, the event selection methods provide an access to a few to sub-GeV
dark matter particles using constant rate studies. In this article, we discuss
the COSINE-100 event selection algorithm, its validation, and efficiencies near
the threshold
TABU e Reescrita na Tradução Italiana de a Casa dos Budas Ditosos
Os mais recentes estudos da tradução propõem uma abordagem descritiva antes que prescritiva, e essa é a perspectiva que será utilizada ao logo do presente trabalho. Esta pesquisa apresenta um estudo de caso em que se examina de forma descritiva a reescrita do romance A casa dos budas ditosos de João Ubaldo Ribeiro e sua inserção no sistema cultural italiano. Através das teorias dos polissistemas de Itamar Even-Zohar e da teoria da reescrita de André Lefevere analisam-se os contextos culturais italiano e brasileiro a fim de individuar a posição sistêmica da obra, os elementos paratextuais e as resenhas para descobrir as razões de publicação da tradução do romance, e por fim os elementos textuais, ou seja, vários segmentos do texto-fonte e do texto-alvo, com o objetivo de descobrir as possíveis normas que determinaram o comportamento da tradutora durante o processo tradutório. O propósito da pesquisa é o de entender como foram enfrentadas as questões culturais, de imaginário e sobretudo como se deu a tradução dos tabus.
PALAVRAS-CHAVE: Tabu, Reescrita, Tradução, A casa dos budas ditosos, João Ubaldo Ribeir
A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline
The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the
handling of the scientific and housekeeping telemetry. It is a critical
component of the Planck ground segment which has to strictly commit to the
project schedule to be ready for the launch and flight operations. In order to
guarantee the quality necessary to achieve the objectives of the Planck
mission, the design and development of the Level 1 software has followed the
ESA Software Engineering Standards. A fundamental step in the software life
cycle is the Verification and Validation of the software. The purpose of this
work is to show an example of procedures, test development and analysis
successfully applied to a key software project of an ESA mission. We present
the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by
detailing the methods used and the results obtained. Different approaches have
been used to test the scientific and housekeeping data processing. Scientific
data processing has been tested by injecting signals with known properties
directly into the acquisition electronics, in order to generate a test dataset
of real telemetry data and reproduce as much as possible nominal conditions.
For the HK telemetry processing, validation software have been developed to
inject known parameter values into a set of real housekeeping packets and
perform a comparison with the corresponding timelines generated by the Level 1.
With the proposed validation and verification procedure, where the on-board and
ground processing are viewed as a single pipeline, we demonstrated that the
scientific and housekeeping processing of the Planck-LFI raw data is correct
and meets the project requirements.Comment: 20 pages, 7 figures; this paper is part of the Prelaunch status LFI
papers published on JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/jins
Optimization of Planck/LFI on--board data handling
To asses stability against 1/f noise, the Low Frequency Instrument (LFI)
onboard the Planck mission will acquire data at a rate much higher than the
data rate allowed by its telemetry bandwith of 35.5 kbps. The data are
processed by an onboard pipeline, followed onground by a reversing step. This
paper illustrates the LFI scientific onboard processing to fit the allowed
datarate. This is a lossy process tuned by using a set of 5 parameters Naver,
r1, r2, q, O for each of the 44 LFI detectors. The paper quantifies the level
of distortion introduced by the onboard processing, EpsilonQ, as a function of
these parameters. It describes the method of optimizing the onboard processing
chain. The tuning procedure is based on a optimization algorithm applied to
unprocessed and uncompressed raw data provided either by simulations, prelaunch
tests or data taken from LFI operating in diagnostic mode. All the needed
optimization steps are performed by an automated tool, OCA2, which ends with
optimized parameters and produces a set of statistical indicators, among them
the compression rate Cr and EpsilonQ. For Planck/LFI the requirements are Cr =
2.4 and EpsilonQ <= 10% of the rms of the instrumental white noise. To speedup
the process an analytical model is developed that is able to extract most of
the relevant information on EpsilonQ and Cr as a function of the signal
statistics and the processing parameters. This model will be of interest for
the instrument data analysis. The method was applied during ground tests when
the instrument was operating in conditions representative of flight. Optimized
parameters were obtained and the performance has been verified, the required
data rate of 35.5 Kbps has been achieved while keeping EpsilonQ at a level of
3.8% of white noise rms well within the requirements.Comment: 51 pages, 13 fig.s, 3 tables, pdflatex, needs JINST.csl, graphicx,
txfonts, rotating; Issue 1.0 10 nov 2009; Sub. to JINST 23Jun09, Accepted
10Nov09, Pub.: 29Dec09; This is a preprint, not the final versio
Off-line radiometric analysis of Planck/LFI data
The Planck Low Frequency Instrument (LFI) is an array of 22
pseudo-correlation radiometers on-board the Planck satellite to measure
temperature and polarization anisotropies in the Cosmic Microwave Background
(CMB) in three frequency bands (30, 44 and 70 GHz). To calibrate and verify the
performances of the LFI, a software suite named LIFE has been developed. Its
aims are to provide a common platform to use for analyzing the results of the
tests performed on the single components of the instrument (RCAs, Radiometric
Chain Assemblies) and on the integrated Radiometric Array Assembly (RAA).
Moreover, its analysis tools are designed to be used during the flight as well
to produce periodic reports on the status of the instrument. The LIFE suite has
been developed using a multi-layered, cross-platform approach. It implements a
number of analysis modules written in RSI IDL, each accessing the data through
a portable and heavily optimized library of functions written in C and C++. One
of the most important features of LIFE is its ability to run the same data
analysis codes both using ground test data and real flight data as input. The
LIFE software suite has been successfully used during the RCA/RAA tests and the
Planck Integrated System Tests. Moreover, the software has also passed the
verification for its in-flight use during the System Operations Verification
Tests, held in October 2008.Comment: Planck LFI technical papers published by JINST:
http://www.iop.org/EJ/journal/-page=extra.proc5/1748-022
Planck pre-launch status: Low Frequency Instrument calibration and expected scientific performance
We give the calibration and scientific performance parameters of the Planck
Low Frequency Instrument (LFI) measured during the ground cryogenic test
campaign. These parameters characterise the instrument response and constitute
our best pre-launch knowledge of the LFI scientific performance. The LFI shows
excellent stability and rejection of instrumental systematic effects;
measured noise performance shows that LFI is the most sensitive instrument of
its kind. The set of measured calibration parameters will be updated during
flight operations through the end of the mission.Comment: Accepted for publications in Astronomy and Astrophysics. Astronomy &
Astrophysics, 2010 (acceptance date: 12 Jan 2010
Chromosomal imbalances are uncommon in chagasic megaesophagus
<p>Abstract</p> <p>Background</p> <p>Chagas' disease is a human tropical parasitic illness and a subset of the chronic patients develop megaesophagus or megacolon. The esophagus dilation is known as chagasic megaesophagus (CM) and one of the severe late consequences of CM is the increased risk for esophageal carcinoma (ESCC). Based on the association between CM and ESCC, we investigated whether genes frequently showing unbalanced copy numbers in ESCC were altered in CM by fluorescence in situ (FISH) technology.</p> <p>Methods</p> <p>A total of 50 formalin-fixed, paraffin-embedded esophageal mucosa specimens (40 from Chagas megaesophagus-CM, and 10 normal esophageal mucosa-NM) were analyzed. DNA FISH probes were tested for <it>FHIT</it>, <it>TP63</it>, <it>PIK3CA</it>, <it>EGFR, FGFR1</it>, <it>MYC</it>, <it>CDKN2A, YES1 </it>and <it>NCOA3 </it>genes, and centromeric sequences from chromosomes 3, 7 and 9.</p> <p>Results</p> <p>No differences between superficial and basal layers of the epithelial mucosa were found, except for loss of copy number of <it>EGFR </it>in the esophageal basal layer of CM group. Mean copy number of <it>CDKN2A and </it>CEP9 and frequency of nuclei with loss of <it>PIK3CA </it>were significantly different in the CM group compared with normal mucosa and marginal levels of deletions in <it>TP63</it>, <it>FHIT, PIK3CA, EGFR, CDKN2A, YES </it>and gains at <it>PIK3CA, TP63, FGFR1, MYC, CDNK2A </it>and <it>NCOA3 </it>were detected in few CM cases, mainly with dilation grades III and IV. All changes occurred at very low levels.</p> <p>Conclusions</p> <p>Genomic imbalances common in esophageal carcinomas are not present in chagasic megaesophagus suggesting that these features will not be effective markers for risk assessment of ESCC in patients with chagasic megaesophagus.</p
- …
