4,944 research outputs found

    An Economic Analysis of Electron Accelerators and Cobalt-60 for Irradiating Food

    Get PDF
    Average costs per pound of irradiating food are similar for the electron accelerator and cobalt-60 irradiators analyzed in this study, but initial investment costs can vary by $1 million. Irradiation costs range from 0.5 to 7 cents per pound and decrease as annual volumes treated increase. Cobalt-60 is less expensive than electron beams for annual volumes below 50 million pounds. For radiation source requirements above the equivalent of 1 million curies of cobalt-60, electron beams are more economical.food irradiation, electron accelerators, cobalt-60, cost comparison, economies of size, Food Consumption/Nutrition/Food Safety,

    Horn antenna with v-shaped corrugated surface

    Get PDF
    Corrugated shape is easily machined for millimeter wave application and is better suited for folding antenna designs. Measured performance showed ""V'' corrugations and rectangular corrugations have nearly the same pattern beamwidth, gain, and impedance. Also, ""V'' corrugations have higher relative power loss

    Deep Structured Features for Semantic Segmentation

    Full text link
    We propose a highly structured neural network architecture for semantic segmentation with an extremely small model size, suitable for low-power embedded and mobile platforms. Specifically, our architecture combines i) a Haar wavelet-based tree-like convolutional neural network (CNN), ii) a random layer realizing a radial basis function kernel approximation, and iii) a linear classifier. While stages i) and ii) are completely pre-specified, only the linear classifier is learned from data. We apply the proposed architecture to outdoor scene and aerial image semantic segmentation and show that the accuracy of our architecture is competitive with conventional pixel classification CNNs. Furthermore, we demonstrate that the proposed architecture is data efficient in the sense of matching the accuracy of pixel classification CNNs when trained on a much smaller data set.Comment: EUSIPCO 2017, 5 pages, 2 figure

    Practical Full Resolution Learned Lossless Image Compression

    Full text link
    We propose the first practical learned lossless image compression system, L3C, and show that it outperforms the popular engineered codecs, PNG, WebP and JPEG 2000. At the core of our method is a fully parallelizable hierarchical probabilistic model for adaptive entropy coding which is optimized end-to-end for the compression task. In contrast to recent autoregressive discrete probabilistic models such as PixelCNN, our method i) models the image distribution jointly with learned auxiliary representations instead of exclusively modeling the image distribution in RGB space, and ii) only requires three forward-passes to predict all pixel probabilities instead of one for each pixel. As a result, L3C obtains over two orders of magnitude speedups when sampling compared to the fastest PixelCNN variant (Multiscale-PixelCNN). Furthermore, we find that learning the auxiliary representation is crucial and outperforms predefined auxiliary representations such as an RGB pyramid significantly.Comment: Updated preprocessing and Table 1, see A.1 in supplementary. Code and models: https://github.com/fab-jul/L3C-PyTorc

    The Possibility of Transfer(?): A Comprehensive Approach to the International Criminal Tribunal for Rwanda’s Rule 11bis To Permit Transfer to Rwandan Domestic Courts

    Get PDF
    We present a learned image compression system based on GANs, operating at extremely low bitrates. Our proposed framework combines an encoder, decoder/generator and a multi-scale discriminator, which we train jointly for a generative learned compression objective. The model synthesizes details it cannot afford to store, obtaining visually pleasing results at bitrates where previous methods fail and show strong artifacts. Furthermore, if a semantic label map of the original image is available, our method can fully synthesize unimportant regions in the decoded image such as streets and trees from the label map, proportionally reducing the storage cost. A user study confirms that for low bitrates, our approach is preferred to state-of-the-art methods, even when they use more than double the bits.Comment: E. Agustsson, M. Tschannen, and F. Mentzer contributed equally to this work. ICCV 2019 camera ready versio

    Autophagy: an affair of the heart

    Get PDF
    Whether an element of routine housekeeping or in the setting of imminent disaster, it is a good idea to get one’s affairs in order. Autophagy, the process of recycling organelles and protein aggregates, is a basal homeostatic process and an evolutionarily conserved response to starvation and other forms of metabolic stress. Our understanding of the role of autophagy in the heart is changing rapidly as new information becomes available. This review examines the role of autophagy in the heart in the setting of cardioprotection, hypertrophy, and heart failure. Contradictory findings are reconciled in light of recent developments. The preponderance of evidence favors a beneficial role for autophagy in the heart under most conditions

    Is the corporate elite disintegrating? Interlock boards and the Mizruchi hypothesis

    No full text
    International audienceThis paper proposes an approach for comparing interlocked board networks over time to test for statistically significant change. In addition to contributing to the conversation about whether the Mizruchi hypothesis (that a disintegration of power is occurring within the corporate elite) holds or not, we propose novel methods to handle a longitudinal investigation of a series of social networks where the nodes undergo a few modifications at each time point. Methodologically, our contribution is twofold: we extend a Bayesian model hereto applied to compare two time periods to a longer time period, and we define and employ the concept of a hull of a sequence of social networks, which makes it possible to circumvent the problem of changing nodes over time

    Risk factors for breast cancer in a population with high incidence rates.

    Get PDF
    BackgroundThis report examines generally recognized breast cancer risk factors and years of residence in Marin County, California, an area with high breast cancer incidence and mortality rates.MethodsEligible women who were residents of Marin County diagnosed with breast cancer in 1997-99 and women without breast cancer obtained through random digit dialing, frequency-matched by cases' age at diagnosis and ethnicity, participated in either full in-person or abbreviated telephone interviews.ResultsIn multivariate analyses, 285 cases were statistically significantly more likely than 286 controls to report being premenopausal, never to have used birth control pills, a lower highest lifetime body mass index, four or more mammograms in 1990-94, beginning drinking after the age of 21, on average drinking two or more drinks per day, the highest quartile of pack-years of cigarette smoking and having been raised in an organized religion. Cases and controls did not significantly differ with regard to having a first-degree relative with breast cancer, a history of benign breast biopsy, previous radiation treatment, age at menarche, parity, use of hormone replacement therapy, age of first living in Marin County, or total years lived in Marin County. Results for several factors differed for women aged under 50 years or 50 years and over.ConclusionsDespite similar distributions of several known breast cancer risk factors, case-control differences in alcohol consumption suggest that risk in this high-risk population might be modifiable. Intensive study of this or other areas of similarly high incidence might reveal other important risk factors proximate to diagnosis
    corecore