243 research outputs found

    Two Measures of Dependence

    Full text link
    Two families of dependence measures between random variables are introduced. They are based on the R\'enyi divergence of order α\alpha and the relative α\alpha-entropy, respectively, and both dependence measures reduce to Shannon's mutual information when their order α\alpha is one. The first measure shares many properties with the mutual information, including the data-processing inequality, and can be related to the optimal error exponents in composite hypothesis testing. The second measure does not satisfy the data-processing inequality, but appears naturally in the context of distributed task encoding.Comment: 40 pages; 1 figure; published in Entrop

    Codes for Tasks and R\'enyi Entropy Rate

    Full text link
    A task is randomly drawn from a finite set of tasks and is described using a fixed number of bits. All the tasks that share its description must be performed. Upper and lower bounds on the minimum ρ\rho-th moment of the number of performed tasks are derived. The key is an analog of the Kraft Inequality for partitions of finite sets. When a sequence of tasks is produced by a source of a given R\'enyi entropy rate of order 1/(1+ρ)1/(1+\rho) and nn tasks are jointly described using nRnR bits, it is shown that for RR larger than the R\'enyi entropy rate, the ρ\rho-th moment of the ratio of performed tasks to nn can be driven to one as nn tends to infinity, and that for RR less than the R\'enyi entropy rate it tends to infinity. This generalizes a recent result for IID sources by the same authors. A mismatched version of the direct part is also considered, where the code is designed according to the wrong law. The penalty incurred by the mismatch can be expressed in terms of a divergence measure that was shown by Sundaresan to play a similar role in the Massey-Arikan guessing problem.Comment: 5 pages, to be presented at ISIT 2014; minor changes in the presentation, added a referenc

    On Multipath Fading Channels at High SNR

    Full text link
    This paper studies the capacity of discrete-time multipath fading channels. It is assumed that the number of paths is finite, i.e., that the channel output is influenced by the present and by the L previous channel inputs. A noncoherent channel model is considered where neither transmitter nor receiver are cognizant of the fading's realization, but both are aware of its statistic. The focus is on capacity at high signal-to-noise ratios (SNR). In particular, the capacity pre-loglog - defined as the limiting ratio of the capacity to loglog SNR as SNR tends to infinity - is studied. It is shown that, irrespective of the number paths L, the capacity pre-loglog is 1.Comment: To be presented at the 2008 IEEE Symposium on Information Theory (ISIT), Toronto, Canada; replaced with version that appears in the proceeding

    On the Listsize Capacity with Feedback

    Full text link
    The listsize capacity of a discrete memoryless channel is the largest transmission rate for which the expectation---or, more generally, the ρ\rho-th moment---of the number of messages that could have produced the output of the channel approaches one as the blocklength tends to infinity. We show that for channels with feedback this rate is upper-bounded by the maximum of Gallager's E0E_0 function divided by ρ\rho, and that equality holds when the zero-error capacity of the channel is positive. To establish this inequality we prove that feedback does not increase the cutoff rate. Relationships to other notions of channel capacity are explored.Comment: 17 pages. Fixed some typos; minor changes in the presentation. Published in the IEEE Transactions on Information Theory. Presented in part at the 2013 IEEE Information Theory Worksho
    corecore