4,995 research outputs found
Bounds on the Capacity of the Relay Channel with Noncausal State Information at Source
We consider a three-terminal state-dependent relay channel with the channel
state available non-causally at only the source. Such a model may be of
interest for node cooperation in the framework of cognition, i.e.,
collaborative signal transmission involving cognitive and non-cognitive radios.
We study the capacity of this communication model. One principal problem in
this setup is caused by the relay's not knowing the channel state. In the
discrete memoryless (DM) case, we establish lower bounds on channel capacity.
For the Gaussian case, we derive lower and upper bounds on the channel
capacity. The upper bound is strictly better than the cut-set upper bound. We
show that one of the developed lower bounds comes close to the upper bound,
asymptotically, for certain ranges of rates.Comment: 5 pages, submitted to 2010 IEEE International Symposium on
Information Theor
The Future of Information Activities in the CGIAR: A System-Wide Strategy
Revised draft of a systemwide information and communication strategy by the Center Directors' Committee on Information, Documentation and Training, chaired by Roberto Lenton. The paper was prepared on the basis of an ISNAR workshop in June 1994. An earlier draft was discussed at TAC 64, and this one at TAC 65 in October 1994. TAC made an interim comment, included in the report of TAC 65.The strategy sets out the information needs of the CGIAR during the 1990s, and the opportunities new information and communications technologies represent for improving joint action and collaboration among IARCS, and between IARCs and their partners outside the CGIAR System. There is also a memorandum from CBC Chair, Bonte-Friedheim, transmitting the paper and describing its progress.Agenda document at TAC 64 and 65
Multiaccess Channels with State Known to One Encoder: Another Case of Degraded Message Sets
We consider a two-user state-dependent multiaccess channel in which only one
of the encoders is informed, non-causally, of the channel states. Two
independent messages are transmitted: a common message transmitted by both the
informed and uninformed encoders, and an individual message transmitted by only
the uninformed encoder. We derive inner and outer bounds on the capacity region
of this model in the discrete memoryless case as well as the Gaussian case.
Further, we show that the bounds for the Gaussian case are tight in some
special cases.Comment: 5 pages, Proc. of IEEE International Symposium on Information theory,
ISIT 2009, Seoul, Kore
Low-density parity-check codes for asymmetric distributed source coding
The research work is partially funded by the Strategic Educational Pathways Scholarship Scheme (STEPS-Malta). This scholarship is partly financed by the European Union -
European Social Fund (ESF 1.25).Low-Density Parity-Check (LDPC) codes achieve good performance, tending towards the Slepian-Wolf bound, when used as channel codes in Distributed Source Coding (DSC). Most LDPC codes found in literature are designed assuming random distribution of transmission errors. However, certain DSC applications can predict the error location within a certain level of accuracy. This feature can be exploited in order to design application specific LDPC codes to enhance the performance of traditional LDPC codes. This paper proposes a novel architecture for asymmetric DSC where the encoder is able to estimate the location of the errors within the side information. It then interleaves the bits having a high probability of error to the beginning of the codeword. The LDPC codes are designed to provide a higher level of protection to the front bits. Simulation results show that correct localization of errors pushes the performance of the system on average 13.3% closer to the Slepian-Wolf bound, compared to the randomly constructed LDPC codes. If the error localization prediction fails, such that the errors are randomly distributed, the performance is still in line with that of the traditional DSC architecture.peer-reviewe
Abstract Model Counting: A Novel Approach for Quantification of Information Leaks
acmid: 2590328 keywords: model checking, quantitative information flow, satisfiability modulo theories, symbolic execution location: Kyoto, Japan numpages: 10acmid: 2590328 keywords: model checking, quantitative information flow, satisfiability modulo theories, symbolic execution location: Kyoto, Japan numpages: 10acmid: 2590328 keywords: model checking, quantitative information flow, satisfiability modulo theories, symbolic execution location: Kyoto, Japan numpages: 10We present a novel method for Quantitative Information Flow analysis. We show how the problem of computing information leakage can be viewed as an extension of the Satisfiability Modulo Theories (SMT) problem. This view enables us to develop a framework for QIF analysis based on the framework DPLL(T) used in SMT solvers. We then show that the methodology of Symbolic Execution (SE) also fits our framework. Based on these ideas, we build two QIF analysis tools: the first one employs CBMC, a bounded model checker for ANSI C, and the second one is built on top of Symbolic PathFinder, a Symbolic Executor for Java. We use these tools to quantify leaks in industrial code such as C programs from the Linux kernel, a Java tax program from the European project HATS, and anonymity protocol
A web of stakeholders and strategies: A case of broadband diffusion in South Korea
When a new technology is launched, its diffusion becomes an issue of importance. There are various stakeholders that influence diffusion. The question that remains to be determined is their identification and roles. This paper outlines how the strategies pursued by a government acting as the key stakeholder affected the diffusion of a new technology. The analysis is based on a theoretical framework derived from innovation diffusion and stakeholder theories. The empirical evidence comes from a study of broadband development in South Korea. A web of stakeholders and strategies is drawn in order to identify the major stakeholders involved and highlight their relations. The case of South Korea offers implications for other countries that are pursuing broadband diffusion strategies
Integrating information to bootstrap information extraction from web sites
In this paper we propose a methodology to learn to extract domain-specific information from large repositories (e.g. the Web) with minimum user intervention. Learning is seeded by integrating information from structured sources (e.g. databases and digital libraries). Retrieved information is then used to bootstrap learning for simple Information Extraction (IE) methodologies, which in turn will produce more annotation to train more complex IE engines. All the corpora for training the IE en- gines are produced automatically by integrating in- formation from different sources such as available corpora and services (e.g. databases or digital libraries, etc.). User intervention is limited to providing an initial URL and adding information missed by the different modules when the computation has finished. The information added or delete by the user can then be reused providing further training and therefore getting more information (recall) and/or more precision. We are currently applying this methodology to mining web sites of Computer Science departments.peer-reviewe
Environmental issues and the geological storage of CO2 : a discussion document
Increasing CO2 emissions will lead to climate change and ocean acidification with severe consequences for ecosystems and for human society. Strategies are being sought to reduce emissions including the geological storage of CO2. Existing studies operate within existing oil and gas regulatory frameworks, but if other non-oil reservoir geological formations are used these existing regulations may not apply. At UK and European levels the potential environmental impacts of uncontrolled CO2 releases from storage sites have been highlighted to be of significance for regulators. Thus a new regulatory framework may be needed. The precautionary principle is likely to be adopted by regulators, so it is important that the effects of acute and chronic exposures of ecosystems to CO2 leakages are evaluated. Consequently, existing regulations are likely to be developed to include specific recommendations concerning leakages. This review shows that many basic data simply do not exist to assist regulators in this process
- …
