2,257 research outputs found

    Bayesian comparison of latent variable models: Conditional vs marginal likelihoods

    Full text link
    Typical Bayesian methods for models with latent variables (or random effects) involve directly sampling the latent variables along with the model parameters. In high-level software code for model definitions (using, e.g., BUGS, JAGS, Stan), the likelihood is therefore specified as conditional on the latent variables. This can lead researchers to perform model comparisons via conditional likelihoods, where the latent variables are considered model parameters. In other settings, however, typical model comparisons involve marginal likelihoods where the latent variables are integrated out. This distinction is often overlooked despite the fact that it can have a large impact on the comparisons of interest. In this paper, we clarify and illustrate these issues, focusing on the comparison of conditional and marginal Deviance Information Criteria (DICs) and Watanabe-Akaike Information Criteria (WAICs) in psychometric modeling. The conditional/marginal distinction corresponds to whether the model should be predictive for the clusters that are in the data or for new clusters (where "clusters" typically correspond to higher-level units like people or schools). Correspondingly, we show that marginal WAIC corresponds to leave-one-cluster out (LOcO) cross-validation, whereas conditional WAIC corresponds to leave-one-unit out (LOuO). These results lead to recommendations on the general application of the criteria to models with latent variables.Comment: Manuscript in press at Psychometrika; 31 pages, 8 figure

    Hash-based signatures for the internet of things

    Get PDF
    While numerous digital signature schemes exist in the literature, most real-world system rely on RSA-based signature schemes or on the digital signature algorithm (DSA), including its elliptic curve cryptography variant ECDSA. In this position paper we review a family of alternative signature schemes, based on hash functions, and we make the case for their application in Internet of Things (IoT) settings. Hash-based signatures provide postquantum security, and only make minimal security assumptions, in general requiring only a secure cryptographic hash function. This makes them extremely flexible, as they can be implemented on top of any hash function that satisfies basic security properties. Hash-based signatures also feature numerous parameters defining aspects such as signing speed and key size, that enable trade-offs in constrained environments. Simplicity of implementation and customization make hash based signatures an attractive candidate for the IoT ecosystem, which is composed of a number of diverse, constrained devices

    Energy benefits and emergent space use patterns of an empirically parameterized model of memory-based patch selection

    Get PDF
    Many species frequently return to previously visited foraging sites. This bias towards familiar areas suggests that remembering information from past experience is beneficial. Such a memory-based foraging strategy has also been hypothesized to give rise to restricted space use (i.e. a home range). Nonetheless, the benefits of empirically derived memory-based foraging tactics and the extent to which they give rise to restricted space use patterns are still relatively unknown. Using a combination of stochastic agent-based simulations and deterministic integro-difference equations, we developed an adaptive link (based on energy gains as a foraging currency) between memory-based patch selection and its resulting spatial distribution. We used a memory-based foraging model developed and parameterized with patch selection data of free-ranging bison Bison bison in Prince Albert National Park, Canada. Relative to random use of food patches, simulated foragers using both spatial and attribute memory are more efficient, particularly in landscapes with clumped resources. However, a certain amount of random patch use is necessary to avoid frequent returns to relatively poor-quality patches, or avoid being caught in a relatively poor quality area of the landscape. Notably, in landscapes with clumped resources, simulated foragers that kept a reference point of the quality of recently visited patches, and returned to previously visited patches when local patch quality was poorer than the reference point, experienced higher energy gains compared to random patch use. Furthermore, the model of memory-based foraging resulted in restricted space use in simulated landscapes and replicated the restricted space use observed in free-ranging bison reasonably well. Our work demonstrates the adaptive value of spatial and attribute memory in heterogeneous landscapes, and how home ranges can be a byproduct of non-omniscient foragers using past experience to minimize temporal variation in energy gains

    PDFS: Practical Data Feed Service for Smart Contracts

    Full text link
    Smart contracts are a new paradigm that emerged with the rise of the blockchain technology. They allow untrusting parties to arrange agreements. These agreements are encoded as a programming language code and deployed on a blockchain platform, where all participants execute them and maintain their state. Smart contracts are promising since they are automated and decentralized, thus limiting the involvement of third trusted parties, and can contain monetary transfers. Due to these features, many people believe that smart contracts will revolutionize the way we think of distributed applications, information sharing, financial services, and infrastructures. To release the potential of smart contracts, it is necessary to connect the contracts with the outside world, such that they can understand and use information from other infrastructures. For instance, smart contracts would greatly benefit when they have access to web content. However, there are many challenges associated with realizing such a system, and despite the existence of many proposals, no solution is secure, provides easily-parsable data, introduces small overheads, and is easy to deploy. In this paper we propose PDFS, a practical system for data feeds that combines the advantages of the previous schemes and introduces new functionalities. PDFS extends content providers by including new features for data transparency and consistency validations. This combination provides multiple benefits like content which is easy to parse and efficient authenticity verification without breaking natural trust chains. PDFS keeps content providers auditable, mitigates their malicious activities (like data modification or censorship), and allows them to create a new business model. We show how PDFS is integrated with existing web services, report on a PDFS implementation and present results from conducted case studies and experiments.Comment: Blockchain; Smart Contracts; Data Authentication; Ethereu

    Keyword-Based Delegable Proofs of Storage

    Full text link
    Cloud users (clients) with limited storage capacity at their end can outsource bulk data to the cloud storage server. A client can later access her data by downloading the required data files. However, a large fraction of the data files the client outsources to the server is often archival in nature that the client uses for backup purposes and accesses less frequently. An untrusted server can thus delete some of these archival data files in order to save some space (and allocate the same to other clients) without being detected by the client (data owner). Proofs of storage enable the client to audit her data files uploaded to the server in order to ensure the integrity of those files. In this work, we introduce one type of (selective) proofs of storage that we call keyword-based delegable proofs of storage, where the client wants to audit all her data files containing a specific keyword (e.g., "important"). Moreover, it satisfies the notion of public verifiability where the client can delegate the auditing task to a third-party auditor who audits the set of files corresponding to the keyword on behalf of the client. We formally define the security of a keyword-based delegable proof-of-storage protocol. We construct such a protocol based on an existing proof-of-storage scheme and analyze the security of our protocol. We argue that the techniques we use can be applied atop any existing publicly verifiable proof-of-storage scheme for static data. Finally, we discuss the efficiency of our construction.Comment: A preliminary version of this work has been published in International Conference on Information Security Practice and Experience (ISPEC 2018

    Unifying Parsimonious Tree Reconciliation

    Full text link
    Evolution is a process that is influenced by various environmental factors, e.g. the interactions between different species, genes, and biogeographical properties. Hence, it is interesting to study the combined evolutionary history of multiple species, their genes, and the environment they live in. A common approach to address this research problem is to describe each individual evolution as a phylogenetic tree and construct a tree reconciliation which is parsimonious with respect to a given event model. Unfortunately, most of the previous approaches are designed only either for host-parasite systems, for gene tree/species tree reconciliation, or biogeography. Hence, a method is desirable, which addresses the general problem of mapping phylogenetic trees and covering all varieties of coevolving systems, including e.g., predator-prey and symbiotic relationships. To overcome this gap, we introduce a generalized cophylogenetic event model considering the combinatorial complete set of local coevolutionary events. We give a dynamic programming based heuristic for solving the maximum parsimony reconciliation problem in time O(n^2), for two phylogenies each with at most n leaves. Furthermore, we present an exact branch-and-bound algorithm which uses the results from the dynamic programming heuristic for discarding partial reconciliations. The approach has been implemented as a Java application which is freely available from http://pacosy.informatik.uni-leipzig.de/coresym.Comment: Peer-reviewed and presented as part of the 13th Workshop on Algorithms in Bioinformatics (WABI2013

    Human phosphodiesterase 4D7 (PDE4D7) expression is increased in TMPRSS2-ERG positive primary prostate cancer and independently adds to a reduced risk of post-surgical disease progression

    Get PDF
    background: There is an acute need to uncover biomarkers that reflect the molecular pathologies, underpinning prostate cancer progression and poor patient outcome. We have previously demonstrated that in prostate cancer cell lines PDE4D7 is downregulated in advanced cases of the disease. To investigate further the prognostic power of PDE4D7 expression during prostate cancer progression and assess how downregulation of this PDE isoform may affect disease outcome, we have examined PDE4D7 expression in physiologically relevant primary human samples. methods: About 1405 patient samples across 8 publically available qPCR, Affymetrix Exon 1.0 ST arrays and RNA sequencing data sets were screened for PDE4D7 expression. The TMPRSS2-ERG gene rearrangement status of patient samples was determined by transformation of the exon array and RNA seq expression data to robust z-scores followed by the application of a threshold >3 to define a positive TMPRSS2-ERG gene fusion event in a tumour sample. results: We demonstrate that PDE4D7 expression positively correlates with primary tumour development. We also show a positive association with the highly prostate cancer-specific gene rearrangement between TMPRSS2 and the ETS transcription factor family member ERG. In addition, we find that in primary TMPRSS2-ERG-positive tumours PDE4D7 expression is significantly positively correlated with low-grade disease and a reduced likelihood of progression after primary treatment. Conversely, PDE4D7 transcript levels become significantly decreased in castration resistant prostate cancer (CRPC). conclusions: We further characterise and add physiological relevance to PDE4D7 as a novel marker that is associated with the development and progression of prostate tumours. We propose that the assessment of PDE4D7 levels may provide a novel, independent predictor of post-surgical disease progression

    Chosen-ciphertext security from subset sum

    Get PDF
    We construct a public-key encryption (PKE) scheme whose security is polynomial-time equivalent to the hardness of the Subset Sum problem. Our scheme achieves the standard notion of indistinguishability against chosen-ciphertext attacks (IND-CCA) and can be used to encrypt messages of arbitrary polynomial length, improving upon a previous construction by Lyubashevsky, Palacio, and Segev (TCC 2010) which achieved only the weaker notion of semantic security (IND-CPA) and whose concrete security decreases with the length of the message being encrypted. At the core of our construction is a trapdoor technique which originates in the work of Micciancio and Peikert (Eurocrypt 2012

    A Particle-based Multiscale Solver for Compressible Liquid-Vapor Flow

    Full text link
    To describe complex flow systems accurately, it is in many cases important to account for the properties of fluid flows on a microscopic scale. In this work, we focus on the description of liquid-vapor flow with a sharp interface between the phases. The local phase dynamics at the interface can be interpreted as a Riemann problem for which we develop a multiscale solver in the spirit of the heterogeneous multiscale method, using a particle-based microscale model to augment the macroscopic two-phase flow system. The application of a microscale model makes it possible to use the intrinsic properties of the fluid at the microscale, instead of formulating (ad-hoc) constitutive relations
    corecore