265,815 research outputs found

    Freshness-Aware Thompson Sampling

    Full text link
    To follow the dynamicity of the user's content, researchers have recently started to model interactions between users and the Context-Aware Recommender Systems (CARS) as a bandit problem where the system needs to deal with exploration and exploitation dilemma. In this sense, we propose to study the freshness of the user's content in CARS through the bandit problem. We introduce in this paper an algorithm named Freshness-Aware Thompson Sampling (FA-TS) that manages the recommendation of fresh document according to the user's risk of the situation. The intensive evaluation and the detailed analysis of the experimental results reveals several important discoveries in the exploration/exploitation (exr/exp) behaviour.Comment: 21st International Conference on Neural Information Processing. arXiv admin note: text overlap with arXiv:1409.772

    Status Quo Analysis of the Flathead River Conflict

    Get PDF
    Status quo analysis algorithms developed within the paradigm of the graph model for conflict resolution are applied to an international river basin conflict involving the United States and Canada to assess the likeliness of various compromise resolutions. The conflict arose because the state of Montana feared that further expansion of the Sage Creek Coal Company facilities in Canada would pollute the Flathead River, which flows from British Columbia into Montana. Significant insights not generally available from a static stability analysis are obtained about potential resolutions of the conflict under study and about how decision makers’ interactions may direct the conflict to distinct resolutions. Analyses also show how political considerations may affect a particular decision maker’s choice, thereby influencing the evolution of the conflict

    Universality of Long-Range Correlations in Expansion-Randomization Systems

    Full text link
    We study the stochastic dynamics of sequences evolving by single site mutations, segmental duplications, deletions, and random insertions. These processes are relevant for the evolution of genomic DNA. They define a universality class of non-equilibrium 1D expansion-randomization systems with generic stationary long-range correlations in a regime of growing sequence length. We obtain explicitly the two-point correlation function of the sequence composition and the distribution function of the composition bias in sequences of finite length. The characteristic exponent χ\chi of these quantities is determined by the ratio of two effective rates, which are explicitly calculated for several specific sequence evolution dynamics of the universality class. Depending on the value of χ\chi, we find two different scaling regimes, which are distinguished by the detectability of the initial composition bias. All analytic results are accurately verified by numerical simulations. We also discuss the non-stationary build-up and decay of correlations, as well as more complex evolutionary scenarios, where the rates of the processes vary in time. Our findings provide a possible example for the emergence of universality in molecular biology.Comment: 23 pages, 15 figure

    Implantable RF-coiled chip packaging

    Get PDF
    In this paper, we present an embedded chip integration technology that utilizes silicon housings and flexible parylene radio frequency (RF) coils. As a demonstration of this technology, a flexible parylene RF coil has been integrated with an RF identification (RFID) chip. The coil has an inductance of 16 μH, with two layers of metal completely encapsulated in parylene-C. The functionality of the embedded chip is verified using an RFID reader module. Accelerated-lifetime soak testing has been performed in saline, and the results show that the silicon chip is well protected and the lifetime of our parylene-encapsulated RF coil at 37 °C is more than 20 years

    Baryon enhancement in high-density QCD and relativistic heavy ion collisions

    Full text link
    We argue that the collinear factorization of the fragmentation functions in high energy nuclear collisions breaks down at transverse momenta pTQs/gp_T \lesssim Q_s/g due to high parton densities in the colliding hadrons and/or nuclei. We find that gluon recombination dominates in that pTp_T region. We calculate the inclusive cross-section for π\pi meson and nucleon production using the low energy theorems for the scale anomaly in QCD, and compare our quantitative baryon-to-meson ratio to the RHIC data.Comment: 4 pages, 2 figure; Contribution to Quark Matter 2008 in Jaipur, India; submitted to J. Phys.

    The Nullity of Bicyclic Signed Graphs

    Full text link
    Let \Gamma be a signed graph and let A(\Gamma) be the adjacency matrix of \Gamma. The nullity of \Gamma is the multiplicity of eigenvalue zero in the spectrum of A(\Gamma). In this paper we characterize the signed graphs of order n with nullity n-2 or n-3, and introduce a graph transformation which preserves the nullity. As an application we determine the unbalanced bicyclic signed graphs of order n with nullity n-3 or n-4, and signed bicyclic signed graphs (including simple bicyclic graphs) of order n with nullity n-5
    corecore