2,722 research outputs found

    Balanced Allocations: A Simple Proof for the Heavily Loaded Case

    Full text link
    We provide a relatively simple proof that the expected gap between the maximum load and the average load in the two choice process is bounded by (1+o(1))loglogn(1+o(1))\log \log n, irrespective of the number of balls thrown. The theorem was first proven by Berenbrink et al. Their proof uses heavy machinery from Markov-Chain theory and some of the calculations are done using computers. In this manuscript we provide a significantly simpler proof that is not aided by computers and is self contained. The simplification comes at a cost of weaker bounds on the low order terms and a weaker tail bound for the probability of deviating from the expectation

    Online Makespan Minimization with Parallel Schedules

    Full text link
    In online makespan minimization a sequence of jobs σ=J1,...,Jn\sigma = J_1,..., J_n has to be scheduled on mm identical parallel machines so as to minimize the maximum completion time of any job. We investigate the problem with an essentially new model of resource augmentation. Here, an online algorithm is allowed to build several schedules in parallel while processing σ\sigma. At the end of the scheduling process the best schedule is selected. This model can be viewed as providing an online algorithm with extra space, which is invested to maintain multiple solutions. The setting is of particular interest in parallel processing environments where each processor can maintain a single or a small set of solutions. We develop a (4/3+\eps)-competitive algorithm, for any 0<\eps\leq 1, that uses a number of 1/\eps^{O(\log (1/\eps))} schedules. We also give a (1+\eps)-competitive algorithm, for any 0<\eps\leq 1, that builds a polynomial number of (m/\eps)^{O(\log (1/\eps) / \eps)} schedules. This value depends on mm but is independent of the input σ\sigma. The performance guarantees are nearly best possible. We show that any algorithm that achieves a competitiveness smaller than 4/3 must construct Ω(m)\Omega(m) schedules. Our algorithms make use of novel guessing schemes that (1) predict the optimum makespan of a job sequence σ\sigma to within a factor of 1+\eps and (2) guess the job processing times and their frequencies in σ\sigma. In (2) we have to sparsify the universe of all guesses so as to reduce the number of schedules to a constant. The competitive ratios achieved using parallel schedules are considerably smaller than those in the standard problem without resource augmentation

    Balanced Allocation on Graphs: A Random Walk Approach

    Full text link
    In this paper we propose algorithms for allocating nn sequential balls into nn bins that are interconnected as a dd-regular nn-vertex graph GG, where d3d\ge3 can be any integer.Let ll be a given positive integer. In each round tt, 1tn1\le t\le n, ball tt picks a node of GG uniformly at random and performs a non-backtracking random walk of length ll from the chosen node.Then it allocates itself on one of the visited nodes with minimum load (ties are broken uniformly at random). Suppose that GG has a sufficiently large girth and d=ω(logn)d=\omega(\log n). Then we establish an upper bound for the maximum number of balls at any bin after allocating nn balls by the algorithm, called {\it maximum load}, in terms of ll with high probability. We also show that the upper bound is at most an O(loglogn)O(\log\log n) factor above the lower bound that is proved for the algorithm. In particular, we show that if we set l=(logn)1+ϵ2l=\lfloor(\log n)^{\frac{1+\epsilon}{2}}\rfloor, for every constant ϵ(0,1)\epsilon\in (0, 1), and GG has girth at least ω(l)\omega(l), then the maximum load attained by the algorithm is bounded by O(1/ϵ)O(1/\epsilon) with high probability.Finally, we slightly modify the algorithm to have similar results for balanced allocation on dd-regular graph with d[3,O(logn)]d\in[3, O(\log n)] and sufficiently large girth

    Brand gender and consumer-based brand equity on Facebook: The mediating role of consumer-brand engagement and brand love

    Get PDF
    Brand gender has been suggested as a relevant source of consumer-based brand equity (CBBE). The purpose of this paper is to deepen understanding of the relationship between brand gender and CBBE by analyzing the mediating roleofconsumer–brandengagement (CBE)andbrandlove(BL)onthisrelationship.Thisresearchwas conducted on Facebook, the dominant global social media platform. The hypotheses were tested using structural equation modeling. Results support 6 of the 9 hypotheses, with a significant relationship between analyzed constructs. This study advances prior work by showing that brand gender has an indirect and relevant impact on CBBE through BL and CBE. Therefore, this research confirms the advantages of clear gender positioning and extends prior research by suggesting that brands with a strong gender identity will encourage BL and CB

    Locally Optimal Load Balancing

    Full text link
    This work studies distributed algorithms for locally optimal load-balancing: We are given a graph of maximum degree Δ\Delta, and each node has up to LL units of load. The task is to distribute the load more evenly so that the loads of adjacent nodes differ by at most 11. If the graph is a path (Δ=2\Delta = 2), it is easy to solve the fractional version of the problem in O(L)O(L) communication rounds, independently of the number of nodes. We show that this is tight, and we show that it is possible to solve also the discrete version of the problem in O(L)O(L) rounds in paths. For the general case (Δ>2\Delta > 2), we show that fractional load balancing can be solved in poly(L,Δ)\operatorname{poly}(L,\Delta) rounds and discrete load balancing in f(L,Δ)f(L,\Delta) rounds for some function ff, independently of the number of nodes.Comment: 19 pages, 11 figure

    Statistical mechanics of budget-constrained auctions

    Full text link
    Finding the optimal assignment in budget-constrained auctions is a combinatorial optimization problem with many important applications, a notable example being the sale of advertisement space by search engines (in this context the problem is often referred to as the off-line AdWords problem). Based on the cavity method of statistical mechanics, we introduce a message passing algorithm that is capable of solving efficiently random instances of the problem extracted from a natural distribution, and we derive from its properties the phase diagram of the problem. As the control parameter (average value of the budgets) is varied, we find two phase transitions delimiting a region in which long-range correlations arise.Comment: Minor revisio

    On the Growth of Al_2 O_3 Scales

    Get PDF
    Understanding the growth of Al2O3 scales requires knowledge of the details of the chemical reactions at the scale–gas and scale–metal interfaces, which in turn requires specifying how the creation/annihilation of O and Al vacancies occurs at these interfaces. The availability of the necessary electrons and holes to allow for such creation/annihilation is a crucial aspect of the scaling reaction. The electronic band structure of polycrystalline Al2O3 thus plays a decisive role in scale formation and is considered in detail, including the implications of a density functional theory (DFT) calculation of the band structure of a Σ7 View the MathML source bicrystal boundary, for which the atomic structure of the boundary was known from an independent DFT energy-minimization calculation and comparisons with an atomic-resolution transmission electron micrograph of the same boundary. DFT calculations of the formation energy of O and Al vacancies in bulk Al2O3 in various charge states as a function of the Fermi energy suggested that electronic conduction in Al2O3 scales most likely involves excitation of both electrons and holes, which are localized on singly charged O vacancies, View the MathML source and doubly charged Al vacancies, View the MathML source, respectively. We also consider the variation of the Fermi level across the scale and bending (“tilting”) of the conduction band minimum and valence band maximum due to the electric field developed during the scaling reaction. The band structure calculations suggest a new mechanism for the “reactive element” effect—a consequence of segregation of Y, Hf, etc., to grain boundaries in Al2O3 scales, which results in improved oxidation resistance—namely, that the effect is due to the modification of the near-band edge grain-boundary defect states rather than any blocking of diffusion pathways, as previously postulated. Secondly, Al2O3 scale formation is dominated by grain boundary as opposed to lattice diffusion, and there is unambiguous evidence for both O and Al countercurrent transport in Al2O3 scale-forming alloys. We postulate that such transport is mediated by migration of grain boundary disconnections containing charged jogs, rather than by jumping of isolated point defects in random high-angle grain boundaries

    The power of choice in network growth

    Full text link
    The "power of choice" has been shown to radically alter the behavior of a number of randomized algorithms. Here we explore the effects of choice on models of tree and network growth. In our models each new node has k randomly chosen contacts, where k > 1 is a constant. It then attaches to whichever one of these contacts is most desirable in some sense, such as its distance from the root or its degree. Even when the new node has just two choices, i.e., when k=2, the resulting network can be very different from a random graph or tree. For instance, if the new node attaches to the contact which is closest to the root of the tree, the distribution of depths changes from Poisson to a traveling wave solution. If the new node attaches to the contact with the smallest degree, the degree distribution is closer to uniform than in a random graph, so that with high probability there are no nodes in the network with degree greater than O(log log N). Finally, if the new node attaches to the contact with the largest degree, we find that the degree distribution is a power law with exponent -1 up to degrees roughly equal to k, with an exponential cutoff beyond that; thus, in this case, we need k >> 1 to see a power law over a wide range of degrees.Comment: 9 pages, 4 figure

    Managing Climate Risk

    Get PDF
    At the heart of the traditional approach to strategy in the climate change dilemma lies the assumption that the global community, by applying a set of powerful analytical tools, can predict the future of climate change accurately enough to choose a clear strategic direction for it. We claim that this approach might involve underestimating uncertainty in order to lay out a vision of future events sufficiently precise to be captured in a discounted cost flow analysis in integrated assessment models. However, since the future of climate change is truly uncertain, this approach might at best be marginally helpful and at worst downright dangerous: underestimating uncertainty can lead to strategies that do not defend the world against unexpected and sometimes even catastrophic threats. Another danger lies on the other extreme: if the global community can not find a strategy that works under traditional analysis or if uncertainties are too large that clear messages are absent, they may abandon the analytical rigor of their planning process altogether and base their decisions on good instinct and consensus of some future process that is easy to agree upon. In this paper, we try to outline a system to derive strategic decisions under uncertainty for the climate change dilemma. What follows is a framework for determining the level of uncertainty surrounding strategic decisions and for tailoring strategy to that uncertainty. Our core argument is that a robust strategy towards climate change involves the building of a technological portfolio of mitigation and adaptation measures that includes sufficient opposite technological positions to the underlying baseline emission scenarios given the uncertainties of the entire physical and socioeconomic system in place. In the case of mitigation, opposite technological positions with the highest leverage are particular types of sinks. A robust climate risk management portfolio can only work when the opposite technological positions are readily available when needed and therefore have to be prepared in advance. It is precisely the flexibility of these technological options which has to be quantified under the perspective of the uncertain nature of the underlying system and compared to the cost of creating these options, rather than comparing their cost with expected losses in a net present value type analysis. We conclude that climate policy - especially under the consideration of the precautionary principle - would look much different if uncertainties would be taken explicitly into account
    corecore