4,517 research outputs found

    Sequential Posted Price Mechanisms with Correlated Valuations

    Full text link
    We study the revenue performance of sequential posted price mechanisms and some natural extensions, for a general setting where the valuations of the buyers are drawn from a correlated distribution. Sequential posted price mechanisms are conceptually simple mechanisms that work by proposing a take-it-or-leave-it offer to each buyer. We apply sequential posted price mechanisms to single-parameter multi-unit settings in which each buyer demands only one item and the mechanism can assign the service to at most k of the buyers. For standard sequential posted price mechanisms, we prove that with the valuation distribution having finite support, no sequential posted price mechanism can extract a constant fraction of the optimal expected revenue, even with unlimited supply. We extend this result to the the case of a continuous valuation distribution when various standard assumptions hold simultaneously. In fact, it turns out that the best fraction of the optimal revenue that is extractable by a sequential posted price mechanism is proportional to ratio of the highest and lowest possible valuation. We prove that for two simple generalizations of these mechanisms, a better revenue performance can be achieved: if the sequential posted price mechanism has for each buyer the option of either proposing an offer or asking the buyer for its valuation, then a Omega(1/max{1,d}) fraction of the optimal revenue can be extracted, where d denotes the degree of dependence of the valuations, ranging from complete independence (d=0) to arbitrary dependence (d=n-1). Moreover, when we generalize the sequential posted price mechanisms further, such that the mechanism has the ability to make a take-it-or-leave-it offer to the i-th buyer that depends on the valuations of all buyers except i's, we prove that a constant fraction (2-sqrt{e})/4~0.088 of the optimal revenue can be always be extracted.Comment: 29 pages, To appear in WINE 201

    ASCORE: an up-to-date cardiovascular risk score for hypertensive patients reflecting contemporary clinical practice developed using the (ASCOT-BPLA) trial data.

    No full text
    A number of risk scores already exist to predict cardiovascular (CV) events. However, scores developed with data collected some time ago might not accurately predict the CV risk of contemporary hypertensive patients that benefit from more modern treatments and management. Using data from the randomised clinical trial Anglo-Scandinavian Cardiac Outcomes Trial-BPLA, with 15 955 hypertensive patients without previous CV disease receiving contemporary preventive CV management, we developed a new risk score predicting the 5-year risk of a first CV event (CV death, myocardial infarction or stroke). Cox proportional hazard models were used to develop a risk equation from baseline predictors. The final risk model (ASCORE) included age, sex, smoking, diabetes, previous blood pressure (BP) treatment, systolic BP, total cholesterol, high-density lipoprotein-cholesterol, fasting glucose and creatinine baseline variables. A simplified model (ASCORE-S) excluding laboratory variables was also derived. Both models showed very good internal validity. User-friendly integer score tables are reported for both models. Applying the latest Framingham risk score to our data significantly overpredicted the observed 5-year risk of the composite CV outcome. We conclude that risk scores derived using older databases (such as Framingham) may overestimate the CV risk of patients receiving current BP treatments; therefore, 'updated' risk scores are needed for current patients

    The potassic sedimentary rocks in Gale Crater, Mars, as seen by ChemCam on board Curiosity

    Get PDF
    The Mars Science Laboratory rover Curiosity encountered potassium-rich clastic sedimentary rocks at two sites in Gale Crater, the waypoints Cooperstown and Kimberley. These rocks include several distinct meters thick sedimentary outcrops ranging from fine sandstone to conglomerate, interpreted to record an ancient fluvial or fluvio-deltaic depositional system. From ChemCam Laser-Induced Breakdown Spectroscopy (LIBS) chemical analyses, this suite of sedimentary rocks has an overall mean K2O abundance that is more than 5 times higher than that of the average Martian crust. The combined analysis of ChemCam data with stratigraphic and geographic locations reveals that the mean K2O abundance increases upward through the stratigraphic section. Chemical analyses across each unit can be represented as mixtures of several distinct chemical components, i.e., mineral phases, including K-bearing minerals, mafic silicates, Fe-oxides, and Fe-hydroxide/oxyhydroxides. Possible K-bearing minerals include alkali feldspar (including anorthoclase and sanidine) and K-bearing phyllosilicate such as illite. Mixtures of different source rocks, including a potassium-rich rock located on the rim and walls of Gale Crater, are the likely origin of observed chemical variations within each unit. Physical sorting may have also played a role in the enrichment in K in the Kimberley formation. The occurrence of these potassic sedimentary rocks provides additional evidence for the chemical diversity of the crust exposed at Gale Crater

    Health services research in the public healthcare system in Hong Kong: An analysis of over 1 million antihypertensive prescriptions between 2004-2007 as an example of the potential and pitfalls of using routinely collected electronic patient data

    Get PDF
    <b>Objectives</b> Increasing use is being made of routinely collected electronic patient data in health services research. The aim of the present study was to evaluate the potential usefulness of a comprehensive database used routinely in the public healthcare system in Hong Kong, using antihypertensive drug prescriptions in primary care as an example.<p></p> <b>Methods</b> Data on antihypertensive drug prescriptions were retrieved from the electronic Clinical Management System (e-CMS) of all primary care clinics run by the Health Authority (HA) in the New Territory East (NTE) cluster of Hong Kong between January 2004 and June 2007. Information was also retrieved on patients’ demographic and socioeconomic characteristics, visit type (new or follow-up), and relevant diseases (International Classification of Primary Care, ICPC codes). <p></p> <b>Results</b> 1,096,282 visit episodes were accessed, representing 93,450 patients. Patients’ demographic and socio-economic details were recorded in all cases. Prescription details for anti-hypertensive drugs were missing in only 18 patients (0.02%). However, ICPC-code was missing for 36,409 patients (39%). Significant independent predictors of whether disease codes were applied included patient age > 70 years (OR 2.18), female gender (OR 1.20), district of residence (range of ORs in more rural districts; 0.32-0.41), type of clinic (OR in Family Medicine Specialist Clinics; 1.45) and type of visit (OR follow-up visit; 2.39). <p></p> In the 57,041 patients with an ICPC-code, uncomplicated hypertension (ICPC K86) was recorded in 45,859 patients (82.1%). The characteristics of these patients were very similar to those of the non-coded group, suggesting that most non-coded patients on antihypertensive drugs are likely to have uncomplicated hypertension. <p></p> <b>Conclusion</b> The e-CMS database of the HA in Hong Kong varies in quality in terms of recorded information. Potential future health services research using demographic and prescription information is highly feasible but for disease-specific research dependant on ICPC codes some caution is warranted. In the case of uncomplicated hypertension, future research on pharmaco-epidemiology (such as prescription patterns) and clinical issues (such as side-effects of medications on metabolic parameters) seems feasible given the large size of the data set and the comparability of coded and non-coded patients

    A simulation tool for better management of retinal services

    Get PDF
    Background: Advances in the management of retinal diseases have been fast-paced as new treatments become available, resulting in increasing numbers of patients receiving treatment in hospital retinal services. These patients require frequent and long-term follow-up and repeated treatments, resulting in increased pressure on clinical workloads. Due to limited clinic capacity, many National Health Service (NHS) clinics are failing to maintain recommended follow-up intervals for patients receiving care. As such, clear and robust, long term retinal service models are required to assess and respond to the needs of local populations, both currently and in the future. Methods: A discrete event simulation (DES) tool was developed to facilitate the improvement of retinal services by identifying efficiencies and cost savings within the pathway of care. For a mid-size hospital in England serving a population of over 500,000, we used 36 months of patient level data in conjunction with statistical forecasting and simulation to predict the impact of making changes within the service. Results: A simulation of increased demand and a potential solution of the 'Treat and Extend' (T&E) regimen which is reported to result in better outcomes, in combination with virtual clinics which improve quality, effectiveness and productivity and thus increase capacity is presented. Without the virtual clinic, where T&E is implemented along with the current service, we notice a sharp increase in the number of follow-ups, number of Anti-VEGF injections, and utilisation of resources. In the case of combining T&E with virtual clinics, there is a negligible (almost 0%) impact on utilisation of resources. Conclusions: Expansion of services to accommodate increasing number of patients seen and treated in retinal services is feasible with service re-organisation. It is inevitable that some form of initial investment is required to implement service expansion through T&E and virtual clinics. However, modelling with DES indicates that such investment is outweighed by cost reductions in the long term as more patients receive optimal treatment and retain vision with better outcomes. The model also shows that the service will experience an average of 10% increase in surplus capacity.Peer reviewedFinal Published versio

    Logarithmic Corrections to Schwarzschild and Other Non-extremal Black Hole Entropy in Different Dimensions

    Full text link
    Euclidean gravity method has been successful in computing logarithmic corrections to extremal black hole entropy in terms of low energy data, and gives results in perfect agreement with the microscopic results in string theory. Motivated by this success we apply Euclidean gravity to compute logarithmic corrections to the entropy of various non-extremal black holes in different dimensions, taking special care of integration over the zero modes and keeping track of the ensemble in which the computation is done. These results provide strong constraint on any ultraviolet completion of the theory if the latter is able to give an independent computation of the entropy of non-extremal black holes from microscopic description. For Schwarzschild black holes in four space-time dimensions the macroscopic result seems to disagree with the existing result in loop quantum gravity.Comment: LaTeX, 40 pages; corrected small typos and added reference

    Logarithmic Corrections to Extremal Black Hole Entropy from Quantum Entropy Function

    Get PDF
    We evaluate the one loop determinant of matter multiplet fields of N=4 supergravity in the near horizon geometry of quarter BPS black holes, and use it to calculate logarithmic corrections to the entropy of these black holes using the quantum entropy function formalism. We show that even though individual fields give non-vanishing logarithmic contribution to the entropy, the net contribution from all the fields in the matter multiplet vanishes. Thus logarithmic corrections to the entropy of quarter BPS black holes, if present, must be independent of the number of matter multiplet fields in the theory. This is consistent with the microscopic results. During our analysis we also determine the complete spectrum of small fluctuations of matter multiplet fields in the near horizon geometry.Comment: LaTeX file, 52 pages; v2: minor corrections, references adde

    Alternative low-cost adsorbent for water and wastewater decontamination derived from eggshellwaste: an overview

    Get PDF
    As the current global trend towards more stringent environmental standards, technical applicability and cost-effectiveness became key factors in the selection of adsorbents for water and wastewater treatment. Recently, various low-cost adsorbents derived from agricultural waste, industrial by-products or natural materials, have been intensively investigated. In this respect, the eggshells from egg-breaking operations constitute significant waste disposal problems for the food industry, so the development of value-added by-products from this waste is to be welcomed. The egg processing industry is very competitive, with low profit margins due to global competition and cheap imports. Additionally, the costs associated with the egg shell disposal (mainly on landfill sites) are significant, and expected to continue increasing as landfill taxes increase. The aim of the present review is to provide an overview on the development of low-cost adsorbents derived from eggshell by-products

    Monitoring and evaluation of human resources for health: an international perspective

    Get PDF
    BACKGROUND: Despite the undoubted importance of human resources to the functions of health systems, there is little consistency between countries in how human resource strategies are monitored and evaluated. This paper presents an integrated approach for developing an evidence base on human resources for health (HRH) to support decision-making, drawing on a framework for health systems performance assessment. METHODS: Conceptual and methodological issues for selecting indicators for HRH monitoring and evaluation are discussed, and a range of primary and secondary data sources that might be used to generate indicators are reviewed. Descriptive analyses are conducted drawing primarily on one type of source, namely routinely reported data on the numbers of health personnel and medical schools as covered by national reporting systems and compiled by the World Health Organization. Regression techniques are used to triangulate a given HRH indicator calculated from different data sources across multiple countries. RESULTS: Major variations in the supply of health personnel and training opportunities are found to occur by region. However, certain discrepancies are also observed in measuring the same indicator from different sources, possibly related to the occupational classification or to the sources' representation. CONCLUSION: Evidence-based information is needed to better understand trends in HRH. Although a range of sources exist that can potentially be used for HRH assessment, the information that can be derived from many of these individual sources precludes refined analysis. A variety of data sources and analytical approaches, each with its own strengths and limitations, is required to reflect the complexity of HRH issues. In order to enhance cross-national comparability, data collection efforts should be processed through the use of internationally standardized classifications (in particular, for occupation, industry and education) at the greatest level of detail possible
    corecore