241 research outputs found

    An improvement of the Berry--Esseen inequality with applications to Poisson and mixed Poisson random sums

    Full text link
    By a modification of the method that was applied in (Korolev and Shevtsova, 2009), here the inequalities ρ(Fn,Φ)0.335789(β3+0.425)n\rho(F_n,\Phi)\le\frac{0.335789(\beta^3+0.425)}{\sqrt{n}} and ρ(Fn,Φ)0.3051(β3+1)n\rho(F_n,\Phi)\le \frac{0.3051(\beta^3+1)}{\sqrt{n}} are proved for the uniform distance ρ(Fn,Φ)\rho(F_n,\Phi) between the standard normal distribution function Φ\Phi and the distribution function FnF_n of the normalized sum of an arbitrary number n1n\ge1 of independent identically distributed random variables with zero mean, unit variance and finite third absolute moment β3\beta^3. The first of these inequalities sharpens the best known version of the classical Berry--Esseen inequality since 0.335789(β3+0.425)0.335789(1+0.425)β3<0.4785β30.335789(\beta^3+0.425)\le0.335789(1+0.425)\beta^3<0.4785\beta^3 by virtue of the condition β31\beta^3\ge1, and 0.4785 is the best known upper estimate of the absolute constant in the classical Berry--Esseen inequality. The second inequality is applied to lowering the upper estimate of the absolute constant in the analog of the Berry--Esseen inequality for Poisson random sums to 0.3051 which is strictly less than the least possible value of the absolute constant in the classical Berry--Esseen inequality. As a corollary, the estimates of the rate of convergence in limit theorems for compound mixed Poisson distributions are refined.Comment: 33 page

    Proof-theoretic semantics, a problem with negation and prospects for modality

    Get PDF
    This paper discusses proof-theoretic semantics, the project of specifying the meanings of the logical constants in terms of rules of inference governing them. I concentrate on Michael Dummett’s and Dag Prawitz’ philosophical motivations and give precise characterisations of the crucial notions of harmony and stability, placed in the context of proving normalisation results in systems of natural deduction. I point out a problem for defining the meaning of negation in this framework and prospects for an account of the meanings of modal operators in terms of rules of inference

    Implicit complexity for coinductive data: a characterization of corecurrence

    Full text link
    We propose a framework for reasoning about programs that manipulate coinductive data as well as inductive data. Our approach is based on using equational programs, which support a seamless combination of computation and reasoning, and using productivity (fairness) as the fundamental assertion, rather than bi-simulation. The latter is expressible in terms of the former. As an application to this framework, we give an implicit characterization of corecurrence: a function is definable using corecurrence iff its productivity is provable using coinduction for formulas in which data-predicates do not occur negatively. This is an analog, albeit in weaker form, of a characterization of recurrence (i.e. primitive recursion) in [Leivant, Unipolar induction, TCS 318, 2004].Comment: In Proceedings DICE 2011, arXiv:1201.034

    General-elimination stability

    Get PDF
    General-elimination harmony articulates Gentzen's idea that the elimination-rules are justified if they infer from an assertion no more than can already be inferred from the grounds for making it. Dummett described the rules as not only harmonious but stable if the E-rules allow one to infer no more and no less than the I-rules justify. Pfenning and Davies call the rules locally complete if the E-rules are strong enough to allow one to infer the original judgement. A method is given of generating harmonious general-elimination rules from a collection of I-rules. We show that the general-elimination rules satisfy Pfenning and Davies' test for local completeness, but question whether that is enough to show that they are stable. Alternative conditions for stability are considered, including equivalence between the introduction- and elimination-meanings of a connective, and recovery of the grounds for assertion, finally generalizing the notion of local completeness to capture Dummett's notion of stability satisfactorily. We show that the general-elimination rules meet the last of these conditions, and so are indeed not only harmonious but also stable.Publisher PDFPeer reviewe

    A Bell Inequality Analog in Quantum Measure Theory

    Get PDF
    One obtains Bell's inequalities if one posits a hypothetical joint probability distribution, or {\it measure}, whose marginals yield the probabilities produced by the spin measurements in question. The existence of a joint measure is in turn equivalent to a certain causality condition known as ``screening off''. We show that if one assumes, more generally, a joint {\it quantal measure}, or ``decoherence functional'', one obtains instead an analogous inequality weaker by a factor of 2\sqrt{2}. The proof of this ``Tsirel'son inequality'' is geometrical and rests on the possibility of associating a Hilbert space to any strongly positive quantal measure. These results lead both to a {\it question}: ``Does a joint measure follow from some quantal analog of `screening off'?'', and to the {\it observation} that non-contextual hidden variables are viable in histories-based quantum mechanics, even if they are excluded classically.Comment: 38 pages, TeX. Several changes and added comments to bring out the meaning more clearly. Minor rewording and extra acknowledgements, now closer to published versio

    Models of HoTT and the Constructive View of Theories

    Get PDF
    Homotopy Type theory and its Model theory provide a novel formal semantic framework for representing scientific theories. This framework supports a constructive view of theories according to which a theory is essentially characterised by its methods. The constructive view of theories was earlier defended by Ernest Nagel and a number of other philosophers of the past but available logical means did not allow these people to build formal representational frameworks that implement this view

    Towards a canonical classical natural deduction system

    Get PDF
    This paper studies a new classical natural deduction system, presented as a typed calculus named \lml. It is designed to be isomorphic to Curien-Herbelin's calculus, both at the level of proofs and reduction, and the isomorphism is based on the correct correspondence between cut (resp. left-introduction) in sequent calculus, and substitution (resp. elimination) in natural deduction. It is a combination of Parigot's λμ\lambda\mu-calculus with the idea of ``coercion calculus'' due to Cervesato-Pfenning, accommodating let-expressions in a surprising way: they expand Parigot's syntactic class of named terms. This calculus aims to be the simultaneous answer to three problems. The first problem is the lack of a canonical natural deduction system for classical logic. \lml is not yet another classical calculus, but rather a canonical reflection in natural deduction of the impeccable treatment of classical logic by sequent calculus. The second problem is the lack of a formalization of the usual semantics of Curien-Herbelin's calculus, that explains co-terms and cuts as, respectively, contexts and hole-filling instructions. The mentioned isomorphism is the required formalization, based on the precise notions of context and hole-expression offered by \lml. The third problem is the lack of a robust process of ``read-back'' into natural deduction syntax of calculi in the sequent calculus format, that affects mainly the recent proof-theoretic efforts of derivation of λ\lambda-calculi for call-by-value. An isomorphic counterpart to the QQ-subsystem of Curien-Herbelin's-calculus is derived, obtaining a new λ\lambda-calculus for call-by-value, combining control and let-expressions.Fundação para a Ciência e a Tecnologia (FCT

    Assessment-schedule matching in unanchored indirect treatment comparisons of progression-free survival in cancer studies

    Get PDF
    Background The timing of efficacy-related clinical events recorded at scheduled study visits in clinical trials are interval censored, with the interval duration pre-determined by the study protocol. Events may happen any time during that interval but can only be detected during a planned or unplanned visit. Disease progression in oncology is a notable example where the time to an event is affected by the schedule of visits within a study. This can become a source of bias when studies with varying assessment schedules are used in unanchored comparisons using methods such as matching-adjusted indirect comparisons. Objective We illustrate assessment-time bias (ATB) in a simulation study based on data from a recent study in second-line treatment for locally advanced or metastatic urothelial carcinoma, and present a method to adjust for differences in assessment schedule when comparing progression-free survival (PFS) against a competing treatment. Methods A multi-state model for death and progression was used to generate simulated death and progression times, from which PFS times were derived. PFS data were also generated for a hypothetical comparator treatment by applying a constant hazard ratio (HR) to the baseline treatment. Simulated PFS times for the two treatments were then aligned to different assessment schedules so that progression events were only observed at set visit times, and the data were analysed to assess the bias and standard error of estimates of HRs between two treatments with and without assessment-schedule matching (ASM). Results ATB is highly affected by the rate of the event at the first assessment time; in our examples, the bias ranged from 3 to 11% as the event rate increased. The proposed method relies on individual-level data from a study and attempts to adjust the timing of progression events to the comparator’s schedule by shifting them forward or backward without altering the patients’ actual follow-up time. The method removed the bias almost completely in all scenarios without affecting the precision of estimates of comparative effectiveness. Conclusions Considering the increasing use of unanchored comparative analyses for novel cancer treatments based on single-arm studies, the proposed method offers a relatively simple means of improving the accuracy of relative benefits of treatments on progression times

    Patient-reported outcome measures of the impact of cancer on patient’s everyday lives: a systematic review

    Get PDF
    Purpose: Patients with advanced disease are living longer and commonly used patient-reported outcome measures (PROMs) may miss relevant elements of the quality of extended survival. This systematic review examines the measures used to capture aspects of the quality of survival including impact on patients’ everyday lives such as finances, work and family roles. Methods: Searches were conducted in MEDLINE, EMBASE, CINAHL and PsycINFO restricted to English language articles. Information on study characteristics, instruments and outcomes was systematically extracted and synthesised. A predefined set of criteria was used to rate the quality of studies. Results: From 2761 potentially relevant articles, 22 met all inclusion criteria, including 10 concerning financial distress, 3 on roles and responsibilities and 9 on multiple aspects of social well-being. Generally, studies were not of high quality; many lacked bias free participant selection, had confounding factors and had not accounted for all participants. High levels of financial distress were reported and were associated with multiple demographic factors such as age and income. There were few reports concerned with impacts on patients’ roles/responsibilities in everyday life although practical and emotional struggles with parenting were identified. Social difficulties were common and associated with multiple factors including being a caregiver. Many studies were single time-point surveys and used non-validated measures. Exceptions were employment of the COST and Social Difficulties Inventory (SDI), validated measures of financial and social distress respectively. Conclusions: Impact on some important parts of patients’ everyday lives is insufficiently and inconsistently captured. Further PROM development focussing on roles and responsibilities, including work and caring for dependents, is warranted. Implications for Cancer Survivors: Factors such as finances, employment and responsibility for caring for dependents (e.g. children and elderly relatives) can affect the well-being of cancer survivors. There is a need to ensure that any instruments used to assess patients’ social well-being are broad enough to include these areas so that any difficulties arising can be better understood and appropriately supported

    Strict functionals for termination proofs

    Full text link
    corecore