1,184 research outputs found
Efficient Certified Resolution Proof Checking
We present a novel propositional proof tracing format that eliminates complex
processing, thus enabling efficient (formal) proof checking. The benefits of
this format are demonstrated by implementing a proof checker in C, which
outperforms a state-of-the-art checker by two orders of magnitude. We then
formalize the theory underlying propositional proof checking in Coq, and
extract a correct-by-construction proof checker for our format from the
formalization. An empirical evaluation using 280 unsatisfiable instances from
the 2015 and 2016 SAT competitions shows that this certified checker usually
performs comparably to a state-of-the-art non-certified proof checker. Using
this format, we formally verify the recent 200 TB proof of the Boolean
Pythagorean Triples conjecture
Efficient Certified RAT Verification
Clausal proofs have become a popular approach to validate the results of SAT
solvers. However, validating clausal proofs in the most widely supported format
(DRAT) is expensive even in highly optimized implementations. We present a new
format, called LRAT, which extends the DRAT format with hints that facilitate a
simple and fast validation algorithm. Checking validity of LRAT proofs can be
implemented using trusted systems such as the languages supported by theorem
provers. We demonstrate this by implementing two certified LRAT checkers, one
in Coq and one in ACL2
Poincaré on the Foundation of Geometry in the Understanding
This paper is about Poincaré’s view of the foundations of geometry. According to the established view, which has been inherited from the logical positivists, Poincaré, like Hilbert, held that axioms in geometry are schemata that provide implicit definitions of geometric terms, a view he expresses by stating that the axioms of geometry are “definitions in disguise.” I argue that this view does not accord well with Poincaré’s core commitment in the philosophy of geometry: the view that geometry is the study of groups of operations. In place of the established view I offer a revised view, according to which Poincaré held that axioms in geometry are in fact assertions about invariants of groups. Groups, as forms of the understanding, are prior in conception to the objects of geometry and afford the proper definition of those objects, according to Poincaré. Poincaré’s view therefore contrasts sharply with Kant’s foundation of geometry in a unique form of sensibility. According to my interpretation, axioms are not definitions in disguise because they themselves implicitly define their terms, but rather because they disguise the definitions which imply them
A Model-Based Analysis of GC-Biased Gene Conversion in the Human and Chimpanzee Genomes
GC-biased gene conversion (gBGC) is a recombination-associated process that favors the fixation of G/C alleles over A/T alleles. In mammals, gBGC is hypothesized to contribute to variation in GC content, rapidly evolving sequences, and the fixation of deleterious mutations, but its prevalence and general functional consequences remain poorly understood. gBGC is difficult to incorporate into models of molecular evolution and so far has primarily been studied using summary statistics from genomic comparisons. Here, we introduce a new probabilistic model that captures the joint effects of natural selection and gBGC on nucleotide substitution patterns, while allowing for correlations along the genome in these effects. We implemented our model in a computer program, called phastBias, that can accurately detect gBGC tracts about 1 kilobase or longer in simulated sequence alignments. When applied to real primate genome sequences, phastBias predicts gBGC tracts that cover roughly 0.3% of the human and chimpanzee genomes and account for 1.2% of human-chimpanzee nucleotide differences. These tracts fall in clusters, particularly in subtelomeric regions; they are enriched for recombination hotspots and fast-evolving sequences; and they display an ongoing fixation preference for G and C alleles. They are also significantly enriched for disease-associated polymorphisms, suggesting that they contribute to the fixation of deleterious alleles. The gBGC tracts provide a unique window into historical recombination processes along the human and chimpanzee lineages. They supply additional evidence of long-term conservation of megabase-scale recombination rates accompanied by rapid turnover of hotspots. Together, these findings shed new light on the evolutionary, functional, and disease implications of gBGC. The phastBias program and our predicted tracts are freely available. © 2013 Capra et al
A Unifying Model of Genome Evolution Under Parsimony
We present a data structure called a history graph that offers a practical
basis for the analysis of genome evolution. It conceptually simplifies the
study of parsimonious evolutionary histories by representing both substitutions
and double cut and join (DCJ) rearrangements in the presence of duplications.
The problem of constructing parsimonious history graphs thus subsumes related
maximum parsimony problems in the fields of phylogenetic reconstruction and
genome rearrangement. We show that tractable functions can be used to define
upper and lower bounds on the minimum number of substitutions and DCJ
rearrangements needed to explain any history graph. These bounds become tight
for a special type of unambiguous history graph called an ancestral variation
graph (AVG), which constrains in its combinatorial structure the number of
operations required. We finally demonstrate that for a given history graph ,
a finite set of AVGs describe all parsimonious interpretations of , and this
set can be explored with a few sampling moves.Comment: 52 pages, 24 figure
Encoding monomorphic and polymorphic types
Most automatic theorem provers are restricted to untyped logics, and existing translations from typed logics are bulky or unsound. Recent research proposes monotonicity as a means to remove some clutter. Here we pursue this approach systematically, analysing formally a variety of encodings that further improve on efficiency while retaining soundness and completeness. We extend the approach to rank-1 polymorphism and present alternative schemes that lighten
the translation of polymorphic symbols based on the novel notion of “cover”. The new encodings are implemented, and partly proved correct, in Isabelle/HOL. Our evaluation finds them vastly superior to previous schemes
Encoding monomorphic and polymorphic types
Many automatic theorem provers are restricted to untyped logics, and existing translations from typed logics are bulky or unsound. Recent research proposes monotonicity as a means to remove some clutter when translating monomorphic to un-typed first-order logic. Here we pursue this approach systematically, analysing formally a variety of encodings that further improve on efficiency while retaining soundness and completeness. We extend the approach to rank-1 polymorphism and present alternative schemes that lighten the translation of polymorphic symbols based on the novel notion of “cover”. The new encodings are implemented in Isabelle/HOL as part of the Sledgehammer tool. We include informal proofs of soundness and correctness, and have formalized the monomorphic part of this work in Isabelle/HOL. Our evaluation finds the new encodings vastly superior to previous schemes
Soundness and completeness proofs by coinductive methods
We show how codatatypes can be employed to produce compact, high-level proofs of key results in logic: the soundness and completeness of proof systems for variations of first-order logic. For the classical completeness result, we first establish an abstract property of possibly infinite derivation trees. The abstract proof can be instantiated for a wide range of Gentzen and tableau systems for various flavors of first-order logic. Soundness becomes interesting as soon as one allows infinite proofs of first-order formulas. This forms the subject of several cyclic proof systems for first-order logic augmented with inductive predicate definitions studied in the literature. All the discussed results are formalized using Isabelle/HOL’s recently introduced support for codatatypes and corecursion. The development illustrates some unique features of Isabelle/HOL’s new coinductive specification language such as nesting through non-free types and mixed recursion–corecursion
A framework for orthology assignment from gene rearrangement data
Abstract. Gene rearrangements have successfully been used in phylogenetic reconstruction and comparative genomics, but usually under the assumption that all genomes have the same gene content and that no gene is duplicated. While these assumptions allow one to work with organellar genomes, they are too restrictive when comparing nuclear genomes. The main challenge is how to deal with gene families, specifically, how to identify orthologs. While searching for orthologies is a common task in computational biology, it is usually done using sequence data. We approach that problem using gene rearrangement data, provide an optimization framework in which to phrase the problem, and present some preliminary theoretical results.
Does regulating private long-term care facilities lead to better care? a study from Quebec, Canada
Objective. In the province of Quebec, Canada, long-term residential care is provided by 2 types of facilities: publicly-funded accredited facilities and privately-owned facilities in which care is privately financed and delivered. Following evidence that private facilities were delivering inadequate care, the provincial government decided to regulate this industry. We assessed the impact of regulation on care quality by comparing quality assessments made before and after regulation. In both periods, public facilities served as a comparison group.
Design: A cross-sectional study conducted in 2010-2012 that incorporates data collected in 1995-2000.
Settings. Random samples of private and public facilities from 2 regions of Quebec.
Participants. Random samples of disabled residents aged 65 years and over. In total, 451 residents from 145 care settings assessed in 1995-2000 were compared to 329 residents from 102 care settings assessed in 2010-2012.
Intervention. Regulation introduced by the province in 2005, effective February 2007.
Main outcome measure. Quality of care measured with the QUALCARE Scale.
Results. After regulation, fewer small-size facilities were in operation in the private market. Between the 2 study periods, the proportion of residents with severe disabilities decreased in private facilities while it remained over 80% in their public counterparts. Meanwhile, quality of care improved significantly in private facilities, while worsening in their public counterparts, even after controlling for confounding.
Conclusions. The private industry now provides better care to its residents. Improvement in care quality likely results in part from the closure of small homes and change in resident case-mix
- …
