We present results and conjectures on the van der Waerden numbers w(2;3,t), and on the new palindromic van der Waerden numbers pdw(2;3,t). We have computed the new number w(2;3,19) = 349, and we provide lower bounds for 20 <= t <= 39, where for t <= 30 we conjecture these lower bounds to be exact. The lower bounds for 24 <= t <= 30 refute the conjecture that w(2;3,t) <= t^2, and we present an improved conjecture. We also investigate regularities in the good partitions (certificates) to better understand the lower bounds. Motivated by such regularities, we introduce *palindromic van der Waerden numbers* pdw(k; t_0,...,t_{k-1}), defined as ordinary van der Waerden numbers w(k; t_0,...,t_{k-1}), however only allowing palindromic solutions (good partitions), defined as reading the same from both ends. We compute pdw(2;3,t) for 3 <= t <= 27, and we provide lower bounds, which we conjecture to be exact, for t <= 35. All computations are based on SAT solving, and we discuss the various relations between SAT solving and Ramsey theory. Especially we introduce a novel (open-source) SAT solver, the tawSolver , which performs best on the SAT instances studied here, and which is actually the original DLL-solver, but with an efficient implementation and a modern heuristic typical for look-ahead solvers (applying the theory developed in the SAT handbook article ).
For investigations into the structure of MU (minimally unsatisfiable clause-sets or conjunctive normal forms), singular DP-reduction is a fundamental tool, applying DP-reduction F -> DP_v(F) in case variable v occurs in one polarity only once. We consider sDP(F), the set of all results of applying singular DP-reduction to F in MU as long as possible, obtaining non-singular F' in MU with the same deficiency, i.e., delta(F') = delta(F). Our main results are: 1. For all F', F" in sDP(F) we have n(F') = n(F"). 2. If F is saturated (F in SMU), then we have |sDP(F)| = 1. 3. If F is "eventually saturated", that is, sDP(F) <= SMU, then for F', F" in sDP(F) we have F' isomorphic F" (establishing "confluence modulo isomorphism"). The results are obtained by a detailed analysis of singular DP-reduction for F in MU. As an application we obtain that singular DP-reduction for F in MU(2) (i.e., delta(F) = 2) is confluent modulo isomorphism (using the fundamental characterisation of MU(2) by Kleine Buening). The background for these considerations is the general project of the classification of MU in terms of the deficiency.
Based on the theory of hardness hd(F) for clause-sets F (introduced here and further developed here), we develop a systematic approach towards classes interesting for the area of knowledge compilation for the clausal entailment problem, as well as towards providing "soft" ("good") target classes for SAT translation. The class of unit-refutation complete clause-sets was introduced in 1994, and is denoted here by UC_1. It turns out to be exactly the class of clause-sets of hardness at most 1. Thus we obtain a natural generalisation UC_k of classes of clause-sets which are "unit-refutation complete of level k". Utilising the strong connections to (tree-)resolution complexity and (nested) input resolution, we develop fundamental methods for determination of hardness (the level k in UC_k). UC_1 also turns out to be precisely the class SLUR (Single Lookahead Unit Resolution) introduced in 1995. A natural hierarchy SLUR_k based on SLUR is derived, and we show SLUR_k = UC_k, offering an alternative interpretation of UC_k. Finally we consider the problem of "irredundant" clause-sets in UC_k. For 2-CNF we show that strong minimisations are possible in polynomial time, while already for (very special) Horn clause-sets minimisation is NP-complete. We conclude with a discussion of open problems and future directions.
Considers the problem of generalising boolean formulas in conjunctive normal form by allowing non-boolean variables, with the goal of maintaining combinatorial properties. Building up a solid foundation for (generalised) clause-sets, including the notion of autarky systems, the interplay between autarkies and resolution, and basic notions of (DP-)reductions. As a basic combinatorial parameter of generalised clause-sets we introduce the (generalised) notion of deficiency, which in the boolean case is the difference between the number of clauses and the number of variables. Autarky theory plays a fundamental role here, and we concentrate especially on matching autarkies (based on matching theory). A central result is the computation of the lean kernel (the largest lean subset) of a (generalised) clause-set in polynomial time for bounded maximal deficiency.
First we study the properties of the direct translation (or “encoding”) of generalised clause-sets into boolean clause-sets. Then we turn to irredundant clause-sets, which generalise minimally unsatisfiable clause-sets, and we prove basic properties. The simplest irredundant clause-sets are hitting clause-sets, and we provide characterisations and generalisations. Unsatisfiable irredundant clause-sets are the minimally unsatisfiable clause-sets, and we provide basic tools. These tools allow us to characterise the minimally unsatisfiable clause-sets of minimal deficiency. Finally we provide a new translation of generalised boolean clause-sets into boolean clause-sets, the nested translation, which preserves the conflict structure. As an application, we can generalise results for boolean clause-sets regarding the hermitian rank/defect, especially the characterisation of unsatisfiable hitting clause-sets where between every two clauses we have exactly one conflict. We conclude with a list of open problems, and a discussion of the “generic translation scheme”.
@InProceedings{Kullmann2009OKlibrary, author = "Oliver Kullmann", title = "The \texttt{OKlibrary}: Introducing a "Holistic" Research Platform for (Generalised) {SAT} Solving", journal = "Studies in Logic", year = 2009, volume = 2, number = 1, pages = "20-53" }
The OKlibrary is introduced, an open-source library supporting research and development in the area of (generalised) SAT-solving, and available at http://www.ok-sat-library.org. We discuss history, motivation and architecture of the library, and outline its current extent. The differences to existing platforms are explained: We understand the approach of the OKlibrary as “holistic”, so that the OKlibrary includes for example not only program code, but also research plans and methods for processing and evaluation of experiments (as well as experimental results themselves). This leads to the understanding of the OKlibrary as an “IRE”, an “integrated research environment” (however not building a single monolithic block, but based on the Unix/Linux tradition of a “tool chest”). “Generalised satisfiability problems” are understood as a generalisation of SAT towards CSP, and we discuss the basic ideas. To conclude we state 10 research problems which we regard as fundamental for advancing core SAT solving.
6a557631cc862ae37234b61fa1c5859a
).
08ac2e185acf8e88efb543f06e490820
).
bcb90d78125aa8ccfaf040c32bc6e3b2
2797db71bd07bce630b177b3292a7731
376881cf129946dfa04e770732bdbda7
Capturing propositional logic, constraint satisfaction problems and systems of polynomial equations, we introduce the notion of "systems with finite instantiation by partial assignments", fipa-systems for short, which are independent of special representations of problem instances, but which are based on an axiomatic approach with instantiation (or substitution) by partial assignments as the fundamental notion. Fipa-systems seem to constitute the most general framework allowing for a theory of resolution with non-trivial upper and lower bounds. For every fipa-system we generalise relativised hierarchies originating from generalised Horn formulas, and obtain hierarchies of problem instances with recognition and satisfiability decision in polynomial time and linear space, quasi-automatising relativised and generalised tree resolution and utilising a general "quasi-tight" lower bound for tree resolution. And generalising width-restricted resolution, for every fipa-system a (stronger) family of hierarchies of unsatisfiable instances with polynomial time recognition is introduced, weakly automatising relativised and generalised full resolution and utilising a general lower bound for full resolution. Key words: satisfiability problem (SAT), systems with instantiation, propositional logic, constraint satisfaction problems, polynomial time hierarchies, generalised resolution, lower bounds for resolution, upper bounds for SAT algorithms, automatisation of proof systems, generalised input resolution, generalised width restricted resolution, induced width of constraint satisfaction problems
We study the problem of deleting clauses from conjunctive normal forms (clause-sets) which cannot contribute to any proof of unsatisfiability. For that purpose we introduce the notion of an autarky system, which detects deletion of superfluous clauses from a clause-set F and yields a canonical normal form. Clause-sets where no clauses can be deleted are called lean, a natural weakening of minimally unsatisfiable clause-sets, including also satisfiable instances. Three special examples for autarky systems are considered: general autarkies, linear autarkies (based on linear programming) and matching autarkies (based on matching theory). We give new characterizations of lean clause-sets in terms of qualitative matrix analysis, while matching lean clause-sets are characterized in terms of deficiency (the difference between the number of clauses and the number of variables), by having a cyclic associated transversal matroid , and also in terms of fully indecomposable matrices . Finally we discuss how to obtain polynomial time satisfiability decision for clause-sets with bounded deficiency, and we make a few steps towards a general theory of autarky systems.
A conjunctive normal form is minimal unsatisfiable if it is unsatisfiable, and deleting any clause makes it satisfiable. The deficiency of a formula is the difference of the number of clauses and the number of variables. It is known that every minimal unsatisfiable formula has positive deficiency. Until recently, polynomial-time algorithms were known to recognize minimal unsatisfiable formulas with deficiency 1 and 2. We state an algorithm which recognizes minimal unsatisfiable formulas with any fixed deficiency in polynomial time.
Investigations on the monoid of autark assignments, and the introduction of linear autarkies, which can be searched for in polynomial time by means of Linear Programming. Clause-sets satisfiable by a linear autarky are called linearly satisfiable and include 2-clause-sets, Horn clause-sets, q-Horn clause-sets, clause-sets satisfiable by iterated elimination of pure literals, and matched clause-sets.
29a52c65dbc316434720bec6a9795012
Motivated by improved SAT algorithms (see below ), a natural parameterized generalization GER of Extended Resolution (ER) is introduced. GER allows, as ER, the introduction of new clauses, namely blocked clauses. ER can polynomially simulate GER, but GER allows special cases for which exponential lower bounds can be proven. An example is the addition of blocked clauses without new variables and of bounded width (length):
8d06f2fe856f2f2f344b3d52c5f516f8
We prove the worst-case upper bound 1.5045^n for the time complexity of 3-SAT decision in the number n of variables, introducing new methods for the analysis as well as new algorithmic techniques. We add new 3- and 3-clauses, called "blocked clauses", generalising the extension rule of "Extended Resolution". Our methods for estimating the size of trees lead to a refined measure of formula complexity of 3-clause-sets and can be applied also to arbitrary trees.
@InProceedings{Kullmann19963SAT, author = "Oliver Kullmann", title = "Worst-case Analysis, {3-SAT} Decision and Lower Bounds: Approaches for Improved {SAT} Algorithms", pages = "261-313", crossref = "Dimacs1996SAT" } @Proceedings{Dimacs1996SAT, title = "Satisfiability Problem: Theory and Applications (DIMACS Workshop March 11-13, 1996)", year = 1997, editor = "Dingzhu Du and Jun Gu and Panos M. Pardalos", volume = 35, series = "DIMACS Series in Discrete Mathematics and Theoretical Computer Science", publisher = "American Mathematical Society", note = "ISBN 0-8218-0479-0" }
e1570afa635bfd67d2c51a853bdac763
New methods for worst-case analysis and (3-)SAT decision are presented. The focus is on the central ideas leading to the improved bound 1.5045^n for 3-SAT decision (n is the number of variables). The implications for SAT decisions in general are discussed and elucidated by a number of hypothesis'. In addition an exponential lower bound for a general class of SAT-algorithms is given, and the only possibilities to remain below this bound are pointed out.
We solved the Boolean Pythagorean triples problem, showing that for any two subsets A, B of the set N of natural numbers (non-zero), such that the union of A, B is N, at least one of A, B contains a Pythagorean triple, which are three natural numbers a,b,c with a^2 + b^2 = c^2.
This was an mathematical problem open since the 80's. Different from all other problems of Ramsey theory solved per SAT solving, here a fundamental existence problem was solved.
We used the Cube and Conquer method, a hybrid SAT solving method developed initially in the context of computing van der Waerden numbers.
The certificate of the computation yields currently the longest mathematical proof.
4826b8c8c3c086dfa9b92c00fb57caa6
). 7e5d3c39f2a0f7c32487e598fc6ffbb2
;
concentrating on the discussions about "proofs and meaning").
We consider the algorithmic task of computing a maximal autarky for a clause-set F, i.e., a partial assignment which satisfies every clause of F it touches, and where this property is destroyed by adding any non-empty set of further assignments.
We employ SAT solvers as oracles, using various capabilities. Using the standard SAT oracle, log_2(n(F)) oracle calls suffice, where n(F) is the number of variables, but the drawback is that (translated) cardinality constraints are employed, which makes this approach less efficient in practice.
Using an extended SAT oracle, motivated by the capabilities of modern SAT solvers, we show how to compute maximal autarkies with 2 n(F)^{1/2} simpler oracle calls. This novel algorithm combines the previous two main approaches, based on the autarky-resolution duality and on SAT translations.
69541fc9f8d9e93b830a96a68c80e394
). Various "hardness" measures have been studied for resolution, providing theoretical insight into the proof complexity of resolution and its fragments, as well as explanations for the hardness of instances in SAT solving. In this paper we aim at a unified view of a number of hardness measures, including different measures of width, space and size of resolution proofs. Our main contribution is a unified game-theoretic characterisation of these measures. As consequences we obtain new relations between the different hardness measures. In particular, we prove a generalised version of Atserias and Dalmau's result on the relation between resolution width and space.
b49ae56255be6df5c4a70317c923e098
). 75f17c0326930d7530c2aa784c42449d
). c479e9d3b23de2756ce83785923f9f65
).
We consider the problem of finding good representations, via boolean conjunctive normal forms F (clause-sets), of systems S of XOR-constraints x_1 + ... + x_n = e, e in {0,1} (also called parity constraints), i.e., systems of linear equations over the two-element field . These representations are to be used as parts of SAT problems F^* >= F, such that F^* has "good" properties for SAT solving. The basic quality criterion is "arc consistency", that is, for every partial assignment phi to the variables of S, all assignments x_i = e forced by phi are determined by unit-clause propagation on the result phi * F of the application. We show there is no arc-consistent representation of polynomial size for arbitrary S. The proof combines the basic method by Bessiere et al. 2009 on the relation between monotone circuits and ``consistency checkers'', adapted and simplified in the underlying report , with the lower bound on monotone circuits for monotone span programs in Babai et al. 1999 . On the other side, our basic positive result is that computing an arc-consistent representation is fixed-parameter tractable in the number m of equations of S. To obtain stronger representations, instead of mere arc-consistency we consider the class PC of propagation-complete clause-sets, as introduced in Bordeaux et al 2012 . The stronger criterion is now F in PC, which requires for all partial assignments, possibly involving also the auxiliary (new) variables in F, that forced assignments can be determined by unit-clause propagation. We analyse the basic translation, which for m=1 lies in PC, but fails badly so already for m=2, and we show how to repair this.
97d01e75232ae97c16a9f359fd0ee884
). b5125a6b3e2b1c415aa8900ace8c918d
).
The class SLUR (Single Lookahead Unit Resolution) was introduced in [Schlipf, Annexstein, Franco, Swaminathan, 1995] as an umbrella class for efficient SAT solving. Balyo, Gursky, Cepek, Kucera, Vlcek in 2012 extended this class in various ways to hierarchies covering all of CNF (all clause-sets). We introduce a hierarchy SLUR_k which we argue is the natural “limit” of such approaches. The second source for our investigations is the class UC of unit-refutation complete clause-sets introduced in [del Val 1994]. Via the theory of (tree-resolution based) “hardness” of clause-sets as developed in ECCC-report, Annals-paper and [Ansotegui, Bonet, Levy, Manya, 2008], we obtain a natural generalisation UC_k, containing those clause-sets which are “unit-refutation complete of level k”, which is the same as having hardness at most k. Utilising the strong connections to (tree-)resolution complexity and (nested) input resolution, we develop fundamental methods for the determination of hardness (the level k in UCk ). A fundamental insight now is that SLUR_k = UC_k holds for all k. We can thus exploit both streams of intuitions and methods for the investigations of these hierarchies.
e5bed663b36c125baf4683cbd0a124d2
). We study DP-reduction (or "variable elimination") for minimally unsatisfiable clause-sets, concentrating on questions related to confluence.
051f58496e0c85ea04e5008a09bbb258
). 9c2c8d8a1f4ba122f2e39752fb7ca1a8
We present a new SAT approach, called cube-and-conquer, targeted at reducing solving time on hard instances. This two-phase approach partitions a problem into many thousands (or millions) of cubes using lookahead techniques. Afterwards, a conflict-driven solver tackles the problem, using the cubes to guide the search. On several hard SAT-competition benchmarks, our hybrid approach outperforms both lookahead and conflict-driven solvers. Moreover, because cube-and-conquer is natural to parallelise, it is a competitive alternative for solving SAT problems in parallel. This approach was originally developed for solving hard van-der-Waerden problems, and for these (hard, unsatisfiable) problems the approach is not only very well parallelisable, but outperforms all other (parallel or not) SAT-solvers in terms of total run-time by at least a factor of two.
e7dd9b8ab038b4471a67a5dd48e29199
) given at
Fifth Workshop on Formal and Automated Theorem Proving and
Applications
, a somewhat extended and improved version of the talk given
by Marijn Heule at HVC 2011.
SplittingViaOKsolver
).
We consider the question of the existence of variables with few occurrences in boolean conjunctive normal forms (clause-sets). Let mvd(F) for a clause-set F denote the minimal variable-degree. Our main result is an upper bound mvd(F ) ≤ nM(sigma(F)) ≤ sigma(F) + 1 + lg(sigma(F)) for lean clause-sets F in dependency on the surplus sigma(F). Lean clause-sets, defined as having no non-trivial autarkies, generalise minimally unsatisfiable clause-sets. For the surplus we have sigma(F) ≤ delta(F) = c(F)−n(F), using the deficiency delta(F) of clause-sets, the difference between the number of clauses and the number of variables. nM(k) is the k-th “non-Mersenne” number, skipping in the sequence of natural numbers all numbers of the form 2^n − 1. As an application of the upper bound we obtain that clause-sets F violating mvd(F) ≤ nM(sigma(F)) must have a non-trivial autarky. It is open whether such an autarky can be found in polynomial time.
d47c9f44aeb637055c5dfdc82f91b567
). Problems from exact Ramsey theory are used as test problems for SAT solving. Based on Green-Tao's theorem , the Green-Tao numbers are introduced. Directions of the parameter space are investigated, where these numbers only grow linearly. Using standard SAT solvers the basic Green-Tao numbers are computed. For handling the non-boolean cases, the generic translation scheme is introduced, offering an infinite variety of translations ("encodings") of hypergraph colouring problems to boolean SAT.
c619a62a69ff3b0fd4061630c653f9e7
).
752886f5e21ccd8addcab6de90202dfd
"Green-Tao numbers" are defined like Van der Waerden numbers , but based on Green-Tao's theorem . Basic properties are investigated, and SAT translation of Ramsey-theory-type problem instances in general is discussed.
We study complement-invariant clause-sets F (closed under elementwise complementation of clauses). The reduced deficiency of a clause-set F is defined as delta_r(F) := 1/2 (delta(F) - n(F)), where delta(F) = c(F) - n(F) is the deficiency. The maximal reduced deficiency is delta_r^*(F), the maximum of delta_r(F') over all sub-clause-sets F'. If F is complement-invariant and linearly lean, then we have delta_r^*(F) = delta_r(F). We show polynomial time SAT decision for complement-invariant clause-sets F with delta_r^*(F) = 0, exploiting the (non-trivial) decision algorithm for sign-non-singular matrices given by [Robertson, Seymour, Thomas, 1999]. Minimally unsatisfiable complement-invariant clause-sets fulfil delta_r(F) >= 0, and we immediately get polynomial time decidability of minimally unsatisfiable complement-invariant clause-sets F with delta_r(F) = 0, but we can give also more direct algorithms and characterisations.
8bac4420089d1f811b21d4140986d660
). f6cca9d40fdf76bf1b771439ce717258
).
Combining graph theory and linear algebra, we study SAT problems of low "linear algebra complexity", considering formulas with bounded hermitian rank. We show polynomial time SAT decision of the class of formulas with hermitian rank at most one, applying methods from hypergraph transversal theory. Applications to heuristics for SAT algorithms and to the structure of minimally unsatisfiable clause-sets are discussed.
a0bf400eacc63df2948b2fc345fde591
).
We study the symmetric conflict matrix of a multi-clause-set, where the entry at position (i, j) is the number of clashes between clauses i and j. The conflict matrix of a multi-clause-set, either interpreted as a multi-graph or a distance matrix, yields a new link between investigations on the satisfiability problem (in a wider sense) and investigations on the biclique partition number and on the addressing problem (introduced by Graham and Pollak) in graph theory and combinatorics. An application is given by the class of what are called 1-uniform hitting clause-sets in this article, where each pair of (different) clauses clash in exactly one literal. Endre Boros conjectured at the SAT’98 workshop that each 1-uniform hitting clause-set of deficiency 1 contains a literal occurring only once. Kleine Buening and Zhao showed, that under this assumption every 1-uniform hitting clause-set must have deficiency at most 1. We show that the conjecture is false (there are known star-free biclique decompositions of every complete graph with at least nine vertices), but the conclusion is right (a special case of the Graham-Pollak theorem, attributed to Witsenhausen). A basic notion for investigations on the combinatorics of clause-sets is the deficiency of multi-clause-sets (the difference of the number of clauses and the number of variables). We introduce the related notion of hermitian defect, based on the notion of the hermitian rank of a hermitian matrix introduced by Gregory, Watts and Shader. The notion of hermitian defect makes it possible to combine eigenvalue techniques (especially Cauchy’s interlacing inequalities) with matching techniques, and can be seen as the underlying subject of our introduction into the subject.
cb1f401d892c65008f2b44089c5a9316
f8c2b8eb758b7fb55236f4b4bec4e8bf
b8cb7ff810c8fa9c26ced030858d8c67
We consider the deficiency delta(F) = c(F) - n(F) and the maximal deficiency delta^*(F), the maximum of delta(F') for all sub-clause-sets F' of F. Here c(F) is the number of clauses, and n(F) is the number of variables. Combining ideas from matching and matroid theory with techniques from the area of resolution refutations, we prove that for clause-sets F with delta^*(F) <= k, where k is considered to be constant, the SAT problem, the Minimally Unsatisfiability problem and the MAXSAT problem are decidable in polynomial time (previously, only poly-time decidability of the Minimally Unsatisfiability problem was known, and that only for k=1). Of basic importance here is the transversal matroid of the bipartite clause-variable graph of the clause-set. Regarding resolution, a fundamental tool is singular DP-reduction. Also an elegant characterisation of MU(1) and SMU(1) is given (minimally unsatisfiable and saturated minimally unsatisfiable clause-sets of deficiency 1).
110556c899988d933ea308d7860ad240
A new form of width-restricted resolution, called "CUBI resolution" ("closure under bounded input resolution") is presented. CUBI resolution allows to eliminate the dependency on the maximal input clause-length in the general width-lower-bound for resolution from [Ben-Sasson, Wigderson 1999] , and bypasses the poly-time decidability problem of "k-resolution" introduced in [Kleine Büning 1993] (by slightly strengthening the calculus). Generalising the well-known equivalence of the refutational power of Unit Resolution and Input Resolution, CUBI resolution simulates "nested input resolution" .
ebaf0a33de6ee3fd8e1ecad6bbcac911
"Search trees", "branching trees", "backtracking trees" or "enumeration trees" are at the heart of many (complete) approaches towards hard combinatorial problems, constraint problems, and SAT problems. Given many choices for branching, the fundamental question is how to guide the choices so that the resulting trees are (relatively) small. In this chapter the first systematic treatment is given, and a general theory of heuristics guiding the construction of "branching trees" is developed.
502b45ddb5d9bc63c697bf66287a4389
Minimal unsatisfiability describes the reduced kernel of unsatisfiable formulas. The investigation of this property is very helpful in understanding the reasons for unsatisfiability as well as the behaviour of SAT-solvers and proof calculi. The study of minimal unsatisfiability is the study of the structure of problem instances without redundancies. The study of autarkies considers the redundancies themselves, in various guises related to partial assignments which satisfy some part of the problem instance while leaving the rest "untouched". Autarky theory creates many bridges to combinatorics, algebra and logic, and the second part of this chapter provides a solid foundation of the basic ideas and results of autarky theory
Talks related to conference papers are available there.
f0d6c80914756e78eb2309a77e9ea57c
Adapting the Dagstuhl-talk on the foundations of the Cube-and-Conquer method for a general logic-audience.
5071f3d9e45c8cd1ef0404e69e28f5c2
Extending the Toronto-talk on the foundations of the Cube-and-Conquer method .
c17aafc727dc3ac3320a4a532408377b
Motivated by the Pythagorean Triples Problem , the Cube-and-Conquer method is explained, together with the theoretical foundations of look-ahead in the Handbook chapter .
97108894ef693a6a753e7d66c1d4d7b0
Taking our solution of the Pythagorean Triples Problem as the starting point, the talk sketches applications of statistical methods to SAT solving.
f738abff795d6861b9417351dfcf9559
Overview on minimal unsatisfiability with respect variable degrees, full clauses, and classification for small deficiencies. For a longer overview see SAT 2016 Workshop talk .
c9add03c339d31feac38f602aa3f8926
Overview on minimal unsatisfiability with respect variable degrees, full clauses, and classification for small deficiencies. For a shorter version see Logic Colloquium talk .
2431a766f9ffdde1895d2c07e73e7b97
At least amongst computer scientists it is well-known that progress over the last 15 years in SAT, the decision problem whether a propositional formula is satisfiable, has dramatically changed industrial applications in the area of verification of hardware and software. It is also well-known that SAT is likely the best tool available to tackle concrete combinatorial instances, coming for example from Ramsey theory. On the theory side, connections to random structures and to complexity theory are prominent. We consider here a less well-known stream of research, which can be understood as part of the general quest of "understanding unsatisfiability" (where we believe that this is possible).
More precisely, we are investigating "minimally unsatisfiable" clause-sets (MUs for short), where, as with critically colourable hypergraphs, unsatisfiability is destroyed by removing any clause ("condition"). See Handbook chapter for a general overview, and see especially the introduction of Report for the connections to combinatorics.
Since the fundamental work by Aharoni and Linial, it is known that such MUs must have at least one more clause than variables, and this is known nowadays as deficiency delta >= 1, where delta = number of clauses - number of variables.
The great goal is the Finiteness Conjecture, which states that for bounded deficiency k there are only finitely many MU-"patterns". A step towards this is determining extremal parameter values in delta, like degrees of variables and the number of "full clauses", which are clauses containing all variables. In terms of covers of the boolean hypercube by subcubes, where "MU" means irredundant (or minimal) cover, full clauses correspond to singleton sub-cubes.
After a general introduction, the results of our latest work are discussed, which gives a lower bound on the maximal number of full clauses, and introduces connections to elementary number theory and recursion schemes (meta-Fibonacci sequences).
00a0518b09c6e0dbe1c6d71aaa58ff93
e7f50debc0560bb5d5da3d4ed9ddb311
623358ea297682d84c6f5fd3a982e521
9937f0532bb5ad7a341784116e7d4316
5de00d5fa5758dd6c705dcd9a4d9e3b6
An overview on SAT solving.
2e804dc58a3e9186959659fb9bbdd5ff
5b5536b282188c580ae6b01cb51e2c6a
An overview on the project of classification of minimally unsatisfiable clause-sets in the layers given by the deficiency.
8e2d3db109fc14a83722c72e0d755474
On the relationships between circuits and good (SAT-) representations of boolean functions.
885834af8271c536d399509c719cf08c
The basic problem considered in this tutorial is as follows: Given variables with finite domains and constraints in various forms, find a "good" SAT translation. "Encoding" the non-boolean values into boolean values as well as "decomposing" (complex) constraints into elementary constraints (like clauses) are well-studied problems, however it seems that yet a general perspective is missing. I will make an attempt to outline some elements of a general theory, especially discussing what "good" could mean here, and how to achieve it.
f960ac813a6a46d55f73dc5c41d206e9
SAT solving has become an important tool to solve search problems for example from the verification domain, combinatorics, or for cryptanalysis. An obvious problem is the translation into CNF (conjunctive normal form): how to get a good translation? We try to initiate a theory of "good SAT representations", as a kind of "knowledge representation for SAT solvers". Currently, we concentrate on translations of general building blocks, which could be for example some types of constraints (for example cardinality constraints) or components like the S-box in cryptographic ciphers. The first theory-tool available is the notion of "hardness" hd(F), which measures in a certain sense how hard it is for a SAT solver to retrieve information from the representation F of a boolean function f. The talk presents this notion of "hardness", how it fits into the landscape of "representations", and how to generalise this approach.
aaee93b1297a5ed136cee58c59fdccb2
Two areas, where SAT solving and proof complexity should be able to fertilise each other:
fcda95641f82c6f6e868e4926a9f69a5
Presenting an approach towards good representations of boolean functions, based on an algorithmic measurement of the "hardness" of a clause-set F according to the hardness of deriving all consequences from F by tree-resolution.
d6da245a123e7a434958e84f2b58c80a
Two topics in the area of SAT-translations are treated in this talk: Translating a non-boolean CNF (with non-boolean variables) into a boolean CNF, via the generic translation. And translating boolean functions (as black boxes) into CNFs, employing the notion of hardness.
Only reports which did not (yet) appear elsewhere are listed here.
We consider unsatisfiable "hitting clause-sets" (UHITs), special cases of MUs, which correspond to the partitions of the boolean hypercube by subcubes; they are also known as orthogonal or disjoint DNF tautologies.
We prove that a nonsingular such UHIT, where the deficiency is at most 3, has a most 7 variables.
An important new concept here is that of "clause-irreducible clause-sets", where no non-trivial subset is logically equivalent to a single clause.
We establish a new bridge between propositional logic and elementary number theory. The main objects are "minimally unsatisfiable clause-sets", short "MUs", unsatisfiable conjunctive normal forms rendered satisfiable by elimination of any clause. In other words, irredundant coverings of the boolean hypercube by subcubes.
The main parameter for MUs is the "deficiency" k, the difference between the number of clauses and the number of variables (the difference between the number of elements in the covering and the dimension of the hypercube), and the fundamental fact is that k >= 1 holds.
A "full clause" in an MU contains all variables (corresponding to a singleton in the covering). We show the lower bound S_2(k) <= FCM(k), where FCM(k) is the maximal number of full clauses in MUs of deficiency k, while S_2(k) is the smallest n such that 2^k divides n!.
The proof rests on two methods: On the logic-combinatorial side, applying subsumption resolution and its inverse, a fundamental method since Boole in 1854 introduced the expansion method . On the arithmetical side, analysing certain recursions, combining an application-specific recursion with a recursion from the field of meta-Fibonacci sequences (indeed S_2 equals twice the Conolly sequence ).
A further tool is the consideration of unsatisfiable "hitting clause-sets" (UHITs), special cases of MUs, which correspond to the partitions of the boolean hypercube by subcubes; they are also known as orthogonal or disjoint DNF tautologies. We actually show the sharper lower bound S_2(k) <= FCH(k), where FCH(k) is the maximal number of full clauses in UHITs of deficiency k. We conjecture that for all k holds S_2(k) = FCH(k), which would establish a surprising connection between the extremal combinatorics of (un)satisfiability and elementary number theory.
We apply the lower bound to analyse the structure of MUs and UHITs.
We investigate connections between SAT (the propositional satisfiability problem) and combinatorics, around the minimum degree (occurrence) of variables in various forms of redundancy-free boolean conjunctive normal forms (clause-sets).
Lean clause-sets do not have non-trivial autarkies, that is, it is not possible to satisfy some clauses and leave the other clauses untouched. The deficiency of a clause-set is the difference of the number of clauses and the number of variables. We prove a precise upper bound on the minimum variable degree in dependency on the deficiency.
If a clause-set does not fulfil this upper bound, then it must have a non-trivial autarky; we show that the autarky-reduction (elimination of affected clauses) can be done in polynomial time, while it is open to find the autarky itself in polynomial time.
Then we investigate this upper bound for the special case of minimally unsatisfiable clause-sets. Here we show that the bound can be improved.
We consider precise relations, and the investigations have a certain number-theoretical flavour. We try to build a bridge from logic to combinatorics (especially to hypergraph colouring ), and thus we discuss thoroughly the background and open problems, and provide many examples and explanations.
We present a general framework for good CNF-representations of boolean constraints, to be used for translating decision problems into SAT problems (i.e., deciding satisfiability for conjunctive normal forms). We apply it to the representation of systems of XOR-constraints, also known as systems of linear equations over the two-element field, or systems of parity constraints.
The general framework defines the notion of "representation", and provides several methods to measure the quality of the representation by the complexity ("hardness") needed for making implicit "knowledge" of the representation explicit (to a SAT-solving mechanism). We obtain general upper and lower bounds.
Applied to systems of XOR-constraints, we show a super-polynomial lower bound on "good" representations under very general circumstances. A corresponding upper bound shows fixed-parameter tractability in the number of constraints.
The measurement underlying this upper bound ignores the auxiliary variables needed for shorter representations of XOR-constraints. Improved upper bounds (for special cases) take them into account, and a rich picture begins to emerge, under the various hardness measurements.
We aim at a unified view of a number of hardness measures, including different measures of width, space and size of resolution proofs. We also extend these measures to all clause-sets (possibly satisfiable).
Knowledge Compilation (KC) studies compilation of boolean functions f into some formalism F, which allows to answer all queries of a certain kind in polynomial time. Due to its relevance for SAT solving, we concentrate on the query type "clausal entailment" (CE), i.e., whether a clause C follows from f or not, and we consider subclasses of CNF, i.e., clause-sets F with special properties. We consider the hierarchies UC_k <= WC_k, which where introduced by the authors in 2012. Each level allows CE queries. We show that for each k there are (sequences of) boolean functions with polysize representations in UC_{k+1}, but with an exponential lower bound on representations in WC_k. Such a separation was previously only known for k=0. We also consider PC < UC, the class of propagation-complete clause-sets. We show that there are (sequences of) boolean functions with polysize representations in UC, while there is an exponential lower bound for representations in PC. These separations are steps towards a general conjecture determining the representation power of the hierarchies PC_k < UC_k <= WC_k. The strong form of this conjecture also allows auxiliary variables, as discussed in depth in the Outlook.
We aim at providing a foundation of a theory of "good" SAT representations F of boolean functions f. We argue that the hierarchy UC_k of unit-refutation complete clause-sets of level k, introduced by the authors, provides the most basic target classes, that is, F in UC_k is to be achieved for k as small as feasible. If F does not contain new variables, i.e., F is equivalent (as a CNF) to f, then F in UC_1 is similar to "achieving (generalised) arc consistency" known from the literature (it is somewhat weaker, but theoretically much nicer to handle). We show that for polysize representations of boolean functions in this sense, the hierarchy UC_k is strict. The boolean functions for these separations are "doped" minimally unsatisfiable clause-sets of deficiency 1; these functions have been introduced in [Sloan, Soerenyi, Turan, 2007], and we generalise their construction and show a correspondence to a strengthened notion of irredundant sub-clause-sets. Turning from lower bounds to upper bounds, we believe that many common CNF representations fit into the UC_k scheme, and we give some basic tools to construct representations in UC_1 with new variables, based on the Tseitin translation. Note that regarding new variables the UC_1-representations are stronger than mere "arc consistency", since the new variables are not excluded from consideration.
e7fcb0ce6da7dd2c8b3db729df64853c
.
6e5be672e3bffbc9bc7ae31f5183df2d
@Unpublished{KullmannLuckhardt1997Taut, author = "Oliver Kullmann and Horst Luckhardt", title = "Deciding propositional tautologies: Algorithms and their complexity", note = "Preprint, 82 pages; the ps-file can be obtained at \url{http://cs.swan.ac.uk/~csoliver/Artikel/tg.ps}", year = 1997, month = "January" }
d84226a7dfe6baf5d1d8ba8dd97721d4
@Unpublished{KullmannLuckhardt1999Taut, author = "Oliver Kullmann and Horst Luckhardt", title = "Algorithms for {SAT}/{TAUT} decision based on various measures", note = "Preprint, 71 pages; the ps-file can be obtained from \url{http://cs.swan.ac.uk/~csoliver/Artikel/TAUT.ps}", year = 1999, month = "February" }
62cbcb850852110a94a0eec908895167
6a18f0e914421236e177258a9d0576a5
Please see here for the status of the open problems given at the end of the report.