OKlibrary  0.2.1.6
Search.hpp File Reference

Plans regarding searching for short resolution proofs. More...

Go to the source code of this file.


Detailed Description

Plans regarding searching for short resolution proofs.

Todo:
Finding short tree-resolution proofs
  • See optimal_splitting_tree in ComputerAlgebra/Satisfiability/Lisp/Backtracking/SplittingTrees.mac for a backtracking approach.
  • Implement the dynamic-programming algorithm for computing the tree resolution complexity for all possible clauses over var(F) for a given clause-set F:
    1. Let trc(C) be the tree-resolution complexity of C w.r.t. F, that is, the minimal number of nodes in a tree resolution derivation of (exactly) C from F.
    2. So trc(C) is an odd natural number.
    3. At the beginning trc(C) is "open" for all C over F.
    4. Then in round i = 1, 2, ... we determine all clauses C with trc(C) = 2i-1 (by marking exactly those clauses with 2i-1).
    5. In round 1 exactly the clauses of F are marked with 1.
    6. In round i > 1 for a yet unmarked clause C one checks whether it is the resolvent of marked clauses C',C'' such that trc(C') + trc(C'') + 1 = 2i-1 in which case clause C gets marked.
  • For extracting an optimal tree resolution refutation for the empty clause, when marking a clause one should also store pointers to the parent clauses.
  • Can further restrictions be handled for boolean resolution?
  • For non-boolean clause-sets there is still only one form of resolution, however if signed literals are allowed, then more forms are possible.
  • If only interested in the empty clause, then we can stop as soon as we found it (and the reached level then is exactly the tree-resolution complexity of F).
  • Are there other strategies to make the procedure a bit more efficient?
    1. One possibility is to restrict the maximal clause-size considered (for 2-CNF it is actually sufficient to consider only clauses of length at most 2).
Todo:
Regular-resolution complexity
  • One could handle it like resolution complexity, only with the additional restriction on the paths that no resolution variable may occur twice.
Todo:
How to compute the minimal DP-resolution complexity?
  • Is it necessary to run through all permutations of the set of variables ?
  • The question here is about the notion of complexity:
    1. One could consider ordinary resolution complexity, only that the order of variables on all paths is the same.
    2. Or one considers the sum of the sizes of the levels of the DP-procedure.
  • In case one considers resolution complexity, then one can use the methods for (full) resolution proofs (or for regular resolution proofs).
  • When considering the sum of sizes, then perhaps one has to run through all permutations.
    1. Via a greedy heuristics we start with a "promising" ordering.
    2. Then we can stop any other order as soon that the sum is bigger than what we have already achieved.
    3. Likely subsumption elimination should be applied throughout.
    4. If an ordering v_1, ..., v_k was ruled out (a partial ordering, only specifying an initial segment), then actually all orderings which use a permutation of these v_1, ..., v_k as an initial a segment are ruled out as well!
    5. One needs a nice way of implementing this, but it should achieve a substantial shortcut.
Todo:
Resolution complexity
  • First we should concentrate on the decision problem, whether input F has a resolution refutation of length at most k.
  • One approach is to translate the search for a resolution refutation into a (generalised) SAT problem: We have k variables for the k steps in the proof (presented as a sequence), and values are either clauses from F or a pair of indices for a resolution step. See "Finding short resolution proofs via SAT" below.
  • Preprocessing:
    1. Subsumption elimination should be performed first.
    2. And then the learn kernel is to be computed (removing all clauses which cannot participate in a resolution refutation).
    3. We must also find out the necessary clauses (since they have to be placed).
  • Backtracking approach:
    1. A question is whether we should enumerate all unsatisfiable sub-clause-sets, and take each as the basis for a search (in a related search those clauses must then all be used).
    2. An easy criterion to see that the goal can't be reached is given by the observation that each resolution step can only remove at most one literal. So clauses which are too long to reach the empty clause in the given number of steps can be removed.
    3. And when reusing a clause C, then it must be smallest w.r.t. subsumption.
Todo:
2-CNF
  • Implement the Buresh-Oppenheim/Mitchell algorithm for computing the minimal resolution complexity of F in boolean 2-CNF in polynomial time.
Todo:
Finding short resolution proofs via SAT
  • The basic functionality has been established with shortresref_aloamo_fcl.
  • We need standardisation (of the variables).
  • And we need additional conditions (to narrow the search):
    1. One could impose a linear order on the resolution variables for the cases where both parent-clauses are axioms.
    2. For F in MU, every axiom must be used at least once.
    3. For arbitrary F we can demand that every resolvent is used at least once.
    4. For F in MU we can also demand that every variable is used at least once as resolution-variable. Are there stronger conditions, taking the variable-occurrences into account? Seems not so easy, due to possible contractions of literal-occurrences.

Definition in file Search.hpp.