Cores.hpp File Reference

Plans related to irredundant cores. More...

Go to the source code of this file.

Detailed Description

Plans related to irredundant cores.

  • Tests: Cover all branches of the function.
  • Write documentation:
    1. The algorithm (and the underlying reasoning) is not completely trivial.
    2. So needs to be explained properly, with examples.
    3. Also a proof of correctness is needed.
  • Yet we choose just an arbitrary clause for the branching:
    1. A reasonable heuristic would be to choose a clause of minimal length.
    2. The target would be to get the smallest irredundant cores at the beginning.
    3. Otherwise actually it doesn't seem to have any influence.
    4. However, we first remove the clause! So perhaps we should choose a clause of maximal length?
    5. One would guess that choosing a clause of maximal length would tend to first produce a core of small size, while choosing a clause of minimal length would quicker lead to the first core, since we get more forced clauses.
    6. These considerations should be more relevant to the sampling version below.
    7. Perhaps we supply the heuristics as a parameter (selecting a clause C from Possible).
    8. Should this heuristics also have access to Forced? Likely yes.
  • If we monitor the computation and decide that the currently obtained clause-sets are enough --- is there any way to stop the computation, keeping the current results?? Ask on Maxima mailing list.
    1. One possibility would be to push the results into a global variable.
    2. Or, better, to provide a variable into which these results are pushed. So we need to provide the *name* of the variable?
  • Monitoring:
    1. Variable counter_irr is only relevant for monitoring, so perhaps should not be defined as if of general relevance.
    2. It would be informative to print out also the path to the current node (at the entry), as list of 1s and 2s.
      • Perhaps this list is a bit long; printing it out as string saves space.
      • We could also send all monitoring output to a file.
      • Perhaps we shold work on general methods and standards for monitoring. Where to put it? Perhaps a new supermodule Programming?
  • A variation of all_forced_irr_cores_bydef would postpone the irredundancy check just before "return({Forced})". However, I (OK) cannot imagine this to be more efficient.
  • We could write a version of all_forced_irr_cores_bydef, which only works on unsatisfiable clause-sets. On the other hand, not much advantage seems to follow from this additional knowledge?
  • DONE (the corrected version has on the first branch only deletions) An alternative version of all_forced_irr_cores_bydef would make the recursive branching with switched branches, delaying as much as possible the inclusion of clauses. Perhaps this is handled by the randomised version sample_irr_cores.
  • The task is to obtain a random sample of N (if existent) irredundant cores.
  • A problem is that these samples will stay close to the first completed path (due to the backtracking process).
  • So perhaps we create another version, which after each completed sample starts a complete new path.
  • The problem with that is that it takes endless to go down the path, just to find one instance.
  • So perhaps here we just go down the path, only checking the clause selected, whether it is necessary or unusable, or whether it is a choice clause.
  • And only at the end do we check whether the obtained clause-set is actually irredundant --- if not then we just restart.
  • For testing we then use a function "irredundant_corep" (for checking whether a clause-set is an irredundant core of another clause-set).
  • DONE Where are Maxima random generators?
  • DONE A parameter p controls the probability of going left or right in the recursive splitting step.
  • DONE Then p=0 and p=1 correspond to the existing version and the alternative version (with switched branches) mentioned below. Using N=inf we obtain then all irredundant cores.
Finding an irredundant core of minimum size
Organisation : DONE
  • DONE Create sub-modules "Cores.mac".
  • DONE From ComputerAlgebra/Satisfiability/Lisp/MinimalUnsatisfiability/Basics.mac the following functions should belong here:
    1. min_size_mus_bydef(F, S)
    2. contains_us_bydef
    3. first_mus_fcs
    4. first_irr_fcs
    5. all_irr_cores_bydef
    6. all_forced_irr_cores_bydef
  • DONE We should group these functions better together.
  • DONE We should use an "impliesp(F,C,S)" predicate.
"Adaptive core search"
  • Consider the algorithm from [Bruni 2003, DAM, Approximating minimal unsatisfiable subformulae by means of adaptive core search].
  • The ideas are:
    1. Use clause-branching, and count how often to branch on a clause, and how often to fail on a clause.
    2. This yields a measure of "clause hardness".
    3. Now after some iterations of a DLL-solver, select a certain percentage of the hardest clause, and let it be the first trial core.
    4. Run the solver only on this trial core.
    5. If no solution within a certain time, then the core is still too hard, so remove some clauses.
    6. If trial core found unsatisfiable, then stop.
    7. If core found satisfiable, but the assignment is not overall satisfying, then add all clauses falsified by it, and a certain percentage of the hardest clauses, and restart with this new trial core.
    8. Completeness is ensured by some not performing "contraction", i.e., removing some clauses, after some time, so that then this current trial core has to be solved.
    9. If the problem was not solved, then the solver tries again as before, but the clauses added in the current "intensification phase" are never removed, so that this "base core" must grow and eventually, if not finished before, become the whole formula which is then solved by ordinary DLL.
  • Apparently these ideas have then be used by learning algorithms.
Exploiting the duality between MAXSAT and MUSAT
  • See [Liffiton, Sakallah; SAT 2005; On Finding All Minimally Unsatisfiable Subformulas].

Definition in file Cores.hpp.