Download PDFOpen PDF in browser

Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting

14 pagesPublished: December 10, 2019

Abstract

In propositional model counting, also named #SAT, the search space needs to be explored exhaustively, in contrast to SAT, where the task is to determine whether a propositional formula is satisfiable. While state-of-the-art SAT solvers are based on non- chronological backtracking, it has also been shown that backtracking chronologically does not significantly degrade solver performance. Hence investigating the combination of chronological backtracking with conflict-driven clause learning (CDCL) for #SAT seems evident. We present a calculus for #SAT combining chronological backtracking with CDCL and provide a formal proof of its correctness.

Keyphrases: #sat, chronological backtracking, conflict driven clause learning, model counting, propositional calculus, rules, sat

In: Diego Calvanese and Luca Iocchi (editors). GCAI 2019. Proceedings of the 5th Global Conference on Artificial Intelligence, vol 65, pages 113-126.

BibTeX entry
@inproceedings{GCAI2019:Combining_Conflict_Driven_Clause,
  author    = {Sibylle Möhle and Armin Biere},
  title     = {Combining Conflict-Driven Clause Learning and Chronological Backtracking for Propositional Model Counting},
  booktitle = {GCAI 2019. Proceedings of the 5th Global Conference on Artificial Intelligence},
  editor    = {Diego Calvanese and Luca Iocchi},
  series    = {EPiC Series in Computing},
  volume    = {65},
  publisher = {EasyChair},
  bibsource = {EasyChair, https://easychair.org},
  issn      = {2398-7340},
  url       = {/publications/paper/wBnB},
  doi       = {10.29007/vgg4},
  pages     = {113-126},
  year      = {2019}}
Download PDFOpen PDF in browser