Complexity of minimum-size arc-inconsistency explanations
Abstract
Explaining the outcome of programs has become one of the main concerns in AI research. In constraint programming, a user may want the system to explain why a given variable assignment is not feasible or how it came to the conclusion that the problem does not have any solution. One solution to the latter is to return to the user a sequence of simple reasoning steps that lead to inconsistency. Arc consistency is a well-known form of reasoning that can be understood by a human. We consider explanations as sequences of propagation steps of a constraint on a variable (i.e. the ubiquitous revise function in arc-consistency algorithms) that lead to inconsistency. We characterize several cases for which providing a shortest such explanation is easy: For instance when constraints are binary and variables have maximum degree two. However, these polynomial cases are tight. For instance, providing a shortest explanation is NP-hard when constraints are binary and the maximum degree is three, even if the number of variables is bounded. It remains NP-hard on trees, despite the fact that arc consistency is a decision procedure on trees. The problem is not even FPT-approximable unless the FPT =/= W[2] hypothesis is false.
Fichier principal
Complexity_of_Minimum_Size_Arc_Inconsistency_Explanations__Visible_to_all_ (1).pdf (488.03 Ko)
Télécharger le fichier
Origin | Files produced by the author(s) |
---|