NLocalSAT: Boosting Local Search with Solution Prediction 2020-05-02

NLocalSAT: Boosting Local Search with Solution Prediction

Abstract

NLocalSAT :address stochastic local search (SLS) 随机初始化的问题

NLocalSAT combines SLS with a solution prediction model, which boosts SLS by changing initialization assignments with a neural network.
NLocalSAT是使用神经网络增强SLS求解器的一种方法,离线方法, 使用神经网络来增强SAT求解器。
(NeuroCore在线预测,会给CDCL带来大量开销。而NLocalSAT以离线方式使用预测。 在这种方法中,对于每个SAT问题,只计算一次神经网络。)

Experiment on five SLS solvers (CCAnr, Sparrow, CPSparrow, YalSAT, and probSAT)
Experimental results show that NLocalSAT significantly increases the number of problems solved and decreases the solving time for hard problems.

Introduction

state-of-the-art SAT solvers :
CDCL solvers are based on the deep backtracking search, which assigns one variable each time and backtracks when a conflict occurs.
SLS solvers initialize an assignment for all variables and then find a solution by constantly flipping the assignment of variables to optimize some score.

GCN is a neural network model on graph structures, which extracts both structural information and information on nodes in a graph.

Model

the solver is to find a final solution with the guidance of the neural network.
input : formula + transfer it into a formula graph + fed to a graph-based neural network for extracting features.(GGNN)
output : a candidate solution (by a multi-layer perceptron after the neural network. )
use this candidate solution to initialize the assignments of an SLS SAT solver to guide the search process.


image.png
  1. formula graph (为了考虑 the structural information of an input formula )
    transfer to CNF [conjunction of clauses C1 ∧ C2 ∧···∧ Cn
    ( Ci = Li1 ∨Li2 ∨···∨Lin, where Lij = xk or Lij = ¬xk ) ]
    image.png

2.Graph-Based Neural Network (predict the candidate solution for a SAT problem.)
a gated graph convolutional network to extract structural information about the graph
and a two-layer perceptron to predict the solution.
GCNN : extract features of variables.
takes the adjacency matrix as the input and
outputs the features of each variable extracted from the graph.

  1. Two-layer perceptron
    after GGCN , vectors of nodes contains structural information of literals
    Through a softmax function, we get the probability for the variable to be true.

loss function: using ground truth of vi.

Training Data

The competition of SAT in2015 was called SAT Race 2015.
Goal : predict the solution to the SAT problem,
解不唯一问题: 求解器为每个变量和每个否定变量维护一个计数器。x与not x选择多的那个为1

SLS solver 原理:


image.png

initial解很重要
solvers restart the searching process by reassigning new random values to variables after a period of time without a score increase.
Before starting the solver, we run our neural network to predict a solution。


image.png

modified SLS solvers.


image.png

Result:

The experimental result shows that solvers with NLocalSAT solve more problems than the original ones


image.png

Word&Phrase

促进 promote, facilitate, accelerate, boost, stimulate, advance

in an off-line way. 离线

促使 cause, prompt, urge, induce, bring, push

induces limited overhead, 产生有限开销

fine-tuned 微调

counter-intuitive 违反直觉的

cause the SLS solvers to get stuck

have isomorphic problems between these two datasets 两个数据集的同构问题

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
平台声明:文章内容(如有图片或视频亦包括在内)由作者上传并发布,文章内容仅代表作者本人观点,简书系信息发布平台,仅提供信息存储服务。