Search Space and Average Proof Length of Resolution. H. Kleine Buning T. Lettmann. Universitat { GH { Paderborn. Postfach 16 21

Similar documents
SAT and DPLL. Introduction. Preliminaries. Normal forms DPLL. Complexity. Espen H. Lian. DPLL Implementation. Bibliography.

Notes on Natural Logic

SAT and DPLL. Espen H. Lian. May 4, Ifi, UiO. Espen H. Lian (Ifi, UiO) SAT and DPLL May 4, / 59

Yao s Minimax Principle

Finding Equilibria in Games of No Chance

TABLEAU-BASED DECISION PROCEDURES FOR HYBRID LOGIC

On the Optimality of a Family of Binary Trees Techical Report TR

Binary Decision Diagrams

Lecture 2: The Simple Story of 2-SAT

Structural Induction

A relation on 132-avoiding permutation patterns

Generating all nite modular lattices of a given size

Binary Decision Diagrams

0.1 Equivalence between Natural Deduction and Axiomatic Systems

Notes on the symmetric group

Computing Unsatisfiable k-sat Instances with Few Occurrences per Variable

CSE 100: TREAPS AND RANDOMIZED SEARCH TREES

Optimal Satisficing Tree Searches

NOTES ON FIBONACCI TREES AND THEIR OPTIMALITY* YASUICHI HORIBE INTRODUCTION 1. FIBONACCI TREES

Essays on Some Combinatorial Optimization Problems with Interval Data

Strong normalisation and the typed lambda calculus

CSE 21 Winter 2016 Homework 6 Due: Wednesday, May 11, 2016 at 11:59pm. Instructions

1 Solutions to Tute09

Remarks: 1. Often we shall be sloppy about specifying the ltration. In all of our examples there will be a Brownian motion around and it will be impli

Homework #4. CMSC351 - Spring 2013 PRINT Name : Due: Thu Apr 16 th at the start of class

TR : Knowledge-Based Rational Decisions

Markowitz portfolio theory

Lecture l(x) 1. (1) x X

Supporting Information

The Probabilistic Method - Probabilistic Techniques. Lecture 7: Martingales

TR : Knowledge-Based Rational Decisions and Nash Paths

Recall: Data Flow Analysis. Data Flow Analysis Recall: Data Flow Equations. Forward Data Flow, Again

Introduction to Greedy Algorithms: Huffman Codes

Practical SAT Solving

On the Optimality of a Family of Binary Trees

Computing Unsatisfiable k-sat Instances with Few Occurrences per Variable

An effective perfect-set theorem

R-automata. 1 Introduction. Parosh Aziz Abdulla, Pavel Krcal, and Wang Yi

DESCENDANTS IN HEAP ORDERED TREES OR A TRIUMPH OF COMPUTER ALGEBRA

CS134: Networks Spring Random Variables and Independence. 1.2 Probability Distribution Function (PDF) Number of heads Probability 2 0.

IEOR E4004: Introduction to OR: Deterministic Models

Sublinear Time Algorithms Oct 19, Lecture 1

Computational Independence

5 Deduction in First-Order Logic

Virtual Demand and Stable Mechanisms

The Traveling Salesman Problem. Time Complexity under Nondeterminism. A Nondeterministic Algorithm for tsp (d)

sample-bookchapter 2015/7/7 9:44 page 1 #1 THE BINOMIAL MODEL

Another Variant of 3sat. 3sat. 3sat Is NP-Complete. The Proof (concluded)

Heap Building Bounds

Game Theory: Normal Form Games

ON THE MAXIMUM AND MINIMUM SIZES OF A GRAPH

LECTURE 2: MULTIPERIOD MODELS AND TREES

SET 1C Binary Trees. 2. (i) Define the height of a binary tree or subtree and also define a height balanced (AVL) tree. (2)

On the Lower Arbitrage Bound of American Contingent Claims

is a path in the graph from node i to node i k provided that each of(i i), (i i) through (i k; i k )isan arc in the graph. This path has k ; arcs in i

UNIT VI TREES. Marks - 14

The Limiting Distribution for the Number of Symbol Comparisons Used by QuickSort is Nondegenerate (Extended Abstract)

Splay Trees. Splay Trees - 1

Threshold logic proof systems

Lecture 5: Iterative Combinatorial Auctions

arxiv: v1 [math.lo] 24 Feb 2014

arxiv: v1 [math.co] 31 Mar 2009

Gödel algebras free over finite distributive lattices

3 Arbitrage pricing theory in discrete time.

Smoothed Analysis of Binary Search Trees

Global Joint Distribution Factorizes into Local Marginal Distributions on Tree-Structured Graphs

Levin Reduction and Parsimonious Reductions

CMPSCI 311: Introduction to Algorithms Second Midterm Practice Exam SOLUTIONS

THE NUMBER OF UNARY CLONES CONTAINING THE PERMUTATIONS ON AN INFINITE SET

1. INTRODUCTION Often nancial institutions are faced with liability streams which the cost of not meeting is large. There are many examples. Lack of m

Cook s Theorem: the First NP-Complete Problem

monotone circuit value

Maximizing the Spread of Influence through a Social Network Problem/Motivation: Suppose we want to market a product or promote an idea or behavior in

Chapter 16. Binary Search Trees (BSTs)

2 Deduction in Sentential Logic

BAYESIAN GAMES: GAMES OF INCOMPLETE INFORMATION

The illustrated zoo of order-preserving functions

On the Number of Permutations Avoiding a Given Pattern

6 -AL- ONE MACHINE SEQUENCING TO MINIMIZE MEAN FLOW TIME WITH MINIMUM NUMBER TARDY. Hamilton Emmons \,«* Technical Memorandum No. 2.

Tableau-based Decision Procedures for Hybrid Logic

Q1. [?? pts] Search Traces

Tug of War Game. William Gasarch and Nick Sovich and Paul Zimand. October 6, Abstract

UPWARD STABILITY TRANSFER FOR TAME ABSTRACT ELEMENTARY CLASSES

MAT25 LECTURE 10 NOTES. = a b. > 0, there exists N N such that if n N, then a n a < ɛ

Fundamental Algorithms - Surprise Test

GUESSING MODELS IMPLY THE SINGULAR CARDINAL HYPOTHESIS arxiv: v1 [math.lo] 25 Mar 2019

6: MULTI-PERIOD MARKET MODELS

Lecture 5: Tuesday, January 27, Peterson s Algorithm satisfies the No Starvation property (Theorem 1)

Best response cycles in perfect information games

Maximum Contiguous Subsequences

Lecture Notes 1

Trade Agreements as Endogenously Incomplete Contracts

The Binomial Model. Chapter 3

Brief Notes on the Category Theoretic Semantics of Simply Typed Lambda Calculus

Lecture 14: Basic Fixpoint Theorems (cont.)

Online Shopping Intermediaries: The Strategic Design of Search Environments

Semantics with Applications 2b. Structural Operational Semantics

CTL Model Checking. Goal Method for proving M sat σ, where M is a Kripke structure and σ is a CTL formula. Approach Model checking!

Lecture 23: April 10

The potential function φ for the amortized analysis of an operation on Fibonacci heap at time (iteration) i is given by the following equation:

Transcription:

Search Space and Average roof Length of Resolution H. Kleine Buning T. Lettmann FB 7 { Mathematik/Informatik Universitat { GH { aderborn ostfach 6 2 D{4790 aderborn (Germany) E{mail: kbcsl@uni-paderborn.de lettmann@uni-paderborn.de Abstract In this paper we introduce a denition of search trees for resolution based proof procedures. This denition describes more clearly the dierences between the restrictions of resolution. Applying this concept to monotone restrictions of the resolution it is shown that the average proof length for propositional formulas is at most four times as large as for unrestricted resolution. The search trees used within this paper also allow the representation of space bounded resolution. Introduction Many eorts have been made to classify proof procedures like resolution, restrictions of resolution and other systems like cutting plane systems, Davis{utnam algorithms etc. with respect to the minimal proof length, e.g. see [6]. Also restrictions of the resolution proof procedure can be classied in this way [2], [3], [4], [5], [9], [0]. But these papers deal with worst{case complexities. In practice many restrictions show to be very ecient e.g. N{resolution whereas they are disqualied by looking at the proof length in the worst case. This may point out that measuring the minimal proof length is not really helpful in comparing proof systems with respect to practical applications. In order to get a notion of the eciency of a restriction in the average case, we carefully have to investigate into the search space of these restrictions. Our approach is to give a denition of a search space that is structured as a tree that contains all possible derivations, each given by the sequence of resolution steps that are performed. We also want to integrate a principle of locality. Every node within the search tree contains all information to generate the subtree with this node as a root. Many restrictions of resolution also have the property to keep derivability during the further stages of a proof. That means if a clause can be resolved by one resolution step once in a derivation, this resolution step also is possible in a stage of the proof later on. Note that linear restrictions do not fulll this monotony condition. But for monotone restrictions, e.g. for N-resolution it is proven that the average proof length is at most four times as large as for unrestricted resolution. Furthermore, we study how this concept of search trees can be combined with space bounded

resolution restrictions. Here, the memory for storing clauses during a derivation is bound by functions in the length of the starting formula. 2 reliminaries We assume that the reader is familiar with the basic notations of propositional logic and resolution. A resolution step can be seen as a operation on a set of clauses with the following actions:. Select resolvable clauses ' ; ' 2 from. 2. Resolve ' ; ' 2 : ' ; ' 2 j RES '. 3. Add ' to (if ' is a new clause). 4. Remove some clauses from. It is well{known that the empty clause can be generated by a sequence of applications of the operation if and only if the formula is unsatisable. The resolution remains complete for several restrictions e.g. N{resolution (one of the parent clauses must be negative clause) and linear resolution (one of the parent clauses is the previous resolvent), that have been developed in order to reduce the degree of nondeterminism, i.e. the number of choices for the parent clauses. This means that the restrictions reduce the size of the search space. With additional strategies, take for example the level{saturation strategy or the unit{preference strategy deterministic proof systems can be obtained. (For an overview on restrictions of resolution and additional strategies see e.g. [], [8], [].) In our terminology a resolution based procedure X{Res is a resolution operation with some local property and with or without some strategy to determine in which order the local resolution operation has to be applied to the clauses. Therefore, we say that X{Res is a restriction of the resolution. Thus, a resolution based procedure X{Res can be described as. Select resolvable clauses ' ; ' 2 according to the restriction X{Res. 2. Resolve ' ; ' 2 : ' ; ' 2 j RES '. 3. Add ' to (if ' is a new clause). 4. Remove some clauses from according to the restriction X{Res. For a resolution based proof procedure X{Res we dene a derivation via X{Res in the following way. Denition 2. (derivation via X{Res) For a propositional formula in CNF and a clause ' we say ' is derivable from via X{Res, j X?Res ' if and only if ' is a clause of or there is a nite sequence ' : : : : ; ' n of clauses with ' n = ' and i ; i 2 j ' RES i where i and i 2 are clauses in [ f' ; : : : ; ' i? g fullling the special conditions of X{Res. ' : : : : ; ' n are called the intermediate clauses of the derivation. For a formula and a resolution based procedure X? Res the closure is dened as X? Res () = f' j j X?Res 'g.

3 Search Tree There are dierent approaches to dene the search space of a resolution based proof procedure. E.g. all possible ways of applying an inference rule to the initially given set of clauses and to clauses derived from them can determine the search space for a formula. Often, the search space is seen as the set of all initially given or derivable clauses and the proof procedure investigates these clauses using a systematic search strategy. Our approach also represents all possible derivations. But we try to give a representation that is independent from minor constraints as the order of clauses and that also represents concepts as e.g. proof length and resolution restrictions. As mentioned above, we will dene the search space as a search tree. We don't use the concept of search trees in their classical meaning; our search tree represents a decision tree. Each node represents a sequence of resolution steps, and dierent paths in the tree represent dierent choices for the parent clauses. In order to make clear the idea of our denition, we will discuss the N{resolution as an example of a restriction of the resolution. In case of the N{resolution (one of the parent clauses must be a negative clause) we can divide the set of clauses of the initial formula into N, the set of negative clauses and the clauses with at least one positive literal n N. Thus, the set of all possible choices for the parent clauses of the rst resolution step using N{resolution can be represented by N ( n N ). Therefore, we use this product as the label of the root of the search tree for. Now there may be several choices for clauses we can resolve using N-resolution. For each pair of clauses (' ; ' 2 ) 2 N ( n N ) for which ' ; ' 2 j ' and ' is a new clause (i.e. ' 62 ), we X?Res introduce a separate successor node. The label of such a son is determined by ( N [f'g)(n N ) if ' is a negative clause and by N ( n N [ f'g) otherwise. Then, we continue with each node and perform the above steps analogously. More generally the label of a successor node can be computed by a successor function, which essentially depends on the restriction, the label of the father and the resolvent. If the empty clause is contained in the label, then no further successor of this node will exist. On the one hand the number of successor nodes of a node shows the level of nondeterminism we have for a resolution step. Each implementation of a resolution restriction follows one path from the root of the search tree to a leaf. But within the search tree all computations depending on dierent clause orders are represented simultaneously. On the other hand we force that only successor nodes exist if a new clause was deduced. In practice also some eort is necessary to determine whether a deduced clause really is new. If we also represent such unsuccessful tries as nodes, we will get innite search trees. Denition 3. (search tree for resolution) For a resolution based proof procedure X? Res the search tree is a labeled tree T X?Res () specied by (; I; S X?Res ; j X?Res ) where is a formula, I is the label of the root node and S X?Res computes the label of successor nodes determined by the specied resolution restriction j X?Res in the following way: Let be 2 the label of some node in the search tree.. If the empty clause is contained in [ 2, then the node is a leaf node.

2. For each pair ' 2, ' 2 2 2 to which one resolution step according to the X{resolution can be applied, ' ; ' 2 j ' and ' 62 X?Res [ 2 (' is a new clause), there is a successor node labeled by S( 2 ; '). For sake of simplicity we assume S( 2 ; t) = f(t; t)g. A node with this label cannot have a successor node. Therefore, we say that leaves in the search tree are labeled by the empty clause t. Each branch within the search tree contains at least one derivation tree. But for dierent branches these trees might not be dierent because the sequence of resolution steps is coded within a branch of the search tree. Therefore, we can look upon a path from the root to a leaf t in a search tree as a refutation. For any sequence of resolution steps we can easily determine the corresponding branch within the search tree. For the opposite direction we have no information concerning the parent clauses of a resolution step. E.g. there are two ways to derive A from f(a _ B); (A _ C); (A _ :B); (A _ :C)g. In order to distinguish the two possible resolution steps within the search tree we could introduce edge labels that specify the parent clauses of the corresponding resolution step. Let us consider search trees for some resolution strategies. Examples:. Search tree for unrestricted resolution: T Res () is given by (; ; S Res ; j Res ) If we have a node with label 2 not containing the empty clause, then for each pair (' ; ' 2 ) 2 2 with ' ; ' 2 j ' and ' is a new clause for Res 2 we have successor node. The label of a successor node is S Res ( 2 ; ') = ( [f'g)( 2 [f'g). Obviously, there is no need using Cartesian products as label, as both sets are equal for the label of each node. 2. Search tree for N{resolution: T N?Res () is given by (; I N?Res ; S N?Res ; j N?Res ) The label of the root node is I N?Res = N ( n N ) with N denoting the set of negative clauses in (clauses consisting of negative literals only). Using ( ( [ f'g) S N?Res ( 2 ; ') = 2 if ' is a negative clause, ( 2 [ f'g) otherwise we obtain labels with the rst set containing negative clauses only. Therefore, each resolution step is a N{resolution step. Analogously, the search tree for the {resolution (one of the parent clauses is a positive clause) T?Res can be described. 3. Search tree for unit{resolution: T U?Res () is given by (; I U?Res ; S U?Res ; j U?Res ) Dene I U?Res = U() with U() the set of units in and ( ( [ f'g) ( S U?Res ( 2 ; ') = 2 [ f'g) if ' is a unit clause, ( 2 [ f'g) otherwise For resolution, N{resolution and unit{resolution the search tree T X?Res () has a remarkable property: If we leave all leaf nodes and all adjacent edges aside, we get a subtree of T X?Res (). Within this subtree all paths from the root to a leaf have the same length, i.e. the subtree is balanced. We call search trees with this property prebalanced trees.

A bit more complicated is the representation of the search tree for linear resolution. Example: 4. Search tree for linear resolution: T lin?res () is given by (; I lin?res ; S lin?res ; j lin?res ) Dene I lin?res = as in the case of unrestricted resolution. The rst part of the label of each other node contains the clause resolved last, the second part contains the initial clauses and all ancestor clauses for this resolution path: S lin?res ( 2 ; ') = f'g ( 2 [ f'g) 2 Again we look at the subtree of T X?Res () we obtain by leaving all leaf nodes and all adjacent edges aside. But this time the subtree in general has paths of dierent length. This is based on two reasons: Firstly, the length of derivation depends on the choice of the clauses for the rst resolution. Secondly, in each further resolution step one of the clauses is already given, therefore the choice is limited. 4 Average roof Length A lot of work has been done in comparing proof systems with respect to the minimal proof length. But in practice some restrictions of the resolution, for which formulas with superpolynomially minimal proof exist whereas the resolution has polynomial minimal proof length, are much more ecient than unrestricted resolution. Hence, in the following we will compare the average proof length for a class of restrictions of the resolution. In order to motivate the denition of this class of restrictions let us consider the N{ resolution again. If we have a derivation j N?Res t, then we can extend the sequence of derived clauses by doing arbitrarily many resolution steps that are not necessary for the derivation of the empty clause. Therefore, we can deduce the closure N? Res ()? ftg and nally generate the empty clause. That means for any node in the search tree T N?Res () the path from the root to this node can be extended in such a way that along that path all clauses of N? Res () n are generated. Denition 4. (monotone restrictions for resolution) Let be X{resolution some resolution based proof procedure. We say the X-resolution is monotone if and only if for each the following holds: Let be j X?Res ' and ' ; : : : ; ' n with ' n = ' the sequence of intermediate clauses for a derivation of '. If there is a clause ' 0 n 6= ' n such that ' ; : : : ; ' n? ; ' 0 n are the intermediate clauses of another derivation via X? Res, then also ' ; : : :; ' n? ; ' 0 n ; ' n describe a possible derivation via X? Res. Note that the linear resolution is not monotone in this sense. For example, take the formula = :A^A^(:A_C). Resolving the rst two clauses results in the empty clause. But resolving the last two clauses results in the clause C and no further linear resolution step is possible. Examples of monotone restrictions are unrestricted resolution, unit{resolution, input{resolution and other strategies based on sets of support like N{resolution and {resolution. (Note that input{resolution results in linear proofs, but the restriction that one parent clause has to be from the initial clause set keeps monotony.) The average proof length of (X{)resolution applied to a formula is the sum over the length of all refutations divided by the number of dierent refutations. In terms of the associated search tree,

T (X?)Res () the average proof length is the average external path length of the tree. If the formula is unsatisable, then each leaf is the empty clause. Otherwise, in case of a monotone resolution restriction X? Res a leaf is reached after generating the closure X? Res (). We recall for trees the denition of average (external) path length. Denition 4.2 (average (external) path length) Let be T a tree with root r and L(T ) the set of leafs of the tree. The average external path length A(T ) is dened as A(T ) = l2l(t ) jp(r; l)j #(L(T )) where p(r; l) is the path from the root r to the leaf l, jp(r; l)j is the number of edges on this path and #(L(T )) is the number of leafs of T. The average path length A (T ) is dened as where N(T ) is the set of nodes of T. A (T ) = i2n (T ) jp(r; i)j #(N(T )) For monotone restrictions of the resolution we show that the average proof length is at most four times as long as the average proof length for unrestricted resolution. (This is not proven as a sharp bound.) Theorem 4.3. For any satisable formula and any restriction of the resolution holds A(T X?Res ()) A(T Res ()) 2. If the X{resolution is monotone and is not satisable, then it holds A(T X?Res ()) #(X-Res ())? #() #(Res ())? #() (4 A(T Res())? ) The fraction #(X{Res ())? #() #(Res is less or equal than because for each restriction of the resolution ())? #() the size of the closure X? Res () is less or equal than the size of Res (). Therefore, we obtain A(T X?Res ()) 4 A(T Res ()). Often the closure X? Res () is much smaller than Res (). If we could prove that on average the fraction #(X?Res ())? #() #(Res is less than for some k > 4, we ())? #() k even would obtain that on average holds A(T X?Res ()) < A(T Res ()) That would mean X{resolution is better than resolution. But so far no result in this area is known. Within the proof of the theorem 4.3 we will make use of some general properties of trees. For that reason we rstly prove some properties with respect to average (external) path length of trees. Note that a tree is balanced if and only if each leaf of T has the same depth.

Lemma 4.4 For each balanced tree T holds A(T ) 2 A (T ) roof: We prove the lemma by induction on the depth t of the tree and on the number n of sons of the root r. Note that for any balanced tree T with depth t the average external path length A(T ) equals t. For a tree T with depth and arbitrary number n of sons of the root we obtain A (T ) = therefore A(T ) 2 A (T ). n and n+ Now let be given a balanced tree with depth t. If the root has exactly one son, we omit the root node and the edge to the single successor node and obtain a balanced tree T 0 with depth t?. By induction hypothesis we know t? 2 A (T 0 ) = 2 i2n (T ) jp(r; i)j? #(N(T )) + #(N(T ))? Each path from the root to a node in T 0 corresponds to a path in T that is exactly one edge longer. In T we additionally have the path from the root of T to the root of T 0. For the number of nodes in T and T 0 holds #(N(T ))? = #(N(T 0 )). So we have 2 0 @ X i2n (T ) Since t > and #(N(T )) t we get jp(r; i)j? #(N(T )) + A #(N(T ))t? t? #(N(T )) + : 2 X i2n (T ) jp(r; i)j #(N(T ))t and division by #(N(T )) leads to the result 2 A (T ) t = A(T ). Now we suppose that the root of T has n > sons while the depth of T is t. We consider T as a join of n trees T ; : : :; T n of height t (T i is given by the root node r, exactly one of the successor nodes s i of r and the subtree under s i.) and hence we have 2 A (T ) = 2 n j= n j= i2n (T j ) jp(r; i)j #(N(T j ))? (n? ) Note that the root node r was considered in each of T ; : : : ; T n. Applying the induction hypothesis to T ; : : : ; T n (2 jp(r; i)j #(N(T j ))t for j n) implies i2n (T j ) 2 nx and therefore 2 A (T ) t. X j= i2n (T j ) jp(r; i)j t nx j= #(N(T j ))? t(n? ) Since the above result is only proven for balanced trees it can only be applied to resolution strategies with certain properties of the search tree. :

Denition 4.5 (prebalanced tree, monotone tree, leaf degree) A tree T is called prebalanced if and only if removing each leaf of T leads to a balanced tree. A tree T is called monotone if and only if for any inner node n 0 and any successor node n of n 0 holds that n has at least k leafs as direct successors if n 0 has k leafs as direct successors. A tree T has leaf degree k if and only if each inner node has at most k leafs as direct successors and there is an inner node with k leafs as direct successors. Let be T a tree with leaf degree r and k r. Then T k is the tree we obtain by adding to each inner node of T with p leafs as direct successors k? p new leafs. That means any inner node of T k has exactly k leafs as direct successors. Lemma 4.6 For each monotone and prebalanced tree T with leaf degree less or equal than k holds A(T k ) 2 A(T ): roof: The external average path length of T k is at most the depth t of T. Since T is prebalanced and monotone, the average path length of T is at least t. This can be seen by a simple counting 2 argument. There exists an injective mapping associating to each leaf of T with depth r < t a leaf 2 with depth t? r. roof: (of theorem 4.3) Ad : If is satisable, then each path in the search tree T Res () has the same length. Since the search tree T X?Res () is a subtree of T Res (), we obtain the desired inequality. Ad 2: Let be given an unsatisable formula and a monotone restriction of the resolution X?Res. Note that Res itself is monotone. For this monotone restriction X? Res and the resolution Res we dene some variants of the associated search trees T (X?)Res ().. T (X?)Res () is the tree we get by removing each leaf from the tree T (X?)Res() 2. T t () is obtained by removing each leaf with label t and depth less than t, where t is (X?)Res the depth of the tree T (X?)Res (). 3. Let be k the number of complementary literals in the closure (X?)Res (). Then T k X?Res () is the tree we obtain by adding to each inner node of T (X?)Res () edges leading to new leaves with label t, such that each inner node has k leaves with label t. Next we will show three propositions.. A(T X?Res ()) A(TX?Res()) t = #(X? Res ())? #() A(T t #(Res ())? #() Res()) Since X? Res is monotone, the average external path length for T t X?Res () is #(X? Res ())? #(). Analogously we see A(T t ()) = Res #(Res ())? #(). No leaf t of T X?Res () with a path from the root shorter than #(X? Res ())? #() occurs in the tree T t (). X?Res Therefore, we obtain A(T X?Res ()) A(T t X?Res ()).

2. A(T k Res ()) 2 A(T Res()) The inequality follows from lemma 4.6, because T Res () is monotone and prebalanced. 3. A(T t ()) + Res 2 A(T k ()) Res 2 A(TRes k ()) = 2 ( l2l(t k Res ())jp(r;l)j? ) + 2 #(L(T Res k ())) = 2 ( k i2n (T Res ())(jp(r;i)j+) k #(N (T? ) + 2 Res ())) = 2 i2n (T Res ())jp(r;i)j #(N (T Res ())) + 2 (T Res () is a balanced tree, lemma 4.4) l2l(t Res ())jp(r;l)j #(L(T Res ())) + 2 = ( #(Res ())? #()? ) + 2 = #(Res ())? #() + = l2l(t t Res ())jp(r;l)j #(L(T t Res ())) + = A(T t Res ()) + From these inequalities we obtain A(T X?Res ()) #(X?Res ())? #() #(Res A(T t ()) ())? #() Res #(X?Res ())? #() #(Res (2 A(T k ())? #() Res ())? ) (because of proposition 2) #(X?Res ())? #() #(Res ())? #() (4 A(T Res ())? ) 5 Space Requirements The results of this section were originally presented in [7]. But in the context of search trees we get a clear notion of the basic idea that had to be expressed by complicated denitions in [7]. In the previous sections we have considered resolution operations adding the resolvent to the formula. This results in an stepwise increase of the size of the formula. But in practice often some clauses will be removed during the deduction in order to reduce the search space. A well{known

example is the subsumption rule which says that a clause can be omitted if we have a clause with (as a new resolvent). The degree of each node in the search tree T (X?)Res () is strongly connected to the number of clauses in the label(formulas) because the number of pairs of clauses which can be resolved is the number of sons of the node. Besides the various resolution restrictions we can demand that the length of these formulas is bound. A reasonable measure for this space requirement is the number of clauses we have to store actually during a deduction. Denition 5. (space bounded resolution) Let be given a function f : CNF! IN. For a formula in CNF and a clause ' 62 we dene the space bounded resolution with bound f (SB(f){resolution, ) (using the notation of denition 2.) by: j Res ' if and only if there is a Res derivation with intermediate clauses ' ; : : :; ' n and there f are formulas 0 = ; ; : : : ; n such that ' i ; ' i2 2 i?, i i? [ f' i g and j i j f(jj) holds. j Res f The above denition says, that after resolving two clauses, we keep the new resolvent and can remove one or more clauses. If we reach the space bound f(jj), we have to remove at least one of the old clauses. So we have a restricted number of clauses we can actually store during a deduction. The formulas i represent the stored clauses. In terms of the search tree a SB(f){resolution refutation is a path from the root to a node t where each node is labeled by 2 with j [ 2 j f(jj). For example, in the case of unit{resolution in each resolution step L; (:L _ ) j the parent clause (:L _ ) can be replaced by without changing satisability. Hence, each unit{resolution refutation can be shortened to get a SB(k){resolution refutation where k is the number of clauses of the initial formula. Another example is the input{resolution. For input{resolution the root of the search tree is labeled with. All of the remaining inner nodes of the search tree have a label f'g, where ' is the resolvent of a pair of clauses chosen from the sets of the label of the predecessor. That means the actual resolvent is stored and the previous resolvent is removed. Obviously, the size of the labels besides the root label is jj +. It is well{known that the unit{ and the input{resolution are not complete. For SB(f){resolution (together with strategies as e.g. N{resolution) incompleteness also may be the result of the lack of storage. This leads immediately to the question for which functions f the SB(f){resolution remains complete. Let be k the number of clauses of the initial formula. Then there exist unsatisable formulas for which no SB(k){resolution refutation exists. But for each unsatisable formula we can construct a resolution refutation with space bound f(x) = 2x, i.e. using at most 2 jj clauses [7]. It would be of interest to study dierent restrictions of resolution with respect to their space requirement.

6 Conclusion Our denition of search trees as a concept to compare the sizes of search spaces seems to be useful for at least monotone restrictions of resolution. Further research is needed to classify other restrictions, e.g. linear resolution. We also want to investigate whether our concept is appropriate to compare also essentially dierent proof procedures, e.g. resolution calculus and tableau methods. In order to strengthen the presented results it would also be interesting to investigate into the average size of closure X? Res () for dierent strategies. References [] C.-L. Chang, R. C.-T. Lee: Symbolic Logic and mechanical theorem proving, Academic ress (973) [2] Z. Galil: On the Complexity of Regular Resolution and the Davis{utnam rocedure, Theoretical Computer Science 4 (977), pp. 23{46 [3] A. Goerdt: Unrestricted resolution versus N-resolution, Theoretical Computer Science 93 (992) 59 { 67 [4] A. Goerdt: Davis-utnam resolution versus unrestricted resolution, Annals of Math. and AI 6 (992) 69 { 84 [5] A. Goerdt: Regular resolution versus unrestricted resolution, roc. GWAI (990), Fachberichte Informatik; also to appear in SIAM Journ. of Comp. [6] A. Haken: The Intractability of Resolution, Theoretical Computer Science 39 (985), pp. 297-308 [7] H. Kleine Buning: Minimal Space Requirement for Resolution, submitted for publication [8] D. W. Loveland: Automated Theorem roving: A Logical Basis, North Holland (978) [9] G. S. Tseitin: On the Complexity of Derivations in ropositional Calculus, in A. O. Silenko (Ed.): Studies in Constructive Mathematics and Mathematical Logic, art II (970), pp. 5{ 25 [0] A. Urquhart: Hard Examples for Resolution, Journal of the ACM 34 (987), pp. 209{29 [] L. Wos, R. Overbeek, E. Lusk, J. Boyle: Automated Reasoning: Introduction and Applications, rentice Hall (984)