Skip to main content

The Propositional Complexity of First-Order Theorem Proving Strategies

  • Chapter
The Efficiency of Theorem Proving Strategies

Part of the book series: Computational Intelligence ((CI))

  • 37 Accesses

Abstract

The efficiency of a theorem prover is more directly influenced by the total number of inferences performed before a proof is found than by the size of the final proof. In general, in the field of automated deduction for full first-order logic, there has been a great deal of attention devoted to the completeness of strategies but little to their efficiency, in the sense of the total work expended in the search for a proof. The main efficiency considerations to date have to do with the times needed by particular implementations to find proofs of particular example theorems, or with the efficiencies of decision procedures for specialized theories. Of course, there has also been work on the efficiencies of low-level operations employed by theorem provers (such as unification). It is informative (and fun) to evaluate a prover by running it on a series of examples, but this could well be supplemented by analytical results. To this end, a theoretical study would be useful. It would be nice to know something about the behaviors of proposed new strategies without having to read and understand papers about them or having to run them on examples. Theoretical measures of search space size would permit this. Such measures would also make it easier to weed out bad strategies early and would stimulate the development of good ones. There is more at issue than just a quantitative measure of performance — analytical measures reveal something about how a strategy works, and how it does subgoaling. This gives some insight into the strategy. A theoretical approach could also help to pinpoint problem areas and weaknesses in a method and lead to improvements. In general, theory does not replace experiment but it does supplement it, and provides insights that might otherwise be missed. Theory tends to make general statements and to be machine-independent, whereas experiment tends to deal in specifics and to be machine-dependent. This paper is an attempt to initiate (or further) a theory of the search efficiency of automated theorem proving.

Many shall run to and fro, and knowledge shall be increased. Dan. 12:4

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Friedr. Vieweg & Sohn Verlagsgesellschaft mbH, Braunschweig/Wiesbaden

About this chapter

Cite this chapter

Plaisted, D.A., Zhu, Y. (1997). The Propositional Complexity of First-Order Theorem Proving Strategies. In: The Efficiency of Theorem Proving Strategies. Computational Intelligence. Vieweg+Teubner Verlag. https://doi.org/10.1007/978-3-322-93862-6_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-322-93862-6_1

  • Publisher Name: Vieweg+Teubner Verlag

  • Print ISBN: 978-3-528-05574-5

  • Online ISBN: 978-3-322-93862-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics