100% found this document useful (3 votes)
16 views

Complete Download AI algorithms data structures and idioms in Prolog Lisp and Java 6th Edition George F. Luger PDF All Chapters

The document provides information about the book 'AI Algorithms, Data Structures, and Idioms in Prolog, Lisp, and Java' by George F. Luger, including download links and related ebooks. It outlines the contents of the book, which covers programming in Prolog, Lisp, and Java, along with various algorithms and data structures. Additional suggested products and their download links are also included.

Uploaded by

bbnhajni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (3 votes)
16 views

Complete Download AI algorithms data structures and idioms in Prolog Lisp and Java 6th Edition George F. Luger PDF All Chapters

The document provides information about the book 'AI Algorithms, Data Structures, and Idioms in Prolog, Lisp, and Java' by George F. Luger, including download links and related ebooks. It outlines the contents of the book, which covers programming in Prolog, Lisp, and Java, along with various algorithms and data structures. Additional suggested products and their download links are also included.

Uploaded by

bbnhajni
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 68

Visit https://ebookultra.

com to download the full version and


explore more ebooks

AI algorithms data structures and idioms in


Prolog Lisp and Java 6th Edition George F. Luger

_____ Click the link below to download _____


https://ebookultra.com/download/ai-algorithms-data-
structures-and-idioms-in-prolog-lisp-and-java-6th-
edition-george-f-luger/

Explore and download more ebooks at ebookultra.com


Here are some suggested products you might be interested in.
Click the link to download

Data Structures and Algorithms in Java 6th Edition Michael


T. Goodrich

https://ebookultra.com/download/data-structures-and-algorithms-in-
java-6th-edition-michael-t-goodrich/

Artificial Intelligence Structures and Strategies for


Complex Problem Solving 6th Edition George F. Luger

https://ebookultra.com/download/artificial-intelligence-structures-
and-strategies-for-complex-problem-solving-6th-edition-george-f-luger/

Data Structures and Algorithms in Java 4th Edition Michael


T. Goodrich

https://ebookultra.com/download/data-structures-and-algorithms-in-
java-4th-edition-michael-t-goodrich/

Learning F Functional Data Structures and Algorithms 1st


Edition Masood

https://ebookultra.com/download/learning-f-functional-data-structures-
and-algorithms-1st-edition-masood/
Artificial Intelligence Structures and Strategies for
Complex Problem Solving 5th Edition George F. Luger

https://ebookultra.com/download/artificial-intelligence-structures-
and-strategies-for-complex-problem-solving-5th-edition-george-f-luger/

Java Collections An Introduction to Abstract Data Types


Data Structures and Algorithms 1st Edition David A. Watt

https://ebookultra.com/download/java-collections-an-introduction-to-
abstract-data-types-data-structures-and-algorithms-1st-edition-david-
a-watt/

Growing Algorithms and Data Structures 4th Edition David


Scuse

https://ebookultra.com/download/growing-algorithms-and-data-
structures-4th-edition-david-scuse/

Data Structures Algorithms In Go 1st Edition Hemant Jain

https://ebookultra.com/download/data-structures-algorithms-in-go-1st-
edition-hemant-jain/

Learning JavaScript Data Structures and Algorithms 2nd


Edition Loiane Groner

https://ebookultra.com/download/learning-javascript-data-structures-
and-algorithms-2nd-edition-loiane-groner/
AI algorithms data structures and idioms in Prolog Lisp
and Java 6th Edition George F. Luger Digital Instant
Download
Author(s): George F. Luger, William A. Stubblefield
ISBN(s): 9780136070474, 0136070477
Edition: 6
File Details: PDF, 2.27 MB
Year: 2009
Language: english
Luger_all_wcopyright_COsfixed.pd2 2 5/15/2008 6:34:39 PM
AI Algorithms, Data Structures, and
Idioms in Prolog, Lisp, and Java

Luger_all_wcopyright_COsfixed.pd1 1 5/15/2008 6:34:39 PM


Luger_all_wcopyright_COsfixed.pd2 2 5/15/2008 6:34:39 PM
AI Algorithms, Data Structures, and
Idioms in Prolog, Lisp, and Java

George F. Luger
William A. Stubblefield

Luger_all_wcopyright_COsfixed.pd3 3 5/15/2008 6:34:39 PM


Executive Editor Michael Hirsch
Acquisitions Editor Matt Goldstein
Editorial Assistant Sarah Milmore
Managing Editor Jeff Holcomb
Digital Assets Manager Marianne Groth
Senior Media Producer Bethany Tidd
Marketing Manager Erin Davis
Senior Author Support/
Technology Specialist Joe Vetere
Senior Manufacturing Buyer Carol Melville
Text Design, Composition, and Illustrations George F Luger
Cover Design Barbara Atkinson
Cover Image © Tom Barrow

Many of the designations used by manufacturers and sellers to distinguish their products are claimed as
trademarks. Where those designations appear in this book, and Addison-Wesley was aware of a
trademark claim, the designations have been printed in initial caps or all caps.

Copyright © 2009 Pearson Education, Inc. All rights reserved. No part of this publication may be
reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic,
mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher.
Printed in the United States of America. For information on obtaining permission for use of material in this
work, please submit a written request to Pearson Education, Inc., Rights and Contracts Department, 501
Boylston Street, Suite 900, Boston, MA 02116, fax (617) 671-3447, or online at
http://www.pearsoned.com/legal/permissions.htm.

ISBN-13: 978-0-13-607047-4
ISBN-10: 0-13-607047-7

1 2 3 4 5 6 7 8 9 10—OPM—12 11 10 09 08

Luger_copyright.pdf 1 5/15/2008 6:02:23 PM


Contents
Preface ix

Part I Language Idioms and the Master Programmer 1


Chapter 1 Idioms, Patterns, and Programming 3
1.1 Introduction: Idioms and Patterns 3
1.2 Selected Examples of Language Idioms 6
1.3 A Brief History of Three Programming Paradigms 11
1.4 A Summary of Our Task 15

Part II Programming in Prolog 17


Chapter 2 Prolog: Representation 19
2.1 Introduction: Logic-Based Representation 19
2.2 Prolog Syntax 20
2.3 Creating, Changing, and Tracing a Prolog Computation 24
2.4 Lists and Recursion in Prolog 25
2.5 Structured Representation and Inheritance Search 28
Exercises 32

Chapter 3 Abstract Data Types and Search 33


3.1 Introduction 33
3.2 Using cut to Control Search in Prolog 36
3.3 Abstract Data Types (ADTs) in Prolog 38
Exercises 42

Chapter 4 Depth- Breadth-, and Best-First Search 43


4.1 Production System Search in Prolog 43
4.2 A Production System Solution of the FWGC Problem 46
4.3 Designing Alternative Search Strategies 52
Exercises 58

Chapter 5 Meta-Linguistic Abstraction, Types, and Meta-Interpreters 59


5.1 Meta-Interpreters, Types, and Unification 59
5.2 Types in Prolog 61
5.3 Unification, Variable Binding, and Evaluation 64
Exercises 68
v

Luger_all_wcopyright_COsfixed.pd5 5 5/15/2008 6:34:39 PM


vi Contents

Chapter 6 Three Meta-Interpreters: Prolog in Prolog, EXSHELL, and a


Planner 59
6.1 An Introduction to Meta-Interpreters: Prolog in Prolog 69
6.2 A Shell for a Rule-Based System 73
6.3 A Prolog Planner 82
Exercises 85

Chapter 7 Machine Learning Algorithms in Prolog 87


7.1 Machine Learning: Version Space Search 87
7.2 Explanation Based Learning in Prolog 100
Exercises 106

Chapter 8 Natural Language Processing in Prolog 107


8.1 Natural Language Understanding 107
8.2 Prolog Based Semantic Representation 108
8.3 A Context-Free Parser in Prolog 111
8.4 Probabilistic Parsers in Prolog 114
8.5 A Context-Sensitive Parser in Prolog 119
8.6 A Recursive Descent Semantic Net Parser 120
Exercises 123

Chapter 9 Dynamic Programming and the Earley Parser 125


9.1 Dynamic Programming Revisited 125
9.2 The Earley Parser 126
9.3 The Earley Parser in Prolog 134
Exercises 139

Chapter 10 Prolog: Final Thoughts 141


10.1 Towards a Procedural Semantics 141
10.2 Prolog and Automated Reasoning 144
10.3 Prolog Idioms, Extensions, and References 145

Part III Programming in Lisp 149


Chapter 11 S-Expressions, the Syntax of Lisp 151
11.1 Introduction to Symbol Expressions 151
11.2 Control of Lisp Evaluation 154
11.3 Programming in Lisp: Creating New Functions 156
11.4 Program Control: Conditionals and Predicates 157
Exercises 160

Luger_all_wcopyright_COsfixed.pd6 6 5/15/2008 6:34:39 PM


Contents vii

Chapter 12 Lists and Recursive Search 161

12.1 Functions, Lists, and Symbolic Computing 161


12.2 Lists as Recursive Structures 163
12.3 Nested Lists, Structure, and car/cdr Recursion 166
Exercises 168

Chapter 13 Variables, Datratypes, and Search 171


13.1 Variables and Datatypes 171
13.2 Search: The Farmer, Wolf, Goat, and Cabbage Problem 177
Exercises 182

Chapter 14 Higher-Order Functions and Flexible Search 185


14.1 Higher-Order Functions and Abstraction 185
14.2 Search Strategies in Lisp 189
Exercises 193

Chapter 15 Unification and Embedded Languages in Lisp 195


15.1 Introduction 195
15.2 Interpreters and Embedded Languages 203
Exercises 205

Chapter 16 Logic programming in Lisp 207


16.1 A Simple Logic Programming Language 207
16.2 Streams and Stream Processing 209
16.3 A Stream-Based logic Programming Interpreter 211
Exercises 217

Chapter 17 Lisp-shell: An Expert System Shell in Lisp 219


17.1 Streams and Delayed Evaluation 219
17.2 An Expert System Shell in Lisp 223
Exercises 232

Chapter 18 Semantic Networks, Inheritance, and CLOS 233


18.1 Semantic nets and Inheritance in Lisp 233
18.2 Object-Oriented Programming Using CLOS 237
18.3 CLOS Example: A Thermostat Simulation 244
Exercises 250

Chapter 19 Machine Learning in Lisp 251


19.1 Learning: The ID3 Algorithm 251
19.2 Implementing ID3 259

Luger_all_wcopyright_COsfixed.pd7 7 5/15/2008 6:34:39 PM


viii Contents

Exercises 266

Chapter 20 Lisp: Final Thoughts 267

Part IV Programming in Java 269


Chapter 21 Java, Representation and Object-Oriented Programming 273
21.1 Introduction to O-O Representation and Design 273
21.2 Object Orientation 274
21.3 Classes and Encapsulation 275
21.4 Polymorphism 276
21.5 Inheritance 277
21.6 Interfaces 280
21.7 Scoping and Access 282
21.8 The Java Standard Library 283
21.9 Conclusions: Design in Java 284
Exercises 285

Chapter 22 Problem Spaces and Search 287


21.1 Abstraction and Generality in Java 287
21.2 Search Algorithms 288
21.3 Abstracting Problem States 292
21.4 Traversing the Solution Space 295
21.5 Putting the Framework to Use 298
Exercises 303

Chapter 23 Java Representation for Predicate Calculus and Unification 305


23.1 Introduction to the Task 305
23.2 A Review of the Predicate Calculus and Unification 307
23.3 Building a Predicate Calculus Problem Solver in Java 310
23.4 Design Discussion 320
23.5 Conclusions: Mapping Logic into Objects 322
Exercises 323

Chapter 24 A Logic-Based Reasoning System 325


24.1 Introduction 325
24.2 Reasoning in Logic as Searching an And/Or Graph 325
24.3 The Design of a Logic-Based Reasoning System 329
24.4 Implementing Complex Logic Expressions 330
24.5 Logic-Based Reasoning as And/Or Graph Search 335
24.6 Testing the Reasoning System 346

Luger_all_wcopyright_COsfixed.pd8 8 5/15/2008 6:34:40 PM


Contents ix

24.7 Design Discussion 348


Exercises 350

Chapter 25 An Expert System Shell 351


25.1 Introduction: Expert Systems 351
25.2 Certainty Factors and the Unification Problem Solver 352
25.3 Adding User Interactions 358
25.4 Design Discussion 360
Exercises 361

Chapter 26 Case Studies: JESS and other Expert System Shells in Java 363
26.1 Introduction 363
26.2 JESS 363
26.3 Other Expert system Shells 364
26.4 Using Open Source Tools 365

Chapter 27 ID3: Learning from Examples 367


27.1 Introduction to Supervised Learning 367
27.2 Representing Knowledge as Decision Trees 367
27.3 A Decision Tree Induction program 370
27.4 ID3: An Information Theoretic Tree Induction Algorithm 385
Exercises 388

Chapter 28 Genetic and Evolutionary Computing 389


28.1 Introduction 389
28.2 The Genetic Algorithm: A First Pass 389
28.3 A GA Java Implementation in Java 393
28.4 Conclusion: Complex Problem Solving and Adaptation 401
Exercises 401

Chapter 29 Case Studies: Java Machine Learning Software Available on the


Web 403
29.1 Java Machine Learning Software 403

Chapter 30 The Earley Parser: Dynamic Programming in Java 405


30.1 Chart Parsing 405
30.2 The Earley Parser: Components 406
30.3 The Earley Parser: Java Code 408
30.4 The Completed Parser 414
30.5 Generating Parse Trees from Charts and Grammar Rules 419
Exercises 422

Luger_all_wcopyright_COsfixed.pd9 9 5/15/2008 6:34:40 PM


x Contents

Chapter 31 Case Studies: Java Natural Language Tools on the Web 423
31.1 Java Natural Language Processing Software 423
31.2 LingPipe from the University of Pennsylvania 423
31.3 The Stanford Natural Language Processing Group Software 425
31.4 Sun’s Speech API 426

Part V Model Building and the Master Programmer 429

Chapter 32 Conclusion: The Master Programmer 431


32.1 Paradigm-Based Abstractions and Idioms 431
32.2 Programming as a Tool for Exploring Problem Domains 433
32.3 Programming as a Social Activity 434
32.4 Final Thoughts 437

Bibliography 439

Index 443

Luger_all_wcopyright_COsfixed.pd10 10 5/15/2008 6:34:40 PM


Preface
What we have to learn to do
We learn by doing…

- Aristotle, Ethics

Why Another Writing a book about designing and implementing representations and
Programming search algorithms in Prolog, Lisp, and Java presents the authors with a
Language number of exciting opportunities.
Book?
The first opportunity is the chance to compare three languages that give
very different expression to the many ideas that have shaped the evolution
of programming languages as a whole. These core ideas, which also
support modern AI technology, include functional programming, list
processing, predicate logic, declarative representation, dynamic binding,
meta-linguistic abstraction, strong-typing, meta-circular definition, and
object-oriented design and programming. Lisp and Prolog are, of course,
widely recognized for their contributions to the evolution, theory, and
practice of programming language design. Java, the youngest of this trio, is
both an example of how the ideas pioneered in these earlier languages
have shaped modern applicative programming, as well as a powerful tool
for delivering AI applications on personal computers, local networks, and
the world wide web.
The second opportunity this book affords is a chance to look at Artificial
Intelligence from the point of view of the craft of programming. Although
we sometimes are tempted to think of AI as a theoretical position on the
nature of intelligent activity, the complexity of the problems AI addresses
has made it a primary driver of progress in programming languages,
development environments, and software engineering methods. Both Lisp
and Prolog originated expressly as tools to address the demands of
symbolic computing. Java draws on object-orientation and other ideas that
can trace their roots back to AI programming. What is more important, AI
has done much to shape our thinking about program organization, data
structures, knowledge representation, and other elements of the software
craft. Anyone who understands how to give a simple, elegant formulation
to unification-based pattern matching, logical inference, machine learning
theories, and the other algorithms discussed in this book has taken a large
step toward becoming a master programmer.
The book’s third, and in a sense, unifying focus lies at the intersection of
these points of view: how does a programming language’s formal structure
interact with the demands of the art and practice of programming to

xi

Luger_all_wcopyright_COsfixed.pd11 11 5/15/2008 6:34:40 PM


xii Preface

create the idioms that define its accepted use. By idiom, we mean a set of
conventionally accepted patterns for using the language in practice.
Although not the only way of using a language, an idiom defines patterns
of use that have proven effective, and constitute a common understanding
among programmers of how to use the language. Programming language
idioms do much to both enable, as well as support, ongoing
communication and collaboration between programmers.
These, then, are the three points of view that shape our discussion of AI
programming. It is our hope that they will help to make this book more
than a practical guide to advanced programming techniques (although it is
certainly that). We hope that they will communicate the intellectual depth
and pleasure that we have found in mastering a programming language
and using it to create elegant and powerful computer programs.
The Design of There are five sections of this book. The first, made up of a single chapter,
this Book lays the conceptual groundwork for the sections that follow. This first
chapter provides a general introduction to programming languages and
style, and asks questions such as “What is a master programmer?” What is a
programming language idiom?,” and “How are identical design patterns
implemented in different languages?” Next, we introduce a number of
design patterns specific to supporting data structures and search strategies
for complex problem solving. These patterns are discussed in a “language
neutral” context, with pointers to the specifics of the individual
programming paradigms presented in the subsequent sections of our
book. The first chapter ends with a short historical overview of the
evolution of the logic-based, functional, and object-oriented approaches to
computer programming languages.
Part II of this book presents Prolog. For readers that know the rudiments
of first-order predicate logic, the chapters of Part II can be seen as a
tutorial introduction to Prolog, the language for programming in logic.
For readers lacking any knowledge of the propositional and predicate
calculi we recommend reviewing an introductory textbook on logic.
Alternatively, Luger (2005, Chapter 2) presents a full introduction to both
the propositional and predicate logics. The Luger introduction includes a
discussion, as well as a pseudo code implementation, of unification, the
pattern-matching algorithm at the heart of the Prolog engine.
The design patterns that make up Part II begin with the “flat” logic-based
representation for facts, rules, and goals that one might expect in any
relational data base formalism. We next show how recursion, supported by
unification-based pattern matching, provides a natural design pattern for
tree and graph search algorithms. We then build a series of abstract data
types, including sets, stacks, queues, and priority queues that support
patterns for search. These are, of course, abstract structures, crafted for
the specifics of the logic-programming environment that can search across
state spaces of arbitrary content and complexity. We then build and
demonstrate the “production system” design pattern that supports rule
based programming, planning, and a large number of other AI
technologies. Next, we present structured representations, including

Luger_all_wcopyright_COsfixed.pd12 12 5/15/2008 6:34:40 PM


Preface xiii

semantic networks and frame systems in Prolog and demonstrate


techniques for implementing single and multiple inheritance
representation and search. Finally, we show how the Prolog design
patterns presented in Part II can support the tasks of machine learning
and natural language understanding.
Lisp and functional programming make up Part III. Again, we present the
material on Lisp in the form of a tutorial introduction. Thus, a
programmer with little or no experience in Lisp is gradually introduced to
the critical data structures and search algorithms of Lisp that support
symbolic computing. We begin with the (recursive) definition of symbol-
expressions, the basic components of the Lisp language. Next we present
the “assembly instructions” for symbol expressions, including car, cdr, and
cons. We then assemble new patterns for Lisp with cond and defun.
Finally, we demonstrate the creation and/or evaluation of symbol
expressions with quote and eval. Of course, the ongoing discussion of
variables, binding, scope, and closures is critical to building more complex
design patterns in Lisp.
Once the preliminary tools and techniques for Lisp are presented, we
describe and construct many of the design patterns seen earlier in the
Prolog section. These include patterns supporting breadth-first, depth-
first, and best-first search as well as meta-interpreters for rule-based
systems and planning. We build and demonstrate a recursion-based
unification algorithm that supports a logic interpreter in Lisp as well as a
stream processor with delayed evaluation for handling potentially infinite
structures. We next present data structures for building semantic networks
and object systems. We then present the Common Lisp Object system
(CLOS) libraries for building object and inheritance based design patterns.
We close Part III by building design patterns that support decision-tree
based machine learning.
Java and its idioms are presented in Part IV. Because of the complexities
of the Java language, Part IV is not presented as a tutorial introduction to
the language itself. It is expected that the reader has completed at least an
introductory course in Java programming, or at the very least, has seen
object-oriented programming in another applicative language such as
C++, C#, or Objective C. But once we can assume a basic understanding
of Java tools, we do provide a tutorial introduction to many of the design
patterns of the language.
The first chapter of Part IV, after a brief overview of the origins of Java,
goes through many of the features of an object-oriented language that will
support the creation of design patterns in that environment. These
features include the fundamental data structuring philosophy of
encapsulation, polymorphism, and inheritance. Based on these concepts
we briefly address the analysis, iterative design, programming and test
phases for engineering programs. After the introductory chapter we begin
pattern building in Java, first considering the representation issue and how
to represent predicate calculus structures in Java. This leads to building

Luger_all_wcopyright_COsfixed.pd13 13 5/15/2008 6:34:41 PM


xiv Preface

patterns that support breadth-first, depth-first, and best-first search. Based


on patterns for search, we build a production system, a pattern that
supports the rule-based expert system. Our further design patterns
support the application areas of natural language processing and machine
learning. An important strength that Java offers, again because of its
object-orientation and modularity is the use of public domain (and other)
libraries available on the web. We include in the Java section a number of
web-supported AI algorithms, including tools supporting work in natural
language, genetic and evolutionary programming (a-life), natural language
understanding, and machine learning (WEKA).
The final component of the book, Part V, brings together many of the
design patterns introduced in the earlier sections. It also allows the authors
to reinforce many of the common themes that are, of necessity,
distributed across the various components of the presentation, We
conclude with general comments supporting the craft of programming.
Using this Book This book is designed for three primary purposes. The first is as a
programming language component of a general class in Artificial
Intelligence. From this viewpoint, the authors see as essential that the AI
student build the significant algorithms that support the practice of AI.
This book is designed to present exactly these algorithms. However, in the
normal lecture/lab approach taken to teaching Artificial Intelligence at the
University level, we have often found that it is difficult to cover more than
one language per quarter or semester course. Therefore we expect that the
various parts of this material, those dedicated to either Lisp, Prolog, or
Java, would be used individually to support programming the data
structures and algorithms presented in the AI course itself. In a more
advanced course in AI it would be expected that the class cover more than
one of these programming paradigms.
The second use of this book is for university classes exploring
programming paradigms themselves. Many modern computer science
departments offer a final year course in comparative programming
environments. The three languages covered in our book offer excellent
examples on these paradigms. We also feel that a paradigms course should
not be based on a rapid survey of a large number of languages while doing
a few “finger exercises” in each. Our philosophy for a paradigms course is
to get the student more deeply involved in fewer languages, and these
typically representing the declarative, functional, and object-oriented
approaches to programming. We also feel that the study of idiom and
design patterns in different environments can greatly expand the skill set
of the graduating student. Thus, our philosophy of programming is built
around the language idioms and design patterns presented in Part I and
summarized in Part V. We see these as an exciting opportunity for
students to appreciate the wealth and diversity of modern computing
environments. We feel this book offers exactly this opportunity.
The third intent of this book is to offer the professional programmer the
chance to continue their education through the exploration of multiple

Luger_all_wcopyright_COsfixed.pd14 14 5/15/2008 6:34:41 PM


Preface xv

programming idioms, patterns, and paradigms. For these readers we also


feel the discussion of programming idioms and design patterns presented
throughout our book is important. We are all struggling to achieve the
status of the master programmer.
We have built each chapter in this book to reflect the materials that would
be covered in either one or two classroom lectures, or about an hour’s
effort, if the reader is going through this material by herself. There are a
small number of exercises at the end of most chapters that may be used to
reinforce the main concepts of that chapter. There is also, near the end of
each chapter, a summary statement of the core concepts covered.
Acknowledg- First, we must thank several decades of students and colleagues at the
ments University of New Mexico. These friends not only suggested, helped
design, and tested our algorithms but have also challenged us to make
them better.
Second, we owe particular thanks to colleagues who wrote algorithms and
early drafts of chapters. These include Stan Lee, (PhD student at UNM)
for the Prolog chapter on Earley parsing, Breanna Ammons (MS in CS at
UNM) for the Java version of the Earley parser and along with Robert
Spurlock (CS undergraduate at UNM) the web-based NLP chapter, Paul
DePalma (Professor of CS at Gonzaga University) for the Java Genetic
Algorithms chapter, and Chayan Chakrabarti (MS in CS at UNM) for the
web-based machine learning chapter in Java
Third, there are several professional colleagues that we owe particular
debts. These include David MacQueen, University of Chicago, one of the
creators of SML, Manuel Hermenegildo, The Prince of Asturias Endowed
Chair of Computer Science at UNM and a designer of Ciao Prolog, Paul
De Palma, Professor of Computer Science at Gonzaga University, and
Alejandro Cdebaca, our friend and former student, who helped design
many of the algorithms of the Java chapters.
Fourth, we thank our friends at Pearson Education who have supported
our various creative writing activities over the past two decades. We
especially acknowledge our editors Alan Apt, Karen Mossman, Keith
Mansfield, Owen Knight, Simon Plumtree, and Matt Goldstein, along with
their various associate editors, proof readers, and web support personnel.
We also thank our wives, children, family, and friends; all those that have
made our lives not just survivable, but intellectually stimulating and
enjoyable.
Finally, to our readers; we salute you: the art, science, and practice of
programming is great fun, enjoy it!

GL
BS
July 2008
Albuquerque

Luger_all_wcopyright_COsfixed.pd15 15 5/15/2008 6:34:41 PM


xvi Preface

Luger_all_wcopyright_COsfixed.pd16 16 5/15/2008 6:34:41 PM


PART I: Language Idioms and the
Master Programmer

all good things - trout as well as eternal salvation - come by grace and grace comes by art and art does not
come easy…

- Norman Maclean, (1989) A River Runs Through It

Language and In defining a programming language idiom, an analogy with natural


Idioms
language use might help. If I ask a friend, “Do you know what time it is?”
or equivalently “Do you have a watch?”, I would be surprised if she simply
said “yes” and turned away. These particular forms for asking someone for
the time of day are idiomatic in that they carry a meaning beyond their
literal interpretation. Similarly, a programming language idiom consists of
those patterns of use that good programmers accept as elegant, expressive
of their design intent, and that best harness the language’s power. Good
idiomatic style tends to be specific to a given language or language
paradigm: the way experienced programmers organize a Prolog program
would not constitute accepted Java style.
Language idioms serve two roles. The first is to enhance communication
between programmers. As experienced programmers know, we do not
simply write code for a compiler; we also write it for each other. Writing in
a standard idiom makes it easier for other people to understand our intent,
and to maintain and/or extend our code. Second, a language’s idiom helps
us to make sure we fully use the power the language designers have
afforded us. People design a language with certain programming styles in
mind. In the case of Java, that style was object-oriented programming, and
getting full benefit of such Java features as inheritance, scoping, automatic
garbage collection, exception handling, type checking, packages, interfaces,
and so forth requires writing in an object-oriented idiom. A primary goal of
this book is to explore and give examples of good idioms in three diverse
language paradigms: the declarative (logic-based), functional, and object-
oriented.
The Master The goal of this book is to develop the idea and describe the practice of
Programmer
the master programmer. This phrase carries a decidedly working class
connotation, suggesting the kind of knowledge and effort that comes
through long practice and the transmission of tools and skills from master
to student through the musty rituals of apprenticeship. It certainly suggests
something beyond the immaculate formalization that we generally associate
with scientific disciplines. Indeed, most computer science curricula

Luger_all_wcopyright_COsfixed.pd17 17 5/15/2008 6:34:41 PM


2 Part I Introduction

downplay this craft of programming, favoring discussions of computability


and complexity, algorithms, data structures, and the software engineer’s
formalist longings. In reality, the idea of programming as a craft that
demands skill and dedication is widely accepted in practical circles. Few
major successful programming projects have existed that did not owe
significant components of that success to the craftsmanship of such
individuals.
But, what then, do master programmers know?
The foundation of a master programmer’s knowledge is a strong
understanding of the core domains of computer science. Although working
programmers may not spend much (or any) time developing and
publishing theorems, they almost always have a deep, intuitive grasp of
algorithms, data structures, logic, complexity, and other aspects of the
theory of formal systems. We could compare this to a master welder’s
understanding of metallurgy: she may not have a theoretician’s grasp of
metallic crystalline structure, but her welds do not crack. This book
presumes a strong grounding in these computer science disciplines.
Master programmers also tend to be language fanatics, exhibiting a fluency
in several programming languages, and an active interest in anything new
and unusual. We hope that our discussion of three major languages will
appeal to the craftsman’s fascination with their various tools and
techniques. We also hope that, by contrasting these three major languages
in a sort of “comparative language” discussion, we will help programmers
refine their understanding of what a language can provide, and the needs
that continue to drive the evolution of programming languages.

Luger_all_wcopyright_COsfixed.pd18 18 5/15/2008 6:34:42 PM


1 Idioms, Patterns, and Programming

Chapter This chapter introduces the ideas that we use to organize our thinking about
Objectives languages and how they shape the design and implementation of programs.
These are the ideas of language, idiom, and design pattern.

Chapter 1.1 Introduction


Contents 1.2 Selected Examples of AI Language Idioms
1.3 A Brief History of Three Programming Paradigms
1.4 A Summary of our Task

1.1 Introduction
Idioms and As with any craft, programming contains an undeniable element of
Patterns
experience. We achieve mastery through long practice in solving the
problems that inevitably arise in trying to apply technology to actual
problem situations. In writing a book that examines the implementation of
major AI algorithms in a trio of languages, we hope to support the reader’s
own experience, much as a book of musical etudes helps a young musician
with their own exploration and development.
As important as computational theory, tools, and experience are to a
programmer’s growth, there is another kind of knowledge that they only
suggest. This knowledge comes in the form of pattern languages and
idioms, and it forms a major focus of this book. The idea of pattern
languages originated in architecture (Alexander et al. 1977) as a way of
formalizing the knowledge an architect brings to the design of buildings
and cities that will both support and enhance the lives of their residents. In
recent years, the idea of pattern languages has swept the literature on
software design (Gamma, et al. 1995; Coplein & Schmidt 1995; Evans
2003), as a way of capturing a master’s knowledge of good, robust program
structure.
A design pattern describes a typical design problem, and outlines an
approach to its solution. A pattern language consists of a collection of
related design patterns. In the book that first proposed the use of pattern
languages in architecture, Christopher Alexander et al. (1977, page x) state
that a pattern
describes a problem which occurs over and over again in our environment, and
then describes the core of the solution to that problem, in such a way that you
can use this solution a million times over, without ever doing it the same way
twice.
Design patterns capture and communicate a form of knowledge that is
essential to creating computer programs that users will embrace, and that

Luger_all_wcopyright_COsfixed.pd19 19 5/15/2008 6:34:42 PM


4 Part I: Language Idioms and the Master Programmer

programmers will find to be elegant, logical, and maintainable. They


address programming and languages, not in terms of Turing completeness,
language paradigms, compiler semantics, or any of the other qualities that
constitute the core of computer science, but rather as tools for practical
problem solving. To a large extent, you can think of this book as
presenting a pattern language of the core problems of AI programming,
and examples – the patterns – of their solution.
Idioms are a form and structure for knowledge that helps us bridge the
differences between patterns as abstract descriptions of a problem and its
solutions and an understanding of how best to implement that solution in a
given programming language. A language idiom is the expression of a
design pattern in a given language. In this sense, design patterns + idioms =
quality programs.
Sample Design Consider, for example, the simple, widely used design pattern that we can
Patterns call map that applies some operator O to every element of a list L. We can
express this pattern in a pseudo code function as follows:
map(operator O, list L)
{
if (L contains no elements) quit;
h  the first element of L.
apply O to h;
map(O, L minus h);
}
This map function produces a stream of results: O applied to each element
of the list L. As our definition of pattern specifies, this describes a solution
to a recurring problem, and also fosters unlimited variations, depending on
the type of the elements that make up the list L, and the nature of the
operator, O.
Now, let us consider a fragment of Lisp code that implements this same
map pattern, where f is the mapped operator (in Lisp a function) and
list is the list:
(defun map (f list)
(cond ((null list) nil)
(t (cons (apply f (car list))
(map f (cdr list))))))
This function map, created by using the built-in Lisp defun function, not
only implements the map pattern, but also illustrates elements of the Lisp
programming idiom. These include the use of the operators car and cdr to
separate the list into its head and tail, the use of the cons operator to place
the results into a new list, and also the use of recursion to move down the
list. Indeed, this idiom of recursively working through a list is so central to
Lisp, that compiler writers are expected to optimize this sort of tail
recursive structure into a more efficient iterative implementation.
Let us now compare the Lisp map to a Java implementation that
demonstrates how idioms vary across languages:

Luger_all_wcopyright_COsfixed.pd20 20 5/15/2008 6:34:42 PM


Chapter 1 Idioms, Patterns, and the Master programmer 5

public Vector map(Vector l)


{
Vector result = new Vector();
Iterator iter = l.iterator();
while(iter.hasNext())
{
result.add(f(iter.next));
}
return result;
}
The most striking difference between the Java version and the Lisp version
is that the Java version is iterative. We could have written our list search in
a recursive form (Java supports recursion, and compilers should optimize it
where possible), but Java also offers us iterators for moving through lists.
Since the authors of Java provide us with list iterators, and we can assume
they are implemented efficiently, it makes sense to use them. The Java
idiom differs from the Lisp idiom accordingly.
Furthermore, the Java version of map creates the new variable, result.
When the iterator completes its task, result will be a vector of
elements, each the result of applying f to each element of the input list
(vector). Finally, result must be explicitly returned to the external
environment. In Lisp, however, the resulting list of mapped elements is the
result of invoking the function map (because it is returned as a direct
result of evaluating the map function).
Finally, we present a Prolog version of map. Of course in Prolog, map will
be a represented as a predicate. This predicate has three arguments, the
first the function, f, which will be applied to every element of the list that
is the second argument of the predicate. The third argument of the
predicate map is the list resulting from applying f to each element of the
second argument. The pattern [X|Y] is the Prolog list representation,
where X is the head of the list (car in Lisp) and Y is the list that is the rest
of the list (cdr in Lisp). The is operator binds the result of f applied to
H to the variable NH. As with Lisp, the map relationship is defined
recursively, although no tail recursive optimization is possible in this case.
Further clarifications of this Prolog specification are presented in Part II.
map(f, [ ], [ ]).
map(f, [H|T], [NH|NT]):-
NH is f(H), map(f, T, NT).
In the three examples above we see a very simple example of a pattern
having different idioms in each language, the eval&assign pattern. This
pattern evaluates some expression and assigns the result to a variable. In
Java, as we saw above, = simply assigns the evaluated expression on its
right-hand-side to the variable on its left. In Lisp this same activity requires
the cons of an apply of f to an element of the list. The resulting
symbol expression is then simply returned as part of the evaluated function
map. In Prolog, using the predicate representation, there are similar

Luger_all_wcopyright_COsfixed.pd21 21 5/15/2008 6:34:43 PM


6 Part I: Language Idioms and the Master Programmer

differences between assignment (based on unification with patterns such as


[H|T] and =) and evaluation (using is or making f be a goal).
Understanding and utilizing these idioms is an essential aspect of mastering
a programming language, in that they represent expected ways the language
will be used. This not only allows programmers more easily to understand,
maintain, and extend each other’s code, but also allows us to remain
consistent with the language designer’s assumptions and implementation
choices.
1.2 Selected Examples of AI Language Idioms
We can think of this book, then, as presenting some of the most important
patterns supporting Artificial Intelligence programming, and demonstrating
their implementation in the appropriate idioms of three major languages.
Although most of these patterns were introduced in this book’s companion
volume, Artificial Intelligence: Structures and Strategies for Complex Problem Solving
(Luger 2009), it is worthwhile to summarize a subset of them briefly in this
introduction.
Symbolic Artificial Intelligence rests on two basic ideas: first, representation or the use
Computing:
of symbol structures to represent problem solving knowledge (state), and
The Issue of second, search, the systematic consideration of sequences of operations on
Representation
these knowledge structures to solve complex problems. Symbolic
computing embraces a family of patterns for representing state and then
manipulating these (symbol) structures, as opposed to only performing
arithmetic calculations on states. Symbolic computing methods are the
foundation of artificial intelligence: in a sense, everything in this book rests
upon them. The recursive list-handling algorithm described above is a
fundamental symbolic computing pattern, as are the basic patterns for tree
and graph manipulation. Lisp was developed expressly as a language for
symbolic computing, and its s-expression representation (see Chapter 11)
has proved general, powerful and long-lived.
As we develop the examples of this book, pay close attention to how these
simple patterns of list, tree, and graph manipulation combine to form the
more complex, problem specific patterns described below.
Search Search in AI is also fundamental and complementary to representation (as
is emphasized throughout our book. Prolog, in fact, incorporates a form of
search directly into its language semantics. In addition to forming a
foundation of AI, search introduces many of its thorniest problems. In
most interesting problems, search spaces tend to be intractable, and much
of AI theory examines the use of heuristics to control this complexity. As
has been pointed out from the very beginnings of AI (Feigenbaum and
Feldman 1963, Newell and Simon 1976) support of intelligent search
places the greatest demands on AI programming.
Search related design patterns and problems we will examine in this book
include implementations of the basic search algorithms (breadth-first,
depth-first, and best-first), management of search history, and the recovery
of solution paths with the use of those histories.
A particularly interesting search related problem is in the representation

Luger_all_wcopyright_COsfixed.pd22 22 5/15/2008 6:34:43 PM


Chapter 1 Idioms, Patterns, and the Master programmer 7

and generation of problem states. Conceptually, AI search algorithms are


general: they can apply to any search space. Consequently, we will define
general, reusable search “frameworks” that can be applied to a range of
problem representations and operations for generating new states. How
the different programming paradigms address this issue is illuminating in
terms of their language-based idioms.
Lisp makes no syntactic distinction between functions and data structures:
both can be represented as symbol expressions (see s-expression, Chapter
11), and both can be handled identically as Lisp objects. In addition, Lisp
does not enforce strong typing on s-expressions. These two properties of
the language allow us to define a general search algorithm that takes as
parameters the starting problem state, and a list of Lisp functions, often
using the map design pattern described earlier, for producing child states.
Prolog includes a list representation that is very similar to lists in Lisp, but
differs in having built-in search and pattern matching in a language
supporting direct representation of predicate calculus rules. Implementing
a generalized search framework in Prolog builds on this language’s unique
idioms. We define the operators for generating states as rules, using pattern
matching to determine when these rules apply. Prolog offers explicit meta-
level controls that allow us to direct the pattern matching, and control its
built-in search.
Java presents its own unique idioms for generalizing search. Although Java
provides a “reflection” package that allows us to manipulate its objects,
methods, and their parameters directly, this is not as simple to do as in Lisp
or Prolog. Instead, we will use Java interface definitions to specify the
methods a state object must have at a general level, and define search
algorithms that take as states instances of any class that instantiates the
appropriate interface (see Chapters 22-24).
These three approaches to implementing search are powerful lessons in the
differences in language idioms, and the way they relate to a common set of
design patterns. Although each language implements search in a unique
manner, the basic search algorithms (breadth-, depth-, or best-first) behave
identically in each. Similarly, each search algorithm involves a number of
design patterns, including the management of problem states on a list, the
ordering of the state list to control search, and the application of state-
transition operators to a state to produce its descendants. These design
patterns are clearly present in all algorithms; it is only at the level of
language syntax, semantics, and idioms that these implementations differ.
Pattern Pattern matching is another support technology for AI programming that
Matching
spawns a number of useful design patterns. Approaches to pattern
matching can vary from checking for identical memory locations, to
comparing simple regular-expressions, to full pattern-based unification
across predicate calculus expressions, see Luger (2009, Section 2.3). Once
again, the differences in the way each language implements pattern
matching illustrate critical differences in their semantic structure and
associated idioms.
Prolog provides unification pattern matching directly in its interpreter:
unification and search on Predicate Calculus based data structures are the

Luger_all_wcopyright_COsfixed.pd23 23 5/15/2008 6:34:43 PM


8 Part I: Language Idioms and the Master Programmer

basis of Prolog semantics. Here, the question is not how to implement


pattern matching, but how to use it to control search, the flow of program
execution, and the use of variable bindings to construct problem solutions
as search progresses. In this sense, Prolog gives rise to its own very unique
language idioms.
Lisp, in contrast, requires that we implement unification pattern matching
ourselves. Using its basic symbolic computing capabilities, Lisp makes it
straightforward to match recursively the tree structures that implicitly
define predicate calculus expressions. Here, the main design problem
facing us is the management of variable bindings across the unification
algorithm. Because Lisp is so well suited to this type of implementation,
we can take its implementation of unification as a “reference
implementation” for understanding both Prolog semantics, and the Java
implementation of the same algorithm.
Unlike Lisp, which allows us to use nested s-expressions to define tree
structures, Java is a strongly typed language. Consequently, our Java
implementation will depend upon a number of user-created classes to
define expressions, constants, variables, and variable bindings. As with our
implementation of search, the differences between the Java and Lisp
implementations of pattern matching are interesting examples of the
differences between the two languages, their distinct idioms, and their
differing roles in AI programming.
Structured Although the basic symbolic structures (lists, trees, etc.) supported by all
Types and
Inheritance
these languages are at the foundation of AI programming, a major focus of
(Frames) AI work is on producing representations that reflect the way people think
about problems. This leads to more complex structures that reflect the
organization of taxonomies, similarity relationships, ontologies, and other
cognitive structures. One of the most important of these comes from
frame theory (Minsky 1975; Luger 2009, Section 7.1), and is based on
structured data types (collections of individual attributes combined in a
single object or frame), explicit relationships between objects, and the use of
class inheritance to capture hierarchical organizations of classes and their
attributes.
These representational principles have proved so effective for practical
knowledge representation that they formed the basis of object-oriented
programming: Smalltalk, the CommonLisp Object System libraries
(CLOS), C++, and Java. Just as Prolog bases its organization on predicate
calculus and search, and Lisp builds on (functional) operations on symbolic
structures, so Java builds directly on these ideas of structured
representation and inheritance.
This approach of object-oriented programming underlies a large number of
design patterns and their associated idioms (Gamma, et al. 1995; Coplein &
Schmidt 1995), as merited by the expressiveness of the approach. In this
book, we will often focus on the use of structured representations not
simply for design of program code, but also as a tool for knowledge
representation.
Meta-Linguistic Meta-linguistic abstraction is one of the most powerful ways of organizing
Abstraction programs to solve complex problems. In spite of its imposing title, the

Luger_all_wcopyright_COsfixed.pd24 24 5/15/2008 6:34:43 PM


Chapter 1 Idioms, Patterns, and the Master programmer 9

idea behind meta-linguistic abstraction is straightforward: rather than trying


to write a solution to a hard problem in an existing programming language,
use that language to create another language that is better suited to solving
the problem. We have touched on this idea briefly in this introduction in
our mention of general search frameworks, and will develop it throughout
the book (e.g., Chapters 5, 15, 26).
One example of meta-linguistic abstraction that is central to AI is the idea
of an inference engine: a program that takes a declarative representation of
domain knowledge in the form of rules, frames or some other
representation, and applies that knowledge to problems using general
inference algorithms. The commonest example of an inference engine is
found in a rule-based expert system shell. We will develop such a shell,
EXSHELL in Prolog (Chapter 6), Lisp-shell in Lisp (Chapter 17), and an
equivalent system in Java (Chapter 26), providing similar semantics in all
three language environments. This will be a central focus of the book, and
will provide an in-depth comparison of the programming idioms supported
by each of these languages.
Knowledge- This discussion of AI design patterns and language idioms has proceeded
Level Design
from simple features, such as basic, list-based symbol processing, to more
powerful AI techniques such as frame representations and expert system
shells. In doing so, we are adopting an organization parallel to the
theoretical discussion in Artificial Intelligence: Strategies and Structures for
Complex Problem Solving (Luger 2009). We are building a set of tools for
programming at what Allen Newell (1982) has called the knowledge level.

Figure 1.1 Levels of a Knowledge-Based System, adapted from Newell


(1982).
Allen Newell (1982) has distinguished between the knowledge level and the
symbol level in describing an intelligent system. As may be seen in Figure 1.1
(adapted from Newell, 1982), the symbol level is concerned with the
particular formalisms used to represent problem solving knowledge, for
example the predicate calculus. Above this symbol level is the knowledge
level concerned with the knowledge content of the program and the way in
which that knowledge is used.
The distinction between the symbol and knowledge level is reflected in the

Luger_all_wcopyright_COsfixed.pd25 25 5/15/2008 6:34:53 PM


10 Part I: Language Idioms and the Master Programmer

architectures of expert systems and other knowledge-based programs (see


Chapters 6, 15, and 25). Since the user will understand these programs in
terms of their knowledge content, these programs must preserve two
invariants: first, as just noted, there must be a knowledge-level
characterization, and second, there must be a clear distinction between this
knowledge and its control. We see this second invariant when we utilize the
production system design pattern in Chapters 6, 15, and 25. Knowledge level
concerns include questions such as: What queries will be made of the
system? What objects and/or relations are important in the domain? How
is new knowledge added to the system? Will information change over time?
How will the system need to reason about its knowledge? Does the
problem domain include missing or uncertain information?
The symbol level, just below the knowledge level, defines the knowledge
representation language, whether it be direct use of the predicate calculus
or production rules. At this level decisions are made about the structures
required to represent and organize knowledge. This separation from the
knowledge level allows the programmer to address such issues as
expressiveness, efficiency, and ease of programming, that are not relevant
to the programs higher level intent and behavior.
The implementation of the algorithm and data structure level constitutes a still
lower level of program organization, and defines an additional set of design
considerations. For instance, the behavior of a logic-based or function-
based program should be unaffected by the use of a hash table, heap, or
binary tree for implementing its symbol tables. These are implementation
decisions and invisible at higher levels. In fact, most of the techniques used
to implement representation languages for AI are common computer
science techniques, including binary trees and tables and an important
component of the knowledge-level design hypothesis is that they be hidden
from the programmer.
In thinking of knowledge level programming, we are defining a hierarchy
that uses basic programming language constructs to create more
sophisticated symbol processing languages, and uses these symbolic
languages to capture knowledge of complex problem domains. This is a
natural hierarchy that moves from machine models that reflect an
underlying computer architecture of variables, assignments and processes,
to a symbolic layer that works with more abstract ideas of symbolic
representation and inference. The knowledge level looks beyond symbolic
form to the semantics of problem solving domains and their associated
knowledge relationships.
The importance of this multi-level approach to system design cannot be
overemphasized: it allows a programmer to ignore the complexity hidden
at lower levels and focus on issues appropriate to the current level of
abstraction. It allows the theoretical foundations of artificial intelligence to
be kept free of the nuances of a particular implementation or programming
language. It allows us to modify an implementation, improving its
efficiency or porting it to another machine, without affecting its
specification and behavior at higher levels. But the AI programmer begins
addressing the problem-solving task from the programming language level.

Luger_all_wcopyright_COsfixed.pd26 26 5/15/2008 6:34:53 PM


Chapter 1 Idioms, Patterns, and the Master programmer 11

In fact, we may characterize the programmer’s ability to use design patterns


and their associated idioms as her ability to bridge and link the algorithms
and data structures afforded by different language paradigms with the
symbol level in the process of building expressive knowledge-intensive
programs.
To a large extent, then, our goal in writing this book is to give the reader
the intellectual tools for programming at the knowledge level. Just as an
experienced musician thinks past the problems of articulating individual
notes and chords on their instrument to the challenges of harmonic and
rhythmic structure in a composition, or an architect looks beyond the
layout of floor plans to ways buildings will interact with their occupants
over time, we believe the goal of a programmer’s development is to think
of computer programs in terms of the knowledge they incorporate, and the
way they engage human beings in the patterns of their work,
communication and relationships. Becoming the “master programmer” we
mentioned earlier in this introduction requires the ability to think in terms
of the human activities a program will support, and simultaneously to
understand the many levels of abstraction, algorithms, and data structures
that lie between those activities and the comparatively barren structures of
the “raw” programming language
1.3 A Brief History of Three Programming Paradigms
We conclude this chapter by giving a brief description of the origins of the
three programming languages we present. We also give a cursory
description of the three paradigms these languages represent. These details
are precursors of and an introduction to the material presented in the next
three parts of this book.
Logic Like Lisp, Prolog gains much of its power and elegance from its
Programming
in Prolog
foundations in mathematics. In the case of Prolog, those foundations are
predicate logic and resolution theorem proving. Of the three languages
presented in this book, Prolog may well seem unusual to most
programmers in that it is a declarative, rather than procedural, language. A
Prolog program is simply a statement, in first-order predicate calculus, of
the logical conditions a solution to a problem must satisfy. The declarative
semantics do not tell the computer what to do, only the conditions a
solution must satisfy. Execution of a Prolog program relies on search to
find a set of variable bindings that satisfy the conditions stated in the
particular goals required by the program. This declarative semantics makes
Prolog extremely powerful for a large class of problems that are of
particular interest to AI. These include constraint satisfaction problems,
natural language parsing, and many search problems, as will be
demonstrated in Part II.
A logic program is a set of specifications in formal logic; Prolog uses the
first-order predicate calculus. Indeed, the name itself comes from
programming in logic. An interpreter executes the program by
systematically making inferences from logic specifications. The idea of
using the representational power of the first-order predicate calculus to
express specifications for problem solving is one of the central

Luger_all_wcopyright_COsfixed.pd27 27 5/15/2008 6:34:53 PM


12 Part I: Language Idioms and the Master Programmer

contributions Prolog has made to computer science in general and to


artificial intelligence in particular. The benefits of using first-order
predicate calculus for a programming language include a clean and elegant
syntax and a well-defined semantics.
The implementation of Prolog has its roots in research on theorem proving
by J.A. Robinson (Robinson 1965), especially the creation of algorithms for
resolution refutation systems. Robinson designed a proof procedure called
resolution, which is the primary method for computing with Prolog. For a
more complete description of resolution refutation systems and of Prolog
as Horn clause refutation, see Luger (2009, Chapter 14).
Because of these features, Prolog has proved to be a useful vehicle for
investigating such experimental programming issues as automatic code
generation, program verification, and design of high-level specification
languages. As noted above, Prolog and other logic-based languages support
a declarative programming style—that is, constructing a program in terms
of high-level descriptions of a problem’s constraints—rather than a
procedural programming style—writing programs as a sequence of
instructions for performing an algorithm. This mode of programming
essentially tells the computer “what is true” and “what needs to be proven
(the goals)” rather than “how to do it.” This allows programmers to focus
on problem solving as creating sets of specifications for a domain rather
than the details of writing low-level algorithmic instructions for “what to
do next.”
The first Prolog program was written in Marseille, France, in the early
1970s as part of a project in natural language understanding (Colmerauer,
Kanoui et al. 1973, Roussel 1975, Kowalski 1979). The theoretical
background for the language is discussed in the work of Kowalski, Hayes,
and others (Hayes 1977, Kowalski 1979, Kowalski 1979, Lloyd 1984). The
major development of the Prolog language was carried out from 1975 to
1979 at the Department of Artificial Intelligence of the University of
Edinburgh. The people at Edinburgh responsible for the first “road
worthy” implementation of Prolog were David H.D. Warren and Fernando
Pereira. They produced the first Prolog interpreter robust enough for
delivery to the general computing community. This product was built using
the “C” language on the DEC-system 10 and could operate in both
interpretive and compiled modes (Warren, Pereira, et al. 1979).
Further descriptions of this early code and comparisons of Prolog with
Lisp may be found in Warren et al. (Warren, Pereira, et al. 1977). This
“Warren and Pereira” Prolog became the early standard. The book
Programming in Prolog (Clocksin and Mellish 1984, now in its fifth edition)
was created by two other researchers at the Department of Artificial
Intelligence, Bill Clocksin and Chris Mellish. This book quickly became the
chief vehicle for delivering Prolog to the computing community. We use
this standard, which has come to be known as Edinburgh Prolog. In fact,
all the Prolog code in this book may be run on the public domain
interpreter SWI-Prolog (to find, Google on swi-prolog).
Functional Lisp was arguably the first programming language to ground its semantics
Programming
in Lisp
in mathematical theory: the theory of partial recursive functions (McCarthy

Luger_all_wcopyright_COsfixed.pd28 28 5/15/2008 6:34:54 PM


Chapter 1 Idioms, Patterns, and the Master programmer 13

1960, Church 1941). In contrast to most of its contemporaries, which


essentially presented the architecture of the underlying computer in a
higher-level form, this mathematical grounding has given Lisp unusual
power, durability and influence. Ideas such as list-based data structures,
functional programming, and dynamic binding, which are now accepted
features of mainstream programming languages can trace their origins to
earlier work in Lisp. Meta-circular definition, in which compilers and
interpreters for a language are written in a core version of the language
itself, was the basis of the first, and subsequent Lisp implementations. This
approach, still revolutionary after more than fifty years, replaces
cumbersome language specifications with an elegant, formal, public,
testable meta-language kernel that supports the continued growth and
refinement of the language.
Lisp was first proposed by John McCarthy in the late 1950s. The language
was originally intended as an alternative model of computation based on
the theory of recursive functions. In an early paper, McCarthy (McCarthy
1960) outlined his goals: to create a language for symbolic rather than
numeric computation, to implement a model of computation based on the
theory of recursive functions (Church 1941), to provide a clear definition
of the language’s syntax and semantics, and to demonstrate formally the
completeness of this computational model. Although Lisp is one of the
oldest computing languages still in active use (along with FORTRAN and
COBOL), the careful thought given to its original design and the
extensions made to the language through its history have kept it in the
vanguard of programming languages. In fact, this programming model has
proved so effective that a number of other languages have been based on
functional programming, including SCHEME, SML-NJ, FP, and OCAML.
In fact, several of these newer languages, e.g., SCHEME and SML-NJ,
have been designed specifically to reclaim the semantic clarity of the earlier
versions of Lisp.
The list is the basis of both programs and data structures in Lisp: Lisp is an
acronym for list processing. Lisp provides a powerful set of list-handling
functions implemented internally as linked pointer structures. Lisp gives
programmers the full power and generality of linked data structures while
freeing them, with real-time garbage collection, from the responsibility for
explicitly managing pointers and pointer operations.
Originally, Lisp was a compact language, consisting of functions for
constructing and accessing lists (car, cdr, cons), defining new functions
(defun), detecting equality (eq), and evaluating expressions (quote,
eval). The only means for building program control were recursion and a
single conditional. More complicated functions, when needed, were
defined in terms of these primitives. Through time, the best of these new
functions became part of the language itself. This process of extending the
language by adding new functions led to the development of numerous
dialects of Lisp, often including hundreds of specialized functions for data
structuring, program control, real and integer arithmetic, input/output
(I/O), editing Lisp functions, and tracing program execution. These
dialects are the vehicle by which Lisp has evolved from a simple and
elegant theoretical model of computing into a rich, powerful, and practical

Luger_all_wcopyright_COsfixed.pd29 29 5/15/2008 6:34:54 PM


14 Part I: Language Idioms and the Master Programmer

environment for building large software systems. Because of the


proliferation of early Lisp dialects, the Defense Advanced Research
Projects Agency in 1983 proposed a standard dialect for the language,
known as Common Lisp.
Although Common Lisp has emerged as the lingua franca of Lisp dialects,
a number of simpler dialects continue to be widely used. One of the most
important of these is SCHEME, an elegant rethinking of the language that
has been used both for AI development and for teaching the fundamental
concepts of computer science. The dialect we use throughout the
remainder of our book is Common Lisp. All our code may be run on a
current public domain interpreter built by Carnegie Mellon University,
called CMUCL (Google CMUCL).
Object- Java is the third language considered in this book. Although it does not
Oriented
Programming
have Lisp or Prolog’s long historical association with Artificial Intelligence,
in Java it has become extremely important as a tool for delivering practical AI
applications. There are two primary reasons for this. The first is Java’s
elegant, dynamic implementation of object-oriented programming, a
programming paradigm with its roots in AI, that has proven its power for
use building AI programs through Smalltalk, Flavors, the Common Lisp
Object System (CLOS), and other object-oriented systems. The second
reason for Java’s importance to AI is that it has emerged as a primary
language for delivering tools and content over the world-wide-web. Java’s
ease of programming and the large amounts of reusable code available to
programmers greatly simplify the coding of complex programs involving
AI techniques. We demonstrate this in the final chapters of Part IV.
Object-oriented programming is based on the idea that programs can be
best modularized in terms of objects: encapsulated structures of data and
functionality that can be referenced and manipulated as a unit. The power
of this programming model is enhanced by inheritance, or the ability to
define sub-classes of more general objects that inherit and modify their
functionality, and the subtle control object-oriented languages provide over
the scoping of variables and functions alike.
The first language to build object-oriented representations was created in
Norway in the 1960s. Simula-67 was, appropriately, a simulation language.
Simulation is a natural application of object-oriented programming that
language objects are used to represent objects in the domain being
simulated. Indeed, this ability to easily define isomorphisms between the
representations in an object-oriented program and a simulation domain has
carried over into modern object-oriented programming style, where
programmers are encouraged to model domain objects and their
interactions directly in their code.
Perhaps the most elegant formulation of the object-oriented model is in
the Smalltalk programming language, built at Xerox PARC in the early
1970s. Smalltalk not only presented a very pure form of object-oriented
programming, but also used it as a tool for graphics programming. Many of
the ideas now central to graphics interfaces, such as manipulable screen
objects, event driven interaction, and so on, found their early
implementation in the Smalltalk language. Other, later implementations of

Luger_all_wcopyright_COsfixed.pd30 30 5/15/2008 6:34:54 PM


Chapter 1 Idioms, Patterns, and the Master programmer 15

object-programming include C++, Objective C, C#, and the Common


Lisp Object System. The success of the model has made it rare to find a
programming language that does not incorporate at least some object-
oriented ideas.
Our first introduction of object-oriented languages is with the Common
Lisp Object System in Chapter 18 of Part III. However, in Part IV, we
have chosen Java to present the use of object-oriented tools for AI
programming. Java offers an elegant implementation of object-orientation
that implements single inheritance, dynamic binding, interface definitions,
packages, and other object concepts in a language syntax that most
programmers will find natural. Java is also widely supported and
documented.
The primary reason, however, for including Java in this book is its great
success as a practical programming language for a large number and variety
of applications, most notably those on the world-wide-web. One of the
great benefits of object-oriented programming languages is that the ability
to define objects combining data and related methods in a single structure
encourages the development of reusable software objects.
Although Java is, at its core, a relatively simple language, the efforts of
thousands of programmers have led to large amounts of high-quality, often
open source, Java code. This includes code for networking, graphics,
processing html and XML, security, and other techniques for programming
on the world-wide-web. We will examine a number of public domain Java
tools for AI, such as expert system rule engines, machine learning
algorithms, and natural language parsers. In addition, the modularity and
control of the object-oriented model supports the development of large
programs. This has led to the embedding of AI techniques in larger and
indeed more ordinary programs. We see Java as an essential language for
delivering AI in practical contexts, and will discuss the Java language in this
context. In this book we refer primarily to public domain interpreters most
of which are easily web accessible.

1.4 A Summary of Our Task


We hope that in reading this introductory chapter, you have come to see
that our goal in writing this book is not simply to present basic
implementation strategies for major Artificial Intelligence algorithms.
Rather, our goal is to look at programming languages as tools for the
intellectual activities of design, knowledge modeling, and system
development.
Computer programming has long been the focus both for scientific theory
and engineering practice. These disciplines have given us powerful tools
for the definition and analysis of algorithms and for the practical
management of large and small programming projects. In writing this
book, it has been our overarching goal to provide a third perspective on
programming languages: as tools for the art of designing systems to
support people in their thinking, communication, and work.
It is in this third perspective that the ideas of idioms and patterns become

Luger_all_wcopyright_COsfixed.pd31 31 5/15/2008 6:34:55 PM


16 Part I: Language Idioms and the Master Programmer

important. It is not our goal simply to present examples of artificial


intelligence algorithms that can be reused in a narrow range of situations.
Our goal is to use these algorithms – with all their complexity and
challenges – to help programmers build a repertoire of patterns and idioms
that can serve well across a wide range of practical problem solving
situations. The examples of this book are not ends in themselves; they are
only small steps in the maturation of the master programmer. Our goal is
to see them as starting points for developing programmers’ skills. We hope
you will share our enthusiasm for these remarkable artist’s tools and the
design patterns and idioms they both enable and support.

Luger_all_wcopyright_COsfixed.pd32 32 5/15/2008 6:34:55 PM


PART II: Programming in Prolog

The only way to rectify our reasonings is to make them as tangible as those of the mathematicians, so that
we can find our error at a glance, and when there are disputes among persons we can simply say, “Let us
calculate… to see who is right.”
—Leibniz, The Art of Discovery

As an implementation of logic programming, Prolog makes many


important contributions to AI problem solving. First and foremost, is its
direct and transparent representation and interpretation of predicate
calculus expressions. The predicate calculus has been an important
representational scheme in AI from the beginning, used everywhere from
automated reasoning to robotics research. A second contribution to AI is
the ability to create meta-predicates or predicates that can constrain,
manipulate, and interpret other predicates. This makes Prolog ideal for
creating meta-interpreters or interpreters written in Prolog that can
interpret subsets of Prolog code. We will do this many times in the
following chapters, writing interpreters for expert rule systems, exshell,
interpreters for machine learning using version space search and
explanation based learning models, and deterministic and stochastic natural
language parsers.
Most importantly Prolog has a declarative semantics, a means of directly
expressing problem relationships in AI. Prolog also has built-in unification,
some high- powered techniques for pattern matching and a depth-first left
to right search. For a full description of Prolog representation, unification,
and search as well as Prolog interpreter compared to an automated
theorem prover, we recommend Luger (2009, Section 14.3) or references
mentioned in Chapter 10. We will also address many of the important
issues of Prolog and logic programming for artificial intelligence
applications in the chapters that make up Part II.
In Chapter 2 we present the basic Prolog syntax and several simple
programs. These programs demonstrate the use of the predicate calculus as
a representation language. We show how to monitor the Prolog
environment and demonstrate the use of the cut with Prolog’s built in
depth-first left-to-right search. We also present simple structured
representations including semantic nets and frames and present a simple
recursive algorithm that implements inheritance search.
In Chapter 3 we create abstract data types (ADTs) in Prolog. These ADTs
include stacks, queues, priority queues, and sets. These data types are the basis
for many of the search and control algorithms in the remainder of Part II.

17

Luger_all_wcopyright_COsfixed.pd33 33 5/15/2008 6:34:55 PM


18 Part II Programming in Prolog

In particular, they are used to build a production system in Chapter 4, which


can perform depth-first, breadth-first, and best-first or heuristic search. They also
are critical to algorithms later in Part II including building planners,
parsers, and algorithms for machine learning.
In Chapter 5 we begin to present the family of design patterns expressed
through building meta-interpreters. But first we consider a number of
important Prolog meta-predicates, predicates whose domains of interpretation
are Prolog expressions themselves. For example, atom(X) succeeds if X is
bound to an atom, that is if X is instantiated at the time of the atom(X)
test. Meta-predicates may also be used for imposing type constraints on
Prolog interpretations, and we present a small database that enforces
Prolog typing constraints.
In Chapter 6 meta-predicates are used for designing meta-interpreters in
Prolog. We begin by building a Prolog interpreter in Prolog. We extend
this interpreter to rule-based expert system processing with exshell and
then build a robot planner using add- and delete-lists along the lines of the
older STRIPS problem solver (Fikes and Nilsson 1972, Nilsson 1980).
In Chapter 7 we demonstrate Prolog as a language for machine learning,
with the design of meta-interpreters for version space search and explanation-
based learning. In Chapter 8 we build a number of natural language
parsers/generators in Prolog, including context-free, context-sensitive,
probabilistic, and a recursive descent semantic net parser.
In Chapter 9 we present the Earley parser, a form of chart parsing, an
important contribution to interpreting natural language structures. The
Earley algorithm is built on ideas from dynamic programming (Luger 2009,
Section 4.1.2 and 15.2.2) where the chart captures sub-parse components
as they are generated while the algorithm moves across the words of the
sentence. Possible parses of the sentence are retrieved from the chart after
completion of its left-to-right generation of the chart.
Part II ends with Chapter 10 where we return to the discussion of the
general issues of programming in logic, the design of meta-interpreters, and
issues related to procedural versus declarative representation for problem
solving. We end Chapter 10 presenting an extensive list of references on
the Prolog language.

Luger_all_wcopyright_COsfixed.pd34 34 5/15/2008 6:34:55 PM


2 Prolog: Representation

Chapter Prolog’s fundamental representations are described and built:


Objectives Facts
Rules
The and, or, not, and imply connectives
The environment for Prolog is presented:
The program as a data base of facts and relations between facts
Predicates are for creating and modifying this data base
Prolog’s procedural semantics is described with examples
Pattern-matching
Left-to-right depth-first search
Backtracking on variable bindings
The built-in predicates for monitoring Prolog’s execution are presented
spy and trace
The list representation and recursive search are introduced
Examples of member check and writing out lists
Representations for structured hierarchies are created in Prolog
Semantic net and frame systems
Inherited properties determined through recursive (tree) search

Chapter 2.1 Introduction: Logic-Based Representation


Contents 2.2 Syntax for Predicate Calculus Programming
2.3 Creating, Changing and Tracing a Prolog Computation
2.4 Lists and Recursion in Prolog
2.5 Structured Representations and Inheritance Search in Prolog

2.1 Introduction: Logic-Based Representation


Prolog and Prolog is a computer language that uses many of the representational
Logic
strengths of the First-Order Predicate Calculus (Luger 2009, Chapter 2).
Because Prolog has this representational power it can express general
relationships between entities. This allows expressions such as “all females
are intelligent” rather than the limited representations of the propositional
calculus: “Kate is intelligent”, “Sarah is intelligent”, “Karen is intelligent”,
and so on for a very long time!
As in the Predicate Calculus, predicates offer the primary (and only)
representational structure in Prolog. Predicates can have zero or more
arguments, where their arity is the number of arguments. Functions may
only be represented as the argument of a predicate; they cannot be a
program statement in themselves. Prolog predicates have the usual and,
or, not and implies connectives. The predicate representation along
with its connectives is presented in Section 2.2.

19

Luger_all_wcopyright_COsfixed.pd35 35 5/15/2008 6:34:56 PM


20 Part II: Programming in Prolog

Prolog also takes on many of the declarative aspects of the Predicate


Calculus in the sense that a program is simply the set of all true predicates
that describe a domain. The Prolog interpreter can be seen as a “theorem
prover” that takes the user’s query and determines whether or not it is true,
as well as what variable substitutions might be required to make the query
true. If the query is not true in the context of the program’s specifications,
the interpreter says “no.”
2.2 Prolog Syntax
Facts, Rules Although there are numerous dialects of Prolog, the syntax used
and throughout this text is that of the original Warren and Pereira C-Prolog as
Connectives
described by Clocksin and Mellish (2003). We begin with the set of
connectives that can take atomic predicates and join them with other
expressions to make more complex relationships. There are, because of the
usual keyboard conventions, a number of differences between Prolog and
predicate calculus syntax. In C-Prolog, for example, the symbol :- replaces
the  of first-order predicate calculus. The Prolog connectives include:
ENGLISH PREDICATE CALCULUS Prolog
and ^ ,
or v ;
only if  :-
not ~ not
In Prolog, predicate names and bound variables are expressed as a
sequence of alphanumeric characters beginning with an alphabetic.
Variables are represented as a string of alphanumeric characters beginning
(the first character, at least) with an uppercase alphabetic. Thus:
likes(X, susie).
or, better,
likes(Everyone, susie).
could represent the fact that “everyone likes Susie.” Note that the scope of
all variables is universal to that predicate, i.e., when a variable is used in a
predicate it is understood that it is true for all the domain elements within
its scope. For example,
likes(george, Y), likes(susie, Y).
represents the set of things (or people) liked by BOTH George and Susie.
Similarly, suppose it was desired to represent in Prolog the following
relationships: “George likes Kate and George likes Susie.” This could be
stated as:
likes(george, kate), likes(george, susie).
Likewise, “George likes Kate or George likes Susie”:
likes(george, kate); likes(george, susie).
Finally, “George likes Susie if George does not like Kate”:
likes(george, susie) :- not(likes(george, kate)).

Luger_all_wcopyright_COsfixed.pd36 36 5/15/2008 6:34:56 PM


Chapter 2 Prolog: Representation 21

These examples show how the predicate calculus connectives are expressed
in Prolog. The predicate names (likes), the number or order of parameters,
and even whether a given predicate always has the same number of
parameters are determined by the design requirements (the implicit
“semantics”) of the problem.
The form Prolog expressions take, as in the examples above, is a restricted
form of the full predicate calculus called the “Horn Clause calculus.” There
are many reasons supporting this restricted form, most important is the
power and computational efficiency of a resolution refutation system. For details
see Luger (2009, Chapter 14).
A Simple A Prolog program is a set of specifications in the first-order predicate
Prolog
Program
calculus describing the objects and relations in a problem domain. The set
of specifications is referred to as the database for that problem. The Prolog
interpreter responds to questions about this set of specifications. Queries to
the database are patterns in the same logical syntax as the database entries.
The Prolog interpreter uses pattern-directed search to find whether these
queries logically follow from the contents of the database.
The interpreter processes queries, searching the database in left to right
depth-first order to find out whether the query is a logical consequence of
the database of specifications. Prolog is primarily an interpreted language.
Some versions of Prolog run in interpretive mode only, while others allow
compilation of part or all of the set of specifications for faster execution.
Prolog is an interactive language; the user enters queries in response to the
Prolog prompt, “?-“.
Let us describe a “world” consisting of George’s, Kate’s, and Susie’s likes
and dislikes. The database might contain the following set of predicates:
likes(george, kate).
likes(george, susie).
likes(george, wine).
likes(susie, wine).
likes(kate, gin).
likes(kate, susie).
This set of specifications has the obvious interpretation, or mapping, into
the world of George and his friends. That world is a model for the database
(Luger 2009, Section 2.3). The interpreter may then be asked questions:
?- likes(george, kate).
Yes
?- likes(kate, susie).
Yes
?- likes(george, X).
X = kate
;
X = Susie
;
X = wine

Luger_all_wcopyright_COsfixed.pd37 37 5/15/2008 6:34:56 PM


22 Part II: Programming in Prolog

;
no
?- likes(george, beer).
no
Note first that in the request likes(george, X), successive user
prompts (;) cause the interpreter to return all the terms in the database
specification that may be substituted for the X in the query. They are
returned in the order in which they are found in the database: kate before
susie before wine. Although it goes against the philosophy of
nonprocedural specifications, a determined order of evaluation is a
property of most interpreters implemented on sequential machines.
To summarize: further responses to queries are produced when the user
prompts with the ; (or). This forces the rejection of the current solution
and a backtrack on the set of Prolog specifications for answers. Continued
prompts force Prolog to find all possible solutions to the query. When no
further solutions exist, the interpreter responds no.
This example also illustrates the closed world assumption or negation as failure.
Prolog assumes that “anything is false whose opposite is not provably
true.” For the query likes(george, beer), the interpreter looks for
the predicate likes(george, beer) or some rule that could
establish likes(george, beer). Failing this, the request is false.
Prolog assumes that all knowledge of the world is present in the database.
The closed world assumption introduces a number of practical and
philosophical difficulties in the language. For example, failure to include a
fact in the database often means that its truth is unknown; the closed world
assumption treats it as false. If a predicate were omitted or there were a
misspelling, such as likes(george, beeer), the response remains
no. Negation-as-failure issue is an important topic in AI research. Though
negation-as-failure is a simple way to deal with the problem of unspecified
knowledge, more sophisticated approaches, such as multi-valued logics
(true, false, unknown) and nonmonotonic reasoning (see Luger
2009, Section 9.1), provide a richer interpretive context.
The Prolog expressions just seen are examples of fact specifications. Prolog
also supports rule predicates to describe relationships between facts. We use
the logical implication :- . For rules, only one predicate is permitted on
the left-hand side of the if symbol :-, and this predicate must be a positive
literal, which means it cannot have not in front of it. All predicate calculus
expressions that contain logical implication must be reduced to this form,
referred to as Horn clause logic. In Horn clause form, the left-hand side
(conclusion) of an implication must be a single positive literal. The Horn
clause calculus is equivalent to the full first-order predicate calculus for proofs
by refutation (Luger 2009, Chapter 14).
Suppose we add to the specifications of the previous database a rule for
determining whether two people are friends. This may be defined:
friends(X, Y) :- likes(X, Z), likes(Y, Z).
This expression might be interpreted as “X and Y are friends if there exists
a Z such that X likes Z and Y likes Z.” Two issues are important here. First,

Luger_all_wcopyright_COsfixed.pd38 38 5/15/2008 6:34:56 PM


Chapter 2 Prolog: Representation 23

because neither the predicate calculus nor Prolog has global variables, the
scopes (extent of definition) of X, Y, and Z are limited to the friends
rule. Second, values bound to, or unified with, X, Y, and Z are consistent
across the entire expression. The treatment of the friends rule by the
Prolog interpreter is seen in the following example.
With the friends rule added to the set of specifications of the preceding
example, we can query the interpreter:
?- friends(george, susie).
yes
To solve this query, Prolog searches the database using the backtrack
algorithm. Briefly, backtrack examines each predicate specification in the
order that it was placed in the Prolog. If the variable bindings of the
specification satisfy the query it accepts them. If they don’t, the interpreter
goes on to the next specification. If the interpreter runs into a dead end,
i.e., no variable substitution satisfies it, then it backs up looking for other
variable bindings for the predicates it has already satisfied. For example,
using the predicate specifications of our current example, the query
friends(george, susie) is unified with the conclusion of the rule
friends(X, Y) :- likes(X, Z), likes(Y, Z), with X as
george and Y as susie. The interpreter looks for a Z such that
likes(george, Z) is true and uses the first fact, with Z as kate.
The interpreter then tries to determine whether likes(susie,
kate) is true. When it is found to be false, using the closed world
assumption, this value for Z (kate) is rejected. The interpreter backtracks
to find a second value for Z. likes(george, Z) then matches the
second fact, with Z bound to susie. The interpreter then tries to match
likes(susie, susie). When this also fails, the interpreter goes
back to the database for yet another value for Z. This time wine is found
in the third predicate, and the interpreter goes on to show that
likes(susie, wine) is true. In this case wine is the binding that
ties george and susie.
It is important to state the relationship between universal and existential
quantification in the predicate calculus and the treatment of variables in a
Prolog program. When a variable is placed in the specifications of a Prolog
database, it is universally quantified. For example, likes(susie, Y)
means, according to the semantics of the previous examples, “Susie likes
everyone.” In the course of interpreting a query, any term, or list, or
predicate from the domain of Y, may be bound to Y. Similarly, in the rule
friends(X, Y) :- likes(X, Z), likes(Y, Z), any X, Y,
and Z that meets the specifications of the expression are used.
To represent an existentially quantified variable in Prolog, we may take two
approaches. First, if the existential value of a variable is known, that value
may be entered directly into the database. Thus, likes(george,
wine) is an instance of likes(george, Z).
Second, to find an instance of a variable that makes an expression true, we
query the interpreter. For example, to find whether a Z exists such that
likes(george, Z) is true, we put this query to the interpreter. It will

Luger_all_wcopyright_COsfixed.pd39 39 5/15/2008 6:34:57 PM


24 Part II: Programming in Prolog

find whether a value of Z exists under which the expression is true. Some
Prolog interpreters find all existentially quantified values; C-Prolog requires
repeated user prompts (;), as shown previously, to get all values.
2.3 Creating, Changing, and Tracing a Prolog Computation
In building a Prolog program the database of specifications is created first.
In an interactive environment the predicate assert can be used to add
new predicates to the set of specifications. Thus:
?- assert(likes(david, sarah)).
adds this predicate to the computing specifications. Now, with the query:
?- likes(david, X).
X = sarah.
is returned. assert allows further control in adding new specifications to
the database: asserta(P) asserts the predicate P at the beginning of all
the predicates P, and assertz(P) adds P at the end of all the predicates
named P. This is important for search priorities and building heuristics. To
remove a predicate P from the database retract(P) is used. (It should
be noted that in many Prologs assert can be unpredictable in that the
exact entry time of the new predicate into the environment can vary
depending on what other things are going on, affecting both the indexing
of asserted clauses as well as backtracking.)
It soon becomes tedious to create a set of specifications using the
predicates assert and retract. Instead, the good programmer takes
her favorite editor and creates a file containing all the Prolog program’s
specifications. Once this file is created, call it myfile, and Prolog is
called, then the file is placed in the database by the Prolog command
consult. Thus:
?- consult(myfile).
yes
integrates the predicates in myfile into the database. A short form of the
consult predicate, and better for adding multiple files to the database,
uses the list notation, to be seen shortly:
?- [myfile].
yes
If there are any syntax errors in your Prolog code the consult operator
will describe them at the time it is called.
The predicates read and write are important for user/system
communication. read(X) takes the next term from the current input
stream and binds it to X. Input expressions are terminated with a “.”
write(X) puts X in the output stream. If X is unbound then an integer
preceded by an underline is printed (_69). This integer represents the
internal bookkeeping on variables necessary in a theorem-proving
environment (see Luger 2009, Chapter 14).
The Prolog predicates see and tell are used to read information from
and place information into files. see(X) opens the file X and defines the
current input stream as originating in X. If X is not bound to an available

Luger_all_wcopyright_COsfixed.pd40 40 5/15/2008 6:34:57 PM


Chapter 2 Prolog: Representation 25

file see(X) fails. Similarly, tell(X) opens a file for the output stream.
If no file X exists, tell(X) creates a file named by the bound value of X.
seen(X) and told(X) close the respective files.
A number of Prolog predicates are important in helping keep track of the
state of the Prolog database as well as the state of computing about the
database; the most important of these are listing, trace, and spy. If
we use listing(predicate_name) where predicate_name is
the name of a predicate, such as friends (above), all the clauses with
that predicate name in the database are returned by the interpreter. Note
that the number of arguments of the predicate is not indicated; in fact, all
uses of the predicate, regardless of the number of arguments, are returned.
trace allows the user to monitor the progress of the Prolog interpreter.
This monitoring is accomplished by printing to the output file every goal
that Prolog attempts, which is often more information than the user wants
to have. The tracing facilities in Prolog are often rather cryptic and take
some study and experience to understand. The information available in a
trace of a Prolog program usually includes the following:
The depth level of recursive calls (marked left to right on line).
When a goal is tried for the first time (sometimes call is used).
When a goal is successfully satisfied (with an exit).
When a goal has further matches possible (a retry).
When a goal fails because all attempts to satisfy it have failed
The goal notrace stops the exhaustive tracing.
When a more selective trace is required the goal spy is useful. This
predicate takes a predicate name as argument but sometimes is defined as a
prefix operator where the predicate to be monitored is listed after the
operator. Thus, spy member causes the interpreter to print to output all
uses of the predicate member. spy can also take a list of predicates
followed by their arities: spy[member/2, append/3] monitors
member with two arguments and append with three. nospy removes
these spy points.
2.4 Lists and Recursion in Prolog
The previous subsections presented Prolog syntax with several simple
examples. These examples introduced Prolog as an engine for computing
with predicate calculus expressions (in Horn clause form). This is
consistent with all the principles of predicate calculus inference presented
in Luger (2009, Chapter 2). Prolog uses unification for pattern matching
and returns the bindings that make an expression true. These values are
unified with the variables in a particular expression and are not bound in
the global environment.
Recursion is the primary control mechanism for Prolog programming. We
will demonstrate this with several examples. But first we consider some
simple list-processing examples. The list is a data structure consisting of
ordered sets of elements (or, indeed, lists). Recursion is the natural way to
process the list structure. Unification and recursion come together in list

Luger_all_wcopyright_COsfixed.pd41 41 5/15/2008 6:34:57 PM


26 Part II: Programming in Prolog

processing in Prolog. The set of elements of a list are enclosed by brackets,


[ ], and are separated by commas. Examples of Prolog lists are:
[1, 2, 3, 4]
[[george, kate], [allen, amy], [richard, shirley]]
[tom, dick, harry, fred]
[ ]
The first elements of a list may be separated from the tail of the list by the
bar operator, |. The tail of a list is the list with its first element removed.
For instance, when the list is [tom,dick,harry,fred], the first
element is tom and the tail is the list [dick, harry, fred]. Using
the vertical bar operator and unification, we can break a list into its
components:
If [tom, dick, harry, fred] is matched to [X | Y],
then X = tom and Y = [dick, harry, fred].
If [tom,dick,harry,fred] is matched to the pattern
[X, Y | Z], then X = tom , Y = dick , and Z =
[harry, fred].
If [tom, dick, harry, fred] is matched to [X, Y, Z |
W], then X = tom, Y = dick, Z = harry, and W =
[fred].
If [tom, dick, harry, fred] is matched to [W, X, Y,
Z | V], then W = tom, X = dick, Y = harry, Z = fred,
and V = [ ].
[tom, dick, harry, fred] will not match [V, W, X, Y,
Z | U].
[tom, dick, harry, fred] will match [tom, X |
[harry, fred]], to give X = dick.
Besides “tearing lists apart” to get at particular elements, unification can be
used to “build” the list structure. For example, if X = tom, Y =
[dick] when L unifies with [X | Y], then L will be bound to [tom,
dick]. Thus terms separated by commas before the | are all elements of
the list, and the structure after the | is always a list, the tail of the list.
Let’s take a simple example of recursive processing of lists: the member
check. We define a predicate to determine whether an item, represented by
X, is in a list. This predicate member takes two arguments, an element and
a list, and is true if the element is a member of the list. For example:
?- member(a, [a, b, c, d, e]).
yes
?- member(a, [1, 2, 3, 4]).
no
?- member(X, [a, b, c]).
X = a
;
X = b
;
X = c

Luger_all_wcopyright_COsfixed.pd42 42 5/15/2008 6:34:57 PM


Chapter 2 Prolog: Representation 27

;
no
To define member recursively, we first test if X is the first item in the list:
member(X, [X | T]).
This tests whether X and the first element of the list are identical. Not that
this pattern will match no matter what X is bound to: an atom, a list,
whatever! If the two are not identical, then it is natural to check whether X
is an element of the rest (T) of the list. This is defined by:
member(X, [Y | T]) :- member(X, T).
The two lines of Prolog for checking list membership are then:
member(X, [X | T]).
member(X, [Y | T]) :- member(X, T).
This example illustrates the importance of Prolog’s built-in order of search
with the terminating condition placed before the recursive call, that is, to be
tested before the algorithm recurs. If the order of the predicates is reversed,
the terminating condition may never be checked. We now trace
member(c,[a,b,c]), with numbering:
1: member(X, [X | T]).
2: member(X, [Y | T]) :- member(X, T).
?- member(c, [a, b, c]).
call 1. fail, since c <> a
call 2. X = c, Y = a, T = [b, c],
member(c, | [b,c])?
call 1. fail, since c <> b
call 2. X = c, Y = b, T = [c],
member(c, | [c])?
call 1. success, c = c
yes (to second call 2.)
yes (to first call 2.)
yes
Good Prolog style suggests the use of anonymous variables. These serve as an
indication to the programmer and interpreter that certain variables are used
solely for pattern-matching purposes, with the variable binding itself not
part of the computation process. Thus, when we test whether the element
X is the same as the first item in the list we usually say: member(X,
[X|_]). The use of the _ indicates that even though the tail of the list
plays a crucial part in the unification of the query, the content of the tail of
the list is unimportant. In the member check the anonymous variable
should be used in the recursive statement as well, where the value of the
head of the list is unimportant:
member(X, [X | _]).
member(X, [_ | T]) :- member(X, T).
Writing out a list one element to a line is a nice exercise for understanding
both lists and recursive control. Suppose we wish to write out the list
[a,b,c,d]. We could define the recursive command:
writelist([ ]).
writelist([H | T]) :- write(H), nl, writelist(T).

Luger_all_wcopyright_COsfixed.pd43 43 5/15/2008 6:34:58 PM


Another Random Scribd Document
with Unrelated Content
carried him from me down the field. On another
occasion I found myself between two knights who were
vying with each other to see who could strike me down
the first. I warded off their fury with what skill I had
until one of them was stricken from behind by a hand
that was as sudden as it was sure. The other I struck a
fortunate blow for I stunned him so hard that he rode
off the field to nurse his wound.

Late in the afternoon I was knocked from my horse, but


had wit enough left to scramble again into the saddle. I
was tossed here and there with driving force as the
battle swayed this way or that. My helmet was dented in
from the swing of a mace. My right arm near the
shoulder was numbed from over action and from a
sword beat that had landed on it.

But I came out of it with a whole skin and no bones


broken which was enough to make me thankful. As for
Charles of Gramont, I never laid eyes on him from the
outbreak of the fight. It was long after dark when I
found him inquiring among the troops who had been
near me if they knew if I had fallen. When he saw me
he threw out his hands. I must confess that a kind of
weakness came over me at the sight of my companion.
As though we were children we flew to each other’s
arms and cried like babies.

Then came the parting. It is true that the Black Prince 297
asked us to go along with him to Bordeaux to stay there
for the winter with the promise that he would take us
with him in the early spring on a campaign into Spain.
For a while we were divided two ways, but the longing
for home won in the end. Charles was anxious to get
home to put his house in order and (now that he was
left alone) to give care to the estate. As for me, I knew
that my brother, André, was lying awake far into the
nights, wondering what had become of me and whether
he would ever lay eyes on me again. Besides the fall
was coming on (it was already September) and I knew
the streams were full of fish and that the woods about
my home were thick with game.

You should have been present in our village when we


rode in. The country folk (they had been warned of our
coming beforehand) gathered from the fields. They
wore their best of everything and I can tell you that
their simple dress of velvet jerkins, their breeches of
leather, their hats with feathers in them, never looked
more welcome or more pleasing to my eye. You would
have thought it was some great holiday for the country
players were assembled. Jugglers and sleight-of-hand
artists and to my surprise the man with the birds whom
I had met on my journey out, came to greet us and to
display the best of their wares. And in the midst of all
the merrymaking it was my brother, André, who was the
proudest man alive. He never left my side and when my
name was mentioned, he boasted of my courage and
my strength of will that led me on a quest through the
heart of our enemies, till I had to turn my face away in
shame.

We settled down to the quiet life of the countryside. The 298


first snows of winter came and the fields about the
house were covered white, when a courier rode into the
yard. He was from Bordeaux on his way to the great city
of Paris to negotiate for peace and a return of the King.
He had been commanded, he said, to deliver a letter
from his master, the Black Prince.

With my brother André looking over my shoulder, I


broke the seal and read,
At Bordeaux.
December

To Henri la Mar, the Norman,


My lad,

It has long been in my minde to write you a lettre


of thanks for the helpfull deed you performed. Your
name shall always be enscrolled in my memorie
and I shall think of you as a brave and worthie
servant of your countrie. If there come a time when
you wish to try your hande as a soldier of England,
you will but come to me.

Your timely warning saved an army from


destruction. Not only that, it saved your land and
fireside from the greed of your enemies.

Edward.

Postscriptum.

It may be to your interest to learn that De Marsac


recovered from the blow I gave him when we
fought together on the highway. But he was slain
later at Poitiers.

That was all.

“Well, Henri,” said André, “that letter is worth while.” 299


Transcriber’s Notes

Copyright notice provided as in the original—this e-text


is public domain in the country of publication.
Silently corrected palpable typos; left non-standard
spellings and dialect unchanged.
In the text versions, delimited italics text in
_underscores_ (the HTML version reproduces the font
form of the printed book.)
*** END OF THE PROJECT GUTENBERG EBOOK THE MESSENGER
OF THE BLACK PRINCE ***

Updated editions will replace the previous one—the old editions


will be renamed.

Creating the works from print editions not protected by U.S.


copyright law means that no one owns a United States
copyright in these works, so the Foundation (and you!) can copy
and distribute it in the United States without permission and
without paying copyright royalties. Special rules, set forth in the
General Terms of Use part of this license, apply to copying and
distributing Project Gutenberg™ electronic works to protect the
PROJECT GUTENBERG™ concept and trademark. Project
Gutenberg is a registered trademark, and may not be used if
you charge for an eBook, except by following the terms of the
trademark license, including paying royalties for use of the
Project Gutenberg trademark. If you do not charge anything for
copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such
as creation of derivative works, reports, performances and
research. Project Gutenberg eBooks may be modified and
printed and given away—you may do practically ANYTHING in
the United States with eBooks not protected by U.S. copyright
law. Redistribution is subject to the trademark license, especially
commercial redistribution.

START: FULL LICENSE


THE FULL PROJECT GUTENBERG
LICENSE
PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK

To protect the Project Gutenberg™ mission of promoting the


free distribution of electronic works, by using or distributing this
work (or any other work associated in any way with the phrase
“Project Gutenberg”), you agree to comply with all the terms of
the Full Project Gutenberg™ License available with this file or
online at www.gutenberg.org/license.

Section 1. General Terms of Use and


Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand,
agree to and accept all the terms of this license and intellectual
property (trademark/copyright) agreement. If you do not agree
to abide by all the terms of this agreement, you must cease
using and return or destroy all copies of Project Gutenberg™
electronic works in your possession. If you paid a fee for
obtaining a copy of or access to a Project Gutenberg™
electronic work and you do not agree to be bound by the terms
of this agreement, you may obtain a refund from the person or
entity to whom you paid the fee as set forth in paragraph 1.E.8.

1.B. “Project Gutenberg” is a registered trademark. It may only


be used on or associated in any way with an electronic work by
people who agree to be bound by the terms of this agreement.
There are a few things that you can do with most Project
Gutenberg™ electronic works even without complying with the
full terms of this agreement. See paragraph 1.C below. There
are a lot of things you can do with Project Gutenberg™
electronic works if you follow the terms of this agreement and
help preserve free future access to Project Gutenberg™
electronic works. See paragraph 1.E below.
1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright
law in the United States and you are located in the United
States, we do not claim a right to prevent you from copying,
distributing, performing, displaying or creating derivative works
based on the work as long as all references to Project
Gutenberg are removed. Of course, we hope that you will
support the Project Gutenberg™ mission of promoting free
access to electronic works by freely sharing Project Gutenberg™
works in compliance with the terms of this agreement for
keeping the Project Gutenberg™ name associated with the
work. You can easily comply with the terms of this agreement
by keeping this work in the same format with its attached full
Project Gutenberg™ License when you share it without charge
with others.

1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.

1.E. Unless you have removed all references to Project


Gutenberg:

1.E.1. The following sentence, with active links to, or other


immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project
Gutenberg™ work (any work on which the phrase “Project
Gutenberg” appears, or with which the phrase “Project
Gutenberg” is associated) is accessed, displayed, performed,
viewed, copied or distributed:

This eBook is for the use of anyone anywhere


in the United States and most other parts of
the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it
away or re-use it under the terms of the
Project Gutenberg License included with this
eBook or online at www.gutenberg.org. If you
are not located in the United States, you will
have to check the laws of the country where
you are located before using this eBook.

1.E.2. If an individual Project Gutenberg™ electronic work is


derived from texts not protected by U.S. copyright law (does not
contain a notice indicating that it is posted with permission of
the copyright holder), the work can be copied and distributed to
anyone in the United States without paying any fees or charges.
If you are redistributing or providing access to a work with the
phrase “Project Gutenberg” associated with or appearing on the
work, you must comply either with the requirements of
paragraphs 1.E.1 through 1.E.7 or obtain permission for the use
of the work and the Project Gutenberg™ trademark as set forth
in paragraphs 1.E.8 or 1.E.9.

1.E.3. If an individual Project Gutenberg™ electronic work is


posted with the permission of the copyright holder, your use and
distribution must comply with both paragraphs 1.E.1 through
1.E.7 and any additional terms imposed by the copyright holder.
Additional terms will be linked to the Project Gutenberg™
License for all works posted with the permission of the copyright
holder found at the beginning of this work.
1.E.4. Do not unlink or detach or remove the full Project
Gutenberg™ License terms from this work, or any files
containing a part of this work or any other work associated with
Project Gutenberg™.

1.E.5. Do not copy, display, perform, distribute or redistribute


this electronic work, or any part of this electronic work, without
prominently displaying the sentence set forth in paragraph 1.E.1
with active links or immediate access to the full terms of the
Project Gutenberg™ License.

1.E.6. You may convert to and distribute this work in any binary,
compressed, marked up, nonproprietary or proprietary form,
including any word processing or hypertext form. However, if
you provide access to or distribute copies of a Project
Gutenberg™ work in a format other than “Plain Vanilla ASCII” or
other format used in the official version posted on the official
Project Gutenberg™ website (www.gutenberg.org), you must,
at no additional cost, fee or expense to the user, provide a copy,
a means of exporting a copy, or a means of obtaining a copy
upon request, of the work in its original “Plain Vanilla ASCII” or
other form. Any alternate format must include the full Project
Gutenberg™ License as specified in paragraph 1.E.1.

1.E.7. Do not charge a fee for access to, viewing, displaying,


performing, copying or distributing any Project Gutenberg™
works unless you comply with paragraph 1.E.8 or 1.E.9.

1.E.8. You may charge a reasonable fee for copies of or


providing access to or distributing Project Gutenberg™
electronic works provided that:

• You pay a royalty fee of 20% of the gross profits you


derive from the use of Project Gutenberg™ works
calculated using the method you already use to
calculate your applicable taxes. The fee is owed to the
owner of the Project Gutenberg™ trademark, but he has
agreed to donate royalties under this paragraph to the
Project Gutenberg Literary Archive Foundation. Royalty
payments must be paid within 60 days following each
date on which you prepare (or are legally required to
prepare) your periodic tax returns. Royalty payments
should be clearly marked as such and sent to the
Project Gutenberg Literary Archive Foundation at the
address specified in Section 4, “Information about
donations to the Project Gutenberg Literary Archive
Foundation.”

• You provide a full refund of any money paid by a user


who notifies you in writing (or by e-mail) within 30 days
of receipt that s/he does not agree to the terms of the
full Project Gutenberg™ License. You must require such
a user to return or destroy all copies of the works
possessed in a physical medium and discontinue all use
of and all access to other copies of Project Gutenberg™
works.

• You provide, in accordance with paragraph 1.F.3, a full


refund of any money paid for a work or a replacement
copy, if a defect in the electronic work is discovered and
reported to you within 90 days of receipt of the work.

• You comply with all other terms of this agreement for


free distribution of Project Gutenberg™ works.

1.E.9. If you wish to charge a fee or distribute a Project


Gutenberg™ electronic work or group of works on different
terms than are set forth in this agreement, you must obtain
permission in writing from the Project Gutenberg Literary
Archive Foundation, the manager of the Project Gutenberg™
trademark. Contact the Foundation as set forth in Section 3
below.
1.F.

1.F.1. Project Gutenberg volunteers and employees expend


considerable effort to identify, do copyright research on,
transcribe and proofread works not protected by U.S. copyright
law in creating the Project Gutenberg™ collection. Despite these
efforts, Project Gutenberg™ electronic works, and the medium
on which they may be stored, may contain “Defects,” such as,
but not limited to, incomplete, inaccurate or corrupt data,
transcription errors, a copyright or other intellectual property
infringement, a defective or damaged disk or other medium, a
computer virus, or computer codes that damage or cannot be
read by your equipment.

1.F.2. LIMITED WARRANTY, DISCLAIMER OF DAMAGES - Except


for the “Right of Replacement or Refund” described in
paragraph 1.F.3, the Project Gutenberg Literary Archive
Foundation, the owner of the Project Gutenberg™ trademark,
and any other party distributing a Project Gutenberg™ electronic
work under this agreement, disclaim all liability to you for
damages, costs and expenses, including legal fees. YOU AGREE
THAT YOU HAVE NO REMEDIES FOR NEGLIGENCE, STRICT
LIABILITY, BREACH OF WARRANTY OR BREACH OF CONTRACT
EXCEPT THOSE PROVIDED IN PARAGRAPH 1.F.3. YOU AGREE
THAT THE FOUNDATION, THE TRADEMARK OWNER, AND ANY
DISTRIBUTOR UNDER THIS AGREEMENT WILL NOT BE LIABLE
TO YOU FOR ACTUAL, DIRECT, INDIRECT, CONSEQUENTIAL,
PUNITIVE OR INCIDENTAL DAMAGES EVEN IF YOU GIVE
NOTICE OF THE POSSIBILITY OF SUCH DAMAGE.

1.F.3. LIMITED RIGHT OF REPLACEMENT OR REFUND - If you


discover a defect in this electronic work within 90 days of
receiving it, you can receive a refund of the money (if any) you
paid for it by sending a written explanation to the person you
received the work from. If you received the work on a physical
medium, you must return the medium with your written
explanation. The person or entity that provided you with the
defective work may elect to provide a replacement copy in lieu
of a refund. If you received the work electronically, the person
or entity providing it to you may choose to give you a second
opportunity to receive the work electronically in lieu of a refund.
If the second copy is also defective, you may demand a refund
in writing without further opportunities to fix the problem.

1.F.4. Except for the limited right of replacement or refund set


forth in paragraph 1.F.3, this work is provided to you ‘AS-IS’,
WITH NO OTHER WARRANTIES OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO WARRANTIES OF
MERCHANTABILITY OR FITNESS FOR ANY PURPOSE.

1.F.5. Some states do not allow disclaimers of certain implied


warranties or the exclusion or limitation of certain types of
damages. If any disclaimer or limitation set forth in this
agreement violates the law of the state applicable to this
agreement, the agreement shall be interpreted to make the
maximum disclaimer or limitation permitted by the applicable
state law. The invalidity or unenforceability of any provision of
this agreement shall not void the remaining provisions.

1.F.6. INDEMNITY - You agree to indemnify and hold the


Foundation, the trademark owner, any agent or employee of the
Foundation, anyone providing copies of Project Gutenberg™
electronic works in accordance with this agreement, and any
volunteers associated with the production, promotion and
distribution of Project Gutenberg™ electronic works, harmless
from all liability, costs and expenses, including legal fees, that
arise directly or indirectly from any of the following which you
do or cause to occur: (a) distribution of this or any Project
Gutenberg™ work, (b) alteration, modification, or additions or
deletions to any Project Gutenberg™ work, and (c) any Defect
you cause.
Section 2. Information about the Mission
of Project Gutenberg™
Project Gutenberg™ is synonymous with the free distribution of
electronic works in formats readable by the widest variety of
computers including obsolete, old, middle-aged and new
computers. It exists because of the efforts of hundreds of
volunteers and donations from people in all walks of life.

Volunteers and financial support to provide volunteers with the


assistance they need are critical to reaching Project
Gutenberg™’s goals and ensuring that the Project Gutenberg™
collection will remain freely available for generations to come. In
2001, the Project Gutenberg Literary Archive Foundation was
created to provide a secure and permanent future for Project
Gutenberg™ and future generations. To learn more about the
Project Gutenberg Literary Archive Foundation and how your
efforts and donations can help, see Sections 3 and 4 and the
Foundation information page at www.gutenberg.org.

Section 3. Information about the Project


Gutenberg Literary Archive Foundation
The Project Gutenberg Literary Archive Foundation is a non-
profit 501(c)(3) educational corporation organized under the
laws of the state of Mississippi and granted tax exempt status
by the Internal Revenue Service. The Foundation’s EIN or
federal tax identification number is 64-6221541. Contributions
to the Project Gutenberg Literary Archive Foundation are tax
deductible to the full extent permitted by U.S. federal laws and
your state’s laws.

The Foundation’s business office is located at 809 North 1500


West, Salt Lake City, UT 84116, (801) 596-1887. Email contact
links and up to date contact information can be found at the
Foundation’s website and official page at
www.gutenberg.org/contact

Section 4. Information about Donations to


the Project Gutenberg Literary Archive
Foundation
Project Gutenberg™ depends upon and cannot survive without
widespread public support and donations to carry out its mission
of increasing the number of public domain and licensed works
that can be freely distributed in machine-readable form
accessible by the widest array of equipment including outdated
equipment. Many small donations ($1 to $5,000) are particularly
important to maintaining tax exempt status with the IRS.

The Foundation is committed to complying with the laws


regulating charities and charitable donations in all 50 states of
the United States. Compliance requirements are not uniform
and it takes a considerable effort, much paperwork and many
fees to meet and keep up with these requirements. We do not
solicit donations in locations where we have not received written
confirmation of compliance. To SEND DONATIONS or determine
the status of compliance for any particular state visit
www.gutenberg.org/donate.

While we cannot and do not solicit contributions from states


where we have not met the solicitation requirements, we know
of no prohibition against accepting unsolicited donations from
donors in such states who approach us with offers to donate.

International donations are gratefully accepted, but we cannot


make any statements concerning tax treatment of donations
received from outside the United States. U.S. laws alone swamp
our small staff.
Please check the Project Gutenberg web pages for current
donation methods and addresses. Donations are accepted in a
number of other ways including checks, online payments and
credit card donations. To donate, please visit:
www.gutenberg.org/donate.

Section 5. General Information About


Project Gutenberg™ electronic works
Professor Michael S. Hart was the originator of the Project
Gutenberg™ concept of a library of electronic works that could
be freely shared with anyone. For forty years, he produced and
distributed Project Gutenberg™ eBooks with only a loose
network of volunteer support.

Project Gutenberg™ eBooks are often created from several


printed editions, all of which are confirmed as not protected by
copyright in the U.S. unless a copyright notice is included. Thus,
we do not necessarily keep eBooks in compliance with any
particular paper edition.

Most people start at our website which has the main PG search
facility: www.gutenberg.org.

This website includes information about Project Gutenberg™,


including how to make donations to the Project Gutenberg
Literary Archive Foundation, how to help produce our new
eBooks, and how to subscribe to our email newsletter to hear
about new eBooks.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookultra.com

You might also like