100% found this document useful (1 vote)
10 views

Genetic Algorithms and Genetic Programming Modern Concepts and Practical Applications 1st Edition Michael Affenzeller pdf download

The document provides information on various books related to genetic algorithms, genetic programming, and their applications in different fields. It includes details about the authors, editions, and links for instant ebook downloads. Additionally, it outlines the structure and contents of a specific book titled 'Genetic Algorithms and Genetic Programming: Modern Concepts and Practical Applications' by Michael Affenzeller and others.

Uploaded by

fhumukurul
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
10 views

Genetic Algorithms and Genetic Programming Modern Concepts and Practical Applications 1st Edition Michael Affenzeller pdf download

The document provides information on various books related to genetic algorithms, genetic programming, and their applications in different fields. It includes details about the authors, editions, and links for instant ebook downloads. Additionally, it outlines the structure and contents of a specific book titled 'Genetic Algorithms and Genetic Programming: Modern Concepts and Practical Applications' by Michael Affenzeller and others.

Uploaded by

fhumukurul
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 81

Genetic Algorithms and Genetic Programming

Modern Concepts and Practical Applications 1st


Edition Michael Affenzeller pdf download

https://ebookgate.com/product/genetic-algorithms-and-genetic-
programming-modern-concepts-and-practical-applications-1st-
edition-michael-affenzeller/

Get Instant Ebook Downloads – Browse at https://ebookgate.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Genetic Algorithms in Electromagnetics 1st Edition Randy


L. Haupt

https://ebookgate.com/product/genetic-algorithms-in-
electromagnetics-1st-edition-randy-l-haupt/

ebookgate.com

Genetic Counseling Practice Advanced Concepts and Skills


Second Edition Bonnie S. Leroy

https://ebookgate.com/product/genetic-counseling-practice-advanced-
concepts-and-skills-second-edition-bonnie-s-leroy/

ebookgate.com

Genetic counseling practice advanced concepts and skills


Second Edition Patricia Mccarthy Veach (Editor)

https://ebookgate.com/product/genetic-counseling-practice-advanced-
concepts-and-skills-second-edition-patricia-mccarthy-veach-editor/

ebookgate.com

Mobile Genetic Elements Protocols and Genomic Applications


1st Edition Wolfgang J. Miller

https://ebookgate.com/product/mobile-genetic-elements-protocols-and-
genomic-applications-1st-edition-wolfgang-j-miller/

ebookgate.com
Restoring Wildlife Ecological Concepts and Practical
Applications 1st Edition Michael L. Morrison

https://ebookgate.com/product/restoring-wildlife-ecological-concepts-
and-practical-applications-1st-edition-michael-l-morrison/

ebookgate.com

Crop Physiology Applications for Genetic Improvement and


Agronomy 1st Edition Victor O. Sadras

https://ebookgate.com/product/crop-physiology-applications-for-
genetic-improvement-and-agronomy-1st-edition-victor-o-sadras/

ebookgate.com

Basics of Programming and Algorithms Principles and


Applications 2024th Edition Roberto Mantaci

https://ebookgate.com/product/basics-of-programming-and-algorithms-
principles-and-applications-2024th-edition-roberto-mantaci/

ebookgate.com

Genetic Ancestry 1st Edition Jada Benn Torres

https://ebookgate.com/product/genetic-ancestry-1st-edition-jada-benn-
torres/

ebookgate.com

Symmetrical Analysis Techniques for Genetic Systems and


Bioinformatics Advanced Patterns and Applications Premier
Reference Source 1st Edition Sergey Petoukhov
https://ebookgate.com/product/symmetrical-analysis-techniques-for-
genetic-systems-and-bioinformatics-advanced-patterns-and-applications-
premier-reference-source-1st-edition-sergey-petoukhov/
ebookgate.com
Genetic
Algorithms and
Genetic
Programming
Modern Concepts and
Practical Applications

© 2009 by Taylor & Francis Group, LLC


Numerical Insights
Series Editor
A. Sydow, GMD-FIRST, Berlin, Germany

Editorial Board
P. Borne, École de Lille, France; G. Carmichael, University of Iowa, USA;
L. Dekker, Delft University of Technology, The Netherlands; A. Iserles, University of
Cambridge, UK; A. Jakeman, Australian National University, Australia;
G. Korn, Industrial Consultants (Tucson), USA; G.P. Rao, Indian Institute of Technology,
India; J.R. Rice, Purdue University, USA; A.A. Samarskii, Russian Academy of Science,
Russia; Y. Takahara, Tokyo Institute of Technology, Japan

The Numerical Insights series aims to show how numerical simulations provide valuable insights
into the mechanisms and processes involved in a wide range of disciplines. Such simulations
provide a way of assessing theories by comparing simulations with observations. These models are
also powerful tools which serve to indicate where both theory and experiment can be improved.
In most cases the books will be accompanied by software on disk demonstrating working
examples of the simulations described in the text.
The editors will welcome proposals using modelling, simulation and systems analysis
techniques in the following disciplines: physical sciences; engineering; environment; ecology;
biosciences; economics.

Volume 1
Numerical Insights into Dynamic Systems: Interactive Dynamic System Simulation with
Microsoft® Windows™ and NT™
Granino A. Korn

Volume 2
Modelling, Simulation and Control of Non-Linear Dynamical Systems: An Intelligent Approach
using Soft Computing and Fractal Theory
Patricia Melin and Oscar Castillo

Volume 3
Principles of Mathematical Modeling: Ideas, Methods, Examples
A.A. Samarskii and A. P. Mikhailov

Volume 4
Practical Fourier Analysis for Multigrid Methods
Roman Wienands and Wolfgang Joppich

Volume 5
Effective Computational Methods for Wave Propagation
Nikolaos A. Kampanis, Vassilios A. Dougalis, and John A. Ekaterinaris

Volume 6
Genetic Algorithms and Genetic Programming: Modern Concepts and Practical Applications
Michael Affenzeller, Stephan Winkler, Stefan Wagner, and Andreas Beham
© 2009 by Taylor & Francis Group, LLC
Genetic
Algorithms and
Genetic
Programming
Modern Concepts and
Practical Applications

Michael Affenzeller, Stephan Winkler,


Stefan Wagner, and Andreas Beham

© 2009 by Taylor & Francis Group, LLC


Chapman & Hall/CRC
Taylor & Francis Group
6000 Broken Sound Parkway NW, Suite 300
Boca Raton, FL 33487‑2742
© 2009 by Taylor & Francis Group, LLC
Chapman & Hall/CRC is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works


Printed in the United States of America on acid‑free paper
10 9 8 7 6 5 4 3 2 1

International Standard Book Number‑13: 978‑1‑58488‑629‑7 (Hardcover)

This book contains information obtained from authentic and highly regarded sources. Reasonable
efforts have been made to publish reliable data and information, but the author and publisher can‑
not assume responsibility for the validity of all materials or the consequences of their use. The
authors and publishers have attempted to trace the copyright holders of all material reproduced
in this publication and apologize to copyright holders if permission to publish in this form has not
been obtained. If any copyright material has not been acknowledged please write and let us know so
we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced,
transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or
hereafter invented, including photocopying, microfilming, and recording, or in any information
storage or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access www.copy‑
right.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222
Rosewood Drive, Danvers, MA 01923, 978‑750‑8400. CCC is a not‑for‑profit organization that pro‑
vides licenses and registration for a variety of users. For organizations that have been granted a
photocopy license by the CCC, a separate system of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and
are used only for identification and explanation without intent to infringe.

Library of Congress Cataloging‑in‑Publication Data

Genetic algorithms and genetic programming : modern concepts and practical


applications / Michael Affenzeller ... [et al.].
p. cm. ‑‑ (Numerical insights ; v. 6)
Includes bibliographical references and index.
ISBN 978‑1‑58488‑629‑7 (hardcover : alk. paper)
1. Algorithms. 2. Combinatorial optimization. 3. Programming (Mathematics)
4. Evolutionary computation. I. Affenzeller, Michael. II. Title. III. Series.

QA9.58.G46 2009
006.3’1‑‑dc22 2009003656

Visit the Taylor & Francis Web site at


http://www.taylorandfrancis.com
and the CRC Press Web site at
http://www.crcpress.com

© 2009 by Taylor & Francis Group, LLC


Contents

List of Tables xi

List of Figures xv

List of Algorithms xxiii

Introduction xxv

1 Simulating Evolution: Basics about Genetic Algorithms 1


1.1 The Evolution of Evolutionary Computation . . . . . . . . . 1
1.2 The Basics of Genetic Algorithms . . . . . . . . . . . . . . . 2
1.3 Biological Terminology . . . . . . . . . . . . . . . . . . . . . 3
1.4 Genetic Operators . . . . . . . . . . . . . . . . . . . . . . . . 6
1.4.1 Models for Parent Selection . . . . . . . . . . . . . . . 6
1.4.2 Recombination (Crossover) . . . . . . . . . . . . . . . 7
1.4.3 Mutation . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.4.4 Replacement Schemes . . . . . . . . . . . . . . . . . . 9
1.5 Problem Representation . . . . . . . . . . . . . . . . . . . . . 10
1.5.1 Binary Representation . . . . . . . . . . . . . . . . . . 11
1.5.2 Adjacency Representation . . . . . . . . . . . . . . . . 12
1.5.3 Path Representation . . . . . . . . . . . . . . . . . . . 13
1.5.4 Other Representations for Combinatorial Optimization
Problems . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.5.5 Problem Representations for Real-Valued Encoding . . 14
1.6 GA Theory: Schemata and Building Blocks . . . . . . . . . . 14
1.7 Parallel Genetic Algorithms . . . . . . . . . . . . . . . . . . . 17
1.7.1 Global Parallelization . . . . . . . . . . . . . . . . . . 18
1.7.2 Coarse-Grained Parallel GAs . . . . . . . . . . . . . . 19
1.7.3 Fine-Grained Parallel GAs . . . . . . . . . . . . . . . 20
1.7.4 Migration . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.8 The Interplay of Genetic Operators . . . . . . . . . . . . . . 22
1.9 Bibliographic Remarks . . . . . . . . . . . . . . . . . . . . . 23

2 Evolving Programs: Genetic Programming 25


2.1 Introduction: Main Ideas and Historical Background . . . . . 26
2.2 Chromosome Representation . . . . . . . . . . . . . . . . . . 28
2.2.1 Hierarchical Labeled Structure Trees . . . . . . . . . . 28

v
© 2009 by Taylor & Francis Group, LLC
vi Genetic Algorithms and Genetic Programming

2.2.2 Automatically Defined Functions and Modular Genetic


Programming . . . . . . . . . . . . . . . . . . . . . . . 35
2.2.3 Other Representations . . . . . . . . . . . . . . . . . . 36
2.3 Basic Steps of the GP-Based Problem Solving Process . . . . 37
2.3.1 Preparatory Steps . . . . . . . . . . . . . . . . . . . . 37
2.3.2 Initialization . . . . . . . . . . . . . . . . . . . . . . . 39
2.3.3 Breeding Populations of Programs . . . . . . . . . . . 39
2.3.4 Process Termination and Results Designation . . . . . 41
2.4 Typical Applications of Genetic Programming . . . . . . . . 43
2.4.1 Automated Learning of Multiplexer Functions . . . . . 43
2.4.2 The Artificial Ant . . . . . . . . . . . . . . . . . . . . 44
2.4.3 Symbolic Regression . . . . . . . . . . . . . . . . . . . 46
2.4.4 Other GP Applications . . . . . . . . . . . . . . . . . 49
2.5 GP Schema Theories . . . . . . . . . . . . . . . . . . . . . . 50
2.5.1 Program Component GP Schemata . . . . . . . . . . . 51
2.5.2 Rooted Tree GP Schema Theories . . . . . . . . . . . 52
2.5.3 Exact GP Schema Theory . . . . . . . . . . . . . . . . 54
2.5.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 59
2.6 Current GP Challenges and Research Areas . . . . . . . . . 59
2.7 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
2.8 Bibliographic Remarks . . . . . . . . . . . . . . . . . . . . . 62

3 Problems and Success Factors 65


3.1 What Makes GAs and GP Unique among Intelligent
Optimization Methods? . . . . . . . . . . . . . . . . . . . . . 65
3.2 Stagnation and Premature Convergence . . . . . . . . . . . . 66

4 Preservation of Relevant Building Blocks 69


4.1 What Can Extended Selection Concepts Do to Avoid
Premature Convergence? . . . . . . . . . . . . . . . . . . . . 69
4.2 Offspring Selection (OS) . . . . . . . . . . . . . . . . . . . . 70
4.3 The Relevant Alleles Preserving Genetic Algorithm (RAPGA) 73
4.4 Consequences Arising out of Offspring Selection and RAPGA 76

5 SASEGASA – More than the Sum of All Parts 79


5.1 The Interplay of Distributed Search and Systematic Recovery
of Essential Genetic Information . . . . . . . . . . . . . . . . 80
5.2 Migration Revisited . . . . . . . . . . . . . . . . . . . . . . . 81
5.3 SASEGASA: A Novel and Self-Adaptive Parallel Genetic
Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
5.3.1 The Core Algorithm . . . . . . . . . . . . . . . . . . . 83
5.4 Interactions among Genetic Drift, Migration, and Self-Adaptive
Selection Pressure . . . . . . . . . . . . . . . . . . . . . . . . 86

© 2009 by Taylor & Francis Group, LLC


Table of Contents vii

6 Analysis of Population Dynamics 89


6.1 Parent Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 89
6.2 Genetic Diversity . . . . . . . . . . . . . . . . . . . . . . . . 90
6.2.1 In Single-Population GAs . . . . . . . . . . . . . . . . 90
6.2.2 In Multi-Population GAs . . . . . . . . . . . . . . . . 91
6.2.3 Application Examples . . . . . . . . . . . . . . . . . . 92

7 Characteristics of Offspring Selection and the RAPGA 97


7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . 97
7.2 Building Block Analysis for Standard GAs . . . . . . . . . . 98
7.3 Building Block Analysis for GAs Using Offspring Selection . 103
7.4 Building Block Analysis for the Relevant Alleles Preserving GA
(RAPGA) . . . . . . . . . . . . . . . . . . . . . . . . . . . . 113

8 Combinatorial Optimization: Route Planning 121


8.1 The Traveling Salesman Problem . . . . . . . . . . . . . . . . 121
8.1.1 Problem Statement and Solution Methodology . . . . 122
8.1.2 Review of Approximation Algorithms and Heuristics . 125
8.1.3 Multiple Traveling Salesman Problems . . . . . . . . . 130
8.1.4 Genetic Algorithm Approaches . . . . . . . . . . . . . 130
8.2 The Capacitated Vehicle Routing Problem . . . . . . . . . . 139
8.2.1 Problem Statement and Solution Methodology . . . . 140
8.2.2 Genetic Algorithm Approaches . . . . . . . . . . . . . 147

9 Evolutionary System Identification 157


9.1 Data-Based Modeling and System Identification . . . . . . . 157
9.1.1 Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
9.1.2 An Example . . . . . . . . . . . . . . . . . . . . . . . 159
9.1.3 The Basic Steps in System Identification . . . . . . . . 166
9.1.4 Data-Based Modeling Using Genetic Programming . . 169
9.2 GP-Based System Identification in HeuristicLab . . . . . . . 170
9.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . 170
9.2.2 Problem Representation . . . . . . . . . . . . . . . . . 171
9.2.3 The Functions and Terminals Basis . . . . . . . . . . . 173
9.2.4 Solution Representation . . . . . . . . . . . . . . . . . 178
9.2.5 Solution Evaluation . . . . . . . . . . . . . . . . . . . 182
9.3 Local Adaption Embedded in Global Optimization . . . . . . 188
9.3.1 Parameter Optimization . . . . . . . . . . . . . . . . . 189
9.3.2 Pruning . . . . . . . . . . . . . . . . . . . . . . . . . . 192
9.4 Similarity Measures for Solution Candidates . . . . . . . . . 197
9.4.1 Evaluation-Based Similarity Measures . . . . . . . . . 199
9.4.2 Structural Similarity Measures . . . . . . . . . . . . . 201

© 2009 by Taylor & Francis Group, LLC


viii Genetic Algorithms and Genetic Programming

10 Applications of Genetic Algorithms: Combinatorial


Optimization 207
10.1 The Traveling Salesman Problem . . . . . . . . . . . . . . . . 208
10.1.1 Performance Increase of Results of Different Crossover
Operators by Means of Offspring Selection . . . . . . . 208
10.1.2 Scalability of Global Solution Quality by SASEGASA 210
10.1.3 Comparison of the SASEGASA to the Island-Model
Coarse-Grained Parallel GA . . . . . . . . . . . . . . . 214
10.1.4 Genetic Diversity Analysis for the Different GA Types 217
10.2 Capacitated Vehicle Routing . . . . . . . . . . . . . . . . . . 221
10.2.1 Results Achieved Using Standard Genetic Algorithms 222
10.2.2 Results Achieved Using Genetic Algorithms with
Offspring Selection . . . . . . . . . . . . . . . . . . . . 226

11 Data-Based Modeling with Genetic Programming 235


11.1 Time Series Analysis . . . . . . . . . . . . . . . . . . . . . . 235
11.1.1 Time Series Specific Evaluation . . . . . . . . . . . . . 236
11.1.2 Application Example: Design of Virtual Sensors for
Emissions of Diesel Engines . . . . . . . . . . . . . . . 237
11.2 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
11.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . 251
11.2.2 Real-Valued Classification with Genetic Programming 251
11.2.3 Analyzing Classifiers . . . . . . . . . . . . . . . . . . . 252
11.2.4 Classification Specific Evaluation in GP . . . . . . . . 258
11.2.5 Application Example: Medical Data Analysis . . . . . 263
11.3 Genetic Propagation . . . . . . . . . . . . . . . . . . . . . . . 285
11.3.1 Test Setup . . . . . . . . . . . . . . . . . . . . . . . . 285
11.3.2 Test Results . . . . . . . . . . . . . . . . . . . . . . . . 286
11.3.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . 288
11.3.4 Additional Tests Using Random Parent Selection . . . 289
11.4 Single Population Diversity Analysis . . . . . . . . . . . . . . 292
11.4.1 GP Test Strategies . . . . . . . . . . . . . . . . . . . . 292
11.4.2 Test Results . . . . . . . . . . . . . . . . . . . . . . . . 293
11.4.3 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . 297
11.5 Multi-Population Diversity Analysis . . . . . . . . . . . . . . 300
11.5.1 GP Test Strategies . . . . . . . . . . . . . . . . . . . . 300
11.5.2 Test Results . . . . . . . . . . . . . . . . . . . . . . . . 301
11.5.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . 303
11.6 Code Bloat, Pruning, and Population Diversity . . . . . . . . 306
11.6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . 306
11.6.2 Test Strategies . . . . . . . . . . . . . . . . . . . . . . 307
11.6.3 Test Results . . . . . . . . . . . . . . . . . . . . . . . . 309
11.6.4 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . 318

Conclusion and Outlook 321

© 2009 by Taylor & Francis Group, LLC


Table of Contents ix

Symbols and Abbreviations 325

References 327

© 2009 by Taylor & Francis Group, LLC


List of Tables

7.1 Parameters for test runs using a conventional GA. . . . . . 99


7.2 Parameters for test runs using a GA with offspring selection. 104
7.3 Parameters for test runs using the relevant alleles preserving
genetic algorithm. . . . . . . . . . . . . . . . . . . . . . . . . 113

8.1 Exemplary edge map of the parent tours for an ERX operator. 138

9.1 Data-based modeling example: Training data. . . . . . . . . 160


9.2 Data-based modeling example: Test data. . . . . . . . . . . 164

10.1 Overview of algorithm parameters. . . . . . . . . . . . . . . 209


10.2 Experimental results achieved using a standard GA. . . . . 209
10.3 Experimental results achieved using a GA with offspring se-
lection. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
10.4 Parameter values used in the test runs of the SASEGASA
algorithms with single crossover operators as well as with a
combination of the operators. . . . . . . . . . . . . . . . . . 211
10.5 Results showing the scaling properties of SASEGASA with
one crossover operator (OX), with and without mutation. . 211
10.6 Results showing the scaling properties of SASEGASA with
one crossover operator (ERX), with and without mutation. 212
10.7 Results showing the scaling properties of SASEGASA with
one crossover operator (MPX), with and without mutation. 212
10.8 Results showing the scaling properties of SASEGASA with a
combination of crossover operators (OX, ERX, MPX), with
and without mutation. . . . . . . . . . . . . . . . . . . . . . 213
10.9 Parameter values used in the test runs of a island model GA
with various operators and various numbers of demes. . . . 215
10.10 Results showing the scaling properties of an island GA with
one crossover operator (OX) using roulette-wheel selection,
with and without mutation. . . . . . . . . . . . . . . . . . . 215
10.11 Results showing the scaling properties of an island GA with
one crossover operator (ERX) using roulette-wheel selection,
with and without mutation. . . . . . . . . . . . . . . . . . . 216
10.12 Results showing the scaling properties of an island GA with
one crossover operator (MPX) using roulette-wheel selection,
with and without mutation. . . . . . . . . . . . . . . . . . . 216

xi
© 2009 by Taylor & Francis Group, LLC
xii Genetic Algorithms and Genetic Programming

10.13 Parameter values used in the CVRP test runs applying a stan-
dard GA. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 223
10.14 Results of a GA using roulette-wheel selection, 3-tournament
selection and various mutation operators. . . . . . . . . . . 226
10.15 Parameter values used in CVRP test runs applying a GA with
OS. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
10.16 Results of a GA with offspring selection and population sizes
of 200 and 400 and various mutation operators. The configu-
ration is listed in Table 10.15. . . . . . . . . . . . . . . . . . 232
10.17 Showing results of a GA with offspring and a population size
of 500 and various mutation operators. The configuration is
listed in Table 10.15. . . . . . . . . . . . . . . . . . . . . . . 234

11.1 Linear correlation of input variables and the target values


(NOx ) in the N Ox data set I. . . . . . . . . . . . . . . . . . 240
11.2 Mean squared errors on training data for the N Ox data set I. 241
11.3 Statistic features of the identification relevant variables in the
N Ox data set II. . . . . . . . . . . . . . . . . . . . . . . . . 246
11.4 Linear correlation coefficients of the variables relevant in the
N Ox data set II. . . . . . . . . . . . . . . . . . . . . . . . . 248
11.5 Statistic features of the variables in the N Ox data set III. . 250
11.6 Linear correlation coefficients of the variables relevant in the
N Ox data set III. . . . . . . . . . . . . . . . . . . . . . . . . 250
11.7 Exemplary confusion matrix with three classes . . . . . . . 253
11.8 Exemplary confusion matrix with two classes . . . . . . . . 254
11.9 Set of function and terminal definitions for enhanced GP-
based classification. . . . . . . . . . . . . . . . . . . . . . . . 264
11.10 Experimental results for the Thyroid data set. . . . . . . . . 270
11.11 Summary of the best GP parameter settings for solving clas-
sification problems. . . . . . . . . . . . . . . . . . . . . . . . 271
11.12 Summary of training and test results for the Wisconsin data
set: Correct classification rates (average values and standard
deviation values) for 10-fold CV partitions, produced by GP
with offspring selection. . . . . . . . . . . . . . . . . . . . . 279
11.13 Comparison of machine learning methods: Average test ac-
curacy of classifiers for the Wisconsin data set. . . . . . . . 280
11.14 Confusion matrices for average classification results produced
by GP with OS for the Melanoma data set. . . . . . . . . . 280
11.15 Comparison of machine learning methods: Average test ac-
curacy of classifiers for the Melanoma data set. . . . . . . . 281
11.16 Summary of training and test results for the Thyroid data
set: Correct classification rates (average values and standard
deviation values) for 10-fold CV partitions, produced by GP
with offspring selection. . . . . . . . . . . . . . . . . . . . . 282

© 2009 by Taylor & Francis Group, LLC


List of Tables xiii

11.17 Comparison of machine learning methods: Average test ac-


curacy of classifiers for the Thyroid data set. . . . . . . . . 283
11.18 GP test strategies. . . . . . . . . . . . . . . . . . . . . . . . 285
11.19 Test results. . . . . . . . . . . . . . . . . . . . . . . . . . . . 286
11.20 Average overall genetic propagation of population partitions. 287
11.21 Additional test strategies for genetic propagation tests. . . . 289
11.22 Test results in additional genetic propagation tests (using ran-
dom parent selection). . . . . . . . . . . . . . . . . . . . . . 290
11.23 Average overall genetic propagation of population partitions
for random parent selection tests. . . . . . . . . . . . . . . . 290
11.24 GP test strategies. . . . . . . . . . . . . . . . . . . . . . . . 293
11.25 Test results: Solution qualities. . . . . . . . . . . . . . . . . 294
11.26 Test results: Population diversity (average similarity values;
avg., std.). . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
11.27 Test results: Population diversity (maximum similarity val-
ues; avg., std.). . . . . . . . . . . . . . . . . . . . . . . . . . 296
11.28 GP test strategies. . . . . . . . . . . . . . . . . . . . . . . . 302
11.29 Multi-population diversity test results of the GP test runs
using the Thyroid data set. . . . . . . . . . . . . . . . . . . 303
11.30 Multi-population diversity test results of the GP test runs
using the N Ox data set III. . . . . . . . . . . . . . . . . . . 304
11.31 GP parameters used for code growth and bloat prevention
tests. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 307
11.32 Summary of the code growth prevention strategies applied in
these test series. . . . . . . . . . . . . . . . . . . . . . . . . . 308
11.33 Performance of systematic and ES-based pruning strategies. 310
11.34 Formula size progress in test series (d). . . . . . . . . . . . . 311
11.35 Quality of results produced in test series (d). . . . . . . . . 311
11.36 Formula size and population diversity progress in test series
(e). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 312
11.37 Formula size and population diversity progress in test series
(f). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 313
11.38 Quality of results produced in test series (f). . . . . . . . . . 313
11.39 Formula size and population diversity progress in test series
(g). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 314
11.40 Quality of results produced in test series (g). . . . . . . . . 314
11.41 Formula size and population diversity progress in test series
(h). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
11.42 Quality of results produced in test series (h). . . . . . . . . 316
11.43 Comparison of best models on training and validation data
(bt and bv , respectively). . . . . . . . . . . . . . . . . . . . . 317
11.44 Formula size and population diversity progress in test series
(i). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 320
11.45 Quality of results produced in test series (i). . . . . . . . . . 320

© 2009 by Taylor & Francis Group, LLC


List of Figures

1.1 The canonical genetic algorithm with binary solution encod-


ing. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Schematic display of a single point crossover. . . . . . . . . 8
1.3 Global parallelization concepts: A panmictic population struc-
ture (shown in left picture) and the corresponding master–
slave model (right picture). . . . . . . . . . . . . . . . . . . 18
1.4 Population structure of a coarse-grained parallel GA. . . . . 19
1.5 Population structure of a fine-grained parallel GA; the special
case of a cellular model is shown here. . . . . . . . . . . . . 20

2.1 Exemplary programs given as rooted, labeled structure trees. 30


2.2 Exemplary evaluation of program (a). . . . . . . . . . . . . 31
2.3 Exemplary evaluation of program (b). . . . . . . . . . . . . 32
2.4 Exemplary crossover of programs (1) and (2) labeled as par-
ent1 and parent2, respectively. Child1 and child2 are possible
new offspring programs formed out of the genetic material of
their parents. . . . . . . . . . . . . . . . . . . . . . . . . . . 34
2.5 Exemplary mutation of a program: The programs mutant1,
mutant2, and mutant3 are possible mutants of parent. . . . 35
2.6 Intron-augmented representation of an exemplary program in
PDGP [Pol99b]. . . . . . . . . . . . . . . . . . . . . . . . . . 38
2.7 Major preparatory steps of the basic GP process. . . . . . . 38
2.8 The genetic programming cycle [LP02]. . . . . . . . . . . . . 40
2.9 The GP-based problem solving process. . . . . . . . . . . . 41
2.10 GA and GP flowcharts: The conventional genetic algorithm
and genetic programming. . . . . . . . . . . . . . . . . . . . 42
2.11 The Boolean multiplexer with three address bits; (a) general
black box model, (b) addressing data bit d5 . . . . . . . . . . 44
2.12 A correct solution to the 3-address Boolean multiplexer prob-
lem [Koz92b]. . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.13 The Santa Fe trail. . . . . . . . . . . . . . . . . . . . . . . . 45
2.14 A Santa Fe trail solution. The black points represent nodes
referencing to the Prog3 function. . . . . . . . . . . . . . . . 46
2.15 A symbolic regression example. . . . . . . . . . . . . . . . . 48
2.16 Exemplary formulas. . . . . . . . . . . . . . . . . . . . . . . 49
2.17 Programs matching Koza’s schema H=[(+ x 3), y]. . . . . . 51

xv
© 2009 by Taylor & Francis Group, LLC
xvi Genetic Algorithms and Genetic Programming

2.18 The rooted tree GP schema ∗(=, = (x, =)) and three exem-
plary programs of the schema’s semantics. . . . . . . . . . . 53
2.19 The GP schema H = +(*(=,x),=) and exemplary u and l
schemata. Cross bars indicate crossover points; shaded re-
gions show the parts of H that are replaced by “don’t care”
symbols. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
2.20 The GP hyperschema ∗(#, = (x, =)) and three exemplary
programs that are a part of the schema’s semantics. . . . . 56
2.21 The GP schema H = +(∗(=, x), =) and exemplary U and L
hyperschema building blocks. Cross bars indicate crossover
points; shaded regions show the parts of H that are modified. 57
2.22 Relation between approximate and exact schema theorems for
different representations and different forms of crossover (in
the absence of mutation). . . . . . . . . . . . . . . . . . . . 58
2.23 Examples for bloat. . . . . . . . . . . . . . . . . . . . . . . . 60

4.1 Flowchart of the embedding of offspring selection into a ge-


netic algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . 71
4.2 Graphical representation of the gene pool available at a cer-
tain generation. Each bar represents a chromosome with its
alleles representing the assignment of the genes at the certain
loci. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74
4.3 The left part of the figure represents the gene pool at gener-
ation i and the right part indicates the possible size of gen-
eration i + 1 which must not go below a minimum size and
also not exceed an upper limit. These parameters have to be
defined by the user. . . . . . . . . . . . . . . . . . . . . . . . 74
4.4 Typical development of actual population size between the
two borders (lower and upper limit of population size) dis-
playing also the identical chromosomes that occur especially
in the last iterations. . . . . . . . . . . . . . . . . . . . . . . 76

5.1 Flowchart of the reunification of subpopulations of a SASEGASA


(light shaded subpopulations are still evolving, whereas dark
shaded ones have already converged prematurely). . . . . . 84
5.2 Quality progress of a typical run of the SASEGASA algo-
rithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
5.3 Selection pressure curves for a typical run of the SASEGASA
algorithm. . . . . . . . . . . . . . . . . . . . . . . . . . . . . 86
5.4 Flowchart showing the main steps of the SASEGASA. . . . 87

6.1 Similarity of solutions in the population of a standard GA


after 20 and 200 iterations, shown in the left and the right
charts, respectively. . . . . . . . . . . . . . . . . . . . . . . . 93

© 2009 by Taylor & Francis Group, LLC


List of Figures xvii

6.2 Histograms of the similarities of solutions in the population


of a standard GA after 20 and 200 iterations, shown in the
left and the right charts, respectively. . . . . . . . . . . . . . 94
6.3 Average similarities of solutions in the population of a stan-
dard GA over for the first 2,000 and 10,000 iterations, shown
in the upper and lower charts, respectively. . . . . . . . . . 95
6.4 Multi-population specific similarities of the solutions of a par-
allel GA’s populations after 5,000 generations. . . . . . . . . 96
6.5 Progress of the average multi-population specific similarity
values of a parallel GA’s solutions, shown for 10,000 genera-
tions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96

7.1 Quality progress for a standard GA with OX crossover for


mutation rates of 0%, 5%, and 10%. . . . . . . . . . . . . . 99
7.2 Quality progress for a standard GA with ERX crossover for
mutation rates of 0%, 5%, and 10%. . . . . . . . . . . . . . 101
7.3 Quality progress for a standard GA with MPX crossover for
mutation rates of 0%, 5%, and 10%. . . . . . . . . . . . . . 102
7.4 Distribution of the alleles of the global optimal solution over
the run of a standard GA using OX crossover and a mutation
rate of 5% (remaining parameters are set according to Table
7.1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
7.5 Quality progress for a GA with offspring selection, OX, and
a mutation rate of 5%. . . . . . . . . . . . . . . . . . . . . . 105
7.6 Quality progress for a GA with offspring selection, MPX, and
a mutation rate of 5%. . . . . . . . . . . . . . . . . . . . . . 106
7.7 Quality progress for a GA with offspring selection, ERX, and
a mutation rate of 5%. . . . . . . . . . . . . . . . . . . . . . 107
7.8 Quality progress for a GA with offspring selection, ERX, and
no mutation. . . . . . . . . . . . . . . . . . . . . . . . . . . 108
7.9 Quality progress for a GA with offspring selection using a
combination of OX, ERX, and MPX, and a mutation rate of
5%. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
7.10 Success progress of the different crossover operators OX, ERX,
and MPX, and a mutation rate of 5%. The plotted graphs
represent the ratio of successfully produced children to the
population size over the generations. . . . . . . . . . . . . . 110
7.11 Distribution of the alleles of the global optimal solution over
the run of an offspring selection GA using ERX crossover
and a mutation rate of 5% (remaining parameters are set
according to Table 7.2). . . . . . . . . . . . . . . . . . . . . 111
7.12 Distribution of the alleles of the global optimal solution over
the run of an offspring selection GA using ERX crossover and
no mutation (remaining parameters are set according to Table
7.2). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112

© 2009 by Taylor & Francis Group, LLC


xviii Genetic Algorithms and Genetic Programming

7.13 Quality progress for a relevant alleles preserving GA with OX


and a mutation rate of 5%. . . . . . . . . . . . . . . . . . . 114
7.14 Quality progress for a relevant alleles preserving GA with
MPX and a mutation rate of 5%. . . . . . . . . . . . . . . . 115
7.15 Quality progress for a relevant alleles preserving GA with
ERX and a mutation rate of 5%. . . . . . . . . . . . . . . . 115
7.16 Quality progress for a relevant alleles preserving GA using a
combination of OX, ERX, and MPX, and a mutation rate of
5%. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
7.17 Quality progress for a relevant alleles preserving GA using a
combination of OX, ERX, and MPX, and mutation switched
off. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 116
7.18 Distribution of the alleles of the global optimal solution over
the run of a relevant alleles preserving GA using a combi-
nation of OX, ERX, and MPX, and a mutation rate of 5%
(remaining parameters are set according to Table 7.3). . . . 118
7.19 Distribution of the alleles of the global optimal solution over
the run of a relevant alleles preserving GA using a combina-
tion of OX, ERX, and MPX without mutation (remaining are
set parameters according to Table 7.3). . . . . . . . . . . . . 119

8.1 Exemplary nearest neighbor solution for a 51-city TSP in-


stance ([CE69]). . . . . . . . . . . . . . . . . . . . . . . . . . 126
8.2 Example of a 2-change for a TSP instance with 7 cities. . . 128
8.3 Example of a 3-change for a TSP instance with 11 cities. . . 129
8.4 Example for a partially matched crossover. . . . . . . . . . . 134
8.5 Example for an order crossover. . . . . . . . . . . . . . . . . 135
8.6 Example for a cyclic crossover. . . . . . . . . . . . . . . . . 136
8.7 Exemplary result of the sweep heuristic for a small CVRP. . 144
8.8 Exemplary sequence-based crossover. . . . . . . . . . . . . . 149
8.9 Exemplary route-based crossover. . . . . . . . . . . . . . . . 151
8.10 Exemplary relocate mutation. . . . . . . . . . . . . . . . . . 152
8.11 Exemplary exchange mutation. . . . . . . . . . . . . . . . . 152
8.12 Example for a 2-opt mutation for the VRP. . . . . . . . . . 153
8.13 Example for a 2-opt∗ mutation for the VRP. . . . . . . . . . 153
8.14 Example for an or-opt mutation for the VRP. . . . . . . . . 154

9.1 Data-based modeling example: Training data. . . . . . . . . 160


9.2 Data-based modeling example: Evaluation of an optimally fit
linear model. . . . . . . . . . . . . . . . . . . . . . . . . . . 161
9.3 Data-based modeling example: Evaluation of an optimally fit
cubic model. . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
9.4 Data-based modeling example: Evaluation of an optimally fit
polynomial model (n = 10). . . . . . . . . . . . . . . . . . . 162

© 2009 by Taylor & Francis Group, LLC


List of Figures xix

9.5 Data-based modeling example: Evaluation of an optimally fit


polynomial model (n = 20). . . . . . . . . . . . . . . . . . . 163
9.6 Data-based modeling example: Evaluation of an optimally fit
linear model (evaluated on training and test data). . . . . . 163
9.7 Data-based modeling example: Evaluation of an optimally fit
cubic model (evaluated on training and test data). . . . . . 164
9.8 Data-based modeling example: Evaluation of an optimally fit
polynomial model (n = 10) (evaluated on training and test
data). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 165
9.9 Data-based modeling example: Summary of training and test
errors for varying numbers of parameters n. . . . . . . . . . 165
9.10 The basic steps of system identification. . . . . . . . . . . . 167
9.11 The basic steps of GP-based system identification. . . . . . 170
9.12 Structure tree representation of a formula. . . . . . . . . . . 179
9.13 Structure tree crossover and the functions basis. . . . . . . . 181
9.14 Simple examples for pruning in GP. . . . . . . . . . . . . . . 195
9.15 Simple formula structure and all included pairs of ancestors
and descendants (genetic information items). . . . . . . . . 202

10.1 Quality improvement using offspring selection and various


crossover operators. . . . . . . . . . . . . . . . . . . . . . . . 210
10.2 Degree of similarity/distance for all pairs of solutions in a
SGA’s population of 120 solution candidates after 10 genera-
tions. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
10.3 Genetic diversity in the population of a conventional GA over
time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
10.4 Genetic diversity of the population of a GA with offspring
selection over time. . . . . . . . . . . . . . . . . . . . . . . . 219
10.5 Genetic diversity of the entire population over time for a
SASEGASA with 5 subpopulations. . . . . . . . . . . . . . . 220
10.6 Quality progress of a standard GA using roulette wheel se-
lection on the left and 3-tournament selection the right side,
applied to instances of the Taillard CVRP benchmark: tai75a
(top) and tai75b (bottom). . . . . . . . . . . . . . . . . . . . 223
10.7 Genetic diversity in the population of a GA with roulette
wheel selection (shown on the left side) and 3-tournament
selection (shown on the right side). . . . . . . . . . . . . . . 225
10.8 Box plots of the qualities produced by a GA with roulette
and 3-tournament selection, applied to the problem instances
tai75a (top) and tai75b (bottom). . . . . . . . . . . . . . . . 227
10.9 Quality progress of the offspring selection GA for the in-
stances (from top to bottom) tai75a and tai75b. The left col-
umn shows the progress with a population size of 200, while
in the right column the GA with offspring selection uses a
population size of 400. . . . . . . . . . . . . . . . . . . . . . 229

© 2009 by Taylor & Francis Group, LLC


xx Genetic Algorithms and Genetic Programming

10.10 Influence of the crossover operators SBX and RBX on each


generation of an offspring selection algorithm. The lighter
line represents the RBX; the darker line represents the SBX. 230
10.11 Genetic diversity in the population of an GA with offspring
selection and a population size of 200 on the left and 400 on
the right for the problem instances tai75a and tai75b (from
top to bottom). . . . . . . . . . . . . . . . . . . . . . . . . . 231
10.12 Box plots of the offspring selection GA with a population size
of 200 and 400 for the instances tai75a and tai75b. . . . . . 233
10.13 Box plots of the GA with 3-tournament selection against the
offspring selection GA for the instances tai75a (shown in the
upper part) and tai75b (shown in the lower part). . . . . . . 233

11.1 Dynamic diesel engine test bench at the Institute for Design
and Control of Mechatronical Systems, JKU Linz. . . . . . . 238
11.2 Evaluation of the best model produced by GP for test strategy
(1). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
11.3 Evaluation of the best model produced by GP for test strategy
(2). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 242
11.4 Evaluation of models for particulate matter emissions of a
diesel engine (snapshot showing the evaluation of the model
on validation / test samples). . . . . . . . . . . . . . . . . . 244
11.5 Errors distribution of models for particulate matter emissions. 244
11.6 Cumulative errors of models for particulate matter emissions. 245
11.7 Target N Ox values of N Ox data set II, recorded over ap-
proximately 30 minutes at 20Hz recording frequency yielding
∼36,000 samples. . . . . . . . . . . . . . . . . . . . . . . . . 247
11.8 Target HoribaN Ox values of N Ox data set III. . . . . . . . 248
11.9 Target HoribaN Ox values of N Ox data set III, samples 6000
– 7000. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
11.10 Two exemplary ROC curves and their area under the ROC
curve (AUC). . . . . . . . . . . . . . . . . . . . . . . . . . . 255
11.11 An exemplary graphical display of a multi-class ROC (MROC)
matrix. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 257
11.12 Classification example: Several samples with original class
values C1 , C2 , and C3 are shown; the class ranges result from
the estimated values for each class and are indicated as cr1 ,
cr2 , and cr3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 261
11.13 An exemplary hybrid structure tree of a combined formula
including arithmetic as well as logical functions. . . . . . . . 265
11.14 Graphical representation of the best result we obtained for
the Thyroid data set, CV-partition 9: Comparison of original
and estimated class values. . . . . . . . . . . . . . . . . . . . 272
11.15 ROC curves and their area under the curve (AUC) values for
classification models generated for Thyroid data, CV-set 9. 273

© 2009 by Taylor & Francis Group, LLC


List of Figures xxi

11.16 MROC charts and their maximum and average area under
the curve (AUC) values for classification models generated
for Thyroid data, CV-set 9. . . . . . . . . . . . . . . . . . . 274
11.17 Graphical representation of a classification model (formula),
produced for 10-fold cross validation partition 3 of the Thy-
roid data set. . . . . . . . . . . . . . . . . . . . . . . . . . . 275
11.18 pctotal values for an exemplary run of series I. . . . . . . . . 287
11.19 pctotal values for an exemplary run of series II. . . . . . . . 287
11.20 pctotal values for an exemplary run of series III. . . . . . . . 288
11.21 Selection pressure progress in two exemplary runs of test se-
ries III and V (extended GP with gender specific parent se-
lection and strict offspring selection). . . . . . . . . . . . . . 291
11.22 Distribution of similarity values in an exemplary run of NOx
test series A, generation 200. . . . . . . . . . . . . . . . . . 297
11.23 Distribution of similarity values in an exemplary run of NOx
test series A, generation 4000. . . . . . . . . . . . . . . . . . 298
11.24 Distribution of similarity values in an exemplary run of NOx
test series (D), generation 20. . . . . . . . . . . . . . . . . . 298
11.25 Distribution of similarity values in an exemplary run of NOx
test series (D), generation 95. . . . . . . . . . . . . . . . . . 299
11.26 Population diversity progress in exemplary Thyroid test runs
of series (A) and (D) (shown in the upper and lower graph,
respectively). . . . . . . . . . . . . . . . . . . . . . . . . . . 299
11.27 Exemplary multi-population diversity of a test run of Thyroid
series F at iteration 50, grayscale representation. . . . . . . 305
11.28 Code growth in GP without applying size limits or complexity
punishment strategies (left: standard GP, right: extended
GP). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 310
11.29 Progress of formula complexity in one of the test runs of series
(1g), shown for the first ∼400 iterations. . . . . . . . . . . . 315
11.30 Progress of formula complexity in one of the test runs of series
(1h) (shown left) and one of series (2h) (shown right). . . . 316
11.31 Model with best fit on training data: Model structure and
full evaluation. . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.32 Model with best fit on validation data: Model structure and
full evaluation. . . . . . . . . . . . . . . . . . . . . . . . . . 318
11.33 Errors distributions of best models: Charts I, II, and III show
the errors distributions of the model with best fit on training
data evaluated on training, validation, and test data, respec-
tively; charts IV, V, and VI show the errors distributions of
the model with best fit on validation data evaluated on train-
ing, validation, and test data, respectively. . . . . . . . . . . 319
11.34 A simple workbench in HeuristicLab 2.0. . . . . . . . . . . . 323

© 2009 by Taylor & Francis Group, LLC


List of Algorithms

1.1 Basic workflow of a genetic algorithm. . . . . . . . . . . . . . 3


4.1 Definition of a genetic algorithm with offspring selection. . . . 72
9.1 Exhaustive pruning of a model m using the parameters h1 , h2 ,
minimizeM odel, cpmax , and detmax . . . . . . . . . . . . . . . 196
9.2 Evolution strategy inspired pruning of a model m using the
parameters λ, maxU nsuccRounds, h1 , h2 , minimizeM odel,
cpmax , and detmax . . . . . . . . . . . . . . . . . . . . . . . . . 198
9.3 Calculation of the evaluation-based similarity of two models m1
and m2 with respect to data base data . . . . . . . . . . . . . 200
9.4 Calculation of the structural similarity of two models m1 and
m2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205

xxiii
© 2009 by Taylor & Francis Group, LLC
Introduction

Essentially, this book is about algorithmic developments in the context of


genetic algorithms (GAs) and genetic programming (GP); we also describe
their applications to significant combinatorial optimization problems as well
as structure identification using HeuristicLab as a platform for algorithm de-
velopment. The main issue of the theoretical considerations is to obtain a
better understanding of the basic workflow of GAs and GP, in order to estab-
lish new bionic, problem independent theoretical concepts and to substantially
increase the achievable solution quality.

The book is structured into a theoretical and an empirical part. The aim of
the theoretical part is to describe the important and characteristic properties
of the basic genetic algorithm as well as the main characteristics of the algo-
rithmic extensions introduced here. The empirical part of the book elaborates
two case studies: On the one hand, the traveling salesman problem (TSP) and
the capacitated vehicle routing problem (CVRP) are used as representatives
for GAs applied to combinatorial optimization problems. On the other hand,
GP-based nonlinear structure identification applied to time series and clas-
sification problems is analyzed to highlight the properties of the algorithmic
measures in the field of genetic programming. The borderlines between theory
and practice become indistinct in some parts as it is also necessary to describe
theoretical properties on the basis of practical examples in the first part of the
book. For this purpose we go back to some small-dimensioned TSP instances
that are perfectly suited for theoretical GA considerations.

Research concerning the self-adaptive interplay between selection and the


applied solution manipulation operators (crossover and mutation) is the basis
for the algorithmic developments presented in this book. The ultimate goal in
this context is to avoid the disappearance of relevant building blocks and to
support the combination of those alleles from the gene pool that carry solution
properties of highly fit individuals. As we show in comparative test series, in
conventional GAs and GP this relevant genetic information is likely to get lost
quite early in the standard variants of these algorithms and can only be rein-
troduced into the population’s gene pool by mutation. This dependence on
mutation can be drastically reduced by new generic selection principles based
upon either self-adaptive selection pressure steering (offspring selection, OS)
or self-adaptive population size adjustment as proposed in the relevant alleles
preserving genetic algorithm (RAPGA). Both algorithmic extensions certify
the survival of essential genetic information by supporting the survival of rel-

xxv
© 2009 by Taylor & Francis Group, LLC
xxvi Genetic Algorithms and Genetic Programming

evant alleles rather than the survival of above average chromosomes. This
is achieved by defining the survival probability of a new child chromosome
depending on the child’s fitness in comparison to the fitness values of its own
parents. With these measures it becomes possible to channel the relevant
alleles, which are initially scattered in the entire population, to single chro-
mosomes at the end of the genetic search process.

The SASEGASA algorithm is a special coarse-grained parallel GA; the


acronym “SASEGASA” hereby stands for Self-Adaptive Segregative Genetic
Algorithm including aspects of Simulated Annealing. SASEGASA combines
offspring selection with enhanced parallelization concepts in order to avoid
premature convergence, one of the major problems with GAs. As we will
show for the TSP, it becomes possible to scale the robustness and particularly
the achievable solution quality by the number of subpopulations.

Due to the high focus on sexual recombination, evolution strategies (ES)


are not considered explicitly in this book. Nevertheless, many of the theoret-
ical considerations are heavily inspired by evolution strategies, especially the
aspect of selection after reproduction and (self-)adaptive selection pressure
steering. Aside from other variants of evolutionary computation, further in-
spirations are borrowed from fields, as for example, population genetics. The
implementation of bionic ideas for algorithmic developments is quite prag-
matic and ignores debates on principles that are discussed in natural sciences.
Of course, we are always aware of the fact that artificial evolution as per-
formed in an evolutionary algorithm is situated on a high level of abstraction
compared to the biological role model in any case.

The problem-oriented part of the book is dedicated to the application of


the algorithmic concepts described in this book to benchmark as well as real
world problems. Concretely, we examine the traveling salesman problem and
the capacitated vehicle routing problem (which is thematically related to the
TSP), but more in step with actual practice, as representatives of combina-
torial optimization problems.

Time series and classification analysis are used as application areas of data-
based structure identification with genetic programming working with for-
mula trees representing mathematical models. As a matter of principle, we
use standard problem representations and the appropriate problem-specific
genetic operators known from GA and GP theory for the experiments shown
in these chapters. The focus is set on the comparison of results achievable with
standard GA and GP implementations to the results achieved using the ex-
tended algorithmic concepts described in this book. These enhanced concepts
do not depend on a concrete problem representation and its operators; their
influences on population dynamics in GA and GP populations are analyzed,
too.

© 2009 by Taylor & Francis Group, LLC


Introduction xxvii

Additional material related to the research described in this book is pro-


vided on the book’s homepage at http://gagp2009.heuristiclab.com.
Among other information this website provides some of the software used
as well as dynamical presentations of representative test runs as additional
material.

© 2009 by Taylor & Francis Group, LLC


Chapter 1
Simulating Evolution: Basics about
Genetic Algorithms

1.1 The Evolution of Evolutionary Computation


Work on what is nowadays called evolutionary computation started in the
sixties of the 20th century in the United States and Germany. There have
been two basic approaches in computer science that copy evolutionary mech-
anisms: evolution strategies (ES) and genetic algorithms (GA). Genetic al-
gorithms go back to Holland [Hol75], an American computer scientist and
psychologist who developed his theory not only under the aspect of solving
optimization problems but also to study self-adaptiveness in biological pro-
cesses. Essentially, this is the reason why genetic algorithms are much closer
to the biological model than evolution strategies. The theoretical foundations
of evolution strategies were formed by Rechenberg and Schwefel (see for ex-
ample [Rec73] or [Sch94]), whose primary goal was optimization. Although
these two concepts have many aspects in common, they developed almost in-
dependently from each other in the USA (where GAs were developed) and
Germany (where research was done on ES).
Both attempts work with a population model whereby the genetic informa-
tion of each individual of a population is in general different. Among other
things this genotype includes a parameter vector which contains all necessary
information about the properties of a certain individual. Before the intrinsic
evolutionary process takes place, the population is initialized arbitrarily; evo-
lution, i.e., replacement of the old generation by a new generation, proceeds
until a certain termination criterion is fulfilled.
The major difference between evolution strategies and genetic algorithms
lies in the representation of the genotype and in the way the operators are used
(which are mutation, selection, and eventually recombination). In contrast
to GAs, where the main role of the mutation operator is simply to avoid
stagnation, mutation is the primary operator of evolution strategies.
Genetic programming (GP), an extension of the genetic algorithm, is a
domain-independent, biologically inspired method that is able to create com-
puter programs from a high-level problem statement. In fact, virtually all
problems in artificial intelligence, machine learning, adaptive systems, and

1
© 2009 by Taylor & Francis Group, LLC
2 Genetic Algorithms and Genetic Programming

automated learning can be recast as a search for a computer program; genetic


programming provides a way to search for a computer program in the space
of computer programs (as formulated by Koza in [Koz92a]). Similar to GAs,
GP works by imitating aspects of natural evolution, but whereas GAs are
intended to find arrays of characters or numbers, the goal of a GP process is
to search for computer programs (or, for example, formulas) solving the opti-
mization problem at hand. As in every evolutionary process, new individuals
(in GP’s case, new programs) are created. They are tested, and the fitter ones
in the population succeed in creating children of their own whereas unfit ones
tend to disappear from the population.
In the following sections we give a detailed description of the basics of
genetic algorithms in Section 1.2, take a look at the corresponding biological
terminology in Section 1.3, and characterize the operators used in GAs in
Section 1.4. Then, in Section 1.5 we discuss problem representation issues,
and in Section 1.6 we summarize the schema theory, an essentially important
concept for understanding not only how, but also why GAs work. Parallel
GA concepts are given in Section 1.7, and finally we discuss the interplay of
genetic operators in Section 1.8.

1.2 The Basics of Genetic Algorithms


Concerning its internal functioning, a genetic algorithm is an iterative pro-
cedure which usually operates on a population of constant size and is basically
executed in the following way:
An initial population of individuals (also called “solution candidates” or
“chromosomes”) is generated randomly or heuristically. During each itera-
tion step, also called a “generation,” the individuals of the current population
are evaluated and assigned a certain fitness value. In order to form a new pop-
ulation, individuals are first selected (usually with a probability proportional
to their relative fitness values), and then produce offspring candidates which
in turn form the next generation of parents. This ensures that the expected
number of times an individual is chosen is approximately proportional to its
relative performance in the population. For producing new solution candi-
dates genetic algorithms use two operators, namely crossover and mutation:

• Crossover is the primary genetic operator: It takes two individuals,


called parents, and produces one or two new individuals, called offspring,
by combining parts of the parents. In its simplest form, the operator
works by swapping (exchanging) substrings before and after a randomly
selected crossover point.

• The second genetic operator, mutation, is essentially an arbitrary mod-

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 3

ification which helps to prevent premature convergence by randomly


sampling new points in the search space. In the case of bit strings,
mutation is applied by simply flipping bits randomly in a string with a
certain probability called mutation rate.
Genetic algorithms are stochastic iterative algorithms, which cannot guar-
antee convergence; termination is hereby commonly triggered by reaching a
maximum number of generations or by finding an acceptable solution or more
sophisticated termination criteria indicating premature convergence. We will
discuss this issue in further detail within Chapter 3.
The so-called standard genetic algorithm (SGA), which represents the basis
of almost all variants of genetic algorithms, is given in Algorithm 1.1 (which
is formulated as in [Tom95], for example).

Algorithm 1.1 Basic workflow of a genetic algorithm.


Produce an initial population of individuals
Evaluate the fitness of all individuals
while termination condition not met do
Select fitter individuals for reproduction and produce new individuals
(crossover and mutation)
Evaluate fitness of new individuals
Generate a new population by inserting some new “good” individuals and
by erasing some old “bad” individuals
end while

A special and quite restricted GA variant, that has represented the basis for
theoretical considerations for a long period of time, is given in Figure 1.1. This
chart sketches a GA with binary representation operating with generational
replacement, a population of constant size, and the following genetic opera-
tors: roulette wheel selection, single point crossover, and bit flip mutation.
This special type of genetic algorithms, which is the basis for theoretical GA
research such as the well known schema theorem and accordingly the building
block hypothesis, is also called the canonical genetic algorithm (CGA).

1.3 Biological Terminology


The approximative way of solving optimization problems by genetic algo-
rithms holds a strong analogy to the basic principles of biological evolution.
The fundamentals of the natural evolution theory, as it is considered nowa-
days, mainly refer to the theories of Charles Darwin, which were published

© 2009 by Taylor & Francis Group, LLC


4 Genetic Algorithms and Genetic Programming

FIGURE 1.1: The canonical genetic algorithm with binary solution encoding.

in 1859 in his well-known work “The Origin of Species By Means of Natural


Selection or the Preservation of Favoured Races in the Struggle for Life” (re-
vised edition: [Dar98]). In this work Darwin states the following five major
ideas:

• Evolution, change in lineages, occurs and occurred over time.

• All creatures have common descent.

• Natural selection determines changes in nature.

• Gradual change, i.e., nature changes somehow successively.

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 5

• Speciation, i.e., Darwin claimed that the process of natural selection


results in populations diverging enough to become separate species.

Although some of Darwin’s proposals were not new, his ideas (particularly
those on common descent and natural selection) provided the first solid foun-
dation upon which evolutionary biology has been built.
At this point it may be useful to formally introduce some essential parts of
the biological terminology which are used in the context of genetic algorithms:

• All living organisms consist of cells containing the same set of one or
more chromosomes, i.e., strings of DNA. A gene can be understood
as an “encoder” of a characteristic, such as eye color. The different
possibilities for a characteristic (e.g., brown, green, blue, gray) are called
alleles. Each gene is located at a particular position (locus) on the
chromosome.

• Most organisms have multiple chromosomes in each cell. The sum of all
chromosomes, i.e., the complete collection of genetic material, is called
the genome of the organism and the term genotype refers to the partic-
ular set of genes contained in a genome. Therefore, if two individuals
have identical genomes, they are said to have the same genotype.

• Organisms whose chromosomes are arranged in pairs are called diploid,


whereas organisms with unpaired chromosomes are called haploid. In
nature, most sexually reproducing species are diploid. Humans for in-
stance have 23 pairs of chromosomes in each somatic cell in their body.
Recombination (crossover) occurs during sexual reproduction in the fol-
lowing way:

• For producing a new child, the genes of the parents are combined to
eventually form a new diploid set of chromosomes. Offspring are sub-
ject to mutation where elementary parts of the DNA (nucleotides) are
changed. The fitness of an organism (individual) is typically defined as
its probability to reproduce, or as a function of the number of offspring
the organism has produced.

For the sake of simplification, in genetic algorithms the term chromosome


refers to a solution candidate (in the first GAs encoded as a bit). The genes are
either single bits or small blocks of neighboring bits that encode a particular
element of the solution. Even if an allele usually is either 0 or 1, for larger
alphabets more alleles are possible at each locus.
As a further simplification to the biological role model, crossover typically
operates by exchanging genetic material between two haploid parents whereas
mutation is implemented by simply flipping the bit at a randomly chosen locus.
Finally it is remarkable that most applications of genetic algorithms employ
haploid single-chromosome individuals, although the evolution of mankind has

© 2009 by Taylor & Francis Group, LLC


6 Genetic Algorithms and Genetic Programming

inspired the GA-community at most. This is most probably due to the easier
and more effective representation and implementation of single-chromosome
individuals.

1.4 Genetic Operators


In the following, the main genetic operators, namely parent selection,
crossover, mutation, and replacement are to be described. The focus hereby
lies on a functional description of the principles rather than to give a complete
overview of operator concepts; for more details about genetic operators the
interested reader is referred to textbooks as for example [DLJD00].

1.4.1 Models for Parent Selection


In genetic algorithms a fitness function assigns a score to each individual in
a population; this fitness value indicates the quality of the solution represented
by the individual. The fitness function is often given as part of the problem
description or based on the objective function; developing an appropriate
fitness function may also involve the use of simulation, heuristic techniques, or
the knowledge of an expert. Evaluating the fitness function for each individual
should be relatively fast due to the number of times it will be invoked. If
the evaluation is likely to be slow, then concepts of parallel and distributed
computing, an approximate function evaluation technique, or a technique,
that only considers elements that have changed, may be employed.
Once a population has been generated and its fitness has been measured,
the set of solutions, that are selected to be “mated” in a given generation, is
produced. In the standard genetic algorithm (SGA) the probability, that a
chromosome of the current population is selected for reproduction, is propor-
tional to its fitness.
In fact, there are many ways of accomplishing this selection. These include:
• Proportional selection (roulette wheel selection):
The classical SGA utilizes this selection method which has been pro-
posed in the context of Holland’s schema theorem (which will be ex-
plained in detail in Section 1.6). Here the expected number of descen-
dants for an individual i is given as pi = ffi with f : S → R+ denoting
the fitness function and f representing the average fitness of all indi-
viduals. Therefore, each individual of the population is represented by
a space proportional to its fitness. By repeatedly spinning the wheel,
individuals are chosen using random sampling with replacement. In or-
der to make proportional selection independent from the dimension of
the fitness values, so-called windowing techniques are usually employed.

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 7

Further variants of proportional selection aim to reduce the dominance


of a single or a group of highly fit individuals (“super individuals”) by
stochastic sampling techniques (as for example explained in [DLJD00]).

• Linear-rank selection:
In the context of linear-rank selection the individuals of the population
are ordered according to their fitness and copies are assigned in such a
way that the best individual receives a pre-determined multiple of the
number of copies the worst one receives [GB89]. On the one hand rank
selection implicitly reduces the dominating effects of “super individuals”
in populations (i.e., individuals that are assigned a significantly better
fitness value than all other individuals), but on the other hand it warps
the difference between close fitness values, thus increasing the selection
pressure in stagnant populations. Even if linear-rank selection has been
used with some success, it ignores the information about fitness differ-
ences of different individuals and violates the schema theorem.

• Tournament selection:
There are a number of variants on this theme. The most common one
is k-tournament selection where k individuals are selected from a pop-
ulation and the fittest individual of the k selected ones is considered
for reproduction. In this variant selection pressure can be scaled quite
easily by choosing an appropriate number for k.

1.4.2 Recombination (Crossover)


In its easiest formulation, which is suggested in the canonical GA for binary
encoding, crossover takes two individuals and cuts their chromosome strings
at some randomly chosen position. The produced substrings are then swapped
to produce two new full length chromosomes.
Conventional crossover techniques for binary representation include:

• Single point crossover:


A single random cut is made, producing two head sections and two
tail sections. The two tail sections are then swapped to produce two
new individuals (chromosomes); Figure 1.2 schematically sketches this
crossover method which is also called one point crossover.

• Multiple point crossover:


One natural extension of the single point crossover is the multiple point
crossover: In a n-point crossover there are n crossover points and sub-
strings are swapped between the n points. According to some re-
searchers, multiple-point crossover is more suitable to combine good fea-
tures present in strings because it samples uniformly along the full length
of a chromosome [Ree95]. At the same time, multiple-point crossover be-
comes more and more disruptive with an increasing number of crossover

© 2009 by Taylor & Francis Group, LLC


8 Genetic Algorithms and Genetic Programming
Crossover Point

Parents

Crossover

Children

FIGURE 1.2: Schematic display of a single point crossover.

points, i.e., the evolvement of longer building blocks becomes more and
more difficult. Decreasing the number of crossover points during the
run of the GA may be a good compromise.

• Uniform crossover:
Given two parents, each gene in the offspring is created by copying
the corresponding gene from one of the parents. The selection of the
corresponding parent is undertaken via a randomly generated crossover
mask: At each index, the offspring gene is taken from the first parent
if there is a 1 in the mask at this index, and otherwise (if there is a 0
in the mask at this index) the gene is taken from the second parent.
Due to this construction principle uniform crossover does not support
the evolvement of higher order building blocks.

The choice of an appropriate crossover operator depends very much on the


representation of the search space (see also Section 1.5). Sequencing problems
as routing problems for example often require operators different from the ones
described above as almost all generated children may be situated outside of
the space of valid solutions.
In higher order representations, a variety of real-number combination op-
erators can be employed, such as the average and geometric mean. Domain
knowledge can be used to design local improvement operators which some-
times allow more efficient exploration of the search space around good solu-
tions. For instance, knowledge could be used to determine the appropriate
locations for crossover points.
As the number of proposed problem-specific crossover-techniques has been
growing that much over the years, it would go beyond the scope of the present
book even to discuss the more important ones. For a good discussion of
crossover-related issues and further references the reader is referred to [Mic92]
and [DLJD00].

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 9

1.4.3 Mutation
Mutations allow undirected jumps to slightly different areas of the search
space. The basic mutation operator for binary coded problems is bitwise
mutation. Mutation occurs randomly and very rarely with a probability pm ;
typically, this mutation rate is less than ten percent. In some cases mutation
is interpreted as generating a new bit and in others it is interpreted as flipping
the bit.
In higher order alphabets, such as integer numbering formulations, muta-
tion takes the form of replacing an allele with a randomly chosen value in the
appropriate range with probability pm . However, for combinatorial optimiza-
tion problems, such mutation schemes can cause difficulties with chromosome
legality; for example, multiple copies of a given value can occur which might
be illegal for some problems (including routing). Alternatives suggested in
literature include pairwise swap and shift operations as for instance described
in [Car94].
In addition, adaptive mutation schemes similar to mutation in the context
of evolution strategies are worth mentioning. Adaptive mutation schemes
vary either the rate, or the form of mutation, or both during a GA run. For
instance, mutation is sometimes defined in such a way that the search space
is explored uniformly at first and more locally towards the end, in order to do
a kind of local improvement of candidate solutions [Mic92].

1.4.4 Replacement Schemes


After having generated a new generation of descendants (offspring) by
crossover and mutation, the question arises which of the new candidates should
become members of the next generation. In the context of evolution strategies
this fact determines the life span of the individuals and substantially influ-
ences the convergence behavior of the algorithm. A further strategy influenc-
ing replacement quite drastically is offspring selection which will be discussed
separately in Chapter 4. The following schemes are possible replacement
mechanisms for genetic algorithms:

• Generational Replacement:
The entire population is replaced by its descendants. Similar to the
(µ, λ) evolution strategy it might therefore happen that the fitness of
the best individual decreases at some stage of evolution. Additionally,
this strategy puts into perspective the dominance of a few individuals
which might help to avoid premature convergence [SHF94].

• Elitism:
The best individual (or the n best individuals, respectively) of the pre-
vious generation are retained for the next generation which theoretically
allows immortality similar to the (µ + λ) evolution strategy and might

© 2009 by Taylor & Francis Group, LLC


10 Genetic Algorithms and Genetic Programming

be critical with respect to premature convergence. The special and com-


monly applied strategy of just retaining one (the best) individual of the
last generation is also called the “golden cage model,” which is a special
case of n-elitism with n = 1. If mutation is applied to the elite in order
to prevent premature convergence, the replacement mechanism is called
“weak elitism.”

• Delete-n-last:
The n weakest individuals are replaced by n descendants. If n ≪ |P OP |
we speak of a steady-state replacement scheme; for n = 1 the changes
between the old and the new generation are certainly very small and n =
|P OP | gives the already introduced generational replacement strategy.

• Delete-n:
In contrast to the delete-n-last replacement strategy, here not the n
weakest but rather n arbitrarily chosen individuals of the old generation
are replaced, which on the one hand reduces the convergence speed of
the algorithm but on the other hand also helps to avoid premature
convergence (compare elitism versus weak elitism).

• Tournament Replacement:
Competitions are run between sets of individuals from the last and the
actual generation, with the winners becoming part of the new popula-
tion.

A detailed description of replacement schemes and their effects can be found


for example in [SHF94], [Mic92], [DLJD00], and [Mit96].

1.5 Problem Representation


As already stated before, the first genetic algorithm presented in literature
[Hol75] used binary vectors for the representation of solution candidates (chro-
mosomes). Consequently, the first solution manipulation operators (single
point crossover, bit mutation) have been developed for binary representation.
Furthermore, this very simple GA, also commonly known as the canonical
genetic algorithm (CGA), represents the basis for extensive theoretical in-
spections, resulting in the well known schema theorem and the building block
hypothesis ([Hol75], [Gol89]). This background theory will be examined sep-
arately in Section 1.6, as it defines the scope of almost any GA as it should
ideally be and distinguishes GAs from almost any other heuristic optimization
technique.
The unique selling point of GAs is to compile so-called building blocks,
i.e., somehow linked parts of the chromosome which become larger as the

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 11

algorithm proceeds, advantageously with respect to the given fitness function.


In other words, one could define the claim of a GA as to be an algorithm which
is able to assemble the basic modules of highly fit or even globally optimal
solutions (which the algorithm of course does not know about). These basic
modules are with some probability already available in the initial population,
but widespread over many individuals; the algorithm therefore has to compile
these modules in such a clever way that continuously growing sequences of
highly qualified alleles, the so-called building blocks, are formed.
Compared to heuristic optimization techniques based on neighborhood
search (as tabu search [Glo86] or simulated annealing [KGV83], for exam-
ple), the methodology of GAs to combine partial solutions (by crossover) is
potentially much more robust with respect to getting stuck in local but not
global optimal solutions; this tendency of neighborhood-based searches de-
notes a major drawback of these heuristics. Still, when applying GAs the
user has to draw much more attention on the problem representation in or-
der to help the algorithm to fulfill the claim stated above. In that sense the
problem representation must allow the solution manipulation operators, es-
pecially crossover, to combine alleles of different parent individuals. This is
because crossover is responsible for combining the properties of two solution
candidates which may be located in very different regions of the search space
so that valid new solution candidates are built. This is why the problem rep-
resentation has to be designed in a way that crossover operators are able to
build valid new children (solution candidates) with a genetic make up that
consists of the union set of its parent alleles.
Furthermore, as a tribute to the general functioning of GAs, the crossover
operators also have to support the potential development of higher-order
building blocks (longer allele sequences). Only if the genetic operators for
a certain problem representation show these necessary solution manipulator
properties, the corresponding GA can be expected to work as it should, i.e.,
in the sense of a generalized interpretation of the building block hypothesis.
Unfortunately, a lot of more or less established problem representations are
not able to fulfill these requirements, as they do not support the design of
potentially suited crossover operators. Some problem representations will be
considered exemplarily in the following attracting notice to their ability to
allow meaningful crossover procedures. Even if mutation, the second solution
manipulation concept of GAs, is also of essential importance, the design of
meaningful mutation operators is much less challenging as it is a lot easier
to fulfill the requirements of a suited mutation operator (which in fact is to
introduce a small amount of new genetic information).

1.5.1 Binary Representation


In the early years of GA research there was a strong focus on binary encod-
ing of solution candidates. To some extent, an outgrowth of these ambitions
is certainly the binary representation for the TSP. There have been different

© 2009 by Taylor & Francis Group, LLC


12 Genetic Algorithms and Genetic Programming

ways how to use binary representation for the TSP, the most straightforward
one being to encode each city as a string of log2 n bits and a solution candidate
as a string of n(log2 n) bits. Crossover is then simply performed by applying
single-point crossover as proposed by Holland [Hol75]. Further attempts us-
ing binary encoding have been proposed using binary matrix representation
([FM91], [HGL93]). In [HGL93], Homaifar and Guan for example defined a
matrix element in the i-th row and the j-th column to be 1 if and only if in the
tour city j is visited after city i; they also applied one- or two- point crossover
on the parent matrices, which for one-point crossover means that the child
tour is created by just taking the column vectors left of the crossover point
from one parent, and the column vectors right of the crossover point from the
other parent.

Obviously, these strategies lead to highly illegal tours which are then re-
paired by additional repair strategies [HGL93], which is exactly the point
where a GA can no longer act as it is supposed to. As the repair strate-
gies have to introduce a high amount of genetic information which is neither
from the one nor from the other parent, child solutions emerge whose genetic
make-up has only little in common with its own parents; this counteracts the
general functioning of GAs as given in a more general interpretation of the
schema theorem and the according building block hypothesis.

1.5.2 Adjacency Representation

Using the adjacency representation for the TSP (as described in [LKM+ 99],
e.g.), a city j is listed in position i if and only if the tour leads from city i to
city j. Based on the adjacency representation, the so-called alternating edges
crossover has been proposed for example which basically works as follows:
First it chooses an edge from one parent and continues with the position of
this edge in the other parent representing the next edge, etc. The partial
tour is built up by choosing edges from the two parents alternatingly. In case
this strategy would produce a cycle, the edge is not added, but instead the
operator randomly selects an edge from the edges which do not produce a
cycle and continues in the way described above.

Compared to the crossover operators based on binary encoding, this strat-


egy has the obvious advantage that a new child is built up from edges of its
own parents. However, also this strategy is not very well suited as a fur-
ther claim to crossover is not fulfilled at all: The alternating edges crossover
cannot inherit longer tour segments and therefore longer building blocks can-
not establish. As a further development to the alternating edges crossover,
the so-called sub-tour chunks crossover aims to put things right by not alter-
nating the edges but sub-tours of the two parental solutions. However, the
capabilities of this strategy are also rather limited.

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 13

1.5.3 Path Representation


The most natural representation of a TSP tour is given by the path repre-
sentation. Within this representation, the n cities of a tour are put in order
according to a list of length n, so that the order of cities to be visited is given
by the list entries with an imaginary edge from the last to the first list entry. A
lot of crossover and mutation operators have been developed based upon this
representation, and most of the nowadays used TSP solution methods using
GAs are realized using path representation. Despite obvious disadvantages
like the equivocality of this representation (the same tour can be described in
2n different ways for a symmetrical TSP and in n different ways for an asym-
metrical TSP) this representation has allowed the design of quite powerful
operators like the order crossover (OX) or the edge recombination crossover
(ERX) which are able to inherit parent sub-tours to child solutions with only
a rather small ratio of edges stemming from none of its own parents which is
essential for GAs. A detailed description of these operators is given in Chapter
8.

1.5.4 Other Representations for Combinatorial


Optimization Problems
Combinatorial optimization problems that are more in step with actual
practice than the TSP require more complex problem representations, which
makes it even more difficult for the designer of genetic solution manipulation
operators to construct crossover operators that fulfill the essential require-
ments.
Challenging optimization tasks arise in the field of logistics and production
planning optimization where the capacitated vehicle routing problem with
(CVRPTW, [Tha95]) and without time windows (CVRP, [DR59]) as well as
the job shop scheduling problem (JSSP [Tai93]) denote abstracted standard
formulations which are used for the comparison of optimization techniques on
the basis of widely available standardized benchmark problems. Tabu search
[Glo86] and genetic algorithms are considered the most powerful optimiza-
tion heuristics for these rather practical combinatorial optimization problems
[BR03].
Cheng et al. as well as Yamada and Nakano give a comprehensive review
of problem representations and corresponding operators for applying Genetic
Algorithms to the JSSP in [CGT99] and [YN97], respectively.
For the CVRP, Bräysy and Gendreau give a detailed overview about the
application of local search algorithms in [BG05a] and about the application
of metaheuristics in [BG05b]; concrete problem representations and crossover
operators for GAs are outlined in [PB96] and [Pri04]. Furthermore, the appli-
cation of extended GA concepts to the CVRP will be covered in the practical
part of this book within Chapter 10.

© 2009 by Taylor & Francis Group, LLC


14 Genetic Algorithms and Genetic Programming

1.5.5 Problem Representations for Real-Valued Encoding


When using real-valued encoding, a solution candidate is represented as
a real-valued vector in which the dimension of the chromosomes is constant
and equal to the dimension of the solution vectors. Crossover concepts are
distinguished into discrete and continuous recombination where the discrete
variants copy the exact allele values of the parent chromosomes to the child
chromosome whereas the continuous variants perform some kind of averaging.
Mutation operators for real-valued encoding either slightly modify all po-
sitions of the gene or introduce major changes to only some (often just one)
position. Often a mixture of different crossover and mutation techniques leads
to the best results for real-valued GAs. A comprehensive review of crossover
and mutation techniques including also more sophisticated techniques like
multi-parent recombination is given in [DLJD00].
Although real-valued encoding is a problem representation which is espe-
cially suited for evolution strategies or particle swarm optimization rather
than for GAs, a lot of operators have been established also for GAs which are
quite similar to modern implementations of ES that make use of recombina-
tion [Bey01]. Real-valued encoding for GAs distinguishes itself from typical
discrete representations for combinatorial optimization problems in that point
that the evolvement of longer and longer building block sequences in terms of
adjacent alleles is of minor or no importance. Nevertheless, GA-based tech-
niques like offspring selection have proven to be a very powerful optimization
technique also for this kind of problem representation especially in case of
highly multimodal fitness landscapes [AW05].

1.6 GA Theory: Schemata and Building Blocks


Researchers working in the field of GAs have put a lot of effort into the
analysis of the genetic operators (crossover, mutation, selection). In order to
achieve better analysis and understanding, Holland has introduced a construct
called schema [Hol75]:
Under the assumption of a canonical GA with binary string representation
of individuals, the symbol alphabet {0,1,#} is considered where {#}(don’t
care) is a special wild card symbol that matches both, 0 and 1.
A schema is a string with fixed and variable symbols. For example, the schema
[0#11#01] is a template that matches the following four strings: [0011001],
[0011101], [0111001], and [0111101]. The symbol # is never actually manip-
ulated by the genetic algorithm; it is just a notational device that makes it
easier to talk about families of strings.
Essentially, Holland’s idea was that every evaluated string actually gives
partial information about the fitness of the set of possible schemata of which

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 15

the string is a member. Holland analyzed the influence of selection, crossover,


and mutation on the expected number of schemata, when going from one
generation to the next. A detailed discussion of related analysis can be found
in [Gol89]; in the context of the present work we only outline the main results
and their significance.
Assuming fitness proportional replication, the number m of individuals of
the population belonging to a particular schema H at time t + 1 is related to
the same number at the time t as

fH (t)
m(H, t + 1) = m(H, t) (1.1)
f (t)
where fH (t) is the average fitness value of the string representing schema H,
while f (t) is the average fitness value over all strings within the population.
Assuming that a particular schema remains above the average by a fixed
amount cf (t) for a number t of generations, the solution of the equation given
above can be formulated as the following exponential growth equation:

m(H, t) = m(H, 0)(1 + c)t (1.2)


where m(H, 0) stands for the number of schemata H in the population at
time 0, c denotes a positive integer constant, and t ≥ 0.
The importance of this result is the exponentially increasing number of
trials to above average schemata.
The effect of crossover which breaks strings apart (at least in the case of
canonical genetic algorithms) is that they reduce the exponential increase by
a quantity that is proportional to the crossover rate pc and depends on the
defining length δ of a schema on the string of length l:

δ(H)
pc (1.3)
l−1
The defining length δ of a schema is the distance between the first and
the last fixed string position. For example, for the schema [###0#0101]
δ = 9 − 4 = 5. Obviously, short defining length schemata are less likely to
be disrupted by a single point crossover operator. The main result is that
above average schemata with short defining lengths will still be sampled at an
exponential increasing rate. These schemata with above average fitness and
short defining length are the so-called building blocks and play an important
role in the theory of genetic algorithms.
The effects of mutation are described in a rather straightforward way: If
the bit mutation probability is pm , then the probability of survival of a single
bit is 1 − pm ; since single bit mutations are independent, the total survival
probability is therefore (1 − pm )l with l denoting the string length. But in the
context of schemata only the fixed, i.e., non-wildcard, positions matter. This
number is called the order o(H) of schema H and equals to l minus the number
of “don’t care” symbols. Then the probability of surviving a mutation for a

© 2009 by Taylor & Francis Group, LLC


16 Genetic Algorithms and Genetic Programming

certain schema H is (1 − pm )o(H) which can be approximated by 1 − o(H)pm


for pm ≪ 1.
Summarizing the described effects of mutation, crossover, and reproduction,
we end up with Holland’s well known schema theorem [Hol75]:

fH (t) δ(H)
m(H, t + 1) ≥ m(H, t) [1 − pc − o(H)pm ] (1.4)
f (t) l−1
The result essentially says that the number of short schemata with low order
and above average quality grows exponentially in subsequent generations of a
genetic algorithm.
Still, even if the schema theorem is a very important result in GA theory, it
is obtained under idealized conditions that do not hold for most practical GA
applications. Both the individual representation and the genetic operators are
often different from those used by Holland. The building block hypothesis has
been found reliable in many cases but it also depends on the representation and
on the genetic operators. Therefore, it is easy to find or to construct problems
for which it is not verified. These so-called deceptive problems are studied in
order to find out the inherent limitations of GAs, and which representations
and operators can make them more tractable. A more detailed description of
the underlying theory can for instance be found in [Raw91] or [Whi93].
The major drawback of the building block theory is given by the fact
that the underlying GA (binary encoding, proportional selection, single-point
crossover, strong mutation) is applicable only to very few problems as it re-
quires more sophisticated problem representations and corresponding oper-
ators to tackle challenging real-world problems. Therefore, a more general
theory is an intense topic in GA research since its beginning. Some theo-
retically interesting approaches like the forma theory of Radcliffe and Surry
[RS94], who consider a so-called forma as a more general schema for arbitrary
representations, state requirements to the operators, which cannot be fulfilled
for practical problems with their respective constraints.
By the end of the last millennium, Stephens and Waelbroeck ([SW97],
[SW99]) developed an exact GA schema theory. The main idea is to de-
scribe the total transmission probability α of a schema H so that α(H, t) is
the probability that at generation t the individuals of the GA’s population
will match H (for a GA working on fixed-length bit strings). Assuming a
crossover probability pxo , α(H, t) is calculated as1 :
N −1
pxo X
α(H, t) = (1 − pxo )p(H, t) + p(L(H, i), t)p(R(H, i), t) (1.5)
N − 1 i=1

with L(H, i) and R(H, i) being the left and right parts of schema H, respec-
tively, and p(H, t) the probability of selecting an individual matching H to

1 We here give the slightly modified version as stated in [LP02]; it is equivalent to the results
in [SW97] and [SW99] assuming pm = 0.

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 17

become a parent. The “left” part of a schema H is thereby produced by re-


placing all elements of H at the positions from the given index i to N with
“don’t care” symbols (with N being the length of the bit strings); the “right”
part of a schema H is produced by replacing all elements of H from position
1 to i with “don’t care.” The summation is over all positions from 1 to N − 1,
i.e., over all possible crossover points.
Stephens later generalized this GA schema theory to variable-length GAs; see
for example [SPWR02].

Keeping in mind that the ultimate goal of any heuristic optimization tech-
nique is to approximately and efficiently solve highly complex real-world prob-
lems rather than stating a mathematically provable theory that holds only
under very restricted conditions, our intention for an extended building block
theory is a not so strict formulation that in return can be interpreted for ar-
bitrary GA applications. At the same time, the enhanced variants of genetic
algorithms and genetic programming proposed in this book aim to support the
algorithms in their intention to operate in the sense of an extended building
block interpretation discussed in the following chapters.

1.7 Parallel Genetic Algorithms

The basic idea behind many parallel and distributed programs is to divide
a task into partitions and solve them simultaneously using multiple proces-
sors. This divide-and-conquer approach can be used in different ways, and
leads to different methods to parallelize GAs where some of them change the
behavior of the GA whereas others do not. Some methods (as for instance
fine-grained parallel GAs) can exploit massively parallel computer architec-
tures, while others (coarse-grained parallel GAs, e.g.) are better qualified for
multi-computers with fewer and more powerful processing elements. Detailed
descriptions and classifications of distributed GAs are given in [CP01], [CP97]
or [AT99] and [Alb05]; the scalability of parallel GAs is discussed in [CPG99].
A further and newer variant of parallel GAs which is based on offspring selec-
tion (see Chapter 4) is the so-called SASEGASA algorithm which is discussed
in Chapter 5.

In a rough classification, parallel GA concepts established in GA textbooks


(as for example [DLJD00]) can be classified into global parallelization, coarse-
grained parallel GAs, and fine-grained parallel GAs, where the most popular
model for practical applications is the coarse-grained model, also very well
known as the island model.

© 2009 by Taylor & Francis Group, LLC


18 Genetic Algorithms and Genetic Programming

Master

Slaven
Slave1

Slave2 Slave4
Slave3

FIGURE 1.3: Global parallelization concepts: A panmictic population struc-


ture (shown in left picture) and the corresponding master–slave model (right
picture).

1.7.1 Global Parallelization


Similar to the sequential GA, in the context of global parallelization there is
only one single panmictic2 population and selection considers all individuals,
i.e., every individual has a chance to mate with any other. The behavior
of the algorithm remains unchanged and the global GA has exactly the same
qualitative properties as a sequential GA. The most common operation that is
parallelized is the evaluation of the individuals as the calculation of the fitness
of an individual is independent from the rest of the population. Because of
this the only necessary communication during this phase is in the distribution
and collection of the workload.
One master node executes the GA (selection, crossover, and mutation), and
the evaluation of fitness is divided among several slave processors. Parts of
the population are assigned to each of the available processors, in that they
return the fitness values for the subset of individuals they have received. Due
to their centered and hierarchical communication order, global parallel GAs
are also known as single-population master–slave GAs.
Figure 1.3 shows the population structure of a master–slave parallel GA:
This panmictic GA has all its individuals (indicated by the black spots) in the
same population. The master stores the population, executes the GA opera-
tions, and distributes individuals to the slaves; the slaves compute the fitness
of the individuals. As a consequence, global parallelization can be efficient
only if the bottleneck in terms of runtime consumption is the evaluation of
the fitness function.
Globally parallel GAs are quite easy to implement, and they can be a quite
efficient method of parallelization if the evaluation requires considerable com-
putational effort compared to the effort required for the operations carried out
by the master node. However, they do not influence the qualitative properties
of the corresponding sequential GA.

2 In general, a population is called panmictic when all individuals are possible mating part-

ners.

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 19

1.7.2 Coarse-Grained Parallel GAs

Migration
direction

Island Model

FIGURE 1.4: Population structure of a coarse-grained parallel GA.

In the case of a coarse-grained parallel GA, the population is divided into


multiple subpopulations (also called islands or demes) that evolve mostly
isolated from each other and only occasionally exchange individuals during
phases called migration. This process is controlled by several parameters
which will be explained later in Section 1.7.4. In contrast to the global paral-
lelization model, coarse-grained parallel GAs introduce fundamental changes
in the structure of the GA and have a different behavior than a sequential GA.
Coarse-grained parallel GAs are also known as distributed GAs because they
are usually implemented on computers with distributed memories. Litera-
ture also frequently uses the notation “island parallel GAs” because there is a
model in population genetics called the island model that considers relatively
isolated demes.
Figure 1.4 schematically shows the design of a coarse-grained parallel GA:
Each circle represents a simple GA, and there is (infrequent) communication
between the populations. The qualitative performance of a coarse-grained
parallel GA is influenced by the number and size of its demes and also by
the information exchange between them (migration). The main idea of this
type of parallel GAs is that relatively isolated demes will converge to differ-
ent regions of the solution-space, and that migration and recombination will
combine the relevant solution parts [SWM91]. However, at present there is
only one model in the theory of coarse-grained parallel GAs that considers
the concept of selection pressure for recombining the favorable attributes of
solutions evolved in the different demes, namely the SASEGASA algorithm
(which will be described later in Chapter 5). Coarse-grained parallel GAs
are the most frequently used parallel GA concept, as they are quite easy to
implement and are a natural extension to the general concept of sequential
GAs making use of commonly available cluster computing facilities.

© 2009 by Taylor & Francis Group, LLC


20 Genetic Algorithms and Genetic Programming

1.7.3 Fine-Grained Parallel GAs

FIGURE 1.5: Population structure of a fine-grained parallel GA; the special


case of a cellular model is shown here.

Fine-grained models consider a large number of very small demes; Figure 1.5
sketches a fine-grained parallel GA. This class of parallel GAs has one spatially
distributed population; it is suited for massively parallel computers, but it
can also be implemented on other supercomputing architectures. A typical
example is the diffusion model [Müh89] which represents an intrinsic parallel
GA-model.

The basic idea behind this model is that the individuals are spread through-
out the global population like molecules in a diffusion process. Diffusion
models are also called cellular models. In the diffusion model a processor
is assigned to each individual and recombination is restricted to the local
neighborhood of each individual.

A recent research topic in the area of parallel evolutionary computation is


the combination of certain aspects of the different population models resulting
in so-called hybrid parallel GAs. Most of the hybrid parallel GAs are coarse-
grained at the upper level and fine-grained at the lower levels. Another way
to hybridize parallel GAs is to use coarse-grained GAs at the high as well as
at the low levels in order to force stronger mixing at the low levels using high
migration rates and a low migration rate at the high level [CP01]. Using this
strategy, computer cluster environments at different locations can collectively
work on a common problem with only little communication overhead (due to
the low migration rates at the high level).

© 2009 by Taylor & Francis Group, LLC


Simulating Evolution: Basics about Genetic Algorithms 21

1.7.4 Migration
Especially for coarse-grained parallel GAs the concept of migration is con-
sidered to be the main success criterion in terms of achievable solution quality.
The most important parameters for migration are:

• The communication topology which defines the interconnections be-


tween the subpopulations (demes)

• The migration scheme which controls which individuals (best, random)


migrate from one deme to another and which individuals should be
replaced (worst, random, doubles)

• The migration rate which determines how many individuals migrate

• The migration interval or migration gap that determines the frequency


of migrations

The most essential question concerning migration is when and to which ex-
tent migration should take place. Much theoretical work considering this has
already been done; for a survey of these efforts see [CP97] or [Alb05]. It is
very usual for parallel GAs that migration occurs synchronously meaning that
it occurs at predetermined constant intervals. However, synchronous migra-
tion is known to be slow and inefficient in some cases [AT99]. Asynchronous
migration schemes perform communication between demes only after specific
events. The migration rate which determines how many individuals undergo
migration at every exchange can be expressed as a percentage of the popula-
tion size or as an absolute value. The majority of articles in this field suggest
migration rates between 5% and 20% of the population size. However, the
choice of this parameter is considered to be very problem dependent [AT99].
A recent overview of various migration techniques is given in [CP01].
Recent theory of self-adaptive selection pressure steering (see Chapters 4
and 5) plays a major role in defying the conventions of recent parallel GA-
theory. Within these models it becomes possible to detect local premature
convergence, i.e., premature convergence in a certain deme. Thus, local pre-
mature convergence can be detected independently in all demes, which should
give a high potential in terms of efficiency especially for parallel implementa-
tions. Furthermore, the fact that selection pressure is adjusted self-adaptively
with respect to the potential of genetic information stored in the certain demes
makes the concept of a parallel GA much more independent in terms of mi-
gration parameters (see [Aff05] and Chapter 5).

© 2009 by Taylor & Francis Group, LLC


22 Genetic Algorithms and Genetic Programming

1.8 The Interplay of Genetic Operators

In order to allow an efficient performance of a genetic algorithm, a beneficial


interplay of exploration and exploitation should be possible. Critical factors
for this interplay are the genetic operators selection, crossover, and mutation.

The job of crossover is to advantageously combine alleles of selected (above


average) chromosomes which may stem from different regions of the search
space. Therefore, crossover is considered to rather support the aspect of
breadth search. Mutation slightly modifies certain chromosomes at times and
thus brings new alleles into the gene pool of a population in order to avoid
stagnation. As mutation modifies the genetic make-up of certain chromosomes
only slightly it is primarily considered as a depth search operator. However,
via mutation newly introduced genetic information does also heavily support
the aspect of breadth search if crossover is able to “transport” this new genetic
information to other chromosomes in other search space regions. As we will
show later in this book, this aspect of mutation is of prime importance for an
efficient functioning of a GA.

The aspect of migration in coarse-grained parallel GAs should also be men-


tioned in our considerations about the interplay of operators. In this kind
of parallel GAs, migration functions somehow like a meta-model of mutation
introducing new genetic information into certain demes at the chromosome-
level whereas mutation introduces new genetic information at the allele level.
Concerning migration, a well-adjusted interplay between breadth and depth
search is aimed to function in the way that breadth search is supported in
the intra-migration phases by allowing the certain demes to drift to different
regions of the search space until a certain stage of stagnation is reached; the
demes have expanded over the search space. Then migration comes into play
by introducing new chromosomes stemming from other search space regions
in order to avoid stagnation in the certain demes; this then causes the demes
to contract again slightly which from a global point of view tends to support
the aspect of depth search in the migration phases. The reason for this is that
migration causes an increase of genetic diversity in the specific demes on the
one hand, but on the other hand it decreases the diversity over all islands.
This global loss of genetic diversity can be interpreted as an exploitation of
the search space.

This overall strategy is especially beneficial in case of highly multimodal


search spaces as it is the case for complex combinatorial optimization prob-
lems.

© 2009 by Taylor & Francis Group, LLC


Another Random Scribd Document
with Unrelated Content
Chap. XII.
Of the advantage of adversity.

1. It is good for us to have sometimes troubles and adversities; for


they make a man enter into himself, that he may know that he is in
a state of banishment, and may not place his hopes in any thing of
this world.

It is good that we sometimes suffer contradictions, and that men


have an evil or imperfect opinion of us; even when we do and
intend well.

These things are often helps to humility, and defend us from vain
glory.

For then we better run to God our inward witness, when outwardly
we are despised by men, and little credit is given to us.

2. Therefore should a man establish himself in such a manner in


God, as to have no need of seeking many comforts from men.

When a man of good will is troubled or tempted, or afflicted with


evil thoughts; then he better understands what need he hath of
God, without whom he finds he can do no good.

Then also he laments; he sighs, and prays by reason of the


miseries which he suffers.

Then he is weary of living longer: and wishes death to come that


he may be dissolved and be with Christ.

Then also he well perceives that perfect security and full peace
cannot be found in this world.

Chap. XIII.
Of resisting temptation.
1. As long as we live in this world, we cannot be without tribulation
and temptation.

Hence it is written in Job: Man's life upon earth is a


temptation.

Therefore ought every one to be solicitous about his temptations,


and to watch in prayer; lest the devil, (who never sleeps, but goes
about seeking whom he may devour,) find room to deceive
him.

No man is so perfect and holy as not to have sometimes


temptations: and we cannot be wholly without them.

2. Temptations are often very profitable to a man, although they be


troublesome and grievous: for in them a man is humbled, purified,
and instructed.

All the saints have passed through many tribulations and


temptations, and have profited by them: and they who could not
support temptations, have become reprobates, and fell off.

There is not any order so holy, nor place so retired, where there
are not temptations and adversities.

3. A man is never entirely secure from temptations as long as he


lives: because we have within us the source of temptations, having
been born in concupiscence.

When one temptation or tribulation is over, another comes on: and


we shall have always something to suffer, because we have lost the
good of our original happiness.

Many seek to fly temptations, and fall more grievously into them.

By flight alone we cannot overcome: but by patience and true


humility we are made stronger than all our enemies.
4. He that only declines them outwardly, and does not pluck out
the root, will profit little; nay, temptations will sooner return to him,
and he will find himself in a worse condition.

By degrees, and by patience, with longanimity, thou shalt, by God's


grace, better overcome them, than by harshness and thine own
importunity.

In temptation, often take counsel, and deal not roughly with one
that is tempted: but comfort him, as thou wouldst wish to be done
to thyself.

5. Inconstancy of mind, and small confidence in God, is the


beginning of all temptations.

For as a ship without a rudder is tossed to and fro by the waves:


so the man who is remiss, and who quits his resolution, is many
ways tempted.

Fire tries iron, and temptation tries a just man.

We often know not what we can do: but temptation discovers what
we are.

6. However, we must be watchful, especially in the beginning of


temptation: because then the enemy is easier overcome, when he
is not suffered to come in at the door of the soul, but is kept out
and resisted at his first knock.

Whence a certain man said: Withstand the beginning, after-


remedies come too late.

For first a bare thought comes to the mind, then a strong


imagination; afterwards delight, and evil motion and consent.

And thus, by little and little, the wicked enemy gets full entrance,
when he is not resisted in the beginning.
And how much the longer a man is negligent in resisting: so much
the weaker does he daily become in himself, and the enemy
becomes stronger against him.

7. Some suffer great temptations in the beginning of their


conversion, and some in the end.

And some there are who are much troubled in a manner all their
life time.

Some are but lightly tempted, according to the Wisdom and equity
of the ordinance of God, who weighs the state and merits of men,
and pre-ordains all for the salvation of his elect.

8. We must not therefore despair when we are tempted, but pray


to God with so much the more fervour, that he may vouchsafe to
help us in all tribulations: who, no doubt, according to the saying of
St. Paul, will make such issue with the temptation, that we
may be able to sustain it. 1 Corinthians x.

Let us therefore humble our souls, under the hand of God in all
temptations and tribulations: for the humble in spirit he will save
and exalt.

9. In temptations and tribulations, a man is proved what progress


he has made: and in them there is greater merit, and his virtue
appears more conspicuous.

Nor is it much if a man be devout and fervent when he feels no


trouble: but if in the time of adversity he bears up with patience,
there will be hope of a great advancement.

Some are preserved from great temptations, and are often


overcome in daily little ones: that being humbled, they may never
presume of themselves in great things, who are weak in such small
occurrences.
Chap. XIV.
Of avoiding rash judgment.

1. Turn thy eyes back upon thyself, and see thou judge not the
doings of others.

In judging others a man labours in vain, often errs, and easily sins;
but in judging and looking into himself, he always labours with
fruit.

We frequently judge of a thing according as we have it at heart: for


we easily love true judgment through private affection.

If God were always the only object of our desire, we should not so
easily be disturbed at the resistance of our opinions.

2. But there is often something lies hid within, or occurs from


without, which draws us along with it.

Many secretly seek themselves in what they do, and are not
sensible of it.

They seem also to continue in good peace, when things are done
according to their will and judgment: but if it fall out contrary to
their desires, they are soon moved and become sad.

Difference of thoughts and opinions is too frequently the source of


dissensions amongst friends and neighbours, amongst religious and
devout persons.

3. An old custom is with difficulty relinquished: and no man is led


willingly farther than himself sees or likes.

If thou reliest more upon thine own reason or industry than upon
the virtue that subjects to Jesus Christ, thou wilt seldom and hardly
be an enlightened man: for God will have us perfectly subject to
himself, and to transcend all reason by inflamed love.
Chap. XV.
Of works done out of charity.

1. Evil ought not to be done, either for any thing in the world, or
for the love of any man: but for the profit of one that stands in
need, a good work is sometimes freely to be omitted, or rather to
be changed for a better.

For, by doing thus, a good work is not lost, but is changed into a
better.

Without charity, the outward work profiteth nothing: but whatever


is done out of charity, be it ever so little and contemptible, all
becomes fruitful.

For God regards more with how much affection and love a person
performs a work, than how much he does.

2. He does much who loves much.

He does much that does well what he does.

He does well who regards rather the common good than his own
will.

That seems often to be charity which is rather natural affection:


because our own natural inclination, self-will, hope of retribution,
desire of our own interest, will seldom be wanting.

3. He that has true and perfect charity, seeks himself in no one


thing: but desires only the glory of God in all things.

He envies no man, because he loves no private joy; nor does he


desire to rejoice in himself: but above all good things, he wishes to
be made happy in God.
He attributes nothing of good in any man, but refers it totally to
God, from whom all things proceed as from their fountain, in the
enjoyment of whom all the saints repose as in their last end.

Ah! if a man had but one spark of perfect charity, he would


doubtless perceive that all earthly things are full of vanity.

Chap. XVI.
Of bearing the defects of others.

1. What a man cannot amend in himself or others, he must bear


with patience, till God ordains otherwise.

Think, that it perhaps is better so for thy trial and patience: without
which, our merits are little worth.

Thou must, nevertheless, under such impressions, earnestly pray


that God may vouchsafe to help thee, and that thou mayest bear
them well.

2. If any one being once or twice admonished, does not comply,


contend not with him: but commit all to God, that his will may be
done, and he may be honoured in all his servants, who knows how
to convert evil into good.

Endeavour to be patient in supporting others defects and infirmities


of what kind so ever: because thou also hast many things which
others must bear withal.

If thou canst not make thyself such a one as thou wouldst: how
canst thou expect to have another according to thy liking?

We would willingly have others perfect: and yet we mend not, our
own defects.
3. We would have others strictly corrected: but are not willing to be
corrected ourselves.

The large liberty of others displeases us: and yet we would not be
denied any thing we ask for.

We are willing that others should be bound up by laws: and we


suffer not ourselves by any means to be restrained.

Thus it is evident how seldom we weigh our neighbour in the same


balance with ourselves.

If all were perfect: what then should we have to suffer from others
for God's sake?

4. But now God has so disposed things, that we may learn to bear
one another's burdens: for there is no man without defect; no man
without his burden: no man sufficient for himself; no man wise
enough for himself: but we must support one another, comfort one
another, assist, instruct, and admonish one another.

But how great each one's virtue is, best appears by occasion of
adversity: for occasions do not make a man frail, but shew what he
is.

Chap. XVII.
Of a monastic life.

1. Thou must learn to renounce thy own will in many things, if thou
wilt keep peace and concord with others.

It is no small matter to live in a monastery, or in a congregation,


and to converse therein without reproof, and to persevere faithful
till death.

Blessed is he who has there lived well, and made a happy end.
If thou wilt stand as thou oughtest, and make a due progress, look
upon thyself as a banished man, and a stranger upon earth.

Thou must be content to be made a fool for Christ, if thou wilt lead
a religious life.

2. The habit and the tonsure contribute little; but a change of


manners, and an entire mortification of the passions, make a true
religious man.

He that seeks here any other thing than purely God and the
salvation of his soul, will find nothing but trouble and sorrow.

Neither can he long remain in peace, who does not strive to be the
least, and subject to all.

3. Thou camest hither to serve, not to govern: know that thou art
called to suffer and to labour, not to be idle and talkative.

Here then men are tried as gold in the furnace.

Here no man can stand, unless he be willing with all his heart to
humble himself for the love of God.

Chap. XVIII.
Of the example of the holy fathers.

1. Look upon the lively examples of the holy fathers, in whom true
perfection and religion was most shining, and thou wilt see how
little, and almost nothing, that is which we do.

Alas! what is our life if compared to theirs?

The saints and friends of Christ served the Lord in hunger and
thirst; in cold and nakedness; in labour and weariness; in watchings
and fastings; in prayers and holy meditations; in persecutions and
many reproaches.

2. Ah! how many and how grievous tribulations have the apostles,
martyrs, confessors, virgins, and all the rest, gone through, who
have been willing to follow Christ's footsteps: for they hated their
lives in this world, that they might possess them for eternity.

O! how strict and mortified a life did the holy fathers lead in the
desert! How long and grievous temptations did they endure! how
often were they molested by the enemy! What frequent and fervent
prayers did they offer to God! What rigorous abstinence did they go
through! What great zeal and fervour had they for their spiritual
progress! How strong a war did they wage for overcoming vice!
How pure and upright was their intention to God!

They laboured all the day, and in the nights, they gave themselves
to long prayers: though even whilst they were at work, they ceased
not from mental prayer.

3. They spent all their time profitably: every hour seemed short
which they spent with God: and through the great sweetness of
divine contemplation, they forgot even the necessity of their bodily
refreshment.

They renounced all riches, dignities, honours, friends, and kindred;


they desired to have nothing of this world; they scarce allowed
themselves the necessaries of life: the serving the body even in
necessity, was irksome to them.

They were poor, therefore, as to earthly things: but very rich in


grace and virtues.

Outwardly they wanted, but inwardly they were refreshed with


divine graces and consolations.
4. They were strangers to the world: but near and familiar friends
to God.

They seemed to themselves as nothing, and were despised by this


world: but in the eyes of God they were very valuable and beloved.

They stood in true humility, they lived in simple obedience, they


walked in charity and patience: and therefore they daily advanced
in spirit, and obtained great favour with God.

They were given as an example for all religious: and ought more to
excite us to make good progress, than the number of the lukewarm
to grow slack.

5. O! how great was the fervour of all religious in the beginning of


their holy institution!

O! how great was their devotion in prayer! how great their zeal for
virtue!

How great discipline was in force amongst them! How great


reverence and obedience in all, flourished under the rule of a
superior!

The footsteps remaining still bear witness that they were truly
perfect and holy men: who waging war so stoutly, trod the world
under their feet.

Now he is thought great who is not a transgressor: and who can


with patience endure what he hath undertaken.

6. Ah! the lukewarmness and negligence of our state, that we so


quickly fall away from our former fervour, and are now even weary
of living through sloth and tepidity!

Would to God that advancement in virtues were not wholly asleep


in thee, who hast often seen many examples of the devout!
Chap. XIX.
Of the exercises of a good religious man.

1. The life of a good religious man ought to be eminent in all


virtue: that he may be such interiorly, as he appears to men in his
exterior.

And with good reason ought he to be much more in his interior,


than he exteriorly appears; because he who beholds us is God, of
whom we ought exceedingly to stand in awe, wherever we are, and
like angels walk pure in his sight.

We ought every day to renew our resolution, and excite ourselves


to fervour, as if it were the first day of our conversion, and to say:

Help me, O Lord God, in my good resolution, and in thy holy


service, and give me grace now this day perfectly to begin; for
what I have hitherto done, is nothing.

2. According as our resolution is, will the progress of our


advancement be; and he had need of much diligence who would
advance much.

Now if he that makes a strong resolution often fails: what will he


do who seldom or but weakly resolves?

The falling off from our resolution happens divers ways: and a
small omission in our exercises seldom passeth without some loss.

The resolutions of the just depend on the grace of God, rather than
on their own wisdom: and in whom they always put their trust,
whatever they take in hand.

For man proposes, but God disposes: nor is the way of man in his
own hands.
3. If for piety's sake, or with a design to the profit of our brother,
we sometimes omit our accustomed exercises, it may afterwards be
easily recovered.

But if through a loathing of mind, or negligence, it be lightly let


alone, it is no small fault, and will prove hurtful.

Let us endeavour what we can, we shall still be apt to fail in many


things.

But yet we must always resolve on something certain, and in


particular against those things which hinder us most.

We must examine and order well both our exterior and interior!
because both conduce to our advancement.

4. If thou canst not continually recollect thyself, do it sometimes,


and at least once a day, that is, at morning or evening.

In the morning resolve, in the evening examine thy performances:


how thou hast behaved this day in word, work, or thought:
because in these perhaps thou hast often offended God and thy
neighbour.

Prepare thyself like a man to resist the wicked attacks of the devil;
bridle gluttony, and thou shalt the easier restrain all carnal
inclinations.

Be never altogether idle: but either reading, or writing, or praying,


or meditating, or labouring in something that may be for the
common good.

Yet in bodily exercises, a discretion is to be used: nor are they


equally to be undertaken by all.

5. Those things which are not common are not to be done in


public: for particular things are more safely done in private.
But take care then be not slack in common exercises, and more
forward in things of thy own particular devotion: but having fully,
and faithfully performed what thou art bound to, and what is
enjoined thee, if thou hast any time remaining, give thyself to
thyself according as thy devotion shall incline thee.

All cannot have the self same exercise: but this is more proper for
one, and that for another.

Moreover, according to the diversity of times, divers exercises are


more pleasing: for some relish better on festival days, others on
common days.

We stand in need of one kind in time of temptation, and of another


in time of peace and rest.

Some we willingly think on when we are sad, others when we are


joyful in the Lord.

6. About the time of the principal festivals, we must renew our


good exercises: and more fervently implore the prayers of the
saints.

We ought to make our resolution from festival to festival: as if we


were then to depart out of this world, and to come to the
everlasting festival.

Therefore we ought carefully to prepare ourselves at times of


devotion; and to converse more devoutly, and keep all observances
more strictly, as being shortly to receive the reward of our labour
from God.

7. And if it be deferred, let us believe that we are not well


prepared, and that we are as yet unworthy of the great glory which
shall be revealed in us at the appointed time: and let us endeavour
to prepare ourselves better for our departure.
Blessed is that servant, says the evangelist St. Luke, whom
when his Lord shall come he shall find watching. Amen, I
say to you, he shall set him over all his possessions. Luke
xiii.

Chap. XX.
Of the love of solitude and silence.

1. Seek a proper time to retire into thyself, and often think of the
benefits of God.

Let curiosities alone.

Read such matters as may rather move thee to compunction, than


give thee occupation.

If thou wilt withdraw thyself from superfluous talk and idle visits, as
also from giving ear to news and reports, thou wilt find time
sufficient and proper to employ thyself in good meditations.

The greatest saints avoided the company of men as much as they


could, and chose to live to God in secret.

2. As often as I have been amongst men, said one, I have


returned less a man: this we often experience when we talk
long.

It is easier to be altogether silent, than not to exceed in words.

It is easier to keep retired at home, than to be able to be


sufficiently upon one's guard abroad.

Whosoever, therefore, aims at arriving at internal and spiritual


things, must, with Jesus, go aside from the crowd.
No man is secure in appearing abroad, but he who would willingly
lie hid at home.

No man securely speaks, but he who loves to hold his peace.

No man securely governs, but he who would willingly live in


subjection.

No man securely commands, but he who has learned well to obey.

3. No man securely rejoiceth, unless he has within him the


testimony of a good conscience.

Yet the security of the saints was always full of the fear of God.

Neither were they less careful or humble in themselves because


they were shining with great virtues and grace.

But the security of the wicked arises from pride and presumption;
and will end in deceiving themselves.

Never promise thyself security in this life, though thou seemest to


be a good religious man, or a devout hermit.

4. Oftentimes they that were better in the judgments of men, have


been in greater danger by reason of their too great confidence.

So that it is better for many not to be altogether free from


temptations, but to be often assaulted; that they may not be too
secure: lest, perhaps, they be lifted up with pride, or take more
liberty to go aside after exterior comforts.

O! how good a conscience would that man preserve, who would


never seek after transitory joy, nor ever busy himself with the
world.

O! how great peace and tranquillity would he possess, who would


cut off all vain solicitude, and only think of the things of God and
his salvation, and place his whole hope in God.

5. No man is worthy of heavenly comfort who has not diligently


exercised himself in holy compunction.

If thou wouldst find compunction in thy heart, retire into thy


chamber, and shut out the tumults of the world, as it is written:
Have compunction in your chambers. Psalms iv.

Thou shalt find in thy cell what thou shalt often lose abroad.

Thy cell, if thou continue in it, grows sweet: but if thou keep not to
it, it becomes tedious and distasteful.

If in the beginning of thy conversion thou accustom thyself to


remain in thy cell, and keep it well; it will be to thee afterwards a
dear friend, and a most agreeable delight.

6. In silence and quiet the devout soul goes forward, and learns
the secrets of the scriptures.

There she finds floods of tears, with which she may wash and
cleanse herself every night: that she may become so much the
more familiar with her Maker, by how much the farther she lives
from all worldly tumult.

For God with his holy angels will draw nigh to him, who withdraws
himself from his acquaintance and friends.

It is better to lie hid, and take care of one's self, than neglecting
one's self to work even miracles.

It is commendable for a religious man, to go seldom abroad, to fly


being seen, and not to desire to see men.

7. Why wilt thou see what thou must not have? The world
passeth and its concupiscences. 1 John ii.
The desires of sensuality draw thee abroad: but when the hour is
past, what dost thou bring home, but a weight upon thy
conscience, and a dissipation of heart.

A joyful going abroad often brings forth a sorrowful coming home,


and a merry evening makes a sad morning.

So all carnal joy enters pleasantly; but in the end brings remorse
and death.

What canst thou see elsewhere which thou seest not here? Behold
the heaven and the earth, and all the elements; for of these are all
things made.

8. What canst thou see any where which can continue long under
the sun?

Thou thinkest perhaps to be satisfied, but thou canst not attain to


it.

If thou couldst see any thing at once before thee, what would it be
but a vain sight?

Lift up thine eyes to God on high, and pray for thy sins and
negligences.

Leave vain things to vain people: but mind thou the things which
God has commanded thee.

Shut thy doors upon thee, and call to thee Jesus thy beloved.

Stay with him in thy cell, for thou shalt not find so great peace any
where else.

If thou hadst not gone abroad, and hearkened to rumours, thou


hadst kept thyself better in good peace: but since thou art
delighted sometimes to hear news, thou must from thence suffer a
disturbance of heart.

Chap. XXI.
Of compunction of heart.

1. If thou wilt make any progress keep thyself in the fear of God,
and be not too free, but restrain all thy senses under discipline, and
give not thyself up to foolish mirth.

Give thyself to compunction of heart, and thou shalt find devotion.

Compunction opens the way to much good, which dissolution is


wont quickly to lose.

It is wonderful that any man can heartily rejoice in this life, who
weighs and considers his banishment, and the many dangers of his
soul.

2. Through levity of heart, and the little thought we have of our


defects, we feel not the sorrows of our soul: but often vainly laugh,
when in all reason we ought to weep.

There is no true liberty, nor good joy, but in the fear of God with a
good conscience.

Happy is he who can cast away all impediments of distractions, and


recollect himself to the union of holy communion.

Happy is he who separates himself from all that may burthen or


defile his conscience.

Strive manfully: custom is overcome by custom.

If thou canst let men alone, they will let thee do what thou hast to
do.
3. Busy not thyself with other men's affairs, nor entangle thyself
with the causes of great ones.

Have always an eye upon thyself in the first place: and take special
care to admonish thyself preferably to all thy dearest friends.

If thou hast not the favour of men, be not grieved thereat: but let
thy concern be, that thou dost not carry thyself so well and so
circumspectly as it becomes a servant of God, and a devout
religious man to demean himself.

It is oftentimes more profitable and more secure for a man not to


have many comforts in this life; especially according to the flesh.

Yet, that we have not divine comforts, or seldom experience them,


is our own faults: because we do not seek compunction of heart,
nor cast off altogether vain and outward satisfactions.

4. Acknowledge thyself unworthy of divine consolation, and rather


worthy of much tribulation.

When a man has perfect compunction, then the whole world is to


him burdensome and distasteful.

A good man always finds subject enough for mourning and


weeping.

For whether he considers himself, or thinks of his neighbour, he


knows that no man lives here without tribulations; and the more
thoroughly he considers himself, the more he grieves.

The subjects for just grief and interior compunction are our vices
and sins, in which we lie entangled in such manner, as seldom to
be able to contemplate heavenly things.

5. If thou wouldst oftener think of thy death, than of a long life, no


doubt but thou wouldst more fervently amend thyself.
And if thou didst seriously consider in thy heart the future
punishments of hell and purgatory, I believe thou wouldst willingly
endure labour and pain, and fear no kind of austerity.

But because these things reach not the heart, and we still love the
things which flatter us, therefore we remain cold and very sluggish.

6. It is oftentimes a want of spirit, which makes the wretched


body so easily complain.

Pray therefore humbly to our Lord, that he may give thee the spirit
of compunction: and say with the prophet: Feed me, Lord, with
the food of tears, and give me drink of tears in measure.

Chap. XXII.
Of the consideration of the misery of man.

1. Thou art miserable wherever thou art, and which way soever
thou turnest thyself, unless thou turn thyself to God.

Why art thou troubled because things do not succeed with thee
according to thy will and desire?

Who is there that has all things according to his will?

Neither I, nor thou, nor any man upon earth.

There is no man in the world without some trouble or affliction,


though he be a king or a pope.

Who is there that is most at ease? doubtless he who is willing to


suffer something for God's sake.

2. Many unstable and weak men are apt to say: behold how well
such a one lives, how rich, how great, how mighty and powerful!
But attend to heavenly goods, and thou wilt see that all these
temporal things are nothing, but very uncertain, and rather
burdensome: because they are never possessed without care and
fear.

The happiness of a man consisteth not in having temporal things in


abundance, but a moderate competency sufficeth.

It is truly a misery to live upon earth.

The more a man desireth to be spiritual, the more this present life
becomes distasteful to him: because he the better understands,
and more clearly sees the defects of human corruption.

For to eat, drink, watch, sleep, rest, labour, and to be subject to


other necessities of nature, is truly a great misery and affliction to
a devout man, who desires to be released, and free from all sin.

3. For the inward man is very much burdened with the necessities
of the body in this world.

And therefore the prophet devoutly prays to be freed from them,


saying: From my necessities deliver me, O Lord. Psalms xxiv.

But wo to them that know not their own misery, and more wo to
them that love this miserable and corruptible life.

For some there are who love it to that degree, although they can
scarce get necessaries by labouring or begging, that if they could
live always here, they would not care at all for the kingdom of God.

4. O senseless people, and infidels in heart, who lie buried so deep


in earthly things, as to relish nothing but the things of the flesh!

Miserable wretches! they will in the end find to their cost, how vile
a nothing that was which they so much loved.
But the saints of God, and all the devout friends of Christ, made no
account of what pleased the flesh, or flourished in this life; but
their whole hope and intentions aspired to eternal goods.

Their whole desire tended upwards to things everlasting and


invisible; for fear lest the love of visible things should draw them
down to things below.

Lose not, brother, thy confidence of going forward to spiritual


things; there is yet time, the hour is not yet past.

5. Why wilt thou put off thy resolution from day to day? Arise, and
begin this very moment, and say: Now is the time for doing, and
now is the time to fight; now is the proper time to amend my life.

When thou art troubled and afflicted, then is the time to merit.

Thou must pass through fire and water, before thou comest to
refreshment.

Unless thou do violence to thyself, thou wilt not overcome vice.

As long as we carry about us this frail body, we cannot be without


sin, nor live without uneasiness and sorrow.

We would fain be at rest from all misery: but because we have lost
innocence by sin, we have also lost true happiness.

We must therefore have patience, and wait for the mercy of God,
till iniquity pass away, and this mortality be swallowed up by
immortal life.

6. O! how great is human frailty, which is always prone to vice!

To-day thou confessest thy sins, and to-morrow thou again


committest what thou hast confessed!
Now thou resolvest to take care, and an hour after thou dost as if
thou hadst never resolved.

We have reason therefore to humble ourselves, and never to think


much of ourselves, since we are so frail and inconstant.

That may also quickly be lost through negligence, which with much
labour and time was hardly gotten by grace.

7. What will become of us yet in the end: who grow lukewarm so


very soon?

Wo be to us if we are for giving ourselves to rest, as if we had


already met with peace and security, when there does not appear
any mark of true sanctity in our conversation.

It would be very needful that we should yet again, like good


novices, be instructed in all good behaviour: if so, perhaps there
would be hopes of some future amendment, and greater spiritual
progress.

Chap. XXIII.
Of the thoughts of death.

1. Very quickly must thou be gone from hence: see then how
matters stand with thee: a man is here to-day, and to-morrow he is
vanished.

And when he is taken away from the sight, he is quickly also out of
mind.

O! the dulness and hardness of man's heart, which only thinks on


what is present, and looks not forward to things to come!

Thou oughtest in every action and thought so to order thyself, as if


thou wert immediately to die.
If thou hast a good conscience, thou wouldst not much fear death.

It were better for thee to fly sin, than to be afraid of death.

If thou art not prepared to-day, how wilt thou be to-morrow?

To-morrow is an uncertain day; and how dost thou know that thou
shalt be alive to-morrow?

2. What benefit is it to live long, when we advance so little?

Ah! long life does not always make us better, but often adds to our
guilt!

Would to God we had behaved ourselves well in this world, even


for one day!

Many count the years of their conversion; but oftentimes the fruit
of amendment is but small.

If it be frightful to die, perhaps it will be more dangerous to live


longer.

Blessed is he that has always the hour of his death before his eyes,
and every day disposes himself to die.

If thou hast at any time seen a man die, think that thou must also
pass the same way.

3. In the morning, imagine thou shalt not live till night: and when
evening comes, presume not to promise thyself the next morning.

Be therefore always prepared, and live in such a manner, that


death may never find thee unprovided.

Many die suddenly, and when they little think of it: For the Son of
Man will come at the hour when he is not looked for.
Matthew xxiv. When that last hour shall come, thou wilt begin to
have quite other thoughts of thy whole past life: and thou wilt be
exceedingly grieved that thou hast been so negligent and remiss.

4. How happy and prudent is he who strives to be such now in this


life, as he desires to be found at his death.

For it will give a man a great confidence of dying happily, if he has


a perfect contempt of the world, a fervent desire of advancing in
virtue, a love for discipline, the spirit of penance, a ready
obedience, self-denial, and patience in bearing all adversities for the
love of Christ.

Thou mayest do many good things whilst thou art well: but when
thou art sick, I know not what thou wilt be able to do.

Few are improved by sickness; they also that travel much abroad
seldom become holy.

5. Trust not in thy friends and kinsfolks, nor put off the welfare of
thy soul to hereafter: for men will sooner forget thee than thou
imaginest.

It is better now to provide in time and send some good before


thee, than to trust to others helping thee after thy death.

If thou art not now careful for thyself, who will be careful for thee
hereafter?

The present time is very precious: Now are the days of


salvation: now is an acceptable time.

But it is greatly to be lamented, that thou dost not spend this time
more profitably: wherein thou mayest acquire a stock on which
thou mayest live for ever! The time will come, when thou wilt wish
for one day or hour to amend: and I know not whether thou wilt
obtain it.
6. O my dearly beloved, from how great a danger mayest thou
deliver thyself: from how great a fear mayest thou be freed, if thou
wilt but now be always fearful, and looking for death! Strive now so
to live, that in the hour of thy death thou mayest rather rejoice
than fear.

Learn now to die to the world, that then thou mayest begin to live
with Christ.

Learn now to despise all things, that then thou mayest freely go to
Christ.

Chastise thy body now by penance, that thou mayest then have an
assured confidence.

7. Ah! fool! why dost thou think to live long, when thou art not
sure of one day?

How many thinking to live long, have been deceived, and


unexpectedly have been snatched away.

How often hast thou heard related, that such a one was slain by
the sword; another drowned; another falling from on high, broke
his neck: this man died at the table; that other came to his end
when he was at play.

Some have perished by fire; some by the sword; some by


pestilence; and some by robbers.

Thus death is the end of all, and man's life passeth suddenly like a
shadow.

8. Who will remember thee when thou art dead; and who will pray
for thee?

Do now, beloved, do now all thou canst, because thou knowest not
when thou shalt die: nor dust thou know what shall befal thee after
death.

Whilst thou hast time, heap up to thyself riches that will never die;
think of nothing but thy salvation; care for nothing but the things
of God.

Make now to thyself friends, by honouring the saints of God, and


imitating their actions; that when thou shalt fail in this life, they
may receive thee into everlasting dwellings.

9. Keep thyself as a pilgrim, and a stranger upon earth, to whom


the affairs of this world do not in the least belong.

Keep thy heart free, and raised upwards to God; because thou hast
not here a lasting city.

Send thither thy daily prayer, with sighs and tears; that after death
thy spirit may be worthy to pass happily to our Lord. Amen.

Chap. XXIV.
Of judgment and the punishment of sins.

1. In all things look to thy end, and how thou shalt be able to
stand before a severe Judge, to whom nothing is hidden: who
takes no bribes, nor receives excuses, but will judge that which is
just.

O most wretched and foolish sinner, what answer wilt thou make to
God, who knows all thy evils? thou who sometimes art afraid of the
looks of an angry man.

Why dost thou not provide for thy self against the day of judgment,
when no man can be excused or defended by another; but every
one shall have enough to do to answer for himself?
At present thy labour is profitable; thy tears are acceptable; thy
sighs will be heard, and thy sorrow is satisfactory, and may purge
away thy sins.

2. A patient man hath a great and wholesome purgatory, who


receiving injuries is more concerned at another person's sin than
his own wrong; who willingly prays for his adversaries, and from his
heart forgives offences; who delays not to ask forgiveness of
others; who is easier moved to compassion than to anger; who
frequently useth violence to himself, and labours to bring the flesh
wholly under subjection to the spirit.

It is better now to purge away our sins, and cut up our vices, than
to reserve them to be purged hereafter.

Truly, we deceive ourselves through the inordinate love we bear to


our flesh.

3. What other things shall that fire feed on but thy sins?

The more thou sparest thyself now, and followest the flesh, the
more grievously shalt thou suffer hereafter, and the more fuel dost
thou lay up for that fire.

In what things a man has more sinned, in those shall he be more


heavily punished.

There the slothful shall be pricked forward with burning goads, and
the glutton will be tormented with extreme hunger and thirst.

There the luxurious and the lovers of pleasure will be covered all
over with burning pitch and stinking brimstone, and the envious,
like mad dogs, will howl for grief.

4. There is no vice which will not have its proper torments.


There the proud will be filled with all confusion; and the covetous
be straitened with most miserable want.

There one hour of suffering will be more sharp, than a hundred


years here spent in the most rigid penance.

There is no rest, no comfort for the damned: but here there is


sometimes intermission of labour, and we receive comfort from our
friends.

Be careful at present, and sorrowful for thy sins: that in the day of
judgment thou mayest be secure with the blessed.

For then the just shall stand with great constancy against
those that afflicted and oppressed them. Wisdom v.

Then will he stand to judge: who now humbly submits himself to


the judgment of men.

Then the poor and humble will have great confidence: and the
proud will fear on every side.

5. Then it will appear that he was wise in this world, who learned
for Christ's sake to be a fool, and despised.

Then all tribulation suffered with patience will be pleasing, and all
iniquity shall stop her mouth. Psalms cvi.

Then every devout person will rejoice, and the irreligious will be
sad.

Then the flesh that has been mortified shall triumph more than if it
had always been pampered in delights.

Then shall the mean habit shine, and fine clothing appear
contemptible.
Welcome to Our Bookstore - The Ultimate Destination for Book Lovers
Are you passionate about books and eager to explore new worlds of
knowledge? At our website, we offer a vast collection of books that
cater to every interest and age group. From classic literature to
specialized publications, self-help books, and children’s stories, we
have it all! Each book is a gateway to new adventures, helping you
expand your knowledge and nourish your soul
Experience Convenient and Enjoyable Book Shopping Our website is more
than just an online bookstore—it’s a bridge connecting readers to the
timeless values of culture and wisdom. With a sleek and user-friendly
interface and a smart search system, you can find your favorite books
quickly and easily. Enjoy special promotions, fast home delivery, and
a seamless shopping experience that saves you time and enhances your
love for reading.
Let us accompany you on the journey of exploring knowledge and
personal growth!

ebookgate.com

You might also like