100% found this document useful (12 votes)
51 views

Introduction to Algorithms for Data Mining and Machine Learning 1st edition - eBook PDF 2024 Scribd Download

The document promotes ebookluna.com as a platform for seamless full ebook downloads across various genres, particularly focusing on data mining and machine learning. It highlights several specific titles available for download, including 'Introduction to Algorithms for Data Mining and Machine Learning' and 'Machine Learning for Biometrics.' The document also emphasizes the availability of different ebook formats such as PDF, ePub, and MOBI.

Uploaded by

nkumxire
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (12 votes)
51 views

Introduction to Algorithms for Data Mining and Machine Learning 1st edition - eBook PDF 2024 Scribd Download

The document promotes ebookluna.com as a platform for seamless full ebook downloads across various genres, particularly focusing on data mining and machine learning. It highlights several specific titles available for download, including 'Introduction to Algorithms for Data Mining and Machine Learning' and 'Machine Learning for Biometrics.' The document also emphasizes the availability of different ebook formats such as PDF, ePub, and MOBI.

Uploaded by

nkumxire
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 56

Experience Seamless Full Ebook Downloads for Every Genre at ebookluna.

com

Introduction to Algorithms for Data Mining and


Machine Learning 1st edition - eBook PDF

https://ebookluna.com/download/introduction-to-algorithms-
for-data-mining-and-machine-learning-ebook-pdf/

OR CLICK BUTTON

DOWNLOAD NOW

Explore and download more ebook at https://ebookluna.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Machine Learning for Biometrics: Concepts, Algorithms and


Applications (Cognitive Data Science in Sustainable
Computing) 1st Edition - eBook PDF
https://ebookluna.com/download/machine-learning-for-biometrics-
concepts-algorithms-and-applications-cognitive-data-science-in-
sustainable-computing-ebook-pdf/
ebookluna.com

BIG DATA ANALYTICS: Introduction to Hadoop, Spark, and


Machine-Learning 1st Edition - eBook PDF

https://ebookluna.com/download/big-data-analytics-introduction-to-
hadoop-spark-and-machine-learning-ebook-pdf/

ebookluna.com

(eBook PDF) Introduction to Machine Learning with Python:


A Guide for Data Scientists

https://ebookluna.com/product/ebook-pdf-introduction-to-machine-
learning-with-python-a-guide-for-data-scientists/

ebookluna.com

(eBook PDF) Introduction to Business Data Mining 1st


Edition

https://ebookluna.com/product/ebook-pdf-introduction-to-business-data-
mining-1st-edition/

ebookluna.com
(eBook PDF) Machine Learning Refined: Foundations,
Algorithms, and Applications

https://ebookluna.com/product/ebook-pdf-machine-learning-refined-
foundations-algorithms-and-applications/

ebookluna.com

(eBook PDF) Introduction to Data Mining, Global Edition


2nd Edition

https://ebookluna.com/product/ebook-pdf-introduction-to-data-mining-
global-edition-2nd-edition/

ebookluna.com

Big Data Mining for Climate Change 1st edition - eBook PDF

https://ebookluna.com/download/big-data-mining-for-climate-change-
ebook-pdf/

ebookluna.com

(eBook PDF) Introduction to Data Mining 2nd Edition by


Pang-Ning Tan

https://ebookluna.com/product/ebook-pdf-introduction-to-data-
mining-2nd-edition-by-pang-ning-tan/

ebookluna.com

Machine Learning for Planetary Science 1st Edition - eBook


PDF

https://ebookluna.com/download/machine-learning-for-planetary-science-
ebook-pdf/

ebookluna.com
Xin-She Yang

Introduction to
Algorithms for Data Mining
and Machine Learning
Introduction to Algorithms for Data Mining and
Machine Learning
This page intentionally left blank
Introduction to
Algorithms for Data
Mining and Machine
Learning

Xin-She Yang
Middlesex University
School of Science and Technology
London, United Kingdom
Academic Press is an imprint of Elsevier
125 London Wall, London EC2Y 5AS, United Kingdom
525 B Street, Suite 1650, San Diego, CA 92101, United States
50 Hampshire Street, 5th Floor, Cambridge, MA 02139, United States
The Boulevard, Langford Lane, Kidlington, Oxford OX5 1GB, United Kingdom
Copyright © 2019 Elsevier Inc. All rights reserved.
No part of this publication may be reproduced or transmitted in any form or by any means, electronic or
mechanical, including photocopying, recording, or any information storage and retrieval system, without
permission in writing from the publisher. Details on how to seek permission, further information about the
Publisher’s permissions policies and our arrangements with organizations such as the Copyright Clearance Center
and the Copyright Licensing Agency, can be found at our website: www.elsevier.com/permissions.
This book and the individual contributions contained in it are protected under copyright by the Publisher (other
than as may be noted herein).
Notices
Knowledge and best practice in this field are constantly changing. As new research and experience broaden our
understanding, changes in research methods, professional practices, or medical treatment may become necessary.
Practitioners and researchers must always rely on their own experience and knowledge in evaluating and using
any information, methods, compounds, or experiments described herein. In using such information or methods
they should be mindful of their own safety and the safety of others, including parties for whom they have a
professional responsibility.
To the fullest extent of the law, neither the Publisher nor the authors, contributors, or editors, assume any liability
for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or
from any use or operation of any methods, products, instructions, or ideas contained in the material herein.

Library of Congress Cataloging-in-Publication Data


A catalog record for this book is available from the Library of Congress

British Library Cataloguing-in-Publication Data


A catalogue record for this book is available from the British Library

ISBN: 978-0-12-817216-2

For information on all Academic Press publications


visit our website at https://www.elsevier.com/books-and-journals

Publisher: Candice Janco


Acquisition Editor: J. Scott Bentley
Editorial Project Manager: Michael Lutz
Production Project Manager: Nilesh Kumar Shah
Designer: Miles Hitchen
Typeset by VTeX
Contents

About the author ix


Preface xi
Acknowledgments xiii

1 Introduction to optimization 1
1.1 Algorithms 1
1.1.1 Essence of an algorithm 1
1.1.2 Issues with algorithms 3
1.1.3 Types of algorithms 3
1.2 Optimization 4
1.2.1 A simple example 4
1.2.2 General formulation of optimization 7
1.2.3 Feasible solution 9
1.2.4 Optimality criteria 10
1.3 Unconstrained optimization 10
1.3.1 Univariate functions 11
1.3.2 Multivariate functions 12
1.4 Nonlinear constrained optimization 14
1.4.1 Penalty method 15
1.4.2 Lagrange multipliers 16
1.4.3 Karush–Kuhn–Tucker conditions 17
1.5 Notes on software 18

2 Mathematical foundations 19
2.1 Convexity 20
2.1.1 Linear and affine functions 20
2.1.2 Convex functions 21
2.1.3 Mathematical operations on convex functions 22
2.2 Computational complexity 22
2.2.1 Time and space complexity 24
2.2.2 Complexity of algorithms 25
2.3 Norms and regularization 26
2.3.1 Norms 26
2.3.2 Regularization 28
2.4 Probability distributions 29
2.4.1 Random variables 29
2.4.2 Probability distributions 30
vi Contents

2.4.3 Conditional probability and Bayesian rule 32


2.4.4 Gaussian process 34
2.5 Bayesian network and Markov models 35
2.6 Monte Carlo sampling 36
2.6.1 Markov chain Monte Carlo 37
2.6.2 Metropolis–Hastings algorithm 37
2.6.3 Gibbs sampler 39
2.7 Entropy, cross entropy, and KL divergence 39
2.7.1 Entropy and cross entropy 39
2.7.2 DL divergence 40
2.8 Fuzzy rules 41
2.9 Data mining and machine learning 42
2.9.1 Data mining 42
2.9.2 Machine learning 42
2.10 Notes on software 42

3 Optimization algorithms 45
3.1 Gradient-based methods 45
3.1.1 Newton’s method 45
3.1.2 Newton’s method for multivariate functions 47
3.1.3 Line search 48
3.2 Variants of gradient-based methods 49
3.2.1 Stochastic gradient descent 50
3.2.2 Subgradient method 51
3.2.3 Conjugate gradient method 52
3.3 Optimizers in deep learning 53
3.4 Gradient-free methods 56
3.5 Evolutionary algorithms and swarm intelligence 58
3.5.1 Genetic algorithm 58
3.5.2 Differential evolution 60
3.5.3 Particle swarm optimization 61
3.5.4 Bat algorithm 61
3.5.5 Firefly algorithm 62
3.5.6 Cuckoo search 62
3.5.7 Flower pollination algorithm 63
3.6 Notes on software 64

4 Data fitting and regression 67


4.1 Sample mean and variance 67
4.2 Regression analysis 69
4.2.1 Maximum likelihood 69
4.2.2 Liner regression 70
4.2.3 Linearization 75
4.2.4 Generalized linear regression 77
4.2.5 Goodness of fit 80
Contents vii

4.3 Nonlinear least squares 81


4.3.1 Gauss–Newton algorithm 82
4.3.2 Levenberg–Marquardt algorithm 85
4.3.3 Weighted least squares 85
4.4 Overfitting and information criteria 86
4.5 Regularization and Lasso method 88
4.6 Notes on software 90

5 Logistic regression, PCA, LDA, and ICA 91


5.1 Logistic regression 91
5.2 Softmax regression 96
5.3 Principal component analysis 96
5.4 Linear discriminant analysis 101
5.5 Singular value decomposition 104
5.6 Independent component analysis 105
5.7 Notes on software 108

6 Data mining techniques 109


6.1 Introduction 110
6.1.1 Types of data 110
6.1.2 Distance metric 110
6.2 Hierarchy clustering 111
6.3 k-Nearest-neighbor algorithm 112
6.4 k-Means algorithm 113
6.5 Decision trees and random forests 115
6.5.1 Decision tree algorithm 115
6.5.2 ID3 algorithm and C4.5 classifier 116
6.5.3 Random forest 120
6.6 Bayesian classifiers 121
6.6.1 Naive Bayesian classifier 121
6.6.2 Bayesian networks 123
6.7 Data mining for big data 124
6.7.1 Characteristics of big data 124
6.7.2 Statistical nature of big data 125
6.7.3 Mining big data 125
6.8 Notes on software 127

7 Support vector machine and regression 129


7.1 Statistical learning theory 129
7.2 Linear support vector machine 130
7.3 Kernel functions and nonlinear SVM 133
7.4 Support vector regression 135
7.5 Notes on software 137
viii Contents

8 Neural networks and deep learning 139


8.1 Learning 139
8.2 Artificial neural networks 140
8.2.1 Neuron models 140
8.2.2 Activation models 141
8.2.3 Artificial neural networks 143
8.3 Back propagation algorithm 146
8.4 Loss functions in ANN 147
8.5 Optimizers and choice of optimizers 149
8.6 Network architecture 149
8.7 Deep learning 151
8.7.1 Convolutional neural networks 151
8.7.2 Restricted Boltzmann machine 157
8.7.3 Deep neural nets 158
8.7.4 Trends in deep learning 159
8.8 Tuning of hyperparameters 160
8.9 Notes on software 161

Bibliography 163

Index 171
About the author

Xin-She Yang obtained his PhD in Applied Mathematics from the University of Ox-
ford. He then worked at Cambridge University and National Physical Laboratory (UK)
as a Senior Research Scientist. Now he is Reader at Middlesex University London, and
an elected Bye-Fellow at Cambridge University.
He is also the IEEE Computer Intelligence Society (CIS) Chair for the Task Force
on Business Intelligence and Knowledge Management, Director of the International
Consortium for Optimization and Modelling in Science and Industry (iCOMSI), and
an Editor of Springer’s Book Series Springer Tracts in Nature-Inspired Computing
(STNIC).
With more than 20 years of research and teaching experience, he has authored
10 books and edited more than 15 books. He published more than 200 research pa-
pers in international peer-reviewed journals and conference proceedings with more
than 36 800 citations. He has been on the prestigious lists of Clarivate Analytics and
Web of Science highly cited researchers in 2016, 2017, and 2018. He serves on the
Editorial Boards of many international journals including International Journal of
Bio-Inspired Computation, Elsevier’s Journal of Computational Science (JoCS), In-
ternational Journal of Parallel, Emergent and Distributed Systems, and International
Journal of Computer Mathematics. He is also the Editor-in-Chief of the International
Journal of Mathematical Modelling and Numerical Optimisation.
This page intentionally left blank
Preface

Both data mining and machine learning are becoming popular subjects for university
courses and industrial applications. This popularity is partly driven by the Internet and
social media because they generate a huge amount of data every day, and the under-
standing of such big data requires sophisticated data mining techniques. In addition,
many applications such as facial recognition and robotics have extensively used ma-
chine learning algorithms, leading to the increasing popularity of artificial intelligence.
From a more general perspective, both data mining and machine learning are closely
related to optimization. After all, in many applications, we have to minimize costs,
errors, energy consumption, and environment impact and to maximize sustainabil-
ity, productivity, and efficiency. Many problems in data mining and machine learning
are usually formulated as optimization problems so that they can be solved by opti-
mization algorithms. Therefore, optimization techniques are closely related to many
techniques in data mining and machine learning.
Courses on data mining, machine learning, and optimization are often compulsory
for students, studying computer science, management science, engineering design, op-
erations research, data science, finance, and economics. All students have to develop
a certain level of data modeling skills so that they can process and interpret data for
classification, clustering, curve-fitting, and predictions. They should also be familiar
with machine learning techniques that are closely related to data mining so as to carry
out problem solving in many real-world applications. This book provides an introduc-
tion to all the major topics for such courses, covering the essential ideas of all key
algorithms and techniques for data mining, machine learning, and optimization.
Though there are over a dozen good books on such topics, most of these books are
either too specialized with specific readership or too lengthy (often over 500 pages).
This book fills in the gap with a compact and concise approach by focusing on the key
concepts, algorithms, and techniques at an introductory level. The main approach of
this book is informal, theorem-free, and practical. By using an informal approach all
fundamental topics required for data mining and machine learning are covered, and
the readers can gain such basic knowledge of all important algorithms with a focus
on their key ideas, without worrying about any tedious, rigorous mathematical proofs.
In addition, the practical approach provides about 30 worked examples in this book
so that the readers can see how each step of the algorithms and techniques works.
Thus, the readers can build their understanding and confidence gradually and in a
step-by-step manner. Furthermore, with the minimal requirements of basic high school
mathematics and some basic calculus, such an informal and practical style can also
enable the readers to learn the contents by self-study and at their own pace.
This book is suitable for undergraduates and graduates to rapidly develop all the
fundamental knowledge of data mining, machine learning, and optimization. It can
xii Preface

also be used by students and researchers as a reference to review and refresh their
knowledge in data mining, machine learning, optimization, computer science, and data
science.

Xin-She Yang
January 2019 in London
Acknowledgments

I would like to thank all my students and colleagues who have given valuable feedback
and comments on some of the contents and examples of this book. I also would like to
thank my editors, J. Scott Bentley and Michael Lutz, and the staff at Elsevier for their
professionalism. Last but not least, I thank my family for all the help and support.

Xin-She Yang
January 2019
This page intentionally left blank
Introduction to optimization
Contents
1.1 Algorithms
1 1
1.1.1 Essence of an algorithm 1
1.1.2 Issues with algorithms 3
1.1.3 Types of algorithms 3
1.2 Optimization 4
1.2.1 A simple example 4
1.2.2 General formulation of optimization 7
1.2.3 Feasible solution 9
1.2.4 Optimality criteria 10
1.3 Unconstrained optimization 10
1.3.1 Univariate functions 11
1.3.2 Multivariate functions 12
1.4 Nonlinear constrained optimization 14
1.4.1 Penalty method 15
1.4.2 Lagrange multipliers 16
1.4.3 Karush–Kuhn–Tucker conditions 17
1.5 Notes on software 18

This book introduces the most fundamentals and algorithms related to optimization,
data mining, and machine learning. The main requirement is some understanding of
high-school mathematics and basic calculus; however, we will review and introduce
some of the mathematical foundations in the first two chapters.

1.1 Algorithms
An algorithm is an iterative, step-by-step procedure for computation. The detailed
procedure can be a simple description, an equation, or a series of descriptions in
combination with equations. Finding the roots of a polynomial, checking if a natu-
ral number is a prime number, and generating random numbers are all algorithms.

1.1.1 Essence of an algorithm


In essence, an algorithm can be written as an iterative equation or a set of iterative
equations. For example, to find a square root of a > 0, we can use the following
iterative equation:
1 a
xk+1 = xk + , (1.1)
2 xk
where k is the iteration counter (k = 0, 1, 2, . . . ) starting with a random guess x0 = 1.
Introduction to Algorithms for Data Mining and Machine Learning. https://doi.org/10.1016/B978-0-12-817216-2.00008-9
Copyright © 2019 Elsevier Inc. All rights reserved.
2 Introduction to Algorithms for Data Mining and Machine Learning

Example 1
As an example, if x0 = 1 and a = 4, then we have

1 4
x1 = (1 + ) = 2.5. (1.2)
2 1

Similarly, we have

1 4 1 4
x2 = (2.5 + ) = 2.05, x3 = (2.05 + ) ≈ 2.0061, (1.3)
2 2.5 2 2.05
x4 ≈ 2.00000927, (1.4)

which is very close to the true value of 4 = 2. The accuracy of this iterative formula or algorithm
is high because it achieves the accuracy of five decimal places after four iterations.

The convergence is very quick if we start from different initial values such as
x0 = 10 and even x0 = 100. However, for an obvious reason, we cannot start with
x0 = 0 due to division by
√zero.
Find the root of x = a is equivalent to solving the equation

f (x) = x 2 − a = 0, (1.5)

which is again equivalent to finding the roots of a polynomial f (x). We know that
Newton’s root-finding algorithm can be written as

f (xk )
xk+1 = xk − , (1.6)
f  (xk )

where f  (x) is the first derivative or gradient of f (x). In this case, we have
f  (x) = 2x. Thus, Newton’s formula becomes

(xk2 − a)
xk+1 = xk − , (1.7)
2xk

which can be written as


xk a 1 a
xk+1 = (xk − )+ = xk + ). (1.8)
2 2xk 2 xk

This is exactly what we have in Eq. (1.1).


Newton’s method has rigorous mathematical foundations, which has a guaranteed
convergence under certain conditions. However, in general, Eq. (1.6) is more general,
and the gradient information f  (x) is needed. In addition, for the formula to be valid,
we must have f  (x) = 0.
Introduction to optimization 3

1.1.2 Issues with algorithms


The advantage of the algorithm given in Eq. (1.1) is that√it converges very quickly.
However, careful readers may have asked: we know that 4 = ±2, how can we find
the other root −2 in addition to +2?
Even if we use different initial value x0 = 10 or x0 = 0.5, we can only reach x∗ = 2,
not −2.
What happens if we start with x0 < 0? From x0 = −1, we have
1 4 1 4
x1 = (−1 + ) = −2.5, x 2 = (−2.5 + ) = −2.05, (1.9)
2 −1 2 −2.5
x3 ≈ −2.0061, x4 ≈ −2.00000927, (1.10)
which is approaching −2 very quickly. If we start from x0 = −10 or x0 = −0.5, then
we can always get x∗ = −2, not +2.
This highlights a key issue here: the final solution seems to depend on the initial
starting point for this algorithm, which is true for many algorithms.
Now the relevant question is: how do we know where to start to get a particular
solution? The general short answer is “we do not know”. Thus, some knowledge of
the problem under consideration or an educated guess may be useful to find the final
solution.
In fact, most algorithms may depend on the initial configuration, and such algo-
rithms are often carrying out search moves locally. Thus, this type of algorithm is
often referred to as local search. A good algorithm should be able to “forget” its initial
configuration though such algorithms may not exist at all for most types of problems.
What we need in general is the global search, which attempts to find final solutions
that are less sensitive to the initial starting point(s).
Another important issue in our discussions is that the gradient information f  (x) is
necessary for some algorithms such as Newton’s method given in Eq. (1.6). This poses
certain requirements on the smoothness of the function f (x). For example, we know
that |x| is not differentiable at x = 0. Thus, we cannot directly use Newton’s method
to find the roots of f (x) = |x|x 2 − a = 0 for a > 0. Some modifications are needed.
There are other issues related to algorithms such as the setting of parameters, the
slow rate of convergence, condition numbers, and iteration structures. All these make
algorithm designs and usage somehow challenging, and we will discuss these issues
in more detail later in this book.

1.1.3 Types of algorithms


An algorithm can only do a specific computation task (at most a class of computational
tasks), and no algorithms can do all the tasks. Thus, algorithms can be classified due
to their purposes. An algorithm to find roots of a polynomial belongs to root-finding
algorithms, whereas an algorithm for ranking a set of numbers belongs to sorting
algorithms. There are many classes of algorithms for different purposes. Even for the
same purpose such as sorting, there are many different algorithms such as the merge
sort, bubble sort, quicksort, and others.
4 Introduction to Algorithms for Data Mining and Machine Learning

We can also categorize algorithms in terms of their characteristics. The root-finding


algorithms we just introduced are deterministic algorithms because the final solutions
are exactly the same if we start from the same initial guess. We obtain the same set of
solutions every time we run the algorithm. On the other hand, we may introduce some
randomization into the algorithm, for example, using purely random initial points.
Every time we run the algorithm, we use a new random initial guess. In this case, the
algorithm can have some nondeterministic nature, and such algorithms are referred
to as stochastic.√Sometimes, using randomness may be advantageous. For example, in
the example of 4 = ±2 using Eq. (1.1), random initial values (both positive and neg-
ative) can allow the algorithm to find both roots. In fact, a major trend in the modern
metaheuristics is using some randomization to suit different purposes.
For algorithms to be introduced in this book, we are mainly concerned with al-
gorithms for data mining, optimization, and machine learning. We use a relatively
unified approach to link algorithms in data mining and machine learning to algorithms
for optimization.

1.2 Optimization

Optimization is everywhere, from engineering design to business planning. After all,


time and resources are limited, and optimal use of such valuable resources is crucial.
In addition, designs of products have to maximize the performance, sustainability, and
energy efficiency and to minimize the costs. Therefore, optimization is important for
many applications.

1.2.1 A simple example


Let us start with a very simple example to design a container with volume capacity
V0 = 10 m3 . As the main cost is related to the cost of materials, the main aim is to
minimize the total surface area S.
The first thing we have to decide is the shape of the container (cylinder, cubic,
sphere or ellipsoid, or more complex geometry). For simplicity, let us start with a
cylindrical shape with radius r and height h (see Fig. 1.1).
The total surface area of a cylinder is

S = 2(πr 2 ) + 2πrh, (1.11)

and the volume is

V = πr 2 h. (1.12)

There are only two design variables r and h and one objective function S to be min-
imized. Obviously, if there is no capacity constraint, then we can choose not to build
the container, and then the cost of materials is zero for r = 0 and h = 0. However,
Introduction to optimization 5

Figure 1.1 Design of a cylindric container.

the constraint requirement means that we have to build a container with fixed volume
V0 = πr 2 h = 10 m3 . Therefore, this optimization problem can be written as

minimize S = 2πr 2 + 2πrh, (1.13)

subject to the equality constraint

πr 2 h = V0 = 10. (1.14)

To solve this problem, we can first try to use the equality constraint to reduce the
number of design variables by solving h. So we have
V0
h= . (1.15)
πr 2
Substituting it into (1.13), we get

S = 2πr 2 + 2πrh
V0 2V0
= 2πr 2 + 2πr 2 = 2πr 2 + . (1.16)
πr r
This is a univariate function. From basic calculus we know that the minimum or max-
imum can occur at the stationary point, where the first derivative is zero, that is,
dS 2V0
= 4πr − 2 = 0, (1.17)
dr r
which gives

V0 3 V0
r3 = , or r = . (1.18)
2π 2π
Thus, the height is

h V0 /(πr 2 ) V0
= = 3 = 2. (1.19)
r r πr
6 Introduction to Algorithms for Data Mining and Machine Learning

This means that the height is twice the radius: h = 2r. Thus, the minimum surface is

S∗ = 2πr 2 + 2πrh = 2πr 2 + 2πr(2r) = 6πr 2


 V 2/3 6π
0 2/3
= 6π =√3
V0 . (1.20)
2π 4π 2

For V0 = 10, we have


 
3 V0 3 10
r= = ≈ 1.1675, h = 2r = 2.335,
(2π) 2π

and the total surface area

S∗ = 2πr 2 + 2πrh ≈ 25.69.

It is worth pointing out that this optimal solution is based on the assumption or re-
quirement to design a cylindrical container. If we decide to use a sphere with radius R,
we know that its volume and surface area is
4π 3
V0 = R , S = 4πR 2 . (1.21)
3
We can solve R directly

3V0 3 3V0
R =
3
, or R = , (1.22)
4π 4π
which gives the surface area
 3V 2/3 √
0 4π 3 9 2/3
S = 4π =√ 3
V0 . (1.23)
4π 16π 2
√3 √ √ 3
Since 6π/ 4π 2 ≈ 5.5358 and 4π 3 9/ 16π 2 ≈ 4.83598, we have S < S∗ , that is, the
surface area of a sphere is smaller than the minimum surface area of a cylinder with
the same volume. In fact, for the same V0 = 10, we have

4π 3 9 2/3
S(sphere) = √ 3
V0 ≈ 22.47, (1.24)
16π 2
which is smaller than S∗ = 25.69 for a cylinder.
This highlights the importance of the choice of design type (here in terms of shape)
before we can do any truly useful optimization. Obviously, there are many other fac-
tors that can influence the choice of design, including the manufacturability of the
design, stability of the structure, ease of installation, space availability, and so on. For
a container, in most applications, a cylinder may be much easier to produce than a
sphere, and thus the overall cost may be lower in practice. Though there are so many
factors to be considered in engineering design, for the purpose of optimization, here
we will only focus on the improvement and optimization of a design with well-posed
mathematical formulations.
Introduction to optimization 7

1.2.2 General formulation of optimization


Whatever the real-world applications may be, it is usually possible to formulate an
optimization problem in a generic form [49,53,160]. All optimization problems with
explicit objectives can in general be expressed as a nonlinearly constrained optimiza-
tion problem

maximize/minimize f (x), x = (x1 , x2 , . . . , xD )T ∈ RD ,


subject to φj (x) = 0 (j = 1, 2, . . . , M),
ψk (x) ≤ 0 (k = 1, . . . , N), (1.25)

where f (x), φj (x), and ψk (x) are scalar functions of the design vector x. Here the
components xi of x = (x1 , . . . , xD )T are called design or decision variables, and they
can be either continuous, discrete, or a mixture of these two. The vector x is often
called the decision vector, which varies in a D-dimensional space RD .
It is worth pointing out that we use a column vector here for x (thus with trans-
pose T ). We can also use a row vector x = (x1 , . . . , xD ) and the results will be the
same. Different textbooks may use slightly different formulations. Once we are aware
of such minor variations, it should cause no difficulty or confusion.
In addition, the function f (x) is called the objective function or cost function,
φj (x) are constraints in terms of M equalities, and ψk (x) are constraints written as
N inequalities. So there are M + N constraints in total. The optimization problem
formulated here is a nonlinear constrained problem. Here the inequalities ψk (x) ≤ 0
are written as “less than”, and they can also be written as “greater than” via a simple
transformation by multiplying both sides by −1.
The space spanned by the decision variables is called the search space RD , whereas
the space formed by the values of the objective function is called the objective or
response space, and sometimes the landscape. The optimization problem essentially
maps the domain RD or the space of decision variables into the solution space R (or
the real axis in general).
The objective function f (x) can be either linear or nonlinear. If the constraints φj
and ψk are all linear, it becomes a linearly constrained problem. Furthermore, when
φj , ψk , and the objective function f (x) are all linear, then it becomes a linear pro-
gramming problem [35]. If the objective is at most quadratic with linear constraints,
then it is called a quadratic programming problem. If all the values of the decision
variables can be only integers, then this type of linear programming is called integer
programming or integer linear programming.
On the other hand, if no constraints are specified and thus xi can take any values
in the real axis (or any integers), then the optimization problem is referred to as an
unconstrained optimization problem.
As a very simple example of optimization problems without any constraints, we
discuss the search of the maxima or minima of a univariate function.
8 Introduction to Algorithms for Data Mining and Machine Learning

2
Figure 1.2 A simple multimodal function f (x) = x 2 e−x .

Example 2
For example, to find the maximum of a univariate function f (x)

f (x) = x 2 e−x ,
2
−∞ < x < ∞, (1.26)

is a simple unconstrained problem, whereas the following problem is a simple constrained mini-
mization problem:

f (x1 , x2 ) = x12 + x1 x2 + x22 , (x1 , x2 ) ∈ R2 , (1.27)

subject to

x1 ≥ 1, x2 − 2 = 0. (1.28)

It is worth pointing out that the objectives are explicitly known in all the optimiza-
tion problems to be discussed in this book. However, in reality, it is often difficult to
quantify what we want to achieve, but we still try to optimize certain things such as the
degree of enjoyment or service quality on holiday. In other cases, it may be impossible
to write the objective function in any explicit form mathematically.
From basic calculus we know that, for a given curve described by f (x), its gradient
f  (x) describes the rate of change. When f  (x) = 0, the curve has a horizontal tangent
at that particular point. This means that it becomes a point of special interest. In fact,
the maximum or minimum of a curve occurs at
f  (x∗ ) = 0, (1.29)

which is a critical condition or stationary condition. The solution x∗ to this equation


corresponds to a stationary point, and there may be multiple stationary points for a
given curve.
To see if it is a maximum or minimum at x = x∗ , we have to use the information of
its second derivative f  (x). In fact, f  (x∗ ) > 0 corresponds to a minimum, whereas
f  (x∗ ) < 0 corresponds to a maximum. Let us see a concrete example.

Example 3
To find the minimum of f (x) = x 2 e−x (see Fig. 1.2), we have the stationary condition
2

f  (x) = 0 or

f  (x) = 2x × e−x + x 2 × (−2x)e−x = 2(x − x 3 )e−x = 0.


2 2 2
Introduction to optimization 9

Figure 1.3 (a) Feasible domain with nonlinear inequality constraints ψ1 (x) and ψ2 (x) (left) and linear
inequality constraint ψ3 (x). (b) An example with an objective of f (x) = x 2 subject to x ≥ 2 (right).

As e−x > 0, we have


2

x(1 − x 2 ) = 0, or x = 0 and x = ±1.

The second derivative is given by

f  (x) = 2e−x (1 − 5x 2 + 2x 4 ),
2

which is an even function with respect to x.


So at x = ±1, f  (±1) = 2[1 − 5(±1)2 + 2(±1)4 ]e−(±1) = −4e−1 < 0. Thus, there are
2

two maxima that occur at x∗ = ±1 with fmax = e−1 . At x = 0, we have f  (0) = 2 > 0, thus
the minimum of f (x) occurs at x∗ = 0 with fmin (0) = 0.

Whatever the objective is, we have to evaluate it many times. In most cases, the
evaluations of the objective functions consume a substantial amount of computational
power (which costs money) and design time. Any efficient algorithm that can reduce
the number of objective evaluations saves both time and money.
In mathematical programming, there are many important concepts, and we will
first introduce a few related concepts: feasible solutions, optimality criteria, the strong
local optimum, and weak local optimum.

1.2.3 Feasible solution


A point x that satisfies all the constraints is called a feasible point and thus is a feasible
solution to the problem. The set of all feasible points is called the feasible region (see
Fig. 1.3).
For example, we know that the domain f (x) = x 2 consists of all real numbers. If
we want to minimize f (x) without any constraint, all solutions such as x = −1, x = 1,
and x = 0 are feasible. In fact, the feasible region is the whole real axis. Obviously,
x = 0 corresponds to f (0) = 0 as the true minimum.
However, if we want to find the minimum of f (x) = x 2 subject to x ≥ 2, then it
becomes a constrained optimization problem. The points such as x = 1 and x = 0 are
no longer feasible because they do not satisfy x ≥ 2. In this case the feasible solutions
are all the points that satisfy x ≥ 2. So x = 2, x = 100, and x = 108 are all feasible. It
is obvious that the minimum occurs at x = 2 with f (2) = 22 = 4, that is, the optimal
solution for this problem occurs at the boundary point x = 2 (see Fig. 1.3).
10 Introduction to Algorithms for Data Mining and Machine Learning

Figure 1.4 Local optima, weak optima, and global optimality.

1.2.4 Optimality criteria


A point x ∗ is called a strong local maximum of the nonlinearly constrained op-
timization problem if f (x) is defined in a δ-neighborhood N (x ∗ , δ) and satisfies
f (x ∗ ) > f (u) for u ∈ N (x ∗ , δ), where δ > 0 and u = x ∗ . If x ∗ is not a strong lo-
cal maximum, then the inclusion of equality in the condition f (x ∗ ) ≥ f (u) for all
u ∈ N (x ∗ , δ) defines the point x ∗ as a weak local maximum (see Fig. 1.4). The local
minima can be defined in a similar manner when > and ≥ are replaced by < and ≤,
respectively.
Fig. 1.4 shows various local maxima and minima. Point A is a strong local max-
imum, whereas point B is a weak local maximum because there are many (in fact,
infinite) different values of x that will lead to the same value of f (x ∗ ). Point D is the
global maximum, and point E is the global minimum. In addition, point F is a strong
local minimum. However, point C is a strong local minimum, but it has a discontinuity
in f  (x ∗ ). So the stationary condition for this point f  (x ∗ ) = 0 is not valid. We will
not deal with these types of minima or maxima in detail.
As we briefly mentioned before, for a smooth curve f (x), optimal solutions usu-
ally occur at stationary points where f  (x) = 0. This is not always the case because
optimal solutions can also occur at the boundary, as we have seen in the previous ex-
ample of minimizing f (x) = x 2 subject to x ≥ 2. In our present discussion, we will
assume that both f (x) and f  (x) are always continuous or f (x) is everywhere twice
continuously differentiable. Obviously, the information of f  (x) is not sufficient to
determine whether a stationary point is a local maximum or minimum. Thus, higher-
order derivatives such as f  (x) are needed, but we do not make any assumption at this
stage. We will further discuss this in detail in the next section.

1.3 Unconstrained optimization

Optimization problems can be classified as either unconstrained or constrained. Un-


constrained optimization problems can in turn be subdivided into univariate and mul-
tivariate problems.
Introduction to optimization 11

1.3.1 Univariate functions


The simplest optimization problem without any constraints is probably the search for
the maxima or minima of a univariate function f (x). For unconstrained optimization
problems, the optimality occurs at the critical points given by the stationary condition
f  (x) = 0.
However, this stationary condition is just a necessary condition, but it is not a suf-
ficient condition. If f  (x∗ ) = 0 and f  (x∗ ) > 0, it is a local minimum. Conversely, if
f  (x∗ ) = 0 and f  (x∗ ) < 0, then it is a local maximum. However, if f  (x∗ ) = 0 and
f  (x∗ ) = 0, care should be taken because f  (x) may be indefinite (both positive and
negative) when x → x∗ , then x∗ corresponds to a saddle point.
For example, for f (x) = x 3 , we have

f  (x) = 3x 2 , f  (x) = 6x. (1.30)

The stationary condition f  (x) = 3x 2 = 0 gives x∗ = 0. However, we also have

f  (x∗ ) = f  (0) = 0.

In fact, f (x) = x 3 has a saddle point x∗ = 0 because f  (0) = 0 but f  changes sign
from f  (0+) > 0 to f  (0−) < 0 as x moves from positive to negative.

Example 4
For example, to find the maximum or minimum of a univariate function

f (x) = 3x 4 − 4x 3 − 12x 2 + 9, −∞ < x < ∞,

we first have to find its stationary points x∗ when the first derivative f  (x) is zero, that is,

f  (x) = 12x 3 − 12x 2 − 24x = 12(x 3 − x 2 − 2x) = 0.

Since f  (x) = 12(x 3 − x 2 − 2x) = 12x(x + 1)(x − 2) = 0, we have

x∗ = −1, x∗ = 2, x∗ = 0.

The second derivative of f (x) is simply

f  (x) = 36x 2 − 24x − 24.

From the basic calculus we know that the maximum requires f  (x∗ ) ≤ 0 whereas the minimum
requires f  (x∗ ) ≥ 0.
At x∗ = −1, we have

f  (−1) = 36(−1)2 − 24(−1) − 24 = 36 > 0,

so this point corresponds to a local minimum

f (−1) = 3(−1)4 − 4(−1)3 − 12(−1)2 + 9 = 4.


12 Introduction to Algorithms for Data Mining and Machine Learning

Similarly, at x∗ = 2, f  (x∗ ) = 72 > 0, and thus we have another local minimum

f (x∗ ) = −23.

However, at x∗ = 0, we have f  (0) = −24 < 0, which corresponds to a local maximum


f (0) = 9. However, this maximum is not a global maximum because the global maxima for f (x)
occur at x = ±∞.
The global minimum occurs at x∗ = 2 with f (2) = −23.

The maximization of a function f (x) can be converted into the minimization of A−


f (x), where A is usually a large positive number (though A = 0 will do). For example,
we know the maximum of f (x) = e−x , x ∈ (−∞, ∞), is 1 at x∗ = 0. This problem
2

can be converted to the minimization of −f (x). For this reason, the optimization
problems can be expressed as either minimization or maximization depending on the
context and convenience of formulations.
In fact, in the optimization literature, some books formulate all the optimization
problems in terms of maximization, whereas others write these problems in terms of
minimization, though they are in essence dealing with the same problems.

1.3.2 Multivariate functions


We can extend the optimization procedure for univariate functions to multivariate
functions using partial derivatives and relevant conditions. Let us start with the ex-
ample

minimize f (x, y) = x 2 + y 2 , x, y ∈ R. (1.31)

It is obvious that x = 0 and y = 0 is a minimum solution because f (0, 0) = 0. The


question is how to solve this problem formally. We can extend the stationary condition
to partial derivatives, and we have ∂f ∂f
∂x = 0 and ∂y = 0. In this case, we have

∂f ∂f
= 2x + 0 = 0, = 0 + 2y = 0. (1.32)
∂x ∂y

The solution is obviously x∗ = 0 and y∗ = 0.


Now how do we know that it corresponds to a maximum or minimum? If we try to
use the second derivatives, we have four different partial derivatives such as fxx and
fyy , and which one should we use? In fact, we need to define the Hessian matrix from
these second partial derivatives, and we have
⎛ ⎞
 ∂ 2f ∂ 2f
fxx fxy
=⎝ ⎠.
∂x 2 ∂x∂y
H= (1.33)
fyx fyy ∂ 2f ∂ 2f
∂y∂x ∂y 2
Introduction to optimization 13

Since

∂ 2f ∂ 2f
= , (1.34)
∂x∂y ∂y∂x

we can conclude that the Hessian matrix is always symmetric. In the case of f (x, y) =
x 2 + y 2 , it is easy to check that the Hessian matrix is

2 0
H= . (1.35)
0 2

Mathematically speaking, if H is positive definite, then the stationary point (x∗ , y∗ )


corresponds to a local minimum. Similarly, if H is negative definite, then the sta-
tionary point corresponds to a maximum. The definiteness of a symmetric matrix is
controlled by its eigenvalues. For this simple diagonal matrix H , its eigenvalues are
its two diagonal entries 2 and 2. As both eigenvalues are positive, this matrix is pos-
itive definite. Since the Hessian matrix here does not involve any x or y, it is always
positive definite in the whole search domain (x, y) ∈ R2 , so we can conclude that the
solution at point (0, 0) is the global minimum.
Obviously, this is a particular case. In general, the Hessian matrix depends on the
independent variables, but the definiteness test conditions still apply. That is, positive
definiteness of a stationary point means a local minimum. Alternatively, for bivariate
functions, we can define the determinant of the Hessian matrix in Eq. (1.33) as

 = det(H ) = fxx fyy − (fxy )2 . (1.36)

At the stationary point (x∗ , y∗ ), if  > 0 and fxx > 0, then (x∗ , y∗ ) is a local mini-
mum. If  > 0 but fxx < 0, then it is a local maximum. If  = 0, then it is inconclu-
sive, and we have to use other information such as higher-order derivatives. However,
if  < 0, then it is a saddle point. A saddle point is a special point where a local
minimum occurs along one direction, whereas the maximum occurs along another
(orthogonal) direction.

Example 5
To minimize f (x, y) = (x − 1)2 + x 2 y 2 , we have

∂f ∂f
= 2(x − 1) + 2xy 2 = 0, = 0 + 2x 2 y = 0. (1.37)
∂x ∂y

The second condition gives y = 0 or x = 0. Substituting y = 0 into the first condition, we have
x = 1. However, x = 0 does not satisfy the first condition. Therefore, we have a solution x∗ = 1
and y∗ = 0.
For our example with f = (x − 1)2 + x 2 y 2 , we have

∂ 2f 2 2 2
2 + 2, ∂ f = 4xy, ∂ f = 4xy, ∂ f = 2x 2 ,
= 2y (1.38)
∂x 2 ∂x∂y ∂y∂x ∂y 2
14 Introduction to Algorithms for Data Mining and Machine Learning

and thus we have


 
2y 2 + 2 4xy
H= . (1.39)
4xy 2x 2

At the stationary point (x∗ , y∗ ) = (1, 0), the Hessian matrix becomes
 
2 0
H= ,
0 2

which is positive definite because its double eigenvalues 2 are positive. Alternatively, we have
 = 4 > 0 and fxx = 2 > 0. Therefore, (1, 0) is a local minimum.

In fact, for a multivariate function f (x1 , x2 , . . . , xn ) in an n-dimensional space, the


stationary condition can be extended to
∂f ∂f ∂f T
G = ∇f = ( , ,..., ) = 0, (1.40)
∂x1 ∂x2 ∂xn
where G is called the gradient vector. The second derivative test becomes the definite-
ness of the Hessian matrix
⎛ 2 ⎞
∂f ∂ 2f ∂ 2f
...
⎜ ∂x1 2 ∂x1 ∂x2 ∂x1 ∂xn ⎟
⎜ ∂ 2f 2f 2f ⎟
⎜ ∂x ∂x ∂
... ∂x∂2 ∂x ⎟
H =⎜ ⎜ 2 1 ∂x 2 n ⎟. (1.41)

2
⎜ .. . .
.. . .. .
.. ⎟
⎝ ⎠
∂ 2f ∂ 2f ∂ 2f
∂xn ∂x1 ∂xn ∂x2 ... ∂xn 2

At the stationary point defined by G = ∇f = 0, the positive definiteness of H gives a


local minimum, whereas the negative definiteness corresponds to a local maximum. In
essence, the eigenvalues of the Hessian matrix H determine the local behavior of the
function. As we mentioned before, if H is positive semidefinite, then it corresponds
to a local minimum.

1.4 Nonlinear constrained optimization


As most real-world problems are nonlinear, nonlinear mathematical programming
forms an important part of mathematical optimization methods. A broad class of non-
linear programming problems is about the minimization or maximization of f (x) sub-
ject to no constraints, and another important class is the minimization of a quadratic
objective function subject to nonlinear constraints. There are many other nonlinear
programming problems as well.
Nonlinear programming problems are often classified according to the convexity
of the defining functions. An interesting property of a convex function f is that the
Introduction to optimization 15

vanishing of the gradient ∇f (x ∗ ) = 0 guarantees that the point x∗ is a global minimum


or maximum of f . We will introduce the concept of convexity in the next chapter. If
a function is not convex or concave, then it is much more difficult to find its global
minima or maxima.

1.4.1 Penalty method


For the simple function optimization with equality and inequality constraints, a com-
mon method is the penalty method. For the optimization problem

minimize f (x), x = (x1 , . . . , xn )T ∈ Rn ,

subject to φi (x) = 0, (i = 1, . . . , M), ψj (x) ≤ 0, (j = 1, . . . , N ), (1.42)


the idea is to define a penalty function so that the constrained problem is transformed
into an unconstrained problem. Now we define


M 
N
(x, μi , νj ) = f (x) + μi φi2 (x) + νj max{0, ψj (x)}2 , (1.43)
i=1 j =1

where μi 1 and νj ≥ 0.
For example, let us solve the following minimization problem:

minimize f (x) = 40(x − 1)2 , x ∈ R, subject to g(x) = x − a ≥ 0, (1.44)

where a is a given value. Obviously, without this constraint, the minimum value occurs
at x = 1 with fmin = 0. If a < 1, then the constraint will not affect the result. However,
if a > 1, then the minimum should occur at the boundary x = a (which can be obtained
by inspecting or visualizing the objective function and the constraint). Now we can
define a penalty function (x) using a penalty parameter μ 1. We have

(x, μ) = f (x) + μ[g(x)]2 = 40(x − 1)2 + μ(x − a)2 , (1.45)

which converts the original constrained optimization problem into an unconstrained


problem. From the stationarity condition  (x) = 0 we have
40 − μa
80(x − 1) − 2μ(x − a) = 0, or x∗ = . (1.46)
40 − μ
For a particular case a = 1, we have x∗ = 1, and the result does not depend on μ.
However, in the case of a > 1 (say, a = 5), the result will depend on μ. When a = 5
and μ = 100, we have x∗ = 40 − 100 × 5/40 − 100 = 7.6667. If μ = 1000, then this
gives 50 − 1000 ∗ 5/40 − 1000 = 5.1667. Both values are far from the exact solution
xtrue = a = 5. If we use μ = 104 , then we have x∗ ≈ 5.0167. Similarly, for μ = 105 ,
we have x∗ ≈ 5.00167. This clearly demonstrates that the solution in general depends
on μ. However, it is very difficult to use extremely large values without causing extra
computational difficulties.
Random documents with unrelated
content Scribd suggests to you:
Adam and Eve, greatly cheered, blessed the Lord, and thanked Him
for His goodness, and resolved to continue their repentance.
A short time after they committed a fault. Satan presented himself to
them under the form of an angel of light, and announced that he was
commissioned by the Most High to lead them to the brink of the
River of the Water of Life, into which they were to plunge and wash
away their sin.
They believed, and followed him by a strange road, and he led them
to the edge of a precipice, down which he endeavoured to fling them;
for, he thought, were he to destroy the man and the woman, he
would be supreme in the world God had made. But the Almighty
rescued Adam and Eve, and drave Satan from them.
To punish themselves for their involuntary fault, Adam and Eve
separated, so as not to see one another, and resolved to spend forty
days up to their necks in the sea.
Before parting, Adam said to his wife, “Remain in the water here,
and do not quit it till I return, and spend your time in praying the Lord
to pardon us.”
Now, whilst they were undergoing this penance, Satan cast about
how he might bring to naught our first parents, and he sought them
but could not find them, till on the thirty-fifth day of their penance he
perceived the two heads above the water; then he knew at once
what was their intention, and he resolved to frustrate it. So he took
upon him the form of an angel of Heaven, and flew over the sea,
singing praises to God; and when he came to the place where Eve
was, he cried, “Joy, joy to thee! God is with thee, and He has sent
me to bring thee to Adam to announce to him that he has found
favour with the Most High.”
Eve instantly scrambled out of the water, and followed Satan to
Adam, and the Evil One placed her before her husband, and
vanished. When Adam saw his wife, he was filled with dismay, and
beat his breast and wept. When she told him why she was there, he
knew that the great Enemy had been again at his work of deception,
and he fell into despair. But a voice from Heaven bade him return
with Eve to the Treasure-cave.
Hunger, thirst, cold, and prayer had completely exhausted the pair,
and Adam cried to the Lord, “O God, my Creator! Thou hast given
me reason and an enlightened heart. When Thou didst forbid me to
eat of the fruit of the Tree, Eve was not yet made, and she did not
hear Thy command; in Eden we hungered not, nor felt thirst or pain
or fatigue. All this have we lost. And now we dare not touch the fruit
of the trees or drink of water without Thy command. Our bodies are
exhausted, our strength is gone; grant us wherewith to satisfy our
hunger, and to quench our thirst.”
God ordered the Cherubim who kept the gate of Eden, to carry to
Adam two figs from the tree under which our first parents had
concealed themselves after the Fall.
“Take,” said the Cherubim, presenting the figs to them, “take the fruit
of the tree whose leaves covered your shame.”
“Oh!” cried Adam, “may God grant us some of the fruit of the Tree of
Life.”
But God answered, “I will give unto you this fruit and living water, to
you and to your descendants, on that day that I shall descend into
the abode of death and shall break the gates of iron in sunder, to
bring you forth into my garden of pleasures. That which you ask of
Me shall take place at the expiration of five long days and a half (i.e.
5,500 years), after that my blood has flowed upon thy head, O
Adam, upon Golgotha.”
Adam and Eve took the figs, which were very heavy, for the fruits of
the earthly paradise were much larger than the fruit of this outer
world in which we live. And when they were about to enter into the
Cave of Treasures, they saw there a great fire; this mightily
astonished them, for as yet they had not seen fire except in the
flaming sword of the Cherub. Now this fire which surprised them was
the work of Satan; he had collected branches and had fired them in
the hope of burning down the cavern and driving Adam to despair.
The fire lasted till the morrow; Satan, without showing himself,
keeping it supplied with fresh fuel. Adam and Eve did not venture to
approach, but recommended themselves to God; and the Evil One,
finding that his plan had failed, let the fire die out and departed.
Adam and Eve slept the following night at the foot of a mountain
near their lost Eden. Satan, beholding them, said, “God has made a
compact with Adam, whom He desires to save, but I will slay him,
and the earth shall be mine.”
He therefore summoned his attendant angels, and they dislodged a
huge rock from the mountain and hurled it upon the sleepers. But as
this mass was bounding down the flank of the mountain, and was in
mid-air in one of its leaps, God arrested it above the heads of the
sleepers, and it sheltered them from the dews of night.
Adam and Eve awoke greatly troubled by their dreams, and they
asked of God garments to cover their naked bodies, for they suffered
from the scorching sun by day, and the frost by night. God replied,
“Go to the shore of the sea; you will there find the skins of sheep
which have been devoured by lions: of them make to yourselves
raiment.”
Satan heard the words of God, and he outran our first parents, that
he might secure the skins and destroy them, in the hopes that Adam
and Eve, finding no hides, would doubt God and think that He had
failed in His word. But God fastened Satan in his naked hideousness
beside the skins, immoveable, till Adam and Eve arrived, when He
addressed them in these terms: “Behold him who has seduced you;
see what has become of his beauty. After having made you such
promises, he was about to rob you of these hides.” Adam and Eve
took the skins and made of them garments. A few days after, God
said to them, “Go to the west till you arrive at a black land; there you
will find food.” They obeyed, and they saw corn full ripe, and God
inspired Adam with knowledge how to make bread. But not having
sickles they tore the corn up by the roots, and having made a rick of
it, they slept, expecting to thrash it out and grind it on the morrow.
But Satan fired this rick and reduced their harvest to ashes.
Whilst they wept and lamented, Satan came to them as an angel,
and said, “This is the work of your Enemy the Fiend, but God has
sent me to bring you into a field where you will find better corn.”
They followed him, nothing doubting, and he led them for eight days,
and they fainted with exhaustion and were foot-sore. Then he left
them in an unknown land; but God was their protector, He brought
them back to their harvest and restored their rick of corn, and they
made bread and offered to God the first sacrifice.[89]
But enough of this apocryphal work, which contains a string of
absurd tricks played by Satan on our first parents, which are
invariably defeated by God; of these the specimens given above are
sufficient.
A curious legend exists among the Sclavonic nations by which the
existence of elves is accounted for. It is said that Adam had by his
wife Eve, thirty sons and thirty daughters. God asked him, one day,
the number of his children. Adam was ashamed of having so many
girls, so he answered, “Thirty sons and twenty-seven daughters.” But
from the eye of God nothing can be concealed, and He took from
among Adam’s daughters the three fairest, and He made them Willis,
or elves; they were good and holy, and therefore did not perish in the
Deluge, but entered with Noah into the ark and were saved.
The story of Adam’s penitence as told by Tabari is as follows:—
The moment that Adam fell out of Paradise and touched the ground
on the mountains in the centre of Ceylon, he understood in all its
magnitude the greatness of his loss and his sin. He remained
stupefied with his face on the earth, and did not raise it, but allowed
his tears to flow upon and soak into, the soil. For a hundred years he
remained in this position and his tears formed a stream which rolled
down the mountain, which still flows from Adam’s Peak in the island
of Ceylon, and gives their virtue to the healing plants and fragrant
trees which there flourish, and are exported for medicinal purposes.
When a hundred years had elapsed, God had compassion on Adam,
and sent Gabriel to him, who said, “God salutes thee, O Adam! and
He bids me say to thee, Did I not create thee out of the earth by My
will? Did I not give thee Paradise to be thine abode? Why these
tears and sighs?”
Adam replied, “How shall I not weep, and how shall I abstain from
sighing? Have I not lost the protection of God, and have I not
disobeyed His will?”
Gabriel said, “Do not afflict thyself. Recite the words I shall teach
thee, and God will grant thee repentance which He will accept,” as it
is written in the Koran, ‘Adam learnt of His Lord words; and the Lord
returned to Him, for He is merciful, and He returns.’ Adam recited
these words, and in the joy he felt at the prospect of finding mercy,
he wept, and his joyous tears watered the earth, and from them
sprang up the narcissus and the ox-eye.
Then said Adam to Gabriel, “What shall I now do?”
And Gabriel gave to Adam wheat-grains from out of Paradise, the
fruit of the Forbidden Tree, and he bade him sow it, and he said,
“This shall be thy food in future.”
Afterwards, Gabriel taught Adam to draw iron out of the rock and to
make instruments of husbandry. And all that Adam sowed sprang up
in the self-same hour that it was sown, for the blessing of God was
upon it. And Adam reaped and thrashed and winnowed. Then
Gabriel bade him take two stones from the mountain, and he taught
him with them to grind the corn; and when he had made flour, he
said to the angel, “Shall I eat now?” But Gabriel answered, “Not so;”
and he showed him how to build an oven of iron. It was from this
oven that the water of the deluge at Koufa flowed. He taught him
also to make dough and to bake.
But Adam was hungry, and he said, “Let me eat now,” and the angel
stayed him, and answered, “Tarry till the bread be cold and stale,”
but he would not, but ate. Therefore he suffered from pain in his
belly. Next, Gabriel by the command of Allah brought out of Eden the
ox and fruit; of these latter there were ten kinds whose exterior was
edible, but whose insides were useless to eat, such as the apricot,
the peach, and the date. And there were three that could not be
eaten anyhow. Then he brought ten more whose insides and
outsides might be eaten, such as the grape, the fig, and the apple.
Said Gabriel to Adam, “Sow these,” and he sowed them. These are
the trees that the angel brought out of Paradise.
Now Adam was all alone on the peak in the midst of Ceylon, and his
head was in the first heaven. The sun burnt him, so that all his hair
fell off; and God, in compassion, bade Gabriel pass his wing over
Adam’s head, and Adam thereupon shrank to the height of sixty
cubits. And then he could no longer hear the voices of the angels in
heaven, and he was sore distressed.
Then God said to him, “I have made this world thy prison, but I send
to thee out of heaven a house of rubies, in order that thou mayest
enter in and walk round it, and therein find repose for thy heart.”
Thereupon out of heaven descended “the visited house,” and it was
placed where now stands the temple of Mecca. The black stone
which is there was originally white and shining. It was placed in the
ruby house. Whosoever looked in that direction from ten parasangs
off, could see the light of that house shining like a fire up to the
heaven, and in the midst of that red light shone the white stone like a
star.
Afterwards, Gabriel conducted Adam to that house that he might go
in procession round it. All the places where his foot was planted
became verdant oases, with rivers of water and many flowers and
trees, but all the tract between was barren.
Gabriel taught Adam how to make the pilgrimage; and if anyone now
goes there without knowing the ceremonies, he needs a guide.
Then Adam met with Eve again, and they rejoiced together; and she
went back with him to Ceylon. Now at that time there was in the
world no other pair than Adam and Eve, and no other house than the
mansion of rubies.
Now Eblis had made his prayer to Allah that he might be allowed to
live till Israfiel should sound the last trumpet. And he asked this,
because those who are alive when that trumpet sounds, shall not die
any more, for Death will be brought in, in the shape of a sheep, and
will be slaughtered; and when Death is slaughtered, no one will be
able to die.
And God said, “I give thee the time till all creatures must die.”
Then Eblis said, “Just as Thou didst turn me out of the right way, so
shall I pervert those whom Thou hast made.” Satan went to man and
said to him, “God has driven me out of Paradise, never to return
there, and He has taken from me the sovereignty of this world to give
it to thee. Why should we not be friends and associate together, and
I can advise thee on thy concerns?”
And Adam thought to himself, “I must be the companion of this one,
but I will make use of him.” So he suffered him to be his comrade.
The first act of treachery he did was this.
Every child Adam had by Eve died when born. Eve became pregnant
for the fourth time, and Eblis said to Adam, “I believe this child will be
good-looking and will live.”
“I am of the same opinion,” answered Adam.
“If my prophecy turns out right,” said the Evil One, “give the child to
me.”
“I will give it,” said Adam.
Now the child, when born, was very fair to look upon, and Adam,
though he repented of his rash promise, did not venture to break his
word; so he gave the child to Eblis, that is to say, he named it Abd-
el-Hareth, or Servant of Hareth, instead of Abd-Allah, Servant of
God. And after living two years it died.[90]
Thus Satan became an associate in the affairs of man.
But others tell the conclusion of the story somewhat differently. They
say that the child Abd-el-Hareth became the progenitor of the whole
race of Satyrs, nightmares, and hobgoblins.
Maimonides says that the Sabians attribute to Adam the introduction
of the worship of the moon, on which account they call him the
prophet or apostle of the moon.[91]
A large number of books are attributed to Adam. The passage in
Genesis, This is the Book of the generations of Adam,[92] led many to
suppose that Moses quoted from a book written by our first parent.
That such an apocryphal book did exist in after-times, appears from
the fact of Pope Gelasius in his decrees rejecting it as spurious. He
speaks of it as “the book which is called the Book of the generations
of Adam or Geneseos.” And the Rabbis say that this book was
written by Adam, after he had seen all his posterity brought out
before him, as already related. And this book, they say, Adam gave
to Enoch.[93]
Beside this, there existed an Apocalypse of Adam, which is
mentioned by S. Epiphanius, who quotes a passage from it, in which
Adam describes the Tree of Life, which produced twelve kinds of fruit
every year.[94] And George Syncellus, in his Chronicle, extracts a
portion of an apocryphal Life of Adam.
Amongst the Revelations of S. Amadeus are found two psalms,
which, in vision, he heard had been composed by Adam. One was
on the production of Eve, the other is a hymn of repentance, a joint
composition of the two outcasts. It runs as follows:—
Adam.—“Adonai, my Lord God, have mercy upon me for Thy great
goodness, and according to the multitude of Thy mercies do away
my transgressions. I am bowed down with trouble, Thy waves and
storms have gone over me. Deliver me, O God, and save me from
the flood of many waters. Hear my words, O Heavens, and all ye that
dwell in them. May the Angels bear up all my thoughts and words to
Thee, and may the celestial virtues declare them. May the Lord bend
His compassionate ear to my lowly petition. May He hear my prayer,
and let the cry of my heart reach Him. Thou, O God, art the true and
most brilliant light; all other lights are mingled with darkness. Thou
art the sun that knowest no down-setting, that dwellest in
inaccessible light. Thou art the end to which all flesh come. Thou art
the only satisfaction of all the blessed.”
Eve.—“Adonai, Lord God, have mercy upon me for Thy great
goodness, and for the multitude of Thy mercies do away my
transgressions. Thou before all things didst create the immoveable
heaven as a holy and exalted home, and Thou didst adorn it with
angel spirits, to whom Thou didst in goodness declare thy purposes.
They were the bright morning stars who sang to Thee through ages
of ages. Thou didst form the moveable heaven and Thou didst set in
it the watery clouds. Those waters are under the immoveable
heaven, and are above all that live and move. Thou didst create the
light; the beauteous sun, the moon with the five planets didst Thou
place in the midst, and didst fix the signs and constellations. Thou
didst produce four elements, and didst kindle all with Thy wisdom.”
Adam.—“Adonai, Lord God, have mercy upon me for Thy great
goodness, and for the multitude of Thy mercies do away my
transgressions. Thou hast cast out the proud and rebel dragon with
Thy mighty arm. Thou hast put down the mighty from their seat and
hast exalted the humble and meek. Thou hast filled the hungry with
good things, and the rich Thou hast sent empty away. Thou didst
fashion me in Thine own image of the dust of earth, and destine me,
mortal, to be immortal; and me, frail, to endure. Thou didst lead me
into the place of life and joy, and didst surround me with all good
things; Thou didst put all things under my feet, and didst reveal to
me Thy great name, Adonai. Thou didst give me Eve, to be a help
meet for me, whom Thou didst draw from my side.”
Adam.—“Adonai, Lord and God, have mercy upon me for Thy great
goodness, and for the multitude of Thy mercies do away my
transgressions; for Thou hast made me the head of all men. Thou
hast inspired me and my consort with Thy wisdom, and hast given us
a free will and placed our lot in our own hands. But Thou hast given
us precepts and laws, and hast placed life and death before us that
we might keep Thy commandments, and in keeping them find life;
but if we keep them not, we shall die. Lucifer, the envious one, saw
and envied. He fought against us and prevailed. Conquered by
angels, he conquered man, and subjugated all his race. I have
sinned. I am he who have committed iniquity. If I had refused in my
free will, neither Eve nor the Enemy could have obtained my
destruction. But being in honour I had no understanding and I lost
my dignity. I am like to the cattle, the horse, and the mule, which
have no understanding.”
Eve.—“Adonai, Lord and God, have mercy upon me for Thy great
goodness, and for the multitude of Thy mercies do away mine
offences. Great is our God, and great is His mercy; His goodness is
unmeasured. He will supply the remedy to our sin, that if we will to
rise, we may be able to arise; He has appointed His Son, the glorifier
of all, and our Redeemer; and He has appointed the Holy Mother to
be our mediatrix, in whose image He has built me, Eve, the mother
of all flesh. He has fashioned the Mother after the likeness of her
daughter. He has made the father after the image and likeness of His
Son; and He will blot out our transgressions for His merits, if we yield
our wills thereto, and receive His sacraments. He will receive a free-
will offering, and He will not despise a contrite heart. To those going
towards Him, He will fly with welcome, He will pardon their offences
and will crown them with glory.”
Adam.—“Adonai, Lord and God, have mercy upon me for Thy great
goodness, and for the multitude of Thy mercies do away mine
offences. O God, great is the abundance of Thy sweetness. Blessed
are all they that hope in Thee. After the darkness Thou bringest in
the light; and pain is converted into joy. Thou repayest a thousand
for a hundred, and for a thousand thou givest ten thousand. For the
least things, Thou rewardest with the greatest things; and for
temporal joys, Thou givest those that are eternal. Blessed are they
that keep Thy statutes, and bend their necks to Thy yoke. They shall
dwell in Thy tabernacle and rest upon Thy holy hill. They shall be
denizens of Thy courts with Thee, whose roofs shine above gold and
precious stones. Blessed are they who believe in the triune God, and
will to know His ways. We all sing, Glory to the Father, and to the
Son, and to the Holy Ghost, and we magnify our God. As in the
beginning the angels sang, so shall we now and ever, and in ages of
ages. Amen.”[95]
Manasseh Ben-Israel has preserved a prophecy of Adam, that the
world is to last seven thousand years. He says this secret was
handed down from Adam to Enoch, and from Enoch to Noah, and
from Noah to Shem.[96]
At Hebron is a cave, “which,” says an old traveller, “Christians and
Turks point out as having been the place where Adam and Eve
bewailed their sins for a hundred years. This spot is towards the
west, in a valley, about a hundred paces from the Damascene field; it
is a dark grotto, not very long or broad, very low, in a hard rock, and
not apparently artificial, but natural. This valley is called La valle de
Lagrime, the Vale of Tears, as they shed such copious tears over
their transgressions.”[97]
Abu Mohammed Mustapha Ben-Alschit Hasen, in his Universal
History, says that Adam’s garment of fig-leaves, in which he went out
of Eden, was left by him, when he fell, on Adam’s Peak in Ceylon.
There it dried to dust, and the dust was scattered by the wind over
the island, and from this sprang the odoriferous plants which grow
there.[98]
Adam is said to have not gone altogether empty-handed out of
Paradise. Hottinger, in his Oriental History, quoting Jewish
authorities, says: “Adam having gone into the land of Babel, took
with him many wonderful things, amongst others a tree with flowers,
leaves and branches of gold, also a stone tree, also the leaves of a
tree so strong that they were inconsumable in fire, and so large as to
be able to shelter under them ten thousand men of the stature of
Adam; and he carried about with him two of these leaves, of which
one would shelter two men, or clothe them.”[99] Of these trees we
read in the Gemara that the Rabbi Canaan asked of the Rabbi
Simon, son of Assa, who had gone to see them, whether this was
true. He was told in reply that it was so, and that at the time of the
Captivity the Jews had seated themselves under these trees, and in
their shadow had found consolation.
But Palestine seems also to have possessed some of the trees of
Adam’s planting, for Jacob Vitriacus in his Jewish History says:
“There are in that land wonderful trees, which for their pre-excellence
are called Apples of Paradise, bearing oblong fruit, very sweet and
unctuous, having a most delicious savour, bearing in one cluster
more than a hundred compressed berries. The leaves of this tree are
a cubit long and half a cubit wide. There are three other trees
producing beautiful apples or citrons, in which the bite of a man’s
teeth is naturally manifest, wherefore they are called Adam’s
Apples.”[100] Hottinger says that at Tripoli grows a tree called Almaus,
or Adam’s apple, with a green head, and leaves like outspread
fingers, no branches, but only leaves, and with a fruit like a bean-
pod, of delicious flavour, and an odour of roses. Buntingius, in his
Itinerary, describes an Adam’s apple which he tasted at Alexandria,
and he said the taste was like pears, and the clusters of prodigious
size, with twenty in each cluster, like magnificent bunches of grapes.
But the most remarkable fact about them was that, if one of the fruit
were cut with a knife, the figure of a crucifix was found to be
contained in it.[101] And this tree was supposed to have been the
forbidden tree, and the fruit to have thus brought hope as it also
brought death to the eater. Nider, “In Formicario,” also relates that
this fruit, thus marked with the form of the Crucified, grows in
Granada.[102]
“At Beyrut, of which S. Nicodemus was the first bishop,” writes the
Friar, Ignatius von Rheinfelden, “I saw a wonderful fruit which is
called by the Arabs, Mauza, and by the Christians Adam’s fig. This
fruit grows upon a trunk in clusters of fifty or more, and hangs down
towards the ground on account of its weight. The fruit is in shape
something like a cucumber, and is a span long, yellow, and tasting
something like figs. The Christians of those parts say it is the fruit of
which Adam and Eve ate in Paradise, and they argue thus: first,
there are no apples in those parts; secondly, S. Jerome translated
the word in the Bible, Mauza; thirdly, if the fruit be cut, within it is
seen the figure of a crucifix, and they conclude thereby that the first
parents were showed by this figure how their sin would be atoned;
fourthly, the leaves being three ells long and half an ell wide, were
admirably adapted to make skirts of, when Adam and Eve were
conscious of their nakedness. And Holy Scripture says nothing of
apples, but says merely—fruit. But whether this was the fruit or not, I
leave to others to decide.”[103]
Adam is said by the Easterns to have received from Raphael a
magic ring, which became his symbol, and which he handed down to
his descendants selected to know and read mysteries. This was no
other than the ‘crux ansata,’ or handled cross, so common on
Egyptian monuments as the hieroglyph of Life out of death. The
circle symbolized the apple, and thus the Carthusian emblem, which
bears the motto “Stat crux dum volvitur orbis,” is in reality the mystic
symbol of Adam. “Which,” says the Arabic philosopher, Ibn-ephi,
“Mizraim received from Ham, and Ham from Noah, and Noah from
Enoch, and Enoch from Seth, and Seth from Adam, and Adam from
the angel Raphael. Ham wrought with it great marvels, and Hermes
received it from him and placed it amongst the hieroglyphics. But this
character signifies the progress and motion of the Spirit of the world,
and it was a magic seal, kept secret among their mysteries, and a
ring constraining demons.”[104]
VI.
CAIN AND ABEL.

After that the child given to Satan died, says Tabari, Adam had
another son, and he called him Seth, and Seth was prophet in the
room of his father, after the death of Adam.
Adam had many more children; every time that Eve bore, she bare
twins, whereof one was male, the other female, and the twins were
given to one another as husband and wife.
Now Adam sought to give to Abel the twin sister of Cain, when she
was old enough to be married, but Cain (Kabil, in Arabic) was
dissatisfied.[105] Adam said to the brothers, Cain and Abel, “Go, my
sons, and sacrifice to the Lord; and he whose sacrifice is accepted,
shall have the young girl. Take each of you offerings in your hand
and go, sacrifice to the Lord, and He shall decide.”
Abel was a shepherd, and he took the fattest of the sheep, and bore
it to the place of sacrifice; but Cain, who was a tiller of the soil, took a
sheaf of corn, the poorest he could find, and placed it on the altar.
Then fire descended from heaven and consumed the offering of
Abel, so that not even the cinders remained; but the sheaf of Cain
was left untouched.
Adam gave the maiden to Abel, and Cain was sore vexed.
One day, Abel was asleep on a mountain. Cain took a stone and
crushed his head. Then he threw the corpse on his back, and carried
it about, not knowing what to do with it; but he saw two crows
fighting, and one killed the other; then the crow that survived dug a
hole in the earth with his beak, and buried the dead bird. Cain said, “I
have not the sense of this bird. I too will lay my brother in the
ground.” And he did so.
When Adam learned the death of his son, he set out in search of
Cain, but could not find him; then he recited the following lines:—

“Every city is alike, each mortal man is vile,


The face of earth has desert grown, the sky has ceased to
smile,
Every flower has lost its hue, and every gem is dim.
Alas! my son, my son is dead; the brown earth swallows him!
We one have had in midst of us whom death has not yet
found,
No peace for him, no rest for him, treading the blood-
drenched ground.”[106]

This is how the story is told in the Midrash:[107] Cain and Abel could
not agree, for, what one had, the other wanted; then Abel devised a
scheme that they should make a division of property, and thus
remove the possibility of contention. The proposition pleased Cain.
So Cain took the earth, and all that is stationary, and Abel took all
that is moveable.
But the envy which lay in the heart of Cain gave him no rest. One
day he said to his brother, “Remove thy foot, thou standest on my
property; the plain is mine.”
Then Abel ran upon the hills, but Cain cried, “Away, the hills are
mine!” Then he climbed the mountains, but still Cain followed him,
calling, “Away! the stony mountains are mine.”
In the Book of Jasher the cause of quarrel is differently stated. One
day the flock of Abel ran over the ground Cain had been ploughing;
Cain rushed furiously upon him and bade him leave the spot. “Not,”
said Abel, “till you have paid me for the skins of my sheep and wool
of their fleeces used for your clothing.” Then Cain took the coulter
from his plough, and with it slew his brother.[108]
The Targum of Jerusalem says, the subject of contention was that
Cain denied a Judgment to come and Eternal Life; and Abel argued
for both.[109] The Rabbi Menachem, however, asserts that the point
on which they strove was whether a word was written zizit or zizis in
the Parascha.[110]
“And when they were in the field together, the brothers quarrelled,
saying, ‘Let us divide the world.’ One said, ‘The earth you stand on is
my soil.’ The other said, ‘You are standing on my earth.’ One said,
‘The Holy Temple shall stand on my lot;’ the other said, ‘It shall stand
on my lot.’ So they quarrelled. Now there were born with Abel two
daughters, his sisters. Then said Cain, ‘I will take the one I choose, I
am the eldest;’ Abel said, ‘They were born with me, and I will have
them both to wife.’ And when they fought, Abel flung Cain down and
was above him; and he lay on Cain. Then Cain said to Abel, ‘Are we
not both sons of one father; why wilt thou kill me?’ And Abel had
compassion, and let Cain get up. And so Cain fell on him and killed
him. From this we learn not to render good to the evil, for, because
Abel showed mercy to Cain, Cain took advantage of it to slay
Abel.”[111]
S. Methodius the Younger refers to this tradition. He says: “Be it
known that Adam and Eve when they left Paradise were virgins. But
the third year after the expulsion from Eden, they had Cain, their
first-born, and his sister Calmana; and after this, next year, they had
Abel and his sister Deborah. But in the three hundredth year of
Adam’s life, Cain slew his brother, and Adam and Eve wailed over
him a hundred years.”[112]
Eutychius, Patriarch of Alexandria, says, “When Adam and Eve
rebelled against God, He expelled them from Paradise at the ninth
hour on Friday to a certain mountain in India, and He bade them
produce children to increase and multiply upon the earth. Adam and
Eve therefore became parents, first of a boy named Cain, and of a
girl named Azrun, who were twins; then of another boy named Abel,
and of a twin sister named Owain, or in Greek Laphura.
“Now, when the children were grown up, Adam said to Eve, ‘Let Cain
marry Owain, who was born with Abel, and let Abel have Azrun, who
was born with Cain.’ But Cain said to his mother, ‘I will marry my own
twin sister, and Abel shall marry his.’ For Azrun was prettier than
Owain. But when Adam heard this, he said, ‘It is contrary to the
precept that thou shouldst marry thy twin sister.’
“Now Cain was a tiller of the ground, but Abel was a pastor of sheep.
Adam said to them, ‘Take of the fruits of the earth, and of the young
of the sheep, and ascend the top of this holy mountain, and offer
there the best and choicest to God.’ Abel offered of the best and
fattest of the first-born of the flock. Now as they were ascending the
summit of the mountain, Satan put it into the head of Cain to kill his
brother, so as to get Azrun. For that reason his oblation was not
accepted by God. Therefore he was the more inflamed with rage
against Abel, and as they were going down the mount, he rushed
upon him and beat him about the head with a stone and killed him.
Adam and Eve bewailed Abel a hundred years with the greatest
grief.... And God cast out Cain whilst he was still unmarried into the
land of Nod. But Cain carried off with him his sister Azrun.”[113]
The Rabbi Zadok said, “This was the reason why Cain slew Abel.
His twin sister and wife was not at all good-looking. Then he said, ‘I
will kill my brother Abel, and carry off his wife.’”[114]
Gregory Abulfaraj gives this account of the strife: “According to the
opinion of Mar Theodosius, thirty years after he was expelled from
Paradise, Adam knew his wife Eve, and she bore twins, Cain and his
sister Climia; and after thirty more years she bore Abel and his twin
sister Lebuda. Then, seventy years after when Adam wanted to
marry one of the brothers with the twin sister of the other, Cain
refused, asking to have his own twin sister.”[115]
The Pseudo-Athanasius says, “Up to this time no man had died so
that Cain should know how to kill. The devil instructed him in this in a
dream.”[116]
Leonhard Marius on Genesis iv. says, “As to what instrument Cain
used, Scripture is silent. Chrysostom calls it a sword; Prudentius, a
spade; Irenæus, an axe; Isidore says simply, steel; but artists
generally paint a club, and Abulensis thinks he was killed with
stones.” Reuchlin thinks, as iron was not discovered till the times of
Tubal-cain, the weapon must have been made of wood, and he
points out how much more this completes the type of Christ.[117]
Cain and Abel had been born and had lived with Adam in the land of
Adamah; but after Cain slew his brother, he was cast out into the
land Erez, and wherever he went, swords sounded and flashed as
though thirsting to smite him. And he fled that land and came to
Acra, where he had children, and his descendants who live there to
this day have two heads.[118]
Before Cain slew his brother, says the Targum of Jerusalem, the
earth brought forth fruits as the fruits of Eden; but from the day that
blood was spilt upon it, thistles and thorns sprang up; for the face of
earth grew sad, its joy was gone, the stain was on its brow.
Abel’s offering had been of the fattest of his sheep, the Targum adds,
but Cain offered flax.[119]
Abel’s offering, say certain Rabbis, was not perfect; for he offered
the chief part to God, but the remainder he dedicated to the Devil,
and Cain offered the chief part to Satan, and only the remainder to
God.[120]
The Rabbi Johanan said, Cain exclaimed when accused by God of
the murder, “My iniquity is greater than I can bear,” and this is
supposed to mean, “My iniquity is too great to be atoned for, except
by my brother rising from the earth and slaying me.” What did the
Holy One then? He took one letter of the twenty-two which are in the
Law, and He wrote it on the arm of Cain, as it is written, “He put a
mark upon him.”[121]
After Abel was slain, the dog which had kept his sheep guarded his
body, says the Midrash. Adam and Eve sat beside it and wept, and
knew not what to do. Then said a raven whose friend was dead, “I
will teach Adam a lesson,” and he dug a hole in the soil and laid his
friend there and covered him up. And when Adam saw this, he said
to Eve, “We will do the same with Abel.” God rewarded the raven for
this by promising that none should ever injure his young, that he
should always have meat in abundance, and that his prayer for rain
should be immediately answered.[122]
But the Rabbi Johanan taught that Cain buried his brother to hide
what he had done from the eye of God, not knowing that God can
see even the most secret things.[123]
According to some Rabbis, all good souls are derived from Abel and
all bad souls from Cain. Cain’s soul was derived from Satan, his
body alone was from Eve; for the Evil Spirit Sammael, according to
some, Satan, according to others, deceived Eve, and thus Cain was
the son of the Evil One.[124] All the children of Cain also became
demons of darkness and nightmares, and therefore it is, say the
Cabbalists, that there is no mention in Genesis of the death of any of
Cain’s offspring.[125]
When Cain had slain his brother, we are told in Scripture that he fled.
Certain Rabbis give the reason:—He feared lest Satan should kill
him: now Satan has no power over any one whose face he does not
see, thus he had none over Lot’s wife till she turned her face towards
Sodom, and he could see it; and Cain fled, to keep his face from
being seen by the Evil One, and thus give him an opportunity of
taking his life.[126]
With regard to the mark put upon Cain, there is great diverging of
opinion. Some say that his tongue turned white; others, that he was
given a peculiar dress; others, that his face became black; but the
most prevalent opinion is that he became covered with hair, and a
horn grew in the midst of his forehead.
The Little Genesis says, Cain was born when Adam was aged
seventy, and Abel when he was seventy-seven.
The book of the penitence of Adam gives us some curious details.
When Cain had killed his brother, he was filled with terror, for he saw
the earth quivering. He cast the body into a hole and covered it with
dust, but the earth threw the body out. Then he dug another hole and
heaped earth on his brother’s corpse, but again the earth rejected it.
When God appeared before him, Cain trembled in all his limbs, and
God said to him, “Thou tremblest and art in fear; this shall be thy
sign.” And from that moment he quaked with a perpetual ague.
The Rabbis give another mark as having been placed on Cain. They
say that a horn grew out of the midst of his forehead. He was killed
by a son of Lamech, who, being shortsighted, mistook him for a wild
beast; but in the Little Genesis it is said that he was killed by the fall
of his house, in the year 930, the same day that Adam died.
According to the same authority, Adam and Eve bewailed Abel
twenty-eight years.
The Talmud relates the following beautiful incident.
God had cursed Cain, and he was doomed to a bitter punishment;
but moved, at last, by Cain’s contrition, He placed on his brow the
symbol of pardon.
Adam met Cain, and looked with wonder on the seal or token, and
asked,—
“How hast thou turned away the wrath of the Almighty?”
“By confession of sin and repentance,” answered the fratricide.
“Woe is me!” cried Adam, smiting his brow; “is the virtue of
repentance so great, and I knew it not! And by repentance I might
have altered my lot!”[127]
Tabari says that Cain was the first worshipper of fire. Eblis (Satan)
appeared to him and told him that the reason of the acceptance of
Abel’s sacrifice was, that he had invoked the fire that fell on it and
consumed it; Cain had not done this, and therefore fire had not come
down on his oblation. Cain believed this, and adored fire, and taught
his children to do the same.[128]
Cain, says Josephus, having wandered over the earth with his wife,
settled in the land of Nod. But his punishment, so far from proving of
advantage to him, proved only a stimulus to his violence and
passion; and he increased his wealth by rapine, and he encouraged
his children and friends to live by robbery and in luxury. He also
corrupted the primitive simplicity in which men lived, by the
introduction amongst them of weights and measures, by placing
boundaries, and walling cities.[129]
John Malala says the same: “Cain was a tiller of the ground till he
committed the crime of slaying his brother; after that, he lived by
violence, his hand being against every man, and he invented and
taught men the use of weights, measures, and boundaries.”[130]
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookluna.com

You might also like