100% found this document useful (1 vote)
89 views

Download full Python for Probability, Statistics, and Machine Learning 2nd Edition José Unpingco ebook all chapters

Python

Uploaded by

deganfoongcf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (1 vote)
89 views

Download full Python for Probability, Statistics, and Machine Learning 2nd Edition José Unpingco ebook all chapters

Python

Uploaded by

deganfoongcf
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 55

Download the Full Version of textbook for Fast Typing at textbookfull.

com

Python for Probability, Statistics, and Machine


Learning 2nd Edition José Unpingco

https://textbookfull.com/product/python-for-probability-
statistics-and-machine-learning-2nd-edition-jose-unpingco/

OR CLICK BUTTON

DOWNLOAD NOW

Download More textbook Instantly Today - Get Yours Now at textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Python for Probability Statistics and Machine Learning


José Unpingco

https://textbookfull.com/product/python-for-probability-statistics-
and-machine-learning-jose-unpingco/

textboxfull.com

Biota Grow 2C gather 2C cook Loucas

https://textbookfull.com/product/biota-grow-2c-gather-2c-cook-loucas/

textboxfull.com

Probability for Machine Learning - Discover How To Harness


Uncertainty With Python Jason Brownlee

https://textbookfull.com/product/probability-for-machine-learning-
discover-how-to-harness-uncertainty-with-python-jason-brownlee/

textboxfull.com

Python Programming for Data Analysis 1st Edition José


Unpingco

https://textbookfull.com/product/python-programming-for-data-
analysis-1st-edition-jose-unpingco/

textboxfull.com
Machine Learning with LightGBM and Python 2nd Edition
Andrich Van Wyk

https://textbookfull.com/product/machine-learning-with-lightgbm-and-
python-2nd-edition-andrich-van-wyk/

textboxfull.com

Python Real World Machine Learning: Real World Machine


Learning: Take your Python Machine learning skills to the
next level 1st Edition Joshi
https://textbookfull.com/product/python-real-world-machine-learning-
real-world-machine-learning-take-your-python-machine-learning-skills-
to-the-next-level-1st-edition-joshi/
textboxfull.com

Probability statistics for engineers scientists Walpole

https://textbookfull.com/product/probability-statistics-for-engineers-
scientists-walpole/

textboxfull.com

Practical Machine Learning for Data Analysis Using Python


1st Edition Abdulhamit Subasi

https://textbookfull.com/product/practical-machine-learning-for-data-
analysis-using-python-1st-edition-abdulhamit-subasi/

textboxfull.com

Algorithmic Trading Methods: Applications Using Advanced


Statistics, Optimization, and Machine Learning Techniques
2nd Edition Robert Kissell
https://textbookfull.com/product/algorithmic-trading-methods-
applications-using-advanced-statistics-optimization-and-machine-
learning-techniques-2nd-edition-robert-kissell/
textboxfull.com
José Unpingco

Python for
Probability,
Statistics, and
Machine Learning
Second Edition
Python for Probability, Statistics, and Machine
Learning
José Unpingco

Python for Probability,


Statistics, and Machine
Learning
Second Edition

123
José Unpingco
San Diego, CA, USA

ISBN 978-3-030-18544-2 ISBN 978-3-030-18545-9 (eBook)


https://doi.org/10.1007/978-3-030-18545-9
1st edition: © Springer International Publishing Switzerland 2016
2nd edition: © Springer Nature Switzerland AG 2019
This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part
of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission
or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt from
the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authors or the editors give a warranty, expressed or implied, with respect to the material contained
herein or for any errors or omissions that may have been made. The publisher remains neutral with regard
to jurisdictional claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
To Irene, Nicholas, and Daniella, for all
their patient support.
Preface to the Second Edition

This second edition is updated for Python version 3.6+. Furthermore, many existing
sections have been revised for clarity based on feedback from the first version. The
book is now over thirty percent larger than the original with new material about
important probability distributions, including key derivations and illustrative code
samples. Additional important statistical tests are included in the statistics chapter
including the Fisher Exact test and the Mann–Whitney–Wilcoxon Test. A new
section on survival analysis has been included. The most significant addition is the
section on deep learning for image processing with a detailed discussion of gradient
descent methods that underpin all deep learning work. There is also substan-
tial discussion regarding generalized linear models. As before, there are more
Programming Tips that illustrate effective Python modules and methods for scientific
programming and machine learning. There are 445 run-able code blocks that have
been tested for accuracy so you can try these out for yourself in your own codes.
Over 158 graphical visualizations (almost all generated using Python) illustrate the
concepts that are developed both in code and in mathematics. We also discuss and use
key Python modules such as NumPy, Scikit-learn, SymPy, SciPy, lifelines, CVXPY,
Theano, Matplotlib, Pandas, TensorFlow, StatsModels, and Keras.
As with the first edition, all of the key concepts are developed mathematically
and are reproducible in Python, to provide the reader with multiple perspectives on
the material. As before, this book is not designed to be exhaustive and reflects the
author’s eclectic industrial background. The focus remains on concepts and fun-
damentals for day-to-day work using Python in the most expressive way possible.

vii
viii Preface to the Second Edition

Acknowledgements

I would like to acknowledge the help of Brian Granger and Fernando Perez, two
of the originators of the Jupyter Notebook, for all their great work, as well as the
Python community as a whole, for all their contributions that made this book pos-
sible. Hans Petter Langtangen is the author of the Doconce [1] document preparation
system that was used to write this text. Thanks to Geoffrey Poore [2] for his work
with PythonTeX and LATEX, both key technologies used to produce this book.

San Diego, CA, USA José Unpingco


February 2019

References

1. H.P. Langtangen, DocOnce markup language, https://github.com/hplgit/doconce


2. G.M. Poore, Pythontex: reproducible documents with latex, python, and more. Comput. Sci.
Discov. 8(1), 014010 (2015)
Preface to the First Edition

This book will teach you the fundamental concepts that underpin probability and
statistics and illustrate how they relate to machine learning via the Python language
and its powerful extensions. This is not a good first book in any of these topics
because we assume that you already had a decent undergraduate-level introduction
to probability and statistics. Furthermore, we also assume that you have a good
grasp of the basic mechanics of the Python language itself. Having said that, this
book is appropriate if you have this basic background and want to learn how to use
the scientific Python toolchain to investigate these topics. On the other hand, if you
are comfortable with Python, perhaps through working in another scientific field,
then this book will teach you the fundamentals of probability and statistics and how
to use these ideas to interpret machine learning methods. Likewise, if you are a
practicing engineer using a commercial package (e.g., MATLAB, IDL), then you
will learn how to effectively use the scientific Python toolchain by reviewing
concepts you are already familiar with.
The most important feature of this book is that everything in it is reproducible
using Python. Specifically, all of the code, all of the figures, and (most of) the text is
available in the downloadable supplementary materials that correspond to this book
as IPython Notebooks. IPython Notebooks are live interactive documents that allow
you to change parameters, recompute plots, and generally tinker with all of the
ideas and code in this book. I urge you to download these IPython Notebooks and
follow along with the text to experiment with the topics covered. I guarantee doing
this will boost your understanding because the IPython Notebooks allow for
interactive widgets, animations, and other intuition-building features that help make
many of these abstract ideas concrete. As an open-source project, the entire sci-
entific Python toolchain, including the IPython Notebook, is freely available.
Having taught this material for many years, I am convinced that the only way to
learn is to experiment as you go. The text provides instructions on how to get
started installing and configuring your scientific Python environment.

ix
x Preface to the First Edition

This book is not designed to be exhaustive and reflects the author’s eclectic
background in industry. The focus is on fundamentals and intuitions for day-to-day
work, especially when you must explain the results of your methods to a non-
technical audience. We have tried to use the Python language in the most expressive
way possible while encouraging good Python-coding practices.

Acknowledgements

I would like to acknowledge the help of Brian Granger and Fernando Perez, two
of the originators of the Jupyter/IPython Notebook, for all their great work, as well
as the Python community as a whole, for all their contributions that made this book
possible. Additionally, I would also like to thank Juan Carlos Chavez for his
thoughtful review. Hans Petter Langtangen is the author of the Doconce [14]
document preparation system that was used to write this text. Thanks to Geoffrey
Poore [25] for his work with PythonTeX and LATEX.

San Diego, CA, USA José Unpingco


February 2016
Contents

1 Getting Started with Scientific Python . . . . . . . . . . . . . . . . . . . . . . . 1


1.1 Installation and Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Numpy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2.1 Numpy Arrays and Memory . . . . . . . . . . . . . . . . . . . . . 6
1.2.2 Numpy Matrices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2.3 Numpy Broadcasting . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.2.4 Numpy Masked Arrays . . . . . . . . . . . . . . . . . . . . . . . . . 13
1.2.5 Floating-Point Numbers . . . . . . . . . . . . . . . . . . . . . . . . 13
1.2.6 Numpy Optimizations and Prospectus . . . . . . . . . . . . . . 17
1.3 Matplotlib . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
1.3.1 Alternatives to Matplotlib . . . . . . . . . . . . . . . . . . . . . . . 19
1.3.2 Extensions to Matplotlib . . . . . . . . . . . . . . . . . . . . . . . . 20
1.4 IPython . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
1.5 Jupyter Notebook . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.6 Scipy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.7 Pandas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
1.7.1 Series . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
1.7.2 Dataframe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
1.8 Sympy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
1.9 Interfacing with Compiled Libraries . . . . . . . . . . . . . . . . . . . . . . 32
1.10 Integrated Development Environments . . . . . . . . . . . . . . . . . . . . 33
1.11 Quick Guide to Performance and Parallel Programming . . . . . . . 34
1.12 Other Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
2 Probability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.1.1 Understanding Probability Density . . . . . . . . . . . . . . . . 40
2.1.2 Random Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2.1.3 Continuous Random Variables . . . . . . . . . . . . . . . . . . . 46

xi
xii Contents

2.1.4 Transformation of Variables Beyond Calculus . . . . . . . . 49


2.1.5 Independent Random Variables . . . . . . . . . . . . . . . . . . . 51
2.1.6 Classic Broken Rod Example . . . . . . . . . . . . . . . . . . . . 53
2.2 Projection Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
2.2.1 Weighted Distance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
2.3 Conditional Expectation as Projection . . . . . . . . . . . . . . . . . . . . 58
2.3.1 Appendix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
2.4 Conditional Expectation and Mean Squared Error . . . . . . . . . . . . 65
2.5 Worked Examples of Conditional Expectation and Mean Square
Error Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
2.5.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
2.5.2 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 72
2.5.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
2.5.4 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
2.5.5 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
2.5.6 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
2.6 Useful Distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
2.6.1 Normal Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 83
2.6.2 Multinomial Distribution . . . . . . . . . . . . . . . . . . . . . . . . 84
2.6.3 Chi-square Distribution . . . . . . . . . . . . . . . . . . . . . . . . . 86
2.6.4 Poisson and Exponential Distributions . . . . . . . . . . . . . . 89
2.6.5 Gamma Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
2.6.6 Beta Distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
2.6.7 Dirichlet-Multinomial Distribution . . . . . . . . . . . . . . . . . 93
2.7 Information Entropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
2.7.1 Information Theory Concepts . . . . . . . . . . . . . . . . . . . . 96
2.7.2 Properties of Information Entropy . . . . . . . . . . . . . . . . . 98
2.7.3 Kullback–Leibler Divergence . . . . . . . . . . . . . . . . . . . . 99
2.7.4 Cross-Entropy as Maximum Likelihood . . . . . . . . . . . . . 100
2.8 Moment Generating Functions . . . . . . . . . . . . . . . . . . . . . . . . . . 101
2.9 Monte Carlo Sampling Methods . . . . . . . . . . . . . . . . . . . . . . . . 104
2.9.1 Inverse CDF Method for Discrete Variables . . . . . . . . . . 105
2.9.2 Inverse CDF Method for Continuous Variables . . . . . . . 107
2.9.3 Rejection Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 108
2.10 Sampling Importance Resampling . . . . . . . . . . . . . . . . . . . . . . . 113
2.11 Useful Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
2.11.1 Markov’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
2.11.2 Chebyshev’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . 116
2.11.3 Hoeffding’s Inequality . . . . . . . . . . . . . . . . . . . . . . . . . 118
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
Contents xiii

3 Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
3.2 Python Modules for Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . 124
3.2.1 Scipy Statistics Module . . . . . . . . . . . . . . . . . . . . . . . . 124
3.2.2 Sympy Statistics Module . . . . . . . . . . . . . . . . . . . . . . . 125
3.2.3 Other Python Modules for Statistics . . . . . . . . . . . . . . . 126
3.3 Types of Convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 126
3.3.1 Almost Sure Convergence . . . . . . . . . . . . . . . . . . . . . . 126
3.3.2 Convergence in Probability . . . . . . . . . . . . . . . . . . . . . . 129
3.3.3 Convergence in Distribution . . . . . . . . . . . . . . . . . . . . . 131
3.3.4 Limit Theorems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
3.4 Estimation Using Maximum Likelihood . . . . . . . . . . . . . . . . . . . 133
3.4.1 Setting Up the Coin-Flipping Experiment . . . . . . . . . . . 135
3.4.2 Delta Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 145
3.5 Hypothesis Testing and P-Values . . . . . . . . . . . . . . . . . . . . . . . . 147
3.5.1 Back to the Coin-Flipping Example . . . . . . . . . . . . . . . . 149
3.5.2 Receiver Operating Characteristic . . . . . . . . . . . . . . . . . 152
3.5.3 P-Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 154
3.5.4 Test Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
3.5.5 Testing Multiple Hypotheses . . . . . . . . . . . . . . . . . . . . . 163
3.5.6 Fisher Exact Test . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
3.6 Confidence Intervals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 166
3.7 Linear Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169
3.7.1 Extensions to Multiple Covariates . . . . . . . . . . . . . . . . . 178
3.8 Maximum A-Posteriori . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
3.9 Robust Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188
3.10 Bootstrapping . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
3.10.1 Parametric Bootstrap . . . . . . . . . . . . . . . . . . . . . . . . . . 200
3.11 Gauss–Markov . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
3.12 Nonparametric Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 205
3.12.1 Kernel Density Estimation . . . . . . . . . . . . . . . . . . . . . . 205
3.12.2 Kernel Smoothing . . . . . . . . . . . . . . . . . . . . . . . . . . . . 207
3.12.3 Nonparametric Regression Estimators . . . . . . . . . . . . . . 213
3.12.4 Nearest Neighbors Regression . . . . . . . . . . . . . . . . . . . . 214
3.12.5 Kernel Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
3.12.6 Curse of Dimensionality . . . . . . . . . . . . . . . . . . . . . . . . 219
3.12.7 Nonparametric Tests . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
3.13 Survival Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 228
3.13.1 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 231
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
xiv Contents

4 Machine Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237


4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 237
4.2 Python Machine Learning Modules . . . . . . . . . . . . . . . . . . . . . . 237
4.3 Theory of Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
4.3.1 Introduction to Theory of Machine Learning . . . . . . . . . 244
4.3.2 Theory of Generalization . . . . . . . . . . . . . . . . . . . . . . . 249
4.3.3 Worked Example for Generalization/Approximation
Complexity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
4.3.4 Cross-Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 256
4.3.5 Bias and Variance . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260
4.3.6 Learning Noise . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265
4.4 Decision Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 268
4.4.1 Random Forests . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 275
4.4.2 Boosting Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 277
4.5 Boosting Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
4.5.1 Boosting Trees . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 281
4.6 Logistic Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285
4.7 Generalized Linear Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . 295
4.8 Regularization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 300
4.8.1 Ridge Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304
4.8.2 Lasso Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 309
4.9 Support Vector Machines . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 311
4.9.1 Kernel Tricks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 315
4.10 Dimensionality Reduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 317
4.10.1 Independent Component Analysis . . . . . . . . . . . . . . . . . 321
4.11 Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 325
4.12 Ensemble Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
4.12.1 Bagging . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 329
4.12.2 Boosting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 331
4.13 Deep Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 334
4.13.1 Introduction to Tensorflow . . . . . . . . . . . . . . . . . . . . . . 343
4.13.2 Understanding Gradient Descent . . . . . . . . . . . . . . . . . . 350
4.13.3 Image Processing Using Convolutional Neural
Networks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 363
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 379
Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 381
Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 383
Chapter 1
Getting Started with Scientific Python

Python is fundamental to data science and machine learning, as well as an ever-


expanding list of areas including cyber-security, and web programming. The funda-
mental reason for Python’s widespread use is that it provides the software glue that
permits easy exchange of methods and data across core routines typically written in
Fortran or C.
Python is a language geared toward scientists and engineers who may not have
formal software development training. It is used to prototype, design, simulate, and
test without getting in the way because Python provides an inherently easy and
incremental development cycle, interoperability with existing codes, access to a large
base of reliable open-source codes, and a hierarchical compartmentalized design
philosophy. Python is known for enhancing user productivity because it reduces the
development time (i.e., time spent programming) and thereby increases program
run-time.
Python is an interpreted language. This means that Python codes run on a Python
virtual machine that provides a layer of abstraction between the code and the plat-
form it runs on, thus making codes portable across different platforms. For example,
the same script that runs on a Windows laptop can also run on a Linux-based super-
computer or on a mobile phone. This makes programming easier because the virtual
machine handles the low-level details of implementing the business logic of the script
on the underlying platform.
Python is a dynamically typed language, which means that the interpreter itself
figures out the representative types (e.g., floats, integers) interactively or at run-time.
This is in contrast to a language like Fortran that has compilers that study the code
from beginning to end, perform many compiler-level optimizations, link intimately
with the existing libraries on a specific platform, and then create an executable that is
henceforth liberated from the compiler. As you may guess, the compiler’s access to
the details of the underlying platform means that it can utilize optimizations that ex-
ploit chip-specific features and cache memory. Because the virtual machine abstracts
away these details, it means that the Python language does not have programmable
access to these kinds of optimizations. So, where is the balance between the ease

© Springer Nature Switzerland AG 2019 1


J. Unpingco, Python for Probability, Statistics, and Machine Learning,
https://doi.org/10.1007/978-3-030-18545-9_1
2 1 Getting Started with Scientific Python

of programming the virtual machine and these key numerical optimizations that are
crucial for scientific work?
The balance comes from Python’s native ability to bind to compiled Fortran and C
libraries. This means that you can send intensive computations to compiled libraries
directly from the interpreter. This approach has two primary advantages. First, it gives
you the fun of programming in Python, with its expressive syntax and lack of visual
clutter. This is a particular boon to scientists who typically want to use software as
a tool as opposed to developing software as a product. The second advantage is that
you can mix-and-match different compiled libraries from diverse research areas that
were not otherwise designed to work together. This works because Python makes
it easy to allocate and fill memory in the interpreter, pass it as input to compiled
libraries, and then recover the output back in the interpreter.
Moreover, Python provides a multiplatform solution for scientific codes. As an
open-source project, Python itself is available anywhere you can build it, even though
it typically comes standard nowadays, as part of many operating systems. This means
that once you have written your code in Python, you can just transfer the script to
another platform and run it, as long as the third-party compiled libraries are also
available there. What if the compiled libraries are absent? Building and configuring
compiled libraries across multiple systems used to be a painstaking job, but as scien-
tific Python has matured, a wide range of libraries have now become available across
all of the major platforms (i.e., Windows, MacOS, Linux, Unix) as prepackaged
distributions.
Finally, scientific Python facilitates maintainability of scientific codes because
Python syntax is clean, free of semi-colon litter and other visual distractions that
makes code hard to read and easy to obfuscate. Python has many built-in testing,
documentation, and development tools that ease maintenance. Scientific codes are
usually written by scientists unschooled in software development, so having solid
software development tools built into the language itself is a particular boon.

1.1 Installation and Setup

The easiest way to get started is to download the freely available Anaconda distribu-
tion provided by Anaconda (anaconda.com), which is available for all of the major
platforms. On Linux, even though most of the toolchain is available via the built-in
Linux package manager, it is still better to install the Anaconda distribution because
it provides its own powerful package manager (i.e., conda) that can keep track of
changes in the software dependencies of the packages that it supports. Note that if
you do not have administrator privileges, there is also a corresponding Miniconda
distribution that does not require these privileges. Regardless of your platform, we
recommend Python version 3.6 or better.
You may have encountered other Python variants on the web, such as IronPython
(Python implemented in C#) and Jython (Python implemented in Java). In this text,
we focus on the C implementation of Python (i.e., known as CPython), which is, by
1.1 Installation and Setup 3

far, the most popular implementation. These other Python variants permit specialized,
native interaction with libraries in C# or Java (respectively), which is still possible
(but clunky) using CPython. Even more Python variants exist that implement the low-
level machinery of Python differently for various reasons, beyond interacting with
native libraries in other languages. Most notable of these is Pypy that implements a
just-in-time compiler (JIT) and other powerful optimizations that can substantially
speed up pure Python codes. The downside of Pypy is that its coverage of some
popular scientific modules (e.g., Matplotlib, Scipy) is limited or nonexistent which
means that you cannot use those modules in code meant for Pypy.
If you want to install a Python module that is not available via the conda manager,
the pip installer is available. This installer is the main one used outside of the
scientific computing community. The key difference between the two installer is that
conda implements a satisfiability solver that checks for conflicts in versions among
and between installed packages. This can result in conda decreasing versions of
certain packages to accommodate proposed package installation. The pip installer
does not check for such conflicts checks only if the proposed package already has its
dependencies installed and will install them if not or remove existing incompatible
modules. The following command line uses pip to install the given Python module,
Terminal> pip install package_name

The pip installer will download the package you want and its dependencies and
install them in the existing directory tree. This works beautifully in the case where
the package in question is pure-Python, without any system-specific dependencies.
Otherwise, this can be a real nightmare, especially on Windows, which lacks freely
available Fortran compilers. If the module in question is a C library, one way to cope
is to install the freely available Visual Studio Community Edition, which usually
has enough to compile many C-codes. This platform dependency is the problem
that conda was designed to solve by making the binary dependencies of the various
platforms available instead of attempting to compile them. On a Windows system,
if you installed Anaconda and registered it as the default Python installation (it asks
during the install process), then you can use the high-quality Python wheel files on
Christoph Gohlke’s laboratory site at the University of California, Irvine where he
kindly makes a long list of scientific modules available.1 Failing this, you can try
the conda-forge site, which is a community-powered repository of modules that
conda is capable of installing, but which are not formally supported by Anaconda.
Note that conda-forge allows you to share scientific Python configurations with
your remote colleagues using authentication so that you can be sure that you are
downloading and running code from users you trust.
Again, if you are on Windows, and none of the above works, then you may wan-
t to consider installing a full virtual machine solution, as provided by VMWare’s
Player or Oracle’s VirtualBox (both freely available under liberal terms), or with

1 Wheel files are a Python distribution format that you download and install using pip as in pip
install file.whl. Christoph names files according to Python version (e.g., cp27 means Python
2.7) and chipset (e.g., amd32 vs. Intel win32).
4 1 Getting Started with Scientific Python

the Windows subsystem for Linux (WSL) that is built into Windows 10. Using ei-
ther of these, you can set up a Linux machine running on top of Windows, which
should cure these problems entirely! The great part of this approach is that you
can share directories between the virtual machine and the Windows system so that
you don’t have to maintain duplicate data files. Anaconda Linux images are also
available on the cloud by Platform as a Service (PaaS) providers like Amazon Web
Services and Microsoft Azure. Note that for the vast majority of users, especially
newcomers to Python, the Anaconda distribution should be more than enough on
any platform. It is just worth highlighting the Windows-specific issues and associat-
ed workarounds early on. Note that there are other well-maintained scientific Python
Windows installers like WinPython and PythonXY. These provide the spyder in-
tegrated development environment, which is very MATLAB-like environment for
transitioning MATLAB users.

1.2 Numpy

As we touched upon earlier, to use a compiled scientific library, the memory allocated
in the Python interpreter must somehow reach this library as input. Furthermore, the
output from these libraries must likewise return to the Python interpreter. This two-
way exchange of memory is essentially the core function of the Numpy (numerical
arrays in Python) module. Numpy is the de facto standard for numerical arrays in
Python. It arose as an effort by Travis Oliphant and others to unify the preexisting
numerical arrays in Python. In this section, we provide an overview and some tips
for using Numpy effectively, but for much more detail, Travis’ freely available book
[1] is a great place to start.
Numpy provides specification of byte-sized arrays in Python. For example, below
we create an array of three numbers, each of 4 bytes long (32-bits at 8-bits per byte)
as shown by the itemsize property. The first line imports Numpy as np, which is
the recommended convention. The next line creates an array of 32-bit floating-point
numbers. The itemize property shows the number of bytes per item.

>>> import numpy as np # recommended convention


>>> x = np.array([1,2,3],dtype=np.float32)
>>> x
array([1., 2., 3.], dtype=float32)
>>> x.itemsize
4

In addition to providing uniform containers for numbers, Numpy provides a com-


prehensive set of universal functions (i.e., ufuncs) that process arrays element-wise
without additional looping semantics. Below, we show how to compute the element-
wise sine using Numpy,
1.2 Numpy 5

>>> np.sin(np.array([1,2,3],dtype=np.float32) )
array([0.84147096, 0.9092974 , 0.14112 ], dtype=float32)
This computes the sine of the input array [1,2,3], using Numpy’s unary function,
np.sin. There is another sine function in the built-in math module, but the Numpy
version is faster because it does not require explicit looping (i.e., using a for loop)
over each of the elements in the array. That looping happens in the compiled np.sin
function itself. Otherwise, we would have to do looping explicitly as in the following:
>>> from math import sin
>>> [sin(i) for i in [1,2,3]] # list comprehension
[0.8414709848078965, 0.9092974268256817, 0.1411200080598672]
Numpy uses common-sense casting rules to resolve the output types. For example,
if the inputs had been an integer-type, the output would still have been a floating-
point type. In this example, we provided a Numpy array as input to the sine function.
We could have also used a plain Python list instead and Numpy would have built the
intermediate Numpy array (e.g., np.sin([1,1,1])). The Numpy documentation
provides a comprehensive (and very long) list of available ufuncs.
Numpy arrays come in many dimensions. For example, the following shows a
two-dimensional 2x3 array constructed from two conforming Python lists.
>>> x=np.array([ [1,2,3],[4,5,6] ])
>>> x.shape
(2, 3)
Note that Numpy is limited to 32 dimensions unless you build it for more.2 Numpy
arrays follow the usual Python slicing rules in multiple dimensions as shown below
where the : colon character selects all elements along a particular axis.
>>> x=np.array([ [1,2,3],[4,5,6] ])
>>> x[:,0] # 0th column
array([1, 4])
>>> x[:,1] # 1st column
array([2, 5])
>>> x[0,:] # 0th row
array([1, 2, 3])
>>> x[1,:] # 1st row
array([4, 5, 6])
You can also select subsections of arrays by using slicing as shown below
>>> x=np.array([ [1,2,3],[4,5,6] ])
>>> x
array([[1, 2, 3],
[4, 5, 6]])

2 See arrayobject.h in the Numpy source code.


6 1 Getting Started with Scientific Python

>>> x[:,1:] # all rows, 1st thru last column


array([[2, 3],
[5, 6]])
>>> x[:,::2] # all rows, every other column
array([[1, 3],
[4, 6]])
>>> x[:,::-1] # reverse order of columns
array([[3, 2, 1],
[6, 5, 4]])

1.2.1 Numpy Arrays and Memory

Some interpreted languages implicitly allocate memory. For example, in MATLAB,


you can extend a matrix by simply tacking on another dimension as in the following
MATLAB session:
>> x=ones(3,3)
x =
1 1 1
1 1 1
1 1 1
>> x(:,4)=ones(3,1) % tack on extra dimension
x =
1 1 1 1
1 1 1 1
1 1 1 1
>> size(x)
ans =
3 4

This works because MATLAB arrays use pass-by-value semantics so that slice oper-
ations actually copy parts of the array as needed. By contrast, Numpy uses pass-by-
reference semantics so that slice operations are views into the array without implicit
copying. This is particularly helpful with large arrays that already strain available
memory. In Numpy terminology, slicing creates views (no copying) and advanced
indexing creates copies. Let’s start with advanced indexing.
If the indexing object (i.e., the item between the brackets) is a non-tuple sequence
object, another Numpy array (of type integer or boolean), or a tuple with at least
one sequence object or Numpy array, then indexing creates copies. For the above
example, to accomplish the same array extension in Numpy, you have to do something
like the following:

>>> x = np.ones((3,3))
>>> x
array([[1., 1., 1.],
[1., 1., 1.],
1.2 Numpy 7

[1., 1., 1.]])


>>> x[:,[0,1,2,2]] # notice duplicated last dimension
array([[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]])
>>> y=x[:,[0,1,2,2]] # same as above, but do assign it to y

Because of advanced indexing, the variable y has its own memory because the rele-
vant parts of x were copied. To prove it, we assign a new element to x and see that
y is not updated.
>>> x[0,0]=999 # change element in x
>>> x # changed
array([[999., 1., 1.],
[ 1., 1., 1.],
[ 1., 1., 1.]])
>>> y # not changed!
array([[1., 1., 1., 1.],
[1., 1., 1., 1.],
[1., 1., 1., 1.]])

However, if we start over and construct y by slicing (which makes it a view) as shown
below, then the change we made does affect y because a view is just a window into
the same memory.

>>> x = np.ones((3,3))
>>> y = x[:2,:2] # view of upper left piece
>>> x[0,0] = 999 # change value
>>> x
array([[999., 1., 1.],
[ 1., 1., 1.],
[ 1., 1., 1.]])
>>> y
array([[999., 1.],
[ 1., 1.]])

Note that if you want to explicitly force a copy without any indexing tricks, you
can do y=x.copy(). The code below works through another example of advanced
indexing versus slicing.

>>> x = np.arange(5) # create array


>>> x
array([0, 1, 2, 3, 4])
>>> y=x[[0,1,2]] # index by integer list to force copy
>>> y
array([0, 1, 2])
>>> z=x[:3] # slice creates view
8 1 Getting Started with Scientific Python

>>> z # note y and z have same entries


array([0, 1, 2])
>>> x[0]=999 # change element of x
>>> x
array([999, 1, 2, 3, 4])
>>> y # note y is unaffected,
array([0, 1, 2])
>>> z # but z is (it's a view).
array([999, 1, 2])
In this example, y is a copy, not a view, because it was created using advanced
indexing whereas z was created using slicing. Thus, even though y and z have the
same entries, only z is affected by changes to x. Note that the flags property of
Numpy arrays can help sort this out until you get used to it.
Manipulating memory using views is particularly powerful for signal and image
processing algorithms that require overlapping fragments of memory. The following
is an example of how to use advanced Numpy to create overlapping blocks that do
not actually consume additional memory,
>>> from numpy.lib.stride_tricks import as_strided
>>> x = np.arange(16,dtype=np.int64)
>>> y=as_strided(x,(7,4),(16,8)) # overlapped entries
>>> y
array([[ 0, 1, 2, 3],
[ 2, 3, 4, 5],
[ 4, 5, 6, 7],
[ 6, 7, 8, 9],
[ 8, 9, 10, 11],
[10, 11, 12, 13],
[12, 13, 14, 15]])
The above code creates a range of integers and then overlaps the entries to create a
7x4 Numpy array. The final argument in the as_strided function are the strides,
which are the steps in bytes to move in the row and column dimensions, respectively.
Thus, the resulting array steps eight bytes in the column dimension and sixteen bytes
in the row dimension. Because the integer elements in the Numpy array are eight
bytes, this is equivalent to moving by one element in the column dimension and by
two elements in the row dimension. The second row in the Numpy array starts at
sixteen bytes (two elements) from the first entry (i.e., 2) and then proceeds by eight
bytes (by one element) in the column dimension (i.e., 2,3,4,5). The important
part is that memory is re-used in the resulting 7x4 Numpy array. The code below
demonstrates this by reassigning elements in the original x array. The changes show
up in the y array because they point at the same allocated memory.
>>> x[::2]=99 # assign every other value
>>> x
array([99, 1, 99, 3, 99, 5, 99, 7, 99, 9, 99, 11, 99, 13, 99, 15])
1.2 Numpy 9

>>> y # the changes appear because y is a view


array([[99, 1, 99, 3],
[99, 3, 99, 5],
[99, 5, 99, 7],
[99, 7, 99, 9],
[99, 9, 99, 11],
[99, 11, 99, 13],
[99, 13, 99, 15]])

Bear in mind that as_strided does not check that you stay within memory block
bounds. So, if the size of the target matrix is not filled by the available data, the
remaining elements will come from whatever bytes are at that memory location. In
other words, there is no default filling by zeros or other strategy that defends memory
block bounds. One defense is to explicitly control the dimensions as in the following
code:

>>> n = 8 # number of elements


>>> x = np.arange(n) # create array
>>> k = 5 # desired number of rows
>>> y = as_strided(x,(k,n-k+1),(x.itemsize,)*2)
>>> y
array([[0, 1, 2, 3],
[1, 2, 3, 4],
[2, 3, 4, 5],
[3, 4, 5, 6],
[4, 5, 6, 7]])

1.2.2 Numpy Matrices

Matrices in Numpy are similar to Numpy arrays but they can only have two dimen-
sions. They implement row–column matrix multiplication as opposed to element-
wise multiplication. If you have two matrices you want to multiply, you can either
create them directly or convert them from Numpy arrays. For example, the following
shows how to create two matrices and multiply them.

>>> import numpy as np


>>> A=np.matrix([[1,2,3],[4,5,6],[7,8,9]])
>>> x=np.matrix([[1],[0],[0]])
>>> A*x
matrix([[1],
[4],
[7]])

This can also be done using arrays as shown below


10 1 Getting Started with Scientific Python

>>> A=np.array([[1,2,3],[4,5,6],[7,8,9]])
>>> x=np.array([[1],[0],[0]])
>>> A.dot(x)
array([[1],
[4],
[7]])
Numpy arrays support element-wise multiplication, not row–column multiplication.
You must use Numpy matrices for this kind of multiplication unless use the inner
product np.dot, which also works in multiple dimensions (see np.tensordot for
more general dot products). Note that Python 3.x has a new @ notation for matrix
multiplication so we can re-do the last calculation as follows:
>>> A @ x
array([[1],
[4],
[7]])
It is unnecessary to cast all multiplicands to matrices for multiplication. In the
next example, everything until last line is a Numpy array and thereafter we cast the
array as a matrix with np.matrix which then uses row–column multiplication. Note
that it is unnecessary to cast the x variable as a matrix because the left-to-right order
of the evaluation takes care of that automatically. If we need to use A as a matrix
elsewhere in the code then we should bind it to another variable instead of re-casting
it every time. If you find yourself casting back and forth for large arrays, passing the
copy=False flag to matrix avoids the expense of making a copy.
>>> A=np.ones((3,3))
>>> type(A) # array not matrix
<class 'numpy.ndarray'>
>>> x=np.ones((3,1)) # array not matrix
>>> A*x
array([[1., 1., 1.],
[1., 1., 1.],
[1., 1., 1.]])
>>> np.matrix(A)*x # row-column multiplication
matrix([[3.],
[3.],
[3.]])

1.2.3 Numpy Broadcasting

Numpy broadcasting is a powerful way to make implicit multidimensional grids for


expressions. It is probably the single most powerful feature of Numpy and the most
difficult to grasp. Proceeding by example, consider the vertices of a two-dimensional
unit square as shown below
1.2 Numpy 11

>>> X,Y=np.meshgrid(np.arange(2),np.arange(2))
>>> X
array([[0, 1],
[0, 1]])
>>> Y
array([[0, 0],
[1, 1]])

Numpy’s meshgrid creates two-dimensional grids. The X and Y arrays have cor-
responding entries match the coordinates of the vertices of the unit square (e.g.,
(0, 0), (0, 1), (1, 0), (1, 1)). To add the x and y-coordinates, we could use X and Y
as in X+Y shown below, The output is the sum of the vertex coordinates of the unit
square.

>>> X+Y
array([[0, 1],
[1, 2]])

Because the two arrays have compatible shapes, they can be added together element-
wise. It turns out we can skip a step here and not bother with meshgrid to implicitly
obtain the vertex coordinates by using broadcasting as shown below

>>> x = np.array([0,1])
>>> y = np.array([0,1])
>>> x
array([0, 1])
>>> y
array([0, 1])
>>> x + y[:,None] # add broadcast dimension
array([[0, 1],
[1, 2]])
>>> X+Y
array([[0, 1],
[1, 2]])

On line 7 the None Python singleton tells Numpy to make copies of y along this
dimension to create a conformable calculation. Note that np.newaxis can be used
instead of None to be more explicit. The following lines show that we obtain the
same output as when we used the X+Y Numpy arrays. Note that without broadcasting
x+y=array([0, 2]) which is not what we are trying to compute. Let’s continue
with a more complicated example where we have differing array shapes.

>>> x = np.array([0,1])
>>> y = np.array([0,1,2])
>>> X,Y = np.meshgrid(x,y)
>>> X
array([[0, 1],
12 1 Getting Started with Scientific Python

[0, 1],
[0, 1]])
>>> Y
array([[0, 0],
[1, 1],
[2, 2]])
>>> X+Y
array([[0, 1],
[1, 2],
[2, 3]])
>>> x+y[:,None] # same as with meshgrid
array([[0, 1],
[1, 2],
[2, 3]])
In this example, the array shapes are different, so the addition of x and y is
not possible without Numpy broadcasting. The last line shows that broadcasting
generates the same output as using the compatible array generated by meshgrid.
This shows that broadcasting works with different array shapes. For the sake of
comparison, on line 3, meshgrid creates two conformable arrays, X and Y. On the
last line, x+y[:,None] produces the same output as X+Y without the meshgrid. We
can also put the None dimension on the x array as x[:,None]+y which would give
the transpose of the result.
Broadcasting works in multiple dimensions also. The output shown has shape
(4,3,2). On the last line, the x+y[:,None] produces a two-dimensional array
which is then broadcast against z[:,None,None], which duplicates itself along the
two added dimensions to accommodate the two-dimensional result on its left (i.e., x
+ y[:,None]). The caveat about broadcasting is that it can potentially create large,
memory-consuming, intermediate arrays. There are methods for controlling this by
re-using previously allocated memory but that is beyond our scope here. Formulas
in physics that evaluate functions on the vertices of high dimensional grids are great
use-cases for broadcasting.
>>> x = np.array([0,1])
>>> y = np.array([0,1,2])
>>> z = np.array([0,1,2,3])
>>> x+y[:,None]+z[:,None,None]
array([[[0, 1],
[1, 2],
[2, 3]],

[[1, 2],
[2, 3],
[3, 4]],
1.2 Numpy 13

[[2, 3],
[3, 4],
[4, 5]],

[[3, 4],
[4, 5],
[5, 6]]])

1.2.4 Numpy Masked Arrays

Numpy provides a powerful method to temporarily hide array elements without


changing the shape of the array itself,
>>> from numpy import ma # import masked arrays
>>> x = np.arange(10)
>>> y = ma.masked_array(x, x<5)
>>> print (y)
[-- -- -- -- -- 5 6 7 8 9]
>>> print (y.shape)
(10,)
Note that the elements in the array for which the logical condition (x<5) is true are
masked, but the size of the array remains the same. This is particularly useful in
plotting categorical data, where you may only want those values that correspond to
a given category for part of the plot. Another common use is for image processing,
wherein parts of the image may need to be excluded from subsequent processing.
Note that creating a masked array does not force an implicit copy operation unless
copy=True argument is used. For example, changing an element in x does change
the corresponding element in y, even though y is a masked array,
>>> x[-1] = 99 # change this
>>> print(x)
[ 0 1 2 3 4 5 6 7 8 99]
>>> print(y)# masked array changed!
[-- -- -- -- -- 5 6 7 8 99]

1.2.5 Floating-Point Numbers

There are precision limitations when representing floating-point numbers on a com-


puter with finite memory. For example, the following shows these limitations when
adding two simple numbers,
>>> 0.1 + 0.2
0.30000000000000004
14 1 Getting Started with Scientific Python

So, then, why is the output not 0.3? The issue is the floating-point representation of
the two numbers and the algorithm that adds them. To represent an integer in binary,
we just write it out in powers of 2. For example, 230 = (11100110)2 . Python can
do this conversion using string formatting,
>>> print('{0:b}'.format(230))
11100110
To add integers, we just add up the corresponding bits and fit them into the allowable
number of bits. Unless there is an overflow (the results cannot be represented with
that number of bits), then there is no problem. Representing floating point is trickier
because we have to represent these numbers as binary fractions. The IEEE 754
standard requires that floating-point numbers be represented as ±C × 2 E where C is
the significand (mantissa) and E is the exponent.
To represent a regular decimal fraction as binary fraction, we need to compute
the expansion of the fraction in the following form a1 /2 + a2 /22 + a3 /23 ... In other
words, we need to find the ai coefficients. We can do this using the same process we
would use for a decimal fraction: just keep dividing by the fractional powers of 1/2
and keep track of the whole and fractional parts. Python’s divmod function can do
most of the work for this. For example, to represent 0.125 as a binary fraction,
>>> a = 0.125
>>> divmod(a*2,1)
(0.0, 0.25)
The first item in the tuple is the quotient and the other is the remainder. If the quotient
was greater than 1, then the corresponding ai term is one and is zero otherwise. For
this example, we have a1 = 0. To get the next term in the expansion, we just keep
multiplying by 2 which moves us rightward along the expansion to ai+1 and so on.
Then,
>>> a = 0.125
>>> q,a = divmod(a*2,1)
>>> print (q,a)
0.0 0.25
>>> q,a = divmod(a*2,1)
>>> print (q,a)
0.0 0.5
>>> q,a = divmod(a*2,1)
>>> print (q,a)
1.0 0.0
The algorithm stops when the remainder term is zero. Thus, we have that 0.125 =
(0.001)2 . The specification requires that the leading term in the expansion be one.
Thus, we have 0.125 = (1.000) × 2−3 . This means the significand is 1 and the
exponent is -3.
Now, let’s get back to our main problem 0.1+0.2 by developing the representation
0.1 by coding up the individual steps above.
1.2 Numpy 15

>>> a = 0.1
>>> bits = []
>>> while a>0:
... q,a = divmod(a*2,1)
... bits.append(q)
...
>>> print (''.join(['%d'%i for i in bits]))
0001100110011001100110011001100110011001100110011001101

Note that the representation has an infinitely repeating pattern. This means that we
have (1.1001)2 × 2−4 . The IEEE standard does not have a way to represent infinitely
repeating sequences. Nonetheless, we can compute this,

 1 1 3
+ 4n =
24n−3 2 5
n=1

Thus, 0.1 ≈ 1.6 × 2−4 . Per the IEEE 754 standard, for float type, we have
24-bits for the significand and 23-bits for the fractional part. Because we can-
not represent the infinitely repeating sequence, we have to round off at 23-bits,
10011001100110011001101. Thus, whereas the significand’s representation used
to be 1.6, with this rounding, it is Now
>>> b = '10011001100110011001101'
>>> 1+sum([int(i)/(2**n) for n,i in enumerate(b,1)])
1.600000023841858

Thus, we now have 0.1 ≈ 1.600000023841858×2−4 = 0.10000000149011612. For


the 0.2 expansion, we have the same repeating sequence with a different exponent,
so that we have 0.2 ≈ 1.600000023841858 × 2−3 = 0.20000000298023224. To
add 0.1+0.2 in binary, we must adjust the exponents until they match the higher of
the two. Thus,
0.11001100110011001100110
+1.10011001100110011001101
--------------------------
10.01100110011001100110011

Now, the sum has to be scaled back to fit into the significand’s available bits so the
result is 1.00110011001100110011010 with exponent -2. Computing this in the
usual way as shown below gives the result.
>>> k='00110011001100110011010'
>>> print('%0.12f'%((1+sum([int(i)/(2**n)
... for n,i in enumerate(k,1)]))/2**2))
0.300000011921

which matches what we get with numpy


16 1 Getting Started with Scientific Python

>>> import numpy as np


>>> print('%0.12f'%(np.float32(0.1) + np.float32(0.2)))
0.300000011921

The entire process proceeds the same for 64-bit floats. Python has a fractions
and decimal modules that allow more exact number representations. The decimal
module is particularly important for certain financial computations.
Round-off Error. Let’s consider the example of adding 100,000,000 and 10 in
32-bit floating point.

>>> print('{0:b}'.format(100000000))
101111101011110000100000000

This means that 100, 000, 000 = (1.01111101011110000100000000)2 × 226 . Like-


wise, 10 = (1.010)2 × 23 . To add these we have to make the exponents match as in
the following,

1.01111101011110000100000000
+0.00000000000000000000001010
-------------------------------
1.01111101011110000100001010

Now, we have to round off because we only have 23 bits to the right of the decimal
point and obtain 1.0111110101111000010000, thus losing the trailing 10 bits.
This effectively makes the decimal 10 = (1010)2 we started out with become 8 =
(1000)2 . Thus, using Numpy again,
>>> print(format(np.float32(100000000) + np.float32(10),'10.3f'))
100000008.000

The problem here is that the order of magnitude between the two numbers was so
great that it resulted in loss in the significand’s bits as the smaller number was right-
shifted. When summing numbers like these, the Kahan summation algorithm (see
math.fsum()) can effectively manage these round-off errors.

>>> import math


>>> math.fsum([np.float32(100000000),np.float32(10)])
100000010.0

Cancelation Error. Cancelation error (loss of significance) results when two nearly
equal floating-point numbers are subtracted. Let’s consider subtracting 0.1111112
and 0.1111111. As binary fractions, we have the following,

1.11000111000111001000101 E-4
-1.11000111000111000110111 E-4
---------------------------
0.00000000000000000011100
Exploring the Variety of Random
Documents with Different Content
wandering about the bazaar talking to the keepers of the shops and
to each other. It seemed to the púsári that he had been walking for
hours, yet the bazaar appeared to be as interminable as ever. He
walked on as in a dream, for, in spite of the apparent bustle and
excitement around him, he could hear nothing. Stupefied by his
fearful position, he walked on mechanically, having now lost the
sense of fear, and feeling only a sort of vague wonder.
And now a raging thirst seized on the púsári. He had been on foot all
day in the sun, and all the afternoon his mouth had been hot and
bitter with curses. He had drunk nothing for many hours. As he
walked along, the craving for water grew stronger and stronger, till
he could bear it no longer. He realised vaguely the peril he ran in
accepting anything from the hand of a pisási, nevertheless he
stopped and looked about, in the hope of finding something to drink.
Near at hand was a small shop presided over by a hideous old she-
pisási. Undeterred by the horrible aspect of the red-eyed, wrinkled,
old hag, the púsári approached her with the intention of asking for a
drink of water. As he did so, he felt conscious that all the pisásis had
suddenly stood still and were watching him. The she-pisási’s shop
contained some strange things. On one side lay a huge rock python
cut into lengths, each of which was wriggling about as if full of life.
On the other side lay a young crocodile apparently dead; but as the
púsári approached, it turned its head and looked slily at him with its
cold yellow eye. Over the old hag’s head hung a crate full of live
snakes, that writhed about and thrust their heads through the withes.
Strings of dead bats, and baskets full of loathsome reptiles and
creeping creatures, filled the shop. In front of her stood a hollow
gourd full of water.
‘Mother! I am thirsty,’ said the púsári as he pointed to the water. But
though he said the words, he did not hear his own voice. The old hag
looked fixedly at him for a moment, and then raising the gourd, gave
it to him. He raised it to his lips, and drank long and eagerly. As he
put the empty vessel down, he felt everything reel and swim about
him. Gazing wildly round, he grasped at the air two or three times for
some support, and then fell to the ground motionless and senseless.
AN EVERY-DAY OCCURRENCE.
There are in all our lives episodes which we should be glad to
forget; of which we are so much ashamed, that we scarcely dare to
think of them, and when we do, find ourselves hurriedly muttering the
words we imagine we ought to have said, or making audible
apologies for our conduct to the air; and yet these are not always
episodes which necessarily involve a tangible sense of wrong done
either to ourselves or to others. Some such episode in a
commonplace life, such as must have fallen to the lot of many men,
we would here reveal.
Once upon a time—to commence in an orthodox fashion—a man
and a maid lived and loved. On the woman’s part the affection was
as pure and generous as ever filled the breast of a maiden; on the
man’s, as warm as his nature permitted. His love did not absorb his
whole soul, it rather permeated his mind and coloured his being. Like
most men of his not uncommon stamp, his affection once given, was
given for ever. His was not a jubilant nature, nor did his feelings lie
near the surface, and his manner was undemonstrative. The girl was
clear-sighted enough to see that what love there was, was pure and
true, and she made up for its scarcity with the overflowings of her
sympathetic nature. She idealised rather than condoned. She gave
in such measure that she could not perceive how little she was
receiving in return; or if she noticed it, her consciousness of its worth
seemed to her a full equivalent. He was an artist; and circumstances
forced the lovers to wait, and at the same time kept them apart. A
couple of days once a month, and a week now and again, was the
limit of the time they could spend together. This, of course,
prevented them getting that intimate knowledge of each other’s
personality which both recognised as an essential adjunct to the
happiness of married life, though they did their best to obviate it by
long letters, giving full details of daily events and of the society in
which they moved. The remedy was an imperfect one. Strive as they
might, the sketches were crude, and the letters had a tendency to
become stereotyped. We only mention these details to show that
they tried to be perfectly honest with each other.
While the girl’s life, in her quiet country home, was one that held little
variety in it, it was a part of the man’s stock-in-trade to mix with
society and to observe closely. Whether he liked it or not, he was
compelled to make friends to such an extent as to afford him an
opportunity of gauging character. Unfortunately for the purposes of
my study, he had no sympathy with pessimism or pessimists. He
loved the good and the beautiful for their own sakes, and in his art
loved to dwell on the bright side of human nature, a side which the
writer has found so much easier to meet with than the more sombre
colouring we are constantly told is the predominating one in life. Like
most artists, he was somewhat susceptible, but his susceptibility was
on the surface; the inward depths of his soul had never been stirred
save by the gentle girl who held his heart, and she was such as to
inspire a constant and growing affection rather than a demonstrative
passion.
At one of the many houses at which he was a welcome guest, the
lover found a young girl bright, sensuous, beautiful. Unwittingly, he
compared her with the one whose heart he held, and the comparison
was unsatisfactory to him; do what he would, the honesty of his
nature compelled him to allow that this beautiful girl was the superior
in a number of ways to her to whom he had pledged his life. He was
caught in the Circe’s chains of golden hair, and fancied—almost
hoped—yet feared lest, like bonds of cobwebs in the fairy tale, the
toils were too strong for him to break. He could see, too, that the girl
regarded him with a feeling so warm, that a chance spark would
rouse it into a flame of love; and this gave her an interest as
dangerous as it was fascinating. His fancy swerved. Day after day he
strove with himself, and by efforts, too violent to be wise, he kept
away from the siren till his inflamed fancy forced him back to her
side.
To the maiden in the country he was partially honest. In his letters he
faithfully told her of his visits, and as far as he could, recorded his
opinions of the girl who had captivated his fancy. Too keen an artist
to be blind to her faults, he dwelt on them in his frequent letters at
unnecessary length. When the lovers met, the girl questioned him
closely about her rival, but only from the interest she felt in all his
friends known and unknown, for her love for him was too pure and
strong to admit of jealousy, and he, with what honesty he could,
answered her questions unreservedly.
Little by little he began to examine himself. Which girl did he really
love? Should he not be doing a wrong to both by not deciding? The
examination was dangerous, because it was not thorough. The
premises were true, but incomplete. Yet we should wrong him if we
implied that he for a moment thought seriously about breaking off his
engagement. Even had he wished, his almost mistaken feelings of
honour would have forbidden it. This constant surface introspection
—a kind of examination which, had not the subject been himself, he
would have despised and avoided—could have but one result—an
obliquity of mental vision. He had a horror of being untrue—untrue to
himself as untrue to his lass, and yet he dreaded causing pain to a
bosom so tender and innocent. When he sat down to write the
periodical letters to the girl to whom he was engaged, he found his
phrases becoming more and more general and guarded. He took
pains not to let her know what he felt must wound her, and the letters
grew as unnatural as they had been the reverse; they were
descriptive of the man rather than the reflex of his personality.
The country girl was quick of perception. The letters were more full
of endearing terms than ever; they were longer and told more of his
life; yet between the lines she could see that they were by one
whose heart was not at rest, and that a sense of duty and not of
pleasure prompted the ample details. Their very regularity was
painful: it seemed as if the writer was anxious to act up to the letter
of his understanding. She knew that the letters were often written
when he was tired out. Why did he not put off writing, and taking
advantage of her love, let her exercise her trust in him? Eagerly she
scanned the pages to find the name of her rival, and having found it,
would thoughtfully weigh every word of description, of blame or
praise.
When the lovers met, she questioned him more closely than she had
ever done before. He was seemingly as fond as ever; no endearing
name, no accustomed caress, was forgotten. He spoke of himself
and his friends as freely as usual, and all her questions were
answered without a shadow of reserve. Yet the answers were slower,
and his manner absent and thoughtful. For a time she put it down to
the absorbing nature of his pursuits; but little by little, a belief that
she was no longer dearest crept into her heart, and would not be
dislodged, try as she might. She thought she was jealous, and
struggled night and day against a fault she dreaded above all others;
then, in a paroxysm of despair, she allowed herself to be convinced
of what she feared, and, loving him deeply, prepared to make the
greatest sacrifice an unselfish woman can offer. He no longer loved
her; it was best he should be free.
When he had been with her last, he had told her that his ensuing
absence must perforce be longer than usual, and this she thought
would be the best time for her purpose.
‘Dear Frank,’ she wrote at the end of a pitiful little letter, ‘I am going
to ask you not to come here next week. This will surprise you, for in
all my other letters I have told you that what I most look forward to in
life is your visits. But I have been thinking, dear, that it will be best for
us to part for ever. I often ask myself if we love one another as much
as we did, and I am afraid we do not. A loveless married life would
be too dreadful to live through, and I dare not risk it. It is better that
the parting should come through me. Do not fancy that I am
reproaching you; I cannot, for to me you are above reproach, above
blame. All I feel is that our affection is colder, so we had better part.
God bless you, Frank; I can never tell you how deeply I have loved
you.—Elsie.’
Frank was almost stunned by the receipt of this letter. He read it and
re-read it till every word seemed burnt into his brain. That the girl’s
love for him was less, he did not believe; he could read undiminished
affection in the vague phraseology, in the studied carefulness to take
equal blame on herself. That she should be jealous, was out of the
question; long years of experience had taught him that this was
totally foreign to her trustful nature. There was but one conclusion to
come to. She had given him up because she thought his happiness
involved. Yet she wished him to be free; might it not be ungracious to
refuse to accept her gift?
Free! There was a terrible fascination in the sound. Be the bondage
ever so pleasant, be it even preferable to liberty itself, the idea of
freedom is irresistibly alluring. If the same bondage will be chosen
again, there is a delight in the consciousness that it will be your own
untrammelled choice. Frank was aware of a wild exultation when he
realised the fact that he was once more a free agent. In the first flush
of liberty, poor Elsie’s image faded out of sight, and that of the siren
took its place. Now, without wrong, he might follow his inclinations.
He determined to write to Elsie, but knew not what to say, and put it
off till the morrow.
There could be no harm in going to the house of his fascinator; it
was pleasant to think that he might now speak, think, look, without
any mental reservations; there would be no longer any need to
watch his actions, or to force back the words which would tell her
that she exercised a deadly power over him. The girl received him
with a winning smile, yet, when he touched her hand, he did not feel
his brain throb or his blood rush madly through his veins as he had
expected. He bore his part through the evening quietly, and owned
that it was a pleasant one; still, the flavour was not what he had
expected. He called to mind that when he was abroad for the first
time, he had been served with a peculiar dish, which he
remembered, and often longed for when unattainable. After several
years, he had visited the same café and ordered the same dish. The
same cook prepared it, and the same waiter served it, but the taste
was not the same; expectation had heightened the flavour, and the
real was inferior to the ideal.
So it was with Frank. Before, when the siren had seemed
unattainable, he had luxuriated in her beauty, admired her grace and
genius, and revelled in her wit; now, when he felt he might call these
his own, his eye began to detect deficiencies. The girl noted his
critical attitude, and chafed at the calmness of his keen, watchful
glance. Where was the open admiration she used to read in his
eyes? Piqued at his indifference, she grew silent and irritable; and
when he bade her farewell, both were conscious that an ideal had
been shattered.
He buttoned his overcoat, and prepared for a long walk to the lonely
chambers where he lived the usual careless, comfortless life of a
bachelor whose purse is limited. All the way home he submitted
himself to a deep and critical examination. He felt as if he was sitting
by the ashes of a failing fire which he had no means of replenishing;
the night was coming, and he must sit in the cold. If passion died out,
where was he to look for the sympathy, the respect, the true
friendliness which alone can supply its place in married life? Then he
thought of Elsie. He had made a mistake, but a very common
mistake. He had thought that the excitement of his interest, the
enchaining of his fancy, and the enthralment of his senses, was love,
and lo! it was only passion. He analysed his feelings more deeply
yet, and getting below the surface-currents which are stirred by the
winds, saw that the quiet waters beneath had kept unswervingly on
their course.
When he reached his chambers, he sat down by his table and drew
paper and ink towards him. ‘I shall not accept your dismissal, Elsie,’
he wrote hurriedly in answer to her piteous letter: ‘I should be very
shallow if I could not read the motive which prompted your letter. I
shall come down as usual, and we will talk over it till we understand
each other fully. Till then, you must believe me when I tell you that I
love you all the more for your act of sacrifice, and that I love you
more now than I have ever done before.’
Frank and Elsie have been long married, and are content. There is
no fear of his swerving again; but the event described left its mark on
Frank. He knows now that he was on the verge of committing a
grievous mistake, and one which might have darkened all his future
life. For it is not great events, involving tragedies and tears, that
impress themselves most deeply upon the body of our habits and
thoughts; but the tendency of our life, as in the case before us, is
often most deeply affected by what is no more than ‘an every-day
occurrence.’
A NIGHT IN A WELL.
The station of Rawal Pindi, in which the following incident took place,
is a large military cantonment in the Punjab, about a hundred miles
from the Indus at Attock, where the magnificent bridge across the
rapid river now completes the connection by rail between the
presidency towns of Calcutta, Madras, and Bombay with Peshawur
our frontier outpost, which, like a watchful sentinel, stands looking
straight into the gloomy portal of the far-famed Khyber Pass. It was
at Rawal Pindi that the meeting took place between the Viceroy of
India, Lord Dufferin, and the present Ameer of Afghanistan, before
whom were then paraded not only the garrison of Rawal Pindi, or, as
it is more generally known in those parts, by the familiar abbreviation
of Pindi—a Punjabi word signifying a village—but a goodly array of
the three arms, artillery, cavalry, and infantry, drawn from the
garrisons of the Punjab and North-west Provinces of India. In
ordinary times, the troops in garrison at Pindi consist of four or five
batteries of royal artillery, both horse and field; a regiment of British,
and one of Indian cavalry; and one regiment of British, and two of
Bengal infantry, with a company of sappers and miners. The
barracks—or, as they are called in India, the lines—occupied by
these troops extend across the Grand Trunk Road leading to
Peshawur, those of the royal artillery being almost, if not quite on the
extreme right, and it is here that the occurrence which gives the
heading to this article took place.
In front of the lines of each regiment is the quarter-guard belonging
to it, at a distance of two or three hundred yards from the centre
barrack. The men of this guard are turned out and inspected once by
day and once by night by the officer on duty, technically known as
the orderly officer. In rear of the quarter-guard, as has been already
said, are the men’s barracks; and in rear of them the cook-houses
and horse-lines, amongst and behind which are large wells—‘pucka
wells,’ as they are called, from being lined for a long way down and
about the surface with brick-work and cement, in distinction from the
ordinary ‘cutcha wells,’ which are merely circular holes dug until
water is reached.
The pucka wells in the Pindi cantonments are from twelve to
fourteen feet in diameter, and from thirty to forty feet from the surface
to the water. They are surrounded by low parapets; and from each
well extend long troughs of brick and cement, into which the water
drawn from the well is conducted by channels, for the use of the
horses and other cattle belonging to the artillery or cavalry. The low
parapets round the wells are sufficient protection, at all events in the
daytime; though instances are not unfrequent when accidents have
occurred on a dark night to goats, sheep, and even bullocks straying
from their tethers, especially when a dust-storm has been adding by
its turmoil to the bewilderment of all so unfortunate as to be caught
abroad in it, as the writer has on more than one occasion, when
compelled to stand or sit for hours behind some protecting wall or
tree; the darkness in noonday has been so great that his hand,
though held close to his eyes, was with difficulty discernible. When to
such a state of things are added the roar of the wind and the beating
of broken branches of trees, wisps of straw, and other articles caught
up and hurtled along, it may be easily imagined how dazed and
perplexed is the condition of every creature so exposed. A dust-
storm, however, had nothing to say to the accident with which we
have to do.
In rear of the cook-houses, wells, &c., come the mess-house and the
bungalows in which the officers reside, each in its own compound or
inclosure, about eighty or a hundred yards square, and about a
quarter of a mile from the men’s lines.
One night in the cold season of 1866-67, as well as I can remember,
the subaltern on duty at Pindi was Lieutenant Black—as we will call
him—of the Royal Horse Artillery. He was well known in the arm of
the service to which he belonged as a bold and fearless horseman,
who had distinguished himself on many occasions as a race-rider
both at home and abroad. On the evening in question he remained
playing billiards in the mess-house until it was time to visit the
quarter-guard in front of the lines. A little before midnight he mounted
his horse at the door of the mess, and started. It was very dark; but
he knew the road well, and had perfect faith in his horse, a favourite
charger; so, immediately on passing the gate of the mess
compound, he set off, as was his custom, at a smart canter along the
straight road leading to the barracks. He passed through these, and
soon reached the guard, which he turned out, and finding all present
and correct, proceeded to return to his own bungalow, having
completed his duty for the day. He rode through the lines by the way
he had come; but then, being in a hurry to get to bed, he left the
main road and took a short-cut across an open space.
Notwithstanding the darkness, the horse was cantering freely on, no
doubt as anxious as his master to reach his comfortable stall, when
all at once Black felt him jump over some obstacle, which he cleared,
and the next moment horse and rider were falling through the air;
and a great splash and crash were the last things of which Black had
any consciousness. After an interval—how long he couldn’t tell—
sensation slowly returned, and he became aware that he was still
sitting in his saddle, but bestriding a dead horse. His legs were in
water; and the hollow reverberation of his voice when he shouted for
help, as he did until he could do so no longer, informed him that he
had fallen into one of the huge wells somewhere in the lines. It was
intensely dark; but he soon became aware that there were other
living creatures in the well, for from its sides came occasional weird
rustlings and hissings, which added considerably to the horror of his
situation, by creating a vague feeling of dread of some unknown
danger close at hand.
Slowly the long night passed, and he could plainly hear the gongs of
the different regiments as the hours were struck on them, and the
sentries, as if in mockery, crying the usual ‘All’s well.’ Gradually day
began to dawn, and light to show up above at the mouth of the well.
By degrees, his prison became less dim, and he could see his
surroundings. He was bestriding his dead charger, which lay
crumpled up with a broken neck at the bottom of the well, in which
was not more than three feet of water. Black himself, except for the
shock, was uninjured. His legs were pretty well numbed, from being
so long in the water, but there were no bones broken; and barring
the terrible jar to his system, he was sound in every respect. As the
sun arose, he began to peer about, and again tried to make himself
heard above ground. This caused a renewal of the peculiar rustlings
and hissings we have referred to; and he was now enabled to verify
what he had dreaded and suspected when he first heard them in the
dark. All round the sides of the well were holes, tenanted by snakes,
most of them of the deadly cobra tribe, and many, seemingly, of an
extraordinary size. Presently, like muffled thunder, the morning gun
roused the sleepers in the various barracks, and the loud reveille
quickly following it, brought hope of speedy release to the worn-out
watcher.
The bheesties coming to draw water were the first to discover him,
and their loud cries soon surrounded the mouth of the well with
stalwart artillerymen. Drag-ropes were brought from the nearest
battery; and Black, barely able to attach them to his body, was at
length drawn, to all appearance more dead than alive, to upper air,
unable to reply to the eager questionings of those by whom he was
surrounded. He was placed on a hospital litter, and hurried off to his
own bungalow. Under careful treatment, and thanks to a splendid
constitution, he was in a short time again fit for duty.
When recounting the events of the night, Black didn’t forget to
mention his sensations at hearing the hissings all round him, and
which the darkness at first made him think to be closer even than
they were. This at once caused a proposal to be made for a raid
upon the inhabitants of the holes; but he begged that they should not
be disturbed, saying that they could do no harm where they were,
and that he couldn’t but feel deeply grateful for their forbearance in
confining themselves to hissing his first and, he sincerely hoped, his
last appearance in a well.
PERSEPHONÉ.
A LAY OF SPRING.[1]
Through the dusky halls of Hadës
Thrills the echo of a voice,
Full of love, and full of longing:
‘Come, and bid my heart rejoice!
Daughter, all the world is barren,
While I mourn thy long delay!’
It is fond Demeter calling
On her lost Persephoné.

Sad she leans, the queen of Hadës,


On the gloomy monarch’s breast,
When upon her fettered senses
Falls that wail of Earth distrest;
And it woos her latent fancy
With a dream of days gone by—
And her heart responds in rapture
To that eager parent-cry!

Gently from the shadowy circle


Of his arms she lifts her head,
And its youthful beauty lightens
Even the Kingdom of the Dead.
Half a-dreaming, yet resistless
To the voice that bids her come,
Soft she murmurs: ‘Mother calls me;
Hermes waits to lead me home.’

‘Wilt thou leave me? I have loved thee,


Held thee dear as queenly wife;
It was Zeus who gave thee to me—
Life to Death, and Death to Life!’
Still a-dreaming and bewildered,
‘Ah!’ she says, complaining low,
‘Hear ye not Demeter calling?
King and husband, let me go!’

Lingeringly he yields his darling,


But she leaves the Shadow-land
With his spell upon her spirit,
With his chain upon her hand.
‘She will come again,’ he whispers,
‘And our union earth must own;
Young Life drawn from Death’s embraces
Will return to share his throne!’

. . . . .

Pure and queenly, all immortal,


Stands she ’neath her native skies:
Cloud and sunbeam, dew and rainbow,
Mingle in her lucid eyes:
Fitful smiles and vivid blushes
Blend to banish every tear,
And, like lute, her tender accents
Fall upon Demeter’s ear:

‘Mother, from the heart of Hadës


I have come again to thee!’—
Desert wide and boundless welkin,
Grove and valley, hill and sea,
All the animate creation,
All the haunts of listening day,
Echo with Demeter’s answer:
‘Hail, my child Persephoné!’

Lo! the world awakes to rapture;


Love rejoices, gods are glad,
Flowers unfold around her footfalls,
Youth in virgin garb is clad;
All the Muses chant a welcome;
Nymph and Naïad swell the strain;
Dancing sunbeams, laughing waters,
Aid the triumph of her train.

Where she moves, a magic whisper


Stirs the world to wanton mirth;
Winter flies before her presence;
Forms of beauty find new birth;
Nature’s languid pulses flutter
With the fervid breath of Spring,
Zephyrs tell to opening blossoms:
‘Eros comes to reign as king!’

Ah! while life breaks forth in music,


Emerald hues, and heavenly light,
Warmth, and love, and fairest promise,
Still a vision of the night
Glides athwart the happy Present,
Vague as harvest hopes in May;
’Tis a dream of gloomy Hadës
Haunts the young Persephoné!

So, to Mother Earth she falters:


‘Though thy daughter, still his wife.
Zeus decrees in kingly fashion,
Death shall hold the hand of Life:
Zeus decrees, and in one circle
Life and Death doth still combine.
Though I crown thee with my beauty,
Though my soul is part of thine,
Yet the mighty Hadës holds me
By a power that is divine.

‘But, sweet mother, Life can only


Be withdrawn. It never dies.
From the heart of sombre Hadës,
At thy call I will arise.
Year by year thy eager summons
Shall have power to break the chain,
And in all her youthful glory,
Will thy daughter come again.

‘Yet, because his spell must ever


Lie upon my charmèd soul,
He, the gloomy Lord of Shadows,
Shall my wayward will control.
As I heard thee call, my mother,
So his call I must obey;
Even here shall come his mandate,
And I may not answer nay.
Ah! when harvest fruits are garnered,
Mourn thy child Persephoné.’

Jessie M. E. Saxby.

[1] Persephoné, according to the Greek mythology, was the


daughter of Zeus (the Heavens) and Demeter (the Earth). Various
legends are related of her, one of the later and most beautiful
being that, when young, she was carried off by Pluto (ruler of the
spirits of the dead), and by him made Queen of Hadës (the nether
world). Her mother, in agony at her loss, searched for her all over
the earth with torches, until at last she discovered her abode. The
gods, moved by the mother’s distress, sent a messenger to bring
Persephoné back, and Pluto consented to let her go on condition
that she returned and spent a portion of every year with him. From
this, Persephoné became among the ancients the symbol of
Spring, her disappearance to the lower world coinciding with
winter, and her reappearance in the upper world bringing back
vegetable life and beauty.

Printed and Published by W. & R. Chambers, 47 Paternoster Row,


London, and 339 High Street, Edinburgh.
All rights reserved.
*** END OF THE PROJECT GUTENBERG EBOOK CHAMBERS'S
JOURNAL OF POPULAR LITERATURE, SCIENCE, AND ART,
FIFTH SERIES, NO. 114, VOL. III, MARCH 6, 1886 ***

Updated editions will replace the previous one—the old editions


will be renamed.

Creating the works from print editions not protected by U.S.


copyright law means that no one owns a United States copyright
in these works, so the Foundation (and you!) can copy and
distribute it in the United States without permission and without
paying copyright royalties. Special rules, set forth in the General
Terms of Use part of this license, apply to copying and
distributing Project Gutenberg™ electronic works to protect the
PROJECT GUTENBERG™ concept and trademark. Project
Gutenberg is a registered trademark, and may not be used if
you charge for an eBook, except by following the terms of the
trademark license, including paying royalties for use of the
Project Gutenberg trademark. If you do not charge anything for
copies of this eBook, complying with the trademark license is
very easy. You may use this eBook for nearly any purpose such
as creation of derivative works, reports, performances and
research. Project Gutenberg eBooks may be modified and
printed and given away—you may do practically ANYTHING in
the United States with eBooks not protected by U.S. copyright
law. Redistribution is subject to the trademark license, especially
commercial redistribution.

START: FULL LICENSE


THE FULL PROJECT GUTENBERG LICENSE
PLEASE READ THIS BEFORE YOU DISTRIBUTE OR USE THIS WORK

To protect the Project Gutenberg™ mission of promoting the


free distribution of electronic works, by using or distributing this
work (or any other work associated in any way with the phrase
“Project Gutenberg”), you agree to comply with all the terms of
the Full Project Gutenberg™ License available with this file or
online at www.gutenberg.org/license.

Section 1. General Terms of Use and


Redistributing Project Gutenberg™
electronic works
1.A. By reading or using any part of this Project Gutenberg™
electronic work, you indicate that you have read, understand,
agree to and accept all the terms of this license and intellectual
property (trademark/copyright) agreement. If you do not agree to
abide by all the terms of this agreement, you must cease using
and return or destroy all copies of Project Gutenberg™
electronic works in your possession. If you paid a fee for
obtaining a copy of or access to a Project Gutenberg™
electronic work and you do not agree to be bound by the terms
of this agreement, you may obtain a refund from the person or
entity to whom you paid the fee as set forth in paragraph 1.E.8.

1.B. “Project Gutenberg” is a registered trademark. It may only


be used on or associated in any way with an electronic work by
people who agree to be bound by the terms of this agreement.
There are a few things that you can do with most Project
Gutenberg™ electronic works even without complying with the
full terms of this agreement. See paragraph 1.C below. There
are a lot of things you can do with Project Gutenberg™
electronic works if you follow the terms of this agreement and
help preserve free future access to Project Gutenberg™
electronic works. See paragraph 1.E below.
1.C. The Project Gutenberg Literary Archive Foundation (“the
Foundation” or PGLAF), owns a compilation copyright in the
collection of Project Gutenberg™ electronic works. Nearly all the
individual works in the collection are in the public domain in the
United States. If an individual work is unprotected by copyright
law in the United States and you are located in the United
States, we do not claim a right to prevent you from copying,
distributing, performing, displaying or creating derivative works
based on the work as long as all references to Project
Gutenberg are removed. Of course, we hope that you will
support the Project Gutenberg™ mission of promoting free
access to electronic works by freely sharing Project
Gutenberg™ works in compliance with the terms of this
agreement for keeping the Project Gutenberg™ name
associated with the work. You can easily comply with the terms
of this agreement by keeping this work in the same format with
its attached full Project Gutenberg™ License when you share it
without charge with others.

1.D. The copyright laws of the place where you are located also
govern what you can do with this work. Copyright laws in most
countries are in a constant state of change. If you are outside
the United States, check the laws of your country in addition to
the terms of this agreement before downloading, copying,
displaying, performing, distributing or creating derivative works
based on this work or any other Project Gutenberg™ work. The
Foundation makes no representations concerning the copyright
status of any work in any country other than the United States.

1.E. Unless you have removed all references to Project


Gutenberg:

1.E.1. The following sentence, with active links to, or other


immediate access to, the full Project Gutenberg™ License must
appear prominently whenever any copy of a Project
Gutenberg™ work (any work on which the phrase “Project
Gutenberg” appears, or with which the phrase “Project
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like