0% found this document useful (0 votes)
10 views

Using OpenCL Programming Massively Parallel Computers J. Kowalik download

Ebook access

Uploaded by

sehartorpyrw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

Using OpenCL Programming Massively Parallel Computers J. Kowalik download

Ebook access

Uploaded by

sehartorpyrw
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

Using OpenCL Programming Massively Parallel

Computers J. Kowalik download

https://ebookname.com/product/using-opencl-programming-massively-
parallel-computers-j-kowalik/

Get Instant Ebook Downloads – Browse at https://ebookname.com


Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Computers and Programming 1st Edition Lisa Mccoy

https://ebookname.com/product/computers-and-programming-1st-
edition-lisa-mccoy/

Hidden Structure Music Analysis Using Computers David


Cope

https://ebookname.com/product/hidden-structure-music-analysis-
using-computers-david-cope/

Using Computers in Linguistics A Practical Guide 1st


Edition John M. Lawler

https://ebookname.com/product/using-computers-in-linguistics-a-
practical-guide-1st-edition-john-m-lawler/

The ASHRAE GreenGuide Second Edition The ASHRAE Green


Guide Series Ashrae Press

https://ebookname.com/product/the-ashrae-greenguide-second-
edition-the-ashrae-green-guide-series-ashrae-press/
Complete Java 2 Certification Study Guide 4th Edition
Phillip Heller

https://ebookname.com/product/complete-java-2-certification-
study-guide-4th-edition-phillip-heller/

Bosnia and Herzegovina in the Second World War 2004


Enver Redzic

https://ebookname.com/product/bosnia-and-herzegovina-in-the-
second-world-war-2004-enver-redzic/

Human Resource Skills for the Project Manager The Human


Aspects of Project Management Volume Two 1st Edition
Verma

https://ebookname.com/product/human-resource-skills-for-the-
project-manager-the-human-aspects-of-project-management-volume-
two-1st-edition-verma/

Textbook of Forensic Medicine and Toxicology 2nd


Edition Nageshkumar G Rao

https://ebookname.com/product/textbook-of-forensic-medicine-and-
toxicology-2nd-edition-nageshkumar-g-rao/

World Civilizations Volume 1 To 1700 5th Edition Philip


J. Adler

https://ebookname.com/product/world-civilizations-
volume-1-to-1700-5th-edition-philip-j-adler/
Noise and Military Service Implications for Hearing
Loss and Tinnitus 1st Edition Committee On Noise-
Induced Hearing Loss And Tinnitus Associated With
Military Service From World War Ii To The Present
https://ebookname.com/product/noise-and-military-service-
implications-for-hearing-loss-and-tinnitus-1st-edition-committee-
on-noise-induced-hearing-loss-and-tinnitus-associated-with-
military-service-from-world-war-ii-to-the-present/
USING OPENCL
Advances in Parallel Computing
This book series publishes research and development results on all aspects of parallel
computing. Topics may include one or more of the following: high-speed computing
architectures (Grids, clusters, Service Oriented Architectures, etc.), network technology,
performance measurement, system software, middleware, algorithm design,
development tools, software engineering, services and applications.

Series Editor:
Professor Dr. Gerhard R. Joubert

Volume 21
Recently published in this series
Vol. 20. I. Foster, W. Gentzsch, L. Grandinetti and G.R. Joubert (Eds.), High
Performance Computing: From Grids and Clouds to Exascale
Vol. 19. B. Chapman, F. Desprez, G.R. Joubert, A. Lichnewsky, F. Peters and T. Priol
(Eds.), Parallel Computing: From Multicores and GPU’s to Petascale
Vol. 18. W. Gentzsch, L. Grandinetti and G. Joubert (Eds.), High Speed and Large Scale
Scientific Computing
Vol. 17. F. Xhafa (Ed.), Parallel Programming, Models and Applications in Grid and
P2P Systems
Vol. 16. L. Grandinetti (Ed.), High Performance Computing and Grids in Action
Vol. 15. C. Bischof, M. Bücker, P. Gibbon, G.R. Joubert, T. Lippert, B. Mohr and F.
Peters (Eds.), Parallel Computing: Architectures, Algorithms and Applications

Volumes 1–14 published by Elsevier Science.

ISSN 0927-5452 (print)


ISSN 1879-808X (online)
Usin
ng OpeenCL
Program
mming Ma
assively Pa
arallel Com
mputers

Janu
usz Kow
walik
1647
77-107th PL NE, Bothell,, WA 98011
1, USA
and
Tadeusz PuĨnia
akowski
UG, MFI, Wit
W Stwosz Street
S 57, 80-952
8 GdaĔĔsk, Poland

Amstterdam x Berrlin x Tokyo x Washington, DC


© 2012 The authors and IOS Press.

All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or
transmitted, in any form or by any means, without prior written permission from the publisher.

ISBN 978-1-61499-029-1 (print)


ISBN 978-1-61499-030-7 (online)
Library of Congress Control Number: 2012932792
doi:10.3233/978-1-61499-030-7-i

Publisher
IOS Press BV
Nieuwe Hemweg 6B
1013 BG Amsterdam
Netherlands
fax: +31 20 687 0019
e-mail: order@iospress.nl

Distributor in the USA and Canada


IOS Press, Inc.
4502 Rachael Manor Drive
Fairfax, VA 22032
USA
fax: +1 703 323 3668
e-mail: iosbooks@iospress.com

LEGAL NOTICE
The publisher is not responsible for the use which might be made of the following information.

PRINTED IN THE NETHERLANDS


This book is dedicated to Alex, Bogdan and Gabriela
with love and consideration.

v
vi
Preface
This book contains the most important and essential information required for de-
signing correct and efficient OpenCL programs. Some details have been omitted but
can be found in the provided references. The authors assume that readers are famil-
iar with basic concepts of parallel computation, have some programming experience
with C or C++ and have a fundamental understanding of computer architecture.
In the book, all terms, definitions and function signatures have been copied from
official API documents available on the page of the OpenCL standards creators.
The book was written in 2011, when OpenCL was in transition from its infancy
to maturity as a practical programming tool for solving real-life problems in science
and engineering. Earlier, the Khronos Group successfully defined OpenCL specifica-
tions, and several companies developed stable OpenCL implementations ready for
learning and testing. A significant contribution to programming heterogeneous com-
puters was made by NVIDIA which created one of the first working systems for pro-
gramming massively parallel computers – CUDA. OpenCL has borrowed from CUDA
several key concepts. At this time (fall 2011), one can install OpenCL on a hetero-
geneous computer and perform meaningful computing experiments. Since OpenCL
is relatively new, there are not many experienced users or sources of practical infor-
mation. One can find on the Web some helpful publications about OpenCL, but there
is still a shortage of complete descriptions of the system suitable for students and
potential users from the scientific and engineering application communities.
Chapter 1 provides short but realistic examples of codes using MPI and OpenMP
in order for readers to compare these two mature and very successful systems with
the fledgling OpenCL. MPI used for programming clusters and OpenMP for shared
memory computers, have achieved remarkable worldwide success for several rea-
sons. Both have been designed by groups of parallel computing specialists that per-
fectly understood scientific and engineering applications and software development
tools. Both MPI and OpenMP are very compact and easy to learn. Our experience
indicates that it is possible to teach scientists or students whose disciplines are other
than computer science how to use MPI and OpenMP in a several hours time. We
hope that OpenCL will benefit from this experience and achieve, in the near future,
a similar success.
Paraphrasing the wisdom of Albert Einstein, we need to simplify OpenCL as
much as possible but not more. The reader should keep in mind that OpenCL will
be evolving and that pioneer users always have to pay an additional price in terms
of initially longer program development time and suboptimal performance before
they gain experience. The goal of achieving simplicity for OpenCL programming re-
quires an additional comment. OpenCL supporting heterogeneous computing offers
us opportunities to select diverse parallel processing devices manufactured by differ-
ent vendors in order to achieve near-optimal or optimal performance. We can select
multi-core CPUs, GPUs, FPGAs and other parallel processing devices to fit the prob-
lem we want to solve. This flexibility is welcomed by many users of HPC technology,
but it has a price.
Programming heterogeneous computers is somewhat more complicated than
writing programs in conventional MPI and OpenMP. We hope this gap will disappear
as OpenCL matures and is universally used for solving large scientific and engineer-
ing problems.

vii
Acknowledgements
It is our pleasure to acknowledge assistance and contributions made by several per-
sons who helped us in writing and publishing the book.
First of all, we express our deep gratitude to Prof. Gerhard Joubert who has
accepted the book as a volume in the book series he is editing, Advances in Parallel
Computing. We are proud to have our book in his very prestigious book series.
Two members of the Khronos organization, Elizabeth Riegel and Neil Trevett,
helped us with evaluating the initial draft of Chapter 2 Fundamentals and provided
valuable feedback. We thank them for the feedback and for their offer of promoting
the book among the Khronos Group member companies.
Our thanks are due to NVIDIA for two hardware grants that enabled our com-
puting work related to the book.
Our thanks are due to Piotr Arłukowicz, who contributed two sections to the
book and helped us with editorial issues related to using LATEX and the Blender3D
modeling open-source program.
We thank two persons who helped us improve the book structure and the lan-
guage. They are Dominic Eschweiler from FIAS, Germany and Roberta Scholz from
Redmond, USA.
We also thank several friends and family members who helped us indirectly by
supporting in various ways our book writing effort.
Janusz Kowalik
Tadeusz Puźniakowski

How to read this book


The text and the source code presented in this book are written using different text
fonts. Here are some examples of diffrent typography styles collected.
variable – for example:
. . . the variable platform represents an object of class . . .
type or class name – for example:
. . . the value is always of type cl_ulong. . .
. . . is an object of class cl::Platform. . .
constant or macro – for example:
. . . the value CL_PLATFORM_EXTENSIONS means that. . .
function, method or constructor – for example:
. . . the host program has to execute clGetPlatformIDs. . .
. . . can be retrieved using cl::Platform::getInfo method. . .
. . . the context is created by cl::Context construcor. . .
file name – for example:
. . . the cl.h header file contains. . .
keyword – for example:
. . . identified by the __kernel qualifier. . .

viii
Contents

1 Introduction 1
1.1 Existing Standard Parallel Programming Systems . . . . . . . . . . . . . . 1
1.1.1 MPI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1.2 OpenMP . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Two Parallelization Strategies: Data Parallelism and Task Parallelism . 9
1.2.1 Data Parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2.2 Task Parallelism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
1.2.3 Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
1.3 History and Goals of OpenCL . . . . . . . . . . . . . . . . . . . . . . . . . . 12
1.3.1 Origins of Using GPU in General Purpose Computing . . . . . . 12
1.3.2 Short History of OpenCL . . . . . . . . . . . . . . . . . . . . . . . . 13
1.4 Heterogeneous Computer Memories and Data Transfer . . . . . . . . . . 14
1.4.1 Heterogeneous Computer Memories . . . . . . . . . . . . . . . . . 14
1.4.2 Data Transfer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
1.4.3 The Fourth Generation CUDA . . . . . . . . . . . . . . . . . . . . . 15
1.5 Host Code . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
1.5.1 Phase a. Initialization and Creating Context . . . . . . . . . . . . 17
1.5.2 Phase b. Kernel Creation, Compilation and Preparations . . . . . 17
1.5.3 Phase c. Creating Command Queues and Kernel Execution . . . 17
1.5.4 Finalization and Releasing Resource . . . . . . . . . . . . . . . . . 18
1.6 Applications of Heterogeneous Computing . . . . . . . . . . . . . . . . . . 18
1.6.1 Accelerating Scientific/Engineering Applications . . . . . . . . . 19
1.6.2 Conjugate Gradient Method . . . . . . . . . . . . . . . . . . . . . . 19
1.6.3 Jacobi Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
1.6.4 Power Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.6.5 Monte Carlo Methods . . . . . . . . . . . . . . . . . . . . . . . . . . 22
1.6.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
1.7 Benchmarking CGM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
1.7.2 Additional CGM Description . . . . . . . . . . . . . . . . . . . . . . 24
1.7.3 Heterogeneous Machine . . . . . . . . . . . . . . . . . . . . . . . . 24
1.7.4 Algorithm Implementation and Timing Results . . . . . . . . . . 24
1.7.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25

ix
2 OpenCL Fundamentals 27
2.1 OpenCL Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.1.1 What is OpenCL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.1.2 CPU + Accelerators . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.1.3 Massive Parallelism Idea . . . . . . . . . . . . . . . . . . . . . . . . 27
2.1.4 Work Items and Workgroups . . . . . . . . . . . . . . . . . . . . . . 29
2.1.5 OpenCL Execution Model . . . . . . . . . . . . . . . . . . . . . . . . 29
2.1.6 OpenCL Memory Structure . . . . . . . . . . . . . . . . . . . . . . . 30
2.1.7 OpenCL C Language for Programming Kernels . . . . . . . . . . . 30
2.1.8 Queues, Events and Context . . . . . . . . . . . . . . . . . . . . . . 30
2.1.9 Host Program and Kernel . . . . . . . . . . . . . . . . . . . . . . . . 31
2.1.10 Data Parallelism in OpenCL . . . . . . . . . . . . . . . . . . . . . . 31
2.1.11 Task Parallelism in OpenCL . . . . . . . . . . . . . . . . . . . . . . 32
2.2 How to Start Using OpenCL . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
2.2.1 Header Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.2.2 Libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.2.3 Compilation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
2.3 Platforms and Devices . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
2.3.1 OpenCL Platform Properties . . . . . . . . . . . . . . . . . . . . . . 36
2.3.2 Devices Provided by Platform . . . . . . . . . . . . . . . . . . . . . 37
2.4 OpenCL Platforms – C++ . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
2.5 OpenCL Context to Manage Devices . . . . . . . . . . . . . . . . . . . . . . 41
2.5.1 Different Types of Devices . . . . . . . . . . . . . . . . . . . . . . . 43
2.5.2 CPU Device Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
2.5.3 GPU Device Type . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.5.4 Accelerator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
2.5.5 Different Device Types – Summary . . . . . . . . . . . . . . . . . . 44
2.5.6 Context Initialization – by Device Type . . . . . . . . . . . . . . . 45
2.5.7 Context Initialization – Selecting Particular Device . . . . . . . . 46
2.5.8 Getting Information about Context . . . . . . . . . . . . . . . . . . 47
2.6 OpenCL Context to Manage Devices – C++ . . . . . . . . . . . . . . . . . 48
2.7 Error Handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
2.7.1 Checking Error Codes . . . . . . . . . . . . . . . . . . . . . . . . . . 50
2.7.2 Using Exceptions – Available in C++ . . . . . . . . . . . . . . . . 53
2.7.3 Using Custom Error Messages . . . . . . . . . . . . . . . . . . . . . 54
2.8 Command Queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
2.8.1 In-order Command Queue . . . . . . . . . . . . . . . . . . . . . . . 55
2.8.2 Out-of-order Command Queue . . . . . . . . . . . . . . . . . . . . 57
2.8.3 Command Queue Control . . . . . . . . . . . . . . . . . . . . . . . 60
2.8.4 Profiling Basics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
2.8.5 Profiling Using Events – C example . . . . . . . . . . . . . . . . . . 61
2.8.6 Profiling Using Events – C++ example . . . . . . . . . . . . . . . 63
2.9 Work-Items and Work-Groups . . . . . . . . . . . . . . . . . . . . . . . . . 65
2.9.1 Information About Index Space from a Kernel . . . . . . . . . . 66
2.9.2 NDRange Kernel Execution . . . . . . . . . . . . . . . . . . . . . . 67
2.9.3 Task Execution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
2.9.4 Using Work Offset . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70

x
2.10 OpenCL Memory . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
2.10.1 Different Memory Regions – the Kernel Perspective . . . . . . . . 71
2.10.2 Relaxed Memory Consistency . . . . . . . . . . . . . . . . . . . . . 73
2.10.3 Global and Constant Memory Allocation – Host Code . . . . . . 75
2.10.4 Memory Transfers – the Host Code . . . . . . . . . . . . . . . . . . 78
2.11 Programming and Calling Kernel . . . . . . . . . . . . . . . . . . . . . . . . 79
2.11.1 Loading and Compilation of an OpenCL Program . . . . . . . . . 81
2.11.2 Kernel Invocation and Arguments . . . . . . . . . . . . . . . . . . 88
2.11.3 Kernel Declaration . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90
2.11.4 Supported Scalar Data Types . . . . . . . . . . . . . . . . . . . . . 90
2.11.5 Vector Data Types and Common Functions . . . . . . . . . . . . . 92
2.11.6 Synchronization Functions . . . . . . . . . . . . . . . . . . . . . . . 94
2.11.7 Counting Parallel Sum . . . . . . . . . . . . . . . . . . . . . . . . . 96
2.11.8 Parallel Sum – Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . 97
2.11.9 Parallel Sum – Host Program . . . . . . . . . . . . . . . . . . . . . 100
2.12 Structure of the OpenCL Host Program . . . . . . . . . . . . . . . . . . . . 103
2.12.1 Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 104
2.12.2 Preparation of OpenCL Programs . . . . . . . . . . . . . . . . . . . 106
2.12.3 Using Binary OpenCL Programs . . . . . . . . . . . . . . . . . . . . 107
2.12.4 Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 109
2.12.5 Release of Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 113
2.13 Structure of OpenCL host Programs in C++ . . . . . . . . . . . . . . . . . 114
2.13.1 Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 115
2.13.2 Preparation of OpenCL Programs . . . . . . . . . . . . . . . . . . . 115
2.13.3 Using Binary OpenCL Programs . . . . . . . . . . . . . . . . . . . . 116
2.13.4 Computation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 120
2.13.5 Release of Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
2.14 The SAXPY Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122
2.14.1 Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
2.14.2 The Example SAXPY Application – C Language . . . . . . . . . . 123
2.14.3 The example SAXPY application – C++ language . . . . . . . . 128
2.15 Step by Step Conversion of an Ordinary C Program to OpenCL . . . . . 131
2.15.1 Sequential Version . . . . . . . . . . . . . . . . . . . . . . . . . . . . 132
2.15.2 OpenCL Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . 132
2.15.3 Data Allocation on the Device . . . . . . . . . . . . . . . . . . . . . 134
2.15.4 Sequential Function to OpenCL Kernel . . . . . . . . . . . . . . . 135
2.15.5 Loading and Executing a Kernel . . . . . . . . . . . . . . . . . . . . 136
2.15.6 Gathering Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 139
2.16 Matrix by Vector Multiplication Example . . . . . . . . . . . . . . . . . . . 139
2.16.1 The Program Calculating mat r i x × vec t or . . . . . . . . . . . . 140
2.16.2 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
2.16.3 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
2.16.4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 144

3 Advanced OpenCL 147


3.1 OpenCL Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
3.1.1 Different Classes of Extensions . . . . . . . . . . . . . . . . . . . . 147

xi
3.1.2 Detecting Available Extensions from API . . . . . . . . . . . . . . 148
3.1.3 Using Runtime Extension Functions . . . . . . . . . . . . . . . . . 149
3.1.4 Using Extensions from OpenCL Program . . . . . . . . . . . . . . 153
3.2 Debugging OpenCL codes . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
3.2.1 Printf . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 155
3.2.2 Using GDB . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 157
3.3 Performance and Double Precision . . . . . . . . . . . . . . . . . . . . . . 162
3.3.1 Floating Point Arithmetics . . . . . . . . . . . . . . . . . . . . . . . 162
3.3.2 Arithmetics Precision – Practical Approach . . . . . . . . . . . . . 165
3.3.3 Profiling OpenCL Application . . . . . . . . . . . . . . . . . . . . . 172
3.3.4 Using the Internal Profiler . . . . . . . . . . . . . . . . . . . . . . . 173
3.3.5 Using External Profiler . . . . . . . . . . . . . . . . . . . . . . . . . 180
3.3.6 Effective Use of Memories – Memory Access Patterns . . . . . . . 183
3.3.7 Matrix Multiplication – Optimization Issues . . . . . . . . . . . . 189
3.4 OpenCL and OpenGL . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194
3.4.1 Extensions Used . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 195
3.4.2 Libraries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
3.4.3 Header Files . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196
3.4.4 Common Actions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 197
3.4.5 OpenGL Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . 198
3.4.6 OpenCL Initialization . . . . . . . . . . . . . . . . . . . . . . . . . . 201
3.4.7 Creating Buffer for OpenGL and OpenCL . . . . . . . . . . . . . . 203
3.4.8 Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 209
3.4.9 Generating Effect . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 213
3.4.10 Running Kernel that Operates on Shared Buffer . . . . . . . . . . 215
3.4.11 Results Display . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 216
3.4.12 Message Handling . . . . . . . . . . . . . . . . . . . . . . . . . . . . 218
3.4.13 Cleanup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 219
3.4.14 Notes and Further Reading . . . . . . . . . . . . . . . . . . . . . . . 221
3.5 Case Study – Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . 221
3.5.1 Historical Notes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
3.5.2 Terminology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 221
3.5.3 Genetic Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 222
3.5.4 Example Problem Definition . . . . . . . . . . . . . . . . . . . . . . 225
3.5.5 Genetic Algorithm Implementation Overview . . . . . . . . . . . 225
3.5.6 OpenCL Program . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 226
3.5.7 Most Important Elements of Host Code . . . . . . . . . . . . . . . 234
3.5.8 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 241
3.5.9 Experiment Results . . . . . . . . . . . . . . . . . . . . . . . . . . . 241

A Comparing CUDA with OpenCL 245


A.1 Introduction to CUDA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245
A.1.1 Short CUDA Overview . . . . . . . . . . . . . . . . . . . . . . . . . . 245
A.1.2 CUDA 4.0 Release and Compatibility . . . . . . . . . . . . . . . . . 245
A.1.3 CUDA Versions and Device Capability . . . . . . . . . . . . . . . . 247
A.2 CUDA Runtime API Example . . . . . . . . . . . . . . . . . . . . . . . . . . 249
A.2.1 CUDA Program Explained . . . . . . . . . . . . . . . . . . . . . . . 251

xii
A.2.2 Blocks and Threads Indexing Formulas . . . . . . . . . . . . . . . 257
A.2.3 Runtime Error Handling . . . . . . . . . . . . . . . . . . . . . . . . 260
A.2.4 CUDA Driver API Example . . . . . . . . . . . . . . . . . . . . . . . 262

B Theoretical Foundations of Heterogeneous Computing 269


B.1 Parallel Computer Architectures . . . . . . . . . . . . . . . . . . . . . . . . 269
B.1.1 Clusters and SMP . . . . . . . . . . . . . . . . . . . . . . . . . . . . 269
B.1.2 DSM and ccNUMA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 270
B.1.3 Parallel Chip Computer . . . . . . . . . . . . . . . . . . . . . . . . . 270
B.1.4 Performance of OpenCL Programs . . . . . . . . . . . . . . . . . . 270
B.2 Combining MPI with OpenCL . . . . . . . . . . . . . . . . . . . . . . . . . . 277

C Matrix Multiplication – Algorithm and Implementation 279


C.1 Matrix Multiplication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
C.2 Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
C.2.1 OpenCL Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 279
C.2.2 Initialization and Setup . . . . . . . . . . . . . . . . . . . . . . . . . 280
C.2.3 Kernel Arguments . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282
C.2.4 Executing Kernel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 282

D Using Examples Attached to the Book 285


D.1 Compilation and Setup . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286
D.1.1 Linux . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286
D.1.2 Windows . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 286

Bibliography and References 289

xiii
xiv
Chapter 1

Introduction

1.1. Existing Standard Parallel Programming Systems


The last decade of the 20th century and the first decade of the 21st century can be
called the Era of Parallel Computing. In this period of time, not only were extremely
powerful supercomputers designed and built, but two de facto standard parallel pro-
gramming systems for scientific and engineering applications were successfully intro-
duced worldwide. They are MPI (Message Passing Interface) for clusters of comput-
ers and OpenMP for shared memory multi-processors. Both systems are predecessors
of the subject of this book on OpenCL. They deserve short technical description and
discussion. This will help to see differences between the older MPI/OpenMP and the
newest OpenCL parallel programming systems.

1.1.1. MPI
MPI is a programming system but not a programming language. It is a library of func-
tions for C and subroutines for FORTRAN that are used for message communication
between parallel processes created by MPI. The message-passing computing model
(Fig. 1.1) is a collection of interconnected processes that use their local memories
exclusively. Each process has an individual address identification number called the
rank. Ranks are used for sending and receiving messages and for workload distribu-
tion. There are two kinds of messages: point-to-point messages and collective mes-
sages. Point-to-point message C functions contain several parameters: the address of
the sender, the address of the receiver, the message size and type and some additional
information. In general, message parameters describe the nature of the transmitted
data and the delivery envelope description.
The collection of processes that can exchange messages is called the communica-
tor. In the simplest case there is only one communicator and each process is assigned
to one processor. In more general settings, there are several communicators and sin-
gle processors serve several processes. A processor rank is usually an integer number
running from 0 to p-1 where p is the total number of processes. It is also possible to
number processes in a more general way than by consecutive integer numbers. For
example, their address ID can be a double number such as a point (x, y) in Carte-

1
sian space. This method of identifying processes may be very helpful in handling ma-
trix operations or partial differential equations (PDEs) in two-dimensional Cartesian
space.

Figure 1.1: The message passing model.

An example of a collective message is the broadcast message that sends data


from a single process to all other processes. Other collective messages such as Gather
and Scatter are very helpful and are often used in computational mathematics
algorithms.
For the purpose of illustrating MPI, consider a parallel computation of the SAXPY
operation. SAXPY is a linear algebra operation z = ax+y where a is a constant scalar
and x, y, z are vectors of the same size. The name SAXPY comes from Sum of ax plus
y. An example of a SAXPY operation is presented in section 2.14.
The following assumptions and notations are made:

1. The number of processes is p.


2. The vector size n is divisible by p.
3. Only a part of the code for parallel calculation of the vector z will be written.
4. All required MPI initialization instructions have been done in the front part of
the code.
5. The vectors x and y have been initialized.
6. The process ID is called my_rank from 0 to p−1. The number of vector element
pairs that must be computed by each process is: N = n/p. The communicator
is the entire system.
7. n = 10000 and p = 10 so N = 1000.

The C loop below will be executed by all processes from the process from
my_rank equal to 0 to my_rank equal to p − 1.

1 {
2 int i;
3 for (i = my_rank*N; i < (my_rank+1)*N; i++)
4 z[i]=a*x[i]+y[i];
5 }

2
For example, the process with my_rank=1 will add one thousand ele-
ments a*x[i] and y[i] from i=N =1000 to 2N –1=1999. The process with the
my_rank=p–1=9 will add elements from i=n–N =9000 to n–1=9999.
One can now appreciate the usefulness of assigning a rank ID to each process.
This simple concept makes it possible to send and receive messages and share the
workloads as illustrated above. Of course, before each process can execute its code
for computing a part of the vector z it must receive the needed data. This can be
done by one process initializing the vectors x and y and sending appropriate groups
of data to all remaining processes. The computation of SAXPY is an example of data
parallelism. Each process executes the same code on different data.
To implement function parallelism where several processes execute different pro-
grams process ranks can be used. In the SAXPY example the block of code containing
the loop will be executed by all p processes without exception. But if one process, for
example, the process rank 0, has to do something different than all others this could
be accomplished by specifying the task as follows:

1 if (my_rank == 0)
2 {execute specified here task}

This block of code will be executed only by the process with my_rank==0. In the
absence of "if (my_rank==something)" instructions, all processes will execute the
block of code {execute this task}. This technique can be used for a case where
processes have to perform several different computations required in the task-parallel
algorithm. MPI is universal. It can express every kind of algorithmic parallelism.
An important concern in parallel computing is the efficiency issue. MPI often
can be computationally efficient because all processes use only local memories. On
the other hand, processes require network communication to place data in proper
process memories in proper times. This need for moving data is called the algorithmic
synchronization, and it creates the communication overhead, impacting negatively
the parallel program performance. It might significantly damage performance if the
program sends many short messages. The communication overhead can be reduced
by grouping data for communication. By grouping data for communication created
are larger user defined data types for larger messages to be sent less frequently. The
benefit is avoiding the latency times.
On the negative side of the MPI evaluation score card is its inability for incre-
mental (part by part) conversion from a serial code to an MPI parallel code that
is algorithmically equivalent. This problem is attributed to the relatively high level
of MPI program design. To design an MPI program, one has to modifiy algorithms.
This work is done at an earlier stage of the parallel computing process than develop-
ing parallel codes. Algorithmic level modification usually can’t be done piecewise. It
has to be done all at once. In contrast, in OpenMP a programmer makes changes to
sequential codes written in C or FORTRAN.
Fig. 1.2 shows the difference. The code level modification makes possible incre-
mental conversions. In a commonly practiced conversion technique, a serial code is
converted incrementally from the most compute-intensive program parts to the least
compute intensive parts until the parallelized code runs sufficiently fast. For further
study of MPI, the book [1] is highly recommended.

3
1. Mathematical model.
2. Computational model.
3. Numerical algorithm and parallel conversion: MPI
4. Serial code and parallel modification: OpenMP
5. Computer runs.

Figure 1.2: The computer solution process. OpenMP will be discussed in the next
section.

1.1.2. OpenMP
OpenMP is a shared address space computer application programming interface
(API). It contains a number of compiler directives that instruct C/C++ or FORTRAN
“OpenMP aware” compilers to execute certain instructions in parallel and distribute
the workload among multiple parallel threads. Shared address space computers fall
into two major groups: centralized memory multiprocessors, also called Symmetric
Multi-Processors (SMP) as shown in Fig. 1.3, and Distributed Shared Memory (DSM)
multiprocessors whose common representative is the cache coherent Non Uniform
Memory Access architecture (ccNUMA) shown in Fig. 1.4.

Figure 1.3: The bus based SMP multiprocessor.

SMP computers are also called Uniform Memory Access (UMA). They were the
first commercially successful shared memory computers and they still remain popu-
lar. Their main advantage is uniform memory access. Unfortunately for large num-
bers of processors, the bus becomes overloaded and performance deteriorates. For
this reason, the bus-based SMP architectures are limited to about 32 or 64 proces-
sors. Beyond these sizes, single address memory has to be physically distributed.
Every processor has a chunk of the single address space.
Both architectures have single address space and can be programmed by using
OpenMP. OpenMP parallel programs are executed by multiple independent threads

4
Figure 1.4: ccNUMA with cache coherent interconnect.

that are streams of instructions having access to shared and private data, as shown
in Fig. 1.5.

Figure 1.5: The threads’ access to data.

The programmer can explicitly define data that are shared and data that are
private. Private data can be accessed only by the thread owning these data. The flow
of OpenMP computation is shown in Fig. 1.6. One thread, the master thread, runs
continuously from the beginning to the end of the program execution. The worker
threads are created for the duration of parallel regions and then are terminated.
When a programmer inserts a proper compiler directive, the system creates a
team of worker threads and distributes workload among them. This operation point
is called the fork. After forking, the program enters a parallel region where parallel
processing is done. After all threads finish their work, the program worker threads
cease to exist or are redeployed while the master thread continues its work. The
event of returning to the master thread is called the join. The fork and join tech-
nique may look like a simple and easy way for parallelizing sequential codes, but the

5
Figure 1.6: The fork and join OpenMP code flow.

reader should not be deceived. There are several difficulties that will be described
and discussed shortly. First of all, how to find whether several tasks can be run in
parallel?
In order for any group of computational tasks to be correctly executed in parallel,
they have to be independent. That means that the result of the computation does not
depend on the order of task execution. Formally, two programs or parts of programs
are independent if they satisfy the Bernstein conditions shown in 1.1.

I j ∩ Oi = 0
Ii ∩ Oj = 0 (1.1)
Oi ∩ O j = 0

Bernstein’s conditions for task independence.

Letters I and O signify input and output. The ∩ symbol means the intersection
of two sets that belong to task i or j. In practice, determining if some group of
tasks can be executed in parallel has to be done by the programmer and may not
be very easy. To discuss other shared memory computing issues, consider a very
simple programming task. Suppose the dot product of two large vectors x and y
whose number of components is n is calculated. The sequential computation would
be accomplished by a simple C loop

1 double dp = 0;
2 for(int i=0; i<n; i++)
3 dp += x[i]*y[i];

Inserting, in front of the above for-loop, the OpenMP directive for parallelizing
the loop and declaring shared and private variables leads to:

6
1 double dp = 0;
2 #pragma omp parallel for shared(x,y,dp) (private i)
3 for(int i=0; i<n; i++)
4 dp += x[i]*y[i];

Unfortunately, this solution would encounter a serious difficulty in computing


dp by this code. The difficulty arises because the computation of dp+=x[i]*y[i]; is
not atomic. This means that more threads than one may attempt to update the value
of dp simultaneously. If this happens, the value of dp will depend on the timing of
individual threads’ operations. This situation is called the data race. The difficulty
can be removed in two ways.
One way is by using the OpenMP critical construct #pragma omp critical in
front of the statement dp+=x[i]*y[i]; The directive critical forces threads to up-
date dp one at a time. In this way, the updating relation dp+=x[i]*y[i] becomes
atomic and is executed correctly. That means every thread executes this operation
alone and completely without interference of other threads. With this addition, the
code becomes:

1 double dp = 0;
2 #pragma omp parallel for shared(x,y,dp) private(i)
3 for(int i=0; i<n; i++){
4 #pragma omp critical
5 dp += x[i]*y[i];
6 }

In general, the block of code following the critical construct is computed by


one thread at a time. In our case, the critical block is just one line of code
dp+=x[i]*y[i];. The second way for fixing the problem is by using the clause re-
duction that ensures adding correctly all partial results of the dot product. Below is
a correct fragment of code for computing dp with the reduction clause.

1 double dp = 0;
2 #pragma omp parallel for reduction(+:dp) shared(x,y) private(i)
3 for (int i=0;i<n ;i++)
4 dp += x[i]*y[i];

Use of the critical construct amounts to serializing the computation of the critical
region, the block of code following the critical construct. For this reason, large critical
regions degrade program performance and ought to be avoided. Using the reduction
clause is more efficient and preferable. The reader may have noticed that the loop
counter variable i is declared private, so each thread updates its loop counter in-
dependently without interference. In addition to the parallelizing construct parallel
for that applies to C for-loops, there is a more general section construct that paral-
lelizes independent sections of code. The section construct makes it possible to apply
OpenMP to task parallelism where several threads can compute different sections of
a code. For computing two parallel sections, the code structure is:

7
1 #pragma omp parallel
2 {
3 #pragma omp sections
4 {
5 #pragma omp section
6 /* some program segment computation */
7 #pragma omp section
8 /* another program segment computation */
9 }
10 /* end of sections block */
11 }
12 /* end of parallel region */

OpenMP has tools that can be used by programmers for improving performance.
One of them is the clause nowait. Consider the program fragment:

1 #pragma omp parallel shared(a,b,c,d,e,f) private(i,j)


2 {
3 #pragma omp for nowait
4 for(int i=0; i<n; i++)
5 c[i] = a[i]+b[i];
6 #pragma omp for
7 for(int j=0; j<m; j++)
8 d[j] = e[j]*f[j];
9 #pragma omp barrier
10 g = func(d);
11 }
12 /* end of the parallel region; implied barrier */

In the first parallelized loop, there is the clause nowait. Since the second loop
variables do not depend on the results of the first loop, it is possible to use the clause
nowait – telling the compiler that as soon as any first loop thread finishes its work, it
can start doing the second loop work without waiting for other threads. This speeds
up computation by reducing the potential waiting time for the threads that finish
work at different times.
On the other hand, the construct #pragma omp barrier is inserted after the
second loop to make sure that the second loop is fully completed before the calcula-
tion of g being a function of d is performed after the second loop. At the barrier, all
threads computing the second loop must wait for the last thread to finish before they
proceed. In addition to the explicit barrier construct #pragma omp barrier used by
the programmer, there are also implicit barriers used by OpenMP automatically at
the end of every parallel region for the purpose of thread synchronization. To sum
up, barriers should be used sparingly and only if necessary. nowait clauses should
be used as frequently as possible, provided that their use is safe.
Finally, we have to point out that the major performance issue in numerical com-
putation is the use of cache memories. For example, in computing the matrix/vector
product c = Ab, two mathematically equivalent methods could be used. In the first
method, elements of the vector c are computed by multiplying each row of A by the
vector b, i.e., computing dot products. An inferior performance will be obtained if c
is computed as the sum of the columns of A multiplied by the elements of b. In this

8
case, the program would access the columns of A not in the way the matrix data are
stored in the main memory and transferred to caches.
For further in-depth study of OpenMP and its performance, reading [2] is highly
recommended. Of special value for parallel computing practitioners are Chapter 5
“How to get Good Performance by Using OpenMP” and Chapter 6 “Using OpenMP
in the Real World”. Chapter 6 offers advice for and against using combined MPI and
OpenMP. A chapter on combining MPI and OpenMP can also be found in [3]. Like
MPI and OpenMP, the OpenCL system is standardized. It has been designed to run
regardless of processor types, operating systems and memories. This makes OpenCL
programs highly portable, but the method of developing OpenCL codes is more com-
plicated. The OpenCL programmer has to deal with several low-level programming
issues, including memory management.

1.2. Two Parallelization Strategies: Data Parallelism


and Task Parallelism
There are two strategies for designing parallel algorithms and related codes: data
parallelism and task parallelism. This Section describes both concepts.

1.2.1. Data Parallelism


Data parallelism, also called Single Program Multiple Data (SPMD) is very common
in computational linear algebra. In a data parallel code, data structures such as ma-
trices are divided into blocks, sets of rows or columns, and a single program performs
identical operations on these partitions that contain different data. An example of
data parallelism is the matrix/vector multiplication a = Ab where every element of
a can be computed by performing the dot product of one row of A and the vector b.
Fig. 1.7 shows this data parallel concept.

Figure 1.7: Computing matrix/vector product.

1.2.2. Task Parallelism


The task parallel approach is more general. It is assumed that there are multiple
different independent tasks that can be computed in parallel. The tasks operate on

9
their own data sets. Task parallelism can be called Multiple Programs Multiple Data
(MPMD). A small-size example of a task parallel problem is shown in Fig. 1.8. The
directed graph indicates the task execution precedence. Two tasks can execute in
parallel if they are not dependent.

Figure 1.8: Task dependency graph.

In the case shown in Fig. 1.8, there are two options for executing the entire
set of tasks. Option 1. Execute tasks T1, T2 and T4 in parallel, followed by Task T3
and finally T5. Option 2. Execute tasks T1 and T2 in parallel, then T3 and T4 in
parallel and finally T5. In both cases, the total computing work is equal but the time
to solution may not be.

1.2.3. Example
Consider a problem that can be computed in both ways – via data parallelism and
task parallelism. The problem is to calculate C = A × B − (D + E) where A, B, D
and E are all square matrices of size n × n. An obvious task parallel version would
be to compute in parallel two tasks A × B and D + E and then subtract the sum
from the product. Of course, the task of computing A × B and the task of computing
the sum D + E can be calculated in a data parallel fashion. Here, there are two
levels of parallelism: the higher task level and the lower data parallel level. Similar
multilevel parallelism is common in real-world applications. Not surprisingly, there
is also for this problem a direct data parallel method based on the observation that
every element of C can be computed directly and independently from the coefficients
of A, B, D and E. This computation is shown in equation 1.2.


n−1
ci j = aik bk j − di j − ei j (1.2)
k=0

The direct computation of C.

10
The equation (1.2) means that it is possible to calculate all n2 elements of C
in parallel using the same formula. This is good news for OpenCL devices that can
handle only one kernel and related data parallel computation. Those devices will
be discussed in the chapters that follow. The matrix C can be computed using three
standard programming systems: MPI, OpenMP and OpenCL.
Using MPI, the matrix C could be partitioned into sub-matrix components and
assigned the subcomponents to processes. Every process would compute a set of
elements of C, using the expression 1.2. The main concern here is not computation
itself but the ease of assembling results and minimizing the cost of communication
while distributing data and assembling the results. A reasonable partitioning would
be dividing C by blocks of rows that can be scattered and gathered as user-defined
data types. Each process would get a set of the rows of A, D and E and the entire
matrix B. If matrix C size is n and the number of processes is p, then each process
would get n/p rows of C to compute. If a new data type is defined as n/p rows,
the data can easily be distributed by strips of rows to processes and then results can
be gathered. The suggested algorithm is a data parallel method. Data partitioning is
shown in Fig. 1.9.

Figure 1.9: Strips of data needed by a process to compute the topmost strip of C.

The data needed for each process include one strip of A, D and E and the entire
matrix B. Each process of rank 0<=rank<p computes one strip of C rows. After fin-
ishing computation, the matrix C can be assembled by the collective MPI communi-
cation function Gather. An alternative approach would be partitioning Gather into
blocks and assigning to each process computing one block of C. However, assembling
the results would be harder than in the strip partitioning case.
OpenMP would take advantage of the task parallel approach. First, the subtasks
A × B and D + E would be parallelized and computed separately, and then C = A ×
B − (D + E) would be computed, as shown in Fig. 1.10. One weakness of task parallel
programs is the efficiency loss if all parallel tasks do not represent equal workloads.
For example, in the matrix C computation, the task of computing A×B and the task of
computing D+E are not load-equal. Unequal workloads could cause some threads to
idle unless tasks are selected dynamically for execution by the scheduler. In general,
data parallel implementations are well load balanced and tend to be more efficient.
In the case of OpenCL implementation of the data parallel option for computing C, a
single kernel function would compute elements of C by the equation 1.2 where the
sum represents dot products of A × B and the remaining terms represent subtracting
D and E. If the matrix size is n = 1024, a compute device could execute over one
million work items in parallel. Additional performance gain can be achieved by using
the tiling technique [4].

11
Exploring the Variety of Random
Documents with Different Content
and an athletic build which was the envy of most of the boys at
North Bend, where the young folks lived. Teddy had always liked
Billie a lot because, as he told his sister, Laura, Billie was the nearest
like a boy of all the girls he knew. She liked sports almost as well as
he did and so as a matter of course they played tennis and hiked
and skated a good deal together.
Returning from their vacation in the old homestead at Cherry
Corners, the girls went straight to Three Towers Hall, the boarding
school to which their parents were sending them, partly because the
young folks wanted to go and partly because the high school at
North Bend was hopelessly inefficient and unsatisfactory.
At the same time, the boys departed for Boxton Military
Academy which was only a little over a mile from the boarding
school and which was also situated close to Lake Molata.
The good times the young folks had at school are told in the
second volume of the series entitled, “Billie Bradley at Three Towers
Hall.” The most startling thing that happened during the year was
the capture of the man whom the boys and girls had named the
“Codfish” on account of his peculiarly fish-like mouth. The latter had
once attempted to steal Billie’s precious trunk, and had later on been
suspected of planning and carrying out a robbery at Boxton Military
Academy. Later, he had robbed Miss Race, one of the teachers at the
Hall.
The girls had made new friends—and enemies also,—at Three
Towers Hall. Chief among the enemies were Amanda Peabody and
her chum, Eliza Dilks. The girls were both sneaks and tattletales, and
the former, being jealous of Billie and her chums, had done her best
to make life unbearable for them at Three Towers. That the
disagreeable girls had not succeeded, was not in the least their fault.
Another enemy of Billie’s had been Rose Belser, a pretty, black-
haired, very vain girl who was also jealous of Billie because of her
unusual and immediate popularity with the girls. However, even Rose
was won over to Billie’s side in the end and became sincerely
repentant for her mean behavior.
Connie Danvers, a pretty, fluffy-haired girl, became a staunch
friend of the chums at once, and it was she who had invited Billie
and Laura and Vi to spend their vacation at Lighthouse Island where
her parents had a summer bungalow. Connie’s Uncle John, an
interesting, bluff character, lived at the lighthouse on the island.
The girls had become very much interested in a mystery
surrounding Miss Arbuckle, one of the very nice new teachers who
had come to Three Towers to replace the disagreeable “Dill Pickles.”
They had also met a queer looking man one day when they were
lost in the woods, and they had wondered about him a great deal.
It seems Miss Arbuckle had been very greatly disturbed over the
loss of an album, and when Billie, accidentally stumbling upon the
book, had returned it to the teacher, the latter had wept with joy.
Turning over the pages of the album until she came to the pictures
of three beautiful children she had cried out: “Oh my precious
children. I couldn’t lose your pictures after losing you.”
Of course this exclamation, together with Miss Arbuckle’s
strange conduct, considerably puzzled the girls, and they wondered
about it all during the vacation at Lighthouse Island. Then one day a
terrible storm came up and a ship was wrecked on one of the
treacherous shoals which surrounded the island. The girls, helping in
the work of rescue, discovered three children lashed to a rude raft,
and after releasing the little victims, the girls had carried them to the
Lighthouse to be cared for.
Later, Billie saw a marked resemblance in the three children to
the pictures of the children she had seen in Miss Arbuckle’s album,
and what strange discovery this led to is told in the third volume of
this series entitled “Billie Bradley on Lighthouse Island.”
And now the girls were all back at Three Towers again in search
of further education, likewise, they hoped, much fun and adventure.
“Don’t come any farther,” Billie said to Laura and Vi, as she
stretched herself out at full length on the ice and reached out to
grasp one of the children in the water. “Lie down on the thick ice,
both of you, and hold on to me just as hard as you can. When I say
pull—pull!”
Obediently Laura and Vi flopped down on the ice, each grasping
one of Billie’s feet and holding on stoutly.
“I’d like to see you get away from us now,” said Laura.
Leaning over, Billie grasped the nearest child under the arms
and tugged with all her strength.
“Pull!” she gasped to the girls, “I’m slipping.”
The girls pulled and dragged her, child and all, out on the more
solid ice. They set the child on his poor shivering little feet and then
went back for the next one. A moment more and all three of the
little things were standing huddled together on the ice, shivering and
crying miserably.
“I wanna do home!” wailed the little boy. “I wanna do home.”
CHAPTER III—POLLY HADDON

“Where do you live?” asked Billie, turning to the oldest of the three
children. “Tell us quick, so we can get you there.”
“We live wiv our muvver, Polly Haddon,” said the little one
quaintly, pointing with a shivering finger out across the lake. “We
runned away dis mornin’.”
“So we see,” said Laura, adding, as she turned to Billie: “I think
I know where they live. Teddy pointed the house out to me one day
when we were taking a hike through the woods. Said he and the
boys had stopped there one day and had bought some waffles and
real maple syrup from Mrs. Haddon. Of course, I don’t know
whether it is the same one or not——”
“Well, come on—we’ll find out,” said Billie, lifting the largest of
the three children in her strong arms. “You and Vi can manage the
other two kiddies, I guess. You lead the way, Laura, if you know
where the house is.”
“But hadn’t we better take our skates off and walk around?”
suggested Vi.
“We can make it quicker on skates,” said Billie impatiently,
“because we can cut across the lake——”
“But the ice!” Laura objected. “It may not be solid——”
“We’ll have to take a chance on that,” Billie returned, adding
with an exasperated stamp of her foot, “if you don’t hurry and show
us the way, Laura, I’ll do it myself.”
So Laura, knowing that nothing could change Billie’s mind when
it was once made up, caught the little boy in her arms and started
off across the lake, Billie and Vi following close behind her.
Luckily the children were not heavy, being thin almost to
emaciation, or the girls could never have made their goal. As it was,
they had to stop several times and set the children down on the ice
to rest.
And more than once the treacherous ice cracked under their
feet, frightening them horribly. They made it at last, however, and
with a sigh of relief set the children on the ground while they
fumbled with numbed fingers at their skate straps.
“Is this where you live?” asked Billie of the elder of the two little
girls. Billie had undone the last strap buckle and was peering off
through the woods in search of some sort of habitation.
“Yes,” answered the little girl through chattering teeth. “Our
house is just a little way off, along that path.”
She pointed to a narrow foot path, or rather, to the place where
a foot path had once been. For now it was obliterated by snow and
was indicated only very faintly by footprints recently made.
Billie, seeing that the other girls were ready, caught up the little
girl again, holding her close for warmth and started down the snow-
covered path, Laura and Vi following.
The snow was hard, which made the going a little easier, and in
a minute or two they came in sight of a shabby cabin set in the
heart of a small clearing.
If the place had been a mansion, the girls could not have
greeted the sight of it any more joyfully. They stumbled forward
recklessly at the imminent risk of dropping the poor little children in
the snow.
Before they could reach the cottage the door of it opened and a
woman stood on the threshold, hatless and coatless and staring at
them anxiously.
When she recognized the children she gave a gesture of relief
and backed into the house, motioning to the girls to follow her.
This the girls were not in the least reluctant to do, for they were
chilled through, and the warmth of Mrs. Haddon’s kitchen was
wonderfully comforting.
They set the children on the floor, and the little ones ran
straight to their mother. Polly Haddon dropped to her knees and put
her arms around the three of them, cuddling them hungrily.
“My precious little lambs, you frightened mother so!” she said.
“She thought you were lost—but you are wet—or you have been!”
She rose to her feet and faced the girls while the children clung to
her skirts.
“Where did you find my little ones?” she asked abruptly, looking
anxiously from one to the other of them.
“We found them up to their waists in icy water,” Billie explained,
knowing that no time was to be lost if the children were to be saved
from a bad cold. “They fell through the ice on the lake.”
“Fell through the ice!” the woman repeated dumbly, then,
seeming suddenly to realize the full seriousness of the situation, she
roused herself to action.
With a quick motion she swept the children nearer to the
warmth of the coal stove, then started for a door at the opposite end
of the room. Then as if she realized that something was due the
girls, she paused and looked back at them.
“Draw up chairs close to the fire and warm yourselves,” she
directed. “You must be nearly frozen.”
The girls managed to find three rather rickety old chairs, and
these they drew as close to the stove as they could without
scorching their clothes. They tried to draw the children into their
laps, but the children were either too miserable to want to be
touched by strangers or they had become a little shy. At any rate,
they drew away so sharply that one of them nearly fell on the stove.
This frightened them all and they began to cry dismally.
The girls were glad when Mrs. Haddon returned with three
shabby but warm little bath robes which she hung close to the stove.
Then she undressed the children quickly, rubbed their little bodies till
they were in a glow, then slipped them into the snug robes.
And all the time she was doing it she kept up a running fire of
conversation with the girls.
“Thank goodness,” she said, “I only missed the children a little
while ago. They have always been so good to play close to the
house, and I was so busy I didn’t look out as usual. And to think
that they ran away and fell into the lake! Well, it’s only one more
trouble, that’s all. It’s funny how a person can become used to
trouble after a while.”
“But it would have been so much worse,” Billie suggested,
gently, “if the kiddies had fallen through into deeper water.”
“Eh?” said Mrs. Haddon, looking up at Billie quickly, then down
again. “Yes, I suppose that would have been worse.” Then she
added, with a bitterness the girls did not understand: “It isn’t often
that the worst doesn’t happen to me.”
Puzzled, the girls looked at each other, then around the bare,
specklessly clean little kitchen.
That Mrs. Haddon was very poor, there could be no doubt. The
shabbiness of the place, her dress, and the children’s clothes all
showed that. But could poverty alone account for the sadness in her
voice?
Mrs. Haddon had once been a very pretty woman, and she was
sweet looking yet, in spite of the lines of worry about her mouth.
She had lovely hair, black as night and thick, but she had arranged it
carelessly, and long strands of it had pulled loose from the pins and
straggled down over her forehead. At this moment, as though she
felt the eyes of the girls upon her, she flung the untidy hair back
with an impatient movement.
“How old are the kiddies?” asked Laura, feeling that the silence
was becoming awkward. “They look almost the same age.”
“There isn’t more than a year’s difference between Mary and
Peter here,” indicating the taller of the two little girls and the boy.
“And Isabel is thirteen months younger than Peter. Mary is nine
years old,” she added as a sort of afterthought.
“Nine years old!” cried Vi, in surprise. “Why, that would make
Peter eight and the little girl seven. I thought they were much
younger than that.”
“Yes,” added Laura, thoughtlessly, “they are very tiny for their
age.”
As though the innocent words had been a deadly insult, the
woman rose from her knees and shot the girls so black a glance
from her dark eyes that they were frightened.
“My children are tiny—yes,” she said in a hard voice, repeating
what Laura had said. “And no wonder they are small, when for years
they have been half starved.”
Then she turned quickly and herded the three frightened little
ones out of the room.
“You go to bed,” she said to them as they disappeared through
the door.
Left to themselves, the girls looked blankly at one another.
“Billie, did you hear what I heard?” asked Laura, anxiously. “Did
she really mean that the kiddies are so little because they don’t get
enough to eat?”
“Sounds that way,” said Billie pityingly. “Poor little things!”
“We must find some way to help them,” Vi was beginning when
Mrs. Haddon herself came into the room.
She seemed to be sorry for what she had said, and she told
them so. She drew up the only chair that was left in the bare little
room and sat down, facing the chums.
“You must have thought it very strange for me to speak as I
did,” she began, and went on hurriedly as the girls seemed about to
protest. “But I have had so much trouble for years that sometimes I
don’t know just what I’m doing.”
“Have you lived alone here for very long?” asked Billie, gently.
“Ever since my husband died,” answered Polly Haddon, leaning
back in her chair as though she were tired and smoothing her heavy
hair back from her forehead. “He was an inventor,” she went on,
encouraged by the girls’ friendly interest, to tell of her troubles. “For
years he made hardly enough to keep us alive, and after the children
came we had a harder pull of it than ever. Then suddenly,” she
straightened up in her chair and into her black eyes came a strange
gleam, “suddenly, my husband found the one little thing that was
wrong with the invention he had been working on for so long—just
some little thing it was, that a child could almost see, yet that he
had overlooked—and we were fairly crazy with happiness. We
thought we had at last realized our dream of a fortune.”
She paused a moment, evidently living over that time in her
mind, and the girls, fired by her excitement, waited impatiently for
her to go on.
“What happened then?” asked Vi.
“Then,” said the woman, the light dying out of her eyes, leaving
them tired and listless again, “the invention was stolen.”
“Stolen!” they echoed, breathlessly.
The woman nodded wearily. She had evidently lost all interest in
her story.
“My husband suspected a Philadelphia knitting company, whom
he had told of his invention and who were very enthusiastic over it,
of having some hand in the robbery. But when he accused them of it
they denied it and offered a reward of twenty thousand dollars for
the recovery of the models of the machinery.”
“Twenty thousand dollars!” repeated Billie in an awed tone. “I
guess they must have liked your husband’s invention pretty well to
offer all that money for it.”
The woman nodded, drearily, while two big tears rolled slowly
down her face.
“Yes, I think they would have accepted it and paid my husband
almost anything he would have asked for it,” she answered.
“But haven’t you ever found out who stole it?” asked Vi, eagerly.
“I should think that the thief, whoever he is, would have brought the
invention back because of the twenty thousand dollars.”
The woman nodded again.
“Yes, that was the queer thing about it,” she said. “When the
knitting company first told us of the reward we were jubilant, my
husband and I. We thought surely we would recover the precious
invention then. But as the weeks went by and we heard nothing, the
strain was too much. Poor Frank, after all those years of struggle,
with victory snatched away at the last minute, when he had every
right to think it in his grasp—my poor husband could fight no longer.
He died.”
With these words the poor woman bowed her head upon her
hands and sobbed brokenly. The girls, feeling heartily sorry for her
trouble but helpless to comfort her, rose awkwardly to their feet and
picked up their skates from the floor where they had thrown them.
Billie went over to the sobbing woman and patted her shyly on
the shoulder.
“I—I wish I could help you,” she ventured. “I—we are dreadfully
sorry for you.”
Then as the woman neither moved nor made an answer, Billie
motioned to Laura and Vi and they stepped quietly from the room
into the chill of the open, closing the door softly behind them.
CHAPTER IV—GENEROUS PLANS

The girls talked a great deal of Mrs. Haddon and her trouble as they
put on their skates and slowly skated back to the Hall.
“It must be dreadful,” Laura was saying thoughtfully just as the
three towers of the school loomed up before them, “not to have
enough to eat. Just think of it, girls, to be hungry—and not have
enough to eat!”
No wonder this condition of affairs seemed unusually horrible, in
fact almost impossible to luxury-loving Laura, whose father was one
of the richest and most influential men in rich and influential North
Bend. To Laura it seemed incredible that every one should not have
enough and to spare of the good things that, rightly used, go to
make happiness in this strange old world. She had never known
what it was to have a wish that was not gratified almost on the
instant.
“Yes, it must be awful,” Billie answered soberly, in response to
Laura’s exclamation. “And I’m sure,” she added decidedly, “that I
won’t be able to enjoy another good meal until I know that those
three poor little kiddies and Mrs. Haddon have had all they could
possibly eat—for once, at least.”
“What do you mean?” they asked, wonderingly.
“We’ll pack a basket,” planned Billie, growing excited over the
great idea which had just that minute occurred to her. “We’ll put
everything in it that we can possibly think of, chicken sandwiches
and a bottle of current jelly, a thermos bottle of hot coffee and
another of milk for the children——”
“Say wake up, wake up,” begged Laura, irreverently. “Where do
you suppose we are going to get all this stuff anyway? It’s too late
to go to town——”
“Who said anything about going to town?” Billie interrupted
impatiently. “I’m going straight to Miss Walters and tell her all about
the Haddon family and ask her to let us raid the kitchen and make
up the basket ourselves. We can pay for the things,” she added, as
an afterthought.
“It’s a bright idea—but it takes nerve,” said Laura slangily. “Miss
Walters may not like the idea of feeding the countryside.”
“I’m not asking her to feed the countryside,” Billie retorted,
adding comfortably as a picture of Miss Walters, white-haired, blue-
eyed and sweet, rose before her: “I’m sure she will let us do it just
this once.”
For Miss Walters, strict though she was at maintaining discipline
in the school, was nevertheless generosity and kindness itself to
every one about her.
“But,” said Laura, uttering one last protest, “I don’t believe Mrs.
Haddon would accept anything that looked like charity. She’s too
proud.”
“We won’t take any chances on her being too proud to accept
it,” said Billie decidedly, adding with a chuckle: “We’ll do the way the
boys used to do on Hallowe’en, ring the bell and run.”
They had no other chance to talk, for in a minute they were
surrounded by about a dozen of their classmates who all began
scolding them at once about running away and demanded to know
where they had been, so that plans for the Haddons were pushed
temporarily into the background.
Laughing and shouting to each other the girls took off their
skates and scrambled up the long terraced hill that led to Three
Towers.
If the Hall and its surroundings were beautiful in the summer
time, it was even more attractive in the winter. The ivy that covered
the green-gray stone of the building was now frosted white with
snow and ice, and this, catching the ruddy gleam of the afternoon
sun, gave the Hall the appearance of a great, sparkling jewel.
The three towers which gave the school its name made the
place seem like some castle of old, and the surrounding trees and
shrubbery, heavily coated with snow and icicles, gave to the old
building just the air of mystery that it needed.
The beauty of the familiar place struck Billie afresh, and she
stopped short suddenly and gazed up at it with loving eyes.
“Isn’t it lovely to have a place like this to come home to?” she
said, as the girls looked at her inquiringly, “when you are tired and
cold and——”
“Hungry,” finished Laura, giving her a shove. “Giddap, Billie,
you’re slowing down the works.”
“Slang again,” sighed Vi, plaintively, as Billie obligingly
“giddaped.” “If I should tell Miss Walters——”
“You would never live to tell another tale,” prophesied Laura,
amid a gale of laughter from the girls. “Two sneaks and tattletales
are enough,” she added significantly, as she caught sight of Amanda
Peabody and Eliza Dilks walking a little ahead of them.
“I wonder where Connie and Nellie have kept themselves,” said
Billie, as she with the other girls crowded through the wide door of
the Hall.
“They were up in the dorm, cramming for the exams when I
saw them last,” said a tall girl at Billie’s elbow. She had evidently not
been with the girls on the lake, for she wore no coat or hat and she
carried a book under each arm as though she also had been
studying.
“Oh, hello, Carol!” greeted Billie, putting an arm about the tall
girl and sweeping her toward the stairs. “So you’ve been grinding
away as usual when you ought to have been out getting some good
fresh air. My, you look as pale as a ghost.”
For the tall girl, so studiously inclined, was none other than
Caroline Brant, who had been such a good friend to Billie upon her
arrival at Three Towers Hall the year before. The girls were all fond
of Caroline, in spite of the undeniable fact that she was one of those
usually despised students commonly known as “grinds.”
“You know I don’t skate,” Caroline said in response to Billie’s
accusation. “And I never could see why people prefer freezing their
toes and noses to staying comfortably indoors.”
“You’re an old lamb,” said Billie with a squeeze. “But there are
lots of things that you never will see!”
As Caroline had predicted, the chums found Connie Danvers and
Nellie Bane in the dormitory, curled up uncomfortably on the bed,
heads bent disconsolately over two thick and bulky history books.
When the door burst open and the chums swung into the room,
skates slung over shoulders, eyes bright and cheeks glowing from
exercise, the two on the bed flung away their books and looked
despairingly at the newcomers.
“Great heavens, here they are back already,” cried Connie,
running her hands wildly through her fluffy hair. “And I haven’t
learned more than five dates so I can say them straight.”
“And that’s just five more than I have learned,” cried Billie gayly,
dropping her skates in a corner and flinging herself on the edge of
the bed. “Come closer, girls,” she added, lowering her voice to a
mysterious whisper while Nellie and Connie wriggled over to her. “I
would whisper in thine ear. We have met with an adventure!”
CHAPTER V—BEARDING THE LION

The one word “adventure” was enough to make the girls all interest
at once. Caroline Brant wedged herself into a square inch of space
on the bed between Connie and the bedpost, and as Rose Belser
came in at that moment the girls motioned her to join them.
“What’s up?” asked Rose, flinging off her cap and scarf as she
came. “Billie been getting into mischief again? Or is it only trouble
this time?”
“Trouble, I guess,” said Billie, and then she told them the
astonishing tale of what had happened that afternoon. But instead
of being interested as she had expected them to be, the girls
actually seemed disappointed.
“Well, was that all you had to tell us?” asked Connie, when she
had finished. “I’m surprised at you, Billie. I thought you had really
done something exciting.”
“Yes,” added Rose, in her aggravating little drawl, as she rose to
get ready for dinner, “it was awfully good of you to rescue those
three annoying little brats and return them to their distracted mother
and all that. But I don’t see anything dreadfully hair-raising about it.”
Rose read books that were too old for her and ran with girls
who were too old for her and so she herself contrived to seem much
older than she was. And sometimes Billie found this manner
extremely irritating, in spite of the fact that she and Rose were
friends—now.
“I suppose it doesn’t seem very exciting to you,” she said, as
she pulled off her cap and unwound the muffler from about her
neck. “But I presume you would be a little bit more interested if it
was you who didn’t have enough to eat.”
“Don’t be mad at us, Billie,” Connie begged, patting Billie’s hand
soothingly. “Of course we all feel sorry for the poor little kiddies and
their mother and we want to help them all we can. But you can’t
blame us for being disappointed when you said you had had an
adventure.”
“I wonder if you would call it an adventure,” mused Billie, more
to herself than to them, “if one of us should find that stolen
invention and claim the twenty thousand dollars reward for it!”
Her classmates stopped what they were doing and stared at her.
“Wh—what did you say?” demanded Connie.
“You heard me,” said Billie, with a grin.
“But, Billie, you know that’s absurd,” said Rose, in her best
drawl. “How could we possibly hope to find a thing that has been
missing for a couple of years?”
“It may be absurd,” said Billie good-naturedly, pulling the ribbon
from her curls and brushing them vigorously. “I think it sounds
foolish myself. But while there’s life, there’s hope. Hand me that
comb, will you, Vi?”
A few minutes later the big gong sounded through the halls,
announcing gratefully to the hungry girls that dinner was ready. And
now that the vinegary Misses Dill had gone, delight reigned supreme
in the dining hall.
The girls had all they could possibly eat of good satisfying food
and they were allowed to chatter as much as they would as long as
they did not become too noisy.
But although they had chicken for dinner and cranberry sauce
and creamed cauliflower, things all of which she especially liked,
Billie enjoyed it less than any meal she had ever eaten.
Again and again before her eyes arose the reproachful images
of the three little Haddons, undersized, undernourished, half-
starved.
She could hardly wait until dessert had been served, and then,
with a murmured word to Laura and Vi, she excused herself from the
table and went in search of Miss Walters.
She found that lady in the act of drinking her after-dinner coffee
in the privacy of her own little domain.
Miss Walters had a suite of three rooms all to herself: a
bedroom, a dressing-room and a sitting-room, and all three of the
rooms were fitted up in a manner that befitted a queen.
The sitting-room was done in mahogany and blue. An exquisite
Persian rug of dull blue covered the floor and the rich mahogany
furniture was all upholstered in blue velour. The curtain draperies
were all of this same rich blue over cream-colored lace. In the center
of the room was a huge mahogany library table upon which stood a
handsome reading lamp with a blue silk shade.
Billie, who had never been in this sanctum before and who had
seen Miss Walters only in her office, was amazed when, in reply to
her timid knock, the principal invited her to enter.
For a moment she stood dumbly staring, while Miss Walters set
down her cup and looked up with a smile. The smile changed to a
look of surprise and then to annoyance as the principal saw who the
intruder was.
“It must be something very important to bring you here at this
hour, Beatrice,” said Miss Walters, while poor Billie began to wish
herself back in the security of dormitory C. She was too frightened
to explain her presence, and yet she knew that Miss Walters
expected an explanation. “What is it you wish?” asked the latter,
impatiently.
“I—I’m sorry,” said Billie at last, backing away toward the door.
“I shouldn’t have come—but I thought—that is, I thought it was
important.” She was half through the door by this time, and Miss
Walters, her annoyance changing to amusement, took pity on her.
“What was important?” she asked, adding, as Billie still
continued to back away: “Come in here, Billie Bradley, and shut that
door. There’s a draft in the hall.”
Relieved at the use of the familiar name Billie, the girl obeyed,
shutting the door softly, then turned imploringly to the teacher.
“Sit down,” commanded the latter, pointing to one of the blue
velour armchairs near by. “Now tell me the ‘important thing’ you
came about while I finish my coffee.”
Billie made poor work of her story at first, for she was still
wondering how she had ever had the courage to approach Miss
Walters in the privacy of her sanctum sanctorum, but as she went on
she became less self-conscious and was encouraged by Miss Walters’
unfeigned interest.
And when, at the end of the recital, Miss Walters reached over
and patted her hand and told her she had been quite right in coming
to her as she had, Billie was in the seventh heaven of delight.
“With poverty behind them, fortune and comfort ahead, and
then again, desolation!” Miss Walters mused, talking more to herself
than Billie. “How the human mind can stand up under the strain is a
mystery to me. Poor, starving little mites and pitiful, noble mother,
fighting for her young with the only weapons she has. Lucky mother
to have come to the notice of a girl like you, Billie Bradley,” she
added, turning upon Billie so warm and bright a smile that the girl’s
heart swelled with pride and adoration.
“Then you will let us help the Haddons?” she asked breathlessly.
“More than that,” smiled Miss Walters. “I will help you to help
them. I think it is too late to follow out your plan of taking them
something to-night.” But she added as she saw Billie’s bright face
fall: “But we will pack a basket full to the brim with good things early
to-morrow morning and you and Laura and Violet may take them to
the cottage after breakfast. Only, you must walk around the lake. I
could not take the chance of your skating after what happened this
afternoon.”
Billie stammered out some incoherent words of thanks, Miss
Walters patted her cheek, and in another moment she found herself
standing outside in the hall in a sort of happy daze.
A girl passed her, eyed her curiously, went on a few steps and
then came back. It was Eliza Dilks.
“In Miss Walters’ room at night,” said the sneering voice that
Billie knew only too well. “No wonder you get away with everything
—teacher’s pet.”
Billie started to retort angrily, but knowing that silence was the
very worst punishment one could inflict upon Eliza she merely
shrugged her shoulders, turned up her straight little nose as far as it
would go and walked off, leaving Eliza fuming helplessly.
When Billie reached the dormitory she found the girls waiting
for her in an agitated group. There was not one of them who would
have dared to approach Miss Walters after school hours unless it had
been about a matter of life and death importance, and they had
more than half expected that Billie would be carried back on a
stretcher.
When they found out what had really happened they welcomed
Billie as a hero should be welcomed. They lifted her on their
shoulders and carried her round the dormitory, chanting school
songs till a warning hiss from one of the girls near the door sent
them scuttling. By the time Miss Arbuckle reached the dormitory,
they were bent decorously over their text-books, seeking what
knowledge they might discover!
Next morning, true to her word, Miss Walters herself
superintended the packing of an immense basket with all the
dainties at her command. There were chicken and roast beef
sandwiches, half of a leg of lamb, two or three different kinds of
jelly, some rice pudding left over from the night before, a big slab of
cake, two quarts of fresh milk, and some beef tea made especially
for the Haddons.
And the girls, feeling more important than they had ever felt
before in their lives, marched off after breakfast, during school hours
—Miss Walters having personally excused them from class—joyfully
bent upon playing the good Samaritan.
“I never knew,” said Laura, as if she were making a great
discovery, “that it could make you so happy to be kind to somebody
else!”
CHAPTER VI—TROUBLE

It was the girls’ intention at first to leave the hamper of good things
before the Haddons’ door so that Mrs. Haddon would have no
chance of refusing the gift through pride.
But when they came to the little cottage after half an hour of
steady walking, they found to their dismay that Fate had taken a
hand and spoiled all their plans.
For Mrs. Haddon herself, a shawl over her head and looking
even more worried and anxious than she had when they had seen
her before, rounded the corner of the house and met them just as
they reached the door.
For a moment the girls had a panicky impulse to drop the
basket and run, but on second thought they decided that that would
be just about the worst thing they could possibly do. And while they
were trying to think up something to say, Mrs. Haddon took the
whole situation entirely out of their hands.
At first she did not seem to recognize them, but the next instant
her face lighted up with relief and she opened the door of the
cottage, beckoning them to enter.
“Just stay here in the kitchen a minute where it’s warm,” she
directed them in a strained tone, and before the girls had time to
draw their breath she had disappeared from the room, leaving the
classmates alone.
“Now we’ve gone and spilled the beans,” whispered slangy
Laura, eyeing the blameless hamper disapprovingly as she warmed
her chilled hands before the stove. “I don’t suppose she will touch a
thing now, and after we went and walked all this way, and
everything, too——”
“Sh-h,” cautioned Billie, a hand to her lips. “She’s coming back.”
At that moment Mrs. Haddon did indeed come back into the
kitchen. She closed the door very gently behind her and then came
quickly toward the girls.
“Listen,” she said breathlessly. “I don’t know who sent you, just
now. Maybe it was God.” She caught her breath on the words and
the girls regarded her wonderingly and a little fearfully. For
goodness’ sake! what was she talking about?
“Anyway, you’ve come,” went on the woman, swiftly. “And if you
want to, you can do me a great favor.”
“What is it?” they asked together.
“Run for the nearest doctor, one of you—or all of you,” said the
woman, her words stumbling over one another in her agitation.
“Peter, my little boy, is sick. If I don’t have a doctor very soon, he
may die.”
“Oh, where is the nearest doctor?” asked Billie, breathlessly, her
eyes big with sympathy. “Tell me and I’ll go.”
“Half a mile down the road!” said the woman. “Dr. Ramsey! In
the big white house! These are his office hours. He should be at
home. I just went to a neighbor’s, but she was not at home and I
could not go myself. Peter would have been alone——”
“I’ll go, and I’ll have him back here in half an hour,” promised
Billie, running to the door as she spoke. But Laura grabbed her skirt
and held on to it.
“No, you stay here. I’ll go,” she said, thinking desperately of the
food hamper and fearing that if Billie went for the doctor she would
probably have to explain their mission.
“I’ll go with you,” volunteered Vi, with the same thought in
mind, and before Billie could do more than blink, her two chums had
flashed through the door, closing it with a sharp little click behind
them. Then it opened again for an instant and Laura put her pretty
head inside.
“You always could explain things so much better than the rest of
us, Billie,” she said, by way of excuse, it is to be supposed—and then
the door closed again.
It was good for Billie at that moment that she had been blessed
with a sense of humor. Otherwise, she might have been a little put
out.
As it was, she took it as a joke on her and turned back
resignedly to her task of telling why they had come to proud Polly
Haddon.
The latter was pacing the floor anxiously. Then, as a little moan
came from the next room, she flew to the patient, leaving Billie
entirely alone.
The latter regarded the hamper uncertainly for a moment, then,
with a sigh, she lifted it from the floor to the rickety kitchen table.
“I’ll let her see all the good things first,” she decided wisely, as
she removed the cover from the basket, exposing to view its inviting
contents. “Then maybe she’ll be too busy looking at them to be
angry.”
So busy was she that she did not hear Mrs. Haddon reënter the
room. Neither did she know that the latter was staring unbelievingly
over her shoulder till a slight exclamation of wonder made her start
and whirl round suddenly.
“Where did you get all that?” asked the woman, her eyes still
fixed on the contents of the basket. “And what is it for?”
“It’s—it’s for you—if you will take it, please,” stammered Billie, in
her surprise and confusion saying what came first to her mind. “We
—we thought maybe—maybe the kiddies would like the beef tea and
milk and—and—things——” she finished weakly, thinking resentfully
that the girls, or one of them anyway, might have stayed and helped
her out.
But after all, she need not have worried. For an instant the look
that Billie had expected and dreaded flared into Polly Haddon’s eyes
—a look of outraged pride. But then the woman thought of the
children—and she had no pride.
“You said you brought some beef tea?” she repeated, bending
eagerly over the basket. “And milk?”
“Two quarts of milk,” cried Billie, joyfully, the relief she felt
singing in her voice. “And we made the beef tea fresh this morning.
Why—why—what’s the matter?”
For Polly Haddon’s black eyes had filled with tears and she had
turned away impatiently to hide them. Beneath the worn old shawl,
her thin shoulders shook in an effort to suppress her hysterical sobs.
Then Billie ran to her and put her young arms around her and
Polly Haddon, who had struggled so long and so bravely alone, clung
to the girl hungrily while she fought for self-control.
“It’s so long!” she said huskily, “so long since any one did
anything for us—for my babies——” Her voice broke, and for a
minute she just clung to Billie and let tears wash some of the
bitterness from her heart. Then she straightened up suddenly, wiped
the tears from her eyes with a handkerchief that Billie had slipped
into her hand, and holding the girl off at arm’s length regarded her
intently.
“It seems,” said the woman softly, while Billie looked up at her
out of clear, grave eyes, “that when things get as bad as they can be
the Lord sends somebody to help. This time he sent you. Hark!
What’s that?”
It was only the restless turning of a feverish little body in bed,
but the mother was instantly alert.
“The beef tea!” she directed, and Billie quickly handed her one
of the bottles. “He has had hardly any real nourishment since day
before yesterday,” Polly Haddon went on as she poured the liquid
into one of the pans on the stove and sniffed of it hungrily. “Strong
beef tea is just what the little fellow needs.”
Billie wondered while she watched Mrs. Haddon with pitying
eyes. No nourishment for almost two days! Why, if they had not
come the children might have starved to death!
“Where are the two little girls?” she asked, remembering
suddenly that she had seen no sign of them.
Mrs. Haddon said nothing for so long that Billie began to think
she had not heard her question. Then the woman turned and faced
the girl, holding a steaming cup of beef broth in her hand.
“I’ve kept them in bed, too,” she said. “I was afraid they had
caught cold, and then, too—one feels less hungry if one doesn’t
move about.”
Then abruptly she turned and once more left the room. Billie
would have followed, but the thought that perhaps Polly Haddon
would not wish her to held her back. The woman had accepted the
food for her children’s sake, because they were practically starving.
But in spite of that she was very proud. Perhaps she would not wish
to have Billie see the poverty-stricken bareness of the rooms
beyond. So Billie stayed in the kitchen and waited.
Her eyes strayed nervously to an alarm clock that ticked away
on a shelf over the sink. She wished the girls would come with the
doctor. If little Peter was as sick as his mother thought he was, every
minute might be precious. And besides that, they must get back to
school.
Then she heard the girls’ voices mingled with the gruff tones of
a man—the doctor, of course—and her heart jumped with relief. The
next moment the door was flung open and Laura and Vi came in,
followed by an immense man who seemed to completely fill the
narrow doorway. Then Polly Haddon appeared in the doorway
between the two rooms, an empty cup in her hand. At sight of the
doctor she set down the cup and motioned him eagerly into the
other room.
The latter glanced curiously at Billie, flung his hat on the kitchen
table in passing, and disappeared with Mrs. Haddon into the sick
room.
“Just luck that we happened to catch the doctor on his way
out,” panted Laura, for the big man had hustled the girls back to the
cottage on a run. “Say, Billie,” she added, her eyes lighting on the
opened hamper, “I see you did the trick. Any bones broken?”
“Tell us about it,” begged Vi.
“I’ll tell you on the way home,” said Billie, her eye once more on
the clock. “Miss Walters told us not to stay long, you know. We were
to come right back.”
“Gracious, look at the time!” cried Laura, in consternation,
following Billie’s eyes to the clock. “Miss Walters will think we have
eloped.”
“I wish we could wait and see what the doctor says,” protested
Vi, hanging back, and just then Billie raised a warning finger.
“Listen,” she said.
The doctor had raised his voice for a moment and his words
came clearly to the girls where they stood near the door.
“The boy is very sick, Mrs. Haddon,” he said. “It will take good
nursing to pull him through and plenty of nourishing food.” He
lowered his voice again and the rest of what he said was lost in a
meaningless murmur.
In the kitchen the girls stared at each other.
“Plenty of nourishing food,” whispered Billie. “Where is he going
to get it?”
“I guess,” said Laura, as she opened the door, “it is up to us!”
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookname.com

You might also like