100% found this document useful (10 votes)
223 views

[Ebooks PDF] download Data Structures and Algorithm Analysis in C 2nd Edition China Reprint Edition Weiss full chapters

Reprint

Uploaded by

gosalsaavipi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (10 votes)
223 views

[Ebooks PDF] download Data Structures and Algorithm Analysis in C 2nd Edition China Reprint Edition Weiss full chapters

Reprint

Uploaded by

gosalsaavipi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

Download the full version of the ebook at

https://ebookfinal.com

Data Structures and Algorithm Analysis in C


2nd Edition China Reprint Edition Weiss

https://ebookfinal.com/download/data-structures-
and-algorithm-analysis-in-c-2nd-edition-china-
reprint-edition-weiss/

Explore and download more ebook at https://ebookfinal.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Data Structures and Algorithm Analysis in Java 3rd Edition


Edition Mark A. Weiss

https://ebookfinal.com/download/data-structures-and-algorithm-
analysis-in-java-3rd-edition-edition-mark-a-weiss/

ebookfinal.com

Data Structures and Algorithm Analysis in Java 3rd Edition


Dr. Clifford A. Shaffer

https://ebookfinal.com/download/data-structures-and-algorithm-
analysis-in-java-3rd-edition-dr-clifford-a-shaffer/

ebookfinal.com

Fundamentals of Data Structures in C 2nd Edition Ellis


Horowitz

https://ebookfinal.com/download/fundamentals-of-data-structures-
in-c-2nd-edition-ellis-horowitz/

ebookfinal.com

ADTs Data Structures and Problem Solving with C 2nd


Edition Larry R. Nyhoff

https://ebookfinal.com/download/adts-data-structures-and-problem-
solving-with-c-2nd-edition-larry-r-nyhoff/

ebookfinal.com
Data structures and algorithms in Java 2nd Edition Adam
Drozdek

https://ebookfinal.com/download/data-structures-and-algorithms-in-
java-2nd-edition-adam-drozdek/

ebookfinal.com

Data structures using C Second Edition Reema Thareja

https://ebookfinal.com/download/data-structures-using-c-second-
edition-reema-thareja/

ebookfinal.com

Data Structures and Other Objects Using C 4th Edition Main

https://ebookfinal.com/download/data-structures-and-other-objects-
using-c-4th-edition-main/

ebookfinal.com

Data Analysis of Asymmetric Structures Advanced Approaches


in Computational Statistics 1st Edition Takayuki Saito

https://ebookfinal.com/download/data-analysis-of-asymmetric-
structures-advanced-approaches-in-computational-statistics-1st-
edition-takayuki-saito/
ebookfinal.com

C Programming Program Design Including Data Structures 5th


Edition D. S. Malik

https://ebookfinal.com/download/c-programming-program-design-
including-data-structures-5th-edition-d-s-malik/

ebookfinal.com
Data Structures and Algorithm Analysis in C 2nd Edition
China Reprint Edition Weiss Digital Instant Download
Author(s): Weiss, Mark A.
ISBN(s): 9787111312802, 7111312805
Edition: 2
File Details: PDF, 114.91 MB
Year: 2010
Language: english
1•
.a
..

2)

Data Structures
and
CCNO

Or T I 0
.Algorithm Analysis in C
&4Jfl3flft C4M! (.
Data Structures and Algorithm Analysis in C (Second Edition)

*WiW5,2Ot3O$tt*#n*ttt—, *I*BL*flJ?nn
fl, fl t*WSNJ?ncJntX*JJ, #JflJW, EWfl5OOfThic

M*#M$tT&hTL IitC#!ifl
W, #flfletjfl.. iüBtTfi4R0

flWtTfl&it’5, mnn 31*W*. wu. SLfl


no
#Tfl#1D *flIflMg, znflBPit. IL z’jk.nfl

%fflWtTMiI3flR, W—&M
tnTRfl4fluA, IM TTflW treaptL. k-d*t

Mark Alien Weiss 1987fl YIP


AtRobert Sedgewick, W *itE-54tL 1
@tAP(Advanced Placement) 4JU*4MtSJ*O tflWI
&a fln4fl0

www.pearsonhighered.com

For sale and distribution in the People’s Republic of china


exclusively (except Taiwan, Hong Kong SAR and Macau SAR).
$A#*u*M flteWi1a
4upfttka) *W1

ISBN 978-7-111-31280-2
WNa4t: (010) 88378991, 88361066

ffM: (010) 68326294, 88379649, 68995259

4Mt: (010) 88379604

itftMM: hzjsj@hzbook.com

9 87111 312802
1J9I4t: www.chlna-pub.com
-4* P44 ffh 45.OOYE
()& 2j)

II:!

II# I
liii
‘lit

! ‘i )z Jq

I *
I I
I I I
I
II I I 11
A
English reprint edition copyright © 2010 by Pearson Education Asia Limited
and China Machine Press.
Original English language title: Data Structures and Algorithm Analysis in C,
Second Edition (ISBN 978-0-201-49840-0) by Mark Allen Weiss, Copyright © 1997.
All rights reserved.
Published by arrangement with the original publisher, Pearson Education, Inc.,
publishing as Addison-Wesley.
For sale and distribution in the People’s Republic of China exclusively (except
Taiwan, Hong Kong SAR and Macau SAR).

tJtWPearson Education Asia


flL *tWfl11’FjI,
T+$ARnflficj (4J+Ifl. i1En+1ft
tthL:) WW°
fPearson Education (±flW)tkL)

$wpn, ft:vtR
*1*ItJiM

: 01-2010-4175
*4€:

(CIP) fl

t-Wflff: C141& (fl)tk 2Jt&) / () 4i (Weiss,


MA.) t4hi: I1kItflj’f±, 2010.7
(fl&1)
4Jt: Data Structures and Algorithm Analysis in C, Second Edition
ISBN 978-7-llt-31280-2

It” ffi.©flt4j-fl fl5YW-fl ©Ci—


W.cDTP3II.l2 ©TP312
)i&t±—t

+RCIPflIfl (2010) l3l749

t1L4itLEJkLBJtE ( $M[KWEFk#522 IIIi&IIII 100037)

4hT)4j[j (1JR± /1 Efl’j

l50mmx2l4mm l6.5EFJ*
ISBN 978-7-111-31280-2
tift: 45.005

FLWJ$4S, 1znt, ifUT. ,


tNt: (010) 88378991; 88361066
(010) 68326294; 88379649; 68995259
&&: (010) 88379604
imt1u: hzjsj@hzbook.com
tnrn, *Wff*M*t*n flflfl*fl fV
t4*fltThEth &IELSWØIf4
, flflfliVc±flfñLflB. flL flAk
ftflWrt, Ak-*fltt, ttflfl+

tt’P, 4fltJTlWfl, 3S4W5ITflflWL4!, R3EV**fl,


flflL #-
i, flflht*?#HbT aNB’fWtJU9ktiME,
tflAtfl HtLiftifl0
ftMfl; Akt fl±flSL fl1*I
t*flBt1H1fltIflT, Nttfl4M4*flJL
-h*ifltfl#O3VFSflfl2ib0 hFt, Kli—ft

, &&-flfl. A*2J°
4WIS$flfli 3b&ftfl”. 1998

4Jfl, tH]ti#IWtflT. fl[9hfl*#±O in


41fl4WJJ, {f]Pearson, McGraw-Hill, Elsevier, MIT, John

Wiley & Sons, Cengage W1}L1LTaM1E, Ak


S. Tanenbaum, Bjarne

Stroustrup, Brain W. Kernighan, Dennis Ritchie, Jim Gray, Afred V.

Aho, John E. Hopcroft, Jeffrey D. Uliman, Abraham Silberschatz,

William Stallings, Donald E. Knuth, John L. Hennessy, Larry L.

U
Peterson*—4fttAfi,
Ilk, flflTh IU+&*L *JJW NT)SAk

L
“t±flf4flt” kIt #JT 34flj4ffiJjfl, N
fl4ZflTrtWflfl, JtflTa1Iiflnfl
iv

If
j+, “i4”
Tt
, #iE
“I1n

11TfliJE0
ft, WtJL*I
iUfl
Ijni
1flI±IE, i9fl1:

*: www.hzbook.com

hzjsj@hzbook.com =

(010) 88379604

J4hj1:
100037
PREFACE

Purpose/Goals
This book describes data structures, methods of organizing large amounts of data,
and algorithm analysis, the estimation of the running time of algorithms. As computers
become faster and faster, the need for programs that can handle large amounts

of input becomes more acute. Paradoxically, this requires more careful attention to
efficiency, since inefficiencies in programs become most obvious when input sizes are
large. By analyzing an algorithm before it is actually coded, students can decide if a
particular solution will be feasible. For example, in this text students look at specific
problems and see how careful implementations can reduce the time constraint for
large amounts of data from 16 years to less than a second. Therefore, no algorithm
or data structure is
presented without an explanation of its running time. In some
cases, minute details that affect the running time of the implementation are explored.
Once a solution method is determined, a program must still be written. As
computers have become more
powerful, the problems they must solve have become
larger and more complex, requiring development of more intricate programs. The
goal of this text is to teach students good programming and algorithm analysis skills
simultaneously so that they can develop such programs with the maximum amount
of efficiency.
This book is suitable for either an advanced data structures (CS7) course or

a algorithm analysis. Students should have some knowledge


first-year graduate course in
of intermediate programming, including such topics as pointers and recursion,
and some background in discrete math.

Approach
I believe it is important for students
to learn how to program for themselves, not

how to copy programs frombook. On the other hand, it is virtually impossible to


a

discuss realistic programming issues without including sample code. For this reason,
the book usually provides about one-half to three-quarters of an implementation,
and the student is encouraged to supply the rest. Chapter 12, which is new to this
edition, discusses additional data structures with an emphasis on implementation

details.
vi Preface

The algorithms in this book are presented in ANSI C, which, despite some
flaws, is arguably the most popular systems programming language. The use of C
instead of Pascal allows the use of dynamically allocated arrays (see, for instance,
rehashing in Chapter 5). It also produces simplified code in several places, usually
because the and (&&) operation is short-circuited.
Most criticisms of C center on the fact that it is easy to write code that is barely
readable. Some of the more standard tricks, such as the simultaneous assignment
and testing against 0 via

if (x=y)

aregenerally not used in the text, since the loss of clarity is compensated by only a
few keystrokes and no increased speed. I believe that this book demonstrates that
unreadable code can be avoided by exercising reasonable care.

Overview

Chapter 1 contains review material on discrete math and recursion. I believe the only
way to be comfortable with recursion is to see good uses over and over. Therefore,
recursion is prevalent in this text, with examples in every chapter except Chapter 5.
Chapter 2 deals with algorithm analysis. This chapter explains asymptotic analysis
and its major weaknesses. Many examples are provided, including an in-depth
explanation of logarithmic running time. Simple recursive programs are analyzed
by intuitively converting them into iterative programs. More complicated divide-
and-conquer programs are introduced, but some of the analysis (solving recurrence
relations) is implicitly delayed until Chapter 7, where it is performed in detail.
Chapter 3 covers lists, stacks, and queues. The emphasis here is on coding
these data structures using ADT5, fast implementation of these data structures, and
an exposition of some of their uses. There are almost no programs (just routines),
but the exercises contain plenty of ideas for programming assignments.
Chapter 4 covers trees, with an emphasis on search trees, including external
search trees (B-trees). The uix file system and expression trees are used as examples.
AVL trees and splay trees are introduced but not analyzed. Seventy-five percent of the
code is written, leaving similar cases to be completed by the student. More careful
treatment of search tree implementation details is found in Chapter 12. Additional

coverage of trees, such as file compression and game trees, is deferred until Chapter
10. Data structures for an external medium are considered as the final topic in

severalchapters.
Chapter 5 is a relatively short chapter concerning hash tables. Some analysis is
performed, and extendible hashing is covered at the end of the chapter.
Chapter 6 is about priority queues. Binary heaps are covered, and there is
additional material on some of the theoretically interesting implementations of

priority queues. The Fbonacci heap is discussed in Chapter 11, and the pairing heap
is discussed in Chapter 12.
Preface vii

Chapter 7 covers sorting. It is very specific with respect to coding details and
analysis. All the important general-purpose sorting algorithms are covered and
compared. Four algorithms are analyzed in detail: insertion sort, Sheilsort, heapsort,
and quicksort. The analysis of the average-case running time of heapsort is new to
this edition. External sorting is covered at the end of the chapter.
Chapter 8 discusses the disjoint set algorithm with proof of the running time.
This is a short and specific chapter that can be skipped if Kruskal’s algorithm is not
discussed.
Chapter 9 covers graph algorithms. Algorithms on graphs are interesting, not
only because they frequently occur in practice but also because their running time is
so heavily dependent on the proper use of data structures. Virtually all of the standard

algorithms are presented along with appropriate data structures, pseudocode, and
analysis of running time. To place these problems in a proper context, a short
discussion on complexity theory (including NP-completeness and undecidability) is
provided.
Chapter 10 covers algorithm design by examining common problem-solving
techniques. This chapter is heavily fortified with examples. Pseudocode is used in
these later chapters so that the student’s appreciation of an example algorithm is not
obscured by implementation details.
Chapter 11 deals with amortized analysis. Three data structures from Chapters
4 and 6 and the Fibonacci heap, introduced in this chapter, are analyzed.
Chapter 12 is new to this edition. It covers search tree algorithms, the k-d tree,
and the pairing heap. This chapter departs from the rest of the text by providing

complete and careful implementations for the search trees and pairing heap. The
material is structured so that the instructor can integrate sections into discussions
from other chapters. For example, the top-down red black tree in Chapter 12 can
be discussed under AVL trees (in Chapter 4).

Chapters 1—9
provide enough material for most one-semester data structures

courses, permits, then Chapter 10 can be covered. A graduate course


If time

on algorithm analysis could cover Chapters 7—11.The advanced data structures


analyzed in Chapter 11 can easily be referred to in the earlier chapters. The
discussion of NP-completeness in Chapter 9 is far too brief to be used in such a
course. Garey and Johnson’s book on NP-completeness can be used to augment this
text.

Exercises

Exercises, provided at the end of each chapter, match the order in which material
is presented. The last exercises may address the chapter as a whole rather than a
specific section. Difficult exercises are marked with an asterisk, and more challenging
exercises have two asterisks.
A solutions manual containing solutions to almost all the exercises is available
to instructors from the Addison-Wesley Publishing Company.
viii Preface

References

References are placed at the end of each chapter. Generally the references either
are historical, representing the original source of the material, or they represent

extensions and improvements to the results given in the text. Some references
represent solutions to exercises.

Code Availability
The program code in this book is available via anonymous ftp
example
at is also accessible through the World Wide Web; the URL is
aw.coni. It

http://www.aw.com/cseng/ (follow the links from there). The exact location of


this material may change.

Acknowledgments
Many, many people have helped me in the preparation of books in this series. Some
are listed in other versions of the book; thanks to all.
For this edition, I would like to thank my editors at Addison-Wesley, Carter
Shanklin and Susan Hartman. Ten Hyde did another wonderful job with the
production, and Matthew Harris and his staff at Publication Services did their usual
fine work putting the final pieces together.

M.A. W.

Miami, Florida
1996
July,
CONTENTS

Introduction 1
1.1. What’stheBookAbout? 1

1.2. Mathematics Review 3


12.1. Exponents 3
1.2.2. Logarithms 3
1.2,3. Series 4
1.2.4. Modular Aiithmetic 5
1.2.5. ThePWord 6

1.3. A Brief Introduction to Recursion 8

Sununary 12
Exercises 12
References 13

2 Algorithm Analysis 15
2.1. Mathematical Background 15
2.2. Model 18
2.3. WhattoAnalyze 18
2.4. Running Time Calculations 20
2.4.1. ASimpleExample 21
2.4.2. GeneraiRules 21
2.4.3. Solutions for the Maximum Subsequence Sum Problem 24
2.4.4. Logarithms in the Running Time 28
2.4.5. CheckingYourAnalysis 33
2.4.6. A Grain of Salt 33

Summary 34
Exercises 35
References 39
x Contents

3 Lists, Stacks, and Queues 41


3.1. Abstract Data Types (Ts) 41
3.2. TheListnr 42
3.2.1. Simple Array implementation of Lists 43
3.2.2. Linked Lists 43
3.2.3. ProgrammIng Details 44
3.2.4. Common Errors 49
3.2.5. Doubly Linked Lists 51
3.2.6. CircularlylinkedLists 52
3.2.7. Examples 52
3.2.8. Cursor Implementation of Linked Lists 57

3.3. The StackAwT 62


3.3.1. Stack Model 62
3.3.2. Implementation of Stacks 63
3.3.3. Applications 71

3.4. The Queue ADT 79


3.4.1. Queue Model 79
3.4.2. Array Implementation of Queues 79
3.4.3. Applications of Queues 84
Summary 85
Exercises 85

4 Trees 89
4.1. Preliminaries 89
4.1.1. Implementation of Trees 90
4.1.2. TreeTraversalswithanApplication 91
4.2. BinaryTrees 95
4.2.1. Implementation 96
4.2.2. Expression Trees 97

4.3. The Search Tree ADT—Binaly


Search Trees 100
4.3.1. MakeEmpty 101

4.3.2. Find 101

4.3.3. FindMin and FindMax 103


4.3.4. Insert 104
4.3.5 Delete 105
4.3.6. Average-Case Analysis 107
4.4. AvLTrees 110
4.4.1. SingleRotation 112
4.4.2. DoubleRotation 115
Contents xi

4.5. SplayTrees 123


4.5.1. ASimpleldea(ThatDoesNotWork) 124
4.5.2. Splaying 126
4.6. Tree Traversals (Revisited) 132
4.7. B-Trees 133
Summary 138
Exercises 139
References 146

5 Hashing 149
5.1. General Idea 149
5.2. Hash Function 150
5.3. Separate Chaining 152
5.4. Open Addressing 157
5.4.1. linear Probing 157
5.4.2. QuadratIc Probing 160
5.4.3. Double Hashing 164
5.5. Rehashing 165
5.6. Extendible Hashing 168
Summary 171
ExercIses 172
References 175

6 Priority Queues (Heaps) 177


6.1. Model 177
6.2. Simple ImplementatIons 178
6.3. Binary Heap 179
6.3.1. Structure Property 179
6.3.2. Heap Order Property 180
6.3.3. BasIc Heap Operations 182
6.3.4. Other Heap Operations 186

6.4. Applications of Priority Queues 189


6.4.1. The Selection Problem 189
6.4.2. Ent Simulation 191
xii Contents

6.5. d-Heaps 192


6.6. Leftist Heaps 193
6.6.1. LeftistHeapProperty 193
6.6.2. Leftist Heap Operations 194

6.7. Skew Heaps 200


6.8. Binomial Queues 202
6.8.1. Binomial Queue Structure 202
6.8.2. Binomial Queue Operations 204
6.8.3. Implementation of Binomial Queues 205

Summary 212
Exercises 212
References 216

7 Sorting 219
7.1. Preliminaries 219
7.2. Insertion Sort 220
7.2.1. The Algorithm 220
7.2.2. AnalysisoflnsertionSort 221

7.3. A Lower Bound for Simple Sorting Algorithms 221

7.4. Sheilsort 222


7.4.1. Worst-Case Analysis of Sheilsort 224

7.5. Heapsort 226


7.5.1. Analysis of Heapsorl 228

7.6. Mergesort 230


7.6.1. Analysis of Mergesort 232
7.7. Quicksort 235
7.7.1. PickingihePivot 236
7.7.2. Partitioning Strategy 237
7.7.3. Small Arrays 240
7.7.4. AcWal Quicksort Routines 240
7.7.5. Analysis of Quicksort 241
7.7.6. A Linear-Expected-Time Algorithm for Selection 245
7.8. Sorting Large Structures 247
7.9. A General Lower Bound for Sorting 247
7.9.1. Decision Trees 247

7.10. Bucket Sort 250

7.11. ExternalSorting 250


7.11.1. WhyWeNeedNewAlgorithnis 251
7.11.2. ModelforExternalSorting 251
Contents xiii

7.11.3. TheSimpleAlgorithm 251


7.11.4. Multiway Merge 253
7.11.5. Polyphase Merge 254
7.11.6. Replacement Selection 255
Summary 256
Exercises 257
References 261

8 The Disjoint Set AI)T 263


8.1. Equivalence Relations 263
8.2. The Dynamic Equivalence Problem 264
8.3. Basic Data Structure 265
8.4. Smart Union Algorithms 269
8.5. Path Compression 271
8.6. Worst Case for Union-by-Rank and Path Compression 273
8.6.1. Analysis of the Union/Find Algorithm 273

8.7. AnApplication 279


Summary 279
Exercises 280
References 281

9 Graph Algorithms 283


9.1. Definitions 283
9.1.1. Representation of Graphs 284

9.2. Topological Sort 286

9.3. Shortest-Path Algorithms 290


9.3.1. Unweighted Shortest Paths 291
9.3.2. Dijkstra’sAlgorithm 295
9.3.3. Graphs with Negative Edge Costs 304
9.3.4. Acydic Graphs 305
9.3.5. All-Pairs Shortest Path 308

9.4. Network Flow Problems 308


9.4.1. A Simple Maximum-Flow Algorithm 309

9.5. Minimum Spanning Tree 313


9.5.1. Prim’s Algorithm 314
9.5.2. Kruskal’s Algorithm 316
xiv Contents

9.6. Applications of Depth-First Search 319


9.6.1. Undirected Graphs 320
9.6.2. Biconnectivity 322
9.6.3. Euler Circuits 326
9.6.4. Directed Graphs 329
9.6.5. Finding Strong Components 331

9.7. Introduction to NP-Completeness 332


9.7.1. Easy vs. Hard 333
9.7.2. The ClassNP 334
9.7.3. M’-Complete Problems 335

Summary 337
Exercises 337
References 343

10 Algorithm Design Techniques 347


10.1, GreedyAlgorithms 347
10.1.1. ASimpleSchedulingProblem 348
10.1.2. Huffman Codes 351
10.1.3. ApproximateBinPacking 357
10.2. Divide and Conquer 365
10.2.1. RunningThneofDivideandConquerAlgorithms 366
10.2.2. Closest-Points Problem 368
10.2.3. The Selection Problem 373
10.2.4. Theoretical Improvements for Arithmetic Problems 376

10.3. Dynamic Programming 380


10.3.1. UsingaTableinsteadofRecursion 380
10.3.2. Ordering Matrix Multiplications 383
10.3.3. Optimal Binary Search Tree 387
10.3.4. MI-Pairs Shortest Path 390

10.4. Randomized Algorithms 392


10.4.1. Random Number Generators 394
10.4.2. Skip Lists 397
10.4.3. Primality Testing 399

10.5. Backtracking Algorithms 401


10.5.1. The Turnpike Reconstrnction Problem 403
10.5.2. Games 407
Summary 413
Exercises 415
References 422
Contents xv

11 Amortized Analysis 427


11.1. AnUnrelatedPuzzle 428

11.2. BinomialQueues 428


11.3. Skew Heaps 433
11.4. Fibonacci Heaps 435
11.4.1. CuttingNodesinLeftistlleaps 436
11.4.2. Lazy Merging for Binomial Queues 439
11.4.3. The Fibonacci Heap Operations 442
11.4.4. Proof of the Time Bound 443

11.5. SplayTrees 445


Summary 449
Exercises 450
References 451

12 Advanced Dala Structures and Implementation 453


12.1. Top-Down Splay Trees 453
12.2. RedBlackTrees 457
12.2.1. Bottom-Up Insertion 462
12.2.2. Top-Down Red Black Trees 463
12.2.3. Top-Down Deletion 465

12.3. Deterministic Skip Lists 469


12.4. AA-Trees 476
12.5. Treaps 482
12.6. k-dTrees 485
12.7. Pairing Heaps 488
Summary 494
Exercises 495
References 497

Index 501
CHAPTER 1

Introduction

In this chapter, we discuss the aims and goals of this text and briefly review
programming concepts and discrete mathematics. We will

See that how a program performs for reasonably large input is just as important
as itsperformance on moderate amounts of input.
Summarize the basic mathematical background needed for the rest of the
book.

Briefly review recursion.

1.1. What’s the Book About?

Suppose you have a group of N numbers and would like to determine the kth largest.
This is known the selection problem. Most students who have had aprogramming
as

course or two would have no difficulty writing a program to solve this problem.
There are quite a few “obvious”solutions.
One way to solve this problem would be to read the N numbers into an array,
sort the array in decreasing order by some simple algorithm such as bubblesort, and
then return the element in position k.
A somewhat better algorithm might be to read the first k elements into an array
and sort them (in decreasing order). Next, each remaining element is read one by
one. As a new element arrives, it is
ignored if it is smaller than the kth element
in the array. Otherwise, it is placed in its correct spot in the array, bumping one
element out of the array. When the algorithm ends, the element in the kth position
is returned as the answer.

Both algorithms are simple to code, and you are encouraged to do so. The
natural questions, then, are which is better and, more important, is either
algorithm
algorithm good enough? A simulation using a random file of 1 million elements
and k =
500,000 will show that neither algorithm finishes in a reasonable amount

of time; each requires several days of computer processing to terminate (albeit


2 Data Structures and Algorithm Analysis in C

eventually with a correct answer). An alternative method, discussed in Chapter 7,


gives a solution in about
second. Thus, although our proposed algorithms work,
a

they cannot be considered good algorithms, because they are entirely impractical for
input sizes that a third algorithm can handle in a reasonable amount of time.
A second problem is to solve a popular word puzzle. The input consists of a
two-dimensional array of letters and a list of words. The object is to find the words
in the puzzle. These words may be horizontal, vertical, or diagonal in any direction.
As an example, the puzzle shown in Figure 1.1 contains the words this, two, fat,
and that. The word this begins at 1, column 1, or (1,1), and extends to (1,4);
row

two goes from (1,1) to (3,1); fat goes from (4,1) to (2,3); and that goes from (4,4)
to (1,1).
Again, there are at least two straightforward algorithms that solve the problem.
For each word in the word list, we check each ordered
triple (row, column,
orientation) for the presence of the word. This amounts to lots of nested for loops
but is basically straightforward.
Alternatively, for each ordered quadruple (row, column, orientatton, number
of characters) that doesn’t run off an end of the puzzle, we can test whether the
word indicated is in the word list. Again, this amounts to lots of nested for loops. It
is possible to save some time if the maximum number of characters in any word is
known.
It is relatively easy ro code up either method of solution and solve many of the
real-life puzzles commonly published in magazines. These typically have 16 rows, 16
columns, and 40 or so words. Suppose, however, we consider the variation where
only the puzzle board is given and the word list is essentially an English dictionary.
Both of the solutions proposed require considerable time to solve this problem and
therefore are not acceptable. However, it is possible, even with a large word list, to
solve the problem in a matter of seconds.
An important concept is that, in many problems, writing a working program is
not good enough. If the program is to be run on a large data set, then the running
time becomes an issue. Throughout this book we will see how to estimate the
running time of a program for large inputs and, more important, how to compare
the running times of two programs without actually coding them. We will see

techniques for drastically improving the speed of a program and for determining
program bottlenecks. These techniques will enable us to find the section of the code
on which to concentrate our optimization efforts.

Figure Li Sample word puzzle

1 2 3 4

1 t h i s
2 w a t s
3 o a h g
4 1 g d t
Chapter 1 Introduction 3

1.2. Mathematics Review

This section lists some of the basic formulas you need to memorize or be able to

derive and reviews basic proof techniques.

1.2.1. Exponents
XAXB =
XAt8
yA
-

XAB
XB

(XA)B =
XAB

+ XN =
2XN X2”
2’ + 2N’ 2N+1

1.2.2. Logarithms
In computer science, all logarithms are to the base 2 unless specified otherwise.

DEFINmON: XA =
B if and only if logy B =
A

Several convenient equalities follow from this definition.

ThEOREM 1.1.

logc B
logB= ; C>O
Iog A
PROOF:
Let X loge B, Y
=
loge A, and Z log B. Then, by the definition of
= =

logarithms, C B, C A, and AZ
=
B. Combining these three equalities
= =

yields (C Cx B. Therefore, X
=
YZ, which implies Z X/Y,
= =

proving the theorem.

ThEOREM 1.2.

logAB =
logA + logB

PROOF:

Let X =
log A, Y =
log B, and Z =
logAB. Then, assuming the default base
of 2, 2X =
A, 2’ =
B, and 2Z =
AB. Combining the last three equalities yields
= =
AB. Therefore, X + Y =
Z, which proves the theorem.
Some other useful formulas, which can all be derived in a similar manner,
follow.
4 Data Structures and Algorithm Analysis in C

log A/B =
log A log B

log(A8) =
BlogA
logX<X forallX>0

logi =
0, log2
=
1, log 1,024 =
10, log 1,048,576 =
20

1.2.3. SerIes

The easiest formulas to remember are

=
—1

and the companion,


A41—
i

In the latter formula, if 0 <A < 1, then

and as N tends to x, the sum approaches 1/(1 A). These are the “geometricseries”
formulas.
We can derive the last formula for A’ (0 < A < 1) in the following
manner. Let S be the sum. Then

S =
l+A+A2÷A3÷A4+A5+•
Then
AS =A+A2+A3+A4+A5+
If we subtract these two equations (which is permissible only for a convergent series),
virtually all the terms on the right side cancel, leaving

S AS =
I

which implies that


1
1—A

We can use this same technique to compute i12’, a sum that occurs

frequently. We write

12 3 4 S
S =

and multiply by 2, obtaining


23 4 5 6
2S =
1 + + + + + +
Chapter 1 Introduction 5

Subtracting these two equations yields


1 1 1 1 1
S=l+++p++g+•••
Thus,S =2.
Another type of common series in analysis is the arithmetic series. Any such
series can be evaluated from the basic formula.
N
N(N+1) N2

i=1
2

Forinstance,tofindthesum2+5+8 + 1),rewriteitas3(1 +2+3+


+(3k—
+ k) (1 + I + I + + 1), which is clearly 3k(k + 1)/2 k. Another way to

remember this is add the first and last terms (total 3k + 1), the second and next
to

to last terms (total 3k + 1), and so on. Since there are k/2 of these pairs, the total

sum is k(3k + 1)/2, which is the same answer as before.

The next two formulas pop up now and then but are fairly uncommon.

N(N + 1)(2N + 1)
V12 =

6 3
:=1

kø—1

When k —1, the latter formula is not valid. We then need the following
=

formula, which is used far more in computer science than in other mathematical
disciplines. The numbers HN are known as the harmonic numbers, and the sum
is known as a harmonic sum. The error in the following approximation tends to
y 0.57721566, which is known as Euler’s constant.
=

11N =,-=logN
These two formulas are Just general algebraic manipulations.

=
Nf(N)

1(1) =

f(i)
i’1 1=1
i=no

1.2.4. ModularAritbmetic

We say that A is congruent to B modulo N, written A B (mod N), if N divides


A B. Intuitively, this means that the remainder is the same when either A or B

isdivided by N. Thus, 81 61 1 (mod 10). As with equality, if A B(modN),


then A + C B + C(modN) and AD BD(modN).
There are many theorems that apply to modular arithmetic, and some of them
require extraordinary proofs in number theory. We will use modular arithmetic
sparingly, and the preceding theorems will suffice.
6 Data Structures and Algorithm Analysis in C

1.2.5. The P Wtmi

The two most common ways of proving statements in data structure analysis
are proof by induction and proof by contradiction (and occasionally proof by
intimidation, used by professors only). The best way of proving that a theorem is
false is by exhibiting a counterexample.

Proof by Induction
A proof by induction has The first step is proving a base
two standard parts.

case, that is, establishing that a theorem is true for


some small (usually degenerate)

value(s); this step is almost always trivial. Next, an inductive hypothesis is assumed.
Generally this means that the theorem is assumed to be true for all cases up to some
limit k. Using this assumption, the theorem is then shown to be true for the next

value, which is typically k


+ 1. This proves the theorem (as long as k is finite).

As an example, we prove that the Fibonacci numbers, Fo 1, F1 1, F2 2, = =

F3 =
3, F4 5,..., F,
=
F,.1 F,_2, satisfy F,
=
+ (5/3)’, for i 1. (Some
<

definitions have F0 0, which shifts the series.) To do this, we first verify that
=

the theorem is true for the trivial cases. It is easy to verify that F1 =
1 < 5/3 and
F2 2 < 25/9; this proves the basis. We assume that the theorem is true for i =
1,
2,..., k; this is the inductive hypothesis. To prove the theorem, we need to show

that Fk+l < (S/3)k+1 We have

F51 =
Fk + Fk_1
by the definition, and we can use the inductive hypothesis on the right-hand side,
obtaining
Fk+l < (5/3)k + (5f3)’
< (3/5)(5/3)k+1 + (3/5)2(5/3)k+1
< (3/5)(5/3)k+1 + (9/25)(5/3)k+1
which simplifies to

Fk+l < (3/5 + 9/25)(5/3)k+1


<(24/25)(5/3)1
< (5/3)k+1
proving the theorem.
As a second example, we establish the following theorem.

ThEOREM 1.3.

N(N+1)(2N+1)
uN >
1,theni2 =

The proof is by induction. For the basis, it is readily seen that the theorem is true
when N =
1. For the inductive hypothesis, assume that the theorem is true for
Chapter 1 Introduction 7

1 k <
N. We will establish that, under this assumption, the theorem is true
for N + 1. We have

+(N + 1)2

Applying the inductive hypothesis, we obtain

N+1

N(N+1)(2N+l)+(N+l)2
N(2N + 1)
=(N+1) +(N+1)
6

2N2 + 7N + 6
=(N+1)
6

(N + 1)(N + 2)(2N + 3)
6

Thus,
N+l
(N + 1)[(N + 1) + 1][2(N + 1) + 1]
i=1
6

proving the theorem.

Proof by Counterexainpie
The statement Fk k2 is false. The easiest way to prove this is to compute
F11 =
144> 112.

Proof by Contradiction
Proof by contradiction proceeds by assuming that the theorem is false and showing
that this assumption implies that known property is false, and hence the
some

original assumption was erroneous. A classic example is the proof that there is an
infinite number of primes. To prove this, we assume that the theorem is false, so
that there is some largest prime Pk. Let P1, P2,..., Pt be all the primes in order and
consider

N=PiP2P3•P+1

Clearly, N is larger than P, so by assumption N is not prime. However, none of

P1, F2,..., Pt divides N exactly, because there will always be a remainder of 1.


This is a contradiction, because every number is either prime or a product of primes.
Hence, the original assumption, that P is the largest prime, is false, which implies
that the theorem is true.
8 Data Structures and Algorithm Analysis in C

I nt
F( mt X )

1*1*1 if(X==O)
1* 2*! return 0;
else
return2*F(X_1)+X*X;

Figure 1.2 A recursive function

1.3. A Brief Introduction to Recursion

Most mathematical functions that we are familiar with are described by a simple
formula. For instance, we can convert temperatures from Fahrenheit to Celsius by
applying the formula
C =
S(F 32)19
Given this formula, it is trivial to write a C function; with declarations and braces
removed, the one-line formula translates to one line of C.
Mathematical functions are sometimes defined in a less standard form. As an

example, we can define a function F, valid nonnegative integers, that satisfies


on

F(0) =
0 and F(X) 2F(X
=
1) + X2. From this definition we see that F(1) 1, =

F(2) =
6, F(3) =
21, ad F(4) 58. A function that is defined in terms of itself
=

is called recursive. C allows functions to be recursive. It is important to remember


that what C provides is merely an attempt to follow the recursive spirit. Not all
mathematically recursive functions are efficiently (or correctly) implemented by
C’s simulation of recursion. The idea is that the recursive function F ought to be
expressible in only a few lines, just like a nonrecursive function. Figure 1.2 shows
the recursive implementation of F.
Lines 1 and 2 handle what is known as the base case, that is, the value for
which the function is directly known without resorting to recursion. Just as declaring
F(X) 2F(X
=
1) + X2 is meaningless, mathematically, without including the fact
that F(0) 0, the recursive C function doesn’t make sense without a base case.
=

Line 3 makes the recursive call.

There are several important and possibly confusing points about recursion. A
common question is: Isn’tthis just circular logic? The answer is that although we are

defining a function in terms of itself,


defining a particular instance of the
we are not

function in terms of itself. In other words, evaluating F(S) by computing F(S) would
be circular. Evaluating F(S) by computing F(4) is not circular—unless, of course,
F(4) is evaluated by eventually computing F(S). The two most important issues are

Using recursion for numerical calculations is usually a bad idea. We have done so to illustrate the basic
points.
Chapter 1 Introduction 9

probably the how and why questions. In Chapter 3, the how and why issues are

formally resolved. We will give an incomplete description here.


It turns out that recursive calls are handled no differently from
any others. if F
is called with the value of 4, then line 3 requires the computation of 2 * F(3) + 4 * 4.
Thus, a call is made to compute F(3). This requires the computation of 2 * F(2) + 3 *
3. Therefore, another call is made to compute F(2). This means that 2 * F(1) + 2 * 2

must be evaluated. To do so, F(1) is computed as 2 * F(O) + 1 * 1. Now, F(O) must


beevaluated. Since this is a base case, we know a priori that F(O) 0. This enables =

the completion of the calculation for F (1), which is now seen to be 1. Then F (2),
F(3), and finally F(4) can be determined. All the bookkeeping needed to keep track
of pending function calls (those started but waiting for a recursive call to complete),
along with their variables, is done by the computer automatically. An important
point, however, is that recursive calls will keep on being made until a base case is
reached. For instance, an attempt to evaluate F( —1) will result in calls to F( —2),
and
F(—3), so on. Since this will never get to a base case, the program won’t be able
to compute the answer (which is undefined anyway). Occasionally, a much more
subtle error is made, which is exhibited in Figure 1.3. The error in the program in
Figure 1.3 is that Bad(1) is defined, by line 3, to be Bad(1). Obviously, this doesn’t
give any clue as to what Bad(1) actually is. The computer will thus repeatedly
make calls to Bad( 1) in an attempt to resolve its values. Eventually, its bookkeeping
system will run out of space, and the program will crash. Generally, we would say

that this function doesn’t work for one special case but is correct otherwise. This
isn’t true here, since Bad(2) calls Bad(1). Thus, Bad(2) cannot be evaluated either.
Furthermore, Bad(3), Bad(4), and Bad(S) all make calls to Bad(2). Since Bad(2)
is unevaluable, none of these values are either. In fact, this
program doesn’t work
for any value of N, except 0. With recursive programs, there is no such thing as a

“specialcase.”
These considerations lead to the first two fundamental rules of recursion:

1. Base cases. You must always have some base cases, which can be solved
without recursion.
2. Making progress. For the cases that are to be solved recursively, the recursive
must always be to a case that makes progress toward a base case.
call

FIgure 1.3 A nonterminating recursive program

.i nt
Bad( unsigned mt N )

1*1*1 if(N==0)
1* 2*! return 0;
else
return Bad(N/3÷1)+N-1;
10 Data Structures and Algorithm Analysis in C

Throughout this book, we will use recursion to solve problems. As an


example
of nonmathematical use, consider a large dictionary. Words in dictionaries are
a

defined in terms of other words. When we look up a word, we might not always
understand the definition, so we might have to look up words in the definition.
Likewise, we might not understand some of those, so we might have to continue
this search for a while. Because the dictionary is finite, eventually either (1) we will
come to a point where we understand all of the words in some definition (and thus

understand that definition and retrace our path through the other definitions) or
(2) we will find that the definitions are circular and we are stuck, or that some word
we need to understand for a definition is not in the dictionary.

Our recursive strategy to understand words is as follows: If we know the


meaning of a word, then we are done; otherwise, we look the word up in the
dictionary. If we understand all the words in the definition, we are done; otherwise,
we figure out what the definition means by recursively looking up the words we
don’t know. This procedure will terminate if the dictionary is well defined but can

loop indefinitely if a word is either not defined or circularly defined.

Prinling Out Numbers


Suppose we have a positive integer, N, that we wish to print out. Our routine will

have the heading PrintOut(N). Assume that the only 1/0 routines available will
take a single-digit number and output it to the terminal. We will call this routine
PrintDigit; for example, PrintDigit(4) will output a 4 to the terminal.
Recursion provides a very clean solution to this problem. To print out 76234,
we need to first print out 7623 and then print out 4. The second step is easily
accomplished with the statement PrintDigit(N% 10), but the first doesn’t seem any
simpler than the original problem. Indeed it is virtually the same problem, so we can

solve it recursively with the statement PrintOut(NI10).


This tells us how to solve the general problem, but we still need to make sure

that the program doesn’t loop indefinitely. Since we haven’t defined a base case yet,
it is clear that we still have something to do. Our base case will be PrintDigit(N) if

0 N <10. Now PrintOut(N) is defined for every positive number from 0 to 9,


and larger numbers are defined in terms of a smaller positive number. Thus, there is

no cycle. The is shown in Figure 1.4.


entire procedure*
efficiently. We could have avoided using the
We have made no effort to do this
mod routine (which is very expensive) because N%10 N IN/10i * 10.t =

Recursion and Induction

Let us prove (somewhat) rigorously that the recursive number-printing program


works. To do so, we’ll use a proof by induction.

The term procedure refers to a function that returns void.

tXj is the
largest integer that is less than or
equal to X.
Chapter 1 Introduction 11

void
Printout( unsigned mt N ) /* Print nonnegative N */

if(N>=10)
Print0ut( N / 10 );
PrintDigit( N % 10 );

FIgure 1.4 Recursive routine to print an integer

flIEO1EM 1.4.

The recursive number-printing algorithm is correct for N : 0.

PROOF (BYINDIJCflON (iN fliP NIMIFR OFDIGm IMJ):

First, if N has digit, then the program is trivially correct, since it merely
one

makes a call PrintDigit. Assume then that PrintOut works for all numbers
to

of k or fewer digits. A number of k + 1 digits is expressed by its first k digits


followed by its least significant digit. But the number formed by the first k digits
is exactly jN/WJ, which, by the inductive hypothesis, is correctly printed, and
the last digit is Nmod 10, so the program prints out any (k + 1)-digit number
correctly. Thus, by induction, all numbers are correctly printed.
This proof probably seems a little strange in that it is virtually identical to the
algorithm description. It illustrates that in designing a recursive program, all smaller
instances of the same problem (which are on the path to a base case) may be assumed
to work correctly. The recursive program needs only to combine solutions to smaller

problems, which are “magically”obtained by recursion, into a solution for the

current problem. The mathematical justification for this is proof by induction. This
gives the third rule of recursion:

3. Design rule. Assume that all the recursive calls work.

This rule is important because it means that when designing recursive programs,
you generally don’t need to know the details of the bookkeeping arrangements, and
you don’t have to try to trace through the myriad of recursive calls. Frequently, it is
extremely difficult down the actual sequence of recursive calls. Of course,
to track

in many cases this is an indication of a good use of recursion, since the computer is
being allowed to work out the complicated details.
The main problem with recursion is the hidden bookkeeping costs. Although
these costs are almost always justifiable, because recursive programs not only simplify
the algorithm design but also tend to give cleaner code, recursion should never be
used as a substitute for a simple for loop. We’ll discuss the overhead involved in
recursion in more detail in Section 3.3.
When writing recursive routines, it is crucial to keep in mind the four basic
rules of recursion:

1. Base cases. You must always have some base cases, which can be solved
without recursion.
12 Data Structures and Algorithm Analysis in C

2. Makrng progress. For the cases that are to be solved recursively, the recursive
call must always be to a case that makes progress toward a base case.
3. Design rule. Assume that all the recursive calls work.
4. Compound interest rule. Never duplicate work by solving the same instance
ofa
problem in separate recursive calls.

The fourth rule, which will be justified (along with its nickname) in later sections,
is the reason that it is generally a bad idea to use recursion to evaluate simple
mathematical functions, such as the Fibonacci numbers. As long as you keep these
rules in mind, recursive programming should be straightforward.

Summary

This chapter sets the stage for the rest of the book. The time taken by an algorithm
confronted with large amounts of input will be an important criterion for deciding if
it is a good algorithm. (Of course, correctness is most important.) Speed is relative.
What is fast for one problem on one machine might be slow for another problem or
a different machine. We will begin to address these issues in the next chapter and
will use the mathematics discussed here to establish a formal model.

Exercises

1.1 Write program to solve the selection problem. Let k


a N12. Draw a table =

showing the running time of your program for various values of N.


1.2 Write a program to solve the word puzzle problem.
1.3 Write a procedure to output an arbitrary real number (which might be negative)
using only PrintDigit for I/O.
1.4 C allows statements of the form

#i nd ude filename

which reads filename and inserts its contents in place of the include statement.
Include statements may be nested; in other words, the file filename may itself
contain an include statement, but, obviously, a file can’t include itself in any
chain. Write a program that reads in a file and outputs the file as modified by
the include statements.
1.5 Prove the following formulas:
a. logX <X for all X > 0

b. log(AB) =
BlogA
1.6 Evaluate the following sums:

a.
,to 4’

b.
Chapter 1 Introduction 13

,=0

**d
,=0
1.7 Estimate

I
=[N12j

“1.8 What is 21°°(mod 5)?


1.9 Let F, be the Fibonacci numbers as defined in Section 1.2. Prove the following:
N -2

a.

b. FN <N,withq, =(1+ ,J)/2


precise closed-form expression for FN.
*
Give a

1.10 Prove the following formulas:


N
a. Z(2i—1)=N2
i=1

N /N\2
b. i3=(2i
i=1

References

There are many good textbooks covering the mathematics reviewed in this chapter.
A small subset is [1], [2], [31, [9], [10], and [11]. Reference [9] is specifically geared
toward the analysis of algorithms. It is the first volume of a three-volume series that
will be cited throughout this text. More advanced material is covered in [5].
Throughout this book we will assume a knowledge of C [8]. Occasionally,
we add a feature where necessary for clarity. We also familiarity with
assume

pointers and recursion (the recursion summary in this chapter is meant to be a quick
review). We will attempt to provide hints on their use where appropriate throughout
the textbook. Readers not familiar with these should consult [12] or any good
intermediate programming textbook.
General programming style is discussed in several books. Some of the classics
are [4], [61, and [7].

1. M. 0. Albertson and J. P. Hutchinson, Discrete Mathematics with Algorithms, John


Wiley & Sons, New York, 1988.
2. Z. Bavel, Math Companion for Computer Science, Reston PubIishng Co., Resron, Va.,
1982.
3. R. A. Brualdi, Introductory Combinatorics, North-Holland, New York, 1977.
4. E. W. Dijkstra, A Discipline of Programming, Prentice Hall, Englewood Cliffs, N.J.,
1976.
14 Data Structures and Algorithm Analysis in C

5. R. L. Graham, D. E. Knuth, and 0. Patashnik, Concrete Mathematics, Addison-Wesley,


Reading, Mass., 1989.
6. D. Gries, The Science of Programming, Springer-Verlag, New York, 1981.
7. B. W. Kernighan and P. J. Plauger, The Elements of Programming Style, 2d ed., McGraw-
Hill, New York, 1978.
8. B. W. Kernighan and D. M. Ritchie, The C Programming Language, 2d ed., Prentice
Hall, Englewood Cliffs, N.J., 1988.
9. D. E. Knuth, The Art of Computer Programming, Vol. 1: Fundamental Algorithms, 2d
ed., Addison-Wesley, Reading, Mass., 1973.
10. F. S. Roberts, Applied Combinatorics, Prentice Hall, Englewood Cliffs, N.J., 1984.
11. A. Tucker, Applied Combinatorics, 2d ed., John Wiley & Sons, New York, 1984.
12. M. A. Weiss, Efficient C Programming: A Practical Approach, Prentice Hall, Englewood
Cliffs, N.J., 1995.
CHAPTER 2

Algorithm Analysis
An algorithm is a clearly specified set of simple instructions to be followed to solve
a problem. Once an algorithm is given for a problem and decided (somehow) to be
correct, an important step is to determine how much in the way of resources, such
as time or space, the algorithm will require. An algorithm that solves a problem but
requires a year is hardly of any use. Likewise, an algorithm that requires a gigabyte
of main memory is not (currently) useful on most machines.
In this chapter, we shall discuss

How to estimate the time required for a program.


How to reduce the running time of a program from days or years to fractions
of a second.
The results of careless use of recursion.

Very efficient algorithms to raise a number to a power and to compute the

greatest common divisor of two numbers.

2.1. Mathematical Background


The analysis required to estimate the resource use of an algorithm is generally a
theoretical issue, and therefore a formal framework is required. We begin with some
mathematical definitions.
Throughout the book we will use the following four definitions:

DEFINfflON: T(N) 0(1(N)) if there are positive constants c and no such that

T(N) cf(N)whenN n.

DEFINmON: T(N) f1(g(N)) if there


=
are positive constants c and o such that

T(N) cg(N) when N

DEFINmON: T(N) =
9(h(N)) if and only if T(N) =
0(h(N)) and T(N)

DEFINm0N: T(N) =
o(p(N)) if T(N) =
0(p(N)) and T(N)
16 Data Structures and Algorithm Analysis in C

The idea of these definitions is to establish a relative order among functions. Given
two functions, there are usually points where one function is smaller than the other
function, so it does not make sense to claim, for instance, [(N) <
g(N). Thus,
we compare their relative rates of growth. When we apply this to the analysis of
algorithms, we shall see why this is the important measure.
Although 1,000N is larger than N2 for small values of N, N2 grows at a
faster rate, and thus N2 will eventually be the larger function. The turning point is
N =
1,000 in this case. The first definition says that eventually there is some point
n0 past which c [(N) is always at least as large as T (N), so that if constant factors
are ignored, [(N) is at least as big as T(N). In our case, we have T(N) =
1,000N,
[(N) =
N2, n0
=
1,000, and c = 1. We could also use no
100. =
10 and c =

Thus, we can say that 1,000N 0(N2) (order N-squared). This notation is
=

known as Big-Oh notation. Frequently, instead of saying “order. ,“one says . .

“Big-Oh....”
If we use the traditional inequality operators to compare growth rates, then
the first definition says that the growth rate of T(N) is less than or equal to (s)
that of ((N). The second definition, T(N) =
fl(g(N)) (pronounced “omega”),says
that the growth rate of T(N) is greater than or equal to () that of g(N). The

third definition, T(N) says that the growth rate


0(h(N)) (pronounced “theta”),
=

of T(N) equals (=) the growth rate of h(N). The last definition, T(N) o(p(N)) =

(pronounced “little-oh”),says that the growth rate of T (N) is less than (<) the
growth rate of p(N). This is different from Big-Oh, because Big-Oh allows the
possibility that the growth rates are the same.
To prove that some function T(N) 0([(N)), we usually do not apply these
=

definitions formally but instead use a repertoire of known results. In general, this
means that a proof (or determination that the assumption is incorrect) is a very simple
calculation and should not involve calculus, except in extraordinary circumstances
(not likely to occur in an algorithm analysis).
When we say that T(N) 0(f(N)), we are guaranteeing that the function
=

T(N) grows at a rate no faster than ((N); thus f(N) is an upper bound on T(N).
Since this implies that ((N) =
fl(T(N)), we say that T(N) is a lower bound on

((N).
As an example, N3 grows faster than N2, so we can say that N2 =
0(N3)
or N3 =
f1(N2). f(N) =
N2 and g(N) =
2N2 grow at the same rate, so both

[(N) =
0(g(N)) and [(N) =
fl(g(N)) are true. When two functions grow at
the same rate, then the decision of whether or not to signify this with 6() can

depend on the particular context. Intuitively, if g(N) 2N2, then g(N) 0(N4), =

g(N) 0(N3), and g(N)


=
0(N2) are all technically correct, but the last option
=

is the best answer. Writing g(N) 6(N2) says not only that g(N)
=
0(N2), but =

also that the result is as good (tight) as possible.


The important things to know are

RULE 1:

11T1(N) 0(f(N))andT2(N)
=
0(g(N)), then =

(a) T1(N) + T2(N) max(0(f(N)),0(g(N))),


=

(b) T1(N) * T2(N) 0(f(N) * g(N)), =


Chapter 2 Algorithm Analysis 17

Function Nnme

c Constant

log N Logarithmic
log’ N Log-squared
N Linear
NlogN
N’ Quadratic
N’ Cubic
Exponential

FIgure 2.1 Typical growth rates

RULE 2:

If T(N) is a polynomial of degree k, then T(N) =


€J(N’9.

RULE 3:

Iog’ N =
0(N) for any constant k. This tells us that logarithms grow very
slowly.
This information is sufficient to arrange most of the common functions by
growth rate(see Figure 2.1).
Several points are in order. First, it is very bad style to include constants or low-
order terms inside a Big-Oh. Do not say T(N) O(2N2) or T(N) =
0(N2 + N). =

In both cases, the correct form is T (N) 0(N’). This =


means that in any analysis
that will require a Big-Oh answer, all sorts of shortcuts are possible. Lower-order
terms can generally be ignored, and constants can be thrown away. Considerably
less precision is required in these cases.

Second, always determine the relative growth rates of two functions [(N)
we can

and g(N) by computing limN f (N )/g(N), using L’Hópital’srule if necessary.


...

The limit can have four possible values:

The limit isO: This means that [(N) =


o(g(N)).
The limit is c 0: This means that
f(N) ø(g(N)).
=

The limit is ec: This means that g(N) o(f(N)).


=

The limit oscillates: There is no relation (this will not happen in our context).

Using this method almost always amounts to overkill. Usually the relation between

[(N) and g(N) be derived by simple algebra. For instance, if [(N)


can N log N =

and g(N) N’5, then to decide which of [(N) and g(N) grows faster, one really
=

needs to determine which of log N and N°5 grows faster. This is like determining

i’Hôpital’srule states that if hmN..f(N) =


and limN_..g(N) =
, then l1mN_..f(N)IgN)
=

lim... f (N)/g(N), where f’(N)and g’(N) are the derivatives of f(N) and g(N), respectively.
18 Data Structures and Algorithm Analysis in C

which of log2
N or N grows faster. This is a simple problem, because it is already
known that N grows faster than any power of a log. Thus, g(N) grows faster than
[(N).
One
stylistic note: It is bad to say [(N) O(g(N)), because the inequality is
implied by the definition. It is wrong to write [(N) O(g(N)), which does not
make sense.

2.2. Model

In order to in a formal framework, we need a model of


analyze algorithms
computation. Our model is basically a normal computer, in which instructions are
executed sequentially. Our model has the standard repertoire of simple instructions,
such as addition, multiplication, comparison, and assignment, but, unlike the case

with real computers, it takes exactly one time unit to do anything (simple). To be
reasonable, we will assume that, like a modern computer, our model has fixed-size
(say, 32-bit) integers and that there are no fancy operations, such as matrix inversion
or sorting, that clearly cannot be done in one time unit. We also assume infinite

memory.
This model clearly has some weaknesses. Obviously, in real life, not all operations
take exactly the same time. In particular, in our model one disk read counts
the same as an addition, even though the addition is typically several orders of
magnitude faster. Also, by assuming infinite memory, we never worry about page
faulting, which can be a real problem, especially for efficient algorithms.

2.3. What toAnalyze


The most important resource to analyze is generally the running time. Several factors
affect the running time of a program. Some, such as the compiler and computer
used, are obviously beyond the scope of any theoretical model, so, although they are
important, we cannot deal with them here. The other main factors are the algorithm
used and the input to the algorithm.
Typically, the size of the input is the main consideration. We define two
functions, Tavg(N) and T0,(N), as the average and worst-case running time,
respectively, used by an algorithm on input of size N. Clearly, TaV(N) Tworst(N).
If there is more than one input, these functions may have more than one argument.

Generally, the quantity required is the worst-case time, unless otherwise specified.
One reason for this is that itprovides a bound for all input, including
particularly bad input, which an average-case analysis does not provide.
The other
reason is that average-case bounds are usually much more difficult to compute. In
some instances, the definition of “average” can affect the result. (For instance, what

is average input for the following problem?)


Chapter 2 Algorithm Analysis 19

As an example, in the next section, we shall consider the following problem:


MAXIMUM SUBSEQUENCE SUM PROBLEM:

Given (possibly negative) integers A1, A2,. . .


,AN, find the maximum value
of Ak. (For convenience, the maximum subsequence
,
sum is 0 if all the
integers are negative.)
Example:
For input —2,
11, —4,
13, —5,
—2,the answer is 20 (A2 through A4).
This problem is interesting mainly because there are so many algorithms to

solve it, and the performance of these algorithms varies drastically. We will discuss
four algorithms to solve this problem. The running time on some computer (the
exact computer is unimportant) for these algorithms is given in Figure 2.2.
There are several important things worth noting in this table. For smalla

amount of input, the algorithms all run in a blink of the eye, so if only smalla

amount of input is expected, it might be silly to expend a great deal of effort to


design a clever algorithm. On the other hand, there is a
large market these days
for rewriting programs that were written five years ago based on a no-longer-valid
assumption of small input size. These programs are now too slow, because they used
poor algorithms. For large amounts of input, algorithm 4 is clearly the best choice
(although algorithm 3 is still usable).
Second, the times given do not include the time required to read the input. For
algorithm 4, the time merely to read in the input from a disk is likely to be an order
of magnitude larger than the time required to solve the problem. This is typical of
many efficient algorithms. Reading the data is generally the bottleneck; once the
data are read, the problem
can be solved quickly. For inefficient algorithms this

and significant computer resources must be used. Thus it is important


is not true,

that, whenever possible, algorithms be efficient enough not to be the bottleneck of a


problem.
Figure2.3 shows the growth rates of the running times of the four algorithms.
Even though this graph encompasses only values of N ranging from 10 to 100, the
relative growth rates are still evident. Although the graph for algorithm 3 seems
linear, it is easy to verify that it is not by using a straight-edge (or piece of paper).
Figure 2.4 shows the performance for larger values. It dramatically illustrates how
useless inefficient algorithms are for even moderately large amounts of input.

FIgure 2.2 Running times of several algorithms for maximum subsequence sum

(in seconds)

Algorithm 1 2 3 4

Time 0(N3) 0(N2) 0(N log N) 0(N)

Input N =
10 0.00103 0.00045 0.00066 0.00034
Size N 100 0.47015 0.01112 0.00486 0.00063
N =
1,000 448.77 1.1233 0.05843 0.00333
N =
10,000 NA 111.13 0.68631 0.03042
N =
100,000 NA NA 8.0113 0.29832
20 Data Structures and Algorithm Analysis in C

Al 1.0(N3) Al 2.0(N2)
Mg 3. 0(NlogN)
4

1
Alg4.O(N)

00 10 20 30 40 50 60 70 80 90 100

Figure 2.3 Plot (N vs. milliseconds) of various maximum subsequence sum

algorithms

Mg 1. 0(N3)
0(N2)
g 2.

0.5 AIg 3. 0(NlogN)

0.4

0.3

0.2

0.1

Al 4.0(N)

0.00 1000 2000 3000 4000 5000 6000 7000 8000 9000 10000

Figure 2.4 Plot (N vs. seconds) of various maximum subsequence sum algorithms

2.4. Running Time Calculations

There are several ways to estimate the running time of a


program. The previous
table was obtained empirically. If two programs are expected to take similar times,
probably the best way to decide which is faster is to code them both up and run

them!
Chapter 2 Algorithm Analysis 21

Generally, there are several algorithmic ideas, and we would like to eliminate
the bad early, so an analysis is usually required. Furthermore, the ability to do
ones

an analysis usually provides insight into designing efficient algorithms. The analysis

also generally pinpoints the bottlenecks, which are worth coding carefully.
To simplify the analysis, we will adopt the convention that there are no particular
units of time. Thus, we throw away leading constants. We will also throw away
low-order terms, so what we are essentially doing is computing a Big-Oh running
time. Since Big-Oh is an upper bound, we must be careful never to underestimate the
running time of the program. In effect, the answer provided is a guarantee that the
program will terminate within a certain time period. The program may stop earlier
than this, but never later.

2.4.1. A Simple &ample


Here is a simple program fragment to calculate
Zr.. i3:

i nt
Sum( mt N )
{
mt i, PartialSum;

/* 1*! PartialSum 0; =

/* 2*! for( I 1; i <= N; i++ )


=

1* 3*/ PartialSuni = j * j *
/* 4*/ return PartialSum;
}

The analysis of this program is simple. The declarations count for no time.
Lines 1 and 4 count for one unit each. Line 3 counts for four units per time executed
(two multiplications, one addition, and one assignment) and is executed N times,
for atotal of 4N units. Line 2 has the hidden costs of initializing i, testing i N,
and incrementing i. The total cost of all these is I to initialize, N + 1 for all the
tests, and N for all the increments, which is 2N + 2. We ignore the costs of calling
the function and returning, for a total of 6N + 4. Thus, we say that this function is
0(N).
If we had to perform all this work every time we needed to analyze a program,
the task would quickly become infeasible. Fortunately, since we are giving the
answer in terms of Big-Oh, there are lots of shortcuts that can be taken without
affecting the final answer. For instance, line 3 is obviously an 0(1) statement (per

execution), so it is silly to count precisely whether it is two, three, or four units; it


does not matter. Line 1 is obviously insignificant compared with the for loop, so it
is silly to waste time here. This leads to several general rules.

2.4.2. General Rules

RULE 1—MR
WOPS

The running time of a for loop is at most the running time of the statements

inside the for ioop (including tests) times the number of iterations.
22 Data Structures and Algorithm Analysis in C

RULE 2-NFSFEDR WOPS:

Analyze these inside out. The total running time of a statement inside a group
of nested loops is the running time of the statement multiplied by the product
of the sizes of all the for loops.
As an example, the following program fragment is 0(N2):
for(i =0; i <N; i++)
for(j=0; j<N; j++)

STAT
RULE 3—CONSECIJ11VE MEN:

These just add (which means that the maximum is the one that counts; see

rule 1(a) on page 16).

As an example, the following program fragment, which has 0(N) work followed
by 0(N2) work, is also 0(N2):
for(i = i <N; i++)
A[ I 0;] =

for(i =o; i <N; i-+.+)


for( j 0; j < N; j++ )
=

A[ I ] .i-= A[ j ] + I +

RULE 4—if/ELSE:

For the fragment


if( Condition )
Si
else
52

the running time of an if/else statement is never more than the running time of
the test plus the larger of the running times of SI and S2.

Clearly, this can be an overestimate in some cases, but it is never an underestimate.


Other rules are obvious, but a basic strategy of analyzing from the inside (or
deepest part) out works. If there are function calls, these must be analyzed first. If
there are recursive procedures, there are several options. If the recursion is really
just a thinly veiled for loop, the analysis is usually trivial. For instance, the following
function is really Just a simple loop and is 0(N):

long mt
Factorial( mt N )
{
if( N <= 1 )

return 1;
else
*
return N Factorial( N -

1 );
}
Chapter 2 Algorithm Analysis 23

This example is really a poor use of recursion. When recursion is properly used,
it is difficult to convert the recursion into a
simple loop structure. In this case, the
analysis will involve a recurrence relation that needs to be solved. To see what might
happen, consider the following program, which turns out to be a horrible use of
recursion:

long mt
Fib( mnt N )
{
1*1*1 if(N<=1)
1* 2*! return 1;
else
/* 3*/ return Fib( N -

1 ) + Fib( N
-

2 );
}

At first glance, this seems like a very clever use of recursion. However, if the

program is coded up and run for values of N around 30, it becomes apparent that
this program is terribly inefficient. The analysis is fairly simple. Let T(N) be the
running time for the function Fib(N). If N 0 or N 1, then the running time is
= =

some constant value, which is the time to do the test at line 1 and return. We can

say that T(0) =


T(1) =
1 because constants do not matter. The running time for
other values of N is then measured relative to therunning time of the base case. For
N > 2, the time to execute the function is the constant work at line 1 plus the work
at line 3. Line 3 consists of an addition and two function calls. Since the function
calls are not simple operations, they must be analyzed by themselves. The first

function call is Fib(N 1) and hence, by the definition ofT, requires T (N 1) units
of time. A similar argument shows that the second function call requires T(N 2)
units of time. The total time required is then T(N 1) + T(N 2) + 2, where the
2 accounts for the work at line 1 plus the addition at line 3. Thus, for N 2, we
have the following formula for the running time of Fib(N):

T(N) =
—2)+2
T(N —1)+T(N

Since Fib(N) =
Fib(N 1) + Fib(N 2), it is easy
to show by induction that

T(N) Fib(N). In Section 1.2.5, showed that Fib(N) < (5/3)N A similar
we

calculation shows that (for N > 4) Fib(N) (3/2)N, and so the running time of
this program grows exponentially. This is about as bad as possible. By keeping a
simple array and using a for loop, the running time can be reduced substantially.
This program is slow because there is a huge amount of redundant work being
performed, violating the fourth major rule of recursion (the compound interest rule),
which was presented in Section 1.3. Notice that the first call on line 3, Fib(N 1),
actually computes Fib(N 2) at some point. This information is thrown away
and recomputed by the second call on line 3. The amount of information thrown
away compounds recursively and results in the huge running time. This is perhaps
the finest example of the maxim ‘Don’tcompute anything more than once” and
should not scare you away from using recursion. Throughout this book, we shall
see outstanding uses of recursion.
Random documents with unrelated
content Scribd suggests to you:
Keaukanai, 2, 8.

Keauleinakahi ordered to pierce the double canoe of Kaumaielieli and kill Kana
and Niheu, 444.
sword-fish of Kapepeekauila, 444.
warrior in charge of the ocean, 444.
warrior of Kapepeekauila meets and attacks the double canoe; is struck and
killed by Niheu, 444.

Keaumiki and Keauka, tides or demigods, 160–62.


from Kauai, return with Makolea, 510.
gods of tides, 510.
guardians and attendants on Kaikipaananea, 510.

Keawe, 25, 240, 388, 405;


dwelt at Piilani’s, 240.
Hauoa of, 242.
Lono sacred chief by, 356.
lord of Hawaii, 394.
was given birth, 356.

Keawekekahialiiokamoku, 364.
Kualii likened to, 388, 392.
[xxiv]ruled Hawaii four generations before Kamehameha, 388.
turned salt water into fresh, 388.

Keawenuiaumi, 25, 220, 228, 256, 405.


after the death of, 256.
and party proceed to koa forest, 462.
at Kaipalaoa at Mainele’s arrival, 460.
awards his daughter and land to Pikoiakaalala, 462.
battle of Puumaneo, a rebellion against, 314.
bones of six rebellious chiefs killed at battle of, on Puumaneo, 314–20.
circuits Hawaii with Lonoikamakahiki, 264.
contends with his son for the retention of weapons, etc., 260.
father of Lonoikamakahiki, 256.
god of; in charge of Lono; worshipped by Hauna and Loli; thought to be Kaili,
292.
hears report of son’s orders that his war and game implements be destroyed,
260.
high priest with long hair to below his waist in presence of, 264.
Kaikilani became ruler of Hawaii at death of, 266.
king of Hawaii, cautioned by birds against cutting a hollow tree, 458.
orders four men to carry the basket and builds a house for the god and
Kauakahi, 460–62.
ponders upon the future of his boy, 260.
requests Lono to take charge of government, but he declines, 266.
sends messengers for Mainele, vowing to give him daughter and land if he kill
the birds, 458.
sought out Lonoikamakahiki, 260.
told again by the birds of the hollow tree, 462;
vexed, seeks skillful archers to kill them, 458.
vacancy left by, 270.

Keawewai, Hoamakeikekula enveloped in thick fog, arrives at, 536.


Kalamaula lizard king of, 534.

Keawewaihe, 396.

Keeaumoku, 25, 405.

Keehi, sea for mullet is at, 378.

Keelekoha, 382.

Keeumoku, 25. (See Keeaumoku.)

Kehoni, priest, permitted to save himself, 568.

Keiki a kaua, our son, an elastic term, 500.

Keiki, hookama, 182.

Keinohoomanawanui again sees an armed company and says “Our death is close
upon us,” 466.
credited by Kakuhihewa as the cause of victories, 468.
definition of, 466.
discredited by a farmer for the victories, 468.
fears at dagger sign of being discovered, 466.
fears for the result of Kalelealuaka’s wish, 464.
gains victory in battles with Pueonui’s men, 468.
made an officer of Kakuhihewa’s, 468.
seeing an armed company approaching, fears death, 466.
termed by Kalelealuaka a coward, 466.

Keaka, chiefess; has produced eight; sacred bud of, 240.

Kekaa, 284, 302; fleet of war canoes at, 424.

Kekaha battle at Kalamaula, prepared for, by, 418.


chiefs of Koolau and Kona battle against those of, 418.
chiefs of, value their fishing grounds, 416.
Koi and companions land at; thence to Makaeo, 234.
Koolau (Molokai) chiefs desire, 416.
Paepae, a chief of, visits Maui seeking aid of Kauhi, 416.
rain comes by way of, 396.
section of country from Kawela to Maamomi, 416.

Kekaihawewe, son of Moikeha, 118, 132–34, 144–46.

Kekamakahinuiaiku, 32, 48, 80;


almost dead of hunger, 50;
has a double portion, 98.
of bad temper, thrown into the sea, 36.
pretends friendliness, 38.
throws Aukele into pit of Kamooinanea, 38.

Kekamaluahaku, 24.

Kekauilani, 24, 404.

Kekea, or Albino, 8.

Kekea Kapu, 4.

Kekela, 25, 405.

Kekele, a handsome woman, becomes wife of Kaulu, 532.


hala at, planted for her, 530.

Kekohi, the priest, deserts Hakau, 16.

Kekuaokalani’s stick, 96.

Kekuapoiwa, 25, 405.


Kekuapololi, 396.

Kekuawalu, 394.

Kekupuaiawaawa, 284, 302.

Kekuuna, the waters of, 386, 416.

Keliiokaloa, 25, 220, 228, 405.

Kemamo, above Waipio, 200.

Kemau, 192.

Keohe, 344.

Keohokalani, 370, 404.

Keolewa, a mountain of Kauai, spread low is, 372.


Kiki and party are at, 372.

Keoloewa, Kauai chief victorious in revolt, 152.


Kila returned with, to Kauai, 152.
king Puuonale living at, 538.
prime minister of Kila’s, 152.
sails for Waipio: urges Kila to become King, 152.
spirit chief on Maui, 476.

Keoneoio (Maui), Koi returning from Kauai stayed over at, 232.

Keopu, cave of Umi in, at Kailua, 232.

Keopuolono, Kapolei daughter of, sent to entertain Kualii, 418.

Keoua, chiefs rebelled during reign of, 362.

Kepakailiula, adopted through fear by Kakuhihewa; brought to and given Oahu,


510.
acceding to messengers’ overtures, sails for Oahu with his two wives, landing at
Waikiki, 510.
Aiakoake and Kuaikalolo, elders of, 498.
and Makolea became husband and wife, 502.
and wife do nothing but sleep, 502.
at rush of chiefs and warriors of Maui, drops his war club for a hand encounter,
508.
became the fire that lighted Paliuli, 500.
befriends the king’s crier, Kukaea, 512.
besmears the mat of Kakaalaneo, 504.
boards a canoe and sails for Hana, Maui, 504.
born in Keaau, Puna, as an egg, 498.
by aid of young wife, Kukuipahu furnishes canoes and men for invasion of Maui,
506.
called first-born of the beloved one of, 500.
definition of, 498.
favored by king of Kohala, 504.
friend and Kukaea slay in great numbers and force the people to flee, 514–16.
given the daughter of Kukuipahu for wife, 504.
gives charge of Kauai to his friend, 516.
gives his people choice to remain or return, 508.
gives Makolea’s attendant to foster-parents for wife, 502.
hears of king’s sports and joins therein; in contest, defeats Kaikipaananea, 512.
informs foster-parents and moves to Kohala, 502–04.
in hiding, takes Makolea from Kakaalaneo, 504.
in return for kindness, is given answers to king’s riddles, 512–14.
in small canoe, arrives at Waimea, Kauai, 512.
is stayed in his slaughter by his young wife in arms of her father, 508.
[xxv]landing at Hana, the crowd shout in admiration, 506.
led the fight with uprooted trees and rocks, 508.
left asleep at wife’s departure, 502.
legend of, 498.
makes his foster-fathers become kings of Oahu, 510.
makes Kukuipahu king of Maui, 510.
Makolea, wife of, in surf-riding at Waikiki, is taken to Kauai by Keaumiki and
Keauka, 510.
meets and is befriended by a high chief, 512.
name of, assumed by Kakuhihewa, 510.
placed in a canoe sent to Maui, 502.
receives harmless the spear thrusts of Kakaalaneo, 508.
rejecting ordinary food, ate bananas only, 500.
remains on Oahu with Kapuaokeonaona, 510–12.
repeats his visit to Hana and to Makolea from Kakaalaneo, 504.
replies to Makolea’s fears through her husband’s skill, 504.
returns unseen to Hawaii, 504.
says Makolea was taken by order of the king of Kauai, 510.
sets out for Kohala to return on third day, 506.
stands Kakaalaneo on his head, 504.
to enjoy Paliuli, 498.
with war club, cuts his opponent in two, 508.
with war club, meets Kakaalaneo holding two spears, 506.
with young wife on his back, retraces his steps, 508.
young wife, his foster-fathers and their wives in one canoe; the only one that
landed at Hana, 506.

Kiakia, bird-catching, 380.

Kiha, 25, 284, 302, 405.

Kihapaewa, or Kihapea, 336, 340.

Kihapiilani, advised to confer with Pao at Waikapu, 238.


advises wife of his departure, 242.
Aihakoko’s attendant killed by, 232.
and Piikea placed under Piilani, 236;
suggested as parents, 248.
arrives at Waipio; exchange greetings with his sister Piikea, and seeks Umi’s
aid, 244.
at Kalepolepo, sets out for Hawaii, 244.
beaten as he gathers potato tops; pays no attention, 238.
bids his discoverers “be quiet”, 238.
chief, unknown as such to the people, 236.
greets Pao and is instructed, 244.
neglected and ill-treated by Piilani; ran away secretly to Kalaniwai; marries
there, 236.
recognized as of high rank while getting potato tops, 238.
reveals himself, relates his ill-treatment and seeks for someone to avenge him,
236.
reveals his rank, 244.
sacred chief; a male through Piilani, 240.
searches for an avenger, 242;
seeks Pao, 244.
shall see bitterness, 240.
son of Piilani, chief of Maui, 242.
story of; to uplands of Kalaniwai, 242.
termed lazy by his wife’s parents, 236.
Umi turned Maui over to, 254;
went to the defence of, 232.
younger brother of Piikea, 236.

Kiholo and Kapalaoa, white sands of, 560.

Kii, 24, 404;


red rain of, 398.

Kiihele chides Kiinoho at his strange inaction, 502.


great runner, could circuit Hawaii in one day, 498.
questions Kiinoho who is to benefit by Paliuli’s delights, 498.
returns to Paliuli and reports his journey; narrates meeting with Makolea and
extols her beauty, 502.
sent to various districts for a suitable wife, 500–02.
takes Kepakailiula by the hand and leaves the house, 502.
told of Kiinoho’s dream; is indifferent and dreams same thing; traveler, 498.
unsuccessful till meeting Makolea in Kona, 500–02.

Kiinoho, a stay-at-home fortune-teller, 498.


bids Kiihele get Hina’s child, 500.
develops the egg in a feather cape into a beautiful child, 500.
dreams of Paliuli and tells Kiihele, 498.
sends Kiihele in search of wife for Kepakailiula, 500.

Kiinoho and Kiihele accompany Kepakailiula to Hana but not permitted to land,
506.
brothers of Hina, 498.
decide to find a wife for Kepakailiula, 500.
definition of, 498.
directed by dream, start for Paliuli, 498.
join in the fight, 508.
left Paliuli in charge of the gods, 502.
made joint kings of Oahu by Kepakailiula, reserving to himself and Kakuhihewa
rulers’ rights, 510.
mourn on leaving Paliuli, 502.
Puna chiefs of high rank, 498.
Kikakapu, butterfly-fish, 576.
put up in place of kapu stick, 576.
sacred fish, 240.

Kikenuiaewa, 24;
of Ewa, 342.

Kiki and party at Keolewa, 372.


hair dressing, 378.

Kila adjusts government of Kauai; declines the kingship, 152.


admired as a handsome young man, 134.
advised to delay departure, meets a priestess whose aid he invokes, 124.
again enters the temple, 144.
and brothers at Waipio, 132.
and Kamahualele seek in vain for place of Laamaikahiki’s hiding, 124.
and Laamaikahiki, arrival at Kauai, 128;
return to Tahiti with the bones of their father, 154.
and party set sail for Hawaii, 128.
anxious to find Laamaikahiki, 126.
arranges to take Moikeha’s bones to Tahiti, 154.
arrives at Luukia’s place and extends greetings, 124.
as Lena, questions Kaialea pointedly, to which false replies are given, 138.
asleep, is taken off the canoe and left at Waipio, 132.
assumes the reins of government on death of Moikeha, 128.
awakens and finds himself deserted, 132–34.
brings his mother and aunt into the temple, 148.
brothers questioned, orders them confined, 148.
bundled on the canoe platform, 164.
calls the people to witness the sacrifice, 148.
chants of Moikeha’s life of ease, 162.
chief ruler of Kauai, 152.
contends with Luukia, 172.
defers putting his brothers to death, 150.
did not think Kaialea would be killed, 148.
disregards mother’s desire that companions-in-death be offered up with their
sons, 150.
does not intend sacrifice, 144, 148.
does not wish the sacrifice of Kaialea, but his realization of the gravity of the evil
deed committed, 140.
drawn into a plot on pretext of brothers to obtain their father’s bones for removal
to Tahiti, 130.
elicited partial truth from Kaialea, 144.
falsely accused of violating kapus, flees to Pakaalana, 134.
follows the aged priestess; hides in the Mua house of the temple, 126.
former inhabitants wail on arrival of, 170.
[xxvi]gave up looking for Laamaikahiki; orders Kamahualele to prepare the canoe
for return, that Moikeha may send others, 124.
given the name of Lena, 134.
greets Laamaikahiki and is questioned, 128.
greets relatives at various points, 122.
handsome man, 164.
hearing of a canoe from Kauai, he sends for the men, 142.
Hooipoikamalanai and sister bewail the death of, 132.
informed of prayer’s interruption, 148.
insects, animals and the elements rejoice, at arrival of, in testimony of his high
chief rank, 168.
instructs a friend as to questioning Kaialea, 138.
instructs his men as to treatment of Mua; his wit avenges his father, 166.
instructs his men, in contest with two warriors, 166–68.
instructs the priests, on their course, 140.
Kaialea often questioned by, but found stubborn, 140.
Kaialea ordered confined again by, 144.
Kaialea search party questioned by, 142.
king and chief priest with, enter the kapu house, 148.
king of Kauai; jealousy of brothers thereat; reign of, not satisfactory, 130.
life of, in Waipio at first menial, 134.
lit the lamp and laid down on the couch, 170.
Luukia consenting, unfastens the cords, 172.
makes himself known, and relates his experiences, 150.
master of all the lands, victorious in his battle, 170.
meditates on his brothers’ actions, 134.
offers to die first; loved his brothers more than self, 152.
on the covered platform, 130.
orders food to be taken to Kauai for his mother and aunt, 140.
orders his officers to arrest the men, 142.
orders release of other men, 144.
orders reservation of food under penalty of death, 136.
originates the working system, 134–36.
otherwise known as Lena, 138.
prepares for the trip to Tahiti for Laamaikahiki, 120, 160.
prevailed on to be Kauai’s king, 152.
proclaims himself the offspring of Moikeha, 122–24, 128, 150, 162.
questions his mother; tells her Kaialea will surely die, 146.
questions Kamahualele, 124.
recognized by certain high chief signs, a priest directs the king of Waipio to take,
as a son, 134.
recognizes Kaialea’s canoe, and sees his brother; orders the canoe confiscated,
136.
recognizes Makalii, offers to meet him in contest, 168.
reported eaten by a shark, his hands only left, 132.
resembles Moikeha, 172.
restrained by his father from accompanying his brothers, 120.
retires to Lanikeha, 124.
returns to Waipio, 152;
to Kauai, 150.
returns to his father’s house; the guards come to life, 170.
reveals himself and the object of his journey, 126.
sails for main island, 164.
saw his mother and relatives; orders houses made ready, 146.
seen by Luukia asleep, is mistaken for Moikeha; embraces him, is startled, 172.
sends men to take Kaialea to the king’s strong house, 138.
sets sail for Oahu on voyage to Tahiti, 122.
son of Moikeha, 118;
and Hooipoikamalanai, 160.
spared from death through intervention, 134.
standing by the anuu, faces his brothers, 148.
suggests a god be provided his brothers, 120.
suggests to his mother and aunt that Umalehu and the rest be saved; is
opposed by them, 150.
taken by the brothers, they sail for Oahu, 130.
tells his men to return to Kauai if he is slain in contest with Makalii, 168.
tells history of his brother’s treatment, 134.
the crowd shout in praise at sight of, 168.
the shells advise there are no more chiefs, 170.
told of the food delivery to his people, whereas it was all squandered at Molokai,
140.
told of Kaialea’s weeping, he questions him, 144.
tries to conceal his emotions, 140.
upon death of Moikeha the land descended to, 128.
visits and exchanges greetings with Kanepohihi, 162.
wins in contest with his brothers, 120, 160.

Kila’s brothers dissuade the mothers from joining, 130.


fear their scheme will fall through, 130.
hear there is food at Waipio, 136.
kidnap a young man from Waipio and slay him, 132.
plan concealment of their jealousy and hatred; professing obedience, plot
against him, 130.
prepare the double canoe, planning to include Kila, 130.
propose to bring the bones of their father for removal to Tahiti, 130.
report Kila as eaten by a shark and bones of father lost, 132.
swearing to take good care of Kila, the mother’s fears are allayed, 130.
take Kila and sail for Oahu, 130.

Kila-pa-Wahineikamalanai, 122–24, 128, 150.

Kilauea, pit at, dug by Pele and Hiiaka, 106.


to Kalihi, 358.

Ki-leaf (or Ti-leaf) fishing coat, 224;


knotted, 366.

Kilohi begs Wahanui to return to Hawaii, 518.


not the priest Wahanui had thought him to be, 518.
prophet of Wahanui, joins him on voyage to Tahiti, 516.
refuses Kaneapua to board their canoe, 516.
terms Kanehunamoku the man-eating dog of Hina, 518.

Kilou, cliff of Lehua at, 306.

Kinau, a sand-eel, 358.

King of Hawaii, Kapawa, 22.


of Kauai, Ku, 372.
of Kauai meets Kualii and gives over his island to him, 400.
of Koolau (Kualii), 402.
of Koolauloa ceded the districts to Kualii, 400.
of Kauai, Moikeha became, 118;
Kiha made, 130.

King’s loin cloth and kapa, 278.

King’s riddles, Kepakailiula given the answer to, 512–14;


invited to join in solving, 514.
Kepakailiula solving the; Kukaea throws the king into the oven, 514.
Kukaea summons the people to answer the, 512.
oven-baking the penalty in contest of, 514.

Kini (40,000), 364;


from word Kinikini, 400.

Kinilauaemano, 370, 404.

Kino, or miraculous powers, 72.

Kio, 24, 404.

Kipahulu, Koi and companion sail from, 234.

Kipapai, 288, 304.

Kipapalaulu, asked by daughter, sends Aiai a ten-fathom canoe, 558.


asked for a pearl fish-hook, 556.
[xxvii]king of Honolulu, at success of Kuula in aku fishing steals his pearl hook
Kahuoi, 556.
living at Kapuukolo, 546.
on further request by daughter, gives up the stolen hook Kahuoi, 556.

Kipu, mischief-maker of Palaau, 396.

Kipunuiaiakamau and companion on Moikeha voyage from Tahiti, 116.


navigators and sailing masters with Kila, 122.

Kiss on the nose, 350.

Kissing of olden time, “honi ka ihu”, 308.

Kiu ahiu, wild spy, 396.

Kiwaawaa, a coarse kapa, 584.


Kiwaha gives Aukele a way of escape by rainbow, 66.
mate of Halulu, 66.
one of three bird-guards of Namakaokahai, 42.

Kiwalao, overthrow of, 4.

Koa, 25, 405.

Koa tree, sounding-leafed, 358.


trunkless, 350–52;
without roots, 356.

Koae (bird), 70, 234, 340;


that soars high, 394.
bos’n bird (Phaethon lepturus), 340, 394.

Koaie (tree) of Kauai, 386.

Koakea, 186;
heights of, adjoining Waipio, 208.
Umi meets Piimaiwaa at, 182.

Koauli, chant, 342.

Koeleele powerful man of bad temper, 528.


runs away from Kaulu, 528.
younger brother of Makalii hurls the rock Ikuwa at Kaulu, 528.

Kohala, aeloa the favorite wind of, 566.


ahupuaa of Hihiu nui in, 354;
Kapaihiahilina lands at, 356.
and Hamakua rebels met at Nakikiaianihau, 324.
beautiful country, 314.
bones of chief of, 314;
boundary between Kona, 362.
depopulated, 336–38.
did not see people of, 338.
dividing line of, 360.
east of, without growing food, 570.
forsake the proud land of, 568.
given to Koi, 206.
got their fish from Mumu and brothers, 562.
heiau of Muleiula in, 324;
in Kapaau of, 328.
is in darkness, 28.
Kaiopihi, the slain general, 330.
Kapua and Kukuipahu in, 380.
Kauhiakama arrived at, 334;
reports on his tour through, 336.
lies level, face down, 314.
Lono and forces reach; victory followed into, 328.
lowland country noted as a proud land, 540.
making and worshipping of idols originated in, 540.
men all at, awaiting slaughter by Lono, 320.
men had also been assembled and guarded, 344.
of Wakiu, 306.
Palahalaha, chief of, 314.
people of, attached to, 540.
Puuonale, king of, 538.
rebels stationed from Anaehoomalu to distant, 322.
small district, to be fought last, 324.
stretches forth, 374.

Kohana, naked, 378.

Kohenemonemo, wife of Hauna and Loli, 256.

Kohia, from Ko, 372.

Kohikohiokalani, 24.

Koholalele, Liloa journeys to, 178;


Umi and companions journey to, 186.
shallow sea of, 398.

Koi accompanies Umi to Laupahoehoe beach, 210.


and co-counselors ordered to prepare canoes, 246.
and companions with the king reside at Waipio, 214;
become courtiers, 220.
and companions in spear practice with Umi, 210.
and companions without bundles of stones, 200–02.
and officers ordered to war on stronghold of Kauiki, 248.
at call of Piimaiwaa followed him up, 254.
brother-in-law and, select Umi’s burial place, 234.
charged with secretion of Umi’s bones at his death, 232.
directed to kill Paiea; returns with the body for sacrifice, 214.
dispossessed of his lands, 232.
enters home of his sister, 234.
finds the guards asleep, takes the body of Umi, 234.
hearing of Umi’s fatal illness, sets out for Hawaii, 232.
hears of Umi’s death, 234.
kills a man as a substitute corpse for Umi, 234.
met at Kukuihaele and taken along by Umi, 186.
Omaokamau and Piimaiwaa aku fishing; farm daily, 186;
taught the arts of warfare, 190.
recognized, enjoins quiet and secrecy, 234.
report of, makes Umi sad at heart, 250.
said to have taken Umi’s bones to Maui, 234.
sees Umi’s hurt, vows to slay Paiea, 210.
sent to scale Kauiki; after two attempts, in fear of the giant guard he returns,
250.
Waimanu to Pololu the lands of, 232.
wanders away to Kauai, 232.
with Umi and Omaokamau, confined for sacrifice, 220.
with valuables, sails secretly for Maui, 234.

Koihalawai, 25, 405.

Kokio, Ku not like the, 392.


medicinal, 390.

Koko, a net, 530.


Kualii held up in his, 388;
network, 364, 400.

Kokoiki, Kohala, Maile sisters met Hikapoloa at, 562.

Kolea and mate fly up and inquire of Makalii of the loud-voiced god Kaeha, 524.
reports Makalii’s message, 524.
said to Ulili, “Let us fly high above Kana and call to him”, 444.
told of Kaulu hiding in the palm leaf, 524.
Kolea and Ulili are told by Hina wherein Niheu’s strength lies, 446.
fall down on the hill of Haupu, 444.
met Kapepeekauila, barely escaped death; sent to tell Keauleinakahi, his
warrior, 444.
not a formidable pair, 444.
seeing Hina being taken, flew and held Niheu by the hair, 446.
swift messengers of Kapepeekauila, sent to look for Kana and Niheu, 444.

Kolohia, 374.

Kolowalu (Royal) Statute, a beneficent law of Kualii, 432.

Kona (Oahu), 300, 384;


another wing of army from, 410.

Kona and Kau, kukui first introduced into, 570.

Kona and Koolau (Molokai) continue the battle against Kekaha, 418.
bones of chief of, 320;
rebels from, arrive, 330.
district, defeated king of, 394.
district, the largest, 338.
dividing line of, 360–62.
dwell in; house stands in, 286, 304.
Ehunuikaimalino king of, 228.
first meets the eye, 374.
given to Ehu, 206;
Kapalilua, 336.
Heapae chief of, 320;
Lono at temples in, 330.
Kapaihiahilina sails for, 356;
returns to, 362.
Kauhi through, sees not its people, 338.
Kauhiakama reports on, 336.
[xxviii]known from below, 378.
men from, 344.
Moihala chief of, son of Heapae, 320.
stands forth to sight, 28;
plainly seen, 374.
term for the lower regions, 378.
the sun warmed the selfish chiefs of, 394.
Umi desired to live in, 228–30.
whose stone floor burns, 394.

Konane board, Lono strikes his wife with, 272.


chant, 56, 272;
quoted in contests, 272.
engaged in playing, Lono strikes Pupuakea on the head with the board, 334.
game of, 56, 270–72;
resembling checkers, 270.
Hauna plays, against two women, wagering his canoe load of feathers and wins,
312.
Kakuhihewa challenges Lono to a game of, 300.
Kama and Lono played, 334.

Koniewalu, 370, 404.

Konohiki, 24, 404.

Kookookumaikalani, 25, 404.

Koolau, 284–86, 300, 304, 392;


army from, 410.
bracing up the house of, 392.
chief of, 366;
flower of, 314.
Lonokaeho, king of, 530.
tea plant (Campylotheca), 386.
trembles, 396.

Koolau and Kona (Molokai) chiefs battle against those of Kekaha, 418.
chiefs of, gave up to Kualii all Molokai, 420.
chiefs of, hear war is to be carried into Kalaupapa, 418.
defeated, lands on the, side come into Paepae’s possession, 418.
war canoes from all the side of, go to battle, 418.

Koolaukahili, 25, 405.

Koolauloa, 300, 364, 388.


armies of chiefs of Waianae and, routed, 414
and Koolaupoko ceded to Kualii, 400.
Koolaupoko, Kailua in, Kualii’s favorite residence, 432.
Kualii and boy returned to Kailua in, 430.
Kualii died at Kailua, in, 432.

Kou, Honolulu harbor, 452–54.


Kahaookamoku’s party lauded at, 478.
scattering blossoms of the, 392.

Koukou, drank the awa of, 378.


mother of Pikoiakaalala, a son, and Iole and Opeapea born before him, 450.

Kowali or Koali, 530.

Kowili, 372.

Ku (deity), 26, 30, 394.

Ku enamored of Kaunoa while bathing, 548.


father of Kalanimanuia, 548.
king of Lihue, takes Kaunoa to wife, 548.
leaves Kaunoa with expectant child, 548.
leaves name and tokens for the child, 548.
not recognizing the lad, orders him killed, 548.
on hearing strict kapu violated, orders his female attendant killed, 542.
on priests’ questions, owns the relationship with Kaunoa, 550.
realizing having killed his own son, seeks to regain him, 550.
searched for by Kalanimanuia, 548.
thou unnatural father, 548.

Ku (Kualii), 30, 372, 376, 380, 386, 390–96, 414–16, 420, 428.
arrayed in his feather cloak, 384, 416.
encompassed by, is the island, 400.
haole from Tahiti, a god, 394.
has left but few priests, 386, 416.
holds up the rain, 378;
led to earth, 380.
indeed, whose is Tahiti for, 374.
is brought forth in the forest, 384.
is indeed king, 384–86, 416.
puts on his loin-cloth for war, 382.
returning to Oahu; sailing to Kauai, 374.
the lehua eater, 286, 304.
uncomparable, 390–94.
urged to be merciful and spare his wrath, 388.

Ku and Hina, male attendant reports the conduct of their two charges to, 542.
parents of Kepakailiula, 498, 540.
son and daughter of, brought up under strict kapu, 540.

Kua, maile vines creep down to, 400.

Kuaihelani, 32, 46–48, 68.


Aukelenuiaiku the hero of, 78;
his departure for, 108.
Bambu stalk shoots up till it reached, 598.
boy from, 56, 80.
brother or sister to be banished to, 540.
brothers of Aukele depart for, but meet disaster and perish, 106.
champions, visit Kauai, Oahu, and Maui; give exhibition of games, 34.
deserted and overgrown with weeds, 108.
fine rain of, 94.
Hina originally belonged to, 546.
home of Makiioeoe, where, as king, he had one child, Kahikiula, 596.
Kanemoe accompanies Aukele to, 108.
Kapuaokaoheloai and messengers embark for, 542.
king of, desires to know his daughter’s opinion, 544.
king of, questioned the priests relative to rank of Kapuaokaoheloai, 544.
land in Tahiti, 540.
land of, origin of Aukelenuiaiku, 32.
Laukiamanuikahiki told her father has gone back to, 596;
said “Here I am returning to,” 608.
Makiioeoe prepares and returns to, 596.
messengers of, cautioned Kapuaokaoheloai regarding the king’s daughter, 542–
44.
messengers of king of, in search of a wife, 542.
popular mythical land, 32.
the chief reminded that banishment to, would be the penalty for violating the
kapu, 542.

Kuaikalolo and Aiakoake, elders of Kepakailiula, 498.


Kuaiku, 32, 36;
has his arms broken, 36.

Kuaimakani, 180.

Kuaiwa, 25, 405.

Kuaiwilu, a voyager with Kila to Tahiti, 122.

Kuala, current of, 240.

Kualii, 364–66, 370.


accompanies his soldiers in battle usually, 426.
advised by priest of Pumaia’s spirit as the thief, 476.
and Maheleana take war club lessons, 412.
and warriors sail to Kauai for war clubs; armies different, closing in on, 412.
arrival of, reported to Haloalena and Kamalalawalu, 424.
arrives at Kahaluu, 428.
arriving at Laupahoehoe, Haalilo prepares for war, 414.
asks Kauhi’s authority for his mischievous acts, 426.
assumes a royal right to dedicate Kawaluna temple, 408.
astrologers fail to find auguries for defeat of, 366.
at advice of priest, builds houses and cares for bones of Pumaia, 476.
awakens his companions to meet the advancing armies; refuses to flee, 408–10.
battle of Kalena, names it the, 414.
battles and battle grounds of, 406;
fights three more battles, conquering Oahu, 414.
began fighting in childhood; story of, 364.
bestows great riches and favors on Kapaahulani; orders share sent the brother
at Puuloa, 402.
body of Pumaia thrown into pit of temple of, 472.
[xxix]bones of, to be secreted at death by his trusted kahu, are powdered and
hidden in one hundred living tombs, 434.
calls and questions the boy, deeming him very brave, 430.
celebrated for strength and bravery, 364.
chant composed to name of, 364–66;
supplementary chant for, 394.
charged with having overstepped himself, 408.
chief officer of, remarked, 378.
compared to a god and an early king, 388.
declares the battle prepared by Haloalena off, 426.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookfinal.com

You might also like