Fundamentals Of Python First Programs MindTap Course List 3rd Edition Kenneth A. Lambert 2024 scribd download
Fundamentals Of Python First Programs MindTap Course List 3rd Edition Kenneth A. Lambert 2024 scribd download
com
https://ebookgate.com/product/fundamentals-of-python-first-
programs-mindtap-course-list-3rd-edition-kenneth-a-lambert/
OR CLICK HERE
DOWLOAD NOW
https://ebookgate.com/product/writing-ten-core-concepts-w-
mla9e-updates-mindtap-course-list-3rd-edition-robert-p-yagelski/
ebookgate.com
https://ebookgate.com/product/python-programming-for-teens-1st-
edition-kenneth-a-lambert/
ebookgate.com
https://ebookgate.com/product/the-fundamentals-of-drawing-portraits-a-
practical-and-inspirational-course-barrington-barber/
ebookgate.com
https://ebookgate.com/product/physiology-and-nutrition-for-amateur-
wrestling-first-edition-lambert/
ebookgate.com
A First Course in Topology John Mccleary
https://ebookgate.com/product/a-first-course-in-topology-john-
mccleary/
ebookgate.com
https://ebookgate.com/product/python-crash-course-a-hands-on-project-
based-introduction-to-programming-1st-edition-eric-matthes/
ebookgate.com
https://ebookgate.com/product/the-history-of-mathematics-a-brief-
course-3rd-ed-2013-3rd-edition-roger-l-cooke/
ebookgate.com
https://ebookgate.com/product/fundamentals-of-geomorphology-3rd-
edition-routledge-fundamentals-of-physical-geography-richard-huggett/
ebookgate.com
https://ebookgate.com/product/a-first-course-in-loop-quantum-gravity-
gambini/
ebookgate.com
Fundamentals
of Python First
Programs Third Edition
Kenneth A. Lambert
Fundamentals
of Python
First Programs
Third Edition
Kenneth A. Lambert
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
This is an electronic version of the print textbook. Due to electronic rights restrictions,
some third party content may be suppressed. Editorial review has deemed that any suppressed
content does not materially affect the overall learning experience. The publisher reserves the right
to remove content from this title at any time if subsequent rights restrictions require it. For
valuable information on pricing, previous editions, changes to current editions, and alternate
formats, please visit www.cengage.com/highered to search by ISBN#, author, title, or keyword for
materials in your areas of interest.
Important Notice: Media content referenced within the product description or the product
text may not be available in the eBook version.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Fundamentals of Python: First Programs, © 2024 Cengage Learning, Inc. ALL RIGHTS RESERVED.
Third Edition Previous edition(s): © 2015, © 2012.
Kenneth A. Lambert WCN: 02-300
No part of this work covered by the copyright herein may be reproduced
or distributed in any form or by any means, except as permitted by U.S.
SVP, Product: Cheryl Costantini copyright law, without the prior written permission of the copyright owner.
VP, Product: Thais Alencar Unless otherwise noted, all content is Copyright © Cengage Learning, Inc.
Portfolio Product Director: Rita Lombard The names of all products mentioned herein are used for identification
purposes only and may be trademarks or registered trademarks of their
Portfolio Product Manager: Tran Pham respective owners. Cengage Learning disclaims any affiliation, association,
connection with, sponsorship, or endorsement by such owners.
Product Assistant: Anh Nguyen
VP, Product Marketing: Jason Sakos Library of Congress Control Number: 2023904384
Cover Image Source: Armagadon/shutterstock.com Cengage is a leading provider of customized learning solutions.
Our employees reside in nearly 40 different countries and serve digital
learners in 165 countries around the world. Find your local representative at:
www.cengage.com.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Brief Contents
About the Authorxi
Prefacexiii
Chapter 1 Introduction����������������������������������������������������������������������������������������������������� 1
Glossary439
Index453
iii
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Contents
About the Author xi
Preface xiii
Chapter 1
Introduction 1
1.1 T
wo Fundamental Ideas of Personal Computing and Networks (1975–1990) 12
Computer Science: Algorithms and Consultation, Communication, and E-Commerce
Information Processing 1 (1990–2000) 14
Algorithms 2 Mobile Applications and Ubiquitous Computing
Information Processing 3 (2000–present) 15
1.2 T
he Structure of a Modern 1.4 G
etting Started with Python
Computer System 4 Programming16
Computer Hardware 4 Running Code in the Interactive Shell 16
Chapter 2
Software Development, Data Types, and Expressions 27
2.1 T
he Software Development 2.3 N
umeric Data Types and
Process27 Character Sets 37
2.2 S
trings, Assignment, and Integers 37
Comments32 Floating-Point Numbers 37
Data Types 33 Character Sets 38
String Literals 33 2.4 Expressions 39
Escape Sequences 34 Arithmetic Expressions 39
String Concatenation 34 Mixed-Mode Arithmetic and Type Conversions 41
Variables and the Assignment Statement 35
2.5 Using Functions and Modules 43
Program Comments and Docstrings 36
Calling Functions: Arguments and Return Values 43
iv
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Contents v
Chapter 3
Loops and Selection Statements 53
3.1 Definite Iteration: The for Loop 53 Logical Operators and Compound Boolean
Expressions 68
Executing a Statement a Given Number of Times 54
Short-Circuit Evaluation 70
Count-Controlled Loops 55
Testing Selection Statements 70
Augmented Assignment 56
Loop Errors: Off-by-One Error 56 3.4 C
onditional Iteration: The while
Traversing the Contents of a Data Sequence 57 Loop71
Specifying the Steps in the Range 57 The Structure and Behavior of a while Loop 72
Chapter 4
Strings and Text Files 87
4.1 A
ccessing Characters and Converting Binary to Decimal 94
Substrings in Strings 87 Converting Decimal to Binary 95
The Structure of Strings 87 Conversion Shortcuts 95
The Subscript Operator 88 Octal and Hexadecimal Numbers 96
Slicing for Substrings 89 4.4 String Methods 97
Testing for a Substring with in Operator 90
4.5 Text Files 100
4.2 Data Encryption 91 Text Files and Their Format 100
4.3 Strings and Number Systems 93 Writing Text to a File 101
The Positional System for Representing Writing Numbers to a File 101
Numbers 93 Reading Text from a File 101
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
vi Contents
Chapter 5
Lists and Dictionaries 115
5.1 Lists 115 The return Statement 127
List Literals and Basic Operators 116 Boolean Functions 127
Replacing an Element in a List 118 Defining a main Function 127
List Methods for Inserting and Removing 5.3 Dictionaries 132
Elements 119
Dictionary Literals 132
Searching a List 120
Adding Keys and Replacing Values 132
Sorting a List 121
Accessing Values 133
Mutator Methods and the Value None 121
Removing Keys 133
Aliasing and Side Effects 122
Traversing a Dictionary 133
Equality: Object Identity and Structural
Example: The Hexadecimal System Revisited 134
Equivalence 123
Example: Finding the Mode of a List
Example: Using a List to Find the Median of
of Values 135
a Set of Numbers 123
Summary 141
Tuples 124
Key Terms 142
5.2 Defining Simple Functions 126 Review Questions 142
The Syntax of Simple Function Definitions 126 Programming Exercises 143
Parameters and Arguments 127 Debugging Exercise 144
Chapter 6
Design with Functions 145
6.1 A
Quick Review of What Functions 6.3 M
anaging a Program’s
Are and How They Work 145 Namespace156
Functions as Abstraction Mechanisms 146 Module Variables, Parameters, and
Functions Eliminate Redundancy 146 Temporary Variables 156
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Contents vii
Chapter 7
Design with Recursion 165
7.1 Design with Recursive Functions 165 Mapping 180
Defining a Recursive Function 166 Filtering 181
Recursive Algorithms 167 Reducing 181
Tracing a Recursive Function 167 Using lambda to Create Anonymous Functions 182
Using Recursive Definitions to Construct Creating Jump Tables 182
Recursive Functions 168 Summary 185
Recursion in Sentence Structure 168 Key Terms 185
Infinite Recursion 170 Review Questions 186
The Costs and Benefits of Recursion 170 Programming Exercises 187
7.2 Higher-Order Functions 179 Debugging Exercise 188
Functions as First-Class Data Objects 179
Chapter 8
Simple Graphics and Image Processing 189
8.1 Simple Graphics 189 Image-Manipulation Operations 205
Overview of Turtle Graphics 190 The Properties of Images 205
Turtle Operations 190 The images Module 205
Setting Up a turtle.cfg File and Running IDLE 192 A Loop Pattern for Traversing a Grid 208
Object Instantiation and the turtle Module 192 A Word on Tuples 209
Drawing Two-Dimensional Shapes 194 Converting an Image to Black and White 209
Examining an Object’s Attributes 195 Converting an Image to Grayscale 210
Manipulating a Turtle’s Screen 196 Copying an Image 211
Taking a Random Walk 196 Blurring an Image 212
Colors and the RGB System 197 Edge Detection 213
Example: Filling Radial Patterns with Random Reducing the Image Size 214
Colors 198 Summary 216
8.2 Image Processing 203 Key Terms 217
Analog and Digital Information 204 Review Questions 217
Sampling and Digitizing Images 204 Programming Exercises 218
Image File Formats 204 Debugging Exercise 221
Chapter 9
Graphical User Interfaces 223
9.1 T
he Behavior of Terminal-Based 9.2 C
oding Simple GUI-Based
Programs and GUI-Based Programs226
Programs224 A Simple “Hello World” Program 227
The Terminal-Based Version 224 A Template for All GUI Programs 228
The GUI-Based Version 225 The Syntax of Class and Method Definitions 228
Event-Driven Programming 226 Subclassing and Inheritance as Abstraction
Mechanisms 229
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
viii Contents
9.3 W
indows and Window Using Nested Frames to Organize
Components229 Components 247
Windows and Their Attributes 230 Multiline Text Areas 248
Window Layout 230 File Dialogs 250
Types of Window Components and Their Obtaining Input with Prompter Boxes 252
Attributes 232 Check Buttons 253
Displaying Images 233 Radio Buttons 254
9.4 C
ommand Buttons and Keyboard Events 256
Responding to Events 235 Working with Colors 256
9.5 I nput and Output with Entry Using a Color Chooser 258
Fields237 Summary 260
Text Fields 237 Key Terms 260
Integer and Float Fields for Numeric Data 238 Review Questions 261
9.6 D
efining and Using Instance Programming Exercises 262
Variables240 Debugging Exercise 263
9.7 O
ther Useful GUI Resources 246
Chapter 10
Design with Classes 265
10.1 G
etting Inside Objects and Input of Objects and the try-except
Classes266 Statement 289
Rational Number Arithmetic and Operator Example 2: The Dealer and a Player in the
Overloading 281 Game of Blackjack 308
Equality and the __eq__ Method 282 The Costs and Benefits of Object-Oriented
Programming 312
The __repr__ Method for Printing an Object
in IDLE 283 Summary 315
Savings Accounts and Class Variables 284 Key Terms 316
Putting the Accounts into a Bank 286 Review Questions 317
Using pickle for Permanent Storage Programming Exercises 318
of Objects 288 Debugging Exercise 320
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Contents ix
Chapter 11
Data Analysis and Visualization 321
11.1 S
ome Basic Functions for 11.3 W
orking with More Complex
Analyzing a Data Set 322 Data Sets 343
Computing the Maximum, Minimum, and Mean 323 Creating a Data Set with pandas 344
Computing the Median 323 Visualizing Data with pandas and
Computing the Mode and Modes 324 matplotlib.pyplot 344
Computing the Standard Deviation 325 Accessing Columns and Rows in a Data Frame 345
Using the NumPy Library 326 Creating a Data Frame from a CSV File 346
Cleaning the Data in a Data Frame 347
11.2 Visualizing a Data Set 333
Accessing Other Attributes of a Data Frame 348
Pie Charts 335
Summary 354
Bar Charts 337
Key Terms 354
Scatter Plots 339
Review Questions 354
Line Plots 340
Programming Exercises 355
Histograms 342
Chapter 12
Multithreading, Networks, and Client/Server Programming 357
12.1 Threads and Processes 357 12.3 N
etworks, Clients, and
Threads 358 Servers374
Sleeping Threads 360 IP Addresses 374
Producer, Consumer, and Synchronization 362 Ports, Servers, and Clients 375
Sockets and a Day/Time Client Script 375
12.2 T
he Readers and Writers
Problem368 A Day/Time Server Script 377
Implementing the Interface of the SharedCell Handling Multiple Clients Concurrently 380
Class 370 Summary 388
Implementing the Helper Methods of the Key Terms 389
SharedCell Class 371 Review Questions 389
Testing the SharedCell Class with Programming Exercises 390
a Counter Object 372
Debugging Exercise 392
Defining a Thread-Safe Class 373
Chapter 13
Searching, Sorting, and Complexity Analysis 393
13.1 M
easuring the Efficiency of 13.2 Complexity Analysis 398
Algorithms394 Orders of Complexity 398
Measuring the Run Time of an Algorithm 394 Big-O Notation 400
Counting Instructions 396
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
x Contents
Appendix A
Python Resources 429
Appendix B
Installing the images and breezypythongui Libraries 431
Appendix C
The API for Image Processing 433
Appendix D
Transition from Python to Java and C++ 435
Appendix E
Suggestions for Further Reading 437
Glossary439
Index453
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
About the Author
Kenneth A. Lambert is Professor of Computer Science Emeritus at Washington and Lee University. He taught
introductory programming courses for 37 years and has been an active researcher in computer science educa-
tion. Lambert has co-authored a series of introductory C++ textbooks with Douglas Nance and Thomas Naps
and a series of introductory Java textbooks with Martin Osborne. He is the author of Python textbooks for CS1
and CS2 college-level courses and a Python textbook for teens. He is also the co-creator of the BreezySwing
framework and is the creator of the breezypythongui framework.
Dedication
To my grandchildren—Lucy, Wyatt, Cuba, and Van
Kenneth A. Lambert
Lexington, VA
xi
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface
“Everyone should learn how to code.” That’s my favorite quote from Suzanne Keen, formerly the Thomas
Broadus Professor of English and Dean of the College at Washington and Lee University, where I have taught
computer science for more than 30 years. The quote also states the reason why I wrote the first and second
editions of Fundamentals of Python: First Programs, and why I now offer you this third edition. The book is
intended for an introductory course in programming and problem solving. It covers the material taught in a
typical Computer Science 1 (CS1) course at the undergraduate or high school level.
Why Python?
Computer technology and applications have become increasingly more sophisticated over the past three
decades, and so has the computer science curriculum, especially at the introductory level. Today’s students
learn a bit of programming and problem solving and are then expected to move quickly into topics like software
development, complexity analysis, and data structures that 35 years ago were relegated to advanced courses.
In addition, the ascent of object-oriented programming as the dominant paradigm of problem solving has led
instructors and textbook authors to implant powerful, industrial-strength programming languages such as
xiii
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xiv Preface
C++ and Java in the introductory curriculum. As a result, instead of experiencing the rewards and excitement of solving
problems with computers, beginning computer science students often become overwhelmed by the combined tasks
of mastering advanced concepts as well as the syntax of a programming language.
This book uses the Python programming language as a way of making the first year of studying computer science
more manageable and attractive for students and instructors alike. Python has the following pedagogical benefits:
• Python has simple, conventional syntax. Python statements are very close to those of pseudocode algorithms,
and Python expressions use the conventional notation found in algebra. Thus, students can spend less time
learning the syntax of a programming language and more time learning to solve interesting problems.
• Python has safe semantics. Any expression or statement whose meaning violates the definition of the language
produces an error message.
• Python scales well. It is very easy for beginners to write simple programs in Python. Python also includes all
of the advanced features of a modern programming language, such as support for data structures and object-
oriented software development, for use when they become necessary.
• Python is highly interactive. Expressions and statements can be entered at an interpreter’s prompts to allow the
programmer to try out experimental code and receive immediate feedback. Longer code segments can then be
composed and saved in script files to be loaded and run as modules or standalone applications.
• Python is general purpose. In today’s context, this means that the language includes resources for
contemporary applications, including media computing and networks.
• Python is free and is in widespread use in industry. Students can download Python to run on a variety of
devices. There is a large Python user community, and expertise in Python programming has great résumé value.
To summarize these benefits, Python is a comfortable and flexible vehicle for expressing ideas about computation,
both for beginners and for experts. If students learn these ideas well in the first course, they should have no problems
making a quick transition to other languages needed for courses later in the curriculum. Most importantly, beginning
students will spend less time staring at a computer screen and more time thinking about interesting problems to solve.
Chapter 1 introduces computer science by focusing on two fundamental ideas, algorithms and information pro-
cessing. A brief overview of computer hardware and software, followed by an extended discussion of the history of
computing, sets the context for computational problem solving.
Chapters 2 and 3 cover the basics of problem solving and algorithm development using the standard control
structures of expression evaluation, sequencing, Boolean logic, selection, and iteration with the basic numeric data
types. Emphasis in these chapters is on problem solving that is both systematic and experimental, involving algorithm
design, testing, and documentation.
Chapters 4 and 5 introduce the use of the strings, text files, lists, and dictionaries. These data structures are both
remarkably easy to manipulate in Python and support some interesting applications. Chapter 5 also introduces simple
function definitions as a way of organizing algorithmic code.
Chapter 6 explores the technique and benefits of procedural abstraction with function definitions. Top-down design
and stepwise refinement with functions are examined as means of structuring code to solve complex problems. Details
of namespace organization (parameters, temporary variables, and module variables) and communication among soft-
ware components are discussed.
Chapter 7 examines recursive design with functions. A section on functional programming with higher-order func-
tions shows how to exploit functional design patterns to simplify solutions.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xv
Chapter 8 focuses on the use of existing objects and classes to compose programs. Special attention is paid to
the application programming interface (API), or set of methods, of a class of objects and the manner in which objects
cooperate to solve problems. This chapter also introduces two contemporary applications of computing: graphics and
image processing. These are areas in which object-based programming is particularly useful.
Chapter 9 introduces the definition of new classes to construct graphical user interfaces (GUIs). The chapter
contrasts the event-driven model of GUI programs with the process-driven model of terminal-based programs. The
chapter explores the creation and layout of GUI components, as well as the design of GUI-based applications using
the model/view pattern. The initial approach to defining new classes in this chapter is unusual for an introductory
textbook: students learn that the easiest way to define a new class is to customize an existing class using subclassing
and inheritance.
Chapter 10 continues the exploration of object-oriented design with the definition of entirely new classes. Several
examples of simple class definitions from different application domains are presented. Some of these are then inte-
grated into more realistic applications to show how object-oriented software components can be used to build complex
systems. Emphasis is on designing appropriate interfaces for classes that exploit polymorphism.
Chapter 11 introduces tools and techniques for performing data analysis, a fast-growing application area of com-
puter science. Topics include the acquisition and cleaning of data sets, applying functions to determine relationships
among data, and deploying graphs, plots, and charts to visualize these relationships.
Chapter 12 covers advanced material related to several important areas of computing: concurrent programming,
networks, and client/server applications. This chapter thus gives students challenging experiences near the end of the
first course. This chapter introduces multithreaded programs and the construction of simple network-based client/
server applications.
Chapter 13 covers some topics addressed at the beginning of a traditional CS2 course. This chapter introduces
complexity analysis with big-O notation. Enough material is presented to enable you to perform simple analyses of
the running time and memory usage of algorithms and data structures, using search and sort algorithms as examples.
• A new chapter (Chapter 7) on design with recursion. This chapter incorporates and expands on material on
recursive functions and higher-order functions from Chapter 6 of the second edition.
• A new chapter (Chapter 11) on data analysis and visualization. This chapter introduces tools and techniques
for acquiring data sets, cleaning them, and applying functions to them to determine relationships which can be
visualized in plots, charts, and graphs.
• Updated coverage of the history of computing in Chapter 1.
• New fail-safe programming sections added to most chapters to demonstrate best practices for programming
securely.
• New list of key terms in each chapter.
• Updated end-of-chapter review questions and programming exercises.
• End-of-chapter programming exercises mapped to the learning objectives for each chapter.
• New debugging exercises in each chapter provide examples of challenging programming errors and give you
experience in diagnosing and correcting them.
• Several new case studies as well as new or updated programming exercises.
• Text revisions throughout with a focus on readability.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xvi Preface
• Chapter Objectives: Each chapter begins with a set of learning objectives which describe the skills and
concepts you will acquire from a careful reading of the chapter.
• Chapter Summary: Each chapter ends with a summary of the major concepts covered in the chapter.
• Key Terms: When a technical term is introduced in the text, it appears in boldface. The list of terms appears
after the chapter summary. Definitions of the key terms are provided in the glossary.
Exercise
Exercises: Most major sections of each chapter end with exercise questions that reinforce the reading by asking
basic questions about the material in the section.
Case Study
Case Studies: The Case Studies present complete Python programs ranging from the simple to the
substantial. To emphasize the importance and usefulness of the software development life cycle, case
studies are discussed in the framework of a user request, followed by analysis, design, implementation, and
suggestions for testing, with well-defined tasks performed at each stage. Some case studies are extended in
end-of-chapter programming exercises.
Fail-Safe Programming
Fail-Safe Programming: Fail-Safe Programming sections include a discussion of ways to make a program
detect and respond gracefully to disturbances in its runtime environment.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xvii
Review Questions
Review Questions: Multiple-choice review questions allow you to revisit the concepts presented in each
chapter.
Programming Exercises
Programming Exercises: Each chapter ends with a set of programming projects of varying difficulty. Each
programming exercise is mapped to one or more relevant chapter learning objectives and gives you the
opportunity to design and implement a complete program that utilizes major concepts presented in that
chapter.
Debugging Exercises
Debugging Exercises: Debugging exercises illustrate a typical program error with suggestions for repairing it.
• A software toolkit for image processing: This book comes with an open-source Python toolkit for the easy
image processing discussed in Chapter 8. The toolkit can be obtained with the ancillaries at www.cengage.com
or at https://kennethalambert.com/python/
• A software toolkit for GUI programming: This book comes with an open-source Python toolkit for the easy GUI
programming introduced in Chapter 9. The toolkit can be obtained with the ancillaries at www.cengage.com or
at https://kennethalambert.com/breezypythongui/
• Appendices: Five appendices include information on obtaining Python resources, installing the toolkits, using
the toolkits’ interfaces, and suggestions for further reading.
• Glossary: Definitions of key terms are collected in a glossary.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xviii Preface
materials seek to affirm the fullness of human diversity with respect to ability, language, culture, gender, age, socio-
economics, and other forms of human difference that students may bring to the classroom.
Across the computing industry, standard coding language, such as “Master” and “Slave” is being retired in favor of lan-
guage that is more inclusive, such as “Supervisor/Worker,” “Primary/Replica,” or “Leader/Follower.” At this time, different
software development and social media companies are adopting their own replacement language and currently there is no
shared standard. In addition, the terms “Master” and “Slave” remain deeply embedded in legacy code and understanding
this terminology remains necessary for new programmers. When required for understanding, Cengage will introduce the
non-inclusive term in the first instance but will then provide an appropriate replacement terminology for the remainder of
the discussion or example. We appreciate your feedback as we work to make our products more inclusive for all.
For more information about Cengage’s commitment to inclusivity and diversity, please visit https://www.cengage
.com/inclusion-diversity/
Course Solutions
Online Learning Platform: MindTap
Today’s leading online learning platform, MindTap for Fundamentals of Python, Third Edition provides complete con-
trol to craft a personalized, engaging learning experience that challenges students, builds confidence, and elevates
performance.
MindTap introduces students to core concepts from the beginning of the course, using a simplified learning path
that progresses from understanding to application and delivers access to eTextbooks, study tools, interactive media,
auto-graded assessments, and performance analytics.
MindTap activities for Fundamentals of Python: First Programs are designed to help students build the skills needed
in today’s workforce. Research shows employers seek critical thinkers, troubleshooters, and creative problem-solvers
to stay relevant in our fast-paced, technology-driven world. MindTap achieves this with assignments and activities
that provide hands-on practice and real-life relevance. Students are guided through assignments that reinforce basic
knowledge and understanding before moving on to more challenging problems.
All MindTap activities and assignments are tied to defined chapter learning objectives. Hands-on coding labs pro-
vide real-life application and practice. Readings and dynamic visualizations support the lecture, while a post-course
assessment measures exactly how much a student has learned. MindTap provides the analytics and reporting to easily
see where the class stands in terms of progress, engagement, and completion rates. The content and learning path
can be used as provided, customized directly in the MindTap platform, or integrated into the Learning Management
System (LMS) to meet the needs of a particular course . Instructors can control what students see and when they see
it. Learn more at https://www.cengage.com/mindtap.
In addition to the readings, the MindTap for Fundamentals of Python: First Programs, Third Edition includes the following:
• Coding labs. These supplemental assignments provide real-world application and encourage students to
practice new programming concepts in a complete online IDE. New and improved Guided Feedback provides
personalized and immediate feedback to students as they proceed through their coding assignments so that
they can understand and correct errors in their code.
• Gradeable assessments and activities. All assessments and activities from the readings are available as
gradeable assignments within MindTap, including Exercises and Review Questions.
• Video quizzes. These graded assessments provide a visual explanation of foundational programming concepts
that can be applied across multiple languages. Questions accompany each video to confirm understanding of
new material.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xix
• Interactive activities. These embedded interactive flowcharts, tabbed explorations, and click-to-reveal
experiences are designed to engage students and help them assess their understanding of introductory
computer science concepts as they progress through their chapter readings.
• Interactive study aids. Flashcards and PowerPoint lectures help users review main concepts from the units.
Supplemental Package
Instructor and Student Resources
Additional instructor and student resources for this product are available online.
Instructor assets include an Instructor’s Manual, Educator’s Guide, PowerPoint® slides, and a test bank powered
by Cognero®. Student assets include data sets. Sign up or sign in at www.cengage.com to search for and access this
product and its online resources.
• Instructor Manual. The Instructor Manual that accompanies this textbook includes additional instructional
material to assist in class preparation, including items such as Overviews, Chapter Objectives, Teaching Tips,
Quick Quizzes, Class Discussion Topics, Additional Projects, Additional Resources, and Key Terms.
• Test Bank. Cengage Testing Powered by Cognero is a flexible, online system that allows you to:
■ Author, edit, and manage test bank content from multiple Cengage solutions.
■ Create multiple test versions in an instant.
■ Deliver tests from your LMS, your classroom, or wherever you want.
• PowerPoint Presentations. This text provides PowerPoint slides to accompany each chapter. Slides may
be used to guide classroom presentations, to make available to students for chapter review, or to print as
classroom handouts. Files are provided for every figure in the text. Instructors may use the files to customize
PowerPoint slides, illustrate quizzes, or create handouts.
• Solution and Answer Guide. Solutions and rationales to review questions and exercises are provided to assist
with grading and student understanding.
• Solutions. Solutions to all programming exercises and case studies are available. If an input file is needed to run
a programming exercise, it is included with the solution file.
• Data Files. Data files necessary to complete some of the steps in the programming exercises are available. If an
input file is needed to run a program, it is included with the source code.
• Educator’s Guide. The Educator’s Guide contains a detailed outline of the corresponding MindTap course.
• Transition Guide. The Transition Guide outlines information on what has changed from the Second Edition.
Supplements can be found at https://faculty.cengage.com/. Sign In or create an account, then search for this title.
You can save the title for easy access and then download the resources that you need.
Acknowledgments
I would like to thank my good friend, Martin Osborne, for many years of advice, friendly criticism, and encourage-
ment on several of my book projects. I am also grateful to the many students and faculty colleagues at Washington
and Lee University who have used earlier editions of this book and given helpful feedback on it over the life of those
editions.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xx Preface
In addition, I would like to thank the following reviewers for the time and effort they contributed to Fundamentals
of Python: Eric Williamson, Liberty University and Jason Carman, Horry, Georgetown Technical College.
Thank you also to Danielle Shaw, who helped to assure that the content of all data and solution files used for this
text were correct and accurate.
Finally, thanks to the individuals at Cengage who made this book possible: Tran Pham, Product Manager; Mary
Convertino, Learning Designer; Michelle Ruelos Cannistraci, Senior Content Manager; Troy Dundas, Technical C
ontent
Developer; Spencer Peppet, Developmental Editor; Ann Shaffer, Developmental Editor; and Ethan Wheel, Product
Assistant.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1
Introduction
Learning Objectives
When you complete this chapter, you will be able to:
› 1.1 Describe the basic features of an algorithm
› 1.2 Explain how hardware and software collaborate in a computer’s architecture
› 1.3 Summarize a brief history of computing
› 1.4 Compose and run a simple Python program
As a reader of this book, you almost certainly have played a video game and listened to digital music. It’s likely
that you have watched a movie on Netflix after preparing a snack in a microwave. Chances are that today you
will make a phone call, send or receive a text message, take a photo, or consult your favorite social network
on a smartphone, which is a small computer. You and your friends have most likely used a desktop or laptop
computer to do significant coursework in high school or college.
Computer technology is almost everywhere: in our homes, schools, and in the places where we work
and play. Computer technology is essential to modern entertainment, education, medicine, manufacturing,
communications, government, and commerce. We have digital lifestyles in an information-based economy.
Some people even claim that nature itself performs computations on information structures present in DNA
and in the relationships among subatomic particles.
In the following chapters you will learn about computer science, which is the study of computation that has
made this new technology and this new world possible. You will also learn how to use computers effectively
and appropriately to enhance your own life and the lives of others.
1.1 T
wo Fundamental Ideas of Computer Science:
Algorithms and Information Processing
Like most areas of study, computer science focuses on a broad set of interrelated ideas. Two of the most basic
ones are algorithms and information processing. In this section, these ideas are introduced in an informal way.
You will examine them in more detail in later chapters.
1
1
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
2 Chapter 1 Introduction
Algorithms
People computed long before the invention of modern computing devices, and many continue to use devices that we
might consider primitive. For example, consider how merchants made change for customers in marketplaces before
the existence of credit cards, pocket calculators, or cash registers. Making change can be a complex activity. It takes
some mental effort to get it right every time. Let’s consider what’s involved in this process.
According to one method, the first step is to compute the difference between the purchase price and the amount
of money that the customer gives the merchant. The result of this calculation is the total amount that the merchant
must return to the purchaser. For example, if you buy a dozen eggs at the farmers’ market for $2.39 and you give the
farmer a $10 bill, she should return $7.61 to you. To produce this amount, the merchant selects the appropriate coins
and bills that add up to $7.61.
According to another method, the merchant starts with the purchase price and goes toward the amount
given. First, coins are selected to bring the price to the next dollar amount (in this case, $0.61 = 2 quarters, 1
dime, and 1 penny), then dollars are selected to bring the price to the next five-dollar amount (in this case,
$2), and then, in this case, a $5 bill completes the transaction. As you will see in this book, there can be many
possible methods or algorithms that solve the same problem, and the choice of the best one is a skill you will
acquire with practice.
Few people can subtract three-digit numbers without resorting to some manual aids, such as pencil and paper.
As you learned in grade school, you can carry out subtraction with pencil and paper by following a sequence of well-
defined steps. You have probably done this many times but never made a list of the specific steps involved. Making
such lists to solve problems is something computer scientists do all the time. For example, the following list of steps
describes the process of subtracting two numbers using a pencil and paper:
Step 1 Write down the two numbers, with the larger number above the smaller number and their digits aligned
in columns from the right.
Step 2 Assume that you will start with the rightmost column of digits and work your way left through the
various columns.
Step 3 Write down the difference between the two digits in the current column of digits, borrowing a 1 from
the top number’s next column to the left if necessary.
Step 4 If there is no next column to the left, stop. Otherwise, move to the next column to the left, and go back
to Step 3.
If the computing agent (in this case a human being) follows each of these simple steps correctly, the entire process
results in a correct solution to the given problem. We assume in Step 3 that the agent already knows how to compute
the difference between the two digits in any given column, borrowing if necessary.
To make change, most people can select the combination of coins and bills that represent the correct change amount
without any manual aids, other than the coins and bills. But the mental calculations involved can still be described in
a manner similar to the preceding steps, and we can resort to writing them down on paper if there is a dispute about
the correctness of the change.
The sequence of steps that describes each of these computational processes is called an algorithm. Informally,
an algorithm is like a recipe. It provides a set of instructions that tells us how to do something, such as make change,
bake bread, or put together a piece of furniture. More precisely, an algorithm describes a process that ends with a
solution to a problem. The algorithm is also one of the fundamental ideas of computer science. An algorithm has the
following features:
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.1 Two Fundamental Ideas of Computer Science: Algorithms and Information Processing 3
any computing agent capable of arithmetic can compute the difference between two digits. So, an
algorithmic step that says “compute the difference between two digits” would be well defined. On the
other hand, a step that says “divide a number by 0” is not well defined, because no computing agent
could carry it out.
3. An algorithm describes a process that eventually halts after arriving at a solution to a problem. For example,
the process of subtraction halts after the computing agent writes down the difference between the two
digits in the leftmost column of digits.
4. An algorithm solves a general class of problems. For example, an algorithm that describes how to make
change should work for any two amounts of money whose difference is greater than or equal to $0.00.
Creating a list of steps that describe how to make change might not seem like a major accomplishment to you. But the
ability to break a task down into its component parts is one of the main jobs of a computer programmer. Once you
write an algorithm to describe a particular type of computation, you can build a machine to do the computing. Put
another way, if you can develop an algorithm to solve a problem, you can automate the task of solving the problem.
You might not feel compelled to write a computer program to automate the task of making change, because you can
probably already make change yourself fairly easily. But suppose you needed to do a more complicated task—such as
sorting a list of 100 names. In that case, a computer program would be very handy.
Computers can be designed to run a small set of algorithms for performing specialized tasks, such as operating
a microwave. But we can also build computers, like the one on your desktop, that are capable of performing a
task described by any algorithm. These computers are truly general-purpose problem-solving machines. They
are unlike any machines that were built before, and they have formed the basis of the completely new world in
which we live.
Later in this book, we introduce a notation for expressing algorithms and some suggestions for designing algorithms.
You will see that algorithms and algorithmic thinking are critical underpinnings of any computer system.
Information Processing
Since people first learned to write several thousand years ago, they have processed information. Information itself
has taken many forms in its history, from the marks impressed on clay tablets in ancient Mesopotamia; to the first
written texts in ancient Greece; to the printed words in the books, newspapers, and magazines mass-produced
since the European Renaissance; to the abstract symbols of modern mathematics and science used during the
past 350 years. Only recently, however, have human beings developed the capacity to automate the processing of
information by building computers. In the modern world of computers, information is also commonly referred to
as data. But what is information?
Like mathematical calculations, information processing can be described with algorithms. In our earlier example
of making change, the subtraction steps involved manipulating symbols used to represent numbers and money. In
carrying out the instructions of any algorithm, a computing agent manipulates information. The computing agent
starts with some given information (known as input), transforms this information according to well-defined rules, and
produces new information, known as output.
It is important to recognize that the algorithms that describe information processing can also be represented as
information. Computer scientists have been able to represent algorithms in a form that can be executed effectively
and efficiently by machines. They have also designed real machines, called electronic digital computers, which are
capable of executing algorithms.
Computer scientists more recently discovered how to represent many other things, such as images, music, human
speech, and video, as information. Many of the media and communication devices that we now take for granted would be
impossible without this new kind of information processing. We examine many of these achievements in more detail in
later chapters.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
4 Chapter 1 Introduction
Exercise 1-1
These short end-of-section exercises are intended to stimulate your thinking about computing.
2. Write an algorithm that describes the second part of the process of making change (counting out the
coins and bills).
4. Describe an instruction that is not well defined and thus could not be included as a step in an algorithm.
Give an example of such an instruction.
6. List four devices that use computers and describe the information that they process. (Hint: Think of the
inputs and outputs of the devices.)
Computer Hardware
The basic hardware components of a computer are memory, a central processing unit (CPU), and a set of
input/output devices, as shown in Figure 1-1.
Memory
CPU
Human users primarily interact with the input and output devices. The input devices include a keyboard, a mouse,
a trackpad, a microphone, and a touchscreen. Common output devices include a monitor and speakers. Computers can
also communicate with the external world through various ports that connect them to networks and to other devices
such as smartphones and digital cameras. The purpose of most input devices is to convert information that human
beings deal with, such as text, images, and sounds, into information for computational processing. The purpose of
most output devices is to convert the results of this processing back to human-usable form.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.2 The Structure of a Modern Computer System 5
Computer memory is set up to represent and store information in electronic form. Specifically, information is stored
as patterns of binary digits (1s and 0s). To understand how this works, consider a basic device such as a light switch,
which can only be in one of two states, on or off. Now suppose there is a bank of switches that control 16 small lights
in a row. By turning the switches off or on, we can represent any pattern of 16 binary digits (1s and 0s) as patterns of
lights that are on or off. As you will see later in this book, computer scientists have discovered how to represent any
information, including text, images, and sound, in binary form.
Now, suppose there are 8 of these groups of 16 lights. We can select any group of lights and examine or change the
state of each light within that collection. We have just developed a tiny model of computer memory. The memory has
8 cells, each of which can store 16 bits of binary information. A diagram of this model, in which the memory cells are
filled with binary digits, is shown in Figure 1-2. This memory is also sometimes called primary memory or internal
or random access memory (RAM).
The information stored in memory can represent any type of data, such as numbers, text, images, sound, or the
instructions of a program. Once the information is stored in memory, we typically want to do something with it—that
is, we want to process it. The part of a computer that is responsible for processing data is the central processing unit
(CPU). This device, which is also sometimes called a processor, consists of electronic switches arranged to perform
simple logical, arithmetic, and control operations. The CPU executes an algorithm by fetching its binary instructions
from memory, decoding them, and executing them. Executing an instruction might involve fetching other binary
information—the data—from memory as well.
The processor can locate data in a computer’s primary memory very quickly. However, these data exist only as
long as electric power comes into the computer. If the power fails or is turned off, the data in primary memory are
lost. Clearly, a more permanent type of memory is needed to preserve data. This more permanent type of memory is
called external or secondary memory, and it comes in several forms. Magnetic storage media, such as tapes and hard
disks, allow bit patterns to be stored as patterns on a magnetic field. Semiconductor storage media, such as flash
memory sticks and universal serial bus (USB) drives, perform much the same function with a different technology,
as do optical storage media, such as compact disks (CDs) and digital video disks (DVDs). Some of these secondary
storage media can hold much larger quantities of information than the internal memory of a computer.
Computer Software
You have learned that a computer is a general-purpose problem-solving machine. To solve any computable problem,
a computer must be capable of executing any algorithm. Because it is impossible to anticipate all of the problems for
which there are algorithmic solutions, there is no way to hardwire all potential algorithms into a computer’s hardware.
Instead, some basic operations are built into the hardware’s processor and require any algorithm to use them. The
algorithms are converted to binary form and then loaded, with their data, into the computer’s memory. The processor
can then execute the algorithms’ instructions by running the hardware’s more basic operations.
Any programs that are stored in memory so that they can be executed later are called software. A program stored
in computer memory must be represented in binary digits, which is also known as machine code. Loading machine
code into computer memory one digit at a time would be a tedious, error-prone task for human beings. It would be
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
6 Chapter 1 Introduction
convenient if we could automate this process to get it right every time. For this reason, computer scientists have
developed another program, called a loader, to perform this task. A loader takes a set of machine language instructions
as input and loads them into the appropriate memory locations. When the loader is finished, the machine language
program is ready to execute. Obviously, the loader cannot load itself into memory, so this is one of those algorithms
that must be hardwired into the computer.
Now that a loader exists, you can load and execute other programs that make the development, execution, and
management of programs easier. This type of software is called system software. The most important example of system
software is a computer’s operating system. You are probably already familiar with at least one of the most popular
operating systems, such as Linux, Apple’s macOS, and Microsoft’s Windows. An operating system is responsible for
managing and scheduling several concurrently running programs. It also manages the computer’s memory, including the
external storage, and manages communications between the CPU, the input/output devices, and other computers on a
network. An important part of any operating system is its file system, which allows human users to organize their data and
programs in permanent storage. Another important function of an operating system is to provide user interfaces—that
is, ways for the human user to interact with the computer’s software. A terminal-based interface accepts inputs from a
keyboard and displays text output on a monitor screen. A graphical user interface (GUI) organizes the monitor screen
around the metaphor of a desktop, with windows containing icons for folders, files, and applications. This type of user
interface also allows the user to manipulate images with a pointing device such as a mouse. A touchscreen interface
supports more direct manipulation of these visual elements with gestures such as pinches and swipes of the user’s
fingers. Devices that respond verbally and in other ways to verbal commands are also becoming widespread.
Another major type of software is called applications software, or simply apps. An application is a program that is
designed for a specific task, such as editing a document or displaying a web page. Applications include web browsers,
word processors, spreadsheets, database managers, graphic design packages, music production systems, and games,
among millions of others. As you begin learning to write computer programs, you will focus on writing simple applications.
As you have learned, computer hardware can execute only instructions that are written in binary form—that is, in
machine language. Writing a machine language program, however, would be an extremely tedious, error-prone task. To
ease the process of writing computer programs, computer scientists have developed high-level programming languages
for expressing algorithms. These languages resemble English and allow the author to express algorithms in a form
that other people can understand.
A programmer typically starts by writing high-level language statements in a text editor. The programmer then
runs another program called a translator to convert the high-level program code into executable code. Because it is
possible for a programmer to make grammatical mistakes even when writing high-level code, the translator checks
for syntax errors before it completes the translation process. If it detects any of these errors, the translator alerts the
programmer via error messages. The programmer then has to revise the program. If the translation process succeeds
without a syntax error, the program can be executed by the run-time system. The run-time system might execute the
program directly on the hardware or run yet another program called an interpreter or virtual machine to execute the
program. Figure 1-3 shows the steps and software used in the coding process.
Create high-level
language program
Program
outputs
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.3 A Not-So-Brief History of Computing Systems 7
Exercise 1-2
1. List two examples of input devices and two examples of output devices.
4. What is the difference between a terminal-based interface and a graphical user interface (GUI)?
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
8 Chapter 1 Introduction
A device known as the abacus also appeared in ancient times. The abacus helped people perform simple arithmetic.
Users calculated sums and differences by sliding beads on a grid of wires (see Figure 1-5a). The configuration of beads
on the abacus served as the data.
In the seventeenth century, the French mathematician Blaise Pascal (1623–1662) built one of the first mechanical
devices to automate the process of addition (see Figure 1-5b). The addition operation was embedded in the configuration
of gears within the machine. The user entered the two numbers to be added by rotating some wheels. The sum or output
number appeared on another rotating wheel. The German mathematician Gottfried Wilhelm Leibniz (1646–1716) built
another mechanical calculator that included other arithmetic functions such as multiplication. Leibniz, who invented
calculus concurrently with Newton, went on to propose the idea of computing with symbols as one of our most basic
intellectual activities. He argued for a universal language in which one could solve any problem by calculating.
Early in the nineteenth century, the French engineer Joseph-Marie Jacquard (1752–1834) designed and constructed
a machine that automated the process of weaving (see Figure 1-5c). Until then, each row in a weaving pattern had to
be set up by hand, a quite tedious, error-prone process. Jacquard’s loom was designed to accept input in the form of
a set of punched cards. Each card described a row in a pattern of cloth. Although it was still an entirely mechanical
device, Jacquard’s loom possessed something that previous devices had lacked—the ability to execute an algorithm
automatically. The set of cards expressed the algorithm or set of instructions that controlled the behavior of the
loom. If the loom operator wanted to produce a different pattern, he just had to run the machine with a different
set of cards.
The British mathematician Charles Babbage (1792–1871) took the concept of a programmable computer a step
further by designing a model of a machine that, conceptually, bore a striking resemblance to a modern general-
purpose computer. Babbage conceived his machine, which he called the Analytical Engine, as a mechanical device.
His design called for four functional parts: a mill to perform arithmetic operations, a store to hold data and a program,
an operator to run the instructions from punched cards, and an output to produce the results on punched cards.
Sadly, Babbage’s computer was never built. The project perished for lack of funds near the time when Babbage
himself passed away.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.3 A Not-So-Brief History of Computing Systems 9
ChewHow/Shutterstock.com
(a) Abacus
Nastasic/Getty Images
(b) Pascal’s Calculator (c) Jacquard’s Loom
In the last two decades of the nineteenth century, a U.S. Census Bureau statistician named Herman Hollerith
(1860–1929) developed a machine that automated data processing for the U.S. Census. Hollerith’s machine, which
had the same component parts as Babbage’s Analytical Engine, simply accepted a set of punched cards as input and
then tallied and sorted the cards. His machine greatly shortened the time it took to produce statistical results on the
U.S. population. Government and business organizations seeking to automate their data processing quickly adopted
Hollerith’s punched card machines. Hollerith was also one of the founders of a company that eventually became
International Business Machines (IBM).
Also in the nineteenth century, the British secondary school teacher George Boole (1815–1864) developed a system
of logic. This system consisted of a pair of values, TRUE and FALSE, and a set of three primitive operations on these
values, AND, OR, and NOT. Boolean logic eventually became the basis for designing the electronic circuitry to process
binary information.
A half century later, in the 1930s, the British mathematician Alan Turing (1912–1954) explored the theoretical
foundations and limits of algorithms and computation. Turing’s essential contributions were to develop the concept
of a universal machine that could be specialized to solve any computable problems and to demonstrate that some
problems are unsolvable by computers.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
10 Chapter 1 Introduction
The needs of the combatants in World War II pushed the development of computer hardware into high gear. Several
teams of scientists and engineers in the United States, England, and Germany independently created the first generation
of general-purpose digital electronic computers during the 1940s. All of these scientists and engineers used Shannon’s
innovation of expressing binary digits and logical operations in terms of electronic switching devices. Among these
groups was a team at Harvard University under the direction of Howard Aiken. Their computer, called the Mark I,
became operational in 1944 and did mathematical work for the U.S. Navy during the war. The Mark I was considered
an electromechanical device, because it used a combination of magnets, relays, and gears to store and process data.
Another team under J. Presper Eckert and John Mauchly, at the University of Pennsylvania, produced a computer
called the Electronic Numerical Integrator and Calculator (ENIAC). The ENIAC calculated ballistics tables for the artillery
of the U.S. Army toward the end of the war. Because the ENIAC used entirely electronic components, it was almost a
thousand times faster than the Mark I.
Two other electronic digital computers were completed a bit earlier than the ENIAC. They were the Atanasoff–Berry
Computer (ABC), built by John Atanasoff and Clifford Berry at Iowa State University in 1942, and the Colossus, constructed
by a group working under Alan Turing in England in 1943. The ABC was created to solve systems of simultaneous linear
equations. Although the ABC’s function was much narrower than that of the ENIAC, the ABC is now regarded as the
first electronic digital computer. The Colossus, whose existence had been top secret until recently, was used to crack
the powerful German Enigma code during the war.
The first electronic digital computers, sometimes called mainframe computers, consisted of vacuum tubes,
wires, and plugs, and they filled entire rooms. Although they were much faster than people at computing, by current
standards they were extraordinarily slow and prone to breakdown. The early computers were also extremely difficult
to program. To enter or modify a program, a team of workers had to rearrange the connections among the vacuum
tubes by unplugging and replugging the wires. Each program was loaded by literally hardwiring it into the computer.
With thousands of wires involved, it was easy to make a mistake.
The memory of these first computers stored only data, not the program that processed the data. As we have
seen, the idea of a stored program first appeared 100 years earlier in Jacquard’s loom and in Babbage’s design for the
Analytical Engine. In 1946, John von Neumann realized that the instructions of the programs could also be stored in
binary form in an electronic digital computer’s memory. His research group at Princeton developed one of the first
modern stored-program computers.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.3 A Not-So-Brief History of Computing Systems 11
Although the size, speed, and applications of computers have changed dramatically since those early days, the
basic architecture and design of the electronic digital computer have remained remarkably stable.
In the early 1950s, computer scientists realized that a symbolic notation could be used instead of machine code,
and the first assembly languages appeared. The programmers would enter mnemonic codes for operations, such as
ADD and OUTPUT, and for data variables, such as SALARY and RATE, at a keypunch machine. The keystrokes punched
a set of holes in a small card for each instruction. The programmers then carried their stacks of cards to a system
operator, who placed them in a device called a card reader. This device translated the holes in the cards to patterns
in the computer’s memory. A program called an assembler then translated the application programs in memory to
machine code, and they were executed.
Programming in assembly language was an improvement over programming in machine code, since the symbolic
notation used in assembly languages was easier for people to read and understand. Another advantage was that the
assembler could catch some programming errors before the program was actually executed. However, the symbolic
notation still appeared a bit arcane when compared with the notations of conventional mathematics. To remedy this
problem, John Backus, a programmer working for IBM, developed FORTRAN (Formula Translation Language) in 1954.
Programmers, many of whom were mathematicians, scientists, and engineers, could now use conventional algebraic
notation. FORTRAN programmers still entered their programs on a keypunch machine, but the computer executed
them after they were translated to machine code by a compiler.
FORTRAN was considered ideal for numerical and scientific applications. However, expressing the kind of data
used in data processing—in particular, textual information—was difficult. For example, FORTRAN was not practical
for processing information that included people’s names, addresses, Social Security numbers, and the financial data
of corporations and other institutions. In the early 1960s, a team led by Rear Admiral Grace Murray Hopper developed
COBOL (Common Business Oriented Language) for data processing in the U.S. government. Banks, insurance companies,
and other institutions were quick to adopt its use in data-processing applications.
Also in the late 1950s and early 1960s, John McCarthy, a computer scientist at MIT, developed a powerful and elegant
notation called LISP (List Processing) for expressing computations. Based on a theory of recursive functions (a subject
covered in Chapter 7), LISP captured the essence of symbolic information processing. A student of McCarthy, Steve
“Slug” Russell, coded the first interpreter for LISP in 1960. The interpreter accepted LISP expressions directly as inputs,
evaluated them, and printed their results. In its early days, LISP was used primarily for laboratory experiments in an
area of research known as artificial intelligence. More recently, LISP has been touted as an ideal language for solving
any difficult or complex problems.
Although they were among the first high-level programming languages, FORTRAN and LISP have survived for decades.
They have undergone modifications to improve their capabilities and have served as models for the development of
many other programming languages. COBOL, by contrast, is no longer in active use but has survived in the form of
legacy programs that must still be maintained.
These new, high-level programming languages had one feature in common: abstraction. In science or any other
area of enquiry, an abstraction allows humans to reduce complex ideas or entities to simpler ones. For example, a set
of 10 assembly language instructions might be replaced with an equivalent algebraic expression that consists of only
five symbols in FORTRAN. Any time you can say more with less, you are using an abstraction. The use of abstraction is
also found in other areas of computing, such as hardware design and information architecture. The complexities do not
actually go away, but the abstractions hide them from view. Abstraction allows computer scientists to conceptualize,
design, and build ever more sophisticated and complex systems.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
12 Chapter 1 Introduction
The development of the integrated circuit in the early 1960s allowed computer engineers to build ever smaller, faster,
and less-expensive computer hardware components. They perfected a process of photographically etching transistors
and other solid-state components onto very thin wafers of silicon, leaving an entire processor and memory on a single
chip. In 1965, Gordon Moore, one of the founders of the computer chip manufacturer Intel, made a prediction that came
to be known as Moore’s Law. This prediction states that the processing speed and storage capacity of hardware will
increase and its cost will decrease by approximately a factor of 2 every 18 months. This trend has held true for more
than 50 years. For example, in 1965 there were about 50 electrical components on a chip, whereas by 2000, a chip could
hold over 40 million components. Without the integrated circuit, human beings would not have gone to the moon in
1969, and we would not have the powerful and inexpensive handheld devices that we now use on a daily basis.
Minicomputers the size of a large office desk appeared in the 1960s, when the means of developing and running
programs were changing. Until then, a computer was typically located in a restricted area with a single human operator.
Programmers composed their programs on keypunch machines in another room or building. They then delivered their
stacks of cards to the computer operator, who loaded them into a card reader and compiled and ran the programs
in sequence on the computer. Programmers then returned to pick up the output results in the form of new stacks of
cards or printouts. This mode of operation, also called batch processing, might cause a programmer to wait days for
results, including error messages.
The increases in processing speed and memory capacity enabled computer scientists to develop the first time-
sharing operating system. John McCarthy, the creator of the programming language LISP, recognized that a program
could automate many of the functions performed by the human system operator. When memory, including magnetic
secondary storage, became large enough to hold several users’ programs at the same time, they could be scheduled
for concurrent processing. Each process associated with a program would run for a slice of time and then yield the
CPU to another process. All of the active processes would repeatedly cycle for a turn with the CPU until they finished.
Several users could now run their own programs simultaneously by entering commands at separate terminals
connected to a single computer. As processor speeds continued to increase, each user gained the illusion that a time-
sharing computer system belonged entirely to them.
By the late 1960s, programmers could enter program input at a terminal and also see program output immediately displayed
on a cathode ray tube (CRT) screen. Compared to its predecessors, this new computer system was both highly interactive
and much more accessible to its users. Interactive, multiuser computer systems could now support the development of
large, complex software applications by teams of programmers. One such group, under the direction of Margaret Hamilton,
constructed the programs that controlled the command and lunar landing modules for the Apollo Space Mission. She
coined the term software engineering to refer to the construction of large software systems using a disciplined method
of requirements analysis, design, coding, and testing that involves the coordination of a team of specialized developers.
Many relatively small- and medium-sized institutions, such as universities, were now able to afford computers.
These machines were used not only for data processing and engineering applications but also for teaching and research
in the new and rapidly growing field of computer science.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.3 A Not-So-Brief History of Computing Systems 13
would these personal computers take, and how would their owners use them? Two decades earlier, in 1945, Engelbart
had read an article in The Atlantic Monthly titled “As We May Think” that had posed this question and offered some
answers. The author, Vannevar Bush, a scientist at MIT, predicted that computing devices would serve as repositories
of information and, ultimately, of all human knowledge. Owners of computing devices would consult this information
by browsing through it with pointing devices, and they would contribute information to the knowledge base almost at
will. Engelbart agreed that the primary purpose of the personal computer would be to augment the human intellect,
and he spent the rest of his career designing computer systems that would accomplish this goal.
During the late 1960s, Engelbart built the first pointing device, or mouse. He also designed software to represent
windows, icons, and pull-down menus on a bit-mapped display screen. He demonstrated that a computer user could
not only enter text at the keyboard but could also directly manipulate the icons that represent files, folders, and
computer applications on the screen.
But for Engelbart, personal computing did not mean computing in isolation. He participated in the first experiment
to connect computers in a network, and he believed that soon people would use computers to communicate, share
information, and collaborate on team projects.
Engelbart developed his first experimental system, which he called NLS (oNLine System) Augment, on a minicomputer
at SRI. In the early 1970s, he moved to Xerox Palo Alto Research Center (PARC) and worked with a team under Alan
Kay to develop the first desktop computer system. Called the Alto, this system had many of the features of Engelbart’s
Augment, as well as email and a functioning hypertext (a forerunner of the World Wide Web). Kay’s group also developed
a programming language called Smalltalk, which was designed to create programs for the new computer and to teach
programming to children. Kay’s goal was to develop a personal computer the size of a large notebook, which he called
the Dynabook. Unfortunately for Xerox, the company’s management had more interest in photocopy machines than
in the work of Kay’s visionary research group. However, a young entrepreneur named Steve Jobs visited the Xerox lab
and saw the Alto in action. Almost a decade later, in 1984, Apple Computer, the now-famous company founded by Steve
Jobs, brought forth the Macintosh, the first successful mass-produced personal computer with a GUI.
While Kay’s group was busy building the computer system of the future in their research lab, dozens of hobbyists
gathered near San Francisco to found the Homebrew Computer Club, the first personal computer users group. They
met to share ideas, programs, hardware, and applications for personal computing. The first mass-produced personal
computer, the Altair, appeared in 1975. The Altair contained Intel’s 8080 processor, the first microprocessor chip. But
from the outside, the Altair looked and behaved more like a miniature version of the early computers than the Alto.
Programs and their input had to be entered by flipping switches, and output was displayed by a set of lights. However,
the Altair was small enough for personal computing enthusiasts to carry home, and input/output devices eventually
were invented to support the processing of text and sound.
The Osborne and the Kaypro were among the first mass-produced interactive personal computers. They boasted
tiny display screens and keyboards, with floppy disk drives for loading system software, applications software, and
users’ data files. Early personal computing applications were word processors, spreadsheets, and games such as Pac-
Man and Spacewar! These computers also ran CP/M (Control Program for Microcomputers), the first personal computer
(PC)–based operating system.
In the early 1980s a college dropout named Bill Gates and his partner Paul Allen built their own operating system
software, which they called Microsoft Disk Operating System (MS-DOS). They then arranged a deal with the giant
computer manufacturer IBM to supply MS-DOS for the new line of PCs that the company intended to mass produce.
This deal proved to be a very advantageous one for Gates’s company, Microsoft. Not only did Microsoft receive a fee
for each computer sold, it also got a head start on supplying applications software that would run on its operating
system. Fast, high sales of the IBM PC and its clones to individuals and institutions quickly made MS-DOS the world’s
most widely used operating system. Within a few years, Gates and Allen had become billionaires, and within a decade,
Gates had become the world’s richest man, a position he held for 13 straight years.
Also in the 1970s, the U.S. government began to support the development of a network that would connect computers
at military installations and research universities. The first such network, called Advanced Research Projects Agency
Network (ARPANET), connected four computers at SRI, University of California at Los Angeles (UCLA), University of
California Santa Barbara, and the University of Utah. Bob Metcalfe, a researcher associated with Kay’s group at Xerox,
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
14 Chapter 1 Introduction
developed a software protocol called Ethernet for operating a network of computers. Ethernet allowed computers to
communicate in a local area network (LAN) within an organization and also with computers in other organizations via
a wide area network (WAN). By the mid-1980s, the ARPANET had grown into what we now call the Internet, connecting
computers owned by large institutions, small organizations, and individuals all over the world.
Desktop and laptop computers now not only performed useful work but also gave their users new means of personal
expression. This decade saw the rise of computers as communication tools, with email, instant messaging, bulletin
boards, chat rooms, and the World Wide Web.
Perhaps the most interesting story from this period concerns Tim Berners-Lee, the creator of the World Wide Web.
In the late 1980s, Berners-Lee, a theoretical physicist doing research at the CERN Institute in Geneva, Switzerland, began
to develop some ideas for using computers to share information. Computer engineers had been linking computers to
networks for several years, and it was already common in research communities to exchange files and send and receive
email around the world. However, the vast differences in hardware, operating systems, file formats, and applications
still made it difficult for users who were not adept at programming to access and share this information. Berners-Lee
was interested in creating a common medium for sharing information that would be easy to use, not only for scientists
but also for any other person capable of manipulating a keyboard and mouse and viewing the information on a monitor.
Berners-Lee was familiar with Vannevar Bush’s vision of a web-like consultation system, Engelbart’s work on NLS
Augment, and also with the first widely available hypertext systems. One of these systems, Apple Computer’s HyperCard,
broadened the scope of hypertext to hypermedia. HyperCard allowed authors to organize not just text but also images,
sound, video, and executable applications into webs of linked information. However, a HyperCard database sat only
on standalone computers; the links could not carry HyperCard data from one computer to another. Furthermore, the
supporting software ran only on Apple’s computers.
Berners-Lee realized that networks could extend the reach of a hypermedia system to any computers connected
to the net, making their information available worldwide. To preserve its independence from particular operating
systems, the new medium would need to have universal standards for distributing and presenting the information.
To ensure this neutrality and independence, no private corporation or individual government could own the medium
and dictate the standards.
Berners-Lee built the software for this new medium, which we now call the World Wide Web, in 1992. The software
used many of the existing mechanisms for transmitting information over the Internet. People contribute information
to the web by publishing files on computers known as web servers. The web server software on these computers is
responsible for answering requests for viewing the information stored on the web server. To view information on the
web, people use software called a web browser. In response to a user’s commands, a web browser sends a request for
information across the Internet to the appropriate web server. The server responds by sending the information back
to the browser’s computer, called a web client, where it is displayed or rendered in the browser.
Although Berners-Lee wrote the first web server and web browser software, he made two other even more important
contributions. First, he designed a set of rules, called Hypertext Transfer Protocol (HTTP), which allows any server
and browser to talk to each other. Second, he designed a language, Hypertext Markup Language (HTML), which allows
browsers to structure the information to be displayed on web pages. He then made all of these resources available to
anyone for free.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.3 A Not-So-Brief History of Computing Systems 15
Berners-Lee’s invention and gift of this universal information medium is a truly remarkable achievement. Today
there are millions of web servers in operation around the world. Anyone with the appropriate training and resources—
companies, government, nonprofit organizations, and private individuals—can start up a new web server or obtain
space on one. Web browser software now runs not only on desktop and laptop computers but also on handheld devices
such as cell phones.
The growth of the Internet, the web, and related software technologies also transformed manufacturing, retail sales,
and finance in the latter half of this decade. Computer-supported automation dramatically increased productivity, while
eliminating high-paying jobs for many people. Firms established and refined the chains of production and distribution
of goods, from raw materials to finished products to retail sales, which were increasingly cost-effective and global in
scope. Computer technology facilitated in large part the spread of giant big-box stores like Walmart and the rise of
online stores like Amazon, while driving many local retailers out of business and creating a workforce of part-timers
without benefits.
The technology that made online stores pervasive, called web applications, presented a revolution in the way in
which software services were delivered to people. Instead of purchasing and running software for specific applications
to run on one’s own computer, one could obtain access to a specific service through a web browser. The web application
providing this service ran on a remote computer or server located at the provider’s place of business. The web browser
played the role of the client, front end, or user interface for millions of users to access the same server application for
a given service. Client/server applications had already been in use for email, bulletin boards, and chat rooms on the
Internet, so this technology was simply deployed on the web when it became available.
The final major development of this decade took place in a computer lab at Stanford University, where two graduate
students, Sergey Brin and Larry Page, developed algorithms for indexing and searching the web. The outcome of their
work added a new verb to the dictionary: to google. Today, much of the world’s economy and research relies upon
Google’s various search platforms.
Meanwhile, cellular technology became widespread, with millions of people beginning to use the first cell phones.
These devices, which allowed calls to be made from a simple mechanical keypad, were “dumb” compared to today’s
smartphones. But cellular technology provided the basis for what was soon to come. At about the same time, wireless
technology began to allow computers to communicate through the air to a base station with an Internet connection.
The conditions for mobile and ubiquitous computing were now in place, awaiting only the kinds of devices and apps
that would make them useful and popular.
No one foresaw the types of devices and applications that mobile computing would make possible better
than Steve Jobs (the founder of Apple Computer, mentioned earlier). During the final dozen years of his life, Jobs
brought forward from Apple several devices and technologies that revolutionized not only computing but also
the way in which people engaged in cultural pursuits. The devices were the iPod, which began as a digital music
player but evolved into a handheld general-purpose computing device; the iPhone, which added cellular phone
technology to the iPod’s capabilities; and the iPad, which realized Alan Kay’s dreams of a personal notebook
computer. All of these devices utilized touchscreen and voice recognition technology, which eliminated the need
for bulky mechanical keypads.
The associated software technologies came in the form of Apple’s iLife suite, a set of applications that allowed
users to organize various types of media (music, photos, video, and books); and Apple’s iTunes, iBooks, and App
Stores, vendor sites that allowed developers to market mobile media and applications. The web browser that for a
decade had given users access to web apps became just another type of app in the larger world of mobile computing.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
16 Chapter 1 Introduction
The new millennium has seen another major addition to the digital landscape: social networking applications.
Although various Internet forums, such as chatrooms and bulletin board systems, had been in use for a couple of
decades, their use was not widespread. In 2004, Mark Zuckerberg, a student at Harvard University, changed all that
when he launched Facebook from his college dorm room. The application allowed students to join a network to share
their profiles; post messages, photos, and videos; and generally communicate as “friends.” Participation in this network
rapidly spread to include more than a billion users. Social networking technology now includes many other variations,
as exemplified by LinkedIn, Twitter, Tumblr, Flickr, and Instagram.
During the past decade (2010–2020), as computing applications have migrated from standalone desktop machines
to mobile devices, the storage of data has moved from individual devices to giant server farms to which these devices
are wirelessly connected. Our data, including music, photos, text, financial assets, and geolocation, are now located
in a digital cloud, which we can access from our phones, watches, TV sets, and automobiles, among other things.
Cloud computing and wireless technology also underlie an even broader Internet of Things (IOT), in which practically any
physical objects (including my cat) containing the appropriate computer chips can send and receive digital information.
We conclude this not-so-brief overview by mentioning the rise of a technology known as big data. Governments,
businesses, and hackers continually monitor Internet traffic for various purposes. This “clickstream” can be “mined”
to learn users’ preferences, interests, and behavior patterns to better serve them, exploit them, or spy on them. For
example, an online store might advertise a product on a person’s Facebook page immediately after that person viewed
a similar product while shopping online or mentioned the product in the presence of a virtual assistant such as Alexa.
Researchers in the field of data science have created algorithms that process massive amounts of data to discover
trends and predict outcomes.
To summarize this history, one trend ties the last several decades of computing together: rapid technical progress.
Processes and the things in which they are embedded have become automated, programmable, smaller, faster, highly
interconnected, and easily visualized and interpreted.
If you want to learn more about the history of computing, consult the sources listed in Appendix E. We now turn
to an introduction to programming in Python.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.4 Getting Started with Python Programming 17
Python appearing in this screenshot is 3.10.4. This book assumes that you will use Python 3 rather than Python 2. There
are substantial differences between the two versions, and many examples used in this book will not work with Python 2.
A shell window contains an opening message followed by the special symbol >>>, called a shell prompt. The cursor at
the shell prompt waits for you to enter a Python command. Note that you can get immediate help by entering help at
the shell prompt or selecting help from the window’s drop-down menu.
When you enter an expression or a statement, Python evaluates it and displays its result, if there is one, followed
by a new prompt. The next few lines show the evaluation of several expressions and statements.
Note the use of colors in the Python code. The IDLE programming environment uses color-coding to help the reader
pick out different elements in the code. In this example, the items within quotation marks are in green, the names of
standard functions are in purple, program comments are in red, and the responses of IDLE to user commands are in
blue. The remaining code is in black. Table 1-1 lists the color-coding scheme used in all program code in this book.
The Python shell is useful for experimenting with short expressions or statements to learn new features of the
language, as well as for consulting documentation on the language. To quit the Python shell, you can either select the
window’s close box or press the CTRL-D key combination.
The means of developing more complex and interesting programs are examined in the rest of this section.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
18 Chapter 1 Introduction
The programmer can also force the output of a value by using the print function. The simplest form for using
this function looks like the following:
print(<expression>)
This example shows you the basic syntax (or grammatical rule) for using the print function. The angle brackets
(the < and > symbols) enclose a type of phrase. In actual Python code, you would replace this syntactic form, including
the angle brackets, with an example of that type of phrase. In this case, <expression> is shorthand for any Python
expression, such as 3 + 4.
When running the print function, Python first evaluates the expression and then displays its value. In the example
shown earlier, print was used to display some text. The following is another example:
In this example, the text "Hi there" is the text that we want Python to display. In programming terminology,
this piece of text is referred to as a string. In Python code, a string is always enclosed in quotation marks. However,
the print function displays a string without the quotation marks.
You can also write a print function that includes two or more expressions separated by commas. In such a case,
the print function evaluates the expressions and displays their results, separated by single spaces, on one line. The
syntax for a print statement with two or more expressions looks like the following:
print(<expression>,…, <expression>)
Note the ellipsis (…) in this syntax example. The ellipsis indicates that you could include multiple expressions
after the first one. Whether it outputs one or multiple expressions, the print function always ends its output with a
newline. In other words, it displays the values of the expressions, and then it moves the cursor to the next line on the
console window.
To begin the next output on the same line as the previous one, you can place the expression end = "", which
says “end the line with an empty string instead of a newline,” at the end of the list of expressions, as follows:
As you create programs in Python, you’ll often want your programs to ask the user for input. You can do this by
using the input function. This function causes the program to stop and wait for the user to enter a value from the
keyboard. When the user presses Return or Enter key, the function accepts the input value and makes it available to
the program. A program that receives an input value in this manner typically saves it for further processing.
The following example receives an input string from the user and saves it for further processing. The user’s input
is in italics.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.4 Getting Started with Python Programming 19
1. Displays a prompt for the input. In this example, the prompt is "Enter your name: ".
2. Receives a string of keystrokes, called characters, entered at the keyboard and returns the string to the
shell.
How does the input function know what to use as the prompt? The text in parentheses, "Enter your name: ",
is an argument for the input function that tells it what to use for the prompt. An argument is a piece of information
that a function needs to do its work.
The string returned by the function in our example is saved by assigning it to the variable name. The form of an
assignment statement with the input function is the following:
A variable identifier, or variable for short, is just a name for a value. When a variable receives its value in an input
statement, the variable then refers to this value. If the user enters the name "Ken Lambert" in our last example, the
value of the variable name can be viewed as follows:
>>> name
'Ken Lambert'
The input function always builds a string from the user’s keystrokes and returns it to the program. After inputting
strings that represent numbers, the programmer must convert them from strings to the appropriate numeric types. In
Python, there are two type conversion functions for this purpose, called int (for integers) and float (for floating-
point numbers). The next session inputs two integers and displays their sum:
Note that the int function is called with each result returned by the input function. The two numbers are added,
and then their sum is output. Table 1-2 summarizes the functions introduced in this subsection.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
20 Chapter 1 Introduction
which Python programs are distributed to others. Most important, as you know from writing term papers, files allow
you to save, safely and permanently, many hours of work.
To compose and execute programs in this manner, you perform the following steps:
1. Select the option New File from the File menu of the shell window.
2. In the new window, enter Python expressions or statements on separate lines, in the order in which you
want Python to execute them.
3. At any point, you may save the file by selecting File/Save. If you do this, you should use a .py extension. For
example, your first program file might be named myprogram.py.
4. To run this file of code as a Python script, select Run Module from the Run menu or press the F5 key.
The command in Step 4 reads the code from the saved file and executes it. If Python executes any print functions
in the code, you will see the outputs as usual in the shell window. If the code requests any inputs, the interpreter will
pause to allow you to enter them. Otherwise, program execution continues invisibly behind the scenes. When the
interpreter has finished executing the last instruction, it quits and returns you to the shell prompt.
Figure 1-7 shows an IDLE window containing a complete script that prompts the user for the width and height of
a rectangle, computes its area, and outputs the result:
When the script is run from the IDLE window, it produces the interaction with the user in the shell window shown
in Figure 1-8.
Python™
This can be a slightly less interactive way of executing programs than entering them directly at Python’s interpreter
prompt. However, running the script from the IDLE window will allow you to construct some complex programs, test
them, and save them in program libraries that you can reuse or share with others.
1. The interpreter reads a Python expression or statement, also called the source code, and verifies that it is
well formed. In this step, the interpreter behaves like a strict English teacher who rejects any sentence that
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
1.4 Getting Started with Python Programming 21
Byte code
Program
outputs
does not adhere to the grammar rules, or syntax, of the language. As soon as the interpreter encounters
such an error, it halts translation with an error message.
2. If a Python expression is well formed, the interpreter then translates it to an equivalent form in a low-level
language called byte code. When the interpreter runs a script, it completely translates it to byte code.
3. This byte code is next sent to another software component, called the Python virtual machine (PVM), where
it is executed. If another error occurs during this step, execution also halts with an error message.
Exercise 1-3
1. Describe what happens when the programmer enters the string “Greetings!” in the Python shell.
2. Write a line of code that prompts the user for his or her name and saves the user’s input in a variable
called name.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
22 Chapter 1 Introduction
The first statement assigns an input value to the variable length. The next statement attempts to print the value
of the variable lenth. Python responds that this name is not defined. Although the programmer might have meant to
write the variable length, Python can read only what the programmer actually entered. This is a good example of the
rule that a computer can read only the instructions it receives, not the instructions we intend to give it.
The next statement attempts to print the value of the correctly spelled variable. However, Python still generates
an error message.
>>> print(length)
SyntaxError: unexpected indent
In this error message, Python explains that this line of code is unexpectedly indented. In fact, there is an extra
space before the word print. Indentation is significant in Python code. Each line of code entered at a shell prompt or
in a script must begin in the leftmost column, with no leading spaces. The only exception to this rule occurs in control
statements and definitions, where nested statements must be indented one or more spaces.
You might think that it would be painful to keep track of indentation in a program. However, the Python language
is much simpler than other programming languages. Consequently, there are fewer types of syntax errors to encounter
and correct, and a lot less syntax for you to learn!
In our final example, the programmer attempts to add two numbers but forgets to include the second one:
>>> 3 +
SyntaxError: invalid syntax
In later chapters, you will learn more about other kinds of program errors and how to repair the code that generates
them.
Summary
• One of the most fundamental ideas of computer science is the algorithm. An algorithm is a sequence of
instructions for solving a problem. A computing agent can carry out these instructions to solve a problem in a
finite amount of time.
• Another fundamental idea of computer science is information processing. Practically any relationship among real-
world objects can be represented as information or data. Computing agents manipulate information and transform
it by following the steps described in algorithms.
• Real computing agents can be constructed out of hardware devices. These consist of a CPU, memory, and input
and output devices. The CPU contains circuitry that executes the instructions described by algorithms. The
memory contains switches that represent binary digits. All information stored in memory is represented in binary
form. Input devices such as a keyboard and flatbed scanner and output devices such as a monitor and speakers
transmit information between the computer’s memory and the external world. These devices also transfer
information between a binary form and a form that human beings can use.
• Some real computers, such as those in fitness trackers and home thermostats, are specialized for a small set of
tasks, whereas a desktop or laptop computer is a general-purpose problem-solving machine.
• Software provides the means whereby different algorithms can be run on a general-purpose hardware device.
The term software can refer to editors and interpreters for developing programs; an operating system for
managing hardware devices; user interfaces for communicating with human users; and applications such as word
processors, spreadsheets, database managers, games, and media-processing programs.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Key Terms 23
• Software is written in programming languages. Languages such as Python are high level; they resemble English
and allow authors to express their algorithms clearly to other people. A program called an interpreter translates a
Python program to a lower-level form that can be executed on a real computer.
• The Python shell provides a command prompt for evaluating and viewing the results of Python expressions and
statements. IDLE is an integrated development environment that allows the programmer to save programs in files
and load them into a shell for testing.
• Python scripts are programs that are saved in files and run from a terminal command prompt. An interactive
script consists of a set of input statements, statements that process these inputs, and statements that output the
results.
• When a Python program is executed, it is translated into byte code. This byte code is then sent to the PVM for
further interpretation and execution.
• Syntax is the set of rules for forming correct expressions and statements in a programming language. When
the interpreter encounters a syntax error in a Python program, it halts execution with an error message. Two
examples of syntax errors are a reference to a variable that does not yet have a value and an indentation that is
unexpected.
Key Terms
abacus data networks
abstraction data science newline
algorithm digital cloud operating system
applications software executed optical storage media
argument file system output
artificial intelligence graphical user interface (GUI) personal digital assistants (PDAs)
assembler hardware ports
assembly languages high-level programming languages primary memory
batch processing hypermedia processor
big data information processing programs
binary digits input programming languages
bits input/output devices program libraries
bit-mapped display screen integrated circuit Python Shell
byte code Internet of Things (IOT) Python virtual machine (PVM)
card reader interpreter random access memory (RAM)
cathode ray tube (CRT) screen keypunch machine run-time system
central processing unit (CPU) loader secondary memory
client machine code semiconductor storage media
client/server applications magnetic storage media server
cloud computing mainframe computers server farms
compiler memory shell
computing agent microprocessor software
concurrent processing Moore’s Law software engineering
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
24 Chapter 1 Introduction
Review Questions
1. Which of the following is an example of an algorithm?
a. An audio CD c. An automobile
b. A refrigerator d. A stereo speaker
a. Speaker c. Printer
b. Microphone d. Display screen
a. A compiler c. A loader
b. A text editor d. An interpreter
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Debugging Exercise 25
a. Edit, compile, and run Python programs c. Just compile Python programs
b. Just edit Python programs d. Just run Python programs
10. What is the set of rules for forming sentences in a language called?
a. Semantics c. Syntax
b. Pragmatics d. Logic
Programming Exercises
1. Write a Python program in a file named myinfo.py that prints (displays) your name, address, and telephone
number. (LO: 1.4)
2. Open an IDLE window and enter the program from Figure 1-7 that computes the area of a rectangle. Save the
program to a file named rectangle.py and load it into the shell by pressing the F5 key and correct any errors
that occur. Test the program with different inputs by running it at least three times. (LO: 1.4)
3. Write a program in a file named triangle.py to compute the area of a triangle. Issue the appropriate prompts
for the triangle’s base and height. Then, use the formula .5 * base * height to compute the area. Test the
program from an IDLE window. (LO: 1.4)
4. Write and test a program in a file named circle.py that computes the area of a circle. This program
should request a number representing a radius as input from the user. It should use the formula
3.14 * radius ** 2 to compute the area and then output this result suitably labeled. (LO: 1.4)
5. A cuboid is a solid figure bounded by six rectangular faces. Its dimensions are its height, width, and depth.
Write a Python program in a file named cuboid.py that computes and prints the volume of a cuboid, given its
height, width, and depth as inputs. The volume is just the product of these three inputs. The output should be
labeled as “cubic units.” (LO: 1.4)
Debugging Exercise
Consider the following interaction at the Python shell:
The expected output is 67, but the output of this computation is 2344. Explain what causes this error and describe
how to correct it.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 2
Software
Development,
Data Types, and
Expressions
Learning Objectives
When you complete this chapter, you will be able to:
› 2.1 D
escribe the basic phases of software development: analysis, design, coding,
and testing
› 2.2 Use strings for the terminal input and output of text
› 2.3 Use integers and floating-point numbers in arithmetic operations
› 2.4 Construct arithmetic expressions
› 2.5 Import functions from library modules
This chapter begins with a discussion of the software development process, followed by a case study in which
you walk through the steps of program analysis, design, coding, and testing. The chapter also examines the
basic elements that form programs. These include the data types for text and numbers and the expressions
that manipulate them. The chapter concludes with an introduction to the use of functions and modules in
simple programs.
27
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
28 Chapter 2 Software Development, Data Types, and Expressions
1. Customer request: In this phase, the programmers receive a broad statement of a problem that is
potentially amenable to a computerized solution. This step is also called the user requirements phase.
2. Analysis: The programmers determine what the program will do. This is sometimes viewed as a process of
clarifying the specifications for the problem.
3. Design: The programmers determine how the program will do its task.
4. Implementation: The programmers write the program. This step is also called the coding phase.
5. Integration: Large programs have many parts. In the integration phase, these parts are brought together into
a smoothly functioning whole, usually not an easy task.
6. Maintenance: Programs usually have a long life; a lifespan of 5 to 15 years is common for software. During
this time, requirements change, errors are detected, and minor or major modifications are made.
The phases of the waterfall model are shown in Figure 2-1. As you can see, the figure resembles a waterfall, in
which the results of each phase flow down to the next. However, if a developer makes a mistake in an early phase, it
may require them to back up and redo some of the work. Modifications made during maintenance also require backing
up to earlier phases. Taken together, these phases are also called the software development life cycle.
Verify
Analysis
Verify
Design
Verify
Implementation
Test
Integration
Test
Maintenance
Although the diagram depicts distinct phases, this does not mean that developers must analyze and design a
complete system before coding it. Modern software development is usually incremental and iterative. This means that
analysis and design may produce a rough draft, skeletal version, or prototype of a system for coding, and then back
up to earlier phases to fill in more details after some testing. For purposes of introducing this process, however, this
chapter treats these phases as distinct.
Programs rarely work perfectly the first time they are run, which is why they should be subjected to extensive and
careful testing. Many people think that testing is an activity that applies only to the implementation and integration
phases; however, you should scrutinize the outputs of each phase carefully. Keep in mind that mistakes found early are
much less expensive to correct than those found late. Figure 2-2 illustrates some relative costs of repairing mistakes
when found in different phases. These are not just financial costs but also costs in time and effort.
Copyright 2024 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Random documents with unrelated
content Scribd suggests to you:
penetrate. From the heavy beams under the roof, wisps
of clothes waved weird and ghostlike in the slight wind.
The two girls stood huddled together and felt like
intruders as they talked of the people who once must
have lived there. Judy, her imagination in full flight,
pointed to the tattered garments.
They could see at once that he was old, very old. His
face was crisscrossed with fine lines, but his blue eyes
were bright and he held himself so erect that Judy
involuntarily straightened her slumping shoulders.
Judy walked closer to the cabin and the door being ajar,
she looked inside—two cots, some shelves sparsely
stacked with cans of soup, some other foodstuffs.
“You don’t live here, do you?” she asked, her voice
incredulous as she again faced the old man.
“I don’t see how you can bear to live in that little cabin 140
all winter. I should think you’d die of lonesomeness or
freeze to death!”
“It’s never that cold, Miss. The sun’s good and hot even
on the coldest days. And I’m used to it.”
“I’ll tell yer. For one thing, the mines out this way were
hard to work and new mines weren’t easy to locate. At
Aspen things were different. New veins kept on being
opened all the time and they weren’t so hard to mine.
Nature favored it more, or maybe it was better
equipment. Anyhow, prospectors and settlers both got
discouraged. They gradually took off. Yep, they just
moved away. A lot of them dragged their houses with
them by mule team.”
141
142
“The young lady’s right,” he said. “Montezeuma had
plenty of good ore and it did well. Made Tabor a tidy
fortune. But it was too high. Nearly thirteen thousand
feet. Dragging supplies out there was hard, but only a
man like Tabor could make a good thing of it.” He
nodded at them and a great smile spread over the
wrinkled face, deepening the two well-marked furrows
around his jaw.
“Yes. Did you ever see her?” Judy asked, with mounting
interest.
But his smile quickly faded. “Augusta got that mine too.”
He sat thinking for a moment. “Not that you can
altogether blame her, the first Mrs. Tabor. She’d helped
him when he was—well, nobody. And now that he was
rich and famous, she wanted to hold on. Guess she
loved him, so she said right out in all the newspapers.”
“Well, did Horace Tabor and his new love live happily
ever after?” Lynne asked lightly.
Judy brushed aside the question. “What happened after 143
the Silver Panic, Mister? Did Baby Doe leave Tabor when
he became poor?”
“No.” The old man shook his head gravely. “As I was
saying, Tabor lost everything and what he didn’t lose,
he’d given to Augusta. She was rich and stayed rich. All
that remained to Tabor was one mine. He still owned
Matchless. It wasn’t paying any but he had great faith in
it. When he was on his deathbed, he tells Baby Doe,
‘Hold on to Matchless. It’ll make a fortune yet.’”
“They found her one day, her body dressed in rags, her
feet covered with newspapers to keep out the cold—
found her frozen to death.”
“For one thing, kid,” Allen said, annoyed at Judy’s lack 145
of enthusiasm, “he was with the ski troops that saw
Arctic duty in World War II. He learned about dogs the
hard way.”
147
13
THE HUSKIES
Mr. Mace smiled at the boy. “Don’t you think these dogs 149
deserve a rest after working hard from November
through April? This is their vacation, son,” he said kindly.
“That’s how we keep them fit and happy.”
The crowd moved on. The boy who had just been
admonished stood in front of the kennel watching the
sulky animal. As Judy tried to pass, the boy stood
talking to the dog.
The dog lifted her leg and gave the boy’s chest a shove. 150
He went down as if hit by a load of bricks. The boy lay
there, stunned. Judy screamed, “Mr. Mace! Mr. Mace!”
“In the far north they can take a temperature that goes
to sixty or seventy degrees below zero. We, of course,
haven’t such extremes of cold here, but it’s plenty cold
in the mountains in the winter. When we take people on
our sledding trips over snow-covered trails, we stop
overnight at a cabin we’ve built. Our riders enjoy a good
fire, a comfortable bed and a meal.
“Of course, she does,” Mr. Mace said. “She’s our prima
donna, one of our famous movie stars. She’s only
completely happy when she’s in front of a movie
camera.”
“I’m afraid not. Our dogs have performed often right out
here in these very mountains. You’ve probably seen
them on your own TV’s at home, thinking they were
made in the Arctic! But most often when Hollywood
needs our dogs, we just board a plane and go there.”
There was more, much more. Eighty dogs are a lot of 151
dogs to see and Judy must have looked as she felt, very
weary. The tour was over.
“How could you get off when the chairs keep moving all
the time? The machinery never stops. I’ve watched it a
hundred times.”
“We gave up, just like that,” Lynne said laughing. “Allen
shouted to me, ‘I’m getting off at Midway. Not going
further. You keep going if you wish, but I don’t think it
sensible.’
“So like a cautious young couple with good reasons for 153
our caution,” again that special smile for Allen, “we
walked down a steep mining road that took us back to
Aspen. It was wonderful even if we didn’t get to the
top.”
Judy wondered.
“Good-bye!”
“Something special to tell them?” Judy repeated to
herself as she slowly mounted the porch steps. “Maybe
that’s why Allen didn’t want Lynne to go further on the
Chairlift. After all, they are married two years—”
154
14
“CONFIDENTIALLY YOURS”
“... and so, dear Grandpa, I’ve brought you up on all the
latest news. One or two things more. Mother is still
hopeful for an early audition for the City Center Opera
Company. Father continues to write incomprehensible
notes on his music sheets—and literally walks on air
when it goes well. Other times he just looks black and
frustrated, staring into space as if listening. But his work
at the school is fine. And his quartet is making a name
for itself in this oasis we call Aspen. There! That’s
enough about them!
“I can see you look at me in that way you have and say,
‘What about you?’
“His only answer was, ‘She knows her piano. I don’t. I’m
lucky to get that ribbing. It helps to keep one’s feet on
the ground. Besides, she’s fun to be with!’
157
15
THE MOUNTAIN CLIMB
Marian gave Judy a little nod and held out her pretty
manicured hand to Mr. Lurie. “I know I’m just an
interloper, but to be in the heart of the Rockies and not
able to boast of one little climb—” She gave Mr. Lurie a
ravishing smile.
“Little climb,” Judy muttered under her breath, but she 162
noticed that her father looked as pleased as Punch and
said, “We’re delighted to have you come along.”
“Don’t worry about me,” Marian added lightly. She lifted 163
a trim little foot. “These sneakers are the best—new and
strong. I’ll manage.”
Giant peaks cut into the sky, deep forests of black pine 164
were far below, and in the distance a thread of silver
shimmered, a river, perhaps unknown, uncharted on any
map. In a craterlike hollow, barely seen at first, lay a
lake of dazzling color, like a giant emerald, sparkling in
the sun.
ebookgate.com