100% found this document useful (8 votes)
301 views71 pages

Data Modeling and Database Design 2nd Edition Narayan S. Umanath Download

The document is a promotional and informational piece for the second edition of 'Data Modeling and Database Design' by Narayan S. Umanath and Richard W. Scamell, published by Cengage Learning. It outlines the contents of the book, including chapters on database systems, conceptual data modeling, logical data modeling, normalization, and database implementation. Additionally, it includes links to other related educational resources and eBooks available for download.

Uploaded by

cuwxreso2779
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (8 votes)
301 views71 pages

Data Modeling and Database Design 2nd Edition Narayan S. Umanath Download

The document is a promotional and informational piece for the second edition of 'Data Modeling and Database Design' by Narayan S. Umanath and Richard W. Scamell, published by Cengage Learning. It outlines the contents of the book, including chapters on database systems, conceptual data modeling, logical data modeling, normalization, and database implementation. Additionally, it includes links to other related educational resources and eBooks available for download.

Uploaded by

cuwxreso2779
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 71

Data Modeling and Database Design 2nd Edition

Narayan S. Umanath download

https://ebookname.com/product/data-modeling-and-database-
design-2nd-edition-narayan-s-umanath/

Get the full ebook with Bonus Features for a Better Reading Experience on ebookname.com
Instant digital products (PDF, ePub, MOBI) available
Download now and explore formats that suit you...

Data Analysis for Database Design Third Edition David


Howe

https://ebookname.com/product/data-analysis-for-database-design-
third-edition-david-howe/

Relational Database Design Clearly Explained 2nd


edition Edition Jan Lharington

https://ebookname.com/product/relational-database-design-clearly-
explained-2nd-edition-edition-jan-lharington/

Micromechatronics Modeling Analysis and Design with


MATLAB 2nd ed Edition Giurgiutiu

https://ebookname.com/product/micromechatronics-modeling-
analysis-and-design-with-matlab-2nd-ed-edition-giurgiutiu/

Multiplatform e learning systems and technologies


mobile devices for ubiquitous ICT based education 1st
Edition Tiong T. Goh

https://ebookname.com/product/multiplatform-e-learning-systems-
and-technologies-mobile-devices-for-ubiquitous-ict-based-
education-1st-edition-tiong-t-goh/
Synthetic Vaccines Volume 114 1st Edition Cornelius
Melief

https://ebookname.com/product/synthetic-vaccines-volume-114-1st-
edition-cornelius-melief/

Guide to ASTM Test Methods for the Analysis of


Petroleum Products and Lubricants 2nd Edition MNL 44 R.
A. Nadkarni

https://ebookname.com/product/guide-to-astm-test-methods-for-the-
analysis-of-petroleum-products-and-lubricants-2nd-edition-
mnl-44-r-a-nadkarni/

European Territorial Governance 1st Edition W.


Zonneveld

https://ebookname.com/product/european-territorial-
governance-1st-edition-w-zonneveld/

From Scribal Error to Rewriting How Ancient Texts Could


and Could Not Be Changed De Septuaginta Investigationes
DSI 12 1. ed. Edition Anneli Aejmelaeus (Editor)

https://ebookname.com/product/from-scribal-error-to-rewriting-
how-ancient-texts-could-and-could-not-be-changed-de-septuaginta-
investigationes-dsi-12-1-ed-edition-anneli-aejmelaeus-editor/

Egypt s Liberal Experiment 1922 1936 Afaf Lutfi Al-


Sayyid-Marsot

https://ebookname.com/product/egypt-s-liberal-
experiment-1922-1936-afaf-lutfi-al-sayyid-marsot/
Physics A Conceptual World View 7th Edition Larry
Kirkpatrick

https://ebookname.com/product/physics-a-conceptual-world-
view-7th-edition-larry-kirkpatrick/
Data modeling/database design life cycle

Universe of
Interest

Requirements
Specification

Process Data
Specifications Specifications

Process Modeling Conceptual Data Modeling Presentation ER Diagram


Layer + A list of other semantic
ER Model integrity constraints

[ER Modeling
Process Model Conceptual Design/Schema
Grammar]

ER Diagram
Design-Specific + Updated semantic
Logical Data Modeling ER Model integrity constraints List

Technology-Independent
Logical Schema
[Information Preserving Grammar]
Technology-Independent

Normalization

Technology-Dependent
Technology-Dependent
Logical Schema
[Relational Modeling Grammar]

Physical Data Modeling

Physical Design/Schema

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
DATA MODELING AND
DATABASE DESIGN

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
DATA MODELING AND
DATABASE DESIGN
Second Edition

Narayan S. Umanath
University of Cincinnati
Richard W. Scamell
University of Houston

Australia • Brazil • Mexico • Singapore • United Kingdom • United States

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
This is an electronic version of the print textbook. Due to electronic rights restrictions,
some third party content may be suppressed. Editorial review has deemed that any suppressed
content does not materially affect the overall learning experience. The publisher reserves the right
to remove content from this title at any time if subsequent rights restrictions require it. For
valuable information on pricing, previous editions, changes to current editions, and alternate
formats, please visit www.cengage.com/highered to search by ISBN#, author, title, or keyword for
materials in your areas of interest.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Data Modeling and Database Design, © 2015 Cengage Learning
Second Edition
WCN: 02-200-203
2
Narayan S. Umanath and
Richard W. Scamell ALL RIGHTS RESERVED. No part of this work covered by the copyright
herein may be reproduced, transmitted, stored, or used in any form or by
Production Director: Patty Stephan any means graphic, electronic, or mechanical, including but not limited to
Product Manager: Clara Goosman photocopying, recording, scanning, digitizing, taping, Web distribution,
information networks, or information storage and retrieval systems, except
Managing Developer: Jeremy Judson
as permitted under Section 107 or 108 of the 1976 United States Copyright
Content Developer: Wendy Langeurd Act, without the prior written permission of the publisher.
Product Assistant: Brad Sullender
Senior Marketing Manager: Eric La Scola For product information and technology assistance, contact us at
Cengage Learning Customer & Sales Support, 1-800-354-9706
IP Analyst: Sara Crane
For permission to use material from this text or product,
Senior IP Project Manager: Kathryn Kucharek submit all requests online at www.cengage.com/permissions
Manufacturing Planner: Ron Montgomery Further permissions questions can be e-mailed to
permissionrequest@cengage.com
Art and Design Direction, Production
Management, and Composition:
PreMediaGlobal Library of Congress Control Number: 2014934580
Cover Image: © VikaSuh/www.Shutterstock.com ISBN-13: 978-1-285-08525-8
ISBN-10: 1-285-08525-6

Cengage Learning
20 Channel Center Street
Boston, MA 02210
USA

Cengage Learning is a leading provider of customized learning solutions


with office locations around the globe, including Singapore, the United
Kingdom, Australia, Mexico, Brazil, and Japan. Locate your local office at
www.cengage.com/global

Cengage Learning products are represented in Canada by


Nelson Education, Ltd.

To learn more about Cengage Learning Solutions, visit www.cengage.com

Purchase any of our products at your local college store or at our


preferred online store www.cengagebrain.com

Printed in the United States of America


1 2 3 4 5 6 7 18 17 16 15 14

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
To Beloved Bhagwan Sri Sathya Sai Baba, the very source
of my thoughts, words, and deeds
To my Graduate Teaching Assistants and students,
the very source of my inspiration
To my dear children, Sharda and Kausik, always concerned
about their dad overworking
To my dear wife Lalitha, a pillar of courage I always lean on
Uma

There is a verse that says


Focus on what I’m doing right now
And tell me that you appreciate me
So that I learn to feel worthy
And motivated to do more
Led by my family, I have always been surrounded by people
(friends, teachers, and students) who
With their kind thoughts, words, and deeds treat me in this way.
This book is dedicated to these people.
Richard

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
BRIEF CONTENTS

Preface xvii
Chapter 1
Database Systems: Architecture and Components 1

Part I: Conceptual Data Modeling

Chapter 2
Foundation Concepts 30

Chapter 3
Entity-Relationship Modeling 79

Chapter 4
Enhanced Entity-Relationship (EER) Modeling 141

Chapter 5
Modeling Complex Relationships 197

Part II: Logical Data Modeling

Chapter 6
The Relational Data Model 280

P a r t I I I : N o rm a l i z a t i o n

Chapter 7
Functional Dependencies 358

Chapter 8
Normal Forms Based on Functional Dependencies 395

Chapter 9
Higher Normal Forms 467

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
viii Brief Contents

P a r t I V : D a t a b a s e I mp l e me n t a t i o n U s i n g th e R e l a t i o n a l
Data Model

Chapter 10
Database Creation 506

Chapter 11
Relational Algebra 539

Chapter 12
Structured Query Language (SQL) 567

Chapter 13
Advanced Data Manipulation Using SQL 635

Appendix A
Data Modeling Architectures Based on the Inverted Tree
and Network Data Structures 719

Appendix B
Object-Oriented Data Modeling Architectures 731

Selected Bibliography 739

Index 743

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
TABLE OF CONTENTS

Preface xvii

Chapter 1 Database Systems: Architecture and Components 1


1.1 Data, Information, and Metadata 1
1.2 Data Management 3
1.3 Limitations of File-Processing Systems 3
1.4 The ANSI/SPARC Three-Schema Architecture 6
1.4.1 Data Independence Defined 8
1.5 Characteristics of Database Systems 10
1.5.1 What Is a Database System? 11
1.5.2 What Is a Database Management System? 12
1.5.3 Advantages of Database Systems 15
1.6 Data Models 17
1.6.1 Data Models and Database Design 17
1.6.2 Data Modeling and Database Design in a Nutshell 19
Chapter Summary 25
Exercises 25

Part I: Conceptual Data Modeling

Chapter 2 Foundation Concepts 30


2.1 A Conceptual Modeling Framework 30
2.2 ER Modeling Primitives 30
2.3 Foundations of the ER Modeling Grammar 32
2.3.1 Entity Types and Attributes 32
2.3.2 Entity and Attribute-Level Data Integrity Constraints 35
2.3.3 Relationship Types 38
2.3.4 Structural Constraints of a Relationship Type 43
2.3.5 Base Entity Types and Weak Entity Types 52
2.3.6 Cluster Entity Type: A Brief Introduction 57
2.3.7 Specification of Deletion Constraints 58
Chapter Summary 70
Exercises 71

Chapter 3 Entity-Relationship Modeling 79


3.1 Bearcat Incorporated: A Case Study 79
3.2 Applying the ER Modeling Grammar to the Conceptual Modeling Process 81
3.2.1 The Presentation Layer ER Model 82
3.2.2 The Presentation Layer ER Model for Bearcat Incorporated 85

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
x Table of Contents

3.2.3 The Design-Specific ER Model 104


3.2.4 The Decomposed Design-Specific ER Model 111
3.3 Data Modeling Errors 119
3.3.1 Vignette 1 120
3.3.2 Vignette 2 127
Chapter Summary 134
Exercises 134

Chapter 4 Enhanced Entity-Relationship (EER) Modeling 141


4.1 Superclass/subclass Relationship 142
4.1.1 A Motivating Exemplar 142
4.1.2 Introduction to the Intra-Entity Class Relationship Type 143
4.1.3 General Properties of a Superclass/subclass Relationship 145
4.1.4 Specialization and Generalization 146
4.1.5 Specialization Hierarchy and Specialization Lattice 154
4.1.6 Categorization 157
4.1.7 Choosing the Appropriate EER Construct 160
4.1.8 Aggregation 166
4.2 Converting from the Presentation Layer to a Design-Specific EER Diagram 168
4.3 Bearcat Incorporated Data Requirements Revisited 170
4.4 ER Model for the Revised Story 171
4.5 Deletion Rules for Intra-Entity Class Relationships 182
Chapter Summary 188
Exercises 188

Chapter 5 Modeling Complex Relationships 197


5.1 The Ternary Relationship Type 198
5.1.1 Vignette 1—Madeira College 198
5.1.2 Vignette 2—Get Well Pharmacists, Inc. 203
5.2 Beyond the Ternary Relationship Type 205
5.2.1 The Case for a Cluster Entity Type 205
5.2.2 Vignette 3—More on Madeira College 206
5.2.3 Vignette 4—A More Complex Entity Clustering 212
5.2.4 Cluster Entity Type—Additional Examples 212
5.2.5 Madeira College—The Rest of the Story 216
5.2.6 Clustering a Recursive Relationship Type 221
5.3 Inter-Relationship Integrity Constraint 224
5.4 Composites of Weak Relationship Types 230
5.4.1 Inclusion Dependency in Composite Relationship Types 230
5.4.2 Exclusion Dependency in Composites of Weak Relationship Types 231
5.5 Decomposition of Complex Relationship Constructs 234
5.5.1 Decomposing Ternary and Higher-Order Relationship Types 234
5.5.2 Decomposing a Relationship Type with a Multi-Valued Attribute 235
5.5.3 Decomposing a Cluster Entity Type 240
5.5.4 Decomposing Recursive Relationship Types 241
5.5.5 Decomposing a Weak Relationship Type 244

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Table of Contents xi

5.6 Validation of the Conceptual Design 246


5.6.1 Fan Trap 246
5.6.2 Chasm Trap 251
5.6.3 Miscellaneous Semantic Traps 253
5.7 Cougar Medical Associates 257
5.7.1 Conceptual Model for CMA: The Genesis 259
5.7.2 Conceptual Model for CMA: The Next Generation 265
5.7.3 The Design-Specific ER Model for CMA: The Final Frontier 266
Chapter Summary 273
Exercises 273

Part II: Logical Data Modeling

Chapter 6 The Relational Data Model 280


6.1 Definition 280
6.2 Characteristics of a Relation 282
6.3 Data Integrity Constraints 283
6.3.1 The Concept of Unique Identifiers 284
6.3.2 Referential Integrity Constraint in the Relational Data Model 290
6.4 A Brief Introduction to Relational Algebra 291
6.4.1 Unary Operations: Selection (s) and Projection (p) 292
6.4.2 Binary Operations: Union ([), Difference (−), and Intersection (\) 293
6.4.3 The Natural Join (*) Operation 295
6.5 Views and Materialized Views in the Relational Data Model 296
6.6 The Issue of Information Preservation 297
6.7 Mapping an ER Model to a Logical Schema 298
6.7.1 Information-Reducing Mapping of ER Constructs 298
6.7.2 An Information-Preserving Mapping 315
6.8 Mapping Enhanced ER Model Constructs to a Logical Schema 320
6.8.1 Information-Reducing Mapping of EER Constructs 321
6.8.2 Information-Preserving Grammar for Enhanced ER Modeling Constructs 328
6.9 Mapping Complex ER Model Constructs to a Logical Schema 336
Chapter Summary 345
Exercises 347

P a r t I I I : N o rm a l i z a t i o n

Chapter 7 Functional Dependencies 358


7.1 A Motivating Exemplar 359
7.2 Functional Dependencies 365
7.2.1 Definition of Functional Dependency 365
7.2.2 Inference Rules for Functional Dependencies 366
7.2.3 Minimal Cover for a Set of Functional Dependencies 367
7.2.4 Closure of a Set of Attributes 372
7.2.5 When Do FDs Arise? 374

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xii Table of Contents

7.3 Candidate Keys Revisited 374


7.3.1 Deriving Candidate Key(s) by Synthesis 375
7.3.2 Deriving Candidate Keys by Decomposition 379
7.3.3 Deriving a Candidate Key—Another Example 382
7.3.4 Prime and Non-prime Attributes 386
Chapter Summary 390
Exercises 390

Chapter 8 Normal Forms Based on Functional Dependencies 395


8.1 Normalization 395
8.1.1 First Normal Form (1NF) 396
8.1.2 Second Normal Form (2NF) 398
8.1.3 Third Normal Form (3NF) 401
8.1.4 Boyce-Codd Normal Form (BCNF) 404
8.1.5 Side Effects of Normalization 407
8.1.6 Summary Notes on Normal Forms 418
8.2 The Motivating Exemplar Revisited 420
8.3 A Comprehensive Approach to Normalization 424
8.3.1 Case 1 424
8.3.2 Case 2 431
8.3.3 A Fast-Track Algorithm for a Non-Loss, Dependency-Preserving
Solution 436
8.4 Denormalization 442
8.5 Role of Reverse Engineering in Data Modeling 443
8.5.1 Reverse Engineering the Normalized Solution of Case 1 445
8.5.2 Reverse Engineering the Normalized Solution of URS2 (Case 3) 451
8.5.3 Reverse Engineering the Normalized Solution of URS3 (Case 2) 453
Chapter Summary 457
Exercises 458

Chapter 9 Higher Normal Forms 467


9.1 Multi-Valued Dependency 467
9.1.1 A Motivating Exemplar for Multi-Valued Dependency 467
9.1.2 Multi-Valued Dependency Defined 469
9.1.3 Inference Rules for Multi-Valued Dependencies 470
9.2 Fourth Normal Form (4NF) 472
9.3 Resolution of a 4NF Violation—A Comprehensive Example 476
9.4 Generality of Multi-Valued Dependencies and 4NF 478
9.5 Join-Dependencies and Fifth Normal Form (5NF) 480
9.6 A Thought-Provoking Exemplar 490
9.7 A Note on Domain Key Normal Form (DK/NF) 497
Chapter Summary 498
Exercises 498

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Table of Contents xiii

P a r t I V : Da t a b a s e I m p l e m e n t a t i o n U s i n g th e R e l a t i o n a l
Data Model

Chapter 10 Database Creation 506


10.1 Data Definition Using SQL 507
10.1.1 Base Table Specification in SQL/DDL 507
10.2 Data Population Using SQL 524
10.2.1 The INSERT Statement 525
10.2.2 The DELETE Statement 528
10.2.3 The UPDATE Statement 530
Chapter Summary 532
Exercises 532

Chapter 11 Relational Algebra 539


11.1 Unary Operators 542
11.1.1 The Select Operator 542
11.1.2 The Project Operator 544
11.2 Binary Operators 546
11.2.1 The Cartesian Product Operator 546
11.2.2 Set Theoretic Operators 549
11.2.3 Join Operators 551
11.2.4 The Divide Operator 557
11.2.5 Additional Relational Operators 560
Chapter Summary 563
Exercises 563

Chapter 12 Structured Query Language (SQL) 567


12.1 SQL Queries Based on a Single Table 569
12.1.1 Examples of the Selection Operation 569
12.1.2 Use of Comparison and Logical Operators 572
12.1.3 Examples of the Projection Operation 578
12.1.4 Grouping and Summarizing 580
12.1.5 Handling Null Values 583
12.1.6 Pattern Matching in SQL 593
12.2 SQL Queries Based on Binary Operators 597
12.2.1 The Cartesian Product Operation 597
12.2.2 SQL Queries Involving Set Theoretic Operations 599
12.2.3 Join Operations 602
12.2.4 Outer Join Operations 608
12.2.5 SQL and the Semi-Join and Semi-Minus Operations 612
12.3 Subqueries 613
12.3.1 Multiple-Row Uncorrelated Subqueries 613
12.3.2 Multiple-Row Correlated Subqueries 625
12.3.3 Aggregate Functions and Grouping 628
Chapter Summary 631
Exercises 631

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xiv Table of Contents

Chapter 13 Advanced Data Manipulation Using SQL 635


13.1 Selected SQL:2003 Built-In Functions 635
13.1.1 The SUBSTRING Function 636
13.1.2 The CHAR_LENGTH (char) Function 639
13.1.3 The TRIM Function 640
13.1.4 The TRANSLATE Function 643
13.1.5 The POSITION Function 644
13.1.6 Combining the INSTR and SUBSTR Functions 645
13.1.7 The DECODE Function and the CASE Expression 646
13.1.8 A Query to Simulate the Division Operation 649
13.2 Some Brief Comments on Handling Dates and Times 651
13.3 Hierarchical Queries 656
13.3.1 Using the CONNECT BY and START WITH Clauses with
the PRIOR Operator 658
13.3.2 Using the LEVEL Pseudo-Column 660
13.3.3 Formatting the Results from a Hierarchical Query 661
13.3.4 Using a Subquery in a START WITH Clause 661
13.3.5 The SYS_CONNECT_BY_PATH Function 663
13.3.6 Joins in Hierarchical Queries 664
13.3.7 Incorporating a Hierarchical Structure into a Table 665
13.4 Extended GROUP BY Clauses 668
13.4.1 The ROLLUP Operator 668
13.4.2 Passing Multiple Columns to ROLLUP 669
13.4.3 Changing the Position of Columns Passed to ROLLUP 671
13.4.4 Using the CUBE Operator 672
13.4.5 The GROUPING () Function 674
13.4.6 The GROUPING SETS Extension to the GROUP BY Clause 676
13.4.7 The GROUPING_ID () 677
13.4.8 Using a Column Multiple Times in a GROUP BY Clause 679
13.5 Using the Analytical Functions 681
13.5.1 Analytical Function Types 682
13.5.2 The RANK () and DENSE_RANK () Functions 684
13.5.3 Using ROLLUP, CUBE, and GROUPING SETS Operators with
Analytical Functions 687
13.5.4 Using the Window Functions 688
13.6 A Quick Look at the MODEL Clause 692
13.6.1 MODEL Clause Concepts 693
13.6.2 Basic Syntax of the MODEL Clause 693
13.6.3 An Example of the MODEL Clause 694
13.7 A Potpourri of Other SQL Queries 700
13.7.1 Concluding Example 1 700
13.7.2 Concluding Example 2 702
13.7.3 Concluding Example 3 704
13.7.4 Concluding Example 4 704
13.7.5 Concluding Example 5 705
Chapter Summary 706
Exercises 707
SQL Project 711

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Table of Contents xv

Appendix A Data Modeling Architectures Based on the Inverted Tree


and Network Data Structures 719
A.1 Logical Data Structures 719
A.1.1 Inverted Tree Structure 719
A.1.2 Network Data Structure 721
A.2 Logical Data Model Architectures 722
A.2.1 Hierarchical Data Model 722
A.2.2 CODASYL Data Model 726
Summary 729
Selected Bibliography 729

Appendix B Object-Oriented Data Modeling Architectures 731


B.1 The Object-Oriented Data Model 731
B.1.1 Overview of OO Concepts 732
B.1.2 A Note on UML 735
B.2 The Object-Relational Data Model 737
Summary 738
Selected Bibliography 738

Selected Bibliography 739


Index 743

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
PREFACE

QUOTE
Everything should be made as simple as possible—but no simpler.
—Albert Einstein

Popular business database books typically provide broad coverage of a wide variety of
topics, including data modeling, database design and implementation, database
administration, the client/server database environment, the Internet database envi-
ronment, distributed databases, and object-oriented database development. This is
invariably at the expense of deeper treatment of critical topics, such as principles of
data modeling and database design. Using current business database books in our
courses, we found that in order to properly cover data modeling and database design,
we had to augment the texts with significant supplemental material (1) to achieve
precision and detail and (2) to impart the depth necessary for the students to gain a
robust understanding of data modeling and database design. In addition, we ended up
skipping several chapters as topics to be covered in a different course. We also know
other instructors who share this experience. Broad coverage of many database topics
in a single book is appropriate for some audiences, but that is not the aim of this
book.
The goal of Data Modeling and Database Design, Second Edition is to provide
core competency in the areas that every Information Systems (IS), Computer Science
(CS), and Computer Information Systems (CIS) student and professional should
acquire: data modeling and database design. It is our experience that this set of
topics is the most essential for database professionals, and that, covered in sufficient
depth, these topics alone require a full semester of study. It is our intention to
address these topics at a level of technical depth achieved in CS textbooks, yet make
palatable to the business student/IS professional with little sacrifice in precision. We
deliberately refrain from the mathematics and algorithmic solutions usually found in
CS textbooks, yet we attempt to capture the precision therein via heuristic
expressions.
Data Modeling and Database Design, Second Edition provides not just hands-on
instruction in current data modeling and database design practices, it gives readers a
thorough conceptual background for these practices. We do not subscribe to the idea
that a textbook should limit itself to describing what is actually being practiced.
Teaching only what is being practiced is bound to lead to knowledge stagnation.
Where do practitioners learn what they know? Did they invent the relational data
model? Did they invent the ER model? We believe that it is our responsibility to
present not only industry “best practices” but also to provide students (future practi-
tioners) with concepts and techniques that are not necessarily used in industry today

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xviii Preface

but can enliven their practice and help it evolve without knowledge stagnation. One
of the coauthors of this book has worked in the software development industry for
over 15 years, with a significant focus on database development. His experience indi-
cates that having a richness of advanced data modeling constructs available enhances
the robustness of database design and that practitioners readily adopt these techni-
ques in their design practices.
In a nutshell, our goal is to take an IS/CS/CIS student/professional through an
intense educational experience, starting at conceptual modeling and culminating in a
fully implemented database design—nothing more and nothing less. This educational
journey is briefly articulated in the following paragraphs.

STRUCTURE
We have tried very hard to make the book “fluff-free.” It is our hope that every sen-
tence in the book, including this preface, adds value to a reader’s learning (and foot-
notes are no exception to this statement).
The book begins with an introduction to rudimentary concepts of data, metadata,
and information, followed by an overview of data management. Pointing out the limita-
tions of file-processing systems, Chapter 1 introduces database systems as a solution to
overcome these limitations. The architecture and components of a database system that
makes this possible are discussed. The chapter concludes with the presentation of a
framework for the database system design life cycle. Following the introductory chapter
on database systems architecture and components, the book contains four parts.

Part I: Conceptual Data Modeling


Part I addresses the topic of conceptual data modeling—that is, modeling at the high-
est level of abstraction, independent of the limitations of the technology employed to
deploy the database system. Four chapters (Chapters 2–5) are used in order to pro-
vide an extensive discussion of conceptual data modeling. Chapter 2 lays the ground-
work using the Entity-Relationship (ER) modeling grammar as the principal means
to model a database application domain. Chapter 3 elaborates on the use of the ER
modeling grammar in progressive layers and exemplifies the modeling technique with
a comprehensive case called Bearcat Incorporated. This is followed by a presentation
in Chapter 4 of richer data modeling constructs that overlap with object-oriented
modeling constructs. The Bearcat Incorporated story is further enriched to demon-
strate the value of Enhanced ER (EER) modeling constructs. Chapter 5 provides
exclusive coverage of modeling complex relationships that have meaningful real-world
significance. At the end of Part I, the reader ought to be able to fully appreciate the
value of conceptual data modeling in the database system design life cycle.
This second edition of Data Modeling and Database Design includes the follow-
ing major enhancements:
• The material in Chapters 2 and 3 has been reorganized and better stream-
lined so that the reader not only learns the ER modeling grammar but is able
to develop very simple applications of ER modeling. In Chapter 3, the model-
ing method steps have been reconfigured across the Presentation Layer and

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xix

Design-Specific layer of the ER model. Also, the unique learning technique


via error detection exclusively developed by us is presented at the end of
Chapter 3.
• The intra-entity class relationships are introduced with a new simpler exam-
ple at the beginning of Chapter 4.
• The already extensive coverage of complex relationships in Chapter 5 is aug-
mented by a few newer modeling ideas. Additional examples clarifying
decomposition of complex relationships in preparation for logical model
mapping have also been added to this chapter.

Part II: Logical Data Modeling


Part II of the book is dedicated to the discussion of migration of a conceptual data
model to its logical counterpart. Since the relational data model architecture forms
the basis for the logical data modeling discussed in this textbook, Chapter 6 focuses
on its characteristics. Other logical data modeling architectures prevalent in some
legacy systems, the hierarchical data model, and the CODASYL data model appear in
Appendix A. An introduction to object-oriented data modeling concepts is presented
in Appendix B. The rest of Chapter 6 describes techniques to map a conceptual data
model to its logical counterpart. An information-preserving logical data modeling
grammar is introduced and contrasted with existing popular mapping techniques that
are information reducing. A comprehensive set of examples is used to clarify the use
and value of the information-preserving grammar.
An important addition to the current edition of the book is a section on mapping
complex relationship types to the logical tier.

Part III: Normalization


Part III addresses the critical question of the “goodness” of a database design that
results from a conceptual and logical data modeling processes. Normalization is
introduced as the “scientific” way to verify and improve the quality of a logical
schema that is available at this stage in the database design. Three chapters are
employed to cover the topic of normalization. In Chapter 7, we take a look at data
redundancy in a relation schema and see how it manifests as a problem. We then
trace the problem to its source—namely, undesirable functional dependencies. To
that end, we first learn about functional dependencies axiomatically and how infer-
ence rules (Armstrong’s axioms) can be used to derive candidate keys of a relation
schema. In Chapter 8, the solution offered by the normalization process to data
redundancy problems triggered by undesirable functional dependencies is presented.
After discussing first, second, third and Boyce-Codd normal forms individually, we
examine the side effects of normalization—namely, dependency preservation and
non-loss decomposition and their consequences. Next, we present real-world scenar-
ios of deriving full-fledged relational schemas (sets of relation schemas), given sets of
functional dependencies using several examples. The useful topic of denormalization
is covered next. Reverse engineering a normalized relational schema to the concep-
tual tier often forges insightful understanding of the database design and enables a
database designer to become a better data modeler. Despite its practical utility, this

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xx Preface

topic is rarely covered in database textbooks. Chapter 9 completes the discussion of


normalization by examining multi-valued dependency (MVD) and join-dependency
(JD) and their impact on a relation schema in terms of fourth normal form (4NF) and
Project/Join normal form, viz., PJNF (also known as fifth normal form—5NF)
respectively.
An interesting enhancement in Chapter 8 is the introduction of a fast-track algo-
rithm to achieve a non-loss, dependency-preserving 3NF design. Two distinct exam-
ples demonstrating the use of the algorithm are presented. The discussion of MVD
and 4NF, of JD and 5NF, and their respective expressiveness of ternary and n-ray
relationships is presented in Chapter 9. Additional examples offer unique insights into
apparently conflicting alternative solutions.

Part IV: Database Implementation Using the Relational Database Model


Part IV pertains to database implementation using the relational data model. Spread
over four chapters, this part of the book covers relational algebra and the ANSI/ISO
standard Structured Query Language (SQL). Chapter 10 focuses on the data defini-
tion language (DDL) aspect of SQL. Included in the discussion are the SQL schema
evolution statements for adding, altering, or dropping table structures, attributes,
constraints, and supporting structures. This is followed by the development of SQL/
DDL script for a comprehensive case about a college registration system. The chapter
also includes the use of INSERT, UPDATE, and DELETE statements in populating a
database and performing database maintenance.
Chapters 11, 12, and 13 focus on relational algebra and the use of SQL for data
manipulation. Chapter 11 concentrates on E. F. Codd’s eight original relational alge-
bra operations as a means to specify the logic for data retrieval from a relational
database. SQL, the most common way that relational algebra is implemented for data
retrieval operations, is the subject of Chapter 12. Chapter 13 covers a number of
built-in functions used by SQL to work with strings, dates, and times, and it illustrates
how SQL can be used to do retrievals against hierarchically structured data. This
chapter also provides an introduction to some of the features of SQL that facilitate
the summarization and analysis of data. The chapter ends with an SQL database
project that provides students with a real-life scenario to test and apply the skills and
concepts presented in Part IV.

FEATURES OF EACH CHAPTER


Since our objective is a crisp and clear presentation of rather intricate subject matter,
each chapter begins with a simple introduction, followed by the treatment of the
subject matter, and concludes with a chapter summary and a set of exercises based
on the subject matter.

WHAT MAKES THIS BOOK DIFFERENT?


Every book has strengths and weaknesses. If lack of breadth in the coverage of
database topics is considered a weakness, we have deliberately chosen to be weak in
that dimension. We have not planned this book to be another general book on

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xxi

database systems. We have chosen to limit the scope of this book exclusively to data
modeling and database design since we firmly believe that this set of topics is the
core of database systems and must be learned in depth by every IS/CS/CIS student
and practitioner. Any system designed robustly has the potential to best serve the
needs of the users. More importantly, a poor design is a virus that can ruin an
enterprise.
In this light, we believe these are the unique strengths of this book:
• It presents conceptual modeling using the entity-relationship modeling gram-
mar including extensive discussion of the enhanced entity-relationship (ER)
model.
We believe that a conceptual model should capture all possible constraints
conveyed by the business rules implicit in users’ requirement specifica-
tions. To that end, we posit that an ER diagram is not an ER model unless
accompanied by a comprehensive specification of characteristics of and
constraints pertaining to attributes. We accomplish this via a list of
semantic integrity constraints (sort of a conceptual data dictionary) that
will accompany an ER diagram, a unique feature that we have not seen in
other database textbooks. We also seek to demonstrate the systematic
development of a multi-layer conceptual data model via a comprehensive
illustration at the beginning of each Part. We consider the multi-layer
modeling strategy and the heuristics for systematic development as unique
features of this book.
• It includes substantial coverage of higher-degree relationships and other
complex relationships in the entity-relationship diagram.
Most business database books seem to provide only a cursory treatment of
complex relationships in an ER model. We not only cover relationships
beyond binary relationships (e.g., ternary and higher-degree relationships),
we also clarify the nuances pertaining to the necessity and efficacy of
higher-degree relationships and the various conditions under which even
recursive and binary relationships are aggregated in interesting ways to
form cluster entity types.
• It discusses the information-preserving issue in data model mapping and
introduces a new information-preserving grammar for logical data modeling.
Many computer scientists have noted that the major difficulty of logical
database design (i.e., transforming an ER schema into a schema in the lan-
guage of some logical model) is the information preservation issue. Indeed,
assuring a complete mapping of all modeling constructs and constraints
that are inherent, implicit or explicit, in the source schema (e.g., ER/EER
model) is problematic since constraints of the source model often cannot be
represented directly in terms of structures and constraints of the target
model (e.g., relational schema). In such a case, they must be realized
through application programs; alternatively, an information-reducing trans-
formation must be accepted (Fahrner and Vossen, 1995). In their research,
initially presented at the Workshop on Information Technologies (WITS) in
the ICIS (International Conference on Information Systems) in Brisbane,

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xxii Preface

Australia, Umanath and Chiang (2000) describe a logical modeling gram-


mar that generates an information preserving transformation. Umanath
further revised this modeling grammar based on the feedback received at
WITS. We have included this logical modeling grammar as a unique com-
ponent of this textbook.
• It includes unique features under the topic of normalization rarely covered in
business database books:
• Inference rules for functional dependencies (Armstrong’s axioms)
and derivations of candidate keys from a set of functional
dependencies
• Derivation of canonical covers for a set of semantically obvious func-
tional dependencies
• Rich examples to clarify the basic normal forms (first, second, third,
and Boyce-Codd)
• Derivation of a complete logical schema from a large set of functional
dependencies considering lossless (non-additive) join properties and
dependency preservation
• Reverse engineering a logical schema to an entity-relationship diagram
• Advanced coverage of fourth and fifth normal form (project-join normal
form, abbreviated “PJNF”) using a variety of examples
• It supports in-depth coverage of relational algebra with a significant number
of examples of their operationalization in ANSI/ISO SQL.

A NOTE TO THE INSTRUCTOR


The content of this book is designed for a rigorous one-semester course in database
design and development and may be used at both undergraduate and graduate levels.
Technical emphasis can be tempered by minimizing or eliminating the coverage of
some of the following topics from the course syllabus: Enhanced Entity-Relationship
(EER) Modeling (Chapter 4) and the related data model mapping topics in Chapter 6
(Section 6.8) on Mapping Enhanced ER Modeling Constructs to a Logical Schema;
Modeling Complex Relationships (Chapter 5); and higher normal forms (Chapter 9).
The suggested exclusions will not impair the continuity of the subject matter in the
rest of the book.

SUPPORTING TECHNOLOGIES
Any business database book can be effective only when supporting technologies are
made available for student use. Yet, we don’t think that the type of book we are writ-
ing should be married to any commercial product. The specific technologies that will
render this book highly effective include a drawing tool (such as Microsoft Visio), a
software engineering tool (such as ERWIN, ORACLE/Designer, or Visible Analyst),
and a relational database management system (RDBMS) product (such as ORACLE,
SQL Server, or DB2).

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Preface xxiii

SUPPLEMENTAL MATERIALS
The following supplemental materials are available to instructors when this book is
used in a classroom setting. Some of these materials may also be found on the
Cengage Learning Web site at www.cengage.com.
• Electronic Instructor’s Manual: The Instructor’s Manual assists in class
preparation by providing suggestions and strategies for teaching the text, and
solutions to the end-of-chapter questions/problems.
• Sample Syllabi and Course Outline: The sample syllabi and course outlines
are provided as a foundation to begin planning and organizing your course.
• Cognero Test Bank: Cognero allows instructors to create and administer
printed, computer (LAN-based), and Internet exams. The Test Bank includes
an array of questions that correspond to the topics covered in this text,
enabling students to generate detailed study guides that include page refer-
ences for further review. The computer-based and Internet testing compo-
nents allow students to generate detailed study guides that include page
references for further review. The computer-based and Internet testing
components allow students to take exams at their computers, and also save
the instructor time by automatically grading each exam. The Test Bank is
also available in Blackboard and WebCT versions posted online at www
.course.com.
• PowerPoint Presentations: Microsoft PowerPoint slides for each chapter are
included as a teaching aid for classroom presentation, to make available to
students on the network for chapter review, or to be printed for classroom
distribution. Instructors can add their own slides for additional topics they
introduce to the class.
• Figure Files: Figure files from each chapter are provided for the instructor’s
use in the classroom.
• Data Files: Data files containing scripts to populate the database tables used
as examples in Chapters 11 and 12 are provided on the Cengage Learning
Web site at www.cengage.com.

ACKNOWLEDGMENTS
We have never written a textbook before. We have been using books written by our
academic colleagues, always supplemented with handouts that we developed our-
selves. Over the years, we accumulated a lot of supplemental material. In the begin-
ning, we took the positive feedback from the students about the supplemental
material rather lightly until we started to see comments like “I don’t know why I
bought the book; the instructor’s handouts were so good and much clearer than the
book” in the student evaluation forms. Our impetus to write a textbook thus origi-
nated from the consistent positive feedback from our students.
We also realized that, contrary to popular belief, business students are certainly
capable of assimilating intricate technical concepts; the trick is to frame the concepts
in meaningful business scenarios. The unsolicited testimonials from our alumni about

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
xxiv Preface

the usefulness of the technical depth offered in our database course in solving real-
world design problems reinforced our faith in developing a book focused exclusively
on data modeling and database design that was technically rigorous but permeated
with business relevance.
Since we both teach database courses regularly, we have had the opportunity to
field-test the manuscript of this book for close to 10 years at both undergraduate-level
and graduate-level information systems courses in the Carl Lindner College of
Business at the University of Cincinnati and in the C. T. Bauer College of Business at
the University of Houston. Hundreds of students—mostly business students—have
used earlier drafts of this textbook so far. Interestingly, even the computer science
and engineering students taking our courses have expressed their appreciation of the
content. This is a long preamble to acknowledge one of the most important and for-
mative elements in the creation of this book: our students.
The students’ continued feedback (comments, complaints, suggestions, and criti-
cisms) have significantly contributed to the improvement of the content. As we were
cycling through revisions of the manuscript, the graduate teaching assistants of
Dr. Umanath were a constant source of inspiration. Their meaningful questions and
suggestions added significant value to the content of this book. Dr. Scamell was ably
assisted by his graduate assistants as well.
We would also like to thank the following reviewers whose critiques, comments,
and suggestions helped shape every chapter of this book’s first edition:
Akhilesh Bajaj, University of Tulsa
Iris Junlgas, Florida State University
Margaret Porciello, State University of New York/Farmingdale
Sandeep Purao, Pennsylvania State University
Jaymeen Shah, Texas State University
Last, but by no means the least, we gratefully acknowledge the significant contri-
bution of Deb Kaufmann and Kent Williams, the development editors of our first and
second editions, respectively. We cannot thank them enough for their thorough and
also prompt and supportive efforts.
Enjoy!

N. S. Umanath

R. W. Scamell

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
CHAPTER 1
DATABASE SYSTEMS:
ARCHITECTURE AND
COMPONENTS

Data modeling and database design involve elements of both art and engineering.
Understanding user requirements and modeling them in the form of an effective logical
database design is an artistic process. Transforming the design into a physical database
with functionally complete and efficient applications is an engineering process.
To better comprehend what drives the design of databases, it is important to under-
stand the distinction between data and information. Data consists of raw facts—that is,
facts that have not yet been processed to reveal their meaning. Processing these facts
provides information on which decisions can be based.
Timely and useful information requires that data be accurate and stored in a manner
that is easy to access and process. And, like any basic resource, data must be managed
carefully. Data management is a discipline that focuses on the proper acquisition, storage,
maintenance, and retrieval of data. Typically, the use of a database enables efficient and
effective management of data.
This chapter introduces the rudimentary concepts of data and how information
emerges from data when viewed through the lens of metadata. Next, the discussion
addresses data management, contrasting file-processing systems with database systems.
This is followed by brief examples of desktop, workgroup, and enterprise databases. The
chapter then presents a framework for database design that describes the multiple tiers of
data modeling and how these tiers function in database design. This framework serves as a
roadmap to guide the reader through the remainder of the book.

1.1 DATA, INFORMATION, AND METADATA


Although the terms are often used interchangeably, information is different from data.
Data can be viewed as raw material consisting of unorganized facts about things, events,
activities, and transactions. While data may have implicit meaning, the lack of organiza-
tion renders it valueless. In other words, information is data in context—that is, data that
has been organized into a specific context such that it has value to its recipient.
As an example, consider the digits 2357111317. What does this string of digits
represent? One response is that they are simply 10 meaningless digits. Another might be

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

the number 31 (obtained by summing the 10 digits). A mathematician may see a set of
2
prime numbers, viz., 2, 3, 5, 7, 11, 13, 17. Another might see a person’s phone number with
the first three digits constituting the area code and the remaining seven digits the local
phone number. On the other hand, if the first digit is used to represent a person’s gender
(1 for male and 2 for female) and the remaining nine digits the person’s Social Security
number, the 10 digits would mean something else. Numerous other interpretations are pos-
sible, but without a context it is impossible to say what the digits represent. However, when
framed in a specific context (such as being told that the first digit represents a person’s
gender and the remaining digits the Social Security number), the data is transformed into
information. It is important to note that “information” is not necessarily the “Truth” since
the same data yields different information based on the context; information is an inference.
Metadata, in a database environment, is data that describes the properties of data. It
contains a complete definition or description of database structure (i.e., the file structure,
data type, and storage format of each data item), and other constraints on the stored data.
For example, when the structure of the 10 digits 2357111317 is revealed, the 10 digits
become information, such as a phone number. Metadata defines this structure. In other
words, through the lens of metadata, data takes on specific meaning and yields information.1
Metadata may be characterized as follows:
• The lens to view data and infer information
• A precise definition of the context for framing the data
Table 1.1 contains metadata for the data associated with a manufacturing plant. Later
in this chapter, we will see that in a database environment, metadata is recorded in what
is called a data dictionary.

Record
Type Data Element Data Type Size Source Role Domain

PLANT Pl_name Alphabetic 30 Stored Non-key

PLANT Pl_number Numeric 2 Stored Key Integer values


from 10 to 20

PLANT Budget Numeric 7 Stored Non-key

PLANT Building Alphabetic 20 Stored Non-key

PLANT No_of_employees Numeric 4 Derived Non-key

TABLE 1.1 Some metadata for a manufacturing plant

As reflected in Table 1.1, the smallest unit of data is called a data element. A group of
related data elements treated as a unit (such as Pl_name, Pl_number, Budget, Building,

1
With the advent of the data warehouse, the term “metadata” assumes a more comprehensive
meaning to include business and technical metadata, which is outside the scope of the current
discussion.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

and No_of_employees) is called a record type. A set of values for the data elements con-
3
stituting a record type is called a record instance or simply a record. A file is a collection
of records. A file is sometimes referred to as a data set. A company with 10 plants would
have a PLANT file or a PLANT data set that contains 10 records.

1.2 DATA MANAGEMENT


This book focuses strictly on management of data, as opposed to the management of
human resources. Data management involves four actions: (a) data creation, (b) data
retrieval, (c) data modification or updating, and (d) data deletion. Two data management
functions support these four actions: Data must be accessed and, for ease of access, data
must be organized.
Despite today’s sophisticated information technologies, there are still only two pri-
mary approaches for accessing data. One is sequential access, where in order to get to the
nth record in a data set it is necessary to pass through the previous n–1 records in the
data set. The second approach is direct access, where it is possible to get to the nth
record without having to pass through the previous n–1 records. While direct access is
useful for ad hoc querying of information, sequential access remains essential for
transaction processing applications such as generating payroll, grade reports, and
utility bills.
In order to access data, the data must be organized. For sequential access, this means
that all records in a file must be stored (organized) through some order using a unique
identifier, such as employee number, inventory number, flight number, account number,
or stock symbol. This is called sequential organization. A serial (unordered) collection of
records, also known as a “heap file,” cannot provide sequential access. For direct
access, the records in a file can be stored serially and organized either randomly or by
using an external index. A randomly organized file is one in which the value of a unique
identifier is processed by some sort of transformation routine (often called a “hashing
algorithm”) that computes the location of records within the file (relative record
numbers). An indexed file makes use of an index external to the data set similar in nature
to the one found at the back of this book to identify the location where a record is
physically stored.
As discussed in Section 1.5, a database takes advantage of software called a database
management system (DBMS) that sits on top of a set of files physically organized as
sequential files and/or as some form of direct access files. A DBMS facilitates data access
in a database without burdening a user with the details of how the data is physically
organized.

1.3 LIMITATIONS OF FILE-PROCESSING SYSTEMS


Computer applications in the 1960s and 1970s focused primarily on automating clerical
tasks. These applications made use of records stored in separate files and thus were
called file-processing systems. Although file-processing systems for information systems
applications have been useful for many years, database technology has rendered them
obsolete except for their use in a few legacy systems such as some payroll and customer

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

billing systems. Nonetheless, understanding their limitations provides insight into the
4
development of and justification for database systems.
Figure 1.1 shows three file-processing systems for a hypothetical university. One pro-
cesses data for students, another processes data for faculty and staff, and a third processes
data for alumni. In such an environment, each file-processing system has its own collec-
tion of private files and programs that access these files.

© 2015 Cengage Learning®


FIGURE 1.1 An example of a file-processing environment

While an improvement over the manual systems that preceded them, file-processing
systems suffer from a number of limitations:
• Lack of data integrity—Data integrity ensures that data values are correct,
consistent, complete, and current. Duplication of data in isolated file-
processing systems leads to the possibility of inconsistent data. Then it is
difficult to identify which of these duplicate data is correct, complete, and/
or current. This creates data integrity problems. For example, if an
employee who is also a student and an alumnus changes his or her mailing
address, files that contain the mailing address in three different file-
processing systems require updating to ensure consistency of information
across the board. Data redundancy across the three file-processing
systems not only creates maintenance inefficiencies, it also leads to the
problem of not knowing which is the current, correct, and /or complete
address of the person.
• Lack of standards—Organizations with file-processing systems often lack or
find it difficult to enforce standards for naming data items as well as for
accessing, updating, and protecting data. The absence of such standards can

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

lead to unauthorized access and accidental or intentional damage to or


5
destruction of data. In essence, security and confidentiality of information
may be compromised.
• Lack of flexibility/maintainability—Information systems make it possible
for end users to develop information requirements that they had never
envisioned previously. This inevitably leads to a substantial increase in
requests for new queries and reports. However, file-processing systems are
dependent upon a programmer who has to either write or modify program
code to meet these information requirements from isolated data. This can
bring about information requests that are not satisfied or programs that are
inefficiently written, poorly documented, and difficult to maintain.
These limitations are actually symptoms resulting from two fundamental problems:
lack of integration of related data and lack of program-data independence.
• Lack of data integration—Data is separated and isolated, and ownership of
data is compartmentalized, resulting in limited data sharing. For example, to
produce a list of employees who are students and alumni at the same time,
data from multiple files must be accessed. This process can be quite complex
and time consuming since a program has to access and perform logical com-
parisons across independent files containing employee, student, and alumni
data. In short, lack of integration of data contributes to all of the problems
listed previously as symptoms.
• Lack of program-data independence—In a file-processing environment, the
structural layout of each file is embedded in the application programs. That
is, the metadata of a file is fully coded in each application program that uses
the particular file. Perhaps the most often-cited example of the program-data
dependence problem occurred during the file-processing era, when it was
common for an organization to expand the zip code field from five digits to
nine digits. In order to implement this change, every program in the
employee, student, and alumni file-processing systems containing the zip
code field had to be identified (often a time-consuming process itself) and
then modified to conform to the new file structure. This not only required
modification of each program and its documentation but also recompiling and
retesting of the program. Likewise, if a decision was made to change the
organization of a file from indexed to random, since the structure of the file
was mapped into every program using the file, every program using the file
had to be modified. Identifying all the affected programs for corrective action
was not a simple task, either. Thus, because of lack of program-data inde-
pendence, file-processing systems lack flexibility since they are not amenable
to structural changes in data. Program-data dependence also exacerbates data
security and confidentiality problems.
It is only through attacking the problems of lack of program-data independence and
lack of integration of related data that the limitations of file-processing systems can be

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

eliminated. If a way is found to deal with these problems so as to establish centralized


6
control of data, then unnecessary redundancy can be reduced, data can be shared, stan-
dards can be enforced, security restrictions can be applied, and integrity can be main-
tained. One of the objectives of database systems is to integrate data without programmer
intervention in a way that eliminates data redundancy. The other objective of database
systems is to establish program-data independence, so that programs that access the data
are immune to changes in storage structure (how the data is physically organized) and
access technique.
The Time Life company experienced many of these problems in its early days.
Time Life was established in 1961 as a book-marketing division. It took its name from
Time and Life magazines, which at the time were two of the most popular weeklies
on the market. Time Life gained fame as a seller of book series that were mailed to
households in monthly installments, operating as book sales clubs. Most of the series
were more or less encyclopedic in nature (e.g., The LIFE History of the United States,
The Time-Life Encyclopedia of Gardening, The Great Cities, The American
Wilderness, etc.), providing the basics of the subjects in the way it might be done in
a series of lectures aimed at the general public. Over the years, more than 50 series
were published.
During the 1970s and first half of the 1980s, Time Life exhibited all of the character-
istics of a file-processing system. A separate collection of files was maintained for each
book series. Thus, when the company sought to promote a new series to its existing cus-
tomer base, a customer who had purchased or was currently subscribing to several book
series already would receive multiple copies of the same glossy brochure promoting the
new series. In addition, it was not uncommon for a customer to receive the same bro-
chure at multiple addresses if that customer had used different mailing addresses when
subscribing to different publications. In the mid-1980s, the company replaced its separate
file-processing systems with an integrated database system that eliminated much of the
data duplication and lack of data integrity that characterized the previous file-processing
environment in which it had been operating.

1.4 THE ANSI/SPARC THREE-SCHEMA ARCHITECTURE


In the 1970s, the Standards Planning and Requirements Committee (SPARC) of the
American National Standards Institute (ANSI) offered a solution to these problems by
proposing what came to be known as the ANSI/SPARC three-schema architecture.2 The
ANSI/SPARC three-schema architecture, as illustrated in Figure 1.2, consists of three per-
spectives of metadata in a database. The conceptual schema is the nucleus of the three-
schema architecture. Located between the external schema and internal schema, the
conceptual schema represents the global conceptual view of the structure of the entire
database for a community of users. By insulating applications/programs from changes in
physical storage structure and data access strategy, the conceptual schema achieves
program-data independence in a database environment.

2
In a database context, the word “schema” stands for “description of metadata.”

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

FIGURE 1.2 The ANSI/SPARC three-schema architecture

The external schema3 consists of a number of different user views4 or subschemas,


each describing portions of the database of interest to a particular user or group of users.
The conceptual schema represents the global view of the structure of the entire database
for a community of users. The conceptual schema is the consolidation of user views. The
data specification (metadata) for the entire database is captured by the conceptual

3
While an external schema is technically a collection of external subschemas or views, the term
“external schema” is used here in the context of either an individual user view or a collection of
different user views.
4
Informally, a “view” is a term that describes the information of interest to a user or a group of
users, where a user can be either an end user or a programmer. See Chapter 6 (Section 6.4) for
a more precise definition of a “view.”

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

schema. The internal schema describes the physical structure of the stored data (how the
8
data is actually laid out on storage devices) and the mechanism used to implement the
access strategies (indexes, hashed addresses, and so on). The internal schema is con-
cerned with efficiency of data storage and access mechanisms in the database. Thus, the
internal schema is technology dependent, while the conceptual schema and external
schemas are technology independent. In principle, user views are generated on demand
through logical reference to data items in the conceptual schema independent of the logi-
cal or physical structure of the data.

1.4.1 Data Independence Defined


Data independence is the central concept driving a database system, and the very purpose
of a three-schema architecture is to enable data independence. The theme underlying the
concept of data independence is that when a schema at a lower level is changed, the
higher-level schemas themselves are unaffected by such changes. In other words, when a
change is made to storage structure or access strategy in the internal schema, there will be
no need to make any changes in the conceptual or external schemas; only the mapping
information—i.e., transforming requests and results between levels of schema—between a
schema and higher-level schemas need to be changed. Only then can it be said that data
independence is fully supported.
For instance, suppose direct access to data ordered by zip code is required. This
may be recorded as “direct access” in the conceptual schema, and a certain type of
indexing technique may be employed in the internal schema. This fact will be available
as the mapping information so that if/when the indexing technique in the internal schema
is changed, only the mapping information gets changed, and the conceptual schema is
unaffected. Incidentally, the external views are completely shielded from even the
knowledge of this change in the internal schema. That is, the specification and implementa-
tion of a change in the indexing mechanism on zip code does not require any modification
and testing of the application programs that use the external views containing zip code.
This capacity to change the internal schema without having to change the conceptual
or external schema is sometimes referred to as physical data independence. The internal
schema may be changed when certain file structures are reorganized or new indexes are
created to improve database performance. The physical data independence enables imple-
mentation of such changes without requiring any corresponding changes in the conceptual
or external schemas.
Likewise, enhancements to the conceptual schema in the form of growth or restructur-
ing will have no impact on any of the external views (subschemas) since all external views
are spawned from the conceptual schema only by logical reference to elements in the
conceptual schema. For instance, redefinition of logical structures of a data model (such as
adding or restructuring tables in a relational database) may sometimes be in order. Since
the external views (subschemas) are generated exclusively by logical references, the user
views are immune to such logical design changes in the conceptual schema. This property is
often called logical data independence. Logical data independence also enables a user
(external) view to be immune to changes in the other user views.
A file-processing system, in contrast, may be viewed as a two-schema architecture
consisting of the internal schema and the programmer’s view (external schema), as shown

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

in Figure 1.3. Here, the programmer’s view corresponds to the physical structure of the
9
data, meaning that the physical structure of data (internal schema) is fully mapped
(incorporated) into the application program. The file-processing system lacks program-
data independence because any modification to the storage structure or access strategy in
the internal schema necessitates changes to application programs and subsequent recom-
pilation and testing. In the absence of a conceptual schema, the internal schema struc-
tures are necessarily mapped directly to external views (or subschemas). Consequently,
changes in the internal schema require appropriate changes in the external schema;
therefore, data independence is lost. Because changes to the internal schema, such as
incorporating new user requirements and accommodating technological enhancements,
are expected in a typical application environment, absence of a conceptual schema essen-
tially sacrifices data independence. In short, file-processing systems lack data indepen-
dence because they employ what amounts to a two-schema architecture.

FIGURE 1.3 The file-processing system: a two-schema architecture

The three-schema architecture described in this section is required to achieve data


independence. It is worthwhile to remind the reader at this point that the conceptual
schema, external schema, and internal schema are essentially expressions of metadata in a
hierarchical transition from technology-independent state to technology-dependent state.
The complete data modeling and design process is about modeling metadata.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

10
1.5 CHARACTERISTICS OF DATABASE SYSTEMS
Database systems seek to overcome the two root causes of the limitations that plague file-
processing systems by creating a single integrated set of files that can be accessed by all
users. This integrated set of files is known as a database. A database management system
(typically referred to as a DBMS) is a collection of general-purpose software that facilitates
the processes of defining, constructing, and manipulating a database for various applica-
tions. Figure 1.4 provides a layman’s view of the difference between a database and a
database management system. This illustration shows how neither a user nor a program-
mer is able to access data in the database without going through the database manage-
ment system software. Whether a program is written in Java, C, COBOL, or some other
language, the program must “ask” the DBMS for the data, and the DBMS will fetch the
data. SQL (Structured Query Language) has been established as the language for acces-
sing data in a database by the International Organization for Standardization (ISO) and
the American National Standards Institute (ANSI). Accordingly, any application program
that seeks to access a database must do so via embedded SQL statements.

FIGURE 1.4 An early view of a database system


Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

An important purpose of this book is to discuss how to organize the data items con-
11
ceptualized in Figure 1.4. In reality, data items do not exist in one big pool surrounded by
the database management system. Several different architectures exist for organizing this
data. One is a hierarchical organization, another is a network organization, and a third is
relational; in this book, the relational approach is emphasized.5 While the data items that
collectively comprise the database at the physical level are stored as sequential, indexed,
and random files, the DBMS is a layer on top of these files that frees the user and appli-
cation programs from the burden of knowing the structures of the physical files (unlike a
file-processing system).
Next, let us look more closely at what constitutes a database, a database management
system, and finally a database system.

1.5.1 What Is a Database System?


A system is generally defined as a set of interrelated components working together for
some purpose. A database system is a self-describing collection of interrelated data. A
database system includes data and metadata. Here are the properties of a database system:
• Data consists of recorded facts that have implicit meaning.
• Viewed through the lens of metadata, the meaning of recorded data becomes
explicit.
• A database is self-describing in that the metadata is recorded within the
database, not in application programs.
• A database is a collection of files whose records are logically related to one
another. In contrast with that of a file-processing system, integration of data
as needed is the responsibility of the DBMS software instead of the
programmer.
• Embedded pointers and various forms of indexes exist in the database system
to facilitate access to the data.
A database system may be classified as single-user or multi-user. A single-user data-
base system supports only one user at a time. In other words, if user A is using the data-
base, users B and C must wait until user A has completed his or her database work. When
a single-user database runs on a personal computer, it is also called a desktop database
system. In contrast, a multi-user database system supports multiple users concurrently. If
the multi-user database supports a relatively small number of users (usually fewer than
50) or a specific workgroup within an organization, it is called a workgroup database sys-
tem. If the database is used by the entire organization and supports many users (more
than 50, usually hundreds) across many locations, the database is known as an enterprise
database system.
The term “enterprise database system” is somewhat misleading. In the early days of
database processing, the goal was to have a single database for the entire organization.

5
Two relatively new data modeling architectures (the object-oriented data model and the object-
relational model) also exist. Appendix B briefly discusses each of these architectures. Appendix A
reviews architectures based on the hierarchical and network organizations.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Chapter 1

While this type of database is possible in a small organization, in large organizations


12
multiple databases exist that are indeed used enterprise-wide. For example, large oil
companies have databases organized by function: one database for exploration,
one database for refining, another for marketing, a fourth for royalty payments, and
so on. On the other hand, a consumer-product company might have several product
databases. Each one of these is an enterprise database system since its use extends
enterprise-wide.
A natural extension to the enterprise database system is the concept of distributed
database systems. With the tremendous strides made in network and data communication
technology in the last two decades, distribution of databases over a wide geographic area
has become highly feasible. A distributed database (DDB) is a collection of multiple logi-
cally interrelated databases that may be geographically dispersed over a computer net-
work. A distributed database management system (DDBMS) essentially manages a
distributed database while rendering the geographical distribution of data transparent to
the user community. The advent of DDBs facilitated replacement of the large, centralized,
monolithic databases of the 1980s with decentralized autonomous database systems inter-
related via a computer network.
Another important trend that emerged in the 1990s is the development of data ware-
houses. The distinguishing characteristic of a data warehouse is that it is mainly intended
for decision-support applications used by knowledge workers. As a consequence, data
warehouses are optimized for information retrieval rather than transaction processing. By
definition, a data warehouse is subject-oriented, integrated, nonvolatile, and time-variant.
Since its relatively recent inception, data warehousing has evolved rapidly in large cor-
porations to support business intelligence, data mining, decision analytics, and customer
relations management (CRM).

1.5.2 What Is a Database Management System?


Figure 1.5 illustrates the components of a database system, consisting of the DBMS,
database, data dictionary, and data repository. A database management system (DBMS)
is a collection of general-purpose software that facilitates the processes of defining, con-
structing, and manipulating a database. The major components of a DBMS include: one
or more query languages; tools for generating reports; facilities for providing security,
integrity, backup, and recovery; a data manipulation language for accessing the database;
and a data definition language used to define the structure of data. As shown in
Figure 1.5, Structured Query Language (SQL) plays an integral role in each of these
components. SQL is used in the data definition language (DDL) for creating the struc-
ture of database objects such as tables, views, and synonyms. SQL statements are also
generated by programming languages used to build reports in order to access data from
the database. In addition, people involved in the data administration function use data
control languages (DCLs) that make use of SQL statements to (a) control the resource
locking required in a multi-user environment, (b) facilitate backup and recovery from
failures, and (c) provide the security required to ensure that users access only the data
that they are authorized to use.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Database Systems: Architecture and Components

13

FIGURE 1.5 Components of a database system

Data manipulation languages (DMLs) facilitate the retrieval, insertion, deletion, and
modification of data in a database. SQL is the most well-known nonprocedural6 DML and
can be used to specify many complex database operations in a concise manner. Most
DBMS products also include procedural language extensions to supplement the capabilities
of SQL, such as Oracle PL/SQL. Other examples of procedural language extensions are
languages such as C, Java, Visual Basic, and COBOL, in which pre-compilers extract data
manipulation commands written in SQL from a program and send them to a DML com-
piler for compilation into object code for subsequent database access by the run-time sub-
system.7 Finally, the access routines handle database access at run time by passing
requests to the file manager of the operating system to retrieve data from the physical files
of the database.
Much as a dictionary is a reference book that provides information about the form,
origin, function, meaning, and syntax of words, a data dictionary in a DBMS environment

6
SQL is known to be a nonprocedural language since it only specifies what data to retrieve as
opposed to specifying how actually to retrieve it. A procedural language specifies how to retrieve
data in addition to what data to retrieve.
7
The run-time subsystem of a database management system processes applications created by the
various design tools at run time.

Copyright 2015 Cengage Learning. All Rights Reserved. May not be copied, scanned, or duplicated, in whole or in part. Due to electronic rights, some third party content may be suppressed from the eBook and/or eChapter(s).
Editorial review has deemed that any suppressed content does not materially affect the overall learning experience. Cengage Learning reserves the right to remove additional content at any time if subsequent rights restrictions require it.
Exploring the Variety of Random
Documents with Different Content
138 FIGHTING THE NORTH VIETNAMESE taking casualties
and requested a helicopter medevac. In an effort to pick up some of
the casualties, Captain Ronald D. Bennett of HMM-363 attempted to
land his UH-34D within the 2d Battalion's perimeter. Those on the
ground waved him off because of intense enemy fire. As Captain
Bennett pulled away, enemy fire hit the rear of the helicopter,
separating the tail pylon. The aircraft crashed, rolled and began
burning about 150 meters outside the Marine lines. Bennett and a
gunner, Corporal Edward Clem, died in the crash. Second Lieutenant
Vernon J. Sharpless and Lance Corporal Howard J. Cones, both
seriously injured, managed to crawl from the burning wreckage. A
second helicopter from HMM-363, piloted by Captain Frank T. Grassi,
tried to land to pick up the survivors but could not. Enemy fire hit
Grassi in the leg and arm, damaged the helicopter, and slightly
wounded one of the gunners and a Navy hospital corpsman. The
aircraft limped away as far as Strong Point C-2 where it made a
forced landing. Captain James E. Murphy, the 2d Battalion, 4th
Marines' air liaison officer, who had been calling in air strikes in front
of Company E, saw Bennett's helicopter go down. With his radio still
on his back, Murphy crawled out to the downed helicopter, moving
past NVA soldiers in his path. He found the two survivors near the
burning helicopter. The three Marines were surrounded and there
was no way Murphy could get them back to Marine lines.
Fortunately, the enemy soldiers in the area either did not know the
three men were there or simply did not care. Captain Murphy could
hear NVA soldiers nearby and see some movement, however, and
called in air strikes within 50 meters of the crashed helicopter with
the aid of an airborne observer in an O-lC aircraft overhead. The
latter eventually managed to direct a Marine A-4 attack aircraft to
deliver a line of smoke so that a UH-1 helicopter could land and
rescue the three Marines.28 The rescue helicopter was a UH-1C from
the U.S. Army's 190th Helicopter Assault Company whose pilot
volunteered to make the pickup. Enemy fire hit the aircraft twice
during the rescue and the pilot suffered a minor wound in the arm.
The UH-lC also managed to reach Strong Point C-2 where it, too,
made a forced landing. Lieutenant Colonel Studt's observation during
his short period of command convinced him of the need for
reinforcements. At his request, the 9th Marines ordered the 3d
Battalion, 3d Marines at C-2 Bridge to send two companies and a
small command group to the 2d Battalion, 4th Marines' position.29
Company F still occupied its exposed position and Studt decided to
move it within the battalion perimeter. He directed the company to
have its attached engineers blow up the excess ammunition, but
they were unable to do so.* After several hours of fruitless attempts
by the engineers, Studt told the company to leave the ammunition
and join the rest of the battalion. The battalion had direct
observation of the ammunition pile and would cover it by fire.30
Company F reached the perimeter near dusk. The two companies
from the 3d Battalion, 3d Marines arrived at about the same time.31
With these reinforcements, the 2d Battalion, 4th Marines was ready
for any NVA attacks that evening. Studt recounted the night's
subsequent events: From before dusk . . . until almost 0200 in the
morning, we were under almost continuous attacks by both direct
and indirect fire, and our perimeter was hit again and again by
ground attacks. . . . The wounded were being accumulated in the
vicinity of my CP, which consisted of foxholes, and their suffering
was a cause of anguish. After several attempts to medevac them by
helicopter were aborted due to intense enemy fire, we came up with
the plan that on signal every man on the perimeter would open fire
on known or suspected enemy positions ... for a few minutes with
an intense volume of fire. During this brief period, a volunteer pilot .
. . succeeded in zipping into the zone and removing our emergency
medevacs. The [trick] . . . probably would not have worked again.52
The ground attacks ceased around 0200 in the morning of the 27th,
but the Marines heard enemy movement for the rest of the night as
the North Vietnamese removed their dead and wounded. Dawn
revealed 19 enemy bodies within or in sight of the Marine positions.
Lieutenant Colonel Studt decided not to send anyone to sweep the
area since any movement still drew enemy artillery and mortar
fire.33 The enemy completed its departure by dawn. The Marines
soon did likewise; on orders from the 9th Marines, the battalion
made a tactical withdrawal. Still harrassed by enemy rocket and
mortar fire and carrying the remainder of its dead and wounded, the
2d Battalion, 4th Marines moved by echelon to *The reason for the
failure to detonate the ammunition is not clear from the records.
Studt himself wrote in 1981 that he never knew the reason. Col John
C. Studt, Comments on draft ms, 9jul81 (Vietnam comment file,
MCHC, Washington, D.C.)
FALL FIGHTING IN THE NORTH 139 Strong Point C-2 and
then to Cam Lo.34 During the period 25-27 October, eight 2d
Battalion Marines died and 45 suffered wounds giving the battalion
an effective strength of around 300 Marines. Known NVA casualties
were the 19 bodies counted by the battalion on 27 October. The
battalion moved back to Dong Ha on the 28th and resumed its role
as the regimental reserve. Lieutenant Colonel William Wiese took
command of the 2d Battalion, 4th Marines and Lieutenant Colonel
Studt returned to his duties at the 9th Marines' command post.*
That day a message from Lieutenant General Cushman arrived, the
last line of which read "2/4 has met and beaten the best the enemy
had to offer. Well done."35 Kingfisher listed 1,117 enemy killed and
five captured; Marine casualties totaled 340 killed and 1,461
wounded. General Westmoreland described the operation as a
"crushing defeat" of the enemy. The Con Thien area remained a
grim place. The constant danger of artillery, rocket, and mortar fire,
and massed infantry assaults, and the depressing drizzle and mud
from which there was no escape, combined to make it miserable for
the Marines there. Neuropsychiatric or "shell shock" casualties,
relatively unheard of elsewhere in South Vietnam, were not unusual.
Duty on and around the drab hill mass was referred to by all Marines
as their "Turn in the Barrel," or "the Meatgrinder."36 Medina /Bastion
Hill/Lam Son 138 On 5 October, in conjunction with the arrival of a
fourth U.S. Army brigade in southern I Corps, Colonel Herbert E.
Ing, Jr.'s 1st Marines, consisting of two battalions, came under the
operational control of the 3d Marine Division and moved north from
the Da Nang TAOR to the southern part of Quang Tri Province. On
the 11th, the regiment, reinforced by SLF Alpha, started Operation
Medina in the rugged hills of the Hai Lang National Forest.** The
operation was part of III MAF's comprehensive program to *Studt
had hoped to retain command but Colonel Smith was more
interested in keeping him as the regimental operations officer.
"Unfortunately," wrote Smith, "I extolled [Studt's] virtues so much to
General Tompkins that he was grabbed later to take over a battalion
at Khe Sanh where he distinguished himself." Colonel Richard B.
Smith, Comments on draft ms, 2lMay81 (Vietnam comment file,
MCHC, Washington, D.C.) **SLF Alpha's (BLT 1/3) move to its
Medina blocking positions had the code-name Operation Bastion Hill.
Department of Defense Photo (USMC) A189393 Marines and
journalists wait on 2 October in the safety of a trench beside Con
Thien s landing zone until the arrival and touchdown of the
helicopter that will take them from the base back to Dong Ha.
destroy enemy base areas previously left alone because of lack of
forces. The Hai Lang forest area south of Quang Tri was the enemy's
Base Area 101, the support area for the 5th and 6th NVA Regiments.
Northeast of the Medina AO, two ARVN airborne battalions
conducted Operation Lam Son 138. Medina started as Lieutenant
Colonel Albert F. Belbusti's 1st Battalion, 1st Marines and Lieutenant
Colonel Archie Van Winkle's 2d Battalion, 1st Marines made a
helicopter assault landing in the forest. After landing they cleared
the area around LZ Dove and then swept in a northeasterly direction
while BLT 1/3 blocked to the east. At 0330 on the 11th, Company C
of BLT 1/3 came under mortar and small arms fire, followed by^ a
ground assault. The company drove off the attackers and the
fighting subsided. The next day both of the 1st Marines' battalions
continued searching to the southwest, while BLT 1/3 remained in its
blocking positions. At 1515, Company C, 1st Marines was moving
through thick jungle when the point element engaged 10 NVA
soldiers. The exchange of fire wounded several Marines. Company C
pulled back to a small clearing
FALL FIGHTING IN THE NORTH 141 Department of Defense
Photo (USMC) A421900 Operation Medina begins early on 11
October as two battalions of the 1st Marines make a helicopter
assault into Landing Zone Dove in a III MAF drive to clear enemy
base areas in the thick Hai Lang forest, located approximately 12
miles south of Quang Tri City. and established a perimeter before
calling in helicopters to pick up wounded. Just after the evacuation
was completed, three NVA companies attacked Company C from two
sides. The firefight continued as darkness fell; hand grenades
figured heavily in the exchange. The battle surged back and forth
across the small clearing. At the height of the struggle a grenade
landed in the company command post. Corporal William T. Perkins,
Jr., a combat photographer attached to the company, yelled,
"Grenade!" and threw himself on the deadly missile. The explosion
killed him.* Lieutenant Colonel Belbusti reinforced Company C with
Company D and the two companies drove off the attacking NVA
force. Dawn on the 13th revealed 40 enemy dead around the
Marines' position. The enemy attack had killed eight Marines and
wounded 39. After these two fights, the enemy avoided further
contact; Medina turned into a search for small groups of North
Vietnamese in the nearly impenetrable forests. The 1st Marines did
find a number of base camps, but the enemy had evacuated the
sites. The Marines captured more than •Corporal Perkins received a
posthumous Medal of Honor, becoming the first Marine combat
photographer to receive the nation's highest award. See Appendix D
for Corporal Perkins' citation. 3d MarDiv ComdC, October 1967 An air
observer with the 1st Battalion, 1st Marines directs an air strike early
in Operation Medina on enemy positions located on an adjacent
ridgeline.
142 FIGHTING THE NORTH VIETNAMESE four tons of rice,
16 weapons, and a quantity of small arms ammunition. The enemy's
efforts to elude the sweeping Marine units resulted in the largest
action of the companion Operation Lam Son 138. On the morning of
20 October, the 4l6tb NVA Battalion, a subordinate unit of the 5th
NVA Regiment collided with one of the ARVN airborne companies
involved in Operation Lam Son 138. The airborne company held and,
after reinforcement, killed 197 North Vietnamese in the day-long
battle. Operation Medina ended on the 20th. The SLF battalion
transferred to Colonel William L. Dick's 4th Marines, which was
conducting Operation Fremont to the south. The 1st Marines stayed
in the former Medina area and started Operation Osceola the same
day. Osceola was an unspectacular, but systematic, search for enemy
forces in the Hai Lang forest. Adjustments Within the 3d Marine
Division A new series of operations began in November. Only
Osceola continued from October. The 3d 3d MarDiv ComdC, October
1967 A Marine with the 1st Battalion, 1st Marines places explosives
before blowing a helicopter landing zone in the Hai Lang forest
during Operation Medina. Department of Defense Photo (USMC)
A193856 An LVTP-5 carries members of the 1st Amphibian Tractor
Battalion, operating as infantrymen, on a sweep of the shoreline
north of the Cua Viet River in September. The battalion continued
these patrols in November and December in Operation Napoleon.
Marine Division split the Kingfisher TAOR in two: Kentucky,
embracing the region including Gio Linh, Con Thien, Cam Lo, and
Dong Ha came under the control of Colonel Richard B. Smith's 9th
Marines; and Lancaster, to the west, covered Camp Carroll, the
Rockpile, and Ca Lu under Colonel Joseph E. LoPrete's 3d Marines.
The division renamed Operation Ardmore at Khe Sanh to Scotland
and continued it as a one-battalion operation under the control of
Colonel David E. Lownds' 26th Marines. On the coast, the 1st
Amphibian Tractor Battalion conducted Operation Napoleon north of
the Cua Viet River. In Thua Thien Province, Colonel William L. Dick's
4th Marines continued to cover approaches to Hue City west of
Route 1 as Operation Neosho replaced Fremont. The 3d Marine
Division had tactical responsibility for all territory west of Highway 1
in the northern two provinces of Quang Tri and Thua Thien, while
the 1st ARVN Division was responsible for all terrain east of the road
except for the Napoleon operational area north of the Cua Viet River.
Artillery support for all of these operations came from Colonel Edwin
S. Schick, Jr.'s 12th Marines. Composed of five Marine artillery
battalions, three Army artillery battalions, and two Marine separate
FALL FIGHTING IN THE NORTH 143 batteries, it was the
largest artillery regiment in the history of the Marine Corps. The
reinforced regiment's 220 weapons37 were located throughout the
division TAOR. Each infantry regiment could call upon a direct
support battalion of 105mm howitzers. In addition, the artillery
regiment's medium 155mm howitzers and guns, and heavy 8-inch
howitzers and 175mm guns, provided reinforcing or general support
fires. While the new operations were beginning, the division
headquarters at Phu Bai prepared for a visit from Vice President
Hubert H. Humphrey on 1 November. After the stop at the division
command post, the Vice President flew over the division's area of
operations. Upon his return to Da Nang, he presented the
Presidential Unit Citation to the 3d Marine Division for "extraordinary
heroism and outstanding performance of duty in action against
North Vietnamese and insurgent Communist forces in the Republic of
Vietnam from 8 March 1965 to 15 September 1967." After pinning
the streamer on the division colors, the Vice President warmly
congratulated the division commander, Major General Hochmuth.
This was the last official ceremony that the general attended. Major
General Hochmuth died on 14 November when his UH-lE exploded
and crashed five miles northwest of Hue. Colonel William L. Dick,
commanding the 4th Marines at Phu Bai, learned of the crash around
1400 on 14 November. Since he had a helicopter sitting on a pad at
his headquarters, Dick, accompanied by his operations officer, Major
James D. Beans, and the regimental sergeant major, quickly reached
the crash scene. Colonel Dick described the rescue attempt: After
several passes, I spotted the Huey upside down in a rice paddy filled
to the brim by the heavy rains which had been falling for several
weeks. ... I directed the helicopter pilot to land on the paddy dike
nearest the crash site from where the three of us walked through
about 200 yards of paddy water until we reached the wreckage.
There were flames on the water's surface around the aircraft. While
the sergeant major attempted to extinguish these, Major Beans and
I commenced diving beneath the surface, groping through the water
for possible survivors. We had no idea just how long it had been
since the crash had occurred. This was a difficult task, as you can
imagine, since the water was full of silt, not to mention leeches, and
impossible to see through. The three of us were joined by a
Vietnamese farmer who refused to identify himself and could be
distinguished only by a small gold crucifix around A machine gun
team from Company F, 2d Battalion, 9th Marines pauses during its
movement in November in Operation Lancaster in the 9th Marines'
portion of the former Operation Kingfisher area. The team wears its
ammunition bandolier-style. 3d MarDiv ComdC, November 1967
144 FIGHTING THE NORTH VIETNAMESE Department of
Defense Photo (USMC) A190235 Ma/Gen Bruno A. Hochmuth, the
commanding general of the 3d Marine Division, wearing a rainsuit as
protection from the monsoon, sits in a UH-1E helicopter prior to a
routine inspection of the divisional area on 7 November, one week
prior to his death. his neck. The four of us, after getting rid of the
aviation fuel flames, repeatedly went below the surface into the
helicopter cabin and by touch, finally found the bodies, one by one,
of the six who had died in the crash. The helicopter had turned
upside down just before impact which made the situation even more
difficult. The last body recovered was General Hochmuth. I found
him in the rear seat of the helicopter, the spot where he usually
traveled when visiting the various command posts.38 Major General
Rathvon McC. Tompkins, a veteran of more than 32 years' Marine
service and holder of the Navy Cross as a battalion commander at
Saipan, received immediate orders as General Hochmuth's
replacement. Brigadier General Louis Metzger, the assistant division
commander, assumed command until General Tompkins arrived from
the United States on 28 November. One of General Tompkins' first
steps after his arrival was to discuss the overall situation with his
division operations officer, Colonel James R. Stockman, who had
commanded an 81mm mortar platoon under Tompkins on Saipan.
"Tell me," said Tompkins, "about the operational folklore in the
division's area of operations." Stockman replied with, among other
things, descriptions of the enemy and the terrain and the
frustrations of fighting under the Department of Defense Photo
(USMC) A189947 MajGen Rathvon McC. Tompkins inspects an honor
guard on 28 November during the ceremony at Da Nang in which he
assumed command of the 3d Marine Division after MajGen
Hochmuth's death.
FALL FIGHTING IN THE NORTH 145 restrictions imposed by
MACV and Washington. Stockman recalled that Tompkins disliked the
system which considered infantry battalions as interchangeable parts
to be shifted from one regimental headquarters to another,
depending upon the tactical situation. Tompkins accepted it,
however, as "temporary operational folklore," which he would have
to live with. "He faced," wrote Stockman, "a worsening operational
situation in the late part of 1967 with as much fortitude and
optimism as humanly possible."39 During November and December,
the most significant activity in the 3d Marine Division's zone of action
was small unit fighting near the strongpoint obstacle system around
Con Thien and Gio Linh. In November, platoon and company-size
NVA units operated from well camouflaged bunkers in the area,
trying to ambush Marine patrols and to hinder the system's
construction. The Marines countered with attacks that drove the NVA
units out of their positions on four different occasions during
November, killing 65 Communists. In addition, Marine patrols found
and destroyed three extensive Department of Defense Photo (USMC)
A189948 Company G, 2d Battalion, 9th Marines engages an NVA unit
on 3 December during the portion of Operation Kentucky conducted
to prevent enemy interference with the construction of Strongpoint
A-3 of the barrier system south of the DMZ. The 3d Marine Division
originally planned to call this protective effort Operation Newton but
decided on 28 November to consider it as simply part of Kentucky.
Department of Defense Photo (USMC) A189997 A patrol from
Company F, 9th Marines, part of the screening effort during the
construction of Strongpoint A-3, moves out carefully after finding
fresh enemy footprints and bunkers on 22 December. bunker
systems. On 29 November, three Marine battalions and two ARVN
battalions began clearing operations within the Kentucky TAOR
between Con Thien and Gio Linh, the planned site of Strong Point A-
3 of the proposed barrier plan, or "McNamara wall." The Marine
units swept south of Con Thien eastward to Site A-3, while the ARVN
units moved from near Gio Linh westward to clear a road to the
strong point location. The following day, Lieutenant Colonel William
M. Cryan's 2d Battalion, 9th Marines found a North Vietnamese
company in bunkers two and one-half miles northeast of Con Thien.
The battalion maneuvered to envelop the enemy and overran the
position by 1800, killing 41 defenders. Marine casualties totaled 15
killed and 53 wounded requiring evacuation. Although the Marine
and ARVN units continued screening operations north of A-3 during
December, the largest engagement during the month took place
southeast of Gio Linh in the Napoleon area of operation. Lieutenant
Colonel Edward R. Toner's 1st Amphibian Tractor Battalion and
Company F, 2d Battalion, 4th Marines were protecting the movement
of building materials to Strongpoint C-4 on the coast, two kilometers
north of the Cua Viet River. Company F, under the operational control
of the tractor battalion, occupied Strongpoint C-4. Platoon and
146 FIGHTING THE NORTH VIETNAMESE squad patrols
routinely operated 2,000 meters north of C-4 as forward security for
both the strongpoint and the battalion's position at Cua Viet port
facility. Early in the afternoon of the 10th, two squads patrolled near
the fishing village of Ha Loi Tay. Their operational area was a sea of
sand dunes, interrupted by a strip of scrub pine growth and
hedgerows dotting the coastline. As they approached a break in the
coastal tree line south of the village, sniper fire surprised them. The
Marines fired back, killing eight North Vietnamese. The enemy fire
killed one Marine and wounded three in this brief encounter. As the
Marines checked the area, they discovered 20-25 NVA soldiers, some
wearing American helmets and flak jackets. The Marines opened fire
and called for reinforcements. The company commander, First
Lieutenant Michael H. Gavlick, radioed the situation to the battalion
CP, and then took a platoon and the third squad of the engaged
platoon forward to join the battle. Contact continued throughout the
afternoon. Before dark, Lieutenant Colonel Toner ordered two
provisional rifle platoons from his Company B and two LVTH-6s to go
to the scene of contact to assist. As darkness settled, Lieutenant
Gavlick drew his composite force into a tight perimeter. At 0630 on
the 11th, the composite unit moved out under a light drizzle toward
the area of the previous day's action. At 0800, lead elements spotted
40 of the enemy trying to move south across the break in the tree
line. The Marines observed 11 NVA soldiers digging a mortar position
and another 15 moving behind a sand dune to the north. While the
Marines took these enemy under fire with artillery and the LVTH-6s,
Lieutenant Colonel Toner moved his Company A, organized as an
infantry unit, and his command group to Strong Point C-4. At the
same time, the U.S. advisor with the ARVN battalion occupying
Strong Point A-l, 2,500 meters across the sand dunes west of the
contact, asked if his battalion could help. Toner asked the ARVN
battalion to move a unit into a blocking position southwest of the
action. The NVA force had moved around to the west of the Marines
and were now attacking from the south. The advisor informed Toner
that an ARVN company would move to the desired blocking position.
Fifteen of the enemy had already attacked the Marines and,
although driven off, had fired 10 RPG antitank rounds. One of these
rounds hit a LVTH-6 on the bow, but the round glanced off without
damaging the tractor. The LVTH-6 destroyed the antitank gunners'
position with direct 105mm howitzer fire. The number of enemy
troops involved in the battle increased. A 30-minute firefight began;
Gavlick's composite company took heavy small arms fire from three
sides, then the Communists began hitting the Marines with mortars.
Throughout the action, the two LVTH-6s maneuvered back and forth
to engage the enemy, often firing at ranges between 50 to 150
meters. The remaining four LVTH-6s at Cua Viet and a detachment
of 4. 2 -inch mortars at C-4 added their fire to the battle. As the
Marines tightened their perimeter, the NVA made a second assault.
Fifty-five of the enemy attacked from the north, 12 more came in
from the northeast, and 20 others from the south. Again, mortar fire
supported their assault. The Marines responded with artillery, and
used naval gunfire to hold back enemy reinforcements. The
Communist assault failed, but individual soldiers continued to pop up
around the perimeter. One audacious NVA mortar crew, protected by
infantry, went into action on an exposed sand dune only 90 meters
from the Marine perimeter. They fired six rounds before machine
guns and direct fire from one of the LVTH-6s killed them. 3d MaiDiv
ComdC, December 1967 PFC F. N. Bunton carries a small Christmas
tree on bis pack while on Operation Kentucky with Company C, 1st
Battalion, 4th Marines in December.
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookname.com

You might also like