100% found this document useful (2 votes)
69 views

Using Multivariate Statistics Barbara G. Tabachnick 2024 Scribd Download

Using

Uploaded by

merchyjenny
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
69 views

Using Multivariate Statistics Barbara G. Tabachnick 2024 Scribd Download

Using

Uploaded by

merchyjenny
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 67

Visit https://ebookfinal.

com to download the full version and


explore more ebook

Using Multivariate Statistics Barbara G. Tabachnick

_____ Click the link below to download _____


https://ebookfinal.com/download/using-multivariate-
statistics-barbara-g-tabachnick/

Explore and download more ebook at ebookfinal.com


Here are some recommended products that might interest you.
You can download now and explore!

Exploratory Multivariate Analysis by Example Using R


Second Edition Husson

https://ebookfinal.com/download/exploratory-multivariate-analysis-by-
example-using-r-second-edition-husson/

ebookfinal.com

Multivariate Statistics High Dimensional and Large Sample


Approximations 1st Edition Yasunori Fujikoshi

https://ebookfinal.com/download/multivariate-statistics-high-
dimensional-and-large-sample-approximations-1st-edition-yasunori-
fujikoshi/
ebookfinal.com

Applied Statistics From Bivariate Through Multivariate


Techniques 2nd Edition Edition Rebecca M. Warner

https://ebookfinal.com/download/applied-statistics-from-bivariate-
through-multivariate-techniques-2nd-edition-edition-rebecca-m-warner/

ebookfinal.com

Discovering Statistics using SPSS 4th Edition Andy Field

https://ebookfinal.com/download/discovering-statistics-using-spss-4th-
edition-andy-field/

ebookfinal.com
Discovering Statistics Using SPSS 3rd Edition Andy Field

https://ebookfinal.com/download/discovering-statistics-using-spss-3rd-
edition-andy-field/

ebookfinal.com

Teaching Statistics Using Baseball 2nd Edition James


Albert

https://ebookfinal.com/download/teaching-statistics-using-
baseball-2nd-edition-james-albert/

ebookfinal.com

Using Biographical Methods in Social Research 1st Edition


Barbara Merrill

https://ebookfinal.com/download/using-biographical-methods-in-social-
research-1st-edition-barbara-merrill/

ebookfinal.com

Introductory Statistics for Health and Nursing Using SPSS


Louise Marston

https://ebookfinal.com/download/introductory-statistics-for-health-
and-nursing-using-spss-louise-marston/

ebookfinal.com

Teaching Music Through Composition A Curriculum Using


Technology 1st Edition Barbara Freedman

https://ebookfinal.com/download/teaching-music-through-composition-a-
curriculum-using-technology-1st-edition-barbara-freedman/

ebookfinal.com
Using Multivariate Statistics Barbara G. Tabachnick
Digital Instant Download
Author(s): Barbara G. Tabachnick, Linda S. Fidell
ISBN(s): 9780205459384, 0205459382
Edition: None
File Details: PDF, 40.85 MB
Year: 2007
Language: english
FIFTH EDITION

Using Multivariate Statistics

Barbara G. Tabachnick
California State University, Northridge

Linda S. Fidell
California State University, Northridge

Boston rn New York rn San Francisco


Mexico City Montreal rn Toronto rn London rn Madrid rn Munich Paris
Hong Kong rn Singapore rn Tokyo rn CapeTown rn Sydney
Executive Editor: Siisczrz Hlzrttnan
Editorial Assistant< Tizerese Felser
Marketing Manager: Karen Nat~lle
Senior Production Administrator: Donna Simons
Editorial Production Service: Omegatype T y p o g m p h ~Inc.
Composition Buyer: Andrew Erryo
Manufacturing Buyer: Andrew T~lrso
Electronic Composition: Omegcztype Typography, Inc.
Cover Administrator: Joel Geizdron

For related titles and support materials, visit our online catalog at www.ablongman.com.

Copyright 0 2007,2001, I996 Pearson Education. Inc.

All rights reserved. No part of the material protected by t h ~ scopyright notice may be reproduced
or utilized in any form or by any means. electronic or mechanical, including photocopying, recording,
or by any information storage and retrieval system, without written permission from the copyright holder.

To obtain permissionis) to use materials from this work, please submit a written request to Allyn and Bacon,
Permissions Department, 75 Arlington Street, Boston, MA 02 116, or fax your request to 6 17-848-7320.

Between the time web site information is gathered and published. it is not unusual for some sites to have
closed. Also, the transcription of URLs can result in typographical errors. The publisher would appreciate
notification where these occur so that they may be corrected in subsequent editions.

ISBN 0-205-45938-2

Punted In the Un~redState\ o t Amerlca

1 0 9 8 7 6 5 4 3 2 1 RRD 10 09 08 07 06
CONTENTS

Preface xxvii

1.1 Multivariate Statistics: Why? 1


1.1.1 The Domain of Multivariate Statistics: Numbers of IVs
and DVs 1
1.1.2 Experimental and Nonexperimental Research 2
1.1.3 Computers and Multivariate Statistics 4
1.1.4 Garbage In, Roses Out? 5
1.2 Some Useful Definitions 5
1.2.1 Continuous. Discrete, and Dichotomous Data 5
1.2.2 Samples and Populations 7
1.2.3 Descriptive and Inferential Statistics 7
1.2.4 Orthogonality: Standard and Sequential Analyses 8
1.3 Linear Combinations of Variables 10
1.4 Number and Nature of Variables to Include 11
1.5 Statistical Power 11
1.6 Data Appropriate for Multivariate Statistics 12
1.6.1 The Data Matrix 12
1.6.2 The Correlation Matrix 13
1.6.3 The Variance-Covariance Matrix 14
1.6.4 The SUG-of-Sc;uaresand Cr~ss-ProductsMatrix l4
1.6.5 Residuals 16
1.7 Organization of the Book 16

3
li A Guide to Statistical Techniques: Using the Book 17
2.1 Research Questions and Associated Techniques 17
2.1.1 Degree of Relationship among Variables 17
2.1.1.1 Bivariate r 17
2.1.1.2 Multiple R 18
2.1.1.3 Sequential R 18
2.1.1.4 Canonical R I8
2.1.1.5 Multiway Frequency Analysis 19
7.1.1.6 Multilevel Modeling 19

iii
iv CONTENTS

2.1.2 Significance ot'(;i-oup Difference\ I9


3. I .Z. i One-Way XKOL'A and r Test 19
2.1.1.2 One-Way ANCOVA 30
2.1.2.3 Factorial ANOVA 20
2.1.2.4 Factorial ANCOVA 20
2.1.2.5 Hotelling's T' 21
2.1.2.6 One-Way MANOVA 21
2.1.2.7 One-Way MANCOVA 21
2.1.2.8 Factorial MANOVA 22
2.1.2.9 Factorial MANCOVA 22
2.1.2.10 Profile Analysis of Repeated Measures 23
2.1.3 Prediction of Group Membership 23
2.1.3.1 One-Way Discriminant 23
2.1.3.2 Sequential One-Way Discriminant 24
2.1.3.3 Multiway Frequency Analysis (Logit) 24
2.1.3.4 Logistic Regression 24
2. I .3.5 Sequential Logistic Regression 25
2.1.3.6 Factorial Discriminant Analysis 25
2.1.3.7 Sequential Factorial Discriminant Analysis 25
2.1.4 Structure 25
2.1.4.1 Principal Components 25
2.1.3.2 Factor Analysis 26
2. !.4.? Structural Eqtiation Modeling 26
2.1.5 Time Course of Events 26
2.1.5.1 Survival/Failure Analysis '26
2.1.5.2 Time-Series Analysis 27
2.2 Some Further Comparisons 27
2.3 A Decision Tree 28
2.4 Technique Chapters 31
2.5 Preliiiiinary Check of the Data 32

3 Review of Univariate and Bivariate Statistics 33


3.1 Hypothesis Testing 33
3.1.1 One-Sample z Test as Prototype 33
3.1.2 Power 36
3.1.3 Extensions of the Model 37
3.1.4 Controversy Surrounding Significance Testing 37
3.2 Analysis of Variance 37
3.2.1 One-Way Between-Subjects ANOVA 39
3.2.2 Factorial Between-Subjects ANOVA 42
3.2.3 Within-Subjects ANOVA 43
3.2.4 Mixed Between-Within-Subjects ANOVA 46
3.2.5 De\ign Complexity 47
3.2.5.1 Nesting 47
3.2.5.1 Latin-Square Designs 47
3.2.5.3 Unequal 11 and Nonorthogonal~ty 48
3.2.5.3 Fixed and Random Effects 49
3.2.6 Specific Comparisons 49
3.2.6.1 Weighting Coefficients for Comparisons 50
3.2.6.2 Orthogonality of Weighting Coefficients 50
3.2.6.3 Obtained F for Comparisons 51
3.2.6.4 Critical F for Planned Comparisons 52
3.2.6.5 Critical F for Post Hoc Comparisons 53
3.3 Parameter Estimation 53
3.4 Effect Size 54
3.5 Bivariate Statistics: Correlation and Regression 56
3.5.1 Correlation 56
3.5.2 Regression 57
3.6 Chi-Square Analysis 58

Cleaning Up Your Act: Screening Data


Prior to Analysis 60
4.1 Important Issues in Data Screening 61
4.1 . 1Accuracy of Data File 61
4.1.2 Honest Correlations 61
4.1.2.1 Inflated Correlation 61
4.1.2.2 Deflated Correlation 61
4.1.3 Missing Data 62
4. !.3. ! Deleting Cases or Variables 63
4.1.3.2 Estimating Missing Data 66
4.1.3.3 Using a Missing Data Correlation Matrix 70
4.1.3.4 Treating Missing Data as Data 71
4.1.3.5 Repeating Analyses with and without Missing Data 71
4.1.3.6 Choosing among Methods for Dealing
with Missing Data 71
4.1.4 Outliers 72
4.1.4.1 Detecting Univariate and Multivariate Outliers 73
4.1.4.2 Describing Outliers 76
4.1.4.3 Reducing the Influence of Outliers 77
4.1.4.4 Outliers in a Solution 77
4.1.5 Normality, Linearity, and Hornoscedasticity 78
4.1.5.1 Normality 79
4.1.5.2 Linearity 83
4.1.5.3 Homoscedasticity, Homogeneity of Variance. and
Homogeneity of Variance-Covariance Matrices 85
vi CONTENTS

4.1 (7 Cornnion Data Tr;tn(forlndti~n\ 86


'ind Sing~11~1-1ty 88
1.1.7 M~~lticollinedrit>
4.1.8 A Checklist and Some Pract~calRecommendation\ 9I
4.2 Complete Examples of Data Screening 92
4.2.1 Screening Ungrouped Data 92
4.2.1.1 Accuracy of Input. Missing Data, Distributions,
and Univariate Outliers 93
4.2.1.2 Linearity and Homoscedasticity 96
4.2.1.3 Transformation 98
4.2.1.4 Detecting Multivariate Outliers 99
4.2.1.5 Variables Causing Cases to Be Outliers 100
4.2.1.6 Multicollinearity 104
4.2.2 Screening Grouped Data 105
4.2.2.1 Accuracy of Input, Missing Data, Distributions,
Homogeneity of Variance, and Univariate Outliers 105
4.2.2.2 Linearity 110
4.2.2.3 Multivariate Outliers 11 1
4.2.2.4 Variables Causing Cases to Be Outliers 1 13
4.2.2.5 Multicollinearity 1 14

Ir
3 Multiple Regression 117
5.1 General Purpose and Description 117
5.2 Kinds of Research Questions 118
5.2.1 Degree of Relationship 1 19
5.2.2 Irr~portanceof i'v'a 119
5.2.3 Adding IVs 119
5.2.4 Changing 1Vs 120
5.2.5 Contingencies among IVs 120
5.2.6 Comparing Sets of IVs 120
5.2.7 Predicting DV Scores for Members of a New Sample 120
5.2.8 Parameter Estimates 121
5.3 Limitations to Regression Analyses 121
5.3.1 Theoretical Issues 122
5.3.2 Practical Issues 123
5.3.2.1 Ratio of Cases to IVs 123
5.3.2.2 Absence of Outliers among the IVs and on the DV 124
5.3.2.3 Absence of Multicollinearity and Singularity 124
5.3.2.4 Normality, Linearity, Homoscedasticity of Residuals 125
5.3.2.5 Independence of Errors 128
5.3.2.6 Absence of Outliers in the Solution 128
5.4 Fundamental Equations for Multiple Regression 128
5.4.1 General Linear Equations 129
5.4.2 Matrix Equations 13 1
5.4.3 Computer Analyses of Small-Sample Example 134
C O \ \ , E L r‘j vii

5.5 Major Types of klultiple Regression 136


5.5. l Stancl~rctMultiple Regression 136
5.5.2 Sequential Multiple Regression 138
5.5.3 Statistical (Stepwise) Regression 138
5.5.4 Choosing among Regression Strategies 143
. 5.6 Some Important Issues 144
5.6.1 Importance of IVs 144
5.6.1.l Standard Multiple Regression 146
5.6.1.2 Sequential or Statistical Regression 146
5.6.2 Statistical Inference 146
5.6.2.1 Test for Multiple R 147
5.6.2.2 Test of Regression Components 148
5.6.2.3 Test of Added Subset of IVs 149
5.6.2.4 Confidence Limits around B and Multiple" R 50
5.6.2.5 Comparing Two Sets of Predictors 152
5.6.3 Adjustment of R~ 153
5.6.4 Suppressor Variables 154
5.6.5 Regression Approach to ANOVA 155
5.6.6 Centering when Interactions and Powers of IVs
Are Included 157
5.6.7 Mediation in Causal Sequences 159
5.7 Complete Examples of Regression Analysis 161
5.7,1 Evaluation of Assumptions 161
5.7.1.1 Ratio of Cases to IVs 16 1
5.7.1.2 Normality, Linearity, Homoscedasticity,
and Independence of Residuals 16 1
5.7.1.3 Outliers 165
5.7.1.4 Multicollinearity and Singularity 167
5.7.2 Standard Multiple Regression 167
5.7.3 Sequential Regression 174
5.7.4 Example of Standard Multiple Regression with Missing Values
Multiply Imputed 179
5.8 Comparison of Programs 188
5.8.1 SPSS Package 188
5.8.2 SAS System 191
5.8.3 SYSTAT System 194

6 Analysis of Covariance 195


6.1 General Purpose and Description 195
6.2 Kinds of Research Questions 198
6.2.1 Main Effects of IV4 I98
6.2.2 Interactions among IVs 198
6.2.3 Specific Comparisons and Trend Analysis 199
6.2.4 Effects of Covariates 199
CONTENTS

6.2.5 Effect S ~ r e I99


6.2.6 Parameter E\t~rnates 199
6.3 Limitations to Analysis of Covariance 200
6.3.1 Theoretical Issues 200
6.3.2 Practical Issues 20 1
6.3.2.1 Unequal Sample Sizes, Missing Data, and Ratio of Cases
to IVs 201
6.3.2.2 Absence of Outliers 201
6.3.2.3 Absence of Multicollinearity and Singularity 20 1
6.3.2.4 Normality of Sampling Distributions 202
6.3.2.5 Homogeneity of Variance 202
6.3.2.6 Linearity 202
6.3.2.7 Homogeneity of Regression 202
6.3.2.8 Reliability of Covariates 203
6.4 Fundamental Equations for Analysis of Covariance 203
6.4.1 Sums of Squares and Cross Products 204
6.4.2 Significance Test and Effect Size 208
6.4.3 Computer Analyses of Small-Sample Example 209
6.5 Some Important Issues 211
6.5.1 Choosing Covariates 2 11
6.5.2 Evaluation of Covariates 2 12
6.5.3 Test for Homogeneity of Regression 2 13
6.5.4 Design Complexity 2 13
6.5.4.1 Within-Subjects and Mixed
Within-Between Designs 2 13
6.5.4.2 Unequal Sample Si7es 2 17
6.5.4.3 Specific Comparisons and Trend Analysis 2 18
6.5.4.4 Effect Size 22 1
6.5.5 Alternatives to ANCOVA 22 1
6.6 Complete Example of Analysis of Covariance 223
6.6.1 Evaluation of Assumptions 223
6.6. I. 1 Unequal n and Missing Data 224
6.6.1.2 Normality 224
6.6.1.3 Linearity 224
6.6.1.4 Outliers 224
6.6.1.5 Multicollinearity and Singularity 227
6.6.1.6 Homogeneity of Variance 228
6.6.1.7 Homogeneity of Regression 230
6.6.1.8 Reliability of Covariates 230
6.6.2 Analysis of Covariance 230
6.6.2.1 Main Analysis 230
6.6.2.2 Evaluation of Covariates 235
6.6.2.3 Homogeneity of Regression Run 237
6.7 Comparison of Programs 240
6.7.1 SPSSPackage 240
CONTENTS ix

6 7 2 SXS Sqstem 710


6.7.3 SYSTAT Systern 310

7 Multivariate Analysis of Variance and Covariance 243


7.1 General Purpose and Description 243
7.2 Kinds of Research Questions 247
7.2.1 Main Effects of IVs 247
7.2.2 Interactions among IVs 247
7.2.3 Importance of DVs 247
7.2.4 Parameter Estimates 248
7.2.5 Specific Comparisons and Trend Analysis 248
7.2.6 Effect Size 248
7.2.7 Effects of Covariates 248
7.2.8 Repeated-Measures Analysis of Variance 249
7.3 Limitations to Multivariate Analysis of Variance
and Covariance 249
7.3.1 Theoretical Issues 249
7.3.2 Practical Issues 250
7.3.2.1 Unequal Sample Sizes. Missing Data, and Power 250
7.3.2.2 Multivariate Normality 25 :
7.3.2.3 Absence of Outliers 25 1
7.3.2.4 Homogeneity of Variance-Covariance Matrices 25 1
7.3.2.5 Linearity 252
7.3.2.6 Homogeneity of Regression 252
7.3.2.7 Reliability of Covariates 253
7.3.2.8 Absence of Multicollinearity and Singular~ty 253
7.4 Fundamental Equations for Multivariate Analysis
of Variance and Covariance 253
7.4.1 Multivariate Analysis of Variance 253
7.4.2 Computer Analyses of Small-Sample Example 261
7.4.3 Multivariate Analysis of Covariance 264
7.5 Some Important Issues 268
7.5.1 MANOVA vs. ANOVAs 268
7.5.2 Criteria for Statistical Inference 269
7.5.3 Assessing DVs 270
7.5.3.1 Univariate F 270
7.5.3.2 Roy-Bargmann Stepdown Analysis 27 1
7.5.3.3 Using Discriminant Analysis 272
7.5.3.4 Choosing among Strategies for Assessing DVs 273
7.5.4 Specific Comparisons and Trend Analysis 273
7.5.5 Design Complexity 274
7.5.5,l Within-Subjects and Between-Within Designs 273
7.5.5.3- Unequal Sample Sizes 276
X CONTENTS

7.6 Complete Examples of 3Iultivariate .Analysis of Variance


and Covariance 277
7.6.1 Evaluation of Assulnptions 177
7.6. I . I Unequal Sample Sizes and Missing Data 277
7.6. 1.2 Multivariate Normality 279
7.6.1.3 Linearity 279
7.6.1.4 Outliers 279
7.6.1.5 Homogencity of Variance-Covariance Matrices 280
7.6.1.6 Homogeneity of Regression 28 1
7.6.1.7 Reliability of Covariates 284
7.6.1.8 Multicollinearity and Singularity 285
7.6.2 Multivariate Analysis of Variance 285
7.6.3 Multivariate Analysis of Covariance 296
7.6.3.1 Assessing Covariates 296
7.6.3.2 Assessing DVs 296
7.7 Comparison of Programs 307
7.7.1 SPSS Package 307
7.7.2 SAS System 310
7.7.3 SYSTAT System 3 10

8 Profile Analysis: The Multivariate Approach


to Repeated Measures 311
8.1 General Purpose and Description 31 1
8.2 Kinds of Research Questions 312
8.2. I Parallelism of Profiles 3 12
8.2.2 Overali D~t'ferenceamong Groups 3 13
8.2.3 Flatness of Profiles 3 13
8.2.4 Contrasts Following Profile Analysis 3 13
8.2.5 Parameter Estimates 3 !3
8.2.6 Effect Size 314
8.3 Limitations to Profile Analysis 314
8.3.1 Theoretical Issues 3 14
8.3.2 Practical Issues 3 15
8.3.2.1 Sample Size, Missing Data, and Power 3 15
8.3.2.2 Multivariate Normality 3 15
8.3.2.3 Absence of Outliers 3 15
8.3.2.4 Homogeneity of Variance-Covariance Matrices 3 15
8.3.2.5 Linearity 3 16
8.3.2.6 Absence of Multicollinearity and Singularity 3 16
8.4 Fundamental Equations for Profile Analysis 316
8.4.1 Differences in Levels 3 16
8.4.2 Parallelism 3 18
CONTENTS xi

8.4.7 Flatne\\ 32 I
8.4.1 Co~npi~ter
.Alialy\ss cif Small-S;\mplt: Example 323
8.5 Some Important Issues 329
8.5.1 Univariate vs. Multivariate Approach to Repeated Measures 329
8.5.2 Contrasts in Profile Analysis 33 1
8.5.2.1 Parallelism and Flatness Significant. Levels Not Signiticant
(Simple-effectsAnalysis) 333
8.5.2.2 Parallelism and Levels Signiticant, Flatness Not Signiticant
(Simple-effectsAnalysis) 336
8.5.2.3 Parallelism, Levels, and Flatness Significant
(Interaction Contrasts) 339
8.5.2.4 Only Parallelism Signiticant 339
8.5.3 Doubly-Multivariate Designs 339
8.5.4 Classifying Profiles 345
8.5.5 Imputation of Missing Values 345
8.6 Complete Examples of Profile Analysis 346
8.6.1 Profile Analysis of Subscales of the WISC 346
8.6.I . 1 Evaluation of Assumptions 346
8.6.1.2 Profile Analysis 35 1
8.6.2 Doubly-Multivariate Analysis of Reaction Time 360
8.6.2.1 Evaluation of Assumptions 360
8.6.2.2 Doubly-Multivzriate .4nalysis of Slope and Intercept 363
8.7 Comparison of Programs 37 1
8.7.1 SPSS Package 373
8.7.2 SAS System 373
8.7.3 SYSTAT System 37-4

9 Discriminant Analysis 375


9.1 General Purpose and Description 375
9.2 Kinds of Research Questions 378
9.2.1 Significance of Prediction 378
9.2.2 Number of Signiticant Discriminant Functions 378
9.2.3 Dimensions of Discrimination 379
9.2.4 Classification Functions 379
9.2.5 Adequacy of Classification 379
9.2.6 Effect Size 379
9.2.7 Importance of Predictor Variables 380
9.2.8 Significance of Prediction with Covariates 380
9.2.9 Estirtiation of Group Means 380
9.3 Limitations to Discriminant Analysis 381
9.3.1 Theoretical Issues 38 I
xii CONTENTS

9.7.2 Prilctic'al Ihsue\ 38 I


9.3.2. I Unequal Sample St/es. Missing Data. and Power 38 I
9.3.2.2 blultivat-iate Not-maltty 382
9.3.2.3 Absence of Outliers 382
9.3.1.4 Homogeneity of Variance-Covariance Matrices 382
9.3.1.5 Linearity 383
9.3.2.6 Absence of Multicollinearity and Singularity 383
9.4 Fundamental Equations for Discriminant Analysis 384
9.4.1 Derivation and Test of Discriminant Functions 384
9.4.2 Classification 387
9.4.3 Computer Analyses of Small-Sample Example 389
9.5 Types of Discriminant Function Analyses 395
9.5.1 Direct Discriminant Analysis 395
9.5.2 Sequential Discriminant Analysis 396
9.5.3 Stepwise (Statistical) Discriminant Analysis 396
9.6 Some Important Issues 397
9.6.1 Statistical Inference 397
9.6. I. I Criteria for Overall Statistical Significance 397
9.6.1.2 Stepping Methods 397
9.6.2 Number of Discriminant Functions 398
9.6.3 Interpreting Discriminant Functions 398
9.6.3.1 Discriminant Function Plots 398
9.6.3.2 Structure Matrix of Loadings 400
9.6.4 Evaluating Predictor Variables 401
9.6.5 Effectsize 402
9.6.6 Design Complexity: Factorial Designs 403
9.6.7 Use of Classification Procedures 404
9.0.7.! Cross-iralidatior; and Neiv Cases 405
9.6.7.2 Jackknifed Classification 405
9.6.7.3 Evaluating Improvement in Classification 405
9.7 Complete Exampie of Discriminant Analysis 407
9.7.1 Evaluation of Assumptions 407
9.7.1.1 Unequal Sample Sizes and Missing Data 407
9.7. I .2 Multivariate Normality 408
9.7.1.3 Linearity 408
9.7.1.4 Outliers 408
9.7.1.5 Homogeneity of Variance-Covariance Matrices 411
9.7.1.6 Multicollinearity and Singularity 41 1
9.7.2 Direct Discriminant Analysis 412
9.8 Comparison of Programs 430
9.8.1 SPSSPackage 430
9.8.2 SAS System 430
9.8.3 SYSTAT System 436
CONTENTS xiii

I (1 Logistic Regression 437


10.1 General Purpose and Description 437
10.2 Kinds of Research Questions 439
10.2.1 Prediction of Croup Membership or Outcome 439
10.2.2 Importance of Predictors 439
10.2.3 Interactio~lsamong Predictors 440
10.2.4 Parameter Estimates 440
10.2.5 Classification of Cases 440
10.2.6 Significance of Prediction with Covariates 440
10.2.7 Effect Size 441
10.3 Limitations to Logistic Regression Analysis 441
. 10.3.1 Theoretical Issues 44 1
10.3.2 Practical Issues 442
10.3.2.1 Ratio of Cases to Variables 442
10.3.2.2 Adequacy of Expected Frequencies and Power 442
10.3.2.3 Linearity in the Logit 443
10.3.2.4 Absence of Multicollinearity 443
10.3.2.5 Absence of Outliers in the Solution 443
10.3.2.6 Independence of Errors 443
10.4 Fundamental Equations for Logistic Regression 444
10.4.1 Testing and Interpreting Coefficients 445
10.4.2 Goodness-of-Fit 446
10.4.3 Comparing Models 448
10.4.4 Interpretation and Analysis of Residuals 448
i0.4.5 Computer Analyse; of Sma!!-Samp!e Example 449
10.5 Types of Logistic Regression 453
10.5.1 Direct Logistic Regression 454
.n r 6
l u . 3 . ~Sequential Logistic Regressior, 454
10.5.3 Stadstical (Stepwise) Logistic Regression 454
10.5.4 Probit and Other Analyses 456
10.6 Some Important Issues 457
10.6.1 Statistical Inference 457
10.6.1.1 Assessing Goodness-of-Fit of Models 457
10.6.1.2 Tests of Individual Variables 459
10.6.2 Effect Size for a Model 460
10.6.3 Interpretation of Coefficients Using Odds 46 1
10.6.4 Coding Outcome and Predictor Categories 464
10.6.5 Number and Type of Outcome Categories 464
10.6.6 Classification of Cases 468
10.6.7 Hierarchical and Nonhierarchical Analysis 468
xiv CONTENTS

10.6.5 Importance of Predictor) 469


10.6.9 L O ~ I XReg~e\\ton
~IL' for Matched Group\ 469
10.7 Complete Examples of Logistic Regression 469
10.7.1 Evaluation of Limitations 470
10.7.1.1 Ratio of Cases to Variables and Missing Data 170
10.7.1.2 Multicollinearity 473
10.7.1.3 Outliers in the Solution 474
10.7.2 Direct Logistic Regression with Two-Category Outcome
and Continuous Predictors 474
10.7.2.1 Limitation: Linearity in the Logit 474
10.7.2.2 Direct Logistic Regression with Two-Category Outcome 474
10.7.3 Sequential Logistic Regression with Three Categories
of Outcome 481
10.7.3.1 Limitations of Multinomial Logistic Regression 481
10.7.3.2 Sequential Multinomial Logistic Regression 48 1
10.8 Comparisons of Programs 499
10.8.1 SPSS Package 499
10.8.2 SAS System 504
10.8.3 SYSTAT System 504

1I SurvivaWailure Analysis 506


11.1 General Purpose and Description 506
Kinds of Research Questions 507
1 1.2.1 Proportions Surviving at Various Times 507
1 !.2.2 Group Differences in Surviva! 508
1 i 2 . 3 Survivai Time with Covariates 508
1 1.2.3.1 Treatment Effects 508
1 1.2.3.2 Importance of Covar~ates 508
1 1.2.3.3 Parumeier Esrimates 508
1 1.2.3.4 Contingencies among Covariates 508
1 1.2.3.5 Effect Size and Power 509
11.3 Limitations to Survival Analysis 509
11.3.1 Theoretical Issues 509
11.3.2 Practical Issues 509
11.3.2.1 Sample Size and Missing Data 509
1 1.3.2.2 Normality of Sampling Distributions, Linearity,
and Homoscedasticity 5 10
1 1.3.2.3 Absence of Outliers 5 10
1 1.3.2.4 Differences between Withdrawn and
Remaining Cases 5 10
1 1.3.3.5 Change in Survival Conditions over Time 5 10
1 1.3.2.6 Proportionality of Hazards 5 10
1 1.3.2.7 Absence of Multicollinear~ty 5 10
CONTENTS XV

11.4 Fundamental Equations for Survival 4nalysi5 51 1


1 1.4.1 L ~ f eTablej 511
i 1.4.2. Standard Error of Cumulat~vc:Proportion S u n i \ lng 5 13
1 1.4.3 Hazard and Density Funct1on5 5 14
1 1.4.4 Plot of Life Tables 5 15
1 1.4.5 Test for Group Differences 5 15
1 1.4.6 Computer Analyses of Small-Sample Example 5 17
11.5 Types of Survival Analyses 524
1 1.5.1 Actuarial and Product-Limit Life T ~ b l e s
and Survivor Functions 524
1 1.5.2 Prediction of Group Survival Times from Covariates 524
11 S.2.1 Direct, Sequential, and Statistical Analysis 527
11S.2.2 Cox Proportional-Hazards Model 527
1 1.5 2.3 Accelerated Failure-Time Models 529
1 1 S.2.4 Choosing a Method 535
11.6 Some Important Issues 535
1 1.6.1 Proportionality of Hazards 535
1 1.6.2 Censored Data 537
1 1.6.2.1 Right-Censored Data 537
1 1.6.2.2 Other Forms of Censoring 537
1 1.6.3 Effect Size and Power 538
11.6.4 Statistical Criteria 534
1 1.6.4.1 Test Statistics for Group Differences
in Survival Functions 539
1 1.6.4.2 Test Statistics for Prediction from Covariates 540
1 1.6.5 Predicting Survival Rate 540
1 1.6.5.1 Regression Coefticients (Parameter Estimates) 540
1 1.6.5.2 Odds Ratios 540
1 1.6.5.3 Expected Survival Rates 54 1
11.7 Complete Example of Survival Analysis 541
1 1.7.1 Evaluation of Assumptions 543
11.7.;.1 Accuracy of Input, Adequacy of Samplc Size, Missing Data,
and Distributions 543
1 1.7.1.2 Outliers 545
1 1.7.1.3 Differences between Withdrawn and Remaining Cases 549
I 1 7 1 4 Change in Survival Experience over Time 549
1 1.7.1.5 Proportionality of Hazards 549
1 1 7 1 6 Multicollinearity 55 1
1 1.7.2 Cox Regression Survival Analysis 55 1
1 1.7.2.1 Effect of Drug Treatment 552
1 1.7.2.2 Evaluation of Other Covariates 552
11.8 Comparison of Programs 559
i 1.8.1 SAS System 559
1 1.8.2 SPSS Package 559
1 1.8.3 SYSTAT System 566
xvi CONTENTS

1? Canonical Correlation 567


12.1 General Purpose and Description 567
12.2 Kinds of Research Questions 568
12.2.1 Number of Canonical Variate Pairs 568
12.2.2 Interpretation of Canonical Variates 569
12.2.3 Importance of Canonical Variates 569
12.2.4 Canonical Variate Scores 569
12.3 Limitations 569
12.3.1 Theoretical Limitations 569
12.3.2 Practical Issues 570
12.3.2.1 Ratio of Cases to IVs 570
12.3.2.2 Normality, Linearity, and Homoscedasticity 570
12.3.2.3 Missing Data 57 1
12.3.2.4 Absence of Outliers 57 1
12.3.2.5 Absence of Multicollinearity and Singularity 57 1
12.4 Fundamental Equations for Canonical Correlation 572
12.4.1 Eigenvalues and Eigenvectors 573
12.4.2 Matrix Equations 575
12.4.3 Proportions of Variance Extracted 579
12.4.4 Computer Analyses of Small-Sample Example 580
12.5 Some Important Issues 586
12.5.1 Importance of Canonical Variates 586
12.5.2 Interpretation of Canonical Variates 587
12.6 Complete Example of Canonical Correlation 587
12.6.1 Evaluation of Assumptions 588
12.6.1.1 Missing Data 588
12.6.1.2 Normality, Linearity, and Homoscedasticity 588
12.6.1.3 Outliers 591
12.6.1.4 Multicollinearity and Singularity 595
12.6.2 Canonical Correlation 595
12.7 Comparison of Programs 604
12.7.1 SAS System 604
12.7.2 SPSS Package 604
12.7.3 SYSTAT System 606

13 Principal Components and Factor Analysis 607


13.1 General Purpose and Description 607
13.2 Kinds of Research Questions 610
13.2.1 Number of Factors 610
CONTENTS xvii

13.1 1 Nature of Factor. 61 1


13.2.3 Importance of Solut~on\and Factor\ hl 1
13.2.3 Testing Theory In FA 61 1
13.2.5 Estimating Scores on Factors 611
13.3 Limitations 611
13.3.1 Theoretical Issues 611
13.3.2 Practical Issues 6 12
13.3.2.1 Sample Size and Missing Data 613
13.3.2.2 Normality 613
13.3.2.3 Linearity 613
13.3.2.4 Absence of Outliers among Cases 61 3
13.3.2.5 Absence of Multicollinearity and Singularity 6 14
13.3.2.6 Factorability of R 6 14
13.3.2.7 Absence of Outliers among Variables 6 14
13.4 Fundamental Equations for Factor Analysis 615
13.4.1 Extraction 616
13.4.2 Orthogonal Rotation 620
13.4.3 Communalities, Variance, and Covariance 62 1
13.4.4 Factor Scores 622
L 3.4.5 Oblique Rotation 625
13.4.6 Computer Analyses of Small-Sample Example 628
13.5 Major Types of Factor Analyses 633
13.5.1 Factor Extraction Techniques 633
13.5.1.1 PCA vs. FA 634
13.5.1.2 Principal Components 635
13.5.1.3 Principal Factors 636
13.5.1.4 Image Factor Extraction 636
13.5.1.5 Maximum Likelihood Factor Extraction 636
13.5.1.6 Unweighted Least Squares Factoring 636
13.5.1.7 Generalized (Weighted) Least Squares Factoring 637
13.5.1.8 Alpha Factoring 637
13.5.2 Rotation 637
13.5.2.1 Orthogonal Rotation 638
13.5.2.2 Oblique Rotation 638
13.5.2.3 Geometric Interpretation 640
13.5.3 Some Practical Recommendations 642
13.6 Some Important Issues 643
13.6.1 Estimates of Communalities 643
13.6.2 Adequacy of Extraction and Number of Factors 644
13.6.3 Adequacy of Rotation and Simple Structure 646
13.6.4 Importance and Internal Consistency of Factors 647
13.6.5 Interpretation of Factors 649
13.6.6 Factor Scores 650
13.6.7 Comparisons among Solutions and Croups 65 1
xviii CONTENTS

13.7 Complete Example of FA 651


13.7.1 Evaluation of Limitations 652
13.7.1.1 Sample Size and Misjing Data 652
13.7.1.2 Normality 652
13.7.1.3 Linearity 652.
13.7.1.4 Outliers 652
13.7.1.5 Multicollinearity and Singularity 657
13.7.1.6 Outliers among Variables 657
13.7.2 Principal Factors Extraction with Varimax Rotation 657
13.8 Comparison of Programs 671
13.8.1 SPSS Package 674
13.8.2 SAS System 675
13.8.3 SYSTAT System 675

14 Structural Equation Modeling 676


14.1 General Purpose and Description 676
14.2 Kinds of Research Questions 680
14.2.1 Adequacy of the Model 680
14.2.2 Testing Theory 680
14.2.3 Amount of Variance in the Variables Accounted for
by the Factors 680
14.2.4 Reliability of the Indicators 680
14.2.5 Parameter Estimates 680
14.2.6 Intervening Variables 681
14.2.7 Group Differences 68 1
14.2.8 Long~tudinalDifferences 68 1
14.2.9 Multilevel Modeling 68 1
14.3 Limitations to Structural Equation Modeling 682
i4.3. i Tlieoreiicai issues 682
14.3.2 Practical Issues 682
14.3.2.1 Sample Size and Missing Data 682
14.3.2.2 Multivariate Normality and Absence of Outliers 683
14.3.2.3 Linearity 683
14.3.2.4 Absence of Multicollinearity and Singularity 683
14.3.2.5 Residuals 684
14.4 Fundamental Equations for Structural Equations Modeling 684
14.4.1 Covariance Algebra 684
14.4.2 Model Hypotheses 686
14.4.3 Model Specification 688
i4.4.4 Model Estimation 690
14.4.5 Model Evaluation 694
14.1.6 Computer Analysis of Small-Sample Example 696
CONTENTS xix

14.5 Some Important Issues 709


14.5.1 Model Identification 709
14.5.2 Estimation Techniques 7 13
14.5.2.1 Estimation Methods and Sample Size 714
14.5.2.2 Estimation Methods and Nonnormality 7 14
14.5.2.3 Estimation Methods and Dependence 7 15
14.5.2.4 Some Recommendations for Choice
of Estimation Method 7 15
14.5.3 Assessing the Fit of the Model 715
14.5.3.1 Colnpwative Fit Indices 7 16
14.5.3.2 Absolute Fit Index 7 18
14.5.3.3 Indices of Proportion of Variance Accounted 7 18
14.5.3.4 Degree of Parsimony Fit Indices 7 19
14.5.3.5 Residual-Based Fit Indices 720
14.5.3.6 Choosing among Fit Indices 720
14.5.4 Model Modification 72 1
14.5.4.1 Chi-Square Difference Test 72 1
14.5.4.2 Lagrange Multiplier (LM) Test 72 1
14.5.4.3 Wald Test 723
14.5.4.4 Some Caveats and Hints on Model Modification 728
14.5.5 Reliability and Proportion of Variance 728
14.5.6 Discrete and Ordinal Data 729
14:5:7 Multiple Group Models 730
14.5.8 Mean and Covariance Structure Models 73 1
14.6 Complete Examples of Structural Equation
Modeling Analysis 732
14.6.1 Confirmatory Factor Analysis of the WISC 733
14.6.1.1 Model Specification for CFA 7 72
14.6.1.2 Evaluation of Assumptions for CFA 733
14.6.1.3 CFA Model Estimation and Preliminary Evaluation 734
14.6.1.4 Model Modification 743
14.6.2 SEM of Health Data 750
14.6.2.1 SEM Model Specification 750
14.6.2.2 Evaluation of Assumptions for SEM 75 1
14.6.2.3 SEM Model Estimation and Preliminary Evaluation 755
14.6.2.4 Model Modification 759
14.7 Comparison of Programs 773
14.7.1 EQS 773
14.7.2 LISREL 773
14.7.3 AMOS 780
14.7.4 SAS System 780

13 Multilevel Linear Modeling 781


15.1 General Purpose and Description 781
CONTENTS

15.2 Kinds of Research Questions 784


15.2.1 GSOLIP Differences in Means 784
15.2.2 Groiip Differences in Slopes 784
15.2.3 Cross-Level Interactions 785
15.2.4 Meta-Analysis 785
15.2.5 Relative Strength of Predictors at Various Levels 785
15.2.6 Individual and Group Structure 785
15.2.7 Path Analysis at Individual and Group Levels 786
15.2.8 Analysis of Longitudinal Data 786
15.2.9 Multilevel Logistic Regression 786
15.2.10 Multiple Response Analysis 786
15.3 Limitations to Multilevel Linear Modeling 786
15.3.1 Theoretical Issues 786
. 15.3.2 Practical Issues 787
15.3.2.1 Sample Size, Unequal-n, and Missing Data 787
15.3.2.2 Independence of Errors 788
15.3.2.3 Absence of Multicollinearity and Singularity 785,
15.4 Fundamental Equations 789
15.4.1 Intercepts-Only Model 792
15.4.I. l The Intercepts-Only Model: Level- 1 Equation 793
15.4.1.2 The Intercepts-Only Model: Level-2 Equation 793
15.4.1.3 Conlputer Analysis of Intercepts-only Model 794
15.4.2 Model with a First-Level Predictor 799
15.4.2.1 Level- l Equation for a Model with a Level- 1 Predictor 799
15.4.2.2 Level-2 Equations for a Model with a Level- I Predictor 801
15.4.2.3 Computer Analysis of a Model with a Level-1 Predictor 802
15.4.1 Model with Predictors at First and Second Levels X07
15.4.3.1 Level-1 Equation for Model with Predictors
at Both Levels 807
15.4.3.2 Level-2 Equations for Model with Predictors
at Both level< 807
15.4.3.3 Computer Analyses of Model with Predictors
at First and Second Levels 808
15.5 Types of M L M 814
15.5.1 Repeated Measures 8 14
15.5.2 Higher-Order MLM 8 19
15.5.3 Latent Variables 8 19
15.5.4 Nonnormal Outcome Variables 820
15.5.5 Multiple Response Models 82 1
15.6 Some Important Issues 822
15.6.1 Intraclass Correlation 822
15.6.2 Centering Predictors and Changes in Their Interpretations 823
15.6.3 Interactions 826
15.6.4 Random and Fixed Intercepts and Slopes 826
CONTENTS xxi

15.6.5 Statistical Inference 810


15.6.5.1 Assessing Models 830
15.6.5.2 Test5 of In&\ idual El'fects 63 1
15.6.6 Effect Size 832
15.6.7 Estimation Techniques and Convergence Problems 833
15.6.8 Exploratory Model Building 834
,15.7 Complete Example of MLM 835
15.7. I Evaluation of Assumptions 835
15.7. I. 1 Sample Sizes, Missing Data, and Distributions 835
15.7.1.2 Outliers 838
15.7.1.3 Multicollinearity and Singularity 839
15.7.1.4 Independence of Errors: Intraclass Correlations 839
15.7.2 Multilevel Modeling 840
15.8 Comparison of Programs 852
15.8.1 SAS System 852
15.8.2 SPSS Package 856
!5.8.3 HLM Prograrr. 856
15.8.4 MLwiN Program 857
15.8.5 SYSTAT System 857

16 Multiway Frequency Analysis 858


16.1 General Purpose and Description 858
16.2 Kinds of Research Questions 859
16.2.1 Associations among Variables 859
16.2.2 Effect on a Dependent Variable 860
16.2.3 Parameter Estimates 860
16.2.4 Importance of Effects 860
16.2.5 Effect Size 860
16.2.6 Specific Comparisons and Trend Analysis 860
16.3 Limitations to Multiway Frequency Analysis 861
16.3.1 Theoretical Issues 86 1
16.3.2 Practical Issues 861
16.3.2.I Independence 86 1
16.3.2.2 Ratio of Cases to Variables 861
16.3.2.3 Adequacy of Expected Frequencies 862
16.3.2.4 Absence of Outliers in the Solution 863
16.4 Fundamental Equations for Multiway Frequency Analysis 863
16.4.1 Screening for Effects 864
16.4.1.1 Total Effect 865
16.3.1.2 First-Order Effects 866
16.4 1 3 Second-Order Effects 867
16.4.1.3 Th~rd-OrderEffect 87 I
xxii CONTENTS

16.4.1 blodel~ng 87 1
16.4..3 Evaluatic~iand Interpretation 574
16.4.3.i Rcsiduais 87-4
16.4.3.2 Parameter Estimates 874
16.4.4 Cotnputer Analyses of Small-Sample Example 880
16.5 Some Important Issues 887
16.5.1 Hierarchical and Nonhierarchical Models 887
16.5.2 Statistical Criteria 888
16.5.2.1 Tests of Models 888
16.5.2.2 Tests of Individual Effects 888
16.5.3 Strategies for Choosing a Model 889
16.5.3.1 SPSS HILOGLINEAR (Hierarchical) 889
16.5.3.2 SPSS GENLOG (General Log-Linear) 889
16.5.3.3 SAS CATMOD and SPSS LOGLINEAR
(General Log-Linear) 890
16.6 Complete Example of Multiway Frequency Analysis 890
16.6.1 Evaluation of Assumptions: Adequacy
of Expected Frequencies 890
16.6.2 Hierarchical Log-Linear Analysis 89 1
16.6.2.1 Preliminary Model Screening 89 1
16.6.2.2 Stepwise Model Selection 893
16.6.2.3 Adequacy of Fit 895
16.6.2.4 Interpretation of the Selected Model 901
16.7 Comparison of Programs 908
16.7.1 SPSS Package 911
16.7.2 SAS System 9 12
16.7.3 SYSTAT System 9!3

17 An Overview of the General Linear Model 913


17.1 Linearity and the General Linear Model 913
17.2 Bivariate to Multivariate Statistics and Overview
of Techniques 913
17.2.1 BivariateForm 913
17.2.2 Simple Multivariate Form 914
17.2.3 Full Multivariate Form 9 17
17.3 Alternative Research Strategies 918

I8 Time-Series Analysis (available online at


www.ablongman.com/tabachnick5e) 18-1
18.1 General Purpose and Description 18-1
CONTENTS xxiii

18.2 Kinds of Research Questions 18-3


18.2.1 Pattern of Autocorrelation 18-5
18.2.2 Seasonal Cycles and Trends 18-5
18.2.3 Forecasting 18-5
18.2.4 Effect of an Intervention 18-5
18.2.5 Comparing Time Series 18-5
18.2.6 Time Series with Covariates 18-6
18.2.7 Effect Size and Power 18-6
' 18.3 Assumptions of Time-Series Analysis 18-6
18.3.1 Theoretical Issues 18-6
18.3.2 Practical Issues 18-6
18.3.2.1 Normality of Distributions of Residuals 18-6
18.3.2.2 Homogeneity of Variance and Zero Mean of Residuals 18-7
18.3.2.3 Independence of Residuals 18-7
18.3.2.4 Absence of Outliers 18-7
18.4 Fundamental Equations for Time-Series ARIMA Models 18-7
18.4.1 Identification ARIMA (p, d, q) Models 18-8
18.4.1.1 Trend Components, d: Making the Process Stationary 18-8
18.4.1.2 Auto-Regressive Components 18-1 1
18.4.1.3 Moving Average Components 18-12
18.4.1.4 Mixed Models 18-13
18.4.1.5 ACFs and PACFs 18-13
18.4.2 Estimating Model Parameters 18- 16
18.4.3 Diagnosing a Model 18- 19
18.4.4 Computer Analysis of Small-Sample
Time-Series Example 18- 19
18.5 Types of Time-Series Analyses 18-27
18.5.1 Models with Seasonal Components 18-27
18.5.2 Models with Interventions 18-30
18.5.2.1 Abrupt. Permanent Effects 18-32
18.5.2.2 Abrupt, Temporary Effects 18-32
18.5.2.3 Gradual, Permanent Effects 18-38
18.5.2.4 Models with Multiple Interventions 18-38
18.5.3 Adding Continuous Variables 18-38
18.6 Some Important issues 18-41
18.6.1 Patterns of ACFs and PACFs 18-4 1
18.6.2 Effect Size 18-44
18.6.3 Forecasting 18-45
18.6.4 Statistical Methods for Comparing Two Models 18-45
18.7 Complete Example of a Time-Series Analysis 18-47
18.7.1 Evaluation of Assumptions 18-48
18.7.1.1 Normality of Sampling Distributions 18-48
18.7.1.2 Homogeneity of Variance 18-48
18.7.1.1 Outliers 18-48
xxiv CONTENTS

1 8.7 .'7 Baseline ivlodel identification and Estiination 18-48


I Y .7.3 B a s l i n e Model Diagnosis 18-19
18.7.1 Intervention Analysis 18-55
18.7.4.1 Model Diagnosis 18-55
18.7.4.2 Model Interpretation 18-56
18.8 Comparison of Programs 18-60
18.8.1 SPSS Package 18-61
18.8.2 SAS System 18-61
18.8.3 SYSTAT System 18-61

Appendix A A Skimpy Introduction to Matrix Algebra 924


A.1, The Trace of a Matrix 925
A.2 Addition or Subtraction of a Constant to a Matrix 925
A.3 Multiplication or Division of a Matrix by 3 Constant 925
A.4 Addition and Subtraction of Two Matrices 926
A.5 Multiplication, Transposes, and Square Roots of Matrices 927
A.6 Matrix "Division" (Inverses and Determinants) 929
A.7 Eigenvalues and Eigenvectors: Procedures
for Consolidating Variance from a Matrix 930

Appendix K Research Designs for Complete Examples 934


B.l Women's Health and Drug Study 934
B.2 Sexual Attraction Study 935
B.3 Le'lrning Disabilities Data Bank 938
B.4 Reaction Time to Identify Figures 939
B.5 Field Studies of Noise-Induced Sleep Disturbance 939
B.6 Clinical Trial for Primary Biliary Cirrhosis 940
B.7 Impact of Seat Belt Law 940

Appendix C Statistical Tables 941


C.l Normal Curve Areas 942
C.2 Critical Values of the t Distribution for a = .05 and .01,
Two-Tailed Test 943
CONTENTS YYV

C.3 Critical Values of the F Distribution 944


C.1 Critical Values of Chi Square ( X ' ) 949
C.5 Critical Values for Squared ICIultiple Correlation ( R ~in) Forward
Stepwise Selection 950
C.6 Critical Values for F,,,, ( s ~ , , , / s ~Distribution
~,,) for a = .05
and .O1 952

References 953

Index 963
PREFACE

Obesity threatened, and we've had to consider putting the book on a diet. We've added only'one
chapter this time around, Multilevel Linear Modeling (Chapter 15). and some spiffy new techniques
for dealing with missing data (in Chapter 4). Otherwise, we've mostly streamlined and said good-
bye to some old friends. We've forsaken the Time-Series Analysis chapter in the text, but you'll be
able to download it from the publisher's web site at www.ablongman.com/tabachnick5e. Another
sadly forsaken old friend is SYSTAT. We still love the program, however, for its right-to-the-point
analyses and terrific graphics, and are pleased that most of the graphics have been incorporated into
SPSS. Although absent from demonstrations, features of SYSTAT, and any other programs we've
cut, still appear in the last sections of Chapters 5 through 16, and in online Chapter 18, where pro-
grams are compared. We've changed the order of some chapters: canonical correlation seemed rather
difficult to appear as early as it did, and survival analysis seemed to want to snuggle up to logistic
regression. .Act~lally,the order doesn't seem to matter much: perusal of syllabi on the Web convinces
us that professors feel free to present chapters in any order they choose-and that's fine with us.
Multilevel linear modeling (MLM) seems to have taken the world by storm; how did we ever
live without it? Real life is hierarchical-students come to us within classrooms, teachers work
within different schools, patients share wards and nursing staff, and audiences attend different per-
formances. We hardly ever get to break these groups apart for research purposes, so we have to deal
with intact groups and all their shared experiences. MLM lets us do this without violating all of the
statistical assumptions we learned to know and hate. Now that SAS and SPSS can deal with these
models, we're ready to tackle the real world. Hence, a new chapter.
SAS and SPSS also now offer reasonable ways to impute illissing data through multiple-
imputation techniques and fii_llly assess miscing data patterns, respectively. We expanded Chapter 4
to detnonstrate these enhancements. SPSS and SAS keep adding goodies, which we'll try to show
off. As before, we adapt our syntax from Windows menus whenever possible, and all of our data sets
are available on the book's web page (www.ablongman.com/tabachnick5e). We've also paid more
attention to effect sizes and, especially, confidence intervals around effect sizes. Michael Simpson of
[he Austraiian Nationai University has kindiy given us permission to include some nifty SPSS and
SAS syntax and data files in our web page downloads. Jim Steiger and Rachel Fouladi have gra-
ciously given us permission to include their DOS program that finds confidence intervals around R?
One thing we'll never change is our practical bent, focusing on the benefits and lirriitations of
applications of a technique to a data set-when, why, and how to do it. The math is wonderful, and
we suggest (but don't insist) that students follow along through section four of each chapter using
readily available software for matrix manipulations or spreadsheets. But we still feel that under-
standing the math is not enough to insure appropriate analysis of data. And our readers assure us that
they really are able to apply the techniques without a great deal of attention to the math of section
four. Our small-sample examples remain silly; alas, our belly dancing days are over. As for our most
recent reviewers, kindly provided by our publisher, we had the three bears checking out beds: too
hard, too soft, and just right. So we've not changed the tone or level of difficulty.
Some extremely helpful advice wax offered by S t e ~ eOsterlincl of the Univerxity of
Missouri-Columbia and Jeremy Jewel of Southern Illinois University-Edwal-dsville. We also

xxvii
xxviii PREFACE

heartily thank Lisa Harlow of the Un~versityof Rhodr: I.\land. who wrote an extenhibe. in\i?htful
E ~ I I L I IMI ~o I~ Ii ~ l i ~inz 7002.
review of the entire fourth edition of 011s book in Sti.~l(~t~[i.lll g LVr asain
thank the reviewers of earlier editions of our book, but fears of breaking the backs of current students
dissuade us from listing them all once more. You know who you are; we still care. Our thanks to the
reviewers of this edition: Joseph Benz, University of Nebraska-Kearney; Stanley Cohen, West Vir-
ginia University; Michael Granaas, University of South Dakota; Marie Hammond. Tennessee State
University at ~ a s h v i l l eJosephine
; Korchmaros, Southern Illinois University; and Scott Roesch, San
Diego State University.
As always, the improvements are largely due to reviewers and those colleagues who have taken
the time to email us with suggestions and corrections. Any remaining errors and lack of clarity are
due to us alone. As always, we hope the book provides a few smiles as well as help in analyzing data.

Barbara G. Tabachnick
Linda S. Fidell
CHAPTER

Introduction

1 . Multivariate Statistics: Why?


Multivariate statistics are increasingly popular techn~quesused for anaiyzing compiicated data sets.
They provide analysis when there are many independent variables (IVs) and/or many dependent
variables (DVs), all correlated with one another to varying degrees. Because of the difficulty of
addressing complicated research questions with univariate analyses and because of the availability
of canned software for performing multivariate analyses, multivariate statistics have become widely
used. Indeed, a standard univariate statistics course only begins to prepare a student to read research
literature or a researcher to produce it.
But how much harder are the multivariate techniques? Compared with the multivariate meth-
ods, univariate statistical methods are so straightforward and neatly structured that it is hard to
believe they once took so much effort to master. Yet many researchers apply and correctly interpret
results of intricate analysis of variance before the grand structure is apparent to them. The same can
be true of multivariate statistical methods. Although we are delighted if you gain insights into the full
multivariate general linear model,( we have accomplished our goal if you feel comfortable selecting
and setting up multivariate analyses and interpreting the computer output.
Multivariate methods are more complex than univariate by at least an order of magnitude. But
for the most part, the greater complexity requires few conceptual leaps. Familiar concepts such as
sampling distributions and homogeneity of variance simply become more elaborate.
Multivariate models have not gained popularity by accident--or even by sinister design. Their
growing popularity parallels the greater complexity of contemporary research. In psychology, for
example, we are less and less enamored of the simple, clean, laboratory study in which pliant, first-
year college students each provides us with a single behavioral measure on cue.

1.1.1 The Domain of Multivariate Statistics: Numbers of IVs


and DVs
Multi~ariatcstatistical methods are an extension of univariate and bivariate statistics. Multivariate
statistics are the cnmplere or general case, whereas univariate and bivariate statistics are cpecial cases

'Chapter 17 attempts to foster such insights


of the multivariate model. If your desisn ha.; many ~ast:tble\.rnultlvar~:~te techn~clue\often Ict ~ O L I
perform a single analysis instead oft: series of univari~iteor bi\.asinte anaiyses.
Variables are roughly dichoton~izedinto two major types-independent and dependent. Inde-
pendent variables (IVs) are the differing conditions (treatment vs. placebo) to which you expose your
subjects, or characteristics (tall or short) that the subjects themselves bring into the research situa-
tion. IVs are usually considered predictor variables because they predict the DVs-the response or
outcome variables. Note that IV and DV are defined within a research context; a DV in one research
setting may be an IV in another.
Additional t e k s for IVs and DVs are predictor-criterion, stimulus-response, task-performance,
or simply input-output. We use IV and DV throughout this book to identify variables that belong on
one side of an equation or the other, without causal implication. That is, the terms are used for conve-
nience rather than to indicate that one of the variables caused or determined the size of the other.
The term univariclte statistics refers to analyses in which there is a single DV. There may be,
however, more than one IV. For example, the amount of social behavior of graduate students (the
DV) is studied as a function of course load (one IV) and type of training in social skills to which stu-
dents are exposed (another IV). Analysis of variance is a commonly used univariate statistic.
Blviirtaie siiiilsiies fieqileniiy refers io anaiysis or' two variabies where neither is an experi-
mental IV and the desire is simply to study the relationship between the variables (e.g., the relation-
ship between income and amount of education). Bivariate statistics, of course, can be applied in an
experimental setting, but usually they are not. Prototypical examples of bivariate statistics are the
Pearson product-moment correlation coefficient and chi square analysis. (Chapter 3 reviews univari-
ate and bivariate statistics.)
With multivariate statistics, you simultaneously analyze multiple dependent and multiple inde-
pendent variables. This capability is important in both nonexperimental (correlational or survey) and
experimental research.

1.1.2 Experimental and Nonexperimental Research


A critical distinction between experimental and nonexperimental research is whether the researcher
manipulates the levels of the IVs. In an experiment, the researcher has control over the levels (or con-
ditions) of at least one IV to which a subject is exposed by determining what the levels are, how they
are implemented, and how and when cases are assigned and exposed to them. Further, the experi-
menter randomly assigns subjects to levels of the IV and controls all other influential factors by hold-
ing them constant, counterbalancing, or randomizing their influence. Scores on the DV are expected to
be the same, within random variation, except for the influence of the IV (Campbell & Stanley. 1966).
If there are systematic differences in the DV associated with levels of the IV, these differences are
attributed to the IV.
For example, if groups of undergraduates are randomly assigned to the same material but dif-
ferent types of teaching techniques, and afterward some groups of undergraduates perform better
than others, the difference in performance is said, with some degree of confidence. to be caused by
the difference in teaching technique. In this type of research, the terms independent and dependent
have obvious meaning: the value of the DV depends on the manipulated level of the IV. The IV is
manipulated by the experimenter and the score on the DV depends on the level of the IV.
In noneuperimental !correlational or wrvey) rewarcli. the le\els o f the IV(s) are not manipu-
lated by the researcher. The researcher can Jttfi~iethe IV. but has no c~titrolO L ~ the K ahsignment of
subjects to levels of it. For example, groups of people may be categorized into geographic area of res-
idence (Northeast, Midwest, e t ~ . )but
, only the definition of the variable is under researcher control.
Except for the military or prison, place of residence is rarely subject to manipulation by a researcher.
Nevertheless, a naturally occurring difference like this is often considered an IV and is used to pre-
dict some other b~nex~eriinental (dependent) variable such as income. In this type of research. the
distinction between IVs and DVs is usually arbitrary and many researchers prefer to call IVs predic-
tors and DVs criterion variables.
In nonexperimental research, it is very difficult to attribute causality to an IV. If there is a sys-
tematic difference in a DV associated with levels of an IV, the two variables are said (with some
degree of confidence) to be related, but the cause of the relationship is unclear. For example, income
as a DV might be related to geographic area, but no causal association is implied.
Nonexperimental research takes many forms, but a common example is the survey. Typically,
many people are surveyed, and each respondent provides answers to many questions, producing a
large number of variables. These variables are usually interrelated in highly complex ways, but uni-
variate and bivariate statistics are not sensitive to this complexity. Bivariate correlations between aii
pairs of variables, for example, could not reveal that the 20 to 25 variables measured really represent
only two or three "supervariables."
Or, if a research goal is to distinguish among subgroups in a sample (e.g., between Catholics
and Protestants) on the basis of a variety of attitudinal variables, we could use several univariate t
tests (or analyses of variance) to examine group differences on each variable separately. But if the
variables are related, which is highly likely, the results of many t tests are misleading and statistically
suspect.
With the use of multivariate statistical techniques, complex interrelationships among variables
are revealed and assessed in statistical inference. Further, it is possible to keep the overall Type I
error rate at, say, 5%. no matter how many variables are tested.
Although most multivariate techniques were developed for use in nonexperimental research,
they are also useful in experimental research in which there may be multiple IVs and multiple DVs.
With multiple IVs, the research is usually designed so that the IVs are independent of each other and
a straightforward correction for numerous statistical tests is available (see Chapter 3). With multiple
DVs, a problem of inflated error rate arises if each DV is tested separately. Further, at least some of
the DVs are likely to be correlated with each other, so separate tests of each DV reanalyze some of
the same variance. Therefore, multivariate tests are used.
Experimental research designs with multiple DVs were unusual at one time. Now, however,
with attempts to make experimental designs more realistic, and with the availability of computer pro-
grams, experiments often have several DVs. It is dangerous to run an experiment with only one DV
and risk missing the impact of the IV because the most sensitive DV is not measured. Multivariate
statistics help the experimenter design more efficient and more realistic experiments by allowing
measurement of multiple DVs without violation of acceptable levels of Type I error.
One of the few considerations not relevant to choice of stnti.rtica1 technique is whether the data
are experimenta! or correlational. The statistical methods "work" whether the researcher manipulated
the levels of the IV. But attribution of causality to results is cn~ciallyaffected by the experimental-
nonexperimental distinction.
4 CHAPTER I

1.1.3 Computers and hlultivariate Statistics


One answer to the question "Why multivariate statistics'?" is that the techniques are now accessible
by computer. Only the most dedicated number cruncher would consider doing real-life-sized prob-
lems in multivariate statistics without a computer. Fortunately, excellent multivariate programs are
available in a number of computer packages.
Two packages are demonstrated in this book. Examples are based on programs in SPSS (Sta-
tistical Package for the Social Sciences) and SAS.
If you have hccess to both packages, you are indeed fortunate. Programs within the packages
do not completely overlap, and some problems are better handled through one package than the
other. For example, doing several versions of the same basic analysis on the same set of data is par-
ticularly easy with SPSS whereas SAS has the most extensive capabilities for saving derived scores
from data screening or from intermediate analyses.
Chapters 5 through 16 (the chapters that cover the specialized multivariate techniques) and Chap-
ter 18 (available at www.ablongman.com/tabachnick5e) offer explanations and illustrations of a variety
of programs2 within each package and a comparison of the features of the programs. We hope that once
you understand the techniques, you will be able to generalize to virtually any mu!tivxlriate program.
Recent versions of the programs are implemented in Windows, with menus that implement
most of the techniques illustrated in this book. All of the techniques may be implemented through
syntax, and syntax itself is generated through menus. Then you may add or change syntax as desired
for your analysis. For example, you may "paste" menu choices into a syntax window in SPSS, edit
the resulting text, and then run the program. Also, syntax generated by SPSS menus is saved in the
"journal" file (spss.jn1) which also may be accessed and copied into a syntax window. Syntax gener-
ated by SAS menus is recorded in a "log" file. The contents may then be copied to an interactive win-
dow, edited, and run. Do not overlook the help files in these programs. Indeed, SAS and SPSS now
provide the entire set of user manuals on CD, often with more current information than is available
in printed manuals.
Our demonstrations in this book are based on syntax generated through menus whenever fea-
sible. We would love to show you the sequence of menu choices, but space does not permit. And, for
the sake of parsimony, we have edited program output to illustrate the material that we feel is the
most important for interpretation. We have also edited out some of the unnecessary (because it is
default) syntax that is generated through menu choices.
With commercial computer packages, you need to know which version of the package you are
using. Programs are continually being changed, and not all changes are immediately implemented at
each facility. Therefore, many versions of the various programs are simultaneously in use at differ-
ent institutions; even at one institution, more than one version of a package is sometimes available.
Program updates are often corrections of errors discovered in earlier versions. Occasionally,
though, there are major revisions in one or more programs or a new program is added to the package.
Sometimes defaults change with updates, so that output looks different although syntax is the same.
Check to find out which version of each package you are using. Then be sure that the manual you are
using is consistent with the version in use at your facility. Also check updates for error correction in
previous releases that may be relevant to some of your previous runs.
Except where noted, this book reviews Windows versions of SPSS Version 13 and SAS Ver-
sion 9.1. Information on availability and versions of software, macros, books, and the like changes
almost daily. We recommend the Internet as a source of "keeping up."
'We have retained descriptions of features of SYSTAT in these sections despite the removal of detailed demonstration\ of that
program in this edition.
1.14 Garbage In, Roses Out?
The trick in multivariate statistics is not in computation; that is easily done as discu~sedabove. The
trick is to select reliable and valid measurements, choose the appropriate program, use it correctly,
and know how to interpret the output. Output from commercial computer programs, with their beau-
tifully formatted tables, graphs, and matrices, can make garbage look like roses. Throughout'this
book, we try to suggest clues that reveal when the true message in the output more closely resembles
the fertilizer than the flowers.
Second, when you use multivariate statistics, you rarely get as close to the raw data as you do
when you apply univariate statistics to a relatively few cases. Errors and anomalies in the data that
would be obvious if the data were processed by hand are less easy to spot when processing is entirely
by computer. But the computer packages have programs to graph and describe your data in the sim-
plest univariate terms and to display bivariate relationships among your variables. As discussed in
Chapter 4, these programs provide preliminary analyses that are absolutely necessary if the results of
multivariate programs are to be believed.
There are also certain costs associated with the benefits of using multivariate procedures. Ben-
efits of increased flexibility in research design, for instance, are sometimes paralleled by increased
ambiguity in interpretation of results. In addition, multivariate results can be quite sensitive to which
analytic strategy is chosen (cf. Section 1.2.4) and do not always provide better protection against sta-
tistical errors than their univariate counterparts. Add to this the fact that occasionally you still cannot
get a firm statistical answer to your research questions, and you may wonder if the increase in com-
plexity and difficulty is warranted.
Frankly, we think it is. Slippery as some of the concepts and procedures are, these statistics
provide insights into relationships among variables that may more closely resemble the complexity
of the "real" world. And sometimes you get at least partial answers to questions that could not be
asked at all in the univariate framework. For a complete analysis, making sense of your data usually
requires a judicious mix of multivariate and univariate statistics.
And the addition of multivariate statistical methods to your repertoire makes data analysis a lot
more fun. If you liked univariate statistics, you will love multivariate statistic^!^

1.2 Some Useful Definitions


In order to describe multivariate statistics easily, it is useful to review some common terms in research
design and basic statistics. Distinctions were made in preceding sections between IVs and DVs &id
between experimentai and nonexperimental research. Additienal terms that are encountered repeat-
edly in the book but not necessarily related to each other are described in this section.

1.2.1 Continuous, Discrete, and Dichotomous Data


In applying statistical techniques of any sort, it is important to consider the type of measurement and
the nature of the correspondence between numbers and the events that they represent. The distinction
made here is among continuous, discrete, and dichototnous variables; you may prefer to substitute
the terms intrrvcll or y~l~lntitutivr
for cot7ritluo~r.\and tlortzinr~l.~ , ~ z t ~ g o tor for (iirC1oto-
- i cq~inlitntil'~
~~l
mnus and discrete.

' ~ o n ' teven think about it.


6 CHAPTER I

Conti~?uousvariable\ are measured on a ccale that changes calues m~oothlkrather than II! step\.
Continuous variables take on any u l u e within the range of the hcale. and the \ i ~ eof the number
retlects the amount of the variable. Precision is limited by the measuring instrument, not by the nature
of the scale itself. Some examples of continuous variables are time as measured on an old-fashioned
analog clock face. annual income, age, temperature, distance, and grade point average (GPA). .

Discrete variables take on a finite and usually small number of values, and there is no smooth
transition from one value or category to the next. Examples include time as displayed by a digital
clock, continents, categories of religious affiliation, and type of community (rural or urban).
Sometimes aiscrete variables are used in multivariate analyses as if continuous if there are
numerous categories and the categories represent a quantitative attribute. For instance, a variable that
represents age categories (where, say, 1 stands for 0 to 4 years, 2 stands for 5 to 9 years, 3 stands for
10 to 14 years, and so on up through the normal age span) can be used because there are a lot of cat-
egories and the numbers designate a quantitative attribute (increasing age). But the same numbers
used to designate categories of religious affiliation are not in appropriate form for analysis with
many of the techniques%ecause religions do not fall along a quantitative continuum.
Discrete variables composed of qualitatively different categories are sometimes analyzed after
being changed into a number of dichotomous or two-!eve! variables (e.g., Catholic vs. son-Cath~lic,
Protestant vs. non-Protestant, Jewish vs. non-Jewish, and so on until the degrees of freedom are
used). Recategorization of a discrete variable into a series of dichotomous ones is called dummy vari-
able coding. The conversion of a discrete variable into a series of dichotomous ones is done to limit
the relationship between the dichotomous variables and others to linear relationships. A discrete
variable with more than two categories can have a relationship of any shape with another variable,
and the relationship is changed arbitrarily if assignment of numbers to categories is changed.
Dichotomous variables, however, with only two points, can have only linear relationships with other
variables; they are, therefore, appropriately analyzed by methods using correlation in which only lin-
ear relationships are analyzed.
The distinction between continuous and discrete variables is not always clear. If you add
enough digits to the digital clock, for instance, it becomes for all practical purposes a continuous
measuring device, whereas time as measured by the analog device can also be read in discrete cate-
gories such as hours or half hours. In fact, any continuous measurement may be rendered discrete (or
dichotomous) with some loss of information, by specifying cutoffs on the continuous scale.
The property of variables that is crucial to application of multivariate procedures is not the type
of measurement so much as the shape of distribution, as discussed in Chapter 4 and elsewhere. Non-
normally distributed continuous variables and dichotomous variables with very uneven splits
between the categories present problems to several of the multivariate analyses. This issue and its
resolutior, are disciissed at some length in Chapter 4.
Another type of measurement that is used sometimes produces a rank order (ordinal) scale.
This scale assigns a number to each subject to indicate the subject's position vis-8-vis other subjects
along some dimension. For instance, ranks are assigned to contestants (first place, second place,
third place, etc.) to provide an indication of who was best-but not by how much. A problem with
ordinal measures is that their distributions are rectangular (one frequency per number) instead of
normal. unless tied ranks are permitted and they pile up in the middle of the distribution.

'some tnultivariate techliiq~~es


ie.g.. logi\tic regres.;ion, S E M ) are clpproprlate for all type, o t var~ables
Introduction 7

In practice. we often treat variables as ~ tthey


' are continuous when the underlying scale is
thought to be continuous but the measured scale actually 1s ordinal. the number of categories is
large-say, seven or more-and the data meet other assumptions of the analysis. For instance, the
number of correct items on an objective test is technically not continuous because fractional values
are not possible, but it is thought to measure some underlying continuous variable such as course
mastery. Another example of a variable with ambiguous measurement is one measured on a Likert-
type scale in which consumers rate their attitudes toward a product as "strongly like," "moderately
like," "mildly like,'' "neither like nor dislike," "mildly dislike," "moderately dislike," or "strongly
dislike." As mentioned previously, even dichotomous variables may be treated as if continuous under
some conditions. Thus, we often use the term "contirz~ious"throughout the remainder of this book
whether the measured scale itself is continuous or the variable is to be treated as if continuous. We
use the term "discrete" for variables with a few categories, whether the categories differ in type or
quantity.

1.2.2 Samples and Populations


Samples are measured in order to make generaiizatiotib about popillations. Ideal!y, samp!es 2re
selected, usually by some random process, so that they represent the population of interest. In real
life, however, populations are frequently best defined in terms of samples, rather than vice versa; the
population is the group from which you were able to randomly sample.
Sampling has somewhat different connotations in nonexperimental and experimental research.
In nonexperimental research, you investigate relationships among variables in some predefined pop-
ulation. Typically, you take elaborate precautions to ensure that you have achieved a representa-
tive sample of that population; you define your population: then do your best to randomly sample
from it.5
In experimental research, you attempt to create different populations by treating subgroups
from an originally homogeneous group differently. The sampling objective here ih to ensure that all
subjects come from the same population before you treat them differently. Random sampling con-
sists of randomly assigning subjects to treatment groups (levels of the IV) to ensure that, before dif-
ferential treatment, all subsamples come from the same population. Statistical tests provide evidence
as to whether, after treatment, all samples still come from the same population. Generalizations
about treatment effectiveness are made to the type of subjects who participated in the experiment.

1.2.3 Descriptive and Inferential Statistics


Descriptive statistics describe samples of subjects in terms of variables or combinations of variables.
Inferential statistical techniques test hypotheses about differences in populations on the basis of
measurements made on samples of subjects. If reliable differences are found, descriptive statistics
are then used to provide estimations of central tendency, and the like, in the population. Descriptive
statistics used in this way are called parameter estimates.
Use of inferential and descriptive statistics is rarely an either-or proposition. We are usually
interested in both describing and making inferences about a data set. We describe the data. find

S~trategiesl o r random sampling are Jiscussed in rnany sources, including Levy and I-elnenshou i 1990). Rea and Parker
( 1997). and de Vaus (2002).
8 CHAPTER I

reliable clifferences or relationah~p\.and ehtlrnate population ~aluehfor the reliable findings. H o & -
ever. there are more restrictions on inference than there are on description. Many ~~ss~imptions of
multivariate statistical methods are necessary only for inference. If simple description of the sample
is the major goal, many assumptions are relaxed, as discussed in Chapters 5 through 16 and 18
(online).

1.2.4 Orthogonality: Standard and Sequential Analyses


Orthogonality is a'perfect nonassociation between variables. If two variables are orthogonal, know-
ing the value of one variable gives no clue as to the value of the other; the correlation between them
is zero.
Orthogonality is often desirable in statistical applications. For instance, factorial designs for
experiments are orthogonal when two or more IVs are completely crossed with equal sample sizes
in each combination of levels. Except for use of a common error term, tests of hypotheses about main
effects and interactions are independent of each other; the outcome of each test gives no hint as to the
outcome of the others. In orthogonal experimental designs with random assignment of subjects,
manipulation of the levels of the IV, and good cnntro!~, changes i~ va!ue of the DV can be unam-
biguously attributed to various main effects and interactions.
Similarly, in multivariate analyses, there are advantages if sets of IVs or DVs are orthogonal.
If all pairs of IVs in a set are orthogonal, each IV adds, in a simple fashion, to prediction of the DV.
Consider income as a DV with education and occupational prestige as IVs. If education and occupa-
tional prestige are orthogonal, and if 35% of the variability in income may be predicted from educa-
tion and a different 45% is predicted from occupational prestige, then 80% of the variance in income
is predicted from education and occupational prestige together.
Orthogonality can easily be illustrated in Venn diagrams, as shown in Figure 1 . I. Venn dia-
grams represent shared variance (or correlation) as overlapping areas between two (or more) circles.
The total variance for income ic one circle. The section with hori~ontalstripes represents thc part of
income predictable from education, and the section with vertical stripes represents the part pre-
dictable from occupational prestige; the circle for education overlaps the circle for income 35% and
the circle for occupational prestige overlaps 45%. Together. they account for 80% of the variability
in income because education and occupational prestige are orthogonal and do not themselves over-
lap. There are similar advantages if a set of DVs is orthogonal. The overall effect of an IV can be par-
titioned into effects on each DV in an additive fashion.

FIGURE 1.1 Venn diagram for Y


(income), Xi (education), and X z
(occupational prestige).
Introduction 9

Usually. howe~er.the variables are crJrrelated with each other (nonorthoponal).IV\ !n noneu-
perimental desipns are often correlated naturally: in experimental designs. IV become correlated
when unequal numbers of subjects are measured in different cells of the design. DVs are usually cor-
related because individual differences among subjects tend to be consistent over many attributes.
When variables are correlated, they have shared or overlapping variance. In the example of
Figure 1:2, education and occupational prestige correlate with each other. Although the independent
contribution made by education is still 35% and that by occupational prestige is 4596, their joint con-
tribution to prediction of income is not 80% but rather something smaller due to the overlapping area
shown by the ari-ow in Figure 1.2(a). A major decision for the multivariate analyst is how to handle
the variance that is predictable from more than one variable. Many multivariate techniques have at
least two strategies for handling it; some have more.
In standard analysis, the overlapping variance contributes to the size of summary statistics of
the overall relationship but is not assigned to either variable. Overlapping variance is disregarded in
assessing the contribution of each variable to the solution. Figure 1.2(a) is a Venn diagram of a stan-
dard analysis in which overlapping variance is shown as overlapping areas in circles; the unique con-
tributions of X I and X2 to prediction of Yare shown as horizontal and vertical areas, respectively, and
the total relationship between i.' and the colnbination of X I and X, is those two areas plus the area
with the arrow. If X I is education and X2 is occupational prestige, then in standard analysis, X I is
"credited with" the area marked by the horizontal lines and X2 by the area marked by vertical lines.
Neither of the IVs is assigned the area designated with the arrow. When X I and X2 substantially over-
lap each other, very little horizontal or vertical area may be left for either of them despite the fact that
they are both related to !l They have essentially knocked each other out of the solution.
Sequential analyses differ in that the researcher assigns priority for entry of variables into
equations, and the first one to enter is assigned both unique variance and any overlapping variance it
has with other variables. Lower-priority variables then are assigned on entry their unique and any
remaining overlapping variance. Figure 1.2(b) shows a sequential analysis for the same case as Fig-
ure 1.2(a). where X ! (education) is given priority over X 2 (occupational prestige). The total variance
explained is the same as in Figure 1.2(a), but the relative contributions of X I and X 2 have changed;

Area represents variance


in relationship that contributes
to solution but is assigned to
neither X, nor X,

(a) Standard analysis (b) Sequential analysis i n which


X, is given priority over X,

FIGURE 1.2 Standard (a) and sequential (h) analyses of the relationship
between Z: X,, and X z . Horizontal shading depicts variance assigned to XI.
Vertical shading depicts variance assigned to X L .
10 CHAPTER I

education nou \how\ cl htroriger ~.elation\h~p uith Income than in the st;lndal-d analyhb. wht.re;is the
relation between occupational prestige and income remains the same.
The choice of strategy for dealing with overlapping variance is not trivial. I f variables are cor-
related, the overall relationship remains the same but the apparent importance of variables to the
solution changes depending on whether a standard or a sequential strategy is used. If the multivari-
ate procedures have a reputation for unreliability, it is because solutions change, sometimes dramat-
ically, when different strategies for entry of variables are chosen. However. the strategies also ask
different questions of the data, and it is incumbent on the researcher to determine exactly which
question to ask. We try to make the choices clear in the chapters that follow.

1.3 Linear Combinations of Variables


Multivariate analyses combine variables to do useful work such as predict scores or predict group
membership. The combination that is formed depends on the relationships among the variables and
the goals of analysis, but in most cases. the combination that is formed is a linear combination. A lin-
ear combination is one in which each variable is assigned a weight (e.g.. W i ) ,and then products of
weights and variable scores are summed to predict a score on a combined variable. In Equation I. I,
Y ' (the predicted DV) is predicted by a linear combination of X I and X , (the IVs).

If, for example, Y ' is predicted income, X I is education, and X , is occupational prestige, the
best prediction of income is obtained by weighting education (XI) by W I and occupational prestige
(X,) by W, before summing. No other values of W i and W, produce as good a prediction of income.
Notice that Equation 1.1 includes neither X or X , raised to powers (exponents) nor a product
of X i and X,. This seems to severely restrict multivariate solutions until one realizes that X I could
itself be a of two different variables or a single variable raised to a power. For example, X I
might be education squared. A multivariate solution does not produce exponents or cross-products of
IVs to improve a solution, but the researcher can include Xs that are cross-products of IVs or are IVs
raised to powers. Inclusion of variables raised to powers or cross-products of variables has both the-
oretical and practical implications for the solution. Berry (1993) provides a useful discussion of
many of the issues.
The size of the W values (or some function of them) often reveals a great deal about the rela-
tionship between Dt' and iVs. if, for instance, the W value for some IV is zero, the IV is not needed
i n the best DV-TV relationship. 0: if s=me IV has a large W value, then the i'v' tends to be important
to the relationship. Although complications (to be explained later) prevent interpretation of the mul-
tivariate solution from the sizes of the W values alone, they are nonetheless important in most multi-
variate procedures.
The combination of variables can be considered a supervariable, not directly measured but
worthy of interpretation. The supervariable may represent an underlying dimension that predicts
something or optimizes some relationship. Therefore, the attempt to understand the meaning of the
combination of IVs is worthwhile in many multivariate analyses.
In the search for the best weights to apply in combining variables, computers do not try out all
possible sets of weights. Various algorithms have been developed to compute the weights. Most
algorithms involve n~anipulationof a correlation matrix, a variance-covariance matrix, or a sum-of-
Introduction 11

squares and cross-products matrlx. Section I .h describes these matrices In \el-? simple term\ ancl
shows their development from a \.cry small data set. Appendix A describes some terms and manipu-
lations appropriate to matrices. In the fourth sections of Chapters 5 through 16 and I8 (online). a
small hypothetical sample of data is analyzed by hand to show how the weights are derived for each
analysis. Though this information is useful for a basic understanding of multivariate statistics, it is
not necessary for applying multivariate techniques fruitfully to your research questions and may,
sadly, be skippe'd by those who are math aversive.

1.4 Number and Nature of Variables to Include


Attention to the number of variables included in analysis is important. A general rule is to get the best
solution with the fewest variables. As more and more variables are included, the solution usually
improves, but only.slightly. Sometimes the improvement does not compensate for the cost in degrees
of freedom of including more variables, so the power of the analyses diminishes.
A second problem is ovetj5tting. With overfitting, the solution is very good, so good in fact,
that it is unlikely to generaiize to a popillatioii. Overfitting occgrs when too many variables are
included in an analysis relative to the sample size. With smaller samples, very few variables can be
analyzed. Generally, a researcher should include only a limited number of uncorrelated variables in
each analysis,6 fewer with smaller samples. We give guidelines for the number of variables that can
be included relative to sample size in the third section of Chapters 5-16 and 18.
Additional considerations for inclusion of variables in a multivariate analysis include cost,
availability, meaning, and theoretical relationships among the variables. Except in analysis of struc-
ture, one usually wants a small number of valid, cheaply obtained, easily available, uncorrelated
variables that assess all the theoretically important dimensions of a research area. Another important
consideration is reliability. How stable is the position of a given score in a distribution of scores when
measured at different times or in different ways'! Unreliable variables degrade an analysis whereas
reliable ones enhance it. A few reliable variables give a more meaningful solution than a large num-
ber of less reliable variables. Indeed, if variables are sufficiently unreliable, the entire solution may
retlect only measurement error. Further considerations for variable selection are mentioned as they
apply to each analysis.

1.5 Statistical Power


A critical issue in designing any study is whether there is adequate power. Power, as you may reca!!,
represents the probability that effects that actually exist have a chance of producing statistical sig-
nificance in your eventual data analysis. For example, do you have a large enough sample size to
show a significant relationship between GRE and GPA if the actual relationship is fairly large? What
if the relationship is fairly small? Is your sample large enough to reveal significant effects of treat-
ment on your DV(s)? Relationships among power and errors of inference are discussed in Chapter 3.
Issues of power are best considered in the planning state of a study when the researcher deter-
mines the required sample size. The researcher estimates the size of the anticipated effect (e.g., an
expected mean difference). the variability expected in assessment of the effect, the desired alpha

"The exceptions are analysis of structure. such as factor analysis. in which numerow correlated variables are measured
12 CHAPTER I

level (ordinarily 0.05). m d the desired power (often ,801.Thehe four estimates are required to deter-
mine necessary sample size. Failure to consider power in the planning stage often results in failure
to find a significant effect (and an unpublishable study). The interested reader may wish to consult
Cohen (1965, 1988). Rossi ( 1990), or Sedlnleier and Giperenzer ( 1989) for more detail.
There is a great deal of software available to help you estimate the power available with various
sample sizes for various statistical techniques, and to help you determine necessary sample size given
a desired level of power (e.g., an 80% probability of achieving a significant result if an effect exists)
and expected sizes of relationships. One of these programs that estimates power for several techniques
is PASS (NCSS, 2002). Many other programs are reviewed (and sometimes available as shareware) on
the Internet. Issues of power relevant to each of the statistical techniques are discussed in chapters cov-
ering those techniques.

1.6 Data Appropriate for Multivariate Statistics


An appropriate data set for multivariate statistical methods consists of values on a number of vari-
ables for each of several subjects or cases. For continuous variables, the values are scores on vari-
ables. For example, if the continuous variable is the GRE (Graduate Record Examination), the values
for the various subjects are scores such as 500,650,420, and so on. For discrete variables, values are
number codes for group membership or treatment. For example, if there are three teaching tech-
niques, students who receive one technique are arbitrarily assigned a "I," those receiving another
technique are assigned a "2," and so on.

1.6.1 The Data Matrix


The data matrix is an organization of scores in which rows (lines) represent subjects and columns
represent variables. An example of a data matrix with six subjects7 and four variables is in Table 1.1.
For example, X I might be type of teaching technique, X , score on the GRE, X 3 GPA, and X4 gender,
with women coded 1 and men coded 2.
Data are entered into a data file with long-term storage accessible by computer in order to
apply computer techniques to them. Each subject starts with a new row (line). Information identify-
ing the subject is typically entered first, followed by the value of each variable for that subject.

TABLE 1.1 A Data Matrix of Hypothetical Scores

Student X~ X2 x3 x4

' ~ o r r n a l l ~of, course. there are many more than six jubjects.
Exploring the Variety of Random
Documents with Different Content
“Suspect? I suspect nothing!”
The girl stood looking at her fixedly under dark menacing brows. “I
do, then! I wouldn’t allow myself to before; but all the while I knew
there was another woman.” Between the sentences she drew short
panting breaths, as though with every word speech grew more
difficult. “Mother,” she broke out, “the day I went to Baltimore to see
him the maid who opened the door didn’t want to let me in because
there’d been a woman there two days before who’d made a scene. A
scene—that’s what she said! Isn’t it horrible?” She burst into tears.
Kate Clephane sat stupefied. She could not yet grasp the
significance of the words her daughter was pouring out, and
repeated dully: “You went to Baltimore?” How secret Anne must be,
she thought, not only to have concealed her visit at the time, but
even to have refrained from any allusion to it during their stormy talk
at Rio! How secret, since, even in moments of seeming self-
abandonment, she could refrain from revealing whatever she chose
to keep to herself! More acutely than ever, the mother had the sense
of being at arm’s length from her child.
“Yes, I went to Baltimore,” said Anne, speaking now in a controlled
incisive voice. “I didn’t tell you at the time because you were not well.
It was just after you came back from Meridia, and had that nervous
break-down—you remember? I didn’t want to bother you about my
own affairs. But as soon as I got his letter saying the engagement
was off I jumped into the first train, and went straight to Baltimore to
see him.”
“And you did?” It slipped from Kate irresistibly.
“No. He was away; he’d left. But I didn’t believe it at the time; I
thought the maid-servant had had orders not to let me in....” She
paused. “Mother, it was too horrible; she took me for the woman who
had made the scene. She said I looked just like her.”
Kate gasped: “The negress said so?”
Her question seemed to drop into the silence like a shout; it was as if
she had let fall a platter of brass on a marble floor.
“The negress?” Anne echoed.
Kate Clephane sank down into the depths of her chair as if she had
been withered by a touch. She pressed her elbows against her side
to try to hide the trembling of her body.
“How did you know it was a negress, mother?”
Kate sat helpless, battling with confused possibilities of fear; and in
that moment Anne leapt on the truth.
“It was you, mother—you were the other woman? You went to see
him the day you said you’d been to Meridia?” The girl stood before
her now like a blanched Fury.
“I did go to Meridia!” Kate Clephane declared.
“You went to Baltimore too, then. You went to his house; you saw
him. You were the woman who made the scene.” Anne’s voice had
mounted to a cry; but suddenly she seemed to regain a sense of her
surroundings. At the very moment when Kate Clephane felt the flash
of the blade over her head it was arrested within a hair’s-breadth of
her neck. Anne’s voice sank to a whisper.
“Mother—you did that? It was really you—it was your doing? You’ve
always hated him, then? Hated him enough for that?”
Ah, that blessed word—hated! When the other had trembled in the
very air! The mother, bowed there, her shrunken body drawn in on
itself, felt a faint expanding of the heart.
“No, dear; no; not hate,” she stammered.
“But it was you?” She suddenly understood that, all the while, Anne
had not really believed it. But the moment for pretense was past.
“I did go to see him; yes.”
“To persuade him to break our engagement?”
“Anne—”
“Answer me, please.”
“To ask him—to try to make him see....”
The girl interrupted her with a laugh. “You made him break our
engagement—you did it. And all this time—all these dreadful months
—you let me think it was because he was tired of me!” She sprang to
her mother and caught her by the wrists. Her hot fingers seemed to
burn into Kate’s shivering flesh.
“Look at me, please, mother; no, straight in the eyes. I want to try to
find out which of us you hated most; which of us you most wanted to
see suffer.”
The mother disengaged herself and stood up. “As for suffering—if
you look at me, you’ll see I’ve had my share.”
The girl seemed not to hear. “But why—why—why?” she wailed.
A reaction of self-defence came over Kate Clephane. Anne’s white-
heat of ire seemed to turn her cold, and her self-possession
returned.
“What is it you want me to tell you? I did go to see Major Fenno—
yes. I wanted to speak to him privately; to ask him to reconsider his
decision. I didn’t believe he could make you happy. He came round
to my way of thinking. That’s all. Any mother would have done as
much. I had the right—”
“The right?” Anne shrilled. “What right? You gave up all your rights
over me when you left my father for another man!”
Mrs. Clephane rose with uncertain steps, and moved toward the
door of her bedroom. On the threshold she paused and turned
toward her daughter. Strength had come back to her with the thought
that after all the only thing that mattered was to prevent this
marriage. And that she might still do.
“The right of a friend, then, Anne. Won’t you even allow me that?
You’ve treated me as a friend since you asked me to come back.
You’ve trusted me, or seemed to. Trust me now. I did what I did
because I knew you ought not to marry Major Fenno. I’ve known him
for a great many years. I knew he couldn’t make you happy—make
any woman happy. Some men are not meant to marry; he’s one of
them. I know enough of his history to know that. And you see he
recognized that I was right—”
Anne was still staring at her with the same fixed implacable brows.
Then her face broke up into the furrows of young anguish, and she
became again a helpless grief-tossed girl, battling blindly with her
first sorrow. She flung up her arms, buried her head in them, and
sank down by the sofa. Kate watched her for a moment, hesitating;
then she stole up and laid an arm about the bowed neck. But Anne
shook her off and sprang up.
“No—no—no!” she cried. They stood facing each other, as on that
other cruel night.
“You don’t know me; you don’t understand me. What right have you
to interfere with my happiness? Won’t you please say nothing more
now? It was my own fault to imagine that we could ever live together
like mother and daughter. A relation like that can’t be improvised in a
day.” She flung a tragic look at her mother. “If you’ve suffered, I
suppose it was my fault for asking you to make the experiment.
Excuse me if I’ve said anything to hurt you. But you must leave me
to manage my life in my own way.” She turned toward the door.
“Goodnight—my child,” Kate whispered.
XVII.
TWO days later Fred Landers returned.
Mrs. Clephane had sent a note begging him to call her up as soon as
he arrived. When his call came she asked if she might dine with him
that night, and he replied that she ought to have come without
asking. Anne, he supposed, would honour him too?
No, she answered; Anne, the day before, had gone down to the
Drovers’ on Long Island. She would probably be away for a few
days. And would Fred please ask no one else to dine? He assured
her that such an idea would never have occurred to him.
He received her in the comfortable shabby drawing-room which he
had never changed since his mother and an old-maid sister had
vanished from it years before. He indulged his own tastes in the
library upstairs, leaving this chintzy room, with its many armchairs,
the Steinway piano and the family Chippendale, much as Kate had
known it when old Mrs. Landers had given her a bridal dinner. The
memory of that dinner, and of Mrs. Landers, large, silvery,
demonstrative, flashed through Mrs. Clephane’s mind. She saw
herself in an elaborately looped gown, proudly followed by her
husband, and enclosed in her hostess’s rustling embrace, while her
present host, crimson with emotion and admiration, hung shyly
behind his mother; and the memory gave her a pang of self-pity.
In the middle of the room she paused and looked about her. “It feels
like home,” she said, without knowing what she was saying.
A flush almost as agitated as the one she remembered mounted to
Landers’s forehead. She saw his confusion and pleasure, and was
remotely touched by them.
“You see, I’m homeless,” she explained with a faint smile.
“Homeless?”
“Oh, I can’t remember when I was ever anything else. I’ve been a
wanderer for so many years.”
“But not any more,” he smiled.
The double mahogany doors were thrown open. Landers, with his
stiff little bow, offered her an arm, and they passed into a dusky
flock-papered dining-room which seemed to borrow most of its
lighting from the sturdy silver and monumental cut-glass of the
dinner-table. A bunch of violets, compact and massive, lay by her
plate. Everything about Fred Landers was old-fashioned, solid and
authentic. She sank into her chair with a sense of its being a place of
momentary refuge. She did not mean to speak till after dinner—then
she would tell him everything she thought. “How delicious they are!”
she murmured, smelling the violets.
In the library, after dinner, Landers settled her in his deepest
armchair, moved the lamp away, pressed a glass of old Chartreuse
on her, and said: “And now, what’s wrong?”
The suddenness and the perspicacity of the question took her by
surprise. She had imagined he would leave the preliminaries to her,
or at any rate beat about the subject in a clumsy effort to get at it.
But she perceived that, awkward and almost timorous as he
remained in smaller ways, the mere habit of life had given him a
certain self-assurance at important moments. It was she who now
felt a tremor of reluctance. How could she tell him—what could she
tell him?
“Well, you know, I really am homeless,” she began. “Or at least, in
remaining where I am I’m forfeiting my last shred of self-respect.
Anne has told me that her experiment has been a mistake.”
“What experiment?”
“Having me back.”
“Is that what she calls it—an experiment?”
Mrs. Clephane nodded.
Fred Landers stood leaning against the mantelpiece, an unlit cigar in
his hand. His face expressed perplexity and perturbation. “I don’t
understand. What has happened? She seemed to adore you.”
“Yes; as a visitor; a chaperon; a travelling companion.”
“Well—that’s not so bad to begin with.”
“No; but it has nothing on earth to do with the real relation between a
mother and daughter.”
“Oh, that—”
It was her turn to flush. “You agree with Anne, then, that I’ve forfeited
all right to claim it?”
He seemed embarrassed. “What do you mean by claiming it?”
She hesitated a moment; then she began. It was not the story she
had meant to tell; she had hardly opened her lips before she
understood that it would be as impossible to tell that to Fred Landers
as to Anne. For an instant, as he welcomed her to the familiar house,
so full of friendly memories, she had had the illusion of nearness to
him, the sense of a brotherly reassuring presence. But as she began
to speak of Chris every one else in her new life except her daughter
became remote and indistinct to her. She supposed it could not be
otherwise. She had chosen to cast her lot elsewhere, and now,
coming back after so many years, she found the sense of intimacy
and confidence irreparably destroyed. What did she really know of
the present Fred Landers, or he of her? All she found herself able to
say was that when she had heard that Anne meant to marry Chris
Fenno she had thought it her duty to try to prevent the marriage; and
that the girl had guessed her interference and could not forgive her.
She elaborated on this, lingering over the relatively insignificant
details of her successive talks with her daughter in the attempt to
delay the moment when Landers should begin to question her.
She saw that he was deeply disturbed, but perhaps not altogether
sorry. He had never liked Chris, she knew, and the news of the
engagement was clearly a shock to him. He said he had seen and
heard nothing of Fenno since Anne and her mother had left.
Landers, who could not recall that either Horace Maclew or Lilla had
ever mentioned him, had concluded that the young man was no
longer a member of their household, and probably not even in
Baltimore. If he were, Lilla would have been sure to keep her hold on
him; he was too useful a diner and dancer to be lost sight of—and
much more in Lilla’s line, one would have fancied, than in Anne’s.
Kate Clephane winced at the unconscious criticism. “He gave me his
word that he would go,” she said with a faint sigh of relief.
Fred Landers continued to lean meditatively against the chimney-
piece.
“You said nothing at all to Anne herself at the time?” he asked, after
another interval.
“No. Perhaps I was wrong; but I was afraid to. I felt I didn’t know her
well enough—yet.”
Instantly she saw how he would interpret her avowal, and her colour
rose again. She must have felt, then, that she knew Major Fenno
better; the inference was inevitable.
“You found it easier to speak to Fenno?”
She hesitated. “I cared so much less for what he felt.”
“Of course,” he sighed. “And you knew damaging things about him?
Evidently, since he broke the engagement when you told him to.”
Again she faltered. “I knew something of his past life—enough to be
sure he wasn’t the kind of husband for Anne. I made him understand
it. That’s all.”
“Ah. Well, I’m not surprised. I suspected he was trying for her, and I
own I hated the idea. But now I suppose there’s no help for it—”
“No help?” She looked up in dismay.
“Well—is there? To be so savage with you she must be pretty well
determined to have him back. How the devil are you going to stop
it?”
“I can’t. But you—oh, Fred, you must!”—Her eyes clung imploringly
to his troubled face.
“But I don’t know anything definite! If there is anything—anything one
can really take hold of—you’ll have to tell me. I’ll do all I can; but if I
interfere without good reason, I know it will only make Anne more
determined. Have you forgotten what the Clephanes are like?”
She had lowered her head again, and sat desolately staring at the
floor. With the little wood-fire playing on the hearth, and this honest
kindly man looking down at her, how safe and homelike the room
seemed! Yet her real self was not in it at all, but blown about on a
lonely wind of anguish, outside in the night. And so it would always
be, she supposed.
“Won’t you tell me exactly what there is against him?” she heard
Landers repeat.
The answer choked in her throat. Finally she brought out: “Oh, I don’t
know ... women ... the usual thing.... He’s light....”
“But is it all just hearsay? Or have you proof—proof of any one
particular rotten thing?”
“Isn’t his giving up and going away sufficient proof?”
“Not if he comes back now when she sends for him.”
The words shot through her like a stab. “Oh, but she mustn’t—she
can’t!”
“You’re fairly sure he will come if she does?”
Kate Clephane put up her hands and pressed them against her ears.
She could not bear to hear another question. What had been the use
of coming to Fred Landers? He had no help to give her, and his
insight had only served to crystallize her hazy terrors. She rose
slowly from her armchair and held out her hand with a struggling
smile.
“You’re right. I suppose there’s nothing more to do.”
“But you’re not going?”
“Yes; I’m tired. And I want to be by myself—to think. I must decide
about my own future.”
“Your own future? Oh, nonsense! Let all this blow over. Wait till Anne
comes back. The chief thing, of course, is that you should stay with
her, whatever happens.”
She put her hand in his. “Goodbye, Fred. And thank you.”
“I’ll do all I can, you know,” he said, as he followed her down the
stairs. “But you mustn’t desert Anne.”
The taxi he had called carried her back to her desolate house.
XVIII.
HER place was beside Anne—that was all she had got out of Fred
Landers. And in that respect she was by no means convinced that
her instinct was not surer than his, that she was not right in agreeing
with her daughter that their experiment had been a failure.
Yet, even if it had, she could not leave Anne now; not till she had
made sure there was no further danger from Chris. Ah—if she were
once certain of that, it would perhaps be easiest and simplest to go!
But not till then.
She did not know when Anne was coming back; no word had come
from her. Mrs. Clephane had an idea that the house-keeper knew;
but she could not ask the house-keeper. So for another twenty-four
hours she remained on, with a curious sense of ghostly unconcern,
while she watched Aline unpack her trunks and “settle” her into her
rooms for the winter.
It was on the third day that Nollie Tresselton telephoned. She was in
town, and asked if she might see Mrs. Clephane at once. The very
sound of her voice brought reassurance; and Kate Clephane sat
counting the minutes till she appeared.
She had come up from the Drovers’, as Kate had guessed; and she
brought an embarrassed message of apology from Anne. “She
couldn’t write—she’s too upset. But she’s so sorry for what she said
... for the way she said it. You must try to forgive her....”
“Oh, forgive her—that’s nothing!” the mother cried, her eyes
searching the other’s face. But Nollie’s vivid features were obscured
by the embarrassment of the message she had brought. She looked
as if she were tangled in Anne’s confusion.
“That’s nothing,” Kate Clephane repeated. “I hurt her horribly too—I
had to. I couldn’t expect her to understand.”
Mrs. Tresselton looked relieved. “Ah, you do see that? I knew you
would! I told her so—” She hesitated, and then went on, with a slight
tremor in her voice: “Your taking it in that way will make it all so much
easier—”
But she stopped again, and Kate, with a sinking heart, stood up.
“Nollie; she wants me to go?”
“No, no! How could you imagine it? She wants you to look upon this
house as yours; she has always wanted it.”
“But she’s not coming back to it?”
The younger woman laid a pleading hand on Mrs. Clephane’s arm.
“Aunt Kate—you must be patient. She feels she can’t; not now, at
any rate.”
“Not now? Then it’s she who hasn’t forgiven?”
“She would, you know—oh, so gladly!—she’d never think again of
what’s happened. Only she fears—”
“Fears?”
“Well—that your feeling about Chris is still the same....”
Mrs. Clephane caught at the hand that lay on her arm. “Nollie! She
knows where he is? She’s seen him?”
“No; but she means to. He’s been very ill—he’s had a bad time since
the engagement was broken. And that makes her feel still more
strongly—” The younger woman broke off and looked at Mrs.
Clephane compassionately, as if trying to make her understand the
hopelessness of the struggle. “Aunt Kate, really ... what’s the use?”
“The use? Where is he, Nollie? Here—now—in New York?”
Mrs. Tresselton was silent; the pity in her gaze had turned to a
guarded coolness. Of course Nollie couldn’t understand—never
would! Of course they were all on Anne’s side. Kate Clephane stood
looking helplessly about her. The memory of old scenes under that
same roof—threats, discussions, dissimulations and inward revolts—
arose within her, and she felt on her shoulders the whole oppression
of the past.
“Don’t think,” Nollie continued, her expression softening, “that Anne
hasn’t tried to understand ... to make allowances. The boy you knew
must have been so different from the Major Fenno we all like and
respect—yes, respect. He’s ‘made good’, you see. It’s not only his
war record, but everything since. He’s worked so hard—done so well
at his various jobs—and Anne’s sure that if he had the chance he
would make himself a name in the literary world. All that naturally
makes it more difficult for her to understand your objection—or your
way of asserting it.”
Mrs. Clephane lifted imploring eyes to her face. “I don’t expect Anne
to understand; not yet. But you must try to, Nollie; you must help
me.”
“I want to, Aunt Kate.” The young woman stood before her,
affectionately perplexed. “If there’s anything ... anything really wrong
... you ought to tell me.”
“I do tell you,” Kate panted.
“Well—what is it?”
Silence fell—always the same silence. Kate glanced desperately
about the imprisoning room. Every panel and moulding of its walls,
every uncompromising angle or portly curve of its decorous furniture,
seemed equally leagued against her, forbidding her, defying her, to
speak.
“Ask Fred Landers,” she said, at bay.
“But I have; I saw him on my way here. And he says he doesn’t know
—that you wouldn’t explain.”
“Why should I have to explain? I’ve said Major Fenno ought not to
marry Anne. I’ve known him longer than any of you. Isn’t it likely that
I know him better?”
The words came from her precipitate and shrill; she felt she was
losing all control of her face and voice, and lifted her handkerchief to
her lips to hide their twitching.
“Aunt Kate—!” Nollie Tresselton gasped it out on a new note of
terror; then she too fell silent, slowly turning her eyes away.
In that instant Kate Clephane saw that she had guessed, or if not,
was at least on the point of guessing; and fresh alarm possessed the
mother. She tried to steady herself, to raise new defences against
this new danger. “Some men are not meant to marry: they’re sure to
make their wives unhappy. Isn’t that reason enough? It’s a question
of character. In those ways, I don’t believe character ever changes.
That’s all.”
“That’s all.” The word was said. She had been challenged again, and
had again shrunk away from the challenge.
Nollie Tresselton drew a deep breath of relief. “After knowing him so
well as a boy, you naturally don’t want to say anything more; but you
think they’re unsuited to each other.”
“Yes—that’s it. You do see?”
The younger woman considered; then she took Mrs. Clephane by
the hand. “I do see. And I’ll try to help—to persuade Anne to put off
deciding. Perhaps after she’s seen him it will be easier....”
Nollie was again silent, and Mrs. Clephane understood that,
whatever happened, the secret of Chris’s exact whereabouts was to
be kept from her. She thought: “Anne’s afraid to have me meet him
again,” and there was a sort of fierce satisfaction in the thought.
Nollie was gathering up her wrap and hand-bag. She had to get back
to Long Island, she said; Kate understood that she meant to return to
the Drovers’. As she reached the door a last impulse of avowal
seized the older woman. What if, by giving Nollie a hint of the truth,
she could make sure of her support and thus secure Anne’s safety?
But what argument against the marriage would be more efficacious
on Nollie’s lips than on her own? One only—the one that no one
must ever use. The terror lest Nollie, possessed of that truth, and
sickened by it, should after all reveal it in a final effort to prevent the
marriage, prevailed over Mrs. Clephane’s other fears. Once Nollie
knew, Anne would surely get to know; the horror of that possibility
sealed the mother’s lips.
Nollie, from the threshold, still looked at her wistfully, expectantly, as
if half-awaiting the confession; but Mrs. Clephane held out her hand
without a word.

“I must find out where he is.” It was Kate’s first thought after the door
had closed on her visitor. If he were in New York—and he evidently
was—she, Kate Clephane, must run him down, must get speech with
him, before he had been able to see her daughter.
But how was she to set about it? Fred Landers did not even know if
he were still with Horace Maclew or not—for the mere fact of
Maclew’s not alluding to him while they were together meant nothing,
less than nothing. And even if he had left the Maclews, the chances
were that Lilla knew where he was, and had already transmitted
Anne’s summons.
Mrs. Clephane consulted the telephone book, but of course in vain.
Then, after some hesitation, she rang up Horace Maclew’s house in
Baltimore. No one was there, but she finally elicited from the servant
who answered the telephone that Mrs. Maclew was away on a motor
trip. Perhaps Mr. Maclew could be reached at his country-place....
Kate tried the country-place, but Mr. Maclew had gone to Chicago.
The sense of loneliness and helplessness closed in on her more
impenetrably than ever. Night came, and Aline reminded her that she
had asked to have her dinner brought up on a tray. Solitary meals in
John Clephane’s dining-room were impossible to her.
“I don’t want any dinner.”
Aline’s look seemed to say that she knew why, and her mistress
hastily emended: “Or just some bouillon and toast. Whatever’s ready
—”
She sat down to it without changing her dress. Every gesture, every
act, denoting intimacy with that house, or the air of permanence in
her relation to it, would also have been impossible. Again she had
the feeling of sitting in a railway station, waiting for a train to come in.
But now she knew for what she was waiting.
At the close of her brief meal Aline entered briskly with fruit and
coffee. Her harsh face illuminated with curiosity, she handed her
mistress a card. “The gentleman is downstairs. He hopes Madame
will excuse the hour.” Her tone seemed to imply: “Madame, in this
case, will excuse everything!” and Kate cast a startled glance at the
name.
He had come to her, then—had come of his own accord! She felt
dizzy with relief and fear. Fear uppermost—yes; was she not always
afraid of him?
XIX.
CHRIS FENNO stood in the drawing-room. The servant who
received him had turned on a blast of lamps and wall-lights, and in
the hard overhead glare he looked drawn and worn, like a man
recovering from severe illness. His clothes, too, Kate fancied, were
shabbier; everything in his appearance showed a decline, a defeat.
She had not much believed in his illness when Nollie spoke of it; the
old habit of incredulity was too strong in her. But now his appearance
moved her. She felt herself responsible, almost guilty. But for her
folly, she thought, he might have been standing before her with a
high head, on easy terms with the world.
“You’ve been ill!” she exclaimed.
His gesture brushed that aside. “I’m well now, thanks.” He looked her
in the face and added: “May we have a few minutes’ talk?”
She faltered: “If you think it necessary.” Inwardly she had already
begun to tremble. When his blue eyes turned to that harsh slate-
gray, and the two perpendicular lines deepened between his brows,
she had always trembled.
“You’ve made it necessary,” he retorted, his voice as harsh as his
eyes.
“I?”
“You’ve broken our compact. It’s not my doing. I stuck to my side of
it.” He flung out the short sentences like blows.
Her heart was beating so wildly that she could not follow what he
said. “What do you mean?” she stammered.
“That you agreed to help me if I gave up Anne. God knows what your
idea of helping me was. To me it meant only one thing: your keeping
quiet, keeping out of the whole business, and trusting me to carry out
my side of the bargain—as I did. I broke our engagement, chucked
my job, went away. And you? Instead of keeping out of it, of saying
nothing, you’ve talked against me, insinuated God knows what, and
then refused to explain your insinuations. You’ve put me in such a
position that I’ve got to take back my word to you, or appear to your
daughter and your family as a man who has run away because he
knew he couldn’t face the charge hanging over him.”
It was only in the white-heat of anger that he spoke with such
violence, and at such length; he seemed spent, and desperately at
bay, and the thought gave Kate Clephane courage.
“Well—can you face it?” she asked.
His expression changed, as she had so often seen it change. From
menace it passed to petulance and then became almost pleading in
its perplexity. She said to herself: “It’s the first time I’ve ever been
brave with him, and he doesn’t know how to take it.” But even then
she felt the precariousness of the advantage. His ready wit had so
often served him instead of resolution. It served him now.
“You do mean to make the charge, then?” he retorted.
She stood silent, feeling herself defeated, and at the same time
humiliated that their angry thoughts should have dragged them down
to such a level.
“Don’t sneer—” she faltered.
“Sneer? At what? I’m in dead earnest—can’t you see it? You’ve
ruined me—or very nearly. I’m not speaking now of my feelings; that
would make you sneer, probably. At any rate, this is no time for
discussing them. I’m merely putting my case as a poor devil who has
to earn his living, a man who has his good name to defend. On both
counts you’ve done me all the harm you could.”
“I had to stop this marriage.”
“Very well. I agreed to that. I did what I’d promised. Couldn’t you let it
alone?”
“No. Because Anne wouldn’t. She wanted to ask you to come back.
She saw I couldn’t bear it—she suspected me of knowing something.
She insisted.”
“And you sacrificed my good name rather—”
“Oh, I’d sacrifice anything. You’d better understand that.”
“I do understand it. That’s why I’m here. To tell you I consider that
what you’ve done has freed me from my promise.”
She stretched out her hands as if to catch him back. “Chris—no,
stay! You can’t! You can’t! You know you can’t!”
He stood leaning against the chimney-piece, his arms crossed, his
head a little bent and thrust forward, in the attitude of sullen
obstinacy that she knew so well. And all at once in her own cry she
heard the echo of other cries, other entreaties. She saw herself in
another scene, stretching her arms to him in the same desperate
entreaty, with the same sense of her inability to move him, even to
reach him. Her tears overflowed and ran down.
“You don’t mean you’ll tell her?” she whispered.
He kept his dogged attitude. “I’ve got to clear myself—somehow.”
“Oh, don’t tell her, don’t tell her! Chris, don’t tell her!”
As the cry died on her lips she understood that, in uttering it, she had
at last cast herself completely on his mercy. For it was not
impossible that, if other means failed, he would risk justifying himself
to Anne by revealing the truth. There were times when he was
reckless enough to risk anything. And if Kate were right in her
conjecture—if he had the audacity he affected—then his hold over
her was complete, and he knew it. If any one else told Anne, the
girl’s horror would turn her from him at once. But what if he himself
told her? All this flashed on Kate Clephane in the same glare of
enlightenment.
There was a long silence. She had sunk into a chair and hidden her
face in her hands. Presently, through the enveloping cloud of her
misery, she felt his nearness, and a touch on her shoulder.
“Kate—won’t you try to understand; to listen quietly?”
She lifted her eyes and met his fugitively. They had lost their
harshness, and were almost frightened. “I was angry when I came
here—a man would be,” he continued. “But what’s to be gained by
our talking to each other in this way? You were awfully kind to me in
old times; I haven’t forgotten. But is that a reason for being so hard
on me now? I didn’t bring this situation on myself—you’re my witness
that I didn’t. But here it is; it’s a fact; we’ve got to face it.”
She lowered her eyes and voice to whisper painfully: “To face Anne’s
love for you?”
“Yes.”
“Her determination—?”
“Her absolute determination.”
His words made her tremble again; there had always been moments
when his reasonableness alarmed her more than his anger, because
she knew that, to be so gentle, he must be certain of eventually
gaining his point. But she gathered resolution to say: “And if I take
back my threats, as you call them? If I take back all I’ve said—‘clear’
you entirely? That’s what you want, I understand? If I promise that,”
she panted, “will you promise too—promise me to find a way out?”
His hand fell from her shoulder, and he drew back a step. “A way out
—now? But there isn’t any.”
Mrs. Clephane stood up. She remembered wondering long ago—
one day when he had been very tender—how cruel Chris could be.
The conjecture, then, had seemed whimsical, almost morbid; now
she understood that she had guessed in him from the outset this
genius for reaching, at the first thrust, to the central point of his
antagonist’s misery.
“You’ve seen my daughter, then?”
“I’ve seen her—yes. This morning. It was she who sent me here.”
“If she’s made up her mind, why did she send you?”
“To tell you how she’s suffering. She thinks, you know—” He
wavered again for a second or two, and then brought out: “She’s
very unhappy about the stand you take. She thinks you ought to say
something to ... to clear up....”
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

ebookfinal.com

You might also like