100% found this document useful (5 votes)
25 views

Full Download Learning Representation for Multi-View Data Analysis: Models and Applications Zhengming Ding PDF DOCX

Models

Uploaded by

txabimelsid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (5 votes)
25 views

Full Download Learning Representation for Multi-View Data Analysis: Models and Applications Zhengming Ding PDF DOCX

Models

Uploaded by

txabimelsid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 65

Experience Seamless Full Ebook Downloads for Every Genre at textbookfull.

com

Learning Representation for Multi-View Data


Analysis: Models and Applications Zhengming Ding

https://textbookfull.com/product/learning-representation-
for-multi-view-data-analysis-models-and-applications-
zhengming-ding/

OR CLICK BUTTON

DOWNLOAD NOW

Explore and download more ebook at https://textbookfull.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Linking and Mining Heterogeneous and Multi-view Data


Deepak P

https://textbookfull.com/product/linking-and-mining-heterogeneous-and-
multi-view-data-deepak-p/

textboxfull.com

Data Analysis in the Cloud : Models, Techniques and


Applications 1st Edition Marozzo

https://textbookfull.com/product/data-analysis-in-the-cloud-models-
techniques-and-applications-1st-edition-marozzo/

textboxfull.com

Advanced R Statistical Programming and Data Models:


Analysis, Machine Learning, and Visualization 1st Edition
Matt Wiley
https://textbookfull.com/product/advanced-r-statistical-programming-
and-data-models-analysis-machine-learning-and-visualization-1st-
edition-matt-wiley/
textboxfull.com

Machine Learning and Big Data Analytics Paradigms:


Analysis, Applications and Challenges Aboul Ella Hassanien

https://textbookfull.com/product/machine-learning-and-big-data-
analytics-paradigms-analysis-applications-and-challenges-aboul-ella-
hassanien/
textboxfull.com
Time Series Analysis Methods and Applications for Flight
Data Zhang

https://textbookfull.com/product/time-series-analysis-methods-and-
applications-for-flight-data-zhang/

textboxfull.com

Statistical Modeling for Degradation Data 1st Edition


Ding-Geng (Din) Chen

https://textbookfull.com/product/statistical-modeling-for-degradation-
data-1st-edition-ding-geng-din-chen/

textboxfull.com

Practical Machine Learning for Streaming Data with Python:


Design, Develop, and Validate Online Learning Models 1st
Edition Sayan Putatunda
https://textbookfull.com/product/practical-machine-learning-for-
streaming-data-with-python-design-develop-and-validate-online-
learning-models-1st-edition-sayan-putatunda/
textboxfull.com

Electrolyzed Water in Food Fundamentals and Applications


Tian Ding

https://textbookfull.com/product/electrolyzed-water-in-food-
fundamentals-and-applications-tian-ding/

textboxfull.com

Practical Machine Learning for Data Analysis Using Python


1st Edition Abdulhamit Subasi

https://textbookfull.com/product/practical-machine-learning-for-data-
analysis-using-python-1st-edition-abdulhamit-subasi/

textboxfull.com
Advanced Information and Knowledge Processing

Zhengming Ding
Handong Zhao
Yun Fu

Learning
Representation for
Multi-View Data
Analysis
Models and Applications
Advanced Information and Knowledge
Processing

Series editors
Lakhmi C. Jain
Bournemouth University, Poole, UK, and
University of South Australia, Adelaide, Australia
Xindong Wu
University of Vermont
Information systems and intelligent knowledge processing are playing an increasing
role in business, science and technology. Recently, advanced information systems
have evolved to facilitate the co-evolution of human and information networks
within communities. These advanced information systems use various paradigms
including artificial intelligence, knowledge management, and neural science as well
as conventional information processing paradigms. The aim of this series is to
publish books on new designs and applications of advanced information and
knowledge processing paradigms in areas including but not limited to aviation,
business, security, education, engineering, health, management, and science. Books
in the series should have a strong focus on information processing—preferably
combined with, or extended by, new results from adjacent sciences. Proposals for
research monographs, reference books, coherently integrated multi-author edited
books, and handbooks will be considered for the series and each proposal will be
reviewed by the Series Editors, with additional reviews from the editorial board and
independent reviewers where appropriate. Titles published within the Advanced
Information and Knowledge Processing series are included in Thomson Reuters’
Book Citation Index.

More information about this series at http://www.springer.com/series/4738


Zhengming Ding Handong Zhao

Yun Fu

Learning Representation
for Multi-View Data Analysis
Models and Applications

123
Zhengming Ding Yun Fu
Indiana University-Purdue Northeastern University
University Indianapolis Boston, MA, USA
Indianapolis, IN, USA

Handong Zhao
Adobe Research
San Jose, CA, USA

ISSN 1610-3947 ISSN 2197-8441 (electronic)


Advanced Information and Knowledge Processing
ISBN 978-3-030-00733-1 ISBN 978-3-030-00734-8 (eBook)
https://doi.org/10.1007/978-3-030-00734-8

Library of Congress Control Number: 2018961715

© Springer Nature Switzerland AG 2019


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part
of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations,
recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission
or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar
methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this
publication does not imply, even in the absence of a specific statement, that such names are exempt from
the relevant protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the
authors or the editors give a warranty, express or implied, with respect to the material contained herein or
for any errors or omissions that may have been made. The publisher remains neutral with regard to
jurisdictional claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Preface

This book equips readers to handle complex multi-view data representation,


centered around several major visual applications, sharing many tips and insights
through a unified learning framework. This framework is able to model most
existing multi-view learning and domain adaptation, enriching readers’ under-
standing from their similarity and differences based on data organization and
problem settings, as well as the research goal.
A comprehensive review exhaustively provides the key recent research on
multi-view data analysis, i.e., multi-view clustering, multi-view classification,
zero-shot learning, and domain adaption. More practical challenges in multi-view
data analysis are discussed including incomplete, unbalanced, and large-scale
multi-view learning. Learning representation for multi-view data analysis covers a
wide range of applications in the research fields of big data, human-centered com-
puting, pattern recognition, digital marketing, Web mining, and computer vision.
This book consists of ten chapters. Chapter 1 introduces the background and
unified model of multi-view data representations. Part I, which includes Chaps. 2–4,
introduces the unsupervised learning for multi-view data analysis. Chapter 2 pre-
sents the unsupervised representation learning methods for two multi-view sce-
narios. One is considering various data sources as multiple views. The other is
considering different splits of one source data as multiple views. Chapter 3
addresses the more challenging and practical incomplete multi-view clustering
problem. Chapter 4 introduces a novel outlier detection problem in multi-view
setting and correspondingly proposes a multi-view outlier detection framework.
Part II, which includes Chaps. 5 and 6, presents the multi-view data analysis for
supervised multi-view classification. Chapter 5 presents two multi-view classifi-
cation models—one is dual low-rank decomposition multi-view subspace and the
other is cross-view auto-encoder. Chapter 6 shows an adaptive latent semantic
representation model in a sparse dictionary learning scheme for zero-shot learning
(a special case of multi-view classification problem). Part III, which includes Chaps.
7–10, presents the multi-view data analysis for domain adaptation. Chapter 7 lists
the missing modality transfer learning model to solve the problem when target
modality is not available in the training stage. Chapter 8 discusses the multi-source

v
vi Preface

transfer learning problem when all the sources are incomplete. Chapter 9 proposes
three deep domain adaptation models to address the challenge where target data has
limited or no label. Following this, Chap. 10 provides a deep domain generalization
model aiming to deal with the target domain that is not available in the training
stage while only with multiple related sources at hand.
In particular, this book can be used by these audiences in the background of
computer science, information systems, data science, statistics, and mathematics.
Other potential audiences can be attracted from broad fields of science and engi-
neering since this topic has potential applications in many disciplines.
We would like to thank our collaborators Ming Shao, Hongfu Liu, and Shuyang
Wang. We would also like to thank editor Helen Desmond from Springer for the
help and support.

Indianapolis, IN, USA Zhengming Ding


San Jose, CA, USA Handong Zhao
Boston, MA, USA Yun Fu
September 2018
Contents

1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 What Are Multi-view Data and Problem? . . . . . . . . . . . . . . . . . 1
1.2 A Unified Perspective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.3 Organization of the Book . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Part I Unsupervised Multi-view Learning


2 Multi-view Clustering with Complete Information . . . . . . . . . . . . . 9
2.1 Deep Multi-view Clustering . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
2.1.2 Deep Semi-NMF Formulation . . . . . . . . . . . . . . . . . . . 11
2.1.3 Experiments on Face Benchmarks . . . . . . . . . . . . . . . . 16
2.1.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.2 Ensemble Subspace Clustering . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.2.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
2.2.2 Ensemble Formulation with Sparse and Block-Wise
Constraints . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
2.2.3 Experiments on Face, Object, Motion Benchmarks . . . . 34
2.2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
3 Multi-view Clustering with Partial Information . . . . . . . . . . . . . . . 51
3.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
3.2 Incomplete Multi-view Clustering . . . . . . . . . . . . . . . . . . . . . . 53
3.2.1 Incomplete Case Formulation . . . . . . . . . . . . . . . . . . . 53
3.2.2 Complete Graph Laplacian . . . . . . . . . . . . . . . . . . . . . 54
3.2.3 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
3.2.4 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 57

vii
viii Contents

3.3 Experiment on Synthetic and Real-World Data . . . . . . . . . . . . . 58


3.3.1 Experimental Result . . . . . . . . . . . . . . . . . . . . . . . . . . 59
3.3.2 Convergence Study . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
3.3.3 Parameter Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63
3.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
4 Multi-view Outlier Detection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 67
4.2 Related Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
4.3 Multi-view Outlier Detection Method . . . . . . . . . . . . . . . . . . . . 70
4.3.1 The Proposed Consensus Based Algorithm . . . . . . . . . 70
4.3.2 Outlier Measurement Criterion . . . . . . . . . . . . . . . . . . 72
4.4 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.4.1 Algorithm Derivation . . . . . . . . . . . . . . . . . . . . . . . . . 73
4.4.2 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 76
4.5 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 76
4.5.1 Synthetic Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 77
4.5.2 Real-World Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 78
4.5.3 Analytical Experiments . . . . . . . . . . . . . . . . . . . . . . . . 85
4.5.4 Application on Saliency Detection . . . . . . . . . . . . . . . . 90
4.5.5 Application on Face Reconstruction . . . . . . . . . . . . . . . 91
4.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93

Part II Supervised Multi-view Classification


5 Multi-view Transformation Learning . . . . . . . . . . . . . . . . . . . . . . . 99
5.1 Dual Low-Rank Decomposition for Multi-view Learning . . . . . 99
5.1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
5.1.2 Robust Multi-view Subspace Learning . . . . . . . . . . . . . 101
5.1.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
5.2 Coupled Marginalized Auto-encoders for Cross-domain
Multi-view Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 111
5.2.2 The Proposed Algorithm . . . . . . . . . . . . . . . . . . . . . . . 113
5.2.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 118
5.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124
6 Zero-Shot Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
6.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
6.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 128
6.3 The Proposed Algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 130
Contents ix

6.3.1 Learning Latent Semantic Dictionary . . . . . . . . . . . . . . 131


6.3.2 Adaptive Graph Guided Latent Semantics . . . . . . . . . . 132
6.3.3 Optimization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133
6.3.4 ZSL with Fast Inference . . . . . . . . . . . . . . . . . . . . . . . 135
6.4 Experiment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 136
6.4.1 Dataset & Experimental Setting . . . . . . . . . . . . . . . . . . 136
6.4.2 Zero-Shot Classification . . . . . . . . . . . . . . . . . . . . . . . 137
6.4.3 Zero-Shot Retrieval . . . . . . . . . . . . . . . . . . . . . . . . . . 138
6.4.4 Empirical Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 140
6.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 142

Part III Transfer Learning


7 Missing Modality Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . 147
7.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 147
7.1.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
7.2 Transfer Learning via Latent Low-Rank Constraint . . . . . . . . . . 150
7.2.1 Conference Version Revisit . . . . . . . . . . . . . . . . . . . . . 150
7.2.2 Transfer Learning with Dictionary Constraint . . . . . . . . 151
7.2.3 Low-Rank Transfer with Latent Factor . . . . . . . . . . . . 152
7.3 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 162
7.3.1 Datasets and Experiments Setting . . . . . . . . . . . . . . . . 163
7.3.2 Convergence and Property in Two Directions . . . . . . . 164
7.3.3 Recognition Results . . . . . . . . . . . . . . . . . . . . . . . . . . 166
7.3.4 Parameter Property and Training Time . . . . . . . . . . . . 170
7.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172
8 Multi-source Transfer Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
8.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 175
8.2 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 177
8.3 Incomplete Multi-source Transfer Learning . . . . . . . . . . . . . . . 178
8.3.1 Effective Incomplete Multi-source Alignment . . . . . . . . 179
8.3.2 Cross-Domain Knowledge Transfer . . . . . . . . . . . . . . . 180
8.3.3 Cross-Source Knowledge Alignment . . . . . . . . . . . . . . 183
8.3.4 Solving Objective Function . . . . . . . . . . . . . . . . . . . . . 185
8.3.5 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 189
8.3.6 Generalization Bound Analysis . . . . . . . . . . . . . . . . . . 189
8.4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 190
8.4.1 Synthetic Experiment . . . . . . . . . . . . . . . . . . . . . . . . . 190
8.4.2 Real-world Datasets . . . . . . . . . . . . . . . . . . . . . . . . . . 191
8.4.3 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193
x Contents

8.4.4 Property Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . 196


8.4.5 Incomplete Single Source Comparison . . . . . . . . . . . . . 199
8.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201
9 Deep Domain Adaptation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
9.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 203
9.2 Stacked Low-Rank Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . 204
9.2.1 Single-Layer Low-Rank Coding . . . . . . . . . . . . . . . . . 205
9.2.2 Optimization Solution . . . . . . . . . . . . . . . . . . . . . . . . . 208
9.2.3 Complexity Analysis . . . . . . . . . . . . . . . . . . . . . . . . . 210
9.2.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . 211
9.3 Deep Low-Rank Coding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 215
9.3.1 Preliminaries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
9.3.2 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 217
9.3.3 Deep Transfer Low-Rank Coding . . . . . . . . . . . . . . . . 218
9.3.4 Non-linear Representation . . . . . . . . . . . . . . . . . . . . . . 224
9.3.5 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . 227
9.4 Spectral Bisection Tree Guided Deep Adaptive Exemplar
Autoencoder . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
9.4.1 Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
9.4.2 Data Composition via Spectral Bisection Tree . . . . . . . 236
9.4.3 Deep Adaptive Exemplar Autoencoder . . . . . . . . . . . . 237
9.4.4 Experimental Results . . . . . . . . . . . . . . . . . . . . . . . . . 243
9.4.5 Datasets and Experimental Setting . . . . . . . . . . . . . . . . 244
9.4.6 Results and Discussion . . . . . . . . . . . . . . . . . . . . . . . . 244
9.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 246
10 Deep Domain Generalization . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
10.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 251
10.2 Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 253
10.3 Deep Generalized Transfer Learning . . . . . . . . . . . . . . . . . . . . 254
10.3.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 254
10.3.2 Deep Neural Networks Revisit . . . . . . . . . . . . . . . . . . 255
10.3.3 Deep Generalized Transfer Learning . . . . . . . . . . . . . . 255
10.3.4 Model Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 258
10.4 Experiments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 260
10.4.1 Datasets and Experimental Setting . . . . . . . . . . . . . . . . 260
10.4.2 Comparison Experiments . . . . . . . . . . . . . . . . . . . . . . 261
10.4.3 Self-evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 264
10.5 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 266
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 267
Chapter 1
Introduction

Multi-view data generated from various view-points or multiple sensors are com-
monly seen in real-world applications. For example, the popular commercial depth
sensor Kinect uses both visible light and near infrared sensors for depth estimation;
autopilot uses both visual and radar sensors to produce real-time 3D information
on the road; face analysis algorithms prefer face images from different views for
high-fidelity reconstruction and recognition. However, such data with large view
divergence would lead to an enormous challenge: data across various views have a
large divergence preventing them from a fair comparison. Generally, different views
tend to be treated as different domains from different distributions. Thus, there is an
urgent need to mitigate the view divergence when facing specific problems by either
fusing the knowledge across multiple views or adapting knowledge from some views
to others. Since there are different terms regarding “multi-view” data analysis and
its aliasing, we first give a formal definition and narrow down our research focus to
differentiate it from other related works but in different lines.

1.1 What Are Multi-view Data and Problem?

Definition 1 (Multi-view Data) (Fig. 1.1): Assume we have a set of data X =


{X 1 , X 2 , . . . , X v } from v views, e.g., face poses, camera views and types of features.
In this book, we are especially interested in two cases upon data correspondence:
First, the samples across v views are correspondent (i.e., sample-wise relationship)
in multi-view data, falling in the conventional multi-view learning; Second, the
samples across different views have no data correspondence, falling in the transfer
learning scenario, where discriminant knowledge are transferred.

First, multi-view learning aims to to merge the knowledge from different views
to either uncover common knowledge, or employ the complementary knowledge in

© Springer Nature Switzerland AG 2019 1


Z. Ding et al., Learning Representation for Multi-View Data Analysis,
Advanced Information and Knowledge Processing,
https://doi.org/10.1007/978-3-030-00734-8_1
2 1 Introduction

Fig. 1.1 Different scenarios of multi-view data analytics. a Different types of features from single
image; b different sources to represent information; c–e images from different viewpoints

specific views to assist learning tasks. For example, in vision, multiple features
extracted from the same object by various visual descriptors, e.g., LBP, SIFT and
HOG are very discriminant in recognition tasks. Another example is multi-modal data
captured, represented, and stored in varied formats, e.g., near-infrared and visible
face, and image and text. For multi-view learning, the goal is to fuse the knowledge
from multiple views to facilitate common learning tasks, e.g., clustering and classi-
fication. The key challenge is exploring data correspondence across multiple views.
The mappings among different views are able to couple view-specific knowledge
while additional labels would help formulate supervised regularizers. The general
setting of multi-view clustering is to group n data samples in v different views (e.g.,
v types of features, sensors, or modalities) by fusing the knowledge across different
views to seek a consistent clustering result. The general setting of multi-view clas-
sification is that it needs to build a model with given v views of training data. In
the test stage, we would have two different scenarios. First, one view will be used
to recognize other views with the learned model. In this case, the label information
across training and test data is different; Second, specifically for multi-features based
learning, is that v-view training data is used to seek a model by fusing the cross-view
knowledge, which is also used as gallery data to recognize v-view probe data.
Second, domain adaptation attempts to transfer knowledge from labeled source
domains to facilitate the learning burden in the target domains with sparsely or
no labeled samples. For example, in surveillance, faces are captured by long wave
infrared sensor in night-time, but recognition model is trained on regular face images
collected under visible light. Conventional domain adaptation methods consider seek-
ing domain-invariant representation for the data or modifying classifiers to fight off
the marginal or conditional distribution mismatch across source and target domains.
The goal of domain adaptation is to transfer knowledge from well-labeled sources
to unlabeled targets, which accounts for the more general settings that some source
views are labeled while target views are unlabeled. The general setting of domain
adaptation is that we build a model on both labeled source data and unlabeled target
data. Then we use the model to predict the unlabeled target data, either the same
1.1 What Are Multi-view Data and Problem? 3

data in the training stage or different data. Thus, we have corresponding transductive
domain adaptation and inductive domain adaption.
There are different strategies to deal with multi-view data, e.g., translation, fusion,
alignment, co-learning and representation learning. This book will focus on represen-
tation learning and fusion. The following chapters would discuss multi-view data ana-
lytic algorithms along with our proposed unified model Sect. 1.2 from three aspects.
Furthermore, we will discuss the challenging situation where test data are sampled
from unknown categories, e.g., zero-shot learning, and more challenging tasks with
incomplete data, e.g., missing modality transfer learning, incomplete multi-source
adaptation and domain generalization.

1.2 A Unified Perspective

Due to the distribution divergence across different views, view-invariant feature


learning is a widely-used and promising technique to address the multi-view chal-
lenges. Generally, multiple view-specific linear or non-linear mapping functions
would be sought to transform the original multi-view data into a new common space
by identifying dedicated alignment strategies with various loss functions. Specifi-
cally, we could formulate them into a common objective including two parts: (1)
multi-view alignment term; (2) feature learning regularizer, namely:


v 
v
min A ( f i (X i ), f j (X j )) + λ R( f k (X k )),
f 1 (·),..., f v (·)
i=1,i< j k=1

where f i (·) is a feature learning function for view i, either linear, non-linear mapping,
or deep network.
The first common term A (·) is a pairwise symmetric alignment function across
multiple views to either fuse the knowledge among multiple views or transfer knowl-
edge across different views. Due to different problem settings, multi-view learning
and domain adaptation would explore various strategies to define the loss functions.
While multi-view learning employs data correspondence (i.e., sample-wise relation-
ship w/ or w/o labels) to seek common representation, domain adaptation employs
domain- or class-wise relationship during the model learning for discriminant domain
invariant feature.
The second common term R(·) is the feature learning regularizer by incorporat-
ing either the labeled information or the intrinsic structure of the data, or both during
the mapping learning. To name a few, logistic regression, Softmax regression, graph
regularizers are usually incorporated to carry the label and manifold information.
When we turn to deep learning, this term is mostly Softmax regression. For a part of
multi-view learning algorithms, they would merge feature learning regularizer into
the alignment term. Generally, the formulation of the second term is very similar
4 1 Introduction

between multi-view learning and domain adaptation within our research concentra-
tion.
Along the unified model, we will cover both shallow structure learning and deep
learning approaches for multi-view data analysis, e.g., subspace learning, matrix
factorization, low-rank modeling, deep auto-encoder, deep neural networks, deep
convolutional neural networks. For example, multi-view clustering models will be
explored including multi-view matrix factorization, multi-view subspace learning,
multi-view deep structure learning in unsupervised setting.

1.3 Organization of the Book

The rest of this book is organized as follows. The first two parts are for multi-view
data analysis with sample-wise correspondence; and the third part is for multi-view
data analysis with class-wise correspondence.
Part I focuses on developing unsupervised multi-view clustering (MVC) models.
It consists of the following three chapters. Chapter 2 explores complementary infor-
mation across views to benefit the clustering problem and presents a deep matrix
factorization framework for MVC, where semi-nonnegative matrix factorization is
adopted to learn the hierarchical semantics of multi-view data in a layer-wise fashion.
To maximize the mutual information from each view, we enforce the non-negative
representation of each view in the final layer to be the same. Furthermore, to respect
the intrinsic geometric structure in each view data, graph regularizers are introduced
to couple the output representation of deep structures.
Chapter 3 considers an underlying problem hidden behind the emerging multi-
view techniques: What if one/more view data fail? Thus, we propose an unsuper-
vised method which well handles the incomplete multi-view data by transforming
the original and incomplete data to a new and complete representation in a latent
space. Different from the existing efforts that simply project data from each view
into a common subspace, a novel graph Laplacian term with a good probabilistic
interpretation is proposed to couple the incomplete multi-view samples. In such a
way, a compact global structure over the entire heterogeneous data is well preserved,
leading to a strong grouping discriminability.
Chapter 4 presents a multi-view outlier detection algorithm based on clustering
techniques to identify two different types of data outliers with abnormal behaviors.
We first give the definition of both types of outliers in multi-view setting. Then we
propose a multi-view outlier detection method with a novel consensus regularizer
on the latent representations. Specifically, we explicitly characterize each kind of
outliers by the intrinsic cluster assignment labels and sample-specific errors. We
experimentally show that this practice generalizes well when the number of views are
greater than two. Last but the least, we make a thorough discussion on the connection
and difference between the proposed consensus-regularization and the state-of-the-
art pairwise-regularization.
1.3 Organization of the Book 5

Part II proposes to solve multi-view classification problems including zero-shot


learning (a special problem of multi-view learning). This part includes the following
two chapters.
Chapter 5 presents two multi-view transformation learning algorithms. First, we
develop a Robust Multi-view Subspace Learning algorithm (RMSL) through dual
low-rank decompositions, which desires to seek a low-dimensional view-invariant
subspace for multi-view data. Generally, one sample lies in two kinds of structures,
one is class structure and the other is view structure, which are intertwined with
one another in the original feature space. To address this, Through dual low-rank
decompositions, RMSL aims to disassemble two intertwined structures from each
other in the low-dimensional subspace. Second, we propose a Coupled Marginal-
ized Denoising Auto-encoders framework, whose core idea is to build two types of
marginalized denoising auto-encoders for effective feature extraction. Specifically,
the intermediate dataset is treated as one of two views in one domain, therefore, one
domain has two views while the other domain only has one view.
Chapter 6 targets at precisely recognizing unseen categories through a shared
visual-semantic function, which is built on the seen categories and expected to well
adapt to unseen categories. We tackle this issue by exploiting the intrinsic relationship
in the semantic manifold and enhancing the transferability of visual-semantic func-
tion. Specifically, we propose an Adaptive Latent Semantic Representation (ALSR)
model in a sparse dictionary learning scheme, where a generic semantic dictionary
is learned to connect the latent semantic space with visual feature space. To build
a fast inference model, we explore a non-linear network to approximate the latent
sparse semantic representation, which lies in the semantic manifold space.
Part III discusses the transfer learning scenarios when the multi-view data are
with class-wise correspondence. This part includes the following four chapters.
Chapter 7 defines Missing Modality Problem in transfer learning, since we always
confront such a problem that no target data are achievable, especially when data
are multi-modal. Under this situation, the target modality is blind in the training
stage, while only the source modality can be obtained. To this end, we propose a
novel transfer learning framework by extending conventional transfer learning into
two directions to handle the Missing Modality Problem. By borrowing an auxiliary
database with the same complete modalities, our model can learn appropriate low-
dimensional subspaces from cross-modality direction and cross-database one.
Chapter 8 attempts to utilize incomplete multiple sources for effective knowledge
transfer to facilitate the learning task in target domain. Nowadays, it is common
to see multiple sources available for knowledge transfer, each of which, however,
may not include complete classes information of the target domain. Naively merging
multiple sources together would lead to inferior results due to the large divergence
among multiple sources. The core idea is to seek an appropriate domain-free subspace
where relevant knowledge for target from multiple sources is coupled and reinforced
to compensate for any missing data in other sources. Specifically, IMTL is designed to
minimize the marginal and conditional distribution discrepancy from two directions:
cross-domain transfer and cross-source transfer.
6 1 Introduction

Chapter 9 develops three novel deep domain adaptation approaches. First, we


propose a Deep Low-Rank Coding framework (DLRC) for transfer learning. The core
idea of DLRC is to jointly learn a deep structure of feature representation and transfer
knowledge via an iterative structured low-rank constraint, which aims to deal with
the mismatch between source and target domains layer by layer. Second, we propose
a novel Deep Transfer Low-rank Coding (DTLC) framework to uncover more shared
knowledge across source and target in a multi-layer manner. Specifically, we extend
traditional low-rank coding with one dictionary to multi-layer dictionaries by jointly
building multiple latent common dictionaries shared by two domains. Third, we
propose a novel deep model called “Deep Adaptive Exemplar AutoEncoder”, where
we build a spectral bisection tree to generate source-target data compositions as
the training pairs fed to autoencoders, and impose a low-rank coding regularizer to
ensure the transferability of the learned hidden layer.
Chapter 10 explores to fight off the challenge through capturing knowledge from
multiple source domains and generalizing to the unseen target domains. In reality,
we would always confront such cases in reality that the target data are totally blind in
the training stage, which is extremely challenging since we have no prior knowledge
of the target. However, existing domain generalization research efforts all employ
shallow structures, so it is difficult for them to well uncover the rich information
within the complex data. To this end, we desire to explore deep structure learning in
domain generalization to uncover more effective knowledge across multiple sources.
Part I
Unsupervised Multi-view Learning
Chapter 2
Multi-view Clustering with Complete
Information

Abstract Multi-view Clustering (MVC) has garnered more attention recently since
many real-world data are comprised of different representations or views. The key
is to explore complementary information to benefit the clustering problem. In this
chapter, we consider the conventional complete-view scenario. Specifically, in the
first section, we present a deep matrix factorization framework for MVC, where
semi-nonnegative matrix factorization is adopted to learn the hierarchical semantics
of multi-view data in a layer-wise fashion. In the second section, we make an exten-
sion and consider the different sampled feature sets as multi-view data. We propose
a novel graph-based method, Ensemble Subspace Segmentation under Block-wise
constraints (ESSB), which is jointly formulated in the ensemble learning framework.

2.1 Deep Multi-view Clustering1

2.1.1 Overview

Traditional clustering aims to identify groups of “similar behavior” in single view


data (Von Luxburg 2007; Liu et al. 2015; Steinwart 2015; Tao et al. 2016; Liu et al.
2016; Li et al. 2017). As the real-world data are always captured from multiple sources
or represented by several distinct feature sets (Cai et al. 2013a; Ding and Fu 2014;
Gao et al. 2015; Zhao and Fu 2015; Wang et al. 2016), MVC is intensively studied
recently by leveraging the heterogeneous data to achieve the same goal. Different
features characterize different information from the data set. For example, an image
can be described by different characteristics, e.g., color, texture, shape and so on.
These multiple types of features can provide useful information from different views.
MVC aims to integrate multiple feature sets together, and uncover the consistent
latent information from different views. Extensive research efforts have been made

1 Thischapter is reprinted with permission from AAAI. “Multi-view Clustering via Deep Matrix
Factorization”. 31st AAAI Conference on Artificial Intelligence, pp. 2921–2927, 2017.
© Springer Nature Switzerland AG 2019 9
Z. Ding et al., Learning Representation for Multi-View Data Analysis,
Advanced Information and Knowledge Processing,
https://doi.org/10.1007/978-3-030-00734-8_2
10 2 Multi-view Clustering with Complete Information

in developing effective MVC methods (Cai et al. 2013a; Gao et al. 2015; Xu et al.
2016; Zhao et al. 2016). Along this line, Kumar et al. developed co-regularized Multi-
view spectral clustering to do clustering on different views simultaneously with a
co-regularization constraint (Kumar et al. 2011). Gao et al. proposed to perform
clustering on the subspace representation of each view simultaneously guided by a
common cluster structure for the consistence across different views (Gao et al. 2015).
A good survey can be found in Xu et al. (2013).
Recently, lots of research activities on MVC have achieved promising performance
based on Non-negative Matrix Factorization (NMF) and its variants, because the non-
negativity constraints allow for better interpretability (Guan et al. 2012; Trigeorgis
et al. 2014). The general idea is to seek a common latent factor through non-negative
matrix factorization among Multi-view data (Liu et al. 2013; Zhang et al. 2014, 2015).
Semi Non-negative Matrix Factorization (Semi-NMF), as one of the most popular
variants of NMF, was proposed to extend NMF by relaxing the factorized basis matrix
to be real values. This practice allows Semi-NMF to have a wider application in the
real world than NMF. Apart from exploring Semi-NMF in MVC application for the
first time, our method has another distinction from the existing NMF-based MVC
methods: we adopt a deep structure to conduct Semi-NMF hierarchically as shown in
Fig. 2.1. As illustrated, through the deep Semi-NMF structure, we push data samples
from the same class closer layer by layer. We borrow the idea from deep learning
(Bengio 2009), thus this practice has such a flavor. Note that the proposed method
is different from the existing deep auto-encoder based MVC approaches (Andrew
et al. 2013; Wang et al. 2015), though all of us are of deep structure. One major
difference is that Andrew et al. (2013), Wang et al. (2015) are based on Canonical
Correlation Analysis (CCA), which is limited to 2-view case, while our method has
no such limitation.

Fig. 2.1 Framework of our proposed method. Same shape denotes the same class. For demonstra-
tion purposes, we only show the two-view case, where two deep matrix factorization structures
are proposed to capture rich information behind each view in a layer-wise fashion. With the deep
structure, samples from the same class but different views gather close to each other to generate
more discriminative representation
2.1 Deep Multi-view Clustering 11

To sum up, in this section we propose a deep MVC algorithm through graph reg-
ularized semi-nonnegative matrix factorization. The key is to build a deep structure
through semi-nonnegative matrix factorization to seek a common feature represen-
tation with more consistent knowledge to facilitate clustering. To the best of our
knowledge, this is the first attempt applying semi-nonnegative matrix factorization
to MVC in a deep structure. We summarize our major contributions as follows:
• Deep Semi-NMF structure is built to capture the hidden information by leveraging
benefits of strong interpretability from Semi-NMF and effective feature learning
from deep structure. Through this deep matrix factorization structure, we dis-
semble unimportant factors layer by layer and generate an effective consensus
representation in the final layer for MVC.
• To respect the intrinsic geometric relationship among data samples, we introduce
graph regularizers to guide the shared representation learning in each view. This
practice makes the consensus representation in the final layer preserve most shared
structures across multiple graphs. It can be considered as a fusion scheme to boost
the final MVC performance.

2.1.2 Deep Semi-NMF Formulation

2.1.2.1 Overview of Semi-NMF

As a variant of NMF, Ding et al. (2010) extended the application of traditional


NMF from non-negative input to a mix-sign input, while still preserving the strong
interpretability at the same time. Its objective function can be expressed as:

min X − Z H 2F , (2.1)


Z ,H ≥0

where X ∈ Rd×n denotes the input data with n samples, each sample is of d dimen-
sional feature. In the discussion on equivalence of semi-NMF and K-means clustering
(Ding et al. 2010), Z ∈ Rd×K can be considered as the cluster centroid matrix,2 and
H ∈ R K ×n , H ≥ 0 is the “soft” cluster assignment matrix in latent space.3 Similar
to the traditional NMF, the compact representation H uncovers the hidden semantics
by simulating the part-based representation in human brain, i.e., psychological and
physiological interpretation.
While in reality, natural data may contain different modalities (or factors), e.g.,
expression, illumination, pose in face datasets (Samaria and Harter 1994; Georghi-
ades et al. 2001). Single NMF is not strong enough to eliminate the effect of

2 For a neat presentation, we do not follow the notation style in Ding et al. (2010), and remove the
mix-sign notation “±” on X and Z , which does not affect the rigorousness.
3 In some literatures (Ding et al. 2010; Zhao et al. 2015), Semi-NMF is also called the soft version

of K-means clustering.
12 2 Multi-view Clustering with Complete Information

those undesirable factors and extract the intrinsic class information. To solve this,
Trigeorgis et al. (2014) showed that a deep model based on Semi-NMF has a promis-
ing result in data representation. The multi-layer decomposition process can be
expressed as
X ≈ Z 1 H1+
X ≈ Z 1 Z 2 H2+
.. (2.2)
.
X ≈ Z 1 . . . Z m Hm+

where Z i denotes the ith layer basis matrix, Hi+ is the ith layer representation matrix.
Trigeorgis et al. (2014) proved that each hidden representations layer is able to
identify the different attributes. Inspired by this work, we propose a MVC method
based on deep matrix factorization technique.
In the MVC setting, let us denote X = {X (1) , . . . , X (v) , . . . , X (V ) } as the data
sample set. V represents the number of views. X (v) ∈ Rdv ×n , where dv denotes the
dimensionality of the v-view data and n is the number of data samples. Then we
formulate our model as:


V
 
min (α (v) )γ X (v) −Z 1(v) Z 2(v) . . . Z m(v) Hm 2F + βtr(Hm L (v) HmT )
Z i(v) , Hi(v) v=1
Hm , α (v) (2.3)

V
s.t. Hi(v) ≥ 0, Hm ≥ 0, α (v) = 1, α (v) ≥ 0,
v=1

where X (v) is the given data for vth view. Z i(v) , i ∈ {1, 2, . . . , m} is the ith layer map-
ping for view v. m is the number of layers. Hm is the consensus latent representation
for all views. α (v) is the weighting coefficient for the vth view. γ is the parameter to
control the weights distribution. L (v) is the graph Laplacian of the graph for view v,
where each graph is constructed in k-nearest neighbor (k-NN) fashion. Theweight
matrix of the graph for view v is A(v) and L (v) = A(v) − D (v) , where Dii(v) = j Ai(v) j
(He and Niyogi 2003; Ding and Fu 2016).

Remark 1 Due to the homology of Multi-view data, the final layer representation
Hm(v) for vth view data should be close to each other. Here, we use the consensus
Hm as a constraint to enforce Multi-view data to share the same representation after
multi-layer factorization.

Remark 2 Multiple graphs are constructed to constrain the common representation


learning so that the geometric structure in each view could be well preserved for the
final clustering. Moreover, the novel graph term could fuse the geometric knowledge
from multiple views to make the common representation more consistent.
2.1 Deep Multi-view Clustering 13

2.1.2.2 Optimization

To expedite the approximation of the variables in the proposed model, each of the
layers is pre-trained to have an initial approximation of variables Z i(v) and Hi(v)
for the ith layer in vth view. The effectiveness of pre-training has been proven
before Hinton and Salakhutdinov (2006) on deep autoencoder networks. Similar
to Trigeorgis et al. (2014), we decompose the input data matrix X (v) ≈ Z 1(v) H1(v)
to perform the pre-training, where Z 1(v) ∈ Rdv × p1 and H1(v) ∈ R p1 ×n . Then the vth
view feature matrix H1(v) is decomposed as H1(v) ≈ Z 2(v) H2(v) , where Z 2(v) ∈ R p1 × p2
and H2(v) ∈ R p2 ×n . p1 and p2 are the dimensionalities for layer 1 and layer 2,
respectively.4 Continue to do so until we have pre-trained all layers. Follow-
ing this, the weights of each layer is fine-tuned by alternating minimizations of
the proposed objective function Eq. (2.3). First, we denote the cost function as

V  
C = (α (v) )γ X (v) − Z 1(v) Z 2(v) . . . Z m(v) Hm 2F + βtr(Hm L (v) HmT ) .
v=1

Update rule for weight matrix Z(v) i . We minimize the objective value with
(v)
respect to Z i by fixing the rest of variables in vth view for the ith layer. By setting
∂C /∂ Z i(v) = 0, we give the solutions as

Z i(v) = (Φ T Φ)−1 Φ T X (v) H̃i(v) T ( H̃i(v) H̃i(v) T )−1


(2.4)
Z i(v) = Φ † X (v) H̃i(v) † ,

where Φ = [Z 1(v) . . . Z i−1


(v)
], H̃i(v) denotes the reconstruction (or the learned latent
feature) of the ith layer’s feature matrix in vth view, and notation † represents the
Moore–Penrose pseudo-inverse.
Update rule for weight matrix Hi(v) (i < m). Following Ding et al. (2010), the
update rule for Hi(v) (i < m) is formulated as follows:

[Φ T X (v) ]pos + [Φ T Φ Hi(v) ]neg
Hi(v) = Hi(v)  , (2.5)
[Φ T X (v) ]neg + [Φ T Φ Hi(v) ]pos

where [M]pos denotes a matrix that all the negative elements are replaced by 0.
Similarly, [M]neg denotes one that has all the positive elements replaced by 0. That is,

pos |Mk j | + Mk j neg |Mk j | − Mk j


∀k, j [M]k j = , [M]k j = . (2.6)
2 2

Update rule for weight matrix Hm (i.e., Hi(v) (i = m)). Since Hm involves the
graph term, the updating rule and convergence property have never been investigated

4 For the ease of presentation, we denote the dimensionalities (layer size) from layer 1 to layer m
as [ p1 . . . pm ] in the experiments.
14 2 Multi-view Clustering with Complete Information

before. We give the updating rule first, followed by the proof of its convergence
property. 
[Φ T X (v) ]pos +[Φ T Φ Hm ]neg +Gu (Hm , A)
Hm =Hm  (2.7)
[Φ T X (v) ]neg +[Φ T Φ Hm ]pos +Gd (Hm , A)

where Gu (Hm , A) = β([Hm A(v) ]pos + [Hm D (v) ]neg ) and Gd (Hm , A) = β([Hm
A(v) ]neg + [Hm D (v) ]pos ).

Theorem 2.1 The limited solution of the update rule in Eq. (2.7) satisfies the KKT
condition.

Proof We introduce the Lagrangian function


V 
L (Hm ) = (α (v) )γ X (v) − Z 1(v) Z 2(v) . . . Z m(v) Hm 2F
v=1
(2.8)
(v)

+ βtr(Hm L HmT ) − ηHm ,

where the Lagrangian multiplier η enforces nonnegative constraints, Hm ≥ 0. The


zero gradient condition gives ∂L (Hm )/∂ Hm = 2Φ T (Φ Hm − X (v) ) + 2Hm (D (v) −
A(v) ) − η = 0. From the complementary slackness condition, we obtain
 
2Φ T (Φ Hm − X (v) ) + 2Hm (D (v) − A(v) ) kl
(Hm )kl
(2.9)
= ηkl (Hm )kl = 0.

This is a fixed point equation that the solution must satisfy at convergence.
The limiting solution of Eq. (2.7) satisfies the fixed point equation. At conver-
gence, Hm(∞) = Hm(t+1) = Hm(t) = Hm , i.e.,

(Hm )kl = (Hm )kl 



[Φ T X (v) ]kl + [Φ T Φ Hm ]kl + [Gu (Hm(v) , A)]kl
pos neg (2.10)
.
[Φ T X (v) ]neg + [Φ T Φ Hm ]pos + [Gd (Hm(v) , A)]kl

Note that Φ T X (v) = [Φ T X (v) ]pos − [Φ T X (v) ]neg ; Φ T Φ Hm = [Φ T Φ Hm ]pos −


[Φ T Φ Hm ]neg ; Hm D (v) = [Hm D (v) ]pos − [Hm D (v) ]neg ; Hm A(v) = [Hm A(v) ]pos −
[Hm A(v) ]neg . Thus Eq. (2.10) reduces to
 T 
2Φ (Φ Hm −X (v) )+2Hm (D (v) −A(v) ) kl (Hm )2kl = 0. (2.11)

Equation (2.11) is identical to Eq. (2.9). Both equations require that at least one of
the two factors is equal to zero. The first factors in both equations are identical. For
the second factor (Hm )kl or (Hm2 )kl , if (Hm )kl = 0 then (Hm2 )kl = 0, and vice versa.
Therefore if Eq. (2.9) holds, Eq. (2.11) also holds and vice versa.
2.1 Deep Multi-view Clustering 15

Update rule for weight α (v) . Similar to (Cai et al. 2013b), for the ease of rep-
resentation, let us denote R (v) = X (v) − Z 1(v) Z 2(v) . . . Z m(v) Hm 2F + βtr(Hm L (v) HmT ).
The objective in Eq. (2.3) with respect to α (v) is written as


V 
V
min (α (v) )γ R (v) , s.t. α (v) = 1, α (v) ≥ 0. (2.12)
α (v)
v=1 v=1

The Lagrange function of Eq. (2.12) is written as


V V
min (α (v) )γ R (v) − λ( α (v) − 1), (2.13)
α (v)
v=1 v=1

where λ is the Lagrange multiplier. By taking the derivative of Eq. (2.13) with respect
to α(v), and setting it to zero, we obtain
 1
λ γ −1
α (v) = . (2.14)
γ R (v)


V
Then we replace α (v) in Eq. (2.14) into α (v) = 1, and obtain
v=1

  1
(v) γ R (v) 1−γ
α = V .
 (v)
 1−γ
1 (2.15)
γR
v=1

It is interesting to see that with only one parameter γ , we could control the different
weights for different views. When γ approaches ∞, we get equal weights. When γ
is close to 1, the weight of the view whose R (v) value is the smallest is assigned to
1, and the others are assigned to 0.
Until now, we have all the update rules done. We repeat the updates iteratively
until convergence. The entire algorithm is outlined in Algorithm 2.1. After obtaining
the optimized Hm , standard spectral clustering (Ng et al. 2001) is performed on the
graph built on Hm via k-NN algorithm.

2.1.2.3 Time Complexity

Our deep matrix factorization model is composed of two stages, i.e., pre-training and
fine-tuning, so we analyze them separately. To simplify the analysis, we assume the
dimensions in all the layers (i.e., layer size) are the same, denoting p. The original
feature dimensions for all the views are the same, denoting d. V is the number of
views. m is the number of layers.
16 2 Multi-view Clustering with Complete Information

Algorithm 2.1: Optimization Solution of Problem (2.3)


Input: Multi-view data X (v) , tuning parameters γ , β, the layer size di , the number of classes
k.
1 Initialize:
2 for all layers in each view do
(v) (v) (v)
3 (Z i , Hi ) ← SemiNMF(Hi−1 , di )
4 end
5 while not converged do
6 for all layers in each view do
˜ Hm if i = m
7 Hi(v) ← (v) ˜(v)
Z i+1 Hi+1 otherwise
8 Φ ← i−1τ =1 Z τ .
(v)
9 Z i ← Φ † X (v) H̃i .
(v) Update via Eq. (2.5) if i = m
10 Hi ←
Update via Eq. (2.7) otherwise
11 end
12 end
(v) (v)
Output: Weighted matrices Z i and feature matrices Hi (i = m) and Hm in the final layer.

In pre-training stage, the Semi-NMF process and graph construction are the time
consuming
 parts. The complexity is of order O V mt p (dnp + np 2 + pd 2 + pn 2 +
dn ) , where t p is the number of iterations to achieve convergence in Semi-NMF
2

optimization
 process. Normally,
 p < d, thus the computational cost is T pr e. =
O V mt p (dnp + pd 2 + dn 2 ) for the pre-training  stage. Similarly, in the fine-tuning

stage, the time complexity is of order T f ine. = O V mt f (dnp + pd 2 + pn 2 ) , where
t f is the number of iterations in this fine-tuning stage. To sum up, the overall com-
putational cost is Ttotal = T pr e. + T f ine. .

2.1.3 Experiments on Face Benchmarks

We choose three face image/video benchmarks in our experiments, as face contains


good structural information, which is beneficial to manifesting the strengths of deep
NMF structure. A brief introduction of datasets and preprocessing steps is as follows.
Yale consists of 165 images of 15 subjects in raw pixel. Each subject has 11
images, with different conditions, e.g., facial expressions, illuminations, with/without
glasses, lighting conditions, etc. Extended Yale B consists of 38 subjects of face
images. Each subject has 64 faces images under various lighting conditions and
poses. In this work, the first 10 subjects, 640 images data are used for experiment.
Notting-Hill is a well-known video face benchmark (Zhang et al. 2009), which is
generated from movie “Notting Hill”. There are 5 major casts, including 4660 faces
in 76 tracks.
2.1 Deep Multi-view Clustering 17

For these datasets, we follow the preprocessing strategy (Cao et al. 2015). Firstly
all the images are resized into 48 × 48 and then three kinds of features are extracted,
i.e., intensity, LBP (Ahonen et al. 2006) and Gabor (Feichtinger and Strohmer 1998).
Specifically, LBP is a 59-dimension histogram over 9 × 10 pixel patches generated
from cropped images. The scale parameter λ in Gabor wavelets is fixed as 4 at four
orientations θ = {0◦ , 45◦ , 90◦ , 135◦ } with a cropped image of size 25 × 30 pixels.
For the comparison baselines, we have the following. (1) BestSV performs stan-
dard spectral clustering (Ng et al. 2001) on the features in each view. We report the best
performance. (2) ConcatFea concatenates all the features, and then performs stan-
dard spectral clustering. (3) ConcatPCA concatenates all the features, then projects
the original features into a low-dimensional subspace via PCA. Spectral clustering
is applied on the projected feature representation. (4) Co-Reg (SPC) (Kumar et al.
2011) co-regularizes the clustering hypotheses to enforce the memberships from
different views admit with each other. (5) Co-Training (SPC) (Kumar and Daume
III 2011) borrows the idea of co-training strategy to alternatively modify the graph
structure of each view using other views’ information. (6) Min-D(isagreement) (de
Sa 2005) builds a bipartite graph which derives from the “minimizing-disagreement”
idea. (7) MultiNMF (Liu et al. 2013) applies NMF to project each view data to the
common latent subspace. This method can be roughly considered as one-layer ver-
sion of our proposed method. (8) NaMSC (Cao et al. 2015) firstly applies (Hu et
al. 2014) to each view data, then combines the learned representations and feeds to
the spectral clustering. (9) DiMSC (Cao et al. 2015) investigates the complementary
information of representations of Multi-view data by introducing a diversity term.
This work is also one of the most recent approaches in MVC. We do not make the
comparison with deep auto-encoder based methods (Andrew et al. 2013, Wang et
al. 2015), because these CCA-based methods cannot fully utilize more than 2 view
data, leading to an unfair comparison.
To make a comprehensive evaluation, we use six different evaluation metrics
including normalized mutual information (NMI), accuracy (ACC), adjusted
rand index (AR), F-score, Precision and Recall. For details about the metrics,
readers could refer to Kumar and Daume III (2011), Cao et al. (2015). For all the
metrics, higher value denotes better performance. Different measurements favor dif-
ferent properties, thus a comprehensive view can be acquired from the diverse results.
For each experiment, we repeat 10 times and report the mean values along with stan-
dard deviations.

2.1.3.1 Result

Tables 2.1 and 2.2 tabulate the results on datasets Yale and Extended YaleB. Our
method outperforms all the other competitors. For the dataset Yale, we raise the
performance bar by around 7.57% in NMI, 5.08% in ACC, 8.22% in AR, 6.56% in
F-score, 10.13% in Precision and 4.61% in Recall. On average, we improve the state-
of-the-art DiMSC by more than 7%. The possible reason why our method improves
a lot is that both image data in Yale and Extended YaleB contain multiple factors, i.e.,
18 2 Multi-view Clustering with Complete Information

Table 2.1 Results of 6 different metrics (mean ± standard deviation) on dataset Yale
Method NMI ACC AR F-score Precision Recall
BestSV 0.654 ± 0.616 ± 0.440 ± 0.475 ± 0.457 ± 0.495 ±
0.009 0.030 0.011 0.011 0.011 0.010
ConcatFea 0.641 ± 0.544 ± 0.392 ± 0.431 ± 0.415 ± 0.448 ±
0.006 0.038 0.009 0.008 0.007 0.008
ConcatPCA 0.665 ± 0.578 ± 0.396 ± 0.434 ± 0.419 ± 0.450 ±
0.037 0.038 0.011 0.011 0.012 0.009
Co-Reg 0.648 ± 0.564 ± 0.436 ± 0.466 ± 0.455 ± 0.491 ±
0.002 0.000 0.002 0.000 0.004 0.003
Co-Train 0.672 ± 0.630 ± 0.452 ± 0.487 ± 0.470 ± 0.505 ±
0.006 0.001 0.010 0.009 0.010 0.007
Min-D 0.645 ± 0.615 ± 0.433 ± 0.470 ± 0.446 ± 0.496 ±
0.005 0.043 0.006 0.006 0.005 0.006
MultiNMF 0.690 ± 0.673 ± 0.495 ± 0.527 ± 0.512 ± 0.543 ±
0.001 0.001 0.001 0.000 0.000 0.000
NaMSC 0.671 ± 0.636 ± 0.475 ± 0.508 ± 0.492 ± 0.524 ±
0.011 0.000 0.004 0.007 0.003 0.004
DiMSC 0.727 ± 0.709 ± 0.535 ± 0.564 ± 0.543 ± 0.586 ±
0.010 0.003 0.001 0.002 0.001 0.003
Ours 0.782 ± 0.745 ± 0.579 ± 0.601 ± 0.598 ± 0.613 ±
0.010 0.011 0.002 0.002 0.001 0.002

pose, expression, illumination, etc. The existing MVC methods only involve one layer
of representation, e.g., one layer factor decomposition in MultiNMF or the practice of
self-representation (i.e., coefficient matrix Z in NaMSC and DiMSC Cao et al. 2015).
However, our proposed approach can extract the meaningful representation layer by
layer. Through the deep representation, we eliminate the influence of undesirable
factors, and keep the core information (i.e., class/id information) in the final layer.
Table 2.3 lists the performance on video data Notting-Hill. This dataset is more
challenging than the previous two image datasets, since the illumination conditions
vary dramatically and the source of lighting is arbitrary. Moreover, there is no fixed
expression pattern in the Notting-Hill movie, on the contrary to datasets Yale and
Extended YaleB. We observe from the tables that our method reports the superior
results in five metrics. The only outlier is NMI, but our performance is slightly
worse than DiMSC by only 0.25%. Therefore, we safely draw the conclusion that our
proposed method generally achieves better clustering performance in the challenging
video dataset Notting-Hill.

2.1.3.2 Analysis

In this subsection, the robustness and stability of the proposed model is evaluated.
The convergence property is firstly studied in terms of objective value and NMI
2.1 Deep Multi-view Clustering 19

Table 2.2 Results of 6 different metrics (mean ± standard deviation) on dataset Extended YaleB
Method NMI ACC AR F-score Precision Recall
BestSV 0.360 ± 0.366 ± 0.225 ± 0.303 ± 0.296 ± 0.310 ±
0.016 0.059 0.018 0.011 0.010 0.012
ConcatFea 0.147 ± 0.224 ± 0.064 ± 0.159 ± 0.155 ± 0.162 ±
0.005 0.012 0.003 0.002 0.002 0.002
ConcatPCA 0.152 ± 0.232 ± 0.069 ± 0.161 ± 0.158 ± 0.164 ±
0.003 0.005 0.002 0.002 0.001 0.002
Co-Reg 0.151 ± 0.224 ± 0.066 ± 0.160 ± 0.157 ± 0.162 ±
0.001 0.000 0.001 0.000 0.001 0.000
Co-Train 0.302 ± 0.186 ± 0.043 ± 0.140 ± 0.137 ± 0.143 ±
0.007 0.001 0.001 0.001 0.001 0.002
Min-D 0.186 ± 0.242 ± 0.088 ± 0.181 ± 0.174 ± 0.189 ±
0.003 0.018 0.001 0.001 0.001 0.002
MultiNMF 0.377 ± 0.428 ± 0.231 ± 0.329 ± 0.298 ± 0.372 ±
0.006 0.002 0.001 0.001 0.001 0.002
NaMSC 0.594 ± 0.581 ± 0.380 ± 0.446 ± 0.411 ± 0.486 ±
0.004 0.013 0.002 0.004 0.002 0.001
DiMSC 0.635 ± 0.615 ± 0.453 ± 0.504 ± 0.481 ± 0.534 ±
0.002 0.003 0.000 0.006 0.002 0.001
Ours 0.649 ± 0.763 ± 0.512 ± 0.564 ± 0.525 ± 0.610 ±
0.002 0.001 0.002 0.001 0.001 0.001

performance. Then the analytical experiments on three key model parameters β, γ ,


and layer size are conducted.
Convergence analysis. In Theorem 2.1, we theoretically show that the most
complex updating for Hm satisfies KKT conditions. To experimentally show the
convergence property of the whole model, we compute the objective value of Eq. (2.3)
in each iteration. The corresponding parameters γ , β and layer size are set as 0.5,
0.1 and [100, 50], respectively. The objective value curve is plotted in red in Fig. 2.2.
We observe that the objective value decreases steadily, and then gradually meets the
convergence after around 100 iterations. The average NMI (in blue) has two stages
before converging: from #1 to #14, the NMI increases dramatically; then from #15 to
#30, it slightly bumps and reaches the best at around the convergence point. For the
sake of safety, the maximum number of iterations is set to 150 for all the experiments.
Parameter analysis. In the proposed method, we have four sets of parameters
i.e., balancing parameters β and γ , layer size pi and the number of nearest neighbors
k when constructing k-NN graph. Selecting k in the k-NN graph construction algo-
rithms is an open problem (He and Niyogi 2003). Due to the limited page length, we
only include the first three parameter analysis experiments in this section. However,
we find that k = 5 usually achieves relatively good results.
Figure 2.3 shows the influence of NMI result with respect to the parameter γ under
three different layer size settings, i.e., {[100 50], [500 50], [500 200]}. Parameter
Exploring the Variety of Random
Documents with Different Content
CHAPTER II.

CHILD-LIFE IN THE LOWELL COTTON-MILLS.

In attempting to describe the life and times of the early mill-girls,


it has seemed best for me to write my story in the first person; not
so much because my own experience is of importance, as that it is,
in some respects, typical of that of many others who lived and
worked with me.
Our home was in Boston, in Leverett Court, now Cotting Street,
where I was born the year the corner-stone was laid for the Bunker
Hill Monument, as my mother told me always to remember. We lived
there until I was nearly seven years of age, and, although so young,
I can remember very vividly scenes and incidents which took place
at that time. We lived under the shadow of the old jail (near where
Wall Street now runs), and we children used to hear conversation,
not meant for small ears, between the prisoners and the persons in
the court who came there to see them.
All the land on which the North Union Station now stands, with
the railway lines connected with it, and also the site of many of the
streets, particularly Lowell Street, was then a part of the Mill-pond,
or was reclaimed from the Bay. The tide came in at the foot of
Leverett Court, and we could look across the water and see the
sailing vessels coming and going. There the down-east wood-
coasters landed their freight; many a time I have gone “chipping”
there, and once a generous young skipper offered me a stick of
wood, which I did not dare to take.
In 1831, under the shadow of a great sorrow, which had made
her four children fatherless,—the oldest but seven years of age,—my
mother was left to struggle alone; and, although she tried hard to
earn bread enough to fill our hungry mouths, she could not do it,
even with the help of kind friends. And so it happened that one of
her more wealthy neighbors, who had looked with longing eyes on
the one little daughter of the family, offered to adopt me. But my
mother, who had had a hard experience in her youth in living
amongst strangers, said, “No; while I have one meal of victuals a
day, I will not part with my children.” I always remembered this
speech because of the word “victuals,” and I wondered for a long
time what this good old Bible word meant.
My father was a carpenter, and some of his fellow-workmen
helped my mother to open a little shop, where she sold small stores,
candy, kindling-wood, and so on, but there was no great income
from this, and we soon became poorer than ever. Dear me! I can
see the small shop now, with its jars of striped candy, its loaves of
bread, the room at the back where we all lived, and my oldest
brother (now a “D.D.”) sawing the kindling-wood which we sold to
the neighbors.
That was a hard, cold winter; and for warmth’s sake my mother
and her four children all slept in one bed, two at the foot and three
at the head,—but her richer neighbor could not get the little
daughter; and, contrary to all the modern notions about hygiene, we
were a healthful and a robust brood. We all, except the baby, went
to school every day, and Saturday afternoons I went to a charity
school to learn to sew. My mother had never complained of her
poverty in our hearing, and I had accepted the conditions of my life
with a child’s trust, knowing nothing of the relative difference
between poverty and riches. And so I went to the sewing-school, like
any other little girl who was taking lessons in sewing and not as a
“charity child;” until a certain day when something was said by one
of the teachers, about me, as a “poor little girl,”—a thoughtless
remark, no doubt, such as may be said to-day in “charity schools.”
When I went home I told my mother that the teacher said I was
poor, and she replied in her sententious manner, “You need not go
there again.”
Shortly after this my mother’s widowed sister, Mrs. Angeline
Cudworth, who kept a factory boarding-house in Lowell, advised her
to come to that city. She secured a house for her, and my mother,
with her little brood and her few household belongings, started for
the new factory town.
We went by the canal-boat, The Governor Sullivan, and a long
and tiresome day it was to the weary mother and her four active
children, though the children often varied the scene by walking on
the tow-path under the Lombardy poplars, riding on the gates when
the locks were swung open, or buying glasses of water at the
stopping-places along the route.
When we reached Lowell, we were carried at once to my aunt’s
house, whose generous spirit had well provided for her hungry
relations; and we children were led into her kitchen, where, on the
longest and whitest of tables, lay, oh, so many loaves of bread!
After our feast of loaves we walked with our mother to the
Tremont Corporation, where we were to live, and at the old No. 5
(which imprint is still legible over the door), in the first block of
tenements then built, I began my life among factory people. My
mother kept forty boarders, most of them men, mill-hands, and she
did all her housework, with what help her children could give her
between schools; for we all, even the baby three years old, were
kept at school. My part in the housework was to wash the dishes,
and I was obliged to stand on a cricket in order to reach the sink!
My mother’s boarders were many of them young men, and
usually farmers’ sons. They were almost invariably of good character
and behavior, and it was a continual pleasure for me and my
brothers to associate with them. I was treated like a little sister,
never hearing a word or seeing a look to remind me that I was not
of the same sex as my brothers. I played checkers with them,
sometimes “beating,” and took part in their conversation, and it
never came into my mind that they were not the same as so many
“girls.” A good object-lesson for one who was in the future to
maintain, by voice and pen, her belief in the equality of the sexes!
I had been to school constantly until I was about ten years of
age, when my mother, feeling obliged to have help in her work
besides what I could give, and also needing the money which I could
earn, allowed me, at my urgent request (for I wanted to earn money
like the other little girls), to go to work in the mill. I worked first in
the spinning-room as a “doffer.” The doffers were the very youngest
girls, whose work was to doff, or take off, the full bobbins, and
replace them with the empty ones.
I can see myself now, racing down the alley, between the
spinning-frames, carrying in front of me a bobbin-box bigger than I
was. These mites had to be very swift in their movements, so as not
to keep the spinning-frames stopped long, and they worked only
about fifteen minutes in every hour. The rest of the time was their
own, and when the overseer was kind they were allowed to read,
knit, or even to go outside the mill-yard to play.
Some of us learned to embroider in crewels, and I still have a
lamb worked on cloth, a relic of those early days, when I was first
taught to improve my time in the good old New England fashion.
When not doffing, we were often allowed to go home, for a time,
and thus we were able to help our mothers in their housework. We
were paid two dollars a week; and how proud I was when my turn
came to stand up on the bobbin-box, and write my name in the
paymaster’s book, and how indignant I was when he asked me if I
could “write.” “Of course I can,” said I, and he smiled as he looked
down on me.
The working-hours of all the girls extended from five o’clock in
the morning until seven in the evening, with one-half hour for
breakfast and for dinner. Even the doffers were forced to be on duty
nearly fourteen hours a day, and this was the greatest hardship in
the lives of these children. For it was not until 1842 that the hours of
labor for children under twelve years of age were limited to ten per
day; but the “ten-hour law” itself was not passed until long after
some of these little doffers were old enough to appear before the
legislative committee on the subject, and plead, by their presence,
for a reduction of the hours of labor.
I do not recall any particular hardship connected with this life,
except getting up so early in the morning, and to this habit, I never
was, and never shall be, reconciled, for it has taken nearly a lifetime
for me to make up the sleep lost at that early age. But in every
other respect it was a pleasant life. We were not hurried any more
than was for our good, and no more work was required of us than
we were able easily to do.
Most of us children lived at home, and we were well fed, drinking
both tea and coffee, and eating substantial meals (besides
luncheons) three times a day. We had very happy hours with the
older girls, many of whom treated us like babies, or talked in a
motherly way, and so had a good influence over us. And in the long
winter evenings, when we could not run home between the doffings,
we gathered in groups and told each other stories, and sung the old-
time songs our mothers had sung, such as “Barbara Allen,” “Lord
Lovell,” “Captain Kid,” “Hull’s Victory,” and sometimes a hymn.
Among the ghost stories I remember some that would delight the
hearts of the “Society for Psychical Research.” The more imaginative
ones told of what they had read in fairy books, or related tales of old
castles and distressed maidens; and the scene of their adventures
was sometimes laid among the foundation stones of the new mill,
just building.
And we told each other of our little hopes and desires, and what
we meant to do when we grew up. For we had our aspirations; and
one of us, who danced the “shawl dance,” as she called it, in the
spinning-room alley, for the amusement of her admiring
companions, discussed seriously with another little girl the scheme
of their running away together, and joining the circus. Fortunately,
there was a grain of good sense lurking in the mind of this gay little
lassie, with the thought of the mother at home, and the scheme was
not carried out.
There was another little girl, whose mother was suffering with
consumption, and who went out of the mill almost every forenoon,
to buy and cook oysters, which she brought in hot, for her mother’s
luncheon. The mother soon went to her rest, and the little daughter,
after tasting the first bitter experience of life, followed her. Dear
Lizzie Osborne! little sister of my child-soul, such friendship as ours
is not often repeated in after life! Many pathetic stories might be told
of these little fatherless mill-children, who worked near their
mothers, and who went hand in hand with them to and from the
mill.
I cannot tell how it happened that some of us knew about the
English factory children, who, it was said, were treated so badly, and
were even whipped by their cruel overseers. But we did know of it,
and used to sing, to a doleful little tune, some verses called, “The
Factory Girl’s Last Day.” I do not remember it well enough to quote it
as written, but have refreshed my memory by reading it lately in
Robert Dale Owen’s writings:—

“THE FACTORY GIRL’S LAST DAY.

“’Twas on a winter morning,


The weather wet and wild,
Two hours before the dawning
The father roused his child,
Her daily morsel bringing,
The darksome room he paced,
And cried, ‘The bell is ringing—
My hapless darling, haste!’

. . . . . .

The overlooker met her


As to her frame she crept;
And with his thong he beat her,
And cursed her when she wept.
It seemed as she grew weaker,
The threads the oftener broke,
The rapid wheels ran quicker,
And heavier fell the stroke.”

The song goes on to tell the sad story of her death while her
“pitying comrades” were carrying her home to die, and ends:—

“That night a chariot passed her,


While on the ground she lay;
The daughters of her master,
An evening visit pay.
Their tender hearts were sighing,
As negroes’ wrongs were told,
While the white slave was dying
Who gained her father’s gold.”

In contrast with this sad picture, we thought of ourselves as well


off, in our cosey corner of the mill, enjoying ourselves in our own
way, with our good mothers and our warm suppers awaiting us
when the going-out bell should ring.
Holidays came when repairs to the great mill-wheel were going
on, or some late spring freshet caused the shutting down of the mill;
these were well improved. With what freedom we enjoyed those
happy times! My summer play-house was the woodshed, which my
mother always had well filled; how orderly and with what precision
the logs were sawed and piled with the smooth ends outwards! The
catacombs of Paris reminded me of my old playhouse. And here, in
my castle of sawed wood, was my vacation retreat, where, with my
only and beloved wooden doll, I lunched on slices of apple cut in
shape so as to represent what I called “German half-moon cakes.” I
piled up my bits of crockery with sticks of cinnamon to represent
candy, and many other semblances of things, drawn from my
mother’s housekeeping stores.
The yard which led to the shed was always green, and here many
half-holiday duties were performed. We children were expected to
scour all the knives and forks used by the forty men-boarders, and
my brothers often bought themselves off by giving me some trifle,
and I was left alone to do the whole. And what a pile of knives and
forks it was! But it was no task, for did I not have the open yard to
work in, with the sky over me, and the green grass to stand on, as I
scrubbed away at my “stent”? I don’t know why I did not think such
long tasks a burden, nor of my work in the mill as drudgery. Perhaps
it was because I expected to do my part towards helping my mother
to get our living, and had never heard her complain of the hardships
of her life.
On other afternoons I went to walk with a playmate, who, like
myself, was full of romantic dreams, along the banks of the
Merrimack River, where the Indians had still their tents, or on
Sundays, to see the “new converts” baptized. These baptizings in the
river were very common, as the tanks in the churches were not
considered apostolic by the early Baptists of Lowell.
Sometimes we rambled by the “race-way” or mill-race, which
carried the water into the flume of the mill, along whose inclining
sides grew wild roses, and the “rock-loving columbine;” and we used
to listen to see if we could hear the blue-bells ring,—this was long
before either of us had read a line of poetry.
The North Grammar school building stood at the base of a hilly
ridge of rocks, down which we coasted in winter, and where in
summer, after school-hours, we had a little cave, where we
sometimes hid, and played that we were robbers; and together we
rehearsed the dramatic scenes in “Alonzo and Melissa,” “The Children
of the Abbey,” or the “Three Spaniards;” we were turned out of
doors with Amanda, we exclaimed “Heavens!” with Melissa, and
when night came on we fled from our play-house pursued by the
dreadful apparition of old Don Padilla through the dark windings of
those old rocks, towards our commonplace home. “Ah!” as some
writer has said, “if one could only add the fine imagination of those
early days to the knowledge and experience of later years, what
books might not be written!”
Our home amusements were very original. We had no toys,
except a few homemade articles or devices of our own. I had but a
single doll, a wooden-jointed thing, with red cheeks and staring
black eyes. Playing-cards were tabooed, but my elder brother (the
incipient D.D.), who had somehow learned the game of high-low-
jack, set about making a pack. The cards were cut out of thick
yellow pasteboard, the spots and figures were made in ink, and, to
disguise their real character, the names of the suits were changed.
Instead of hearts, diamonds, spades, and clubs, they were called
charity, love, benevolence, and faith. The pasteboard was so thick
that all together the cards made a pile at least two or three feet
high, and they had to be shuffled in sections! He taught my second
brother and me the game of high-low-jack; and, with delightful
secrecy, as often as we could steal away, we played in the attic,
keeping the cards hidden, between whiles, in an old hair trunk. In
playing the game we got along very well with the names of the face-
cards,—the “queen of charity,” the “king of love,” and so on; but the
“ten-spot of faith,” and particularly the “two-spot of benevolence”
(we had never heard of the “deuce”) was too much for our sense of
humor, and almost spoiled the “rigor of the game.”
I was a “little doffer” until I became old enough to earn more
money; then I tended a spinning-frame for a little while; and after
that I learned, on the Merrimack corporation, to be a drawing-in girl,
which was considered one of the most desirable employments, as
about only a dozen girls were needed in each mill. We drew in, one
by one, the threads of the warp, through the harness and the reed,
and so made the beams ready for the weaver’s loom. I still have the
two hooks I used so long, companions of many a dreaming hour,
and preserve them as the “badge of all my tribe” of drawing-in girls.
It may be well to add that, although so many changes have been
made in mill-work, during the last fifty years, by the introduction of
machinery, this part of it still continues to be done by hand, and the
drawing-in girl—I saw her last winter, as in my time—still sits on her
high stool, and with her little hook patiently draws in the thousands
of threads, one by one.
CHAPTER III.

THE LITTLE MILL-GIRL’S ALMA MATER.

The education of a child is an all-around process, and he or she


owes only a part of it to school or college training. The child to
whom neither college nor school is open must find his whole
education in his surroundings, and in the life he is forced to lead. As
the cotton-factory was the means of the early schooling of so large a
number of men and women, who, without the opportunity thus
afforded, could not have been mentally so well developed, I love to
call it their Alma Mater. For, without this incentive to labor, this
chance to earn extra money and to use it in their own way, their
influence on the times, and also, to a certain extent, on modern
civilization, would certainly have been lost.
I had been to school quite constantly until I was nearly eleven
years of age, and then, after going into the mill, I went to some of
the evening schools that had been established, and which were
always well filled with those who desired to improve their scant
education, or to supplement what they had learned in the village
school or academy. Here might often be seen a little girl puzzling
over her sums in Colburn’s Arithmetic, and at her side another “girl”
of fifty poring over her lesson in Pierpont’s National Reader.
Some of these schools were devoted to special studies. I went to
a geography school, where the lessons were repeated in unison in a
monotonous sing-song tone, like this: “Lake Winnipeg! Lake
Winnipeg! Lake Titicaca! Lake Titicaca! Memphremagog!
Memphremagog!” and also to a school where those who fancied
they had thoughts were taught by Newman’s Rhetoric to express
them in writing. In this school, the relative position of the subject
and the predicate was not always well taught by the master; but
never to mix a metaphor or to confuse a simile was a lesson he
firmly fixed in the minds of his pupils.
As a result of this particular training, I may say here, that, while I
do not often mix metaphors, I am to this day almost as ignorant of
what is called “grammar” as Dean Swift, who, when he went up to
answer for his degree, said he “could not tell a subject from a
predicate;” or even James Whitcomb Riley, who said he “would not
know a nominative if he should meet it on the street.”
The best practical lesson in the proper use of at least one
grammatical sentence was given to me by my elder brother (not two
years older than I) one day, when I said, “I done it.” “You done it!”
said he, taking me by the shoulder and looking me severely in the
face; “Don’t you ever let me hear you say I done it again, unless you
can use have or had before it.” I also went to singing-school, and
became a member of the church choir, and in this way learned many
beautiful hymns that made a lasting impression on the serious part
of my nature.
The discipline our work brought us was of great value. We were
obliged to be in the mill at just such a minute, in every hour, in order
to doff our full bobbins and replace them with empty ones. We went
to our meals and returned at the same hour every day. We worked
and played at regular intervals, and thus our hands became deft, our
fingers nimble, our feet swift, and we were taught daily habits of
regularity and of industry; it was, in fact, a sort of manual training or
industrial school.
Some of us were fond of reading, and we read all the books we
could borrow. One of my mother’s boarders, a farmer’s daughter
from “the State of Maine,” had come to Lowell to work, for the
express purpose of getting books, usually novels, to read, that she
could not find in her native place. She read from two to four
volumes a week; and we children used to get them from the
circulating library, and return them, for her. In exchange for this, she
allowed us to read her books, while she was at work in the mill; and
what a scurrying there used to be home from school, to get the first
chance at the new book!
It was as good as a fortune to us, and all for six and a quarter
cents a week! In this way I read the novels of Richardson, Madame
D’Arblay, Fielding, Smollett, Cooper, Scott, Captain Marryatt, and
many another old book not included in Mr. Ruskin’s list of “one
hundred good books.” Passing through the alembic of a child’s pure
mind, I am not now conscious that the reading of the doubtful ones
did me any lasting harm. But I should add that I do not advise such
indiscriminate reading among young people, and there is no need of
it, since now there are so many good books, easy of access, which
have not the faults of those I was obliged to read. Then, there was
no choice. To-day, the best of reading, for children and young
people, can be found everywhere.
“Lalla Rookh” was the first poem I ever read, and it awoke in me,
not only a love of poetry, but also a desire to try my own hand at
verse-making.
And so the process of education went on, and I, with many
another “little doffer,” had more than one chance to nibble at the
root of knowledge. I had been to school for three months in each
year, until I was about thirteen years old, when my mother, who was
now a little better able to do without my earnings, sent me to the
Lowell High School regularly for two years, adding her constant
injunction, “Improve your mind, try and be somebody.” There I was
taught a little of everything, including French and Latin; and I may
say here that my “little learning,” in French at least, proved “a
dangerous thing,” as I had reason to know some years later, when I
tried to speak my book-French in Paris, for it might as well have
been Choctaw, when used as a means of oral communication with
the natives of that fascinating city.
The Lowell high school, in about 1840, was kept in a wooden
building over a butcher’s shop, but soon afterwards the new high
school, still in use, was provided, and it was co-educational. How
well I remember some of the boys and girls, and I recall them with
pleasure if not with affection. I could name them now, and have
noted with pride their success in life. A few are so high above the
rest that one would be surprised to know that they received the
principal part of their school education in that little high school room
over the butcher’s shop.
I left the high school when fifteen years of age, my school
education completed; though after that I took private lessons in
German, drawing, and dancing! About this time my elder brother
and I made up our minds that our mother had worked hard long
enough, and we prevailed on her to give up keeping boarders. This
she did, and while she remained in Lowell we supported the home
by our earnings. I was obliged to have my breakfast before daylight
in the winter. My mother prepared it over night, and while I was
cooking and eating it I read such books as Stevens’s “Travels” in
Yucatan and in Mexico, Tasso’s “Jerusalem Delivered,” and “Lights
and Shadows of Scottish Life.” My elder brother was the clerk in the
counting-room of the Tremont Corporation, and the agent, Mr.
Charles L. Tilden,—whom I thank, wherever he may be,—allowed
him to carry home at night, or over Sunday, any book that might be
left on his (the agent’s) desk; by this means I read many a beloved
volume of poetry, late into the night and on Sunday. Longfellow, in
particular, I learned almost by heart, and so retentive is the young
memory that I can repeat, even now, whole poems.
I read and studied also at my work; and as this was done by the
job, or beam, if I chose to have a book in my lap, and glance at it at
intervals, or even write a bit, nothing was lost to the “corporation.”
Lucy Larcom, in her “New England Girlhood,” speaks of the
windows in the mill on whose sides were pasted newspaper
clippings, which she calls “window gems.” It was very common for
the spinners and weavers to do this, as they were not allowed to
read books openly in the mill; but they brought their favorite
“pieces” of poetry, hymns, and extracts, and pasted them up over
their looms or frames, so that they could glance at them, and
commit them to memory. We little girls were fond of reading these
clippings, and no doubt they were an incentive to our thoughts as
well as to those of the older girls, who went to “The Improvement
Circle,” and wrote compositions.
A year or two after this I attempted poetry, and my verses began
to appear in the newspapers, in one or two Annuals, and later in The
Lowell Offering.
In 1846 I wrote some verses which were published in the Lowell
Journal, and these caused me to make the acquaintance of the sub-
editor of that paper, who afterwards became my life companion. I
speak of this here because, in my early married life, I found the
exact help that I needed for continued education,—the leisure to
read good books, sent to my husband for review, in the quiet of my
secluded home. For I had neither the gowns to wear nor the
disposition to go into society, and as my companion was not willing
to go without me, in the long evenings, when the children were in
bed and I was busy making “auld claes look amaist as good as new,”
he read aloud to me countless books on abstruse political and
general subjects, which I never should have thought of reading for
myself.
These are the “books that have helped me.” In fact, of all the
books I have read, I remember but very few that have not helped
me. Thus I had the companionship of a mind more mature, wiser,
and less prone to unrealities than my own; and if it seems to the
reader that my story is that of one of the more fortunate ones
among the working-girls of my time, it is because of this needed
help, which I received almost at the beginning of my womanhood.
And for this, as well as for those early days of poverty and toil, I am
devoutly and reverently thankful.
The religious experience of a young person oftentimes forms a
large part of the early education or development; and mine is
peculiar, since I am one of the very few persons, in this country at
least, who have been excommunicated from a Protestant church.
And I cannot speak of this event without showing the strong
sectarian tendencies of the time.
As late as 1843-1845 Puritan orthodoxy still held sway over nearly
the whole of New England; and the gloomy doctrines of Jonathan
Edwards, now called his “philosophy,” held a mighty grasp on the
minds of the people, all other denominations being frowned upon.
The Episcopal church was considered “little better than the Catholic,”
and the Universalists and the Unitarians were treated with even less
tolerance by the “Evangelicals” than any sect outside these
denominations is treated to-day. The charge against the Unitarians
was that they did not believe all of the Bible, and that they preached
“mere morality rather than religion.”
My mother, who had sat under the preaching of the Rev. Paul
Dean, in Boston, had early drifted away from her hereditary church
and its beliefs; but she had always sent her children to the
Congregational church and Sunday-school, not wishing, perhaps, to
run the same risk for their souls that she was willing to take for her
own, thus keeping us on the “safe side,” as it was called, with regard
to our eternal salvation. Consequently, we were well taught in the
belief of a literal devil, in a lake of brimstone and fire, and in the
“wrath of a just God.”
The terrors of an imaginative child’s mind, into which these
monstrous doctrines were poured, can hardly be described, and their
lasting effect need not be dwelt upon. It was natural that young
people who had minds of their own should be attracted to the new
doctrine of a Father’s love, as well as to the ministers who preached
it; and thus in a short time the mill girls and boys made a large part
of the congregation of those “unbelieving” sects which had come to
disturb the “ancient solitary reign” of primitive New England
orthodoxy.
I used often to wish that I could go to the Episcopal Sunday-
school, because their little girls were not afraid of the devil, were
allowed to dance, and had so much nicer books in their Sunday-
school library. “Little Henry and his Bearer,” and “The Lady of the
Manor,” in which was the story of “The Beautiful Estelle,” were lent
to me; and the last-named was a delight and an inspiration. But the
little “orthodox” girls were not allowed to read even religious novels;
and one of my work-mates, whose name would surprise the reader,
and who afterwards outgrew such prejudices, took me to task for
buying a paper copy of Scott’s “Redgauntlet,” saying, “Why, Hattie,
do you not know that it is a novel?”
We had frequent discussions among ourselves on the different
texts of the Bible, and debated such questions as, “Is it a sin to read
novels?” “Is it right to read secular books on Sunday?” or, “Is it
wicked to play cards or checkers?” By this it will be seen that we
were made more familiar with the form, than with the spirit or the
teaching, of Christianity.
In the spring of 1840 there was a great revival in Lowell, and
some of the little girls held prayer-meetings, after school, at each
other’s houses, and many of them “experienced religion.” I went
sometimes to these meetings, and one night, when I was walking
home by starlight, for the days were still short, one of the older girls
said to me, “Are you happy?” “Do you love Jesus?” “Do you want to
be saved?”—“Why, yes,” I answered. “Then you have experienced
religion,” said the girl; “you are converted.” I was startled at the
idea, but did not know how to deny it, and I went home in an
exalted state of feeling; and, as I looked into the depths of the
heavens above me, there came to my youthful mind the first
glimmer of thought on spiritual themes.
It was an awakening, but not a conversion, for I had been
converted from nothing to nothing. I was at once claimed as a
“young convert,” went to the church prayer-meeting, told my
“experience” as directed, and was put on probation for admission to
the church. Meanwhile, I had been advised not to ask my mother’s
consent to this step, because she was a Universalist, and might
object. But I did not follow this advice; and when I told her of my
desire, she simply answered, “If you think it will make you any
happier, do so, but I do not believe you will be satisfied.” I have
always been very thankful to my mother for giving me this freedom
in my young life,—

“Not to be followed hourly, watched and noosed,”—

this chance in such an important matter to learn to think and to act


for myself. In fact, she always carried out this principle, and never to
my recollection coerced her children on any important point, but
taught them to “see for themselves.”
When the day came for me to be admitted into the church, I,
with many other little girls, was sprinkled; and, when I stood up to
repeat the creed, I can truly say that I knew no more what were the
doctrines to which I was expected to subscribe, than I did about the
Copernican System or the Differential Calculus. And I might have
said, with the disciples at Ephesus, I “have not so much as heard
whether there be any Holy Ghost.” For, although I had been regularly
to church and to Sunday-school, I had never seen the Articles of
Belief, nor had I been instructed concerning the doctrines, or the
sacredness of the vow I was about to take upon me. Nor, from the
frequent backsliding among the young converts, do I think my case
was a singular one, although, so far as I know, I was the only one
who backslid enough to be excommunicated.
And later, when I was requested to subscribe to the Articles of
Belief, I found I could not accept them, particularly a certain part,
which related to the day of judgment and what would follow
thereafter. I have reviewed this document, and am able to quote the
exact words which were a stumbling-block to me. “We believe ...
that at the day of judgment the state of all will be unalterably fixed,
and that the punishment of the wicked and the happiness of the
righteous will be endless.”
When the service was over, I went home, feeling as if I had done
something wrong. I thought of my mother, whom my church people
called an “unbeliever;” of my dear little brother who had been
drowned, and whose soul might be lost, and I was most unhappy. In
fact, so serious was I for many days, that no doubt my church
friends thought me a most promising young convert.
Indeed I was converted, but not in the way they supposed; for I
had begun to think on religious subjects, and the more I thought the
less I believed in the doctrines of the church to which I belonged.
Doubts of the goodness of God filled my mind, and unbelief in the
Father’s love and compassion darkened my young life. What a
conversion! The beginning of long years of doubt and of struggle in
search of spiritual truths.
After a time I went no more to my church meetings, and began
to attend those of the Universalists; but, though strongly urged, as a
“come-outer,” to join that body, I did not do so, being fearful of
subscribing to a belief whose mysteries I could neither understand
nor explain.
Hearing that I was attending the meetings of another
denomination, my church appointed three persons, at least one of
whom was a deacon, to labor with me. They came to our house one
evening, and, while my mother and I sat at our sewing, they plied
me with questions relating to my duty as a church member, and
arguments concerning the articles of belief; these I did not know
how to answer, but my mother, who had had some experience in
“religious” disputes, gave text for text, and I remember that,
although I trembled at her boldness, I thought she had the best of
it.
Meanwhile, I sat silent, with downcast eyes, and when they
threatened me with excommunication if I did not go to the church
meetings, and “fulfil my covenant,” I mustered up courage to say,
with shaking voice, “I do not believe; I cannot go to your church,
even if you do excommunicate me.”
When my Universalist friends heard of this threat of
excommunication, they urged the preparation of a letter to the
church, giving my reasons for non-attendance; and this was
published in a Lowell newspaper, July 30, 1842. In this letter, which
my elder brother helped me to prepare,—in fact, I believe wrote the
most of it,—several arguments against the Articles of Belief are
given; and the letter closes with a request to “my brothers and
sisters,” to erase my name from “your church books rather than to
follow your usual course, common in cases similar to my own, to
excommunicate the heretic.”
This request was not heeded, and shortly after a committee of
three was “then appointed to take farther steps;” and this committee
reported that they had “visited and admonished” me without
success; and in November, 1842, the following vote was passed, and
is recorded in the church book:—

“Nov. 21, 1842.


Whereas, it appears that Miss Harriet Hanson has violated her
covenant with this church,—first, by repeated and regular absence
from the ordinances of the gospel, second, by embracing sentiments
deemed by this church heretical; and whereas, measures have been
taken to reclaim her, but ineffectual; therefore,
Voted, that we withdraw our fellowship from the said Miss Hanson
until she shall give satisfactory evidence of repentance.”

And thus, at seventeen years of age, I was excommunicated from


the church of my ancestors, and for no fault, no sin, no crime, but
simply because I could not subscribe conscientiously to doctrines
which I did not comprehend. I relate this phase of my youthful
experience here in detail, because it serves to show the methods
which were then in use to cast out or dispose of those members who
could not subscribe to the doctrines of the dominant church of New
England.
For some time after this, I was quite in disgrace with some of my
work-mates, and was called a “heretic” and a “child of perdition” by
my church friends. But, as I did not agree, even in this, with their
opinions, but went my “ain gait,” it followed that, although I
remained for a time something of a heretic, I was not an unbeliever
in sacred things nor did I prove to be a “child of perdition.” But this
experience made me very unhappy, and gave me a distaste for
religious reading and thinking, and for many years the Bible was a
sealed book to me, until I came to see in the Book, not the letter of
dogma, but rather the spirit of truth and of revelation. This
experience also repressed the humorous side of my nature, which is
every one’s birthright, and made me for a time a sort of youthful
cynic; and I allowed myself to feel a certain contempt for those of
my work-mates who, though they could not give clear reason for
their belief, still remained faithful to their “covenant.”
There were two or three little incidents connected with this
episode in my life that may be of interest. A little later, when I
thought of applying for the position of teaching in a public school, I
was advised by a well-meaning friend not to attempt it, “for,” the
friend added, “you will not succeed, for how can a Universalist pray
in her school?”
Several years after my excommunication, when I had come to
observe that religion and “mere morality” do not always go together,
I had a final interview with one of the deacons who had labored with
me. He was an overseer in the room where I worked, and I had
noticed his familiar manner with some of the girls, who did not like it
any better than I did; and one day, when his behavior was unusually
offensive, I determined to speak to him about it.
I called him to my drawing-in frame, where I was sitting at work,
and said to him something like this: “I have hard work to believe
that you are one of those deacons who came to labor with a young
girl about belonging to your church. I don’t think you set the
example of good works you then preached to me.” He gave me a
look, but did not answer; and shortly after, as I might have
expected, I received an “honorable discharge” from his room.
But let me acknowledge one far-reaching benefit that resulted
from my being admitted to the Orthodox church, a benefit which
came to me in the summer of 1895. Because of my baptism,
administered so long ago, I was enabled to officiate as god-mother
to my grandchild and namesake, in Pueblo, Colorado,—one among
the first of the little girls born on a political equality with the little
boys of that enlightened State, born, as one may say, with the ballot
in her hand! And to any reader who has an interest in the final result
of my religious experience, I may add, that, as late as 1898, I
became a communicant of the Episcopal Church.
When the time came for me to become engaged to the man of
my choice, having always believed in the old-fashioned idea that
there should be no secrets between persons about to marry, I told
him, among my other shortcomings, as the most serious of all, the
story of my excommunication. To my great surprise, he laughed
heartily, derided the whole affair, and wondered at the serious view I
had always taken of it; and later he enjoyed saying to some of his
gentlemen friends, as if it were a good joke, “Did you know my wife
had been excommunicated from the church?”
And I too, long since have learned, that no creed—

“Can fix our doom,


Nor stay the eternal Love from His intent,
While Hope remaining bears her verdant bloom.”
CHAPTER IV.

THE CHARACTERISTICS OF THE EARLY FACTORY


GIRLS.

When I look back into the factory life of fifty or sixty years ago, I
do not see what is called “a class” of young men and women going
to and from their daily work, like so many ants that cannot be
distinguished one from another; I see them as individuals, with
personalities of their own. This one has about her the atmosphere of
her early home. That one is impelled by a strong and noble purpose.
The other,—what she is, has been an influence for good to me and
to all womankind.
Yet they were a class of factory operatives, and were spoken of
(as the same class is spoken of now) as a set of persons who earned
their daily bread, whose condition was fixed, and who must continue
to spin and to weave to the end of their natural existence. Nothing
but this was expected of them, and they were not supposed to be
capable of social or mental improvement. That they could be
educated and developed into something more than mere work-
people, was an idea that had not yet entered the public mind. So
little does one class of persons really know about the thoughts and
aspirations of another! It was the good fortune of these early mill-
girls to teach the people of that time that this sort of labor is not
degrading; that the operative is not only “capable of virtue,” but also
capable of self-cultivation.
At the time the Lowell cotton-mills were started, the factory girl
was the lowest among women. In England, and in France
particularly, great injustice had been done to her real character; she
was represented as subjected to influences that could not fail to
destroy her purity and self-respect. In the eyes of her overseer she
was but a brute, a slave, to be beaten, pinched, and pushed about.
It was to overcome this prejudice that such high wages had been
offered to women that they might be induced to become mill-girls, in
spite of the opprobrium that still clung to this “degrading
occupation.” At first only a few came; for, though tempted by the
high wages to be regularly paid in “cash,” there were many who still
preferred to go on working at some more genteel employment at
seventy-five cents a week and their board.
But in a short time the prejudice against factory labor wore away,
and the Lowell mills became filled with blooming and energetic New
England women. They were naturally intelligent, had mother-wit,
and fell easily into the ways of their new life. They soon began to
associate with those who formed the community in which they had
come to live, and were invited to their houses. They went to the
same church, and sometimes married into some of the best families.
Or if they returned to their secluded homes again, instead of being
looked down upon as “factory girls” by the squire’s or the lawyer’s
family, they were more often welcomed as coming from the
metropolis, bringing new fashions, new books, and new ideas with
them.
In 1831 Lowell was little more than a factory village. Several
corporations were started, and the cotton-mills belonging to them
were building. Help was in great demand; and stories were told all
over the country of the new factory town, and the high wages that
were offered to all classes of work-people,—stories that reached the
ears of mechanics’ and farmers’ sons, and gave new life to lonely
and dependent women in distant towns and farmhouses. Into this
Yankee El Dorado, these needy people began to pour by the various
modes of travel known to those slow old days. The stage-coach and
the canal-boat came every day, always filled with new recruits for
this army of useful people. The mechanic and machinist came, each
with his home-made chest of tools, and oftentimes his wife and little
ones. The widow came with her little flock and her scanty
housekeeping goods to open a boarding-house or variety store, and
so provided a home for her fatherless children. Many farmers’
daughters came to earn money to complete their wedding outfit, or
buy the bride’s share of housekeeping articles.
Women with past histories came, to hide their griefs and their
identity, and to earn an honest living in the “sweat of their brow.”
Single young men came, full of hope and life, to get money for an
education, or to lift the mortgage from the home-farm. Troops of
young girls came by stages and baggage-wagons, men often being
employed to go to other States and to Canada, to collect them at so
much a head, and deliver them at the factories.
A very curious sight these country girls presented to young eyes
accustomed to a more modern style of things. When the large
covered baggage-wagon arrived in front of a block on the
corporation, they would descend from it, dressed in various and
outlandish fashions, and with their arms brimful of bandboxes
containing all their worldly goods. On each of these was sewed a
card, on which one could read the old-fashioned New England name
of the owner. And sorrowful enough they looked, even to the fun-
loving child who has lived to tell the story; for they had all left their
pleasant country homes to try their fortunes in a great
manufacturing town, and they were homesick even before they
landed at the doors of their boarding-houses. Years after, this scene
dwelt in my memory; and whenever anyone said anything about
being homesick, there rose before me the picture of a young girl
with a sorrowful face and a big tear in each eye, clambering down
the steps at the rear of a great covered wagon, holding fast to a
cloth-covered bandbox, drawn up at the top with a string, on which
was sewed a paper bearing the name of Plumy Clay!
Some of these girls brought diminutive hair trunks covered with
the skin of calves, spotted in dun and white, even as when they did
skip and play in daisy-blooming meads. And when several of them
were set together in front of one of the blocks, they looked like their
living counterparts, reposing at noontide in the adjacent field. One of
this kind of trunks has been handed down to me as an heirloom.
The hair is worn off in patches; it cannot be invigorated, and it is
now become a hairless heirloom. Within its hide-bound sides are
safely stowed away the love-letters of a past generation,—love-
letters that agitated the hearts of the grandparents of to-day; and I
wonder that their resistless ardor has not long ago burst its wrinkled
sides. It is relegated to distant attics, with its ancient crony, “ye
bandbox,” to enjoy an honored and well-earned repose.
Ah me! when some of us, its contemporaries, are also past our
usefulness, gone clean out of fashion, may we also be as resigned,
yea, as willing, to be laid quietly on some attic shelf!
These country girls had queer names, which added to the
singularity of their appearance. Samantha, Triphena, Plumy, Kezia,
Aseneth, Elgardy, Leafy, Ruhamah, Lovey, Almaretta, Sarepta, and
Florilla were among them.
Their dialect was also very peculiar. On the broken English and
Scotch of their ancestors was ingrafted the nasal Yankee twang; so
that many of them, when they had just come daown, spoke a
language almost unintelligible. But the severe discipline and ridicule
which met them was as good as a school education, and they were
soon taught the “city way of speaking.”
Their dress was also peculiar, and was of the plainest of
homespun, cut in such an old-fashioned style that each young girl
looked as if she had borrowed her grandmother’s gown. Their only
head-covering was a shawl, which was pinned under the chin; but
after the first pay-day, a “shaker” (or “scooter”) sunbonnet usually
replaced this primitive head-gear of their rural life.
But the early factory girls were not all country girls. There were
others also, who had been taught that “work is no disgrace.” There
were some who came to Lowell solely on account of the social or
literary advantages to be found there. They lived in secluded parts of
New England, where books were scarce, and there was no cultivated
society. They had comfortable homes, and did not perhaps need the
money they would earn; but they longed to see this new “City of
Spindles,” of which they had heard so much from their neighbors
and friends, who had gone there to work.
And the fame of the circulating libraries, that were soon opened,
drew them and kept them there, when no other inducement would
have been sufficient.
The laws relating to women were such, that a husband could
claim his wife wherever he found her, and also the children she was
trying to shield from his influence; and I have seen more than one
poor woman skulk behind her loom or her frame when visitors were
approaching the end of the aisle where she worked. Some of these
were known under assumed names, to prevent their husbands from
trusteeing their wages. It was a very common thing for a male
person of a certain kind to do this, thus depriving his wife of all her
wages, perhaps, month after month. The wages of minor children
could be trusteed, unless the children (being fourteen years of age)
were given their time. Women’s wages were also trusteed for the
debts of their husbands, and children’s for the debts of their parents.
As an instance, my mother had some financial difficulties when I
was fifteen years old, and to save herself and me from annoyance,
she gave me my time. The document reads as follows:—

“Be it known that I, Harriet Hanson, of Lowell, in consideration


that my minor daughter Harriet J. has taken upon herself the whole
burden of her own support, and has undertaken and agreed to
maintain herself henceforward without expense to me, do hereby
release and quitclaim unto her all profits and wages which she may
hereafter earn or acquire by her skill or labor in any occupation,—
and do hereby disclaim all right to collect or interfere with the same.
And I do give and release unto her the absolute control and disposal
of her own time according to her own discretion, without
interference from me. It being understood that I am not to be
chargeable hereafter with any expense on her account.
(Signed) Harriet Hanson.
July 2, 1840.”

It must be remembered that at this date woman had no property


rights. A widow could be left without her share of her husband’s (or
the family) property, a legal “incumbrance” to his estate. A father
could make his will without reference to his daughter’s share of the
inheritance. He usually left her a home on the farm as long as she
remained single. A woman was not supposed to be capable of
spending her own or of using other people’s money. In
Massachusetts, before 1840, a woman could not legally be treasurer
of her own sewing-society, unless some man were responsible for
her.
The law took no cognizance of woman as a money-spender. She
was a ward, an appendage, a relict. Thus it happened, that if a
woman did not choose to marry, or, when left a widow, to re-marry,
she had no choice but to enter one of the few employments open to
her, or to become a burden on the charity of some relative.
In almost every New England home could be found one or more
of these women, sometimes welcome, more often unwelcome, and
leading joyless, and in many instances unsatisfactory, lives. The
cotton-factory was a great opening to these lonely and dependent
women. From a condition approaching pauperism they were at once
placed above want; they could earn money, and spend it as they
pleased; and could gratify their tastes and desires without restraint,
and without rendering an account to anybody. At last they had found
a place in the universe; they were no longer obliged to finish out
their faded lives mere burdens to male relatives. Even the time of
these women was their own, on Sundays and in the evening after
the day’s work was done. For the first time in this country woman’s
labor had a money value. She had become not only an earner and a
producer, but also a spender of money, a recognized factor in the
political economy of her time. And thus a long upward step in our
material civilization was taken; woman had begun to earn and hold
her own money, and through its aid had learned to think and to act
for herself.
Among the older women who sought this new employment were
very many lonely and dependent ones, such as used to be
mentioned in old wills as “incumbrances” and “relicts,” and to whom
a chance of earning money was indeed a new revelation. How well I
remember some of these solitary ones! As a child of eleven years, I
often made fun of them—for children do not see the pathetic side of
human life—and imitated their limp carriage and inelastic gait. I can
see them now, even after sixty years, just as they looked,—
depressed, modest, mincing, hardly daring to look one in the face,
so shy and sylvan had been their lives. But after the first pay-day
came, and they felt the jingle of silver in their pockets, and had
begun to feel its mercurial influence, their bowed heads were lifted,
their necks seemed braced with steel, they looked you in the face,
sang blithely among their looms or frames, and walked with elastic
step to and from their work. And when Sunday came, homespun
was no longer their only wear; and how sedately gay in their new
attire they walked to church, and how proudly they dropped their
silver fourpences into the contribution-box! It seemed as if a great
hope impelled them,—the harbinger of the new era that was about
to dawn for them and for all women-kind.
In passing, let me not forget to pay a tribute, also, to those noble
single and widowed women, who are “set solitary in families,” but
whose presence cements the domestic fabric, and whose influence is
unseen and oftentimes unappreciated, until they are taken away and
the integral part of the old home-life begins to crumble.
Except in rare instances, the rights of the early mill-girls were
secure. They were subject to no extortion, if they did extra work
they were always paid in full, and their own account of labor done
by the piece was always accepted. They kept the figures, and were
paid accordingly. This was notably the case with the weavers and
drawing-in girls. Though the hours of labor were long, they were not
over-worked; they were obliged to tend no more looms and frames
than they could easily take care of, and they had plenty of time to sit
and rest. I have known a girl to sit idle twenty or thirty minutes at a
time. They were not driven, and their work-a-day life was made
easy. They were treated with consideration by their employers, and
there was a feeling of respectful equality between them. The most
favored of the girls were sometimes invited to the houses of the
dignitaries of the mills, showing that the line of social division was
not rigidly maintained.
Their life in the factory was made pleasant to them. In those days
there was no need of advocating the doctrine of the proper relation
between employer and employed. Help was too valuable to be ill-
treated. If these early agents, or overseers, had been disposed to
exercise undue authority, or to establish unjust or arbitrary laws, the
high character of the operatives, and the fact that women employees
were scarce would have prevented it. A certain agent of one of the
first corporations in Lowell (an old sea-captain) said to one of his
boarding-house keepers, “I should like to rule my help as I used to
rule my sailors, but so many of them are women I do not dare to do
it.”
The knowledge of the antecedents of these operatives was the
safeguard of their liberties. The majority of them were as well born
as their “overlookers,” if not better; and they were also far better
educated.
The agents and overseers were usually married men, with
families of growing sons and daughters. They were members, and
sometimes deacons, of the church, and teachers in the same
Sunday-school with the girls employed under them. They were
generally of good morals and temperate habits, and often exercised
a good influence over their help. The feeling that the agents and
overseers were interested in their welfare caused the girls, in turn,
to feel an interest in the work for which their employers were
responsible. The conscientious among them took as much pride in
spinning a smooth thread, drawing in a perfect web, or in making
good cloth, as they would have done if the material had been for
Welcome to our website – the ideal destination for book lovers and
knowledge seekers. With a mission to inspire endlessly, we offer a
vast collection of books, ranging from classic literary works to
specialized publications, self-development books, and children's
literature. Each book is a new journey of discovery, expanding
knowledge and enriching the soul of the reade

Our website is not just a platform for buying books, but a bridge
connecting readers to the timeless values of culture and wisdom. With
an elegant, user-friendly interface and an intelligent search system,
we are committed to providing a quick and convenient shopping
experience. Additionally, our special promotions and home delivery
services ensure that you save time and fully enjoy the joy of reading.

Let us accompany you on the journey of exploring knowledge and


personal growth!

textbookfull.com

You might also like