100% found this document useful (2 votes)
16 views

Machine Learning Applications From Computer Vision to Robotics 1st Edition Indranath Chatterjee - The ebook in PDF format is ready for download

The document provides information about various eBooks available for download at ebookmeta.com, focusing on topics related to machine learning, computer vision, and robotics. It highlights specific titles, including 'Machine Learning Applications From Computer Vision to Robotics' and other related works, along with their respective links for instant download. Additionally, it includes details about the editors and contributors of the machine learning applications book, along with copyright and publication information.

Uploaded by

dozjutasdid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (2 votes)
16 views

Machine Learning Applications From Computer Vision to Robotics 1st Edition Indranath Chatterjee - The ebook in PDF format is ready for download

The document provides information about various eBooks available for download at ebookmeta.com, focusing on topics related to machine learning, computer vision, and robotics. It highlights specific titles, including 'Machine Learning Applications From Computer Vision to Robotics' and other related works, along with their respective links for instant download. Additionally, it includes details about the editors and contributors of the machine learning applications book, along with copyright and publication information.

Uploaded by

dozjutasdid
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 69

Read Anytime Anywhere Easy Ebook Downloads at ebookmeta.

com

Machine Learning Applications From Computer Vision


to Robotics 1st Edition Indranath Chatterjee

https://ebookmeta.com/product/machine-learning-applications-
from-computer-vision-to-robotics-1st-edition-indranath-
chatterjee/

OR CLICK HERE

DOWLOAD EBOOK

Visit and Get More Ebook Downloads Instantly at https://ebookmeta.com


Recommended digital products (PDF, EPUB, MOBI) that
you can download immediately if you are interested.

Linear Algebra and Optimization with Applications to


Machine Learning Volume I Linear Algebra for Computer
Vision Robotics and Machine Learning 1st Edition Jean H.
Gallier
https://ebookmeta.com/product/linear-algebra-and-optimization-with-
applications-to-machine-learning-volume-i-linear-algebra-for-computer-
vision-robotics-and-machine-learning-1st-edition-jean-h-gallier/
ebookmeta.com

Practical Machine Learning for Computer Vision: End-to-End


Machine Learning for Images 1st Edition Valliappa
Lakshmanan
https://ebookmeta.com/product/practical-machine-learning-for-computer-
vision-end-to-end-machine-learning-for-images-1st-edition-valliappa-
lakshmanan/
ebookmeta.com

Machine Learning Algorithms and Applications in


Engineering 1st Edition Prasenjit Chatterjee

https://ebookmeta.com/product/machine-learning-algorithms-and-
applications-in-engineering-1st-edition-prasenjit-chatterjee/

ebookmeta.com

Psychology, Global Edition Sixth edition Saundra K.. White


Ciccarelli (J. Noland.)

https://ebookmeta.com/product/psychology-global-edition-sixth-edition-
saundra-k-white-ciccarelli-j-noland/

ebookmeta.com
Quesadilla Cookbook Delicious Quesadilla Recipes for All
Types of Tasty Quesadillas 2nd Edition Booksumo Press

https://ebookmeta.com/product/quesadilla-cookbook-delicious-
quesadilla-recipes-for-all-types-of-tasty-quesadillas-2nd-edition-
booksumo-press/
ebookmeta.com

Applied Knowledge in Paediatrics: MRCPCH Mastercourse 1st


Edition Martin Hewitt

https://ebookmeta.com/product/applied-knowledge-in-paediatrics-mrcpch-
mastercourse-1st-edition-martin-hewitt/

ebookmeta.com

Morse Code Quilts: Material Messages for Loved Ones First


Edition Maxwell

https://ebookmeta.com/product/morse-code-quilts-material-messages-for-
loved-ones-first-edition-maxwell/

ebookmeta.com

Axisymmetry in Mechanical Engineering 1st Edition Emanuel


Willert

https://ebookmeta.com/product/axisymmetry-in-mechanical-
engineering-1st-edition-emanuel-willert/

ebookmeta.com

The Complete Letters of Henry James 1880 1883 Volume 2 5th


Edition Henry James

https://ebookmeta.com/product/the-complete-letters-of-henry-
james-1880-1883-volume-2-5th-edition-henry-james/

ebookmeta.com
Problems in Quantum Field Theory With Fully Worked
Solutions 1st Edition Gelis

https://ebookmeta.com/product/problems-in-quantum-field-theory-with-
fully-worked-solutions-1st-edition-gelis/

ebookmeta.com
Machine Learning Applications
IEEE Press
445 Hoes Lane
Piscataway, NJ 08854

IEEE Press Editorial Board


Sarah Spurgeon, Editor in Chief
Jón Atli Benediktsson Behzad Razavi Jeffrey Reed
Anjan Bose Jim Lyke Diomidis Spinellis
James Duncan Hai Li Adam Drobot
Amin Moeness Brian Johnson Tom Robertazzi
Desineni Subbaram Naidu Ahmet Murat Tekalp
Machine Learning Applications

From Computer Vision to Robotics

Edited by

Indranath Chatterjee
Department of Computer Engineering
Tongmyong University
Busan, South Korea

Sheetal Zalte
Department of Computer Science
Shivaji University
Kolhapur, Maharashtra, India
Copyright © 2024 by The Institute of Electrical and Electronics Engineers, Inc. All rights reserved.

Published by John Wiley & Sons, Inc., Hoboken, New Jersey.


Published simultaneously in Canada.

No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form
or by any means, electronic, mechanical, photocopying, recording, scanning, or otherwise, except as
permitted under Section 107 or 108 of the 1976 United States Copyright Act, without either the prior
written permission of the Publisher, or authorization through payment of the appropriate per-­copy fee
to the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, (978) 750-­8400, fax
(978) 750-­4470, or on the web at www.copyright.com. Requests to the Publisher for permission should
be addressed to the Permissions Department, John Wiley & Sons, Inc., 111 River Street, Hoboken, NJ
07030, (201) 748-­6011, fax (201) 748-­6008, or online at http://www.wiley.com/go/permission.

Trademarks: Wiley and the Wiley logo are trademarks or registered trademarks of John Wiley & Sons,
Inc. and/or its affiliates in the United States and other countries and may not be used without written
permission. All other trademarks are the property of their respective owners. John Wiley & Sons, Inc.
is not associated with any product or vendor mentioned in this book.

Limit of Liability/Disclaimer of Warranty: While the publisher and author have used their best efforts
in preparing this book, they make no representations or warranties with respect to the accuracy
or completeness of the contents of this book and specifically disclaim any implied warranties of
merchantability or fitness for a particular purpose. No warranty may be created or extended by
sales representatives or written sales materials. The advice and strategies contained herein may not
be suitable for your situation. You should consult with a professional where appropriate. Further,
readers should be aware that websites listed in this work may have changed or disappeared between
when this work was written and when it is read. Neither the publisher nor authors shall be liable for
any loss of profit or any other commercial damages, including but not limited to special, incidental,
consequential, or other damages.

For general information on our other products and services or for technical support, please contact our
Customer Care Department within the United States at (800) 762-­2974, outside the United States at
(317) 572-­3993 or fax (317) 572-­4002.

Wiley also publishes its books in a variety of electronic formats. Some content that appears in print
may not be available in electronic formats. For more information about Wiley products, visit our web
site at www.wiley.com.

Library of Congress Cataloging-­in-­Publication Data

Names: Chatterjee, Indranath, editor. | Zalte, Sheetal S., editor. | John


Wiley & Sons, publisher.
Title: Machine learning applications : from computer vision to robotics /
edited by Indranath Chatterjee, Sheetal Zalte.
Description: Hoboken, New Jersey : Wiley-IEEE Press, [2024] | Includes
bibliographical references and index.
Identifiers: LCCN 2023043411 (print) | LCCN 2023043412 (ebook) | ISBN
9781394173327 (cloth) | ISBN 9781394173334 (adobe pdf) | ISBN
9781394173341 (epub)
Subjects: LCSH: Machine learning–Industrial applications. | Machine
learning–Scientific applications. | Deep learning (Machine
learning)–Industrial applications. | Deep learning (Machine
learning)–Scientific applications.
Classification: LCC Q325.5 .M321323 2024 (print) | LCC Q325.5 (ebook) |
DDC 006.3/1–dc23/eng/20231023
LC record available at https://lccn.loc.gov/2023043411
LC ebook record available at https://lccn.loc.gov/2023043412

Cover Design: Wiley


Cover Image: © Yuichiro Chino/Getty Images

Set in 9.5/12.5pt STIXTwoText by Straive, Pondicherry, India


v

Contents

About the Authors xiii


Preface xv

1 Statistical Similarity in Machine Learning 1


Dmitriy Klyushin
1.1 ­Introduction 1
1.2 ­Featureless Machine Learning 2
1.3 ­Two-­Sample Homogeneity Measure 3
1.4 ­The Klyushin–Petunin Test 3
1.5 ­Experiments and Applications 4
1.6 ­Summary 6
­ References 6

2 Development of ML-­Based Methodologies for Adaptive Intelligent


E-­Learning Systems and Time Series Analysis Techniques 11
Indra Kumari, Indranath Chatterjee, and Minho Lee
2.1 ­Introduction 11
2.1.1 Machine Learning 12
2.1.2 Types of Machine Learning 12
2.1.3 Learning Methods 13
2.1.4 E-­Learning with Machine Learning 14
2.1.5 Need for Machine Learning 15
2.2 ­Methodological Advancement of Machine Learning 16
2.2.1 Automatic Learner Profiling Agent 16
2.2.2 Learning Materials’ Content Indexing Agent 17
2.2.3 Adaptive Learning 17
2.2.4 Proposed Research 18
2.2.5 Multi-­Perspective Learning 18
2.2.6 Machine Learning Recommender Agent for Customization 19
2.2.6.1 E-­Learning 19
vi Contents

2.2.7 Data Creation 19


2.2.8 Naïve Bayes model 19
2.2.9 K-­Means Model 20
2.3 ­Machine Learning on Time Series Analysis 21
2.3.1 Time Series Representation 22
2.3.2 Time Series Classification 24
2.3.3 Time Series Forecasting 25
2.4 ­Conclusion 26
­ Acknowledgment 28
Conflict of Interest 28
­References 28

3 Time-­Series Forecasting for Stock Market Using Convolutional


Neural Network 31
Partha Pratim Deb, Diptendu Bhattacharya, Indranath Chatterjee,
and Sheetal Zalte
3.1 ­Introduction 31
3.2 ­Materials 33
3.3 ­Methodology 33
3.3.1 The Convolutional Neural Network 34
3.4 ­Accuracy Measurement 35
3.5 ­Result and Discussion 35
3.6 ­Conclusion 47
­­ Acknowledgement 47
References 48

4 Comparative Study for Applicability of Color Histograms for CBIR


Used for Crop Leaf Disease Detection 49
Jayamala Kumar Patil, Sampada Abhijit Dhole, Vinay Sampatrao Mandlik,
and Sachin B. Jadhav
4.1 ­Introduction 49
4.2 ­Literature Review 50
4.3 ­Methodology 51
4.3.1 Color Features 52
4.3.1.1 RGB Color Model/Space 53
4.3.1.2 HSV Color Space 53
4.3.1.3 YCbCr Color Space 54
4.3.1.4 Color Histogram 54
4.3.2 Database 54
4.3.3 Parameters for Performance Analysis 57
4.3.4 Experimental Procedure for CBIR Using Color Histogram
for Detection of Disease 58
Contents vii

4.4 ­Results and Discussions 60


4.4.1 Results of CBIR Using Color Histogram for Detection of Soybean
Alfalfa Mosaic Virus Disease 60
4.4.2 Results of CBIR Using Color Histogram for Detection of Soybean
Septoria Brown Spot (SBS) Disease 62
4.4.3 Results of CBIR Using Color Histogram for Detection of Soybean
Healthy Leaf 63
4.5 ­Conclusion 63
­ References 65
­Biographies of Authors 67

5 Stock Index Forecasting Using RNN-­Long


Short-­Term Memory 69
Partha Pratim Deb, Diptendu Bhattacharya, and Sheetal Zalte
5.1 ­Introduction 69
5.2 ­Materials 71
5.3 ­Methodology 71
5.3.1 RNN 71
5.3.2 LSTM 72
5.4 ­Result and Discussion 73
5.4.1 Comparison Table for the Method TAIEX 80
5.4.2 Comparison Table for Method BSE-­SENSEX 80
5.4.3 Comparison Table for Method KOSPI 80
5.5 ­Conclusion 81
­ Acknowledgement 83
­References 84

6 Study and Analysis of Machine Learning Models for Detection


of Phishing URLs 85
Shreyas Desai, Sahil Salunkhe, Rashmi Deshmukh, and Shital Gaikwad
6.1 ­Introduction 85
6.2 ­Literature Review 86
6.3 ­Methodology 87
6.3.1 Proposed Work 87
6.3.2 Traditional Methods 87
6.3.2.1 Blacklist Method 88
6.3.2.2 Heuristic-­Based Model 88
6.3.2.3 Visual Similarity 89
6.3.2.4 Machine Learning–Based Approach 89
6.4 ­Results and Experimentation 89
6.4.1 Dataset Creation 89
6.4.2 Feature Extraction 90
viii Contents

6.4.3 Training Data and Comparison 90


6.4.3.1 XGB (eXtreme Gradient Boosting) 90
6.4.3.2 Logistic Regression (LR) 90
6.4.3.3 RFC (Random Forest Classifier) 91
6.4.3.4 Decision Tree 91
6.4.3.5 SVM (Support Vector Machines) 91
6.4.3.6 KNN (K-­Nearest Neighbors) 91
6.5 ­Model-­Metric Analysis 91
6.6 ­Conclusion 94
­References 94

7 Real-­World Applications of BC Technology in Internet


of Things 97
Pardeep Singh, Ajay Kumar, and Mayank Chopra
7.1 ­Introduction 97
7.1.1 Relevance and Benefits of Blockchain Technology
Applications 98
7.2 ­Review of Existing Study 100
7.3 ­Background of Blockchain 101
7.3.1 Blockchain Stakeholders 101
7.3.2 What is Bitcoin? 102
7.3.3 Emergence of Bitcoin 102
7.3.4 Working of Bitcoin 102
7.3.5 Risk in Bitcoin 103
7.3.6 Legal Issues in Bitcoin 103
7.4 ­Blockchain Technology in Internet of Things 104
7.4.1 Need of Integrating Blockchain with IoT 104
7.4.1.1 IoT Data Traceability and Reliability 105
7.4.1.2 Superior Interoperability 105
7.4.1.3 Increased Security 105
7.4.1.4 IoT System Autonomous Interactions 106
7.4.2 Hyperledger 106
7.4.3 Ethereum 107
7.4.4 IOTA 107
7.5 ­Challenges and Concerns in Integrating Blockchain with
the IoT 108
7.5.1 Blockchain Challenges and Concern 108
7.5.1.1 Scalability 108
7.5.1.2 Privacy Infringement 109
7.5.2 Privacy and Security issues with Internet of Things 109
7.6 ­Blockchain Applications for the Internet of Things
(BIoT Applications) 110
Contents ix

7.6.1 BIoT Applications for Smart Agriculture 111


7.6.2 Blockchain for Smart Agriculture 111
7.6.3 Intelligent Irrigation Driven by IoT 111
7.7 ­Application of BIoT in Healthcare 112
7.7.1 Interoperability 113
7.7.2 Improved Analytics and Data Storage 113
7.7.3 Increased Security 113
7.7.4 Immutability 114
7.7.5 Quicker Services 114
7.7.5.1 Transparency 114
7.8 ­Application of BIoT in Voting 115
7.9 ­Application of BIoT in Supply Chain 116
7.10 ­Summary 116
­References 117

8 Advanced Persistent Threat: Korean Cyber Security Knack


Model Impost and Applicability 123
Indra Kumari and Minho Lee
8.1 ­Introduction 123
8.2 ­Background Study 124
8.3 ­Literature Review 126
8.4 ­Research Questions 131
8.5 ­Research Objectives 131
8.6 ­Research Hypothesis 131
8.7 ­Phases of APT Outbreak 131
8.7.1 Gain Access 132
8.7.2 Establish Foothold 132
8.7.3 Deepen Access 133
8.7.4 Move Laterally 133
8.7.5 Look, Learn, and Remain 133
8.8 ­Research Methodology 134
8.8.1 South Korea Cyber Security Initiatives and Applicability 135
8.8.2 Korea’s Cyber-­Security Program Proposals 137
8.8.2.1 Modernized Multi-­Negotiator Retreat Arrangement 137
8.8.2.2 Headway of the Realms Exemplary 137
8.8.2.3 Scrutiny of Over apt in Cyber Retreat 137
8.8.2.4 Indiscriminate Inconsistency Revealing 138
8.9 ­A Deception Exemplary of Counter-­Offensive 138
8.10 ­Conclusion 141
­­ Acknowledgment 142
Conflict of Interest 142
­ References 142
x Contents

9 Integration of Blockchain Technology and Internet of Things:


Challenges and Solutions 145
Aman Kumar Dhiman and Ajay Kumar
9.1 ­Introduction 145
9.2 ­Overview of Blockchain–IoT Integration 146
9.3 ­How Blockchain–IoT Work Together 146
9.3.1 Network in IoT Devices 147
9.3.2 Network in IoT with Blockchain Technology 148
9.3.3 Data Flow in IoT Devices 148
9.3.4 Data Flow in IoT with Blockchain 149
9.3.5 The Role of Blockchain in IoT 149
9.3.6 The Role of IoT in Blockchain 150
9.4 ­Blockchain–IoT Applications 151
9.5 ­Related Studies on Integration of IoT and Blockchain Applications 153
9.6 ­Challenges of Blockchain–IoT Integration 155
9.7 ­Solutions of Blockchain-­IoT Integration 155
9.8 ­Future Directions for Blockchain–IoT Integration 156
9.9 ­Conclusion 157
­ References 157

10 Machine Learning Techniques for SWOT Analysis of


Online Education System 161
Priyanka P. Shinde, Varsha P. Desai, T. Ganesh Kumar,
Kavita S. Oza, and Sheetal Zalte-­Gaikwad
10.1 ­Introduction 161
10.2 ­Motivation 162
10.3 ­Objectives 163
10.4 ­Methodology 163
10.5 ­Dataset Preparation 164
10.6 ­Data Visualization and Analysis 170
10.6.1 Observations 171
10.7 ­Machine Learning Techniques Implementation 178
10.7.1 K-­Nearest Neighbors 178
10.7.2 Decision Tree 178
10.7.3 Random Forest 178
10.7.4 Support Vector Machine 179
10.7.5 Logistic Regression 179
10.8 ­Conclusion 179
­ References 180
Contents xi

11 Crop Yield and Soil Moisture Prediction Using Machine Learning


Algorithms 183
Debarghya Acharjee, Nibedita Mallik, Dipa Das, Mousumi Aktar,
and Parijata Majumdar
11.1 ­Introduction 183
11.2 ­Literature Review 185
11.3 ­Methodology 187
11.4 ­Result and Discussion 190
11.5 ­Conclusion 191
­ References 193

12 Multirate Signal Processing in WSN for Channel Capacity and


Energy Efficiency Using Machine Learning 195
Prashant R. Dike, T. S. Vishwanath, V. M. Rohokale, and D. S. Mantri
12.1 ­Introduction 195
12.2 ­Energy Management in WSN 197
12.3 ­Different Strategies to Increase Energy Efficiency 197
12.4 ­Algorithm Development 198
12.5 ­Results 202
12.6 ­Summary 203
­References 203

13 Introduction to Mechanical Design of AI-­Based Robotic System 207


Mohammad Zubair
13.1 ­Introduction 207
13.2 ­Mechanisms in a Robot 209
13.2.1 Serial Manipulator 209
13.2.2 Parallel Manipulator 209
13.3 ­Kinematics 212
13.3.1 Degree of Freedom 214
13.3.2 Position and Orientation in a Robotic System 215
13.4 ­Conclusion 216
­­­ Acknowledgment 217
Conflict of Interest 217
References 217

Index 219
xiii

About the Authors

Dr. Indranath Chatterjee is working as


a professor in the Department of Computer
Engineering at Tongmyong University,
Busan, South Korea. He received his PhD
in Computational Neuroscience from the
Department of Computer Science,
University of Delhi, Delhi, India. His
research areas include computational
­neuroscience, schizophrenia, medical
imaging, fMRI, and machine learning. He
has authored and edited nine books on
computer science and neuroscience
­published by renowned international
publishers. To date, he has published more
than 70 research papers in international
journals and conferences. He is the
recipient of various global awards in neuroscience. He is serving as the Chief
Section Editor of a few renowned international journals and as a member of the
advisory board and editorial board of various international journals and open-­
science organizations worldwide. He is working on several projects of government
and non-­government organizations as PI/co-­PI, related to medical imaging and
machine learning for a broader societal impact, in collaboration with several
universities globally. He is an active professional member of the Association of
Computing Machinery (ACM, USA), the Organization of Human Brain Mapping
(OHBM, USA), the Federation of European Neuroscience Society (FENS,
Belgium), the Association for Clinical Neurology and Mental Health (ACNM,
India), and the International Neuroinformatics Coordinating Facility (INCF,
Sweden).
xiv About the Authors

Dr. Sheetal S. Zalte-­Gaikwad is an


assistant professor in the Computer
Science Department at Shivaji University,
Kolhapur, India. She pursued BSc and MSc
from Pune University, India. She earned
her PhD in mobile ad-­hoc network at
Shivaji University, India. She has 14 years
of teaching experience in computer
science. She has published more than
40 research papers in reputed international
journals and conferences. She has also
published book chapters with Springer,
Bentham, CRC Taylor, Wiley, and Francis.
Her research areas are MANET, VANET,
blockchain security. She has also authored a book, Computational Theory,
Problems and Solutions. She worked as the lead editor for the book, Synergistic
Interaction of Big Data with Cloud Computing for Industry 4.0, CRC Press, Taylor
and Francis Publisher, USA.
xv

Preface

In our rapidly evolving world, the transformative power of machine learning


(ML) and deep learning (DL) technologies is undeniable. From robotics and vehi-
cle automation to financial services, retail, manufacturing, healthcare, and
beyond, ML and DL are revolutionizing industries and driving improvements in
business operations. The potential of these advanced technologies to enhance our
lives and reshape our future is immense.
In this book, we delve into the remarkable advancements made possible by ML
and DL, showcasing case studies that demonstrate how these technologies have
facilitated breakthroughs in business intelligence, enabling faster and more effi-
cient decision-­making processes. We explore a wide range of applications, from
facial recognition to natural language processing, and illustrate how ML and DL
play a central role in the continuous learning and data simulation capabilities of
cars in real-­time.
While it is crucial to acknowledge the potential challenges and implications
associated with ML and DL, it is equally important to recognize the positive
impact they can have on our society. This book aims to shed light on real-­world
examples that highlight how ML and DL can create better technology to support
modern thinking. Whether you are a novice or a specialist in the field, these cap-
tivating case studies will offer valuable insights into various applications where
ML and DL techniques play a significant role.
Within these pages, we uncover the inner workings of ML algorithms, revealing
how they transform digital images, which are mere series of numbers, into mean-
ingful patterns through image processing techniques. We also explore the com-
plex landscapes of risk modeling, genomic sequencing, and modeling, where ML
and DL implementations require extensive cloud environments with high-­
performance data processing and management capabilities.
Moreover, we examine the competitive landscape of ML-­ and DL-­based plat-
forms, where major vendors such as Amazon, Google, Microsoft, IBM, and others
xvi Preface

vie for customers by offering comprehensive services encompassing data collec-


tion, classification, modeling, training, and application deployment.
The revolutionizing influence of ML and DL technologies transcends bounda-
ries, revolutionizing nearly every industry worldwide. This book is dedicated to
providing extensive coverage of these groundbreaking technologies and illustrat-
ing how they are reshaping industries and our lives.
We explore the vast domain of computer vision and its wide-­ranging applica-
tions, from everyday life scenarios to the Internet of Things and brain–computer
interfaces. With the ability to detect and track humans across multiple streams of
data, ML and computer vision represent significant leaps forward, offering tre-
mendous potential in terms of efficiency, productivity, revenue generation, and
profitability.
We also examine the critical role played by ML and computer vision in our digi-
tal society. They empower individuals with great ideas and limited resources to
succeed in business while also enabling established enterprises to harness and
analyze the data they collect. Moreover, we highlight how ML contributes to
cybersecurity by effectively tracking and preventing monetary frauds online,
using examples like PayPal’s ML-­powered tools for detecting money laundering.
Throughout this book, we aim to cultivate an understanding of the vital impor-
tance of ML and computer vision in our AI-­driven era. By exploring real-­world
applications across diverse disciplines and daily-­life scenarios, we hope to provide
readers with state-­of-­the-­art algorithms and practical insights that underscore the
value of AI in future applications.
Embark on this journey with us as we uncover the exciting world of ML and DL,
where cutting-­edge technology meets real-­world impact. May this book empower
you to grasp the immense potential of these technologies and inspire you to
explore and contribute to their further advancement.
Enjoy the exploration!

November 2023 Indranath Chatterjee


Busan, South Korea
Sheetal Zalte
Kolhapur, Maharashtra, India
1

Statistical Similarity in Machine Learning


Dmitriy Klyushin
Department of Computer Science and Cybernetics, Kyiv, Ukraine

1.1 ­Introduction

In machine learning, the accuracy of algorithms depends on how accurately the


hypothesis about the proximity of objects in the feature space is fulfilled. It is this
property that guarantees the possibility of generalization based on training samples.
The hypothesis of proximity (similarity) of objects in the feature space assumes that
objects of one class form a compact region with a smooth boundary. A classic
demonstration of this conjecture is the famous Fisher iris problem, in which points
of three classes form easily separable and dense clouds on a plane. This problem
illustrates both the strength and the weakness of the compactness hypothesis. The
strength of this hypothesis is that we can easily draw boundaries between sets of
points and classify them. The weakness of the compactness hypothesis is that that
we cannot generalize it to the case when the object is defined not by one point, but
by many points. Such situations often arise in medical research, when we take a lot
of cells from a patient and measure different features of these cells. As a result, a
patient is represented not by a vector in a feature space, but by a matrix of feature
samples (moreover, the order of the numbers in the columns of this matrix is
random). Of course, it is possible to reduce this matrix to a vector by averaging
values in the columns and considering only a vector of means, but it is obvious that
this leads to losing of important information about the distribution of feature values.
In fact, what can be said about a distribution, knowing only the estimate of its
mathematical expectation?

Machine Learning Applications: From Computer Vision to Robotics, First Edition.


Edited by Indranath Chatterjee and Sheetal Zalte.
© 2024 The Institute of Electrical and Electronics Engineers, Inc.
Published 2024 by John Wiley & Sons, Inc.
2 1 Statistical Similarity in Machine Learning

The hypothesis of compactness ignores the randomness of training data, so we


must replace it with an alternative postulate on the proximity between random
samples, guided by the laws of mathematical statistics. We propose to use the
well-­known concept of sample homogeneity in mathematical statistics, i.e. a
hypothesis that samples are drawn from a same distribution. Returning to the
terminology of machine learning, this means that samples of features of objects
have identical distributions. Within this approach, we can use a wide variety of
statistical criteria to test the homogeneity hypothesis.
In the chapter, we introduce an alternative concept of proximity in machine
learning and propose to use the hypothesis about homogeneity of samples
instead of the hypothesis of compactness, as well as provide examples of its
effective use.

1.2 ­Featureless Machine Learning

The pioneers of featureless, or relational machine learning, were scientific schools


of Duin (Duin et al. 1997, 1999; Pekalska and Duin 2001; Pekalska and Duin 2005)
and Mottl (Mottl et al. 2001, 2002, 2017; Seredin et al. 2012). Their idea was to
replace a feature vector of an object by a similarity measure to the training dataset
using a metrics. It is obvious that this is not a solution of the problem of classifica-
tion of objects using a matrix of feature values. The point is that in such cases, it
is necessary to use not geometric but statistical tools, for example, two-­sample
tests of homogeneity, such as the Kolmogorov–Smirnov test and the Mann–
Whitney–Wilcoxon test. Using these criteria, we can test the hypothesis that fea-
ture samples are homogeneous. However, this is not a complete solution of the
posed problem. Testing the homogeneity hypothesis using the tests mentioned
above and various other tests, for example, Cramer–von Mises and Anderson–
Darling, we cannot obtain a numerical measure of similarity. These tests provide
only p-­values that denote the probability of the samples being homogeneous. We
shall describe a solution allowing measuring the similarity between samples as
follows.
To fill the distance matrix, Euclidean and pseudo-­Euclidean distances, as well
as kernels, are used. It is quite obvious that such an approach is not acceptable for
estimating the similarity between matrices whose columns are random samples
of features. The use of metrics in such cases is impossible.
Recently, the minimal learning machine (Kuli 2013) and the extreme minimal
learning machine (Souza Junior et al. 2015) were developed. The authors used
nonlinear distance regression, estimating dissimilarity between objects. There are
numerous metrics and learning techniques in this field (Mesquita et al. 2017; Caldas
et al. 2018; Florêncio et al. 2018; Maia et al. 2018; Cao et al. 2019; Kärkkäinen 2019;
1.4 ­The Klyushin–Petunin Tes 3

Bicego 2020; Florêncio et al. 2020; Nanni et al. 2020; Silva et al. 2020 etc.) Details of
the surveys of these issues are provided in (Costa et al. 2020; Hämäläinen et al. 2020).
All these methods use the Euclidean distance. Therefore, they are unacceptable for
solving the problem stated above: to classify objects represented by matrices of
independent identically distributed random values.
Our goal is to extend the featureless approach to similarity-­based classification
using the nonparametric similarity measure and nonparametric two-­sample test
of homogeneity. Due to the nonparametric nature of these tools, we do not use
any assumption about a hypothetical distribution of training sample. Also, as we
shall demonstrate below, these tools are universal in the sense that using the pro-
posed test, we can test the homogeneity hypothesis for all possible variants: differ-
ent location parameters and the same scale parameter, the same location
parameter and different scale parameters, and both different location and scale
parameters. The proposed similarity measure also is universal because it is appli-
cable to both samples without ties and with ties (duplicates).

1.3 ­Two-­Sample Homogeneity Measure

Consider training samples a = (a1, a2, …, an) ∈ A and b = (b1, b2, …, bn) ∈ B from
populations A and B obeying distributions F and G that are absolutely continuous.
The classification problem for a test sample c = (c1, c2, …, cn) is reduced to testing
the homogeneity of c and a and c and b. There are various nonparametric two-­
sample tests of homogeneity (Derrick et al. 2019). However, every test has own
drawbacks. For example, the Kolmogorov–Smirnov test is a universal test in the
sense that it tests the general hypothesis F = G, but it is very sensible to outliers
and need in large size of samples. The Wilcoxon sign rank test is not universal
because it tests only the hypothesis about location shift (i.e. whether E(a) signifi-
cantly differs from E(c)). In our opinion, the most effective and universal tool was
developed in Klyushin and Petunin (2003).

1.4 ­The Klyushin–Petunin Test

The two-­sample test of homogeneity (Klyushin and Petunin 2003) is nonparamet-


ric. This test uses the Hillʼs assumption (Hill 1968): for exchangeable random
­values a1, a2, …, an ∈ F with continuous distribution, we have

j i
 
P x   a(i ) , a( j )  
n 1
, j  i, (1.1)
4 1 Statistical Similarity in Machine Learning

where a(i) and a(j) are order statistics and x obeys F.

Let us find hij 


 
# ck   a(i ) , a( j ) 
and estimate the deviation of the observable
n
relative frequency hij from the expected probability (1.1) constructing a confidence
interval for a probability of success in the Bernoulli scheme (Pires and
Amado 2008). Since p-­statistics is invariant in respect of the selection of a
confidence interval for binomial proportion (Klyushin and Martynenko 2021), we
may select the most simple one for computations, the Wilson confidence interval
(n)
 
Iij  pij(1) , pij( 2 ) , where

hij n  0.5g2  g hij 1  hij  n  0.25g2


pij(1)  ,
n  g2 (1.2)
hij n  0.5g  g hij 1  hij  n  0.25g
2 2
pij( 2 )  .
n  g2

If g = 3, the confidence level of (1.2) is greater than 0.95 (Klyushin and


Petunin 2003). Since the number of all the intervals (a(i), a(j)) where i < j is equal to
n  n  1
N , the homogeneity measure for samples a and c is
2
1  j i (n) 
h #  pij   Iij . (1.3)
N  n  1 

Note that h in (1.3) is also a binomial proportion. Therefore, the test for homo-
geneity may be formulated in the following way: samples are homogeneous if the
confidence interval for the binomial h covers 0.95, else it is rejected.

1.5 ­Experiments and Applications

Consider the results of two numerical experiments in which samples were


drawn from the normal distribution Gaussian(α, 1) and Gaussian(0, 1) and
Gaussian(0, 1 − α) and Gaussian(0, 1). We considered 100 pairs of samples
containing 100 random numbers. The p-­statistics and p-­value of the
Kolmogorov–Smirnov statistics (KS-­statistics) were averaged. The null hypoth-
esis is accepted if the p-­statistic is greater than 0.95 or the p-­value of KS-­test is
less than 0.05. We tested hypotheses about shift of location and scale parame-
ters. In the first case, the null hypothesis supposes that distributions have the
same mathematical expectation. In the latter case, the null hypothesis
1.5 ­Experiments and Application 5

P-statistics (location shift)


P-statistics (scale shift)
KS-statistics (location shift)
1.0 KS-statistics (scale shift)
P-statistics and P-value of KS-statistics

0.8

0.6

0.4

0.2

0.0

0.0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1.0
Alpha

Figure 1.1 Behavior of P-­and KS-­statistics in the testing the location and scale shift
hypothesis.

supposes that distributions have the same standard deviations. The results are
presented in Figure 1.1.
In Figure 1.1, we see that the p-­statistics decreases as α increases. The point
where we can reject the null hypothesis about both location and scale shift is
α = 0.3. The Kolmogorov–Smirnov test detects the shift of location when α = 0.1
and the scale shift when α = 0.2.
The graph demonstrates very high sensitivity of both tests. But, the high
sensitivity of KS-­statistics has a negative side because for α > 0.2 the p-­value of the
KS-­test varies close to 0. Therefore, it just recognizes the fact that distributions are
different but does not estimate the value of this difference. In contrast, the graph of
p-­statistics is monotonic and may be used as a similarity measure between statistics
in all the ranges of the parameter α. These results may be easily reproduced for
pairs of samples drawn from various distributions (lognormal, uniform, gamma
et al.). The examples are observed in Klyushin and Martynenko (2021).
The test based on the p-­statistics successfully estimated the similarity and
difference between the feature samples of patients with breast cancer (Andrushkiw
6 1 Statistical Similarity in Machine Learning

et al. 2007), detected change points in time series (Klyushin and Martynenko 2021),
and compared forecasting models for the COVID-­19 epidemic curve
(Klyushin 2021). The applications of the p-­statistics are not bounded by the
abovementioned problems. It may be useful to solve the problem if two rankings
come from the same distribution (Balázs et al. 2022) and constructing a statistical
depth (Goibert et al. 2022), for instance. At the best case, the proposed test is more
effective due to its universal nature; in the worst case, it is as effective as the
Kolmogorov–Smirnov and other tests (Klyushin 2021). The p-­statistics is so-­called
“soft” similarity measure. In contrast to other tests, the p-­statistics is stable with
respect to outliers and anomalities. Therefore, it is a natural measure of similarity
between two samples.

1.6 ­Summary

The ability of a machine learning algorithm to generalize results obtained from


training datasets depends on the underlying hypotheses. Classical discriminant
analysis uses the compactness hypothesis. This geometric hypothesis is not
applicable to the classification of random samples since for such samples, the
concept of distance is meaningless. We propose an alternative concept of
homogeneity of objects, which are considered homogeneous if their sample
features have identical distribution. The Klyushin–Petunin two-­sample test
successfully tests the hypotheses about location and scale shift with high sensitivity
and specificity. The future scope of the work is related with the development of
analogous tests for multivariate samples.

­References

Andrushkiw, R.I., Boroday, N.V., Klyushin, D.A., and Petunin, Y.I. (2007). Computer-­
Aided Cytogenetic Method of Cancer Diagnosis. New York: Nova Publishers.
Balázs, R., Baranyi, M., and Héberger, K. (2022). Testing rankings with cross-­
validation. arXiv https://doi.org/10.48550/arXiv.2105.11939.
Bicego, M. (2020). Dissimilarity random forest clustering. IEEE International
Conference on Data Mining (ICDM), Sorrento, Italy (17–20 November 2020),
pp. 936–941. IEEE. https://doi.org/10.1109/ICDM50108.2020.00105.
Caldas, W.L., Gomes, J.P.P., and Mesquita, D.P.P. (2018). Fast Co-­MLM: an efficient
semi-­supervised co-­training method based on the minimal learning machine. New
Generation Computing 36: 41–58. https://doi.org/10.1007/s00354-­017-­0027-­x.
Cao, H., Bernard, S., and Sabourin, R.&, Heutte, L. (2019). Random forest
dissimilarity based multi-­view learning for radiomics application. Pattern
Recognition 88: 185–197. https://doi.org/10.1016/j.patcog.2018.11.011.
  ­Reference 7

Costa, Y.M.G., Bertolini, D., Britto, A.S. et al. (2020). The dissimilarity approach: a
review. Artificial Intelligence Review 53: 2783–2808. https://doi.org/10.1007/
s10462-­019-­09746-­z.
Derrick, B., White, P., and Toher, D. (2019). Parametric and non-­parametric tests for
the comparison of two samples which both include paired and unpaired
observations. Journal of Modern Applied Statistical Methods 18: eP2847.
https://doi.org/10.22237/jmasm/1556669520.
Duin, R.P.W., de Ridder, D., and Tax, D.N.J. (1997). Experiments with a featureless
approach to pattern recognition. Pattern Recognition Letters 18: 1159–1166.
https://doi.org/10.1016/S0167-­8655(97)00138-­4.
Duin, R.P.W., Pekalska, E., and de Ridder, D. (1999). Relational discriminant analysis.
Pattern Recognition Letters 20: 1175–1181. https://doi.org/10.1016/S0167-
­8655(99)00085-­9.
Florêncio, J.A., Dias, M.L.D., and de Souza J́unior, A.H. (2018). A fuzzy c-­means-­
based approach for selecting reference points in minimal learning machines. In:
Fuzzy Information Processing (ed. G.A. Barreto and R. Coelho), 398–407. Cham:
Springer International Publishing. https://doi.org/10.1007/978-­3-­319-­95312-­0_34.
Florêncio, J.A., Oliveira, S.A., Gomes, J.P., and da Rocha Neto, A.R. (2020). A new
perspective for minimal learning machines: a lightweight approach.
Neurocomputing 401: https://doi.org/10.1016/j.neucom.2020.03.088.
Goibert, M., Clémençon, S., Irurozki, E., and Mozharovskyi, P. (2022). Statistical depth
functions for ranking distributions: definitions, statistical learning and applications.
Proceedings of the 25th International Conference on Artificial Intelligence and
Statistics AISTATS 2022, Valence, Spain (28–30 March 2022). https://hal.archives-­
ouvertes.fr/hal-­03537148/document. https://doi.org/10.48550/arXiv.2201.08105.
Hämäläinen, J., Alencar, A., Kärkkäinen, T. et al. (2020). Minimal learning machine:
theoretical results and clustering-­based reference point selection. Journal of
Machine Learning Research 21: 1–29. http://jmlr.org/papers/v21/19-­786.html.
Hill, B.M. (1968). Posterior distribution of percentiles: bayes’ theorem for sampling
from a population. Journal of the American Statistical Association 63: 677–691.
Kärkkäinen, T. (2019). Extreme minimal learning machine: ridge regression with
distance-­based basis. Neurocomputing 342: 33–48. https://doi.org/10.1016/j.
neucom.2018.12.078.
Klyushin, D. (2021). Non-­parametric k-­sample tests for comparing forecasting
models. Polibits 62: 33–41. http://www.polibits.gelbukh.com/2020_62/Non-­
Parametric%20k-­Sample%20Tests%20for%20Comparing%20Forecasting%20Models.
pdf. https://doi.org/10.17562/PB-­62-­4.
Klyushin, D. and Martynenko, I. (2021). Nonparametric test for change point
detection in time series. Proceeding of 3rd International Workshop ʻModern
Machine Learning Technologies and Data Scienceʼ, MoMLeT&DS 2021. Volume I:
Main Conference, Lviv-­Shatsk, Ukraine (5–6 June 2021), pp. 117–127. https://
ceur-­ws.org/Vol-­2917/paper11.pdf (accessed 12 November 2022).
8 1 Statistical Similarity in Machine Learning

Klyushin, D.A. and Petunin, Y.I. (2003). A nonparametric test for the equivalence of
populations based on a measure of proximity of samples. Ukrainian Mathematical
Journal 55: 181–198. https://doi.org/10.1023/A:1025495727612.
Kulis, B. (2013). Metric learning: a survey. Foundations and Trends in Machine
Learning 5: 287–364. https://doi.org/10.1561/2200000019.
Maia, A.N., Dias, M.L.D., Gomes, J.P.P., and da Rocha Neto, A.R. (2018). Optimally
selected minimal learning machine. In: Intelligent Data Engineering and
Automated Learning – IDEAL (ed. H. Yin, D. Camacho, P. Novais, and A.J.
Tallón-­Ballesteros), 670–678. Cham: Springer International Publishing. https://doi.
org/10.1007/978-­3-­030-­33617-­2.
Mesquita, D.P.P., Gomes, J.P.P., and de Souza Junior, A.H. (2017). Ensemble of
efficient minimal learning machines for classification and regression. Neural
Processing Letters 46: 751–766. https://doi.org/10.1007/s11063-­017-­9587-­5.
Mottl, V., Dvoenko, S., Seredin, O. et al. (2001). Featureless pattern recognition in an
imaginary Hilbert space and its application to protein fold classification. Machine
Learning and Data Mining in Pattern Recognition, Leipzig, Germany (25–27 July
2001), pp. 322–336. Lecture Notes in Computer Science, 2123. https://doi.org/
10.1007/3-­540-­44596-­X_26.
Mottl, V., Seredin, O., Dvoenko, S. et al. (2002). Featureless pattern recognition in an
imaginary Hilbert space. International Conference on Pattern Recognition 2: 88–91.
https://doi.org/10.1109/ICPR.2002.1048244.
Mottl, V., Seredin, O., and Krasotkina, O. (2017). Compactness hypothesis, potential
functions, and rectifying linear space. Machine Learning: International Conference
Commemorating the 40th Anniversary of Emmanuil Braverman’s Decease, Boston,
MA, USA (28–30 April 2017), Invited Talks. https://doi.org/10.1007/978-­3-­319-­
99492-­5_3.
Nanni, L., Rigo, A., Lumini, A., and Brahnam, S. (2020). Spectrogram classification
using dissimilarity space. Applied Sciences 10: 4176. https://doi.org/10.3390/
app10124176.
Pekalska, E. and Duin, R.P.W. (2001). On combining dissimilarity representations. In:
Multiple Classifier Systems,. LNCS, 2096 (ed. J. Kittler and F. Roli), 359–368. Berlin:
Springer–Verlag. https://doi.org/10.1007/3-­540-­48219-­9_36.
Pekalska, E. and Duin, R.P.W. (2005). The Dissimilarity Representation for Pattern
Recognition, Foundations and Applications. Singapore: World Scientific.
Pires, A.M. and Amado, C. (2008). Interval estimators for a binomial proportion:
Comparison of twenty methods. REVSTAT–Statistical Journal 6 (2): 165–197.
https://doi.org/10.57805/revstat.v6i2.63.
Seredin O., Mottl, V., Tatarchuk, A. et al. (2012). Convex support and relevance vector
machines for selective multimodal pattern recognition. Proceedings of the 21st
International Conference on Pattern Recognition (ICPR2012), Tsukuba, Japan
(11–15 November 2012), pp. 1647–1650. IEEE.
  ­Reference 9

da Silva, A.C.F., Saïs, F., Waller, E., and Andres, F. (2020). Dissimilarity-­based
approach for identity link invalidation. IEEE 29th International Conference on
Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE),
Bayonne, France (10–13 September 2020), pp. 251–256. IEEE. https://doi.
org/10.1109/WETICE49692.2020.00056.
de Souza Junior, A.H., de Corona, F., Barreto, G.A. et al. (2015). Minimal learning
machine: a novel supervised distance-­based approach for regression and
classification. Neurocomputing 164: 34–44. https://doi.org/10.1016/
j.neucom.2014.11.073.
11

Development of ML-­Based Methodologies


for Adaptive Intelligent E-­Learning Systems
and Time Series Analysis Techniques
Indra Kumari1, Indranath Chatterjee2, and Minho Lee1
1
Korea Institute of Science and Technology Information (KISTI), University of Science and Technology (UST),
Daejeon, South Korea
2
Department of Computer Engineering, Tongmyong University, Busan, South Korea

2.1 ­Introduction

Artificial intelligence (AI)’s machine learning (ML) subfield focuses on creating and
studying AI software that can teach itself new skills. Definition: ML is the study of
how to program computers to learn and make decisions in ways that are indistin-
guishable from human intelligence (Sarker 2021). The term “machine learning”
refers to a technique whereby a computer is taught to optimize a performance
metric by analyzing and learning from examples. Generalization and representa-
tion are at the heart of ML. The system’s ability to generalize to novel data samples
is a key feature. According to Herbert Simon, “learning” is the process through
which a system experiences adaptive alterations that improve its performance on
a given task or collection of activities the next time it is used. If the program’s
­performance on tasks in class T improves with experience E, as measured by
the performance measure P, then we say that the program has learned from its past
performance and can apply that knowledge to future performance. Tom Mitchell
explains that “a computer program is said to learn from experience E concerning
some class of tasks T and performance measure P.” Robots with AI can learn from
their experiences, identify patterns, and infer their meaning (Patel and Patel 2016).
ML and AI have become so pervasive in our daily lives that they are no longer the
purview of specialized researchers trying to crack a difficult issue. Instead of being
a fluke, this development has a very natural feel to it. Organizations are now able
to harness a massive amount of data in developing solutions with far-­reaching

Machine Learning Applications: From Computer Vision to Robotics, First Edition.


Edited by Indranath Chatterjee and Sheetal Zalte.
© 2024 The Institute of Electrical and Electronics Engineers, Inc.
Published 2024 by John Wiley & Sons, Inc.
12 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

commercial benefits, thanks to the exponential development in processing speed


and the introduction of better algorithms for tackling complicated and tough
issues. The availability of rich data, new algorithms, and unique methodologies in
its numerous applications make financial services, banking, and insurance one of
the most important industries with a very high potential in reaping the advantages
of ML and AI. Because companies have only scratched the surface of quickly devel-
oping fields like deep neural networks and reinforcement learning, the potential of
employing these approaches in many applications remains ­significantly untapped.
Organizations are reaping the benefits of cutting-­edge ML applications in areas
such as customer segmentation for targeted marketing of newly released products,
the development of optimal portfolio strategies, the identification and prevention of
money laundering and other illegal activities in the financial markets, the imple-
mentation of more intelligent and effective credit risk management practices, and
the maintenance of compliance with regulatory frameworks in financial, account-
ing, and other operational areas. However, the full potential of ML and AI has yet to
be discovered or used. Businesses need to take use of these features if they want to
gain and keep a competitive edge over the long run. One of the main reasons for the
slow adoption of AI/ML models and methods in financial applications is the lack of
familiarity and trust in deploying them in critical and privacy-­sensitive applications.
However, the “black-­box” nature of such models and frameworks that analyzes
their internal operations in producing outputs and their validations also impedes
faster acceptance and deployment of such models in real-­world settings.

2.1.1 Machine Learning


All intelligent applications today take advantage of ML’s capabilities. By analyz-
ing large amounts of data automatically, ML can help uncover insights that would
otherwise be inaccessible through conventional software (Malakar et al. 2019). In
recent years, ML has expanded into every area of study. The advent of ML has
made it possible for computers to teach themselves. It is implemented to endow
machines with the capacity for judgment equal to that of humans. An individual’s
proficiency with a machine grows with time, and that time is spent accumulating
knowledge in the form of data that may be used to teach the machine (Masetic
and Subasi 2016).

2.1.2 Types of Machine Learning


There are three main categories of ML, which are as follows:
1) Supervised learning: For this method of training, the training data explicitly
map outcomes to their corresponding inputs. The system “learns” by parsing
the training data for patterns in how inputs lead to desired outcomes.
2.1 ­Introductio 13

Supervised learning is useful for two distinct sorts of problems: classification


and regression. The goal of classification is to organize data into meaningful
groups. Predicting the value of a variable or variables is the goal of regression
analysis (Kumar et al. 2021).
2) Unsupervised learning: To train a computer in unsupervised learning, it is
fed data that has not been labeled in any way, and the program must figure out
what those patterns are. Through its algorithms, the system compiles a synthe-
sis of the information. The process of clustering is an example of unsupervised
learning. Clustering is assembling things into distinct clusters. When items
that share similarities are grouped, we call this a cluster (Qiu et al. 2016).
3) Reinforcement learning: The purpose of learning in a reinforcement setting
is to acquire the skills necessary to maximize the likelihood of success. To act
is the job of an agent. The agent performs an action by first learning about the
context in which it will occur. The agent keeps itself in a certain mental state
to learn about its surroundings. The agent acquires knowledge of its surround-
ings via exploration and experimentation. The agent uses the reward function
to gain knowledge of its surroundings. The agent receives either a positive
reward or a negative punishment based on the acts it does. The agent’s goal is
to maximize positive reinforcement and minimize detrimental feedback.
Human experts in the application domain are unnecessary for reinforcement
learning. Reinforcement learning is useful in many contexts, including self-­
driving automobiles (Prasad et al. 2019).

2.1.3 Learning Methods


Multiple methods of education are discernable. You can classify the various
­methods of learning as:
1) Generalization Learning and
2) Discovery learning
We classify various methods of instruction as follows:
1) Rote learning
2) Learning by taking advice
3) Learning by example
4) Learning in problem-­solving
The following categories have been established for more study:
1) Supervised Learning
2) Unsupervised Learning
3) Reinforcement Learning
4) Semi-­Supervised Learning
14 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

To infer a function from data that have previously been characterized is the
purpose of supervised learning. The training data consist of instructional exam-
ples. Instances are represented by pairs, each of which has a class and an input
value. Supervised learning algorithms take a training set of examples and utilize
them to infer a function that can be used to map further instances. In the best-­case
situation, the algorithm can reliably assign accurate labels to newly encountered
instances. The challenge of unsupervised learning in ML is to classify data with-
out labels into groups with a predefined degree of similarity (Bharti et al. 2021).
The lack of a clear error signal while looking at provided examples prevents the
learner from focusing on the best approach. Of the aforementioned learning prob-
lems, reinforcement learning is the broadest. Instead of being instructed about
what to do by a superior, a reinforcement learning agent must learn by experience.
To solve a problem, a learner employs reinforcement learning, in which he takes
action on his surroundings and receives feedback in the form of a reward or pun-
ishment. The system discovers the best plan of action by making mistakes.
According to research, the most beneficial plan of action may consist of a series
of actions carefully crafted to maximize returns (Bottou 2010). There is a lot of
unlabeled data but not a lot of tagged data in many real-­world learning domains
like biology or text processing. As a rule, it takes a lot of effort and money to create
data that have been appropriately categorized. As a result, SSL refers to the method
of learning from a combination of labeled and unlabeled information. This kind
of learning combines features of both supervised and unsupervised methods.
Semi-­supervised learning excels in scenarios when there is more unlabeled data
available than labeled data. This occurs when the cost of collecting data points is
low, while the cost of obtaining labels is high.

2.1.4 E-­Learning with Machine Learning


E-­learning refers to the process of acquiring knowledge via the use of electronic
means of communication and dissemination. Further limiting its scope to the
internet, Rosenberg defined e-­learning as “the use of Internet technologies to
deliver a broad array of solutions that enhances knowledge and performance.”
Helping students learn and grow as a result of their experiences is what is com-
monly referred to under many different names, including e-­learning, virtual
learning, web-­based learning, computer-­assisted learning, Internet learning, dis-
tributed learning, and distance teaching. Through the context of online learning,
the student interacts with the course’s teacher and other students in asynchro-
nous, collaborative activities that take place via the Internet. Several researchers
over the past few years have examined many different aspects of e-­learning, with
a primary focus on creating novel strategies for content presentation and enabling
greater student–teacher interaction and collaboration (Alharbi and Doncker 2019).
2.1 ­Introductio 15

The development of reskilling and upskilling and the supplementation of the


conventional educational system are all attributable to the advent of e-­learning.
E-­learning uses web technologies to make a structured, learner-­focused, interac-
tive, and facilitated learning environment available to anyone, anywhere, and at
any time. An effective e-­learning platform will have some sort of adaptive mecha-
nism. The primary goal of adaptive systems is to facilitate individualized digital
instruction. The foundation of meaningful learning is a constructivist method of
conceptually modeling an individual’s existing body of knowledge and experience
with an eye toward adaptation. This latter method relies on ML algorithms and
has made great strides toward mimicking a human expert’s performance. When
daily learning patterns are included in the model, desirable results are attained,
giving e-­learning platforms a distinct advantage in the market. Efficient incorpo-
ration of personalization into educational websites requires an e-­learning system
to dynamically detect the user model. If we use semantic analysis of e-­content and
global ontologies in an inferential manner, we can construct a learner model that
stores our inferences about the user’s perspective on the e-­content (Anjaria and
Guddeti 2014).

2.1.5 Need for Machine Learning


What if we used ML instead of programming our computers to do work? Programs
that learn and develop based on their “experience” may be necessary due to the
complexity of the issue and the need for adaptability:

1) Tasks performed by animals/humans: There are many things that we


humans do regularly, but we do not reflect deeply enough on our processes to
extract a clearly defined program. Tasks like driving, voice recognition, and
visual comprehension are all examples. After being exposed to enough training
instances, state-­of-­the-­art ML systems and programs that “learn from their
experience” perform well in all of these areas.
2) Tasks beyond human capabilities: The analysis of very large and complex
data sets is yet another broad category that can benefit from ML techniques;
examples include astronomical data, converting medical archives into medical
knowledge, weather prediction, genomic data analysis, web search engines,
and electronic commerce. As more and more information is captured digitally,
it is becoming clear that vast troves of valuable knowledge lie dormant in
­databases that are incomprehensibly massive and complicated for humans to
­comprehend. The combination of learning programs with the almost infinite
memory capacity and ever-­increasing processing power of computers opens
up new vistas in the field of learning to find significant patterns in big and
complicated data sets.
16 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

2.2 ­Methodological Advancement
of Machine Learning

Students come from a variety of backgrounds and have varying learning styles and
pedagogical requirements. The primary goal of an adaptive e-­learning system is to
identify the specific requirements of each learner and then, following the training
process, supply that learner with content that is tailored to his or her specific
needs. Using ML and deep learning (DL) models with the right dataset may make
the training process of an e-­learning system more robust. In addition, an efficient
intelligent mechanism is needed to automatically classify this content as belong-
ing to the learner’s category in a reasonable amount of time. This reduces the time
spent by the learner searching through the vast amounts of content available
within the e-­learning environment to find something relevant to their specific
needs. By doing so, we can tailor the information to each user. However, a multi-­
agent approach can be used in an e-­learning system to tailor e-­content to each
student by tracking how they engage with the system and gathering data on their
preferred methods of instruction (Araque et al. 2017).

2.2.1 Automatic Learner Profiling Agent


The method of developing a conceptual model through learner profiling allows
the system to discover the learner profile and implicitly classify learners. E-­content
can then be provided to students based on their categories, and teachers can reach
out to students who are less interactive to help them understand the material and
pinpoint where their problems lie. The learner model is essential to adaptive
e-­learning because it describes the learner’s characteristics and provides recom-
mendations based on those descriptions. A learner’s profile is automatically
derived using their learning style, currently enrolled courses, and knowledge.
Several AI approaches can help with this by grouping online course participants
based on their shared characteristics and prior experiences.
To deduce the user profile from user activity, the intelligent profiling agent
makes use of these data. Because the agent uses expert knowledge to assess user
features, it might be considered an expert system (Chikersal et al. 2015). To select
the most appropriate service to suggest, another agent is employed. The agent
mimics the human expert’s teaching style by using ML to figure out how to best
serve the learner in question. Several factors, such as the probability or certainty
factor in representing the user’s behavior, are dependent on the model used to
describe behavior, profile, and services. While the idea of context is utilized to
model the course material, the representation of a student’s knowledge state may
be done using probabilistic logic. In this way, we have a formalized definition
of the content model, the student model, and the instructional strategy. Each
2.2 ­Methodological Advancement of Machine Learnin 17

student’s profile in the profiling system has a record of their academic and social
background. The user’s history has been modeled to create the model of the
pupil. Students will receive customized course materials based on their unique
learning plans, which are generated using data from both the student and con-
tent models.

2.2.2 Learning Materials’ Content Indexing Agent


The cognitive level at which a learner acquires information about a subject dic-
tates the learning materials that should be used to teach that subject. E-­content
resources may be selected for optimal performance in the learning process by con-
sidering the specific habits of each student. A model that may propose services in
which each student is linked to a list of necessary ideas is needed. Learners can
select useful resources, both physical and digital, by following the approach
(Fancellu et al. 2016). This procedure necessitates specific conditions, including
the retrieval of relevant materials via appropriate indexing. Web-­based e-­learning
solutions that adhere to industry standards like SCORM are abundant, i.e. the
reusable and interoperable Learning Content Object Reference Model (SCORM)
specification. To ensure compatibility among SCORM conformant systems,
SCORM offers a specific method for developing Learning Management Systems
and instructional material.

2.2.3 Adaptive Learning


Learning in which proficiency is adapted about either one’s surroundings or
one’s learning task is known as adaptive learning. Data, history, and experience
are the building blocks of education. When it comes to various forms of educa-
tion, a technique that works wonders in one context might not be the best
option. Humans employ a wide variety of tactics for learning new material and
tackling challenges. Learning a new language requires a different approach
than studying for a math test. The learning issue, or what is to be learned and
the goals of the learning process, is intrinsically linked to the learning procedure
itself. Therefore, comprehension of the issue is necessary for the selection of a
learning approach. The first step in adaptive learning is to identify the nature of
the learning problem, and the second is to select, in real-­time, the most appro-
priate strategy for addressing that problem. More than just trying out some
novel approaches, enlisting the aid of a few eager students is needed. The best
method can be decided upon after careful data selection. Adaptive learning is a
strategy that modifies learning environments based on a set of policies. It is
conditional on the specifics of the input circumstance and the user’s real sur-
roundings (Vanschoren et al. 2014).
18 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

Predicting the efficacy of ML algorithms, ranking learning algorithms, and


selecting the most appropriate classification approach are all topics of study in the
fields of ML and data mining. The algorithm selection problem is investigated in
this chapter. Several algorithms exist that can approximate or precisely solve a
problem. Find the best algorithm(s) and suggest them to the user.
In reality, the data characteristics of different datasets influence the perfor-
mance of a classifier in different ways, which accounts for the seen variation in
results. According to the extensive empirical results presented by (Romero
et al. 2013), “decision trees do well on credit datasets, k-­nearest neighbors excel on
problems concerning images.” There is widespread agreement that data charac-
teristics should be considered when selecting an appropriate algorithm. This is
extremely useful for practical purposes.

2.2.4 Proposed Research


The proposed classification algorithm recommendation method differs from
the aforementioned work in how it characterizes a dataset and how it handles
datasets that are similar to the one being characterized. In more detail, the
k-­nearest neighbor (k-­NN) method finds the k datasets that are most similar to
the new dataset (Jankowski 2013). The algorithms that perform best on data-
sets that are most similar to the one in question are then suggested as a place to
begin addressing the data concealment issue. Experimental results on 38 UCI
datasets with nine different classifiers corroborate the efficiency of the pro-
posed method.

2.2.5 Multi-­Perspective Learning


Information and knowledge about the system, gathered from a variety of sources,
must be recorded in its entirety. If you want to maximize your learning, you need
to take use of as much data as possible. There is a new viewpoint to be gained with
each new piece of data. In this regard, some points of view are crucial while others
are less so. When information regarding other views is lacking, making decisions
can be tough. This means that there is a chance of learning gaps. To effectively
make decisions that take into account several factors, it is vital to acquire knowl-
edge from a variety of sources (Bhatt et al. 2012).
Learning from “knowledge and information obtained and created from many
viewpoints” is what we mean when we talk about multi-­perspective learning. To
achieve multi-­perspective learning, it is necessary to record data, system charac-
teristics, and connections while considering several viewpoints. The multi-­
perspective approach to learning may increase the complexity of education in
general, but it also offers many more educational opportunities. One’s point of
2.2 ­Methodological Advancement of Machine Learnin 19

view may be defined as their current mental or factual condition. All the informa-
tion from the current window pertains to a certain issue domain. Multi-­perspective
learning is a process wherein individuals bring such as P1, P2, …, and Pn are
­combined to help in decision-­making.

2.2.6 Machine Learning Recommender Agent for Customization


2.2.6.1 E-­Learning
The model takes into account both the students’ tastes and the content of the
online courses. We develop a recommender agent using an ML model to establish
a connection between a learner’s preferences and an appropriate e-­learning cate-
gory and material (Prudencio et al. 2011). In this study, we employ Naive Bayes
and K-­Means, two ML models, to categorize the course contents into different
types of e-­learning resources. The system may operate in both an unsupervised
learning mode and a supervised learning mode, the latter of which makes use of
previously collected data for training purposes. The concept allows the system to
provide recommendations for learning resources that are tailored to each student.
The data were used to compare the efficiency of two ML models.

2.2.7 Data Creation


The model is implemented and assessed using a custom dataset compiled from
user survey data collected via a Google form. Ninety-­six resources representing
various types of education are gathered with the help of college students. All stu-
dent requests have been entered into a 12 by 1500 grid form. To keep track of how
often certain concepts appear in the material the student has selected, the values
in the columns are used, e.g. Material = {Web application, C-­Programming,
Python Programming, Java …etc.}, Category = {Software, Artificial Intelligence,
Multi-­media, Networks …. etc.}. After collecting the data, in this step, the dataset
undergoes preprocessing and cleaning. The information must be transformed
from its current textual format into a numerical one (Alkawaz et al. 2018). The
information has been used to test the ML models’ performance. Various Python
libraries are used for preliminary data cleaning and analysis, and the Scikit learn
package is used to run experiments.

2.2.8 Naïve Bayes model


The NB model is implemented as a Bayesian supervised learning classifier.
Classifying the e-­learning resources for an individualized e-­content recommenda-
tion determines how closely related the concepts are to the materials. Bayes’
20 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

theorem was used to determine the posterior probability (G|L) of class (Content;
Category), as follows:

L
P   * P G 
G
P   G
L PL

In this formula, the likelihood of a class is denoted by P(L|G), and the prior
probabilities of content (G) and predictor category (P(L)) are used to determine
the likelihood of a class (L) given content (G). The conditional probability between
the categories and the content is calculated based on the data’s past.
The procedure uses a data set to generate a numeric value “LabelEncoder().fit_
transform(),” which permits a string to be converted into a numerical mode that
is machine-­readable. The data will then be represented using the Naive Bayes
model’s preferred method.

For example:
The function “datafram[’Term’]=number.fit_transform(datafram[’Term’])” has
been used which represents the term “Software” with the number “10,”
“Multimedia” with the number “8” and so on.

2.2.9 K-­Means Model


K-­means, a form of clusterization that classifies learning objects into k categories
using learners’ past data, is an unsupervised classification algorithm with limited
labeled training data. The objects are clustered so that their average distance from
the cluster center is minimized. There are 12 words and 96 materials represented
in the collected data. The challenge for each material is to determine which of the
two centers it most closely resembles, assuming that all quantities, including
those of time and matter, are real scalars. Clusters of points with axes are defined
by their centroid. Reassigning materials to clusters takes into account their prox-
imity to the centers of existing clusters, or, in another way, their similarities to the
materials already in those clusters (Badrinarayanan et al. 2017). K random terms
{t1, t2, …, tk} are selected as seeds. After a predetermined number of cycles, the
centroid positions will remain stable.
Concerning every substance (mi): Assign mi to the nearest Term tj (cluster) such
that distance (mi, tj) is minimal. Next, update the seeds to the centroid of each
cluster.
For each cluster tj, tj = (mj)
Having 12 terms where each term represents a cluster and each cluster contains
eight materials was previously illustrated. The data set must be transformed into
2.3 ­Machine Learning on Time Series Analysi 21

a numerical format so that it can be used as input in our model. The function
“random. Uniform ()” has been employed, allowing us to produce random num-
bers (here, eight numbers for each cluster, yielding 96 materials for all clusters) in
a range from 0 to 1, to which a fixed number is subsequently appended.

For example:
The function “data [‘Software’]= np.random.uniform(0,1,8)+2” has been used, a
visual representation of the k-­means clustering algorithm’s partitioning of the
learning material adaptability model into clusters. When introducing a brand-­
new set of topics and a fresh set of student profiles, unexpected complications
might occur. To account for these variations, the frequency table database is being
dynamically updated. Less inaccuracy will be introduced into the system’s predic-
tions of relevant material based on a user’s profile (Bayar and Stamm 2018).

2.3 ­Machine Learning on Time Series Analysis

Time series are mathematical constructs that denote a series of data ordered and
indexed by time. In addition, a time series is a collection of measurements taken
at regular intervals in the past, called yt, each of which has a real value and a time
stamp. The importance of the data’s ordering across time is what sets time series
apart from other types of information. Time series values are typically collected
by keeping track of some process with time, with measurements taken at set
­intervals. One mathematical definition of a time series is

yt   y1, y2 ,..., yn  (2.1)

Since a time series can only be observed a finite number of times, the underlying
process can be assumed to be a set of random variables in n dimensions. In
addition, it is beneficial to assume the underlying process is a stochastic one,
which allows for an infinite number of observations. When a mathematical func-
tion is applied to observe time series data, yt = f (time), then the series is said to
be deterministic. Additionally, the time series is said to be nondeterministic or
stochastic, when data are observed by the mathematical function yt = f (time, ϵ),
where ϵ is the random term. Furthermore, stationarity is an important feature in
time series. Properties (such as statistical properties) of a stationary time series
remain constant across time (Birajdar and Mankar 2013).
Moreover, when we talk about the statistical properties, we are referring to
things like the time series’ mean value, auto-­correlation, and variance. Univariate
time series (UTS) and multivariate time series (MTS) are the two primary classifi-
cations of time series data. One way to think about an MTS is as an infinite series
of numerous UTSs. Both UTS and MTS are widely available now because of the
22 2 Development of ML-­Based Methodologies for Adaptive Intelligent E-­Learning Systems

proliferation of numerous real-­world applications and human activities that pro-


duce them. Everything from signatures and voice recordings to stock market data
and medical signals is an example of biometrics. Time series data involve different
sources such as monthly financial data, yearly birth rate data, hourly internet
data, and annual temperature data.
Time series data come from a wide variety of modern-­day sources. From bio-
logical signals and weather recordings to stock market rates and countless other
sources, data are constantly being generated from a wide variety of human activi-
ties and practical applications. The time series data are typically gathered through
the use of sensors that record physical quantities and then convert those readings
into signals that can be easily understood by computers or humans. There has
been a surge in interest, both in academia and in the field, in modeling non-­
temporal data as time series. It has been demonstrated that it is possible to trans-
form data types as time series signals, including those used in video retrieval,
picture retrieval, handwriting recognition, and text mining jobs (Dang et al. 2019).
There is a lot of interest in studying how to extract useful information from the
massive amounts of time series data produced by a variety of applications (whether
they are strictly temporal or simply sequential); nonetheless, it remains a difficult
issue to solve. The primary objective of time series analysis is to derive actionable
statistics and other features from time series data. Due to the inherent temporal
ordering of time series, the study of these problems is more challenging than other
data mining tasks. “Time series data mining” means “mining” data from time
series for insights. Particularly, time series data mining is a product of math and
computer science coming together, ML, AI, and statistics with time series data
(Flach 2012). When analyzing time series data, scientists are on the lookout for
things like anomalies, commonalities, and natural groupings. Classification,
grouping, prediction, segmentation, anomaly detection, representation, and index-
ing are all typical time series data mining activities. This dissertation focuses on
issues surrounding the representation, categorization, and prediction of time
series, an issue with representing, categorizing, and forecasting time data.

2.3.1 Time Series Representation


Since learning directly from time series data is typically inefficient and laborious,
the fundamental concern in time series mining is how to represent the time series
data. Proposing algorithms that directly work on raw time series is computation-
ally costly due to the large dimensionality of the time series. All real-­world time
series data are extremely highly dimensional, which is a major source of worry for
the representation problem. Additionally, several time series data mining learning
systems are predicated on time series representation and modeling (Jaiswal and
Srivastava 2020). When dealing with time series data, the large dimensionality
Exploring the Variety of Random
Documents with Different Content
So we went up to Sands Jones’s house, and there he was, standing
just outside the kitchen door with an ax in his hand, like he was
going to chop wood. He looked at the ax and then he looked at the
wood and then he breathed hard and rested the ax-head on the
ground and looked over the garden fence. Mrs. Jones poked her
head out of the door.
“Sands Jones,” she said, “don’t you think I can’t see where you’re
lookin’—over the back fence toward the river. I’m watchin’ you, too.
You git to splittin’ if you expect to eat. Now chop, Sands, chop. I
hain’t goin’ to move off’n this spot till that ax-head hits a block of
wood.”
“Now, Maw,” says Sands, “can’t a feller look around a bit?”
“He kin look after he splits,” she says. “Lift that there ax.”
He lifted it.
“Now chop.”
He chopped.
“Howdy, Mr. Jones?” says I.
He dropped his ax and looked at me kind of pleased.
“I come to talk business to you—paintin’ business,” says I.
“You chop,” says Mrs. Jones.
“How kin I chop and talk business, Maw? My perfession hain’t
choppin’, it’s paintin’. Now hain’t it, Maw? You can say no other ways
if you was to try.”
“This feller,” says I, pointing to Catty, “is named Atkins, and he’s got
paintin’ work.” Mr. Jones looked Catty over kind of hopeless, and
then says: “Paintin’ work? How much? Dog-kennel maybe. I hain’t
no time to be paintin’ dog-kennels.”
“It’s a big job, Mr. Jones,” says Catty, “and my father has to hire
several good men to help him.”
“Your father! Who’s your Paw, Sonny?”
“Mr. Atkins, the master painter,” says Catty, without wiggling an
eyebrow. “He calc’lates to hire several men, and sent me to see if
you wanted a job beginning Monday.”
“What’s the job?”
“Paintin’ Mr. Manning’s new warehouse.”
“All of it?”
“All of it.”
“Can’t do it. Too big. Before I got t’ other end of it painted the paint
on the first end ’u’d be wore out and I’d have to start in repaintin’
ag’in. Hain’t lookin’ for no permanent paintin’ job. I like variety.
Different jobs every day or so; that’s me.”
“You don’t have to do it alone,” says Catty. “There’ll be other men.
There’ll be my father to boss and to work, and this Mr. Patt—”
“Darkie Patt?”
“Yes.”
“Won’t work with him. Have nothin’ to do with him. Wouldn’t lean a
ladder ag’in’ the same buildin’ he was leanin’ a ladder agin.
“That’s what he says about you,” Catty says.
“Eh?”
“He says you can’t paint, nohow,” says Catty. “He says he was willin’
to work, but that if you was on the same job he’d want twice the
wages you was gittin’ because he could paint twice as much and
twice as well with one hand.”
“Did, did he?”
“Yes, but I says I didn’t think so, and I says I’d like to have a chance
to prove it. It was a kind of a challenge to a paintin’-race. Yes, sir. I
says to Mr. Patt that I’d start him out paintin’ on one side and you on
the other. Even start. Then there’d be a race betwixt you two to see
who could do the most and the best. Yes, sir, and there was to be a
prize. Five dollars it was to the feller that got his side done first.”
“You mean Patt was willin’ to race me?”
“He’ll race you, all right.”
“Huh! Hear that, Maw?”
“I heard it,” says Mrs. Jones, “and if you paint like you split wood,
Patt kin sleep half a day and beat you with one hand tied.”
“Think so, do you? Think so? That’s your idee? Wa-al, I’ll show you.
That’s what I’ll do.... Maw, you jest walk down to that job and cock
your eyes up at me a-workin’ if you want to see paint fly. Paint hain’t
never flew as I’ll make it fly. You watch.”
“Then you agree?” says Catty.
“You kin bet your bottom dollar. When do we start?”
“Monday morning at seven. By the way, have you any ladders we
can rent?”
“Jest rented my ladders to a feller in the next town. Wasn’t no
paintin’ jobs in sight, so I figgered to realize on my investment.”
“All right.” says Catty, not showing a mite that he was disappointed.
“At seven sharp, on Monday.”
“I’ll be there,” says Mr. Jones.
After that we went over to Darkie Patt’s, and made about the same
kind of talk, and got the same results. Patt had two ladders, but
both of them was busted or something and couldn’t be used. Said he
hadn’t figured on painting much this summer, because, what with
night lines and one thing and another, he calc’lated to make a living
a heap pleasanter than by buttering the side of a house with yellow
paint.
“Well,” says I, when we had gone off, leaving Mr. Patt hired for
Monday morning at seven, “you got your men hired to paint, but you
hain’t either ladders or brushes. How be you goin’ to make out?”
“Main thing is to find ladders, or scaffoldin’ or somethin’. When I git
them I calc’late to git the brushes and paints.”
I was trying hard to think of any ladders I’d ever seen, but I couldn’t
think of any. So we just walked along, down alleys and every place
we could think, looking to see if we couldn’t see some. After a while
we walked down Main Street, and just in front of the drug-store I
saw Mrs. Gage and Mrs. Gordon, Skoodles’s mother. Catty didn’t
notice them, and I thought maybe we would get past without being
seen, but we didn’t. Just as we were alongside Mrs. Gage looked up
and saw us.
“There,” she says to Mrs. Gordon. “That’s the boy I mean—there,
with the Moore boy. Nice thing to have coming to town, isn’t it? I
thought there was a law or something about vagrants.... That Mr.
Moore must be out of his head to allow his son to play around with a
young tramp like that.”
Mrs. Gordon looked and sniffed. “He’s got a hard face,” says she. “I
told my boy never to let me catch them together, and he promised.”
“When my husband comes home to-night I’m going to see if
something can’t be done about it,” says Mrs. Gage. “I wonder if that
boy ’ll have the cheek to go to school.”
“Oh, that isn’t likely,” says Mrs. Gordon. “That sort don’t take to
school much, I imagine.”
Catty let on like he didn’t hear, but I knew he heard, because in
about five minutes he spoke up and says: “When school starts this
fall I be a-goin’ and nobody hain’t goin’ to keep me away from it. I
got a right to go. When my Dad’s a business man in this town I’ll
have as good a right to go to school as anybody.”
“Sure,” says I.
“We got to have a place of business right on Main Street,” says he,
kind of to himself. “It won’t do jest to work, but we got to make a
show of it and look as big as we kin. I wonder if there’s a store we
kin git?”
“One down to the end of the block,” says I. “Let’s look at it,” says he.
We walked along until we came to the building I meant. It was wood
with a false front—jest one story, but made to look like it had two,
and there was an iron hitching-rail in front of it. There was a good-
sized store and a small shop right next to it and opening into it. It
was kind of run down and needed painting and a window or so, but
it was on Main Street, and a good corner, too. Used to be a bakery
there, but it went out of business and nobody had rented it since.
“That ’ll do fine,” says Catty. “Dad kin use the big store for paints
and wall-papers and sich like, and I kin use the little shop.”
“What for?” says I.
“Oh,” says he, “so’s I kin sort of have a little business of my own and
maybe make a dollar or two. I kin tend it and Dad’s store, too, when
he’s out on a job.”
“Seems to me like you was cuttin’ out quite a spell of work for
yourself,” says I.
“I wonder if there’s rooms behind where we kin live?” says he.
So we took a look, and there were rooms there—four of them—a
kitchen and a dining-room and two bedrooms.
“Jest suits,” says Catty. “Who owns her?”
“Mr. Gage,” says I, with a chuckle.
Catty looked at me and then he grinned. “Guess maybe I better see
him ’fore his wife gits a chance to talk to him to-night like she said
she was going to. Where’s he at?”
“Runs the grocery up the street.”
We walked right up there and found Mr. Gage shooing flies off the
fruit up in front.
“Howdy-do, Mr. Gage?” says I. “This is my friend, Catty Atkins.”
“Howdy?” says Mr. Gage. “What kin I do for you?”
“I’m sorter running errands for my Dad,” says Catty. “He’s goin’ into
business here, and wants to find out about that store buildin’ of you
down the street.”
“What business?” says Mr. Gage.
“Paintin’ and decoratin’,” says Catty.
“Jest come to town?”
“Yes. What rent do you ask?”
“Figger I ought to git twenty dollars a month for that buildin’.”
“Give you seventeen and a half,” says Catty, “and take it for not less
’n a year.”
“Rent payable in advance,” says Mr. Gage, cautious-like.
“We take it from the first of the month. Pay a month’s rent the
mornin’ we move in. That all right?”
“Calc’late so.”
“Write it,” says Catty.
“Eh?”
“Set it down in pen and ink, so’s I kin show it to Dad and he’ll know
I’ve done what’s right,” says Catty.
So Mr. Gage went in and wrote it down like Catty said, and signed
his name to it. After that we went on hunting up ladders, but we
didn’t find any. It got supper-time and I left Catty and went home.
About nine o’clock that night our door-bell rang, and I went, and it
was Catty. He looked mad and he looked queer and he looked
worried.
“Jest come over to tell you the town marshal just come to our house
and ordered us to git out of town within forty-eight hours. Says as
how he’ll put us in the calaboose for vagrants if we don’t move on.”
“What you goin’ to do?” says I, too surprised and hit all of a heap to
even say I was sorry.
“I dunno what I’m goin’ to do,” says Catty, with his jaw shoved out
and his eyes kind of hard and mad, “but I kin tell you what I hain’t
goin’ to do. I hain’t goin’ to move an inch.”
“Bully for you,” says I, and in another second he had turned around
and run off into the dark. I dunno to this day what made him come
and tell me about it, because he didn’t ask for any help or anything.
But I got a sneaking suspicion it was jest because he was sort of
lonesome and kind of wanted to make sure he really did have a
friend in the world.
CHAPTER VII

I could hardly wait for breakfast to be over in the morning so that I


could hunt up Catty Atkins and find out just exactly what had
happened. I told Dad about it, but he didn’t say much.
“Catty said he wasn’t going to leave town, did he?” Dad asked.
“Yes,” I says.
“Well,” says Dad, with a kind of a hint of a grin, “I shouldn’t be
surprised if folks had to get used to Catty being here, then.”
“Can’t they make him go?” I asked.
“They could make some folks go. I guess it depends a lot on the
folks.”
I found Catty arguing with his father, it seemed like his father was
willing to pull up stakes and go away, and Catty was insisting that
they were going to stay.
“But you can’t,” says Mr. Atkins, waggling his head kind of
bewildered. “They won’t let you. They’re a-goin’ to chase you off.
This here town don’t want no traffickin’ with us, no way. We might
jest as well up and leave friendly as to get chased by a bulldog.”
“There ain’t no bulldog,” says Catty.
“Can’t never tell. Bulldogs puts in appearances when least expected.”
“They hain’t got no right to chase us off. We hain’t vagrants like the
marshal said.”
“What be we, then?”
“Business men,” says Catty. “The marshal he says that a vagrant is a
feller with no visible means of support. Well, hain’t we got visible
means? Hain’t we in the paintin’ and decoratin’ business? Hain’t we
got a job? Hain’t we rented a place of business? I guess we have.”
“You’ll see,” says Mr. Atkins, solemn-like. “When town marshals
wants to run folks out of town, why, they jest up and runs ’em.
Who’s a-goin’ to stop ’em?”
“Me,” says Catty. He snapped it out like he was biting the word off a
chunk of the dictionary and it come hard.
“What you goin’ to do?” I says.
“Jest go ahead and mind my own business,” says Catty, “and let the
marshal do the doin’.”
“Um!” says I.
“Um nothin’!” says he. “You watch.... Now I got to git after them
ladders and that paint.”
“Hadn’t you better be seeing about this other thing first?” I says.
Catty looked at me a second, and he looked just like a fellow who
had made up his mind and wasn’t going to change it. He looked like
he would fight and fight hard. “I’m goin’ to act jest like that marshal
never came here at all,” says he. “When does your newspaper come
out?”
“To-night,” says I. “Most gen’ally it comes out Thursdays, when it
don’t come out Fridays or Saturdays or Mondays. It hain’t what you
call reg’lar. Editor has to go fishin’, or he loses his bottle of ink, or he
hain’t got money to git his paper out of the express-office, or
somethin’ else.”
“We’re goin’ to the printin’-office,” says Catty.
“What if that there town marshal comes back while you’re gone and
starts chasin’ me away?” says Mr. Atkins.
Catty laughed. “Don’t run no farther ’n you have to, Dad, and run
slow. I’ll catch up.”
“Dunno but what I’d rather be chased off than have to go paintin’ all
that buildin’ and git the colic,” Mr. Atkins says, under his breath, but
Catty jest grinned at him and patted him on the back, and we
mogged along.
“What kind of a feller is this editor?” says Catty.
“He’s all right, except that he hain’t got much gumption.”
We hiked along till we got to the printing-office and went in. I
always like to go into the printing-office on account of the smell. I
don’t know what there is about that smell that I like, but it sort of
excites a fellow and makes him think about things happening in far-
off places, and about adventures, and all sorts of interesting things.
I suppose that is because printer’s ink has been used to tell so many
exciting and bully things for years and years that, somehow or other,
they have got to be a part of the ink, and the smell of them has got
into it. I’d like to be a newspaper man some day and live in that
smell all of the time.
Mr. Cuppy was sitting in front of a table, with his coat off and a
shade over his eyes and a corncob pipe in his mouth. He was all
hunched over like he was using the last drop of his brains to write an
editorial about something that was mighty important, and for a
second I sort of hesitated about interrupting him; but I took a look
over his shoulder and saw that what he was doing was painting up
an artificial minnow with streaks and polka-dots. There was another
contraption that looked like a mouse cut out of wood, and there
were hooks and feathers and all sorts of things scattered around.
“Mornin’, Mr. Cuppy,” says I.
“Mornin’, Wee-wee,” says he, just looking up and then looking back
again at his minnow.
“This is Catty Atkins,” says I, “and he wants to talk business with
you.”
“Does, hey? In a hurry is he? Because I’m mighty busy just this
minute. I think I’ve got it at last. Been trying for months to paint up
a minnow so it can’t fail, and now I’m on the track. Bet I’ve painted
this one forty times, but I’ll get it yet, and when I do I’ll show you
how to catch bass.”
“What’s the idea?” I says.
“I’ve been figuring out what kind of a looking minnow I’d like to eat
if I was a bass,” says he, as solemn as a church. “I’ve been putting
myself in the place of the bass and thinking like he would think, and
this minnow is the result. Now, Wee-wee, if you were a bass,
wouldn’t you jump out of the water to grab that bait?”
“Dunno but what I would,” says I.
“Good!” says he. “What did you say his name was?” He jerked his
thumb toward Catty.
“Catty Atkins,” says I.
“New-comer?”
“Yes.”
“Give him a personal. Mr. Catty Atkins, of—where does he come
from?—is visiting friends in our midst. Something like that, eh?”
“I think,” says Catty, “that I’ve got better news than that for you.”
“Do, eh? What is it? Who’s been doin’ what?”
“It’s about Sands Jones and Darkie Patt,” says Catty.
“Only news about them,” says Editor Cuppy, “would be that they had
gone to work of their own accord.”
“They have,” says Catty, “and, what’s more, they’re goin’ to work on
the same job, and what’s more, it’s a race. Never had no paintin’-
race in this town, did you?”
“Not that I call to mind,” said Editor Cuppy, and he began to look
interested. “What’s the idee?”
Catty explained the whole thing to him, and Editor Cuppy began to
laugh, and then he grabbed a piece of paper and begun to write.
“Best story in a year,” says he. “We’ll run her down the front page....
So your Pa is goin’ into business here, eh?”
“Yes. We’ve rented Mr. Gage’s store, and we’re goin’ to have the
most up-to-date paintin’ and decoratin’ shop in the state, and a
refreshment-stand in the little shop at the side.”
“Good! Glad to see enterprise comin’ to our midst. I’ll put in some
about it,” and he grabbed his pencil and wrote quite a lot about
Catty and his Dad being acquisitions to our town that the town
ought to be proud to welcome, and stuff like that. Then he said he
was much obliged and went out into the back room to set the story
up in type.
Next we went to the hardware-store where they kept paints and
brushes and such-like things, and Catty walked right up to Mr. Moss,
hardware, and says: “Mr. Moss, my father, Mr. Atkins, who has the
contract for painting Mr. Manning’s new warehouse, sent me in to
order this list of supplies. He would like to have them delivered
before noon at the warehouse, so he can get to work mixing paints
and one thing and another. We start work Monday morning.” Catty
had a list of things and of quantities of oil and paint and everything.
“We are new-comers here,” says Catty, “and you don’t know us, but
Mr. Manning will send you your check in payment himself.”
“That’s all right. That’s all right,” says Mr. Moss. “Anything else I kin
do fur you?”
“Unless you have four or five ladders. We need some new ladders.”
“Nary a ladder, young man. But I’ll deliver these things before noon.
Much obleeged.”
“Don’t you find it a kind of a nuisance to handle paints with your
hardware business?” Catty says. “Must take up room you need for
other things, and use up a lot of time. Hain’t much profit into it,
neither.”
“That’s right, young feller. But somebody’s got to handle ’em for
accommodation of the public.”
“Well, maybe Dad and you could make an arrangement,” says Catty.
“We might be willin’ to buy out your paint stock, if you was to put a
reasonable price onto it. Kind of calc’late to go into business
permanent here.”
“Do, hey? I want to know? Paints and sich?”
“Wall-papers and everythin’,” says Catty.
“Well, you jest come around and talk it over. Shouldn’t be a mite
s’prised if we could fix it up.”
And all this time, mind you, there was the town marshal going to run
Catty and his Dad out of the village! Catty went right ahead as if
there never had been any town marshal at all, and as if he and his
Dad were leading citizens instead of a couple of folks that hadn’t a
pair of pants to their name and was looked on by most as tramps.
“If I only had them ladders, now,” Catty says as he came out of the
store, “everythin’ would be all right.”
“Ladders it is,” says I. “Let’s go out and shoot us a couple. Might see
some flyin’ around in the woods.”
Catty could see a joke as far as anybody. “Let’s,” says he. “It’s open
season for ladders now.”
In a minute he stopped and says, “When does school start?”
“Five weeks,” says I.
“I’ll have to hustle,” says he. “What grade are you in?”
“Eighth,” says I.
“I got to go to school. Folks hain’t respectable if their children don’t
go to school,” says Catty. “But I hain’t got much education. I’d have
to start almost at the beginnin’ with the little kids. Don’t kind of like
the idee much.”
“You must know somethin’,” says I.
“I do,” says he, “but you can’t pass examinations in grammar with it.
If it was how birds live and about rabbits and about growin’ things, I
could git along, but I don’t know no rules to speak of.”
“But you know arithmetic.”
“Quite a sight of it. Dad he taught me some, because a feller has to
know some arithmetic. But I hain’t up on hist’ry nor language nor
such. Be they hard to learn?”
“Hist’ry,” says I, “is jest like readin’ a book that you like. We hain’t
had much but United States hist’ry yet.”
“Got the book?”
“Yes.”
“Wonder if you’d loan her to me so’s I could be readin’ it nights?”
“You bet,” says I.
“I wisht there was some way of learnin’ them other things, so I
could start in school where I ought to be. But ’tain’t much use
wishin’. I ought to learn things easier ’n a little kid six or seven year
old, and I ought to ketch up before long, but I’ll have to start in at
the beginnin’, I expect.”
“Say,” says I, “I got all the books. And my Dad knows everythin’.
Why don’t you borrow them books and ask Dad if he won’t kind of
teach you at odd times? Bet you’d learn quicker ’n greased lightnin’,
with him to show you.”
“Calc’late he’d be willin’?”
“Who? Dad? Ho! Be tickled to death. I’ll ask him this noon.”
“Much obleeged,” says Catty, “and while he’s teachin’ me he kin give
me an idee about manners.. I got to be as full of manners as
anybody, and fuller, ’cause folks won’t expect it of me, and I got to
prove to ’em that I’m jest as good as they be and know as much
about how to eat and them kind of things.... Now them ladders.”
We walked along a spell and then Catty stopped all of a sudden.
“I got to git Dad one of them painter’s suits made out of white stuff.
You know the kind. Got to have that to-day, too.”
“Why?” says I.
“You’ll see,” says he, which was a way of his. He didn’t always tell a
fellow everything that was in his mind, and he took a lot of pleasure
in surprising folks. “Ought to git one for along about three dollars,
hadn’t I?”
“Guess so,” says I. “Got the three dollars?”
“No, but three dollars hadn’t ought to be hard to git if you set your
mind to it.”
“Huh!” says I. “Might as well be a million.”
“You’ll see,” says he. “I hain’t earned much money, but it kin be
done. Everybody does it, and if everybody kin do it, why, I calc’late I
kin, too.”
“How?”
“Don’t know, but jest keep your ears open and your eyes open, and
somethin’s sure to turn up.”
“If you had a cow—” says I.
“What if I did?” says he.
“Why, you could sell her.”
“I could sell an elephant if I had one, I expect, but I hain’t got no
elephants, nor no zebras, nor no ornithorincuses. Know somebody
wants to buy a cow?”
“Mr. Gackins next door to Gage’s was sayin’ to Dad the other night
that he was lookin’ out for a Jersey.”
“Um!... Jersey, eh? S’pose he meant it?”
“Know he did,” says I. “His last cow took sick and died, and he needs
another.”
“Let’s go see him,” says Catty.
So we walked up toward Gackins’s, and Mr. Gackins was digging in
the garden.
“Mr. Gackins?” says Catty.
“What kin I do fur you, young man?” says Mr. Gackins.
“I hear tell you aim to buy a cow.”
“Calc’late to, if I kin git a good one.”
“Cash?”
“On the spot,” says Mr. Gackins. “Got a Jersey fer sale?”
“Expect to have. How high d’you aim to go fer sich a cow?”
“Depends on the cow,” says Mr. Gackins. “Pervidin’ she was a good-
dispositioned critter that wasn’t give to kickin’ over the milk-pail and
stickin’ her hoof in the milk, and pervidin’ she give a generous
pailful, why, I might go as high as thirty or maybe thirty-five dollars.”
“Um!... Goin’ to be home all mornin’?”
“Yes.”
“Figger I kin fetch around jest the cow you want. Won’t be more ’n
an hour or two.”
“I’ll be waitin’ fer you, but mind she’s sweet-tempered. I got to milk
her myself, and I hain’t hankerin’ to git kicked over the fence, nor
yet hooked in the stummick. Always name my cows Jane. Git me
one by the name of Jane, if you kin.”
“Her name ’ll be Jane,” says Catty, as serious as a judge.
We hustled off, and then Catty says to me, “Who owns a good
Jersey cow?”
“Hiram Winklereid, out a half a mile, keeps quite a herd.”
“That’s where we’re headin’,” says Catty. We mogged along the road
till we came to Winklereid’s and went back into the big barn. Mr.
Winklereid was walking around, looking at his cattle, and Catty went
up to him as big as life and more than three times as natural.
“Mr. Winklereid,” says he, “I got a man that wants to buy him a
good-natured Jersey cow by the name of Jane, that gives a pailful of
milk. Got sich a cow fer sale?”
“All but the name of Jane,” says Mr. Winklereid, with a grin. He was a
great big man, and about as pleasant as any farmer in that part of
the state. Everybody liked him on account of him always grinning
and joking with folks, and it was said of him that he treated the
animals on his place better than most men treated their families.
“If you got the cow,” says Catty, and he grinned, too, “I guess I kin
tend to christenin’ her Jane.”
“Shouldn’t be s’prised a mite,” says Mr. Winklereid. “Who be you,
anyhow?”
I introduced them and Mr. Winklereid looked at Catty kind of sharp
and says, “How come you in this deal?”
“To make money,” says Catty. “I needed some money for a purpose I
got in mind, and when I heard of a man that wanted a cow I
figgered I’d buy him one and turn it over at a profit.”
“Um!... You got gumption, young feller. Prepared to pay cash, be
you?”
“The man I’m a-sellin’ to, he’ll pay cash.”
“How much?”
“Don’t seem like I ought to tell you that. What I aim to do is to buy
this cow off of you as cheap as I can and sell it to him as high as I
can. If you knew what he’d pay me, why, you’d charge me more.”
“Maybe so. Can’t tell. But you got business idees all right. Now here,
young feller, is a cow I’ll guarantee to be kind and gentle, and
capable of fillin’ the pail every evenin’ with the sweetest milk in this
section. She’s young and willin’, and I’ll sell her to you for thirty-five
dollars.”
“Too high,” says Catty. “Tell you what I’ll do. I’ll give you thirty-two,
even. I’ll drive her into town and sell her and fetch back the money
in no time.”
“How do I know you won’t run off with the money?” says Mr.
Winklereid, but his eyes was kind of twinkling.
Catty looked at him a minute like he was going to get mad, and then
he says, “You know by the looks of me,” says he. Just like that.
Mr. Winklereid laughed. “By Jing!” says he, “that was a mighty good
answer. You git the cow, Sonny. Wait a minute till I put a leadin’-
rope on her. Good luck to you. Any time you got business dealin’s of
this here kind jest drop in to see me.”
“Thankee,” says Catty, and in a couple of minutes more we were
driving the cow down the road toward Mr. Gackins’s house.
As we turned in his driveway Catty began talking to the cow.
“Careful now, Jane,” says he. “Watch where you’re steppin’. You’re
comin’ to your home now, and I hope you’re goin’ to enjoy it. There
never was a better or pleasanter-natured cow than you and the way
you give milk is a caution. I hate to part with you, but Mr. Gackins
here needs a first-class cow and I want to find a home for you....”
Then he pretended to notice Mr. Gackins, and says: “Here she is,
sir.... Jane, here’s your new owner, Mr. Gackins. Come over and leave
him pat your head.”
Mr. Gackins came over and looked at Jane and talked to her and
patted her head.
“Guaranteed,” says Catty. He told how much milk she gave and all
that as he had learned it from Mr. Winklereid.
“I like her looks. How much?” says Mr. Gackins.
“Thirty-nine dollars,” says Catty. “She’s about the finest Jersey in this
neck of the woods.”
“Too high. Couldn’t pay a cent more ’n thirty-five.”
“Too bad.... Well, guess we’ll have to drive you home, Jane. G’-by,
Mr. Gackins. Hope you git as good a cow somewheres else, but I
doubt it.”
As we came up the marshal said to Mr. Atkins: “You won’t leave town, eh?
Wa-al, we’ll see about that”

“Hey, hold on there! What’s the hurry? I’ll go thirty-six.”


“Move along there, Jane,” says Catty. “Make it thirty-seven,” says Mr.
Gackins.
“Take you,” says Catty, passing over the leading-rope. “Cash.”
Mr. Gackins counted out thirty-seven dollars and Catty thanked him
and back we hiked to Winklereid’s and paid him his thirty-two.
“There,” says Catty to me, “I needed three dollars and I made five.
Wasn’t very hard, either, was it? Now we’ll go buy them painter’s
clothes.”
We bought the clothes and went to the bayou where Mr. Atkins was
talking to a man, and the man was the town marshal. As we came
up the marshal says: “You won’t leave town, eh? Wa-al, we’ll see
about that. Here’s a paper that says you got to come before the
justice of the peace, and I calc’late he kin tend to your case. You be
there this evenin’ at seven-thirty. And from there, Mister Man, you’ll
take a trip to the calaboose.”
The marshal hustled off as dignified as if he was the President of the
United States, and Catty went up to his Dad. “Here,” says he, “put
on these clothes. Painter’s clothes. Git right into ’em and hustle over
to Mr. Manning’s warehouse. I want you to git busy mixin’ paints and
fixin’ things to start that job Monday mornin’.”
“Can’t do no paintin’ in the calaboose,” says Mr. Atkins.
“Never mind the calaboose,” says Catty. “You git on them clothes
and go ahead. I guess we kin tend to the calaboose when we git to
it.”
I says to myself that maybe he could and maybe he couldn’t, but all
the same, when it comes to monkeying with the law and town
marshals and justices of the peace, I didn’t want any of it on my
plate. But I was interested to see how it was comin’ out and how
Catty was calculating to handle it.
CHAPTER VIII

That afternoon about four o’clock the paper came out, and right on
the front page of it was a big piece about Sands Jones and Darkie
Patt and the painting-race. Mr. Cuppy had done himself proud.
Everything was there that Catty had told him and a lot of things
Catty never thought of at all.
“This event,” said Editor Cuppy, “constitutes one of the most
remarkable examples of civic and business ingenuity ever manifested
in our midst. Our village will thrill at the prospect of such a contest
between such well-known citizens as Mr. Patt and Mr. Jones. There
have been horse-races and foot-races and balloon-races and dog-
races, but never to our knowledge has the earth seen a painting-
race. It remained for our town to set the lead in this new realm of
sport, and it remained for our new and valued citizens, Atkins & Son,
painters and decorators and contractors, to bring this honor to us. It
represents true enterprise. We should all extend the hand of
welcome to these progressive citizens. It is to be hoped that the
town will take formal notice of this event and that some sort of
celebration will be arranged to mark the start of the race. The least
that could be done would be to organize a parade to the place of the
contest, and to hear some words of congratulation and patriotism
spoken before the gladiators lay on their brushes.” There was a lot
more of it and Catty was tickled to death.
“I guess I git my ladders now,” he said.
“How?”
“Wait and see,” says he.
We walked over to Mr. Manning’s warehouse where Mr. Atkins was
mixing paints. He was about done when we got there, and Catty
grabbed onto him and told him to come along.
“Where?” says Mr. Atkins. “I want a chance to git off and rest and
look at birds a-flyin’ and clouds a-scuddin’ by.”
“After this,” says Catty, “about the only time you git to do that is
Sundays. You’re goin’ to be too busy the rest of the week.”
“I be, be I? Wa-al, where we goin’ now?”
“Barber’s,” says Catty.
“Hair-cuttin’ barber’s?”
“That’s the feller, Dad.”
“I’m goin’ to git my hair cut?”
“Whiskers, too.”
“Not clean off?” says Mr. Atkins, and his eyes got kind of frightened.
“Naw,” says Catty, “not off. Them whiskers is valuable, pervidin’
they’re used right. I’ve been thinkin’ up what kind of whiskers looks
most respectable and dignified and sich-like, and I got it all planned
out. Let’s hustle, Dad.”
So we went to the barber’s, and Catty herded his Dad into the chair,
and then told the barber just what he wanted done and how he
wanted it. He had a picture he had cut out of an old magazine of
some man that was president of a railroad, and he was about the
most dignified-looking man I ever see. His whiskers come down to a
sharp point and was that neat and handsome you wouldn’t believe
it. Catty held this picture up to the barber and told him to make his
Dad look as much like that as he could.
The barber he went to work slow and careful. Every little while he
would stand off and look at Mr. Atkins with his head on one side and
whistle through his teeth. Then he would sort of rush in and snip off
a chunk of hair and then stand off again and take another look. Mr.
Atkins sat like he was frozen solid and looked at the barber hard and
then looked in the glass, and then grunted down in his throat. It
took the barber ’most an hour to git through, but when he was done
you wouldn’t have known Mr. Atkins. He looked like he was ten years
younger and a million dollars richer. Why, if a man with whiskers like
his were fixed should stop you on the street and ask you to get him
change for a million-dollar bill, you would be surprised that he was
bothering with such small change.
Mr. Atkins looked at himself and waggled his head; then he looked at
himself some more, sideways, hideways, and wideways, and
mumbled and looked discontented.
“’Tain’t me,” says he. “Now, when I git up in the mornin’ and wash
my face and look in the glass I’ll have to git interduced or I’ll think
there’s a stranger a-hangin’ around. I got used to my face and I kind
of liked it. Now I got to start in all over to git used to this one.”
“It hain’t only your face that’s changed,” says Catty. “It’s all of you.
You’re respectable now. How does it feel?”
“Can’t say as yet. Can’t say as yet.... Goodness gracious, Peter! Now,
honest, Catty, is that me?”
“It’s you, Dad.”
“But that feller in the glass looks as if he liked to work, and all that.”
“He does,” says Catty.
“Then, ’tain’t me. I knowed it.... I wisht I had back my whiskers.”
Well, we went out of there and walked down the street, and all at
once I noticed that folks were pointing at us and whispering.
Everywhere you looked there was men reading the paper and talking
about it. It was almost like the night before election. The town was
stirred up, and when our town gets stirred it gets stirred clean to the
bottom. That painting-race had hit us right between the eyes, and I
could see that something was going to happen sure.
Dad had told me I could eat with Catty and his Dad, which I did. We
had fish cooked in the coals and water and bread and cheese. It was
a mighty fine meal. After supper we sat around awhile helping Mr.
Atkins get used to his whiskers, and then Catty says it was time to
go to court.
The court was in a room over the fire-engine hall, and when we got
there there was a crowd. It looked like all the town had been
arrested for something. There was women there, too, and one of
them was Mrs. Gage, the justice’s wife. I figured she was to blame
for trying to get Mr. Atkins chased out of town, and had come down
to make sure her husband did it. We went in and sat down inside
the railing, and pretty soon everybody else came in, and then Mr.
Gage sat down in his chair behind the desk and cleared his throat
and scowled at everybody as important as all-git-out.
“Case of the People against Atkins,” he says. “Is the defendant
present?”
“I be,” says Mr. Atkins.
“You’re charged with being a vagrant. Guilty or not guilty?”
“Wa-al,” says Mr. Atkins, looking like a banker that was thinking
about lending fifty thousand dollars, “there’s two ways of lookin’ at
it.”
“What two ways?” says Mr. Gage.
“If you look at it from the point of view that what I’m doin’ makes
me a vagrant, then I be one; but if you look at it from the point of
view that what I’m doin’ don’t make me a vagrant, then I hain’t.”
I looked back, and you could see heads nodding all over the room.
Those words of Mr. Atkins’s coming right out of that kind of whiskers
sounded as if they were a little wiser than Solomon.
“What do you think?” says the judge.
“I think I hain’t,” says Mr. Atkins.
“Defendant pleads not guilty,” says Mr. Gage. “Town-marshal
Piddlecomb, take the stand.”

You might also like