0% found this document useful (0 votes)
13 views

Learn TensorFlow 2.0: Implement Machine Learning and Deep Learning Models with Python 1st Edition Pramod Singh instant download

The document provides information about the book 'Learn TensorFlow 2.0: Implement Machine Learning and Deep Learning Models with Python' by Pramod Singh and Avinash Manure, including its content structure and chapters. It covers various topics such as supervised learning, neural networks, image processing, natural language processing, and model deployment using TensorFlow 2.0. Additionally, it includes links to download the book and other related resources.

Uploaded by

ferenthomoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Learn TensorFlow 2.0: Implement Machine Learning and Deep Learning Models with Python 1st Edition Pramod Singh instant download

The document provides information about the book 'Learn TensorFlow 2.0: Implement Machine Learning and Deep Learning Models with Python' by Pramod Singh and Avinash Manure, including its content structure and chapters. It covers various topics such as supervised learning, neural networks, image processing, natural language processing, and model deployment using TensorFlow 2.0. Additionally, it includes links to download the book and other related resources.

Uploaded by

ferenthomoud
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 81

Learn TensorFlow 2.

0: Implement Machine Learning


and Deep Learning Models with Python 1st Edition
Pramod Singh install download

https://ebookmeta.com/product/learn-tensorflow-2-0-implement-
machine-learning-and-deep-learning-models-with-python-1st-
edition-pramod-singh/

Download more ebook from https://ebookmeta.com


We believe these products will be a great fit for you. Click
the link to download now, or visit ebookmeta.com
to discover even more!

Applied Deep Learning with TensorFlow 2: Learn to


Implement Advanced Deep Learning Techniques with
Python, 2nd Edition Umberto Michelucci

https://ebookmeta.com/product/applied-deep-learning-with-
tensorflow-2-learn-to-implement-advanced-deep-learning-
techniques-with-python-2nd-edition-umberto-michelucci-2/

Applied Deep Learning with TensorFlow 2: Learn to


Implement Advanced Deep Learning Techniques with Python
2nd Edition Umberto Michelucci

https://ebookmeta.com/product/applied-deep-learning-with-
tensorflow-2-learn-to-implement-advanced-deep-learning-
techniques-with-python-2nd-edition-umberto-michelucci/

Time Series Algorithms Recipes: Implement Machine


Learning and Deep Learning Techniques with Python 1st
Edition Akshay Kulkarni

https://ebookmeta.com/product/time-series-algorithms-recipes-
implement-machine-learning-and-deep-learning-techniques-with-
python-1st-edition-akshay-kulkarni/

Basic Korean A Grammar and Workbook 2nd Edition Andrew


Sangpil Byon

https://ebookmeta.com/product/basic-korean-a-grammar-and-
workbook-2nd-edition-andrew-sangpil-byon-2/
Death Under a Little Sky 1st Edition Stig Abell

https://ebookmeta.com/product/death-under-a-little-sky-1st-
edition-stig-abell/

The Madman in the White House: Sigmund Freud,


Ambassador Bullitt, and the Lost Psychobiography of
Woodrow Wilson 1st Edition Patrick Weil

https://ebookmeta.com/product/the-madman-in-the-white-house-
sigmund-freud-ambassador-bullitt-and-the-lost-psychobiography-of-
woodrow-wilson-1st-edition-patrick-weil/

Cancer Basics 2nd Edition J. Eggert

https://ebookmeta.com/product/cancer-basics-2nd-edition-j-eggert/

State Politics in Contemporary India Crisis or


Continuity 1st Edition John R. Wood (Editor)

https://ebookmeta.com/product/state-politics-in-contemporary-
india-crisis-or-continuity-1st-edition-john-r-wood-editor/

The Art of Statistics How to Learn from Data David


Spiegelhalter

https://ebookmeta.com/product/the-art-of-statistics-how-to-learn-
from-data-david-spiegelhalter/
Torts Cases Legislation and Commentary 9th Edition
Harold Luntz David Hambly

https://ebookmeta.com/product/torts-cases-legislation-and-
commentary-9th-edition-harold-luntz-david-hambly/
Learn
TensorFlow 2.0
Implement Machine Learning and
Deep Learning Models with Python

Pramod Singh
Avinash Manure

www.allitebooks.com
Learn TensorFlow 2.0
Implement Machine Learning
and Deep Learning Models
with Python

Pramod Singh
Avinash Manure

www.allitebooks.com
Learn TensorFlow 2.0: Implement Machine Learning and Deep Learning
Models with Python
Pramod Singh Avinash Manure
Bangalore, Karnataka, India Bangalore, India

ISBN-13 (pbk): 978-1-4842-5560-5 ISBN-13 (electronic): 978-1-4842-5558-2


https://doi.org/10.1007/978-1-4842-5558-2

Copyright © 2020 by Pramod Singh, Avinash Manure


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or
part of the material is concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way,
and transmission or information storage and retrieval, electronic adaptation, computer software,
or by similar or dissimilar methodology now known or hereafter developed.
Trademarked names, logos, and images may appear in this book. Rather than use a trademark
symbol with every occurrence of a trademarked name, logo, or image, we use the names, logos,
and images only in an editorial fashion and to the benefit of the trademark owner, with no
intention of infringement of the trademark.
The use in this publication of trade names, trademarks, service marks, and similar terms, even if
they are not identified as such, is not to be taken as an expression of opinion as to whether or not
they are subject to proprietary rights.
While the advice and information in this book are believed to be true and accurate at the date of
publication, neither the authors nor the editors nor the publisher can accept any legal
responsibility for any errors or omissions that may be made. The publisher makes no warranty,
express or implied, with respect to the material contained herein.
Managing Director, Apress Media LLC: Welmoed Spahr
Acquisitions Editor: Celestin Suresh John
Development Editor: James Markham
Coordinating Editor: Aditee Mirashi
Cover designed by eStudioCalamar
Cover image designed by Freepik (www.freepik.com)
Distributed to the book trade worldwide by Springer Science+Business Media New York,
233 Spring Street, 6th Floor, New York, NY 10013. Phone 1-800-SPRINGER, fax (201) 348-4505,
e-mail orders-ny@springer-sbm.com, or visit www.springeronline.com. Apress Media, LLC is a
California LLC, and the sole member (owner) is Springer Science+Business Media Finance Inc
(SSBM Finance Inc). SSBM Finance Inc is a Delaware corporation.
For information on translations, please e-mail rights@apress.com, or visit www.apress.com/
rights-permissions.
Apress titles may be purchased in bulk for academic, corporate, or promotional use. eBook
versions and licenses are also available for most titles. For more information, reference our Print
and eBook Bulk Sales web page at www.apress.com/bulk-sales.
Any source code or other supplementary material referenced by the authors in this book is available
to readers on GitHub, via the book’s product page, located at www.apress.com/978-1-4842-5560-5.
For more detailed information, please visit www.apress.com/source-code.
Printed on acid-free paper

www.allitebooks.com
I dedicate this book to my wife, Neha, my son, Ziaan, and
my parents. Without you, this book wouldn’t have
been possible. You complete my world and are the
source of my strength.
—Pramod Singh

I dedicate this book to my wife, Jaya, for constantly


encouraging me to do the best in whatever I undertake,
and also my mom and dad, for their unconditional love
and support, which have made me what I am today.
Last but not least, my thanks to Pramod, for trusting me
and giving me the opportunity to coauthor this book.
—Avinash Manure

www.allitebooks.com
Table of Contents
About the Authors��������������������������������������������������������������������������������ix
About the Technical Reviewer�������������������������������������������������������������xi
Acknowledgments�����������������������������������������������������������������������������xiii
Introduction����������������������������������������������������������������������������������������xv

Chapter 1: Introduction to TensorFlow 2.0�������������������������������������������1


Tensor + Flow = TensorFlow���������������������������������������������������������������������������������2
Components and Basis Vectors�����������������������������������������������������������������������3
Tensor�������������������������������������������������������������������������������������������������������������������6
Rank����������������������������������������������������������������������������������������������������������������7
Shape��������������������������������������������������������������������������������������������������������������7
Flow�����������������������������������������������������������������������������������������������������������������7
TensorFlow 1.0 vs. TensorFlow 2.0�����������������������������������������������������������������������9
Usability-Related Changes����������������������������������������������������������������������������10
Performance-Related Changes���������������������������������������������������������������������16
Installation and Basic Operations in TensorFlow 2.0������������������������������������������17
Anaconda�������������������������������������������������������������������������������������������������������17
Colab�������������������������������������������������������������������������������������������������������������17
Databricks�����������������������������������������������������������������������������������������������������19
Conclusion����������������������������������������������������������������������������������������������������������24

www.allitebooks.com
Table of Contents

Chapter 2: Supervised Learning with TensorFlow������������������������������25


What Is Supervised Machine Learning?��������������������������������������������������������������25
Linear Regression with TensorFlow 2.0��������������������������������������������������������������28
Implementation of a Linear Regression Model, Using TensorFlow and Keras����29
Logistic Regression with TensorFlow 2.0������������������������������������������������������������37
Boosted Trees with TensorFlow 2.0���������������������������������������������������������������������47
Ensemble Technique��������������������������������������������������������������������������������������47
Gradient Boosting������������������������������������������������������������������������������������������49
Conclusion����������������������������������������������������������������������������������������������������������52

Chapter 3: Neural Networks and Deep Learning with TensorFlow�����53


What Are Neural Networks?��������������������������������������������������������������������������������53
Neurons���������������������������������������������������������������������������������������������������������54
Artificial Neural Networks (ANNs)������������������������������������������������������������������55
Simple Neural Network Architecture�������������������������������������������������������������57
Forward and Backward Propagation�������������������������������������������������������������������58
Building Neural Networks with TensorFlow 2.0�������������������������������������������������61
About the Data Set����������������������������������������������������������������������������������������61
Deep Neural Networks (DNNs)����������������������������������������������������������������������������67
Building DNNs with TensorFlow 2.0��������������������������������������������������������������������68
Estimators Using the Keras Model����������������������������������������������������������������������71
Conclusion����������������������������������������������������������������������������������������������������������74

Chapter 4: Images with TensorFlow���������������������������������������������������75


Image Processing�����������������������������������������������������������������������������������������������76
Convolutional Neural Networks���������������������������������������������������������������������������77
Convolutional Layer���������������������������������������������������������������������������������������77
Pooling Layer�������������������������������������������������������������������������������������������������80
Fully Connected Layer�����������������������������������������������������������������������������������81

vi
Table of Contents

ConvNets Using TensorFlow 2.0��������������������������������������������������������������������������82


Advanced Convolutional Neural Network Architectures��������������������������������������89
Transfer Learning������������������������������������������������������������������������������������������������93
Transfer Learning and Machine Learning������������������������������������������������������95
Variational Autoencoders Using TensorFlow 2.0�������������������������������������������������97
Autoencoders������������������������������������������������������������������������������������������������97
Applications of Autoencoders������������������������������������������������������������������������98
Variational Autoencoders�������������������������������������������������������������������������������98
Implementation of Variational Autoencoders Using TensorFlow 2.0��������������99
Conclusion��������������������������������������������������������������������������������������������������������106

Chapter 5: Natural Language Processing with TensorFlow 2.0��������107


NLP Overview����������������������������������������������������������������������������������������������������107
Text Preprocessing��������������������������������������������������������������������������������������������109
Tokenization������������������������������������������������������������������������������������������������110
Word Embeddings���������������������������������������������������������������������������������������112
Text Classification Using TensorFlow����������������������������������������������������������������113
Text Processing�������������������������������������������������������������������������������������������115
Deep Learning Model����������������������������������������������������������������������������������119
Embeddings�������������������������������������������������������������������������������������������������120
TensorFlow Projector����������������������������������������������������������������������������������������123
Conclusion��������������������������������������������������������������������������������������������������������129

Chapter 6: TensorFlow Models in Production�����������������������������������131


Model Deployment��������������������������������������������������������������������������������������������132
Isolation�������������������������������������������������������������������������������������������������������133
Collaboration�����������������������������������������������������������������������������������������������133
Model Updates���������������������������������������������������������������������������������������������134

vii
Table of Contents

Model Performance�������������������������������������������������������������������������������������134
Load Balancer����������������������������������������������������������������������������������������������134
Python-Based Model Deployment���������������������������������������������������������������������135
Saving and Restoring a Machine Learning Model���������������������������������������135
Deploying a Machine Learning Model As a REST Service���������������������������138
Templates����������������������������������������������������������������������������������������������������142
Challenges of Using Flask���������������������������������������������������������������������������145
Building a Keras TensorFlow-Based Model�������������������������������������������������������146
TF ind deployment��������������������������������������������������������������������������������������������151
Conclusion��������������������������������������������������������������������������������������������������������159

Index�������������������������������������������������������������������������������������������������161

viii
About the Authors
Pramod Singh is currently employed as a
machine learning expert at Walmart Labs. He
has extensive hands-on experience in machine
learning, deep learning, artificial intelligence
(AI), data engineering, designing algorithms,
and application development. He has spent
more than ten years working on multiple
data projects at different organizations. He’s
the author of three books: Machine Learning
with PySpark, Learn PySpark, and Learn TensorFlow 2.0. He is also a
regular speaker at major tech conferences, such as O’Reilly Strata Data
and AI Conferences. Pramod holds a BTech in electrical engineering from
Mumbai University and an MBA from Symbiosis University. He also holds
data science certification from IIM–Calcutta. Pramod lives in Bangalore,
India, with his wife and three-­year-­old son. In his spare time, he enjoys
playing guitar, coding, reading, and watching football.

ix
About the Authors

Avinash Manure is a senior data scientist


at Publicis Sapient with more than eight
years of experience using data to solve real-
world business challenges. He is proficient
in deploying complex machine learning and
statistical modeling algorithms/techniques to
identify patterns and extract valuable insights
for key stakeholders and organizational
leadership.
Avinash holds a bachelor’s degree in electronics engineering from
Mumbai University and holds an MBA in marketing from the University
of Pune. He and his wife are currently settled in Bangalore. He enjoys
traveling to new places and reading motivational books.

x
About the Technical Reviewer
Jojo Moolayil is an AI professional and author
of three books on machine learning, deep
learning, and the Internet of Things (IoT). He
is currently working as a research scientist—AI
at Amazon Web Services, in their Vancouver,
British Columbia, office.
Jojo was born and raised in Pune, India,
and graduated from the University of Pune
with a major in information technology
engineering. His passion for problem solving
and data-driven decision making led him to start a career with Mu
Sigma Inc., the world’s largest pure-play analytics provider. There, he
was responsible for developing machine learning and decision science
solutions to complex problems for major health care and telecom
companies. He later worked with Flutura (an IoT analytics startup) and
General Electric, with a focus on industrial AI, in Bangalore.
In his current role with Amazon, he works on researching and developing
large-scale AI solutions to combat fraud and enrich the customers’ payment
experience in the cloud. Jojo is also actively involved as a tech reviewer and
AI consultant to leading publishers and has reviewed more than a dozen
books on machine learning, deep learning, and business analytics.
You can reach Jojo at the following:

• www.jojomoolayil.com/

• www.linkedin.com/in/jojo62000

• twitter.com/jojo62000

xi
Acknowledgments
This is my third book with Apress, and a lot of thought went into writing
it. The main objective was to introduce to the IT community the critical
changes introduced in the new version of TensorFlow. I hope readers will
find it useful, but first, I’d like to thank a few people who helped me along
the journey. First, I must thank the most important person in my life, my
beloved wife, Neha, who selflessly supported me throughout and sacrificed
so much to ensure that I completed this book.
I must also thank my coauthor, Avinash Manure, who expended a great
amount of effort to complete the project on time. In addition, my thanks to
Celestin Suresh John, who believed in me and offered me this opportunity
to write another book for Apress. Aditee Mirashi is one of the best editors
in India. This is my third book with her, and it was quite exciting to
collaborate again. She was, as usual, extremely supportive and always
available to accommodate my requests. To James Markham, who had
the patience to review every line of code and check the appropriateness
of each example, thank you for your feedback and your encouragement.
It really made a difference to me and the book. I also want to thank my
mentors who have constantly encouraged me to chase my dreams. Thank
you Sebastian Keupers, Dr. Vijay Agneeswaran, Sreenivas Venkatraman,
Shoaib Ahmed, and Abhishek Kumar.
Finally, I am infinitely grateful to my son, Ziaan, and my parents, for
the endless love and support, irrespective of circumstances. You all make
my world beautiful.
—Pramod Singh

xiii
Acknowledgments

This is my first book, and a very special one indeed. As mentioned by


Pramod, the objective of this book is to introduce readers to TensorFlow
2.0 and explain how this platform has evolved over the years to become
one of the most popular and user-friendly source libraries for machine
learning currently available. I would like to thank Pramod for having
confidence in me and giving me the golden opportunity to coauthor this
book. As this is my first book, Pramod has been guiding and helping me to
complete it.
I would like to thank my wife, Jaya, who made sure I had the right
environment at home to concentrate and complete this book on time. I
would also like to thank the publishing team—Aditee Mirashi, Matthew
Moodie, and James Markham—who have helped me immensely in
ensuring that this book reaches the audience in its best state. I would
also like to thank my mentors, who made sure I grew professionally and
personally by always supporting me in my dreams and guiding me toward
them. Thank you Tristan Bishop, Erling Amundson, Deepak Jain, Dr.
Vijay Agneeswaran, and Abhishek Kumar for all the support that you have
extended to me. Last but not least, I would like to acknowledge my parents,
my friends, and colleagues, who have always been there in my tough times
and motivated me to follow my dreams.
—Avinash Manure

xiv
Introduction
Google has been a pioneer in introducing groundbreaking technology and
products. TensorFlow is no exception, when it comes to efficiency and
scale, yet there have been some adoption challenges that have convinced
Google’s TensorFlow team to implement changes to facilitate ease of use.
Therefore, the idea of writing this book was simply to introduce to readers
these important changes made by the TensorFlow core team. This book
focuses on different aspects of TensorFlow, in terms of machine learning,
and goes deeper into the internals of the recent changes in approach. This
book is a good reference point for those who seek to migrate to TensorFlow
to perform machine learning.
This book is divided into three sections. The first offers an introduction
to data processing using TensorFlow 2.0. The second section discusses
using TensorFlow 2.0 to build machine learning and deep learning models.
It also includes neuro-linguistic programming (NLP) using TensorFlow 2.0.
The third section covers saving and deploying TensorFlow 2.0 models in
production. This book also is useful for data analysts and data engineers,
as it covers the steps of big data processing using TensorFlow 2.0. Readers
who want to transition to the data science and machine learning fields
will also find that this book provides a practical introduction that can lead
to more complicated aspects later. The case studies and examples given
in the book make it really easy to follow and understand the relevant
fundamental concepts. Moreover, there are very few books available
on TensorFlow 2.0, and this book will certainly increase the readers’

xv
Introduction

knowledge. The strength of this book lies in its simplicity and the applied
machine learning to meaningful data sets.
We have tried our best to inject our entire experience and knowledge
into this book and feel it is specifically relevant to what businesses are
seeking to solve real challenges. We hope you gain some useful takeaways
from it.

xvi
CHAPTER 1

Introduction
to TensorFlow 2.0
The intent of this book is to introduce readers to the latest version of the
TensorFlow library. Therefore, this first chapter focuses mainly on what has
changed in the TensorFlow library since its first version, TensorFlow 1.0.
We will cover the various changes, in addition to highlighting the specific
parts for which changes are yet to be introduced. This chapter is divided
into three sections: the first discusses the internals of TensorFlow;
the second focuses on the changes that have been implemented in
TensorFlow 2.0 after TensorFlow 1.0; and the final section covers
TensorFlow 2.0 installation methods and basic operations.
You may already be aware that TensorFlow is widely used as a
machine learning implementation library. It was created by Google
as part of the Google Brain project and was later made available as an
open source product, as there were multiple machine learning and
deep learning frameworks that were capturing the attention of users.
With open source availability, more and more people in the artificial
intelligence (AI) and machine learning communities were able to adopt
TensorFlow and build features and products on top of it. It not only
helped users with implementation of standard machine learning and
deep learning algorithms but also allowed them to implement customized
and differentiated versions of algorithms for business applications and

© Pramod Singh, Avinash Manure 2020 1


P. Singh and A. Manure, Learn TensorFlow 2.0,
https://doi.org/10.1007/978-1-4842-5558-2_1
Chapter 1 Introduction to TensorFlow 2.0

various research purposes. In fact, it soon became one of the most popular
libraries in the machine learning and AI communities—so much so that
people have been building a huge number of apps using TensorFlow
under the hood. This is principally owing to the fact that Google itself uses
TensorFlow in most of its products, whether Google Maps, Gmail,
or other apps.
While TensorFlow had its strengths in certain areas, it also had a few
limitations, owing to which developers found it a bit difficult to adopt,
compared to such other libraries as PyTorch, Theano, and OpenCV. As
Google’s TensorFlow team took the feedback of the TensorFlow
community seriously, it went back to the drawing board and started
working on most of the changes required to make TensorFlow even more
effective and easy to work with, soon launching the TensorFlow 2.0 alpha
version this year. TensorFlow 2.0 claims to have removed some of the
previous hurdles, in order to allow developers to use TensorFlow even
more seamlessly. In this chapter, we will go over those changes one by one,
but before covering these, let us spend some time understanding what
exactly TensorFlow is and what makes it one of the best available options
to perform machine learning and deep learning today.

Tensor + Flow = TensorFlow


Tensors are the building blocks of TensorFlow, as all computations are
done using tensors. So, what exactly is a tensor?
According to the definition provided by Google’s TensorFlow team,
A tensor is a generalization of vectors and matrices to poten-
tially higher dimensions. Internally, TensorFlow represents
tensors as n-dimensional arrays of base datatypes.
But we would like to delve a little deeper into tensors, in order to
provide more than a general overview of what they are. We would like to
compare them with vectors or matrices, so as to highlight the key dynamic

2
Chapter 1 Introduction to TensorFlow 2.0

property that makes tensors so powerful. Let us start with a simple vector.
A vector is commonly understood as something that has a magnitude and
a direction. Simply put, it is an array that contains an ordered list of values.
Without the direction of a vector, a tensor becomes a scalar value that has
only magnitude.
A vector can be used to represent n number of things. It can represent
area and different attributes, among other things. But let’s move beyond
just magnitude and direction and try to understand the real components
of a vector.

Components and Basis Vectors


Let’s suppose we have a vector Â, as shown in Figure 1-1. This is currently
represented without any coordinate system consideration, but most of us
are already aware of the Cartesian coordinate system (x, y, z axis).

Figure 1-1. Simple vector

If the vector  is represented in a three-dimensional space, it will look


something like what is shown in Figure 1-2. This vector  can also be
represented with the help of basis vectors.

3
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-2. Types of variables

Basis vectors are associated with the coordinate system and can be used
to represent any vector. These basis vectors have a length of 1 and, hence, are
also known as unit vectors. The direction of these basis vectors is determined
by their respective coordinates. For example, for three-­dimensional
representation, we have three basis vectors (xˆ , y,
ˆ zˆ ), so x̂ would have the
direction of the x axis coordinate, and the ŷ basis vector would have the
direction of the y axis. Similarly, this would be the case for ẑ.
Once the basis vectors are present, we can use the coordinate system
to find the components that represent the original vector Â. For simplicity,
and to understand the components of the vector well, let’s reduce the
coordinate system from three dimensions to two. So, now the vector Â
looks something like what is shown in Figure 1-3.

Figure 1-3. 2-dimensional view

4
Chapter 1 Introduction to TensorFlow 2.0

To find the first component of the vector  along the x axis, we


will project it onto the x axis, as shown in Figure 1-4. Now, wherever
the projection meets the x axis is known as the x component, or first
component, of the vector.

Figure 1-4. Vector Magnitude

If you look carefully, you can easily recognize this x component as the
sum of a few basis vectors along the x axis. In this case, adding three basis
vectors will give the x component of vector Â. Similarly, we can find the y
component of vector  by projecting it on the y axis and adding up the
basis vectors (2 ŷ) along the y axis to represent it. In simple terms, we can
think of this as how much one has to move in the x axis direction and y axis
direction in order to reach vector Â.

 = 3 xˆ + 2 yˆ

One other thing worth noting is that as the angle between vector Â
and the x axis increases, the x component decreases, but the y component
increases. Vectors are part of a bigger class of objects known as tensors.
If we end up multiplying a vector with another vector, we get a result
that is a scalar quantity, whereas if we multiply a vector with a scalar
value, it just increases or decreases in the same proportion, in terms of
its magnitude, without changing its direction. However, if we multiply

5
Chapter 1 Introduction to TensorFlow 2.0

a vector with a tensor, it will result in a new vector that has a changed
magnitude as well as a new direction.

T ensor
At the end of the day, a tensor is also a mathematical entity with which to
represent different properties, similar to a scalar, vector, or matrix. It is true
that a tensor is a generalization of a scalar or vector. In short, tensors are
multidimensional arrays that have some dynamic properties. A vector is a
one-dimensional tensor, whereas two-dimensional tensors are matrices
(Figure 1-5).

Figure 1-5. Tensors

Tensors can be of two types: constant or variable.

6
Chapter 1 Introduction to TensorFlow 2.0

R
 ank
Ranking tensors can sometimes be confusing for some people, but in terms
of tensors, rank simply indicates the number of directions required to
describe the properties of an object, meaning the dimensions of the array
contained in the tensor itself. Breaking this down for different objects, a
scalar doesn’t have any direction and, hence, automatically becomes a
rank 0 tensor, whereas a vector, which can be described using only one
direction, becomes a first rank tensor. The next object, which is a matrix,
requires two directions to describe it and becomes a second rank tensor.

S
 hape
The shape of a tensor represents the number of values in each dimension.

Scalar—32: The shape of the tensor would be [ ].

Vector—[3, 4, 5]: The shape of the first rank tensor


would be [3].
1 2 3
Matrix = 4 5 6 : The second rank tensor would
7 8 9
have a shape of [3, 3].

F low
Now comes the second part of TensorFlow: flow. This is basically an
underlying graph computation framework that uses tensors for its
execution. A typical graph consists of two entities: nodes and edges, as
shown in Figure 1-6. Nodes are also called vertices.

7
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-6. Typical graph

The edges are essentially the connections between the nodes/vertices


through which the data flows, and nodes are where actual computation
takes place. Now, in general, the graph can be cyclic or acyclic, but in
TensorFlow, it is always acyclic. It cannot start and end at the same node.
Let’s consider a simple computational graph, as shown in Figure 1-7, and
explore some of its attributes.

Figure 1-7. Computational graph

8
Chapter 1 Introduction to TensorFlow 2.0

The nodes in the graph indicate some sort of computation, such as


addition, multiplication, division, etc., except for the leaf nodes, which
contain the actual tensors with either constant or variable values to be
operated upon. These tensors flow through the edges or connections
between nodes, and the computation at the next node results in formation
of a new tensor. So, in the sample graph, a new tensor m is created through
a computation at the node using other tensors x and y. The thing to focus
on in this graph is that computations take place only at the next stage
after leaf nodes, as leaf nodes can only be simple tensors, which become
input for next-node computation flowing through edges. We can also
represent the computations at each node through a hierarchical structure.
The nodes at the same level can be executed in parallel, as there is no
interdependency between them. In this case, m and n can be calculated
in parallel at the same time. This attribute of graph helps to execute
computational graphs in a distributed manner, which allows TensorFlow
to be used for large-scale applications.

TensorFlow 1.0 vs. TensorFlow 2.0


Although TensorFlow was well adopted by the IT community after it was
made available on an open source basis, there were still a lot of gaps, in
terms of its user-friendliness. Users found it somewhat difficult to write
TensorFlow-based code. Therefore, there was a lot of critical feedback
by the developer and research communities regarding a few aspects of
TensorFlow. As a result, the TensorFlow core development team started
incorporating the suggested changes, to make the product easier to use
and more effective. This section reviews those changes that have been
incorporated into the TensorFlow 2.0 beta version. There are mainly three
broad categories of changes that have been introduced in TensorFlow 2.0.

9
Chapter 1 Introduction to TensorFlow 2.0

1. Usability-related modifications

2. Performance-related modifications

3. Deployment-related modifications

In this chapter, we are going to focus on only the first two categories, as
Chapter 6 covers TensorFlow model deployment.

U
 sability-Related Changes
The first category of changes mainly focused on TensorFlow’s ease of use
and more consistent APIs. To go through these changes in detail, we have
further subcategorized them according to three broad types.

1. Simpler APIs

2. Improved documentation

3. More inbuilt data sources

S
 impler APIs
One of the most common criticisms of TensorFlow by users regarded its
APIs, which were not user-friendly, thus a major focus of TensorFlow 2.0
has been on overhauling its APIs. Now, TensorFlow 2.0 provides two levels
of APIs:

1. High-level APIs

2. Lower-level APIs

High-Level APIs
The high-level APIs make it easier to use TensorFlow for various
applications, as these APIs are more intuitive in nature. These new high-­
level APIs have made debugging relatively easier than in earlier versions.
As TensorFlow 1.0 was graph control–based, users were not able to debug

10
Chapter 1 Introduction to TensorFlow 2.0

their programs easily. TensorFlow 2.0 has now introduced eager execution,
which performs operations and returns output instantly.

Lower-Level APIs
Another available set of APIs are lower level APIs which offer much more
flexibility and configuration capability to the users in order to define and
parameterise the models as per their specific requirements.

Session Execution

Readers who have used earlier versions of TensorFlow must have gone
through the conventional procedure, session execution, to get to an
operational graph, which likely consisted of the following steps:

1. First, create the tf.Graph object and set it to the


default graph for the current scope.

2. Declare the computation part in TensorFlow:


c=tf.matmul(m,n).

3. Define the variable sharing and scope, as required.

4. Create and configure the tf.Session to build the


graph and connect to the tf.Session.

5. Initialize all the variables in advance.

6. Use the tf.Session.run method to start the


computation.

7. The tf.Session.run then triggers a procedure to


compute the final output.

11
Chapter 1 Introduction to TensorFlow 2.0

Eager Execution

With eager execution, TensorFlow 2.0 adopts a radically different approach


and removes the need to execute most of the preceding steps.

1. TensorFlow 2.0 doesn’t require the graph definition.

2. TensorFlow 2.0 doesn’t require the session


execution.

3. TensorFlow 2.0 doesn’t make it mandatory to


initialize variables.

4. TensorFlow 2.0 doesn’t require variable sharing via


scopes.

To understand these differences in detail, let’s consider an example


using TensorFlow 1.0 vs. TensorFlow 2.0.

[In]: import tensorflow as tf


[In]: tfs=tf.InteractiveSession()
[In]: c1=tf.constant(10,name='x')
[In]: print(c1)
[Out]: Tensor("x:0", shape=(), dtype=int32)
[In]: tfs.run(c1)
[Out]: 10

Import the new version of TensorFlow.

[In]: ! pip install -q tensorflow==2.0.0-beta1


[In]: import tensorflow as tf
[In]: print(tf.__version__)
[Out]: 2.0.0-beta1
[In]: c_1=tf.constant(10)
[In]: print(c_1)
[Out]: tf.Tensor(10, shape=(), dtype=int32)

# Operations

12
Chapter 1 Introduction to TensorFlow 2.0

TensorFlow 1.0

[In]: c2=tf.constant(5.0,name='y')
[In]: c3=tf.constant(7.0,tf.float32,name='z')
[In]: op1=tf.add(c2,c3)
[In]: op2=tf.multiply(c2,c3)
[In]: tfs.run(op2)
[Out]: 35.0
[In]: tfs.run(op1)
[Out]: 12.0

TensorFlow 2.0

[In]:c2= tf.constant(5.0)
[In]:c3= tf.constant(7.0)
[In]: op_1=tf.add(c2,c3)
[In]: print(op_1)
[Out]: tf.Tensor(12.0, shape=(), dtype=float32)
[In]: op_2=tf.multiply(c2,c3)
[In]: print(op_2)
[Out]: tf.Tensor(35.0, shape=(), dtype=float32)

TensorFlow 1.0

g = tf.Graph()
with g.as_default():
    a = tf.constant([[10,10],[11.,1.]])
    x = tf.constant([[1.,0.],[0.,1.]])
    b = tf.Variable(12.)
    y = tf.matmul(a, x) + b
    init_op = tf.global_variables_initializer()

with tf.Session() as sess:


    sess.run(init_op)
    print(sess.run(y))

13
Chapter 1 Introduction to TensorFlow 2.0

TensorFlow 2.0

a = tf.constant([[10,10],[11.,1.]])
x = tf.constant([[1.,0.],[0.,1.]])
b = tf.Variable(12.)
y = tf.matmul(a, x) + b
print(y.numpy())

Note With TensorFlow 1.0 graph execution, the program state


(such as variables) is stored in global collections, and their lifetime
is managed by the tf.Session object. By contrast, during eager
execution, the lifetime of state objects is determined by the lifetime of
their corresponding Python object.

tf.function
Another powerful introduction of TensorFlow 2.0 is its tf.function
capability, which converts relevant Python code into a formidable
TensorFlow graph. It combines the flexibility of eager execution and
strength of graph computations. As mentioned, TensorFlow 2.0 doesn’t
require the creation of a tf.session object. Instead, simple Python
functions can be translated into a graph, using the tf.function decorator.
In simple terms, in order to define a graph in TensorFlow 2.0, we must
define a Python function and decorate it with @tf.function.

Keras
tf.keras was originally meant for small-scale models, as it had very
simple APIs, but it was not scalable. TensorFlow also had introduced
estimators that were designed for scaling and distributed training of
machine learning models. Estimators had a huge advantage as they offered

14
Chapter 1 Introduction to TensorFlow 2.0

fault tolerance training in a distributed environment, but its APIs were


not very user-friendly and were often regarded as confusing and a little
hard to consume. With this in mind, TensorFlow 2.0 has introduced the
standardized version of tf.keras, which combines the simplicity of Keras
and power of estimators.
The code for tf.keras in TensorFlow versions 1.13 and 2.0 remain the
same, but what has changed under the hood is the integration of Keras
with new features of TensorFlow 2.0. To elaborate a little bit, if a particular
piece of code was run with tf.keras in version 1.13, it would build a
graph-based model that ran a session under the hood, which we initiated
in the code. In version 2.0, the same model definition would run in eager
mode, without any modification whatsoever.
With eager mode, it becomes easy to debug the code, compared to
earlier graph-based execution. In eager mode, the data set pipelines
behave exactly as those of a NumPy array, but TensorFlow takes care of the
optimization in the best possible manner. Graphs are still very much part
of TensorFlow but operate in the background.

Redundancy
Another useful feedback from the community regarding TensorFlow usage
was that there were too many redundant components, which created
confusion when using them in different places. For example, there were
multiple optimizers and layers that one had to choose from while building
the model. TensorFlow 2.0 has removed all the redundant elements and
now comes with just one set of optimizers, metrics, losses, and layers.
Duplicative classes have also been reduced, making it easier for users to
figure out what to use and when.

15
Chapter 1 Introduction to TensorFlow 2.0

Improved Documentation and More Inbuilt Data Sources


TensorFlow.org now contains much more exhaustive and detailed
documentation for TensorFlow. This was critical from the user’s
perspective, as earlier versions had limited examples and tutorials for
reference. This new documentation includes a lot of new data sources
(small as well as big) for users to make use of in their programs or for
learning purposes. The new APIs also make it very easy to import any new
data source in TensorFlow. Some of the data sets from different domains
that are made available within TensorFlow are shown in Table 1-1.

Table 1-1. Data Sets Within TensorFlow 2.0


Sr. No Category Data set

1 Text imdb_reviews, squad


2 Image mnist, imagenet2012 , coco2014, cifar10
3 Video moving_mnist, starcraft_video,
bair_robot_pushing_small
4 Audio Nsynth
5 Structured titanic, iris

P
 erformance-Related Changes
The TensorFlow development team also claims that new changes have
improved product performance over earlier versions. Based on training
and inference results using different processors (GPUs, TPUs), it seems
TensorFlow has improved its speed two times, on average.

16
Chapter 1 Introduction to TensorFlow 2.0

Installation and Basic Operations


in TensorFlow 2.0
There are multiple ways in which we can use TensorFlow (local as well as
cloud). In this section, we go over two ways in which TensorFlow 2.0 can
be used locally as well as in the cloud.

1. Anaconda

2. Colab

3. Databricks

A
 naconda
This is the simplest way of using TensorFlow on a local system. We can pip
install the latest version of TensorFlow, as follows:

[In]: pip install -q tensorflow==2.0.0-beta1

C
 olab
The most convenient way to use TensorFlow, provided by Google’s
TensorFlow team, is Colab. Short for Colaboratory, this represents the idea
of collaboration and online laboratories. It is a free Jupyter-based web
environment requiring no setup, as it comes with all the dependencies
prebuilt. It provides an easy and convenient way to let users write
TensorFlow code within their browser, without having to worry about any
sort of installations and dependencies. Let’s go over the steps to see how to
use Google Colab for TensorFlow 2.0.

1. Go to https://colab.research.google.com. You
will see that the console has multiple options, as
shown in Figure 1-8.

17
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-8. Python Notebook Colaboratory (Colab) console

2. Select the relevant option from the console, which


contains the following five tabs:

a. Examples. Shows the default notebooks provided in Colab

b. Recent. The last few notebooks that the user worked on

c. Google Drive. The notebooks linked to the user’s Google


Drive account

d. GitHub. The option to link the notebooks present in the user’s


GitHub account

e. Upload. The option to upload a new ipynb or github file

3. Click New Python 3 Notebook, and a new Colab


notebook will appear, as shown in Figure 1-9.

18
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-9. New notebook

4. Install and import TensorFlow 2.0 (Beta).

[In]:! pip install -q tensorflow==2.0.0-beta1


[In]: import tensorflow as tf
[In]: print(tf.__version__)
[Out]: 2.0.0-beta1

Another great advantage of using Colab is that it allows you to build


your models on GPU in the back end, using Keras, TensorFlow, and
PyTorch. It also provides 12GB RAM, with usage up to 12 hours.

D
 atabricks
Another way to use TensorFlow is through the Databricks platform. The
method of installing TensorFlow on Databricks is shown following, using
a community edition account, but the same procedure can be adopted for
business account usage as well. The first step is to log in to the Databricks
account and spin up a cluster of desired size (Figures 1-10–1-12).

19
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-10. Databricks

Figure 1-11. Clusters

20
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-12. Cluster settings

Once the cluster is up and running, we go to the Libraries options of


the cluster, via Actions, as shown in Figure 1-13.

Figure 1-13. Cluster library

Within the Libraries tab, if the cluster already has a set of pre-installed
libraries, they will be listed, or, in the case of a new cluster, no packages
will be installed. We then click the Install New button (Figure 1-14).

21
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-14. Installing a new library

This will open a new window with multiple options to import or


install a new library in Databricks (Figure 1-15). We select PyPI, and in
the Package option, we mention the version of TensorFlow required to be
installed, as shown in Figure 1-16.

Figure 1-15. PyPI source

Figure 1-16. TensorFlow package

22
Chapter 1 Introduction to TensorFlow 2.0

It will take some time, and we can then see TensorFlow successfully
installed in Databricks, under Libraries. We can now open a new or
existing notebook using the same cluster (Figures 1-17 and 1-18).

Figure 1-17. Running cluster

Figure 1-18. New notebook

The final step is simply to import TensorFlow in the notebook and


validate the version. We can print the TensorFlow version, as shown in
Figure 1-19.

23
Chapter 1 Introduction to TensorFlow 2.0

Figure 1-19. TensorFlow notebook

Conclusion
In this chapter, we explained the fundamental difference between a vector
and a tensor. We also covered the major differences between previous
and current versions of TensorFlow. Finally, we went over the process
of installing TensorFlow locally as well as in a cloud environment (with
Databricks).

24
CHAPTER 2

Supervised Learning
with TensorFlow
In this chapter, we will be explaining the concept of supervised machine
learning. Next, we take a deep dive into such supervised machine learning
techniques as linear regression, logistic regression, and boosted trees.
Finally, we will demonstrate all the aforementioned techniques, using
TensorFlow 2.0.

What Is Supervised Machine Learning?


First, let us quickly review the concept of machine learning and then see
what supervised machine learning is, with the help of an example.
As defined by Arthur Samuel in 1959, machine learning is the field
of study that gives computers the ability to learn without being explicitly
programmed. The aim of machine learning is to build programs whose
performance improves automatically with some input parameters, such as
data, performance criteria, etc. The programs become more data-driven,
in terms of making decisions or predictions. We may not be aware of it,
but machine learning has taken over our daily lives, from recommending
products on online portals to self-driving cars that can take us from point
A to point B without our driving them or employing a driver.

© Pramod Singh, Avinash Manure 2020 25


P. Singh and A. Manure, Learn TensorFlow 2.0,
https://doi.org/10.1007/978-1-4842-5558-2_2
Chapter 2 Supervised Learning with TensorFlow

Machine learning is a part of artificial intelligence (AI) and mainly


comprises three types:

1. Supervised machine learning

2. Unsupervised machine learning

3. Reinforcement learning

Let us explore supervised machine learning via an example and


then implement different techniques using TensorFlow 2.0. Note that
unsupervised machine learning and reinforcement learning are beyond
the scope of this book.
Imagine a three-year-old seeing a kitten for the first time. How would
the child react? The child doesn’t know what he/she is seeing. He or she
might initially experience a feeling of curiosity, fear, or joy. It is only after
his or her parents pet the kitten that the child realizes the animal might not
harm him/her. Later, the child might be comfortable enough to hold the
kitten and play with it. Now, the next time the child sees a kitten, he/she
may instantly recognize it and start playing with it, without the initial fear
or curiosity it felt toward the kitten previously. The child has learned that
the kitten is not harmful, and, instead, he/she can play with it. This is how
supervised learning works in real life.
In the machine world, supervised learning is done by providing a
machine inputs and labels and asking it to learn from them. For example,
using the preceding example, we can provide to the machine pictures
of kittens, with the corresponding label (kitten), and ask it to learn the
intrinsic features of a kitten, so that it can generalize well. Later, if we
provide an image of another kitten without a label, the machine will be
able to predict that the image is that of a kitten.
Supervised learning usually comprises two phases: training and
testing/prediction. In the training phase, a set of the total data, called
a training set, is provided to the machine learning algorithm, made up
of input data (features) as well as output data (labels). The aim of the

26
Chapter 2 Supervised Learning with TensorFlow

training phase is to make sure the algorithm learns as much as possible


from the input data and forms a mapping between input and output, such
that it can be used to make predictions. In the test/prediction phase, the
remaining set of data, called a test set, is provided to the algorithm and
comprises only the input data (features) and not the labels. The aim of the
test/prediction phase is to check how well the model is able to learn and
generalize. If the accuracy of the training and test sets differs too much,
we can infer that the model might have mapped the input and output of
training data too closely, and, therefore, it was not able to generalize the
unseen data (test set) well. This is generally known as overfitting.
A typical supervised machine learning architecture is shown in
Figure 2-1.

Figure 2-1. Supervised machine learning architecture

27
Chapter 2 Supervised Learning with TensorFlow

Within supervised learning, if we are to predict numeric values, this


is called regression, whereas if we are to predict classes or categorical
variables, we call that classification. For example, if the aim is to predict
the sales (in dollars) a company is going to earn (numeric value), this
comes under regression. If the aim is to determine whether a customer will
buy a product from an online store or to check if an employee is going to
churn or not (categorical yes or no), this is a classification problem.
Classification can be further divided as binary and multi-class.
Binary classification deals with classifying two outcomes, i.e., either yes
or no. Multi-class classification yields multiple outcomes. For example,
a customer is categorized as a hot prospect, warm prospect, or cold
prospect, etc.

Linear Regression with TensorFlow 2.0


In linear regression, as with any other regression problem, we are trying
to map the inputs and the output, such that we are able to predict the
numeric output. We try to form a simple linear regression equation:

y = mx + b

In this equation, y is the numeric output that we are interested in, and
x is the input variable, i.e., part of the features set. m is the slope of the line,
and b is the intercept. For multi-variate input features (multiple linear
regression), we can generalize the equation, as follows:

y = m1x1 + m2x2 + m3x3 + ……… + mnxn + b

where x1, x2, x3, ………, xn are different input features, m1, m2, m3, ……… mn are the
slopes for different features, and b is the intercept
This equation can also be represented graphically, as shown in
Figure 2-2 (in 2D).

28
Chapter 2 Supervised Learning with TensorFlow

Figure 2-2. Linear regression graph

Here, we can clearly see that there is a linear relation between label y
and feature inputs X.

Implementation of a Linear Regression


Model, Using TensorFlow and Keras
We will implement the linear regression method in TensorFlow 2.0, using
the Boston housing data set and the LinearRegressor estimator available
within the TensorFlow package.

1. Import the required modules.

[In]: from __future__ import absolute_import, division, print_


function, unicode_literals

[In]: import numpy as np


[In]: import pandas as pd
[In]: import seaborn as sb
[In]: import tensorflow as tf
[In]: from tensorflow import keras as ks

29
Chapter 2 Supervised Learning with TensorFlow

[In]: from tensorflow.estimator import LinearRegressor


[In]: from sklearn import datasets
[In]: from sklearn.model_selection import train_test_split
[In]: from sklearn.metrics import mean_squared_error, r2_score
[In]: print(tf.__version__)
[Out]: 2.0.0-rc1

2. Load and configure the Boston housing data set.

[In]: boston_load = datasets.load_boston()


[In]: feature_columns = boston_load.feature_names
[In]: target_column = boston_load.target
[In]: boston_data = pd.DataFrame(boston_load.data,
columns=feature_columns).astype(np.float32)
[In]: boston_data['MEDV'] = target_column.astype(np.float32)
[In]: boston_data.head()

[Out]:

3. Check the relation between the variables, using


pairplot and a correlation graph.

[In]: sb.pairplot(boston_data, diag_kind="kde")


[Out]:

30
Another Random Scribd Document
with Unrelated Content
Lines of King Island kayak, Alaska, 1888, in U.S.
181 198
National Museum.
Lines of Norton Sound kayak, Alaska, 1889, in U.S.
182 198
National Museum.
Nunivak Island kayak with picture of mythological
183 water monster Palriayuk painted along gunwale. 199
(Photo by Henry B. Collins.)
184 Photo: Nunivak Island kayak in U.S. National Museum. 199
Western Alaskan kayak, Cape Prince of Wales, 1936.
185 200
(Photo by Henry B. Collins.)
186 Lines of Kotzebue Sound kayak, in Mariners' Museum. 201
Lines of Point Barrow kayak, Alaska, 1888, in U.S.
187 201
National Museum.
Lines of Mackenzie Delta kayak, in Museum of the
188 201
American Indian.
Photo: Kayak from Point Barrow, Alaska, in U.S.
189 202
National Museum.
190 Photo: Cockpit of kayak from Point Barrow. 202
191 Lines of kayak in U.S. National Museum. 203
192 Lines of kayak from Coronation Gulf, Canada. 203
Lines of Caribou Eskimo kayak, Canada, in American
193 203
Museum of Natural History.
Lines of Netsilik Eskimo kayak, King William Island,
194 203
Canada, in the American Museum of Natural History.
Lines of old kayak from vicinity of Southampton
195 205
Island, Canada.
Lines of Baffin Island kayak, from Cape Dorset,
196 205
Canada, in the Museum of the American Indian.
Lines of kayak from north Labrador, Canada, in the
197 207
Museum of the American Indian.
Lines of Labrador kayak, Canada, in the U.S. National
198 207
Museum.
Lines of north Greenland kayak, in the Museum of the
199 207
American Indian.
Lines of north Greenland kayak, in the Peabody
200 207
200 207
Museum, Salem, Mass.
Photo: Profile of Greenland kayak from Disko Bay, in
201 208
the National Museum.
202 Photo: Deck of Greenland kayak from Disko Bay. 208
203 Photo: Cockpit of Greenland kayak from Disko Bay. 209
204 Photo: Bow view of Greenland kayak from Disko Bay. 209
Lines of northwestern Greenland kayak, in the U.S.
205 210
National Museum.
Lines of southwestern Greenland kayak, 1883, in the
206 210
U.S. National Museum.
Lines of southwestern Greenland kayak, in the
207 210
Peabody Museum, Salem, Mass.
Lines of south Greenland kayak, in the American
208 211
Museum of Natural History.
209 Lines of Malecite and Iroquois temporary canoes. 214
Photo: Model of hickory-bark canoe under
210 217
construction, in the Mariner's Museum.
Sketch: Detail of thwart used in Malecite temporary
211 217
spruce-bark canoe.
Iroquois temporary elm-bark canoe, after a drawing
212 218
of 1849.
Large moosehide canoe of upper Gravel River,
213 221
Mackenzie valley. (Photo, George M. Douglas.)
214 Sketch: Standard Greenland roll. 224
215 Sketch: Critical stage of a capsize recovery. 225
Sketch: Hand positions used with the standard
216 226
Greenland roll.
217 Sketch: Kayak rescue, bow-grab method. 226
218 Sketch: Kayak rescue, paddle-grab method. 226
Preparing for demonstration of Eskimo roll,
219 Igdlorssuit, West Greenland. (Photo by Kenneth 227
Taylor.)
220 Getting aboard kayak. (Photo by Kenneth Taylor.) 228
221 Fully capsized kayak. (Photo by Kenneth Taylor.) 228
222 E i f ll (Ph t b K th T l ) 229
222 Emerging from roll. (Photo by Kenneth Taylor.) 229
223 Emerging from roll. (Photo by Kenneth Taylor.) 229
224 Righting the kayak. (Photo by Kenneth Taylor.) 229

The Bark Canoes and Skin Boats of


North America
INTRODUCTION

Figure 1
Fur-Trade Canoe on the Missinaibi River, 1901. (Canadian Geological
Survey photo.)
The bark canoes of the North American Indians, particularly those of birch
bark, were among the most highly developed of manually propelled primitive
watercraft. Built with Stone Age tools from materials available in the areas of
their use, their design, size, and appearance were varied so as to create boats
suitable to the many and different requirements of their users. The great skill
exhibited in their design and construction shows that a long period of
development must have taken place before they became known to white men.
The Indian bark canoes were most efficient watercraft for use in forest travel;
they were capable of being propelled easily with a single-bladed paddle. This
allowed the paddler, unlike the oarsman, to face the direction of travel, a
necessity in obstructed or shoal waters and in fast-moving streams. The
canoes, being light, could be carried overland for long distances, even where
trails were rough or nonexistent. Yet they could carry heavy loads in shallow
water and could be repaired in the forest without special tools.
Bark canoes were designed for various conditions: some for use in rapid
streams, some for quiet waters, some for the open waters of lakes, some for
use along the coast. Most were intended for portage in overland
transportation as well. They were built in a variety of sizes, from small one-
man hunting and fishing canoes to canoes large enough to carry a ton of
cargo and a crew, or a war-party, or one or more families moving to new
habitations. Some canoes were designed so that they could be used, turned
bottom up, for shelter ashore.
The superior qualities of the bark canoes of North America are indicated by
the white man's unqualified adoption of the craft. Almost as soon as he
arrived in North America, the white man learned to use the canoe, without
alteration, for wilderness travel. Much later, when the original materials used
in building were no longer readily available, canvas was substituted for bark,
and nails for the lashings and sewing; but as long as manual propulsion was
used, the basic models of the bark canoes were retained. Indeed, the models
and the proportions used in many of these old bark canoes are retained in the
canoes used today in the wildernesses of northern Canada and Alaska, and
the same styles may be seen in the canoes used for pleasure in the summer
resorts of Europe and America. The bark canoe of North America shares with
the Eskimo kayak the distinction of being one of the few primitive craft of
which the basic models are retained in the boats of civilized man.
It may seem strange, then, that the literature on American bark canoes is so
limited. Many possible explanations for this might be offered. One is that the
art of bark canoe building died early, as the Indians came into contact with
the whites, before there was any attempt fully to record Indian culture. The
bark canoe is fragile compared to the dugout. The latter might last hundreds
of years submerged in a bog, but the bark canoe will not last more than a few
decades. It is difficult, in fact, to preserve bark canoes in museums, for as
they age and the bark becomes brittle, they are easily damaged in moving
and handling.
Some small models made by Indians are preserved, but, like most models
made by primitive men, these are not to any scale and do not show with
equal accuracy all parts of the canoes they represent. They are, therefore, of
value only when full-sized canoes of the same type are available for
comparison, but this is too rarely the case with the American Indian bark
canoes. Today the builders who might have added to our knowledge are long
dead.
It might be said fairly that those who had the best opportunities to observe,
including many whose profession it was to record the culture of primitive man,
showed little interest in watercraft and have left us only the most meager
descriptions. Even when the watercraft of the primitive man had obviously
played a large part in his culture, we rarely find a record complete enough to
allow the same accuracy of reproduction that obtains, say, for his art, his
dress, or his pottery. Once lost, the information on primitive watercraft
cannot, as a rule, be recovered.
However, as far as the bark canoes of North America are concerned, there
was another factor. The student who became sufficiently interested to begin
research soon discovered that one man was devoting his lifetime to the study
of these craft; that, in a field with few documentary records and fewer
artifacts, he had had opportunities for detailed examination not open to
younger men; and that it was widely expected that this man would eventually
publish his findings. Hence many, who might otherwise have carried on some
research and writing, turned to other subjects. Practically, then, the whole
field had been left to Edwin Tappan Adney.
Born at Athens, Ohio, in 1868, Edwin Tappan Adney was the son of Professor
H. H. Adney, formerly a colonel in a volunteer regiment in the Civil War but
then on the faculty of Ohio University. His mother was Ruth Shaw Adney.
Edwin Tappan Adney did not receive a college education, but he managed to
pursue three years' study of art with The Art Students' League of New York.
Apparently he was interested in ornithology as well as in art, and spent much
time in New York museums, where he met Ernest Thompson Seton and other
naturalists. Being unable to afford more study in art school, he went on what
was intended to be a short vacation, in 1887, to Woodstock, New Brunswick.
There he became interested in the woods-life of Peter Joe, a Malecite Indian
who lived in a temporary camp nearby. This life so interested the 19-year-old
Ohioan that he turned toward the career of an artist-craftsman, recording
outdoor scenes of the wilderness in pictures.
He undertook to learn the handicrafts of the Indian, in order to picture him
and his works correctly, and lengthened his stay. In 1889, Adney and Peter
Joe each built a birch-bark canoe, Adney following and recording every step
the Indian made during construction. The result Adney published, with
sketches, in Harper's Young People magazine, July 29, 1890, and, in a later
version, in Outing, May 1900. These, so far as is known, are the earliest
detailed descriptions of a birch-bark canoe, with instructions for building one.
Daniel Beard considered them the best, and with Adney's permission used the
material in his Boating Book for Boys.
In 1897, Adney went to the Klondike as an artist and special correspondent
for Harper's Weekly and The London Chronicle, to report on the gold-rush. He
also wrote a book on his experience, Klondike Stampede, published in 1900.
In 1899 he married Minnie Bell Sharp, of Woodstock, but by 1900 Adney was
again in the Northwest, this time as special correspondent for Colliers
magazine at Nome, Alaska, during the gold-rush of that year. On his return to
New York, Adney engaged in illustrating outdoor scenes and also lectured for
the Society for the Prevention of Cruelty to Animals. In 1908 he contributed to
a Harper's Outdoor Book for Boys. From New York he removed to Montreal
and became a citizen of Canada, entering the Canadian Army as a Lieutenant
of Engineers in 1916. He was assigned to the construction of training models
and was on the staff of the Military College, mustering out in 1919. He then
made his home in Montreal, engaging in painting and illustrating. From his
early years in Woodstock he had made a hobby of the study of birch-bark
canoes, and while in Montreal he became honorary consultant to the Museum
of McGill University, dealing with Indian lore. By 1925 Adney had assembled a
great deal of material and, to clarify his ideas, he began construction of scale
models of each type of canoe, carrying on a very extensive correspondence
with Indians, factors and other employees (retired and active) of the Hudson's
Bay Company, and with government agents on the Indian Reservations. He
also made a number of expeditions to interview Indians. Possessing linguistic
ability in Malecite, he was much interested in all the Indian languages; this
helped him in his canoe studies.
Owing to personal and financial misfortunes, he and his wife (then blind)
returned in the early 1930's to her family homestead in Woodstock, where
Mrs. Adney died in 1937. Adney continued his work under the greatest
difficulties, including ill-health, until his death, October 10, 1950. He did not
succeed in completing his research and had not organized his collection of
papers and notes for publication when he died.
Through the farsightedness of Frederick Hill, then director of The Mariners'
Museum, Newport News, Virginia, Adney had, ten years before his death,
deposited in the museum over a hundred of his models and a portion of his
papers. After his death his son Glenn Adney cooperated in placing in The
Mariners' Museum the remaining papers dealing with bark canoes, thus
completing the "Adney Collection."
Frederick Hill's appreciation of the scope and value of the collection prompted
him to seek my assistance in organizing this material with a view to
publication. Though the Adney papers were apparently complete and were
found, upon careful examination, to contain an immense amount of valuable
information, they were in a highly chaotic state. At the request of The
Mariners' Museum, I have assembled the pertinent papers and have compiled
from Adney's research notes as complete a description as I could of bark
canoes, their history, construction, decoration and use. I had long been
interested in the primitive watercraft of the Americas, but I was one of those
who had discontinued research on bark canoes upon learning of Adney's
work. The little I had accomplished dealt almost entirely with the canoes of
Alaska and British Columbia; from these I had turned to dugouts and to the
skin boats of the Eskimo. Therefore I have faced with much diffidence the
task of assembling and preparing the Adney papers for publication,
particularly since it was not always clear what Adney had finally decided about
certain matters pertaining to canoes. His notes were seldom arranged in a
sequence that would enable the reader to decide which, of a number of
solutions or opinions given, were Adney's final ones.
Adney's interest in canoes, as canoes, was very great, but his interest in
anthropology led him to form many opinions about pre-Columbian migrations
of Indian tribes and about the significance of the decorations used in some
canoes. His papers contain considerable discussion of these matters, but they
are in such state that only an ethnologist could edit and evaluate them. In
addition, my own studies lead me to conclude that the mere examination of
watercraft alone is insufficient evidence upon which to base opinions as far-
reaching as those of Adney. Therefore I have not attempted to present in this
work any of Adney's theories regarding the origin or ethnological significance
of the canoes discussed. I have followed the same practice with those Adney
papers which concern Indian language, some of which relate to individual
tribal canoe types and are contained in the canoe material. (Most of his
papers on linguistics are now in The Peabody Museum, Salem,
Massachusetts.)
The strength and weaknesses of Adney's work, as shown in his papers,
drawings, and models, seem to me to be fully apparent. That part dealing
with the eastern Indians, with whom he had long personal contact, is by far
the most voluminous and, perhaps, the most accurate. The canoes used by
Indians west of the St. Lawrence as far as the western end of the Great Lakes
and northward to the west side of Hudsons Bay are, with a few exceptions,
covered in somewhat less detail, but the material nonetheless appears ample
for our purpose. The canoes used in the Canadian Northwest, except those
from the vicinity of Great Slave Lake, and in Alaska were less well described.
It appears that Adney had relatively little opportunity to examine closely the
canoes used in Alaska, during his visit there in 1900, and that he later was
unable to visit those American museums having collections that would have
helped him with regard to these areas. As a result, I have found it desirable to
add my own material on these areas, drawn largely from the collections of
American museums and from my notes on construction details.
An important part of Adney's work deals with the large canoes used in the fur
trade. Very little beyond the barest of descriptions has been published and,
with but few exceptions, contemporary paintings and drawings of these
canoes are obviously faulty. Adney was fortunate enough to have been able to
begin his research on these canoes while there were men alive who had built
and used them. As a result he obtained information that would have been lost
within, at most, the span of a decade. His interest was doubly keen,
fortunately, for Adney not only was interested in the canoes as such, he also
valued the information for its aid in painting historical scenes. As a result,
there is hardly a question concerning fur trade canoes, whether of model,
construction, decoration, or use, that is not answered in his material.
I have made every effort to preserve the results of Adney's investigations of
the individual types in accurate drawings or in the descriptions in the text. It
was necessary to redraw and complete most of Adney's scale drawings of
canoes, for they were prepared for model-building rather than for publication.
Where his drawings were incomplete, they could be filled in from his scale
models and notes. It must be kept in mind that in drawing plans of primitive
craft the draftsman must inevitably "idealize" the subject somewhat, since a
drawing shows fair curves and straight lines which the primitive craft do not
have in all cases. Also, the inboard profiles are diagrammatic rather than
precise, because, in the necessary reduction of the full-size canoe to a
drawing, this is the only way to show its "form" in a manner that can be
interpreted accurately and that can be reproduced in a model or full size, as
desired. It is necessary to add that, though most of the Adney plans were
measured from full-size canoes, some were reconstructed from Indian
models, builders' information, or other sources. Thanks to Adney's thorough
knowledge of bark construction, the plans are highly accurate, but there are
still chances for error, and these are discussed where they occur.
Although reconstruction of extinct canoe types is difficult, for the strange
canoes of the Beothuk Indians of Newfoundland Adney appears to have
solved some of the riddles posed by contemporary descriptions and the few
grave models extant (the latter may have been children's toys). Whether or
not his reconstructed canoe is completely accurate cannot be determined; at
least it conforms reasonably well to the descriptions and models, and Adney's
thorough knowledge of Indian craftsmanship gives weight to his opinions and
conclusions. This much can be said: the resulting canoe would be a practical
one and it fulfills very nearly all descriptions of the type known today.
Adney's papers and drawings dealing with the construction of bark canoes are
most complete and valuable. So complete as to be almost a set of "how-to-
do-it" instructions, they cover everything from the selection of materials and
use of tools to the art of shaping and building the canoe. An understanding of
these building instructions is essential to any sound examination of the bark
canoes of North America, for they show the limitations of the medium and
indicate what was and what was not reasonable to expect from the finished
product.
In working on Adney's papers, it became obvious that this publication could
not be limited to birch-bark canoes, since canoes built of other barks and
even some covered with skins appear in the birch bark areas. Because of this,
and to explain the technical differences between these and the birch canoes,
skin-covered canoes have been included. I have also appended a chapter on
Eskimo skin boats and kayaks. This material I had originally prepared for
inclusion in the Encyclopedia Arctica, publication of which was cancelled after
one volume had appeared. As a result, the present work now covers the
native craft, exclusive of dugouts, of all North America north of Mexico.
In my opinion the value of the information gathered by Edwin Tappan Adney
is well worth the effort that has been expended to bring it to its present form,
and any merit that attaches to it belongs largely to Adney himself, whose long
and painstaking research, carried on under severe personal difficulties, is the
foundation of this study.
Howard Irving Chapelle Curator of Transportation, Museum of History and
Technology
Chapter One
EARLY HISTORY
The development of bark canoes in North America before the arrival of the
white men cannot satisfactorily be traced. Unlike the dugout, the bark canoe
is too perishable to survive in recognizable form buried in a bog or submerged
in water, so we have little or no visual evidence of very great age upon which
to base sound assumptions.
Records of bark canoes, contained in the reports of the early white explorers
of North America, are woefully lacking in detail, but they at least give grounds
for believing that the bark canoes even then were highly developed, and were
the product of a very long period of existence and improvement prior to the
first appearance of Europeans.
The Europeans were most impressed by the fact that the canoes were built of
bark reinforced by a light wooden frame. The speed with which they could be
propelled by the Indians also caused amazement, as did their light weight and
marked strength, combined with a great load-carrying capacity in shallow
water. It is remarkable, however, that although bark canoes apparently
aroused so much admiration among Europeans, so little of accurate and
complete information appears in their writings.
With two notable exceptions, to be discussed later, early explorers,
churchmen, travellers, and writers were generally content merely to mention
the number of persons in a canoe. The first published account of variations in
existing forms of the American bark canoe does not occur until 1724, and the
first known illustration of a bark canoe accurate enough to indicate its tribal
designation appeared only two years earlier. This fact makes any detailed
examination of the early books dealing with North America quite unprofitable
as far as precise information on bark canoes is concerned.
The first known reference by a Frenchman to the bark canoe is that of
Jacques Cartier, who reported that he saw two bark canoes in 1535; he said
the two carried a total of 17 men. Champlain was the first to record any
definite dimensions of the bark canoes; he wrote that in 1603 he saw, near
what is now Quebec, bark canoes 8 to 9 paces long and 1½ paces wide, and
he added that they might transport as much as a pipe of wine yet were light
enough to be carried easily by one man. If a pace is taken as about 30 inches,
then the canoes would have been between 20 and 23 feet long, between 40
and 50 inches beam and capable of carrying about half a ton, English
measurements. These were apparently Algonkin canoes. Champlain was
impressed by the speed of the bark canoes; he reported that his fully manned
longboat was passed by two canoes, each with two paddlers. As will be seen,
he was perhaps primarily responsible for the rapid adoption of bark canoes by
the early French in Canada.
The first English reference that has been found is in the records of Captain
George Weymouth's voyage. He and his crew in 1603 saw bark canoes to the
westward of Penobscot Bay, on what is now the coast of Maine. The English
were impressed, just as Champlain had been, by the speed with which canoes
having but three or four paddlers could pass his ship's boat manned with four
oarsmen. Weymouth also speaks admiringly of the fine workmanship shown in
the structure of the canoes.
When Champlain attacked the Iroquois, on what is now Lake Champlain, he
found that these Indians had "oak" bark (more probably elm) canoes capable
of carrying 10, 15, and 18 men. This would indicate that the maximum size of
the Iroquois canoes was about 30 to 33 feet long. The illustrations in his
published account indicate canoes about 30 feet long; but early illustrations of
this kind were too often the product of the artist's imagination, just as were
the delineations of the animals and plants of North America.
As an example of what may be deduced from other early French accounts,
Champlain in 1615, with a companion and 12 Indians, embarked at La Chine
in two bark canoes for a trip to the Great Lakes. He stated that the two
canoes, with men and baggage aboard, were over-crowded. Taking one of
these canoes as having 7 men and baggage aboard, it seems apparent that it
was not much larger than the largest of the canoes Champlain had seen in
1603 on the St. Lawrence. But in 1672, Louis Joliet and Father Jacques
Marquette traveled in two canoes, carrying a total of 5 French and 25 Indians
—say 14 in one canoe and 16 in the other. These canoes, then, must have
been at least 28 feet long over the gunwales, exclusive of the round of the
ends, or about 30 feet overall. The Chevalier Henri de Tonti, one of La Salle's
officers, mentions a canoe carrying 30 men—probably 14 paddlers on each
side, a steersman, and a passenger or officer. Such a capacity might indicate a
canoe about 40 feet over the gunwales, though this seems very long indeed;
it is more probable that the canoe would be about 36 feet long.
Another of La Salle's officers, Baron de LaHontan, gave the first reasonably
complete account that has been found of the size and character of a birch-
bark canoe. This was written at Montreal June 29, 1684. After stating that he
had seen at least a hundred bark canoes in his journeys, he said that birch-
bark canoes ranged in length from 10 to 28 pieds and were capable of
carrying from 2 to 14 persons. The largest, when carrying cargo, might be
handled by three men and could carry 2,000 pounds of freight (20 quintals).
These large canoes were safe and never upset. They were built of bark peeled
in the winter; hot water was thrown on the bark to make it pliable, so that it
could be rolled up after it was removed from the tree. The canoes were built
of more than one piece of bark as a rule.
The large canoes, he reports, were 28 pieds long, 4½ pieds wide and 20
pouces deep, top of gunwale to top of frames on bottom. The last indicates
"inside" measurement; in this the length would be over the gunwales, not
overall, and the beam inside the gunwales, not extreme. He also says the
canoes had a lining or sheathing of cedar "splints" or plank and, inside this,
cedar ribs or frames. The bark was the thickness of an écu (this coin, a
crown, was a little less than ⅛ inch thick), the sheathing the thickness of two
écus, and the ribs of three. The ends of the ribs were pointed and these were
seated in holes in the underside of the gunwales. There were 8 crosspieces
(thwarts) between the gunwales (note: such a canoe would commonly have 9
thwarts; LaHontan may have erred here).
The canoes were convenient, he says, because of their great lightness and
shallow draft, but they were easily damaged. Hence they had to be loaded
and unloaded afloat and usually required repairs to the bark covers at the end
of each day. They had to be staked down at night, so that a strong wind
might not damage or blow them away; but this light weight permitted them to
be carried with ease by two men, one at each end, and this suited them for
use on the rivers of Canada, where rapids and falls made carrying frequently
necessary. These canoes were of no value on the Lakes, LaHontan states, as
they could not be used in windy weather; though in good weather they might
cross lakes and might go four or five leagues on open water. The canoes
carried small sails, but these could be used only with fair winds of moderate
force. The paddlers might kneel, sit, or stand to paddle and pole the canoes.
The paddle blade was 20 pouces long, 6 wide, and 4 lignes thick; the handle
was of the diameter of a pigeon's egg and three pieds long. The paddlers also
had a "setting pole," to pole the canoes in shoal water. The canoes were alike
at both ends and cost 80 écus (LaHontan's cost 90), and would last not more
than five or six years. The foregoing is but a condensed extract of LaHontan's
lively account.
In translating LaHontan's measurements a pied is taken as 12.79 inches, a
pouce as about 1⅛ inches. The French fathom, or brasse, as used in colonial
Canada, was the length from finger-tip to finger-tip of the arms outstretched
and so varied, but may be roughly estimated as about 64 inches; this was the
"fathom" used later in classing fur-trade canoes for length. In English
measurements his large canoe would have been about 30 feet long over the
gunwales and, perhaps, almost 33 feet overall, 57½ inches beam inside the
gunwales, or about 60 inches extreme beam. The depth inside would be 21 or
21¾ inches bottom to top of gunwale amidships. LaHontan also described the
elm-bark canoes of the Iroquois as being large and wide enough to carry 30
paddlers, 15 on a side, sitting or standing. Here again a canoe about 40 feet
long is indicated. He said that these elm-bark canoes were crude, heavy and
slow, with low sides, so that once he and his men reached an open lake, he
no longer feared pursuit by the Iroquois in these craft.
Figure 2
Page From a Manuscript of 1771, "Observations
on Hudsons Bay," by Alexander Graham,
Factor, now in the archives of the Hudson's
Bay Company in London. The birch-bark
canoe at the top, the kayak below, and the
paddles are obviously drawn by one not
trained to observe as an artist.
From the slight evidence offered in such records as these, it appears that the
Indians may have had, when the Europeans first reached Canada, canoes at
least as long as the 5-fathom or 5½-fathom canoe of later times. It appears
also that these dimensions applied to the canoes of the Great Lakes area and
perhaps to the elm-bark canoes of the Iroquois as well. Probably there were
canoes as short as 10 feet, used as one-man hunting and fishing boats, and it
is plainly evident that canoes between this length and about 24 feet were very
common. The evidence in La Salle's time, in the last half of the seventeenth
century, must be taken with some caution, as French influence on the size of
large canoes may have by then come into play. The comparison between the
maximum length of the Iroquois canoes, inferred from the report of
Champlain, and that suggested by LaHontan, might indicate this growth.
Beginning as early as 1660, the colonial government of Canada issued congés
or trading licenses. These were first granted to the military officers or their
families; later the congés were issued to all approved traders, and the fees
were used for pensions of the military personnel. Records of these licenses,
preserved from about 1700, show that three men commonly made up the
crew of a trading canoe in the earliest years, but that by 1725 five men were
employed, by 1737 seven men, and by 1747 seven or eight men. However, as
LaHontan has stated that in his time three men were sufficient to man a large
canoe with cargo, it is evident that the congés offer unreliable data and do
not necessarily prove that the size of canoes had increased during this period.
The increase in the crews may have been brought about by the greater
distances travelled, with an increased number of portages or, perhaps, by
heavier items of cargo.
The war canoe does not appear in these early accounts as a special type.
According to the traditions of the eastern Micmac and Malecite Indians, their
war canoes were only large enough to carry three or four warriors and so
must not have exceeded 18 feet in length. These were built for speed, narrow
and with very sharp ends; the bottom was made as smooth as was possible.
Each canoe carried the insignia of each of its warriors, that is, his personal
mark or sign. A canoe carrying a war leader had only his personal mark, none
for the rest of the crew. It is possible to regard the large canoes of the
Iroquois as "war canoes" since they were used in the pursuit of French raiders
in LaHontan's time. However, the Iroquois did not build the canoes primarily
for war; in early times these fierce tribesmen preferred to take to the warpath
in the dead of winter and to raid overland on snowshoes. In open weather,
they used the rough, short-lived and quickly built elm-bark canoes to cross
streams and lakes or to follow waterways, discarding them when the
immediate purpose was accomplished. Probably it was the French who really
produced the bark "war canoes," for they appear to have placed great
emphasis on large canoes for use of the military, as indicated by LaHontan's
concern with the largest canoes of his time. Perhaps large bark canoes were
once used on the Great Lakes for war parties, but, if so, no mention of a
special type has been found in the early French accounts. The sparse
references suggest that both large and small canoes were used by the war
parties but that no special type paralleling the characteristics of the Micmac
and Malecite war canoes existed in the West. The huge dugout war canoe of
the Indians of the Northwest Coast appears to have had no counterpart in
size among the birch or elm bark canoes.
Except for LaHontan, the early French writers who refer to the use of sail
agree that the canoes were quite unfitted for sailing. It is extremely doubtful
that the prehistoric Indians using bark canoes were acquainted with sails,
though it is possible that the coastal Indians might have set up a bush in the
bow to utilize a following wind and thus lighten the labor of paddling.
However, once the Indian saw the usefulness of a sail demonstrated by white
men, he was quick to adopt it; judging from the LaHontan reference, and the
use of sails in canoes must have become well established in some areas by
1685.
One of the most important elements in the history of the canoe is its early
adoption by the French. Champlain was the first to recommend its use by
white men. He stated that the bark canoe would be very necessary in trade
and exploration, pointing out that in order to penetrate the back country
above the rapids at Montreal, during the short summer season, and to come
back in time to return to France for the winter (unless the winter was to be
spent in Canada) the canoe would have to be used. With it the small and
large streams could be navigated safely and the numerous overland carries
could be quickly made. Also, of course, Indians could be employed as crews
without the need of training them to row. This general argument in favor of
the bark canoe remained sound after the desirability of going home to France
for the winter had ceased to influence French ideas. The quick expansion of
the French fur trade in the early seventeenth century opened up the western
country into the Great Lakes area and to the northward. It was soon
discovered that by using canoes on the ancient canoe route along the Ottawa
River goods could reach the western posts on the Lakes and be transported
north early enough to reach the northernmost posts before the first freeze-up
occurred. The use of sailing vessels on the Lakes did not enable this to be
accomplished, so that until the railroads were built in western Canada, the
canoe remained the mode of transport for the fur trade in this area. Even
after the railways were built, canoe traffic remained important, until well into
the first half of the twentieth century as part of the local system of
transportation in the northwestern country of Canada.

Figure 3
Canoes From LaHontan's Nouveaux Voyages ...
dans l'Amerique Septentrionale, showing
crude representations typical of early writers.
The unsatisfactory illustrations accompanying early published accounts have
been mentioned. The earliest recognizable canoe to be shown in an
illustration is the reasonably accurate drawing of a Micmac canoe that appears
in Bacqueville de la Poterie's book, published in 1722. LaFiteau, another
Frenchman, in 1724 published a book that not only contains recognizable
drawings but points out reasons for the variation in the appearance of bark
canoes:
The Abenacquis, for example, are less high in the sides, less large,
and more flat at the two ends; in a way they are almost level for
their whole extent; because those who travel on their small rivers
are sure to be troubled and struck by the branches of trees that
border and extend over the water. On the other hand, the
Outaouacs [Ottawas] and the nations of the upper country having
to do their navigation on the St. Lawrence River where there are
many falls and rapids, or especially on the Lakes where there is
always a very considerable swell, must have high ends.
His illustrations show that his low-ended canoes were of Micmac type but that
his high-ended canoes were not of the Ottawa River or Great Lakes types but
rather of the eastern Malecite of the lower St. Lawrence valley. This Jesuit
missionary also noted that the canoes were alike at the ends and that the
paddles were of maple and about 5 feet long, with blades 18 inches long and
6 wide. He observed that bark canoes were unfitted for sailing.
Figure 4
Lines of an Old Birch-Bark Canoe, probably Micmac, brought to England
in 1749 from New England. This canoe was not alike at both ends,
although apparently intended to be so by the builder. (From
Admiralty Collection of Draughts, National Maritime Museum,
Greenwich.)
The early English settlers of New England and New York were acquainted with
the canoe forms of eastern Indians such as the Micmac, Malecite, Abnaki, and
the Iroquois. Surviving records, however, show no detailed description of
these canoes by an English writer and no illustration until about 1750. At this
time a bark canoe, apparently Micmac, was brought from Portsmouth, New
Hampshire, to England and delivered to Lord Anson who had it placed in the
Boat House of the Chatham Dockyard. There it was measured and a scale
drawing was made by Admiralty draftsmen; the drawing is now in the
Admiralty Collection of Draughts, in the National Maritime Museum at
Greenwich. A redrawing of this plan appears opposite. It probably represents
a war canoe, since a narrow, sharp-ended canoe is shown. The bottom,
neither flat nor fully round, is a rounded V-shape; this may indicate a canoe
intended for coastal waters. Other drawings, of a later date, showing crude
plans of canoes, exist in Europe but none yet found appear as carefully drawn
as the Admiralty plan, a scale drawing, which seems to be both the earliest
and the most accurate 18th-century representation of a tribal type of
American Indian bark canoe.
Due to the rapid development of the French fur trade, and the attendant
exploration, a great variety of canoe types must have become known to the
French by 1750, yet little in the way of drawings and no early scale plans have
been found. This is rather surprising, not only because the opportunity for
observation existed but also because a canoe factory was actually operated by
the French. The memoirs of Colonel Franquet, Military Engineer-in-Chief for
New France, contain extensive references to this factory as it existed in 1751.
The canoe factory was located at Trois Rivières, just below Montreal, on the
St. Lawrence. A standard large canoe was built, and the rate of production
was then 20 a year. Franquet gives as the dimensions of the canoes the
following (converted to English measurement): length 36 feet, beam about
5½ feet, and depth about 33 inches. Much of his description is not clear, but
it seems evident that the canoe described was very much like the later grand
canot, or large canoe, of the fur trade. The date at which this factory was
established is unknown; it may have existed as early as 1700, as might have
been required by the rapid expansion of the French trade and other activities
in the last half of the previous century. It is apparent from early comments
that the French found the Indian canoe-builders unreliable, not to say most
uncertain, as a source of supply. The need for large canoes for military and
trade operations had forced the establishment of such a factory as soon as
Europeans could learn how to build the canoes. This would, in fact, have been
the only possible solution.
Of course, it must not be assumed that the bark canoes were the only
watercraft used by the early French traders. They used plank boats as well,
ranging from scows to flat-bottomed bateaux and ship's boats, and they also
had some early sailing craft built on the Great Lakes and on the lower St.
Lawrence. The bateau, shaped much like a modern dory but with a sharp
stern, was adopted by the English settlers as well as the French. In early
colonial times this form of boat was called by the English a "battoe," or
"Schenectady Boat," and later, an "Albany Boat." It was sharp at both ends, it
usually had straight flaring sides with a flat bottom, and was commonly built
of white pine plank. Some, however, had rounded sides and lapstrake
planking, as shown by a plan of a bateau of 1776 in the Admiralty Collection
of Draughts. Early bateaux had about the same range of size as the bark
canoes but later ones were larger.
After the English gained control of Canada, the records of the Hudson's Bay
Company, and of individual traders and travellers such as Alexander Henry, Jr.,
and Alexander MacKenzie, at the end of the eighteenth century, give much
material on the fur-trade canoes but little on the small Indian canoes. In
general, these records show that the fur-trade canoe of the West was
commonly 24 feet long inside the gunwales, exclusive of the curves of bow
and stern; 4 feet 9 inches beam; 26 inches deep; and light enough to be
carried by two men, as MacKenzie recorded, "three or four miles without
resting on a good road." But the development of the fur-trade canoes is best
left for a later chapter.
The use of the name "canoe" for bark watercraft does not appear to been
taken from a North American Indian usage. The early French explorers and
travellers called these craft canau (pl. canaux). As this also meant "canal," the
name canot (pl. canots) was soon substituted. But some early writers
preferred to call the canoe ecorse de bouleau, or birch-bark, and sometimes
the name used was merely the generic petit embarcation, or small boat. The
early English term was "canoa," later "canoe." The popular uses of canoe,
canoa, canau, and canot are thought to have begun early in the sixteenth
century as the adaptation of a Carib Indian word for a dugout canoe.
Summary
It will be seen that the early descriptions of the North American bark canoes
are generally lacking in exact detail. Yet this scanty information strongly
supports the claim that bark canoes were highly developed and that the only
influence white men exercised upon their design was related to an increase in
size of the large canoes that may have taken place in the late seventeenth
and early eighteenth centuries. The very early recognition of the speed, fine
construction, and general adaptability of the bark canoes to wilderness travel
sustain this view. The two known instances mentioned of early accurate
illustration emphasize that distinct variations in tribal forms of canoes existed,
and that these were little changed between early colonial times and a
relatively recent period, despite steadily increasing influence of the European.
Chapter Two
MATERIALS and TOOLS
Bark of the paper birch was the material preferred by the North American
Indians for the construction of their canoes, although other barks were used
where birch was not available. This tree (Betula papyrifera Marsh.), also
known as the canoe birch, is found in good soil, often near streams, and
where growing conditions are favorable it becomes large, reaching a height of
a hundred feet, with a butt diameter of thirty inches or more. Its range forms
a wide belt across the continent, with the northern limits in Canada along a
line extending westward from Newfoundland to the southern shores of
Hudson Bay and thence generally northwestward to Great Bear Lake, the
Yukon River, and the Alaskan coast. The southern limits extend roughly
westward from Long Island to the southern shores of Lake Erie and through
central Michigan to Lake Superior, thence through Wisconsin, northern
Nebraska, and northwesterly through the Dakotas, northern Montana, and
northern Washington to the Pacific Coast. The trees are both abundant and
large in the eastern portion of the belt, particularly in Newfoundland, Quebec,
the Maritime Provinces, Ontario, Maine, and New Hampshire, in contrast to
the western areas. Near the limits of growth to the north and south the trees
are usually small and scattered.
The leaves are rather small, deep green, and pointed-oval, and are often
heart-shaped at the base. The edges of the leaves are rather coarsely toothed
along the margin, which is slightly six-notched. The small limbs are black,
sometimes spotted with white, and the large are white.
The bark of the tree has an aromatic odor when freshly peeled, and is chalky
white marked with black splotches on either side of limbs or where branches
have grown at one time. Elsewhere on the bark, dark, or black, horizontal
lines of varying lengths also appear. The lower part of the tree, to about the
height of winter snows, has bark that is usually rough, blemished and thin;
above this level, to the height of the lowest large limbs, the bark is often only
slightly blemished and is thick and well formed. The bark is made up of paper-
like layers, their color deepens with each layer from the chalky white of the
exterior through creamy buff to a light tan on the inner layer. A gelatinous
greenish to yellow rind, or cambium layer, lies between the bark and the wood
of the trunk; its characteristics are different from those of the rest of the bark.
The horizontal lines that appear on each successive paper-like layer do not
appear on the rind.
The thickness of the bark cannot be judged from the size of a tree and may
vary markedly among trees of the same approximate size in a single grove.
The thickness varies from a little less than one-eighth to over three-sixteenths
inch; bark with a thickness of one-quarter inch or more is rarely found. For
canoe construction, bark must be over one-eighth inch thick, tough, and from
a naturally straight trunk of sufficient diameter and length to give reasonably
large pieces. The "eyes" must be small and not so closely spaced as to allow
the bark to split easily in their vicinity.
The bark can be peeled readily when the sap is flowing. In winter, when the
exterior of the tree is frozen, the bark can be removed only when heat is
applied. During a prolonged thaw, however, this may be accomplished without
the application of heat. Bark peeled from the tree during a winter thaw, and
early in the spring or late in the fall, usually adheres strongly to the inner rind,
which comes away from the tree with the bark. The act of peeling, however,
puts a strain on the bark, so that only tough, well-made bark can be removed
under these conditions. This particular characteristic caused Indians in the
east to call bark with the rind adhering "winter bark," even though it might
have been peeled from a tree during the warm weather of early summer.
Since in large trees the flow of sap usually starts later than in small ones, the
period in which good bark is obtainable may extend into late June in some
localities. Upon exposure to air and moisture, the inner rind first turns orange-
red and gradually darkens with age until in a few years it becomes dark
brown, or sepia. If it is first moistened, the rind can be scraped off, and this
allowed it to be employed in decoration, enough being left to form designs.
Hence winter bark was prized.
To the eastern Indians "summer bark" was a poor grade that readily
separated into its paper-like layers, a characteristic of bark peeled in hot
weather, or of poorly made bark in any season. In the west, however, high-
quality bark was often scarce and, therefore, the distinction between winter
and summer bark does not seem to have been made. Newfoundland once had
excellent canoe bark, as did the Maritime Provinces, Maine, New Hampshire,
and Quebec, but the best bark was found back from the seacoast. Ontario
and the country to the immediate north of Lake Superior are also said to have
produced bark of high quality for canoe building.
The bark of the paper birch was preferred for canoe building because it could
be obtained in quite large sheets clear of serious blemishes; because its grain
ran around the tree rather than along the line of vertical tree growth, so that
sheets could be "sewn" together to obtain length in a canoe; and because the
bark was resinous and not only did not stretch and shrink as did other barks,
but also had some elasticity when green, or when kept damp. This elasticity,
of course, was lost once the bark was allowed to become dry through
exposure to air and sunshine, a factor which controlled to some extent the
technique of its employment.
Many other barks were employed in bark canoe construction, but in most
instances the craft were for temporary or emergency use and were discarded
after a short time. Such barks as spruce (Picea), elm (Ulmus), chestnut
(Castenea dentata L.), hickory (Carya spp.), basswood (Tilia spp.), and
cottonwood (Populus spp.) are said to have been used in bark canoe
construction in some parts of North America. Birches other than the paper
birch could be used, but most of them produced bark that was thin and
otherwise poor, and was considered unsuitable for the better types of canoes.
Barks other than birch usually had rough surfaces that had to be scraped
away, in order to make the material flexible enough for canoe construction.
Spruce bark had some of the good qualities of the paper birch bark, but to a
far less degree, and was considered at best a mere substitute. Non-resinous
barks, because of their structure could not be joined together to gain length,
and their characteristic shrinkage and swelling made it virtually impossible to
keep them attached to a solid framework for any great length of time.
Figure 5
Ojibway Indian carrying spruce roots, Lac
Seul, Ont., 1919. (Canadian Geological
Survey photo.)
The material used for "sewing" together pieces of birch bark was most
commonly the root of the black spruce (Picea mariana (Mill.) B.S.P.), which
grows in much of the area where the paper birch exists. The root of this
particular spruce is long but of small diameter; it is tough, durable, and
flexible enough for the purpose. The tree usually grows in soft, moist ground,
so that the long roots are commonly very close to the surface, where they
could easily be dug up with a sharp stick or with the hands. In some areas of
favorable growing conditions, the roots of the black spruce could be obtained
in lengths up to 20 feet, yet with a maximum diameter no larger than that of
a lead pencil.

Figure 6
Roll of Bark for a Hunting Canoe. Holding the bark is the intended
builder, Vincent Mikans, then (in 1927), at age 100, the oldest Indian
on the Algonkin Reserve at Golden Lake, Ont.
Other roots could be used in an emergency, such as those of the other
spruces, as well as of the northern white-cedar (Thuja occidentalis L.),
tamarack (hackmatack or eastern larch) (Laris laricina (Du Roi) K. Koch) and
jack pine (pinus banksiana Lamb.), the last named being used extensively by
some of the western tribes. Although inferior to the black spruce for sewing,
these and other materials were used for sewing bark; even rawhide was
employed for some purposes in canoe construction by certain tribes.
Canoes built of nonresinous barks were usually lashed, instead of sewn, by
thongs of such material as the inner bark of the northern white cedar,
basswood, elm, or hickory, for the reason stated earlier. Spruce root was also
used for lashings, if readily available. Since sheets of birch bark were joined
without employing a needle, the sewing actually could more correctly be
termed lacing, rather than stitching. But for the nonresinous barks, which
could stand little sewing or lacing, perhaps lashing is the better term.
Before steel tools became available to the Indians, the woodwork required in
constructing a birch-bark canoe represented great labor, since stone tools
having poor cutting characteristics were used. Selection of the proper wood
was therefore a vital consideration. In most sections of the bark canoe area,
the northern white cedar was the most sought-for wood for canoe
construction. This timber had the excellent characteristic of splitting cleanly
and readily when dry and well-seasoned. As a result, the Indian could either
utilize fallen timber of this species, windblown or torn up in spring floods; with
the crude means available he could fell a suitable tree well in advance of his
needs; or he could girdle the tree so that it would die and season on the
stump and then fell it at his convenience. If split properly, ribs of white cedar
could be bent and set in shape by the use of hot water. In many areas the
ribs, sheathing, and the gunwale members of bark canoes were made of this
wood, as were also the headboards and stem pieces.

Figure 7

You might also like