Deep Learning Toolbox Getting Started Guide MATLAB The Mathworks - The ebook is ready for download with just one simple click
Deep Learning Toolbox Getting Started Guide MATLAB The Mathworks - The ebook is ready for download with just one simple click
com
https://ebookmeta.com/product/deep-learning-toolbox-getting-
started-guide-matlab-the-mathworks/
OR CLICK HERE
DOWLOAD EBOOK
https://ebookmeta.com/product/matlab-symbolic-math-toolbox-user-s-
guide-2023-the-mathworks/
ebookmeta.com
https://ebookmeta.com/product/audio-toolbox-user-s-guide-matlab-
simulink-1st-edition-mathworks/
ebookmeta.com
https://ebookmeta.com/product/matlab-mathematics-1st-edition-
mathworks/
ebookmeta.com
https://ebookmeta.com/product/guidebook-for-the-young-officer-2nd-
edition-ltc-frank-j-caravella/
ebookmeta.com
5 000 Awesome Facts about Everything 2 2nd Edition
National Geographic Society
https://ebookmeta.com/product/5-000-awesome-facts-about-
everything-2-2nd-edition-national-geographic-society/
ebookmeta.com
https://ebookmeta.com/product/shakespeare-investigate-the-bard-s-
influence-on-today-s-world-1st-edition-andi-diehn-samuel-carbaugh/
ebookmeta.com
https://ebookmeta.com/product/survey-of-operating-systems-7th-edition-
jane-holcombe/
ebookmeta.com
https://ebookmeta.com/product/learning-functional-programming-
managing-code-complexity-by-thinking-functionally-1st-edition-jack-
widman/
ebookmeta.com
https://ebookmeta.com/product/calling-doctor-love-little-rock-
medical-1-1st-edition-rochelle-summers/
ebookmeta.com
Deep Learning Toolbox™
Getting Started Guide
R2021b
How to Contact MathWorks
Phone: 508-647-7000
Acknowledgments
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
Getting Started
1
Deep Learning Toolbox Product Description . . . . . . . . . . . . . . . . . . . . . . . . 1-2
v
Cluster Data with a Self-Organizing Map . . . . . . . . . . . . . . . . . . . . . . . . . 1-75
Defining a Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1-75
Cluster Data Using the Neural Net Clustering App . . . . . . . . . . . . . . . . . 1-75
Cluster Data Using Command-Line Functions . . . . . . . . . . . . . . . . . . . . . 1-81
vi Contents
Acknowledgments
vii
Acknowledgments
Acknowledgments
The authors would like to thank the following people:
Joe Hicklin of MathWorks for getting Howard into neural network research years ago at the
University of Idaho, for encouraging Howard and Mark to write the toolbox, for providing crucial help
in getting the first toolbox Version 1.0 out the door, for continuing to help with the toolbox in many
ways, and for being such a good friend.
Mary Ann Freeman of MathWorks for general support and for her leadership of a great team of
people we enjoy working with.
Rakesh Kumar of MathWorks for cheerfully providing technical and practical help, encouragement,
ideas and always going the extra mile for us.
Orlando De Jesús of Oklahoma State University for his excellent work in developing and
programming the dynamic training algorithms described in “Time Series and Dynamic Systems” and
in programming the neural network controllers described in “Neural Network Control Systems”.
Martin T. Hagan, Howard B. Demuth, and Mark Hudson Beale for permission to include various
problems, examples, and other material from Neural Network Design, January, 1996.
viii
1
Getting Started
Deep Learning Toolbox provides a framework for designing and implementing deep neural networks
with algorithms, pretrained models, and apps. You can use convolutional neural networks (ConvNets,
CNNs) and long short-term memory (LSTM) networks to perform classification and regression on
image, time-series, and text data. You can build network architectures such as generative adversarial
networks (GANs) and Siamese networks using automatic differentiation, custom training loops, and
shared weights. With the Deep Network Designer app, you can design, analyze, and train networks
graphically. The Experiment Manager app helps you manage multiple deep learning experiments,
keep track of training parameters, analyze results, and compare code from different experiments. You
can visualize layer activations and graphically monitor training progress.
You can exchange models with TensorFlow™ and PyTorch through the ONNX™ format and import
models from TensorFlow-Keras and Caffe. The toolbox supports transfer learning with DarkNet-53,
ResNet-50, NASNet, SqueezeNet and many other pretrained models.
You can speed up training on a single- or multiple-GPU workstation (with Parallel Computing
Toolbox™), or scale up to clusters and clouds, including NVIDIA® GPU Cloud and Amazon EC2® GPU
instances (with MATLAB® Parallel Server™).
1-2
Get Started with Deep Network Designer
unzip('MerchData.zip');
deepNetworkDesigner
Load a pretrained GoogLeNet network by selecting it from the Deep Network Designer Start Page. If
you need to download the network, then click Install to open the Add-On Explorer.
Deep Network Designer displays a zoomed-out view of the whole network. Explore the network plot.
To zoom in with the mouse, use Ctrl+scroll wheel.
1-3
1 Getting Started
To load the data into Deep Network Designer, on the Data tab, click Import Data > Import Image
Data. The Import Image Data dialog box opens.
In the Data source list, select Folder. Click Browse and select the extracted MerchData folder.
The dialog box also allows you to split the validation data from within the app. Divide the data into
70% training data and 30% validation data.
Specify augmentation operations to perform on the training images. For this example, apply a random
reflection in the x-axis, a random rotation from the range [-90,90] degrees, and a random rescaling
from the range [1,2].
1-4
Get Started with Deep Network Designer
Using Deep Network Designer, you can visually inspect the distribution of the training and validation
data in the Data tab. You can see that, in this example, there are five classes in the data set. You can
also view random observations from each class.
1-5
1 Getting Started
Deep Network Designer resizes the images during training to match the network input size. To view
the network input size, in the Designer tab, click the imageInputLayer. This network has an input
size of 224-by-224.
To retrain a pretrained network to classify new images, replace the last learnable layer and the final
classification layer with new layers adapted to the new data set. In GoogLeNet, these layers have the
names 'loss3-classifier' and 'output', respectively.
1-6
Get Started with Deep Network Designer
In the Designer tab, drag a new fullyConnectedLayer from the Layer Library onto the canvas.
Set OutputSize to the number of classes in the new data, in this example, 5.
Edit learning rates to learn faster in the new layers than in the transferred layers. Set
WeightLearnRateFactor and BiasLearnRateFactor to 10. Delete the last fully connected layer
and connect your new layer instead.
Replace the output layer. Scroll to the end of the Layer Library and drag a new
classificationLayer onto the canvas. Delete the original output layer and connect your new
layer instead.
1-7
1 Getting Started
Check Network
Check your network by clicking Analyze. The network is ready for training if Deep Learning Network
Analyzer reports zero errors.
1-8
Get Started with Deep Network Designer
Train Network
To train the network with the default settings, on the Training tab, click Train.
If you want greater control over the training, click Training Options and choose the settings to train
with. The default training options are better suited for large data sets. For small data sets, use
smaller values for the mini-batch size and the validation frequency. For more information on selecting
training options, see trainingOptions.
1-9
1 Getting Started
To train the network with the specified training options, click Close and then click Train.
Deep Network Designer allows you to visualize and monitor the training progress. You can then edit
the training options and retrain the network, if required.
1-10
Get Started with Deep Network Designer
To export the results from training, on the Training tab, select Export > Export Trained Network
and Results. Deep Network Designer exports the trained network as the variable
trainedNetwork_1 and the training info as the variable trainInfoStruct_1.
You can also generate MATLAB code, which recreates the network and the training options used. On
the Training tab, select Export > Generate Code for Training.
I = imread("MerchDataTest.jpg");
[YPred,probs] = classify(trainedNetwork_1,I);
imshow(I)
label = YPred;
title(string(label) + ", " + num2str(100*max(probs),3) + "%");
1-11
1 Getting Started
For more information, including on other pretrained networks, see Deep Network Designer.
See Also
Deep Network Designer
More About
• “Create Simple Image Classification Network Using Deep Network Designer” on page 1-29
• “Build Networks with Deep Network Designer”
• “Deep Learning Tips and Tricks”
• “List of Deep Learning Layers”
1-12
Try Deep Learning in 10 Lines of MATLAB Code
1 Run these commands to get the downloads if needed, connect to the webcam, and get a
pretrained neural network.
If you need to install the webcam and alexnet add-ons, a message from each function appears
with a link to help you download the free add-ons using Add-On Explorer. Alternatively, see Deep
Learning Toolbox Model for AlexNet Network and MATLAB Support Package for USB Webcams.
After you install Deep Learning Toolbox Model for AlexNet Network, you can use it to classify
images. AlexNet is a pretrained convolutional neural network (CNN) that has been trained on
more than a million images and can classify images into 1000 object categories (for example,
keyboard, mouse, coffee mug, pencil, and many animals).
2 Run the following code to show and classify live images. Point the webcam at an object and the
neural network reports what class of object it thinks the webcam is showing. It will keep
classifying images until you press Ctrl+C. The code resizes the image for the network using
imresize.
while true
im = snapshot(camera); % Take a picture
image(im); % Show the picture
im = imresize(im,[227 227]); % Resize the picture for alexnet
label = classify(net,im); % Classify the picture
title(char(label)); % Show the class label
drawnow
end
In this example, the network correctly classifies a coffee mug. Experiment with objects in your
surroundings to see how accurate the network is.
1-13
1 Getting Started
To watch a video of this example, see Deep Learning in 11 Lines of MATLAB Code.
To learn how to extend this example and show the probability scores of classes, see “Classify
Webcam Images Using Deep Learning”.
For next steps in deep learning, you can use the pretrained network for other tasks. Solve new
classification problems on your image data with transfer learning or feature extraction. For
examples, see “Start Deep Learning Faster Using Transfer Learning” and “Train Classifiers Using
Features Extracted from Pretrained Networks”. To try other pretrained networks, see
“Pretrained Deep Neural Networks”.
See Also
trainNetwork | trainingOptions | alexnet
More About
• “Classify Webcam Images Using Deep Learning”
• “Classify Image Using Pretrained Network” on page 1-15
• “Get Started with Transfer Learning” on page 1-17
• “Transfer Learning with Deep Network Designer”
• “Create Simple Image Classification Network” on page 1-26
• “Create Simple Sequence Classification Network Using Deep Network Designer” on page 1-34
1-14
Classify Image Using Pretrained Network
GoogLeNet has been trained on over a million images and can classify images into 1000 object
categories (such as keyboard, coffee mug, pencil, and many animals). The network has learned rich
feature representations for a wide range of images. The network takes an image as input, and then
outputs a label for the object in the image together with the probabilities for each of the object
categories.
Load the pretrained GoogLeNet network. You can also choose to load a different pretrained network
for image classification. This step requires the Deep Learning Toolbox™ Model for GoogLeNet
Network support package. If you do not have the required support packages installed, then the
software provides a download link.
net = googlenet;
The image that you want to classify must have the same size as the input size of the network. For
GoogLeNet, the network input size is the InputSize property of the image input layer.
Read the image that you want to classify and resize it to the input size of the network. This resizing
slightly changes the aspect ratio of the image.
I = imread("peppers.png");
inputSize = net.Layers(1).InputSize;
I = imresize(I,inputSize(1:2));
label = classify(net,I);
figure
imshow(I)
title(string(label))
1-15
1 Getting Started
For a more detailed example showing how to also display the top predictions with their associated
probabilities, see “Classify Image Using GoogLeNet”.
For next steps in deep learning, you can use the pretrained network for other tasks. Solve new
classification problems on your image data with transfer learning or feature extraction. For examples,
see “Start Deep Learning Faster Using Transfer Learning” and “Train Classifiers Using Features
Extracted from Pretrained Networks”. To try other pretrained networks, see “Pretrained Deep Neural
Networks”.
References
1 Szegedy, Christian, Wei Liu, Yangqing Jia, Pierre Sermanet, Scott Reed, Dragomir Anguelov,
Dumitru Erhan, Vincent Vanhoucke, and Andrew Rabinovich. "Going deeper with convolutions."
In Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 1-9. 2015.
2 BVLC GoogLeNet Model. https://github.com/BVLC/caffe/tree/master/models/bvlc_googlenet
See Also
googlenet | classify | Deep Network Designer
More About
• “Classify Image Using GoogLeNet”
• “Try Deep Learning in 10 Lines of MATLAB Code” on page 1-13
• “Get Started with Transfer Learning” on page 1-17
• “Create Simple Image Classification Network Using Deep Network Designer” on page 1-29
• “Transfer Learning with Deep Network Designer”
• “Create Simple Image Classification Network” on page 1-26
• “Create Simple Sequence Classification Network Using Deep Network Designer” on page 1-34
1-16
Get Started with Transfer Learning
Transfer learning is commonly used in deep learning applications. You can take a pretrained network
and use it as a starting point to learn a new task. Fine-tuning a network with transfer learning is
usually much faster and easier than training a network with randomly initialized weights from
scratch. You can quickly transfer learned features to a new task using a smaller number of training
images.
Extract Data
In the workspace, extract the MathWorks Merch data set. This is a small data set containing 75
images of MathWorks merchandise, belonging to five different classes (cap, cube, playing cards,
screwdriver, and torch).
unzip("MerchData.zip");
deepNetworkDesigner
Select SqueezeNet from the list of pretrained networks and click Open.
1-17
1 Getting Started
1-18
Get Started with Transfer Learning
Explore the network plot. To zoom in with the mouse, use Ctrl+scroll wheel. To pan, use the arrow
keys, or hold down the scroll wheel and drag the mouse. Select a layer to view its properties.
Deselect all layers to view the network summary in the Properties pane.
Import Data
To load the data into Deep Network Designer, on the Data tab, click Import Data > Import Image
Data. The Import Image Data dialog box opens.
In the Data source list, select Folder. Click Browse and select the extracted MerchData folder.
Divide the data into 70% training data and 30% validation data.
Specify augmentation operations to perform on the training images. Data augmentation helps prevent
the network from overfitting and memorizing the exact details of the training images. For this
example, apply a random reflection in the x-axis, a random rotation from the range [-90,90] degrees,
and a random rescaling from the range [1,2].
1-19
1 Getting Started
To retrain SqueezeNet to classify new images, replace the last 2-D convolutional layer and the final
classification layer of the network. In SqueezeNet, these layers have the names 'conv10' and
'ClassificationLayer_predictions', respectively.
On the Designer pane, drag a new convolution2dLayer onto the canvas. To match the original
convolutional layer, set FilterSize to 1,1. Edit NumFilters to be the number of classes in the
new data, in this example, 5.
Change the learning rates so that learning is faster in the new layer than in the transferred layers by
setting WeightLearnRateFactor and BiasLearnRateFactor to 10.
Delete the last 2-D convolutional layer and connect your new layer instead.
1-20
Get Started with Transfer Learning
Replace the output layer. Scroll to the end of the Layer Library and drag a new
classificationLayer onto the canvas. Delete the original output layer and connect your new
layer in its place.
1-21
1 Getting Started
Train Network
To choose the training options, select the Training tab and click Training Options. Set the initial
learn rate to a small value to slow down learning in the transferred layers. In the previous step, you
increased the learning rate factors for the 2-D convolutional layer to speed up learning in the new
final layers. This combination of learning rate settings results in fast learning only in the new layers
and slower learning in the other layers.
1-22
Get Started with Transfer Learning
To train the network with the specified training options, click Close and then click Train.
Deep Network Designer allows you to visualize and monitor the training progress. You can then edit
the training options and retrain the network, if required.
1-23
1 Getting Started
To export the results from training, on the Training tab, select Export > Export Trained Network
and Results. Deep Network Designer exports the trained network as the variable
trainedNetwork_1 and the training info as the variable trainInfoStruct_1.
You can also generate MATLAB code, which recreates the network and the training options used. On
the Training tab, select Export > Generate Code for Training. Examine the MATLAB code to learn
how to programmatically prepare the data for training, create the network architecture, and train the
network.
I = imread("MerchDataTest.jpg");
[YPred,probs] = classify(trainedNetwork_1,I);
imshow(I)
label = YPred;
title(string(label) + ", " + num2str(100*max(probs),3) + "%");
1-24
Get Started with Transfer Learning
References
[1] ImageNet. http://www.image-net.org
[2] Iandola, Forrest N., Song Han, Matthew W. Moskewicz, Khalid Ashraf, William J. Dally, and Kurt
Keutzer. "SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and <0.5 MB model
size." Preprint, submitted November 4, 2016. https://arxiv.org/abs/1602.07360.
See Also
trainNetwork | trainingOptions | squeezenet | Deep Network Designer
More About
• “Try Deep Learning in 10 Lines of MATLAB Code” on page 1-13
• “Classify Image Using Pretrained Network” on page 1-15
• “Transfer Learning with Deep Network Designer”
• “Create Simple Image Classification Network Using Deep Network Designer” on page 1-29
• “Create Simple Image Classification Network” on page 1-26
• “Create Simple Sequence Classification Network Using Deep Network Designer”
1-25
1 Getting Started
For an example showing how to interactively create and train a simple image classification network,
see “Create Simple Image Classification Network Using Deep Network Designer” on page 1-29.
Load Data
Load the digit sample data as an image datastore. The imageDatastore function automatically
labels the images based on folder names.
digitDatasetPath = fullfile(matlabroot,'toolbox','nnet','nndemos', ...
'nndatasets','DigitDataset');
Divide the data into training and validation data sets, so that each category in the training set
contains 750 images, and the validation set contains the remaining images from each label.
splitEachLabel splits the image datastore into two new datastores for training and validation.
numTrainFiles = 750;
[imdsTrain,imdsValidation] = splitEachLabel(imds,numTrainFiles,'randomize');
Define the convolutional neural network architecture. Specify the size of the images in the input layer
of the network and the number of classes in the fully connected layer before the classification layer.
Each image is 28-by-28-by-1 pixels and there are 10 classes.
inputSize = [28 28 1];
numClasses = 10;
layers = [
imageInputLayer(inputSize)
convolution2dLayer(5,20)
batchNormalizationLayer
reluLayer
fullyConnectedLayer(numClasses)
softmaxLayer
classificationLayer];
For more information about deep learning layers, see “List of Deep Learning Layers”.
1-26
Create Simple Image Classification Network
Train Network
By default, trainNetwork uses a GPU if one is available, otherwise, it uses a CPU. Training on a
GPU requires Parallel Computing Toolbox™ and a supported GPU device. For information on
supported devices, see “GPU Support by Release” (Parallel Computing Toolbox). You can also specify
the execution environment by using the 'ExecutionEnvironment' name-value pair argument of
trainingOptions.
net = trainNetwork(imdsTrain,layers,options);
1-27
1 Getting Started
For more information about training options, see “Set Up Parameters and Train Convolutional Neural
Network”.
Test Network
YPred = classify(net,imdsValidation);
YValidation = imdsValidation.Labels;
accuracy = mean(YPred == YValidation)
accuracy = 0.9892
For next steps in deep learning, you can try using pretrained network for other tasks. Solve new
classification problems on your image data with transfer learning or feature extraction. For examples,
see “Start Deep Learning Faster Using Transfer Learning” and “Train Classifiers Using Features
Extracted from Pretrained Networks”. To learn more about pretrained networks, see “Pretrained
Deep Neural Networks”.
See Also
trainNetwork | trainingOptions
More About
• “Start Deep Learning Faster Using Transfer Learning”
• “Create Simple Image Classification Network Using Deep Network Designer” on page 1-29
• “Try Deep Learning in 10 Lines of MATLAB Code” on page 1-13
• “Classify Image Using Pretrained Network” on page 1-15
• “Get Started with Transfer Learning” on page 1-17
• “Transfer Learning with Deep Network Designer”
• “Create Simple Sequence Classification Network Using Deep Network Designer” on page 1-34
1-28
Create Simple Image Classification Network Using Deep Network Designer
Load Data
Load the digit sample data as an image datastore. The imageDatastore function automatically
labels the images based on folder names. The data set has 10 classes and each image in the data set
is 28-by-28-by-1 pixels.
Open Deep Network Designer. Create a network, import and visualize data, and train the network
using Deep Network Designer.
deepNetworkDesigner
To import the image datastore, select the Data tab and click Import Data > Import Image Data.
Select imds as the data source. Set aside 30% of the training data to use as validation data.
Randomly allocate observations to the training and validation sets by selecting Randomize.
1-29
1 Getting Started
In the Designer pane, define the convolutional neural network architecture. Drag layers from the
Layer Library and connect them. To quickly search for layers, use the Filter layers search box in
the Layer Library pane. To edit the properties of a layer, click the layer and edit the values in the
Properties pane.
1-30
Create Simple Image Classification Network Using Deep Network Designer
For more information about deep learning layers, see “List of Deep Learning Layers”.
Train Network
On the Training tab, click Training Options. For this example, set the maximum number of epochs
to 5 and keep the other default settings. Set the training options by clicking Close. For more
information about training options, see “Set Up Parameters and Train Convolutional Neural
Network”.
1-31
1 Getting Started
The accuracy is the fraction of labels that the network predicts correctly. In this case, more than 97%
of the predicted labels match the true labels of the validation set.
To export the trained network to the workspace, on the Training tab, click Export.
1-32
Discovering Diverse Content Through
Random Scribd Documents
The Project Gutenberg eBook of Future
Development of Japanese Dwelling Houses
This ebook is for the use of anyone anywhere in the United States
and most other parts of the world at no cost and with almost no
restrictions whatsoever. You may copy it, give it away or re-use it
under the terms of the Project Gutenberg License included with this
ebook or online at www.gutenberg.org. If you are not located in the
United States, you will have to check the laws of the country where
you are located before using this eBook.
Language: English
THESIS
IN THE
GRADUATE SCHOOL
UNIVERSITY OF ILLINOIS
SUPERVISION BY
OF Master of Architecture
N. Clifford Ricker
Introduction.
It was comparatively recently that it became in vogue for the
educated circle in Japan to tour over Europe and America to observe
and investigate the manner and customs of those civilized nations.
And at length they deduced a conclusion that the so-called civilization
of the West is not only based on superficial progress of materialism
but it had profound root in the mental training of the citizens;
comparison and discussion have taken place in every institution of
education throughout the Empire. This is one of the procedures of
pushing one step further toward the advancement of this country. We
hear also too often of late years as to the questionable qualities of the
behavior of citizens toward the public, and so much talk about the
improvement of general customs of the country. All these are only
reflections arising from sharp observation of intelligent Japanese Globe
trotters who carefully compared with keen eyes all the conduct and
behavior of natives.
The manners and customs of a nation are only the reflection of
means of existence, which mainly consist of clothing, food, and
shelter; what we call improvements of national living is in the main
improvements in these three things. Other thing, such as etiquette,
form only an insignificant part which necessarily comes from the
method of living; when the latter undergoes a change a corresponding
change will follow in the former.
Here the question comes on the start and which at least is a most
predominant factor governing the design of our dwelling houses. Have
we to design our houses so as to sit on the mat, or to sit on the chair?
This may sound strange to a person alien from Japan, yet it is a most
important question for the native Japanese in this time of transition. It
will be too severe to urge one to chose one in preference to another.
If he likes to sit on the mat as he has done, or sit on the chair as all
European nations do, either make no difference according to my own
view, and under the circumstances of our modern mode of living the
houses should be suitable for either way; the future will decide this
question. Remember, however, that the way of bending the legs under
the weight of the body to which we have so long been accustomed
and which has characterized Japanese from all other nations is surely
a great impediment for the development of our legs; comparative
shortness of legs of all Japanese has as believed by some its cause in
this habit. Stretching a body on the Futon (a bed, without bedstead,
simply spread over the mat[A] on the floor) at night is not healthy
mode of sleeping from hygienical stand point taking in the air much
loaded with carbonic acid gas at night. Only common sense is enough
to know whether it is evil or not. Still I do not insist upon changing our
mode of daily life instilled in us from time unknown; it might be too
severe to persuade one to accomplish the work which is almost
impossible to do at present; it would be better to leave this question
to one’s own judgement for awhile. It will not take more than a
century before the problem is solved; and meanwhile it is enough to
remember that the only way for progress is to abandon what one
consider wrong and to adopt what is right. An inclination of a few
minutes of a navigator’s compass when he leaves a port makes a
divergence of thousands of miles in a course of a few days, so the
discrimination of the majority of people however small the matter may
be, greatly influences the civilization of a country.
[A] Japanese mat is 2 ft. 10¾ ins. by 5 ft. 9½ ins. having thickness
of 2½ ins. laid on wooden floor.
Plate 1.
Plate 2.
Plate 3.
Plate 4.
Plate 5.
Plate 6.
Plate 8. “FUSUMA”
Samurai class, the heart of the citizen represented the nature and
characteristic of all Japanese. Beside this there were agriculturists,
mechanics, and merchants forming four classes of Japanese society.
May it be understood that the social classes of Japan was not so
severely divided as Hindoo castes intermarriage between classes being
comparatively free, and occupations not necessarily descended rigidly
from father to son. Although these classes had been withdrawn from
society since the political revolution of 1867, still the spirit remains. It
may be interesting to note how this spirit is expressed in our domestic
architecture; Samurai likes to dignify himself and rule his retainers
accordingly; so the house has a ridiculously large gate and occupies
exceedingly vast area in its plan. Samurai observes the propriety of
etiquette in the highest degree as he thinks it a most important factor
of a social decorum; for that reason, even though there is no proper
partition in the house etiquette works like a strong wall. Samurai will
be regarded as mean if he displays his possession like an exhibition,
he intends it to be recognized that his mind is as clean and simple as
clean water is in spite of having much valuable contents within; so in
his parlor nothing is to be seen as decoration but “kakemono” (paper
or silk hanging scrolls on which there may be paintings by eminent
artists or ideograms of famous personages) flower vase, if any, in
“Tokonoma”, and a few valuable articles on “Chigai-tana”, and perhaps
one or two “gaku” (painting or ideogram in a frame) over a lintel of
“Shōji” or “Fusuma”. These are all that we can find in the parlor while
hundreds or thousands, if he is wealthy enough, of these descriptions
are stored in “Kura”. (a detached store room of half fire-resisting
construction) Samurai thinks it a greatest honor to keep his family
name among the martialhood as long as he can. He feels the greatest
disdain or shame if his family name is discarded from a list of
martialhood by any silly conduct, which can be redeemed only by
death. This naturally inspires him with reverence of forefathers who
had handed down the stainless family to his reign. Hence we see in
many plans of houses of respectable Samurai a room preserved for
images of forefathers. This is not only found in the house of class but
in all classes of citizens and this for the most part may be ascribed to
the effect of Buddhism and Shintoism, the national religions of Japan.
Samurai, however is rather indifferent in regard to religious matters in
comparison with other classes of society; though the spirit of honor or
something like chivalric idea of middle ages in Europe was heightened
to the utmost. As to the idea or conception of Samurai Professor Inazo
Nitobē in his recent work “Bushido, the soul of Japan” treats it in full
detail, my conception on the same may not precisely conform with
Professor Nitobē, still I believe there may not be a great contradiction
between us. On the whole in the feudal system of a government the
relation of a Daimio or a leader of Samurai to the latter is well
manifested in a like feature in the relation between Samurai and his
retainers. The shadow of feudal systems is cast in everywhere in
social life and even the planning and construction of a house is greatly
modified by it.
It is curious to note that so called American balloon frame
construction represents the idea of Americanism, the democracy, each
member having no special office particularly assigned to it, yet stands
firmly by joint strain. I do not for a moment deduce that a system or a
form of government has any direct connection to the construction of a
house; but it modifies greatly in the planning of a house for the
reason that a plan of a house evolves a national idea. This is well
illustrated by the plan of both American and Japanese dwelling
houses. Is it not also strange to observe that by studying the
construction of our peasant’s house which has a middle, main post
called a “Daikoku-bashira” (“Daikoku” is a name of god of wealth,
“hashira” or “bashira” for euphony means a post or column) to which
all structural stability is concentrated? A construction well suited to the
aristocratic form of state only having no king post or queen post; but
have “Daikoku-bashira”! When aristocracy in connection with feudal
system was the form of government the family life of Daimio was
simply a smaller type of it and Samurai and other wealthy families
were still smaller of types of government; thus the house plan was
made to conform with their traits. The fact that the form of
government of a state modifies the architecture of the dwelling house
is also exemplified by the house of England and France of the
sixteenth century. Indeed, most of our houses of today were chiefly
modeled after the prototype of former Samurai houses. Now the
spirits of commonwealth and liberty pervade all through the country;
daishō (long and short swords borne by Samurai) were thrown away,
mage (hair tied up at the top of a head. The old custom of Japan) was
cut off, even the clothing was partly changed and yet we are faithfully
following a mode of living which is half obsolete. Japan is in the state
of transition from old to new from destruction to upheaval in
architecture and in every thing. Cannot we hope to create a new
design unless the old had been destroyed?
The houses as any other objects of utility should be improved by
keeping abreast with the advancement of science. The house as a
thing which has a money value and useful object to contain human
beings, is not different from the railroad train and the steam boat.
While a marked progress in these is being noticed from time to time
what have we done for the house? We have shown a certain
improvement in aspect by adopting European architectural style in
house design, but a very little alteration has been done in its plan.
What improvement have we accomplished toward its construction,
materials, decoration, and workmanship? Besides the use of glass in
“shōji”, iron and zinc plates in roof and gutter, what else have we used
but ordinary building materials which have been handed down from
time immemorial? What is the difference between our houses and
those of our ancestors in aspect, construction, materials, and
workmanship?
The history of Japanese dwelling houses is a subject not well
studied by any architect or man of literature. Though much light has
been thrown on the history of Japanese religious architecture by
Professor Itō of the Imperial university of Tokyo, we can infer very
little from him as to how our dwelling houses were in the past.
Religious buildings and palaces form an important element in the
history of architecture in all nations, and Japan is no exception. But it
is not the aim of this theme to give a historical sketch of Japanese
architecture from its earliest time, the object being only to show here
the stage of development of our houses and thus I mean to infer that
an important change should take place in the future.
The history of Japan dates as far back as six hundred sixty years
before the Christian era. Before this date we call it the legendary era.
According to the decree of administrative court of Shinki it says “in our
legendary era the people were primitive, living in caves in winter and
nestling on trees in summer”, we can imagine from this that in earliest
time we were cave dwellers in winter and tree nestlers in summer like
natives of New Guinea of the present time. In time of Jimmu the
founder of the Japanese Empire (660 B.C.) the houses developed in
wooden type and henceforth wood became the only building material.
Early Japanese houses had no decoration whatever and it seems to
me that since 190 A.D. when Coreans brought some coloring pigments
as tribute to the government of Jingo-Kōgō the painting was applied
for the first time to the building, but it is certain that the color was
applied only to the palace not to the “Yashiro” (Shintō temple) nor to
the dwelling houses. The dwelling houses. The dwelling houses were
much improved in the time of Shōmu, (767 A.D.) the zenith of
religious architecture. It was then that tiles were used for the first
time as the roof covering in common dwelling houses which before
that time were mostly covered by the bark of hinoki. (Thuya Obtusa,
Benth) In common houses tiles were not yet used so abundantly as in
temple roofs; they were used on the ridge only; the rest being
covered by barks of wood. The plastered wall was also introduced at
this time. It may, however, be remembered that that plaster consisted
of lime and sand. Perhaps having some mud in the mixture; no
gypsum was in use as in European plaster.
The ages between eighth and twelfth centuries, which includes a
little more than three hundred and eighty years, when the Fujiwara
family played an important role in the government formed a most
prominent epoch of art and literature in the history of Japan. The
long, peaceful reign generally ensues an effeminate tendency to the
spirit of a nation especially to the nobility who had every facility to
possess every thing at call. The result is the production of “Azumaya”
or “Shinden-tsukuri”. The plan of which is by no means a desirable
type of residence even for a nobility of today. But, to be sure, it served
the requirement of the day in which the higher class of people
indulged mostly in music and poetry, festival and pleasure. The plan of
the Shinden type reminds me of the notable building the “Hō-oh-dō”
which was built at this time that is some eight hundred years ago in
Yamashiro and which still remains in this day in the same spot after
long defacing action of nature. It had the honor of being reproduced
in Jackson Park at Chicago in 1893 as a representation of Japanese
architecture.
Plate 12. SHINDEN-TSUKURI
Plate 13.