2. Transfer learning is a research
problem in machine learning that
focuses on storing knowledge
gained while solving one problem
and applying it to a different but
Knowledge gained while learning to
recognise cars could apply when trying to
4. JUPYTER NOTEBOOK
The Jupyter Notebook is an interactive computing environment that enables
users to author notebook documents that include:
Jupyter Notebook is great for the following use cases:
learn and try out Python
data processing / transformation
Python is a popular platform used for research and development of production systems. It is a vast
language with number of modules, packages and libraries that provides multiple ways of achieving a
Python and its libraries like NumPy, SciPy, Scikit-Learn, Matplotlib are used in data science and data
analysis. They are also extensively used for creating scalable machine learning algorithms. Python
implements popular machine learning techniques such as Classification, Regression,
Recommendation, and Clustering.
Python offers ready-made framework for performing data mining tasks on large volumes of data
effectively in lesser time. It includes several implementations achieved through algorithms such as
linear regression, logistic regression, Naïve Bayes, k-means, K nearest neighbor, and Random Forest.
6. Convolution Neural Network
A specific kind of such a deep neural network is the convolutional network, which is commonly
referred to as CNN or ConvNet.It's a deep, feed-forward artificial neural network. Remember that
feed-forward neural networks are also called multi-layer perceptrons(MLPs), which are the
quintessential deep learning models. The models are called "feed-forward" because information
fl�ows right through the model. There are no feedback connections in which outputs of the model
are fed back into itself.
Convolutional neural networks have been one of the most influential innovations in the field of
computer vision. They have performed a lot better than traditional computer vision and have
produced state-of-the-art results. These neural networks have proven to be successful in many
different real-life case studies and applications, like:
Image classification, object detection, segmentation, face recognition;
Self driving cars that leverage CNN based vision systems;
Classification of crystal structure using a convolutional neural network;
And many more.
Residual Networks are important because (1) they have shown superior performance in ImageNet
and (2) they have shown that you can create extremely deep layers of neural networks. The first
result is an indicator of the value of pass through network elements. The second result has
ramifications also in recurrent networks because RNNs are implicitly deep.
The name PyTorch is inspired from popular library Torch, which was written in Lua.
The first key feature of PyTorch is imperative programming. An imperative program
performs computation as you type it.
The second key feature of PyTorch is Dynamic computational graphs. PyTorch is defined
by run, which means the graph structure is generated at runtime. These graphs arise
whenever the amount of work that needs to be done is variable.
9. Data Flow
1. Download the dataset
2. Data augmentation.
3. Downloading pre-trained ResNet model (Transfer learning)
4. Training the model on the dataset
5. How to decay the learning rate for every nth epoch
10. Imports needed
from __future__ import print_function , division
import torch.nn as nn
import torch.optim as optim
from torch.optim import lr_scheduler
import numpy as np
from torchvision import datasets, models, transforms
import matplotlib.pyplot as plt
11. Data Augmentation
Data augmentation is a process where you make changes to existing photos like
adjusting the colors , flipping it horizontally or vertically , scaling , cropping and
Pytorch provides a very useful library called torchvision.transforms which
provides a lot of methods which helps to apply data augmentation. transforms
comes with a compose method which takes a list of transformation.