Melden

Ankit GuptaFolgen

9. Dec 2018•0 gefällt mir•168 views

9. Dec 2018•0 gefällt mir•168 views

Melden

Technologie

Mini-Project of B.tech ,Transfer Leaning Using Pytorch,Transfer Learning , Pytorch , Ant-Bees Classification using Pytorch, https://github.com/ankitAMD/1Ant_Bees_classification_Pytorch

Ankit GuptaFolgen

Introduction To TensorFlowSpotle.ai

Tensorflowv5.0Sanjib Basak

DATA VISUALIZATION USING MATPLOTLIB (PYTHON)Mohammed Anzil

TensorFlow and Keras: An OverviewPoo Kuan Hoong

Daniel Shank, Data Scientist, Talla at MLconf SF 2016MLconf

Erin LeDell, Machine Learning Scientist, H2O.ai at MLconf ATL 2016MLconf

- 1. Transfer Learning Using PyTorch Ankit Gupta Adarsh Pratik
- 2. Transfer learning is a research problem in machine learning that focuses on storing knowledge gained while solving one problem and applying it to a different but related problem. For example, Knowledge gained while learning to recognise cars could apply when trying to recognise trucks.
- 3. Framework Used: PyTorch Software/Technology used: Jupyter Notebook, Python, Convolutional Neural Network, PreTrained ResNet Model .
- 4. JUPYTER NOTEBOOK The Jupyter Notebook is an interactive computing environment that enables users to author notebook documents that include: Live code Interactive widgets Plots Narrative text Equations Images Video Jupyter Notebook is great for the following use cases: learn and try out Python data processing / transformation numeric simulation statistical modeling machine learning
- 5. Python Python is a popular platform used for research and development of production systems. It is a vast language with number of modules, packages and libraries that provides multiple ways of achieving a task. Python and its libraries like NumPy, SciPy, Scikit-Learn, Matplotlib are used in data science and data analysis. They are also extensively used for creating scalable machine learning algorithms. Python implements popular machine learning techniques such as Classification, Regression, Recommendation, and Clustering. Python offers ready-made framework for performing data mining tasks on large volumes of data effectively in lesser time. It includes several implementations achieved through algorithms such as linear regression, logistic regression, Naïve Bayes, k-means, K nearest neighbor, and Random Forest.
- 6. Convolution Neural Network A specific kind of such a deep neural network is the convolutional network, which is commonly referred to as CNN or ConvNet.It's a deep, feed-forward artificial neural network. Remember that feed-forward neural networks are also called multi-layer perceptrons(MLPs), which are the quintessential deep learning models. The models are called "feed-forward" because information fl�ows right through the model. There are no feedback connections in which outputs of the model are fed back into itself. Convolutional neural networks have been one of the most influential innovations in the field of computer vision. They have performed a lot better than traditional computer vision and have produced state-of-the-art results. These neural networks have proven to be successful in many different real-life case studies and applications, like: Image classification, object detection, segmentation, face recognition; Self driving cars that leverage CNN based vision systems; Classification of crystal structure using a convolutional neural network; And many more.
- 7. RESNET Residual Networks are important because (1) they have shown superior performance in ImageNet and (2) they have shown that you can create extremely deep layers of neural networks. The first result is an indicator of the value of pass through network elements. The second result has ramifications also in recurrent networks because RNNs are implicitly deep.
- 8. PyTorch The name PyTorch is inspired from popular library Torch, which was written in Lua. The first key feature of PyTorch is imperative programming. An imperative program performs computation as you type it. The second key feature of PyTorch is Dynamic computational graphs. PyTorch is defined by run, which means the graph structure is generated at runtime. These graphs arise whenever the amount of work that needs to be done is variable.
- 9. Data Flow 1. Download the dataset 2. Data augmentation. 3. Downloading pre-trained ResNet model (Transfer learning) 4. Training the model on the dataset 5. How to decay the learning rate for every nth epoch
- 10. Imports needed from __future__ import print_function , division import torch import torch.nn as nn import torch.optim as optim from torch.optim import lr_scheduler import numpy as np import torchvision from torchvision import datasets, models, transforms import matplotlib.pyplot as plt import time import os import copy
- 11. Data Augmentation Data augmentation is a process where you make changes to existing photos like adjusting the colors , flipping it horizontally or vertically , scaling , cropping and many more. Pytorch provides a very useful library called torchvision.transforms which provides a lot of methods which helps to apply data augmentation. transforms comes with a compose method which takes a list of transformation.
- 12. Training and Validating data data_transforms = { 'train': transforms.Compose([ transforms.RandomResizedCrop(224), transforms.RandomHorizontalFlip(), transforms.ToTensor(), transforms.Normalize([0.485,0.456, 0.406], [0.229, 0.224, 0.225 ]) ]), 'val':transforms.Compose([ transforms.Resize(256), transforms.CenterCrop(224), transforms.ToTensor(), transforms.Normalize([0.485, 0.456, 0.406],[0.229, 0.224, 0.225]) ]), }
- 13. Visualising image after augmentation
- 14. Train and Evaluate model_ft = train_model(model_ft, criterion, optimizer_ft, exp_lr_scheduler, num_epochs=25)
- 15. Conclusion visualize_model(model_conv) plt.ioff() plt.show()
- 17. Source code of the project on : https://github.com/ankitAMD/1Ant_Bees_classification_Pytorch