Diese Präsentation wurde erfolgreich gemeldet.
Wir verwenden Ihre LinkedIn Profilangaben und Informationen zu Ihren Aktivitäten, um Anzeigen zu personalisieren und Ihnen relevantere Inhalte anzuzeigen. Sie können Ihre Anzeigeneinstellungen jederzeit ändern.

HML: Historical View and Trends of Deep Learning

1.110 Aufrufe

Veröffentlicht am

This is the first chapter of deep learning book, introduced at Houston machine learning meetup.

Veröffentlicht in: Wissenschaft
  • My personal experience with research paper writing services was highly positive. I sent a request to ⇒ www.HelpWriting.net ⇐ and found a writer within a few minutes. Because I had to move house and I literally didn’t have any time to sit on a computer for many hours every evening. Thankfully, the writer I chose followed my instructions to the letter. I know we can all write essays ourselves. For those in the same situation I was in, I recommend ⇒ www.HelpWriting.net ⇐.
       Antworten 
    Sind Sie sicher, dass Sie …  Ja  Nein
    Ihre Nachricht erscheint hier
  • You have to choose carefully. ⇒ www.WritePaper.info ⇐ offers a professional writing service. I highly recommend them. The papers are delivered on time and customers are their first priority. This is their website: ⇒ www.WritePaper.info ⇐
       Antworten 
    Sind Sie sicher, dass Sie …  Ja  Nein
    Ihre Nachricht erscheint hier

HML: Historical View and Trends of Deep Learning

  1. 1. Historical View and Trends of Deep Learning "DEEP LEARNING“ CHAPTER 1 1
  2. 2. New Year Resolution 2
  3. 3. Survey: Topics you want to learn 3 Deep Learning Reinforc ement Learning NLPForeca sting Ensem ble
  4. 4. HML 2018 Roadmap 1. Introduction (Chapter 1), Historical view and trends of deep learning – Yan Xu 2. Linear algebra and probability (Chapter 2&3) – Cheng Zhan 3. Numerical Computation and machine learning basics (Chapter 4&5) – Linda MacPhee-Cobb 4. Deep forward neural nets and regularization (Chapter 6&7) – Licheng Zhang 5 Quantum Machine Learning - Nicholas Teague 6. Optimization for training models (Chapter 8) 7. Convolutional Networks (Chapter 9) 8. Sequence modeling I (Chapter 10) 9. Sequence modeling II (Chapter 10) ...... 4
  5. 5. Outline • Representation Learning • Historical Waves • Current Trends of Deep Learning • Research Trends 5
  6. 6. Representation Matters 6
  7. 7. Illustration of Deep Learning Nested simple mappings 7
  8. 8. Computational Graphs Depth = 3 Depth = 1 8
  9. 9. Machine Learning and AI 9
  10. 10. Representation Learning Able to learn from data 10
  11. 11. Historical Waves • A long and rich history. • The amount of available training data has increased. • Deep learning models have grown in size over time. • Deep learning has solved increasingly complicated applications with increasing accuracy. 11
  12. 12. Historical Waves 12
  13. 13. Historical Waves Source: https://beamandrew.github.io/deeplearning/2017/02/23/deep_learning_101_part1.html 13
  14. 14. Historical Waves McCulloch-Pitts neuron (1943) The perceptron (1958, 1962) ADALINE, stochastic gradient descent (1960) Neocognitron (1980) Distributed representation (1986) Back-propagation algorithm (1986) Convolutional neural network (1998) Sequence models (1991, 1994) Long Short Term Memory (LSTM) (1997) Deep belief network, pretraining (2006) Using GPUs for Deep Learning (2005, 2009) 14
  15. 15. Perceptrons: First-generation Neural Networks https://www.coursera.org/learn/neural-networks/lecture/pgU1w/perceptrons- the-first-generation-of-neural-networks-8-min 15
  16. 16. Current Trends: Growing Datasets 16
  17. 17. Connection Per Neuron 17
  18. 18. Number of Neurons 18
  19. 19. Deep Learning Framework 19
  20. 20. ImageNet Challenge 20
  21. 21. SQuAD Challenge Stanford Question Answering D ataset (SQuAD) • the answer to every question is a segment of text from the corresponding reading passage from Wiki. • 100,000+ question-answer pairs on 500+ articles. ExactMatch 21
  22. 22. Game AI 22
  23. 23. Research Trends • Generative models • Domain alignment • Learning to Learn (Meta-Learning) • Neural networks and graphs • Program Induction Source: “Deep Learning: Practice and Trends”, NIPS 2017 23
  24. 24. Generative Models Generative Model Discriminative Model Naïve bayes Gaussian mixture Latent dirichlet allocation Generative adversarial networks Logistic regression Support vector machines Boosting Neural networks Deep Generative Models: Tutorial UAI 2017 https://danilorezendedotco m.files.wordpress.com/201 7/09/deepgenmodelstutori al.pdf 24
  25. 25. Domain Alignment 25
  26. 26. Learning to Learn (Meta-Learning) 26
  27. 27. Neural Network and Graphs 27
  28. 28. Message Passing Neural Networks Predicting DFT with MPNNs (Gilmer et al, ICML 17) 13 properties DFT : Density functional theory 28
  29. 29. Program Induction RobustFill: Neural Program Learning under Noisy I/O, 2017 29
  30. 30. Summary • Representation Learning • Historical Waves o ADALINE, stochastic gradient descent (1960) o Back-propagation algorithm (1986) o Deep belief network, pretraining (2006) • Current Trends of Deep Learning o Increasing data sets o Increasing number of neurons and number of connections per neuron o Increasing accuracy on various tasks in vision, NLP and game etc. • Research Trends o Generative models o Domain alignment o Meta learning o Graph as input o Program induction 30
  31. 31. References Deep Learning Book Chapter 1 http://www.deeplearningbook.org/ NIPS 2017 slides and videos (Deep Learning: Practice and Trends): https://github.com/hindupuravinash/nips2017 Andrew L. Beam https://beamandrew.github.io/deeplearning/2017/02/23/deep_learnin g_101_part1.html 31
  32. 32. Thank You Slides: https://www.slideshare.net/xuyangela https://www.meetup.com/Houston-Machine-Learning/ Feel free to message me if you want to lead a session! 32

×