Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
What to Upload to SlideShare
Loading in …3
×
1 of 32

HML: Historical View and Trends of Deep Learning

4

Share

Download to read offline

This is the first chapter of deep learning book, introduced at Houston machine learning meetup.

Related Books

Free with a 30 day trial from Scribd

See all

HML: Historical View and Trends of Deep Learning

  1. 1. Historical View and Trends of Deep Learning "DEEP LEARNING“ CHAPTER 1 1
  2. 2. New Year Resolution 2
  3. 3. Survey: Topics you want to learn 3 Deep Learning Reinforc ement Learning NLPForeca sting Ensem ble
  4. 4. HML 2018 Roadmap 1. Introduction (Chapter 1), Historical view and trends of deep learning – Yan Xu 2. Linear algebra and probability (Chapter 2&3) – Cheng Zhan 3. Numerical Computation and machine learning basics (Chapter 4&5) – Linda MacPhee-Cobb 4. Deep forward neural nets and regularization (Chapter 6&7) – Licheng Zhang 5 Quantum Machine Learning - Nicholas Teague 6. Optimization for training models (Chapter 8) 7. Convolutional Networks (Chapter 9) 8. Sequence modeling I (Chapter 10) 9. Sequence modeling II (Chapter 10) ...... 4
  5. 5. Outline • Representation Learning • Historical Waves • Current Trends of Deep Learning • Research Trends 5
  6. 6. Representation Matters 6
  7. 7. Illustration of Deep Learning Nested simple mappings 7
  8. 8. Computational Graphs Depth = 3 Depth = 1 8
  9. 9. Machine Learning and AI 9
  10. 10. Representation Learning Able to learn from data 10
  11. 11. Historical Waves • A long and rich history. • The amount of available training data has increased. • Deep learning models have grown in size over time. • Deep learning has solved increasingly complicated applications with increasing accuracy. 11
  12. 12. Historical Waves 12
  13. 13. Historical Waves Source: https://beamandrew.github.io/deeplearning/2017/02/23/deep_learning_101_part1.html 13
  14. 14. Historical Waves McCulloch-Pitts neuron (1943) The perceptron (1958, 1962) ADALINE, stochastic gradient descent (1960) Neocognitron (1980) Distributed representation (1986) Back-propagation algorithm (1986) Convolutional neural network (1998) Sequence models (1991, 1994) Long Short Term Memory (LSTM) (1997) Deep belief network, pretraining (2006) Using GPUs for Deep Learning (2005, 2009) 14
  15. 15. Perceptrons: First-generation Neural Networks https://www.coursera.org/learn/neural-networks/lecture/pgU1w/perceptrons- the-first-generation-of-neural-networks-8-min 15
  16. 16. Current Trends: Growing Datasets 16
  17. 17. Connection Per Neuron 17
  18. 18. Number of Neurons 18
  19. 19. Deep Learning Framework 19
  20. 20. ImageNet Challenge 20
  21. 21. SQuAD Challenge Stanford Question Answering D ataset (SQuAD) • the answer to every question is a segment of text from the corresponding reading passage from Wiki. • 100,000+ question-answer pairs on 500+ articles. ExactMatch 21
  22. 22. Game AI 22
  23. 23. Research Trends • Generative models • Domain alignment • Learning to Learn (Meta-Learning) • Neural networks and graphs • Program Induction Source: “Deep Learning: Practice and Trends”, NIPS 2017 23
  24. 24. Generative Models Generative Model Discriminative Model Naïve bayes Gaussian mixture Latent dirichlet allocation Generative adversarial networks Logistic regression Support vector machines Boosting Neural networks Deep Generative Models: Tutorial UAI 2017 https://danilorezendedotco m.files.wordpress.com/201 7/09/deepgenmodelstutori al.pdf 24
  25. 25. Domain Alignment 25
  26. 26. Learning to Learn (Meta-Learning) 26
  27. 27. Neural Network and Graphs 27
  28. 28. Message Passing Neural Networks Predicting DFT with MPNNs (Gilmer et al, ICML 17) 13 properties DFT : Density functional theory 28
  29. 29. Program Induction RobustFill: Neural Program Learning under Noisy I/O, 2017 29
  30. 30. Summary • Representation Learning • Historical Waves o ADALINE, stochastic gradient descent (1960) o Back-propagation algorithm (1986) o Deep belief network, pretraining (2006) • Current Trends of Deep Learning o Increasing data sets o Increasing number of neurons and number of connections per neuron o Increasing accuracy on various tasks in vision, NLP and game etc. • Research Trends o Generative models o Domain alignment o Meta learning o Graph as input o Program induction 30
  31. 31. References Deep Learning Book Chapter 1 http://www.deeplearningbook.org/ NIPS 2017 slides and videos (Deep Learning: Practice and Trends): https://github.com/hindupuravinash/nips2017 Andrew L. Beam https://beamandrew.github.io/deeplearning/2017/02/23/deep_learnin g_101_part1.html 31
  32. 32. Thank You Slides: https://www.slideshare.net/xuyangela https://www.meetup.com/Houston-Machine-Learning/ Feel free to message me if you want to lead a session! 32

×