Deep Learning Basics: Introduction and Overview

2,299,555
0
Published 2019-01-11
An introductory lecture for MIT course 6.S094 on the basics of deep learning including a few key ideas, subfields, and the big picture of why neural networks have inspired and energized an entire new generation of researchers. For more lecture videos on deep learning, reinforcement learning (RL), artificial intelligence (AI & AGI), and podcast conversations, visit our website or follow TensorFlow code tutorials on our GitHub repo.

INFO:
Website: deeplearning.mit.edu/
GitHub: github.com/lexfridman/mit-deep-learning
Slides: bit.ly/deep-learning-basics-slides
Playlist: bit.ly/deep-learning-playlist
Blog post: link.medium.com/TkE476jw2T

OUTLINE:
0:00 - Introduction
0:53 - Deep learning in one slide
4:55 - History of ideas and tools
9:43 - Simple example in TensorFlow
11:36 - TensorFlow in one slide
13:32 - Deep learning is representation learning
16:02 - Why deep learning (and why not)
22:00 - Challenges for supervised learning
38:27 - Key low-level concepts
46:15 - Higher-level methods
1:06:00 - Toward artificial general intelligence

CONNECT:
- If you enjoyed this video, please subscribe to this channel.
- Twitter: twitter.com/lexfridman
- LinkedIn: www.linkedin.com/in/lexfridman
- Facebook: www.facebook.com/lexfridman
- Instagram: www.instagram.com/lexfridman

All Comments (21)
  • @lexfridman
    First lecture in the 2019 deep learning series! It's humbling to have the opportunity to teach at MIT and exciting to be part of the AI community. Thank you all for the support and great discussions over the past few years. It's been an amazing ride.
  • 3 years later..he never would have guessed he would be best buds with Joe Rogan, David Goggins and interview Ye and others. Crazy
  • @abrar4466
    I slept listening to you this morning and saw my mom reading deep learning books in my dream.
  • @maceovikasmr569
    When she says “go deeper” but you’re all out of PowerPoint slides
  • @BruceW779
    This might be 4 years old but it is still incredibly helpful in understanding the current state of ML and ANN. Thank you Lex.
  • @franktfrisby
    I really admire the work that Lex is doing both at MIT and his podcast!
  • 0:48 Deep Learning Basics Summary 5:00 Visualization of 3% of the neurons and 0.001% of the synapses in the brain 6:26 History of Deep Learning Ideas and Milestones 9:13 History of DL Tools 11:36 TensorFlow in One Slide 13:32 Deep Learning is Representation Learning 16:05 Why Deep Learning? Scalable Machine Learning 17:10 Gartner Hype Cycle 18:18 Why Not Deep Learning? 21:59 Challenges of Deep Learning 29:20 Deep Learning from Human and Machine 30:00 Data Augmentation 31:36 Deep Learning: Training and Testing 32:10 How Neural Network Learn: Backpropagation 32:28 Regression vs Classification 32:54 Multi Class vs. Multi Label 33:13 What can we do with Deep Learning? 33:45 Neuron: Biological Inspiration for computation 34:14 Biological and Artificial Neural Networks + Biological Inspiration for Computation 35:55 Neuron: Forward Pass 36:40 Combining Neurons in Hidden Layers: The "Emergent" Power to Approximate 37:37 Neural Networks are Parallelism 38:00 Compute Hardware 38:27 Activation Functions 39:00 Backpropogation 40:07 Learning is an Optimization Problem 41:34 Overfitting and Regularization 42:58 Regularization: Early Stoppage 44:04 Normalization 44:32 Convolutional Neural Networks: Image Classification 47:52 Object Detection/ Localization 50:03 Semantic Segmentation 51:27 Transfer Learning 52:27 Autoencoders 55:05 Generative Adversarial Networks (GANs) 57:03 Word Embeddings (Word2Vec) 58:58 Recurrent Neural Networks 59:49 Long Short-Term Memory (LSTM) Networks: Pick what to forget and what to remember 1:00:15 Bidirectional RNN 1:00:50 Encoder Decoder Architechture 1:01:38 Attention 1:02:10 AutoML and Neural Architecture Search (NASNet) 1:04:40 Deep Reinforcement Learning 1:06:00: Toward Artificial General Intelligence
  • @eni4ever
    Amazing talk! Thank you, Lex! What an exciting time to be alive...
  • Thank you so much Lex. This will help us a lot. This will help the students, who cant afford paid online courses and none in the neighbourhood can teach.
  • @heyitsbruno
    Watching this on 2023, after the advancements of generative pretrained models, is mind-blowing. Things advanced so much in 4 years.
  • @ArseniyCat
    Thank you for your honesty, Dr Fridman. Brilliant and thought -provoking to those who can ask questions to answer.
  • @Rahul-tg9gj
    Superb lecture. The guy speaks as if he sell dreams.Great confidence and knowledge
  • @BenjaminGolding
    This is a great rundown of the general DL basics. Really good lecture
  • @pwnangel12
    Thank you for being such an amazing source of information and learning.
  • @ZaneMcFate
    This is an extremely useful resource; thank you for sharing this!
  • @leunglicken2680
    I have attempted to meditate many times in my life and prior to this CD the only success I've experienced is with live guided meditation. youtube.com/post/Ugkxzpa8CIfZcihW4Z0F_ja0QF3W9KIat… This is the first CD I've used that cuts through my unmedicated ADHD and enables me to truly relax and experience a quiet and energizing interval. The instructors voice is very soothing and pleasant to listen to. I am easily able to sit successfully through the entire CD, and for quite some time after. I cannot adequately express how tremendously helpful this CD has been on my spiritual journey!! Two thumbs up and 10 stars!
  • This lecture is awesome and really inspiring. I've been a fan now for years now Lex, and I'm really happy to see your success. I just wanted to point out that I believe your analysis of "One Shot Learning" re: human bipedal locomotion might be a little off base. The learning and development process that leads to bipedalism is characterized by a list of precursors like crawling, sitting up, and standing up. This process takes usually between 1 and 2 years. This time (and the hundreds if not thousands of reps that come with it) is needed to build from the ground up both the requisite muscular strength and the requisite neural pathways for these coordinations to be possible. The process can be accelerated through coordination-specific training on the part of the parents (which occurs quite often). Errors that occur in this process lead to hardcore biomechanical problems down the road (e.g. requiring knee replacement at 55) Bipedalism is pretty complex, and is way harder than quadrupedalism, which would fall more in the scope of your one shot learning claim.
  • @Rivali0us
    Thank you Lex for all your contribution and for sharing so much on YouTube. My life would not be the same with our you podcast series
  • @souravsahoo1582
    You know what lex will revolutionize the world..a great scientist and a fluent speaker,it always a pleasure to listen lex😍😍