-
Videos Lectures
-
Lecture 1 - First Order Differential Equation
-
Lecture 2 - Gaussian Noise & Brownian Motion
-
Lecture 3 - Stochastic Differential Equation Part 1
-
Lecture 3 - Stochastic Differential Equation Part 2
-
Lecture 4 - Probability, Conditional Probability and Random Variable.
-
Lecture 5 - Kalman Filter Theory and Application
-
Lecture 5-1 - Kalman Filter - part 1
-
Lecture 5-2 - Review Stochastic Model and Kalman Filter - part 2
-
Lecture 5-3 - Kalman Filter - part 3
-
Lecture 5-4 - Kalman Filter - part 4
-
Lecture 6 - Optimization 1 (unconstrained)
-
Lecture 7 - Linear Algebra 1 (least square)
-
Lecture 8 - Optimization 2 (constrained) part 1
-
Lecture 8 - Optimization 2 (constrained) part 2
-
Lecture 9-0 - Introduction to Machine Learning
-
Lecture 9-1 - Classical Machine Learning - part 1
-
Lecture 9-2 - Classical Machine Learning part 2
-
Lecture 9-4 - Introduction to Machine Learning - Practice
-
Lecture 10 - Neural Network
-
Lecture 11 - Convolutional Neural Network
-
Lecture 12 - Introduction to Deep Learning
-
Lecture 12-1 - Recurrent Neural Networks and Transformers
-
Lecture 12-2 - Convolutional Neural Networks
-
Lecture 12-3 - Deep Generative Modeling
-
Lecture 12-4 - Reinforcement Learning
-
Lecture 12-5 - Deep Learning New Frontiers
-
Lecture 12-6 - LiDAR for Autonomous Driving
-
Lecture 12-7 - Automatic Speech Recognition
-
Lecture 12-8 - AI for Science
-
-
Lectures Slide
-
Lecture 1 - First Order Differential Equation
-
Lecture 2 - Gaussian Noise & Brownian Motion
-
Lecture 3 - Stochastic Differential Equation Part 2
-
Lecture 4 - Probability, Conditional Probability and Random Variable.
-
Lecture 5 - Kalman Filter
-
Lecture 6 - Optimization 1 (unconstrained)
-
Lecture 7 - Linear Algebra 1 (least square)
-
Lecture 8 - Optimization 2 (constrained) part 1
-
Lecture 8 - Optimization 2 (constrained) part 2
-
Lecture 10 - Neural Network
-
Introduction to Machine Learning
-
Lecture 12 - Introduction to Deep Learning
MIT Introduction to Deep Learning 6.S191: Lecture 1
*New 2022 Edition*
Foundations of Deep Learning
Lecturer: Alexander Amini
For all lectures, slides, and lab materials: http://introtodeeplearning.com/
Lecture Outline
0:00 - Introduction
6:35 - Course information
9:51 - Why deep learning?
12:30 - The perceptron
14:31 - Activation functions
17:03 - Perceptron example
20:25 - From perceptrons to neural networks
26:37 - Applying neural networks
29:18 - Loss functions
31:19 - Training and gradient descent
35:46 - Backpropagation
38:55 - Setting the learning rate
41:37 - Batched gradient descent
43:45 - Regularization: dropout and early stopping
47:58 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!
*New 2022 Edition*
Foundations of Deep Learning
Lecturer: Alexander Amini
For all lectures, slides, and lab materials: http://introtodeeplearning.com/
Lecture Outline
0:00 - Introduction
6:35 - Course information
9:51 - Why deep learning?
12:30 - The perceptron
14:31 - Activation functions
17:03 - Perceptron example
20:25 - From perceptrons to neural networks
26:37 - Applying neural networks
29:18 - Loss functions
31:19 - Training and gradient descent
35:46 - Backpropagation
38:55 - Setting the learning rate
41:37 - Batched gradient descent
43:45 - Regularization: dropout and early stopping
47:58 - Summary
Subscribe to stay up to date with new deep learning lectures at MIT, or follow us on @MITDeepLearning on Twitter and Instagram to stay fully-connected!!
Views | |
---|---|
8734 | Total Views |
0 | Members Views |
8734 | Public Views |
Actions | |
---|---|
0 | Likes |
0 | Dislikes |
0 | Comments |
Share by mail
Please login to share this video by email.