❗The content presented here is sourced directly from Youtube platform. For comprehensive course details, including enrollment information, simply click on the 'Go to class' link on our website.
Updated in [July 21st, 2023]
MIT 6S191: Recurrent Neural Networks Transformers and Attention provides an introduction to deep learning. Lecturer Ava Amini will guide students through the lecture series, which covers sequence modeling, neurons with recurrence, recurrent neural networks, unfolding RNNs, design criteria for sequential modeling, word prediction examples, backpropagation through time, gradient issues, long short term memory (LSTM), RNN applications, attention fundamentals, intuition of attention, attention and search relationship, learning attention with neural networks, scaling attention and applications. All lecture slides and lab materials can be found at http://introtodeeplearning.com. Students are encouraged to subscribe to stay up to date with new deep learning lectures at MIT, or follow @MITDeepLearning on Twitter and Instagram to stay fully-connected.