Lecture part A
This lecture introduces the topic of Neural Machine Translation with the help of an example. We then discuss language modelling, model architecture, NMT inference. Further, we discuss the issues faced because of the languages and the need for Low Resource Machine Translation. Also, we examine a case study and the challenges faced in Low Resource MT, different stages in the cycle of research, how they can be used for Machine Translation.
Lecture part B
This week’s lecture was a guest lecture by Marc’Aurelio Ranzato, who is a research scientist and manager at the Facebook AI Research (FAIR) lab, where he works to enable machines to learn with weaker supervision and to efficiently transfer knowledge across tasks. The first part of Lecture B focuses on understanding low resource machine translation, and the second half discusses potential domain mismatches in machine learning and machine translation.
We introduced the state transition function and the way to model a physical system with state and control. We discussed how to achieve optimal control by inference using Kelley-Bryson algorithm, which utilizes backprop through time and gradient descent. Finally, we explained the notebook of Optimization_Path_Planner, in which various cost functions are defined and path planning is implemented to guide a tri-cycle to reach the desired position with the specified speed.