Neural MT systems generate translations one word at a time. They can still generate fluid translations because they choose each word based on all of the words generated so far. Typically, these systems are just trained to generate the next word correctly, based on all previous words. One systematic problem with this word-by-word approach to training and translating is that the translations are often too short and omit important content. In the paper Neural Machine Translation with Reconstruction, the authors describe a clever new way to train and translate. During training, their system is encouraged not only to generate each next word correctly but also to correctly generate the original source sentence based on the translation that was generated. In this way, the model is rewarded for generating a translation that is sufficient to describe all of the content in the original source.
Neural Machine Translation is everywhere (and not just on this blog). Translators want to know how it will affect their livelihood, and internal localization managers want to know how they can make it work for their translation strategy. Whether you're looking to assess the business applications of neural machine translation, or peek under the hood to see how all the gears fit together, these NMT videos can help you wrap your head around the rising tide that is neural machine translation.