Transformer vs. DeepL: Attention Based Approaches to Machine Translation

Transformer vs. DeepL: Attention Based Approaches to Machine Translation

Machine can now pick up nuanced information and translate sentences more naturally. Back in December of 2016, the New York Times published an article on the Google Brain team and how neural networks have elevated the accuracy of Google Translate to a human-like level. Still, these systems based on complex architectures involving recurrent neural networks (RNN) and convolutional neural networks (CNN) were computationally expensive and were limited by their sequential nature. To illustrate, take the following sentences where the word “bank” has a different meaning in context: “I arrived at the bank after crossing the road.” “I arrived at […]