In this tutorial, we are going to look at one of the coolest applications of LSTMs: Seq2Seq models. The canonical example of Seq2Seq is translation, and in fact Seq2Seq models are what Google Translate uses.
We are going to build a Seq2Seq model that takes in strings of arithmetic equations (e.g. “10 + 12”) and returns the answer to that equation (“22”). What makes this really amazing is that the model knows nothing about arithmetic - it is ‘translating’ the strings.
Let’s go through how the LSTM works on our simple “10 + 12” = “22” model.
Go into the seq2deq directory and open up train.py. Here you will see the full code for implementing our arithmetic model.
Congratulations! You have finished this course, and learned a ton about the inner workings of RNNs, LSTMs, word embeddings and Seq2Seq. You’re now ready to try building your own models based on code shown in these classes.