Question 1
Which of the following is a common application of sequence-to-sequence models?
Question 2
What is the primary function of the 'forget gate' in an LSTM unit?
Question 3
In the context of Recurrent Neural Networks, what does 'unrolling' the network refer to?
Question 4
Which type of gate in a GRU unit combines the functionality of the forget and input gates from an LSTM?
Question 5
What is the primary purpose of an 'embedding layer' when working with text data in recurrent networks?