Week 3 Quiz
Submit your assignment
Due DateJan 4, 2:59 PM +07
Attempts3 every 8 hours
Receive grade
To Pass80% or higher
Grade
100%
We keep your highest score
Due Jan 4, 2:59 PM +07
Why does sequence make a large difference when determining semantics of language?
How do Recurrent Neural Networks help you understand the impact of sequence on meaning?
How does an LSTM help understand meaning when words that qualify each other aren’t necessarily beside each other in a sentence?
What keras layer type allows LSTMs to look forward and backward in a sentence?
What’s the output shape of a bidirectional LSTM layer with 64 units?
When stacking LSTMs, how do you instruct an LSTM to feed the next one in the sequence?
If a sentence has 120 tokens in it, and a Conv1D with 128 filters with a Kernal size of 5 is passed over it, what’s the output shape?
What’s the best way to avoid overfitting in NLP datasets?