Test Your Transformers Knowledge: ML Practice Quiz
kylie genner
Created 6/10/2024
9
74.44%
Q & A
Share This Quiz
Sources
Do you think you know everything about Transformers and Machine Learning? Put your knowledge to the test with our practice quiz based on the attention mechanism in the 'Attention is All You Need' paper!
1. What kind of model is the Transformer?
A model architecture eschewing recurrence
A recurrent neural network model
2. How many parallel attention layers, or heads, does the Transformer use?
8
16
3. What is the benefit of multi-head attention?
Allows the model to jointly attend to information from different representation subspaces
To perform a single attention function more efficiently
4. Which tasks has self-attention been used successfully in?