Self-Attention | Gabriele Sarti
Home
Publications
Blog
Talks
Projects
Activities
CV
AI2S
Self-Attention
ICLR 2020 Trends: Better & Faster Transformers for Natural Language Processing
A summary of promising directions from ICLR 2020 for better and faster pretrained tranformers language models.
Cite
×