Self-Attention | Gabriele Sarti
Home
About me
Publications
Blog
Talks
Projects
Activities
CV
Communities
AI2S
AISIG
Self-Attention
ICLR 2020 Trends: Better & Faster Transformers for Natural Language Processing
A summary of promising directions from ICLR 2020 for better and faster pretrained tranformers language models.
Cite
×