Deep Learning | Gabriele Sarti
Home
About me
Publications
Blog
Talks
Projects
Activities
CV
Communities
AI2S
AISIG
Deep Learning
ICLR 2020 Trends: Better & Faster Transformers for Natural Language Processing
A summary of promising directions from ICLR 2020 for better and faster pretrained tranformers language models.
Covid-19 Semantic Browser
A semantic browser for SARS-CoV-2 and COVID-19 powered by neural language models.
Neural Language Models: the New Frontier of Natural Language Understanding
An overview of the latest advances in the field of NLP, with a focus on neural models and language understanding.
Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks
Is it possible to induce sparseness in neural networks while preserving its performances? An overview of latest advances in making neural approaches more parsimonious
«
Cite
×