Italian | Gabriele Sarti


IT5: Large-scale Text-to-text Pretraining for Italian Language Understanding and Generation

IT5s are the first encoder-decoder transformers pretrained on more than 40 billion Italian words.

Contrastive Language-Image Pre-training for the Italian Language

We present the first CLIP model for the Italian Language (CLIP-Italian), trained on more than 1.4 million image-text pairs.

Contrastive Image-Text Pretraining for Italian

The first CLIP model pretrained on the Italian language.

Teaching NLP with Bracelets and Restaurant Menus: An Interactive Workshop for Italian Students

We developed an interactive workshop designed to illustrate the NLP and computational linguistics to Italian high schoolers.