We present IT5, the first family of encoder-decoder transformer models pretrained specifically on Italian on more than 40 billion words, reaching state-of-the-art performance for most Italian conditional language generation tasks.
We present the first CLIP model for the Italian Language (CLIP-Italian), trained on more than 1.4 million image-text pairs.
The first CLIP model pretrained on the Italian language.
We developed an interactive workshop designed to illustrate the basic principles of NLP and computational linguistics to high school Italian students aged between 13 and 18 years, in the form of a game in which participants play the role of machines needing to solve some of the most common problems a computer faces in understanding language.