HuggingFace | Gabriele Sarti

HuggingFace

Non Verbis, Sed Rebus: Large Language Models are Weak Solvers of Italian Rebuses

We evaluate the rebus-solving capabilities of large language models on a new Italian dataset.

IT5: Text-to-text Pretraining for Italian Language Understanding and Generation

IT5s are the first encoder-decoder transformers pretrained on more than 40 billion Italian words.

Inseq: An Interpretability Toolkit for Sequence Generation Models

We present Inseq, a Python library to democratize access to interpretability analyses of sequence generation models.

Inseq: An Interpretability Toolkit for Sequence Generation Models

An open-source library to democratize access to model interpretability for sequence generation models

Contrastive Language-Image Pre-training for the Italian Language

We present the first CLIP model for the Italian Language (CLIP-Italian), trained on more than 1.4 million image-text pairs.

Contrastive Image-Text Pretraining for Italian

The first CLIP model pretrained on the Italian language.