Italian Transformers Under the Linguistic Lens


In this paper we present an in-depth investigation of the linguistic knowledge encoded by the transformer models currently available for the Italian language. In particular, we investigate whether and how using different architectures of probing models affects the performance of Italian transformers in encoding a wide spectrum of linguistic features. Moreover, we explore how this implicit knowledge varies according to different textual genres.

In Proceedings of the Seventh Italian Conference on Computational Linguistics (CLiC-it 2020)