Multilingual | Gabriele Sarti

Multilingual

Multi-property Steering of Large Language Models with Dynamic Activation Composition

We propose Dynamic Activation Composition, an adaptive approach for multi-property activation steering of LLMs

IT5: Text-to-text Pretraining for Italian Language Understanding and Generation

IT5s are the first encoder-decoder transformers pretrained on more than 40 billion Italian words.

RAMP: Retrieval and Attribute-Marking Enhanced Prompting for Attribute-Controlled Translation

We introduce Retrieval and Attribute-Marking enhanced Prompting (RAMP) to perform attribute-controlled MT with multilingual LLMs.

DivEMT: Neural Machine Translation Post-Editing Effort Across Typologically Diverse Languages

DivEMT is a publicly available post-editing study of Neural Machine Translation over a typologically diverse set of target languages.