We propose Dynamic Activation Composition, an adaptive approach for multi-property activation steering of LLMs
IT5s are the first encoder-decoder transformers pretrained on more than 40 billion Italian words.
We introduce Retrieval and Attribute-Marking enhanced Prompting (RAMP) to perform attribute-controlled MT with multilingual LLMs.
DivEMT is a publicly available post-editing study of Neural Machine Translation over a typologically diverse set of target languages.