"DOCmT5: Document-Level Pretraining of Multilingual Language Models."

Chia-Hsuan Lee et al. (2022)

Details and statistics

DOI: 10.18653/V1/2022.FINDINGS-NAACL.32

access: open

type: Conference or Workshop Paper

metadata version: 2022-08-01

a service of  Schloss Dagstuhl - Leibniz Center for Informatics