"Improved Knowledge Distillation for Pre-trained Language Models via ..."

Chenglong Wang et al. (2022)

Details and statistics

DOI: 10.18653/V1/2022.FINDINGS-EMNLP.464

access: open

type: Conference or Workshop Paper

metadata version: 2023-08-10

a service of  Schloss Dagstuhl - Leibniz Center for Informatics