"Improved Knowledge Distillation for Pre-trained Language Models via ..."

Chenglong Wang et al. (2023)

Details and statistics

DOI: 10.48550/ARXIV.2302.00444

access: open

type: Informal or Other Publication

metadata version: 2023-02-09

a service of  Schloss Dagstuhl - Leibniz Center for Informatics