"General Cross-Architecture Distillation of Pretrained Language Models into ..."

Lukas Galke et al. (2022)

Details and statistics

DOI: 10.1109/IJCNN55064.2022.9892144

access: closed

type: Conference or Workshop Paper

metadata version: 2022-10-10