"MiniLMv2: Multi-Head Self-Attention Relation Distillation for Compressing ..."

Wenhui Wang et al. (2020)

Details and statistics

DOI:

access: open

type: Informal or Other Publication

metadata version: 2024-04-19

a service of  Schloss Dagstuhl - Leibniz Center for Informatics