"Self-Distillation for Model Stacking Unlocks Cross-Lingual NLU in 200+ ..."

Fabian David Schmidt et al. (2024)

Details and statistics

DOI:

access: open

type: Conference or Workshop Paper

metadata version: 2024-11-18