"Knowledge Base Grounded Pre-trained Language Models via Distillation."

Raphaël Sourty et al. (2024)

Details and statistics

DOI: 10.1145/3605098.3635888

access: closed

type: Conference or Workshop Paper

metadata version: 2024-05-31

a service of  Schloss Dagstuhl - Leibniz Center for Informatics