"Private Model Compression via Knowledge Distillation."

Ji Wang et al. (2019)

Details and statistics

DOI: 10.1609/AAAI.V33I01.33011190

access: open

type: Conference or Workshop Paper

metadata version: 2024-05-22

a service of  Schloss Dagstuhl - Leibniz Center for Informatics