"Class Attention Transfer Based Knowledge Distillation."

Ziyao Guo et al. (2023)

Details and statistics

DOI: 10.1109/CVPR52729.2023.01142

access: closed

type: Conference or Workshop Paper

metadata version: 2024-01-26

a service of  Schloss Dagstuhl - Leibniz Center for Informatics