"BERT is to NLP what AlexNet is to CV: Can Pre-Trained Language Models ..."

Asahi Ushio et al. (2021)

Details and statistics

DOI: 10.18653/V1/2021.ACL-LONG.280

access: open

type: Conference or Workshop Paper

metadata version: 2021-08-09