"LexMAE: Lexicon-Bottlenecked Pretraining for Large-Scale Retrieval."

Tao Shen et al. (2022)

Details and statistics

DOI: 10.48550/ARXIV.2208.14754

access: open

type: Informal or Other Publication

metadata version: 2023-06-26