"LLaMA-MoE: Building Mixture-of-Experts from LLaMA with Continual Pre-Training."

Tong Zhu et al. (2024)

Details and statistics

DOI: 10.18653/V1/2024.EMNLP-MAIN.890

access: open

type: Conference or Workshop Paper

metadata version: 2025-06-13