


default search action
"Continual Quantization-Aware Pre-Training: When to transition from 16-bit ..."
Jacob Nielsen, Peter Schneider-Kamp, Lukas Galke (2025)
- Jacob Nielsen, Peter Schneider-Kamp, Lukas Galke:
Continual Quantization-Aware Pre-Training: When to transition from 16-bit to 1.58-bit pre-training for BitNet language models? CoRR abs/2502.11895 (2025)

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.