


default search action
"Understanding and Improving Knowledge Distillation for Quantization Aware ..."
Minsoo Kim et al. (2022)
- Minsoo Kim, Sihwa Lee, Sukjin Hong, Du-Seong Chang, Jungwook Choi:
Understanding and Improving Knowledge Distillation for Quantization Aware Training of Large Transformer Encoders. EMNLP 2022: 6713-6725

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.