"Multi-head Knowledge Distillation for Model Compression."

Huan Wang et al. (2020)

Details and statistics

DOI:

access: open

type: Informal or Other Publication

metadata version: 2022-05-18