default search action
"Self-Distillation into Self-Attention Heads for Improving ..."
Ye-Rin Jeoung et al. (2023)
- Ye-Rin Jeoung, Jeong-Hwan Choi, Ju-Seok Seong, Jehyun Kyung, Joon-Hyuk Chang:
Self-Distillation into Self-Attention Heads for Improving Transformer-based End-to-End Neural Speaker Diarization. INTERSPEECH 2023: 3197-3201
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.