2 INTERSPEECH Multi-domain Knowledge Distillation via Uncertainty-Matching for End-to-End ASR Models
INTERSPEECH 2021 Multi-domain Knowledge Distillation via Uncertainty-Matching for End-to-End ASR Models Ho-Gyeong Kim, Min-Joong Lee, Hoshik Lee, Tae Gyoon Kang, Jihyun Lee, Eunho Yang and Sung Ju Hwang Knowledge Distillation basically matches predictive distributions of st...