2 EMNLP Distilling Linguistic Context for Language Model Compression
EMNLP 2021 Distilling Linguistic Context for Language Model Compression Geondo Park, Gyeongman Kim, Eunho Yang A computationally expensive and memory intensive neural network lies behind the recent success of language representation learning. Knowledge distillation, a major tech...