GitHub - line/LINE-DistilBERT-Japanese: DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE.
DistilBERT model pre-trained on 131 GB of Japanese web text. The teacher model is BERT-base that built in-house at LINE. - line/LINE-DistilBERT-Japanese