BERT for Sequence-to-Sequence Multi-label Text Classification

Опубликовано: 04 Декабрь 2020
на канале: AIST Conference
1,506
18

BERT for Sequence-to-Sequence Multi-label Text Classification


Ramil Yarullin and Pavel Serdyukov


We study the BERT language representation model and the sequence generation model with BERT encoder for the multi-label text classification task. We show that the Sequence Generating BERT model achieves decent results in significantly fewer training epochs compared to the standard BERT. We also introduce and experimentally examine a mixed model, an ensemble of BERT and Sequence Generating BERT models. Our experiments demonstrate that the proposed model outperforms current baselines in several metrics on three well-studied multi-label classification datasets with English texts and two private Yandex Taxi datasets with Russian texts.


Смотрите видео BERT for Sequence-to-Sequence Multi-label Text Classification онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь AIST Conference 04 Декабрь 2020, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 1,506 раз и оно понравилось 18 людям.