BERT for Sequence-to-Sequence Multi-label Text Classification

Published: 04 December 2020
on channel: AIST Conference
1,506
18

BERT for Sequence-to-Sequence Multi-label Text Classification


Ramil Yarullin and Pavel Serdyukov


We study the BERT language representation model and the sequence generation model with BERT encoder for the multi-label text classification task. We show that the Sequence Generating BERT model achieves decent results in significantly fewer training epochs compared to the standard BERT. We also introduce and experimentally examine a mixed model, an ensemble of BERT and Sequence Generating BERT models. Our experiments demonstrate that the proposed model outperforms current baselines in several metrics on three well-studied multi-label classification datasets with English texts and two private Yandex Taxi datasets with Russian texts.


Watch video BERT for Sequence-to-Sequence Multi-label Text Classification online without registration, duration hours minute second in high quality. This video was added by user AIST Conference 04 December 2020, don't forget to share it with your friends and acquaintances, it has been viewed on our site 1,506 once and liked it 18 people.