In this video we present BERT, which is a transformer-based language model. BERT is pre-trained in a self-supervised manner on a large corpus. After that, we can use transfer learning and fine-tune the model for new tasks and obtain good performance even with a limited annotated dataset for the specific task that we would like solve (e.g., a text classification task).
The original paper: https://arxiv.org/abs/1810.04805.
Slides used in video: https://chalmersuniversity.box.com/s/....
Watch video BERT: transfer learning for NLP online without registration, duration hours minute second in high quality. This video was added by user Lennart Svensson 10 September 2021, don't forget to share it with your friends and acquaintances, it has been viewed on our site 10,141 once and liked it 187 people.