In episode 7 we’ll tackle two important pieces of BERT’s internal architecture:(1) the “Feed Forward Network”, which is the second half of the Encoder, and (2) the “Positional Encoding Vectors”, which allow BERT to incorporate information about the relative positions of the words in a sentence.
==== Series Playlist ====
• BERT Research Series
==== Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe
==== References ====
Here is the blog post that I referenced which goes into more of the math behind positional encoding: https://kazemnejad.com/blog/transform...
Note that the author follows the paper’s definition of the functions (where the sine and cosine signals are interleaved), whereas Jay’s post follows the code implementation (where the sine and cosine signals are concatenated).
Смотрите видео BERT Research - Ep. 7 - Inner Workings IV - FFN and Positional Encoding онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь ChrisMcCormickAI 18 Февраль 2020, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 14,357 раз и оно понравилось 345 людям.