In episode 7 we’ll tackle two important pieces of BERT’s internal architecture:(1) the “Feed Forward Network”, which is the second half of the Encoder, and (2) the “Positional Encoding Vectors”, which allow BERT to incorporate information about the relative positions of the words in a sentence.
==== Series Playlist ====
• BERT Research Series
==== Updates ====
Sign up to hear about new content across my blog and channel: https://www.chrismccormick.ai/subscribe
==== References ====
Here is the blog post that I referenced which goes into more of the math behind positional encoding: https://kazemnejad.com/blog/transform...
Note that the author follows the paper’s definition of the functions (where the sine and cosine signals are interleaved), whereas Jay’s post follows the code implementation (where the sine and cosine signals are concatenated).
Watch video BERT Research - Ep. 7 - Inner Workings IV - FFN and Positional Encoding online without registration, duration hours minute second in high quality. This video was added by user ChrisMcCormickAI 18 February 2020, don't forget to share it with your friends and acquaintances, it has been viewed on our site 14,357 once and liked it 345 people.