In this video, I will talk about the Embedding module of PyTorch. It has a lot of applications in the Natural language processing field and also when working with categorical variables. I will explain some of its functionalities like the padding index and maximum norm. In the second part of this video I will use the Embedding module to represent characters in an English alphabet and build a text generating model. Once we train the model, we we look into how the character embeddings evolved over epochs.
Code: https://github.com/jankrepl/mildlyove...
00:00 Intro
01:23 BERT example
01:56 Behavior explained (IPython)
04:25 Intro character-level model
05:29 Dataset implementation
08:53 Network implementation
12:12 Text generating function
14:00 Training script implementation
17:55 Launching and analyze results
18:31 Visualization of results
20:31 Outro
If you have any video suggestions or you just wanna chat feel free to join the discord server: / discord
Twitter: / moverfitted
Credits logo animation
Title: Conjungation · Author: Uncle Milk · Source: / unclemilk · License: https://creativecommons.org/licenses/... · Download (9MB): https://auboutdufil.com/?id=600
Watch video torch.nn.Embedding explained (+ Character-level language model) online without registration, duration hours minute second in high quality. This video was added by user mildlyoverfitted 21 February 2021, don't forget to share it with your friends and acquaintances, it has been viewed on our site 36,087 once and liked it 735 people.