Word embedding is the first step in lots of neural networks, including Transformers (like ChatGPT) and other state of the art models. Here we learn how to code a stand alone word embedding network from scratch and with nn.Linear. We then learn how to load and use pre-trained word embedding values with nn.Embedding.
NOTE: This StatQuest assumes that you are already familiar with Word Embedding, if not, check out the 'Quest: • Word Embedding and Word2Vec, Clearly ...
If you'd like to support StatQuest, please consider...
Patreon: / statquest
...or...
YouTube Membership: / @statquest
...buying my book, a study guide, a t-shirt or hoodie, or a song from the StatQuest store...
https://statquest.org/statquest-store/
...or just donating to StatQuest!
paypal: https://www.paypal.me/statquest
venmo: @JoshStarmer
Lastly, if you want to keep up with me as I research and create new StatQuests, follow me on twitter:
/ joshuastarmer
0:00 Awesome song and introduction
1:53 Importing modules
2:48 Encoding the training data
6:55 Word Embedding from scratch
16:54 Graphing the embedding values
21:17 Printing out predicted words
20:37 Word Embedding with nn.Linear
28:12 Loading and using pre-trained Embedding values with nn.Embedding
#StatQuest #neuralnetworks #transformers
Watch video Word Embedding in PyTorch + Lightning online without registration, duration hours minute second in high quality. This video was added by user StatQuest with Josh Starmer 06 November 2023, don't forget to share it with your friends and acquaintances, it has been viewed on our site 41,270 once and liked it 843 people.