The concept of word embeddings is a central one in language processing (NLP). It's a method of representing words as numerically -- as lists of numbers that capture their meaning. Word2vec is an algorithm (a couple of algorithms, actually) of creating word vectors which helped popularize this concept. In this video, Jay take you in a guided tour of The Illustrated Word2Vec, an article explaining the method and how it came to be developed.
The article: https://jalammar.github.io/illustrate...
The talk: • Intuition & Use-Cases of Embeddings i...
Word2vec paper: https://proceedings.neurips.cc/paper/...
By Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean
---
Twitter: / jayalammar
Blog: https://jalammar.github.io/
Mailing List: https://jayalammar.substack.com/
---
More videos by Jay:
Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP)
• Language Processing with BERT: The 3 ...
Explainable AI Cheat Sheet - Five Key Categories
• Explainable AI Cheat Sheet - Five Key...
The Narrated Transformer Language Model
• The Narrated Transformer Language Model
Watch video The Illustrated Word2vec - A Gentle Intro to Word Embeddings in Machine Learning online without registration, duration hours minute second in high quality. This video was added by user Jay Alammar 14 September 2022, don't forget to share it with your friends and acquaintances, it has been viewed on our site 49,493 once and liked it 1 thousand people.