The Illustrated Word2vec - A Gentle Intro to Word Embeddings in Machine Learning

Опубликовано: 14 Сентябрь 2022
на канале: Jay Alammar
49,493
1k

The concept of word embeddings is a central one in language processing (NLP). It's a method of representing words as numerically -- as lists of numbers that capture their meaning. Word2vec is an algorithm (a couple of algorithms, actually) of creating word vectors which helped popularize this concept. In this video, Jay take you in a guided tour of The Illustrated Word2Vec, an article explaining the method and how it came to be developed.

The article: https://jalammar.github.io/illustrate...
The talk:    • Intuition & Use-Cases of Embeddings i...  

Word2vec paper: https://proceedings.neurips.cc/paper/...
By Tomas Mikolov, Ilya Sutskever, Kai Chen, Greg Corrado, and Jeffrey Dean

---

Twitter:   / jayalammar  
Blog: https://jalammar.github.io/
Mailing List: https://jayalammar.substack.com/

---


More videos by Jay:

Language Processing with BERT: The 3 Minute Intro (Deep learning for NLP)
   • Language Processing with BERT: The 3 ...  

Explainable AI Cheat Sheet - Five Key Categories
   • Explainable AI Cheat Sheet - Five Key...  

The Narrated Transformer Language Model
   • The Narrated Transformer Language Model  


Смотрите видео The Illustrated Word2vec - A Gentle Intro to Word Embeddings in Machine Learning онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь Jay Alammar 14 Сентябрь 2022, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 49,493 раз и оно понравилось 1 тысяч людям.