SELF-ATTENTION in NLP | How does it works? - Explained

Опубликовано: 12 Октябрь 2022
на канале: Data Science Garage
2,206
48

This video clearly explains how Self-Attention works in NLP (Natural Language Processing). The main idea is calculate relationship between an actual token (word) and previous tokens in the same text sequence.

Self-attention is broadly used in areas such as text summarization and text generation. Self-attention was proposed by researchers at Google Research and Google Brain. It was proposed due to challenges faced by encoder-decoder in dealing with long sequences.

The self-attention model allows inputs to interact with each other (i.e calculate attention of all other inputs). To better understand self-attention mechanism, I suggest to check for material about Transformers in NLP, because: the Transformer model revolutionized the implementation of attention by dispensing of recurrence and convolutions and, alternatively, relying solely on a self-attention mechanism.

You can read more about it here: https://machinelearningmastery.com/th...

As always in attention calculation (as we demonstrated in Decoder-Encoder attention in the last video lesson), we need to calculate a Dot product, which consists of K (Keys) transposed and Q (Queries).

The content of the video:
0:00 - Intro to attention and other mechanisms
0:48 - Principal scheme explanation of Self-Attention
6:40 - Quick example

Learn more in this video right now and enjoy!
I hope the attention intuition introduced in this video let you understand how self-attention mechanism works in more details. If yes, I would be so glad of that.

The first lesson: Encoder-Decoder Attention in NLP :    • ENCODER-DECODER Attention in NLP | Ho...  

#attention #nlp #transformers #selfattention #dotproduct #ai #datascience #datasciencegarage #softmax
‪@DataScienceGarage‬


Смотрите видео SELF-ATTENTION in NLP | How does it works? - Explained онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь Data Science Garage 12 Октябрь 2022, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 2,206 раз и оно понравилось 48 людям.