SELF-ATTENTION in NLP | How does it works? - Explained

Published: 12 October 2022
on channel: Data Science Garage
2,206
48

This video clearly explains how Self-Attention works in NLP (Natural Language Processing). The main idea is calculate relationship between an actual token (word) and previous tokens in the same text sequence.

Self-attention is broadly used in areas such as text summarization and text generation. Self-attention was proposed by researchers at Google Research and Google Brain. It was proposed due to challenges faced by encoder-decoder in dealing with long sequences.

The self-attention model allows inputs to interact with each other (i.e calculate attention of all other inputs). To better understand self-attention mechanism, I suggest to check for material about Transformers in NLP, because: the Transformer model revolutionized the implementation of attention by dispensing of recurrence and convolutions and, alternatively, relying solely on a self-attention mechanism.

You can read more about it here: https://machinelearningmastery.com/th...

As always in attention calculation (as we demonstrated in Decoder-Encoder attention in the last video lesson), we need to calculate a Dot product, which consists of K (Keys) transposed and Q (Queries).

The content of the video:
0:00 - Intro to attention and other mechanisms
0:48 - Principal scheme explanation of Self-Attention
6:40 - Quick example

Learn more in this video right now and enjoy!
I hope the attention intuition introduced in this video let you understand how self-attention mechanism works in more details. If yes, I would be so glad of that.

The first lesson: Encoder-Decoder Attention in NLP :    • ENCODER-DECODER Attention in NLP | Ho...  

#attention #nlp #transformers #selfattention #dotproduct #ai #datascience #datasciencegarage #softmax
‪@DataScienceGarage‬


Watch video SELF-ATTENTION in NLP | How does it works? - Explained online without registration, duration hours minute second in high quality. This video was added by user Data Science Garage 12 October 2022, don't forget to share it with your friends and acquaintances, it has been viewed on our site 2,206 once and liked it 48 people.