AI Reading List (by Ilya Sutskever) - Part 3

Опубликовано: 12 Июнь 2024
на канале: DataMListic
1,621
56

In the third part of the AI reading list series, we continue with the next 5 items that Ilya Sutskever, former OpenAI chief scientist, gave to John Carmack. Ilya followed by saying that "If you really learn all of these, you’ll know 90% of what matters today".

References
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
AI Reading List - Part 1:    • AI Reading List (by Ilya Sutskever) -...  
AI Reading List - Part 2:    • AI Reading List (by Ilya Sutskever) -...  
Transformer Self-Attention Mechanism Explained:    • Transformer Self-Attention Mechanism ...  
Effective Approaches to Attention-based Neural Machine Translation paper: https://arxiv.org/abs/1508.04025

Reading List
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Multi-Scale Context Aggregation by Dilated Convolutions: https://arxiv.org/pdf/1511.07122
Neural Message Passing for Quantum Chemistry: https://arxiv.org/pdf/1704.01212
Attention Is All You Need: https://arxiv.org/pdf/1706.03762
Neural Machine Translation by Jointly Learning to Align and Translate: https://arxiv.org/pdf/1409.0473
Identity Mappings in Deep Residual Networks: https://arxiv.org/pdf/1603.05027

Related Videos
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
Why Language Models Hallucinate:    • Why Language Models Hallucinate  
Grounding DINO, Open-Set Object Detection:    • Object Detection Part 8: Grounding DI...  
Transformer Self-Attention Mechanism Explained:    • Transformer Self-Attention Mechanism ...  
How to Fine-tune Large Language Models Like ChatGPT with Low-Rank Adaptation (LoRA):    • How to Fine-tune Large Language Model...  
Multi-Head Attention (MHA), Multi-Query Attention (MQA), Grouped Query Attention (GQA) Explained:    • Multi-Head Attention (MHA), Multi-Que...  
LLM Prompt Engineering with Random Sampling: Temperature, Top-k, Top-p:    • LLM Prompt Engineering with Random Sa...  
The Era of 1-bit LLMs: All Large Language Models are in 1.58 Bits:    • The Era of 1-bit LLMs: All Large Lang...  

Contents
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
00:00 - Intro
00:21 - Item 1: Dilated Convolutions
01:10 - Item 2: Neural Message Passing for Quantum Chemistry
01:35 - Item 3: Attention Is All You Need
02:42 - Item 4: NMT by Jointly Learning to Align and Translate
03:12 - Item 5: Identity Mappings in Deep Residual Networks
04:30 - Outro

Follow Me
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
🐦 Twitter: @datamlistic   / datamlistic  
📸 Instagram: @datamlistic   / datamlistic  
📱 TikTok: @datamlistic   / datamlistic  

Channel Support
▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬
The best way to support the channel is to share the content. ;)

If you'd like to also support the channel financially, donating the price of a coffee is always warmly welcomed! (completely optional and voluntary)
► Patreon:   / datamlistic  
► Bitcoin (BTC): 3C6Pkzyb5CjAUYrJxmpCaaNPVRgRVxxyTq
► Ethereum (ETH): 0x9Ac4eB94386C3e02b96599C05B7a8C71773c9281
► Cardano (ADA): addr1v95rfxlslfzkvd8sr3exkh7st4qmgj4ywf5zcaxgqgdyunsj5juw5
► Tether (USDT): 0xeC261d9b2EE4B6997a6a424067af165BAA4afE1a

#seq2seq #attentionmechanism #dilatedconvolutions #identitymappings #messagepassing


Смотрите видео AI Reading List (by Ilya Sutskever) - Part 3 онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь DataMListic 12 Июнь 2024, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 1,621 раз и оно понравилось 56 людям.