Event page: https://www.meetup.com/Deep-Learning-...
Hello dear friends 🌎 🌍,
We hope you are enjoying our latest sessions 🏖️ deploying models and cool applications to our phones 📱 and edge devices 🕹️
Our friend Dmitri has offered to lead some very interesting papers, code and content all on the attention mechanism in deep learning models and transformers! 😲
Attention and Transformers Part 2/2
-Presentation slides: https://docs.google.com/presentation/...
Quick overview of the previous session
Describe the zoo of transformer models (BERT, roberta, T5, GPT-1/2/3, etc). Distinguish between encoder-decoder and decoder-only (generative) models. Reference (a) TF Hub for BERT variants, (b) Huggingface library 💪
Bidirectional Encoder Representations from Transformers (BERT):
https://tfhub.dev/google/collections/...
Collection of BERT experts fine-tuned on different datasets:
https://tfhub.dev/google/collections/...
Huggingface: State-of-the-art Natural Language Processing for Jax, Pytorch and TensorFlow
https://huggingface.co/transformers
Run a TF tutorial notebook implementing BERT: https://www.tensorflow.org/text/tutor... 📚
The recording of this cool event 😎 is available at:
https://bit.ly/dla-transformers
Join us on Slack:
https://join.slack.com/t/deeplearning...
Spread the word about our meetup 🎉
Deep Learning YouTube playlists, feel free to share and subscribe 😀 Our TF Data and Deployment YouTube playlist is available at:
http://bit.ly/dla-tf-data-deployment
Are you excited to join us? See you soon!
Best,
Dmitri, Robert, George, and David
Смотрите видео Attention and Transformers Part 2/2 онлайн без регистрации, длительностью часов минут секунд в хорошем качестве. Это видео добавил пользователь George Zoto 27 Октябрь 2021, не забудьте поделиться им ссылкой с друзьями и знакомыми, на нашем сайте его посмотрели 267 раз и оно понравилось 11 людям.