Understanding and Applying BERT | Bidirectional Encoder Representations from Transformers | NLP | Py

Published: 05 July 2022
on channel: Spencer Pao
3,053
63

===== Likes: 26 👍: Dislikes: 0 👎: 100.0% : Updated on 01-21-2023 11:57:17 EST =====
BERT is an open source machine learning framework for natural language processing (NLP) developed by the Google AI team. This has lead to state-of-the art technologies that have made significant breakthroughs on common problems such as natural language inference, question answering, sentiment analysis, and text summarization.

I go through the basic theory, architecture, and implemenation and in no time, you will be conversational in this brilliant architecture!

Feel free to support me! Do know that just viewing my content is plenty of support! 😍
☕Consider supporting me! https://ko-fi.com/spencerpao ☕

Watch Next?
Transformer →    • Transformers EXPLAINED! Neural Networ...  
LSTM →    • [LSTM] Applying and Understanding Lon...  
Sentiment Analysis →    • NLP Sentiment Analysis in Python  

🔗 My Links 🔗
BERT Notebook: https://github.com/SpencerPao/Natural...
Google's Notebook: https://colab.research.google.com/git...
Github: https://github.com/SpencerPao/spencer...
My Website: https://spencerpao.github.io/

📓 Requirements 🧐
Python Intermediate and or advanced Knowledge
Google Account
Google Paper on BERT → https://arxiv.org/abs/1810.04805

⌛ Timeline ⌛
0:00 - BERT Importance
1:05 - BERT Architecture
1:39 - Pre-training Phase MLM and NSP
5:25 - Fine-tuning
6:58 - BERT Code Implmentation CMD or Notebook
9:51 - Create Tokenizer and Important Features
11:45 - Transforming Text to BERT input
12:38 - Fine Tuning Model, Testing, and Predictions

🏷️Tags🏷️:
Machine Learning, BERT, Bidirectional Encoder Representations from Transformers, Statistics, Jupyter notebook, python, Natural, language, processing, NLP, transformer, encoder, google, AI, google AI, tutorial,how to, code, machine, GPU, google colab, github, pretraining, fine tuning, sentiment, twitter, predictions,AUC, MLM, NSP, Masked Language Model, Next Sentence Prediction,

🔔Current Subs🔔:
3,033


Watch video Understanding and Applying BERT | Bidirectional Encoder Representations from Transformers | NLP | Py online without registration, duration hours minute second in high quality. This video was added by user Spencer Pao 05 July 2022, don't forget to share it with your friends and acquaintances, it has been viewed on our site 3,053 once and liked it 63 people.