Skip to content

Latest commit

 

History

History
85 lines (47 loc) · 4.66 KB

README.md

File metadata and controls

85 lines (47 loc) · 4.66 KB

how-to-understand-bert 🤗

In this repository, I have collected different sources, visualizations and code examples of BERT. I had been doing this during learning material of BERT. Perhaps, I should call this repository like "how-I-understand-bert" 🤔


Theory 🙌

So, I have started my long way in BERT with blog posts and videos produced by Chris McCormick. After watching the first video, I clearly understood that it is necessary to read more about Attention and Transformer.

  1. Attention

  2. Transformer


Next, let's talk about tokenization in context of BERT.

  • [en] WordPiece Embeddings - II part of BERT Research by Chris McCormick ❤️

  • [en] Tokenizers: How machines read ❤️

  • also, in vocabulary folder you can find different examples ❤️‍🔥


And finally, let's read more about BERT.


Examples 💪


Additional sources 📚