Introduction to Transformers for NLP

Chapter 1 introduces natural language generation and natural language understanding.

Chapters 3 and 4 show how tokenizer works and sentiment analysis using BERT via finiteautomata/bertweet-base-sentiment-analysis model

Lots of goodies in chapter 5:

  • Walk-thru on setting up Gradio on huggingface.co.
  • An example of a chatbot using microsoft/DialoGPT-medium.
  • Abstractive text summarization using google/pegasus-xsum
  • Zero shot learning is taking a pretrained model from huggingface that is trained on a certain dataset and use it for inference on examples it has never seen before
  • T5 = Text-to-Text Transfer Transformer

Chapter 6 is about fine-tuning pre-trained bert-base-cased model on imdb reviews to classify them as positive/negative

Source code for the book is at https://github.com/Apress/intro-transformers-nlp