Support Materials
This page provides access to supplementary materials to enhance your learning experience. These resources are designed to complement our sessions and provide additional insights into generative models and related topics.
Course Materials
Lecture Slides
Code Examples
Recommended Reading
"Attention Is All You Need" (Vaswani et al., 2017)
The original transformer paper that revolutionized NLP.
View Paper"BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding" (Devlin et al., 2018)
Introducing the BERT model that advanced the state of the art in many NLP tasks.
View Paper"Language Models are Few-Shot Learners" (Brown et al., 2020)
The GPT-3 paper exploring the capabilities of large language models.
View PaperUseful Tools
Here are some tools and resources that can help you in your research and development work:
Hugging Face Transformers
Library providing thousands of pre-trained models for NLP tasks.
Visit Website