Support Materials

This page provides access to supplementary materials to enhance your learning experience. These resources are designed to complement our sessions and provide additional insights into generative models and related topics.

Course Materials

Lecture Slides

Introduction to NLP

Basic concepts and history of Natural Language Processing

Download PDF

Transformers Architecture

Deep dive into transformer models and attention mechanisms

Download PDF

Fine-tuning Techniques

Advanced methods for adapting pre-trained models

Download PDF

Code Examples

PyTorch Basics Notebook

Introduction to PyTorch for deep learning

Download Notebook

LSTM Implementation

Step-by-step implementation of LSTM networks

Download Notebook

Transformer from Scratch

Building a transformer model from scratch in PyTorch

Download Notebook

Recommended Reading

"Attention Is All You Need" (Vaswani et al., 2017)

The original transformer paper that revolutionized NLP.

View Paper

"BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding" (Devlin et al., 2018)

Introducing the BERT model that advanced the state of the art in many NLP tasks.

View Paper

"Language Models are Few-Shot Learners" (Brown et al., 2020)

The GPT-3 paper exploring the capabilities of large language models.

View Paper

Useful Tools

Here are some tools and resources that can help you in your research and development work:

Hugging Face Transformers

Library providing thousands of pre-trained models for NLP tasks.

Visit Website

PyTorch Lightning

Lightweight PyTorch wrapper for high-performance AI research.

Visit Website

Weights & Biases

MLOps platform for experiment tracking and visualization.

Visit Website

Papers With Code

Free resource of ML papers with code implementations.

Visit Website