Skip to content

Latest commit

 

History

History
11 lines (6 loc) · 594 Bytes

README.md

File metadata and controls

11 lines (6 loc) · 594 Bytes

Commented Transformers

Highly commented implementations of Transformers in PyTorch for Creating a Transformer From Scratch series:

  1. The Attention Mechanism
  2. The Rest of the Transformer

The layers folder contains implementations for Bidirectional Attention, Causal Attention, and CausalCrossAttention.

The models folder contains single file implementations for GPT-2 and BERT. Both models are compatible with torch.compile(..., fullgraph=True).