Attention & Transformers
In this module, you'll learn:
- What is the Attention Mechanism
- What are Transformers
- The variations of the Attention Mechanism (Self-Attention, Cross-Attention, ...)
- The difference between a Vision Transformer and a regular Transformer
- Why Transformers have the potential to replace Convolutions in Computer Vision
- What are Query, Key, and Values in Attention
- How Multi-Head Attention works
9 Lessons