The contents in this
presentation are : Power Transformers, Ideal Transformers, Transformer Current
Relationships, Transformer Voltage Relationships, Transformer Power and Impedance,
Real Transformer Models, Identifying Model Parameters, The Per-Unit System,
Example Problems, Three-Phase Per-Unit System.
Transformers are used for
modeling long dependencies between input sequence elements. This presentation
contains the following contents : What is Transformer?, Self-Attention, Query,
Key, Value, Position encoding, Encoder-Decoder.
The topics in this
presentation are as follows : Transformer overview, Self-attention, Multi-head
attention , Common transformer ingredients, Pioneering transformer: machine
translation.
Author(s): Danna Gurari, University of Colorado Boulder