Transformers
Transformers are electrical devices that transfer energy between circuits through electromagnetic induction. They typically consist of primary and secondary windings wound around a common magnetic core. When alternating current flows in the primary, a changing magnetic flux induces voltage in the secondary, enabling voltage step-up or step-down. They support efficient long-distance power transmission and safe local distribution. Transformers operate with AC signals and do not transfer energy mechanically. Sizes range from small signal transformers to large power transformers rated in kilo-volt-amperes (kVA). Efficiency is high, often 95–99%, with core and copper losses as main inefficiencies. Common types include step-up, step-down, and isolation transformers, used in power networks, electronics, and electrical devices.
In machine learning, transformers are a class of neural network architectures designed for processing sequences using