Transformer

August 23, 2025 2 weeks ago 1 min read

Neural network architecture built on self‑attention and feed‑forward layers that models long‑range dependencies efficiently; foundation of modern LLMs.