To generate an entire screenful of image data live, we design special hardware that can process all vertices, triangles, & "fragment" pixels simultaneously. Or all output pixels simultaneously.
In contrast CPUs as we stuffed transistors into our microchips GPUs got conceptually simpler, so how does a modern GPU work?
In short: We stuff it full of "compute units" cycling between processing multiple such items as it fetches data via a cache hierarchy, or other causes of latency.
1/4?